pax_global_header00006660000000000000000000000064136122611130014506gustar00rootroot0000000000000052 comment=ffaca02739918891e2d43c466c8c4ced7cf8081a menhir-20200123/000077500000000000000000000000001361226111300132235ustar00rootroot00000000000000menhir-20200123/.gitignore000066400000000000000000000000221361226111300152050ustar00rootroot00000000000000*~ .merlin _build menhir-20200123/.gitlab-ci.yml000066400000000000000000000001531361226111300156560ustar00rootroot00000000000000image: busybox pages: script: - mv www public artifacts: paths: - public only: - master menhir-20200123/CHANGES.md000066400000000000000000001030241361226111300146150ustar00rootroot00000000000000# Changes ## 2020/01/21 * There used to be a distinction between two slightly different ways of installing Menhir, namely with and without `ocamlfind`. This distinction disappears. The command line switch `--suggest-ocamlfind` is deprecated and causes Menhir to print `false`. * Menhir is now built and installed by dune. This should make life easier for developers: in particular, `make test` and `make speed` can be run straight away and do not requiring installing Menhir first. This should also make compilation much faster on multi-core machines. (Contributed by Nicolás Ojeda Bär, to whom many thanks are due.) ## 2019/09/24 * Build Menhir's standard library into the Menhir executable instead of storing it in a separate file `standard.mly`. This removes the need to hardcode the path to this file into the Menhir executable. This also removes the need for the command line switch `--stdlib`, which remains supported but is now ignored, and for the environment variable `$MENHIR_STDLIB`, which is now ignored. A positive side effect of this change is that the full path of the file `standard.mly` no longer appears in generated parsers; this removes a source of spurious variation. (Suggested and implemented by Nicolás Ojeda Bär.) ## 2019/06/20 * When compiled with OCaml 4.02.3, Menhir could produce OCaml code containing invalid string literals. This was due to a problem in `String.escaped`. Fixed. (Reported by ELLIOTCABLE.) ## 2019/06/13 * Relax the syntax of point-free actions to allow `< >` (with arbitrary whitespace inside the angle brackets) instead of just `<>`. (Suggested by Lélio Brun.) * When a cycle of `%inline` nonterminal symbols is encountered, the error message now shows the entire cycle, as opposed to just one symbol that participates in the cycle. * Fix the treatment of the `error` token when printing the grammar for `ocamlyacc`. Its semantic value must not be referred to; a unit value must be used instead. The switch `--only-preprocess-for-ocamlyacc` remains undocumented. (Reported by kris.) * Coq back-end: multiple changes to stay up-to-date with respect to coq-menhirlib. See [coq-menhirlib/CHANGES.md](coq-menhirlib/CHANGES.md). * Coq back-end: the generated parser now contains a dedicated inductive type for tokens. This removes the need for `Obj.magic` in client code when the parser is used via extraction. * Coq back-end: the generated parser checks that the version of MenhirLib matches. This check can be disabled with `--coq-no-version-check`. * Coq back-end: the fuel parameter is now given as the *logarithm* of the maximum number of steps to perform. Therefore, using e.g., 50 makes sure we will not run out of fuel in any reasonable computation time. ## 2018/11/13 * In `.mly` files, a new syntax for rules has been introduced, which is slightly more pleasant than the old syntax. (A rule is the definition of a nonterminal symbol.) The old syntax remains available; the user chooses between the two syntaxes on a per-rule basis. The new syntax is fully documented in the manual; [a brief summary of the differences](doc/new-rule-syntax-summary.md) with respect to the old syntax is also available. **The new syntax is considered experimental** and is subject to change in the near future. * In the Coq back-end, avoid pattern-matching at type `int31`, which will disappear in future versions of Coq. Instead, convert `int31` to `Z`, and perform pattern matching in `Z`. (Reported by Vincent Laporte, implemented by Jacques-Henri Jourdan.) * Implement a more economical renaming scheme for OCaml variables during the elimination of `%inline` symbols. This leads to slightly more readable code (more reasonable variables names, fewer `let` bindings). * Another attempt at removing all trailing spaces in auto-generated `.messages` files. (I hope I got it right, this time.) ## 2018/10/26 * A new syntactic sugar facility, "token aliases", has been added. The declaration of a terminal symbol may also declare an alias, which takes the form of a name between double quotes, as in `%token PLUS "+"`. Thereafter, `"+"` may be used freely in place of `PLUS` throughout the grammar. This makes it slightly easier to read grammars. (Contributed by Perry E. Metzger.) * Until today, the semicolon character `;` was insignificant: it was considered as whitespace by Menhir. It is now accepted only in a few specific places, namely: after a declaration; after a rule; after a producer. If Menhir suddenly complains about a semicolon, just remove it. This change is being made in preparation for further syntactic changes. * New flag `--no-dollars`, which disallows the use of `$i` in semantic actions. The default behavior remains to allow the use of `$i`. * When generating OCaml code, include all record fields in record patterns, even when bound to a wildcard pattern. Thus, avoid triggering OCaml's warning 9. ## 2018/10/06 * Standard library: add `rev`, `flatten`, `append`. Add a link from the manual to `standard.mly` in the repository. * Update the manual to explain how to use `dune` and `menhir` together. * Install `.cmxs` files for menhirLib and menhirSdk. * Remove all references to `Pervasives` in the generated OCaml code. These references were infrequent anyway, and `Pervasives` is about to be deprecated in OCaml 4.08, it seems. * In `--interpret` mode, print `Ready!` once ready to accept input. * At verbosity level `-lg 2`, for each nonterminal symbol `N`, display a sentence (composed of terminal symbols) of minimal length generated by `N`. * When writing a `.cmly` file, open it in binary mode. This should eliminate the failure that was observed under Windows: `output_value: not a binary channel`. (Reported by Bryan Phelps. A fix was present in the `mingw` patches for Menhir.) * Change the logic used in the root `Makefile` to deal with Unix and Windows in a uniform way. (Also inspired by the `mingw` patches for Menhir.) * Coq back-end: add a few newlines in the generated file for readability. (Suggested by Bernhard Schommer.) * Remove the trailing space at the end of every sentence in auto-generated `.messages` files. (Suggested by Xavier Leroy.) ## 2018/09/05 * When `--explain` is enabled, always create a fresh `.conflicts` file (wiping out any pre-existing file), even if there are in fact no conflicts. This should avoid confusion with outdated `.conflicts` files. * Fix several bugs in the treatment of `--strict`. No `.conflicts` file was created when `--strict` and `--explain` were both enabled. Also, some warnings were not properly turned into errors by `--strict`. ## 2018/07/04 * Update the `man` page, which was woefully out of date. ## 2018/07/03 * New location keywords. `$loc` is sugar for the pair `($startpos, $endpos)`. `$loc(x)` is sugar for the pair `($startpos(x), $endpos(x))`. `$sloc` is sugar for the pair `($symbolstartpos, $endpos)`. (Contributed by Nicolás Ojeda Bär.) ## 2018/06/08 * Add two new parameterized nonterminal symbols, `endrule(X)` and `midrule(X)`, to the standard library. These symbols have been available since 2015/02/11 under the names `anonymous(X)` and `embedded(X)`, but were not yet documented. `endrule(X)` and `midrule(X)` are now documented, while `anonymous(X)` and `embedded(X)` remain present but are deprecated. ## 2018/05/30 * In `--coq` mode, Menhir now produces references to `MenhirLib.Grammar` instead of just `Grammar`, and similarly for all modules in Menhir's support library. * New command line option `--coq-lib-no-path` to suppress the above behavior and retain the previous (now-deprecated) behavior, that is, produce unqualified references the modules in Menhir's support library. * New command line option `--coq-lib-path ` to indicate under what name (or path) the support library has been installed. Its default value is `MenhirLib`. ## 2018/05/23 * New commands `--infer-write-query`, `--infer-read-reply`, and `--infer-protocol-supported`. These commands remove the need for Menhir to invoke `ocamlc` and `ocamldep` behind the scenes, and make it easier to write correct build rules for Menhir projects. The command line options `--infer`, `--raw-depend` and `--depend` remain supported, but are no longer preferred. (Suggested by Fabrice Le Fessant.) * Remove the warning that was issued when `%inline` was used but `--infer` was turned off. Most people should use a build system that knows how to enable OCaml type inference, such as `ocamlbuild` or `dune`. * New HTML rendering of the manual, available both online and as part of Menhir's installation. (Implemented by Gabriel Scherer.) ## 2017/12/22 * Add a flag `--unused-precedence-levels` to suppress all warnings about useless `%left`, `%right`, `%nonassoc` and `%prec` declarations. (Suggested by Zachary Tatlock.) ## 2017/12/06 * Fix the termination test that takes place before parameterized symbols are expanded away. The previous test was both unsound (it would accept grammars whose expansion did not terminate) and incomplete (it would reject grammars whose expansion did terminate). The new test is believed to be sound and complete. (Thanks to Martin Bodin for prompting us to look into this issue.) ## 2017/11/12 * Documentation: clarify the fact that `%type` declarations should carry types whose meaning does not depend on the headers `%{ ... %}`. ## 2017/10/13 * Remove the OCaml version check at installation time, for greater simplicity, and because for some reason it did not work properly under Cygwin. (Reported by Andrew Appel.) ## 2017/09/26 * `Makefile` fix: when determining whether the suffix `.exe` should be used, one should test whether the OS is Windows, not whether the compiler is MSVC. (Suggested by Jonathan Protzenko.) ## 2017/07/12 * Include the LaTeX sources of the manual in the official `.tar.gz` archive. This should allow the manual to be included as part of the Debian package. * Mention [Obelisk](https://github.com/Lelio-Brun/Obelisk), a pretty-printer for `.mly` files, in the manual. ## 2017/06/07 * Removed an undeclared dependency of MenhirSdk on Unix. (Reported and fixed by Frédéric Bour.) ## 2017/05/09 * Menhir now always places OCaml line number directives in the generated `.ml` file. (Until now, this was done only when `--infer` was off.) Thus, if a semantic action contains an `assert` statement, the file name and line number information carried by the `Assert_failure` exception should now be correct. (Reported by Helmut Brandl.) ## 2017/04/18 * Changed Menhir's license from QPL to GPLv2. MenhirLib remains under LGPLv2, with a linking exception. * Moved the repository to [gitlab.inria.fr](https://gitlab.inria.fr/fpottier/menhir/). * Introduced a new command line switch, `--cmly`, which causes Menhir to create a `.cmly` file, containing a description of the grammar and automaton. (Suggested by Frédéric Bour.) * Introduced a new library, MenhirSdk, which allows reading a `.cmly` file. The purpose of this library is to allow external tools to take advantage of the work performed by Menhir's front-end. (Suggested by Frédéric Bour.) * Introduced new syntax for attributes in a `.mly` file. Attributes are ignored by Menhir's back-ends, but are written to `.cmly` files, thus can be exploited by external tools via MenhirSdk. (Suggested by Frédéric Bour.) * The definition of a `%public` nonterminal symbol can now be split into several parts within a single `.mly` file. (This used to be permitted only over multiple `.mly` files.) (Suggested by Frédéric Bour.) * New functions in the incremental API: `shifts`, `acceptable`, `current_state_number`. * New functions in the incremental API and inspection API: `top`, `pop`, `pop_many`, `get`, `equal`, `force_reduction`, `feed`, `input_needed`, `state_has_default_reduction`, `production_index`, `find_production`. (Suggested by Frédéric Bour.) * New module `MenhirLib.ErrorReports`. This module is supposed to offer auxiliary functions that help produce good syntax error messages. This module does not yet contain much functionality and is expected to evolve in the future. * Incompatible change in the incremental API: the type `env` becomes `'a env`. * Incompatible change in the incremental API: the function `has_default_reduction` is renamed `env_has_default_reduction`. * The type `stack` and the function `stack` in the incremental API are deprecated. The new functions `top` and `pop` can be used instead to inspect the parser's stack. The module `MenhirLib.General` is deprecated as well. Deprecated functionality will be removed in the future. * Incompatible change in the incremental API: the type of the function `print_stack` in the result signature of the functor `MenhirLib.Printers.Make` changes to `'a env -> unit`. (Anyway, as of now, `MenhirLib.Printers` remains undocumented.) * Improved the syntax error message that is displayed when a `.mly` file is incorrect: the previous and next token are shown. * Fixed a bug where the module name `Basics` was shadowed (that is, if the user's project happened to contain a toplevel module by this name, then it could not be referred to from a `.mly` file). (Reported by François Thiré.) ## 2017/01/01 * Add `$MENHIR_STDLIB` as a way of controlling where Menhir looks for the file `standard.mly`. This environment variable overrides the installation-time default setting, and is itself overridden by the `--stdlib` command line switch. (Requested by Jonathan Protzenko.) * `Makefile` fix: filter out `'\r'` in the output of `menhir --suggest-ocamlfind`, so that the `Makefile` works when Menhir is compiled as a Windows executable. (Suggested by Jonathan Protzenko.) ## 2016/12/01 * Updated the Coq back-end for compatibility with Coq 8.6. (Jacques-Henri Jourdan.) ## 2016/11/15 * Fix in `--only-preprocess-for-ocamlyacc` mode: avoid printing newline characters inside a `%type` declaration, as this is forbidden by `ocamlyacc`. (Reported by Kenji Maillard.) * Fix in `--only-preprocess-for-ocamlyacc` mode: avoid variable capture caused by `ocamlyacc` internally translating `$i` to `_i`. (Reported by Kenji Maillard.) ## 2016/09/01 * New command line switch `--only-preprocess-for-ocamlyacc`, supposed to print the grammar in a form that `ocamlyacc` can accept. As of now, this feature is incomplete (in particular, support for Menhir's position keywords is missing), untested, and undocumented. It could be removed in the future. ## 2016/08/26 * Fixes in the output of `--only-preprocess`: * The order of productions is now preserved. (It was not. This matters if there are reduce/reduce conflicts.) * `%parameter` directives are now printed. (They were not). * `%on_error_reduce` directives are now printed. (They were not.) ## 2016/08/25 * `Makefile` fix, undoing a change made on 2016/03/03, which caused installation to fail under (some versions of?) Windows where dynamic linking is not supported. (Reported by Andrew Appel.) ## 2016/08/05 * `%on_error_reduce` declarations now have implicit priority levels, so as to tell Menhir what to do when two such declarations are applicable. Also, the well-formedness checks on `%type` and `%on_error_reduce` declarations have been reinforced. ## 2016/06/23 * A small change in the generated code (both in the code and table back-ends) so as to avoid OCaml's warning 41. The warning would arise (when compiling a generated parser with OCaml 4.03) because Menhir's exception `Error` has the same name as the data constructor `Error` in OCaml's pervasive library. (Reported by Bernhard Schommer.) ## 2016/05/18 * Anonymous rules now work also when used inside a parameterized rule. (This did not work until now.) When an anonymous rule is hoisted out of a parameterized rule, it may itself become parameterized. Menhir parameterizes it only over the parameters that it actually needs. ## 2016/05/04 * In the Coq back-end, split the largest definitions into smaller ones. This circumvents a limitation of vm_compute on 32 bit machines. This also enables us to perform sharing between definitions, so that the generated files are much smaller. ## 2016/04/10 * When printing a grammar (which is done by the `--only-preprocess` options), remove the leading bar `|`, for compatibility with `yacc` and `bison`. ## 2016/03/11 * In the code back-end, generate type annotations when extracting a semantic value out of the stack. When working with a semantic value of some function type, OCaml would incorrectly warn that this function does not use its argument. This warning should now be gone. ## 2016/03/03 * Makefile changes, so as to support `ocamlbuild` 4.03, which seems to have stricter hygiene rules than previous versions. ## 2015/12/30 * Prevented an incorrect installation that would take place if `USE_OCAMLFIND` was given during `make all` but not during `make install`. Added a command line directive `--suggest-ocamlfind`. ## 2015/11/11 * Fixed a severe bug in Menhir 20151110 which (when using the code back-end) could cause a generated parser to crash. Thanks to ygrek for reporting the bug. * The code produced by version `XXXXXXXX` of `menhir --table` can now be linked only against a matching version of MenhirLib. If an incorrect version of MenhirLib is installed, the OCaml compiler should complain that `MenhirLib.StaticVersion.require_XXXXXXXX` is undefined. ## 2015/11/10 * Optimized the computation of `$symbolstartpos`, based on a couple of assumptions about the lexer. (See the manual.) ## 2015/11/04 * Modified the treatment of `%inline` so that the positions that are computed are the same, regardless of whether `%inline` is used. This property did not hold until now. It now does. Of course, this means that the positions computed by the new Menhir are not the same as those computed by older versions of Menhir. * Fixed a bug in the treatment of `%inline` that would lead to an incorrect position being computed when the caller and callee had a variable by the same name. * Modified Menhir so as to compute the start and end positions in the exact same way as `ocamlyacc`. (There used to be a difference in the treatment of epsilon productions.) Of course, this means that the positions computed by the new Menhir are not the same as those computed by older versions of Menhir. Added the keyword `$symbolstartpos` so as to simulate `Parsing.symbol_start_pos()` in the `ocamlyacc` world. The keyword `$startpos` sometimes produces a position that is too far off to the left; `$symbolstartpos` produces a more accurate position. * Incompatible change of the incremental API: instead of a unit argument, the entry points (which are named after the start symbols) now require an initial position, which typically should be `lexbuf.lex_curr_p`. ## 2015/11/03 * Fix-fix-and-re-fix the `Makefile` in an attempt to allow installation under opam/Windows. Thanks to Daniel Weil for patient explanations and testing. ## 2015/10/29 * MenhirLib is now installed in both binary and source forms. `menhir --suggest-menhirLib` reports where MenhirLib is installed. This can be used to retrieve a snapshot of MenhirLib in source form and include it in your project (if you wish to use `--table` mode, yet do not wish to have a dependency on MenhirLib). ## 2015/10/26 * Allow `--list-errors` to work on 32-bit machines (with low hard limits). This should fix a problem whereby the 2015/10/23 release could not bootstrap on a 32-bit machine. ## 2015/10/23 * New declaration `%on_error_reduce foo`, where `foo` is a nonterminal symbol. This modifies the automaton as follows. In every state where a production of the form `foo -> ...` is ready to be reduced, every error action is replaced with a reduction of this production. (If there is a conflict between several productions that could be reduced in this manner, nothing is done.) This does not affect the language that is accepted by the automaton, but delays the detection of an error: more reductions take place before the error is detected. * Fixed a bug whereby Menhir would warn about a useless `%prec` declaration, even though it was useful. This would happen when the declaration was duplicated (by inlining or by macro-expansion) and some but not all of the copies were useful. * Added `has_default_reduction` to the incremental API. * Modified the meaning of `--canonical` to allow default reductions to take place. This implies no loss of precision in terms of lookahead sets, and should allow gaining more contextual information when a syntax error is encountered. (It should also lead to a smaller automaton.) * A brand new set of tools to work on syntax errors. * New command `--list-errors`, which produces a list of input sentences which are representative of all possible syntax errors. (Costly.) * New command `--interpret-error`, which confirms that one particular input sentence ends in a syntax error, and prints the number of the state in which this error occurs. * New command `--compile-errors`, which compiles a list of erroneous sentences (together with error messages) to OCaml code. * New command `--compare-errors`, which compares two lists of erroneous sentences to check if they cover the same error states. * New command `--update-errors`, which updates the auto-generated comments in a list of erroneous sentences. * New command `--echo-errors`, which removes all comments and messages from a list of erroneous sentences, and echoes just the sentences. ## 2015/10/16 * Additions to the incremental API. * A `supplier` is a function that produces tokens on demand. * `lexer_lexbuf_to_supplier` turns a lexer and a lexbuf into a supplier. * `loop` is a ready-made made main parsing loop. * `loop_handle` is a variant that lets the user do her own error handling. * `loop_handle_undo` is a variant that additionally allows undoing the last few "spurious" reductions. * `number` maps a state of the LR(1) automaton to its number. * Incompatible change of the incremental API: renamed the type `'a result` to `'a checkpoint`. This is a better name anyway, and should help avoid confusion with the type `'a result` introduced in OCaml 4.03. ## 2015/10/12 * Avoid using `$(shell pwd)` in `Makefile`, for better Windows compatibility. ## 2015/10/05 * Fixed a bug where inconsistent OCaml code was generated when `--table` and `--external-tokens` were used together. (Reported by Darin Morrison.) * In `--infer` mode, leave the `.ml` file around (instead of removing it) if `ocamlc` fails, so we have a chance to understand what's wrong. ## 2015/09/21 * Re-established some error messages concerning the mis-use of `$i` which had disappeared on 2015/06/29. ## 2015/09/11 * Fixed the mysterious message that would appear when a nonterminal symbol begins with an uppercase letter and `--infer` is turned on. Clarified the documentation to indicate that a (non-start) nonterminal symbol can begin with an uppercase letter, but this is not recommended. ## 2015/08/27 * New option `--inspection` (added last January, documented only now). This generates an inspection API which allows inspecting the automaton's stack, among other things. This API can in principle be used to write custom code for error reporting, error recovery, etc. It is not yet mature and may change in the future. ## 2015/07/20 * Added the command line options `--unused-token ` and `--unused-tokens`. ## 2015/06/29 * Changed the treatment of the positional keywords `$i`. They are now rewritten into variables of the form `_i` where `i` is an integer. Users are advised not to use variables of this form inside semantic actions. ## 2015/02/11 * Added support for anonymous rules. This allows writing, e.g., `list(e = expression SEMI { e })` whereas previously one should have written `list(terminated(e, SEMI))`. ## 2015/02/09 * Moved all of the demos to `ocamlbuild` (instead of `make`). ## 2015/01/18 * Incompatible change of the incremental API. The incremental API now exposes shift events too. ## 2015/01/16 * Fixed a couple bugs in `Makefile` and `src/Makefile` which would cause compilation and installation to fail with `TARGET=byte`. (Reported by Jérémie Courrèges-Anglas and Daniel Dickman.) ## 2015/01/01 * Incompatible change of the incremental API. The entry point `main_incremental` is now named `Incremental.main`. ## 2014/12/29 * Incompatible change of the incremental API. * The API now exposes reduction events. * The type `'a result` is now private. * The type `env` is no longer parameterized. * `handle` is renamed to `resume`. * `offer` and `resume` now expect a result, not an environment. ## 2014/12/22 * Documented the Coq back-end (designed and implemented by Jacques-Henri Jourdan). ## 2014/12/15 * New incremental API (in `--table` mode only), inspired by Frédéric Bour. ## 2014/12/11 * Menhir now reports an error if one of the start symbols produces either the empty language or the singleton language {epsilon}. * Although some people out there actually define a start symbol that recognizes {epsilon} (and use it as a way of initializing or re-initializing some global state), this is considered bad style. Furthermore, by ruling out this case, we are able to simplify the table back-end a little bit. ## 2014/12/12 * A speed improvement in the code back-end. ## 2014/12/08 * Menhir now requires OCaml 4.02 (instead of 3.09). ## 2014/12/02 * Removed support for the `$previouserror` keyword. * Removed support for `--error-recovery` mode. ## 2014/02/18 * In the Coq back-end, use `'` instead of `_` as separator in identifiers. Also, correct a serious bug that was inadvertently introduced on 2013/03/01 (r319). ## 2014/02/14 * Lexer fix so as to support an open variant type `[> ...]` within a `%type<...>` declaration. ## 2013/12/16 * Updated the `Makefile` so that `install` no longer depends on `all`. * Updated the demos so that the lexer does not invoke `exit 0` when encoutering `eof`. (This should be more intuitive.) ## 2013/09/11 * Fixed a newline conversion problem that would prevent Menhir from building on Windows when using ocaml 4.01. ## 2013/03/02 * Switched to ocamlbuild. Many thanks to Daniel Weil for offering very useful guidance. ## 2013/01/16 * `menhir --depend` was broken since someone added new whitespace in the output of `ocamldep`. Fixed. ## 2012/12/19 * Fixed a compilation problem that would arise when a file produced by Menhir on a 64-bit platform was compiled by ocaml on a 32-bit platform. ## 2012/08/25 * Performance improvements in the computation of various information about the automaton (module `Invariant`). The improvements will be noticeable only for very large automata. ## 2012/06/07 * The option `--log-grammar 3` (and above) now causes the `FOLLOW` sets for terminal symbols to be computed and displayed. ## 2012/05/25 * Added the flag `--canonical`, which causes Menhir to produce a canonical LR(1) automaton in the style of Knuth. This means that no merging of states takes place during the construction of the automaton, and that no default reductions are allowed. ## 2012/01/23 * Fixed a bug whereby a `%nonassoc` declaration was not respected. This declaration requests that a shift/reduce conflict be reduced in favor of neither shifting nor reducing, that is, a syntax error must occur. However, due to an unforeseen interaction with the default reduction mechanism, this declaration was sometimes ignored and reduction would take place. ## 2012/01/09 * Changes in the (undocumented) Coq back-end so as to match the ESOP 2012 paper. ## 2011/10/19 * The `Makefile` now tests whether Unix or Windows is used (the test is performed by evaluating `Sys.os_type` under `ocaml`) and changes a couple settings accordingly: * the executable file name is either `menhir` or `menhir.exe` * the object file suffix is either `.o` or `.obj` * Added `--strict`, which causes many warnings about the grammar and about the automaton to be considered errors. * The `#` annotations that are inserted in the generated `.ml` file now retain their full path. (That is, we no longer use `Filename.basename`.) This implies that the `#` annotations depend on how Menhir is invoked -- e.g., `menhir foo/bar.mly` and `cd foo && menhir bar.mly` will produce different results. Nevertheless, this seems reasonable and useful (e.g., in conjunction with `ocamlbuild` and a hierarchy of files). Thanks to Daniel Weil. ## 2011/10/06 * With the `-lg 1` switch, Menhir now indicates whether the grammar is SLR(1). ## 2011/05/24 * Removed the lock in `ocamldep.wrapper`. It is the responsibility of the user to avoid interferences with other processes (or other instances of the script) that create and/or remove files. ## 2011/04/28 * The (internal) computation of the automaton's invariant was broken and has been fixed. Surprisingly, this does not seem to affect the generated code, (which was correct,) so no observable bug is fixed. Hopefully no bug is introduced! ## 2011/04/07 * The grammar description files (`.mly`) are now read in up front and stored in memory while they are parsed. This allows us to avoid the use of `pos_in` and `seek_in`, which do not work correctly when CRLF conversion is being performed. ## 2011/04/05 * Fixed a bug in the type inference module (for parameterized non-terminals) which would cause an infinite loop. ## 2011/01/24 * Fixed a bug that would cause an assertion failure in the generated parser in some situations where the input stream was incorrect and the grammar involved the error token. The fix might cause grammars that use the error token to behave differently (hopefully more accurately) as of now. ## 2009/06/18 * `Makefile` changes: build and install only the bytecode version of MenhirLib when `TARGET=byte` is set. ## 2009/02/06 * Fixed `ocamldep.wrapper` to avoid quoting the name of the `ocaml` command. This is hoped to fix a compilation problem under MinGW. ## 2009/02/04 * A `Makefile` fix to avoid a problem under Windows/Cygwin. * Renamed the `ocaml-check-version` script so as to avoid a warning. ## 2008/09/05 * Ocaml summer project: added `--interpret`, `--table`, and `--suggest-*`. ## 2008/08/06 * Fixed a problem that would cause the code inliner to abort when a semantic value and a non-terminal symbol happened to have the same name. * Removed code sharing. ## 2008/06/20 * Removed an incorrect assertion that caused failures (`lr1.ml`, line 134). ## 2007/12/05 * Disabled code sharing by default, as it is currently broken. (See Yann's message; assertion failure at runtime.) ## 2007/12/01 * Added an optimization to share code among states that have identical outgoing transition tables. ## 2007/08/30 * Small `Makefile` change: create an executable file for `check-ocaml-version` in order to work around the absence of dynamic loading on some platforms. ## 2007/05/20 * Made a fundamental change in the construction of the LR(1) automaton in order to eliminate a bug that could lead to spurious conflicts -- thanks to Ketti for submitting a bug report. ## 2007/05/18 * Added `--follow-construction` to help understand the construction of the LR(1) automaton (very verbose). ## 2007/05/11 * Code generation: more explicit qualifications with `Pervasives` so as to avoid capture when the user redefines some of the built-in operators, such as `(+)`. * Added a new demo (`calc-param`) that shows how to use `%parameter`. ## 2007/03/22 * `Makefile` improvements (check for `PREFIX`; bootstrap in bytecode now also available). Slight changes to `OMakefile.shared`. ## 2007/02/15 * Portability fix in `Makefile` and `Makefile.shared` (avoided `which`). ## 2006/12/15 * Portability fix in `Makefile.shared` (replaced `&>` with `2>&1 >`). ## 2006/06/23 * Made a slight restriction to Pager's criterion so as to never introduce fake conflict tokens (see `Lr0.compatible`). This might help make conflict explanations more accurate in the future. ## 2006/06/16 * Fixed bug that would cause positions to become invalid after inlining. ## 2006/06/15 * Fixed `--depend` to be more lenient when analyzing `ocamldep`'s output. * Added `--raw-depend` which transmits `ocamldep`'s output unchanged (for use in conjunction with `omake`). ## 2006/06/12 * Fixed bug that would cause `--only-preprocess` to print `%token` declarations also for pseudo-tokens. * Fixed bug that caused some precedence declarations to be incorrectly reported as useless. * Improved things so that useless pseudo-tokens now also cause warnings. * Fixed bug that would cause `%type` directives for terminal symbols to be incorrectly accepted. * Fixed bug that would occur when a semantic action containing `$i` keywords was inlined. ## 2006/05/05 * Fixed problem that caused some end-of-stream conflicts not to be reported. * Fixed Pager's compatibility criterion to avoid creating end-of-stream conflicts. ## 2006/04/21 * Fixed problem that allowed generating incorrect but apparently well-typed Objective Caml code when a semantic action was ill-typed and `--infer` was omitted. ## 2006/03/29 * Improved conflict reports by factoring out maximal common derivation contexts. ## 2006/03/28 * Fixed bug that could arise when explaining a conflict in a non-LALR(1) grammar. ## 2006/03/27 * Changed count of reduce/reduce conflicts to allow a comparison with `ocamlyacc`'s diagnostics. * When refusing to resolve a conflict, report all diagnostics before dying. ## 2006/03/18 * Added display of `FOLLOW` sets when using `--log-grammar 2`. * Added `--graph` option. * Fixed behavior of `--depend` option. ## 2006/01/06 * Removed reversed lists from the standard library. menhir-20200123/INSTALLATION.md000066400000000000000000000015741361226111300154550ustar00rootroot00000000000000# Installation ## Requirements You need OCaml (version 4.02.3 or later) and `dune` (version 2.0 or later). ## Compilation and Installation Compile and install as follows: ``` make all # or: dune build @install make install # or: dune install ``` The executable file `menhir` and the libraries `MenhirLib` and `MenhirSdk` are installed by `dune`. `dune` usually figures out by itself where they should be installed. If desired, a `--prefix` option can be passed to `dune`. ## Coq support If you wish to use Menhir's Coq back-end, which produces verified parsers, then you must install the Coq library `coq-menhirlib`. This is normally done via the following commands: ``` opam repo add coq-released https://coq.inria.fr/opam/released opam install coq-menhirlib ``` The library can also be manually installed as follows: ``` cd coq-menhirlib make make install ``` menhir-20200123/LICENSE000066400000000000000000001306631361226111300142410ustar00rootroot00000000000000In the following, "THE LIBRARY" refers to the following files: 1- the file src/standard.mly; 2- the OCaml source files whose basename appears in the file src/menhirLib.mlpack and whose extension is ".ml" or ".mli". "THE COQ LIBRARY" refers to the files in the subdirectory coq-menhirlib/. "THE GENERATOR" refers to the files of this archive which are neither part of THE LIBRARY, nor part of THE COQ LIBRARY, nor in the subdirectory test/. Files in the subdirectory test/ are not covered by this license. THE GENERATOR is distributed under the terms of the GNU General Public License version 2 (included below). THE LIBRARY is distributed under the terms of the GNU Library General Public License version 2 (included below). THE COQ LIBRARY is distributed under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. Version 3 of the GNU Lesser General Public License is included in the file coq-menhirlib/LICENSE. As a special exception to the GNU Library General Public License, you may link, statically or dynamically, a "work that uses the Library" with a publicly distributed version of the Library to produce an executable file containing portions of the Library, and distribute that executable file under terms of your choice, without any of the additional requirements listed in clause 6 of the GNU Library General Public License. By "a publicly distributed version of the Library", we mean either the unmodified Library as distributed by INRIA, or a modified version of the Library that is distributed under the conditions defined in clause 3 of the GNU Library General Public License. This exception does not however invalidate any other reasons why the executable file might be covered by the GNU Library General Public License. ---------------------------------------------------------------------- GNU GENERAL PUBLIC LICENSE Version 2, June 1991 Copyright (C) 1989, 1991 Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. Preamble The licenses for most software are designed to take away your freedom to share and change it. By contrast, the GNU General Public License is intended to guarantee your freedom to share and change free software--to make sure the software is free for all its users. This General Public License applies to most of the Free Software Foundation's software and to any other program whose authors commit to using it. (Some other Free Software Foundation software is covered by the GNU Lesser General Public License instead.) You can apply it to your programs, too. When we speak of free software, we are referring to freedom, not price. Our General Public Licenses are designed to make sure that you have the freedom to distribute copies of free software (and charge for this service if you wish), that you receive source code or can get it if you want it, that you can change the software or use pieces of it in new free programs; and that you know you can do these things. To protect your rights, we need to make restrictions that forbid anyone to deny you these rights or to ask you to surrender the rights. These restrictions translate to certain responsibilities for you if you distribute copies of the software, or if you modify it. For example, if you distribute copies of such a program, whether gratis or for a fee, you must give the recipients all the rights that you have. You must make sure that they, too, receive or can get the source code. And you must show them these terms so they know their rights. We protect your rights with two steps: (1) copyright the software, and (2) offer you this license which gives you legal permission to copy, distribute and/or modify the software. Also, for each author's protection and ours, we want to make certain that everyone understands that there is no warranty for this free software. If the software is modified by someone else and passed on, we want its recipients to know that what they have is not the original, so that any problems introduced by others will not reflect on the original authors' reputations. Finally, any free program is threatened constantly by software patents. We wish to avoid the danger that redistributors of a free program will individually obtain patent licenses, in effect making the program proprietary. To prevent this, we have made it clear that any patent must be licensed for everyone's free use or not licensed at all. The precise terms and conditions for copying, distribution and modification follow. GNU GENERAL PUBLIC LICENSE TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION 0. This License applies to any program or other work which contains a notice placed by the copyright holder saying it may be distributed under the terms of this General Public License. The "Program", below, refers to any such program or work, and a "work based on the Program" means either the Program or any derivative work under copyright law: that is to say, a work containing the Program or a portion of it, either verbatim or with modifications and/or translated into another language. (Hereinafter, translation is included without limitation in the term "modification".) Each licensee is addressed as "you". Activities other than copying, distribution and modification are not covered by this License; they are outside its scope. The act of running the Program is not restricted, and the output from the Program is covered only if its contents constitute a work based on the Program (independent of having been made by running the Program). Whether that is true depends on what the Program does. 1. You may copy and distribute verbatim copies of the Program's source code as you receive it, in any medium, provided that you conspicuously and appropriately publish on each copy an appropriate copyright notice and disclaimer of warranty; keep intact all the notices that refer to this License and to the absence of any warranty; and give any other recipients of the Program a copy of this License along with the Program. You may charge a fee for the physical act of transferring a copy, and you may at your option offer warranty protection in exchange for a fee. 2. You may modify your copy or copies of the Program or any portion of it, thus forming a work based on the Program, and copy and distribute such modifications or work under the terms of Section 1 above, provided that you also meet all of these conditions: a) You must cause the modified files to carry prominent notices stating that you changed the files and the date of any change. b) You must cause any work that you distribute or publish, that in whole or in part contains or is derived from the Program or any part thereof, to be licensed as a whole at no charge to all third parties under the terms of this License. c) If the modified program normally reads commands interactively when run, you must cause it, when started running for such interactive use in the most ordinary way, to print or display an announcement including an appropriate copyright notice and a notice that there is no warranty (or else, saying that you provide a warranty) and that users may redistribute the program under these conditions, and telling the user how to view a copy of this License. (Exception: if the Program itself is interactive but does not normally print such an announcement, your work based on the Program is not required to print an announcement.) These requirements apply to the modified work as a whole. If identifiable sections of that work are not derived from the Program, and can be reasonably considered independent and separate works in themselves, then this License, and its terms, do not apply to those sections when you distribute them as separate works. But when you distribute the same sections as part of a whole which is a work based on the Program, the distribution of the whole must be on the terms of this License, whose permissions for other licensees extend to the entire whole, and thus to each and every part regardless of who wrote it. Thus, it is not the intent of this section to claim rights or contest your rights to work written entirely by you; rather, the intent is to exercise the right to control the distribution of derivative or collective works based on the Program. In addition, mere aggregation of another work not based on the Program with the Program (or with a work based on the Program) on a volume of a storage or distribution medium does not bring the other work under the scope of this License. 3. You may copy and distribute the Program (or a work based on it, under Section 2) in object code or executable form under the terms of Sections 1 and 2 above provided that you also do one of the following: a) Accompany it with the complete corresponding machine-readable source code, which must be distributed under the terms of Sections 1 and 2 above on a medium customarily used for software interchange; or, b) Accompany it with a written offer, valid for at least three years, to give any third party, for a charge no more than your cost of physically performing source distribution, a complete machine-readable copy of the corresponding source code, to be distributed under the terms of Sections 1 and 2 above on a medium customarily used for software interchange; or, c) Accompany it with the information you received as to the offer to distribute corresponding source code. (This alternative is allowed only for noncommercial distribution and only if you received the program in object code or executable form with such an offer, in accord with Subsection b above.) The source code for a work means the preferred form of the work for making modifications to it. For an executable work, complete source code means all the source code for all modules it contains, plus any associated interface definition files, plus the scripts used to control compilation and installation of the executable. However, as a special exception, the source code distributed need not include anything that is normally distributed (in either source or binary form) with the major components (compiler, kernel, and so on) of the operating system on which the executable runs, unless that component itself accompanies the executable. If distribution of executable or object code is made by offering access to copy from a designated place, then offering equivalent access to copy the source code from the same place counts as distribution of the source code, even though third parties are not compelled to copy the source along with the object code. 4. You may not copy, modify, sublicense, or distribute the Program except as expressly provided under this License. Any attempt otherwise to copy, modify, sublicense or distribute the Program is void, and will automatically terminate your rights under this License. However, parties who have received copies, or rights, from you under this License will not have their licenses terminated so long as such parties remain in full compliance. 5. You are not required to accept this License, since you have not signed it. However, nothing else grants you permission to modify or distribute the Program or its derivative works. These actions are prohibited by law if you do not accept this License. Therefore, by modifying or distributing the Program (or any work based on the Program), you indicate your acceptance of this License to do so, and all its terms and conditions for copying, distributing or modifying the Program or works based on it. 6. Each time you redistribute the Program (or any work based on the Program), the recipient automatically receives a license from the original licensor to copy, distribute or modify the Program subject to these terms and conditions. You may not impose any further restrictions on the recipients' exercise of the rights granted herein. You are not responsible for enforcing compliance by third parties to this License. 7. If, as a consequence of a court judgment or allegation of patent infringement or for any other reason (not limited to patent issues), conditions are imposed on you (whether by court order, agreement or otherwise) that contradict the conditions of this License, they do not excuse you from the conditions of this License. If you cannot distribute so as to satisfy simultaneously your obligations under this License and any other pertinent obligations, then as a consequence you may not distribute the Program at all. For example, if a patent license would not permit royalty-free redistribution of the Program by all those who receive copies directly or indirectly through you, then the only way you could satisfy both it and this License would be to refrain entirely from distribution of the Program. If any portion of this section is held invalid or unenforceable under any particular circumstance, the balance of the section is intended to apply and the section as a whole is intended to apply in other circumstances. It is not the purpose of this section to induce you to infringe any patents or other property right claims or to contest validity of any such claims; this section has the sole purpose of protecting the integrity of the free software distribution system, which is implemented by public license practices. Many people have made generous contributions to the wide range of software distributed through that system in reliance on consistent application of that system; it is up to the author/donor to decide if he or she is willing to distribute software through any other system and a licensee cannot impose that choice. This section is intended to make thoroughly clear what is believed to be a consequence of the rest of this License. 8. If the distribution and/or use of the Program is restricted in certain countries either by patents or by copyrighted interfaces, the original copyright holder who places the Program under this License may add an explicit geographical distribution limitation excluding those countries, so that distribution is permitted only in or among countries not thus excluded. In such case, this License incorporates the limitation as if written in the body of this License. 9. The Free Software Foundation may publish revised and/or new versions of the General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. Each version is given a distinguishing version number. If the Program specifies a version number of this License which applies to it and "any later version", you have the option of following the terms and conditions either of that version or of any later version published by the Free Software Foundation. If the Program does not specify a version number of this License, you may choose any version ever published by the Free Software Foundation. 10. If you wish to incorporate parts of the Program into other free programs whose distribution conditions are different, write to the author to ask for permission. For software which is copyrighted by the Free Software Foundation, write to the Free Software Foundation; we sometimes make exceptions for this. Our decision will be guided by the two goals of preserving the free status of all derivatives of our free software and of promoting the sharing and reuse of software generally. NO WARRANTY 11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION. 12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. END OF TERMS AND CONDITIONS How to Apply These Terms to Your New Programs If you develop a new program, and you want it to be of the greatest possible use to the public, the best way to achieve this is to make it free software which everyone can redistribute and change under these terms. To do so, attach the following notices to the program. It is safest to attach them to the start of each source file to most effectively convey the exclusion of warranty; and each file should have at least the "copyright" line and a pointer to where the full notice is found. Copyright (C) This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program; if not, write to the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. Also add information on how to contact you by electronic and paper mail. If the program is interactive, make it output a short notice like this when it starts in an interactive mode: Gnomovision version 69, Copyright (C) year name of author Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'. This is free software, and you are welcome to redistribute it under certain conditions; type `show c' for details. The hypothetical commands `show w' and `show c' should show the appropriate parts of the General Public License. Of course, the commands you use may be called something other than `show w' and `show c'; they could even be mouse-clicks or menu items--whatever suits your program. You should also get your employer (if you work as a programmer) or your school, if any, to sign a "copyright disclaimer" for the program, if necessary. Here is a sample; alter the names: Yoyodyne, Inc., hereby disclaims all copyright interest in the program `Gnomovision' (which makes passes at compilers) written by James Hacker. , 1 April 1989 Ty Coon, President of Vice This General Public License does not permit incorporating your program into proprietary programs. If your program is a subroutine library, you may consider it more useful to permit linking proprietary applications with the library. If this is what you want to do, use the GNU Lesser General Public License instead of this License. ---------------------------------------------------------------------- GNU LIBRARY GENERAL PUBLIC LICENSE Version 2, June 1991 Copyright (C) 1991 Free Software Foundation, Inc. 59 Temple Place - Suite 330, Boston, MA 02111-1307, USA Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. [This is the first released version of the library GPL. It is numbered 2 because it goes with version 2 of the ordinary GPL.] Preamble The licenses for most software are designed to take away your freedom to share and change it. By contrast, the GNU General Public Licenses are intended to guarantee your freedom to share and change free software--to make sure the software is free for all its users. This license, the Library General Public License, applies to some specially designated Free Software Foundation software, and to any other libraries whose authors decide to use it. You can use it for your libraries, too. When we speak of free software, we are referring to freedom, not price. Our General Public Licenses are designed to make sure that you have the freedom to distribute copies of free software (and charge for this service if you wish), that you receive source code or can get it if you want it, that you can change the software or use pieces of it in new free programs; and that you know you can do these things. To protect your rights, we need to make restrictions that forbid anyone to deny you these rights or to ask you to surrender the rights. These restrictions translate to certain responsibilities for you if you distribute copies of the library, or if you modify it. For example, if you distribute copies of the library, whether gratis or for a fee, you must give the recipients all the rights that we gave you. You must make sure that they, too, receive or can get the source code. If you link a program with the library, you must provide complete object files to the recipients so that they can relink them with the library, after making changes to the library and recompiling it. And you must show them these terms so they know their rights. Our method of protecting your rights has two steps: (1) copyright the library, and (2) offer you this license which gives you legal permission to copy, distribute and/or modify the library. Also, for each distributor's protection, we want to make certain that everyone understands that there is no warranty for this free library. If the library is modified by someone else and passed on, we want its recipients to know that what they have is not the original version, so that any problems introduced by others will not reflect on the original authors' reputations. Finally, any free program is threatened constantly by software patents. We wish to avoid the danger that companies distributing free software will individually obtain patent licenses, thus in effect transforming the program into proprietary software. To prevent this, we have made it clear that any patent must be licensed for everyone's free use or not licensed at all. Most GNU software, including some libraries, is covered by the ordinary GNU General Public License, which was designed for utility programs. This license, the GNU Library General Public License, applies to certain designated libraries. This license is quite different from the ordinary one; be sure to read it in full, and don't assume that anything in it is the same as in the ordinary license. The reason we have a separate public license for some libraries is that they blur the distinction we usually make between modifying or adding to a program and simply using it. Linking a program with a library, without changing the library, is in some sense simply using the library, and is analogous to running a utility program or application program. However, in a textual and legal sense, the linked executable is a combined work, a derivative of the original library, and the ordinary General Public License treats it as such. Because of this blurred distinction, using the ordinary General Public License for libraries did not effectively promote software sharing, because most developers did not use the libraries. We concluded that weaker conditions might promote sharing better. However, unrestricted linking of non-free programs would deprive the users of those programs of all benefit from the free status of the libraries themselves. This Library General Public License is intended to permit developers of non-free programs to use free libraries, while preserving your freedom as a user of such programs to change the free libraries that are incorporated in them. (We have not seen how to achieve this as regards changes in header files, but we have achieved it as regards changes in the actual functions of the Library.) The hope is that this will lead to faster development of free libraries. The precise terms and conditions for copying, distribution and modification follow. Pay close attention to the difference between a "work based on the library" and a "work that uses the library". The former contains code derived from the library, while the latter only works together with the library. Note that it is possible for a library to be covered by the ordinary General Public License rather than by this special one. GNU LIBRARY GENERAL PUBLIC LICENSE TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION 0. This License Agreement applies to any software library which contains a notice placed by the copyright holder or other authorized party saying it may be distributed under the terms of this Library General Public License (also called "this License"). Each licensee is addressed as "you". A "library" means a collection of software functions and/or data prepared so as to be conveniently linked with application programs (which use some of those functions and data) to form executables. The "Library", below, refers to any such software library or work which has been distributed under these terms. A "work based on the Library" means either the Library or any derivative work under copyright law: that is to say, a work containing the Library or a portion of it, either verbatim or with modifications and/or translated straightforwardly into another language. (Hereinafter, translation is included without limitation in the term "modification".) "Source code" for a work means the preferred form of the work for making modifications to it. For a library, complete source code means all the source code for all modules it contains, plus any associated interface definition files, plus the scripts used to control compilation and installation of the library. Activities other than copying, distribution and modification are not covered by this License; they are outside its scope. The act of running a program using the Library is not restricted, and output from such a program is covered only if its contents constitute a work based on the Library (independent of the use of the Library in a tool for writing it). Whether that is true depends on what the Library does and what the program that uses the Library does. 1. You may copy and distribute verbatim copies of the Library's complete source code as you receive it, in any medium, provided that you conspicuously and appropriately publish on each copy an appropriate copyright notice and disclaimer of warranty; keep intact all the notices that refer to this License and to the absence of any warranty; and distribute a copy of this License along with the Library. You may charge a fee for the physical act of transferring a copy, and you may at your option offer warranty protection in exchange for a fee. 2. You may modify your copy or copies of the Library or any portion of it, thus forming a work based on the Library, and copy and distribute such modifications or work under the terms of Section 1 above, provided that you also meet all of these conditions: a) The modified work must itself be a software library. b) You must cause the files modified to carry prominent notices stating that you changed the files and the date of any change. c) You must cause the whole of the work to be licensed at no charge to all third parties under the terms of this License. d) If a facility in the modified Library refers to a function or a table of data to be supplied by an application program that uses the facility, other than as an argument passed when the facility is invoked, then you must make a good faith effort to ensure that, in the event an application does not supply such function or table, the facility still operates, and performs whatever part of its purpose remains meaningful. (For example, a function in a library to compute square roots has a purpose that is entirely well-defined independent of the application. Therefore, Subsection 2d requires that any application-supplied function or table used by this function must be optional: if the application does not supply it, the square root function must still compute square roots.) These requirements apply to the modified work as a whole. If identifiable sections of that work are not derived from the Library, and can be reasonably considered independent and separate works in themselves, then this License, and its terms, do not apply to those sections when you distribute them as separate works. But when you distribute the same sections as part of a whole which is a work based on the Library, the distribution of the whole must be on the terms of this License, whose permissions for other licensees extend to the entire whole, and thus to each and every part regardless of who wrote it. Thus, it is not the intent of this section to claim rights or contest your rights to work written entirely by you; rather, the intent is to exercise the right to control the distribution of derivative or collective works based on the Library. In addition, mere aggregation of another work not based on the Library with the Library (or with a work based on the Library) on a volume of a storage or distribution medium does not bring the other work under the scope of this License. 3. You may opt to apply the terms of the ordinary GNU General Public License instead of this License to a given copy of the Library. To do this, you must alter all the notices that refer to this License, so that they refer to the ordinary GNU General Public License, version 2, instead of to this License. (If a newer version than version 2 of the ordinary GNU General Public License has appeared, then you can specify that version instead if you wish.) Do not make any other change in these notices. Once this change is made in a given copy, it is irreversible for that copy, so the ordinary GNU General Public License applies to all subsequent copies and derivative works made from that copy. This option is useful when you wish to copy part of the code of the Library into a program that is not a library. 4. You may copy and distribute the Library (or a portion or derivative of it, under Section 2) in object code or executable form under the terms of Sections 1 and 2 above provided that you accompany it with the complete corresponding machine-readable source code, which must be distributed under the terms of Sections 1 and 2 above on a medium customarily used for software interchange. If distribution of object code is made by offering access to copy from a designated place, then offering equivalent access to copy the source code from the same place satisfies the requirement to distribute the source code, even though third parties are not compelled to copy the source along with the object code. 5. A program that contains no derivative of any portion of the Library, but is designed to work with the Library by being compiled or linked with it, is called a "work that uses the Library". Such a work, in isolation, is not a derivative work of the Library, and therefore falls outside the scope of this License. However, linking a "work that uses the Library" with the Library creates an executable that is a derivative of the Library (because it contains portions of the Library), rather than a "work that uses the library". The executable is therefore covered by this License. Section 6 states terms for distribution of such executables. When a "work that uses the Library" uses material from a header file that is part of the Library, the object code for the work may be a derivative work of the Library even though the source code is not. Whether this is true is especially significant if the work can be linked without the Library, or if the work is itself a library. The threshold for this to be true is not precisely defined by law. If such an object file uses only numerical parameters, data structure layouts and accessors, and small macros and small inline functions (ten lines or less in length), then the use of the object file is unrestricted, regardless of whether it is legally a derivative work. (Executables containing this object code plus portions of the Library will still fall under Section 6.) Otherwise, if the work is a derivative of the Library, you may distribute the object code for the work under the terms of Section 6. Any executables containing that work also fall under Section 6, whether or not they are linked directly with the Library itself. 6. As an exception to the Sections above, you may also compile or link a "work that uses the Library" with the Library to produce a work containing portions of the Library, and distribute that work under terms of your choice, provided that the terms permit modification of the work for the customer's own use and reverse engineering for debugging such modifications. You must give prominent notice with each copy of the work that the Library is used in it and that the Library and its use are covered by this License. You must supply a copy of this License. If the work during execution displays copyright notices, you must include the copyright notice for the Library among them, as well as a reference directing the user to the copy of this License. Also, you must do one of these things: a) Accompany the work with the complete corresponding machine-readable source code for the Library including whatever changes were used in the work (which must be distributed under Sections 1 and 2 above); and, if the work is an executable linked with the Library, with the complete machine-readable "work that uses the Library", as object code and/or source code, so that the user can modify the Library and then relink to produce a modified executable containing the modified Library. (It is understood that the user who changes the contents of definitions files in the Library will not necessarily be able to recompile the application to use the modified definitions.) b) Accompany the work with a written offer, valid for at least three years, to give the same user the materials specified in Subsection 6a, above, for a charge no more than the cost of performing this distribution. c) If distribution of the work is made by offering access to copy from a designated place, offer equivalent access to copy the above specified materials from the same place. d) Verify that the user has already received a copy of these materials or that you have already sent this user a copy. For an executable, the required form of the "work that uses the Library" must include any data and utility programs needed for reproducing the executable from it. However, as a special exception, the source code distributed need not include anything that is normally distributed (in either source or binary form) with the major components (compiler, kernel, and so on) of the operating system on which the executable runs, unless that component itself accompanies the executable. It may happen that this requirement contradicts the license restrictions of other proprietary libraries that do not normally accompany the operating system. Such a contradiction means you cannot use both them and the Library together in an executable that you distribute. 7. You may place library facilities that are a work based on the Library side-by-side in a single library together with other library facilities not covered by this License, and distribute such a combined library, provided that the separate distribution of the work based on the Library and of the other library facilities is otherwise permitted, and provided that you do these two things: a) Accompany the combined library with a copy of the same work based on the Library, uncombined with any other library facilities. This must be distributed under the terms of the Sections above. b) Give prominent notice with the combined library of the fact that part of it is a work based on the Library, and explaining where to find the accompanying uncombined form of the same work. 8. You may not copy, modify, sublicense, link with, or distribute the Library except as expressly provided under this License. Any attempt otherwise to copy, modify, sublicense, link with, or distribute the Library is void, and will automatically terminate your rights under this License. However, parties who have received copies, or rights, from you under this License will not have their licenses terminated so long as such parties remain in full compliance. 9. You are not required to accept this License, since you have not signed it. However, nothing else grants you permission to modify or distribute the Library or its derivative works. These actions are prohibited by law if you do not accept this License. Therefore, by modifying or distributing the Library (or any work based on the Library), you indicate your acceptance of this License to do so, and all its terms and conditions for copying, distributing or modifying the Library or works based on it. 10. Each time you redistribute the Library (or any work based on the Library), the recipient automatically receives a license from the original licensor to copy, distribute, link with or modify the Library subject to these terms and conditions. You may not impose any further restrictions on the recipients' exercise of the rights granted herein. You are not responsible for enforcing compliance by third parties to this License. 11. If, as a consequence of a court judgment or allegation of patent infringement or for any other reason (not limited to patent issues), conditions are imposed on you (whether by court order, agreement or otherwise) that contradict the conditions of this License, they do not excuse you from the conditions of this License. If you cannot distribute so as to satisfy simultaneously your obligations under this License and any other pertinent obligations, then as a consequence you may not distribute the Library at all. For example, if a patent license would not permit royalty-free redistribution of the Library by all those who receive copies directly or indirectly through you, then the only way you could satisfy both it and this License would be to refrain entirely from distribution of the Library. If any portion of this section is held invalid or unenforceable under any particular circumstance, the balance of the section is intended to apply, and the section as a whole is intended to apply in other circumstances. It is not the purpose of this section to induce you to infringe any patents or other property right claims or to contest validity of any such claims; this section has the sole purpose of protecting the integrity of the free software distribution system which is implemented by public license practices. Many people have made generous contributions to the wide range of software distributed through that system in reliance on consistent application of that system; it is up to the author/donor to decide if he or she is willing to distribute software through any other system and a licensee cannot impose that choice. This section is intended to make thoroughly clear what is believed to be a consequence of the rest of this License. 12. If the distribution and/or use of the Library is restricted in certain countries either by patents or by copyrighted interfaces, the original copyright holder who places the Library under this License may add an explicit geographical distribution limitation excluding those countries, so that distribution is permitted only in or among countries not thus excluded. In such case, this License incorporates the limitation as if written in the body of this License. 13. The Free Software Foundation may publish revised and/or new versions of the Library General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. Each version is given a distinguishing version number. If the Library specifies a version number of this License which applies to it and "any later version", you have the option of following the terms and conditions either of that version or of any later version published by the Free Software Foundation. If the Library does not specify a license version number, you may choose any version ever published by the Free Software Foundation. 14. If you wish to incorporate parts of the Library into other free programs whose distribution conditions are incompatible with these, write to the author to ask for permission. For software which is copyrighted by the Free Software Foundation, write to the Free Software Foundation; we sometimes make exceptions for this. Our decision will be guided by the two goals of preserving the free status of all derivatives of our free software and of promoting the sharing and reuse of software generally. NO WARRANTY 15. BECAUSE THE LIBRARY IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY FOR THE LIBRARY, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE LIBRARY "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE LIBRARY IS WITH YOU. SHOULD THE LIBRARY PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION. 16. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR REDISTRIBUTE THE LIBRARY AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE LIBRARY (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE LIBRARY TO OPERATE WITH ANY OTHER SOFTWARE), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. END OF TERMS AND CONDITIONS Appendix: How to Apply These Terms to Your New Libraries If you develop a new library, and you want it to be of the greatest possible use to the public, we recommend making it free software that everyone can redistribute and change. You can do so by permitting redistribution under these terms (or, alternatively, under the terms of the ordinary General Public License). To apply these terms, attach the following notices to the library. It is safest to attach them to the start of each source file to most effectively convey the exclusion of warranty; and each file should have at least the "copyright" line and a pointer to where the full notice is found. Copyright (C) This library is free software; you can redistribute it and/or modify it under the terms of the GNU Library General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Library General Public License for more details. You should have received a copy of the GNU Library General Public License along with this library; if not, write to the Free Software Foundation, Inc., 59 Temple Place - Suite 330, Boston, MA 02111-1307, USA Also add information on how to contact you by electronic and paper mail. You should also get your employer (if you work as a programmer) or your school, if any, to sign a "copyright disclaimer" for the library, if necessary. Here is a sample; alter the names: Yoyodyne, Inc., hereby disclaims all copyright interest in the library `Frob' (a library for tweaking knobs) written by James Random Hacker. , 1 April 1990 Ty Coon, President of Vice That's all there is to it! menhir-20200123/README.md000066400000000000000000000012271361226111300145040ustar00rootroot00000000000000# Menhir Menhir is an LR(1) parser generator for OCaml. Menhir has a [home page](http://cambium.inria.fr/~fpottier/menhir/). ## Installation The latest released version of Menhir can be easily installed via `opam`, OCaml's package manager. Just type `opam install menhir`. For manual installation, see [INSTALLATION.md](INSTALLATION.md). Some instructions for developers can be found in [HOWTO.md](HOWTO.md). ## Authors * [François Pottier](Francois.Pottier@inria.fr) * [Yann Régis-Gianas](Yann.Regis-Gianas@pps.jussieu.fr) ## Contributors * Frédéric Bour (incremental engine, inspection API, attributes, SDK) * Jacques-Henri Jourdan (Coq back-end) menhir-20200123/check-tarball.sh000077500000000000000000000026331361226111300162620ustar00rootroot00000000000000#!/bin/bash set -euo pipefail IFS=$'\n\t' # This script checks that a Menhir tarball can be compiled and installed. # The command line argument should be the tarball's name without .tar.gz. PACKAGE="$1" TARBALL=$PACKAGE.tar.gz # We use a dedicated opam switch where it is permitted to uninstall/reinstall # Menhir. echo "Now switching to test-menhir..." eval $(opam env --set-switch --switch test-menhir) OPAMYES=true opam install coq dune # Uninstall Menhir if it is installed. echo "Removing menhir if already installed..." # read -p "Can I remove it [Enter/^C]?" -n 1 -r ; opam remove menhir || /bin/true # Create a temporary directory; extract into it. # Build and install; then uninstall. TEMPDIR=`mktemp -d /tmp/menhir-test.XXXXXX` INSTALL=$TEMPDIR/install COQCONTRIB=$INSTALL/coq-contrib DUNE=dune cp $TARBALL $TEMPDIR echo " * Extracting. " (cd $TEMPDIR && tar xfz $TARBALL) echo " * Compiling and installing." mkdir $INSTALL (cd $TEMPDIR/$PACKAGE && $DUNE build @install && $DUNE install --prefix=$INSTALL menhir && make -C coq-menhirlib all && make -C coq-menhirlib CONTRIB=$COQCONTRIB install ) > $TEMPDIR/install.log 2>&1 || (cat $TEMPDIR/install.log; exit 1) echo " * Uninstalling." (cd $TEMPDIR/$PACKAGE && $DUNE uninstall --prefix=$INSTALL menhir make -C coq-menhirlib CONTRIB=$COQCONTRIB uninstall ) > $TEMPDIR/uninstall.log 2>&1 || (cat $TEMPDIR/uninstall.log; exit 1) rm -rf $TEMPDIR menhir-20200123/coq-menhirlib/000077500000000000000000000000001361226111300157545ustar00rootroot00000000000000menhir-20200123/coq-menhirlib/CHANGES.md000066400000000000000000000023461361226111300173530ustar00rootroot00000000000000# Changes ## 2019/09/24 * Fix compatibility with Coq 8.10, and some warnings. ## 2019/06/26 * Fix compatibility with Coq 8.7 and Coq 8.9: * In Coq 8.7, in the syntax `{ x : T & T' }` for the `sigT` types, it was not possible to omit the type `T`. * An anomaly in Coq 8.7 has been worked around. * In Coq 8.9, the numeral notation for positives moved from `Coq.Numbers.BinNums` to `Coq.PArith.BinPos`. ## 2019/06/13 * The Coq development is now free of any axiom (it used to use axiom `K`), and the parsers can now be executed directly within Coq, without using extraction. * The parser interpreter is now written using dependent types, so that no dynamic checks are needed anymore at parsing time. When running the extracted code, this should give a performance boost. Moreover, efficient extraction of `int31` is no longer needed. This required some refactoring of the type of parse trees. * Instead of a dependent pair of a terminal and a semantic value, tokens are now a user-defined (inductive) type. ## 2018/08/27 * Avoid an undocumented mode of use of the `fix` tactic, which would cause an incompatibility with Coq > 8.8.1. (Reported and corrected by Michael Soegtrop.) ## 2018/05/30 * Initial release. menhir-20200123/coq-menhirlib/LICENSE000066400000000000000000001243441361226111300167710ustar00rootroot00000000000000All files in this directory are distributed under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. Version 3 of the GNU Lesser General Public License is included bellow. ---------------------------------------------------------------------- GNU LESSER GENERAL PUBLIC LICENSE Version 3, 29 June 2007 Copyright (C) 2007 Free Software Foundation, Inc. Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. This version of the GNU Lesser General Public License incorporates the terms and conditions of version 3 of the GNU General Public License, supplemented by the additional permissions listed below. 0. Additional Definitions. As used herein, "this License" refers to version 3 of the GNU Lesser General Public License, and the "GNU GPL" refers to version 3 of the GNU General Public License. "The Library" refers to a covered work governed by this License, other than an Application or a Combined Work as defined below. An "Application" is any work that makes use of an interface provided by the Library, but which is not otherwise based on the Library. Defining a subclass of a class defined by the Library is deemed a mode of using an interface provided by the Library. A "Combined Work" is a work produced by combining or linking an Application with the Library. The particular version of the Library with which the Combined Work was made is also called the "Linked Version". The "Minimal Corresponding Source" for a Combined Work means the Corresponding Source for the Combined Work, excluding any source code for portions of the Combined Work that, considered in isolation, are based on the Application, and not on the Linked Version. The "Corresponding Application Code" for a Combined Work means the object code and/or source code for the Application, including any data and utility programs needed for reproducing the Combined Work from the Application, but excluding the System Libraries of the Combined Work. 1. Exception to Section 3 of the GNU GPL. You may convey a covered work under sections 3 and 4 of this License without being bound by section 3 of the GNU GPL. 2. Conveying Modified Versions. If you modify a copy of the Library, and, in your modifications, a facility refers to a function or data to be supplied by an Application that uses the facility (other than as an argument passed when the facility is invoked), then you may convey a copy of the modified version: a) under this License, provided that you make a good faith effort to ensure that, in the event an Application does not supply the function or data, the facility still operates, and performs whatever part of its purpose remains meaningful, or b) under the GNU GPL, with none of the additional permissions of this License applicable to that copy. 3. Object Code Incorporating Material from Library Header Files. The object code form of an Application may incorporate material from a header file that is part of the Library. You may convey such object code under terms of your choice, provided that, if the incorporated material is not limited to numerical parameters, data structure layouts and accessors, or small macros, inline functions and templates (ten or fewer lines in length), you do both of the following: a) Give prominent notice with each copy of the object code that the Library is used in it and that the Library and its use are covered by this License. b) Accompany the object code with a copy of the GNU GPL and this license document. 4. Combined Works. You may convey a Combined Work under terms of your choice that, taken together, effectively do not restrict modification of the portions of the Library contained in the Combined Work and reverse engineering for debugging such modifications, if you also do each of the following: a) Give prominent notice with each copy of the Combined Work that the Library is used in it and that the Library and its use are covered by this License. b) Accompany the Combined Work with a copy of the GNU GPL and this license document. c) For a Combined Work that displays copyright notices during execution, include the copyright notice for the Library among these notices, as well as a reference directing the user to the copies of the GNU GPL and this license document. d) Do one of the following: 0) Convey the Minimal Corresponding Source under the terms of this License, and the Corresponding Application Code in a form suitable for, and under terms that permit, the user to recombine or relink the Application with a modified version of the Linked Version to produce a modified Combined Work, in the manner specified by section 6 of the GNU GPL for conveying Corresponding Source. 1) Use a suitable shared library mechanism for linking with the Library. A suitable mechanism is one that (a) uses at run time a copy of the Library already present on the user's computer system, and (b) will operate properly with a modified version of the Library that is interface-compatible with the Linked Version. e) Provide Installation Information, but only if you would otherwise be required to provide such information under section 6 of the GNU GPL, and only to the extent that such information is necessary to install and execute a modified version of the Combined Work produced by recombining or relinking the Application with a modified version of the Linked Version. (If you use option 4d0, the Installation Information must accompany the Minimal Corresponding Source and Corresponding Application Code. If you use option 4d1, you must provide the Installation Information in the manner specified by section 6 of the GNU GPL for conveying Corresponding Source.) 5. Combined Libraries. You may place library facilities that are a work based on the Library side by side in a single library together with other library facilities that are not Applications and are not covered by this License, and convey such a combined library under terms of your choice, if you do both of the following: a) Accompany the combined library with a copy of the same work based on the Library, uncombined with any other library facilities, conveyed under the terms of this License. b) Give prominent notice with the combined library that part of it is a work based on the Library, and explaining where to find the accompanying uncombined form of the same work. 6. Revised Versions of the GNU Lesser General Public License. The Free Software Foundation may publish revised and/or new versions of the GNU Lesser General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. Each version is given a distinguishing version number. If the Library as you received it specifies that a certain numbered version of the GNU Lesser General Public License "or any later version" applies to it, you have the option of following the terms and conditions either of that published version or of any later version published by the Free Software Foundation. If the Library as you received it does not specify a version number of the GNU Lesser General Public License, you may choose any version of the GNU Lesser General Public License ever published by the Free Software Foundation. If the Library as you received it specifies that a proxy can decide whether future versions of the GNU Lesser General Public License shall apply, that proxy's public statement of acceptance of any version is permanent authorization for you to choose that version for the Library. ---------------------------------------------------------------------- GNU GENERAL PUBLIC LICENSE Version 3, 29 June 2007 Copyright (C) 2007 Free Software Foundation, Inc. Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. Preamble The GNU General Public License is a free, copyleft license for software and other kinds of works. The licenses for most software and other practical works are designed to take away your freedom to share and change the works. By contrast, the GNU General Public License is intended to guarantee your freedom to share and change all versions of a program--to make sure it remains free software for all its users. We, the Free Software Foundation, use the GNU General Public License for most of our software; it applies also to any other work released this way by its authors. You can apply it to your programs, too. When we speak of free software, we are referring to freedom, not price. Our General Public Licenses are designed to make sure that you have the freedom to distribute copies of free software (and charge for them if you wish), that you receive source code or can get it if you want it, that you can change the software or use pieces of it in new free programs, and that you know you can do these things. To protect your rights, we need to prevent others from denying you these rights or asking you to surrender the rights. Therefore, you have certain responsibilities if you distribute copies of the software, or if you modify it: responsibilities to respect the freedom of others. For example, if you distribute copies of such a program, whether gratis or for a fee, you must pass on to the recipients the same freedoms that you received. You must make sure that they, too, receive or can get the source code. And you must show them these terms so they know their rights. Developers that use the GNU GPL protect your rights with two steps: (1) assert copyright on the software, and (2) offer you this License giving you legal permission to copy, distribute and/or modify it. For the developers' and authors' protection, the GPL clearly explains that there is no warranty for this free software. For both users' and authors' sake, the GPL requires that modified versions be marked as changed, so that their problems will not be attributed erroneously to authors of previous versions. Some devices are designed to deny users access to install or run modified versions of the software inside them, although the manufacturer can do so. This is fundamentally incompatible with the aim of protecting users' freedom to change the software. The systematic pattern of such abuse occurs in the area of products for individuals to use, which is precisely where it is most unacceptable. Therefore, we have designed this version of the GPL to prohibit the practice for those products. If such problems arise substantially in other domains, we stand ready to extend this provision to those domains in future versions of the GPL, as needed to protect the freedom of users. Finally, every program is threatened constantly by software patents. States should not allow patents to restrict development and use of software on general-purpose computers, but in those that do, we wish to avoid the special danger that patents applied to a free program could make it effectively proprietary. To prevent this, the GPL assures that patents cannot be used to render the program non-free. The precise terms and conditions for copying, distribution and modification follow. TERMS AND CONDITIONS 0. Definitions. "This License" refers to version 3 of the GNU General Public License. "Copyright" also means copyright-like laws that apply to other kinds of works, such as semiconductor masks. "The Program" refers to any copyrightable work licensed under this License. Each licensee is addressed as "you". "Licensees" and "recipients" may be individuals or organizations. To "modify" a work means to copy from or adapt all or part of the work in a fashion requiring copyright permission, other than the making of an exact copy. The resulting work is called a "modified version" of the earlier work or a work "based on" the earlier work. A "covered work" means either the unmodified Program or a work based on the Program. To "propagate" a work means to do anything with it that, without permission, would make you directly or secondarily liable for infringement under applicable copyright law, except executing it on a computer or modifying a private copy. Propagation includes copying, distribution (with or without modification), making available to the public, and in some countries other activities as well. To "convey" a work means any kind of propagation that enables other parties to make or receive copies. Mere interaction with a user through a computer network, with no transfer of a copy, is not conveying. An interactive user interface displays "Appropriate Legal Notices" to the extent that it includes a convenient and prominently visible feature that (1) displays an appropriate copyright notice, and (2) tells the user that there is no warranty for the work (except to the extent that warranties are provided), that licensees may convey the work under this License, and how to view a copy of this License. If the interface presents a list of user commands or options, such as a menu, a prominent item in the list meets this criterion. 1. Source Code. The "source code" for a work means the preferred form of the work for making modifications to it. "Object code" means any non-source form of a work. A "Standard Interface" means an interface that either is an official standard defined by a recognized standards body, or, in the case of interfaces specified for a particular programming language, one that is widely used among developers working in that language. The "System Libraries" of an executable work include anything, other than the work as a whole, that (a) is included in the normal form of packaging a Major Component, but which is not part of that Major Component, and (b) serves only to enable use of the work with that Major Component, or to implement a Standard Interface for which an implementation is available to the public in source code form. A "Major Component", in this context, means a major essential component (kernel, window system, and so on) of the specific operating system (if any) on which the executable work runs, or a compiler used to produce the work, or an object code interpreter used to run it. The "Corresponding Source" for a work in object code form means all the source code needed to generate, install, and (for an executable work) run the object code and to modify the work, including scripts to control those activities. However, it does not include the work's System Libraries, or general-purpose tools or generally available free programs which are used unmodified in performing those activities but which are not part of the work. For example, Corresponding Source includes interface definition files associated with source files for the work, and the source code for shared libraries and dynamically linked subprograms that the work is specifically designed to require, such as by intimate data communication or control flow between those subprograms and other parts of the work. The Corresponding Source need not include anything that users can regenerate automatically from other parts of the Corresponding Source. The Corresponding Source for a work in source code form is that same work. 2. Basic Permissions. All rights granted under this License are granted for the term of copyright on the Program, and are irrevocable provided the stated conditions are met. This License explicitly affirms your unlimited permission to run the unmodified Program. The output from running a covered work is covered by this License only if the output, given its content, constitutes a covered work. This License acknowledges your rights of fair use or other equivalent, as provided by copyright law. You may make, run and propagate covered works that you do not convey, without conditions so long as your license otherwise remains in force. You may convey covered works to others for the sole purpose of having them make modifications exclusively for you, or provide you with facilities for running those works, provided that you comply with the terms of this License in conveying all material for which you do not control copyright. Those thus making or running the covered works for you must do so exclusively on your behalf, under your direction and control, on terms that prohibit them from making any copies of your copyrighted material outside their relationship with you. Conveying under any other circumstances is permitted solely under the conditions stated below. Sublicensing is not allowed; section 10 makes it unnecessary. 3. Protecting Users' Legal Rights From Anti-Circumvention Law. No covered work shall be deemed part of an effective technological measure under any applicable law fulfilling obligations under article 11 of the WIPO copyright treaty adopted on 20 December 1996, or similar laws prohibiting or restricting circumvention of such measures. When you convey a covered work, you waive any legal power to forbid circumvention of technological measures to the extent such circumvention is effected by exercising rights under this License with respect to the covered work, and you disclaim any intention to limit operation or modification of the work as a means of enforcing, against the work's users, your or third parties' legal rights to forbid circumvention of technological measures. 4. Conveying Verbatim Copies. You may convey verbatim copies of the Program's source code as you receive it, in any medium, provided that you conspicuously and appropriately publish on each copy an appropriate copyright notice; keep intact all notices stating that this License and any non-permissive terms added in accord with section 7 apply to the code; keep intact all notices of the absence of any warranty; and give all recipients a copy of this License along with the Program. You may charge any price or no price for each copy that you convey, and you may offer support or warranty protection for a fee. 5. Conveying Modified Source Versions. You may convey a work based on the Program, or the modifications to produce it from the Program, in the form of source code under the terms of section 4, provided that you also meet all of these conditions: a) The work must carry prominent notices stating that you modified it, and giving a relevant date. b) The work must carry prominent notices stating that it is released under this License and any conditions added under section 7. This requirement modifies the requirement in section 4 to "keep intact all notices". c) You must license the entire work, as a whole, under this License to anyone who comes into possession of a copy. This License will therefore apply, along with any applicable section 7 additional terms, to the whole of the work, and all its parts, regardless of how they are packaged. This License gives no permission to license the work in any other way, but it does not invalidate such permission if you have separately received it. d) If the work has interactive user interfaces, each must display Appropriate Legal Notices; however, if the Program has interactive interfaces that do not display Appropriate Legal Notices, your work need not make them do so. A compilation of a covered work with other separate and independent works, which are not by their nature extensions of the covered work, and which are not combined with it such as to form a larger program, in or on a volume of a storage or distribution medium, is called an "aggregate" if the compilation and its resulting copyright are not used to limit the access or legal rights of the compilation's users beyond what the individual works permit. Inclusion of a covered work in an aggregate does not cause this License to apply to the other parts of the aggregate. 6. Conveying Non-Source Forms. You may convey a covered work in object code form under the terms of sections 4 and 5, provided that you also convey the machine-readable Corresponding Source under the terms of this License, in one of these ways: a) Convey the object code in, or embodied in, a physical product (including a physical distribution medium), accompanied by the Corresponding Source fixed on a durable physical medium customarily used for software interchange. b) Convey the object code in, or embodied in, a physical product (including a physical distribution medium), accompanied by a written offer, valid for at least three years and valid for as long as you offer spare parts or customer support for that product model, to give anyone who possesses the object code either (1) a copy of the Corresponding Source for all the software in the product that is covered by this License, on a durable physical medium customarily used for software interchange, for a price no more than your reasonable cost of physically performing this conveying of source, or (2) access to copy the Corresponding Source from a network server at no charge. c) Convey individual copies of the object code with a copy of the written offer to provide the Corresponding Source. This alternative is allowed only occasionally and noncommercially, and only if you received the object code with such an offer, in accord with subsection 6b. d) Convey the object code by offering access from a designated place (gratis or for a charge), and offer equivalent access to the Corresponding Source in the same way through the same place at no further charge. You need not require recipients to copy the Corresponding Source along with the object code. If the place to copy the object code is a network server, the Corresponding Source may be on a different server (operated by you or a third party) that supports equivalent copying facilities, provided you maintain clear directions next to the object code saying where to find the Corresponding Source. Regardless of what server hosts the Corresponding Source, you remain obligated to ensure that it is available for as long as needed to satisfy these requirements. e) Convey the object code using peer-to-peer transmission, provided you inform other peers where the object code and Corresponding Source of the work are being offered to the general public at no charge under subsection 6d. A separable portion of the object code, whose source code is excluded from the Corresponding Source as a System Library, need not be included in conveying the object code work. A "User Product" is either (1) a "consumer product", which means any tangible personal property which is normally used for personal, family, or household purposes, or (2) anything designed or sold for incorporation into a dwelling. In determining whether a product is a consumer product, doubtful cases shall be resolved in favor of coverage. For a particular product received by a particular user, "normally used" refers to a typical or common use of that class of product, regardless of the status of the particular user or of the way in which the particular user actually uses, or expects or is expected to use, the product. A product is a consumer product regardless of whether the product has substantial commercial, industrial or non-consumer uses, unless such uses represent the only significant mode of use of the product. "Installation Information" for a User Product means any methods, procedures, authorization keys, or other information required to install and execute modified versions of a covered work in that User Product from a modified version of its Corresponding Source. The information must suffice to ensure that the continued functioning of the modified object code is in no case prevented or interfered with solely because modification has been made. If you convey an object code work under this section in, or with, or specifically for use in, a User Product, and the conveying occurs as part of a transaction in which the right of possession and use of the User Product is transferred to the recipient in perpetuity or for a fixed term (regardless of how the transaction is characterized), the Corresponding Source conveyed under this section must be accompanied by the Installation Information. But this requirement does not apply if neither you nor any third party retains the ability to install modified object code on the User Product (for example, the work has been installed in ROM). The requirement to provide Installation Information does not include a requirement to continue to provide support service, warranty, or updates for a work that has been modified or installed by the recipient, or for the User Product in which it has been modified or installed. Access to a network may be denied when the modification itself materially and adversely affects the operation of the network or violates the rules and protocols for communication across the network. Corresponding Source conveyed, and Installation Information provided, in accord with this section must be in a format that is publicly documented (and with an implementation available to the public in source code form), and must require no special password or key for unpacking, reading or copying. 7. Additional Terms. "Additional permissions" are terms that supplement the terms of this License by making exceptions from one or more of its conditions. Additional permissions that are applicable to the entire Program shall be treated as though they were included in this License, to the extent that they are valid under applicable law. If additional permissions apply only to part of the Program, that part may be used separately under those permissions, but the entire Program remains governed by this License without regard to the additional permissions. When you convey a copy of a covered work, you may at your option remove any additional permissions from that copy, or from any part of it. (Additional permissions may be written to require their own removal in certain cases when you modify the work.) You may place additional permissions on material, added by you to a covered work, for which you have or can give appropriate copyright permission. Notwithstanding any other provision of this License, for material you add to a covered work, you may (if authorized by the copyright holders of that material) supplement the terms of this License with terms: a) Disclaiming warranty or limiting liability differently from the terms of sections 15 and 16 of this License; or b) Requiring preservation of specified reasonable legal notices or author attributions in that material or in the Appropriate Legal Notices displayed by works containing it; or c) Prohibiting misrepresentation of the origin of that material, or requiring that modified versions of such material be marked in reasonable ways as different from the original version; or d) Limiting the use for publicity purposes of names of licensors or authors of the material; or e) Declining to grant rights under trademark law for use of some trade names, trademarks, or service marks; or f) Requiring indemnification of licensors and authors of that material by anyone who conveys the material (or modified versions of it) with contractual assumptions of liability to the recipient, for any liability that these contractual assumptions directly impose on those licensors and authors. All other non-permissive additional terms are considered "further restrictions" within the meaning of section 10. If the Program as you received it, or any part of it, contains a notice stating that it is governed by this License along with a term that is a further restriction, you may remove that term. If a license document contains a further restriction but permits relicensing or conveying under this License, you may add to a covered work material governed by the terms of that license document, provided that the further restriction does not survive such relicensing or conveying. If you add terms to a covered work in accord with this section, you must place, in the relevant source files, a statement of the additional terms that apply to those files, or a notice indicating where to find the applicable terms. Additional terms, permissive or non-permissive, may be stated in the form of a separately written license, or stated as exceptions; the above requirements apply either way. 8. Termination. You may not propagate or modify a covered work except as expressly provided under this License. Any attempt otherwise to propagate or modify it is void, and will automatically terminate your rights under this License (including any patent licenses granted under the third paragraph of section 11). However, if you cease all violation of this License, then your license from a particular copyright holder is reinstated (a) provisionally, unless and until the copyright holder explicitly and finally terminates your license, and (b) permanently, if the copyright holder fails to notify you of the violation by some reasonable means prior to 60 days after the cessation. Moreover, your license from a particular copyright holder is reinstated permanently if the copyright holder notifies you of the violation by some reasonable means, this is the first time you have received notice of violation of this License (for any work) from that copyright holder, and you cure the violation prior to 30 days after your receipt of the notice. Termination of your rights under this section does not terminate the licenses of parties who have received copies or rights from you under this License. If your rights have been terminated and not permanently reinstated, you do not qualify to receive new licenses for the same material under section 10. 9. Acceptance Not Required for Having Copies. You are not required to accept this License in order to receive or run a copy of the Program. Ancillary propagation of a covered work occurring solely as a consequence of using peer-to-peer transmission to receive a copy likewise does not require acceptance. However, nothing other than this License grants you permission to propagate or modify any covered work. These actions infringe copyright if you do not accept this License. Therefore, by modifying or propagating a covered work, you indicate your acceptance of this License to do so. 10. Automatic Licensing of Downstream Recipients. Each time you convey a covered work, the recipient automatically receives a license from the original licensors, to run, modify and propagate that work, subject to this License. You are not responsible for enforcing compliance by third parties with this License. An "entity transaction" is a transaction transferring control of an organization, or substantially all assets of one, or subdividing an organization, or merging organizations. If propagation of a covered work results from an entity transaction, each party to that transaction who receives a copy of the work also receives whatever licenses to the work the party's predecessor in interest had or could give under the previous paragraph, plus a right to possession of the Corresponding Source of the work from the predecessor in interest, if the predecessor has it or can get it with reasonable efforts. You may not impose any further restrictions on the exercise of the rights granted or affirmed under this License. For example, you may not impose a license fee, royalty, or other charge for exercise of rights granted under this License, and you may not initiate litigation (including a cross-claim or counterclaim in a lawsuit) alleging that any patent claim is infringed by making, using, selling, offering for sale, or importing the Program or any portion of it. 11. Patents. A "contributor" is a copyright holder who authorizes use under this License of the Program or a work on which the Program is based. The work thus licensed is called the contributor's "contributor version". A contributor's "essential patent claims" are all patent claims owned or controlled by the contributor, whether already acquired or hereafter acquired, that would be infringed by some manner, permitted by this License, of making, using, or selling its contributor version, but do not include claims that would be infringed only as a consequence of further modification of the contributor version. For purposes of this definition, "control" includes the right to grant patent sublicenses in a manner consistent with the requirements of this License. Each contributor grants you a non-exclusive, worldwide, royalty-free patent license under the contributor's essential patent claims, to make, use, sell, offer for sale, import and otherwise run, modify and propagate the contents of its contributor version. In the following three paragraphs, a "patent license" is any express agreement or commitment, however denominated, not to enforce a patent (such as an express permission to practice a patent or covenant not to sue for patent infringement). To "grant" such a patent license to a party means to make such an agreement or commitment not to enforce a patent against the party. If you convey a covered work, knowingly relying on a patent license, and the Corresponding Source of the work is not available for anyone to copy, free of charge and under the terms of this License, through a publicly available network server or other readily accessible means, then you must either (1) cause the Corresponding Source to be so available, or (2) arrange to deprive yourself of the benefit of the patent license for this particular work, or (3) arrange, in a manner consistent with the requirements of this License, to extend the patent license to downstream recipients. "Knowingly relying" means you have actual knowledge that, but for the patent license, your conveying the covered work in a country, or your recipient's use of the covered work in a country, would infringe one or more identifiable patents in that country that you have reason to believe are valid. If, pursuant to or in connection with a single transaction or arrangement, you convey, or propagate by procuring conveyance of, a covered work, and grant a patent license to some of the parties receiving the covered work authorizing them to use, propagate, modify or convey a specific copy of the covered work, then the patent license you grant is automatically extended to all recipients of the covered work and works based on it. A patent license is "discriminatory" if it does not include within the scope of its coverage, prohibits the exercise of, or is conditioned on the non-exercise of one or more of the rights that are specifically granted under this License. You may not convey a covered work if you are a party to an arrangement with a third party that is in the business of distributing software, under which you make payment to the third party based on the extent of your activity of conveying the work, and under which the third party grants, to any of the parties who would receive the covered work from you, a discriminatory patent license (a) in connection with copies of the covered work conveyed by you (or copies made from those copies), or (b) primarily for and in connection with specific products or compilations that contain the covered work, unless you entered into that arrangement, or that patent license was granted, prior to 28 March 2007. Nothing in this License shall be construed as excluding or limiting any implied license or other defenses to infringement that may otherwise be available to you under applicable patent law. 12. No Surrender of Others' Freedom. If conditions are imposed on you (whether by court order, agreement or otherwise) that contradict the conditions of this License, they do not excuse you from the conditions of this License. If you cannot convey a covered work so as to satisfy simultaneously your obligations under this License and any other pertinent obligations, then as a consequence you may not convey it at all. For example, if you agree to terms that obligate you to collect a royalty for further conveying from those to whom you convey the Program, the only way you could satisfy both those terms and this License would be to refrain entirely from conveying the Program. 13. Use with the GNU Affero General Public License. Notwithstanding any other provision of this License, you have permission to link or combine any covered work with a work licensed under version 3 of the GNU Affero General Public License into a single combined work, and to convey the resulting work. The terms of this License will continue to apply to the part which is the covered work, but the special requirements of the GNU Affero General Public License, section 13, concerning interaction through a network will apply to the combination as such. 14. Revised Versions of this License. The Free Software Foundation may publish revised and/or new versions of the GNU General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. Each version is given a distinguishing version number. If the Program specifies that a certain numbered version of the GNU General Public License "or any later version" applies to it, you have the option of following the terms and conditions either of that numbered version or of any later version published by the Free Software Foundation. If the Program does not specify a version number of the GNU General Public License, you may choose any version ever published by the Free Software Foundation. If the Program specifies that a proxy can decide which future versions of the GNU General Public License can be used, that proxy's public statement of acceptance of a version permanently authorizes you to choose that version for the Program. Later license versions may give you additional or different permissions. However, no additional obligations are imposed on any author or copyright holder as a result of your choosing to follow a later version. 15. Disclaimer of Warranty. THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION. 16. Limitation of Liability. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. 17. Interpretation of Sections 15 and 16. If the disclaimer of warranty and limitation of liability provided above cannot be given local legal effect according to their terms, reviewing courts shall apply local law that most closely approximates an absolute waiver of all civil liability in connection with the Program, unless a warranty or assumption of liability accompanies a copy of the Program in return for a fee. END OF TERMS AND CONDITIONS How to Apply These Terms to Your New Programs If you develop a new program, and you want it to be of the greatest possible use to the public, the best way to achieve this is to make it free software which everyone can redistribute and change under these terms. To do so, attach the following notices to the program. It is safest to attach them to the start of each source file to most effectively state the exclusion of warranty; and each file should have at least the "copyright" line and a pointer to where the full notice is found. Copyright (C) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . Also add information on how to contact you by electronic and paper mail. If the program does terminal interaction, make it output a short notice like this when it starts in an interactive mode: Copyright (C) This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'. This is free software, and you are welcome to redistribute it under certain conditions; type `show c' for details. The hypothetical commands `show w' and `show c' should show the appropriate parts of the General Public License. Of course, your program's commands might be different; for a GUI interface, you would use an "about box". You should also get your employer (if you work as a programmer) or school, if any, to sign a "copyright disclaimer" for the program, if necessary. For more information on this, and how to apply and follow the GNU GPL, see . The GNU General Public License does not permit incorporating your program into proprietary programs. If your program is a subroutine library, you may consider it more useful to permit linking proprietary applications with the library. If this is what you want to do, use the GNU Lesser General Public License instead of this License. But first, please read . menhir-20200123/coq-menhirlib/Makefile000066400000000000000000000002201361226111300174060ustar00rootroot00000000000000# Delegate the following commands: .PHONY: all clean install uninstall all clean install uninstall: @ $(MAKE) -C src --no-print-directory $@ menhir-20200123/coq-menhirlib/README.md000066400000000000000000000013171361226111300172350ustar00rootroot00000000000000# A support library for verified Coq parsers produced by Menhir The [Menhir](http://gallium.inria.fr/~fpottier/menhir/) parser generator, in `--coq` mode, can produce [Coq](https://coq.inria.fr/) parsers. These parsers must be linked against this library, which provides both an interpreter (which allows running the generated parser) and a validator (which allows verifying, at parser construction time, that the generated parser is correct and complete with respect to the grammar). ## Installation To install the latest released version, use `opam install coq-menhirlib`. To install from the sources, use `make` followed with `make install`. ## Authors * [Jacques-Henri Jourdan](jacques-henri.jourdan@lri.fr) menhir-20200123/coq-menhirlib/src/000077500000000000000000000000001361226111300165435ustar00rootroot00000000000000menhir-20200123/coq-menhirlib/src/.gitignore000066400000000000000000000000451361226111300205320ustar00rootroot00000000000000*.vo *.glob *.v.d .*.aux _CoqProject menhir-20200123/coq-menhirlib/src/Alphabet.v000066400000000000000000000205721361226111300204600ustar00rootroot00000000000000(****************************************************************************) (* *) (* Menhir *) (* *) (* Jacques-Henri Jourdan, CNRS, LRI, Université Paris Sud *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under *) (* the terms of the GNU Lesser General Public License as published by the *) (* Free Software Foundation, either version 3 of the License, or (at your *) (* option) any later version, as described in the file LICENSE. *) (* *) (****************************************************************************) From Coq Require Import Omega List Syntax Relations RelationClasses. Local Obligation Tactic := intros. (** A comparable type is equiped with a [compare] function, that define an order relation. **) Class Comparable (A:Type) := { compare : A -> A -> comparison; compare_antisym : forall x y, CompOpp (compare x y) = compare y x; compare_trans : forall x y z c, (compare x y) = c -> (compare y z) = c -> (compare x z) = c }. Theorem compare_refl {A:Type} (C: Comparable A) : forall x, compare x x = Eq. Proof. intros. pose proof (compare_antisym x x). destruct (compare x x); intuition; try discriminate. Qed. (** The corresponding order is a strict order. **) Definition comparableLt {A:Type} (C: Comparable A) : relation A := fun x y => compare x y = Lt. Instance ComparableLtStrictOrder {A:Type} (C: Comparable A) : StrictOrder (comparableLt C). Proof. apply Build_StrictOrder. unfold Irreflexive, Reflexive, complement, comparableLt. intros. pose proof H. rewrite <- compare_antisym in H. rewrite H0 in H. discriminate H. unfold Transitive, comparableLt. intros x y z. apply compare_trans. Qed. (** nat is comparable. **) Program Instance natComparable : Comparable nat := { compare := Nat.compare }. Next Obligation. symmetry. destruct (Nat.compare x y) as [] eqn:?. rewrite Nat.compare_eq_iff in Heqc. destruct Heqc. rewrite Nat.compare_eq_iff. trivial. rewrite <- nat_compare_lt in *. rewrite <- nat_compare_gt in *. trivial. rewrite <- nat_compare_lt in *. rewrite <- nat_compare_gt in *. trivial. Qed. Next Obligation. destruct c. rewrite Nat.compare_eq_iff in *; destruct H; assumption. rewrite <- nat_compare_lt in *. apply (Nat.lt_trans _ _ _ H H0). rewrite <- nat_compare_gt in *. apply (gt_trans _ _ _ H H0). Qed. (** A pair of comparable is comparable. **) Program Instance PairComparable {A:Type} (CA:Comparable A) {B:Type} (CB:Comparable B) : Comparable (A*B) := { compare := fun x y => let (xa, xb) := x in let (ya, yb) := y in match compare xa ya return comparison with | Eq => compare xb yb | x => x end }. Next Obligation. destruct x, y. rewrite <- (compare_antisym a a0). rewrite <- (compare_antisym b b0). destruct (compare a a0); intuition. Qed. Next Obligation. destruct x, y, z. destruct (compare a a0) as [] eqn:?, (compare a0 a1) as [] eqn:?; try (rewrite <- H0 in H; discriminate); try (destruct (compare a a1) as [] eqn:?; try (rewrite <- compare_antisym in Heqc0; rewrite CompOpp_iff in Heqc0; rewrite (compare_trans _ _ _ _ Heqc0 Heqc2) in Heqc1; discriminate); try (rewrite <- compare_antisym in Heqc1; rewrite CompOpp_iff in Heqc1; rewrite (compare_trans _ _ _ _ Heqc2 Heqc1) in Heqc0; discriminate); assumption); rewrite (compare_trans _ _ _ _ Heqc0 Heqc1); try assumption. apply (compare_trans _ _ _ _ H H0). Qed. (** Special case of comparable, where equality is Leibniz equality. **) Class ComparableLeibnizEq {A:Type} (C: Comparable A) := compare_eq : forall x y, compare x y = Eq -> x = y. (** Boolean equality for a [Comparable]. **) Definition compare_eqb {A:Type} {C:Comparable A} (x y:A) := match compare x y with | Eq => true | _ => false end. Theorem compare_eqb_iff {A:Type} {C:Comparable A} {U:ComparableLeibnizEq C} : forall x y, compare_eqb x y = true <-> x = y. Proof. unfold compare_eqb. intuition. apply compare_eq. destruct (compare x y); intuition; discriminate. destruct H. rewrite compare_refl; intuition. Qed. Instance NComparableLeibnizEq : ComparableLeibnizEq natComparable := Nat.compare_eq. (** A pair of ComparableLeibnizEq is ComparableLeibnizEq **) Instance PairComparableLeibnizEq {A:Type} {CA:Comparable A} (UA:ComparableLeibnizEq CA) {B:Type} {CB:Comparable B} (UB:ComparableLeibnizEq CB) : ComparableLeibnizEq (PairComparable CA CB). Proof. intros x y; destruct x, y; simpl. pose proof (compare_eq a a0); pose proof (compare_eq b b0). destruct (compare a a0); try discriminate. intuition. destruct H2, H0. reflexivity. Qed. (** An [Finite] type is a type with the list of all elements. **) Class Finite (A:Type) := { all_list : list A; all_list_forall : forall x:A, In x all_list }. (** An alphabet is both [ComparableLeibnizEq] and [Finite]. **) Class Alphabet (A:Type) := { AlphabetComparable :> Comparable A; AlphabetComparableLeibnizEq :> ComparableLeibnizEq AlphabetComparable; AlphabetFinite :> Finite A }. (** The [Numbered] class provides a conveniant way to build [Alphabet] instances, with a good computationnal complexity. It is mainly a injection from it to [positive] **) Class Numbered (A:Type) := { inj : A -> positive; surj : positive -> A; surj_inj_compat : forall x, surj (inj x) = x; inj_bound : positive; inj_bound_spec : forall x, (inj x < Pos.succ inj_bound)%positive }. Program Instance NumberedAlphabet {A:Type} (N:Numbered A) : Alphabet A := { AlphabetComparable := {| compare := fun x y => Pos.compare (inj x) (inj y) |}; AlphabetFinite := {| all_list := fst (Pos.iter (fun '(l, p) => (surj p::l, Pos.succ p)) ([], 1%positive) inj_bound) |} }. Next Obligation. simpl. now rewrite <- Pos.compare_antisym. Qed. Next Obligation. match goal with c : comparison |- _ => destruct c end. - rewrite Pos.compare_eq_iff in *. congruence. - rewrite Pos.compare_lt_iff in *. eauto using Pos.lt_trans. - rewrite Pos.compare_gt_iff in *. eauto using Pos.lt_trans. Qed. Next Obligation. intros x y. unfold compare. intros Hxy. assert (Hxy' : inj x = inj y). (* We do not use [Pos.compare_eq_iff] directly to make sure the proof is executable. *) { destruct (Pos.eq_dec (inj x) (inj y)) as [|[]]; [now auto|]. now apply Pos.compare_eq_iff. } (* Using rewrite here leads to non-executable proofs. *) transitivity (surj (inj x)). { apply eq_sym, surj_inj_compat. } transitivity (surj (inj y)); cycle 1. { apply surj_inj_compat. } apply f_equal, Hxy'. Defined. Next Obligation. rewrite <-(surj_inj_compat x). generalize (inj_bound_spec x). generalize (inj x). clear x. intros x. match goal with |- ?Hx -> In ?s (fst ?p) => assert ((Hx -> In s (fst p)) /\ snd p = Pos.succ inj_bound); [|now intuition] end. rewrite Pos.lt_succ_r. induction inj_bound as [|y [IH1 IH2]] using Pos.peano_ind; (split; [intros Hx|]); simpl. - rewrite (Pos.le_antisym _ _ Hx); auto using Pos.le_1_l. - auto. - rewrite Pos.iter_succ. destruct Pos.iter; simpl in *. subst. rewrite Pos.le_lteq in Hx. destruct Hx as [?%Pos.lt_succ_r| ->]; now auto. - rewrite Pos.iter_succ. destruct Pos.iter. simpl in IH2. subst. reflexivity. Qed. (** Definitions of [FSet]/[FMap] from [Comparable] **) Require Import OrderedTypeAlt. Require FSetAVL. Require FMapAVL. Import OrderedType. Module Type ComparableM. Parameter t : Type. Declare Instance tComparable : Comparable t. End ComparableM. Module OrderedTypeAlt_from_ComparableM (C:ComparableM) <: OrderedTypeAlt. Definition t := C.t. Definition compare : t -> t -> comparison := compare. Infix "?=" := compare (at level 70, no associativity). Lemma compare_sym x y : (y?=x) = CompOpp (x?=y). Proof. exact (Logic.eq_sym (compare_antisym x y)). Qed. Lemma compare_trans c x y z : (x?=y) = c -> (y?=z) = c -> (x?=z) = c. Proof. apply compare_trans. Qed. End OrderedTypeAlt_from_ComparableM. Module OrderedType_from_ComparableM (C:ComparableM) <: OrderedType. Module Alt := OrderedTypeAlt_from_ComparableM C. Include (OrderedType_from_Alt Alt). End OrderedType_from_ComparableM. menhir-20200123/coq-menhirlib/src/Automaton.v000066400000000000000000000147341361226111300207120ustar00rootroot00000000000000(****************************************************************************) (* *) (* Menhir *) (* *) (* Jacques-Henri Jourdan, CNRS, LRI, Université Paris Sud *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under *) (* the terms of the GNU Lesser General Public License as published by the *) (* Free Software Foundation, either version 3 of the License, or (at your *) (* option) any later version, as described in the file LICENSE. *) (* *) (****************************************************************************) Require Grammar. Require Export Alphabet. From Coq Require Import Orders. From Coq Require Export List Syntax. Module Type AutInit. (** The grammar of the automaton. **) Declare Module Gram:Grammar.T. Export Gram. (** The set of non initial state is considered as an alphabet. **) Parameter noninitstate : Type. Declare Instance NonInitStateAlph : Alphabet noninitstate. Parameter initstate : Type. Declare Instance InitStateAlph : Alphabet initstate. (** When we are at this state, we know that this symbol is the top of the stack. **) Parameter last_symb_of_non_init_state: noninitstate -> symbol. End AutInit. Module Types(Import Init:AutInit). (** In many ways, the behaviour of the initial state is different from the behaviour of the other states. So we have chosen to explicitaly separate them: the user has to provide the type of non initial states. **) Inductive state := | Init: initstate -> state | Ninit: noninitstate -> state. Program Instance StateAlph : Alphabet state := { AlphabetComparable := {| compare := fun x y => match x, y return comparison with | Init _, Ninit _ => Lt | Init x, Init y => compare x y | Ninit _, Init _ => Gt | Ninit x, Ninit y => compare x y end |}; AlphabetFinite := {| all_list := map Init all_list ++ map Ninit all_list |} }. Local Obligation Tactic := intros. Next Obligation. destruct x, y; intuition; apply compare_antisym. Qed. Next Obligation. destruct x, y, z; intuition. apply (compare_trans _ i0); intuition. congruence. congruence. apply (compare_trans _ n0); intuition. Qed. Next Obligation. intros x y. destruct x, y; intuition; try discriminate. rewrite (compare_eq i i0); intuition. rewrite (compare_eq n n0); intuition. Qed. Next Obligation. apply in_or_app; destruct x; intuition; [left|right]; apply in_map; apply all_list_forall. Qed. Coercion Ninit : noninitstate >-> state. Coercion Init : initstate >-> state. (** For an LR automaton, there are four kind of actions that can be done at a given state: - Shifting, that is reading a token and putting it into the stack, - Reducing a production, that is popping the right hand side of the production from the stack, and pushing the left hand side, - Failing - Accepting the word (special case of reduction) As in the menhir parser generator, we do not want our parser to read after the end of stream. That means that once the parser has read a word in the grammar language, it should stop without peeking the input stream. So, for the automaton to be complete, the grammar must be particular: if a word is in its language, then it is not a prefix of an other word of the language (otherwise, menhir reports an end of stream conflict). As a consequence of that, there is two notions of action: the first one is an action performed before having read the stream, the second one is after **) Inductive lookahead_action (term:terminal) := | Shift_act: forall s:noninitstate, T term = last_symb_of_non_init_state s -> lookahead_action term | Reduce_act: production -> lookahead_action term | Fail_act: lookahead_action term. Arguments Shift_act {term}. Arguments Reduce_act {term}. Arguments Fail_act {term}. Inductive action := | Default_reduce_act: production -> action | Lookahead_act : (forall term:terminal, lookahead_action term) -> action. (** Types used for the annotations of the automaton. **) (** An item is a part of the annotations given to the validator. It is acually a set of LR(1) items sharing the same core. It is needed to validate completeness. **) Record item := { (** The pseudo-production of the item. **) prod_item: production; (** The position of the dot. **) dot_pos_item: nat; (** The lookahead symbol of the item. We are using a list, so we can store together multiple LR(1) items sharing the same core. **) lookaheads_item: list terminal }. End Types. Module Type T. Include AutInit <+ Types. Module Export GramDefs := Grammar.Defs Gram. (** For each initial state, the non terminal it recognizes. **) Parameter start_nt: initstate -> nonterminal. (** The action table maps a state to either a map terminal -> action. **) Parameter action_table: state -> action. (** The goto table of an LR(1) automaton. **) Parameter goto_table: state -> forall nt:nonterminal, option { s:noninitstate | NT nt = last_symb_of_non_init_state s }. (** Some annotations on the automaton to help the validation. **) (** When we are at this state, we know that these symbols are just below the top of the stack. The list is ordered such that the head correspond to the (almost) top of the stack. **) Parameter past_symb_of_non_init_state: noninitstate -> list symbol. (** When we are at this state, the (strictly) previous states verify these predicates. **) Parameter past_state_of_non_init_state: noninitstate -> list (state -> bool). (** The items of the state. **) Parameter items_of_state: state -> list item. (** The nullable predicate for non terminals : true if and only if the symbol produces the empty string **) Parameter nullable_nterm: nonterminal -> bool. (** The first predicates for non terminals, symbols or words of symbols. A terminal is in the returned list if, and only if the parameter produces a word that begins with the given terminal **) Parameter first_nterm: nonterminal -> list terminal. End T. menhir-20200123/coq-menhirlib/src/Grammar.v000066400000000000000000000137371361226111300203330ustar00rootroot00000000000000(****************************************************************************) (* *) (* Menhir *) (* *) (* Jacques-Henri Jourdan, CNRS, LRI, Université Paris Sud *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under *) (* the terms of the GNU Lesser General Public License as published by the *) (* Free Software Foundation, either version 3 of the License, or (at your *) (* option) any later version, as described in the file LICENSE. *) (* *) (****************************************************************************) From Coq Require Import List Syntax Orders. Require Import Alphabet. (** The terminal non-terminal alphabets of the grammar. **) Module Type Alphs. Parameters terminal nonterminal : Type. Declare Instance TerminalAlph: Alphabet terminal. Declare Instance NonTerminalAlph: Alphabet nonterminal. End Alphs. (** Definition of the alphabet of symbols, given the alphabet of terminals and the alphabet of non terminals **) Module Symbol(Import A:Alphs). Inductive symbol := | T: terminal -> symbol | NT: nonterminal -> symbol. Program Instance SymbolAlph : Alphabet symbol := { AlphabetComparable := {| compare := fun x y => match x, y return comparison with | T _, NT _ => Gt | NT _, T _ => Lt | T x, T y => compare x y | NT x, NT y => compare x y end |}; AlphabetFinite := {| all_list := map T all_list++map NT all_list |} }. Next Obligation. destruct x; destruct y; intuition; apply compare_antisym. Qed. Next Obligation. destruct x; destruct y; destruct z; intuition; try discriminate. apply (compare_trans _ t0); intuition. apply (compare_trans _ n0); intuition. Qed. Next Obligation. intros x y. destruct x; destruct y; try discriminate; intros. rewrite (compare_eq t t0); now intuition. rewrite (compare_eq n n0); now intuition. Defined. Next Obligation. rewrite in_app_iff. destruct x; [left | right]; apply in_map; apply all_list_forall. Qed. End Symbol. (** A curryfied function with multiple parameters **) Definition arrows_right: Type -> list Type -> Type := fold_right (fun A B => A -> B). Module Type T. Include Alphs <+ Symbol. (** [symbol_semantic_type] maps a symbols to the type of its semantic values. **) Parameter symbol_semantic_type: symbol -> Type. (** The type of productions identifiers **) Parameter production : Type. Declare Instance ProductionAlph : Alphabet production. (** Accessors for productions: left hand side, right hand side, and semantic action. The semantic actions are given in the form of curryfied functions, that take arguments in the reverse order. **) Parameter prod_lhs: production -> nonterminal. (* The RHS of a production is given in reversed order, so that symbols *) Parameter prod_rhs_rev: production -> list symbol. Parameter prod_action: forall p:production, arrows_right (symbol_semantic_type (NT (prod_lhs p))) (map symbol_semantic_type (prod_rhs_rev p)). (** Tokens are the atomic elements of the input stream: they contain a terminal and a semantic value of the type corresponding to this terminal. *) Parameter token : Type. Parameter token_term : token -> terminal. Parameter token_sem : forall tok : token, symbol_semantic_type (T (token_term tok)). End T. Module Defs(Import G:T). (** The semantics of a grammar is defined in two stages. First, we define the notion of parse tree, which represents one way of recognizing a word with a head symbol. Semantic values are stored at the leaves. This notion is defined in two mutually recursive flavours: either for a single head symbol, or for a list of head symbols. *) Inductive parse_tree: forall (head_symbol:symbol) (word:list token), Type := (** Parse tree for a terminal symbol. *) | Terminal_pt: forall (tok:token), parse_tree (T (token_term tok)) [tok] (** Parse tree for a non-terminal symbol. *) | Non_terminal_pt: forall (prod:production) {word:list token}, parse_tree_list (prod_rhs_rev prod) word -> parse_tree (NT (prod_lhs prod)) word (* Note : the list head_symbols_rev is reversed. *) with parse_tree_list: forall (head_symbols_rev:list symbol) (word:list token), Type := | Nil_ptl: parse_tree_list [] [] | Cons_ptl: forall {head_symbolsq:list symbol} {wordq:list token}, parse_tree_list head_symbolsq wordq -> forall {head_symbolt:symbol} {wordt:list token}, parse_tree head_symbolt wordt -> parse_tree_list (head_symbolt::head_symbolsq) (wordq++wordt). (** We can now finish the definition of the semantics of a grammar, by giving the semantic value assotiated with a parse tree. *) Fixpoint pt_sem {head_symbol word} (tree:parse_tree head_symbol word) : symbol_semantic_type head_symbol := match tree with | Terminal_pt tok => token_sem tok | Non_terminal_pt prod ptl => ptl_sem ptl (prod_action prod) end with ptl_sem {A head_symbols word} (tree:parse_tree_list head_symbols word) : arrows_right A (map symbol_semantic_type head_symbols) -> A := match tree with | Nil_ptl => fun act => act | Cons_ptl q t => fun act => ptl_sem q (act (pt_sem t)) end. Fixpoint pt_size {head_symbol word} (tree:parse_tree head_symbol word) := match tree with | Terminal_pt _ => 1 | Non_terminal_pt _ l => S (ptl_size l) end with ptl_size {head_symbols word} (tree:parse_tree_list head_symbols word) := match tree with | Nil_ptl => 0 | Cons_ptl q t => pt_size t + ptl_size q end. End Defs. menhir-20200123/coq-menhirlib/src/Interpreter.v000066400000000000000000000437221361226111300212450ustar00rootroot00000000000000(****************************************************************************) (* *) (* Menhir *) (* *) (* Jacques-Henri Jourdan, CNRS, LRI, Université Paris Sud *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under *) (* the terms of the GNU Lesser General Public License as published by the *) (* Free Software Foundation, either version 3 of the License, or (at your *) (* option) any later version, as described in the file LICENSE. *) (* *) (****************************************************************************) From Coq Require Import List Syntax. From Coq.ssr Require Import ssreflect. Require Automaton. Require Import Alphabet Grammar Validator_safe. Module Make(Import A:Automaton.T). Module Import ValidSafe := Validator_safe.Make A. (** A few helpers for dependent types. *) (** Decidable propositions. *) Class Decidable (P : Prop) := decide : {P} + {~P}. Arguments decide _ {_}. (** A [Comparable] type has decidable equality. *) Instance comparable_decidable_eq T `{ComparableLeibnizEq T} (x y : T) : Decidable (x = y). Proof. unfold Decidable. destruct (compare x y) eqn:EQ; [left; apply compare_eq; intuition | ..]; right; intros ->; by rewrite compare_refl in EQ. Defined. Instance list_decidable_eq T : (forall x y : T, Decidable (x = y)) -> (forall l1 l2 : list T, Decidable (l1 = l2)). Proof. unfold Decidable. decide equality. Defined. Ltac subst_existT := repeat match goal with | _ => progress subst | H : @existT ?A ?P ?x ?y1 = @existT ?A ?P ?x ?y2 |- _ => let DEC := fresh in assert (DEC : forall u1 u2 : A, Decidable (u1 = u2)) by apply _; apply Eqdep_dec.inj_pair2_eq_dec in H; [|by apply DEC]; clear DEC end. (** The interpreter is written using dependent types. In order to avoid reducing proof terms while executing the parser, we thunk all the propositions behind an arrow. Note that thunkP is still in Prop so that it is erased by extraction. *) Definition thunkP (P : Prop) : Prop := True -> P. (** Sometimes, we actually need a reduced proof in a program (for example when using an equality to cast a value). In that case, instead of reducing the proof we already have, we reprove the assertion by using decidability. *) Definition reprove {P} `{Decidable P} (p : thunkP P) : P := match decide P with | left p => p | right np => False_ind _ (np (p I)) end. (** Combination of reprove with eq_rect. *) Definition cast {T : Type} (F : T -> Type) {x y : T} (eq : thunkP (x = y)) {DEC : unit -> Decidable (x = y)}: F x -> F y := fun a => eq_rect x F a y (@reprove _ (DEC ()) eq). Lemma cast_eq T F (x : T) (eq : thunkP (x = x)) `{forall x y, Decidable (x = y)} a : cast F eq a = a. Proof. by rewrite /cast -Eqdep_dec.eq_rect_eq_dec. Qed. (** Input buffers and operations on them. **) CoInductive buffer : Type := Buf_cons { buf_head : token; buf_tail : buffer }. Delimit Scope buffer_scope with buf. Bind Scope buffer_scope with buffer. Infix "::" := Buf_cons (at level 60, right associativity) : buffer_scope. (** Concatenation of a list and an input buffer **) Fixpoint app_buf (l:list token) (buf:buffer) := match l with | nil => buf | cons t q => (t :: app_buf q buf)%buf end. Infix "++" := app_buf (at level 60, right associativity) : buffer_scope. Lemma app_buf_assoc (l1 l2:list token) (buf:buffer) : (l1 ++ (l2 ++ buf) = (l1 ++ l2) ++ buf)%buf. Proof. induction l1 as [|?? IH]=>//=. rewrite IH //. Qed. (** The type of a non initial state: the type of semantic values associated with the last symbol of this state. *) Definition noninitstate_type state := symbol_semantic_type (last_symb_of_non_init_state state). (** The stack of the automaton : it can be either nil or contains a non initial state, a semantic value for the symbol associted with this state, and a nested stack. **) Definition stack := list (sigT noninitstate_type). (* eg. list {state & state_type state} *) Section Interpreter. Hypothesis safe: safe. (* Properties of the automaton deduced from safety validation. *) Proposition shift_head_symbs: shift_head_symbs. Proof. pose proof safe; unfold ValidSafe.safe in H; intuition. Qed. Proposition goto_head_symbs: goto_head_symbs. Proof. pose proof safe; unfold ValidSafe.safe in H; intuition. Qed. Proposition shift_past_state: shift_past_state. Proof. pose proof safe; unfold ValidSafe.safe in H; intuition. Qed. Proposition goto_past_state: goto_past_state. Proof. pose proof safe; unfold ValidSafe.safe in H; intuition. Qed. Proposition reduce_ok: reduce_ok. Proof. pose proof safe; unfold ValidSafe.safe in H; intuition. Qed. Variable init : initstate. (** The top state of a stack **) Definition state_of_stack (stack:stack): state := match stack with | [] => init | existT _ s _::_ => s end. (** The stack of states of an automaton stack **) Definition state_stack_of_stack (stack:stack) := (List.map (fun cell:sigT noninitstate_type => singleton_state_pred (projT1 cell)) stack ++ [singleton_state_pred init])%list. (** The stack of symbols of an automaton stack **) Definition symb_stack_of_stack (stack:stack) := List.map (fun cell:sigT noninitstate_type => last_symb_of_non_init_state (projT1 cell)) stack. (** The stack invariant : it basically states that the assumptions on the states are true. **) Inductive stack_invariant: stack -> Prop := | stack_invariant_constr: forall stack, prefix (head_symbs_of_state (state_of_stack stack)) (symb_stack_of_stack stack) -> prefix_pred (head_states_of_state (state_of_stack stack)) (state_stack_of_stack stack) -> stack_invariant_next stack -> stack_invariant stack with stack_invariant_next: stack -> Prop := | stack_invariant_next_nil: stack_invariant_next [] | stack_invariant_next_cons: forall state_cur st stack_rec, stack_invariant stack_rec -> stack_invariant_next (existT _ state_cur st::stack_rec). (** [pop] pops some symbols from the stack. It returns the popped semantic values using [sem_popped] as an accumulator and discards the popped states.**) Fixpoint pop (symbols_to_pop:list symbol) {A:Type} (stk:stack) : thunkP (prefix symbols_to_pop (symb_stack_of_stack stk)) -> forall (action:arrows_right A (map symbol_semantic_type symbols_to_pop)), stack * A. unshelve refine (match symbols_to_pop return (thunkP (prefix symbols_to_pop (symb_stack_of_stack stk))) -> forall (action:arrows_right A (map _ symbols_to_pop)), stack * A with | [] => fun _ action => (stk, action) | t::q => fun Hp action => match stk return thunkP (prefix (t::q) (symb_stack_of_stack stk)) -> stack * A with | existT _ state_cur sem::stack_rec => fun Hp => let sem_conv := cast symbol_semantic_type _ sem in pop q _ stack_rec _ (action sem_conv) | [] => fun Hp => False_rect _ _ end Hp end). Proof. - simpl in Hp. clear -Hp. abstract (intros _ ; specialize (Hp I); now inversion Hp). - clear -Hp. abstract (specialize (Hp I); now inversion Hp). - simpl in Hp. clear -Hp. abstract (intros _ ; specialize (Hp I); now inversion Hp). Defined. (* Equivalent declarative specification for pop, so that we avoid (part of) the dependent types nightmare. *) Inductive pop_spec {A:Type} : forall (symbols_to_pop:list symbol) (stk : stack) (action : arrows_right A (map symbol_semantic_type symbols_to_pop)) (stk' : stack) (sem : A), Prop := | Nil_pop_spec stk sem : pop_spec [] stk sem stk sem | Cons_pop_spec symbols_to_pop st stk action sem stk' res : pop_spec symbols_to_pop stk (action sem) stk' res -> pop_spec (last_symb_of_non_init_state st::symbols_to_pop) (existT _ st sem :: stk) action stk' res. Lemma pop_spec_ok {A:Type} symbols_to_pop stk Hp action stk' res: pop symbols_to_pop stk Hp action = (stk', res) <-> pop_spec (A:=A) symbols_to_pop stk action stk' res. Proof. revert stk Hp action. induction symbols_to_pop as [|t symbols_to_pop IH]=>stk Hp action /=. - split. + intros [= <- <-]. constructor. + intros H. inversion H. by subst_existT. - destruct stk as [|[st sem]]=>/=; [by destruct pop_subproof0|]. remember (pop_subproof t symbols_to_pop stk st Hp) as EQ eqn:eq. clear eq. generalize EQ. revert Hp action. rewrite <-(EQ I)=>Hp action ?. rewrite cast_eq. rewrite IH. split. + intros. by constructor. + intros H. inversion H. by subst_existT. Qed. Lemma pop_preserves_invariant symbols_to_pop stk Hp A action : stack_invariant stk -> stack_invariant (fst (pop symbols_to_pop stk Hp (A:=A) action)). Proof. revert stk Hp A action. induction symbols_to_pop as [|t q IH]=>//=. intros stk Hp A action Hi. destruct Hi as [stack Hp' Hpp [|state st stk']]. - destruct pop_subproof0. - now apply IH. Qed. Lemma pop_state_valid symbols_to_pop stk Hp A action lpred : prefix_pred lpred (state_stack_of_stack stk) -> let stk' := fst (pop symbols_to_pop stk Hp (A:=A) action) in state_valid_after_pop (state_of_stack stk') symbols_to_pop lpred. Proof. revert stk Hp A action lpred. induction symbols_to_pop as [|t q IH]=>/=. - intros stk Hp A a lpred Hpp. destruct lpred as [|pred lpred]; constructor. inversion Hpp as [|? lpred' ? pred' Himpl Hpp' eq1 eq2]; subst. specialize (Himpl (state_of_stack stk)). destruct (pred' (state_of_stack stk)) as [] eqn:Heqpred'=>//. destruct stk as [|[]]; simpl in *. + inversion eq2; subst; clear eq2. unfold singleton_state_pred in Heqpred'. now rewrite compare_refl in Heqpred'; discriminate. + inversion eq2; subst; clear eq2. unfold singleton_state_pred in Heqpred'. now rewrite compare_refl in Heqpred'; discriminate. - intros stk Hp A a lpred Hpp. destruct stk as [|[] stk]=>//=. + destruct pop_subproof0. + destruct lpred as [|pred lpred]; [by constructor|]. constructor. apply IH. by inversion Hpp. Qed. (** [step_result] represents the result of one step of the automaton : it can fail, accept or progress. [Fail_sr] means that the input is incorrect. [Accept_sr] means that this is the last step of the automaton, and it returns the semantic value of the input word. [Progress_sr] means that some progress has been made, but new steps are needed in order to accept a word. For [Accept_sr] and [Progress_sr], the result contains the new input buffer. [Fail_sr] means that the input word is rejected by the automaton. It is different to [Err] (from the error monad), which mean that the automaton is bogus and has perfomed a forbidden action. **) Inductive step_result := | Fail_sr: step_result | Accept_sr: symbol_semantic_type (NT (start_nt init)) -> buffer -> step_result | Progress_sr: stack -> buffer -> step_result. (** [reduce_step] does a reduce action : - pops some elements from the stack - execute the action of the production - follows the goto for the produced non terminal symbol **) Definition reduce_step stk prod (buffer : buffer) (Hval : thunkP (valid_for_reduce (state_of_stack stk) prod)) (Hi : thunkP (stack_invariant stk)) : step_result. refine ((let '(stk', sem) as ss := pop (prod_rhs_rev prod) stk _ (prod_action prod) return thunkP (state_valid_after_pop (state_of_stack (fst ss)) _ (head_states_of_state (state_of_stack stk))) -> _ in fun Hval' => match goto_table (state_of_stack stk') (prod_lhs prod) as goto return (thunkP (goto = None -> match state_of_stack stk' with | Init i => prod_lhs prod = start_nt i | Ninit _ => False end)) -> _ with | Some (exist _ state_new e) => fun _ => let sem := eq_rect _ _ sem _ e in Progress_sr (existT noninitstate_type state_new sem::stk') buffer | None => fun Hval => let sem := cast symbol_semantic_type _ sem in Accept_sr sem buffer end (fun _ => _)) (fun _ => pop_state_valid _ _ _ _ _ _ _)). Proof. - clear -Hi Hval. abstract (intros _; destruct Hi=>//; eapply prefix_trans; [by apply Hval|eassumption]). - clear -Hval. abstract (intros _; f_equal; specialize (Hval I eq_refl); destruct stk' as [|[]]=>//). - simpl in Hval'. clear -Hval Hval'. abstract (move : Hval => /(_ I) [_ /(_ _ (Hval' I))] Hval2 Hgoto; by rewrite Hgoto in Hval2). - clear -Hi. abstract by destruct Hi. Defined. Lemma reduce_step_stack_invariant_preserved stk prod buffer Hv Hi stk' buffer': reduce_step stk prod buffer Hv Hi = Progress_sr stk' buffer' -> stack_invariant stk'. Proof. unfold reduce_step. match goal with | |- context [pop ?symbols_to_pop stk ?Hp ?action] => assert (Hi':=pop_preserves_invariant symbols_to_pop stk Hp _ action (Hi I)); generalize (pop_state_valid symbols_to_pop stk Hp _ action) end. destruct pop as [stk0 sem]=>/=. simpl in Hi'. intros Hv'. assert (Hgoto1:=goto_head_symbs (state_of_stack stk0) (prod_lhs prod)). assert (Hgoto2:=goto_past_state (state_of_stack stk0) (prod_lhs prod)). match goal with | |- context [fun _ : True => ?X] => generalize X end. destruct goto_table as [[state_new e]|] eqn:EQgoto=>//. intros _ [= <- <-]. constructor=>/=. - constructor. eapply prefix_trans. apply Hgoto1. by destruct Hi'. - unfold state_stack_of_stack; simpl; constructor. + intros ?. by destruct singleton_state_pred. + eapply prefix_pred_trans. apply Hgoto2. by destruct Hi'. - by constructor. Qed. (** One step of parsing. **) Definition step stk buffer (Hi : thunkP (stack_invariant stk)): step_result := match action_table (state_of_stack stk) as a return thunkP match a return Prop with | Default_reduce_act prod => _ | Lookahead_act awt => forall t : terminal, match awt t with | Reduce_act p => _ | _ => True end end -> _ with | Default_reduce_act prod => fun Hv => reduce_step stk prod buffer Hv Hi | Lookahead_act awt => fun Hv => match buf_head buffer with | tok => match awt (token_term tok) as a return thunkP match a return Prop with Reduce_act p => _ | _ => _ end -> _ with | Shift_act state_new e => fun _ => let sem_conv := eq_rect _ symbol_semantic_type (token_sem tok) _ e in Progress_sr (existT noninitstate_type state_new sem_conv::stk) (buf_tail buffer) | Reduce_act prod => fun Hv => reduce_step stk prod buffer Hv Hi | Fail_act => fun _ => Fail_sr end (fun _ => Hv I (token_term tok)) end end (fun _ => reduce_ok _). Lemma step_stack_invariant_preserved stk buffer Hi stk' buffer': step stk buffer Hi = Progress_sr stk' buffer' -> stack_invariant stk'. Proof. unfold step. generalize (reduce_ok (state_of_stack stk))=>Hred. assert (Hshift1 := shift_head_symbs (state_of_stack stk)). assert (Hshift2 := shift_past_state (state_of_stack stk)). destruct action_table as [prod|awt]=>/=. - eauto using reduce_step_stack_invariant_preserved. - set (term := token_term (buf_head buffer)). generalize (Hred term). clear Hred. intros Hred. specialize (Hshift1 term). specialize (Hshift2 term). destruct (awt term) as [state_new e|prod|]=>//. + intros [= <- <-]. constructor=>/=. * constructor. eapply prefix_trans. apply Hshift1. by destruct Hi. * unfold state_stack_of_stack; simpl; constructor. -- intros ?. by destruct singleton_state_pred. -- eapply prefix_pred_trans. apply Hshift2. by destruct Hi. * constructor; by apply Hi. + eauto using reduce_step_stack_invariant_preserved. Qed. (** The parsing use a [nat] fuel parameter [log_n_steps], so that we do not have to prove terminaison, which is difficult. Note that [log_n_steps] is *not* the fuel in the conventionnal sense: this parameter contains the logarithm (in base 2) of the number of steps to perform. Hence, a value of, e.g., 50 will usually be enough to ensure termination. *) Fixpoint parse_fix stk buffer (log_n_steps : nat) (Hi : thunkP (stack_invariant stk)): { sr : step_result | forall stk' buffer', sr = Progress_sr stk' buffer' -> stack_invariant stk' } := match log_n_steps with | O => exist _ (step stk buffer Hi) (step_stack_invariant_preserved _ _ Hi) | S log_n_steps => match parse_fix stk buffer log_n_steps Hi with | exist _ (Progress_sr stk buffer) Hi' => parse_fix stk buffer log_n_steps (fun _ => Hi' _ buffer eq_refl) | sr => sr end end. (** The final result of a parsing is either a failure (the automaton has rejected the input word), either a timeout (the automaton has spent all the given [2^log_n_steps]), either a parsed semantic value with a rest of the input buffer. Note that we do not make parse_result depend on start_nt for the result type, so that this inductive is extracted without the use of Obj.t in OCaml. **) Inductive parse_result {A : Type} := | Fail_pr: parse_result | Timeout_pr: parse_result | Parsed_pr: A -> buffer -> parse_result. Global Arguments parse_result _ : clear implicits. Definition parse (buffer : buffer) (log_n_steps : nat): parse_result (symbol_semantic_type (NT (start_nt init))). refine (match proj1_sig (parse_fix [] buffer log_n_steps _) with | Fail_sr => Fail_pr | Accept_sr sem buffer' => Parsed_pr sem buffer' | Progress_sr _ _ => Timeout_pr end). Proof. abstract (repeat constructor; intros; by destruct singleton_state_pred). Defined. End Interpreter. Arguments Fail_sr {init}. Arguments Accept_sr {init} _ _. Arguments Progress_sr {init} _ _. End Make. Module Type T(A:Automaton.T). Include (Make A). End T. menhir-20200123/coq-menhirlib/src/Interpreter_complete.v000066400000000000000000001015121361226111300231250ustar00rootroot00000000000000(****************************************************************************) (* *) (* Menhir *) (* *) (* Jacques-Henri Jourdan, CNRS, LRI, Université Paris Sud *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under *) (* the terms of the GNU Lesser General Public License as published by the *) (* Free Software Foundation, either version 3 of the License, or (at your *) (* option) any later version, as described in the file LICENSE. *) (* *) (****************************************************************************) From Coq Require Import List Syntax Arith. From Coq.ssr Require Import ssreflect. Require Import Alphabet Grammar. Require Automaton Interpreter Validator_complete. Module Make(Import A:Automaton.T) (Import Inter:Interpreter.T A). Module Import Valid := Validator_complete.Make A. (** * Completeness Proof **) Section Completeness_Proof. Hypothesis safe: Inter.ValidSafe.safe. Hypothesis complete: complete. (* Properties of the automaton deduced from completeness validation. *) Proposition nullable_stable: nullable_stable. Proof. pose proof complete; unfold Valid.complete in H; intuition. Qed. Proposition first_stable: first_stable. Proof. pose proof complete; unfold Valid.complete in H; intuition. Qed. Proposition start_future: start_future. Proof. pose proof complete; unfold Valid.complete in H; intuition. Qed. Proposition terminal_shift: terminal_shift. Proof. pose proof complete; unfold Valid.complete in H; intuition. Qed. Proposition end_reduce: end_reduce. Proof. pose proof complete; unfold Valid.complete in H; intuition. Qed. Proposition start_goto: start_goto. Proof. pose proof complete; unfold Valid.complete in H; intuition. Qed. Proposition non_terminal_goto: non_terminal_goto. Proof. pose proof complete; unfold Valid.complete in H; intuition. Qed. Proposition non_terminal_closed: non_terminal_closed. Proof. pose proof complete; unfold Valid.complete in H; intuition. Qed. (** If the nullable predicate has been validated, then it is correct. **) Lemma nullable_correct head word : word = [] -> parse_tree head word -> nullable_symb head = true with nullable_correct_list heads word : word = [] -> parse_tree_list heads word -> nullable_word heads = true. Proof. - destruct 2=>//. assert (Hnull := nullable_stable prod). erewrite nullable_correct_list in Hnull; eauto. - intros Hword. destruct 1=>//=. destruct (app_eq_nil _ _ Hword). eauto using andb_true_intro. Qed. (** Auxiliary lemma for first_correct. *) Lemma first_word_set_app t word1 word2 : TerminalSet.In t (first_word_set (word1 ++ word2)) <-> TerminalSet.In t (first_word_set word1) \/ TerminalSet.In t (first_word_set word2) /\ nullable_word (rev word1) = true. Proof. induction word1 as [|s word1 IH]=>/=. - split; [tauto|]. move=>[/TerminalSet.empty_1 ?|[? _]]//. - rewrite /nullable_word forallb_app /=. destruct nullable_symb=>/=. + rewrite Bool.andb_true_r. split. * move=>/TerminalSet.union_1. rewrite IH. move=>[?|[?|[??]]]; auto using TerminalSet.union_2, TerminalSet.union_3. * destruct IH. move=>[/TerminalSet.union_1 [?|?]|[??]]; auto using TerminalSet.union_2, TerminalSet.union_3. + rewrite Bool.andb_false_r. by intuition. Qed. (** If the first predicate has been validated, then it is correct. **) Lemma first_correct head word t q : word = t::q -> parse_tree head word -> TerminalSet.In (token_term t) (first_symb_set head) with first_correct_list heads word t q : word = t::q -> parse_tree_list heads word -> TerminalSet.In (token_term t) (first_word_set (rev' heads)). Proof. - intros Hword. destruct 1=>//. + inversion Hword. subst. apply TerminalSet.singleton_2, compare_refl. + eapply first_stable. eauto. - intros Hword. destruct 1 as [|symq wordq ptl symt wordt pt]=>//=. rewrite /rev' -rev_alt /= first_word_set_app /= rev_involutive rev_alt. destruct wordq; [right|left]. + destruct nullable_symb; eauto using TerminalSet.union_2, nullable_correct_list. + inversion Hword. subst. fold (rev' symq). eauto. Qed. (** A PTL is compatible with a stack if the top of the stack contains data representing to this PTL. *) Fixpoint ptl_stack_compat {symbs word} (stk0 : stack) (ptl : parse_tree_list symbs word) (stk : stack) : Prop := match ptl with | Nil_ptl => stk0 = stk | @Cons_ptl _ _ ptl sym _ pt => match stk with | [] => False | existT _ _ sem::stk => ptl_stack_compat stk0 ptl stk /\ exists e, sem = eq_rect _ symbol_semantic_type (pt_sem pt) _ e end end. (** .. and when a PTL is compatible with a stack, then calling the pop function return the semantic value of this PTL. *) Lemma pop_stack_compat_pop_spec {A symbs word} (ptl:parse_tree_list symbs word) (stk:stack) (stk0:stack) action : ptl_stack_compat stk0 ptl stk -> pop_spec symbs stk action stk0 (ptl_sem (A:=A) ptl action). Proof. revert stk. induction ptl=>stk /= Hstk. - subst. constructor. - destruct stk as [|[st sem] stk]=>//. destruct Hstk as [Hstk [??]]. subst. simpl. constructor. eauto. Qed. Variable init: initstate. (** In order to prove compleness, we first fix a word to be parsed together with the content of the parser at the end of the parsing. *) Variable full_word: list token. Variable buffer_end: buffer. (** Completeness is proved by following the traversal of the parse tree which is performed by the parser. Each step of parsing correspond to one step of traversal. In order to represent the state of the traversal, we define the notion of "dotted" parse tree, which is a parse tree with one dot on one of its node. The place of the dot represents the place of the next action to be executed. Such a dotted parse tree is decomposed into two part: a "regular" parse tree, which is the parse tree placed under the dot, and a "parse tree zipper", which is the part of the parse tree placed above the dot. Therefore, a parse tree zipper is a parse tree with a hole. Moreover, for easier manipulation, a parse tree zipper is represented "upside down". That is, the root of the parse tree is actually a leaf of the zipper, while the root of the zipper is the hole. *) Inductive pt_zipper: forall (hole_symb:symbol) (hole_word:list token), Type := | Top_ptz: pt_zipper (NT (start_nt init)) full_word | Cons_ptl_ptz: forall {head_symbolsq:list symbol} {wordq:list token}, parse_tree_list head_symbolsq wordq -> forall {head_symbolt:symbol} {wordt:list token}, ptl_zipper (head_symbolt::head_symbolsq) (wordq++wordt) -> pt_zipper head_symbolt wordt with ptl_zipper: forall (hole_symbs:list symbol) (hole_word:list token), Type := | Non_terminal_pt_ptlz: forall {p:production} {word:list token}, pt_zipper (NT (prod_lhs p)) word -> ptl_zipper (prod_rhs_rev p) word | Cons_ptl_ptlz: forall {head_symbolsq:list symbol} {wordq:list token}, forall {head_symbolt:symbol} {wordt:list token}, parse_tree head_symbolt wordt -> ptl_zipper (head_symbolt::head_symbolsq) (wordq++wordt) -> ptl_zipper head_symbolsq wordq. (** A dotted parse tree is the combination of a parse tree zipper with a parse tree. It can be intwo flavors, depending on which is the next action to be executed (shift or reduce). *) Inductive pt_dot: Type := | Reduce_ptd: forall {prod word}, parse_tree_list (prod_rhs_rev prod) word -> pt_zipper (NT (prod_lhs prod)) word -> pt_dot | Shift_ptd: forall (tok : token) {symbolsq wordq}, parse_tree_list symbolsq wordq -> ptl_zipper (T (token_term tok)::symbolsq) (wordq++[tok]) -> pt_dot. (** We can compute the full semantic value of a parse tree when represented as a dotted ptd. *) Fixpoint ptlz_sem {hole_symbs hole_word} (ptlz:ptl_zipper hole_symbs hole_word) : (forall A, arrows_right A (map symbol_semantic_type hole_symbs) -> A) -> (symbol_semantic_type (NT (start_nt init))) := match ptlz with | @Non_terminal_pt_ptlz prod _ ptz => fun k => ptz_sem ptz (k _ (prod_action prod)) | Cons_ptl_ptlz pt ptlz => fun k => ptlz_sem ptlz (fun _ f => k _ (f (pt_sem pt))) end with ptz_sem {hole_symb hole_word} (ptz:pt_zipper hole_symb hole_word): symbol_semantic_type hole_symb -> symbol_semantic_type (NT (start_nt init)) := match ptz with | Top_ptz => fun sem => sem | Cons_ptl_ptz ptl ptlz => fun sem => ptlz_sem ptlz (fun _ f => ptl_sem ptl (f sem)) end. Definition ptd_sem (ptd : pt_dot) := match ptd with | @Reduce_ptd prod _ ptl ptz => ptz_sem ptz (ptl_sem ptl (prod_action prod)) | Shift_ptd tok ptl ptlz => ptlz_sem ptlz (fun _ f => ptl_sem ptl (f (token_sem tok))) end. (** The buffer associated with a dotted parse tree corresponds to the buffer left to be read by the parser when at the state represented by the dotted parse tree. *) Fixpoint ptlz_buffer {hole_symbs hole_word} (ptlz:ptl_zipper hole_symbs hole_word): buffer := match ptlz with | Non_terminal_pt_ptlz ptz => ptz_buffer ptz | @Cons_ptl_ptlz _ _ _ wordt _ ptlz' => wordt ++ ptlz_buffer ptlz' end with ptz_buffer {hole_symb hole_word} (ptz:pt_zipper hole_symb hole_word): buffer := match ptz with | Top_ptz => buffer_end | Cons_ptl_ptz _ ptlz => ptlz_buffer ptlz end. Definition ptd_buffer (ptd:pt_dot) := match ptd with | Reduce_ptd _ ptz => ptz_buffer ptz | @Shift_ptd tok _ wordq _ ptlz => (tok::ptlz_buffer ptlz)%buf end. (** We are now ready to define the main invariant of the proof of completeness: we need to specify when a stack is compatible with a dotted parse tree. Informally, a stack is compatible with a dotted parse tree when it is the concatenation stack fragments which are compatible with each of the partially recognized productions appearing in the parse tree zipper. Moreover, the head of each of these stack fragment contains a state which has an item predicted by the corresponding zipper. More formally, the compatibility relation first needs the following auxiliary definitions: *) Fixpoint ptlz_prod {hole_symbs hole_word} (ptlz:ptl_zipper hole_symbs hole_word): production := match ptlz with | @Non_terminal_pt_ptlz prod _ _ => prod | Cons_ptl_ptlz _ ptlz' => ptlz_prod ptlz' end. Fixpoint ptlz_future {hole_symbs hole_word} (ptlz:ptl_zipper hole_symbs hole_word): list symbol := match ptlz with | Non_terminal_pt_ptlz _ => [] | @Cons_ptl_ptlz _ _ s _ _ ptlz' => s::ptlz_future ptlz' end. Fixpoint ptlz_lookahead {hole_symbs hole_word} (ptlz:ptl_zipper hole_symbs hole_word) : terminal := match ptlz with | Non_terminal_pt_ptlz ptz => token_term (buf_head (ptz_buffer ptz)) | Cons_ptl_ptlz _ ptlz' => ptlz_lookahead ptlz' end. Fixpoint ptz_stack_compat {hole_symb hole_word} (stk : stack) (ptz : pt_zipper hole_symb hole_word) : Prop := match ptz with | Top_ptz => stk = [] | Cons_ptl_ptz ptl ptlz => exists stk0, state_has_future (state_of_stack init stk) (ptlz_prod ptlz) (hole_symb::ptlz_future ptlz) (ptlz_lookahead ptlz) /\ ptl_stack_compat stk0 ptl stk /\ ptlz_stack_compat stk0 ptlz end with ptlz_stack_compat {hole_symbs hole_word} (stk : stack) (ptlz : ptl_zipper hole_symbs hole_word) : Prop := match ptlz with | Non_terminal_pt_ptlz ptz => ptz_stack_compat stk ptz | Cons_ptl_ptlz _ ptlz => ptlz_stack_compat stk ptlz end. Definition ptd_stack_compat (ptd:pt_dot) (stk:stack): Prop := match ptd with | @Reduce_ptd prod _ ptl ptz => exists stk0, state_has_future (state_of_stack init stk) prod [] (token_term (buf_head (ptz_buffer ptz))) /\ ptl_stack_compat stk0 ptl stk /\ ptz_stack_compat stk0 ptz | Shift_ptd tok ptl ptlz => exists stk0, state_has_future (state_of_stack init stk) (ptlz_prod ptlz) (T (token_term tok) :: ptlz_future ptlz) (ptlz_lookahead ptlz) /\ ptl_stack_compat stk0 ptl stk /\ ptlz_stack_compat stk0 ptlz end. Lemma ptz_stack_compat_cons_state_has_future {symbsq wordq symbt wordt} stk (ptl : parse_tree_list symbsq wordq) (ptlz : ptl_zipper (symbt :: symbsq) (wordq ++ wordt)) : ptz_stack_compat stk (Cons_ptl_ptz ptl ptlz) -> state_has_future (state_of_stack init stk) (ptlz_prod ptlz) (symbt::ptlz_future ptlz) (ptlz_lookahead ptlz). Proof. move=>[stk0 [? [? ?]]] //. Qed. Lemma ptlz_future_ptlz_prod hole_symbs hole_word (ptlz:ptl_zipper hole_symbs hole_word) : rev_append (ptlz_future ptlz) hole_symbs = prod_rhs_rev (ptlz_prod ptlz). Proof. induction ptlz=>//=. Qed. Lemma ptlz_future_first {symbs word} (ptlz : ptl_zipper symbs word) : TerminalSet.In (token_term (buf_head (ptlz_buffer ptlz))) (first_word_set (ptlz_future ptlz)) \/ token_term (buf_head (ptlz_buffer ptlz)) = ptlz_lookahead ptlz /\ nullable_word (ptlz_future ptlz) = true. Proof. induction ptlz as [|??? [|tok] pt ptlz IH]; [by auto| |]=>/=. - rewrite (nullable_correct _ _ eq_refl pt). destruct IH as [|[??]]; [left|right]=>/=; auto using TerminalSet.union_3. - left. destruct nullable_symb; eauto using TerminalSet.union_2, first_correct. Qed. (** We now want to define what is the next dotted parse tree which is to be handled after one action. Such dotted parse is built in two steps: Not only we have to perform the action by completing the parse tree, but we also have to prepare for the following step by moving the dot down to place it in front of the next action to be performed. *) Fixpoint build_pt_dot_from_pt {symb word} (pt : parse_tree symb word) (ptz : pt_zipper symb word) : pt_dot := match pt in parse_tree symb word return pt_zipper symb word -> pt_dot with | Terminal_pt tok => fun ptz => let X := match ptz in pt_zipper symb word return match symb with T term => True | NT _ => False end -> { symbsq : list symbol & { wordq : list token & (parse_tree_list symbsq wordq * ptl_zipper (symb :: symbsq) (wordq ++ word))%type } } with | Top_ptz => fun F => False_rect _ F | Cons_ptl_ptz ptl ptlz => fun _ => existT _ _ (existT _ _ (ptl, ptlz)) end I in Shift_ptd tok (fst (projT2 (projT2 X))) (snd (projT2 (projT2 X))) | Non_terminal_pt prod ptl => fun ptz => let is_notnil := match ptl in parse_tree_list w _ return option (match w return Prop with [] => False | _ => True end) with | Nil_ptl => None | _ => Some I end in match is_notnil with | None => Reduce_ptd ptl ptz | Some H => build_pt_dot_from_pt_rec ptl H (Non_terminal_pt_ptlz ptz) end end ptz with build_pt_dot_from_pt_rec {symbs word} (ptl : parse_tree_list symbs word) (Hsymbs : match symbs with [] => False | _ => True end) (ptlz : ptl_zipper symbs word) : pt_dot := match ptl in parse_tree_list symbs word return match symbs with [] => False | _ => True end -> ptl_zipper symbs word -> pt_dot with | Nil_ptl => fun Hsymbs _ => False_rect _ Hsymbs | Cons_ptl ptl' pt => fun _ => match ptl' in parse_tree_list symbsq wordq return parse_tree_list symbsq wordq -> ptl_zipper (_ :: symbsq) (wordq ++ _) -> pt_dot with | Nil_ptl => fun _ ptlz => build_pt_dot_from_pt pt (Cons_ptl_ptz Nil_ptl ptlz) | _ => fun ptl' ptlz => build_pt_dot_from_pt_rec ptl' I (Cons_ptl_ptlz pt ptlz) end ptl' end Hsymbs ptlz. Definition build_pt_dot_from_ptl {symbs word} (ptl : parse_tree_list symbs word) (ptlz : ptl_zipper symbs word) : pt_dot := match ptlz in ptl_zipper symbs word return parse_tree_list symbs word -> pt_dot with | Non_terminal_pt_ptlz ptz => fun ptl => Reduce_ptd ptl ptz | Cons_ptl_ptlz pt ptlz => fun ptl => build_pt_dot_from_pt pt (Cons_ptl_ptz ptl ptlz) end ptl. Definition next_ptd (ptd:pt_dot) : option pt_dot := match ptd with | Shift_ptd tok ptl ptlz => Some (build_pt_dot_from_ptl (Cons_ptl ptl (Terminal_pt tok)) ptlz) | Reduce_ptd ptl ptz => match ptz in pt_zipper symb word return parse_tree symb word -> _ with | Top_ptz => fun _ => None | Cons_ptl_ptz ptl' ptlz => fun pt => Some (build_pt_dot_from_ptl (Cons_ptl ptl' pt) ptlz) end (Non_terminal_pt _ ptl) end. Fixpoint next_ptd_iter (ptd:pt_dot) (log_n_steps:nat) : option pt_dot := match log_n_steps with | O => next_ptd ptd | S log_n_steps => match next_ptd_iter ptd log_n_steps with | None => None | Some ptd => next_ptd_iter ptd log_n_steps end end. (** We prove that these functions behave well w.r.t. semantic values. *) Lemma sem_build_from_pt {symb word} (pt : parse_tree symb word) (ptz : pt_zipper symb word) : ptz_sem ptz (pt_sem pt) = ptd_sem (build_pt_dot_from_pt pt ptz) with sem_build_from_pt_rec {symbs word} (ptl : parse_tree_list symbs word) (ptlz : ptl_zipper symbs word) Hsymbs : ptlz_sem ptlz (fun _ f => ptl_sem ptl f) = ptd_sem (build_pt_dot_from_pt_rec ptl Hsymbs ptlz). Proof. - destruct pt as [tok|prod word ptl]=>/=. + revert ptz. generalize [tok]. generalize (token_sem tok). generalize I. change True with (match T (token_term tok) with T _ => True | NT _ => False end) at 1. generalize (T (token_term tok)) => symb HT sem word ptz. by destruct ptz. + match goal with | |- context [match ?X with Some H => _ | None => _ end] => destruct X=>// end. by rewrite -sem_build_from_pt_rec. - destruct ptl; [contradiction|]. specialize (sem_build_from_pt_rec _ _ ptl)=>/=. destruct ptl. + by rewrite -sem_build_from_pt. + by rewrite -sem_build_from_pt_rec. Qed. Lemma sem_build_from_ptl {symbs word} (ptl : parse_tree_list symbs word) (ptlz : ptl_zipper symbs word) : ptlz_sem ptlz (fun _ f => ptl_sem ptl f) = ptd_sem (build_pt_dot_from_ptl ptl ptlz). Proof. destruct ptlz=>//=. by rewrite -sem_build_from_pt. Qed. Lemma sem_next_ptd (ptd : pt_dot) : match next_ptd ptd with | None => True | Some ptd' => ptd_sem ptd = ptd_sem ptd' end. Proof. destruct ptd as [prod word ptl ptz|tok symbs word ptl ptlz] =>/=. - change (ptl_sem ptl (prod_action prod)) with (pt_sem (Non_terminal_pt prod ptl)). generalize (Non_terminal_pt prod ptl). clear ptl. destruct ptz as [|?? ptl ?? ptlz]=>// pt. by rewrite -sem_build_from_ptl. - by rewrite -sem_build_from_ptl. Qed. Lemma sem_next_ptd_iter (ptd : pt_dot) (log_n_steps : nat) : match next_ptd_iter ptd log_n_steps with | None => True | Some ptd' => ptd_sem ptd = ptd_sem ptd' end. Proof. revert ptd. induction log_n_steps as [|log_n_steps IH]; [by apply sem_next_ptd|]=>/= ptd. assert (IH1 := IH ptd). destruct next_ptd_iter as [ptd'|]=>//. specialize (IH ptd'). destruct next_ptd_iter=>//. congruence. Qed. (** We prove that these functions behave well w.r.t. xxx_buffer. *) Lemma ptd_buffer_build_from_pt {symb word} (pt : parse_tree symb word) (ptz : pt_zipper symb word) : (word ++ ptz_buffer ptz)%buf = ptd_buffer (build_pt_dot_from_pt pt ptz) with ptd_buffer_build_from_pt_rec {symbs word} (ptl : parse_tree_list symbs word) (ptlz : ptl_zipper symbs word) Hsymbs : (word ++ ptlz_buffer ptlz)%buf = ptd_buffer (build_pt_dot_from_pt_rec ptl Hsymbs ptlz). Proof. - destruct pt as [tok|prod word ptl]=>/=. + f_equal. revert ptz. generalize [tok]. generalize (token_sem tok). generalize I. change True with (match T (token_term tok) with T _ => True | NT _ => False end) at 1. generalize (T (token_term tok)) => symb HT sem word ptz. by destruct ptz. + match goal with | |- context [match ?X with Some H => _ | None => _ end] => destruct X eqn:EQ end. * by rewrite -ptd_buffer_build_from_pt_rec. * rewrite [X in (X ++ _)%buf](_ : word = []) //. clear -EQ. by destruct ptl. - destruct ptl as [|?? ptl ?? pt]; [contradiction|]. specialize (ptd_buffer_build_from_pt_rec _ _ ptl). destruct ptl. + by rewrite /= -ptd_buffer_build_from_pt. + by rewrite -ptd_buffer_build_from_pt_rec //= app_buf_assoc. Qed. Lemma ptd_buffer_build_from_ptl {symbs word} (ptl : parse_tree_list symbs word) (ptlz : ptl_zipper symbs word) : ptlz_buffer ptlz = ptd_buffer (build_pt_dot_from_ptl ptl ptlz). Proof. destruct ptlz as [|???? pt]=>//=. by rewrite -ptd_buffer_build_from_pt. Qed. (** We prove that these functions behave well w.r.t. xxx_stack_compat. *) Lemma ptd_stack_compat_build_from_pt {symb word} (pt : parse_tree symb word) (ptz : pt_zipper symb word) (stk: stack) : ptz_stack_compat stk ptz -> ptd_stack_compat (build_pt_dot_from_pt pt ptz) stk with ptd_stack_compat_build_from_pt_rec {symbs word} (ptl : parse_tree_list symbs word) (ptlz : ptl_zipper symbs word) (stk : stack) Hsymbs : ptlz_stack_compat stk ptlz -> state_has_future (state_of_stack init stk) (ptlz_prod ptlz) (rev' (prod_rhs_rev (ptlz_prod ptlz))) (ptlz_lookahead ptlz) -> ptd_stack_compat (build_pt_dot_from_pt_rec ptl Hsymbs ptlz) stk. Proof. - intros Hstk. destruct pt as [tok|prod word ptl]=>/=. + revert ptz Hstk. generalize [tok]. generalize (token_sem tok). generalize I. change True with (match T (token_term tok) with T _ => True | NT _ => False end) at 1. generalize (T (token_term tok)) => symb HT sem word ptz. by destruct ptz. + assert (state_has_future (state_of_stack init stk) prod (rev' (prod_rhs_rev prod)) (token_term (buf_head (ptz_buffer ptz)))). { revert ptz Hstk. remember (NT (prod_lhs prod)) eqn:EQ=>ptz. destruct ptz as [|?? ptl0 ?? ptlz0]. - intros ->. apply start_future. congruence. - subst. intros (stk0 & Hfut & _). apply non_terminal_closed in Hfut. specialize (Hfut prod eq_refl). destruct (ptlz_future_first ptlz0) as [Hfirst|[Hfirst Hnull]]. + destruct Hfut as [_ Hfut]. auto. + destruct Hfut as [Hfut _]. by rewrite Hnull -Hfirst in Hfut. } match goal with | |- context [match ?X with Some H => _ | None => _ end] => destruct X eqn:EQ end. * by apply ptd_stack_compat_build_from_pt_rec. * exists stk. destruct ptl=>//. - intros Hstk Hfut. destruct ptl as [|?? ptl ?? pt]; [contradiction|]. specialize (ptd_stack_compat_build_from_pt_rec _ _ ptl). destruct ptl. + eapply ptd_stack_compat_build_from_pt=>//. exists stk. split; [|split]=>//; []. by rewrite -ptlz_future_ptlz_prod rev_append_rev /rev' -rev_alt rev_app_distr rev_involutive in Hfut. + by apply ptd_stack_compat_build_from_pt_rec. Qed. Lemma ptd_stack_compat_build_from_ptl {symbs word} (ptl : parse_tree_list symbs word) (ptlz : ptl_zipper symbs word) (stk stk0: stack) : ptlz_stack_compat stk0 ptlz -> ptl_stack_compat stk0 ptl stk -> state_has_future (state_of_stack init stk) (ptlz_prod ptlz) (ptlz_future ptlz) (ptlz_lookahead ptlz) -> ptd_stack_compat (build_pt_dot_from_ptl ptl ptlz) stk. Proof. intros Hstk0 Hstk Hfut. destruct ptlz=>/=. - eauto. - apply ptd_stack_compat_build_from_pt=>/=. eauto. Qed. (** We can now proceed by proving that the invariant is preserved by each step of parsing. We also prove that each step of parsing follows next_ptd. We start with reduce steps: *) Lemma reduce_step_next_ptd (prod : production) (word : list token) (ptl : parse_tree_list (prod_rhs_rev prod) word) (ptz : pt_zipper (NT (prod_lhs prod)) word) (stk : stack) Hval Hi : ptd_stack_compat (Reduce_ptd ptl ptz) stk -> match next_ptd (Reduce_ptd ptl ptz) with | None => reduce_step init stk prod (ptz_buffer ptz) Hval Hi = Accept_sr (ptd_sem (Reduce_ptd ptl ptz)) buffer_end | Some ptd => exists stk', reduce_step init stk prod (ptz_buffer ptz) Hval Hi = Progress_sr stk' (ptd_buffer ptd) /\ ptd_stack_compat ptd stk' end. Proof. intros (stk0 & _ & Hstk & Hstk0). apply pop_stack_compat_pop_spec with (action := prod_action prod) in Hstk. rewrite <-pop_spec_ok with (Hp := reduce_step_subproof init stk prod Hval Hi) in Hstk. unfold reduce_step. match goal with | |- context [pop_state_valid init ?A stk ?B ?C ?D ?E ?F] => generalize (pop_state_valid init A stk B C D E F) end. rewrite Hstk /=. intros Hv. generalize (reduce_step_subproof1 init stk prod Hval stk0 (fun _ : True => Hv)). clear Hval Hstk Hi Hv stk. assert (Hgoto := fun fut prod' => non_terminal_goto (state_of_stack init stk0) prod' (NT (prod_lhs prod)::fut)). simpl in Hgoto. destruct goto_table as [[st Hst]|] eqn:Hgoto'. - intros _. assert (match ptz with Top_ptz => False | _ => True end). { revert ptz Hst Hstk0 Hgoto'. generalize (eq_refl (NT (prod_lhs prod))). generalize (NT (prod_lhs prod)) at 1 3 5. intros nt Hnt ptz. destruct ptz=>//. injection Hnt=> <- /= Hst -> /= Hg. assert (Hsg := start_goto init). by rewrite Hg in Hsg. } clear Hgoto'. change (ptl_sem ptl (prod_action prod)) with (pt_sem (Non_terminal_pt prod ptl)). generalize (Non_terminal_pt prod ptl). clear ptl. destruct ptz as [|?? ptl ? ? ptlz]=>// pt. subst=>/=. eexists _. split. + f_equal. apply ptd_buffer_build_from_ptl. + destruct Hstk0 as (stk0' & Hfut & Hstk0' & Hstk0). apply (ptd_stack_compat_build_from_ptl _ _ _ stk0'); auto; []. split=>//. by exists eq_refl. - intros Hv. generalize (reduce_step_subproof0 _ prod _ (fun _ => Hv)). intros EQnt. clear Hv Hgoto'. change (ptl_sem ptl (prod_action prod)) with (pt_sem (Non_terminal_pt prod ptl)). generalize (Non_terminal_pt prod ptl). clear ptl. destruct ptz. + intros pt. f_equal. by rewrite cast_eq. + edestruct Hgoto. eapply ptz_stack_compat_cons_state_has_future, Hstk0. Qed. Lemma step_next_ptd (ptd : pt_dot) (stk : stack) Hi : ptd_stack_compat ptd stk -> match next_ptd ptd with | None => step safe init stk (ptd_buffer ptd) Hi = Accept_sr (ptd_sem ptd) buffer_end | Some ptd' => exists stk', step safe init stk (ptd_buffer ptd) Hi = Progress_sr stk' (ptd_buffer ptd') /\ ptd_stack_compat ptd' stk' end. Proof. intros Hstk. unfold step. generalize (reduce_ok safe (state_of_stack init stk)). destruct ptd as [prod word ptl ptz|tok symbs word ptl ptlz]. - assert (Hfut : state_has_future (state_of_stack init stk) prod [] (token_term (buf_head (ptz_buffer ptz)))). { destruct Hstk as (? & ? & ?)=>//. } assert (Hact := end_reduce _ _ _ _ Hfut). destruct action_table as [?|awt]=>Hval /=. + subst. by apply reduce_step_next_ptd. + set (term := token_term (buf_head (ptz_buffer ptz))) in *. generalize (Hval term). clear Hval. destruct (awt term)=>//. subst. intros Hval. by apply reduce_step_next_ptd. - destruct Hstk as (stk0 & Hfut & Hstk & Hstk0). assert (Hact := terminal_shift _ _ _ _ Hfut). simpl in Hact. clear Hfut. destruct action_table as [?|awt]=>//= /(_ (token_term tok)). destruct awt as [st' EQ| |]=>// _. eexists. split. + f_equal. rewrite -ptd_buffer_build_from_ptl //. + apply (ptd_stack_compat_build_from_ptl _ _ _ stk0); simpl; eauto. Qed. (** We prove the completeness of the parser main loop. *) Lemma parse_fix_next_ptd_iter (ptd : pt_dot) (stk : stack) (log_n_steps : nat) Hi : ptd_stack_compat ptd stk -> match next_ptd_iter ptd log_n_steps with | None => proj1_sig (parse_fix safe init stk (ptd_buffer ptd) log_n_steps Hi) = Accept_sr (ptd_sem ptd) buffer_end | Some ptd' => exists stk', proj1_sig (parse_fix safe init stk (ptd_buffer ptd) log_n_steps Hi) = Progress_sr stk' (ptd_buffer ptd') /\ ptd_stack_compat ptd' stk' end. Proof. revert ptd stk Hi. induction log_n_steps as [|log_n_steps IH]; [by apply step_next_ptd|]. move => /= ptd stk Hi Hstk. assert (IH1 := IH ptd stk Hi Hstk). assert (EQsem := sem_next_ptd_iter ptd log_n_steps). destruct parse_fix as [sr Hi']. simpl in IH1. destruct next_ptd_iter as [ptd'|]. - rewrite EQsem. destruct IH1 as (stk' & -> & Hstk'). by apply IH. - by subst. Qed. (** The parser is defined by recursion over a fuel parameter. In the completeness proof, we need to predict how much fuel is going to be needed in order to prove that enough fuel gives rise to a successful parsing. To do so, of a dotted parse tree, which is the number of actions left to be executed before complete parsing when the current state is represented by the dotted parse tree. *) Fixpoint ptlz_cost {hole_symbs hole_word} (ptlz:ptl_zipper hole_symbs hole_word) := match ptlz with | Non_terminal_pt_ptlz ptz => ptz_cost ptz | Cons_ptl_ptlz pt ptlz' => pt_size pt + ptlz_cost ptlz' end with ptz_cost {hole_symb hole_word} (ptz:pt_zipper hole_symb hole_word) := match ptz with | Top_ptz => 0 | Cons_ptl_ptz ptl ptlz' => 1 + ptlz_cost ptlz' end. Definition ptd_cost (ptd:pt_dot) := match ptd with | Reduce_ptd ptl ptz => ptz_cost ptz | Shift_ptd _ ptl ptlz => 1 + ptlz_cost ptlz end. Lemma ptd_cost_build_from_pt {symb word} (pt : parse_tree symb word) (ptz : pt_zipper symb word) : pt_size pt + ptz_cost ptz = S (ptd_cost (build_pt_dot_from_pt pt ptz)) with ptd_cost_build_from_pt_rec {symbs word} (ptl : parse_tree_list symbs word) (ptlz : ptl_zipper symbs word) Hsymbs : ptl_size ptl + ptlz_cost ptlz = ptd_cost (build_pt_dot_from_pt_rec ptl Hsymbs ptlz). Proof. - destruct pt as [tok|prod word ptl']=>/=. + revert ptz. generalize [tok]. generalize (token_sem tok). generalize I. change True with (match T (token_term tok) with T _ => True | NT _ => False end) at 1. generalize (T (token_term tok)) => symb HT sem word ptz. by destruct ptz. + match goal with | |- context [match ?X with Some H => _ | None => _ end] => destruct X eqn:EQ end. * rewrite -ptd_cost_build_from_pt_rec /= plus_n_Sm //. * simpl. by destruct ptl'. - destruct ptl as [|?? ptl ?? pt]; [contradiction|]. specialize (ptd_cost_build_from_pt_rec _ _ ptl). destruct ptl. + apply eq_add_S. rewrite -ptd_cost_build_from_pt /=. ring. + rewrite -ptd_cost_build_from_pt_rec //=. ring. Qed. Lemma ptd_cost_build_from_ptl {symbs word} (ptl : parse_tree_list symbs word) (ptlz : ptl_zipper symbs word) : ptlz_cost ptlz = ptd_cost (build_pt_dot_from_ptl ptl ptlz). Proof. destruct ptlz=>//. apply eq_add_S. rewrite -ptd_cost_build_from_pt /=. ring. Qed. Lemma next_ptd_cost ptd: match next_ptd ptd with | None => ptd_cost ptd = 0 | Some ptd' => ptd_cost ptd = S (ptd_cost ptd') end. Proof. destruct ptd as [prod word ptl ptz|tok symbq wordq ptl ptlz] =>/=. - generalize (Non_terminal_pt prod ptl). clear ptl. destruct ptz as [|?? ptl ?? ptlz]=>// pt. by rewrite -ptd_cost_build_from_ptl. - by rewrite -ptd_cost_build_from_ptl. Qed. Lemma next_ptd_iter_cost ptd log_n_steps : match next_ptd_iter ptd log_n_steps with | None => ptd_cost ptd < 2^log_n_steps | Some ptd' => ptd_cost ptd = 2^log_n_steps + ptd_cost ptd' end. Proof. revert ptd. induction log_n_steps as [|log_n_steps IH]=>ptd /=. - assert (Hptd := next_ptd_cost ptd). destruct next_ptd=>//. by rewrite Hptd. - rewrite Nat.add_0_r. assert (IH1 := IH ptd). destruct next_ptd_iter as [ptd'|]. + specialize (IH ptd'). destruct next_ptd_iter as [ptd''|]. * by rewrite IH1 IH -!plus_assoc. * rewrite IH1. by apply plus_lt_compat_l. + by apply lt_plus_trans. Qed. (** We now prove the top-level parsing function. The only thing that is left to be done is the initialization. To do so, we define the initial dotted parse tree, depending on a full (top-level) parse tree. *) Variable full_pt : parse_tree (NT (start_nt init)) full_word. Theorem parse_complete log_n_steps: match parse safe init (full_word ++ buffer_end) log_n_steps with | Parsed_pr sem buff => sem = pt_sem full_pt /\ buff = buffer_end /\ pt_size full_pt <= 2^log_n_steps | Timeout_pr => 2^log_n_steps < pt_size full_pt | Fail_pr => False end. Proof. assert (Hstk : ptd_stack_compat (build_pt_dot_from_pt full_pt Top_ptz) []) by by apply ptd_stack_compat_build_from_pt. unfold parse. assert (Hparse := parse_fix_next_ptd_iter _ _ log_n_steps (parse_subproof init) Hstk). rewrite -ptd_buffer_build_from_pt -sem_build_from_pt /= in Hparse. assert (Hcost := next_ptd_iter_cost (build_pt_dot_from_pt full_pt Top_ptz) log_n_steps). destruct next_ptd_iter. - destruct Hparse as (? & -> & ?). apply (f_equal S) in Hcost. rewrite -ptd_cost_build_from_pt Nat.add_0_r in Hcost. rewrite Hcost. apply le_lt_n_Sm, le_plus_l. - rewrite Hparse. split; [|split]=>//. apply lt_le_S in Hcost. by rewrite -ptd_cost_build_from_pt Nat.add_0_r in Hcost. Qed. End Completeness_Proof. End Make. menhir-20200123/coq-menhirlib/src/Interpreter_correct.v000066400000000000000000000162561361226111300227700ustar00rootroot00000000000000(****************************************************************************) (* *) (* Menhir *) (* *) (* Jacques-Henri Jourdan, CNRS, LRI, Université Paris Sud *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under *) (* the terms of the GNU Lesser General Public License as published by the *) (* Free Software Foundation, either version 3 of the License, or (at your *) (* option) any later version, as described in the file LICENSE. *) (* *) (****************************************************************************) From Coq Require Import List Syntax. Require Import Alphabet. Require Grammar Automaton Interpreter. From Coq.ssr Require Import ssreflect. Module Make(Import A:Automaton.T) (Import Inter:Interpreter.T A). (** * Correctness of the interpreter **) (** We prove that, in any case, if the interpreter accepts returning a semantic value, then this is a semantic value of the input **) Section Init. Variable init:initstate. (** [word_has_stack_semantics] relates a word with a stack, stating that the word is a concatenation of words that have the semantic values stored in the stack. **) Inductive word_has_stack_semantics: forall (word:list token) (stack:stack), Prop := | Nil_stack_whss: word_has_stack_semantics [] [] | Cons_stack_whss: forall (wordq:list token) (stackq:stack), word_has_stack_semantics wordq stackq -> forall (wordt:list token) (s:noninitstate) (pt:parse_tree (last_symb_of_non_init_state s) wordt), word_has_stack_semantics (wordq++wordt) (existT noninitstate_type s (pt_sem pt)::stackq). (** [pop] preserves the invariant **) Lemma pop_spec_ptl A symbols_to_pop action word_stk stk (res : A) stk' : pop_spec symbols_to_pop stk action stk' res -> word_has_stack_semantics word_stk stk -> exists word_stk' word_res (ptl:parse_tree_list symbols_to_pop word_res), (word_stk' ++ word_res = word_stk)%list /\ word_has_stack_semantics word_stk' stk' /\ ptl_sem ptl action = res. Proof. intros Hspec. revert word_stk. induction Hspec as [stk sem|symbols_to_pop st stk action sem stk' res Hspec IH]; intros word_stk Hword_stk. - exists word_stk, [], Nil_ptl. rewrite -app_nil_end. eauto. - inversion Hword_stk. subst_existT. edestruct IH as (word_stk' & word_res & ptl & ? & Hword_stk'' & ?); [eassumption|]. subst. eexists word_stk', (word_res ++ _)%list, (Cons_ptl ptl _). split; [|split]=>//. rewrite app_assoc //. Qed. (** [reduce_step] preserves the invariant **) Lemma reduce_step_invariant (stk:stack) (prod:production) Hv Hi word buffer : word_has_stack_semantics word stk -> match reduce_step init stk prod buffer Hv Hi with | Accept_sr sem buffer_new => exists pt : parse_tree (NT (start_nt init)) word, buffer = buffer_new /\ pt_sem pt = sem | Progress_sr stk' buffer_new => buffer = buffer_new /\ word_has_stack_semantics word stk' | Fail_sr => True end. Proof. intros Hword_stk. unfold reduce_step. match goal with | |- context [pop_state_valid init ?stp stk ?x1 ?x2 ?x3 ?x4 ?x5] => generalize (pop_state_valid init stp stk x1 x2 x3 x4 x5) end. destruct pop as [stk' sem] eqn:Hpop=>/= Hv'. apply pop_spec_ok in Hpop. apply pop_spec_ptl with (word_stk := word) in Hpop=>//. destruct Hpop as (word1 & word2 & ptl & <- & Hword1 & <-). generalize (reduce_step_subproof1 init stk prod Hv stk' (fun _ : True => Hv')). destruct goto_table as [[st' EQ]|]. - intros _. split=>//. change (ptl_sem ptl (prod_action prod)) with (pt_sem (Non_terminal_pt prod ptl)). generalize (Non_terminal_pt prod ptl). rewrite ->EQ. intros pt. by constructor. - intros Hstk'. destruct Hword1; [|by destruct Hstk']. generalize (reduce_step_subproof0 init prod [] (fun _ : True => Hstk')). simpl in Hstk'. rewrite -Hstk' // => EQ. rewrite cast_eq. exists (Non_terminal_pt prod ptl). by split. Qed. (** [step] preserves the invariant **) Lemma step_invariant stk word buffer safe Hi : word_has_stack_semantics word stk -> match step safe init stk buffer Hi with | Accept_sr sem buffer_new => exists word_new (pt:parse_tree (NT (start_nt init)) word_new), (word ++ buffer = word_new ++ buffer_new)%buf /\ pt_sem pt = sem | Progress_sr stk_new buffer_new => exists word_new, (word ++ buffer = word_new ++ buffer_new)%buf /\ word_has_stack_semantics word_new stk_new | Fail_sr => True end. Proof. intros Hword_stk. unfold step. generalize (reduce_ok safe (state_of_stack init stk)). destruct action_table as [prod|awt]. - intros Hv. apply (reduce_step_invariant stk prod (fun _ => Hv) Hi word buffer) in Hword_stk. destruct reduce_step=>//. + destruct Hword_stk as (pt & <- & <-); eauto. + destruct Hword_stk as [<- ?]; eauto. - destruct buffer as [tok buffer]=>/=. move=> /(_ (token_term tok)) Hv. destruct (awt (token_term tok)) as [st EQ|prod|]=>//. + eexists _. split; [by apply app_buf_assoc with (l2 := [_])|]. change (token_sem tok) with (pt_sem (Terminal_pt tok)). generalize (Terminal_pt tok). generalize [tok]. rewrite -> EQ=>word' pt /=. by constructor. + apply (reduce_step_invariant stk prod (fun _ => Hv) Hi word (tok::buffer)) in Hword_stk. destruct reduce_step=>//. * destruct Hword_stk as (pt & <- & <-); eauto. * destruct Hword_stk as [<- ?]; eauto. Qed. (** [step] preserves the invariant **) Lemma parse_fix_invariant stk word buffer safe log_n_steps Hi : word_has_stack_semantics word stk -> match proj1_sig (parse_fix safe init stk buffer log_n_steps Hi) with | Accept_sr sem buffer_new => exists word_new (pt:parse_tree (NT (start_nt init)) word_new), (word ++ buffer = word_new ++ buffer_new)%buf /\ pt_sem pt = sem | Progress_sr stk_new buffer_new => exists word_new, (word ++ buffer = word_new ++ buffer_new)%buf /\ word_has_stack_semantics word_new stk_new | Fail_sr => True end. Proof. revert stk word buffer Hi. induction log_n_steps as [|log_n_steps IH]=>/= stk word buffer Hi Hstk; [by apply step_invariant|]. assert (IH1 := IH stk word buffer Hi Hstk). destruct parse_fix as [[] Hi']=>/=; try by apply IH1. destruct IH1 as (word' & -> & Hstk')=>//. by apply IH. Qed. (** The interpreter is correct : if it returns a semantic value, then the input word has this semantic value. **) Theorem parse_correct safe buffer log_n_steps: match parse safe init buffer log_n_steps with | Parsed_pr sem buffer_new => exists word_new (pt:parse_tree (NT (start_nt init)) word_new), buffer = (word_new ++ buffer_new)%buf /\ pt_sem pt = sem | _ => True end. Proof. unfold parse. assert (Hparse := parse_fix_invariant [] [] buffer safe log_n_steps (parse_subproof init)). destruct proj1_sig=>//. apply Hparse. constructor. Qed. End Init. End Make. menhir-20200123/coq-menhirlib/src/Main.v000066400000000000000000000065441361226111300176270ustar00rootroot00000000000000(****************************************************************************) (* *) (* Menhir *) (* *) (* Jacques-Henri Jourdan, CNRS, LRI, Université Paris Sud *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under *) (* the terms of the GNU Lesser General Public License as published by the *) (* Free Software Foundation, either version 3 of the License, or (at your *) (* option) any later version, as described in the file LICENSE. *) (* *) (****************************************************************************) Require Grammar Automaton Interpreter_correct Interpreter_complete. From Coq Require Import Syntax Arith. Module Make(Export Aut:Automaton.T). Export Aut.Gram. Export Aut.GramDefs. Module Import Inter := Interpreter.Make Aut. Module Correct := Interpreter_correct.Make Aut Inter. Module Complete := Interpreter_complete.Make Aut Inter. Definition complete_validator:unit->bool := Complete.Valid.is_complete. Definition safe_validator:unit->bool := ValidSafe.is_safe. Definition parse (safe:safe_validator ()=true) init log_n_steps buffer : parse_result (symbol_semantic_type (NT (start_nt init))):= parse (ValidSafe.safe_is_validator safe) init buffer log_n_steps. (** Correction theorem. **) Theorem parse_correct (safe:safe_validator ()= true) init log_n_steps buffer: match parse safe init log_n_steps buffer with | Parsed_pr sem buffer_new => exists word (pt : parse_tree (NT (start_nt init)) word), buffer = (word ++ buffer_new)%buf /\ pt_sem pt = sem | _ => True end. Proof. apply Correct.parse_correct. Qed. (** Completeness theorem. **) Theorem parse_complete (safe:safe_validator () = true) init log_n_steps word buffer_end: complete_validator () = true -> forall tree:parse_tree (NT (start_nt init)) word, match parse safe init log_n_steps (word ++ buffer_end) with | Fail_pr => False | Parsed_pr sem_res buffer_end_res => sem_res = pt_sem tree /\ buffer_end_res = buffer_end /\ pt_size tree <= 2^log_n_steps | Timeout_pr => 2^log_n_steps < pt_size tree end. Proof. intros. now apply Complete.parse_complete, Complete.Valid.complete_is_validator. Qed. (** Unambiguity theorem. **) Theorem unambiguity: safe_validator () = true -> complete_validator () = true -> inhabited token -> forall init word, forall (tree1 tree2:parse_tree (NT (start_nt init)) word), pt_sem tree1 = pt_sem tree2. Proof. intros Hsafe Hcomp [tok] init word tree1 tree2. pose (buf_end := cofix buf_end := (tok :: buf_end)%buf). assert (Hcomp1 := parse_complete Hsafe init (pt_size tree1) word buf_end Hcomp tree1). assert (Hcomp2 := parse_complete Hsafe init (pt_size tree1) word buf_end Hcomp tree2). destruct parse. - destruct Hcomp1. - exfalso. eapply PeanoNat.Nat.lt_irrefl. etransitivity; [|apply Hcomp1]. eapply Nat.pow_gt_lin_r. constructor. - destruct Hcomp1 as [-> _], Hcomp2 as [-> _]. reflexivity. Qed. End Make. menhir-20200123/coq-menhirlib/src/Makefile000066400000000000000000000014421361226111300202040ustar00rootroot00000000000000PWD := $(shell pwd) COQINCLUDE := -R $(PWD) MenhirLib \ export COQINCLUDE .PHONY: all clean install uninstall all: _CoqProject @ $(MAKE) -f Makefile.coq --no-print-directory all _CoqProject: @ $(MAKE) -f Makefile.coq --no-print-directory _CoqProject clean: @ $(MAKE) -f Makefile.coq --no-print-directory clean @ rm -f _CoqProject # The role of DESTDIR is explained here: # https://www.gnu.org/prep/standards/html_node/DESTDIR.html # Basically, it is empty in a normal installation. # A nonempty value can be used to perform a dummy installation # in a different location. CONTRIB = $(shell $(COQBIN)coqc -where)/user-contrib TARGET = $(DESTDIR)$(CONTRIB)/MenhirLib install: all rm -rf $(TARGET) mkdir -p $(TARGET) install -m 644 *.v *.vo *.glob $(TARGET) uninstall: rm -rf $(TARGET) menhir-20200123/coq-menhirlib/src/Makefile.coq000066400000000000000000000135031361226111300207660ustar00rootroot00000000000000############################################################################ # Requirements. # We need bash. We use the pipefail option to control the exit code of a # pipeline. SHELL := /usr/bin/env bash ############################################################################ # Configuration # # # This Makefile relies on the following variables: # ROOTDIR (default: `pwd`) # COQBIN (default: empty) # COQINCLUDE (default: empty) # VV (default: *.v) # V_AUX (default: undefined/empty) # SERIOUS (default: 1) # (if 0, we produce .vio files) # (if 1, we produce .vo files in the old way) # VERBOSE (default: undefined) # (if defined, commands are displayed) # We usually refer to the .v files using relative paths (such as Foo.v) # but [coqdep -R] produces dependencies that refer to absolute paths # (such as /bar/Foo.v). This confuses [make], which does not recognize # that these files are the same. As a result, [make] does not respect # the dependencies. # We fix this by using ABSOLUTE PATHS EVERYWHERE. The paths used in targets, # in -R options, etc., must be absolute paths. ifndef ROOTDIR ROOTDIR := $(shell pwd) endif ifndef VV VV := $(wildcard $(ROOTDIR)/*.v) endif # Typically, $(VV) should list only the .v files that we are ultimately # interested in checking. $(V_AUX) should list every other .v file in the # project. $(VD) is obtained from $(VV) and $(V_AUX), so [make] sees all # dependencies and can rebuild files anywhere in the project, if needed, and # only if needed. ifndef VD VD := $(patsubst %.v,%.v.d,$(VV) $(V_AUX)) endif VIO := $(patsubst %.v,%.vio,$(VV)) VQ := $(patsubst %.v,%.vq,$(VV)) VO := $(patsubst %.v,%.vo,$(VV)) SERIOUS := 1 ############################################################################ # Binaries COQC := $(COQBIN)coqc $(COQFLAGS) COQDEP := $(COQBIN)coqdep COQIDE := $(COQBIN)coqide $(COQFLAGS) COQCHK := $(COQBIN)coqchk ############################################################################ # Targets .PHONY: all proof depend quick proof_vo proof_vq all: proof ifeq ($(SERIOUS),0) proof: proof_vq else proof: proof_vo endif proof_vq: $(VQ) depend: $(VD) quick: $(VIO) proof_vo: $(VO) ############################################################################ # Verbosity control. # Our commands are pretty long (due, among other things, to the use of # absolute paths everywhere). So, we hide them by default, and echo a short # message instead. However, sometimes one wants to see the command. # By default, VERBOSE is undefined, so the .SILENT directive is read, so no # commands are echoed. If VERBOSE is defined by the user, then the .SILENT # directive is ignored, so commands are echoed, unless they begin with an # explicit @. ifndef VERBOSE .SILENT: endif ############################################################################ # Verbosity filter. # Coq is way too verbose when using one of the -schedule-* commands. # So, we grep its output and remove any line that contains 'Checking task'. # We need a pipe that keeps the exit code of the *first* process. In # bash, when the pipefail option is set, the exit code is the logical # conjunction of the exit codes of the two processes. If we make sure # that the second process always succeeds, then we get the exit code # of the first process, as desired. ############################################################################ # Rules # If B uses A, then the dependencies produced by coqdep are: # B.vo: B.v A.vo # B.vio: B.v A.vio %.v.d: %.v $(COQDEP) $(COQINCLUDE) $< > $@ ifeq ($(SERIOUS),0) %.vo: %.vio @echo "Compiling `basename $*`..." set -o pipefail; ( \ $(COQC) $(COQINCLUDE) -schedule-vio2vo 1 $* \ 2>&1 | (grep -v 'Checking task' || true)) # The recipe for producing %.vio destroys %.vo. In other words, we do not # allow a young .vio file to co-exist with an old (possibly out-of-date) .vo # file, because this seems to lead Coq into various kinds of problems # ("inconsistent assumption" errors, "undefined universe" errors, warnings # about the existence of both files, and so on). Destroying %.vo should be OK # as long as the user does not try to build a mixture of .vo and .vio files in # one invocation of make. %.vio: %.v @echo "Digesting `basename $*`..." rm -f $*.vo $(COQC) $(COQINCLUDE) -quick $< %.vq: %.vio @echo "Checking `basename $*`..." set -o pipefail; ( \ $(COQC) $(COQINCLUDE) -schedule-vio-checking 1 $< \ 2>&1 | (grep -v 'Checking task' || true)) touch $@ endif ifeq ($(SERIOUS),1) %.vo: %.v @echo "Compiling `basename $*`..." $(COQC) $(COQINCLUDE) $< # @echo "$(COQC) $(COQINCLUDE) $<" endif _CoqProject: .FORCE @echo $(COQINCLUDE) > $@ .FORCE: ############################################################################ # Dependencies ifeq ($(findstring $(MAKECMDGOALS),depend clean),) -include $(VD) endif ############################################################################ # IDE .PHONY: ide .coqide: @echo '$(COQIDE) $(COQINCLUDE) $$*' > .coqide @chmod +x .coqide ide: _CoqProject $(COQIDE) $(COQINCLUDE) ############################################################################ # Clean .PHONY: clean # In a multi-directory setting, it is not entirely clear how to find the # files that we wish to remove. # One approach would be to view $(VV) as the authoritative list of source files # and remove just the derived files $(VO), etc. # Another approach is to scan all subdirectories of $(ROOTDIR) and remove all # object files in them. We follow this approach. # Be careful to use regular expressions that work both with GNU find # and with BSD find (MacOS). clean:: for d in `find $(ROOTDIR) -type d -not -regex ".*\\.git.*"` ; do \ (cd $$d && \ rm -f *~ && \ rm -f .*.aux && \ rm -f *.{vo,vio,vq,v.d,aux,glob,cache,crashcoqide} && \ rm -rf *.coq-native *.coqide && \ true) ; \ done menhir-20200123/coq-menhirlib/src/Validator_classes.v000066400000000000000000000060671361226111300224050ustar00rootroot00000000000000(****************************************************************************) (* *) (* Menhir *) (* *) (* Jacques-Henri Jourdan, CNRS, LRI, Université Paris Sud *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under *) (* the terms of the GNU Lesser General Public License as published by the *) (* Free Software Foundation, either version 3 of the License, or (at your *) (* option) any later version, as described in the file LICENSE. *) (* *) (****************************************************************************) From Coq Require Import List. From Coq.ssr Require Import ssreflect. Require Import Alphabet. Class IsValidator (P : Prop) (b : bool) := is_validator : b = true -> P. Hint Mode IsValidator + - : typeclass_instances. Instance is_validator_true : IsValidator True true. Proof. done. Qed. Instance is_validator_false : IsValidator False false. Proof. done. Qed. Instance is_validator_eq_true b : IsValidator (b = true) b. Proof. done. Qed. Instance is_validator_and P1 b1 P2 b2 `{IsValidator P1 b1} `{IsValidator P2 b2}: IsValidator (P1 /\ P2) (if b1 then b2 else false). Proof. by split; destruct b1, b2; apply is_validator. Qed. Instance is_validator_comparable_leibniz_eq A (C:Comparable A) (x y : A) : ComparableLeibnizEq C -> IsValidator (x = y) (compare_eqb x y). Proof. intros ??. by apply compare_eqb_iff. Qed. Instance is_validator_comparable_eq_impl A `(Comparable A) (x y : A) P b : IsValidator P b -> IsValidator (x = y -> P) (if compare_eqb x y then b else true). Proof. intros Hval Val ->. rewrite /compare_eqb compare_refl in Val. auto. Qed. Lemma is_validator_forall_finite A P b `(Finite A) : (forall (x : A), IsValidator (P x) (b x)) -> IsValidator (forall (x : A), P x) (forallb b all_list). Proof. move=> ? /forallb_forall Hb ?. apply is_validator, Hb, all_list_forall. Qed. (* We do not use an instance directly here, because we need somehow to force Coq to instantiate b with a lambda. *) Hint Extern 2 (IsValidator (forall x : ?A, _) _) => eapply (is_validator_forall_finite _ _ (fun (x:A) => _)) : typeclass_instances. (* Hint for synthetizing pattern-matching. *) Hint Extern 2 (IsValidator (match ?u with _ => _ end) ?b0) => let b := fresh "b" in unshelve notypeclasses refine (let b : bool := _ in _); [destruct u; intros; shelve|]; (* Synthetize `match .. with` in the validator. *) unify b b0; unfold b; destruct u; clear b : typeclass_instances. (* Hint for unfolding definitions. This is necessary because many hints for IsValidator use [Hint Extern], which do not automatically unfold identifiers. *) Hint Extern 100 (IsValidator ?X _) => unfold X : typeclass_instances. menhir-20200123/coq-menhirlib/src/Validator_complete.v000066400000000000000000000357221361226111300225600ustar00rootroot00000000000000(****************************************************************************) (* *) (* Menhir *) (* *) (* Jacques-Henri Jourdan, CNRS, LRI, Université Paris Sud *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under *) (* the terms of the GNU Lesser General Public License as published by the *) (* Free Software Foundation, either version 3 of the License, or (at your *) (* option) any later version, as described in the file LICENSE. *) (* *) (****************************************************************************) From Coq Require Import List Syntax Derive. From Coq.ssr Require Import ssreflect. Require Automaton. Require Import Alphabet Validator_classes. Module Make(Import A:Automaton.T). (** We instantiate some sets/map. **) Module TerminalComparableM <: ComparableM. Definition t := terminal. Instance tComparable : Comparable t := _. End TerminalComparableM. Module TerminalOrderedType := OrderedType_from_ComparableM TerminalComparableM. Module StateProdPosComparableM <: ComparableM. Definition t := (state*production*nat)%type. Instance tComparable : Comparable t := _. End StateProdPosComparableM. Module StateProdPosOrderedType := OrderedType_from_ComparableM StateProdPosComparableM. Module TerminalSet := FSetAVL.Make TerminalOrderedType. Module StateProdPosMap := FMapAVL.Make StateProdPosOrderedType. (** Nullable predicate for symbols and list of symbols. **) Definition nullable_symb (symbol:symbol) := match symbol with | NT nt => nullable_nterm nt | _ => false end. Definition nullable_word (word:list symbol) := forallb nullable_symb word. (** First predicate for non terminal, symbols and list of symbols, given as FSets. **) Definition first_nterm_set (nterm:nonterminal) := fold_left (fun acc t => TerminalSet.add t acc) (first_nterm nterm) TerminalSet.empty. Definition first_symb_set (symbol:symbol) := match symbol with | NT nt => first_nterm_set nt | T t => TerminalSet.singleton t end. Fixpoint first_word_set (word:list symbol) := match word with | [] => TerminalSet.empty | t::q => if nullable_symb t then TerminalSet.union (first_symb_set t) (first_word_set q) else first_symb_set t end. (** Small helper for finding the part of an item that is after the dot. **) Definition future_of_prod prod dot_pos : list symbol := (fix loop n lst := match n with | O => lst | S x => match loop x lst with [] => [] | _::q => q end end) dot_pos (rev' (prod_rhs_rev prod)). (** We build a fast map to store all the items of all the states. **) Definition items_map (_:unit): StateProdPosMap.t TerminalSet.t := fold_left (fun acc state => fold_left (fun acc item => let key := (state, prod_item item, dot_pos_item item) in let data := fold_left (fun acc t => TerminalSet.add t acc) (lookaheads_item item) TerminalSet.empty in let old := match StateProdPosMap.find key acc with | Some x => x | None => TerminalSet.empty end in StateProdPosMap.add key (TerminalSet.union data old) acc ) (items_of_state state) acc ) all_list (StateProdPosMap.empty TerminalSet.t). (** We need to avoid computing items_map each time we need it. To that purpose, we declare a typeclass specifying that some map is equal to items_map. *) Class IsItemsMap m := is_items_map : m = items_map (). (** Accessor. **) Definition find_items_map items_map state prod dot_pos : TerminalSet.t := match StateProdPosMap.find (state, prod, dot_pos) items_map with | None => TerminalSet.empty | Some x => x end. Definition state_has_future state prod (fut:list symbol) (lookahead:terminal) := exists dot_pos:nat, fut = future_of_prod prod dot_pos /\ TerminalSet.In lookahead (find_items_map (items_map ()) state prod dot_pos). (** Iterator over items. **) Definition forallb_items items_map (P:state -> production -> nat -> TerminalSet.t -> bool): bool:= StateProdPosMap.fold (fun key set acc => match key with (st, p, pos) => (acc && P st p pos set)%bool end ) items_map true. (** Typeclass instances for synthetizing the validator. *) Instance is_validator_subset S1 S2 : IsValidator (TerminalSet.Subset S1 S2) (TerminalSet.subset S1 S2). Proof. intros ?. by apply TerminalSet.subset_2. Qed. (* While the specification of the validator always quantify over possible lookahead tokens individually, the validator usually handles lookahead sets directly instead, for better performances. For instance, the validator for [state_has_future], which speaks about one single lookahead token is a subset operation: *) Lemma is_validator_state_has_future_subset st prod pos lookahead lset im fut : TerminalSet.In lookahead lset -> fut = future_of_prod prod pos -> IsItemsMap im -> IsValidator (state_has_future st prod fut lookahead) (TerminalSet.subset lset (find_items_map im st prod pos)). Proof. intros ? -> -> HSS%TerminalSet.subset_2. exists pos. split=>//. by apply HSS. Qed. (* We do not declare this lemma as an instance, and use [Hint Extern] instead, because the typeclass mechanism has trouble instantiating some evars if we do not explicitely call [eassumption]. *) Hint Extern 2 (IsValidator (state_has_future _ _ _ _) _) => eapply is_validator_state_has_future_subset; [eassumption|eassumption || reflexivity|] : typeclass_instances. (* As said previously, we manipulate lookahead terminal sets instead of lookahead individually. Hence, when we quantify over a lookahead set in the specification, we do not do anything in the executable validator. This instance is used for [non_terminal_closed]. *) Instance is_validator_forall_lookahead_set lset P b: (forall lookahead, TerminalSet.In lookahead lset -> IsValidator (P lookahead) b) -> IsValidator (forall lookahead, TerminalSet.In lookahead lset -> P lookahead) b. Proof. unfold IsValidator. firstorder. Qed. (* Dually, we sometimes still need to explicitelly iterate over a lookahead set. This is what this lemma allows. Used only in [end_reduce]. *) Lemma is_validator_iterate_lset P b lookahead lset : TerminalSet.In lookahead lset -> IsValidator P (b lookahead) -> IsValidator P (TerminalSet.fold (fun lookahead acc => if acc then b lookahead else false) lset true). Proof. intros Hlset%TerminalSet.elements_1 Hval Val. apply Hval. revert Val. rewrite TerminalSet.fold_1. generalize true at 1. clear -Hlset. induction Hlset as [? l <-%compare_eq|? l ? IH]=> /= b' Val. - destruct (b lookahead). by destruct b'. exfalso. by induction l; destruct b'. - eauto. Qed. Hint Extern 100 (IsValidator _ _) => match goal with | H : TerminalSet.In ?lookahead ?lset |- _ => eapply (is_validator_iterate_lset _ (fun lookahead => _) _ _ H); clear H end : typeclass_instances. (* We often quantify over all the items of all the states of the automaton. This lemma and the accompanying [Hint Resolve] declaration allow generating the corresponding executable validator. Note that it turns out that, in all the uses of this pattern, the first thing we do for each item is pattern-matching over the future. This lemma also embbed this pattern-matching, which makes it possible to get the hypothesis [fut' = future_of_prod prod (S pos)] in the non-nil branch. Moreover, note, again, that while the specification quantifies over lookahead terminals individually, the code provides lookahead sets instead. *) Lemma is_validator_forall_items P1 b1 P2 b2 im : IsItemsMap im -> (forall st prod lookahead lset pos, TerminalSet.In lookahead lset -> [] = future_of_prod prod pos -> IsValidator (P1 st prod lookahead) (b1 st prod pos lset)) -> (forall st prod pos lookahead lset s fut', TerminalSet.In lookahead lset -> fut' = future_of_prod prod (S pos) -> IsValidator (P2 st prod lookahead s fut') (b2 st prod pos lset s fut')) -> IsValidator (forall st prod fut lookahead, state_has_future st prod fut lookahead -> match fut with | [] => P1 st prod lookahead | s :: fut' => P2 st prod lookahead s fut' end) (forallb_items im (fun st prod pos lset => match future_of_prod prod pos with | [] => b1 st prod pos lset | s :: fut' => b2 st prod pos lset s fut' end)). Proof. intros -> Hval1 Hval2 Val st prod fut lookahead (pos & -> & Hlookahead). rewrite /forallb_items StateProdPosMap.fold_1 in Val. assert (match future_of_prod prod pos with | [] => b1 st prod pos (find_items_map (items_map ()) st prod pos) | s :: fut' => b2 st prod pos (find_items_map (items_map ()) st prod pos) s fut' end = true). - unfold find_items_map in *. assert (Hfind := @StateProdPosMap.find_2 _ (items_map ()) (st, prod, pos)). destruct StateProdPosMap.find as [lset|]; [|by edestruct (TerminalSet.empty_1); eauto]. specialize (Hfind _ eq_refl). apply StateProdPosMap.elements_1 in Hfind. revert Val. generalize true at 1. induction Hfind as [[? ?] l [?%compare_eq ?]|??? IH]=>?. + simpl in *; subst. match goal with |- _ -> ?X = true => destruct X end; [done|]. rewrite Bool.andb_false_r. clear. induction l as [|[[[??]?]?] l IH]=>//. + apply IH. - destruct future_of_prod eqn:EQ. by eapply Hval1; eauto. eapply Hval2 with (pos := pos); eauto; []. revert EQ. unfold future_of_prod=>-> //. Qed. (* We need a hint for expplicitely instantiating b1 and b2 with lambdas. *) Hint Extern 0 (IsValidator (forall st prod fut lookahead, state_has_future st prod fut lookahead -> _) _) => eapply (is_validator_forall_items _ (fun st prod pos lset => _) _ (fun st prod pos lset s fut' => _)) : typeclass_instances. (* Used in [start_future] only. *) Instance is_validator_forall_state_has_future im st prod : IsItemsMap im -> IsValidator (forall look, state_has_future st prod (rev' (prod_rhs_rev prod)) look) (let lookaheads := find_items_map im st prod 0 in forallb (fun t => TerminalSet.mem t lookaheads) all_list). Proof. move=> -> /forallb_forall Val look. specialize (Val look (all_list_forall _)). exists 0. split=>//. by apply TerminalSet.mem_2. Qed. (** * Validation for completeness **) (** The nullable predicate is a fixpoint : it is correct. **) Definition nullable_stable := forall p:production, if nullable_word (prod_rhs_rev p) then nullable_nterm (prod_lhs p) = true else True. (** The first predicate is a fixpoint : it is correct. **) Definition first_stable:= forall (p:production), TerminalSet.Subset (first_word_set (rev' (prod_rhs_rev p))) (first_nterm_set (prod_lhs p)). (** The initial state has all the S=>.u items, where S is the start non-terminal **) Definition start_future := forall (init:initstate) (p:production), prod_lhs p = start_nt init -> forall (t:terminal), state_has_future init p (rev' (prod_rhs_rev p)) t. (** If a state contains an item of the form A->_.av[[b]], where a is a terminal, then reading an a does a [Shift_act], to a state containing an item of the form A->_.v[[b]]. **) Definition terminal_shift := forall (s1:state) prod fut lookahead, state_has_future s1 prod fut lookahead -> match fut with | T t::q => match action_table s1 with | Lookahead_act awp => match awp t with | Shift_act s2 _ => state_has_future s2 prod q lookahead | _ => False end | _ => False end | _ => True end. (** If a state contains an item of the form A->_.[[a]], then either we do a [Default_reduce_act] of the corresponding production, either a is a terminal (ie. there is a lookahead terminal), and reading a does a [Reduce_act] of the corresponding production. **) Definition end_reduce := forall (s:state) prod fut lookahead, state_has_future s prod fut lookahead -> match fut with | [] => match action_table s with | Default_reduce_act p => p = prod | Lookahead_act awt => match awt lookahead with | Reduce_act p => p = prod | _ => False end end | _ => True end. Definition is_end_reduce items_map := forallb_items items_map (fun s prod pos lset => match future_of_prod prod pos with | [] => match action_table s with | Default_reduce_act p => compare_eqb p prod | Lookahead_act awt => TerminalSet.fold (fun lookahead acc => match awt lookahead with | Reduce_act p => (acc && compare_eqb p prod)%bool | _ => false end) lset true end | _ => true end). (** If a state contains an item of the form A->_.Bv[[b]], where B is a non terminal, then the goto table says we have to go to a state containing an item of the form A->_.v[[b]]. **) Definition non_terminal_goto := forall (s1:state) prod fut lookahead, state_has_future s1 prod fut lookahead -> match fut with | NT nt::q => match goto_table s1 nt with | Some (exist _ s2 _) => state_has_future s2 prod q lookahead | None => False end | _ => True end. Definition start_goto := forall (init:initstate), match goto_table init (start_nt init) with | None => True | Some _ => False end. (** Closure property of item sets : if a state contains an item of the form A->_.Bv[[b]], then for each production B->u and each terminal a of first(vb), the state contains an item of the form B->_.u[[a]] **) Definition non_terminal_closed := forall s1 prod fut lookahead, state_has_future s1 prod fut lookahead -> match fut with | NT nt::q => forall p, prod_lhs p = nt -> (if nullable_word q then state_has_future s1 p (future_of_prod p 0) lookahead else True) /\ (forall lookahead2, TerminalSet.In lookahead2 (first_word_set q) -> state_has_future s1 p (future_of_prod p 0) lookahead2) | _ => True end. (** The automaton is complete **) Definition complete := nullable_stable /\ first_stable /\ start_future /\ terminal_shift /\ end_reduce /\ non_terminal_goto /\ start_goto /\ non_terminal_closed. Derive is_complete_0 SuchThat (forall im, IsItemsMap im -> IsValidator complete (is_complete_0 im)) As complete_0_is_validator. Proof. intros im. subst is_complete_0. instantiate (1:=fun im => _). apply _. Qed. Definition is_complete (_:unit) := is_complete_0 (items_map ()). Lemma complete_is_validator : IsValidator complete (is_complete ()). Proof. by apply complete_0_is_validator. Qed. End Make. menhir-20200123/coq-menhirlib/src/Validator_safe.v000066400000000000000000000202521361226111300216560ustar00rootroot00000000000000(****************************************************************************) (* *) (* Menhir *) (* *) (* Jacques-Henri Jourdan, CNRS, LRI, Université Paris Sud *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under *) (* the terms of the GNU Lesser General Public License as published by the *) (* Free Software Foundation, either version 3 of the License, or (at your *) (* option) any later version, as described in the file LICENSE. *) (* *) (****************************************************************************) From Coq Require Import List Syntax Derive. From Coq.ssr Require Import ssreflect. Require Automaton. Require Import Alphabet Validator_classes. Module Make(Import A:Automaton.T). (** The singleton predicate for states **) Definition singleton_state_pred (state:state) := (fun state' => match compare state state' with Eq => true |_ => false end). (** [past_state_of_non_init_state], extended for all states. **) Definition past_state_of_state (state:state) := match state with | Init _ => [] | Ninit nis => past_state_of_non_init_state nis end. (** Concatenations of last and past **) Definition head_symbs_of_state (state:state) := match state with | Init _ => [] | Ninit s => last_symb_of_non_init_state s::past_symb_of_non_init_state s end. Definition head_states_of_state (state:state) := singleton_state_pred state::past_state_of_state state. (** * Validation for correctness **) (** Prefix predicate between two lists of symbols. **) Inductive prefix: list symbol -> list symbol -> Prop := | prefix_nil: forall l, prefix [] l | prefix_cons: forall l1 l2 x, prefix l1 l2 -> prefix (x::l1) (x::l2). (** [prefix] is transitive **) Lemma prefix_trans: forall (l1 l2 l3:list symbol), prefix l1 l2 -> prefix l2 l3 -> prefix l1 l3. Proof. intros l1 l2 l3 H1 H2. revert l3 H2. induction H1; [now constructor|]. inversion 1. subst. constructor. eauto. Qed. Fixpoint is_prefix (l1 l2:list symbol) := match l1, l2 with | [], _ => true | t1::q1, t2::q2 => (compare_eqb t1 t2 && is_prefix q1 q2)%bool | _::_, [] => false end. Instance prefix_is_validator l1 l2 : IsValidator (prefix l1 l2) (is_prefix l1 l2). Proof. revert l2. induction l1 as [|x1 l1 IH]=>l2 Hpref. - constructor. - destruct l2 as [|x2 l2]=>//. move: Hpref=> /andb_prop [/compare_eqb_iff -> /IH ?]. by constructor. Qed. (** If we shift, then the known top symbols of the destination state is a prefix of the known top symbols of the source state, with the new symbol added. **) Definition shift_head_symbs := forall s, match action_table s with | Lookahead_act awp => forall t, match awp t with | Shift_act s2 _ => prefix (past_symb_of_non_init_state s2) (head_symbs_of_state s) | _ => True end | _ => True end. (** When a goto happens, then the known top symbols of the destination state is a prefix of the known top symbols of the source state, with the new symbol added. **) Definition goto_head_symbs := forall s nt, match goto_table s nt with | Some (exist _ s2 _) => prefix (past_symb_of_non_init_state s2) (head_symbs_of_state s) | None => True end. (** We have to say the same kind of checks for the assumptions about the states stack. However, theses assumptions are predicates. So we define a notion of "prefix" over predicates lists, that means, basically, that an assumption entails another **) Inductive prefix_pred: list (state->bool) -> list (state->bool) -> Prop := | prefix_pred_nil: forall l, prefix_pred [] l | prefix_pred_cons: forall l1 l2 f1 f2, (forall x, implb (f2 x) (f1 x) = true) -> prefix_pred l1 l2 -> prefix_pred (f1::l1) (f2::l2). (** [prefix_pred] is transitive **) Lemma prefix_pred_trans: forall (l1 l2 l3:list (state->bool)), prefix_pred l1 l2 -> prefix_pred l2 l3 -> prefix_pred l1 l3. Proof. intros l1 l2 l3 H1 H2. revert l3 H2. induction H1 as [|l1 l2 f1 f2 Hf2f1]; [now constructor|]. intros l3. inversion 1 as [|??? f3 Hf3f2]. subst. constructor; [|now eauto]. intros x. specialize (Hf3f2 x). specialize (Hf2f1 x). repeat destruct (_ x); auto. Qed. Fixpoint is_prefix_pred (l1 l2:list (state->bool)) := match l1, l2 with | [], _ => true | f1::q1, f2::q2 => (forallb (fun x => implb (f2 x) (f1 x)) all_list && is_prefix_pred q1 q2)%bool | _::_, [] => false end. Instance prefix_pred_is_validator l1 l2 : IsValidator (prefix_pred l1 l2) (is_prefix_pred l1 l2). Proof. revert l2. induction l1 as [|x1 l1 IH]=>l2 Hpref. - constructor. - destruct l2 as [|x2 l2]=>//. move: Hpref=> /andb_prop [/forallb_forall ? /IH ?]. constructor; auto using all_list_forall. Qed. (** The assumptions about state stack is conserved when we shift **) Definition shift_past_state := forall s, match action_table s with | Lookahead_act awp => forall t, match awp t with | Shift_act s2 _ => prefix_pred (past_state_of_non_init_state s2) (head_states_of_state s) | _ => True end | _ => True end. (** The assumptions about state stack is conserved when we do a goto **) Definition goto_past_state := forall s nt, match goto_table s nt with | Some (exist _ s2 _) => prefix_pred (past_state_of_non_init_state s2) (head_states_of_state s) | None => True end. (** What states are possible after having popped these symbols from the stack, given the annotation of the current state ? **) Inductive state_valid_after_pop (s:state): list symbol -> list (state -> bool) -> Prop := | state_valid_after_pop_nil1: forall p pl, p s = true -> state_valid_after_pop s [] (p::pl) | state_valid_after_pop_nil2: forall sl, state_valid_after_pop s sl [] | state_valid_after_pop_cons: forall st sq p pl, state_valid_after_pop s sq pl -> state_valid_after_pop s (st::sq) (p::pl). Fixpoint is_state_valid_after_pop (state:state) (to_pop:list symbol) annot := match annot, to_pop with | [], _ => true | p::_, [] => p state | p::pl, s::sl => is_state_valid_after_pop state sl pl end. Instance impl_is_state_valid_after_pop_is_validator state sl pl P b : IsValidator P b -> IsValidator (state_valid_after_pop state sl pl -> P) (if is_state_valid_after_pop state sl pl then b else true). Proof. destruct (is_state_valid_after_pop _ sl pl) eqn:EQ. - intros ???. by eapply is_validator. - intros _ _ Hsvap. exfalso. induction Hsvap=>//; [simpl in EQ; congruence|]. by destruct sl. Qed. (** A state is valid for reducing a production when : - The assumptions on the state are such that we will find the right hand side of the production on the stack. - We will be able to do a goto after having popped the right hand side. **) Definition valid_for_reduce (state:state) prod := prefix (prod_rhs_rev prod) (head_symbs_of_state state) /\ forall state_new, state_valid_after_pop state_new (prod_rhs_rev prod) (head_states_of_state state) -> match goto_table state_new (prod_lhs prod) with | None => match state_new with | Init i => prod_lhs prod = start_nt i | Ninit _ => False end | _ => True end. (** All the states that does a reduce are valid for reduction **) Definition reduce_ok := forall s, match action_table s with | Lookahead_act awp => forall t, match awp t with | Reduce_act p => valid_for_reduce s p | _ => True end | Default_reduce_act p => valid_for_reduce s p end. (** The automaton is safe **) Definition safe := shift_head_symbs /\ goto_head_symbs /\ shift_past_state /\ goto_past_state /\ reduce_ok. Derive is_safe SuchThat (IsValidator safe (is_safe ())) As safe_is_validator. Proof. subst is_safe. instantiate (1:=fun _ => _). apply _. Qed. End Make. menhir-20200123/coq-menhirlib/src/Version.v000066400000000000000000000000431361226111300203540ustar00rootroot00000000000000Definition require_20200123 := tt. menhir-20200123/doc/000077500000000000000000000000001361226111300137705ustar00rootroot00000000000000menhir-20200123/doc/.gitignore000066400000000000000000000002161361226111300157570ustar00rootroot00000000000000*.dvi *.aux *.bbl *.blg *.log *.out *.toc *.fdb_latexmk *.fls manual.pdf *.haux *.htoc manual.image.tex manual[0-9][0-9][0-9].png manual.html menhir-20200123/doc/Makefile000066400000000000000000000014571361226111300154370ustar00rootroot00000000000000.PHONY: all loop clean export TEXINPUTS=.: DEPS = $(wildcard *.tex) $(wildcard *.bib) $(wildcard *.sty) $(wildcard *.mly) all: manual.pdf manual.html %.pdf: %.tex $(DEPS) pdflatex $* bibtex $* pdflatex $* pdflatex $* manual.html: manual.tex $(DEPS) $(wildcard *.hva) hevea -fix manual.tex # # Hevea interprets 'tabbing' environment in a way # that creates spacing errors in the rendered output # of "textual version of derivation trees": it # asks for (padding:0px;) while the TeX rendering # inserts spacing between columns. Change this # to {padding:1px;} sed -i.bak -e "s/cellpadding0/cellpadding1/" manual.html && rm manual.html.bak # # Note: hevea generates images manual00{1,2,3}.png for the tikz pictures # present in the manual. loop: latexmk -pdf -pvc manual clean: @ rm -f `cat .gitignore` menhir-20200123/doc/declarations-onerrorreduce.mly000066400000000000000000000004051361226111300220360ustar00rootroot00000000000000%token ID ARROW LPAREN RPAREN COLON SEMICOLON %start program %on_error_reduce typ1 %% typ0: ID | LPAREN typ1 RPAREN {} typ1: typ0 | typ0 ARROW typ1 {} declaration: ID COLON typ1 {} program: | LPAREN declaration RPAREN | declaration SEMICOLON {} menhir-20200123/doc/declarations-phantom.mly000066400000000000000000000004731361226111300206330ustar00rootroot00000000000000%token ID ARROW LPAREN RPAREN COLON SEMICOLON %start program %% typ0: ID | LPAREN typ1(RPAREN) RPAREN {} typ1(phantom): typ0 | typ0 ARROW typ1(phantom) {} declaration(phantom): ID COLON typ1(phantom) {} program: | LPAREN declaration(RPAREN) RPAREN | declaration(SEMICOLON) SEMICOLON {} menhir-20200123/doc/declarations.mly000066400000000000000000000003571361226111300171700ustar00rootroot00000000000000%token ID ARROW LPAREN RPAREN COLON SEMICOLON %start program %% typ0: ID | LPAREN typ1 RPAREN {} typ1: typ0 | typ0 ARROW typ1 {} declaration: ID COLON typ1 {} program: | LPAREN declaration RPAREN | declaration SEMICOLON {} menhir-20200123/doc/dune000066400000000000000000000001601361226111300146430ustar00rootroot00000000000000;; Install the man page. (install (section man) (files menhir.1) (package menhir) ) (include dune.manual) menhir-20200123/doc/dune.manual000066400000000000000000000005601361226111300161230ustar00rootroot00000000000000;; This file is concatenated at the end of the file "dune" ;; by the release script, so the documentation is installed ;; only on release branches. ;; The documentation is currently built outside of dune's control ;; by doc/Makefile. (install (section doc) (files manual.pdf manual.html manual001.png manual002.png manual003.png ) (package menhir) ) menhir-20200123/doc/fppdf.sty000066400000000000000000000024401361226111300156300ustar00rootroot00000000000000% This tiny package invokes ``hyperref'' with appropriate options. % Three modes are provided: % if \fppdf is defined, we configure ``hyperref'' for PDF output. % otherwise, if WhizzyTeX is active, we do configure ``softref'' for producing DVI output % containing ``advi''-style hyperlinks. % otherwise, we configure nothing. \ProvidesPackage{fppdf} \@ifundefined{fppdf}{ \newcommand{\texorpdfstring}[2]{#1} \newcommand{\href}[2]{#2} \@ifundefined{WhizzyTeX}{ % PostScript output. \typeout{No hyperlinks.} }{ % WhizzyTeX output. \typeout{Hyperlinks in advi style.} % % Dfinissons les commandes \softlink et \softtarget, employes par locallabel, % de faon ce que les labels de preuves deviennent des hyperliens. % \edef\hyper@quote{\string"} \edef\hyper@sharp{\string#} \def \softlink #1#2{\special {html:}#2\special {html:}} \def \softtarget #1#2{\special {html:}#2\special {html:}} } }{ % PDF output. \typeout{Hyperlinks in pdflatex style.} \usepackage[bookmarks=true,bookmarksopen=true,colorlinks=true,linkcolor=blue,citecolor=blue,urlcolor=blue]{hyperref} \let\softlink\hyperlink \let\softtarget\hypertarget } menhir-20200123/doc/hevea.sty000066400000000000000000000057621361226111300156330ustar00rootroot00000000000000% hevea : hevea.sty % This is a very basic style file for latex document to be processed % with hevea. It contains definitions of LaTeX environment which are % processed in a special way by the translator. % Mostly : % - latexonly, not processed by hevea, processed by latex. % - htmlonly , the reverse. % - rawhtml, to include raw HTML in hevea output. % - toimage, to send text to the image file. % The package also provides hevea logos, html related commands (ahref % etc.), void cutting and image commands. \NeedsTeXFormat{LaTeX2e} \ProvidesPackage{hevea}[2002/01/11] \RequirePackage{comment} \newif\ifhevea\heveafalse \@ifundefined{ifimagen}{\newif\ifimagen\imagenfalse} \makeatletter% \newcommand{\heveasmup}[2]{% \raise #1\hbox{$\m@th$% \csname S@\f@size\endcsname \fontsize\sf@size 0% \math@fontsfalse\selectfont #2% }}% \DeclareRobustCommand{\hevea}{H\kern-.15em\heveasmup{.2ex}{E}\kern-.15emV\kern-.15em\heveasmup{.2ex}{E}\kern-.15emA}% \DeclareRobustCommand{\hacha}{H\kern-.15em\heveasmup{.2ex}{A}\kern-.15emC\kern-.1em\heveasmup{.2ex}{H}\kern-.15emA}% \DeclareRobustCommand{\html}{\protect\heveasmup{0.ex}{HTML}} %%%%%%%%% Hyperlinks hevea style \newcommand{\ahref}[2]{{#2}} \newcommand{\ahrefloc}[2]{{#2}} \newcommand{\aname}[2]{{#2}} \newcommand{\ahrefurl}[1]{\texttt{#1}} \newcommand{\footahref}[2]{#2\footnote{\texttt{#1}}} \newcommand{\mailto}[1]{\texttt{#1}} \newcommand{\imgsrc}[2][]{} \newcommand{\home}[1]{\protect\raisebox{-.75ex}{\char126}#1} \AtBeginDocument {\@ifundefined{url} {%url package is not loaded \let\url\ahref\let\oneurl\ahrefurl\let\footurl\footahref} {}} %% Void cutting instructions \newcounter{cuttingdepth} \newcommand{\tocnumber}{} \newcommand{\notocnumber}{} \newcommand{\cuttingunit}{} \newcommand{\cutdef}[2][]{} \newcommand{\cuthere}[2]{} \newcommand{\cutend}{} \newcommand{\htmlhead}[1]{} \newcommand{\htmlfoot}[1]{} \newcommand{\htmlprefix}[1]{} \newenvironment{cutflow}[1]{}{} \newcommand{\cutname}[1]{} \newcommand{\toplinks}[3]{} \newcommand{\setlinkstext}[3]{} \newcommand{\flushdef}[1]{} \newcommand{\footnoteflush}[1]{} %%%% Html only \excludecomment{rawhtml} \newcommand{\rawhtmlinput}[1]{} \excludecomment{htmlonly} %%%% Latex only \newenvironment{latexonly}{}{} \newenvironment{verblatex}{}{} %%%% Image file stuff \def\toimage{\endgroup} \def\endtoimage{\begingroup\def\@currenvir{toimage}} \def\verbimage{\endgroup} \def\endverbimage{\begingroup\def\@currenvir{verbimage}} \newcommand{\imageflush}[1][]{} %%% Bgcolor definition \newsavebox{\@bgcolorbin} \newenvironment{bgcolor}[2][] {\newcommand{\@mycolor}{#2}\begin{lrbox}{\@bgcolorbin}\vbox\bgroup} {\egroup\end{lrbox}% \begin{flushleft}% \colorbox{\@mycolor}{\usebox{\@bgcolorbin}}% \end{flushleft}} %%% Style sheets macros, defined as no-ops \newcommand{\newstyle}[2]{} \newcommand{\addstyle}[1]{} \newcommand{\setenvclass}[2]{} \newcommand{\getenvclass}[1]{} \newcommand{\loadcssfile}[1]{} \newenvironment{divstyle}[1]{}{} \newenvironment{cellstyle}[2]{}{} \newif\ifexternalcss %%% Postlude \makeatother menhir-20200123/doc/local.bib000066400000000000000000000165441361226111300155520ustar00rootroot00000000000000@String{acta = "Acta Informatica"} @String{aw = "Addison-Wesley"} @String{cacm = "Communications of the {ACM}"} @String{cc = "Compiler Construction (CC)"} @String{cup = "Cambridge University Press"} @String{entcs = "Electronic Notes in Theoretical Computer Science"} @String{spe = "Software: Practice and Experience"} @String{toplas = "ACM Transactions on Programming Languages and Systems"} @Misc{compcert-github, author = "Xavier Leroy", title = "The {CompCert C} verified compiler", year = "2014", howpublished = "\url{https://github.com/AbsInt/CompCert}", } @Misc{obelisk, author = {L\'elio Brun}, title = {Obelisk}, howpublished = {\url{https://github.com/Lelio-Brun/Obelisk}}, year = {2017}, } @Book{aho-86, author = "Alfred V. Aho and Ravi Sethi and Jeffrey D. Ullman", title = "Compilers: Principles, Techniques, and Tools", publisher = aw, year = "1986", } @Book{appel-tiger-98, author = "Andrew Appel", title = "Modern Compiler Implementation in {ML}", publisher = cup, year = "1998", URL = "http://www.cs.princeton.edu/~appel/modern/ml/", } @Article{bhamidipaty-proebsting-98, author = "Achyutram Bhamidipaty and Todd A. Proebsting", title = "Very Fast {YACC}-Compatible Parsers (For Very Little Effort)", journal = spe, year = "1998", volume = "28", number = "2", pages = "181--190", URL = "http://www.cs.arizona.edu/people/todd/papers/TR95-09.ps", } @Article{dencker-84, author = "Peter Dencker and Karl Dürre and Johannes Heuft", title = "Optimization of parser tables for portable compilers", journal = toplas, volume = "6", number = "4", year = "1984", pages = "546--572", URL = "http://doi.acm.org/10.1145/1780.1802", } @Article{deremer-pennello-82, author = "Frank DeRemer and Thomas Pennello", title = "Efficient Computation of ${LALR}(1)$ Look-Ahead Sets", journal = toplas, volume = "4", number = "4", year = "1982", pages = "615--649", URL = "http://doi.acm.org/10.1145/69622.357187", } @Manual{bison, title = "Bison", author = "Charles Donnelly and Richard Stallman", year = "2015", URL = "http://www.gnu.org/software/bison/manual/", } @Book{hopcroft-motwani-ullman-00, author = "John E. Hopcroft and Rajeev Motwani and Jeffrey D. Ullman", title = "Introduction to Automata Theory, Languages, and Computation", publisher = aw, year = "2000", URL = "http://www-db.stanford.edu/~ullman/ialc.html", } @Article{horspool-faster-90, author = "R. Nigel Horspool and Michael Whitney", title = "Even Faster {LR} Parsing", journal = spe, year = "1990", volume = "20", number = "6", pages = "515--535", URL = "http://www.cs.uvic.ca/~nigelh/Publications/fastparse.pdf", } @Article{jeffery-03, author = "Clinton L. Jeffery", title = "Generating {LR} syntax error messages from examples", journal = toplas, volume = "25", number = "5", year = "2003", pages = "631--640", URL = "http://doi.acm.org/10.1145/937563.937566", } @InCollection{johnson-yacc-79, author = "Steven C. Johnson", title = "{Yacc}: Yet Another Compiler Compiler", booktitle = "{UNIX} Programmer's Manual", volume = "2", publisher = "Holt, Rinehart, and Winston", pages = "353--387", year = "1979", URL = "http://dinosaur.compilertools.net/", } @InProceedings{jourdan-leroy-pottier-12, author = "Jacques-Henri Jourdan and François Pottier and Xavier Leroy", title = "Validating ${LR}(1)$ Parsers", year = "2012", booktitle = esop, publisher = springer, series = lncs, volume = "7211", pages = "397--416", URL = "http://gallium.inria.fr/~fpottier/publis/jourdan-leroy-pottier-validating-parsers.pdf", } @Article{klint-laemmel-verhoef-05, author = "Paul Klint and Ralf L{\"a}mmel and Chris Verhoef", title = "Toward an engineering discipline for grammarware", journal = tosem, volume = "14", number = "3", year = "2005", pages = "331--380", URL = "http://www.few.vu.nl/~x/gw/gw.pdf", } @Article{knuth-lr-65, author = "Donald E. Knuth", title = "On the translation of languages from left to right", journal = "Information \& Control", year = "1965", volume = "8", number = "6", pages = "607--639", URL = "http://www.sciencedirect.com/science/article/pii/S0019995865904262", } @Misc{compcert, author = "Xavier Leroy", title = "The {CompCert C} compiler", year = "2015", howpublished = "\url{http://compcert.inria.fr/}", } @Misc{ocaml, author = "Xavier Leroy and Damien Doligez and Alain Frisch and Jacques Garrigue and Didier Rémy and Jérôme Vouillon", title = "The {OCaml} system: documentation and user's manual", year = "2016", URL = "http://caml.inria.fr/", } @Article{pager-77, author = "David Pager", title = "A Practical General Method for Constructing ${LR}(k)$ Parsers", journal = acta, year = "1977", volume = "7", pages = "249--268", URL = "http://dx.doi.org/10.1007/BF00290336", } @InProceedings{pottier-reachability-cc-2016, author = "François Pottier", title = "Reachability and error diagnosis in {LR}(1) parsers", booktitle = cc, year = "2016", pages = "88--98", URL = "http://gallium.inria.fr/~fpottier/publis/fpottier-reachability-cc2016.pdf", } @Article{pottier-regis-gianas-typed-lr, author = "François Pottier and Yann {Régis-Gianas}", title = "Towards efficient, typed {LR} parsers", URL = "http://gallium.inria.fr/~fpottier/publis/fpottier-regis-gianas-typed-lr.pdf", year = "2006", pages = "155--180", journal = entcs, volume = "148", number = "2", } @Manual{tarditi-appel-00, title = "{ML-Yacc} User's Manual", author = "David R. Tarditi and Andrew W. Appel", year = "2000", URL = "http://www.smlnj.org/doc/ML-Yacc/", } @Article{tarjan-yao-79, author = "Robert Endre Tarjan and Andrew Chi-Chih Yao", title = "Storing a sparse table", journal = cacm, volume = "22", number = "11", year = "1979", pages = "606--611", URL = "http://doi.acm.org/10.1145/359168.359175", } @Article{jourdan-pottier-17, author = "Jacques-Henri Jourdan and François Pottier", title = "A simple, possibly correct {LR} parser for {C11}", journal = "ACM Transactions on Programming Languages and Systems", month = aug, year = "2017", volume = "39", number = "4", pages = "14:1--14:36", URL = "http://gallium.inria.fr/~fpottier/publis/jourdan-fpottier-2016.pdf", } menhir-20200123/doc/macros.tex000066400000000000000000000177551361226111300160150ustar00rootroot00000000000000% EBNF syntax. \let\nt\textit % Nonterminal. \newcommand{\is}{& ${} ::= {}$ &} \newcommand{\newprod}{\\\hspace{1cm}\barre\hspace{2mm}} \newcommand{\phaprod}{\\\hspace{1cm}\phantom\barre\hspace{2mm}} % Options and choices. \newcommand{\optional}[1]{% \ifmmode [\,#1\,]% \else $[\,\text{#1}\,]$% \fi } \newcommand{\metachoice}{$\,\mid\,$} % Lists. \newcommand{\seplist}[2]{#2#1${}\ldots{}$#1#2} \newcommand{\sepspacelist}[1]{\seplist{\ }{#1}} \newcommand{\sepcommalist}[1]{\seplist{,\ }{#1}} \newcommand{\precseplist}[2]{% \optional{#1} \seplist{\ #1}{#2}% } % Optional parameters. \newcommand{\tuple}[1]{\dlpar\sepcommalist{#1}\drpar} \newcommand{\oparams}[1]{\optional{\tuple{#1}}} % Some nonterminal symbols used in the grammar. \newcommand{\expression}{\nt{expression}\xspace} \newcommand{\expressionsub}[1]{\nt{expression}${}_{#1}$} \newcommand{\pattern}{\nt{pattern}\xspace} % Concrete syntax. \newcommand{\percentpercent}{\kw{\%\%}\xspace} \newcommand{\deuxpoints}{\kw{:}\xspace} \newcommand{\barre}{\kw{\textbar}\xspace} \newcommand{\kangle}[1]{\kw{\textless} #1 \kw{\textgreater}} \newcommand{\ocamltype}{\kangle{\textit{\ocaml type}}\xspace} \newcommand{\ocamlparam}{\kangle{\nt{uid} \deuxpoints \textit{\ocaml module type}}\xspace} \newcommand{\dheader}[1]{\kw{\%\{} #1 \kw{\%\}}} \newcommand{\dtoken}{\kw{\%token}\xspace} \newcommand{\dstart}{\kw{\%start}\xspace} \newcommand{\dtype}{\kw{\%type}\xspace} \newcommand{\dnonassoc}{\kw{\%nonassoc}\xspace} \newcommand{\dleft}{\kw{\%left}\xspace} \newcommand{\dright}{\kw{\%right}\xspace} \newcommand{\dparameter}{\kw{\%parameter}\xspace} \newcommand{\dpublic}{\kw{\%public}\xspace} \newcommand{\dinline}{\kw{\%inline}\xspace} \newcommand{\donerrorreduce}{\kw{\%on\_error\_reduce}\xspace} \newcommand{\dattribute}{\kw{\%attribute}\xspace} \newcommand{\dpaction}[1]{\kw{\{} #1 \kw{\}}\xspace} \newcommand{\daction}{\dpaction{\textit{\ocaml code}}\xspace} \newcommand{\dpfaction}[1]{\kw{<} #1 \kw{>}\xspace} \newcommand{\dpfidentityaction}{\kw{<>}\xspace} \newcommand{\dprec}{\kw{\%prec}\xspace} \newcommand{\dequal}{\kw{\small =}\xspace} \newcommand{\dquestion}{\kw{?}\xspace} \newcommand{\dplus}{\raisebox{2pt}{\kw{\small +}}\xspace} \newcommand{\dcolonequal}{\kw{\small :=}\xspace} \newcommand{\dequalequal}{\kw{\small ==}\xspace} \newcommand{\dstar}{\kw{*}\xspace} \newcommand{\dlpar}{\kw{(}\,\xspace} \newcommand{\drpar}{\,\kw{)}\xspace} \newcommand{\eos}{\kw{\#}\xspace} \newcommand{\dnewline}{\kw{\textbackslash n}\xspace} \newcommand{\dlet}{\kw{let}\xspace} \newcommand{\dsemi}{\kw{;}\xspace} \newcommand{\dunderscore}{\kw{\_}\xspace} \newcommand{\dtilde}{\raisebox{-4pt}{\kw{\textasciitilde}}\xspace} % Stylistic conventions. \newcommand{\kw}[1]{\text{\upshape\sf\bfseries #1}} \newcommand{\inlinesidecomment}[1]{\textit{\textbf{\footnotesize // #1}}} \newcommand{\sidecomment}[1]{\hspace{2cm}\inlinesidecomment{#1}} \newcommand{\docskip}{\vspace{1mm plus 1mm}} \newcommand{\docswitch}[1]{\docskip#1.\hspace{3mm}} \newcommand{\error}{\kw{error}\xspace} % Links to Menhir's repository. \newcommand{\repo}[2]{\href{https://gitlab.inria.fr/fpottier/menhir/blob/master/#1}{#2}} \newcommand{\menhirlibconvert}{\repo{src/Convert.mli}{\texttt{MenhirLib.Convert}}\xspace} \newcommand{\menhirlibincrementalengine}{\repo{src/IncrementalEngine.ml}{\texttt{MenhirLib.IncrementalEngine}}\xspace} \newcommand{\standardmly}{\repo{src/standard.mly}{\texttt{standard.mly}}\xspace} \newcommand{\distrib}[1]{\repo{#1}{\texttt{#1}}} % Links to CompCert's repository. \newcommand{\compcertgithub}{https://github.com/AbsInt/CompCert/tree/master} \newcommand{\compcertgithubfile}[1]{\href{\compcertgithub/#1}{\texttt{#1}}} % Abbreviations. \newcommand{\menhir}{Menhir\xspace} \newcommand{\menhirlib}{\texttt{MenhirLib}\xspace} \newcommand{\menhirsdk}{\texttt{MenhirSdk}\xspace} \newcommand{\menhirinterpreter}{\texttt{MenhirInterpreter}\xspace} \newcommand{\cmenhir}{\texttt{menhir}\xspace} \newcommand{\ml}{\texttt{.ml}\xspace} \newcommand{\mli}{\texttt{.mli}\xspace} \newcommand{\mly}{\texttt{.mly}\xspace} \newcommand{\cmly}{\texttt{.cmly}\xspace} \newcommand{\vy}{\texttt{.vy}\xspace} \newcommand{\ocaml}{OCaml\xspace} \newcommand{\ocamlc}{\texttt{ocamlc}\xspace} \newcommand{\ocamlopt}{\texttt{ocamlopt}\xspace} \newcommand{\ocamldep}{\texttt{ocamldep}\xspace} \newcommand{\make}{\texttt{make}\xspace} \newcommand{\omake}{\texttt{omake}\xspace} \newcommand{\ocamlbuild}{\texttt{ocamlbuild}\xspace} \newcommand{\dune}{\texttt{dune}\xspace} \newcommand{\Makefile}{\texttt{Makefile}\xspace} \newcommand{\yacc}{\texttt{yacc}\xspace} \newcommand{\bison}{\texttt{bison}\xspace} \newcommand{\ocamlyacc}{\texttt{ocamlyacc}\xspace} \newcommand{\ocamllex}{\texttt{ocamllex}\xspace} \newcommand{\token}{\texttt{token}\xspace} \newcommand{\automaton}{\texttt{.automaton}\xspace} \newcommand{\conflicts}{\texttt{.conflicts}\xspace} \newcommand{\dott}{\texttt{.dot}\xspace} % Environments. \newcommand{\question}[1]{\vspace{3mm}$\diamond$ \textbf{#1}} % Ocamlweb settings. \newcommand{\basic}[1]{\textit{#1}} \let\ocwkw\kw \let\ocwbt\basic \let\ocwupperid\basic \let\ocwlowerid\basic \let\ocwtv\basic \newcommand{\ocwbar}{\vskip 2mm plus 2mm \hrule \vskip 2mm plus 2mm} \newcommand{\tcup}{${}\cup{}$} \newcommand{\tcap}{${}\cap{}$} \newcommand{\tminus}{${}\setminus{}$} % Command line options. \newcommand{\oo}[1]{\texttt{-{}-#1}\xspace} \newcommand{\obase}{\oo{base}} \newcommand{\ocanonical}{\oo{canonical}} % undocumented! \newcommand{\ocomment}{\oo{comment}} \newcommand{\ocmly}{\oo{cmly}} \newcommand{\odepend}{\oo{depend}} \newcommand{\orawdepend}{\oo{raw-depend}} \newcommand{\odump}{\oo{dump}} \newcommand{\oerrorrecovery}{\oo{error-recovery}} \newcommand{\oexplain}{\oo{explain}} \newcommand{\oexternaltokens}{\oo{external-tokens}} \newcommand{\ofixedexc}{\oo{fixed-exception}} \newcommand{\ograph}{\oo{graph}} \newcommand{\oignoreone}{\oo{unused-token}} \newcommand{\oignoreall}{\oo{unused-tokens}} \newcommand{\oignoreprec}{\oo{unused-precedence-levels}} \newcommand{\oinfer}{\oo{infer}} \newcommand{\oinferwrite}{\oo{infer-write-query}} \newcommand{\oinferread}{\oo{infer-read-reply}} \newcommand{\oinferprotocolsupported}{\oo{infer-protocol-supported}} \newcommand{\oinspection}{\oo{inspection}} \newcommand{\ointerpret}{\oo{interpret}} \newcommand{\ointerpretshowcst}{\oo{interpret-show-cst}} \newcommand{\ologautomaton}{\oo{log-automaton}} \newcommand{\ologcode}{\oo{log-code}} \newcommand{\ologgrammar}{\oo{log-grammar}} \newcommand{\onodollars}{\oo{no-dollars}} \newcommand{\onoinline}{\oo{no-inline}} \newcommand{\onostdlib}{\oo{no-stdlib}} \newcommand{\oocamlc}{\oo{ocamlc}} \newcommand{\oocamldep}{\oo{ocamldep}} \newcommand{\oonlypreprocess}{\oo{only-preprocess}} \newcommand{\oonlytokens}{\oo{only-tokens}} \newcommand{\ostrict}{\oo{strict}} \newcommand{\osuggestcomp}{\oo{suggest-comp-flags}} \newcommand{\osuggestlinkb}{\oo{suggest-link-flags-byte}} \newcommand{\osuggestlinko}{\oo{suggest-link-flags-opt}} \newcommand{\osuggestmenhirlib}{\oo{suggest-menhirLib}} \newcommand{\osuggestocamlfind}{\oo{suggest-ocamlfind}} \newcommand{\otable}{\oo{table}} \newcommand{\otimings}{\oo{timings}} \newcommand{\otrace}{\oo{trace}} \newcommand{\ostdlib}{\oo{stdlib}} \newcommand{\oversion}{\oo{version}} \newcommand{\ocoq}{\oo{coq}} \newcommand{\ocoqlibpath}{\oo{coq-lib-path}} \newcommand{\ocoqlibnopath}{\oo{coq-lib-no-path}} \newcommand{\ocoqnocomplete}{\oo{coq-no-complete}} \newcommand{\ocoqnoactions}{\oo{coq-no-actions}} \newcommand{\olisterrors}{\oo{list-errors}} \newcommand{\ointerpreterror}{\oo{interpret-error}} \newcommand{\ocompileerrors}{\oo{compile-errors}} \newcommand{\ocompareerrors}{\oo{compare-errors}} \newcommand{\oupdateerrors}{\oo{update-errors}} \newcommand{\oechoerrors}{\oo{echo-errors}} % The .messages file format. \newcommand{\messages}{\texttt{.messages}\xspace} % Adding mathstruts to ensure a common baseline. \newcommand{\mycommonbaseline}{ \let\oldnt\nt \renewcommand{\nt}[1]{$\mathstrut$\oldnt{##1}} \let\oldbasic\basic \renewcommand{\basic}[1]{$\mathstrut$\oldbasic{##1}} } % Position keywords. \newcommand{\ksymbolstartpos}{\texttt{\$symbolstartpos}\xspace} menhir-20200123/doc/manual.html000066400000000000000000013033261361226111300161430ustar00rootroot00000000000000 Menhir Reference Manual (version 20200123)

Menhir Reference Manual
(version 20200123)

François Pottier and Yann Régis-Gianas
INRIA
{Francois.Pottier, Yann.Regis-Gianas}@inria.fr

Contents

1  Foreword

Menhir is a parser generator. It turns high-level grammar specifications, decorated with semantic actions expressed in the OCaml programming language [18], into parsers, again expressed in OCaml. It is based on Knuth’s LR(1) parser construction technique [15]. It is strongly inspired by its precursors: yacc [11], ML-Yacc [22], and ocamlyacc [18], but offers a large number of minor and major improvements that make it a more modern tool.

This brief reference manual explains how to use Menhir. It does not attempt to explain context-free grammars, parsing, or the LR technique. Readers who have never used a parser generator are encouraged to read about these ideas first [1,2,8]. They are also invited to have a look at the demos directory in Menhir’s distribution.

Potential users of Menhir should be warned that Menhir’s feature set is not completely stable. There is a tension between preserving a measure of compatibility with ocamlyacc, on the one hand, and introducing new ideas, on the other hand. Some aspects of the tool, such as the error handling mechanism, are still potentially subject to incompatible changes: for instance, in the future, the current error handling mechanism (which is based on the error token, see §10) could be removed and replaced with an entirely different mechanism.

There is room for improvement in the tool and in this reference manual. Bug reports and suggestions are welcome!

2  Usage

Menhir is invoked as follows:

menhir optionoption filenamefilename

Each of the file names must end with .mly (unless --coq is used, in which case it must end with .vy) and denotes a partial grammar specification. These partial grammar specifications are joined (§5.1) to form a single, self-contained grammar specification, which is then processed. The following optional command line switches allow controlling many aspects of the process.

--base basename.  This switch controls the base name of the .ml and .mli files that are produced. That is, the tool will produce files named basename.ml and basename.mli. Note that basename can contain occurrences of the / character, so it really specifies a path and a base name. When only one filename is provided on the command line, the default basename is obtained by depriving filename of its final .mly suffix. When multiple file names are provided on the command line, no default base name exists, so that the --base switch must be used.

--cmly.  This switch causes Menhir to produce a .cmly file in addition to its normal operation. This file contains a (binary-form) representation of the grammar and automaton (see §13.1).

--comment.  This switch causes a few comments to be inserted into the OCaml code that is written to the .ml file.

--compare-errors filename1 --compare-errors filename2.  Two such switches must always be used in conjunction so as to specify the names of two .messages files, filename1 and filename2. Each file is read and internally translated to a mapping of states to messages. Menhir then checks that the left-hand mapping is a subset of the right-hand mapping. This feature is typically used in conjunction with --list-errors to check that filename2 is complete (that is, covers all states where an error can occur). For more information, see §11.

--compile-errors filename.  This switch causes Menhir to read the file filename, which must obey the .messages file format, and to compile it to an OCaml function that maps a state number to a message. The OCaml code is sent to the standard output channel. At the same time, Menhir checks that the collection of input sentences in the file filename is correct and irredundant. For more information, see §11.

--coq.  This switch causes Menhir to produce Coq code. See §12.

--coq-lib-path path.  This switch allows specifying under what name (or path) the Coq support library MenhirLib is known to Coq. When Menhir runs in --coq mode, the generated parser contains references to several modules in this library. This path is used to qualify these references. Its default value is MenhirLib.

--coq-lib-no-path.  This switch indicates that references to the Coq library MenhirLib should not be qualified. This was the default behavior of Menhir prior to 2018/05/30. This switch is provided for compatibility, but normally should not be used.

--coq-no-actions.  (Used in conjunction with --coq.) This switch causes the semantic actions present in the .vy file to be ignored and replaced with tt, the unique inhabitant of Coq’s unit type. This feature can be used to test the Coq back-end with a standard grammar, that is, a grammar that contains OCaml semantic actions. Just rename the file from .mly to .vy and set this switch.

--coq-no-complete.  (Used in conjunction with --coq.) This switch disables the generation of the proof of completeness of the parser (§12). This can be necessary because the proof of completeness is possible only if the grammar has no conflict (not even a benign one, in the sense of §6.1). This can be desirable also because, for a complex grammar, completeness may require a heavy certificate and its validation by Coq may take time.

--depend.  See §14.

--dump.  This switch causes a description of the automaton to be written to the file basename.automaton.

--echo-errors filename.  This switch causes Menhir to read the .messages file filename and to produce on the standard output channel just the input sentences. (That is, all messages, blank lines, and comments are filtered out.) For more information, see §11.

--explain.  This switch causes conflict explanations to be written to the file basename.conflicts. See also §6.

--external-tokens T.  This switch causes the definition of the token type to be omitted in basename.ml and basename.mli. Instead, the generated parser relies on the type T.token, where T is an OCaml module name. It is up to the user to define module T and to make sure that it exports a suitable token type. Module T can be hand-written. It can also be automatically generated out of a grammar specification using the --only-tokens switch.

--fixed-exception.  This switch causes the exception Error to be internally defined as a synonym for Parsing.Parse_error. This means that an exception handler that catches Parsing.Parse_error will also catch the generated parser’s Error. This helps increase Menhir’s compatibility with ocamlyacc. There is otherwise no reason to use this switch.

--graph.  This switch causes a description of the grammar’s dependency graph to be written to the file basename.dot. The graph’s vertices are the grammar’s nonterminal symbols. There is a directed edge from vertex A to vertex B if the definition of A refers to B. The file is in a format that is suitable for processing by the graphviz toolkit.

--infer, --infer-write-query, --infer-read-reply.  See §14.

--inspection.  This switch requires --table. It causes Menhir to generate not only the monolithic and incremental APIs (§9.1, §9.2), but also the inspection API (§9.3). Activating this switch causes a few more tables to be produced, resulting in somewhat larger code size.

--interpret.  This switch causes Menhir to act as an interpreter, rather than as a compiler. No OCaml code is generated. Instead, Menhir reads sentences off the standard input channel, parses them, and displays outcomes. This switch can be usefully combined with --trace. For more information, see §8.

--interpret-error.  This switch is analogous to --interpret, except Menhir expects every sentence to cause an error on its last token, and displays information about the state in which the error is detected, in the .messages file format. For more information, see §11.

--interpret-show-cst.  This switch, used in conjunction with --interpret, causes Menhir to display a concrete syntax tree when a sentence is successfully parsed. For more information, see §8.

--list-errors.  This switch causes Menhir to produce (on the standard output channel) a complete list of input sentences that cause an error, in the .messages file format. For more information, see §11.

--log-automaton level.  When level is nonzero, this switch causes some information about the automaton to be logged to the standard error channel.

--log-code level.  When level is nonzero, this switch causes some information about the generated OCaml code to be logged to the standard error channel.

--log-grammar level.  When level is nonzero, this switch causes some information about the grammar to be logged to the standard error channel. When level is 2, the nullable, FIRST, and FOLLOW tables are displayed.

--no-dollars.  This switch disallows the use of positional keywords of the form $i.

--no-inline.  This switch causes all %inline keywords in the grammar specification to be ignored. This is especially useful in order to understand whether these keywords help solve any conflicts.

--no-stdlib.  This switch instructs Menhir to not use its standard library (§5.4).

--ocamlc command.  See §14.

--ocamldep command.  See §14.

--only-preprocess.  This switch causes the grammar specifications to be transformed up to the point where the automaton’s construction can begin. The grammar specifications whose names are provided on the command line are joined (§5.1); all parameterized nonterminal symbols are expanded away (§5.2); type inference is performed, if --infer is enabled; all nonterminal symbols marked %inline are expanded away (§5.3). This yields a single, monolithic grammar specification, which is printed on the standard output channel.

--only-tokens.  This switch causes the %token declarations in the grammar specification to be translated into a definition of the token type, which is written to the files basename.ml and basename.mli. No code is generated. This is useful when a single set of tokens is to be shared between several parsers. The directory demos/calc-two contains a demo that illustrates the use of this switch.

--raw-depend.  See §14.

--stdlib directory.  This switch exists only for backwards compatibility and is ignored. It may be removed in the future.

--strict.  This switch causes several warnings about the grammar and about the automaton to be considered errors. This includes warnings about useless precedence declarations, non-terminal symbols that produce the empty language, unreachable non-terminal symbols, productions that are never reduced, conflicts that are not resolved by precedence declarations, and end-of-stream conflicts.

--suggest-*.  See §14.

--table.  This switch causes Menhir to use its table-based back-end, as opposed to its (default) code-based back-end. When --table is used, Menhir produces significantly more compact and somewhat slower parsers. See §16 for a speed comparison.

The table-based back-end produces rather compact tables, which are analogous to those produced by yacc, bison, or ocamlyacc. These tables are not quite stand-alone: they are exploited by an interpreter, which is shipped as part of the support library MenhirLib. For this reason, when --table is used, MenhirLib must be made visible to the OCaml compilers, and must be linked into your executable program. The --suggest-* switches, described above, help do this.

The code-based back-end compiles the LR automaton directly into a nest of mutually recursive OCaml functions. In that case, MenhirLib is not required.

The incremental API (§9.2) and the inspection API (§9.3) are made available only by the table-based back-end.

--timings.  This switch causes internal timing information to be sent to the standard error channel.

--trace.  This switch causes tracing code to be inserted into the generated parser, so that, when the parser is run, its actions are logged to the standard error channel. This is analogous to ocamlrun’s p=1 parameter, except this switch must be enabled at compile time: one cannot selectively enable or disable tracing at runtime.

--unused-precedence-levels.  This switch suppresses all warnings about useless %left, %right, %nonassoc and %prec declarations.

--unused-token symbol.  This switch suppresses the warning that is normally emitted when Menhir finds that the terminal symbol symbol is unused.

--unused-tokens.  This switch suppresses all of the warnings that are normally emitted when Menhir finds that some terminal symbols are unused.

--update-errors filename.  This switch causes Menhir to read the .messages file filename and to produce on the standard output channel a new .messages file that is identical, except the auto-generated comments have been re-generated. For more information, see §11.

--version.  This switch causes Menhir to print its own version number and exit.

3  Lexical conventions

A semicolon character (;) may appear after a declaration (§4.1).

An old-style rule (§4.2) may be terminated with a semicolon. Also, within an old-style rule, each producer (§4.2.3) may be terminated with a semicolon.

A new-style rule (§4.3) must not be terminated with a semicolon. Within such a rule, the elements of a sequence must be separated with semicolons.

Semicolons are not allowed to appear anywhere except in the places mentioned above. This is in contrast with ocamlyacc, which views semicolons as insignificant, just like whitespace.

Identifiers (id) coincide with OCaml identifiers, except they are not allowed to contain the quote () character. Following OCaml, identifiers that begin with a lowercase letter (lid) or with an uppercase letter (uid) are distinguished.

A quoted identifier qid is a string enclosed in double quotes. Such a string cannot contain a double quote or a backslash. Quoted identifiers are used as token aliases (§4.1.3).

Comments are C-style (surrounded with /* and */, cannot be nested), C++-style (announced by // and extending until the end of the line), or OCaml-style (surrounded with (* and *), can be nested). Of course, inside OCaml code, only OCaml-style comments are allowed.

OCaml type expressions are surrounded with < and >. Within such expressions, all references to type constructors (other than the built-in list, option, etc.) must be fully qualified.

4  Syntax of grammar specifications


specification ::= declarationdeclaration %% rulerule%% OCaml code ]
declaration ::= %{ OCaml code %}
  %parameter < uid : OCaml module type >
  %token< OCaml type > ] uidqid ] … uidqid ]
  %nonassoc uiduid
  %left uiduid
  %right uiduid
  %type < OCaml type > lidlid
  %start< OCaml type > ] lidlid
  %attribute actualactual attributeattribute
  % attribute
  %on_error_reduce lidlid
attribute ::= [@ name payload ]
old syntaxrule ::= %public ] [ %inline ] lid(  id, …, id ) ] :| ] group || group
group ::= production || production { OCaml code }%prec id ]
production ::= producerproducer%prec id ]
producer ::= lid = ] actual
actual ::= id(  actual, …, actual ) ]
  actual?  ∣ +  ∣ * )
  group || group
new syntaxrule ::= %public ] let lid(  id, …, id ) ] ( :=  ∣ == ) expression
expression ::= | ] expression || expression
  pattern = ] expression ; expression
   id(  expression , …, expression  ) ]
   expression?  ∣ +  ∣ * )
   { OCaml code }%prec id ]
   < OCaml id >%prec id ]
pattern ::= lid   ∣   _   ∣   ~   ∣   (  pattern , …, pattern  )
Figure 1: Syntax of grammar specifications

The syntax of grammar specifications appears in Figure 1. The places where attributes can be attached are not shown; they are documented separately (§13.2). A grammar specification begins with a sequence of declarations (§4.1), ended by a mandatory %% keyword. Following this keyword, a sequence of rules is expected. Each rule defines a nonterminal symbol lid, whose name must begin with a lowercase letter. A rule is expressed either in the “old syntax” (§4.2) or in the “new syntax” (§4.3), which is slightly more elegant and powerful.

4.1  Declarations

4.1.1  Headers

A header is a piece of OCaml code, surrounded with %{ and %}. It is copied verbatim at the beginning of the .ml file. It typically contains OCaml open directives and function definitions for use by the semantic actions. If a single grammar specification file contains multiple headers, their order is preserved. However, when two headers originate in distinct grammar specification files, the order in which they are copied to the .ml file is unspecified.

4.1.2  Parameters

A declaration of the form:

%parameter < uid : OCaml module type >

causes the entire parser to become parameterized over the OCaml module uid, that is, to become an OCaml functor. The directory demos/calc-param contains a demo that illustrates the use of this switch.

If a single specification file contains multiple %parameter declarations, their order is preserved, so that the module name uid introduced by one declaration is effectively in scope in the declarations that follow. When two %parameter declarations originate in distinct grammar specification files, the order in which they are processed is unspecified. Last, %parameter declarations take effect before %{%}, %token, %type, or %start declarations are considered, so that the module name uid introduced by a %parameter declaration is effectively in scope in all %{%}, %token, %type, or %start declarations, regardless of whether they precede or follow the %parameter declaration. This means, in particular, that the side effects of an OCaml header are observed only when the functor is applied, not when it is defined.

4.1.3  Tokens

A declaration of the form:

%token< OCaml type > ] uid1qid1 ]  …  uidnqidn ]

defines the identifiers uid1, …, uidn as tokens, that is, as terminal symbols in the grammar specification and as data constructors in the token type.

If an OCaml type t is present, then these tokens are considered to carry a semantic value of type t, otherwise they are considered to carry no semantic value.

If a quoted identifier qidi is present, then it is considered an alias for the terminal symbol uidi. (This feature, known as “token aliases”, is borrowed from Bison.) Throughout the grammar, the quoted identifier qidi is then synonymous with the identifier uidi. For example, if one declares:

%token PLUS "+"

then the quoted identifier "+" stands for the terminal symbol PLUS throughout the grammar. An example of the use of token aliases appears in the directory demos/calc-alias. Token aliases can be used to improve the readability of a grammar. One must keep in mind, however, that they are just syntactic sugar: they are not interpreted in any way by Menhir or conveyed to tools like ocamllex. They could be considered confusing by a reader who mistakenly believes that they are interpreted as string literals.

4.1.4  Priority and associativity

A declaration of one of the following forms:

%nonassoc uid1uidn
%left uid1uidn
%right uid1uidn

assigns both a priority level and an associativity status to the symbols uid1, …, uidn. The priority level assigned to uid1, …, uidn is not defined explicitly: instead, it is defined to be higher than the priority level assigned by the previous %nonassoc, %left, or %right declaration, and lower than that assigned by the next %nonassoc, %left, or %right declaration. The symbols uid1, …, uidn can be tokens (defined elsewhere by a %token declaration) or dummies (not defined anywhere). Both can be referred to as part of %prec annotations. Associativity status and priority levels allow shift/reduce conflicts to be silently resolved (§6).

4.1.5  Types

A declaration of the form:

%type < OCaml type > lid1lidn

assigns an OCaml type to each of the nonterminal symbols lid1, …, lidn. For start symbols, providing an OCaml type is mandatory, but is usually done as part of the %start declaration. For other symbols, it is optional. Providing type information can improve the quality of OCaml’s type error messages.

A %type declaration may concern not only a nonterminal symbol, such as, say, expression, but also a fully applied parameterized nonterminal symbol, such as list(expression) or separated_list(COMMA, option(expression)).

The types provided as part of %type declarations are copied verbatim to the .ml and .mli files. In contrast, headers (§4.1.1) are copied to the .ml file only. For this reason, the types provided as part of %type declarations must make sense both in the presence and in the absence of these headers. They should typically be fully qualified types.

4.1.6  Start symbols

A declaration of the form:

%start< OCaml type > ] lid1lidn

declares the nonterminal symbols lid1, …, lidn to be start symbols. Each such symbol must be assigned an OCaml type either as part of the %start declaration or via separate %type declarations. Each of lid1, …, lidn becomes the name of a function whose signature is published in the .mli file and that can be used to invoke the parser.

4.1.7  Attribute declarations

Attribute declarations of the form %attribute actualactual attributeattribute and % attribute are explained in §13.2.

4.1.8  Extra reductions on error

A declaration of the form:

%on_error_reduce lid1lidn

marks the nonterminal symbols lid1, …, lidn as potentially eligible for reduction when an invalid token is found. This may cause one or more extra reduction steps to be performed before the error is detected.

More precisely, this declaration affects the automaton as follows. Let us say that a production lid → … is “reducible on error” if its left-hand symbol lid appears in a %on_error_reduce declaration. After the automaton has been constructed and after any conflicts have been resolved, in every state s, the following algorithm is applied:

  1. Construct the set of all productions that are ready to be reduced in state s and are reducible on error;
  2. Test if one of them, say p, has higher “on-error-reduce-priority” than every other production in this set;
  3. If so, in state s, replace every error action with a reduction of the production p. (In other words, for every terminal symbol t, if the action table says: “in state s, when the next input symbol is t, fail”, then this entry is replaced with: “in state s, when the next input symbol is t, reduce production p”.)

If step 3 above is executed in state s, then an error can never be detected in state s, since all error actions in state s are replaced with reduce actions. Error detection is deferred: at least one reduction takes place before the error is detected. It is a “spurious” reduction: in a canonical LR(1) automaton, it would not take place.

An %on_error_reduce declaration does not affect the language that is accepted by the automaton. It does not affect the location where an error is detected. It is used to control in which state an error is detected. If used wisely, it can make errors easier to report, because they are detected in a state for which it is easier to write an accurate diagnostic message (§11.3).

Like a %type declaration, an %on_error_reduce declaration may concern not only a nonterminal symbol, such as, say, expression, but also a fully applied parameterized nonterminal symbol, such as list(expression) or separated_list(COMMA, option(expression)).

The “on-error-reduce-priority” of a production is that of its left-hand symbol. The “on-error-reduce-priority” of a nonterminal symbol is determined implicitly by the order of %on_error_reduce declarations. In the declaration %on_error_reduce  lid1lidn, the symbols lid1, …, lidn have the same “on-error-reduce-priority”. They have higher “on-error-reduce-priority” than the symbols listed in previous %on_error_reduce declarations, and lower “on-error-reduce-priority” than those listed in later %on_error_reduce declarations.

4.2  Rules—old syntax

In its simplest form, a rule begins with the nonterminal symbol lid, followed by a colon character (:), and continues with a sequence of production groups (§4.2.1). Each production group is preceded with a vertical bar character (|); the very first bar is optional. The meaning of the bar is choice: the nonterminal symbol id develops to either of the production groups. We defer explanations of the keyword %public5.1), of the keyword %inline5.3), and of the optional formal parameters (  id, …, id )5.2).

4.2.1  Production groups

In its simplest form, a production group consists of a single production (§4.2.2), followed by an OCaml semantic action (§4.2.1) and an optional %prec annotation (§4.2.1). A production specifies a sequence of terminal and nonterminal symbols that should be recognized, and optionally binds identifiers to their semantic values.

Semantic actions

A semantic action is a piece of OCaml code that is executed in order to assign a semantic value to the nonterminal symbol with which this production group is associated. A semantic action can refer to the (already computed) semantic values of the terminal or nonterminal symbols that appear in the production via the semantic value identifiers bound by the production.

For compatibility with ocamlyacc, semantic actions can also refer to unnamed semantic values via positional keywords of the form $1, $2, etc. This style is discouraged. (It is in fact forbidden if --no-dollars is turned on.) Furthermore, as a positional keyword of the form $i is internally rewritten as _i, the user should not use identifiers of the form _i.

%prec annotations

An annotation of the form %prec id indicates that the precedence level of the production group is the level assigned to the symbol id via a previous %nonassoc, %left, or %right declaration (§4.1.4). In the absence of a %prec annotation, the precedence level assigned to each production is the level assigned to the rightmost terminal symbol that appears in it. It is undefined if the rightmost terminal symbol has an undefined precedence level or if the production mentions no terminal symbols at all. The precedence level assigned to a production is used when resolving shift/reduce conflicts (§6).

Multiple productions in a group

If multiple productions are present in a single group, then the semantic action and precedence annotation are shared between them. This short-hand effectively allows several productions to share a semantic action and precedence annotation without requiring textual duplication. It is legal only when every production binds exactly the same set of semantic value identifiers and when no positional semantic value keywords ($1, etc.) are used.

4.2.2  Productions

A production is a sequence of producers (§4.2.3), optionally followed by a %prec annotation (§4.2.1). If a precedence annotation is present, it applies to this production alone, not to other productions in the production group. It is illegal for a production and its production group to both carry %prec annotations.

4.2.3  Producers

A producer is an actual (§4.2.4), optionally preceded with a binding of a semantic value identifier, of the form lid =. The actual specifies which construction should be recognized and how a semantic value should be computed for that construction. The identifier lid, if present, becomes bound to that semantic value in the semantic action that follows. Otherwise, the semantic value can be referred to via a positional keyword ($1, etc.).

4.2.4  Actuals

In its simplest form, an actual is just a terminal or nonterminal symbol id. If it is a parameterized non-terminal symbol (see §5.2), then it should be applied: id(  actual, …, actual ) .

An actual may be followed with a modifier (?, +, or *). This is explained further on (see §5.2 and Figure 2).

An actual may also be an “anonymous rule”. In that case, one writes just the rule’s right-hand side, which takes the form group || group. (This form is allowed only as an argument in an application.) This form is expanded on the fly to a definition of a fresh non-terminal symbol, which is declared %inline. For instance, providing an anonymous rule as an argument to list:

list (  e = expression; SEMICOLON { e }  )

is equivalent to writing this:

list (  expression_SEMICOLON  )

where the non-terminal symbol expression_SEMICOLON is chosen fresh and is defined as follows:

%inline expression_SEMICOLON:
   |  e = expression; SEMICOLON { e }

4.3  Rules—new syntax

Please be warned that the new syntax is considered experimental and is subject to change in the future.

In its simplest form, a rule takes the form let lid := expression. Its left-hand side lid is a nonterminal symbol; its right-hand side is an expression. Such a rule defines an ordinary nonterminal symbol, while the alternate form let lid == expression defines an %inline nonterminal symbol (§5.3), that is, a macro. A rule can be preceded with the keyword %public5.1) and can be parameterized with a tuple of formal parameters (  id, …, id )5.2). The various forms of expressions, listed in Figure 1, are:

  • A choice between several expressions, [ | ] expression1 || expressionn. The leading bar is optional.
  • A sequence of two expressions, pattern = expression1 ; expression2. The semantic value produced by expression1 is decomposed according to the pattern pattern. The OCaml variables introduced by pattern may appear in a semantic action that ends the sequence expression2.
  • A sequence ~ = id1 ; expression2, which is sugar for id1 = id1 ; expression2. This is a pun.
  • A sequence expression1 ; expression2, which is sugar for _ = expression1 ; expression2.
  • A symbol id, possibly applied to a tuple of expressions (  expression1, …, expressionn  ). It is worth noting that such an expression can form the end of a sequence: id at the end of a sequence stands for x = id ; { x } for some fresh variable x. Thus, a sequence need not end with a semantic action.
  • An expression followed with ?, +, or *. This is sugar for the previous form: see §5.2 and Figure 2.
  • A semantic action { OCaml code } , possibly followed with a precedence annotation %prec id. This OCaml code can refer to the variables that have been bound earlier in the sequence that this semantic action ends. These include all variables named by the user as well as all variables introduced by a ~ pattern as part of a pun. The notation $i, where i is an integer, is forbidden.
  • A point-free semantic action < OCaml id >, possibly followed with a precedence annotation %prec id. The OCaml identifier id must denote a function or a data constructor. It is applied to a tuple of the variables that have been bound earlier in the sequence that this semantic action ends. Thus, <  id  > is sugar for {  id  (x1, …, xn} , where x1, …, xn are the variables bound earlier. These include all variables named by the user as well as all variables introduced by a ~ pattern.
  • An identity semantic action <>. This is sugar for < identity >, where identity is OCaml’s identity function. Therefore, it is sugar for {  (x1, …, xn} , where x1, …, xn are the variables bound earlier.

The syntax of expressions, as presented in Figure 1, seems more permissive than it really is. In reality, a choice cannot be nested inside a sequence; a sequence cannot be nested in the left-hand side of a sequence; a semantic action cannot appear in the left-hand side of a sequence. (Thus, there is a stratification in three levels: choice expressions, sequence expressions, and atomic expressions, which corresponds roughly to the stratification of rules, productions, and producers in the old syntax.) Furthermore, an expression between parentheses (  expression  ) is not a valid expression. To surround an expression with parentheses, one must write either midrule  (  expression  ) or endrule  (  expression  ) ; see §5.4 and Figure 3.

When a complex expression (e.g., a choice or a sequence) is placed in parentheses, as in id (  expression  ), this is equivalent to using id (  s ) , where the fresh symbol s is declared as a synonym for this expression, via the declaration let s == expression. This idiom is also known as an anonymous rule (§4.2.4).

Examples

As an example of a rule in the new syntax, the parameterized nonterminal symbol option, which is part of Menhir’s standard library (§5.4), can be defined as follows:

let option(x) :=
  | { None }
  | x = x ; { Some x }

Using a pun, it can also be written as follows:

let option(x) :=
  | { None }
  | ~ = x ; { Some x }

Using a pun and a point-free semantic action, it can also be expressed as follows:

let option(x) :=
  | { None }
  | ~ = x ; < Some >

As another example, the parameterized symbol delimited, also part of Menhir’s standard library (§5.4), can be defined in the new syntax as follows:

let delimited(opening, x, closing) ==
  opening ; ~ = x ; closing ; <>

The use of == indicates that this is a macro, i.e., an %inline nonterminal symbol (see §5.3). The identity semantic action <> is here synonymous with { x }.

Other illustrations of the new syntax can be found in the directories demos/calc-new-syntax-dune and demos/calc-ast-dune.

5  Advanced features

5.1  Splitting specifications over multiple files

Modules

Grammar specifications can be split over multiple files. When Menhir is invoked with multiple argument file names, it considers each of these files as a partial grammar specification, and joins these partial specifications in order to obtain a single, complete specification.

This feature is intended to promote a form a modularity. It is hoped that, by splitting large grammar specifications into several “modules”, they can be made more manageable. It is also hoped that this mechanism, in conjunction with parameterization (§5.2), will promote sharing and reuse. It should be noted, however, that this is only a weak form of modularity. Indeed, partial specifications cannot be independently processed (say, checked for conflicts). It is necessary to first join them, so as to form a complete grammar specification, before any kind of grammar analysis can be done.

This mechanism is, in fact, how Menhir’s standard library (§5.4) is made available: even though its name does not appear on the command line, it is automatically joined with the user’s explicitly-provided grammar specifications, making the standard library’s definitions globally visible.

A partial grammar specification, or module, contains declarations and rules, just like a complete one: there is no visible difference. Of course, it can consist of only declarations, or only rules, if the user so chooses. (Don’t forget the mandatory %% keyword that separates declarations and rules. It must be present, even if one of the two sections is empty.)

Private and public nonterminal symbols

It should be noted that joining is not a purely textual process. If two modules happen to define a nonterminal symbol by the same name, then it is considered, by default, that this is an accidental name clash. In that case, each of the two nonterminal symbols is silently renamed so as to avoid the clash. In other words, by default, a nonterminal symbol defined in module A is considered private, and cannot be defined again, or referred to, in module B.

Naturally, it is sometimes desirable to define a nonterminal symbol N in module A and to refer to it in module B. This is permitted if N is public, that is, if either its definition carries the keyword %public or N is declared to be a start symbol. A public nonterminal symbol is never renamed, so it can be referred to by modules other than its defining module.

In fact, it is permitted to split the definition of a public nonterminal symbol, over multiple modules and/or within a single module. That is, a public nonterminal symbol N can have multiple definitions, within one module and/or in distinct modules. All of these definitions are joined using the choice (|) operator. For instance, in the grammar of a programming language, the definition of the nonterminal symbol expression could be split into multiple modules, where one module groups the expression forms that have to do with arithmetic, one module groups those that concern function definitions and function calls, one module groups those that concern object definitions and method calls, and so on.

Tokens aside

Another use of modularity consists in placing all %token declarations in one module, and the actual grammar specification in another module. The module that contains the token definitions can then be shared, making it easier to define multiple parsers that accept the same type of tokens. (On this topic, see demos/calc-two.)

5.2  Parameterizing rules

A rule (that is, the definition of a nonterminal symbol) can be parameterized over an arbitrary number of symbols, which are referred to as formal parameters.

Example

For instance, here is the definition of the parameterized nonterminal symbol option, taken from the standard library (§5.4):

%public option(X):
   |  { None }
   |  x = X { Some x }

This definition states that option(X) expands to either the empty string, producing the semantic value None, or to the string X, producing the semantic value Some x, where x is the semantic value of X. In this definition, the symbol X is abstract: it stands for an arbitrary terminal or nonterminal symbol. The definition is made public, so option can be referred to within client modules.

A client who wishes to use option simply refers to it, together with an actual parameter – a symbol that is intended to replace X. For instance, here is how one might define a sequence of declarations, preceded with optional commas:

declarations:
   |  { [] }
   |  ds = declarations; option(COMMA); d = declaration { d :: ds }

This definition states that declarations expands either to the empty string or to declarations followed by an optional comma followed by declaration. (Here, COMMA is presumably a terminal symbol.) When this rule is encountered, the definition of option is instantiated: that is, a copy of the definition, where COMMA replaces X, is produced. Things behave exactly as if one had written:

optional_comma:
   |  { None }
   |  x = COMMA { Some x }
declarations:
   |  { [] }
   |  ds = declarations; optional_comma; d = declaration { d :: ds }

Note that, even though COMMA presumably has been declared as a token with no semantic value, writing x = COMMA is legal, and binds x to the unit value. This design choice ensures that the definition of option makes sense regardless of the nature of X: that is, X can be instantiated with a terminal symbol, with or without a semantic value, or with a nonterminal symbol.

Parameterization in general

In general, the definition of a nonterminal symbol N can be parameterized with an arbitrary number of formal parameters. When N is referred to within a production, it must be applied to the same number of actuals. In general, an actual is:

  • either a single symbol, which can be a terminal symbol, a nonterminal symbol, or a formal parameter;
  • or an application of such a symbol to a number of actuals.

For instance, here is a rule whose single production consists of a single producer, which contains several, nested actuals. (This example is discussed again in §5.4.)

plist(X):
   |  xs = loption(delimited(LPAREN, separated_nonempty_list(COMMA, X), RPAREN)) { xs }

actual?  is syntactic sugar for option(actual)
actual+  is syntactic sugar for nonempty_list(actual)
actual*  is syntactic sugar for list(actual)
Figure 2: Syntactic sugar for simulating regular expressions, also known as EBNF

Applications of the parameterized nonterminal symbols option, nonempty_list, and list, which are defined in the standard library (§5.4), can be written using a familiar, regular-expression like syntax (Figure 2).

Higher-order parameters

A formal parameter can itself expect parameters. For instance, here is a rule that defines the syntax of procedures in an imaginary programming language:

procedure(list):
   |  PROCEDURE ID list(formal) SEMICOLON block SEMICOLON {}

This rule states that the token ID, which represents the name of the procedure, should be followed with a list of formal parameters. (The definitions of the nonterminal symbols formal and block are not shown.) However, because list is a formal parameter, as opposed to a concrete nonterminal symbol defined elsewhere, this definition does not specify how the list is laid out: which token, if any, is used to separate, or terminate, list elements? is the list allowed to be empty? and so on. A more concrete notion of procedure is obtained by instantiating the formal parameter list: for instance, procedure(plist), where plist is the parameterized nonterminal symbol defined earlier, is a valid application.

Consistency

Definitions and uses of parameterized nonterminal symbols are checked for consistency before they are expanded away. In short, it is checked that, wherever a nonterminal symbol is used, it is supplied with actual arguments in appropriate number and of appropriate nature. This guarantees that expansion of parameterized definitions terminates and produces a well-formed grammar as its outcome.

5.3  Inlining

It is well-known that the following grammar of arithmetic expressions does not work as expected: that is, in spite of the priority declarations, it has shift/reduce conflicts.

%token < int > INT
%token PLUS TIMES
%left PLUS
%left TIMES
 
%%
 
expression:
   |  i = INT { i }
   |  e = expression; o = op; f = expression { o e f }
op:
   |  PLUS { ( + ) }
   |  TIMES { ( * ) }

The trouble is, the precedence level of the production expressionexpression op expression is undefined, and there is no sensible way of defining it via a %prec declaration, since the desired level really depends upon the symbol that was recognized by op: was it PLUS or TIMES?

The standard workaround is to abandon the definition of op as a separate nonterminal symbol, and to inline its definition into the definition of expression, like this:

expression:
   |  i = INT { i }
   |  e = expression; PLUS; f = expression { e + f }
   |  e = expression; TIMES; f = expression { e * f }

This avoids the shift/reduce conflict, but gives up some of the original specification’s structure, which, in realistic situations, can be damageable. Fortunately, Menhir offers a way of avoiding the conflict without manually transforming the grammar, by declaring that the nonterminal symbol op should be inlined:

expression:
   |  i = INT { i }
   |  e = expression; o = op; f = expression { o e f }
%inline op:
   |  PLUS { ( + ) }
   |  TIMES { ( * ) }

The %inline keyword causes all references to op to be replaced with its definition. In this example, the definition of op involves two productions, one that develops to PLUS and one that expands to TIMES, so every production that refers to op is effectively turned into two productions, one that refers to PLUS and one that refers to TIMES. After inlining, op disappears and expression has three productions: that is, the result of inlining is exactly the manual workaround shown above.

In some situations, inlining can also help recover a slight efficiency margin. For instance, the definition:

%inline plist(X):
   |  xs = loption(delimited(LPAREN, separated_nonempty_list(COMMA, X), RPAREN)) { xs }

effectively makes plist(X) an alias for the right-hand side loption(…). Without the %inline keyword, the language recognized by the grammar would be the same, but the LR automaton would probably have one more state and would perform one more reduction at run time.

The %inline keyword does not affect the computation of positions (§7). The same positions are computed, regardless of where %inline keywords are placed.

If the semantic actions have side effects, the %inline keyword can affect the order in which these side effects take place. In the example of op and expression above, if for some reason the semantic action associated with op has a side effect (such as updating a global variable, or printing a message), then, by inlining op, we delay this side effect, which takes place after the second operand has been recognized, whereas in the absence of inlining it takes place as soon as the operator has been recognized.

5.4  The standard library


NameRecognizesProducesComment
 
endrule(X)Xα, if X : α(inlined)
midrule(X)Xα, if X : α
 
option(X)є | Xα option, if X : α(also X?)
ioption(X)є | Xα option, if X : α(inlined)
boption(X)є | Xbool
loption(X)є | Xα list, if X : α list
 
pair(X, Y)X Yα×β, if X : α and Y : β
separated_pair(X, sep, Y)X sep Yα×β, if X : α and Y : β
preceded(opening, X)opening Xα, if X : α
terminated(X, closing)X closingα, if X : α
delimited(opening, X, closing)opening X closingα, if X : α
 
list(X)a possibly empty sequence of X’sα list, if X : α(also X*)
nonempty_list(X)a nonempty sequence of X’sα list, if X : α(also X+)
separated_list(sep, X)a possibly empty sequence of X’s separated with sep’sα list, if X : α
separated_nonempty_list(sep, X)a nonempty sequence of X’s   separated with sep’sα list, if X : α
 
rev(X)Xα list, if X : α list(inlined)
flatten(X)Xα list, if X : α list list(inlined)
append(X, Y)X Yα list, if X, Y : α list(inlined)
Figure 3: Summary of the standard library; see standard.mly for details

Once equipped with a rudimentary module system (§5.1), parameterization (§5.2), and inlining (§5.3), it is straightforward to propose a collection of commonly used definitions, such as options, sequences, lists, and so on. This standard library is joined, by default, with every grammar specification. A summary of the nonterminal symbols offered by the standard library appears in Figure 3. See also the short-hands documented in Figure 2.

By relying on the standard library, a client module can concisely define more elaborate notions. For instance, the following rule:

%inline plist(X):
   |  xs = loption(delimited(LPAREN, separated_nonempty_list(COMMA, X), RPAREN)) { xs }

causes plist(X) to recognize a list of X’s, where the empty list is represented by the empty string, and a non-empty list is delimited with parentheses and comma-separated.

The standard library is stored in a file named standard.mly, which is embedded inside Menhir when it is built. The command line switch --no-stdlib instructs Menhir to not load the standard library.

The meaning of the symbols defined in the standard library (Figure 3) should be clear in most cases. Yet, the symbols endrule(X) and midrule(X) deserve an explanation. Both take an argument X, which typically will be instantiated with an anonymous rule (§4.2.4). Both are defined as a synonym for X. In both cases, this allows placing an anonymous subrule in the middle of a rule.

For instance, the following is a well-formed production:

  cat    endrule(dog    { OCaml code1 })    cow    { OCaml code2 }

This production consists of three producers, namely cat and endrule(dog { OCaml code1 }) and cow, and a semantic action { OCaml code2 }. Because endrule(X) is declared as an %inline synonym for X, the expansion of anonymous rules (§4.2.4), followed with the expansion of %inline symbols (§5.3), transforms the above production into the following:

  cat    dog    cow    { OCaml code1; OCaml code2 }

Note that OCaml code1 moves to the end of the rule, which means that this code is executed only after cat, dog and cow have been recognized. In this example, the use of endrule is rather pointless, as the expanded code is more concise and clearer than the original code. Still, endrule can be useful when its actual argument is an anonymous rule with multiple branches.

midrule is used in exactly the same way as endrule, but its expansion is different. For instance, the following is a well-formed production:

  cat    midrule({ OCaml code1 })    cow    { OCaml code2 }

(There is no dog in this example; this is intentional.) Because midrule(X) is a synonym for X, but is not declared %inline, the expansion of anonymous rules (§4.2.4), followed with the expansion of %inline symbols (§5.3), transforms the above production into the following:

  cat    xxx    cow    { OCaml code2 }

where the fresh nonterminal symbol xxx is separately defined by the rule xxx: { OCaml code1 } . Thus, xxx recognizes the empty string, and as soon as it is recognized, OCaml code1 is executed. This is known as a “mid-rule action”.

6  Conflicts

When a shift/reduce or reduce/reduce conflict is detected, it is classified as either benign, if it can be resolved by consulting user-supplied precedence declarations, or severe, if it cannot. Benign conflicts are not reported. Severe conflicts are reported and, if the --explain switch is on, explained.

6.1  When is a conflict benign?

A shift/reduce conflict involves a single token (the one that one might wish to shift) and one or more productions (those that one might wish to reduce). When such a conflict is detected, the precedence level (§4.1.4, §4.2.1) of these entities are looked up and compared as follows:

  1. if only one production is involved, and if it has higher priority than the token, then the conflict is resolved in favor of reduction.
  2. if only one production is involved, and if it has the same priority as the token, then the associativity status of the token is looked up:
    1. if the token was declared nonassociative, then the conflict is resolved in favor of neither action, that is, a syntax error will be signaled if this token shows up when this production is about to be reduced;
    2. if the token was declared left-associative, then the conflict is resolved in favor of reduction;
    3. if the token was declared right-associative, then the conflict is resolved in favor of shifting.
  3. if multiple productions are involved, and if, considered one by one, they all cause the conflict to be resolved in the same way (that is, either in favor in shifting, or in favor of neither), then the conflict is resolved in that way.

In either of these cases, the conflict is considered benign. Otherwise, it is considered severe. Note that a reduce/reduce conflict is always considered severe, unless it happens to be subsumed by a benign multi-way shift/reduce conflict (item 3 above).

6.2  How are severe conflicts explained?

When the --dump switch is on, a description of the automaton is written to the .automaton file. Severe conflicts are shown as part of this description. Fortunately, there is also a way of understanding conflicts in terms of the grammar, rather than in terms of the automaton. When the --explain switch is on, a textual explanation is written to the .conflicts file.

Not all conflicts are explained in this file: instead, only one conflict per automaton state is explained. This is done partly in the interest of brevity, but also because Pager’s algorithm can create artificial conflicts in a state that already contains a true LR(1) conflict; thus, one cannot hope in general to explain all of the conflicts that appear in the automaton. As a result of this policy, once all conflicts explained in the .conflicts file have been fixed, one might need to run Menhir again to produce yet more conflict explanations.


%token IF THEN ELSE
%start < expression > expression
 
%%
 
expression:
   |  …
   |  IF b = expression THEN e = expression {}
   |  IF b = expression THEN e = expression ELSE f = expression {}
   |  …
Figure 4: Basic example of a shift/reduce conflict

How the conflict state is reached

Figure 4 shows a grammar specification with a typical shift/reduce conflict. When this specification is analyzed, the conflict is detected, and an explanation is written to the .conflicts file. The explanation first indicates in which state the conflict lies by showing how that state is reached. Here, it is reached after recognizing the following string of terminal and nonterminal symbols—the conflict string:

IF expression THEN IF expression THEN expression

Allowing the conflict string to contain both nonterminal and terminal symbols usually makes it shorter and more readable. If desired, a conflict string composed purely of terminal symbols could be obtained by replacing each occurrence of a nonterminal symbol N with an arbitrary N-sentence.

The conflict string can be thought of as a path that leads from one of the automaton’s start states to the conflict state. When multiple such paths exist, the one that is displayed is chosen shortest. Nevertheless, it may sometimes be quite long. In that case, artificially (and temporarily) declaring some existing nonterminal symbols to be start symbols has the effect of adding new start states to the automaton and can help produce shorter conflict strings. Here, expression was declared to be a start symbol, which is why the conflict string is quite short.

In addition to the conflict string, the .conflicts file also states that the conflict token is ELSE. That is, when the automaton has recognized the conflict string and when the lookahead token (the next token on the input stream) is ELSE, a conflict arises. A conflict corresponds to a choice: the automaton is faced with several possible actions, and does not know which one should be taken. This indicates that the grammar is not LR(1). The grammar may or may not be inherently ambiguous.

In our example, the conflict string and the conflict token are enough to understand why there is a conflict: when two IF constructs are nested, it is ambiguous which of the two constructs the ELSE branch should be associated with. Nevertheless, the .conflicts file provides further information: it explicitly shows that there exists a conflict, by proving that two distinct actions are possible. Here, one of these actions consists in shifting, while the other consists in reducing: this is a shift/reduce conflict.

A proof takes the form of a partial derivation tree whose fringe begins with the conflict string, followed by the conflict token. A derivation tree is a tree whose nodes are labeled with symbols. The root node carries a start symbol. A node that carries a terminal symbol is considered a leaf, and has no children. A node that carries a nonterminal symbol N either is considered a leaf, and has no children; or is not considered a leaf, and has n children, where n≥ 0, labeled x1,…,xn, where Nx1,…,xn is a production. The fringe of a partial derivation tree is the string of terminal and nonterminal symbols carried by the tree’s leaves. A string of terminal and nonterminal symbols that is the fringe of some partial derivation tree is a sentential form.

Why shifting is legal


Figure 5: A partial derivation tree that justifies shifting


expression
IF expression THEN expression
IF expression THEN expression . ELSE expression
Figure 6: A textual version of the tree in Figure 5

In our example, the proof that shifting is possible is the derivation tree shown in Figures 5 and 6. At the root of the tree is the grammar’s start symbol, expression. This symbol develops into the string IF expression THEN expression, which forms the tree’s second level. The second occurrence of expression in that string develops into IF expression THEN expression ELSE expression, which forms the tree’s last level. The tree’s fringe, a sentential form, is the string IF expression THEN IF expression THEN expression ELSE expression. As announced earlier, it begins with the conflict string IF expression THEN IF expression THEN expression, followed with the conflict token ELSE.

In Figure 6, the end of the conflict string is materialized with a dot. Note that this dot does not occupy the rightmost position in the tree’s last level. In other words, the conflict token (ELSE) itself occurs on the tree’s last level. In practical terms, this means that, after the automaton has recognized the conflict string and peeked at the conflict token, it makes sense for it to shift that token.

Why reducing is legal


Figure 7: A partial derivation tree that justifies reducing


expression
IF expression THEN expression ELSE expression       // lookahead token appears
IF expression THEN expression .
Figure 8: A textual version of the tree in Figure 7

In our example, the proof that shifting is possible is the derivation tree shown in Figures 7 and 8. Again, the sentential form found at the fringe of the tree begins with the conflict string, followed with the conflict token.

Again, in Figure 8, the end of the conflict string is materialized with a dot. Note that, this time, the dot occupies the rightmost position in the tree’s last level. In other words, the conflict token (ELSE) appeared on an earlier level (here, on the second level). This fact is emphasized by the comment // lookahead token appears found at the second level. In practical terms, this means that, after the automaton has recognized the conflict string and peeked at the conflict token, it makes sense for it to reduce the production that corresponds to the tree’s last level—here, the production is expressionIF expression THEN expression.

An example of a more complex derivation tree

Figures 9 and 10 show a partial derivation tree that justifies reduction in a more complex situation. (This derivation tree is relative to a grammar that is not shown.) Here, the conflict string is DATA UIDENT EQUALS UIDENT; the conflict token is LIDENT. It is quite clear that the fringe of the tree begins with the conflict string. However, in this case, the fringe does not explicitly exhibit the conflict token. Let us examine the tree more closely and answer the question: following UIDENT, what’s the next terminal symbol on the fringe?


Figure 9: A partial derivation tree that justifies reducing


decls
decl opt_semi decls       // lookahead token appears because opt_semi can vanish and decls can begin with LIDENT
DATA UIDENT EQUALS tycon_expr       // lookahead token is inherited
tycon_item       // lookahead token is inherited
UIDENT opt_type_exprs       // lookahead token is inherited
.
Figure 10: A textual version of the tree in Figure 9

First, note that opt_type_exprs is not a leaf node, even though it has no children. The grammar contains the production opt_type_exprs → є: the nonterminal symbol opt_type_exprs develops to the empty string. (This is made clear in Figure 10, where a single dot appears immediately below opt_type_exprs.) Thus, opt_type_exprs is not part of the fringe.

Next, note that opt_type_exprs is the rightmost symbol within its level. Thus, in order to find the next symbol on the fringe, we have to look up one level. This is the meaning of the comment // lookahead token is inherited. Similarly, tycon_item and tycon_expr appear rightmost within their level, so we again have to look further up.

This brings us back to the tree’s second level. There, decl is not the rightmost symbol: next to it, we find opt_semi and decls. Does this mean that opt_semi is the next symbol on the fringe? Yes and no. opt_semi is a nonterminal symbol, but we are really interested in finding out what the next terminal symbol on the fringe could be. The partial derivation tree shown in Figures 9 and 10 does not explicitly answer this question. In order to answer it, we need to know more about opt_semi and decls.

Here, opt_semi stands (as one might have guessed) for an optional semicolon, so the grammar contains a production opt_semi → є. This is indicated by the comment // opt_semi can vanish. (Nonterminal symbols that generate є are also said to be nullable.) Thus, one could choose to turn this partial derivation tree into a larger one by developing opt_semi into є, making it a non-leaf node. That would yield a new partial derivation tree where the next symbol on the fringe, following UIDENT, is decls.

Now, what about decls? Again, it is a nonterminal symbol, and we are really interested in finding out what the next terminal symbol on the fringe could be. Again, we need to imagine how this partial derivation tree could be turned into a larger one by developing decls. Here, the grammar happens to contain a production of the form declsLIDENT … This is indicated by the comment // decls can begin with LIDENT. Thus, by developing decls, it is possible to construct a partial derivation tree where the next symbol on the fringe, following UIDENT, is LIDENT. This is precisely the conflict token.

To sum up, there exists a partial derivation tree whose fringe begins with the conflict string, followed with the conflict token. Furthermore, in that derivation tree, the dot occupies the rightmost position in the last level. As in our previous example, this means that, after the automaton has recognized the conflict string and peeked at the conflict token, it makes sense for it to reduce the production that corresponds to the tree’s last level—here, the production is opt_type_exprs → є.

Greatest common factor among derivation trees

Understanding conflicts requires comparing two (or more) derivation trees. It is frequent for these trees to exhibit a common factor, that is, to exhibit identical structure near the top of the tree, and to differ only below a specific node. Manual identification of that node can be tedious, so Menhir performs this work automatically. When explaining a n-way conflict, it first displays the greatest common factor of the n derivation trees. A question mark symbol (?) is used to identify the node where the trees begin to differ. Then, Menhir displays each of the n derivation trees, without their common factor – that is, it displays n sub-trees that actually begin to differ at the root. This should make visual comparisons significantly easier.

6.3  How are severe conflicts resolved in the end?

It is unspecified how severe conflicts are resolved. Menhir attempts to mimic ocamlyacc’s specification, that is, to resolve shift/reduce conflicts in favor of shifting, and to resolve reduce/reduce conflicts in favor of the production that textually appears earliest in the grammar specification. However, this specification is inconsistent in case of three-way conflicts, that is, conflicts that simultaneously involve a shift action and several reduction actions. Furthermore, textual precedence can be undefined when the grammar specification is split over multiple modules. In short, Menhir’s philosophy is that

severe conflicts should not be tolerated,

so you should not care how they are resolved.

6.4  End-of-stream conflicts

Menhir’s treatment of the end of the token stream is (believed to be) fully compatible with ocamlyacc’s. Yet, Menhir attempts to be more user-friendly by warning about a class of so-called “end-of-stream conflicts”.

How the end of stream is handled

In many textbooks on parsing, it is assumed that the lexical analyzer, which produces the token stream, produces a special token, written #, to signal that the end of the token stream has been reached. A parser generator can take advantage of this by transforming the grammar: for each start symbol S in the original grammar, a new start symbol S’ is defined, together with the production S′→ S# . The symbol S is no longer a start symbol in the new grammar. This means that the parser will accept a sentence derived from S only if it is immediately followed by the end of the token stream.

This approach has the advantage of simplicity. However, ocamlyacc and Menhir do not follow it, for several reasons. Perhaps the most convincing one is that it is not flexible enough: sometimes, it is desirable to recognize a sentence derived from S, without requiring that it be followed by the end of the token stream: this is the case, for instance, when reading commands, one by one, on the standard input channel. In that case, there is no end of stream: the token stream is conceptually infinite. Furthermore, after a command has been recognized, we do not wish to examine the next token, because doing so might cause the program to block, waiting for more input.

In short, ocamlyacc and Menhir’s approach is to recognize a sentence derived from S and to not look, if possible, at what follows. However, this is possible only if the definition of S is such that the end of an S-sentence is identifiable without knowledge of the lookahead token. When the definition of S does not satisfy this criterion, and end-of-stream conflict arises: after a potential S-sentence has been read, there can be a tension between consulting the next token, in order to determine whether the sentence is continued, and not consulting the next token, because the sentence might be over and whatever follows should not be read. Menhir warns about end-of-stream conflicts, whereas ocamlyacc does not.

A definition of end-of-stream conflicts

Technically, Menhir proceeds as follows. A # symbol is introduced. It is, however, only a pseudo-token: it is never produced by the lexical analyzer. For each start symbol S in the original grammar, a new start symbol S’ is defined, together with the production S′→ S. The corresponding start state of the LR(1) automaton is composed of the LR(1) item S′ → .  S  [# ]. That is, the pseudo-token # initially appears in the lookahead set, indicating that we expect to be done after recognizing an S-sentence. During the construction of the LR(1) automaton, this lookahead set is inherited by other items, with the effect that, in the end, the automaton has:

  • shift actions only on physical tokens; and
  • reduce actions either on physical tokens or on the pseudo-token #.

A state of the automaton has a reduce action on # if, in that state, an S-sentence has been read, so that the job is potentially finished. A state has a shift or reduce action on a physical token if, in that state, more tokens potentially need to be read before an S-sentence is recognized. If a state has a reduce action on #, then that action should be taken without requesting the next token from the lexical analyzer. On the other hand, if a state has a shift or reduce action on a physical token, then the lookahead token must be consulted in order to determine if that action should be taken.


%token < int > INT
%token PLUS TIMES
%left PLUS
%left TIMES
%start < int > expr
%%
expr:
   |  i = INT { i }
   |  e1 = expr PLUS e2 = expr { e1 + e2 }
   |  e1 = expr TIMES e2 = expr { e1 * e2 }
Figure 11: Basic example of an end-of-stream conflict


State 6:
expr -> expr . PLUS expr [ # TIMES PLUS ]
expr -> expr PLUS expr . [ # TIMES PLUS ]
expr -> expr . TIMES expr [ # TIMES PLUS ]
-- On TIMES shift to state 3
-- On # PLUS reduce production expr -> expr PLUS expr

State 4:
expr -> expr . PLUS expr [ # TIMES PLUS ]
expr -> expr . TIMES expr [ # TIMES PLUS ]
expr -> expr TIMES expr . [ # TIMES PLUS ]
-- On # TIMES PLUS reduce production expr -> expr TIMES expr

State 2:
expr' -> expr . [ # ]
expr -> expr . PLUS expr [ # TIMES PLUS ]
expr -> expr . TIMES expr [ # TIMES PLUS ]
-- On TIMES shift to state 3
-- On PLUS shift to state 5
-- On # accept expr
Figure 12: Part of an LR automaton for the grammar in Figure 11


%token END
%start < int > main     // instead of expr
%%
main:
   |  e = expr END { e }
expr:
   |  …
Figure 13: Fixing the grammar specification in Figure 11

An end-of-stream conflict arises when a state has distinct actions on # and on at least one physical token. In short, this means that the end of an S-sentence cannot be unambiguously identified without examining one extra token. Menhir’s default behavior, in that case, is to suppress the action on #, so that more input is always requested.

Example

Figure 11 shows a grammar that has end-of-stream conflicts. When this grammar is processed, Menhir warns about these conflicts, and further warns that expr is never accepted. Let us explain.

Part of the corresponding automaton, as described in the .automaton file, is shown in Figure 12. Explanations at the end of the .automaton file (not shown) point out that states 6 and 2 have an end-of-stream conflict. Indeed, both states have distinct actions on # and on the physical token TIMES. It is interesting to note that, even though state 4 has actions on # and on physical tokens, it does not have an end-of-stream conflict. This is because the action taken in state 4 is always to reduce the production exprexpr TIMES expr, regardless of the lookahead token.

By default, Menhir produces a parser where end-of-stream conflicts are resolved in favor of looking ahead: that is, the problematic reduce actions on # are suppressed. This means, in particular, that the accept action in state 2, which corresponds to reducing the production exprexpr’, is suppressed. This explains why the symbol expr is never accepted: because expressions do not have an unambiguous end marker, the parser will always request one more token and will never stop.

In order to avoid this end-of-stream conflict, the standard solution is to introduce a new token, say END, and to use it as an end marker for expressions. The END token could be generated by the lexical analyzer when it encounters the actual end of stream, or it could correspond to a piece of concrete syntax, say, a line feed character, a semicolon, or an end keyword. The solution is shown in Figure 13.

7  Positions

When an ocamllex-generated lexical analyzer produces a token, it updates two fields, named lex_start_p and lex_curr_p, in its environment record, whose type is Lexing.lexbuf. Each of these fields holds a value of type Lexing.position. Together, they represent the token’s start and end positions within the text that is being scanned. These fields are read by Menhir after calling the lexical analyzer, so it is the lexical analyzer’s responsibility to correctly set these fields.

A position consists mainly of an offset (the position’s pos_cnum field), but also holds information about the current file name, the current line number, and the current offset within the current line. (Not all ocamllex-generated analyzers keep this extra information up to date. This must be explicitly programmed by the author of the lexical analyzer.)


$startpos  start position of the first symbol in the production’s right-hand side, if there is one;
    end position of the most recently parsed symbol, otherwise
$endpos  end position of the last symbol in the production’s right-hand side, if there is one;
    end position of the most recently parsed symbol, otherwise
$startpos( $i | id )  start position of the symbol named $i or id
$endpos( $i | id )  end position of the symbol named $i or id
$symbolstartpos   start position of the leftmost symbol id such that $startpos(id) !=  $endpos(id);
    if there is no such symbol, $endpos
$startofs   
$endofs   
$startofs( $i | id )  same as above, but produce an integer offset instead of a position
$endofs( $i | id )   
$symbolstartofs   
$loc  stands for the pair ($startpos, $endpos)
$loc( id )  stands for the pair ($startpos( id ), $endpos( id ))
$sloc  stands for the pair ($symbolstartpos, $endpos)
Figure 14: Position-related keywords


symbol_start_pos()$symbolstartpos       
symbol_end_pos()$endpos       
rhs_start_pos i$startpos($i)      (1 ≤ in)
rhs_end_pos i$endpos($i)      (1 ≤ in)
symbol_start()$symbolstartofs       
symbol_end()$endofs       
rhs_start i$startofs($i)      (1 ≤ in)
rhs_end i$endofs($i)      (1 ≤ in)
Figure 15: Translating position-related incantations from ocamlyacc to Menhir

This mechanism allows associating pairs of positions with terminal symbols. If desired, Menhir automatically extends it to nonterminal symbols as well. That is, it offers a mechanism for associating pairs of positions with terminal or nonterminal symbols. This is done by making a set of keywords available to semantic actions (Figure 14). These keywords are not available outside of a semantic action: in particular, they cannot be used within an OCaml header.

OCaml’s standard library module Parsing is deprecated. The functions that it offers can be called, but will return dummy positions.

We remark that, if the current production has an empty right-hand side, then $startpos and $endpos are equal, and (by convention) are the end position of the most recently parsed symbol (that is, the symbol that happens to be on top of the automaton’s stack when this production is reduced). If the current production has a nonempty right-hand side, then $startpos is the same as $startpos($1) and $endpos is the same as $endpos($n), where n is the length of the right-hand side.

More generally, if the current production has matched a sentence of length zero, then $startpos and $endpos will be equal, and conversely.

The position $startpos is sometimes “further towards the left” than one would like. For example, in the following production:

  declaration: modifier? variable { $startpos }

the keyword $startpos represents the start position of the optional modifier modifier?. If this modifier turns out to be absent, then its start position is (by definition) the end position of the most recently parsed symbol. This may not be what is desired: perhaps the user would prefer in this case to use the start position of the symbol variable. This is achieved by using $symbolstartpos instead of $startpos. By definition, $symbolstartpos is the start position of the leftmost symbol whose start and end positions differ. In this example, the computation of $symbolstartpos skips the absent modifier, whose start and end positions coincide, and returns the start position of the symbol variable (assuming this symbol has distinct start and end positions).

There is no keyword $symbolendpos. Indeed, the problem with $startpos is due to the asymmetry in the definition of $startpos and $endpos in the case of an empty right-hand side, and does not affect $endpos.

The positions computed by Menhir are exactly the same as those computed by ocamlyacc1. More precisely, Figure 15 sums up how to translate a call to the Parsing module, as used in an ocamlyacc grammar, to a Menhir keyword.

We note that Menhir’s $startpos does not appear in the right-hand column in Figure 15. In other words, Menhir’s $startpos does not correspond exactly to any of the ocamlyacc function calls. An exact ocamlyacc equivalent of $startpos is rhs_start_pos 1 if the current production has a nonempty right-hand side and symbol_start_pos() if it has an empty right-hand side.

Finally, we remark that Menhir’s %inline keyword (§5.3) does not affect the computation of positions. The same positions are computed, regardless of where %inline keywords are placed.

8  Using Menhir as an interpreter

When --interpret is set, Menhir no longer behaves as a compiler. Instead, it acts as an interpreter. That is, it repeatedly:

  • reads a sentence off the standard input channel;
  • parses this sentence, according to the grammar;
  • displays an outcome.

This process stops when the end of the input channel is reached.

8.1  Sentences

The syntax of sentences is as follows:

sentence ::= lid : ] uiduid  \n

Less formally, a sentence is a sequence of zero or more terminal symbols (uid’s), separated with whitespace, terminated with a newline character, and optionally preceded with a non-terminal start symbol (lid). This non-terminal symbol can be omitted if, and only if, the grammar only has one start symbol.

For instance, here are four valid sentences for the grammar of arithmetic expressions found in the directory demos/calc:

main: INT PLUS INT EOL
INT PLUS INT
INT PLUS PLUS INT EOL
INT PLUS PLUS

In the first sentence, the start symbol main was explicitly specified. In the other sentences, it was omitted, which is permitted, because this grammar has no start symbol other than main. The first sentence is a stream of four terminal symbols, namely INT, PLUS, INT, and EOL. These terminal symbols must be provided under their symbolic names. Writing, say, “12+32\n” instead of INT PLUS INT EOL is not permitted. Menhir would not be able to make sense of such a concrete notation, since it does not have a lexer for it.

8.2  Outcomes

As soon as Menhir is able to read a complete sentence off the standard input channel (that is, as soon as it finds the newline character that ends the sentence), it parses the sentence according to whichever grammar was specified on the command line, and displays an outcome.

An outcome is one of the following:

  • ACCEPT: a prefix of the sentence was successfully parsed; a parser generated by Menhir would successfully stop and produce a semantic value;
  • OVERSHOOT: the end of the sentence was reached before it could be accepted; a parser generated by Menhir would request a non-existent “next token” from the lexer, causing it to fail or block;
  • REJECT: the sentence was not accepted; a parser generated by Menhir would raise the exception Error.

When --interpret-show-cst is set, each ACCEPT outcome is followed with a concrete syntax tree. A concrete syntax tree is either a leaf or a node. A leaf is either a terminal symbol or error. A node is annotated with a non-terminal symbol, and carries a sequence of immediate descendants that correspond to a valid expansion of this non-terminal symbol. Menhir’s notation for concrete syntax trees is as follows:

cst ::= uid
  error
  [ lid : cstcst ]

For instance, if one wished to parse the example sentences of §8.1 using the grammar of arithmetic expressions in demos/calc, one could invoke Menhir as follows:

$ menhir --interpret --interpret-show-cst demos/calc/parser.mly
main: INT PLUS INT EOL
ACCEPT
[main: [expr: [expr: INT] PLUS [expr: INT]] EOL]
INT PLUS INT
OVERSHOOT
INT PLUS PLUS INT EOL
REJECT
INT PLUS PLUS
REJECT

(Here, Menhir’s input—the sentences provided by the user on the standard input channel— is shown intermixed with Menhir’s output—the outcomes printed by Menhir on the standard output channel.) The first sentence is valid, and accepted; a concrete syntax tree is displayed. The second sentence is incomplete, because the grammar specifies that a valid expansion of main ends with the terminal symbol EOL; hence, the outcome is OVERSHOOT. The third sentence is invalid, because of the repeated occurrence of the terminal symbol PLUS; the outcome is REJECT. The fourth sentence, a prefix of the third one, is rejected for the same reason.

8.3  Remarks

Using Menhir as an interpreter offers an easy way of debugging your grammar. For instance, if one wished to check that addition is considered left-associative, as requested by the %left directive found in the file demos/calc/parser.mly, one could submit the following sentence:

$ ./menhir --interpret --interpret-show-cst ../demos/calc/parser.mly
INT PLUS INT PLUS INT EOL
ACCEPT
[main:
  [expr: [expr: [expr: INT] PLUS [expr: INT]] PLUS [expr: INT]]
  EOL
]

The concrete syntax tree displayed by Menhir is skewed towards the left, as desired.

The switches --interpret and --trace can be used in conjunction. When --trace is set, the interpreter logs its actions to the standard error channel.

9  Generated API

When Menhir processes a grammar specification, say parser.mly, it produces one OCaml module, Parser, whose code resides in the file parser.ml and whose signature resides in the file parser.mli. We now review this signature. For simplicity, we assume that the grammar specification has just one start symbol main, whose OCaml type is thing.

9.1  Monolithic API

The monolithic API defines the type token, the exception Error, and the parsing function main, named after the start symbol of the grammar.

The type token is an algebraic data type. A value of type token represents a terminal symbol and its semantic value. For instance, if the grammar contains the declarations %token A and %token<int> B, then the generated file parser.mli contains the following definition:

  type token =
  | A
  | B of int

If --only-tokens is specified on the command line, the type token is generated, and the rest is omitted. On the contrary, if --external-tokens is used, the type token is omitted, but the rest (described below) is generated.

The exception Error carries no argument. It is raised by the parsing function main (described below) when a syntax error is detected.

  exception Error

Next comes one parsing function for each start symbol of the grammar. Here, we have assumed that there is one start symbol, named main, so the generated file parser.mli contains the following declaration:

  val main: (Lexing.lexbuf -> token) -> Lexing.lexbuf -> thing

This function expects two arguments, namely: a lexer, which typically is produced by ocamllex and has type Lexing.lexbuf -> token; and a lexing buffer, which has type Lexing.lexbuf. This API is compatible with ocamlyacc. (For information on using Menhir without ocamllex, please consult §16.) This API is “monolithic” in the sense that there is just one function, which does everything: it pulls tokens from the lexer, parses, and eventually returns a semantic value (or fails by throwing the exception Error).

9.2  Incremental API

If --table is set, Menhir offers an incremental API in addition to the monolithic API. In this API, control is inverted. The parser does not have access to the lexer. Instead, when the parser needs the next token, it stops and returns its current state to the user. The user is then responsible for obtaining this token (typically by invoking the lexer) and resuming the parser from that state. The directory demos/calc-incremental contains a demo that illustrates the use of the incremental API.

This API is “incremental” in the sense that the user has access to a sequence of the intermediate states of the parser. Assuming that semantic values are immutable, a parser state is a persistent data structure: it can be stored and used multiple times, if desired. This enables applications such as “live parsing”, where a buffer is continuously parsed while it is being edited. The parser can be re-started in the middle of the buffer whenever the user edits a character. Because two successive parser states share most of their data in memory, a list of n successive parser states occupies only O(n) space in memory.

9.2.1  Starting the parser

In this API, the parser is started by invoking Incremental.main. (Recall that we assume that main is the name of the start symbol.) The generated file parser.mli contains the following declaration:

  module Incremental : sig
    val main: position -> thing MenhirInterpreter.checkpoint
  end

The argument is the initial position. If the lexer is based on an OCaml lexing buffer, this argument should be lexbuf.lex_curr_p. In §9.2 and §9.3, the type position is a synonym for Lexing.position.

We emphasize that the function Incremental.main does not parse anything. It constructs a checkpoint which serves as a starting point. The functions offer and resume, described below, are used to drive the parser.

9.2.2  Driving the parser

The sub-module MenhirInterpreter is also part of the incremental API. Its declaration, which appears in the generated file parser.mli, is as follows:

  module MenhirInterpreter : MenhirLib.IncrementalEngine.INCREMENTAL_ENGINE
    with type token = token

The signature INCREMENTAL_ENGINE, defined in the module MenhirLib.IncrementalEngine, contains many types and functions, which are described in the rest of this section (§9.2.2) and in the following sections (§9.2.3, §9.2.4).

Please keep in mind that, from the outside, these types and functions should be referred to with an appropriate prefix. For instance, the type checkpoint should be referred to as MenhirInterpreter.checkpoint, or Parser.MenhirInterpreter.checkpoint, depending on which modules the user chooses to open.

  type 'a env

The abstract type 'a env represents the current state of the parser. (That is, it contains the current state and stack of the LR automaton.) Assuming that semantic values are immutable, it is a persistent data structure: it can be stored and used multiple times, if desired. The parameter 'a is the type of the semantic value that will eventually be produced if the parser succeeds.

  type production

The abstract type production represents a production of the grammar. The “start productions” (which do not exist in an .mly file, but are constructed by Menhir internally) are not part of this type.

  type 'a checkpoint = private
    | InputNeeded of 'a env
    | Shifting of 'a env * 'a env * bool
    | AboutToReduce of 'a env * production
    | HandlingError of 'a env
    | Accepted of 'a
    | Rejected

The type 'a checkpoint represents an intermediate or final state of the parser. An intermediate checkpoint is a suspension: it records the parser’s current state, and allows parsing to be resumed. The parameter 'a is the type of the semantic value that will eventually be produced if the parser succeeds.

Accepted and Rejected are final checkpoints. Accepted carries a semantic value.

InputNeeded is an intermediate checkpoint. It means that the parser wishes to read one token before continuing.

Shifting is an intermediate checkpoint. It means that the parser is taking a shift transition. It exposes the state of the parser before and after the transition. The Boolean parameter tells whether the parser intends to request a new token after this transition. (It always does, except when it is about to accept.)

AboutToReduce is an intermediate checkpoint: it means that the parser is about to perform a reduction step. HandlingError is also an intermediate checkpoint: it means that the parser has detected an error and is about to handle it. (Error handling is typically performed in several steps, so the next checkpoint is likely to be HandlingError again.) In these two cases, the parser does not need more input. The parser suspends itself at this point only in order to give the user an opportunity to observe the parser’s transitions and possibly handle errors in a different manner, if desired.

  val offer:
    'a checkpoint ->
    token * position * position ->
    'a checkpoint

The function offer allows the user to resume the parser after the parser has suspended itself with a checkpoint of the form InputNeeded env. This function expects the previous checkpoint checkpoint as well as a new token (together with the start and end positions of this token). It produces a new checkpoint, which again can be an intermediate checkpoint or a final checkpoint. It does not raise any exception. (The exception Error is used only in the monolithic API.)

  val resume:
    'a checkpoint ->
    'a checkpoint

The function resume allows the user to resume the parser after the parser has suspended itself with a checkpoint of the form AboutToReduce (env, prod) or HandlingError env. This function expects just the previous checkpoint checkpoint. It produces a new checkpoint. It does not raise any exception.

The incremental API subsumes the monolithic API. Indeed, main can be (and is in fact) implemented by first using Incremental.main, then calling offer and resume in a loop, until a final checkpoint is obtained.

  type supplier =
    unit -> token * position * position

A token supplier is a function of no arguments which delivers a new token (together with its start and end positions) every time it is called. The function loop and its variants, described below, expect a supplier as an argument.

  val lexer_lexbuf_to_supplier:
    (Lexing.lexbuf -> token) -> Lexing.lexbuf -> supplier

The function lexer_lexbuf_to_supplier, applied to a lexer and to a lexing buffer, produces a fresh supplier.

The functions offer and resume, documented above, are sufficient to write a parser loop. One can imagine many variations of such a loop, which is why we expose offer and resume in the first place. Nevertheless, some variations are so common that it is worth providing them, ready for use. The following functions are implemented on top of offer and resume.

  val loop: supplier -> 'a checkpoint -> 'a

loop supplier checkpoint begins parsing from checkpoint, reading tokens from supplier. It continues parsing until it reaches a checkpoint of the form Accepted v or Rejected. In the former case, it returns v. In the latter case, it raises the exception Error. (By the way, this is how we implement the monolithic API on top of the incremental API.)

  val loop_handle:
    ('a -> 'answer) ->
    ('a checkpoint -> 'answer) ->
    supplier -> 'a checkpoint -> 'answer

loop_handle succeed fail supplier checkpoint begins parsing from checkpoint, reading tokens from supplier. It continues until it reaches a checkpoint of the form Accepted v or HandlingError _ (or Rejected, but that should not happen, as HandlingError _ will be observed first). In the former case, it calls succeed v. In the latter case, it calls fail with this checkpoint. It cannot raise Error.

This means that Menhir’s traditional error-handling procedure (which pops the stack until a state that can act on the error token is found) does not get a chance to run. Instead, the user can implement her own error handling code, in the fail continuation.

  val loop_handle_undo:
    ('a -> 'answer) ->
    ('a checkpoint -> 'a checkpoint -> 'answer) ->
    supplier -> 'a checkpoint -> 'answer

loop_handle_undo is analogous to loop_handle, but passes a pair of checkpoints (instead of a single checkpoint) to the failure continuation. The first (and oldest) checkpoint that is passed to the failure continuation is the last InputNeeded checkpoint that was encountered before the error was detected. The second (and newest) checkpoint is where the error was detected. (This is the same checkpoint that loop_handle would pass to its failure continuation.) Going back to the first checkpoint can be thought of as undoing any reductions that were performed after seeing the problematic token. (These reductions must be default reductions or spurious reductions.) This can be useful to someone who wishes to implement an error explanation or error recovery mechanism.

loop_handle_undo must be applied to an InputNeeded checkpoint. The initial checkpoint produced by Incremental.main is of this form.

  val shifts: 'a checkpoint -> 'a env option

shifts checkpoint assumes that checkpoint has been obtained by submitting a token to the parser. It runs the parser from checkpoint, through an arbitrary number of reductions, until the parser either accepts this token (i.e., shifts) or rejects it (i.e., signals an error). If the parser decides to shift, then Some env is returned, where env is the parser’s state just before shifting. Otherwise, None is returned. This can be used to test whether the parser is willing to accept a certain token. This function should be used with caution, though, as it causes semantic actions to be executed. It is desirable that all semantic actions be side-effect-free, or that their side-effects be harmless.

  val acceptable: 'a checkpoint -> token -> position -> bool

acceptable checkpoint token pos requires checkpoint to be an InputNeeded checkpoint. It returns true iff the parser is willing to shift this token. This can be used to test, after an error has been detected, which tokens would have been accepted at this point. To do this, one would typically use loop_handle_undo to get access to the last InputNeeded checkpoint that was encountered before the error was detected, and apply acceptable to that checkpoint.

acceptable is implemented using shifts, so, like shifts, it causes certain semantic actions to be executed. It is desirable that all semantic actions be side-effect-free, or that their side-effects be harmless.

9.2.3  Inspecting the parser’s state

Although the type env is opaque, a parser state can be inspected via a few accessor functions, which are described in this section. The following types and functions are contained in the MenhirInterpreter sub-module.

  type 'a lr1state

The abstract type 'a lr1state describes a (non-initial) state of the LR(1) automaton. If s is such a state, then s should have at least one incoming transition, and all of its incoming transitions carry the same (terminal or non-terminal) symbol, say A. We say that A is the incoming symbol of the state s. The index 'a is the type of the semantic values associated with A. The role played by 'a is clarified in the definition of the type element, which appears further on.

  val number: _ lr1state -> int

The states of the LR(1) automaton are numbered (from 0 and up). The function number maps a state to its number.

  val production_index: production -> int
  val find_production: int -> production

Productions are numbered. (The set of indices of all productions forms an interval, which does not necessarily begin at 0.) The function production_index converts a production to an integer number, whereas the function find_production carries out the reverse conversion. It is an error to apply find_production to an invalid index.

  type element =
    | Element: 'a lr1state * 'a * position * position -> element

The type element describes one entry in the stack of the LR(1) automaton. In a stack element of the form Element (s, v, startp, endp), s is a (non-initial) state and v is a semantic value. The value v is associated with the incoming symbol A of the state s. In other words, the value v was pushed onto the stack just before the state s was entered. Thus, for some type 'a, the state s has type 'a lr1state and the value v has type 'a. The positions startp and endp delimit the fragment of the input text that was reduced to the symbol A.

In order to do anything useful with the value v, one must gain information about the type 'a, by inspection of the state s. So far, the type 'a lr1state is abstract, so there is no way of inspecting s. The inspection API (§9.3) offers further tools for this purpose.

  val top: 'a env -> element option

top env returns the parser’s top stack element. The state contained in this stack element is the current state of the automaton. If the stack is empty, None is returned. In that case, the current state of the automaton must be an initial state.

  val pop_many: int -> 'a env -> 'a env option

pop_many i env pops i elements off the automaton’s stack. This is done via i successive invocations of pop. Thus, pop_many 1 is pop. The index i must be nonnegative. The time complexity is O(i).

  val get: int -> 'a env -> element option

get i env returns the parser’s i-th stack element. The index i is 0-based: thus, get 0 is top. If i is greater than or equal to the number of elements in the stack, None is returned. get is implemented using pop_many and top: its time complexity is O(i).

  val current_state_number: 'a env -> int

current_state_number env is the integer number of the automaton’s current state. Although this number might conceivably be obtained via the functions top and number, using current_state_number is preferable, because this method works even when the automaton’s stack is empty (in which case the current state is an initial state, and top returns None). This number can be passed as an argument to a message function generated by menhir --compile-errors.

  val equal: 'a env -> 'a env -> bool

equal env1 env2 tells whether the parser configurations env1 and env2 are equal in the sense that the automaton’s current state is the same in env1 and env2 and the stack is physically the same in env1 and env2. If equal env1 env2 is true, then the sequence of the stack elements, as observed via pop and top, must be the same in env1 and env2. Also, if equal env1 env2 holds, then the checkpoints input_needed env1 and input_needed env2 must be equivalent. (The function input_needed is documented in §9.2.4.) The function equal has time complexity O(1).

  val positions: 'a env -> position * position

The function positions returns the start and end positions of the current lookahead token. If invoked in an initial state, this function returns a pair of twice the initial position that was passed as an argument to main.

  val env_has_default_reduction: 'a env -> bool
  val state_has_default_reduction: _ lr1state -> bool

When applied to an environment env taken from a checkpoint of the form AboutToReduce (env, prod), the function env_has_default_reduction tells whether the reduction that is about to take place is a default reduction.

state_has_default_reduction s tells whether the state s has a default reduction. This includes the case where s is an accepting state.

9.2.4  Updating the parser’s state

The functions presented in the previous section (§9.2.3) allow inspecting parser states of type 'a checkpoint and 'a env. However, so far, there are no functions for manufacturing new parser states, except offer and resume, which create new checkpoints by feeding tokens, one by one, to the parser.

In this section, a small number of functions are provided for manufacturing new parser states of type 'a env and 'a checkpoint. These functions allow going far back into the past and jumping ahead into the future, so to speak. In other words, they allow driving the parser in other ways than by feeding tokens into it. The functions pop, force_reduction and feed (part of the inspection API; see §9.3) construct values of type 'a env. The function input_needed constructs values of type 'a checkpoint and thereby allows resuming parsing in normal mode (via offer). Together, these functions can be used to implement error handling and error recovery strategies.

  val pop: 'a env -> 'a env option

pop env returns a new environment, where the parser’s top stack cell has been popped off. (If the stack is empty, None is returned.) This amounts to pretending that the (terminal or nonterminal) symbol that corresponds to this stack cell has not been read.

  val force_reduction: production -> 'a env -> 'a env

force_reduction prod env can be called only if in the state env the parser is capable of reducing the production prod. If this condition is satisfied, then this production is reduced, which means that its semantic action is executed (this can have side effects!) and the automaton makes a goto (nonterminal) transition. If this condition is not satisfied, an Invalid_argument exception is raised.

  val input_needed: 'a env -> 'a checkpoint

input_needed env returns InputNeeded env. Thus, out of a parser state that might have been obtained via a series of calls to the functions pop, force_reduction, feed, and so on, it produces a checkpoint, which can be used to resume normal parsing, by supplying this checkpoint as an argument to offer.

This function should be used with some care. It could “mess up the lookahead” in the sense that it allows parsing to resume in an arbitrary state s with an arbitrary lookahead symbol t, even though Menhir’s reachability analysis (which is carried out via the --list-errors switch) might well think that it is impossible to reach this particular configuration. If one is using Menhir’s new error reporting facility (§11), this could cause the parser to reach an error state for which no error message has been prepared.

9.3  Inspection API

If --inspection is set, Menhir offers an inspection API in addition to the monolithic and incremental APIs. (The reason why this is not done by default is that this requires more tables to be generated, thus making the generated parser larger.) Like the incremental API, the inspection API is found in the sub-module MenhirInterpreter. It offers the following types and functions.

The type 'a terminal is a generalized algebraic data type (GADT). A value of type 'a terminal represents a terminal symbol (without a semantic value). The index 'a is the type of the semantic values associated with this symbol. For instance, if the grammar contains the declarations %token A and %token<int> B, then the generated module MenhirInterpreter contains the following definition:

  type _ terminal =
  | T_A : unit terminal
  | T_B : int terminal

The data constructors are named after the terminal symbols, prefixed with “T_”.

The type 'a nonterminal is also a GADT. A value of type 'a nonterminal represents a nonterminal symbol (without a semantic value). The index 'a is the type of the semantic values associated with this symbol. For instance, if main is the only nonterminal symbol, then the generated module MenhirInterpreter contains the following definition:

  type _ nonterminal =
  | N_main : thing nonterminal

The data constructors are named after the nonterminal symbols, prefixed with “N_”.

The type 'a symbol is the disjoint union of the types 'a terminal and 'a nonterminal. In other words, a value of type 'a symbol represents a terminal or nonterminal symbol (without a semantic value). This type is (always) defined as follows:

  type 'a symbol =
    | T : 'a terminal -> 'a symbol
    | N : 'a nonterminal -> 'a symbol

The type xsymbol is an existentially quantified version of the type 'a symbol. It is useful in situations where the index 'a is not statically known. It is (always) defined as follows:

  type xsymbol =
    | X : 'a symbol -> xsymbol

The type item describes an LR(0) item, that is, a pair of a production prod and an index i into the right-hand side of this production. If the length of the right-hand side is n, then i is comprised between 0 and n, inclusive.

  type item =
      production * int

The following functions implement total orderings on the types _ terminal, _ nonterminal, xsymbol, production, and item.

  val compare_terminals: _ terminal -> _ terminal -> int
  val compare_nonterminals: _ nonterminal -> _ nonterminal -> int
  val compare_symbols: xsymbol -> xsymbol -> int
  val compare_productions: production -> production -> int
  val compare_items: item -> item -> int

The function incoming_symbol maps a (non-initial) LR(1) state s to its incoming symbol, that is, the symbol that the parser must recognize before it enters the state s.

  val incoming_symbol: 'a lr1state -> 'a symbol

This function can be used to gain access to the semantic value v in a stack element Element (s, v, _, _). Indeed, by case analysis on the symbol incoming_symbol s, one gains information about the type 'a, hence one obtains the ability to do something useful with the value v.

The function items maps a (non-initial) LR(1) state s to its LR(0) core, that is, to the underlying set of LR(0) items. This set is represented as a list, whose elements appear in an arbitrary order. This set is not closed under є-transitions.

  val items: _ lr1state -> item list

The functions lhs and rhs map a production prod to its left-hand side and right-hand side, respectively. The left-hand side is always a nonterminal symbol, hence always of the form N _. The right-hand side is a (possibly empty) sequence of (terminal or nonterminal) symbols.

  val lhs: production -> xsymbol
  val rhs: production -> xsymbol list

The function nullable, applied to a non-terminal symbol, tells whether this symbol is nullable. A nonterminal symbol is nullable if and only if it produces the empty word є.

  val nullable: _ nonterminal -> bool

The function call first nt t tells whether the FIRST set of the nonterminal symbol nt contains the terminal symbol t. That is, it returns true if and only if nt produces a word that begins with t. The function xfirst is identical to first, except it expects a first argument of type xsymbol instead of _ terminal.

  val first: _ nonterminal -> _ terminal -> bool
  val xfirst: xsymbol -> _ terminal -> bool

The function foreach_terminal enumerates the terminal symbols, including the special symbol error. The function foreach_terminal_but_error enumerates the terminal symbols, excluding error.

  val foreach_terminal:           (xsymbol -> 'a -> 'a) -> 'a -> 'a
  val foreach_terminal_but_error: (xsymbol -> 'a -> 'a) -> 'a -> 'a

feed symbol startp semv endp env causes the parser to consume the (terminal or nonterminal) symbol symbol, accompanied with the semantic value semv and with the start and end positions startp and endp. Thus, the automaton makes a transition, and reaches a new state. The stack grows by one cell. This operation is permitted only if the current state (as determined by env) has an outgoing transition labeled with symbol. Otherwise, an Invalid_argument exception is raised.

  val feed: 'a symbol -> position -> 'a -> position -> 'b env -> 'b env

10  Error handling: the traditional way

Menhir’s traditional error handling mechanism is considered deprecated: although it is still supported for the time being, it might be removed in the future. We recommend setting up an error handling mechanism using the new tools offered by Menhir (§11).

Error handling

Menhir’s error traditional handling mechanism is inspired by that of yacc and ocamlyacc, but is not identical. A special error token is made available for use within productions. The LR automaton is constructed exactly as if error was a regular terminal symbol. However, error is never produced by the lexical analyzer. Instead, when an error is detected, the current lookahead token is discarded and replaced with the error token, which becomes the current lookahead token. At this point, the parser enters error handling mode.

In error handling mode, automaton states are popped off the automaton’s stack until a state that can act on error is found. This includes both shift and reduce actions. (yacc and ocamlyacc do not trigger reduce actions on error. It is somewhat unclear why this is so.)

When a state that can reduce on error is found, reduction is performed. Since the lookahead token is still error, the automaton remains in error handling mode.

When a state that can shift on error is found, the error token is shifted. At this point, the parser returns to normal mode.

When no state that can act on error is found on the automaton’s stack, the parser stops and raises the exception Error. This exception carries no information. The position of the error can be obtained by reading the lexical analyzer’s environment record.

Error recovery

ocamlyacc offers an error recovery mode, which is entered immediately after an error token was successfully shifted. In this mode, tokens are repeatedly taken off the input stream and discarded until an acceptable token is found. This feature is no longer offered by Menhir.

Error-related keywords

The following keyword is made available to semantic actions.

When the $syntaxerror keyword is evaluated, evaluation of the semantic action is aborted, so that the current reduction is abandoned; the current lookahead token is discarded and replaced with the error token; and error handling mode is entered. Note that there is no mechanism for inserting an error token in front of the current lookahead token, even though this might also be desirable. It is unclear whether this keyword is useful; it might be suppressed in the future.

11  Error handling: the new way

Menhir’s incremental API (§9.2) allows taking control when an error is detected. Indeed, as soon as an invalid token is detected, the parser produces a checkpoint of the form HandlingError _. At this point, if one decides to let the parser proceed, by just calling resume, then Menhir enters its traditional error handling mode (§10). Instead, however, one can decide to take control and perform error handling or error recovery in any way one pleases. One can, for instance, build and display a diagnostic message, based on the automaton’s current stack and/or state. Or, one could modify the input stream, by inserting or deleting tokens, so as to suppress the error, and resume normal parsing. In principle, the possibilities are endless.

An apparently simple-minded approach to error reporting, proposed by Jeffery [10] and further explored by Pottier [20], consists in selecting a diagnostic message (or a template for a diagnostic message) based purely on the current state of the automaton.

In this approach, one determines, ahead of time, which are the “error states” (that is, the states in which an error can be detected), and one prepares, for each error state, a diagnostic message. Because state numbers are fragile (they change when the grammar evolves), an error state is identified not by its number, but by an input sentence that leads to it: more precisely, by an input sentence which causes an error to be detected in this state. Thus, one maintains a set of pairs of an erroneous input sentence and a diagnostic message.

Menhir defines a file format, the .messages file format, for representing this information (§11.1), and offers a set of tools for creating, maintaining, and exploiting .messages files (§11.2). Once one understands these tools, there remains to write a collection of diagnostic messages, a more subtle task than one might think (§11.3), and to glue everything together (§11.4).

In this approach to error handling, as in any other approach, one must understand exactly when (that is, in which states) errors are detected. This in turn requires understanding how the automaton is constructed. Menhir’s construction technique is not Knuth’s canonical LR(1) technique [15], which is usually too expensive to be practical. Instead, Menhir merges states [19] and introduces so-called default reductions. These techniques defer error detection by allowing extra reductions to take place before an error is detected. The impact of these alterations must be taken into account when writing diagnostic messages (§11.3).

In this approach to error handling, the special error token is not used. It should not appear in the grammar. Similarly, the $syntaxerror keyword should not be used.

11.1  The .messages file format

A .messages file is a text file. Comment lines, which begin with a # character, are ignored everywhere. As is evident in the following description, blank lines are significant: they are used as separators between entries and within an entry.

.messages file is composed of a list of entries. Two entries are separated by one or more blank lines. Each entry consists of one or more input sentences, followed with one or more blank lines, followed with a message. The syntax of an input sentence is described in §8.1. A message is arbitrary text, but cannot contain a blank line. We stress that there cannot be a blank line between two sentences (if there is one, Menhir becomes confused and may complain about some word not being “a known non-terminal symbol”).


grammar: TYPE UID
grammar: TYPE OCAMLTYPE UID PREC

# A (handwritten) comment.

Ill-formed declaration.
Examples of well-formed declarations:
  %type <Syntax.expression> expression
  %type <int> date time
Figure 16: An entry in a .messages file


grammar: TYPE UID
##
## Ends in an error in state: 1.
##
## declaration -> TYPE . OCAMLTYPE separated_nonempty_list(option(COMMA),
##   strict_actual) [ TYPE TOKEN START RIGHT PUBLIC PERCENTPERCENT PARAMETER
##   ON_ERROR_REDUCE NONASSOC LEFT INLINE HEADER EOF COLON ]
##
## The known suffix of the stack is as follows:
## TYPE
##
grammar: TYPE OCAMLTYPE UID PREC
##
## Ends in an error in state: 5.
##
## strict_actual -> symbol . loption(delimited(LPAREN,separated_nonempty_list
##   (COMMA,strict_actual),RPAREN)) [ UID TYPE TOKEN START STAR RIGHT QUESTION
##   PUBLIC PLUS PERCENTPERCENT PARAMETER ON_ERROR_REDUCE NONASSOC LID LEFT
##   INLINE HEADER EOF COMMA COLON ]
##
## The known suffix of the stack is as follows:
## symbol
##

# A (handwritten) comment.

Ill-formed declaration.
Examples of well-formed declarations:
  %type <Syntax.expression> expression
  %type <int> date time
Figure 17: An entry in a .messages file, decorated with auto-generated comments

As an example, Figure 16 shows a valid entry, taken from Menhir’s own .messages file. This entry contains two input sentences, which lead to errors in two distinct states. A single message is associated with these two error states.

Several commands, described next (§11.2), produce .messages files where each input sentence is followed with an auto-generated comment, marked with ##. This special comment indicates in which state the error is detected, and is supposed to help the reader understand what it means to be in this state: What has been read so far? What is expected next?

As an example, the previous entry, decorated with auto-generated comments, is shown in Figure 17. (We have manually wrapped the lines that did not fit in this document.)

An auto-generated comment begins with the number of the error state that is reached via this input sentence.

Then, the auto-generated comment shows the LR(1) items that compose this state, in the same format as in an .automaton file. these items offer a description of the past (that is, what has been read so far) and the future (that is, which terminal symbols are allowed next).

Finally, the auto-generated comment shows what is known about the stack when the automaton is in this state. (This can be deduced from the LR(1) items, but is more readable if shown separately.)

In a canonical LR(1) automaton, the LR(1) items offer an exact description of the past and future. However, in a noncanonical automaton, which is by default what Menhir produces, the situation is more subtle. The lookahead sets can be over-approximated, so the automaton can perform one or more “spurious reductions” before an error is detected. As a result, the LR(1) items in the error state offer a description of the future that may be both incorrect (that is, a terminal symbol that appears in a lookahead set is not necessarily a valid continuation) and incomplete (that is, a terminal symbol that does not appear in any lookahead set may nevertheless be a valid continuation). More details appear further on (§11.3).

In order to attract the user’s attention to this issue, if an input sentence causes one or more spurious reductions, then the auto-generated comment contains a warning about this fact. This mechanism is not completely foolproof, though, as it may be the case that one particular sentence does not cause any spurious reductions (hence, no warning appears), yet leads to an error state that can be reached via other sentences that do involve spurious reductions.

11.2  Maintaining .messages files

Ideally, the set of input sentences in a .messages file should be correct (that is, every sentence causes an error on its last token), irredundant (that is, no two sentences lead to the same error state), and complete (that is, every error state is reached by some sentence).

Correctness and irredundancy are checked by the command --compile-errors filename, where filename is the name of a .messages file. This command fails if a sentence does not cause an error at all, or causes an error too early. It also fails if two sentences lead to the same error state. If the file is correct and irredundant, then (as its name suggests) this command compiles the .messages file down to an OCaml function, whose code is printed on the standard output channel. This function, named message, has type int -> string, and maps a state number to a message. It raises the exception Not_found if its argument is not the number of a state for which a message has been defined.

Completeness is checked via the commands --list-errors and --compare-errors. The former produces, from scratch, a complete set of input sentences, that is, a set of input sentences that reaches all error states. The latter compares two sets of sentences (more precisely, the two underlying sets of error states) for inclusion.

The command --list-errors first computes all possible ways of causing an error. From this information, it deduces a list of all error states, that is, all states where an error can be detected. For each of these states, it computes a (minimal) input sentence that causes an error in this state. Finally, it prints these sentences, in the .messages file format, on the standard output channel. Each sentence is followed with an auto-generated comment and with a dummy diagnostic message. The user should be warned that this algorithm may require large amounts of time (typically in the tens of seconds, possibly more) and memory (typically in the gigabytes, possibly more). It requires a 64-bit machine. (On a 32-bit machine, it works, but quickly hits a built-in size limit.) At the verbosity level --log-automaton 2, it displays some progress information and internal statistics on the standard error channel.

The command --compare-errors filename1 --compare-errors filename2 compares the .messages files filename1 and filename2. Each file is read and internally translated to a mapping of states to messages. Menhir then checks that the left-hand mapping is a subset of the right-hand mapping. That is, if a state s is reached by some sentence in filename1, then it should also be reached by some sentence in filename2. Furthermore, if the message associated with s in filename1 is not a dummy message, then the same message should be associated with s in filename2.

To check that the sentences in filename2 cover all error states, it suffices to (1) use --list-errors to produce a complete set of sentences, which one stores in filename1, then (2) use --compare-errors to compare filename1 and filename2.

In the case of a grammar that evolves fairly often, it can take significant human time and effort to update the .messages file and ensure correctness, irredundancy, and completeness. A way of reducing this effort is to abandon completeness. This implies that the auto-generated message function can raise Not_found and that a generic “syntax error” message must be produced in that case. We prefer to discourage this approach, as it implies that the end user is exposed to a mixture of specific and generic syntax error messages, and there is no guarantee that the specific (hand-written) messages will appear in all situations where there are expected to appear. Instead, we recommend waiting for the grammar to become stable and enforcing completeness.

The command --update-errors filename is used to update the auto-generated comments in the .messages file filename. It is typically used after a change in the grammar (or in the command line options that affect the construction of the automaton). A new .messages file is produced on the standard output channel. It is identical to filename, except the auto-generated comments, identified by ##, have been removed and re-generated.

The command --echo-errors filename is used to filter out all comments, blank lines, and messages from the .messages file filename. The input sentences, and nothing else, are echoed on the standard output channel. As an example application, one could then translate the sentences to concrete syntax and create a collection of source files that trigger every possible syntax error.

The command --interpret-error is analogous to --interpret. It causes Menhir to act as an interpreter. Menhir reads sentences off the standard input channel, parses them, and displays the outcome. This switch can be usefully combined with --trace. The main difference between --interpret and --interpret-error is that, when the latter command is used, Menhir expects the input sentence to cause an error on its last token, and displays information about the state in which the error is detected, in the form of a .messages file entry. This can be used to quickly find out exactly what error is caused by one particular input sentence.

11.3  Writing accurate diagnostic messages

One might think that writing a diagnostic message for each error state is a straightforward (if lengthy) task. In reality, it is not so simple.

A state, not a sentence

The first thing to keep in mind is that a diagnostic message is associated with a state s, as opposed to a sentence. An entry in a .messages file contains a sentence w that leads to an error in state s. This sentence is just one way of causing an error in state s; there may exist many other sentences that also cause an error in this state. The diagnostic message should not be specific of the sentence w: it should make sense regardless of how the state s is reached.

As a rule of thumb, when writing a diagnostic message, one should (as much as possible) ignore the example sentence w altogether, and concentrate on the description of the state s, which appears as part of the auto-generated comment.

The LR(1) items that compose the state s offer a description of the past (that is, what has been read so far) and the future (that is, which terminal symbols are allowed next). A diagnostic message should be designed based on this description.


%token ID ARROW LPAREN RPAREN COLON SEMICOLON
%start<unit> program
%%
typ0: ID | LPAREN typ1 RPAREN {}
typ1: typ0 | typ0 ARROW typ1  {}
declaration: ID COLON typ1    {}
program:
| LPAREN declaration RPAREN
| declaration SEMICOLON       {}
Figure 18: A grammar where one error state is difficult to explain


program: ID COLON ID LPAREN
##
## Ends in an error in state: 8.
##
## typ1 -> typ0 . [ SEMICOLON RPAREN ]
## typ1 -> typ0 . ARROW typ1 [ SEMICOLON RPAREN ]
##
## The known suffix of the stack is as follows:
## typ0
##
Figure 19: A problematic error state in the grammar of Figure 18, due to over-approximation

The problem of over-approximated lookahead sets

As pointed out earlier (§11.1), in a noncanonical automaton, the lookahead sets in the LR(1) items can be both over- and under-approximated. One must be aware of this phenomenon, otherwise one runs the risk of writing a diagnostic message that proposes too many or too few continuations.

As an example, let us consider the grammar in Figure 18. According to this grammar, a “program” is either a declaration between parentheses or a declaration followed with a semicolon. A “declaration” is an identifier, followed with a colon, followed with a type. A “type” is an identifier, a type between parentheses, or a function type in the style of OCaml.

The (noncanonical) automaton produced by Menhir for this grammar has 17 states. Using --list-errors, we find that an error can be detected in 10 of these 17 states. By manual inspection of the auto-generated comments, we find that for 9 out of these 10 states, writing an accurate diagnostic message is easy. However, one problematic state remains, namely state 8, shown in Figure 19.

In this state, a (level-0) type has just been read. One valid continuation, which corresponds to the second LR(1) item in Figure 19, is to continue this type: the terminal symbol ARROW, followed with a (level-1) type, is a valid continuation. Now, the question is, what other valid continuations are there? By examining the first LR(1) item in Figure 19, it may look as if both SEMICOLON and RPAREN are valid continuations. However, this cannot be the case. A moment’s thought reveals that either we have seen an opening parenthesis LPAREN at the very beginning of the program, in which case we definitely expect a closing parenthesis RPAREN; or we have not seen one, in which case we definitely expect a semicolon SEMICOLON. It is never the case that both SEMICOLON and RPAREN are valid continuations!

In fact, the lookahead set in the first LR(1) item in Figure 19 is over-approximated. State 8 in the noncanonical automaton results from merging two states in the canonical automaton.

In such a situation, one cannot write an accurate diagnostic message. Knowing that the automaton is in state 8 does not give us a precise view of the valid continuations. Some valuable information (that is, whether we have seen an opening parenthesis LPAREN at the very beginning of the program) is buried in the automaton’s stack.


%token ID ARROW LPAREN RPAREN COLON SEMICOLON
%start<unit> program
%%
typ0: ID | LPAREN typ1(RPAREN) RPAREN          {}
typ1(phantom): typ0 | typ0 ARROW typ1(phantom) {}
declaration(phantom): ID COLON typ1(phantom)   {}
program:
| LPAREN declaration(RPAREN) RPAREN
| declaration(SEMICOLON)  SEMICOLON            {}
Figure 20: Splitting the problematic state of Figure 19 via selective duplication


%token ID ARROW LPAREN RPAREN COLON SEMICOLON
%start<unit> program
%on_error_reduce typ1
%%
typ0: ID | LPAREN typ1 RPAREN {}
typ1: typ0 | typ0 ARROW typ1  {}
declaration: ID COLON typ1    {}
program:
| LPAREN declaration RPAREN
| declaration SEMICOLON       {}
Figure 21: Avoiding the problematic state of Figure 19 via reductions on error


program: ID COLON ID LPAREN
##
## Ends in an error in state: 15.
##
## program -> declaration . SEMICOLON [ # ]
##
## The known suffix of the stack is as follows:
## declaration
##
## WARNING: This example involves spurious reductions.
## This implies that, although the LR(1) items shown above provide an
## accurate view of the past (what has been recognized so far), they
## may provide an INCOMPLETE view of the future (what was expected next).
## In state 8, spurious reduction of production typ1 -> typ0
## In state 11, spurious reduction of production declaration -> ID COLON typ1
##
Figure 22: A problematic error state in the grammar of Figure 21, due to under-approximation

How can one work around this problem? Let us suggest three options.

Blind duplication of states

One option would be to build a canonical automaton by using the --canonical switch. In this example, one would obtain a 27-state automaton, where the problem has disappeared. However, this option is rarely viable, as it duplicates many states without good reason.

Selective duplication of states

A second option is to manually cause just enough duplication to remove the problematic over-approximation. In our example, we wish to distinguish two kinds of types and declarations, namely those that must be followed with a closing parenthesis, and those that must be followed with a semicolon. We create such a distinction by parameterizing typ1 and declaration with a phantom parameter. The modified grammar is shown in Figure 20. The phantom parameter does not affect the language that is accepted: for instance, the nonterminal symbols declaration(SEMICOLON) and declaration(RPAREN) generate the same language as declaration in the grammar of Figure 18. Yet, by giving distinct names to these two symbols, we force the construction of an automaton where more states are distinguished. In this example, Menhir produces a 23-state automaton. Using --list-errors, we find that an error can be detected in 11 of these 23 states, and by manual inspection of the auto-generated comments, we find that for each of these 11 states, writing an accurate diagnostic message is easy. In summary, we have selectively duplicated just enough states so as to split the problematic error state into two non-problematic error states.

Reductions on error

A third and last option is to introduce an %on_error_reduce declaration (§4.1.8) so as to prevent the detection of an error in the problematic state 8. We see in Figure 19 that, in state 8, the production typ1typ0 is ready to be reduced. If we could force this reduction to take place, then the automaton would move to some other state where it would be clear which of SEMICOLON and RPAREN is expected. We achieve this by marking typ1 as “reducible on error”. The modified grammar is shown in Figure 21. For this grammar, Menhir produces a 17-state automaton. (This is the exact same automaton as for the grammar of Figure 18, except 2 of the 17 states have received extra reduction actions.) Using --list-errors, we find that an error can be detected in 9 of these 17 states. The problematic state, namely state 8, is no longer an error state! The problem has vanished.

The problem of under-approximated lookahead sets

The third option seems by far the simplest of all, and is recommended in many situations. However, it comes with a caveat. There may now exist states whose lookahead sets are under-approximated, in a certain sense. Because of this, there is a danger of writing an incomplete diagnostic message, one that does not list all valid continuations.

To see this, let us look again at the sentence ID COLON ID LPAREN. In the grammar and automaton of Figure 18, this sentence takes us to the problematic state 8, shown in Figure 19. In the grammar and automaton of Figure 21, because more reduction actions are carried out before the error is detected, this sentence takes us to state 15, shown in Figure 22.

When writing a diagnostic message for state 15, one might be tempted to write: “Up to this point, a declaration has been recognized. At this point, a semicolon is expected”. Indeed, by examining the sole LR(1) item in state 15, it looks as if SEMICOLON is the only permitted continuation. However, this is not the case. Another valid continuation is ARROW: indeed, the sentence ID COLON ID ARROW ID SEMICOLON forms a valid program. In fact, if the first token following ID COLON ID is ARROW, then in state 8 this token is shifted, so the two reductions that take us from state 8 through state 11 to state 15 never take place. This is why, even though ARROW does not appear in state 15 as a valid continuation, it nevertheless is a valid continuation of ID COLON ID. The warning produced by Menhir, shown in Figure 22, is supposed to attract attention to this issue.

Another way to explain this issue is to point out that, by declaring %on_error_reduce typ1, we make a choice. When the parser reads a type and finds an invalid token, it decides that this type is finished, even though, in reality, this type could be continued with ARROW …. This in turn causes the parser to perform another reduction and consider the current declaration finished, even though, in reality, this declaration could be continued with ARROW ….

In summary, when writing a diagnostic message for state 15, one should take into account the fact that this state can be reached via spurious reductions and (therefore) SEMICOLON may not be the only permitted continuation. One way of doing this, without explicitly listing all permitted continuations, is to write: “Up to this point, a declaration has been recognized. If this declaration is complete, then at this point, a semicolon is expected”.

11.4  A working example

The CompCert verified compiler offers a real-world example of this approach to error handling. The “pre-parser” is where syntax errors are detected: see cparser/pre_parser.mly. A database of erroneous input sentences and (templates for) diagnostic messages is stored in cparser/handcrafted.messages. It is compiled, using --compile-errors, to an OCaml file named cparser/pre_parser_messages.ml. The function Pre_parser_messages.message, which maps a state number to (a template for) a diagnostic message, is called from cparser/ErrorReports.ml, where we construct and display a full-fledged diagnostic message.

In CompCert, we allow a template for a diagnostic message to contain the special form $i, where i is an integer constant, understood as an index into the parser’s stack. The code in cparser/ErrorReports.ml automatically replaces this special form with the fragment of the source text that corresponds to this stack entry. This mechanism is not built into Menhir ; it is implemented in CompCert using Menhir’s incremental API.

12  Coq back-end

Menhir is able to generate a parser that whose correctness can be formally verified using the Coq proof assistant [13]. This feature is used to construct the parser of the CompCert verified compiler [17].

Setting the --coq switch on the command line enables the Coq back-end. When this switch is set, Menhir expects an input file whose name ends in .vy and generates a Coq file whose name ends in .v.

Like a .mly file, a .vy file is a grammar specification, with embedded semantic actions. The only difference is that the semantic actions in a .vy file are expressed in Coq instead of OCaml. A .vy file otherwise uses the same syntax as a .mly file. CompCert’s cparser/Parser.vy serves as an example.

Several restrictions are imposed when Menhir is used in --coq mode:

  • The error handling mechanism (§10) is absent. The $syntaxerror keyword and the error token are not supported.
  • Location information is not propagated. The $start* and $end* keywords (Figure 14) are not supported.
  • %parameter4.1.2) is not supported.
  • %inline5.3) is not supported.
  • The standard library (§5.4) is not supported, of course, because its semantic actions are expressed in OCaml. If desired, the user can define an analogous library, whose semantic actions are expressed in Coq.
  • Because Coq’s type inference algorithm is rather unpredictable, the Coq type of every nonterminal symbol must be provided via a %type or %start declaration (§4.1.5, §4.1.6).
  • Unless the proof of completeness has been deactivated using --coq-no-complete, the grammar must not have a conflict (not even a benign one, in the sense of §6.1). That is, the grammar must be LR(1). Conflict resolution via priority and associativity declarations (§4.1.4) is not supported. The reason is that there is no simple formal specification of how conflict resolution should work.

The generated file contains several modules:

  • The module Gram defines the terminal and non-terminal symbols, the grammar, and the semantic actions.
  • The module Aut contains the automaton generated by Menhir, together with a certificate that is checked by Coq while establishing the soundness and completeness of the parser.

The type terminal of the terminal symbols is an inductive type, with one constructor for each terminal symbol. A terminal symbol named Foo in the .vy file is named Foo't in Coq. A terminal symbol per se does not carry a the semantic value.

We also define the type token of tokens, that is, dependent pairs of a terminal symbol and a semantic value of an appropriate type for this symbol. We model the lexer as an object of type Streams.Stream token, that is, an infinite stream of tokens.

The type nonterminal of the non-terminal symbols is an inductive type, with one constructor for each non-terminal symbol. A non-terminal symbol named Bar in the .vy file is named Bar'nt in Coq.

The proof of termination of an LR(1) parser in the case of invalid input seems far from obvious. We did not find such a proof in the literature. In an application such as CompCert [17], this question is not considered crucial. For this reason, we did not formally establish the termination of the parser. Instead, in order to satisfy Coq’s termination requirements, we use the “fuel” technique: the parser takes an additional parameter log_fuel of type nat such that 2log_fuel is the maximum number of steps the parser is allowed to perform. In practice, one can use a value of e.g., 40 or 50 to make sure the parser will never run out of fuel in a reasonnable time.

Parsing can have three different outcomes, represented by the type parse_result. (This definition is implicitly parameterized over the initial state init. We omit the details here.)

  Inductive parse_result :=
  | Fail_pr:    parse_result
  | Timeout_pr: parse_result
  | Parsed_pr:
      symbol_semantic_type (NT (start_nt init)) ->
      Stream token ->
      parse_result.

The outcome Fail_pr means that parsing has failed because of a syntax error. (If the completeness of the parser with respect to the grammar has been proved, this implies that the input is invalid). The outcome Timeout_pr means that the fuel has been exhausted. Of course, this cannot happen if the parser was given an infinite amount of fuel, as suggested above. The outcome Parsed_pr means that the parser has succeeded in parsing a prefix of the input stream. It carries the semantic value that has been constructed for this prefix, as well as the remainder of the input stream.

For each entry point entry of the grammar, Menhir generates a parsing function entry, whose type is nat -> Stream token -> parse_result.

Two theorems are provided, named entry_point_correct and entry_point_complete. The correctness theorem states that, if a word (a prefix of the input stream) is accepted, then this word is valid (with respect to the grammar) and the semantic value that is constructed by the parser is valid as well (with respect to the grammar). The completeness theorem states that if a word (a prefix of the input stream) is valid (with respect to the grammar), then (given sufficient fuel) it is accepted by the parser.

These results imply that the grammar is unambiguous: for every input, there is at most one valid interpretation. This is proved by another generated theorem, named Parser.unambiguous.

The parsers produced by Menhir’s Coq back-end must be linked with a Coq library. This library can be installed via the command opam install coq-menhirlib.2 The Coq sources of this library can be found in the coq-menhirlib directory of the Menhir repository.

The CompCert verified compiler [17,16] can be used as an example if one wishes to use Menhir to generate a formally verified parser as part of some other project. See in particular the directory cparser.

13  Building grammarware on top of Menhir

It is possible to build a variety of grammar-processing tools, also known as “grammarware” [14], on top of Menhir’s front-end. Indeed, Menhir offers a facility for dumping a .cmly file, which contains a (binary-form) representation of the grammar and automaton, as well as a library, MenhirSdk, for (programmatically) reading and exploiting a .cmly file. These facilities are described in §13.1. Furthermore, Menhir allows decorating a grammar with “attributes”, which are ignored by Menhir’s back-ends, yet are written to the .cmly file, thus can be exploited by other tools, via MenhirSdk. Attributes are described in §13.2.

13.1  Menhir’s SDK

The command line option --cmly causes Menhir to produce a .cmly file in addition to its normal operation. This file contains a (binary-form) representation of the grammar and automaton. This is the grammar that is obtained after the following steps have been carried out:

  • joining multiple .mly files, if necessary;
  • eliminating anonymous rules;
  • expanding away parameterized nonterminal symbols;
  • removing unreachable nonterminal symbols;
  • performing OCaml type inference, if the --infer switch is used;
  • inlining away nonterminal symbols that are decorated with %inline.

The library MenhirSdk offers an API for reading a .cmly file. The functor MenhirSdk.Cmly_read.Read reads such a file and produces a module whose signature is MenhirSdk.Cmly_api.GRAMMAR. This API is not explained in this document; for details, the reader is expected to follow the above links.

13.2  Attributes

Attributes are decorations that can be placed in .mly files. They are ignored by Menhir’s back-ends, but are written to .cmly files, thus can be exploited by other tools, via MenhirSdk.

An attribute consists of a name and a payload. An attribute name is an OCaml identifier, such as cost, or a list of OCaml identifiers, separated with dots, such as my.name. An attribute payload is an OCaml expression of arbitrary type, such as 1 or "&&" or print_int. Following the syntax of OCaml’s attributes, an attribute’s name and payload are separated with one or more spaces, and are delimited by [@ and ]. Thus, [@cost 1] and [@printer print_int] are examples of attributes.

An attribute can be attached at one of four levels:

  1. An attribute can be attached with the grammar. Such an attribute must be preceded with a % sign and must appear in the declarations section (§4.1). For example, the following is a valid declaration:
      %[@trace true]
    
  2. An attribute can be attached with a terminal symbol. Such an attribute must follow the declaration of this symbol. For example, the following is a valid declaration of the terminal symbol INT:
      %token<int> INT [@cost 0] [@printer print_int]
    
  3. An attribute can be attached with a nonterminal symbol. Such an attribute must appear inside the rule that defines this symbol, immediately after the name of this symbol. For instance, the following is a valid definition of the nonterminal symbol expr:
      expr [@default EConst 0]:
        i = INT                  { EConst i }
      | e1 = expr PLUS e2 = expr { EAdd (e1, e2) }
    
    An attribute can be attached with a parameterized nonterminal symbol:
      option [@default None] (X):
              { None }
      | x = X { Some x }
    
    An attribute cannot be attached with a nonterminal symbol that is decorated with the %inline keyword.
  4. An attribute can be attached with a producer (§4.2.3), that is, with an occurrence of a terminal or nonterminal symbol in the right-hand side of a production. Such an attribute must appear immediately after the producer. For instance, in the following rule, an attribute is attached with the producer expr*:
      exprs:
        LPAREN es = expr* [@list true] RPAREN { es }
    

As a convenience, it is possible to attach many attributes with many (terminal and nonterminal) symbols in one go, via an %attribute declaration, which must be placed in the declarations section (§4.1). For instance, the following declaration attaches both of the attributes [@cost 0] and [@precious false] with each of the symbols INT and id:

  %attribute INT id [@cost 0] [@precious false]

An %attribute declaration can be considered syntactic sugar: it is desugared away in terms of the four forms of attributes presented earlier. (The command line switch --only-preprocess can be used to see how it is desugared.)

If an attribute is attached with a parameterized nonterminal symbol, then, when this symbol is expanded away, the attribute is transmitted to every instance. For instance, in an earlier example, the attribute [@default None] was attached with the parameterized symbol option. Then, every instance of option, such as option(expr), option(COMMA), and so on, inherits this attribute. To attach an attribute with one specific instance only, one can use an %attribute declaration. For instance, the declaration %attribute option(expr) [@cost 10] attaches an attribute with the nonterminal symbol option(expr), but not with the symbol option(COMMA).

14  Interaction with build systems

This section explains some details of the compilation workflow, including OCaml type inference and its repercussions on dependency analysis (§14.1) and compilation flags (§14.2). This material should be of interest only to authors of build systems who wish to build support for Menhir into their system. Ordinary users should skip this section and use a build system that knows about Menhir, such as dune (preferred) or ocamlbuild.

14.1  OCaml type inference and dependency analysis

In an ideal world, the semantic actions in a .mly file should be well-typed according to the OCaml type discipline, and their types should be known to Menhir, which may need this knowledge. (When --inspection is set, Menhir needs to know the OCaml type of every nonterminal symbol.) To address this problem, three approaches exist:

  • Ignore the problem and let Menhir run without OCaml type information (§14.1.1).
  • Let Menhir obtain OCaml type information by invoking the OCaml compiler (§14.1.2).
  • Let Menhir request and receive OCaml type information without invoking the OCaml compiler (§14.1.3).

14.1.1  Running without OCaml type information

The simplest thing to do is to run Menhir without any of the flags described in the following (§14.1.2, §14.1.3). Then, the semantic actions are not type-checked, and their OCaml type is not inferred. (This is analogous to using ocamlyacc.) The drawbacks of this approach are as follows:

  • A type error in a semantic action is detected only when the .ml file produced by Menhir is type-checked. The location of the type error, as reported by the OCaml compiler, can be suboptimal.
  • Unless a %type declaration for every nonterminal symbol is given, the inspection API cannot be generated, that is, --inspection must be turned off.

14.1.2  Obtaining OCaml type information by calling the OCaml compiler

The second approach is to let Menhir invoke the OCaml compiler so as to type-check the semantic actions and infer their types. This is done by invoking Menhir with the --infer switch, as follows.

--infer.  This switch causes the semantic actions to be checked for type consistency before the parser is generated. To do so, Menhir generates a mock .ml file, which contains just the semantic actions, and invokes the OCaml compiler, under the form ocamlc -i, so as to type-check this file and infer the types of the semantic actions. Menhir then reads this information and produces real .ml and .mli files.

--ocamlc command.  This switch controls how ocamlc is invoked. It allows setting both the name of the executable and the command line options that are passed to it.

One difficulty with the this approach is that the OCaml compiler usually needs to consult a few .cm[iox] files. Indeed, if the .mly file contains a reference to an external OCaml module, say A, then the OCaml compiler typically needs to read one or more files named A.cm[iox].

This implies that these files must have been created first. But how is one supposed to know, exactly, which files should be created first? One must scan the .mly file so as to find out which external modules it depends upon. In other words, a dependency analysis is required. This analysis can be carried out by invoking Menhir with the --depend switch, as follows.

--depend.  This switch causes Menhir to generate dependency information for use in conjunction with make. When invoked in this mode, Menhir does not generate a parser. Instead, it examines the grammar specification and prints a list of prerequisites for the targets basename.cm[iox], basename.ml, and basename.mli. This list is intended to be textually included within a Makefile. To produce this list, Menhir generates a mock .ml file, which contains just the semantic actions, invokes ocamldep, and postprocesses its output.

--raw-depend.  This switch is analogous to --depend. However, in this case, ocamldep’s output is not postprocessed by Menhir: it is echoed without change. This switch is not suitable for direct use with make ; it is intended for use with omake or ocamlbuild, which perform their own postprocessing.

--ocamldep command.  This switch controls how ocamldep is invoked. It allows setting both the name of the executable and the command line options that are passed to it.

14.1.3  Obtaining OCaml type information without calling the OCaml compiler

The third approach is to let Menhir request and receive OCaml type information without allowing Menhir to invoke the OCaml compiler. There is nothing magic about this: to achieve this, Menhir must be invoked twice, and the OCaml compiler must be invoked (by the user, or by the build system) in between. This is done as follows.

--infer-write-query mockfilename.  When invoked in this mode, Menhir does not generate a parser. Instead, generates a mock .ml file, named mockfilename, which contains just the semantic actions. Then, it stops.

It is then up to the user (or to the build system) to invoke ocamlc -i so as to type-check the mock .ml file and infer its signature. The output of this command should be redirected to some file sigfilename. Then, Menhir can be invoked again, as follows.

--infer-read-reply sigfilename.  When invoked in this mode, Menhir assumes that the file sigfilename contains the result of running ocamlc -i on the file mockfilename. It reads and parses this file, so as to obtain the OCaml type of every semantic action, then proceeds normally to generate a parser.

This protocol was introduced on 2018/05/23; earlier versions of Menhir do not support it. Its existence can be tested as follows:

--infer-protocol-supported.  When invoked with this switch, Menhir immediately terminates with exit code 0. An earlier version of Menhir, which does not support this protocol, would display a help message and terminate with a nonzero exit code.

14.2  Compilation flags

The following switches allow querying Menhir so as to find out which compilation flags should be passed to the OCaml compiler and linker.

--suggest-comp-flags.  This switch causes Menhir to print a set of suggested compilation flags, and exit. These flags are intended to be passed to the OCaml compilers (ocamlc or ocamlopt) when compiling and linking the parser generated by Menhir. What flags are suggested? In the absence of the --table switch, no flags are suggested. When --table is set, a -I flag is suggested, so as to ensure that MenhirLib is visible to the OCaml compiler.

--suggest-link-flags-byte.  This switch causes Menhir to print a set of suggested link flags, and exit. These flags are intended to be passed to ocamlc when producing a bytecode executable. What flags are suggested? In the absence of the --table switch, no flags are suggested. When --table is set, the object file menhirLib.cmo is suggested, so as to ensure that MenhirLib is linked in.

--suggest-link-flags-opt.  This switch causes Menhir to print a set of suggested link flags, and exit. These flags are intended to be passed to ocamlopt when producing a native code executable. What flags are suggested? In the absence of the --table switch, no flags are suggested. When --table is set, the object file menhirLib.cmx is suggested, so as to ensure that MenhirLib is linked in.

--suggest-menhirLib.  This switch causes Menhir to print (the absolute path of) the directory where MenhirLib was installed.

--suggest-ocamlfind.  This switch is deprecated and may be removed in the future. It always prints false.

15  Comparison with ocamlyacc

Roughly speaking, Menhir is 90% compatible with ocamlyacc. Legacy ocamlyacc grammar specifications are accepted and compiled by Menhir. The resulting parsers run and produce correct parse trees. However, parsers that explicitly invoke functions in the module Parsing behave slightly incorrectly. For instance, the functions that provide access to positions return a dummy position when invoked by a Menhir parser. Porting a grammar specification from ocamlyacc to Menhir requires replacing all calls to Parsing with new Menhir-specific keywords (§7).

Here is an incomplete list of the differences between ocamlyacc and Menhir. The list is roughly sorted by decreasing order of importance.

  • Menhir allows the definition of a nonterminal symbol to be parameterized (§5.2). A formal parameter can be instantiated with a terminal symbol, a nonterminal symbol, or an anonymous rule (§4.2.4). A library of standard parameterized definitions (§5.4), including options, sequences, and lists, is bundled with Menhir. EBNF syntax is supported: the modifiers ?, +, and * are sugar for options, nonempty lists, and arbitrary lists (Figure 2).
  • ocamlyacc only accepts LALR(1) grammars. Menhir accepts LR(1) grammars, thus avoiding certain artificial conflicts.
  • Menhir’s %inline keyword (§5.3) helps avoid or resolve some LR(1) conflicts without artificial modification of the grammar.
  • Menhir explains conflicts (§6) in terms of the grammar, not just in terms of the automaton. Menhir’s explanations are believed to be understandable by mere humans.
  • Menhir offers an incremental API (in --table mode only) (§9.2). This means that the state of the parser can be saved at any point (at no cost) and that parsing can later be resumed from a saved state.
  • Menhir offers a set of tools for building a (complete, irredundant) set of invalid input sentences, mapping each such sentence to a (hand-written) error message, and maintaining this set as the grammar evolves (§11).
  • In --coq mode, Menhir produces a parser whose correctness and completeness with respect to the grammar can be checked by Coq (§12).
  • Menhir offers an interpreter (§8) that helps debug grammars interactively.
  • Menhir allows grammar specifications to be split over multiple files (§5.1). It also allows several grammars to share a single set of tokens.
  • Menhir produces reentrant parsers.
  • Menhir is able to produce parsers that are parameterized by OCaml modules.
  • ocamlyacc requires semantic values to be referred to via keywords: $1, $2, and so on. Menhir allows semantic values to be explicitly named.
  • Menhir warns about end-of-stream conflicts (§6.4), whereas ocamlyacc does not. Menhir warns about productions that are never reduced, whereas, at least in some cases, ocamlyacc does not.
  • Menhir offers an option to typecheck semantic actions before a parser is generated: see --infer.
  • ocamlyacc produces tables that are interpreted by a piece of C code, requiring semantic actions to be encapsulated as OCaml closures and invoked by C code. Menhir offers a choice between producing tables and producing code. In either case, no C code is involved.
  • Menhir makes OCaml’s standard library module Parsing entirely obsolete. Access to locations is now via keywords (§7). Uses of raise Parse_error within semantic actions are deprecated. The function parse_error is deprecated. They are replaced with keywords (§10).
  • Menhir’s error handling mechanism (§10) is inspired by ocamlyacc’s, but is not guaranteed to be fully compatible. Error recovery, also known as re-synchronization, is not supported by Menhir.
  • The way in which severe conflicts (§6) are resolved is not guaranteed to be fully compatible with ocamlyacc.
  • Menhir warns about unused %token, %nonassoc, %left, and %right declarations. It also warns about %prec annotations that do not help resolve a conflict.
  • Menhir accepts OCaml-style comments.
  • Menhir allows %start and %type declarations to be condensed.
  • Menhir allows two (or more) productions to share a single semantic action.
  • Menhir produces better error messages when a semantic action contains ill-balanced parentheses.
  • ocamlyacc ignores semicolons and commas everywhere. Menhir regards semicolons and commas as significant, and allows them, or requires them, in certain well-defined places.
  • ocamlyacc allows %type declarations to refer to terminal or non-terminal symbols, whereas Menhir requires them to refer to non-terminal symbols. Types can be assigned to terminal symbols with a %token declaration.

16  Questions and Answers


Is Menhir faster than ocamlyacc? What is the speed difference between menhir and menhir --table? A (not quite scientific) benchmark suggests that the parsers produced by ocamlyacc and menhir --table have comparable speed, whereas those produced by menhir are between 2 and 5 times faster. This benchmark excludes the time spent in the lexer and in the semantic actions.


How do I write Makefile rules for Menhir? This can a bit tricky. If you must do this, see §14. It is recommended instead to use a build system with built-in support for Menhir, such as dune (preferred) or ocamlbuild.


How do I use Menhir with ocamlbuild? Pass -use-menhir to ocamlbuild. To pass options to Menhir, pass -menhir "menhir <options>" to ocamlbuild. To use Menhir’s table-based back-end, pass -menhir "menhir --table" to ocamlbuild, and either pass -package menhirLib to ocamlbuild or add the tag package(menhirLib) in the _tags file. To combine multiple .mly files, say a.mly and b.mly, into a single parser, say parser.{ml,mli}, create a file named parser.mlypack that contains the module names A B. See the demos directory for examples. To deal with .messages files (§11), use the rules provided in the file demos/ocamlbuild/myocamlbuild.ml.


How do I use Menhir with dune? Please use dune version 1.4.0 or newer, as it has appropriate built-in rules for Menhir parsers. In the simplest scenario, where the parser resides in a single source file parser.mly, the dune-project file should contain a “stanza” along the following lines:

(menhir (
  (modules (parser))
  (flags ("--explain" "--dump"))
  (infer true)
))

The --infer switch has special status and should not be used directly; instead, write (infer true) or (infer false), as done above. (The default is true.) Ordinary command line switches, like --explain and --dump, are passed as part of the flags line, as done above. The directory demos/calc-dune offers an example. For more details, see dune’s documentation. To deal with .messages files (§11), use and adapt the rules found in the file src/stage2/dune.


Menhir reports more shift/reduce conflicts than ocamlyacc! How come? ocamlyacc sometimes merges two states of the automaton that Menhir considers distinct. This happens when the grammar is not LALR(1). If these two states happen to contain a shift/reduce conflict, then Menhir reports two conflicts, while ocamlyacc only reports one. Of course, the two conflicts are very similar, so fixing one will usually fix the other as well.


I do not use ocamllex. Is there an API that does not involve lexing buffers? Like ocamlyacc, Menhir produces parsers whose monolithic API (§9.1) is intended for use with ocamllex. However, it is possible to convert them, after the fact, to a simpler, revised API. In the revised API, there are no lexing buffers, and a lexer is just a function from unit to tokens. Converters are provided by the library module MenhirLib.Convert. This can be useful, for instance, for users of ulex, the Unicode lexer generator. Also, please note that Menhir’s incremental API (§9.2) does not mention the type Lexing.lexbuf. In this API, the parser expects to be supplied with triples of a token and start/end positions of type Lexing.position.


I need both %inline and non-%inline versions of a non-terminal symbol. Is this possible? Define an %inline version first, then use it to define a non-%inline version, like this:

%inline ioption(X):  (* nothing *) { None } | x = X { Some x }
         option(X): o = ioption(X) { o }

This can work even in the presence of recursion, as illustrated by the following definition of (reversed, left-recursive, possibly empty) lists:

%inline irevlist(X):    (* nothing *) { [] } | xs = revlist(X) x = X { x :: xs }
         revlist(X): xs = irevlist(X) { xs }

The definition of irevlist is expanded into the definition of revlist, so in the end, revlist receives its normal, recursive definition. One can then view irevlist as a variant of revlist that is inlined one level deep.


Can I ship a generated parser while avoiding a dependency on MenhirLib? Yes. One option is to use the code-based back-end (that is, to not use --table). In this case, the generated parser is self-contained. Another option is to use the table-based back-end (that is, use --table) and include a copy of the files menhirLib.{ml,mli} together with the generated parser. The command menhir --suggest-menhirLib will tell you where to find these source files.


Why is $startpos off towards the left? It seems to include some leading whitespace. Indeed, as of 2015/11/04, the computation of positions has changed so as to match ocamlyacc’s behavior. As a result, $startpos can now appear to be too far off to the left. This is explained in §7. In short, the solution is to use $symbolstartpos instead.


Can I pretty-print a grammar in ASCII, HTML, or LATEX format? Yes. Have a look at obelisk [4].


Does Menhir support mid-rule actions? Yes. See midrule and its explanation in §5.4.

17  Technical background

After experimenting with Knuth’s canonical LR(1) technique [15], we found that it really is not practical, even on today’s computers. For this reason, Menhir implements a slightly modified version of Pager’s algorithm [19], which merges states on the fly if it can be proved that no reduce/reduce conflicts will arise as a consequence of this decision. This is how Menhir avoids the so-called mysterious conflicts created by LALR(1) parser generators [7, section 5.7].

Menhir’s algorithm for explaining conflicts is inspired by DeRemer and Pennello’s [6] and adapted for use with Pager’s construction technique.

By default, Menhir produces code, as opposed to tables. This approach has been explored before [3,9]. Menhir performs some static analysis of the automaton in order to produce more compact code.

When asked to produce tables, Menhir performs compression via first-fit row displacement, as described by Tarjan and Yao [23]. Double displacement is not used. The action table is made sparse by factoring out an error matrix, as suggested by Dencker, Dürre, and Heuft [5].

The type-theoretic tricks that triggered our interest in LR parsers [21] are not implemented in Menhir. In the beginning, we did not implement them because the OCaml compiler did not at the time offer generalized algebraic data types (GADTs). Today, OCaml has GADTs, but, as the saying goes, “if it ain’t broken, don’t fix it”.

The main ideas behind the Coq back-end are described in a paper by Jourdan, Pottier and Leroy [13]. The C11 parser in the CompCert compiler [17] is constructed by Menhir and verified by Coq, following this technique. How to construct a correct C11 parser using Menhir is described by Jourdan and Pottier [12].

The approach to error reports presented in §11 was proposed by Jeffery [10] and further explored by Pottier [20].

18  Acknowledgements

Menhir’s interpreter (--interpret) and table-based back-end (--table) were implemented by Guillaume Bau, Raja Boujbel, and François Pottier. The project was generously funded by Jane Street Capital, LLC through the “OCaml Summer Project” initiative.

Frédéric Bour provided motivation and an initial implementation for the incremental API, for the inspection API, for attributes, and for MenhirSdk. Merlin, an emacs mode for OCaml, contains an impressive incremental, syntax-error-tolerant OCaml parser, which is based on Menhir and has been a driving force for Menhir’s APIs.

Jacques-Henri Jourdan designed and implemented the Coq back-end and did the Coq proofs for it.

Gabriel Scherer provided motivation for investigating Jeffery’s technique.

References

[1]
Alfred V. Aho, Ravi Sethi, and Jeffrey D. Ullman. Compilers: Principles, Techniques, and Tools. Addison-Wesley, 1986.
[2]
Andrew Appel. Modern Compiler Implementation in ML. Cambridge University Press, 1998.
[3]
Achyutram Bhamidipaty and Todd A. Proebsting. Very fast YACC-compatible parsers (for very little effort). Software: Practice and Experience, 28(2):181–190, 1998.
[4]
Lélio Brun. Obelisk. https://github.com/Lelio-Brun/Obelisk, 2017.
[5]
Peter Dencker, Karl Dürre, and Johannes Heuft. Optimization of parser tables for portable compilers. ACM Transactions on Programming Languages and Systems, 6(4):546–572, 1984.
[6]
Frank DeRemer and Thomas Pennello. Efficient computation of LALR(1) look-ahead sets. ACM Transactions on Programming Languages and Systems, 4(4):615–649, 1982.
[7]
Charles Donnelly and Richard Stallman. Bison, 2015.
[8]
John E. Hopcroft, Rajeev Motwani, and Jeffrey D. Ullman. Introduction to Automata Theory, Languages, and Computation. Addison-Wesley, 2000.
[9]
R. Nigel Horspool and Michael Whitney. Even faster LR parsing. Software: Practice and Experience, 20(6):515–535, 1990.
[10]
Clinton L. Jeffery. Generating LR syntax error messages from examples. ACM Transactions on Programming Languages and Systems, 25(5):631–640, 2003.
[11]
Steven C. Johnson. Yacc: Yet another compiler compiler. In UNIX Programmer’s Manual, volume 2, pages 353–387. Holt, Rinehart, and Winston, 1979.
[12]
Jacques-Henri Jourdan and François Pottier. A simple, possibly correct LR parser for C11. ACM Transactions on Programming Languages and Systems, 39(4):14:1–14:36, August 2017.
[13]
Jacques-Henri Jourdan, François Pottier, and Xavier Leroy. Validating LR(1) parsers. volume 7211, pages 397–416, 2012.
[14]
Paul Klint, Ralf Lämmel, and Chris Verhoef. Toward an engineering discipline for grammarware. 14(3):331–380, 2005.
[15]
Donald E. Knuth. On the translation of languages from left to right. Information & Control, 8(6):607–639, 1965.
[16]
Xavier Leroy. The CompCert C verified compiler. https://github.com/AbsInt/CompCert, 2014.
[17]
Xavier Leroy. The CompCert C compiler. http://compcert.inria.fr/, 2015.
[18]
Xavier Leroy, Damien Doligez, Alain Frisch, Jacques Garrigue, Didier Rémy, and Jérôme Vouillon. The OCaml system: documentation and user’s manual, 2016.
[19]
David Pager. A practical general method for constructing LR(k) parsers. Acta Informatica, 7:249–268, 1977.
[20]
François Pottier. Reachability and error diagnosis in LR(1) parsers. In Compiler Construction (CC), pages 88–98, 2016.
[21]
François Pottier and Yann Régis-Gianas. Towards efficient, typed LR parsers. Electronic Notes in Theoretical Computer Science, 148(2):155–180, 2006.
[22]
David R. Tarditi and Andrew W. Appel. ML-Yacc User’s Manual, 2000.
[23]
Robert Endre Tarjan and Andrew Chi-Chih Yao. Storing a sparse table. Communications of the ACM, 22(11):606–611, 1979.

1
The computation of $symbolstartpos is optimized by Menhir under two assumptions about the lexer. First, Menhir assumes that the lexer never produces a token whose start and end positions are equal. Second, Menhir assumes that two positions produced by the lexer are equal if and only if they are physically equal. If the lexer violates either of these assumptions, the computation of $symbolstartpos could produce a result that differs from Parsing.symbol_start_pos().
2
This assumes that you have installed opam, the OCaml package manager, and that you have run the command opam repo add coq-released https://coq.inria.fr/opam/released.

This document was translated from LATEX by HEVEA.
menhir-20200123/doc/manual.pdf000066400000000000000000016176261361226111300157630ustar00rootroot00000000000000%PDF-1.5 % 5 0 obj << /Type /ObjStm /N 100 /First 807 /Length 1233 /Filter /FlateDecode >> stream xڕVMs6W1Adq4L= LB6?V"Z$@ ݷ5edHedI J**-DA2<IMR1?OAJR)mgCIieؒ.JҚr6,!T P%LSQ,SodbVd]a%STJ.Y lAf˒,e LDc,BP y2AVaU܂ cd,!DUjX@mr1L4YflCl9 ˆbK X&L4fÑs V@ZHS0 H${0፰'3V)@P5H@$6&̌U)9q0!R&UYqHԊ̸A2`2ؑ<c$P\pҸKd%!Cil]٧p}н'!듽ff\KU|O\=w=Qmt/}A`uCٻǣHРu5rL}cHe2\L?/nB{S*y_!QO#cRת(piڸK2. 01X= :XUZv0A:_5y4.?l*ջZMJMwsj) MG nrcWiw%e蛉w&^/QqKO\CuOls=3@ v<"T>rڎ~_.d59 z۪ĝX%z!fz5x{hq磙1K.M]$;< 70UՊvOݑpj/LؑWݟs=ik5)"), tkNf$R endstream endobj 257 0 obj << /Length 303 /Filter /FlateDecode >> stream xڅJ1}3&`KJYRP5 )V/⻛%KxH~ @?];+d@W j+ Z D"䪋/J(nY]V>}0FkY9Qʮ1fBHJG0FcV#,hJ `4}Ð)[ZnR Jfa(G?\Z.dk5cpi]ZY 3p5?煲2i edV*J))Y=A]LnB샇}?֜Й\HiV*u^/};} endstream endobj 307 0 obj << /Length 1250 /Filter /FlateDecode >> stream xMs6 (S'fN3wzA%,!gFa'q&\ -Υ:9:;: GN/9+bBU߳'0 !aqҌ0t?x@ynؼ%i }mff?+zɞġ‹B'\x)m3&1n߯V(zLX^4KIs*j25 aF^S5>4(ן^%XsgYA6hzZ}L I!Xb>2,f^t,^-72Q%ƻQ*3]O>\{c3 W4!F_Ml؜)*neQK?<"@q ݥb |[@?|j$t$m$.Yy79hLJTZ.UpMyY+IY ZV&@k|:<]w*Kmi\.x6%/dF[$xurHȧq&[ohL-OaH6qwmwnJuFhR*4t.[һ܏՛2;H^k=%DiAX4#f? endstream endobj 333 0 obj << /Length 995 /Filter /FlateDecode >> stream xXMs6WV $&ICL$ШI+4uk<$(|ow.``vW@;9 f7g2$H<r&C E4 f)JZ"?02cCg>< A4avr=,$tD`A "Q"O,yL^)OPd3gŀ2Gm4!M:BiXiORya oܱ* `#Cp'<$q|9 m<@gyel’ah؏`?[w Ir:(e]4&C#n*] 3tY} qN;6W"Lqz3䅽;Q Fc7Ȳv:[sPUm̯9<Q:yTA[o;eX(WŨ;bHY0DQOȪrr,I-Iϔ y.8|084ɁtkrއɒŨ`0v٭X)|!KLQЁ. $87,wr9!BIo#N endstream endobj 353 0 obj << /Length 3567 /Filter /FlateDecode >> stream xZɒ#WOfGPf)4 ےt.B2{Xl_X@b/_&=yw߼û/ޛNk{8݅] 8w?촺?hvMk{ }A|=zU;[Ѯ2^Rdkf=;Ų\ںcѹx:JStÝ-yUUrbG3yM a}.3:[eu_2h- bq_/:Ļ^jNgˉ|U8/KlZQ?qYOCdNûw- &>|=2>)!뾑i;gO.+j.ۚh 3>[{Kom[=?!u,Ï^i,O$6oQ.m~_6 ōxΩ9MTE]ל_n/aG횶kε7#=^K[JnbeLj qW& ݷܘA+o1[.[(&"=79(˭meʌsx7n1w / TZ&2#KzQ5"U,dE/Fe^ϙ ^e?#YRdyT5! ?UZ:ڶt4pFegZMk {dk-ֹ@!+Hq Zڝܤg.F$@Y0*Eﱱ׍ե_2 ,ll2gjЗx>(ّN3ϰRJ3T>:n_,=a*!LOօgtn =oX2b Nl4Gu|htNu+-Ă3߬NíTdu[P6"1oOo 5&Jc G[5bk!>|2Ν|BɱQր-y0I-UMP  6c,PYhYF(kk+raݓ̀X.lRuEAToχeՄudMaA`OU0XSpv,0)[Go)ѨeX"# $qJ6!Wx̛kZZ/1$>FH,Ko* 'Y|ZFgY7gfݪmf6FIsǂ 02>bAeɲYЏ a" ]hRڝaiy(wjZW w7{'4K(VMu A>"|>O׈$nvS/0P&ivʄsݤt-}ls+1) ~[H1)Ν_< 1α's%nkb`G;4c|ۦ8uz?IA_9sMʾa:a|GqU4m߭:'PQиٖr GT X|NC6\:dyD;#T*dǨ)["60aĪjxGfu7 սߵ0angcT/];2YeL|8^N|t PO;b',iԼe4ܠԌ * (1~:"G|K?5pț_6OLd2LItpꂝ ʳNV[˥n+KMUnR3jU\KtkҠ:ky@lpt0:.̬9\X%˹7x4fP)auxf jV֕va:[de2hj{c5s (?eĄ=̋@7-Ul| @Er f.CUBIdO妄L\mj֑+6a0Ukx e'=sZG%bPL B)\EuH tM0ҍ` lH0UY WVeNBr$׼|a Y9(%%}rf];nڂ$Rl:j8)W4U:ט;oh&l4Bmr9BJ{c \`@_l%r,qJ)ʀ1Z@%ڹ)e/̵|Q\ߌuRx7TU/dLd_O;#b츙*C!:xި!&\Q<:9 󠗶Bְĉ߶cf'Bh|.W$~#>t?鯊 E3i0lqABhx$&!(lqTD8~~TuâRhoJn-]ɤٍ hJWG qԇ U(Y Č(&}Ċ\=wAo4LSK"1Ӿfy|yFddWd# FKnJi]xFNFn|<ԍP# ⌧I C}"},}9 '"ZcLx<& }Se[C#6y  C5$ BA!(sb9#5×tJamqc,y@$Ͽ̛ut{}O*An\CdTrn 0zW0 6|){-hH 8F@oR=eih `q\` 䌸(Mұ>_@x]|u?$`I|M-ή[,n:xg(;ar)ra O|~hRաV0ɏ'>w8@p$w3 Et@lugr7N@/>7 n|޻JSu`ïvp>|Kc3Iس51x"4Pa-7<$}Z{cBƛ&gfo}HBx3"uy^g2=Qb. I +{o ݲ>㛵} p[+iʳ?dсq& )v_!0x"Ö!xHٸRKY(f|Ǣر(HU{t?a&Rȹf/Tŷږsn.\7r =8?o7eiN98/X#0۪ -9-=/}WgQfm$J'т3hrgBget1 )[v+߿qfs@DZ <{$Ix ϝi/|WE endstream endobj 206 0 obj << /Type /ObjStm /N 100 /First 876 /Length 2553 /Filter /FlateDecode >> stream xڽZMWn|U\%KJ\V:$ҁK,Q䖴ޯvg&8@~qƻb8OB&'4b។ Ά[{o<>vFc6 61Neś![<&B)&oj}mCn?<0dnm;Ͳ3͵wk"~Wͥ u#puگ_`f~P'';} >v}8@n?>᧋_ϰZxa6z&0n>3ǃ^eBY3ʚ?lui4*666׶jKVyKU^R\*/WyU^r\*Ty+U^JWR*Ty(O#ѱiz7WOsy;]w~Kܫ?ϟ\RA6` 4 Nc3W,FWy<DZ)PȂ &gA=6"S}%Z nzlr'Oؐ,dgAg'd>EVCOxv/>2,'BD<=pNC, 8&8 zD@ަHx$\i IAD8'}`32 ۟tjYNJΓT E2TيV$4jMX-hНe1 aa"A]\l2~Qq R|Ӟ尚 L`2N[xINh٨<킌S~LQ:A bR8 M)C֬$ x(5XhqP:iEt 8C($ Zll䞠ZG.}2Iւ@zqH9ā(o/!in@M8Y2gh+H &io!\I +'sKBөBK$\#|7L_ys}%jjj~<_*WyU|<_kY?پ˵.eg?>ԛ? |^ 5l&f$!$}ekCMEMn0#2V̜@C,M]F;"Y~Iey'UԿe}#xX%:CX#f_:Oΐ'}ATxĐ~[Ǜj[[P(ΰ/ }LA~X_bsԋ4%OݑBW/<80t4xЧm1^qһNP sKJ OyҤjcr\-|ոH\s$9Ij%5璚sI͹\Rs.ʓ*O9Sm `HH3 tA]<T. endstream endobj 379 0 obj << /Length 3708 /Filter /FlateDecode >> stream xڽZKW-=4WakĉLCn4jW}X$jgsh4Y"ՃLnoJ7_>7>8IzJ븁r,. N ­2iJZ$1Fܧ_RQ8$7 + q9g4?tml .l{JWei S\۾ׇƍRn<0rS@!Y',Fq :^<_CS5cj`|zWl[͆ofC62B'뮦o}Q?85r#.A8 >Sϊ2娺y蹺3O4(:`/ c&kc $]cxna%D&Wr)KQLb)J+y+ Zꓣ}R^ & })Vid`޸Zɪ"_Zc=^1)vʺkKe5Ɛ܎ՆFr<ƨ~%h6+zYfyI0~OY#8,@E~4 m\KV].0Iȝ!ݻAu> >AcXƀx26o™8 f3ӑ/T %UeFVK[*|)iL2ZUt+ mMd夊_+C\2w' ~R"e"3Zefoc2QؒB%WX6 p*-'٦O Eh$-K0Nb7[DD]pb³/P "KUf1r|:`vI^^7vnV/TCʝj͜@lpy@W\Eɪm8Yuyҋl C]D0bȢZŃ ꨂ$v_-%;Ӥ*a9 bOu /R{H\Zs4+Nk :.˃S:^_yзB7.":s;ˆDrL)JB!4WεGJ"ivzvf?q|8{gT#6PQ`yJ/`0F|'j|&7#7ucbSŠ]DJH#eo$lHjif߁3E,eLц"Z;jKwf(ˠ8?j~ wb#x.u.6/.q,eеA"_0L{ae'4<=\msʌ8KdULK4猋UXehp@鬚+jsȪwP'7L!)>IsЭ$HYY.^[ؾIOLm8mKͦ:ņuFu /ma}ܣ ^crh~Ϗ5UOSGrNUJ=.A%U76>լ fcji' bV#3U3 FY&^͟v0s.~ʒR@n 5U)WLu.'%k1COn~pobRKr"uA =U8f66'(m`_Ǩؐ9h<.;:6Vꞷ xcPا=n j]Y4YV6ig4SPb]),,Oǭ9l`f+y?2eNnh A&%S $>fغ0~l. ;n d,rL8Ipqeҝ :pr;Gp3\^=QŦ@#Y!ܑn!%OwdžD MDX/S)_d܈KlqT)Kς/db\PgB? .0›|U8[7'x<߹7qQĞ2cX8RL hf)z⍆T 2zX!B.nf.M b{؊0r 1ԂDfA4"6#n 7BҾP}0Ͱ1 C냌:NɻixYM[:N!5T5!t:&O_aBH$ꊋ0+}T#u‘:0 o(EOrL, X{|kKiBX] |m P#9Cn&E>maA LU^<)]Iӥ 1 endstream endobj 397 0 obj << /Length 3394 /Filter /FlateDecode >> stream xZKﯘC\Ѥ$OI*Z&NNqhSBR}F| 5uʉ@ģ_77_ ԝ=+qxGDHnvu{nu]wlvy/dwIav(<~'p=}~R'\:]07:˦!~D{\V ڵ22l7L+g9W3My KۉBsekkuuՖt5Gƞܾ6"%u=^)<ѦisjZuشuW0@kI X#-ZDEbЋnۆ8LÙ>!>p{H Gnma|GQpv>-\1k]+/3fs:W,8oYiIh|c0lrkwv\ 58#WzmzLGISro} Gfw&@;C 0]l#^v #P#+Ic? ?5_ s[3*  wڙx _WO/UUYkp\sdшA:dUDc #@m&sO6LYw}{RŊhu^Z1^֙\ZJB^ȑ.A2 Wr@%Ugx\4Ѕ_\q~xWS }^Ŋn>\oy$ k~ݹ)m(3_ۢ)gPV d5qi.a0nlϽo#N|0)ނ?>O c(S?`> /Vh\d7PٹQtl:͟I[j涅߀&F"##AM<0d3!5~Knbf_ILᠺm@SiBфӾvv?cbt̔[)n|k<"M*F<떕kkN z"u8̶jũ"SUׄъǜDcζ`6tK̊U<d%<"gSJi.`! .*!u[GF\gW,,CxS7U'G?T|B#u⨝RƵ~5ji*4^o~E J !r;42J/t >ePIU`y(Z 0tᯀBa=cWF`0@VgJQh.J2WjLQ'!h{NW4ǒOm~mYU ՀY":$û2SZTxFݤ32rFa{\u(ytp|W@lM1 ='5z1#"|x+n^䡌\;ݰ_;IQ:[xm }j/w f_dYAbge=4^V0dnݏi0V^^6{aW~cjYlnvH ٌ2]ɚ3yf_9ˮsK^ ֘ȑO;x}fs2[eO牰aAhISef3gY&VFHv!^ $)#˰0Q4B"ZV0;+``ld!,σ7 -l맲+-~GKL෶g"^ѐUY"Ayn.-X /}6lNz ?D2z5s4R;|(JoWOE3cMC鏮d-?؏lJKz[3?5dQ> stream xڭɒܶ*=rD{*Elr%VUql3͘MHT=o75W.M|tpp|"޿x.ot 7oXq|j}uXa?~g8Pi6Zntz3q>ޕG~-`enG۷EoS1]+?Vv\όG75mUپzXmSNȸөʢd/J:U5Du:",Z]UV>{G[D o[h`[ZYpw.Qˉ>Vv1Cҁ7ZE:ބ*r& d_=Q@ErtWM1@ϣL27'2/m0!7LhVWk _nZqο>d&PI6!';:g0S&׬ǻK{v%gH0m¶šL\7v21kvuٶ謪 [>!lD-_9U0= <ۥPݝa V-]c?|t8ɖvhD$ʒcQہM08tߒš 㠉:u}u/Ϸ嵓΂lـ_! |N?۟Q\fb&=&*H+pq9e׸sB c!Lj0\(v}f9+wL9q>1~:h!윞|Y VRf27Y@ ]S #lC#xήƷ?*O1 ) Aˏ?aL?G̒?>ZЧ8$[o8)"N9T׉\s*%$QT/3f ,G?އ %v339AՅg7J֏t9:}GFp[`r{"s1C9hMôcIBÈXΐ @,)WQWs }\-b{_- 1;s)$A8OCS G)X[=^ :{3I\.YY?{\ r=H̵͕T){;.}]ڊ{+"Dl]ׯ:Na ړJ^"29(GsFBEk 2cojK%~vOmbC3wtu hB$m[vۛŊmN%j9ԭup!:IIRQ|UK?BgO 36z5\xbk#t880.`:&otA>QXLse.ݓ'/b.+;^T6p7>1FNPdi‹p5A(t>Yo U=!#t4>4e,\̆G75_3֌ R]B?oMw]K݌wAʦ#f2-LxxeRe&;c$^ _.(TƑ @֑}":bϢ|uU-%~n/=1 F<>>g`\ ы{.xC >pch2֌Ma^TGә\:@Jlh=J7epՕt$Vk*JŸ =k=g) U&w0W$" K@+~qӍ*!u\$YMz~ox&WY:⾼)bII9:۰NA\~uxgI\-&% $t{EvN Fv _S '^FfaUCtU@q\Ih5kVN#GT@7 js\HQ`dp^V00;2„]S=]tWۀf/8f>Nt\/jlbq7ى"Ҭ1ԅ-1>Xڢo,ݑ>@1;rV*8qmB,|JzNpjz#zou`f=x/s4}7I-;b\3'amg;:$.nw!ȥʄ-Ss%Ř]);5$ǯ(N^쵌UHU> 15y@ t܅\@rxf+ hKA]s\Q2a&aV3ti20j kAu:CT fxkdT1> stream xZY~_d!^Yȱ af0nʖWn~{.Ҕe\U*xq?]o`& 18<$ tWi_+‡q@ŋvG\"O&,.V/TPZ!C# vVHQPdS$+d^{b)F v R#_՚r\ ;oC4LD2X8?r4CDSWr59+gl *:-\d,R )"ݼ;+FEKT 931HwpQv8Fx4ҪS语TD)$Ī\I0FNZApIƸ;:#Gw cGGCQ2ܸX8l(0%G2)Èj9>eAD($[k"I< ¡"_Qh9eٗTR$t2x} 9@Z:Ft!OVE6 I$$f*g`9yi?+Og6gWU?VIY4rqb~g4AdiYQx `'& e5Aw wIQ䵳+q46z?❔8LKjώ׿ 2!(Ch?fytn9Kj" 0`g=3϶vw@h(6 o(# z͟i8*ZS=OD*p|VXI|7:g>'0)*}1o0ǡۥ\+))Ynxz04\5# ՑT>$P٢Q0A3^luW"̏BoXDZN:͓Vt־C'#dpbƟ?H6aN|}fN-zb# B hPsyE #jvn?%eH7R̔OAzHF,=Im$4E#g_H/ Wa%fXDjDqqb= 9-:/ЉgC--$O+]̵"aWUR^Ġ,_McbR{J1a ,/7rU #rE>F9\^5k2vaM&j25YV^;/ف/oyIdТϒ?H0Vׯd{D*/\uhtwA"?ԥD$?VP_ H`>n@%Ӑl6 Fj9HPS ^&l1H0jۄQ]n5 ^muhE.*b]^R1K\VςL/fŋ:>zg,_Y>'[qˠۓF:ov.b_ FIs=WK9Ԝ_A_+6/j$t}S=J@ߗ>5h]3Re}Z75ȩ.s7vJx=gVZ|f1fֈCڀ>k>ӵY@]|B wo.vpix,qiχ "f0qL C0ún*z EH}`(j[q-}im*$>(`n MPQHb1Z#} endstream endobj 429 0 obj << /Length 3314 /Filter /FlateDecode >> stream xZm_aEwy|%])R4Hn?Bkk5Hm_7ӻ{A"Q$E 3 W+'ܿ?2] rnU*˒|u]}>vcN)nZzjnd~oBXyθ10cqĺ--m4Z[˦f4hƒ-j˚Ǣ |}ч2cwnwQSׯ?/RFgPb ɄRylF$k5uuTr`aw7}A`cU-}M?l{em 6 b՚ n|}uk[wEEwaaq<%_M{xK5剀irQ-IќrPocY{iyBZjb|s0cf]lڄIz-Q|<4bF0l)f5ml>0瓠# K`&%Ike I҂?/p3Ԁ{Μc5X]..N/aɄmʸ6/Fy".dꋪ~K 3p!fMN mʸ~LL4uڢ?+ &o^J5jʢ[)}燦mi';ٵKrYfoN=}h/Ϛ2EҰQ?NIR  5l܆ T+Beˏ#/苙HfPig:~,z_B`Xv4) k FJD;~\$NWi#(=V^ tN`$FueNəQj׃h|qR?}%|+^5dx[%Ip]M-ENML/}tJק29v>VOmڢ1 -!FqA˪E.5D\(to6EGl)3 7_?#[F:()t"XGպx(?SC.5_3\^N]OD=R Jp('7|"tτZņ<=~oշo/45lJD7: OZ>@+[˖ʍ'Fq0-M@UI"m;i 8Oi6;+clipy6Q%~#Ty:,q1 Dԕv,s0mJ;^CVdջM~+7_lBp@BֶE= Y8Lh-T!aٔ dNd"ٝO{#` F&"}E+_{11rRh" ` bq}X;IQ˫9787qx\-n}qLO_dr`dɶJ@cv`Jd  iyDbf!rV3KytT3{(r׿O1TK0M?]Yٺek]yJr%YlL?OҟΠ'L}`˪-Qe]# bE!rro/9,:Q =2oLb3N]w(ؗr~ ҧ HN0~Wq[ʛpX@NJ9Tx.kw n$_;;euTf2Xr 6z;] F<̧3qI\J'2ssө n#QT4g^$Y9 C51hڰe@F-o1-yG]?},/2&e"D R6<Z rI +Ky>h;- S _}mwzG3;ڛJ  wjW ^ A.IƌGNLJY,=&~"-_}<+y90V FUtһsˬ;_!a<[ݍ#t_CEtɱ Ӝx_@.,!-9M1#3cCeLF"dWhF_%-~t \?rWeWMČ"8i&gBdWE ]H943Os> stream xko C5eć^NQ\+H- .zE|g8CiR$ &oo"~_JF"ryuwU*R$[Em뭊Mxe{A:E]me*bm[mlWMGzvǺ9"+M){y,L&<%2r zlQ:\{4"3Y3C(5/<8ᇿ%Xq @0:JEFK@", P3LAj[$G]]񸆑~|] >BL){`jСսP3wZfw>dF\㿂FmFZW?[d?52';w9"!yQ^)^lI=g.$ZsagSM v)C u>pDd X1CAm>n?9Te<lɓȨG߽}B--!dR0m9I`n$my_7P^6,aLsB,ʣ͗A Ӻn2' Vqّj-r#n* 9/.ҩb.;0ŠU]ex3 /[k0,{,wJuh\nIsS޷) É \?i.}loiJ\0E- Fe^Ggn4]T羛 ѮsdjqlQ= TM C.!Osq^ގNVdl5\^x 4lLD}r8MB:jlrLC@yV(dP7 O}3YS -Ӭ%2 $6vӥ]Kn6L"0܇j.||:W \EVsiIjl$ZOCA|]@6 vQI|:z6*g1'3$Ajj>0wPpPؖN2Mm6ţߑY6&'nh=OM γ 誮ewϤ- t)%;ihUj਀//@酣l0kiz'_ Vwa2ALv&,P{CeB3 _~"p^LJ 3ݙ>_ Tj] dR`btI6e 9A4_i ;]4*_Bm"RGxC;:E^PeYSoojَk&2ax߯ԪopnRnb,׶h`Ko4u$ YzgBB'ԑ}j7LdTౌZuH M|F* _.1GǹӹWC~]m+G22"d"(' y5ʈYfaؗ,"%={]o YSܳ~iikk|qXmJKb}X8P"WC|) I㱴O;Poy |2(Fn}&/Rc!i>U~autөs>/?%&=J|}:(_:'TX :qs>q tn_t}HT'4$+D3TDW4B/ܓ@|D4,ϠPs!L=ZV =/?pbݘ'/UQ5BgaHϜ%xFx&ɼ=KAZټ<5dhdۜv$_dq;|Wnh\0$Z_b)GdP굖 ]Ӗ M KQ5*PH:Fur{akPryAa5֡Tk~|MpE:+77?>"ɭ,Y[A$Sm@'lzDG! ,e@ s낹FfnכaZO.q2ȉ.Ũl Cʝ ,Ѻ0o5s@}SgF*E2cLkeC'- X9fZglō0@ jLVOțRI3- 8NVx> stream xn}\,bg !ԲHqOUW5MlO /Ruӫ7)Ûމ*K*+,KJ]]}^i5vY+ӌvv4#8}w: Mq]+^5;Ĭ3q_U[?=b?WTc.eYĿ<͆VwT*ٴcӏB(0z$6X'l)&(n2hiMkbh*Ѱ3Ж:]?[_So>}}0؅JR}y剪[0idLTY%B'Ve$V2īHt/Z0Md_0V$*>DB<#"  ̞7Y dhG|{ݥ&`)Uf14 #Yӟ/طO u38@}CMT;KmﻵЫG'>xm}3*P#) KO kV%_VTY B,00Z@Uy1+V4%'۶ li*qɴCaǡ!bI`e O&B]-&w~py;Yݱum=uLh-lXޥwtflz>q}B'3j,~UT"K0dyK˽LL 3Kw88n˓LBYd1hg[ot㐥 8 =O 4EwHLb팏Zw#8!< NzP6M Kh-⽫ 3=.uDSVV}o l)J@fG#q`f%b T=m ȅf|clǣ{jΠGȵ>5A ;9{Z϶"tgRd1`Y/,}3",@r#pG,z!x7QӤak)N^ 2"@ѼcNm ƥDd8vC3C9'x`oA:f\Еt7"dcdLòIU}!^ŢeMB~ a| -42xS_ߓ܎BkTil'y.)B*'jܴlEr9d?aTZ/l\;T\H=?y},B-2DZ^|v$=9 _4^ ( A6ivb7ֽqD2:n%zd_/.sl-g .QⲀ'-BGYQ+*s_x4OrVg`ͅd\sIwEFiJtwE2\]:CxiLY"x60:!>^G8ly޲'wɜw6 Lr!C^pbš%ٿ Wv8P] s(zڄOӝv.ͱڠ@NV/rYL+ܛ$W-X+`f! RzOfױz]DbN-# if;% x]`&j,T^/&숨;;559Ng&9Y /#mV+3E\Pp-@4rsLq ^>Am*a^@BX< na{./;[TݪQjv'W"B.V>7²Q?.UGз:VMh EԼ'œ:!Ǥ1)ZTLj{w7~6f326&l(i?6GY7.M "bӷ;vxz c)얤=ASgGRDŞ$_TPM~101<@7ݙX(jnWwxǀlvxtY6Eԧ -ȭY6-w)0-d(>NO'6tܣbROT!-khk-9*|RQ}.׵`.03@<-3R*JTAPݥ 0|0"-D> :r )k-Kζ@}!$gV1כYwsy3bΜhTRsR/_ kιP&8 z60+x&H%U^d^Kor)׌8+ VK|2} SytGrHj w0x;ɩ4c@M렛3"!NC"We:mnU^s4+}ca^e$PE%u*D>[o'y8J<ƅl_b) lˮ[q4LtL6ށ:t}Viv1*DɃT!Dn_ĭ 8x6D eTJ5aH }IUñxW"\a-7H(=o:tmҞKfXTu=.mGz\~Ugq8YXC 2)<ׁ}LixBzYiE&\9H2_=f4{T*v]4TplM>R&7ܕ JI :N+Vx^ؼFƇW+ '9Em YgYr]?X_.ҩ B̈́9=*f.-҃`c_M?S&duT Ty"ӪO>ʑcDB_[T,*'7* dADualz\U3CP?yoG`-*;N%}UϦ*N[HOŊmp3dlV'и*栔fOMj -BM8kn' n/A.rߢA㽑5l˥}@'TٔloӁG6g+s}.O/P˴;x| t#l`J:Ǽoqm]nAځ+@a6acĿ-U0'yRTK <vcvY|jS@j'y;#=,J~f~'ٍe/DQNt_G7V"RU"pĿh2) H-,@ l0/2EcZw?~1,L HpoTVL ]1♲j2[eJ"M3?, endstream endobj 358 0 obj << /Type /ObjStm /N 100 /First 895 /Length 2733 /Filter /FlateDecode >> stream xڽZ]o7|_A6: pkymEV#$di6d˜pj]䄐3!#VK1AѠhČ븖]BLV8ǎшԈd$7 1T81j$cHɸx'$4< xb/%E`؄Ip)es |C_D_?{b9`t^<2E,F*dT"FY%PK%M3j4Z=/%]lԂ4m(D%LL[axƛX٧;y>$5ixIYդ9JT+0Ux)wRUx ,T* й\. Qm:EwbJxv S8sQz$_OÉUdH>=€hԝ%0g` ̃tL )=F CAS8t1#ˮ? iVg/wzsr9?]f#W_Ny`ÉYzyMܶkOG 9@>m@e5Cf>=Yxצ{7wgb~ꯠ+oכ ƇSZ|\ο[jJ\o ao+<%Pnںmkַ6V[DI6"e4@?Ztp G)ڜXISfь (e 6 Dء պTAEs䓋e Ϩ5!gqci#$K6ZfDm(nB:Au )p?)#S' $LP(XZG NR&ZT%ݖYG{Qܥ\g.wWQsc4,!Kcrӯ1ni[Ƹ1nm6^):?SfchTB-9=W QPSpzAH\d 4m|(LnW=S& PV *|8w |*-[8j9A&J~#+RA R:LXc2!ϡ.eVS.rqhFxƂ jqau֡\nC,tӣt+/8^rp X@r> )xvD.L .Rm PH# Ҕ:0pWrp"I-܀ y_ ᮂp jG4ȩ[ OTP*N> JE*U@ܮ۶ZZmmlmjmnmimR)yU5Ld|+!@{0ى:_I=0% ȕbnЌ 8SQܰ&TX 0SV|ndmE(ÙT IM[\JH ك@e;ȅʝb#7u|YLߑU#-rw0lnƴ 1%:,iIaV14/+n"7t/BFv:  GX~\*,U$آx6A~62sxBoDJܿFekՃ+,xNm1gyࡠ7W+rwko=Ww}^셾)OߔWRDYE!!]j@q8/}@AȥfMÒbP[35ۄ\\=?m||;Pᗸ5h?ej#$nh? ;Ѹ;) KدO*9ԧshehaZ419h\.A`~̘gL~UD*j1`(: ek7k):+C󎬥VkSy_:-Ө_qhF)*@S=0줏)3ƎNxj@|m)u֝v^oηַb[j_,il(t̨1M㧔&W&¯kȑTn.{Ɵ_uXhrˤe[]]v07G3gw endstream endobj 480 0 obj << /Length 4191 /Filter /FlateDecode >> stream x\Ys~C\%;帼M%qVU.5Cip$ڍt̡nʋH618__/>>z!,㙸HquqfV62]y)ُ'_UKvCׇleۛ?u׏^h9M,ŵ,RrU6-a$Ѝk'ܷ z̘Yר|z[_*;+T}_~埾KtaY6APCHY#<E]t7զnz]V(͛fPSV?pRS۶rEۼZMnQ\V'|Y63`T% g|XV*bE'"}E+^$ŶZH 3pX3jst *&ѲĊDZNRYOO ;7 EO9 L篥e\xj.|nU4U ~c30_WEsdų]mlZeZw8I?TM(Kκۢ.נUTX3q45ۿ󉆜/Ϩ&n`T3dľğH3+[DYSwz}E/r԰DhEC/hPL9K^VE=- ] jPm43z`  G 8*% q{3=(~VPz4zǡzNۨaw.$I1Wt-.E!n0"~/}3m t|ͦ^<7Μw3˲덄YdR{Xv>):)耍1.{1fȬȚebP5 kfa y {/ުx5T hӱr*ݕ':FO^Ν8ɩV\͋E a:U>-ȚRA)>#}fRԠ?nW<)0\foD74KwpnH-|](|aq_n e0袉-Xq0*09 l'jb rc<3̨tgGQ&3œY[u"sUƶ% u0^[3gPѵ|g_`zNʵٓ6_^!.'#%4!Nda'? `Rƻư<ph:`TQw\"Zcՙ,սtpo$VjfJ;G\;0]dK4J+'k|nN+((j~jzhV/z6Cĺٶ0~bǤ 4RmZB=]QExe/xⵔ,>b<Hfdy'DRu@xӢ9:jUЯN:2E9,d,duS>>+q[o9\|}Bg&8A׀b7߬ơxs]7wuNJPH,JӞBNwo,_3s-g(>*1kUGYoun^QxwtGdJE3qsS=yk}oN:)Kbw(Xz3 "Xr0= ||~Ȝ\Η`% F#y;16~wc&]#t.DD6҇8:M5o{7q]O*1oo{q1%QyQԹ|!mfvA'Vs5N^EZypa;%O[f> h*>k{DQXc(Ձt(挶eK{< F֐\m\jg(0c g[TDy~$w] EJ7HJ T<-(v z:ĸ=mA=t. (1irJ",9Mlh3$~*&L KY{idbSqOC?.O7v1FFśOxR-OU,7M@4zԐ-HdXo'3.q L?c蜧{;Ŷ((jQ3a$5 H?׷zl;,8'J>cjظ OEsaF֑kX, rD5,8^*y%ejpf@jLȗW`II Pusu_3ɉtx<<92\mN,|kݼIu?}xC|ΓKxx`3Bˁ,J] ]d9Őgg9-N";~|8rNEN~E@@\EOS> ~f[, 5D3׺Krˊ:ėxMҍ=smY2~OEӪ]`sthyznD# [?ۮ` Aǎ<ֽ2%lSbKi:Mi(|wWtNɷiE[jT{jNa;j3ޛm/Ǜvv;؝75n\z fw$1jj-8 ҨSn+wBvZR͋]&Gқifu%БMقZD́R--sj%iO[09D3X~F!'pѓW}fZY'G2ux+qhh@K$;)#^eSȇ3^iu,Jheۺ\^&QhmHb寎-D?,A.d&Kr oKC9OH^'@C]orvZ]pmxթgMHqlW@`7M"4(=Y2 cygI Ҽ:Iԏ*EI^SP ?-|WK5wsu.h?:(),v21uQW ^*ԼmƏ~*NpE,s-`874<Ҩjb)UďWdڧ~XV,zRjW¨&agz_&D}y8:. ZPԟ0e&3%B endstream endobj 491 0 obj << /Length 2965 /Filter /FlateDecode >> stream xZYܶ~ׯgf $xv풫M!1Er$mߞnt,VZI/CF 6w`ݳ_<{Rdh*3wlیWZ^CwGPc>H_ H.R(JVon~bԯQ* ZwtfFhJT=H(X{= L婦V&dLŷ[m$m@]وř)`OS !E(0R`bbNv^4fy>e QeӢuG*Y+܇̸j3ʯ}0 GM%b0*]Ab3H;|,2POqYm="ְ+1_GOQ;R()n+"M\I*Kot>| xƴ_>wb@RB!I!MAl{vp*I_U8"O޴)ؖx*FW2-+el{UTO.3F4;4wi\] -l,͝j* egg)Pxu}A?b88dbؙA8ѓS½x 8 2F`9. Sb[Sחa @M"pc|<(wfs|^L˱ <[K m-m ˢ.Cu!]p$D7L X8I&('2O`!HtuB0F˕  ( `zKou1^̶Sn l$UQ(I&ӹ{tκ4K 6#%-RRv6SO}hH|uXSjqy vѰ0X(i13ǂj9i&E؋f@c~xcPQt[S~vҔݥBrfٶ9p  eZta3C ]+@Ud4PS^DlwHp#zmafT?Ýԫa d1#JL, > ˓.' ǂuCdq` V*gN[jgɬ#*=5RV4G 4_[4f:1.I螚VÇsQfFUZZS9vI})M)cNKNaj8/}a9-1=ݝȄ0?E"|DAgƮ/ ,VaL2v QkЅg![ɂ2ܭMi`MDXFñhS,b.99d151-|>] Vf@kǽBj.0a |xT1XhN˶aXj%cr{]Q ʉ+Ep-4[ݥ$Wβ!MX ^Rλf*+Ca*|U6ju؏F7shfKC T\'}<8'5>HsO ;,vv ҠL VT"eE"g7P:=؞jk/-V@leI$HF߽K`_ݜ90/҅0n7o (&c l\bE} ǓS:%$GȎ8St%]09,E<_vIE>ED ݺR\x]ek~uK%s`[9o<jZAPaJAxNÔ5+ =-K^68pT~ţ! E WRJ0x NV fȈ5pP]Lp$:tH$/iO؈{/??as( I| hs.c2ex=0h"Z*˗O97iΙ]/Ez||tC枱;XػAc_D_o_vlXKEsv'y 0dvs?q endstream endobj 501 0 obj << /Length 3196 /Filter /FlateDecode >> stream xZݏ_6`/Iܢ"@]rA!ڵrJrv7:zhI"H~-"r/oxN$ 1[Ě ,Ywn|Zz^mdE->+̻;>o&ۖ9MjzQ$*GQU7ǢJ"Om]~ӛwRQD}L)EEk I .)U驚.oܞKi4pɴ2ef~/C<,`oDɒ|O.LEB,{۲حiunvf;,s/;U(h.k"w5>xW\/Y7{bPEc6L!BGQp_f݄$93FJ"UÄtz|Wf.svYӹUF4ǒ&2@m{7ؘ˟PnV0Uvkp$$il?,]:vDNu:QQګ{:̮RU&#/\}Ե9y  ٞJBacxHh)DY.^Ɉź׈SCV%9krp'2lyCXMTL`bѨQhAƸ7-kWa{r\W18X_3X]MJDNDA9 $7% d3S%'D6,PvEdhmY BmNDi?Ee4= hC]BLܙǠ19;X;fΩw89ji-w4gI嶰2ۑ eDI"$;.%1xB1xc{=쑃j4 ZE!l] ]p[[¦W&$` De=0(n A8}@1@7m+2d.!tዽ{)kڼi tSdsk#0|W C}' 9#ѢRGi0cݾm keŖ;J. 59o& k\@R=ĊT6jZY,{}{h*ߛPh"'j%EL%[p_buz{C{@^RQ/߅JL@S& uO1Hs7\,Q1$QA*<uݨAaVĞ`-N{TJ'BH8>JΥ3U-',d9ꨁld J:h0c' )!B2:===pEB L)9ҳ`FɄVW,3.vyi!2Tz- =c[Ocf!Y)H@D]AE (Fv[ap|a&Q{*6IaH{uk,$SJ0%^ԅ~u,-E^u(,G΢tR¿Ç~>*br<8ڗ`AW -l5۬r-sq_WW@P#{!b7Mul[)|,?ޖMIsP )ǡ`y1ZWx*5Rjԡ4Qs*nO69vh;Mswyh+u+ tnC] x{)kHAYA+&NMĀU,M?1bfm^t%6FT d\ &QT72KAi އd$y9ܝ ,tg/$GkCp͸CgL-IYj=Cgl_ Dq*5Kɡ>vnH+z^ڹB1_]]CY5T]5! ruJ~R_ GDftz s G=-Fgk$syvĀ+#W ÖB'L3B@Xο /p"cEۮx%/EXB?a928!K^eh7Bv&"UO9q;ʽ=.xtMI.OC/IG9(ꕠ$Sd(o#IX/ #U8-ij5]mq~w<+] _b69a6ZX&΁t؆$V?*?XzdQir=(~8r {~2EHԩj3DQt9cmWׅ#&e:eic ^ƶP0~4ܿ>8W/z1;|_ lU{n֯ugy/*a KnURSKAꕛ+*6b)"E\"_~oF endstream endobj 510 0 obj << /Length 2814 /Filter /FlateDecode >> stream xڽZoܸb?6/\\r[/''߽r&8+x!f3K&eЗtv}aW֋_.|<]fL yESZ)U8 wd%֋d^vvvX[2bvfjE`*3'|b ;GLF3,bIrh"f+Rz&XQ(Ii& IiwU?L- yy:'!Q,G/ͲI2HM~N..Oy"1K,Maha9 gJ[7k39U=pWt|Ē,4IXfj"dSl,lCjˡjޙHk ˶3}^ޟboY->7B&ۆH?yCXVٙ.Xɑ6煀UgZ{v:8W5,3k|74)]gE&YB&cJv:8ٹ_p =e>SEU˅3e釲r0+l8e0ݦj.Q>w>&sMiw!ЉEǑn6vY2߬zgH6*> Vqk!cL"33řBPt)h(>צ1ݴ T874h'eSp6O } EUa) Y $we"u D~Nt҂Iqu U5tewnse:lΏ7bPϰ7T²}<;\6](COŔ,xni5pՀͮv֕# {Vd9$P$ 2eNqXxd,DbϣbKFXACFT!ж@dR̙N2R3 +ϰwF;1y{S-+};厧W/CZy5-jzF Qb~c.J5p,A>Eȗ6Xv60*%9C=>Kv5 ޴=5^턽?R) b%}I\Y&;covhn$@czL807O? +R~ڛHݚ{ 1>8gYG#V!(|~>1<`Z8\g@K4"?Oɡ Xx%O+"V6)Y^dOE_LU0.&XLO<-t"J@zR=ώj G04I1A%2(;i͘0Si"ɝʜϳ}hw$؜z܇LaK)S"fپz|B?D7uEmTCoklNc+a>K|{;xS#,=4+P@coզ\O{k&XG]6]6 @f2t*(ē٣ϝeD>'{o{" Vȟ92ȩY0SB"p?}cmA.'^6J1JT3D#Oӄ@Rd wkŶCxX-Z Lͤ^ FZ /FߒX16fn\p $rMWؾ9m]㱞H4ᩒ*4Zk!A;RЁM؎`aVj4zH"p2Ka.+ _pþ fpST]c?d9"M\>s"o5[@8?H<>1 0q{Z:Ѝۂ~f9vm;q) qӤa8 7mGm@䨈+m_<<=:pW+ףr 7V=.RyXéQxxuX4r)ZɃ%0`a҃z cH:R0KeWWf\ 2V sL40㋃9֑{l9+xCq<$.79z7fsFFP蕹n;:&'y4;h,-Ai{n]bÉ>!cu)1{YRh.({!E퓁%(yd~GA4OT݆O\5↶rhXUYIَxvJ7 G 'r[%.BG0TE,ɽ~F/TNqnM]٘)_j$e1SDm oK]56RObVUTmRَ(35F.!ر;;0IHA B ܡ>>*o:at@O/< ~L\>&v=ݫ:cK.m倗 ) 8Wx*Sȹs!Ui endstream endobj 518 0 obj << /Length 2747 /Filter /FlateDecode >> stream xZ۸TDA)}!-r-!E 4Łk˶Y$9wC=Zks8$()j8/f$wuX3yͷ.ݘn+)X,╄Ӧx+o3<(84o̢u3EႹo\cs%cqou=]&h?,ssŏ-z ?6{dɾ1Gm4Ͳ9hKTHƑ!F5Z·Z{I:-i.d.Ycc.g묛 vFĢ^SݿTd/Uj#2d俸`ursTΠ}:)Y bAanX@F#b<*J \ޮ/.!K r((%ǸeJJxkg NG{\TpO t \Jpחy?.p|b1+.B_A8=jW *ηH'7fhf*&$ߚRλ5$bOKdY={-\,t@yqbW>@-eay=D7+d!r*ߑn"79(BՄ1Q#'/h\'ya~]GV4[hTф|?OKD`KxdJ6ɔ5WuZ%;LVʲvm仚R:bĉt[dQ5tEiWmBEiK͆&nsCC㯚HKU'C`%UglSLu[ ~U7sqC'=0<)IЎiݤ[is֤2 oߩ:$ 0oQj9@ag_%1IUX_sov qVDԩHcq{RÔpR@\ҕ}(0JNZnoZhH{Drg& *"р" QhY !x-8pp|}ֽO>U7ӛ{ڗ;"B E#^k^07_xK/Yd_[?w\S!5d'ϛ7g<'˨Y>*<[ uMxӐi7QuO"O: p6ފc5pA!z^$:{rg2XTq>!@UϚ;63c4~G[ӵ d A@j%7ܙUM g;c`bl!:#X "ƶ-,F}4/38lZetZ  bN24*Bڛc߹xm@thYCp(}fMT6kx pqjAs{X7Nu[ }fٰk^Tjaו] ?AV0tLoVf54bGPllY~6<-FY?}+cBJA؈A!οہ.lۑ|񡶿-¸!a53' ]Ivru:_`+N>9O^ï1z|A32J941C#g˫{9ۇ@Z=}C!\[+OwE?KUM{VTW Nw ƃy1'؛ 2#%m-:r1<ǜCLmuzWF!т>\[Jhjxf2i=86 SއSndqNM*zMKGU#ꌀ&&+-a YewdUEՑ T> stream x[۶~tJ'ot:v&L&NR>4g2;H|4߻x")s܋? /..vL |_`L'#$&A*|tFHLqJ-ʛ"EY(*BW%QYZxWo/̄'d""D)X"*dGq^4@%0Ѐ\FG2\!R?) gu[Hq[.q8ύl^'Y]V8?%Rl!*`_51oɝ^11"=gS/-S㰠l'Ee0ܙ(۵Z'Ք iT>JPts^?=JFQ_a4ʀq, B*,\U7^jtľ+ad0d6Ze6fyd܎-އ brv * >]MU}Q{Q|MF=l!d gy$ FuY<5պzQBZU ?=Tk({J(Yql(#7'J+v 7&dUAO {α|dLDa-C#h$1ELF'4(K*v fvП}3FhzRȌyW <>K5%yƗİLZ!0fW5 ͚׿O 9i:ĢI'{1žG.~J h {[׶xֵ=xʝ=Gg 6CnB$PĜ߅es4ܔ2{1KF*g"Bx]o!!&)C=@`04͑Ҋy5CK1G (f;h}qD>*2g3DmLEv, \О,jTw]lj~P7I&Bl^ؿay~vaWVݸ%eYܐtSD ܮK>c[_2+e kU tfـq[)/3RZ,<]4c ZMjU&3MȥJ_m8h~:7`geenB0cbt*P,i!%Ym?WO_!֚k#M޸]ݧY{Q~Cюji- ܚ6kUzp{5>n5/;~wʉL fR9J'P{k1tj͗ΰXU3[3n´\l򷜟e@K?#l e` s=\n~h5{xշ+_:ub~uP%tpDz7?<if7>fLjK8=]H6`Zi ]1`m5?6ϣK{r똂ޟq]e3ܗ|ƤK'er]da~sp*]h ֻ v.}hig(S$TudX' Kfݽ1e/0! 1L¶%Q*OOLPNJn2H"@Hrha6,ŭ*M[6XfmXnYMdW4-#ͧ,ogTFEw;p76\>Ͷg6+Yݤ }o2k,z_ 2pH/Q}CIz nrv$T̥# ˻l oP&먣 A,7yj92a@UitUw&Ȗ]8w> stream xZKϯ%hsė3 .CdHLjn +K$俧*J,=*hV?Eˢv+V ,˼7ۻw\_ͩ]Ѻ=U?G:,Tr?ºъZ}O` S4lA'|R Dtk>G=KyGO|tCͮi-?*1ݪ#Ɠt[ 8 bǁ"LwҳSU͝R^mdTՆKU4U^FIDmWwzDFI=u6;"s p2ˀx=ϨӲ y6U1؎clOE_6%g V1R݆G BSo^݇\0Gfk<tФmsǣ͋Dx/Ye~ƟnhĠV ˒̇2XҾʊ%ggi$8q=hw2S%; l# cT R4%)&d`pMݕ]OV߷*az]ڶlq q5LIdL&O 8&%KEU76cHggkrl5)X/zV@( ygǡpGsJKG*px9xVnKQDLMLxn0Ra0JKni6 KvOw|mԙE }[oF`ԭ]n owtg,xMYWe;xh\Sbo+G k/\O!f97^G0#1\;4 cvL1o1Fj W1xl7|cK} Z|Ahz?6v#?`u5I@7oE­i)=G-CcBUN8<˩cdsƳo% iS>ؗ,[edaPrtk#֐ >?bO܀NoG] ԱQzP "]oU: K[ ?f'b,^H+]p&`$ND!Gci.e \nO¤&9懣;#:}98T6cQa96uwFNsW: =^n"gӖ,jR]nKQOQ] ݨҴSY畧5 ~˪ZTPBml#q ewyoh쩖ptt:i~E^&ओ/c.YˣSMx!Mٯ\>|P ;#WcH =[;_w!en{H۩'nX\n˝ ;±0 ncaAqWaaXq{"P>< B4Ѽp=N,x)]ME-E}7*dsdG p=V: !u,;LzCpa3^ >shLG`/y(AOs8I p`Lp1ӳ$n"a'I,?ncH$f# < [c OO9yy@r/=+.JץtB [[<9m uݶ)b:i0ϛ9 ~14\ @ĿZ]UސmFXV}F(ceךnʐY `Zz!,QsۢkOt:cpt3T xc xmaFZW[Xas-cYQsؿb_׷`O؏UW鋮qX =eOǬm9HaǗ%Ipn6ʹ$6C.a[љ̶Gf8ĚBHӛ13+J, lBX#Y1#|E,nikzI(w%jݾ!J 0}!wBkasL{ڛ(a]WZ+v1*{Y4%fGSO=}'ԐT]whspA6{ד aӝǪċb !fk4=E' 0Mkhkz|0674k~ۚc:[Lh:m#?RpE2٘ci5/o淋+Gvz>q>B,'rA9;iMs6Mw" F;)p6l_قWGr"zV}д4O¢ߴP>ﹴMx$bKp9~Rqe6nIՅT7$!#(.ST9>** <֞ڥ&_a-chZ:KPD[(0 K3ɪ! <|8栵2e: JPc37RxC)k>\хp.:o }NAuw M[.U𘜏s律zx×G&h٦ɰtlwՆ{V3 nZhqŬb/ff%3b!CdbV7x(Si+(Y@=1)JR]'"G?lߐ^p-`b? |r_ZyKV&VG;lLyMY'{" 0Ah3-#%UNāu!Jh ui[eUHW^#(Cjz"Y Yb&M Ls.FD6]!X-kn7VhSޭg&Ŝ hmuJ¡R3*"@or t),@D"KKx_N endstream endobj 465 0 obj << /Type /ObjStm /N 100 /First 891 /Length 2473 /Filter /FlateDecode >> stream xZ]o[}Ǥ~B'-`X.-юP4( }Y^)ev/9ܝ33{Cƙ1V0.D5zG#aJFb$$N8 h~K4PH$Rl$2W#y R1p*5MA0QQ8Fp`l!GpA8 bLω g+,qU8*Dž? Cm +( 4h"f(pJؤ)tXej11cLUNnRBrk?+&`Puɤ88&`ɤ"HBESOq` % &dL ް0۶d9UIx -8B-XS( Q){Ah&+*)SgaU!ZA#|YhgR 7BZy`X&'=\Gg _$,s3mc9pL]DLW?Lt>_&U{?||M&ʼb+Ef8W듂991ݩ~YZ ?Xb{= -f_o}#^_m{E]j[3cjYDA 8|bӈ O`, Y-mka/DŽf@7 !؊G0IFh=XJELS1#ڋ$*h')XW˽ 1]:j=/rpXO=*/LƱwr3UK'(`}D/08}9:H# l] I6Lr<8=&1]f˹_?~9]MW3#I|1_yִ{L@rCȴQoȪjb8;j߳{52o l|uBsZ\/fW- lcv~1i4Mr%.ۖ#7fWؕ*yaζz̮}?~VʛW`en&<DӰ?xONv :yˑ`ǚyxI*^mEv yd9'nz 1T~^ 7:r{BJBڋ:]>.{x-/5ZxZ`۳ mFcHr%m`p[XH{cp,bm/F"'LGl ߣ&@nA6@ _(:3oX< GRњhXz|,oʘx9Pzn=E`,8:D)ޑM=l* 0jbMZ]U`YlQe- G]BZk@u,R:nNEV$"= 7 Cu{&!* Acɒkdj=l6-A HՃASw "UZ盅тhq2D^ >Dd|1;oBo dCtlP-J CDtCxs𺉼i=Q y JL$Dx@LAenD\߫t3Kɲe}+4LyqmI 0)U+_n7E O $eN-Wtl0+(6| vt9)rǯ%GN"L-=MF8KbzFHYu;i <=]\;3̫Go9>0'@ xojy{4jy]~<^B'xP"(RTQĴtKPZt{2f[nyϋn[SawR HKx7\B4ф»Ulk(u:SxWdĽp)+6`nZ޺i8Na @6C* `Z|gk=cqT؄ALFXoa7:<ob< &؃-7mTb1"<_)|eNl޽*5 e> stream xۮ|^JCryMopNm|>y="pJܖC\wgggfw._뇻/_D*σ* ţ߽~.&q(/ v 3ϕc7~n7*]kkMS/1=i 4mdy 6LALyt~NN<_%NEH$[rF|x ?-ʼc@.܄Ӏҁ ԡqZe^a]zKxyKkc\ ߈j?\{\aq},.÷fo|o󇻟+VY*Owo~W{=ad(Tz}a< D@0A s ۤEo-#ׅ5Ӗ ^/NJ7-cK)eS}eʎ:\Rx#_&ȫ7[ooW>]Έ6u (ZZIצDgT%{<0 rZ :/4];5\TU bAyaܝ^$I6*90IF6'oE3cy llHbG܇73YNM} I[3G'%ar3$.'72wOA=MH8ofW*Q 0!,Ƌ0Ʋ"*^gӵ9/ 3V 5$5#j:)@11SL9oQ}ɥv'3RWy=糮ߝa/;{99hJ$C1=+гt$}i{=9RsBkA+f$Ӊ}+eZcj9[qUaj?)9UĜ9q'[ACvKtdigߙ<2a2G= -[s8# KלDŽSl"7x[0μ0\@r }t[ +$zz|KxmCPc4ޡ):`myx.vG#k{x<@9E+WnvBVjǒq`Pi{p֌>,X ݞh;|ɡ-N'bhmLwpcQWhiAdXy'1Y6hG4s&ќQRZs5"~>VR&^ێPW-$OM{#?9z+5xP5䱔/9OjI\RʘĈ}WcESM?I \r"< oKvqMIF势!Ӟj""QwGOx#-+T_H;xluNC.>bf`;Ya?Q~NB  qQ[ <]ik=PZ6f3m֕a).p 'p:sQD3ca^r)HrF>vu9D&6U5ǮτIUr-,ꦞiTm;phrͽ0_xsHMZOZO2(JIƳ >֥tO9b QtI< FFUDi'B&Xbp:sFx"CFv{@>{ WXfJ`B"{DI{H t黁L=S{J%}Ktarne Z68^^QI*$(\ji} 1fnFxL}vUl8IO'o\|{g 9[c59<[)Ȅ칩+hSr2wG>m0J +gI#17O&?ֱ 4^ l<w endstream endobj 569 0 obj << /Length 3790 /Filter /FlateDecode >> stream xn6=m5|RrC-閼-y<_U$j3U,zn%Vߟ+RZdBY)p }g"eaW+_Xy4,9p`Ǜ*:yVjuDB!\Ψq$W׷Osu7]{Ogﯧ%ˀ/=#012Ҭ.u92YQ@̙LswR\.RHe,{iMKa`*^Wg1reQi,-w.e? aE#mfEN%ɟ *|\v%H.?DXq*%Ak`DrF98 ^ԏ.KHm'E$g:e fl\Vqo[|٨LZt!Z:BL߉]HTy ¨,9̡hi+Pir tx4"1j͗ * ZsEqh=lcs#W (g5Sv@=CPwдwj+KLyާj_JYSY|z|8E"j74ӫn(]u"  m¤Bf(Ⓐs*Pn- OCP^'AvATF׬exV,*!ZgL1qB }ojvh'k6;vmM]_'mF׻ 7T;VCud?0XH'*r>SхxXÞ \EnRJ+"_w1~@ PAsFpZ( Gt(~o/Xƶc#mp=Zu0}ܝg$0Jwȑ;ρ|D~%f:lz"MA[Cfg 7:(Πywu̫IwCChQUG>jzĺ9Vm"ojฌeZ}i61fe4PS݃SD-Ķ.#}Iva9r}pӏ[,w iioo2ГllȖWL%@*sǫR*|579rvL*/[ ٴ){8N^A~fyCrֻt${^8#Y'v\q 0O5 *sz V*Ȃ s3ְhp̳<{QH>lNqř9&l݄-Y- @fߍՋ>dLH 8FrH<]wp˖nxK!y'bh"/d t.}C~)$E9H2#G J[x|>tr7Ή#8T; z{~por(A "l!c8M_x)iʬÇ TLg an{nnN wLuD`xl~)Gf T\<݌W`RFCX Lf%L(Vy!% 0;6JGI8rmN*́,LéFhEub~s`7:Eqe eUPG^Qk0F[ 7t%(>6뤝e IyZĢyW%?U_?1gb@A"+|+tD̰$.<]bEHXϲ.gX/]6YY$zT SKEf^aR^hsL a BGTȍb7'O;_՛{6Ԑrʠ)aldiPi | &dOCrY1@@|@A@@&@QC~j̡#%Y{/k"KO47՟nN!rܜI*ZuSY_nm.t ?A|ǟU`!!5%<$/s n;|2]%c:kv/}6ÓXy8ztQ%d1dcDd`xj&!z*nܣ-vCvK@Q'O~86w!i#OWf ;Ta퓨7L(u"!wƏV+FEuOSn%gC_o9h' 'g Y*d,Pߐ.U4*4sbxg @|r^ݝ%vG hwUc=- fg8)EJ/cI; }I<|&9m'_x[,<C4bo }USqC=> -]L/%,d|j!;$t(\o>YN?@ ?n[BEajx☤|Mw8yyn,(wDk>S !/Fu0UH#c&xf _ MhYp2#4S@jB;[&?+ JnW`JG_pET endstream endobj 585 0 obj << /Length 3164 /Filter /FlateDecode >> stream xڵZ[s6~>4c!ĕ@a;uzLgqNh[ITI:N}"@Jp·s,}R?ӜP5iFLffw۫?g@R^e-gO3vaqQ WgRovzXBe 6[&T3\?Wo˜ng,Pݮfe/fyٶU[zAd,g^RJ%W$uҜlrx#@em*XJ >D(&d΀ݥ 噂s mfWaM/(;'8AU´b#p) (5Q4\™ˠa" d4?<0s<CIbDHɟ%PIDdg$PIiYSXPq2BK.:LTMcx@uĉd1NEpqdf']H!$:+z&G\IF0;{I9EU> 9F?/pG\' TUyiG^5L_ۧ=,Ca)yB]O ]18FN:/d#IH^e/$af5GsI~!֪ )#CWt@Qߺ`1TƉF25m8!R|#xdmk|p -,2_Lۭ3Xυ>B~dW YZش®\9B2a:H0 Ks-zw>dQݖj qX8| fZXb8QM ϓ8Q92r}I'p M Em]&\M堺9ڧp1˦N'"Q*s#-)TD~ٿ,TeasN4:Z&}[аy7{ ]yww5}qh a5 槇늴'" E4:o5*S$Խ$#S;p74u^aКXG\ wz< "mrcm=Z{v]iUճ恲)7`mCSlEy!kj?0` );Xᾥ{۔tgت~p 1 µ{ZV׿+:`.=S:*VTwU:[WWa );Jߕ?n-\l]gÍ ݦnK (0]H+Xh߸zq aN7AS+48~ tn'P=V]gtw}";ۥ&2RF.mմDKwgxBbȃ LHPQO5L[0 4~ު%\Efݺڬx<0ջp&vR? +ShfI2c<=ɧ 6ߠ @΃״wmQ {6W$޷:a SC?Oe[yK~'!=bS`إM`rnSmtTgv' Jc] ?cJ\VÕrG>9EF+:?0xhܱ8/߮ݶn;;cfl*QDܘyYa[鬭BaA`8ό5 xn %$vWhnyT0$rƈoU|[;fEro]7ׁ_ c];\w`UW& nXSJ̣VcUr>i7}קq;<4Ui`5:^oaOX+<:lTa*o*09;M?6qOb* C5fCEe,(ќ۫3* endstream endobj 592 0 obj << /Length 3743 /Filter /FlateDecode >> stream xڵZ[~_>f"ub&[$8(l6m+K$d{nÙEa. _͇X/Ta"T.ֻ/˝Va$;esb-]HUqLB߼U|.qo>pij)ێ+-/|m)2M9wܘew~Úy N'U,߭l^w<<]W (Z emJ?έGPZ uA[Mk{"̂B6!syb13qp6@^J :3 9vyFBդ: o<|Լ7֥WpOyJȖt+)˕ZRnMYUnqӡǤ6r@g=1C%{ ^P<=K2/@m"&K7*t*ͥ aoL5¹l{K*;iُۄRwl^|ًvϸQ4wBC|z }vdlXBD[ӁݽiedXg Y|M4- 8%a@_EKJmfYqԴҳܠO*^h\J3˃8/F_Fztx`}2dA'D 8G Sp! C Sp@ń}r|nLHN;![ۦj[!5ܑ=#ٖS)@K[ S%K&Az%s.[p '#a:/e~} U2Yt"`X _hے|V73@`{(M c3Ƌ S\\<|dLQT&~qJ#wS0SepEWѲP]ԕM=ԁ18TUox2N㥻*9M y@/m-vz<:zx=B(Jqw*M0igNKMqksx0yG1CaUFrl( TrWrj;X\xk\a$`o#fQIܚ )bC؈ ý1HUJy{ ew9qrB'EzaB)axR074sa""pa9bxta…pf*(+_"ˮDȨåE`z{-g!sU2hXoȠf-^z.,8'>]Uٻn ,P\ vy:Wat쥽Kߜʾ m7q~t|2Y-| :-e|Nlp'|Vp z\`k֋ -oU*8,&kLtZ@熲2\uJ J9>i x4r6~*dpLf g:gUGͪ={Rd{B ސ>cG嶧z ڿ !5 t:`kI95UZŶ xסw7Xy:'Ii`Ya:*ǞiACSdh)6܍nV }p/Ny[ss~ 4Ն`Hdhܒ84~iAZߝݣM[6r7 1/q[&4}_(dt)N;I.28yz7;EM%.7Ѷ\>=N1& {;{M%ԇ{gfp5j E6EOKo퀕p=16d!G#5UNj+Qtg"{vc=O[",z$D}f'U&[<7#JQLW_F)I E9? $Q",F'- '6AB5* d:l0>o r{nd:aԑ4 i#+? y\240(ohѸ- >10]ݩ{̓,v)a+S#< t?+(3A͝;O5,lɭ(NfszYRԏ&Kizq69uGq۱C%WAA'4'*: nݻ[);?CcuD4ED;9grova5 HԻAHvq<wr|yXScӐRn f,bҵ't˾7ԓ=٭(^1 b5xf[rZCP!`.R1k#E,6Rp 1;}e O]gIjrgPQ3r3Ʌ/[o$sϩGiEM(U'#.PP)8[OTρAvLoƤz¬ZKPUN1o5b F=2Q._J)M6^Ntt@_'bw,bC)cP/nY*s:9IqICD8Q"&eC0v# \9 > stream xڥێ=_El%R:}JsSh,PIh^ǖ\If M`ކùww$ٟ?.erp˳;Bߤ\=7W*tzSnz*ltl|D|Lاu]s [^! 6I)LVZdDJz_7a}R=MOgU}:_+fϮ 8fń|O$t 8FxAS˭;UiA`L]sؾڞx?U9a&SQQfgϧg-#¨kf- }\{V@ÖOH=Un+=َG6l6;{:9TK=2f<&͇ :~88>*P ߣF TORTiA@p5/-?ۦrb@ٓlN4wD1=='OtvG:?$/m.Z&<5FU̚*!xe.nW xS[kDLR t|R$ ]IƮzi]܁g{s9K*mGTK `lP5&M՗ău-]Zx CvVv2{訃;=S@'/_imL$ <|;F< |6'>t.@#.nL t*T35+:3V1RCh:ͷQKJ),.aE"4Lkg-zELpڄfV陼WM11EJ&H#F*)(sNurOA妲D*IzNmL,5~o` (_8=5#˖>|H2~q)`%;AcX@>`Ϧ īmu~k'J{6%x~svyy:Y4)M k&b#Z`#*ۻМNd 6:Gv7!cA|Q"x@m]@TXh<=fvӬSΗS{lGG) !s6OI %Se=%`^`Ca )uuhhy c<Rg7߸h/UXdu=I%x:韪zGz>E灻Uq7x-7tU `nwT隳뫳K3]Z :YCǺ=Ϭam.ELoJL"X0C*)`b"Q)G_ta& mjlUlt,D2WE.8ЕE.a-S:Ƚ H̓Ua;۹uprL25t((W >l&EV1 {oMPI{ P=$cjC߅oD3̋RzMHrL3UVDBaKw|=kbiG0|ZQAD}EP4{^(KvB e$ClqAKH?ҡu; U {)e t\eP ER hhm'KuJ*gb,l  .>CiГhrCk{?%Fޑo?З<ڴfLd|ur4&e[bOb=U*_MӮ Yl_@A Զ4>Z UbcPxۈzt1 "XDz[rT 36a&<7'j&y@HS ؘWMp>Tbp>_cqQD^ymM]CbΒW׃2W$Mj02uB"\a@acs>{{A!? x41{d=D)Ud֘ yß O.I~_Rz&HrΎJ!UfBy,5l"3`'0 di?NbޝCƸ*胣b_)T/s(sXC`^ (N?Ȓ?25`t cDcuXtZO$<>_=wl_xs (` @-n?tUC*ԋ{rvgQE& z4}QDwX)5$ņ$ܦ`_X&vTolԕϳ=,n+ k_xe Tۅ8897 e HY%02)bE!I Ǘ"Bwlr!@ķ '\jozlqG(ufڹo"ILbF!Sr'8kȣ I|A ,X@B\14E@Pz$yYy- ֏p b#ݟ,_ԋq*oK3i1KsJ$P6rӸ,/2 779ךt2 ݨ) FpJ6,-}`A LH+Pȩ`S)X@(}+>klnmy-Nk9^ ജ9NˑVSi:IT "Mߝ7,%  3KtS1d}XC$PWȦqxXYXye: i_~{8x\&Q>y ]=곻7,WI2qIzbw )3#*!fF_A>πb9ޮԘVۄnC}$s6:kɂ0?u0B7tKhNRpjɛKLr bãr(pUp֢(f, v\ko,AӐPZnV[ 51O,O|§!ZWpʘ8E%v'/: b=^]84Ig} WM7Hbz3AjV B$Xg%JZ !A1Bv*l.MUs+1|(7i aeYw\ǍN.se6a b hh$.ER&PyS endstream endobj 611 0 obj << /Length 1066 /Filter /FlateDecode >> stream xWo6_A(*ooևbI"{ Ȳ+,<[ {(Œi[NҮ~w4C#BD 6P4"Ͱai°feEא ] C&If66lQ8:X^ PcN 攣r,1Ӄ+8jW3㷿_yumy43XJڂa:`7tT:u[.VŁLtHYqXѮ}-FmD 4)ʬn!~B+OĺI}#D_?cHdېV 1aMBys e>p%0tp S! ޖgjkrC/kd~jp`k1tmz32, pa-k4H")4Fd>!h!9J91id-^4uh-qmGJܢJE6[ )W v%k\952ȧ..&a> *է$_Iϒmgܴ'"Wꯊ$Ձ*k_]:;|U_]T04(}K7Qu%Z\Ǒ۠mM>~)_C[j@eNc o+-k xg҉}Ku.)|%FviŜ˝(8SKI->w ɒzO*t|]2`{S Ur?0ɽ~`a@yBNvgˢC3:]UXi,5}X'ž&,Fˬ>$/sL~b;P-b=gi㋘9C:jO=*EChL Ez@;4LF0 / endstream endobj 618 0 obj << /Length 3120 /Filter /FlateDecode >> stream xێ}A ԵE8E>$A@ܵYr$٢߹Q|vyYCr836X=_n~|Wa WU(HV)0Ym֟vmS_tz c@G[iF},}_㽲T%Izic(&\A-q2K*ٝK,G(Q3w23fzQD'Da:S%͕>E( IJYyA[/RtxI2{0ʺB Ͻ姿2w p+6̻[4SLK~C;۝06)0[ ;a#V8M*z0XW{zaZb Qd A1޹=z{:,zT/-M/y8w]an (3&Tr!vL19N[`+ [J[QozbAc_J!j^y2m'j*swhX4L!tD"~$ y̘r]LD-Z ({^7N4\#Xئt}쑗>LYl ,% ‰\aϽKG֫MKa$+p?Y}>,N҇p+X%JXPޟ;`0OY  ▋Xy(r Nh.haȐ%! 7|- <'9!_P 40+2xNG4$zZN8 '(=b\,gׄm; d" ˪bYř"=ճ(Aٟ7`zhH)nV\L+uGVB*wt%*LF/V}rN$z370 QE&Dih#Uc+>we/1v9D 9WB1-v-'~_ViTLem9B[s57c&!6-uzz-cyz*]!y̠EzvOuo=MIeLUY{vXq-03IsDzhWڡTTRdUb7^߰ZMݡ!tRa>H̫AqXHHU/yaoȻpF!C7R]@+AP4=V{8Z*Li35>ynyy^ 礧{>M@_g,\٭|b\3pcYWʪ쟖T#lڌ16b?|~.&TbYZM?¬@z:6ҙ[ NAEܛٷce2A*ʷ`(ڤjL`E}>.zTOė Wu ;w 0=vCs';SlY&+cw<)o450-𮞜~.i聛mU-NǶGNɔL&aU[K1;Lʋ\K^ܷOg9XŠxZ `;ǿ ʢdKɩIN%׾CοL|K0Py5|Xń A:YݑduMQZn#?]DSϭC* eMLd({ו-u:W7xw 7QRn/e)6z!> stream xZ_ O-&:I-{_^-n>G31ֱsӹ߽(ٲGyXMIIHo6|IM665?cv ?O ?Dt!ĴQBؾիw`G]`5) aQ)|:\L˴V(2HCeLNPșYTILB':&'c:/~6 4^l. xYܒTN,~7jjt,'"K=F}PB+ M^2"ށBx_L24ph*sV͟o~@'H5STǛ~=bY"ͳ:nT<7oAG\YH΄E4pӹHYld۾lބWuzd'jVe;11\L&~*KYU$LeNIδ'2VL'f' b6 rQ?V4#vk;f0(n t]!4bcRyV[n˺)C#tqH*9#h\*܆Ԩ[?As!V>h1;n;s'2M,s=pܖOx0iֆR@XG4 v_{j5C_/c?7&fUB,ѓ^ZlDJ2xtoNО`is[yx(Gz?գ釨 dàL Iʦ1;jmH@1$"6|n bI3"kFz[ y-_?X(roqD?9£#Gyc9vF>Q3{vmO n|\7YjLZjda"w-Z ˅q{]InWdIM$R1.:7E.ƆE0u'[x^knL6~$(6Yؽbr1L~M"}vׄ1}uãƩbÓqK3k}BL=bXtˁs[W@ڌ!e v QC?r.oXy4|OI.B3>Nݤ4 [rޢ"|} f..PwD>@N)H r8 mopNm`5-@7A$ѴTzhz5h [c (B^@@~!%l +y&Dϻ7C =QO6P9i^Oˑ7? aVޯ6_@rG %@N"JIEhE ]wX|O!CN0/Cm0PL~=*H&gW `bF0BL bd١. -l =|wvi|JZ-^ZdKML<;/s{9O֔h4npL8Mn 0Kѩay>~52[cwrx:Sz['JEΨhT[ Qg/˞7iXN@s zۊ2# 'i}`E3gFXOI EXi.|$aG0' #q_#Mx{WYUYHHdB(_T9S}%($p @x%WA >{3բh+ {> 0!ͮ_4Ϯ"6O(aAz@JH,d`Vl+3dکK3L\Wn _bB8x2_WW93}g\6js YJ%$CfDp.OBVð-UBaNet]_\xʁѐlWO\o{s`1:ú\SF+"q (g(֪<)\J\Jf]nK6]v ["vpiťmk?Y|YZ䠮B+'D{=KRyflWkT|}y.4mScmMYk\HV q*=%O"gp%]vzq~MۘaXck ]:]s1Eb t^_@/_H/zҙ-Y, |:W9K2IW[)Ph8U}Ikԓ7-l 24~=u?`Mt=j;QC<2NW|FA,Jm#Q;xޑ||NuE0M4(o*8"´F5b:0)n d- @C4A8G2 ]k+,28n/C](;e a?`jBScnҁ]K_ ( eYv+*GϷA.> f r߫s=P@,Y庑5hLk9U^dJ.92Y endstream endobj 638 0 obj << /Length 2678 /Filter /FlateDecode >> stream xڭM۶_[+o&ϩMs%_(R!)3APZu}> =yǕ+ۿ߽|_dQ$WwU&q+Wשj##ojvUG綡j]Zd{"2V?Q)TC˂lL"XWm6S8ŕF,YjE;^7-m:?]~%+?Z+ޞ+]æ(Z |N5bi];ނ6Y 8" 3`*TY?E#S*Y,cJ(4HӔv{~2R2aǡ9,/ \)핌[ ;퇼)4q8-vyӨ,B$_9 !zĪTmW>ƻPJK5?)x@;U Vhܙ=ߘmzI}=7CE9HcmE|]ݰllmE(Y߹b(3IHK]RPfm]spHQbVtR$h ܛ"I'aHͱ3$B& }72%a ~>8Z_TNg<ڏZFpյM^NMUht?~&4IqmP&G*Tyxl&$\5ix(Ì>8Qc|cǸ+  U ˆ13x}5AѐlC c36 ;cnBv.0`уJ rٶG^ 9#Eکx!|vUP4VgIa>jfP dhhg6`%V\> 6S/վ+xuF1Yd.qjF>r/o0}Vssvfe+ct#\GQ$#{@,EVwI}s?" uʫ0;@A,`uUT> B&ص 4kVr~ V6)t}@+BAW4XzV`@a@ktAo@'.֫ԗU`R -{!swԓM1S+kĄ2H򽪟&EI+ˉE6^9epsEH FR />|?AkZSƓc?f G=Tpi |-~F|,Yrkc\ /ȳ`A3@tTE4Dx4luݞ )u(3Zo:u}AAƦ=֥C"3f3Z>1,9jdsm 1㰤l`lMD-q8k2CgҨ,ϓ罣l4-tcۢx9Cqx3HD鴜}4Z'ocBCR_\1&8+45cȀ=nJq$v3q;) &Ϙlc\;-}$, C'+4.vnbY |ڏQ6^|y_PY o܅n`OdxI ߐ_s/&DjkLx(ՅBqg[$,ؙFc? m(Q&_3141CV3 !(56XY$r3 Q}ᗸ߷?ݻEaj b\Swb;\Nc2-U*`d+,h: 3F_wqC Kuϣ5èU75QӺv?S>pB'GP jUWPߢN% y ש;us+S'=嬥Y,=L8~>}s1L|a˿x)H/l׵93FTFϽmEվ@D6}\-CHAgZgB0`W̲ѭ\@SRKTv)so8L|ӥlUE%sl*D; AJ<7إgI~BPf*A͠HUௗ Bso"MS]90FPSO"E꧎AUJy)35.%`ȻZlNi•3*\0,U_#ś9KQN֜IHF2UomRTLmDb%=3~ cvkO],$ƒŌ3qtl#O*7`らH &͗Pi0=/_oR),Ӭ¼-XpKT: @=iF]혰CRc&GrȀe4TW&)&6BTénnV8q2z*|_2º|@#q3|qVoP`#3m Vz|{n^z endstream endobj 643 0 obj << /Length 2183 /Filter /FlateDecode >> stream xڭXKs6WaS;S@n풝Q6[H9Xtp$y/4&_. ~?]x<WE,bO0/Ň?VЋ[uUjL l殺_gƭj[LNFոmGb, R+_&˟N|p%îu'zN gv^/_^^=}Njs>8QoD<ܘ8 hGUVk!6J å!JwqsbĊΔ۰.ʑ.oҝۘQ'K:u>FNjJ q\^G p hVkuH<{Hʪrv5 i|vM4w^ ,ؙj}\kL3{ƨ/Y9\P824U:׮h9F>JS+E#t0ٕrFt$6NLc\+AmQwZFne=g٩@o 1R$FDhMABg9Q^NjSՆ3O@`އ/vҹӃ%RjxZD6MUJCӖ{S^RO# wⳉXD8}&¸6US fRA ӏDIa, 0}իB],rZ픴/.gپ{GG5NO=#5͗G.l?{S\3ӰD20 6վp㝋h*_ͬ{۴ +5Z}Rʪ'ܪv4f΀pd0""; /+xͩ wa)-*i&5};Ģj"_Uwm4_y@>t\ݬ_oo;6 ThL*JZ+'0w5V@DPTp$hlޒZAJˡ44/զYKcM51qj^=%)0d^q WHq@2 j[^`V¢Foڏ"؇h C;@|WIY(hU=-l$ae Oli.`9 ,g|q%Ķ3Bc"N?aF֘j]jlX$AamgLRFX٫$"񉶵Ydd+4["΢`qO>PH/6ڍfrT*ͱ}ȉRh%| &*ͣ>_ApN HƕƔk([w9Ek;ݾ,]C_ "`QДViUˮĿ|c؅u!頿IVP?"@j[(-&gnADģm<QJCC/gzTAr[m J(0;$\;]yWYanT+,&6afJ 2c[-}WRH!޿yv]s*YLQP>t^xbZ(F:"+* m[7v]):ۅ2 qgN:>MCWF֎$!lI9IJ7k.m9TWt~S< @gN 5F}QO, qO2e+]Fפ%WbxǃЈqVH+3jb2|6U]oJ?mc6`@5f^؄\!z+# }y!KxVL?'Xny~0}}ݪ @HG0p~ppɋ9f N B +oM-m9ʈ(ީFg}+ l*{ 6:H1m,&ӌٙb-yGN7˃A mL*3ڃ0+e- UjY7 _ⱘA4K&bjq3'3D3|?cUE3gHB>щg<ɿڨ?ub}yE`HY$N$Q*C{z'T* endstream endobj 655 0 obj << /Length 3305 /Filter /FlateDecode >> stream xڽZ[~?6fDRS@n "]M2wYrEq}g8C,ݠ@;ù|3Tz^Wߩd%\=>pH%zܯ~x񇯾xP&8ܐ˶k>VkK3&RvURAuSSGw0T(1*ks7]NfD*K@i4'Ԧ;Ѿ9ͱ`ϣRE" 3ڮh&ָratܙΫ-rܝm밥JpҹTǁ[lLH-\U8߈9{6*X;ޏ6= msSLlT~>ߐ`T$rGKFFυ9ueS/3Doۦ]<2"oXڢn蛷GSw:;jC.;݆qBn`c_S \r*.TԜf. @ tVZ"?VTúT,6THp( 8/R"֊1 4B{B|=a  7 (z^]B J-h>PS{<*a:Ee3wɋbQ}PhDHC^Y.kǜG42H`hb%"M7,Y5T=W„MI~̖YTb=݉_UcS7,2IHԔ`6 p3 }VFu8OMhQ;K$i%U{q=[Uey#~'Ԇ6ŭ2!SȀ`8FݙtK ]YP̫>) ;CEwԫv8T'TR04BCH d9:P'8Kj>2Y&~f8JJG#.==L&[ݧ%ִ* Lͱ_y +?]DH6Ж3e4K(!!|. Fs~\pӸgj ȎQpCՠ@hh00h1Qi@ME{ޘKaW`76Zﳩ ,V֝i!+;bڱ0E^ tQ/U,+dzw#d69bF5/0ռ>\0O83а3KO1d\l L=fۻHx}Ƃ⠑hyө[>[ (?xz9_0©};]A*ssd$q}Oebd3OuoYb6jt{sgѷ5[97 XXr蜟g;ĥewϔRcNb#AڼoLɀpo{A=2ݫ f53f`<: pzC֓ ^IhV_jDĺxa"glhK|*Z]]q?98s`B ^Z&cptXdqPS^􎀾ߗ $O ~ ᙉo@hReC*|q~S%q3%'a\#l/О &@ S1Hr8bD I T'0ApzbUcilxF( / GLٟc2j2UF5婱I&])'@-O ȼ޿`Z\C AT}@i4Yp(vj㆛#g\^EA{Lom~;-*x$R dP쿘\4-EzzԺiͬ%w<:铚\ Ϊ91i MyzN3@OLT]EZ̴,[KR=_>@@9*멷3/=b;rig]n &ux-h]@Zb"YuՅ0A~mL;=ekp$$9DY!NI Uf K? #0۲ZRIKAAwNB8D*+G-JϞE/:!B:DčۈȾJY7"^_L,7ҋ%yC:wx6ێ95fԎ ->47Sp=cY<-㟎?cц 2œ+bhak{^IYח 0}f&Re-2 TH5%w\i endstream endobj 549 0 obj << /Type /ObjStm /N 100 /First 889 /Length 2399 /Filter /FlateDecode >> stream xZMoGW19{!p(1 ]Zb%R(j88Dr1s5=կ^}\p)\$\ݏ.eGu?*bGr9(iqUMBG0"T ^ڟ#mJr8ȕ NpR(evqGpRVj1d` e{?e'ǥb%`uBj#''Lw8LI5DF*9I0bTpi ;xC )E"Ԉ7#GvЦj.S-.*B! <;V `db!X(N ^2Fي)V[Oe=^XK2'L6hՕp3%PJW \"#źj]=rH2(G1 =81aҴI5&גG C w+XT(8S`%܁:q e Wӈ S!K^) Qٔ!6 dl5fW[FS~!08;|qq;8ur9^]er8jol\sn7T ?_xgu{Dy&c вQb~u1YK5&_?Aݟd9̖NaԼW[|q>Cb/,Jl6P-.vxu+7j.?,Og5?'}[ S !-W>VO!5k~rrfAt OGu]~xz"lD9F/[~u:mS7hQtuLQ!'f31oD`Wwa5 1Ig1o rd# H̞mKyi\Gd٥@vsDL LO,PEWz>*e! ZW[\>@2Q|d X$&/ٖ%rFZu!Ɣ32[- ޷gcNaPelŇy׉#wGʅz+ G%m~ci_W[:n7 >h=/ oEsF`$gXK2*-D0$G}FTWV*_Fp-WKY\bQBP9Sb_X"wϓz>/!:hWP\ˠ ,oN1+iޗ X.oXWk_w*^5IpoiR_+$D۔#k>ϴvu0\?=IB[bc]kFAP㵉C2N)Vs.ID@.jb00!.ĈNOYQr' G0q s7I³2m@yaFl9c@tgd C ZvT_v z_>CsIñm|Xii]tӐ)HZ g+v6vq@L>fd'вmoh}Z>K ؍D0t> X3ΔNk!ah$Y3lmLO 튒H"yuEwxx:yI/| Q6kw bmRTH= V=yԶ_74d"9cGv1S 0&3"bDmD[64Z<5?߾X.o4?-ci|/ɢ.|к*GxaGeis BllY\>4WC[@W&WL [}]~:ƭ$l7HOa϶..* ʈ8FSQ˩܎ws/zBLeS݁ 9BbNOyǂ-mmGi%Qc;}~p)>:KmBwjh؛m{CsN}?_O-?LgW Db"e†vYJٛ2V^Aۊ1x.5F&xu@Q)V<~[\5olN.wyrߓVqGm endstream endobj 660 0 obj << /Length 2619 /Filter /FlateDecode >> stream xڵˎ_A+kv6`[&٢Ȓ#;@>>U^-r"YU,inCߟ?!6Q(6&MD$,E(7jc^owq_Qh}YTnf{jo7~v=) [nњaxno5`gZ>:9h lFeh׮vz_7(rwER$^;9 vB|FIv41Mo̫0;!YG?}TLD jx/46/,^wՊxOG%hٯ;Io9CHOd"WCD DBmL\ˮ*Hizsڸr~  8 J_uS扦mC9ixi[4 &v0+4:۶[Xr1j" B( ~,髛k EGt4f1YvNeHKwTdY7wLS>@& *n< b=6IAat0640LjaeDNuq'l{)m,}8AQ/ec͑f_q 5eǨrFG>WZ8J+𪷣ut2y""Vౖ|ZCqE˗[m͵條alNtof7,/hk(wI>+d\,0BF]~[,}..(T%݉T4QH.V_W-;Bs.:Joyw@;bˢ@JGC}W)E<ԔI.K(8 @XAaqqzf9 J&\+9Ôʱr9~ t.&"N"4A4 sZL׆lӓʻ/,%O[pjƷvC_ơ&Dz'S}ct4 FJqd0( PV[1.c{]h\rnw{g "q!$ /t$Y%%ҟ^c1F0l_/2owlp'PyjF n;e@J'=v˦@20OR?EٽWv9U<&u2P.pJY)'J/VrLcVRDTM*$@bR9PKf1 gh.@-F=9\j5KAŷ"Lދ˝?r@QFhxG@, s6n/-aX'XR0&&}3X%Ri]|SsH6Ip˶O>EդTI`Q@ f4]D4"AkiV-Cac7:1'K^AG> F`W^+P}de9."HlxCc10VM[* !~He鼛!zѝzByFdHuxqFe8m endstream endobj 664 0 obj << /Length 2654 /Filter /FlateDecode >> stream xYY6~_ȋ6#:`ۋ[4dn+#u:k?h On4q/ﺼ[WIedsʫfM2~4*AWIڂRY-c]t ' sMW/u:#O"&>tWJsvuǯ2mH]eBhxnW1y 5u`L߀& Ɯ-CW#/LƕbAJ$k/y-"yfg s]G !hCSM΋|DcOܜMW1ʌ_~x?{$P- - $~&_vӀf= k늀 _g(+dRyҠ+vOr>o{{$~ԬaD]ɱccT>""Fy 7[: "PEh goE<_zzcR 'd4o|Jm%kPlȳK&kr=8NڻD"/d7Yi}B#D%g>~8<"qUcA $p+l մ 7/w1icKc!+G36R{]{5QSxrڀ1.8MO+Hbi"~+}'y;zS@KPoEpE(nUJ4y@@ƯxrH7{ ,MA@nޠhCWX{JY|l۪ sn\}D 8NuʟQ0*c b=їR3`B$ V ju@˪Mmc5cLWeKǃ#xe<QNRT@Oh<>y/9,Q뮻1|QRН9 !v:qtrBmUn{FOhZw0#%JT<:V˘{q21S2nCљJw+7(>z!kA(zI&$YL4~սRؿ|Tx$ D^ 8۵ک"9VwXjpZ@1 '41;ZyH ]jL{zit}^|䮅"ܗ{.Us*IEYHD [Me6քlJeU8(EP?9cۆΓĆvhq7݁:CeG'䎉)-.Ni6$⟳I+M9̖#_g Wo)}/Z$84T/#h% > stream xڭێ}bޢlFy(M0E}h,keee7=7wm_l<7‡ÏBo0كU6* q g40>s11ұ:?Y{G޼%*,2:㵫*G&QA),ضv=$^`Ww=WMpF{T#%L²}4I#5Ja.`$.^e1,ʢ"w"&!l+p y>;D2A<%Jt4tDmzRfb#5vhV+LHκHxڱnrVniߏ #aaK~ZzXVMβڝtS{\HI/< *dophX1\mxAiy+šBC;!`{{ T}D-V0?:p,2ȪXS# [)t9S_wӰrLlXDD0 45 خ \ r5X>2 #~mʖ̅e_~ۿd L.BV\u-^H^jDddSFIDeYڏ*&ENim=eͫ9X#\:[{OS[tZR֭W?pDM4&<;!9@WZyDgpOO)-6خ+YJs%¸_I]8)5 =uI,Ez[ϥ>X$w>^a:\qdR8rwFVp-|ZW#MUOs)BwA"PۤO w"z|Mr F"@-^OAQ7%Z[XUe,o=HfeSDkeR-b1- {+$oGYC[6 ӉY"ZdVOh|벢$Gɉ{Qseí'8`;)X٪3* eFy.B6d1CTEߠ*?,J^Pri'BI?/f\Bd6 -~l\҃e-BZf?F%-bJчˣ1!iyDހy_e{t?n*d0:Mm% g3a L88 N vdR z֛ ix3s%ԬR4rj4O#r OݗFNqCț_h Aqg?x b_d *d'.Kcm#w.?JƠ Np vC͖bnRB,E"^אe*fo~gtfp~e{"qM',妢RHfZ0ui rᚬ|F7@r۹qTJLj_$F4v rX\z,@%Xu&T \-Ϯ  LhC!s, 2-;w " a^u.WAQ^Kêe1U̽Zg$6 ybbǼR)!aM&%тM yV^^l xI)V6iPDh}L8Dj~_*AGM>bQH"xF!Tī Qy5SBtC\X2.Q(~q:`\X̕`{`auԾPA"sΟ ebVdWz PFEvBB.@q:6X2D1qZ0C5?łd{Gq6s]&/"U @;XYp|8T4'a;^BVs @/1׉,!V O]ta<ʔCh㋷ǎk%rsV+efW۞O;r闤K3k{)_TܸKwߧdKT}||nS=@bNBs;w?]tl֨^wIFW]#kS09}TRR( UMǠ6[9d-l/]JG HzGr$W Ur&2k ?4k.7h|׸ fe1p =CL+\x,N׹],|};lvJEw{ q_ʫVd;գ #|e|Ou I2-SZq}  {PU#oоZ6 %āxSΓz !ಮB39(zK.:zSAع6A pJm}%~XO>~Hλ@(C{BKE!ĭ'- ,gG@Y#ϹÓ^JGY> 1>唂u 5 HT3X endstream endobj 675 0 obj << /Length 3216 /Filter /FlateDecode >> stream xZKϯmA˧$`w1AelmelɑȏOEݓr)*(-Ï? `he"ePx.|)󩩷Ê-,%:I}i7MY+>w!ݾxjԲhG7@XҾ!8se GƮt:\*l3Ŀ/Rᢥʭ!XI%DIojֿ OeT.VTJrR,]XT>֮j_P5~ ks3Ln^8o ƬED~SNBp"&9"pTtRY:y5{t M\#zYEϊ7M>8*;/]ldrf}2uY ]EcR0C1Jq%; H974X}u 'էI2TT4&nrl'Kz$8X)_-s27x3|$3҄%
q] >`݉#vfl?Y31-״^<ʩPDdum["[Unr~7"Ht2EQXf|u9G g((M\v!c哽G p &#"<ڑꪲ-4e+LűFw CX81 Edm$b}v)[>2xb{3%^<,\LduO26f2rz8}+Qne#7Eo69;p0rև X`R-m9h54^fǩ&iv{awǂ|By}E?yX6C~_TY:>yXG_K .6 KNNdy,Dr&챕8]onjhm:.cm_nm ^Ѯ k|3I)݌K;cyTɴzƆY9Ccn7,[ή=6LIW#dQ.ʴd@6K t0\AD|֗ $,_KV+Ca>MVNT^'E-^z,wd}GapS 3u>EgPw&u,@exSZ,d@Aq;kHߔ}/鷌EUiD$&7Q$ThQWU|3P5'ڬ@͇mސ}MUJ iiM~8!NwKEkB )Pxs5dE4]f <7K3"hzלA61Ṩ\ p^.`'|HF>T9r6@z Go RT(+q8,|xrWFWqJ Z wHVߥ?WE5ӳ+fCø+[Uo0~GV>wNUGW/>K:@W eQ| "I~)[n͹ qNn\q"~GW=INFP)&\Dx jX` ~7e=߅܂v~kξ3l鹋) 9 vʛl"#$ ׋ss0~=}n2Mk/rnW5aaK_ #uIڊq3zqzK=H3g-~ N4yLVn7 {>R+Nr,&%c^VwB`+Pg+9@;IsS?a endstream endobj 682 0 obj << /Length 3384 /Filter /FlateDecode >> stream xڥَH}y*+  إWZ2Se34_O\]ݳ˓ȸ#ӝw;ΤwS]*?b+=w}.C㝭>9s無v@CRp+2[!ge _nn[G[Ah+>~6}Y=qh$)8ɺ$^ȧWpߚ"Aζ 2qiα G>"nl2{~ƬdGgR1@rwZtwr+#D`,].5w 7O*HmC7eP#p= M3)lyC?AMKҼDyQ:M kag9t O5GOHcżרNz/m4ڌ'eg7w` ɕf[@5KQYf^QZ\vd"v$0p@*7.tdsOjjGk(Cɞ;+%G3)'"Ğ y溩LB_D*8$3a u+Lл7,lD:U81@mDNL36Ü2)7jyN"B. p cpV5C[d&cpVǻ Mq5H/4O2BzzT<* F=9̢fJ#b&@YV fξov-eu"mt~C4_=D'p/fr8 rCG 7-R^Mض%3`t ߙ^hV6g1׳^r3cXTN-4}722p`] VD8Y*G{M_cZ*zWgZ`E+@pY6r^`-,آG<Po8 jXPN+cˎA<6> fRR_S( ۼDJ^T7,,^o*N_3BQQ3*҉GB0! K5M TɎ8FvDS݁66,q)VOw-'%@>++N0R{YxKc5 K]=&}=a7I$' űȅj@z A֞؂:R u' s؂2{@ }R}V2 IDPN~YY놦<*N\h`5v4HlJ62S>cGqJѷs=\ 'e엲?H˳=zkŽѥ\"W qj'/ujL.9 Ek:[u/e'g 'gHOMcکӐ_dl#HfKp8uQv,e011Zbj@Hz_b %3-g|? Vչ%&Scy){A e'/#}⮸y YS4tAl.9&dp) At,g%zS Ve537UZYn>DO_"lÅ0Rxm`b|\%r%pTRBnqә\{iL:ܽTpvS3:A~|72X oDtÍ͞P4<ٓ]S/$ӓ7yy|5?=)wnLq0+nEDʉ[,؎]¶ yM4,{3cG)x.{ #ҕ1ŦܒT%/`(KXPb/6 ꍗt}a[06)ȏfJb'+qa|? ,KR<@(J :'I#`3ןBuo endstream endobj 686 0 obj << /Length 2506 /Filter /FlateDecode >> stream xڽYݏ۸_bIQA A/{(ဦXpmYڊr-wCJ:kDp>3%UU_}^+2)nUKzK=tOﻮ4|ضvV~e9gSL ݧloi(VZ0^~n1`E4P".F8 X\yMUTrTZ&S%@jOnlceƲt/UPڡ7[+x+ 2Lp]4F(t{OSc@/o JQ D99+ =Ch57]߼kze@ÊLtx+{$0~ٖs&4kkM#PnzRݻM>x;{^I FdAiG!d4DҌ{K_r)H\Th%YFFJwv(^BrdOЍp֚&~|M;褺9V4ݝ+<+o+0-HpT@+Mf돉JЊq !MuPhrsFsx0}N5&H8Sds#@\flڮvC1Xwj S|ư\1/mJ#" ńr6sd -Kx(k'vX?h"vC~ 3[ek[K0Y~T!3L|c 5fp 9 s8Yoga6:͜I "ENW@[A(0Mkihbk/%K̝Hҗ+6fy Xeaz~0iX TβV30}BY .P!cy Ѯ\&;Wb> abH.OL0wnr"@?SģШU~2H-14̯t X -E<|9êl鶈 lD8QXH*7Ӟ*(# 4y\*I5(9k <7a6`8L2)!tK5vacGS׎%SG= <]Q3fZ*X1U'*B844W;]xR?}ȃ{.\sS|-bhlIpziSsq6A=ThU2w+[[i _Qӵڕ;A4xuOmDz(/wѨ٘qˌcN X>*gSm=D9 Krp_ũ޻@"[y%x_[pM[.ΧHU`Q1oЁq+5`2 ,];\D) Q;\~'5fX5bq,XA<ڙ`[5_ J7dE~J T)~p?P_ endstream endobj 691 0 obj << /Length 3273 /Filter /FlateDecode >> stream xڵZ[ۺ~ϯ۱]ny(i^CS] %UM3Rdzw/4 97CGMMt7~ X|swI&2ɸ`Q;n}T6ٶK_QuԠOsG^w|6Ks9˅eswYDo0MK%՛1թ &e: x|Խfji bvq4k 1rDX 54ĠzbQ[0g9G.Mt{zZh ԓ$l[7sYO}SI}mls8NMw 8)+xdsĞd4c|+=>GI6}_AFtq gKs;4w| e}cY?4 @ho鍚MF k]`@4!t_9)g:xNjqIQņ[/.0 &fluB ȇJͮ _*%jAE)HxLg .t!_䙤cW?nh>?қ{{ *Q. iA=Y -U]oH)w{zзUU2,*Eӂ#Lݙ[,V͹UIb &`|I) k+@kd2/F7x٥vZ5lCr%'Qd>{cp&!U ./G ܝF] W4ơ9s WDrF BwػS]|o;!,b@57 ;D{Dž'z6ze^^MsN1d pQ4Q|C:cT[UD:Z|d*nS؄(s{|$DS/U%CcY.EI!B*+h!@^i?1WVAOqc ^<~Qh?y36K_θxy%~x?EU/"b~)``HA%5!$3Z<xDQ}*xXJ@eޮFfr|SOA ?Œ]קe@ZǗBl 6ǜ?Sci' 9TG$NS3>x?܏mt9 5 { pgi vA -r*ޏk[c#~!ԝT0!utBYO-]p<_6U?%h@'#ao mtsa*M _>/,κ/FkXjd,U(bLqvyVpdK}tFB h!ymKRA,J Y\&x0s,.zn_:WK"ug\ `h*9\0jxGd 0O4RX|D`^.z) WJЮ5d@6 x[YS\P/D]:VMUL |*1{J 8T,Qt/ n,v*'1\=⇷Grv2F7ءpr, b|0D&0$\w>7GЦ8W.cMV.B«! ',h,`W{mӶEwWc碤؎DYL/&+B6٫+T-UVf1duER.lf!$K [gY2eƥi4kv*l[|-/ U pR\< ^q)3+_K۱_>6`qCW><Cj`Ko&˄\sX%G .oTacoLw:-@+o%ۧK3<&*څC˽3l?'X$ 3;d E}k@nl٧ݪlE![D܌mR˄Ԑ$0(]1yge"Jh[{oyh12MB (Q<jiH5-<̋g3y3$yA`O4-Z."ӸƙM:t%P t2x]7nd`{a0 <+K s0 eT^ͪiZ{ѯhC/jRbl^#-a"1x,ѭ s%9QgSs@؊HDb7J endstream endobj 709 0 obj << /Length 3773 /Filter /FlateDecode >> stream xڕn6}.mV$N&IvG?uo]|%AN3TdG2=Ök}V \zyxM+\ k- i ]&ynIϣ !>-Akx|щ^<#yF}GVDWGT΁̀#a إ ރt9|}h]Y>Ȥ/Z*nGMGSX `* 2=L]'xn$>-߿|Xl{]1]Omׁ60:lh3X4[MI¿}el]vN6`<DT CVG0mcBrE ]Ct,v]%S(^U !(gAOtklqjw'P2= N,N~%cqp$`~_3pK1$Qic`@)N: Pe/#8[;nBI#r;GrH|,' Y@dcLt'vw :Aе|>f @hazEar6[?N;i69*Yqq=tl9U})D{a-&6{dt,$̴dɔ`/|0tdr":{vAKQ3;%\I\Cjm4@P_8# pw]۽AUgZcbǺUK8R.J0Y076q4诺ݎ0Hw'x؉_:p8:d.sG?K_FN*;^F=YKjۃ(U"Kg]vq-fVp\;.,6,pCd+-,馡 p5{ޟIrT|N\!D~ CE$I;Mpŋ/{ 1Hߊ_J-p·^%7|YJtK[IyAsl*;~ne:oe=WIe-ϻ_⬭ -^ ImM7>$J/,n̥AAiذ0hlc1ΕZAMt $lSe^D zPziLrmkńnV-hp9\ -Qxy\jUTB=t]3-4 a> )^d{X{p#ߐ]4OELg!voB]k+?!G_L&L"R55?"XKHJW)  p0|QKQ(΃8]b #ezSONok'SnxQ1 _B9l&धX"s&y'!,{/}qP+Sko8 K +d}H%#w<͜3zs "C#L³,\F迬-<@sb #zKb+5xӈQ;˗mkUՍ>+ [s$YYGkȗew b,pDR$3$`j60f/eݾ_Tϕ |ѡG')-WzTH(<܁5DYrQml" xYI^}æ$3G% Ø9eNQax{SՇVSIB<9Y'_Nߐ`ɹ=5P`_k a΋Rˍw=PLK?%"L$Y@Ke%8/'c$0Ԏ_"x5va)uxM͂TB2f*3@%V>ղX ('4dܓ%iWϳ*t $Gyq,DhO|ȳw2KSsr!zD?5K;I1+oR敫̋PcmIf?@b&~OKq܄ℽ4 .z}*/3#%|ge!ƻ*#޿-?Cd^Tڍ/\a_[nq;E` fPQQ ξ `cYv*Tau|| endstream endobj 719 0 obj << /Length 945 /Filter /FlateDecode >> stream xV[o6~ P%uW1P&K,? i`hŕ8 ln=\x.>XY+X0_Y'7׺\%Y)X b+Iܩ&-˴y{92&BO.xёiGco;6<<6CZe&gV\%±˒V RV&E1ZMI37"mQY^WXSZn ڊz%-=}"iO muXڶ\j '?uoٯh}Fj;.z)^"/n4CqCFX%ud2@Ø电L"$lN,I*! F?Y2=!_'@bP^FVXXh! 9*|Xu uYգ5W='C&lb颔a0&  1R6 endstream endobj 728 0 obj << /Length 3662 /Filter /FlateDecode >> stream xڍێ|< xQ, 'My) ڦmȒ+J;7JWNIp;NxÇX}$_8*R^mYUzrX=='i6 ֿ˵X>:^U_h8oMjS_L]dænOl-˓Ǯ06S!cop?2Tm[.9sO%ӑS?s'xsB2 ܷMopu4VJGYZ2yha~: :@c=С_z=W35zYj퐂R q*WޣMo]󿕾9Ֆa9!K} eתzuv)y%aV,Hhj3m~E aMy8r>X cσ,Ca\îɝ(*at׮= {$1#mY%gY| 5K30NbYTIhH[;6pt؇ lWn:'%KQajoK&`80~?- 6(E.4۷L$4|$uT8X d)tMI0h.Bܱ.F0jފg,)ɍc6.ʢ\RM*pp:G㘃X뵯ZdN ո!{~IҰ-:XRw&$ʆ '>b13BfX+*种b| zng+Q`__SWlcB:=dVt@,j;lLPJVA}tN!s$5(jYTೊHQEqţZȿ8a8ⷐ=TGy^GNŴlE$[_QيEcC5&[ef1*/BϹA{;]26\l+N + bD"(uN O0ƅI9ZW/0Zu((42)V&EBc!=IfG cRO$ɨB؉*< L.CDQcRf܉Y<;ݹN}c݋%aŒD3¿XuނU}ښ>hU6t;oeAX& uyNe)?i}K'EGdY꽚d۲uyЎ9҂1ʷ5rJgU\ԣg% p";2 Fu3 G``JDS5UI)Lv2#` N;-f-aoAZN-+D1VLpGYWx3J;4e3)͈yP-/ #).7@.z'% |y\:7P|(p^~Kt<.r? @7C/phL5%^SFSJ'lrx]uJ[˅S$v]4b5Z ZCv9؃"N *1WI| |  7.3#dmOd*rOR{u7ٻƝܥR=V'Ц 5Xy%03Fbzn1VQܿN%h_, tIrCfs_9}1DHw[WFFvQ,J`qu΍CƍQ5*EaF2zb4|嗋m㻫n EqDRnImCݛGijx2S&!϶ou*Xj|e?AA-2T?CLSō04 @Qx/h)VOQ:J,0o\s?RL"q)8|"O(?=p,v.svTHJyq@(2@ .cm)u$.[ykU}Nek.ObBZHi8@1Z AɆrI\0114.,Bc@?|AKkdf[b.gQVf ʲ|\oAe+yY b|q& R:]f eӻZ#b:?G/TI> ˍ{95=XE#XeW=& |fW3/ u}j; "!*H":߾0 nW|P< 6wݷGtrGG0p{z}NOu*'`gE9]'/u'q mttI endstream endobj 734 0 obj << /Length 3956 /Filter /FlateDecode >> stream xڝɎ>_@.n戤$ h` e,9OSL<?}G<@eA!Fͥr_>lmM[DZa73-s-ۢqۗ2SY{Bsͣy3x=ƛۣ6M |hU #oNe?-~Wʮ]O"UQi\ O3QoOM_T&.6RTX^nW5ǝq7oj^udFes<ewkЮ9 9utmuQ)ZLkE/JO 7ys>#J+#dЋkxVh4K5LU\-Peqv-VplᑎkpXtkGV4.rfY67Y) 5W֩OSèPр qd[SZkQʖA fStOQoz)Q$GȦ8eQw+d>;(NknX_-\7.Rݬ[pR _^!=!szJAqpup+I1]I~˾Z檮a^OWH _GXPvY5M-ۭ`>}U S;"nzFqp=_)'w$ ;4sb6 oiâD**Qڼ/#JC!QeΉVM ,mP{=iL/y^~O;6gU2ZAH\&| -͚xRYᖂdBj OC[:|Nw.Qq0cU%sbu*Vdܫ pl g|o !wLhF-F ZD)P2JT;boWb2wE_Q(a*X U.ONX+"e&w] q8[Nn0EKISt<Wr7[Vղ{sW뒄=8]ΣZWr 3d4 pUJ[Y5 C#xSݵ`޴m5eH :G>H_!$)}!QW`^YlD;!"T/0Y-@nKV̳-06kIvpT:07̝z9Fu=8 k=7W/qDXt*XxA3&yʯoCǞ]`4BG:*tY0@Vt ayT o= Y99[DZ(OgDB 4.lOnuA H b!N)m9[[ :i;xR1к!.g4xɦ-DHē:B瓫 Q56@W1v>$)y<̺Ŋ0k<]6(:t/vc-/\NDo8;WoÚ,æ5{Th'>\N|ϗMlG3*#@1:NfO"G_c M~ֆ/+Ă]ݷS'aeڻr+$ނx:&ංnm_(ŏ.*/UV><є*Ѣ_Y 6K0u/4ِ79c혛迠 ކg/1`*Zp ?IFKFiE!`s \i}q$U; jyKTI _dp e+B#ޑL8Sћ Tb=ƌFDگfn&5@xZɈO[S0v_V!Ņ{$0\olsMǻpTl26er͠>'@5x ԴdrjF\:e@]$8G/5&&5CtSLYւ(5D7{h}{5%?Ƞ?tγm2eǚ޹(qL:S!D4bîf {^L0B&~)jO**YZ#X6>\̮$ #H1Ҹ bė5B6NNN%)^hɣQh_~T5$] xjTw`端fLWkqZт_C}Suw~!WU _j޿1_gi>{AH B\ƙ{7D3[n&wB5Q^K iZa/zb  &F endstream endobj 745 0 obj << /Length 2438 /Filter /FlateDecode >> stream xڥks~;Lr7N:Jj'8v+?@D5m{ Jx{{{{{>H/ B?3pwK7^bf{8sz~m[}7_y}+~s ÷#Ƿ /xopʸbaߴIy_f,-򡮶5ΓbvӷsA_Oϩ@eNw 1R6uv"m'ufUvҎtƜg8znm֓G?d`ƕHxqwۅ&zay(EZ\kuŁyD? 7ʸ-F,l/h D EăFEeI͓£*u]2d+جqm.Ak'-#J^!OT\x X8zGxG:볦3X\7 ebd{vˤU0O7D R|W}%zT)W;8^vOeu4l/ WP3eh_ȸ:4n|VГΐn k Od|;;ZI+r۟.%Lb$iWU~ Nc.߽':ZZi\&pܨ2OHtGAF/0Fx_eU}Jv6LcfNӎ8zU7rp ܷ ؤ3 pi akv+F1ᡬ4,MrYطUmW^ڐѽO:nL`P"^Z[qW1'ftRr_OS-_ǘ 2JGmxelٲ*BT>diȕ l*k> Ն1:kr˓ul˪C6MeP5;zrX,U )|+M1΄5Je7)r{Řܶ GMeL2l"m{&`1pؐp2)#1d/J @r($=-ʫRw]FrX5aCeE%3q$wrx<Ms^JUv"*`HTO '=ȈA0 I$}vc)s5cnI.0 8N @oKBp>%#c.aހ" лD&dF D$F ըi3ƁrLը*Ϛv՟zٺʡ愶'(8`6j1#,-2dpH`-ӬmkSdh)Oz G!i"aV6]h֖?I -z5։Zwl@TTqN)]~ZZ}WC$M5g8% &ec"ە!ٯsvϤ`D,F@ tjꑾP&ng;ȼXv ෽m e bԎA]mϿas]Db`ƪ9r@3IO)pٝ`|֝;!"yd qQlxNo1!< &@O^ t1  HN_ow@3#태cCߦ+I҅|w+.IJjJ$)S4]2I=nSC/ow-/#HZLBNʍ{ IlhD]7>Oo8y4Tǒ34vY3k T^79j312`fOtg}Dws*hAʸ.@TQt@ބ$NO~F+|*&t3fMmpQĩnL7"wh2NB O{y jNa?K!Zn}ȹUWτM\@w*\tT2|L\yA:S>2}LS?p:I)?N:{kW> stream xYob!.(Z\Krͥ..8Z8 Xkk)~"93~ys񫯼d%\'q[*eV_t6@D7ջO^W޽}߽7cws?N5ݯ*6; cSr%jxt/~~*$%a'N?9= 7ǃ+DY$4<>1ٙ}q~U-iuGQQG1"3rinؤ(Ĭ!I' `(2G@ R0`=.^\BJ D8*:x2Y=Uʏ}G>şd0!IoDy1Uz`I:(F3?NEuy'`BzJp;j́ڨI5{V{F$0[]8S'P W76- cܒ]^wP3ù3`C3۾3T[Wu總Ԇ]Os"8`1z q7=~hl [P5 [o=z'tUV.kTec IyZ_B.)8kCb:s"spE(FHhb!ҏoC iy;71N^r|{AovоG AQBQOpJ 4qhV(c> :/9j )FYqyQy2$> stream xZM7W9 `x]`0bvc0l!kVьlI2 QMV^b*r"5Q-Ij֓}IdykqiIs~ aD$bYVq%A?k1bo8LYD0(:N)q[KW0. 7ȱv$6F ] s8PWq Kb\qXjZ$ZU ޓTҊ̬CTO7 G?=q%iJ /8T LNFV4URSU`j߈pxKrgr\iCT nu8n[6ks#tB96_WRXqLp1gaaZh9`&OPR$]dRp}S8iO4'YC_N0mh>hWu+qOWS)C΍y+:-T'+ `qUcc<, f GUgO߽^77lz8͟oճ,s3Nf_Wg l0fz,^!Cz ͟˧4E&YK?oud8gxbG*J󧋷>}17Mn<sfyZ@|oח?.ߦa H>\d5f) ;0>f aU qWȓZ}dC-Y5~HZ1e2``CKhOƹt̑CZFr]> z´<;I#}d ީ7į;djc)eB)W:w[v7p75*硃jk``Y5 }m/n iQ娎ɑaΈ8[|rxnx)'5wTtڵJ?4ܧ 򎠔ųG)^ % ^m qP*,3l{ WuC[OuT$V qFrM&ǦsNsNXش4<2:uRk0]c5zXM D޶P~XZVo2}9; p(cy,cy,ml}l2ZG9)aRM5(sG!<1X'߽B\@u$2Ёj] b\p(S%q+PN3PfPDرD`NyS',48n`Ҹؿu")G\jH$٘!źۘQ%+wՋ& u 87%WCLw?#?|u\##yw:)yZ@QŋIM_޲zՌx1j(r|oPD@;cffG yO .ya! seSMl_4|xHX) .Pu89 3"S`xW֐ 2y_85Q  endstream endobj 775 0 obj << /Length 3105 /Filter /FlateDecode >> stream xڵn=_bQlVP`vN1m֋y,%WIFYr~տPQ(~(-ʊ[Km2wRϛᡟn;9Gj;~Ɂr#,[خ.o8|;u5K[|&v[ߵ:sxe֧ח~xSwnunvh ƭZ4o<p_Bn6f3 4|Sl[mv=A-0OޖnŷRsTho̯#.q1*BlsזD/`r;{?}3||ۻ3Cwhû$;Fǡ+:?{?LPE[D1)A՘bX<ų *^<%ϐxJ&&8XLd0hE <#y}OOYeMMCeAGXnN :Rkb&YuŘ:>x# h2.AÆlp-x˻הy9'XfdP>?? @y8ێڔ(SQ$ }(,7:]zq=t5hE])ES>G%-:v7|%콮$=#qsC:{擉, N <r[bGE{+>@G(k]L bX7B,[ n}DC 5{ܖmDط4T ~WC%p=24pߐveEt&>Df2>N9$y$}$H0mU;} E\(Y^jl& CW+IT0Oo  DŽc"c²C;|iXT~4EyA,6*!X(Ď|Ţ-ͦ:ۓ#D*v+s¾4׺?%/#pHyWLGW/S yg{xбѼ%{"!i1-F{űZe 8徴ޅ,+NB*boFKu*A72T-X;[켽CI=:{U"wTJA=Bt8Pr "V=}ݡ%eP_\1x4Va~OL6JL*& mU4߂8;z8x K g8R?9hؽ  cO*Kg_58)gpĀ|_25VcQ a}~JAPV0f) Xn} >`=F8i.'.6TnmieG7Cb2)} zUT{Tgvmamjse9,2C'B &Xc endstream endobj 798 0 obj << /Length 3413 /Filter /FlateDecode >> stream xڽZK4+A5%l͞ f =0ULɏqU7 'KG/Snv7w/M"(›Ǜ$B)澺eSAi6 FS6 6LH͎kSE߆Puu@nu늪Ÿ'3j'lu[[^ρ?MҊܷө5Sb%۫:UyىЩ5T }ӈϧn5sӫTqEbGo+PAAE-O߁FT)&=|yoTG/~yZ!@" 7UGz:ފ*of Py2R^7A1Sm3S{hZh vVƀQ(Qwڹ5?޴G.DTKM٪G8g*$y,hq 6BjUHA&nY]d֊: /%X?VQ!Qu ەPC\oN~µi!0H緊?MiM(\?M)dL`5Y`sȞ:(zt=@p|++بj#7磵Bj֭mk ~5FEdbιz"آ+X[L#KSk3E cߝW%pzO9u3W"qqK15~2{)%. fGAiz"^Zpe;6«`χ>]UVgӀsO`iMs8;s2E".2K̘aa2+ԔBGظz1y#rA[."&8j^+~kaqՓ}>Wxwyjykˇ8YYit8|J#07xCbْ ,A|ʻ1$䈺~,NVld*E f8 @okL;xD)EO.us&L \Co @̿ZH)J0Hej1l3! ŇpY@ȹEb<]j%]]qsȞp2sZ*N_F lЧ{Z3pg'~Lg37\T#f"FfuF {Cs6O1!0Oj=0`v!aoJBwҰ@ԞjOyvn;=U5(Xe(n?P/י'Ioꋠx3“E dx(V"^1Xy=; /p&En,-_;B$h"1fE&WF)&Jڡ~hU{Kvx _vSQRK5t1&j#J˝ G yr&h]ʦҹJ#WiFެ`1MJ>+T`YBDa#Qj>1s:L 5mo 塁]U@0<q tw):SSco˭ ;>9Υw28ػr~W0EPoE lڻ G"+ ~u<4bEnTWF_k~a!@9,\py/(rj(|{sAI.$,gLzx؛*;j~µ1Ⱦ[FxAR %ۢ4I} r(M/ͽA@w;;n2SǺ%&np/{ ܭ[L5pcdW 9 &Y:e-ՙ)̈́3e"D:ev=f3tJ 3OEun5Or ]͆\EW_6kΫyyvn8;bBQ8 j ǠBrJA&Tuq` hH1,H\C8:>cۦW%yq1|U. Euacj/dw&./>"9CTԋ-wif-ӜЮQwZÓ8`pɪCPb>u4l<934.ڍ _$iO2JLl;zkw!̇M+bǂ2tjޛ<[阰d~l)x駲(s%¬i.uLAɧ}st2oť;* a -9mI!1) pr6VM"˃QddU-V6TF_  _Tm;q#wܼH= L%aoEZȍ drs7 endstream endobj 812 0 obj << /Length 3673 /Filter /FlateDecode >> stream xڕZݏ۸_^F@/h)@rmk6Hr6.w/5$GÙ̏CUU ^~U"(~&HUjDrqϻõ0%B,=_؊TZseɯ) ;۴xq><8Va{QbaaitXLXoѫ<KvKmҡ ӀS ,!*T<=$Kޏ}vժ(Ɣc-Ά*.L~|̤! qJ/F<'*M[|ijBw?Ny/Fe\K}d(4>u4 ڞ#xuٖ2EWF؟ZysYRX䁝c01¸I jvgnjBK ~q}d8s`WhQ= S#5][{jmh=^<#Nc`GĖ" f,`0(V#NE 6S`eqpp&wDkP>Zl XGt)mD#X$ (moQ"0Mky-BePZQ@?6A{]zBqO-tB[!?me?ֲ`QR$9>. N]-m'G3\c=8(Ot9 |E@8ack~u xBlcYiz@1Guѯx$.ƍdxܛW Z@W`b{sלAdH3_Ĭ@ Vir@ɦf?ͣnT7qpi @OO Fފ][vIp0Wf8PP>$ngmevxIior|ԒXV=Oؙeke,e#/P٠+4us7ޮ&3rH̥[)N^^T: qIoڒ 7'A)XK4%`сpR)5Ϯ} ӪCBD!^:Q0S$1_)̆Le2( cB!G z4_ Dt mI:-jNa'v /C =S ?F`8voc#1>  yT * ʇt1Qf$rLGЅt_R)_,Kޛm{Y243Ɏ_^ɖ P1 ) n͹{=1-lW2f)3HɪElLČD謁nW0ϵh<I4N݇\0_PMEU=TR78%V+ )Ec_E 1O 5Nt~t5׳dhahm[NV^$߸i0 r*]FpR}!5''nh83k,дLYWΧ%U=+\$Sg5HR..Ԏ#Pn#Q8@ =vD8DTɠImL A-4 N}Ki|6ą3C߱xܑ3}670}qQ4/b3G\ W||6É~bVf*P7r( u7-VЮ|2MEW~v^F.nw>.6U\9.Bi80$$3HUaiF:`K:5Mr3,&̇nNsu6b x:T'RJujH%(-Q&~pxa| FUG,#q<jc296WIڋqd^Zto՟2,W[>AEr*ͅL4]嘗޸z*_Nqjg}-Akz)o6}S\|>Ob1ei.`[|(?-־EӺ~ _Yn<<拶ԳP|/W9h^}=X@gTt Ғ)1.ܞ\[Ұ* UBF9ygTd\q%9 _3n5!RMơ 9y)dV|(]TG 'XI䴺$wBo/g_!:*+FGS/֠#0!*iJ8uԽy:[gԾ}_C_*K҆5⻾2IUiHg QBu ѵo*, endstream endobj 823 0 obj << /Length 2733 /Filter /FlateDecode >> stream xYmܶ_T2פ-|uc#m+6U6 gAN\ΐPBXVUaL|DV)ڪ+|Ixv"bbPb ,8>سPOzUfe.ȊDXx@֍u4nFN]ۼ*gĄ{\B$1. ;wطQ"Y@j,fYn3VmhkλѽR`F '46WUЯuUi^HLn2 ,?=fWmnYdADsf 3=.mQG3ЫcmڒֹdV@VU+T:ݴmhhv53 ):tbҚcunEW/D$ @}ʀEW cxGx!U^vjEHI};21l7,Z7 3SPQrEhi H?J}5/c2㕡 &sߧG e*o`Dw+9~4DjPʇz TBƌN?KwE{k~6ZOx&{Q"\c],7ޗ̻h1`lA|. $W^g>չS[qo 4v߄/z6 "J BCLJQԻ!V̝R0 Iv2g8QNm)gF!0#Am#{`.(#DF) Dz@Ä~d ""ќ^q>#_PfNAM8`;$ū48Y$(*~ \/0+J VehU6G }FIH{jiNhmyJDvFRPF^jNONX畤22 Wt,?S?蚫5DxMM2iK}%C7gf,>&U0 +zf|FI4x6^a@vHٵ7}̖p4-D1WL#4SH[x'^rL"X< P%)bAV{ 0:줛ACP3zP68BY I#&5[ :UW'=*ȹـp7ZRz]y*x{]b ai ݓ>,,}/ėܱ9 K%Xb+wpQBbK$Bꑌg3NW> D.z`AB^:~ҝHkv:ˁG1'l3JC]6mJ#i؛,,GO%lV~ \IPHvLIտlz#xPwE<ӛP0{iohR47J9᩿2otDO#=Cǿ㖧ڽ)v,J]8&unsIFI9?Ν@{wJS~_pn0:Oit,NfVחsr򥠻ۜP02b"}P=~Π:eIXJo~Vq{~ND:ҕI5z뻻'σ* endstream endobj 838 0 obj << /Length 3081 /Filter /FlateDecode >> stream xڵ]ܶݿb_DQ_ ą+`A+qoJ[I˦ gbIԐ {V+oWW߿Df C!`RxX}Z+rէm uJ)Y߷nsvzX]}NS\gbFUqAR 3YO{5Ak4;3e;`}%U2jE2ݙ#bϢO֬tK֤RnSf _] !=hJO47%KB5D@G$aD } X 1@j sCs6zaVA 7)HkBii5M f A@!Svn Nׇ% 5c  '3g[R3rݕ'k鶗6c2@/UƎ&NqX_Cc/ A=H3ε3j! :AI;+cv"vpc,~kfDH0aHg1VBp˘5E؍),_SN~ݡE*S e+z'NCg Ve%P"G0 beSaȥR7-{HYn]&&A,2E**bYO Cأ[:I4xͫz AָW[S X9<)!6z]y]?cIINehp:E2yiŔnVW- 41ka(}j]Wjr M3eYpVr]{"ʪ/>8\?dmݴmJϜ6|41c'XT ?̛]nvL/92\+ Lp40}%"}䷱)y9ljκ*\3Y<ԗP J?zYFKJʞAE7/wѳR-+d I:1sZ*[6m\ c~.ZK免e\ߤ ÏXn^Q;6< ؄ż by(QF;̛t1o6k@H&ECltNBb[b2EEӾԀR@ݐUͭ41d4V)s"wshM^|˙C ];1f7k-uwY~Ǒjiac5Йu?'6"o> stream xZKW- !eT|l( 3W"efӍnA3T"5@h4~N+?~ߔK(e=>ݙ.Oq&IE"6Rn~܍U6Vh:_Ζ}Wfjl:w/_Wcx}<ثytnY>Lm\gLo[wtS}Wz ?v# A}*7+>'rӎ~~أyjLꄱi_Q@3;=Mja #i5$AxpDkRӡ8r.{e $9n)J$E0s}x+;;TOb[UBD&RY.E%~Iʚɔ 3At]]t.8`:hMc:P:VԒ=u)z.K]Z)r]|$ PE8sиS` X$] (f(ݑlSeRBK^[a| HLq$Tͮ_6liSN!`NaecXvP!u A;{wX}GwwI"J^*!~~yԛ} Tr_,R/"PHM |/x )eZ馵vvfx9{7u#UHWOM_bȖ !@NwnD.H*_V,Vq;XOajM4ھWD FK-p%f{M+L%:]nXnA:ȉOh866 awc rg5`QPAI\]nS5}ll8 51RlPcraGháD !3F;FHrib^[1Za ifC;$fv4ޞm`y^]뎰`x H*+rz Ǹ_vizvgE|181#!5)hػ"r4&L@0ILYm0=P揥ey*I%e1X԰(ui!#IZOE>G *qK̐ ZO=ziislS1Ac*82+]:p)btU}gy@ۍDLbVE8143dhb89  gOn >K8M;]5P.s<(F9NR!E+ 5ʶBXXE6Ykku dE Vd➇I/ZyR=HDΔ 33#ZZTf0^ U&淴 `O ~pPBɬ-6@T&bHF ɛ891ޕNE,qQe0!E)ά(>EU o&et" #fd2.'r3B6QW"KEn\Q{.]ǩ2-aK(6$JkFk(#2i>hg'1Lg^.Ӑ;5P\VlG`B)7 /`$|Ū 8x22R[H/ucni oGgГI6e Gf&$zvեmef ppJʩ.[PSM9O93r$xZo߿SM?8Z/|6 p~_repG)^kY.vnq8CS6#;9 0vU9z̗ၺ~@y,cj)7# UТg!kA ^V/s31S ŷ3RX@4.Obhj.9be$]8b:;~ Xw"݉GL|_RhX| R%A/6Ij*{+تz70Ϭ龂ɽ\ d #Ic}4/nl>e|4/&'4h:W=\wpN!'_]e4+TKBڨ%"~4*Vѐ_Uq(}$WS%/,6ÜjV%U %D'mL5',.k W-g4:<|qs[Oˌ]!eJX?j0RxaI4_+-=Ul,[0RB>J0E eg?ӆnJ50C9#}fiԭM]nj雷i!bFږY?I3aM|ldz wgؘ`O h߱k$)_, V GwRPw|24t70/9F$#r^)Z<* cw3uճ|:@@N:P+ollKOA6ά·w*IE(peV*%e⥿>~_๻ endstream endobj 857 0 obj << /Length 3235 /Filter /FlateDecode >> stream xZKﯘ+d .M*( CcקI-CKr4I }%F%̣LJD?2~px&B=n#)>U1mU(e\cOse3Qc$tҿ}] 4c$70$AҋH 0׊nvcuC(0 @'î$sl;{t/`9vffA#~{dqBBH`>l n)-RΉU%_XVZtlsp9N5@3Am@cÉ)^G5<85Fhm¼S/jj:2L`."6_N&ZN\, ͌sh~T9ew2AN@8`#ŗ7՞Ů͞;ΔxtEZ1@Szg]eWD~"ݴ$gliy} tCB3EXUO؊$`D޾]{-t2<-NDRxӖL ܷM? #rAܭqĠȿ}2mv@&Zjs _F+Qp2uB;XiR-',R,ERXXGp Kn 70{ N6w#N@'[ADY+~DFנ-A'@܊ /ihȃPv B-%P@(YǠ_0F)|jH""ѩOM6%&G3evmҐU(ܳ-0mLY$'-_D\bWAsLwHnFϏɦc벶 ,&Z$u]tqw:/DBWʧ+ޞX7V6D]*NLg蚆.U*zRw:]n 5ڞʍvG-7Oڞ?ڴN ]sUƌ WQMKMI~g@Ma eR! ز Wi+ 78@CD}X%"WW*YP*wC`iMiJ}9{R M)t+d>]xgꆐ7 oE|m I\][Skr= ސR.8#u窠q\UC\.KQ"UPukyH+@[H--;`MA]%mwKDtKul<)*TCYh Tvb^UP)Cya-Fwvx >XERb*O/Hy%x{;=$BS&n)R9}g'q o"1©ϰzXHΚ<"78+Qzș귑L<0)Jw<=U" {4R,l(9K =G/6%A]/%;rusKemW{.^8d8:INcf"i1c)$MDG۱)X1?-B ]W]:ZPtxn?tVCa(xkn'ݹOA =dNvcs<8/b~_߀Ő^ y7N8@xÁ"ͅ1\a3>ь$6֙ X[bcЙô'b͒5)Su… vDtd0f[ yE+ʁ1EHI~'orU#spn^KsaP8nBfAr1~MPj UDw#w҈kr0>W,6xƇ39Ġ$C-S,n߫aء,«+P~y9;Ugd?^t}Q6U٬q||ثCk40E4W.OS>ͩ.W3oipL7Դ 4htZB_T2m] m#mWwP8{S*_Q$DUòYpe.qzic_4EUBQՈR½[꺼Q}*|z wb"ڽ%p sV :q5~+նp"'q#`8c$u$&ja)^f6t_2Qu-_iZ삫zU1l3eڳY5.y]'c,S^kz.a 2GaOfTv=OB:57JB]q>N6uq>ߨy؂>ݷwc 㨾o_ri7Z}_<ܓӽ IFϽ1#Lo7a?iχ{Jb`֠relA:F]I%e77^ endstream endobj 874 0 obj << /Length 3254 /Filter /FlateDecode >> stream xi #` ENѠi1E?tB豰HN_H]Cg鶟LR<}V|/S*ei$n&,H*T﹈oqؗx%8Ky*F0#w5>!X"l'6NPJf#CAmS;ufv4 mu׺s}9+k Fp&}C'EW@8AJpYMG~?օpw@3xC^|%V-R \*9` 8q _ё|ndv 'uc l_PǑ:i>UM-Yޗuu |:xUU0Y]ٱ;UIq:yt'͆M  nxgNq<rsݽoJ@&Dj8N-=͵iȽuJo\̧Wd͊0<ȗn0\wY[ܶY{GM嵘RT #8ϣbɨBTm`odB \.oM &,b'xc'$v JvQjYY'Zm)35F_37ZL8v#DLGj/SmNH7 OVY}W**D+S,k >.*Jp_ew3:]˺;&'7LX/u8w [zj=އ sRM24w:Cx Us8փ"-&LET@lȖ7ݹmS?r;Ǧi(1sXp$gea \xAZqIckOt87'z qlqB X0a&ptK !*TLH%T: |'-O sc"uAl;۷uSg\y1K^8$Go+}Ȉ%c{KXc m/`ǃ-t^tLjw9adĦ䈔 "h`SVg$h 45 {-sa^v}7)mxٺ]ȟWHd`Bk旡q9rͯ[/;\{w*puw-BMVg%rB1rnдzp<-ѼL𿭏Cf8|D ܦǽ8yXgJYUmVeB ם~"VV b]>ԍͥ22o*bJl* P38 ,Φh@nԚH&[t^ry 9v^1PtWu2ƎU=0FC&YNJLYx* 4 ҺH > 2%3etHE lLsA4 >Xo@Uh=e01WKʙ)(L!O>dvp&b]/Fy`zW6lD{2 ڴ7L&d7:͙ p[wĿua5"ݎ~͉N]?E4e>b[R|Ғv> @ d5㵦D0J`d33ʇx2 P ֩YVLw3DNWݦ~2lm> a(k:{ZU8S/D\?ࣉXZ]XK /Ȁda}%^HalC]ua> stream x[mO9ίvUVfHwRrEj&[A=Un1rRtUSOq&hM&Ib(~z#h xB2~D7Jh)Q;] ;)y x)蛒NE>l|c G·9 !_(!q9"@0`ha*7gâbb8D)p׋7߂\2F I;#QŽ> `1=lWvdٱ p';1WvqG )'::gNx=BJ gCLd eMd(A_nj( xTeX3 2 XrMV }8⣌; j«eM٩{dx}©$a~1Fg9w\sS~aYI4|v}- ?zN\fo4P90X}?0cTbn? P[y)~7ͻKo.uzU>;{ѷ~i޶] 1G{ஈ֥W M'LJbUr1o/6(%`E ~ $i  x\[mm \BJJJ#PlْV =Vb葋Ep龃6/4/r4|ZWkC{F hvlqU`dD ]%#]L2R"|FQ~;Y.{| ]C5h) V&6& B]D4Cݑ= - O瑴zrF6ۖ$Vl@6sԖr!8/Y{v֞vɮ S7S6Z'a,1l؂(Gd^aF4 `xT$bXM}l ;Ӻ禰P2'jO I2cD$[C0\4˜(Ngv3{5'5mŧy./a5Gg_12#J>>Y @)JCZtBϤ|/-Rse,iěd~Uc]nsV*<rPܣ¥|~ҏ+&m湼և񤶿u|}"otҚC9DjKjKj'H%(zTJ-OIH<8bEIyww >y=bUa-[SցR@D$ȠJh]hLBr}.=$<>h-ᨣXz&Yؔ[\NKUsrLGϻDž^Zt>a #51ݹ~ܾa`#"bΊLR&efу^.I[*ޗ4jۢѵ>5N!E=\6$pqUHZJРj6\Y5[kp5ީVXz"QMdՃ.lR$2aIK滣甧Un8n)Cn- qV endstream endobj 886 0 obj << /Length 3685 /Filter /FlateDecode >> stream xڭZ~6@Qo`kEQ Mrť -"EՓ%CwMw6bQ$Wy~q ި&˛Eӷ+)ĢnW2O U C=ݮĢhkC/yQZ@A+HHH;QZT O%R{m/jlM_XvštByaۗ'@y0ɶK^N ~ͭ9 `{NUt(]Xćj$&( <PZOf'^<7 nyϷ Qb[EGGz+" A~4VNԙqA{˓@7mQW6Xs6uP)b_S.wPCSGÎmaw+0R@)`4 4oo_ÑI¬?Yamg:6ʚ^ħni=Mot `x(?fc2) 7"O8`F1 |S󇸂Gcޠu.1S|$`O_X6a_2~vY Z*ꁛ,[hl벬Is ;V(i_O|5hD(, P#0; K/LDE)~l>l ݀O%bOV+Pf~W}:se'zjXp5G\Ͱ.p?B]jAM>6mp dhzSP×.fwZvUwXk~-X(GnJ䎧`QO AJ$'v$$I (/c3+r"[s &(ҾgScR; '2ՇO6~8Aˢ2jײu/>(5jD2} CrF`o}ލf E')-'9Q{.733I߫t9:uGIhZX` bmrcV\ \ r sVs _aOmGB`*}M8A0 $JkeAr^:ín9B$o D{MCk\Ih36ID 3բgXxpoRxq0fsAw)$[C>dHEs+>MgKM^fDyb"<b2ʝDcNY)l\!>Ųx>a{LIsT:@-X {M;5N# Ꞑ#' 6z@YTٱYWWjjHppC]!doZɋ+M]a )ѕy4:d~1ƻ7~@$Do. hqtl 8*'mMIpV:7^R{5)v6ݝd "+řh/&ݩT+kqjWI4y[hkB z/**@9d;M jZ7D6xeJE}@cC&x ?dl)O϶#/Dlʏv'3)dד/1bH,-A0+C@=fg84WTZ ms'2C23C zgK8f0'G׌Q-%O)[Pamh;da,;GvW*MT K٫xÈ)K]_F$.4LR^>ׯkw鰫?slBy'T^~8b(^` gGWk$Cd&Cu{Ɍ P |PaHf`B4_:{s[%F7zd*֊;[hU7 ѹTa] 4q!ǂ N5`$,K2G5EVu>(¡PYM%M![ 9qIyEs"UKʲPW*RtizڃHA9WJ>`ڜvɃQk|d\$hӿMR:`8R1pk븦L'] Lv^3wL!e+8N(AJL)v!T ].Զqa̘Vb8-a71:dlygԶunR+X*^ƥ'PQYi M|-u Lk'3Kn3~ls{{8IWBZ+0A¥iBE '_ ,Y endstream endobj 911 0 obj << /Length 3468 /Filter /FlateDecode >> stream xڝ˒۸!U+UI4ߏ\;θgI{HHMZF7(RÑwsƫoʟmg3ϞY{_,gYz~4{f{씹Wk?U0}Wa6pvUݎzV6҈NV8aZX$sR1a{y; * j[v+b8(w|Uuf8ͼ0K0 ~G|2%u;M'edӭ>RԲdQI2[ ?O7"H*jx^vtN$^p^7: ja`Zn{QJԌ]SNJwzj8Aқz(@N]SIY-a#&*y xK$82w8vS8p'Z/w٢,#ԺɎ{ѕ7 AЊ3KO,N|-5Y˝X_J;EoxZP n9*ri<K)#yy`)caWA C}#Ю[@8ߠXUpؾo2;ssDvjE#)3 ? f v޿~)+BilHr2a|՟b33˧ZkYO;YSoմ̤Ye~.;{A޿}ERw"NQMGPAІ7|[RռG!'k'^3@6dM4 '^dOEV´ EK+ӟ&HP_M|zEޟײV헩_i/iaͅ0Բ%޲x8:`9֎5\ Q[?HV8Gpr~/pk#^Q8:&7 ^7 ]C:Cg^I!8!L H}X\F(l飋sd`(2/Vl:i_pI S-\휗@h.?O@E GydANtT$әEͥ?]2e讃A I褫 EQad<]S2PsR ^Q֒HFV7Kj%'jB;JaLh]Vo[LZ;[izaR[mګL܀ U=[ܢrPS4Q!:QWmh^܎mlH Z 6 FVR>O ޣʎQAF|`E M)FAѼB{4-عű~xNzYjJB9׃ ܻQ8z^)xQ i,~}"HĹiF5if<[I :'> P *@,FLՊ@頌.!7p|Sˀd@x\!;}TJ:w;SP@6S*V;s7ݏy| m=~>1Elӓ8főBфFp_W Vumʢ<݋!5ԷZR,3cC1L9/+NJ&9 A< ɄYľ3/?EYltg'yt5z5%Tex:! 3rB)d,_ !h۹%Ƈ]`zaS 9TqC]R5Obvo_K}s!`E:4St-KB0FC#><<.ΧrGǥr0PSs)kS Ctz֐s?^߼|h1hڪ6Fދ42 0҆oI1:vڭ V8#50OPUpm4iH{*=G;mH='wD?#Wk}JZ}U;՗`VƳ& ֹlg0]Nyy˵tbڞ(ucOȚ#cо?J[dNT4tn:Q}Mc>rga5gb}`hIz=aEpv5?rh*L+`[ l L3p ,,@Q 6ҍ-y95L- 0`ӋC8οS .:\,lQOH%S7GC,p/}RE~SqTզn/ }ہt+x*a2-SEEd T5 kY3s8eD)3mUNDB(ܟ\5'8\[[ToX3h -uMٷ> B:|w)zP!NAzu.=mR^OQTc\Ɖ]I0\q5 ;(?~4ww`]k#Oe==x{sC !$z"P}gF/~+Mvr/ʖ@{m3Az^Q", 4?Yf^_԰"|[ub e d]95RЇ{sr1u xeU+\ʿq.ja]]%D':~m#C5YP)89i/RdZ& ӟ2\[ H۩.sl!_٣:?2t ű܎*SiZWt?^%nedyZovzI$D<æٗ|uמh?Ʒ.yg@\,YxDƮк>G.x 4N_:A5 t(5:f`Ms`5ْE{o$bPҁ< C࢈g'] endstream endobj 945 0 obj << /Length 2799 /Filter /FlateDecode >> stream xY[s۶~9ԌB7Gq&Nqv>$HJ)3! `oW|xW2,d4_"δNF|Ku/yp=[um*OT/Vp'lmLDc8A9wOqphcUǺɊ%kUܹ΃X Iy,$lh`:5=ݘUM>peE3_B2 Ϛ.RbWSL&BTkܧrnW}pM3k!cc0EI D&x{x2)"Mv!SaAig% {Qm Yă6/U7Ҥi62kVG܂r2.+$#J=p,S͢X6"8K|AFߙ**ʺtx{c$eadOlYW傮]o <@HOD~#l,a =8JHC4ߏE.uPҖz MXnsZ{T7v]`_#~蠯cA? `B>ALD`pgh+Y_kSG A )8yCrb[NjT2, U;zK{LajG;af^_&fe?[m%R0T@YpN0+:D?I?v;?Q#Bgz> |+_Oe3x{c|mcaI8Ë&>+%?Gd1Oz> F+7oͬQ'ZW~_6]d4B1n|[Y}ҋWTQRghṠWؔtm6ʍ5W>[:.~Q'a0<+0 naҹJ(xm [M'厤OEc1[U>ֵY:V/-r_A=abgFAQi| Pp.ڣ>ID Y~(hiM@ |n(@q74GA )ëtA+UfW=ۆdlVQ@ꞎlrcCn]?!3HkZ)&r0' z+2߮--\7I|Eiրy8@M7mR-,dÏɬ ~f߭'olQe 2ń ܖ8TAq4l LO:sE2I3 zS5O45+5^EBo*V eGDEIY{>KDt+"*hJn=r!?KIz,k΀B!7@ endstream endobj 876 0 obj << /Type /ObjStm /N 100 /First 915 /Length 3301 /Filter /FlateDecode >> stream x[[H~ϯ!uH0,#hJ;xpw!#NM w'ٕrԩsSEIeRgV)9F#7EgqYLhcJz?>1~ #ZzE1BE`*$zJ1"1h(6O $>5Ӟ ѪYl"eFȒ̨1H& ~ JR2Yn䘵MS`6z+2KĔڰ$%s*҅b!5t<. s^[\XK8z#IϼIJ^3o xb54.<0 Y':̒̇H,H6A1A@&<ڳO& 4._\`Cf?5fiCC` &RL DWqR[)/zJK陁q%|@II `D$N09H,>,]ƻJk\)=]bfȔ,]7w-5&)#㹐W }q8 $7B+ii5x|hi<Q o7 OMnxiS1ia&ζ!Z7fi-7kofeUӷ\b?8ßeQ&qxr&sl0DYrTOB,lOk{@/ )obwU.e[nEՈ"W?ʿi 5^MĈ 6 @V hR3$D|X#RbyVehCHMďQ5{M "9Xbg O$U7 0ԏNFmt- 8F cm֨sDs%n7Q5bρ#bԷXhX"X!X!X!X!X!c֏=8^ڍGd{תck֎[?a69 [ PtDeY9O<9=!^nM܏BPG}_*.Uݿ;з& ")[";w!uefZQ!}R5rޝr7Dsa)K4xi7'%w&BIdP$,mN'%@{0A#Y][M4jΣ>:.,2rBtar.u7ijus1O񄢁4 MvvmWuIИ Tj 1Yr1 Eh柊v@9'8rTεs>jwM,+Z:?OsiN'Le ZJ!fwq,F/˹4 5A,Bt.ȾwnU7G! ^'IF;(/ݳ>|˞o)fV-sT$ȏnP*֛,h !tS2&IZ8ُj!5o%_U Zs ̽Fiir׎LLmZڑq\3;*wոWގq\_kX>vh@BT:Wi"֦j7J r)6Ŧzers?"&m>jNQ]50+Q1Wv6s[">#rm[˶Emw!aiZ@>BH\ 6sY n!'X$15- zK6KڷXͶ[mh%N'Y9wY X{+"{O'CE4JI oj~DdX&В*9:뒗R.VuY; :)(ly~Drˠ/fNk!l.7Q5hM `QZ7m_l{jб4M5e th 3#vAt pXzG- C/󌾏3 g)]J!쎣sɉzYRP?nZ%iե6(A^z^}﷼Y9mm$YKY*W4˺X>DPBgR@K"]F Z*Bߊ\bQXɢכTq v!0uw#tɵ._CHJ}t BD=XcJ6yo*tXt&3.95 Q>rJbW L >R'i1 #mc?%FU<@L=:s|z`t -W6 ;t CѹDఈ^5o-R,A0 #ʼn`4 &ojMޡu'ՠdp0Pc7 Os7\;Gzz<$Ln o~tL녶RH:-0;%mEN>o L)b[nU&QzwTyszUa<Gf7pUt*jZݩycbsq.ų Dz74#` endstream endobj 970 0 obj << /Length1 1704 /Length2 10268 /Length3 0 /Length 11377 /Filter /FlateDecode >> stream xڍTk6t4  00tw+!))tKw7 xy_֬5\}yi_iIYBAPS aƦ#A˱@p'0"/ 8xjP@=/%  5v2r¦`kCL.!!?R 86 @{6 BWFQ&tpb­řX.` @ ;,[@ƎMб;ІZ!\pA`A\B,ApCv*@i+ps'_޿!8-,0 XA yUv+X6;A@=ҁy)Mÿsa'v'9~y8f9 A8aO Y<_õ@] #+0woa[_6"d SPGr\-l8~'qPr?VmVl'3yy[ ` @A`? ?`W?y2~`%b#搑2T2`(6n7'W q^/_%;z"_`0 '3p(WoEo3inYn>~a? <(г =h~>oџ!$TAC?Hphx:AK<B?*2y({R_y8,R<{$_G >koqFq1@ iHmM`e ֈ(a7H &[nt-V;#}^D~KĶ]]2syc95Gqϧ4"}O.w#֦s6΅H^@9E(]RHۺɼOgeֺû7aIUDۮc]*W{ >G);o#D(ey1*s~e:>0W <.]aJQ?<>0e1~EE[$Ngn[.dР4]8u,S)Ovzd4Ȫ+vWͫ5*eU" I:ӾU@N}9cФ>`MxP`38Bn#Ý.礭Maޱ24}v35!GGHwsJ5c^hW{H0&x=<09yFWu{K"S?h s_r*@x5oN7O$ _e^)Ρ[-&f>6ͧBkA[jOlq4bEH6eE0MTSL~R$.T~pcMA~.GcfJso~5%>[Ĩ?N y>#*xZI#)0>В%RSBFLa^\+۹ۚҕKfm6Q>L] -8mj . mȴr|MiSn {^w]j;iH31&A 4ݛ&3]6Iyl`&g0T}cM5!<ģ'j`tٛ06Wk#c϶Z},78$d%{MwXXiVZ"Ţe=m%oNN9>a֍kg*EXG1 "+iNt{FT$@_.bBElU~EO5ڟ/ӛ:mudo=/^%댾GHNʔ0ijB^]:~с=`!1Pz}nE\p~NlKĚ0vu+y3)F<AN@scGDյWx`ATm gOEӝ>r&h4w<~!>ő]VY,]/[xfaC+4EHyjׅ[JI`yD(k@77ԽgQƞXmbcc$lv{#DKfir&h5TຼS"Bg4= CY,s'bTOn 6&̟U=ߜ֑2u;y [ZHSbz84fӘPiM䷤^@0%Y!,;hT=WwC~Ɗq5 ū_Qz<.akmŏ$d7nuLg͙zdW,4}vl#!jߧI3VJKFVdbjOIx&[~W9Tk0a{MIFy܉D <_GwCj!>g酘KƝ ߯ 39$$bzTlkOpzF/Boa}ߙ!/Awqs*=6)P9/ԁrzh\a'\C: z|n*'IG<[i˽ = 9iem)[;XMO$}А WU^13&0)[TUgߵd;4397kqH斍SSfO]' %e<3':bi&ePʣ:`ރGGa8qRRsg;騖m"7SIb`+-Ičd2 xq_VFt`U$ Ŷܷgϥ.ZW)Lf6w96gy`P8N]|6~hdQbXH2#cruX#\Ibʳ<2Ċ8͞ՎV&EWhot퓗/ b]&6LviM`,AKI'g 1OH5.S_Gܐ'YlJ< Zv7Ž ZͲ2Yn?0U!ր@)7g2fj |6:YQn5%3B^]<ݤv z9S[qw2s2z9w(Id!D#VKie7 f΁Ԃy t}&<'7NdО;G4 EI$g]/JQM4l9 R(rWHط zy*φ 1 N[XAdA/@Oi?iP)

/DX)5BPχ'NTY&Ub2G91$vf1DLZoȊqa_8'2 MWOń0~ҥ17ǜ]`рDv|ً,/8F%6\Uz%c+Ka;y\܌Wߤ/:q;:dW>8;iP=p`rXo@ahq1C{Y Ԕn#$XnMZ~v( Qzkfs_wpnTG0g LGM"1AM.eR֣]=\^CEJ#"ةߪq6"- r]fqC+VDs/> @VTֶkzeNiۉ*~FuBܰvoES )h쵈*"%1ct{rFnuEn%r\Ch+Xk57'bKu>Xb+=!($%) )(M6~AEy>Ӆ@lIy'8+gII2lI~'Y@GBHrC$ 0 *^jr9;i6<]K4pNTӇ l9iR∝;R%2ln2yZ9F7XoQ>"QK^Kg χBmzJF\Cx,Ƃ(,)b\VeyL:V&> q=MMwkMClFۨ{-f1OD$xnNêNz.He\hiɓ'u.^CKk-P|n:g{ .}T%8?.Mmk҄Z€@"Jh(&İZjԐF>j~rIJ,)JS#UlFGt/BhVG_(5_(| qK H1nھ׳4&gXX ߹$_#oE_8&pRR=2t#M5jҵ˧!Ӝߚya8[)Q2EϢkU59/U'̰4H%vd*Ȟ&~?{MBL MO^s.xBĹ?TQb^y8G<'~x8x)BLmQru\)c ~Foyϳ n;.QL ֪=Z=R|b2;s(?oxҟ:vbHI"ʑN,_TW9FP^yvPiR"a/#3Kkk'/*)c&LoԜJӟΒSdko;6HV uƨIj(wbӼJ۪&5֋cDM,wT8^pVb:ٝ.T'=~J͞҈=>!uJ[/,|MSAnC/F2A†͇* x-Z%SLXeyԾ4)HMO^`'w"abOFF=hqO*D}RLv#Bsrf>H{w&魾;2qc"FXq*ו{&9?_G|1Pv%Jm<3x3dw!vF/fֹO^\""?%J~wu neĵϪDbfutI?ASl45 l{oz,UC4[XLⲯ@)s9z X^}2^piET=0Z},zv%d0!9Aw7¾y׎q"μC[@:%117%i[eAҘ<^jUoEoL%r:u\<=BY$Wۛ1{^W'KfgyάcCbÙWPVc]tq~ëtQn'[Wr ɳS6}+wJs ["5j!Rgn5xRs\=P |4NO\~Wd:hQ? 8o_V($.o`huv[O27cGyOpHҏ?i,`\KԆw>۠pGzhOHiSsQ/I7G쓆ΰFqix#oM]m$_r7YS@fT%4˙+0@3b{ps&\ܴ:3+eX{ gsGE GیzobZaBN:ڥ~h_ɣB٦wt?y _<zע^H.Ω|ڞbceX?q'5 / X" Ivԝ !({!@nPPE88ٛlj[qA~}pmfTgw/D&W>V6sdCCY( -NA~bwRx.my)lr . յ^4 SZ >֍NEb#I&O//CxI&3Q]7IJΞߖzת饺w9qy[쑺i\am%w Jr**<ϟF=L F ]|ȨL5עphk ;aWm΂[4E3IGwSᖳ쏲zls(2\Z 7U NJ|!nTG–X! {^II&Zm_Di5tg=f0;t:p8k wbTG}}j\#bNv$>%,tJҒtQMܘ`MՖaaxDMyN&*]< VY˸aJ|Ay Y*ܷp%r<ޒNX܊]=EqkaE !iv3  {HUW+Էߣmhf/o$qCX mҭBfQ;뾜d#9QKSNy%vhĻmY?fXdU$|Cwꂳ3o[/EZYxT~!:w*{OBš(t{F#˩3nԔY<3M82GmR66ۢG*BR=wd a>_ƦDx%Fxb*Y?hFfLtz> ӱի=uYmyPhBƠĘ~F: b9Cw\p; 2lcj?6L탅7{S94 \ 3Zְ02Y9c C[ݷ)8Jn%S6f*[WЪfBJ;v{OmC:A눴gf>zE\)/$x*6*0LeFɒYV`iLrvZaI@iH93$P($!P4,($4$,yVw6~!>XԷ$W_"ۯ0^qk$y/ۏ]v嚐4Wٕn0fXf awLY84=V'IX-|16Z[ bZ^Ɗqo>7׳NjȽo09 dZA4uj8*yedCZ{ o3@io) 9qi;ː8K2-B.wQH~-)T[]yYGT=^u)MܯSdy֪|fS~Un 3݌wǂ3șzeӅ)Y]0F y?ig(V{?> stream xڍt4\"Ja:m0e3zhA! тDQI5k9{gww?aa瑵C4,kj`/ObG Y`(8!0(S18M$Pt DX""`ؿHq@4y5$"dGzјc r11逬+n EP#s-Gah`tDy(^7 G;z0 fЂLK8QH{7`.p[D<ှCkpB]oBpd- #{ VE(Bb^P 9PQ(˯~ܲ"N CQS{l1g7aGӍw*`\9ЀXTD@T0[G_ |`_nnH73,n^0 ߁Z`E608?1n|C/;$?5W3crrH_ `^YF ?{$nsOj/g1-$0?,7 m17(ݐ0]|0DcPcjUEC1JE8` QJpm3Gt( & F`Θ C?!( 65a=E-ԗz%C0 6Nj@1)fAkcb(m[OLف_o`>0[H[pYoOۙ'&<-]#V )Os+>wB,ZOr&}xVl&=?e yU`.G/^хCKF mХBco;J"jZ-m59hŌ~,, P-&}3Ť.rJ]M:g=6D+ty9M(C Wm4Sx"k#F=}F=J7۔ <G 7(If}/qDg uf1E9I*8;߻B6Ğz|/(OV?)֌86qLtF!U!n=-HCB&0,)7r-ꠂ7*qVI Y(rjcx/1nWɵsYaFn?VOlPcNy;ZM'>gR7{$+kB/4 Y}-9sۍ4R(3WwNUr1w,44/RN [{cJQUt ]5LO圎^EdXW~˩qdZaJ4P!g|`d't| J|J1yh<\ T ,CO+Cؽ 4Å/o1<cOnntհ"4kXX}T(Y' ]賧v3<:dp)*Jf~NyoY=aӒ*,L?Z\>quώ;=͗E^~2-q .RD\]7ڜuЌ4_>pU8+Ry%ҫsäZԆ1!)bIcr߃鄄@ΟρX""?=VB]Ӣ Gh/ga ٮq!upǥڣ<ހv>j ʪ22{̺^#FhMd6;r^]i_l 9I6fWIkJ-iXŁ)Y=ERbIYIlgIȩ__ y'Pg=ږ_'RW./zIpaXH(oAvyOP]b#Kcz)Y6><*)V9aSLUbkn.zҢzgS6ML}bKܓN>J֛o4HƾfqFvl*nRo~iC՘ʈUԣW$AN7|LMr W_K& ?*zؘ0)p.w**_{]v5J;53 S{-Md`sB+ם\QT$=Lf |[).H=N>皃g:C?s,#SvQ( Qxo'}'}iTqv@l3;v@$>uͦmX@񻂸;͌:-.R;zd87:v&Τ2p/+OW٬ 3z.{@6on%voշ(";֛!--Чmi/?ho$lDɁѸQuH A=g3bKR0m3iT6lkZs )g 05הn 6 n(+0 tKm.LA/^%ݛ|29YWesI}}w¾xoH~Qvs2.l4 2tX'nǥ좿4&mP7gɹ t{e޺n6*̞Qe⇣BľP`""@EȽ"l#ۧkxy7Օ8bamVsMK%ILt7]2ueaX]}oVjqT(o[IݏV2U#aV$l;-Byv?W"6bEX3=tzřAeCfw.dxf%$|0aOȔҺ;"V5kRyUt_K/#6 Z8Ꮂ.BLf">22#&4( M4i;jtMH?9IK5OAl\aNq na=g˦u-иtěW^' ߆!_OplW6DOpX9ELjX93&>ˈyO %DIƛ*U\w皑yan``I ծ(c@|gqX1yO@= 캫L~g G%oPA m-T#Oл]>ϽfqyB7>ysCZI4> $pI[oSq!+yG5PfÎogs5qڎWn*/;d{ɵڲi=-"Mj/:%Z3u'X2k3 FEOX[޹'gC2^-b^P臊"FX_Fy&SdBn7 *| 2FMc:O:O?1Һl) 3wc]&{Ά0.328CAX4ѭ1T[>^z""ƧYyx+c PvzKkIT4?^ |a~uh\vФ\X;="Vzwm̂T֧lb݉FUjwHL)w2r֗#2,(UskkY Ju XG{&~I>qw'ho=RYc@b KrBk(f7c5qU7+T{0Hʷ+ҾhAYrGk@}/aF/N}E֤C, sw~ Xr:l@]|5u )24lZfŕӖioB$s0euQy(ɽո!_ ~ 8{ vbQF \U=aqL"to$rhhG쐌"џtBm#Y ZEm <kJLky?О`bs+:*n]Vַ\vA*>N+PZZ,5qk@2" uWQz,GI|E~툞=S8GDf8nJ}tVZ+EJ`}سgTR^IMx|~ľ' ^SkqMGgU# jo  A3L#_h"&wVB{QSB +'AQXQoxq׷"S(An'X\"TMd!]__7Qpmof3"}eľBVD(w`E;s3݅`D\lŚ&nUUmB>xS]m 񣅥[AېTsG5ȰRy gNUӧĹo*4[pt Q鈵ln(nܳ]lHUEG*w&C]Iq}Gߩ0u):s&5 e!n3T[fМ$ng]|mP<:ah.D!a'}uD$[3sYbUgQ+sOedux8?EOI>vj'$n@eȿXhoi??[Y.HIeX|jHV``3hLaUR`klC~ϻEtTl95PsWb2Yr0 WIŒkmdUj$qϊx⠕_H5>T"|~܌UO~!jhfh] k~tof5OϏe;h0_^ ʭ ~"eC&"1ڮRj黹r$v~>_;ϣl oE7"?D_{FQc/ک0@'Ap|b驷Ŋu!ܠWB_H*]gvm9^ptࣩFDj9&'-Tw˙Q=+ݬ j$9}{kE6 9b QX4X+~hz6]}zpV4;2{ i}clMI]r'4_Hi ?D<Ę·rI;ːCϽua%7' s{8[h>.v>y;IW HP󌼩yF 'U@TK> stream xڍvT.")H7C 1t7 )!!݂Htwt珟wνkݻf}v{ =&9,ـ윂Ie ' ^  !0Rt l@nO/ G 9CN/ ssXZs01m3 [m#l03.p;Av#;R[4`g9w-Oe-+\fw9 u|pJU;0Vooߎ пAff0[; X@lU%v+V8A QAqus4s!6K񕥡0[[0;?)8tb[.ɎC wK=jf۽/!7X`X b~Ap9p' CS%o0W#ߟ>q_吓gSd0W'M2ǃQANC-`?>>?;MT`MrCN^N/721;!'L1b#i { 2dRy8qġ6zF lYa\@`5#Z=dequ8>R/qr34 f{ĸx co,]"1 ?XwT0u?n? o_#}0<lغaf7_.=t;v3'Gk_||0l>=3 l,'sa[|=J1x4Es6Ui3;݆g!_coEۢ~MljiPl/q]Gq3>P)D{EpyY⊼wBBG Cb+Ahң okʵA8*5q }+ kpneɺb+ hT [M:։3#U1,}]Vj޾.įu}^E;E-]/ fS)qXg_> {I\Wr99߼zڎOϛyURsҼ4bmCʃ\+ni`T Յ*5cEK%RnTJ1+SGty<듥anW9:%ڍQ9k{l[5~=A[+F0Pe--ɲO/VLiiI8S}z}:ВZ$/EϘ7ܩ-9vJ*WY/W7K&4HY$Zou:GdZhݗ'GV yؔ~~HFf\M9Y|!"~AEz"u ӏ,A7' 1mA)a"6z=]cRG3׆-tR;hC|#b05HD\0QAFʋ<fAzf[/QV{*?&iqKgS V`|t?$\tG ݃%Og}ew|Bn,$̤ oG#P:ϲ?`"ưwp:%5Aqj*mp!+pdHJ$ۧ |#! my"cp;*Nb)zX%1@oe7)*b$ei{ȅB?I2[ WSLrV˔ƗXmX E;r͘X6<̄0F8h|h2W1OQ=)JʞZ|t #Z5gL~6E/}G^eE@񝨼ll }"sԩs7  VA W؅mw,C@`ucS]uǚ*.yIlܖ)&FyoǶCNw<8vΔs:kgٱg6դ)uę5%86y$3s/mJ=~`0~{ AA GHB|1yڡEĴf{TGa71b<ۚoqVѷ < UL͑ sDWFIj<9$nWd|PpmiDJS:u%Ƚ6u4{f!$~ 4p 5Dt#}SKg9GBG>+ Wp'y#,SQTs,!V|ʒljQFXXm}Wm W5R\98gafͥd5 zC'N?E= A  2=fA&]\.o<ݍ ˵aew*)JǴn9ebIȥ+1:i)׊X s]\IpWdэyKb p3dk.}(1m&sOXWiODY}{aeYH=I/&zHpu2?Ƕ5:ng53/iҺ{'~s>nJN>,`wxOGח,^wh/ ^! S  W,GifBr(g~bL ˦0N@"$ǖ|ُX&A:۟vԾu +F3t&0)&Q dkaFJP%.oq;J "3͈Bw>cnKe>kOU!gF5?o@ y|hqmd2/ ZG[F*sse`Mz*PY4' Ro{:~ۃ*1/CԮOe?8U hǖXY#F]l[~z NbP|e5MǬ gj19-&s@H JلL b蕛Y?RKjT(<ejgޥ.Aw (Ejg#6tz܆:(-}{ Lswo:9YDzo*6J-)IAdʮvrdjd騖dN6lx^68ʠ$lӳ;5{oGB-`[sղq{i_Hf)X*ʓɐv=DP+\[bKig託2hbno2$\$&ĢQ%;Ss^mՅc/^J/ B-.\y|傲uҠxˇzmV ! sݶn;-WА0А|u2ΚOU\? /̧:7hs:k$nWAܷhmB]En&v?\> ͧg}ZHl JP쐀c|kCjV]5^ 42ď (aBY>@Q"dQ:GU].e].:(sKpmUM1G3MFL358BkqU4lWYu'q3p2ɠSkN~p%I`^w4>z~$Ot-|(Bl>.VHP a:C U_簵SUGԽy2N-J48Eh{}l?-@ SHЏACG_/$Q2˓.ߣI}-1/Wx=yzU>ۺý/[8>$D)ApH0K^8F&qB_>x^f#$<7\~/*;6aEIu}I§nE_4v޾C0q޷\bj2ԼA>|bc9ZrNnӻv/2eӓQ=mQ<r)R'SZʴ$_'ޘy11C"{]'rШ'_ Ji=59wʝݧW7Nc,r ӆV~ݰMI}NaYnjy0eF |,ȴ&KFFTg *t|ΕZ/*%> ~MExw~#Cd#4ʲIHE>wɈоC'7_ b`ܔ-)1kfa]+5PO_:R 㞉_a{ t!° {%4fI|v0Z+ŵ&݅X,)P2Z@ʰKiAn(1]Tȕ>J_ůYzB3lՇ4 6׷ (M"+5o}mүxa8C2B8* x0 Vc!Q-_bS2D*4Z9+07Z-N}h8>UH0e#J4$i>FSp{!wӈ!z2G"oTֈ$ pF@Y1l >y C͹s[^'"/9Gk i<Pgܻzd,s B_aj]0Z \2k(#zfߍbkAWDJS>dz3+CiZ/,Oעhܶs8ljCM $UuS isU%DŽf{3ľW؞(?TM0Җ{~6VE*ODꠟ: QQiIs ڭz3f,KhT$4y/Z#2h':Ҫ,׌L u;QQf*_>|b&dBPҮ'@($n./Iff @s7asCrOdr^7Os]-|TH1cjh_kjHNn/[LkᲤu"¢ElOap ErSNDT~u }C`iVg9mlG,> ًUlq;S​~#ȕ4&_rib☁Gm)53Һ`b;v.ujОy̹77T[}(e./G9S=?!b 8" )w{A籠XvIXD=9)>1~bl'i3}燪ϏZ#:1 SniM-d>~hF*ʨ$<_LḼ !!ӜeW3 :rMxy8nkX5!YiJIW(n>LE} _w =v/sy)^n=Nծ~aOlAzG\:Y\km ISȹҬ/ᵅ}ROmob6J[{NnI٧C[ej'}oJ?d jWG`1ƒ8e{ke2DI[XydTF>da0 =ȰV'Od`?֜@NHfUM>w*KHMɎ}Wbps6}\'FŅf|I,4O+jj(Q'y /qI<Mk>ΜNpBUc fx?#qAUeTDŽ߇d܍cƠ"ݓwv$TE$tmJTd (8l*x#)rbL ɹ:c<`+BRK鮻Yt]r|7)nVA.9WXo LW?%qϲ{lmɲae&zѕ,CȪD8[kﶭRS C!f3el䬅D%b$lޞ|C {#'K+Pp#t"(ڱPk^6d|֨hr6YٞZ!՛$u+0b>'*UooF_nF2AʹUHZN84Ϲ4&D57zbb ՄN0۲ARf|j8j#_Z:fc[2de4wE å!'< Yr!\G#/ZZ?#6i5.*4O>ddlAHUEy{3Gt*⟝0ylx-4)?Zt3Ug#=d#u0QBdYPm`Q$g"yˣn)|lb5(J67gRADv#ϳr1=p1-~B9ϝ endstream endobj 976 0 obj << /Length1 1373 /Length2 6093 /Length3 0 /Length 7034 /Filter /FlateDecode >> stream xڍtT/]%Ȁ9H#) 00--J#tJK(ߨ9{Ͻkݻf<~mp/( P >"bc3?Z"6c0 $]:E ݡ#PT(&)(#! Py@Zu8 $bSz# (t\p Z #]m!`?Rp>vD\%<==A.H~8 /r胑`k\6{0~"6#GmGy`Z؂aHt;k 4:`g?;ѿA`ApWsC`&? ~9H8:@A6hߍrzzC" ($?54KV)]\0W} {|!0;_#ع  n`5ſ=*(vl~%7v6Vu#!`/`mD ( #OvlGFo dƖ *&yy(OHD̢ ݅b`pğfѷ=>36X0?7E0C,w?Po+/a@xuGG3߮OU Bs@%B/.e*Fp$׃ *[gD &?K*lv%$" ! o"ђ708 @#~SX ~~):(Ool4~ſߜDp[Pֳj9OQ)ͧ\|6 R4+>+q.0_~kÏhNkJҟl!8N7\m/!#ߵq3vf:[8nՙgWmopVƝI8XiW63tx(>&n/)ʗcIC6 nslj!v~ZIr `SĮ4&$ |R_R)dI@jHz&j3ڐR[iuӃr+Q^ujяza~(It)i/9K:*J(9镤+;xz$LiR8΀ہFmCRn|qnV.CǤ1K 2/tx;\<+1R]0sߕD55bM;EJp@*δ;3Ŧn(rD>IE7,(sA%V=0!J%a8.aS>h;Y&`=uʚK#H|!PSynf/1T4Shn^B!KIi!! 5J-#Q(ͼNqE3Ɠ#GZHLwW$wC>4l(B~ב:S6!U/~5&, YOlj hy̥U1 N\Id:v@ SQ/]tCG2uk@uѝ,$ ?c}Q0@u=44mg z{ I.DmX6WD(LkEhni(9}d{az 1,Ũe(ǻ3e,3&—$O^u'5oU;ЫM-([t` ?Rl}1Đ7N.ĩ2t7?ER=zYbf6]pD`@g31,ܹRo>3kMonFJy_^t.~X] |N"K#вMd Cb.ך"&z B##]],P A1±V^aV36~jzwQu0<~՚ζoULby[p#i:m:w \!ܾ-onVIz6(JhqSnuߧpk#Eq",_U@i CF)(؁XkaD5lPB- ^K=&j2}EHLjq2٩Y 13̾< fGSiU[x"5O-ݎ7u>1^E.)a&'ѩ' J:^DN.E\&mدg#bCbv^~v& -ޔ*,lc@+nNG)d_LQ0:}_U-!8]0ˎqksm1m 6. Ǒ$2Z{ګvZG7Ym&Ќw#0Gf}P${Ǖ])fDDzGbez"uO>sl"ɑÌxG^IĺO4Z >A[0OT_q"2Wng]ŸխTw ΧRټos`bA=swǴ-Wer{*RP)N{^Ou/|fYڏzΜ~4N NA)lV#xbg&G=We\[i3SSM/:Xа*s|^4OA#~kR2Vq`L׬=GY¨Eg dw%nMz.+1T SFv7rTr]LRSux·{pD+6:5YE#05.h߸=0п# lD)cZ͓_g)'IXg6}ܕM))=fL#C~}wiZ'I*屨{lּ.嵐]-u$#] pdi+t}%-ޮJ=ƭ? _(UwR&x@fTf֏;;Om-(a C䛨LQO'_y}#kjɔB̞UlU$uw:yx4tJlRB7Z+&2Y'cdy䴧}+ݔfmycj'DUzkɟX ܝ=XE-*b7x2G>[<9ЬOgș}u^=?XecYʀߨS0z@\)"Jҙ/~nwY1z:|wZpaťM*)j/b-HΫIƹ A’C _?cG>o\}ѭ$JrxdU=_!;YH}U, - o'PWoܳ L|] :Ut&UZl¥RFQ'iSW%bgGO i,CG_ޱwȓRi[J)`\R!zB+l[4Ct?4wSK5uƾ>VkS#9c^z`J"BNu0Y,e,5v;4fc>ج]™kXp8Hx>:4"9 P6!K@Hf./+w52:' 8G'0c@|#bySb?C(sv,l_}cu (g&1y6Qyt+z4TtHHVaGR#ikTʻe;m2 h v2\pI_c!@ڻ˛xԑm Pܽwyn@.=| joKLy[0c-lrF2[f1*1^5$WlyNvGZm A>Nh$!JRt6ܴѵ)cԄC]7ĔgWGScmVKZeWІI3/}FUTּXkꋪO%y~@5drjoSXz_yecvФ%^Fw ΂4:[Ay~Q5ewWHG)]3YgwIR!&y:gB;!]| +V\8t\GuX mz}mNv-N?(mۇS3o ;z?lt `VɊen" eԭ$ca~f6Us< /Gl#ڿhD;M2slFp^b*U yµR69 }$ܓlF_7(u"R%k9y:t5׼I bKc`UGܾ̃#-EKqiDr&"ViJ|Yςc9(C"U)7ݣ6%{5!9i!E͘0o"ؒ]3{Vp_} v Jv|'n`#uAAUcmͰw!}> _!1+m%O=XX%cpW/QjpAeRQ}zsJrKCy3PE5,('v\W`68cZ >,.hAQ Pgt}h=,J\"a.hR;LRXk:2#[\eCQiV[ٶ--dÛwQ+Bƒߕ^ȩԼUq)ey`ɖwڑ-^l7f@7-lHW0p+ YMyGQym!FF 2JcX>c3V<,oΦ jc-v/enHy.Qiʎ8UP*!ᅀfOnux\'x>|\vLgEO~ ͙T' CMk?n&_~5*^o5$ʽa]-M'}6qx,ez4rtxglޗt͛=!pk1!Z%xu@.;R Ϳ9sp Lo1;8!Z#xnÛxectk->g)6pzE ~F u`2٬ojrVS8tl-\5\KF PÑ4AM7=G6}S[C]IT"2VմV.^ۡ9 xW_-]` =1AD3M&ī^?-~){?g>cAM]Q?a|&_5jzhg4D\%&J=^Dt[)þN>ET mM$m}'݅{M0}C4C$M'{@͖L BN5S7R*9?ziZr. 8$x7{HH=5=ۊs]và)~YN8?S7 -) ʩb ?I#C>u"Љ*m9[OQE >OwmX3z`Ќ%}]nk;1Eq*- IuF%Jz{rAdEګgJ. Җ`^]e|lw3`(=y'Ǎ!գg'8Ы|[qM` e#&"VUp[&(D$_a1vy$ê endstream endobj 978 0 obj << /Length1 1511 /Length2 6868 /Length3 0 /Length 7892 /Filter /FlateDecode >> stream xڍxT[.HIH00  JJtI(%!-G=;ݻֽk֚{?}=QGGnVÐ<@q)? ,,$7bvG@0!E0[$*Py@~q~Qq  w(zB58 eCu~8bbܿ`wдE:]Q+l}8Fc'$Mˋ wwxAN=0  вuE`Aq^` B@`PU5n`؟`?܀re*Nn0JHo$7f+m=m!P[;Tdu(C!nH/őW1+ᮮ`k w0u>|5m9@`h{ O< Š 0G0  '_ ;(~np78F}!l=8;m! $SبC@_]]=%3()'xD~~~0 utl!?E[K ߵ(( ;_UB)y@+WJHhQPcC<\۫EM,h~!^Pxu H _7 * /j@.[o5C^W6a.(KǏJ{o1xap$*pj .7ٺlYA![W8A>_!8>(>W(" ;;jEo{A pDY/OXkщ}QHֹ~NSOl g̗ELp^y[ ^MWG)m ~;}S;fDo%٘c~M'bLl ( fLAu?3ed^GZw70P5Aki5>/CPZ*W1o8y<)G *'|!_DUa#}-i@УQK>OHG\xV2{z7>N5r,hCGՂL zV]ҭ4T}1iW6qIiG_ʉ@݆k872FvMnZ0돧|;񥧃-m[nHY|)Zј *Omho~֌=x RBh+jA{9}"p1* V1?L(@G l'BfahN[PiE}sg1$un=+|jKG=u?hw>߶4~Bقe*VrLRh03Rhxq#i-ЂWRWI"cjo\]\k1e+d$L}gM̄2(RXj9WʳZ\5 %-uz:Ch0\BiAQLUVz-]#?@J˵UbL\5rX"kk怇 r$H 4?s[OJPGޘ4د*ƲUuF? [BuKy_R$Rsj%7K*:-*5eSNԓO/Up¥"!pOgݖUұB/Z&-WTua] zO ?dZ>V6?5n/8.aܼ<}v~JhDǫ:lj/=f뙜 K4S8gD= (%tD)2Ӣ >=rhPGhsTE?O Sua/rŠDiV_Vֆ\l<F2 G)0j'lTGrіQAtg;7e 17fr#'ķ<Zܺ+OF[c]eٖg;3'vhJHҭvgu HX> CQ޻DwwEuf;][b.>$ *>n sg c0ʕvL,!"/ƝQ"=M-8;ٳU|ii/G`E;L:^mP۳8i8l2湆JчVR)M0M0WICߏ=~U$ec`:9cIɈUi;bw>Dɯ"&eF-qdϨ>+;W_z;㯙}G9TAH C0Q?k?`{+x||O {Q/hwc>h}>->`S '.lw ;I a3G ā|J44-t^zk :/߉P'xkP-^/SS|k}';(XK%GotXL"Y4b'n5eҡtWP\/;?&bX{:)ԀC*qÃ*OgE6ҾpUb#~{C8ʳ@Ke|qހ1ԦRNѬTPMmvR䃪ww8;yz,,<%5whf䭼2a|:$3 4e2Wٹ}r&c-T콢Ҏ>#tV}QoT qe$Q`~[XuHVV k)U<@ ƃM^e\vORܿ[(JV5>.x8p7X̍ѺD}|XrL)BѷΖa&Rj AZWw:`tq~O(L٥q<u)o\-NU6 6Z,_w$ۿSL&*y@ELnD̹uavqaگ&<94\Wc˜ ~̽pcVHi^JvzAʓNG?gdI`o ]X}%talCDG)s^~e㤸ybdpv|L3r‡#]&Jő7ȊXn9sؖજqvvd\tV}|Fc\ R:povS3@:k(OBzAw~2|%j(P8vv!oƁYq3ǚ!vyQf+nC]~OKOZhf:/$ D{x͕N3.ܪgJW:Lmc\S`QRi`{Ğ&[^[}vSUXC ϲ8*_4l 5{YNfSr`V*pbD-v3Bʢ$=/upQ>TT޼Suz/*H؀wu驱?goWma`/Zlh 8ڞ=DsO<՗r4W4O,SIx2$g5ِ&ǐ|/Ą4Wg]UB:U AruhIOd1AAH)`;yΧV>Y([U_qڏ;g:)H$dDn~Zzgsu]jm|JL"5@m,3eu0 tŤ~WA7\r}q뇈 o p0qpYW%IQSet1u׳e>۟zS c(;nb 0'DfIJ$4tVhJ_uT9+!"~lXfNky 1G(H)k$$͞\KQXk '>_IkK._$^W⼶x"{jOgX?ngYY:J:Db'UL9Ǝ-ƟIY&h~ a07[kc|ed~f#F[I'=n`po=- ˘b{7m{{'F0 iksj ~i˼bLR Y%@7{bFXPmNh)oǸ\T? I95'z=L%gqɧ5bMqÅE]~kNhў +'~ۇs%8~ӖiW6SgBqh7RM&,n0iОh^>7ՅLuVX['M'/z?AV=nYaؐRL_􏑘uK[oO6tB$J0;Lg'O-g7ɽcZ JP3t*.i+!ÜqlS6^nq%[ Lӟt~}?U\͐ԏ{CkIyM v3W%/bR[f@ \fLB+&R(UD彏S,˫Ӌ &b"=n!&"#z?Q=. QM!59P*r/cSZ>i'v3eU}é;ꖻ,2XE 0; oGWkrME'Sh 3bĖ/7\?%v6p$ś>ƻQfTm n³xS @ot7 {~|ED3 1a{0u<<7mIؼO_t֤թú5EÅcz]Gყj~^K8̵-)sg|(bDCO(cTjpCGJ5+W(qDuW~V^ I0꽳"0JabPf\# բ|͙i8]rvLo: 'ڗhr@9UŪ*Y.id<ю c_# v&#$pO^hxg?Ke" o[r(= vdw'Inۗv[V&Qy/5TGjЪN.7fr/H>6( TZ9 yؗ?w+寽VٓC*[~wXp˥yxi{9|HڻכƯ=q rFdHνI>G~iR0Ӳ5xfԜ׺n q*P"F2aRfNjooN[M*Bɭ9!22)X1 ]L<_;J?,=)UQhAE.tY;yj{lXݒޱ\dךP!D*P endstream endobj 980 0 obj << /Length1 1372 /Length2 5935 /Length3 0 /Length 6880 /Filter /FlateDecode >> stream xڍuTݶ-A@%Bґ.А%$A JEDtKA91#\ss~b30Rw(, T3b 0XeĹ .3DdAQ@mPDRVDJ 2"1@5'(v qmEdd]$ AqW0  C"p>(+ù {yyX'BF,㉀ uE 8"pc= A @a(8o4!PȺg_P E Q@{ G (/"C=Hs(!xXÂH_# *?eu\@ᰀ_!1}ܬ3 G{ -<%22@; sU ;( Ov@ 0 "#a8:FY/Bx>~ F~Z 17OH HKd,cEm\-=([1cϿk>?kEUG` 0 %-)77twwC].> xzCmo9ipGpPQx1 p$7@`$7e5$ a"[Y`9X.xs_u 3Q I x9OoH8 Og ڣ1_*. v a?J<0~+ֿ@x#` 4L.ܩ:RK{_Zb-%pܓu/gj܇]O3z92¿q8mݖ2G޵%w怸G3; I,Po>2IyB yl>q!.\Tpւ]Y RYpsZc-8YZS` &ZCg8#H|ƻ4< ɲHZ&:_m&GXn})L]#爠]8(S凛va#VbLj 춺g8Ј4G’g7WyH)Z$ vn+憯rǁw)e%md$"t2tթjܞwKT(]y7w{0!ט>Vxb quC 5~fҶfgwYߎkuz_<ٿ5v1vZ4[:mϧ)~x[~鞰0lFaP`y{s%I:|ڕiZxUH|V?*/}i;`R$1QKA^zCLtog;UD~+3 DEpd㧏h^@idJrM\UC4 e5k6AeLWwK`9w)B |E r!n+uw7NJUԀ4t/X 6L6 ^xV٩"j@ټ0;ŸkjXGLJ3=(N\G&7inzha?7r[:ikz|c| d#q2|PPgmKqS%PDYٯ{>o={1)]="&njyXE`9P^xN(e?>ޕ}@:G&*9rd٧Z6'b-*]m(GʱCИa `rv* RYelptcq>2h?|wBuuZT!<,z,w5IGj'ƒ*˟Oi8fsNCzorIw.`gd؟Kx^x0#ye)p6yIʗ4?{~rKkG#4 Gdn>y,ȼa<AҾ4PN""1 7/JI딖a f&l^- &v^^ao@ug(3$#5#x ;X{O>}:Ktxqqc Ng)6gAKig/+޾~c9ψw7A`P "E] nS̴SPTb)sc,RG0ϟGd6M~䗆(o:0X BEO>ȯ fMtCdh킻 `"y'*f:DflYd&eK.a Ob]^}2jD;"޴&:<ǛTnupEWf5³ &N9)+yi+Jn+d~= .-1桽έhetn~Z^ƒcXi_x-0=آKCIQ秛ȟĂmHnEOZd08vwvxg "Y;#6>ݲ&8a_bEvYi:,$#IzCmַ  acx9R]4naK %nS nQ'}o{uyKCiqő($I_c gng^ËՏ-'8Pzf&I.1 LRV,xF( ܋^R;}OX5s#(|ijCf&{=ɅĪx bO 5[2 !PǍD5=3eXUhRqS3g;j T PV3Q֟}+mфC#-_GFoQ;e:GuҢW!{YɶZ8n6#gَVe[<5߼S.%gpg'sPpH)TR{ )h/|/xEY'Q2n?իo#|$%um%=K_'S_v4ײyE8+m~!q(O4Uԍ~a a{ RYd]~S.(@d Of.AblMJ]fԗo7Ǐ]b5?i,/HH|ꄻ^ Xtl0ZЖQ$KC{kĨUqfb7Iv7}|j3VY9>#rUw{bmYˢ\8Lo-Y#yH Ѽtӿlx8cXl MN~e˛{r]^UҤb6`Lg.Okx1^|? Hm!UJtkѠu@RdavK"n,qqg1O̸.mSM#]ܛk="Z$IAua ( nl: ˯|b~(v:S4JigS 0b,ktm73%`SPF~F$ImtV:"3I˔ {0kHmťQ1QMsɬvEaRTE|!v//ˆvGEZ]U(*b P[9ZTu EݥTdU{$/Ȗ~!۞WLv}J&hݺ}N`+<`vsrN])AU0fv_Umۓn1c=3Y*ȼ [G^#Җ~|[,ּďpԖwZku>yIEmvc*|7tAZ#6qrxYh%Y ǜGqAzؐrHɢkWL%Qg (?"XۤY}՛y۷%M\ ٍizGTok95<[پԚSLB8*%yy#vm2,]Ֆޫ`Ik7,/*d~`N~D9IP|›<x'k"U q^C%t7J Wܠ/\hѩmn>ҋe{ŕOL>}7ڄ 1b!O7I0i.*'?2\E5ʰPi/U:Sv`nɋ5@OWg.4kfqqzqaEDBQR3}uPO{.pAOUћl֊J$v=g;`kՁ[p)\2'e WzVt<T' Ru endstream endobj 982 0 obj << /Length1 726 /Length2 7624 /Length3 0 /Length 8217 /Filter /FlateDecode >> stream xmReP%8 Cpw`w . Wkj T&А`gpQSK:PD  t@ #ƎF tpt[YCtm; vsp[\\Y]EX4i@5` $UTetZY ف ` ;;@,G=@IZS\FEY -ɪ) B,C. sl:{,P AcoK߰??\`vSڃtP3@ h!.q[Cڃ<_P3!VcȀ=@`-v| 9J ?V G寁Zr?2{!`@Pſ_eU G?'`\ u{ Xs#lyDF˒pf3srعxy||G3?uM ma\0&>O:oq|TqvnkmaB8WњE%03<En4K|W@UjWs#A\94eq!Ha+-RHG=ʴ {5v~ "KoDd=1X qJMdb[;LuGOC%5bk'] qeQ+{@t2LZv9W) `h%R|<$;-ڃx;zRO4<\3 G~"$ҥ-eu5atU&7D<="6GoPzq 0~5_cZǗxR`2R'(Pviѯ^m+vT<-Z7D[d)]dȦ"+wlt7UἬXQfϔN.|;4N_+ 2nU]ܬ^ UɤdŔT &|D+ȓ)Ȝ[#PSBx3Ȅf\ ?\YC{ⴤgkZsc7|Nm:VmnF*~z[* R)qāC# CTq!-Gq|Q\J1-uXIsNF{ZnZU:B\社H7lXڑIIzP(ʉ9;$v-$?!i#qH{FSis4pJC{:2+l l$X_;⥌'S7@˪f8 3q Jj9:OrEg3~ ;|E+,фJ)[z7 V`0{iEcFuKW7b~(5tiǴs dЌ$!n )lH!VŨhrhRKptFc*X`!nu% u6t~˰'&uRb FƹOoxfx'ưO8#T1\2mu$sfSCpp5*yQN ;7,eAñ jɮx9Ob \x(&QӱEYV؝&taxn_(^0qMZE"b iɏ y?xC7ajNuOaOXfF -3LӼ)ZLbPɘБhɐ>e | 7e F,;wOBQ2ڸLV--wrV#Zӳ7S%M+WX]~:cAf\+#WF ٨3 ^oWLc|ȿ: ߱\jqeJ&8Tbr)94i``%Zϫ um"|训"7U gGRy],R_KyM^k= { 싫bfbwzyÛ,FF3ܸvsD]nu/${MG)N IT/y8=%F?%s4{I ?Gt6U~}LcaJn pCVu}t(b+AM`õdw19jf(-\b-u7V:ibˇ|tͦL9ۀhnS1l~-D ;%Bz0Oc(f x*#^Cz_fD \ޠMxc#ZϨB+~dY| 9T*Rixηq“D _a:jUD ` {6K bae5Q?_.;[ •ЪNRQ&KBk_WIضb` zd<K?OhjoX@nk7V[׵ZmI(RyR}/d ؐR8DwkW?V"KZ &ڄ嘅&oy7V~~~+Mq5z2_Dk]+aJ"aЛs_,M)3VqP$Jl׭JHPxK|n]w`ͅE;$g;.:E/:+Tz*@^kNz2ݭbgWm2 pU~&S F; Ę7~7v8_=U2ÐeB{>zpɸVfBEB79(Wеʸ ~AOM 5uWu(7)A`Z8-efP B)qh=Q :~ ywP~aS9BIRֲAq-|6ΩZ״i8|=w Pla0n9W&θWV%+Ww^ QZR:tom^Xެ4'ds7+ɾ$Lg:ekr#AܭOe36Ak4JWY ]lΟ2GxE8cTxŇv kloFg bvbuB^2m(qWn4Ok1üǴ-pK7;K3 ïtY47bijLx>axBiޕM%U@@O>@IAݭVOӻ]G$aJyrue:Nsu T=yYȯb{2)\K[ $7+='ݻlacq\<7巛_ng |CGǦ?'!H`ncߪaUD3Tjt/2py JXS2 훬.wjEgrϐ9B%}7c痂DŽ2>a(h.w>sx3-IfsN}xX]1r:9H7$ IVlI";IxohƼ[(pgդ0}zbo #+J37bYq^TN^"1{ג*ڂ2Ts1g^t,.O)F8dZ8Y%sa?Xn-` q6]^Ƒ{aN3lLPqYE:z5q%c_)/Q]jce Y?;Cjn3 E; `2{qF.1&0:[n\76&Y$Qs ЮO?| 9 ìIkC)<8H.߈<]ۉG169˹aJ w{3i"ȯ6/y}jԼgKnז9`=3o4?WwpH]B ,R @%' \u풒 ZO!>y5nt#Tnx\԰ERY@,dvv*^dUgιWa29|b*< @B\b 08Nݽzq=mi~#_8\p1${7hm"}|]6_"RBi2C@5hvos>M)m埔86 $rN\F 6ٚhծQ1(A\[Vp]&ܸ'Yp&czfl$@Fp_3l>X@iv-"I^~e!'9lG\/}VQg)HL~ rgu.x/HQods.X_VX0Rʿ8J? I?}$W"xϝ18AȂQ07,TKe jwb򨓆73{ 04ž#k_QB=uE- b]sפ wXWISga~ kPؕlk` >.ÓRS+3:[/7I~ѕ\]:KUaTnz/tl h:M.$lWnH{e- }ʚ͏Ν8GNR@#F\&}aPc簆%ADӡ?3;ц$Bb$&+"婃DHE6;9l1d1  'Jd:G(jn ˮ^|&7$lM0~74#VU 4&Ҏ 6p"*H-(Y2PFAmCB7r9 ο~U6K=+KI"l,83ߑ8>Pc.eT]@KmYP;a%ܑY<d-Ҏ;Av*K⡊$uՌsߪOc+xr;Ҧ*q eyٔ~T CAm3̱,Ʈ\=[$$:o a.Qx6IԳ,Y*רs_)MVYVp$#~=';\Q y40 7Onc=ĽLyl @;~V$dmZ)O%p v gJu85ZdׅU~",zS?epu36Ɍeq; ^h̖"#ku(7O x vn,@Ljݝn9ѯܻ\7upO5C#>E"&to%+\%R.W>^ !n'*Հ-[ 둟^Hq3=h/FOhu%/khk?ՇT e V_͵{X:*bf:j_ 2x#T6<jGM@q'oǁL41$fsK9^v%m}[hjy#gu^gn3фC˻͙ۗ7P43VD<,_Y c\mF- Cሥn:@:n ,FiS?s!h_iqb8M*}S呂mg7[#l響":`ly2K#_L۸bmcV.,f$'ט!._M>J4*2M&lX#+cd @O zX9ΙlNԳ*P 4#^۪ĴZô̲S a9L@s9YI?#X(3Ԋʕ)&hAg8,A*ױw~k@a0ޙr ҵjgöh"[6 {РV0:}HyNB`Vfv>N6|+)M+YEsҩ!#fX7(@|F>a6\y%5yJqiH#q?fΰ x--\(vJF+H*:-}z yQ5G|B)#+6 ${ft/-vpݸކH΃-+4]qHÃjhoUQP]8*E50u!K^? ~bOu)蠭oEX7 >q|t.YAڸ=fE{qN3~814VU,1Iʬ6Q]~Ns1܌9u\Ň${ LqJt2]3UlH;Qj f :M˒+' ;1R+.~kq\Z_AG)woPW@+#UR)G)(:9NBOj.p`eK_;STV:_Q/&ƸRExXXc# Pwc+i#XR`{ `l!Ob}4|1kdL|]o& OL[o mGO.N4<دqjQ,0SZc`X7b.3*[t.ud1L$1b?44wm4Ǭg7jlw`VdlV(m ?F0&nb>/3 endstream endobj 984 0 obj << /Length1 738 /Length2 6744 /Length3 0 /Length 7344 /Filter /FlateDecode >> stream xmvePi-A 3 Nw=xnA [u뭧T>?^e"#/ W yXX=!`/'% " 0!A ? @`;0r99y`0'[Gܛ[Hx9BN2!P'md p@=A' d ePR3Pv ?p\O5t,WYO,^,K7U= OB?4AT{j vE xB`("N@  9S#<ᏘN~;'?ۃ]wϪjO+\j+T"N+AmavNPqvN뀝9{^5^N~3~޿_`~m@u[Tf^>ňV ,+ke9TsB8QG !P ޺{aFm&v(gYSY=GnC N/vtV%cQgԓ)`0}?~(C|Pמ K}Rk*„HQ'yW/nEwcRB;?kСUV2"W̿H@`^mmN+zE#f^AV"zx@C۵ӂ,[o"(1bCn=6ov3۫vIωPk 9g{)# ._k26̅2]_C.f ɗ[D>79,B7/lz-48!fտ?-5*9wnu`|,״+(;b =}$Mӕyٷ=#l%BETB-z7[-`; XX m πIesL^ O_e~ޔ wݍZͥ$̰縊-?>*3P!0;{O&pTb/H" 5]iJ#^TT95ŔG*י RHIdɵ3og!RW2-}MY|sT=e BU[e=-G DqAJ?dG(V;SۮYUU :۵|7wȄ{Mipkz>;, ďK1ZTXgCp P/*~Cd 8} b!|5PoYZL{z&UИG㵣"L}+CRQJsq7;8~i#hZj8}ֈb^Ul؜ǧ o;Tp϶ kDӧȭ!ۯ_fRYD/+i^E{͞Ç;T5bA=* J؇^3f-v4u`|83xG)Kh}ik 9>ڸl:b@1-RvHb0V C]x?2n4|< %Okfv!c.hT3丗Wzrn5>SݧMĖD/qT} -d׮mLiȖt47RcbÔnnm4/b Z*lN~A; 5A#K#(ځw}Chba1~D(]d?K*ճwk2He!yK{c2v7,at{\'E 2JMΎF(̗ 9t>$ꉹ_ЫxĚy] N9[A"\W{PƲ?"ry$wbE!gՆ'Ed|K'}¨wu̝mc^<`-vpIW!S'wzjdyfG"tC55D@)f [qEdBMӹ0 q[ɣ 8-_b0z\Z"CRAKᅻ(u)ojܒ_=hętɂ<("i(US?=D|jZ&ZM=wZ&^)M.E]@K 8`$F)QVM4`?մ ht[}ﹾF\fs.^: p`]͢jXp:~8z'in#{:sXTDBY a12 6z]C=}URQX7jknOj"YDty#52`GSGRKeLZBQ4ոt8ưEaf$ӹ{31)ݩ?wh:`gw`v#C&1YPEB@gך//@;Kd.1ntq#k g/Zl*q h9L iH:e>BG"]j@2"2߶naˈdh6r8٪͏绿E_}4{Bkउ ~;k(6L9O'I<&qkVj$A\~ '_ˊ*/W<Ig|zKTGJl'C|=Kۄ#@aؑ&ԍ 5!bj&c ($=Ň]"בQ Uz"[v&4㟄KM>b߿荲.l =.fd3Z,k]BޭpR|Q ȡH#X[ݳpd/'kM֣ c <#Skm$Yֺ MիLqL,,P0>=ctWǑ_UIB1ӘkxRaI\o6m{LpDJ>}AXܧ~SNW\(R`55l+ ў{99DYo:`W0.HKUJ;Grg//w 췻͏m$pְK~\]!@beO$=HӽشE"Ttcj3Sv%N`?oXP,$rP\Ʉ+| "?4EZ"8A`!SR#_a} 6&$ґPם76+VnJs,RMˢTr` [qt-1(g曝_;GLaغ7tqD[so:CIL7gؗtE opPiݢ K,[S2)vhStaD,/—907T̹D1bo_NA!%ѵED^32tgڰX>ݷ {&S.I5<^ߢMe$r+@X> $I72mԥ6Q?]j%p$2{(%}p_`RJUp.>~LM,&G&(/]=zˠ< `%LvF 3<*!5Xf|􉟊m7>dLRmNy9v/;tF۝@GZ5t:y?d hYhi'UJL~#&9W]Rš=VT)0yHgHAW09c+~.=Ic{;aY5'}|&xqn6>REǬ% ſCrqr]/Ku$gAYUf%vQ]]`ܸ݀ q6 ŐkE!#IN ג0{x}![%/h8"VŚeх jM_ɽuWow*;1xs=D8 )BŗwBp yb󦕸*时mTB\ jQGQ~5GDiR>11PԶlgSP;ݒ0w}bI*=ז\.[-[8<66~&L}Qܿ7697Πs/ϐ~#E<([)BYI0XbّDn#ÂҾ\ܧ0F*IAI㿳]A>[W hJa\ KAΤ)͆,QVO'efGR_,98 nTD"25;&؇5Rm:tpgm描mJs]܃Bn w-_r^2 hBib\U[z3LFLN$ ގ>[Q]C FuE6j@ݾ؏q?i6b+_$φ4lb1njtY*t;(hTgQ=QAB5J&?u$P]sb˳w0?N^e|Q K"k8>z'}e s? endstream endobj 986 0 obj << /Length1 738 /Length2 19516 /Length3 0 /Length 20096 /Filter /FlateDecode >> stream xlzc&m{m۶m۞Ӷm۶mcڶm}9|7'w"rʬڻ*PIHHE,J@T`dnk#b"P(ڹ;9PQ[Ife`lnm@fkebndF"LGO2@d 01+hJʉP˩lV ΆVF2FG%F6Ho6.8ZȊ˩ ӫHn7_8de?HcupDȉ`jn_IژGml_?G@ ¶vNY[c ?8Z9l 46r_I'1L Nob`k֫Ӫ%O?iMO]NG/,'-,H?o)?8fwZ܍@_뿒ueb$eff"`dgc%dd_p6N?wFזm,R[B}D g+gR/3LF443?ºWs7ݽܑo&2OLF.cWl+Շ]aZɁ݄9(PMdHٗ"3F)G؎[]x%GSW#̚\ܐbj>Rq_ΠQ ?;1ængBDMY%;m 3a͊EAтJmR`X9Tlla+R#G4(}"ˤf6GHE55X Գ^*MѠ 7v``F]K+R>rwJ/q35z$K8U` V&nΚg<_4Ŀ~7\R zO.ʥ0 (:%@?3Jr:ɸ"Bqd4m=N[K~}iƉXZn^m.j=1o d-`){e$8|KbRI@K Rhʱؕs 4b@((k. K2^=NEf/b.ˑ\xAFФ4.+[~(EB.Sw\}k4X KJ04T!9GB;&ÈX6|*!-{7bNsf1{F!|Uޤ"PvñA0#͊omn. \ ] 1 \܉5vwm's`׶*ۣd=,6{'[IǐYi^xm: HTf4=xxD!(T+{?%NqJBk/kNAJ7'"HjNϯBi+6y_1:b 6x ơaV6.D12uvʪ'&5SC{Br+̨eVqOkoH,9[k&;lIk/ o0ЃwT%wK0O/h|yG >Ύ/A\5%c0a=_ ;I.$ RB IjLxҗIV_(>OM1qĐ XpiR$g\#'Tj:ty_ 1YQkJMҽ鞨@ ɼJVZL16.{b <;M[ ~lb6q#$_*"O;gJ3QMU#}MS(\gKkNrYhJ \*+BAɢ^Frd$Hz\ԈɄ._Sj9FfI_6,Z8w#l;W =O&8洵ees6T4p8ēpKݐ d.#w'sSVMǽ/oHtQdbEng 䁴ȪOIhQpѥjhRL*)WMB u?)KK{Ab6QcIgT7lQ XU437oZ zs5UA-`%&b3v$c.>F7=H"U= $}-ݟ(VxI M-)pkM$fΚOv҉Hgr4;#10^`94$?g q2Iv,گv :_-DD!8ѽb214I2dpd%a숷\P~fT;?)WPT"d,+:s[?($t0j}".mpŽo`@Zh6][P NqwV14wXt?Lmr-+Y4a0qZXctb2뱗vڤ_&io 7ו uUSlDA|h0|ۋMU=AX&%&]}{&Qx|Я]9 (ڏG?PK:F>0U^ji"{W,G.Ja dT;᎞Rڡ7/)_⭇ni6ZWeo'y<W7pQ̗_"oXК"ktPve1#X#n5wHV;zV?>W)ißE">E;Xm^ ^VfAp[H!f ΫDEE) e4?9k@Oא4۽oMn7M!Hw-%Q5f}mx+96f ;cp0BRq:p%"1($(I:ԥI3/]teU#9c}Xk7ZfK?CWWY37*(mJ e^7?lNI 0ܼw`ӓA~=>de~όpI4D^I5 >շc.iz$p|P̳ Y|!MPTc۞b5JEVP&z&ąf]syC׎!y=S٪XNc~c5L<ȇ0yB4氰z %qjfe] %%:5lNdncgfj(@2A<$b O:8=Od(b}^XYϥsP\f~? Lff=wܥ)]/4عƓբH%Na@9ltuֺRohX#rPREh~?'TQQy)| 4̧S=Rsk傥';VcYlO׋m,b/1>A⿓hi3 ErYOIR>i]W%1<u34uaIVC\v.OE>C3m  mdfe*RiZve&|P FȒpDŐ#P-l_g@3ߏ ?Rh܆6n&юiXtj~N\_lRobodq|ǬF=i5=={Z[ P/"eC=# zƀ /Me'I\"#ͺu\?Cк1 XxH KE/;UVUy].\%=EXy=χ4jox4Ӣ`hQ+$Pawd<%khA@kyFnU*INٚNnC p Je\q/H;shIʒul'~5ߖ!W&1y}8" 3YmhLlD5*jӦjڬcrTuٱ33 _l\yY ǭ{F1g.f!ibɔ,):`Et,gՌ8EI"w|j[}dAzɣ3ʜI=OOS;!Uu*grN:_\' dTIo|%S MMV:M;1vB{ql ̡ǻ1EI;Rl4c ?)#aK!Z̗W;y1 q&vCrF !KBNL V3->JSўOuqbj$^_% Wœfևιv+\ˊ1Ɵu<9ؾKW¿toI sqĝ7?Muѓ,6TrXOlx]~L:dKqS6s; n]O66Ox2KVa,n+6I2s/>fJyHs#&`gܞev)}kMGt3ۓB}< _bu.5,01;Ixg5'8]$G)cLHk{wP;/jrT';zov]Eݰ ൐reS}ܵWօ8͸AZ,Fη *0h3ωAA*H#!,%@GQ#YkB!SdP(w|jN*O qxw)67 CJWVHSj&cLEaskƨXNj]vth^.a<sTcW{)tZmp-FpzG1*b-~e A=XtɼxU~ `9E~I"2H)_!@0Lj'{ A_N}~xztS~$c]h ! R<,V&K]KWu+m3Ȉ•j2cQն!HyUxUk`Irgډ Bcb͌k|]BwH%Ö(0,[+r)?HTQ ?0) ]J]gI {UBKQNk:0l UT]%os!O&_Qc WUDV"-' ɲDYZ'DQ-vSyN)2}rc`)azְr/:Qmdɉn'@ĉ"2\XGY#V'kXӶS d)d }yn&8N}vuD1 EeL6i(=iOthWiyѣ^PhSZk֔C{@n'6p'yo AaI&x}XԬ_OLE05A{] x#Fz+V 6"qO sjOQex8_ 5}tVݮU>D{!Up5^~kۄ k=R;N:K&o:CjftGЇ|ۊ4Z*{/ܰđӂ^=eD;lyIB>)Ԑ> e- bP(yPU>s9Y8> H_ʍSXj.?kŞPv4gW$VNܑ ys.TiKzS1 FN{mM}c$_žx7%DM jI}s)ost"7V֝qKS !bR&H\$׈<rϕѧ#_<9yM.61D(Ad ~l#n+GSr6d[R3T궢aI--4Sm|Un/WkHj38,F+ώU<&GU Ϧ40m]u8cNwylZo4N4QzaJEˠ|_+eFf̒\ ZeBeb= ម{Yh ь@tdbr8m7zs0ćߦ/ZS7,a+ E*Y3^sH R"ۿc;@HJ~C#v͆rP9+w>Y?fK´J ,U2fw TKW49F[+9pu35tؘԗbn|b<[zY]COGP@9h\VǬl6OEtDRUjR9AGyҞe `%ZkI8wC1,-j7/c JΎ$X;}~_l4+mw͕S\RQDX7ryqtuxUioFkOCn;mw#HK>Y5ۉkn jg{glE V]DGĝIn=x/2WsDhTt YSQuF;@2XTWGWASw_eS6,cvKrp^o:CuSӢN)>%!vF.lWkNVqPR]pp|@ p(g&ԡ ѮI#o:ksK_S-&Ǘ'UNF$H3ׁ* ę>yZ]x[ajK&"ߟ#I"]f13Sӥ N.(g[kXAT{nu2:Ybx`(P@ @sF@%f־2'ù'فs+ƪ\vK9d#`*:G`;3 s?dS,aeһHfIyq􁔣ETKoW)"0X/Et%)Sw\hͬUh-\Flb3)?K21KF/w $WܰxiH3a#ERP}0Bu=&. ~5AIkzBsu_ .E֜C_4BuKкF}xVZgp ;YQy+rB״}B^mA?G9d~ S ,͋ON(">K/M0ÖXIFnbߚz 1+VͅzBdzX u6r(y)KO;< `GQޥUg+R <qзv;i5yqQh5<r7z`;h؄dСّ-x< HEXx#JXW+>G_t~'坻4ےB]ĩ1ߴC3OBrC᮪e2X܈'lβYIe0bHoi@#B|=8KjeF/M#mx@ 0TgC ve|Ҵ s{m$؍ۅWZ IvtsM[M9ꃤ+z'6`+Hر?BT(R8++X,OK+ GxP-a&K%f󋠜_\پ:=ޠ^U!qmiyVv@9msǫsI~0ÅrNvsPiqQU^L. eʈj B\oei~"3Vy Ҭeꪈh;XF=MR-9Ԙ1z-5^ `{UxQz*C&Fuҹji7/?&5:CU\viT v_RM]; u̜eSf^\A{g7\eKO<@nHL7 xmGTmxEڽx11dTt[n ] PrqW ᫲50<{Cllf~g 21DvJ;zvщmM%woN4gkW:b?f;ҥ̇hB13~+v/6Cf!Y Vӆ{ՁȨkg;/Q)|a׀4: 7^e` µqYeҘyl'Kd!pm954Tr}ޛũmNY rY*.̄p#^cQ](va;\=Q L| dL!.R;43QO6+pa] USb|189Cop_}|*o1y]Ta"ˮ8Vkh_-=,Ie1&D? %S@-\$3i7.Z%?X_B`O?cP_o;{ |qBJ@.ĒrA%?+3<|6s[ّ+>B&Fj଺p;Hck9p.I%?0WߢUΌ[ta U=G?GWD}< "TB^ͿSevLOC`.#3,DUH4yՔ/ꨚHcBsqywϭħ\A ljMtr mJ8n=.|?C/N'%Q}4fk+ߌޛ7\&ѴըFb4CPN vc789X}/wcvH5 eQC.Q/nK8S}>Orn- `Oȿ?' 鈥, Y&+0>jkChb-Bvvȕf=/EokS'psQ 5vdF ؤ6 cp^2}NLȷ]ڀۻB\\s7 |?PKœ7k P`.oD)qn<,˃mq#(0(:m=wp9єj%'|ɣ!RlZO_$I0 ef/*x8)Fծt0~'fXk)FQ5#%PPu<8JLHrp-ΣoS=WNJMAflSշ!0S . ` !Q9ySP#sh“7!tS銰sqհ# 2\YLuEb {~OxᢨQl35Vg7@wo`;}BDh%cmKMko=j83y+CU ~}}wej 0?;0X%݋B/^LCU־\2C_qQǞs^<*$ORY`cY כ~@S[oHv$M8uzrrZx$ƒ Ta}B$LOwy(ou{o9ȍ~L}eavKPEQX-6%[<[e-r KbI5o 33獆+rƃꂲ_?N]'>,[@;3fЛ%o6m|՘e1 .c, c\_Z@(Y$ұY  8:*Ҭ,mzHѲӂ $DvuN_Br♄{ ]ld)BQFTU.%|Jw1Wr19Gz~4>L .Q5X}BkJ {uS*8#,ў'vAQIEjWc2h$i)䪋.#qeşyB@-6{ ¼yrп?̼^FG5+\3H]÷]BNR꺓ugNP,LA!ˌ}w?5q'(&ZN2&;Fv5>r%}3QrԻFsKu^*HaK[ΊlJZ_ MY?p$G/f¢_KS KͿf=Y 'sӉ{NHBLqw4nwǓ DǑ١}; 'FwC=lzE6 ݓ[h%_=u#] q|.4edi;=TpVΈ:,YX0(|Os{G9v'iΣ`n(]X`f힋"+:>dz+#11ɱx%,"~'EܯA(kHj,KIp oe},[2U2d"f?y{2y<-&(O ,Ϥt\QVqO .5KLgEWѝΑ+)oC'A=))^+ i.H}5 ǟzkz@*X@l(*|ш;~̖KɨC}ѿ5;;ya9&3!i4[h" Zgj?:GD_Z`T ^;F06)_䱽%jS”P#6Ho <9Q~yKm66]{eC}o1i-Z JQ 5jz yX뷜U[v~Ki-Nх5+i?7püN5}_BSwev'-'i@Z+PfQ28!n9]>6=͢‡ܾWF ep5b@=eN`6֟USA]*M*ƥ= )Gb+,ҥ*_>QJO}\WG54uCmr :)}Y 5feR )Jي6F}dUKBѳ=.uXxĴ1ĚЎE`&Y8A~$R+7K0 9 +rXjG̝l[Ea,\{l5tf}ʨ+IJN"h;2Fo?63g%-nmYÀ~J|$_x{,VOA]I@w%*w豔w^U2LPǎd0gNHoa:+DPW"(O8Ugn/&b&T%\\_:67ޘCΐB SHȩ j>*0ŭ3b]Ʌ쯾G {ח aA9/d",5)U:1^pQ9IIѝDiH䐲oO kVA1WK@Z#:Iˮ/~ 9,X'!ެtwQ1-qCV 04YLJ}S"e:i.0߷ؒj˟#*o L" ˤSc{7EFeHN=58u ҋԪ&OӸ   {_K /ΑmP<:iTK+6@ -쇸~>rァF:+,OVwyȢZAG>H'<3-VuF 2W:I-pϫH﹏> xH:{P`0 w Qrr3oQcZjZq?vtDJy.·INdbR:xO/)9"z_ewޞ}i%7jc?,:ϯpVvw0MX:|AFb.[ڰj~J\Q]~MV_YG?Q_p@H_)+Ҕo-IzU]3RhA`7QZx匧%4${mn?B 2jxH _YEׇc9DWp B=`UrE̊huW^ž@Jz `"tv7rVF4%Ӯ˾dQy7kfoc"y&&GpF8̄F\3 CubfߟS[ C(7~abct5J0-tx'ʼ?I/gLZ3#W,,t֮HS_LT%tZqa+K`i6|iLV5ca' E,0E_S%}};yZelCPXc1Ofm0n?vO~ҍq:GYa%6+)Jv+]#`StI_rq[mjeS34鍶̑[d{CHo!o߃E:dU*^JTuUmtDH:k)xPp̩w`S/Ip0 s]BK6~su4M>X{jױ<{!s7yMEKdćGȽy6F,4B)wu᛻}B];ti Q6iV19L`'oV^@֖3ajH4te{C}̺j;Jÿ?Y¬ӿ7OyHJP?DUŵWġפ5L9>dy K\(ibDk|u"5Tx8$]۴h/I^/1;c ?1m ""FOx:}#qEVpy▐.O(&0;VDzQhEu1v@NS𻝕(B]K0&\̴v-2TM i7v>&U\: (A@5KGK8X(@iodTF3#<4RCr<'/*jSRI8Ra uT6Ex&e# endstream endobj 988 0 obj << /Length1 725 /Length2 27938 /Length3 0 /Length 28472 /Filter /FlateDecode >> stream xlc.]-\]me۶˶mvm۶ݷύg2֘c͌") #-#@ILY@ MJ*hblag+blP31(L Фa;{G 3sg忍Ukc  G9UɅх$%  ,'!)+Uؚ8X] -F&N&S;Gſ09?Ll2ʂbrQazea1@Zt[g'MU $k G$:hFF3_Iښc6v@NP!%hkkY#@/-77FH:C?1d$fnb,oΎ.&1godL-\ln'쪱]Ro)iEm-lJih? ,Aߎ0?]@a_IE ٹ{212hY9l? In&w#e;# Ԗr pi}eF-$Asc+{e<7XCf"X(0eb 3w0k0U,#Y;yI̜)R^$q(~pՅG[qJr oX}jYkC׽5м]R,=rP+Ka- 9jTH!kbJS*R=15#=x;OsA*K[yPH޵DR9{v_`\|(as3>(T![2 ~;?2 z_y:n8D O exFrh.z fsF*rNi^r}}1If: hUl zF&<*-#6 ]_oH]@Hi^+ C~j넇.t|)H;}NXzZ"QCg%3]%>l>/_]tjej/e(YVbذ G;ŋnß/42<񡠙@,u1F*ͮYͮgPF8Y3cd[g&a\LB'(ق<5T?),߭;81cxs}S ?|D[=`jK.1Sk%4 tp+35 5SҜboQ{d([Npx^Mz{W-@҇ΣN)jpS{tኤ[ 3N[K_N݋XaCp~L*N|i2@0=r 萔9@JpM%JA0e۹"?XISزRJtoE(ZѼ83ڮASrp\},#c##虫XA .rΰ; QZzJw˖D턔|@8eo< 񨔁Iȫb@ < :=RĖhUAU[4XՓNW]'ךD6 BB7yi M(hPiz JBKe\\$fG|A;: ݛU>I^,poQ_3˜[mf7t rԝ$S$vڏW53ů%>$7׌ƸylQ9=}7`n?ͱw`զ:A@Z7d|XoFP*Ov/UӅ2@=QV(q 'Aa)¹- J6 'YU&}rז;];Z9bPY"*^WszΊ+@T]sK3 ]X:aޯ4b)v[" O A#T;w_J!V\wuӞ钟i.l?8Lp]l)oʚ \i|B B.iY*&w:|xbsI"ߙfPS?@*nM9R?/R Cѓ7^ݯ9e"``:|'6\M|B.5d3̈́Mᐉ|?=9`6Oũiv\bƄc)&fE-n/0 ?_I/EK\ӫ. <uDQ tC2{ɦ!/ٵ\SFbӒ6-2LJ J9ޑq 9+*n DgWj¢>*.[NG}WU8݆c;Ӏ iToH%(4Z^ qKPr;yMNG Q=/S'?ͫ#ACHA2k, raע!5L;SW؁~~{D95Z\oʲo?,dꈩqٛKO 3eU՘",E268D(+OֵťQB@~sv7[E 輤Rs^ tFhuGLVC62&qTu`Q1ar (+~ ilQN͵|}? LBgN30*Ѷ4Y^Y꜎Ih!֐; l\ ]B3V$2(bPO 9B:BYw_f^;:jJ`H8T>Dfߩt]5ab;r]6ō.'d P{bq7n|Hw.to_p7Б /p%54ll*VwӀP%є0gOAl?UQ%٤HYRnش-)B̳t~.B㩡gj֑`LnR.Bh )c+җտBU;rfnhw=pjCF*cs>/ɺ{`*xn,6OA]9^B;7 ߟ@) J<; n[Qdl<ȩe Γ+ 6ˀ^@{`6c!OS3/kp u0q9ٲMkn5ބ6ʼ1AUCtP:-qTU:;b+#u,%a:[O20pSvq2I5i[IN.(ٜ>;~\̤Ӣ 67~4=YYD^HWqzbѿS_Kݾ;_W"6@=;ऱ,@p ph{k(=~ǎnDTIoDJcAB ۇ3} 3 tNOx\ȑۃ;~Rt)=p/Yr(7D=0[no/xlQb"X%m7HfԆ|JQK+#ɀْN 7.&*09-{9HW>83 n!s3w;%\ ߲ٕ$ ߮64u{ Lsׁȋv=%DDW4Wn 6ܙ}Zsf\~5YC2&"uiIȶ h)qQÚv?: @cr-s5vhd/d(0( !܇iN+3leRZtoCccب9;L)FY:)X-X\&akH4՝7yAcZU4*b+Iq'ke^q?;ns+0H3|M!sD[Ul$uM dȺ$hǀ {`~ы2 F_U<詗AU. Xuvȫ@߽%fn^š.p4YMM)W J~-t833G|hM܍SKH!t j/4vy FJ0&Gh9[t5,MWzc"-L*desvR4Vˡ7f3dK eOQ7C!)EA&m.&Ř9Wm QvYeEUeGog}㭗,݄MӥU[X¯OIsU|D,RŜ #-fd %DHwGi7#2>,^7X%s~Ij!MmG2eҷTo+ |S⪺mpˁ m2Ln#Zp;# Pa.SIBKX>JmN/"2 +EW(ʊXr\RU:EQ!XHt]\(u2٨n͍QQP%bBEhM hYh@&0n,zm5=%2^q,GverLrb.GvyA5<hP/z_MhWMV4EڒF3z 7Z`u_(ʴW!p J62eIzqozԴxBđMG黂埖YgSvkUP| Ivj:cBQK|!m? {zqy|b +#Q@@Xy] hTcM{>wr&>̱E'"'~GcJy|>P%̙zm rdMl{>RPh<#ů(%7d&P'6gUg>m#x3pÆn:Ϋy~]}[Lk#h|ey>m8g@*V\ԥzF\v{[Y$/Y}e%P-Y.v=l{掤L5Q k#Ә,3uNRe'T`~R)F jPFqVQhɑ)[YMkXT&}.vYfz/O넪2aDX@7{CߕSLܳz_:3\DT@@{( n) `NJ>Q/ j)]869a?_D}1ztT%2 aR^N}a7Pe)dvUk@q_|PȚA# 7"oj{Ojz7!YFdr D:eUmkӸW׃vtZhe6m⑹o>/~5zq%NӌP-\Z47r$n˼7 9gҖ;>2Rip?K zE+ M1C R( Uj!s#ps]jBĐA(l ˭D{(ɭ9B6<8SFS!bg;|u;9s=W]T~~q"O_ф.LqQO8Vc7Z@T)ٮCluGw x8v/ $\J >7&8 QW* h)h**Ni}`0ꏐi4CKo'I*,Ŝ Jgkljb?H ?qV{Q<VxO+Ig䧯bR\*aqWIWvx'[JBZ-Գ9_+c"ӍpSÚ[^ʄ0PY6cz5Ĝq+7GľT5Chn)Jif8Geh+`< &ǐw1tz3Q1s؄&'h}&RuX7eYtSR(s MCBMw\qGԃ%|! _/`|eg?aDgl1T\"ȷI)R3zSN4L26WmH[]g< H!0yLB$ (2iԦ}u8>nh_t@h(d[dAu+˫nC[&V*JWx~$)C|}gfR.sg(M*Rz+:XT)23-u梗09sk~lٖ Ɓ4OIjtcy\Etn&>qD6 pPZ$|jV̰K;8 MBhۏ3^X|k-esuPd(%ݟG#<P+5G"woG̖gcn@~'3{nG,9@Ja!J(VtJ BCOuo}}jZ#Le[\w4 @{#Py)1NPx~D!1 EפN!ߒn/ jx&yy:}x?HTU_HX !z.uLy $⻄{c> ]Rz;Wb@YVZ:LkH֭TñظZbNآ,sڪx;Z77Z6+LBݚpijBOpB9:Tf9CxXFI^%N8e$Ң,Yi饷dXko}˭DJ] {Fm͑dVx\%2cp%VN𒬿.s@P;/e[Џ׵TKݭA#ZoRy%+gtpf_r]Z:4iߞM u_K9{YKLgkBl4I-aoݫ(F=yc,ք?*Ed yɉt@tݦqJւfR]!R¦'ޭb {䔳%NEvsNll=B)."yA?jJ^̦y 78IV-3Y^0 _X pqC9(X,z{o)4ySťa\A/r=+[Pt82cQ }:x![`Mz#jrzzP=2hmr{ ߅18w.Qr1vצ#UG1} N&XHwRv;ۻ_LHz#m9_(=~AȤmPhuk˅].V-]gOMyOwr?oN(0bg2D{Ȝ`HGF;t%VC ?X^vF;?FB,O qŎJc,m,JR-{|8!gC͑ԪsOXR` vEئA,?uRК`tK9i#ciRf$AޅSyve5Aȇ= B qTH}0 ˆ<ށl+tK}曐 \. D~nߍR}B>+(g_^~4NYY0l1r9Fm~5cJT@[_Á"x*'[ E ?!|.k:-=u  %lAoT'V}vbIG7q4lY(}WfeDtbKdNPM<;ݘi X!K|@u7̞'PD,t`GEyUjtM1{-~QwOC=b#:0ZfaDJ7VZyLuv l;i^棇c5 G6pӧrs-t2nwgK^]~pvPkAqeߋPt_u=,V??Z-q^xIIj' bR9%`6J-R4䥣nNGgGز%O!oۺNjHпʨhӾ_RvSo1gQAM7xkxmCH\UeIlD^ )點E Gՠ90RMPBn<fQ:2 \GY ^}+WgS(T64}6;g1>#p젌zˁKJ @J8iv5{ _2ci,8|Dzb.=<#p $ d2S|?4...cj9iq'ߑm"E!I޺B5bCj@GG;2["f&<&YB3m {M$1l+u2i^iy@nCWL3l܊7L0:z O֏dH$ԝj|6^YްbuZA([&l@v ng]ֿ7砶𵳟$-Ym[dR>WqfY{~ْіng]mt/mڸT4Na>ysL3v1 8ؖ!宪70%'R&J?gb茭1s(_[kbm%"yݖ$ l[eepE{i 3DO8іduC?x$7v{68:_tu2Q}J܎09ާ=**=B{|*Dˆv rӲoPbNX>}g}rA|AʹqvWyC1 bFɊ4cS䰍z;br6@u-de&j8:C/O?yCoܥBјR@4C? t#UrF$Xju@3ݏDRyU0!clTj, L2;1`y|de!a SN<&g"pi1"o4)`GR:FZɣ S~Hw뚽;5aBADU{Z&% _HjBv#ޅ:=`9yL|_&O4ޔ̒p KM< YZޘ~\T·@GfY8-SG5UFHuYz5C0>tK!$rfCκ+齱"*)G-n,zsx#A2yr6\oE5c,oÎ⼌۬86TKIHOX`VPm #P nt)h餅C~^K>Ѯгu/)ןc#"Ss:dQފ>!d|[,ǁ|U!3!%6Xxn~ Q/p`^O8l NΣ DEPlcJQ-e]9XWA(ct3<;qu_;R&Ɲ'oz^*P<Ul(Ü{ 1ON/@gNVI1ůٶ6FT&W]RLj׵0IGq^rxf``Ǯ4e؄L- CSYe `Yf8/.ܷFb顑5>@ np唔u8'tL(Ԟ_F0Gw:2AgWnfVTٚmZK#eH(=̴H*A㳱B+$)}&e /tS7N"ZչɷȚ6ESy|UO3mw{~WZh#2nq3zVk ƅtu͕0\mcv:m ;ӱm۶m3pVծ]{u\@n_ #U(Cąl_`RPB ,}v c_*&+b.}W8f+q_#@R-/8ӝ{YߋK٢ՆbvSofIlOH62Ago^BСh] SC}U>Y~;ġ4+ 㧴_..u.Gl« [+Kw08Vz]M7̧Ҟ8@k.g>  t y7n^hj0\oxs tran)htacs%|e9lS{'UxX5F kW@߱ɭ 0. nwb&9.{Iji%Eڹ5qf~\QXDpOWuwm[z`@zׯrڣQ/wmCh66MCRYwXKQX085+`q)O^j2B( L-P:kGKBA_K=m$]wv_لVcU8{A6%gg= &uU;Id8ytRlJ?|Rz{wF=iV$[BQCdj+4G'ڒwhA}?t)Lq84ZH.Kɼ0 J ^kۃM~5CRPy=kEP*8+R1yVlڀWe)R;yOH}f7Dmsn?lc#e1!?eW@T=oKOka<,Az+ ~al˔N9 0<{LQw$ -ܯBUUSVD#4j0Tel0HDÌ?gKa=W-> edtR [-8Mwҧ_i65EC+z>*ұ:2 W\ؑZ˞cŻc@3x1^C(G|_ fb-#t@v?,<,hp[.dvjWS\v1{2&`$KlˮM9K4Z}ѨCtqak]Q_Uȓ[_mX3Ç޲ Nƙ Y"?<*iixmWQ+/ RMalh/eo&&*Ȓ2e¾<1"lFx2^x#x `P`/v&uocUFљ }a NVE"PxM-C>W#؋^Wg1D>vښ?0dwlbb}j l˸wZ,4 rWPv,TfA EeHZ*`pKlaJ݆}ySfHEly4J^LqJ#R.: CaM(rUB.M `XN1́쒒T#~ CN 6dIʰmA8tqN=\8L=N0:}O 9=)4D:#ZveFzU_1ɼZm|ML-мo9. uYV;2f( pMZ&g$%4_/xyR :Ȝa7?+C6{,&zc0.T @,QV},5qgvU jk곆s\Y񷳽WJmuo"2a>m5:Č~Mߤ'Ћu/76 84a_9kp(@; z}@,/e9NqPi^<Ǩ5,CDH/~%bVT`\h_%0aw[EL+(ft*i73/UI8 I PZYG>(;&-1wi,F;c˜rQ+b'QXa4$ v~ΐ. Y[*ꌠ,02u&PF 'UemC-.֐5x5ֆ$Αn0fvjRʈf}DŽ$iҜh#׵<56mp2Ovw#}6z;^}h.BW+x3˽T8 b3me '/wdG No=3ׅp/ F%z&ӑ_=oܲCPh(YAkugĦ+\'^Q?IAa8Rڲ~aT7}UQn&(BYiv-.dpЕE(߯ŽBtGKe-)S>Z?ij8PbZ@lC0uVw8iW<d2o؊SMAB2ϋ,Gܩ2& 4vC%gr2 *ȝ5ꖈӝ%?ą%hvيC^5 s*KHsc^ 6>/ INt5K)4mU/Arg} ٢zoHc1Ho\]={! ET*Gk_jDb):tw&sT <gX8Fvh/8QRJJx[xZld cJ͠ݪVDRP-i}ҭ>+|v1A%J\&&qDd4͝^yGPIʚA֘j#1v,;zSnxoB>~lHWf;y@EV-\\a{s[ԙ^tD .I%Kt)c /BYؠUB8ǠB0DYXϞiW6WcR-;fo xa]TGH60dC?FNlA>!"9$v6wRS?bWqBuNg̱@"iLlv"}reqcM^>?T!@q8gw}Ww`|#(t l9~A7GXLi/Ͻ6^R#!*8Smd6[?[{ ËndAsJk#.f4ee*XRغGp(#[PtN*܅?z%ޚa644 䟢pT;S9ZH_2džܝT:/\R_)lo@R"mV ۞9 $st:492,pIn{,$,;w ؖR^ ôZ?XmgCeo_Cأdx,:!_ݴ߄)o/8dLO!_J? /ң}KfӮr7r-c8@25--BWI2'1.lw羉R;*r#ڥ <'a:C9dx !\$\"=a[0f,tS\H7Q6T1zag:El;dzs y$ljpX'bA[{Wq{'/GE;Q$ MQ.D",BjsqU{1Z y_oî8Ebݢ"7bzT5`DUhG`]ZIi{EsӍq\ϏD!s2/aB{gk·ZK_+jXJ1UEN':5ٽ= BVAѠS4gHHNm,$UQGr4~~b\rA2xӒbY!=i4?H4S-~,W9e&,j7yX wuS׉=*I-u`q ㏲@LDbM5|NZ;y1w'ijdބuu/^*rfr{A='܅Kddcu D2ߓ%,*gdp7=-C]KdF ˱ #NlYi%ކ9k`Jvl AKHeR Lzr%%cSAgȞkD>GDc`MAlWQH~g_6/%F|҇W'l+ J`5[$'1' 9`r/zucEhдT|< C7e%H&o+^3f.xƯ#æ^ i.-Gb4jOz/exa?m{82kE`ǧ9 $d5n+TPhFFG!TEaK$ jg_HaTb |iQe"JbWr> pRW;[ɬ{E,E԰1:0dy;;Rӏ1U+"@p't~F7Jޗa,x&àP_lh!L?+̅ZW}Z BukrH-vY?"^E|4pg%צV1KRṜsY Fgt T/NIXa{߲m~ڃmQf`ԭi崠L|ꭚa-"15>9W5ѡ%!Eɹ]4lv!`-Dh?YY_x Smj-ܫ@vF@Fd3Kg1b0!swmGfoҺ`?lȀqVY J* :dL4$Yj>xD DŽC^lpY)2!yr}eE0dɛ୵9O@8}/J|nI(C23Yu衧3"C r.hnA7f+-4TB*jl6?[OnU7ı8p-@k^=p coON$!L=hS/-Ӕ]3@cXbcu]c#%bE$z0hz2/4>Vdʹ#w/EUX?E?ߵDǾ&vk}0VqpC:ؖt" `ߪ]s%Q ;,k";gc.gUP?5?l 8 8)~OXp>WB67EwQ9P>'5-ęh(\r/28~Tti`~2 ,gF蒙V ]U x>}(Ϲ˷nY;xbڀETw)¥8ǵ<5X2̩- t)k*wk|VbtYVDpO2zeq5>.0 pؙ5 Ϭ$h]K8.KT <>oq\کo,1G뾢j;H&TB2D'K|K\h=lUa#/H(}dX>tS#aZ[%YŞzH~$m&*?{ٷXړ̦ddԸkL'_ImB!D5)@U{b+nYIRhF\yJ!̐0OQ,*\@4$(Z#JRi 9yθ8_9qRG_duqUA\iB^U?k%w0g-Ҳ/Lg,Cc~"?p4,+OKS3}dbh dp/=0r!/ˣYW0WS}wkZ.@ YLlWÍ7;N<dҘÓ'AݦTo !-~ V~Sk@/+m\CPiAlGNŸ1jm + (m=mQ\/26EsKv`ǧ3~fcO&?{|XڂB{>EI^V܇][Aa$k/͢\bau_CEnF!#9vS/v*]ʞ'T, 'nvۿKRG[@,7֩9`T̽_^X9/5G,4>5,"}B /pP 0MGw 7zD Ζ8[O0-X^{5F=$6Vs?m`7Hm4?""@k=sL>{Y"xVґr5WɅܡ`ӵo$&B}\+g(seAARaf$x1:GW9Lp_E2px *.9Iˋ%O1)1KQ/y7澦x' |nP<#ѴGi[vgR.7iX-7l@= -A0K_X6 x(pѹԈTęO)yd{Xd1ԇ% mM[ue=J&[%)k v!AD,I}r=‰F&3L.3H͙O8>Qtd6V-TbfS6FVu6z\irJE3>u4ΰaVfܭ!a S8] C^DᐱZٝ>yNZ󾻳>A֏z(NA1Ljiyuc7si2 *c J8 <wZ70Nަ51 glK3{ƒ१0/GցluAůH섊{b~_ye~b ( RXs6CjNȪA%~&5ɼ)J #Gw@ ۅ)n_%A A]Kc.1]vʧ9vzg].\ԫ/; Vs<9Ոsث5=+t1YQW*ݖ76ut؟Wkc} f:h-nиDj:=eV+ IP+Gf?) vQ Ǫ>~\ղi|"0hi& wV'oY`43&R^( k  SY2լ™(b"pa]{ -]J;O?Tvu~ )⊳^nfe5hob 89`8MD߬3bӑı8dS 3mpHr w!TUD%%"ٟ6O} `Ae-t3 endstream endobj 990 0 obj << /Length1 725 /Length2 22309 /Length3 0 /Length 22859 /Filter /FlateDecode >> stream xlspo-v~Ɏm۶mٱm۶mNvl;9zj3{9=WUwZDbv.*t \e1F3 ))@l`d0100<,-\Ɣj6&N5{{7Kc ++?Iʦ S)@X^ASRN@!. 73u2(Xd,MM)fN v&LovnN.3sȊ˩DUv&\65vqWu4Z6k`bi025nvf&?hH 015'F@!lob71uK{w' 5 .!hg| ,,=LM,]?F?hϪx)ы*H S 373( -A: Ʋ.Nmÿt_-`_t]L\-?djaj jolZ+Z4_ N=?kJ6{l9lat󠂓?`Ļw+Blg w25MVܷvC!\L X6%cs#Pj%+VʱY1ڠsOғ8.9J_8y# =[Rl}Š0k+amMn 8c:ON B_*X3Xe+(U Wp̓qbbۿ޺[e0mZkA^`g)x{@-~%Ϧ{6zK t:O*(i= 2Ch5-I/42l^Xа^~<.&M**̚^Ʂ[PoӍhl־31c\RDd"RS%Nޅҟ #Y?4fVADl- =(H&wYgWJ%VO1;Ƞ;.g\)Ò2DI˔ i1훱j8p켦c }MLEWE39k7Ze+IfInms&fN6ppSt}z)Ʉ*( i'IV_@81}Gf$֦ŗ,q;i\B3(mykߦ|{ٛ936}xQe>:dUFk@ؖ4~0 {Ԑ"jTGuy ˮ渨!F%/v:=Mxi6mKTŸ/ Op-;RyFߵO2289I@e q}e`>@WdE6uYhZ $%>omi4]FcYX,o_@cfÙSAH-CzLy`uřmb~bmxZ-# S%ʚJփnں5]=WP2SYÙD"8ZdTmr' \oWN||{gxq) Ov[ ,>Z#eu)kGy[I $+ bؐ>.q{=,r_hOtoH3Nڏ@ fH4T_s 6W ѐ]Qބ0N*y6?bL`+B_jR9HR(PP{{F/sZː篳WrnuP_w93i0͏駅4 "$b`oܩO@O0Ű()E^A\Xcuqg.g f,'3#Eڗ~bL n͕m9O)̜&(EvW/7{ݨŌCc`' _}?G7eZ&P! ~P ^54+ߴ3C_E,/!poKFNRXR {G@ad>&+1dm )h[\JH_ Ylqh%Dfj}V"Hiɇ!l^BhI~=΅I6wBQ :w-Ix'߾nVdOs'P'ji;JңF0> Ńh08$ C4BңI0S[uT pݭ"HMjWY&c &I?;bZ-r0ag 9Q.V5>gkZx@ꇵ6a WavO]"aRP~c|4xŒ⒨ٰAefLQ/? Va/ Ntld&^ kEoCX]ym*Z]FmDg}wo65(O~?$smooJCYGdp SWǔ4,ʱn'D#>(2d-"" r—-1fr927 Z9`@G_${uS 3깇rh*` :L(g=ZD&pbq])!L>TKFdHW⤏ڐ6|㴭^$X  RR[ԸۭrmvtiD\Cezk~O5lywcWn?hn$EK:k ӄEW kw<|Ǔ )Ðg[!ކ'8O#oV(Ǥ+ܧޗ# (({JQɕ fm|Xo|փl||@CS=Ҟ%)"PN|~TnzVWT+ @D/ v~V6v1b/ 'n @CەCV ʰؑb5LRU>aI(#chmZ-˵$ar0!KAʌ {HEň idU}Tiõ"!Gg< f\,pEɎAJq 䰼4ai0xY4v`Jx5D ~:G# &ZT;a-KMIcö$S7g0^ՅTޖx/qCV9Wv2%EZ/1P "& $%R}sl[jǁ. Пj !qg-ffg7bą6袂y+%]h J?N0q!(FLŮ>7DQ&`fZXP+4?qcƧ+`9.%Zn,:L{w)jy i-kd})#M$[S䡸F1S$*Nʭb$(Ł@'a8$1ĘT,WFwM:R\.ذh**0w`Px!1r@u 0c|/`ӄHI:kw1*1pbc7gT"8Dn@~!+%G'ġIEY{~ VÉ.I*6Ҙ9Zzɰ:s\쿷}7`辌ۛ#(Z?KQvRU]…/T)jC7s˨W*#aPb`7.$nz3GkBG38lfX)b%ƥ3ٲ*UI{8zZUܵR~ZU,pa@ |<9(}ZJ.i6>S_H}DZV9;cj-) >HZͰXr QMo> v]11BKȔ]RJG@qAhk٤L2uG*EA. Wcx C@Fd_KF}Nl8~j#/\78ӝ "#ZlV{ 0X҇ G8vi/GnM=ěL?IcuʸǺk4}DPAI|> Cb7!rVis=_[1_3a%j8=?t*y`slrT4n&Ww֤k,m}IA56 5P=zH|3>תhi=/szAaDT$M+8uEl3'{qNu9 J {&xepRk S1pAY+g\!Xd) asmLU{ X"e4" Z(oP\)ж#U1oǹZ (\+Zs~Zw] PʇjII`q0yyrؗ띄>lNJToÔ4ÀB憌LS)偤} xLO ?Z*gS]r_aOJ:•")!7۫,aU9Lး&nAXCeu  . ?/Dj9mH1a1YlbZ}@i>*;*ũZB_y>sy2K ~NgiJVy^+ݲl4RaftC ?\|J8^jN_'}[{>7L*mz+a4!r<0Ѓ]+JtFlTZ/bjL_oA3Tbweгbe?][ HX#Yل<+ު@ EFN\KM7ТfP+Tj\2=LP}m^}^"9sU&~kK+=O覝>|H2TB{)c_ Gf€sJ!TȀOHz)\ɝylM 8ƉJPRnܧS2vǙPA򖗷ti^E^nC<5Juۆ@+6#D=)--"_w8mGW$ `j]e3j.]!f竞^5b}pW*.>~,C4@:Ղ;+#6ߥ:8kj\U/Ը#F*X -UU1[u,5K.*Ua 2. na:N[2sMCY%|pח;ߞ+VZ"f|5_Y%hq {}FlIEz!a&r cSƴg짉eݎu&qYۭ bt !Z#Kۖrߗ/nz߳ T=k]mԩش,AJoEAޔl{ves)6A܇J1^<4BB$6*'o3M1t?8)گJmkgFs !IɈ1M0ͯڻ위/Jb@A"y27Õ %-iD'Ř?ЋgLTMc%vLu|~A2}%C_ s$~ٷox!%^lq􂪩m[բY6o ?{9duL[vD$a[Y -3ߵ\>>d;C/sGFA&D,e9w+IܘTSUƘZ,yn5 Ѥ10^H>斉WzR*G/Gځ%4V\T| 7;%[c '#i+VY1Q H 2C `(}ouK?꧚QF4~;2`O?^^Las,N*`27 wE@-}J5݅: 4.넭.W3Y!ƉNf^:Ni{8(֨ xa?8`&wGǮ:CE@q~~,oJ2D>F94P#]"Sx十n.*(o3-,7NV;Vzl8sۑ "\LXxY3 =IBm+hN;7q8~m:h`D avnY/աUBP0e\;n KrC@tA~n{%Fp/6& >4oq27/ι5VpHi))tU 1 %kf-uohmL}QYUZ],cYNT69v$V ܂@n׍rgPݮŮw*;`tKu1XȺD{\`(]N0 0EoocL]pd]QDAw=Z1YS <YsK%{{Q}JXAX1ru9q?`T\qb2n њ@$̢-J"}{Qab^*50D[nuAj7BycF4ƁD/*.{tN@`3spFn(5+麒}}$}$1@Fe ;w#KK߃j:S~Qt{֚ vB|T(1ұ^ yF?O[~LbJ:3 TGE]L shh3** GWTCl;mhrX©XvaS#N: ӁE iOX An_žy!!4eP). &ZwtA NuR?rPZ'9ÕSSkd%%`Td}}g{2-_ޮOVx`8S0&Ŧ-m􄊨wX]g{> (}Z+I};O虡~-a~m G fQeS˓e/;)R5ȨM%c&nj+yȳɈ;ΛJJfӢ QlG XEwV$qGsp:)8 5BqKF2&y?/+/#eN $:;u*a< mYѹ+P ,#TıLˊr4^d%abXjKѕlԿv(;6Y8u6|;e12+ D: { jd+6i/}ճ޹е+RAb4F:gN@O4]΅p-T Dq cc~J'/oHRLvKV`*s]6D(ڕzu_=D(D_zDZT/\ڐJ֨IN4ӊJ=,i,?923Szhv8_{U+\Eo?i1eѬ$pVy54ڿ%4DQhn,r)[NmC c:eDB WgUBmř9uHo;BƮ6T9lAN!#( '6u^j"4J<h])P*?uքvc^i*ʏu 0HUð&WRV[,D1;o)k$Ȓ [3lc"xbƷjQ P ~h}5uǚvC.|gr\%F1gmyre{| [ٮQcN`D=E*+<}[MU]"~YWI=ǎeMiワIl:s0!L11O$9yzGTSQy^ORbzrLē/,H8.# #HP1Ojl< @B2]1׃1W;OlC텦 E<7ڏ6[uxDz jqs2R@lƪ`/ה7dVGm ߴMد)ZtSXfdoG/) JכaPK#*^r"%er/JFbk#5}R 7yC\x^N¢ԃ&`릍4wH;]`-@z~~ZOWO3_4na@qa96aCPԪ=6S''V{[[NOrNv+|NQA'r`GC 9BG8̃gQ FTc=8>ی6`?8Of$F3~oIu|)s键"Dzr&jIp O@)WtH;5-r[^s+0 ς2V$f݃QYAA[YAYx}0vLmdS#\2JQŗ g/Ȇ'J h+tտr1r%{wI?6)/`v5 OK)~WFyؼ(:|R+.M; ,/dۀȌ+xrbtr1WNɈPV>SSc dܗ|3 Y݉p> ^+myq* m8׺ ؇al"#y|]j*1g̨FgHo,e;MX^He~C*ȔH T>vؖ"WOA0͸|%;qܽ*6j[" ci&5#2Nb|K[&Lhݚd q՜AշP=0=fO7 {A-]kGy9:`ӸEھz8%4y =*O Gz 7X{e\Km* A"y:rhzkATo*f͑XgmO> hy@ǤvU)虋0 ]fkIkzz IgE25]߷97/gއ\{m q/ł5%1**2(qeȚ o+Jljy{|nNJgUٙjB+&e.+h־27P=Dͺ +N2 ʩ+qf˖91ozMCQ{\W#*-]gUqn|7WQ;!xZv%(HU[{ܴߪl-h6nbUe!j`8]⎼^%+x[;J )'Tɑ.f<+صyDt }#76·pwTd-&CZjPlYd&-tMyB60:3̌"6^Tٜ ΥE|Au DƾBoS{eERώuhFJ@w;Mr-N PSQj.!gq_KqIT" 8@-הN{r,fre`$hh}[*GmyP[%hM[ Ͼ )d炟$ dt?!8eqn0q4 hAU Z =QdXALVm`(ĕ3-`IʮƎe'faOSI=SqA8 }5lTmSm% n>?NΧm)Nȝ.RZ'Jm7(6*pՆ0r2zLVwWʨ'0Hᇪ͢(1 雒=3z"bv`4˿FlсJA;Pvq(@%:T'׾Uzk).5[QEZ2߫wҹ,Eٶ.ǼoL qUt=L*\ !YJ̗Ϭa<Ԝ4 9&RA^?xYUw}@.~9<"uhD*q>nDAo8 &8*U(q`)dƞR$J/}#ݐ<.v&Y˨Y>(NgFsDKwSB$Xަ{0?^Yf_|"j>H-mMPm~wSY t(jd@M ԽPk 850\pWmEWo{78H\FJ01lW@V8kK5`:N -7 3C˅YBF T *m tYɼ`ũeƘ\4` k\¯w'85_VݕnXS3C [#pNF&xf2CB~ ٪uvu(KQ4-C{Ots¥-g4Wh">6|JQ/UCPkW?`D+O|~;<8]6,A|,4LU4PMz`7Ya*-]KH^Sf9ȣnx%pÇVZƾw F<W0&v='83r8irlopU`45kO-u187byd`*iK#. A.r] .^ .Z \֐Q!XgUvJ}I^r_rjZ qN =@)o&L #,,A޹ {=S[ JUu|[#W3bh'z Q=4ЩYRő.qCN_vʛ[.w*0 UZ\j*>tWaU=f+xb:Z[(lQ-# pbbJcy1- YQ Y.Fq3 8"p*^J@EzzX̃ Le ޷+I>N)dC`ѻJ R!?8 'Ihk[`=(^lIm%DG3A.ŒkGbmQ8~).B=kpxظ{J,gg#аuV[e>rxx)#=b~ADo }ȔaN e0)9Mr}9&[ ɦB v&űZlQgyq&^"a]qpQm9kH8D ;q5K-#_&, z2q(K-R+y0z\%{3ZdIN7[JЫ]~hpQ8XDZ %e _>X3?Z:Lc4Jf)ˡfQ,P P# A~Xҡ;IRqHV|"ռ6z5It=NDvZVF ~׻t<8|0 Jp:OCj*yٹ~$Zf,I3ƻi#=C4! 'TEϷ Jt/ug7vzA%#ȩA;uT\s=q;Qy`FQý4^ f"K._J@(*l3P"}FO@'jo}Ma;r1klFDFV0t!i>,5S㠌^I*-wlއ`o?+xNiFŤYA% ]f h3Pr ŝ&fOs |.&kx=?}t }pʹAhg̩v7ChsS>o(e|r5|7* D iv|90VAuL=:Lqt'Ʈ|yo x_cDD"=>_8t E}_%׊%؞P. p4UG#8GO7xrA6)hgX0\Dޒ؍wE ו6/BGr$9w r:S:qdIX.@Peq%+~Vc`x"U6$7.V9.Rre?v3( r^|isɪ<,úJɃ[4*AO:\+*6@ @b(cO%٨&i˜~ g.\*}̞ߖDz֚"z)r)?HDf I. 䀛NoSG8|ݠycix~ Kcn WnY~+ Sڒ~+hQ1Eaۦ{;S@'j\;e$1?#cF;^V_}>_u<7ȐSF㍁Ish~nb ^Цbq\.]yVμ5wPK'Mvz7HL씼o)>(@1~8,XϘ;iP4X1=$J|-S8%NiYA?ΎxD=YZiL}rxUY/'!T=2n< EeE6Ձal7m.xv"oLR[qU͝2uI56)f7xZ3gV R]&W{:c%kY әF7ћEahR@Άr{ʔM]g!0=/٫MZOw{}mldH,ο;gH y6n'ki6R,Y^p|Ȭ>oi`JJ_vk^_FL+ȧn{GB {BԺCfStxn7,5Xo&<%q$Lw4+Nу^fiO[ܞXFG_&3W@:}"p}D qJ09{lo.P-1e+ jqSD7,WP8<;=($>aO.\"rZ 7[0Vڃ XLjQSlh[+P18Z}yQ3>M #Rk`Hh !3!B2{{!a⤴_s܄+V͢2nx[ؐҀAp}3_<t=/FE @VFdS$_IJ8 ζ T\Ӡž8}Z=Q MAҍw BD#xHRx80V~]c3y |b/ӱbN]*2jZ.J@'ZO9s#U ֢ɴDavg(M8rpW4:hG _u*Ib 4gCH]O0Be*+! >Dע6&K}@`|(,ڳ6||@P~edH-N#^]^{Rot+d5P@"j]Q>²?P?)q$;-OQ~MGEf(VUn=K}K8i10˔ۼ-9)gbGsz5?}FKw\}qӞa]kG.oLjGcH7En1GΘQy}m>T6 *ᢖ@1J؛.e x*V$_b0|U]͎zNOn,hD5G}+$OR=!{^Bí=Uky2/]p~ta1l |i5r3A:1Rح\Z#J|95@ ѓ/nmB&K44-9fؒF i퐠ִ&1=GqU;$W٫MGoҷarnš`j>gu 67(Пs`.{N@%'.3ݽy4n"ES*5.}ϡSst"TE X*qҊh S\ѣTD8]6̖~ZƦ ~Uh>ecGk<=U݌9@'V+?EvFM/-,E`8 Xw6Y׊[o:H@p'^^OLD_b~ʫB\x"l+Wp"O#Orɋ紟1A"St,~"]2ltmRdJ.O=mr Mszmvcu'ueۤ5M* %l 썪25U 9Fӭ=I<+fC `Ktea7DjA *_7]|Ñ+L_⑖F9ݿFS`U?(Z^:[5frU7? J,J9Q&3r6Ƈ;6ȫ ? (}q7=҆Cjsa]uoxUr#px ?W?H9e %,D@0=pyxu; SٴPԹF3/a/$ 1 f}.6D"_ĽǕ 5!5?t}IמqDyt.B̌cز cܵ!o5g5Uu~rf>nQYq(wc_᭗%d ℑ2[fs f%P1gcxS!pmM@y KqI<};Ch[h#x̭cj 5(Pok9B>xKvиz͎;r C_BotBU4ur Оo8I ΓDo'܆DA! H@TLtQ9WIr.|lʲtL[n;KvlOQk#M@&JC$.}Lx՝y#s/z SF$2@Б}$aOi@[#Gɪr 1駟s|7X/5|9AuY*Y"H1ߛll@z=s+kK1lNj ju3|&s?j,J҉;Ɯ b*WT?HtΣ}G^ Ly?t8mdKeJ 8=;T݂]U(QMذS|5_5V= |s>(r!"%+j6\qemD# o=¾(л!wLFg<5DO` ܨ"iu{Ԅ .D&t^ܿW؇d۶A=rՖe^~zGB$h>QJY_Ӥ2dPR|xE}4[MYgӋVsP6O\©tDy|>~IR9ª-J vy*6^E{^>"|8x# ib#Mm'Q}+Pу^TJ8g}, ٴUٳT"QǮ1L`8 =9@uYn":@pEmx\KyK8} hƮc;D䘛kÜDM6᯵y:DyCc y@1#Խs-S}}:5ҧil} M6qٸl~{R]F.c}uݍ'Neʣ%DJu? Dk.oҶUeNyh|Q)Z]Ⱦ|u}JG.E˶9Nes2Ao⏈u2Z_0tw:fևDQ~^ ΰ;n-Aゼ;GXU ƻyPik|;jKhq(Y^I^KJ7Ē.mѡL '4! 3B.v/f̅Y{O,d,-bB e>pOnŗ*M0W̐@OP# .uc+,%z.uỵ:2 {$f5*"J'4Fım GJK~gڂ]ϿqC+A\{w ׃gd(iGpU^Ŕy\qT.AmYnrf08i{85*?կaw1f+ԗ7u6Kjhe\pJzu91ۗ dE9dt: Z`9?4d-2 infY eg;D1J]^.]m^GEgd$0qc"P!8RxL28fWgnnLNG QnW%EvcheG]>:R{C*aMKtFU3Íi-X7;Gcw0)"dg,P*5KxyS(WM%  !5J}BQ 'iw[+3Aћ kTNXJdU~EPp\#()M[|>BOXXAjg\q%g pL#Q1V"IxbԈk*)n-_V>ct΁e9-[> stream xlcn%ZeOٶm۶m6vٵe۶m>_X2Gf9fFs-RB1{;OSFZF:. #'+TNŔ njP6u02IN. c5CK[K'Mٕɕ"eSS) ,))'Sڙ:\l,2ƦvΦ3{'89?̜m*br*Qaza @Fr;gM]\ ,,:hFF _Iڙ:7!(ѐ`bjO)B kobdN M5& I3G>@b& .jX?{#kjbjr4gWMl<׊Eըg373(+Zg`߾@_#翖eb`21Y ]cW''S;O/=Lk!V-a~EԳˌ+S7H&W*x3F;D1د%ĕ~5[oz`jXr`7w.kA T,ٹ5ReH1V)uɑaMg-B _@waoH- +õ ꯬5;%0Dȶ*7ﯘ~el .<*œYQze<ܝq{VW!w p&ݖ%b9]^rq[UG«*lɜ7-/¶bFJK$q!G]qVy81疩QEV֗_9-Bp׸ϣ^\ Н21q w,geRwyBĚ*'dXXkLݪ6ܫ~;j1UY{-B iWv G@?~Y7e+;L?It&CУ];;.U"tuD](]i;D827EBƍ`Fߑbu4%VkWljlCڬQ"TV`#|$pڃF !0L_1mٹ=9a;Irf~tB9eP18#;2640 fT@xO\Z+[gs%hܫCަA7X5ɱ;eR<&LjbֹnkMZ(am5M>X>@:Ql In'qmdVO݊mg$ڵ$OY8{D&c x;v.|GyZ+2'LxN;S&ͼzFuzpC\%\3!/TRi)P+ g@]qaڌ˪ N T-?ڒ07;)N>!?IQ$) r7wv8DtqC)Mh(9,Ũ IC;E34I,Lpfڳ4cG7pTf9Bо'OҸ[_aB&/re%f :.#=";)¬D/:cGمcJ`_/ݼRuEݐX*s #sX@ N7.<kCn''9ot2Ԟ#)GL\^KlwP쐶)|9$AϿq VhuY~Ae8vJV^QfA&ʇ . A]A5`I>[ *)BH;޷0.E`~Bp@QAYRᬙ$]SK|ކe-v(bIvqړ ;m02\;lSPcF7Wr?(@&K,yeII ʨ6BZ,qÙ|6H=灿N VL,b KvB:+(3ďɆʽ^10/wT4x{/ s>AgZhW#>yu#Y%P,x#Hm̗Up8{PAߜ7e&:k778 ys+^( v`P(mOx:Mҵ?;$M -ЉBN2yau}Al\i[D{ B>mU덌r79Q:i hH: l)FKeB^&)@2h.[pϴ«.mF?"p_N)i&Ē;][Yd].xy'Z z쏗í`H J-AsʢsWB]anjXd[v[p/?ᝦyq2Db04ŮE:U9hkM/ i5hs{Ĭx?B_ VAv LjmG:䉆#q@S<vՇCF <ġ1͡j2aN DgUg^߱']7\^`WppU!٠ 'c^t٩֪( Aj<"!IGSRp /C_캥r~E/% =9GSZarkg*ʀYi)i<] TD9_Wkư+/}@GE~SET: dž;퍉Hvn R԰.\ ro\f1Ү(ߡSz@l3\֋PׯPIc /[~<xm|)V=HloyYZU^q8UA{*1:_LRZn%;<"|z2j'gxE:A;G2hmcixI5ZUǜtc7{7F=z5j3OV˂f+橾X9U{$3_8'7-Ho&"zo<88Ca uӆ~;xhvFY0/TAkゅ氨 TjUCꗩ %ּf+{|;yTї-*o6LO~ˠ ʖb@/_ziЃW%4kw4s% ˻"R#moDT^1Q eFlV\U.i82dvrUyH끐YYxlLܸ(*\Z'!vujnn>e }:SWEݑ{cuŀ|jFXxZb| bX'~Gx‚爵yhX6g[&6߾HІ)A&u]oʝ`.1ph ErZ(~=3k5ia Y̻c(L)Y?b s̯5x6p BZ?AJA7>` pUd\'2ofrAԤ*=_CK jG0z)Vx99S6 ZׅQ/4o«'kNj- >?V}Hwb4qVAWX6ʈhk*[Ps׊_{ W5Q1DaeK_WQꈿ(p>6ЭGD!"%7ug DZ%ڽxώqhkVrt ӊ;a(\5 8X"}(|+1M(fc4֓t{MR*owe*l1Z`eYG_*Jv~E?k Ma]?GE]2i*lMCd3 ?}Y`:CG֡Ҩ;Z\%EhDu>zҔwxqPC[d6x*cҲ{ M;@IqÃO)aHt$at)E)RŠ!1rQ`S%+nW9E7|-v3(aZJm ~K\wlAxFPzKȆQsrp"&/Sy!J{C53Tqqftu'^9}(/,@0ERp.$grmԷNA̺1JV;$ ;#iIM FQU >‚-e&%_6JV EH_'GFsuy\XJNK0`b 3gH0h+^k CT(y2BI7VmSκx;2c'oCeG~K"Ϛ?t˃r[82E6HL_Υ={zߣ9V*oQ4F *zfw3I }aZ%QReIYEjɗoЛK RUk=*Y  }Ű9AvǠTlO dT`RbF;5ZNQ@K"}\j/T % )f=ێ5P&pP2\@r$-\\`#*hk;91@ysjza@p6ܨ4*av3Z@MBJMa+m%|}[瓷jYUƽ}pBawPc7}pe.u*'Q_ 0Wt&HЗp+kS³эOjG[X_ n&@ǷRfA#}V%tVNၕw$)㌪9tī_rv\h]@諆,1 &{OeJEWYsh6F*Y'Vܵ\.;}߇a+Z==oŹ{B oYtU]v4+L/-{:3SQ\H9ST+`aYb`"Z<=UsEi/)@_f%3[h/lH| 涺L5A|yQ>18O=sDSҌN6b-Ask6ԓuyiA;4Ɲ=maeʃ5F*7>S& TÞ!*=CaWf)=MhX^e:emGW:jEj_Őr5Vzu+|_?XD.yi(s{^2G֦ #`Zs;߲cР# b+ R&&Շ 6QN9K6 WF5/!%xCXF$!0cq>qٗ=^>'!2 L{?兜'xtNj4bTUktKgoF T@i+ڐ閶Xj/ꚜOQ~x,Ãh&)J!B_P[KrM=8XWA]N7c^ȋNZadB_M2aFK'_ \MM,ƮzDZ`k,7pvԙ]9@GѱĞA9#Rqv`z8RUh@cF$WNߵ!523-cO fsKKU(ej@:C]+ip'v6GS{؃rO|~qHQ]?[̂E5 9ct{Nȴ1=Aqx" ͨPnOZҖSό򱡷tl݉!O~|_0E~ =GZE1-w`Jҧ\ ejn[h XӋ,0ftw{9K&֘֬`We^} +~iKoIvJiqW(9 '`?  _h:XAQOv=Ð@gV)՘AmU[ Y2P)_r 5 Sv1M;Mj>J[;aNsp9I㕨hwٷg^ΑJ#pEYvSl5j?TeÎ6}@qg'xLƔ/cx>8'`zط9|@oOzXeXh1$ٯ6\9A9\$0OU$I>ݤU,74=kci!ξX ΠXqEfDLGTᇁMg`da]hd $ y|Q%[Pt$U-Κu8b|%a2^ChT @P~^餧.ZL26n7_0E>-ԩI`Q {o;T%d*SfQw퓫ފn ލK=u?dNoF$#l1쓴ȩno0Ec)_;D-XϬϪ{`v"c?K xZܨh=D8*eNg`D ڢ oAX#?I?nەM,9z%&POSFM^_љSL ] Ôڛ N9OA(*:;6,g䡾>1?J5;ԂeʚyeSt'嘤/R[K0\GMCx7dg٫"9Kr/ 1R: a^_ (qnli8dɋU"n:܇6Ij󣚍?z{*8{K iA[$cG(]Z,b|1XS 2]Փ.tݟRB/힨({طHꂏd,i8C`֚VCd|UQs l:-Q4$ny& 4zgC܀T&8x#!\,?3oyչӯsgPCEf ܍Cc~! E(IS2e[:  T󤄚Uբ$b}Rw&1XJ;|n:ٌG[ Pgݾ'_'IPsLJ8'g h:uPސ q9?qA\V*@酩_i+('"X Sj+?UN3Ŀ Ps =䡂AosեWSs32Uz͑{CnmcĠ\ɏ;0/Di,_v'Hw5k| =ch]){Iӎ @qJ*V^Ӽiq^L 7j?.Na!pښ4C^GcJ F~@S jDzDV\q~tLoi3L:;8 bRpXLCM( WtP>Vc Km&p%}ǯ+8 * `hyܝ.Ns{$0+i "Cs'3yA#3X,a#X HzkbʉT(+AOW($)؂ѷYO R`GE&gM, ]_;=pyd\r#<&L|7t!-)dv+ᔪiaC'xaЙ6h~X_nc*T=y(ݳ00P؁Bmq˱X"}Y>gzKyȎʐ "V'_pGy.³)r3VdKodYaXga$E[WP7 AbaD p!bW.7V;U=.`BÛcv{ٱaATK3+;RW ȕB{  &/$#P9?juX`\) 򠻹QK>3hd$͂j`ʩbrxk\,7 R,{:ZL 9)0_^k4еl4|RѳÓm;vtm?JRoXwMX#wZj~6P] ۍ:4ã|~f' 7R`5 }/CJs l: 6mXj#FOvSwfӳnuH]JPn^r  S.geV$*#',r,w~wc}kqGQ- &9sJ胭E4cb`.L5hRƐ84sOq`Ac):rHh2n- W)Ծd$"dFa+cݹհs1+Mz8W.eiNJ$M*ZQ-\ᤛgZN K: v}>0[5_XŶ. {^yWZɹ}/H=]-NtR}AU؆c׃QX*qA~YcH#QR{F V8vKJ#h:(|ڤ튽4^w9qg:w C4êL~p3l.◟z-ЏLO F!7ޙ([T)x}I %0's:cCBf ?[6eʔ V$t< 8a^?;:u`8׍}ҡq[$Ʌ~m; Ӝ 2}7zwv[o[8 F#r hQn _GY,0 M֓͗I s4oIkv"A؜k4$ٔځUT˺P"Pvpn(}N])G FS*Q0!8n̼[N ȋZh∌Sp*YP{)7 ؏7],$"7j,(9_~jE4Ȯ$NwoIJhd+scP)FGG1 ^vNڵzamaWx&~ .>#gl O礓CwQݽu:!$]Aƙq9=K98͙3w dD${8'FTOJw CΈS Rt& H/~LuE\t 2i6`#vNr@7qIww /.M#eP%^UeyOCg$ğK#*zokiEaI{2m=J~V D6,{\}\2{7(ݍM/e4 1?:|ަΥEhf@EeoeO?]?sM~_l芨Z6IQzVo~#<[8CB쎥Fss,5*;iR'|4$!9[g ?KkxS>I|ϹXu:Ld5>lIG&X\[;b@W*!kHYM0}'✺Dt%ݶݶm۶m۶m۶m۶9gf^yJ$”87EAW_lqt:i7No.Lʙ8wlIYc@}MX+RLlB_*5e :4Mf@qJ.7CFE)Jf&!"=Jptu+pORp(ߞr:\O軔^5wJT}쿎_4qVzwBf\c+ʥC]J@8rLM5V 2EWI3TY+|MPOC8xu`$ /|O OSl6ʨX2V?ИfŎ>vvy5CżV$#l5uR"3+nAU!}w7= AU`K>э`rwAhרqyy.Щ궱2&4!Y49Mb\%H?ww%0C[XG$mTз)>Ve`_I퍸 =Ӥڴ3#f?ZH NhXНoJ69.y픠r#~bF̹'KcftZe̲2i]eVhG_чalMA ?("y7mԬmg/cgb9r篖.td)x/G}dm~Q@jv4k-; C~JL՚90KUM/YuqÅ~c"[BF.7e'~lF8I~u0nDl`>9`ɠ#)g45q|yx !=q{DN)6y{,w ?J,st^HVƚSIJE僳pH C8Ĕp5 㒺k-Ik:oV5l*),.pa@ki})Q!OOx?=L𰩟])^!=ѭ\ f3쯻%Wd׈ٶj/`]l_Pr,= don=mWG 1y֎EtmjAy3NWݳz\"EEQs@pBZ#d ⪎-H6r̳ε8l\1v2\5OVu.e? Ƣ~ /[`@B{A1G'3WwMIҭd鵹ʌ{ `mY`7*MS5ԓӀ|2*n֋rׇ,k]c4nR ]O=d M#AEje8rqlhuWHG &pU5457U9k# w#E{ayAW&ih %8dʷm< XRG;no f!BTw4 t&% ՟7:KC+ϳaϳxTPLx*;7k{\zJQqME_N|ZJ-5DMbVO;NhM)#aq&ʳGؙ(z"c]Gݓwv] %yQ3<[a:='xMltخKPN2ae3TLFGsYe<$w~*!K^Q2Sr=r؞Eߒ3-pj!oOp #*azVi SCƉc]S*$X T%"੽:ηuygR~.0΀k\1Ewcj+oI \}H'l3J V3M1!Gÿ/k[S{t1d*YG!7[Am/l\2uE. [a*L ۾w-ǀR!dyp6S97`4-{~:ڐG U;.;cT= 47ܠގag%gjr"[`TII cP8г;Z,Y ь+;$ͽwͻG<.BU3zZWZ z>@rQFاED[} 8d U+wiq(p'A;ڰuVzp8cFHZGԥ|irw0sqv0'RqN%D۬3&DހMC)G.z6=`7c mZX;}DfʥݞLo&=l5+_lxbHKS]_C趥¨: wP 2yN/6c$=0 "O[ ªH~zQ|4w^xG,Xe{f*gDD}ؾZs6rXCޅً^Am%s\iVNR q5z "uȇiT注o®3KJ5a~V @ț\5j4zؐ @' \pJVZyBI5 i[oh%ѿdu}eF+yXHnhSP@?=4EHFF.¯25%UL GhFVh;CCR~ ׅG=Qꭨ& E(%_&j]@3W{!enMO@Nˀ)zOMxcO)NxTC 'K +^Z{n^CV!|d9]֪b8S,I*~"pxʤ'tw=!G)(edKfN0z~MݟhT쿔BMX"h!kG$;eC$d #uExfT{oQC@BI#HOZyźp Ç]%ےD~jɓyzmb(uJdĝG/xtz2 5]BʕOBȍj[s3<:B.50W!*y#z;04fBB_Nv-q%^ؓT?/WҋƳlMJ&\0Ѩ9 ݇HDDH-I/<.J;F4tey|x$}?y\ pNn jk NG%x`3~5YH!尧5vY__q 3b[Z]h.&8vL*n/x*`0ԵDk9H#E:Ip&,W w g 4 _U]S:ۀc+v>V44^ݩם+جBCk) w4>-ɏ{2:֟7X6\_(ܦ@h :3$avQgb/ kY+"'|>(kb?Z +ž:zM!>!^YfB F; #ғSgy g8*a|?2~Ive;riTA QSf06VWh:E3HO{M fnVBG$GasOP}lbp A*?+aX0-X!7`$D5"@PǕۻ' )5F^zĘhuh0^T; O"D*_SM(N4-pM,sqcԅ+0-Sȡ38P ⏺DKaC VDe;YrWbW/j2hT.?.a Γj:9imnz1^D{[ Mr\lBX>1V*eha 9vHaIi{QmWH7\h:qbd.̼GйQjaܿ=CꨇN|T&'62X"SN[W LA{"u5`[]]XY1RZG FiXҊ3'vt1[8hYiPSx}XMy]'QOyKR.h$fU+D_8pJD\Ur7v5[[l&+lUm>Lnߢ@-0d*E2z0|4!QnY~ 8CIO:|YԔ%g4i&Zu!y-gM;4KyA/.qjn%~N?.pesB`'L)i}X)^|cà{ps F: G[]֚1z竽ke'JȊ<Ȏ<$ьnP3[*UG㾄X뫲:<ׄ a}7>g6]u_Wn'Y03l._Eh 1AT4"S7daO> ),CWk$WML#+VIͫ\c8]kI]2KzDwÏa;kC/MTǾqcuc}i>a .T6B3d*^JU,Qh 9;g_Pu8Q?ȱ /{1QgT)APUH eB՚% XK z< 9Iǫ]b$8fԬ? M 0q=? !_qrNvEği7oS wfƉ JMiwB=گ=IߊhdQ}^@ny@ߡnaY؅`{t q f9;>C.|PFw>Z%`ea˻*5酪<{V .kFl\Ff ݈wʽ5eKSўbnd[q 9{~dթc9(v_V4PUt~V i\HW1QKLÛ2ǨSY SI&uUo Vqs8]h.mZ) eS:EYJcAh|-QP ݋ŚẉM_@>8=,D+ \zgh%Bp O~~XDK_f6znd,s {dw\:r" ߉<]EK_pB߶z`"[qUB݌T-?D(Ple| p>I93ajxNP[=hnp97X DYq+L!P<06|k d+N)0HS`8@ IQ;WD{WIBIt~}#mzYVU!MY&*L_XqY.tH,/9}`z395 G͢[\ZGa-+s'\w akM;l|A9&OH ReC_3l8-$rWۑ3ӟMkOBpRr!a|n]y^ 9ޙkq޴̸Y2RD$Ю{ Yݏ%ֈnhT6!d*czᆏ"Õt$CO$:=I)U5)`Z/IB߳v:-z FraK6}}cZŁQPխc3Chfi.V6e(cjXUr.+Wނ i;ڙ&vQ`ϲ<O ~89Ӽ@R_2v!t aQ~{ _cY$hfE0Λ P$bT*tJD?yB-;ܚrpFٓH$:;6{Lip`( 4w{9JvͷqXcC)|+W K& RdO㸮qq@iCYo9lَg)iZ OCKA;Eټȶ !و0¿MKFp ;ӜqO&ujpd9{ T.BI[ dj;јR|#QFg&bݛ~ftӜ&(5GS>g͍ޞ_,]䆈xf_N5:Z;< g YˠLxZ8,Lּ1W)67H=|'t,݁oi^Op>_&1X.ڑ֦7!.\ȯ~h5壉Cϩ.28e~ʺ!  nԳ?LҐ+q-}~]ěmw=R˪ӕf=%‘5%pмQ·5${gK*RSL$a$͡ݙk&:}Ph$x?1-6M>xkBfid6*j|#»b$`2Ce,y_e_E"VQX#s<*PIa)+"ŮDFh;9*o//mȻw;0Q i ,XfnN^N {=+41G=:%2ó+ 4 7oL}8| MV2%8+^؋FpdMG,o_sb18j-h.UKP!;]fn8\2#=Qli]?k)vi]C9/:@F0/FО.z5[mb>z8qpmTROy/ W|\ [uPy)mz{M0mw0 qR#Q<ԼEL Qv/( u:4N \iBؐ&*w9#ղs0J ԤkxR3o)m-Uny^Bu>z!7\}|y%^3vyV* FO%x6lQkkAgdKiP$IĐJKg UEdg8 ٨7J9vGrtFŘ$L4&cϹ>*aTҬ>UJlJxPvMgdR( 7%fh[yWQB_8*}F܉N~wOqPh(Ob'2rA5ػuwWPڹpni<(L̋N'9 zWbiOD/BדHQX<`:,K< !c(Qz`!- h~ x4eWn NMߞN hѫ~-}mLgi!{G]bj3=q׋tad8Qj5K_ϴ0_0a6IUpgGH归6B xF 0OXb"4?*BC)Wg8tgz桄}i4u1_H@9dVj1'8 xJ??-QzZG5p2vv]iYPD= _/^ @sr0i3ddcI6Kνk+7jFm"(fXZ>]*`S#""ɉ,ItwIF_ay*1cOW1l~L䄿aS_&HSjQqG&bC}YREq}P 6A9~WKiX:b u|:A|k2smAQIU@6&[GKwm7g}R>ۀ0DnttЍ"a,|ǥ ɲ#h 3xpM!9DSh]̅1mveC0yxxzR8 [mnX)k͊ͺ+ P̍M1_ gtְ0{`:7"}ȍ0T# U# @NpFv0Gނ"}nh%ۋEdZLDB,Gֻp /o+mZUwpX*]LF8Zdk9nu&Qz`3p1pQ=wHhp..Q;{ >Qs*;&nHf&mMH.c3Ӟd9j8C vY ެ,;j$Cvcۧib wҭ0=YԵ` ۙxx+n9dCV̗aE_C29c P{]BTۜ&G 4j8mp7Gz̅GoaMpȚ.$dYoLUC9(jI);AB ?ט?> PG +& zCZCp c2h6]*sO΅;n 0`2z 5躺%F$^@EZʼnX"@{$n n >Wêe1;TKvS2(fBpo8c"o)~ @+B50;INntR[-ht9<th2M ߡb27"z^~Q^oo68ٍ"Ex#2MԩD!1͍*J EZ^\'g`1W#]]vŬ&C1QHc@ɗf6icy1af'0kaK_Ϥ< s> +F-(C֔.5-}dn-t)j$l$BƱ P7[e.p&\J6-s|mᚡ-rtKƙo ;lvzYsloԼ΄!qf>8LY.:as VR׺o`Vǒ" .sZәm3JLmGN4uE_ TN?&T n`ϭɦO}fY~c톒IEsJZ R@35aE@+k,LiBh1'_Wn5QuϺNYb GXނ gp`١ҕ6G[{SAƈdэ_@g eySdzfep7ɴ0mPP5*=(k#6\'Ah{,E$mT"CNa]DܨGGAzYܱuӍYQYBT_íI̒P SGem9r5:g,CGEL5劋b8iaњ&-0u{p;ka(K!h]s_)rw9F e]a:K"4ʯN +}1tD]us;ϰ k#i{DfV%!R@(8b* ::]Vޞ=fr2K̚zG= yi"ކ_zTd=1a)N({FXIm3i)ѶkG_a܀Ӿrm ]N[^@-(FacEmQ!b*NڦfyAq7caE\7/5,-r\DzMISVx՜5;5C=?hJKA lG ngNFgnOMWb%g˩Xeݩ{&ҾԺT", h/²aP/ +4HF]v7MAP 0A @gEzS(-*AQJӀ~@S`'_eF{ &2˼i1Sgk_d22+iH}FAu/מЋhW_Jp0L(~`Ӌszpt-zbQg3ձƂɆmJt~V*|J(qU |%۷H au9{"[d/.!5fUIJmHV/!i}KQpoS;gjvu/O*cܷ+F_Ux~\*ɸ|MEtDo1i\eg?JN"%y,)\USI3`f\5<1Jwzb,k4m?ed,:\"h)4'$?6cu5x"B\~woFƜg_[DBpuKAGa?{1!c7k*Bi[;t!~5L7^%=?&;?%=!^*tRyR dž3\*u 0㈝F`?dBdJ}q~6=XG lSll$qJp: PzD=׆҄T _M{~Tkfejfl'쮲`x,3F3т7C]2<~q.5+C6(P ‚༹AӱnP?}EY6!ݴ5.I$韇ٲeЕ͆Tч[LfoJ>9[TQ@N 7!| !syo=.XwHO27'~V4'}`ҵ-]*+@bz+WP-+2"we OMZIG#j"N202PY|A@YG._RB?Hur1|.;򐺭vj/M]ϵ\-\]+R#xUx !7O eTK zğ4GR?ŶKE:2vsH9,ʮ|8y}7i s"Ug^sQ} *4Z3MG\IF(dr/jv"n[\R6X08 p}m',t)$Wg1@Q'O嘜/;@rCZU1d [k6 k:J1u;>Oƀh#1YeT'-n#9V5z$_ +-+MWWIalU~!˜?;F~+50lS mvзa[a*ke`g CZ-TqRX(1H6'D3$Ω[*p3m8˚^;8-f^%ۍqH!r&q2ʡkt@#,?9ɓq y414@)"~7%o LsESJ#[2hDVZIO@m~ V9uc@>?Y6Ψ*nd#,6\9#˲6Ҙ`~ yr!wΜ:k#0xwFk6x&ⵎ}v| 0殟CmAZ+V{e?: -|MZxϣb*7RV*%9ii[p~NJB1Y 7l`d١ڛ/<4FA~.Dx8;&yK׌toh=UWO3h o0hJ2@'Eo_uln?y5j ]fyTЈΤԝ屘قMatʾ WqO-ad\-V :).wk*3D4,%M ^vFݐL2a b,{c%5li0!CM@EW pCPHJ~cq#Tg~12;_)k*XYb[='giNr8{f;R/λu_tT̮CP ғB櫥Yi%cVYA<0햜֩_YETBvXBSqiXS@>׉'Fw1 UuX=]T[Jѽ/kQq&:ڏ!rߨX,Ks?QYB͒^Xd ?a>avȧGڏRCd+,!}.>VnkUifI6"8~vHNq M㽰SF{MKRB+3t+^l˔P$6=ԬUN6#j9:9wAS"h^d]xM?o6t5 0xg~ mI?L<#3pޜPH@xè2#?¬m3>%Ob|)Z+@^Ca$6 f 7 CJ)If uO6/P`(~aj0,+7c|fWHZrI൬ w17[E_Fݎ$o=# kcms't-ތՖS42ڰm+ML`FҬpɯzemO (6'rf[zb6 (1ςLr%,?xqˁÉdXSZ7Qs`piLRcе˫vcq<5X'\$}}s;RfoNΒ(7F`tuUH$a]YˆpMt#"&L~{u<[b|e}Eo QjV|Ӽ ʬw~b(w"c[0MH^~#IPQ]UT%[Go%CNhqlN9GHC<3f­LKMӆ@(n^M#i%w8I_ }mt˹2~ qD~ @˥f&k^6v-̕`, X޳(:&ߛE$b1 bl㜜w+\ߘr6V95L퉗x{CqeAFy {Wv&g0|`~aq҆WYkFSU=ntSP@*4 NbJ`0PQOJ~{hnUgy]n 's zt@GdJ dL⩂y1_;J|XG^Y/3*. Q#EБʛ#?}+˕X?@]c.>4u<Mw(m@KjoY/Vd% qk;1낞''%iv |oO[fԘ,N N\mԋ[VC|"Tӗ#2O 98(ϾkJoC PHB;BiM[02H{Мmsޟ!x5 'ɽ'U,"Z>Y=T!pOk|+\KBxLOŵ ę--nyؠ'_Ba~#xfB 7i U^Чύq6G?iq&d #lJRG "Lt\=V,Jc91&B]՚a!q~$G}2/SC7`4ZSFh_oP,%ivJ9YS=_6:6)qjךGD.t}W7a+_.}O, LGJzqNvNccm 7o!!e{CXuD*֣_a"R%[D"acbbg<8zJ36p[lSX~żj^PV چe'_slxU?kf>5VQߟ1U"9k:X Mnl% :ϑG闌_ [)Fsq3V٧%W8k #ϡdoj=E &P@qg.pRX<&'Z)HmiBIcYo0jޘOe~BCVLp9:0& _F~G9p5TNqn".1ʍhA^PHb'ʚvax t8”\[@mdk?A: >aHGmPo_{TܙSKٞr p@8 7g]*b\r5 i CQ񅻘%5D2_$0<{7l CgkG CMR8=&,VRax 4j@6ԝЪHb7z/ 3$05N%W/ ;v<ޕ**Hk0ebCJQ -$~61IrBmw%Լ1t ҆0Jdj/ 9ޕ pð#W2\7ڶ)7s^)H)&-U,zs(+eV/xa[7LʆO7i}S[K#z"tiSU 沍mIo++K|Q{jn^~hm%߾xIU3l(`FN#&F62 lL5D6^dL#j$ f97bo2_zQw %S \*ZP>P/2J\Jc4<Qks>{c_JIjH;5Y5$+Ã5oqz総ܤT3ֵ=]p"mN9."]K*0 z 3u]fo9- 0o֢Y$D\\e# ѸV "j!*-;?eJLnβ@@Y\ c@ĩP{:(P|KR 9Qlt=EV6{db:8CJ)BGȧP}U'o0}[N0aw\"nL@ zWVVSݓ=EN|cfX>)w}Vs.~AVM^cER&O%B!dNZSppeyzKͶ4K,HW6gEos-7ɇrvWl~^7\Ejn}m4G@iqb}ok]x_BJcb~#IɘΆ{\1@hn ÌWxAeC8@[r3St0mj\ ?!U<Ra/rxf?{VVm{D侓 t#7M]ؐخmۓQ31.C9'\6>"! ɜ?C=.SGMT۩dr> i}ߓK剭B3L4EBa[c,l W `&"NjZdaxhɦ"Fă /wU hsw:[v|&2<߱jZ`xI!$sKv O./q/`X^` {YVᩄ?&r0N젬 QVJTD(AU-zuCChӴf,z$Y, +؋YV߲BМUak:v:YϬv7"|+R%pzr:6I"UWlRWhg@5# :sho;k6 LH;ӆ*fns:sD1e>_Vuޞnfɝ}bhff[!=OՠY3[mY(xuDJ" C(hs* p8դ]Gfl^S8刢{|0iFf ̾.BeNqJ} ȓEJsD)GOɥoIlwfuEsZIt҉{ku܀\B=! t}SMAh +>PSzP"؏Ŕ,k꩕ ]6T{gakzc76| '-H=NjZギ}~_Xxu%yޱi 26a4jg5ÿ3 8~]Vj,e&Ԯe&W,; בhrݰ.Oqj-⌠/bVyZQ̘.,K֧sv:dTJisLXqP@%lz?(zIqCP\+بfNْX4xٽ͵%AWanuhW>E70EU։$&GORSt<6H"\B[Ğ7A.\V@}KzaT %CmOR2Ve S҃՛:G =F˧]h= q$0MbCHQ%Rְ 757ٲN |\)Ȍ!P'ZMWW5`$bXUH~ToI.HV\\䰩 XHॲ}I;Pi fL3QTȀV "Ţ"95tn u8|ei^Kpl"[@ p)m同_qVvU.& VGzr6klr<:wvs4Sݢ<7'Ƣϋ^N9 X!F-`3|[3@wVZQR />`QX3\C]NDž7ki|YN4''Ee7J>b~E-B5@xgJnǥd>^ Nr9H>/o`Ɨg[%N#pV?#O/mtn;(u״*/ö.?[us gс~?ĀǎZy\7準`2eU,M⹐kl `*di:n'L?BD8S|WX VMWq XHSH۲ N ?[Z6#cpy PM{pld ՝/Ӳ^BWc۫e]Yw0l>7UւT.=dc6*R# bO夥uAeRGufRQ o ~v1-E;0:keZjIH^(y\_5dmzb3ПwU;ڡ`C6 03\nRa]|Y0 wѪ o f;xTXvi&\yւ|߶@ EMH_Eyx 1ڮ.P^VJ4hbW"e^OػF2ueeP%JgLeGs>쌯,*>(kom+ I ø挙߇5cvC:8>2\7R>06,>P=5_1Cgk N2}[/ƀNs,؃^ӗltkxBˮI티%fv C$[cމ%S34wл5^YsDz endstream endobj 994 0 obj << /Length1 725 /Length2 22975 /Length3 0 /Length 23507 /Filter /FlateDecode >> stream xlcpnݶ-ZJĶm۶m;+m+m۶>W_?mh>f1IDl=Mhh9J zZ&!Gg ;[agN1@Cp03wQ;P60pY۹Z]]]]\h]xiYdbp67ZX5$db*1[Gk@Ʉ`j05''u5qt @FDY@TNV "D,05Hى󿪣8em?XutE 00rYK7 [S;.MC~4v51 ػ88dMm6p6G [OIX]LV3OodL-\lM骱R "Fvf%4p4rD/chТ׈Pv^4&Vz35rqt4u_dbnblgdR+R8[N5;B2}l1hnwx};`ȳ{#qDhkw<1EZLTa[V{ ec$ v~眾(O9ɜs^#Pʃ,_h+FC gOU4k\pmgKϼ<0T3oѠ 7Sk_-R$j&DJMDEi1KUWk!Ora:]=.k hZ}~aYjfS1D6Qs*:fVJGEr80z^X(73?5h\LǤ`]X~$V'q־Iln~[Wv;)keGQ;0P N nBb [ ֦D=v;=փ>2.RCf[AGaDQ`%9S<[otOyX:*rgGH?cvMB BAc#',Փ(-!&_bt쏠AxJ)7ͩE[m} 93ьRQb:>ǝYЯEan&̆vcS9)u  ix&<`AQ V+p`G 7,>~q1 gӻ`3ٵs #y#Ze `dGJ%M׫ ߅f>˃VDF0t4s5kDgՃ48݄/qMlUͅL>+^? |G=<\a_ðNZӄ]o{)XC\/{ U1^ ][Z݋f]}m/y RbniHI[}]&LNQSI=Ԉ<")K$vsHr{W Y9tV@6)`ܬJybU\q@~a7CB bB>#5oš,? |ߛVA,/}wn:cMvs^<6BU@15Ȓ-t|";TeԷEݍ4e-7`%3rlz,|(^@PIJVDOG\D:|TTAm6ɏ[Fd-ˋIqzy%V1~t \m݌H~'8g.1D.W!7%6&m_ c4"x̊#\ϸI bQ8~iι*Icu=/znGƩƐ%dfpMaf~N d$lݺ Ëy% {j[ky,:'A̓RܗkC5.$,hۄc+''3M'vΏj K}0߫ f0G3 Xщ x~zt6YUi6.*:Ibۈ.csՃ`M}I,ǁ|UvA3A%Vp9۝˅w拮hLMV"oa1>w>gm#̼D/2KxP\&hIwBB-hDcS}j8!cw0 A:NuL)B0F/06A4f[3M Ma]$:O>VfoSf:JN 5Tv挬72*(CB.a׽z(p$?4.(ZX#΄ #+),zES#Jkp)ե;aVVhPw(w@}rqo"";Lo4(cYV,4}_B 6iAn"qZ9 Gatu=ݨ%OȋNjE8m )8)A @3OAv"S8qc-{%"tZ[^ar7DF&㳍v0W.]_D"R3 Hrjw{M & \.v:h=)vJ2jVrhLe8]}~'-ʂ2Gi 7MFr'"T}' )n8-<:窎T`B,_DUX{Ng-!+Ӗ/v :'SF`g6/654l_Q41F_&/Z(`#T} Y_alcvwXPZ eu!Pgܾ,~A5u!2}IK|xNZ M_HP](a֣mz[*!t,μ 646B}O =ni+j*~ُoڀvgDHod./GRgƊq1,Ӥ-3GE3eTl7Ss{'k2{ʂ4lc 9-*:ch#Tғa>SUI߉[uZ˔eӑ_.l3C`Ҭj<CAG|.£b I~EJb,GATLDN#`{߷~:(ө1XN8 7vatX3fqޔ,I3wxm.|S#Ps-jw?3cLGPNMMFK[A =X?&q5}2{@ PG8L8 YZa6n/qn0s V^a$LmsKn(،Ӄr(\Cn%ywj"J:DiDՄ׷"ĭ>D.ZZd b={5B)  ypP5 u^DGmGNVgIEA*!7 ZD/W'|AEǯƾ'nGg+]2.Qa$QOG|mqNJw,g\5F&F tP U D%/h`V{B3wrJ{֑8x_5gVQ?`r~v[%ss% e-CѨ\eWz7䌲mp;U>F+e۶SN)'y| 5UBŸZa&ZcWp.mЫ++r,qh^"oHr<+bcϝR39zƢj4Qzyb9zI!C%pmNj;4lqO-fTōe=C.ї"M_NC1SdpU Ev7Ն:*)4xA%?023RpH< xYG-ujLqgVM2UJ58Y^j# wNM>Tstfy:c>Zr2'߻?` |\GP }g1i5sm3О_/A>͠)kjMXxGDqV'B3i9A#6vq7؏,q5?eaI JWJ2b)UE"?]k' LX]~A,%PKY 1fX|&t >B˘V2W o0ÐѿK?DHO|^a[G*">g ӗ)gO=,ӖK_ ,h.t-u J^\f7zt7C-Ϊ B_)J{#!04+CZ~ΈBsOHnOP BVXu%8{.PG1HGo6!+kŜԣ@o+M싖YgؔB:ЮkIG\SOԋpPH_]bb?W`s>D)`MC,sX`AbGqh|(c)ҧ+w?xm&a}H9,4u[~W"cx:M  ]]z /Ce0Dv,'DŽ6FT}/ r8kMʱO)x*3c=ML T ZOj,Y~t[ ؝=ZTݡojp o-V;ˢs|^lzZc>sie|kyƠؤ9yEF :47!DFHuLR!&^/٠ig9s_TⶳQp,px-|c[Vl:,,e;sAg.:w (vSt ƚbIp"tEґ2oW>)ww+F®- l@W p=:I5}B<8bp1h`>KbU2j1J*sf=jn9N؄X𒙏Wy K H"WVeT%F[ʽ/6NlpQ\:=u{yA׆+kk}m>g~ )Yc:S˝s$TNSSYڼyjEi*nȍCӵL9{-n!iL8-5SX_OlU`0=iLNZ22w0/}6ɝGI1Hy_ȵm "[%<-*N}P0>89QUPn0qE] Rhv F(1\C&T;_",$Kcu9+4&Q+b.o 4.2_+Dg^U/B]p*ΠB'E 2p CiI2>3q]e;u7ʟ|?X =s`P=d;H//lI02u`b2L{A6J6W~nȽGFnקNØE?*:_׉3lP jdop_~aLq1,'FlÈV`1fDW[u;0oD&dfciR!U~ޫ^[:#;8>'4C)z~/#P͉ `? 1m-?|kDmQF݀SMMJ8YYtp.9@ͲrG[X_OGX[Dȉ1*#ŗ΁5{ Rz􌤅BltI(߬ctRN\TZet'`-X|^PoJƳk1E7g\6*!{&"nq"W#zo s ;ASЅ>{P)$5FsRPVPquSQ8i$](n O^z{`g^MDco{@,obԟҶ>d-`B06x8Y[r >CzEf@.Rs6늹 =|dCfIѸ1{ԓBUsLC0gۯd\xO߃Ea}T3pNa{м߉^OrM-*' Y#WPs7K8dw3Jy&#lbÙۜ$iKlʲmߜcb?MFKJTK W"3+,y9: B<#f?%\|O%wk (Q/^TM喰GB tübrc;L;LC"aZ=oM#]?& o jD1=ξ6Wnz s' ]I./Kn5v D^Z(]N|7'7nl nv7A::CfS1Yغs_@q.rF"hUU[?A$` xXY'o"(2kVўXcnx]#*L΅-f~Wwmh; hIa'G2ؽk_ZpPi*xVICgj ST6Bݣ ɟEٵv30oit 6f$0d(+G`++O@ й٫ybҳ4kK;)/݊wifSjP/albJKy)nS\#Q{41 eઠUv}o Myk_ݥxt *&4WtG!>FtT hS Dcozap`u1(6Kq4ӺIw(0_pm$S쾪L8 ?"Eu%J$?umT`jH,o,dl9ylgfk͎d 5ײ*rNWL:[{J{ "ɬN[E W;>còeg 4Bc<~p7oj%̶L k)%yª.O tz_>^ZkONcnIq~Bb  Z]!@EދUI݂鰅%[ȧD< [mP].b@W ST`6Γ;->?f͂g䖇Yb갤G&%/;OlM;;]a5E8rx2VJ\Eϙa?,S-2Q[E,v O/Y9XAe,sـ# Z^˄Ș|| MM,}f(1KcRh(jO/vGybtiŢOeK?_;omb=Q|=:s eH{y.N2ԕ3igx aPGL@ީ)ZvDž (bWL)c '~Fs̹` gNrSoB*{yFSYX q+> N7ϼҥQj>'eRA ëКy'ۚZ|{"z!#M~Ji#$ےy0Ib7:wz5:>ڟS1 5 jASA  fjw"Z3X?xfN7СE^?/Wҳ'z 2kY ߛz"QWmr#R]~=wق.N$t֖5IhMiѺҍ#7"{g?rWtdϧߐbbFg15<^}պ(愈F+ CijkjWEZ@#lM$Dž0{ҨuߩcU-C^=ՐV0CNb! _Z=٧(N:N}>0uoU[9)5إhLt㐬s#$WReC@eg؂&WyMr+>ޒ聤|=h*ɾ /QF'ʶajd;gd =&Is 3͚knB:g+if8n) *fXՊ%Md-F8j;0l '~w# On[7*EF75|;f|U6ֶ}q@4:CTQ3ZVLgXw1u+`ͤz.kuܽ5Vw _~Q] qt^meyG D0@n@'x-}/429:}gemd^bǵ b g1:[fa\PaR'*>4_0fT~?SSiK4nvkLH-K .}=ّVFTh%-|Xuw԰ exJv0:"VLhJ Twʎ¦1Pwq 8 s?Mno8PA=@ǂX?4gebrw>hP/z-"MG5utq]CcA4jx ˽J^PKG'[Co8`j1ԏ5UcR.Hɸ8FC)$!BL91GWv?^KX:zޫ"uxn;n1ze)Ntƨ_ Թ=^Ya)7k{2|\BŚse2nƣ!?.^2fՊ ٻ%`1$M |/#]fΚyiŮ++GAS{T$emycdJU,'V/nlѮ= q*}?Z9>OK UXhQ;TI3&EBWsJҩHCOCw)1y f/A_{+t9.!S_s53ZJP@WqLx Ts'݆>w$*Df1EltՏ ~-a%C}WS1[N8yU.^ugѓ*(M9F*X!V 34;0!T*Xp&Q,J/5K]2򐾪 Ѝ.D /g#AN1PZ&)ECc9sJu5J[xl•"527O 6AItUQ_cv5g*&8 ž>XBկ% -Ruc,Yhqp3Db(3#-n+]79Sh3r';9w:daFKpmtyAus8=+4ylSM[H|R* ~BRIT*d]>q mzĴYˍkK:<*$+Ixk(4-'K;ܑ0;`䀛~Z gSfC̜7,˱6eJ$U:9lZ7nh 2`9C"}ER$_9̱~ bxz.QG `(e~'x }Cp%'wɜ*\+ 9|OoW<6xĦmR`4wfV.= # h%'rpgxk!<` UF۶b>절r?=QA) RA:/Jeu&L6׶ dw z!Q4kV}sY:X<8??L[(RlNr`Pd rۉg'oBjYcevm`w|0~o͗fq~QK&>UדK*lT8}f?&I-J?1dٺ3ǰ8LCrdqX.7Bv2" < 9Umɚ}H0AE3d2Eu>m16tJa~9P+N=-ZO'Xv&.I\@2WCDmr}[`# ;x;6 1.)XWZ' [ 7QyJ}M FqsMCKm00{rTRf~hƇhc'E 񀬛\`aƑwͭ.\҉֣ 'a#dy,{ai^>c[>GӃtf2Kôӆ˖V J7w G꯹VD{1u-hMѤW! },RLYM;K4u]gs@XM7Ŗݫz-Dg x#lDJD9P!<$wϬ0Ϫ[ ̖ ϙ@ rP3Qmq!8A[E!ObajHa!NU0 !<1gɏ owf+Fl(^ky]=<7g;^j2!]:Ȕ\цþq 5=_Mv abOt+<~꽨}'QԬbB`l&Dž:{LSvFMѮ{~/4C.N1’tLhՉ̸ Nv@/0jD5}2XAS+dT\~myl^i62o#~=IF=9UKh>/Ӧ@65K?)]R=]> 1ͼ W/ğdJ>kY>}^2cAv(ԟi rmPVSvRbp3_"dRl#IHBDWK,Pe 7Do ,6@A{RI!b@?I+%Z% mvҼG>p0a}"l +f4~{ n}Uoۛ2H #Kvdm5+XYv Eʓ$m/K7qSMA6$ZzhX' wj"h/SCwԿ(ū o%r3V1@-_z!`<!> Y+|ݘq~t^ghϼZU:`P#ן7ktr#t>wge1f4=NgyZ?B%"Cq%ekÆ/m) UI4 x' 2 vFX q^A u+A΂~ehn QV{~5ͅ }Y@k=UAYX?]yS3v:GK2l4 0/sfž@"a7^h oeI?˩"J?wcQTNkW@2 O?][$ųնٶ]Sm8Y m۶6i1ٶz~oϿp΋\׹?[P6*"I@ ty΍F0QӆkڢT:" kIUQ75 A4G1v ̉K*17`-)xaN:(K$қ7y0zDl0/v5b^L'zR-lC7$8;]oӐ D5=*MJ2 a?^qFiJ44ru_z:%qxZzsLV1ְ>>tAXmEuh 4$u=)Ԫ}7Q ښC{j4ˊ|b.J=κّܰOt|)i{p\(oܯ+S.hHW֟TzHHXz ʞӼνN D@xmh΀Yp"pI d^(ڲ~PA`7Š7?99`;>O֭=y~|J" _o6vִ%d ?`}ogO+v̏yÙ'_NRp]H 9Ql5ԵZ SZ_jLvBH `fP(cX=)ŎnmkO|gCnXAP"2To>O9߸ͺ6B1D:u)c'`)k3 !v ZyJ5pjn>'YXqܚ3+nqp'Kh,i8N,axLI>:EвO٣{uaV*RJ2/ /enN)d`u)Sq,Dg 7`# 7)RLWYQ&hoO WUZHRMs~nˇˌJ\;Y9aE9U/(+IRp ܧn5 rU{V?]$~ '`05tiރkK< (hdД8dNk#?31CEM֮;kec Is^INscΎ+s^/ \&Qn >muS, ΂c0 ~ѷcv+$bCJI˟XZ ^E3BUa2x-зyyduwt^n>sVʼ+aVu8|f> Z,ŗe"4 nTb-"\tg$ p+( Bpg7'$z/)ӨvSgzr`,؞fu|wX ┞,j-u@[!|U6Q͢#.V' ITbs!zRwVy/#>* ʝCkEpE ,mok"rgwmݚҠI'(o@Yd*}UFͺR![ j _)2C3Lwe!BQfOV[ǯVk8 eOa4ԩ%`;H!8+~a8nQE"ǩfedK?o܃xLSC19 8>IvJ0I+*vJuvŽфg&+Uz8rSJ"8QNaZ#Wg 7+h%ҤdcB]dE-'2N˜^%X"olE(c||Vd# _%m[5%|#zC.M  67mئ;:TQ|A^+̽G\hL7 d༦3Z KSi ~N/_ΡHI/mfl8þʮeBA{TP={rUW(WkG"#%%+.00vm#./Veө Dd2 ʻa,-5HclGgZ4HcML!*DTOiGJ;RŁ-30r5,瀢}M>͐Bњ2%Ld͎bDH pڄ=iwpMOtظ^OUOd$QLrl*Wk5h(6ԧmQYʇ"VW+mI8p͗^A5!bMzl"US ]k1C?gVS+PAZnpԩd-jD#>0Hp؏ݜ cu/jg^ǢRkDC]~#))C96cO(EXiRv)ɣ=!"/+?@6BFV8EׂIZJ;;Oh5) /ٛ-ˡ1. Wڄo\RQiZ虧,yAU𭢚ęA(һ%`7fҢ,Ìko! L0LK}iD X4 }FlAMէ=!\0KE Ԏ8Ccë6yd zck~E2)=@VI,ŶεIz{R.o:gn LxT@Zd|OBBAIwrqS0pӈ{wE?^D*:\|~MWf̣AR2HpͰbc4sѾ#C.x@F[ݿ:V$ ]^JBk>[8x<|r XsO$3!0 kЊpEa'bB`FrЁ.[ }.^U]6%RJLO*|.yhI%"rҦx^lmj770 cj*\if@ ;_,}~Q EWS BoVv#^v*[U'еymtL/A_He =DUSV|$'G6L&WOAUj_S1rYZ4p""&Tk:C qz@;Qo̍;GX /ro'pn%;\FtגUk]Hzqs{rC{^WbtChQJG0 "ḍc̝6Y!x!eqpz,ܸ%@~+إL>yw?9akK'(е*nCYo vx&ԝ{0#:Z{o8AOS)Y{H> uu` /VMymCm|-^*\%y 4ǰx6 oR peWA=\":ܜ4>KǞ 'T9Y1CZC5Du+g>hEMr/oao~]$TVQ-uN?fCL dL?[fs/AaL.+%X%tGf^u Rw.$8r]t9صVYrSiʹ3{;}IDn3dkTo#4'sP '"O(9 X\`, )KR~I` v O (s']I ~Ne%sS6`O޹"ѯ]BVУ(l_{x-: `;4%w-u UE7rCǵ+i*fk܉.HM=^ҋU+" 6 sqWǪaÑ42g[N|g(—s`^VZ׹ o"LQdO_˰`Yc>]D6i_cTvLҨ?cfY6zŚ䯫UBPW/z)+v~n_[t=),l k@#ߔ07zFcqi9 E4语ki$)$(jT430P?"&ZD`x͉I$t#D=t{dS| .4FAtY, 8ɏ#,OA먇LZMuU; )TiZ.3~J!g6YDʼn!މwor=oWhsr%z5ͧZ. Jcd;]ioOD,r ^0@?)03Jq/K-`?g&(bFtξ_qqD7//fh'O_͇8+4( )GeAe<`U}!Np붟ϲ2u'J+:Dۻ,~)h@#k0\bR[D馕`YD*,F-]]xߤ㰘F5f>;+|ƾ"@i)bLORGewl!}1 ;fF6̧%.WwK(N-Րgt\,84^~q˅"xfI3bu,\]uWliVl2MpR҃!{2%T`=(lr+XCs+z[M)yZʴgJ:96vO|ɗ.o<Ia{=|3fq>LKx.".1-U5-(4e@t`UDXӇ(j;ԬٰЯ?yzF1לuOR1 St kq2ުv8&r'fpN)C P"G +Na ^d)a&*abz[-w%B$ѰQ1>uz(lA* rف!Qm*HR?Է˃Yxb6ڮȨSĻ(PHsmIN{lȃQhWvP>o9(ko a9B3S5J+~T9PWm[p[B#_s'F&ΕWX#B}swN%Ov^9Z8"8J+9Tzs9G}Xïp[F5ÛeK |E3 r"鑏]}Vd"xcaS6u(lĭWF[7D2D}o Ur(4g H _1rH'#b)/c;Z_lYO0)s%_6!1ޅe/7oI Cm߭\W+`hz!Ej * ( _)IS1jqߒ;"0IZHڣN endstream endobj 996 0 obj << /Length1 1626 /Length2 16014 /Length3 0 /Length 16866 /Filter /FlateDecode >> stream xڬctf&R1+zb۶m;OlbFŶm[۩>_}Xc{k*( lir6F.NJv6rv2@ _9+)#VPD&&#''')@@NIMM_LFS@hmgouQ8@8B\N :Z\-2@[' %`Ҝt݀@T4{wo.&+7 {G6u)99;;Z;fU7NgsCr;YULZSҿt:Z:2L, =_0\,l  hfhb tro_u =m/ hmJ7fp̊rs:A _&v)ߔ;Hozo}b.r6;w Ebr1rOku YNoKm@o;D`jh_ښ-lyWK S10& /UO(".DX2T;*RdL0BBv/ZF6-3 ߻'!1YnU?:GQ[c;F/?jcGǿ5@;nuΘ;2-3ݹ3wxRD|8ؾA0߯ڮ7-l&qc@pÚ'x ׇ/y0^{zբ6^;t#kɣ=OXNFڂsēGᡞȾXXRnC̟gI Ɵ c.ZRt&MUTUtK5Տw6jQB6".5'u-Kx qeqOMcGp R̕f ,qcL3uF?Y2#]}\B!+FPkE<07Ѣ]}Crd8<)FFP(4bv0G ޷rhgO,a=0-֖R4*i7xq~bM.P_|! +Dy+OPq9`P[f9061r@w- FR.RN멜dY-QU<]x3p#qJfa"Yur z 7Vvd\jMbS + xYH0]mƥY~Jx rY.ge 0/墔-i@kS}o70>#2l*fVDy6k=r~?VvO<4Z*Wtɡݷ2^kwSRU\wi(Y.b3fjt>8cH%BS&1֪u/Rófo6߳"˄Ljea:a]Fwba?Lxj{Xpfg):R6-hX>JU]Om6,R/'yDGiySt;ݵWw'&,eL]-69qkzK+EMk;R:Gsr M47j5{հqJ/Y;nNzL9w/f %27[{+Rv>Y 7po&gȐ hYQ>ǹϑ|H%t=./26S%w*souή+ܬP4ap#z3 25ɤl"> H-8L(xkDFSaP![oyQ| -ߙRZ6& :J!$4ڣVxx; _q(c}veTrARg|u.dBIl$@̮.+W{ГqVڑkSxRF(n$YF6VI}?7QVVYi;ޗ/ҵ:S-D 2~UD}?l-՗q Vv0sGkJ8*EXXJ =sqp4ytD;EDD$Phž]Dq[b 䇚ӴM\H?z:Z?Re6oǷ)+RBy\+N,Дw'nȷ4c{f$~`z}ȓC?1+UEu=q(Ny"8ቯC 1o WҺUM®:ȹֈ_wG/a=Y lqf`6S0"猾S:SdwEOqe'A?,p;%05Y&O{g80pݒXSrcT=pTL wsV7"ƷSlJ`wP e(a&jsŷ*Ws}A6WZi/ ciQQ d#} i,3CSrkesjHP<2:~$Tj:F!A_9d7ЂGqQ8=i,,b?(Ea=Mɳs6pyb>t Bv S74Cp8>fK_3(Ms/YTY 58ڲ, "!o"eߺjk dWN+!pଞ Ei7M蚾pq;Q9It$XwIb!F"ֵ49;{LMO v㺁cFGS-g;a/v>#Y 7TGKB~<VRP;T:GI$ԁԣ4ޭ1wPbwx.W?n|ګVL54^EŐ_R$Fܦ4^mZ},1xlY`$~'tv5>S{ڹ>1|6nDv=Hjrnq;L䫬JZ<\`r zgaS*2F[sJ _LLV*^`2y 8rVx?.QjqoXT13⶛3լԺ&eahAssD|@Ty#]"hpJ}o:0t`m._! Z[ʹ?2 gjn 6|k\c RN*Ɛɢ@ "ݎajȩDyR`s80:@;B[ Ξ9ex wp ]n_1Zw3Z.~ Jp%M_[o1$|  M|ҶGcp7ëNfMLyaB|yGB- +6vy39f:.yz#83l{ s mNeb#BZtd3Kk%'-{،l1BZPx E(~}9j %m)(鈢s[OUhR_W@d90;nK*pHZQ\bs-0iF 9ŠD _Y~Qw$ؔFZIղ}>%Sm䂀RCDόRXW*y>Dc/[ƺb/  X)>z%Fp5 [Z y/]1ZKí&Zsv) ƌ3S4d(o-wTv|1AsR1?͹e88P`ٹH4N#DYͳ^xHR% {R59фrF;nNN$uv{d6<,9ׁ*?䤙C,v(8y~T6`CbU@q&P =1۵#ǗP5G?_ɐA\ PpzQ P_Um6lσx29;#$pf=gu;\hd.9<mMgytZ=6Wv?shUNjlwѿ0d j=U^D#uAs0Web& AՖ97w8/-,PNڪ& R\鳋3qrآ3fT^a%4vb͆YpdTa9mF] X{u]9 =|A/w3 ݒ br*Epf?ȗPgMLL\{DeE[^2'`TSŧOle)Nt -\ť <(;3MZg*ީ`N@3ڭN 7*w[N,2dPUPgįe#Ð|t<ޱ}g/.?B"Fu"-kTX1Uhv5G7QPzn]܉zl @ Vu b:trYpI۶fCeS±Z\`Jlq ͂lfIY:Y CP)$rӊFlC펅ZN$UY2_,p3idaac|^K [2-/0l H熢|xI{MEo fP69#Axg01y֟A.Xa˿0tN;k}V>5; v_f,g륶l0ƝTƵM~S4 +\7xi.Լ#gvWX!dR.L CF)w<}Ϯk7.-OLJ|{ u[\a[,Va'.5=`v7T (aM%6sYu kSnk[]Sa L!CJ8j@ҷXCLZ։?UÊ(X?"D+P] h"jT*Xzw&?Z&r&<|K&]cMut;VŊ,Hu~錔׎&m^dԤWtgX5%YGB_@PJ證v` eRȸT*gtEpM6 hEN/tHgϗE"}-?pv NjAx'x>QTH)rN"TBć y.,2IA:nbS6M*Sh/l䷦⮲'X`a|ş35mq"MX,*ged[qw\ۿ 14~ЪKYRō)2,Z؄39T=U%깕DnlC+}jpLڳZ8fGi=b}Ա|a( nY)<;;abN ,+": *"_+ d$x߲X5gI($ax[W'DNmkcAJ՛U1#>YoZ~"*a-"mp.< ( &Ԏ=+F'};Stwf=w)f2-p;]iZ0 iC5S|cl :DsT:ws50PI`I=jcApP@Y撮`іwŤDZ~y9}HZ:)Y94(lɥcpƟR+ͼmkz[邱y=5oiR3T&C!OX@iQp/=pt}Ҧ/7،W|Ky.sZ#U= }'RbHQjӒy7 >WUxӞ^T~!䃃قQ o / h|善Z7ow]Q0fG9٨<װg ݃ ~oYx&ݔ{(]yxL- i*KCo>/6}v1QJۋ{6S wBbcSΎva.2ŠUȕ`' 턿Q~OeN$vhS藅72Mul_~ : I@g3̝5' aC]D9e;. E3}-"qr 9*cXV4{f͏6I8*yy(k,]W/jwD/ tltQ»B{ 3(!Z Wpř—Ӊ !8!Smɰ@mk4̐;C.<)@#Bg ɬ ɨ{sZ}=3֜ ePT̩(sD R*QrQ+7?*ؠf;ErI5KA'CzyjuBDK Aܺ=jbHԅ"v vD|y!ׄZX3>mQFg]G8 @N%x x?hGCҥ?8:qP1+WI ?t`( #AG{~ZvSݭJ5]R țG6Nr.0Lݏp ]?{Y@<>7BbXvu!`#NP\Ɯ%n,;VϭDKKd%|L1$S5 IݵzF}u{[ 3AҤQ!9kO$0%\=gGEr]rTgBO>Efk ^.&2wJVJ.&{0)";0(ީֱg 63B(Q L{MV%>^u)*xTv+~Z[oRjeqJ (3?/xT ]tu%`0p::F_:gZ LK.KΐbR<<$I*p%#o>C xb̫ &.qK> ёlӓ7S:)uGQ[wclj&_{m9=%&K.;ԦF"8+8I\R'Î|1h`24L=k8QmuʛSH a4goN"H R<69_!'e9<(YvBAp 4;]Z^k 2=p DufzGRˏy"6ݚ kYF%̺ȉ+1鮸~ֲ`IR[/8l]QEjmV6?j#uNQuzs :oU7ye+4WDɺTGL=sP=nܽK7DwɥgNh{I f܃%ƐվBn'~Q"%nSAL:D&:x3Layܖbaէ<+oHl,`VG`SBeCg?i6B ;U{|RR†; ~NIUխ^~nc:ïa=;3ɘoj 9=]OKX`u帪dh2xqt3Nb@tCA*N\I|>KHђFܤ_kVLg`1NyvC"5@Šd!+x@1iqW3թXTw+&6>$2UW4xdln[Δǁ˫%!ӏTO; &[?l۳rj8}nm,-=q:Z2Qi9x8 WI-Z`Uo=J3(, = -Ϲ8f$[쎫Կ=cKxU'5te0찍$ݍ J^MJ@:b Źd;):Cϼa BA[r2aCD_1 >|yE.Ǖy1X1^H?L|+?`ֳo/ZYM2"{[ Yю+6l&J!ݦE5!n<$Գf.0$Ok$jP-~ajn:!6smN BQsy)r`y+{oqaQKPp;/7*ƑB W,cȂ `zr!}޾q5;G&o@lt+CBU)!o l_:Gj(yT0@wWpk(u-^\)H^px-oEwkG$U" 0ωx Č~pCl"?<-I|SXpӁeGbq4;VAbt( MFU")$pQP5S3É|+o&>τͣEY9 4 \:P=+R3jNo)ONv9l Wi٩5CZ^.w:dJjl %ޫz{y3'وZ鴰EY/n2q/qBNC )`I3m=.GY_?5WW[Imr]ɝ1:F(OܺAV\Vsyri6'uCgMCJ~ͩS F螕}E)ӰHd~z\Yl}H \pk G2 vCFBOAJfӍ<u,gHVm;d#ުRVpoUxLzOSgt~ =,@:p(hbdB2*?B|mQ4Vg}j3+k !=usD>=|K(75X-e`]8v48rclH~@x9۽'OZ/bkCJvXy(g2P8pA#cTDEJ72ietW^,JRH0UDA[}L XdຎK;X樳;SGM1dⰕ菢j?VGrԒ|BTII2ƾ@zWKXcA6D@V-#TPPF w ]54.zפ j(g?/b`6o 7Ӄxsx|U,3-+gޭQ߫ 1t?0kJhD6iC;#*'^u皦lKr*Ц\!PtDmv=؟HZ"r *)UvS: lEs8s?vp?{uSjp8U^c;հjEze6 &#ZaIHiC{9 :UU<h[g 3=EdӱRP[/ЬA6l(Vg{ i L|!TwvWY^P|0{78MgeT }^+L_XpwP +`VfT~+zdyU ?QNy%ōͼ Jj`u*ž e"Es/hFպ}>ӠiFǙXW'[GXּ c SYK^4n*d֕yyaTyc,顪N8%ʡ~D|i8dp `L[Dn*R5AMVNE7I1P={\Hn[)CsU;N: k9UӞʖQ x;4h1tQ )MrX!uSK/6|wM&B5\)X邙kY[ĥƖ,%_l' o\`-tYIRwA{:h҂`f?}[ Z(?6 7I5+kl{N+ǨaM-[NʓGaoR_?*aec+VGeو"H_l~&9Cv@o0)WDZ U75xFRg;7tͲLnGI=A^yc/8{*ܽrA=-vA$;6:r<7nWo;4`a߿sj*:M8/iA8`f,B e^yxDziCwת(6V{Kr5''kC2sJ+uNc< "|ns?@"T\J1Œ:{jsމYW|R76[[Fj&" u:g@{$ n @a>kNRnV|JϩW/|><"ړy _bs$g}n'cUl=]]Ǧ  b suRy9J)î+7U ZAh2W- iel 噭V<c pe/iDZy[@$џYf{XP^..:U^j)bR\3_f7R&8p92"N5>/^j(|iԳ7/F7!lZAbŢt b1z1 mZyw( )=8՗o *K$4 s"SCpD}i{w%VD3lX P 99N=tu +f-u$6ۤY3 C* ?}} Eb?Aw=k[+M3L޵טd'$#ʣ\Amvr7ٱ\RU'VBJvpi؇'ľ\ ݈h-h \%;t_,f "pYFAjK.L#>e茈nMx2 /EOfpoȉn_4dY ;(wfzIGrJLw}7%dKq.R٩ԫ'CjNJ:;Q'WIoUpJ]ܜ(f!ԙyj4Ex=ZuoGRmYiڠAnܰwr%Eﱦ5;Iꎟ2P&x*lx>ʼBS}wf4K\ݴi{Qޕ^*ы *SE^ps4\Gvf`|S%#\;s\Ek' # K&#Xp#- 4Q%2e8''GF66@aٛt88Uoq G:f6"ea̓=yfĤIQ /LvD: )*1f Љ@JՄ ã@#*,6vWl01UXlc讖/cS4:H(4ƒʚחEu N89rWp&naF~C_ѓ`q5DdN%[͔7lkFay~LFS%^8N0e¹m9uhhp瓁D( $_ Ias#%Zb,x1kSF"WIz`G_ZƋ sThF+@#]yz+(i~z8:BA-z~N: ]6`"Ul PVIa?{%f4bw'E³MW?0wc endstream endobj 998 0 obj << /Length1 1642 /Length2 12202 /Length3 0 /Length 13056 /Filter /FlateDecode >> stream xڭweT\ݲ-,kp@74HCp''8CNpww{tjUf{SkIXB>e!P6.vN!*^"̦+@=3 8HCB]%@hp  `+k(I[C[YOl`x}pAW&Z  *`SίE~[@ 3qc8X*ͅK`pqZ_݀@ǿ V# gv%jAN{%S@],PkTui 6  ם ׿J{yE`+G hgw.`e pZ;[]\^i^: O՛;:y {@h\ܯ1-hbg_08y, 4U5$鿧2 "L?]}wjYW;;Usǜs(6 4yW[%!v_U6.>v.`w:ja ۽vmK*go5/%,WNCVKKY^1iެP-G FUX/*II;56n^ ?@oU̡`w!';''FbW iB,_? ίb=^ ݁h3 `Th5qv߈aw'|_cqVA%_J`cU{sӖ"@'$q.-sW2C@bCH)U~N&EH<('7nyt׎Xɵ_ pan{z!vmJ/lNx@0sxFwpG>-wJŻn|뿵#uFGH7ߛ[@M͸Y 'JOP~AP6$qӀkogI*9"x_nѻl=)#'{ .2L[:Rw﷓ v?^FEj< ;exEݵr#|۳Ah+ U\HϳM\nd;&)xXuZy;qjOVhG%{yW$a/*^X?~U:A|"e2"~iD{7gcu$ȩ)FGb)+KT} 4C{C/,э!,7B&֯nDF-﬒>L{͟ѐU J[Һ$ڎ&VQ@yHWjOiAx"27GN$;w <o_zTJ{EF' TNVAecdJҶj1r?/e  ?66BL9qYnن*WWX06"DPS+Ak $>9m'oYІ - c X.C7 Uk94slj_mhr~ώqn!SV#z2s}e580l ,wLtAʝs@fG[ ŬqvkSVE[ʧimVڣaaH)C:2ͪkwleua5`ft08#¹*CFA"fӾP"҃hR} .-P!7Ta{&%Aej!EYsʝ=\o/ޱD٨";8-OspN[g+~[wvz9]XG~ٟy8@U:3.;h$* 舨,n)WXR%4Hֶ5ʂ}mC?=>{g"8\}񜬴s:R3y7Aˆ ч Wqgyb$OO"iuX4ӧO9mTzN,c *ܛszb^x<ڳi?"X:FnZTMx@.u vdFs|M̟V)ԛ Qol)Z,P`$A\cSF|\Qs1t_+I.YTb$O.(bwQ߬L"Կn%7l$_n& 'QLԁ/K'_AݠxV,eR[&Wѷ] Vî)5ȲgFd ^B{O[i/OKaGP̡>̓ HtޜIL& 6>.dn{EhH0QZ^gO2ĂsǛNtMf|H"^Ԥ2ӱ'+OݕcH6I7Ks 1zx'cjTq|tnZIF.)eKfsoVKflCE.eoaoYsU\%Ԟ&Ip"1r67uH$?@ 5"&<CКlf[gВWKC˥Є& ~_)_!)0e鶉w-(L~MMˁ񸦺3h`&mRHe&"6 GTt-GX@J29lZFj3vNg JJq 1p~#`~Kn:| X iKmWq'|::Pt_Vz4ڏrRun/^1\MQ>RjZ/ÏUlf슘"[{At>ncn0je%!"A.i\>Ž*L:a6㓵v{9#0t[%'/葽/P%4{VqU oldme:.8kpP+79 kG@'v2+\kWҭZVmNEͶv{i_t~|>.,10yiZp!ӷ 4Bf}g +0A>9gdC 4l=f8"K0\̫ p|2RQ-<>mCHZM?IA`2'tv.t*kRE 2ZI*5a*/fG7ݖ4;ϛмG}s~a)ڦ Uq\Br5Wj3z=290a< S(CQ^MFb9 ` U|* Wᷰ;y{d)ooFr%Mp4;k?Do+&؞?ݙb~ ?gNN>j9ĔJcC m.{<5؞1^drjLஊ9ުGw|h=0|b[0޳鰶#eZL@Ҩww)Kb4 + LƓY0v,?^JF3 Svi)wof3j1zXZa٣R#>krǡ}Atz9HsΙ@ͧ:U"Iٚ g3Ry9ZMPh/ X5?x{;kEiǤ]1H*d$B[%!Ee*u/"g/&I2Nٍ4Q|cmE,Np7?cbkep+4L,QL[> ?u,Y%}t(@!*܀KӁLgwhQٶSrhA8cm/ :Y ZEtfZwtN">IbKT.Aar">E }:aZߞOC'dTy6I))#.^ܦ+:n#}C]B-vs$: >8ᑻh 4mLft"Jz R~;hËΏqd!/@H0iF# 8gT+:+5D,l:)uE9;z)m/060Uÿy%,R|l9f?|J~Kx3Lŭ;Ey6G%ӯ%$ = "~f\_ʝϦ =}ҔQYz=GuKO/7+z 1;a"-hz_/aw[V NgY\Z;qgk-V_e`9}bVfv$i 65gI2Eyq$VVV! 0>hhdz QP9!ΐ5䂗VV J\ PiWݧAE<[+zEOdK%yD 'f0\Iϕ@IUrQFt[}P4 kwWY*UA'. M .eE[Pf.άm_,+`;id,.]3Mr2p*ɒclb&L/ FӒ Y3LMby] h f=׻-o\*6H ПzFcbL1eKtyAn0c4(ov +>hh2;]+:QG(;W\xfӥ ^^b!4L?nu=dG/Ĵ=KR}.{Mf I#`w O\wRtߧMGQw[{\2?!15Ro?I ?[_e[;%+H5E|zs.^톿4eZţ[> -{A;$^X=7CKUg8M湟w޷iN5M4G^lj*m]nK^GUaiK/M\Nv9aQczs4Ap猷ZW3_*ȕH] `.![dś<5]rq68$D%|_n=WT2q"~sXH0.R}t<M`txƣ!J:`E6S1˽/$ި.6^<^gX!NANOA>3~O" UY4Т1;b]w]!m-1ږi!wjz&ڭx/gMVZXT/Qx<=t)2LQ*Jl綝8DD<ƪnYȋȲ*3 T^>N;dDvr nQld "q_8You@@rQtx}HK*,GES6[mHэ%͙?\+"bd(y'|20kIT#䓀qAq0//+iȕc)ܒCvEd y%="Mќ0t LLΪ"1j:ta*Fg׵wѧj8{uts ٷ=#EnJӟ:č)f|?[Fu V鞨wBa pa .xY+.lNSe٣Ƹ`H%u,<NV6oRf2:`XwǬ`Vz ӢEuomե6Ng[3e*Wtx8s'+%đ-ʮg}<.Ap9 V@WLֲȤ^fo#VgM%eU0 lS =Tu-Ss :pLknQ ǫ"kv3F- L5`RQh%i2P*Ȼx2!ЗU](ujJMe揧}Q&OZ{O<|8SkUJ(W2v=q>X E3# L~<ҨMռô$S{‡e_ S"ϤwaJ?eڛ%tf/]7r^$a‰9^e-QGh~\>!ؾj'?@NL;Z( Tg}EJ#g'6:wHiŽ?_l]Nc yzs@ɍw\gVaܯmc6@~07#ltМ)z?ú HQŶ_eE$y`R\)w]7j-p'ID#Kb: fCrb?glQZ͔K1My'//ڐfdu3s|˨`X_l.wt!4r".TtgѐS(<1XuD+0rGo0kg|$KoGv/9 cwsX$sc|'7=naؚ(,C& AlZޚGtai9顠ET>ATR`J)ݰ 7KGf ݝmO6sP Tk <2Ne4j/`x Zz;(P )z8NMGyM.33b/YK A==J`U,p;Kc3qFHw$i•ÕA8Y&DcI BS-w0n@Tz;Ū%DY=5I:PJM@VKy w?~}^ύb/hؿk6ܜ;9U50ܻ{VMQP][U\\p*AWx#<Ǣ`}dH~Dtnݼ^"m d~3JdC|OVh83ˀ 'X82؞UVᩴaoV`,hvSgn?*3"Li͜10gSr,Bpݼ)-r9 ;aqki bj-{Ȇ є@P=V8gl`xDި˄\fn֡da۩?M&-KS s˹lWbi;WHib2??뚂fٴd?5fnF¬qՕ, .}A?A9d̚-'}v} !E­x@aѦ0L2WӸl% gL/W\jr>}UjzUioU(c ˤt0޲g%sUpdN~xWi)bI["x%y.lԏ H˝#H',RANPQ]rh 6*U!O\3[0Q n0@؄C{"ZuIHKvtː%rLAp'0ZP(>gݻa, -~cG./BKk蛶7R(d`5:o&JT;\)LkB>Ӆ8=&(dUPi#09A;qإ?XTJ_5_:Zjǻ>~W 5nW1_(SQ* m*D^s5ô ~ ʮgaFY;ʱ v{G;$z99A<a of"nTA§sa!j V] /orT5"8'd"gzh &ᲟWIBXW1!P&! 8+,fLbѼrqD7 =^ Dz?S \hKΒ֏(B,.%|1Zgi <3lM(MUXe3 mMuhk 3~;|rm?!Ę]/z\ '{X+Q(MX=ڠ`QWPEu%o YO݆ },žCұ:4<g^E ] oٗcyN"]τHoJRu< N&|o,Mihd~XZ%j'K/^j\Xz-wW'{KWq#@|;jF("]8z/H'<L_v(N݁$;/>~BZt[xWZ9o_U}cu X-(jpҨ6A\4J]ЋseNM!]m&{'}biQ{hfоv*p5X )~\ "ͧ&it]<ᕨDЀ=~A;HD4,k3e p,4:o+w:%UzD$+jSK,n,>mmYwA!+mM\{܅ؔdx.68`Ċ؋ij-91$Ӟ.Rt3)x|"U L# $ C8}{,{af5ωwGe>Y i! 1Y{ jT{ԋ[E\!?$IjT|7FUǑ>ѐtAsiYLUn6cc~^~uz !\co*P*KU~wsj|XV=lzl鳽w G6L)xM#1\Vbf"EE^ voiT*r:_D^(6{6TbI"T8’5%e-W0w#O5z^wj5Hx/Hy"jjh&Y3"4)Fz͍&r7NP&|t`cרb>OCNUG#Sb-2$d$`q1 j(CRY݀yrߪ(v)^ʽlcAȂAeE [6mlK3j_Rզ0/Y~܈vONtdi@Xdٻ6QBikr{G1[cBPSo},{ywATTڛm!zuut4Yo"b|R?u&i<I+F5ڽR[jG?Y)!`Y(I x > 8XW KN=U!/@I+%RK5ḣ6 &4 ]GeXV}0 ɑ|C> ~Qvf(x &~{L0\R6&h x,9P<-~1j8}"r?0dVDA\ ˇNQHCNӇD{.قJ[LHr&x3QGXϿNh>OOeڟgA[P̯rdpqV({> stream xڬctem&۬xǨضmU*1wlI%;ضm}q?c5ؔj "@I{;gFf^=<*W΁@I)9[ۉ9yZ@S8 `A;x,-4Z_hz:Y>mlv!Հ@`fi))(Jh5R@; blci49if Ϳ{;SJsb%089M,M@gddK;+7WB u흜L@΀Q%?,f-MM\)_0FvNg?SK'#9,f@͍@6@'0 _7rpgN@3Fֿ1M6C`gVd,8WhڿIxLfLChXf#[#r\_%]lllw 13y?1?yWk-?8m_j-tt*[:X̌l_r ;S _t&voV$".`ew=?Q7`DE^ ,VnoB<>bϳ3o,Hؙ؛3:jFv Q@I[5@;ayބ/*5#͹7gpL[o7 `CIza}_jO[ucG,p7 uW "Ї'}7@&Sh9M(]Nf1U7l GZr|,_uqh `5'TIG#C]={_)p}OH8{M>_\*3Г4< ^\H(c0Yو?Ew.BcY IF&ekdN8WyS3xb-R 96>5i2tCL5Tt4&k(%QkG~l v!TO:Ufda9psgֱ ;~}Q-eLJHYo-\Jm@vإd{ C^m ፖɡMoљ*<+d8y W Q?#yK^0V1ח=Yeθz[ԑV9AfhJFW$> %;F^(r{5l@ 9ͱl6|nԗofuOyqK~okb~?r[(v]]3ss*w;߃]qu&]a2WS- MR1;Te Js\խ73U)伿vUJkIgWrآ2quD{purbWfS4iފj=UaBGY~1Q0U\j>HTbȭ|߻`cs2saߎ[5 /xRh=ד'a'X M>=X>De_ f4g)TwNV e6;U_VؔbMo~vrC7h]Fd -!!omc4G8@zPrMu%VX]\*&uyߘ5#.fLp !莂u`Se! >sz|W{RȡգX],AZq+{Y6n;S8压ݚzNi❍k[)0p]O'iD[4gy`gƈTi kτ#akcKDFwFnH?S9ay[Y;R/hRҿ2Ǔ SYOw?fa*!^s*p GA୺ߍ$ X;#SW__2-ypiXSQvt_q^kSXJ`>s{IO,vqaǟԱnTF/I3炵]6 uSs̤E9E<DO"ԁFC wBm(/w_uiBɞoG*jz(yJ])hg O!w09eOd양G=w;O6\0 Z̳7ytfʴXXc \n޶*/>չV'"<ԈuОY6_WRRn'Ed.L-1H9P3pH9OqnY`AJ Ɵv3 v(lRBZ|ۙ*$tŘ͢~U7!WQ*]8owr*(5U:, .1 M56;_ =d}/NB"Z1{i^j3# ]=&MB*زh)j\ObOYkq5whnӚ*_Dlw&6ҳs]q5ʄ6v)ځ5ж9r͉{B wvOiV[:x8nZ㸍,fƀQCNYf_XL)7|w_tz"x>3v-fexyxn ZEq@2 uU`@6f6۪'O.,+VD&I/Hx55,'՞Us~8/mTpfE 沭{4em)E*|sm!7Ͽ deoy~B@M] cIoV@<ИshĊ"IIn>ԁޫ4&'!Y_Ii܇ӰT6٢Lm>rؙN9'v*Dt{xv)VEd!MW_sLi Cp "~)WȎ'ضl$T (mY{j>%ma.eqb<j\;NU=(7-κnxkM&.Qa\Z~y^5B̋<%ϰ,}TPqĮ|t~2݆}ݫY]o&}F,gbo;g)A\YM7=T1m7QpGyɔOwwIj_"b;ؒUHׇx;=mľIn; CutQ4e} $X_l6$kZHܸMwoɎ n.wwIpS1kiL {k, m(hmIGSz/'duSE*9GWm!O5I~"5dA4&rM[28"7' G K?,D5щkl! ~' ~,zd8kZĬcݱ͍ TGXho4dӿNiǧv?i !DUV8':xa=$!Nzx7T Bl[j!aY&Gt*.,AF#IɼGdǑ~9]aGW,ʝ(=y(ju`=.qΦK%Ò15X{pdadG)Y))"7p,]'&R:ɥׂ;aޔXt)GZ#$<e 57{&@?Ro[EQ  pIFewB ΅َ҃y+s`]ē]WySM1RhenBn*iF3f=O{[8ǐ$ox&ENiL<NC/ 5:Nۻ5rN;獖%FLS!v||boN5>)+뀸7ٛ~뮙Gk53 #*LBrk*1-zTuS:Ϧ\>x!ĥ=I%"QJ=wc%/U`z`i 庺F lLA17%V/m!N-|.ΨD-ƳxkHiXq yVQ [^bGA"ڔ\ow)'NZR8aUJJf BB_K0]NZPA\=4sٿ#f;׺|֮ 5"䙛߰$os[FV') I(|+ߋ &oɓ0H 9tp4}A֤Z2p3[스+{j;.:t-S)G躴Kx^qceS5~[Hb|UΕ'm l A>[ԏ+ R+w)t΂u鷻u B&]Z#Ӡ1$-}A* L8L*ޏ<[jDr^q]5}Ȏ&-VxYstZ=Iˉ0-T]cY߫k{RN<gό U\:rQ‡EwlR5Q_8>Ӕ=w})_dovu;*mbq4 ԨZWQ_ [v^Q+;)baX`|ҶKFTz$=+,mo7u'KP rEhY/\Ix$"֝Ħ[*P8%c9 ռTůAv}]˹RLɥ֋>eޜnKEYs2M~ yq]m4p݅;VpZ Z~&8)/Pb^:R _Dʗ 4?R>4"m!~+"fgU.M}MO3kqȷ%3 ++z2ۙw IA0(/~'#>BWNBB%C'>[{2]pym{fSa&LXa{}3 q.sjq^~$7#P%PKut+*[7$t^O*3T,s@ [})XKbx#WV˹ڂ]y.lz҃ Aqh{m T&]׍zLr6hx/8"S5xKq߈p7yl4'̝>EW<8)߻\6d?5M?T&.\W{c X:J]s`//pżZ&:=LbC/TS:; |]G: 2$uO8Q54fʬhL"eahӶA^f M7jO¥ }۶g+|emrsJq "ɥvkzxRcdmRETKLq>/Moй8 ttܭgXQJ|Hx?x+@IvViݵכ.QL=wv_+&Nj#EbַN`9 8.}K4 G}yzۏ;1-cZRNf%Ҽ8{518:@b PRk'N{:g{`*Ea:ݯ*$fܤʗ"k1+ki75EI$YռSeւg91*C&繛V̨(sBv}!opĝ L}*/c+>W+/)~Qݗ1`[A[{s(xTBJrIoek+ce6&0?? ?d;شu_!ynې8n=;~:έ2}_F*UM##u%i>co9JezSn$'3ҲAT^Ynf)z@~ ^3ˉ;XZD_$t:!4M-^2b4A04z0\oczL1 V،ᄈ僐m"Jm m!JV+)iϰ@EuK~wYl:6 "S:kw#N93OgծkA Yzv{|-Tg?BYxHFT)¾Z~y6nZC wh>i*kݼ9 =cʔ؂{ob#,M;;1Š~N R.Kd7!aj-M 1+ yxf2$L;wQߐd=z\a&fb/vQž\j'W?!@}..?c: EV辠4ͮ`>OX o}>V$y_ItU6s`| cfū>D7 ! d8|vA]YxQAIMZ{o3iE*r-Li-(tpKjO u!,*+'ˆnU(`&plՑ8@d$..i/n 6)IJ(:f?}?'،j_1s0b'u׭NQ:Я۠/~_ 󹡩ܢn;o< uY,|fK0b[@֧+jb5V]Jn~ޛ8paPIBz7D)o;suczaYj[ c ?f:ϽY.8࢞+XL37(x >~@y4rM#Fk"z$&,F4a[I!BU\ ͢9-[/-2YEQֳ^7QH/3}~DJaI\or-$ŗVV|h3\&ܭ? Г2Dlt{^kvKSMeoT:3/%.Il33 YXj-.dR#_`!KV5\ÿFm5Z%migg/hNS[(&E#%\dG^%q7 r݁t\VC)ʃ\a *P*!#umKOH>6ccb2*^.eMj~TvcVȎcfMaxsu, $.rE ^yFk[s9׫#fr|Sz{Hq4b=dHGB`T2 YR8FKva,hX3+g]) ]'[Ld5rヷe^U!+ޙLo:]$ZekjJAc'f #Ve&(z(шh8GCqs||b#.2#ZH:^OwBo(fˑOo#XU x:F`E6<.xtwRuLBFJR ǀgP9A_JلW>vl_#|Zr ~ '  ,~t&;8/% :3'6NWd֠0ՔyZmҷK65ՠ8xUG -sI漊J_{?ݺHj <`h]CZ3j+0uț7a ;x42<7;=. ! m` nK 4 -IYH6m+M$ztbgT/d7;W)L$f.bC\p5ZM=jg&suv|x+r{Kk`TED]MokxXF1HǏ yZ Oō/gKUe#W{m-YD"$nEෛdO5-ȹ❩SrKG< VC' B6Odl%9xÄ(0Ce tzӫ {e챘D5im8h%H2FKK˴ⳬ9N ?R钡8}aN6 8ir5&&;w"Xґ2:jV6ܣQ_nFQD)*bkY}y w gv$A Ê9Z|" ;2G|$ne m>VhLb `."95843NVo^M*1%5@TDNҾ,~Lvɝx=O~kwi `1Js\5%֞C[ޜ#[]TݖevAKm*w7?W$$u (o=WJ#cJxߛ>L9F >-;!a~l2$`jb$UB%9L40Dky\!eqO_bNpm:/yrM]C]ڵۉMR#AQ#7ʔ8ҔQWKzIs8~6Jrk$P֫ɢ1}L6h.nFnK:UF&sM{D ۃ4Ba7Wتk'9xSb:✜MO/.-H،ϗ{h᭑6 L 2 L}XSyQ;K_\^J㊀cf[/-ߡ"Ifl=.-ѕ 0ݱc!x5߻[ X/v zG<(R-&މ*8T1 盓дTLgԪwy fan@0BjpyVFUxxPq,dC5X/Tb.2I#iմp`GRc0ļc$k窤᛾VoShE.yzwDv_hsfd:`AO Y奠;Ɉr[&}~ml[Sv=GVZ C`Mȃ ЃV\F<*nÖ6Ϩ`Jcڞٹ68ͮLƠ$hhOKOGyC .wQk|? fZnې%y)Dj"{j:7u/^fyilmAmpP: ŧI,ƕb~5r:#C`Hd8M:j8mN"=Ӎ M,)Bsݚk+Л$[u--ޮh^c;"TխDBΨChLpP- {BÀͮhcc'.̨]jxmͬeHlI~ !=?˻FOvc'FyB𣕄vX[#Ӌ+G#Gՠ ZQ;f*"AhÖ)ʬ4wE1*a(eJꎯ(I=/:CLC"N9uN `p4n_vdiy-` `/ PZ]/9о`Cɢ(jNi5'vbHZhi4ǖPRW8),@q]2vG%0زuO.;~11\ GiS7멎!@*+*R~aP[i%GJ!FȃjGw.x_`Y̎-svWm2u\\ v ZIjalFW<.Z=XB\fv* =1$$S1y1PVfi3\h|!NJ 3^;%%0-֤ C x9{?ֶZrcvڤJ/R"Q'nM?,'Խ .mktaxs2)#y} 0}g]_eXx]ժab2o֢|W~bk0r\,V ,|fB{%c|;%bpJsQ&IǏT!}d*݇ خr94(efMEo&5)-Jz(ugh@-u|7naGĔ%Kyg?)4c2>e6z0"XpiI鋀V`Z66z^* $ƑZ_jT.Cf"{e`{On}֫Uw V%\Ĺ%  ;mR;'Ww0Fx-?co mэFTmZW0JbHsmj%~ǔիvbɫ^f9h9N8$d7zd]g cv+ߧT$2*/SEuՃ d%{ϴZR6g01 ܿ 3A0iBu /AB"_cU|f-;8a_(g1FgbmOD6z'Df'4C:烚0.VE@u-+R3e”XHI07?Cb@ ZR}RI< ֥@Kwsֻ!)md!M2I8v08vW|eFe'拱mzkH*;1Ah Ν6ĩ~Yjd-8}qqxN9v0t݆4lmSwKGuO2fzU;CB=k~ϼ\]z(G;wLYef_YUԂ#Cqo)=*l @֔GTzFd"LLu%2{f+tȉ]cȍ}NL}[}i~)D\OxRXLɻrڶ `gcK/ױTT^ۑ1o.\&d:@gMk!'EM ܤt4j DxLBTTw c( G8 R90tjQyJogr}֐#=ƽlPET]d | !&K^?l'ڷf>ײ !m6qu^nm"-3bedjn.oL%" * rjણ1[w6,dQ2r}95AUzu;H^KˡdduPL^]ɻ  K8N*8XRoŭ'E{T|wāVjㅓJo"iy۫}A=WFKpp>I,|C 6u4z ]aOLgV<+f)ںAsTaoTO.W;}6Ք; hwq- "Ra164LhҸ>A%) ĽѱXV/tZ;'do?/աp]vaN5~?6Oniftc1D߁'xxe6M]B~CJBd=h+[Wjù͙%R8.W} p m: %'i;zmUi[Dȸe"wPS>vH `v~a+( 4uIH%FK5QVLh'c֦~>ڎByQe/_A:rM>,bY%G-x!R],˙'1C9JASYj7̱2p¥j5b>{ qf!0Xgj/R5|!.Ê+|k6 9km@nFW֥֕tz]YÇɽڳ_SNKBR5`8h Z1/UֻXOj6&ȿzNFc](H%<:&lFt#`E<*MUz ~::YKQ\TSӀ,)lzmG 4<1=N =^X뼼=gD&΁zMBn 25hr6ݻ6Γ, k,iӖΘHVjC[[|Ku(BP ]2h\ՄX>Ujvaij|I[R;8mk?425o]R_Y 爇mc2Xh~-1,su!Q);*3^ jҮ)Yqȃ<5NK!GaY]_;3}kl&=ֳ"WBm׽y"}n>KB܂yG+ֻL$];m>ې _[YRQd4.%oP>:OBm2$fІcn7FPy2vG޺MZ>NxєɰJKO6YLZpj UK c7*(OٶhϨ3b۬!N bi ] GY=sь g G|w]=3|;oI0tMI2Jnz1@r9+`^tTK0:ޝ{fR r(C7e'x! qgF4Dtk~r&{2ÊO@DHf\~}hn-vyÜuHZ ZuXtN^ PtpmlCC6 i/-}XsHEe`ygjhm}Г |, |-!u/Ъ:s&fɩ)~b ڳ|\MzTURhw~҉'PAa@CRU>E@ k0ҍP;[gssQVa & Q "b:_Ȟ¸KC 5T$u W%_{f~V9s]LVw{5Y,V{SAAbOtWtEd $mf;Du T](W+w'g替\MP,"#KQEK'3DEU[ÅvNp.0(-ij(m^Z0|紵HZZ/eнfۊbo$ˀ8<d}6y]VV3]!¬T!(F$s}&"zL gū(euR}-svSf1رGdDd%G[km1 }D5S8ٹ?Dhݑ1yH^BvO*]Z{y+7mҍЫ"fNTs8<<@?* $)q%-wy3| }$ ^Xn[F0nU7 fƬ;ẅ~r$վ7&Ͼ :; p>EM9ɮޜ"n5tO3o7ug;;@qbG:qMyhzKd֠KoY9q( 9U#ܘW2C]A ۦm[;I()rK46ܞvAcO6ogS0 CyVԟV!̿@_,> stream xڬcx%\&tlvmccIvl۶mul۶mN?;g\WUwUVm2"y%:AcC#=#,AJֆSN`$h`Ic%#8mE \jcB!33!''',=̑REQ?-7hjMH3 `:P t4-r)ɪX6!dh 4"T&6>Xí/-7 j%[&:X;GB? [{Vd6F@[G¿YE]?aB6FN//_h@pu'!`ki7_2[{prZgS{cK_Iuo`kkhyKzX&96Z23/&6L; Q33T00t#428MIH2 "UJ-di)k`wg.k¿Peci`OB nwN!hmW!:&Vzb@W<Ȍˮbm Z_0e3?jXUDh_Q@?3?TBB6tLlt',cht%bgdd"<Qk#H/?_ v=`dchX34!6d[R\[e[:am@pÒ;pEB՛INsϠ[vq ƨz;[7b}DKNhmRco[:3vq)E.<9/ Pf|û?[Xekl3Al.yޒOբwa N V!Y ֆ!n/\xvUi(eJW'"WߝE2>V<n H艦bX}/hrIĔհ@oU׽2q 'r)iFw x2^"2e~VIK\<CʆnW2q O\e x0 !`.ax]U)VtX-Ak-Hມ-s]/TJ r 8ͬknH Wnj#!Z68We`N-ĿΟE 5ldID zŲ0 ib ui$uqS wp`ªm#U  ݐڔܞ3QNg izWMzҥhޱ|Fv #ϑ'R2s&]MrEp/jVUr[W[] ;H233$;cAW=`W_C,:×t9D2q>5҆.>V$?&uDEVM1OMU+qv7ނVfa"A$DC6^ܘ&xD2E4|!!zGu)fKIdɞh3_u*D(U2 v/jfSZ)RIS9\3T Jx 3s,̈́n}i-<L+< !`d}$ߺw٥ehmbHB)1EQۑ)o#9pl`tk[/mEFbMP}l aOV?Ffj[ 8:rg #8-kIJAoi >P(A33_˒W;yab@71qh27frR~^{jPCxsUc&w&Y{)dRbT/ɢ?Ȉ%fE./+='A^|!' شkk˻w兦)}cFs'',LȬ:v#(T+<%L t+^ ӵRV+*:"Ӹ -Œ<|z96q[RnSG'yxj9C_V#j/gmEwXnq*쑾sR;G1thT3D$!e;6d@Չxw6x[ߙ,^dr5 w,nR ;T0aUS?Vz_S$E;a1)WKEc.+Ðk8jS i\1T!WODqQ:QAt'OWVUr)y1KHe'OaClBggs[V3իb;h(`u76#*5qRKqLBo :bƲA}P>ڭ$)5ɠɊ2ב\gZ[`SBř4Re$J bok*z!̉W}*8ojM!?[,~/İNϰv(|)w;twW|K+a ~>7 E򊺢D ʪDUh@3AǿL׷MFeB@&Qu | . C=1pZo~I 5RZtYfatMkle3~F<+%"5f2 #@=^7Y:J=<^ӽ\6jx~EpK'ePM:*ıP6xѡ';*8=I oOstrRѠ0rbkz;9Mcx8(q7QqQn:ۊZCs74+rGX򶡥E i髶fɶ HSv20 5< \Q*gĀٍ&{ƫ[yM%dSo/nk=W ?P.ESy;֐q p)9yd5P#Pf8W0Aljdv_* Gxz sT%$QWL6fڂ@fJmY85'T,F&ORS:e$b1|sS #yb] i'150Z^-A bmM/%Ph{WaZ;v}S.#讆-_$#W=4bas5Ÿ:"!lp<ìz fƐɊ}1eW5'EDOD)[Jۥ)"Y/go`^˦U\$'Am9~{wyvLyp;H5_ED}wb uOp॰2O|Q[/ֶ鄕|=}wH1/Ad3*~$Y+Yb~Om\}D=qw~HNH fߎrz1c]·ʹFzs% m'÷tC6·x4)*RF2 qwb|HTOwC~ F Y gr d0t vZhdjMTQ+ m8o;a#c3 }8WD;1#G޻m{ a~MW%zFAZ;*OAИ%7Rhm"ê݌}gu^K{,"}ήLTWI>Wf{>/!ǪMX׽MBz:̳⵽^5 (˱hLSHI[=܋--@8>c+ $\z}G84T?s)6~y6lKwjM@dB O!I/!/Ygi3tܲ*g$PWcƉ&`OvU׀Q8G ReoN=*=,sg[Qf2'r:mQR?-bksjSzH Z)ZCF6ZE=MR e+"q'i9Pl8%}0 dtӫ];<s]t'JuiA%;=Z|-BaKͮ|6r`='NGog..'[H9m=IV? ơ8}@Fvdց,)n|3w&JUD4̈ަPeakpuCA"E{<<{FȨ(aIwt!H*a 'sZF `,fMOu6SVCP8hCqg>N02eniL{KaB!}l{LSCzMlI3)d-Ibq'=t~/cqOLЩ(p~RG*"4Z kOY/r2bK62K&KtoA{D()$LnBHlkfNx<_DFL"e#[~PvU#"%s{Uɷy{7D ӻ 8aqT&3GZ oLތR*8CǛeg#'5%tQ{Hoh=u]<တ)'ϳ]%1@QKz\.p5ܧfH+))eZjѽ]p͍"P=Sp{;qK7NQ4gC) hQKM`g($SKNNݐ㛝od^p2ՆCi p2qYL9f-#Ӣ%g80:-P[O NPTxug-fj1D]2#3hBt@<6Lnd%DcgďtR߽֕+jѩN8sbw{ `?3S)k.l&Zo1H<>pJ{`s{h눋5C&Ktc0B'Fw6{VAOuBdq1U,i}7nnT1DCx AѻTh$'ܕjS2Jz~ҡģ#`vn¢b皲m5&{o/`d?D>[&}Ցj%YruLR{5mm|㈖yBXU#e¾gFNYIuޫ$cW={~͞FH$HEP(NG}_d6/iF[}f?RH BW]R,OdE vIeAȡHE[vB7PnU܍hG^0iPقL'Yg k,^Damz"dRTC-|հ ,p ME+V|b #H|0pNN:}$K|ڨt4םۿ{$>>W@l=nGkp@Sy%ƵGZs/O_9P(6'qeK>>CBa ér7/%@$@DbdxH'^ Bw/5]z Cm^wI؀5a`] F\M58%?H)5@yq>)ԉhJpJZ|zP""dk1bňF6Hp#ogcBC@ 9s7ر>RÄ)~z'*N:}/v0I>M ]&[JbLJݳ`RSre?׍S|3 ʏw^TIL:Yx6V)|赗3a'W$D ȓUL3RQLWvpS2-u>U"!D`uk5ohL"wa -2s[߇dBOUX񼲲/vI lˌcoN8b)4kqcQ,HL&“Ft"^]msK{{Vn~1R\J9KpVN-]Ͼ0lB(*mc4#U<S*濿^A@6=fzZ YE3'a/Cc&&zRДꆂ5 S nIMNK +l6zpVM( S~2/Jg R RtQ#φ`7'z0h?"`IzwR |m^s2dܕ,GO'x*:rVbvMwcAަ߱|ٷ\NrLXxH7䡏%.GhKno1w Rm9#vzJ"%>*2f\ԒTG"!2!5#|B,Ew_V*: DlpA~\t5abhzu^M`KGRzi6Տ2ߠ=?pTpQ:Q7JK$QUK cYV.m/_uPLמFĤuƌNt,}Ѐ:Q]z,,IUۮ[|`՞@_ʹ4PN?YKJnXbtŰ8S(*Ѻ<(S`lO}{ N^]2)к@_G(\;Q+;}&0>ݚrN5*m̏ Xe#Fcf6^e^gnzyI2@a=b\wEȓ'p:h_75`#ohoWSuߌ\EԾҮvo<7/[/l޴&:x.p7@[YkͿV]P7(q_hP=FJ\3d]A='@L e^b1#c8{h7D_\9*}J(9!]ʭ/cI~5El)ʬfQ*p1{K'ǚ%+~JicsOD`W^'+>RXt&O?㘎U%Rj*^&KL8k/p,( ;&8~ȌXJ0ݖ72Nfn0dp9banb PH22>Ko+5L\ J:*9skJ2r fy6\KjIXoI;2K1o D|8 Ŭoͦc騘ǿ pc"uc|O%?;{R=BM⿶Wl7rZ}= R_5Oj^S9gіwǾIy EgUS(D7h} ,~3v.JYP*O׼7asD,'u+5hoSf.'@,w?8& ?%m&ʔ/F?V&_VRx(^L,F\n0n踯(<"3\<&+dVec7;tJ }rr{ ̽Ѽms$25+Y6 }zZ0uHJH$oVQ9bPwj8Iz h&,"E;>AWRuE۶}>EJ*Q] ^t37䫖 Uc7i.#f۝OL[D).a*CuE=L- \__+ ! ,^˯NޢDkV@C%%AGF){3?Y)vG9䦜JryE $p@K-jTcv "UkՓP9;o8OV770;3QkZ|wA5^']LfSZoiFR]d] `XG3q}1j<{Bv|B<ҒF* 6nAЭyGjpUM_ Qw4>p0qJ(17X"d !X%Jjz rrG S.Ai|I_Nm`"#UqXee ,wVe 9@ U&^Q|83eQ/ rC&7e~[^-"J q,l1xo$oVKT!wxL>ީ9%yaԼ&$ƸwP4V$z6wz6bUfɓܭZ5~_Оo@q `aiVhp}4m2Fq|5%G2R$?tT_t>ս럧eʢTPOF r]^vM@tkqY}$d;2c3?A1ևZ+(mVխD%Hմ317 r%wy&qGvU/?=#,XESZAQhsm~Y 88 vࢳ4oZޣOsiBʌ-`^ISⓆW\qk TPRC;V2ӹd8Š(ݬ(*A".s|kX<i%\К1ҼRү5߂jn*?IH* ނR0؂P8WB:+D(n\菵1چ(M Lc K;~ D M]ǗpBZ;>u WZkx^NDh ;^ƙ@d:f&I˙nH Y:*|+ YaoB`N}S9g5#zOic W>O㾘mɶC וEre >atXx' }2A8ɣ}@8L @?Ju5dZA*4;I#tv~s6@M(@d.]{D' E98בCaEΨMo" ˽1FMYI :OɥNʲL 0UxϤMB&JɕzLU+q*ђoUÌV+-[7;#͕M=0OMkmSXK f邗< V&DjW}R#i^3"6lkJU-v9Jسtg8ctR&)\&#/ܢfï=%SP?,l_$JCDQzwê6"\%}aO`f<)U4= )1^އ'zv5[z+z+<2^Auٴ쐱e<+֘5#b҉.י)4M^J flm)!>/dG͞%/-\39gFD]rBJ4ys8Q۷D3_RʳaՍrNk1}QE Ѩrf\Yc2ieUغt/vAc=O?6)}dNGH]~QsËW>fU&r@5ί5%DJ}+10s!j+FZC!fUǀtuG6hClt[u/:*vӞ)U },1NJ0fǢΙs"a5ȉLB _6:a`>āwfeL"⾰qk@gNt8(⛲ +~lS&\b'@H?]\i縼oZC؜4`n!=5Jv僭݊k.6?9eTGlN%*0A}Ǿѳ·&D)@RyBKBy'[*5{I15c>)8_aP_`>$cɍu+>^YN6b} bj~r9b-=o!}\ gxH>bo.|bbZe ^4i²/\pfDlw^HU =oZOKWP=맫E[*vfipUyG1Wx׹ސ0͔mG/A6׮#)3H97VퟳҎ$%]&ӻX }^W1S (L(F''ŪI^ZmcȇVO֗:ƦLF}!))/紭\Aa% 5igۛ Th~^}  \H˳$(h4Z&nδk [!OYt N,֏>IÌycP͢?Lǘ-{n9X P(m%vC0cLG3O/@+6 )!+'k;5$aM3 6tlAv%^h'S.i\DvX\~9H[~JmUg?Ǣ}!! L8Ҹ'g\krЉsRO"/R go=TY%",ԺӗSYBY-K y)rNuqB,;ĞnW#8,%+Kƒ#HZ7RKc-c񓤵TGsGqw}F^;!~+H59{N_e*M`n!50EkC9Yؓŗ| fw*-./![>Sk,YKc9lUYY,:՝[Cz!ۻ܊qK+;)6x.UPXn,.&9pQGUmeUle_.*.M N_@S[u C{]8?%aT9_OO8[{l>7y}j͙6?2ecVt:BE>B"3RުWtj̚^¶ܼU ߲` b;bK`Oت6IQTi5Z(_ڬqH]aj%GLR "d\a!,DI.p6:7FcpU͔tފ:Div_}$ބNH|)$|]ff.$.Quf+2GXd,J3 u4pqEQ4C.] 4:#2(ш )#=Ĩ_{ËA3#STh& -m~Qjpb F=ou4K{K&wܢbgɕDt2-]sN!|]٬Ԥ QwU/L:]BK+ݾ* y21 qB>x^(9Ş"gprIp}#UEn2ƵxTv8ږ'^լ_WU+=8vjOT=LAϛct!,g#X_Jİ nDEau#kbɤų`pUA+eL xXhۀryTgh*1+;3n@v'-su7ix{K&de7O>VN`X8BnQ%-Bɯѱ'<1jSCr;xGW@mR"ig$ endstream endobj 952 0 obj << /Type /ObjStm /N 100 /First 972 /Length 4814 /Filter /FlateDecode >> stream x\[o۸~ϯ9$[t=胓g}H,9q("օXQNBJe*ch0X؂) ,fP\q((- ͣU!E.$UL!-wT+4Ya00C* wSECq1ذV8Ϊ*)Q4%2y 098GVʷhg1A|P , 㒣FpGx!"$GRf,;fRddEo8$C?Ѐ |R9&#i@M%d,ԡ+cYy 1"f)\gʒv$i)LjSW5#5ל8oIa4VK.(NP{; "P!)V@8ҢF% 79hNCs 6q` L9#2о #$ТqxC,1Gm 5Gv,ԏ.`0#$zPa@#0h{Ok!POIYXEqq^#ŽGHzcrcurk'@K.:Z (븪JhAj/紐0 w&gciuW32gpީ{>$g}9F31(sG+H~soiNSwu<,4*4P܊ /3JBK91з`(ٚMJZ\[&o>b5k[3db9ؕc=XZ KX2YD2r>̘ Ԍm)A\S6!sߓY8 .Ry(IjTЅWi%2`T <{4ǕgSI9'%HZ[* :PfYek(1Yx¶@OCEeK}Ƈtl"b$a;+ ʨv T V-m+CANdG4ʳצ'qvv'oȫd뭠I!ZXP,0I$mL6 (vx*G6*ohBDq*CRxK ?obJV9FRcO5 *( Z)V KIH"Sid&ة d&畨|?&Ti(z>HЏVODM!&JMz%QFfk.13孄l2?5?P ݍ&|@n?`tYߛ!>c}=\+_i}\?vǎq{?8A5XL(CW7160{^r o1ƐI˝+.z<Ic4|_HïC[9?1ZL #8JhOjJm|%@> 롽 @8NV=i0dRV5,hzF"XAn| ic|]bl{P +{ߜ4s#]qu<10$\v &1`a($xNw%Yԣ-*bȝI:TGGg&;Xx=Zս%u`" }J*јogQ|^2SE@56ìOg@ oԞr}y>:Ju*w {c{}cA'O4Sұ xxm|pM9dzt )}3&zLzw_A{w|;e=US-_3ߏA˺(,Gt5,W<ӣ?=|9|y]{?xAy}Z/6Rχg9=ZqP cx|3N/>kjM} rP~hIΧzSWy|P,_oʣ|WNLŴ/׳rY\e-u)*wɌ(: G獰8|s>#((rѕ#ͳgO=O)VVI4UEQYNV/f^ڤ H̲j:/kez!R7 0՝a(dꊣGOM:^{#!V@$ݷFEV˿Nj>7lz\]L7U=ԫy^|3~~S.zѿ ,{]y^g/_tUNbf/7:$2Aʵ+e.ѓiyHrZN׍/ѓS.?/bXL!z e'XFppYSЕ٭qQ6%^uW/Q*N-s4>pnpF¹Klj_Oyp!sl9`01<~a Y\]ԫ3,/Ë-כj,Pe媼ZU]v+"=~ſHUn8)o'7([2؎s~yowH<йyW88h{EG/zAѕ%а Z4甃ڗ65uPq{o}KDe?]\WlXB?d yj˔\ʍH:f]%8JM$yxIXEֆϓ%Ō: ||3]bIb8H9vSxKj;jj J 0*}Yݽ^G^yT&9 <#aUF~8Nq=[kUlmo^mN5J߷|t:4"ᇥ0r/O,W >o~ttF6L~鮡ikymLi#Hz2̳wiH{mHcm x8] ٧Oi.~Bh0Yd `e D86Azo՛mGKɚ#e,h0q2 ':Qp*Cp&&26:×^ 5|z?i#]k~c~u73h͠MP7{Q(lPo!EQ;koLNmN$8L!dHt]'7هQ>Diz@~v9_2|Ò>uER9yWvO7U<:Fv w>K6[]F[傻i]:at&y?I{ݲ9i3xhDW>߲:bg62飚R0ݛ>Kۖ(2ul8bw1r-义`idҖ a)CIɻ`iܱǒ6n2KwC>K[٨u`iҖu;38q,:3gvBMu}vj5ؙo!A7kQ g6[ M<~״\^asz@+lEO3a_w]{@7_H&%݅;3.b7Q? `spgշёc]Ļ uX4~ tȒqP&Nj$a^DTNTaroDFFfOID"&DFk endstream endobj 1053 0 obj << /Author()/Title()/Subject()/Creator(LaTeX with hyperref package)/Producer(pdfTeX-1.40.18)/Keywords() /CreationDate (D:20200123100719+01'00') /ModDate (D:20200123100719+01'00') /Trapped /False /PTEX.Fullbanner (This is pdfTeX, Version 3.14159265-2.6-1.40.18 (TeX Live 2017/MacPorts 2017_4) kpathsea version 6.2.3) >> endobj 1008 0 obj << /Type /ObjStm /N 72 /First 650 /Length 2566 /Filter /FlateDecode >> stream xڝZ[~_Ǹ_ @h6 zy jwum62eRIoCʶq&`2FҿIYLIςa;3z220G  'YǼ*25rHRzBrf<LbB\ A)3Iҳ$ %=o_ dr<ГʙJ>Г {U!40&ED)prFO)+ApLW=!HFpNS IidL{8J>ZJE=όdCk(AXYIFbq`KNqQf 1 KhS 7a9sB̗HH^; rxцaXT ՇȮyP3Pj|I4|?4TPK=Q ?.ͩ9}i~m/QM׷苇5_/SEYLZԬZLS?k;ӈg<G]wrc Qnە|U2t#6,[ڜbW؎[&细Ǝ|[Qu-ohJˑ;,L [,-zH)ɖyWJzf$[eKm*dlmVl-ݨy]a^+n܍YU+|dNP#rbS*ˑbYx*GV6Pd>"L{~}$[Ձy$F/0~h9Z;FǪ5?Zc.x叻cse?C] 4`_S7_ 1 }=H7Gy"I" B$Ƕ~bu舤H2lDGLA35FLJuL6 1z|&$07}Kd޺'Gt+ΥssksUUHUި;yٞ/]tk ?Ʉx7("߾#7{nNτAbiܜN@&:}w]sLɣ"vؔC[M?]l n8uuBID>a:#~p3tӒw_F&B26m$387&2+d3"P. $ q8X[Un %:pMg/c42eq%249JC4.@TLo &}(%4 x" Te:͐ Q(b 8RUg%7xIX($Zu\&^B'rXmNJS&z8f̸+3;sT]w!fE{b^$8w)fFDݥ Mn)W`dzh6t1S,n&xvv%ڊD'j51_VY9\n۵4!جcB:9>ecVPn+ veSLMmֲO,ҕSsW!ҶW$d-Tڬ%" .=tZ\FGJ>~.oVRȫv|K%x K8"ńpKͶ wÔe<4Km"_Zĝdb ']Գva:/3Y_ TATzkߥ9S'':|zH%>v}˔Lʔ:`?_)r)V o{f f0Y%RUpTC%: Njf}S6:eS6:ew/W\ )ڇ=~w_n6P/a J_w  u>[<$u2<7"!jK endstream endobj 1054 0 obj << /Type /XRef /Index [0 1055] /Size 1055 /W [1 3 1] /Root 1052 0 R /Info 1053 0 R /ID [ ] /Length 2525 /Filter /FlateDecode >> stream x%[h^YM$MI&;=Ҥ4mҴM!=0OYx!b #z"BZ .a^eY枀Ye͙eن,ϬUhrN-5Єv͠5hh @;^µ`&*ZЉMt]C"=hwtnڋvPԇvMlA{AZ4= BX#4Vzm#Z;(Cۆ!)85;Nh]`7=`/Tn5nJA0-'<Y^GQh8ָih%|0\s#@EOUpga2 8z8k41j14.@kJJhX䒧u:y0X,r5}se@&c3ЂL  @e;'ϙ,QL 0`23{ׇ-zOwapS8N8 ZvaV!uEp\,[^S4#֨R{\ :hјX@ n[1xGe_ V}ȯD\qI%D\qI%D}4qO^}[NI|$GI|$Gu$>jm;$%ZyNe?y˰@O>\ƃ@ϬeJWH#;H#,u2ُ ;H?yLG2t\W 9k݀:1w=@V}kB'ߴtD* D ϒU@0BkT X5 XXg~fnYAz0lC7 Q_`P5&>lwu~%KWLC8P e]8 xRk5m6t [/u @]*hsf KbPZM[*nEY]q4m7ZPAOS: )q7$?4rfoRHKɾo7{ui%#ɾ_obL? O=o4{wUK|C^nGx>41~xI NO=A@=4̞lfUpW=ɪF*ctO3os :E 3{UZl,0s6yU`yWtf&cϓ& ۫}0 bOjCF=˻u+q9$fVv$5LI!,TSeH#lpI% [$I$a#sDbzH%1$lJbH:!G=BL.I$ᒄKfH!aTX>G$dž1 $FIotuOMH"1)$lE [$|#I it+O0M'r|8Vߝݡ.&MZnmyBW3k3H$y%+qWu[6ZY-5#y VFm}Syy.n~UG=V{6Ytkh>OtoMXkmS:ohZ&^hͯP*}=CU SHVxCP* \basic{IF} \nt{expression} \basic{THEN} \= \nt{expression} \\ \> \> \basic{IF} \nt{expression} \basic{THEN} \basic{expression} . \basic{ELSE} \nt{expression} \end{tabbing} \end{center} \caption{A textual version of the tree in \fref{fig:shifting:tree}} \label{fig:shifting:text} \end{figure} In our example, the proof that shifting is possible is the derivation tree shown in Figures~\ref{fig:shifting:tree} and~\ref{fig:shifting:text}. At the root of the tree is the grammar's start symbol, \nt{expression}. This symbol develops into the string \nt{IF expression THEN expression}, which forms the tree's second level. The second occurrence of \nt{expression} in that string develops into \nt{IF expression THEN expression ELSE expression}, which forms the tree's last level. The tree's fringe, a sentential form, is the string \nt{IF expression THEN IF expression THEN expression ELSE expression}. As announced earlier, it begins with the conflict string \nt{IF expression THEN IF expression THEN expression}, followed with the conflict token \nt{ELSE}. In \fref{fig:shifting:text}, the end of the conflict string is materialized with a dot. Note that this dot does not occupy the rightmost position in the tree's last level. In other words, the conflict token (\basic{ELSE}) itself occurs on the tree's last level. In practical terms, this means that, after the automaton has recognized the conflict string and peeked at the conflict token, it makes sense for it to \emph{shift} that token. \paragraph{Why reducing is legal} \begin{figure} \mycommonbaseline \begin{center} \begin{heveapicture} \begin{tikzpicture}[level distance=12mm] \node { \nt{expression} } child { node {\basic{IF}} } child { node {\nt{expression}} } child { node {\basic{THEN}} } child { node {\nt{expression}} child { node {\basic{IF}} } child { node {\nt{expression}} } child { node {\basic{THEN}} } child { node {\nt{expression}} } } child { node {\basic{ELSE}} } child { node {\nt{expression}} } ; \end{tikzpicture} \end{heveapicture} \end{center} \caption{A partial derivation tree that justifies reducing} \label{fig:reducing:tree} \end{figure} \begin{figure} \begin{center} \begin{tabbing} \= \nt{expression} \\ \> \basic{IF} \nt{expression} \basic{THEN} \= \nt{expression} \basic{ELSE} \nt{expression} \sidecomment{lookahead token appears} \\ \> \> \basic{IF} \nt{expression} \basic{THEN} \basic{expression} . \end{tabbing} \end{center} \caption{A textual version of the tree in \fref{fig:reducing:tree}} \label{fig:reducing:text} \end{figure} In our example, the proof that shifting is possible is the derivation tree shown in Figures~\ref{fig:reducing:tree} and~\ref{fig:reducing:text}. Again, the sentential form found at the fringe of the tree begins with the conflict string, followed with the conflict token. Again, in \fref{fig:reducing:text}, the end of the conflict string is materialized with a dot. Note that, this time, the dot occupies the rightmost position in the tree's last level. In other words, the conflict token (\basic{ELSE}) appeared on an earlier level (here, on the second level). This fact is emphasized by the comment \inlinesidecomment{lookahead token appears} found at the second level. In practical terms, this means that, after the automaton has recognized the conflict string and peeked at the conflict token, it makes sense for it to \emph{reduce} the production that corresponds to the tree's last level---here, the production is \nt{expression} $\rightarrow$ \basic{IF} \nt{expression} \basic{THEN} \basic{expression}. \paragraph{An example of a more complex derivation tree} Figures~\ref{fig:xreducing:tree} and~\ref{fig:xreducing:text} show a partial derivation tree that justifies reduction in a more complex situation. (This derivation tree is relative to a grammar that is not shown.) Here, the conflict string is \basic{DATA UIDENT EQUALS UIDENT}; the conflict token is \basic{LIDENT}. It is quite clear that the fringe of the tree begins with the conflict string. However, in this case, the fringe does not explicitly exhibit the conflict token. Let us examine the tree more closely and answer the question: following \basic{UIDENT}, what's the next terminal symbol on the fringe? \begin{figure} \mycommonbaseline \begin{center} \begin{heveapicture} \begin{tikzpicture}[level distance=12mm,level 1/.style={sibling distance=18mm}, level 2/.style={sibling distance=18mm}, level 4/.style={sibling distance=24mm}]] \node { \nt{decls} } child { node {\nt{decl}} child { node {\basic{DATA}} } child { node {\basic{UIDENT}} } child { node {\basic{EQUALS}} } child { node {\nt{tycon\_expr}} child { node {\nt{tycon\_item}} child { node {\basic{UIDENT}} } child { node {\nt{opt\_type\_exprs}} child { node {} edge from parent [dashed] } } } } } child { node {\nt{opt\_semi}} } child { node {\nt{decls}} } ; \end{tikzpicture} \end{heveapicture} \end{center} \caption{A partial derivation tree that justifies reducing} \label{fig:xreducing:tree} \end{figure} \begin{figure} \begin{center} \begin{tabbing} \= \nt{decls} \\ \> \nt{decl} \nt{opt\_semi} \nt{decls} \sidecomment{lookahead token appears because \nt{opt\_semi} can vanish and \nt{decls} can begin with \basic{LIDENT}} \\ \> \basic{DATA UIDENT} \basic{EQUALS} \= \nt{tycon\_expr} \sidecomment{lookahead token is inherited} \\ \> \> \nt{tycon\_item} \sidecomment{lookahead token is inherited} \\ \> \> \basic{UIDENT} \= \nt{opt\_type\_exprs} \sidecomment{lookahead token is inherited} \\ \> \> \> . \end{tabbing} \end{center} \caption{A textual version of the tree in \fref{fig:xreducing:tree}} \label{fig:xreducing:text} \end{figure} % TEMPORARY the HTML rendering of this figure isn't good First, note that \nt{opt\_type\_exprs} is \emph{not} a leaf node, even though it has no children. The grammar contains the production $\nt{opt\_type\_exprs} \rightarrow \epsilon$: the nonterminal symbol \nt{opt\_type\_exprs} develops to the empty string. (This is made clear in \fref{fig:xreducing:text}, where a single dot appears immediately below \nt{opt\_type\_exprs}.) Thus, \nt{opt\_type\_exprs} is not part of the fringe. Next, note that \nt{opt\_type\_exprs} is the rightmost symbol within its level. Thus, in order to find the next symbol on the fringe, we have to look up one level. This is the meaning of the comment \inlinesidecomment{lookahead token is inherited}. Similarly, \nt{tycon\_item} and \nt{tycon\_expr} appear rightmost within their level, so we again have to look further up. This brings us back to the tree's second level. There, \nt{decl} is \emph{not} the rightmost symbol: next to it, we find \nt{opt\_semi} and \nt{decls}. Does this mean that \nt{opt\_semi} is the next symbol on the fringe? Yes and no. \nt{opt\_semi} is a \emph{nonterminal} symbol, but we are really interested in finding out what the next \emph{terminal} symbol on the fringe could be. The partial derivation tree shown in Figures~\ref{fig:xreducing:tree} and~\ref{fig:xreducing:text} does not explicitly answer this question. In order to answer it, we need to know more about \nt{opt\_semi} and \nt{decls}. Here, \nt{opt\_semi} stands (as one might have guessed) for an optional semicolon, so the grammar contains a production $\nt{opt\_semi} \rightarrow \epsilon$. This is indicated by the comment \inlinesidecomment{\nt{opt\_semi} can vanish}. (Nonterminal symbols that generate $\epsilon$ are also said to be \emph{nullable}.) Thus, one could choose to turn this partial derivation tree into a larger one by developing \nt{opt\_semi} into $\epsilon$, making it a non-leaf node. That would yield a new partial derivation tree where the next symbol on the fringe, following \basic{UIDENT}, is \nt{decls}. Now, what about \nt{decls}? Again, it is a \emph{nonterminal} symbol, and we are really interested in finding out what the next \emph{terminal} symbol on the fringe could be. Again, we need to imagine how this partial derivation tree could be turned into a larger one by developing \nt{decls}. Here, the grammar happens to contain a production of the form $\nt{decls} \rightarrow \basic{LIDENT} \ldots$ This is indicated by the comment \inlinesidecomment{\nt{decls} can begin with \basic{LIDENT}}. Thus, by developing \nt{decls}, it is possible to construct a partial derivation tree where the next symbol on the fringe, following \basic{UIDENT}, is \basic{LIDENT}. This is precisely the conflict token. To sum up, there exists a partial derivation tree whose fringe begins with the conflict string, followed with the conflict token. Furthermore, in that derivation tree, the dot occupies the rightmost position in the last level. As in our previous example, this means that, after the automaton has recognized the conflict string and peeked at the conflict token, it makes sense for it to \emph{reduce} the production that corresponds to the tree's last level---here, the production is $\nt{opt\_type\_exprs} \rightarrow \epsilon$. \paragraph{Greatest common factor among derivation trees} Understanding conflicts requires comparing two (or more) derivation trees. It is frequent for these trees to exhibit a common factor, that is, to exhibit identical structure near the top of the tree, and to differ only below a specific node. Manual identification of that node can be tedious, so \menhir performs this work automatically. When explaining a $n$-way conflict, it first displays the greatest common factor of the $n$ derivation trees. A question mark symbol $\basic{(?)}$ is used to identify the node where the trees begin to differ. Then, \menhir displays each of the $n$ derivation trees, \emph{without their common factor} -- that is, it displays $n$ sub-trees that actually begin to differ at the root. This should make visual comparisons significantly easier. \subsection{How are severe conflicts resolved in the end?} It is unspecified how severe conflicts are resolved. \menhir attempts to mimic \ocamlyacc's specification, that is, to resolve shift/reduce conflicts in favor of shifting, and to resolve reduce/reduce conflicts in favor of the production that textually appears earliest in the grammar specification. However, this specification is inconsistent in case of three-way conflicts, that is, conflicts that simultaneously involve a shift action and several reduction actions. Furthermore, textual precedence can be undefined when the grammar specification is split over multiple modules. In short, \menhir's philosophy is that \begin{center} severe conflicts should not be tolerated, \end{center} so you should not care how they are resolved. % If a shift/reduce conflict is resolved in favor of reduction, then there can % exist words of terminal symbols that are accepted by the canonical LR(1) % automaton without traversing any conflict state and which are rejected by our % automaton (constructed by Pager's method followed by conflict % resolution). Same problem when a shift/reduce conflict is resolved in favor of % neither action (via \dnonassoc) or when a reduce/reduce conflict is resolved % arbitrarily. \subsection{End-of-stream conflicts} \label{sec:eos} \menhir's treatment of the end of the token stream is (believed to be) fully compatible with \ocamlyacc's. Yet, \menhir attempts to be more user-friendly by warning about a class of so-called ``end-of-stream conflicts''. % TEMPORARY il faut noter que \menhir n'est pas conforme à ocamlyacc en % présence de conflits end-of-stream; apparemment il part dans le mur % en exigeant toujours le token suivant, alors que ocamlyacc est capable % de s'arrêter (comment?); cf. problème de S. Hinderer (avril 2015). \paragraph{How the end of stream is handled} In many textbooks on parsing, it is assumed that the lexical analyzer, which produces the token stream, produces a special token, written \eos, to signal that the end of the token stream has been reached. A parser generator can take advantage of this by transforming the grammar: for each start symbol $\nt{S}$ in the original grammar, a new start symbol $\nt{S'}$ is defined, together with the production $S'\rightarrow S\eos$. The symbol $S$ is no longer a start symbol in the new grammar. This means that the parser will accept a sentence derived from $S$ only if it is immediately followed by the end of the token stream. This approach has the advantage of simplicity. However, \ocamlyacc and \menhir do not follow it, for several reasons. Perhaps the most convincing one is that it is not flexible enough: sometimes, it is desirable to recognize a sentence derived from $S$, \emph{without} requiring that it be followed by the end of the token stream: this is the case, for instance, when reading commands, one by one, on the standard input channel. In that case, there is no end of stream: the token stream is conceptually infinite. Furthermore, after a command has been recognized, we do \emph{not} wish to examine the next token, because doing so might cause the program to block, waiting for more input. In short, \ocamlyacc and \menhir's approach is to recognize a sentence derived from $S$ and to \emph{not look}, if possible, at what follows. However, this is possible only if the definition of $S$ is such that the end of an $S$-sentence is identifiable without knowledge of the lookahead token. When the definition of $S$ does not satisfy this criterion, and \emph{end-of-stream conflict} arises: after a potential $S$-sentence has been read, there can be a tension between consulting the next token, in order to determine whether the sentence is continued, and \emph{not} consulting the next token, because the sentence might be over and whatever follows should not be read. \menhir warns about end-of-stream conflicts, whereas \ocamlyacc does not. \paragraph{A definition of end-of-stream conflicts} Technically, \menhir proceeds as follows. A \eos symbol is introduced. It is, however, only a \emph{pseudo-}token: it is never produced by the lexical analyzer. For each start symbol $\nt{S}$ in the original grammar, a new start symbol $\nt{S'}$ is defined, together with the production $S'\rightarrow S$. The corresponding start state of the LR(1) automaton is composed of the LR(1) item $S' \rightarrow . \;S\; [\eos]$. That is, the pseudo-token \eos initially appears in the lookahead set, indicating that we expect to be done after recognizing an $S$-sentence. During the construction of the LR(1) automaton, this lookahead set is inherited by other items, with the effect that, in the end, the automaton has: \begin{itemize} \item \emph{shift} actions only on physical tokens; and \item \emph{reduce} actions either on physical tokens or on the pseudo-token \eos. \end{itemize} A state of the automaton has a reduce action on \eos if, in that state, an $S$-sentence has been read, so that the job is potentially finished. A state has a shift or reduce action on a physical token if, in that state, more tokens potentially need to be read before an $S$-sentence is recognized. If a state has a reduce action on \eos, then that action should be taken \emph{without} requesting the next token from the lexical analyzer. On the other hand, if a state has a shift or reduce action on a physical token, then the lookahead token \emph{must} be consulted in order to determine if that action should be taken. \begin{figure}[p] \begin{quote} \begin{tabular}{l} \dtoken \kangle{\basic{int}} \basic{INT} \\ \dtoken \basic{PLUS TIMES} \\ \dleft PLUS \\ \dleft TIMES \\ \dstart \kangle{\basic{int}} \nt{expr} \\ \percentpercent \\ \nt{expr}: \newprod \basic{i} = \basic{INT} \dpaction{\basic{i}} \newprod \basic{e1} = \nt{expr} \basic{PLUS} \basic{e2} = \nt{expr} \dpaction{\basic{e1 + e2}} \newprod \basic{e1} = \nt{expr} \basic{TIMES} \basic{e2} = \nt{expr} \dpaction{\basic{e1 * e2}} \end{tabular} \end{quote} \caption{Basic example of an end-of-stream conflict} \label{fig:basiceos} \end{figure} \begin{figure}[p] \begin{verbatim} State 6: expr -> expr . PLUS expr [ # TIMES PLUS ] expr -> expr PLUS expr . [ # TIMES PLUS ] expr -> expr . TIMES expr [ # TIMES PLUS ] -- On TIMES shift to state 3 -- On # PLUS reduce production expr -> expr PLUS expr State 4: expr -> expr . PLUS expr [ # TIMES PLUS ] expr -> expr . TIMES expr [ # TIMES PLUS ] expr -> expr TIMES expr . [ # TIMES PLUS ] -- On # TIMES PLUS reduce production expr -> expr TIMES expr State 2: expr' -> expr . [ # ] expr -> expr . PLUS expr [ # TIMES PLUS ] expr -> expr . TIMES expr [ # TIMES PLUS ] -- On TIMES shift to state 3 -- On PLUS shift to state 5 -- On # accept expr \end{verbatim} \caption{Part of an LR automaton for the grammar in \fref{fig:basiceos}} \label{fig:basiceosdump} \end{figure} \begin{figure}[p] \begin{quote} \begin{tabular}{l} \ldots \\ \dtoken \basic{END} \\ \dstart \kangle{\basic{int}} \nt{main} \hspace{1cm} \textit{// instead of \nt{expr}} \\ \percentpercent \\ \nt{main}: \newprod \basic{e} = \nt{expr} \basic{END} \dpaction{\basic{e}} \\ \nt{expr}: \newprod \ldots \end{tabular} \end{quote} \caption{Fixing the grammar specification in \fref{fig:basiceos}} \label{fig:basiceos:sol} \end{figure} An end-of-stream conflict arises when a state has distinct actions on \eos and on at least one physical token. In short, this means that the end of an $S$-sentence cannot be unambiguously identified without examining one extra token. \menhir's default behavior, in that case, is to suppress the action on \eos, so that more input is \emph{always} requested. \paragraph{Example} \fref{fig:basiceos} shows a grammar that has end-of-stream conflicts. When this grammar is processed, \menhir warns about these conflicts, and further warns that \nt{expr} is never accepted. Let us explain. Part of the corresponding automaton, as described in the \automaton file, is shown in \fref{fig:basiceosdump}. Explanations at the end of the \automaton file (not shown) point out that states 6 and 2 have an end-of-stream conflict. Indeed, both states have distinct actions on \eos and on the physical token \basic{TIMES}. % It is interesting to note that, even though state 4 has actions on \eos and on physical tokens, it does not have an end-of-stream conflict. This is because the action taken in state 4 is always to reduce the production $\nt{expr} \rightarrow \nt{expr}$ \basic{TIMES} \nt{expr}, regardless of the lookahead token. By default, \menhir produces a parser where end-of-stream conflicts are resolved in favor of looking ahead: that is, the problematic reduce actions on \eos are suppressed. This means, in particular, that the \emph{accept} action in state 2, which corresponds to reducing the production $\nt{expr} \rightarrow \nt{expr'}$, is suppressed. This explains why the symbol \nt{expr} is never accepted: because expressions do not have an unambiguous end marker, the parser will always request one more token and will never stop. In order to avoid this end-of-stream conflict, the standard solution is to introduce a new token, say \basic{END}, and to use it as an end marker for expressions. The \basic{END} token could be generated by the lexical analyzer when it encounters the actual end of stream, or it could correspond to a piece of concrete syntax, say, a line feed character, a semicolon, or an \texttt{end} keyword. The solution is shown in \fref{fig:basiceos:sol}. % ------------------------------------------------------------------------------ \section{Positions} \label{sec:positions} When an \ocamllex-generated lexical analyzer produces a token, it updates two fields, named \verb+lex_start_p+ and \verb+lex_curr_p+, in its environment record, whose type is \verb+Lexing.lexbuf+. Each of these fields holds a value of type \verb+Lexing.position+. Together, they represent the token's start and end positions within the text that is being scanned. These fields are read by \menhir after calling the lexical analyzer, so \textbf{it is the lexical analyzer's responsibility} to correctly set these fields. A position consists mainly of an offset (the position's \verb+pos_cnum+ field), but also holds information about the current file name, the current line number, and the current offset within the current line. (Not all \ocamllex-generated analyzers keep this extra information up to date. This must be explicitly programmed by the author of the lexical analyzer.) \begin{figure} \begin{center} \begin{tabular}{@{}l@{\hspace{7.0mm}}l@{}} \verb+$startpos+ & start position of the first symbol in the production's right-hand side, if there is one; \\& end position of the most recently parsed symbol, otherwise \\ \verb+$endpos+ & end position of the last symbol in the production's right-hand side, if there is one; \\& end position of the most recently parsed symbol, otherwise \\ \verb+$startpos(+ \verb+$+\nt{i} \barre \nt{id} \verb+)+ & start position of the symbol named \verb+$+\nt{i} or \nt{id} \\ \verb+$endpos(+ \verb+$+\nt{i} \barre \nt{id} \verb+)+ & end position of the symbol named \verb+$+\nt{i} or \nt{id} \\ \ksymbolstartpos & start position of the leftmost symbol \nt{id} such that \verb+$startpos(+\nt{id}\verb+)+ \verb+!=+\, \verb+$endpos(+\nt{id}\verb+)+; \\& if there is no such symbol, \verb+$endpos+ \\[2mm] % \verb+$startofs+ \\ \verb+$endofs+ \\ \verb+$startofs(+ \verb+$+\nt{i} \barre \nt{id} \verb+)+ & same as above, but produce an integer offset instead of a position \\ \verb+$endofs(+ \verb+$+\nt{i} \barre \nt{id} \verb+)+ \\ \verb+$symbolstartofs+ \\[2mm] % \verb+$loc+ & stands for the pair \verb+($startpos, $endpos)+ \\ \verb+$loc(+ \nt{id} \verb+)+ & stands for the pair \verb+($startpos(+ \nt{id} \verb+), $endpos(+ \nt{id} \verb+))+ \\ % $loc($i)$ works too, % but is not documented, % as that would be visually heavy % and its use is not encouraged anyway. \verb+$sloc+ & stands for the pair \verb+($symbolstartpos, $endpos)+ \\ \end{tabular} \end{center} \caption{Position-related keywords} \label{fig:pos} \end{figure} % We could document $endpos($0). Not sure whether that would be a good thing. \begin{figure} \begin{tabular}{@{}ll@{\hspace{2cm}}l} % Positions. \verb+symbol_start_pos()+ & \ksymbolstartpos \\ \verb+symbol_end_pos()+ & \verb+$endpos+ \\ \verb+rhs_start_pos i+ & \verb+$startpos($i)+ & ($1 \leq i \leq n$) \\ \verb+rhs_end_pos i+ & \verb+$endpos($i)+ & ($1 \leq i \leq n$) \\ % i = 0 permitted, really % Offsets. \verb+symbol_start()+ & \verb+$symbolstartofs+ \\ \verb+symbol_end()+ & \verb+$endofs+ \\ \verb+rhs_start i+ & \verb+$startofs($i)+ & ($1 \leq i \leq n$) \\ \verb+rhs_end i+ & \verb+$endofs($i)+ & ($1 \leq i \leq n$) \\ % i = 0 permitted, really \end{tabular} \caption{Translating position-related incantations from \ocamlyacc to \menhir} \label{fig:pos:mapping} \end{figure} This mechanism allows associating pairs of positions with terminal symbols. If desired, \menhir automatically extends it to nonterminal symbols as well. That is, it offers a mechanism for associating pairs of positions with terminal or nonterminal symbols. This is done by making a set of keywords available to semantic actions (\fref{fig:pos}). These keywords are \emph{not} available outside of a semantic action: in particular, they cannot be used within an \ocaml header. \ocaml's standard library module \texttt{Parsing} is deprecated. The functions that it offers \emph{can} be called, but will return dummy positions. We remark that, if the current production has an empty right-hand side, then \verb+$startpos+ and \verb+$endpos+ are equal, and (by convention) are the end position of the most recently parsed symbol (that is, the symbol that happens to be on top of the automaton's stack when this production is reduced). If the current production has a nonempty right-hand side, then \verb+$startpos+ is the same as \verb+$startpos($1)+ and \verb+$endpos+ is the same as \verb+$endpos($+\nt{n}\verb+)+, where \nt{n} is the length of the right-hand side. More generally, if the current production has matched a sentence of length zero, then \verb+$startpos+ and \verb+$endpos+ will be equal, and conversely. % (provided the lexer is reasonable and never produces a token whose start and % end positions are equal). The position \verb+$startpos+ is sometimes ``further towards the left'' than one would like. For example, in the following production: \begin{verbatim} declaration: modifier? variable { $startpos } \end{verbatim} the keyword \verb+$startpos+ represents the start position of the optional modifier \verb+modifier?+. If this modifier turns out to be absent, then its start position is (by definition) the end position of the most recently parsed symbol. This may not be what is desired: perhaps the user would prefer in this case to use the start position of the symbol \verb+variable+. This is achieved by using \ksymbolstartpos instead of \verb+$startpos+. By definition, \ksymbolstartpos is the start position of the leftmost symbol whose start and end positions differ. In this example, the computation of \ksymbolstartpos skips the absent \verb+modifier+, whose start and end positions coincide, and returns the start position of the symbol \verb+variable+ (assuming this symbol has distinct start and end positions). % On pourrait souligner que $symbolstartpos renvoie la $startpos du premier % symbole non vide, et non pas la $symbolstartpos du premier symbole non vide. % Donc ça peut rester un peu contre-intuitif, et ne pas correspondre % exactement à ce que l'on attend. D'ailleurs, le calcul de $symbolstartpos % est préservé par %inline (on obtient cela très facilement en éliminant % $symbolstartpos avant l'inlining) mais ne correspond pas à ce que donnerait % $symbolstartpos après un inlining manuel. Fondamentalement, cette notion de % $symbolstartpos ne tourne pas très rond. There is no keyword \verb+$symbolendpos+. Indeed, the problem with \verb+$startpos+ is due to the asymmetry in the definition of \verb+$startpos+ and \verb+$endpos+ in the case of an empty right-hand side, and does not affect \verb+$endpos+. \newcommand{\fineprint}{\footnote{% The computation of \ksymbolstartpos is optimized by \menhir under two assumptions about the lexer. First, \menhir assumes that the lexer never produces a token whose start and end positions are equal. Second, \menhir assumes that two positions produced by the lexer are equal if and only if they are physically equal. If the lexer violates either of these assumptions, the computation of \ksymbolstartpos could produce a result that differs from \texttt{Parsing.symbol\_start\_pos()}. }} The positions computed by \menhir are exactly the same as those computed by \verb+ocamlyacc+\fineprint. More precisely, \fref{fig:pos:mapping} sums up how to translate a call to the \texttt{Parsing} module, as used in an \ocamlyacc grammar, to a \menhir keyword. We note that \menhir's \verb+$startpos+ does not appear in the right-hand column in \fref{fig:pos:mapping}. In other words, \menhir's \verb+$startpos+ does not correspond exactly to any of the \ocamlyacc function calls. An exact \ocamlyacc equivalent of \verb+$startpos+ is \verb+rhs_start_pos 1+ if the current production has a nonempty right-hand side and \verb+symbol_start_pos()+ if it has an empty right-hand side. Finally, we remark that \menhir's \dinline keyword (\sref{sec:inline}) does not affect the computation of positions. The same positions are computed, regardless of where \dinline keywords are placed. % ------------------------------------------------------------------------------ \section{Using \menhir as an interpreter} \label{sec:interpret} When \ointerpret is set, \menhir no longer behaves as a compiler. Instead, it acts as an interpreter. That is, it repeatedly: \begin{itemize} \item reads a sentence off the standard input channel; \item parses this sentence, according to the grammar; \item displays an outcome. \end{itemize} This process stops when the end of the input channel is reached. \subsection{Sentences} \label{sec:sentences} The syntax of sentences is as follows: \begin{center} \begin{tabular}{r@{}c@{}l} \nt{sentence} \is \optional{\nt{lid}\,\deuxpoints} \sepspacelist{\nt{uid}} \,\dnewline \end{tabular} \end{center} Less formally, a sentence is a sequence of zero or more terminal symbols (\nt{uid}'s), separated with whitespace, terminated with a newline character, and optionally preceded with a non-terminal start symbol (\nt{lid}). This non-terminal symbol can be omitted if, and only if, the grammar only has one start symbol. For instance, here are four valid sentences for the grammar of arithmetic expressions found in the directory \distrib{demos/calc}: % \begin{verbatim} main: INT PLUS INT EOL INT PLUS INT INT PLUS PLUS INT EOL INT PLUS PLUS \end{verbatim} % In the first sentence, the start symbol \texttt{main} was explicitly specified. In the other sentences, it was omitted, which is permitted, because this grammar has no start symbol other than \texttt{main}. The first sentence is a stream of four terminal symbols, namely \texttt{INT}, \texttt{PLUS}, \texttt{INT}, and \texttt{EOL}. These terminal symbols must be provided under their symbolic names. Writing, say, ``\texttt{12+32\textbackslash n}'' instead of \texttt{INT PLUS INT EOL} is not permitted. \menhir would not be able to make sense of such a concrete notation, since it does not have a lexer for it. % On pourrait documenter le fait qu'une phrase finie est transformée par \menhir % en un flot de tokens potentiellement infinie, avec un suffixe infini EOF ... % Mais c'est un hack, qui pourrait changer à l'avenir. \subsection{Outcomes} \label{sec:outcomes} As soon as \menhir is able to read a complete sentence off the standard input channel (that is, as soon as it finds the newline character that ends the sentence), it parses the sentence according to whichever grammar was specified on the command line, and displays an outcome. An outcome is one of the following: \begin{itemize} \item \texttt{ACCEPT}: a prefix of the sentence was successfully parsed; a parser generated by \menhir would successfully stop and produce a semantic value; \item \texttt{OVERSHOOT}: the end of the sentence was reached before it could be accepted; a parser generated by \menhir would request a non-existent ``next token'' from the lexer, causing it to fail or block; \item \texttt{REJECT}: the sentence was not accepted; a parser generated by \menhir would raise the exception \texttt{Error}. \end{itemize} When \ointerpretshowcst is set, each \texttt{ACCEPT} outcome is followed with a concrete syntax tree. A concrete syntax tree is either a leaf or a node. A leaf is either a terminal symbol or \error. A node is annotated with a non-terminal symbol, and carries a sequence of immediate descendants that correspond to a valid expansion of this non-terminal symbol. \menhir's notation for concrete syntax trees is as follows: \begin{center} \begin{tabular}{r@{}c@{}l} \nt{cst} \is \nt{uid} \\ && \error \\ && \texttt{[} \nt{lid}\,\deuxpoints \sepspacelist{\nt{cst}} \texttt{]} \end{tabular} \end{center} % This notation is not quite unambiguous (it is ambiguous if several % productions are identical). For instance, if one wished to parse the example sentences of \sref{sec:sentences} using the grammar of arithmetic expressions in \distrib{demos/calc}, one could invoke \menhir as follows: \begin{verbatim} $ menhir --interpret --interpret-show-cst demos/calc/parser.mly main: INT PLUS INT EOL ACCEPT [main: [expr: [expr: INT] PLUS [expr: INT]] EOL] INT PLUS INT OVERSHOOT INT PLUS PLUS INT EOL REJECT INT PLUS PLUS REJECT \end{verbatim} (Here, \menhir's input---the sentences provided by the user on the standard input channel--- is shown intermixed with \menhir's output---the outcomes printed by \menhir on the standard output channel.) The first sentence is valid, and accepted; a concrete syntax tree is displayed. The second sentence is incomplete, because the grammar specifies that a valid expansion of \texttt{main} ends with the terminal symbol \texttt{EOL}; hence, the outcome is \texttt{OVERSHOOT}. The third sentence is invalid, because of the repeated occurrence of the terminal symbol \texttt{PLUS}; the outcome is \texttt{REJECT}. The fourth sentence, a prefix of the third one, is rejected for the same reason. \subsection{Remarks} Using \menhir as an interpreter offers an easy way of debugging your grammar. For instance, if one wished to check that addition is considered left-associative, as requested by the \dleft directive found in the file \distrib{demos/calc/parser.mly}, one could submit the following sentence: \begin{verbatim} $ ./menhir --interpret --interpret-show-cst ../demos/calc/parser.mly INT PLUS INT PLUS INT EOL ACCEPT [main: [expr: [expr: [expr: INT] PLUS [expr: INT]] PLUS [expr: INT]] EOL ] \end{verbatim} %$ The concrete syntax tree displayed by \menhir is skewed towards the left, as desired. The switches \ointerpret and \otrace can be used in conjunction. When \otrace is set, the interpreter logs its actions to the standard error channel. % ------------------------------------------------------------------------------ \section{Generated API} When \menhir processes a grammar specification, say \texttt{parser.mly}, it produces one \ocaml module, \texttt{Parser}, whose code resides in the file \texttt{parser.ml} and whose signature resides in the file \texttt{parser.mli}. We now review this signature. For simplicity, we assume that the grammar specification has just one start symbol \verb+main+, whose \ocaml type is \verb+thing+. % ------------------------------------------------------------------------------ \subsection{Monolithic API} \label{sec:monolithic} The monolithic API defines the type \verb+token+, the exception \verb+Error+, and the parsing function \verb+main+, named after the start symbol of the grammar. %% type token The type \verb+token+ is an algebraic data type. A value of type \verb+token+ represents a terminal symbol and its semantic value. For instance, if the grammar contains the declarations \verb+%token A+ and \verb+%token B+, then the generated file \texttt{parser.mli} contains the following definition: \begin{verbatim} type token = | A | B of int \end{verbatim} % If \oonlytokens is specified on the command line, the type \verb+token+ is generated, and the rest is omitted. On the contrary, if \oexternaltokens is used, the type \verb+token+ is omitted, but the rest (described below) is generated. %% exception Error The exception \verb+Error+ carries no argument. It is raised by the parsing function \verb+main+ (described below) when a syntax error is detected. % \begin{verbatim} exception Error \end{verbatim} %% val main Next comes one parsing function for each start symbol of the grammar. Here, we have assumed that there is one start symbol, named \verb+main+, so the generated file \texttt{parser.mli} contains the following declaration: \begin{verbatim} val main: (Lexing.lexbuf -> token) -> Lexing.lexbuf -> thing \end{verbatim} % On ne montre pas la définition de l'exception Error. This function expects two arguments, namely: a lexer, which typically is produced by \ocamllex and has type \verb+Lexing.lexbuf -> token+; and a lexing buffer, which has type \verb+Lexing.lexbuf+. This API is compatible with \ocamlyacc. (For information on using \menhir without \ocamllex, please consult \sref{sec:qa}.) % This API is ``monolithic'' in the sense that there is just one function, which does everything: it pulls tokens from the lexer, parses, and eventually returns a semantic value (or fails by throwing the exception \texttt{Error}). % ------------------------------------------------------------------------------ \subsection{Incremental API} \label{sec:incremental} If \otable is set, \menhir offers an incremental API in addition to the monolithic API. In this API, control is inverted. The parser does not have access to the lexer. Instead, when the parser needs the next token, it stops and returns its current state to the user. The user is then responsible for obtaining this token (typically by invoking the lexer) and resuming the parser from that state. % The directory \distrib{demos/calc-incremental} contains a demo that illustrates the use of the incremental API. This API is ``incremental'' in the sense that the user has access to a sequence of the intermediate states of the parser. Assuming that semantic values are immutable, a parser state is a persistent data structure: it can be stored and used multiple times, if desired. This enables applications such as ``live parsing'', where a buffer is continuously parsed while it is being edited. The parser can be re-started in the middle of the buffer whenever the user edits a character. Because two successive parser states share most of their data in memory, a list of $n$ successive parser states occupies only $O(n)$ space in memory. % One could point out that semantic actions should be side-effect free. % But that is an absolute requirement. Semantic actions can have side % effects, if the user knows what they are doing. % TEMPORARY actually, live parsing also requires a way of performing % error recovery, up to a complete parse... as in Merlin. % ------------------------------------------------------------------------------ \subsubsection{Starting the parser} In this API, the parser is started by invoking \verb+Incremental.main+. (Recall that we assume that \verb+main+ is the name of the start symbol.) The generated file \texttt{parser.mli} contains the following declaration: \begin{verbatim} module Incremental : sig val main: position -> thing MenhirInterpreter.checkpoint end \end{verbatim} The argument is the initial position. If the lexer is based on an \ocaml lexing buffer, this argument should be \verb+lexbuf.lex_curr_p+. In \sref{sec:incremental} and \sref{sec:inspection}, the type \verb+position+ is a synonym for \verb+Lexing.position+. We emphasize that the function \verb+Incremental.main+ does not parse anything. It constructs a checkpoint which serves as a \emph{starting} point. The functions \verb+offer+ and \verb+resume+, described below, are used to drive the parser. % ------------------------------------------------------------------------------ \subsubsection{Driving the parser} \label{sec:incremental:driving} The sub-module \menhirinterpreter is also part of the incremental API. Its declaration, which appears in the generated file \texttt{parser.mli}, is as follows: \begin{verbatim} module MenhirInterpreter : MenhirLib.IncrementalEngine.INCREMENTAL_ENGINE with type token = token \end{verbatim} The signature \verb+INCREMENTAL_ENGINE+, defined in the module \menhirlibincrementalengine, contains many types and functions, which are described in the rest of this section (\sref{sec:incremental:driving}) and in the following sections (\sref{sec:incremental:inspecting}, \sref{sec:incremental:updating}). Please keep in mind that, from the outside, these types and functions should be referred to with an appropriate prefix. For instance, the type \verb+checkpoint+ should be referred to as \verb+MenhirInterpreter.checkpoint+, or \verb+Parser.MenhirInterpreter.checkpoint+, depending on which modules the user chooses to open. %% type token % Passons-le sous silence. %% type 'a env \begin{verbatim} type 'a env \end{verbatim} The abstract type \verb+'a env+ represents the current state of the parser. (That is, it contains the current state and stack of the LR automaton.) Assuming that semantic values are immutable, it is a persistent data structure: it can be stored and used multiple times, if desired. The parameter \verb+'a+ is the type of the semantic value that will eventually be produced if the parser succeeds. %% type production \begin{verbatim} type production \end{verbatim} The abstract type \verb+production+ represents a production of the grammar. % The ``start productions'' (which do not exist in an \mly file, but are constructed by \menhir internally) are \emph{not} part of this type. %% type 'a checkpoint \begin{verbatim} type 'a checkpoint = private | InputNeeded of 'a env | Shifting of 'a env * 'a env * bool | AboutToReduce of 'a env * production | HandlingError of 'a env | Accepted of 'a | Rejected \end{verbatim} The type \verb+'a checkpoint+ represents an intermediate or final state of the parser. An intermediate checkpoint is a suspension: it records the parser's current state, and allows parsing to be resumed. The parameter \verb+'a+ is the type of the semantic value that will eventually be produced if the parser succeeds. \verb+Accepted+ and \verb+Rejected+ are final checkpoints. \verb+Accepted+ carries a semantic value. \verb+InputNeeded+ is an intermediate checkpoint. It means that the parser wishes to read one token before continuing. \verb+Shifting+ is an intermediate checkpoint. It means that the parser is taking a shift transition. It exposes the state of the parser before and after the transition. The Boolean parameter tells whether the parser intends to request a new token after this transition. (It always does, except when it is about to accept.) \verb+AboutToReduce+ is an intermediate checkpoint: it means that the parser is about to perform a reduction step. \verb+HandlingError+ is also an intermediate checkpoint: it means that the parser has detected an error and is about to handle it. (Error handling is typically performed in several steps, so the next checkpoint is likely to be \verb+HandlingError+ again.) In these two cases, the parser does not need more input. The parser suspends itself at this point only in order to give the user an opportunity to observe the parser's transitions and possibly handle errors in a different manner, if desired. %% val offer \begin{verbatim} val offer: 'a checkpoint -> token * position * position -> 'a checkpoint \end{verbatim} The function \verb+offer+ allows the user to resume the parser after the parser has suspended itself with a checkpoint of the form \verb+InputNeeded env+. This function expects the previous checkpoint \verb+checkpoint+ as well as a new token (together with the start and end positions of this token). It produces a new checkpoint, which again can be an intermediate checkpoint or a final checkpoint. It does not raise any exception. (The exception \texttt{Error} is used only in the monolithic API.) %% val resume \begin{verbatim} val resume: 'a checkpoint -> 'a checkpoint \end{verbatim} The function \verb+resume+ allows the user to resume the parser after the parser has suspended itself with a checkpoint of the form \verb+AboutToReduce (env, prod)+ or \verb+HandlingError env+. This function expects just the previous checkpoint \verb+checkpoint+. It produces a new checkpoint. It does not raise any exception. The incremental API subsumes the monolithic API. Indeed, \verb+main+ can be (and is in fact) implemented by first using \verb+Incremental.main+, then calling \verb+offer+ and \verb+resume+ in a loop, until a final checkpoint is obtained. %% type supplier \begin{verbatim} type supplier = unit -> token * position * position \end{verbatim} A token supplier is a function of no arguments which delivers a new token (together with its start and end positions) every time it is called. The function \verb+loop+ and its variants, described below, expect a supplier as an argument. %% val lexer_lexbuf_to_supplier \begin{verbatim} val lexer_lexbuf_to_supplier: (Lexing.lexbuf -> token) -> Lexing.lexbuf -> supplier \end{verbatim} The function \verb+lexer_lexbuf_to_supplier+, applied to a lexer and to a lexing buffer, produces a fresh supplier. %% (remark about the loop* functions) The functions \verb+offer+ and \verb+resume+, documented above, are sufficient to write a parser loop. One can imagine many variations of such a loop, which is why we expose \verb+offer+ and \verb+resume+ in the first place. Nevertheless, some variations are so common that it is worth providing them, ready for use. The following functions are implemented on top of \verb+offer+ and \verb+resume+. %% val loop \begin{verbatim} val loop: supplier -> 'a checkpoint -> 'a \end{verbatim} \verb+loop supplier checkpoint+ begins parsing from \verb+checkpoint+, reading tokens from \verb+supplier+. It continues parsing until it reaches a checkpoint of the form \verb+Accepted v+ or \verb+Rejected+. In the former case, it returns \verb+v+. In the latter case, it raises the exception \verb+Error+. (By the way, this is how we implement the monolithic API on top of the incremental API.) \begin{verbatim} val loop_handle: ('a -> 'answer) -> ('a checkpoint -> 'answer) -> supplier -> 'a checkpoint -> 'answer \end{verbatim} \verb+loop_handle succeed fail supplier checkpoint+ begins parsing from \verb+checkpoint+, reading tokens from \verb+supplier+. It continues until it reaches a checkpoint of the form \verb+Accepted v+ or \verb+HandlingError _+ (or~\verb+Rejected+, but that should not happen, as \verb+HandlingError _+ will be observed first). In the former case, it calls \verb+succeed v+. In the latter case, it calls \verb+fail+ with this checkpoint. It cannot raise \verb+Error+. This means that \menhir's traditional error-handling procedure (which pops the stack until a state that can act on the \error token is found) does not get a chance to run. Instead, the user can implement her own error handling code, in the \verb+fail+ continuation. %% val loop_handle_undo \begin{verbatim} val loop_handle_undo: ('a -> 'answer) -> ('a checkpoint -> 'a checkpoint -> 'answer) -> supplier -> 'a checkpoint -> 'answer \end{verbatim} \verb+loop_handle_undo+ is analogous to \verb+loop_handle+, but passes a pair of checkpoints (instead of a single checkpoint) to the failure continuation. % The first (and oldest) checkpoint that is passed to the failure continuation is the last \verb+InputNeeded+ checkpoint that was encountered before the error was detected. The second (and newest) checkpoint is where the error was detected. (This is the same checkpoint that \verb+loop_handle+ would pass to its failure continuation.) Going back to the first checkpoint can be thought of as undoing any reductions that were performed after seeing the problematic token. (These reductions must be default reductions or spurious reductions.) This can be useful to someone who wishes to implement an error explanation or error recovery mechanism. \verb+loop_handle_undo+ must be applied to an \verb+InputNeeded+ checkpoint. The initial checkpoint produced by \verb+Incremental.main+ is of this form. %% val shifts \begin{verbatim} val shifts: 'a checkpoint -> 'a env option \end{verbatim} \verb+shifts checkpoint+ assumes that \verb+checkpoint+ has been obtained by submitting a token to the parser. It runs the parser from \verb+checkpoint+, through an arbitrary number of reductions, until the parser either accepts this token (i.e., shifts) or rejects it (i.e., signals an error). If the parser decides to shift, then \verb+Some env+ is returned, where \verb+env+ is the parser's state just before shifting. Otherwise, \verb+None+ is returned. This can be used to test whether the parser is willing to accept a certain token. This function should be used with caution, though, as it causes semantic actions to be executed. It is desirable that all semantic actions be side-effect-free, or that their side-effects be harmless. %% val acceptable \begin{verbatim} val acceptable: 'a checkpoint -> token -> position -> bool \end{verbatim} \verb+acceptable checkpoint token pos+ requires \verb+checkpoint+ to be an \verb+InputNeeded+ checkpoint. It returns \verb+true+ iff the parser is willing to shift this token. % This can be used to test, after an error has been detected, which tokens would have been accepted at this point. To do this, one would typically use \verb+loop_handle_undo+ to get access to the last \verb+InputNeeded+ checkpoint that was encountered before the error was detected, and apply \verb+acceptable+ to that checkpoint. \verb+acceptable+ is implemented using \verb+shifts+, so, like \verb+shifts+, it causes certain semantic actions to be executed. It is desirable that all semantic actions be side-effect-free, or that their side-effects be harmless. % ------------------------------------------------------------------------------ \subsubsection{Inspecting the parser's state} \label{sec:incremental:inspecting} Although the type \verb+env+ is opaque, a parser state can be inspected via a few accessor functions, which are described in this section. The following types and functions are contained in the \verb+MenhirInterpreter+ sub-module. %% type 'a lr1state \begin{verbatim} type 'a lr1state \end{verbatim} The abstract type \verb+'a lr1state+ describes a (non-initial) state of the LR(1) automaton. % If \verb+s+ is such a state, then \verb+s+ should have at least one incoming transition, and all of its incoming transitions carry the same (terminal or non-terminal) symbol, say $A$. We say that $A$ is the \emph{incoming symbol} of the state~\verb+s+. % The index \verb+'a+ is the type of the semantic values associated with $A$. The role played by \verb+'a+ is clarified in the definition of the type \verb+element+, which appears further on. %% val number \begin{verbatim} val number: _ lr1state -> int \end{verbatim} The states of the LR(1) automaton are numbered (from 0 and up). The function \verb+number+ maps a state to its number. %% val production_index %% val find_production \begin{verbatim} val production_index: production -> int val find_production: int -> production \end{verbatim} Productions are numbered. (The set of indices of all productions forms an interval, which does \emph{not} necessarily begin at 0.) % The function \verb+production_index+ converts a production to an integer number, whereas the function \verb+find_production+ carries out the reverse conversion. It is an error to apply \verb+find_production+ to an invalid index. %% type element \begin{verbatim} type element = | Element: 'a lr1state * 'a * position * position -> element \end{verbatim} The type \verb+element+ describes one entry in the stack of the LR(1) automaton. In a stack element of the form \verb+Element (s, v, startp, endp)+, \verb+s+ is a (non-initial) state and \verb+v+ is a semantic value. The value~\verb+v+ is associated with the incoming symbol~$A$ of the state~\verb+s+. In other words, the value \verb+v+ was pushed onto the stack just before the state \verb+s+ was entered. Thus, for some type \verb+'a+, the state~\verb+s+ has type \verb+'a lr1state+ and the value~\verb+v+ has type~\verb+'a+. The positions \verb+startp+ and \verb+endp+ delimit the fragment of the input text that was reduced to the symbol $A$. In order to do anything useful with the value \verb+v+, one must gain information about the type \verb+'a+, by inspection of the state~\verb+s+. So far, the type \verb+'a lr1state+ is abstract, so there is no way of inspecting~\verb+s+. The inspection API (\sref{sec:inspection}) offers further tools for this purpose. %% val top \begin{verbatim} val top: 'a env -> element option \end{verbatim} \verb+top env+ returns the parser's top stack element. The state contained in this stack element is the current state of the automaton. If the stack is empty, \verb+None+ is returned. In that case, the current state of the automaton must be an initial state. %% val pop_many \begin{verbatim} val pop_many: int -> 'a env -> 'a env option \end{verbatim} \verb+pop_many i env+ pops \verb+i+ elements off the automaton's stack. This is done via \verb+i+ successive invocations of \verb+pop+. Thus, \verb+pop_many 1+ is \verb+pop+. The index \verb+i+ must be nonnegative. The time complexity is $O(i)$. %% val get \begin{verbatim} val get: int -> 'a env -> element option \end{verbatim} \verb+get i env+ returns the parser's \verb+i+-th stack element. The index \verb+i+ is 0-based: thus, \verb+get 0+ is \verb+top+. If \verb+i+ is greater than or equal to the number of elements in the stack, \verb+None+ is returned. \verb+get+ is implemented using \verb+pop_many+ and \verb+top+: its time complexity is $O(i)$. %% val current_state_number \begin{verbatim} val current_state_number: 'a env -> int \end{verbatim} \verb+current_state_number env+ is the integer number of the automaton's current state. Although this number might conceivably be obtained via the functions~\verb+top+ and \verb+number+, using \verb+current_state_number+ is preferable, because this method works even when the automaton's stack is empty (in which case the current state is an initial state, and \verb+top+ returns \verb+None+). This number can be passed as an argument to a \verb+message+ function generated by \verb+menhir --compile-errors+. %% val equal \begin{verbatim} val equal: 'a env -> 'a env -> bool \end{verbatim} \verb+equal env1 env2+ tells whether the parser configurations \verb+env1+ and \verb+env2+ are equal in the sense that the automaton's current state is the same in \verb+env1+ and \verb+env2+ and the stack is \emph{physically} the same in \verb+env1+ and \verb+env2+. If \verb+equal env1 env2+ is \verb+true+, then the sequence of the stack elements, as observed via \verb+pop+ and \verb+top+, must be the same in \verb+env1+ and \verb+env2+. Also, if \verb+equal env1 env2+ holds, then the checkpoints \verb+input_needed env1+ and \verb+input_needed env2+ must be equivalent. (The function \verb+input_needed+ is documented in \sref{sec:incremental:updating}.) The function \verb+equal+ has time complexity $O(1)$. %% val positions \begin{verbatim} val positions: 'a env -> position * position \end{verbatim} The function \verb+positions+ returns the start and end positions of the current lookahead token. If invoked in an initial state, this function returns a pair of twice the initial position that was passed as an argument to \verb+main+. %% val has_default_reduction %% val state_has_default_reduction \begin{verbatim} val env_has_default_reduction: 'a env -> bool val state_has_default_reduction: _ lr1state -> bool \end{verbatim} When applied to an environment \verb+env+ taken from a checkpoint of the form \verb+AboutToReduce (env, prod)+, the function \verb+env_has_default_reduction+ tells whether the reduction that is about to take place is a default reduction. \verb+state_has_default_reduction s+ tells whether the state \verb+s+ has a default reduction. This includes the case where \verb+s+ is an accepting state. % ------------------------------------------------------------------------------ \subsubsection{Updating the parser's state} \label{sec:incremental:updating} The functions presented in the previous section (\sref{sec:incremental:inspecting}) allow inspecting parser states of type \verb+'a checkpoint+ and \verb+'a env+. However, so far, there are no functions for manufacturing new parser states, except \verb+offer+ and \verb+resume+, which create new checkpoints by feeding tokens, one by one, to the parser. In this section, a small number of functions are provided for manufacturing new parser states of type \verb+'a env+ and \verb+'a checkpoint+. These functions allow going far back into the past and jumping ahead into the future, so to speak. In other words, they allow driving the parser in other ways than by feeding tokens into it. The functions \verb+pop+, \verb+force_reduction+ and \verb+feed+ (part of the inspection API; see \sref{sec:inspection}) construct values of type \verb+'a env+. The function \verb+input_needed+ constructs values of type \verb+'a checkpoint+ and thereby allows resuming parsing in normal mode (via \verb+offer+). Together, these functions can be used to implement error handling and error recovery strategies. %% val pop \begin{verbatim} val pop: 'a env -> 'a env option \end{verbatim} \verb+pop env+ returns a new environment, where the parser's top stack cell has been popped off. (If the stack is empty, \verb+None+ is returned.) This amounts to pretending that the (terminal or nonterminal) symbol that corresponds to this stack cell has not been read. %% val force_reduction \begin{verbatim} val force_reduction: production -> 'a env -> 'a env \end{verbatim} \verb+force_reduction prod env+ can be called only if in the state \verb+env+ the parser is capable of reducing the production \verb+prod+. If this condition is satisfied, then this production is reduced, which means that its semantic action is executed (this can have side effects!) and the automaton makes a goto (nonterminal) transition. If this condition is not satisfied, an \verb+Invalid_argument+ exception is raised. %% val input_needed \begin{verbatim} val input_needed: 'a env -> 'a checkpoint \end{verbatim} \verb+input_needed env+ returns \verb+InputNeeded env+. Thus, out of a parser state that might have been obtained via a series of calls to the functions \verb+pop+, \verb+force_reduction+, \verb+feed+, and so on, it produces a checkpoint, which can be used to resume normal parsing, by supplying this checkpoint as an argument to \verb+offer+. This function should be used with some care. It could ``mess up the lookahead'' in the sense that it allows parsing to resume in an arbitrary state \verb+s+ with an arbitrary lookahead symbol \verb+t+, even though \menhir's reachability analysis (which is carried out via the \olisterrors switch) might well think that it is impossible to reach this particular configuration. If one is using \menhir's new error reporting facility (\sref{sec:errors:new}), this could cause the parser to reach an error state for which no error message has been prepared. % ------------------------------------------------------------------------------ \subsection{Inspection API} \label{sec:inspection} If \oinspection is set, \menhir offers an inspection API in addition to the monolithic and incremental APIs. (The reason why this is not done by default is that this requires more tables to be generated, thus making the generated parser larger.) Like the incremental API, the inspection API is found in the sub-module \menhirinterpreter. It offers the following types and functions. %% type _ terminal The type \verb+'a terminal+ is a generalized algebraic data type (GADT). A value of type \verb+'a terminal+ represents a terminal symbol (without a semantic value). The index \verb+'a+ is the type of the semantic values associated with this symbol. For instance, if the grammar contains the declarations \verb+%token A+ and \verb+%token B+, then the generated module \menhirinterpreter contains the following definition: % \begin{verbatim} type _ terminal = | T_A : unit terminal | T_B : int terminal \end{verbatim} % The data constructors are named after the terminal symbols, prefixed with ``\verb+T_+''. %% type _ nonterminal The type \verb+'a nonterminal+ is also a GADT. A value of type \verb+'a nonterminal+ represents a nonterminal symbol (without a semantic value). The index \verb+'a+ is the type of the semantic values associated with this symbol. For instance, if \verb+main+ is the only nonterminal symbol, then the generated module \menhirinterpreter contains the following definition: % \begin{verbatim} type _ nonterminal = | N_main : thing nonterminal \end{verbatim} % The data constructors are named after the nonterminal symbols, prefixed with ``\verb+N_+''. %% type 'a symbol The type \verb+'a symbol+ % (an algebraic data type) is the disjoint union of the types \verb+'a terminal+ and \verb+'a nonterminal+. In other words, a value of type \verb+'a symbol+ represents a terminal or nonterminal symbol (without a semantic value). This type is (always) defined as follows: % \begin{verbatim} type 'a symbol = | T : 'a terminal -> 'a symbol | N : 'a nonterminal -> 'a symbol \end{verbatim} %% type xsymbol The type \verb+xsymbol+ is an existentially quantified version of the type \verb+'a symbol+. It is useful in situations where the index \verb+'a+ is not statically known. It is (always) defined as follows: % \begin{verbatim} type xsymbol = | X : 'a symbol -> xsymbol \end{verbatim} %% type item The type \verb+item+ describes an LR(0) item, that is, a pair of a production \verb+prod+ and an index \verb+i+ into the right-hand side of this production. If the length of the right-hand side is \verb+n+, then \verb+i+ is comprised between 0 and \verb+n+, inclusive. \begin{verbatim} type item = production * int \end{verbatim} %% Comparison functions. The following functions implement total orderings on the types \verb+_ terminal+, \verb+_ nonterminal+, \verb+xsymbol+, \verb+production+, and \verb+item+. \begin{verbatim} val compare_terminals: _ terminal -> _ terminal -> int val compare_nonterminals: _ nonterminal -> _ nonterminal -> int val compare_symbols: xsymbol -> xsymbol -> int val compare_productions: production -> production -> int val compare_items: item -> item -> int \end{verbatim} %% val incoming_symbol The function \verb+incoming_symbol+ maps a (non-initial) LR(1) state~\verb+s+ to its incoming symbol, that is, the symbol that the parser must recognize before it enters the state \verb+s+. % \begin{verbatim} val incoming_symbol: 'a lr1state -> 'a symbol \end{verbatim} % This function can be used to gain access to the semantic value \verb+v+ in a stack element \verb+Element (s, v, _, _)+. Indeed, by case analysis on the symbol \verb+incoming_symbol s+, one gains information about the type \verb+'a+, hence one obtains the ability to do something useful with the value~\verb+v+. %% val items The function \verb+items+ maps a (non-initial) LR(1) state~\verb+s+ to its LR(0) \emph{core}, that is, to the underlying set of LR(0) items. This set is represented as a list, whose elements appear in an arbitrary order. This set is \emph{not} closed under $\epsilon$-transitions. % \begin{verbatim} val items: _ lr1state -> item list \end{verbatim} %% val lhs %% val rhs The functions \verb+lhs+ and \verb+rhs+ map a production \verb+prod+ to its left-hand side and right-hand side, respectively. The left-hand side is always a nonterminal symbol, hence always of the form \verb+N _+. The right-hand side is a (possibly empty) sequence of (terminal or nonterminal) symbols. % \begin{verbatim} val lhs: production -> xsymbol val rhs: production -> xsymbol list \end{verbatim} % %% val nullable The function \verb+nullable+, applied to a non-terminal symbol, tells whether this symbol is nullable. A nonterminal symbol is nullable if and only if it produces the empty word $\epsilon$. % \begin{verbatim} val nullable: _ nonterminal -> bool \end{verbatim} %% val first %% val xfirst The function call \verb+first nt t+ tells whether the \emph{FIRST} set of the nonterminal symbol \verb+nt+ contains the terminal symbol \verb+t+. That is, it returns \verb+true+ if and only if \verb+nt+ produces a word that begins with \verb+t+. The function \verb+xfirst+ is identical to \verb+first+, except it expects a first argument of type \verb+xsymbol+ instead of \verb+_ terminal+. % \begin{verbatim} val first: _ nonterminal -> _ terminal -> bool val xfirst: xsymbol -> _ terminal -> bool \end{verbatim} %% val foreach_terminal %% val foreach_terminal_but_error The function \verb+foreach_terminal+ enumerates the terminal symbols, including the special symbol \error. The function \verb+foreach_terminal_but_error+ enumerates the terminal symbols, excluding \error. \begin{verbatim} val foreach_terminal: (xsymbol -> 'a -> 'a) -> 'a -> 'a val foreach_terminal_but_error: (xsymbol -> 'a -> 'a) -> 'a -> 'a \end{verbatim} %% val feed \verb+feed symbol startp semv endp env+ causes the parser to consume the (terminal or nonterminal) symbol \verb+symbol+, accompanied with the semantic value \verb+semv+ and with the start and end positions \verb+startp+ and \verb+endp+. Thus, the automaton makes a transition, and reaches a new state. The stack grows by one cell. This operation is permitted only if the current state (as determined by \verb+env+) has an outgoing transition labeled with \verb+symbol+. Otherwise, an \verb+Invalid_argument+ exception is raised. \begin{verbatim} val feed: 'a symbol -> position -> 'a -> position -> 'b env -> 'b env \end{verbatim} % TEMPORARY % document the modules that use the inspection API: Printers % document MenhirLib.General? % The directory \distrib{demos/calc-inspection} contains a demo that illustrates the use of the inspection API. % review it / clean it up! % ------------------------------------------------------------------------------ \section{Error handling: the traditional way} \label{sec:errors} \menhir's traditional error handling mechanism is considered deprecated: although it is still supported for the time being, it might be removed in the future. We recommend setting up an error handling mechanism using the new tools offered by \menhir (\sref{sec:errors:new}). \paragraph{Error handling} \menhir's error traditional handling mechanism is inspired by that of \yacc and \ocamlyacc, but is not identical. A special \error token is made available for use within productions. The LR automaton is constructed exactly as if \error was a regular terminal symbol. However, \error is never produced by the lexical analyzer. Instead, when an error is detected, the current lookahead token is discarded and replaced with the \error token, which becomes the current lookahead token. At this point, the parser enters \emph{error handling} mode. In error handling mode, automaton states are popped off the automaton's stack until a state that can \emph{act} on \error is found. This includes \emph{both} shift \emph{and} reduce actions. (\yacc and \ocamlyacc do not trigger reduce actions on \error. It is somewhat unclear why this is so.) When a state that can reduce on \error is found, reduction is performed. Since the lookahead token is still \error, the automaton remains in error handling mode. When a state that can shift on \error is found, the \error token is shifted. At this point, the parser returns to normal mode. When no state that can act on \error is found on the automaton's stack, the parser stops and raises the exception \texttt{Error}. This exception carries no information. The position of the error can be obtained by reading the lexical analyzer's environment record. \paragraph{Error recovery} \ocamlyacc offers an error recovery mode, which is entered immediately after an \error token was successfully shifted. In this mode, tokens are repeatedly taken off the input stream and discarded until an acceptable token is found. This feature is no longer offered by \menhir. \paragraph{Error-related keywords} The following keyword is made available to semantic actions. When the \verb+$syntaxerror+ keyword is evaluated, evaluation of the semantic action is aborted, so that the current reduction is abandoned; the current lookahead token is discarded and replaced with the \error token; and error handling mode is entered. Note that there is no mechanism for inserting an \error token \emph{in front of} the current lookahead token, even though this might also be desirable. It is unclear whether this keyword is useful; it might be suppressed in the future. % ------------------------------------------------------------------------------ \section{Error handling: the new way} \label{sec:errors:new} \menhir's incremental API (\sref{sec:incremental}) allows taking control when an error is detected. Indeed, as soon as an invalid token is detected, the parser produces a checkpoint of the form \verb+HandlingError _+. At this point, if one decides to let the parser proceed, by just calling \verb+resume+, then \menhir enters its traditional error handling mode (\sref{sec:errors}). Instead, however, one can decide to take control and perform error handling or error recovery in any way one pleases. One can, for instance, build and display a diagnostic message, based on the automaton's current stack and/or state. Or, one could modify the input stream, by inserting or deleting tokens, so as to suppress the error, and resume normal parsing. In principle, the possibilities are endless. An apparently simple-minded approach to error reporting, proposed by Jeffery~\cite{jeffery-03} and further explored by Pottier~\cite{pottier-reachability-cc-2016}, consists in selecting a diagnostic message (or a template for a diagnostic message) based purely on the current state of the automaton. In this approach, one determines, ahead of time, which are the ``error states'' (that is, the states in which an error can be detected), and one prepares, for each error state, a diagnostic message. Because state numbers are fragile (they change when the grammar evolves), an error state is identified not by its number, but by an input sentence that leads to it: more precisely, by an input sentence which causes an error to be detected in this state. Thus, one maintains a set of pairs of an erroneous input sentence and a diagnostic message. \menhir defines a file format, the \messages file format, for representing this information (\sref{sec:messages:format}), and offers a set of tools for creating, maintaining, and exploiting \messages files (\sref{sec:messages:tools}). Once one understands these tools, there remains to write a collection of diagnostic messages, a more subtle task than one might think (\sref{sec:errors:diagnostics}), and to glue everything together (\sref{sec:errors:example}). In this approach to error handling, as in any other approach, one must understand exactly when (that is, in which states) errors are detected. This in turn requires understanding how the automaton is constructed. \menhir's construction technique is not Knuth's canonical LR(1) technique~\cite{knuth-lr-65}, which is usually too expensive to be practical. Instead, \menhir \emph{merges} states~\cite{pager-77} and introduces so-called \emph{default reductions}. These techniques \emph{defer} error detection by allowing extra reductions to take place before an error is detected. % Furthermore, \menhir supports \donerrorreduce declarations, % which also introduce extra reductions. The impact of these alterations must be taken into account when writing diagnostic messages (\sref{sec:errors:diagnostics}). In this approach to error handling, the special \error token is not used. It should not appear in the grammar. Similarly, the \verb+$syntaxerror+ keyword should not be used. % ------------------------------------------------------------------------------ \subsection{The \messages file format} \label{sec:messages:format} A \messages file is a text file. Comment lines, which begin with a \verb+#+ character, are ignored everywhere. As is evident in the following description, blank lines are significant: they are used as separators between entries and within an entry. A~\messages file is composed of a list of entries. Two entries are separated by one or more blank lines. Each entry consists of one or more input sentences, followed with one or more blank lines, followed with a message. The syntax of an input sentence is described in \sref{sec:sentences}. A message is arbitrary text, but cannot contain a blank line. We stress that there cannot be a blank line between two sentences (if there is one, \menhir becomes confused and may complain about some word not being ``a known non-terminal symbol''). \begin{figure} \begin{verbatim} grammar: TYPE UID grammar: TYPE OCAMLTYPE UID PREC # A (handwritten) comment. Ill-formed declaration. Examples of well-formed declarations: %type expression %type date time \end{verbatim} \caption{An entry in a \messages file} \label{fig:messages:entry} \end{figure} \begin{figure} \begin{verbatim} grammar: TYPE UID ## ## Ends in an error in state: 1. ## ## declaration -> TYPE . OCAMLTYPE separated_nonempty_list(option(COMMA), ## strict_actual) [ TYPE TOKEN START RIGHT PUBLIC PERCENTPERCENT PARAMETER ## ON_ERROR_REDUCE NONASSOC LEFT INLINE HEADER EOF COLON ] ## ## The known suffix of the stack is as follows: ## TYPE ## grammar: TYPE OCAMLTYPE UID PREC ## ## Ends in an error in state: 5. ## ## strict_actual -> symbol . loption(delimited(LPAREN,separated_nonempty_list ## (COMMA,strict_actual),RPAREN)) [ UID TYPE TOKEN START STAR RIGHT QUESTION ## PUBLIC PLUS PERCENTPERCENT PARAMETER ON_ERROR_REDUCE NONASSOC LID LEFT ## INLINE HEADER EOF COMMA COLON ] ## ## The known suffix of the stack is as follows: ## symbol ## # A (handwritten) comment. Ill-formed declaration. Examples of well-formed declarations: %type expression %type date time \end{verbatim} \caption{An entry in a \messages file, decorated with auto-generated comments} \label{fig:messages:entry:decorated} \end{figure} As an example, \fref{fig:messages:entry} shows a valid entry, taken from \menhir's own \messages file. This entry contains two input sentences, which lead to errors in two distinct states. A single message is associated with these two error states. Several commands, described next (\sref{sec:messages:tools}), produce \messages files where each input sentence is followed with an auto-generated comment, marked with \verb+##+. This special comment indicates in which state the error is detected, and is supposed to help the reader understand what it means to be in this state: What has been read so far? What is expected next? As an example, the previous entry, decorated with auto-generated comments, is shown in \fref{fig:messages:entry:decorated}. (We have manually wrapped the lines that did not fit in this document.) An auto-generated comment begins with the number of the error state that is reached via this input sentence. Then, the auto-generated comment shows the LR(1) items that compose this state, in the same format as in an \automaton file. these items offer a description of the past (that is, what has been read so far) and the future (that is, which terminal symbols are allowed next). Finally, the auto-generated comment shows what is known about the stack when the automaton is in this state. (This can be deduced from the LR(1) items, but is more readable if shown separately.) % Plus, there might be cases where the known suffix is longer than the what % the LR(1) items suggest. But I have never seen this yet. In a canonical LR(1) automaton, the LR(1) items offer an exact description of the past and future. However, in a noncanonical automaton, which is by default what \menhir produces, the situation is more subtle. The lookahead sets can be over-approximated, so the automaton can perform one or more ``spurious reductions'' before an error is detected. As a result, the LR(1) items in the error state offer a description of the future that may be both incorrect (that is, a terminal symbol that appears in a lookahead set is not necessarily a valid continuation) and incomplete (that is, a terminal symbol that does not appear in any lookahead set may nevertheless be a valid continuation). More details appear further on (\sref{sec:errors:diagnostics}). In order to attract the user's attention to this issue, if an input sentence causes one or more spurious reductions, then the auto-generated comment contains a warning about this fact. This mechanism is not completely foolproof, though, as it may be the case that one particular sentence does not cause any spurious reductions (hence, no warning appears), yet leads to an error state that can be reached via other sentences that do involve spurious reductions. % Not sure what to conclude about this issue... % ------------------------------------------------------------------------------ \subsection{Maintaining \messages files} \label{sec:messages:tools} Ideally, the set of input sentences in a \messages file should be correct (that is, every sentence causes an error on its last token), irredundant (that is, no two sentences lead to the same error state), and complete (that is, every error state is reached by some sentence). Correctness and irredundancy are checked by the command \ocompileerrors \nt{filename}, where \nt{filename} is the name of a \messages file. This command fails if a sentence does not cause an error at all, or causes an error too early. It also fails if two sentences lead to the same error state. % If the file is correct and irredundant, then (as its name suggests) this command compiles the \messages file down to an \ocaml function, whose code is printed on the standard output channel. This function, named \verb+message+, has type \verb+int -> string+, and maps a state number to a message. It raises the exception \verb+Not_found+ if its argument is not the number of a state for which a message has been defined. Completeness is checked via the commands \olisterrors and \ocompareerrors. The former produces, from scratch, a complete set of input sentences, that is, a set of input sentences that reaches all error states. The latter compares two sets of sentences (more precisely, the two underlying sets of error states) for inclusion. The command \olisterrors first computes all possible ways of causing an error. From this information, it deduces a list of all error states, that is, all states where an error can be detected. For each of these states, it computes a (minimal) input sentence that causes an error in this state. Finally, it prints these sentences, in the \messages file format, on the standard output channel. Each sentence is followed with an auto-generated comment and with a dummy diagnostic message. The user should be warned that this algorithm may require large amounts of time (typically in the tens of seconds, possibly more) and memory (typically in the gigabytes, possibly more). It requires a 64-bit machine. (On a 32-bit machine, it works, but quickly hits a built-in size limit.) At the verbosity level \ologautomaton~\texttt{2}, it displays some progress information and internal statistics on the standard error channel. The command \ocompareerrors \nt{filename1} \ocompareerrors \nt{filename2} compares the \messages files \nt{filename1} and \nt{filename2}. Each file is read and internally translated to a mapping of states to messages. \menhir then checks that the left-hand mapping is a subset of the right-hand mapping. That is, if a state~$s$ is reached by some sentence in \nt{filename1}, then it should also be reached by some sentence in \nt{filename2}. Furthermore, if the message associated with $s$ in \nt{filename1} is not a dummy message, then the same message should be associated with $s$ in \nt{filename2}. To check that the sentences in \nt{filename2} cover all error states, it suffices to (1)~use \olisterrors to produce a complete set of sentences, which one stores in \nt{filename1}, then (2)~use \ocompareerrors to compare \nt{filename1} and \nt{filename2}. In the case of a grammar that evolves fairly often, it can take significant human time and effort to update the \messages file and ensure correctness, irredundancy, and completeness. A way of reducing this effort is to abandon completeness. This implies that the auto-generated \verb+message+ function can raise \verb+Not_found+ and that a generic ``syntax error'' message must be produced in that case. We prefer to discourage this approach, as it implies that the end user is exposed to a mixture of specific and generic syntax error messages, and there is no guarantee that the specific (hand-written) messages will appear in \emph{all} situations where there are expected to appear. Instead, we recommend waiting for the grammar to become stable and enforcing completeness. The command \oupdateerrors \nt{filename} is used to update the auto-generated comments in the \messages file \nt{filename}. It is typically used after a change in the grammar (or in the command line options that affect the construction of the automaton). A new \messages file is produced on the standard output channel. It is identical to \nt{filename}, except the auto-generated comments, identified by \verb+##+, have been removed and re-generated. The command \oechoerrors \nt{filename} is used to filter out all comments, blank lines, and messages from the \messages file \nt{filename}. The input sentences, and nothing else, are echoed on the standard output channel. As an example application, one could then translate the sentences to concrete syntax and create a collection of source files that trigger every possible syntax error. The command \ointerpreterror is analogous to \ointerpret. It causes \menhir to act as an interpreter. \menhir reads sentences off the standard input channel, parses them, and displays the outcome. This switch can be usefully combined with \otrace. The main difference between \ointerpret and \ointerpreterror is that, when the latter command is used, \menhir expects the input sentence to cause an error on its last token, and displays information about the state in which the error is detected, in the form of a \messages file entry. This can be used to quickly find out exactly what error is caused by one particular input sentence. % ------------------------------------------------------------------------------ \subsection{Writing accurate diagnostic messages} \label{sec:errors:diagnostics} One might think that writing a diagnostic message for each error state is a straightforward (if lengthy) task. In reality, it is not so simple. % Here are a few guidelines. % The reader is referred to Pottier's % paper~\cite{pottier-reachability-cc-2016} for more details. \paragraph{A state, not a sentence} The first thing to keep in mind is that a diagnostic message is associated with a \emph{state}~$s$, as opposed to a sentence. An entry in a \messages file contains a sentence~$w$ that leads to an error in state~$s$. This sentence is just one way of causing an error in state~$s$; there may exist many other sentences that also cause an error in this state. The diagnostic message should not be specific of the sentence~$w$: it should make sense regardless of how the state~$s$ is reached. As a rule of thumb, when writing a diagnostic message, one should (as much as possible) ignore the example sentence~$w$ altogether, and concentrate on the description of the state~$s$, which appears as part of the auto-generated comment. The LR(1) items that compose the state~$s$ offer a description of the past (that is, what has been read so far) and the future (that is, which terminal symbols are allowed next). A diagnostic message should be designed based on this description. \begin{figure} \verbatiminput{declarations.mly} \caption{A grammar where one error state is difficult to explain} \label{fig:declarations} \end{figure} \begin{figure} \begin{verbatim} program: ID COLON ID LPAREN ## ## Ends in an error in state: 8. ## ## typ1 -> typ0 . [ SEMICOLON RPAREN ] ## typ1 -> typ0 . ARROW typ1 [ SEMICOLON RPAREN ] ## ## The known suffix of the stack is as follows: ## typ0 ## \end{verbatim} \caption{A problematic error state in the grammar of \fref{fig:declarations}, due to over-approximation} \label{fig:declarations:over} \end{figure} \paragraph{The problem of over-approximated lookahead sets} As pointed out earlier (\sref{sec:messages:format}), in a noncanonical automaton, the lookahead sets in the LR(1) items can be both over- and under-approximated. One must be aware of this phenomenon, otherwise one runs the risk of writing a diagnostic message that proposes too many or too few continuations. As an example, let us consider the grammar in \fref{fig:declarations}. According to this grammar, a ``program'' is either a declaration between parentheses or a declaration followed with a semicolon. A ``declaration'' is an identifier, followed with a colon, followed with a type. A ``type'' is an identifier, a type between parentheses, or a function type in the style of \ocaml. The (noncanonical) automaton produced by \menhir for this grammar has 17~states. Using \olisterrors, we find that an error can be detected in 10 of these 17~states. By manual inspection of the auto-generated comments, we find that for 9 out of these 10~states, writing an accurate diagnostic message is easy. However, one problematic state remains, namely state~8, shown in \fref{fig:declarations:over}. In this state, a (level-0) type has just been read. One valid continuation, which corresponds to the second LR(1) item in \fref{fig:declarations:over}, is to continue this type: the terminal symbol \verb+ARROW+, followed with a (level-1) type, is a valid continuation. Now, the question is, what other valid continuations are there? By examining the first LR(1) item in \fref{fig:declarations:over}, it may look as if both \verb+SEMICOLON+ and \verb+RPAREN+ are valid continuations. However, this cannot be the case. A moment's thought reveals that \emph{either} we have seen an opening parenthesis \verb+LPAREN+ at the very beginning of the program, in which case we definitely expect a closing parenthesis \verb+RPAREN+; \emph{or} we have not seen one, in which case we definitely expect a semicolon \verb+SEMICOLON+. It is \emph{never} the case that \emph{both} \verb+SEMICOLON+ and \verb+RPAREN+ are valid continuations! In fact, the lookahead set in the first LR(1) item in \fref{fig:declarations:over} is over-approximated. State~8 in the noncanonical automaton results from merging two states in the canonical automaton. In such a situation, one cannot write an accurate diagnostic message. % by lack of ``static context''. Knowing that the automaton is in state~8 does not give us a precise view of the valid continuations. Some valuable information (that is, whether we have seen an opening parenthesis \verb+LPAREN+ at the very beginning of the program) is buried in the automaton's stack. \begin{figure} \verbatiminput{declarations-phantom.mly} \caption{Splitting the problematic state of \fref{fig:declarations:over} via selective duplication} \label{fig:declarations:phantom} \end{figure} \begin{figure} \verbatiminput{declarations-onerrorreduce.mly} \caption{Avoiding the problematic state of \fref{fig:declarations:over} via reductions on error} \label{fig:declarations:onerrorreduce} \end{figure} \begin{figure} \begin{verbatim} program: ID COLON ID LPAREN ## ## Ends in an error in state: 15. ## ## program -> declaration . SEMICOLON [ # ] ## ## The known suffix of the stack is as follows: ## declaration ## ## WARNING: This example involves spurious reductions. ## This implies that, although the LR(1) items shown above provide an ## accurate view of the past (what has been recognized so far), they ## may provide an INCOMPLETE view of the future (what was expected next). ## In state 8, spurious reduction of production typ1 -> typ0 ## In state 11, spurious reduction of production declaration -> ID COLON typ1 ## \end{verbatim} \caption{A problematic error state in the grammar of \fref{fig:declarations:onerrorreduce}, due to under-approximation} \label{fig:declarations:under} \end{figure} How can one work around this problem? Let us suggest three options. \paragraph{Blind duplication of states} One option would be to build a canonical automaton by using the % (undocumented!) \ocanonical switch. In this example, one would obtain a 27-state automaton, where the problem has disappeared. However, this option is rarely viable, as it duplicates many states without good reason. \paragraph{Selective duplication of states} A second option is to manually cause just enough duplication to remove the problematic over-approximation. In our example, we wish to distinguish two kinds of types and declarations, namely those that must be followed with a closing parenthesis, and those that must be followed with a semicolon. We create such a distinction by parameterizing \verb+typ1+ and \verb+declaration+ with a phantom parameter. The modified grammar is shown in \fref{fig:declarations:phantom}. The phantom parameter does not affect the language that is accepted: for instance, the nonterminal symbols \texttt{declaration(SEMICOLON)} and \texttt{declaration(RPAREN)} generate the same language as \texttt{declaration} in the grammar of \fref{fig:declarations}. Yet, by giving distinct names to these two symbols, we force the construction of an automaton where more states are distinguished. In this example, \menhir produces a 23-state automaton. Using \olisterrors, we find that an error can be detected in 11 of these 23~states, and by manual inspection of the auto-generated comments, we find that for each of these 11~states, writing an accurate diagnostic message is easy. In summary, we have selectively duplicated just enough states so as to split the problematic error state into two non-problematic error states. % Je me demande s'il n'y a pas un lien avec la traduction de LR(k+1) vers LR(k)... % On voit que le FOLLOW est intégré au symbole nonterminal. \paragraph{Reductions on error} A third and last option is to introduce an \donerrorreduce declaration (\sref{sec:onerrorreduce}) so as to prevent the detection of an error in the problematic state~8. We see in \fref{fig:declarations:over} that, in state~8, the production $\texttt{typ1} \rightarrow \texttt{typ0}$ is ready to be reduced. If we could force this reduction to take place, then the automaton would move to some other state where it would be clear which of \verb+SEMICOLON+ and \verb+RPAREN+ is expected. We achieve this by marking \verb+typ1+ as ``reducible on error''. The modified grammar is shown in \fref{fig:declarations:onerrorreduce}. For this grammar, \menhir produces a 17-state automaton. (This is the exact same automaton as for the grammar of \fref{fig:declarations}, except 2 of the 17 states have received extra reduction actions.) Using \olisterrors, we find that an error can be detected in 9 of these~17 states. The problematic state, namely state~8, is no longer an error state! The problem has vanished. \paragraph{The problem of under-approximated lookahead sets} The third option seems by far the simplest of all, and is recommended in many situations. However, it comes with a caveat. There may now exist states whose lookahead sets are under-approximated, in a certain sense. Because of this, there is a danger of writing an incomplete diagnostic message, one that does not list all valid continuations. To see this, let us look again at the sentence \texttt{ID COLON ID LPAREN}. In the grammar and automaton of \fref{fig:declarations}, this sentence takes us to the problematic state~8, shown in \fref{fig:declarations:over}. In the grammar and automaton of \fref{fig:declarations:onerrorreduce}, because more reduction actions are carried out before the error is detected, this sentence takes us to state~15, shown in \fref{fig:declarations:under}. When writing a diagnostic message for state~15, one might be tempted to write: ``Up to this point, a declaration has been recognized. At this point, a semicolon is expected''. Indeed, by examining the sole LR(1) item in state~15, it looks as if \verb+SEMICOLON+ is the only permitted continuation. However, this is not the case. Another valid continuation is \verb+ARROW+: indeed, the sentence \texttt{ID COLON ID ARROW ID SEMICOLON} forms a valid program. In fact, if the first token following \texttt{ID COLON ID} is \texttt{ARROW}, then in state~8 this token is shifted, so the two reductions that take us from state~8 through state~11 to state~15 never take place. This is why, even though \texttt{ARROW} does not appear in state~15 as a valid continuation, it nevertheless is a valid continuation of \texttt{ID COLON ID}. The warning produced by \menhir, shown in \fref{fig:declarations:under}, is supposed to attract attention to this issue. Another way to explain this issue is to point out that, by declaring \verb+%on_error_reduce typ1+, we make a choice. When the parser reads a type and finds an invalid token, it decides that this type is finished, even though, in reality, this type could be continued with \verb+ARROW+ \ldots. This in turn causes the parser to perform another reduction and consider the current declaration finished, even though, in reality, this declaration could be continued with \verb+ARROW+ \ldots. In summary, when writing a diagnostic message for state~15, one should take into account the fact that this state can be reached via spurious reductions and (therefore) \verb+SEMICOLON+ may not be the only permitted continuation. One way of doing this, without explicitly listing all permitted continuations, is to write: ``Up to this point, a declaration has been recognized. If this declaration is complete, then at this point, a semicolon is expected''. % ------------------------------------------------------------------------------ \subsection{A working example} \label{sec:errors:example} The CompCert verified compiler offers a real-world example of this approach to error handling. The ``pre-parser'' is where syntax errors are detected: see \compcertgithubfile{cparser/pre\_parser.mly}. % (The pre-parser is also in charge of distinguishing type names versus variable % names, but that is an independent issue.) A database of erroneous input sentences and (templates for) diagnostic messages is stored in \compcertgithubfile{cparser/handcrafted.messages}. It is compiled, using \ocompileerrors, to an \ocaml file named \texttt{cparser/pre\_parser\_messages.ml}. The function \verb+Pre_parser_messages.message+, which maps a state number to (a template for) a diagnostic message, is called from \compcertgithubfile{cparser/ErrorReports.ml}, where we construct and display a full-fledged diagnostic message. In CompCert, we allow a template for a diagnostic message to contain the special form \verb+$i+, where \verb+i+ is an integer constant, understood as an index into the parser's stack. The code in \compcertgithubfile{cparser/ErrorReports.ml} automatically replaces this special form with the fragment of the source text that corresponds to this stack entry. This mechanism is not built into \menhir; it is implemented in CompCert using \menhir's incremental API. % ------------------------------------------------------------------------------ \section{Coq back-end} \label{sec:coq} \menhir is able to generate a parser that whose correctness can be formally verified using the Coq proof assistant~\cite{jourdan-leroy-pottier-12}. This feature is used to construct the parser of the CompCert verified compiler~\cite{compcert}. Setting the \ocoq switch on the command line enables the Coq back-end. When this switch is set, \menhir expects an input file whose name ends in \vy and generates a Coq file whose name ends in \texttt{.v}. Like a \mly file, a \vy file is a grammar specification, with embedded semantic actions. The only difference is that the semantic actions in a \vy file are expressed in Coq instead of \ocaml. A \vy file otherwise uses the same syntax as a \mly file. CompCert's \compcertgithubfile{cparser/Parser.vy} serves as an example. Several restrictions are imposed when \menhir is used in \ocoq mode: % \begin{itemize} \item The error handling mechanism (\sref{sec:errors}) is absent. The \verb+$syntaxerror+ keyword and the \error token are not supported. \item Location information is not propagated. The \verb+$start*+ and \verb+$end*+ keywords (\fref{fig:pos}) are not supported. \item \dparameter (\sref{sec:parameter}) is not supported. \item \dinline (\sref{sec:inline}) is not supported. \item The standard library (\sref{sec:library}) is not supported, of course, because its semantic actions are expressed in \ocaml. If desired, the user can define an analogous library, whose semantic actions are expressed in Coq. \item Because Coq's type inference algorithm is rather unpredictable, the Coq type of every nonterminal symbol must be provided via a \dtype or \dstart declaration (\sref{sec:type}, \sref{sec:start}). \item Unless the proof of completeness has been deactivated using \ocoqnocomplete, the grammar must not have a conflict (not even a benign one, in the sense of \sref{sec:conflicts:benign}). That is, the grammar must be LR(1). Conflict resolution via priority and associativity declarations (\sref{sec:assoc}) is not supported. The reason is that there is no simple formal specification of how conflict resolution should work. \end{itemize} The generated file contains several modules: \begin{itemize} \item The module \verb+Gram+ defines the terminal and non-terminal symbols, the grammar, and the semantic actions. \item The module \verb+Aut+ contains the automaton generated by \menhir, together with a certificate that is checked by Coq while establishing the soundness and completeness of the parser. \end{itemize} The type~\verb+terminal+ of the terminal symbols is an inductive type, with one constructor for each terminal symbol. A terminal symbol named \verb+Foo+ in the \verb+.vy+ file is named \verb+Foo't+ in Coq. A~terminal symbol per se does not carry a the semantic value. We also define the type \verb+token+ of tokens, that is, dependent pairs of a terminal symbol and a semantic value of an appropriate type for this symbol. We model the lexer as an object of type \verb+Streams.Stream token+, that is, an infinite stream of tokens. % TEMPORARY documenter que du coup, après extraction, la seule façon pour un % lexer OCaml de produire des tokens, c'est d'utiliser Obj.magic % cf. la fonction compute_token_stream dans le Lexer.mll de Compcert: % Cons (Coq_existT (t, Obj.magic v), Lazy.from_fun compute_token_stream) The type~\verb+nonterminal+ of the non-terminal symbols is an inductive type, with one constructor for each non-terminal symbol. A non-terminal symbol named \verb+Bar+ in the \verb+.vy+ file is named \verb+Bar'nt+ in Coq. The proof of termination of an LR(1) parser in the case of invalid input seems far from obvious. We did not find such a proof in the literature. In an application such as CompCert~\cite{compcert}, this question is not considered crucial. For this reason, we did not formally establish the termination of the parser. Instead, in order to satisfy Coq's termination requirements, we use the ``fuel'' technique: the parser takes an additional parameter \verb+log_fuel+ of type \verb+nat+ such that $2^{\verb+log_fuel+}$ is the maximum number of steps the parser is allowed to perform. In practice, one can use a value of e.g., 40 or 50 to make sure the parser will never run out of fuel in a reasonnable time. Parsing can have three different outcomes, represented by the type \verb+parse_result+. % (This definition is implicitly parameterized over the initial state~\verb+init+. We omit the details here.) % \begin{verbatim} Inductive parse_result := | Fail_pr: parse_result | Timeout_pr: parse_result | Parsed_pr: symbol_semantic_type (NT (start_nt init)) -> Stream token -> parse_result. \end{verbatim} The outcome \verb+Fail_pr+ means that parsing has failed because of a syntax error. (If the completeness of the parser with respect to the grammar has been proved, this implies that the input is invalid). The outcome \verb+Timeout_pr+ means that the fuel has been exhausted. Of course, this cannot happen if the parser was given an infinite amount of fuel, as suggested above. The outcome \verb+Parsed_pr+ means that the parser has succeeded in parsing a prefix of the input stream. It carries the semantic value that has been constructed for this prefix, as well as the remainder of the input stream. For each entry point \verb+entry+ of the grammar, \menhir generates a parsing function \verb+entry+, whose type is \verb+nat -> Stream token -> parse_result+. % jh: Je suis un peu embêté, parce que init est % en réalité de type initstate, mais je n'ai pas envie d'en parler % dans la doc. Tout ce qui importe, c'est que le premier paramètre de % Parsed_pr a un type compatible avec le type que l'utilisateur a % donné. Two theorems are provided, named \verb+entry_point_correct+ and \verb+entry_point_complete+. The correctness theorem states that, if a word (a prefix of the input stream) is accepted, then this word is valid (with respect to the grammar) and the semantic value that is constructed by the parser is valid as well (with respect to the grammar). The completeness theorem states that if a word (a prefix of the input stream) is valid (with respect to the grammar), then (given sufficient fuel) it is accepted by the parser. These results imply that the grammar is unambiguous: for every input, there is at most one valid interpretation. This is proved by another generated theorem, named \verb+Parser.unambiguous+. % jh: Pas besoin de prouver la terminaison pour avoir la non-ambiguïté, car % les cas de non-terminaison ne concernent que les entrées invalides. % fp: bien vu! % fp: ce serait intéressant d'avoir un certificat comme quoi la grammaire est % bien LR(1), mais peut-être qu'on s'en fout. C'est bien de savoir qu'elle % est non-ambiguë. % jh: Je ne sais pas ce que c'est qu'un certificat comme quoi la grammaire % est LR(1), en pratique... % fp: Ce serait une preuve d'un théorème, exprimé uniquement en termes de % la grammaire, comme quoi la grammaire est LR(1). Il y a une définition % de cette propriété dans le textbook de Aho et Ullman, si je me rappelle % bien. Mais peu importe. % fp: On pourrait aussi souhaiter un théorème comme quoi le parser ne lit % pas le stream trop loin... % jh: pour vraiment prouver cela, il faudrait inverser le % controle. Sinon, comme résultat un peu moins fort, dans la version % actuelle, on renvoie le stream restant, et on prouve qu'il % correspond bien à la fin du Stream. The parsers produced by \menhir's Coq back-end must be linked with a Coq library. This library can be installed via the command \verb+opam install coq-menhirlib+.% % \footnote{This assumes that you have installed \texttt{opam}, the OCaml package manager, and that you have run the command \texttt{opam repo add coq-released https://coq.inria.fr/opam/released}.} % The Coq sources of this library can be found in the \texttt{coq-menhirlib} directory of the Menhir repository. The CompCert verified compiler~\cite{compcert,compcert-github} can be used as an example if one wishes to use \menhir to generate a formally verified parser as part of some other project. See in particular the directory \compcertgithubfile{cparser}. % ------------------------------------------------------------------------------ \section{Building grammarware on top of \menhir} \label{sec:grammarware} It is possible to build a variety of grammar-processing tools, also known as ``grammarware''~\cite{klint-laemmel-verhoef-05}, on top of \menhir's front-end. Indeed, \menhir offers a facility for dumping a \cmly file, which contains a (binary-form) representation of the grammar and automaton, as well as a library, \menhirsdk, for (programmatically) reading and exploiting a \cmly file. These facilities are described in \sref{sec:sdk}. % Furthermore, \menhir allows decorating a grammar with ``attributes'', which are ignored by \menhir's back-ends, yet are written to the \cmly file, thus can be exploited by other tools, via \menhirsdk. % Attributes are described in \sref{sec:attributes}. \subsection{\menhir's SDK} \label{sec:sdk} The command line option \ocmly causes \menhir to produce a \cmly file in addition to its normal operation. This file contains a (binary-form) representation of the grammar and automaton. This is the grammar that is obtained after the following steps have been carried out: \begin{itemize} \item joining multiple \mly files, if necessary; % in fact, always (due to standard.mly) \item eliminating anonymous rules; \item expanding away parameterized nonterminal symbols; \item removing unreachable nonterminal symbols; \item performing \ocaml type inference, if the \oinfer switch is used; \item inlining away nonterminal symbols that are decorated with \dinline. \end{itemize} The library \menhirsdk offers an API for reading a \cmly file. The functor \repo{src/cmly_read.mli}{\texttt{MenhirSdk.Cmly\_read.Read}} reads such a file and produces a module whose signature is \repo{src/cmly_api.ml}{\texttt{MenhirSdk.Cmly\_api.GRAMMAR}}. This API is not explained in this document; for details, the reader is expected to follow the above links. % TEMPORARY mention the demo generate-printers % as an example of both the SDK and attributes % (possibly make it an independent package) \subsection{Attributes} \label{sec:attributes} Attributes are decorations that can be placed in \mly files. They are ignored by \menhir's back-ends, but are written to \cmly files, thus can be exploited by other tools, via \menhirsdk. An attribute consists of a name and a payload. An attribute name is an \ocaml identifier, such as \texttt{cost}, or a list of \ocaml identifiers, separated with dots, such as \texttt{my.name}. An attribute payload is an \ocaml expression of arbitrary type, such as \texttt{1} or \verb+"&&"+ or \verb+print_int+. Following the syntax of \ocaml's attributes, an attribute's name and payload are separated with one or more spaces, and are delimited by \verb+[@+ and \verb+]+. Thus, \verb+[@cost 1]+ and \verb+[@printer print_int]+ are examples of attributes. An attribute can be attached at one of four levels: % grammar-level attributes, %[@foo ...] % terminal attribute, %token BAR [@foo ...] % nonterminal attribute, bar [@foo ...]: ... % producer attribute, e = expr [@foo ...] \begin{enumerate} \item An attribute can be attached with the grammar. Such an attribute must be preceded with a \verb+%+ sign and must appear in the declarations section (\sref{sec:decls}). For example, the following is a valid declaration: \begin{verbatim} %[@trace true] \end{verbatim} \item An attribute can be attached with a terminal symbol. Such an attribute must follow the declaration of this symbol. For example, the following is a valid declaration of the terminal symbol \verb+INT+: \begin{verbatim} %token INT [@cost 0] [@printer print_int] \end{verbatim} \item An attribute can be attached with a nonterminal symbol. Such an attribute must appear inside the rule that defines this symbol, immediately after the name of this symbol. For instance, the following is a valid definition of the nonterminal symbol \verb+expr+: \begin{verbatim} expr [@default EConst 0]: i = INT { EConst i } | e1 = expr PLUS e2 = expr { EAdd (e1, e2) } \end{verbatim} An attribute can be attached with a parameterized nonterminal symbol: \begin{verbatim} option [@default None] (X): { None } | x = X { Some x } \end{verbatim} An attribute cannot be attached with a nonterminal symbol that is decorated with the \dinline keyword. \item An attribute can be attached with a producer (\sref{sec:producers}), that is, with an occurrence of a terminal or nonterminal symbol in the right-hand side of a production. Such an attribute must appear immediately after the producer. For instance, in the following rule, an attribute is attached with the producer \verb+expr*+: \begin{verbatim} exprs: LPAREN es = expr* [@list true] RPAREN { es } \end{verbatim} \end{enumerate} % %attribute declarations: As a convenience, it is possible to attach many attributes with many (terminal and nonterminal) symbols in one go, via an \dattribute declaration, which must be placed in the declarations section (\sref{sec:decls}). For instance, the following declaration attaches both of the attributes \verb+[@cost 0]+ and \verb+[@precious false]+ with each of the symbols \verb+INT+ and \verb+id+: \begin{verbatim} %attribute INT id [@cost 0] [@precious false] \end{verbatim} An \dattribute declaration can be considered syntactic sugar: it is desugared away in terms of the four forms of attributes presented earlier. (The command line switch \oonlypreprocess can be used to see how it is desugared.) % Interaction of %attribute declarations and parameterized nonterminals: If an attribute is attached with a parameterized nonterminal symbol, then, when this symbol is expanded away, the attribute is transmitted to every instance. For instance, in an earlier example, the attribute \verb+[@default None]+ was attached with the parameterized symbol \verb+option+. Then, every instance of \verb+option+, such as \verb+option(expr)+, \verb+option(COMMA)+, and so on, inherits this attribute. To attach an attribute with one specific instance only, one can use an \dattribute declaration. For instance, the declaration \verb+%attribute option(expr) [@cost 10]+ attaches an attribute with the nonterminal symbol \verb+option(expr)+, but not with the symbol \verb+option(COMMA)+. % ------------------------------------------------------------------------------ \section{Interaction with build systems} \label{sec:build} This section explains some details of the compilation workflow, including \ocaml type inference and its repercussions on dependency analysis (\sref{sec:build:infer}) and compilation flags (\sref{sec:build:flags}). % This material should be of interest only to authors of build systems who wish to build support for \menhir into their system. % Ordinary users should skip this section and use a build system that knows about \menhir, such as \dune (preferred) or \ocamlbuild. \subsection{\ocaml type inference and dependency analysis} \label{sec:build:infer} In an ideal world, the semantic actions in a \mly file should be well-typed according to the \ocaml type discipline, and their types should be known to \menhir, which may need this knowledge. (When \oinspection is set, \menhir needs to know the \ocaml type of every nonterminal symbol.) % To address this problem, three approaches exist: \begin{itemize} \item Ignore the problem and let \menhir run without \ocaml type information (\sref{sec:build:infer:none}). \item Let \menhir obtain \ocaml type information by invoking the \ocaml compiler (\sref{sec:build:infer:direct}). \item Let \menhir request and receive \ocaml type information without invoking the \ocaml compiler (\sref{sec:build:infer:indirect}). \end{itemize} \subsubsection{Running without \ocaml type information} \label{sec:build:infer:none} The simplest thing to do is to run \menhir \emph{without} any of the flags described in the following (\sref{sec:build:infer:direct}, \sref{sec:build:infer:indirect}). % Then, the semantic actions are \emph{not} type-checked, and their \ocaml type is \emph{not} inferred. % (This is analogous to using \ocamlyacc.) % The drawbacks of this approach are as follows: \begin{itemize} \item A type error in a semantic action is detected only when the \ml file produced by \menhir is type-checked. The location of the type error, as reported by the \ocaml compiler, can be suboptimal. % I think that the type error should be reported inside a semantic % action (we produce # directives for this purpose). Yet I am not % certain that this will be the case. Plus, the type error could be % reported inside Menhir's standard library, whereas when --infer is % used, we place the standard library first, so as to ensure that no % type error is found inside it. (See [infer.ml].) \item Unless a \dtype declaration for every nonterminal symbol is given, the inspection API cannot be generated, that is, \oinspection must be turned off. \end{itemize} \subsubsection{Obtaining \ocaml type information by calling the \ocaml compiler} \label{sec:build:infer:direct} The second approach is to let \menhir invoke the \ocaml compiler so as to type-check the semantic actions and infer their types. This is done by invoking \menhir with the \oinfer switch, as follows. \docswitch{\oinfer} This switch causes the semantic actions to be checked for type consistency \emph{before} the parser is generated. To do so, \menhir generates a mock \ml file, which contains just the semantic actions, and invokes the \ocaml compiler, under the form \verb+ocamlc -i+, so as to type-check this file and infer the types of the semantic actions. \menhir then reads this information and produces real \ml and \mli files. % There is a slight catch with \oinfer. The types inferred by \ocamlc are valid % in the toplevel context, but can change meaning when inserted into a local % context. \docswitch{\oocamlc \nt{command}} This switch controls how \ocamlc is invoked. It allows setting both the name of the executable and the command line options that are passed to it. \docskip One difficulty with the this approach is that the \ocaml compiler usually needs to consult a few \texttt{.cm[iox]} files. Indeed, if the \mly file contains a reference to an external \ocaml module, say \texttt{A}, then the \ocaml compiler typically needs to read one or more files named \texttt{A.cm[iox]}. This implies that these files must have been created first. But how is one supposed to know, exactly, which files should be created first? One must scan the \mly file so as to find out which external modules it depends upon. In other words, a dependency analysis is required. This analysis can be carried out by invoking \menhir with the \odepend switch, as follows. \docswitch{\odepend} This switch causes \menhir to generate dependency information for use in conjunction with \make. When invoked in this mode, \menhir does not generate a parser. Instead, it examines the grammar specification and prints a list of prerequisites for the targets \nt{basename}\texttt{.cm[iox]}, \nt{basename}\texttt{.ml}, and \nt{basename}\texttt{.mli}. This list is intended to be textually included within a \Makefile. % % It is important to note that \nt{basename}\texttt{.ml} and % \nt{basename}\texttt{.mli} can have \texttt{.cm[iox]} prerequisites. This is % because, when the \oinfer switch is used, \menhir infers types by invoking % \ocamlc, and \ocamlc itself requires the \ocaml modules that the grammar % specification depends upon to have been compiled first. % To produce this list, \menhir generates a mock \ml file, which contains just the semantic actions, invokes \ocamldep, and postprocesses its output. \docswitch{\orawdepend} This switch is analogous to \odepend. However, in this case, \ocamldep's output is \emph{not} postprocessed by \menhir: it is echoed without change. This switch is not suitable for direct use with \make; it is intended for use with \omake or \ocamlbuild, which perform their own postprocessing. \docswitch{\oocamldep \nt{command}} This switch controls how \ocamldep is invoked. It allows setting both the name of the executable and the command line options that are passed to it. \subsubsection{Obtaining \ocaml type information without calling the \ocaml compiler} \label{sec:build:infer:indirect} The third approach is to let \menhir request and receive \ocaml type information \emph{without} allowing \menhir to invoke the \ocaml compiler. There is nothing magic about this: to achieve this, \menhir must be invoked twice, and the \ocaml compiler must be invoked (by the user, or by the build system) in between. This is done as follows. \docswitch{\oinferwrite \nt{mockfilename}} When invoked in this mode, \menhir does not generate a parser. Instead, generates a mock \ml file, named \nt{mockfilename}, which contains just the semantic actions. Then, it stops. \docskip It is then up to the user (or to the build system) to invoke \verb+ocamlc -i+ so as to type-check the mock \ml file and infer its signature. The output of this command should be redirected to some file \nt{sigfilename}. Then, \menhir can be invoked again, as follows. \docswitch{\oinferread \nt{sigfilename}} When invoked in this mode, \menhir assumes that the file \nt{sigfilename} contains the result of running \verb+ocamlc -i+ on the file \nt{mockfilename}. It reads and parses this file, so as to obtain the \ocaml type of every semantic action, then proceeds normally to generate a parser. \docskip This protocol was introduced on 2018/05/23; earlier versions of \menhir do not support it. Its existence can be tested as follows: \docswitch{\oinferprotocolsupported} When invoked with this switch, \menhir immediately terminates with exit code 0. An earlier version of \menhir, which does not support this protocol, would display a help message and terminate with a nonzero exit code. \subsection{Compilation flags} \label{sec:build:flags} The following switches allow querying \menhir so as to find out which compilation flags should be passed to the \ocaml compiler and linker. \docswitch{\osuggestcomp} This switch causes \menhir to print a set of suggested compilation flags, and exit. These flags are intended to be passed to the \ocaml compilers (\ocamlc or \ocamlopt) when compiling and linking the parser generated by \menhir. What flags are suggested? In the absence of the \otable switch, no flags are suggested. When \otable is set, a \texttt{-I} flag is suggested, so as to ensure that \menhirlib is visible to the \ocaml compiler. \docswitch{\osuggestlinkb} This switch causes \menhir to print a set of suggested link flags, and exit. These flags are intended to be passed to \texttt{ocamlc} when producing a bytecode executable. What flags are suggested? In the absence of the \otable switch, no flags are suggested. When \otable is set, the object file \texttt{menhirLib.cmo} is suggested, so as to ensure that \menhirlib is linked in. \docswitch{\osuggestlinko} This switch causes \menhir to print a set of suggested link flags, and exit. These flags are intended to be passed to \texttt{ocamlopt} when producing a native code executable. What flags are suggested? In the absence of the \otable switch, no flags are suggested. When \otable is set, the object file \texttt{menhirLib.cmx} is suggested, so as to ensure that \menhirlib is linked in. \docswitch{\osuggestmenhirlib} This switch causes \menhir to print (the absolute path of) the directory where \menhirlib was installed. \docswitch{\osuggestocamlfind} This switch is deprecated and may be removed in the future. It always prints \texttt{false}. % ------------------------------------------------------------------------------ \section{Comparison with \ocamlyacc} % TEMPORARY idéalement, il faudrait documenter la différence de comportement % sur les réductions par défaut (sur des symboles autres que #). Roughly speaking, Menhir is 90\% compatible with \ocamlyacc. Legacy \ocamlyacc grammar specifications are accepted and compiled by Menhir. The resulting parsers run and produce correct parse trees. However, parsers that explicitly invoke functions in the module \texttt{Parsing} behave slightly incorrectly. For instance, the functions that provide access to positions return a dummy position when invoked by a Menhir parser. Porting a grammar specification from ocamlyacc to Menhir requires replacing all calls to \texttt{Parsing} with new Menhir-specific keywords (\sref{sec:positions}). Here is an incomplete list of the differences between \ocamlyacc and \menhir. The list is roughly sorted by decreasing order of importance. \begin{itemize} \item \menhir allows the definition of a nonterminal symbol to be parameterized (\sref{sec:templates}). A formal parameter can be instantiated with a terminal symbol, a nonterminal symbol, or an anonymous rule (\sref{sec:actual}). A library of standard parameterized definitions (\sref{sec:library}), including options, sequences, and lists, is bundled with Menhir. EBNF syntax is supported: the modifiers \dquestion, \dplus, and \dstar are sugar for options, nonempty lists, and arbitrary lists (\fref{fig:sugar}). \item \ocamlyacc only accepts LALR(1) grammars. \menhir accepts LR(1) grammars, thus avoiding certain artificial conflicts. \item \menhir's \dinline keyword (\sref{sec:inline}) helps avoid or resolve some LR(1) conflicts without artificial modification of the grammar. \item \menhir explains conflicts (\sref{sec:conflicts}) in terms of the grammar, not just in terms of the automaton. \menhir's explanations are believed to be understandable by mere humans. \item \menhir offers an incremental API (in \otable mode only) (\sref{sec:incremental}). This means that the state of the parser can be saved at any point (at no cost) and that parsing can later be resumed from a saved state. \item \menhir offers a set of tools for building a (complete, irredundant) set of invalid input sentences, mapping each such sentence to a (hand-written) error message, and maintaining this set as the grammar evolves (\sref{sec:errors:new}). \item In \ocoq mode, \menhir produces a parser whose correctness and completeness with respect to the grammar can be checked by Coq (\sref{sec:coq}). \item \menhir offers an interpreter (\sref{sec:interpret}) that helps debug grammars interactively. \item \menhir allows grammar specifications to be split over multiple files (\sref{sec:split}). It also allows several grammars to share a single set of tokens. \item \menhir produces reentrant parsers. \item \menhir is able to produce parsers that are parameterized by \ocaml modules. \item \ocamlyacc requires semantic values to be referred to via keywords: \verb+$1+, \verb+$2+, and so on. \menhir allows semantic values to be explicitly named. \item \menhir warns about end-of-stream conflicts (\sref{sec:eos}), whereas \ocamlyacc does not. \menhir warns about productions that are never reduced, whereas, at least in some cases, \ocamlyacc does not. \item \menhir offers an option to typecheck semantic actions \emph{before} a parser is generated: see \oinfer. \item \ocamlyacc produces tables that are interpreted by a piece of C code, requiring semantic actions to be encapsulated as \ocaml closures and invoked by C code. \menhir offers a choice between producing tables and producing code. In either case, no C code is involved. \item \menhir makes \ocaml's standard library module \texttt{Parsing} entirely obsolete. Access to locations is now via keywords (\sref{sec:positions}). Uses of \verb+raise Parse_error+ within semantic actions are deprecated. The function \verb+parse_error+ is deprecated. They are replaced with keywords (\sref{sec:errors}). \item \menhir's error handling mechanism (\sref{sec:errors}) is inspired by \ocamlyacc's, but is not guaranteed to be fully compatible. Error recovery, also known as re-synchronization, is not supported by \menhir. \item The way in which severe conflicts (\sref{sec:conflicts}) are resolved is not guaranteed to be fully compatible with \ocamlyacc. \item \menhir warns about unused \dtoken, \dnonassoc, \dleft, and \dright declarations. It also warns about \dprec annotations that do not help resolve a conflict. \item \menhir accepts \ocaml-style comments. \item \menhir allows \dstart and \dtype declarations to be condensed. \item \menhir allows two (or more) productions to share a single semantic action. \item \menhir produces better error messages when a semantic action contains ill-balanced parentheses. % \item \ocamlyacc allows nonterminal start symbols to start with an uppercase % letter, and produces invalid \ocaml code in that case. \menhir disallows this. \item \ocamlyacc ignores semicolons and commas everywhere. \menhir regards semicolons and commas as significant, and allows them, or requires them, in certain well-defined places. % \item \ocamlyacc ignores multiple definitions of a token, even when two of them are at % different types. \menhir rejects this. \item \ocamlyacc allows \dtype declarations to refer to terminal or non-terminal symbols, whereas \menhir requires them to refer to non-terminal symbols. Types can be assigned to terminal symbols with a \dtoken declaration. \end{itemize} % ------------------------------------------------------------------------------ \section{Questions and Answers} \label{sec:qa} $\mathstrut$ % Ensure correct indentation of the first question. Ugly. \vspace{-\baselineskip} \question{Is \menhir faster than \ocamlyacc? What is the speed difference between \texttt{menhir} and \texttt{menhir -{}-table}?} A (not quite scientific) benchmark suggests that the parsers produced by \ocamlyacc and \texttt{menhir -{}-table} have comparable speed, whereas those produced by \texttt{menhir} are between 2 and 5 times faster. This benchmark excludes the time spent in the lexer and in the semantic actions. \question{How do I write \Makefile rules for \menhir?} This can a bit tricky. % understatement If you must do this, see \sref{sec:build}. It is recommended instead to use a build system with built-in support for \menhir, such as \dune (preferred) or \ocamlbuild. \question{How do I use \menhir with \ocamlbuild?} Pass \verb+-use-menhir+ to \ocamlbuild. To pass options to \menhir, pass \verb+-menhir "menhir "+ to \ocamlbuild. To use \menhir's table-based back-end, pass \verb+-menhir "menhir --table"+ to \ocamlbuild, and either pass \verb+-package menhirLib+ to \ocamlbuild or add the tag \verb+package(menhirLib)+ in the \verb+_tags+ file. To combine multiple \mly files, say \verb+a.mly+ and \verb+b.mly+, into a single parser, say \verb+parser.{ml,mli}+, create a file named \verb+parser.mlypack+ that contains the module names \verb+A B+. See the \distrib{demos} directory for examples. To deal with \messages files (\sref{sec:errors:new}), use the rules provided in the file \distrib{demos/ocamlbuild/myocamlbuild.ml}. % Advanced scenario: to use --only-tokens and -external-tokens, % use .mlypack + _tags + myocamlbuild.ml. Not explained here, % but \distrib{demos/calc-two} contains an example. \question{How do I use \menhir with \dune?} Please use \dune version 1.4.0 or newer, as it has appropriate built-in rules for Menhir parsers. In the simplest scenario, where the parser resides in a single source file \texttt{parser.mly}, the \texttt{dune-project} file should contain a ``stanza'' along the following lines: \begin{verbatim} (menhir ( (modules (parser)) (flags ("--explain" "--dump")) (infer true) )) \end{verbatim} The \oinfer switch has special status and should not be used directly; instead, write \texttt{(infer true)} or \texttt{(infer false)}, as done above. (The default is \texttt{true}.) Ordinary command line switches, like \oexplain and \odump, are passed as part of the \texttt{flags} line, as done above. % The directory \distrib{demos/calc-dune} % (and others like it) offers an example. % For more details, see \href{https://jbuilder.readthedocs.io/en/latest/menhir.html}{\dune's documentation}. % To deal with \messages files (\sref{sec:errors:new}), use and adapt the rules found in the file \distrib{src/stage2/dune}. % It may be necessary to specify which version of the Menhir build rules % one wishes to use. This is done by writing, e.g. % \begin{verbatim} % (using menhir 2.0) % \end{verbatim} % at the top level of the \texttt{dune-project} file. % However, my understanding is that this is usually not necessary. % \dune will automatically add this line for us % when a project is first compiled. \question{\menhir reports \emph{more} shift/reduce conflicts than \ocamlyacc! How come?} \ocamlyacc sometimes merges two states of the automaton that \menhir considers distinct. This happens when the grammar is not LALR(1). If these two states happen to contain a shift/reduce conflict, then \menhir reports two conflicts, while \ocamlyacc only reports one. Of course, the two conflicts are very similar, so fixing one will usually fix the other as well. \question{I do not use \ocamllex. Is there an API that does not involve lexing buffers?} Like \ocamlyacc, \menhir produces parsers whose monolithic API (\sref{sec:monolithic}) is intended for use with \ocamllex. However, it is possible to convert them, after the fact, to a simpler, revised API. In the revised API, there are no lexing buffers, and a lexer is just a function from unit to tokens. Converters are provided by the library module \menhirlibconvert. This can be useful, for instance, for users of \texttt{ulex}, the Unicode lexer generator. Also, please note that \menhir's incremental API (\sref{sec:incremental}) does not mention the type \verb+Lexing.lexbuf+. In this API, the parser expects to be supplied with triples of a token and start/end positions of type \verb+Lexing.position+. \question{I need both \dinline and non-\dinline versions of a non-terminal symbol. Is this possible?} Define an \dinline version first, then use it to define a non-\dinline version, like this: \begin{verbatim} %inline ioption(X): (* nothing *) { None } | x = X { Some x } option(X): o = ioption(X) { o } \end{verbatim} This can work even in the presence of recursion, as illustrated by the following definition of (reversed, left-recursive, possibly empty) lists: \begin{verbatim} %inline irevlist(X): (* nothing *) { [] } | xs = revlist(X) x = X { x :: xs } revlist(X): xs = irevlist(X) { xs } \end{verbatim} The definition of \verb+irevlist+ is expanded into the definition of \verb+revlist+, so in the end, \verb+revlist+ receives its normal, recursive definition. One can then view \verb+irevlist+ as a variant of \verb+revlist+ that is inlined one level deep. % Intentionally do not call this "list", because people may copy-paste this % definition, and will end up unintentionally redefining the meaning of *. \question{Can I ship a generated parser while avoiding a dependency on \menhirlib?} Yes. One option is to use the code-based back-end (that is, to not use \otable). In this case, the generated parser is self-contained. Another option is to use the table-based back-end (that is, use \otable) and include a copy of the files \verb+menhirLib.{ml,mli}+ together with the generated parser. The command \texttt{menhir \osuggestmenhirlib} will tell you where to find these source files. \question{Why is \texttt{\$startpos} off towards the left? It seems to include some leading whitespace.} Indeed, as of 2015/11/04, the computation of positions has changed so as to match \ocamlyacc's behavior. As a result, \texttt{\$startpos} can now appear to be too far off to the left. This is explained in \sref{sec:positions}. In short, the solution is to use \verb+$symbolstartpos+ instead. \question{Can I pretty-print a grammar in ASCII, HTML, or \LaTeX{} format?} Yes. Have a look at \texttt{obelisk} \cite{obelisk}. \question{Does \menhir support mid-rule actions?} Yes. See \nt{midrule} and its explanation in \sref{sec:library}. % ------------------------------------------------------------------------------ \section{Technical background} After experimenting with Knuth's canonical LR(1) technique~\cite{knuth-lr-65}, we found that it \emph{really} is not practical, even on today's computers. For this reason, \menhir implements a slightly modified version of Pager's algorithm~\cite{pager-77}, which merges states on the fly if it can be proved that no reduce/reduce conflicts will arise as a consequence of this decision. This is how \menhir avoids the so-called \emph{mysterious} conflicts created by LALR(1) parser generators~\cite[section 5.7]{bison}. \menhir's algorithm for explaining conflicts is inspired by DeRemer and Pennello's~\cite{deremer-pennello-82} and adapted for use with Pager's construction technique. By default, \menhir produces code, as opposed to tables. This approach has been explored before~\cite{bhamidipaty-proebsting-98,horspool-faster-90}. \menhir performs some static analysis of the automaton in order to produce more compact code. When asked to produce tables, \menhir performs compression via first-fit row displacement, as described by Tarjan and Yao~\cite{tarjan-yao-79}. Double displacement is not used. The action table is made sparse by factoring out an error matrix, as suggested by Dencker, Dürre, and Heuft~\cite{dencker-84}. The type-theoretic tricks that triggered our interest in LR parsers~\cite{pottier-regis-gianas-typed-lr} are not implemented in \menhir. In the beginning, we did not implement them because the \ocaml compiler did not at the time offer generalized algebraic data types (GADTs). Today, \ocaml has GADTs, but, as the saying goes, ``if it ain't broken, don't fix it''. The main ideas behind the Coq back-end are described in a paper by Jourdan, Pottier and Leroy~\cite{jourdan-leroy-pottier-12}. The C11 parser in the CompCert compiler~\cite{compcert} is constructed by Menhir and verified by Coq, following this technique. How to construct a correct C11 parser using Menhir is described by Jourdan and Pottier~\cite{jourdan-pottier-17}. The approach to error reports presented in \sref{sec:errors:new} was proposed by Jeffery~\cite{jeffery-03} and further explored by Pottier~\cite{pottier-reachability-cc-2016}. % ------------------------------------------------------------------------------ \section{Acknowledgements} \menhir's interpreter (\ointerpret) and table-based back-end (\otable) were implemented by Guillaume Bau, Raja Boujbel, and François Pottier. The project was generously funded by Jane Street Capital, LLC through the ``OCaml Summer Project'' initiative. Frédéric Bour provided motivation and an initial implementation for the incremental API, for the inspection API, for attributes, and for \menhirsdk. \href{https://github.com/ocaml/merlin}{Merlin}, an emacs mode for \ocaml, contains an impressive incremental, syntax-error-tolerant \ocaml parser, which is based on \menhir and has been a driving force for \menhir's APIs. Jacques-Henri Jourdan designed and implemented the Coq back-end and did the Coq proofs for it. Gabriel Scherer provided motivation for investigating Jeffery's technique. % ------------------------------------------------------------------------------ % Bibliography. \bibliographystyle{plain} \bibliography{local} \end{document} % LocalWords: Yann Régis Gianas Regis inria Menhir filename mly basename Coq % LocalWords: coq vy tt Coq's iox Menhir's nonterminal graphviz nullable calc % LocalWords: inline postprocessed postprocessing ocamlc bytecode linkpkg cmo % LocalWords: menhirLib ocamlopt cmx qa ocamlrun runtime uid productiongroups % LocalWords: prec Actuals parameterization Parameterizing ds actuals plist xs % LocalWords: loption LPAREN RPAREN Inlining inlined inlining lp ioption bool % LocalWords: boption sep nonassociative multi basicshiftreduce lookahead decl % LocalWords: UIDENT LIDENT decls tycon expr exprs basiceos basiceosdump lex % LocalWords: curr Lexing lexbuf pos cnum startpos endpos startofs endofs LALR % LocalWords: syntaxerror whitespace EOL cst API lexing MenhirInterpreter pc % LocalWords: InputNeeded HandlingError env CompCert Aut se nat init cparser % LocalWords: validator subdirectory EBNF reentrant eos typecheck menhir ulex % LocalWords: DeRemer Pennello's Tarjan Yao Dencker Dürre Heuft Bau Raja LLC % LocalWords: Acknowledgements Boujbel Frédéric Bour menhir-20200123/doc/manual001.png000066400000000000000000000105161361226111300161770ustar00rootroot00000000000000PNG  IHDRooEbKGD̿ pHYsaa?itIME oyiIDATx/iJEE4 t`zU"=0?2/Ll>]]˲f{w7mKV蓢:a5\F5 a[6GکJhIR]mrW1V;W[}!IEjH3(6ٺO[n8 Q-j5 $$No>էJmCUj\ 3Ɵ+ލH>FF,Y}4DSw6_k+BrG6{2[[=Y/eE? )1?(7W.+2nET$'EPTFڞi'sb$F%6>?fqgJ3C@#:QkՉ=LS*G;,惖i1Q#;$Ĝ_:949kϊYז|ҎhCyeDhs/?g.FfG$DKsEZZ#ko'Hiq<¿+Dd|@@7̢[@T+fK ܵZp*}[:Tͮ1$ظR|?v=QBr _ܘG$?Yۨ26mFt: 6 O uHM Jm#_OޟYɁε;N.D|gF9 YB$ȜJֆ=[$})^F%YG.I_}rbr,G@t3뗺 oRat)k@@@]# &?_ѳH$#) ' _xqiO<D54HD'r4jpcCIJ'm|VF6 `yɸ܈"o Me=guLbI3~[,rP,Z 4ObҲz^<`sv}8f.ME_z~<`󁾸xy&0 sNy`׫ъu)N`]U^Dnw <병oj=sJe *T@@Mg1S~ cxG"GӤKR"ă |0J>"IdILl9Q\^ EBBLM@ۥ5ypMd6 M~]0)["B0P5DƔƐ,asM2*b=Yci=vZրYvP4HR̅M—&Med=λ]e>)힧rt?\\P@J`8ڲǙŕv^րI5>Q2W6/H"b241z,4tB;Gmhw.SuzsI]6^OVkwr$.HmQ;k8^:GY x,KG GkgDb[Kcyk-k;AX;5 *uOg~ߣ^#7| -O>MͭO#x`O@S' 0gIgwB|5!I))Igˊ[æ^5~ _pTvif6jAHh^e^$M;?,F_?6 >]Ǹa0I_®/ay`wä/gח2KYeKjYIc^#AMzts6 [*PI'Ѧ;t6+yIKW;YNهfdž3}M2Q1i(k GCql>HJyIڭˊՏcNdĕY7G~=zd]LLrSdA]  ՘ji P4&UFEi}ɑ]:$I+se&Kk=p;?0Π%tEXtdate:create2020-01-23T10:07:22+01:004_I%tEXtdate:modify2020-01-23T10:07:22+01:00E-tEXticc:copyrightCopyright Artifex Software 2011Ŵ1tEXticc:descriptionArtifex Software sRGB ICC Profile 2tEXticc:manufacturerArtifex Software sRGB ICC Profile\~=+tEXticc:modelArtifex Software sRGB ICC Profile1(tEXtSoftwareGPL Ghostscript 9.23(8IENDB`menhir-20200123/doc/manual002.png000066400000000000000000000102211361226111300161710ustar00rootroot00000000000000PNG  IHDRQo 6bKGD̿ pHYsaa?itIME oyIDATx-F<9P4¦{*0^1 1-0 ]7Loj%[eTꪯ T!Aϙtݿ7M߰=G" JK)I;9;wԹx7{@S$;ϕ"@TR3~|۱uxn1!*&kw*$ (d^]bmwz+~66y*،&=Ltw WU5K]׿-jV$ǵ)_ŋ{&E6EHht]IP$D@HdȲRSBaGjM ֣-ɕ+•\~d -(ٌ.B0$2W"NNSȈ $8:UxpTY{*bl6_'Q|4b|⌣+d% L#4RQEՎĴ%7WwZԽWjdW)Z"C.%0wa.,홺2E!PDBjHHjDHD4(T!3$z CXSI(ȕAocjQ~/N}z4'&ɪj6zMoyJɒn=m-xZ6w] ;OJJy$ODB7+ϤdtN`ĭVڂM̮Oe~QD6ǥSʍ^ Kzۧ0qnH`%D8{n ccx\%tI4!̩)LF2A&7ٷ3DôC,Z3C4w,f=LQ 3i_q>"zszoJL|6zN8Zyx)NkgtG~㨎"L/q9zEOQdzK>w![ fT}ϪwM .O7&jO"c\;wÃo~.IH1گ8ޅ<9u68vi[D>Oxz;V9孒e:v1tu^<7`֪\Ͷʽ:s]RjBłx՚v1VF@ ɂF5=JY^n0#t`пhB$1]=_# ̺B1`ilY4 6*GnjaW[jпJzSسinvvH_UjE*T0RŅrwӏ~x zae;g I?C{E#MĂҼ q>{Ժ6UU&0(7#MJGb]XHmh{bwZTBȧK{3aQۗkם*,Z˔zleB٣9$NJupRH_Uj@ͦ ;= BQ)lFԺB&hS&R=<<<~>hg&AůێL41 ?>yAmCN-3yvcakc(ZcIc=C1Jcmi8ի0h$3=u=Kճ{99[󷒝ۻw,U.3K˜znÍ)8'ĺo.#!}a?7/$ x&|)Rjҫ3[,s:]kJR4AfXmp 擛T^+9LzӻsGb-l@XN*/fd/kMYi~m"uIzGo@!Fb3(yA;\SӮW[ ɳ=NB򙀐/|1&s(r[7 TнFR=ՙڸ ~,m-MKnF'=@@I9>n2ӴR!`q E۹nف^SR5QfmXQVEtѢ=gUG{\ U 7Xg% d#IR@DJ{c MAj0ԬsF$P^qXRsLe-1Uc~U-\ ,#Ǎ04;ͨO8(<'O^oYYٹ}^ugrN^$ON<{ܦ@3 }JC9aq.]XxzWfmaR[UvE mMun"6Efqd;\I[k 9>'^D^ yP|3kc$|3!!/|ED`|œ3$ⳝrot5hF@Sؕe5. *YW)!"$E"b2Di6 fd@B#GH-3I2G WǘĀLXk s.=Vn[q]q/y+vkf2k-1OAE!QQjQd^miss $r]p\ Ĕ/܍'{Wi]5Ɏ_Rg8b6,kYGY$Mc:Ͻ栮vju V;cdꑝgV=t iyPDr]mXW+mڧIԾZX1 V=׌hWS6P=( O]9$Av]kX~ֵOWcH %[u{5a]+X͇=4#{&`-= ”DOF̏|W~O|7ws DH?ugތ'+_ף'I7b|#%xǏ˼?~u}vae}WO[3K_*37!0WM76x Լzfx=n]]3ݸ^C:\mGް768U=d0XP@#\缻!#?w\ao𰭾G$l85BUIq֚6~-k|XQ?x8ں7:;lnT'&D%3"/jA ^<6z wT nEa )~آv+ܯ/ok:T fa~zYUq>Do|(i.Ѣ {ݩQ @ExhTR9C&80uJ+WSnBl3{.kcb:w}I::eƼ@9Ktv䭟e+:/GJ)\SG?JH`sNu%ޡ c;bYW?CP w]ˌWLo#B!B MqQ'ZNS + ^ |$䌨, =r9RzppNM.:2#5Xsf&pތ$_:5/|߲-+O5.MtW{M~ץ M|f}֤a~' ٙkKے'Nvsu^TG74g~޷fH3J9ٮ7ĉN]E:u*\.ȡ.:(!֩no{6j^.QҸĈ!>/x;~'|+4~Oo}xxg|Tbuj@Bȳ]C"#ҐȈ4FF!72" iȍHCndDr##ҐȈ4FF$8ZWx )g |xE/ .(d]q.8D*~B F16Ѿ -î'I|n2 3F)ujuOHD~qO w7ާ8 hmq6ф;+Sy5RZPCQȾלŌKÌS m,H<Q5f0oY;4}BX~Şas/x>9.83SI !BqbjkB6]MLE!&"{E~-GMB>RؼlF!@%"(#I lc6!);#x2yoT-f>';Malfc6|'B͌:A:9G_KB:B⒄BF+5r(B\/G&KHQx)J#EI!n$HQR#EI!N%x89RHQR#Eyj p%! ZHuLjo¨q*H5/AHrDnk D!DHrYtb^r G<*DqQ战~,lҍ%cBQR|'NIµ~#͊FAI_b_0Y#/ wHK]>pco|~S=>)|~~q/Ipo|(_xx Iyz_xKؖ˭Bh0#‹HWyV3k_xW@ҚwU /ed8'^RF*5f H"BGWo/Ԛgkp4lPSZ΍zRD֯N[錙}U=2^C ua!r4Ym!I4CTcZ#'a~ډ8Y(Q^-qZ]bۭNc@H wNļlo,N\_?l67u UNw;6ƒܐ)ЁxޱÖPH$\tBPd1Tys:hս< /y om$|@ta ޵lRG^'(= 2V:mo.lz;/#nr=[ @\7y-Ր 50wh@j5lTZ۷ insert n+1 empty lines .\" for manpage-specific macros, see man(7) .SH NAME menhir \- an LR(1) parser generator for OCaml .SH SYNOPSIS .B menhir .RI [ options ] " files" .SH DESCRIPTION .B menhir is an LR(1) parser generator for the OCaml programming language. That is, Menhir compiles LR(1) grammar specifications down to OCaml code. It is mostly compatible with .BR ocamlyacc (1). .SH OPTIONS .TP .B \-h, \-\-help Show summary of options. .TP .BI \-b,\ \-\-base\ basename Specifies a base name for the output file(s). .TP .B \-\-canonical Construct a canonical Knuth LR(1) automaton. .TP .B \-\-cmly Write the grammar and automaton to .IR basename .cmly. .TP .B \-\-comment Include comments in the generated code. .TP .BI \-\-compare\-errors\ file1\ \-\-compare\-errors\ file2 Compare two .messages files. .TP .BI \-\-compile\-errors\ file Compile a .messages file to OCaml code. .TP .B \-\-coq Generate a formally verified parser, in Coq. .TP .BI \-\-coq\-lib\-path\ path How to qualify references to MenhirLib. .TP .B \-\-coq\-lib\-no\-path Do not qualify references to MenhirLib. .TP .B \-\-coq\-no\-actions Ignore semantic actions in the Coq output. .TP .B \-\-coq\-no\-complete Do not generate a proof of completeness. .TP .B \-\-depend Invoke ocamldep and display dependencies. .TP .B \-\-dump Describe the automaton in .IR basename .automaton. .TP .BI \-\-echo\-errors\ file Echo the sentences in a .messages file. .TP .B \-\-explain Explain conflicts in .IR basename .conflicts. .TP .BI \-\-external\-tokens\ module Import token type definition from .IR module . .TP .B \-\-fixed\-exception Declares Error = Parsing.Parse_error. .TP .B \-\-graph Write grammar's dependency graph to .IR basename .dot. .TP .B \-\-infer Invoke ocamlc for ahead of time type inference. .TP .B \-\-infer\-protocol\-supported Stop with exit code 0. .TP .BI \-\-infer\-write\-query\ file Write mock .ml file. .TP .BI \-\-infer\-read\-reply\ file Read inferred .mli file. .TP .B \-\-inspection Generate the inspection API. .TP .B \-\-interpret Interpret the sentences provided on stdin. .TP .B \-\-interpret\-show\-cst Show a concrete syntax tree upon acceptance. .TP .B \-\-interpret\-error Interpret an error sentence provided on stdin. .TP .B \-\-lalr Construct an LALR(1) automaton. .TP .BI \-la,\ \-\-log\-automaton\ level Log information about the automaton. .TP .BI \-lc,\ \-\-log\-code\ level Log information about the generated code. .TP .BI \-lg,\ \-\-log\-grammar\ level Log information about the grammar. .TP .B \-\-list\-errors Produce a list of erroneous inputs. .TP .B \-\-no\-dollars Disallow the use of $i notation. .TP .B \-\-no\-inline Ignore the %inline keyword. .TP .B \-\-no\-stdlib Do not load the standard library. .TP .BI \-\-ocamlc\ command Specifies how ocamlc should be invoked. .TP .BI \-\-ocamldep\ command Specifies how ocamldep should be invoked. .TP .B \-\-only\-preprocess Print a simplified grammar and exit. .TP .B \-\-only\-preprocess\-for\-ocamlyacc Print grammar in ocamlyacc format and exit. .TP .B \-\-only\-preprocess\-u Print grammar with unit actions and exit. .TP .B \-\-only\-preprocess\-uu Print grammar with unit actions and tokens and exit. .TP .B \-\-only\-tokens Generate token type definition only, no code. .TP .B \-\-raw\-depend Invoke ocamldep and echo its raw output. .TP .BI \-\-stdlib\ directory Specify where the standard library lies. .TP .B \-\-strict Warnings about the grammar are errors. .TP .B \-\-suggest\-comp\-flags Suggest compilation flags for ocaml{c,opt}. .TP .B \-\-suggest\-link\-flags-byte Suggest link flags for ocamlc. .TP .B \-\-suggest\-link\-flags-opt Suggest link flags for ocamlopt. .TP .B \-\-suggest\-menhirLib Suggest where MenhirLib was installed in source form. .TP .B \-\-suggest\-ocamlfind Deprecated. .TP .B \-t, \-\-table Use the table-based back-end. .TP .B \-\-timings Display internal timings. .TP .B \-\-trace Include tracing instructions in the generated code. .TP .B \-\-unused\-precedence\-levels Do not warn about unused precedence levels. .TP .BI \-\-unused\-token\ token Do not warn that .IR token is unused. .TP .B \-\-unused\-tokens Do not warn about any unused token. .TP .BI \-\-update\-errors\ file Update auto-comments in a .messages file. .TP .B \-\-version Show version number and exit. .TP .B \-v Synonymous with .BR \-\-dump\ \-\-explain . .SH SEE ALSO .BR ocaml (1). .SH AUTHOR .B menhir was written by Fran\(,cois Pottier and Yann R\('egis-Gianas. .PP This manual page was originally written by Samuel Mimram for the Debian project (but may be used by others). menhir-20200123/doc/mymacros.hva000066400000000000000000000001601361226111300163170ustar00rootroot00000000000000\input{mymacros.sty} % Ignore \raisebox and \phantom. \newcommand{\raisebox}[2]{#2} \newcommand{\phantom}[1]{} menhir-20200123/doc/mymacros.sty000066400000000000000000000014461361226111300163700ustar00rootroot00000000000000%; whizzy -macros main.tex % References to sections, lemmas, theorems, etc. \newcommand{\sref}[1]{\S\ref{#1}} \newcommand{\tref}[1]{Theorem~\ref{#1}} \newcommand{\lemref}[1]{Lemma~\ref{#1}} \newcommand{\dref}[1]{Definition~\ref{#1}} \newcommand{\eref}[1]{Example~\ref{#1}} \newcommand{\fref}[1]{Figure~\ref{#1}} \newcommand{\aref}[1]{Appendix~\ref{#1}} % Abbreviations. \def\etal.{\emph{et al.}} % Define \citeyear in addition to \cite, if not already defined. \@ifundefined{citeyear}{ \@ifundefined{shortcite}{ \let\citeyear\cite }{ \let\citeyear\shortcite } }{} % Lambda-calculus syntax. \newcommand{\ekw}[1]{\mathsf{#1}} \newcommand{\expr}{e} \newcommand{\evar}{x} \newcommand{\eabs}[2]{\lambda#1.#2} \newcommand{\eapp}[2]{#1\;#2} \newcommand{\elet}[3]{\ekw{let}\;#1=#2\;\ekw{in}\;#3} menhir-20200123/doc/new-rule-syntax-blog-post.md000066400000000000000000000325001361226111300213000ustar00rootroot00000000000000# Parser Construction With Menhir: A Couple Appetizers This post is a shameless advertisement for Menhir, a parser generator for OCaml. It illustrates Menhir's new input syntax, which was introduced on November 13, 2018. The code fragments shown below are excerpts of valid `.mly` files. ## Ingredients Suppose I would like to parse and evaluate our good old friends, the arithmetic expressions. For instance, the string `"(3 + 4) * 5 - 9"` should be accepted and evaluated to the value `26`. I assume that I have a lexical analyzer that can chop up this string into a stream of basic tokens, or terminal symbols. My alphabet of terminal is the following: ``` %token INT %token PLUS MINUS TIMES DIV LPAREN RPAREN EOL ``` Based on this alphabet, I wish to define the syntax of (and obtain a parser for) arithmetic expressions. This exercise may seem old and tired, but let me try and see if I can add some new spice and style to it. In fact, let me do it twice, in two slightly different ways. So, how would you like your arithmetic expressions cooked? ## First Flavor: Hot Off the Oven, With On-The-Fly Evaluation In this first demo, I wish to evaluate an arithmetic expression, that is, find out which integer value it represents. Thus, I am eventually interested in just an integer result. ``` %start main %% ``` I wish to recognize an expression followed with an end-of-line symbol: ``` let main := ~ = expr; EOL; <> ``` Here, `~ = expr` is a **pun**, a shorthand for `expr = expr`. It can be read as follows: "read an expression; evaluate it; let the variable `expr` stand for its value". `<>` is a **point-free semantic action**. In general, it is a shorthand for a semantic action that builds a tuple of the variables that have been bound earlier in the sequence. Thus, in this case, it is a shorthand for the semantic action `{ expr }`. It is now time to define `expr` and thereby describe the syntax and the meaning of arithmetic expressions. To do this in a nonambiguous manner, one of several traditional approaches is to stratify the syntax in several levels, namely additive expressions, multiplicative expressions, and atomic expressions. These levels are also traditionally known as *expressions*, *terms*, and *factors*. The topmost level is the level of additive expressions. In other words, an expression is just an additive expression: ``` let expr == additive_expr ``` This definition has no runtime cost: it makes `expr` a synonym for `additive_expr`. In traditional Menhir speak, `expr` is an `%inline` nonterminal symbol. This definition introduces a useful level of indirection: if in the future I decide to introduce a new level in the syntax of expressions, all I have to do is update the definition of `expr`; the places where `expr` is used do not need to be updated. In other words, the fact that "an expression is just an additive expression" is an implementation detail, and should not be revealed. An additive expression is a nonempty, left-associative list of multiplicative expressions, separated with additive operators: ``` let additive_expr == fold_left(additive_op, multiplicative_expr) ``` What does this mean? Well, quite obviously, the additive operators are `PLUS` and `MINUS`, which respectively denote addition or subtraction: ``` let additive_op == | PLUS; { ( + ) } | MINUS; { ( - ) } ``` Furthermore, a nonempty list of elements `elem` separated by operators `op` is: either a single element; or a (smaller) such list, followed with an operator, followed with an element. In the second case, the operator must be applied to the sum of the left-hand list and to the right-hand element: ``` let fold_left(op, elem) := | elem | sum = fold_left(op, elem); ~ = op; ~ = elem; { op sum elem } ``` This is a **parameterized definition**. Because this definition is recursive, it cannot be macro-expanded away: we cannot use `==` and must instead use `:=`. So much for additive expressions. This scheme can now be reproduced, one level down: a multiplicative expression is a nonempty, left-associative list of atomic expressions, separated with multiplicative operators. ``` let multiplicative_expr == fold_left(multiplicative_op, atomic_expr) let multiplicative_op == | TIMES; { ( * ) } | DIV; { ( / ) } ``` There remains to define atomic expressions. In this demo, I wish to allow the use of `MINUS` as a unary operator. Thus, an atomic expression shall be one of the following: an integer literal; an arbitrary expression between parentheses; or an application of a unary operator to an atomic expression. ``` let atomic_expr := | INT | delimited(LPAREN, expr, RPAREN) | app(unary_op, atomic_expr) ``` There is just one unary operator, `MINUS`, whose meaning is integer negation: ``` let unary_op == | MINUS; { (~- ) } ``` There remains to explain `delimited(left, x, right)` and `app(f, x)`. My main motivation for introducing these auxiliary parameterized symbols is to make the definition of `atomic_expr` prettier. `delimited(left, x, right)` is in fact part of Menhir's standard library, where it is defined as follows: ``` %public let delimited(left, x, right) == left; ~ = x; right; <> ``` `app(f, x)` recognizes the sequence `f; x`. Its value is the application of the value of `f` to the value of `x`. It is defined as follows: ``` let app(f, x) == ~ = f; ~ = x; { f x } ``` At this point, the arithmetic-expression parser-and-evaluator is complete. Menhir accepts it without complaining, which means that this grammar is in the class LR(1), therefore is **unambiguous**. From it, Menhir generates an LR(1) parser, a deterministic pushdown automaton, whose **performance is predictable**: provided each semantic action takes constant time, its time complexity is linear in the size of the input. Compared with other parsing techniques, guaranteed unambiguity and efficiency are two important strengths of LR(1) parsers. ## Second Flavor: As An Abstract-Syntax-and-Location Millefeuille Let me now be more ambitious. Instead of evaluating arithmetic expressions on the fly, let me build Abstract Syntax Trees. This opens the door to all kinds of symbolic computation: compilation down to native code, simplification, automatic differentiation, and so on. In a separate file, say `syntax.ml`, I define the types of the ASTs that I wish to build: ``` type unop = | OpNeg type binop = | OpPlus | OpMinus | OpTimes | OpDiv type 'a located = { loc: Lexing.position * Lexing.position; value: 'a } type expr = raw_expr located and raw_expr = | ELiteral of int | EUnOp of unop * expr | EBinOp of expr * binop * expr ``` The types `unop` and `binop` are simple enumerated types. In the definition of the type `raw_expr`, one recognizes three kinds of expressions: integer literals, applications of unary operators, and applications of binary operators. There is no data constructor for expressions in parentheses: although parentheses are a necessary feature of the concrete syntax, there is no need to record them in the abstract syntax. In an abstract syntax tree, I would like every subtree to be annotated with its location in the input text. This would be important, in a real-world programming language implementation, in order to produce error messages that carry a source code location. To achieve this, I use a traditional technique: I define two types, `expr` and `raw_expr`, in a mutually recursive manner. An expression is a raw expression annotated with a location (a pair of a start position and an end position). A raw expression is an integer literal, an application of a unary operator to an expression, or an application of a binary operator to two expressions. Thus, like a cake, an abstract syntax tree has layered structure: one layer of location information, one layer of structural information, one layer of location information, one layer of structural information, and so on. Let me now move on to the description of the parser. This time, I am eventually interested in producing an abstract syntax tree. ``` %start main %{ open Syntax %} %% ``` The first few definitions are unchanged: ``` let main := ~ = expr; EOL; <> let expr == additive_expr ``` This time around, I won't use a generic definition along the lines of `fold_left(op, elem)`. It can be done, though; this is left as an exercise for the reader! Here is a direct definition of additive expressions: ``` let additive_expr := | multiplicative_expr | located( ~ = additive_expr; ~ = additive_op; ~ = multiplicative_expr; ) let additive_op == | PLUS; { OpPlus } | MINUS; { OpMinus } ``` In short, an additive expression is either a multiplicative expression, or an additive expression followed with an additive operator followed with a multiplicative expression. In the second production, I use three `~` patterns in order to avoid the chore of naming the three semantic values. I again use **a point-free semantic action**: `` means that the data constructor `EBinOp` should be applied to a tuple of the three semantic values. At the cost of greater verbosity, one could equivalently write `e1 = additive_expr; op = additive_op; e2 = multiplicative_expr; { EBinOp (e1, op, e2) }`. Now, `EBinOp(e1, op, e2)` has type `raw_expr`, but I would like the semantic value of the nonterminal symbol `additive_expr` to have type `expr`. Therefore, I need to wrap this semantic value in a record of type `raw_expr located`. This can be done in a lightweight and elegant manner just by wrapping the second production with `located(...)`, where the parameterized nonterminal symbol `located(x)` is defined once and for all as follows: ``` let located(x) == ~ = x; { { loc = $loc; value = x } } ``` `located(x)` recognizes the same input as `x`, and wraps the semantic value of type `'a` produced by `x` in a record of type `'a located`. One level down, multiplicative expressions are described via the same pattern: ``` let multiplicative_expr := | atomic_expr | located( ~ = multiplicative_expr; ~ = multiplicative_op; ~ = atomic_expr; ) let multiplicative_op == | TIMES; { OpTimes } | DIV; { OpDiv } ``` Finally, as earlier, an atomic expression is one of: an expression between parentheses; an integer literal; an application of a unary operator to an atomic expression. ``` let atomic_expr := | LPAREN; ~ = expr; RPAREN; <> | located( | ~ = INT; | ~ = unary_op; ~ = atomic_expr; ) let unary_op == | MINUS; { OpNeg } ``` Only the last two cases in the definition of `atomic_expr` are wrapped in `located(...)`: in the first case, this is not necessary, as the expression already carries a location. Things are formulated in such a way that the computed locations are tight: the source code range associated with a parenthesized subexpression does not include the parentheses. One could of course easily adopt the reverse convention: this is left as another exercise for the reader! ## Behind The Scenes, Or: In The Kitchen If one expands away all symbols introduced by `==`, expands away all parameterized symbols, and strips away all semantic actions, one finds that the two descriptions presented above represent the same LR(1) grammar, therefore give rise to the same deterministic pushdown automaton. This bare-bones grammar is printed by `menhir --only-preprocess-u`, a useful inspection tool. It is printed in Menhir's traditional syntax. Once manually translated to the modern syntax used in this article, it is as follows: ``` %token DIV EOL INT LPAREN MINUS PLUS RPAREN TIMES %start main %% let main := additive_expr; EOL let additive_expr := | multiplicative_expr | additive_expr; PLUS; multiplicative_expr | additive_expr; MINUS; multiplicative_expr let multiplicative_expr := | atomic_expr | multiplicative_expr; TIMES; atomic_expr | multiplicative_expr; DIV; atomic_expr let atomic_expr := | INT | LPAREN; additive_expr; RPAREN | MINUS; atomic_expr ``` ## Spilling the Sauce: A Syntax Error Suppose my fingers slip, and I make a syntax error in my grammar description: ``` let main := ~ = expr; EOL; <>; ``` Not to worry. Menhir's parser for `.mly` files is a Menhir-generated parser, and produces reasonable syntax error messages. Here, the semicolon that follows the semantic action is invalid: ``` File "parser.mly", line 30, characters 19-20: Error: syntax error after '<>' and before ';'. At this point, one of the following is expected: a bar '|' followed with an expression, or another rule. ``` Yes, **LR(1) parsers can produce good syntax error messages**. ## References The full source code of [the first demo](https://gitlab.inria.fr/fpottier/menhir/blob/master/demos/calc-new-syntax-dune/parser.mly) and [the second demo](https://gitlab.inria.fr/fpottier/menhir/blob/master/demos/calc-ast-dune/parser.mly) is available online. [A summary of the changes](https://gitlab.inria.fr/fpottier/menhir/blob/master/doc/new-rule-syntax-summary.md) between the old and new syntaxes is also available. The syntax of Menhir is of course also documented in the [reference manual](http://gallium.inria.fr/~fpottier/menhir/manual.html#sec5). menhir-20200123/doc/new-rule-syntax-summary.md000066400000000000000000000072711361226111300210760ustar00rootroot00000000000000# Differences between the old and new rule syntax This presentation of the new rule syntax is meant to be useful to a reader who is familiar with the old rule syntax. For a direct, self-contained presentation of the new rule syntax, please consult Menhir's manual. ## Rules A rule begins with `let`. | | | | |----------------------------|---------|--------------------------| | `foo: ...` | becomes | `let foo := ...` | | `%inline foo: ...` | becomes | `let foo == ...` | | `%public foo: ...` | becomes | `%public let foo := ...` | | `%public %inline foo: ...` | becomes | `%public let foo == ...` | A rule **cannot** be terminated with a semicolon. ## Sums Productions are separated with `|`. A leading `|` is permitted, and ignored. For instance, the rule `let a := | A; { () }` has only one production, which recognizes the symbol `A`. In contrast with the old syntax,two productions **cannot** share a semantic action. ## Sequences In a sequence `p1 = e1; e2`, the semicolon is **mandatory**. The pattern `p1` binds the semantic values produced by `e1`. | | | | |-----------------|-------|-------------------------------| | `x = foo;` | means | the same as in the old syntax | | `foo;` | means | `_ = foo;` | | `~ = foo;` | means | `foo = foo;` | | `(x, y) = foo;` | means | `_xy = foo;` where the following semantic action is wrapped in `let (x, y) = _xy in ...` | In contrast with the old syntax, when a sequence ends with a semantic action, the semicolon that precedes the semantic action is still mandatory. For instance, in `let literal := i = INT; { i }`, the semicolon is required. In contrast with the old syntax, **a sequence need not end with a semantic action**. A sequence can also end with a symbol, whose semantic value is then implicitly returned. For instance, | | | | |--------------------------------|-------|------------------| | `foo` at the end of a sequence | means | `x = foo; { x }` | This implies that **it becomes easy to define a symbol as a synonym for another symbol** or for a complex expression. For instance, | | | | |---------------------------------|---------|-------------------| | `%inline foo: xs = bar* { xs }` | becomes | `let foo == bar*` | ## Semantic actions Traditional semantic actions, such as `{ (x, y) }`, remain available. In addition, so-called **point-free semantic actions** appear. They take the form of a single OCaml identifier between angle brackets. This identifier, which should denote a function or a data constructor, is implicitly **applied** to a tuple of the variables that have been bound earlier in the sequence. If this identifier is omitted, the identity function is assumed. Thus, | | | | |-----------------------------------------------------|-------|--------------------------------------------------------| | `let pair(x, y) := ~ = x; ~ = y; ` | means | `let pair(x, y) := x = x; y = y; { Pair (x, y) }` | | `let pair(x, y) := ~ = x; ~ = y; <>` | means | `let pair(x, y) := x = x; y = y; { (x, y) }` | | `let parenthesized(x) := LPAREN; ~ = x; RPAREN; <>` | means | `let parenthesized(x) := LPAREN; x = x; RPAREN; { x }` | This often removes the need to invent names for semantic values. `$1`, `$2`, etc. are forbidden. Semantic values must be named. A semantic value can be named either explicitly or via a `~` pattern. menhir-20200123/doc/plain.bst000066400000000000000000000460631361226111300156160ustar00rootroot00000000000000% BibTeX standard bibliography style `plain' % version 0.99a for BibTeX versions 0.99a or later, LaTeX version 2.09. % Copyright (C) 1985, all rights reserved. % Copying of this file is authorized only if either % (1) you make absolutely no changes to your copy, including name, or % (2) if you do make changes, you name it something other than % btxbst.doc, plain.bst, unsrt.bst, alpha.bst, and abbrv.bst. % This restriction helps ensure that all standard styles are identical. % The file btxbst.doc has the documentation for this style. % Modified by Francois.Pottier@inria.fr with support for url field. ENTRY { address author booktitle chapter edition editor howpublished institution journal key month note number organization pages publisher school series title type url volume year } {} { label } INTEGERS { output.state before.all mid.sentence after.sentence after.block } FUNCTION {init.state.consts} { #0 'before.all := #1 'mid.sentence := #2 'after.sentence := #3 'after.block := } STRINGS { s t } FUNCTION {output.nonnull} { 's := output.state mid.sentence = { ", " * write$ } { output.state after.block = { add.period$ write$ newline$ "\newblock " write$ } { output.state before.all = 'write$ { add.period$ " " * write$ } if$ } if$ mid.sentence 'output.state := } if$ s } FUNCTION {output} { duplicate$ empty$ 'pop$ 'output.nonnull if$ } FUNCTION {output.check} { 't := duplicate$ empty$ { pop$ "empty " t * " in " * cite$ * warning$ } 'output.nonnull if$ } FUNCTION {output.bibitem} { newline$ "\bibitem{" write$ cite$ write$ "}" write$ newline$ "" before.all 'output.state := } FUNCTION {fin.entry} { add.period$ write$ newline$ } FUNCTION {new.block} { output.state before.all = 'skip$ { after.block 'output.state := } if$ } FUNCTION {new.sentence} { output.state after.block = 'skip$ { output.state before.all = 'skip$ { after.sentence 'output.state := } if$ } if$ } FUNCTION {not} { { #0 } { #1 } if$ } FUNCTION {and} { 'skip$ { pop$ #0 } if$ } FUNCTION {or} { { pop$ #1 } 'skip$ if$ } FUNCTION {new.block.checka} { empty$ 'skip$ 'new.block if$ } FUNCTION {new.block.checkb} { empty$ swap$ empty$ and 'skip$ 'new.block if$ } FUNCTION {new.sentence.checka} { empty$ 'skip$ 'new.sentence if$ } FUNCTION {new.sentence.checkb} { empty$ swap$ empty$ and 'skip$ 'new.sentence if$ } FUNCTION {field.or.null} { duplicate$ empty$ { pop$ "" } 'skip$ if$ } FUNCTION {emphasize} { duplicate$ empty$ { pop$ "" } { "{\em " swap$ * "}" * } if$ } INTEGERS { nameptr namesleft numnames } FUNCTION {format.names} { 's := #1 'nameptr := s num.names$ 'numnames := numnames 'namesleft := { namesleft #0 > } { s nameptr "{ff~}{vv~}{ll}{, jj}" format.name$ 't := nameptr #1 > { namesleft #1 > { ", " * t * } { numnames #2 > { "," * } 'skip$ if$ t "others" = { " et~al." * } { " and " * t * } if$ } if$ } 't if$ nameptr #1 + 'nameptr := namesleft #1 - 'namesleft := } while$ } FUNCTION {format.authors} { author empty$ { "" } { author format.names } if$ } FUNCTION {format.editors} { editor empty$ { "" } { editor format.names editor num.names$ #1 > { ", editors" * } { ", editor" * } if$ } if$ } FUNCTION {format.title} { title empty$ { "" } { url empty$ { title "t" change.case$ } { "\href{" url "}{" title "t" change.case$ "}" * * * * } if$ } if$ } FUNCTION {n.dashify} { 't := "" { t empty$ not } { t #1 #1 substring$ "-" = { t #1 #2 substring$ "--" = not { "--" * t #2 global.max$ substring$ 't := } { { t #1 #1 substring$ "-" = } { "-" * t #2 global.max$ substring$ 't := } while$ } if$ } { t #1 #1 substring$ * t #2 global.max$ substring$ 't := } if$ } while$ } FUNCTION {format.date} { year empty$ { month empty$ { "" } { "there's a month but no year in " cite$ * warning$ month } if$ } { month empty$ 'year { month " " * year * } if$ } if$ } FUNCTION {format.btitle} { url empty$ { title emphasize } { "\href{" url "}{" title emphasize "}" * * * * } if$ } FUNCTION {tie.or.space.connect} { duplicate$ text.length$ #3 < { "~" } { " " } if$ swap$ * * } FUNCTION {either.or.check} { empty$ 'pop$ { "can't use both " swap$ * " fields in " * cite$ * warning$ } if$ } FUNCTION {format.bvolume} { volume empty$ { "" } { "volume" volume tie.or.space.connect series empty$ 'skip$ { " of " * series emphasize * } if$ "volume and number" number either.or.check } if$ } FUNCTION {format.number.series} { volume empty$ { number empty$ { series field.or.null } { output.state mid.sentence = { "number" } { "Number" } if$ number tie.or.space.connect series empty$ { "there's a number but no series in " cite$ * warning$ } { " in " * series * } if$ } if$ } { "" } if$ } FUNCTION {format.edition} { edition empty$ { "" } { output.state mid.sentence = { edition "l" change.case$ " edition" * } { edition "t" change.case$ " edition" * } if$ } if$ } INTEGERS { multiresult } FUNCTION {multi.page.check} { 't := #0 'multiresult := { multiresult not t empty$ not and } { t #1 #1 substring$ duplicate$ "-" = swap$ duplicate$ "," = swap$ "+" = or or { #1 'multiresult := } { t #2 global.max$ substring$ 't := } if$ } while$ multiresult } FUNCTION {format.pages} { pages empty$ { "" } { pages multi.page.check { "pages" pages n.dashify tie.or.space.connect } { "page" pages tie.or.space.connect } if$ } if$ } FUNCTION {format.vol.num.pages} { volume field.or.null number empty$ 'skip$ { "(" number * ")" * * volume empty$ { "there's a number but no volume in " cite$ * warning$ } 'skip$ if$ } if$ pages empty$ 'skip$ { duplicate$ empty$ { pop$ format.pages } { ":" * pages n.dashify * } if$ } if$ } FUNCTION {format.chapter.pages} { chapter empty$ 'format.pages { type empty$ { "chapter" } { type "l" change.case$ } if$ chapter tie.or.space.connect pages empty$ 'skip$ { ", " * format.pages * } if$ } if$ } FUNCTION {format.in.ed.booktitle} { booktitle empty$ { "" } { editor empty$ { "In " booktitle emphasize * } { "In " format.editors * ", " * booktitle emphasize * } if$ } if$ } FUNCTION {empty.misc.check} { author empty$ title empty$ howpublished empty$ month empty$ year empty$ note empty$ and and and and and key empty$ not and { "all relevant fields are empty in " cite$ * warning$ } 'skip$ if$ } FUNCTION {format.thesis.type} { type empty$ 'skip$ { pop$ type "t" change.case$ } if$ } FUNCTION {format.tr.number} { type empty$ { "Technical Report" } 'type if$ number empty$ { "t" change.case$ } { number tie.or.space.connect } if$ } FUNCTION {format.article.crossref} { key empty$ { journal empty$ { "need key or journal for " cite$ * " to crossref " * crossref * warning$ "" } { "In {\em " journal * "\/}" * } if$ } { "In " key * } if$ " \cite{" * crossref * "}" * } FUNCTION {format.crossref.editor} { editor #1 "{vv~}{ll}" format.name$ editor num.names$ duplicate$ #2 > { pop$ " et~al." * } { #2 < 'skip$ { editor #2 "{ff }{vv }{ll}{ jj}" format.name$ "others" = { " et~al." * } { " and " * editor #2 "{vv~}{ll}" format.name$ * } if$ } if$ } if$ } FUNCTION {format.book.crossref} { volume empty$ { "empty volume in " cite$ * "'s crossref of " * crossref * warning$ "In " } { "Volume" volume tie.or.space.connect " of " * } if$ editor empty$ editor field.or.null author field.or.null = or { key empty$ { series empty$ { "need editor, key, or series for " cite$ * " to crossref " * crossref * warning$ "" * } { "{\em " * series * "\/}" * } if$ } { key * } if$ } { format.crossref.editor * } if$ " \cite{" * crossref * "}" * } FUNCTION {format.incoll.inproc.crossref} { editor empty$ editor field.or.null author field.or.null = or { key empty$ { booktitle empty$ { "need editor, key, or booktitle for " cite$ * " to crossref " * crossref * warning$ "" } { "In {\em " booktitle * "\/}" * } if$ } { "In " key * } if$ } { "In " format.crossref.editor * } if$ " \cite{" * crossref * "}" * } FUNCTION {article} { output.bibitem format.authors "author" output.check new.block format.title "title" output.check new.block crossref missing$ { journal emphasize "journal" output.check format.vol.num.pages output format.date "year" output.check } { format.article.crossref output.nonnull format.pages output } if$ new.block note output fin.entry } FUNCTION {book} { output.bibitem author empty$ { format.editors "author and editor" output.check } { format.authors output.nonnull crossref missing$ { "author and editor" editor either.or.check } 'skip$ if$ } if$ new.block format.btitle "title" output.check crossref missing$ { format.bvolume output new.block format.number.series output new.sentence publisher "publisher" output.check address output } { new.block format.book.crossref output.nonnull } if$ format.edition output format.date "year" output.check new.block note output fin.entry } FUNCTION {booklet} { output.bibitem format.authors output new.block format.title "title" output.check howpublished address new.block.checkb howpublished output address output format.date output new.block note output fin.entry } FUNCTION {inbook} { output.bibitem author empty$ { format.editors "author and editor" output.check } { format.authors output.nonnull crossref missing$ { "author and editor" editor either.or.check } 'skip$ if$ } if$ new.block format.btitle "title" output.check crossref missing$ { format.bvolume output format.chapter.pages "chapter and pages" output.check new.block format.number.series output new.sentence publisher "publisher" output.check address output } { format.chapter.pages "chapter and pages" output.check new.block format.book.crossref output.nonnull } if$ format.edition output format.date "year" output.check new.block note output fin.entry } FUNCTION {incollection} { output.bibitem format.authors "author" output.check new.block format.title "title" output.check new.block crossref missing$ { format.in.ed.booktitle "booktitle" output.check format.bvolume output format.number.series output format.chapter.pages output new.sentence publisher "publisher" output.check address output format.edition output format.date "year" output.check } { format.incoll.inproc.crossref output.nonnull format.chapter.pages output } if$ new.block note output fin.entry } FUNCTION {inproceedings} { output.bibitem format.authors "author" output.check new.block format.title "title" output.check new.block crossref missing$ { format.in.ed.booktitle "booktitle" output.check format.bvolume output format.number.series output format.pages output address empty$ { organization publisher new.sentence.checkb organization output publisher output format.date "year" output.check } { address output.nonnull format.date "year" output.check new.sentence organization output publisher output } if$ } { format.incoll.inproc.crossref output.nonnull format.pages output } if$ new.block note output fin.entry } FUNCTION {conference} { inproceedings } FUNCTION {manual} { output.bibitem author empty$ { organization empty$ 'skip$ { organization output.nonnull address output } if$ } { format.authors output.nonnull } if$ new.block format.btitle "title" output.check author empty$ { organization empty$ { address new.block.checka address output } 'skip$ if$ } { organization address new.block.checkb organization output address output } if$ format.edition output format.date output new.block note output fin.entry } FUNCTION {mastersthesis} { output.bibitem format.authors "author" output.check new.block format.title "title" output.check new.block "Master's thesis" format.thesis.type output.nonnull school "school" output.check address output format.date "year" output.check new.block note output fin.entry } FUNCTION {misc} { output.bibitem format.authors output title howpublished new.block.checkb format.title output howpublished new.block.checka howpublished output format.date output new.block note output fin.entry empty.misc.check } FUNCTION {phdthesis} { output.bibitem format.authors "author" output.check new.block format.btitle "title" output.check new.block "PhD thesis" format.thesis.type output.nonnull school "school" output.check address output format.date "year" output.check new.block note output fin.entry } FUNCTION {proceedings} { output.bibitem editor empty$ { organization output } { format.editors output.nonnull } if$ new.block format.btitle "title" output.check format.bvolume output format.number.series output address empty$ { editor empty$ { publisher new.sentence.checka } { organization publisher new.sentence.checkb organization output } if$ publisher output format.date "year" output.check } { address output.nonnull format.date "year" output.check new.sentence editor empty$ 'skip$ { organization output } if$ publisher output } if$ new.block note output fin.entry } FUNCTION {techreport} { output.bibitem format.authors "author" output.check new.block format.title "title" output.check new.block format.tr.number output.nonnull institution "institution" output.check address output format.date "year" output.check new.block note output fin.entry } FUNCTION {unpublished} { output.bibitem format.authors "author" output.check new.block format.title "title" output.check new.block note "note" output.check format.date output fin.entry } FUNCTION {default.type} { misc } MACRO {jan} {"January"} MACRO {feb} {"February"} MACRO {mar} {"March"} MACRO {apr} {"April"} MACRO {may} {"May"} MACRO {jun} {"June"} MACRO {jul} {"July"} MACRO {aug} {"August"} MACRO {sep} {"September"} MACRO {oct} {"October"} MACRO {nov} {"November"} MACRO {dec} {"December"} MACRO {acmcs} {"ACM Computing Surveys"} MACRO {acta} {"Acta Informatica"} MACRO {cacm} {"Communications of the ACM"} MACRO {ibmjrd} {"IBM Journal of Research and Development"} MACRO {ibmsj} {"IBM Systems Journal"} MACRO {ieeese} {"IEEE Transactions on Software Engineering"} MACRO {ieeetc} {"IEEE Transactions on Computers"} MACRO {ieeetcad} {"IEEE Transactions on Computer-Aided Design of Integrated Circuits"} MACRO {ipl} {"Information Processing Letters"} MACRO {jacm} {"Journal of the ACM"} MACRO {jcss} {"Journal of Computer and System Sciences"} MACRO {scp} {"Science of Computer Programming"} MACRO {sicomp} {"SIAM Journal on Computing"} MACRO {tocs} {"ACM Transactions on Computer Systems"} MACRO {tods} {"ACM Transactions on Database Systems"} MACRO {tog} {"ACM Transactions on Graphics"} MACRO {toms} {"ACM Transactions on Mathematical Software"} MACRO {toois} {"ACM Transactions on Office Information Systems"} MACRO {toplas} {"ACM Transactions on Programming Languages and Systems"} MACRO {tcs} {"Theoretical Computer Science"} READ FUNCTION {sortify} { purify$ "l" change.case$ } INTEGERS { len } FUNCTION {chop.word} { 's := 'len := s #1 len substring$ = { s len #1 + global.max$ substring$ } 's if$ } FUNCTION {sort.format.names} { 's := #1 'nameptr := "" s num.names$ 'numnames := numnames 'namesleft := { namesleft #0 > } { nameptr #1 > { " " * } 'skip$ if$ s nameptr "{vv{ } }{ll{ }}{ ff{ }}{ jj{ }}" format.name$ 't := nameptr numnames = t "others" = and { "et al" * } { t sortify * } if$ nameptr #1 + 'nameptr := namesleft #1 - 'namesleft := } while$ } FUNCTION {sort.format.title} { 't := "A " #2 "An " #3 "The " #4 t chop.word chop.word chop.word sortify #1 global.max$ substring$ } FUNCTION {author.sort} { author empty$ { key empty$ { "to sort, need author or key in " cite$ * warning$ "" } { key sortify } if$ } { author sort.format.names } if$ } FUNCTION {author.editor.sort} { author empty$ { editor empty$ { key empty$ { "to sort, need author, editor, or key in " cite$ * warning$ "" } { key sortify } if$ } { editor sort.format.names } if$ } { author sort.format.names } if$ } FUNCTION {author.organization.sort} { author empty$ { organization empty$ { key empty$ { "to sort, need author, organization, or key in " cite$ * warning$ "" } { key sortify } if$ } { "The " #4 organization chop.word sortify } if$ } { author sort.format.names } if$ } FUNCTION {editor.organization.sort} { editor empty$ { organization empty$ { key empty$ { "to sort, need editor, organization, or key in " cite$ * warning$ "" } { key sortify } if$ } { "The " #4 organization chop.word sortify } if$ } { editor sort.format.names } if$ } FUNCTION {presort} { type$ "book" = type$ "inbook" = or 'author.editor.sort { type$ "proceedings" = 'editor.organization.sort { type$ "manual" = 'author.organization.sort 'author.sort if$ } if$ } if$ " " * year field.or.null sortify * " " * title field.or.null sort.format.title * #1 entry.max$ substring$ 'sort.key$ := } ITERATE {presort} SORT STRINGS { longest.label } INTEGERS { number.label longest.label.width } FUNCTION {initialize.longest.label} { "" 'longest.label := #1 'number.label := #0 'longest.label.width := } FUNCTION {longest.label.pass} { number.label int.to.str$ 'label := number.label #1 + 'number.label := label width$ longest.label.width > { label 'longest.label := label width$ 'longest.label.width := } 'skip$ if$ } EXECUTE {initialize.longest.label} ITERATE {longest.label.pass} FUNCTION {begin.bib} { preamble$ empty$ 'skip$ { preamble$ write$ newline$ } if$ "\begin{thebibliography}{" longest.label * "}" * write$ newline$ } EXECUTE {begin.bib} EXECUTE {init.state.consts} ITERATE {call.type$} FUNCTION {end.bib} { newline$ "\end{thebibliography}" write$ newline$ } EXECUTE {end.bib} menhir-20200123/doc/sigplanconf.cls000066400000000000000000000712071361226111300170050ustar00rootroot00000000000000%----------------------------------------------------------------------------- % % LaTeX Class/Style File % % Name: sigplanconf.cls % Purpose: A LaTeX 2e class file for SIGPLAN conference proceedings. % This class file supercedes acm_proc_article-sp, % sig-alternate, and sigplan-proc. % % Author: Paul C. Anagnostopoulos % Windfall Software % 978 371-2316 % paul@windfall.com % % Created: 12 September 2004 % % Revisions: See end of file. % %----------------------------------------------------------------------------- \NeedsTeXFormat{LaTeX2e}[1995/12/01] \ProvidesClass{sigplanconf}[2005/03/07 v0.93 ACM SIGPLAN Proceedings] % The following few pages contain LaTeX programming extensions adapted % from the ZzTeX macro package. % Token Hackery % ----- ------- \def \@expandaftertwice {\expandafter\expandafter\expandafter} \def \@expandafterthrice {\expandafter\expandafter\expandafter\expandafter \expandafter\expandafter\expandafter} % This macro discards the next token. \def \@discardtok #1{}% token % This macro removes the `pt' following a dimension. {\catcode `\p = 12 \catcode `\t = 12 \gdef \@remover #1pt{#1} } % \catcode % This macro extracts the contents of a macro and returns it as plain text. % Usage: \expandafter\@defof \meaning\macro\@mark \def \@defof #1:->#2\@mark{#2} % Control Sequence Names % ------- -------- ----- \def \@name #1{% {\tokens} \csname \expandafter\@discardtok \string#1\endcsname} \def \@withname #1#2{% {\command}{\tokens} \expandafter#1\csname \expandafter\@discardtok \string#2\endcsname} % Flags (Booleans) % ----- ---------- % The boolean literals \@true and \@false are appropriate for use with % the \if command, which tests the codes of the next two characters. \def \@true {TT} \def \@false {FL} \def \@setflag #1=#2{\edef #1{#2}}% \flag = boolean % IF and Predicates % -- --- ---------- % A "predicate" is a macro that returns \@true or \@false as its value. % Such values are suitable for use with the \if conditional. For example: % % \if \oddp{\x} \else \fi % A predicate can be used with \@setflag as follows: % % \@setflag \flag = {} % Here are the predicates for TeX's repertoire of conditional % commands. These might be more appropriately interspersed with % other definitions in this module, but what the heck. % Some additional "obvious" predicates are defined. \def \eqlp #1#2{\ifnum #1 = #2\@true \else \@false \fi} \def \neqlp #1#2{\ifnum #1 = #2\@false \else \@true \fi} \def \lssp #1#2{\ifnum #1 < #2\@true \else \@false \fi} \def \gtrp #1#2{\ifnum #1 > #2\@true \else \@false \fi} \def \zerop #1{\ifnum #1 = 0\@true \else \@false \fi} \def \onep #1{\ifnum #1 = 1\@true \else \@false \fi} \def \posp #1{\ifnum #1 > 0\@true \else \@false \fi} \def \negp #1{\ifnum #1 < 0\@true \else \@false \fi} \def \oddp #1{\ifodd #1\@true \else \@false \fi} \def \evenp #1{\ifodd #1\@false \else \@true \fi} \def \rangep #1#2#3{\if \orp{\lssp{#1}{#2}}{\gtrp{#1}{#3}}\@false \else \@true \fi} \def \tensp #1{\rangep{#1}{10}{19}} \def \dimeqlp #1#2{\ifdim #1 = #2\@true \else \@false \fi} \def \dimneqlp #1#2{\ifdim #1 = #2\@false \else \@true \fi} \def \dimlssp #1#2{\ifdim #1 < #2\@true \else \@false \fi} \def \dimgtrp #1#2{\ifdim #1 > #2\@true \else \@false \fi} \def \dimzerop #1{\ifdim #1 = 0pt\@true \else \@false \fi} \def \dimposp #1{\ifdim #1 > 0pt\@true \else \@false \fi} \def \dimnegp #1{\ifdim #1 < 0pt\@true \else \@false \fi} \def \vmodep {\ifvmode \@true \else \@false \fi} \def \hmodep {\ifhmode \@true \else \@false \fi} \def \mathmodep {\ifmmode \@true \else \@false \fi} \def \textmodep {\ifmmode \@false \else \@true \fi} \def \innermodep {\ifinner \@true \else \@false \fi} \long\def \codeeqlp #1#2{\if #1#2\@true \else \@false \fi} \long\def \cateqlp #1#2{\ifcat #1#2\@true \else \@false \fi} \long\def \tokeqlp #1#2{\ifx #1#2\@true \else \@false \fi} \long\def \xtokeqlp #1#2{\expandafter\ifx #1#2\@true \else \@false \fi} \long\def \definedp #1{% \expandafter\ifx \csname \expandafter\@discardtok \string#1\endcsname \relax \@false \else \@true \fi} \long\def \undefinedp #1{% \expandafter\ifx \csname \expandafter\@discardtok \string#1\endcsname \relax \@true \else \@false \fi} \def \emptydefp #1{\ifx #1\@empty \@true \else \@false \fi}% {\name} \let \emptylistp = \emptydefp \long\def \emptyargp #1{% {#n} \@empargp #1\@empargq\@mark} \long\def \@empargp #1#2\@mark{% \ifx #1\@empargq \@true \else \@false \fi} \def \@empargq {\@empargq} \def \emptytoksp #1{% {\tokenreg} \expandafter\@emptoksp \the#1\@mark} \long\def \@emptoksp #1\@mark{\emptyargp{#1}} \def \voidboxp #1{\ifvoid #1\@true \else \@false \fi} \def \hboxp #1{\ifhbox #1\@true \else \@false \fi} \def \vboxp #1{\ifvbox #1\@true \else \@false \fi} \def \eofp #1{\ifeof #1\@true \else \@false \fi} % Flags can also be used as predicates, as in: % % \if \flaga \else \fi % Now here we have predicates for the common logical operators. \def \notp #1{\if #1\@false \else \@true \fi} \def \andp #1#2{\if #1% \if #2\@true \else \@false \fi \else \@false \fi} \def \orp #1#2{\if #1% \@true \else \if #2\@true \else \@false \fi \fi} \def \xorp #1#2{\if #1% \if #2\@false \else \@true \fi \else \if #2\@true \else \@false \fi \fi} % Arithmetic % ---------- \def \@increment #1{\advance #1 by 1\relax}% {\count} \def \@decrement #1{\advance #1 by -1\relax}% {\count} % Options % ------- \@setflag \@blockstyle = \@false \@setflag \@copyrightwanted = \@true \@setflag \@explicitsize = \@false \@setflag \@mathtime = \@false \@setflag \@ninepoint = \@true \@setflag \@onecolumn = \@false \@setflag \@preprint = \@false \newcount{\@numheaddepth} \@numheaddepth = 3 \@setflag \@times = \@false % Note that all the dangerous article class options are trapped. \DeclareOption{9pt}{\@setflag \@ninepoint = \@true \@setflag \@explicitsize = \@true} \DeclareOption{10pt}{\PassOptionsToClass{10pt}{article}% \@setflag \@ninepoint = \@false \@setflag \@explicitsize = \@true} \DeclareOption{11pt}{\PassOptionsToClass{11pt}{article}% \@setflag \@ninepoint = \@false \@setflag \@explicitsize = \@true} \DeclareOption{12pt}{\@unsupportedoption{12pt}} \DeclareOption{a4paper}{\@unsupportedoption{a4paper}} \DeclareOption{a5paper}{\@unsupportedoption{a5paper}} \DeclareOption{b5paper}{\@unsupportedoption{b5paper}} \DeclareOption{blockstyle}{\@setflag \@blockstyle = \@true} \DeclareOption{cm}{\@setflag \@times = \@false} \DeclareOption{computermodern}{\@setflag \@times = \@false} \DeclareOption{executivepaper}{\@unsupportedoption{executivepaper}} \DeclareOption{indentedstyle}{\@setflag \@blockstyle = \@false} \DeclareOption{landscape}{\@unsupportedoption{landscape}} \DeclareOption{legalpaper}{\@unsupportedoption{legalpaper}} \DeclareOption{letterpaper}{\@unsupportedoption{letterpaper}} \DeclareOption{mathtime}{\@setflag \@mathtime = \@true} \DeclareOption{nocopyrightspace}{\@setflag \@copyrightwanted = \@false} \DeclareOption{notitlepage}{\@unsupportedoption{notitlepage}} \DeclareOption{numberedpars}{\@numheaddepth = 4} \DeclareOption{onecolumn}{\@setflag \@onecolumn = \@true} \DeclareOption{preprint}{\@setflag \@preprint = \@true} \DeclareOption{times}{\@setflag \@times = \@true} \DeclareOption{titlepage}{\@unsupportedoption{titlepage}} \DeclareOption{twocolumn}{\@setflag \@onecolumn = \@false} \DeclareOption*{\PassOptionsToClass{\CurrentOption}{article}} \ExecuteOptions{9pt,indentedstyle,times} \@setflag \@explicitsize = \@false \ProcessOptions \if \@onecolumn \if \notp{\@explicitsize}% \@setflag \@ninepoint = \@false \PassOptionsToClass{11pt}{article}% \fi \PassOptionsToClass{twoside,onecolumn}{article} \else \PassOptionsToClass{twoside,twocolumn}{article} \fi \LoadClass{article} \def \@unsupportedoption #1{% \ClassError{proc}{The standard '#1' option is not supported.}} % Utilities % --------- \newcommand{\setvspace}[2]{% #1 = #2 \advance #1 by -1\parskip} % Document Parameters % -------- ---------- % Page: \setlength{\hoffset}{-1in} \setlength{\voffset}{-1in} \setlength{\topmargin}{1in} \setlength{\headheight}{0pt} \setlength{\headsep}{0pt} \if \@onecolumn \setlength{\evensidemargin}{.75in} \setlength{\oddsidemargin}{.75in} \else \setlength{\evensidemargin}{.75in} \setlength{\oddsidemargin}{.75in} \fi % Text area: \newdimen{\standardtextwidth} \setlength{\standardtextwidth}{42pc} \if \@onecolumn \setlength{\textwidth}{40.5pc} \else \setlength{\textwidth}{\standardtextwidth} \fi \setlength{\topskip}{8pt} \setlength{\columnsep}{2pc} \setlength{\textheight}{54.5pc} % Running foot: \setlength{\footskip}{30pt} % Paragraphs: \if \@blockstyle \setlength{\parskip}{5pt plus .1pt minus .5pt} \setlength{\parindent}{0pt} \else \setlength{\parskip}{0pt} \setlength{\parindent}{12pt} \fi \setlength{\lineskip}{.5pt} \setlength{\lineskiplimit}{\lineskip} \frenchspacing \pretolerance = 400 \tolerance = \pretolerance \setlength{\emergencystretch}{5pt} \clubpenalty = 10000 \widowpenalty = 10000 \setlength{\hfuzz}{.5pt} % Standard vertical spaces: \newskip{\standardvspace} \setvspace{\standardvspace}{5pt plus 1pt minus .5pt} % Margin paragraphs: \setlength{\marginparwidth}{0pt} \setlength{\marginparsep}{0pt} \setlength{\marginparpush}{0pt} \setlength{\skip\footins}{8pt plus 3pt minus 1pt} \setlength{\footnotesep}{9pt} \renewcommand{\footnoterule}{% \hrule width .5\columnwidth height .33pt depth 0pt} \renewcommand{\@makefntext}[1]{% \noindent \@makefnmark \hspace{1pt}#1} % Floats: \setcounter{topnumber}{4} \setcounter{bottomnumber}{1} \setcounter{totalnumber}{4} \renewcommand{\fps@figure}{tp} \renewcommand{\fps@table}{tp} \renewcommand{\topfraction}{0.90} \renewcommand{\bottomfraction}{0.30} \renewcommand{\textfraction}{0.10} \renewcommand{\floatpagefraction}{0.75} \setcounter{dbltopnumber}{4} \renewcommand{\dbltopfraction}{\topfraction} \renewcommand{\dblfloatpagefraction}{\floatpagefraction} \setlength{\floatsep}{18pt plus 4pt minus 2pt} \setlength{\textfloatsep}{18pt plus 4pt minus 3pt} \setlength{\intextsep}{10pt plus 4pt minus 3pt} \setlength{\dblfloatsep}{18pt plus 4pt minus 2pt} \setlength{\dbltextfloatsep}{20pt plus 4pt minus 3pt} % Miscellaneous: \errorcontextlines = 5 % Fonts % ----- \if \@times \renewcommand{\rmdefault}{ptm}% \if \@mathtime \usepackage[mtbold,noTS1]{mathtime}% \else %%% \usepackage{mathptm}% \fi \else \relax \fi \if \@ninepoint \renewcommand{\normalsize}{% \@setfontsize{\normalsize}{9pt}{10pt}% \setlength{\abovedisplayskip}{5pt plus 1pt minus .5pt}% \setlength{\belowdisplayskip}{\abovedisplayskip}% \setlength{\abovedisplayshortskip}{3pt plus 1pt minus 2pt}% \setlength{\belowdisplayshortskip}{\abovedisplayshortskip}} \renewcommand{\tiny}{\@setfontsize{\tiny}{5pt}{6pt}} \renewcommand{\scriptsize}{\@setfontsize{\scriptsize}{7pt}{8pt}} \renewcommand{\small}{% \@setfontsize{\small}{8pt}{9pt}% \setlength{\abovedisplayskip}{4pt plus 1pt minus 1pt}% \setlength{\belowdisplayskip}{\abovedisplayskip}% \setlength{\abovedisplayshortskip}{2pt plus 1pt}% \setlength{\belowdisplayshortskip}{\abovedisplayshortskip}} \renewcommand{\footnotesize}{% \@setfontsize{\footnotesize}{8pt}{9pt}% \setlength{\abovedisplayskip}{4pt plus 1pt minus .5pt}% \setlength{\belowdisplayskip}{\abovedisplayskip}% \setlength{\abovedisplayshortskip}{2pt plus 1pt}% \setlength{\belowdisplayshortskip}{\abovedisplayshortskip}} \renewcommand{\large}{\@setfontsize{\large}{11pt}{13pt}} \renewcommand{\Large}{\@setfontsize{\Large}{14pt}{18pt}} \renewcommand{\LARGE}{\@setfontsize{\LARGE}{18pt}{20pt}} \renewcommand{\huge}{\@setfontsize{\huge}{20pt}{25pt}} \renewcommand{\Huge}{\@setfontsize{\Huge}{25pt}{30pt}} \fi % Abstract % -------- \renewenvironment{abstract}{% \section*{Abstract}% \normalsize}{% } % Bibliography % ------------ \renewenvironment{thebibliography}[1] {\section*{\refname \@mkboth{\MakeUppercase\refname}{\MakeUppercase\refname}}% \list{\@biblabel{\@arabic\c@enumiv}}% {\settowidth\labelwidth{\@biblabel{#1}}% \leftmargin\labelwidth \advance\leftmargin\labelsep \@openbib@code \usecounter{enumiv}% \let\p@enumiv\@empty \renewcommand\theenumiv{\@arabic\c@enumiv}}% \small \softraggedright%%%\sloppy \clubpenalty4000 \@clubpenalty \clubpenalty \widowpenalty4000% \sfcode`\.\@m} {\def\@noitemerr {\@latex@warning{Empty `thebibliography' environment}}% \endlist} % Categories % ---------- \@setflag \@firstcategory = \@true \newcommand{\category}[3]{% \if \@firstcategory \paragraph*{Categories and Subject Descriptors}% \@setflag \@firstcategory = \@false \else \unskip ;\hspace{.75em}% \fi \@ifnextchar [{\@category{#1}{#2}{#3}}{\@category{#1}{#2}{#3}[]}} \def \@category #1#2#3[#4]{% {\let \and = \relax #1 [\textit{#2}]% \if \emptyargp{#4}% \if \notp{\emptyargp{#3}}: #3\fi \else :\space \if \notp{\emptyargp{#3}}#3---\fi \textrm{#4}% \fi}} % Copyright Notice % --------- ------ \def \ftype@copyrightbox {8} \def \@toappear {} \def \@permission {} \def \@copyrightspace {% \@float{copyrightbox}[b]% \vbox to 1.25in{% \vfill \begin{center}% \@toappear \end{center}}% \end@float} \long\def \toappear #1{% \def \@toappear {\parbox[b]{20pc}{\scriptsize #1}}} %%%\def \toappearbox #1{% %%% \def \@toappear {\raisebox{5pt}{\framebox[20pc]{\parbox[b]{19pc}{#1}}}}} \toappear{% \noindent \@permission \par \vspace{2pt} \noindent \textsl{\@conferencename}\quad \@conferenceinfo \par \@copyrightinfo} \newcommand{\permission}[1]{% \gdef \@permission {#1}} \permission{% Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.} \def \@copyrightinfo {% \if \notp{\emptydefp{\copyrightinfo}}% Copyright \copyright\ \@copyrightyear\ ACM \@copyrightdata\dots \$5.00. \fi} % Enunciations % ------------ \def \@begintheorem #1#2{% {name}{number} \trivlist \item[\hskip \labelsep \textsc{#1 #2.}]% \itshape\selectfont \ignorespaces} \def \@opargbegintheorem #1#2#3{% {name}{number}{title} \trivlist \item[% \hskip\labelsep \textsc{#1\ #2}% \if \notp{\emptyargp{#3}}\nut (#3).\fi]% \itshape\selectfont \ignorespaces} \@setflag \@qeddone = \@false \newenvironment{proof}{% \global\@setflag \@qeddone = \@false \@ifnextchar[{\@titledproof}{\@titledproof[]}}{% \if \notp{\@qeddone}\qed \fi \endtrivlist} \def \@titledproof [#1]{% \trivlist \item[\hskip \labelsep \textsc{Proof% \if \notp{\emptyargp{#1}}\space #1\fi .}]% \ignorespaces} \newcommand{\qed}{% \unskip \kern 6pt {\linethickness{.5pt}\framebox(4,4){}}% \global\@setflag \@qeddone = \@true} \newcommand{\newdef}[2]{% {type}{name} \@withname\@ifdefinable {#1}{% \@definecounter{#1}% \@withname\xdef {the#1}{\@thmcounter{#1}}% \global\@namedef{#1}{\@begindef{#1}{#2}}% \global\@namedef{end#1}{\@endtheorem}}} \def \@begindef #1#2{% {type}{name} \refstepcounter{#1}% \@ifnextchar[{\@titleddef{#1}{#2}}{\@titleddef{#1}{#2}[]}} \def \@titleddef #1#2[#3]{% {type}{name}[title] \trivlist \item[\hskip \labelsep \itshape{#2% \if \notp{\emptyargp{#3}}\space #3\fi .}]% \ignorespaces} % Figures % ------- \@setflag \@caprule = \@true \long\def \@makecaption #1#2{% \addvspace{4pt} \if \@caprule \hrule width \hsize height .33pt \vspace{4pt} \fi \setbox \@tempboxa = \hbox{\@setfigurenumber{#1.}\nut #2}% \if \dimgtrp{\wd\@tempboxa}{\hsize}% \noindent \@setfigurenumber{#1.}\nut #2\par \else \centerline{\box\@tempboxa}% \fi} \newcommand{\nocaptionrule}{% \@setflag \@caprule = \@false} \def \@setfigurenumber #1{% {\rmfamily \bfseries \selectfont #1}} % Hierarchy % --------- \setcounter{secnumdepth}{\@numheaddepth} \newskip{\@sectionaboveskip} \setvspace{\@sectionaboveskip}{10pt plus 3pt minus 2pt} \newskip{\@sectionbelowskip} \if \@blockstyle \setlength{\@sectionbelowskip}{0.1pt}% \else \setlength{\@sectionbelowskip}{4pt}% \fi \renewcommand{\section}{% \@startsection {section}% {1}% {0pt}% {-\@sectionaboveskip}% {\@sectionbelowskip}% {\large \bfseries \raggedright}} \newskip{\@subsectionaboveskip} \setvspace{\@subsectionaboveskip}{8pt plus 2pt minus 2pt} \newskip{\@subsectionbelowskip} \if \@blockstyle \setlength{\@subsectionbelowskip}{0.1pt}% \else \setlength{\@subsectionbelowskip}{4pt}% \fi \renewcommand{\subsection}{% \@startsection% {subsection}% {2}% {0pt}% {-\@subsectionaboveskip}% {\@subsectionbelowskip}% {\normalsize \bfseries \raggedright}} \renewcommand{\subsubsection}{% \@startsection% {subsubsection}% {3}% {0pt}% {-\@subsectionaboveskip} {\@subsectionbelowskip}% {\normalsize \bfseries \raggedright}} \newskip{\@paragraphaboveskip} \setvspace{\@paragraphaboveskip}{6pt plus 2pt minus 2pt} \renewcommand{\paragraph}{% \@startsection% {paragraph}% {4}% {0pt}% {\@paragraphaboveskip} {-1em}% {\normalsize \bfseries \if \@times \itshape \fi}} % Standard headings: \newcommand{\acks}{\section*{Acknowledgments}} \newcommand{\keywords}{\paragraph*{Keywords}} \newcommand{\terms}{\paragraph*{General Terms}} % Identification % -------------- \def \@conferencename {} \def \@conferenceinfo {} \def \@copyrightyear {} \def \@copyrightdata {[to be supplied]} \newcommand{\conferenceinfo}[2]{% \gdef \@conferencename {#1}% \gdef \@conferenceinfo {#2}} \newcommand{\copyrightyear}[1]{% \gdef \@copyrightyear {#1}} \let \CopyrightYear = \copyrightyear \newcommand{\copyrightdata}[1]{% \gdef \@copyrightdata {#1}} \let \crdata = \copyrightdata % Lists % ----- \setlength{\leftmargini}{13pt} \setlength\leftmarginii{13pt} \setlength\leftmarginiii{13pt} \setlength\leftmarginiv{13pt} \setlength{\labelsep}{3.5pt} \setlength{\topsep}{\standardvspace} \if \@blockstyle \setlength{\itemsep}{0pt} \setlength{\parsep}{4pt} \else \setlength{\itemsep}{2pt} \setlength{\parsep}{0pt} \fi \renewcommand{\labelitemi}{{\small \centeroncapheight{\textbullet}}} \renewcommand{\labelitemii}{\centeroncapheight{\rule{2.5pt}{2.5pt}}} \renewcommand{\labelitemiii}{$-$} \renewcommand{\labelitemiv}{{\Large \textperiodcentered}} \renewcommand{\@listi}{% \leftmargin = \leftmargini \listparindent = \parindent} \let \@listI = \@listi \renewcommand{\@listii}{% \leftmargin = \leftmarginii \labelwidth = \leftmarginii \advance \labelwidth by -\labelsep \listparindent = \parindent} \renewcommand{\@listiii}{% \leftmargin = \leftmarginiii \labelwidth = \leftmarginiii \advance \labelwidth by -\labelsep \listparindent = \parindent} \renewcommand{\@listiv}{% \leftmargin = \leftmarginiv \labelwidth = \leftmarginiv \advance \labelwidth by -\labelsep \listparindent = \parindent} % Mathematics % ----------- \def \theequation {\arabic{equation}} % Miscellaneous % ------------- \newcommand{\balancecolumns}{% \vfill\eject \global\@colht = \textheight \global\ht\@cclv = \textheight} \newcommand{\nut}{\hspace{.5em}} \newcommand{\softraggedright}{% \let \\ = \@centercr \leftskip = 0pt \rightskip = 0pt plus 10pt} % Program Code % ------- ---- \newcommand{\mono}[1]{% {\@tempdima = \fontdimen2\font \texttt{\spaceskip = 1.1\@tempdima #1}}} % Running Heads and Feet % ------- ----- --- ---- \if \@preprint \def \ps@plain {% \let \@mkboth = \@gobbletwo \let \@evenhead = \@empty \def \@evenfoot {% \reset@font \@conferencename \hfil \thepage \hfil \@formatyear}% \let \@oddhead = \@empty \let \@oddfoot = \@evenfoot} \else \let \ps@plain = \ps@empty \let \ps@headings = \ps@empty \let \ps@myheadings = \ps@empty \fi \def \@formatyear {% \number\year/\number\month/\number\day} % Title Page % ----- ---- \@setflag \@addauthorsdone = \@false \def \@titletext {\@latex@error{No title was provided}{}} \def \@subtitletext {} \newcount{\@authorcount} \newcount{\@titlenotecount} \newtoks{\@titlenotetext} \renewcommand{\title}[1]{% \gdef \@titletext {#1}} \newcommand{\subtitle}[1]{% \gdef \@subtitletext {#1}} \newcommand{\authorinfo}[3]{% {names}{affiliation}{email/URL} \global\@increment \@authorcount \@withname\gdef {\@authorname\romannumeral\@authorcount}{#1}% \@withname\gdef {\@authoraffil\romannumeral\@authorcount}{#2}% \@withname\gdef {\@authoremail\romannumeral\@authorcount}{#3}} \renewcommand{\author}[1]{% \@latex@error{The \string\author\space command is obsolete; use \string\authorinfo}{}} \renewcommand{\maketitle}{% \pagestyle{plain}% \if \@onecolumn {\hsize = \standardtextwidth \@maketitle}% \else \twocolumn[\@maketitle]% \fi \@placetitlenotes \if \@copyrightwanted \@copyrightspace \fi} \def \@maketitle {% \begin{center} \let \thanks = \titlenote \noindent \LARGE \bfseries \@titletext \par \vskip 6pt \noindent \Large \@subtitletext \par \vskip 12pt \ifcase \@authorcount \@latex@error{No authors were specified for this paper}{}\or \@titleauthors{i}{}{}\or \@titleauthors{i}{ii}{}\or \@titleauthors{i}{ii}{iii}\or \@titleauthors{i}{ii}{iii}\@titleauthors{iv}{}{}\or \@titleauthors{i}{ii}{iii}\@titleauthors{iv}{v}{}\or \@titleauthors{i}{ii}{iii}\@titleauthors{iv}{v}{vi}\or \@titleauthors{i}{ii}{iii}\@titleauthors{iv}{v}{vi}% \@titleauthors{vii}{}{}\or \@titleauthors{i}{ii}{iii}\@titleauthors{iv}{v}{vi}% \@titleauthors{vii}{viii}{}\or \@titleauthors{i}{ii}{iii}\@titleauthors{iv}{v}{vi}% \@titleauthors{vii}{viii}{ix}\or \@titleauthors{i}{ii}{iii}\@titleauthors{iv}{v}{vi}% \@titleauthors{vii}{viii}{ix}\@titleauthors{x}{}{}% \@titleauthors{i}{ii}{iii}\@titleauthors{iv}{v}{vi}% \@titleauthors{vii}{viii}{ix}\@titleauthors{x}{xi}{}% \@titleauthors{i}{ii}{iii}\@titleauthors{iv}{v}{vi}% \@titleauthors{vii}{viii}{ix}\@titleauthors{x}{xi}{xii}% \else \@latex@error{Cannot handle more than 12 authors}{}% \fi \vspace{1.75pc} \end{center}} \def \@titleauthors #1#2#3{% \if \andp{\emptyargp{#2}}{\emptyargp{#3}}% \noindent \@setauthor{40pc}{#1}{\@false}\par \else\if \emptyargp{#3}% \noindent \@setauthor{17pc}{#1}{\@false}\hspace{3pc}% \@setauthor{17pc}{#2}{\@false}\par \else \noindent \@setauthor{12.5pc}{#1}{\@false}\hspace{2pc}% \@setauthor{12.5pc}{#2}{\@false}\hspace{2pc}% \@setauthor{12.5pc}{#3}{\@true}\par \relax \fi\fi \vspace{20pt}} \def \@setauthor #1#2#3{% \vtop{% \def \and {% \hspace{16pt}} \hsize = #1 \normalfont \centering \large \@name{\@authorname#2}\par \vspace{5pt} \normalsize \@name{\@authoraffil#2}\par \vspace{2pt} \textsf{\@name{\@authoremail#2}}\par}} \def \@maybetitlenote #1{% \if \andp{#1}{\gtrp{\@authorcount}{3}}% \titlenote{See page~\pageref{@addauthors} for additional authors.}% \fi} \newtoks{\@fnmark} \newcommand{\titlenote}[1]{% \global\@increment \@titlenotecount \ifcase \@titlenotecount \relax \or \@fnmark = {\ast}\or \@fnmark = {\dagger}\or \@fnmark = {\ddagger}\or \@fnmark = {\S}\or \@fnmark = {\P}\or \@fnmark = {\ast\ast}% \fi \,$^{\the\@fnmark}$% \edef \reserved@a {\noexpand\@appendtotext{% \noexpand\@titlefootnote{\the\@fnmark}}}% \reserved@a{#1}} \def \@appendtotext #1#2{% \global\@titlenotetext = \expandafter{\the\@titlenotetext #1{#2}}} \newcount{\@authori} \iffalse \def \additionalauthors {% \if \gtrp{\@authorcount}{3}% \section{Additional Authors}% \label{@addauthors}% \noindent \@authori = 4 {\let \\ = ,% \loop \textbf{\@name{\@authorname\romannumeral\@authori}}, \@name{\@authoraffil\romannumeral\@authori}, email: \@name{\@authoremail\romannumeral\@authori}.% \@increment \@authori \if \notp{\gtrp{\@authori}{\@authorcount}} \repeat}% \par \fi \global\@setflag \@addauthorsdone = \@true} \fi \let \addauthorsection = \additionalauthors \def \@placetitlenotes { \the\@titlenotetext} % Utilities % --------- \newcommand{\centeroncapheight}[1]{% {\setbox\@tempboxa = \hbox{#1}% \@measurecapheight{\@tempdima}% % Calculate ht(CAP) - ht(text) \advance \@tempdima by -\ht\@tempboxa % ------------------ \divide \@tempdima by 2 % 2 \raise \@tempdima \box\@tempboxa}} \newbox{\@measbox} \def \@measurecapheight #1{% {\dimen} \setbox\@measbox = \hbox{ABCDEFGHIJKLMNOPQRSTUVWXYZ}% #1 = \ht\@measbox} \long\def \@titlefootnote #1#2{% \insert\footins{% \reset@font\footnotesize \interlinepenalty\interfootnotelinepenalty \splittopskip\footnotesep \splitmaxdepth \dp\strutbox \floatingpenalty \@MM \hsize\columnwidth \@parboxrestore %%% \protected@edef\@currentlabel{% %%% \csname p@footnote\endcsname\@thefnmark}% \color@begingroup \def \@makefnmark {$^{#1}$}% \@makefntext{% \rule\z@\footnotesep\ignorespaces#2\@finalstrut\strutbox}% \color@endgroup}} % LaTeX Modifications % ----- ------------- \def \@seccntformat #1{% \@name{\the#1}% \@expandaftertwice\@seccntformata \csname the#1\endcsname.\@mark \quad} \def \@seccntformata #1.#2\@mark{% \if \emptyargp{#2}.\fi} % Revision History % -------- ------- % Date Person Ver. Change % ---- ------ ---- ------ % 2004.09.12 PCA 0.1--5 Preliminary development. % 2004.11.18 PCA 0.5 Start beta testing. % 2004.11.19 PCA 0.6 Obsolete \author and replace with % \authorinfo. % Add 'nocopyrightspace' option. % Compress article opener spacing. % Add 'mathtime' option. % Increase text height by 6 points. % 2004.11.28 PCA 0.7 Add 'cm/computermodern' options. % Change default to Times text. % 2004.12.14 PCA 0.8 Remove use of mathptm.sty; it cannot % coexist with latexym or amssymb. % 2005.01.20 PCA 0.9 Rename class file to sigplanconf.cls. % 2005.03.05 PCA 0.91 Change default copyright data. % 2005.03.06 PCA 0.92 Add at-signs to some macro names. % 2005.03.07 PCA 0.93 The 'onecolumn' option defaults to '11pt', % and it uses the full type width. menhir-20200123/doc/sigplanconf.hva000066400000000000000000000003311361226111300167700ustar00rootroot00000000000000\input{article.hva} \usepackage{hyperref} \usepackage{natbib} \usepackage{amsmath} \newcommand{\authorinfo}[3]{\author{\renewcommand\and{ and }#1\\#2\\\texttt{#3}}} \newcommand{\mathstrut}[1]{#1} \usepackage{style}menhir-20200123/doc/style.hva000066400000000000000000000002531361226111300156300ustar00rootroot00000000000000% Compact layout \newstyle{body}{ max-width:800px; width: 85\%; margin: auto; font-size: 1rem; } \newstyle{pre, .quote}{ margin-left: 2em; font-size: 1rem; } menhir-20200123/doc/version.tex000066400000000000000000000000361361226111300161760ustar00rootroot00000000000000\gdef\menhirversion{20200123} menhir-20200123/doc/whizzy.el000066400000000000000000000003541361226111300156600ustar00rootroot00000000000000(whizzy-add-configuration ".*\.\\(tex\\|sty\\)" '((whizzy-master . "main.tex")) ) (whizzy-add-configuration "main\.tex" '((whizzy . "section -advi \"advi -geometry 1270x1024 -fullwidth -html Start-Document\" -dvicopy dvicopy" )) ) menhir-20200123/doc/whizzy.sh000066400000000000000000000004601361226111300156700ustar00rootroot00000000000000# Include TEXINPUTS setting from Makefile.local. # Do not include all of Makefile.local, because both whizzytex and Makefile # rely on NAME (for different purposes). if [ -f Makefile.local ] then echo "Extracting TEXINPUTS setting from Makefile.local..." `grep TEXINPUTS Makefile.local` fi menhir-20200123/doc/whizzy.sty000066400000000000000000000001371361226111300160760ustar00rootroot00000000000000% Use small pages when whizzytex'ing. \makeatletter \setlength\textheight{340\p@} \makeatother menhir-20200123/dune-project000066400000000000000000000003011361226111300155370ustar00rootroot00000000000000(lang dune 2.0) (name menhir) (using menhir 2.0) (version 20200123) (package (name menhirLib) ) (package (name menhirSdk) ) (package (name menhir) ) (package (name coq-menhirlib) ) menhir-20200123/dune-workspace.versions000066400000000000000000000005071361226111300177460ustar00rootroot00000000000000(lang dune 2.0) (context (opam (switch 4.02.3))) (context (opam (switch 4.03.0))) (context (opam (switch 4.04.2))) (context (opam (switch 4.05.0))) (context (opam (switch 4.06.1))) (context (opam (switch 4.07.1))) (context (opam (switch 4.08.1))) (context (opam (switch 4.09.0))) (context (opam (switch 4.09.0+bytecode-only))) menhir-20200123/lib/000077500000000000000000000000001361226111300137715ustar00rootroot00000000000000menhir-20200123/lib/Convert.ml000066400000000000000000000110121361226111300157360ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU Library General Public License version 2, with a *) (* special exception on linking, as described in the file LICENSE. *) (* *) (******************************************************************************) (* An ocamlyacc-style, or Menhir-style, parser requires access to the lexer, which must be parameterized with a lexing buffer, and to the lexing buffer itself, where it reads position information. *) (* This traditional API is convenient when used with ocamllex, but inelegant when used with other lexer generators. *) type ('token, 'semantic_value) traditional = (Lexing.lexbuf -> 'token) -> Lexing.lexbuf -> 'semantic_value (* This revised API is independent of any lexer generator. Here, the parser only requires access to the lexer, and the lexer takes no parameters. The tokens returned by the lexer may contain position information. *) type ('token, 'semantic_value) revised = (unit -> 'token) -> 'semantic_value (* --------------------------------------------------------------------------- *) (* Converting a traditional parser, produced by ocamlyacc or Menhir, into a revised parser. *) (* A token of the revised lexer is essentially a triple of a token of the traditional lexer (or raw token), a start position, and and end position. The three [get] functions are accessors. *) (* We do not require the type ['token] to actually be a triple type. This enables complex applications where it is a record type with more than three fields. It also enables simple applications where positions are of no interest, so ['token] is just ['raw_token] and [get_startp] and [get_endp] return dummy positions. *) let traditional2revised (get_raw_token : 'token -> 'raw_token) (get_startp : 'token -> Lexing.position) (get_endp : 'token -> Lexing.position) (parser : ('raw_token, 'semantic_value) traditional) : ('token, 'semantic_value) revised = (* Accept a revised lexer. *) fun (lexer : unit -> 'token) -> (* Create a dummy lexing buffer. *) let lexbuf : Lexing.lexbuf = Lexing.from_string "" in (* Wrap the revised lexer as a traditional lexer. A traditional lexer returns a raw token and updates the fields of the lexing buffer with new positions, which will be read by the parser. *) let lexer (lexbuf : Lexing.lexbuf) : 'raw_token = let token : 'token = lexer() in lexbuf.Lexing.lex_start_p <- get_startp token; lexbuf.Lexing.lex_curr_p <- get_endp token; get_raw_token token in (* Invoke the traditional parser. *) parser lexer lexbuf (* --------------------------------------------------------------------------- *) (* Converting a revised parser back to a traditional parser. *) let revised2traditional (make_token : 'raw_token -> Lexing.position -> Lexing.position -> 'token) (parser : ('token, 'semantic_value) revised) : ('raw_token, 'semantic_value) traditional = (* Accept a traditional lexer and a lexing buffer. *) fun (lexer : Lexing.lexbuf -> 'raw_token) (lexbuf : Lexing.lexbuf) -> (* Wrap the traditional lexer as a revised lexer. *) let lexer () : 'token = let token : 'raw_token = lexer lexbuf in make_token token lexbuf.Lexing.lex_start_p lexbuf.Lexing.lex_curr_p in (* Invoke the revised parser. *) parser lexer (* --------------------------------------------------------------------------- *) (* Simplified versions of the above, where concrete triples are used. *) module Simplified = struct let traditional2revised parser = traditional2revised (fun (token, _, _) -> token) (fun (_, startp, _) -> startp) (fun (_, _, endp) -> endp) parser let revised2traditional parser = revised2traditional (fun token startp endp -> (token, startp, endp)) parser end menhir-20200123/lib/Convert.mli000066400000000000000000000065701361226111300161240ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU Library General Public License version 2, with a *) (* special exception on linking, as described in the file LICENSE. *) (* *) (******************************************************************************) (* An ocamlyacc-style, or Menhir-style, parser requires access to the lexer, which must be parameterized with a lexing buffer, and to the lexing buffer itself, where it reads position information. *) (* This traditional API is convenient when used with ocamllex, but inelegant when used with other lexer generators. *) type ('token, 'semantic_value) traditional = (Lexing.lexbuf -> 'token) -> Lexing.lexbuf -> 'semantic_value (* This revised API is independent of any lexer generator. Here, the parser only requires access to the lexer, and the lexer takes no parameters. The tokens returned by the lexer may contain position information. *) type ('token, 'semantic_value) revised = (unit -> 'token) -> 'semantic_value (* --------------------------------------------------------------------------- *) (* Converting a traditional parser, produced by ocamlyacc or Menhir, into a revised parser. *) (* A token of the revised lexer is essentially a triple of a token of the traditional lexer (or raw token), a start position, and and end position. The three [get] functions are accessors. *) (* We do not require the type ['token] to actually be a triple type. This enables complex applications where it is a record type with more than three fields. It also enables simple applications where positions are of no interest, so ['token] is just ['raw_token] and [get_startp] and [get_endp] return dummy positions. *) val traditional2revised: ('token -> 'raw_token) -> ('token -> Lexing.position) -> ('token -> Lexing.position) -> ('raw_token, 'semantic_value) traditional -> ('token, 'semantic_value) revised (* --------------------------------------------------------------------------- *) (* Converting a revised parser back to a traditional parser. *) val revised2traditional: ('raw_token -> Lexing.position -> Lexing.position -> 'token) -> ('token, 'semantic_value) revised -> ('raw_token, 'semantic_value) traditional (* --------------------------------------------------------------------------- *) (* Simplified versions of the above, where concrete triples are used. *) module Simplified : sig val traditional2revised: ('token, 'semantic_value) traditional -> ('token * Lexing.position * Lexing.position, 'semantic_value) revised val revised2traditional: ('token * Lexing.position * Lexing.position, 'semantic_value) revised -> ('token, 'semantic_value) traditional end menhir-20200123/lib/Engine.ml000066400000000000000000001004441361226111300155330ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU Library General Public License version 2, with a *) (* special exception on linking, as described in the file LICENSE. *) (* *) (******************************************************************************) type position = Lexing.position open EngineTypes (* The LR parsing engine. *) (* This module is used: - at compile time, if so requested by the user, via the --interpret options; - at run time, in the table-based back-end. *) module Make (T : TABLE) = struct (* This propagates type and exception definitions. The functions [number], [production_index], [find_production], too, are defined by this [include] declaration. *) include T type 'a env = (state, semantic_value, token) EngineTypes.env (* ------------------------------------------------------------------------ *) (* The type [checkpoint] represents an intermediate or final result of the parser. See [EngineTypes]. *) (* The type [checkpoint] is presented to the user as a private type (see [IncrementalEngine]). This prevents the user from manufacturing checkpoints (i.e., continuations) that do not make sense. (Such continuations could potentially violate the LR invariant and lead to crashes.) *) (* 2017/03/29 Although [checkpoint] is a private type, we now expose a constructor function, [input_needed]. This function allows manufacturing a checkpoint out of an environment. For this reason, the type [env] must also be parameterized with ['a]. *) type 'a checkpoint = | InputNeeded of 'a env | Shifting of 'a env * 'a env * bool | AboutToReduce of 'a env * production | HandlingError of 'a env | Accepted of 'a | Rejected (* ------------------------------------------------------------------------ *) (* In the code-based back-end, the [run] function is sometimes responsible for pushing a new cell on the stack. This is motivated by code sharing concerns. In this interpreter, there is no such concern; [run]'s caller is always responsible for updating the stack. *) (* In the code-based back-end, there is a [run] function for each state [s]. This function can behave in two slightly different ways, depending on when it is invoked, or (equivalently) depending on [s]. If [run] is invoked after shifting a terminal symbol (or, equivalently, if [s] has a terminal incoming symbol), then [run] discards a token, unless [s] has a default reduction on [#]. (Indeed, in that case, requesting the next token might drive the lexer off the end of the input stream.) If, on the other hand, [run] is invoked after performing a goto transition, or invoked directly by an entry point, then there is nothing to discard. These two cases are reflected in [CodeBackend.gettoken]. Here, the code is structured in a slightly different way. It is up to the caller of [run] to indicate whether to discard a token, via the parameter [please_discard]. This flag is set when [s] is being entered by shifting a terminal symbol and [s] does not have a default reduction on [#]. *) (* The following recursive group of functions are tail recursive, produce a checkpoint of type [semantic_value checkpoint], and cannot raise an exception. A semantic action can raise [Error], but this exception is immediately caught within [reduce]. *) let rec run env please_discard : semantic_value checkpoint = (* Log the fact that we just entered this state. *) if log then Log.state env.current; (* If [please_discard] is set, we discard the current lookahead token and fetch the next one. In order to request a token from the user, we return an [InputNeeded] continuation, which, when invoked by the user, will take us to [discard]. If [please_discard] is not set, we skip this step and jump directly to [check_for_default_reduction]. *) if please_discard then InputNeeded env else check_for_default_reduction env (* [discard env triple] stores [triple] into [env], overwriting the previous token. It is invoked by [offer], which itself is invoked by the user in response to an [InputNeeded] checkpoint. *) and discard env triple = if log then begin let (token, startp, endp) = triple in Log.lookahead_token (T.token2terminal token) startp endp end; let env = { env with error = false; triple } in check_for_default_reduction env and check_for_default_reduction env = (* Examine what situation we are in. This case analysis is analogous to that performed in [CodeBackend.gettoken], in the sub-case where we do not have a terminal incoming symbol. *) T.default_reduction env.current announce_reduce (* there is a default reduction; perform it *) check_for_error_token (* there is none; continue below *) env and check_for_error_token env = (* There is no default reduction. Consult the current lookahead token so as to determine which action should be taken. *) (* Peeking at the first input token, without taking it off the input stream, is done by reading [env.triple]. We are careful to first check [env.error]. *) (* Note that, if [please_discard] was true, then we have just called [discard], so the lookahead token cannot be [error]. *) (* Returning [HandlingError env] is equivalent to calling [error env] directly, except it allows the user to regain control. *) if env.error then begin if log then Log.resuming_error_handling(); HandlingError env end else let (token, _, _) = env.triple in (* We consult the two-dimensional action table, indexed by the current state and the current lookahead token, in order to determine which action should be taken. *) T.action env.current (* determines a row *) (T.token2terminal token) (* determines a column *) (T.token2value token) shift (* shift continuation *) announce_reduce (* reduce continuation *) initiate (* failure continuation *) env (* ------------------------------------------------------------------------ *) (* This function takes care of shift transitions along a terminal symbol. (Goto transitions are taken care of within [reduce] below.) The symbol can be either an actual token or the [error] pseudo-token. *) (* Here, the lookahead token CAN be [error]. *) and shift env (please_discard : bool) (terminal : terminal) (value : semantic_value) (s' : state) = (* Log the transition. *) if log then Log.shift terminal s'; (* Push a new cell onto the stack, containing the identity of the state that we are leaving. *) let (_, startp, endp) = env.triple in let stack = { state = env.current; semv = value; startp; endp; next = env.stack; } in (* Switch to state [s']. *) let new_env = { env with stack; current = s' } in (* Expose the transition to the user. (In principle, we have a choice between exposing the transition before we take it, after we take it, or at some point in between. This affects the number and type of the parameters carried by [Shifting]. Here, we choose to expose the transition after we take it; this allows [Shifting] to carry only three parameters, whose meaning is simple.) *) Shifting (env, new_env, please_discard) (* ------------------------------------------------------------------------ *) (* The function [announce_reduce] stops the parser and returns a checkpoint which allows the parser to be resumed by calling [reduce]. *) (* Only ordinary productions are exposed to the user. Start productions are not exposed to the user. Reducing a start production simply leads to the successful termination of the parser. *) and announce_reduce env (prod : production) = if T.is_start prod then accept env prod else AboutToReduce (env, prod) (* The function [reduce] takes care of reductions. It is invoked by [resume] after an [AboutToReduce] event has been produced. *) (* Here, the lookahead token CAN be [error]. *) (* The production [prod] CANNOT be a start production. *) and reduce env (prod : production) = (* Log a reduction event. *) if log then Log.reduce_or_accept prod; (* Invoke the semantic action. The semantic action is responsible for truncating the stack and pushing a new cell onto the stack, which contains a new semantic value. It can raise [Error]. *) (* If the semantic action terminates normally, it returns a new stack, which becomes the current stack. *) (* If the semantic action raises [Error], we catch it and initiate error handling. *) (* This [match/with/exception] construct requires OCaml 4.02. *) match T.semantic_action prod env with | stack -> (* By our convention, the semantic action has produced an updated stack. The state now found in the top stack cell is the return state. *) (* Perform a goto transition. The target state is determined by consulting the goto table at the return state and at production [prod]. *) let current = T.goto_prod stack.state prod in let env = { env with stack; current } in run env false | exception Error -> initiate env and accept env prod = (* Log an accept event. *) if log then Log.reduce_or_accept prod; (* Extract the semantic value out of the stack. *) let v = env.stack.semv in (* Finish. *) Accepted v (* ------------------------------------------------------------------------ *) (* The following functions deal with errors. *) (* [initiate] initiates or resumes error handling. *) (* Here, the lookahead token CAN be [error]. *) and initiate env = if log then Log.initiating_error_handling(); let env = { env with error = true } in HandlingError env (* [error] handles errors. *) and error env = assert env.error; (* Consult the column associated with the [error] pseudo-token in the action table. *) T.action env.current (* determines a row *) T.error_terminal (* determines a column *) T.error_value error_shift (* shift continuation *) error_reduce (* reduce continuation *) error_fail (* failure continuation *) env and error_shift env please_discard terminal value s' = (* Here, [terminal] is [T.error_terminal], and [value] is [T.error_value]. *) assert (terminal = T.error_terminal && value = T.error_value); (* This state is capable of shifting the [error] token. *) if log then Log.handling_error env.current; shift env please_discard terminal value s' and error_reduce env prod = (* This state is capable of performing a reduction on [error]. *) if log then Log.handling_error env.current; reduce env prod (* Intentionally calling [reduce] instead of [announce_reduce]. It does not seem very useful, and it could be confusing, to expose the reduction steps taken during error handling. *) and error_fail env = (* This state is unable to handle errors. Attempt to pop a stack cell. *) let cell = env.stack in let next = cell.next in if next == cell then (* The stack is empty. Die. *) Rejected else begin (* The stack is nonempty. Pop a cell, updating the current state with that found in the popped cell, and try again. *) let env = { env with stack = next; current = cell.state } in HandlingError env end (* End of the nest of tail recursive functions. *) (* ------------------------------------------------------------------------ *) (* ------------------------------------------------------------------------ *) (* The incremental interface. See [EngineTypes]. *) (* [start s] begins the parsing process. *) let start (s : state) (initial : position) : semantic_value checkpoint = (* Build an empty stack. This is a dummy cell, which is its own successor. Its [next] field WILL be accessed by [error_fail] if an error occurs and is propagated all the way until the stack is empty. Its [endp] field WILL be accessed (by a semantic action) if an epsilon production is reduced when the stack is empty. *) let rec empty = { state = s; (* dummy *) semv = T.error_value; (* dummy *) startp = initial; (* dummy *) endp = initial; next = empty; } in (* Build an initial environment. *) (* Unfortunately, there is no type-safe way of constructing a dummy token. Tokens carry semantic values, which in general we cannot manufacture. This instance of [Obj.magic] could be avoided by adopting a different representation (e.g., no [env.error] field, and an option in the first component of [env.triple]), but I like this representation better. *) let dummy_token = Obj.magic () in let env = { error = false; triple = (dummy_token, initial, initial); (* dummy *) stack = empty; current = s; } in (* Begin parsing. *) (* The parameter [please_discard] here is [true], which means we know that we must read at least one token. This claim relies on the fact that we have ruled out the two special cases where a start symbol recognizes the empty language or the singleton language {epsilon}. *) run env true (* [offer checkpoint triple] is invoked by the user in response to a checkpoint of the form [InputNeeded env]. It checks that [checkpoint] is indeed of this form, and invokes [discard]. *) (* [resume checkpoint] is invoked by the user in response to a checkpoint of the form [AboutToReduce (env, prod)] or [HandlingError env]. It checks that [checkpoint] is indeed of this form, and invokes [reduce] or [error], as appropriate. *) (* In reality, [offer] and [resume] accept an argument of type [semantic_value checkpoint] and produce a checkpoint of the same type. The choice of [semantic_value] is forced by the fact that this is the parameter of the checkpoint [Accepted]. *) (* We change this as follows. *) (* We change the argument and result type of [offer] and [resume] from [semantic_value checkpoint] to ['a checkpoint]. This is safe, in this case, because we give the user access to values of type [t checkpoint] only if [t] is indeed the type of the eventual semantic value for this run. (More precisely, by examining the signatures [INCREMENTAL_ENGINE] and [INCREMENTAL_ENGINE_START], one finds that the user can build a value of type ['a checkpoint] only if ['a] is [semantic_value]. The table back-end goes further than this and produces versions of [start] composed with a suitable cast, which give the user access to a value of type [t checkpoint] where [t] is the type of the start symbol.) *) let offer : 'a . 'a checkpoint -> token * position * position -> 'a checkpoint = function | InputNeeded env -> Obj.magic discard env | _ -> invalid_arg "offer expects InputNeeded" let resume : 'a . 'a checkpoint -> 'a checkpoint = function | HandlingError env -> Obj.magic error env | Shifting (_, env, please_discard) -> Obj.magic run env please_discard | AboutToReduce (env, prod) -> Obj.magic reduce env prod | _ -> invalid_arg "resume expects HandlingError | Shifting | AboutToReduce" (* ------------------------------------------------------------------------ *) (* ------------------------------------------------------------------------ *) (* The traditional interface. See [EngineTypes]. *) (* ------------------------------------------------------------------------ *) (* Wrapping a lexer and lexbuf as a token supplier. *) type supplier = unit -> token * position * position let lexer_lexbuf_to_supplier (lexer : Lexing.lexbuf -> token) (lexbuf : Lexing.lexbuf) : supplier = fun () -> let token = lexer lexbuf in let startp = lexbuf.Lexing.lex_start_p and endp = lexbuf.Lexing.lex_curr_p in token, startp, endp (* ------------------------------------------------------------------------ *) (* The main loop repeatedly handles intermediate checkpoints, until a final checkpoint is obtained. This allows implementing the monolithic interface ([entry]) in terms of the incremental interface ([start], [offer], [handle], [reduce]). *) (* By convention, acceptance is reported by returning a semantic value, whereas rejection is reported by raising [Error]. *) (* [loop] is polymorphic in ['a]. No cheating is involved in achieving this. All of the cheating resides in the types assigned to [offer] and [handle] above. *) let rec loop : 'a . supplier -> 'a checkpoint -> 'a = fun read checkpoint -> match checkpoint with | InputNeeded _ -> (* The parser needs a token. Request one from the lexer, and offer it to the parser, which will produce a new checkpoint. Then, repeat. *) let triple = read() in let checkpoint = offer checkpoint triple in loop read checkpoint | Shifting _ | AboutToReduce _ | HandlingError _ -> (* The parser has suspended itself, but does not need new input. Just resume the parser. Then, repeat. *) let checkpoint = resume checkpoint in loop read checkpoint | Accepted v -> (* The parser has succeeded and produced a semantic value. Return this semantic value to the user. *) v | Rejected -> (* The parser rejects this input. Raise an exception. *) raise Error let entry (s : state) lexer lexbuf : semantic_value = let initial = lexbuf.Lexing.lex_curr_p in loop (lexer_lexbuf_to_supplier lexer lexbuf) (start s initial) (* ------------------------------------------------------------------------ *) (* [loop_handle] stops if it encounters an error, and at this point, invokes its failure continuation, without letting Menhir do its own traditional error-handling (which involves popping the stack, etc.). *) let rec loop_handle succeed fail read checkpoint = match checkpoint with | InputNeeded _ -> let triple = read() in let checkpoint = offer checkpoint triple in loop_handle succeed fail read checkpoint | Shifting _ | AboutToReduce _ -> let checkpoint = resume checkpoint in loop_handle succeed fail read checkpoint | HandlingError _ | Rejected -> (* The parser has detected an error. Invoke the failure continuation. *) fail checkpoint | Accepted v -> (* The parser has succeeded and produced a semantic value. Invoke the success continuation. *) succeed v (* ------------------------------------------------------------------------ *) (* [loop_handle_undo] is analogous to [loop_handle], except it passes a pair of checkpoints to the failure continuation. The first (and oldest) checkpoint is the last [InputNeeded] checkpoint that was encountered before the error was detected. The second (and newest) checkpoint is where the error was detected, as in [loop_handle]. Going back to the first checkpoint can be thought of as undoing any reductions that were performed after seeing the problematic token. (These reductions must be default reductions or spurious reductions.) *) let rec loop_handle_undo succeed fail read (inputneeded, checkpoint) = match checkpoint with | InputNeeded _ -> (* Update the last recorded [InputNeeded] checkpoint. *) let inputneeded = checkpoint in let triple = read() in let checkpoint = offer checkpoint triple in loop_handle_undo succeed fail read (inputneeded, checkpoint) | Shifting _ | AboutToReduce _ -> let checkpoint = resume checkpoint in loop_handle_undo succeed fail read (inputneeded, checkpoint) | HandlingError _ | Rejected -> fail inputneeded checkpoint | Accepted v -> succeed v (* For simplicity, we publish a version of [loop_handle_undo] that takes a single checkpoint as an argument, instead of a pair of checkpoints. We check that the argument is [InputNeeded _], and duplicate it. *) (* The parser cannot accept or reject before it asks for the very first character of input. (Indeed, we statically reject a symbol that generates the empty language or the singleton language {epsilon}.) So, the [start] checkpoint must match [InputNeeded _]. Hence, it is permitted to call [loop_handle_undo] with a [start] checkpoint. *) let loop_handle_undo succeed fail read checkpoint = assert (match checkpoint with InputNeeded _ -> true | _ -> false); loop_handle_undo succeed fail read (checkpoint, checkpoint) (* ------------------------------------------------------------------------ *) let rec shifts checkpoint = match checkpoint with | Shifting (env, _, _) -> (* The parser is about to shift, which means it is willing to consume the terminal symbol that we have fed it. Return the state just before this transition. *) Some env | AboutToReduce _ -> (* The parser wishes to reduce. Just follow. *) shifts (resume checkpoint) | HandlingError _ -> (* The parser fails, which means it rejects the terminal symbol that we have fed it. *) None | InputNeeded _ | Accepted _ | Rejected -> (* None of these cases can arise. Indeed, after a token is submitted to it, the parser must shift, reduce, or signal an error, before it can request another token or terminate. *) assert false let acceptable checkpoint token pos = let triple = (token, pos, pos) in let checkpoint = offer checkpoint triple in match shifts checkpoint with | None -> false | Some _env -> true (* ------------------------------------------------------------------------ *) (* The type ['a lr1state] describes the (non-initial) states of the LR(1) automaton. The index ['a] represents the type of the semantic value associated with the state's incoming symbol. *) (* The type ['a lr1state] is defined as an alias for [state], which itself is usually defined as [int] (see [TableInterpreter]). So, ['a lr1state] is technically a phantom type, but should really be thought of as a GADT whose data constructors happen to be represented as integers. It is presented to the user as an abstract type (see [IncrementalEngine]). *) type 'a lr1state = state (* ------------------------------------------------------------------------ *) (* Stack inspection. *) (* We offer a read-only view of the parser's state as a stream of elements. Each element contains a pair of a (non-initial) state and a semantic value associated with (the incoming symbol of) this state. Note that the type [element] is an existential type. *) (* As of 2017/03/31, the type [stack] and the function [stack] are DEPRECATED. If desired, they could now be implemented outside Menhir, by relying on the functions [top] and [pop]. *) type element = | Element: 'a lr1state * 'a * position * position -> element open General type stack = element stream (* If [current] is the current state and [cell] is the top stack cell, then [stack cell current] is a view of the parser's state as a stream of elements. *) let rec stack cell current : element stream = lazy ( (* The stack is empty iff the top stack cell is its own successor. In that case, the current state [current] should be an initial state (which has no incoming symbol). We do not allow the user to inspect this state. *) let next = cell.next in if next == cell then Nil else (* Construct an element containing the current state [current] as well as the semantic value contained in the top stack cell. This semantic value is associated with the incoming symbol of this state, so it makes sense to pair them together. The state has type ['a state] and the semantic value has type ['a], for some type ['a]. Here, the OCaml type-checker thinks ['a] is [semantic_value] and considers this code well-typed. Outside, we will use magic to provide the user with a way of inspecting states and recovering the value of ['a]. *) let element = Element ( current, cell.semv, cell.startp, cell.endp ) in Cons (element, stack next cell.state) ) let stack env : element stream = stack env.stack env.current (* As explained above, the function [top] allows access to the top stack element only if the stack is nonempty, i.e., only if the current state is not an initial state. *) let top env : element option = let cell = env.stack in let next = cell.next in if next == cell then None else Some (Element (env.current, cell.semv, cell.startp, cell.endp)) (* [equal] compares the stacks for physical equality, and compares the current states via their numbers (this seems cleaner than using OCaml's polymorphic equality). *) (* The two fields that are not compared by [equal], namely [error] and [triple], are overwritten by the function [discard], which handles [InputNeeded] checkpoints. Thus, if [equal env1 env2] holds, then the checkpoints [input_needed env1] and [input_needed env2] are equivalent: they lead the parser to behave in the same way. *) let equal env1 env2 = env1.stack == env2.stack && number env1.current = number env2.current let current_state_number env = number env.current (* ------------------------------------------------------------------------ *) (* Access to the position of the lookahead token. *) let positions { triple = (_, startp, endp); _ } = startp, endp (* ------------------------------------------------------------------------ *) (* Access to information about default reductions. *) (* This can be a function of states, or a function of environments. We offer both. *) (* Instead of a Boolean result, we could return a [production option]. However, we would have to explicitly test whether [prod] is a start production, and in that case, return [None], I suppose. Indeed, we have decided not to expose the start productions. *) let state_has_default_reduction (state : _ lr1state) : bool = T.default_reduction state (fun _env _prod -> true) (fun _env -> false) () let env_has_default_reduction env = state_has_default_reduction env.current (* ------------------------------------------------------------------------ *) (* The following functions work at the level of environments (as opposed to checkpoints). The function [pop] causes the automaton to go back into the past, pretending that the last input symbol has never been read. The function [force_reduction] causes the automaton to re-interpret the past, by recognizing the right-hand side of a production and reducing this production. The function [feed] causes the automaton to progress into the future by pretending that a (terminal or nonterminal) symbol has been read. *) (* The function [feed] would ideally be defined here. However, for this function to be type-safe, the GADT ['a symbol] is needed. For this reason, we move its definition to [InspectionTableInterpreter], where the inspection API is available. *) (* [pop] pops one stack cell. It cannot go wrong. *) let pop (env : 'a env) : 'a env option = let cell = env.stack in let next = cell.next in if next == cell then (* The stack is empty. *) None else (* The stack is nonempty. Pop off one cell. *) Some { env with stack = next; current = cell.state } (* [force_reduction] is analogous to [reduce], except that it does not continue by calling [run env] or [initiate env]. Instead, it returns [env] to the user. *) (* [force_reduction] is dangerous insofar as it executes a semantic action. This semantic action could have side effects: nontermination, state, exceptions, input/output, etc. *) let force_reduction prod (env : 'a env) : 'a env = (* Check if this reduction is permitted. This check is REALLY important. The stack must have the correct shape: that is, it must be sufficiently high, and must contain semantic values of appropriate types, otherwise the semantic action will crash and burn. *) (* We currently check whether the current state is WILLING to reduce this production (i.e., there is a reduction action in the action table row associated with this state), whereas it would be more liberal to check whether this state is CAPABLE of reducing this production (i.e., the stack has an appropriate shape). We currently have no means of performing such a check. *) if not (T.may_reduce env.current prod) then invalid_arg "force_reduction: this reduction is not permitted in this state" else begin (* We do not expose the start productions to the user, so this cannot be a start production. Hence, it has a semantic action. *) assert (not (T.is_start prod)); (* Invoke the semantic action. *) let stack = T.semantic_action prod env in (* Perform a goto transition. *) let current = T.goto_prod stack.state prod in { env with stack; current } end (* The environment manipulation functions -- [pop] and [force_reduction] above, plus [feed] -- manipulate the automaton's stack and current state, but do not affect the automaton's lookahead symbol. When the function [input_needed] is used to go back from an environment to a checkpoint (and therefore, resume normal parsing), the lookahead symbol is clobbered anyway, since the only action that the user can take is to call [offer]. So far, so good. One problem, though, is that this call to [offer] may well place the automaton in a configuration of a state [s] and a lookahead symbol [t] that is normally unreachable. Also, perhaps the state [s] is a state where an input symbol normally is never demanded, so this [InputNeeded] checkpoint is fishy. There does not seem to be a deep problem here, but, when programming an error recovery strategy, one should pay some attention to this issue. Ideally, perhaps, one should use [input_needed] only in a state [s] where an input symbol is normally demanded, that is, a state [s] whose incoming symbol is a terminal symbol and which does not have a default reduction on [#]. *) let input_needed (env : 'a env) : 'a checkpoint = InputNeeded env (* The following functions are compositions of [top] and [pop]. *) let rec pop_many i env = if i = 0 then Some env else match pop env with | None -> None | Some env -> pop_many (i - 1) env let get i env = match pop_many i env with | None -> None | Some env -> top env end menhir-20200123/lib/Engine.mli000066400000000000000000000031271361226111300157040ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU Library General Public License version 2, with a *) (* special exception on linking, as described in the file LICENSE. *) (* *) (******************************************************************************) open EngineTypes (* The LR parsing engine. *) module Make (T : TABLE) : ENGINE with type state = T.state and type token = T.token and type semantic_value = T.semantic_value and type production = T.production and type 'a env = (T.state, T.semantic_value, T.token) EngineTypes.env (* We would prefer not to expose the definition of the type [env]. However, it must be exposed because some of the code in the inspection API needs access to the engine's internals; see [InspectionTableInterpreter]. Everything would be simpler if --inspection was always ON, but that would lead to bigger parse tables for everybody. *) menhir-20200123/lib/EngineTypes.ml000066400000000000000000000333101361226111300165550ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU Library General Public License version 2, with a *) (* special exception on linking, as described in the file LICENSE. *) (* *) (******************************************************************************) (* This file defines several types and module types that are used in the specification of module [Engine]. *) (* --------------------------------------------------------------------------- *) (* It would be nice if we could keep the structure of stacks and environments hidden. However, stacks and environments must be accessible to semantic actions, so the following data structure definitions must be public. *) (* --------------------------------------------------------------------------- *) (* A stack is a linked list of cells. A sentinel cell -- which is its own successor -- is used to mark the bottom of the stack. The sentinel cell itself is not significant -- it contains dummy values. *) type ('state, 'semantic_value) stack = { (* The state that we should go back to if we pop this stack cell. *) (* This convention means that the state contained in the top stack cell is not the current state [env.current]. It also means that the state found within the sentinel is a dummy -- it is never consulted. This convention is the same as that adopted by the code-based back-end. *) state: 'state; (* The semantic value associated with the chunk of input that this cell represents. *) semv: 'semantic_value; (* The start and end positions of the chunk of input that this cell represents. *) startp: Lexing.position; endp: Lexing.position; (* The next cell down in the stack. If this is a self-pointer, then this cell is the sentinel, and the stack is conceptually empty. *) next: ('state, 'semantic_value) stack; } (* --------------------------------------------------------------------------- *) (* A parsing environment contains all of the parser's state (except for the current program point). *) type ('state, 'semantic_value, 'token) env = { (* If this flag is true, then the first component of [env.triple] should be ignored, as it has been logically overwritten with the [error] pseudo-token. *) error: bool; (* The last token that was obtained from the lexer, together with its start and end positions. Warning: before the first call to the lexer has taken place, a dummy (and possibly invalid) token is stored here. *) triple: 'token * Lexing.position * Lexing.position; (* The stack. In [CodeBackend], it is passed around on its own, whereas, here, it is accessed via the environment. *) stack: ('state, 'semantic_value) stack; (* The current state. In [CodeBackend], it is passed around on its own, whereas, here, it is accessed via the environment. *) current: 'state; } (* --------------------------------------------------------------------------- *) (* This signature describes the parameters that must be supplied to the LR engine. *) module type TABLE = sig (* The type of automaton states. *) type state (* States are numbered. *) val number: state -> int (* The type of tokens. These can be thought of as real tokens, that is, tokens returned by the lexer. They carry a semantic value. This type does not include the [error] pseudo-token. *) type token (* The type of terminal symbols. These can be thought of as integer codes. They do not carry a semantic value. This type does include the [error] pseudo-token. *) type terminal (* The type of nonterminal symbols. *) type nonterminal (* The type of semantic values. *) type semantic_value (* A token is conceptually a pair of a (non-[error]) terminal symbol and a semantic value. The following two functions are the pair projections. *) val token2terminal: token -> terminal val token2value: token -> semantic_value (* Even though the [error] pseudo-token is not a real token, it is a terminal symbol. Furthermore, for regularity, it must have a semantic value. *) val error_terminal: terminal val error_value: semantic_value (* [foreach_terminal] allows iterating over all terminal symbols. *) val foreach_terminal: (terminal -> 'a -> 'a) -> 'a -> 'a (* The type of productions. *) type production val production_index: production -> int val find_production: int -> production (* If a state [s] has a default reduction on production [prod], then, upon entering [s], the automaton should reduce [prod] without consulting the lookahead token. The following function allows determining which states have default reductions. *) (* Instead of returning a value of a sum type -- either [DefRed prod], or [NoDefRed] -- it accepts two continuations, and invokes just one of them. This mechanism allows avoiding a memory allocation. *) val default_reduction: state -> ('env -> production -> 'answer) -> ('env -> 'answer) -> 'env -> 'answer (* An LR automaton can normally take three kinds of actions: shift, reduce, or fail. (Acceptance is a particular case of reduction: it consists in reducing a start production.) *) (* There are two variants of the shift action. [shift/discard s] instructs the automaton to discard the current token, request a new one from the lexer, and move to state [s]. [shift/nodiscard s] instructs it to move to state [s] without requesting a new token. This instruction should be used when [s] has a default reduction on [#]. See [CodeBackend.gettoken] for details. *) (* This is the automaton's action table. It maps a pair of a state and a terminal symbol to an action. *) (* Instead of returning a value of a sum type -- one of shift/discard, shift/nodiscard, reduce, or fail -- this function accepts three continuations, and invokes just one them. This mechanism allows avoiding a memory allocation. *) (* In summary, the parameters to [action] are as follows: - the first two parameters, a state and a terminal symbol, are used to look up the action table; - the next parameter is the semantic value associated with the above terminal symbol; it is not used, only passed along to the shift continuation, as explained below; - the shift continuation expects an environment; a flag that tells whether to discard the current token; the terminal symbol that is being shifted; its semantic value; and the target state of the transition; - the reduce continuation expects an environment and a production; - the fail continuation expects an environment; - the last parameter is the environment; it is not used, only passed along to the selected continuation. *) val action: state -> terminal -> semantic_value -> ('env -> bool -> terminal -> semantic_value -> state -> 'answer) -> ('env -> production -> 'answer) -> ('env -> 'answer) -> 'env -> 'answer (* This is the automaton's goto table. This table maps a pair of a state and a nonterminal symbol to a new state. By extension, it also maps a pair of a state and a production to a new state. *) (* The function [goto_nt] can be applied to [s] and [nt] ONLY if the state [s] has an outgoing transition labeled [nt]. Otherwise, its result is undefined. Similarly, the call [goto_prod prod s] is permitted ONLY if the state [s] has an outgoing transition labeled with the nonterminal symbol [lhs prod]. The function [maybe_goto_nt] involves an additional dynamic check and CAN be called even if there is no outgoing transition. *) val goto_nt : state -> nonterminal -> state val goto_prod: state -> production -> state val maybe_goto_nt: state -> nonterminal -> state option (* [is_start prod] tells whether the production [prod] is a start production. *) val is_start: production -> bool (* By convention, a semantic action is responsible for: 1. fetching whatever semantic values and positions it needs off the stack; 2. popping an appropriate number of cells off the stack, as dictated by the length of the right-hand side of the production; 3. computing a new semantic value, as well as new start and end positions; 4. pushing a new stack cell, which contains the three values computed in step 3; 5. returning the new stack computed in steps 2 and 4. Point 1 is essentially forced upon us: if semantic values were fetched off the stack by this interpreter, then the calling convention for semantic actions would be variadic: not all semantic actions would have the same number of arguments. The rest follows rather naturally. *) (* Semantic actions are allowed to raise [Error]. *) exception Error type semantic_action = (state, semantic_value, token) env -> (state, semantic_value) stack val semantic_action: production -> semantic_action (* [may_reduce state prod] tests whether the state [state] is capable of reducing the production [prod]. This function is currently costly and is not used by the core LR engine. It is used in the implementation of certain functions, such as [force_reduction], which allow the engine to be driven programmatically. *) val may_reduce: state -> production -> bool (* The LR engine requires a number of hooks, which are used for logging. *) (* The comments below indicate the conventional messages that correspond to these hooks in the code-based back-end; see [CodeBackend]. *) (* If the flag [log] is false, then the logging functions are not called. If it is [true], then they are called. *) val log : bool module Log : sig (* State %d: *) val state: state -> unit (* Shifting () to state *) val shift: terminal -> state -> unit (* Reducing a production should be logged either as a reduction event (for regular productions) or as an acceptance event (for start productions). *) (* Reducing production / Accepting *) val reduce_or_accept: production -> unit (* Lookahead token is now (-) *) val lookahead_token: terminal -> Lexing.position -> Lexing.position -> unit (* Initiating error handling *) val initiating_error_handling: unit -> unit (* Resuming error handling *) val resuming_error_handling: unit -> unit (* Handling error in state *) val handling_error: state -> unit end end (* --------------------------------------------------------------------------- *) (* This signature describes the monolithic (traditional) LR engine. *) (* In this interface, the parser controls the lexer. *) module type MONOLITHIC_ENGINE = sig type state type token type semantic_value (* An entry point to the engine requires a start state, a lexer, and a lexing buffer. It either succeeds and produces a semantic value, or fails and raises [Error]. *) exception Error val entry: state -> (Lexing.lexbuf -> token) -> Lexing.lexbuf -> semantic_value end (* --------------------------------------------------------------------------- *) (* The following signatures describe the incremental LR engine. *) (* First, see [INCREMENTAL_ENGINE] in the file [IncrementalEngine.ml]. *) (* The [start] function is set apart because we do not wish to publish it as part of the generated [parser.mli] file. Instead, the table back-end will publish specialized versions of it, with a suitable type cast. *) module type INCREMENTAL_ENGINE_START = sig (* [start] is an entry point. It requires a start state and a start position and begins the parsing process. If the lexer is based on an OCaml lexing buffer, the start position should be [lexbuf.lex_curr_p]. [start] produces a checkpoint, which usually will be an [InputNeeded] checkpoint. (It could be [Accepted] if this starting state accepts only the empty word. It could be [Rejected] if this starting state accepts no word at all.) It does not raise any exception. *) (* [start s pos] should really produce a checkpoint of type ['a checkpoint], for a fixed ['a] that depends on the state [s]. We cannot express this, so we use [semantic_value checkpoint], which is safe. The table back-end uses [Obj.magic] to produce safe specialized versions of [start]. *) type state type semantic_value type 'a checkpoint val start: state -> Lexing.position -> semantic_value checkpoint end (* --------------------------------------------------------------------------- *) (* This signature describes the LR engine, which combines the monolithic and incremental interfaces. *) module type ENGINE = sig include MONOLITHIC_ENGINE include IncrementalEngine.INCREMENTAL_ENGINE with type token := token and type 'a lr1state = state (* useful for us; hidden from the end user *) include INCREMENTAL_ENGINE_START with type state := state and type semantic_value := semantic_value and type 'a checkpoint := 'a checkpoint end menhir-20200123/lib/ErrorReports.ml000066400000000000000000000052131361226111300167740ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU Library General Public License version 2, with a *) (* special exception on linking, as described in the file LICENSE. *) (* *) (******************************************************************************) (* -------------------------------------------------------------------------- *) (* A two-place buffer stores zero, one, or two elements. *) type 'a content = | Zero | One of 'a | Two of 'a * (* most recent: *) 'a type 'a buffer = 'a content ref (* [update buffer x] pushes [x] into [buffer], causing the buffer to slide. *) let update buffer x = buffer := match !buffer, x with | Zero, _ -> One x | One x1, x2 | Two (_, x1), x2 -> Two (x1, x2) (* [show f buffer] prints the contents of the buffer. The function [f] is used to print an element. *) let show f buffer : string = match !buffer with | Zero -> (* The buffer cannot be empty. If we have read no tokens, we cannot have detected a syntax error. *) assert false | One invalid -> (* It is unlikely, but possible, that we have read just one token. *) Printf.sprintf "before '%s'" (f invalid) | Two (valid, invalid) -> (* In the most likely case, we have read two tokens. *) Printf.sprintf "after '%s' and before '%s'" (f valid) (f invalid) (* [last buffer] returns the last element of the buffer (that is, the invalid token). *) let last buffer = match !buffer with | Zero -> (* The buffer cannot be empty. If we have read no tokens, we cannot have detected a syntax error. *) assert false | One invalid | Two (_, invalid) -> invalid (* [wrap buffer lexer] *) open Lexing let wrap lexer = let buffer = ref Zero in buffer, fun lexbuf -> let token = lexer lexbuf in update buffer (lexbuf.lex_start_p, lexbuf.lex_curr_p); token (* -------------------------------------------------------------------------- *) menhir-20200123/lib/ErrorReports.mli000066400000000000000000000037441361226111300171540ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU Library General Public License version 2, with a *) (* special exception on linking, as described in the file LICENSE. *) (* *) (******************************************************************************) (* -------------------------------------------------------------------------- *) (* The following functions help keep track of the start and end positions of the last two tokens in a two-place buffer. This is used to nicely display where a syntax error took place. *) type 'a buffer (* [wrap lexer] returns a pair of a new (initially empty) buffer and a lexer which internally relies on [lexer] and updates [buffer] on the fly whenever a token is demanded. *) open Lexing val wrap: (lexbuf -> 'token) -> (position * position) buffer * (lexbuf -> 'token) (* [show f buffer] prints the contents of the buffer, producing a string that is typically of the form "after '%s' and before '%s'". The function [f] is used to print an element. The buffer MUST be nonempty. *) val show: ('a -> string) -> 'a buffer -> string (* [last buffer] returns the last element of the buffer. The buffer MUST be nonempty. *) val last: 'a buffer -> 'a (* -------------------------------------------------------------------------- *) menhir-20200123/lib/General.ml000066400000000000000000000042061361226111300157020ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU Library General Public License version 2, with a *) (* special exception on linking, as described in the file LICENSE. *) (* *) (******************************************************************************) (* --------------------------------------------------------------------------- *) (* Lists. *) let rec take n xs = match n, xs with | 0, _ | _, [] -> [] | _, (x :: xs as input) -> let xs' = take (n - 1) xs in if xs == xs' then input else x :: xs' let rec drop n xs = match n, xs with | 0, _ -> xs | _, [] -> [] | _, _ :: xs -> drop (n - 1) xs let rec uniq1 cmp x ys = match ys with | [] -> [] | y :: ys -> if cmp x y = 0 then uniq1 compare x ys else y :: uniq1 cmp y ys let uniq cmp xs = match xs with | [] -> [] | x :: xs -> x :: uniq1 cmp x xs let weed cmp xs = uniq cmp (List.sort cmp xs) (* --------------------------------------------------------------------------- *) (* Streams. *) type 'a stream = 'a head Lazy.t and 'a head = | Nil | Cons of 'a * 'a stream (* The length of a stream. *) let rec length xs = match Lazy.force xs with | Nil -> 0 | Cons (_, xs) -> 1 + length xs (* Folding over a stream. *) let rec foldr f xs accu = match Lazy.force xs with | Nil -> accu | Cons (x, xs) -> f x (foldr f xs accu) menhir-20200123/lib/General.mli000066400000000000000000000045361361226111300160610ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU Library General Public License version 2, with a *) (* special exception on linking, as described in the file LICENSE. *) (* *) (******************************************************************************) (* This module offers general-purpose functions on lists and streams. *) (* As of 2017/03/31, this module is DEPRECATED. It might be removed in the future. *) (* --------------------------------------------------------------------------- *) (* Lists. *) (* [take n xs] returns the [n] first elements of the list [xs]. It is acceptable for the list [xs] to have length less than [n], in which case [xs] itself is returned. *) val take: int -> 'a list -> 'a list (* [drop n xs] returns the list [xs], deprived of its [n] first elements. It is acceptable for the list [xs] to have length less than [n], in which case an empty list is returned. *) val drop: int -> 'a list -> 'a list (* [uniq cmp xs] assumes that the list [xs] is sorted according to the ordering [cmp] and returns the list [xs] deprived of any duplicate elements. *) val uniq: ('a -> 'a -> int) -> 'a list -> 'a list (* [weed cmp xs] returns the list [xs] deprived of any duplicate elements. *) val weed: ('a -> 'a -> int) -> 'a list -> 'a list (* --------------------------------------------------------------------------- *) (* A stream is a list whose elements are produced on demand. *) type 'a stream = 'a head Lazy.t and 'a head = | Nil | Cons of 'a * 'a stream (* The length of a stream. *) val length: 'a stream -> int (* Folding over a stream. *) val foldr: ('a -> 'b -> 'b) -> 'a stream -> 'b -> 'b menhir-20200123/lib/IncrementalEngine.ml000066400000000000000000000464441361226111300177260ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU Library General Public License version 2, with a *) (* special exception on linking, as described in the file LICENSE. *) (* *) (******************************************************************************) type position = Lexing.position open General (* This signature describes the incremental LR engine. *) (* In this mode, the user controls the lexer, and the parser suspends itself when it needs to read a new token. *) module type INCREMENTAL_ENGINE = sig type token (* A value of type [production] is (an index for) a production. The start productions (which do not exist in an \mly file, but are constructed by Menhir internally) are not part of this type. *) type production (* The type ['a checkpoint] represents an intermediate or final state of the parser. An intermediate checkpoint is a suspension: it records the parser's current state, and allows parsing to be resumed. The parameter ['a] is the type of the semantic value that will eventually be produced if the parser succeeds. *) (* [Accepted] and [Rejected] are final checkpoints. [Accepted] carries a semantic value. *) (* [InputNeeded] is an intermediate checkpoint. It means that the parser wishes to read one token before continuing. *) (* [Shifting] is an intermediate checkpoint. It means that the parser is taking a shift transition. It exposes the state of the parser before and after the transition. The Boolean parameter tells whether the parser intends to request a new token after this transition. (It always does, except when it is about to accept.) *) (* [AboutToReduce] is an intermediate checkpoint. It means that the parser is about to perform a reduction step. It exposes the parser's current state as well as the production that is about to be reduced. *) (* [HandlingError] is an intermediate checkpoint. It means that the parser has detected an error and is currently handling it, in several steps. *) (* A value of type ['a env] represents a configuration of the automaton: current state, stack, lookahead token, etc. The parameter ['a] is the type of the semantic value that will eventually be produced if the parser succeeds. *) (* In normal operation, the parser works with checkpoints: see the functions [offer] and [resume]. However, it is also possible to work directly with environments (see the functions [pop], [force_reduction], and [feed]) and to reconstruct a checkpoint out of an environment (see [input_needed]). This is considered advanced functionality; its purpose is to allow error recovery strategies to be programmed by the user. *) type 'a env type 'a checkpoint = private | InputNeeded of 'a env | Shifting of 'a env * 'a env * bool | AboutToReduce of 'a env * production | HandlingError of 'a env | Accepted of 'a | Rejected (* [offer] allows the user to resume the parser after it has suspended itself with a checkpoint of the form [InputNeeded env]. [offer] expects the old checkpoint as well as a new token and produces a new checkpoint. It does not raise any exception. *) val offer: 'a checkpoint -> token * position * position -> 'a checkpoint (* [resume] allows the user to resume the parser after it has suspended itself with a checkpoint of the form [AboutToReduce (env, prod)] or [HandlingError env]. [resume] expects the old checkpoint and produces a new checkpoint. It does not raise any exception. *) val resume: 'a checkpoint -> 'a checkpoint (* A token supplier is a function of no arguments which delivers a new token (together with its start and end positions) every time it is called. *) type supplier = unit -> token * position * position (* A pair of a lexer and a lexing buffer can be easily turned into a supplier. *) val lexer_lexbuf_to_supplier: (Lexing.lexbuf -> token) -> Lexing.lexbuf -> supplier (* The functions [offer] and [resume] are sufficient to write a parser loop. One can imagine many variations (which is why we expose these functions in the first place!). Here, we expose a few variations of the main loop, ready for use. *) (* [loop supplier checkpoint] begins parsing from [checkpoint], reading tokens from [supplier]. It continues parsing until it reaches a checkpoint of the form [Accepted v] or [Rejected]. In the former case, it returns [v]. In the latter case, it raises the exception [Error]. *) val loop: supplier -> 'a checkpoint -> 'a (* [loop_handle succeed fail supplier checkpoint] begins parsing from [checkpoint], reading tokens from [supplier]. It continues parsing until it reaches a checkpoint of the form [Accepted v] or [HandlingError env] (or [Rejected], but that should not happen, as [HandlingError _] will be observed first). In the former case, it calls [succeed v]. In the latter case, it calls [fail] with this checkpoint. It cannot raise [Error]. This means that Menhir's traditional error-handling procedure (which pops the stack until a state that can act on the [error] token is found) does not get a chance to run. Instead, the user can implement her own error handling code, in the [fail] continuation. *) val loop_handle: ('a -> 'answer) -> ('a checkpoint -> 'answer) -> supplier -> 'a checkpoint -> 'answer (* [loop_handle_undo] is analogous to [loop_handle], except it passes a pair of checkpoints to the failure continuation. The first (and oldest) checkpoint is the last [InputNeeded] checkpoint that was encountered before the error was detected. The second (and newest) checkpoint is where the error was detected, as in [loop_handle]. Going back to the first checkpoint can be thought of as undoing any reductions that were performed after seeing the problematic token. (These reductions must be default reductions or spurious reductions.) [loop_handle_undo] must initially be applied to an [InputNeeded] checkpoint. The parser's initial checkpoints satisfy this constraint. *) val loop_handle_undo: ('a -> 'answer) -> ('a checkpoint -> 'a checkpoint -> 'answer) -> supplier -> 'a checkpoint -> 'answer (* [shifts checkpoint] assumes that [checkpoint] has been obtained by submitting a token to the parser. It runs the parser from [checkpoint], through an arbitrary number of reductions, until the parser either accepts this token (i.e., shifts) or rejects it (i.e., signals an error). If the parser decides to shift, then [Some env] is returned, where [env] is the parser's state just before shifting. Otherwise, [None] is returned. *) (* It is desirable that the semantic actions be side-effect free, or that their side-effects be harmless (replayable). *) val shifts: 'a checkpoint -> 'a env option (* The function [acceptable] allows testing, after an error has been detected, which tokens would have been accepted at this point. It is implemented using [shifts]. Its argument should be an [InputNeeded] checkpoint. *) (* For completeness, one must undo any spurious reductions before carrying out this test -- that is, one must apply [acceptable] to the FIRST checkpoint that is passed by [loop_handle_undo] to its failure continuation. *) (* This test causes some semantic actions to be run! The semantic actions should be side-effect free, or their side-effects should be harmless. *) (* The position [pos] is used as the start and end positions of the hypothetical token, and may be picked up by the semantic actions. We suggest using the position where the error was detected. *) val acceptable: 'a checkpoint -> token -> position -> bool (* The abstract type ['a lr1state] describes the non-initial states of the LR(1) automaton. The index ['a] represents the type of the semantic value associated with this state's incoming symbol. *) type 'a lr1state (* The states of the LR(1) automaton are numbered (from 0 and up). *) val number: _ lr1state -> int (* Productions are numbered. *) (* [find_production i] requires the index [i] to be valid. Use with care. *) val production_index: production -> int val find_production: int -> production (* An element is a pair of a non-initial state [s] and a semantic value [v] associated with the incoming symbol of this state. The idea is, the value [v] was pushed onto the stack just before the state [s] was entered. Thus, for some type ['a], the state [s] has type ['a lr1state] and the value [v] has type ['a]. In other words, the type [element] is an existential type. *) type element = | Element: 'a lr1state * 'a * position * position -> element (* The parser's stack is (or, more precisely, can be viewed as) a stream of elements. The type [stream] is defined by the module [General]. *) (* As of 2017/03/31, the types [stream] and [stack] and the function [stack] are DEPRECATED. They might be removed in the future. An alternative way of inspecting the stack is via the functions [top] and [pop]. *) type stack = (* DEPRECATED *) element stream (* This is the parser's stack, a stream of elements. This stream is empty if the parser is in an initial state; otherwise, it is non-empty. The LR(1) automaton's current state is the one found in the top element of the stack. *) val stack: 'a env -> stack (* DEPRECATED *) (* [top env] returns the parser's top stack element. The state contained in this stack element is the current state of the automaton. If the stack is empty, [None] is returned. In that case, the current state of the automaton must be an initial state. *) val top: 'a env -> element option (* [pop_many i env] pops [i] cells off the automaton's stack. This is done via [i] successive invocations of [pop]. Thus, [pop_many 1] is [pop]. The index [i] must be nonnegative. The time complexity is O(i). *) val pop_many: int -> 'a env -> 'a env option (* [get i env] returns the parser's [i]-th stack element. The index [i] is 0-based: thus, [get 0] is [top]. If [i] is greater than or equal to the number of elements in the stack, [None] is returned. The time complexity is O(i). *) val get: int -> 'a env -> element option (* [current_state_number env] is (the integer number of) the automaton's current state. This works even if the automaton's stack is empty, in which case the current state is an initial state. This number can be passed as an argument to a [message] function generated by [menhir --compile-errors]. *) val current_state_number: 'a env -> int (* [equal env1 env2] tells whether the parser configurations [env1] and [env2] are equal in the sense that the automaton's current state is the same in [env1] and [env2] and the stack is *physically* the same in [env1] and [env2]. If [equal env1 env2] is [true], then the sequence of the stack elements, as observed via [pop] and [top], must be the same in [env1] and [env2]. Also, if [equal env1 env2] holds, then the checkpoints [input_needed env1] and [input_needed env2] must be equivalent. The function [equal] has time complexity O(1). *) val equal: 'a env -> 'a env -> bool (* These are the start and end positions of the current lookahead token. If invoked in an initial state, this function returns a pair of twice the initial position. *) val positions: 'a env -> position * position (* When applied to an environment taken from a checkpoint of the form [AboutToReduce (env, prod)], the function [env_has_default_reduction] tells whether the reduction that is about to take place is a default reduction. *) val env_has_default_reduction: 'a env -> bool (* [state_has_default_reduction s] tells whether the state [s] has a default reduction. This includes the case where [s] is an accepting state. *) val state_has_default_reduction: _ lr1state -> bool (* [pop env] returns a new environment, where the parser's top stack cell has been popped off. (If the stack is empty, [None] is returned.) This amounts to pretending that the (terminal or nonterminal) symbol that corresponds to this stack cell has not been read. *) val pop: 'a env -> 'a env option (* [force_reduction prod env] should be called only if in the state [env] the parser is capable of reducing the production [prod]. If this condition is satisfied, then this production is reduced, which means that its semantic action is executed (this can have side effects!) and the automaton makes a goto (nonterminal) transition. If this condition is not satisfied, [Invalid_argument _] is raised. *) val force_reduction: production -> 'a env -> 'a env (* [input_needed env] returns [InputNeeded env]. That is, out of an [env] that might have been obtained via a series of calls to the functions [pop], [force_reduction], [feed], etc., it produces a checkpoint, which can be used to resume normal parsing, by supplying this checkpoint as an argument to [offer]. *) (* This function should be used with some care. It could "mess up the lookahead" in the sense that it allows parsing to resume in an arbitrary state [s] with an arbitrary lookahead symbol [t], even though Menhir's reachability analysis (menhir --list-errors) might well think that it is impossible to reach this particular configuration. If one is using Menhir's new error reporting facility, this could cause the parser to reach an error state for which no error message has been prepared. *) val input_needed: 'a env -> 'a checkpoint end (* This signature is a fragment of the inspection API that is made available to the user when [--inspection] is used. This fragment contains type definitions for symbols. *) module type SYMBOLS = sig (* The type ['a terminal] represents a terminal symbol. The type ['a nonterminal] represents a nonterminal symbol. In both cases, the index ['a] represents the type of the semantic values associated with this symbol. The concrete definitions of these types are generated. *) type 'a terminal type 'a nonterminal (* The type ['a symbol] represents a terminal or nonterminal symbol. It is the disjoint union of the types ['a terminal] and ['a nonterminal]. *) type 'a symbol = | T : 'a terminal -> 'a symbol | N : 'a nonterminal -> 'a symbol (* The type [xsymbol] is an existentially quantified version of the type ['a symbol]. This type is useful in situations where the index ['a] is not statically known. *) type xsymbol = | X : 'a symbol -> xsymbol end (* This signature describes the inspection API that is made available to the user when [--inspection] is used. *) module type INSPECTION = sig (* The types of symbols are described above. *) include SYMBOLS (* The type ['a lr1state] is meant to be the same as in [INCREMENTAL_ENGINE]. *) type 'a lr1state (* The type [production] is meant to be the same as in [INCREMENTAL_ENGINE]. It represents a production of the grammar. A production can be examined via the functions [lhs] and [rhs] below. *) type production (* An LR(0) item is a pair of a production [prod] and a valid index [i] into this production. That is, if the length of [rhs prod] is [n], then [i] is comprised between 0 and [n], inclusive. *) type item = production * int (* Ordering functions. *) val compare_terminals: _ terminal -> _ terminal -> int val compare_nonterminals: _ nonterminal -> _ nonterminal -> int val compare_symbols: xsymbol -> xsymbol -> int val compare_productions: production -> production -> int val compare_items: item -> item -> int (* [incoming_symbol s] is the incoming symbol of the state [s], that is, the symbol that the parser must recognize before (has recognized when) it enters the state [s]. This function gives access to the semantic value [v] stored in a stack element [Element (s, v, _, _)]. Indeed, by case analysis on the symbol [incoming_symbol s], one discovers the type ['a] of the value [v]. *) val incoming_symbol: 'a lr1state -> 'a symbol (* [items s] is the set of the LR(0) items in the LR(0) core of the LR(1) state [s]. This set is not epsilon-closed. This set is presented as a list, in an arbitrary order. *) val items: _ lr1state -> item list (* [lhs prod] is the left-hand side of the production [prod]. This is always a non-terminal symbol. *) val lhs: production -> xsymbol (* [rhs prod] is the right-hand side of the production [prod]. This is a (possibly empty) sequence of (terminal or nonterminal) symbols. *) val rhs: production -> xsymbol list (* [nullable nt] tells whether the non-terminal symbol [nt] is nullable. That is, it is true if and only if this symbol produces the empty word [epsilon]. *) val nullable: _ nonterminal -> bool (* [first nt t] tells whether the FIRST set of the nonterminal symbol [nt] contains the terminal symbol [t]. That is, it is true if and only if [nt] produces a word that begins with [t]. *) val first: _ nonterminal -> _ terminal -> bool (* [xfirst] is analogous to [first], but expects a first argument of type [xsymbol] instead of [_ terminal]. *) val xfirst: xsymbol -> _ terminal -> bool (* [foreach_terminal] enumerates the terminal symbols, including [error]. [foreach_terminal_but_error] enumerates the terminal symbols, excluding [error]. *) val foreach_terminal: (xsymbol -> 'a -> 'a) -> 'a -> 'a val foreach_terminal_but_error: (xsymbol -> 'a -> 'a) -> 'a -> 'a (* The type [env] is meant to be the same as in [INCREMENTAL_ENGINE]. *) type 'a env (* [feed symbol startp semv endp env] causes the parser to consume the (terminal or nonterminal) symbol [symbol], accompanied with the semantic value [semv] and with the start and end positions [startp] and [endp]. Thus, the automaton makes a transition, and reaches a new state. The stack grows by one cell. This operation is permitted only if the current state (as determined by [env]) has an outgoing transition labeled with [symbol]. Otherwise, [Invalid_argument _] is raised. *) val feed: 'a symbol -> position -> 'a -> position -> 'b env -> 'b env end (* This signature combines the incremental API and the inspection API. *) module type EVERYTHING = sig include INCREMENTAL_ENGINE include INSPECTION with type 'a lr1state := 'a lr1state with type production := production with type 'a env := 'a env end menhir-20200123/lib/InfiniteArray.ml000066400000000000000000000036601361226111300170740ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU Library General Public License version 2, with a *) (* special exception on linking, as described in the file LICENSE. *) (* *) (******************************************************************************) (** This module implements infinite arrays, that is, arrays that grow transparently upon demand. *) type 'a t = { default: 'a; mutable table: 'a array; mutable extent: int; (* the index of the greatest [set] ever, plus one *) } let default_size = 16384 (* must be non-zero *) let make x = { default = x; table = Array.make default_size x; extent = 0; } let rec new_length length i = if i < length then length else new_length (2 * length) i let ensure a i = assert (0 <= i); let table = a.table in let length = Array.length table in if i >= length then begin let table' = Array.make (new_length (2 * length) i) a.default in Array.blit table 0 table' 0 length; a.table <- table' end let get a i = ensure a i; Array.unsafe_get a.table (i) let set a i x = ensure a i; Array.unsafe_set a.table (i) x; if a.extent <= i then a.extent <- i + 1 let extent a = a.extent let domain a = Array.sub a.table 0 a.extent menhir-20200123/lib/InfiniteArray.mli000066400000000000000000000034311361226111300172410ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU Library General Public License version 2, with a *) (* special exception on linking, as described in the file LICENSE. *) (* *) (******************************************************************************) (** This module implements infinite arrays. **) type 'a t (** [make x] creates an infinite array, where every slot contains [x]. **) val make: 'a -> 'a t (** [get a i] returns the element contained at offset [i] in the array [a]. Slots are numbered 0 and up. **) val get: 'a t -> int -> 'a (** [set a i x] sets the element contained at offset [i] in the array [a] to [x]. Slots are numbered 0 and up. **) val set: 'a t -> int -> 'a -> unit (** [extent a] is the length of an initial segment of the array [a] that is sufficiently large to contain all [set] operations ever performed. In other words, all elements beyond that segment have the default value. *) val extent: 'a t -> int (** [domain a] is a fresh copy of an initial segment of the array [a] whose length is [extent a]. *) val domain: 'a t -> 'a array menhir-20200123/lib/InspectionTableFormat.ml000066400000000000000000000060651361226111300205660ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU Library General Public License version 2, with a *) (* special exception on linking, as described in the file LICENSE. *) (* *) (******************************************************************************) (* This signature defines the format of the tables that are produced (in addition to the tables described in [TableFormat]) when the command line switch [--inspection] is enabled. It is used as an argument to [InspectionTableInterpreter.Make]. *) module type TABLES = sig (* The types of symbols. *) include IncrementalEngine.SYMBOLS (* The type ['a lr1state] describes an LR(1) state. The generated parser defines it internally as [int]. *) type 'a lr1state (* Some of the tables that follow use encodings of (terminal and nonterminal) symbols as integers. So, we need functions that map the integer encoding of a symbol to its algebraic encoding. *) val terminal: int -> xsymbol val nonterminal: int -> xsymbol (* The left-hand side of every production already appears in the signature [TableFormat.TABLES], so we need not repeat it here. *) (* The right-hand side of every production. This a linearized array of arrays of integers, whose [data] and [entry] components have been packed. The encoding of symbols as integers in described in [TableBackend]. *) val rhs: PackedIntArray.t * PackedIntArray.t (* A mapping of every (non-initial) state to its LR(0) core. *) val lr0_core: PackedIntArray.t (* A mapping of every LR(0) state to its set of LR(0) items. Each item is represented in its packed form (see [Item]) as an integer. Thus the mapping is an array of arrays of integers, which is linearized and packed, like [rhs]. *) val lr0_items: PackedIntArray.t * PackedIntArray.t (* A mapping of every LR(0) state to its incoming symbol, if it has one. *) val lr0_incoming: PackedIntArray.t (* A table that tells which non-terminal symbols are nullable. *) val nullable: string (* This is a packed int array of bit width 1. It can be read using [PackedIntArray.get1]. *) (* A two-table dimensional table, indexed by a nonterminal symbol and by a terminal symbol (other than [#]), encodes the FIRST sets. *) val first: int (* width of the bitmap *) * string (* second component of [PackedIntArray.t] *) end menhir-20200123/lib/InspectionTableInterpreter.ml000066400000000000000000000252061361226111300216370ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU Library General Public License version 2, with a *) (* special exception on linking, as described in the file LICENSE. *) (* *) (******************************************************************************) (* -------------------------------------------------------------------------- *) (* The type functor. *) module Symbols (T : sig type 'a terminal type 'a nonterminal end) = struct open T (* This should be the only place in the whole library (and generator!) where these types are defined. *) type 'a symbol = | T : 'a terminal -> 'a symbol | N : 'a nonterminal -> 'a symbol type xsymbol = | X : 'a symbol -> xsymbol end (* -------------------------------------------------------------------------- *) (* The code functor. *) module Make (TT : TableFormat.TABLES) (IT : InspectionTableFormat.TABLES with type 'a lr1state = int) (ET : EngineTypes.TABLE with type terminal = int and type nonterminal = int and type semantic_value = Obj.t) (E : sig type 'a env = (ET.state, ET.semantic_value, ET.token) EngineTypes.env end) = struct (* Including [IT] is an easy way of inheriting the definitions of the types [symbol] and [xsymbol]. *) include IT (* This auxiliary function decodes a packed linearized array, as created by [TableBackend.linearize_and_marshal1]. Here, we read a row all at once. *) let read_packed_linearized (data, entry : PackedIntArray.t * PackedIntArray.t) (i : int) : int list = LinearizedArray.read_row_via (PackedIntArray.get data) (PackedIntArray.get entry) i (* This auxiliary function decodes a symbol. The encoding was done by [encode_symbol] or [encode_symbol_option] in the table back-end. *) let decode_symbol (symbol : int) : IT.xsymbol = (* If [symbol] is 0, then we have no symbol. This could mean e.g. that the function [incoming_symbol] has been applied to an initial state. In principle, this cannot happen. *) assert (symbol > 0); (* The low-order bit distinguishes terminal and nonterminal symbols. *) let kind = symbol land 1 in let symbol = symbol lsr 1 in if kind = 0 then IT.terminal (symbol - 1) else IT.nonterminal symbol (* These auxiliary functions convert a symbol to its integer code. For speed and for convenience, we use an unsafe type cast. This relies on the fact that the data constructors of the [terminal] and [nonterminal] GADTs are declared in an order that reflects their internal code. In the case of nonterminal symbols, we add [start] to account for the presence of the start symbols. *) let n2i (nt : 'a IT.nonterminal) : int = let answer = TT.start + Obj.magic nt in (* For safety, check that the above cast produced a correct result. *) assert (IT.nonterminal answer = X (N nt)); answer let t2i (t : 'a IT.terminal) : int = let answer = Obj.magic t in (* For safety, check that the above cast produced a correct result. *) assert (IT.terminal answer = X (T t)); answer (* Ordering functions. *) let compare_terminals t1 t2 = (* Subtraction is safe because overflow is impossible. *) t2i t1 - t2i t2 let compare_nonterminals nt1 nt2 = (* Subtraction is safe because overflow is impossible. *) n2i nt1 - n2i nt2 let compare_symbols symbol1 symbol2 = match symbol1, symbol2 with | X (T _), X (N _) -> -1 | X (N _), X (T _) -> 1 | X (T t1), X (T t2) -> compare_terminals t1 t2 | X (N nt1), X (N nt2) -> compare_nonterminals nt1 nt2 let compare_productions prod1 prod2 = (* Subtraction is safe because overflow is impossible. *) prod1 - prod2 let compare_items (prod1, index1) (prod2, index2) = let c = compare_productions prod1 prod2 in (* Subtraction is safe because overflow is impossible. *) if c <> 0 then c else index1 - index2 (* The function [incoming_symbol] goes through the tables [IT.lr0_core] and [IT.lr0_incoming]. This yields a representation of type [xsymbol], out of which we strip the [X] quantifier, so as to get a naked symbol. This last step is ill-typed and potentially dangerous. It is safe only because this function is used at type ['a lr1state -> 'a symbol], which forces an appropriate choice of ['a]. *) let incoming_symbol (s : 'a IT.lr1state) : 'a IT.symbol = let core = PackedIntArray.get IT.lr0_core s in let symbol = decode_symbol (PackedIntArray.get IT.lr0_incoming core) in match symbol with | IT.X symbol -> Obj.magic symbol (* The function [lhs] reads the table [TT.lhs] and uses [IT.nonterminal] to decode the symbol. *) let lhs prod = IT.nonterminal (PackedIntArray.get TT.lhs prod) (* The function [rhs] reads the table [IT.rhs] and uses [decode_symbol] to decode the symbol. *) let rhs prod = List.map decode_symbol (read_packed_linearized IT.rhs prod) (* The function [items] maps the LR(1) state [s] to its LR(0) core, then uses [core] as an index into the table [IT.lr0_items]. The items are then decoded by the function [export] below, which is essentially a copy of [Item.export]. *) type item = int * int let export t : item = (t lsr 7, t mod 128) let items s = (* Map [s] to its LR(0) core. *) let core = PackedIntArray.get IT.lr0_core s in (* Now use [core] to look up the table [IT.lr0_items]. *) List.map export (read_packed_linearized IT.lr0_items core) (* The function [nullable] maps the nonterminal symbol [nt] to its integer code, which it uses to look up the array [IT.nullable]. This yields 0 or 1, which we map back to a Boolean result. *) let decode_bool i = assert (i = 0 || i = 1); i = 1 let nullable nt = decode_bool (PackedIntArray.get1 IT.nullable (n2i nt)) (* The function [first] maps the symbols [nt] and [t] to their integer codes, which it uses to look up the matrix [IT.first]. *) let first nt t = decode_bool (PackedIntArray.unflatten1 IT.first (n2i nt) (t2i t)) let xfirst symbol t = match symbol with | X (T t') -> compare_terminals t t' = 0 | X (N nt) -> first nt t (* The function [foreach_terminal] exploits the fact that the first component of [TT.error] is [Terminal.n - 1], i.e., the number of terminal symbols, including [error] but not [#]. *) let rec foldij i j f accu = if i = j then accu else foldij (i + 1) j f (f i accu) let foreach_terminal f accu = let n, _ = TT.error in foldij 0 n (fun i accu -> f (IT.terminal i) accu ) accu let foreach_terminal_but_error f accu = let n, _ = TT.error in foldij 0 n (fun i accu -> if i = TT.error_terminal then accu else f (IT.terminal i) accu ) accu (* ------------------------------------------------------------------------ *) (* The following is the implementation of the function [feed]. This function is logically part of the LR engine, so it would be nice if it were placed in the module [Engine], but it must be placed here because, to ensure type safety, its arguments must be a symbol of type ['a symbol] and a semantic value of type ['a]. The type ['a symbol] is not available in [Engine]. It is available here. *) open EngineTypes open ET open E (* [feed] fails if the current state does not have an outgoing transition labeled with the desired symbol. This check is carried out at runtime. *) let feed_failure () = invalid_arg "feed: outgoing transition does not exist" (* Feeding a nonterminal symbol [nt]. Here, [nt] has type [nonterminal], which is a synonym for [int], and [semv] has type [semantic_value], which is a synonym for [Obj.t]. This type is unsafe, because pushing a semantic value of arbitrary type into the stack can later cause a semantic action to crash and burn. The function [feed] is given a safe type below. *) let feed_nonterminal (nt : nonterminal) startp (semv : semantic_value) endp (env : 'b env) : 'b env = (* Check if the source state has an outgoing transition labeled [nt]. This is done by consulting the [goto] table. *) let source = env.current in match ET.maybe_goto_nt source nt with | None -> feed_failure() | Some target -> (* Push a new cell onto the stack, containing the identity of the state that we are leaving. The semantic value [semv] and positions [startp] and [endp] contained in the new cell are provided by the caller. *) let stack = { state = source; semv; startp; endp; next = env.stack } in (* Move to the target state. *) { env with stack; current = target } let reduce _env _prod = feed_failure() let initiate _env = feed_failure() let feed_terminal (terminal : terminal) startp (semv : semantic_value) endp (env : 'b env) : 'b env = (* Check if the source state has an outgoing transition labeled [terminal]. This is done by consulting the [action] table. *) let source = env.current in ET.action source terminal semv (fun env _please_discard _terminal semv target -> (* There is indeed a transition toward the state [target]. Push a new cell onto the stack and move to the target state. *) let stack = { state = source; semv; startp; endp; next = env.stack } in { env with stack; current = target } ) reduce initiate env (* The type assigned to [feed] ensures that the type of the semantic value [semv] is appropriate: it must be the semantic-value type of the symbol [symbol]. *) let feed (symbol : 'a symbol) startp (semv : 'a) endp env = let semv : semantic_value = Obj.repr semv in match symbol with | N nt -> feed_nonterminal (n2i nt) startp semv endp env | T terminal -> feed_terminal (t2i terminal) startp semv endp env end menhir-20200123/lib/InspectionTableInterpreter.mli000066400000000000000000000041401361226111300220020ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU Library General Public License version 2, with a *) (* special exception on linking, as described in the file LICENSE. *) (* *) (******************************************************************************) (* This functor is invoked inside the generated parser, in [--table] mode. It produces no code! It simply constructs the types [symbol] and [xsymbol] on top of the generated types [terminal] and [nonterminal]. *) module Symbols (T : sig type 'a terminal type 'a nonterminal end) : IncrementalEngine.SYMBOLS with type 'a terminal := 'a T.terminal and type 'a nonterminal := 'a T.nonterminal (* This functor is invoked inside the generated parser, in [--table] mode. It constructs the inspection API on top of the inspection tables described in [InspectionTableFormat]. *) module Make (TT : TableFormat.TABLES) (IT : InspectionTableFormat.TABLES with type 'a lr1state = int) (ET : EngineTypes.TABLE with type terminal = int and type nonterminal = int and type semantic_value = Obj.t) (E : sig type 'a env = (ET.state, ET.semantic_value, ET.token) EngineTypes.env end) : IncrementalEngine.INSPECTION with type 'a terminal := 'a IT.terminal and type 'a nonterminal := 'a IT.nonterminal and type 'a lr1state := 'a IT.lr1state and type production := int and type 'a env := 'a E.env menhir-20200123/lib/LinearizedArray.ml000066400000000000000000000053061361226111300174140ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU Library General Public License version 2, with a *) (* special exception on linking, as described in the file LICENSE. *) (* *) (******************************************************************************) (* The [entry] array contains offsets into the [data] array. It has [n+1] elements if the original (unencoded) array has [n] elements. The value of [entry.(n)] is the length of the [data] array. This convention is natural and allows avoiding a special case. *) type 'a t = (* data: *) 'a array * (* entry: *) int array let make (a : 'a array array) : 'a t = let n = Array.length a in (* Build the entry array. *) let size = ref 0 in let entry = Array.init (n + 1) (fun i -> let s = !size in if i < n then size := s + Array.length a.(i); s ) in assert (entry.(n) = !size); (* Build the data array. *) let i = ref 0 and j = ref 0 in let data = Array.init !size (fun _ -> while !j = Array.length a.(!i) do i := !i + 1; j := 0; done; let x = a.(!i).(!j) in j := !j + 1; x ) in data, entry let length ((_, entry) : 'a t) : int = Array.length entry let row_length ((_, entry) : 'a t) i : int = entry.(i + 1) - entry.(i) let row_length_via get_entry i = get_entry (i + 1) - get_entry i let read ((data, entry) as la : 'a t) i j : 'a = assert (0 <= j && j < row_length la i); data.(entry.(i) + j) let read_via get_data get_entry i j = assert (0 <= j && j < row_length_via get_entry i); get_data (get_entry i + j) let write ((data, entry) as la : 'a t) i j (v : 'a) : unit = assert (0 <= j && j < row_length la i); data.(entry.(i) + j) <- v let rec read_interval_via get_data i j = if i = j then [] else get_data i :: read_interval_via get_data (i + 1) j let read_row_via get_data get_entry i = read_interval_via get_data (get_entry i) (get_entry (i + 1)) let read_row ((data, entry) : 'a t) i : 'a list = read_row_via (Array.get data) (Array.get entry) i menhir-20200123/lib/LinearizedArray.mli000066400000000000000000000053001361226111300175570ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU Library General Public License version 2, with a *) (* special exception on linking, as described in the file LICENSE. *) (* *) (******************************************************************************) (* An array of arrays (of possibly different lengths!) can be ``linearized'', i.e., encoded as a data array (by concatenating all of the little arrays) and an entry array (which contains offsets into the data array). *) type 'a t = (* data: *) 'a array * (* entry: *) int array (* [make a] turns the array of arrays [a] into a linearized array. *) val make: 'a array array -> 'a t (* [read la i j] reads the linearized array [la] at indices [i] and [j]. Thus, [read (make a) i j] is equivalent to [a.(i).(j)]. *) val read: 'a t -> int -> int -> 'a (* [write la i j v] writes the value [v] into the linearized array [la] at indices [i] and [j]. *) val write: 'a t -> int -> int -> 'a -> unit (* [length la] is the number of rows of the array [la]. Thus, [length (make a)] is equivalent to [Array.length a]. *) val length: 'a t -> int (* [row_length la i] is the length of the row at index [i] in the linearized array [la]. Thus, [row_length (make a) i] is equivalent to [Array.length a.(i)]. *) val row_length: 'a t -> int -> int (* [read_row la i] reads the row at index [i], producing a list. Thus, [read_row (make a) i] is equivalent to [Array.to_list a.(i)]. *) val read_row: 'a t -> int -> 'a list (* The following variants read the linearized array via accessors [get_data : int -> 'a] and [get_entry : int -> int]. *) val row_length_via: (* get_entry: *) (int -> int) -> (* i: *) int -> int val read_via: (* get_data: *) (int -> 'a) -> (* get_entry: *) (int -> int) -> (* i: *) int -> (* j: *) int -> 'a val read_row_via: (* get_data: *) (int -> 'a) -> (* get_entry: *) (int -> int) -> (* i: *) int -> 'a list menhir-20200123/lib/PackedIntArray.ml000066400000000000000000000141171361226111300171700ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU Library General Public License version 2, with a *) (* special exception on linking, as described in the file LICENSE. *) (* *) (******************************************************************************) (* A packed integer array is represented as a pair of an integer [k] and a string [s]. The integer [k] is the number of bits per integer that we use. The string [s] is just an array of bits, which is read in 8-bit chunks. *) (* The ocaml programming language treats string literals and array literals in slightly different ways: the former are statically allocated, while the latter are dynamically allocated. (This is rather arbitrary.) In the context of Menhir's table-based back-end, where compact, immutable integer arrays are needed, ocaml strings are preferable to ocaml arrays. *) type t = int * string (* The magnitude [k] of an integer [v] is the number of bits required to represent [v]. It is rounded up to the nearest power of two, so that [k] divides [Sys.word_size]. *) let magnitude (v : int) = if v < 0 then Sys.word_size else let rec check k max = (* [max] equals [2^k] *) if (max <= 0) || (v < max) then k (* if [max] just overflew, then [v] requires a full ocaml integer, and [k] is the number of bits in an ocaml integer plus one, that is, [Sys.word_size]. *) else check (2 * k) (max * max) in check 1 2 (* [pack a] turns an array of integers into a packed integer array. *) (* Because the sign bit is the most significant bit, the magnitude of any negative number is the word size. In other words, [pack] does not achieve any space savings as soon as [a] contains any negative numbers, even if they are ``small''. *) let pack (a : int array) : t = let m = Array.length a in (* Compute the maximum magnitude of the array elements. This tells us how many bits per element we are going to use. *) let k = Array.fold_left (fun k v -> max k (magnitude v) ) 1 a in (* Because access to ocaml strings is performed on an 8-bit basis, two cases arise. If [k] is less than 8, then we can pack multiple array entries into a single character. If [k] is greater than 8, then we must use multiple characters to represent a single array entry. *) if k <= 8 then begin (* [w] is the number of array entries that we pack in a character. *) assert (8 mod k = 0); let w = 8 / k in (* [n] is the length of the string that we allocate. *) let n = if m mod w = 0 then m / w else m / w + 1 in let s = Bytes.create n in (* Define a reader for the source array. The reader might run off the end if [w] does not divide [m]. *) let i = ref 0 in let next () = let ii = !i in if ii = m then 0 (* ran off the end, pad with zeroes *) else let v = a.(ii) in i := ii + 1; v in (* Fill up the string. *) for j = 0 to n - 1 do let c = ref 0 in for _x = 1 to w do c := (!c lsl k) lor next() done; Bytes.set s j (Char.chr !c) done; (* Done. *) k, Bytes.unsafe_to_string s end else begin (* k > 8 *) (* [w] is the number of characters that we use to encode an array entry. *) assert (k mod 8 = 0); let w = k / 8 in (* [n] is the length of the string that we allocate. *) let n = m * w in let s = Bytes.create n in (* Fill up the string. *) for i = 0 to m - 1 do let v = ref a.(i) in for x = 1 to w do Bytes.set s ((i + 1) * w - x) (Char.chr (!v land 255)); v := !v lsr 8 done done; (* Done. *) k, Bytes.unsafe_to_string s end (* Access to a string. *) let read (s : string) (i : int) : int = Char.code (String.unsafe_get s i) (* [get1 t i] returns the integer stored in the packed array [t] at index [i]. It assumes (and does not check) that the array's bit width is [1]. The parameter [t] is just a string. *) let get1 (s : string) (i : int) : int = let c = read s (i lsr 3) in let c = c lsr ((lnot i) land 0b111) in let c = c land 0b1 in c (* [get t i] returns the integer stored in the packed array [t] at index [i]. *) (* Together, [pack] and [get] satisfy the following property: if the index [i] is within bounds, then [get (pack a) i] equals [a.(i)]. *) let get ((k, s) : t) (i : int) : int = match k with | 1 -> get1 s i | 2 -> let c = read s (i lsr 2) in let c = c lsr (2 * ((lnot i) land 0b11)) in let c = c land 0b11 in c | 4 -> let c = read s (i lsr 1) in let c = c lsr (4 * ((lnot i) land 0b1)) in let c = c land 0b1111 in c | 8 -> read s i | 16 -> let j = 2 * i in (read s j) lsl 8 + read s (j + 1) | _ -> assert (k = 32); (* 64 bits unlikely, not supported *) let j = 4 * i in (((read s j lsl 8) + read s (j + 1)) lsl 8 + read s (j + 2)) lsl 8 + read s (j + 3) (* [unflatten1 (n, data) i j] accesses the two-dimensional bitmap represented by [(n, data)] at indices [i] and [j]. The integer [n] is the width of the bitmap; the string [data] is the second component of the packed array obtained by encoding the table as a one-dimensional array. *) let unflatten1 (n, data) i j = get1 data (n * i + j) menhir-20200123/lib/PackedIntArray.mli000066400000000000000000000053421361226111300173410ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU Library General Public License version 2, with a *) (* special exception on linking, as described in the file LICENSE. *) (* *) (******************************************************************************) (* A packed integer array is represented as a pair of an integer [k] and a string [s]. The integer [k] is the number of bits per integer that we use. The string [s] is just an array of bits, which is read in 8-bit chunks. *) (* The ocaml programming language treats string literals and array literals in slightly different ways: the former are statically allocated, while the latter are dynamically allocated. (This is rather arbitrary.) In the context of Menhir's table-based back-end, where compact, immutable integer arrays are needed, ocaml strings are preferable to ocaml arrays. *) type t = int * string (* [pack a] turns an array of integers into a packed integer array. *) (* Because the sign bit is the most significant bit, the magnitude of any negative number is the word size. In other words, [pack] does not achieve any space savings as soon as [a] contains any negative numbers, even if they are ``small''. *) val pack: int array -> t (* [get t i] returns the integer stored in the packed array [t] at index [i]. *) (* Together, [pack] and [get] satisfy the following property: if the index [i] is within bounds, then [get (pack a) i] equals [a.(i)]. *) val get: t -> int -> int (* [get1 t i] returns the integer stored in the packed array [t] at index [i]. It assumes (and does not check) that the array's bit width is [1]. The parameter [t] is just a string. *) val get1: string -> int -> int (* [unflatten1 (n, data) i j] accesses the two-dimensional bitmap represented by [(n, data)] at indices [i] and [j]. The integer [n] is the width of the bitmap; the string [data] is the second component of the packed array obtained by encoding the table as a one-dimensional array. *) val unflatten1: int * string -> int -> int -> int menhir-20200123/lib/Printers.ml000066400000000000000000000067061361226111300161420ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU Library General Public License version 2, with a *) (* special exception on linking, as described in the file LICENSE. *) (* *) (******************************************************************************) module Make (I : IncrementalEngine.EVERYTHING) (User : sig val print: string -> unit val print_symbol: I.xsymbol -> unit val print_element: (I.element -> unit) option end) = struct let arrow = " -> " let dot = "." let space = " " let newline = "\n" open User open I (* Printing a list of symbols. An optional dot is printed at offset [i] into the list [symbols], if this offset lies between [0] and the length of the list (included). *) let rec print_symbols i symbols = if i = 0 then begin print dot; print space; print_symbols (-1) symbols end else begin match symbols with | [] -> () | symbol :: symbols -> print_symbol symbol; print space; print_symbols (i - 1) symbols end (* Printing an element as a symbol. *) let print_element_as_symbol element = match element with | Element (s, _, _, _) -> print_symbol (X (incoming_symbol s)) (* Some of the functions that follow need an element printer. They use [print_element] if provided by the user; otherwise they use [print_element_as_symbol]. *) let print_element = match print_element with | Some print_element -> print_element | None -> print_element_as_symbol (* Printing a stack as a list of symbols. Stack bottom on the left, stack top on the right. *) let rec print_stack env = match top env, pop env with | Some element, Some env -> print_stack env; print space; print_element element | _, _ -> () let print_stack env = print_stack env; print newline (* Printing an item. *) let print_item (prod, i) = print_symbol (lhs prod); print arrow; print_symbols i (rhs prod); print newline (* Printing a list of symbols (public version). *) let print_symbols symbols = print_symbols (-1) symbols (* Printing a production (without a dot). *) let print_production prod = print_item (prod, -1) (* Printing the current LR(1) state. *) let print_current_state env = print "Current LR(1) state: "; match top env with | None -> print ""; (* TEMPORARY unsatisfactory *) print newline | Some (Element (current, _, _, _)) -> print (string_of_int (number current)); print newline; List.iter print_item (items current) let print_env env = print_stack env; print_current_state env; print newline end menhir-20200123/lib/Printers.mli000066400000000000000000000052011361226111300163000ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU Library General Public License version 2, with a *) (* special exception on linking, as described in the file LICENSE. *) (* *) (******************************************************************************) (* This module is part of MenhirLib. *) module Make (I : IncrementalEngine.EVERYTHING) (User : sig (* [print s] is supposed to send the string [s] to some output channel. *) val print: string -> unit (* [print_symbol s] is supposed to print a representation of the symbol [s]. *) val print_symbol: I.xsymbol -> unit (* [print_element e] is supposed to print a representation of the element [e]. This function is optional; if it is not provided, [print_element_as_symbol] (defined below) is used instead. *) val print_element: (I.element -> unit) option end) : sig open I (* Printing a list of symbols. *) val print_symbols: xsymbol list -> unit (* Printing an element as a symbol. This prints just the symbol that this element represents; nothing more. *) val print_element_as_symbol: element -> unit (* Printing a stack as a list of elements. This function needs an element printer. It uses [print_element] if provided by the user; otherwise it uses [print_element_as_symbol]. (Ending with a newline.) *) val print_stack: 'a env -> unit (* Printing an item. (Ending with a newline.) *) val print_item: item -> unit (* Printing a production. (Ending with a newline.) *) val print_production: production -> unit (* Printing the current LR(1) state. The current state is first displayed as a number; then the list of its LR(0) items is printed. (Ending with a newline.) *) val print_current_state: 'a env -> unit (* Printing a summary of the stack and current state. This function just calls [print_stack] and [print_current_state] in succession. *) val print_env: 'a env -> unit end menhir-20200123/lib/RowDisplacement.ml000066400000000000000000000214211361226111300174230ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU Library General Public License version 2, with a *) (* special exception on linking, as described in the file LICENSE. *) (* *) (******************************************************************************) (* This module compresses a two-dimensional table, where some values are considered insignificant, via row displacement. *) (* This idea reportedly appears in Aho and Ullman's ``Principles of Compiler Design'' (1977). It is evaluated in Tarjan and Yao's ``Storing a Sparse Table'' (1979) and in Dencker, Dürre, and Heuft's ``Optimization of Parser Tables for Portable Compilers'' (1984). *) (* A compressed table is represented as a pair of arrays. The displacement array is an array of offsets into the data array. *) type 'a table = int array * (* displacement *) 'a array (* data *) (* In a natural version of this algorithm, displacements would be greater than (or equal to) [-n]. However, in the particular setting of Menhir, both arrays are intended to be compressed with [PackedIntArray], which does not efficiently support negative numbers. For this reason, we are careful not to produce negative displacements. *) (* In order to avoid producing negative displacements, we simply use the least significant bit as the sign bit. This is implemented by [encode] and [decode] below. *) (* One could also think, say, of adding [n] to every displacement, so as to ensure that all displacements are nonnegative. This would work, but would require [n] to be published, for use by the decoder. *) let encode (displacement : int) : int = if displacement >= 0 then displacement lsl 1 else (-displacement) lsl 1 + 1 let decode (displacement : int) : int = if displacement land 1 = 0 then displacement lsr 1 else -(displacement lsr 1) (* It is reasonable to assume that, as matrices grow large, their density becomes low, i.e., they have many insignificant entries. As a result, it is important to work with a sparse data structure for rows. We internally represent a row as a list of its significant entries, where each entry is a pair of a [j] index and an element. *) type 'a row = (int * 'a) list (* [compress equal insignificant dummy m n t] turns the two-dimensional table [t] into a compressed table. The parameter [equal] is equality of data values. The parameter [wildcard] tells which data values are insignificant, and can thus be overwritten with other values. The parameter [dummy] is used to fill holes in the data array. [m] and [n] are the integer dimensions of the table [t]. *) let compress (equal : 'a -> 'a -> bool) (insignificant : 'a -> bool) (dummy : 'a) (m : int) (n : int) (t : 'a array array) : 'a table = (* Be defensive. *) assert (Array.length t = m); assert begin for i = 0 to m - 1 do assert (Array.length t.(i) = n) done; true end; (* This turns a row-as-array into a row-as-sparse-list. The row is accompanied by its index [i] and by its rank (the number of its significant entries, that is, the length of the row-as-a-list. *) let sparse (i : int) (line : 'a array) : int * int * 'a row (* index, rank, row *) = let rec loop (j : int) (rank : int) (row : 'a row) = if j < 0 then i, rank, row else let x = line.(j) in if insignificant x then loop (j - 1) rank row else loop (j - 1) (1 + rank) ((j, x) :: row) in loop (n - 1) 0 [] in (* Construct an array of all rows, together with their index and rank. *) let rows : (int * int * 'a row) array = (* index, rank, row *) Array.mapi sparse t in (* Sort this array by decreasing rank. This does not have any impact on correctness, but reportedly improves compression. The intuitive idea is that rows with few significant elements are easy to fit, so they should be inserted last, after the problem has become quite constrained by fitting the heavier rows. This heuristic is attributed to Ziegler. *) Array.fast_sort (fun (_, rank1, _) (_, rank2, _) -> compare rank2 rank1 ) rows; (* Allocate a one-dimensional array of displacements. *) let displacement : int array = Array.make m 0 in (* Allocate a one-dimensional, infinite array of values. Indices into this array are written [k]. *) let data : 'a InfiniteArray.t = InfiniteArray.make dummy in (* Determine whether [row] fits at offset [k] within the current [data] array, up to extension of this array. *) (* Note that this check always succeeds when [k] equals the length of the [data] array. Indeed, the loop is then skipped. This property guarantees the termination of the recursive function [fit] below. *) let fits k (row : 'a row) : bool = let d = InfiniteArray.extent data in let rec loop = function | [] -> true | (j, x) :: row -> (* [x] is a significant element. *) (* By hypothesis, [k + j] is nonnegative. If it is greater than or equal to the current length of the data array, stop -- the row fits. *) assert (k + j >= 0); if k + j >= d then true (* We now know that [k + j] is within bounds of the data array. Check whether it is compatible with the element [y] found there. If it is, continue. If it isn't, stop -- the row does not fit. *) else let y = InfiniteArray.get data (k + j) in if insignificant y || equal x y then loop row else false in loop row in (* Find the leftmost position where a row fits. *) (* If the leftmost significant element in this row is at offset [j], then we can hope to fit as far left as [-j] -- so this element lands at offset [0] in the data array. *) (* Note that displacements may be negative. This means that, for insignificant elements, accesses to the data array could fail: they could be out of bounds, either towards the left or towards the right. This is not a problem, as long as [get] is invoked only at significant elements. *) let rec fit k row : int = if fits k row then k else fit (k + 1) row in let fit row = match row with | [] -> 0 (* irrelevant *) | (j, _) :: _ -> fit (-j) row in (* Write [row] at (compatible) offset [k]. *) let rec write k = function | [] -> () | (j, x) :: row -> InfiniteArray.set data (k + j) x; write k row in (* Iterate over the sorted array of rows. Fit and write each row at the leftmost compatible offset. Update the displacement table. *) Array.iter (fun (i, _, row) -> let k = fit row in (* if [row] has leading insignificant elements, then [k] can be negative *) write k row; displacement.(i) <- encode k ) rows; (* Return the compressed tables. *) displacement, InfiniteArray.domain data (* [get ct i j] returns the value found at indices [i] and [j] in the compressed table [ct]. This function call is permitted only if the value found at indices [i] and [j] in the original table is significant -- otherwise, it could fail abruptly. *) (* Together, [compress] and [get] have the property that, if the value found at indices [i] and [j] in an uncompressed table [t] is significant, then [get (compress t) i j] is equal to that value. *) let get (displacement, data) i j = assert (0 <= i && i < Array.length displacement); let k = decode displacement.(i) in assert (0 <= k + j && k + j < Array.length data); (* failure of this assertion indicates an attempt to access an insignificant element that happens to be mapped out of the bounds of the [data] array. *) data.(k + j) (* [getget] is a variant of [get] which only requires read access, via accessors, to the two components of the table. *) let getget get_displacement get_data (displacement, data) i j = let k = decode (get_displacement displacement i) in get_data data (k + j) menhir-20200123/lib/RowDisplacement.mli000066400000000000000000000050451361226111300176000ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU Library General Public License version 2, with a *) (* special exception on linking, as described in the file LICENSE. *) (* *) (******************************************************************************) (* This module compresses a two-dimensional table, where some values are considered insignificant, via row displacement. *) (* A compressed table is represented as a pair of arrays. The displacement array is an array of offsets into the data array. *) type 'a table = int array * (* displacement *) 'a array (* data *) (* [compress equal insignificant dummy m n t] turns the two-dimensional table [t] into a compressed table. The parameter [equal] is equality of data values. The parameter [wildcard] tells which data values are insignificant, and can thus be overwritten with other values. The parameter [dummy] is used to fill holes in the data array. [m] and [n] are the integer dimensions of the table [t]. *) val compress: ('a -> 'a -> bool) -> ('a -> bool) -> 'a -> int -> int -> 'a array array -> 'a table (* [get ct i j] returns the value found at indices [i] and [j] in the compressed table [ct]. This function call is permitted only if the value found at indices [i] and [j] in the original table is significant -- otherwise, it could fail abruptly. *) (* Together, [compress] and [get] have the property that, if the value found at indices [i] and [j] in an uncompressed table [t] is significant, then [get (compress t) i j] is equal to that value. *) val get: 'a table -> int -> int -> 'a (* [getget] is a variant of [get] which only requires read access, via accessors, to the two components of the table. *) val getget: ('displacement -> int -> int) -> ('data -> int -> 'a) -> 'displacement * 'data -> int -> int -> 'a menhir-20200123/lib/TableFormat.ml000066400000000000000000000140271361226111300165270ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU Library General Public License version 2, with a *) (* special exception on linking, as described in the file LICENSE. *) (* *) (******************************************************************************) (* This signature defines the format of the parse tables. It is used as an argument to [TableInterpreter.Make]. *) module type TABLES = sig (* This is the parser's type of tokens. *) type token (* This maps a token to its internal (generation-time) integer code. *) val token2terminal: token -> int (* This is the integer code for the error pseudo-token. *) val error_terminal: int (* This maps a token to its semantic value. *) val token2value: token -> Obj.t (* Traditionally, an LR automaton is described by two tables, namely, an action table and a goto table. See, for instance, the Dragon book. The action table is a two-dimensional matrix that maps a state and a lookahead token to an action. An action is one of: shift to a certain state, reduce a certain production, accept, or fail. The goto table is a two-dimensional matrix that maps a state and a non-terminal symbol to either a state or undefined. By construction, this table is sparse: its undefined entries are never looked up. A compression technique is free to overlap them with other entries. In Menhir, things are slightly different. If a state has a default reduction on token [#], then that reduction must be performed without consulting the lookahead token. As a result, we must first determine whether that is the case, before we can obtain a lookahead token and use it as an index in the action table. Thus, Menhir's tables are as follows. A one-dimensional default reduction table maps a state to either ``no default reduction'' (encoded as: 0) or ``by default, reduce prod'' (encoded as: 1 + prod). The action table is looked up only when there is no default reduction. *) val default_reduction: PackedIntArray.t (* Menhir follows Dencker, Dürre and Heuft, who point out that, although the action table is not sparse by nature (i.e., the error entries are significant), it can be made sparse by first factoring out a binary error matrix, then replacing the error entries in the action table with undefined entries. Thus: A two-dimensional error bitmap maps a state and a terminal to either ``fail'' (encoded as: 0) or ``do not fail'' (encoded as: 1). The action table, which is now sparse, is looked up only in the latter case. *) (* The error bitmap is flattened into a one-dimensional table; its width is recorded so as to allow indexing. The table is then compressed via [PackedIntArray]. The bit width of the resulting packed array must be [1], so it is not explicitly recorded. *) (* The error bitmap does not contain a column for the [#] pseudo-terminal. Thus, its width is [Terminal.n - 1]. We exploit the fact that the integer code assigned to [#] is greatest: the fact that the right-most column in the bitmap is missing does not affect the code for accessing it. *) val error: int (* width of the bitmap *) * string (* second component of [PackedIntArray.t] *) (* A two-dimensional action table maps a state and a terminal to one of ``shift to state s and discard the current token'' (encoded as: s | 10), ``shift to state s without discarding the current token'' (encoded as: s | 11), or ``reduce prod'' (encoded as: prod | 01). *) (* The action table is first compressed via [RowDisplacement], then packed via [PackedIntArray]. *) (* Like the error bitmap, the action table does not contain a column for the [#] pseudo-terminal. *) val action: PackedIntArray.t * PackedIntArray.t (* A one-dimensional lhs table maps a production to its left-hand side (a non-terminal symbol). *) val lhs: PackedIntArray.t (* A two-dimensional goto table maps a state and a non-terminal symbol to either undefined (encoded as: 0) or a new state s (encoded as: 1 + s). *) (* The goto table is first compressed via [RowDisplacement], then packed via [PackedIntArray]. *) val goto: PackedIntArray.t * PackedIntArray.t (* The number of start productions. A production [prod] is a start production if and only if [prod < start] holds. This is also the number of start symbols. A nonterminal symbol [nt] is a start symbol if and only if [nt < start] holds. *) val start: int (* A one-dimensional semantic action table maps productions to semantic actions. The calling convention for semantic actions is described in [EngineTypes]. This table contains ONLY NON-START PRODUCTIONS, so the indexing is off by [start]. Be careful. *) val semantic_action: ((int, Obj.t, token) EngineTypes.env -> (int, Obj.t) EngineTypes.stack) array (* The parser defines its own [Error] exception. This exception can be raised by semantic actions and caught by the engine, and raised by the engine towards the final user. *) exception Error (* The parser indicates whether to generate a trace. Generating a trace requires two extra tables, which respectively map a terminal symbol and a production to a string. *) val trace: (string array * string array) option end menhir-20200123/lib/TableInterpreter.ml000066400000000000000000000161541361226111300176050ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU Library General Public License version 2, with a *) (* special exception on linking, as described in the file LICENSE. *) (* *) (******************************************************************************) module MakeEngineTable (T : TableFormat.TABLES) = struct type state = int let number s = s type token = T.token type terminal = int type nonterminal = int type semantic_value = Obj.t let token2terminal = T.token2terminal let token2value = T.token2value let error_terminal = T.error_terminal let error_value = Obj.repr () (* The function [foreach_terminal] exploits the fact that the first component of [T.error] is [Terminal.n - 1], i.e., the number of terminal symbols, including [error] but not [#]. *) (* There is similar code in [InspectionTableInterpreter]. The code there contains an additional conversion of the type [terminal] to the type [xsymbol]. *) let rec foldij i j f accu = if i = j then accu else foldij (i + 1) j f (f i accu) let foreach_terminal f accu = let n, _ = T.error in foldij 0 n (fun i accu -> f i accu ) accu type production = int (* In principle, only non-start productions are exposed to the user, at type [production] or at type [int]. This is checked dynamically. *) let non_start_production i = assert (T.start <= i && i - T.start < Array.length T.semantic_action) let production_index i = non_start_production i; i let find_production i = non_start_production i; i let default_reduction state defred nodefred env = let code = PackedIntArray.get T.default_reduction state in if code = 0 then nodefred env else defred env (code - 1) let is_start prod = prod < T.start (* This auxiliary function helps access a compressed, two-dimensional matrix, like the action and goto tables. *) let unmarshal2 table i j = RowDisplacement.getget PackedIntArray.get PackedIntArray.get table i j let action state terminal value shift reduce fail env = match PackedIntArray.unflatten1 T.error state terminal with | 1 -> let action = unmarshal2 T.action state terminal in let opcode = action land 0b11 and param = action lsr 2 in if opcode >= 0b10 then (* 0b10 : shift/discard *) (* 0b11 : shift/nodiscard *) let please_discard = (opcode = 0b10) in shift env please_discard terminal value param else (* 0b01 : reduce *) (* 0b00 : cannot happen *) reduce env param | c -> assert (c = 0); fail env let goto_nt state nt = let code = unmarshal2 T.goto state nt in (* code = 1 + state *) code - 1 let goto_prod state prod = goto_nt state (PackedIntArray.get T.lhs prod) let maybe_goto_nt state nt = let code = unmarshal2 T.goto state nt in (* If [code] is 0, there is no outgoing transition. If [code] is [1 + state], there is a transition towards [state]. *) assert (0 <= code); if code = 0 then None else Some (code - 1) exception Error = T.Error type semantic_action = (state, semantic_value, token) EngineTypes.env -> (state, semantic_value) EngineTypes.stack let semantic_action prod = (* Indexing into the array [T.semantic_action] is off by [T.start], because the start productions do not have entries in this array. *) T.semantic_action.(prod - T.start) (* [may_reduce state prod] tests whether the state [state] is capable of reducing the production [prod]. This information could be determined in constant time if we were willing to create a bitmap for it, but that would take up a lot of space. Instead, we obtain this information by iterating over a line in the action table. This is costly, but this function is not normally used by the LR engine anyway; it is supposed to be used only by programmers who wish to develop error recovery strategies. *) (* In the future, if desired, we could memoize this function, so as to pay the cost in (memory) space only if and where this function is actually used. We could also replace [foreach_terminal] with a function [exists_terminal] which stops as soon as the accumulator is [true]. *) let may_reduce state prod = (* Test if there is a default reduction of [prod]. *) default_reduction state (fun () prod' -> prod = prod') (fun () -> (* If not, then for each terminal [t], ... *) foreach_terminal (fun t accu -> accu || (* ... test if there is a reduction of [prod] on [t]. *) action state t () (* shift: *) (fun () _ _ () _ -> false) (* reduce: *) (fun () prod' -> prod = prod') (* fail: *) (fun () -> false) () ) false ) () (* If [T.trace] is [None], then the logging functions do nothing. *) let log = match T.trace with Some _ -> true | None -> false module Log = struct open Printf let state state = match T.trace with | Some _ -> fprintf stderr "State %d:\n%!" state | None -> () let shift terminal state = match T.trace with | Some (terminals, _) -> fprintf stderr "Shifting (%s) to state %d\n%!" terminals.(terminal) state | None -> () let reduce_or_accept prod = match T.trace with | Some (_, productions) -> fprintf stderr "%s\n%!" productions.(prod) | None -> () let lookahead_token token startp endp = match T.trace with | Some (terminals, _) -> fprintf stderr "Lookahead token is now %s (%d-%d)\n%!" terminals.(token) startp.Lexing.pos_cnum endp.Lexing.pos_cnum | None -> () let initiating_error_handling () = match T.trace with | Some _ -> fprintf stderr "Initiating error handling\n%!" | None -> () let resuming_error_handling () = match T.trace with | Some _ -> fprintf stderr "Resuming error handling\n%!" | None -> () let handling_error state = match T.trace with | Some _ -> fprintf stderr "Handling error in state %d\n%!" state | None -> () end end menhir-20200123/lib/TableInterpreter.mli000066400000000000000000000032531361226111300177520ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU Library General Public License version 2, with a *) (* special exception on linking, as described in the file LICENSE. *) (* *) (******************************************************************************) (* This module provides a thin decoding layer for the generated tables, thus providing an API that is suitable for use by [Engine.Make]. It is part of [MenhirLib]. *) (* The exception [Error] is declared within the generated parser. This is preferable to pre-declaring it here, as it ensures that each parser gets its own, distinct [Error] exception. This is consistent with the code-based back-end. *) (* This functor is invoked by the generated parser. *) module MakeEngineTable (T : TableFormat.TABLES) : EngineTypes.TABLE with type state = int and type token = T.token and type semantic_value = Obj.t and type production = int and type terminal = int and type nonterminal = int menhir-20200123/lib/dune000066400000000000000000000015121361226111300146460ustar00rootroot00000000000000;; Note: the library MenhirLib is not built here; it is built in lib/pack. ;; These rules generate the module [StaticVersion]. This module defines a ;; value of type [unit] whose name is [require_XXXXXXXX], where [XXXXXXXX] ;; is our 8-digit version number. This number is set in the [dune-project] ;; file. ;; When the [--table] switch is passed, Menhir produces a reference to ;; [MenhirLib.StaticVersion.require_XXXXXXXX] in the generated code. This ;; ensures that the generated code can be linked only with an appropriate ;; version of MenhirLib. This is important because we use unsafe casts: a ;; version mismatch could cause a crash. (rule (with-stdout-to StaticVersion.mli (echo "val require_%{version:menhir}: unit\n") ) ) (rule (with-stdout-to StaticVersion.ml (echo "let require_%{version:menhir} = ()\n") ) ) menhir-20200123/lib/pack/000077500000000000000000000000001361226111300147075ustar00rootroot00000000000000menhir-20200123/lib/pack/dune000066400000000000000000000010001361226111300155540ustar00rootroot00000000000000;; The helper script pack.ml generates menhirLib.{ml,mli} ;; by concatenating the modules listed in menhirLib.mlpack. (executable (name pack) (modules pack) ) (rule (targets menhirLib.ml menhirLib.mli) (deps (glob_files ../*.{ml,mli}) menhirLib.mlpack) (action (run ./pack.exe)) ) ;; We can then compile MenhirLib from menhirLib.{ml,mli} ;; in this directory. (library (name menhirLib) (public_name menhirLib) (synopsis "Runtime support for code generated by Menhir") (modules menhirLib) ) menhir-20200123/lib/pack/menhirLib.mlpack000066400000000000000000000006211361226111300200100ustar00rootroot00000000000000# This is the list of modules that must go into MenhirLib. # They must be listed in dependency order, as this list is # used to construct menhirLib.ml at installation time. General Convert IncrementalEngine EngineTypes Engine ErrorReports Printers InfiniteArray PackedIntArray RowDisplacement LinearizedArray TableFormat InspectionTableFormat InspectionTableInterpreter TableInterpreter StaticVersion menhir-20200123/lib/pack/pack.ml000066400000000000000000000052731361226111300161660ustar00rootroot00000000000000(* This script finds the names of the modules in MenhirLib by reading the file menhirLib.mlpack. It then finds the source files for these modules in the parent directory (lib/), and concatenates them to create menhirLib.{ml,mli} in the current directory (lib/pack). *) (* ------------------------------------------------------------------------- *) (* [up fn] is [../fn]. *) let up fn = Filename.concat Filename.parent_dir_name fn (* ------------------------------------------------------------------------- *) (* [cat_file oc fn] prints the content of the file [fn] on the channel [oc]. *) let cat_file oc fn = let ic = open_in fn in let rec loop () = match input_line ic with | s -> output_string oc s; output_char oc '\n'; loop () | exception End_of_file -> () in loop (); close_in ic (* ------------------------------------------------------------------------- *) (* The names of the modules in MenhirLib are obtained by reading the non-comment lines in the file menhirLib.mlpack. *) let menhirLib_modules : string list = let ic = open_in "menhirLib.mlpack" in let rec loop accu = match input_line ic with | exception End_of_file -> List.rev accu | s -> let s = String.trim s in let accu = if s <> "" && s.[0] <> '#' then s :: accu else accu in loop accu in let r = loop [] in close_in ic; r (* ------------------------------------------------------------------------- *) (* The source file menhirLib.ml is created by concatenating all of the source files that make up MenhirLib. This file is not needed to compile Menhir or MenhirLib. It is installed at the same time as MenhirLib and is copied by Menhir when the user requests a self-contained parser (one that is not dependent on MenhirLib). *) let () = print_endline "Creating menhirLib.ml..."; let oc = open_out "menhirLib.ml" in List.iter (fun m -> Printf.fprintf oc "module %s = struct\n" m; cat_file oc (up (m ^ ".ml")); Printf.fprintf oc "end\n" ) menhirLib_modules; close_out oc (* The source file menhirLib.mli is created in the same way. If a module does not have an .mli file, then we assume that its .ml file contains type (and module type) definitions only, so we copy it instead of the (non-existent) .mli file. *) let () = print_endline "Creating menhirLib.mli..."; let oc = open_out "menhirLib.mli" in List.iter (fun m -> Printf.fprintf oc "module %s : sig\n" m; if Sys.file_exists (up (m ^ ".mli")) then cat_file oc (up (m ^ ".mli")) else cat_file oc (up (m ^ ".ml")); Printf.fprintf oc "end\n" ) menhirLib_modules; close_out oc menhir-20200123/sdk/000077500000000000000000000000001361226111300140045ustar00rootroot00000000000000menhir-20200123/sdk/cmly_api.ml000066400000000000000000000115541361226111300161410ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* The following signatures describe the API offered by the functor [Cfmly_read.Read]. This functor reads in a .cmly file and gives access to the description of the grammar and automaton contained in this file. *) (* This API is currently entirely self-contained, except for a reference to the module [Keyword], which is also part of [MenhirSdk]. *) (* The module type [INDEXED] describes a type [t] whose elements are in a bijection with an integer interval of the form [0..count). *) module type INDEXED = sig type t val count : int val of_int : int -> t val to_int : t -> int val iter : (t -> unit) -> unit val fold : (t -> 'a -> 'a) -> 'a -> 'a val tabulate : (t -> 'a) -> t -> 'a end (* The module type [GRAMMAR] describes the grammar and automaton. *) module type GRAMMAR = sig type terminal = private int type nonterminal = private int type production = private int type lr0 = private int type lr1 = private int type item = production * int type ocamltype = string type ocamlexpr = string module Range : sig type t val startp: t -> Lexing.position val endp: t -> Lexing.position end module Attribute : sig type t val label : t -> string val has_label : string -> t -> bool val payload : t -> string val position : t -> Range.t end module Grammar : sig val basename : string val preludes : string list val postludes : string list val parameters : string list val entry_points : (nonterminal * production * lr1) list val attributes : Attribute.t list end module Terminal : sig include INDEXED with type t = terminal val name : t -> string val kind : t -> [`REGULAR | `ERROR | `EOF | `PSEUDO] val typ : t -> ocamltype option val attributes : t -> Attribute.t list end module Nonterminal : sig include INDEXED with type t = nonterminal val name : t -> string val mangled_name : t -> string val kind : t -> [`REGULAR | `START] val typ : t -> ocamltype option val positions : t -> Range.t list val nullable : t -> bool val first : t -> terminal list val attributes : t -> Attribute.t list end type symbol = | T of terminal | N of nonterminal val symbol_name : ?mangled:bool -> symbol -> string type identifier = string module Action : sig type t val expr : t -> ocamlexpr val keywords : t -> Keyword.keyword list end module Production : sig include INDEXED with type t = production val kind : t -> [`REGULAR | `START] val lhs : t -> nonterminal val rhs : t -> (symbol * identifier * Attribute.t list) array val positions : t -> Range.t list val action : t -> Action.t option val attributes : t -> Attribute.t list end module Lr0 : sig include INDEXED with type t = lr0 val incoming : t -> symbol option val items : t -> item list end module Lr1 : sig include INDEXED with type t = lr1 val lr0 : t -> lr0 val transitions : t -> (symbol * t) list val reductions : t -> (terminal * production list) list end module Print : sig open Format val terminal : formatter -> terminal -> unit val nonterminal : formatter -> nonterminal -> unit val symbol : formatter -> symbol -> unit val mangled_nonterminal : formatter -> nonterminal -> unit val mangled_symbol : formatter -> symbol -> unit val production : formatter -> production -> unit val item : formatter -> item -> unit val itemset : formatter -> item list -> unit val annot_item : string list -> formatter -> item -> unit val annot_itemset : string list list -> formatter -> item list -> unit end end menhir-20200123/sdk/cmly_format.ml000066400000000000000000000064151361226111300166600ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module defines the data that is stored in .cmly files. In short, a .cmly file contains a value of type [grammar], defined below. *) (* The type definitions in this module are used by [Cmly_write], which writes a .cmly file, and by [Cmly_read], which reads a .cmly file. They should not be used anywhere else. *) (* All entities (terminal symbols, nonterminal symbols, and so on) are represented as integers. These integers serve as indices into arrays. This enables simple and efficient hashing, comparison, indexing, etc. *) type terminal = int type nonterminal = int type production = int type lr0 = int type lr1 = int type ocamltype = string type ocamlexpr = string type range = { r_start: Lexing.position; r_end: Lexing.position; } type attribute = { a_label: string; a_payload: string; a_position: range; } type attributes = attribute list type terminal_def = { t_name: string; t_kind: [`REGULAR | `ERROR | `EOF | `PSEUDO]; t_type: ocamltype option; t_attributes: attributes; } type nonterminal_def = { n_name: string; n_kind: [`REGULAR | `START]; n_mangled_name: string; n_type: ocamltype option; n_positions: range list; n_nullable: bool; n_first: terminal list; n_attributes: attributes; } type symbol = | T of terminal | N of nonterminal type identifier = string type action = { a_expr: ocamlexpr; a_keywords: Keyword.keyword list; } type producer_def = symbol * identifier * attributes type production_def = { p_kind: [`REGULAR | `START]; p_lhs: nonterminal; p_rhs: producer_def array; p_positions: range list; p_action: action option; p_attributes: attributes; } type lr0_state_def = { lr0_incoming: symbol option; lr0_items: (production * int) list; } type lr1_state_def = { lr1_lr0: lr0; lr1_transitions: (symbol * lr1) list; lr1_reductions: (terminal * production list) list; } type grammar = { g_basename : string; g_preludes : string list; g_postludes : string list; g_terminals : terminal_def array; g_nonterminals : nonterminal_def array; g_productions : production_def array; g_lr0_states : lr0_state_def array; g_lr1_states : lr1_state_def array; g_entry_points : (nonterminal * production * lr1) list; g_attributes : attributes; g_parameters : string list; } menhir-20200123/sdk/cmly_read.ml000066400000000000000000000217641361226111300163070ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Cmly_format open Cmly_api (* ------------------------------------------------------------------------ *) (* Reading a .cmly file. *) exception Error of string let read (ic : in_channel) : grammar = (* .cmly file format: CMLY ++ version string ++ grammar *) let magic = "CMLY" ^ Version.version in try let m = really_input_string ic (String.length magic) in if m <> magic then raise (Error (Printf.sprintf "Invalid magic string in .cmly file.\n\ Expecting %S, but got %S." magic m)) else (input_value ic : grammar) with | End_of_file (* [really_input_string], [input_value] *) | Failure _ -> (* [input_value] *) raise (Error (Printf.sprintf "Invalid or damaged .cmly file.")) let read (filename : string) : grammar = let ic = open_in_bin filename in match read ic with | x -> close_in_noerr ic; x | exception exn -> close_in_noerr ic; raise exn (* ------------------------------------------------------------------------ *) (* Packaging the interval [0..count) as a module of type [INDEXED]. *) module Index (P : sig val name: string (* for error messages only *) val count: int end) : INDEXED with type t = int = struct type t = int let count = P.count let of_int n = if 0 <= n && n < count then n else invalid_arg (P.name ^ ".of_int: index out of bounds") let to_int n = n let iter f = for i = 0 to count - 1 do f i done let fold f x = let r = ref x in for i = 0 to count - 1 do r := f i !r done; !r let tabulate f = let a = Array.init count f in Array.get a end (* ------------------------------------------------------------------------ *) (* Packaging a data structure of type [Cmly_format.grammar] as a module of type [Cmly_api.GRAMMAR]. *) module Make (G : sig val grammar : grammar end) : GRAMMAR = struct open G type terminal = int type nonterminal = int type production = int type lr0 = int type lr1 = int type item = production * int type ocamltype = string type ocamlexpr = string module Range = struct type t = Cmly_format.range let startp range = range.r_start let endp range = range.r_end end module Attribute = struct type t = Cmly_format.attribute let label attr = attr.a_label let has_label label attr = label = attr.a_label let payload attr = attr.a_payload let position attr = attr.a_position end module Grammar = struct let basename = grammar.g_basename let preludes = grammar.g_preludes let postludes = grammar.g_postludes let entry_points = grammar.g_entry_points let attributes = grammar.g_attributes let parameters = grammar.g_parameters end module Terminal = struct let table = grammar.g_terminals let name i = table.(i).t_name let kind i = table.(i).t_kind let typ i = table.(i).t_type let attributes i = table.(i).t_attributes include Index(struct let name = "Terminal" let count = Array.length table end) end module Nonterminal = struct let table = grammar.g_nonterminals let name i = table.(i).n_name let mangled_name i = table.(i).n_mangled_name let kind i = table.(i).n_kind let typ i = table.(i).n_type let positions i = table.(i).n_positions let nullable i = table.(i).n_nullable let first i = table.(i).n_first let attributes i = table.(i).n_attributes include Index(struct let name = "Nonterminal" let count = Array.length table end) end type symbol = Cmly_format.symbol = | T of terminal | N of nonterminal let symbol_name ?(mangled=false) = function | T t -> Terminal.name t | N n -> if mangled then Nonterminal.mangled_name n else Nonterminal.name n type identifier = string module Action = struct type t = action let expr t = t.a_expr let keywords t = t.a_keywords end module Production = struct let table = grammar.g_productions let kind i = table.(i).p_kind let lhs i = table.(i).p_lhs let rhs i = table.(i).p_rhs let positions i = table.(i).p_positions let action i = table.(i).p_action let attributes i = table.(i).p_attributes include Index(struct let name = "Production" let count = Array.length table end) end module Lr0 = struct let table = grammar.g_lr0_states let incoming i = table.(i).lr0_incoming let items i = table.(i).lr0_items include Index(struct let name = "Lr0" let count = Array.length table end) end module Lr1 = struct let table = grammar.g_lr1_states let lr0 i = table.(i).lr1_lr0 let transitions i = table.(i).lr1_transitions let reductions i = table.(i).lr1_reductions include Index(struct let name = "Lr1" let count = Array.length table end) end module Print = struct let terminal ppf t = Format.pp_print_string ppf (Terminal.name t) let nonterminal ppf t = Format.pp_print_string ppf (Nonterminal.name t) let symbol ppf = function | T t -> terminal ppf t | N n -> nonterminal ppf n let mangled_nonterminal ppf t = Format.pp_print_string ppf (Nonterminal.name t) let mangled_symbol ppf = function | T t -> terminal ppf t | N n -> mangled_nonterminal ppf n let rec lengths l acc = function | [] -> if l = -1 then [] else l :: lengths (-1) [] acc | [] :: rows -> lengths l acc rows | (col :: cols) :: rows -> lengths (max l (String.length col)) (cols :: acc) rows let rec adjust_length lengths cols = match lengths, cols with | l :: ls, c :: cs -> let pad = l - String.length c in let c = if pad = 0 then c else c ^ String.make pad ' ' in c :: adjust_length ls cs | _, [] -> [] | [], _ -> assert false let align_tabular rows = let lengths = lengths (-1) [] rows in List.map (adjust_length lengths) rows let print_line ppf = function | [] -> () | x :: xs -> Format.fprintf ppf "%s" x; List.iter (Format.fprintf ppf " %s") xs let print_table ppf table = let table = align_tabular table in List.iter (Format.fprintf ppf "%a\n" print_line) table let annot_itemset annots ppf items = let last_lhs = ref (-1) in let prepare (p, pos) annot = let rhs = Array.map (fun (sym, id, _) -> if id <> "" && id.[0] <> '_' then "(" ^ id ^ " = " ^ symbol_name sym ^ ")" else symbol_name sym ) (Production.rhs p) in if pos >= 0 && pos < Array.length rhs then rhs.(pos) <- ". " ^ rhs.(pos) else if pos > 0 && pos = Array.length rhs then rhs.(pos - 1) <- rhs.(pos - 1) ^ " ."; let lhs = Production.lhs p in let rhs = Array.to_list rhs in let rhs = if !last_lhs = lhs then "" :: " |" :: rhs else begin last_lhs := lhs; Nonterminal.name lhs :: "::=" :: rhs end in if annot = [] then [rhs] else [rhs; ("" :: "" :: annot)] in let rec prepare_all xs ys = match xs, ys with | [], _ -> [] | (x :: xs), (y :: ys) -> let z = prepare x y in z :: prepare_all xs ys | (x :: xs), [] -> let z = prepare x [] in z :: prepare_all xs [] in print_table ppf (List.concat (prepare_all items annots)) let itemset ppf t = annot_itemset [] ppf t let annot_item annot ppf item = annot_itemset [annot] ppf [item] let item ppf t = annot_item [] ppf t let production ppf t = item ppf (t, -1) end end module Read (X : sig val filename : string end) = Make (struct let grammar = read X.filename end) menhir-20200123/sdk/cmly_read.mli000066400000000000000000000024461361226111300164540ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* The functor [Read] reads a .cmly file. If the file is unreadable, the exception [Error] is raised. Otherwise, the functor builds a module of type [Cmly_api.GRAMMAR], which gives access to a description of the grammar and automaton. *) exception Error of string module Read (X : sig val filename : string end) : Cmly_api.GRAMMAR menhir-20200123/sdk/dune000066400000000000000000000010361361226111300146620ustar00rootroot00000000000000;; The library MenhirSdk is built here. ;; This rule generates the module [Version]. This module defines the value ;; [version] of type [string]. Its value is a string representation of our ;; 8-digit version number [XXXXXXXX]. This number is set in the [dune-project] ;; file. (rule (with-stdout-to version.ml (echo "let version = \"%{version:menhir}\"\n") ) ) ;; Compile MenhirSdk in this directory. (library (name menhirSdk) (public_name menhirSdk) (synopsis "Toolkit for postprocessing Menhir automata (.cmly files)") ) menhir-20200123/sdk/keyword.ml000066400000000000000000000063521361226111300160300ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module provides some type and function definitions that help deal with the keywords that we recognize within semantic actions. *) (* ------------------------------------------------------------------------- *) (* Types. *) (* The user can request position information either at type [int] (a simple offset) or at type [Lexing.position]. *) type flavor = | FlavorOffset | FlavorPosition | FlavorLocation (* The user can request position information about the $start or $end of a symbol. Also, $symbolstart requests the computation of the start position of the first nonempty element in a production. *) type where = | WhereSymbolStart | WhereStart | WhereEnd (* The user can request position information about a production's left-hand side or about one of the symbols in its right-hand side, which he can refer to by position or by name. *) type subject = | Before | Left | RightNamed of string (* Keywords inside semantic actions. They allow access to semantic values or to position information. *) type keyword = | Position of subject * where * flavor | SyntaxError (* ------------------------------------------------------------------------- *) (* These auxiliary functions help map a [Position] keyword to the name of the variable that the keyword is replaced with. *) let where = function | WhereSymbolStart -> "symbolstart" | WhereStart -> "start" | WhereEnd -> "end" let subject = function | Before -> "__0_" | Left -> "" | RightNamed id -> Printf.sprintf "_%s_" id let flavor = function | FlavorPosition -> "pos" | FlavorOffset -> "ofs" | FlavorLocation -> "loc" let posvar s w f = match w, f with | _, (FlavorOffset | FlavorPosition) -> Printf.sprintf "_%s%s%s" (where w) (flavor f) (subject s) | WhereSymbolStart, FlavorLocation -> "_sloc" | WhereStart, FlavorLocation -> Printf.sprintf "_loc%s" (subject s) | _ -> assert false (* ------------------------------------------------------------------------- *) (* Sets of keywords. *) module KeywordSet = struct include Set.Make (struct type t = keyword let compare = compare end) let map f keywords = fold (fun keyword accu -> add (f keyword) accu ) keywords empty end menhir-20200123/sdk/keyword.mli000066400000000000000000000062371361226111300162030ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module provides some type and function definitions that help deal with the keywords that we recognize within semantic actions. *) (* The user can request position information either at several types: - a simple offset of type [int], e.g., via $startofs; - a position of type [Lexing.position], e.g., via $startpos; - a location, e.g., via $loc. A location is currently represented as a pair of positions, but this might change in the future; we may allow the user to choose a custom type of locations. *) type flavor = | FlavorOffset | FlavorPosition | FlavorLocation (* The user can request position information about the $start or $end of a symbol. Also, $symbolstart requests the computation of the start position of the first nonempty element in a production. *) type where = | WhereSymbolStart | WhereStart | WhereEnd (* The user can request position information about a production's left-hand side or about one of the symbols in its right-hand side, which he must refer to by name. (Referring to its symbol by its position, using [$i], is permitted in the concrete syntax, but the lexer eliminates this form.) We add a new subject, [Before], which corresponds to [$endpos($0)] in concrete syntax. We adopt the (slightly awkward) convention that when the subject is [Before], the [where] component must be [WhereEnd]. If [flavor] is [FlavorLocation], then [where] must be [WhereSymbolStart] or [WhereStart]. In the former case, [subject] must be [Left]; this corresponds to $sloc in concrete syntax. In the latter case, [subject] must be [Left] or [RightNamed _]; this corresponds to $loc and $loc(x) in concrete syntax. *) type subject = | Before | Left | RightNamed of string (* Keywords inside semantic actions. They allow access to semantic values or to position information. *) type keyword = | Position of subject * where * flavor | SyntaxError (* This maps a [Position] keyword to the name of the variable that the keyword is replaced with. *) val posvar: subject -> where -> flavor -> string (* Sets of keywords. *) module KeywordSet : sig include Set.S with type elt = keyword val map: (keyword -> keyword) -> t -> t end menhir-20200123/src/000077500000000000000000000000001361226111300140125ustar00rootroot00000000000000menhir-20200123/src/Boolean.ml000066400000000000000000000022231361226111300157220ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* The Boolean lattice. *) type property = bool let bottom = false let equal (b1 : bool) (b2 : bool) = b1 = b2 let is_maximal b = b let union (b1 : bool) (b2 : bool) = b1 || b2 menhir-20200123/src/Boolean.mli000066400000000000000000000020551361226111300160760ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) include Fix.PROPERTY with type property = bool val union: property -> property -> property menhir-20200123/src/CheckSafeParameterizedGrammar.ml000066400000000000000000000156601361226111300222140ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) let value = Positions.value open Syntax (* This test accepts a parameterized grammar, with the restriction that all parameters must have sort [*]. This implies that the head of every application must be a toplevel nonterminal symbol: it cannot be a formal parameter of the current rule. *) (* -------------------------------------------------------------------------- *) (* This flag causes graph edges to be logged on the standard error channel. *) let debug = false (* -------------------------------------------------------------------------- *) (* For syntactic convenience, the code is wrapped in a functor. *) module Run (G : sig val g : grammar end) = struct open G (* -------------------------------------------------------------------------- *) (* We build a graph whose vertices are all formal parameters of all rules. A formal parameter is represented as a pair of a nonterminal symbol and a 0-based integer index (the number of this parameter within this rule). We use OCaml's generic equality and hash functions at this type. *) type formal = symbol * int let formals (nt, rule) : formal list = let arity = List.length rule.pr_parameters in Misc.mapi arity (fun i -> nt, i) let formals : formal array = StringMap.bindings g.p_rules |> List.map formals |> List.concat |> Array.of_list (* -------------------------------------------------------------------------- *) (* The graph edges are as follows. First, for every rule of the following form: F(..., X, ...): # where X is the i-th formal parameter of F ... G(..., X, ...) ... # where X is the j-th actual parameter of G there is a "safe" edge from the formal parameter F/i to the formal G/j. This reflects the fact that there is a flow from F/i to G/j. It is "safe" in the sense that it is not size-increasing: the same parameter X is passed from F to G. Second, for every rule of the following form: F(..., X, ...): # where X is the i-th formal parameter of F ... G(..., H(..., X, ...) , ...) ... # where H(...) is the j-th actual parameter of G there is a "dangerous" edge from the formal parameter F/i to the formal G/j. This reflects the fact that there is a flow from F/i to G/j. This flow is "dangerous" in the sense that it is size-increasing: X is transformed to H(..., X, ...). *) type edge = | Safe | Dangerous let successors_parameter (f : edge -> formal -> unit) x (param : parameter) = match param with | ParameterVar _ -> (* This is not an application. No successors. *) () | ParameterApp (sym, params) -> let nt = value sym in (* If [x] occurs in the [i]-th actual parameter of this application, then there is an edge to the formal [nt, i]. Whether it is a safe or dangerous edge depends on whether [x] occurs shallow or deep. *) List.iteri (fun i param -> if Parameters.occurs_shallow x param then f Safe (nt, i) else if Parameters.occurs_deep x param then f Dangerous (nt, i) ) params | ParameterAnonymous _ -> assert false let successors_producer f x ((_, param, _) : producer) = successors_parameter f x param let successors_branch f x (branch : parameterized_branch) = List.iter (successors_producer f x) branch.pr_producers let successors f ((nt, i) : formal) = let rule = try StringMap.find nt g.p_rules with Not_found -> assert false in let x = try List.nth rule.pr_parameters i with Failure _ -> assert false in List.iter (successors_branch f x) rule.pr_branches (* -------------------------------------------------------------------------- *) (* We now have a full description of the graph. *) module G = struct type node = formal let n = Array.length formals let index = Misc.inverse formals let successors f = successors (fun _ target -> f target) let iter f = Array.iter f formals end (* -------------------------------------------------------------------------- *) (* Display the graph. *) let () = if debug then G.iter (fun (x, i) -> successors (fun edge (y, j) -> let kind = match edge with Safe -> "safe" | Dangerous -> "dangerous" in Printf.eprintf "%s/%d ->(%s) %s/%d\n" x i kind y j ) (x, i) ) (* -------------------------------------------------------------------------- *) (* Compute its strongly connected components, ignoring the distinction between safe and dangerous edges. *) module T = Tarjan.Run(G) (* -------------------------------------------------------------------------- *) (* The safety criterion is: no dangerous edge is part of a cycle. Indeed, if this criterion is satisfied, then expansion must terminate: only a finite number of well-sorted terms (involving toplevel symbols and applications) can arise. (This sentence is not a proof!) Conversely, if a dangerous edge appears in a cycle, then expansion will not terminate. (That is, unless the dangerous cycle is unreachable. We choose to reject it anyway in that case.) In other words, this criterion is sound and complete. *) (* Checking that no dangerous edge is part of a cycle is done by examining the source and destination of every dangerous edge and ensuring that they lie in distinct components. *) let () = G.iter (fun source -> successors (fun edge target -> match edge with | Safe -> () | Dangerous -> if T.representative source = T.representative target then let (nt, i) = source in Error.error [] "the parameterized nonterminal symbols in this grammar\n\ cannot be expanded away: expansion would not terminate.\n\ The %s formal parameter of \"%s\" grows without bound." (Misc.nth (i + 1)) nt ) source ) end (* of the functor *) (* -------------------------------------------------------------------------- *) (* Re-package the above functor as a function. *) let check g = let module T = Run(struct let g = g end) in () menhir-20200123/src/CheckSafeParameterizedGrammar.mli000066400000000000000000000024621361226111300223610ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This test accepts a parameterized grammar, with the restriction that all parameters must have sort [*]. Parameters of higher sort must be eliminated prior to running this test: see [SelectiveExpansion]. *) (* This test succeeds if and only if the expansion of this grammar is safe, that is, terminates. *) val check: Syntax.grammar -> unit menhir-20200123/src/Compatibility.ml000066400000000000000000000051351361226111300171610ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) module Bytes = struct include Bytes let escaped s = let n = ref 0 in for i = 0 to length s - 1 do n := !n + (match unsafe_get s i with | '\"' | '\\' | '\n' | '\t' | '\r' | '\b' -> 2 | ' ' .. '~' -> 1 | _ -> 4) done; if !n = length s then copy s else begin let s' = create !n in n := 0; for i = 0 to length s - 1 do begin match unsafe_get s i with | ('\"' | '\\') as c -> unsafe_set s' !n '\\'; incr n; unsafe_set s' !n c | '\n' -> unsafe_set s' !n '\\'; incr n; unsafe_set s' !n 'n' | '\t' -> unsafe_set s' !n '\\'; incr n; unsafe_set s' !n 't' | '\r' -> unsafe_set s' !n '\\'; incr n; unsafe_set s' !n 'r' | '\b' -> unsafe_set s' !n '\\'; incr n; unsafe_set s' !n 'b' | (' ' .. '~') as c -> unsafe_set s' !n c | c -> let a = Char.code c in unsafe_set s' !n '\\'; incr n; unsafe_set s' !n (Char.chr (48 + a / 100)); incr n; unsafe_set s' !n (Char.chr (48 + (a / 10) mod 10)); incr n; unsafe_set s' !n (Char.chr (48 + a mod 10)); end; incr n done; s' end end module String = struct open String let escaped s = let rec escape_if_needed s n i = if i >= n then s else match unsafe_get s i with | '\"' | '\\' | '\000'..'\031' | '\127'.. '\255' -> Bytes.unsafe_to_string (Bytes.escaped (Bytes.unsafe_of_string s)) | _ -> escape_if_needed s n (i+1) in escape_if_needed s (length s) 0 end menhir-20200123/src/Compatibility.mli000066400000000000000000000026061361226111300173320ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* The standard library function [String.escaped] in OCaml 4.02.3 depends on the operating system function [isprint] and therefore can have OS- dependent, locale-dependent behavior. This issue has been fixed in OCaml 4.03. We use a copy of the code found in OCaml 4.03 and higher, so as to avoid this issue. *) module Bytes : sig val escaped: bytes -> bytes end module String : sig val escaped: string -> string end menhir-20200123/src/CompletedNatWitness.ml000066400000000000000000000043271361226111300203060ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) type 'a t = | Finite of int * 'a Seq.seq | Infinity let equal p1 p2 = match p1, p2 with | Finite (i1, _), Finite (i2, _) -> i1 = i2 | Infinity, Infinity -> true | _, _ -> false let bottom = Infinity let epsilon = Finite (0, Seq.empty) let singleton x = Finite (1, Seq.singleton x) let is_maximal p = match p with | Finite (0, _) -> true | _ -> false let min p1 p2 = match p1, p2 with | Finite (i1, _), Finite (i2, _) -> if i1 <= i2 then p1 else p2 | p, Infinity | Infinity, p -> p let min_lazy p1 p2 = match p1 with | Finite (0, _) -> p1 | _ -> min p1 (p2()) let add p1 p2 = match p1, p2 with | Finite (i1, xs1), Finite (i2, xs2) -> Finite (i1 + i2, Seq.append xs1 xs2) | _, _ -> Infinity let add_lazy p1 p2 = match p1 with | Infinity -> Infinity | _ -> add p1 (p2()) let print conv p = match p with | Finite (i, xs) -> Printf.sprintf "(* %d *) " i ^ String.concat " " (List.map conv (Seq.elements xs)) | Infinity -> "infinity" let to_int p = match p with | Finite (i, _) -> i | Infinity -> max_int let extract p = match p with | Finite (_, xs) -> Seq.elements xs | Infinity -> assert false menhir-20200123/src/CompletedNatWitness.mli000066400000000000000000000037651361226111300204640ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This is the lattice of the natural numbers, completed with [Infinity], and ordered towards zero (i.e. [Infinity] is [bottom], [Finite 0] is [top]). *) (* These numbers are further enriched with sequences of matching length. Thus, a lattice element is either [Finite (n, xs)], where [n] is a natural number and [xs] is a sequence of length [n]; or [Infinity]. The sequences [xs] are ignored by the ordering (e.g., [compare] ignores them) but are nevertheless constructed (e.g., [add] concatenates two sequences). They should be thought of as witnesses, or proofs, that explain why the number [n] was obtained. *) type 'a t = | Finite of int * 'a Seq.seq | Infinity val bottom: 'a t val equal: 'a t -> 'b t -> bool val is_maximal: 'a t -> bool val epsilon: 'a t val singleton: 'a -> 'a t val min: 'a t -> 'a t -> 'a t val add: 'a t -> 'a t -> 'a t val min_lazy: 'a t -> (unit -> 'a t) -> 'a t val add_lazy: 'a t -> (unit -> 'a t) -> 'a t val print: ('a -> string) -> 'a t -> string val to_int: 'a t -> int val extract: 'a t -> 'a list menhir-20200123/src/DFS.ml000066400000000000000000000040101361226111300147530ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) module Run (G : sig type node type label val foreach_outgoing_edge: node -> (label -> node -> unit) -> unit val foreach_root: (node -> unit) -> unit end) (M : sig val mark: G.node -> unit val is_marked: G.node -> bool end) (D : sig val discover: G.node -> unit val traverse: G.node -> G.label -> G.node -> unit end) = struct open G open M open D let rec visit node = if not (is_marked node) then begin mark node; discover node; foreach_outgoing_edge node (fun label target -> traverse node label target; visit target ) end let () = foreach_root visit end module MarkSet (S : Set.S) = struct let marked = ref S.empty let is_marked x = S.mem x !marked let mark x = marked := S.add x !marked let marked () = !marked end module MarkArray (G : sig type node val n: int val number: node -> int end) = struct let marked = Array.make G.n false let is_marked x = marked.(G.number x) let mark x = marked.(G.number x) <- true let marked () = marked end menhir-20200123/src/DFS.mli000066400000000000000000000057571361226111300151470ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* A generic implementation of depth-first search. *) (* The graph [G] must be equipped with ways of iterating over the outoing edges of a node and over the root notes. Edges can be labeled. If no labels are needed, then the type [label] should be defined as [unit]. *) (* The module [M] must offer a mechanism for marking a node and testing whether a node is marked. The functors [MarkSet] and [MarkArray] (below) can help implement it. *) (* The function [D.discover] is invoked at most once per node, when this node is newly discovered (after this node has been marked and before its outgoing edges are traversed). The function [D.traverse] is invoked at most once per edge, when this edge is traversed. *) (* The functor application [Run(G)(M)(D)] performs the search. No result is returned. *) module Run (G : sig type node type label val foreach_outgoing_edge: node -> (label -> node -> unit) -> unit val foreach_root: (node -> unit) -> unit end) (M : sig val mark: G.node -> unit val is_marked: G.node -> bool end) (D : sig val discover: G.node -> unit val traverse: G.node -> G.label -> G.node -> unit end) : sig end (* The module [MarkSet(S)] provides a fresh marking mechanism for elements of type [S.elt], where [S] is a set implementation. The functions [mark] and [is_marked] allow marking an element and testing whether an element is marked. The function [marked] returns the set of all marked elements. *) module MarkSet (S : Set.S) : sig val mark: S.elt -> unit val is_marked: S.elt -> bool val marked: unit -> S.t end (* The module [MarkArray(S)] provides a fresh marking mechanism for nodes of type [G.node], where [G] is a graph whose nodes are numbered. The functions [mark] and [is_marked] allow marking a node and testing whether a node is marked. The function [marked] returns an array of marks. *) module MarkArray (G : sig type node val n: int val number: node -> int end) : sig val mark: G.node -> unit val is_marked: G.node -> bool val marked: unit -> bool array end menhir-20200123/src/DependencyGraph.ml000066400000000000000000000042331361226111300174060ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Grammar let print_dependency_graph() = (* Allocate. *) let forward : NonterminalSet.t NonterminalMap.t ref = ref NonterminalMap.empty in let successors nt = try NonterminalMap.find nt !forward with Not_found -> NonterminalSet.empty in (* Populate. *) Production.iter (fun prod -> let nt1 = Production.nt prod and rhs = Production.rhs prod in Array.iter (function | Symbol.T _ -> () | Symbol.N nt2 -> forward := NonterminalMap.add nt1 (NonterminalSet.add nt2 (successors nt1)) !forward ) rhs ); (* Print. *) let module P = Dot.Print (struct type vertex = Nonterminal.t let name nt = Printf.sprintf "nt%d" (Nonterminal.n2i nt) let successors (f : ?style:Dot.style -> label:string -> vertex -> unit) nt = NonterminalSet.iter (fun successor -> f ~label:"" successor ) (successors nt) let iter (f : ?shape:Dot.shape -> ?style:Dot.style -> label:string -> vertex -> unit) = Nonterminal.iter (fun nt -> f ~label:(Nonterminal.print false nt) nt ) end) in let f = open_out (Settings.base ^ ".dot") in P.print f; close_out f menhir-20200123/src/DependencyGraph.mli000066400000000000000000000022741361226111300175620ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* Build and print the forward reference graph of the grammar. There is an edge of a nonterminal symbol [nt1] to every nonterminal symbol [nt2] that occurs in the definition of [nt1]. *) val print_dependency_graph: unit -> unit menhir-20200123/src/Driver.mli000066400000000000000000000022561361226111300157550ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* The module [Driver] serves to offer a unified API to the parser, which could be produced by either ocamlyacc or Menhir. *) val grammar : (Lexing.lexbuf -> Parser.token) -> Lexing.lexbuf -> Syntax.partial_grammar menhir-20200123/src/Drop.ml000066400000000000000000000133211361226111300152500ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) let value = Positions.value (* The source. *) module S = Syntax (* The target. *) module T = BasicSyntax (* -------------------------------------------------------------------------- *) (* Most of the translation is straightforward. *) let drop_parameter (param : S.parameter) : S.symbol = match param with | S.ParameterVar sym -> value sym | S.ParameterApp _ -> (* The grammar should not have any parameterized symbols. *) assert false | S.ParameterAnonymous _ -> assert false let drop_producer ((id, param, attrs) : S.producer) : T.producer = { T.producer_identifier = value id; T.producer_symbol = drop_parameter param; T.producer_attributes = attrs } let drop_branch (branch : S.parameterized_branch) : T.branch = { T.branch_position = branch.S.pr_branch_position; T.producers = List.map drop_producer branch.S.pr_producers; T.action = branch.S.pr_action; T.branch_prec_annotation = branch.S.pr_branch_prec_annotation; T.branch_production_level = branch.S.pr_branch_production_level } let drop_rule (rule : S.parameterized_rule) : T.rule = (* The grammar should not have any parameterized symbols. *) assert (rule.S.pr_parameters = []); (* The [%public] flag is dropped. *) { T.branches = List.map drop_branch rule.S.pr_branches; T.positions = rule.S.pr_positions; T.inline_flag = rule.S.pr_inline_flag; T.attributes = rule.S.pr_attributes; } (* -------------------------------------------------------------------------- *) (* We must store [%type] declarations and [%on_error_reduce] declarations in StringMaps, whereas so far they were represented as lists. *) let drop_declarations (kind : string) (f : 'info1 -> 'info2) (decls : (S.parameter * 'info1) list) : 'info2 StringMap.t = (* Now is as good a time as any to check against multiple declarations concerning a single nonterminal symbol. Indeed, if we did not rule out this situation, then we would have to keep only one (arbitrarily chosen) declaration. To do this, we first build a map of symbols to info *and* position... *) List.fold_left (fun accu (param, info) -> let symbol = drop_parameter param in begin match StringMap.find symbol accu with | exception Not_found -> () | (_, position) -> Error.error [position; Parameters.position param] "there are multiple %s declarations for the symbol %s." kind symbol end; StringMap.add symbol (f info, Parameters.position param) accu ) StringMap.empty decls (* ... then drop the positions. *) |> StringMap.map (fun (info, _) -> info) let drop_type_declarations = drop_declarations "%type" value let drop_on_error_reduce_declarations = drop_declarations "%on_error_reduce" (fun x -> x) (* -------------------------------------------------------------------------- *) (* We must eliminate (that is, desugar) [%attribute] declarations. We examine them one by one and attach these attributes with terminal or nonterminal symbols, as appropriate. This is entirely straightforward. *) let add_attribute (g : T.grammar) param attr : T.grammar = let symbol = drop_parameter param in match StringMap.find symbol g.T.tokens with | props -> (* This is a terminal symbol. *) let props = { props with S.tk_attributes = attr :: props.S.tk_attributes } in { g with T.tokens = StringMap.add symbol props g.T.tokens } | exception Not_found -> match StringMap.find symbol g.T.rules with | rule -> (* This is a nonterminal symbol. *) let rule = { rule with T.attributes = attr :: rule.T.attributes } in { g with T.rules = StringMap.add symbol rule g.T.rules } | exception Not_found -> (* This is an unknown symbol. This should not happen. *) assert false let add_attributes g (params, attrs) = List.fold_left (fun g param -> List.fold_left (fun g attr -> add_attribute g param attr ) g attrs ) g params let add_attributes (decls : (S.parameter list * S.attributes) list) g = List.fold_left add_attributes g decls (* -------------------------------------------------------------------------- *) (* Putting it all together. *) let drop (g : S.grammar) : T.grammar = { T.preludes = g.S.p_preludes; T.postludes = g.S.p_postludes; T.parameters = g.S.p_parameters; T.start_symbols = StringMap.domain g.S.p_start_symbols; T.types = drop_type_declarations g.S.p_types; T.tokens = g.S.p_tokens; T.on_error_reduce = drop_on_error_reduce_declarations g.S.p_on_error_reduce; T.gr_attributes = g.S.p_grammar_attributes; T.rules = StringMap.map drop_rule g.S.p_rules } |> add_attributes g.S.p_symbol_attributes menhir-20200123/src/Drop.mli000066400000000000000000000023331361226111300154220ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This function translates a grammar from the [Syntax] format to the [BasicSyntax] format. Naturally, the grammar must not have any parameterized symbols, since these are not allowed by the latter format. *) val drop: Syntax.grammar -> BasicSyntax.grammar menhir-20200123/src/Fix.ml000066400000000000000000000443471361226111300151060ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* -------------------------------------------------------------------------- *) (* Maps. *) (* We require imperative maps, that is, maps that can be updated in place. An implementation of persistent maps, such as the one offered by ocaml's standard library, can easily be turned into an implementation of imperative maps, so this is a weak requirement. *) module type IMPERATIVE_MAPS = sig type key type 'data t val create: unit -> 'data t val clear: 'data t -> unit val add: key -> 'data -> 'data t -> unit val find: key -> 'data t -> 'data val iter: (key -> 'data -> unit) -> 'data t -> unit end (* -------------------------------------------------------------------------- *) (* Properties. *) (* Properties must form a partial order, equipped with a least element, and must satisfy the ascending chain condition: every monotone sequence eventually stabilizes. *) (* [is_maximal] determines whether a property [p] is maximal with respect to the partial order. Only a conservative check is required: in any event, it is permitted for [is_maximal p] to return [false]. If [is_maximal p] returns [true], then [p] must have no upper bound other than itself. In particular, if properties form a lattice, then [p] must be the top element. This feature, not described in the paper, enables a couple of minor optimizations. *) module type PROPERTY = sig type property val bottom: property val equal: property -> property -> bool val is_maximal: property -> bool end (* -------------------------------------------------------------------------- *) (* The dynamic dependency graph. *) (* An edge from [node1] to [node2] means that [node1] depends on [node2], or (equivalently) that [node1] observes [node2]. Then, an update of the current property at [node2] causes a signal to be sent to [node1]. A node can observe itself. *) (* This module could be placed in a separate file, but is included here in order to make [Fix] self-contained. *) module Graph : sig (* This module provides a data structure for maintaining and modifying a directed graph. Each node is allowed to carry a piece of client data. There are functions for creating a new node, looking up a node's data, looking up a node's predecessors, and setting or clearing a node's successors (all at once). *) type 'data node (* [create data] creates a new node, with no incident edges, with client information [data]. Time complexity: constant. *) val create: 'data -> 'data node (* [data node] returns the client information associated with the node [node]. Time complexity: constant. *) val data: 'data node -> 'data (* [predecessors node] returns a list of [node]'s predecessors. Amortized time complexity: linear in the length of the output list. *) val predecessors: 'data node -> 'data node list (* [set_successors src dsts] creates an edge from the node [src] to each of the nodes in the list [dsts]. Duplicate elements in the list [dsts] are removed, so that no duplicate edges are created. It is assumed that [src] initially has no successors. Time complexity: linear in the length of the input list. *) val set_successors: 'data node -> 'data node list -> unit (* [clear_successors node] removes all of [node]'s outgoing edges. Time complexity: linear in the number of edges that are removed. *) val clear_successors: 'data node -> unit (* That's it. *) end = struct (* Using doubly-linked adjacency lists, one could implement [predecessors] in worst-case linear time with respect to the length of the output list, [set_successors] in worst-case linear time with respect to the length of the input list, and [clear_successors] in worst-case linear time with respect to the number of edges that are removed. We use a simpler implementation, based on singly-linked adjacency lists, with deferred removal of edges. It achieves the same complexity bounds, except [predecessors] only offers an amortized complexity bound. This is good enough for our purposes, and, in practice, is more efficient by a constant factor. This simplification was suggested by Arthur Charguéraud. *) type 'data node = { (* The client information associated with this node. *) data: 'data; (* This node's incoming and outgoing edges. *) mutable outgoing: 'data edge list; mutable incoming: 'data edge list; (* A transient mark, always set to [false], except when checking against duplicate elements in a successor list. *) mutable marked: bool; } and 'data edge = { (* This edge's nodes. Edges are symmetric: source and destination are not distinguished. Thus, an edge appears both in the outgoing edge list of its source node and in the incoming edge list of its destination node. This allows edges to be easily marked as destroyed. *) node1: 'data node; node2: 'data node; (* Edges that are destroyed are marked as such, but are not immediately removed from the adjacency lists. *) mutable destroyed: bool; } let create (data : 'data) : 'data node = { data = data; outgoing = []; incoming = []; marked = false; } let data (node : 'data node) : 'data = node.data (* [follow src edge] returns the node that is connected to [src] by [edge]. Time complexity: constant. *) let follow src edge = if edge.node1 == src then edge.node2 else begin assert (edge.node2 == src); edge.node1 end (* The [predecessors] function removes edges that have been marked destroyed. The cost of removing these has already been paid for, so the amortized time complexity of [predecessors] is linear in the length of the output list. *) let predecessors (node : 'data node) : 'data node list = let predecessors = List.filter (fun edge -> not edge.destroyed) node.incoming in node.incoming <- predecessors; List.map (follow node) predecessors (* [link src dst] creates a new edge from [src] to [dst], together with its reverse edge. Time complexity: constant. *) let link (src : 'data node) (dst : 'data node) : unit = let edge = { node1 = src; node2 = dst; destroyed = false; } in src.outgoing <- edge :: src.outgoing; dst.incoming <- edge :: dst.incoming let set_successors (src : 'data node) (dsts : 'data node list) : unit = assert (src.outgoing = []); let rec loop = function | [] -> () | dst :: dsts -> if dst.marked then loop dsts (* skip duplicate elements *) else begin dst.marked <- true; link src dst; loop dsts; dst.marked <- false end in loop dsts let clear_successors (node : 'data node) : unit = List.iter (fun edge -> assert (not edge.destroyed); edge.destroyed <- true; ) node.outgoing; node.outgoing <- [] end (* -------------------------------------------------------------------------- *) (* The code is parametric in an implementation of maps over variables and in an implementation of properties. *) module Make (M : IMPERATIVE_MAPS) (P : PROPERTY) = struct type variable = M.key type property = P.property type valuation = variable -> property type rhs = valuation -> property type equations = variable -> rhs (* -------------------------------------------------------------------------- *) (* Data. *) (* Each node in the dependency graph carries information about a fixed variable [v]. *) type node = data Graph.node and data = { (* This is the result of the application of [rhs] to the variable [v]. It must be stored in order to guarantee that this application is performed at most once. *) rhs: rhs; (* This is the current property at [v]. It evolves monotonically with time. *) mutable property: property; (* That's it! *) } (* [property node] returns the current property at [node]. *) let property node = (Graph.data node).property (* -------------------------------------------------------------------------- *) (* Many definitions must be made within the body of the function [lfp]. For greater syntactic convenience, we place them in a local module. *) let lfp (eqs : equations) : valuation = let module LFP = struct (* -------------------------------------------------------------------------- *) (* The workset. *) (* When the algorithm is inactive, the workset is empty. *) (* Our workset is based on a Queue, but it could just as well be based on a Stack. A textual replacement is possible. It could also be based on a priority queue, provided a sensible way of assigning priorities could be found. *) module Workset : sig (* [insert node] inserts [node] into the workset. [node] must have no successors. *) val insert: node -> unit (* [repeat f] repeatedly applies [f] to a node extracted out of the workset, until the workset becomes empty. [f] is allowed to use [insert]. *) val repeat: (node -> unit) -> unit (* That's it! *) end = struct (* Initialize the workset. *) let workset = Queue.create() let insert node = Queue.push node workset let repeat f = while not (Queue.is_empty workset) do f (Queue.pop workset) done end (* -------------------------------------------------------------------------- *) (* Signals. *) (* A node in the workset has no successors. (It can have predecessors.) In other words, a predecessor (an observer) of some node is never in the workset. Furthermore, a node never appears twice in the workset. *) (* When a variable broadcasts a signal, all of its predecessors (observers) receive the signal. Any variable that receives the signal loses all of its successors (that is, it ceases to observe anything) and is inserted into the workset. This preserves the above invariant. *) let signal subject = List.iter (fun observer -> Graph.clear_successors observer; Workset.insert observer ) (Graph.predecessors subject) (* At this point, [subject] has no predecessors. This plays no role in the correctness proof, though. *) (* -------------------------------------------------------------------------- *) (* Tables. *) (* The permanent table maps variables that have reached a fixed point to properties. It persists forever. *) let permanent : property M.t = M.create() (* The transient table maps variables that have not yet reached a fixed point to nodes. (A node contains not only a property, but also a memoized right-hand side, and carries edges.) At the beginning of a run, it is empty. It fills up during a run. At the end of a run, it is copied into the permanent table and cleared. *) let transient : node M.t = M.create() (* [freeze()] copies the transient table into the permanent table, and empties the transient table. This allows all nodes to be reclaimed by the garbage collector. *) let freeze () = M.iter (fun v node -> M.add v (property node) permanent ) transient; M.clear transient (* -------------------------------------------------------------------------- *) (* Workset processing. *) (* [solve node] re-evaluates the right-hand side at [node]. If this leads to a change, then the current property is updated, and [node] emits a signal towards its observers. *) (* When [solve node] is invoked, [node] has no subjects. Indeed, when [solve] is invoked by [node_for], [node] is newly created; when [solve] is invoked by [Workset.repeat], [node] has just been extracted out of the workset, and a node in the workset has no subjects. *) (* [node] must not be in the workset. *) (* In short, when [solve node] is invoked, [node] is neither awake nor asleep. When [solve node] finishes, [node] is either awake or asleep again. (Chances are, it is asleep, unless it is its own observer; then, it is awakened by the final call to [signal node].) *) let rec solve (node : node) : unit = (* Retrieve the data record carried by this node. *) let data = Graph.data node in (* Prepare to compute an updated value at this node. This is done by invoking the client's right-hand side function. *) (* The flag [alive] is used to prevent the client from invoking [request] after this interaction phase is over. In theory, this dynamic check seems required in order to argue that [request] behaves like a pure function. In practice, this check is not very useful: only a bizarre client would store a [request] function and invoke it after it has become stale. *) let alive = ref true and subjects = ref [] in (* We supply the client with [request], a function that provides access to the current valuation, and dynamically records dependencies. This yields a set of dependencies that is correct by construction. *) let request (v : variable) : property = assert !alive; try M.find v permanent with Not_found -> let subject = node_for v in let p = property subject in if not (P.is_maximal p) then subjects := subject :: !subjects; p in (* Give control to the client. *) let new_property = data.rhs request in (* From now on, prevent any invocation of this instance of [request] the client. *) alive := false; (* At this point, [node] has no subjects, as noted above. Thus, the precondition of [set_successors] is met. We can install [data.subjects] as the new set of subjects for this node. *) (* If we have gathered no subjects in the list [data.subjects], then this node must have stabilized. If [new_property] is maximal, then this node must have stabilized. *) (* If this node has stabilized, then it need not observe any more, so the call to [set_successors] is skipped. In practice, this seems to be a minor optimization. In the particular case where every node stabilizes at the very first call to [rhs], this means that no edges are ever built. This particular case is unlikely, as it means that we are just doing memoization, not a true fixed point computation. *) (* One could go further and note that, if this node has stabilized, then it could immediately be taken out of the transient table and copied into the permanent table. This would have the beneficial effect of allowing the detection of further nodes that have stabilized. Furthermore, it would enforce the property that no node in the transient table has a maximal value, hence the call to [is_maximal] above would become useless. *) if not (!subjects = [] || P.is_maximal new_property) then Graph.set_successors node !subjects; (* If the updated value differs from the previous value, record the updated value and send a signal to all observers of [node]. *) if not (P.equal data.property new_property) then begin data.property <- new_property; signal node end (* Note that equality of the two values does not imply that this node has stabilized forever. *) (* -------------------------------------------------------------------------- *) (* [node_for v] returns the graph node associated with the variable [v]. It is assumed that [v] does not appear in the permanent table. If [v] appears in the transient table, the associated node is returned. Otherwise, [v] is a newly discovered variable: a new node is created on the fly, and the transient table is grown. The new node can either be inserted into the workset (it is then awake) or handled immediately via a recursive call to [solve] (it is then asleep, unless it observes itself). *) (* The recursive call to [solve node] can be replaced, if desired, by a call to [Workset.insert node]. Using a recursive call to [solve] permits eager top-down discovery of new nodes. This can save a constant factor, because it allows new nodes to move directly from [bottom] to a good first approximation, without sending any signals, since [node] has no observers when [solve node] is invoked. In fact, if the dependency graph is acyclic, the algorithm discovers nodes top-down, performs computation on the way back up, and runs without ever inserting a node into the workset! Unfortunately, this causes the stack to grow as deep as the longest path in the dependency graph, which can blow up the stack. *) and node_for (v : variable) : node = try M.find v transient with Not_found -> let node = Graph.create { rhs = eqs v; property = P.bottom } in (* Adding this node to the transient table prior to calling [solve] recursively is mandatory, otherwise [solve] might loop, creating an infinite number of nodes for the same variable. *) M.add v node transient; solve node; (* or: Workset.insert node *) node (* -------------------------------------------------------------------------- *) (* Invocations of [get] trigger the fixed point computation. *) (* The flag [inactive] prevents reentrant calls by the client. *) let inactive = ref true let get (v : variable) : property = try M.find v permanent with Not_found -> assert !inactive; inactive := false; let node = node_for v in Workset.repeat solve; freeze(); inactive := true; property node (* -------------------------------------------------------------------------- *) (* Close the local module [LFP]. *) end in LFP.get end menhir-20200123/src/Fix.mli000066400000000000000000000101541361226111300152440ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This code is described in the paper ``Lazy Least Fixed Points in ML''. *) (* -------------------------------------------------------------------------- *) (* Maps. *) (* We require imperative maps, that is, maps that can be updated in place. An implementation of persistent maps, such as the one offered by ocaml's standard library, can easily be turned into an implementation of imperative maps, so this is a weak requirement. *) module type IMPERATIVE_MAPS = sig type key type 'data t val create: unit -> 'data t val clear: 'data t -> unit val add: key -> 'data -> 'data t -> unit val find: key -> 'data t -> 'data val iter: (key -> 'data -> unit) -> 'data t -> unit end (* -------------------------------------------------------------------------- *) (* Properties. *) (* Properties must form a partial order, equipped with a least element, and must satisfy the ascending chain condition: every monotone sequence eventually stabilizes. *) (* [is_maximal] determines whether a property [p] is maximal with respect to the partial order. Only a conservative check is required: in any event, it is permitted for [is_maximal p] to return [false]. If [is_maximal p] returns [true], then [p] must have no upper bound other than itself. In particular, if properties form a lattice, then [p] must be the top element. This feature, not described in the paper, enables a couple of minor optimizations. *) module type PROPERTY = sig type property val bottom: property val equal: property -> property -> bool val is_maximal: property -> bool end (* -------------------------------------------------------------------------- *) (* The code is parametric in an implementation of maps over variables and in an implementation of properties. *) module Make (M : IMPERATIVE_MAPS) (P : PROPERTY) : sig type variable = M.key type property = P.property (* A valuation is a mapping of variables to properties. *) type valuation = variable -> property (* A right-hand side, when supplied with a valuation that gives meaning to its free variables, evaluates to a property. More precisely, a right-hand side is a monotone function of valuations to properties. *) type rhs = valuation -> property (* A system of equations is a mapping of variables to right-hand sides. *) type equations = variable -> rhs (* [lfp eqs] produces the least solution of the system of monotone equations [eqs]. *) (* It is guaranteed that, for each variable [v], the application [eqs v] is performed at most once (whereas the right-hand side produced by this application is, in general, evaluated multiple times). This guarantee can be used to perform costly pre-computation, or memory allocation, when [eqs] is applied to its first argument. *) (* When [lfp] is applied to a system of equations [eqs], it performs no actual computation. It produces a valuation, [get], which represents the least solution of the system of equations. The actual fixed point computation takes place, on demand, when [get] is applied. *) val lfp: equations -> valuation end menhir-20200123/src/FixSolver.ml000066400000000000000000000047231361226111300162730ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) module Make (M : Fix.IMPERATIVE_MAPS) (P : sig include Fix.PROPERTY val union: property -> property -> property end) = struct type variable = M.key type property = P.property (* A constraint is represented as a mapping of each variable to an expression, which represents its lower bound. We could represent an expression as a list of constants and variables; we can also represent it as a binary tree, as follows. *) type expression = | EBottom | ECon of property | EVar of variable | EJoin of expression * expression type constraint_ = expression M.t (* Looking up a variable's lower bound. *) let consult (m : constraint_) (x : variable) : expression = try M.find x m with Not_found -> EBottom (* Evaluation of an expression in an environment. *) let rec evaluate get e = match e with | EBottom -> P.bottom | ECon p -> p | EVar x -> get x | EJoin (e1, e2) -> P.union (evaluate get e1) (evaluate get e2) (* Solving a constraint. *) let solve (m : constraint_) : variable -> property = let module F = Fix.Make(M)(P) in F.lfp (fun x get -> evaluate get (consult m x) ) (* The imperative interface. *) let create () = let m = M.create() in let record_ConVar p y = M.add y (EJoin (ECon p, consult m y)) m and record_VarVar x y = M.add y (EJoin (EVar x, consult m y)) m in record_ConVar, record_VarVar, fun () -> solve m end menhir-20200123/src/FixSolver.mli000066400000000000000000000032001361226111300164310ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) module Make (M : Fix.IMPERATIVE_MAPS) (P : sig include Fix.PROPERTY val union: property -> property -> property end) : sig (* Variables and constraints. A constraint is an inequality between a constant or a variable, on the left-hand side, and a variable, on the right-hand side. *) type variable = M.key type property = P.property (* An imperative interface, where we create a new constraint system, and are given three functions to add constraints and (once we are done adding) to solve the system. *) val create: unit -> (property -> variable -> unit) * (variable -> variable -> unit) * (unit -> (variable -> property)) end menhir-20200123/src/Generic.ml000066400000000000000000000023771361226111300157310ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* Because the generic comparison function is named [Pervasives.compare] in early versions of OCaml and [Stdlib.compare] in recent versions, we cannot refer to it under either name. The following definition allows us to refer to it under the name [Generic.compare]. *) let compare = compare menhir-20200123/src/GroundSort.ml000066400000000000000000000021041361226111300164470ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) type sort = | GArrow of sort list let star = GArrow [] let domain sort = let GArrow sorts = sort in sorts menhir-20200123/src/GroundSort.mli000066400000000000000000000023051361226111300166230ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* The syntax of sorts is: sort ::= (sort, ..., sort) -> * where the arity (the number of sorts on the left-hand side of the arrow) can be zero. *) type sort = | GArrow of sort list val star: sort val domain: sort -> sort list menhir-20200123/src/IL.ml000066400000000000000000000144311361226111300146530ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* Abstract syntax of the language used for code production. *) type interface = interface_item list and interface_item = (* Functor. Called [Make]. No functor if no parameters. Very ad hoc! *) | IIFunctor of Stretch.t list * interface (* Exception declarations. *) | IIExcDecls of excdef list (* Algebraic data type declarations (mutually recursive). *) | IITypeDecls of typedef list (* Value declarations. *) | IIValDecls of (string * typescheme) list (* Include directive. *) | IIInclude of module_type (* Submodule. *) | IIModule of string * module_type (* Comment. *) | IIComment of string and module_type = | MTNamedModuleType of string | MTWithType of module_type * string list * string * with_kind * typ | MTSigEnd of interface and with_kind = | WKNonDestructive (* = *) | WKDestructive (* := *) and excdef = { (* Name of the exception. *) excname: string; (* Optional equality. *) exceq: string option; } and typedef = { (* Name of the algebraic data type. *) typename: string; (* Type parameters. This is a list of type variable names, without the leading quote, which will be added by the pretty-printer. Can also be "_". *) typeparams: string list; (* Data constructors. *) typerhs: typedefrhs; (* Constraint. *) typeconstraint: (typ * typ) option } and typedefrhs = | TDefRecord of fielddef list | TDefSum of datadef list | TAbbrev of typ and fielddef = { (* Whether the field is mutable. *) modifiable: bool; (* Name of the field. *) fieldname: string; (* Type of the field. *) fieldtype: typescheme } and datadef = { (* Name of the data constructor. *) dataname: string; (* Types of the value parameters. *) datavalparams: typ list; (* Instantiated type parameters, if this is a GADT -- [None] if this is an ordinary ADT. *) datatypeparams: typ list option; } and typ = (* Textual OCaml type. *) | TypTextual of Stretch.ocamltype (* Type variable, without its leading quote. Can also be "_". *) | TypVar of string (* Application of an algebraic data type constructor. *) | TypApp of string * typ list (* Anonymous tuple. *) | TypTuple of typ list (* Arrow type. *) | TypArrow of typ * typ and typescheme = { (* Universal quantifiers, without leading quotes. *) quantifiers: string list; (* Body. *) body: typ; } and valdef = { (* Whether the value is public. Public values cannot be suppressed by the inliner. They serve as seeds for the dead code analysis. *) valpublic: bool; (* Definition's left-hand side. *) valpat: pattern; (* Value to which it is bound. *) valval: expr } and expr = (* Variable. *) | EVar of string (* Function. *) | EFun of pattern list * expr (* Function call. *) | EApp of expr * expr list (* Local definitions. This is a nested sequence of [let] definitions. *) | ELet of (pattern * expr) list * expr (* Case analysis. *) | EMatch of expr * branch list | EIfThen of expr * expr | EIfThenElse of expr * expr * expr (* Raising exceptions. *) | ERaise of expr (* Exception analysis. *) | ETry of expr * branch list (* Data construction. Tuples of length 1 are considered nonexistent, that is, [ETuple [e]] is considered the same expression as [e]. *) | EUnit | EIntConst of int | EStringConst of string | EData of string * expr list | ETuple of expr list (* Type annotation. *) | EAnnot of expr * typescheme (* Cheating on the typechecker. *) | EMagic of expr (* Obj.magic *) | ERepr of expr (* Obj.repr *) (* Records. *) | ERecord of (string * expr) list | ERecordAccess of expr * string | ERecordWrite of expr * string * expr (* Textual OCaml code. *) | ETextual of Stretch.t (* Comments. *) | EComment of string * expr | EPatComment of string * pattern * expr (* Arrays. *) | EArray of expr list | EArrayAccess of expr * expr and branch = { (* Branch pattern. *) branchpat: pattern; (* Branch body. *) branchbody: expr; } and pattern = (* Wildcard. *) | PWildcard (* Variable. *) | PVar of string (* Data deconstruction. Tuples of length 1 are considered nonexistent, that is, [PTuple [p]] is considered the same pattern as [p]. *) | PUnit | PData of string * pattern list | PTuple of pattern list | PRecord of (string * pattern) list (* Disjunction. *) | POr of pattern list (* Type annotation. *) | PAnnot of pattern * typ (* Module expressions. *) and modexpr = | MVar of string | MStruct of structure | MApp of modexpr * modexpr (* Structures. *) and program = structure and structure = structure_item list and structure_item = (* Functor. Called [Make]. No functor if no parameters. Very ad hoc! *) | SIFunctor of Stretch.t list * structure (* Exception definitions. *) | SIExcDefs of excdef list (* Algebraic data type definitions (mutually recursive). *) | SITypeDefs of typedef list (* Value definitions (mutually recursive or not, as per the flag). *) | SIValDefs of bool * valdef list (* Raw OCaml code. *) | SIStretch of Stretch.t list (* Sub-module definition. *) | SIModuleDef of string * modexpr (* Module inclusion. *) | SIInclude of modexpr (* Comment. *) | SIComment of string menhir-20200123/src/IO.ml000066400000000000000000000106551361226111300146620ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* Input-output utilities. *) (* ------------------------------------------------------------------------- *) (* [try/finally] has the same semantics as in Java. *) let try_finally action handler = let result = try action() with e -> handler(); raise e in handler(); result (* ------------------------------------------------------------------------- *) (* [moving_away filename action] moves the file [filename] away (if it exists), performs [action], then moves the file back into place (if it was moved away). *) let moving_away filename action = if Sys.file_exists filename then let newname = filename ^ ".moved_by_menhir" in Sys.rename filename newname; try_finally action (fun () -> Sys.rename newname filename ) else action() (* ------------------------------------------------------------------------- *) (* [with_file filename creation action] creates the file [filename] by running [creation], then runs [action], and ensures that the file is removed in the end. *) let with_file filename creation action = creation(); try_finally action (fun () -> Sys.remove filename) (* ------------------------------------------------------------------------- *) (* [exhaust channel] reads all of the data that's available on [channel]. It does not assume that the length of the data is known ahead of time. It does not close the channel. *) let chunk_size = 16384 let exhaust channel = let buffer = Buffer.create chunk_size in let chunk = Bytes.create chunk_size in let rec loop () = let length = input channel chunk 0 chunk_size in if length = 0 then Buffer.contents buffer else begin Buffer.add_subbytes buffer chunk 0 length; loop() end in loop() (* ------------------------------------------------------------------------- *) (* [invoke command] invokes an external command (which expects no input) and returns its output, if the command succeeds. It returns [None] if the command fails. *) let invoke command = let ic = Unix.open_process_in command in (* 20130911 Be careful to read in text mode, so as to avoid newline translation problems (which would manifest themselves on Windows). *) set_binary_mode_in ic false; let result = exhaust ic in match Unix.close_process_in ic with | Unix.WEXITED 0 -> Some result | _ -> None (* ------------------------------------------------------------------------- *) (* [read_whole_file filename] reads the file [filename] in text mode and returns its contents as a string. *) let read_whole_file filename = (* Open the file in text mode, so that (under Windows) CRLF is converted to LF. This guarantees that one byte is one character and seems to be required in order to report accurate positions. *) let channel = open_in filename in (* The standard library functions [pos_in] and [seek_in] do not work correctly when CRLF conversion is being performed, so we abandon their use. (They were used to go and extract the text of semantic actions.) Instead we load the entire file into memory up front, and work with a string. *) (* The standard library function [in_channel_length] does not work correctly when CRLF conversion is being performed, so we do not use it to read the whole file. And the standard library function [Buffer.add_channel] uses [really_input] internally, so we cannot use it either. Bummer. *) let s = exhaust channel in close_in channel; s menhir-20200123/src/IO.mli000066400000000000000000000041471361226111300150320ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* Input-output utilities. *) (* [try/finally] has the same semantics as in Java. *) val try_finally : (unit -> 'a) -> (unit -> unit) -> 'a (* [moving_away filename action] moves the file [filename] away (if it exists), performs [action], then moves the file back into place (if it was moved away). *) val moving_away: string -> (unit -> 'a) -> 'a (* [with_file filename creation action] creates the file [filename] by running [creation], then runs [action], and ensures that the file is removed in the end. *) val with_file: string -> (unit -> unit) -> (unit -> 'a) -> 'a (* [exhaust channel] reads all of the data that's available on [channel]. It does not assume that the length of the data is known ahead of time. It does not close the channel. *) val exhaust: in_channel -> string (* [invoke command] invokes an external command (which expects no input) and returns its output, if the command succeeds. It returns [None] if the command fails. *) val invoke: string -> string option (* [read_whole_file filename] reads the file [filename] in text mode and returns its contents as a string. *) val read_whole_file: string -> string menhir-20200123/src/InputFile.ml000066400000000000000000000072361361226111300162530ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* ---------------------------------------------------------------------------- *) (* The identity of the current input file. *) (* 2011/10/19: do not use [Filename.basename]. The [#] annotations that we insert in the [.ml] file must retain their full path. This does mean that the [#] annotations depend on how menhir is invoked -- e.g. [menhir foo/bar.mly] and [cd foo && menhir bar.mly] will produce different files. Nevertheless, this seems useful/reasonable. *) (* This also influences the type error messages produced by [--infer]. *) (* 2016/08/25: in principle, the order in which file names appear on the command line (when there are several of them) does not matter. It is however used in [BasicPrinter] (see the problem description there). For this reason, we define a type [input_file] which includes the file's name as well as its index on the command line. *) type input_file = { input_file_name: string; input_file_index: int } let builtin_input_file = { input_file_name = ""; input_file_index = -1 } let dummy_input_file = { input_file_name = ""; input_file_index = 0 } let same_input_file file1 file2 = file1.input_file_index = file2.input_file_index (* could also use physical equality [file1 == file2] *) let compare_input_files file1 file2 = Generic.compare file1.input_file_index file2.input_file_index (* Ideally, this function should NOT be used, as it reflects the order of the input files on the command line. As of 2016/08/25, it is used by [BasicPrinter], for lack of a better solution. *) let current_input_file = ref dummy_input_file (* This declares that a new file is being processed. *) let new_input_file name : unit = current_input_file := { input_file_name = name; input_file_index = !current_input_file.input_file_index + 1 } let get_input_file () : input_file = assert (!current_input_file != dummy_input_file); !current_input_file let get_input_file_name () : string = (get_input_file()).input_file_name (* ---------------------------------------------------------------------------- *) (* The contents of the current input file. *) let get_initialized_ref ref = match !ref with | None -> assert false | Some contents -> contents let file_contents = ref (None : string option) let get_file_contents () = get_initialized_ref file_contents let with_file_contents contents f = file_contents := Some contents; let result = f() in file_contents := None; (* avoid memory leak *) result open Lexing let chunk (pos1, pos2) = let ofs1 = pos1.pos_cnum and ofs2 = pos2.pos_cnum in let contents = get_file_contents() in let len = ofs2 - ofs1 in String.sub contents ofs1 len menhir-20200123/src/InputFile.mli000066400000000000000000000054161361226111300164220ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module keeps track of which input file is currently being read. It defines a type [input_file] of input files, which is used to record the origin of certain elements (productions, declarations, etc.). *) (* ---------------------------------------------------------------------------- *) (* The identity of the current input file. *) type input_file (* [new_input_file filename] must be called when a new input file is about to be read. *) val new_input_file: string -> unit (* [get_input_file()] indicates which input file is currently being read. [get_input_file_name()] is the name of this file. *) val get_input_file: unit -> input_file val get_input_file_name: unit -> string (* This fictitious "built-in" input file is used as the origin of the start productions. This technical detail is probably irrelevant entirely. *) val builtin_input_file: input_file (* This equality test for input files is used (for instance) when determining which of two productions has greater priority. *) val same_input_file: input_file -> input_file -> bool (* This ordering between input files reflects their ordering on the command line. Ideally, it should NOT be used. *) val compare_input_files: input_file -> input_file -> int (* ---------------------------------------------------------------------------- *) (* The contents of the current input file. *) (* [with_file_contents contents f] records that the contents of the current input file is [contents] while the action [f] runs. The function [f] can then call [chunk] (below) to retrieve certain segments of [contents]. *) val with_file_contents: string -> (unit -> 'a) -> 'a (* [chunk pos1 pos2] extracts a chunk out of the current input file, delimited by the positions [pos1] and [pos2]. *) val chunk: (Lexing.position * Lexing.position) -> string menhir-20200123/src/LR1Sigs.ml000066400000000000000000000031731361226111300155740ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Grammar (* The output signature of several LR(1) automaton construction algorithms. *) module type LR1_AUTOMATON = sig (* An abstract type of nodes, that is, states in the LR(1) automaton. *) type node (* The number of nodes. *) val n: int (* Nodes are numbered from 0 to [n-1]. *) val number: node -> int val node: int -> node (* To each start production corresponds an entry node. *) val entry : node ProductionMap.t (* Each node carries outgoing transitions towards other nodes. *) val transitions: node -> node SymbolMap.t (* Each node represents an LR(1) state, that is, a set of LR(1) items. *) val state: node -> Lr0.lr1state end menhir-20200123/src/LRijkstra.ml000066400000000000000000000270311361226111300162540ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module implements [--list-errors]. Its purpose is to find, for each pair of a state [s] and a terminal symbol [z] such that looking at [z] in state [s] causes an error, a minimal path (starting in some initial state) that actually triggers this error. *) (* This is potentially useful for grammar designers who wish to better understand the properties of their grammar, or who wish to produce a list of all possible syntax errors (or, at least, one syntax error in each automaton state where an error may occur). *) (* The problem seems rather tricky. One might think that it suffices to compute shortest paths in the automaton, and to use [Analysis.minimal] to replace each non-terminal symbol in a path with a minimal word that this symbol generates. One can indeed do so, but this yields only a lower bound on the actual shortest path to the error at [s, z]. Indeed, several difficulties arise, including the fact that reductions are subject to a lookahead hypothesis; the fact that some states have a default reduction, hence will never trigger an error; the fact that conflict resolution x removes some (shift or reduce) actions, hence may suppress the shortest path. *) (* ------------------------------------------------------------------------ *) (* To delay the side effects performed by this module, we wrap everything in in a big functor. The functor also serves to pass verbosity parameters. *) module Run (X : sig (* If [verbose] is set, produce various messages on [stderr]. *) val verbose: bool (* If [statistics] is defined, it is interpreted as the name of a file to which one line of statistics is appended. *) val statistics: string option end) = struct open Grammar open Default (* ------------------------------------------------------------------------ *) (* Record our start time. *) let now () = match X.statistics with | Some _ -> Unix.((times()).tms_utime) | None -> 0.0 let start = now() (* ------------------------------------------------------------------------ *) (* Run the core reachability analysis, which finds out exactly under what conditions each nonterminal transition in the automaton can be taken. *) module Core = LRijkstraCore.Run(X) module W = Core.W (* ------------------------------------------------------------------------ *) (* The following code validates the fact that an error can be triggered in state [s'] by beginning at the start symbol [nt] and reading the sequence of terminal symbols [w]. We use this for debugging purposes. Furthermore, this gives us a list of spurious reductions, which we use to produce a comment. *) let fail msg = Printf.eprintf "LRijkstra: internal error: %s.\n%!" msg; exit 1 let fail format = Printf.ksprintf fail format let validate nt s' w : ReferenceInterpreter.target = let open ReferenceInterpreter in match check_error_path false nt (W.elements w) with | OInputReadPastEnd -> fail "input was read past its end" | OInputNotFullyConsumed -> fail "input was not fully consumed" | OUnexpectedAccept -> fail "input was unexpectedly accepted" | OK ((state, _) as target) -> if Lr1.Node.compare state s' <> 0 then fail "error occurred in state %d instead of %d" (Lr1.number state) (Lr1.number s') else target (* ------------------------------------------------------------------------ *) (* We now wish to determine, given a state [s'] and a terminal symbol [z], a minimal path that takes us from some entry state to state [s'] with [z] as the next (unconsumed) symbol. *) (* This can be formulated as a search for a shortest path in a graph. The graph is not just the automaton, though. It is a (much) larger graph whose vertices are pairs [s, z] and whose edges are obtained by querying the module [E] above. For this purpose, we use Dijkstra's algorithm, unmodified. Experiments show that the running time of this phase is typically 10x shorter than the running time of the main loop above. *) module A = Astar.Make(struct (* A vertex is a pair [s, z], where [z] is a real terminal symbol. *) type node = Lr1.node * Terminal.t let equal (s'1, z1) (s'2, z2) = Lr1.Node.compare s'1 s'2 = 0 && Terminal.compare z1 z2 = 0 let hash (s, z) = Hashtbl.hash (Lr1.number s, z) (* An edge is labeled with a word. *) type label = W.word (* We search forward from every [s, z], where [s] is an initial state. *) let sources f = Terminal.iter_real (fun z -> ProductionMap.iter (fun _ s -> f (s, z) ) Lr1.entry ) (* The successors of [s, z] are defined as follows. *) let successors (s, z) edge = assert (Terminal.real z); (* For every transition out of [s], labeled [sym], leading to [s']... *) Lr1.transitions s |> SymbolMap.iter (fun sym s' -> match sym with | Symbol.T t -> if Terminal.equal z t then (* If [sym] is the terminal symbol [z], then this transition matches our lookahead assumption, so we can take it. For every [z'], we have an edge to [s', z'], labeled with the singleton word [z]. *) let w = W.singleton z in Terminal.iter_real (fun z' -> edge w 1 (s', z') ) | Symbol.N nt -> (* If [sym] is a nonterminal symbol [nt], then we query [E] in order to find out which (minimal) words [w] allow us to take this transition. We must again try every [z'], and must respect the constraint that the first symbol of the word [w.z'] is [z]. For every [z'] and [w] that fulfill these requirements, we have an edge to [s', z'], labeled with the word [w]. *) Core.query s nt z (fun w z' -> edge w (W.length w) (s', z') ) ) (* Algorithm A*, used with a zero estimate, is Dijkstra's algorithm. We have experimented with a non-zero estimate, but the performance increase was minimal. *) let estimate _ = 0 end) (* ------------------------------------------------------------------------ *) (* [explored] counts how many graph nodes we have discovered during the search. *) let explored = ref 0 (* We wish to store a set of triples [nt, w, (s', spurious)], meaning that an error can be triggered in state [s'] by beginning in the initial state that corresponds to [nt] and by reading the sequence of terminal symbols [w]. We wish to store at most one such triple for every state [s'], so we organize the data as a set [domain] of states [s'] and a list [data] of triples [nt, w, (s', spurious)]. The list [spurious] documents the spurious reductions that are performed by the parser at the end. *) (* We could print this data as we go, which would naturally result in sorting the output by increasing word sizes. However, it seems preferable to sort the sentences lexicographically, so that similar sentences end up close to one another. (We could also sort them by state number. The result would be roughly similar.) This is why we store a list of triples and sort it before printing it out. *) let domain = ref Lr1.NodeSet.empty let data : (Nonterminal.t * W.word * ReferenceInterpreter.target) list ref = ref [] (* The set [reachable] stores every reachable state (regardless of whether an error can be triggered in that state). *) let reachable = ref Lr1.NodeSet.empty (* Perform the forward search. *) let _, _ = A.search (fun ((s', z), path) -> incr explored; reachable := Lr1.NodeSet.add s' !reachable; (* If [z] causes an error in state [s'] and this is the first time we are able to trigger an error in this state, ... *) if causes_an_error s' z && not (Lr1.NodeSet.mem s' !domain) then begin (* Reconstruct the initial state [s] and the word [w] that lead to this error. *) let (s, _), ws = A.reverse path in let w = List.fold_right W.append ws (W.singleton z) in (* Check that the reference interpreter confirms our finding. At the same time, compute a list of spurious reductions. *) let nt = Lr1.nt_of_entry s in let target = validate nt s' w in (* Store this new data. *) domain := Lr1.NodeSet.add s' !domain; data := (nt, w, target) :: !data end ) (* Sort and output the data. *) let () = !data |> List.fast_sort (fun (nt1, w1, _) (nt2, w2, _) -> let c = Nonterminal.compare nt1 nt2 in if c <> 0 then c else W.compare w2 w1 ) |> List.map (fun (nt, w, target) -> (nt, W.elements w, target)) |> List.iter Interpret.print_messages_item (* ------------------------------------------------------------------------ *) (* Verbosity. *) let max_heap_size = if X.verbose || X.statistics <> None then let stat = Gc.quick_stat() in (stat.Gc.top_heap_words * (Sys.word_size / 8) / 1024 / 1024) else 0 (* dummy *) let () = Time.tick "Forward search"; if X.verbose then begin Printf.eprintf "%d graph nodes explored by forward search.\n\ %d out of %d states are reachable.\n\ Found %d states where an error can occur.\n\ Maximum size reached by the major heap: %dM\n%!" !explored (Lr1.NodeSet.cardinal !reachable) Lr1.n (Lr1.NodeSet.cardinal !domain) max_heap_size end (* ------------------------------------------------------------------------ *) (* If requested by the client, write one line of statistics to a .csv file. *) let stop = now() let () = X.statistics |> Option.iter (fun filename -> let c = open_out_gen [ Open_creat; Open_append; Open_text ] 0o644 filename in Printf.fprintf c "%s,%d,%d,%d,%d,%d,%d,%d,%.2f,%d\n%!" (* Grammar name. *) Settings.base (* Number of terminal symbols. *) Terminal.n (* Number of nonterminal symbols. *) Nonterminal.n (* Grammar size (not counting the error productions). *) begin Production.foldx (fun prod accu -> let rhs = Production.rhs prod in if List.mem (Symbol.T Terminal.error) (Array.to_list rhs) then accu else accu + Array.length rhs ) 0 end (* Automaton size (i.e., number of states). *) Lr1.n (* Total trie size. *) Core.total_trie_size (* Size of [F]. *) Core.facts (* Size of [E]. *) Core.edge_facts (* Elapsed user time, in seconds. *) (stop -. start) (* Max heap size, in megabytes. *) max_heap_size ; close_out c ) (* ------------------------------------------------------------------------ *) end menhir-20200123/src/LRijkstra.mli000066400000000000000000000041171361226111300164250ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module implements [--list-errors]. Its purpose is to find, for each pair of a state [s] and a terminal symbol [z] such that looking at [z] in state [s] causes an error, a minimal path (starting in some initial state) that actually triggers this error. *) (* In this analysis, we explicitly ignore the [error] token. (We display a warning if the grammar uses this token.) Thus, we disregard any reductions or transitions that take place when the lookahead symbol is [error]. As a result, any state whose incoming symbol is [error] is found unreachable. It would be too complicated to have to create a first error in order to be able to take certain transitions or drop certain parts of the input. *) module Run (X : sig (* If [verbose] is set, produce various messages on [stderr]. *) val verbose: bool (* If [statistics] is defined, it is interpreted as the name of a file to which one line of statistics is appended. *) val statistics: string option end) : sig (* The result of this analysis is a [.messages] file. It is written to the standard output channel. No result is returned. *) end menhir-20200123/src/LRijkstraCore.ml000066400000000000000000000765241361226111300171000ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* As announced in our specification, we ignore the [error] token. We never work with the terminal symbol [#] either. This symbol never appears in the maps returned by [Lr1.transitions] and [Lr1.reductions]. Thus, in principle, we work with real terminal symbols only. However, we encode [any] as [#] -- see below. *) (* NOTE: Because he performance impact of the assertions in this file is about 10%, they are turned off by default. Change the value of [debug] to [true] if you wish to enable assertions. *) let debug = false open Grammar open Default (* ------------------------------------------------------------------------ *) (* We introduce a pseudo-terminal symbol [any]. It is used in several places later on, in particular in the [lookahead] field of a fact, to encode the absence of a lookahead hypothesis -- i.e., any terminal symbol will do. *) (* We choose to encode [any] as [#]. There is no risk of confusion, since we do not use [#] anywhere. Thus, the assertion [Terminal.real z] implies [z <> any]. *) let any = Terminal.sharp (* [foreach_terminal f] applies the function [f] to every terminal symbol in turn, except [error] and [#]. *) let foreach_terminal = Terminal.iter_real (* [foreach_terminal_not_causing_an_error s f] applies the function [f] to every terminal symbol [z] such that [causes_an_error s z] is false. This could be implemented in a naive manner using [foreach_terminal] and [causes_an_error]. This implementation is significantly more efficient. *) let foreach_terminal_not_causing_an_error s f = match has_default_reduction s with | Some _ -> (* There is a default reduction. No symbol causes an error. *) foreach_terminal f | None -> (* Enumerate every terminal symbol [z] for which there is a reduction. *) TerminalMap.iter (fun z _ -> (* A reduction on [#] is always a default reduction. (See [lr1.ml].) *) if debug then assert (not (Terminal.equal z Terminal.sharp)); if Terminal.non_error z then f z ) (Lr1.reductions s); (* Enumerate every terminal symbol [z] for which there is a transition. *) SymbolMap.iter (fun sym _ -> match sym with | Symbol.T z -> if debug then assert (not (Terminal.equal z Terminal.sharp)); if Terminal.non_error z then f z | Symbol.N _ -> () ) (Lr1.transitions s) (* Let us say a state [s] is solid if its incoming symbol is a terminal symbol (or if it has no incoming symbol at all, i.e., it is an initial state). It is fragile if its incoming symbol is a non-terminal symbol. *) let is_solid s = match Lr1.incoming_symbol s with | None | Some (Symbol.T _) -> true | Some (Symbol.N _) -> false (* ------------------------------------------------------------------------ *) (* To delay the side effects performed by this module, we wrap everything in in a big functor. The functor also serves to pass verbosity parameters. *) module Run (X : sig (* If [verbose] is set, produce various messages on [stderr]. *) val verbose: bool end) = struct (* ------------------------------------------------------------------------ *) (* Because of our encoding of terminal symbols as 8-bit characters, this algorithm supports at most 256 terminal symbols. *) let () = if Terminal.n > 256 then Error.error [] "the reachability analysis supports at most 256 terminal symbols.\n\ The grammar has %d terminal symbols." Terminal.n (* ------------------------------------------------------------------------ *) (* Produce a warning if the grammar uses the [error] pseudo-token. *) let () = if grammar_uses_error_token then Error.warning [] "The reachability analysis ignores all productions that involve the error token." (* ------------------------------------------------------------------------ *) (* Build a module that represents words as (hash-consed) strings. Note: this functor application has a side effect (it allocates memory, and more importantly, it may fail). *) module W = Terminal.Word(struct end) (* ------------------------------------------------------------------------ *) (* Instantiate [Trie]. This allocates fresh mutable state, but otherwise has no effect. The construction of the tries actually takes place when [Trie.stars] is invoked below. *) module Trie = Trie.Make(struct end) (* ------------------------------------------------------------------------ *) (* The main algorithm, [LRijkstra], accumulates facts. A fact is a triple of a [position] (that is, a sub-trie), a [word], and a [lookahead] assumption. Such a fact means that this [position] can be reached, from the source state [Trie.source position], by consuming [word], under the assumption that the next input symbol is [lookahead]. *) (* We allow [lookahead] to be [any] so as to indicate that this fact does not have a lookahead assumption. *) (* type fact = { position: Trie.trie; word: W.word; lookahead: Terminal.t (* may be [any] *) } *) (* To save memory (and therefore time), we encode a fact in a single OCaml integer value. This is made possible by the fact that tries, words, and terminal symbols are represented as (or can be encoded as) integers. This admittedly horrible hack allows us to save roughly a factor of 2 in space, and to gain 10% in time. *) type fact = int let dummy : fact = -1 (* should never be accessed! *) (* Encoding and decoding facts. *) (* We encode [position|word|lookahead] in a single word of memory. *) (* The lookahead symbol fits in 8 bits. *) (* In the largest grammars that we have seen, the number of unique words is about 3.10^5, so a word should fit in about 19 bits (2^19 = 524288). In the largest grammars that we have seen, the total star size is about 64000, so a trie should fit in about 17 bits (2^17 = 131072). *) (* On a 64-bit machine, we have ample space in a 63-bit word! We allocate 30 bits for [word] and the rest (i.e., 25 bits) for [position]. *) (* On a 32-bit machine, we are a bit more cramped! In Menhir's own fancy-parser, the number of terminal symbols is 27, the number of unique words is 566, and the total star size is 546. We allocate 12 bits for [word] and 11 bits for [position]. This is better than refusing to work altogether, but still not great. A more satisfactory approach might be to revert to heap allocation of facts when in 32-bit mode, but that would make the code somewhat ugly. *) let w_lookahead = 8 let w_word = if Sys.word_size < 64 then 12 else 30 let w_position = Sys.word_size - 1 - (w_word + w_lookahead) (* 25, on a 64-bit machine *) let identity (fact : fact) : int = if debug then assert (fact <> dummy); fact lsr (w_word + w_lookahead) let position (fact : fact) : Trie.trie = if debug then assert (fact <> dummy); Trie.decode (identity fact) let word (fact : fact) : W.word = if debug then assert (fact <> dummy); (fact lsr w_lookahead) land (1 lsl w_word - 1) let lookahead (fact : fact) : Terminal.t = Terminal.i2t (fact land (1 lsl w_lookahead - 1)) let mkfact position (word : W.word) lookahead = let position : int = Trie.encode position and word : int = word and lookahead : int = Terminal.t2i lookahead in if debug then begin assert (0 <= position && 0 <= word && 0 <= lookahead); assert (lookahead < 1 lsl w_lookahead); end; if position < 1 lsl w_position && word < 1 lsl w_word then (* [lsl] binds tighter than [lor] *) (position lsl w_word lor word) lsl w_lookahead lor lookahead else let advice = if Sys.word_size < 64 then "Please use a 64-bit machine." else "Please report this error to Menhir's developers." in Error.error [] "an internal limit was exceeded.\n\ Sys.word_size = %d. Position = %d. Word = %d.\n\ %s%!" Sys.word_size position word advice let mkfact p w l = let fact = mkfact p w l in if debug then begin assert (word fact == w); (* round-trip property *) assert (lookahead fact == l); (* round-trip property *) assert (position fact == p); (* round-trip property *) end; fact (* Two invariants reduce the number of facts that we consider: 1. If [lookahead] is a real terminal symbol [z] (i.e., not [any]), then [z] does not cause an error in the [current] state. It would be useless to consider a fact that violates this property; this cannot possibly lead to a successful reduction. In practice, this refinement allows reducing the number of facts that go through the queue by a factor of two. 2. [lookahead] is [any] iff the [current] state is solid. This sounds rather reasonable (when a state is entered by shifting, it is entered regardless of which symbol follows) and simplifies the implementation of the sub-module [F]. *) let invariant1 position _word lookahead = let current = Trie.current position in lookahead = any || not (causes_an_error current lookahead) let invariant2 position _word lookahead = let current = Trie.current position in (lookahead = any) = is_solid current (* [compatible z a] checks whether the terminal symbol [a] satisfies the lookahead assumption [z] -- which can be [any]. *) let compatible z a = if debug then begin assert (Terminal.non_error z); assert (Terminal.real a); end; z = any || z = a (* ------------------------------------------------------------------------ *) (* As in Dijkstra's algorithm, a priority queue contains the facts that await examination. The length of [word fact] serves as the priority of a fact. This guarantees that we discover shortest paths. (We never insert into the queue a fact whose priority is less than the priority of the last fact extracted out of the queue.) *) (* [LowIntegerPriorityQueue] offers very efficient operations (essentially constant time, for a small constant). It exploits the fact that priorities are low nonnegative integers. *) module Q = LowIntegerPriorityQueue let q = Q.create dummy (* In principle, there is no need to insert the fact into the queue if [F] already stores a comparable fact. We could perform this test in [enqueue]. However, a few experiments suggests that this is not worthwhile. The run time augments (because membership in [F] is tested twice, upon inserting and upon extracting) and the memory consumption does not seem to go down significantly. *) let enqueue position word lookahead = (* [lookahead] can be [any], but cannot be [error] *) if debug then begin assert (Terminal.non_error lookahead); assert (invariant1 position word lookahead); assert (invariant2 position word lookahead); end; (* The length of [word] serves as the priority of this fact. *) let priority = W.length word in (* Encode and enqueue this fact. *) Q.add q (mkfact position word lookahead) priority (* ------------------------------------------------------------------------ *) (* Construct the [star] of every state [s]. Initialize the priority queue. *) let () = (* For every state [s], if the trie rooted at [s] is nontrivial, ... *) Trie.stars (fun s position -> (* ...then insert an initial fact into the priority queue. *) (* In order to respect invariants 1 and 2, we must distinguish two cases. If [s] is solid, then we insert a single fact, whose lookahead assumption is [any]. Otherwise, we must insert one initial fact for every terminal symbol [z] that does not cause an error in state [s]. *) let word = W.epsilon in if is_solid s then enqueue position word any else foreach_terminal_not_causing_an_error s (fun z -> enqueue position word z ) ); if X.verbose then Trie.verbose() (* ------------------------------------------------------------------------ *) (* The module [F] maintains a set of known facts. *) (* Three aspects of a fact are of particular interest: - its position [position], given by [position fact]; - its first symbol [a], given by [W.first (word fact) (lookahead fact)]; - its lookahead assumption [z], given by [lookahead fact]. For every triple of [position], [a], and [z], we store at most one fact, (whose word has minimal length). Indeed, we are not interested in keeping track of several words that produce the same effect. Only the shortest such word is of interest. Thus, the total number of facts accumulated by the algorithm is at most [T.n^2], where [T] is the total size of the tries that we have constructed, and [n] is the number of terminal symbols. (This number can be quite large. [T] can be in the tens of thousands, and [n] can be over one hundred. These figures lead to a theoretical upper bound of 100M. In practice, for T=25K and n=108, we observe that the algorithm gathers about 7M facts.) *) module F : sig (* [register fact] registers the fact [fact]. It returns [true] if this fact is new, i.e., no fact concerning the same triple of [position], [a], and [z] was previously known. *) val register: fact -> bool (* [query current z f] enumerates all known facts whose current state is [current] and whose lookahead assumption is compatible with [z]. The symbol [z] must a real terminal symbol, i.e., cannot be [any]. *) val query: Lr1.node -> Terminal.t -> (fact -> unit) -> unit (* [size()] returns the number of facts currently stored in the set. *) val size: unit -> int (* [verbose()] outputs debugging & performance information. *) val verbose: unit -> unit end = struct (* We need to query the set of facts in two ways. In [register], we must test whether a proposed triple of [position], [a], [z] already appears in the set. In [query], we must find all facts that match a pair [current, z], where [current] is a state. (Note that [position] determines [current], but the converse is not true: a position contains more information besides the current state.) To address these needs, we use a two-level table. The first level is a matrix indexed by [current] and [z]. At the second level, we find sets of facts, where two facts are considered equal if they have the same triple of [position], [a], and [z]. In fact, we know at this level that all facts have the same [z] component, so only [position] and [a] are compared. Because our facts satisfy invariant 2, [z] is [any] if and only if the state [current] is solid. This means that we are wasting quite a lot of space in the matrix (for a solid state, the whole line is empty, except for the [any] column). *) (* The level-2 sets. *) module M = MySet.Make(struct type t = fact let compare fact1 fact2 = if debug then assert (lookahead fact1 = lookahead fact2); (* Compare the two positions first. This can be done without going through [Trie.decode], by directly comparing the two integer identities. *) let c = Generic.compare (identity fact1) (identity fact2) in if debug then assert (c = Trie.compare (position fact1) (position fact2)); if c <> 0 then c else let z = lookahead fact1 in let a1 = W.first (word fact1) z and a2 = W.first (word fact2) z in (* note: [a1] and [a2] can be [any] here *) Terminal.compare a1 a2 end) (* The level-1 matrix. *) let table = Array.make (Lr1.n * Terminal.n) M.empty let index current z = Terminal.n * (Lr1.number current) + Terminal.t2i z let count = ref 0 let register fact = let current = Trie.current (position fact) in let z = lookahead fact in let i = index current z in let m = table.(i) in (* We crucially rely on the fact that [M.add] guarantees not to change the set if an ``equal'' fact already exists. Thus, a later, longer path is ignored in favor of an earlier, shorter path. *) let m' = M.add fact m in m != m' && begin incr count; table.(i) <- m'; true end let query current z f = if debug then assert (not (Terminal.equal z any)); (* If the state [current] is solid then the facts that concern it are stored in the column [any], and all of them are compatible with [z]. Otherwise, they are stored in all columns except [any], and only those stored in the column [z] are compatible with [z]. *) let i = index current (if is_solid current then any else z) in let m = table.(i) in M.iter f m let size () = !count let verbose () = Printf.eprintf "F stores %d facts.\n%!" (size()) end (* ------------------------------------------------------------------------ *) (* The module [E] is in charge of recording the non-terminal edges that we have discovered, or more precisely, the conditions under which these edges can be taken. It maintains a set of quadruples [s, nt, w, z], where such a quadruple means that in the state [s], the outgoing edge labeled [nt] can be taken by consuming the word [w], under the assumption that the next symbol is [z]. Again, the terminal symbol [a], given by [W.first w z], plays a role. For each quadruple [s, nt, a, z], we store at most one quadruple [s, nt, w, z]. Thus, internally, we maintain a mapping of [s, nt, a, z] to [w]. For greater simplicity, we do not allow [z] to be [any] in [register] or [query]. Allowing it would complicate things significantly, it seems. *) module E : sig (* [register s nt w z] records that, in state [s], the outgoing edge labeled [nt] can be taken by consuming the word [w], if the next symbol is [z]. It returns [true] if this information is new, i.e., if the underlying quadruple [s, nt, a, z] is new. The symbol [z] cannot be [any]. *) val register: Lr1.node -> Nonterminal.t -> W.word -> Terminal.t -> bool (* [query s nt a foreach] enumerates all words [w] and all real symbols [z] such that, in state [s], the outgoing edge labeled [nt] can be taken by consuming the word [w], under the assumption that the next symbol is [z], and the first symbol of the word [w.z] is [a]. The symbol [a] can be [any]. The function [foreach] can be either [foreach_terminal] or of the form [foreach_terminal_not_causing_an_error _]. It limits the symbols [z] that are considered. *) val query: Lr1.node -> Nonterminal.t -> Terminal.t -> (* foreach: *) ((Terminal.t -> unit) -> unit) -> (W.word -> Terminal.t -> unit) -> unit (* [size()] returns the number of edges currently stored in the set. *) val size: unit -> int (* [verbose()] outputs debugging & performance information. *) val verbose: unit -> unit end = struct (* At a high level, we must implement a mapping of [s, nt, a, z] to [w]. In practice, we can implement this specification using any combination of arrays, hash tables, balanced binary trees, and perfect hashing (i.e., packing several of [s], [nt], [a], [z] in one word.) Here, we choose to use an array, indexed by [s], of hash tables, indexed by a key that packs [nt], [a], and [z] in one word. According to a quick experiment, the final population of the hash table [table.(index s)] seems to be roughly [Terminal.n * Trie.size s]. We note that using an initial capacity of 0 and relying on the hash table's resizing mechanism has a significant cost, which is why we try to guess a good initial capacity. *) module H = Hashtbl let table = Array.init Lr1.n (fun i -> let size = Trie.size i in H.create (if size = 1 then 0 else Terminal.n * size) ) let index s = Lr1.number s let pack nt a z : int = (* We rely on the fact that we have at most 256 terminal symbols. *) (Nonterminal.n2i nt lsl 16) lor (Terminal.t2i a lsl 8) lor (Terminal.t2i z) let count = ref 0 let register s nt w z = if debug then assert (Terminal.real z); let i = index s in let m = table.(i) in let a = W.first w z in (* Note that looking at [a] in state [s] cannot cause an error. *) if debug then assert (not (causes_an_error s a)); let key = pack nt a z in if H.mem m key then false else begin incr count; H.add m key w; true end let rec query s nt a foreach f = if Terminal.equal a any then begin (* If [a] is [any], we query the table for every real symbol [a]. We can limit ourselves to symbols that do not cause an error in state [s]. Those that do certainly do not have an entry; see the assertion in [register] above. *) foreach_terminal_not_causing_an_error s (fun a -> query s nt a foreach f ) end else let i = index s in let m = table.(i) in foreach (fun z -> if debug then assert (Terminal.real z); let key = pack nt a z in match H.find m key with | w -> f w z | exception Not_found -> () ) let size () = !count let verbose () = Printf.eprintf "E stores %d edges.\n%!" (size()) end (* ------------------------------------------------------------------------ *) (* [new_edge s nt w z] is invoked when we discover that in the state [s], the outgoing edge labeled [nt] can be taken by consuming the word [w], under the assumption that the next symbol is [z]. We check whether this quadruple already exists in the set [E]. If not, then we add it, and we compute its consequences, in the form of new facts, which we insert into the priority queue for later examination. *) let new_edge s nt w z = if debug then assert (Terminal.real z); if E.register s nt w z then let sym = Symbol.N nt in (* Query [F] for existing facts which could be extended by following this newly discovered edge. They must be facts whose current state is [s] and whose lookahead assumption is compatible with [a]. For each such fact, ... *) F.query s (W.first w z) (fun fact -> if debug then assert (compatible (lookahead fact) (W.first w z)); (* ... try to take one step in the trie along an edge labeled [nt]. *) match Trie.step sym (position fact) with | position -> (* This takes us to a new state whose incoming symbol is [nt]. Hence, this state is not solid. In order to satisfy invariant 2, we must create fact whose lookahead assumption is not [any]. That's fine, since our lookahead assumption is [z]. In order to satisfy invariant 1, we must check that [z] does not cause an error in this state. *) if debug then assert (not (is_solid (Trie.current position))); if not (causes_an_error (Trie.current position) z) then let word = W.append (word fact) w in enqueue position word z | exception Not_found -> (* Could not take a step in the trie. This means this branch leads nowhere of interest, and was pruned when the trie was constructed. *) () ) (* ------------------------------------------------------------------------ *) (* [new_fact fact] is invoked when we discover a new fact (i.e., one that was not previously known). It studies the consequences of this fact. These consequences are of two kinds: - As in Dijkstra's algorithm, the new fact can be viewed as a newly discovered vertex. We study its (currently known) outgoing edges, and enqueue new facts in the priority queue. - Sometimes, a fact can also be viewed as a newly discovered edge. This is the case when the word that took us from [source] to [current] represents a production of the grammar and [current] is willing to reduce this production. We record the existence of this edge, and re-inspect any previously discovered vertices which are interested in this outgoing edge. *) let new_fact fact = (* Throughout this rather long function, there is just one [fact]. Let's name its components right now, so as to avoid accessing them several times. (That could be costly, as it requires decoding the fact.) *) let position = position fact and lookahead = lookahead fact and word = word fact in let source = Trie.source position and current = Trie.current position in (* 1. View [fact] as a vertex. Examine the transitions out of [current]. For every transition labeled by a symbol [sym] and into a state [target], ... *) Lr1.transitions current |> SymbolMap.iter (fun sym target -> (* ... try to follow this transition in the trie [position], down to a child which we call [child]. *) match Trie.step sym position, sym with | exception Not_found -> (* Could not take a step in the trie. This means this transition leads nowhere of interest. *) () | child, Symbol.T t -> (* 1a. The transition exists in the trie, and [sym] is in fact a terminal symbol [t]. We note that [t] cannot be the [error] token, because the trie does not have any edges labeled [error]. *) if debug then begin assert (Lr1.Node.compare (Trie.current child) target = 0); assert (is_solid target); assert (Terminal.non_error t); end; (* If the lookahead assumption [lookahead] is compatible with [t], then we derive a new fact, where one more edge has been taken, and enqueue this new fact for later examination. *) (* The state [target] is solid, i.e., its incoming symbol is terminal. This state is always entered without consideration for the next lookahead symbol. Thus, we can use [any] as the lookahead assumption in the new fact that we produce. If we did not have [any], we would have to produce one fact for every possible lookahead symbol. *) if compatible lookahead t then let word = W.append word (W.singleton t) in enqueue child word any | child, Symbol.N nt -> (* 1b. The transition exists in the trie, and [sym] is in fact a nonterminal symbol [nt]. *) if debug then begin assert (Lr1.Node.compare (Trie.current child) target = 0); assert (not (is_solid target)); end; (* We need to know how this nonterminal edge can be taken. We query [E] for a word [w] that allows us to take this edge. In general, the answer depends on the terminal symbol [z] that comes *after* this word: we try all such symbols. We must make sure that the first symbol of the word [w.z] satisfies the lookahead assumption [lookahead]; this is ensured by passing this information to [E.query]. *) (* It could be the case that, due to a default reduction, the answer to our query does not depend on [z], and we are wasting work. However, allowing [z] to be [any] in [E.query], and taking advantage of this to increase performance, seems difficult. *) let foreach = foreach_terminal_not_causing_an_error target in E.query current nt lookahead foreach (fun w z -> if debug then assert (compatible lookahead (W.first w z)); let word = W.append word w in enqueue child word z ) ); (* 2. View [fact] as a possible edge. This is possible if the path from [source] to the [current] state represents a production [prod] and [current] is willing to reduce this production. Then, reducing [prod] takes us all the way back to [source]. Thus, this production gives rise to an edge labeled [nt] -- the left-hand side of [prod] -- out of [source]. *) let z = lookahead in if not (Terminal.equal z any) then begin (* 2a. The lookahead assumption [z] is a real terminal symbol. We check whether [current] is willing to reduce some production [prod] on [z], and whether the sub-trie [position] accepts [prod], which means that this reduction takes us back to the root of the trie. If so, we have discovered a new edge. *) match has_reduction current z with | Some prod when Trie.accepts prod position -> new_edge source (Production.nt prod) word z | _ -> () end else begin (* 2b. The lookahead assumption is [any]. We must consider every pair [prod, z] such that the [current] state can reduce [prod] on [z] and [position] accepts [prod]. *) match has_default_reduction current with | Some (prod, _) -> if Trie.accepts prod position then (* [new_edge] does not accept [any] as its 4th parameter, so we must iterate over all terminal symbols. *) foreach_terminal (fun z -> new_edge source (Production.nt prod) word z ) | None -> TerminalMap.iter (fun z prods -> if Terminal.non_error z then let prod = Misc.single prods in if Trie.accepts prod position then new_edge source (Production.nt prod) word z ) (Lr1.reductions current) end (* ------------------------------------------------------------------------ *) (* The main loop of the algorithm. *) (* [level] is the length of [word fact] for the facts that we are examining at the moment. [extracted] counts how many facts we have extracted out of the priority queue. [considered] counts how many of these were found to be new, and subsequently passed to [new_fact]. *) let level, extracted, considered = ref 0, ref 0, ref 0 let done_with_level () = Printf.eprintf "Done with level %d.\n" !level; W.verbose(); F.verbose(); E.verbose(); Printf.eprintf "Q stores %d facts.\n" (Q.cardinal q); Printf.eprintf "%d facts extracted out of Q, of which %d considered.\n%!" !extracted !considered let () = Q.repeat q (fun fact -> incr extracted; if F.register fact then begin if X.verbose && W.length (word fact) > !level then begin done_with_level(); level := W.length (word fact); end; incr considered; new_fact fact end ); if X.verbose then done_with_level(); Time.tick "Running LRijkstra" (* ------------------------------------------------------------------------ *) (* We are done. Expose accessor functions. *) (* We expose [E.query], but simplify its interface by specializing it with [foreach_terminal]. We also restrict it to the case where [a] is real. *) let query s nt a = if debug then assert (Terminal.real a); E.query s nt a foreach_terminal (* Expose some numbers. *) let facts, edge_facts = F.size(), E.size() let total_trie_size = Trie.total_size() end menhir-20200123/src/LRijkstraCore.mli000066400000000000000000000057131361226111300172410ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Grammar (* This is the core of the reachability analysis. After the automaton has been constructed, this (expensive) analysis determines exactly under which conditions each nonterminal edge in the automaton can be taken. This information can then be used to determine how to reach certain states in the automaton; see, e.g., [LRijkstra]. *) (* In this analysis, we explicitly ignore the [error] token. (We display a warning if the grammar uses this token.) Thus, we disregard any reductions or transitions that take place when the lookahead symbol is [error]. As a result, any state whose incoming symbol is [error] is found unreachable. It would be too complicated to have to create a first error in order to be able to take certain transitions or drop certain parts of the input. *) module Run (X : sig (* If [verbose] is set, produce various messages on [stderr]. *) val verbose: bool end) : sig (* A representation of words of terminal symbols. See [GrammarFunctor]. *) module W : sig type word val singleton: Terminal.t -> word val append: word -> word -> word val length: word -> int val elements: word -> Terminal.t list val compare: word -> word -> int end (* [query s nt a] enumerates all words [w] and all symbols [z] such that, in state [s], the outgoing edge labeled [nt] can be taken by consuming the word [w], under the assumption that the next symbol is [z], and the first symbol of the word [w.z] is [a]. *) val query: (* s: *) Lr1.node -> (* nt: *) Nonterminal.t -> (* a: *) Terminal.t -> (* f: *) (W.word -> Terminal.t -> unit) -> unit (* [facts] is the total number of facts discovered. [edge_facts] is the total number of edge facts discovered. [total_trie_size] is the sum of the sizes of the tries that are internally constructed in the module [Trie]. These numbers are provided for information only. *) val facts: int val edge_facts: int val total_trie_size: int end menhir-20200123/src/LowIntegerPriorityQueue.ml000066400000000000000000000101021361226111300211640ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module implements a simple-minded priority queue, under the assumption that priorities are low nonnegative integers. *) module MyArray = ResizableArray module MyStack = ResizableArray type 'a t = { (* A priority queue is represented as a resizable array, indexed by priorities, of stacks (implemented as resizable arrays). There is no a priori bound on the size of the main array -- its size is increased if needed. It is up to the user to use priorities of reasonable magnitude. *) a: 'a MyStack.t MyArray.t; (* Index of lowest nonempty stack, if there is one; or lower (sub-optimal, but safe). If the queue is empty, [best] is arbitrary. *) mutable best: int; (* Current number of elements in the queue. Used in [remove] to stop the search for a nonempty bucket. *) mutable cardinal: int; } let create default = (* Set up the main array so that it initially has 16 priority levels and, whenever new levels are added, each of them is initialized with a fresh empty stack. The dummy stack is never accessed; it is used to fill empty physical slots in the main array. *) let dummy = MyStack.make_ 0 default in let a = MyArray.make 16 dummy (fun _ -> MyStack.make_ 1024 default) in { a; best = 0; cardinal = 0 } let add q x priority = assert (0 <= priority); q.cardinal <- q.cardinal + 1; (* Grow the main array if necessary. *) if MyArray.length q.a <= priority then MyArray.resize q.a (priority + 1); (* Find out which stack we should push into. *) let xs = MyArray.get q.a priority in (* assert (xs != MyArray.default q.a); *) (* Push. *) MyStack.push xs x; (* Decrease [q.best], if necessary, so as not to miss the new element. In the special case of Dijkstra's algorithm or A*, this never happens. *) if priority < q.best then q.best <- priority let is_empty q = q.cardinal = 0 let cardinal q = q.cardinal let rec remove_nonempty q = (* Look for the next nonempty bucket. We know there is one. This may seem inefficient, because it is a linear search. However, in applications where [q.best] never decreases, the cumulated cost of this loop is the maximum priority ever used, which is good. *) let xs = MyArray.get q.a q.best in if MyStack.length xs = 0 then begin (* As noted below, [MyStack.pop] does not physically shrink the stack. When we find that a priority level has become empty, we physically empty it, so as to free the (possibly large) space that it takes up. This strategy is good when the client is Dijkstra's algorithm or A*. *) let dummy = MyArray.default q.a in MyArray.set q.a q.best dummy; q.best <- q.best + 1; remove_nonempty q end else begin q.cardinal <- q.cardinal - 1; Some (MyStack.pop xs) (* Note: [MyStack.pop] does not shrink the physical array underlying the stack. This is good, because we are likely to push new elements into this stack. *) end let remove q = if q.cardinal = 0 then None else remove_nonempty q let rec repeat q f = match remove q with | None -> () | Some x -> f x; repeat q f menhir-20200123/src/LowIntegerPriorityQueue.mli000066400000000000000000000036551361226111300213540ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (** This module implements a simple-minded priority queue, under the assumption that priorities are low nonnegative integers. *) (** The type of priority queues. *) type 'a t (** [create default] creates an empty priority queue. The [default] value is used to fill empty physical slots, but is otherwise irrelevant. *) val create: 'a -> 'a t (** [add q x p] inserts the element [x], with priority [p], into the queue [q]. *) val add: 'a t -> 'a -> int -> unit (** [remove q] extracts out of [q] and returns an element with minimum priority. *) val remove: 'a t -> 'a option (** [is_empty q] tests whether the queue [q] is empty. *) val is_empty: 'a t -> bool (** [cardinal q] returns the number of elements in the queue [q]. *) val cardinal: 'a t -> int (** [repeat q f] repeatedly extracts an element with minimum priority out of [q] and passes it to [f] (which may insert new elements into [q]), until [q] is exhausted. *) val repeat: 'a t -> ('a -> unit) -> unit menhir-20200123/src/Makefile000066400000000000000000000000751361226111300154540ustar00rootroot00000000000000# [make] compiles Menhir. .PHONY: all all: @ make -C .. $@ menhir-20200123/src/Maps.ml000066400000000000000000000070651361226111300152540ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* BEGIN PERSISTENT_MAPS *) module type PERSISTENT_MAPS = sig type key type 'data t val empty: 'data t val add: key -> 'data -> 'data t -> 'data t val find: key -> 'data t -> 'data val iter: (key -> 'data -> unit) -> 'data t -> unit end (* END PERSISTENT_MAPS *) (* BEGIN IMPERATIVE_MAPS *) module type IMPERATIVE_MAPS = sig type key type 'data t val create: unit -> 'data t val clear: 'data t -> unit val add: key -> 'data -> 'data t -> unit val find: key -> 'data t -> 'data val iter: (key -> 'data -> unit) -> 'data t -> unit end (* END IMPERATIVE_MAPS *) (* BEGIN IMPERATIVE_MAP *) module type IMPERATIVE_MAP = sig type key type data val set: key -> data -> unit val get: key -> data option end (* END IMPERATIVE_MAP *) module PersistentMapsToImperativeMaps (M : PERSISTENT_MAPS) : IMPERATIVE_MAPS with type key = M.key and type 'data t = 'data M.t ref = struct type key = M.key type 'data t = 'data M.t ref let create () = ref M.empty let clear t = t := M.empty let add k d t = t := M.add k d !t let find k t = M.find k !t let iter f t = M.iter f !t end module ImperativeMapsToImperativeMap (M : IMPERATIVE_MAPS) (D : sig type data end) : IMPERATIVE_MAP with type key = M.key and type data = D.data = struct type key = M.key type data = D.data let m = M.create() let set k d = M.add k d m let get k = try Some (M.find k m) with Not_found -> None end module ArrayAsImperativeMaps (K : sig val n: int end) : IMPERATIVE_MAPS with type key = int and type 'data t = 'data option array = struct open K type key = int type 'data t = 'data option array let create () = Array.make n None let clear m = Array.fill m 0 n None let add key data m = m.(key) <- Some data let find key m = match m.(key) with | None -> raise Not_found | Some data -> data let iter f m = Array.iteri (fun key data -> match data with | None -> () | Some data -> f key data ) m end module HashTableAsImperativeMaps (H : Hashtbl.HashedType) : IMPERATIVE_MAPS with type key = H.t = struct include Hashtbl.Make(H) let create () = create 1023 let add key data table = add table key data let find table key = find key table end module TrivialHashedType (T : sig type t end) : Hashtbl.HashedType with type t = T.t = struct include T let equal = (=) let hash = Hashtbl.hash end menhir-20200123/src/Maps.mli000066400000000000000000000064011361226111300154160ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module defines three signatures for association maps, together with a number of conversion functors. *) (* Following the convention of the ocaml standard library, the [find] functions raise [Not_found] when the key is not a member of the domain of the map. By contrast, [get] returns an option. *) (* BEGIN PERSISTENT_MAPS *) module type PERSISTENT_MAPS = sig type key type 'data t val empty: 'data t val add: key -> 'data -> 'data t -> 'data t val find: key -> 'data t -> 'data val iter: (key -> 'data -> unit) -> 'data t -> unit end (* END PERSISTENT_MAPS *) (* BEGIN IMPERATIVE_MAPS *) module type IMPERATIVE_MAPS = sig type key type 'data t val create: unit -> 'data t val clear: 'data t -> unit val add: key -> 'data -> 'data t -> unit val find: key -> 'data t -> 'data val iter: (key -> 'data -> unit) -> 'data t -> unit end (* END IMPERATIVE_MAPS *) (* BEGIN IMPERATIVE_MAP *) module type IMPERATIVE_MAP = sig type key type data val set: key -> data -> unit val get: key -> data option end (* END IMPERATIVE_MAP *) (* An implementation of persistent maps can be made to satisfy the interface of imperative maps. An imperative map is represented as a persistent map, wrapped within a reference cell. *) module PersistentMapsToImperativeMaps (M : PERSISTENT_MAPS) : IMPERATIVE_MAPS with type key = M.key and type 'data t = 'data M.t ref (* An implementation of imperative maps can be made to satisfy the interface of a single imperative map. This map is obtained via a single call to [create]. *) module ImperativeMapsToImperativeMap (M : IMPERATIVE_MAPS) (D : sig type data end) : IMPERATIVE_MAP with type key = M.key and type data = D.data (* An implementation of imperative maps as arrays is possible if keys are consecutive integers. *) module ArrayAsImperativeMaps (K : sig val n: int end) : IMPERATIVE_MAPS with type key = int and type 'data t = 'data option array (* An implementation of imperative maps as a hash table. *) module HashTableAsImperativeMaps (H : Hashtbl.HashedType) : IMPERATIVE_MAPS with type key = H.t (* A trivial implementation of equality and hashing. *) module TrivialHashedType (T : sig type t end) : Hashtbl.HashedType with type t = T.t menhir-20200123/src/Memoize.ml000066400000000000000000000160451361226111300157570ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* Sigs. *) module type TYPE = sig type t end module type PERSISTENT_MAPS = sig type key type 'data t val empty: 'data t val add: key -> 'data -> 'data t -> 'data t val find: key -> 'data t -> 'data val iter: (key -> 'data -> unit) -> 'data t -> unit end module type IMPERATIVE_MAPS = sig type key type 'data t val create: unit -> 'data t val add: key -> 'data -> 'data t -> unit val find: key -> 'data t -> 'data val clear: 'data t -> unit val iter: (key -> 'data -> unit) -> 'data t -> unit end module type MEMOIZER = sig (* A type of keys. *) type key (* A memoization combinator for this type. *) val memoize: (key -> 'a) -> (key -> 'a) (* A recursive memoization combinator for this type. *) val fix: ((key -> 'a) -> (key -> 'a)) -> (key -> 'a) (* [defensive_fix] works like [fix], except it additionally detects circular dependencies, which can arise if the second-order function supplied by the user does not follow a well-founded recursion pattern. When the user invokes [f x], where [f] is the function returned by [defensive_fix], if a cyclic dependency is detected, then [Cycle (zs, z)] is raised, where the list [zs] begins with [z] and continues with a series of intermediate keys, leading back to [z]. Note that undetected divergence remains possible; this corresponds to an infinite dependency chain, without a cycle. *) exception Cycle of key list * key val defensive_fix: ((key -> 'a) -> (key -> 'a)) -> (key -> 'a) end (* -------------------------------------------------------------------------- *) (* Glue. *) module INT = struct type t = int end module STRING = struct type t = string end module TrivialHashedType (T : TYPE) = struct include T let equal = (=) let hash = Hashtbl.hash end module PersistentMapsToImperativeMaps (M : PERSISTENT_MAPS) = struct type key = M.key type 'data t = 'data M.t ref let create () = ref M.empty let clear t = t := M.empty let add k d t = t := M.add k d !t let find k t = M.find k !t let iter f t = M.iter f !t end module Adapt (T : Hashtbl.S) = struct include T (* types: [key], ['data t] *) (* values: [clear], [iter] *) let create () = T.create 1023 let add key data table = T.add table key data let find table key = T.find key table end module HashTablesAsImperativeMaps (H : Hashtbl.HashedType) = Adapt(Hashtbl.Make(H)) (* -------------------------------------------------------------------------- *) (* Memoize. *) (* [rev_take accu n xs] is [accu @ rev (take n xs)], where [take n xs] takes the first [n] elements of the list [xs]. The length of [xs] must be at least [n]. *) let rec rev_take accu n xs = match n, xs with | 0, _ -> accu | _, [] -> (* The list is too short. This cannot happen. *) assert false | _, x :: xs -> rev_take (x :: accu) (n - 1) xs module Make (M : IMPERATIVE_MAPS) = struct type key = M.key let add x y table = M.add x y table; y (* [memoize] could be defined as a special case of [fix] via the declaration [let memoize f = fix (fun _ x -> f x)]. The following direct definition is perhaps easier to understand and may give rise to more efficient code. *) let memoize (f : key -> 'a) : key -> 'a = let table = M.create() in fun x -> try M.find x table with Not_found -> add x (f x) table let fix (ff : (key -> 'a) -> (key -> 'a)) : key -> 'a = let table = M.create() in let rec f x = try M.find x table with Not_found -> add x (ff f x) table in f (* In the implementation of [defensive_fix], we choose to use two tables. A permanent table, [table] maps keys to values. Once a pair [x, y] has been added to this table, it remains present forever: [x] is stable, and a call to [f x] returns [y] immediately. A transient table, [marked], is used only while a call is in progress. This table maps keys to integers: for each key [x], it records the depth of the stack at the time [x] was pushed onto the stack. Finally, [stack] is a list of the keys currently under examination (most recent key first), and [depth] is the length of the list [stack]. Recording integer depths in the table [marked] allows us to identify the desired cycle, a prefix of the list [stack], without requiring an equality test on keys. *) exception Cycle of key list * key let defensive_fix (ff : (key -> 'a) -> (key -> 'a)) : key -> 'a = (* Create the permanent table. *) let table = M.create() in (* Define the main recursive function. *) let rec f stack depth marked (x : key) : 'a = try M.find x table with Not_found -> match M.find x marked with | i -> (* [x] is marked, and was pushed onto the stack at a time when the stack depth was [i]. We have found a cycle. Fail. Cut a prefix of the reversed stack, which represents the cycle that we have detected, and reverse it on the fly. *) raise (Cycle (rev_take [] (depth - i) stack, x)) | exception Not_found -> (* [x] is not marked. Mark it while we work on it. There is no need to unmark [x] afterwards; inserting it into [table] indicates that it has stabilized. There also is no need to catch and re-raise the exception [Cycle]; we just let it escape. *) M.add x depth marked; let stack = x :: stack and depth = depth + 1 in let y = ff (f stack depth marked) x in add x y table in fun x -> (* Create the transient table. *) let marked = M.create() and stack = [] and depth = 0 in (* Answer this query. *) f stack depth marked x end module ForOrderedType (T : Map.OrderedType) = Make(PersistentMapsToImperativeMaps(Map.Make(T))) module ForHashedType (T : Hashtbl.HashedType) = Make(HashTablesAsImperativeMaps(T)) module ForType (T : TYPE) = ForHashedType(TrivialHashedType(T)) module Int = ForType(INT) module String = ForType(STRING) menhir-20200123/src/Memoize.mli000066400000000000000000000070001361226111300161170ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This code is copied from the library Fix by François Pottier, with manual tweaks. We prefer to avoid any dependency on an external library. *) module type TYPE = sig type t end module type IMPERATIVE_MAPS = sig type key type 'data t val create: unit -> 'data t val add: key -> 'data -> 'data t -> unit val find: key -> 'data t -> 'data val clear: 'data t -> unit val iter: (key -> 'data -> unit) -> 'data t -> unit end module type MEMOIZER = sig (* A type of keys. *) type key (* A memoization combinator for this type. *) val memoize: (key -> 'a) -> (key -> 'a) (* A recursive memoization combinator for this type. *) val fix: ((key -> 'a) -> (key -> 'a)) -> (key -> 'a) (* [defensive_fix] works like [fix], except it additionally detects circular dependencies, which can arise if the second-order function supplied by the user does not follow a well-founded recursion pattern. When the user invokes [f x], where [f] is the function returned by [defensive_fix], if a cyclic dependency is detected, then [Cycle (zs, z)] is raised, where the list [zs] begins with [z] and continues with a series of intermediate keys, leading back to [z]. Note that undetected divergence remains possible; this corresponds to an infinite dependency chain, without a cycle. *) exception Cycle of key list * key val defensive_fix: ((key -> 'a) -> (key -> 'a)) -> (key -> 'a) end (* [Make] constructs a memoizer for a type [key] that is equipped with an implementation of imperative maps. *) module Make (M : IMPERATIVE_MAPS) : MEMOIZER with type key = M.key (* [ForOrderedType] is a special case of [Make] where it suffices to pass an ordered type [T] as an argument. A reference to a persistent map is used to hold the memoization table. *) module ForOrderedType (T : Map.OrderedType) : MEMOIZER with type key = T.t (* [ForHashedType] is a special case of [Make] where it suffices to pass a hashed type [T] as an argument. A hash table is used to hold the memoization table. *) module ForHashedType (T : Hashtbl.HashedType) : MEMOIZER with type key = T.t (* [ForType] is a special case of [Make] where it suffices to pass an arbitrary type [T] as an argument. A hash table is used to hold the memoization table. OCaml's built-in generic equality and hash functions are used. *) module ForType (T : TYPE) : MEMOIZER with type key = T.t (* Memoizers for some common types. *) module Int : MEMOIZER with type key = int module String : MEMOIZER with type key = string menhir-20200123/src/MySet.ml000066400000000000000000000075511361226111300154150ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) module Make (Ord: Map.OrderedType) = struct type elt = Ord.t type t = Empty | Node of t * elt * t * int (* Sets are represented by balanced binary trees (the heights of the children differ by at most 2 *) let height = function Empty -> 0 | Node(_, _, _, h) -> h (* Creates a new node with left son l, value v and right son r. We must have all elements of l < v < all elements of r. l and r must be balanced and | height l - height r | <= 2. Inline expansion of height for better speed. *) let create l v r = let hl = match l with Empty -> 0 | Node(_,_,_,h) -> h in let hr = match r with Empty -> 0 | Node(_,_,_,h) -> h in Node(l, v, r, (if hl >= hr then hl + 1 else hr + 1)) (* Same as create, but performs one step of rebalancing if necessary. Assumes l and r balanced and | height l - height r | <= 3. Inline expansion of create for better speed in the most frequent case where no rebalancing is required. *) let bal l v r = let hl = match l with Empty -> 0 | Node(_,_,_,h) -> h in let hr = match r with Empty -> 0 | Node(_,_,_,h) -> h in if hl > hr + 2 then begin match l with Empty -> invalid_arg "Set.bal" | Node(ll, lv, lr, _) -> if height ll >= height lr then create ll lv (create lr v r) else begin match lr with Empty -> invalid_arg "Set.bal" | Node(lrl, lrv, lrr, _)-> create (create ll lv lrl) lrv (create lrr v r) end end else if hr > hl + 2 then begin match r with Empty -> invalid_arg "Set.bal" | Node(rl, rv, rr, _) -> if height rr >= height rl then create (create l v rl) rv rr else begin match rl with Empty -> invalid_arg "Set.bal" | Node(rll, rlv, rlr, _) -> create (create l v rll) rlv (create rlr rv rr) end end else Node(l, v, r, (if hl >= hr then hl + 1 else hr + 1)) (* [add x t] guarantees that it returns [t] (physically unchanged) if [x] is already a member of [t]. *) let rec add x = function Empty -> Node(Empty, x, Empty, 1) | Node(l, v, r, _) as t -> let c = Ord.compare x v in if c = 0 then t else if c < 0 then let l' = add x l in if l == l' then t else bal l' v r else let r' = add x r in if r == r' then t else bal l v r' let empty = Empty let rec find x = function Empty -> raise Not_found | Node(l, v, r, _) -> let c = Ord.compare x v in if c = 0 then v else find x (if c < 0 then l else r) let rec iter f = function Empty -> () | Node(l, v, r, _) -> iter f l; f v; iter f r end menhir-20200123/src/MySet.mli000066400000000000000000000030151361226111300155550ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This is a stripped-down copy of the [Set] module from OCaml's standard library. The only difference is that [add x t] guarantees that it returns [t] (physically unchanged) if [x] is already a member of [t]. This yields fewer memory allocations and an easy way of testing whether the element was already present in the set before it was added. *) module Make (Ord: Map.OrderedType) : sig type elt = Ord.t type t val empty: t val add: elt -> t -> t val find: elt -> t -> elt (* may raise [Not_found] *) val iter: (elt -> unit) -> t -> unit end menhir-20200123/src/SelectiveExpansion.ml000066400000000000000000000446531361226111300201700ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) let value = Positions.value let unknown = Positions.unknown_pos open Syntax open GroundSort (* -------------------------------------------------------------------------- *) (* Expansion modes. *) type mode = | ExpandHigherSort | ExpandAll (* -------------------------------------------------------------------------- *) (* Expansion can be understood as traversing a graph where every vertex is labeled with a pair of a nonterminal symbol [nt] and an instantiation of the formal parameters of [nt]. *) (* We allow partial instantiations, where some of the formal parameters of [nt] are instantiated, while others remain present. For this reason, we represent an instantation as a list of *optional* actual parameters. *) (* The actual parameters that appear in an instantiation make sense *in the source namespace* (at the toplevel). That is, they refer to (terminal and nonterminal) symbols that exist (at the toplevel) in the original grammar. *) type instantiation = parameter option list type label = nonterminal * instantiation (* Equality and hashing for labels. *) module Label = struct type t = label let equal (nt1, inst1) (nt2, inst2) = nt1 = nt2 && List.for_all2 (Option.equal Parameters.equal) inst1 inst2 let hash (nt, inst) = Hashtbl.hash (nt, Misc.ListExtras.hash (Option.hash Parameters.hash) inst) end (* -------------------------------------------------------------------------- *) (* [mangle label] chooses a concrete name for the new nonterminal symbol that corresponds to the label [label]. *) (* We include parentheses and commas in this name, because that is readable and acceptable in many situations. We replace them with underscores in situations where these characters are not valid; see [Misc.normalize]. *) let mangle_po (po : parameter option) = match po with | None -> (* When a parameter remains uninstantiated, we put an underscore in its place. *) "_" | Some p -> Parameters.print false p let mangle ((nt, pos) : label) : nonterminal = if pos = [] then nt else Printf.sprintf "%s(%s)" nt (Misc.separated_list_to_string mangle_po "," pos) (* -------------------------------------------------------------------------- *) (* An environment maps all of the formal parameters of a rule to actual parameters, which make sense in the source namespace. *) module Env = StringMap type env = parameter Env.t let subst_symbol env sym : parameter = try Env.find (value sym) env with Not_found -> (* [x] is not a formal parameter. It is a toplevel symbol. *) ParameterVar sym let apply (param : parameter) (params : parameter list) : parameter = match param with | ParameterVar sym -> assert (params <> []); ParameterApp (sym, params) | ParameterApp _ -> (* In a well-sorted grammar, only a variable can have higher sort. Here, [param] must have higher sort, so [param] must be a variable. This case cannot arise. *) assert false | ParameterAnonymous _ -> (* Anonymous rules are eliminated early on. *) assert false let rec subst_parameter env param : parameter = match param with | ParameterVar sym -> subst_symbol env sym | ParameterApp (sym, params) -> assert (params <> []); apply (subst_symbol env sym) (subst_parameters env params) | ParameterAnonymous _ -> (* Anonymous rules are eliminated early on. *) assert false and subst_parameters env params = List.map (subst_parameter env) params (* -------------------------------------------------------------------------- *) (* -------------------------------------------------------------------------- *) (* For syntactic convenience, the rest of this file is a functor. *) module Run (G : sig (* Expansion mode. *) val mode: mode (* Sort information. *) val sorts: SortInference.sorts (* The grammar [g] whose expansion is desired. *) val g : grammar end) = struct open G (* -------------------------------------------------------------------------- *) (* Determining the sort of a symbol or parameter. *) (* Be careful: these functions use the toplevel sort environment [sorts], so they must not be used within a rule. (The sort environment would have to be extended with information about the formal parameters.) *) let sort symbol = try StringMap.find (value symbol) sorts with Not_found -> assert false let sort param = match param with | ParameterVar sym -> sort sym | ParameterApp (_, params) -> assert (params <> []); (* An application always has sort [*]. *) star | ParameterAnonymous _ -> assert false (* -------------------------------------------------------------------------- *) (* Looking up the [%attribute] declarations, looking for attributes attached with a nonterminal symbol [nt]. This is used when we create a specialized version of this symbol. *) (* We use an inefficient linear search, but that shouldn't be a problem. *) let global_attributes (nt : symbol) : attribute list = let param = ParameterVar (unknown nt) in List.concat (List.map (fun (params, attrs) -> if List.exists (Parameters.equal param) params then attrs else [] ) g.p_symbol_attributes) (* -------------------------------------------------------------------------- *) (* A queue keeps track of the graph vertices that have been discovered but not yet visited. *) let enqueue, repeatedly = let queue = Queue.create() in let enqueue label = Queue.add label queue and repeatedly visit = Misc.qiter visit queue in enqueue, repeatedly (* -------------------------------------------------------------------------- *) (* A hash table is used to mark the graph vertices that have been discovered. *) let mark, marked = let module H = Hashtbl.Make(Label) in let table = H.create 128 in let mark label = H.add table label () and marked label = H.mem table label in mark, marked (* -------------------------------------------------------------------------- *) (* The rules of the expanded grammar are gradually collected. *) let emit, rules = let rules = ref StringMap.empty in let emit rule = assert (not (StringMap.mem rule.pr_nt !rules)); rules := StringMap.add rule.pr_nt rule !rules and rules() = !rules in emit, rules (* -------------------------------------------------------------------------- *) (* On top of the function [mangle], we set up a mechanism that checks that every (normalized) mangled name is unique. (Indeed, in principle, there could be clashes, although in practice this is unlikely.) We must check that every application of [mangle] to a *new* argument yields a *new* (normalized) result. This is succinctly expressed by combining a claim and a memoizer. *) let mangle : label -> nonterminal = let ensure_fresh = Misc.new_claim() in let module M = Memoize.ForHashedType(Label) in M.memoize (fun label -> let name = mangle label in ensure_fresh (Misc.normalize name); name ) (* -------------------------------------------------------------------------- *) (* [recognize] receives an actual parameter [param] that makes sense in the source namespace and transforms it into a parameter that makes sense in the target namespace. This involves examining each application and "recognizing" it as an application of a label to a sequence of residual actual parameters, as explained next. All labels thus recognized are enqueued. *) (* [recognize] governs how much specialization is performed. For instance, [F(X, Y, Z)] could be recognized as: - an application of the symbol [F] to the residual arguments [X, Y, Z]. Then, no specialization at all takes place. - an application of the symbol [F(X,Y,Z)] to no residual arguments. Then, [F] is fully specialized for [X, Y, Z]. - in between these extremes, say, an application of the symbol [F(X,_,Z)] to the residual argument [Y]. Then, [F] is partially specialized. If there are any residual arguments, then they must be recursively recognized. For instance, [F(X,G(Y),Z)] could be recognized as an application of the symbol [F(X,_,Z)] to [G(Y)], which itself could be recognized as an application of the symbol [G(Y)] to no residual arguments. *) let rec recognize (param : parameter) : parameter = (* [param] must have sort [star], in an appropriate sort environment. *) match param with | ParameterAnonymous _ -> assert false | ParameterVar _ -> param | ParameterApp (sym, ps) -> assert (ps <> []); let x = value sym in (* This symbol is applied to at least one argument, so cannot be a terminal symbol. It must be either a nonterminal symbol or an (uninstantiated) formal parameter of the current rule. *) (* Actually, in both modes, formal parameters of higher sort are expanded away, so [sym] cannot be an uninstantiated parameter of the current rule. It must be a nonterminal symbol. We can therefore look up its sort in the toplevel environment [sorts]. *) let inst, residuals = match mode with | ExpandAll -> (* Expansion of all parameters. *) let inst = List.map (fun p -> Some p) ps and residuals = [] in inst, residuals | ExpandHigherSort -> (* Expansion of only the parameters of higher sort. *) let ss : sort list = domain (sort (ParameterVar sym)) in assert (List.length ps = List.length ss); let pss = List.combine ps ss in let inst = pss |> List.map (fun (param, sort) -> if sort = star then None else Some param) in let residuals = pss |> List.filter (fun (_, sort) -> sort = star) |> List.map (fun (param, _) -> recognize param) in inst, residuals in let label = (x, inst) in enqueue label; let sym = mangle label in Parameters.app (unknown sym) residuals (* -------------------------------------------------------------------------- *) (* The following functions take arguments in the source namespace and produce results in the target namespace. *) let subst_parameter env param = (* [param] must have sort [star], in an appropriate sort environment. *) recognize (subst_parameter env param) let subst_producer env (id, param, attrs) = let param = subst_parameter env param in (id, param, attrs) let subst_producers env producers = List.map (subst_producer env) producers let subst_branch env branch = { branch with pr_producers = subst_producers env branch.pr_producers } let subst_branches env branches = List.map (subst_branch env) branches (* -------------------------------------------------------------------------- *) (* A quick and dirty way of mapping a name to a fresh name. *) let freshen : string -> string = let c = ref 0 in fun x -> Printf.sprintf "%s__menhir__%d" x (Misc.postincrement c) (* -------------------------------------------------------------------------- *) (* [instantiation_env] expects the formal parameters of a rule, [formals], and an instantiation [inst] that dictates how this rule must be specialized. It returns an environment [env] that can be used to perform specialization and a list of residual formal parameters (those that are not specialized). *) let instantiation_env formals inst : env * symbol list = assert (List.length formals = List.length inst); let env, residuals = List.fold_right2 (fun formal po (env, residuals) -> let param, residuals = match po with | Some param -> (* This formal parameter is instantiated. *) param, residuals | None -> (* This formal parameter is not instantiated. *) (* We would like to map it to itself. *) (* However, we must in principle be a bit careful: if a toplevel symbol by the same name as [formal] appears free in the codomain of the environment that we are building, then we will run intro trouble. We avoid this problem by systematically renaming every formal parameter to a fresh unlikely name. *) let formal = freshen formal in ParameterVar (unknown formal), formal :: residuals in Env.add formal param env, residuals ) formals inst (Env.empty, []) in env, residuals (* -------------------------------------------------------------------------- *) (* [visit label] visits a vertex labeled [label] in the graph. This label is a pair of a nonterminal symbol [nt] and an instantiation [inst]. Unless this vertex has been visited already, we create a specialized copy of [nt] for this instantiation. This involves a call to [subst_branches], which can cause more vertices to be discovered and enqueued. *) (* The specialized symbol retains any attributes carried by the original parameterized symbol. These attributes could be either attached with this rule ([rule.pr_attributes]) or specified via an [%attribute] declaration. We have to look up [%attribute] declarations now (as opposed to letting [Drop] handle them) if this is a parameterized symbol, as the connection between the original parameterized symbol and its specialized version is evident here but is lost afterwards. *) let visit label = if not (marked label) then begin mark label; let (nt, inst) = label in let rule = StringMap.find nt g.p_rules in let formals = rule.pr_parameters in let env, residuals = instantiation_env formals inst in emit { rule with pr_nt = mangle label; pr_parameters = residuals; pr_branches = subst_branches env rule.pr_branches; pr_attributes = (if formals = [] then [] else global_attributes nt) @ rule.pr_attributes } end (* -------------------------------------------------------------------------- *) (* The entry points of the graph traversal include the nonterminal symbols of sort [*]. (Not just the start symbols, as we haven't run the reachability analysis, and the grammar may contain unreachable parts, which we still want to expand.) Because a start symbol must have sort [*], this includes the start symbols. *) let () = StringMap.iter (fun nt prule -> if prule.pr_parameters = [] then let label = (nt, []) in enqueue label ) g.p_rules (* -------------------------------------------------------------------------- *) (* The parameters that appear in [%type] declarations and [%on_error_reduce] declarations are also considered entry points. They have sort [*]. *) let subst_parameter param = subst_parameter Env.empty param let subst_declaration (param, info) = assert (sort param = star); (subst_parameter param, info) let subst_declarations decls = List.map subst_declaration decls (* -------------------------------------------------------------------------- *) (* An [%attribute] declaration for a parameter of sort [*] is treated as an entry point. An [%attribute] declaration for a symbol of higher sort is not regarded as an entry point, and at the end, is kept only if this symbol still appears in the expanded grammar. *) (* This is done in two passes over the list of [%attribute] declarations, named [thingify] and [unthingify]. The first pass runs as part of the discovery of entry points, before the graph traversal. The second pass runs after the graph traversal is complete. *) type thing = | TargetParameterOfSortStar of parameter | SourceParameterOfHigherSort of parameter let thingify_parameter param : thing = if sort param = star then TargetParameterOfSortStar (subst_parameter param) else SourceParameterOfHigherSort param let thingify_attribute_declaration (params, attrs) = (List.map thingify_parameter params, attrs) let thingify_attribute_declarations decls = List.map thingify_attribute_declaration decls let unthingify_parameter rules thing = match thing with | TargetParameterOfSortStar param -> (* This parameter has sort [star]. Keep it. *) Some param | SourceParameterOfHigherSort param -> (* This parameter has higher sort. It must be a symbol. Keep it if it still appears in the expanded grammar. *) let symbol = value (Parameters.unvar param) in if StringMap.mem symbol rules then Some param else None let unthingify_attribute_declaration rules (params, attrs) = (Misc.map_opt (unthingify_parameter rules) params, attrs) let unthingify_attribute_declarations rules decls = List.map (unthingify_attribute_declaration rules) decls (* -------------------------------------------------------------------------- *) (* Put everything together a construct a new grammar. *) let g = (* Discovery of entry points. *) let p_types = subst_declarations g.p_types and p_on_error_reduce = subst_declarations g.p_on_error_reduce and things = thingify_attribute_declarations g.p_symbol_attributes in (* Graph traversal. *) repeatedly visit; (* Construction of the new grammar. *) let p_rules = rules() in let p_symbol_attributes = unthingify_attribute_declarations p_rules things in { g with p_types; p_on_error_reduce; p_symbol_attributes; p_rules } end (* of the functor *) (* -------------------------------------------------------------------------- *) (* Re-package the above functor as a function. *) let expand mode sorts g = let module G = Run(struct let mode = mode let sorts = sorts let g = g end) in G.g menhir-20200123/src/SelectiveExpansion.mli000066400000000000000000000037761361226111300203420ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Syntax open SortInference (* [expand sorts g] expands away some or all of the parameterized nonterminal symbols in the grammar [g], producing a new grammar. [sorts] is the sort environment produced by [SortInference]. *) (* The mode [ExpandHigherSort] causes a partial expansion: only the parameters of higher sort (i.e., of sort other than [*]) are expanded away. This mode is safe, in the sense that expansion always terminates. A proof sketch is as follows: 1- an application always has sort [*]; 2- therefore, only a variable can have higher sort; 3- therefore, only a finite number of terms can appear during expansion. *) (* The mode [ExpandAll] causes a complete expansion: all parameters are expanded away. This process is potentially nonterminating. One must first run the termination test in [CheckSafeParameterizedGrammar] (which itself is applicable only after the parameters of higher sort have been expanded away). *) type mode = | ExpandHigherSort | ExpandAll val expand: mode -> sorts -> grammar -> grammar menhir-20200123/src/Seq.ml000066400000000000000000000037211361226111300150770ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* Sequences with constant time concatenation and linear-time conversion to an ordinary list. *) (* We maintain the invariant that the left-hand side of [SConcat] is never an empty sequence. This allows a slight improvement in [first]. *) type 'a seq = | SZero | SOne of 'a | SConcat of 'a seq * 'a seq let empty = SZero let singleton x = SOne x let append xs ys = match xs with | SZero -> ys | SOne _ | SConcat _ -> SConcat (xs, ys) let rec elements xs accu = match xs with | SZero -> accu | SOne x -> x :: accu | SConcat (xs1, xs2) -> elements xs1 (elements xs2 accu) let elements xs = elements xs [] let rec concat xss = match xss with | [] -> empty | xs :: xss -> append xs (concat xss) let rec first xs = match xs with | SZero -> (* We disallow applying [first] to an empty sequence. *) assert false | SOne x -> x | SConcat (xs1, _) -> (* Our invariant guarantees [xs1] is nonempty. *) first xs1 menhir-20200123/src/Seq.mli000066400000000000000000000024211361226111300152440ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* Sequences with constant time concatenation and linear-time conversion to an ordinary list. *) type 'a seq val empty: 'a seq val singleton: 'a -> 'a seq val append: 'a seq -> 'a seq -> 'a seq val elements: 'a seq -> 'a list val concat: 'a seq list -> 'a seq val first: 'a seq -> 'a (* sequence must be nonempty *) menhir-20200123/src/SortInference.ml000066400000000000000000000223761361226111300171240ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) let value = Positions.value let position = Positions.position let error = Error.error open Syntax open SortUnification (* -------------------------------------------------------------------------- *) (* Error handling. *) (* In [check_arity], in principle, [arity1] is the expected arity and [arity2] is the actual arity. This distinction does not make much sense, though, as we do not know which is wrong, the declaration site or the use site. So, we display a neutral error message. *) let check_arity sym arity1 arity2 = let plural = max arity1 arity2 > 1 in if arity1 <> arity2 then error [position sym] "does the symbol \"%s\" expect %d or %d argument%s?" (value sym) (min arity1 arity2) (max arity1 arity2) (if plural then "s" else "") (* This variant of [unify] is used when no unification error can arise. *) let unify_cannot_fail sort1 sort2 = try unify sort1 sort2 with | Unify _ | Occurs _ -> (* If the caller is right, this unification step cannot fail! *) assert false (* In [unify], in principle, [sort1] is the expected sort and [sort2] is the actual sort. Again, this distinction does not make much sense, so we display a neutral error message. *) let unify sym sort1 sort2 = try unify sort1 sort2 with | Unify (v1, v2) -> let print v = print (decode v) in error [position sym] "how is the symbol \"%s\" parameterized?\n\ It is used at sorts %s and %s.\n\ The sort %s is not compatible with the sort %s." (value sym) (print sort1) (print sort2) (print v1) (print v2) | Occurs (v1, v2) -> let print v = print (decode v) in error [position sym] "how is the symbol \"%s\" parameterized?\n\ It is used at sorts %s and %s.\n\ The sort %s cannot be unified with the sort %s." (value sym) (print sort1) (print sort2) (print v1) (print v2) (* -------------------------------------------------------------------------- *) (* An environment maps (terminal and nonterminal) symbols to unification variables. *) type symbol = string module Env = StringMap type env = variable Env.t let find x env : variable = try Env.find x env with Not_found -> assert false (* unbound terminal or nonterminal symbol *) let extend env (xvs : (symbol * variable) list) = List.fold_left (fun env (x, v) -> Env.add x v env ) env xvs (* -------------------------------------------------------------------------- *) (* [allocate xs] allocates a fresh unification variable [v] for every element [x] of the list [xs]. It returns the lists [xvs] and [vs]. *) let allocate (xs : 'a list) : ('a * variable) list * variable list = let xvs = List.map (fun x -> x, fresh()) xs in let vs = List.map snd xvs in xvs, vs (* -------------------------------------------------------------------------- *) (* [check_parameter env param expected] checks that the parameter [param] has sort [expected]. A parameter is either a symbol or an application of a symbol to a number of parameters. Every application is total -- the language does not have partial applications. The sort of every application is [star], but the sort of a variable is unrestricted. *) let rec check_parameter env (param : parameter) (expected : variable) = match param with | ParameterVar sym -> let x = value sym in unify sym expected (find x env) | ParameterApp (sym, actuals) -> let x = value sym in (* This application has sort [star]. *) unify sym expected star; (* Retrieve the expected sort of each parameter. Two cases arise: if [x] has already been assigned an arrow sort, then we can retrieve its domain, which gives us the expected sort of each actual parameter; otherwise, we just make up a fresh arrow sort of appropriate arity. We could avoid this case distinction and always use the latter method, but the former method, when applicable, yields better error messages. If [sym] is a toplevel (nonterminal or terminal) symbol, then we will be in the first case, as we have been careful to initially assign an arrow sort of appropriate arity to each such symbol. *) let v = find x env in let expected = match domain v with | Some expected -> check_arity sym (List.length expected) (List.length actuals); expected | None -> let _, expected = allocate actuals in unify_cannot_fail v (arrow expected); expected in (* Check the sort of each actual parameter. *) List.iter2 (check_parameter env) actuals expected | ParameterAnonymous _ -> (* Anonymous rules have been eliminated already. *) assert false (* -------------------------------------------------------------------------- *) (* The following functions respectively check that a producer, a branch, a rule, and a grammar are well-sorted under an environment [env]. *) let check_producer env (producer : producer) = let (_, param, _) = producer in (* A producer must have sort [star]. *) check_parameter env param star let check_branch env (branch : parameterized_branch) = List.iter (check_producer env) branch.pr_producers let enter_rule env (nt : symbol) (rule : parameterized_rule) : env = (* For each formal parameter, allocate a fresh variable. *) let formals, domain = allocate rule.pr_parameters in (* Connect these variables with the sort of the symbol [nt]. *) (* Because it is performed first, this particular unification cannot fail. *) unify_cannot_fail (find nt env) (arrow domain); (* Extend the environment. *) extend env formals let check_rule env (nt : symbol) (rule : parameterized_rule) = (* Extend the environment within this rule. *) let env = enter_rule env nt rule in (* Check each branch in this extended environment. *) List.iter (check_branch env) rule.pr_branches let check_grammar env g = (* Each rule must be well-sorted. *) StringMap.iter (check_rule env) g.p_rules; (* The start symbols must have sort [star]. *) StringMap.iter (fun nt position -> let sym = Positions.with_pos position nt in unify sym star (find nt env) ) g.p_start_symbols; (* Every symbol that appears in a [%type] declaration must have sort [star]. *) List.iter (fun (param, _) -> check_parameter env param star ) g.p_types; (* Same rule for [%on_error_reduce] declarations. *) List.iter (fun (param, _) -> check_parameter env param star ) g.p_on_error_reduce; (* The symbols that appear in [%attribute] declarations must be well-sorted. Their sort is not necessarily [star]: it is legal to attach an attribute with a parameterized symbol. *) List.iter (fun (params, _) -> List.iter (fun param -> check_parameter env param (fresh()) ) params ) g.p_symbol_attributes (* -------------------------------------------------------------------------- *) type sorts = GroundSort.sort Env.t let infer (g : grammar) : sorts = (* For each (terminal or nonterminal) symbol, allocate a unification variable. The terminal symbols have sort [star], so we can use this particular variable. *) let env = StringMap.fold (fun tok _ env -> Env.add tok star env ) g.p_tokens Env.empty in let env = Env.add "error" star env in let env = StringMap.fold (fun nt rule env -> let env = Env.add nt (fresh()) env in (* The following line unifies the sort of [nt] with an arrow of appropriate arity. It cannot fail. This strategy should lead to slightly better unification error messages. *) let _ : env = enter_rule env nt rule in env ) g.p_rules env in (* Impose sort equality constraints. *) check_grammar env g; (* Decode the environment, so our user doesn't have to deal with unification variables. *) let env = Env.map decode env in (* Ground any unassigned sort variables. (These should occur only in unreachable parts of the grammar.) This guarantees that the user does not have to deal with sort variables. *) let env = Env.map ground env in (* At log level 3, display the inferred sort of every symbol. *) Error.logG 3 (fun f -> Env.iter (fun x gsort -> Printf.fprintf f "%s :: %s\n" x (print (unground gsort)) ) env ); env menhir-20200123/src/SortInference.mli000066400000000000000000000023471361226111300172710ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Syntax open GroundSort (* [infer_grammar g] performs sort inference for the grammar [g], rejecting the grammar if it is ill-sorted. It returns a map of (terminal and nonterminal) symbols to ground sorts. *) type sorts = sort StringMap.t val infer: grammar -> sorts menhir-20200123/src/SortUnification.ml000066400000000000000000000076251361226111300174760ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module implements sort inference. *) (* -------------------------------------------------------------------------- *) (* The syntax of sorts is: sort ::= (sort, ..., sort) -> * where the arity (the number of sorts on the left-hand side of the arrow) can be zero. *) module S = struct type 'a structure = | Arrow of 'a list let map f (Arrow xs) = Arrow (List.map f xs) let iter f (Arrow xs) = List.iter f xs exception Iter2 let iter2 f (Arrow xs1) (Arrow xs2) = let n1 = List.length xs1 and n2 = List.length xs2 in if n1 = n2 then List.iter2 f xs1 xs2 else raise Iter2 end include S (* -------------------------------------------------------------------------- *) (* Instantiate the unification algorithm with the above signature. *) include Unifier.Make(S) type sort = term = | TVar of int | TNode of sort structure (* -------------------------------------------------------------------------- *) (* Sort constructors. *) let arrow (args : variable list) : variable = fresh (Some (Arrow args)) let star : variable = arrow [] let fresh () = fresh None (* Sort accessors. *) let domain (x : variable) : variable list option = match structure x with | Some (Arrow xs) -> Some xs | None -> None (* -------------------------------------------------------------------------- *) (* Converting between sorts and ground sorts. *) let rec ground s = match s with | TVar _ -> (* All variables are replaced with [*]. *) GroundSort.GArrow [] | TNode (Arrow ss) -> GroundSort.GArrow (List.map ground ss) let rec unground (GroundSort.GArrow ss) = TNode (Arrow (List.map unground ss)) (* -------------------------------------------------------------------------- *) (* A name generator for unification variables. *) let make_gensym () : unit -> string = let c = ref 0 in let gensym () = let n = Misc.postincrement c in Printf.sprintf "%c%s" (char_of_int (Char.code 'a' + n mod 26)) (let d = n / 26 in if d = 0 then "" else string_of_int d) in gensym (* A memoized name generator. *) let make_name () : int -> string = let gensym = make_gensym() in Memoize.Int.memoize (fun _x -> gensym()) (* -------------------------------------------------------------------------- *) (* A printer. *) let rec print name (b : Buffer.t) (sort : sort) = match sort with | TVar x -> Printf.bprintf b "%s" (name x) | TNode (S.Arrow []) -> Printf.bprintf b "*" | TNode (S.Arrow (sort :: sorts)) -> (* Always parenthesize the domain, so there is no ambiguity. *) Printf.bprintf b "(%a%a) -> *" (print name) sort (print_comma_sorts name) sorts and print_comma_sorts name b sorts = List.iter (print_comma_sort name b) sorts and print_comma_sort name b sort = Printf.bprintf b ", %a" (print name) sort let print sort : string = let b = Buffer.create 32 in print (make_name()) b sort; Buffer.contents b menhir-20200123/src/SortUnification.mli000066400000000000000000000044241361226111300176410ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module implements sort inference. *) (* -------------------------------------------------------------------------- *) (* The syntax of sorts is: sort ::= (sort, ..., sort) -> * where the arity (the number of sorts on the left-hand side of the arrow) can be zero. See [GroundSort]. *) type 'a structure = | Arrow of 'a list type sort = | TVar of int | TNode of sort structure (* -------------------------------------------------------------------------- *) (* Sort unification. *) type variable val star: variable val arrow: variable list -> variable val fresh: unit -> variable (* [domain] is the opposite of [arrow]. If [x] has been unified with an arrow, then [domain x] returns its domain. Otherwise, it returns [None]. Use with caution. *) val domain: variable -> variable list option exception Unify of variable * variable exception Occurs of variable * variable val unify: variable -> variable -> unit (* Once unification is over, a unification variable can be decoded as a sort. *) val decode: variable -> sort (* Grounding a sort replaces all sort variables with the sort [*]. *) val ground: sort -> GroundSort.sort val unground: GroundSort.sort -> sort (* -------------------------------------------------------------------------- *) (* A sort can be printed. *) val print: sort -> string menhir-20200123/src/Trie.ml000066400000000000000000000202711361226111300152510ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Grammar (* -------------------------------------------------------------------------- *) (* We begin with a number of auxiliary functions that provide information about the LR(1) automaton. These functions could perhaps be moved elsewhere, e.g., inside [Default]. We keep them here, for now, because they are not used anywhere else. *) (* [can_reduce s prod] indicates whether state [s] is able to reduce production [prod] (either as a default reduction, or as a normal reduction). *) let can_reduce s prod = match Default.has_default_reduction s with | Some (prod', _) when prod = prod' -> true | _ -> TerminalMap.fold (fun z prods accu -> (* A reduction on [#] is always a default reduction. (See [lr1.ml].) *) assert (not (Terminal.equal z Terminal.sharp)); accu || Terminal.non_error z && List.mem prod prods ) (Lr1.reductions s) false (* [reduction_path_exists s w prod] tests whether the path determined by the sequence of symbols [w] out of the state [s] exists in the automaton and leads to a state where [prod] can be reduced. It further requires [w] to not contain the [error] token. *) let rec reduction_path_exists s (w : Symbol.t list) prod : bool = match w with | [] -> can_reduce s prod | a :: w -> Symbol.non_error a && match SymbolMap.find a (Lr1.transitions s) with | s -> reduction_path_exists s w prod | exception Not_found -> false (* -------------------------------------------------------------------------- *) (* Tries. *) module Make (X : sig end) = struct (* A trie has the following structure. *) type trie = { (* A unique identity, used by [compare]. The trie construction code ensures that these numbers are indeed unique: see [fresh], [insert], [star]. *) identity: int; (* The root state of this star: "where we come from". *) source: Lr1.node; (* The current state, i.e., the root of this sub-trie: "where we are". *) current: Lr1.node; (* The productions that we can reduce in the current state. In other words, if this list is nonempty, then the current state is the end of one (or several) branches. It can nonetheless have children. *) mutable productions: Production.index list; (* The children, or sub-tries. *) mutable transitions: trie SymbolMap.t (* The two fields above are written only during the construction of a trie. Once every trie has been constructed, they are frozen. *) } (* This counter is used by [mktrie] to produce unique identities. *) let c = ref 0 (* We keep a mapping of integer identities to tries. Whenever a new identity is assigned, this mapping must be updated. *) let tries = let s : Lr1.node = Obj.magic () in (* yes, this hurts *) let dummy = { identity = -1; source = s; current = s; productions = []; transitions = SymbolMap.empty } in MenhirLib.InfiniteArray.make dummy (* This smart constructor creates a new trie with a unique identity. *) let mktrie source current productions transitions = let identity = Misc.postincrement c in let t = { identity; source; current; productions; transitions } in MenhirLib.InfiniteArray.set tries identity t; t (* [insert t w prod] updates the trie (in place) by adding a new branch, corresponding to the sequence of symbols [w], and ending with a reduction of production [prod]. We assume [reduction_path_exists w prod t.current] holds, so we need not worry about this being a dead branch, and we can use destructive updates without having to set up an undo mechanism. *) let rec insert (t : trie) (w : Symbol.t list) prod : unit = match w with | [] -> assert (can_reduce t.current prod); t.productions <- prod :: t.productions | a :: w -> match SymbolMap.find a (Lr1.transitions t.current) with | exception Not_found -> assert false | successor -> (* Find our child at [a], or create it. *) let t' = try SymbolMap.find a t.transitions with Not_found -> let t' = mktrie t.source successor [] SymbolMap.empty in t.transitions <- SymbolMap.add a t' t.transitions; t' in (* Update our child. *) insert t' w prod (* [insert t prod] inserts a new branch, corresponding to production [prod], into the trie [t], which is updated in place. *) let insert t prod : unit = let w = Array.to_list (Production.rhs prod) in (* Check whether the path [w] leads to a state where [prod] can be reduced. If not, then some transition or reduction action must have been suppressed by conflict resolution; or the path [w] involves the [error] token. In that case, the branch is dead, and is not added. This test is superfluous (i.e., it would be OK to add a dead branch) but allows us to build a slightly smaller star in some cases. *) if reduction_path_exists t.current w prod then insert t w prod (* [fresh s] creates a new empty trie whose source is [s]. *) let fresh source = mktrie source source [] SymbolMap.empty (* The star at [s] is obtained by starting with a fresh empty trie and inserting into it every production [prod] whose left-hand side [nt] is the label of an outgoing edge at [s]. *) let star s = let t = fresh s in SymbolMap.iter (fun sym _ -> match sym with | Symbol.T _ -> () | Symbol.N nt -> Production.iternt nt (insert t) ) (Lr1.transitions s); t (* A trie [t] is nontrivial if it has at least one branch, i.e., contains at least one sub-trie whose [productions] field is nonempty. Trivia: a trie of size greater than 1 is necessarily nontrivial, but the converse is not true: a nontrivial trie can have size 1. (This occurs if all productions have zero length.) *) let trivial t = t.productions = [] && SymbolMap.is_empty t.transitions (* Redefine [star] to record the size of the newly built trie. *) let size = Array.make Lr1.n (-1) let star s = let initial = !c in let t = star s in let final = !c in size.(Lr1.number s) <- final - initial; t (* Define [stars] to build all stars and pass all nontrivial ones to [f]. *) let stars f = (* For every state [s]... *) Lr1.iter (fun s -> (* Build the trie rooted at [s]. If it is nontrivial, invoke [f]. *) let t = star s in if not (trivial t) then f s t ) let size s = assert (size.(s) >= 0); size.(s) let total_size () = !c let compare t1 t2 = Generic.compare t1.identity t2.identity let source t = t.source let current t = t.current let accepts prod t = List.mem prod t.productions let step a t = SymbolMap.find a t.transitions (* careful: may raise [Not_found] *) let verbose () = Printf.eprintf "Total star size: %d\n%!" (total_size()) let decode i = let t = MenhirLib.InfiniteArray.get tries i in assert (t.identity = i); (* ensure we do not get the [dummy] trie *) t let encode t = assert (decode t.identity == t); (* round-trip property *) t.identity end menhir-20200123/src/Trie.mli000066400000000000000000000100651361226111300154220ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Grammar (* Suppose [s] is a state that carries an outgoing edge labeled with a non-terminal symbol [nt]. We are interested in finding out how this edge can be taken. In order to do that, we must determine how, by starting in [s], one can follow a path that corresponds to (the right-hand side of) a production [prod] associated with [nt]. There are in general several such productions. The paths that they determine in the automaton form a "star". We represent the star rooted at [s] as a trie. A point in a trie (that is, a sub-trie) tells us where we come from, where we are, and which production(s) we are hoping to reduce in the future. *) (* This module depends on [Grammar], [Lr1], [Default]: that is, we assume that the automaton has been fully constructed. It is used by [LRijkstra]. *) module Make (X : sig end) : sig type trie (* [stars f] constructs the trie rooted at every state [s]. (There is one branch for every production [prod] associated with every non-terminal symbol [nt] for which [s] carries an outgoing edge.) If this trie [t] is nontrivial (i.e., it has at least one branch, leading to a state where a production can be reduced), then [f s t] is invoked. *) val stars: (Lr1.node -> trie -> unit) -> unit (* After [stars] has been called, [size (Lr1.number s)] reports the size of the trie that has been constructed for state [s]. *) val size: int -> int (* After [stars] has been called, [total_size()] reports the total size of the tries that have been constructed. *) val total_size: unit -> int (* Every (sub-)trie has a unique identity. (One can think of it as its address.) [compare] compares the identity of two tries. This can be used, e.g., to set up a map whose keys are tries. *) val compare: trie -> trie -> int (* [source t] returns the source state of the (sub-)trie [t]. This is the root of the star of which [t] is a sub-trie. In other words, this tells us "where we come from". *) val source: trie -> Lr1.node (* [current t] returns the current state of the (sub-)trie [t]. This is the root of the sub-trie [t]. In other words, this tells us "where we are". *) val current: trie -> Lr1.node (* [accepts prod t] tells whether the current state of the trie [t] is the end of a branch associated with production [prod]. If so, this means that we have successfully followed a path that corresponds to the right-hand side of production [prod]. *) val accepts: Production.index -> trie -> bool (* [step sym t] is the immediate sub-trie of [t] along the symbol [sym]. This function raises [Not_found] if [t] has no child labeled [sym]. *) val step: Symbol.t -> trie -> trie (* [verbose()] outputs debugging & performance information. *) val verbose: unit -> unit (* Since every (sub-)trie has a unique identity, its identity can serve as a unique integer code for this (sub-)trie. We allow this conversion, both ways. This mechanism is used only as a way of saving space in the encoding of facts. *) val encode: trie -> int val decode: int -> trie end menhir-20200123/src/Unifier.ml000066400000000000000000000144671361226111300157610ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module provides a simple-minded implementation of first-order unification over an arbitrary signature. *) (* -------------------------------------------------------------------------- *) (* The signature must be described by the client, as follows. *) module type STRUCTURE = sig (* The type ['a structure] should be understood as a type of shallow terms whose leaves have type ['a]. *) type 'a structure val map: ('a -> 'b) -> 'a structure -> 'b structure val iter: ('a -> unit) -> 'a structure -> unit (* [iter2] fails if the head constructors differ. *) exception Iter2 val iter2: ('a -> 'b -> unit) -> 'a structure -> 'b structure -> unit end (* -------------------------------------------------------------------------- *) (* The unifier. *) module Make (S : STRUCTURE) = struct type 'a structure = 'a S.structure (* The data structure maintained by the unifier is as follows. *) (* A unifier variable is a point of the union-find algorithm. *) type variable = descriptor UnionFind.point and descriptor = { (* Every equivalence class carries a globally unique identifier. When a new equivalence class is created, a fresh identifier is chosen, and when two classes are merged, one of the two identifiers is kept. This identifier can be used as a key in a hash table. One should be aware, though, that identifiers are stable only as long as no unions are performed. *) id : int; (* Every equivalence class carries a structure, which is either [None], which means that the variable is just that, a variable; or [Some t], which means that the variable represents (has been equated with) the term [t]. *) structure : variable structure option; (* Every equivalence class carries a mutable mark, which is used only by the occurs check. We could also remove this field altogether and use a separate hash table, where [id]s serve as keys, but this should be faster. The occurs check is performed eagerly, so this could matter. *) mutable mark : Mark.t; } (* -------------------------------------------------------------------------- *) (* Accessors. *) let id v = (UnionFind.get v).id let structure v = (UnionFind.get v).structure (* -------------------------------------------------------------------------- *) (* [fresh] creates a fresh variable with specified structure. *) let fresh = let c = ref 0 in fun structure -> let id = Misc.postincrement c in let mark = Mark.none in UnionFind.fresh { id; structure; mark } (* -------------------------------------------------------------------------- *) (* [occurs_check x y] checks that [x] does not occur within [y]. *) exception Occurs of variable * variable let occurs_check x y = (* Generate a fresh color for this particular traversal. *) let black = Mark.fresh () in (* The traversal code -- a depth-first search. *) let rec visit z = let desc = UnionFind.get z in if not (Mark.same desc.mark black) then begin desc.mark <- black; (* We are looking for [x]. *) if UnionFind.equivalent x z then raise (Occurs (x, y)) else Option.iter (S.iter visit) desc.structure end in (* The root is [y]. *) visit y (* -------------------------------------------------------------------------- *) (* The internal function [unify v1 v2] equates the variables [v1] and [v2] and propagates the consequences of this equation until a cycle is detected, an inconsistency is found, or a solved form is reached. The exceptions that can be raised are [Occurs] and [S.Iter2]. *) let rec unify (v1 : variable) (v2 : variable) : unit = if not (UnionFind.equivalent v1 v2) then begin let desc1 = UnionFind.get v1 and desc2 = UnionFind.get v2 in (* Unify the two descriptors. *) let desc = match desc1.structure, desc2.structure with | None, None -> (* variable/variable *) desc1 | None, Some _ -> (* variable/term *) occurs_check v1 v2; desc2 | Some _, None -> (* term/variable *) occurs_check v2 v1; desc1 | Some s1, Some s2 -> (* term/term *) S.iter2 unify s1 s2; { desc1 with structure = Some s1 } in (* Merge the equivalence classes. Do this last, so we get more meaningful output if the recursive call (above) fails and we have to print the two terms. *) UnionFind.union v1 v2; UnionFind.set v1 desc end (* -------------------------------------------------------------------------- *) (* The public version of [unify]. *) exception Unify of variable * variable let unify v1 v2 = try unify v1 v2 with S.Iter2 -> raise (Unify (v1, v2)) (* -------------------------------------------------------------------------- *) (* Decoding an acyclic graph as a deep term. *) (* This is a simple-minded version of the code, where sharing is lost. Its cost could be exponential if there is a lot of sharing. In practice, its use is usually appropriate, especially in the scenario where the term is meant to be printed as a tree. *) type term = | TVar of int | TNode of term structure let rec decode (v : variable) : term = match structure v with | None -> TVar (id v) | Some t -> TNode (S.map decode t) (* -------------------------------------------------------------------------- *) end menhir-20200123/src/Unifier.mli000066400000000000000000000053651361226111300161270ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module provides a simple-minded implementation of first-order unification over an arbitrary signature. *) (* -------------------------------------------------------------------------- *) (* The signature must be described by the client, as follows. *) module type STRUCTURE = sig (* The type ['a structure] should be understood as a type of shallow terms whose leaves have type ['a]. *) type 'a structure val map: ('a -> 'b) -> 'a structure -> 'b structure val iter: ('a -> unit) -> 'a structure -> unit (* [iter2] fails if the head constructors differ. *) exception Iter2 val iter2: ('a -> 'b -> unit) -> 'a structure -> 'b structure -> unit end (* -------------------------------------------------------------------------- *) (* The unifier. *) module Make (S : STRUCTURE) : sig (* The type of unification variables. *) type variable (* [fresh s] creates a fresh variable that carries the structure [s]. *) val fresh: variable S.structure option -> variable (* [structure x] returns the structure (currently) carried by variable [x]. *) val structure: variable -> variable S.structure option (* [unify x y] attempts to unify the terms represented by the variables [x] and [y]. The creation of cycles is not permitted; an eager occurs check rules them out. *) exception Unify of variable * variable exception Occurs of variable * variable val unify: variable -> variable -> unit (* This is the type of deep terms over the signature [S]. *) type term = | TVar of int (* the variable's unique identity *) | TNode of term S.structure (* [decode x] turns the variable [x] into the term that it represents. Sharing is lost, so this operation can in the worst case have exponential cost. *) val decode: variable -> term end menhir-20200123/src/action.ml000066400000000000000000000130141361226111300156200ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Keyword type t = { (* The code for this semantic action. *) expr: IL.expr; (* The files where this semantic action originates. Via inlining, several semantic actions can be combined into one, so there can be several files. *) filenames: string list; (* The set of keywords that appear in this semantic action. They can be thought of as free variables that refer to positions. They must be renamed during inlining. *) keywords : KeywordSet.t; } (* Creation. *) let from_stretch s = { expr = IL.ETextual s; filenames = [ s.Stretch.stretch_filename ]; keywords = KeywordSet.of_list s.Stretch.stretch_keywords } let from_il_expr e = { expr = e; filenames = []; keywords = KeywordSet.empty; } (* Defining a keyword in terms of other keywords. *) let define keyword keywords f action = assert (KeywordSet.mem keyword action.keywords); { action with expr = f action.expr; keywords = KeywordSet.union keywords (KeywordSet.remove keyword action.keywords) } (* Composition, used during inlining. *) let compose x a1 a2 = (* 2015/07/20: there used to be a call to [parenthesize_stretch] here, which would insert parentheses around every stretch in [a1]. This is not necessary, as far as I can see, since every stretch that represents a semantic action is already parenthesized by the lexer. *) { expr = CodeBits.blet ([ IL.PVar x, a1.expr ], a2.expr); keywords = KeywordSet.union a1.keywords a2.keywords; filenames = a1.filenames @ a2.filenames; } (* Binding an OCaml pattern to an OCaml variable in a semantic action. *) let bind p x a = { expr = CodeBits.blet ([ p, IL.EVar x ], a.expr); keywords = a.keywords; filenames = a.filenames; } (* Substitutions, represented as association lists. In principle, no name appears twice in the domain. *) type subst = (string * string) list let apply (phi : subst) (s : string) : string = try List.assoc s phi with Not_found -> s let apply_subject (phi : subst) (subject : subject) : subject = match subject with | Before | Left -> subject | RightNamed s -> RightNamed (apply phi s) let extend x y (phi : subst ref) = assert (not (List.mem_assoc x !phi)); if x <> y then phi := (x, y) :: !phi (* Renaming of keywords, used during inlining. *) type sw = Keyword.subject * Keyword.where (* [rename_keyword f phi keyword] applies the function [f] to possibly change the keyword [keyword]. If [f] decides to change this keyword (by returning [Some _]) then this decision is obeyed. Otherwise, the keyword is renamed by the substitution [phi]. In either case, [phi] is extended with a renaming decision. *) let rename_keyword (f : sw -> sw option) (phi : subst ref) keyword : keyword = match keyword with | SyntaxError -> SyntaxError | Position (subject, where, flavor) -> let subject', where' = match f (subject, where) with | Some (subject', where') -> subject', where' | None -> apply_subject !phi subject, where in extend (Keyword.posvar subject where flavor) (Keyword.posvar subject' where' flavor) phi; Position (subject', where', flavor) (* [rename f phi a] applies to the semantic action [a] the renaming [phi] as well as the transformations decided by the function [f]. The function [f] is applied to each (not-yet-renamed) keyword and may decide to transform it, by returning [Some _], or to not transform it, by returning [None]. (In the latter case, [phi] still applies to the keyword.) *) let rename f phi a = (* Rename all keywords, growing [phi] as we go. *) let keywords = a.keywords in let phi = ref phi in let keywords = KeywordSet.map (rename_keyword f phi) keywords in let phi = !phi in (* Construct a new semantic action, where [phi] is translated into a set of *simultaneous* [let] bindings. (We cannot use a series of nested [let] bindings, as that would cause a capture if the domain and codomain of [phi] have a nonempty intersection.) *) let phi = List.map (fun (x, y) -> IL.PVar x, IL.EVar y) phi in let expr = CodeBits.eletand (phi, a.expr) in { expr = expr; filenames = a.filenames; keywords = keywords; } let to_il_expr action = action.expr let filenames action = action.filenames let keywords action = action.keywords let has_syntaxerror action = KeywordSet.mem SyntaxError (keywords action) let has_beforeend action = KeywordSet.mem (Position (Before, WhereEnd, FlavorPosition)) action.keywords menhir-20200123/src/action.mli000066400000000000000000000064471361226111300160050ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Keyword (** Semantic action's type. *) type t (** [compose x a1 a2] builds the action [let x = a1 in a2]. This feature is used during the processing of the %inline keyword. *) val compose : string -> t -> t -> t (** [bind p x a] binds the OCaml pattern [p] to the OCaml variable [x] in the semantic action [a]. Therefore, it builds the action [let p = x in a]. *) val bind: IL.pattern -> string -> t -> t (* [define keyword keywords f action] defines away the keyword [keyword]. It is removed from the set of keywords of this semantic action; the set [keywords] is added in its place. The body of the semantic action is transformed by the function [f], which typically wraps it in some new [let] bindings. *) val define: keyword -> KeywordSet.t -> (IL.expr -> IL.expr) -> t -> t (* Variable-to-variable substitutions, used by [rename], below. *) type subst = (string * string) list (* [Subject/where] pairs, as defined in [Keyword], encode a position keyword. *) type sw = subject * where (** [rename f phi a] applies to the semantic action [a] the renaming [phi] as well as the transformations decided by the function [f]. The function [f] is applied to each (not-yet-renamed) keyword and may decide to transform it, by returning [Some _], or to not transform it, by returning [None]. (In the latter case, [phi] still applies to the keyword.) *) val rename: (sw -> sw option) -> subst -> t -> t (** Semantic actions are translated into [IL] code using the [IL.ETextual] and [IL.ELet] constructors. *) val to_il_expr: t -> IL.expr (** A semantic action might be the inlining of several others. The filenames of the different parts are given by [filenames a]. This can be used, for instance, to check whether all parts come from the standard library. *) val filenames: t -> string list (** [keywords a] is the set of keywords used in the semantic action [a]. *) val keywords: t -> KeywordSet.t (** [from_stretch s] builds an action out of a textual piece of code. *) val from_stretch: Stretch.t -> t (** [from_il_expr] converts an [IL] expression into a semantic action. *) val from_il_expr: IL.expr -> t (** Test whether the keyword [$syntaxerror] is used in the action. *) val has_syntaxerror: t -> bool (** Test whether the keyword [$endpos($0)] is used in the action. *) val has_beforeend: t -> bool menhir-20200123/src/anonymous.ml000066400000000000000000000135001361226111300163730ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Syntax (* For each anonymous rule, we define a fresh nonterminal symbol, and replace the anonymous rule with a reference to this symbol. If the anonymous rule appears inside a parameterized rule, then we must define a parameterized nonterminal symbol. *) (* ------------------------------------------------------------------------ *) (* Computing the free names of some syntactic categories. *) let rec fn_parameter accu (p : parameter) = (* [p] cannot be [ParameterAnonymous _]. *) let x, ps = Parameters.unapp p in let accu = StringSet.add (Positions.value x) accu in fn_parameters accu ps and fn_parameters accu ps = List.fold_left fn_parameter accu ps let fn_producer accu ((_, p, _) : producer) = fn_parameter accu p let fn_branch accu branch = List.fold_left fn_producer accu branch.pr_producers let fn_branches accu branches = List.fold_left fn_branch accu branches (* ------------------------------------------------------------------------ *) (* This functor makes it easy to share mutable internal state between the functions that follow. *) module Run (X : sig end) = struct (* ------------------------------------------------------------------------ *) (* A fresh name generator. *) let fresh : unit -> string = let next = ref 0 in fun () -> Printf.sprintf "__anonymous_%d" (Misc.postincrement next) (* ------------------------------------------------------------------------ *) (* A rule accumulator. Used to collect the fresh definitions that we produce. *) let rules = ref [] (* ------------------------------------------------------------------------ *) (* [anonymous pos parameters branches] deals with an anonymous rule, at position [pos], which appears inside a possibly-parameterized rule whose parameters are [parameters], and whose body is [branches]. We assume that [branches] does not itself contain any anonymous rules. As a side effect, we create a fresh definition, and return its name. *) let var (symbol : symbol) : parameter = ParameterVar (Positions.unknown_pos symbol) let anonymous pos (parameters : symbol list) (branches : parameterized_branch list) : parameter = (* Compute the free symbols of [branches]. They should form a subset of [parameters], although we have not yet checked this. We create a definition that is parameterized only over the parameters that actually occur free in the definition -- i.e., a definition without useless parameters. This seems important, as (in some situations) it avoids duplication and leads to fewer states in the automaton. *) let used = fn_branches StringSet.empty branches in let parameters = List.filter (fun x -> StringSet.mem x used) parameters in (* Generate a fresh non-terminal symbol. *) let symbol = fresh() in (* Construct its definition. Note that it is implicitly marked %inline. Also, it does not carry any attributes; this is consistent with the fact that %inline symbols cannot carry attributes. *) let rule = { pr_public_flag = false; pr_inline_flag = true; pr_nt = symbol; pr_positions = [ pos ]; (* this list is not allowed to be empty *) pr_attributes = []; pr_parameters = parameters; pr_branches = branches } in (* Record this definition. *) rules := rule :: !rules; (* Return the symbol that stands for it. *) Parameters.app (Positions.with_pos pos symbol) (List.map var parameters) (* ------------------------------------------------------------------------ *) (* Traversal code. *) let rec transform_parameter (parameters : symbol list) (p : parameter) : parameter = match p with | ParameterVar _ -> p | ParameterApp (x, ps) -> ParameterApp (x, List.map (transform_parameter parameters) ps) | ParameterAnonymous branches -> let pos = Positions.position branches and branches = Positions.value branches in (* Do not forget the recursive invocation! *) let branches = List.map (transform_parameterized_branch parameters) branches in (* This is where the real work is done. *) anonymous pos parameters branches and transform_producer parameters ((x, p, attrs) : producer) = x, transform_parameter parameters p, attrs and transform_parameterized_branch parameters branch = let pr_producers = List.map (transform_producer parameters) branch.pr_producers in { branch with pr_producers } let transform_parameterized_rule rule = let pr_branches = List.map (transform_parameterized_branch rule.pr_parameters) rule.pr_branches in { rule with pr_branches } end (* ------------------------------------------------------------------------ *) (* The main entry point invokes the functor and reads its result. *) let transform_partial_grammar g = let module R = Run(struct end) in let pg_rules = List.map R.transform_parameterized_rule g.pg_rules in let pg_rules = !R.rules @ pg_rules in { g with pg_rules } menhir-20200123/src/anonymous.mli000066400000000000000000000020371361226111300165470ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Syntax val transform_partial_grammar: partial_grammar -> partial_grammar menhir-20200123/src/astar.ml000066400000000000000000000244111361226111300154600ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module implements A* search, following Hart, Nilsson, and Raphael (1968). To each visited graph node, the algorithm associates an internal record, carrying various information. For this reason, the algorithm's space complexity is, in the worst case, linear in the size of the graph. The mapping of nodes to internal records is implemented via a hash table, while the converse mapping is direct (via a record field). Nodes that remain to be examined are kept in a priority queue, where the priority of a node is the cost of the shortest known path from the start node to it plus the estimated cost of a path from this node to a goal node. (Lower priority nodes are considered first). It is the use of the second summand that makes A* more efficient than Dijkstra's standard algorithm for finding shortest paths in an arbitrary graph. In fact, when [G.estimate] is the constant zero function, A* coincides with Dijkstra's algorithm. One should note that A* is faster than Dijkstra's algorithm only when a path to some goal node exists. Otherwise, both algorithms explore the entire graph, and have similar time requirements. The priority queue is implemented as an array of doubly linked lists. *) module Make (G : sig (* Graph nodes. *) type node include Hashtbl.HashedType with type t := node (* Edge labels. *) type label (* The source node(s). *) val sources: (node -> unit) -> unit (* [successors n f] presents each of [n]'s successors, in an arbitrary order, to [f], together with the cost of the edge that was followed. *) val successors: node -> (label -> int -> node -> unit) -> unit (* An estimate of the cost of the shortest path from the supplied node to some goal node. For algorithms such as A* and IDA* to find shortest paths, this estimate must be a correct under-approximation of the actual cost. *) val estimate: node -> int end) = struct type cost = int (* Nodes with low priorities are dealt with first. *) type priority = cost (* Paths back to a source (visible by the user). *) type path = | Edge of G.label * path | Source of G.node let rec follow labels path = match path with | Source node -> node, labels | Edge (label, path) -> follow (label :: labels) path let reverse path = follow [] path type inode = { (* Graph node associated with this internal record. *) this: G.node; (* Cost of the best known path from a source node to this node. (ghat) *) mutable cost: cost; (* Estimated cost of the best path from this node to a goal node. (hhat) *) estimate: cost; (* Best known path from a source node to this node. *) mutable path: path; (* Previous node on doubly linked priority list *) mutable prev: inode; (* Next node on doubly linked priority list *) mutable next: inode; (* The node's priority, if the node is in the queue; -1 otherwise *) mutable priority: priority; } (* This auxiliary module maintains a mapping of graph nodes to internal records. *) module M : sig (* Adds a binding to the mapping. *) val add: G.node -> inode -> unit (* Retrieves the internal record for this node. Raises [Not_found] no such record exists. *) val get: G.node -> inode end = struct module H = Hashtbl.Make(struct include G type t = node end) let t = H.create 100003 let add node inode = H.add t node inode let get node = H.find t node end (* This auxiliary module maintains a priority queue of internal records. *) module P : sig (* Adds this node to the queue. *) val add: inode -> priority -> unit (* Adds this node to the queue, or changes its priority, if it already was in the queue. It is assumed, in the second case, that the priority can only decrease. *) val add_or_decrease: inode -> priority -> unit (* Retrieve a node with lowest priority of the queue. *) val get: unit -> inode option end = struct module InfiniteArray = MenhirLib.InfiniteArray (* Array of pointers to the doubly linked lists, indexed by priorities. There is no a priori bound on the size of this array -- its size is increased if needed. It is up to the user to use a graph where paths have reasonable lengths. *) let a = InfiniteArray.make None (* Index of lowest nonempty list, if there is one; or lower (sub-optimal, but safe). If the queue is empty, [best] is arbitrary. *) let best = ref 0 (* Current number of elements in the queue. Used in [get] to stop the search for a nonempty bucket. *) let cardinal = ref 0 (* Adjust node's priority and insert into doubly linked list. *) let add inode priority = assert (0 <= priority); cardinal := !cardinal + 1; inode.priority <- priority; match InfiniteArray.get a priority with | None -> InfiniteArray.set a priority (Some inode); (* Decrease [best], if necessary, so as not to miss the new element. In the special case of A*, this never happens. *) assert (!best <= priority); (* if priority < !best then best := priority *) | Some inode' -> inode.next <- inode'; inode.prev <- inode'.prev; inode'.prev.next <- inode; inode'.prev <- inode (* Takes a node off its doubly linked list. Does not adjust [best], as this is not necessary in order to preserve the invariant. *) let remove inode = cardinal := !cardinal - 1; if inode.next == inode then InfiniteArray.set a inode.priority None else begin InfiniteArray.set a inode.priority (Some inode.next); inode.next.prev <- inode.prev; inode.prev.next <- inode.next; inode.next <- inode; inode.prev <- inode end; inode.priority <- -1 let rec get () = if !cardinal = 0 then None else get_nonempty() and get_nonempty () = (* Look for next nonempty bucket. We know there is one. This may seem inefficient, because it is a linear search. However, in A*, [best] never decreases, so the total cost of this loop is the maximum priority ever used. *) match InfiniteArray.get a !best with | None -> best := !best + 1; get_nonempty() | Some inode as result -> remove inode; result let add_or_decrease inode priority = if inode.priority >= 0 then remove inode; add inode priority end (* Initialization. *) let estimate node = let e = G.estimate node in assert (0 <= e); (* failure means user error *) e let () = G.sources (fun node -> let rec inode = { this = node; cost = 0; estimate = estimate node; path = Source node; prev = inode; next = inode; priority = -1 } in M.add node inode; P.add inode inode.estimate ) (* Access to the search results (after the search is over). *) let distance node = try (M.get node).cost with Not_found -> max_int let path node = (M.get node).path (* let [Not_found] escape if no path was found *) (* Search. *) let rec search f = (* Pick the open node that currently has lowest fhat, that is, lowest estimated distance to a goal node. *) match P.get() with | None -> (* Finished. *) distance, path | Some inode -> let node = inode.this in (* Let the user know about this newly discovered node. *) f (node, inode.path); (* Otherwise, examine its successors. *) G.successors node (fun label edge_cost son -> assert (0 <= edge_cost); (* failure means user error *) (* Determine the cost of the best known path from the start node, through this node, to this son. *) let new_cost = inode.cost + edge_cost in assert (0 <= new_cost); (* failure means overflow *) try let ison = M.get son in if new_cost < ison.cost then begin (* This son has been visited before, but this new path to it is shorter. If it was already open and waiting in the priority queue, increase its priority; otherwise, mark it as open and insert it into the queue. *) let new_fhat = new_cost + ison.estimate in assert (0 <= new_fhat); (* failure means overflow *) P.add_or_decrease ison new_fhat; ison.cost <- new_cost; ison.path <- Edge (label, inode.path) end with Not_found -> (* This son was never visited before. Allocate a new status record for it and mark it as open. *) let rec ison = { this = son; cost = new_cost; estimate = estimate son; path = Edge (label, inode.path); prev = ison; next = ison; priority = -1 } in M.add son ison; let fhat = new_cost + ison.estimate in assert (0 <= fhat); (* failure means overflow *) P.add ison fhat ); search f end menhir-20200123/src/astar.mli000066400000000000000000000053761361226111300156420ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This signature defines an implicit representation for graphs where edges have integer costs, there is a distinguished start node, and there is a set of distinguished goal nodes. It is also assumed that some geometric knowledge of the graph allows safely estimating the cost of shortest paths to goal nodes. If no such knowledge is available, [estimate] should be the constant zero function. *) module Make (G : sig (* Graph nodes. *) type node include Hashtbl.HashedType with type t := node (* Edge labels. *) type label (* The source node(s). *) val sources: (node -> unit) -> unit (* [successors n f] presents each of [n]'s successors, in an arbitrary order, to [f], together with the cost of the edge that was followed. *) val successors: node -> (label -> int -> node -> unit) -> unit (* An estimate of the cost of the shortest path from the supplied node to some goal node. This estimate must be a correct under-approximation of the actual cost. *) val estimate: node -> int end) : sig (* A path (from a target node back to some source node) is described by a series of labels and ends in a source node. *) type path = | Edge of G.label * path | Source of G.node (* A path can also be presented as a pair of a source node and a list of labels, which describe the edges from the source node to a target node. *) val reverse: path -> G.node * G.label list (* Search. Newly discovered nodes are presented to the user, in order of increasing distance from the source nodes, by invoking the user-supplied function [f]. At the end, a mapping of nodes to distances to the source nodes and a mapping of nodes to shortest paths are returned. *) val search: (G.node * path -> unit) -> (G.node -> int) * (G.node -> path) end menhir-20200123/src/back.ml000066400000000000000000000066741361226111300152610ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* Driver for the back-end. *) (* Let [Interpret] handle the command line options [--interpret], [--interpret-error], [--compile-errors], [--compare-errors]. *) let () = Interpret.run() (* If [--list-errors] is set, produce a list of erroneous input sentences, then stop. *) let () = if Settings.list_errors then begin let module L = LRijkstra.Run(struct (* Undocumented: if [--log-automaton 2] is set, be verbose. *) let verbose = Settings.logA >= 2 (* For my own purposes, LRijkstra can print one line of statistics to a .csv file. *) let statistics = if false then Some "lr.csv" else None end) in exit 0 end (* Define an .ml file writer . *) let write program = let module P = Printer.Make (struct let filename = Settings.base ^ ".ml" let f = open_out filename let locate_stretches = (* 2017/05/09: always include line number directives in generated .ml files. Indeed, they affect the semantics of [assert] instructions in the semantic actions. *) (* 2011/10/19: do not use [Filename.basename]. The line number directives that we insert in the [.ml] file must retain their full path. This does mean that the line number directives depend on how menhir is invoked -- e.g. [menhir foo/bar.mly] and [cd foo && menhir bar.mly] will produce different files. Nevertheless, this seems useful/reasonable. *) Some filename end) in P.program program (* If requested, generate a .cmly file. *) let () = if Settings.cmly then Cmly_write.write (Settings.base ^ ".cmly") (* The following DEAD code forces [Cmly_read] to be typechecked. *) let () = if false then let module R = Cmly_read.Read (struct let filename = "" end) in () (* Construct the code, using either the table-based or the code-based back-end, and pass it on to the printer. (This continuation-passing style is imposed by the fact that there is no conditional in ocaml's module language.) *) let () = if Settings.coq then let module B = CoqBackend.Run (struct end) in let filename = Settings.base ^ ".v" in let f = open_out filename in B.write_all f; exit 0 else if Settings.table then let module B = TableBackend.Run (struct end) in write B.program else let module B = CodeBackend.Run (struct end) in write (CodeInliner.inline B.program) (* Write the interface file. *) let () = Interface.write Front.grammar () let () = Time.tick "Printing" menhir-20200123/src/back.mli000066400000000000000000000020471361226111300154200ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module drives the back-end. No functionality is offered by this module. *) menhir-20200123/src/basicPrinter.ml000066400000000000000000000344271361226111300170030ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Printf open Positions open Syntax open Stretch open BasicSyntax open Settings (* When the original grammar is split over several files, it may be IMPOSSIBLE to print it out into a single file, as that would introduce a total ordering (between rules, between priority declarations, between %on_error_reduce declarations) that did not exist originally. We currently do not warn about this problem. Nobody has ever complained about it. *) (* -------------------------------------------------------------------------- *) (* The printing mode. *) (* [PrintNormal] is the normal mode: the result is a Menhir grammar. [PrintForOCamlyacc] is close to the normal mode, but attempts to produce ocamlyacc-compatible output. This means, in particular, that we cannot bind identifiers to semantic values, but must use [$i] instead. [PrintUnitActions _] causes all OCaml code to be suppressed: the semantic actions are replaced with unit actions, preludes and postludes disappear, %parameter declarations disappear. Every %type declaration carries the [unit] type. [PrintUnitActions true] in addition declares that every token carries a semantic value of type [unit]. *) module Print (X : sig val mode : Settings.print_mode end) = struct open X (* -------------------------------------------------------------------------- *) (* Printing an OCaml type. *) let print_ocamltype ty : string = Printf.sprintf " <%s>" ( match ty with | Declared stretch -> stretch.stretch_raw_content | Inferred t -> t ) let print_ocamltype ty : string = let s = print_ocamltype ty in match mode with | PrintForOCamlyacc -> (* ocamlyacc does not allow a %type declaration to contain a new line. Replace it with a space. *) String.map (function '\r' | '\n' -> ' ' | c -> c) s | PrintNormal | PrintUnitActions _ -> s (* -------------------------------------------------------------------------- *) (* Printing the type of a terminal symbol. *) let print_token_type (prop : token_properties) = match mode with | PrintNormal | PrintForOCamlyacc | PrintUnitActions false -> Misc.o2s prop.tk_ocamltype print_ocamltype | PrintUnitActions true -> "" (* omitted ocamltype after %token means *) (* -------------------------------------------------------------------------- *) (* Printing the type of a nonterminal symbol. *) let print_nonterminal_type ty = match mode with | PrintNormal | PrintForOCamlyacc -> print_ocamltype ty | PrintUnitActions _ -> " " (* -------------------------------------------------------------------------- *) (* Printing a binding for a semantic value. *) let print_binding id = match mode with | PrintNormal -> id ^ " = " | PrintForOCamlyacc | PrintUnitActions _ -> (* need not, or must not, bind a semantic value *) "" (* -------------------------------------------------------------------------- *) (* Testing whether it is permitted to print OCaml code (semantic actions, prelude, postlude). *) let if_ocaml_code_permitted f x = match mode with | PrintNormal | PrintForOCamlyacc -> f x | PrintUnitActions _ -> (* In these modes, all OCaml code is omitted: semantic actions, preludes, postludes, etc. *) () (* -------------------------------------------------------------------------- *) (* Testing whether attributes should be printed. *) let attributes_printed : bool = match mode with | PrintNormal | PrintUnitActions _ -> true | PrintForOCamlyacc -> false (* -------------------------------------------------------------------------- *) (* Printing a semantic action. *) let print_semantic_action f g branch = let e = Action.to_il_expr branch.action in match mode with | PrintUnitActions _ -> (* In the unit-action modes, we print a pair of empty braces, which is fine. *) () | PrintNormal -> Printer.print_expr f e | PrintForOCamlyacc -> (* In ocamlyacc-compatibility mode, the code must be wrapped in [let]-bindings whose right-hand side uses the [$i] keywords. *) let bindings = List.mapi (fun i producer -> let id = producer_identifier producer and symbol = producer_symbol producer in (* Test if [symbol] is a terminal symbol whose type is [unit]. *) let is_unit_token = try let prop = StringMap.find symbol g.tokens in prop.tk_ocamltype = None with Not_found -> symbol = "error" in (* Define the variable [id] as a synonym for [$(i+1)]. *) (* As an exception to this rule, if [symbol] is a terminal symbol which has been declared *not* to carry a semantic value, then we cannot use [$(i+1)] -- ocamlyacc does not allow it -- so we use the unit value instead. *) IL.PVar id, if is_unit_token then IL.EUnit else IL.EVar (sprintf "$%d" (i + 1)) ) branch.producers in (* The identifiers that we bind are pairwise distinct. *) (* We must use simultaneous bindings (that is, a [let/and] form), as opposed to a cascade of [let] bindings. Indeed, ocamlyacc internally translates [$i] to [_i] (just like us!), so name captures will occur unless we restrict the use of [$i] to the outermost scope. (Reported by Kenji Maillard.) *) let e = CodeBits.eletand (bindings, e) in Printer.print_expr f e (* -------------------------------------------------------------------------- *) (* Printing preludes and postludes. *) let print_preludes f g = List.iter (fun prelude -> fprintf f "%%{%s%%}\n" prelude.stretch_raw_content ) g.preludes let print_postludes f g = List.iter (fun postlude -> fprintf f "%s\n" postlude.stretch_raw_content ) g.postludes (* -------------------------------------------------------------------------- *) (* Printing %start declarations. *) let print_start_symbols f g = StringSet.iter (fun symbol -> fprintf f "%%start %s\n" (Misc.normalize symbol) ) g.start_symbols (* -------------------------------------------------------------------------- *) (* Printing %parameter declarations. *) let print_parameter f stretch = fprintf f "%%parameter<%s>\n" stretch.stretch_raw_content let print_parameters f g = match mode with | PrintNormal -> List.iter (print_parameter f) g.parameters | PrintForOCamlyacc | PrintUnitActions _ -> (* %parameter declarations are not supported by ocamlyacc, and presumably become useless when the semantic actions are removed. *) () (* -------------------------------------------------------------------------- *) (* Printing attributes. *) let print_attribute f ((name, payload) : attribute) = if attributes_printed then fprintf f " [@%s %s]" (Positions.value name) payload.stretch_raw_content let print_attributes f attrs = List.iter (print_attribute f) attrs (* -------------------------------------------------------------------------- *) (* Printing token declarations and precedence declarations. *) let print_assoc = function | LeftAssoc -> Printf.sprintf "%%left" | RightAssoc -> Printf.sprintf "%%right" | NonAssoc -> Printf.sprintf "%%nonassoc" | UndefinedAssoc -> "" let compare_pairs compare1 compare2 (x1, x2) (y1, y2) = let c = compare1 x1 y1 in if c <> 0 then c else compare2 x2 y2 let compare_tokens (_token, prop) (_token', prop') = match prop.tk_precedence, prop'.tk_precedence with | UndefinedPrecedence, UndefinedPrecedence -> 0 | UndefinedPrecedence, PrecedenceLevel _ -> -1 | PrecedenceLevel _, UndefinedPrecedence -> 1 | PrecedenceLevel (m, v, _, _), PrecedenceLevel (m', v', _, _) -> compare_pairs InputFile.compare_input_files Generic.compare (m, v) (m', v') let print_tokens f g = (* Print the %token declarations. *) StringMap.iter (fun token prop -> if prop.tk_is_declared then fprintf f "%%token%s %s%a\n" (print_token_type prop) token print_attributes prop.tk_attributes ) g.tokens; (* Sort the tokens wrt. precedence, and group them into levels. *) let levels : (string * token_properties) list list = Misc.levels compare_tokens (List.sort compare_tokens ( StringMap.bindings g.tokens )) in (* Print the precedence declarations: %left, %right, %nonassoc. *) List.iter (fun level -> let (_token, prop) = try List.hd level with Failure _ -> assert false in (* Do nothing about the tokens that have no precedence. *) if prop.tk_precedence <> UndefinedPrecedence then begin fprintf f "%s" (print_assoc prop.tk_associativity); List.iter (fun (token, _prop) -> fprintf f " %s" token ) level; fprintf f "\n" end ) levels (* -------------------------------------------------------------------------- *) (* Printing %type declarations. *) let print_types f g = StringMap.iter (fun symbol ty -> fprintf f "%%type%s %s\n" (print_nonterminal_type ty) (Misc.normalize symbol) ) g.types (* -------------------------------------------------------------------------- *) (* Printing branches and rules. *) let print_producer sep f producer = fprintf f "%s%s%s%a" (sep()) (print_binding (producer_identifier producer)) (Misc.normalize (producer_symbol producer)) print_attributes (producer_attributes producer) let print_branch f g branch = (* Print the producers. *) let sep = Misc.once "" " " in List.iter (print_producer sep f) branch.producers; (* Print the %prec annotation, if there is one. *) Option.iter (fun x -> fprintf f " %%prec %s" x.value ) branch.branch_prec_annotation; (* Newline, indentation, semantic action. *) fprintf f "\n {"; print_semantic_action f g branch; fprintf f "}\n" (* Because the resolution of reduce/reduce conflicts is implicitly dictated by the order in which productions appear in the grammar, the printer should be careful to preserve this order. *) (* 2016/08/25: As noted above, when two productions originate in different files, we have a problem. We MUST print them in some order, even though they should be incomparable. In that case, we use the order in which the source files are specified on the command line. However, this behavior is undocumented, and should not be exploited. (In previous versions of Menhir, the function passed to [List.sort] was not transitive, so it did not make any sense!) *) let compare_branch_production_levels bpl bpl' = match bpl, bpl' with | ProductionLevel (m, l), ProductionLevel (m', l') -> compare_pairs InputFile.compare_input_files Generic.compare (m, l) (m', l') let compare_branches (b : branch) (b' : branch) = compare_branch_production_levels b.branch_production_level b'.branch_production_level let compare_rules (_nt, (r : rule)) (_nt', (r' : rule)) = match r.branches, r'.branches with | [], [] -> 0 | [], _ -> -1 | _, [] -> 1 | b :: _, b' :: _ -> (* To compare two rules, it suffices to compare their first productions. *) compare_branches b b' let print_rule f g (nt, r) = fprintf f "\n%s%a:\n" (Misc.normalize nt) print_attributes r.attributes; (* Menhir accepts a leading "|", but bison does not. Let's not print it. So, we print a bar-separated list. *) let sep = Misc.once (" ") ("| ") in List.iter (fun br -> fprintf f "%s" (sep()); print_branch f g br ) r.branches let print_rules f g = let rules = List.sort compare_rules (StringMap.bindings g.rules) in List.iter (print_rule f g) rules (* -------------------------------------------------------------------------- *) (* Printing %on_error_reduce declarations. *) let print_on_error_reduce_declarations f g = let cmp (_nt, oel) (_nt', oel') = compare_branch_production_levels oel oel' in let levels : (string * on_error_reduce_level) list list = Misc.levels cmp (List.sort cmp ( StringMap.bindings g.on_error_reduce )) in List.iter (fun level -> fprintf f "%%on_error_reduce"; List.iter (fun (nt, _level) -> fprintf f " %s" nt ) level; fprintf f "\n" ) levels let print_on_error_reduce_declarations f g = match mode with | PrintNormal | PrintUnitActions _ -> print_on_error_reduce_declarations f g | PrintForOCamlyacc -> (* %on_error_reduce declarations are not supported by ocamlyacc *) () (* -------------------------------------------------------------------------- *) (* Printing %attribute declarations. *) let print_grammar_attribute f ((name, payload) : attribute) = if attributes_printed then fprintf f "%%[@%s %s]\n" (Positions.value name) payload.stretch_raw_content let print_grammar_attributes f g = List.iter (print_grammar_attribute f) g.gr_attributes (* -------------------------------------------------------------------------- *) (* The main entry point. *) let print f g = print_parameters f g; if_ocaml_code_permitted (print_preludes f) g; print_start_symbols f g; print_tokens f g; print_types f g; print_on_error_reduce_declarations f g; print_grammar_attributes f g; fprintf f "%%%%\n"; print_rules f g; fprintf f "\n%%%%\n"; if_ocaml_code_permitted (print_postludes f) g end let print mode = let module P = Print(struct let mode = mode end) in P.print menhir-20200123/src/basicPrinter.mli000066400000000000000000000030201361226111300171350ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This is a pretty-printer for grammars. *) (* If the [mode] parameter requests ``unit actions'', then semantic actions are dropped: that is, they are replaced with trivial semantic actions that return unit. Accordingly, all [%type] declarations are changed to unit. The prologue and epilogue are dropped. All bindings for semantic values are suppressed. If, furthermore, the [mode] parameter requests ``unit tokens'', then the types carried by tokens are changed to unit. *) val print: Settings.print_mode -> out_channel -> BasicSyntax.grammar -> unit menhir-20200123/src/basicSyntax.ml000066400000000000000000000162421361226111300166410ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Syntax (* This is the abstract syntax for an unparameterized grammar, that is, a grammar that does not have any parameterized nonterminal symbols. Such a grammar is obtained as the result of an expansion phase, which is implemented in [ParameterizedGrammar]. *) (* In an unparameterized grammar, %attribute declarations can be desugared away. This is also done during the above-mentioned expansion phase. Thus, in an unparameterized grammar, attributes can be attached in the following places: - with the grammar: field [gr_attributes] of [grammar] - with a terminal symbol: field [tk_attributes] of [token_properties] - with a nonterminal symbol: field [attributes] of [rule] - with a producer: field [producer_attributes] of [producer] *) (* ------------------------------------------------------------------------ *) (* A producer is a pair of identifier and a symbol. In concrete syntax, it could be [e = expr], for instance. It carries a number of attributes. *) type producer = { producer_identifier : identifier; producer_symbol : symbol; producer_attributes : attributes; } type producers = producer list (* ------------------------------------------------------------------------ *) (* A branch contains a series of producers and a semantic action. It is the same as in the surface syntax; see [Syntax]. *) type branch = { branch_position : Positions.t; producers : producers; action : action; branch_prec_annotation : branch_prec_annotation; branch_production_level : branch_production_level } type branches = branch list (* ------------------------------------------------------------------------ *) (* A rule consists mainly of several branches. In contrast with the surface syntax, it has no parameters. *) (* The [%inline] flag is no longer relevant after [NonTerminalInlining]. *) type rule = { branches : branches; positions : Positions.t list; inline_flag : bool; attributes : attributes; } (* ------------------------------------------------------------------------ *) (* A grammar is essentially the same as in the surface syntax; see [Syntax]. The main difference is that [%attribute] declarations, represented by the field [p_symbol_attributes] in the surface syntax, have disappeared. *) type grammar = { preludes : Stretch.t list; postludes : Syntax.postlude list; parameters : Stretch.t list; start_symbols : StringSet.t; types : Stretch.ocamltype StringMap.t; tokens : Syntax.token_properties StringMap.t; on_error_reduce : on_error_reduce_level StringMap.t; gr_attributes : attributes; rules : rule StringMap.t; } (* -------------------------------------------------------------------------- *) (* Accessors for the type [producer]. *) let producer_identifier { producer_identifier } = producer_identifier let producer_symbol { producer_symbol } = producer_symbol let producer_attributes { producer_attributes } = producer_attributes (* -------------------------------------------------------------------------- *) (* A getter and a transformer for the field [branches] of the type [rule]. *) let get_branches rule = rule.branches let transform_branches f rule = { rule with branches = f rule.branches } (* -------------------------------------------------------------------------- *) (* [tokens grammar] is a list of all (real) tokens in the grammar [grammar]. The special tokens "#" and "error" are not included. Pseudo-tokens (used in %prec declarations, but never declared using %token) are filtered out. *) let tokens grammar = StringMap.fold (fun token properties tokens -> if properties.tk_is_declared then token :: tokens else tokens ) grammar.tokens [] (* [typed_tokens grammar] is analogous, but includes the OCaml type of each token. *) let typed_tokens grammar = StringMap.fold (fun token properties tokens -> if properties.tk_is_declared then (token, properties.tk_ocamltype) :: tokens else tokens ) grammar.tokens [] (* [nonterminals grammar] is a list of all nonterminal symbols in the grammar [grammar]. *) let nonterminals grammar : nonterminal list = StringMap.fold (fun nt _ rules -> nt :: rules) grammar.rules [] (* [ocamltype_of_symbol grammar symbol] produces the OCaml type of the symbol [symbol] in the grammar [grammar], if it is known. *) let ocamltype_of_symbol grammar symbol : Stretch.ocamltype option = try Some (StringMap.find symbol grammar.types) with Not_found -> None (* [ocamltype_of_start_symbol grammar symbol] produces the OCaml type of the start symbol [symbol] in the grammar [grammar]. *) let ocamltype_of_start_symbol grammar symbol : Stretch.ocamltype = try StringMap.find symbol grammar.types with Not_found -> (* Every start symbol should have a type. *) assert false (* [is_inline_symbol grammar symbol] tells whether [symbol] is a nonterminal symbol (as opposed to a terminal symbol) and is marked %inline. *) let is_inline_symbol grammar symbol : bool = match StringMap.find symbol grammar.rules with | rule -> (* This is a nonterminal symbol. Test its %inline flag. *) rule.inline_flag | exception Not_found -> (* This is a terminal symbol. *) false (* [is_inline_symbol grammar producer] tells whether [producer] represents a nonterminal symbol (as opposed to a terminal) and is marked %inline. *) let is_inline_producer grammar producer = is_inline_symbol grammar (producer_symbol producer) (* -------------------------------------------------------------------------- *) (* [names producers] is the set of names of the producers [producers]. The name of a producer is the OCaml variable that is used to name its semantic value. *) (* This function checks on the fly that no two producers carry the same name. This check should never fail if we have performed appropriate renamings. It is a debugging aid. *) let names (producers : producers) : StringSet.t = List.fold_left (fun ids producer -> let id = producer_identifier producer in assert (not (StringSet.mem id ids)); StringSet.add id ids ) StringSet.empty producers menhir-20200123/src/chopInlined.mll000066400000000000000000000023521361226111300167560ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* Chopping [_inlined] off a name, if there is one, and returning the numeric suffix that follows, if there is one. *) rule chop = parse | (_* as x) "_inlined" (['0'-'9']+ as n) eof { x, int_of_string n } | (_* as x) "_inlined" eof { x, 0 } | (_* as x) eof { x, 0 } menhir-20200123/src/cmly_write.ml000066400000000000000000000136201361226111300165240ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open BasicSyntax open Grammar open Cmly_format let raw_content stretch = stretch.Stretch.stretch_raw_content let ocamltype (typ : Stretch.ocamltype) : ocamltype = match typ with | Stretch.Declared stretch -> raw_content stretch | Stretch.Inferred typ -> typ let ocamltype (typo : Stretch.ocamltype option) : ocamltype option = match typo with | None -> None | Some typ -> Some (ocamltype typ) let range (pos : Positions.t) : range = { r_start = Positions.start_of_position pos; r_end = Positions.end_of_position pos; } let ranges = List.map range let attribute (label, payload : Syntax.attribute) : attribute = { a_label = Positions.value label; a_payload = raw_content payload; a_position = range (Positions.position label); } let attributes : Syntax.attributes -> attributes = List.map attribute let terminal (t : Terminal.t) : terminal_def = { t_kind = ( if Terminal.equal t Terminal.error then `ERROR else if (match Terminal.eof with | None -> false | Some eof -> Terminal.equal t eof) then `EOF else if Terminal.pseudo t then `PSEUDO else `REGULAR ); t_name = Terminal.print t; t_type = ocamltype (Terminal.ocamltype t); t_attributes = attributes (Terminal.attributes t); } let nonterminal (nt : Nonterminal.t) : nonterminal_def = let is_start = Nonterminal.is_start nt in { n_kind = if is_start then `START else `REGULAR; n_name = Nonterminal.print false nt; n_mangled_name = Nonterminal.print true nt; n_type = if is_start then None else ocamltype (Nonterminal.ocamltype nt); n_positions = if is_start then [] else ranges (Nonterminal.positions nt); n_nullable = Analysis.nullable nt; n_first = List.map Terminal.t2i (TerminalSet.elements (Analysis.first nt)); n_attributes = if is_start then [] else attributes (Nonterminal.attributes nt); } let symbol (sym : Symbol.t) : symbol = match sym with | Symbol.N n -> N (Nonterminal.n2i n) | Symbol.T t -> T (Terminal.t2i t) let action (a : Action.t) : action = { a_expr = Printer.string_of_expr (Action.to_il_expr a); a_keywords = Keyword.KeywordSet.elements (Action.keywords a); } let rhs (prod : Production.index) : producer_def array = match Production.classify prod with | Some n -> [| (N (Nonterminal.n2i n), "", []) |] | None -> Array.mapi (fun i sym -> let id = (Production.identifiers prod).(i) in let attrs = attributes (Production.rhs_attributes prod).(i) in symbol sym, id, attrs ) (Production.rhs prod) let production (prod : Production.index) : production_def = { p_kind = if Production.is_start prod then `START else `REGULAR; p_lhs = Nonterminal.n2i (Production.nt prod); p_rhs = rhs prod; p_positions = ranges (Production.positions prod); p_action = if Production.is_start prod then None else Some (action (Production.action prod)); p_attributes = attributes (Production.lhs_attributes prod); } let item (i : Item.t) : production * int = let p, i = Item.export i in (Production.p2i p, i) let itemset (is : Item.Set.t) : (production * int) list = List.map item (Item.Set.elements is) let lr0_state (node : Lr0.node) : lr0_state_def = { lr0_incoming = Option.map symbol (Lr0.incoming_symbol node); lr0_items = itemset (Lr0.items node) } let transition (sym, node) : symbol * lr1 = (symbol sym, Lr1.number node) let lr1_state (node : Lr1.node) : lr1_state_def = { lr1_lr0 = Lr0.core (Lr1.state node); lr1_transitions = List.map transition (SymbolMap.bindings (Lr1.transitions node)); lr1_reductions = let add t ps rs = (Terminal.t2i t, List.map Production.p2i ps) :: rs in TerminalMap.fold_rev add (Lr1.reductions node) [] } let entry_point prod node nt _typ accu : (nonterminal * production * lr1) list = (Nonterminal.n2i nt, Production.p2i prod, Lr1.number node) :: accu let encode () : grammar = { g_basename = Settings.base; g_preludes = List.map raw_content Front.grammar.preludes; g_postludes = List.map raw_content Front.grammar.postludes; g_terminals = Terminal.init terminal; g_nonterminals = Nonterminal.init nonterminal; g_productions = Production.init production; g_lr0_states = Array.init Lr0.n lr0_state; g_lr1_states = Array.of_list (Lr1.map lr1_state); g_entry_points = Lr1.fold_entry entry_point []; g_attributes = attributes Analysis.attributes; g_parameters = List.map raw_content Front.grammar.parameters; } let write oc t = (* .cmly file format: CMLY ++ version string ++ grammar *) let magic = "CMLY" ^ Version.version in output_string oc magic; output_value oc (t : grammar) let write filename = (* Opening in binary mode is required. This is not a text file; we write to it using [output_value]. *) let oc = open_out_bin filename in write oc (encode()); close_out oc menhir-20200123/src/cmly_write.mli000066400000000000000000000023351361226111300166760ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* [write filename] queries the modules [Front] and [Grammar] for information about the grammar and queries the modules [Lr0] and [Lr1] for information about the automaton. It writes this information to the .cmly file [filename]. *) val write: string -> unit menhir-20200123/src/codeBackend.ml000066400000000000000000001552551361226111300165430ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* The code generator. *) module Run (T : sig end) = struct open Grammar open IL open CodeBits open CodePieces open TokenType open Interface (* ------------------------------------------------------------------------ *) (* Here is a description of our code generation mechanism. Every internal function that we produce is parameterized by the parser environment [env], which contains (pointers to) the lexer, the lexing buffer, the last token read, etc. No global variables are exploited, so our parsers are reentrant. The functions that we export do not expect an environment as a parameter; they create a fresh one when invoked. Every state [s] is translated to a [run] function. To a first approximation, the only parameter of the [run] function, besides [env], is the stack. However, in some cases (consult the predicate [runpushes]), the top stack cell is not yet allocated when [run s] is called. The cell's contents are passed as extra parameters, and it is [run]'s responsibility to allocate that cell. (When [run] is the target of a shift transition, the position parameters [startp] and [endp] are redundant with the [env] parameter, because they are always equal to [env.startp] and [env.endp]. However, this does not appear to make a great difference in terms of code size, and makes our life easier, so we do not attempt to eliminate this redundancy.) The first thing in [run] is to discard a token, if the state was entered through a shift transition, and to peek at the lookahead token. When the current token is to be discarded, the [discard] function is invoked. It discards the current token, invokes the lexer to obtain a new token, and returns an updated environment. When we only wish to peek at the current token, without discarding it, we simply read [env.token]. (We have to be careful in cases where the current lookahead token might be [error], since, in those cases, [env.token] is meaningless; see below.) Once the lookahead token is obtained, [run] performs a case analysis of the lookahead token. Each branch performs one of the following. In shift branches, control is dispatched to another [run] function, with appropriate parameters, typically the current stack plus the information that should go into the new top stack cell (a state, a semantic value, locations). In reduce branches, a [reduce] function is invoked. In the default branch, error handling is initiated (see below). The [reduce] function associated with production [prod] pops as many stack cells as necessary, retrieving semantic values and the state [s] that initiated the reduction. It then evaluates the semantic action, which yields a new semantic value. (This is the only place where semantic actions are evaluated, so that semantic actions are never duplicated.) It then passes control on to the [goto] function associated with the nonterminal [nt], where [nt] is the left-hand side of the production [prod]. The [goto] function associated with nonterminal [nt] expects just one parameter besides the environment -- namely, the stack. However, in some cases (consult the predicate [gotopushes]), the top stack cell is not allocated yet, so its contents are passed as extra parameters. In that case, [goto] first allocates that cell. Then, it examines the state found in that cell and performs a goto transition, that is, a shift transition on the nonterminal symbol [nt]. This simply consists in passing control to the [run] function associated with the transition's target state. If this case analysis only has one branch, because all transitions for [nt] lead to the same target state, then no case analysis is required. In principle, a stack cell contains a state, a semantic value, and start and end positions. However, the state can be omitted if it is never consulted by a [goto] function. The semantic value can be omitted if it is associated with a token that was declared not to carry a semantic value. (One could also omit semantic values for nonterminals whose type was declared to be [unit], but that does not seem very useful.) The start or end position can be omitted if they are associated with a symbol that does not require keeping track of positions. When all components of a stack cell are omitted, the entire cell disappears, so that no memory allocation is required. For each start symbol [nt], an entry point function, named after [nt], is generated. Its parameters are a lexer and a lexing buffer. The function allocates and initializes a parser environment and transfers control to the appropriate [run] function. Our functions are grouped into one huge [let rec] definition. The inliner, implemented as a separate module, will inline functions that are called at most once, remove dead code (although there should be none or next to none), and possibly perform other transformations. I note that, if a state can be entered only through (nondefault) reductions, then, in that state, the lookahead token must be a member of the set of tokens that allow these reductions, and by construction, there must exist an action on that token in that state. Thus, the default branch (which signals an error when the lookahead token is not a member of the expected set) is in fact dead. It would be nice (but difficult) to exploit types to prove that. However, one could at least replace the code of that branch with a simple [assert false]. TEMPORARY do it *) (* ------------------------------------------------------------------------ *) (* Here is a description of our error handling mechanism. With every state [s], we associate an [error] function. If [s] is willing to act when the lookahead token is [error], then this function tells how. This includes *both* shift *and* reduce actions. (For some reason, yacc/ocamlyacc/mule/bison can only shift on [error].) If [s] is unable to act when the lookahead token is [error], then this function pops a stack cell, extracts a state [s'] out of it, and transfers control, via a global [errorcase] dispatch function, to the [error] function associated with [s']. (Because some stack cells do not physically hold a state, this description is somewhat simpler than the truth, but that's the idea.) When an error is detected in state [s], then (see [initiate]) the [error] function associated with [s] is invoked. Immediately before invoking the [error] function, the flag [env.error] is set. By convention, this means that the current token is discarded and replaced with an [error] token. The [error] token transparently inherits the positions associated with the underlying concrete token. Whenever we attempt to consult the current token, we check whether [env.error] is set and, if that is the case, resume error handling by calling the [error] function associated with the current state. This allows a series of reductions to correctly take place when the lookahead token is [error]. In many states, though, it is possible to statically prove that [env.error] cannot be set. In that case, we produce a lookup of [env.token] without checking [env.error]. The flag [env.error] is cleared when a token is shifted. States with default reductions perform a reduction regardless of the current lookahead token, which can be either [error] or a regular token. A question that bothered me for a while was, when unwinding the stack, do we stop at a state that has a default reduction? Should it be considered able to handle the error token? I now believe that the answer is, this cannot happen. Indeed, if a state has a default reduction, then, whenever it is entered, reduction is performed and that state is exited, which means that it is never pushed onto the stack. So, it is fine to consider that a state with a default reduction is unable to handle errors. I note that a state that can handle [error] and has a default reduction must in fact have a reduction action on [error]. *) (* The type of environments. *) let tcenv = env let tenv = TypApp (tcenv, []) (* The [assertfalse] function. We have just one of these, in order to save code size. It should become unnecessary when we add GADTs. *) let assertfalse = prefix "fail" (* The [discard] function. *) let discard = prefix "discard" (* The [initenv] function. *) let initenv = prefix "init" (* The [run] function associated with a state [s]. *) let run s = prefix (Printf.sprintf "run%d" (Lr1.number s)) (* The [goto] function associated with a nonterminal [nt]. *) let goto nt = prefix (Printf.sprintf "goto_%s" (Nonterminal.print true nt)) (* The [reduce] function associated with a production [prod]. *) let reduce prod = prefix (Printf.sprintf "reduce%d" (Production.p2i prod)) (* The [errorcase] function. *) let errorcase = prefix "errorcase" (* The [error] function associated with a state [s]. *) let error s = prefix (Printf.sprintf "error%d" (Lr1.number s)) (* The constant associated with a state [s]. *) let statecon s = dataprefix (Printf.sprintf "State%d" (Lr1.number s)) let estatecon s = EData (statecon s, []) let pstatecon s = PData (statecon s, []) let pstatescon ss = POr (List.map pstatecon ss) (* The type of states. *) let tcstate = prefix "state" let tstate = TypApp (tcstate, []) (* The [print_token] function. This automatically generated function is used in [--trace] mode. *) let print_token = prefix "print_token" (* Fields in the environment record. *) let flexer = prefix "lexer" let flexbuf = prefix "lexbuf" let ftoken = prefix "token" let ferror = prefix "error" (* The type variable that represents the stack tail. *) let tvtail = tvprefix "tail" let ttail = TypVar tvtail (* The result type for every function. TEMPORARY *) let tvresult = tvprefix "return" let tresult = TypVar tvresult (* ------------------------------------------------------------------------ *) (* Helpers for code production. *) let var x : expr = EVar x let pvar x : pattern = PVar x let magic e : expr = EMagic e let nomagic e = e (* The following assertion checks that [env.error] is [false]. *) let assertnoerror : pattern * expr = PUnit, EApp (EVar "assert", [ EApp (EVar "not", [ ERecordAccess (EVar env, ferror) ]) ]) let trace (format : string) (args : expr list) : (pattern * expr) list = if Settings.trace then [ PUnit, EApp (EVar "Printf.fprintf", (EVar "stderr") :: (EStringConst (format ^"\n%!")) :: args) ] else [] let tracecomment (comment : string) (body : expr) : expr = if Settings.trace then blet (trace comment [], body) else EComment (comment, body) let auto2scheme t = scheme [ tvtail; tvresult ] t (* ------------------------------------------------------------------------ *) (* Accessing the positions of the current token. *) (* There are two ways we can go about this. We can read the positions from the lexbuf immediately after we request a new token, or we can wait until we need the positions and read them at that point. As of 2014/12/12, we switch to the latter approach. The speed difference in a micro-benchmark is not measurable, but this allows us to save two fields in the [env] record, which should be a good thing, as it implies less frequent minor collections. *) let getstartp = ERecordAccess (ERecordAccess (EVar env, flexbuf), "Lexing.lex_start_p") let getendp = ERecordAccess (ERecordAccess (EVar env, flexbuf), "Lexing.lex_curr_p") (* ------------------------------------------------------------------------ *) (* Determine whether the [goto] function for nonterminal [nt] will push a new cell onto the stack. If it doesn't, then that job is delegated to the [run] functions called by [goto]. One could decide that [gotopushes] always returns true, and produce decent code. As a refinement, we decide to drive the [push] operation inside the [run] functions if all of them are able to eliminate this operation via shiftreduce optimization. This will be the case if all of these [run] functions implement a default reduction of a non-epsilon production. If that is not the case, then [gotopushes] returns true. In general, it is good to place the [push] operation inside [goto], because multiple [reduce] functions transfer control to [goto], and [goto] in turn transfers control to multiple [run] functions. Hence, this is where code sharing is maximal. All of the [run] functions that [goto] can transfer control to expect a stack cell of the same shape (indeed, the symbol [nt] is the same in every case, and the state is always represented), which makes this decision possible. *) let gotopushes : Nonterminal.t -> bool = Nonterminal.tabulate (fun nt -> not ( Lr1.targets (fun accu _ target -> accu && match Default.has_default_reduction target with | Some (prod, _) -> Production.length prod > 0 | None -> false ) true (Symbol.N nt) ) ) (* ------------------------------------------------------------------------ *) (* Determine whether the [run] function for state [s] will push a new cell onto the stack. Our convention is this. If this [run] function is entered via a shift transition, then it is in charge of pushing a new stack cell. If it is entered via a goto transition, then it is in charge of pushing a new cell if and only if the [goto] function that invoked it did not do so. Last, if this [run] function is invoked directly by an entry point, then it does not push a stack cell. *) let runpushes s = match Lr1.incoming_symbol s with | Some (Symbol.T _) -> true | Some (Symbol.N nt) -> not (gotopushes nt) | None -> false (* ------------------------------------------------------------------------ *) (* In some situations, we are able to fuse a shift (or goto) transition with a reduce transition, which means that we save the cost (in speed and in code size) of pushing and popping the top stack cell. This involves creating a modified version of the [reduce] function associated with a production [prod], where the contents of the top stack cell are passed as extra parameters. Because we wish to avoid code duplication, we perform this change only if all call sites for [reduce] agree on this modified calling convention. At the call site, the optimization is possible only if a stack cell allocation exists and is immediately followed by a call to [reduce]. This is the case inside the [run] function for state [s] when [run] pushes a stack cell and performs a default reduction. This optimization amounts to coalescing the push operation inside [run] with the pop operation that follows inside [reduce]. Unit production elimination, on the other hand, would coalesce the pop operation inside [reduce] with the push operation that follows inside [goto]. For this reason, the two are contradictory. As a result, we do not attempt to perform unit production elimination. In fact, we did implement it at one point and found that it was seldom applicable, because preference was given to the shiftreduce optimization. There are cases where shiftreduce optimization does not make any difference, for instance, if production [prod] is never reduced, or if the top stack cell is in fact nonexistent. *) let (shiftreduce : Production.index -> bool), shiftreducecount = Production.tabulateb (fun prod -> (* Check that this production pops at least one stack cell. *) Production.length prod > 0 && (* Check that all call sites push a stack cell and have a default reduction. *) Lr1.NodeSet.fold (fun s accu -> accu && (match Default.has_default_reduction s with None -> false | Some _ -> true) && (runpushes s) ) (Lr1.production_where prod) true ) let () = Error.logC 1 (fun f -> Printf.fprintf f "%d out of %d productions exploit shiftreduce optimization.\n" shiftreducecount Production.n) (* Check that, as predicted above, [gotopushes nt] returns [false] only when all of the [run] functions that follow it perform shiftreduce optimization. This can be proved as follows. If [gotopushes nt] returns [false], then every successor state [s] has a default reduction for some non-epsilon production [prod]. Furthermore, all states that can reduce [prod] must be successors of that same [goto] function: indeed, because the right-hand side of the production ends with symbol [nt], every state that can reduce [prod] must be entered through [nt]. So, at all such states, [runpushes] is true, which guarantees that [shiftreduce prod] is true as well. *) let () = assert ( Nonterminal.fold (fun nt accu -> accu && if gotopushes nt then true else Lr1.targets (fun accu _ target -> accu && match Default.has_default_reduction target with | Some (prod, _) -> shiftreduce prod | None -> false ) true (Symbol.N nt) ) true ) (* ------------------------------------------------------------------------ *) (* Type production. *) (* This is the type of states. Only states that are represented are declared. *) let statetypedef = { typename = tcstate; typeparams = []; typerhs = TDefSum ( Lr1.fold (fun defs s -> if Invariant.represented s then { dataname = statecon s; datavalparams = []; datatypeparams = None } :: defs else defs ) [] ); typeconstraint = None } (* The type of lexers. *) let tlexer = TypArrow (tlexbuf, ttoken) (* This is the type of parser environments. *) let field modifiable name t = { modifiable = modifiable; fieldname = name; fieldtype = type2scheme t } let envtypedef = { typename = tcenv; typeparams = []; typerhs = TDefRecord [ (* The lexer itself. *) field false flexer tlexer; (* The lexing buffer. *) field false flexbuf tlexbuf; (* The last token that was read from the lexer. This is the head of the token stream, unless [env.error] is set. *) field false ftoken ttoken; (* A flag which tells whether we currently have an [error] token at the head of the stream. When this flag is set, the head of the token stream is the [error] token, and the contents of the [token] field is irrelevant. The token following [error] is obtained by invoking the lexer again. *) field true ferror tbool; ]; typeconstraint = None } (* [curry] curries the top stack cell in a type [t] of the form [(stack type) arrow (result type)]. [t] remains unchanged if the stack type does not make at least one cell explicit. *) let curry = function | TypArrow (TypTuple (tstack :: tcell), tresult) -> TypArrow (tstack, marrow tcell tresult) | TypArrow _ as t -> t | _ -> assert false (* [curryif true] is [curry], [curryif false] is the identity. *) let curryif flag t = if flag then curry t else t (* Types for stack cells. [celltype tailtype holds_state symbol] returns the type of a stack cell. The parameter [tailtype] is the type of the tail of the stack. The flag [holds_state] tells whether the cell holds a state. The parameter [symbol] is used to determine whether the cell holds a semantic value and what its type is. A subtlety here and in [curry] above is that singleton stack cells give rise to singleton tuple types, which the type printer eliminates, but which do exist internally. As a result, [curry] always correctly removes the top stack cell, even if it is a singleton tuple cell. *) let celltype tailtype holds_state symbol _ = TypTuple ( tailtype :: elementif (Invariant.endp symbol) tposition @ elementif holds_state tstate @ semvtype symbol @ elementif (Invariant.startp symbol) tposition ) (* Types for stacks. [stacktype s] is the type of the stack at state [s]. [reducestacktype prod] is the type of the stack when about to reduce production [prod]. [gotostacktype nt] is the type of the stack when the [goto] function associated with [nt] is called. In all cases, the tail (that is, the unknown part) of the stack is represented by [ttail], currently a type variable. These stack types are obtained by folding [celltype] over a description of the stack provided by module [Invariant]. *) let stacktype s = Invariant.fold celltype ttail (Invariant.stack s) let reducestacktype prod = Invariant.fold celltype ttail (Invariant.prodstack prod) let gotostacktype nt = Invariant.fold celltype ttail (Invariant.gotostack nt) (* The type of the [run] function. As announced earlier, if [s] is the target of shift transitions, the type of the stack is curried, that is, the top stack cell is not yet allocated, so its contents are passed as extra parameters. If [s] is the target of goto transitions, the top stack cell is allocated. If [s] is a start state, this issue makes no difference. *) let runtypescheme s = auto2scheme ( arrow tenv ( curryif (runpushes s) ( arrow (stacktype s) tresult ) ) ) (* The type of the [goto] function. The top stack cell is curried. *) let gototypescheme nt = auto2scheme (arrow tenv (curry (arrow (gotostacktype nt) tresult))) (* If [prod] is an epsilon production and if the [goto] function associated with it expects a state parameter, then the [reduce] function associated with [prod] also requires a state parameter. *) let reduce_expects_state_param prod = let nt = Production.nt prod in Production.length prod = 0 && Invariant.fold (fun _ holds_state _ _ -> holds_state) false (Invariant.gotostack nt) (* The type of the [reduce] function. If shiftreduce optimization is performed for this production, then the top stack cell is not explicitly allocated. *) let reducetypescheme prod = auto2scheme ( arrow tenv ( curryif (shiftreduce prod) ( arrow (reducestacktype prod) ( arrowif (reduce_expects_state_param prod) tstate tresult ) ) ) ) (* The type of the [errorcase] function. The shape of the stack is unknown, and is determined by examining the state parameter. *) let errorcasetypescheme = auto2scheme (marrow [ tenv; ttail; tstate ] tresult) (* The type of the [error] function. The shape of the stack is the one associated with state [s]. *) let errortypescheme s = auto2scheme ( marrow [ tenv; stacktype s ] tresult) (* ------------------------------------------------------------------------ *) (* Code production preliminaries. *) (* This flag will be set to [true] if we ever raise the [Error] exception. This happens when we unwind the entire stack without finding a state that can handle errors. *) let can_die = ref false (* A code pattern for an exception handling construct where both alternatives are in tail position. Concrete syntax in OCaml 4.02 is [match e with x -> e1 | exception Error -> e2]. Earlier versions of OCaml do not support this construct. We continue to emulate it using a combination of [try/with], [match/with], and an [option] value. It is used only in a very rare case anyway. *) let letunless e x e1 e2 = EMatch ( ETry ( EData ("Some", [ e ]), [ { branchpat = PData (excdef.excname, []); branchbody = EData ("None", []) } ] ), [ { branchpat = PData ("Some", [ PVar x ]); branchbody = e1 }; { branchpat = PData ("None", []); branchbody = e2 } ] ) (* ------------------------------------------------------------------------ *) (* Calling conventions. *) (* The layout of a stack cell is determined here. The first field in a stack cell is always a pointer to the rest of the stack; it is followed by the fields listed below, each of which may or may not appear. [runpushcell] and [gotopushcell] are the two places where stack cells are allocated. *) (* 2015/11/04. We make [endp] the first element in the list of optional fields, so we are able to access it at a fixed offset, provided we know that it exists. This is exploited when reducing an epsilon production. *) (* The contents of a stack cell, exposed as individual parameters. The choice of identifiers is suitable for use in the definition of [run]. *) let runcellparams var holds_state symbol = elementif (Invariant.endp symbol) (var endp) @ elementif holds_state (var state) @ symval symbol (var semv) @ elementif (Invariant.startp symbol) (var startp) (* The contents of a stack cell, exposed as individual parameters, again. The choice of identifiers is suitable for use in the definition of a [reduce] function. [prod] is the production's index. The integer [i] tells which symbol on the right-hand side we are focusing on, that is, which symbol this stack cell is associated with. *) let reducecellparams prod i holds_state symbol = let ids = Production.identifiers prod in (* The semantic value is bound to the variable [ids.(i)]. Its type is [t]. As of 2016/03/11, we generate a type annotation. Indeed, because of our use of [magic], the semantic value would otherwise have an unknown type; and, if it is a function, the OCaml compiler could warn (incorrectly) that this function does not use its argument. *) let semvpat t = PAnnot (PVar ids.(i), t) in elementif (Invariant.endp symbol) (PVar (Printf.sprintf "_endpos_%s_" ids.(i))) @ elementif holds_state (if i = 0 then PVar state else PWildcard) @ symvalt symbol semvpat @ elementif (Invariant.startp symbol) (PVar (Printf.sprintf "_startpos_%s_" ids.(i))) (* The contents of a stack cell, exposed as individual parameters, again. The choice of identifiers is suitable for use in the definition of [error]. *) let errorcellparams (i, pat) holds_state symbol _ = i + 1, ptuple ( pat :: elementif (Invariant.endp symbol) PWildcard @ elementif holds_state (if i = 0 then PVar state else PWildcard) @ symval symbol PWildcard @ elementif (Invariant.startp symbol) PWildcard ) (* Calls to [run]. *) let runparams magic var s = var env :: magic (var stack) :: listif (runpushes s) (Invariant.fold_top (runcellparams var) [] (Invariant.stack s)) let call_run s actuals = EApp (EVar (run s), actuals) (* The parameters to [reduce]. When shiftreduce optimization is in effect, the top stack cell is not allocated, so extra parameters are required. Note that [shiftreduce prod] and [reduce_expects_state_param prod] are mutually exclusive conditions, so the [state] parameter is never bound twice. *) let reduceparams prod = PVar env :: PVar stack :: listif (shiftreduce prod) ( Invariant.fold_top (reducecellparams prod (Production.length prod - 1)) [] (Invariant.prodstack prod) ) @ elementif (reduce_expects_state_param prod) (PVar state) (* Calls to [reduce]. One must specify the production [prod] as well as the current state [s]. *) let call_reduce prod s = let actuals = (EVar env) :: (EMagic (EVar stack)) :: listif (shiftreduce prod) (Invariant.fold_top (runcellparams var) [] (Invariant.stack s)) (* compare with [runpushcell s] *) @ elementif (reduce_expects_state_param prod) (estatecon s) in EApp (EVar (reduce prod), actuals) (* Calls to [goto]. *) let gotoparams var nt = var env :: var stack :: Invariant.fold_top (runcellparams var) [] (Invariant.gotostack nt) let call_goto nt = EApp (EVar (goto nt), gotoparams var nt) (* Calls to [errorcase]. *) let errorcaseparams magic var = [ var env; magic (var stack); var state ] let call_errorcase = EApp (EVar errorcase, errorcaseparams magic var) (* Calls to [error]. *) let errorparams magic var = [ var env; magic (var stack) ] let call_error magic s = EApp (EVar (error s), errorparams magic var) let call_error_via_errorcase magic s = (* TEMPORARY document *) if Invariant.represented s then EApp (EVar errorcase, [ var env; magic (var stack); estatecon s ]) else call_error magic s (* Calls to [assertfalse]. *) let call_assertfalse = EApp (EVar assertfalse, [ EVar "()" ]) (* ------------------------------------------------------------------------ *) (* Code production for the automaton functions. *) (* Count how many states actually can peek at an error token. This figure is, in general, inferior or equal to the number of states at which [Invariant.errorpeeker] is true, because some of these states have a default reduction and will not consult the lookahead token. *) let errorpeekers = ref 0 (* Code for calling the reduction function for token [prod] upon finding a token within [toks]. This produces a branch, to be inserted in a [run] function for state [s]. *) let reducebranch toks prod s = { branchpat = tokspat toks; branchbody = call_reduce prod s } (* Code for shifting from state [s] to state [s'] via the token [tok]. This produces a branch, to be inserted in a [run] function for state [s]. The callee, [run s'], is responsible for taking the current token off the input stream. (There is actually a case where the token is *not* taken off the stream: when [s'] has a default reduction on [#].) It is also responsible for pushing a new stack cell. The rationale behind this decision is that there may be multiple shift transitions into [s'], so we actually share that code by placing it inside [run s'] rather than inside every transition. *) let shiftbranchbody s tok s' = (* Construct the actual parameters for [run s']. *) let actuals = (EVar env) :: (EMagic (EVar stack)) :: Invariant.fold_top (fun holds_state symbol -> assert (Symbol.equal (Symbol.T tok) symbol); elementif (Invariant.endp symbol) getendp @ elementif holds_state (estatecon s) @ tokval tok (EVar semv) @ elementif (Invariant.startp symbol) getstartp ) [] (Invariant.stack s') in (* Call [run s']. *) tracecomment (Printf.sprintf "Shifting (%s) to state %d" (Terminal.print tok) (Lr1.number s')) (call_run s' actuals) let shiftbranch s tok s' = assert (not (Terminal.pseudo tok)); { branchpat = PData (tokendata (Terminal.print tok), tokval tok (PVar semv)); branchbody = shiftbranchbody s tok s' } (* This generates code for pushing a new stack cell upon entering the [run] function for state [s]. *) let runpushcell s e = if runpushes s then let contents = var stack :: Invariant.fold_top (runcellparams var) [] (Invariant.stack s) in mlet [ pvar stack ] [ etuple contents ] e else e let runpushcellunless shiftreduce s e = if shiftreduce then EComment ("Not allocating top stack cell", e) else runpushcell s e (* This generates code for dealing with the lookahead token upon entering the [run] function for state [s]. If [s] is the target of a shift transition, then we must take the current token (which was consumed in the shift transition) off the input stream. Whether [s] was entered through a shift or a goto transition, we want to peek at the next token, unless we are performing a default reduction. The parameter [defred] tells which default reduction, if any, we are about to perform. *) (* 2014/12/06 New convention regarding initial states (i.e., states which have no incoming symbol). The function [initenv] does not invoke the lexer, so the [run] function for an initial state must do it. (Except in the very special case where the initial state has a default reduction on [#] -- this means the grammar recognizes only the empty word. We have ruled out this case.) *) let gettoken s defred e = match Lr1.incoming_symbol s, defred with | (Some (Symbol.T _) | None), Some (_, toks) when TerminalSet.mem Terminal.sharp toks -> assert (TerminalSet.cardinal toks = 1); (* There is a default reduction on token [#]. We cannot request the next token, since that might drive the lexer off the end of the input stream, so we cannot call [discard]. Do nothing. *) e | (Some (Symbol.T _) | None), Some _ -> (* There is some other default reduction. Discard the first input token. *) blet ([ PVar env, EApp (EVar discard, [ EVar env ]) (* Note that we do not read [env.token]. *) ], e) | (Some (Symbol.T _) | None), None -> (* There is no default reduction. Discard the first input token and peek at the next one. *) blet ([ PVar env, EApp (EVar discard, [ EVar env ]); PVar token, ERecordAccess (EVar env, ftoken) ], e) | Some (Symbol.N _), Some _ -> (* There is some default reduction. Do not peek at the input token. *) e | Some (Symbol.N _), None -> (* There is no default reduction. Peek at the first input token, without taking it off the input stream. This is normally done by reading [env.token], unless the token might be [error]: then, we check [env.error] first. *) if Invariant.errorpeeker s then begin incr errorpeekers; EIfThenElse ( ERecordAccess (EVar env, ferror), tracecomment "Resuming error handling" (call_error_via_errorcase magic s), blet ([ PVar token, ERecordAccess (EVar env, ftoken) ], e) ) end else blet ([ assertnoerror; PVar token, ERecordAccess (EVar env, ftoken) ], e) (* This produces the header of a [run] function. *) let runheader s body = let body = tracecomment (Printf.sprintf "State %d:" (Lr1.number s)) body in { valpublic = false; valpat = PVar (run s); valval = EAnnot (EFun (runparams nomagic pvar s, body), runtypescheme s) } (* This produces the comment attached with a default reduction. *) let defaultreductioncomment toks e = EPatComment ( "Reducing without looking ahead at ", tokspat toks, e ) (* This produces some bookkeeping code that is used when initiating error handling. We set the flag [env.error]. By convention, the field [env.token] becomes meaningless and one considers that the first token on the input stream is [error]. As a result, the next peek at the lookahead token will cause error handling to be resumed. The next call to [discard] will take the [error] token off the input stream and clear [env.error]. *) (* It seems convenient for [env.error] to be a mutable field, as this allows us to generate compact code. Re-allocating the whole record would produce less compact code. And speed is not an issue in this error-handling code. *) let errorbookkeeping e = tracecomment "Initiating error handling" (blet ( [ PUnit, ERecordWrite (EVar env, ferror, etrue) ], e )) (* This code is used to indicate that a new error has been detected in state [s]. If I am correct, [env.error] is never set here. Indeed, that would mean that we first found an error, and then signaled another error before being able to shift the first error token. My understanding is that this cannot happen: when the first error is signaled, we end up at a state that is willing to handle the error token, by a series of reductions followed by a shift. We initiate error handling by first performing the standard bookkeeping described above, then transferring control to the [error] function associated with [s]. *) let initiate s = blet ( [ assertnoerror ], errorbookkeeping (call_error_via_errorcase magic s) ) (* This produces the body of the [run] function for state [s]. *) let rundef s : valdef = match Default.has_default_reduction s with | Some (prod, toks) as defred -> (* Perform reduction without looking ahead. If shiftreduce optimization is being performed, then no stack cell is allocated. The contents of the top stack cell are passed do [reduce] as extra parameters. *) runheader s ( runpushcellunless (shiftreduce prod) s ( gettoken s defred ( defaultreductioncomment toks ( call_reduce prod s ) ) ) ) | None -> (* If this state is willing to act on the error token, ignore that -- this is taken care of elsewhere. *) let transitions = SymbolMap.remove (Symbol.T Terminal.error) (Lr1.transitions s) and reductions = TerminalMap.remove Terminal.error (Lr1.reductions s) in (* Construct the main case analysis that determines what action should be taken next. A default branch, where an error is detected, is added if the analysis is not exhaustive. In the default branch, we initiate error handling. *) let covered, branches = ProductionMap.fold (fun prod toks (covered, branches) -> (* There is a reduction for these tokens. *) TerminalSet.union toks covered, reducebranch toks prod s :: branches ) (Lr0.invert reductions) (TerminalSet.empty, []) in let covered, branches = SymbolMap.fold (fun symbol s' (covered, branches) -> match symbol with | Symbol.T tok -> (* There is a shift transition for this token. *) TerminalSet.add tok covered, shiftbranch s tok s' :: branches | Symbol.N _ -> covered, branches ) transitions (covered, branches) in let branches = if TerminalSet.subset TerminalSet.universe covered then branches else branches @ [ { branchpat = PWildcard; branchbody = initiate s } ] in (* Finally, construct the code for [run]. The former pushes things onto the stack, obtains the lookahead token, then performs the main case analysis on the lookahead token. *) runheader s ( runpushcell s ( gettoken s None ( EMatch ( EVar token, branches ) ) ) ) (* This is the body of the [reduce] function associated with production [prod]. *) let reducebody prod = (* Find out about the left-hand side of this production and about the identifiers that have been bound to the symbols in the right-hand side. These represent variables that we should bind to semantic values before invoking the semantic action. *) let nt, rhs = Production.def prod and ids = Production.identifiers prod and length = Production.length prod in (* Build a pattern that represents the shape of the stack. Out of the stack, we extract a state (except when the production is an epsilon production) and a number of semantic values. If shiftreduce optimization is being performed, then the top stack cell is not explicitly allocated, so we do not include it in the pattern that is built. *) let (_ : int), pat = Invariant.fold (fun (i, pat) holds_state symbol _ -> i + 1, if i = length - 1 && shiftreduce prod then pat else ptuple (pat :: reducecellparams prod i holds_state symbol) ) (0, PVar stack) (Invariant.prodstack prod) in (* If any identifiers refer to terminal symbols without a semantic value, then bind these identifiers to the unit value. This provides the illusion that every symbol, terminal or nonterminal, has a semantic value. This is more regular and allows applying operators such as ? to terminal symbols without a semantic value. *) let unitbindings = Misc.foldi length (fun i unitbindings -> match semvtype rhs.(i) with | [] -> (PVar ids.(i), EUnit) :: unitbindings | _ -> unitbindings ) [] in (* If necessary, determine start and end positions for the left-hand side of the production. If the right-hand side is nonempty, this is done by extracting position information out of the first and last symbols of the right-hand side. If it is empty, then (as of 2015/11/04) this is done by taking the end position stored in the top stack cell (whatever it is). The constraints imposed by the module [Invariant], the layout of cells, and our creation of a sentinel cell (see [entrydef] further on), ensure that this cell exists and has an [endp] field at offset 1. Yes, we live dangerously. You only live once. *) let extract x = (* Extract the end position (i.e., the field at offset 1) in the top stack cell and bind it to the variable [x]. *) PTuple [ PWildcard; PVar x ], EMagic (EVar stack) in let symbol = Symbol.N nt in let posbindings action = let bind_startp = Invariant.startp symbol in elementif (Action.has_beforeend action) ( extract beforeendp ) @ elementif bind_startp ( if length > 0 then PVar startp, EVar (Printf.sprintf "_startpos_%s_" ids.(0)) else extract startp ) @ elementif (Invariant.endp symbol) ( if length > 0 then PVar endp, EVar (Printf.sprintf "_endpos_%s_" ids.(length - 1)) else if bind_startp then PVar endp, EVar startp else extract endp ) in (* If this production is one of the start productions, then reducing it means accepting the input. In that case, we return a final semantic value and stop. Otherwise, we transfer control to the [goto] function, unless the semantic action raises [Error], in which case we transfer control to [errorcase]. *) if Production.is_start prod then tracecomment "Accepting" (blet ( [ pat, EVar stack ], EMagic (EVar ids.(0)) )) else let action = Production.action prod in let act = EAnnot (Action.to_il_expr action, type2scheme (semvtypent nt)) in tracecomment (Printf.sprintf "Reducing production %s" (Production.print prod)) (blet ( (pat, EVar stack) :: unitbindings @ posbindings action, (* If the semantic action is susceptible of raising [Error], use a [let/unless] construct, otherwise use [let]. *) if Action.has_syntaxerror action then letunless act semv (call_goto nt) (errorbookkeeping call_errorcase) else blet ([ PVar semv, act ], call_goto nt) )) (* This is the definition of the [reduce] function associated with production [prod]. *) let reducedef prod = { valpublic = false; valpat = PVar (reduce prod); valval = EAnnot ( EFun ( reduceparams prod, reducebody prod ), reducetypescheme prod ) } (* This generates code for pushing a new stack cell inside [goto]. *) let gotopushcell nt e = if gotopushes nt then let contents = var stack :: Invariant.fold_top (runcellparams var) [] (Invariant.gotostack nt) in mlet [ pvar stack ] [ etuple contents ] e else e (* This is the heart of the [goto] function associated with nonterminal [nt]. *) let gotobody nt = (* Examine the current state to determine where to go next. *) let branches = Lr1.targets (fun branches sources target -> { branchpat = pstatescon sources; branchbody = call_run target (runparams magic var target) } :: branches ) [] (Symbol.N nt) in match branches with | [] -> (* If there are no branches, then this [goto] function is never invoked. The inliner will drop it, so whatever we generate here is unimportant. *) call_assertfalse | [ branch ] -> (* If there is only one branch, no case analysis is required. This optimization is not strictly necessary if GADTs are used by the compiler to prove that the case analysis is exhaustive. It does improve readability, though, and is also useful if the compiler does not have GADTs. *) EPatComment ( "State should be ", branch.branchpat, branch.branchbody ) | _ -> (* In the general case, we keep the branches computed above and, unless [nt] is universal, add a default branch, which is theoretically useless but helps avoid warnings if the compiler does not have GADTs. *) let default = { branchpat = PWildcard; branchbody = call_assertfalse } in EMatch ( EVar state, branches @ (if Invariant.universal (Symbol.N nt) then [] else [ default ]) ) (* This the [goto] function associated with nonterminal [nt]. *) let gotodef nt = { valpublic = false; valpat = PVar (goto nt); valval = EAnnot (EFun (gotoparams pvar nt, gotopushcell nt (gotobody nt)), gototypescheme nt) } (* ------------------------------------------------------------------------ *) (* Code production for the error handling functions. *) (* This is the body of the [error] function associated with state [s]. *) let handle s e = tracecomment (Printf.sprintf "Handling error in state %d" (Lr1.number s)) e let errorbody s = try let s' = SymbolMap.find (Symbol.T Terminal.error) (Lr1.transitions s) in (* There is a shift transition on error. *) handle s ( shiftbranchbody s Terminal.error s' ) with Not_found -> try let prods = TerminalMap.lookup Terminal.error (Lr1.reductions s) in let prod = Misc.single prods in (* There is a reduce transition on error. If shiftreduce optimization is enabled for this production, then we must pop an extra cell for [reduce]'s calling convention to be met. *) let extrapop e = if shiftreduce prod then let pat = ptuple (PVar stack :: Invariant.fold_top (runcellparams pvar) [] (Invariant.stack s)) in blet ([ pat, EVar stack ], e) else e in handle s ( extrapop ( call_reduce prod s ) ) with Not_found -> (* This state is unable to handle errors. Pop the stack to find a state that does handle errors, a state that can further pop the stack, or die. *) match Invariant.rewind s with | Invariant.Die -> can_die := true; ERaise errorval | Invariant.DownTo (w, st) -> let _, pat = Invariant.fold errorcellparams (0, PVar stack) w in blet ( [ pat, EVar stack ], match st with | Invariant.Represented -> call_errorcase | Invariant.UnRepresented s -> call_error magic s ) (* This is the [error] function associated with state [s]. *) let errordef s = { valpublic = false; valpat = PVar (error s); valval = EAnnot ( EFun ( errorparams nomagic pvar, errorbody s ), errortypescheme s ) } (* This is the [errorcase] function. It examines its state parameter and dispatches control to an appropriate [error] function. *) let errorcasedef = let branches = Lr1.fold (fun branches s -> if Invariant.represented s then { branchpat = pstatecon s; branchbody = EApp (EVar (error s), [ EVar env; EMagic (EVar stack) ]) } :: branches else branches ) [] in { valpublic = false; valpat = PVar errorcase; valval = EAnnot ( EFun ( errorcaseparams nomagic pvar, EMatch ( EVar state, branches ) ), errorcasetypescheme ) } (* ------------------------------------------------------------------------ *) (* Code production for the entry points. *) (* This is the entry point associated with a start state [s]. By convention, it is named after the nonterminal [nt] that corresponds to this state. This is a public definition. The code initializes a parser environment, an empty stack, and invokes [run]. 2015/11/11. If the state [s] can reduce an epsilon production whose left-hand symbol keeps track of its start or end position, or if [s] can reduce any production that mentions [$endpos($0)], then the initial stack should contain a sentinel cell with a valid [endp] field at offset 1. For simplicity, we always create a sentinel cell. *) let entrydef s = let nt = Item.startnt (Lr1.start2item s) in let lexer = "lexer" and lexbuf = "lexbuf" in let initial_stack = let initial_position = getendp in etuple [ EUnit; initial_position ] in { valpublic = true; valpat = PVar (Nonterminal.print true nt); valval = EAnnot ( EFun ( [ PVar lexer; PVar lexbuf ], blet ( [ PVar env, EApp (EVar initenv, [ EVar lexer; EVar lexbuf ]) ], EMagic (EApp (EVar (run s), [ EVar env; initial_stack ])) ) ), entrytypescheme Front.grammar (Nonterminal.print true nt) ) } (* ------------------------------------------------------------------------ *) (* Code production for auxiliary functions. *) (* This is [assertfalse], used when internal failure is detected. This should never happen if our tool is correct. *) let assertfalsedef = { valpublic = false; valpat = PVar assertfalse; valval = EAnnot ( EFun ([ PUnit ], blet ([ PUnit, EApp (EVar "Printf.fprintf", [ EVar "stderr"; EStringConst "Internal failure -- please contact the parser generator's developers.\n%!" ]); ], EApp (EVar "assert", [ efalse ]) ) ), scheme [ "a" ] (arrow tunit (tvar "a")) ) } (* This is [print_token], used to print tokens in [--trace] mode. *) let printtokendef = destructuretokendef print_token tstring false (fun tok -> EStringConst (Terminal.print tok)) (* This is [discard], used to take a token off the input stream and query the lexer for a new one. The code queries the lexer for a new token and stores it into [env.token], overwriting the previous token. It also stores the start and positions of the new token. Last, [env.error] is cleared. We use the lexer's [lex_start_p] and [lex_curr_p] fields to extract the start and end positions of the token that we just read. In practice, it seems that [lex_start_p] can be inaccurate (that is the case when the lexer calls itself recursively, instead of simply recognizing an atomic pattern and returning immediately). However, we are 100% compatible with ocamlyacc here, and there is no better solution anyway. As of 2014/12/12, we re-allocate the environment record instead of updating it. Perhaps surprisingly, this makes the code TWICE FASTER overall. The write barrier is really costly! *) let discardbody = let lexer = "lexer" and lexbuf = "lexbuf" in EFun ( [ PVar env ], blet ([ PVar lexer, ERecordAccess (EVar env, flexer); PVar lexbuf, ERecordAccess (EVar env, flexbuf); PVar token, EApp (EVar lexer, [ EVar lexbuf ]); ] @ trace "Lookahead token is now %s (%d-%d)" [ EApp (EVar print_token, [ EVar token ]); ERecordAccess (ERecordAccess (EVar lexbuf, "Lexing.lex_start_p"), "Lexing.pos_cnum"); ERecordAccess (ERecordAccess (EVar lexbuf, "Lexing.lex_curr_p"), "Lexing.pos_cnum") ], ERecord [ flexer, EVar lexer; flexbuf, EVar lexbuf; ftoken, EVar token; ferror, efalse ] ) ) let discarddef = { valpublic = false; valpat = PVar discard; valval = EAnnot ( discardbody, type2scheme (arrow tenv tenv) ) } (* This is [initenv], used to allocate a fresh parser environment. It fills in all fields in a straightforward way. The [token] field receives a dummy value. It will be overwritten by the first call to [run], which will invoke [discard]. This allows us to invoke the lexer in just one place. *) let initenvdef = let lexer = "lexer" and lexbuf = "lexbuf" in { valpublic = false; valpat = PVar initenv; valval = EAnnot ( EFun ( [ PVar lexer; PVar lexbuf ], blet ( (* We do not have a dummy token at hand, so we forge one. *) (* It will be overwritten by the first call to the lexer. *) [ PVar token, EMagic EUnit ], ERecord ([ (flexer, EVar lexer); (flexbuf, EVar lexbuf); (ftoken, EVar token); (ferror, efalse) ] ) ) ), type2scheme (marrow [ tlexer; tlexbuf ] tenv) ) } (* ------------------------------------------------------------------------ *) (* Here is complete code for the parser. *) open BasicSyntax let grammar = Front.grammar let program = [ SIFunctor (grammar.parameters, mbasics grammar @ SITypeDefs [ envtypedef; statetypedef ] :: SIStretch grammar.preludes :: SIValDefs (true, ProductionMap.fold (fun _ s defs -> entrydef s :: defs ) Lr1.entry ( Lr1.fold (fun defs s -> rundef s :: errordef s :: defs ) ( Nonterminal.foldx (fun nt defs -> gotodef nt :: defs ) (Production.fold (fun prod defs -> if Lr1.NodeSet.is_empty (Lr1.production_where prod) then defs else reducedef prod :: defs ) [ discarddef; initenvdef; printtokendef; assertfalsedef; errorcasedef ]))) ) :: SIStretch grammar.postludes :: [])] (* ------------------------------------------------------------------------ *) (* We are done! *) let () = Error.logC 1 (fun f -> Printf.fprintf f "%d out of %d states can peek at an error.\n" !errorpeekers Lr1.n) let () = if not !can_die then Error.logC 1 (fun f -> Printf.fprintf f "The generated parser cannot raise Error.\n") let () = Time.tick "Producing abstract syntax" end menhir-20200123/src/codeBackend.mli000066400000000000000000000020701361226111300166760ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* The (code-based) code generator. *) module Run (T : sig end) : sig val program: IL.program end menhir-20200123/src/codeBits.ml000066400000000000000000000147261361226111300161120ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module provides a number of tiny functions that help produce [IL] code. *) open IL (* Tuples. *) let etuple = function | [] -> EUnit | [ e ] -> e | es -> ETuple es let ptuple = function | [] -> PUnit | [ p ] -> p | ps -> PTuple ps (* A list subject to a condition. *) let listif condition xs = if condition then xs else [] let elementif condition x = if condition then [ x ] else [] let listiflazy condition xs = if condition then xs() else [] (* The unit type. *) let tunit = TypApp ("unit", []) (* The Boolean type. *) let tbool = TypApp ("bool", []) (* The integer type. *) let tint = TypApp ("int", []) (* The string type. *) let tstring = TypApp ("string", []) (* The exception type. *) let texn = TypApp ("exn", []) (* The type of pairs. *) let tpair typ1 typ2 = TypTuple [typ1; typ2] (* The type of lexer positions. *) let tposition = TypApp ("Lexing.position", []) (* The type of the $loc and $sloc keywords. *) (* A location is a pair of positions. This might change in the future. *) let tlocation = tpair tposition tposition (* The type of lexer buffers. *) let tlexbuf = TypApp ("Lexing.lexbuf", []) (* The type of untyped semantic values. *) let tobj = TypApp ("Obj.t", []) (* Building a type variable. *) let tvar x : typ = TypVar x (* Building a type scheme. *) let scheme qs t = { quantifiers = qs; body = t } (* Building a type scheme with no quantifiers out of a type. *) let type2scheme t = scheme [] t let pat2var = function | PVar x -> x | _ -> assert false (* [simplify] removes bindings of the form [let v = v in ...] and [let _ = v in ...]. *) let rec simplify = function | [] -> [] | (PVar v1, EVar v2) :: bindings when v1 = v2 -> (* Avoid a useless let binding. *) simplify bindings | (PWildcard, EVar _) :: bindings -> (* Avoid a useless let binding. *) simplify bindings | binding :: bindings -> binding :: simplify bindings (* Building a [let] construct, with on-the-fly simplification. *) let blet (bindings, body) = let bindings = simplify bindings in match bindings, body with | [], _ -> body | [ PVar x1, e ], EVar x2 when x1 = x2 -> (* Reduce [let x = e in x] to just [e]. *) e | _, _ -> ELet (bindings, body) let mlet formals actuals body = blet (List.combine formals actuals, body) (* Simulating a [let/and] construct using tuples. *) let eletand (bindings, body) = let bindings = simplify bindings in match bindings, body with | [], _ -> (* special case: zero bindings *) body | [ PVar x1, e ], EVar x2 when x1 = x2 -> (* Reduce [let x = e in x] to just [e]. *) e | [ _ ], _ -> (* special case: one binding *) ELet (bindings, body) | _ :: _ :: _, _ -> (* general case: at least two bindings *) let pats, exprs = List.split bindings in ELet ([ PTuple pats, ETuple exprs ], body) (* [eraisenotfound] is an expression that raises [Not_found]. *) let eraisenotfound = ERaise (EData ("Not_found", [])) (* [bottom] is an expression that has every type. Its semantics is irrelevant. *) let bottom = eraisenotfound (* Boolean constants. *) let efalse : expr = EData ("false", []) let etrue : expr = EData ("true", []) let eboolconst b = if b then etrue else efalse (* Option constructors. *) let enone = EData ("None", []) let esome e = EData ("Some", [e]) (* List constructors. *) let rec elist xs = match xs with | [] -> EData ("[]", []) | x :: xs -> EData ("::", [ x; elist xs ]) (* Integer constants as patterns. *) let pint k : pattern = PData (string_of_int k, []) (* These help build function types. *) let arrow typ body : typ = TypArrow (typ, body) let arrowif flag typ body : typ = if flag then arrow typ body else body let marrow typs body : typ = List.fold_right arrow typs body (* ------------------------------------------------------------------------ *) (* Here is a bunch of naming conventions. Our names are chosen to minimize the likelihood that a name in a semantic action is captured. In other words, all global definitions as well as the parameters to [reduce] are given far-fetched names, unless [--no-prefix] was specified. Note that the prefix must begin with '_'. This allows avoiding warnings about unused variables with ocaml 3.09 and later. *) let prefix name = if Settings.noprefix then name else "_menhir_" ^ name let dataprefix name = if Settings.noprefix then name else "Menhir" ^ name let tvprefix name = if Settings.noprefix then name else "ttv_" ^ name (* ------------------------------------------------------------------------ *) (* Converting an interface to a structure. Only exception and type definitions go through. *) let interface_item_to_structure_item = function | IIExcDecls defs -> [ SIExcDefs defs ] | IITypeDecls defs -> [ SITypeDefs defs ] | IIFunctor (_, _) | IIValDecls _ | IIInclude _ | IIModule (_, _) | IIComment _ -> [] let interface_to_structure i = List.flatten (List.map interface_item_to_structure_item i) (* Constructing a named module type together with a list of "with type" constraints. *) let with_types wk name tys = List.fold_left (fun mt (params, name, ty) -> MTWithType (mt, params, name, wk, ty) ) (MTNamedModuleType name) tys let mapp me1 me2 = MApp (me1, me2) let mapp me1 mes2 = List.fold_left mapp me1 mes2 menhir-20200123/src/codeBits.mli000066400000000000000000000066071361226111300162620ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module provides a number of tiny functions that help produce [IL] code. *) open IL (* Tuples. *) val etuple: expr list -> expr val ptuple: pattern list -> pattern (* A list subject to a condition. (Be careful, though: the list is of course constructed even if the condition is false.) *) val listif: bool -> 'a list -> 'a list val elementif: bool -> 'a -> 'a list (* A lazy version of [listif], where the list is constructed only if the condition is true. *) val listiflazy: bool -> (unit -> 'a list) -> 'a list (* Standard types. *) val tunit: typ val tbool: typ val tint: typ val tstring: typ val texn: typ val tposition: typ val tlocation: typ val tlexbuf: typ val tobj : typ (* Building a type variable. *) val tvar: string -> typ (* Building a type scheme. *) val scheme: string list -> typ -> typescheme val type2scheme: typ -> typescheme (* Projecting out of a [PVar] pattern. *) val pat2var: pattern -> string (* Building a [let] construct, with on-the-fly simplification. These two functions construct a nested sequence of [let] definitions. *) val blet: (pattern * expr) list * expr -> expr val mlet: pattern list -> expr list -> expr -> expr (* Simulating a [let/and] construct. *) val eletand: (pattern * expr) list * expr -> expr (* [eraisenotfound] is an expression that raises [Not_found]. *) val eraisenotfound: expr (* [bottom] is an expression that has every type. Its semantics is irrelevant. *) val bottom: expr (* Boolean constants. *) val etrue: expr val efalse: expr val eboolconst: bool -> expr (* Option constructors. *) val enone: expr val esome: expr -> expr (* List constructors. *) val elist: expr list -> expr (* Integer constants as patterns. *) val pint: int -> pattern (* These help build function types. *) val arrow: typ -> typ -> typ val arrowif: bool -> typ -> typ -> typ val marrow: typ list -> typ -> typ (* These functions are used to generate names in menhir's namespace. *) val prefix: string -> string val dataprefix: string -> string val tvprefix: string -> string (* Converting an interface to a structure. Only exception and type definitions go through. *) val interface_to_structure: interface -> structure (* Constructing a named module type together with a list of "with type" constraints. *) val with_types: IL.with_kind -> string -> (string list * string * IL.typ) list -> IL.module_type (* Functor applications. *) val mapp: IL.modexpr -> IL.modexpr list -> IL.modexpr menhir-20200123/src/codeInliner.ml000066400000000000000000000230471361226111300166050ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open IL open CodeBits (* In the following, we only inline global functions. In order to avoid unintended capture, as we traverse terms, we keep track of local identifiers that hide global ones. The following little class helps do that. (The pathological case where a local binding hides a global one probably does not arise very often. Fortunately, checking against it in this way is quite cheap, and lets me sleep safely.) *) class locals table = object method pvar (locals : StringSet.t) (id : string) = if Hashtbl.mem table id then StringSet.add id locals else locals end (* Here is the inliner. *) let inline_valdefs (defs : valdef list) : valdef list = (* Create a table of all global definitions. *) let before, table = Traverse.tabulate_defs defs in (* Prepare to count how many times each function is used, including inside its own definition. The public functions serve as starting points for this discovery phase. *) let queue : valdef Queue.t = Queue.create() and usage : int StringMap.t ref = ref StringMap.empty in (* [visit] is called at every identifier occurrence. *) let visit locals id = if StringSet.mem id locals then (* This is a local identifier. Do nothing. *) () else try let _, def = Hashtbl.find table id in (* This is a globally defined identifier. Increment its usage count. If it was never visited, enqueue its definition for exploration. *) let n = try StringMap.find id !usage with Not_found -> Queue.add def queue; 0 in usage := StringMap.add id (n + 1) !usage with Not_found -> (* This identifier is not global. It is either local or a reference to some external library, e.g. ocaml's standard library. *) () in (* Look for occurrences of identifiers inside expressions. *) let o = object inherit [ StringSet.t, unit ] Traverse.fold inherit locals table method! evar locals () id = visit locals id end in (* Initialize the queue with all public definitions, and work from there. We assume that the left-hand side of every definition is a variable. *) List.iter (fun { valpublic = public; valpat = p } -> if public then visit StringSet.empty (pat2var p) ) defs; Misc.qfold (o#valdef StringSet.empty) () queue; let usage = !usage in (* Now, inline every function that is called at most once. At the same time, every function that is never called is dropped. The public functions again serve as starting points for the traversal. *) let queue : valdef Queue.t = Queue.create() and emitted = ref StringSet.empty in let enqueue def = let id = pat2var def.valpat in if not (StringSet.mem id !emitted) then begin emitted := StringSet.add id !emitted; Queue.add def queue end in (* A simple application is an application of a variable to a number of variables, constants, or record accesses out of variables. *) let rec is_simple_arg = function | EVar _ | EData (_, []) | ERecordAccess (EVar _, _) -> true | EMagic e -> is_simple_arg e | _ -> false in let is_simple_app = function | EApp (EVar _, actuals) -> List.for_all is_simple_arg actuals | _ -> false in (* Taking a fresh instance of a type scheme. Ugly. *) let instance = let count = ref 0 in let fresh tv = incr count; tv, Printf.sprintf "freshtv%d" !count in fun scheme -> let mapping = List.map fresh scheme.quantifiers in let rec sub typ = match typ with | TypTextual _ -> typ | TypVar v -> begin try TypVar (List.assoc v mapping) with Not_found -> typ end | TypApp (f, typs) -> TypApp (f, List.map sub typs) | TypTuple typs -> TypTuple (List.map sub typs) | TypArrow (typ1, typ2) -> TypArrow (sub typ1, sub typ2) in sub scheme.body in (* Destructuring a type annotation. *) let rec annotate formals body typ = match formals, typ with | [], _ -> [], EAnnot (body, type2scheme typ) | formal :: formals, TypArrow (targ, tres) -> let formals, body = annotate formals body tres in PAnnot (formal, targ) :: formals, body | _ :: _, _ -> (* Type annotation has insufficient arity. *) assert false in (* The heart of the inliner: rewriting a function call to a [let] expression. If there was a type annotation at the function definition site, it is dropped, provided the semantic actions have been type-checked. Otherwise, it is kept, because, due to the presence of [EMagic] expressions in the code, dropping a type annotation could cause an ill-typed program to become apparently well-typed. Keeping a type annotation requires taking a fresh instance of the type scheme, because OCaml doesn't have support for locally and existentially bound type variables. Yuck. *) let inline formals actuals body oscheme = assert (List.length actuals = List.length formals); match oscheme with | Some scheme when not Front.ocaml_types_have_been_checked -> let formals, body = annotate formals body (instance scheme) in mlet formals actuals body | _ -> mlet formals actuals body in (* Look for occurrences of identifiers inside expressions, branches, etc. and replace them with their definitions if they have only one use site or if their definitions are sufficiently simple. *) let o = object (self) inherit [ StringSet.t ] Traverse.map as super inherit locals table method! eapp locals e actuals = match e with | EVar id when (Hashtbl.mem table id) && (* a global identifier *) (not (StringSet.mem id locals)) (* not hidden by a local identifier *) -> let _, def = Hashtbl.find table id in (* cannot fail, thanks to the above check *) let formals, body, oscheme = match def with | { valval = EFun (formals, body) } -> formals, body, None | { valval = EAnnot (EFun (formals, body), scheme) } -> formals, body, Some scheme | { valval = _ } -> (* The definition is not a function definition. This should not happen in the kind of code that we generate. *) assert false in assert (StringMap.mem id usage); if StringMap.find id usage = 1 || is_simple_app body then (* The definition can be inlined, with beta reduction. *) inline formals (self#exprs locals actuals) (EComment (id, self#expr locals body)) oscheme else begin (* The definition cannot be inlined. *) enqueue def; super#eapp locals e actuals end | _ -> (* The thing in function position is not a reference to a global. *) super#eapp locals e actuals end in (* Initialize the queue with all public definitions, and work from there. *) List.iter (function { valpublic = public } as def -> if public then enqueue def ) defs; let valdefs = Misc.qfold (fun defs def -> o#valdef StringSet.empty def :: defs ) [] queue in Error.logC 1 (fun f -> Printf.fprintf f "%d functions before inlining, %d functions after inlining.\n" before (List.length valdefs)); Time.tick "Inlining"; valdefs (* Dumb recursive traversal. *) let rec inline_structure_item item = match item with | SIValDefs (true, defs) -> (* A nest of recursive definitions. Act on it. *) SIValDefs (true, inline_valdefs defs) | SIFunctor (params, s) -> SIFunctor (params, inline_structure s) | SIModuleDef (name, e) -> SIModuleDef (name, inline_modexpr e) | SIInclude e -> SIInclude (inline_modexpr e) | SIExcDefs _ | SITypeDefs _ | SIValDefs (false, _) | SIStretch _ | SIComment _ -> item and inline_structure s = List.map inline_structure_item s and inline_modexpr = function | MVar x -> MVar x | MStruct s -> MStruct (inline_structure s) | MApp (e1, e2) -> MApp (inline_modexpr e1, inline_modexpr e2) (* The external entry point. *) let inline (p : program) : program = if Settings.code_inlining then inline_structure p else p menhir-20200123/src/codeInliner.mli000066400000000000000000000024161361226111300167530ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This transformer inlines every function that is called at most once. It also inlines some functions whose body consists of a single function call. At the same time, every function that is never called is dropped. Public functions are never inlined or dropped. *) val inline: IL.program -> IL.program menhir-20200123/src/codePieces.ml000066400000000000000000000162551361226111300164200ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module defines many internal naming conventions for use by the two code generators, [CodeBackend] and [TableBackend]. It also offers a few code generation facilities. *) open IL open CodeBits open Grammar (* ------------------------------------------------------------------------ *) (* Naming conventions. *) (* The type variable associated with a nonterminal [nt]. *) let ntvar nt = Infer.ntvar (Nonterminal.print true nt) (* The variable that holds the environment. This is a parameter to all functions. We do not make it a global variable because we wish to preserve re-entrancy. *) let env = prefix "env" (* A variable used to hold a semantic value. *) let semv = "_v" (* A variable used to hold a stack. *) let stack = prefix "stack" (* A variable used to hold a state. *) let state = prefix "s" (* A variable used to hold a token. *) let token = "_tok" (* Variables used to hold start and end positions. Do not change these names! They are chosen to coincide with the $startpos and $endpos keywords, which the lexer rewrites to _startpos and _endpos, so that binding these variables before executing a semantic action is meaningful. *) (* These names should agree with the printing function [Keyword.posvar]. *) let beforeendp = "_endpos__0_" let startp = "_startpos" let endp = "_endpos" (* ------------------------------------------------------------------------ *) (* Types for semantic values. *) (* [semvtypent nt] is the type of the semantic value associated with nonterminal [nt]. *) let semvtypent nt = match Nonterminal.ocamltype nt with | None -> (* [nt] has unknown type. If we have run [Infer], then this can't happen. However, running type inference is only an option, so we still have to deal with that case. *) TypVar (ntvar nt) | Some ocamltype -> (* [nt] has known type. *) TypTextual ocamltype (* [semvtypetok tok] is the type of the semantic value associated with token [tok]. There is no such type if the token does not have a semantic value. *) let semvtypetok tok = match Terminal.ocamltype tok with | None -> (* Token has unit type and is omitted in stack cell. *) [] | Some ocamltype -> (* Token has known type. *) [ TypTextual ocamltype ] (* [semvtype symbol] is the type of the semantic value associated with [symbol]. *) let semvtype = function | Symbol.T tok -> semvtypetok tok | Symbol.N nt -> [ semvtypent nt ] (* [symvalt] returns the empty list if the symbol at hand carries no semantic value and the singleton list [[f t]] if it carries a semantic value of type [t]. *) let symvalt symbol f = match semvtype symbol with | [] -> [] | [ t ] -> [ f t ] | _ -> assert false (* [symval symbol x] returns either the empty list or the singleton list [[x]], depending on whether [symbol] carries a semantic value. *) let symval symbol x = match semvtype symbol with | [] -> [] | [ _t ] -> [ x ] | _ -> assert false (* [tokval] is a version of [symval], specialized for terminal symbols. *) let tokval tok x = symval (Symbol.T tok) x (* ------------------------------------------------------------------------ *) (* Patterns for tokens. *) (* [tokpat tok] is a pattern that matches the token [tok], without binding its semantic value. *) let tokpat tok = PData (TokenType.tokendata (Terminal.print tok), tokval tok PWildcard) (* [tokpatv tok] is a pattern that matches the token [tok], and binds its semantic value, if it has one, to the variable [semv]. *) let tokpatv tok = PData (TokenType.tokendata (Terminal.print tok), tokval tok (PVar semv)) (* [tokspat toks] is a pattern that matches any token in the set [toks], without binding its semantic value. *) let tokspat toks = POr ( TerminalSet.fold (fun tok pats -> tokpat tok :: pats ) toks [] ) (* [destructuretokendef name codomain bindsemv branch] generates the definition of a function that destructures tokens. [name] is the name of the function that is generated. [codomain] is its return type. [bindsemv] tells whether the variable [semv] should be bound. [branch] is applied to each (non-pseudo) terminal and must produce code for each branch. *) let destructuretokendef name codomain bindsemv branch = { valpublic = false; valpat = PVar name; valval = EAnnot ( EFun ([ PVar token ], EMatch (EVar token, Terminal.fold (fun tok branches -> if Terminal.pseudo tok then branches else { branchpat = (if bindsemv then tokpatv else tokpat) tok; branchbody = branch tok } :: branches ) [] ) ), type2scheme (arrow TokenType.ttoken codomain) ) } (* ------------------------------------------------------------------------ *) (* A global variable holds the exception [Error]. *) (* We preallocate the [Error] exception and store it into a global variable. This allows saving code at the sites where the exception is raised. Don't change the conventional name [_eRR], it is shared with the lexer, which replaces occurrences of the [$syntaxerror] keyword with [(raise _eRR)]. *) let parse_error = "_eRR" let errorval = EVar parse_error let basics = "MenhirBasics" (* 2017/01/20 The name [basics] must be an unlikely name, as it might otherwise hide a user-defined module by the same name. *) let excvaldef = { valpublic = false; valpat = PVar parse_error; valval = EData (basics ^ "." ^ Interface.excname, []) (* 2016/06/23 We now use the qualified name [Basics.Error], instead of just [Error], so as to avoid OCaml's warning 41. *) } (* ------------------------------------------------------------------------ *) (* Define the internal sub-module [Basics], which contains the definitions of the exception [Error] and of the type [token]. Then, include this sub-module. This is used both in the code and table back-ends. *) let mbasics grammar = [ SIModuleDef (basics, MStruct ( SIExcDefs [ Interface.excdef ] :: interface_to_structure ( TokenType.tokentypedef grammar ) )); SIInclude (MVar basics); SIValDefs (false, [ excvaldef ]); ] menhir-20200123/src/codePieces.mli000066400000000000000000000111141361226111300165560ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module defines many internal naming conventions for use by the two code generators, [CodeBackend] and [TableBackend]. It also offers a few code generation facilities. *) open IL open Grammar (* ------------------------------------------------------------------------ *) (* Naming conventions. *) (* The type variable associated with a nonterminal [nt]. *) val ntvar : Nonterminal.t -> string (* The variable that holds the environment. This is a parameter to all functions. We do not make it a global variable because we wish to preserve re-entrancy. *) val env : string (* A variable used to hold a semantic value. *) val semv : string (* A variable used to hold a stack. *) val stack: string (* A variable used to hold a state. *) val state: string (* A variable used to hold a token. *) val token: string (* Variables used to hold start and end positions. *) val beforeendp: string val startp: string val endp: string (* ------------------------------------------------------------------------ *) (* Types for semantic values. *) (* [semvtypent nt] is the type of the semantic value associated with nonterminal [nt]. *) val semvtypent : Nonterminal.t -> typ (* [semvtypetok tok] is the type of the semantic value associated with token [tok]. There is no such type if the token does not have a semantic value. *) val semvtypetok : Terminal.t -> typ list (* [semvtype symbol] is the type of the semantic value associated with [symbol]. *) val semvtype : Symbol.t -> typ list (* [symvalt] returns the empty list if the symbol at hand carries no semantic value and the singleton list [[f t]] if it carries a semantic value of type [t]. *) val symvalt : Symbol.t -> (typ -> 'a) -> 'a list (* [symval symbol x] returns either the empty list or the singleton list [[x]], depending on whether [symbol] carries a semantic value. *) val symval : Symbol.t -> 'a -> 'a list (* [tokval] is a version of [symval], specialized for terminal symbols. *) val tokval : Terminal.t -> 'a -> 'a list (* ------------------------------------------------------------------------ *) (* Patterns for tokens. *) (* [tokpat tok] is a pattern that matches the token [tok], without binding its semantic value. *) val tokpat: Terminal.t -> pattern (* [tokpatv tok] is a pattern that matches the token [tok], and binds its semantic value, if it has one, to the variable [semv]. *) val tokpatv: Terminal.t -> pattern (* [tokspat toks] is a pattern that matches any token in the set [toks], without binding its semantic value. *) val tokspat: TerminalSet.t -> pattern (* [destructuretokendef name codomain bindsemv branch] generates the definition of a function that destructure tokens. [name] is the name of the function that is generated. [codomain] is its return type. [bindsemv] tells whether the variable [semv] should be bound. [branch] is applied to each (non-pseudo) terminal and must produce code for each branch. *) val destructuretokendef: string -> typ -> bool -> (Terminal.t -> expr) -> valdef (* ------------------------------------------------------------------------ *) (* A global variable holds the exception [Error]. *) (* A reference to this global variable. *) val errorval: expr (* ------------------------------------------------------------------------ *) (* The structure items [mbasics grammar] define and include the internal sub-module [Basics], which contains the definitions of the exception [Error] and of the type [token]. Then, they define the global variable mentioned above, which holds the exception [Error]. *) val basics: string val mbasics: BasicSyntax.grammar -> structure menhir-20200123/src/compressedBitSet.ml000066400000000000000000000133641361226111300176320ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* A compressed (or should we say sparse?) bit set is a list of pairs of integers. The first component of every pair is an index, while the second component is a bit field. The list is sorted by order of increasing indices. *) type t = | N | C of int * int * t type element = int let word_size = Sys.word_size - 1 let empty = N let is_empty = function | N -> true | C _ -> false let add i s = let ioffset = i mod word_size in let iaddr = i - ioffset and imask = 1 lsl ioffset in let rec add = function | N -> (* Insert at end. *) C (iaddr, imask, N) | C (addr, ss, qs) as s -> if iaddr < addr then (* Insert in front. *) C (iaddr, imask, s) else if iaddr = addr then (* Found appropriate cell, update bit field. *) let ss' = ss lor imask in if ss' = ss then s else C (addr, ss', qs) else (* Not there yet, continue. *) let qs' = add qs in if qs == qs' then s else C (addr, ss, qs') in add s let singleton i = add i N let remove i s = let ioffset = i mod word_size in let iaddr = i - ioffset and imask = 1 lsl ioffset in let rec remove = function | N -> N | C (addr, ss, qs) as s -> if iaddr < addr then s else if iaddr = addr then (* Found appropriate cell, update bit field. *) let ss' = ss land (lnot imask) in if ss' = 0 then qs else if ss' = ss then s else C (addr, ss', qs) else (* Not there yet, continue. *) let qs' = remove qs in if qs == qs' then s else C (addr, ss, qs') in remove s let rec fold f s accu = match s with | N -> accu | C (base, ss, qs) -> loop f qs base ss accu and loop f qs i ss accu = if ss = 0 then fold f qs accu else (* One could in principle check whether [ss land 0x3] is zero and if so move to [i + 2] and [ss lsr 2], and similarly for various sizes. In practice, this does not seem to make a measurable difference. *) loop f qs (i + 1) (ss lsr 1) (if ss land 1 = 1 then f i accu else accu) let iter f s = fold (fun x () -> f x) s () let is_singleton s = match s with | C (_, ss, N) -> (* Test whether only one bit is set in [ss]. We do this by turning off the rightmost bit, then comparing to zero. *) ss land (ss - 1) = 0 | C (_, _, C _) | N -> false let cardinal s = fold (fun _ m -> m + 1) s 0 let elements s = fold (fun tl hd -> tl :: hd) s [] let rec subset s1 s2 = match s1, s2 with | N, _ -> true | _, N -> false | C (addr1, ss1, qs1), C (addr2, ss2, qs2) -> if addr1 < addr2 then false else if addr1 = addr2 then if (ss1 land ss2) <> ss1 then false else subset qs1 qs2 else subset s1 qs2 let mem i s = subset (singleton i) s let rec union s1 s2 = match s1, s2 with | N, s | s, N -> s | C (addr1, ss1, qs1), C (addr2, ss2, qs2) -> if addr1 < addr2 then C (addr1, ss1, union qs1 s2) else if addr1 > addr2 then let s = union s1 qs2 in if s == qs2 then s2 else C (addr2, ss2, s) else let ss = ss1 lor ss2 in let s = union qs1 qs2 in if ss == ss2 && s == qs2 then s2 else C (addr1, ss, s) let rec inter s1 s2 = match s1, s2 with | N, _ | _, N -> N | C (addr1, ss1, qs1), C (addr2, ss2, qs2) -> if addr1 < addr2 then inter qs1 s2 else if addr1 > addr2 then inter s1 qs2 else let ss = ss1 land ss2 in let s = inter qs1 qs2 in if ss = 0 then s else if (ss = ss1) && (s == qs1) then s1 else C (addr1, ss, s) exception Found of int let choose s = try iter (fun x -> raise (Found x) ) s; raise Not_found with Found x -> x let rec compare s1 s2 = match s1, s2 with N, N -> 0 | _, N -> 1 | N, _ -> -1 | C (addr1, ss1, qs1), C (addr2, ss2, qs2) -> if addr1 < addr2 then -1 else if addr1 > addr2 then 1 else if ss1 < ss2 then -1 else if ss1 > ss2 then 1 else compare qs1 qs2 let equal s1 s2 = compare s1 s2 = 0 let rec disjoint s1 s2 = match s1, s2 with | N, _ | _, N -> true | C (addr1, ss1, qs1), C (addr2, ss2, qs2) -> if addr1 = addr2 then if (ss1 land ss2) = 0 then disjoint qs1 qs2 else false else if addr1 < addr2 then disjoint qs1 s2 else disjoint s1 qs2 menhir-20200123/src/compressedBitSet.mli000066400000000000000000000017671361226111300200070ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) include GSet.S with type element = int menhir-20200123/src/conflict.ml000066400000000000000000000444421361226111300161550ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Grammar let () = if Settings.graph then DependencyGraph.print_dependency_graph() (* artificial dependency *) (* -------------------------------------------------------------------------- *) (* Explaining shift actions. *) (* The existence of a shift action stems from the existence of a shift item in the LR(0) core that underlies the LR(1) state of interest. That is, lookahead sets are not relevant. The existence of a shift item in the LR(0) core is explained by finding a path from a start item to the shift item in the LR(0) nondeterministic automaton, such that the symbols read along this path form the (previously fixed) symbol string that leads to the conflict state in the LR(1) automaton. There may be several such paths: a shortest one is chosen. There may also be several shift items in the conflict state: an arbitrary one is chosen. I believe it would not be interesting to display traces for several shift items: they would be identical except in their last line (where the desired shift item actually appears). *) (* Symbolic execution of the nondeterministic LR(0) automaton. *) (* Configurations are pairs of an LR(0) item and an offset into the input string, which indicates how much has been read so far. *) type configuration0 = Item.t * int (* This function builds a derivation out of a (nonempty, reversed) sequence of configurations. The derivation is constructed from bottom to top, that is, beginning at the last configuration and moving back towards to the start configuration. *) let rec follow derivation offset' = function | [] -> assert (offset' = 0); derivation | (item, offset) :: configs -> let _, _, rhs, pos, _ = Item.def item in let derivation = if offset = offset' then (* This is an epsilon transition. Put a new root node on top of the existing derivation. *) Derivation.build pos rhs derivation None else (* This was a shift transition. Tack symbol in front of the forest. *) Derivation.prepend rhs.(pos) derivation in follow derivation offset configs (* Symbolic execution begins with a start item (corresponding to one of the automaton's entry nodes), a fixed string of input symbols, to be fully consumed, and a goal item. The objective is to find a path through the automaton that leads from the start configuration [(start, 0)] to the goal configuration [(stop, n)], where [n] is the length of the input string. The automaton is explored via breadth-first search. A hash table is used to record which configurations have been visited and to build a spanning tree of shortest paths. *) exception Done let explain_shift_item (start : Item.t) (input : Symbol.t array) (stop : Item.t) : Derivation.t = let n = Array.length input in let table : (configuration0, configuration0 option) Hashtbl.t = Hashtbl.create 1023 in let queue : configuration0 Queue.t = Queue.create() in let enqueue ancestor config = try let _ = Hashtbl.find table config in () with Not_found -> Hashtbl.add table config ancestor; Queue.add config queue in enqueue None (start, 0); try Misc.qiter (function (item, offset) as config -> (* If the item we're looking at is the goal item and if we have read all of the input symbols, stop. *) if (Item.equal item stop) && (offset = n) then raise Done; (* Otherwise, explore the transitions out of this item. *) let prod, _, rhs, pos, length = Item.def item in (* Shift transition, followed only if the symbol matches the symbol found in the input string. *) if (pos < length) && (offset < n) && (Symbol.equal rhs.(pos) input.(offset)) then begin let config' = (Item.import (prod, pos+1), offset+1) in enqueue (Some config) config' end; (* Epsilon transitions. *) if pos < length then match rhs.(pos) with | Symbol.N nt -> Production.iternt nt (fun prod -> let config' = (Item.import (prod, 0), offset) in enqueue (Some config) config' ) | Symbol.T _ -> () ) queue; assert false with Done -> (* We have found a (shortest) path from the start configuration to the goal configuration. Turn it into an explicit derivation. *) let configs = Misc.materialize table (stop, n) in let _, _, rhs, pos, _ = Item.def stop in let derivation = Derivation.tail pos rhs in let derivation = follow derivation n configs in derivation (* -------------------------------------------------------------------------- *) (* Explaining reduce actions. *) (* The existence of a reduce action stems from the existence of a reduce item, whose lookahead set contains the token of interest, in the state of interest. Here, lookahead sets are relevant only insofar as they contain or do not contain the token of interest -- in other words, lookahead sets can be abstracted by Boolean values. The existence of the reduce item is explained by finding a path from a start item to the reduce item in the LR(1) nondeterministic automaton, such that the symbols read along this path form the (previously fixed) symbol string that leads to the conflict state in the LR(1) automaton. There may be several such paths: a shortest one is chosen. *) (* Symbolic execution of the nondeterministic LR(1) automaton. *) (* Configurations are pairs of an LR(1) item and an offset into the input string, which indicates how much has been read so far. An LR(1) item is itself represented as the combination of an LR(0) item and a Boolean flag, telling whether the token of interest appears or does not appear in the lookahead set. *) type configuration1 = Item.t * bool * int (* This function builds a derivation out of a sequence of configurations. The end of the sequence is dealt with specially -- we want to explain how the lookahead symbol appears and is inherited. Once that is done, the rest (that is, the beginning) of the derivation is dealt with as above. *) let config1toconfig0 (item, _, offset) = (item, offset) let rec follow1 tok derivation offset' = function | [] -> assert (Terminal.equal tok Terminal.sharp); (* One could emit a comment saying that the lookahead token is initially [#]. That comment would have to be displayed above the derivation, though, and there is no support for that at the moment, so let's skip it. *) derivation | (item, _, offset) :: configs -> let prod, _, rhs, pos, length = Item.def item in if offset = offset' then (* This is an epsilon transition. Attack a new line and add a comment that explains why the lookahead symbol is produced or inherited. *) let nullable, first = Analysis.nullable_first_prod prod (pos + 1) in if TerminalSet.mem tok first then (* The lookahead symbol is produced (and perhaps also inherited, but let's ignore that). *) let e = Analysis.explain_first_rhs tok rhs (pos + 1) in let comment = "lookahead token appears" ^ (if e = "" then "" else " because " ^ e) in let derivation = Derivation.build pos rhs derivation (Some comment) in (* Print the rest of the derivation without paying attention to the lookahead symbols. *) follow derivation offset (List.map config1toconfig0 configs) else begin (* The lookahead symbol is not produced, so it is definitely inherited. *) assert nullable; let comment = "lookahead token is inherited" ^ (if pos + 1 < length then Printf.sprintf " because %scan vanish" (Symbol.printao (pos + 1) rhs) else "") in let derivation = Derivation.build pos rhs derivation (Some comment) in follow1 tok derivation offset configs end else (* This is a shift transition. Tack symbol in front of forest. *) let derivation = Derivation.prepend rhs.(pos) derivation in follow1 tok derivation offset configs (* Symbolic execution is performed in the same manner as above. *) let explain_reduce_item (tok : Terminal.t) (start : Item.t) (input : Symbol.t array) (stop : Item.t) : Derivation.t = let n = Array.length input in let table : (configuration1, configuration1 option) Hashtbl.t = Hashtbl.create 1023 in let queue : configuration1 Queue.t = Queue.create() in let enqueue ancestor config = try let _ = Hashtbl.find table config in () with Not_found -> Hashtbl.add table config ancestor; Queue.add config queue in (* If the lookahead token is #, then it initially appear in the lookahead set, otherwise it doesn't. *) enqueue None (start, Terminal.equal tok Terminal.sharp, 0); try Misc.qiter (function (item, lookahead, offset) as config -> (* If the item we're looking at is the goal item and if we have read all of the input symbols, stop. *) if (Item.equal item stop) && lookahead && (offset = n) then raise Done; (* Otherwise, explore the transitions out of this item. *) let prod, _nt, rhs, pos, length = Item.def item in (* Shift transition, followed only if the symbol matches the symbol found in the input string. *) if (pos < length) && (offset < n) && (Symbol.equal rhs.(pos) input.(offset)) then begin let config' = (Item.import (prod, pos+1), lookahead, offset+1) in enqueue (Some config) config' end; (* Epsilon transitions. *) if pos < length then match rhs.(pos) with | Symbol.N nt -> let nullable, first = Analysis.nullable_first_prod prod (pos + 1) in let first : bool = TerminalSet.mem tok first in let lookahead' = if nullable then first || lookahead else first in Production.iternt nt (fun prod -> let config' = (Item.import (prod, 0), lookahead', offset) in enqueue (Some config) config' ) | Symbol.T _ -> () ) queue; assert false with Done -> (* We have found a (shortest) path from the start configuration to the goal configuration. Turn it into an explicit derivation. *) let configs = Misc.materialize table (stop, true, n) in let derivation = Derivation.empty in let derivation = follow1 tok derivation n configs in derivation (* -------------------------------------------------------------------------- *) (* Putting it all together. *) let () = if Settings.explain then begin (* 2018/09/05: when [--explain] is enabled, always create a fresh .conflicts file (wiping out any pre-existing file), even if there are in fact no conflicts. This should avoid confusion with outdated .conflicts files. *) let out = open_out (Settings.base ^ ".conflicts") in Lr1.conflicts (fun toks node -> try (* Construct a partial LR(1) automaton, looking for a conflict in a state that corresponds to this node. Because Pager's algorithm can merge two states as soon as one of them has a conflict, we can't be too specific about the conflict that we expect to find in the canonical automaton. So, we must supply a set of conflict tokens and accept any kind of conflict that involves one of them. *) (* TEMPORARY with the new compatibility criterion, we can be sure that every conflict token is indeed involved in a conflict. Exploit that? Avoid focusing on a single token? *) let module P = Lr1partial.Run (struct let tokens = toks let goal = node end) in let closure = Lr0.closure P.goal in (* Determine what kind of conflict was found. *) let shift, reduce = Item.Map.fold (fun item toks (shift, reduce) -> match Item.classify item with | Item.Shift (Symbol.T tok, _) when Terminal.equal tok P.token -> shift + 1, reduce | Item.Reduce _ when TerminalSet.mem P.token toks -> shift, reduce + 1 | _ -> shift, reduce ) closure (0, 0) in let kind = if (shift > 0) && (reduce > 1) then "shift/reduce/reduce" else if (shift > 0) then "shift/reduce" else "reduce/reduce" in (* Explain how the conflict state is reached. *) Printf.fprintf out "\n\ ** Conflict (%s) in state %d.\n\ ** Token%s involved: %s\n%s\ ** This state is reached from %s after reading:\n\n%s\n" kind (Lr1.number node) (if TerminalSet.cardinal toks > 1 then "s" else "") (TerminalSet.print toks) (if TerminalSet.cardinal toks > 1 then Printf.sprintf "** The following explanations concentrate on token %s.\n" (Terminal.print P.token) else "") (Nonterminal.print false (Item.startnt P.source)) (Symbol.printa P.path); (* Examine the items in that state, focusing on one particular token. Out of the shift items, we explain just one -- this seems enough. We explain each of the reduce items. *) (* First, build a mapping of items to derivations. *) let (_ : bool), derivations = Item.Map.fold (fun item toks (still_looking_for_shift_item, derivations) -> match Item.classify item with | Item.Shift (Symbol.T tok, _) when still_looking_for_shift_item && (Terminal.equal tok P.token) -> false, let derivation = explain_shift_item P.source P.path item in Item.Map.add item derivation derivations | Item.Reduce _ when TerminalSet.mem P.token toks -> still_looking_for_shift_item, let derivation = explain_reduce_item P.token P.source P.path item in Item.Map.add item derivation derivations | _ -> still_looking_for_shift_item, derivations ) closure (true, Item.Map.empty) in (* Factor out the common context among all derivations, so as to avoid repeating it. This helps prevent derivation trees from drifting too far away towards the right. It also helps produce sub-derivations that are quite compact. *) let context, derivations = Derivation.factor derivations in (* Display the common context. *) Printf.fprintf out "\n** The derivations that appear below have the following common factor:\ \n** (The question mark symbol (?) represents the spot where the derivations begin to differ.)\n\n"; Derivation.printc out context; (* Then, display the sub-derivations. *) Item.Map.iter (fun item derivation -> Printf.fprintf out "\n** In state %d, looking ahead at %s, " (Lr1.number node) (Terminal.print P.token); begin match Item.classify item with | Item.Shift _ -> Printf.fprintf out "shifting is permitted\n** because of the following sub-derivation:\n\n" | Item.Reduce prod -> Printf.fprintf out "reducing production\n** %s\n** is permitted because of the following sub-derivation:\n\n" (Production.print prod) end; Derivation.print out derivation ) derivations; flush out with Lr1partial.Oops -> (* Ha ha! We were unable to explain this conflict. This could happen because the automaton was butchered by conflict resolution directives, or because [--lalr] was enabled and we have unexplainable LALR conflicts. Anyway, send the error message to the .conflicts file and continue. *) Printf.fprintf out "\n\ ** Conflict (unexplainable) in state %d.\n\ ** Token%s involved: %s\n\ ** %s.\n%!" (Lr1.number node) (if TerminalSet.cardinal toks > 1 then "s" else "") (TerminalSet.print toks) (match Settings.construction_mode with | Settings.ModeLALR -> "This may be an artificial conflict caused by your use of --lalr" | Settings.ModeCanonical | Settings.ModeInclusionOnly | Settings.ModePager -> "Please send your grammar to Menhir's developers" ) ); Time.tick "Explaining conflicts" end (* ------------------------------------------------------------------------ *) (* Resolve the conflicts that remain in the automaton. *) let () = Lr1.default_conflict_resolution(); Time.tick "Resolving remaining conflicts" (* ------------------------------------------------------------------------ *) (* Now is as good a time as any to add extra reductions, if requested by the user. This must be done after conflicts have been resolved. *) let () = Lr1.extra_reductions(); Time.tick "Adding extra reductions" (* ------------------------------------------------------------------------ *) (* If any warnings about the grammar have been emitted up to this point, and if [--strict] is enabled, now is the time to stop, before going into the back-end. *) let () = Error.exit_if Error.grammatical_error menhir-20200123/src/conflict.mli000066400000000000000000000021321361226111300163140ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module explains conflicts. Explanations are written to the .conflicts file. No functionality is offered by this module. *) menhir-20200123/src/coqBackend.ml000066400000000000000000000543601361226111300164060ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Printf open Grammar module Run (T: sig end) = struct let from_menhirlib f = match Settings.coq_lib_path with | None -> () | Some path -> fprintf f "From %s " path let menhirlib_path = match Settings.coq_lib_path with | None -> "" | Some path -> path ^ "." let print_term t = assert (not (Terminal.pseudo t)); sprintf "%s't" (Terminal.print t) let print_nterm nt = sprintf "%s'nt" (Nonterminal.print true nt) let print_symbol = function | Symbol.N nt -> sprintf "NT %s" (print_nterm nt) | Symbol.T t -> sprintf "T %s" (print_term t) let print_type ty = if Settings.coq_no_actions then "unit" else match ty with | None -> "unit" | Some t -> match t with | Stretch.Declared s -> s.Stretch.stretch_content | Stretch.Inferred _ -> assert false (* We cannot infer coq types *) let is_final_state node = match Default.has_default_reduction node with | Some (prod, _) -> Production.is_start prod | None -> false let lr1_iter_nonfinal f = Lr1.iter (fun node -> if not (is_final_state node) then f node) let lr1_iterx_nonfinal f = Lr1.iterx (fun node -> if not (is_final_state node) then f node) let lr1_foldx_nonfinal f = Lr1.foldx (fun accu node -> if not (is_final_state node) then f accu node else accu) let print_nis nis = sprintf "Nis'%d" (Lr1.number nis) let print_init init = sprintf "Init'%d" (Lr1.number init) let print_st st = match Lr1.incoming_symbol st with | Some _ -> sprintf "Ninit %s" (print_nis st) | None -> sprintf "Init %s" (print_init st) let (prod_ids, _) = Production.foldx (fun p (prod_ids, counters) -> let lhs = Production.nt p in let id = try SymbolMap.find (Symbol.N lhs) counters with Not_found -> 0 in (ProductionMap.add p id prod_ids, SymbolMap.add (Symbol.N lhs) (id+1) counters)) (ProductionMap.empty, SymbolMap.empty) let print_prod p = sprintf "Prod'%s'%d" (Nonterminal.print true (Production.nt p)) (ProductionMap.find p prod_ids) let () = if not Settings.coq_no_actions then begin Nonterminal.iterx (fun nonterminal -> match Nonterminal.ocamltype nonterminal with | None -> Error.error [] "I don't know the type of the nonterminal symbol %s." (Nonterminal.print false nonterminal) | Some _ -> ()); Production.iterx (fun prod -> if not (Keyword.KeywordSet.is_empty (Action.keywords (Production.action prod))) then Error.error [] "the Coq back-end supports none of the $ keywords." ) end; Production.iterx (fun prod -> Array.iter (fun symb -> match symb with | Symbol.T t -> if t = Terminal.error then Error.error [] "the Coq back-end does not support the error token." | _ -> ()) (Production.rhs prod)); if Front.grammar.BasicSyntax.parameters <> [] then Error.error [] "the Coq back-end does not support %%parameter." let write_tokens f = fprintf f "Inductive token : Type :="; Terminal.iter_real (fun term -> fprintf f "\n| %s : %s%%type -> token" (Terminal.print term) (print_type (Terminal.ocamltype term)) ); fprintf f ".\n\n" let write_inductive_alphabet f name constrs = fprintf f "Inductive %s' : Set :=" name; List.iter (fprintf f "\n| %s") constrs; fprintf f ".\n"; fprintf f "Definition %s := %s'.\n\n" name name; if List.length constrs > 0 then begin let iteri f = ignore (List.fold_left (fun k x -> f k x; succ k) 1 constrs) in fprintf f "Program Instance %sNum : %sAlphabet.Numbered %s :=\n" name menhirlib_path name; fprintf f " { inj := fun x => match x return _ with"; iteri (fun k constr -> fprintf f "\n | %s => %d%%positive" constr k); fprintf f "\n end;\n"; fprintf f " surj := (fun n => match n return _ with"; iteri (fprintf f "\n | %d%%positive => %s"); fprintf f "\n | _ => %s\n end)%%Z;\n" (List.hd constrs); fprintf f " inj_bound := %d%%positive }.\n" (List.length constrs); end else begin fprintf f "Program Instance %sAlph : %sAlphabet.Alphabet %s :=\n" name menhirlib_path name; fprintf f " { AlphabetComparable := {| compare := fun x y =>\n"; fprintf f " match x, y return comparison with end |};\n"; fprintf f " AlphabetEnumerable := {| all_list := []%%list |} }."; end let write_terminals f = write_inductive_alphabet f "terminal" ( Terminal.fold (fun t l -> if Terminal.pseudo t then l else print_term t::l) []); fprintf f "Instance TerminalAlph : %sAlphabet.Alphabet terminal := _.\n\n" menhirlib_path let write_nonterminals f = write_inductive_alphabet f "nonterminal" ( Nonterminal.foldx (fun nt l -> (print_nterm nt)::l) []); fprintf f "Instance NonTerminalAlph : %sAlphabet.Alphabet nonterminal := _.\n\n" menhirlib_path let write_symbol_semantic_type f = fprintf f "Definition terminal_semantic_type (t:terminal) : Type:=\n"; fprintf f " match t with\n"; Terminal.iter_real (fun terminal -> fprintf f " | %s => %s%%type\n" (print_term terminal) (print_type (Terminal.ocamltype terminal)) ); fprintf f " end.\n\n"; fprintf f "Definition nonterminal_semantic_type (nt:nonterminal) : Type:=\n"; fprintf f " match nt with\n"; Nonterminal.iterx (fun nonterminal -> fprintf f " | %s => %s%%type\n" (print_nterm nonterminal) (print_type (Nonterminal.ocamltype nonterminal))); fprintf f " end.\n\n"; fprintf f "Definition symbol_semantic_type (s:symbol) : Type:=\n"; fprintf f " match s with\n"; fprintf f " | T t => terminal_semantic_type t\n"; fprintf f " | NT nt => nonterminal_semantic_type nt\n"; fprintf f " end.\n\n" let write_token_term f = fprintf f "Definition token_term (tok : token) : terminal :=\n"; fprintf f " match tok with\n"; Terminal.iter_real (fun terminal -> fprintf f " | %s _ => %s\n" (Terminal.print terminal) (print_term terminal)); fprintf f " end.\n\n" let write_token_sem f = fprintf f "Definition token_sem (tok : token) : symbol_semantic_type (T (token_term tok)) :=\n"; fprintf f " match tok with\n"; Terminal.iter_real (fun terminal -> fprintf f " | %s x => x\n" (Terminal.print terminal)); fprintf f " end.\n\n" let write_productions f = write_inductive_alphabet f "production" ( Production.foldx (fun prod l -> (print_prod prod)::l) []); fprintf f "Instance ProductionAlph : %sAlphabet.Alphabet production := _.\n\n" menhirlib_path let write_productions_contents f = fprintf f "Definition prod_contents (p:production) :\n"; fprintf f " { p:nonterminal * list symbol &\n"; fprintf f " %sGrammar.arrows_right\n" menhirlib_path; fprintf f " (symbol_semantic_type (NT (fst p)))\n"; fprintf f " (List.map symbol_semantic_type (snd p)) }\n"; fprintf f " :=\n"; fprintf f " let box := existT (fun p =>\n"; fprintf f " %sGrammar.arrows_right\n" menhirlib_path; fprintf f " (symbol_semantic_type (NT (fst p)))\n"; fprintf f " (List.map symbol_semantic_type (snd p)) )\n"; fprintf f " in\n"; fprintf f " match p with\n"; Production.iterx (fun prod -> fprintf f " | %s => box\n" (print_prod prod); fprintf f " (%s, [%s]%%list)\n" (print_nterm (Production.nt prod)) (String.concat "; " (List.map print_symbol (List.rev (Array.to_list (Production.rhs prod))))); if Production.length prod = 0 then fprintf f " (\n" else fprintf f " (fun %s =>\n" (String.concat " " (List.rev (Array.to_list (Production.identifiers prod)))); if Settings.coq_no_actions then fprintf f "tt" else Printer.print_expr f (Action.to_il_expr (Production.action prod)); fprintf f "\n)\n"); fprintf f " end.\n\n"; fprintf f "Definition prod_lhs (p:production) :=\n"; fprintf f " fst (projT1 (prod_contents p)).\n"; fprintf f "Definition prod_rhs_rev (p:production) :=\n"; fprintf f " snd (projT1 (prod_contents p)).\n"; fprintf f "Definition prod_action (p:production) :=\n"; fprintf f " projT2 (prod_contents p).\n\n" let write_nullable_first f = fprintf f "Definition nullable_nterm (nt:nonterminal) : bool :=\n"; fprintf f " match nt with\n"; Nonterminal.iterx (fun nt -> fprintf f " | %s => %b\n" (print_nterm nt) (Analysis.nullable nt)); fprintf f " end.\n\n"; fprintf f "Definition first_nterm (nt:nonterminal) : list terminal :=\n"; fprintf f " match nt with\n"; Nonterminal.iterx (fun nt -> let firstSet = Analysis.first nt in fprintf f " | %s => [" (print_nterm nt); let first = ref true in TerminalSet.iter (fun t -> if !first then first := false else fprintf f "; "; fprintf f "%s" (print_term t) ) firstSet; fprintf f "]%%list\n"); fprintf f " end.\n\n" let write_grammar f = fprintf f "Module Import Gram <: %sGrammar.T.\n\n" menhirlib_path; fprintf f "Local Obligation Tactic := let x := fresh in intro x; case x; reflexivity.\n\n"; write_terminals f; write_nonterminals f; fprintf f "Include %sGrammar.Symbol.\n\n" menhirlib_path; write_symbol_semantic_type f; fprintf f "Definition token := token.\n\n"; write_token_term f; write_token_sem f; write_productions f; write_productions_contents f; fprintf f "Include %sGrammar.Defs.\n\n" menhirlib_path; fprintf f "End Gram.\n\n" let write_nis f = write_inductive_alphabet f "noninitstate" ( lr1_foldx_nonfinal (fun l node -> (print_nis node)::l) []); fprintf f "Instance NonInitStateAlph : %sAlphabet.Alphabet noninitstate := _.\n\n" menhirlib_path let write_init f = write_inductive_alphabet f "initstate" ( ProductionMap.fold (fun _prod node l -> (print_init node)::l) Lr1.entry []); fprintf f "Instance InitStateAlph : %sAlphabet.Alphabet initstate := _.\n\n" menhirlib_path let write_start_nt f = fprintf f "Definition start_nt (init:initstate) : nonterminal :=\n"; fprintf f " match init with\n"; Lr1.fold_entry (fun _prod node startnt _t () -> fprintf f " | %s => %s\n" (print_init node) (print_nterm startnt) ) (); fprintf f " end.\n\n" let write_actions f = fprintf f "Definition action_table (state:state) : action :=\n"; fprintf f " match state with\n"; lr1_iter_nonfinal (fun node -> fprintf f " | %s => " (print_st node); match Default.has_default_reduction node with | Some (prod, _) -> fprintf f "Default_reduce_act %s\n" (print_prod prod) | None -> fprintf f "Lookahead_act (fun terminal:terminal =>\n"; fprintf f " match terminal return lookahead_action terminal with\n"; let has_fail = ref false in Terminal.iter_real (fun t -> try let target = SymbolMap.find (Symbol.T t) (Lr1.transitions node) in fprintf f " | %s => Shift_act %s (eq_refl _)\n" (print_term t) (print_nis target) with Not_found -> try let prod = Misc.single (TerminalMap.find t (Lr1.reductions node)) in fprintf f " | %s => Reduce_act %s\n" (print_term t) (print_prod prod) with Not_found -> has_fail := true); if !has_fail then fprintf f " | _ => Fail_act\n"; fprintf f " end)\n" ); fprintf f " end.\n\n" let write_gotos f = fprintf f "Definition goto_table (state:state) (nt:nonterminal) :=\n"; fprintf f " match state, nt return option { s:noninitstate | NT nt = last_symb_of_non_init_state s } with\n"; let has_none = ref false in lr1_iter_nonfinal (fun node -> Nonterminal.iterx (fun nt -> try let target = SymbolMap.find (Symbol.N nt) (Lr1.transitions node) in fprintf f " | %s, %s => " (print_st node) (print_nterm nt); if is_final_state target then fprintf f "None" else fprintf f "Some (exist _ %s (eq_refl _))\n" (print_nis target) with Not_found -> has_none := true)); if !has_none then fprintf f " | _, _ => None\n"; fprintf f " end.\n\n" let write_last_symb f = fprintf f "Definition last_symb_of_non_init_state (noninitstate:noninitstate) : symbol :=\n"; fprintf f " match noninitstate with\n"; lr1_iterx_nonfinal (fun node -> match Lr1.incoming_symbol node with | Some s -> fprintf f " | %s => %s\n" (print_nis node) (print_symbol s) | None -> assert false); fprintf f " end.\n\n" let write_past_symb f = fprintf f "Definition past_symb_of_non_init_state (noninitstate:noninitstate) : list symbol :=\n"; fprintf f " match noninitstate with\n"; lr1_iterx_nonfinal (fun node -> let s = String.concat "; " (List.tl (Invariant.fold (fun l _ symb _ -> print_symbol symb::l) [] (Invariant.stack node))) in fprintf f " | %s => [%s]%%list\n" (print_nis node) s); fprintf f " end.\n"; fprintf f "Extract Constant past_symb_of_non_init_state => \"fun _ -> assert false\".\n\n" module NodeSetMap = Map.Make(Lr1.NodeSet) let write_past_states f = let get_stateset_id = let memo = ref NodeSetMap.empty in let next_id = ref 1 in fun stateset -> try NodeSetMap.find stateset !memo with | Not_found -> let id = sprintf "state_set_%d" !next_id in memo := NodeSetMap.add stateset id !memo; incr next_id; fprintf f "Definition %s (s:state) : bool :=\n" id; fprintf f " match s with\n"; fprintf f " "; Lr1.NodeSet.iter (fun st -> fprintf f "| %s " (print_st st)) stateset; fprintf f "=> true\n"; fprintf f " | _ => false\n"; fprintf f " end.\n"; fprintf f "Extract Inlined Constant %s => \"assert false\".\n\n" id; id in let b = Buffer.create 256 in bprintf b "Definition past_state_of_non_init_state (s:noninitstate) : list (state -> bool) :=\n"; bprintf b " match s with\n"; lr1_iterx_nonfinal (fun node -> let s = String.concat "; " (Invariant.fold (fun accu _ _ states -> get_stateset_id states::accu) [] (Invariant.stack node)) in bprintf b " | %s => [ %s ]%%list\n" (print_nis node) s); bprintf b " end.\n"; Buffer.output_buffer f b; fprintf f "Extract Constant past_state_of_non_init_state => \"fun _ -> assert false\".\n\n" module TerminalSetMap = Map.Make(TerminalSet) let write_items f = if not Settings.coq_no_complete then begin let get_lookaheadset_id = let memo = ref TerminalSetMap.empty in let next_id = ref 1 in fun lookaheadset -> let lookaheadset = if TerminalSet.mem Terminal.sharp lookaheadset then TerminalSet.universe else lookaheadset in try TerminalSetMap.find lookaheadset !memo with Not_found -> let id = sprintf "lookahead_set_%d" !next_id in memo := TerminalSetMap.add lookaheadset id !memo; incr next_id; fprintf f "Definition %s : list terminal :=\n [" id; let first = ref true in TerminalSet.iter (fun lookahead -> if !first then first := false else fprintf f "; "; fprintf f "%s" (print_term lookahead) ) lookaheadset; fprintf f "]%%list.\nExtract Inlined Constant %s => \"assert false\".\n\n" id; id in let b = Buffer.create 256 in lr1_iter_nonfinal (fun node -> bprintf b "Definition items_of_state_%d : list item :=\n" (Lr1.number node); bprintf b " [ "; let first = ref true in Item.Map.iter (fun item lookaheads -> let prod, pos = Item.export item in if not (Production.is_start prod) then begin if !first then first := false else bprintf b ";\n "; bprintf b "{| prod_item := %s; dot_pos_item := %d; lookaheads_item := %s |}" (print_prod prod) pos (get_lookaheadset_id lookaheads); end ) (Lr0.closure (Lr0.export (Lr1.state node))); bprintf b " ]%%list.\n"; bprintf b "Extract Inlined Constant items_of_state_%d => \"assert false\".\n\n" (Lr1.number node) ); Buffer.output_buffer f b; fprintf f "Definition items_of_state (s:state) : list item :=\n"; fprintf f " match s with\n"; lr1_iter_nonfinal (fun node -> fprintf f " | %s => items_of_state_%d\n" (print_st node) (Lr1.number node)); fprintf f " end.\n"; end else fprintf f "Definition items_of_state (s:state): list item := []%%list.\n"; fprintf f "Extract Constant items_of_state => \"fun _ -> assert false\".\n\n" let write_automaton f = fprintf f "Module Aut <: %sAutomaton.T.\n\n" menhirlib_path; fprintf f "Local Obligation Tactic := let x := fresh in intro x; case x; reflexivity.\n\n"; fprintf f "Module Gram := Gram.\n"; fprintf f "Module GramDefs := Gram.\n\n"; write_nullable_first f; write_nis f; write_last_symb f; write_init f; fprintf f "Include %sAutomaton.Types.\n\n" menhirlib_path; write_start_nt f; write_actions f; write_gotos f; write_past_symb f; write_past_states f; write_items f; fprintf f "End Aut.\n\n" let write_theorems f = fprintf f "Module MenhirLibParser := %sMain.Make Aut.\n" menhirlib_path; fprintf f "Theorem safe:\n"; fprintf f " MenhirLibParser.safe_validator tt = true.\n"; fprintf f "Proof eq_refl true<:MenhirLibParser.safe_validator tt = true.\n\n"; if not Settings.coq_no_complete then begin fprintf f "Theorem complete:\n"; fprintf f " MenhirLibParser.complete_validator tt = true.\n"; fprintf f "Proof eq_refl true<:MenhirLibParser.complete_validator tt = true.\n\n"; end; Lr1.fold_entry (fun _prod node startnt _t () -> let funName = Nonterminal.print true startnt in fprintf f "Definition %s : nat -> MenhirLibParser.Inter.buffer -> MenhirLibParser.Inter.parse_result %s := MenhirLibParser.parse safe Aut.%s.\n\n" funName (print_type (Nonterminal.ocamltype startnt)) (print_init node); fprintf f "Theorem %s_correct (log_fuel : nat) (buffer : MenhirLibParser.Inter.buffer):\n" funName; fprintf f " match %s log_fuel buffer with\n" funName; fprintf f " | MenhirLibParser.Inter.Parsed_pr sem buffer_new =>\n"; fprintf f " exists word (tree : Gram.parse_tree (%s) word),\n" (print_symbol (Symbol.N startnt)); fprintf f " buffer = MenhirLibParser.Inter.app_buf word buffer_new /\\\n"; fprintf f " Gram.pt_sem tree = sem\n"; fprintf f " | _ => True\n"; fprintf f " end.\n"; fprintf f "Proof. apply MenhirLibParser.parse_correct with (init:=Aut.%s). Qed.\n\n" (print_init node); if not Settings.coq_no_complete then begin fprintf f "Theorem %s_complete (log_fuel : nat) (word : list token) (buffer_end : MenhirLibParser.Inter.buffer) :\n" funName; fprintf f " forall tree : Gram.parse_tree (%s) word,\n" (print_symbol (Symbol.N startnt)); fprintf f " match %s log_fuel (MenhirLibParser.Inter.app_buf word buffer_end) with\n" funName; fprintf f " | MenhirLibParser.Inter.Fail_pr => False\n"; fprintf f " | MenhirLibParser.Inter.Parsed_pr output_res buffer_end_res =>\n"; fprintf f " output_res = Gram.pt_sem tree /\\\n"; fprintf f " buffer_end_res = buffer_end /\\ (Gram.pt_size tree <= PeanoNat.Nat.pow 2 log_fuel)%%nat\n"; fprintf f " | MenhirLibParser.Inter.Timeout_pr => (PeanoNat.Nat.pow 2 log_fuel < Gram.pt_size tree)%%nat\n"; fprintf f " end.\n"; fprintf f "Proof. apply MenhirLibParser.parse_complete with (init:=Aut.%s); exact complete. Qed.\n" (print_init node); end ) () let write_all f = if not Settings.coq_no_actions then List.iter (fun s -> fprintf f "%s\n\n" s.Stretch.stretch_content) Front.grammar.BasicSyntax.preludes; fprintf f "From Coq.Lists Require List.\n"; fprintf f "From Coq.PArith Require Import BinPos.\n"; from_menhirlib f; fprintf f "Require Main.\n"; if not Settings.coq_no_version_check then begin from_menhirlib f; fprintf f "Require Version.\n" end; fprintf f "Import List.ListNotations.\n\n"; if not Settings.coq_no_version_check then fprintf f "Definition version_check : unit := %sVersion.require_%s.\n\n" menhirlib_path Version.version; fprintf f "Unset Elimination Schemes.\n\n"; write_tokens f; write_grammar f; write_automaton f; write_theorems f; if not Settings.coq_no_actions then List.iter (fun stretch -> fprintf f "\n\n%s" stretch.Stretch.stretch_raw_content) Front.grammar.BasicSyntax.postludes end menhir-20200123/src/coqBackend.mli000066400000000000000000000020701361226111300165460ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* The coq code generator. *) module Run (T: sig end) : sig val write_all: out_channel -> unit end menhir-20200123/src/cst.ml000066400000000000000000000067541361226111300151510ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Grammar (* Concrete syntax trees. *) (* A concrete syntax tree is one of a leaf -- which corresponds to a terminal symbol; a node -- which corresponds to a non-terminal symbol, and whose immediate descendants form an expansion of that symbol; or an error leaf -- which corresponds to a point where the [error] pseudo-token was shifted. *) type cst = | CstTerminal of Terminal.t | CstNonTerminal of Production.index * cst array | CstError (* This is a (mostly) unambiguous printer for concrete syntax trees, in an sexp-like notation. *) let rec pcst b = function | CstTerminal tok -> (* A leaf is denoted by a terminal symbol. *) Printf.bprintf b "%s" (Terminal.print tok) | CstNonTerminal (prod, csts) -> (* A node is denoted by a bracketed, whitespace-separated list, whose head is a non-terminal symbol (followed with a colon) and whose tail consists of the node's descendants. *) (* There is in fact some ambiguity in this notation, since we only print the non-terminal symbol that forms the left-hand side of production [prod], instead of the production itself. This abuse makes things much more readable, and should be acceptable for the moment. The cases where ambiguity actually arises should be rare. *) Printf.bprintf b "[%s:%a]" (Nonterminal.print false (Production.nt prod)) pcsts csts | CstError -> (* An error leaf is denoted by [error]. *) Printf.bprintf b "error" and pcsts b (csts : cst array) = Array.iter (fun cst -> Printf.bprintf b " %a" pcst cst ) csts (* This is the public interface. *) let wrap print f x = let b = Buffer.create 32768 in print b x; Buffer.output_buffer f b let print = wrap pcst (* This is a pretty-printer for concrete syntax trees. The notation is the same as that used by the above printer; the only difference is that the [Pprint] library is used to manage indentation. *) open Pprint let rec build : cst -> document = function | CstTerminal tok -> text (Terminal.print tok) | CstNonTerminal (prod, csts) -> brackets ( group ( text (Nonterminal.print false (Production.nt prod)) ^^ colon ^^ group ( nest 2 ( Array.fold_left (fun doc cst -> doc ^^ break1 ^^ build cst ) empty csts ) ) ^^ break0 ) ) | CstError -> text "error" let show f cst = Channel.pretty 0.8 80 f (build cst) menhir-20200123/src/cst.mli000066400000000000000000000034411361226111300153100ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Grammar (* Concrete syntax trees. *) (* A concrete syntax tree is one of a leaf -- which corresponds to a terminal symbol; a node -- which corresponds to a non-terminal symbol, and whose immediate descendants form an expansion of that symbol; or an error leaf -- which corresponds to a point where the [error] pseudo-token was shifted. *) type cst = | CstTerminal of Terminal.t | CstNonTerminal of Production.index * cst array | CstError (* This is a (mostly) unambiguous printer for concrete syntax trees, in an sexp-like notation. *) val print: out_channel -> cst -> unit (* This is a pretty-printer for concrete syntax trees. The notation is the same as that used by the above printer; the only difference is that the [Pprint] library is used to manage indentation. *) val show: out_channel -> cst -> unit menhir-20200123/src/default.ml000066400000000000000000000127401361226111300157740ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Grammar module C = Conflict (* artificial dependency; ensures that [Conflict] runs first *) (* Here is how we check whether state [s] should have a default reduction. We check whether [s] has no outgoing shift transitions and only has one possible reduction action. In that case, we produce a default reduction action, that is, we perform reduction without consulting the lookahead token. This saves code, but can alter the parser's behavior in the presence of errors. The check for default actions subsumes the check for the case where [s] admits a reduce action with lookahead symbol "#". In that case, it must be the only possible action -- see [Lr1.default_conflict_resolution]. That is, we have reached a point where we have recognized a well-formed input and are now expecting an end-of-stream. In that case, performing reduction without looking at the next token is the right thing to do, since there should in fact be none. The state that we reduce to will also have the same property, and so on, so we will in fact end up rewinding the entire stack and accepting the input when the stack becomes empty. (New as of 2012/01/23.) A state where a shift/reduce conflict was solved in favor of neither (due to a use of the %nonassoc directive) must not perform a default reduction. Indeed, this would effectively mean that the failure that was requested by the user is forgotten and replaced with a reduction. This surprising behavior is present in ocamlyacc and was present in earlier versions of Menhir. See e.g. http://caml.inria.fr/mantis/view.php?id=5462 There is a chance that we might run into trouble if the ideas described in the above two paragraphs collide, that is, if we forbid a default reduction (due to a shift/reduce conflict solved by %nonassoc) in a node where we would like to have default reduction on "#". This situation seems unlikely to arise, so I will not do anything about it for the moment. (Furthermore, someone who uses precedence declarations is looking for trouble anyway.) Between 2012/05/25 and 2015/09/25, if [--canonical] has been specified, then we disallow default reductions on a normal token, because we do not want to introduce any spurious actions into the automaton. We do still allow default reductions on "#", since they are needed for the automaton to terminate properly. From 2015/09/25 on, we again always allow default reductions, as they seem to be beneficial when explaining syntax errors. *) let has_default_reduction, count = Misc.tabulateo Lr1.number Lr1.fold Lr1.n (fun s -> if Lr1.forbid_default_reduction s then None else let reduction = ProductionMap.is_singleton (Lr0.invert (Lr1.reductions s)) in match reduction with | Some _ -> if SymbolMap.purelynonterminal (Lr1.transitions s) then reduction else None | None -> reduction ) let () = Error.logC 1 (fun f -> Printf.fprintf f "%d out of %d states have a default reduction.\n" count Lr1.n) let () = Time.tick "Computing default reductions" (* ------------------------------------------------------------------------ *) (* Here are a number of auxiliary functions that provide information about the LR(1) automaton. *) (* [reductions_on s z] is the list of reductions permitted in state [s] when the lookahead symbol is [z]. This is a list of zero or one elements. This does not take default reductions into account. [z] must be real. *) let reductions_on s z : Production.index list = assert (Terminal.real z); try TerminalMap.find z (Lr1.reductions s) with Not_found -> [] (* [has_reduction s z] tells whether state [s] is willing to reduce some production (and if so, which one) when the lookahead symbol is [z]. It takes a possible default reduction into account. [z] must be real. *) let has_reduction s z : Production.index option = assert (Terminal.real z); match has_default_reduction s with | Some (prod, _) -> Some prod | None -> match reductions_on s z with | prod :: prods -> assert (prods = []); Some prod | [] -> None (* [causes_an_error s z] tells whether state [s] will initiate an error on the lookahead symbol [z]. [z] must be real. *) let causes_an_error s z : bool = assert (Terminal.real z); match has_default_reduction s with | Some _ -> false | None -> reductions_on s z = [] && not (SymbolMap.mem (Symbol.T z) (Lr1.transitions s)) menhir-20200123/src/default.mli000066400000000000000000000032011361226111300161350ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Grammar (* [has_default_reduction s] tells whether state [s] has a default reduction, and, if so, upon which set of tokens. *) val has_default_reduction : Lr1.node -> (Production.index * TerminalSet.t) option (* [has_reduction s z] tells whether state [s] is willing to reduce some production (and if so, which one) when the lookahead symbol is [z]. It takes a possible default reduction into account. [z] must be real. *) val has_reduction: Lr1.node -> Terminal.t -> Production.index option (* [causes_an_error s z] tells whether state [s] will initiate an error on the lookahead symbol [z]. [z] must be real. *) val causes_an_error: Lr1.node -> Terminal.t -> bool menhir-20200123/src/derivation.ml000066400000000000000000000233131361226111300165120ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Grammar (* -------------------------------------------------------------------------- *) (* This is a data structure for linear derivation trees. These are derivation trees that are list-like (that is, they do not branch), because a single path is of interest. A tree is either empty or formed of a non-terminal symbol at the root and a forest below the root. A forest is an ordered list of elements. However, its elements are not trees, as one would perhaps expect. Because we are interested in *linear* derivation trees, only one element of the forest receives focus and is a tree. All other elements remain un-expanded, so they are just symbols. In other words, a linear derivation tree is roughly just a list of levels, where each forest corresponds to one level. *) type 'focus level = { prefix: Symbol.t list; focus: 'focus; suffix: Symbol.t list; comment: string } type tree = | TEmpty | TRooted of Symbol.t * forest and forest = tree level (* We make use of contexts with a forest-shaped hole. We have tree contexts and forest contexts. Tree contexts do not have a case for holes, since we work with forest-shaped holes only. Forest contexts have one. *) type ctree = | CRooted of Symbol.t * cforest and cforest = | CHole | CCons of ctree level (* Make a few types visible to clients. *) type t = forest type context = cforest (* -------------------------------------------------------------------------- *) (* Construction. *) let rec array_to_list a i j = if i = j then [] else a.(i) :: array_to_list a (i + 1) j let empty = { prefix = []; focus = TEmpty; suffix = []; comment = "" } let tail pos rhs = let length = Array.length rhs in assert (pos < length); { prefix = []; focus = TEmpty; suffix = array_to_list rhs pos length; comment = "" } let build pos rhs forest comment = let length = Array.length rhs in assert (pos < length); match rhs.(pos) with | Symbol.T _ -> assert false | Symbol.N _ as symbol -> { prefix = []; focus = TRooted (symbol, forest); suffix = array_to_list rhs (pos + 1) length; comment = (match comment with None -> "" | Some comment -> comment) } let prepend symbol forest = { forest with prefix = symbol :: forest.prefix } (* -------------------------------------------------------------------------- *) (* Display. *) let buffer = Buffer.create 32768 let rec print_blank k = if k > 0 then begin Buffer.add_char buffer ' '; print_blank (k - 1) end let print_symbol symbol = let word = Symbol.print symbol in Buffer.add_string buffer word; Buffer.add_char buffer ' '; String.length word + 1 let print_symbols symbols = List.fold_left (fun offset symbol -> offset + print_symbol symbol ) 0 symbols let print_level print_focus_root print_focus_remainder offset forest = print_blank offset; let offset = offset + print_symbols forest.prefix in print_focus_root forest.focus; let (_ : int) = print_symbols forest.suffix in if String.length forest.comment > 0 then begin Buffer.add_string buffer "// "; Buffer.add_string buffer forest.comment end; Buffer.add_char buffer '\n'; print_focus_remainder offset forest.focus let print_tree_root = function | TEmpty -> Buffer.add_string buffer ". " | TRooted (symbol, _) -> let (_ : int) = print_symbol symbol in () let rec print_forest offset forest = print_level print_tree_root print_tree_remainder offset forest and print_tree_remainder offset = function | TEmpty -> () | TRooted (_, forest) -> print_forest offset forest let print_ctree_root = function | CRooted (symbol, _) -> let (_ : int) = print_symbol symbol in () let rec print_cforest offset cforest = match cforest with | CHole -> print_blank offset; Buffer.add_string buffer "(?)\n" | CCons forest -> print_level print_ctree_root print_ctree_remainder offset forest and print_ctree_remainder offset = function | CRooted (_, cforest) -> print_cforest offset cforest let wrap print channel x = Buffer.clear buffer; print 0 x; Buffer.output_buffer channel buffer let print = wrap print_forest let printc = wrap print_cforest (* -------------------------------------------------------------------------- *) (* [punch] turns a (tree or forest) into a pair of a (tree or forest) context and a residual forest, where the context is chosen maximal. In other words, the residual forest consists of a single level -- its focus is [TEmpty]. *) let rec punch_tree tree : (ctree * forest) option = match tree with | TEmpty -> None | TRooted (symbol, forest) -> let forest1, forest2 = punch_forest forest in Some (CRooted (symbol, forest1), forest2) and punch_forest forest : cforest * forest = match punch_tree forest.focus with | None -> CHole, forest | Some (ctree1, forest2) -> CCons { prefix = forest.prefix; focus = ctree1; suffix = forest.suffix; comment = forest.comment }, forest2 (* [fill] fills a (tree or forest) context with a forest so as to produce a new (tree or forest). *) let rec fill_tree ctree1 forest2 : tree = match ctree1 with | CRooted (symbol1, cforest1) -> TRooted (symbol1, fill_forest cforest1 forest2) and fill_forest cforest1 forest2 : forest = match cforest1 with | CHole -> forest2 | CCons level1 -> { prefix = level1.prefix; focus = fill_tree level1.focus forest2; suffix = level1.suffix; comment = level1.comment } (* [common] factors the maximal common (tree or forest) context out of a pair of a (tree or forest) context and a (tree or forest). It returns the (tree or forest) context as well as the residuals of the two parameters. *) let rec common_tree ctree1 tree2 : (ctree * cforest * forest) option = match ctree1, tree2 with | CRooted _, TEmpty -> None | CRooted (symbol1, cforest1), TRooted (symbol2, forest2) -> if Symbol.equal symbol1 symbol2 then let cforest, cforest1, forest2 = common_forest cforest1 forest2 in Some (CRooted (symbol1, cforest), cforest1, forest2) else None and common_forest cforest1 forest2 : cforest * cforest * forest = match cforest1 with | CHole -> CHole, cforest1, forest2 | CCons forest1 -> if Symbol.lequal forest1.prefix forest2.prefix && Symbol.lequal forest1.suffix forest2.suffix && forest1.comment = forest2.comment then begin match common_tree forest1.focus forest2.focus with | None -> CHole, cforest1, forest2 | Some (ctree, csubforest1, subforest2) -> let cforest = { prefix = forest1.prefix; focus = ctree; suffix = forest1.suffix; comment = forest1.comment } in CCons cforest, csubforest1, subforest2 end else CHole, cforest1, forest2 (* [factor] factors the maximal common forest context out of a nonempty family of forests. We assume that the family is represented as a map indexed by items, because this is convenient for the application that we have in mind, but this assumption is really irrelevant. *) let factor forests = match Item.Map.fold (fun item forest accu -> match accu with | None -> (* First time through the loop, so [forest] is the first forest that we examine. Punch it, so as to produce a maximal forest context and a residual forest. *) let context, residual = punch_forest forest in Some (context, Item.Map.singleton item residual) | Some (context, residuals) -> (* Another iteration through the loop. [context] and [residuals] are the maximal common context and the residuals of the forests examined so far. *) (* Combine the common context obtained so far with the forest at hand. This yields a new, smaller common context, as well as residuals for the previous common context and for the forest at hand. *) let context, contextr, forestr = common_forest context forest in (* The residual forests are now: (i) the residual forest [forestr]; and (ii) the previous residual forests [residuals], each of which must be placed with the residual context [contextr]. *) let residuals = Item.Map.add item forestr (Item.Map.map (fill_forest contextr) residuals) in Some (context, residuals) ) forests None with | None -> assert false (* parameter [forests] was an empty map *) | Some (context, residuals) -> context, residuals menhir-20200123/src/derivation.mli000066400000000000000000000054321361226111300166650ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Grammar (* -------------------------------------------------------------------------- *) (* This is the type of derivations. Derivations are forests: see inside. *) type t (* This is the type of derivations contexts, or derivations with a derivation-shaped hole. *) type context (* -------------------------------------------------------------------------- *) (* Construction. *) (* [empty] is the forest that consists of a single empty tree. *) val empty: t (* [tail pos rhs] is the forest: (i) whose first element is the empty tree, and (ii) whose remaining elements are the symbols found at positions greater than or equal to [pos] in the array [rhs]. *) val tail: int -> Symbol.t array -> t (* [build pos rhs forest comment] is the forest: (i) whose first element is the tree that has the non-terminal symbol [rhs.(pos)] at its root and the forest [forest] below its root, and (ii) whose remaining elements are the symbols found at positions greater than [pos] in the array [rhs]. *) val build: int -> Symbol.t array -> t -> string option -> t (* [prepend symbol forest] is the forest: (i) whose first element is the symbol [symbol], and (ii) whose remaining elements form the forest [forest]. *) val prepend: Symbol.t -> t -> t (* -------------------------------------------------------------------------- *) (* Factoring. *) (* [factor] factors the maximal common derivation context out of a nonempty family of derivations. It produces a pair of the context and of the residual derivations. *) val factor: t Item.Map.t -> context * t Item.Map.t (* -------------------------------------------------------------------------- *) (* Display. *) (* [print] prints a derivation. *) val print: out_channel -> t -> unit (* [printc] prints a derivation context. *) val printc: out_channel -> context -> unit menhir-20200123/src/dot.ml000066400000000000000000000104051361226111300151320ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Printf (* ------------------------------------------------------------------------- *) (* Type definitions. *) type size = float * float (* in inches *) type orientation = | Portrait | Landscape type rankdir = | LeftToRight | TopToBottom type ratio = | Compress | Fill | Auto type style = (* Both nodes and edges. *) | Solid | Dashed | Dotted | Bold | Invisible (* Nodes only. *) | Filled | Diagonals | Rounded type shape = | Box | Oval | Circle | DoubleCircle (* there are many others, let's stop here *) (* ------------------------------------------------------------------------- *) (* Basic printers. *) let print_style = function | None -> "" | Some style -> let style = match style with | Solid -> "solid" | Dashed -> "dashed" | Dotted -> "dotted" | Bold -> "bold" | Invisible -> "invis" | Filled -> "filled" | Diagonals -> "diagonals" | Rounded -> "rounded" in sprintf ", style = %s" style let print_shape = function | None -> "" | Some shape -> let shape = match shape with | Box -> "box" | Oval -> "oval" | Circle -> "circle" | DoubleCircle -> "doublecircle" in sprintf ", shape = %s" shape (* ------------------------------------------------------------------------- *) (* The graph printer. *) module Print (G : sig type vertex val name: vertex -> string val successors: (?style:style -> label:string -> vertex -> unit) -> vertex -> unit val iter: (?shape:shape -> ?style:style -> label:string -> vertex -> unit) -> unit end) = struct let print ?(directed = true) ?size ?(orientation = Landscape) ?(rankdir = LeftToRight) ?(ratio = Compress) (f : out_channel) = fprintf f "%s G {\n" (if directed then "digraph" else "graph"); Option.iter (fun (hsize, vsize) -> fprintf f "size=\"%f, %f\";\n" hsize vsize ) size; begin match orientation with | Portrait -> fprintf f "orientation = portrait;\n" | Landscape -> fprintf f "orientation = landscape;\n" end; begin match rankdir with | LeftToRight -> fprintf f "rankdir = LR;\n" | TopToBottom -> fprintf f "rankdir = TB;\n" end; begin match ratio with | Compress -> fprintf f "ratio = compress;\n" | Fill -> fprintf f "ratio = fill;\n" | Auto -> fprintf f "ratio = auto;\n" end; G.iter (fun ?shape ?style ~label vertex -> fprintf f "%s [ label=\"%s\"%s%s ] ;\n" (G.name vertex) label (print_style style) (print_shape shape) ); G.iter (fun ?shape ?style ~label source -> ignore shape; (* avoid unused variable warnings *) ignore style; ignore label; G.successors (fun ?style ~label destination -> fprintf f "%s %s %s [ label=\"%s\"%s ] ;\n" (G.name source) (if directed then "->" else "--") (G.name destination) label (print_style style) ) source ); fprintf f "\n}\n" end menhir-20200123/src/dot.mli000066400000000000000000000043341361226111300153070ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module displays graphs in graphviz dot format. It is much more basic than the one bundled with the ocamlgraph library, but offers the advantage of being stand-alone. *) (* ------------------------------------------------------------------------- *) (* Type definitions. *) type size = float * float (* in inches *) type orientation = | Portrait | Landscape type rankdir = | LeftToRight | TopToBottom type ratio = | Compress | Fill | Auto type style = (* Both nodes and edges. *) | Solid | Dashed | Dotted | Bold | Invisible (* Nodes only. *) | Filled | Diagonals | Rounded type shape = | Box | Oval | Circle | DoubleCircle (* there are many others, let's stop here *) (* ------------------------------------------------------------------------- *) (* The graph printer. *) module Print (G : sig type vertex val name: vertex -> string val successors: (?style:style -> label:string -> vertex -> unit) -> vertex -> unit val iter: (?shape:shape -> ?style:style -> label:string -> vertex -> unit) -> unit end) : sig val print: ?directed: bool -> ?size: size -> ?orientation: orientation -> ?rankdir: rankdir -> ?ratio: ratio -> out_channel -> unit end menhir-20200123/src/dune000066400000000000000000000015501361226111300146710ustar00rootroot00000000000000;; Compilation flags for Menhir. ;; Warnings are enabled (and fatal) during development, ;; but are disabled in releases. (env (dev (flags :standard -safe-string -g -w @1..66-4-9-41-44-60 )) (release (flags :standard -safe-string -g )) ) ;; The following parsers are built by ocamlyacc. (ocamlyacc sentenceParser ) ;; The following lexers are built by ocamllex. (ocamllex lexer lineCount lexmli lexdep chopInlined sentenceLexer segment lexpointfree ) ;; The Menhir standard library "standard.mly" is embedded in the source code of ;; Menhir using the following rule. It generates a file "standard_mly.ml" with ;; contents "let contents = {||}". (rule (with-stdout-to standard_mly.ml (progn (echo "let contents = {|") (cat standard.mly) (echo "|}") ) ) ) menhir-20200123/src/error.ml000066400000000000000000000073501361226111300155020ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Printf (* ---------------------------------------------------------------------------- *) (* A mechanism to turn all display (logging, warnings, errors) on and off. *) let enabled = ref true let enable () = enabled := true let disable () = enabled := false (* ---------------------------------------------------------------------------- *) (* The new OCaml type inference protocol means that Menhir is called twice, first with [--infer-write-query], then with [--infer-read-reply]. This means that any information messages or warnings issued before OCaml type inference takes place are duplicated, unless we do something about it. To address this issue, when [--infer-read-reply] is set, we disable all output until the point where we read the inferred [.mli] file. Then, we enable it again and continue. *) (* An alternative idea would be to disable all output when [--infer-write-query] is set. However, we would then have no output at all if this command fails. *) let () = Settings.(match infer with | IMReadReply _ -> disable() | _ -> () ) (* ---------------------------------------------------------------------------- *) (* Logging and log levels. *) let log kind verbosity msg = if kind >= verbosity && !enabled then Printf.fprintf stderr "%t%!" msg let logG = log Settings.logG let logA = log Settings.logA let logC = log Settings.logC (* ---------------------------------------------------------------------------- *) (* Errors and warnings. *) let print_positions f positions = List.iter (fun position -> fprintf f "%s:\n" (Positions.string_of_pos position) ) positions let display continuation header positions format = let kprintf = if !enabled then Printf.kfprintf else Printf.ikfprintf in kprintf continuation stderr ("%a" ^^ header ^^ format ^^ "\n%!") print_positions positions let error positions format = display (fun _ -> exit 1) "Error: " positions format let warning positions format = display (fun _ -> ()) "Warning: " positions format let errorp v = error [ Positions.position v ] (* ---------------------------------------------------------------------------- *) (* Delayed error reports -- where multiple errors can be reported at once. *) type category = bool ref let new_category () = ref false let signal category positions format = display (fun _ -> category := true) "Error: " positions format let exit_if category = if !category then exit 1 (* ---------------------------------------------------------------------------- *) (* Certain warnings about the grammar can optionally be treated as errors. *) let grammatical_error = new_category() let grammar_warning pos = if Settings.strict then signal grammatical_error pos else warning pos menhir-20200123/src/error.mli000066400000000000000000000070571361226111300156570ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module helps report errors. *) (* ---------------------------------------------------------------------------- *) (* A mechanism to turn all display (logging, warnings, errors) on and off. *) val enable: unit -> unit val disable: unit -> unit (* ---------------------------------------------------------------------------- *) (* Logging and log levels. *) val logG: int -> (out_channel -> unit) -> unit val logA: int -> (out_channel -> unit) -> unit val logC: int -> (out_channel -> unit) -> unit (* ---------------------------------------------------------------------------- *) (* Errors and warnings. *) (* [error ps format ...] displays the list of positions [ps], followed with the error message [format ...], and exits. The strings "Error: " and "\n" are automatically added at the beginning and end of the error message. The message should begin with a lowercase letter and end with a dot. *) val error: Positions.positions -> ('a, out_channel, unit, 'b) format4 -> 'a (* [errorp] is like [error], but uses the position range carried by [v]. *) val errorp: _ Positions.located -> ('a, out_channel, unit, 'b) format4 -> 'a (* [warning] is like [error], except it does not exit. *) val warning: Positions.positions -> ('a, out_channel, unit, unit) format4 -> 'a (* ---------------------------------------------------------------------------- *) (* Delayed error reports -- where multiple errors can be reported at once. *) (* A category of errors. *) type category (* [new_category()] creates a new category of errors. *) val new_category: unit -> category (* [signal category] is like [error], except it does not exit immediately. It records the fact that an error of this category has occurred. This can be later detected by [exit_if category]. *) val signal: category -> Positions.positions -> ('a, out_channel, unit, unit) format4 -> 'a (* [exit_if category] exits with exit code 1 if [signal category] was previously called. Together, [signal] and [exit_if] allow reporting multiple errors before aborting. *) val exit_if: category -> unit (* ---------------------------------------------------------------------------- *) (* Certain warnings about the grammar can optionally be treated as errors. *) val grammatical_error: category (* [grammar_warning] emits a warning or error message, via either [warning] or [signal grammatical_error]. It does not stop the program; the client must at some point use [exit_if grammatical_error] and stop the program if any errors have been reported. *) val grammar_warning: Positions.positions -> ('a, out_channel, unit, unit) format4 -> 'a menhir-20200123/src/expandTokenAliases.ml000066400000000000000000000130371361226111300201320ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Syntax (* We first build an alias map, which records the token aliases declared across all partial grammars. This is a map of aliases to pairs of a terminal symbol and the position where this symbol is declared. Then, we walk the partial grammars before they are joined, expanding the token aliases along the way. *) type aliasmap = (terminal * Positions.t) StringMap.t (* -------------------------------------------------------------------------- *) (* Extend an alias map with the token aliases present in a declaration. *) let collect_aliases_from_declaration (aliasmap : aliasmap) decl : aliasmap = match Positions.value decl with | DToken (_, id, Some qid, _) -> begin match StringMap.find qid aliasmap with | exception Not_found -> (* Good: this alias does not exist yet. Record it. *) StringMap.add qid (id, Positions.position decl) aliasmap | id0, pos -> (* Oops: [qid] has already been declared as an alias for some other token. *) Error.error [Positions.position decl; pos] "%s cannot be declared as an alias for the symbol %s.\n\ It has already been declared as an alias for %s." qid id id0 end | _ -> aliasmap (* Extend an alias map with the token aliases present in a partial grammar. *) let collect_aliases_from_grammar aliasmap g = List.fold_left collect_aliases_from_declaration aliasmap g.pg_declarations let collect_aliases_from_grammars gs : aliasmap = List.fold_left collect_aliases_from_grammar StringMap.empty gs (* -------------------------------------------------------------------------- *) (* Expand a possible alias, returning a name which definitely is not an alias (and may or may not be a valid terminal symbol). *) let dealias_terminal (aliasmap : aliasmap) pos (t : terminal) : terminal = (* [t] is either a terminal symbol or a token alias. If it starts with double quote, then it must be a token alias. *) if t.[0] = '"' then match StringMap.find t aliasmap with | id, _ -> id | exception Not_found -> Error.error [pos] "the token alias %s was never declared." t else t (* Perform alias expansion throughout a partial grammar. (Visitors could be useful here!) *) let dealias_symbol aliasmap (sym : terminal Positions.located) = Positions.pmap (dealias_terminal aliasmap) sym let rec dealias_parameter aliasmap (param : parameter) = match param with | ParameterVar sym -> ParameterVar (dealias_symbol aliasmap sym) | ParameterApp (sym, params) -> ParameterApp ( dealias_symbol aliasmap sym, dealias_parameters aliasmap params ) | ParameterAnonymous branches -> ParameterAnonymous (Positions.map (dealias_branches aliasmap) branches) and dealias_parameters aliasmap params = List.map (dealias_parameter aliasmap) params and dealias_producer aliasmap (producer : producer) = let id, param, attrs = producer in id, (dealias_parameter aliasmap param), attrs and dealias_producers aliasmap producers = List.map (dealias_producer aliasmap) producers and dealias_branch aliasmap (branch : parameterized_branch) = { branch with pr_producers = dealias_producers aliasmap branch.pr_producers } and dealias_branches aliasmap branches = List.map (dealias_branch aliasmap) branches let dealias_rule aliasmap rule = { rule with pr_branches = dealias_branches aliasmap rule.pr_branches } let dealias_decl aliasmap (decl : declaration Positions.located) = Positions.pmap (fun pos (decl : declaration) -> match decl with | DCode _ | DParameter _ | DToken _ | DStart _ | DGrammarAttribute _ -> decl | DTokenProperties (t, assoc, prec) -> DTokenProperties (dealias_terminal aliasmap pos t, assoc, prec) | DType (ty, param) -> DType (ty, dealias_parameter aliasmap param) | DSymbolAttributes (params, attrs) -> DSymbolAttributes (dealias_parameters aliasmap params, attrs) | DOnErrorReduce (param, level) -> DOnErrorReduce (dealias_parameter aliasmap param, level) ) decl let dealias_grammar aliasmap g = { g with pg_declarations = List.map (dealias_decl aliasmap) g.pg_declarations; pg_rules = List.map (dealias_rule aliasmap) g.pg_rules } let dealias_grammars aliasmap gs = List.map (dealias_grammar aliasmap) gs (* -------------------------------------------------------------------------- *) (* The two phases above are combined as follows. *) let dealias_grammars gs = let aliasmap = collect_aliases_from_grammars gs in dealias_grammars aliasmap gs menhir-20200123/src/expandTokenAliases.mli000066400000000000000000000030271361226111300203010ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* Token aliases are quoted strings that are used to provide syntactic sugar for terminal symbols, for example, to allow "+" to be used in grammar rules instead of PLUS, or to allow ")" instead of RPAREN. *) (* This transformation eliminates all references to token aliases in a list of partial grammars. (An alias declared in one partial grammar can be used in another partial grammar.) Declarations of token aliases are preserved, and could be used if desired (e.g. for printing). *) open Syntax val dealias_grammars: partial_grammar list -> partial_grammar list menhir-20200123/src/front.ml000066400000000000000000000207071361226111300155020ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* The front-end. This module performs a series of toplevel side effects. *) (* ------------------------------------------------------------------------- *) (* Reading a grammar from a file. *) let load_grammar_from_contents filename contents = InputFile.new_input_file filename; InputFile.with_file_contents contents (fun () -> let open Lexing in let lexbuf = Lexing.from_string contents in lexbuf.lex_curr_p <- { lexbuf.lex_curr_p with pos_fname = filename }; (* the grammar: *) { (Driver.grammar Lexer.main lexbuf) with Syntax.pg_filename = filename } ) let check_filename filename = let validExt = if Settings.coq then ".vy" else ".mly" in if not (Filename.check_suffix filename validExt) then Error.error [] "argument file names should end in %s. \"%s\" is not accepted." validExt filename let load_grammar_from_file filename : Syntax.partial_grammar = check_filename filename; try let contents = IO.read_whole_file filename in load_grammar_from_contents filename contents with Sys_error msg -> Error.error [] "%s" msg (* ------------------------------------------------------------------------- *) (* Read all of the grammar files that are named on the command line, plus the standard library, unless suppressed by [--no-stdlib] or [--coq]. *) let grammars () : Syntax.partial_grammar list = List.map load_grammar_from_file Settings.filenames let grammars : Syntax.partial_grammar list = if Settings.no_stdlib || Settings.coq then grammars() else (* As 20190924, the standard library is no longer actually read from a file. Instead, its text is built into the Menhir executable: it is found in the string [Standard_mly.contents]. We parse it just as if it had been read from a file, and pretend that the file name is [Settings.stdlib_filename]. This file name can appear in generated parsers, because Menhir produces # directives that point back to source (.mly) files. *) (* Note that the [let] construct below is required in order to ensure that the standard library is read first. *) let standard_library = load_grammar_from_contents Settings.stdlib_filename Standard_mly.contents in standard_library :: grammars() let () = Time.tick "Lexing and parsing" (* ------------------------------------------------------------------------- *) (* Eliminate anonymous rules. *) let grammars : Syntax.partial_grammar list = List.map Anonymous.transform_partial_grammar grammars (* ------------------------------------------------------------------------- *) (* If several grammar files were specified, merge them. *) let grammar : Syntax.grammar = PartialGrammar.join_partial_grammars grammars (* ------------------------------------------------------------------------- *) (* Check that the grammar is well-sorted; infer the sort of every symbol. *) let sorts = SortInference.infer grammar (* ------------------------------------------------------------------------- *) (* Expand away all applications of parameterized nonterminal symbols, so as to obtain a grammar without parameterized nonterminal symbols. *) let grammar : BasicSyntax.grammar = let module S = SelectiveExpansion in (* First, perform a selective expansion: expand away all parameters of higher sort, keeping the parameters of sort [*]. This process always terminates. *) let grammar1 = S.expand S.ExpandHigherSort sorts grammar in (* This "first-order parameterized grammar" can then be submitted to the termination check. *) CheckSafeParameterizedGrammar.check grammar1; (* If it passes the check, then full expansion is safe. We drop [grammar1] and start over from [grammar]. This is required in order to get correct names. (Expanding [grammar1] would yield an equivalent grammar, with more complicated names, reflecting the two steps of expansion.) *) let grammar = S.expand S.ExpandAll sorts grammar in (* This yields an unparameterized grammar. *) Drop.drop grammar let () = Time.tick "Joining and expanding" (* ------------------------------------------------------------------------- *) (* If [--only-tokens] was specified on the command line, produce the definition of the [token] type and stop. *) let () = TokenType.produce_tokentypes grammar (* ------------------------------------------------------------------------- *) (* Perform reachability analysis. *) let grammar = Reachability.trim grammar let () = Time.tick "Trimming" (* ------------------------------------------------------------------------- *) (* If [--infer] was specified on the command line, perform type inference. The OCaml type of every nonterminal symbol is then known. *) (* If [--depend] or [--raw-depend] was specified on the command line, perform dependency analysis and stop. *) (* The purpose of [--depend] and [--raw-depend] is to support [--infer]. Indeed, [--infer] is implemented by producing a mock [.ml] file (which contains just the semantic actions) and invoking [ocamlc]. This requires certain [.cmi] files to exist. So, [--(raw-)depend] is a way for us to announce which [.cmi] files we need. It is implemented by producing the mock [.ml] file and running [ocamldep] on it. We also produce a mock [.mli] file, even though in principle it should be unnecessary -- see comment in [nonterminalType.mli]. *) (* If [--infer-write-query] was specified on the command line, write a mock [.ml] file and stop. It is then up to the user (or build system) to invoke [ocamlc -i] on this file, so as to do type inference. *) (* If [--infer-read-reply] was specified on the command line, read the inferred [.mli] file. The OCaml type of every nonterminal symbol is then known, just as with [--infer]. *) let grammar, ocaml_types_have_been_checked = Settings.(match infer with | IMNone -> grammar, false | IMInfer -> let grammar = Infer.infer grammar in Time.tick "Inferring types for nonterminals"; grammar, true | IMDependRaw -> Infer.depend false grammar (* never returns *) | IMDependPostprocess -> Infer.depend true grammar (* never returns *) | IMWriteQuery filename -> Infer.write_query filename grammar (* never returns *) | IMReadReply filename -> let grammar = Infer.read_reply filename grammar in Time.tick "Reading inferred types for nonterminals"; grammar, true ) (* ------------------------------------------------------------------------- *) (* Expand away some of the position keywords. *) let grammar = KeywordExpansion.expand_grammar grammar (* ------------------------------------------------------------------------- *) (* If [--no-inline] was specified on the command line, skip the inlining of non terminal definitions marked with %inline. *) let grammar = if Settings.inline then begin let grammar = Inlining.inline grammar in (* 2018/05/23 Removed the warning that was issued when %inline was used but --infer was turned off. Most people should use ocamlbuild or dune anyway. *) Time.tick "Inlining"; grammar end else grammar (* ------------------------------------------------------------------------- *) (* If [--only-preprocess] or [--only-preprocess-drop] was specified on the command line, print the grammar and stop. Otherwise, continue. *) let () = match Settings.preprocess_mode with | Settings.PMOnlyPreprocess mode -> BasicPrinter.print mode stdout grammar; exit 0 | Settings.PMNormal -> () menhir-20200123/src/front.mli000066400000000000000000000036711361226111300156540ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module drives the front-end. It opens and parses the input files, which yields a number of partial grammars. It joins these grammars, expands them to get rid of parameterized nonterminals, and performs reachability analysis. This yields a single unified grammar. It then performs type inference. This yields the grammar that the back-end works with (often through the interface provided by module [Grammar]). *) val grammar: BasicSyntax.grammar (* This flag tells whether the semantic actions have been type-checked. It is set if and only if either [--infer] or [--infer-read-reply] is in use. Note that the presence of a %type declaration for every nonterminal symbol is *not* sufficient for this flag to be set. Note also that, when [--infer-read-reply] is set, it could be the case that we have an out-of-date inferred [.mli] file, so the semantic actions could still be ill-typed. (The user is then at fault.) *) val ocaml_types_have_been_checked: bool menhir-20200123/src/gMap.ml000066400000000000000000000151321361226111300152320ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) module type S = sig (* Keys are assumed to have a natural total order. *) type key (* The type of maps whose data have type ['a]. *) type 'a t (* The empty map. *) val empty: 'a t (* [lookup k m] looks up the value associated to the key [k] in the map [m], and raises [Not_found] if no value is bound to [k]. *) val lookup: key -> 'a t -> 'a val find: key -> 'a t -> 'a (* [add k d m] returns a map whose bindings are all bindings in [m], plus a binding of the key [k] to the datum [d]. If a binding already exists for [k], it is overridden. *) val add: key -> 'a -> 'a t -> 'a t (* [strict_add k d m] returns a map whose bindings are all bindings in [m], plus a binding of the key [k] to the datum [d]. If a binding already exists for [k] then [Unchanged] is raised. *) exception Unchanged val strict_add: key -> 'a -> 'a t -> 'a t (* [fine_add decide k d m] returns a map whose bindings are all bindings in [m], plus a binding of the key [k] to the datum [d]. If a binding from [k] to [d0] already exists, then the resulting map contains a binding from [k] to [decide d0 d]. *) type 'a decision = 'a -> 'a -> 'a val fine_add: 'a decision -> key -> 'a -> 'a t -> 'a t (* [mem k m] tells whether the key [k] appears in the domain of the map [m]. *) val mem: key -> 'a t -> bool (* [singleton k d] returns a map whose only binding is from [k] to [d]. *) val singleton: key -> 'a -> 'a t (* [is_empty m] returns [true] if and only if the map [m] defines no bindings at all. *) val is_empty: 'a t -> bool (* [is_singleton s] returns [Some x] if [s] is a singleton containing [x] as its only element; otherwise, it returns [None]. *) val is_singleton: 'a t -> (key * 'a) option (* [cardinal m] returns [m]'s cardinal, that is, the number of keys it binds, or, in other words, the cardinal of its domain. *) val cardinal: 'a t -> int (* [choose m] returns an arbitrarily chosen binding in [m], if [m] is nonempty, and raises [Not_found] otherwise. *) val choose: 'a t -> key * 'a (* [lookup_and_remove k m] looks up the value [v] associated to the key [k] in the map [m], and raises [Not_found] if no value is bound to [k]. The call returns the value [v], together with the map [m] deprived from the binding from [k] to [v]. *) val lookup_and_remove: key -> 'a t -> 'a * 'a t val find_and_remove: key -> 'a t -> 'a * 'a t (* [remove k m] is the map [m] deprived from any binding for [k]. *) val remove: key -> 'a t -> 'a t (* [union m1 m2] returns the union of the maps [m1] and [m2]. Bindings in [m2] take precedence over those in [m1]. *) val union: 'a t -> 'a t -> 'a t (* [fine_union decide m1 m2] returns the union of the maps [m1] and [m2]. If a key [k] is bound to [x1] (resp. [x2]) within [m1] (resp. [m2]), then [decide] is called. It is passed [x1] and [x2], and must return the value that shall be bound to [k] in the final map. *) val fine_union: 'a decision -> 'a t -> 'a t -> 'a t (* [iter f m] invokes [f k x], in turn, for each binding from key [k] to element [x] in the map [m]. Keys are presented to [f] in increasing order. *) val iter: (key -> 'a -> unit) -> 'a t -> unit (* [fold f m seed] invokes [f k d accu], in turn, for each binding from key [k] to datum [d] in the map [m]. Keys are presented to [f] in increasing order. The initial value of [accu] is [seed]; then, at each new call, its value is the value returned by the previous invocation of [f]. The value returned by [fold] is the final value of [accu]. *) val fold: (key -> 'a -> 'b -> 'b) -> 'a t -> 'b -> 'b (* [fold_rev] performs exactly the same job as [fold], but presents keys to [f] in the opposite order. *) val fold_rev: (key -> 'a -> 'b -> 'b) -> 'a t -> 'b -> 'b (* [filter f m] returns a copy of the map [m] where only the bindings that satisfy [f] have been retained. *) val filter: (key -> 'a -> bool) -> 'a t -> 'a t (* It is valid to evaluate [iter2 f m1 m2] if and only if [m1] and [m2] have equal domains. Doing so invokes [f k x1 x2], in turn, for each key [k] bound to [x1] in [m1] and to [x2] in [m2]. Bindings are presented to [f] in increasing order. *) val iter2: (key -> 'a -> 'b -> unit) -> 'a t -> 'b t -> unit (* [map f m] returns the map obtained by composing the map [m] with the function [f]; that is, the map $k\mapsto f(m(k))$. *) val map: ('a -> 'b) -> 'a t -> 'b t (* [endo_map] is similar to [map], but attempts to physically share its result with its input. This saves memory when [f] is the identity function. *) val endo_map: ('a -> 'a) -> 'a t -> 'a t (* If [dcompare] is an ordering over data, then [compare dcompare] is an ordering over maps. *) val compare: ('a -> 'a -> int) -> 'a t -> 'a t -> int (* A map's domain is a set. Thus, to be able to perform operations on domains, we need set operations, provided by the [Domain] sub-module. The two-way connection between maps and their domains is given by two additional functions, [domain] and [lift]. [domain m] returns [m]'s domain. [lift f s] returns the map $k\mapsto f(k)$, where $k$ ranges over a set of keys [s]. *) module Domain : GSet.S with type element = key val domain: 'a t -> Domain.t val lift: (key -> 'a) -> Domain.t -> 'a t (* [corestrict m d] performs a co-restriction of the map [m] to the domain [d]. That is, it returns the map $k\mapsto m(k)$, where $k$ ranges over all keys bound in [m] but \emph{not} present in [d]. *) val corestrict: 'a t -> Domain.t -> 'a t end menhir-20200123/src/gSet.ml000066400000000000000000000071441361226111300152540ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This is a stripped down version of [GSet] that describes both [Patricia] and [CompressedBitSet]. The full version of [GSet] is in [AlphaLib]. *) module type S = sig (* Elements are assumed to have a natural total order. *) type element (* Sets. *) type t (* The empty set. *) val empty: t (* [is_empty s] tells whether [s] is the empty set. *) val is_empty: t -> bool (* [singleton x] returns a singleton set containing [x] as its only element. *) val singleton: element -> t (* [is_singleton s] tests whether [s] is a singleton set. *) val is_singleton: t -> bool (* [cardinal s] returns the cardinal of [s]. *) val cardinal: t -> int (* [choose s] returns an arbitrarily chosen element of [s], if [s] is nonempty, and raises [Not_found] otherwise. *) val choose: t -> element (* [mem x s] returns [true] if and only if [x] appears in the set [s]. *) val mem: element -> t -> bool (* [add x s] returns a set whose elements are all elements of [s], plus [x]. *) val add: element -> t -> t (* [remove x s] returns a set whose elements are all elements of [s], except [x]. *) val remove: element -> t -> t (* [union s1 s2] returns the union of the sets [s1] and [s2]. *) val union: t -> t -> t (* [inter s t] returns the set intersection of [s] and [t], that is, $s\cap t$. *) val inter: t -> t -> t (* [disjoint s1 s2] returns [true] if and only if the sets [s1] and [s2] are disjoint, i.e. iff their intersection is empty. *) val disjoint: t -> t -> bool (* [iter f s] invokes [f x], in turn, for each element [x] of the set [s]. Elements are presented to [f] in increasing order. *) val iter: (element -> unit) -> t -> unit (* [fold f s seed] invokes [f x accu], in turn, for each element [x] of the set [s]. Elements are presented to [f] in increasing order. The initial value of [accu] is [seed]; then, at each new call, its value is the value returned by the previous invocation of [f]. The value returned by [fold] is the final value of [accu]. In other words, if $s = \{ x_1, x_2, \ldots, x_n \}$, where $x_1 < x_2 < \ldots < x_n$, then [fold f s seed] computes $([f]\,x_n\,\ldots\,([f]\,x_2\,([f]\,x_1\,[seed]))\ldots)$. *) val fold: (element -> 'b -> 'b) -> t -> 'b -> 'b (* [elements s] is a list of all elements in the set [s]. *) val elements: t -> element list (* [compare] is an ordering over sets. *) val compare: t -> t -> int (* [equal] implements equality over sets. *) val equal: t -> t -> bool (* [subset] implements the subset predicate over sets. *) val subset: (t -> t -> bool) end menhir-20200123/src/grammar.ml000066400000000000000000000022051361226111300157710ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module runs the grammar functor on the grammar produced by the front-end. *) include GrammarFunctor.Make(struct let grammar = Front.grammar let verbose = true end) menhir-20200123/src/grammarFunctor.ml000066400000000000000000001363451361226111300173470ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open BasicSyntax open Syntax open Positions module Make (G : sig (* An abstract syntax tree for the grammar. *) val grammar: BasicSyntax.grammar (* This flag indicates whether it is OK to produce warnings, verbose information, etc., when this functor is invoked. If it is set to [false], then only serious errors can be signaled. *) val verbose: bool end) = struct open G (* ------------------------------------------------------------------------ *) (* Precedence levels for tokens or pseudo-tokens alike. *) module TokPrecedence = struct (* This set records, on a token by token basis, whether the token's precedence level is ever useful. This allows emitting warnings about useless precedence declarations. *) let ever_useful : StringSet.t ref = ref StringSet.empty let use id = ever_useful := StringSet.add id !ever_useful (* This function is invoked when someone wants to consult a token's precedence level. This does not yet mean that this level is useful, though. Indeed, if it is subsequently compared against [UndefinedPrecedence], it will not allow solving a conflict. So, in addition to the desired precedence level, we return a delayed computation which, when evaluated, records that this precedence level was useful. *) let levelip id properties = lazy (use id), properties.tk_precedence let leveli id = let properties = try StringMap.find id grammar.tokens with Not_found -> assert false (* well-formedness check has been performed earlier *) in levelip id properties (* This function prints warnings about useless precedence declarations for terminal symbols (%left, %right, %nonassoc). It should be invoked after only the automaton has been constructed. *) let diagnostics () = StringMap.iter (fun id properties -> if not (StringSet.mem id !ever_useful) then match properties.tk_precedence with | UndefinedPrecedence -> () | PrecedenceLevel (_, _, pos1, pos2) -> Error.grammar_warning [Positions.import (pos1, pos2)] "the precedence level assigned to %s is never useful." id ) grammar.tokens end (* ------------------------------------------------------------------------ *) (* Nonterminals. *) module Nonterminal = struct type t = int let n2i i = i let compare = (-) (* Determine how many nonterminals we have and build mappings both ways between names and indices. A new nonterminal is created for every start symbol. *) let new_start_nonterminals = StringSet.fold (fun symbol ss -> (symbol ^ "'") :: ss) grammar.start_symbols [] let original_nonterminals = nonterminals grammar let start = List.length new_start_nonterminals let (n : int), (name : string array), (map : int StringMap.t) = Misc.index (new_start_nonterminals @ original_nonterminals) let () = if verbose then Error.logG 1 (fun f -> Printf.fprintf f "Grammar has %d nonterminal symbols, among which %d start symbols.\n" (n - start) start ) let is_start nt = nt < start let print normalize nt = if normalize then Misc.normalize name.(nt) else name.(nt) let lookup name = StringMap.find name map let positions nt = (StringMap.find (print false nt) grammar.rules).positions let init f = Array.init n f let iter f = Misc.iteri n f let fold f accu = Misc.foldi n f accu let map f = Misc.mapi n f let iterx f = for nt = start to n - 1 do f nt done let foldx f accu = Misc.foldij start n f accu let ocamltype nt = assert (not (is_start nt)); try Some (StringMap.find (print false nt) grammar.types) with Not_found -> None let ocamltype_of_start_symbol nt = match ocamltype nt with | Some typ -> typ | None -> (* Every start symbol has a type. *) assert false let tabulate f = Array.get (Array.init n f) let attributes : Syntax.attributes array = Array.make n [] let () = StringMap.iter (fun nonterminal { attributes = attrs } -> let nt = lookup nonterminal in attributes.(nt) <- attrs ) grammar.rules let attributes nt = attributes.(nt) end (* Sets and maps over nonterminals. *) module NonterminalMap = Patricia.Big module NonterminalSet = Patricia.Big.Domain (* ------------------------------------------------------------------------ *) (* Terminals. *) module Terminal = struct type t = int let t2i i = i let i2t i = i let compare = (-) let equal (tok1 : t) (tok2 : t) = tok1 = tok2 (* Determine how many terminals we have and build mappings both ways between names and indices. A new terminal "#" is created. A new terminal "error" is created. The fact that the integer code assigned to the "#" pseudo-terminal is the last one is exploited in the table-based back-end. (The right-most row of the action table is not created.) Pseudo-tokens (used in %prec declarations, but never declared using %token) are filtered out. *) (* In principle, the number of the [error] token is irrelevant. It is currently 0, but we do not rely on that. *) let (n : int), (name : string array), (map : int StringMap.t) = let tokens = tokens grammar in match tokens with | [] when verbose -> Error.error [] "no tokens have been declared." | _ -> Misc.index ("error" :: tokens @ [ "#" ]) let print tok = name.(tok) let lookup name = StringMap.find name map let sharp = lookup "#" let error = lookup "error" let pseudo tok = (tok = sharp) || (tok = error) let real t = error <> t && t <> sharp let non_error tok = tok <> error let token_properties = let not_so_dummy_properties = (* applicable to [error] and [#] *) { tk_filename = "__primitives__"; tk_precedence = UndefinedPrecedence; tk_associativity = UndefinedAssoc; tk_ocamltype = None; tk_is_declared = true; tk_position = Positions.dummy; tk_attributes = []; } in Array.init n (fun tok -> try StringMap.find name.(tok) grammar.tokens with Not_found -> assert (tok = sharp || tok = error); not_so_dummy_properties ) let () = if verbose then Error.logG 1 (fun f -> Printf.fprintf f "Grammar has %d terminal symbols.\n" (n - 2) ) let precedence_level tok = TokPrecedence.levelip (print tok) token_properties.(tok) let associativity tok = token_properties.(tok).tk_associativity let ocamltype tok = token_properties.(tok).tk_ocamltype let init f = Array.init n f let iter f = Misc.iteri n f let fold f accu = Misc.foldi n f accu let map f = Misc.mapi n f let () = assert (sharp = n - 1) let foldx f accu = Misc.foldi sharp f accu let mapx f = Misc.mapi sharp f let () = assert (error = 0) let iter_real f = for i = 1 to n-2 do f i done (* If a token named [EOF] exists, then it is assumed to represent ocamllex's [eof] pattern. *) let eof = try Some (lookup "EOF") with Not_found -> None let attributes tok = token_properties.(tok).tk_attributes (* The sub-module [Word] offers an implementation of words (that is, sequences) of terminal symbols. It is used by [LRijkstra]. We make it a functor, because it has internal state (a hash table) and a side effect (failure if there are more than 256 terminal symbols). *) module Word (X : sig end) = struct (* We could use lists, or perhaps the sequences offered by the module [Seq], which support constant time concatenation. However, we need a much more compact representation: [LRijkstra] stores tens of millions of such words. We use strings, because they are very compact (8 bits per symbol), and on top of that, we use a hash-consing facility. In practice, hash-consing allows us to save 1000x in space. *) (* A drawback of this approach is that it works only if the number of terminal symbols is at most 256. For the moment, this is good enough. [LRijkstra] already has difficulty at 100 terminal symbols or so. *) let () = assert (n <= 256) let (encode : string -> int), (decode : int -> string), verbose = Misc.new_encode_decode 1024 type word = int let epsilon = encode "" let singleton t = encode (String.make 1 (Char.chr t)) let append i1 i2 = let w1 = decode i1 and w2 = decode i2 in if String.length w1 = 0 then i2 else if String.length w2 = 0 then i1 else encode (w1 ^ w2) let length i = String.length (decode i) let first i z = let w = decode i in if String.length w > 0 then Char.code w.[0] else z let rec elements i n w = if i = n then [] else Char.code w.[i] :: elements (i + 1) n w let elements i = let w = decode i in elements 0 (String.length w) w let print i = let w = decode i in Misc.separated_iter_to_string (fun c -> print (Char.code c)) " " (fun f -> String.iter f w) (* [Generic.compare] implements a lexicographic ordering on strings. *) let compare i1 i2 = Generic.compare (decode i1) (decode i2) end end (* Sets of terminals are used intensively in the LR(1) construction, so it is important that they be as efficient as possible. *) module TerminalSet = struct include CompressedBitSet let print toks = Misc.separated_iter_to_string Terminal.print " " (fun f -> iter f toks) let universe = remove Terminal.sharp ( remove Terminal.error ( Terminal.fold add empty ) ) (* The following definitions are used in the computation of FIRST sets below. They are not exported outside of this file. *) type property = t let bottom = empty let is_maximal _ = false end (* Maps over terminals. *) module TerminalMap = Patricia.Big (* ------------------------------------------------------------------------ *) (* Symbols. *) module Symbol = struct type t = | N of Nonterminal.t | T of Terminal.t let is_terminal sym = match sym with | N _ -> false | T _ -> true let compare sym1 sym2 = match sym1, sym2 with | N nt1, N nt2 -> Nonterminal.compare nt1 nt2 | T tok1, T tok2 -> Terminal.compare tok1 tok2 | N _, T _ -> 1 | T _, N _ -> -1 let equal sym1 sym2 = compare sym1 sym2 = 0 let rec lequal syms1 syms2 = match syms1, syms2 with | [], [] -> true | sym1 :: syms1, sym2 :: syms2 -> equal sym1 sym2 && lequal syms1 syms2 | _ :: _, [] | [], _ :: _ -> false let non_error sym = match sym with | T tok -> Terminal.non_error tok | N _ -> true let print = function | N nt -> Nonterminal.print false nt | T tok -> Terminal.print tok let nonterminal = function | T _ -> false | N _ -> true (* Printing an array of symbols. [offset] is the start offset -- we print everything to its right. [dot] is the dot offset -- we print a dot at this offset, if we find it. *) let printaod offset dot symbols = let buffer = Buffer.create 512 in let length = Array.length symbols in for i = offset to length do if i = dot then Buffer.add_string buffer ". "; if i < length then begin Buffer.add_string buffer (print symbols.(i)); Buffer.add_char buffer ' ' end done; Buffer.contents buffer let printao offset symbols = printaod offset (-1) symbols let printa symbols = printao 0 symbols let printl symbols = printa (Array.of_list symbols) let lookup name = try T (Terminal.lookup name) with Not_found -> try N (Nonterminal.lookup name) with Not_found -> assert false (* well-formedness check has been performed earlier *) end (* Sets of symbols. *) module SymbolSet = struct include Set.Make(Symbol) let print symbols = Symbol.printl (elements symbols) (* The following definitions are used in the computation of symbolic FOLLOW sets below. They are not exported outside of this file. *) type property = t let bottom = empty let is_maximal _ = false end (* Maps over symbols. *) module SymbolMap = struct include Map.Make(Symbol) let domain m = fold (fun symbol _ accu -> symbol :: accu ) m [] let init f xs = List.fold_left (fun accu x -> add x (f x) accu ) empty xs let purelynonterminal m = fold (fun symbol _ accu -> accu && Symbol.nonterminal symbol ) m true end (* ------------------------------------------------------------------------ *) (* Productions. *) module Production = struct type index = int let compare = (-) (* A new production S' -> S is created for every start symbol S. It is known as a start production. *) (* Count how many productions we have, including the start productions. This is [n]. *) let n : int = let n = StringMap.fold (fun _ { branches = branches } n -> n + List.length branches ) grammar.rules 0 in if verbose then Error.logG 1 (fun f -> Printf.fprintf f "Grammar has %d productions.\n" n); n + StringSet.cardinal grammar.start_symbols let p2i prod = prod let i2p prod = assert (prod >= 0 && prod < n); prod (* Create a number of uninitialized tables that map a production index to information about this production. *) (* [table] maps a production to the left-hand side and right-hand side of this production. [identifiers] maps a production to an array of the identifiers that are used to name the elements of the right-hand side. [actions] maps a production to an optional semantic action. (Only the start productions have none.) [positions] maps a production to an array of the positions (in the .mly file) of the elements of the right-hand side. [rhs_attributes] maps a production to an array of the attributes attached to the elements of the right-hand side. [prec_decl] maps a production to an optional [%prec] annotation. [production_level] maps a production to a production level (see [ParserAux]). *) let table : (Nonterminal.t * Symbol.t array) array = Array.make n (-1, [||]) let identifiers : identifier array array = Array.make n [||] let actions : action option array = Array.make n None let positions : Positions.t list array = Array.make n [] let rhs_attributes : Syntax.attributes array array = Array.make n [||] let prec_decl : symbol located option array = Array.make n None let production_level : branch_production_level array = (* The start productions receive a level that pretends that they originate in a fictitious "builtin" file. So, a reduce/reduce conflict that involves a start production will not be solved. *) let dummy = ProductionLevel (InputFile.builtin_input_file, 0) in Array.make n dummy (* [ntprods] maps a nonterminal symbol to the interval of its productions. *) let ntprods : (int * int) array = Array.make Nonterminal.n (-1, -1) (* This Boolean flag records whether the grammar uses the [error] token. *) let grammar_uses_error_token = ref false (* Create the start productions, populating the above arrays as appropriate. [start] is the number of start productions, therefore also the index of the first non-start production. [startprods] is a mapping of the start symbols to the corresponding start productions. *) let (start : int), (startprods : index NonterminalMap.t) = StringSet.fold (fun nonterminal (k, startprods) -> let nt = Nonterminal.lookup nonterminal and nt' = Nonterminal.lookup (nonterminal ^ "'") in table.(k) <- (nt', [| Symbol.N nt |]); identifiers.(k) <- [| "_1" |]; ntprods.(nt') <- (k, k+1); positions.(k) <- Nonterminal.positions nt; k+1, NonterminalMap.add nt k startprods ) grammar.start_symbols (0, NonterminalMap.empty) (* Create the non-start productions, populating the above arrays. *) let producer_symbol producer = Symbol.lookup (producer_symbol producer) let (_ : int) = StringMap.fold (fun nonterminal { branches } k -> let nt = Nonterminal.lookup nonterminal in let k' = List.fold_left (fun k branch -> let producers = Array.of_list branch.producers in let rhs = Array.map producer_symbol producers in table.(k) <- (nt, rhs); identifiers.(k) <- Array.map producer_identifier producers; actions.(k) <- Some branch.action; rhs_attributes.(k) <- Array.map producer_attributes producers; production_level.(k) <- branch.branch_production_level; prec_decl.(k) <- branch.branch_prec_annotation; positions.(k) <- [ branch.branch_position ]; if not (Misc.array_for_all Symbol.non_error rhs) then grammar_uses_error_token := true; k+1 ) k branches in ntprods.(nt) <- (k, k'); k' ) grammar.rules start (* Iteration over the productions associated with a specific nonterminal. *) let iternt nt f = let k, k' = ntprods.(nt) in for prod = k to k' - 1 do f prod done let foldnt (nt : Nonterminal.t) (accu : 'a) (f : index -> 'a -> 'a) : 'a = let k, k' = ntprods.(nt) in let rec loop accu prod = if prod < k' then loop (f prod accu) (prod + 1) else accu in loop accu k (* This funny variant is lazy. If at some point [f] does not demand its second argument, then iteration stops. *) let foldnt_lazy (nt : Nonterminal.t) (f : index -> (unit -> 'a) -> 'a) (seed : 'a) : 'a = let k, k' = ntprods.(nt) in let rec loop prod seed = if prod < k' then f prod (fun () -> loop (prod + 1) seed) else seed in loop k seed (* Accessors. *) let def prod = table.(prod) let nt prod = let nt, _ = table.(prod) in nt let rhs prod = let _, rhs = table.(prod) in rhs let length prod = Array.length (rhs prod) let identifiers prod = identifiers.(prod) let is_start prod = prod < start let classify prod = if is_start prod then match (rhs prod).(0) with | Symbol.N nt -> Some nt | Symbol.T _ -> assert false else None let action prod = match actions.(prod) with | Some action -> action | None -> (* Start productions have no action. *) assert (is_start prod); assert false let positions prod = positions.(prod) let lhs_attributes prod = Nonterminal.attributes (nt prod) let rhs_attributes prod = rhs_attributes.(prod) let startsymbol2startprod nt = try NonterminalMap.find nt startprods with Not_found -> assert false (* [nt] is not a start symbol *) (* Iteration. *) let init f = Array.init n f let iter f = Misc.iteri n f let fold f accu = Misc.foldi n f accu let map f = Misc.mapi n f let amap f = Array.init n f let iterx f = for prod = start to n - 1 do f prod done let foldx f accu = Misc.foldij start n f accu let mapx f = Misc.mapij start n f (* Printing a production. *) let print prod = assert (not (is_start prod)); let nt, rhs = table.(prod) in Printf.sprintf "%s -> %s" (Nonterminal.print false nt) (Symbol.printao 0 rhs) (* Tabulation. *) let tabulate f = Misc.tabulate n f let tabulateb f = Misc.tabulateb n f (* This array allows recording, for each %prec declaration, whether it is ever useful. This allows us to emit a warning about useless %prec declarations. *) (* 2015/10/06: We take into account the fact that a %prec declaration can be duplicated by inlining or by the expansion of parameterized non-terminal symbols. Our table is not indexed by productions, but by positions (of %prec declarations in the source). Thus, if a %prec declaration is duplicated, at least one of its copies should be found useful for the warning to be suppressed. *) let ever_useful : (Positions.t, unit) Hashtbl.t = (* assuming that generic hashing and equality on positions are OK *) Hashtbl.create 16 let consult_prec_decl prod = let osym = prec_decl.(prod) in lazy ( Option.iter (fun sym -> (* Mark this %prec declaration as useful. *) let pos = Positions.position sym in Hashtbl.add ever_useful pos () ) osym ), osym (* This function prints warnings about useless precedence declarations for productions (%prec). It should be invoked after only the automaton has been constructed. *) let diagnostics () = iterx (fun prod -> let osym = prec_decl.(prod) in Option.iter (fun sym -> (* Check whether this %prec declaration was useless. *) let pos = Positions.position sym in if not (Hashtbl.mem ever_useful pos) then begin Error.grammar_warning [pos] "this %%prec declaration is never useful."; Hashtbl.add ever_useful pos () (* hack: avoid two warnings at the same position *) end ) osym ) (* Determining the precedence level of a production. If no %prec declaration was explicitly supplied, it is the precedence level of the rightmost terminal symbol in the production's right-hand side. *) type production_level = | PNone | PRightmostToken of Terminal.t | PPrecDecl of symbol let rightmost_terminal prod = Array.fold_left (fun accu symbol -> match symbol with | Symbol.T tok -> PRightmostToken tok | Symbol.N _ -> accu ) PNone (rhs prod) let combine e1 e2 = lazy (Lazy.force e1; Lazy.force e2) let precedence prod = let fact1, prec_decl = consult_prec_decl prod in let oterminal = match prec_decl with | None -> rightmost_terminal prod | Some { value = terminal } -> PPrecDecl terminal in match oterminal with | PNone -> fact1, UndefinedPrecedence | PRightmostToken tok -> let fact2, level = Terminal.precedence_level tok in combine fact1 fact2, level | PPrecDecl id -> let fact2, level = TokPrecedence.leveli id in combine fact1 fact2, level end let grammar_uses_error_token = !Production.grammar_uses_error_token (* ------------------------------------------------------------------------ *) (* Maps over productions. *) module ProductionMap = struct include Patricia.Big (* Iteration over the start productions only. *) let start f = Misc.foldi Production.start (fun prod m -> add prod (f prod) m ) empty end (* ------------------------------------------------------------------------ *) (* Support for analyses of the grammar, expressed as fixed point computations. We exploit the generic fixed point algorithm in [Fix]. *) (* We perform memoization only at nonterminal symbols. We assume that the analysis of a symbol is the analysis of its definition (as opposed to, say, a computation that depends on the occurrences of this symbol in the grammar). *) module GenericAnalysis (P : Fix.PROPERTY) (S : sig open P (* An analysis is specified by the following functions. *) (* [terminal] maps a terminal symbol to a property. *) val terminal: Terminal.t -> property (* [disjunction] abstracts a binary alternative. That is, when we analyze an alternative between several productions, we compute a property for each of them independently, then we combine these properties using [disjunction]. *) val disjunction: property -> (unit -> property) -> property (* [P.bottom] should be a neutral element for [disjunction]. We use it in the analysis of an alternative with zero branches. *) (* [conjunction] abstracts a binary sequence. That is, when we analyze a sequence, we compute a property for each member independently, then we combine these properties using [conjunction]. In general, conjunction needs access to the first member of the sequence (a symbol), not just to its analysis (a property). *) val conjunction: Symbol.t -> property -> (unit -> property) -> property (* [epsilon] abstracts the empty sequence. It should be a neutral element for [conjunction]. *) val epsilon: property end) : sig open P (* The results of the analysis take the following form. *) (* To every nonterminal symbol, we associate a property. *) val nonterminal: Nonterminal.t -> property (* To every symbol, we associate a property. *) val symbol: Symbol.t -> property (* To every suffix of every production, we associate a property. The offset [i], which determines the beginning of the suffix, must be contained between [0] and [n], inclusive, where [n] is the length of the production. *) val production: Production.index -> int -> property end = struct open P (* The following analysis functions are parameterized over [get], which allows making a recursive call to the analysis at a nonterminal symbol. [get] maps a nonterminal symbol to a property. *) (* Analysis of a symbol. *) let symbol sym get : property = match sym with | Symbol.T tok -> S.terminal tok | Symbol.N nt -> (* Recursive call to the analysis, via [get]. *) get nt (* Analysis of (a suffix of) a production [prod], starting at index [i]. *) let production prod i get : property = let rhs = Production.rhs prod in let n = Array.length rhs in (* Conjunction over all symbols in the right-hand side. This can be viewed as a version of [Array.fold_right], which does not necessarily begin at index [0]. Note that, because [conjunction] is lazy, it is possible to stop early. *) let rec loop i = if i = n then S.epsilon else let sym = rhs.(i) in S.conjunction sym (symbol sym get) (fun () -> loop (i+1)) in loop i (* The analysis is the least fixed point of the following function, which analyzes a nonterminal symbol by looking up and analyzing its definition as a disjunction of conjunctions of symbols. *) let nonterminal nt get : property = (* Disjunction over all productions for this nonterminal symbol. *) Production.foldnt_lazy nt (fun prod rest -> S.disjunction (production prod 0 get) rest ) P.bottom (* The least fixed point is taken as follows. Note that it is computed on demand, as [lfp] is called by the user. *) module F = Fix.Make (Maps.ArrayAsImperativeMaps(Nonterminal)) (P) let nonterminal = F.lfp nonterminal (* The auxiliary functions can be published too. *) let symbol sym = symbol sym nonterminal let production prod i = production prod i nonterminal end (* ------------------------------------------------------------------------ *) (* Compute which nonterminals are nonempty, that is, recognize a nonempty language. Also, compute which nonterminals are nullable. The two computations are almost identical. The only difference is in the base case: a single terminal symbol is not nullable, but is nonempty. *) module NONEMPTY = GenericAnalysis (Boolean) (struct (* A terminal symbol is nonempty. *) let terminal _ = true (* An alternative is nonempty if at least one branch is nonempty. *) let disjunction p q = p || q() (* A sequence is nonempty if both members are nonempty. *) let conjunction _ p q = p && q() (* The sequence epsilon is nonempty. It generates the singleton language {epsilon}. *) let epsilon = true end) module NULLABLE = GenericAnalysis (Boolean) (struct (* A terminal symbol is not nullable. *) let terminal _ = false (* An alternative is nullable if at least one branch is nullable. *) let disjunction p q = p || q() (* A sequence is nullable if both members are nullable. *) let conjunction _ p q = p && q() (* The sequence epsilon is nullable. *) let epsilon = true end) (* ------------------------------------------------------------------------ *) (* Compute FIRST sets. *) module FIRST = GenericAnalysis (TerminalSet) (struct (* A terminal symbol has a singleton FIRST set. *) let terminal = TerminalSet.singleton (* The FIRST set of an alternative is the union of the FIRST sets. *) let disjunction p q = TerminalSet.union p (q()) (* The FIRST set of a sequence is the union of: the FIRST set of the first member, and the FIRST set of the second member, if the first member is nullable. *) let conjunction symbol p q = if NULLABLE.symbol symbol then TerminalSet.union p (q()) else p (* The FIRST set of the empty sequence is empty. *) let epsilon = TerminalSet.empty end) (* ------------------------------------------------------------------------ *) (* For every nonterminal symbol [nt], compute a word of minimal length generated by [nt]. This analysis subsumes [NONEMPTY] and [NULLABLE]. Indeed, [nt] produces a nonempty language if only if the minimal length is finite; [nt] is nullable if only if the minimal length is zero. *) (* This analysis is in principle more costly than [NONEMPTY] and [NULLABLE], so it is performed only on demand. In practice, it seems to be very cheap: its cost is not measurable for any of the grammars in our benchmark suite. *) module MINIMAL = GenericAnalysis (struct include CompletedNatWitness type property = Terminal.t t end) (struct open CompletedNatWitness (* A terminal symbol has length 1. *) let terminal = singleton (* The length of an alternative is the minimum length of any branch. *) let disjunction = min_lazy (* The length of a sequence is the sum of the lengths of the members. *) let conjunction _ = add_lazy (* The epsilon sequence has length 0. *) let epsilon = epsilon end) (* ------------------------------------------------------------------------ *) let () = if verbose then begin (* If a start symbol generates the empty language or generates the language {epsilon}, report an error. In principle, this could be just a warning. However, in [Engine], in the function [start], it is convenient to assume that neither of these situations can arise. This means that at least one token must be read. *) StringSet.iter (fun symbol -> let nt = Nonterminal.lookup symbol in if not (NONEMPTY.nonterminal nt) then Error.error (Nonterminal.positions nt) "%s generates the empty language." (Nonterminal.print false nt); if TerminalSet.is_empty (FIRST.nonterminal nt) then Error.error (Nonterminal.positions nt) "%s generates the language {epsilon}." (Nonterminal.print false nt) ) grammar.start_symbols; (* If a nonterminal symbol generates the empty language, issue a warning. *) for nt = Nonterminal.start to Nonterminal.n - 1 do if not (NONEMPTY.nonterminal nt) then Error.grammar_warning (Nonterminal.positions nt) "%s generates the empty language." (Nonterminal.print false nt); done end (* ------------------------------------------------------------------------ *) (* Dump the analysis results. *) let () = if verbose then Error.logG 2 (fun f -> for nt = Nonterminal.start to Nonterminal.n - 1 do Printf.fprintf f "nullable(%s) = %b\n" (Nonterminal.print false nt) (NULLABLE.nonterminal nt) done; for nt = Nonterminal.start to Nonterminal.n - 1 do Printf.fprintf f "first(%s) = %s\n" (Nonterminal.print false nt) (TerminalSet.print (FIRST.nonterminal nt)) done; for nt = Nonterminal.start to Nonterminal.n - 1 do Printf.fprintf f "minimal(%s) = %s\n" (Nonterminal.print false nt) (CompletedNatWitness.print Terminal.print (MINIMAL.nonterminal nt)) done ) let () = if verbose then Time.tick "Analysis of the grammar" (* ------------------------------------------------------------------------ *) (* Compute FOLLOW sets. Unnecessary for us, but requested by a user. Also, this is useful for the SLR(1) test. Thus, we perform this analysis only on demand. *) (* The computation of the symbolic FOLLOW sets follows exactly the same pattern as that of the traditional FOLLOW sets. We share code and parameterize this computation over a module [P]. The type [P.property] intuitively represents a set of symbols. *) module FOLLOW (P : sig include Fix.PROPERTY val union: property -> property -> property val terminal: Terminal.t -> property val first: Production.index -> int -> property end) = struct module S = FixSolver.Make (Maps.ArrayAsImperativeMaps(Nonterminal)) (P) (* Build a system of constraints. *) let record_ConVar, record_VarVar, solve = S.create() (* Iterate over all start symbols. *) let () = let sharp = P.terminal Terminal.sharp in for nt = 0 to Nonterminal.start - 1 do assert (Nonterminal.is_start nt); (* Add # to FOLLOW(nt). *) record_ConVar sharp nt done (* We need to do this explicitly because our start productions are of the form S' -> S, not S' -> S #, so # will not automatically appear into FOLLOW(S) when the start productions are examined. *) (* Iterate over all productions. *) let () = Array.iteri (fun prod (nt1, rhs) -> (* Iterate over all nonterminal symbols [nt2] in the right-hand side. *) Array.iteri (fun i symbol -> match symbol with | Symbol.T _ -> () | Symbol.N nt2 -> let nullable = NULLABLE.production prod (i+1) and first = P.first prod (i+1) in (* The FIRST set of the remainder of the right-hand side contributes to the FOLLOW set of [nt2]. *) record_ConVar first nt2; (* If the remainder of the right-hand side is nullable, FOLLOW(nt1) contributes to FOLLOW(nt2). *) if nullable then record_VarVar nt1 nt2 ) rhs ) Production.table (* Second pass. Solve the equations (on demand). *) let follow : Nonterminal.t -> P.property = solve() end (* Use the above functor to obtain the standard (concrete) FOLLOW sets. *) let follow : Nonterminal.t -> TerminalSet.t = let module F = FOLLOW(struct include TerminalSet let terminal = singleton let first = FIRST.production end) in F.follow (* At log level 2, display the FOLLOW sets. *) let () = if verbose then Error.logG 2 (fun f -> for nt = Nonterminal.start to Nonterminal.n - 1 do Printf.fprintf f "follow(%s) = %s\n" (Nonterminal.print false nt) (TerminalSet.print (follow nt)) done ) (* Compute FOLLOW sets for the terminal symbols as well. Again, unnecessary for us, but requested by a user. This is done in a single pass over the grammar -- no new fixpoint computation is required. *) let tfollow : TerminalSet.t array Lazy.t = lazy ( let tfollow = Array.make Terminal.n TerminalSet.empty in (* Iterate over all productions. *) Array.iteri (fun prod (nt1, rhs) -> (* Iterate over all terminal symbols [t2] in the right-hand side. *) Array.iteri (fun i symbol -> match symbol with | Symbol.N _ -> () | Symbol.T t2 -> let nullable = NULLABLE.production prod (i+1) and first = FIRST.production prod (i+1) in (* The FIRST set of the remainder of the right-hand side contributes to the FOLLOW set of [t2]. *) tfollow.(t2) <- TerminalSet.union first tfollow.(t2); (* If the remainder of the right-hand side is nullable, FOLLOW(nt1) contributes to FOLLOW(t2). *) if nullable then tfollow.(t2) <- TerminalSet.union (follow nt1) tfollow.(t2) ) rhs ) Production.table; tfollow ) (* Define another accessor. *) let tfollow t = (Lazy.force tfollow).(t) (* At log level 3, display the FOLLOW sets for terminal symbols. *) let () = if verbose then Error.logG 3 (fun f -> for t = 0 to Terminal.n - 1 do Printf.fprintf f "follow(%s) = %s\n" (Terminal.print t) (TerminalSet.print (tfollow t)) done ) (* ------------------------------------------------------------------------ *) (* Compute symbolic FIRST and FOLLOW sets. *) (* The symbolic FIRST set of the word determined by [prod/i] is defined (and computed) as follows. *) let sfirst prod i = let rhs = Production.rhs prod in let n = Array.length rhs in let rec loop i = if i = n then (* If the word [prod/i] is empty, the set is empty. *) SymbolSet.empty else let sym = rhs.(i) in (* If the word [prod/i] begins with a symbol [sym], then [sym] itself is part of the symbolic FIRST set, unconditionally. *) SymbolSet.union (SymbolSet.singleton sym) (* Furthermore, if [sym] is nullable, then the symbolic FIRST set of the sub-word [prod/i+1] contributes, too. *) (if NULLABLE.symbol sym then loop (i + 1) else SymbolSet.empty) in loop i (* The symbolic FOLLOW sets are computed just like the FOLLOW sets, except we use a symbolic FIRST set instead of a standard FIRST set. *) let sfollow : Nonterminal.t -> SymbolSet.t = let module F = FOLLOW(struct include SymbolSet let terminal t = SymbolSet.singleton (Symbol.T t) let first = sfirst end) in F.follow (* At log level 3, display the symbolic FOLLOW sets. *) let () = if verbose then Error.logG 3 (fun f -> for nt = Nonterminal.start to Nonterminal.n - 1 do Printf.fprintf f "sfollow(%s) = %s\n" (Nonterminal.print false nt) (SymbolSet.print (sfollow nt)) done ) (* ------------------------------------------------------------------------ *) (* Provide explanations about FIRST sets. *) (* The idea is to explain why a certain token appears in the FIRST set for a certain sequence of symbols. Such an explanation involves basic assertions of the form (i) symbol N is nullable and (ii) the token appears in the FIRST set for symbol N. We choose to take these basic facts for granted, instead of recursively explaining them, so as to keep explanations short. *) (* We first produce an explanation in abstract syntax, then convert it to a human-readable string. *) type explanation = | EObvious (* sequence begins with desired token *) | EFirst of Terminal.t * Nonterminal.t (* sequence begins with a nonterminal that produces desired token *) | ENullable of Symbol.t list * explanation (* sequence begins with a list of nullable symbols and ... *) let explain (tok : Terminal.t) (rhs : Symbol.t array) (i : int) = let length = Array.length rhs in let rec loop i = assert (i < length); let symbol = rhs.(i) in match symbol with | Symbol.T tok' -> assert (Terminal.equal tok tok'); EObvious | Symbol.N nt -> if TerminalSet.mem tok (FIRST.nonterminal nt) then EFirst (tok, nt) else begin assert (NULLABLE.nonterminal nt); match loop (i + 1) with | ENullable (symbols, e) -> ENullable (symbol :: symbols, e) | e -> ENullable ([ symbol ], e) end in loop i let rec convert = function | EObvious -> "" | EFirst (tok, nt) -> Printf.sprintf "%s can begin with %s" (Nonterminal.print false nt) (Terminal.print tok) | ENullable (symbols, e) -> let e = convert e in Printf.sprintf "%scan vanish%s%s" (Symbol.printl symbols) (if e = "" then "" else " and ") e (* ------------------------------------------------------------------------ *) (* Package the analysis results. *) module Analysis = struct let nullable = NULLABLE.nonterminal let nullable_symbol = NULLABLE.symbol let first = FIRST.nonterminal let first_symbol = FIRST.symbol (* An initial definition of [nullable_first_prod]. *) let nullable_first_prod prod i = NULLABLE.production prod i, FIRST.production prod i (* A memoised version, so as to avoid recomputing along a production's right-hand side. *) let nullable_first_prod = Misc.tabulate Production.n (fun prod -> Misc.tabulate (Production.length prod + 1) (fun i -> nullable_first_prod prod i ) ) let first_prod_lookahead prod i z = let nullable, first = nullable_first_prod prod i in if nullable then TerminalSet.add z first else first let explain_first_rhs (tok : Terminal.t) (rhs : Symbol.t array) (i : int) = convert (explain tok rhs i) let follow = follow let attributes = grammar.gr_attributes end (* ------------------------------------------------------------------------ *) (* Conflict resolution via precedences. *) module Precedence = struct type choice = | ChooseShift | ChooseReduce | ChooseNeither | DontKnow type order = Lt | Gt | Eq | Ic let precedence_order p1 p2 = match p1, p2 with | UndefinedPrecedence, _ | _, UndefinedPrecedence -> Ic | PrecedenceLevel (m1, l1, _, _), PrecedenceLevel (m2, l2, _, _) -> if not (InputFile.same_input_file m1 m2) then Ic else if l1 > l2 then Gt else if l1 < l2 then Lt else Eq let production_order p1 p2 = match p1, p2 with | ProductionLevel (m1, l1), ProductionLevel (m2, l2) -> if not (InputFile.same_input_file m1 m2) then Ic else if l1 > l2 then Gt else if l1 < l2 then Lt else Eq let shift_reduce tok prod = let fact1, tokp = Terminal.precedence_level tok and fact2, prodp = Production.precedence prod in match precedence_order tokp prodp with (* Our information is inconclusive. Drop [fact1] and [fact2], that is, do not record that this information was useful. *) | Ic -> DontKnow (* Our information is useful. Record that fact by evaluating [fact1] and [fact2]. *) | (Eq | Lt | Gt) as c -> Lazy.force fact1; Lazy.force fact2; match c with | Ic -> assert false (* already dispatched *) | Eq -> begin match Terminal.associativity tok with | LeftAssoc -> ChooseReduce | RightAssoc -> ChooseShift | NonAssoc -> ChooseNeither | _ -> assert false (* If [tok]'s precedence level is defined, then its associativity must be defined as well. *) end | Lt -> ChooseReduce | Gt -> ChooseShift let reduce_reduce prod1 prod2 = let pl1 = Production.production_level.(prod1) and pl2 = Production.production_level.(prod2) in match production_order pl1 pl2 with | Lt -> Some prod1 | Gt -> Some prod2 | Eq -> (* The order is strict except in the presence of parameterized non-terminals and/or inlining. Two productions can have the same precedence level if they originate, via macro-expansion or via inlining, from a single production in the source grammar. *) None | Ic -> None end (* This function prints warnings about useless precedence declarations for terminal symbols (%left, %right, %nonassoc) and productions (%prec). It should be invoked after only the automaton has been constructed. *) let diagnostics () = if not Settings.ignore_all_unused_precedence_levels then begin TokPrecedence.diagnostics(); Production.diagnostics() end (* ------------------------------------------------------------------------ *) (* %on_error_reduce declarations. *) module OnErrorReduce = struct (* We keep a [StringMap] internally, and convert back and forth between the types [Nonterminal.t] and [string] when querying this map. This is not very elegant, and could be changed if desired. *) let declarations : Syntax.on_error_reduce_level StringMap.t = grammar.on_error_reduce let print (nt : Nonterminal.t) : string = Nonterminal.print false nt let lookup (nt : string) : Nonterminal.t = try Nonterminal.lookup nt with Not_found -> (* If this fails, then we have an [%on_error_reduce] declaration for an invalid symbol. *) assert false let reduce prod = let nt = Production.nt prod in StringMap.mem (print nt) declarations let iter f = StringMap.iter (fun nt _prec -> f (lookup nt) ) declarations open Precedence let preferable prod1 prod2 = (* The two productions that we are comparing must be distinct. *) assert (prod1 <> prod2); let nt1 = Production.nt prod1 and nt2 = Production.nt prod2 in (* If they have the same left-hand side (which seems rather unlikely?), declare them incomparable. *) nt1 <> nt2 && (* Otherwise, look up the priority levels associated with their left-hand symbols. *) let prec1, prec2 = try StringMap.find (print nt1) declarations, StringMap.find (print nt2) declarations with Not_found -> (* [preferable] should be used to compare two symbols for which there exist [%on_error_reduce] declarations. *) assert false in match production_order prec1 prec2 with | Gt -> (* [prec1] is a higher integer than [prec2], therefore comes later in the file. By analogy with [%left] and friends, we give higher priority to later declarations. *) true | Lt -> false | Eq | Ic -> (* We could issue a warning or an information message in these cases. *) false end (* ------------------------------------------------------------------------ *) end (* module Make *) menhir-20200123/src/grammarFunctor.mli000066400000000000000000000504301361226111300175060ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* The functor [Make] transforms an abstract syntax tree for the grammar into a rich internal representation of the grammar. *) (* The reason why this is now a functor, and the reason why its verbosity can be controlled, is that we may wish to invoke it several times, e.g. on the grammar before %inlining, and on the grammar after %inlining. 2015/11/10 *) module Make (G : sig (* An abstract syntax tree for the grammar. *) val grammar: BasicSyntax.grammar (* This flag indicates whether it is OK to produce warnings, verbose information, etc., when this functor is invoked. If it is set to [false], then only serious errors can be signaled. *) val verbose: bool end) : sig (* ------------------------------------------------------------------------ *) (* Nonterminals. *) module Nonterminal : sig (* The type of nonterminals. *) type t (* Comparison. *) val compare: t -> t -> int (* The number of nonterminals. This includes the extra nonterminals that are internally generated for the grammar's entry points. *) val n: int (* [lookup] maps an identifier to a nonterminal, or raises [Not_found]. *) val lookup : string -> t (* Nonterminals can be converted to integers. This feature is exploited in the table-based back-end. *) val n2i: t -> int (* This produces a string representation of a nonterminal. It should in principle never be applied to one of the internally generated nonterminals, as we do not wish users to become aware of the existence of these extra nonterminals. However, we do sometimes violate this rule when it is difficult to do otherwise. The Boolean parameter tells whether the string representation should be normalized, that is, whether parentheses and commas should be eliminated. This is necessary if the string is intended for use as a valid nonterminal name or as a valid OCaml identifier. *) val print: bool -> t -> string (* This is the OCaml type associated with a nonterminal symbol. It is known only if a %type declaration was provided. This function is not applicable to the internally generated nonterminals. *) val ocamltype: t -> Stretch.ocamltype option (* A start symbol always has a type. This allows us to define a simplified version of [ocamltype] for start symbols. *) val ocamltype_of_start_symbol: t -> Stretch.ocamltype (* Creation of a table indexed by nonterminals. *) val init: (t -> 'a) -> 'a array (* Iteration over nonterminals. The order in which elements are examined, and the order of [map]'s output list, correspond to the numeric indices produced by [n2i] above. *) val iter: (t -> unit) -> unit val fold: (t -> 'a -> 'a) -> 'a -> 'a val map: (t -> 'a) -> 'a list (* Iteration over all nonterminals, except the start nonterminals. *) val iterx: (t -> unit) -> unit val foldx: (t -> 'a -> 'a) -> 'a -> 'a (* Tabulation of a function over nonterminals. *) val tabulate: (t -> 'a) -> (t -> 'a) (* [positions nt] is a list of the positions associated with the definition of [nt]. There can be more than one position because definitions can be split over multiple files. *) val positions: t -> Positions.t list (* This tells whether a non-terminal symbol is one of the start symbols. *) val is_start: t -> bool (* [attributes nt] is the list of attributes attached with the nonterminal symbol [nt]. *) val attributes: t -> Syntax.attribute list end (* ------------------------------------------------------------------------ *) (* Sets of nonterminals. *) module NonterminalMap : GMap.S with type key = Nonterminal.t module NonterminalSet = NonterminalMap.Domain (* ------------------------------------------------------------------------ *) (* Terminals. *) module Terminal : sig (* The type of terminals. *) type t (* The number of terminals. This includes the two pseudo-tokens [#] and [error]. *) val n: int (* Comparison. *) val equal: t -> t -> bool val compare: t -> t -> int (* [lookup] maps an identifier to a terminal, or raises [Not_found]. *) val lookup : string -> t (* Terminals can be converted to integers. This feature is exploited in the table-based back-end and in [LRijkstra]. The reverse conversion, [i2t], is unsafe and should not be used. [LRijkstra] uses it :-) *) val t2i: t -> int val i2t: int -> t (* unsafe! *) (* This produces a string representation of a terminal. *) val print: t -> string (* This is the OCaml type associated with a terminal symbol. It is known only if the %token declaration was accompanied with a type. *) val ocamltype: t -> Stretch.ocamltype option (* These are the two pseudo-tokens [#] and [error]. The former is used to denote the end of the token stream. The latter is accessible to the user and is used for handling errors. *) val sharp: t val error: t (* This is the programmer-defined [EOF] token, if there is one. It is recognized based solely on its name, which is fragile, but this behavior is documented. This token is assumed to represent [ocamllex]'s [eof] pattern. It is used only by the reference interpreter, and in a rather non-essential way. *) val eof: t option (* A terminal symbol is pseudo if it is [#] or [error]. It is real otherwise. *) val pseudo: t -> bool val real: t -> bool (* [non_error] returns [true] if its argument is not the [error] token. *) val non_error: t -> bool (* Creation of a table indexed by terminals. *) val init: (t -> 'a) -> 'a array (* Iteration over terminals. The order in which elements are examined, and the order of [map]'s output list, correspond to the numeric indices produced by [t2i] above. *) val iter: (t -> unit) -> unit val fold: (t -> 'a -> 'a) -> 'a -> 'a val map: (t -> 'a) -> 'a list (* Iteration over all terminals except [#]. *) val foldx: (t -> 'a -> 'a) -> 'a -> 'a val mapx: (t -> 'a) -> 'a list (* [iter_real] offers iteration over all real terminals. *) val iter_real: (t -> unit) -> unit (* [attributes t] is the list of attributes attached with the terminal symbol [t]. *) val attributes: t -> Syntax.attribute list (* The sub-module [Word] offers an implementation of words (that is, sequences) of terminal symbols. It is used by [LRijkstra]. We make it a functor, because it has internal state (a hash table) and a side effect (failure if there are more than 256 terminal symbols). *) (* The type [word] should be treated, as much as possible, as an abstract type. In fact, for efficiency reasons, we represent a word as a unique integer codes, and we allocate these integer codes sequentially, from 0 upwards. The conversion from [int] to [word] is of course unsafe and should be used wisely. *) module Word (X : sig end) : sig type word = int val epsilon: word val singleton: t -> word val append: word -> word -> word val length: word -> int (* [first w z] returns the first symbol of the word [w.z]. *) val first: word -> t -> t val elements: word -> t list val print: word -> string (* [verbose()] prints statistics about the use of the internal hash-consing table so far. *) val verbose: unit -> unit (* Lexicographic ordering. *) val compare: word -> word -> int end end (* ------------------------------------------------------------------------ *) (* Sets and maps over terminals. *) module TerminalSet : sig (* All of the operations documented in [GSet] are available. *) include GSet.S with type element = Terminal.t (* This offers a string representation of a set of terminals. The symbols are simply listed one after the other and separated with spaces. *) val print: t -> string (* This is the set of all terminal symbols except the pseudo-tokens [#] and [error]. *) val universe: t end (* All of the operations documented in [GMap] are available. *) module TerminalMap : GMap.S with type key = Terminal.t (* ------------------------------------------------------------------------ *) (* Symbols. *) module Symbol : sig (* A symbol is either a nonterminal or a terminal. *) type t = | N of Nonterminal.t | T of Terminal.t val is_terminal: t -> bool (* [lookup] maps an identifier to a symbol, or raises [Not_found]. *) val lookup : string -> t (* Comparison. *) val equal: t -> t -> bool val lequal: t list -> t list -> bool (* [non_error] returns [true] if its argument is not the [error] token. *) val non_error: t -> bool (* These produce a string representation of a symbol, of a list of symbols, or of an array of symbols. The symbols are simply listed one after the other and separated with spaces. [printao] prints an array of symbols, starting at a particular offset. [printaod] is analogous, but can also print a single dot at a particular position between two symbols. *) val print: t -> string val printl: t list -> string val printa: t array -> string val printao: int -> t array -> string val printaod: int -> int -> t array -> string end (* ------------------------------------------------------------------------ *) (* Sets and maps over symbols. *) (* All of the operations documented in [Set] are available. *) module SymbolSet : Set.S with type elt = Symbol.t module SymbolMap : sig (* All of the operations documented in [Map] are available. *) include Map.S with type key = Symbol.t (* [domain m] is the domain of the map [m], that is, the list of keys for which an entry exists in the map [m]. *) val domain: 'a t -> key list (* [init f xs] creates a map whose keys are the elements [x] found in the list [xs] and the datum associated with [x] is [f x]. *) val init: (key -> 'a) -> key list -> 'a t (* This returns [true] if and only if all of the symbols in the domain of the map at hand are nonterminals. *) val purelynonterminal: 'a t -> bool end (* ------------------------------------------------------------------------ *) (* Productions. *) module Production : sig (* This is the type of productions. This includes user-defined productions as well as the internally generated productions associated with the start symbols. *) type index (* Comparison. *) val compare: index -> index -> int (* Productions can be converted to integers and back. This is unsafe and should be avoided as much as possible. This feature is exploited, for efficiency, in the encoding of items. *) val p2i: index -> int val i2p: int -> index (* The number of productions. *) val n: int (* These map a production index to the production's definition, that is, a nonterminal (the left-hand side) and an array of symbols (the right-hand side). *) val def: index -> Nonterminal.t * Symbol.t array val nt: index -> Nonterminal.t val rhs: index -> Symbol.t array val length: index -> int (* This maps a production index to an array of the identifiers that should be used for naming the semantic values of the symbols in the right-hand side. *) val identifiers: index -> Syntax.identifier array (* This maps a production index to the production's semantic action. This function is not applicable to a start production. *) val action: index -> Syntax.action (* [positions prod] is a list of the positions associated with production [prod]. This is usually a singleton list, but there can be more than one position for start productions when the definition of the corresponding start symbol is split over multiple files. *) val positions: index -> Positions.t list (* [lhs_attributes prod] returns the attributes attached with the head symbol of the production [prod]. It is equivalent to [Nonterminal.attributes (nt prod)]. [rhs_attributes prod] returns an array of the attributes attached with each element in the right-hand side of the production [prod]. *) val lhs_attributes: index -> Syntax.attributes val rhs_attributes: index -> Syntax.attributes array (* Creation of a table indexed by productions. *) val init: (index -> 'a) -> 'a array (* Iteration over all productions. The order in which elements are examined, and the order of [map]'s output list, correspond to the numeric indices produced by [p2i] above. *) val iter: (index -> unit) -> unit val fold: (index -> 'a -> 'a) -> 'a -> 'a val map: (index -> 'a) -> 'a list val amap: (index -> 'a) -> 'a array (* Iteration over all productions, except the start productions. *) val iterx: (index -> unit) -> unit val foldx: (index -> 'a -> 'a) -> 'a -> 'a val mapx: (index -> 'a) -> 'a list (* This maps a (user) non-terminal start symbol to the corresponding start production. *) val startsymbol2startprod: Nonterminal.t -> index (* Iteration over the productions associated with a specific nonterminal. *) val iternt: Nonterminal.t -> (index -> unit) -> unit val foldnt: Nonterminal.t -> 'a -> (index -> 'a -> 'a) -> 'a (* This allows determining whether a production is a start production. If it is a start production, the start symbol that it is associated with is returned. If it is a regular production, nothing is returned. *) val classify: index -> Nonterminal.t option (* [is_start] is easier to use than [classify] when the start symbol is not needed. *) val is_start: index -> bool (* The integer [start] is published so as to allow the table back-end to produce code for [is_start]. It should not be used otherwise. *) val start: int (* This produces a string representation of a production. It should never be applied to a start production, as we do not wish users to become aware of the existence of these extra productions. *) val print: index -> string (* Tabulation of a Boolean function over productions. [tabulateb f] returns a tabulated version of [f] as well as the number of productions where [f] is true. *) val tabulate: (index -> 'a) -> (index -> 'a) val tabulateb: (index -> bool) -> (index -> bool) * int end (* ------------------------------------------------------------------------ *) (* Maps over productions. *) module ProductionMap : sig include GMap.S with type key = Production.index (* Iteration over the start productions only. *) val start: (Production.index -> 'a) -> 'a t end (* ------------------------------------------------------------------------ *) (* This flag tells whether the [error] token appears in at least one production. *) val grammar_uses_error_token: bool (* ------------------------------------------------------------------------ *) (* Analysis of the grammar. *) module Analysis : sig (* [nullable nt] is the NULLABLE flag of the non-terminal symbol [nt]. That is, it is true if and only if this symbol produces the empty word [epsilon]. *) val nullable: Nonterminal.t -> bool val nullable_symbol: Symbol.t -> bool (* [first nt] is the FIRST set of the non-terminal symbol [nt]. *) val first: Nonterminal.t -> TerminalSet.t val first_symbol: Symbol.t -> TerminalSet.t (* [nullable_first_prod prod i] considers the suffix of the production [prod] defined by offset [i]. It returns its NULLABLE flag as well as its FIRST set. The offset [i] must be contained between [0] and [n], inclusive, where [n] is the length of production [prod]. *) val nullable_first_prod: Production.index -> int -> bool * TerminalSet.t (* [first_prod_lookahead prod i t] computes [FIRST(alpha.t)], where [alpha] is the suffix of the production defined by offset [i], and [t] is a terminal symbol. The offset [i] must be contained between [0] and [n], inclusive, where [n] is the length of production [prod]. *) val first_prod_lookahead: Production.index -> int -> Terminal.t -> TerminalSet.t (* [explain_first_rhs tok rhs i] explains why the token [tok] appears in the FIRST set for the string of symbols found at offset [i] in the array [rhs]. *) val explain_first_rhs: Terminal.t -> Symbol.t array -> int -> string (* [follow nt] is the FOLLOW set of the non-terminal symbol [nt], that is, the set of terminal symbols that could follow an expansion of [nt] in a valid sentence. *) val follow: Nonterminal.t -> TerminalSet.t (* [attributes] are the attributes attached with the grammar. *) val attributes: Syntax.attributes end (* ------------------------------------------------------------------------ *) (* Conflict resolution via precedences. *) module Precedence : sig (* Shift/reduce conflicts require making a choice between shifting a token and reducing a production. How these choices are made is of no concern to the back-end, but here is a rough explanation. Shifting is preferred when the token has higher precedence than the production, or they have same precedence and the token is right-associative. Reducing is preferred when the token has lower precedence than the production, or they have same precedence and the token is left-associative. Neither is allowed when the token and the production have same precedence and the token is non-associative. No preference is explicitly specified when the token or the production has undefined precedence. In that case, the default choice is to prefer shifting, but a conflict will be reported. *) type choice = | ChooseShift | ChooseReduce | ChooseNeither | DontKnow val shift_reduce: Terminal.t -> Production.index -> choice (* Reduce/reduce conflicts require making a choice between reducing two distinct productions. This is done by exploiting a partial order on productions. For compatibility with ocamlyacc, this order should be total and should correspond to textual order when the two productions originate in the same source file. When they originate in different source files, the two productions should be incomparable. *) val reduce_reduce: Production.index -> Production.index -> Production.index option end (* ------------------------------------------------------------------------ *) (* [%on_error_reduce] declarations. *) module OnErrorReduce : sig (* [reduce prod] tells whether the left-hand side of [prod] (a nonterminal symbol) appears in an [%on_error_reduce] declaration. *) val reduce: Production.index -> bool (* [iter f] applies the function [f] in turn, in an arbitrary order, to every nonterminal symbol that appears in an [%on_error_reduce] declaration. *) val iter: (Nonterminal.t -> unit) -> unit (* When two productions could be reduced, in a single state, due to [%on_error_reduce] declarations, these productions can be compared, using [preferable], to test if one of them takes precedence over the other. This is a partial order; two productions may be incomparable. *) val preferable: Production.index -> Production.index -> bool end (* ------------------------------------------------------------------------ *) (* Diagnostics. *) (* This function prints warnings about useless precedence declarations for terminal symbols (%left, %right, %nonassoc) and productions (%prec). It should be invoked after only the automaton has been constructed. *) val diagnostics: unit -> unit (* ------------------------------------------------------------------------ *) end (* module Make *) menhir-20200123/src/infer.ml000066400000000000000000000303541361226111300154540ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Syntax open Stretch open BasicSyntax open IL open CodeBits open TokenType (* ------------------------------------------------------------------------- *) (* Naming conventions. *) (* The type variable associated with a nonterminal symbol. Its name begins with a prefix which ensures that it begins with a lowercase letter and cannot clash with OCaml keywords. *) let ntvar symbol = Printf.sprintf "tv_%s" (Misc.normalize symbol) (* The term variable associated with a nonterminal symbol. Its name begins with a prefix which ensures that it begins with a lowercase letter and cannot clash with OCaml keywords. *) let encode symbol = Printf.sprintf "xv_%s" (Misc.normalize symbol) let decode s = let n = String.length s in if not (n >= 3 && String.sub s 0 3 = "xv_") then Lexmli.fail(); String.sub s 3 (n - 3) (* The name of the temporary file. *) let base = Settings.base let mlname = base ^ ".ml" let mliname = base ^ ".mli" (* ------------------------------------------------------------------------- *) (* Code production. *) (* [nttype nt] is the type of the nonterminal [nt], as currently known. *) let nttype grammar nt = try TypTextual (StringMap.find nt grammar.types) with Not_found -> TypVar (ntvar nt) (* [is_standard] determines whether a branch derives from a standard library definition. The method, based on a file name, is somewhat fragile. *) let is_standard branch = List.for_all (fun x -> x = Settings.stdlib_filename) (Action.filenames branch.action) (* [actiondef] turns a branch into a function definition. *) (* The names and types of the conventional internal variables that correspond to keywords ($startpos,etc.) are hardwired in this code. It would be nice if these conventions were more clearly isolated and perhaps moved to the [Action] or [Keyword] module. *) let actiondef grammar symbol branch = (* Construct a list of the semantic action's formal parameters that depend on the production's right-hand side. *) let formals = List.fold_left (fun formals producer -> let symbol = producer_symbol producer and id = producer_identifier producer in let startp, endp, starto, endo, loc = Printf.sprintf "_startpos_%s_" id, Printf.sprintf "_endpos_%s_" id, Printf.sprintf "_startofs_%s_" id, Printf.sprintf "_endofs_%s_" id, Printf.sprintf "_loc_%s_" id in let t = try let props = StringMap.find symbol grammar.tokens in (* Symbol is a terminal. *) match props.tk_ocamltype with | None -> tunit | Some ocamltype -> TypTextual ocamltype with Not_found -> (* Symbol is a nonterminal. *) nttype grammar symbol in PAnnot (PVar id, t) :: PAnnot (PVar startp, tposition) :: PAnnot (PVar endp, tposition) :: PAnnot (PVar starto, tint) :: PAnnot (PVar endo, tint) :: PAnnot (PVar loc, tlocation) :: formals ) [] branch.producers in (* Extend the list with parameters that do not depend on the right-hand side. *) let formals = PAnnot (PVar "_eRR", texn) :: PAnnot (PVar "_startpos", tposition) :: PAnnot (PVar "_endpos", tposition) :: PAnnot (PVar "_endpos__0_", tposition) :: PAnnot (PVar "_symbolstartpos", tposition) :: PAnnot (PVar "_startofs", tint) :: PAnnot (PVar "_endofs", tint) :: PAnnot (PVar "_endofs__0_", tint) :: PAnnot (PVar "_symbolstartofs", tint) :: PAnnot (PVar "_sloc", tlocation) :: PAnnot (PVar "_loc", tlocation) :: formals in (* Construct a function definition out of the above bindings and the semantic action. *) let body = EAnnot ( Action.to_il_expr branch.action, type2scheme (nttype grammar symbol) ) in match formals with | [] -> body | _ -> EFun (formals, body) (* [program] turns an entire grammar into a test program. *) let program grammar = (* Turn the grammar into a bunch of function definitions. Grammar productions that derive from the standard library are reflected first, so that type errors are not reported in them. *) let bindings1, bindings2 = StringMap.fold (fun symbol rule (bindings1, bindings2) -> List.fold_left (fun (bindings1, bindings2) branch -> if is_standard branch then (PWildcard, actiondef grammar symbol branch) :: bindings1, bindings2 else bindings1, (PWildcard, actiondef grammar symbol branch) :: bindings2 ) (bindings1, bindings2) rule.branches ) grammar.rules ([], []) in (* Create entry points whose types are the unknowns that we are looking for. *) let ps, ts = StringMap.fold (fun symbol _ (ps, ts) -> PVar (encode (Misc.normalize symbol)) :: ps, nttype grammar symbol :: ts ) grammar.rules ([], []) in let def = { valpublic = true; valpat = PTuple ps; valval = ELet (bindings1 @ bindings2, EAnnot (bottom, type2scheme (TypTuple ts))) } in (* Insert markers to delimit the part of the file that we are interested in. These markers are recognized by [Lexmli]. This helps skip the values, types, exceptions, etc. that might be defined by the prologue or postlogue. *) let begindef = { valpublic = true; valpat = PVar "menhir_begin_marker"; valval = EIntConst 0 } and enddef = { valpublic = true; valpat = PVar "menhir_end_marker"; valval = EIntConst 0 } in (* Issue the test program. We include the definition of the type of tokens, because, in principle, the semantic actions may refer to it or to its data constructors. *) [ SIFunctor (grammar.parameters, interface_to_structure (tokentypedef grammar) @ SIStretch grammar.preludes :: SIValDefs (false, [ begindef; def; enddef ]) :: SIStretch grammar.postludes :: [])] (* ------------------------------------------------------------------------- *) (* Writing the program associated with a grammar to a file. *) let write grammar filename () = let ml = open_out filename in let module P = Printer.Make (struct let f = ml let locate_stretches = Some filename end) in P.program (program grammar); close_out ml (* ------------------------------------------------------------------------- *) (* Running ocamldep on the program. *) type entry = string (* basename *) * string (* filename *) type line = entry (* target *) * entry list (* dependencies *) let depend postprocess grammar = (* Create an [.ml] file and an [.mli] file, then invoke ocamldep to compute dependencies for us. *) (* If an old [.ml] or [.mli] file exists, we are careful to preserve it. We temporarily move it out of the way and restore it when we are done. There is no reason why dependency analysis should destroy existing files. *) let ocamldep_command = Printf.sprintf "%s %s %s" Settings.ocamldep (Filename.quote mlname) (Filename.quote mliname) in let output : string = Option.project ( IO.moving_away mlname (fun () -> IO.moving_away mliname (fun () -> IO.with_file mlname (write grammar mlname) (fun () -> IO.with_file mliname (Interface.write grammar) (fun () -> IO.invoke ocamldep_command ))))) in (* Echo ocamldep's output. *) print_string output; (* If [--raw-depend] was specified on the command line, stop here. This option is used by omake and by ocamlbuild, which performs their own postprocessing of [ocamldep]'s output. For normal [make] users, who use [--depend], some postprocessing is required, which is performed below. *) if postprocess then begin (* Make sense out of ocamldep's output. *) let lexbuf = Lexing.from_string output in let lines : line list = try Lexdep.main lexbuf with Lexdep.Error msg -> (* Echo the error message, followed with ocamldep's output. *) Error.error [] "%s" (msg ^ output) in (* Look for the line that concerns the [.cmo] target, and echo a modified version of this line, where the [.cmo] target is replaced with [.ml] and [.mli] targets, and where the dependency over the [.cmi] file is dropped. In doing so, we assume that the user's [Makefile] supports bytecode compilation, so that it makes sense to request [bar.cmo] to be built, as opposed to [bar.cmx]. This is not optimal, but will do. [camldep] exhibits the same behavior. *) List.iter (fun ((_, target_filename), dependencies) -> if Filename.check_suffix target_filename ".cmo" then let dependencies = List.filter (fun (basename, _) -> basename <> base ) dependencies in if List.length dependencies > 0 then begin Printf.printf "%s.ml %s.mli:" base base; List.iter (fun (_basename, filename) -> Printf.printf " %s" filename ) dependencies; Printf.printf "\n%!" end ) lines end; (* Stop. *) exit 0 (* ------------------------------------------------------------------------- *) (* Augmenting a grammar with inferred type information. *) (* The parameter [output] is supposed to contain the output of [ocamlc -i]. *) let read_reply (output : string) grammar = (* See comment in module [Error]. *) Error.enable(); let env : (string * int * int) list = Lexmli.main (Lexing.from_string output) in let env : (string * ocamltype) list = List.map (fun (id, openingofs, closingofs) -> decode id, Inferred (String.sub output openingofs (closingofs - openingofs)) ) env in (* Augment the grammar with new %type declarations. *) let types = StringMap.fold (fun symbol _ types -> let ocamltype = try List.assoc (Misc.normalize symbol) env with Not_found -> (* No type information was inferred for this symbol. Perhaps the mock [.ml] file or the inferred [.mli] file are out of date. Fail gracefully. *) Error.error [] "found no inferred type for %s." symbol in if StringMap.mem symbol grammar.types then (* If there was a declared type, keep it. *) types else (* Otherwise, insert the inferred type. *) StringMap.add symbol ocamltype types ) grammar.rules grammar.types in { grammar with types = types } (* ------------------------------------------------------------------------- *) (* Inferring types for a grammar's nonterminals. *) let infer grammar = (* Invoke ocamlc to do type inference for us. *) let ocamlc_command = Printf.sprintf "%s -c -i %s" Settings.ocamlc (Filename.quote mlname) in let output = write grammar mlname (); match IO.invoke ocamlc_command with | Some result -> Sys.remove mlname; result | None -> (* 2015/10/05: intentionally do not remove the [.ml] file if [ocamlc] fails. (Or if an exception is thrown.) *) exit 1 in (* Make sense out of ocamlc's output. *) read_reply output grammar (* ------------------------------------------------------------------------- *) let write_query filename grammar = write grammar filename (); exit 0 (* ------------------------------------------------------------------------- *) let read_reply filename grammar = read_reply (IO.read_whole_file filename) grammar menhir-20200123/src/infer.mli000066400000000000000000000043121361226111300156200ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open BasicSyntax (* [ntvar symbol] is the name of the type variable associated with a nonterminal symbol. *) val ntvar: string -> string (* [infer grammar] analyzes the grammar [grammar] and returns a new grammar, augmented with a [%type] declaration for every nonterminal symbol. The [ocamlc] compiler is used to infer types. *) val infer: grammar -> grammar (* [depend postprocess grammar] prints (on the standard output channel) the OCaml dependencies induced by the semantic actions. If [postprocess] is [true], then ocamldep's output is postprocessed, otherwise it is echoed unchanged. This function does not return; it terminates the program. *) val depend: bool -> grammar -> 'never_returns (* [write_query filename grammar] writes the grammar's semantic actions to a mock [.ml] file named [filename]. This file can then be submitted to [ocamlc] for type inference. See [--infer-write-query ] in the manual. *) val write_query: string -> grammar -> 'never_returns (* [read_reply filename grammar] reads the types inferred by OCaml for the mock [.ml] file described above, and returns a new grammar, augmented with a [%type] declaration for every nonterminal symbol. *) val read_reply: string -> grammar -> grammar menhir-20200123/src/inlining.ml000066400000000000000000000534471361226111300161700ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) let position = Positions.position open Keyword type sw = Action.sw open BasicSyntax open ListMonad let drop = MenhirLib.General.drop let take = MenhirLib.General.take (* -------------------------------------------------------------------------- *) (* Throughout this file, branches (productions) are represented as lists of producers. We consider it acceptable to perform operations whose cost is linear in the length of a production, even when (with more complicated code) it would be possible to eliminate this cost. *) (* -------------------------------------------------------------------------- *) (* [search p i xs] searches the list [xs] for an element [x] that satisfies [p]. If successful, then it returns a pair of: - [i] plus the offset of [x] in the list, and - the element [x]. *) let rec search (p : 'a -> bool) (i : int) (xs : 'a list) : (int * 'a) option = match xs with | [] -> None | x :: xs -> if p x then Some (i, x) else search p (i+1) xs (* [search_at p i xs] searches the list [xs] for an element [x] that satisfies [p]. The search begins at index [i] in the list. If successful, then it returns a pair of: - the offset of [x] in the list, and - the element [x]. *) let search_at p i xs = search p i (drop i xs) (* -------------------------------------------------------------------------- *) (* [find grammar symbol] looks up the definition of [symbol], which must be a valid nonterminal symbol, in the grammar [grammar]. *) let find grammar symbol : rule = try StringMap.find symbol grammar.rules with Not_found -> (* This cannot happen. *) assert false (* -------------------------------------------------------------------------- *) (* [check_no_producer_attributes] checks that a producer, which represents a use site of an %inline symbol, does not carry any attributes. This ensures that we need not worry about propagating attributes through inlining. *) let check_no_producer_attributes producer = match producer_attributes producer with | [] -> () | (id, _payload) :: _attributes -> Error.error [position id] "the nonterminal symbol %s is declared %%inline.\n\ A use of it cannot carry an attribute." (producer_symbol producer) (* -------------------------------------------------------------------------- *) (* 2015/11/18. The interaction of %prec and %inline is not documented. It used to be the case that we would disallow marking a production both %inline and %prec. Now, we allow it, but we check that (1) it is inlined at the last position of the host production and (2) the host production does not already have a %prec annotation. *) let check_prec_inline caller producer nsuffix callee = callee.branch_prec_annotation |> Option.iter (fun callee_prec -> (* The callee has a %prec annotation. *) (* Check condition 1. *) if nsuffix > 0 then begin let symbol = producer_symbol producer in Error.error [ position callee_prec; caller.branch_position ] "this production carries a %%prec annotation,\n\ and the nonterminal symbol %s is marked %%inline.\n\ For this reason, %s can be used only in tail position." symbol symbol end; (* Check condition 2. *) caller.branch_prec_annotation |> Option.iter (fun caller_prec -> let symbol = producer_symbol producer in Error.error [ position callee_prec; position caller_prec ] "this production carries a %%prec annotation,\n\ and the nonterminal symbol %s is marked %%inline.\n\ For this reason, %s cannot be used in a production\n\ which itself carries a %%prec annotation." symbol symbol ) ) (* -------------------------------------------------------------------------- *) (* 2015/11/18. If the callee has a %prec annotation (which implies that the caller does not have one, and that the callee appears in tail position in the caller) then the annotation is inherited. This seems reasonable, but remains undocumented. *) let propagate_prec_annotation caller callee = match callee.branch_prec_annotation with | (Some _) as annotation -> assert (caller.branch_prec_annotation = None); annotation | None -> caller.branch_prec_annotation (* -------------------------------------------------------------------------- *) (* [new_candidate x] is a candidate fresh name, which is based on [x] in an unspecified way. A fairly arbitrary construction can be used here; we just need it to produce an infinite sequence of names, so that eventually we are certain to be able to escape any finite set of unavailable names. We also need this construction to produce reasonably concise names, as it can be iterated several times in practice; I have observed up to 9 iterations in real-world grammars. *) (* Here, the idea is to add a suffix of the form _inlined['0'-'9']+ to the name [x], if it does not already include such a suffix. If [x] already carries such a suffix, then we increment the integer number. *) let new_candidate x = let x, n = ChopInlined.chop (Lexing.from_string x) in Printf.sprintf "%s_inlined%d" x (n + 1) (* [fresh names x] returns a fresh name that is not in the set [names]. The new name is obtained by iterating [new_candidate] until we fall outside the set [names]. *) let rec fresh names x = if StringSet.mem x names then fresh names (new_candidate x) else x (* -------------------------------------------------------------------------- *) (* [rename used producers] renames the producers [producers] of the inlined branch (the callee) if necessary to avoid a clash with the set [used] of the names used by the producers of the host branch (the caller). This set need not contain the name of the producer that is inlined away. *) (* This function produces a pair of: 1. a substitution [phi], which represents the renaming that we have performed, and which must be applied to the semantic action of the callee; 2. the renamed [producers]. *) let rename (used : StringSet.t) producers: Action.subst * producers = let phi, _used, producers = List.fold_left (fun (phi, used, producers) producer -> let x = producer_identifier producer in if StringSet.mem x used then let x' = fresh used x in (x, x') :: phi, StringSet.add x' used, { producer with producer_identifier = x' } :: producers else (phi, StringSet.add x used, producer :: producers) ) ([], used, []) producers in phi, List.rev producers (* -------------------------------------------------------------------------- *) (* [define_positions] defines how the start and end positions of the callee should be computed once it is inlined into the caller. This information is used to transform [$startpos] and [$endpos] in the callee and to transform [$startpos(x)] and [$endpos(x)] in the caller. *) (* 2015/11/04. We ensure that positions are computed in the same manner, regardless of whether inlining is performed. *) (* The arguments of this function are as follows: [name] an array of the names of the producers of the new branch [nprefix] the length of the prefix of the caller, up to the inlining site [ncallee] the length of the callee The results are as follows: [startp] how to transform $startpos in the callee [endp] how to transform $endpos in the callee [beforeendp] how to transform $endpos($0) in the callee *) let define_positions (name : string array) nprefix ncallee : sw * sw * sw = let startp = if ncallee > 0 then (* If the inner production is non-epsilon, things are easy. The start position of the inner production is the start position of its first element. *) RightNamed name.(nprefix), WhereStart else if nprefix > 0 then (* If the inner production is epsilon, we are supposed to compute the end position of whatever comes in front of it. If the prefix is nonempty, then this is the end position of the last symbol in the prefix. *) RightNamed (name.(nprefix - 1)), WhereEnd else (* If the inner production is epsilon and the prefix is empty, then we need to look up the end position stored in the top stack cell. This is the reason why we need the keyword [$endpos($0)]. It is required in this case to preserve the semantics of $startpos and $endpos. *) Before, WhereEnd (* Note that, to contrary to intuition perhaps, we do NOT have that if the prefix is empty, then the start position of the inner production is the start production of the outer production. This is true only if the inner production is non-epsilon. *) in let endp = if ncallee > 0 then (* If the inner production is non-epsilon, things are easy: its end position is the end position of its last element. *) RightNamed (name.(nprefix + ncallee - 1)), WhereEnd else (* If the inner production is epsilon, then its end position is equal to its start position. *) startp (* We must also transform [$endpos($0)] if it used by the inner production. It refers to the end position of the stack cell that comes before the inner production. So, if the prefix is non-empty, then it translates to the end position of the last element of the prefix. Otherwise, it translates to [$endpos($0)]. *) and beforeendp = if nprefix > 0 then RightNamed (name.(nprefix - 1)), WhereEnd else Before, WhereEnd in startp, endp, beforeendp (* -------------------------------------------------------------------------- *) (* [rename_sw_outer] transforms the keywords in the outer production (the caller) during inlining. It replaces [$startpos(x)] and [$endpos(x)], where [x] is the name of the callee, with [startpx] and [endpx], respectively. *) let rename_sw_outer (x, startpx, endpx) (sw : sw) : sw option = match sw with | Before, _ -> None | RightNamed x', where -> if x' = x then match where with | WhereStart -> Some startpx | WhereEnd -> Some endpx | WhereSymbolStart -> assert false (* has been expanded away *) else None | Left, _ -> (* [$startpos], [$endpos], and [$symbolstartpos] have been expanded away earlier; see [KeywordExpansion]. *) assert false (* -------------------------------------------------------------------------- *) (* [rename_sw_inner] transforms the keywords in the inner production (the callee) during inlining. It replaces [$endpos($0)] with [beforeendp]. *) let rename_sw_inner beforeendp (sw : sw) : sw option = match sw with | Before, where -> assert (where = WhereEnd); Some beforeendp | RightNamed _, _ -> None | Left, _ -> (* [$startpos] and [$endpos] have been expanded away earlier; see [KeywordExpansion]. *) assert false (* -------------------------------------------------------------------------- *) (* [inline_branch caller site callee] inlines the branch [callee] into the branch [caller] at the site [site]. By convention, a site is a pair of an integer index -- the index [i] of the producer that must be inlined away -- and a producer [producer] -- the producer itself. This is redundant, as [producer] can be recovered based on [caller] and [i], but convenient. *) type site = int * producer let inline_branch caller (i, producer : site) (callee : branch) : branch = (* The host branch (the caller) is divided into three sections: a prefix of length [nprefix], the producer that we wish to inline away, and a suffix of length [nsuffix]. *) (* Compute the length of the prefix and suffix. *) let nprefix = i in let nsuffix = List.length caller.producers - (i + 1) in (* Construct the prefix and suffix. *) let prefix = take nprefix caller.producers and suffix = drop (nprefix + 1) caller.producers in (* Apply the (undocumented) restrictions that concern the interaction between %prec and %inline. Then, (possibly) propagate a %prec annotation. *) check_prec_inline caller producer nsuffix callee; let branch_prec_annotation = propagate_prec_annotation caller callee in (* Compute the names of the producers in the host branch (the caller), minus the one that is being inlined away. Rename the producers of the inlined branch (the callee), if necessary, so as to avoid a clash with this set. The goal is to guarantee that, after inlining, all producers in the newly constructed branch have unique names. *) let used = StringSet.union (names prefix) (names suffix) in let phi, inlined_producers = rename used callee.producers in (* Construct (the producers of) the new branch. The prefix and suffix of the caller are preserved. In the middle, [producer] disappears and is replaced with [inlined_producers]. For debugging purposes, check that each producer in the new branch carries a unique name. *) let producers = prefix @ inlined_producers @ suffix in let (_ : StringSet.t) = names producers in (* Find out how the start and end positions of the callee should be computed once it is inlined into the caller. *) let startp, endp, beforeendp = let name = producers |> Array.of_list |> Array.map producer_identifier in let ncallee = List.length callee.producers in define_positions name nprefix ncallee in (* Apply appropriate renamings to the semantic actions of the caller and callee, then compose them using a [let] binding. If [x] is the name of the producer that we wish to inline away, then the variable [x] in the caller's semantic action should refer to the semantic value produced by the callee's semantic action. *) let x = producer_identifier producer in let caller_action, callee_action = Action.rename (rename_sw_outer (x, startp, endp)) [] caller.action, Action.rename (rename_sw_inner beforeendp) phi callee.action in let action = Action.compose x callee_action caller_action in (* We are done! Build a new branch. *) let { branch_position; branch_production_level; _ } = caller in { branch_position; producers; action; branch_prec_annotation; branch_production_level; } (* -------------------------------------------------------------------------- *) (* Inlining a list of branches [callees] into the branch [caller] at [site]. *) let inline_branches caller site (callees : branches) : branches = List.map (inline_branch caller site) callees (* -------------------------------------------------------------------------- *) (* For greater syntactic convenience, the main function is written as a functor, and re-packaged as a function at the very end. *) (* Roughly speaking, the transformation is implemented by two mutually recursive functions. [expand_branches] transforms a list of branches into a list of (expanded) branches; [expand_symbol] maps a nonterminal symbol (which may or may not be marked %inline) to its definition in the transformed grammar, an (expanded) rule. In order to avoid duplication of work, we memoize [expand_symbol]. Thus, the expansion of each symbol is computed at most once. (Expansions are demanded top-down, but are computed bottom-up.) Memoization is implemented without pain by using a ready-made fixed point combinator, [Memoize.defensive_fix]. Furthermore, this find point combinator dynamically detects cycles of %inline nonterminal symbols, allowing us to avoid divergence and display a nice error message. *) module Inline (G : sig val grammar: grammar end) = struct open G let is_inline_symbol = is_inline_symbol grammar let is_inline_producer = is_inline_producer grammar let find = find grammar (* In [--coq] mode, %inline is forbidden. There are two reasons for this. One technical reason is that inlining requires constructing composite semantic actions (using [Action.compose], etc.) and this construction is currently OCaml-specific. (This could be rather easily changed, though.) A more philosophical reason is that we don't want to have a large gap between the grammar written by the user in the .mly file and the grammar written by Menhir in the .v file. The latter grammar is the reference grammar, the one with respect to which the generated parser is proved correct. *) let () = if Settings.coq then StringMap.iter (fun _ rule -> if rule.inline_flag then Error.error rule.positions "%%inline is not supported by the Coq back-end." ) grammar.rules (* This is [expand_branches], parameterized by its companion function, [expand_symbol]. The parameter [i], an integer, is used to perform a left-to-right sweep: the precondition of [expand_branches] is that there are no inlining sites at indices less than [i] in [branches]. Thus, we can begin searching at index [i]. (Beginning to search at index 0 would work, too, but would cause redundant searches.) *) let rec expand_branches expand_symbol i branches : branches = (* For each branch [caller] in the list [branches], *) branches >>= fun (caller : branch) -> (* Search for an inlining site in the branch [caller]. We begin the search at position [i], as we know that every inlining site left of this position has been dealt with already. *) match search_at is_inline_producer i caller.producers with | None -> (* There is none; we are done. *) return caller | Some ((i, producer) as site) -> (* There is one. This is an occurrence of a nonterminal symbol [symbol] that is marked %inline. We look up its (expanded) definition (via a recursive call to [expand_symbol]), yielding a set of branches, which we inline into the branch [caller]. Then, we continue looking for inlining sites. *) check_no_producer_attributes producer; let symbol = producer_symbol producer in expand_symbol symbol |> get_branches |> inline_branches caller site |> expand_branches expand_symbol i (* This is [expand_symbol], parameterized by itself. *) let expand_symbol expand_symbol symbol : rule = (* Find the rule that defines this symbol. Then, transform this rule by applying [expand_branches] to its branches. The left-to-right sweep begins at index 0. *) find symbol |> transform_branches (expand_branches expand_symbol 0) (* Apply [defensive_fix] to obtain a closed function [expand_symbol]. *) let expand_symbol : Syntax.symbol -> rule = Memoize.String.defensive_fix expand_symbol (* Wrap [expand_symbol] in an exception handler, so that, when a cycle of %inline nonterminal symbols is detected, a good error message is displayed. *) let expand_symbol symbol = try expand_symbol symbol with Memoize.String.Cycle (symbols, symbol) -> let rule = find symbol in let b = Buffer.create 128 in Printf.bprintf b "there is a cycle of %%inline nonterminal symbols:\n"; begin match symbols with | [] -> assert false | head :: [] -> assert (head = symbol); Printf.bprintf b " %s refers to itself." symbol | head :: next :: symbols -> assert (head = symbol); Printf.bprintf b " %s refers to %s,\n" head next; List.iter (fun symbol -> Printf.bprintf b " which refers to %s,\n" symbol ) symbols; Printf.bprintf b " which refers back to %s." symbol end; Error.error rule.positions "%s" (Buffer.contents b) (* The rules of the transformed grammar are obtained by keeping only non-%inline symbols and expanding their rules. *) let rules = grammar.rules |> StringMap.filter (fun _ rule -> not rule.inline_flag) |> StringMap.mapi (fun symbol _rule -> expand_symbol symbol) (* Drop %type declarations that concern %inline symbols. *) let keep symbol _rule : bool = not (is_inline_symbol symbol) let types = StringMap.filter keep grammar.types (* Drop %on_error_reduce declarations that concern %inline symbols. At the same time, display a warning, as this seems strange: these declarations are useless. *) let keep_or_warn (symbol : string) _rule : bool = let keep = keep symbol _rule in if not keep then Error.grammar_warning [] "the declaration %%on_error_reduce %s\n\ has no effect: this symbol is marked %%inline and is expanded away." symbol; keep let on_error_reduce = StringMap.filter keep_or_warn grammar.on_error_reduce (* We are done. *) let grammar = { grammar with rules; types; on_error_reduce } end (* -------------------------------------------------------------------------- *) (* Re-package the above functor as a function. *) let inline grammar = let module I = Inline(struct let grammar = grammar end) in I.grammar menhir-20200123/src/inlining.mli000066400000000000000000000023051361226111300163240ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open BasicSyntax (** [inline g] traverses the grammar [g] and inlines away the nonterminal symbols whose definitions are marked [%inline]. The result is a grammar where no symbols are marked [%inline]. *) val inline: grammar -> grammar menhir-20200123/src/installation.ml000066400000000000000000000044771361226111300170610ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* [normalize] normalizes a file name by recognizing . and .. and treating them in an appropriate manner. *) let rec normalize fn = let dir = Filename.dirname fn in let base = Filename.basename fn in if dir = fn then (* This could be the case e.g. if [fn] is "/". *) dir else if base = Filename.current_dir_name then (* We have "." as the basename, that is, at the end. Remove it and continue. *) normalize dir else if base = Filename.parent_dir_name then (* We have ".." as the basename, that is, at the end. Normalize the rest. Once done, chop off the basename, thus moving to the parent directory. *) Filename.dirname (normalize dir) else (* We have a normal basename. Normalize the rest. *) Filename.concat (normalize dir) base (* The directory where (we think) MenhirLib is installed. *) (* This directory used to be hard-coded in the [menhir] executable. We now adopt a different strategy. We fetch the name of the [menhir] executable, and hope that it is of the form [.../bin/menhir]. We change this to [.../lib/menhirLib], and hope that this is where MenhirLib is installed. *) let libdir () = let root = Sys.executable_name |> normalize |> Filename.dirname (* remove [menhir] *) |> Filename.dirname (* remove [bin] *) in Filename.concat root (Filename.concat "lib" "menhirLib") menhir-20200123/src/installation.mli000066400000000000000000000020511361226111300172140ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* The directory where (we think) MenhirLib is installed. *) val libdir: unit -> string menhir-20200123/src/interface.ml000066400000000000000000000136271361226111300163150ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open BasicSyntax open IL open CodeBits (* -------------------------------------------------------------------------- *) (* The [Error] exception. *) let excname = "Error" let excdef = { excname = excname; exceq = (if Settings.fixedexc then Some "Parsing.Parse_error" else None); } (* -------------------------------------------------------------------------- *) (* The type of the monolithic entry point for the start symbol [symbol]. *) let entrytypescheme grammar symbol = let typ = TypTextual (ocamltype_of_start_symbol grammar symbol) in type2scheme (marrow [ arrow tlexbuf TokenType.ttoken; tlexbuf ] typ) (* -------------------------------------------------------------------------- *) (* When the table back-end is active, the generated parser contains, as a sub-module, an application of [Engine.Make]. This sub-module is named as follows. *) let interpreter = "MenhirInterpreter" let checkpoint t = TypApp (interpreter ^ ".checkpoint", [ t ]) let lr1state = "lr1state" let tlr1state a : typ = TypApp (lr1state, [a]) (* -------------------------------------------------------------------------- *) (* The name of the sub-module that contains the incremental entry points. *) let incremental = "Incremental" (* The type of the incremental entry point for the start symbol [symbol]. *) let entrytypescheme_incremental grammar symbol = let t = TypTextual (ocamltype_of_start_symbol grammar symbol) in type2scheme (marrow [ tposition ] (checkpoint t)) (* -------------------------------------------------------------------------- *) (* The name of the sub-module that contains the inspection API. *) let inspection = "Inspection" (* -------------------------------------------------------------------------- *) (* The monolithic (traditional) API: the type [token], the exception [Error], and the parser's entry points. *) let monolithic_api grammar = TokenType.tokentypedef grammar @ IIComment "This exception is raised by the monolithic API functions." :: IIExcDecls [ excdef ] :: IIComment "The monolithic API." :: IIValDecls ( StringSet.fold (fun symbol decls -> (Misc.normalize symbol, entrytypescheme grammar symbol) :: decls ) grammar.start_symbols [] ) :: [] (* -------------------------------------------------------------------------- *) (* The inspection API. *) let inspection_api grammar () = let a = "a" in (* Define the types [terminal] and [nonterminal]. *) TokenType.tokengadtdef grammar @ NonterminalType.nonterminalgadtdef grammar @ (* Include the signature that lists the inspection functions, with appropriate type instantiations. *) IIComment "The inspection API." :: IIInclude ( with_types WKDestructive "MenhirLib.IncrementalEngine.INSPECTION" [ [ a ], "lr1state", tlr1state (TypVar a); [], "production", TypApp ("production", []); [ a ], TokenType.tctokengadt, TokenType.ttokengadt (TypVar a); [ a ], NonterminalType.tcnonterminalgadt, NonterminalType.tnonterminalgadt (TypVar a); [ a ], "env", TypApp ("env", [ TypVar a ]); ] ) :: [] (* -------------------------------------------------------------------------- *) (* The incremental API. *) let incremental_engine () : module_type = with_types WKNonDestructive "MenhirLib.IncrementalEngine.INCREMENTAL_ENGINE" [ [], "token", (* NOT [tctoken], which is qualified if [--external-tokens] is used *) TokenType.ttoken ] let incremental_entry_points grammar : interface = IIComment "The entry point(s) to the incremental API." :: IIModule (incremental, MTSigEnd [ IIValDecls ( StringSet.fold (fun symbol decls -> (symbol, entrytypescheme_incremental grammar symbol) :: decls ) grammar.start_symbols [] ) ]) :: [] let incremental_api grammar () : interface = IIModule ( interpreter, MTSigEnd ( IIComment "The incremental API." :: IIInclude (incremental_engine()) :: listiflazy Settings.inspection (inspection_api grammar) ) ) :: (* The entry points must come after the incremental API, because their type refers to the type [checkpoint]. *) incremental_entry_points grammar (* -------------------------------------------------------------------------- *) (* The complete interface of the generated parser. *) let interface grammar = [ IIFunctor (grammar.parameters, monolithic_api grammar @ listiflazy Settings.table (incremental_api grammar) ) ] (* -------------------------------------------------------------------------- *) (* Writing the interface to a file. *) let write grammar () = (* We have a dependency on [TokenType], which takes care of the case where [token_type_mode] is [TokenTypeOnly]. *) assert (Settings.token_type_mode <> Settings.TokenTypeOnly); let mli = open_out (Settings.base ^ ".mli") in let module P = Printer.Make (struct let f = mli let locate_stretches = None end) in P.interface (interface grammar); close_out mli menhir-20200123/src/interface.mli000066400000000000000000000033741361226111300164640ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module defines the interface of the generated parser. *) (* This is the [Error] exception. *) val excname: string val excdef: IL.excdef (* The type of the entry point for the start symbol [nt]. *) val entrytypescheme: BasicSyntax.grammar -> string -> IL.typescheme (* The name of the interpreter sub-module, when the table back-end is used. *) val interpreter: string (* The type ['a checkpoint], defined in the interpreter sub-module. *) val checkpoint: IL.typ -> IL.typ (* The name of the sub-module that contains the incremental entry points. *) val incremental: string (* The name of the sub-module that contains the inspection API. *) val inspection: string (* This writes the interface of the generated parser to the [.mli] file. *) val write: BasicSyntax.grammar -> unit -> unit menhir-20200123/src/interpret.ml000066400000000000000000000664751361226111300164020ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) module I = Invariant (* artificial dependency *) module D = Default (* artificial dependency *) (* --------------------------------------------------------------------------- *) open Grammar open SentenceParserAux (* A delimiter. *) type delimiter = string (* An error message. *) type message = string (* A run is a series of sentences or comments, followed with a delimiter (at least one blank line; comments), followed with an error message. *) type run = located_sentence or_comment list * delimiter * message (* A targeted sentence is a located sentence together with the target into which it leads. A target tells us which state a sentence leads to, as well as which spurious reductions are performed at the end. *) type target = ReferenceInterpreter.target let target2state (s, _spurious) = s type maybe_targeted_sentence = located_sentence * target option type targeted_sentence = located_sentence * target (* A targeted run is a series of targeted sentences or comments together with an error message. *) type maybe_targeted_run = maybe_targeted_sentence or_comment list * delimiter * message type targeted_run = targeted_sentence or_comment list * delimiter * message (* A filtered targeted run is a series of targeted sentences together with an error message. (The comments have been filtered out.) *) type filtered_targeted_run = targeted_sentence list * message (* --------------------------------------------------------------------------- *) (* Display and debugging. *) let print_sentence (nto, terminals) : string = let b = Buffer.create 128 in Option.iter (fun nt -> Printf.bprintf b "%s: " (Nonterminal.print false nt) ) nto; let separator = Misc.once "" " " in List.iter (fun t -> Printf.bprintf b "%s%s" (separator()) (Terminal.print t) ) terminals; Printf.bprintf b "\n"; Buffer.contents b (* --------------------------------------------------------------------------- *) (* [stream] turns a finite list of terminals into a stream of terminals. *) exception EndOfStream let stream (toks : Terminal.t list) : unit -> Terminal.t * Lexing.position * Lexing.position = let toks = ref toks in fun () -> let tok = match !toks with | tok :: more -> (* Take a token off the list, and return it. *) toks := more; tok | [] -> (* The finite list has been exhausted. Here, two plausible behaviors come to mind. The first behavior consists in raising an exception. In that case, we are creating a finite stream, and it is up to the parser to not read past its end. The second behavior consists in returning a designated token. In that case, we are creating an infinite, eventually constant, stream. The choice between these two behaviors is somewhat arbitrary; furthermore, in the second case, the choice of the designated token is arbitrary as well. Here, we adopt the second behavior if and only if the grammar has an EOF token, and we use EOF as the designated token. Again, this is arbitrary, and could be changed in the future. *) match Terminal.eof with | Some eof -> eof | None -> raise EndOfStream in (* For now, return dummy positions. *) tok, Lexing.dummy_pos, Lexing.dummy_pos (* --------------------------------------------------------------------------- *) (* [start sentence] returns the start symbol that we should use to interpret the sentence [sentence]. *) (* If a start symbol was explicitly provided as part of the sentence, we use it. Otherwise, we use the grammar's unique start symbol, if there is one. *) let start poss ((nto, _) : sentence) : Nonterminal.t = match nto with | Some nt -> nt | None -> match ProductionMap.is_singleton Lr1.entry with | None -> Error.error poss "because the grammar has multiple start symbols, each of the\n\ sentences provided on the standard input channel must be of the\n\ form: : *" | Some (prod, _) -> match Production.classify prod with | Some nt -> nt | None -> assert false (* --------------------------------------------------------------------------- *) (* [interpret] interprets a sentence. *) let interpret ((_, toks) as sentence) : unit = let nt = start [] sentence in (* Run the reference interpreter. This can produce a concrete syntax tree ([Some cst]), fail with a parser error ([None]), or fail with a lexer error ([EndOfStream]). *) (* In either case, we produce just one line of output, so it should be clear to the user which outcomes correspond to which sentences (should multiple sentences be supplied). *) begin try match MenhirLib.Convert.Simplified.traditional2revised (ReferenceInterpreter.interpret Settings.trace nt) (stream toks) with | Some cst -> (* Success. *) Printf.printf "ACCEPT"; if Settings.interpret_show_cst then begin print_newline(); Cst.show stdout cst end | None -> (* Parser failure. *) Printf.printf "REJECT" with EndOfStream -> (* Lexer failure. *) Printf.printf "OVERSHOOT" end; print_newline() (* --------------------------------------------------------------------------- *) (* [interpret_error_aux] interprets a sentence, expecting it to end in an error. Failure or success is reported via two continuations. *) let interpret_error_aux log poss ((_, terminals) as sentence) fail succeed = let nt = start poss sentence in let open ReferenceInterpreter in match check_error_path log nt terminals with | OInputReadPastEnd -> fail "no syntax error occurs." | OInputNotFullyConsumed -> fail "a syntax error occurs before the last token is reached." | OUnexpectedAccept -> fail "no syntax error occurs; in fact, this input is accepted." | OK target -> succeed nt terminals target (* --------------------------------------------------------------------------- *) (* This default error message is produced by [--list-errors] when it creates a [.messages] file, and is recognized by [--compare-errors] when it compares two such files. *) let default_message = "\n" (* [print_messages_auto] displays just the sentence and the auto-generated comments. [otarget] may be [None], in which case the auto-generated comment is just a warning that this sentence does not end in an error. *) let print_messages_auto (nt, sentence, otarget) : unit = (* Print the sentence, followed with auto-generated comments. *) print_string (print_sentence (Some nt, sentence)); match (otarget : target option) with | None -> Printf.printf "##\n\ ## WARNING: This sentence does NOT end with a syntax error, as it should.\n\ ##\n" | Some (s', spurious) -> Printf.printf "##\n\ ## Ends in an error in state: %d.\n\ ##\n\ %s##\n" (Lr1.number s') (* [Lr0.print] or [Lr0.print_closure] could be used here. The latter could sometimes be helpful, but is usually intolerably verbose. *) (Lr0.print "## " (Lr1.state s')) ; Printf.printf "## The known suffix of the stack is as follows:\n\ ##%s\n\ ##\n" (Invariant.print (Invariant.stack s')) ; if spurious <> [] then begin Printf.printf "## WARNING: This example involves spurious reductions.\n\ ## This implies that, although the LR(1) items shown above provide an\n\ ## accurate view of the past (what has been recognized so far), they\n\ ## may provide an INCOMPLETE view of the future (what was expected next).\n" ; List.iter (fun (s, prod) -> Printf.printf "## In state %d, spurious reduction of production %s\n" (Lr1.number s) (Production.print prod) ) spurious; Printf.printf "##\n" end (* [print_messages_item] displays one data item. The item is of the form [nt, sentence, target], which means that beginning at the start symbol [nt], the sentence [sentence] ends in an error in the target state given by [target]. [target] also contains information about which spurious reductions are performed at the end. The display obeys the [.messages] file format. *) let print_messages_item (nt, sentence, target) : unit = (* Print the sentence, followed with auto-generated comments. *) print_messages_auto (nt, sentence, Some target); (* Then, print a proposed error message, between two blank lines. *) Printf.printf "\n%s\n" default_message (* --------------------------------------------------------------------------- *) (* [write_run run] writes a run into a new [.messages] file. Manually-written comments are preserved. New auto-generated comments are produced. *) let write_run : maybe_targeted_run or_comment -> unit = function | Thing (sentences_or_comments, delimiter, message) -> (* First, print every sentence and human comment. *) List.iter (fun sentence_or_comment -> match sentence_or_comment with | Thing ((poss, ((_, toks) as sentence)), target) -> let nt = start poss sentence in (* Every sentence is followed with newly generated auto-comments. *) print_messages_auto (nt, toks, target) | Comment c -> print_string c ) sentences_or_comments; (* Then, print the delimiter, which must begin with a blank line and may include comments. *) print_string delimiter; (* Then, print the error message. *) print_string message (* No need for another blank line. It will be printed as part of a separate [Comment]. *) | Comment comments -> (* Must begin with a blank line. *) print_string comments (* --------------------------------------------------------------------------- *) (* [interpret_error] interprets a sentence, expecting it to end in an error. Failure or success is reported on the standard output channel. This is used by [--interpret-error]. *) let fail msg = Error.error [] "%s" msg let succeed nt terminals target = print_messages_item (nt, terminals, target); exit 0 let interpret_error sentence = interpret_error_aux Settings.trace [] sentence fail succeed (* --------------------------------------------------------------------------- *) (* [target_sentence] interprets a (located) sentence, expecting it to end in an error, computes the state in which the error is obtained, and constructs a targeted sentence. *) let target_sentence (signal : Positions.positions -> ('a, out_channel, unit, unit) format4 -> 'a) : located_sentence -> maybe_targeted_sentence = fun (poss, sentence) -> (poss, sentence), interpret_error_aux false poss sentence (* failure: *) (fun msg -> signal poss "this sentence does not end with a syntax error, as it should.\n%s" msg ; None ) (* success: *) (fun _nt _terminals target -> Some target) let target_run_1 signal : run -> maybe_targeted_run = fun (sentences, delimiter, message) -> List.map (or_comment_map (target_sentence signal)) sentences, delimiter, message let target_run_2 : maybe_targeted_run -> targeted_run = fun (sentences, delimiter, message) -> let aux (x, y) = (x, Misc.unSome y) in List.map (or_comment_map aux) sentences, delimiter, message let target_runs : run list -> targeted_run list = fun runs -> let c = Error.new_category() in let signal = Error.signal c in (* Interpret all sentences, possibly displaying multiple errors. *) let runs = List.map (target_run_1 signal) runs in (* Abort if an error occurred. *) Error.exit_if c; (* Remove the options introduced by the first phase above. *) let runs = List.map target_run_2 runs in runs (* --------------------------------------------------------------------------- *) (* [filter_things] filters out the comments in a list of things or comments. *) let filter_things : 'a or_comment list -> 'a list = fun things -> List.flatten (List.map unThing things) (* [filter_run] filters out the comments within a run. *) let filter_run : targeted_run -> filtered_targeted_run = fun (sentences, _, message) -> filter_things sentences, message (* --------------------------------------------------------------------------- *) (* [setup()] returns a function [read] which reads one sentence from the standard input channel. *) let setup () : unit -> sentence option = let open Lexing in let lexbuf = from_channel stdin in lexbuf.lex_curr_p <- { lexbuf.lex_curr_p with pos_fname = "(stdin)" }; let read () = try SentenceParser.optional_sentence SentenceLexer.lex lexbuf with Parsing.Parse_error -> Error.error (Positions.lexbuf lexbuf) "ill-formed input sentence." in read (* --------------------------------------------------------------------------- *) (* Display an informational message about the contents of a [.messages] file. *) let stats (runs : run or_comment list) = (* [s] counts the sample input sentences. [m] counts the error messages. *) let s = ref 0 and m = ref 0 in List.iter (function | Thing (sentences, _, _) -> incr m; List.iter (function | Thing _ -> incr s | Comment _ -> () ) sentences | Comment _ -> () ) runs; Printf.eprintf "Read %d sample input sentences and %d error messages.\n%!" !s !m; runs (* --------------------------------------------------------------------------- *) (* Reading a [.messages] file. *) (* Our life is slightly complicated by the fact that the whitespace between two runs can contain comments, which we wish to preserve when performing [--update-errors]. *) let read_messages filename : run or_comment list = let open Segment in (* Read and segment the file. *) let segments : (tag * string * Lexing.lexbuf) list = segment filename in (* Process the segments, two by two. We expect one segment to contain a non-empty series of sentences, and the next segment to contain free-form text. *) let rec loop accu segments = match segments with | [] -> List.rev accu | (Whitespace, comments, _) :: segments -> loop (Comment comments :: accu) segments | (Segment, _, lexbuf) :: segments -> (* Read a series of located sentences. *) match SentenceParser.entry SentenceLexer.lex lexbuf with | exception Parsing.Parse_error -> Error.error [Positions.cpos lexbuf] "ill-formed sentence." | sentences -> (* In principle, we should now find a segment of whitespace followed with a segment of text. By construction, the two kinds of segments alternate. *) match segments with | (Whitespace, comments, _) :: (Segment, message, _) :: segments -> let run : run = sentences, comments, message in loop (Thing run :: accu) segments | [] | [ _ ] -> Error.error (Positions.one (Lexing.lexeme_end_p lexbuf)) "missing a final message. I may be desynchronized." | (Segment, _, _) :: _ | (Whitespace, _, _) :: (Whitespace, _, _) :: _ -> (* Should not happen, thanks to the alternation between the two kinds of segments. *) assert false in stats (loop [] segments) (* --------------------------------------------------------------------------- *) (* [message_table] converts a list of targeted runs to a table (a mapping) of states to located sentences and messages. Optionally, it can detect that two sentences lead to the same state, and report an error. *) let message_table (detect_redundancy : bool) (runs : filtered_targeted_run list) : (located_sentence * message) Lr1.NodeMap.t = let c = Error.new_category() in let table = List.fold_left (fun table (sentences_and_states, message) -> List.fold_left (fun table (sentence2, target) -> let s = target2state target in match Lr1.NodeMap.find s table with | sentence1, _ -> if detect_redundancy then Error.signal c (fst sentence1 @ fst sentence2) "these sentences both cause an error in state %d." (Lr1.number s); table | exception Not_found -> Lr1.NodeMap.add s (sentence2, message) table ) table sentences_and_states ) Lr1.NodeMap.empty runs in Error.exit_if c; table (* --------------------------------------------------------------------------- *) (* [compile_runs] converts a list of targeted runs to OCaml code that encodes a mapping of state numbers to error messages. The code is sent to the standard output channel. *) let compile_runs filename (runs : filtered_targeted_run list) : unit = (* We wish to produce a function that maps a state number to a message. By convention, we call this function [message]. *) let name = "message" in let open IL in let open CodeBits in let default = { branchpat = PWildcard; branchbody = eraisenotfound (* The default branch raises an exception, which can be caught by the user, who can then produce a generic error message. *) } in let branches = List.fold_left (fun branches (sentences_and_states, message) -> (* Create an or-pattern for these states. *) let states = List.map (fun (_, target) -> let s = target2state target in pint (Lr1.number s) ) sentences_and_states in (* Map all these states to this message. *) { branchpat = POr states; branchbody = EStringConst message } :: branches ) [ default ] runs in let messagedef = { valpublic = true; valpat = PVar name; valval = EFun ([ PVar "s" ], EMatch (EVar "s", branches)) } in let program = [ SIComment (Printf.sprintf "This file was auto-generated based on \"%s\"." filename); SIComment (Printf.sprintf "Please note that the function [%s] can raise [Not_found]." name); SIValDefs (false, [ messagedef ]); ] in (* Write this program to the standard output channel. *) let module P = Printer.Make (struct let f = stdout let locate_stretches = None end) in P.program program (* --------------------------------------------------------------------------- *) (* The rest of this file is the function [run], internally written as a functor [Run] for syntactic convenience. *) module Run (X : sig end) = struct (* --------------------------------------------------------------------------- *) (* If [--interpret] is set, interpret the sentences found on the standard input channel, then stop, without generating a parser. *) (* We read a series of sentences from the standard input channel. To allow interactive use, we interpret each sentence as soon as it is read. *) let () = if Settings.interpret then let read = setup() in Printf.printf "Ready!\n%!"; while true do match read() with | None -> exit 0 | Some sentence -> interpret sentence done (* --------------------------------------------------------------------------- *) (* If [--interpret-error] is set, interpret one sentence found on the standard input channel, then stop, without generating a parser. *) (* We read just one sentence, confirm that this sentence ends in an error, and (if that is the case) display the number of the state that is reached. *) let () = if Settings.interpret_error then let read = setup() in match read() with | None -> exit 1 (* abnormal: no input *) | Some sentence -> interpret_error sentence (* never returns *) (* --------------------------------------------------------------------------- *) (* If [--compile-errors ] is set, compile the error message descriptions found in file [filename] down to OCaml code, then stop. *) let () = Settings.compile_errors |> Option.iter (fun filename -> (* Read the file. *) let runs : run or_comment list = read_messages filename in (* Drop the comments in between two runs. *) let runs : run list = filter_things runs in (* Convert every sentence to a state number. We signal an error if a sentence does not end in an error, as expected. *) let runs : targeted_run list = target_runs runs in (* Remove comments within the runs. *) let runs : filtered_targeted_run list = List.map filter_run runs in (* Build a mapping of states to located sentences. This allows us to detect if two sentences lead to the same state. *) let _ = message_table true runs in (* In principle, we would like to check whether this set of sentences is complete (i.e., covers all states where an error can arise), but this may be costly -- it requires running [LRijkstra]. Instead, we offer a separate facility for comparing two [.messages] files, one of which can be produced via [--list-errors]. This can be used to ensure completeness. *) (* Now, compile this information down to OCaml code. We wish to produce a function that maps a state number to a message. By convention, we call this function [message]. *) compile_runs filename runs; exit 0 ) (* --------------------------------------------------------------------------- *) (* If two [--compare-errors ] directives are provided, compare the two message descriptions files, and stop. We wish to make sure that every state that appears on the left-hand side appears on the right-hand side as well. *) let () = Settings.compare_errors |> Option.iter (fun (filename1, filename2) -> (* Read and convert both files, as above. *) let runs1 = read_messages filename1 and runs2 = read_messages filename2 in let runs1 = filter_things runs1 and runs2 = filter_things runs2 in let runs1 = target_runs runs1 and runs2 = target_runs runs2 in (* here, it would be OK to ignore errors *) let runs1 = List.map filter_run runs1 and runs2 = List.map filter_run runs2 in let table1 = message_table false runs1 and table2 = message_table false runs2 in (* Check that the domain of [table1] is a subset of the domain of [table2]. *) let c = Error.new_category() in table1 |> Lr1.NodeMap.iter (fun s ((poss1, _), _) -> if not (Lr1.NodeMap.mem s table2) then Error.signal c poss1 "this sentence leads to an error in state %d.\n\ No sentence that leads to this state exists in \"%s\"." (Lr1.number s) filename2 ); (* Check that [table1] is a subset of [table2], that is, for every state [s] in the domain of [table1], [s] is mapped by [table1] and [table2] to the same error message. As an exception, if the message found in [table1] is the default message, then no comparison takes place. This allows using [--list-errors] and [--compare-errors] in conjunction to ensure that a [.messages] file is complete, without seeing warnings about different messages. *) table1 |> Lr1.NodeMap.iter (fun s ((poss1, _), message1) -> if message1 <> default_message then try let (poss2, _), message2 = Lr1.NodeMap.find s table2 in if message1 <> message2 then Error.warning (poss1 @ poss2) "these sentences lead to an error in state %d.\n\ The corresponding messages in \"%s\" and \"%s\" differ." (Lr1.number s) filename1 filename2 with Not_found -> () ); Error.exit_if c; exit 0 ) (* --------------------------------------------------------------------------- *) (* If [--update-errors ] is set, update the error message descriptions found in file [filename]. The idea is to re-generate the auto-comments, which are marked with ##, while leaving the rest untouched. *) let () = Settings.update_errors |> Option.iter (fun filename -> (* Read the file. *) let runs : run or_comment list = read_messages filename in (* Convert every sentence to a state number. Warn, but do not fail, if a sentence does not end in an error, as it should. *) let runs : maybe_targeted_run or_comment list = List.map (or_comment_map (target_run_1 Error.warning)) runs in (* We might wish to detect if two sentences lead to the same state. We might also wish to detect if this set of sentences is incomplete, and complete it automatically. However, the first task is carried out by [--compile-errors] already, and the second task is carried out by [--list-errors] and [--compare-errors] together. For now, let's try and keep things as simple as possible. The task of [--update-errors] should be to update the auto-generated comments, without failing, and without adding or removing sentences. *) (* Now, write a new [.messages] to the standard output channel, with new auto-generated comments. *) List.iter write_run runs; exit 0 ) (* --------------------------------------------------------------------------- *) (* If [--echo-errors ] is set, echo the error sentences found in file [filename]. Do not echo the error messages or the comments. *) (* In principle, we should able to run this command without even giving an .mly file name on the command line, and without building the automaton. This is not possible at the moment, because our code is organized in too rigid a manner. *) let () = Settings.echo_errors |> Option.iter (fun filename -> (* Read the file. *) let runs : run or_comment list = read_messages filename in (* Echo. *) List.iter (or_comment_iter (fun run -> let (sentences : located_sentence or_comment list), _, _ = run in List.iter (or_comment_iter (fun (_, sentence) -> print_string (print_sentence sentence) )) sentences )) runs; exit 0 ) (* --------------------------------------------------------------------------- *) (* End of the functor [Run]. *) end let run () = let module R = Run(struct end) in () menhir-20200123/src/interpret.mli000066400000000000000000000036201361226111300165320ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* [run()] is in charge of handling several command line options, namely [--interpret], [--interpret-error], [--compile-errors], [--compare-errors]. If any of these options is present, the execution of Menhir stops here. *) val run: unit -> unit (* This default error message is produced by [--list-errors] when it creates a [.messages] file, and is recognized by [--compare-errors] when it compares two such files. *) val default_message: string (* [print_messages_item] displays one data item. The item is of the form [nt, sentence, target], which means that beginning at the start symbol [nt], the sentence [sentence] ends in an error in the target state given by [target]. [target] also contains information about which spurious reductions are performed at the end. The display obeys the [.messages] file format. *) open Grammar val print_messages_item: Nonterminal.t * Terminal.t list * ReferenceInterpreter.target -> unit menhir-20200123/src/invariant.ml000066400000000000000000000627231361226111300163510ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module discovers information about the shape and content of the stack in each of the automaton's states. *) open Grammar module C = Conflict (* artificial dependency; ensures that [Conflict] runs first *) (* ------------------------------------------------------------------------ *) (* Compute a lower bound on the height of the stack at every state. At the same time, compute which symbols are held in this stack prefix. *) (* In order to compute (a lower bound on) the height of the stack at a state [s], we examine the LR(0) items that compose [s]. For each item, if the bullet is at position [pos], then we can be assured that the height of the stack is at least [pos]. Thus, we compute the maximum of [pos] over all items (of which there is at least one). *) (* The set of items that we use is not closed, but this does not matter; the items that would be added by the closure would not add any information regarding the height of the stack, since the bullet is at position 0 in these items. *) (* Instead of computing just the stack height, we compute, in the same manner, which symbols are on the stack at a state [s]. This is an array of symbols whose length is the height of the stack at [s]. By convention, the top of the stack is the end of the array. *) (* We first compute and tabulate this information at the level of the LR(0) automaton. *) let stack_symbols : Lr0.node -> Symbol.t array = let dummy = Array.make 0 (Symbol.T Terminal.sharp) in Misc.tabulate Lr0.n (fun node -> Item.Set.fold (fun item accu -> let _prod, _nt, rhs, pos, _length = Item.def item in if pos > Array.length accu then Array.sub rhs 0 pos else accu ) (Lr0.items node) dummy ) (* Then, it is easy to extend it to the LR(1) automaton. *) let stack_symbols (node : Lr1.node) : Symbol.t array = stack_symbols (Lr0.core (Lr1.state node)) let stack_height (node : Lr1.node) : int = Array.length (stack_symbols node) (* ------------------------------------------------------------------------ *) (* Above, we have computed a prefix of the stack at every state. We have computed the length of this prefix and the symbols that are held in this prefix of the stack. Now, compute which states may be held in this prefix. *) (* In order to compute this information, we perform an analysis of the automaton, via a least fixed fixed point computation. *) (* It is worth noting that it would be possible to use an analysis based on a least fixed point computation to discover at the same time the length of the stack prefix, the symbols that it contains, and the states that it may contain. This alternate approach, which was used until 2012/08/25, would lead us to discovering a richer invariant, that is, potentially longer prefixes. This extra information, however, was useless; computing it was a waste of time. Hence, as of 2012/08/25, the height of the stack prefix and the symbols that it contains are predicted (see above), and the least fixed computation is used only to populate these prefixes of predictable length with state information. *) (* By the way, this least fixed point analysis remains the most costly computation throughout this module. *) (* Vectors of sets of states. *) module StateVector = struct type property = Lr1.NodeSet.t list let empty = [] let rec equal v1 v2 = match v1, v2 with | [], [] -> true | states1 :: v1, states2 :: v2 -> Lr1.NodeSet.equal states1 states2 && equal v1 v2 | _, _ -> (* Because all heights are known ahead of time, we are able to (and careful to) compare only vectors of equal length. *) assert false let rec join v1 v2 = match v1, v2 with | [], [] -> [] | states1 :: v1, states2 :: v2 -> Lr1.NodeSet.union states1 states2 :: join v1 v2 | _, _ -> (* Because all heights are known ahead of time, we are able to (and careful to) compare only vectors of equal length. *) assert false let push v x = x :: v let truncate = MenhirLib.General.take end (* In order to perform the fixed point computation, we must extend our type of vectors with a bottom element. This element will not appear in the least fixed point, provided every state of the automaton is reachable. *) module StateLattice = struct type property = | Bottom | NonBottom of StateVector.property let bottom = Bottom let empty = NonBottom StateVector.empty let equal v1 v2 = match v1, v2 with | Bottom, Bottom -> true | NonBottom v1, NonBottom v2 -> StateVector.equal v1 v2 | _, _ -> false let join v1 v2 = match v1, v2 with | Bottom, v | v, Bottom -> v | NonBottom v1, NonBottom v2 -> NonBottom (StateVector.join v1 v2) let push v x = match v with | Bottom -> Bottom | NonBottom v -> NonBottom (StateVector.push v x) let truncate h v = match v with | Bottom -> Bottom | NonBottom v -> NonBottom (StateVector.truncate h v) let is_maximal _ = false end open StateLattice (* Define the fixed point. *) let stack_states : Lr1.node -> property = let module F = Fix.Make (Maps.PersistentMapsToImperativeMaps(Lr1.NodeMap)) (StateLattice) in F.lfp (fun node (get : Lr1.node -> property) -> (* We use the fact that a state has incoming transitions if and only if it is not a start state. *) match Lr1.incoming_symbol node with | None -> assert (Lr1.predecessors node = []); assert (stack_height node = 0); (* If [node] is a start state, then the stack at [node] may be (in fact, must be) the empty stack. *) empty | Some _symbol -> (* If [node] is not a start state, then include the contribution of every incoming transition. We compute a join over all predecessors. The contribution of one predecessor is the abstract value found at this predecessor, extended with a new cell for this transition, and truncated to the stack height at [node], so as to avoid obtaining a vector that is longer than expected/necessary. *) let height = stack_height node in List.fold_left (fun v predecessor -> join v (truncate height (push (get predecessor) (Lr1.NodeSet.singleton predecessor)) ) ) bottom (Lr1.predecessors node) ) (* If every state is reachable, then the least fixed point must be non-bottom everywhere, so we may view it as a function that produces a vector of sets of states. *) let stack_states (node : Lr1.node) : StateVector.property = match stack_states node with | Bottom -> (* apparently this node is unreachable *) assert false | NonBottom v -> v (* ------------------------------------------------------------------------ *) (* From the above information, deduce, for each production, the states that may appear in the stack when this production is reduced. *) (* We are careful to produce a vector of states whose length is exactly that of the production [prod]. *) let production_states : Production.index -> StateLattice.property = Production.tabulate (fun prod -> let nodes = Lr1.production_where prod in let height = Production.length prod in Lr1.NodeSet.fold (fun node accu -> join accu (truncate height (NonBottom (stack_states node)) ) ) nodes bottom ) (* ------------------------------------------------------------------------ *) (* We now determine which states must be represented, that is, explicitly pushed onto the stack. For simplicity, a state is either always represented or never represented. More fine-grained strategies, where a single state is sometimes pushed onto the stack and sometimes not pushed, depending on which outgoing transition is being taken, are conceivable, but quite tricky, and probably not worth the trouble. (1) If two states are liable to appear within a single stack cell, then one is represented if and only if the other is represented. This ensures that the structure of stacks is known everywhere and that we can propose types for stacks. (2) If a state [s] has an outgoing transition along nonterminal symbol [nt], and if the [goto] table for symbol [nt] has more than one target, then state [s] is represented. (3) If a stack cell contains more than one state and if at least one of these states is able to handle the [error] token, then these states are represented. (4) If the semantic action associated with a production mentions the [$syntaxerror] keyword, then the state that is being reduced to (that is, the state that initiated the recognition of this production) is represented. (Indeed, it will be passed as an argument to [errorcase].) *) (* Data. *) let rep : bool UnionFind.point array = Array.init Lr1.n (fun _ -> UnionFind.fresh false) (* Getter. *) let represented state = rep.(Lr1.number state) (* Setters. *) let represent state = UnionFind.set (represented state) true let represents states = represent (Lr1.NodeSet.choose states) (* Enforce condition (1) above. *) let share (v : StateVector.property) = List.iter (fun states -> let dummy = UnionFind.fresh false in Lr1.NodeSet.iter (fun state -> UnionFind.union dummy (represented state) ) states ) v let () = Lr1.iter (fun node -> share (stack_states node) ); Production.iter (fun prod -> match production_states prod with | Bottom -> () | NonBottom v -> share v ) (* Enforce condition (2) above. *) let () = Nonterminal.iter (fun nt -> let count = Lr1.targets (fun count _ _ -> count + 1 ) 0 (Symbol.N nt) in if count > 1 then Lr1.targets (fun () sources _ -> List.iter represent sources ) () (Symbol.N nt) ) (* Enforce condition (3) above. *) let handler state = try let _ = SymbolMap.find (Symbol.T Terminal.error) (Lr1.transitions state) in true with Not_found -> try let _ = TerminalMap.lookup Terminal.error (Lr1.reductions state) in true with Not_found -> false let handlers states = Lr1.NodeSet.exists handler states let () = Lr1.iter (fun node -> let v = stack_states node in List.iter (fun states -> if Lr1.NodeSet.cardinal states >= 2 && handlers states then represents states ) v ) (* Enforce condition (4) above. *) let () = Production.iterx (fun prod -> if Action.has_syntaxerror (Production.action prod) then match production_states prod with | Bottom -> () | NonBottom v -> let sites = Lr1.production_where prod in let length = Production.length prod in if length = 0 then Lr1.NodeSet.iter represent sites else let states = List.nth v (length - 1) in represents states ) (* Define accessors. *) let represented state = UnionFind.get (represented state) let representeds states = if Lr1.NodeSet.is_empty states then assert false else represented (Lr1.NodeSet.choose states) (* Statistics. *) let () = Error.logC 1 (fun f -> let count = Lr1.fold (fun count node -> if represented node then count + 1 else count ) 0 in Printf.fprintf f "%d out of %d states are represented.\n" count Lr1.n ) (* ------------------------------------------------------------------------ *) (* Accessors for information about the stack. *) (* We describe a stack prefix as a list of cells, where each cell is a pair of a symbol and a set of states. The top of the stack is the head of the list. *) type cell = Symbol.t * Lr1.NodeSet.t type word = cell list (* This auxiliary function converts a stack-as-an-array (top of stack at the right end) to a stack-as-a-list (top of stack at list head). *) let convert a = let n = Array.length a in let rec loop i accu = if i = n then accu else loop (i + 1) (a.(i) :: accu) in loop 0 [] (* [stack s] describes the stack when the automaton is in state [s]. *) let stack node : word = List.combine (convert (stack_symbols node)) (stack_states node) (* [prodstack prod] describes the stack when production [prod] is about to be reduced. *) let prodstack prod : word = match production_states prod with | Bottom -> (* This production is never reduced. *) assert false | NonBottom v -> List.combine (convert (Production.rhs prod)) v (* [gotostack nt] is the structure of the stack when a shift transition over nonterminal [nt] is about to be taken. It consists of just one cell. *) let gotostack : Nonterminal.t -> word = Nonterminal.tabulate (fun nt -> let sources = Lr1.targets (fun accu sources _ -> List.fold_right Lr1.NodeSet.add sources accu ) Lr1.NodeSet.empty (Symbol.N nt) in [ Symbol.N nt, sources ] ) let fold f accu w = List.fold_right (fun (symbol, states) accu -> f accu (representeds states) symbol states ) w accu let fold_top f accu w = match w with | [] -> accu | (symbol, states) :: _ -> f (representeds states) symbol let print (w : word) = let b = Buffer.create 64 in fold (fun () _represented symbol _states -> Buffer.add_char b ' '; Buffer.add_string b (Symbol.print symbol) ) () w; Buffer.contents b (* ------------------------------------------------------------------------ *) (* Explain how the stack should be deconstructed when an error is found. We sometimes have a choice as too how many stack cells should be popped. Indeed, several cells in the known suffix of the stack may physically hold a state. If neither of these states handles errors, then we could jump to either. (Indeed, if we jump to one that's nearer, it will in turn pop further stack cells and jump to one that's farther.) In the interests of code size, we should pop as few stack cells as possible. So, we jump to the topmost represented state in the known suffix. *) type state = | Represented | UnRepresented of Lr1.node type instruction = | Die | DownTo of word * state let rewind node : instruction = let w = stack node in let rec rewind w = match w with | [] -> (* I believe that every stack description either is definite (that is, ends with [TailEmpty]) or contains at least one represented state. Thus, if we find an empty [w], this means that the stack is definitely empty. *) Die | ((_, states) as cell) :: w -> if representeds states then (* Here is a represented state. We will pop this cell and no more. *) DownTo ([ cell ], Represented) else if handlers states then begin (* Here is an unrepresented state that can handle errors. The cell must hold a singleton set of states, so we know which state to jump to, even though it isn't represented. *) assert (Lr1.NodeSet.cardinal states = 1); let state = Lr1.NodeSet.choose states in DownTo ([ cell ], UnRepresented state) end else (* Here is an unrepresented state that does not handle errors. Pop this cell and look further. *) match rewind w with | Die -> Die | DownTo (w, st) -> DownTo (cell :: w, st) in rewind w (* ------------------------------------------------------------------------ *) (* Machinery for the computation of which symbols must keep track of their start or end positions. *) open Keyword type variable = Symbol.t * where (* WhereStart or WhereEnd *) module M : Fix.IMPERATIVE_MAPS with type key = variable = struct type key = variable type 'data t = { mutable startp: 'data SymbolMap.t; mutable endp: 'data SymbolMap.t; } open SymbolMap let create() = { startp = empty; endp = empty } let clear m = m.startp <- empty; m.endp <- empty let add (sym, where) data m = match where with | WhereStart -> m.startp <- add sym data m.startp | WhereEnd -> m.endp <- add sym data m.endp | WhereSymbolStart -> assert false let find (sym, where) m = match where with | WhereStart -> find sym m.startp | WhereEnd -> find sym m.endp | WhereSymbolStart -> assert false let iter f m = iter (fun sym -> f (sym, WhereStart)) m.startp; iter (fun sym -> f (sym, WhereEnd)) m.endp end (* ------------------------------------------------------------------------ *) (* We now determine which positions must be kept track of. For simplicity, we do this on a per-symbol basis. That is, for each symbol, either we never keep track of position information, or we always do. In fact, we do distinguish start and end positions. This leads to computing two sets of symbols -- those that keep track of their start position and those that keep track of their end position. A symbol on the right-hand side of a production must keep track of its (start or end) position if that position is explicitly requested by a semantic action. Furthermore, if the left-hand symbol of a production must keep track of its start (resp. end) position, then the first (resp. last) symbol of its right-hand side (if there is one) must do so as well. That is, unless the right-hand side is empty. *) (* 2015/11/11. When a production [prod] is reduced, the top stack cell may be consulted for its end position. This implies that this cell must exist and must store an end position! Now, when does this happen? 1- This happens if [prod] is an epsilon production and the left-hand symbol of the production, [nt prod], keeps track of its start or end position. 2- This happens if the semantic action explicitly mentions the keyword [$endpos($0)]. Now, if this happens, what should we do? a- If this happens in a state [s] whose incoming symbol is [sym], then [sym] must keep track of its end position. b- If this happens in an initial state, where the stack may be empty, then the sentinel cell at the bottom of the stack must contain an end position. Point (b) doesn't concern us here, but point (a) does. We must implement the constraint (1) \/ (2) -> (a). Point (b) is taken care of in the code back-end, where, for simplicity, we always create a sentinel cell. *) (* I will say that this is a lot more sophisticated than I would like. The code back-end has been known for its efficiency and I am trying to maintain this property -- in particular, I would like to keep track of no positions at all, if the user doesn't use any position keyword. But I am suffering. *) module S = FixSolver.Make(M)(Boolean) let record_ConVar, record_VarVar, solve = S.create() let () = (* We gather the constraints explained above in two loops. The first loop looks at every (non-start) production [prod]. The second loop looks at every (non-initial) state [s]. *) Production.iterx (fun prod -> let nt, rhs = Production.def prod and ids = Production.identifiers prod and action = Production.action prod in let length = Array.length rhs in if length > 0 then begin (* If [nt] keeps track of its start position, then the first symbol in the right-hand side must do so as well. *) record_VarVar (Symbol.N nt, WhereStart) (rhs.(0), WhereStart); (* If [nt] keeps track of its end position, then the last symbol in the right-hand side must do so as well. *) record_VarVar (Symbol.N nt, WhereEnd) (rhs.(length - 1), WhereEnd) end; KeywordSet.iter (function | SyntaxError -> () | Position (Before, _, _) -> (* Doing nothing here because [$endpos($0)] is dealt with in the second loop. *) () | Position (Left, _, _) -> (* [$startpos] and [$endpos] have been expanded away. *) assert false | Position (_, _, FlavorLocation) -> (* [$loc] and [$sloc] have been expanded away. *) assert false | Position (RightNamed _, WhereSymbolStart, _) -> (* [$symbolstartpos(x)] does not exist. *) assert false | Position (RightNamed id, where, _) -> (* If the semantic action mentions [$startpos($i)], then the [i]-th symbol in the right-hand side must keep track of its start position. Similarly for end positions. *) Array.iteri (fun i id' -> if id = id' then record_ConVar true (rhs.(i), where) ) ids ) (Action.keywords action) ); (* end of loop on productions *) Lr1.iterx (fun s -> (* Let [sym] be the incoming symbol of state [s]. *) let sym = Misc.unSome (Lr1.incoming_symbol s) in (* Condition (1) in the long comment above (2015/11/11). If an epsilon production [prod] can be reduced in state [s], if its left-hand side [nt] keeps track of its start or end position, then [sym] must keep track of its end position. *) TerminalMap.iter (fun _ prods -> let prod = Misc.single prods in let nt, rhs = Production.def prod in let length = Array.length rhs in if length = 0 then begin record_VarVar (Symbol.N nt, WhereStart) (sym, WhereEnd); record_VarVar (Symbol.N nt, WhereEnd) (sym, WhereEnd) end ) (Lr1.reductions s); (* Condition (2) in the long comment above (2015/11/11). If a production can be reduced in state [s] and mentions [$endpos($0)], then [sym] must keep track of its end position. *) if Lr1.has_beforeend s then record_ConVar true (sym, WhereEnd) ) let track : variable -> bool = solve() let startp symbol = track (symbol, WhereStart) let endp symbol = track (symbol, WhereEnd) let for_every_symbol (f : Symbol.t -> unit) : unit = Terminal.iter (fun t -> f (Symbol.T t)); Nonterminal.iter (fun nt -> f (Symbol.N nt)) let sum_over_every_symbol (f : Symbol.t -> bool) : int = let c = ref 0 in for_every_symbol (fun sym -> if f sym then c := !c + 1); !c let () = Error.logC 1 (fun f -> Printf.fprintf f "%d out of %d symbols keep track of their start position.\n\ %d out of %d symbols keep track of their end position.\n" (sum_over_every_symbol startp) (Terminal.n + Nonterminal.n) (sum_over_every_symbol endp) (Terminal.n + Nonterminal.n)) (* ------------------------------------------------------------------------- *) (* Miscellaneous. *) let universal symbol = Lr1.fold (fun universal s -> universal && (if represented s then SymbolMap.mem symbol (Lr1.transitions s) else true) ) true (* ------------------------------------------------------------------------ *) (* Discover which states can peek at an error. These are the states where an error token may be on the stream. These are the states that are targets of a reduce action on [error]. *) (* 2012/08/25 I am optimizing this code, whose original version I found had quadratic complexity. The problem is as follows. We can easily iterate over all states to find which states [s] have a reduce action on error. What we must find out, then, is into which state [t] this reduce action takes us. This is not easy to predict, as it depends on the contents of the stack. The original code used an overapproximation, as follows: if the reduction concerns a production whose head symbol is [nt], then all of the states that have an incoming transition labeled [nt] are potential targets. The new version of the code below relies on the same approximation, but uses two successive loops instead of two nested loops. *) let errorpeekers = (* First compute a set of symbols [nt]... *) let nts : SymbolSet.t = Lr1.fold (fun nts node -> try let prods = TerminalMap.lookup Terminal.error (Lr1.reductions node) in let prod = Misc.single prods in let nt = Production.nt prod in SymbolSet.add (Symbol.N nt) nts with Not_found -> nts ) SymbolSet.empty in (* ... then compute the set of all target states of all transitions labeled by some symbol in the set [nt]. *) SymbolSet.fold (fun nt errorpeekers -> Lr1.targets (fun errorpeekers _ target -> Lr1.NodeSet.add target errorpeekers ) errorpeekers nt ) nts Lr1.NodeSet.empty let errorpeeker node = Lr1.NodeSet.mem node errorpeekers (* ------------------------------------------------------------------------ *) let () = Time.tick "Constructing the invariant" menhir-20200123/src/invariant.mli000066400000000000000000000115251361226111300165140ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module discovers and publishes information about the automaton. It determines the shape of the stack when a state is about to be entered, when a production is about to be reduced, and when a goto transition is about to be taken. It also determines which states should be represented (that is, need to physically exist on the stack at runtime) and which symbols need to keep track of (start or end) positions. It also determines which automaton states could have to deal with an [error] token. The information computed in this module is used in the code back-end, in the Coq back-end, and in the automated production of .messages files. It is not used in the table back-end. *) open Grammar (* ------------------------------------------------------------------------- *) (* A representation of stack shapes. *) (* A word is a representation of a stack or stack suffix. *) type word (* [fold] folds over a word. At each cell, [f] is applied to the accumulator, to a Boolean flag that tells whether the cell holds a state, to the set of possible states of the cell, and to the symbol associated with the cell. The stack is visited from bottom to top. *) val fold: ('a -> bool -> Symbol.t -> Lr1.NodeSet.t -> 'a) -> 'a -> word -> 'a (* [fold_top f accu s] is analogous to [fold], but only folds over the top stack cell, if there is one, so that [f] is either not invoked at all or invoked just once. *) val fold_top: (bool -> Symbol.t -> 'a) -> 'a -> word -> 'a (* [print w] produces a string representation of the word [w]. Only the symbols are shown. One space is printed in front of each symbol. *) val print: word -> string (* ------------------------------------------------------------------------- *) (* Information about the stack. *) (* [stack s] is the structure of the stack at state [s]. *) val stack: Lr1.node -> word (* [prodstack prod] is the structure of the stack when production [prod] is about to be reduced. This function should not be called if production [prod] is never reduced. *) val prodstack: Production.index -> word (* [gotostack nt] is the structure of the stack when a shift transition over nonterminal [nt] is about to be taken. It consists of just one cell. *) val gotostack: Nonterminal.t -> word (* [rewind s] explains how to rewind the stack when dealing with an error in state [s]. It produces an instruction to either die (because no state on the stack can handle errors) or pop a suffix of the stack. In the latter case, one reaches a state that is either represented (its identity is physically stored in the bottommost cell that is popped) or unrepresented (its identity is statically known). *) type instruction = | Die | DownTo of word * state and state = | Represented | UnRepresented of Lr1.node val rewind: Lr1.node -> instruction (* ------------------------------------------------------------------------- *) (* Information about which states and positions need to physically exist on the stack. *) (* [represented s] tells whether state [s] must have an explicit representation, that is, whether it is pushed onto the stack. *) val represented: Lr1.node -> bool (* [startp symbol] and [endp symbol] tell whether start or end positions must be recorded for symbol [symbol]. *) val startp: Symbol.t -> bool val endp: Symbol.t -> bool (* ------------------------------------------------------------------------- *) (* Information about error handling. *) (* [errorpeeker s] tells whether state [s] can potentially peek at an error. This is the case if, in state [s], an error token may be on the stream. *) val errorpeeker: Lr1.node -> bool (* ------------------------------------------------------------------------- *) (* Miscellaneous. *) (* [universal symbol] tells whether every represented state has an outgoing transition along [symbol]. *) val universal: Symbol.t -> bool menhir-20200123/src/item.ml000066400000000000000000000300611361226111300153020ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Grammar (* ------------------------------------------------------------------------ *) (* Items. *) (* An LR(0) item encodes a pair of integers, namely the index of the production and the index of the bullet in the production's right-hand side. *) (* Both integers are packed into a single integer, using 7 bits for the bullet position and the rest (usually 24 bits) for the production index. These widths could be adjusted. *) (* The function [export] is duplicated in [TableInterpreter]. Do not modify it; or modify it here and there in a consistent manner. *) type t = int let import (prod, pos) = assert (pos < 128); (Production.p2i prod) lsl 7 + pos let export t = (Production.i2p (t lsr 7), t mod 128) let marshal (item : t) : int = item (* Comparison. *) let equal (item1 : t) (item2: t) = item1 = item2 (* Position. *) let positions (item : t) = let prod, _ = export item in Production.positions prod (* [def item] looks up the production associated with this item in the grammar and returns [prod, nt, rhs, pos, length], where [prod] is the production's index, [nt] and [rhs] represent the production, [pos] is the position of the bullet in the item, and [length] is the length of the production's right-hand side. *) let def t = let prod, pos = export t in let nt, rhs = Production.def prod in let length = Array.length rhs in assert ((pos >= 0) && (pos <= length)); prod, nt, rhs, pos, length let startnt t = let _, _, rhs, pos, length = def t in assert (pos = 0 && length = 1); match rhs.(0) with | Symbol.N nt -> nt | Symbol.T _ -> assert false (* Printing. *) let print item = let _, nt, rhs, pos, _ = def item in Printf.sprintf "%s -> %s" (Nonterminal.print false nt) (Symbol.printaod 0 pos rhs) (* Classifying items. *) type kind = | Shift of Symbol.t * t | Reduce of Production.index let classify item = let prod, _, rhs, pos, length = def item in if pos = length then Reduce prod else Shift (rhs.(pos), import (prod, pos + 1)) (* Sets of items and maps over items. Hashing these data structures is specifically allowed, so balanced trees (for instance) would not be applicable here. *) module Map = Patricia.Big module Set = Map.Domain (* This functor performs precomputation that helps efficiently compute the closure of an LR(0) or LR(1) state. The precomputation requires time linear in the size of the grammar. The nature of the lookahead sets remains abstract. *) (* The precomputation consists in building the LR(0) nondeterministic automaton. This is a graph whose nodes are items and whose edges are epsilon transitions. (We do not care about shift transitions here.) Lookahead information can be attached to nodes and is propagated through the graph during closure computations. *) module Closure (L : Lookahead.S) = struct type state = L.t Map.t type node = { (* Nodes are sequentially numbered so as to allow applying Tarjan's algorithm (below). *) num: int; (* Each node is associated with an item. *) item: t; (* All of the epsilon transitions that leave a node have the same behavior with respect to lookahead information. *) (* The lookahead set transmitted along an epsilon transition is either a constant, or the union of a constant and the lookahead set at the source node. The former case corresponds to a source item whose trailer is not nullable, the latter to a source item whose trailer is nullable. *) epsilon_constant: L.t; epsilon_transmits: bool; (* Each node carries pointers to its successors through epsilon transitions. This field is never modified once initialization is over. *) mutable epsilon_transitions: node list; (* The following fields are transient, that is, only used temporarily during graph traversals. Marks are used to recognize which nodes have been traversed already. Lists of predecessors are used to record which edges have been traversed. Lookahead information is attached with each node. *) mutable mark: Mark.t; mutable predecessors: node list; mutable lookahead: L.t; } (* Allocate one graph node per item and build a mapping of items to nodes. *) let count = ref 0 let mapping : node array array = Array.make Production.n [||] let item2node item = let prod, pos = export item in mapping.(Production.p2i prod).(pos) let () = Production.iter (fun prod -> let _nt, rhs = Production.def prod in let length = Array.length rhs in mapping.(Production.p2i prod) <- Array.init (length+1) (fun pos -> let item = import (prod, pos) in let num = !count in count := num + 1; (* The lookahead set transmitted through an epsilon transition is the FIRST set of the remainder of the source item, plus, if that is nullable, the lookahead set of the source item. *) let constant, transmits = if pos < length then let nullable, first = Analysis.nullable_first_prod prod (pos + 1) in L.constant first, nullable else (* No epsilon transitions leave this item. *) L.empty, false in { num = num; item = item; epsilon_constant = constant; epsilon_transmits = transmits; epsilon_transitions = []; (* temporary placeholder *) mark = Mark.none; predecessors = []; lookahead = L.empty; } ) ) (* At each node, compute transitions. *) let () = Production.iter (fun prod -> let _nt, rhs = Production.def prod in let length = Array.length rhs in Array.iteri (fun pos node -> node.epsilon_transitions <- if pos < length then match rhs.(pos) with | Symbol.N nt -> Production.foldnt nt [] (fun prod nodes -> (item2node (import (prod, 0))) :: nodes ) | Symbol.T _ -> [] else [] ) mapping.(Production.p2i prod) ) (* Detect and reject cycles of transitions that transmit a lookahead set. We need to ensure that there are no such cycles in order to be able to traverse these transitions in topological order. Each such cycle corresponds to a set of productions of the form A1 -> A2, A2 -> A3, ..., An -> A1 (modulo nullable trailers). Such cycles are unlikely to occur in realistic grammars, so our current approach is to reject the grammar if such a cycle exists. Actually, according to DeRemer and Pennello (1982), such a cycle is exactly an includes cycle, and implies that the grammar is not LR(k) for any k, unless A1, ..., An are in fact uninhabited. In other words, this is a pathological case. *) (* Yes, indeed, this is called a cycle in Aho & Ullman's book, and a loop in Grune & Jacobs' book. It is not difficult to see that (provided all symbols are inhabited) the grammar is infinitely ambiguous if and only if there is a loop. *) module P = struct type foo = node type node = foo let n = !count let index node = node.num let iter f = Array.iter (fun nodes -> Array.iter f nodes ) mapping let successors f node = if node.epsilon_transmits then List.iter f node.epsilon_transitions end module T = Tarjan.Run (P) let cycle scc = let items = List.map (fun node -> node.item) scc in let positions = List.flatten (List.map positions items) in let names = String.concat "\n" (List.map print items) in Error.error positions "the grammar is ambiguous.\n\ The following items participate in an epsilon-cycle:\n\ %s" names let () = P.iter (fun node -> let scc = T.scc node in match scc with | [] -> () | [ node ] -> (* This is a strongly connected component of one node. Check whether it carries a self-loop. Forbidding self-loops is not strictly required by the code that follows, but is consistent with the fact that we forbid cycles of length greater than 1. *) P.successors (fun successor -> if successor.num = node.num then cycle scc ) node | _ -> (* This is a strongly connected component of at least two elements. *) cycle scc ) (* Closure computation. *) let closure (items : state) : state = (* Explore the graph forwards, starting from these items. Marks are used to tell which nodes have been visited. Build a list of all visited nodes; this is in fact the list of all items in the closure. At initial nodes and when reaching a node through a transition, record a lookahead set. When we reach a node through a transition that transmits the lookahead set found at its source, record its source, so as to allow re-traversing this transition backwards (below). *) let this = Mark.fresh() in let nodes = ref [] in let rec visit father transmits toks node = if Mark.same node.mark this then begin (* Node has been visited already. *) node.lookahead <- L.union toks node.lookahead; if transmits then node.predecessors <- father :: node.predecessors end else begin (* Node is new. *) node.predecessors <- if transmits then [ father ] else []; node.lookahead <- toks; follow node end and follow node = node.mark <- this; nodes := node :: !nodes; List.iter (visit node node.epsilon_transmits node.epsilon_constant) node.epsilon_transitions in Map.iter (fun item toks -> let node = item2node item in visit node (* dummy! *) false toks node ) items; let nodes = !nodes in (* Explore the graph of transmitting transitions backwards. By hypothesis, it is acyclic, so this is a topological walk. Lookahead sets are inherited through transitions. *) let this = Mark.fresh() in let rec walk node = if not (Mark.same node.mark this) then begin (* Node is new. *) node.mark <- this; (* Explore all predecessors and merge their lookahead sets into the current node's own lookahead set. *) List.iter (fun predecessor -> walk predecessor; node.lookahead <- L.union predecessor.lookahead node.lookahead ) node.predecessors end in List.iter walk nodes; (* Done. Produce a mapping of items to lookahead sets. Clear all transient fields so as to reduce pressure on the GC -- this does not make much difference. *) List.fold_left (fun closure node -> node.predecessors <- []; let closure = Map.add node.item node.lookahead closure in node.lookahead <- L.empty; closure ) Map.empty nodes (* End of closure computation *) end menhir-20200123/src/item.mli000066400000000000000000000057571361226111300154710ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Grammar (* An LR(0) item encodes a pair of integers, namely the index of the production and the index of the bullet in the production's right-hand side. *) type t val import: Production.index * int -> t val export: t -> Production.index * int (* An item can be encoded as an integer. This is used in the table back-end only. The decoding function (really a copy of [export]) is in [TableInterpreter]. *) val marshal: t -> int (* Comparison. *) val equal: t -> t -> bool (* [def item] looks up the production associated with this item in the grammar and returns [prod, nt, rhs, pos, length], where [prod] is the production's index, [nt] and [rhs] represent the production, [pos] is the position of the bullet in the item, and [length] is the length of the production's right-hand side. *) val def: t -> Production.index * Nonterminal.t * Symbol.t array * int * int (* If [item] is a start item, [startnt item] returns the start nonterminal that corresponds to [item]. *) val startnt: t -> Nonterminal.t (* Printing. *) val print: t -> string (* Classifying items as shift or reduce items. A shift item is one where the bullet can still advance. A reduce item is one where the bullet has reached the end of the right-hand side. *) type kind = | Shift of Symbol.t * t | Reduce of Production.index val classify: t -> kind (* Sets of items and maps over items. Hashing these data structures is specifically allowed. *) module Set : GSet.S with type element = t module Map : GMap.S with type key = t and type Domain.t = Set.t (* This functor performs precomputation that helps efficiently compute the closure of an LR(0) or LR(1) state. The precomputation requires time linear in the size of the grammar. The nature of the lookahead sets remains abstract. *) module Closure (L : Lookahead.S) : sig (* A state maps items to lookahead information. *) type state = L.t Map.t (* This takes the closure of a state through all epsilon transitions. *) val closure: state -> state end menhir-20200123/src/keywordExpansion.ml000066400000000000000000000241541361226111300177230ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open BasicSyntax open Keyword open IL open CodeBits (* [posvar_ keyword] constructs the conventional name of the variable that stands for the position keyword [keyword]. *) let posvar_ = function | Position (subject, where, flavor) -> posvar subject where flavor | _ -> assert false (* [posvar_] should be applied to a position keyword *) (* [symbolstartpos producers i n] constructs an expression which, beginning at index [i], looks for the first non-empty producer and returns its start position. If none is found, this expression returns the end position of the right-hand side. This computation is modeled after the function [Parsing.symbol_start_pos] in OCaml's standard library. *) (* This cascade of [if] constructs could be quite big, and this could be a problem in terms of code size. Fortunately, we can optimize this code by computing, ahead of time, the outcome of certain comparisons. We assume that the lexer never produces a token whose start and end positions are the same. There follows that a non-nullable symbol cannot have the same start and end positions. Conversely, a symbol that generates (a subset of) the language {epsilon} must have the same start and end positions. *) (* Although this code is modeled after [Parsing.symbol_start_pos], we compare positions using physical equality, whereas they use structural equality. If for some reason a symbol has start and end positions that are structurally equal but physically different, then a difference will be observable. However, this is very unlikely. It would mean that a token has the same start and end positions (and furthermore, this position has been re-allocated). *) (* The reason why we expand [$symbolstartpos] away prior to inlining is that we want its meaning to be preserved by inlining. If we tried to preserve this keyword through the inlining phase, then (I suppose) we would have to introduce a family of keywords [$symbolstartpos(i, j)], computing over the interval from [i] to [j], and the preservation would not be exact -- because a nonempty symbol, once inlined, can be seen to be a sequence of empty and nonempty symbols. *) let rec symbolstartpos ((nullable, epsilon) as analysis) producers i n : IL.expr * KeywordSet.t = if i = n then (* Return [$endpos]. *) let keyword = Position (Left, WhereEnd, FlavorPosition) in EVar (posvar_ keyword), KeywordSet.singleton keyword else (* [symbol] is the symbol that appears in the right-hand side at position i. [x] is the identifier that is bound to it. We generate code that compares [$startpos($i)] and [$endpos($i)]. If they differ, we return [$startpos($i)]. Otherwise, we continue. Furthermore, as noted above, if [symbol] is not nullable, then we know that the start and end positions must differ, so we optimize this case. *) let producer = List.nth producers i in let symbol = producer_symbol producer and x = producer_identifier producer in let startp = Position (RightNamed x, WhereStart, FlavorPosition) and endp = Position (RightNamed x, WhereEnd, FlavorPosition) in if not (nullable symbol) then (* The start and end positions must differ. *) EVar (posvar_ startp), KeywordSet.singleton startp else let continue, keywords = symbolstartpos analysis producers (i + 1) n in if epsilon symbol then (* The start and end positions must be the same. *) continue, keywords else (* In the general case, a runtime test is required. *) EIfThenElse ( EApp (EVar "(!=)", [ EVar (posvar_ startp); EVar (posvar_ endp) ]), EVar (posvar_ startp), continue ), KeywordSet.add startp (KeywordSet.add endp keywords) (* [define keyword1 f keyword2] macro-expands [keyword1] as [f(keyword2)], where [f] is a function of expressions to expressions. *) let define keyword1 f keyword2 = Action.define keyword1 (KeywordSet.singleton keyword2) (mlet [ PVar (posvar_ keyword1) ] [ f (EVar (posvar_ keyword2)) ]) (* A [loc] keyword is expanded away. *) (* Since a location is represented as a pair of positions, $loc is sugar for the pair ($startpos, $endpos). (Similarly for $loc(x).) Furthermore, $sloc is sugar for the pair ($symbolstartpos, $endpos). *) let define_as_tuple keyword keywords = Action.define keyword (List.fold_right KeywordSet.add keywords KeywordSet.empty) (mlet [ PVar (posvar_ keyword) ] [ ETuple (List.map (fun keyword -> EVar (posvar_ keyword)) keywords) ]) let expand_loc keyword action = match keyword with | Position (Left, WhereSymbolStart, FlavorLocation) -> (* $sloc *) define_as_tuple keyword [ Position (Left, WhereSymbolStart, FlavorPosition); Position (Left, WhereEnd, FlavorPosition) ] action | Position (subject, WhereStart, FlavorLocation) -> (* $loc, $loc(x) *) define_as_tuple keyword [ Position (subject, WhereStart, FlavorPosition); Position (subject, WhereEnd, FlavorPosition) ] action | _ -> action (* An [ofs] keyword is expanded away. It is defined in terms of the corresponding [pos] keyword. *) let expand_ofs keyword action = match keyword with | Position (subject, where, FlavorOffset) -> define keyword (fun e -> ERecordAccess (e, "Lexing.pos_cnum")) (Position (subject, where, FlavorPosition)) action | _ -> action (* [$symbolstartpos] is expanded into a cascade of [if] constructs, modeled after [Parsing.symbol_start_pos]. *) let expand_symbolstartpos analysis producers n keyword action = match keyword with | Position (Left, WhereSymbolStart, FlavorPosition) -> let expansion, keywords = symbolstartpos analysis producers 0 n in Action.define keyword keywords (mlet [ PVar (posvar_ keyword) ] [ expansion ]) action | Position (RightNamed _, WhereSymbolStart, FlavorPosition) -> (* [$symbolstartpos(x)] does not exist. *) assert false | _ -> action (* [$startpos] and [$endpos] are expanded away. *) let expand_startend producers n keyword action = match keyword with | Position (Left, WhereStart, flavor) -> (* [$startpos] is defined as [$startpos($1)] if this production has nonzero length and [$endpos($0)] otherwise. *) define keyword (fun e -> e) ( if n > 0 then let x = producer_identifier (List.hd producers) in Position (RightNamed x, WhereStart, flavor) else Position (Before, WhereEnd, flavor) ) action | Position (Left, WhereEnd, flavor) -> (* [$endpos] is defined as [$endpos($n)] if this production has nonzero length and [$endpos($0)] otherwise. *) define keyword (fun e -> e) ( if n > 0 then let x = producer_identifier (List.hd (List.rev producers)) in Position (RightNamed x, WhereEnd, flavor) else Position (Before, WhereEnd, flavor) ) action | _ -> action (* [expand_round] performs one round of expansion on [action], using [f] as a rewriting rule. *) let expand_round f action = KeywordSet.fold f (Action.keywords action) action (* [expand_action] performs macro-expansion in [action]. We do this in several rounds: first, expand the [loc] keywords away; then, expand the [ofs] keywords away; then, expand [symbolstart] away; then, expand the rest. We do this in this order because each round can cause new keywords to appear, which must eliminated by the following rounds. *) let expand_action analysis producers action = let n = List.length producers in (* Expand [loc] keywords away first. *) let action = expand_round expand_loc action in (* The [ofs] keyword family is defined in terms of the [pos] family by accessing the [pos_cnum] field. Expand these keywords away first. *) let action = expand_round expand_ofs action in (* Expand [$symbolstartpos] away. *) let action = expand_round (expand_symbolstartpos analysis producers n) action in (* Then, expand away the non-[ofs] keywords. *) let action = expand_round (expand_startend producers n) action in action (* Silently analyze the grammar so as to find out which symbols are nullable and which symbols generate a subset of {epsilon}. This is used to optimize the expansion of $symbolstartpos. *) let analysis grammar = let module G = GrammarFunctor.Make(struct let grammar = grammar let verbose = false end) in let lookup (nt : Syntax.symbol) : G.Symbol.t = try G.Symbol.lookup nt with Not_found -> assert false in let nullable nt : bool = G.Analysis.nullable_symbol (lookup nt) and epsilon nt : bool = G.TerminalSet.is_empty (G.Analysis.first_symbol (lookup nt)) in nullable, epsilon (* Put everything together. *) let expand_branch analysis branch = { branch with action = expand_action analysis branch.producers branch.action } let expand_rule analysis rule = { rule with branches = List.map (expand_branch analysis) rule.branches } let expand_grammar grammar = let analysis = analysis grammar in { grammar with rules = StringMap.map (expand_rule analysis) grammar.rules } menhir-20200123/src/keywordExpansion.mli000066400000000000000000000023261361226111300200710ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open BasicSyntax (* [expand_grammar] expands away the keywords [$startpos] and [$endpos], as well the entire [ofs] family of keywords. Doing this early simplifies some aspects later on, in particular %inlining. *) val expand_grammar: grammar -> grammar menhir-20200123/src/lexdep.mll000066400000000000000000000040711361226111300160030ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This code analyzes the output of [ocamldep] and returns the list of [.cmi] files that the [.cmo] file depends on. *) { open Lexing exception Error of string let fail lexbuf = raise (Error (Printf.sprintf "failed to make sense of ocamldep's output (character %d).\n" lexbuf.lex_curr_p.pos_cnum) ) } let newline = ('\n' | '\r' | "\r\n") let whitespace = ( ' ' | '\t' | ('\\' newline) ) let entrychar = [^ '\n' '\r' '\t' ' ' '\\' ':' ] let entry = ((entrychar+ as basename) ".cm" ('i' | 'o' | 'x') as filename) (* [main] recognizes a sequence of lines, where a line consists of an entry, followed by a colon, followed by a list of entries. *) rule main = parse | eof { [] } | entry whitespace* ":" { let bfs = collect [] lexbuf in ((basename, filename), bfs) :: main lexbuf } | _ { fail lexbuf } (* [collect] recognizes a list of entries, separated with spaces and ending in a newline. *) and collect bfs = parse | whitespace+ entry { collect ((basename, filename) :: bfs) lexbuf } | whitespace* newline { bfs } | _ | eof { fail lexbuf } menhir-20200123/src/lexer.mll000066400000000000000000000540641361226111300156500ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) { open Lexing open Parser open Positions open Keyword (* ------------------------------------------------------------------------ *) (* Short-hands. *) let error1 pos = Error.error (Positions.one pos) let error2 lexbuf = Error.error (Positions.lexbuf lexbuf) (* ------------------------------------------------------------------------ *) (* This wrapper saves the current lexeme start, invokes its argument, and restores it. This allows transmitting better positions to the parser. *) let savestart lexbuf f = let startp = lexbuf.lex_start_p in let token = f lexbuf in lexbuf.lex_start_p <- startp; token (* ------------------------------------------------------------------------ *) (* Overwrites an old character with a new one at a specified offset in a [bytes] buffer. *) let overwrite content offset c1 c2 = assert (Bytes.get content offset = c1); Bytes.set content offset c2 (* ------------------------------------------------------------------------ *) (* Keyword recognition and construction. *) (* A monster is a spot where we have identified a keyword in concrete syntax. We describe a monster as an object with the following methods: *) type monster = { (* The position of the monster. *) pos: Positions.t; (* This method is passed an array of (optional) names for the producers, that is, the elements of the production's right-hand side. It is also passed a flag which tells whether [$i] syntax is allowed or disallowed. It may perform some checks and is allowed to fail. *) check: check; (* This method transforms the keyword (in place) into a conventional OCaml identifier. This is done by replacing '$', '(', and ')' with '_'. Bloody. The arguments are [ofs1] and [content]. [ofs1] is the offset where [content] begins in the source file. *) transform: int -> bytes -> unit; (* This is the keyword, in abstract syntax. *) keyword: keyword option; } and check = Settings.dollars -> string option array -> unit (* No check. *) let none : check = fun _ _ -> () (* ------------------------------------------------------------------------ *) (* The [$syntaxerror] monster. *) let syntaxerror pos : monster = let check = none and transform ofs1 content = (* [$syntaxerror] is replaced with [(raise _eRR)]. Same length. *) let pos = start_of_position pos in let ofs = pos.pos_cnum - ofs1 in let source = "(raise _eRR)" in Bytes.blit_string source 0 content ofs (String.length source) and keyword = Some SyntaxError in { pos; check; transform; keyword } (* ------------------------------------------------------------------------ *) (* We check that every [$i] is within range. Also, we forbid using [$i] when a producer has been given a name; this is bad style and may be a mistake. (Plus, this simplies our life, as we rewrite [$i] to [_i], and we would have to rewrite it to a different identifier otherwise.) *) let check_dollar pos i : check = fun dollars producers -> (* If [i] is out of range, say so. *) if not (0 <= i - 1 && i - 1 < Array.length producers) then Error.error [pos] "$%d refers to a nonexistent symbol." i; (* If [$i] could be referred to via a name, say so. *) producers.(i - 1) |> Option.iter (fun x -> Error.error [pos] "please do not say: $%d. Instead, say: %s." i x ); (* If [$i] syntax is disallowed, say so. *) match dollars with | Settings.DollarsDisallowed -> Error.error [pos] "please do not use $%d. Instead, name this value." i | Settings.DollarsAllowed -> () (* We check that every reference to a producer [x] in a position keyword, such as [$startpos(x)], exists. *) let check_producer pos x : check = fun _ producers -> if not (List.mem (Some x) (Array.to_list producers)) then Error.error [pos] "%s refers to a nonexistent symbol." x (* ------------------------------------------------------------------------ *) (* The [$i] monster. *) let dollar pos i : monster = let check : check = check_dollar pos i and transform ofs1 content = (* [$i] is replaced with [_i]. Thus, it is no longer a keyword. *) let pos = start_of_position pos in let ofs = pos.pos_cnum - ofs1 in overwrite content ofs '$' '_' and keyword = None in { pos; check; transform; keyword } (* ------------------------------------------------------------------------ *) (* The position-keyword monster. The most horrible of all. *) let position pos (where : string) (flavor : string) (i : string option) (x : string option) = let check_no_parameter () = if i <> None || x <> None then Error.error [pos] "$%s%s does not take a parameter." where flavor in let ofslpar = (* offset of the opening parenthesis, if there is one *) 1 + (* for the initial "$" *) String.length where + 3 (* for "pos" or "ofs" or "loc" *) in let where = match where with | "symbolstart" | "s" -> check_no_parameter(); WhereSymbolStart | "start" -> WhereStart | "end" -> WhereEnd | "" -> WhereStart | _ -> assert false in let flavor = match flavor with | "pos" -> FlavorPosition | "ofs" -> FlavorOffset | "loc" -> FlavorLocation | _ -> assert false in let subject, check = match i, x with | Some i, None -> let ii = int_of_string i in (* cannot fail *) if ii = 0 && where = WhereEnd then (* [$endpos($0)] *) Before, none else (* [$startpos($i)] is rewritten to [$startpos(_i)]. *) RightNamed ("_" ^ i), check_dollar pos ii | None, Some x -> (* [$startpos(x)] *) RightNamed x, check_producer pos x | None, None -> (* [$startpos] *) Left, none | Some _, Some _ -> assert false in let transform ofs1 content = let pos = start_of_position pos in let ofs = pos.pos_cnum - ofs1 in overwrite content ofs '$' '_'; let ofslpar = ofs + ofslpar in match i, x with | None, Some x -> overwrite content ofslpar '(' '_'; overwrite content (ofslpar + 1 + String.length x) ')' '_' | Some i, None -> overwrite content ofslpar '(' '_'; overwrite content (ofslpar + 1) '$' '_'; overwrite content (ofslpar + 2 + String.length i) ')' '_' | _, _ -> () in let keyword = Some (Position (subject, where, flavor)) in { pos; check; transform; keyword } (* ------------------------------------------------------------------------ *) (* In an OCaml header, there should be no monsters. This is just a sanity check. *) let no_monsters monsters = match monsters with | [] -> () | monster :: _ -> Error.error [monster.pos] "a Menhir keyword cannot be used in an OCaml header." (* ------------------------------------------------------------------------ *) (* Creates a stretch. *) let mk_stretch pos1 pos2 parenthesize monsters = (* Read the specified chunk of the file. *) let raw_content : string = InputFile.chunk (pos1, pos2) in (* Transform the monsters, if there are any. (This explicit test allows saving one string copy and keeping just one live copy.) *) let content : string = match monsters with | [] -> raw_content | _ :: _ -> let content : bytes = Bytes.of_string raw_content in List.iter (fun monster -> monster.transform pos1.pos_cnum content) monsters; Bytes.unsafe_to_string content in (* Add whitespace so that the column numbers match those of the source file. If requested, add parentheses so that the semantic action can be inserted into other code without ambiguity. *) let content = if parenthesize then (String.make (pos1.pos_cnum - pos1.pos_bol - 1) ' ') ^ "(" ^ content ^ ")" else (String.make (pos1.pos_cnum - pos1.pos_bol) ' ') ^ content in Stretch.({ stretch_filename = InputFile.get_input_file_name(); stretch_linenum = pos1.pos_lnum; stretch_linecount = pos2.pos_lnum - pos1.pos_lnum; stretch_content = content; stretch_raw_content = raw_content; stretch_keywords = Misc.map_opt (fun monster -> monster.keyword) monsters }) (* ------------------------------------------------------------------------ *) (* OCaml's reserved words. *) let reserved = let table = Hashtbl.create 149 in List.iter (fun word -> Hashtbl.add table word ()) [ "and"; "as"; "assert"; "begin"; "class"; "constraint"; "do"; "done"; "downto"; "else"; "end"; "exception"; "external"; "false"; "for"; "fun"; "function"; "functor"; "if"; "in"; "include"; "inherit"; "initializer"; "lazy"; "let"; "match"; "method"; "module"; "mutable"; "new"; "object"; "of"; "open"; "or"; "parser"; "private"; "rec"; "sig"; "struct"; "then"; "to"; "true"; "try"; "type"; "val"; "virtual"; "when"; "while"; "with"; "mod"; "land"; "lor"; "lxor"; "lsl"; "lsr"; "asr"; ]; table } (* ------------------------------------------------------------------------ *) (* Patterns. *) let newline = ('\010' | '\013' | "\013\010") let whitespace = [ ' ' '\t' ] let lowercase = ['a'-'z' '\223'-'\246' '\248'-'\255' '_'] let uppercase = ['A'-'Z' '\192'-'\214' '\216'-'\222'] let identchar = ['A'-'Z' 'a'-'z' '_' '\192'-'\214' '\216'-'\246' '\248'-'\255' '0'-'9'] (* '\'' forbidden *) let attributechar = identchar | '.' let subject = '$' (['0'-'9']+ as i) | ((lowercase identchar*) as x) let poskeyword = '$' ( (("symbolstart" | "start" | "end") as where) (("pos" | "ofs") as flavor) | (("s" | "") as where) ("loc" as flavor) ) ( '(' subject ')' )? let previouserror = "$previouserror" let syntaxerror = "$syntaxerror" (* ------------------------------------------------------------------------ *) (* The lexer. *) rule main = parse | "%token" { TOKEN } | "%type" { TYPE } | "%left" { LEFT } | "%right" { RIGHT } | "%nonassoc" { NONASSOC } | "%start" { START } | "%prec" { PREC } | "%public" { PUBLIC } | "%parameter" { PARAMETER } | "%inline" { INLINE } | "%attribute" { PERCENTATTRIBUTE } | "%on_error_reduce" { ON_ERROR_REDUCE } | "%%" { (* The token [PERCENTPERCENT] carries a stretch that contains everything that follows %% in the input file. This string must be created lazily. The parser decides (based on the context) whether this stretch is needed. If it is indeed needed, then constructing this stretch drives the lexer to the end of the file. *) PERCENTPERCENT (lazy ( let openingpos = lexeme_end_p lexbuf in let closingpos = finish lexbuf in mk_stretch openingpos closingpos false [] )) } | ";" { SEMI } | ":" { COLON } | "," { COMMA } | "=" { EQUAL } | "(" { LPAREN } | ")" { RPAREN } | "|" { BAR } | "?" { QUESTION } | "*" { STAR } | "+" { PLUS } | "~" { TILDE } | "_" { UNDERSCORE } | ":=" { COLONEQUAL } | "==" { EQUALEQUAL } | "let" { LET } | (lowercase identchar *) as id { if Hashtbl.mem reserved id then error2 lexbuf "this is an OCaml reserved word." else LID (with_pos (cpos lexbuf) id) } | (uppercase identchar *) as id { UID (with_pos (cpos lexbuf) id) } (* Quoted strings, which are used as aliases for tokens. For simplicity, we just disallow double quotes and backslash outright. Given the use of terminal strings in grammars, this is fine. *) | ( "\"" ( [' ' - '~'] # ['"' '\\'] + ) "\"" ) as id { QID (with_pos (cpos lexbuf) id) } | "//" [^ '\010' '\013']* newline (* skip C++ style comment *) | newline { new_line lexbuf; main lexbuf } | whitespace+ { main lexbuf } | "/*" { comment (lexeme_start_p lexbuf) lexbuf; main lexbuf } | "(*" { ocamlcomment (lexeme_start_p lexbuf) lexbuf; main lexbuf } | "<" { savestart lexbuf (ocamltype (lexeme_end_p lexbuf)) } | "%{" { savestart lexbuf (fun lexbuf -> let openingpos = lexeme_start_p lexbuf in let stretchpos = lexeme_end_p lexbuf in let closingpos, monsters = action true openingpos [] lexbuf in no_monsters monsters; HEADER (mk_stretch stretchpos closingpos false []) ) } | "{" { savestart lexbuf (fun lexbuf -> let openingpos = lexeme_start_p lexbuf in let stretchpos = lexeme_end_p lexbuf in let closingpos, monsters = action false openingpos [] lexbuf in ACTION ( fun dollars producers -> List.iter (fun monster -> monster.check dollars producers) monsters; let stretch = mk_stretch stretchpos closingpos true monsters in Action.from_stretch stretch ) ) } | ('%'? as percent) "[@" (attributechar+ as id) whitespace* { let openingpos = lexeme_start_p lexbuf in let stretchpos = lexeme_end_p lexbuf in let closingpos = attribute openingpos lexbuf in let pos = Positions.import (openingpos, lexeme_end_p lexbuf) in let attr = mk_stretch stretchpos closingpos false [] in if percent = "" then (* No [%] sign: this is a normal attribute. *) ATTRIBUTE (Positions.with_pos pos id, attr) else (* A [%] sign is present: this is a grammar-wide attribute. *) GRAMMARATTRIBUTE (Positions.with_pos pos id, attr) } | eof { EOF } | _ { error2 lexbuf "unexpected character(s)." } (* ------------------------------------------------------------------------ *) (* Skip C style comments. *) and comment openingpos = parse | newline { new_line lexbuf; comment openingpos lexbuf } | "*/" { () } | eof { error1 openingpos "unterminated comment." } | _ { comment openingpos lexbuf } (* ------------------------------------------------------------------------ *) (* Collect an O'Caml type delimited by angle brackets. Angle brackets can appear as part of O'Caml function types and variant types, so we must recognize them and *not* treat them as a closing bracket. *) and ocamltype openingpos = parse | "->" | "[>" { ocamltype openingpos lexbuf } | '>' { OCAMLTYPE (Stretch.Declared (mk_stretch openingpos (lexeme_start_p lexbuf) true [])) } | "(*" { ocamlcomment (lexeme_start_p lexbuf) lexbuf; ocamltype openingpos lexbuf } | newline { new_line lexbuf; ocamltype openingpos lexbuf } | eof { error1 openingpos "unterminated OCaml type." } | _ { ocamltype openingpos lexbuf } (* ------------------------------------------------------------------------ *) (* Collect O'Caml code delimited by curly brackets. The monsters that are encountered along the way are accumulated in the list [monsters]. Nested curly brackets must be properly counted. Nested parentheses are also kept track of, so as to better report errors when they are not balanced. *) and action percent openingpos monsters = parse | '{' { let _, monsters = action false (lexeme_start_p lexbuf) monsters lexbuf in action percent openingpos monsters lexbuf } | ("}" | "%}") as delimiter { match percent, delimiter with | true, "%}" | false, "}" -> (* This is the delimiter we were instructed to look for. *) lexeme_start_p lexbuf, monsters | _, _ -> (* This is not it. *) error1 openingpos "unbalanced opening brace." } | '(' { let _, monsters = parentheses (lexeme_start_p lexbuf) monsters lexbuf in action percent openingpos monsters lexbuf } | '$' (['0'-'9']+ as i) { let monster = dollar (cpos lexbuf) (int_of_string i) in action percent openingpos (monster :: monsters) lexbuf } | poskeyword { let monster = position (cpos lexbuf) where flavor i x in action percent openingpos (monster :: monsters) lexbuf } | previouserror { error2 lexbuf "$previouserror is no longer supported." } | syntaxerror { let monster = syntaxerror (cpos lexbuf) in action percent openingpos (monster :: monsters) lexbuf } | '"' { string (lexeme_start_p lexbuf) lexbuf; action percent openingpos monsters lexbuf } | "'" { char lexbuf; action percent openingpos monsters lexbuf } | "(*" { ocamlcomment (lexeme_start_p lexbuf) lexbuf; action percent openingpos monsters lexbuf } | newline { new_line lexbuf; action percent openingpos monsters lexbuf } | ')' | eof { error1 openingpos "unbalanced opening brace." } | _ { action percent openingpos monsters lexbuf } (* ------------------------------------------------------------------------ *) (* Inside a semantic action, we keep track of nested parentheses, so as to better report errors when they are not balanced. *) and parentheses openingpos monsters = parse | '(' { let _, monsters = parentheses (lexeme_start_p lexbuf) monsters lexbuf in parentheses openingpos monsters lexbuf } | ')' { lexeme_start_p lexbuf, monsters } | '{' { let _, monsters = action false (lexeme_start_p lexbuf) monsters lexbuf in parentheses openingpos monsters lexbuf } | '$' (['0'-'9']+ as i) { let monster = dollar (cpos lexbuf) (int_of_string i) in parentheses openingpos (monster :: monsters) lexbuf } | poskeyword { let monster = position (cpos lexbuf) where flavor i x in parentheses openingpos (monster :: monsters) lexbuf } | previouserror { error2 lexbuf "$previouserror is no longer supported." } | syntaxerror { let monster = syntaxerror (cpos lexbuf) in parentheses openingpos (monster :: monsters) lexbuf } | '"' { string (lexeme_start_p lexbuf) lexbuf; parentheses openingpos monsters lexbuf } | "'" { char lexbuf; parentheses openingpos monsters lexbuf } | "(*" { ocamlcomment (lexeme_start_p lexbuf) lexbuf; parentheses openingpos monsters lexbuf } | newline { new_line lexbuf; parentheses openingpos monsters lexbuf } | '}' | eof { error1 openingpos "unbalanced opening parenthesis." } | _ { parentheses openingpos monsters lexbuf } (* ------------------------------------------------------------------------ *) (* Collect an attribute payload, which is terminated by a closing square bracket. Nested square brackets must be properly counted. Nested curly brackets and nested parentheses are also kept track of, so as to better report errors when they are not balanced. *) and attribute openingpos = parse | '[' { let _ = attribute (lexeme_start_p lexbuf) lexbuf in attribute openingpos lexbuf } | ']' { lexeme_start_p lexbuf } | '{' { let _, _ = action false (lexeme_start_p lexbuf) [] lexbuf in attribute openingpos lexbuf } | '(' { let _, _ = parentheses (lexeme_start_p lexbuf) [] lexbuf in attribute openingpos lexbuf } | '"' { string (lexeme_start_p lexbuf) lexbuf; attribute openingpos lexbuf } | "'" { char lexbuf; attribute openingpos lexbuf } | "(*" { ocamlcomment (lexeme_start_p lexbuf) lexbuf; attribute openingpos lexbuf } | newline { new_line lexbuf; attribute openingpos lexbuf } | '}' | ')' | eof { error1 openingpos "unbalanced opening bracket." } | _ { attribute openingpos lexbuf } (* ------------------------------------------------------------------------ *) (* Skip O'Caml comments. Comments can be nested and can contain strings or characters, which must be correctly analyzed. (A string could contain begin-of-comment or end-of-comment sequences, which must be ignored; a character could contain a begin-of-string sequence.) *) and ocamlcomment openingpos = parse | "*)" { () } | "(*" { ocamlcomment (lexeme_start_p lexbuf) lexbuf; ocamlcomment openingpos lexbuf } | '"' { string (lexeme_start_p lexbuf) lexbuf; ocamlcomment openingpos lexbuf } | "'" { char lexbuf; ocamlcomment openingpos lexbuf } | newline { new_line lexbuf; ocamlcomment openingpos lexbuf } | eof { error1 openingpos "unterminated OCaml comment." } | _ { ocamlcomment openingpos lexbuf } (* ------------------------------------------------------------------------ *) (* Skip O'Caml strings. *) and string openingpos = parse | '"' { () } | '\\' newline | newline { new_line lexbuf; string openingpos lexbuf } | '\\' _ (* Upon finding a backslash, skip the character that follows, unless it is a newline. Pretty crude, but should work. *) { string openingpos lexbuf } | eof { error1 openingpos "unterminated OCaml string." } | _ { string openingpos lexbuf } (* ------------------------------------------------------------------------ *) (* Skip O'Caml characters. A lone quote character is legal inside a comment, so if we don't recognize the matching closing quote, we simply abandon. *) and char = parse | '\\'? newline "'" { new_line lexbuf } | [^ '\\' '\''] "'" | '\\' _ "'" | '\\' ['0'-'9'] ['0'-'9'] ['0'-'9'] "'" | '\\' 'x' ['0'-'9' 'a'-'f' 'A'-'F'] ['0'-'9' 'a'-'f' 'A'-'F'] "'" | "" { () } (* ------------------------------------------------------------------------ *) (* Read until the end of the file. This is used after finding a %% that marks the end of the grammar specification. We update the current position as we go. This allows us to build a stretch for the postlude. *) and finish = parse | newline { new_line lexbuf; finish lexbuf } | eof { lexeme_start_p lexbuf } | _ { finish lexbuf } menhir-20200123/src/lexmli.mll000066400000000000000000000050431361226111300160140ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This code analyzes the output of [ocamlc -i] and returns a list of identifiers together with their types. Types are represented by offsets in the source string. *) { let fail () = Error.error [] "failed to make sense of ocamlc's output." } let whitespace = [ ' ' '\t' '\n' '\r' ] let lowercase = ['a'-'z' '\223'-'\246' '\248'-'\255' '_'] let identchar = ['A'-'Z' 'a'-'z' '_' '\192'-'\214' '\216'-'\246' '\248'-'\255' '0'-'9'] (* '\'' forbidden *) (* Read a list of bindings. We start immediately after a [val] keyword, so we expect either an end marker, or an identifier, followed by a colon, followed by a type, followed by another list of bindings. In the latter case, we recognize the identifier and the colon, record where the type begins, and pass control to [type_then_bindings]. *) rule bindings env = parse | "menhir_end_marker : int" { env } | whitespace* ((lowercase identchar*) as id) whitespace* ':' whitespace* { type_then_bindings env id (Lexing.lexeme_end lexbuf) lexbuf } | _ | eof { fail() } (* Read a type followed by a list of bindings. *) and type_then_bindings env id openingofs = parse | whitespace+ "val" whitespace { let closingofs = Lexing.lexeme_start lexbuf in bindings ((id, openingofs, closingofs) :: env) lexbuf } | _ { type_then_bindings env id openingofs lexbuf } | eof { fail() } (* Skip up to the first [val] keyword that follows the begin marker, and start from there. *) and main = parse | _* "val menhir_begin_marker : int" whitespace+ "val" whitespace+ { bindings [] lexbuf } | _ | eof { fail() } menhir-20200123/src/lexpointfree.mll000066400000000000000000000037211361226111300172270ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) { exception InvalidPointFreeAction } (* See [ParserAux.validate_pointfree_action]. *) let lowercase = ['a'-'z' '\223'-'\246' '\248'-'\255' '_'] let uppercase = ['A'-'Z' '\192'-'\214' '\216'-'\222'] let identchar = ['A'-'Z' 'a'-'z' '_' '\192'-'\214' '\216'-'\246' '\248'-'\255' '0'-'9'] (* '\'' forbidden *) let symbolchar = ['!' '$' '%' '&' '*' '+' '-' '.' '/' ':' '<' '=' '>' '?' '@' '^' '|' '~'] let op = symbolchar+ (* An approximation of OCaml's rules. *) let whitespace = [ ' ' '\t' '\n' ] rule validate_pointfree_action = parse | whitespace* (lowercase | uppercase | '`') (identchar | '.')* whitespace* eof | whitespace* '(' op ')' whitespace* eof (* We have got a nonempty point-free action: . *) { true } | whitespace* eof (* We have got an empty point-free action: <>. *) { false } | _ { raise InvalidPointFreeAction } (* See [ParserAux.valid_ocaml_identifier]. *) and valid_ocaml_identifier = parse | lowercase identchar* eof { true } | _ | eof { false } menhir-20200123/src/lineCount.mll000066400000000000000000000023441361226111300164630ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This simple function counts the number of newline characters in a string. *) let newline = ('\010' | '\013' | "\013\010") let ordinary = [^ '\010' '\013']+ rule count n = parse | eof { n } | newline { count (n + 1) lexbuf } | ordinary { count n lexbuf } menhir-20200123/src/listMonad.ml000066400000000000000000000034441361226111300163030ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) type 'a m = 'a list let return x = [ x ] let bind l f = List.flatten (List.map f l) let ( >>= ) l f = bind l f (* 1. (return x) >>= f == f x bind [ x ] f = List.flatten (List.map f [ x ]) = f x 2. m >>= return == m bind l return = List.flatten (List.map (fun x -> [ x ]) (x1::x2::..::xn)) = List.flatten ([x1]::...::[xn]) = x1::...::xn = l 3. (m >>= f) >>= g == m >>= (\x -> f x >>= g) bind (bind l f) g = List.flatten (List.map g (List.flatten (List.map f (x1::...::xn)))) = List.flatten (List.map g (f x1 :: f x2 :: ... :: f xn)) = List.flatten (List.map g ([fx1_1; fx1_2 ... ] :: [fx2_1; ... ] :: ...)) = List.flatten ([ g fx1_1; g fx_1_2 ... ] :: [ g fx_2_1; ... ] ...) = List.flatten (List.map (fun x -> List.flatten (List.map g (f x))) l) = bind l (fun x -> bind (f x) g) *) menhir-20200123/src/listMonad.mli000066400000000000000000000024241361226111300164510ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (** Monad type which represents a list of results. *) type 'a m = 'a list (** [bind x f] applies [f] to a list of results, returning a list of results. *) val bind: 'a m -> ('a -> 'b m) -> 'b m val ( >>= ) : 'a m -> ('a -> 'b m) -> 'b m (** [return x] is the left and right unit of [bind]. *) val return: 'a -> 'a m menhir-20200123/src/lookahead.ml000066400000000000000000000027151361226111300163000ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* These are the operations required of lookahead sets during a closure computation. This signature is exploited by the functor [Item.Closure]. *) module type S = sig (* The type of lookahead sets. *) type t (* The empty lookahead set. Redundant with the following, but convenient. *) val empty: t (* A concrete, constant set of terminal symbols. *) val constant: Grammar.TerminalSet.t -> t (* [union s1 s2] returns the union of [s1] and [s2]. *) val union: t -> t -> t end menhir-20200123/src/lr0.ml000066400000000000000000000563011361226111300150460ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Grammar module InfiniteArray = MenhirLib.InfiniteArray (* ------------------------------------------------------------------------ *) (* Symbolic lookahead information. *) (* A symbolic lookahead set consists of an actual concrete set of terminal symbols and of a number of set variables. Set variables as encoded as integers. *) module SymbolicLookahead = struct type t = TerminalSet.t * CompressedBitSet.t let constant toks = (toks, CompressedBitSet.empty) let empty = constant TerminalSet.empty let union (toks1, vars1) ((toks2, vars2) as s2) = let toks = TerminalSet.union toks1 toks2 and vars = CompressedBitSet.union vars1 vars2 in if toks2 == toks && vars2 == vars then s2 else (toks, vars) let variable (var : int) : t = (TerminalSet.empty, CompressedBitSet.singleton var) let project (toks, vars) = assert (CompressedBitSet.is_empty vars); toks end (* We will perform closure operations over symbolic lookahead sets. This allows us to later represent LR(1) states as pairs of an LR(0) node number and an array of concrete lookahead sets. *) module SymbolicClosure = Item.Closure(SymbolicLookahead) (* Closure operations over concrete lookahead sets are also used (when explaining conflicts). One could take another instance of the functor. The approach below is somewhat less elegant and makes each call to [closure] somewhat slower, but saves the cost of instantiating the functor again -- which is linear in the size of the grammar. *) type concretelr1state = TerminalSet.t Item.Map.t let closure (state : concretelr1state) : concretelr1state = Item.Map.map SymbolicLookahead.project (SymbolicClosure.closure (Item.Map.map SymbolicLookahead.constant state)) (* ------------------------------------------------------------------------ *) (* Finding which non-epsilon transitions leave a set of items. This code is parametric in the nature of lookahead sets. *) let transitions (state : 'a Item.Map.t) : 'a Item.Map.t SymbolMap.t = Item.Map.fold (fun item toks transitions -> match Item.classify item with | Item.Shift (symbol, item') -> let items : 'a Item.Map.t = try SymbolMap.find symbol transitions with Not_found -> Item.Map.empty in SymbolMap.add symbol (Item.Map.add item' toks items) transitions | Item.Reduce _ -> transitions ) state SymbolMap.empty (* ------------------------------------------------------------------------ *) (* Determining the reduction opportunities at a (closed) state. They are represented as a list of pairs of a lookahead set and a production index. This code is again parametric in the nature of lookahead sets. *) let reductions (state : 'a Item.Map.t) : ('a * Production.index) list = Item.Map.fold (fun item toks accu -> match Item.classify item with | Item.Reduce prod -> (toks, prod) :: accu | Item.Shift _ -> accu ) state [] (* ------------------------------------------------------------------------ *) (* Construction of the the LR(0) automaton. *) (* Nodes are numbered sequentially. *) type node = int (* A symbolic transition is a pair of the target state number and an array of symbolic lookahead sets. The variables in these sets are numbered in [0,g) where g is the number of items in the source LR(0) state. Items are numbered in the order of presentation by [Item.Set.fold]. *) type symbolic_transition_target = node * SymbolicLookahead.t array (* The automaton is represented by (growing) arrays of states (sets of items), symbolic transition information, and symbolic reduction information, indexed by node numbers. Conversely, a hash table maps states (sets of items) to node numbers. *) let n = ref 0 let states : Item.Set.t InfiniteArray.t = InfiniteArray.make Item.Set.empty let _transitions : symbolic_transition_target SymbolMap.t InfiniteArray.t = InfiniteArray.make SymbolMap.empty let _reductions : (SymbolicLookahead.t * Production.index) list InfiniteArray.t = InfiniteArray.make [] let map : (Item.Set.t, node) Hashtbl.t = Hashtbl.create 50021 let incoming : Symbol.t option InfiniteArray.t = InfiniteArray.make None (* The automaton is built depth-first. *) let rec explore (symbol : Symbol.t option) (state : Item.Set.t) : node = (* Find out whether this state was already explored. *) try Hashtbl.find map state with Not_found -> (* If not, create a new node. *) let k = !n in n := k + 1; InfiniteArray.set states k state; Hashtbl.add map state k; (* Record its incoming symbol. *) InfiniteArray.set incoming k symbol; (* Build a symbolic version of the current state, where each item is associated with a distinct lookahead set variable, numbered consecutively. *) let (_ : int), (symbolic_state : SymbolicClosure.state) = Item.Set.fold (fun item (i, symbolic_state) -> i+1, Item.Map.add item (SymbolicLookahead.variable i) symbolic_state ) state (0, Item.Map.empty) in (* Compute the symbolic closure. *) let closure = SymbolicClosure.closure symbolic_state in (* Compute symbolic information about reductions. *) InfiniteArray.set _reductions k (reductions closure); (* Compute symbolic information about the transitions, and, by dropping the symbolic lookahead information, explore the transitions to further LR(0) states. *) InfiniteArray.set _transitions k (SymbolMap.mapi (fun symbol symbolic_state -> let (k : node) = explore (Some symbol) (Item.Map.domain symbolic_state) in let lookahead : SymbolicLookahead.t array = Array.make (Item.Map.cardinal symbolic_state) SymbolicLookahead.empty in let (_ : int) = Item.Map.fold (fun _ s i -> lookahead.(i) <- s; i+1 ) symbolic_state 0 in ((k, lookahead) : symbolic_transition_target) ) (transitions closure)); k (* Creating a start state out of a start production. It contains a single item, consisting of the start production, at position 0. *) let start prod : Item.Set.t = Item.Set.singleton (Item.import (prod, 0)) (* This starts the construction of the automaton and records the entry nodes in an array. *) let entry : node ProductionMap.t = ProductionMap.start (fun prod -> explore None (start prod) ) let () = Hashtbl.clear map let n = !n let () = Error.logA 1 (fun f -> Printf.fprintf f "Built an LR(0) automaton with %d states.\n" n); Time.tick "Construction of the LR(0) automaton" (* ------------------------------------------------------------------------ *) (* Accessors. *) let items node : Item.Set.t = InfiniteArray.get states node let incoming_symbol node : Symbol.t option = InfiniteArray.get incoming node let outgoing_edges node : node SymbolMap.t = SymbolMap.map (fun (target, _) -> target) (InfiniteArray.get _transitions node) let outgoing_symbols node : Symbol.t list = SymbolMap.domain (InfiniteArray.get _transitions node) (* Efficient access to the predecessors of an LR(0) state requires building a reversed graph. This is done on the first invocation of the function [predecessors]. Our measurements show that it typically takes less than 0.01s anyway. *) let predecessors : node list array Lazy.t = lazy ( let predecessors = Array.make n [] in for source = 0 to n-1 do SymbolMap.iter (fun _symbol (target, _) -> predecessors.(target) <- source :: predecessors.(target) ) (InfiniteArray.get _transitions source) done; predecessors ) let incoming_edges (c : node) : node list = (Lazy.force predecessors).(c) (* ------------------------------------------------------------------------ *) (* Help for building the LR(1) automaton. *) (* An LR(1) state is represented as a pair of an LR(0) state number and an array of concrete lookahead sets (whose length depends on the LR(0) state). *) type lr1state = node * TerminalSet.t array (* An encoded LR(1) state can be turned into a concrete representation, that is, a mapping of items to concrete lookahead sets. *) let export (k, toksr) = let (_ : int), items = Item.Set.fold (fun item (i, items) -> i+1, Item.Map.add item toksr.(i) items ) (InfiniteArray.get states k) (0, Item.Map.empty) in items (* Displaying a concrete state. *) let print_concrete leading (state : concretelr1state) = let buffer = Buffer.create 1024 in Item.Map.iter (fun item toks -> Printf.bprintf buffer "%s%s[ %s ]\n" leading (Item.print item) (TerminalSet.print toks) ) state; Buffer.contents buffer (* Displaying a state. By default, only the kernel is displayed, not the closure. *) let print leading state = print_concrete leading (export state) let print_closure leading state = print_concrete leading (closure (export state)) (* The core of an LR(1) state is the underlying LR(0) state. *) let core (k, _) = k (* A sanity check. *) let well_formed (k, toksr) = Array.length toksr = Item.Set.cardinal (InfiniteArray.get states k) (* An LR(1) start state is the combination of an LR(0) start state (which consists of a single item) with a singleton lookahead set that consists of the end-of-file pseudo-token. *) let start k = let state = (k, [| TerminalSet.singleton Terminal.sharp |]) in assert (well_formed state); state (* Interpreting a symbolic lookahead set with respect to a source state. The variables in the symbolic lookahead set (which are integers) are interpreted as indices into the state's array of concrete lookahead sets. The result is a concrete lookahead set. *) let interpret ((_, toksr) as state : lr1state) ((toks, vars) : SymbolicLookahead.t) : TerminalSet.t = assert (well_formed state); CompressedBitSet.fold (fun var toks -> assert (var >= 0 && var < Array.length toksr); TerminalSet.union toksr.(var) toks ) vars toks (* Out of an LR(1) state, one produces information about reductions and transitions. This is done in an efficient way by interpreting the precomputed symbolic information with respect to that state. *) let reductions ((k, _) as state : lr1state) : (TerminalSet.t * Production.index) list = List.map (fun (s, prod) -> interpret state s, prod ) (InfiniteArray.get _reductions k) let transitions ((k, _) as state : lr1state) : lr1state SymbolMap.t = SymbolMap.map (fun ((k, sr) : symbolic_transition_target) -> ((k, Array.map (interpret state) sr) : lr1state) ) (InfiniteArray.get _transitions k) let transition symbol ((k, _) as state : lr1state) : lr1state = let ((k, sr) : symbolic_transition_target) = try SymbolMap.find symbol (InfiniteArray.get _transitions k) with Not_found -> assert false (* no transition along this symbol *) in (k, Array.map (interpret state) sr) (* [transition_tokens transitions] returns the set of tokens (terminal symbols) that are labels of outgoing transitions in the table [transitions]. *) let transition_tokens transitions = SymbolMap.fold (fun symbol _target toks -> match symbol with | Symbol.T tok -> TerminalSet.add tok toks | Symbol.N _ -> toks ) transitions TerminalSet.empty (* Equality of states. *) let equal ((k1, toksr1) as state1) ((k2, toksr2) as state2) = assert (k1 = k2 && well_formed state1 && well_formed state2); let rec loop i = if i = 0 then true else let i = i - 1 in (TerminalSet.equal toksr1.(i) toksr2.(i)) && (loop i) in loop (Array.length toksr1) (* A total order on states. *) let compare ((k1, toksr1) as state1) ((k2, toksr2) as state2) = assert (k1 = k2 && well_formed state1 && well_formed state2); let rec loop i = if i = 0 then 0 else let i = i - 1 in let c = TerminalSet.compare toksr1.(i) toksr2.(i) in if c <> 0 then c else loop i in loop (Array.length toksr1) (* Subsumption between states. *) let subsume ((k1, toksr1) as state1) ((k2, toksr2) as state2) = assert (k1 = k2 && well_formed state1 && well_formed state2); let rec loop i = if i = 0 then true else let i = i - 1 in (TerminalSet.subset toksr1.(i) toksr2.(i)) && (loop i) in loop (Array.length toksr1) (* This function determines whether two (core-equivalent) states are compatible, according to a criterion that is close to Pager's weak compatibility criterion. Pager's criterion guarantees that if a merged state has a potential conflict at [(i, j)] -- that is, some token [t] appears within the lookahead sets of both item [i] and item [j] -- then there exists a state in the canonical automaton that also has a potential conflict at [(i, j)] -- that is, some token [u] appears within the lookahead sets of both item [i] and item [j]. Note that [t] and [u] can be distinct. Pager has shown that his weak compatibility criterion is stable, that is, preserved by transitions and closure. This means that, if two states can be merged, then so can their successors. This is important, because merging two states means committing to merging their successors, even though we have not even built these successors yet. The criterion used here is a slightly more restrictive version of Pager's criterion, which guarantees equality of the tokens [t] and [u]. This is done essentially by applying Pager's original criterion on a token-wise basis. Pager's original criterion states that two states can be merged if the new state has no conflict or one of the original states has a conflict. Our more restrictive criterion states that two states can be merged if, for every token [t], the new state has no conflict at [t] or one of the original states has a conflict at [t]. This modified criterion is also stable. My experiments show that it is almost as effective in practice: out of more than a hundred real-world sample grammars, only one automaton was affected, and only one extra state appeared as a result of using the modified criterion. Its advantage is to potentially make conflict explanations easier: if there appears to be a conflict at [t], then some conflict at [t] can be explained. This was not true when using Pager's original criterion. *) let compatible (k1, toksr1) (k2, toksr2) = assert (k1 = k2); let n = Array.length toksr1 in (* Two states are compatible if and only if they are compatible at every pair (i, j), where i and j are distinct. *) let rec loopi i = if i = n then true else let toksr1i = toksr1.(i) and toksr2i = toksr2.(i) in let rec loopj j = if j = i then true else let toksr1j = toksr1.(j) and toksr2j = toksr2.(j) in (* The two states are compatible at (i, j) if every conflict token in the merged state already was a conflict token in one of the two original states. This could be written as follows: TerminalSet.subset (TerminalSet.inter (TerminalSet.union toksr1i toksr2i) (TerminalSet.union toksr1j toksr2j)) (TerminalSet.union (TerminalSet.inter toksr1i toksr1j) (TerminalSet.inter toksr2i toksr2j)) but is easily seen (on paper) to be equivalent to: *) TerminalSet.subset (TerminalSet.inter toksr2i toksr1j) (TerminalSet.union toksr1i toksr2j) && TerminalSet.subset (TerminalSet.inter toksr1i toksr2j) (TerminalSet.union toksr2i toksr1j) && loopj (j+1) in loopj 0 && loopi (i+1) in loopi 0 (* This function determines whether two (core-equivalent) states can be merged without creating an end-of-stream conflict, now or in the future. The rule is, if an item appears in one state with the singleton "#" as its lookahead set, then its lookahead set in the other state must contain "#". So, either the second lookahead set is also the singleton "#", and no end-of-stream conflict exists, or it is larger, and the second state already contains an end-of-stream conflict. Put another way, we do not want to merge two lookahead sets when one contains "#" alone and the other does not contain "#". I invented this rule to complement Pager's criterion. I believe, but I am not 100% sure, that it does indeed prevent end-of-stream conflicts and that it is stable. Thanks to Sébastien Hinderer for reporting the bug caused by the absence of this extra criterion. *) let eos_compatible (k1, toksr1) (k2, toksr2) = assert (k1 = k2); let n = Array.length toksr1 in let rec loop i = if i = n then true else let toks1 = toksr1.(i) and toks2 = toksr2.(i) in begin if TerminalSet.mem Terminal.sharp toks1 && TerminalSet.is_singleton toks1 then (* "#" is alone in one set: it must be a member of the other set. *) TerminalSet.mem Terminal.sharp toks2 else if TerminalSet.mem Terminal.sharp toks2 && TerminalSet.is_singleton toks2 then (* Symmetric condition. *) TerminalSet.mem Terminal.sharp toks1 else true end && loop (i+1) in loop 0 (* This function determines whether two (core-equivalent) states can be merged without creating spurious reductions on the [error] token. The rule is, we merge two states only if they agree on which reductions are permitted on the [error] token. Without this restriction, we might end up in a situation where we decide to introduce an [error] token into the input stream and perform a reduction, whereas a canonical LR(1) automaton, confronted with the same input string, would fail normally -- that is, it would introduce an [error] token into the input stream, but it would not be able to perform a reduction right away: the current state would be discarded. In the interest of more accurate (or sane, or predictable) error handling, I decided to introduce this restriction as of 20110124. This will cause an increase in the size of automata for grammars that use the [error] token. It might actually make the [error] token somewhat easier to use. Note that two sets can be in the subsumption relation and still be error-incompatible. Error-compatibility requires equality of the lookahead sets, restricted to [error]. Thanks to Didier Rémy for reporting a bug caused by the absence of this extra criterion. *) let error_compatible (k1, toksr1) (k2, toksr2) = assert (k1 = k2); let n = Array.length toksr1 in let rec loop i = if i = n then true else let toks1 = toksr1.(i) and toks2 = toksr2.(i) in begin if TerminalSet.mem Terminal.error toks1 then (* [error] is a member of one set: it must be a member of the other set. *) TerminalSet.mem Terminal.error toks2 else if TerminalSet.mem Terminal.error toks2 then (* Symmetric condition. *) TerminalSet.mem Terminal.error toks1 else true end && loop (i+1) in loop 0 (* Union of two states. The two states must have the same core. The new state is obtained by pointwise union of the lookahead sets. *) let union (k1, toksr1) (k2, toksr2) = assert (k1 = k2); k1, Array.init (Array.length toksr1) (fun i -> TerminalSet.union toksr1.(i) toksr2.(i) ) (* Restriction of a state to a set of tokens of interest. Every lookahead set is intersected with that set. *) let restrict toks (k, toksr) = k, Array.map (fun toksri -> TerminalSet.inter toksri toks ) toksr (* A (state-local, possibly nondeterministic) reduction table maps terminal symbols to lists of productions. *) type reductions = Production.index list TerminalMap.t (* [add_reduction prod tok reductions] adds a reduction of [prod] on [tok] to the table [reductions]. *) let add_reduction prod tok reductions = let prods = try TerminalMap.lookup tok reductions with Not_found -> [] in TerminalMap.add tok (prod :: prods) reductions (* [add_reductions prod toks reductions] adds a reduction of [prod] on every token in the set [toks] to the table [reductions]. *) let add_reductions prod toks reductions = TerminalSet.fold (add_reduction prod) toks reductions let reductions_table state = List.fold_left (fun reductions (toks, prod) -> add_reductions prod toks reductions ) TerminalMap.empty (reductions state) (* [reduction_tokens reductions] returns the domain of the reductions table [table], in the form of a set of tokens. *) let reduction_tokens reductions = TerminalMap.fold (fun tok _prods toks -> TerminalSet.add tok toks ) reductions TerminalSet.empty (* This inverts a mapping of tokens to productions into a mapping of productions to sets of tokens. *) (* This is needed, in [CodeBackend], to avoid producing two (or more) separate branches that call the same [reduce] function. Instead, we generate just one branch, guarded by a [POr] pattern. *) let invert reductions : TerminalSet.t ProductionMap.t = TerminalMap.fold (fun tok prods inverse -> let prod = Misc.single prods in let toks = try ProductionMap.lookup prod inverse with Not_found -> TerminalSet.empty in ProductionMap.add prod (TerminalSet.add tok toks) inverse ) reductions ProductionMap.empty (* [has_eos_conflict transitions reductions] tells whether a state has an end-of-stream conflict, that is, a reduction action on [#] and at least one other (shift or reduce) action. *) let has_eos_conflict transitions reductions = match TerminalMap.lookup_and_remove Terminal.sharp reductions with | exception Not_found -> (* There is no reduction action on [#], thus no conflict. *) false | prods, reductions -> (* There is at least one reduction action on [#]. *) (* If there are two reduction actions on [#], then we have a conflict. *) List.length prods > 1 || (* If there only one reduction on [#], then we have a conflict if and only if either there exists another shift or reduce action. *) not (TerminalMap.is_empty reductions) || SymbolMap.exists (fun symbol _ -> Symbol.is_terminal symbol) transitions let has_eos_conflict_lr1state (state : lr1state) = has_eos_conflict (transitions state) (reductions_table state) menhir-20200123/src/lr0.mli000066400000000000000000000152201361226111300152120ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Grammar (* This module builds the LR(0) automaton associated with the grammar, then provides access to it. It also provides facilities for efficiently performing LR(1) constructions. *) (* ------------------------------------------------------------------------ *) (* The LR(0) automaton. *) (* The nodes of the LR(0) automaton are numbered. *) type node = int (* This is the number of nodes in the LR(0) automaton. *) val n: int (* These are the automaton's entry states, indexed by the start productions. *) val entry: node ProductionMap.t (* A node can be converted to the underlying LR(0) set of items. This set is not closed. *) val items: node -> Item.Set.t (* The incoming symbol of an LR(0) node is the symbol carried by all of the edges that enter this node. A node has zero incoming edges (and, thus, no incoming symbol) if and only if it is a start node. *) val incoming_symbol: node -> Symbol.t option val incoming_edges: node -> node list (* The outgoing edges of a node. *) val outgoing_edges: node -> node SymbolMap.t val outgoing_symbols: node -> Symbol.t list (* ------------------------------------------------------------------------ *) (* Help for building the LR(1) automaton. *) (* An LR(1) state is internally represented as a pair of an LR(0) state number and an array of concrete lookahead sets (whose length depends on the LR(0) state). *) type lr1state (* An encoded LR(1) state can be turned into a concrete representation, that is, a mapping of items to concrete lookahead sets. *) type concretelr1state = TerminalSet.t Item.Map.t val export: lr1state -> concretelr1state (* One can take the closure of a concrete LR(1) state. *) val closure: concretelr1state -> concretelr1state (* The core of an LR(1) state is the underlying LR(0) state. *) val core: lr1state -> node (* One can create an LR(1) start state out of an LR(0) start node. *) val start: node -> lr1state (* Information about the transitions at a state. *) val transitions: lr1state -> lr1state SymbolMap.t val transition: Symbol.t -> lr1state -> lr1state (* [transition_tokens transitions] returns the set of tokens (terminal symbols) that are labels of outgoing transitions in the table [transitions]. *) val transition_tokens: 'target SymbolMap.t -> TerminalSet.t (* Information about the reductions at a state. *) (* See also [reductions_table] further on. *) val reductions: lr1state -> (TerminalSet.t * Production.index) list (* Equality of states. The two states must have the same core. Then, they are equal if and only if their lookahead sets are pointwise equal. *) val equal: lr1state -> lr1state -> bool (* A total order on states. The two states must have the same core. This is an arbitrary total order; it has nothing to do with set inclusion, which is a partial order; see [subsume] below. *) val compare: lr1state -> lr1state -> int (* Subsumption between states. The two states must have the same core. Then, one subsumes the other if and only if their lookahead sets are (pointwise) in the subset relation. *) val subsume: lr1state -> lr1state -> bool (* A slightly modified version of Pager's weak compatibility criterion. The two states must have the same core. *) val compatible: lr1state -> lr1state -> bool (* This function determines whether two (core-equivalent) states can be merged without creating an end-of-stream conflict. *) val eos_compatible: lr1state -> lr1state -> bool (* This function determines whether two (core-equivalent) states can be merged without creating spurious reductions on the [error] token. *) val error_compatible: lr1state -> lr1state -> bool (* Union of two states. The two states must have the same core. The new state is obtained by pointwise union of the lookahead sets. *) val union: lr1state -> lr1state -> lr1state (* Restriction of a state to a set of tokens of interest. Every lookahead set is intersected with that set. *) val restrict: TerminalSet.t -> lr1state -> lr1state (* The following functions display: 1- a concrete state; 2- a state (only the kernel, not the closure); 3- the closure of a state. The first parameter is a fixed string that is added at the beginning of every line. *) val print_concrete: string -> concretelr1state -> string val print: string -> lr1state -> string val print_closure: string -> lr1state -> string (* A (state-local, possibly nondeterministic) reduction table maps terminal symbols to lists of productions. *) type reductions = Production.index list TerminalMap.t (* [add_reduction prod tok reductions] adds a reduction of [prod] on [tok] to the table [reductions]. *) val add_reduction: Production.index -> Terminal.t -> reductions -> reductions (* [add_reductions prod toks reductions] adds a reduction of [prod] on every token in the set [toks] to the table [reductions]. *) val add_reductions: Production.index -> TerminalSet.t -> reductions -> reductions (* A table of the reductions at a state. *) val reductions_table: lr1state -> reductions (* [invert] inverts a reduction table (that is, a mapping of tokens to lists of productions), producing a mapping of productions to sets of tokens. *) val invert : reductions -> TerminalSet.t ProductionMap.t (* [reduction_tokens reductions] returns the domain of the reductions table [table], in the form of a set of tokens. *) val reduction_tokens: reductions -> TerminalSet.t (* [has_eos_conflict transitions reductions] tells whether a state has an end-of-stream conflict, that is, a reduction action on [#] and at least one other (shift or reduce) action. *) val has_eos_conflict: 'target SymbolMap.t -> reductions -> bool val has_eos_conflict_lr1state: lr1state -> bool menhir-20200123/src/lr1.ml000066400000000000000000001030011361226111300150350ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Grammar (* This module first constructs an LR(1) automaton by using [Lr1construction]. Then, this automaton is further transformed (in place), in three steps: - Silent conflict resolution (without warnings), following the user's precedence declarations. This is done immediately. This can remove transitions and reductions. - Default conflict resolution (with warnings), following a fixed default policy. This is done via an explicit call to [default_conflict_resolution()]. This can remove reductions. - Addition of extra reductions, following the user's [%on_error_reduce] declarations. This is done via an explicit call to [extra_reductions()]. Conflicts are explained after step 1, and before steps 2 and 3. This is the main reason why these steps are separate. *) (* -------------------------------------------------------------------------- *) (* Run the SLR(1) check first. *) let () = Slr.check() (* -------------------------------------------------------------------------- *) (* Run the construction algorithm. *) module Raw = Lr1construction.Run() let () = Error.logA 1 (fun f -> Printf.fprintf f "Built an LR(1) automaton with %d states.\n" Raw.n ) (* -------------------------------------------------------------------------- *) (* In the following, we perform a depth-first traversal of the raw automaton that was built above. As we go, we perform silent conflict resolution (which can remove some transitions and therefore make some raw nodes unreachable) and we assign consecutive numbers to the reachable nodes. *) (* We define our own type [node] to be the number of a reachable node in this new numbering system. *) type node = int (* -------------------------------------------------------------------------- *) (* All of the following mutable state is modified or initialized during the depth-first traversal below. *) (* All of the arrays below are indexed by raw node numbers. *) module M = struct let marked : bool array = Array.make Raw.n false let mark (node : Raw.node) = marked.(Raw.number node) <- true let is_marked (node : Raw.node) = marked.(Raw.number node) end (* This array is initialized during the traversal. We assign new consecutive numbers to the reachable nodes. *) let unreachable = -1 let _number : node array = Array.make Raw.n unreachable let transport (raw_node : Raw.node) : node = _number.(Raw.number raw_node) (* This array of transitions is initialized here with the data supplied by [Raw.transitions]. Then, some transitions are *removed* (because of conflict resolution) during the traversal. *) let transitions : Raw.node SymbolMap.t array = Array.init Raw.n (fun i -> Raw.transitions (Raw.node i) ) (* Transitions are also stored in reverse, so as to allow reverse traversals of the automaton. This array is populated during the traversal. *) let predecessors : node list array = Array.make Raw.n [] (* This array is initialized during the traversal. *) let reductions : Lr0.reductions array = Array.make Raw.n TerminalMap.empty (* dummy *) (* Tokens for which there are several possible actions in the raw LR(1) automaton are known as conflict tokens. This array is populated during the traversal. *) let conflict_tokens : TerminalSet.t array = Array.make Raw.n TerminalSet.empty (* (New as of 2012/01/23.) This flag records whether a shift/reduce conflict in this node was solved in favor of neither (%nonassoc). This is later used to forbid a default reduction at this node. *) let forbid_default_reduction : bool array = Array.make Raw.n false (* A count of all reachable nodes. *) let n = ref 0 (* A list of nodes with conflicts. *) let conflict_nodes : node list ref = ref [] (* Counts of nodes with shift/reduce and reduce/reduce conflicts. *) let shift_reduce = ref 0 let reduce_reduce = ref 0 (* Count of the shift/reduce conflicts that could be silently resolved. *) let silently_solved = ref 0 (* -------------------------------------------------------------------------- *) (* A view of the raw LR(1) automaton as a graph. *) (* This view relies on the [transitions] array, as opposed to the function [Raw.transitions]. This means that, once an edge has been removed, it can no longer be followed. *) module ForwardEdges = struct type node = Raw.node type label = Symbol.t let foreach_outgoing_edge node f = let i = Raw.number node in SymbolMap.iter f transitions.(i) let foreach_root f = ProductionMap.iter (fun _prod node -> f node) Raw.entry end (* -------------------------------------------------------------------------- *) (* This function is invoked during the traversal when a node is discovered. *) (* It is in charge of detecting and resolving conflicts at this node. *) let discover (raw_node : Raw.node) = let i = Raw.number raw_node and state = Raw.state raw_node in (* Number this node. *) let node : node = Misc.postincrement n in _number.(i) <- node; (* Detect conflicts. We iterate over the table [Lr0.reductions_table state], which gives us all potential reductions. The code is written in such a way that we are aware of multi-way shift/reduce/reduce conflicts. We solve conflicts, when unambiguously allowed by priorities, by removing certain transitions and reductions. *) let has_shift_reduce = ref false and has_reduce_reduce = ref false in let foreach_reduction f = TerminalMap.fold f (Lr0.reductions_table state) TerminalMap.empty in reductions.(i) <- foreach_reduction begin fun tok prods reductions -> assert (prods <> []); if SymbolMap.mem (Symbol.T tok) transitions.(i) then begin (* There is a transition in addition to the reduction(s). We have (at least) a shift/reduce conflict. *) assert (not (Terminal.equal tok Terminal.sharp)); if List.length prods = 1 then begin let prod = List.hd prods in (* This is a single shift/reduce conflict. If priorities tell us how to solve it, we follow that and modify the automaton. *) match Precedence.shift_reduce tok prod with | Precedence.ChooseShift -> (* Suppress the reduce action. *) incr silently_solved; reductions | Precedence.ChooseReduce -> (* Record the reduce action and suppress the shift transition. The automaton is modified in place. This can have the subtle effect of making some nodes unreachable. Any conflicts in these nodes are then ignored (as they should be). *) incr silently_solved; transitions.(i) <- SymbolMap.remove (Symbol.T tok) transitions.(i); TerminalMap.add tok prods reductions | Precedence.ChooseNeither -> (* Suppress both the reduce action and the shift transition. *) incr silently_solved; transitions.(i) <- SymbolMap.remove (Symbol.T tok) transitions.(i); forbid_default_reduction.(i) <- true; reductions | Precedence.DontKnow -> (* Priorities don't allow concluding. Record the existence of a shift/reduce conflict. *) conflict_tokens.(i) <- Grammar.TerminalSet.add tok conflict_tokens.(i); has_shift_reduce := true; TerminalMap.add tok prods reductions end else begin (* At least two reductions are enabled, so this is a shift/reduce/reduce conflict. If the priorities are such that each individual shift/reduce conflict is solved in favor of shifting or in favor of neither, then solve the entire composite conflict in the same way. Otherwise, report the conflict. *) let choices = List.map (Precedence.shift_reduce tok) prods in if List.for_all (fun choice -> match choice with | Precedence.ChooseShift -> true | _ -> false ) choices then begin (* Suppress the reduce action. *) silently_solved := !silently_solved + List.length prods; reductions end else if List.for_all ((=) Precedence.ChooseNeither) choices then begin (* Suppress the reduce action and the shift transition. *) silently_solved := !silently_solved + List.length prods; transitions.(i) <- SymbolMap.remove (Symbol.T tok) transitions.(i); reductions end else begin (* Record a shift/reduce/reduce conflict. Keep all reductions. *) conflict_tokens.(i) <- Grammar.TerminalSet.add tok conflict_tokens.(i); has_shift_reduce := true; has_reduce_reduce := true; TerminalMap.add tok prods reductions end end end else begin (* There is no transition in addition to the reduction(s). *) if List.length prods >= 2 then begin (* If there are multiple reductions, then we have a pure reduce/reduce conflict. Do nothing about it at this point. *) conflict_tokens.(i) <- Grammar.TerminalSet.add tok conflict_tokens.(i); has_reduce_reduce := true end; TerminalMap.add tok prods reductions end end; (* Record statistics about conflicts. *) if not (TerminalSet.is_empty conflict_tokens.(i)) then begin conflict_nodes := node :: !conflict_nodes; if !has_shift_reduce then incr shift_reduce; if !has_reduce_reduce then incr reduce_reduce end (* -------------------------------------------------------------------------- *) (* This function is invoked during the traversal when an edge is traversed. *) (* It records an edge in the predecessor array. *) let traverse (source : Raw.node) _symbol (target : Raw.node) = (* The source node has been discovered and numbered already, so it can be transported. (This is not necessarily true of the target node.) *) let j = Raw.number target in predecessors.(j) <- transport source :: predecessors.(j) (* -------------------------------------------------------------------------- *) (* Perform the depth-first traversal of the raw automaton. *) let () = let module D = struct let traverse = traverse let discover = discover end in let module R = DFS.Run(ForwardEdges)(M)(D) in () let () = if !silently_solved = 1 then Error.logA 1 (fun f -> Printf.fprintf f "One shift/reduce conflict was silently solved.\n" ) else if !silently_solved > 1 then Error.logA 1 (fun f -> Printf.fprintf f "%d shift/reduce conflicts were silently solved.\n" !silently_solved ); if !n < Raw.n then Error.logA 1 (fun f -> Printf.fprintf f "Only %d states remain after resolving shift/reduce conflicts.\n" !n ) let () = Grammar.diagnostics() (* -------------------------------------------------------------------------- *) (* Most of our mutable state becomes frozen at this point. Some transitions and reductions can still be removed further on, when default conflict resolution is performed. Also, some reductions can still be added further on, when [%on_error_reduce] declarations are obeyed. *) let n = !n let conflict_nodes = !conflict_nodes (* We need a mapping of nodes to raw node numbers -- the inverse of the array [number]. *) (* While building this mapping, we must remember that [number.(i)] is [unreachable] if the raw node number [i] has never been reached by the traversal. *) let raw : node -> int = let raw = Array.make n (-1) (* dummy *) in Array.iteri (fun i (* raw index *) (node : node) -> assert (0 <= i && i < Raw.n); if node <> unreachable then begin assert (0 <= node && node < n); raw.(node) <- i end ) _number; fun node -> assert (0 <= node && node < n); raw.(node) (* The array [transitions] is re-constructed so as to map nodes to nodes (instead of raw nodes to raw nodes). This array is now frozen; it is no longer modified. *) let transitions : node SymbolMap.t array = Array.init n (fun node -> SymbolMap.map transport transitions.(raw node) ) (* The array [predecessors] is now frozen. *) (* The array [reductions] is *not* yet frozen. *) (* The array [conflict_tokens] is now frozen. *) (* -------------------------------------------------------------------------- *) (* Accessors. *) let number node = node let entry = ProductionMap.map transport Raw.entry let state node = Raw.state (Raw.node (raw node)) let transitions node = assert (0 <= node && node < n); transitions.(node) let set_reductions node table = reductions.(raw node) <- table let reductions node = reductions.(raw node) let predecessors node = predecessors.(raw node) module BackwardEdges = struct type nonrec node = node type label = unit let foreach_outgoing_edge node f = List.iter (fun node -> f () node) (predecessors node) end let conflict_tokens node = conflict_tokens.(raw node) let conflicts f = List.iter (fun node -> f (conflict_tokens node) node ) conflict_nodes let forbid_default_reduction node = forbid_default_reduction.(raw node) (* -------------------------------------------------------------------------- *) (* The incoming symbol of a node can be computed by going through its LR(0) core. For this reason, we do not need to explicitly record it here. *) let incoming_symbol node = Lr0.incoming_symbol (Lr0.core (state node)) (* -------------------------------------------------------------------------- *) (* Iteration over all nodes. *) let fold f accu = let accu = ref accu in for node = 0 to n - 1 do accu := f !accu node done; !accu let iter f = for node = 0 to n - 1 do f node done let map f = List.rev ( fold (fun accu node -> f node :: accu ) [] ) let foldx f = fold (fun accu node -> match incoming_symbol node with | None -> accu | Some _ -> f accu node ) let iterx f = iter (fun node -> match incoming_symbol node with | None -> () | Some _ -> f node ) (* -------------------------------------------------------------------------- *) (* We build a map of each symbol to the (reachable) nodes that have this incoming symbol. *) let lookup symbol index = try SymbolMap.find symbol index with Not_found -> [] let index : node list SymbolMap.t = fold (fun index node -> match incoming_symbol node with | None -> index | Some symbol -> SymbolMap.add symbol (node :: lookup symbol index) index ) SymbolMap.empty (* This allows iterating over all nodes that are targets of edges carrying a certain symbol. The sources of the corresponding edges are also provided. *) let targets f accu symbol = (* There are no incoming transitions on the start symbols. *) let targets = lookup symbol index in List.fold_left (fun accu target -> f accu (predecessors target) target ) accu targets (* -------------------------------------------------------------------------- *) (* Our output channel. *) let out = lazy (open_out (Settings.base ^ ".automaton")) (* -------------------------------------------------------------------------- *) (* If requested, dump a verbose description of the automaton. *) let describe out node = Printf.fprintf out "State %d%s:\n%s" (number node) (if Settings.follow then Printf.sprintf " (r%d)" (raw node) else "") (Lr0.print "" (state node)); SymbolMap.iter (fun symbol node -> Printf.fprintf out "-- On %s shift to state %d\n" (Symbol.print symbol) (number node) ) (transitions node); (* TEMPORARY In the following, one might wish to group all symbols that lead to reducing a common production. *) TerminalMap.iter (fun tok prods -> List.iter (fun prod -> Printf.fprintf out "-- On %s " (Terminal.print tok); match Production.classify prod with | Some nt -> Printf.fprintf out "accept %s\n" (Nonterminal.print false nt) | None -> Printf.fprintf out "reduce production %s\n" (Production.print prod) ) prods ) (reductions node); if not (TerminalSet.is_empty (conflict_tokens node)) then Printf.fprintf out "** Conflict on %s\n" (TerminalSet.print (conflict_tokens node)); Printf.fprintf out "\n%!" let () = Time.tick "Construction of the LR(1) automaton"; if Settings.dump then begin iter (describe (Lazy.force out)); Time.tick "Dumping the LR(1) automaton" end (* -------------------------------------------------------------------------- *) (* Converting a start node into the single item that it contains. *) let start2item node = let state : Lr0.lr1state = state node in let core : Lr0.node = Lr0.core state in let items : Item.Set.t = Lr0.items core in assert (Item.Set.cardinal items = 1); Item.Set.choose items (* -------------------------------------------------------------------------- *) (* [has_beforeend s] tests whether the state [s] can reduce a production whose semantic action uses [$endpos($0)]. Note that [$startpos] and [$endpos] have been expanded away already, so we need not worry about the fact that (in an epsilon production) they expand to [$endpos($0)]. *) let has_beforeend node = TerminalMap.fold (fun _ prods accu -> accu || let prod = Misc.single prods in not (Production.is_start prod) && let action = Production.action prod in Action.has_beforeend action ) (reductions node) false (* -------------------------------------------------------------------------- *) (* Computing which terminal symbols a state is willing to act upon. One must keep in mind that, due to the merging of states, a state might be willing to perform a reduction on a certain token, yet the reduction can take us to another state where this token causes an error. In other words, the set of terminal symbols that is computed here is really an over-approximation of the set of symbols that will not cause an error. And there seems to be no way of performing an exact computation, as we would need to know not only the current state, but the contents of the stack as well. *) let acceptable_tokens (s : node) = (* If this state is willing to act on the error token, ignore it -- we do not wish to report that an error would be accepted in this state :-) *) let transitions = SymbolMap.remove (Symbol.T Terminal.error) (transitions s) and reductions = TerminalMap.remove Terminal.error (reductions s) in (* Accumulate the tokens carried by outgoing transitions. *) let covered = SymbolMap.fold (fun symbol _ covered -> match symbol with | Symbol.T tok -> TerminalSet.add tok covered | Symbol.N _ -> covered ) transitions TerminalSet.empty in (* Accumulate the tokens that permit reduction. *) let covered = ProductionMap.fold (fun _ toks covered -> TerminalSet.union toks covered ) (Lr0.invert reductions) covered in (* That's it. *) covered (* -------------------------------------------------------------------------- *) (* Report statistics. *) (* Produce the reports. *) let () = if !shift_reduce = 1 then Error.grammar_warning [] "one state has shift/reduce conflicts." else if !shift_reduce > 1 then Error.grammar_warning [] "%d states have shift/reduce conflicts." !shift_reduce; if !reduce_reduce = 1 then Error.grammar_warning [] "one state has reduce/reduce conflicts." else if !reduce_reduce > 1 then Error.grammar_warning [] "%d states have reduce/reduce conflicts." !reduce_reduce (* -------------------------------------------------------------------------- *) (* Instantiate [Set] and [Map] on the type [node]. *) module Node = struct type t = node let compare = (-) end module NodeSet = Set.Make(Node) module NodeMap = Map.Make(Node) (* -------------------------------------------------------------------------- *) (* For each production, compute where (that is, in which states) this production can be reduced. This computation is done AFTER default conflict resolution (see below). It is an error to call the accessor function [production_where] before default conflict resolution has taken place. *) let production_where : NodeSet.t ProductionMap.t option ref = ref None let initialize_production_where () = production_where := Some ( fold (fun accu node -> TerminalMap.fold (fun _ prods accu -> let prod = Misc.single prods in let nodes = try ProductionMap.lookup prod accu with Not_found -> NodeSet.empty in ProductionMap.add prod (NodeSet.add node nodes) accu ) (reductions node) accu ) ProductionMap.empty ) let production_where (prod : Production.index) : NodeSet.t = match !production_where with | None -> (* It is an error to call this function before conflict resolution. *) assert false | Some production_where -> try (* Production [prod] may be reduced at [nodes]. *) let nodes = ProductionMap.lookup prod production_where in assert (not (NodeSet.is_empty nodes)); nodes with Not_found -> (* The production [prod] is never reduced. *) NodeSet.empty (* -------------------------------------------------------------------------- *) (* Warn about productions that are never reduced. *) (* These are productions that can never, ever be reduced, because there is no state that is willing to reduce them. There could be other productions that are never reduced because the only states that are willing to reduce them are unreachable. We do not report those. In fact, through the use of the inspection API, it might be possible to bring the automaton into a state where one of those productions can be reduced. *) let warn_about_productions_never_reduced () = let count = ref 0 in Production.iter (fun prod -> if NodeSet.is_empty (production_where prod) then match Production.classify prod with | Some nt -> incr count; Error.grammar_warning (Nonterminal.positions nt) "symbol %s is never accepted." (Nonterminal.print false nt) | None -> incr count; Error.grammar_warning (Production.positions prod) "production %sis never reduced." (Production.print prod) ); if !count > 0 then let plural_mark, be = if !count > 1 then ("s", "are") else ("", "is") in Error.grammar_warning [] "in total, %d production%s %s never reduced." !count plural_mark be (* -------------------------------------------------------------------------- *) (* When requested by the code generator, apply default conflict resolution to ensure that the automaton is deterministic. *) (* [best prod prods] chooses which production should be reduced among the list [prod :: prods]. It fails if no best choice exists. *) let rec best choice = function | [] -> choice | prod :: prods -> match Precedence.reduce_reduce choice prod with | Some choice -> best choice prods | None -> (* The cause for not knowing which production is best could be: 1- the productions originate in different source files; 2- they are derived, via inlining, from the same production. *) Error.signal Error.grammatical_error (Production.positions choice @ Production.positions prod) "do not know how to resolve a reduce/reduce conflict\n\ between the following two productions:\n%s\n%s" (Production.print choice) (Production.print prod); choice (* dummy *) (* Go ahead. *) let default_conflict_resolution () = let shift_reduce = ref 0 and reduce_reduce = ref 0 in conflict_nodes |> List.iter (fun node -> set_reductions node ( TerminalMap.fold (fun tok prods reductions -> try let (_ : node) = SymbolMap.find (Symbol.T tok) (transitions node) in (* There is a transition at this symbol, so this is a (possibly multiway) shift/reduce conflict. Resolve in favor of shifting by suppressing all reductions. *) shift_reduce := List.length prods + !shift_reduce; reductions with Not_found -> (* There is no transition at this symbol. Check whether we have multiple reductions. *) match prods with | [] -> assert false | [ _ ] -> TerminalMap.add tok prods reductions | prod :: ((_ :: _) as prods) -> (* We have a reduce/reduce conflict. Resolve, if possible, in favor of a single reduction. This reduction must be preferrable to each of the others. *) reduce_reduce := List.length prods + !reduce_reduce; TerminalMap.add tok [ best prod prods ] reductions ) (reductions node) TerminalMap.empty ) ); if !shift_reduce = 1 then Error.warning [] "one shift/reduce conflict was arbitrarily resolved." else if !shift_reduce > 1 then Error.warning [] "%d shift/reduce conflicts were arbitrarily resolved." !shift_reduce; if !reduce_reduce = 1 then Error.warning [] "one reduce/reduce conflict was arbitrarily resolved." else if !reduce_reduce > 1 then Error.warning [] "%d reduce/reduce conflicts were arbitrarily resolved." !reduce_reduce; (* Now, detect and remove end-of-stream conflicts. If a state has both a reduce action at [#] and some other (shift or reduce) action, this is an end-of-stream conflict. This conflict is resolved by suppressing the reduce action at [#]. *) let ambiguities = ref 0 in iter begin fun node -> let transitions = transitions node and reductions = reductions node in if Lr0.has_eos_conflict transitions reductions then begin (* Suppress the reduce action at [#]. *) let prods, reductions = TerminalMap.lookup_and_remove Terminal.sharp reductions in set_reductions node reductions; (* We can assume that there is only one reduction on [#]. *) let prod = Misc.single prods in (* Count this end-of-stream conflict. *) incr ambiguities; (* Signal this end-of-stream conflict in the .automaton file. *) if Settings.dump then begin (* Compute the tokens involved in the transitions and remaining reductions. *) let toks = TerminalSet.union (Lr0.transition_tokens transitions) (Lr0.reduction_tokens reductions) in (* Emit a message. *) Printf.fprintf (Lazy.force out) "State %d has an end-of-stream conflict. There is a tension between\n\ (1) %s\n\ without even requesting a lookahead token, and\n\ (2) checking whether the lookahead token is %s%s,\n\ which would require some other action.\n\n" (number node) (match Production.classify prod with | Some nt -> Printf.sprintf "accepting %s" (Nonterminal.print false nt) | None -> Printf.sprintf "reducing production %s" (Production.print prod)) (if TerminalSet.cardinal toks > 1 then "one of " else "") (TerminalSet.print toks) end end end; if !ambiguities = 1 then Error.grammar_warning [] "one state has an end-of-stream conflict." else if !ambiguities > 1 then Error.grammar_warning [] "%d states have an end-of-stream conflict." !ambiguities; (* We can now compute where productions are reduced. *) initialize_production_where(); warn_about_productions_never_reduced() (* -------------------------------------------------------------------------- *) (* Extra reductions. *) (* 2015/10/19 Original implementation. *) (* 2016/07/13 Use priority levels to choose which productions to reduce when several productions are eligible. *) (* If a state can reduce some productions whose left-hand symbol has been marked [%on_error_reduce], and if one such production [prod] is preferable to every other (according to the priority rules of [%on_error_reduce] declarations), then every error action in this state is replaced with a reduction of [prod]. This is done even though this state may have outgoing shift transitions: thus, we are forcing one interpretation of the past, among several possible interpretations. *) (* The code below looks like the decision on a default reduction in [Default], except we do not impose the absence of outgoing terminal transitions. Also, we actually modify the automaton, so the back-ends, the reference interpreter, etc., need not be aware of this feature, whereas they are aware of default reductions. *) (* This code can run before we decide on the default reductions; this does not affect which default reductions will be permitted. *) (* This code does not affect which productions can be reduced where. Thus, it is OK for it to run after [initialize_production_where()]. *) (* A count of how many states receive extra reductions through this mechanism. *) let extra = ref 0 (* A count of how many states have more than one eligible production, but one is preferable to every other (so priority plays a role). *) let prioritized = ref 0 (* The set of nonterminal symbols in the left-hand side of an extra reduction. *) let extra_nts = ref NonterminalSet.empty let extra_reductions_in_node node = (* Compute the productions which this node can reduce. *) let productions : _ ProductionMap.t = Lr0.invert (reductions node) in let prods : Production.index list = ProductionMap.fold (fun prod _ prods -> prod :: prods) productions [] in (* Keep only those whose left-hand symbol is marked [%on_error_reduce]. *) let prods = List.filter OnErrorReduce.reduce prods in (* Check if one of them is preferable to every other one. *) match Misc.best OnErrorReduce.preferable prods with | None -> (* Either no production is marked [%on_error_reduce], or several of them are marked and none is preferable. *) () | Some prod -> let acceptable = acceptable_tokens node in (* An extra reduction is possible. Replace every error action with a reduction of [prod]. If we replace at least one error action with a reduction, update [extra] and [extra_nts]. *) let triggered = lazy ( incr extra; if List.length prods > 1 then incr prioritized; extra_nts := NonterminalSet.add (Production.nt prod) !extra_nts ) in Terminal.iter_real (fun tok -> if not (TerminalSet.mem tok acceptable) then begin set_reductions node (TerminalMap.add tok [ prod ] (reductions node)); Lazy.force triggered end ) let extra_reductions () = (* Examine every node. *) iter (fun node -> (* Just like a default reduction, an extra reduction should be forbidden (it seems) if [forbid_default_reduction] is set. *) if not (forbid_default_reduction node) then extra_reductions_in_node node ); (* Info message. *) if !extra > 0 then Error.logA 1 (fun f -> Printf.fprintf f "Extra reductions on error were added in %d states.\n" !extra; Printf.fprintf f "Priority played a role in %d of these states.\n" !prioritized ); (* Warn about useless %on_error_reduce declarations. *) OnErrorReduce.iter (fun nt -> if not (NonterminalSet.mem nt !extra_nts) then Error.grammar_warning [] "the declaration %%on_error_reduce %s is never useful." (Nonterminal.print false nt) ) (* -------------------------------------------------------------------------- *) (* Define [fold_entry], which in some cases facilitates the use of [entry]. *) let fold_entry f accu = ProductionMap.fold (fun prod state accu -> let nt : Nonterminal.t = match Production.classify prod with | Some nt -> nt | None -> assert false (* this is a start production *) in let t : Stretch.ocamltype = Nonterminal.ocamltype_of_start_symbol nt in f prod state nt t accu ) entry accu let entry_of_nt nt = (* Find the entry state that corresponds to [nt]. *) try ProductionMap.find (Production.startsymbol2startprod nt) entry with Not_found -> assert false exception Found of Nonterminal.t let nt_of_entry s = (* [s] should be an initial state. *) assert (incoming_symbol s = None); try ProductionMap.iter (fun prod entry -> if Node.compare s entry = 0 then match Production.classify prod with | None -> assert false | Some nt -> raise (Found nt) ) entry; (* This should not happen if [s] is indeed an initial state. *) assert false with Found nt -> nt menhir-20200123/src/lr1.mli000066400000000000000000000160061361226111300152160ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Grammar (* This module constructs an LR(1) automaton by following Pager's method, that is, by merging states on the fly when they are found to be (weakly) compatible. *) (* Shift/reduce conflicts are silently resolved when (and only when) that is allowed in a clean way by user-specified priorities. This includes shift/reduce/reduce conflicts when (and only when) there is agreement that the shift action should be preferred. Conflicts that cannot be silently resolved in this phase will be reported, explained, and arbitrarily resolved immediately before code generation. *) (* ------------------------------------------------------------------------- *) (* Accessors. *) (* This is the type of the automaton's nodes. *) type node module Node : Set.OrderedType with type t = node module NodeSet : Set.S with type elt = node module NodeMap : Map.S with type key = node (* These are the automaton's entry states, indexed by the start productions. *) val entry: node ProductionMap.t (* [fold_entry] folds over [entry]. For convenience, it gives access not only to the start production and start state, but also to the nonterminal symbol and to the OCaml type associated with this production. *) val fold_entry: (Production.index -> node -> Nonterminal.t -> Stretch.ocamltype -> 'a -> 'a) -> 'a -> 'a (* [entry_of_nt] maps a (user) non-terminal start symbol to the corresponding start state. [nt_of_entry] does the reverse. *) val entry_of_nt: Nonterminal.t -> node val nt_of_entry: node -> Nonterminal.t (* Nodes are numbered sequentially from [0] to [n-1]. *) val n: int val number: node -> int (* This provides access to the LR(1) state that a node stands for. *) val state: node -> Lr0.lr1state (* This converts a start node into the single item that it contains. *) val start2item: node -> Item.t (* This maps a node to its incoming symbol, that is, the symbol carried by all of the edges that enter this node. A node has zero incoming edges (and, thus, no incoming symbol) if and only if it is a start node. *) val incoming_symbol: node -> Symbol.t option (* This maps a node to its predecessors. *) val predecessors: node -> node list (* A view of the backward (reverse) edges as a graph. *) module BackwardEdges : sig type nonrec node = node type label = unit val foreach_outgoing_edge: node -> (label -> node -> unit) -> unit end (* This provides access to a node's transitions and reductions. *) val transitions: node -> node SymbolMap.t val reductions: node -> Production.index list TerminalMap.t (* or: node -> Lr0.reductions *) (* (New as of 2012/01/23.) This tells whether a shift/reduce conflict in this node was solved in favor of neither (%nonassoc). This implies that one must forbid a default reduction at this node. *) val forbid_default_reduction: node -> bool (* [has_beforeend s] tests whether the state [s] can reduce a production whose semantic action uses [$endpos($0)]. Note that [$startpos] and [$endpos] have been expanded away already, so we need not worry about the fact that (in an epsilon production) they expand to [$endpos($0)]. *) val has_beforeend: node -> bool (* Computing which terminal symbols a state is willing to act upon. This function is currently unused, but could be used as part of an error reporting system. *) val acceptable_tokens: node -> TerminalSet.t (* Iteration over all nodes. The order in which elements are examined, and the order of [map]'s output list, correspond to the numeric indices produced by [number] above. *) val fold: ('a -> node -> 'a) -> 'a -> 'a val iter: (node -> unit) -> unit val map: (node -> 'a) -> 'a list (* Iteration over non-start nodes *) val foldx: ('a -> node -> 'a) -> 'a -> 'a val iterx: (node -> unit) -> unit (* Iteration over all edges that carry a certain symbol. Edges are grouped in families, where all edges in a single family have the same target node. [targets f accu symbol] invokes [f accu sources target] once for every family, where [sources] are the sources of the edges in the family and [target] is their common target. *) val targets: ('a -> node list -> node -> 'a) -> 'a -> Symbol.t -> 'a (* Iteration over all nodes with conflicts. [conflicts f] invokes [f toks node] once for every node [node] with a conflict, where [toks] are the tokens involved in the conflicts at that node. *) val conflicts: (TerminalSet.t -> node -> unit) -> unit (* ------------------------------------------------------------------------- *) (* Modifications of the automaton. *) (* This function performs default conflict resolution. First, it resolves standard (shift/reduce and reduce/reduce) conflicts (thus ensuring that the automaton is deterministic) by removing some reduction actions. Second, it resolves end-of-stream conflicts by ensuring that states that have a reduce action at the pseudo-token "#" have no other action. It is called after conflicts have been explained and before code generation takes place. The automaton is modified in place. *) val default_conflict_resolution: unit -> unit (* This function adds extra reduction actions in the face of an error, if requested by the user via [%on_error_reduce]. *) (* It must be called after conflict resolution has taken place. The automaton is modified in place. *) (* If a state can reduce only one production, whose left-hand symbol has been declared [%on_error_reduce], then every error action in this state is replaced with a reduction action. This is done even though this state may have outgoing shift transitions: thus, we are forcing one interpretation of the past, among several possible interpretations. *) val extra_reductions: unit -> unit (* ------------------------------------------------------------------------- *) (* Information about which productions are reduced and where. *) (* [production_where prod] is the set of all states [s] where production [prod] might be reduced. It is an error to call this functios before default conflict resolution has taken place. *) val production_where: Production.index -> NodeSet.t menhir-20200123/src/lr1construction.ml000066400000000000000000000347471361226111300175340ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module constructs an LR(1) automaton for the grammar described by the module [Grammar]. *) (* Depending on Menhir's settings, one of four methods is used: - Canonical mode: no merging of states at all. - Inclusion only: a new state can be merged into a larger existing state. The reverse is forbidden, though: a smaller existing state will not be grown if a new larger state must be created. This mode is undocumented and may be removed in the future. - Normal mode: a version of Pager's weak compatibility criterion is used to determine which states can be merged. Merging is performed on the fly, during construction. A proposed new state can be merged with an existing state, which is then grown. - LALR mode: two states are merged as soon as they have the same LR(0) core. *) open Grammar type lr1state = Lr0.lr1state module Run () = struct (* -------------------------------------------------------------------------- *) (* Nodes. *) type node = { (* An internal node number assigned during construction. This number appears in Menhir's output when [--follow-construction] is set. This number is also exposed to the client so as to allow building efficient maps over nodes. It otherwise has no use. *) number: int; (* Each node is associated with a state. This state can change during construction as nodes are merged. *) mutable state: lr1state; (* Each node carries information about its outgoing transitions towards other nodes. *) mutable transitions: node SymbolMap.t; } (* -------------------------------------------------------------------------- *) (* Output debugging information if [--follow-construction] is enabled. *) let follow_transition (again : bool) (source : node) (symbol : Symbol.t) (state : lr1state) = if Settings.follow then Printf.fprintf stderr "%s transition out of state r%d along symbol %s.\n\ Proposed target state:\n%s" (if again then "Re-examining" else "Examining") source.number (Symbol.print symbol) (Lr0.print_closure "" state) let follow_state (msg : string) (node : node) (print : bool) = if Settings.follow then Printf.fprintf stderr "%s: r%d.\n%s\n" msg node.number (if print then Lr0.print_closure "" node.state else "") (* -------------------------------------------------------------------------- *) (* The following two mutually recursive functions are invoked when the state associated with an existing node grows. The node's descendants are examined and grown until a fixpoint is reached. This work is performed in an eager manner: we do not attempt to build any new transitions until all existing nodes have been suitably grown. Indeed, building new transitions requires making merging decisions, and such decisions cannot be made on a sound basis unless all existing nodes have been suitably grown. Otherwise, one could run into a dead end where two successive, incompatible merging decisions are made, because the consequences of the first decision (growing descendant nodes) were not made explicit before the second decision was taken. This was a bug in versions of Menhir ante 20070520. Although I wrote this code independently, I later found out that it seems quite similar to the code in Karl Schimpf's Ph.D. thesis (1981), page 35. It is necessary that all existing transitions be explicit before the [grow] functions are called. In other words, if it has been decided that there will be a transition from [node1] to [node2], then [node1.transitions] must be updated before [grow] is invoked. *) (* [grow node state] grows the existing node [node], if necessary, so that its associated state subsumes [state]. If this represents an actual (strict) growth, then [node]'s descendants are grown as well. *) let rec grow node state = if Lr0.subsume state node.state then follow_state "Target state is unaffected" node false else begin (* In versions of Menhir prior to June 2008, I wrote this: If I know what I am doing, then the new state that is being merged into the existing state should be compatible, in Pager's sense, with the existing node. In other words, compatibility should be preserved through transitions. and the code contained this assertion: assert (Lr0.compatible state node.state); assert (Lr0.eos_compatible state node.state); However, this was wrong. See, for instance, the sample grammars cocci.mly and boris-mini.mly. The problem is particularly clearly apparent in boris-mini.mly, where it only involves inclusion of states -- the definition of Pager's weak compatibility does not enter the picture. Here is, roughly, what is going on. Assume we have built some state A, which, along some symbol S, has a transition to itself. This means, in fact, that computing the successor of A along S yields a *subset* of A, that is, succ(A, S) <= A. Then, we wish to build a new state A', which turns out to be a superset of A, so we decide to grow A. (The fact that A is a subset of A' implies that A and A' are Pager-compatible.) As per the code below, we immediately update the state A in place, to become A'. Then, we inspect the transition along symbol S. We find that the state succ(A', S) must be merged into A'. In this situation, the assertions above require succ(A', S) to be compatible with A'. However, this is not necessarily the case. By monotonicity of succ, we do have succ(A, S) <= succ(A', S). But nothing says that succ(A', S) and A' are related with respect to inclusion, or even Pager-compatible. The grammar in boris-mini.mly shows that they are not. *) (* Grow [node]. *) node.state <- Lr0.union state node.state; follow_state "Growing existing state" node true; (* Grow [node]'s successors. *) grow_successors node end (* [grow_successors node] grows [node]'s successors. *) (* Note that, if there is a cycle in the graph, [grow_successors] can be invoked several times at a single node [node], with [node.state] taking on a new value every time. In such a case, this code should be correct, although probably not very efficient. *) and grow_successors node = SymbolMap.iter (fun symbol (successor_node : node) -> let successor_state = Lr0.transition symbol node.state in follow_transition true node symbol successor_state; grow successor_node successor_state ) node.transitions (* -------------------------------------------------------------------------- *) (* Data structures maintained during the construction of the automaton. *) (* A queue of pending nodes, whose outgoing transitions have not yet been built. *) let queue : node Queue.t = Queue.create() (* A mapping of LR(0) node numbers to lists of nodes. This allows us to efficiently find all existing nodes that are core-compatible with a newly found state. *) let map : node list array = Array.make Lr0.n [] (* A counter that allows assigning raw numbers to nodes. *) let num = ref 0 (* A (reversed) list of all nodes that we have allocated. At the end of the process, this list is turned into an array, and allows us to expose an efficient mapping of node numbers back to nodes. *) let nodes = ref [] (* -------------------------------------------------------------------------- *) (* [create state] creates a new node that stands for the state [state]. It is expected that [state] does not subsume, and is not subsumed by, any existing state. *) let create (state : lr1state) : node = (* Allocate a new node. *) let node = { state = state; transitions = SymbolMap.empty; number = Misc.postincrement num; } in nodes := node :: !nodes; (* Update the mapping of LR(0) cores to lists of nodes. *) let k = Lr0.core state in assert (k < Lr0.n); map.(k) <- node :: map.(k); (* Enqueue this node for further examination. *) Queue.add node queue; (* Debugging output. *) follow_state "Creating a new state" node false; (* Return the freshly created node. *) node (* -------------------------------------------------------------------------- *) (* Materializing a transition turns its target state into a (fresh or existing) node. There are three scenarios: the proposed new state can be subsumed by an existing state, compatible with an existing state, or neither. *) exception Subsumed of node exception Compatible of node let materialize (source : node) (symbol : Symbol.t) (target : lr1state) : unit = try (* Debugging output. *) follow_transition false source symbol target; (* Find all existing core-compatible states. *) let k = Lr0.core target in assert (k < Lr0.n); let similar = map.(k) in (* Check whether we need to create a new node or can reuse an existing state. *) (* 20120525: the manner in which this check is performed depends on [Settings.construction_mode]. There are now three modes. *) (* 20150204: there are now four modes. *) begin match Settings.construction_mode with | Settings.ModeCanonical -> (* In a canonical automaton, two states can be merged only if they are identical. *) List.iter (fun node -> if Lr0.subsume target node.state && Lr0.subsume node.state target then raise (Subsumed node) ) similar | Settings.ModeInclusionOnly | Settings.ModePager -> (* A more aggressive approach is to take subsumption into account: if the new candidate state is a subset of an existing state, then no new node needs to be created. Furthermore, the existing state does not need to be enlarged. *) (* 20110124: require error compatibility in addition to subsumption. *) List.iter (fun node -> if Lr0.subsume target node.state && Lr0.error_compatible target node.state then raise (Subsumed node) ) similar | Settings.ModeLALR -> () end; begin match Settings.construction_mode with | Settings.ModeCanonical | Settings.ModeInclusionOnly -> () | Settings.ModePager -> (* One can be even more aggressive and check whether the existing state is compatible, in Pager's sense, with the new state. If so, there is no need to create a new state: just merge the new state into the existing one. The result is a state that may be larger than each of the two states that have been merged. *) (* 20110124: require error compatibility in addition to the existing compatibility criteria. *) List.iter (fun node -> if Lr0.compatible target node.state && Lr0.eos_compatible target node.state && Lr0.error_compatible target node.state then raise (Compatible node) ) similar | Settings.ModeLALR -> (* In LALR mode, as soon as there is one similar state -- i.e. one state that shares the same LR(0) core -- we merge the new state into the existing one. *) List.iter (fun node -> raise (Compatible node) ) similar end; (* The above checks have failed. Create a new node. Two states that are in the subsumption relation are also compatible. This implies that the newly created node does not subsume any existing states. *) source.transitions <- SymbolMap.add symbol (create target) source.transitions with | Subsumed node -> (* Join an existing target node. *) follow_state "Joining existing state" node false; source.transitions <- SymbolMap.add symbol node source.transitions | Compatible node -> (* Join and grow an existing target node. It seems important that the new transition is created before [grow_successors] is invoked, so that all transition decisions made so far are explicit. *) node.state <- Lr0.union target node.state; follow_state "Joining and growing existing state" node true; source.transitions <- SymbolMap.add symbol node source.transitions; grow_successors node (* -------------------------------------------------------------------------- *) (* The actual construction process. *) (* Populate the queue with the start nodes and store them in an array. *) let entry : node ProductionMap.t = ProductionMap.map (fun (k : Lr0.node) -> create (Lr0.start k) ) Lr0.entry (* Pick a node in the queue, that is, a node whose transitions have not yet been built. Build these transitions, and continue. *) (* Note that building a transition can cause existing nodes to grow, so [node.state] is not necessarily invariant throughout the inner loop. *) let () = Misc.qiter (fun node -> List.iter (fun symbol -> materialize node symbol (Lr0.transition symbol node.state) ) (Lr0.outgoing_symbols (Lr0.core node.state)) ) queue (* Record how many nodes were constructed. *) let n = !num (* Allocate an array of all nodes. *) let nodes = Array.of_list (List.rev !nodes) let () = assert (Array.length nodes = n) (* -------------------------------------------------------------------------- *) (* Accessors. *) let number node = node.number let node i = assert (0 <= i && i < n); nodes.(i) let state node = node.state let transitions node = node.transitions (* -------------------------------------------------------------------------- *) end menhir-20200123/src/lr1construction.mli000066400000000000000000000025421361226111300176710ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module constructs an LR(1) automaton for the grammar described by the module [Grammar]. *) (* In this construction, precedence declarations are not taken into account. Thus, conflicts are not resolved; no transitions or reductions are removed in order to resolve conflicts. As a result, every node is reachable from some entry node. *) open LR1Sigs module Run () : LR1_AUTOMATON menhir-20200123/src/lr1partial.ml000066400000000000000000000200741361226111300164220ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Grammar exception Oops module Run (X : sig (* A restricted set of tokens of interest. *) val tokens: TerminalSet.t (* A state of the (merged) LR(1) automaton that we're trying to simulate. *) val goal: Lr1.node end) = struct (* First, let's restrict our interest to the nodes of the merged LR(1) automaton that can reach the goal node. Some experiments show that this can involve one tenth to one half of all nodes. This optimization seems minor, but is easy to implement. *) let relevant : Lr1.node -> bool = let module G = struct include Lr1.BackwardEdges let foreach_root f = f X.goal end in let module M = DFS.MarkArray(Lr1) in let module D = struct let discover _node = () let traverse _source _label _target = () end in let module R = DFS.Run(G)(M)(D) in M.is_marked (* Second, all of the states that we shall consider are restricted to the set of tokens of interest. This is an important idea: by abstracting away some information, we make the construction much faster. *) let restrict = Lr0.restrict X.tokens (* Constructing the automaton. The automaton is represented as a graph. States are never merged -- this is a canonical LR(1) construction! As we go, we record the correspondence between nodes in this automaton and nodes in the merged LR(1) automaton. This allows us to tell when we have reached the desired place. This also allows us not to follow transitions that have already been eliminated, in the merged automaton, via resolution of shift/reduce conflicts. Whenever we follow a transition in the canonical LR(1) automaton, we check that the corresponding transition is legal in the merged LR(1) automaton. The automaton is explored breadth-first and shortest paths from every node to one of the start nodes are recorded. *) type node = { state: Lr0.lr1state; ancestor: (Symbol.t * node) option; shadow: Lr1.node; } (* A queue of pending nodes, whose successors should be explored. *) let queue : node Queue.t = Queue.create() (* Mapping of LR(0) state numbers to lists of nodes. *) let map : node list array = Array.make Lr0.n [] (* Exploring a state. This creates a new node, if necessary, and enqueues it for further exploration. *) exception Goal of node * Terminal.t let explore ancestor shadow (state : Lr0.lr1state) : unit = (* Find all existing nodes that share the same LR(0) core. *) let k = Lr0.core state in assert (k < Lr0.n); let similar = map.(k) in (* Check whether one of these nodes coincides with the candidate new node. If so, stop. This check requires comparing not only the states of the partial, canonical automaton, but also their shadows in the full, merged automaton. This is because a single state of the canonical automaton may be reached along several different paths, leading to distinct shadows in the merged automaton, and we must explore all of these paths in order to ensure that we eventually find a goal node. *) if not (List.exists (fun node -> Lr0.equal state node.state && shadow == node.shadow ) similar) then begin (* Otherwise, create a new node. *) let node = { state = state; ancestor = ancestor; shadow = shadow; } in map.(k) <- node :: similar; Queue.add node queue; (* Check whether this is a goal node. A node [N] is a goal node if (i) [N] has a conflict involving one of the tokens of interest and (ii) [N] corresponds to the goal node, that is, the path that leads to [N] in the canonical LR(1) automaton leads to the goal node in the merged LR(1) automaton. Note that these conditions do not uniquely define [N]. *) if shadow == X.goal then let can_reduce = ref TerminalSet.empty in let reductions1 : Production.index list TerminalMap.t = Lr1.reductions shadow in List.iter (fun (toks, prod) -> TerminalSet.iter (fun tok -> (* We are looking at a [(tok, prod)] pair -- a reduction in the canonical automaton state. *) (* Check that this reduction, which exists in the canonical automaton state, also exists in the merged automaton -- that is, it wasn't suppressed by conflict resolution. *) if List.mem prod (TerminalMap.lookup tok reductions1) then try let (_ : Lr1.node) = SymbolMap.find (Symbol.T tok) (Lr1.transitions shadow) in (* Shift/reduce conflict. *) raise (Goal (node, tok)) with Not_found -> let toks = !can_reduce in (* We rely on the property that [TerminalSet.add tok toks] preserves physical equality when [tok] is a member of [toks]. *) let toks' = TerminalSet.add tok toks in if toks == toks' then (* Reduce/reduce conflict. *) raise (Goal (node, tok)) else (* No conflict so far. *) can_reduce := toks' ) toks ) (Lr0.reductions state) end (* Populate the queue with the start nodes. Until we find a goal node, take a node out the queue, construct the nodes that correspond to its successors, and enqueue them. *) let goal, token = try ProductionMap.iter (fun (prod : Production.index) (k : Lr0.node) -> let shadow = try ProductionMap.find prod Lr1.entry with Not_found -> assert false in if relevant shadow then explore None shadow (restrict (Lr0.start k)) ) Lr0.entry; Misc.qiter (fun node -> SymbolMap.iter (fun symbol state -> try let shadow = SymbolMap.find symbol (Lr1.transitions node.shadow) in if relevant shadow then explore (Some (symbol, node)) shadow (restrict state) with Not_found -> (* No shadow. This can happen if a shift/reduce conflict was resolved in favor in reduce. Ignore that transition. *) () ) (Lr0.transitions node.state) ) queue; (* We didn't find a goal node. This shouldn't happen! If the goal node in the merged LR(1) automaton has a conflict, then there should exist a node with a conflict in the canonical automaton as well. Otherwise, Pager's construction is incorrect. *) raise Oops with Goal (node, tok) -> node, tok (* Query the goal node that was found about the shortest path from it to one of the entry nodes. *) let source, path = let rec follow path node = match node.ancestor with | None -> Lr1.start2item node.shadow, Array.of_list path | Some (symbol, node) -> follow (symbol :: path) node in follow [] goal let goal = Lr0.export goal.state end menhir-20200123/src/lr1partial.mli000066400000000000000000000041351361226111300165730ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Grammar (* This exception is raised by [Run] if we fail to reach the goal state. This is known to happen in a few pathological cases (e.g., when a shift/reduce conflict is solved in favor of reduction, the only path towards the goal state may disappear). So we report this situation gracefully in the .conflicts file instead of failing abruptly. *) exception Oops module Run (X : sig (* A restricted set of tokens of interest. *) val tokens: TerminalSet.t (* A state of the (merged) LR(1) automaton that we're trying to simulate. *) val goal: Lr1.node end) : sig (* What we are after is a path, in the canonical LR(1) automaton, that leads from some entry node to a node [N] such that (i) [N] has a conflict involving one of the tokens of interest and (ii) [N] corresponds to the goal node, that is, the path that leads to [N] in the canonical LR(1) automaton leads to the goal node in the merged LR(1) automaton. *) val source: Item.t val path: Symbol.t array val goal: Lr0.concretelr1state (* An (arbitrarily chosen) conflict token in the goal state. *) val token: Terminal.t end menhir-20200123/src/main.ml000066400000000000000000000020661361226111300152740ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* The main program. *) (* Everything is in [Back]. *) module B = Back (* artificial dependency *) menhir-20200123/src/mark.ml000066400000000000000000000024121361226111300152750ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (** This module implements a very simple notion of ``mark''. A mark is really a reference cell (without content). Creating a new mark requires allocating a new cell, and comparing marks requires comparing pointers. *) type t = unit ref let fresh = ref let same = (==) let none = fresh() menhir-20200123/src/mark.mli000066400000000000000000000027031361226111300154510ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (** This module implements a very simple notion of ``mark''. *) (** The type of marks. *) type t (** [fresh()] generates a fresh mark, that is, a mark that is guaranteed to be distinct from all existing marks. *) val fresh: unit -> t (** [same mark1 mark2] tells whether [mark1] and [mark2] are the same mark, that is, were created by the same call to [fresh]. *) val same: t -> t -> bool (** [none] is a distinguished mark, created via an initial call to [fresh()]. *) val none: t menhir-20200123/src/misc.ml000066400000000000000000000263221361226111300153040ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) let unSome = function None -> assert false | Some x -> x let o2s o f = match o with | None -> "" | Some x -> f x let single = function | [ x ] -> x | _ -> assert false let rec mapd f = function | [] -> [] | x :: xs -> let y1, y2 = f x in y1 :: y2 :: mapd f xs let tabulate n f = let a = Array.init n f in Array.get a let tabulateb n f = let a = Array.init n f in Array.get a, Array.fold_left (fun count element -> if element then count + 1 else count ) 0 a (* [tabulatef number fold n dummy f] returns a function that is extensionally equal to [f], but relies on an internal array. Arguments to [f] are of type ['a] and are mapped by [number] into the range [0..n). [fold] allows folding over the domain of [f]. [dummy] is used to initialize the internal array. Its value has no impact if [fold] is surjective. *) let tabulatef number fold n dummy f = let a = Array.make n dummy in let () = fold (fun () element -> a.(number element) <- f element ) () in let get element = a.(number element) in get let tabulateo number fold n f = let c = ref 0 in let get = tabulatef number fold n None (fun element -> let image = f element in begin match image with | Some _ -> incr c | None -> () end; image ) in get, !c type 'a iter = ('a -> unit) -> unit let separated_iter_to_string printer separator iter = let b = Buffer.create 32 in let first = ref true in iter (fun x -> if !first then begin Buffer.add_string b (printer x); first := false end else begin Buffer.add_string b separator; Buffer.add_string b (printer x) end ); Buffer.contents b let separated_list_to_string printer separator xs = separated_iter_to_string printer separator (fun f -> List.iter f xs) let inverse (a : 'a array) : 'a -> int = let table = Hashtbl.create (Array.length a) in Array.iteri (fun i data -> assert (not (Hashtbl.mem table data)); Hashtbl.add table data i ) a; fun data -> try Hashtbl.find table data with Not_found -> assert false let support_assoc l x = try List.assoc x l with Not_found -> x let index (strings : string list) : int * string array * int StringMap.t = let name = Array.of_list strings and n, map = List.fold_left (fun (n, map) s -> n+1, StringMap.add s n map ) (0, StringMap.empty) strings in n, name, map (* Turning an implicit list, stored using pointers through a hash table, into an explicit list. The head of the implicit list is not included in the explicit list. *) let materialize (table : ('a, 'a option) Hashtbl.t) (x : 'a) : 'a list = let rec loop x = match Hashtbl.find table x with | None -> [] | Some x -> x :: loop x in loop x (* [iteri] implements a [for] loop over integers, from 0 to [n-1]. *) let iteri n f = for i = 0 to n - 1 do f i done (* [foldi] implements a [for] loop over integers, from 0 to [n-1], with an accumulator. [foldij] implements a [for] loop over integers, from [start] to [n-1], with an accumulator. *) let foldij start n f accu = let rec loop i accu = if i = n then accu else loop (i+1) (f i accu) in loop start accu let foldi n f accu = foldij 0 n f accu (* [mapij start n f] produces the list [ f start; ... f (n-1) ]. *) let mapij start n f = List.rev ( foldij start n (fun i accu -> f i :: accu ) [] ) (* [mapi n f] produces the list [ f 0; ... f (n-1) ]. *) let mapi n f = mapij 0 n f (* [qfold f accu q] repeatedly takes an element [x] off the queue [q] and applies [f] to the accumulator and to [x], until [q] becomes empty. Of course, [f] can add elements to [q] as a side-effect. We allocate an option to ensure that [qfold] is tail-recursive. *) let rec qfold f accu q = match try Some (Queue.take q) with Queue.Empty -> None with | Some x -> qfold f (f accu x) q | None -> accu (* [qiter f q] repeatedly takes an element [x] off the queue [q] and applies [f] to [x], until [q] becomes empty. Of course, [f] can add elements to [q] as a side-effect. *) let qiter f q = try while true do f (Queue.take q) done with Queue.Empty -> () let rec smap f = function | [] -> [] | (x :: xs) as l -> let x' = f x and xs' = smap f xs in if x == x' && xs == xs' then l else x' :: xs' let rec smapa f accu = function | [] -> accu, [] | (x :: xs) as l -> let accu, x' = f accu x in let accu, xs' = smapa f accu xs in accu, if x == x' && xs == xs' then l else x' :: xs' let normalize s = let s = Bytes.of_string s in let n = Bytes.length s in for i = 0 to n - 1 do match Bytes.get s i with | '(' | ')' | ',' -> Bytes.set s i '_' | _ -> () done; Bytes.unsafe_to_string s (* [postincrement r] increments [r] and returns its original value. *) let postincrement r = let x = !r in r := x + 1; x (* [map_opt f l] returns the list of [y]s such that [f x = Some y] where [x] is in [l], preserving the order of elements of [l]. *) let map_opt f l = List.(rev (fold_left (fun ys x -> match f x with | None -> ys | Some y -> y :: ys ) [] l)) let new_encode_decode capacity = (* Set up a a hash table, mapping strings to unique integers. *) let module H = Hashtbl.Make(struct type t = string let equal = (=) let hash = Hashtbl.hash end) in let table = H.create capacity in (* Set up a resizable array, mapping integers to strings. *) let text = MenhirLib.InfiniteArray.make "" in (* This counts the calls to [encode]. *) let c = ref 0 in (* A string is mapped to a unique integer, as follows. *) let encode (s : string) : int = c := !c + 1; try H.find table s with Not_found -> (* The number of elements in the hash table is the next available unique integer code. *) let i = H.length table in H.add table s i; MenhirLib.InfiniteArray.set text i s; i (* An integer code can be mapped back to a string, as follows. *) and decode (i : int) : string = MenhirLib.InfiniteArray.get text i and verbose () = Printf.fprintf stderr "%d calls to intern; %d unique strings.\n%!" !c (H.length table) in encode, decode, verbose let new_claim () = let names = ref StringSet.empty in let claim name = if StringSet.mem name !names then Error.error [] "internal name clash over %s" name; names := StringSet.add name !names in claim let rec best (preferable : 'a -> 'a -> bool) (xs : 'a list) : 'a option = match xs with | [] -> (* Special case: no elements at all, so no best element. This case does not participate in the recursion. *) None | [x] -> Some x | x :: xs -> (* If [x] is preferable to every element of [xs], then it is the best element of [x :: xs]. *) if List.for_all (preferable x) xs then Some x else (* [xs] is nonempty, so the recursive call is permitted. *) match best preferable xs with | Some y -> if preferable y x then (* If [y] is the best element of [xs] and [y] is preferable to [x], then [y] is the best element of [x :: xs]. *) Some y else (* There is no best element. *) None | None -> (* There is no best element. *) None let rec levels1 cmp x1 xs = match xs with | [] -> [x1], [] | x2 :: xs -> let ys1, yss = levels1 cmp x2 xs in if cmp x1 x2 = 0 then x1 :: ys1, yss else [x1], ys1 :: yss let levels cmp xs = match xs with | [] -> [] | x1 :: xs -> let ys1, yss = levels1 cmp x1 xs in ys1 :: yss (* Suppose [ys] is a list of elements that are pairwise incomparable with respect to the partial order [<=], and [x] is a new element. Then, [insert (<=) x ys] is the list obtained by inserting [x] and removing any non-maximal elements; so it is again a list of pairwise incomparable elements. *) let insert (<=) x ys = (* If [x] is subsumed by some element [y] of [ys], then there is nothing to do. In particular, no element [y] of [ys] can be subsumed by [x], since the elements of [ys] are pairwise incomparable. *) if List.exists (fun y -> x <= y) ys then ys (* Or [x] must be inserted, and any element [y] of [ys] that is subsumed by [x] must be removed. *) else x :: List.filter (fun y -> not (y <= x)) ys (* Suppose [xs] is an arbitrary list of elements. Then [trim (<=) xs] is the sublist of the elements of [xs] that are maximal with respect to the partial order [<=]. In other words, it is a sublist where every element that is less than some other element has been removed. *) (* One might wish to define [trim] using [List.filter] to keep just the maximal elements, but it is not so easy to say "keep an element only if it is not subsumed by some *other* element of the list". Instead, we iterate [insert]. *) let trim (<=) xs = List.fold_right (insert (<=)) xs [] let rec dup1 cmp x ys = match ys with | [] -> None | y :: ys -> if cmp x y = 0 then Some x else dup1 cmp y ys let dup cmp xs = match xs with | [] -> None | x :: xs -> dup1 cmp x xs let once x y = let s = ref x in fun () -> let result = !s in s := y; result module ListExtras = struct let rec equal (=) xs ys = match xs, ys with | [], [] -> true | x :: xs, y :: ys -> x = y && equal (=) xs ys | _ :: _, [] | [], _ :: _ -> false let hash hash xs = Hashtbl.hash (List.map hash xs) end let nth = function | 1 -> "first" | 2 -> "second" | 3 -> "third" | i -> Printf.sprintf "%dth" i let count = function | 1 -> "one" | 2 -> "two" | 3 -> "three" | i -> Printf.sprintf "%d" i (* To keep compatibility with OCaml 4.02, we copy [Array.for_all], which appeared in 4.03. *) let array_for_all p a = let n = Array.length a in let rec loop i = if i = n then true else if p (Array.unsafe_get a i) then loop (succ i) else false in loop 0 menhir-20200123/src/misc.mli000066400000000000000000000201561361226111300154540ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* Projecting out of an option. May fail abruptly! *) val unSome: 'a option -> 'a (* Converting an option to a string, with [None] converted to the empty string. *) val o2s: 'a option -> ('a -> string) -> string (* Projection out of a singleton list. *) val single: 'a list -> 'a (* A variant of [List.map] where [f] returns a pair of elements, to be flattened into the new list. *) val mapd: ('a -> 'b * 'b) -> 'a list -> 'b list (* Tabulating a function using an internal array. [tabulate n f] returns a function that is extensionally equal to [f], but relies on an internal array. Arguments to [f] are of type [int] and are supposed to lie in the range [0..n). *) val tabulate: int -> (int -> 'a) -> (int -> 'a) (* Tabulating a function using an internal array. [tabulateb n f] returns a function that is extensionally equal to [f], but relies on an internal array. Arguments to [f] are of type [int] and are supposed to lie in the range [0..n). The result type of [f] is assumed to be of type [bool]. [tabulateb] also returns the number of points where [f] is [true]. *) val tabulateb: int -> (int -> bool) -> (int -> bool) * int (* [tabulateo number fold n f] returns a function that is extensionally equal to [f], but relies on an internal array. Arguments to [f] are of type ['a] and are mapped by [number] into the range [0..n). [fold] allows folding over the domain of [f]. The result type of [f] is an option type, and [tabulateo] also returns the number of points where [f] is [Some _]. *) val tabulateo: ('a -> int) -> ((unit -> 'a -> unit) -> unit -> unit) -> int -> ('a -> 'b option) -> ('a -> 'b option) * int (* [separated_list_to_string printer sep l] converts [l] into a string representation built by using [printer] on each element and [sep] as a separator. *) type 'a iter = ('a -> unit) -> unit val separated_iter_to_string: ('a -> string) -> string -> 'a iter -> string val separated_list_to_string: ('a -> string) -> string -> 'a list -> string (* If [a] is an array, therefore a mapping of integers to elements, then [inverse a] computes its inverse, a mapping of elements to integers. The type ['a] of elements must support the use of OCaml's generic equality and hashing functions. *) val inverse: 'a array -> ('a -> int) (* [support_assoc l x] returns the second component of the first couple in [l] whose first component is [x]. If it does not exist, it returns [x]. *) val support_assoc : ('a * 'a) list -> 'a -> 'a (* [index] indexes a list of (distinct) strings, that is, assigns an integer index to each string and builds mappings both ways between strings and indices. *) val index: string list -> int * string array * int StringMap.t (* Turning an implicit list, stored using pointers through a hash table, into an explicit list. The head of the implicit list is not included in the explicit list. *) val materialize: ('a, 'a option) Hashtbl.t -> 'a -> 'a list (* [iteri] implements a [for] loop over integers, from 0 to [n-1]. *) val iteri: int -> (int -> unit) -> unit (* [foldi] implements a [for] loop over integers, from 0 to [n-1], with an accumulator. [foldij] implements a [for] loop over integers, from [start] to [n-1], with an accumulator. *) val foldi: int -> (int -> 'a -> 'a) -> 'a -> 'a val foldij: int -> int -> (int -> 'a -> 'a) -> 'a -> 'a (* [mapij start n f] produces the list [ f start; ... f (n-1) ]. *) val mapij: int -> int -> (int -> 'a) -> 'a list (* [mapi n f] produces the list [ f 0; ... f (n-1) ]. *) val mapi: int -> (int -> 'a) -> 'a list (* [qfold f accu q] repeatedly takes an element [x] off the queue [q] and applies [f] to the accumulator and to [x], until [q] becomes empty. Of course, [f] can add elements to [q] as a side-effect. *) val qfold: ('a -> 'b -> 'a) -> 'a -> 'b Queue.t -> 'a (* [qiter f q] repeatedly takes an element [x] off the queue [q] and applies [f] to [x], until [q] becomes empty. Of course, [f] can add elements to [q] as a side-effect. *) val qiter: ('b -> unit) -> 'b Queue.t -> unit (* [smap] has the same semantics as [List.map], but attempts to physically return the input list when [f] is the identity. *) val smap: ('a -> 'a) -> 'a list -> 'a list (* [smapa] is a variant of [smap] that maintains an accumulator. *) val smapa: ('b -> 'a -> 'b * 'a) -> 'b -> 'a list -> 'b * 'a list (* [normalize s] returns a copy of [s] where parentheses and commas are replaced with underscores. *) val normalize: string -> string (* [postincrement r] increments [r] and returns its original value. *) val postincrement: int ref -> int (* [map_opt f l] returns the list of [y]s such that [f x = Some y] where [x] is in [l], preserving the order of elements of [l]. *) val map_opt : ('a -> 'b option) -> 'a list -> 'b list (* [new_encode_decode capacity] creates a new service for assigning unique integer codes to strings. [capacity] is the initial capacity of the internal hash table. [new_encode_decode] returns a triple [encode, decode, verbose], where [encode] and [decode] translate between strings and unique integer codes and [verbose] prints statistics about the use of the service so far. *) val new_encode_decode: int -> (string -> int) * (int -> string) * (unit -> unit) (* [new_claim()] creates a new service for claiming names. It returns a function [claim] of type [int -> unit] such that the call [claim x] succeeds if and only if [claim x] has never been called before. *) val new_claim: unit -> (string -> unit) (* If [preferable] is a partial order on elements, then [best preferable xs] returns the best (least) element of [xs], if there is one. Its complexity is quadratic. *) val best: ('a -> 'a -> bool) -> 'a list -> 'a option (* Assuming that the list [xs] is sorted with respect to the ordering [cmp], [levels cmp xs] is the list of levels of [xs], where a level is a maximal run of adjacent equal elements. Every level is a nonempty list. *) val levels: ('a -> 'a -> int) -> 'a list -> 'a list list (* Suppose [xs] is an arbitrary list of elements. Then [trim (<=) xs] is the sublist of the elements of [xs] that are maximal with respect to the partial order [<=]. In other words, it is a sublist where every element that is less than some other element has been removed. *) val trim: ('a -> 'a -> bool) -> 'a list -> 'a list (* Assuming that the list [xs] is sorted with respect to the ordering [cmp], [dup cmp xs] returns a duplicate element of the list [xs], if one exists. *) val dup: ('a -> 'a -> int) -> 'a list -> 'a option (* [once x y] produces a function [f] which produces [x] the first time it is called and produces [y] forever thereafter. *) val once: 'a -> 'a -> (unit -> 'a) (* Equality and hashing for lists, parameterized over equality and hashing for elements. *) module ListExtras : sig val equal: ('a -> 'a -> bool) -> 'a list -> 'a list -> bool val hash: ('a -> int) -> 'a list -> int end (* A nice way of printing [n] in English, for concrete values of [n]. *) val count: int -> string (* A nice way of printing "nth" in English, for concrete values of [n]. *) val nth: int -> string (* [Array.for_all] *) val array_for_all : ('a -> bool) -> 'a array -> bool menhir-20200123/src/newRuleSyntax.ml000066400000000000000000000430001361226111300171710ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Syntax (* Because the main function, [NewRuleSyntax.rule], is called by the stage 2 parser (fancy-parser) and nowhere else, this file is type-checked and compiled only at stage 2, not at stage 1. Be sure to run [make bootstrap]. *) (* -------------------------------------------------------------------------- *) (* -------------------------------------------------------------------------- *) (* Function composition. *) let (>>) f g x = g (f x) (* -------------------------------------------------------------------------- *) (* Constructors for the new rule syntax. *) (* [return pos x] is the semantic action [{x}]. *) let return pos x : seq_expression = let action = Action.from_il_expr (IL.EVar x) in let raw_action _ _ = action in Positions.with_pos pos (EAction (XATraditional raw_action, None)) (* -------------------------------------------------------------------------- *) (* -------------------------------------------------------------------------- *) (* Converting the new syntax to the old syntax. *) (* The new syntax is organized in several levels: choice expressions, sequence expressions, symbol expressions, action expressions. The code below reflects this organization. *) (* -------------------------------------------------------------------------- *) (* When a [~] pattern appears at the top level in [~ = e1; e2] and furthermore the expression [e1] is a symbol [x1], then this is considered a pun -- that is, [~] is sugar for [x1]. We resolve these puns in a first pass, before we check that patterns are linear, so a linearity violation that involves [~] will be correctly caught. *) (* There can still remain [~] patterns after puns are resolved, but they stand for fresh variables and cannot cause a linearity violation. *) let rec resolve_puns (e : seq_expression) : seq_expression = Positions.map (fun e -> match e with | ECons (SemPatTilde pos, (ESymbol (x1, [], _) as e1), e2) when ParserAux.valid_ocaml_identifier x1 -> (* This is a pun. Resolve it. *) let x1 = Positions.with_pos pos (Positions.value x1) in (* optional *) ECons (SemPatVar x1, e1, resolve_puns e2) | ECons (p1, e1, e2) -> ECons (p1, e1, resolve_puns e2) | ESingleton _ | EAction _ -> e ) e (* -------------------------------------------------------------------------- *) (* Checking that a new-syntax pattern is linear, i.e., that no variable is bound twice. *) (* We first build a mapping of variables to the places where they are bound, then check that every list in the image of this mapping is a singleton list. *) let check_linearity (ps : pattern list) = let rec build (m : Positions.positions StringMap.t) (p : pattern) = match p with | SemPatVar x -> let x, pos = Positions.decompose x in StringMap.multiple_add x pos m | SemPatWildcard | SemPatTilde _ -> m | SemPatTuple ps -> List.fold_left build m ps in let m = List.fold_left build StringMap.empty ps in StringMap.iter (fun x positions -> if List.length positions > 1 then Error.error positions "The variable %s is bound several times." x ) m let rec patterns (e : seq_expression) : pattern list = let e = Positions.value e in match e with | ECons (p, _, e) -> p :: patterns e | ESingleton _ | EAction _ -> [] let check_linearity : seq_expression -> unit = patterns >> check_linearity (* -------------------------------------------------------------------------- *) (* Determining whether a pattern contains a [~] subpattern. *) let rec tilde_used positions (p : pattern) = match p with | SemPatVar _ | SemPatWildcard -> positions | SemPatTilde pos -> pos :: positions | SemPatTuple ps -> List.fold_left tilde_used positions ps (* Emitting a warning when a [~] subpattern has been used but the sequence expression ends in something other than a point-free semantic action. *) let tilde_used_warning positions = let n = List.length positions in if n > 0 then let variables_have, tpatterns, wpatterns = if n = 1 then "variable has", "a ~ pattern", "a wildcard pattern _" else "variables have", "~ patterns", "wildcard patterns _" in Error.warning positions "%s nameless %s been introduced by %s,\n\ yet this sequence does not end in a point-free semantic action <...>.\n\ Perhaps %s should be used instead." (Misc.count n) variables_have tpatterns wpatterns (* -------------------------------------------------------------------------- *) (* Converting a new-syntax pattern to an IL pattern. *) (* Here, [x1] is the variable that holds the semantic value; it is typically named [_1], [_2], etc. When we encounter a [~] pattern, we convert it to a fresh name, using [x1] as a basis in the generation of this fresh name. *) let pattern (x1 : identifier) (p : pattern) : IL.pattern = let c = ref 1 in let fresh () = Printf.sprintf "%s_%d" x1 (Misc.postincrement c) in let rec pattern p = match p with | SemPatVar x -> IL.PVar (Positions.value x) | SemPatWildcard -> IL.PWildcard | SemPatTilde _ -> IL.PVar (fresh()) | SemPatTuple [] -> IL.PUnit | SemPatTuple [p] -> pattern p | SemPatTuple ps -> IL.PTuple (List.map pattern ps) in pattern p (* [bv accu p] accumulates the bound variables of a pattern [p] produced by the above function. The ordering is significant; variables must be accumulated left to right (so we get a reversed list). *) let rec bv accu p = match p with | IL.PVar x -> x :: accu | IL.PWildcard -> accu | IL.PUnit -> accu | IL.PTuple ps -> List.fold_left bv accu ps | _ -> (* Impossible; not produced above. *) assert false (* -------------------------------------------------------------------------- *) (* Extracting the attributes of a symbol expression. *) let attributes (e : symbol_expression) : attributes = match e with | ESymbol (_, _, attrs) -> attrs (* -------------------------------------------------------------------------- *) (* As we descend into a sequence expression and prepare to build a production, we maintain a context of the following form. *) type context = { (* The position of the complete sequence expression. *) pos: Positions.t; (* The prefix of the production's right-hand side that has been built so far. This is reversed list of producers. Every producer carries an identifier, which is either user-supplied or auto-generated. *) producers: producer list; (* The user-supplied names under which the semantic values are known. Not every semantic value has such a name, as the user can supply no pattern, a wildcard pattern, or a tuple pattern; in either of these cases, there is no name for the semantic value. This is a reversed list. Its length is equal to the length of the list [producers] above. *) uxs: identifier option list; (* A set of (independent) bindings that must be wrapped around the semantic action. These are typically bindings of the form [let p = x in ...]. *) bindings: action -> action; (* A tuple of variables that the user has bound, either explicitly or via the [~] notation. This is a reversed list. Its length is unrelated to the length of the above lists, because one semantic value can be matched against a pattern that binds zero, one, or more variables. Once complete, this tuple becomes the argument to a point-free semantic action. *) tuple: identifier list; (* A list of positions indicating where [~] patterns appear. This flag is maintained as we descend into a [seq_expression] whose puns have been resolved already. Thus, when this list is nonempty, we expect that this [seq_expression] ends with a point-free semantic action; otherwise, there is no point in using [~], and the user could have used [_] instead. We issue a warning if the [seq_expression] does not end with a point-free semantic action. *) tilde_used: Positions.positions; } (* The empty context. *) let empty pos : context = { pos; producers = []; uxs = []; bindings = (fun a -> a); tuple = []; tilde_used = []; } (* Recording a user-supplied identifier. *) let user (x : identifier located) : identifier option = Some (Positions.value x) let auto : identifier option = None (* Accessing the producers. *) let producers context : producer list = List.rev context.producers (* Accessing the user-supplied identifiers. *) let uxs context : identifier option array = Array.of_list (List.rev context.uxs) (* Accessing the tuple. *) let tuple context : identifier list = List.rev context.tuple (* -------------------------------------------------------------------------- *) (* OCaml variables for semantic values. *) (* We do not use a fresh name generator. Instead, we use our [context] to generate names of the form [_1], [_2], etc., corresponding to the current index in the production that we are building. *) let semvar context : identifier = let i = List.length context.producers + 1 in Printf.sprintf "_%d" i (* -------------------------------------------------------------------------- *) (* Converting a symbol expression to a parameter. *) let rec parameter (e : symbol_expression) : parameter = match e with | ESymbol (sym, args, _attrs) -> (* Converting a symbol expression is easy. Note, however, that the arguments [args] are arbitrary expressions. *) Parameters.app sym (List.map nested_parameter args) (* Converting an arbitrary expression to a parameter. *) and nested_parameter (e : expression) : parameter = match Positions.value e with | EChoice [ Branch ({ Positions.value = ESingleton e }, _) ] -> (* A choice with only one branch, whose body is a trivial sequence consisting of just a symbol, is simplified on the fly. This is important, as it allows us to avoid falling into the default case below, where an anonymous rule is generated. E.g., when we have an application [f(x)] of a parameterized symbol [f] to a symbol [x], we don't want an anonymous rule to be generated for [x]. That would be wasteful and (more seriously) could cause the grammar-expansion-termination check to fail. *) parameter e | EChoice _ -> (* A choice expression is converted to an anonymous rule. *) let pos = Positions.position e in ParameterAnonymous (Positions.with_pos pos (productions e)) (* Converting the head of a sequence, a pair [x = e1] of a variable [x] and a symbol expression [e1], to a producer. *) and producer x (e1 : symbol_expression) : producer = x, parameter e1, attributes e1 (* Converting the head of a sequence, a pair [p = e1] of a pattern [p] and a symbol expression [e1], to a context extension. *) and extend (p : pattern) (e1 : symbol_expression) (context : context) : context = match p with | SemPatVar x1 -> (* The variable [x1] is bound to the semantic value of [e1]. *) (* Puns have been resolved already, so they are handled by this code. *) { pos = context.pos; producers = producer x1 e1 :: context.producers; uxs = user x1 :: context.uxs; bindings = context.bindings; tuple = Positions.value x1 :: context.tuple; tilde_used = context.tilde_used } | _ -> (* An arbitrary pattern [p] is used. We bind a variable [x1] to the semantic value of [e1], and wrap the semantic action in a binding [let p = x1 in ...]. Any [~] subpatterns within [p] are translated to fresh identifiers. *) let x1 = semvar context in let tilde_used = tilde_used context.tilde_used p in let p : IL.pattern = pattern x1 p in let binding = Action.bind p x1 in let x1 = Positions.unknown_pos x1 in { pos = context.pos; producers = producer x1 e1 :: context.producers; uxs = auto :: context.uxs; bindings = binding >> context.bindings; tuple = bv context.tuple p; tilde_used } (* Converting a sequence expression to a production. *) and production_aux (context : context) (e : seq_expression) (level : branch_production_level) : parameterized_branch = let e, pos = Positions.decompose e in match e with | ECons (p, e1, e2) -> (* A sequence expression [p = e1; e2]. Based on [p] and [e1], extend the context, then continue with [e2]. *) production_aux (extend p e1 context) e2 level | EAction (XATraditional raw_action, prec) -> (* An action expression. This is the end of the sequence. *) tilde_used_warning context.tilde_used; (* Check that the semantic action seems well-formed. *) let action = raw_action Settings.DollarsDisallowed (uxs context) in (* Build and return a complete production. *) { pr_branch_position = context.pos; pr_producers = producers context; pr_action = context.bindings action; pr_branch_prec_annotation = prec; pr_branch_production_level = level; } | EAction (XAPointFree oid, prec) -> (* A point-free semantic action, containing an OCaml identifier [id] between angle brackets. This is syntactic sugar for a traditional semantic action containing an application of [id] to a tuple of the semantic values that have been assigned a name by the user. *) (* As a special case, if [oid] is [None], then we must not build an application node -- we simply build a tuple. *) (* [id] is actually a stretch, not just a string, and this matters when there is an OCaml error (e.g., [id] is undeclared, or ill-typed). The stretch contains source code location information which allows the error to be reported in the source file. *) (* Build the tuple as an IL expression. *) let evar x = IL.EVar x in let evars xs = List.map evar xs in let tuple = CodeBits.etuple (evars (tuple context)) in (* Build an application of [id] to this tuple. *) (* We abuse the abstract syntax of IL and build an application node, regardless of whether [id] a (possibly qualified) value, a (possibly qualified) data constructor, a polymorphic variant constructor, etc. *) let e = match oid with | Some id -> IL.EApp (IL.ETextual id, [tuple]) | None -> tuple in (* Build a traditional semantic action. *) let action = Action.from_il_expr e in let raw_action _ _ = action in let e = EAction (XATraditional raw_action, prec) in let e = Positions.with_pos pos e in (* Reset [tilde_used], to avoid triggering the warning via our recursive call. *) let context = { context with tilde_used = [] } in (* Done. *) production_aux context e level | ESingleton e -> tilde_used_warning context.tilde_used; (* When a symbol expression [e] appears as the last element of a sequence, this is understood as syntactic sugar for [x = e; {x}], where [x] is a fresh variable. *) (* Another option would be to view it as sugar for [~ = e; <>]. That would also make sense, but would depart from the lambda-calculus convention that in a sequence [e1; e2; e3] only the value of the last expression is returned. *) (* No %prec annotation can be supplied when this sugar is used. *) let x = semvar context in let e = ECons (SemPatVar (Positions.with_pos pos x), e, return pos x) in let e = Positions.with_pos pos e in production_aux context e level and production (Branch (e, level) : branch) = let e = resolve_puns e in check_linearity e; let pos = Positions.position e in production_aux (empty pos) e level and productions (e : expression) : parameterized_branch list = match Positions.value e with | EChoice branches -> List.map production branches (* -------------------------------------------------------------------------- *) (* Converting a new rule to an old rule. *) let rule (rule : rule) : parameterized_rule = { pr_public_flag = rule.rule_public; pr_inline_flag = rule.rule_inline; pr_nt = Positions.value rule.rule_lhs; pr_positions = [ Positions.position rule.rule_lhs ]; pr_attributes = rule.rule_attributes; pr_parameters = List.map Positions.value rule.rule_formals; pr_branches = productions rule.rule_rhs } menhir-20200123/src/newRuleSyntax.mli000066400000000000000000000022611361226111300173460ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Syntax (* The new rule syntax is desugared to the old rule syntax. The translation exploits anonymous rules, so it must be performed before anonymous rules are eliminated. *) val rule: rule -> parameterized_rule menhir-20200123/src/nonterminalType.ml000066400000000000000000000100341361226111300175320ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open BasicSyntax open IL (* This is the conventional name of the nonterminal GADT, which describes the nonterminal symbols. *) let tcnonterminalgadt = "nonterminal" let tnonterminalgadt a = TypApp (tcnonterminalgadt, [ a ]) (* This is the conventional name of the data constructors of the nonterminal GADT. *) let tnonterminalgadtdata nt = "N_" ^ Misc.normalize nt (* This is the definition of the nonterminal GADT. Here, the data constructors have no value argument, but have a type index. *) exception MissingOCamlType of string let nonterminalgadtdef grammar = assert Settings.inspection; let comment, datadefs = try (* The ordering of this list matters. We want the data constructors to respect the internal ordering (as determined by [nonterminals] in [BasicSyntax]) of the nonterminal symbols. This may be exploited in the table back-end to allow an unsafe conversion of a data constructor to an integer code. See [n2i] in [InspectionTableInterpreter]. *) "The indexed type of nonterminal symbols.", List.map (fun nt -> let index = match ocamltype_of_symbol grammar nt with | Some t -> TypTextual t | None -> raise (MissingOCamlType nt) in { dataname = tnonterminalgadtdata nt; datavalparams = []; datatypeparams = Some [ index ] } ) (nonterminals grammar) with MissingOCamlType nt -> (* If the type of some nonterminal symbol is unknown, give up and define ['a nonterminal] as an abstract type. This is useful when we are in [--(raw)-depend] mode and we do not wish to fail. Instead, we produce a mock [.mli] file that is an approximation of the real [.mli] file. When we are not in [--(raw)-depend] mode, though, this is a problem. We display an error message and stop. *) Settings.(match infer with | IMDependRaw | IMDependPostprocess -> "The indexed type of nonterminal symbols (mock!).", [] | IMNone -> Error.error [] "\ the type of the nonterminal symbol %s is unknown.\n\ When --inspection is set, the type of every nonterminal symbol must be known.\n\ Please enable type inference (see --infer and --infer-read-reply)\n\ or specify the type of every symbol via %%type declarations." nt | IMInfer | IMReadReply _ -> (* This should not happen: when [--infer] or [--infer-read-reply] is set, the types of all nonterminal symbols should be known. *) assert false | IMWriteQuery _ -> (* This should not happen: when [--infer-write-query] is set, we write a mock [.ml] file, but no [.mli] file, so this function should never be called. *) assert false) in [ IIComment comment; IITypeDecls [{ typename = tcnonterminalgadt; typeparams = [ "_" ]; typerhs = TDefSum datadefs; typeconstraint = None }] ] menhir-20200123/src/nonterminalType.mli000066400000000000000000000060621361226111300177110ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module deals with the definition of the type that describes the nonterminal symbols. *) (* This is the conventional name of the [nonterminal] GADT. This is an indexed type (i.e., it has one type parameter). Its data constructors carry zero value arguments. *) val tcnonterminalgadt: string val tnonterminalgadt: IL.typ -> IL.typ (* [tnonterminalgadtdata nt] is the conventional name of the data constructor associated with the non-terminal symbol [nt]. *) val tnonterminalgadtdata: string -> string (* This is the definition of the [nonterminal] GADT, for use by the code generators. This definition can be constructed only if the type of every nonterminal symbol is known, either because the user has provided this information, or because [--infer] has been set and inference has been performed already. This definition is produced only in [--inspection] mode. *) val nonterminalgadtdef: BasicSyntax.grammar -> IL.interface (* When in [--(raw-)depend] mode, we are asked to produce a mock [.mli] file before [--infer] has run, which means that we are usually not able to construct the definition of the [nonterminal] GADT. This implies that the mock [.mli] file is a subset of the final [.mli] file. I believe that, when working with [ocamlbuild], this is not a problem. In fact, the mock [.mli] file could just as well be empty or absent, and things would still work: in principle, it is enough for us to publish which files we need in order to be able to type-check the real [.ml] file used by [--infer]. However, when working with [make], which is unable to mix the production of targets and the computation of dependencies, we additionally need to predict which files one will need in order to compile the real [.mli] and [.ml] files. Here, the fact that the mock [.mli] file is incomplete could in theory be a problem, leading to incomplete dependencies. The problem does not lie in the line [parser.ml parser.mli: ...] that we add; it lies in the lines produced by [ocamldep] itself, where the line [parser.cmi: ...] is missing some dependencies. *) menhir-20200123/src/option.ml000066400000000000000000000031071361226111300156550ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) let map f = function | None -> None | Some x -> Some (f x) let iter f o = match o with | None -> () | Some x -> f x let fold f o accu = match o with | None -> accu | Some x -> f x accu let project = function | Some x -> x | None -> (* Presumably, an error message has already been printed. *) exit 1 let equal equal o1 o2 = match o1, o2 with | None, None -> true | Some x1, Some x2 -> equal x1 x2 | None, Some _ | Some _, None -> false let hash hash = function | Some x -> hash x | None -> Hashtbl.hash None menhir-20200123/src/option.mli000066400000000000000000000024261361226111300160310ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) val map: ('a -> 'b) -> 'a option -> 'b option val iter: ('a -> unit) -> 'a option -> unit val fold: ('a -> 'b -> 'b) -> 'a option -> 'b -> 'b val project: 'a option -> 'a (* careful: calls [exit 1] in case of failure *) val equal: ('a -> 'b -> bool) -> 'a option -> 'b option -> bool val hash: ('a -> int) -> 'a option -> int menhir-20200123/src/parameters.ml000066400000000000000000000073051361226111300165140ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* TEMPORARY clean up and write an .mli file *) open Syntax open Positions let app p ps = match ps with | [] -> ParameterVar p | _ -> ParameterApp (p, ps) let unapp = function | ParameterVar x -> (x, []) | ParameterApp (p, ps) -> (p, ps) | ParameterAnonymous _ -> (* Anonymous rules are eliminated early on. *) assert false let unvar = function | ParameterVar x -> x | ParameterApp _ | ParameterAnonymous _ -> assert false let rec map f = function | ParameterVar x -> ParameterVar (f x) | ParameterApp (p, ps) -> ParameterApp (f p, List.map (map f) ps) | ParameterAnonymous _ -> (* Anonymous rules are eliminated early on. *) assert false let rec fold f init = function | ParameterVar x -> f init x | ParameterApp (p, ps) -> f (List.fold_left (fold f) init ps) p | ParameterAnonymous _ -> (* Anonymous rules are eliminated early on. *) assert false let identifiers m p = fold (fun accu x -> StringMap.add x.value x.position accu) m p let rec occurs (x : symbol) (p : parameter) = match p with | ParameterVar y -> x = y.value | ParameterApp (y, ps) -> x = y.value || List.exists (occurs x) ps | ParameterAnonymous _ -> assert false let occurs_shallow (x : symbol) (p : parameter) = match p with | ParameterVar y -> x = y.value | ParameterApp (y, _) -> assert (x <> y.value); false | ParameterAnonymous _ -> assert false let occurs_deep (x : symbol) (p : parameter) = match p with | ParameterVar _ -> false | ParameterApp (_, ps) -> List.exists (occurs x) ps | ParameterAnonymous _ -> assert false type t = parameter let rec equal x y = match x, y with | ParameterVar x, ParameterVar y -> x.value = y.value | ParameterApp (p1, p2), ParameterApp (p1', p2') -> p1.value = p1'.value && List.for_all2 equal p2 p2' | _ -> (* Anonymous rules are eliminated early on. *) false let hash = function | ParameterVar x | ParameterApp (x, _) -> Hashtbl.hash (Positions.value x) | ParameterAnonymous _ -> (* Anonymous rules are eliminated early on. *) assert false let position = function | ParameterVar x | ParameterApp (x, _) -> Positions.position x | ParameterAnonymous bs -> Positions.position bs let with_pos p = Positions.with_pos (position p) p let rec print with_spaces = function | ParameterVar x | ParameterApp (x, []) -> x.value | ParameterApp (x, ps) -> let separator = if with_spaces then ", " else "," in Printf.sprintf "%s(%s)" x.value (Misc.separated_list_to_string (print with_spaces) separator ps) | ParameterAnonymous _ -> assert false menhir-20200123/src/parserAux.ml000066400000000000000000000114701361226111300163210ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Positions open Stretch open Syntax type early_producer = Positions.t * identifier located option * parameter * attributes type early_producers = early_producer list type early_production = early_producers * string located option * (* optional precedence *) branch_production_level * Positions.t type early_productions = early_production list let new_precedence_level = let c = ref 0 in fun (pos1, pos2) -> incr c; PrecedenceLevel (InputFile.get_input_file (), !c, pos1, pos2) let new_production_level = let c = ref 0 in fun () -> incr c; ProductionLevel (InputFile.get_input_file (), !c) let new_on_error_reduce_level = new_production_level (* the counter is shared with [new_production_level], but this is irrelevant *) module IdSet = Set.Make (struct type t = identifier located let compare id1 id2 = compare (value id1) (value id2) end) let defined_identifiers (_, ido, _, _) accu = Option.fold IdSet.add ido accu let defined_identifiers (producers : early_producers) = List.fold_right defined_identifiers producers IdSet.empty let check_production_group (right_hand_sides : early_productions) = match right_hand_sides with | [] -> (* A production group cannot be empty. *) assert false | (producers, _, _, _) :: right_hand_sides -> let ids = defined_identifiers producers in List.iter (fun (producers, _, _, _) -> let ids' = defined_identifiers producers in try let id = IdSet.choose (IdSet.union (IdSet.diff ids ids') (IdSet.diff ids' ids)) in Error.error [Positions.position id] "two productions that share a semantic action must define exactly\n\ the same identifiers. Here, \"%s\" is defined\n\ in one production, but not in all of them." (Positions.value id) with Not_found -> () ) right_hand_sides (* [normalize_producer i p] assigns a name of the form [_i] to the unnamed producer [p]. *) let normalize_producer i (pos, opt_identifier, parameter, attrs) = let id = match opt_identifier with | Some id -> id | None -> Positions.with_pos pos ("_" ^ string_of_int (i + 1)) in (id, parameter, attrs) let normalize_producers (producers : early_producers) : producer list = List.mapi normalize_producer producers let override pos o1 o2 = match o1, o2 with | Some _, Some _ -> Error.error [ pos ] "this production carries two %%prec declarations." | None, Some _ -> o2 | _, None -> o1 (* Only unnamed producers can be referred to using positional identifiers. Besides, such positions must be taken in the interval [1 .. List.length producers]. The output array [p] is such that [p.(idx) = Some x] if [idx] must be referred to using [x], not [$(idx + 1)]. *) let producer_names (producers : early_producers) = producers |> List.map (fun (_, oid, _, _) -> Option.map Positions.value oid) |> Array.of_list (* Check that a stretch contains an OCaml lowercase or uppercase identifier, and convert this stretch to a string. The stretch may be empty, too. *) let validate_pointfree_action (ty : ocamltype) : Stretch.t option = match ty with | Inferred _ -> assert false | Declared stretch -> let s = stretch.stretch_raw_content in if Lexpointfree.validate_pointfree_action (Lexing.from_string s) then Some stretch else None (* Test whether a string is a valid OCaml lowercase identifier. *) (* [x] should be a LID, UID, or QID, produced by Menhir's main lexer. Testing its first character should suffice, but we are more cautious and validate it thoroughly. *) let valid_ocaml_identifier (x : identifier located) : bool = Lexpointfree.valid_ocaml_identifier (Lexing.from_string (Positions.value x)) menhir-20200123/src/parserAux.mli000066400000000000000000000102411361226111300164650ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module provides utilities that are shared by the two versions of the parser. *) open Stretch open Syntax (* A few types used in the parser. *) type early_producer = Positions.t * identifier located option * parameter * attributes type early_producers = early_producer list type early_production = early_producers * string located option * (* optional precedence *) branch_production_level * Positions.t type early_productions = early_production list (* [new_precedence_level (pos1, pos2)] creates a new precendence level, which is stronger than any levels previously created by this function. It should be called every time a [%left], [%right], or [%nonassoc] declaration is found. The positions are the positions of this declaration in the source code. The precedence levels created by this function are attached to tokens and (via %prec) to productions. They are used in solving shift/reduce and shift/reduce/reduce conflicts. *) val new_precedence_level: Lexing.position * Lexing.position -> precedence_level (* [new_production_level()] creates a new production level, which is stronger than any levels previously created by this function. It should be called every time a new production is found. The production levels created by this function are attached to productions. They are used in solving reduce/reduce conflicts: following ocamlyacc and bison, the production that appears first in the grammar receives preference. It may seem very strange that %prec annotations do not influence this process, but that's how it is, at least for the moment. *) val new_production_level: unit -> branch_production_level (* [new_on_error_reduce_level()] creates a new level, which is attached to an [%on_error_reduce] declaration. *) val new_on_error_reduce_level: unit -> on_error_reduce_level (* [check_production_group] accepts a production group and checks that all productions in the group define the same set of identifiers. *) val check_production_group: early_productions -> unit (* [normalize_producers] accepts a list of producers where identifiers are optional and returns a list of producers where identifiers are mandatory. A missing identifier in the [i]-th position receives the conventional name [_i]. *) val normalize_producers: early_producers -> producer list (* [override pos oprec1 oprec2] decides which of the two optional %prec declarations [oprec1] and [oprec2] applies to a production. It signals an error if the two are present. *) val override: Positions.t -> 'a option -> 'a option -> 'a option (* [producer_names producers] returns an array [names] such that [names.(idx) = None] if the (idx + 1)-th producer is unnamed and [names.(idx) = Some id] if it is called [id]. *) val producer_names: early_producers -> identifier option array (* Check that a stretch represents valid content for a point-free semantic action, i.e., either just whitespace, or an OCaml lowercase or uppercase identifier. May raise [Lexpointfree.InvalidPointFreeAction]. *) val validate_pointfree_action: ocamltype -> Stretch.t option (* Test whether a string is a valid OCaml lowercase identifier. *) val valid_ocaml_identifier: identifier located -> bool menhir-20200123/src/partialGrammar.ml000066400000000000000000000651011361226111300173120ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Syntax open Positions (* ------------------------------------------------------------------------- *) (* This adds one declaration [decl], as found in file [filename], to the grammar [grammar]. *) let join_declaration filename (grammar : grammar) decl = match decl.value with (* Preludes are stored in an arbitrary order. The order of preludes within a single source file is preserved. Same treatment for functor parameters. *) | DCode code -> { grammar with p_preludes = grammar.p_preludes @ [ code ] } | DParameter (Stretch.Declared stretch) -> { grammar with p_parameters = grammar.p_parameters @ [ stretch ] } | DParameter (Stretch.Inferred _) -> assert false (* Token declarations are recorded. Things are made somewhat difficult by the fact that %token and %left-%right-%nonassoc declarations are independent. *) (* Declarations of token aliases are lost at this point. *) | DToken (ocamltype, terminal, _alias, attributes) -> let token_property = try (* Retrieve any previous definition for this token. *) let token_property = StringMap.find terminal grammar.p_tokens in (* If the previous definition was actually a %token declaration (as opposed to a %left, %right, or %nonassoc specification), signal an error. *) if token_property.tk_is_declared then Error.errorp decl "the token %s has multiple definitions." terminal; (* Otherwise, update the previous definition. *) { token_property with tk_is_declared = true; tk_ocamltype = ocamltype; tk_filename = filename; tk_position = decl.position; tk_attributes = attributes; } with Not_found -> (* If no previous definition exists, create one. *) { tk_filename = filename; tk_ocamltype = ocamltype; tk_associativity = UndefinedAssoc; tk_precedence = UndefinedPrecedence; tk_position = decl.position; tk_attributes = attributes; tk_is_declared = true } in { grammar with p_tokens = StringMap.add terminal token_property grammar.p_tokens } (* Start symbols. *) | DStart nonterminal -> { grammar with p_start_symbols = StringMap.add nonterminal decl.position grammar.p_start_symbols } (* Type declarations for nonterminals. *) | DType (ocamltype, nonterminal) -> { grammar with p_types = (nonterminal, with_pos (position decl) ocamltype)::grammar.p_types } (* Reductions on error for nonterminals. *) | DOnErrorReduce (nonterminal, prec) -> { grammar with p_on_error_reduce = (nonterminal, prec) :: grammar.p_on_error_reduce } (* Token associativity and precedence. *) | DTokenProperties (terminal, assoc, prec) -> (* Retrieve the property record for this token, creating one if none existed (but without deeming the token to have been declared). *) let token_properties, grammar = try StringMap.find terminal grammar.p_tokens, grammar with Not_found -> let p = { tk_filename = filename; tk_ocamltype = None; tk_associativity = UndefinedAssoc; tk_precedence = prec; tk_is_declared = false; tk_attributes = []; (* Will be updated later. *) tk_position = decl.position; } in p, { grammar with p_tokens = StringMap.add terminal p grammar.p_tokens } in (* Reject duplicate precedence declarations. *) if token_properties.tk_associativity <> UndefinedAssoc then Error.error [ decl.position; token_properties.tk_position ] "there are multiple precedence declarations for token %s." terminal; (* Record the new declaration. *) token_properties.tk_precedence <- prec; token_properties.tk_associativity <- assoc; grammar | DGrammarAttribute attr -> { grammar with p_grammar_attributes = attr :: grammar.p_grammar_attributes } | DSymbolAttributes (actuals, attrs) -> { grammar with p_symbol_attributes = (actuals, attrs) :: grammar.p_symbol_attributes } (* ------------------------------------------------------------------------- *) (* This stores an optional postlude into a grammar. Postludes are stored in an arbitrary order. *) let join_postlude postlude grammar = match postlude with | None -> grammar | Some postlude -> { grammar with p_postludes = postlude :: grammar.p_postludes } (* ------------------------------------------------------------------------- *) (* We rewrite definitions when nonterminals are renamed. The renaming [phi] is an association list of names to names. *) type renaming = (nonterminal * nonterminal) list let identity_renaming = [] let rewrite_nonterminal (phi : renaming) nonterminal = Misc.support_assoc phi nonterminal let rewrite_parameter phi parameter = Parameters.map (Positions.map (Misc.support_assoc phi)) parameter let rewrite_producer phi ((ido, parameter, attrs) : producer) = ido, rewrite_parameter phi parameter, attrs let rewrite_branch phi ({ pr_producers = producers } as branch) = { branch with pr_producers = List.map (rewrite_producer phi) producers } let rewrite_branches phi branches = match phi with | [] -> branches | _ -> List.map (rewrite_branch phi) branches let fresh_counter = ref 0 let names = ref StringSet.empty let use_name name = names := StringSet.add name !names let used_name name = StringSet.mem name !names let rec fresh ?(hint = "v") () = let name = incr fresh_counter; hint ^ string_of_int !fresh_counter in if used_name name then fresh ~hint () else ( use_name name; name ) (* Alpha conversion of [prule]. We rename bound parameters using fresh names. *) let alphaconvert_rule parameters prule = let phi = List.combine parameters (List.map (fun x -> fresh ~hint:x ()) parameters) in { prule with pr_parameters = List.map (Misc.support_assoc phi) prule.pr_parameters; pr_branches = rewrite_branches phi prule.pr_branches } (* Rewrite a rule taking bound names into account. We rename parameters to avoid capture. *) let rewrite_rule phi prule = let ids = List.fold_left (fun acu (f, d) -> StringSet.add f (StringSet.add d acu)) StringSet.empty phi in let captured_parameters = List.filter (fun p -> StringSet.mem p ids) prule.pr_parameters in let prule = alphaconvert_rule captured_parameters prule in { prule with pr_nt = rewrite_nonterminal phi prule.pr_nt; pr_branches = rewrite_branches phi prule.pr_branches } let rewrite_rules phi rules = List.map (rewrite_rule phi) rules let rewrite_grammar phi grammar = (* We assume that [phi] affects only private symbols, so it does not affect the start symbols. *) if phi = identity_renaming then grammar else { grammar with pg_rules = rewrite_rules phi grammar.pg_rules } (* ------------------------------------------------------------------------- *) (* To rename (internalize) a nonterminal, we prefix it with its filename. This guarantees that names are unique. *) let is_valid_nonterminal_character = function | 'A' .. 'Z' | 'a' .. 'z' | '_' | '\192' .. '\214' | '\216' .. '\246' | '\248' .. '\255' | '0' .. '9' -> true | _ -> false let restrict filename = let m = Bytes.of_string (Filename.chop_suffix filename (if Settings.coq then ".vy" else ".mly")) in for i = 0 to Bytes.length m - 1 do if not (is_valid_nonterminal_character (Bytes.get m i)) then Bytes.set m i '_' done; Bytes.unsafe_to_string m let rename nonterminal filename = let name = restrict filename ^ "_" ^ nonterminal in if used_name name then fresh ~hint:name () else (use_name name; name) (* ------------------------------------------------------------------------- *) type symbol_kind = (* The nonterminal is declared public at a particular position. *) | PublicNonTerminal of Positions.t (* The nonterminal is declared (nonpublic) at a particular position. *) | PrivateNonTerminal of Positions.t (* The symbol is a token. *) | Token of token_properties (* We do not know yet what the symbol means. This is defined in the sequel or it is free in the partial grammar. *) | DontKnow of Positions.t type symbol_table = (symbol, symbol_kind) Hashtbl.t let find_symbol (symbols : symbol_table) symbol = Hashtbl.find symbols symbol let add_in_symbol_table (symbols : symbol_table) symbol kind = use_name symbol; Hashtbl.add symbols symbol kind; symbols let replace_in_symbol_table (symbols : symbol_table) symbol kind = Hashtbl.replace symbols symbol kind; symbols let empty_symbol_table () : symbol_table = Hashtbl.create 13 let store_symbol (symbols : symbol_table) symbol kind = match find_symbol symbols symbol, kind with (* The symbol is not known so far. Add it. *) | exception Not_found -> add_in_symbol_table symbols symbol kind (* There are two definitions of this symbol in one grammatical unit (that is, one .mly file), and at least one of them is private. This is forbidden. *) | PrivateNonTerminal p, PrivateNonTerminal p' | PublicNonTerminal p, PrivateNonTerminal p' | PrivateNonTerminal p, PublicNonTerminal p' -> Error.error [ p; p'] "the nonterminal symbol %s is multiply defined.\n\ Only %%public symbols can have split definitions." symbol (* The symbol is known to be a token but declared as a nonterminal.*) | Token tkp, (PrivateNonTerminal p | PublicNonTerminal p) | (PrivateNonTerminal p | PublicNonTerminal p), Token tkp -> Error.error [ p; tkp.tk_position ] "the identifier %s is a reference to a token." symbol (* In the following cases, we do not gain any piece of information. As of 2017/03/29, splitting the definition of a %public nonterminal symbol is permitted. (It used to be permitted over multiple units, but forbidden within a single unit.) *) | _, DontKnow _ | Token _, Token _ | PublicNonTerminal _, PublicNonTerminal _ -> symbols (* We learn that the symbol is a nonterminal or a token. *) | DontKnow _, _ -> replace_in_symbol_table symbols symbol kind let store_used_symbol position tokens symbols symbol = let kind = try Token (StringMap.find symbol tokens) with Not_found -> DontKnow position in store_symbol symbols symbol kind let non_terminal_is_not_reserved symbol positions = if symbol = "error" then Error.error positions "%s is reserved and thus cannot be used \ as a non-terminal symbol." symbol let non_terminal_is_not_a_token tokens symbol positions = try let tkp = StringMap.find symbol tokens in Error.error (positions @ [ tkp.tk_position ]) "the identifier %s is a reference to a token." symbol with Not_found -> () let store_public_nonterminal tokens symbols symbol positions = non_terminal_is_not_reserved symbol positions; non_terminal_is_not_a_token tokens symbol positions; store_symbol symbols symbol (PublicNonTerminal (List.hd positions)) let store_private_nonterminal tokens symbols symbol positions = non_terminal_is_not_reserved symbol positions; non_terminal_is_not_a_token tokens symbol positions; store_symbol symbols symbol (PrivateNonTerminal (List.hd positions)) (* for debugging, presumably: let string_of_kind = function | PublicNonTerminal p -> Printf.sprintf "public (%s)" (Positions.string_of_pos p) | PrivateNonTerminal p -> Printf.sprintf "private (%s)" (Positions.string_of_pos p) | Token tk -> Printf.sprintf "token (%s)" tk.tk_filename | DontKnow p -> Printf.sprintf "only used at (%s)" (Positions.string_of_pos p) let string_of_symbol_table t = let b = Buffer.create 13 in let m = 1 + Hashtbl.fold (fun k v acu -> max (String.length k) acu) t 0 in let fill_blank s = let s' = String.make m ' ' in String.blit s 0 s' 0 (String.length s); s' in Hashtbl.iter (fun k v -> Buffer.add_string b (Printf.sprintf "%s: %s\n" (fill_blank k) (string_of_kind v))) t; Buffer.contents b *) let is_private_symbol t x = try match Hashtbl.find t x with | PrivateNonTerminal _ -> true | _ -> false with Not_found -> false let fold_on_private_symbols f init t = Hashtbl.fold (fun k -> function PrivateNonTerminal _ -> (fun acu -> f acu k) | _ -> (fun acu -> acu)) t init let fold_on_public_symbols f init t = Hashtbl.fold (fun k -> function PublicNonTerminal _ -> (fun acu -> f acu k) | _ -> (fun acu -> acu)) t init let iter_on_only_used_symbols f t = Hashtbl.iter (fun k -> function DontKnow pos -> f k pos | _ -> ()) t let symbols_of grammar (pgrammar : Syntax.partial_grammar) = let tokens = grammar.p_tokens in let symbols_of_rule symbols prule = let rec store_except_rule_parameters symbols parameter = let symbol, parameters = Parameters.unapp parameter in (* Process the reference to [symbol]. *) let symbols = if List.mem symbol.value prule.pr_parameters then (* Rule parameters are bound locally, so they are not taken into account. *) symbols else store_used_symbol symbol.position tokens symbols symbol.value in (* Process the parameters. *) List.fold_left store_except_rule_parameters symbols parameters in (* Analyse each branch. *) let symbols = List.fold_left (fun symbols branch -> List.fold_left (fun symbols (_, p, _) -> store_except_rule_parameters symbols p ) symbols branch.pr_producers ) symbols prule.pr_branches in (* Store the symbol declaration. *) (* A nonterminal symbol is considered public if it is declared using %public or %start. *) if prule.pr_public_flag || StringMap.mem prule.pr_nt grammar.p_start_symbols then store_public_nonterminal tokens symbols prule.pr_nt prule.pr_positions else store_private_nonterminal tokens symbols prule.pr_nt prule.pr_positions in List.fold_left symbols_of_rule (empty_symbol_table ()) pgrammar.pg_rules let merge_rules symbols pgs = (* Retrieve all the public symbols. *) let public_symbols = List.fold_left (fold_on_public_symbols (fun s k -> StringSet.add k s)) (StringSet.singleton "error") symbols in (* We check the references in each grammar can be bound to a public symbol. *) let _ = List.iter (iter_on_only_used_symbols (fun k pos -> if not (StringSet.mem k public_symbols) then Error.error [ pos ] "%s is undefined." k)) symbols in (* Detect private symbol clashes and rename them if necessary. *) let detect_private_symbol_clashes = fold_on_private_symbols (fun (defined, clashes) symbol -> if StringSet.mem symbol defined || StringSet.mem symbol public_symbols then (defined, StringSet.add symbol clashes) else (StringSet.add symbol defined, clashes)) in let _private_symbols, clashes = List.fold_left detect_private_symbol_clashes (StringSet.empty, StringSet.empty) symbols in let rpgs = List.map (fun (symbol_table, pg) -> let renaming = StringSet.fold (fun x phi -> if is_private_symbol symbol_table x then begin let x' = rename x pg.pg_filename in Printf.fprintf stderr "Note: the nonterminal symbol %s (from %s) is renamed %s.\n" x pg.pg_filename x'; (x, x') :: phi end else phi) clashes [] in rewrite_grammar renaming pg) pgs in (* Merge public nonterminal definitions and copy private nonterminal definitions. Since the clash between private symbols have already been resolved, these copies are safe. *) List.fold_left (fun rules rpg -> List.fold_left (fun rules r -> let r = try let r' = StringMap.find r.pr_nt rules in let positions = r.pr_positions @ r'.pr_positions in let ra, ra' = List.length r.pr_parameters, List.length r'.pr_parameters in (* The arity of the parameterized symbols must be constant.*) if ra <> ra' then Error.error positions "the symbol %s is defined with arities %d and %d." r.pr_nt ra ra' else if r.pr_inline_flag <> r'.pr_inline_flag then Error.error positions "not all definitions of %s are marked %%inline." r.pr_nt else (* We combine the different branches. The parameters could have different names, we rename them with the fresh names assigned earlier (see the next comment). *) let phi = List.combine r.pr_parameters r'.pr_parameters in let rbr = rewrite_branches phi r.pr_branches in { r' with pr_positions = positions; pr_branches = rbr @ r'.pr_branches; pr_attributes = r.pr_attributes @ r'.pr_attributes; } with Not_found -> (* We alphaconvert the rule in order to avoid the capture of private symbols coming from another unit. *) alphaconvert_rule r.pr_parameters r in StringMap.add r.pr_nt r rules) rules rpg.pg_rules) StringMap.empty rpgs let empty_grammar = { p_preludes = []; p_postludes = []; p_parameters = []; p_start_symbols = StringMap.empty; p_types = []; p_tokens = StringMap.empty; p_rules = StringMap.empty; p_on_error_reduce = []; p_grammar_attributes = []; p_symbol_attributes = []; } let join grammar pgrammar = let filename = pgrammar.pg_filename in List.fold_left (join_declaration filename) grammar pgrammar.pg_declarations |> join_postlude pgrammar.pg_postlude (* If a rule is marked %inline, then it must not carry an attribute. *) let check_inline_attribute prule = match prule.pr_inline_flag, prule.pr_attributes with | true, (id, _payload) :: _attributes -> Error.error [Positions.position id] "the nonterminal symbol %s is declared %%inline.\n\ It cannot carry an attribute." prule.pr_nt | true, [] | false, _ -> () let check_parameterized_grammar_is_well_defined grammar = (* Every start symbol is defined and has a %type declaration. *) StringMap.iter (fun nonterminal p -> if not (StringMap.mem nonterminal grammar.p_rules) then Error.error [p] "the start symbol %s is undefined." nonterminal; if not (List.exists (function | ParameterVar { value = id }, _ -> id = nonterminal | _ -> false) grammar.p_types) then Error.error [p] "the type of the start symbol %s is unspecified." nonterminal; ) grammar.p_start_symbols; (* Every %type definition refers to well-defined (terminal or nonterminal) symbols and has, at its head, a nonterminal symbol. *) (* Same check for %on_error_reduce definitions. *) let reserved = [ "error" ] in let rec check (kind : string) (must_be_nonterminal : bool) (p : Syntax.parameter) = (* Destructure head and arguments. *) let head, ps = Parameters.unapp p in let head = value head in (* Check if [head] is a nonterminal or terminal symbol. *) let is_nonterminal = StringMap.mem head grammar.p_rules and is_terminal = StringMap.mem head grammar.p_tokens || List.mem head reserved in (* If [head] is not satisfactory, error. *) if not (is_terminal || is_nonterminal) then Error.error [Parameters.position p] "%s is undefined." head; if (must_be_nonterminal && not is_nonterminal) then Error.error [Parameters.position p] "%s is a terminal symbol,\n\ but %s declarations are applicable only to nonterminal symbols." (Parameters.print true p) kind; (* Then, check the arguments. *) List.iter (check kind false) ps in let check_fst kind must_be_nonterminal (p, _) = check kind must_be_nonterminal p in List.iter (check_fst "%type" true) grammar.p_types; List.iter (check_fst "%on_error_reduce" true) grammar.p_on_error_reduce; List.iter (fun (params, _) -> List.iter (check "%attribute" false) params ) grammar.p_symbol_attributes; (* Every reference to a symbol is well defined. *) let used_tokens = ref StringSet.empty in let mark_token_as_used token = used_tokens := StringSet.add token !used_tokens in let check_identifier_reference grammar prule s p = (* Mark the symbol as a used token if this is a token. *) if StringMap.mem s grammar.p_tokens then mark_token_as_used s; if not (StringMap.mem s grammar.p_rules || StringMap.mem s grammar.p_tokens || List.mem s prule.pr_parameters || List.mem s reserved) then Error.error [ p ] "%s is undefined." s in StringMap.iter (fun k prule -> (* The formal parameters of each rule must have distinct names. *) prule.pr_parameters |> List.sort compare |> Misc.dup compare |> Option.iter (fun x -> Error.error prule.pr_positions "several parameters of this rule are named \"%s\"." x ); (* Check each branch. *) List.iter (fun { pr_producers = producers; pr_branch_prec_annotation; } -> ignore (List.fold_left (* Check the producers. *) (fun already_seen (id, p, _) -> let symbol, parameters = Parameters.unapp p in let s = symbol.value and p = symbol.position in let already_seen = (* Check the producer id is unique. *) if StringSet.mem id.value already_seen then Error.error [ id.position ] "there are multiple producers named %s in this sequence." id.value; StringSet.add id.value already_seen in (* Check that the producer is defined somewhere. *) check_identifier_reference grammar prule s p; StringMap.iter (check_identifier_reference grammar prule) (List.fold_left Parameters.identifiers StringMap.empty parameters); (* If this producer seems to be a reference to a token, make sure it is a real token, as opposed to a pseudo-token introduced in a priority declaration. *) (try if not ((StringMap.find s grammar.p_tokens).tk_is_declared || List.mem s reserved) then Error.errorp symbol "%s has not been declared as a token." s with Not_found -> ()); already_seen ) StringSet.empty producers); match pr_branch_prec_annotation with | None -> () | Some terminal -> check_identifier_reference grammar prule terminal.value terminal.position; (* Furthermore, the symbol following %prec must be a valid token identifier. *) if not (StringMap.mem terminal.value grammar.p_tokens) then Error.errorp terminal "%s is undefined." terminal.value) prule.pr_branches; (* It is forbidden to use %inline on a %start symbol. *) if (prule.pr_inline_flag && StringMap.mem k grammar.p_start_symbols) then Error.error prule.pr_positions "%s cannot be both a start symbol and inlined." k; (* If a rule is marked %inline, then it must not carry an attribute. *) check_inline_attribute prule ) grammar.p_rules; (* Check that every token is used. *) if not Settings.ignore_all_unused_tokens then begin match Settings.token_type_mode with | Settings.TokenTypeOnly -> () | Settings.TokenTypeAndCode | Settings.CodeOnly _ -> StringMap.iter (fun token { tk_position = p } -> if not (StringSet.mem token !used_tokens || StringSet.mem token Settings.ignored_unused_tokens) then Error.warning [p] "the token %s is unused." token ) grammar.p_tokens end let join_partial_grammars pgs = (* Prior to joining the partial grammars, remove all uses of token aliases. *) let pgs = ExpandTokenAliases.dealias_grammars pgs in (* Join the partial grammars. *) let grammar = List.fold_left join empty_grammar pgs in let symbols = List.map (symbols_of grammar) pgs in let tpgs = List.combine symbols pgs in let rules = merge_rules symbols tpgs in let grammar = { grammar with p_rules = rules } in (* Check well-formedness. *) check_parameterized_grammar_is_well_defined grammar; grammar menhir-20200123/src/partialGrammar.mli000066400000000000000000000020331361226111300174560ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Syntax val join_partial_grammars : partial_grammar list -> grammar menhir-20200123/src/patricia.ml000066400000000000000000000775711361226111300161610ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This is an implementation of Patricia trees, following Chris Okasaki's paper at the 1998 ML Workshop in Baltimore. Both big-endian and little-endian trees are provided. Both sets and maps are implemented on top of Patricia trees. *) (*i --------------------------------------------------------------------------------------------------------------- i*) (*s \mysection{Little-endian vs big-endian trees} *) (* A tree is little-endian if it expects the key's least significant bits to be tested first during a search. It is big-endian if it expects the key's most significant bits to be tested first. Most of the code is independent of this design choice, so it is written as a functor, parameterized by a small structure which defines endianness. Here is the interface which must be adhered to by such a structure. *) module Endianness = struct module type S = sig (* A mask is an integer with a single one bit (i.e. a power of 2). *) type mask = int (* [branching_bit] accepts two distinct integers and returns a mask which identifies the first bit where they differ. The meaning of ``first'' varies according to the endianness being implemented. *) val branching_bit: int -> int -> mask (* [mask i m] returns an integer [i'], where all bits which [m] says are relevant are identical to those in [i], and all others are set to some unspecified, but fixed value. Which bits are ``relevant'' according to a given mask varies according to the endianness being implemented. *) val mask: int -> mask -> int (* [shorter m1 m2] returns [true] if and only if [m1] describes a shorter prefix than [m2], i.e. if it makes fewer bits relevant. Which bits are ``relevant'' according to a given mask varies according to the endianness being implemented. *) val shorter: mask -> mask -> bool end (* Now, let us define [Little] and [Big], two possible [Endiannness] choices. *) module Little = struct type mask = int let lowest_bit x = x land (-x) (* Performing a logical ``xor'' of [i0] and [i1] yields a bit field where all differences between [i0] and [i1] show up as one bits. (There must be at least one, since [i0] and [i1] are distinct.) The ``first'' one is the lowest bit in this bit field, since we are checking least significant bits first. *) let branching_bit i0 i1 = lowest_bit (i0 lxor i1) (* The ``relevant'' bits in an integer [i] are those which are found (strictly) to the right of the single one bit in the mask [m]. We keep these bits, and set all others to 0. *) let mask i m = i land (m-1) (* The smaller [m] is, the fewer bits are relevant. *) let shorter = (<) end module Big = struct type mask = int let lowest_bit x = x land (-x) let rec highest_bit x = let m = lowest_bit x in if x = m then m else highest_bit (x - m) (* Performing a logical ``xor'' of [i0] and [i1] yields a bit field where all differences between [i0] and [i1] show up as one bits. (There must be at least one, since [i0] and [i1] are distinct.) The ``first'' one is the highest bit in this bit field, since we are checking most significant bits first. In Okasaki's paper, this loop is sped up by computing a conservative initial guess. Indeed, the bit at which the two prefixes disagree must be somewhere within the shorter prefix, so we can begin searching at the least-significant valid bit in the shorter prefix. Unfortunately, to allow computing the initial guess, the main code has to pass in additional parameters, e.g. a mask which describes the length of each prefix. This ``pollutes'' the endianness-independent code. For this reason, this optimization isn't implemented here. *) let branching_bit i0 i1 = highest_bit (i0 lxor i1) (* The ``relevant'' bits in an integer [i] are those which are found (strictly) to the left of the single one bit in the mask [m]. We keep these bits, and set all others to 0. Okasaki uses a different convention, which allows big-endian Patricia trees to masquerade as binary search trees. This feature does not seem to be useful here. *) let mask i m = i land (lnot (2*m-1)) (* The smaller [m] is, the more bits are relevant. *) let shorter = (>) end end (*i --------------------------------------------------------------------------------------------------------------- i*) (*s \mysection{Patricia-tree-based maps} *) module Make (X : Endianness.S) = struct (* Patricia trees are maps whose keys are integers. *) type key = int (* A tree is either empty, or a leaf node, containing both the integer key and a piece of data, or a binary node. Each binary node carries two integers. The first one is the longest common prefix of all keys in this sub-tree. The second integer is the branching bit. It is an integer with a single one bit (i.e. a power of 2), which describes the bit being tested at this node. *) type 'a t = | Empty | Leaf of int * 'a | Branch of int * X.mask * 'a t * 'a t (* The empty map. *) let empty = Empty (* [choose m] returns an arbitrarily chosen binding in [m], if [m] is nonempty, and raises [Not_found] otherwise. *) let rec choose = function | Empty -> raise Not_found | Leaf (key, data) -> key, data | Branch (_, _, tree0, _) -> choose tree0 (* [lookup k m] looks up the value associated to the key [k] in the map [m], and raises [Not_found] if no value is bound to [k]. This implementation takes branches \emph{without} checking whether the key matches the prefix found at the current node. This means that a query for a non-existent key shall be detected only when finally reaching a leaf, rather than higher up in the tree. This strategy is better when (most) queries are expected to be successful. *) let rec lookup key = function | Empty -> raise Not_found | Leaf (key', data) -> if key = key' then data else raise Not_found | Branch (_, mask, tree0, tree1) -> lookup key (if (key land mask) = 0 then tree0 else tree1) let find = lookup (* [mem k m] tells whether the key [k] appears in the domain of the map [m]. *) let mem k m = try let _ = lookup k m in true with Not_found -> false (* The auxiliary function [join] merges two trees in the simple case where their prefixes disagree. Assume $t_0$ and $t_1$ are non-empty trees, with longest common prefixes $p_0$ and $p_1$, respectively. Further, suppose that $p_0$ and $p_1$ disagree, that is, neither prefix is contained in the other. Then, no matter how large $t_0$ and $t_1$ are, we can merge them simply by creating a new [Branch] node that has $t_0$ and $t_1$ as children! *) let join p0 t0 p1 t1 = let m = X.branching_bit p0 p1 in let p = X.mask p0 (* for instance *) m in if (p0 land m) = 0 then Branch(p, m, t0, t1) else Branch(p, m, t1, t0) (* The auxiliary function [match_prefix] tells whether a given key has a given prefix. More specifically, [match_prefix k p m] returns [true] if and only if the key [k] has prefix [p] up to bit [m]. Throughout our implementation of Patricia trees, prefixes are assumed to be in normal form, i.e. their irrelevant bits are set to some predictable value. Formally, we assume [X.mask p m] equals [p] whenever [p] is a prefix with [m] relevant bits. This allows implementing [match_prefix] using only one call to [X.mask]. On the other hand, this requires normalizing prefixes, as done e.g. in [join] above, where [X.mask p0 m] has to be used instead of [p0]. *) let match_prefix k p m = X.mask k m = p (* [fine_add decide k d m] returns a map whose bindings are all bindings in [m], plus a binding of the key [k] to the datum [d]. If a binding from [k] to [d0] already exists, then the resulting map contains a binding from [k] to [decide d0 d]. *) type 'a decision = 'a -> 'a -> 'a exception Unchanged let basic_add decide k d m = let rec add t = match t with | Empty -> Leaf (k, d) | Leaf (k0, d0) -> if k = k0 then let d' = decide d0 d in if d' == d0 then raise Unchanged else Leaf (k, d') else join k (Leaf (k, d)) k0 t | Branch (p, m, t0, t1) -> if match_prefix k p m then if (k land m) = 0 then Branch (p, m, add t0, t1) else Branch (p, m, t0, add t1) else join k (Leaf (k, d)) p t in add m let strict_add k d m = basic_add (fun _ _ -> raise Unchanged) k d m let fine_add decide k d m = try basic_add decide k d m with Unchanged -> m (* [add k d m] returns a map whose bindings are all bindings in [m], plus a binding of the key [k] to the datum [d]. If a binding already exists for [k], it is overridden. *) let add k d m = fine_add (fun _old_binding new_binding -> new_binding) k d m (* [singleton k d] returns a map whose only binding is from [k] to [d]. *) let singleton k d = Leaf (k, d) (* [is_singleton m] returns [Some (k, d)] if [m] is a singleton map that maps [k] to [d]. Otherwise, it returns [None]. *) let is_singleton = function | Leaf (k, d) -> Some (k, d) | Empty | Branch _ -> None (* [is_empty m] returns [true] if and only if the map [m] defines no bindings at all. *) let is_empty = function | Empty -> true | Leaf _ | Branch _ -> false (* [cardinal m] returns [m]'s cardinal, that is, the number of keys it binds, or, in other words, its domain's cardinal. *) let rec cardinal = function | Empty -> 0 | Leaf _ -> 1 | Branch (_, _, t0, t1) -> cardinal t0 + cardinal t1 (* [remove k m] returns the map [m] deprived from any binding involving [k]. *) let remove key m = let rec remove = function | Empty -> raise Not_found | Leaf (key', _) -> if key = key' then Empty else raise Not_found | Branch (prefix, mask, tree0, tree1) -> if (key land mask) = 0 then match remove tree0 with | Empty -> tree1 | tree0 -> Branch (prefix, mask, tree0, tree1) else match remove tree1 with | Empty -> tree0 | tree1 -> Branch (prefix, mask, tree0, tree1) in try remove m with Not_found -> m (* [lookup_and_remove k m] looks up the value [v] associated to the key [k] in the map [m], and raises [Not_found] if no value is bound to [k]. The call returns the value [v], together with the map [m] deprived from the binding from [k] to [v]. *) let rec lookup_and_remove key = function | Empty -> raise Not_found | Leaf (key', data) -> if key = key' then data, Empty else raise Not_found | Branch (prefix, mask, tree0, tree1) -> if (key land mask) = 0 then match lookup_and_remove key tree0 with | data, Empty -> data, tree1 | data, tree0 -> data, Branch (prefix, mask, tree0, tree1) else match lookup_and_remove key tree1 with | data, Empty -> data, tree0 | data, tree1 -> data, Branch (prefix, mask, tree0, tree1) let find_and_remove = lookup_and_remove (* [fine_union decide m1 m2] returns the union of the maps [m1] and [m2]. If a key [k] is bound to [x1] (resp. [x2]) within [m1] (resp. [m2]), then [decide] is called. It is passed [x1] and [x2], and must return the value which shall be bound to [k] in the final map. The operation returns [m2] itself (as opposed to a copy of it) when its result is equal to [m2]. *) let reverse decision elem1 elem2 = decision elem2 elem1 let fine_union decide m1 m2 = let rec union s t = match s, t with | Empty, _ -> t | (Leaf _ | Branch _), Empty -> s | Leaf(key, value), _ -> fine_add (reverse decide) key value t | Branch _, Leaf(key, value) -> fine_add decide key value s | Branch(p, m, s0, s1), Branch(q, n, t0, t1) -> if (p = q) && (m = n) then (* The trees have the same prefix. Merge their sub-trees. *) let u0 = union s0 t0 and u1 = union s1 t1 in if t0 == u0 && t1 == u1 then t else Branch(p, m, u0, u1) else if (X.shorter m n) && (match_prefix q p m) then (* [q] contains [p]. Merge [t] with a sub-tree of [s]. *) if (q land m) = 0 then Branch(p, m, union s0 t, s1) else Branch(p, m, s0, union s1 t) else if (X.shorter n m) && (match_prefix p q n) then (* [p] contains [q]. Merge [s] with a sub-tree of [t]. *) if (p land n) = 0 then let u0 = union s t0 in if t0 == u0 then t else Branch(q, n, u0, t1) else let u1 = union s t1 in if t1 == u1 then t else Branch(q, n, t0, u1) else (* The prefixes disagree. *) join p s q t in union m1 m2 (* [union m1 m2] returns the union of the maps [m1] and [m2]. Bindings in [m2] take precedence over those in [m1]. *) let union m1 m2 = fine_union (fun _d d' -> d') m1 m2 (* [iter f m] invokes [f k x], in turn, for each binding from key [k] to element [x] in the map [m]. Keys are presented to [f] according to some unspecified, but fixed, order. *) let rec iter f = function | Empty -> () | Leaf (key, data) -> f key data | Branch (_, _, tree0, tree1) -> iter f tree0; iter f tree1 (* [fold f m seed] invokes [f k d accu], in turn, for each binding from key [k] to datum [d] in the map [m]. Keys are presented to [f] in increasing order according to the map's ordering. The initial value of [accu] is [seed]; then, at each new call, its value is the value returned by the previous invocation of [f]. The value returned by [fold] is the final value of [accu]. *) let rec fold f m accu = match m with | Empty -> accu | Leaf (key, data) -> f key data accu | Branch (_, _, tree0, tree1) -> fold f tree1 (fold f tree0 accu) (* [fold_rev] performs exactly the same job as [fold], but presents keys to [f] in the opposite order. *) let rec fold_rev f m accu = match m with | Empty -> accu | Leaf (key, data) -> f key data accu | Branch (_, _, tree0, tree1) -> fold_rev f tree0 (fold_rev f tree1 accu) (* It is valid to evaluate [iter2 f m1 m2] if and only if [m1] and [m2] have the same domain. Doing so invokes [f k x1 x2], in turn, for each key [k] bound to [x1] in [m1] and to [x2] in [m2]. Bindings are presented to [f] according to some unspecified, but fixed, order. *) let rec iter2 f t1 t2 = match t1, t2 with | Empty, Empty -> () | Leaf (key1, data1), Leaf (key2, data2) -> assert (key1 = key2); f key1 (* for instance *) data1 data2 | Branch (p1, m1, left1, right1), Branch (p2, m2, left2, right2) -> assert (p1 = p2); assert (m1 = m2); iter2 f left1 left2; iter2 f right1 right2 | _, _ -> assert false (* [map f m] returns the map obtained by composing the map [m] with the function [f]; that is, the map $k\mapsto f(m(k))$. *) let rec map f = function | Empty -> Empty | Leaf (key, data) -> Leaf(key, f data) | Branch (p, m, tree0, tree1) -> Branch (p, m, map f tree0, map f tree1) (* [endo_map] is similar to [map], but attempts to physically share its result with its input. This saves memory when [f] is the identity function. *) let rec endo_map f tree = match tree with | Empty -> tree | Leaf (key, data) -> let data' = f data in if data == data' then tree else Leaf(key, data') | Branch (p, m, tree0, tree1) -> let tree0' = endo_map f tree0 in let tree1' = endo_map f tree1 in if (tree0' == tree0) && (tree1' == tree1) then tree else Branch (p, m, tree0', tree1') (* [filter f m] returns a copy of the map [m] where only the bindings that satisfy [f] have been retained. *) let filter f m = fold (fun key data accu -> if f key data then add key data accu else accu ) m empty (* [iterator m] returns a stateful iterator over the map [m]. *) (* TEMPORARY performance could be improved, see JCF's paper *) let iterator m = let remainder = ref [ m ] in let rec next () = match !remainder with | [] -> None | Empty :: parent -> remainder := parent; next() | (Leaf (key, data)) :: parent -> remainder := parent; Some (key, data) | (Branch(_, _, s0, s1)) :: parent -> remainder := s0 :: s1 :: parent; next () in next (* If [dcompare] is an ordering over data, then [compare dcompare] is an ordering over maps. *) exception Got of int let compare dcompare m1 m2 = let iterator2 = iterator m2 in try iter (fun key1 data1 -> match iterator2() with | None -> raise (Got 1) | Some (key2, data2) -> let c = Generic.compare key1 key2 in if c <> 0 then raise (Got c) else let c = dcompare data1 data2 in if c <> 0 then raise (Got c) ) m1; match iterator2() with | None -> 0 | Some _ -> -1 with Got c -> c (*i --------------------------------------------------------------------------------------------------------------- i*) (*s \mysection{Patricia-tree-based sets} *) (* To enhance code sharing, it would be possible to implement maps as sets of pairs, or (vice-versa) to implement sets as maps to the unit element. However, both possibilities introduce some space and time inefficiency. To avoid it, we define each structure separately. *) module Domain = struct type element = int type t = | Empty | Leaf of int | Branch of int * X.mask * t * t (* The empty set. *) let empty = Empty (* [is_empty s] returns [true] if and only if the set [s] is empty. *) let is_empty = function | Empty -> true | Leaf _ | Branch _ -> false (* [singleton x] returns a set whose only element is [x]. *) let singleton x = Leaf x (* [is_singleton s] tests whether [s] is a singleton set. *) let is_singleton = function | Leaf _ -> true | Empty | Branch _ -> false (* [choose s] returns an arbitrarily chosen element of [s], if [s] is nonempty, and raises [Not_found] otherwise. *) let rec choose = function | Empty -> raise Not_found | Leaf x -> x | Branch (_, _, tree0, _) -> choose tree0 (* [cardinal s] returns [s]'s cardinal. *) let rec cardinal = function | Empty -> 0 | Leaf _ -> 1 | Branch (_, _, t0, t1) -> cardinal t0 + cardinal t1 (* [mem x s] returns [true] if and only if [x] appears in the set [s]. *) let rec mem x = function | Empty -> false | Leaf x' -> x = x' | Branch (_, mask, tree0, tree1) -> mem x (if (x land mask) = 0 then tree0 else tree1) (* The auxiliary function [join] merges two trees in the simple case where their prefixes disagree. *) let join p0 t0 p1 t1 = let m = X.branching_bit p0 p1 in let p = X.mask p0 (* for instance *) m in if (p0 land m) = 0 then Branch(p, m, t0, t1) else Branch(p, m, t1, t0) (* [add x s] returns a set whose elements are all elements of [s], plus [x]. *) exception Unchanged let rec strict_add x t = match t with | Empty -> Leaf x | Leaf x0 -> if x = x0 then raise Unchanged else join x (Leaf x) x0 t | Branch (p, m, t0, t1) -> if match_prefix x p m then if (x land m) = 0 then Branch (p, m, strict_add x t0, t1) else Branch (p, m, t0, strict_add x t1) else join x (Leaf x) p t let add x s = try strict_add x s with Unchanged -> s (* [remove x s] returns a set whose elements are all elements of [s], except [x]. *) let remove x s = let rec strict_remove = function | Empty -> raise Not_found | Leaf x' -> if x = x' then Empty else raise Not_found | Branch (prefix, mask, tree0, tree1) -> if (x land mask) = 0 then match strict_remove tree0 with | Empty -> tree1 | tree0 -> Branch (prefix, mask, tree0, tree1) else match strict_remove tree1 with | Empty -> tree0 | tree1 -> Branch (prefix, mask, tree0, tree1) in try strict_remove s with Not_found -> s (* [union s1 s2] returns the union of the sets [s1] and [s2]. *) let rec union s t = match s, t with | Empty, _ -> t | _, Empty -> s | Leaf x, _ -> add x t | _, Leaf x -> add x s | Branch(p, m, s0, s1), Branch(q, n, t0, t1) -> if (p = q) && (m = n) then (* The trees have the same prefix. Merge their sub-trees. *) let u0 = union s0 t0 and u1 = union s1 t1 in if t0 == u0 && t1 == u1 then t else Branch(p, m, u0, u1) else if (X.shorter m n) && (match_prefix q p m) then (* [q] contains [p]. Merge [t] with a sub-tree of [s]. *) if (q land m) = 0 then Branch(p, m, union s0 t, s1) else Branch(p, m, s0, union s1 t) else if (X.shorter n m) && (match_prefix p q n) then (* [p] contains [q]. Merge [s] with a sub-tree of [t]. *) if (p land n) = 0 then let u0 = union s t0 in if t0 == u0 then t else Branch(q, n, u0, t1) else let u1 = union s t1 in if t1 == u1 then t else Branch(q, n, t0, u1) else (* The prefixes disagree. *) join p s q t (* [build] is a ``smart constructor''. It builds a [Branch] node with the specified arguments, but ensures that the newly created node does not have an [Empty] child. *) let build p m t0 t1 = match t0, t1 with | Empty, Empty -> Empty | Empty, _ -> t1 | _, Empty -> t0 | _, _ -> Branch(p, m, t0, t1) (* [inter s t] returns the set intersection of [s] and [t], that is, $s\cap t$. *) let rec inter s t = match s, t with | Empty, _ | _, Empty -> Empty | (Leaf x as s), t | t, (Leaf x as s) -> if mem x t then s else Empty | Branch(p, m, s0, s1), Branch(q, n, t0, t1) -> if (p = q) && (m = n) then (* The trees have the same prefix. Compute the intersections of their sub-trees. *) build p m (inter s0 t0) (inter s1 t1) else if (X.shorter m n) && (match_prefix q p m) then (* [q] contains [p]. Intersect [t] with a sub-tree of [s]. *) inter (if (q land m) = 0 then s0 else s1) t else if (X.shorter n m) && (match_prefix p q n) then (* [p] contains [q]. Intersect [s] with a sub-tree of [t]. *) inter s (if (p land n) = 0 then t0 else t1) else (* The prefixes disagree. *) Empty (* [disjoint s1 s2] returns [true] if and only if the sets [s1] and [s2] are disjoint, i.e. iff their intersection is empty. It is a specialized version of [inter], which uses less space. *) exception NotDisjoint let disjoint s t = let rec inter s t = match s, t with | Empty, _ | _, Empty -> () | Leaf x, _ -> if mem x t then raise NotDisjoint | _, Leaf x -> if mem x s then raise NotDisjoint | Branch(p, m, s0, s1), Branch(q, n, t0, t1) -> if (p = q) && (m = n) then begin inter s0 t0; inter s1 t1 end else if (X.shorter m n) && (match_prefix q p m) then inter (if (q land m) = 0 then s0 else s1) t else if (X.shorter n m) && (match_prefix p q n) then inter s (if (p land n) = 0 then t0 else t1) else () in try inter s t; true with NotDisjoint -> false (* [iter f s] invokes [f x], in turn, for each element [x] of the set [s]. Elements are presented to [f] according to some unspecified, but fixed, order. *) let rec iter f = function | Empty -> () | Leaf x -> f x | Branch (_, _, tree0, tree1) -> iter f tree0; iter f tree1 (* [fold f s seed] invokes [f x accu], in turn, for each element [x] of the set [s]. Elements are presented to [f] according to some unspecified, but fixed, order. The initial value of [accu] is [seed]; then, at each new call, its value is the value returned by the previous invocation of [f]. The value returned by [fold] is the final value of [accu]. In other words, if $s = \{ x_1, x_2, \ldots, x_n \}$, where $x_1 < x_2 < \ldots < x_n$, then [fold f s seed] computes $([f]\,x_n\,\ldots\,([f]\,x_2\,([f]\,x_1\,[seed]))\ldots)$. *) let rec fold f s accu = match s with | Empty -> accu | Leaf x -> f x accu | Branch (_, _, s0, s1) -> fold f s1 (fold f s0 accu) (* [elements s] is a list of all elements in the set [s]. *) let elements s = fold (fun tl hd -> tl :: hd) s [] (* [iterator s] returns a stateful iterator over the set [s]. That is, if $s = \{ x_1, x_2, \ldots, x_n \}$, where $x_1 < x_2 < \ldots < x_n$, then [iterator s] is a function which, when invoked for the $k^{\text{th}}$ time, returns [Some]$x_k$, if $k\leq n$, and [None] otherwise. Such a function can be useful when one wishes to iterate over a set's elements, without being restricted by the call stack's discipline. For more comments about this algorithm, please see module [Baltree], which defines a similar one. *) let iterator s = let remainder = ref [ s ] in let rec next () = match !remainder with | [] -> None | Empty :: parent -> remainder := parent; next() | (Leaf x) :: parent -> remainder := parent; Some x | (Branch(_, _, s0, s1)) :: parent -> remainder := s0 :: s1 :: parent; next () in next (* [compare] is an ordering over sets. *) exception Got of int let compare s1 s2 = let iterator2 = iterator s2 in try iter (fun x1 -> match iterator2() with | None -> raise (Got 1) | Some x2 -> let c = Generic.compare x1 x2 in if c <> 0 then raise (Got c) ) s1; match iterator2() with | None -> 0 | Some _ -> -1 with Got c -> c (* [equal] implements equality over sets. *) let equal s1 s2 = compare s1 s2 = 0 (* [subset] implements the subset predicate over sets. In other words, [subset s t] returns [true] if and only if $s\subseteq t$. It is a specialized version of [diff]. *) exception NotSubset let subset s t = let rec diff s t = match s, t with | Empty, _ -> () | _, Empty | Branch _, Leaf _ -> raise NotSubset | Leaf x, _ -> if not (mem x t) then raise NotSubset | Branch(p, m, s0, s1), Branch(q, n, t0, t1) -> if (p = q) && (m = n) then begin diff s0 t0; diff s1 t1 end else if (X.shorter n m) && (match_prefix p q n) then diff s (if (p land n) = 0 then t0 else t1) else (* Either [q] contains [p], which means at least one of [s]'s sub-trees is not contained within [t], or the prefixes disagree. In either case, the subset relationship cannot possibly hold. *) raise NotSubset in try diff s t; true with NotSubset -> false end (*i --------------------------------------------------------------------------------------------------------------- i*) (*s \mysection{Relating sets and maps} *) (* Back to the world of maps. Let us now describe the relationship which exists between maps and their domains. *) (* [domain m] returns [m]'s domain. *) let rec domain = function | Empty -> Domain.Empty | Leaf (k, _) -> Domain.Leaf k | Branch (p, m, t0, t1) -> Domain.Branch (p, m, domain t0, domain t1) (* [lift f s] returns the map $k\mapsto f(k)$, where $k$ ranges over a set of keys [s]. *) let rec lift f = function | Domain.Empty -> Empty | Domain.Leaf k -> Leaf (k, f k) | Domain.Branch (p, m, t0, t1) -> Branch(p, m, lift f t0, lift f t1) (* [build] is a ``smart constructor''. It builds a [Branch] node with the specified arguments, but ensures that the newly created node does not have an [Empty] child. *) let build p m t0 t1 = match t0, t1 with | Empty, Empty -> Empty | Empty, _ -> t1 | _, Empty -> t0 | _, _ -> Branch(p, m, t0, t1) (* [corestrict m d] performs a co-restriction of the map [m] to the domain [d]. That is, it returns the map $k\mapsto m(k)$, where $k$ ranges over all keys bound in [m] but \emph{not} present in [d]. Its code resembles [diff]'s. *) let rec corestrict s t = match s, t with | Empty, _ | _, Domain.Empty -> s | Leaf (k, _), _ -> if Domain.mem k t then Empty else s | _, Domain.Leaf k -> remove k s | Branch(p, m, s0, s1), Domain.Branch(q, n, t0, t1) -> if (p = q) && (m = n) then build p m (corestrict s0 t0) (corestrict s1 t1) else if (X.shorter m n) && (match_prefix q p m) then if (q land m) = 0 then build p m (corestrict s0 t) s1 else build p m s0 (corestrict s1 t) else if (X.shorter n m) && (match_prefix p q n) then corestrict s (if (p land n) = 0 then t0 else t1) else s end (*i --------------------------------------------------------------------------------------------------------------- i*) (*s \mysection{Instantiating the functor} *) module Little = Make(Endianness.Little) module Big = Make(Endianness.Big) menhir-20200123/src/patricia.mli000066400000000000000000000024301361226111300163100ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This is an implementation of Patricia trees, following Chris Okasaki's paper at the 1998 ML Workshop in Baltimore. Both big-endian and little-endian trees are provided. Both sets and maps are implemented on top of Patricia trees. *) module Little : GMap.S with type key = int module Big : GMap.S with type key = int menhir-20200123/src/positions.ml000066400000000000000000000074341361226111300164030ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Lexing type t = (* Start and end positions. *) position * position type 'a located = { value : 'a; position : t; } let value { value = v } = v let position { position = p } = p let decompose { value; position } = (value, position) let with_pos p v = { value = v; position = p; } let with_loc = (* The location is converted from the type [position * position] to the type [t]. *) with_pos let map f v = { value = f v.value; position = v.position; } let pmap f v = { value = f v.position v.value; position = v.position } let iter f { value = v } = f v let mapd f v = let w1, w2 = f v.value in let pos = v.position in { value = w1; position = pos }, { value = w2; position = pos } let dummy = (dummy_pos, dummy_pos) let unknown_pos v = { value = v; position = dummy } let start_of_position (p, _) = p let end_of_position (_, p) = p let filename_of_position p = (start_of_position p).pos_fname let line p = p.pos_lnum let column p = p.pos_cnum - p.pos_bol let characters p1 p2 = (column p1, p2.pos_cnum - p1.pos_bol) (* intentionally [p1.pos_bol] *) let join x1 x2 = ( start_of_position (if x1 = dummy then x2 else x1), end_of_position (if x2 = dummy then x1 else x2) ) let import x = x let join_located l1 l2 f = { value = f l1.value l2.value; position = join l1.position l2.position; } let string_of_lex_pos p = let c = p.pos_cnum - p.pos_bol in (string_of_int p.pos_lnum)^":"^(string_of_int c) let string_of_pos p = let filename = filename_of_position p in (* [filename] is hopefully not "". *) let l = line (start_of_position p) in let c1, c2 = characters (start_of_position p) (end_of_position p) in Printf.sprintf "File \"%s\", line %d, characters %d-%d" filename l c1 c2 let pos_or_undef = function | None -> dummy | Some x -> x let cpos lexbuf = (lexeme_start_p lexbuf, lexeme_end_p lexbuf) let with_cpos lexbuf v = with_pos (cpos lexbuf) v let string_of_cpos lexbuf = string_of_pos (cpos lexbuf) let joinf f t1 t2 = join (f t1) (f t2) let ljoinf f = List.fold_left (fun p t -> join p (f t)) dummy let join_located_list ls f = { value = f (List.map (fun l -> l.value) ls); position = ljoinf (fun x -> x.position) ls } (* The functions that print error messages and warnings require a list of positions. The following auxiliary functions help build such lists. *) type positions = t list let one (pos : position) : positions = [ import (pos, pos) ] let lexbuf (lexbuf : lexbuf) : positions = [ import (lexbuf.lex_start_p, lexbuf.lex_curr_p) ] let print (pos : position) = Printf.printf "{ pos_fname = \"%s\"; pos_lnum = %d; pos_bol = %d; pos_cnum = %d }\n" pos.pos_fname pos.pos_lnum pos.pos_bol pos.pos_cnum menhir-20200123/src/positions.mli000066400000000000000000000107001361226111300165420ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* TEMPORARY clean up this over-complicated API? *) (** Extension of standard library's positions. *) (** {2 Extended lexing positions} *) (** Abstract type for pairs of positions in the lexing stream. *) type t (** Decoration of a value with a position. *) type 'a located = { value : 'a; position : t; } (** [value dv] returns the raw value that underlies the decorated value [dv]. *) val value: 'a located -> 'a (** [position dv] returns the position that decorates the decorated value [dv]. *) val position: 'a located -> t (** [decompose dv] returns a pair of the value and position. *) val decompose: 'a located -> 'a * t (** [with_pos p v] decorates [v] with a position [p]. *) val with_pos : t -> 'a -> 'a located val with_cpos: Lexing.lexbuf -> 'a -> 'a located val with_loc : (Lexing.position * Lexing.position) -> 'a -> 'a located val unknown_pos : 'a -> 'a located (** [map f v] extends the decoration from [v] to [f v]. *) val map: ('a -> 'b) -> 'a located -> 'b located val pmap: (t -> 'a -> 'b) -> 'a located -> 'b located (** [iter f dv] applies [f] to the value inside [dv]. *) val iter: ('a -> unit) -> 'a located -> unit (** [mapd f v] extends the decoration from [v] to both members of the pair [f v]. *) val mapd: ('a -> 'b1 * 'b2) -> 'a located -> 'b1 located * 'b2 located (** This value is used when an object does not come from a particular input location. *) val dummy: t (** {2 Accessors} *) (** [column p] returns the number of characters from the beginning of the line of the Lexing.position [p]. *) val column : Lexing.position -> int (** [column p] returns the line number of to the Lexing.position [p]. *) val line : Lexing.position -> int (** [characters p1 p2] returns the character interval between [p1] and [p2] assuming they are located in the same line. *) val characters : Lexing.position -> Lexing.position -> int * int val start_of_position: t -> Lexing.position val end_of_position: t -> Lexing.position val filename_of_position: t -> string (** {2 Position handling} *) (** [join p1 p2] returns a position that starts where [p1] starts and stops where [p2] stops. *) val join : t -> t -> t val import : Lexing.position * Lexing.position -> t val ljoinf : ('a -> t) -> 'a list -> t val joinf : ('a -> t) -> 'a -> 'a -> t val join_located : 'a located -> 'b located -> ('a -> 'b -> 'c) -> 'c located val join_located_list : ('a located) list -> ('a list -> 'b list) -> ('b list) located (** [string_of_lex_pos p] returns a string representation for the lexing position [p]. *) val string_of_lex_pos : Lexing.position -> string (** [string_of_pos p] returns the standard (Emacs-like) representation of the position [p]. *) val string_of_pos : t -> string (** [pos_or_undef po] is the identity function except if po = None, in that case, it returns [undefined_position]. *) val pos_or_undef : t option -> t (** {2 Interaction with the lexer runtime} *) (** [cpos lexbuf] returns the current position of the lexer. *) val cpos : Lexing.lexbuf -> t (** [string_of_cpos p] returns a string representation of the lexer's current position. *) val string_of_cpos : Lexing.lexbuf -> string (* The functions that print error messages and warnings require a list of positions. The following auxiliary functions help build such lists. *) type positions = t list val one: Lexing.position -> positions val lexbuf: Lexing.lexbuf -> positions (* Low-level printing function, for debugging. *) val print: Lexing.position -> unit menhir-20200123/src/pprint.ml000066400000000000000000000706341361226111300156720ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This is an adaptation of Daan Leijen's [PPrint] library, which itself is based on the ideas developed by Philip Wadler in ``A Prettier Printer''. For more information, see: http://www.cs.uu.nl/~daan/pprint.html http://homepages.inf.ed.ac.uk/wadler/papers/prettier/prettier.pdf *) (* ------------------------------------------------------------------------- *) (* A uniform interface for output channels. *) module type OUTPUT = sig type channel val char: channel -> char -> unit val substring: channel -> string -> int (* offset *) -> int (* length *) -> unit end (* ------------------------------------------------------------------------- *) (* Two implementations of the above interface, respectively based on output channels and memory buffers. This compensates for the fact that ocaml's standard library does not allow creating an output channel out of a memory buffer (a regrettable omission). *) module ChannelOutput : OUTPUT with type channel = out_channel = struct type channel = out_channel let char = output_char let substring = output_substring end module BufferOutput : OUTPUT with type channel = Buffer.t = struct type channel = Buffer.t let char = Buffer.add_char let substring = Buffer.add_substring end (* ------------------------------------------------------------------------- *) (* Here is the algebraic data type of documents. It is analogous to Daan Leijen's version, but the binary constructor [Union] is replaced with the unary constructor [Group], and the constant [Line] is replaced with more general constructions, namely [IfFlat], which provides alternative forms depending on the current flattening mode, and [HardLine], which represents a newline character, and is invalid in flattening mode. *) type document = (* [Empty] is the empty document. *) | Empty (* [Char c] is a document that consists of the single character [c]. We enforce the invariant that [c] is not a newline character. *) | Char of char (* [String (s, ofs, len)] is a document that consists of the portion of the string [s] delimited by the offset [ofs] and the length [len]. We assume, but do not check, that this portion does not contain a newline character. *) | String of string * int * int (* [Blank n] is a document that consists of [n] blank characters. *) | Blank of int (* When in flattening mode, [IfFlat (d1, d2)] turns into the document [d1]. When not in flattening mode, it turns into the document [d2]. *) | IfFlat of document * document (* When in flattening mode, [HardLine] is illegal. When not in flattening mode, it represents a newline character, followed with an appropriate number of indentation. A safe way of using [HardLine] is to only use it directly within the right branch of an [IfFlat] construct. *) | HardLine (* [Cat doc1 doc2] is the concatenation of the documents [doc1] and [doc2]. *) | Cat of document * document (* [Nest (j, doc)] is the document [doc], in which the indentation level has been increased by [j], that is, in which [j] blanks have been inserted after every newline character. *) | Nest of int * document (* [Group doc] represents an alternative: it is either a flattened form of [doc], in which occurrences of [Group] disappear and occurrences of [IfFlat] resolve to their left branch, or [doc] itself. *) | Group of document (* [Column f] is the document obtained by applying [f] to the current column number. *) | Column of (int -> document) (* [Nesting f] is the document obtained by applying [f] to the current indentation level, that is, the number of blanks that were printed at the beginning of the current line. *) | Nesting of (int -> document) (* ------------------------------------------------------------------------- *) (* A signature for document renderers. *) module type RENDERER = sig (* Output channels. *) type channel (* [pretty rfrac width channel document] pretty-prints the document [document] to the output channel [channel]. The parameter [width] is the maximum number of characters per line. The parameter [rfrac] is the ribbon width, a fraction relative to [width]. The ribbon width is the maximum number of non-indentation characters per line. *) val pretty: float -> int -> channel -> document -> unit (* [compact channel document] prints the document [document] to the output channel [channel]. No indentation is used. All newline instructions are respected, that is, no groups are flattened. *) val compact: channel -> document -> unit end (* ------------------------------------------------------------------------- *) (* The pretty rendering algorithm: preliminary declarations. *) (* The renderer is supposed to behave exactly like Daan Leijen's, although its implementation is quite radically different. Instead of relying on Haskell's lazy evaluation mechanism, we implement an abstract machine with mutable current state, forking, backtracking (via an explicit stack of choice points), and cut (disposal of earlier choice points). *) (* The renderer's input consists of an ordered sequence of documents. Each document carries an extra indentation level, akin to an implicit [Nest] constructor, and a ``flattening'' flag, which, if set, means that this document should be printed in flattening mode. *) (* An alternative coding style would be to avoid decorating each input document with an indentation level and a flattening mode, and allow the input sequence to contain instructions that set the current nesting level or reset the flattening mode. That would perhaps be slightly more readable, and slightly less efficient. *) type input = | INil | ICons of int * bool * document * input (* When possible (that is, when the stack is empty), the renderer writes directly to the output channel. Otherwise, output is buffered until either a failure point is reached (then, the buffered output is discarded) or a cut is reached (then, all buffered output is committed to the output channel). At all times, the length of the buffered output is at most one line. *) (* The buffered output consists of a list of characters and strings. It is stored in reverse order (the head of the list should be printed last). *) type output = | OEmpty | OChar of char * output | OString of string * int * int * output | OBlank of int * output (* The renderer maintains the following state record. For efficiency, the record is mutable; it is copied when the renderer forks, that is, at choice points. *) type 'channel state = { (* The line width and ribbon width. *) width: int; ribbon: int; (* The output channel. *) channel: 'channel; (* The current indentation level. This is the number of blanks that were printed at the beginning of the current line. *) mutable indentation: int; (* The current column. *) mutable column: int; (* The renderer's input. For efficiency, the input is assumed to never be empty, and the leading [ICons] constructor is inlined within the state record. In other words, the fields [nest1], [flatten1], and [input1] concern the first input document, and the field [input] contains the rest of the input sequence. *) mutable indent1: int; mutable flatten1: bool; mutable input1: document; mutable input: input; (* The renderer's buffer output. *) mutable output: output; } (* The renderer maintains a stack of resumptions, that is, states in which execution should be resumed if the current thread of execution fails by lack of space on the current line. *) (* It is not difficult to prove that the stack is empty if and only if flattening mode is off. Furthermore, when flattening mode is on, all groups are ignored, so no new choice points are pushed onto the stack. As a result, the stack has height one at most at all times, so that the stack height is zero when flattening mode is off and one when flattening mode is on. *) type 'channel stack = 'channel state list (* ------------------------------------------------------------------------- *) (* The pretty rendering algorithm: code. *) (* The renderer is parameterized over an implementation of output channels. *) module Renderer (Output : OUTPUT) = struct type channel = Output.channel (* Printing blank space (indentation characters). *) let blank_length = 80 let blank_buffer = String.make blank_length ' ' let rec blanks channel n = if n <= 0 then () else if n <= blank_length then Output.substring channel blank_buffer 0 n else begin Output.substring channel blank_buffer 0 blank_length; blanks channel (n - blank_length) end (* Committing buffered output to the output channel. The list is printed in reverse order. The code is not tail recursive, but there is no risk of stack overflow, since the length of the buffered output cannot exceed one line. *) let rec commit channel = function | OEmpty -> () | OChar (c, output) -> commit channel output; Output.char channel c | OString (s, ofs, len, output) -> commit channel output; Output.substring channel s ofs len | OBlank (n, output) -> commit channel output; blanks channel n (* The renderer's abstract machine. *) (* The procedures [run], [shift], [emit_char], [emit_string], and [emit_blanks] are mutually recursive, and are tail recursive. They maintain a stack and a current state. The states in the stack, and the current state, are pairwise distinct, so that the current state can be mutated without affecting the contents of the stack. *) (* An invariant is: the buffered output is nonempty only when the stack is nonempty. The contrapositive is: if the stack is empty, then the buffered output is empty. Indeed, the fact that the stack is empty means that no choices were made, so we are not in a speculative mode of execution: as a result, all output can be sent directly to the output channel. On the contrary, when the stack is nonempty, there is a possibility that we might backtrack in the future, so all output should be held in a buffer. *) (* [run] is allowed to call itself recursively only when no material is printed. In that case, the check for failure is skipped -- indeed, this test is performed only within [shift]. *) let rec run (stack : channel stack) (state : channel state) : unit = (* Examine the first piece of input, as well as (in some cases) the current flattening mode. *) match state.input1, state.flatten1 with (* The first piece of input is an empty document. Discard it and continue. *) | Empty, _ -> shift stack state (* The first piece of input is a character. Emit it and continue. *) | Char c, _ -> emit_char stack state c (* The first piece of input is a string. Emit it and continue. *) | String (s, ofs, len), _ -> emit_string stack state s ofs len | Blank n, _ -> emit_blanks stack state n (* The first piece of input is a hard newline instruction. Such an instruction is valid only when flattening mode is off. *) (* We emit a newline character, followed by the prescribed amount of indentation. We update the current state to record how many indentation characters were printed and to to reflect the new column number. Then, we discard the current piece of input and continue. *) | HardLine, flattening -> assert (not flattening); (* flattening mode must be off. *) assert (stack = []); (* since flattening mode is off, the stack must be empty. *) Output.char state.channel '\n'; let i = state.indent1 in blanks state.channel i; state.column <- i; state.indentation <- i; shift stack state (* The first piece of input is an [IfFlat] conditional instruction. *) | IfFlat (doc, _), true | IfFlat (_, doc), false -> state.input1 <- doc; run stack state (* The first piece of input is a concatenation operator. We take it apart and queue both documents in the input sequence. *) | Cat (doc1, doc2), _ -> state.input1 <- doc1; state.input <- ICons (state.indent1, state.flatten1, doc2, state.input); run stack state (* The first piece of input is a [Nest] operator. We increase the amount of indentation to be applied to the first input document. *) | Nest (j, doc), _ -> state.indent1 <- state.indent1 + j; state.input1 <- doc; run stack state (* The first piece of input is a [Group] operator, and flattening mode is currently off. This introduces a choice point: either we flatten this whole group, or we don't. We try the former possibility first: this is done by enabling flattening mode. Should this avenue fail, we push the current state, in which flattening mode is disabled, onto the stack. *) (* Note that the current state is copied before continuing, so that the state that is pushed on the stack is not affected by future modifications. This is a fork. *) | Group doc, false -> state.input1 <- doc; run (state :: stack) { state with flatten1 = true } (* The first piece of input is a [Group] operator, and flattening mode is currently on. The operator is ignored. *) | Group doc, true -> state.input1 <- doc; run stack state (* The first piece of input is a [Column] operator. The current column is fed into it, so as to produce a document, with which we continue. *) | Column f, _ -> state.input1 <- f state.column; run stack state (* The first piece of input is a [Column] operator. The current indentation level is fed into it, so as to produce a document, with which we continue. *) | Nesting f, _ -> state.input1 <- f state.indentation; run stack state (* [shift] discards the first document in the input sequence, so that the second input document, if there is one, becomes first. The renderer stops if there is none. *) and shift stack state = assert (state.output = OEmpty || stack <> []); assert (state.flatten1 = (stack <> [])); (* If the stack is nonempty and we have exceeded either the width or the ribbon width parameters, then fail. Backtracking is implemented by discarding the current state, popping a state off the stack, and making it the current state. *) match stack with | resumption :: stack when state.column > state.width || state.column - state.indentation > state.ribbon -> run stack resumption | _ -> match state.input with | INil -> (* End of input. Commit any buffered output and stop. *) commit state.channel state.output | ICons (indent, flatten, head, tail) -> (* There is an input document. Move it one slot ahead and check if we are leaving flattening mode. *) state.indent1 <- indent; state.input1 <- head; state.input <- tail; if state.flatten1 && not flatten then begin (* Leaving flattening mode means success: we have flattened a certain group, and fitted it all on a line, without reaching a failure point. We would now like to commit our decision to flatten this group. This is a Prolog cut. We discard the stack of choice points, replacing it with an empty stack, and commit all buffered output. *) state.flatten1 <- flatten; (* false *) commit state.channel state.output; state.output <- OEmpty; run [] state end else run stack state (* [emit_char] prints a character (either to the output channel or to the output buffer), increments the current column, discards the first piece of input, and continues. *) and emit_char stack state c = begin match stack with | [] -> Output.char state.channel c | _ -> state.output <- OChar (c, state.output) end; state.column <- state.column + 1; shift stack state (* [emit_string] prints a string (either to the output channel or to the output buffer), updates the current column, discards the first piece of input, and continues. *) and emit_string stack state s ofs len = begin match stack with | [] -> Output.substring state.channel s ofs len | _ -> state.output <- OString (s, ofs, len, state.output) end; state.column <- state.column + len; shift stack state (* [emit_blanks] prints a blank string (either to the output channel or to the output buffer), updates the current column, discards the first piece of input, and continues. *) and emit_blanks stack state n = begin match stack with | [] -> blanks state.channel n | _ -> state.output <- OBlank (n, state.output) end; state.column <- state.column + n; shift stack state (* This is the renderer's main entry point. *) let pretty rfrac width channel document = run [] { width = width; ribbon = max 0 (min width (truncate (float_of_int width *. rfrac))); channel = channel; indentation = 0; column = 0; indent1 = 0; flatten1 = false; input1 = document; input = INil; output = OEmpty; } (* ------------------------------------------------------------------------- *) (* The compact rendering algorithm. *) let compact channel document = let column = ref 0 in let rec scan = function | Empty -> () | Char c -> Output.char channel c; column := !column + 1 | String (s, ofs, len) -> Output.substring channel s ofs len; column := !column + len | Blank n -> blanks channel n; column := !column + n | HardLine -> Output.char channel '\n'; column := 0 | Cat (doc1, doc2) -> scan doc1; scan doc2 | IfFlat (doc, _) | Nest (_, doc) | Group doc -> scan doc | Column f -> scan (f !column) | Nesting f -> scan (f 0) in scan document end (* ------------------------------------------------------------------------- *) (* Instantiating the renderers for the two kinds of output channels. *) module Channel = Renderer(ChannelOutput) module Buffer = Renderer(BufferOutput) (* ------------------------------------------------------------------------- *) (* Constructors. *) let empty = Empty let (^^) x y = match x, y with | Empty, x | x, Empty -> x | _, _ -> Cat (x, y) let ifflat doc1 doc2 = IfFlat (doc1, doc2) let hardline = HardLine let char c = assert (c <> '\n'); Char c let substring s ofs len = if len = 0 then Empty else String (s, ofs, len) let text s = substring s 0 (String.length s) let blank n = if n = 0 then Empty else Blank n let nest i x = assert (i >= 0); Nest (i, x) let column f = Column f let nesting f = Nesting f let group x = Group x (* ------------------------------------------------------------------------- *) (* Low-level combinators for alignment and indentation. *) let align d = column (fun k -> nesting (fun i -> nest (k - i) d ) ) let hang i d = align (nest i d) let indent i d = hang i (blank i ^^ d) (* ------------------------------------------------------------------------- *) (* High-level combinators. *) let lparen = char '(' let rparen = char ')' let langle = char '<' let rangle = char '>' let lbrace = char '{' let rbrace = char '}' let lbracket = char '[' let rbracket = char ']' let squote = char '\'' let dquote = char '"' let bquote = char '`' let semi = char ';' let colon = char ':' let comma = char ',' let space = char ' ' let dot = char '.' let sharp = char '#' let backslash = char '\\' let equals = char '=' let qmark = char '?' let tilde = char '~' let at = char '@' let percent = char '%' let dollar = char '$' let caret = char '^' let ampersand = char '&' let star = char '*' let plus = char '+' let minus = char '-' let underscore = char '_' let bang = char '!' let bar = char '|' let break i = ifflat (text (String.make i ' ')) hardline let break0 = ifflat empty hardline let break1 = ifflat space hardline let string s = let n = String.length s in let rec chop i = try let j = String.index_from s i '\n' in substring s i (j - i) ^^ break1 ^^ chop (j + 1) with Not_found -> substring s i (n - i) in chop 0 let group_break1 = group break1 let words s = let n = String.length s in let rec blank accu i = (* we have skipped over at least one blank character *) if i = n then accu ^^ group_break1 else match s.[i] with | ' ' | '\t' | '\n' | '\r' -> blank accu (i + 1) | _ -> word break1 accu i (i + 1) and word prefix accu i j = (* we have skipped over at least one non-blank character *) if j = n then accu ^^ group (prefix ^^ substring s i (j - i)) else match s.[j] with | ' ' | '\t' | '\n' | '\r' -> blank (accu ^^ group (prefix ^^ substring s i (j - i))) (j + 1) | _ -> word prefix accu i (j + 1) in if n = 0 then empty else match s.[0] with | ' ' | '\t' | '\n' | '\r' -> blank empty 1 | _ -> word empty empty 0 1 let enclose l r x = l ^^ x ^^ r let squotes = enclose squote squote let dquotes = enclose dquote dquote let bquotes = enclose bquote bquote let braces = enclose lbrace rbrace let parens = enclose lparen rparen let angles = enclose langle rangle let brackets = enclose lbracket rbracket let fold f docs = List.fold_right f docs empty let rec fold1 f docs = match docs with | [] -> empty | [ doc ] -> doc | doc :: docs -> f doc (fold1 f docs) let rec fold1map f g docs = match docs with | [] -> empty | [ doc ] -> g doc | doc :: docs -> let doc = g doc in (* force left-to-right evaluation *) f doc (fold1map f g docs) let sepmap sep g docs = fold1map (fun x y -> x ^^ sep ^^ y) g docs let optional f = function | None -> empty | Some x -> f x let group1 d = group (nest 1 d) let group2 d = group (nest 2 d) module Operators = struct let ( !^ ) = text let ( ^^ ) = ( ^^ ) let ( ^/^ ) x y = x ^^ break1 ^^ y let ( ^//^ ) x y = group (x ^^ nest 2 (break1 ^^ y)) let ( ^@^ ) x y = group (x ^^ break1 ^^ y) let ( ^@@^ ) x y = group2 (x ^^ break1 ^^ y) end open Operators let prefix op x = !^op ^//^ x let infix op x y = (x ^^ space ^^ !^op) ^//^ y let infix_dot op x y = group2 ((x ^^ !^op) ^^ break0 ^^ y) let infix_com op x y = x ^^ !^op ^^ group_break1 ^^ y let surround n sep open_doc contents close_doc = group (open_doc ^^ nest n (sep ^^ contents) ^^ sep ^^ close_doc) let surround1 open_txt contents close_txt = surround 1 break0 !^open_txt contents !^close_txt let surround2 open_txt contents close_txt = surround 2 break1 !^open_txt contents !^close_txt let soft_surround n sep open_doc contents close_doc = group (open_doc ^^ nest n (group sep ^^ contents) ^^ group (sep ^^ close_doc)) let seq indent break empty_seq open_seq sep_seq close_seq = function | [] -> empty_seq | xs -> surround indent break open_seq (fold1 (fun x xs -> x ^^ sep_seq ^^ xs) xs) close_seq let seq1 open_txt sep_txt close_txt = seq 1 break0 !^(open_txt ^ close_txt) !^open_txt (!^sep_txt ^^ break1) !^close_txt let seq2 open_txt sep_txt close_txt = seq 2 break1 !^(open_txt ^ close_txt) !^open_txt (!^sep_txt ^^ break1) !^close_txt let sprintf fmt = Printf.ksprintf string fmt (* A signature for value representations. This is compatible with the associated Camlp4 generator: SwitchValueRepresentation *) module type VALUE_REPRESENTATION = sig (* The type of value representation *) type t (* [variant type_name data_constructor_name tag arguments] Given information about the variant and its arguments, this function produces a new value representation. *) val variant : string -> string -> int -> t list -> t (* [record type_name fields] Given a type name and a list of record fields, this function produces the value representation of a record. *) val record : string -> (string * t) list -> t (* [tuple arguments] Given a list of value representation this function produces a new value representation. *) val tuple : t list -> t (* ------------------------------------------------------------------------- *) (* Value representation for primitive types. *) val string : string -> t val int : int -> t val int32 : int32 -> t val int64 : int64 -> t val nativeint : nativeint -> t val float : float -> t val char : char -> t val bool : bool -> t val option : ('a -> t) -> 'a option -> t val list : ('a -> t) -> 'a list -> t val array : ('a -> t) -> 'a array -> t val ref : ('a -> t) -> 'a ref -> t (* Value representation for any other value. *) val unknown : string -> 'a -> t end module type DOCUMENT_VALUE_REPRESENTATION = VALUE_REPRESENTATION with type t = document (* please remove as soon as this will be available in ocaml *) module MissingFloatRepr = struct let valid_float_lexeme s = let l = String.length s in let rec loop i = if i >= l then s ^ "." else match s.[i] with | '0' .. '9' | '-' -> loop (i+1) | _ -> s in loop 0 let float_repres f = match classify_float f with FP_nan -> "nan" | FP_infinite -> if f < 0.0 then "neg_infinity" else "infinity" | _ -> let s1 = Printf.sprintf "%.12g" f in if f = float_of_string s1 then valid_float_lexeme s1 else let s2 = Printf.sprintf "%.15g" f in if f = float_of_string s2 then valid_float_lexeme s2 else Printf.sprintf "%.18g" f end module ML = struct type t = document let tuple = seq1 "(" "," ")" let variant _ cons _ args = if args = [] then !^cons else !^cons ^^ tuple args let record _ fields = seq2 "{" ";" "}" (List.map (fun (k, v) -> infix ":" !^k v) fields) let option f = function | Some x -> !^"Some" ^^ tuple [f x] | None -> !^"None" let list f xs = seq2 "[" ";" "]" (List.map f xs) let array f xs = seq2 "[|" ";" "|]" (Array.to_list (Array.map f xs)) let ref f x = record "ref" ["contents", f !x] let float f = string (MissingFloatRepr.float_repres f) let int = sprintf "%d" let int32 = sprintf "%ld" let int64 = sprintf "%Ld" let nativeint = sprintf "%nd" let char = sprintf "%C" let bool = sprintf "%B" let string = sprintf "%S" let unknown tyname _ = sprintf "" tyname end (* Deprecated *) let line = ifflat space hardline let linebreak = ifflat empty hardline let softline = group line let softbreak = group linebreak menhir-20200123/src/pprint.mli000066400000000000000000000220321361226111300160300ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* ------------------------------------------------------------------------- *) (* Basic combinators for building documents. *) type document val empty: document val hardline: document val char: char -> document val substring: string -> int -> int -> document val text: string -> document val blank: int -> document val (^^): document -> document -> document val nest: int -> document -> document val column: (int -> document) -> document val nesting: (int -> document) -> document val group: document -> document val ifflat: document -> document -> document (* ------------------------------------------------------------------------- *) (* Low-level combinators for alignment and indentation. *) val align: document -> document val hang: int -> document -> document val indent: int -> document -> document (* ------------------------------------------------------------------------- *) (* High-level combinators for building documents. *) (* [break n] Puts [n] spaces in flat mode and a new line otherwise. Equivalent to: [ifflat (String.make n ' ') hardline] *) val break: int -> document (* [break0] equivalent to [break 0] *) val break0: document (* [break1] equivalent to [break 1] *) val break1: document val string: string -> document val words: string -> document val lparen: document val rparen: document val langle: document val rangle: document val lbrace: document val rbrace: document val lbracket: document val rbracket: document val squote: document val dquote: document val bquote: document val semi: document val colon: document val comma: document val space: document val dot: document val sharp: document val backslash: document val equals: document val qmark: document val tilde: document val at: document val percent: document val dollar: document val caret: document val ampersand: document val star: document val plus: document val minus: document val underscore: document val bang: document val bar: document val squotes: document -> document val dquotes: document -> document val bquotes: document -> document val braces: document -> document val parens: document -> document val angles: document -> document val brackets: document -> document val fold: (document -> document -> document) -> document list -> document val fold1: (document -> document -> document) -> document list -> document val fold1map: (document -> document -> document) -> ('a -> document) -> 'a list -> document val sepmap: document -> ('a -> document) -> 'a list -> document val optional: ('a -> document) -> 'a option -> document (* [prefix left right] Flat layout: [left] [right] Otherwise: [left] [right] *) val prefix: string -> document -> document (* [infix middle left right] Flat layout: [left] [middle] [right] Otherwise: [left] [middle] [right] *) val infix: string -> document -> document -> document (* [infix_com middle left right] Flat layout: [left][middle] [right] Otherwise: [left][middle] [right] *) val infix_com: string -> document -> document -> document (* [infix_dot middle left right] Flat layout: [left][middle][right] Otherwise: [left][middle] [right] *) val infix_dot: string -> document -> document -> document (* [surround nesting break open_doc contents close_doc] *) val surround: int -> document -> document -> document -> document -> document (* [surround1 open_txt contents close_txt] Flat: [open_txt][contents][close_txt] Otherwise: [open_txt] [contents] [close_txt] *) val surround1: string -> document -> string -> document (* [surround2 open_txt contents close_txt] Flat: [open_txt] [contents] [close_txt] Otherwise: [open_txt] [contents] [close_txt] *) val surround2: string -> document -> string -> document (* [soft_surround nesting break open_doc contents close_doc] *) val soft_surround: int -> document -> document -> document -> document -> document (* [seq indent break empty_seq open_seq sep_seq close_seq contents] *) val seq: int -> document -> document -> document -> document -> document -> document list -> document (* [seq1 open_seq sep_seq close_seq contents] Flat layout: [open_seq][contents][sep_seq]...[sep_seq][contents][close_seq] Otherwise: [open_seq] [contents][sep_seq]...[sep_seq][contents] [close_seq] *) val seq1: string -> string -> string -> document list -> document (* [seq2 open_seq sep_seq close_seq contents] Flat layout: [open_seq] [contents][sep_seq]...[sep_seq][contents] [close_seq] Otherwise: [open_seq] [contents][sep_seq]...[sep_seq][contents] [close_seq] *) val seq2: string -> string -> string -> document list -> document (* [group1 d] equivalent to [group (nest 1 d)] *) val group1: document -> document (* [group2 d] equivalent to [group (nest 2 d)] *) val group2: document -> document module Operators : sig val ( ^^ ) : document -> document -> document val ( !^ ) : string -> document val ( ^/^ ) : document -> document -> document val ( ^//^ ) : document -> document -> document val ( ^@^ ) : document -> document -> document val ( ^@@^ ) : document -> document -> document end (* ------------------------------------------------------------------------- *) (* A signature for document renderers. *) module type RENDERER = sig (* Output channels. *) type channel (* [pretty rfrac width channel document] pretty-prints the document [document] to the output channel [channel]. The parameter [width] is the maximum number of characters per line. The parameter [rfrac] is the ribbon width, a fraction relative to [width]. The ribbon width is the maximum number of non-indentation characters per line. *) val pretty: float -> int -> channel -> document -> unit (* [compact channel document] prints the document [document] to the output channel [channel]. No indentation is used. All newline instructions are respected, that is, no groups are flattened. *) val compact: channel -> document -> unit end (* ------------------------------------------------------------------------- *) (* Renderers to output channels and to memory buffers. *) module Channel : RENDERER with type channel = out_channel module Buffer : RENDERER with type channel = Buffer.t (* ------------------------------------------------------------------------- *) (* A signature for value representations. This is compatible with the associated Camlp4 generator: SwitchValueRepresentation *) module type VALUE_REPRESENTATION = sig (* The type of value representation *) type t (* [variant type_name data_constructor_name tag arguments] Given information about the variant and its arguments, this function produces a new value representation. *) val variant : string -> string -> int -> t list -> t (* [record type_name fields] Given a type name and a list of record fields, this function produces the value representation of a record. *) val record : string -> (string * t) list -> t (* [tuple arguments] Given a list of value representation this function produces a new value representation. *) val tuple : t list -> t (* ------------------------------------------------------------------------- *) (* Value representation for primitive types. *) val string : string -> t val int : int -> t val int32 : int32 -> t val int64 : int64 -> t val nativeint : nativeint -> t val float : float -> t val char : char -> t val bool : bool -> t val option : ('a -> t) -> 'a option -> t val list : ('a -> t) -> 'a list -> t val array : ('a -> t) -> 'a array -> t val ref : ('a -> t) -> 'a ref -> t (* Value representation for any other value. *) val unknown : string -> 'a -> t end (* A signature for source printers. *) module type DOCUMENT_VALUE_REPRESENTATION = VALUE_REPRESENTATION with type t = document module ML : DOCUMENT_VALUE_REPRESENTATION (* Deprecated *) val line: document val linebreak: document val softline: document val softbreak: document menhir-20200123/src/printer.ml000066400000000000000000000516361361226111300160420ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* A pretty-printer for [IL]. *) open IL module PreliminaryMake (X : sig (* We assume that the following types and functions are given. This allows us to work both with buffers of type [Buffer.t] and with output channels of type [out_channel]. *) type channel val fprintf: channel -> ('a, channel, unit) format -> 'a val output_substring: channel -> string -> int -> int -> unit (* This is the channel that is being written to. *) val f: channel (* [locate_stretches] controls the way we print OCaml stretches (types and semantic actions). If it is [Some dstfilename], where [dstfilename] is the name of the file that is being written, then we surround stretches with OCaml line number directives of the form # . If it is [None], then we don't. *) (* Providing line number directives allows the OCaml typechecker to report type errors in the .mly file, instead of in the generated .ml / .mli files. Line number directives also affect the dynamic semantics of any [assert] statements contained in semantic actions: when they are provided, the [Assert_failure] exception carries a location in the .mly file. As a general rule of thumb, line number directives should always be provided, except perhaps where we think that they decrease readability (e.g., in a generated .mli file). *) val locate_stretches: string option end) = struct open X let output_char f c = fprintf f "%c" c let output_string f s = fprintf f "%s" s let flush f = fprintf f "%!" (* ------------------------------------------------------------------------- *) (* Dealing with newlines and indentation. *) let maxindent = 120 let whitespace = String.make maxindent ' ' let indentation = ref 0 let line = ref 1 (* [rawnl] is, in principle, the only place where writing a newline character to the output channel is permitted. This ensures that the line counter remains correct. But see also [stretch] and [typ0]. *) let rawnl f = incr line; output_char f '\n' let nl f = rawnl f; output_substring f whitespace 0 !indentation let indent ofs producer f x = let old_indentation = !indentation in let new_indentation = old_indentation + ofs in if new_indentation <= maxindent then indentation := new_indentation; nl f; producer f x; indentation := old_indentation (* This produces a line number directive. *) let sharp f line file = fprintf f "%t# %d \"%s\"%t" rawnl line file rawnl (* ------------------------------------------------------------------------- *) (* Printers of atomic elements. *) let nothing _ = () let space f = output_char f ' ' let comma f = output_string f ", " let semi f = output_char f ';' let seminl f = semi f; nl f let times f = output_string f " * " let letrec f = output_string f "let rec " let letnonrec f = output_string f "let " let keytyp f = output_string f "type " let exc f = output_string f "exception " let et f = output_string f "and " let var f x = output_string f x let bar f = output_string f " | " (* ------------------------------------------------------------------------- *) (* List printers. *) (* A list with a separator in front of every element. *) let rec list elem sep f = function | [] -> () | e :: es -> fprintf f "%t%a%a" sep elem e (list elem sep) es (* A list with a separator between elements. *) let seplist elem sep f = function | [] -> () | e :: es -> fprintf f "%a%a" elem e (list elem sep) es (* OCaml type parameters. *) let typeparams p0 p1 f = function | [] -> () | [ param ] -> fprintf f "%a " p0 param | _ :: _ as params -> fprintf f "(%a) " (seplist p1 comma) params (* ------------------------------------------------------------------------- *) (* Expression printer. *) (* We use symbolic constants that stand for subsets of the expression constructors. We do not use numeric levels to stand for subsets, because our subsets do not form a linear inclusion chain. *) type subset = | All | AllButSeq | AllButFunTryMatch | AllButFunTryMatchSeq | AllButLetFunTryMatch | AllButLetFunTryMatchSeq | AllButIfThenSeq | OnlyAppOrAtom | OnlyAtom (* This computes the intersection of a subset with the constraint "should not be a sequence". *) let andNotSeq = function | All | AllButSeq -> AllButSeq | AllButFunTryMatch | AllButFunTryMatchSeq -> AllButFunTryMatchSeq | AllButLetFunTryMatch | AllButLetFunTryMatchSeq -> AllButLetFunTryMatchSeq | AllButIfThenSeq -> AllButIfThenSeq | OnlyAppOrAtom -> OnlyAppOrAtom | OnlyAtom -> OnlyAtom (* This defines the semantics of subsets by relating expressions with subsets. *) let rec member e k = match e with | EComment _ | EPatComment _ -> true | EFun _ | ETry _ | EMatch _ -> begin match k with | AllButFunTryMatch | AllButFunTryMatchSeq | AllButLetFunTryMatch | AllButLetFunTryMatchSeq | OnlyAppOrAtom | OnlyAtom -> false | _ -> true end | ELet ([], e) -> member e k | ELet ((PUnit, _) :: _, _) -> begin match k with | AllButSeq | AllButFunTryMatchSeq | AllButLetFunTryMatchSeq | AllButIfThenSeq | OnlyAppOrAtom | OnlyAtom -> false | _ -> true end | ELet (_ :: _, _) -> begin match k with | AllButLetFunTryMatch | AllButLetFunTryMatchSeq | OnlyAppOrAtom | OnlyAtom -> false | _ -> true end | EIfThen _ -> begin match k with | AllButIfThenSeq | OnlyAppOrAtom | OnlyAtom -> false | _ -> true end | EApp (_, _ :: _) | EData (_, _ :: _) | EMagic _ | ERepr _ | ERaise _ -> begin match k with | OnlyAtom -> false | _ -> true end | ERecordWrite _ | EIfThenElse _ -> begin match k with | OnlyAppOrAtom | OnlyAtom -> false | _ -> true end | EVar _ | ETextual _ | EApp (_, []) | EData (_, []) | ETuple _ | EAnnot _ | ERecord _ | ERecordAccess (_, _) | EIntConst _ | EStringConst _ | EUnit | EArray _ | EArrayAccess (_, _) -> true let rec exprlet k pes f e2 = match pes with | [] -> exprk k f e2 | (PUnit, e1) :: pes -> fprintf f "%a%t%a" (exprk AllButLetFunTryMatch) e1 seminl (exprlet k pes) e2 | (PVar id1, EAnnot (e1, ts1)) :: pes -> (* TEMPORARY current ocaml does not support type schemes here; drop quantifiers, if any *) fprintf f "let %s : %a = %a in%t%a" id1 typ ts1.body (* scheme ts1 *) expr e1 nl (exprlet k pes) e2 | (PVar id1, EFun (ps1, e1)) :: pes -> fprintf f "let %s%a = %a in%t%t%a" id1 (list pat0 space) ps1 (indent 2 expr) e1 nl nl (exprlet k pes) e2 | (p1, (ELet _ as e1)) :: pes -> fprintf f "let %a =%a%tin%t%a" pat p1 (indent 2 expr) e1 nl nl (exprlet k pes) e2 | (p1, e1) :: pes -> fprintf f "let %a = %a in%t%a" pat p1 expr e1 nl (exprlet k pes) e2 and atom f e = exprk OnlyAtom f e and app f e = exprk OnlyAppOrAtom f e and expr f e = exprk All f e and exprk k f e = if member e k then match e with | EComment (c, e) -> if Settings.comment then fprintf f "(* %s *)%t%a" c nl (exprk k) e else exprk k f e | EPatComment (s, p, e) -> if Settings.comment then fprintf f "(* %s%a *)%t%a" s pat p nl (exprk k) e else exprk k f e | ELet (pes, e2) -> exprlet k pes f e2 | ERecordWrite (e1, field, e2) -> fprintf f "%a.%s <- %a" atom e1 field (exprk (andNotSeq k)) e2 | EMatch (_, []) -> assert false | EMatch (e, brs) -> fprintf f "match %a with%a" expr e (branches k) brs | ETry (_, []) -> assert false | ETry (e, brs) -> fprintf f "try%a%twith%a" (indent 2 expr) e nl (branches k) brs | EIfThen (e1, e2) -> fprintf f "if %a then%a" expr e1 (indent 2 (exprk (andNotSeq k))) e2 | EIfThenElse (e0, e1, e2) -> fprintf f "if %a then%a%telse%a" expr e0 (indent 2 (exprk AllButIfThenSeq)) e1 nl (indent 2 (exprk (andNotSeq k))) e2 | EFun (ps, e) -> fprintf f "fun%a ->%a" (list pat0 space) ps (indent 2 (exprk k)) e | EApp (EVar op, [ e1; e2 ]) when op.[0] = '(' && op.[String.length op - 1] = ')' -> let op = String.sub op 1 (String.length op - 2) in fprintf f "%a %s %a" app e1 op app e2 | EApp (e, args) -> fprintf f "%a%a" app e (list atom space) args | ERaise e -> fprintf f "raise %a" atom e | EMagic e -> fprintf f "Obj.magic %a" atom e | ERepr e -> fprintf f "Obj.repr %a" atom e | EData (d, []) -> var f d | EData (d, [ arg ]) -> fprintf f "%s %a" d atom arg | EData ("::", [ arg1; arg2 ]) -> (* Special case for infix cons. *) fprintf f "%a :: %a" atom arg1 atom arg2 | EData (d, (_ :: _ :: _ as args)) -> fprintf f "%s (%a)" d (seplist app comma) args | EVar v -> var f v | ETextual action -> stretch false f action | EUnit -> fprintf f "()" | EIntConst k -> if k >= 0 then fprintf f "%d" k else fprintf f "(%d)" k | EStringConst s -> fprintf f "\"%s\"" (Compatibility.String.escaped s) | ETuple [] -> assert false | ETuple [ e ] -> atom f e | ETuple (_ :: _ :: _ as es) -> fprintf f "(%a)" (seplist app comma) es | EAnnot (e, s) -> (* TEMPORARY current ocaml does not support type schemes here; drop quantifiers, if any *) fprintf f "(%a : %a)" app e typ s.body (* should be scheme s *) | ERecordAccess (e, field) -> fprintf f "%a.%s" atom e field | ERecord fs -> fprintf f "{%a%t}" (indent 2 (seplist field nl)) fs nl | EArray fs -> fprintf f "[|%a%t|]" (indent 2 (seplist array_field nl)) fs nl | EArrayAccess (e, i) -> fprintf f "%a.(%a)" atom e expr i else fprintf f "(%a)" expr e and stretch raw f stretch = let content = stretch.Stretch.stretch_content and raw_content = stretch.Stretch.stretch_raw_content in match X.locate_stretches with | Some basename -> sharp f stretch.Stretch.stretch_linenum stretch.Stretch.stretch_filename; output_string f content; line := !line + stretch.Stretch.stretch_linecount; sharp f (!line + 2) basename; output_substring f whitespace 0 !indentation | None -> output_string f (if raw then raw_content else content) and branches k f = function | [] -> () | [ br ] -> fprintf f "%t| %a" nl (branch k) br | br :: brs -> fprintf f "%t| %a%a" nl (branch AllButFunTryMatch) br (branches k) brs and branch k f br = fprintf f "%a ->%a" pat br.branchpat (indent 4 (exprk k)) br.branchbody and field f (label, e) = fprintf f "%s = %a%t" label app e semi and fpat f (label, p) = fprintf f "%s = %a%t" label pat p semi and array_field f e = fprintf f "%a%t" app e semi and pat0 f = function | PUnit -> fprintf f "()" | PWildcard -> fprintf f "_" | PVar x -> var f x | PData (d, []) -> var f d | PTuple [] -> assert false | PTuple [ p ] -> pat0 f p | PTuple (_ :: _ :: _ as ps) -> fprintf f "(%a)" (seplist pat1 comma) ps | PAnnot (p, t) -> fprintf f "(%a : %a)" pat p typ t | PRecord fps -> (* 2018/10/19. In a record pattern, we used to omit bindings of the form [field = _]. However, this triggers OCaml's warning 9. We now print all bindings. *) fprintf f "{%a%t}" (indent 2 (seplist fpat nl)) fps nl | p -> fprintf f "(%a)" pat p and pat1 f = function | PData (d, [ arg ]) -> fprintf f "%s %a" d pat0 arg | PData (d, (_ :: _ :: _ as args)) -> fprintf f "%s (%a)" d (seplist pat1 comma) args | PTuple [ p ] -> pat1 f p | p -> pat0 f p and pat2 f = function | POr [] -> assert false | POr (_ :: _ as ps) -> seplist pat2 bar f ps | PTuple [ p ] -> pat2 f p | p -> pat1 f p and pat f p = pat2 f p and typevar f = function | "_" -> fprintf f "_" | v -> fprintf f "'%s" v and typ0 f = function | TypTextual (Stretch.Declared ocamltype) -> (* Parentheses are necessary to avoid confusion between 1 - ary data constructor with n arguments and n - ary data constructor. *) fprintf f "(%a)" (stretch true) ocamltype | TypTextual (Stretch.Inferred t) -> line := !line + LineCount.count 0 (Lexing.from_string t); fprintf f "(%s)" t | TypVar v -> typevar f v | TypApp (t, params) -> fprintf f "%a%s" (typeparams typ0 typ) params t | t -> fprintf f "(%a)" typ t and typ1 f = function | TypTuple [] -> assert false | TypTuple (_ :: _ as ts) -> seplist typ0 times f ts | t -> typ0 f t and typ2 f = function | TypArrow (t1, t2) -> fprintf f "%a -> %a" typ1 t1 typ2 t2 | t -> typ1 f t and typ f = typ2 f and scheme f scheme = match scheme.quantifiers with | [] -> typ f scheme.body | qs -> fprintf f "%a. %a" (list typevar space) qs typ scheme.body (* ------------------------------------------------------------------------- *) (* Toplevel definition printer. *) (* The tuple of the arguments of a data constructor. *) let datavalparams f params = (* [typ1] because [type t = A of int -> int ] is not allowed by OCaml *) (* [type t = A of (int -> int)] is allowed *) seplist typ1 times f params (* A data constructor definition. *) let datadef typename f def = fprintf f " | %s" def.dataname; match def.datavalparams, def.datatypeparams with | [], None -> (* | A *) () | _ :: _, None -> (* | A of t * u *) fprintf f " of %a" datavalparams def.datavalparams | [], Some indices -> (* | A : (v, w) ty *) fprintf f " : %a%s" (typeparams typ0 typ) indices typename | _ :: _, Some indices -> (* | A : t * u -> (v, w) ty *) fprintf f " : %a -> %a%s" datavalparams def.datavalparams (typeparams typ0 typ) indices typename let fielddef f def = fprintf f " %s%s: %a" (if def.modifiable then "mutable " else "") def.fieldname scheme def.fieldtype let typerhs typename f = function | TDefRecord [] -> assert false | TDefRecord (_ :: _ as fields) -> fprintf f " = {%t%a%t}" nl (seplist fielddef seminl) fields nl | TDefSum [] -> () | TDefSum defs -> fprintf f " = %a" (list (datadef typename) nl) defs | TAbbrev t -> fprintf f " = %a" typ t let typeconstraint f = function | None -> () | Some (t1, t2) -> fprintf f "%tconstraint %a = %a" nl typ t1 typ t2 let typedef f def = fprintf f "%a%s%a%a" (typeparams typevar typevar) def.typeparams def.typename (typerhs def.typename) def.typerhs typeconstraint def.typeconstraint let rec pdefs pdef sep1 sep2 f = function | [] -> () | [ def ] -> fprintf f "%t%a" sep1 pdef def | def :: defs -> fprintf f "%t%a%t%t%a" sep1 pdef def (* Separate two successive items with two newlines. *) nl nl (pdefs pdef sep2 sep2) defs let valdef f = function | { valpat = PVar id; valval = EAnnot (e, ts) } -> (* TEMPORARY current ocaml does not support type schemes here; drop quantifiers, if any *) fprintf f "%s : %a =%a" id typ ts.body (* scheme ts *) (indent 2 expr) e | { valpat = p; valval = e } -> fprintf f "%a =%a" pat p (indent 2 expr) e let valdefs recursive = pdefs valdef (if recursive then letrec else letnonrec) et let typedefs = pdefs typedef keytyp et let excdef in_intf f def = match in_intf, def.exceq with | _, None | true, Some _ -> fprintf f "%s" def.excname | false, Some s -> fprintf f "%s = %s" def.excname s let excdefs in_intf = pdefs (excdef in_intf) exc exc let block format body f b = fprintf f format (fun f b -> indent 2 body f b; nl f ) b (* Convention: each structure (or interface) item prints a newline before and after itself. *) let rec structure_item f item = match item with | SIFunctor ([], s) -> structure f s | SIStretch stretches -> List.iter (stretch false f) stretches | _ -> nl f; begin match item with | SIFunctor (params, s) -> fprintf f "module Make%a%t= %a" (list (stretch false) nl) params nl structend s | SIExcDefs defs -> excdefs false f defs | SITypeDefs defs -> typedefs f defs | SIValDefs (recursive, defs) -> valdefs recursive f defs | SIStretch _ -> assert false (* already handled above *) | SIModuleDef (name, rhs) -> fprintf f "module %s = %a" name modexpr rhs | SIInclude e -> fprintf f "include %a" modexpr e | SIComment comment -> fprintf f "(* %s *)" comment end; nl f and structend f s = block "struct%aend" structure f s and structure f s = list structure_item nothing f s and modexpr f = function | MVar x -> fprintf f "%s" x | MStruct s -> structend f s | MApp (e1, e2) -> fprintf f "%a (%a)" modexpr e1 modexpr e2 let valdecl f (x, ts) = fprintf f "val %s: %a" x typ ts.body let with_kind f = function | WKNonDestructive -> output_string f "=" | WKDestructive -> output_string f ":=" let rec module_type f = function | MTNamedModuleType s -> output_string f s | MTWithType (mt, params, name, wk, t) -> fprintf f "%a%a" module_type mt (indent 2 with_type) (params, name, wk, t) | MTSigEnd i -> sigend f i and with_type f (params, name, wk, t) = fprintf f "with type %a %a %a" typ (TypApp (name, List.map (fun v -> TypVar v) params)) with_kind wk typ t and interface_item f item = match item with | IIFunctor ([], i) -> interface f i | _ -> nl f; begin match item with | IIFunctor (params, i) -> fprintf f "module Make%a%t: %a" (list (stretch false) nl) params nl sigend i | IIExcDecls defs -> excdefs true f defs | IITypeDecls defs -> typedefs f defs | IIValDecls decls -> pdefs valdecl nothing nothing f decls | IIInclude mt -> fprintf f "include %a" module_type mt | IIModule (name, mt) -> fprintf f "module %s : %a" name module_type mt | IIComment comment -> fprintf f "(* %s *)" comment end; nl f and sigend f i = block "sig%aend" interface f i and interface f i = list interface_item nothing f i let program s = structure X.f s; flush X.f let interface i = interface X.f i; flush X.f let expr e = expr X.f e; flush X.f end (* ------------------------------------------------------------------------- *) (* Instantiation with output channels. *) module Make (X : sig val f: out_channel val locate_stretches: string option end) = struct include PreliminaryMake(struct type channel = out_channel include X let fprintf = Printf.fprintf let output_substring = output_substring end) end (* ------------------------------------------------------------------------- *) (* Instantiation with buffers. *) module MakeBuffered (X : sig val f: Buffer.t val locate_stretches: string option end) = struct include PreliminaryMake(struct type channel = Buffer.t include X let fprintf = Printf.bprintf let output_substring = Buffer.add_substring end) end (* ------------------------------------------------------------------------- *) (* Common instantiations. *) let print_expr f e = let module P = Make (struct let f = f let locate_stretches = None end) in P.expr e let string_of_expr e = let b = Buffer.create 512 in let module P = MakeBuffered (struct let f = b let locate_stretches = None end) in P.expr e; Buffer.contents b menhir-20200123/src/printer.mli000066400000000000000000000045761361226111300162140ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* A pretty-printer for [IL]. *) module Make (X : sig (* This is the channel that is being written to. *) val f: out_channel (* [locate_stretches] controls the way we print OCaml stretches (types and semantic actions). If it is [Some dstfilename], where [dstfilename] is the name of the file that is being written, then we surround stretches with OCaml line number directives of the form # . If it is [None], then we don't. *) (* Providing line number directives allows the OCaml typechecker to report type errors in the .mly file, instead of in the generated .ml / .mli files. Line number directives also affect the dynamic semantics of any [assert] statements contained in semantic actions: when they are provided, the [Assert_failure] exception carries a location in the .mly file. As a general rule of thumb, line number directives should always be provided, except perhaps where we think that they decrease readability (e.g., in a generated .mli file). *) val locate_stretches: string option end) : sig val program: IL.program -> unit val expr: IL.expr -> unit val interface: IL.interface -> unit end (* Common instantiations. In the following two functions, [locate_stretches] is [None], so no line number directives are printed. *) val print_expr: out_channel -> IL.expr -> unit val string_of_expr: IL.expr -> string menhir-20200123/src/rawPrinter.ml000066400000000000000000000133031361226111300165010ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* A debugging pretty-printer for [IL]. Newlines are used liberally, so as to facilitate diffs. *) open IL module Make (X : sig (* This is the channel that is being written to. *) val f: out_channel end) = struct (* ------------------------------------------------------------------------- *) (* XML-style trees. *) type tree = | Node of string * tree list let node label ts = Node (label, ts) (* ------------------------------------------------------------------------- *) (* Dealing with newlines and indentation. *) let maxindent = 120 let whitespace = Bytes.make maxindent ' ' let indentation = ref 0 let line = ref 1 (* [rawnl] is, in principle, the only place where writing a newline character to the output channel is permitted. This ensures that the line counter remains correct. But see also [stretch] and [typ0]. *) let rawnl f = incr line; output_char f '\n' let nl f = rawnl f; output f whitespace 0 !indentation let indent ofs producer f x = let old_indentation = !indentation in let new_indentation = old_indentation + ofs in if new_indentation <= maxindent then indentation := new_indentation; nl f; producer f x; indentation := old_indentation (* ------------------------------------------------------------------------- *) (* Tree printers. *) let rec print_tree f = function | Node (label, []) -> output_char f '<'; output_string f label; output_char f '/'; output_char f '>'; nl f | Node (label, ts) -> output_char f '<'; output_string f label; output_char f '>'; indent 2 print_trees f ts; output_char f '<'; output_char f '/'; output_string f label; output_char f '>'; nl f and print_trees f = function | [] -> () | t :: ts -> print_tree f t; print_trees f ts (* ------------------------------------------------------------------------- *) (* Expression-to-tree converter. *) let rec expr e = match e with | EComment (c, e) -> node "comment" [ string c; expr e ] | EPatComment (s, p, e) -> node "patcomment" [ string s; pat p; expr e ] | ELet (pes, e2) -> node "let" ( patexprs pes @ [ expr e2 ]) | ERecordWrite (e1, field, e2) -> node "recordwrite" [ expr e1; string field; expr e2 ] | EMatch (e, brs) -> node "match" ( expr e :: branches brs ) | ETry (e, brs) -> node "try" ( expr e :: branches brs ) | EIfThen (e1, e2) -> node "ifthen" [ expr e1; expr e2 ] | EIfThenElse (e0, e1, e2) -> node "ifthenelse" [ expr e0; expr e1; expr e2 ] | EFun (ps, e) -> node "fun" ( pats ps @ [ expr e ]) | EApp (e, args) -> node "app" ( expr e :: exprs args ) | ERaise e -> node "raise" [ expr e ] | EMagic e -> node "magic" [ expr e ] | ERepr e -> node "repr" [ expr e ] | EData (d, args) -> node "data" ( string d :: exprs args ) | EVar v -> node "var" [ string v ] | ETextual action -> node "text" [ stretch action ] | EUnit -> node "unit" [] | EIntConst k -> node "int" [ int k ] | EStringConst s -> node "string" [ string s ] | ETuple es -> node "tuple" ( exprs es ) | EAnnot (e, s) -> node "annot" [ expr e; scheme s ] | ERecordAccess (e, field) -> node "recordaccess" [ expr e; string field ] | ERecord fs -> node "record" (fields fs) | EArray fs -> node "array" (exprs fs) | EArrayAccess (e1, e2) -> node "arrayaccess" [ expr e1; expr e2 ] and exprs es = List.map expr es and stretch stretch = string stretch.Stretch.stretch_content and branches brs = List.map branch brs and branch br = node "branch" [ pat br.branchpat; expr br.branchbody ] and fields fs = List.map field fs and field (label, e) = node "field" [ string label; expr e ] and pats ps = List.map pat ps and pat = function | PUnit -> node "punit" [] | PWildcard -> node "pwildcard" [] | PVar x -> node "pvar" [ string x ] | PTuple ps -> node "ptuple" (pats ps) | PAnnot (p, t) -> node "pannot" [ pat p; typ t ] | PData (d, args) -> node "pdata" (string d :: pats args) | PRecord fps -> node "precord" (fpats fps) | POr ps -> node "por" (pats ps) and fpats fps = List.map fpat fps and fpat (_, p) = pat p and patexprs pes = List.map patexpr pes and patexpr (p, e) = node "patexpr" [ pat p; expr e ] and string s = node s [] and int k = node (string_of_int k) [] and scheme _s = string "omitted" (* TEMPORARY to be completed, someday *) and typ _t = string "omitted" (* TEMPORARY to be completed, someday *) (* ------------------------------------------------------------------------- *) (* Convert to a tree, then print the tree. *) let expr e = print_tree X.f (expr e) end menhir-20200123/src/rawPrinter.mli000066400000000000000000000023101361226111300166460ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* A debugging pretty-printer for [IL]. Newlines are used liberally, so as to facilitate diffs. *) module Make (X : sig (* This is the channel that is being written to. *) val f: out_channel end) : sig val expr: IL.expr -> unit end menhir-20200123/src/reachability.ml000066400000000000000000000045141361226111300170100ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open BasicSyntax let rec visit grammar visited symbol = try let rule = StringMap.find symbol grammar.rules in if not (StringSet.mem symbol visited) then let visited = StringSet.add symbol visited in List.fold_left (visitb grammar) visited rule.branches else visited with Not_found -> (* This is a terminal symbol. *) assert (symbol = "error" || StringMap.mem symbol grammar.tokens); visited and visitb grammar visited { producers = symbols } = List.fold_left (visits grammar) visited symbols and visits grammar visited producer = visit grammar visited (producer_symbol producer) let trim grammar = if StringSet.cardinal grammar.start_symbols = 0 then Error.error [] "no start symbol has been declared." else let reachable = StringSet.fold (fun symbol visited -> visit grammar visited symbol ) grammar.start_symbols StringSet.empty in StringMap.iter (fun symbol rule -> if not (StringSet.mem symbol reachable) then Error.grammar_warning rule.positions "symbol %s is unreachable from any of the start symbol(s)." symbol ) grammar.rules; { grammar with rules = StringMap.restrict reachable grammar.rules; types = StringMap.restrict reachable grammar.types; on_error_reduce = StringMap.restrict reachable grammar.on_error_reduce; } menhir-20200123/src/reachability.mli000066400000000000000000000022431361226111300171560ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This extremely simple analysis restricts a grammar to the set of nonterminals that are reachable, via productions, from the start nonterminals. *) val trim: BasicSyntax.grammar -> BasicSyntax.grammar menhir-20200123/src/referenceInterpreter.ml000066400000000000000000000261601361226111300205330ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Grammar open Cst (* ------------------------------------------------------------------------ *) (* Set up all of the information required by the LR engine. Everything is read directly from [Grammar] and [Lr1]. *) module T = struct type state = Lr1.node let number = Lr1.number type token = Terminal.t type terminal = Terminal.t type nonterminal = Nonterminal.t type semantic_value = cst let token2terminal (token : token) : terminal = token let token2value (token : token) : semantic_value = CstTerminal token let error_terminal = Terminal.error let error_value = CstError let foreach_terminal = Terminal.foldx type production = Production.index let production_index = Production.p2i let find_production = Production.i2p let default_reduction (s : state) defred nodefred env = match Default.has_default_reduction s with | Some (prod, _) -> defred env prod | None -> nodefred env let action (s : state) (tok : terminal) value shift reduce fail env = (* Check whether [s] has an outgoing shift transition along [tok]. *) try let s' : state = SymbolMap.find (Symbol.T tok) (Lr1.transitions s) in (* There is such a transition. Return either [ShiftDiscard] or [ShiftNoDiscard], depending on the existence of a default reduction on [#] at [s']. *) match Default.has_default_reduction s' with | Some (_, toks) when TerminalSet.mem Terminal.sharp toks -> shift env false tok value s' | _ -> shift env true tok value s' (* There is no such transition. Look for a reduction. *) with Not_found -> try let prod = Misc.single (TerminalMap.find tok (Lr1.reductions s)) in reduce env prod (* There is no reduction either. Fail. *) with Not_found -> fail env let goto_nt (s : state) (nt : nonterminal) : state = try SymbolMap.find (Symbol.N nt) (Lr1.transitions s) with Not_found -> assert false let goto_prod (s : state) (prod : production) : state = goto_nt s (Production.nt prod) let maybe_goto_nt (s : state) (nt : nonterminal) : state option = try Some (SymbolMap.find (Symbol.N nt) (Lr1.transitions s)) with Not_found -> None open MenhirLib.EngineTypes exception Error (* By convention, a semantic action returns a new stack. It does not affect [env]. *) let is_start = Production.is_start type semantic_action = (state, semantic_value, token) env -> (state, semantic_value) stack let semantic_action (prod : production) : semantic_action = fun env -> assert (not (Production.is_start prod)); (* Reduce. Pop a suffix of the stack, and use it to construct a new concrete syntax tree node. *) let n = Production.length prod in let values : semantic_value array = Array.make n CstError (* dummy *) and startp = ref Lexing.dummy_pos and endp= ref Lexing.dummy_pos and current = ref env.current and stack = ref env.stack in (* We now enter a loop to pop [k] stack cells and (after that) push a new cell onto the stack. *) (* This loop does not update [env.current]. Instead, the state in the newly pushed stack cell will be used (by our caller) as a basis for a goto transition, and [env.current] will be updated (if necessary) then. *) for k = n downto 1 do (* Fetch a semantic value. *) values.(k - 1) <- !stack.semv; (* Pop one cell. The stack must be non-empty. As we pop a cell, change the automaton's current state to the one stored within the cell. (It is sufficient to do this only when [k] is 1, since the last write overwrites any and all previous writes.) If this is the first (last) cell that we pop, update [endp] ([startp]). *) let next = !stack.next in assert (!stack != next); if k = n then begin endp := !stack.endp end; if k = 1 then begin current := !stack.state; startp := !stack.startp end; stack := next done; (* Done popping. *) (* Construct and push a new stack cell. The associated semantic value is a new concrete syntax tree. *) { state = !current; semv = CstNonTerminal (prod, values); startp = !startp; endp = !endp; next = !stack } let may_reduce node prod = Lr1.NodeSet.mem node (Lr1.production_where prod) (* The logging functions that follow are called only if [log] is [true]. *) module Log = struct open Printf let state s = fprintf stderr "State %d:" (Lr1.number s); prerr_newline() let shift tok s' = fprintf stderr "Shifting (%s) to state %d" (Terminal.print tok) (Lr1.number s'); prerr_newline() let reduce_or_accept prod = match Production.classify prod with | Some _ -> fprintf stderr "Accepting"; prerr_newline() | None -> fprintf stderr "Reducing production %s" (Production.print prod); prerr_newline() let lookahead_token tok startp endp = fprintf stderr "Lookahead token is now %s (%d-%d)" (Terminal.print tok) startp.Lexing.pos_cnum endp.Lexing.pos_cnum; prerr_newline() let initiating_error_handling () = fprintf stderr "Initiating error handling"; prerr_newline() let resuming_error_handling () = fprintf stderr "Resuming error handling"; prerr_newline() let handling_error s = fprintf stderr "Handling error in state %d" (Lr1.number s); prerr_newline() end end (* ------------------------------------------------------------------------ *) (* Define a palatable user entry point. *) let interpret log nt lexer lexbuf = (* Instantiate the LR engine. *) let module E = MenhirLib.Engine.Make (struct include T let log = log end) in (* Run it. *) try Some (E.entry (Lr1.entry_of_nt nt) lexer lexbuf) with T.Error -> None (* ------------------------------------------------------------------------ *) (* Another entry point, used internally by [LRijkstra] to check that the sentences that [LRijkstra] produces do lead to an error in the expected state. *) type spurious_reduction = Lr1.node * Production.index type target = Lr1.node * spurious_reduction list type check_error_path_outcome = (* Bad: the input was read past its end. *) | OInputReadPastEnd (* Bad: a syntax error occurred before all of the input was read. *) | OInputNotFullyConsumed (* Bad: the parser unexpectedly accepted (part of) this input. *) | OUnexpectedAccept (* Good: a syntax error occurred after reading the last input token. We report in which state the error took place, as well as a list of spurious reductions. A non-default reduction that takes place after looking at the last input token (i.e., the erroneous token) is spurious. Furthermore, any reduction that takes place after a spurious reduction is itself spurious. We note that a spurious reduction can take place only in a non-canonical LR automaton. *) | OK of target let check_error_path log nt input = (* Instantiate the LR engine. *) let module E = MenhirLib.Engine.Make (struct include T let log = log end) in (* Determine the initial state. *) let entry = Lr1.entry_of_nt nt in (* This function helps extract the current parser state out of [env]. It may become unnecessary if the [Engine] API offers it. *) let current env = (* Peek at the stack. If empty, then we must be in the initial state. *) match E.top env with | None -> entry | Some (E.Element (s, _, _, _)) -> s in (* Set up a function that delivers tokens one by one. *) let input = ref input in let next () = match !input with | [] -> None | t :: ts -> input := ts; Some t in let looking_at_last_token () : bool = !input = [] in (* Run. We wish to stop at the first error (without handling the error in any way) and report in which state the error occurred. A clean way of doing this is to use the incremental API, as follows. The main loop resembles the [loop] function in [Engine]. *) (* Another reason why we write our own loop is that we wish to detect spurious reductions. We accumulate these reductions in [spurious], a (reversed) list of productions. *) let rec loop (checkpoint : cst E.checkpoint) (spurious : spurious_reduction list) = match checkpoint with | E.InputNeeded _ -> begin match next() with | None -> OInputReadPastEnd | Some t -> loop (E.offer checkpoint (t, Lexing.dummy_pos, Lexing.dummy_pos)) spurious end | E.Shifting _ -> loop (E.resume checkpoint) spurious | E.AboutToReduce (env, prod) -> (* If we have requested the last input token and if this is not a default reduction, then this is a spurious reduction. Furthermore, if a spurious reduction has taken place already, then this is also a spurious reduction. *) let spurious = if looking_at_last_token() && not (E.env_has_default_reduction env) || spurious <> [] then (current env, prod) :: spurious else spurious in loop (E.resume checkpoint) spurious | E.HandlingError env -> (* Check that all of the input has been read. Otherwise, the error has occurred sooner than expected. *) if !input = [] then (* Return the current state and the list of spurious reductions. *) OK (current env, List.rev spurious) else OInputNotFullyConsumed | E.Accepted _ -> (* The parser has succeeded. This is unexpected. *) OUnexpectedAccept | E.Rejected -> (* The parser rejects this input. This should not happen; we should observe [HandlingError _] first. *) assert false in loop (E.start entry Lexing.dummy_pos) [] menhir-20200123/src/referenceInterpreter.mli000066400000000000000000000056361361226111300207110ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Grammar open Cst (* This reference interpreter animates the LR automaton. It uses the grammar and automaton descriptions, as provided by [Grammar] and [Lr1], as well as the generic LR engine in [MenhirLib.Engine]. *) (* The first parameter to the interpreter is a Boolean flag that tells whether a trace should be produced on the standard error channel. *) (* The interpreter requires a start symbol, a lexer, and a lexing buffer. It either succeeds and produces a concrete syntax tree, or fails. *) val interpret: bool -> Nonterminal.t -> (Lexing.lexbuf -> Terminal.t) -> Lexing.lexbuf -> cst option (* This variant of the reference interpreter is used internally by us. We use it to debug [LRijkstra]. It checks that a sentence leads to a syntax error in the expected state. It is also used by several of the command line options [--interpret-error], [--compile-errors], etc. *) type spurious_reduction = Lr1.node * Production.index type target = Lr1.node * spurious_reduction list type check_error_path_outcome = (* Bad: the input was read past its end. *) | OInputReadPastEnd (* Bad: a syntax error occurred before all of the input was read. *) | OInputNotFullyConsumed (* Bad: the parser unexpectedly accepted (part of) this input. *) | OUnexpectedAccept (* Good: a syntax error occurred after reading the last input token. We report in which state the error took place, as well as a list of spurious reductions. A non-default reduction that takes place after looking at the last input token (i.e., the erroneous token) is spurious. Furthermore, any reduction that takes place after a spurious reduction is itself spurious. We note that a spurious reduction can take place only in a non-canonical LR automaton. *) | OK of target val check_error_path: bool -> (* --trace *) Nonterminal.t -> (* initial non-terminal symbol *) Terminal.t list -> (* input *) check_error_path_outcome menhir-20200123/src/resizableArray.ml000066400000000000000000000075771361226111300173430ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module implements resizable arrays, that is, arrays that can grow upon explicit request. *) type 'a t = { (* The default element is used to fill empty slots when growing or shrinking the physical array. *) default: 'a; (* The init function is used to initialize newly allocated slots when growing the logical array. *) init: int -> 'a; (* The logical size of this array. *) mutable size: int; (* The physical array, whose length is at least [size]. *) mutable table: 'a array } let make capacity default init = (* [capacity] must be nonzero, so that doubling it actually enlarges the array. *) assert (capacity >= 0); let capacity = if capacity = 0 then 1 else capacity in let table = Array.make capacity default in { default; init; size = 0; table } let make_ capacity default = make capacity default (fun _ -> default) let length a = a.size let get a i = assert (0 <= i && i < a.size); Array.unsafe_get a.table (i) let set a i x = assert (0 <= i && i < a.size); Array.unsafe_set a.table (i) x let shrink a s = (* This is [resize a s], assuming [0 <= s < a.size]. *) Array.fill a.table s (a.size - s) a.default; a.size <- s let grow a s = (* This is [resize a s], assuming [0 <= s && a.size < s]. *) let n = Array.length a.table in if s > n then begin (* The physical size of the array must increase. The new size is at least double of the previous size, and larger if requested. *) let table = Array.make (max (2 * n) s) a.default in Array.blit a.table 0 table 0 n; a.table <- table end; (* From [a.size] to [s], we have new logical slots. Initialize them. *) let init = a.init and table = a.table in for i = a.size to s - 1 do Array.unsafe_set table i (init i) done; (* Update the array's logical size. *) a.size <- s let resize a s = assert (0 <= s); if s < a.size then shrink a s else if s > a.size then grow a s let push a x = let s = a.size in (* equivalent to: [length a] *) begin (* equivalent to: [resize a (s + 1)] *) let s = s + 1 in let n = Array.length a.table in if s > n then begin (* assert (s = n + 1); *) (* assert (max (2 * n) s = 2 * n); *) let table = Array.make (2 * n) a.default in Array.blit a.table 0 table 0 n; a.table <- table end; (* No need to call [init], since there is just one new logical slot and we are about to write it anyway. *) a.size <- s end; Array.unsafe_set a.table (s) x (* equivalent to: [set a s x] *) let pop a = let s = a.size in (* equivalent to: [length a] *) assert (s > 0); let s = s - 1 in a.size <- s; let table = a.table in let x = Array.unsafe_get table (s) in (* equivalent to: [get a s] *) Array.unsafe_set table (s) a.default; (* equivalent to: [resize a s] *) x let default a = a.default menhir-20200123/src/resizableArray.mli000066400000000000000000000062461361226111300175040ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module implements resizable arrays, that is, arrays that can grow upon explicit request. *) type 'a t (* [make capacity default init] creates a resizable array of logical length 0, whose physical length is initially [capacity], and whose default element is [default]. The default element is used to fill empty slots in the physical array; it is otherwise irrelevant. The [init] function is used to initialize new logical slots when the logical size of the array grows, so, unlike [default], it is semantically meaningful. *) val make: int -> 'a -> (int -> 'a) -> 'a t (* [make_] is a simplified variant of [make] where the [init] function always returns [default], i.e., where new logical slots are initialized with [default] when the array is grown. *) val make_: int -> 'a -> 'a t (* [length a] returns the current logical length of the array [a]. *) val length: 'a t -> int (* [resize a n] changes the logical length of the array [a] to [n]. If the length decreases, any excess elements are lost. The capacity of the underlying physical array remains the same. If the length increases, the new positions are filled with the array's default element, as initially supplied to [make]. The capacity of the underlying physical array grows by at least a factor of two. *) val resize: 'a t -> int -> unit (* [get a i] returns the element contained at offset [i] in the array [a]. Slots are numbered 0 and up. [i] must be strictly less than the array's current logical length. *) val get: 'a t -> int -> 'a (* [set a i x] sets the element contained at offset [i] in the array [a] to [x]. Slots are numbered 0 and up. [i] must be strictly less than the array's current logical length. *) val set: 'a t -> int -> 'a -> unit (* [push a x] appends the element [x] at the end of the array [a], whose length increases by one. *) val push: 'a t -> 'a -> unit (* [pop a] removes the element [x] found at the end of the array [a], whose length decreases by one. The array must have nonzero length. *) val pop: 'a t -> 'a (* [default a] returns the default value that was used when the array [a] was created. This should be seldom useful, but can be convenient. *) val default: 'a t -> 'a menhir-20200123/src/segment.mll000066400000000000000000000113241361226111300161630ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This lexer is used to cut an input into segments, delimited by a blank line. (More precisely, by a run of at least one blank line and zero or more comment lines.) It produces a list of segments, where each segment is represented as a pair of positions. It is stand-alone and cannot fail. *) (* The whitespace in between two segments can contain comments, and the user may wish to preserve them. For this reason, we view a run of whitespace as a segment, too, and we accompany each segment with a tag which is either [Segment] or [Whitespace]. The two kinds of segments must alternate in the list that we produce. *) { type tag = | Segment | Whitespace open Lexing } let newline = ('\010' | '\013' | "\013\010") let whitespace = [ ' ' '\t' ';' ] let comment = '#' [^'\010''\013']* newline (* In the idle state, we skip whitespace, newlines and comments (while updating the liner counter). If we reach the end of file, we return the list of all segments found so far. If we reach a non-blank non-comment character, we record its position and switch to the busy state. *) rule idle opening segments = parse | whitespace { idle opening segments lexbuf } | newline { new_line lexbuf; idle opening segments lexbuf } | comment { new_line lexbuf; idle opening segments lexbuf } | eof { let closing = lexbuf.lex_start_p in let segment = Whitespace, opening, closing in let segments = segment :: segments in List.rev segments } | _ { let closing = lexbuf.lex_start_p in let segment = Whitespace, opening, closing in let segments = segment :: segments in let opening = closing in busy segments opening false lexbuf } (* In the busy state, we skip everything, maintaining one bit [just_saw_a_newline], until [just_saw_a_newline] is true and we find a second newline. This marks the end of a segment, and we revert back to the idle state. If we reach the end of file, we consider that this is also the end of a segment. *) and busy segments opening just_saw_a_newline = parse | whitespace { busy segments opening just_saw_a_newline lexbuf } | newline { new_line lexbuf; (* The newline that we just saw is already included in the segment. This one is not included. *) let closing = lexbuf.lex_start_p in if just_saw_a_newline then let segment = Segment, opening, closing in let segments = segment :: segments in let opening = closing in idle opening segments lexbuf else busy segments opening true lexbuf } | eof { let closing = lexbuf.lex_start_p in let segment = Segment, opening, closing in let segments = segment :: segments in List.rev segments } | _ { busy segments opening false lexbuf } { (* This wrapper function reads a file, cuts it into segments, and creates a fresh lexbuf for each segment, taking care to adjust its start position. *) let segment filename : (tag * string * lexbuf) list = let content = IO.read_whole_file filename in let lexbuf = from_string content in lexbuf.lex_curr_p <- { lexbuf.lex_curr_p with pos_fname = filename }; let segments : (tag * position * position) list = idle lexbuf.lex_curr_p [] lexbuf in List.map (fun (tag, startp, endp) -> let start = startp.pos_cnum in let length = endp.pos_cnum - start in let content = String.sub content start length in let lexbuf = from_string content in lexbuf.lex_start_p <- startp; lexbuf.lex_curr_p <- startp; lexbuf.lex_abs_pos <- startp.pos_cnum; (* That was tricky to find out. See [Lexing.engine]. [pos_cnum] is updated based on [buf.lex_abs_pos + buf.lex_curr_pos]. *) tag, content, lexbuf ) segments } menhir-20200123/src/sentenceLexer.mll000066400000000000000000000060411361226111300173250ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This lexer is used to read the sentences provided on the standard input channel when [--interpret] is enabled. *) { open Lexing open SentenceParser open Grammar (* A short-hand. *) let error2 lexbuf = Error.error (Positions.lexbuf lexbuf) } let newline = ('\010' | '\013' | "\013\010") let whitespace = [ ' ' '\t' ';' ] let lowercase = ['a'-'z' '\223'-'\246' '\248'-'\255' '_'] let uppercase = ['A'-'Z' '\192'-'\214' '\216'-'\222'] let identchar = ['A'-'Z' 'a'-'z' '_' '\192'-'\214' '\216'-'\246' '\248'-'\255' '0'-'9'] (* '\'' forbidden *) let autocomment = "##" [^'\010''\013']* newline let comment = "#" [^'\010''\013']* newline let skip = newline whitespace* newline rule lex = parse (* An identifier that begins with an lowercase letter is considered a non-terminal symbol. It should be a start symbol. *) | (lowercase identchar *) as lid { try let nt = Nonterminal.lookup lid in if StringSet.mem lid Front.grammar.BasicSyntax.start_symbols then NONTERMINAL (nt, lexbuf.lex_start_p, lexbuf.lex_curr_p) else error2 lexbuf "\"%s\" is not a start symbol." lid with Not_found -> error2 lexbuf "\"%s\" is not a known non-terminal symbol." lid } (* An identifier that begins with an uppercase letter is considered a terminal symbol. *) | (uppercase identchar *) as uid { try TERMINAL (Terminal.lookup uid, lexbuf.lex_start_p, lexbuf.lex_curr_p) with Not_found -> error2 lexbuf "\"%s\" is not a known terminal symbol." uid } (* Whitespace is ignored. *) | whitespace { lex lexbuf } (* The end of a line is translated to [EOL]. *) | newline { new_line lexbuf; EOL } (* An auto-generated comment is ignored. *) | autocomment { new_line lexbuf; lex lexbuf } (* A manually-written comment is preserved. *) | comment as c { new_line lexbuf; COMMENT c } (* The end of file is translated to [EOF]. *) | eof { EOF } (* A colon. *) | ':' { COLON } | _ { error2 lexbuf "unexpected character." } menhir-20200123/src/sentenceParser.mly000066400000000000000000000075601361226111300175260ustar00rootroot00000000000000/******************************************************************************/ /* */ /* Menhir */ /* */ /* François Pottier, Inria Paris */ /* Yann Régis-Gianas, PPS, Université Paris Diderot */ /* */ /* Copyright Inria. All rights reserved. This file is distributed under the */ /* terms of the GNU General Public License version 2, as described in the */ /* file LICENSE. */ /* */ /******************************************************************************/ /* This is two parsers in one. */ /* This parser is used to read the sentences provided on the standard input channel when [--interpret] is set. The entry point is [optional_sentence]. */ /* It is used also to read a [.messages] file. The entry point is [entry]. */ /* This parser must be compatible with both ocamlyacc and menhir, so we use $ notation, do not use Menhir's standard library, and collect positions manually. */ /* ------------------------------------------------------------------------ */ /* Tokens. */ %token COLON EOF EOL %token TERMINAL %token NONTERMINAL %token COMMENT /* only manually-written comments, beginning with a single # */ /* ------------------------------------------------------------------------ */ /* Types. */ %{ open SentenceParserAux (* Removing the position information in a terminal or non-terminal symbol. *) let strip_symbol (x, _, _) = x (* Removing the position information in a sentence. *) let strip_sentence (nto, terminals) = Option.map strip_symbol nto, List.map strip_symbol terminals (* Computing the start and end positions of a sentence. *) let locate_sentence (nto, terminals) = let opening = match nto, terminals with | Some (_, opening, _), _ | None, (_, opening, _) :: _ -> opening | None, [] -> Lexing.dummy_pos (* cannot happen *) and closing = match nto, List.rev terminals with | _, (_, _, closing) :: _ | Some (_, _, closing), _ -> closing | None, [] -> Lexing.dummy_pos (* cannot happen *) in [Positions.import (opening, closing)], strip_sentence (nto, terminals) %} %type located_sentence %type optional_sentence %start optional_sentence %type entry %start entry %% /* ------------------------------------------------------------------------ */ /* An entry is a list of located sentences or comments. */ entry: located_sentences_or_comments EOF { $1 } /* A list of located sentences or comments. */ located_sentences_or_comments: { [] } | located_sentence located_sentences_or_comments { Thing $1 :: $2 } | COMMENT located_sentences_or_comments { Comment $1 :: $2 } /* A located sentence. */ located_sentence: sentence { locate_sentence $1 } /* An optional sentence. */ optional_sentence: | EOF { None } | sentence { Some (strip_sentence $1) } /* A sentence is a pair of an optional non-terminal start symbol and a list of terminal symbols. It is terminated by a newline. */ sentence: | NONTERMINAL COLON terminals EOL { Some $1, $3 } | terminals EOL { None, $1 } /* A list of terminal symbols. */ terminals: | { [] } | TERMINAL terminals { $1 :: $2 } menhir-20200123/src/sentenceParserAux.ml000066400000000000000000000027061361226111300200100ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Grammar type terminals = Terminal.t list type sentence = Nonterminal.t option * terminals type located_sentence = Positions.positions * sentence type comment = string type 'a or_comment = | Thing of 'a | Comment of comment let or_comment_iter f = function | Thing s -> f s | Comment _ -> () let or_comment_map f = function | Thing s -> Thing (f s) | Comment c -> Comment c let unThing = function | Thing x -> [ x ] | Comment _ -> [] menhir-20200123/src/settings.ml000066400000000000000000000423661361226111300162170ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open Printf (* ------------------------------------------------------------------------- *) (* Prepare for parsing the command line. *) type token_type_mode = | TokenTypeAndCode (* produce the definition of the [token] type and code for the parser *) | TokenTypeOnly (* produce the type definition only *) | CodeOnly of string (* produce the code only; import token type from specified module *) let token_type_mode = ref TokenTypeAndCode let tokentypeonly () = token_type_mode := TokenTypeOnly let is_uppercase_ascii c = c >= 'A' && c <= 'Z' let is_capitalized_ascii s = String.length s > 0 && is_uppercase_ascii s.[0] let codeonly m = if not (is_capitalized_ascii m) then begin (* Not using module [Error] to avoid a circular dependency. *) fprintf stderr "Error: %s is not a valid OCaml module name.\n" m; exit 1 end; token_type_mode := CodeOnly m let version = ref false type construction_mode = | ModeCanonical (* --canonical: canonical Knuth LR(1) automaton *) | ModeInclusionOnly (* --no-pager : states are merged when there is an inclusion relationship *) | ModePager (* normal mode: states are merged as per Pager's criterion *) | ModeLALR (* --lalr : states are merged as in an LALR generator, i.e. as soon as they have the same LR(0) core *) (* Note that --canonical overrides --no-pager. If both are specified, the result is a canonical automaton. *) let construction_mode = ref ModePager let explain = ref false let base = ref "" let dump = ref false let follow = ref false let graph = ref false let trace = ref false let noprefix = ref false type print_mode = | PrintNormal | PrintForOCamlyacc | PrintUnitActions of bool (* if true, declare unit tokens *) type preprocess_mode = | PMNormal (* preprocess and continue *) | PMOnlyPreprocess of print_mode (* preprocess, print grammar, stop *) let preprocess_mode = ref PMNormal let recovery = ref false let v () = dump := true; explain := true let inline = ref true type infer_mode = (* Perform no type inference. This is the default mode. *) | IMNone (* Perform type inference by invoking ocamlc directly. *) | IMInfer (* --infer *) | IMDependRaw (* --raw-depend *) | IMDependPostprocess (* --depend *) (* Perform type inference by writing a mock .ml file and reading the corresponding inferred .mli file. *) | IMWriteQuery of string (* --infer-write-query *) | IMReadReply of string (* --infer-read-reply *) let show_infer_mode = function | IMNone -> "" | IMInfer -> "--infer" | IMDependRaw -> "--raw-depend" | IMDependPostprocess -> "--depend" | IMWriteQuery _ -> "--infer-write-query" | IMReadReply _ -> "--infer-read-reply" let infer = ref IMNone let set_infer_mode mode2 = let mode1 = !infer in match mode1, mode2 with | IMNone, _ -> infer := mode2 (* It is valid to specify [--infer] in conjunction with [--depend] or [--raw-depend]. The latter command then takes precedence. This is for compatibility with Menhir prior to 2018/05/23. *) | IMInfer, (IMInfer | IMDependRaw | IMDependPostprocess) -> infer := mode2 | (IMDependRaw | IMDependPostprocess), IMInfer -> () | _, _ -> fprintf stderr "Error: you cannot use both %s and %s.\n" (show_infer_mode mode1) (show_infer_mode mode2); exit 1 let enable_infer () = set_infer_mode IMInfer let enable_depend () = set_infer_mode IMDependPostprocess let enable_raw_depend () = set_infer_mode IMDependRaw let enable_write_query filename = set_infer_mode (IMWriteQuery filename) let enable_read_reply filename = set_infer_mode (IMReadReply filename) let code_inlining = ref true let comment = ref false let ocamlc = ref "ocamlc" let ocamldep = ref "ocamldep" let logG, logA, logC = ref 0, ref 0, ref 0 let timings = ref false let filenames = ref StringSet.empty let no_stdlib = ref false let insert name = filenames := StringSet.add name !filenames let interpret = ref false let interpret_show_cst = ref false let interpret_error = ref false let table = ref false let inspection = ref false let coq = ref false let coq_no_version_check = ref false let coq_no_complete = ref false let coq_no_actions = ref false let strict = ref false let fixedexc = ref false type suggestion = | SuggestNothing | SuggestCompFlags | SuggestLinkFlags of string (* "cmo" or "cmx" *) | SuggestWhereIsMenhirLibSource | SuggestUseOcamlfind let suggestion = ref SuggestNothing let ignored_unused_tokens = ref StringSet.empty let ignore_unused_token t = ignored_unused_tokens := StringSet.add t !ignored_unused_tokens let ignore_all_unused_tokens = ref false let ignore_all_unused_precedence_levels = ref false let list_errors = ref false let compile_errors = ref None let set_compile_errors filename = compile_errors := Some filename let compare_errors = ref [] let add_compare_errors filename = compare_errors := filename :: !compare_errors let update_errors = ref None let set_update_errors filename = update_errors := Some filename let echo_errors = ref None let set_echo_errors filename = echo_errors := Some filename let cmly = ref false let coq_lib_path = ref (Some "MenhirLib") type dollars = | DollarsDisallowed | DollarsAllowed let dollars = ref DollarsAllowed (* When new command line options are added, please update both the manual in [doc/manual.tex] and the man page in [doc/menhir.1]. *) let options = Arg.align [ "--base", Arg.Set_string base, " Specifies a base name for the output file(s)"; "--canonical", Arg.Unit (fun () -> construction_mode := ModeCanonical), " Construct a canonical Knuth LR(1) automaton"; "--cmly", Arg.Set cmly, " Write a .cmly file"; "--comment", Arg.Set comment, " Include comments in the generated code"; "--compare-errors", Arg.String add_compare_errors, " (used twice) Compare two .messages files"; "--compile-errors", Arg.String set_compile_errors, " Compile a .messages file to OCaml code"; "--coq", Arg.Set coq, " Generate a formally verified parser, in Coq"; "--coq-lib-path", Arg.String (fun path -> coq_lib_path := Some path), " How to qualify references to MenhirLib"; "--coq-lib-no-path", Arg.Unit (fun () -> coq_lib_path := None), " Do *not* qualify references to MenhirLib"; "--coq-no-version-check", Arg.Set coq_no_version_check, " The generated parser will not check that the versions of Menhir and MenhirLib match."; "--coq-no-actions", Arg.Set coq_no_actions, " Ignore semantic actions in the Coq output"; "--coq-no-complete", Arg.Set coq_no_complete, " Do not generate a proof of completeness"; "--depend", Arg.Unit enable_depend, " Invoke ocamldep and display dependencies"; "--dump", Arg.Set dump, " Write an .automaton file"; "--echo-errors", Arg.String set_echo_errors, " Echo the sentences in a .messages file"; "--error-recovery", Arg.Set recovery, " (no longer supported)"; "--explain", Arg.Set explain, " Explain conflicts in .conflicts"; "--external-tokens", Arg.String codeonly, " Import token type definition from "; "--fixed-exception", Arg.Set fixedexc, " Declares Error = Parsing.Parse_error"; "--follow-construction", Arg.Set follow, " (undocumented)"; "--graph", Arg.Set graph, " Write a dependency graph to a .dot file"; "--infer", Arg.Unit enable_infer, " Invoke ocamlc to do type inference"; "--infer-protocol-supported", Arg.Unit (fun () -> exit 0), " Stop with exit code 0"; "--infer-write-query", Arg.String enable_write_query, " Write mock .ml file"; "--infer-read-reply", Arg.String enable_read_reply, " Read inferred .mli file"; "--inspection", Arg.Set inspection, " Generate the inspection API"; "--interpret", Arg.Set interpret, " Interpret the sentences provided on stdin"; "--interpret-show-cst", Arg.Set interpret_show_cst, " Show a concrete syntax tree upon acceptance"; "--interpret-error", Arg.Set interpret_error, " Interpret an error sentence"; "--lalr", Arg.Unit (fun () -> construction_mode := ModeLALR), " Construct an LALR(1) automaton"; "--list-errors", Arg.Set list_errors, " Produce a list of erroneous inputs"; "--log-automaton", Arg.Set_int logA, " Log information about the automaton"; "--log-code", Arg.Set_int logC, " Log information about the generated code"; "--log-grammar", Arg.Set_int logG, " Log information about the grammar"; "--no-code-inlining", Arg.Clear code_inlining, " (undocumented)"; "--no-dollars", Arg.Unit (fun () -> dollars := DollarsDisallowed), " Disallow $i in semantic actions"; "--no-inline", Arg.Clear inline, " Ignore the %inline keyword"; "--no-pager", Arg.Unit (fun () -> if !construction_mode = ModePager then construction_mode := ModeInclusionOnly), " (undocumented)"; "--no-prefix", Arg.Set noprefix, " (undocumented)"; "--no-stdlib", Arg.Set no_stdlib, " Do not load the standard library"; "--ocamlc", Arg.Set_string ocamlc, " Specifies how ocamlc should be invoked"; "--ocamldep", Arg.Set_string ocamldep, " Specifies how ocamldep should be invoked"; "--only-preprocess", Arg.Unit (fun () -> preprocess_mode := PMOnlyPreprocess PrintNormal), " Print grammar and exit"; "--only-preprocess-for-ocamlyacc", Arg.Unit (fun () -> preprocess_mode := PMOnlyPreprocess PrintForOCamlyacc), " Print grammar in ocamlyacc format and exit"; "--only-preprocess-u", Arg.Unit (fun () -> preprocess_mode := PMOnlyPreprocess (PrintUnitActions false)), " Print grammar with unit actions and exit"; "--only-preprocess-uu", Arg.Unit (fun () -> preprocess_mode := PMOnlyPreprocess (PrintUnitActions true)), " Print grammar with unit actions & tokens"; "--only-tokens", Arg.Unit tokentypeonly, " Generate token type definition only, no code"; "--raw-depend", Arg.Unit enable_raw_depend, " Invoke ocamldep and echo its raw output"; "--stdlib", Arg.String ignore, " Ignored (deprecated)"; "--strict", Arg.Set strict, " Warnings about the grammar are errors"; "--suggest-comp-flags", Arg.Unit (fun () -> suggestion := SuggestCompFlags), " Suggest compilation flags for ocaml{c,opt}"; "--suggest-link-flags-byte", Arg.Unit (fun () -> suggestion := SuggestLinkFlags "cmo"), " Suggest link flags for ocamlc"; "--suggest-link-flags-opt", Arg.Unit (fun () -> suggestion := SuggestLinkFlags "cmx"), " Suggest link flags for ocamlopt"; "--suggest-menhirLib", Arg.Unit (fun () -> suggestion := SuggestWhereIsMenhirLibSource), " Suggest where is MenhirLib"; "--suggest-ocamlfind", Arg.Unit (fun () -> suggestion := SuggestUseOcamlfind), " (deprecated)"; "--table", Arg.Set table, " Use the table-based back-end"; "--timings", Arg.Set timings, " Display internal timings"; "--trace", Arg.Set trace, " Generate tracing instructions"; "--unused-precedence-levels", Arg.Set ignore_all_unused_precedence_levels, " Do not warn about unused precedence levels"; "--unused-token", Arg.String ignore_unused_token, " Do not warn that is unused"; "--unused-tokens", Arg.Set ignore_all_unused_tokens, " Do not warn about any unused token"; "--update-errors", Arg.String set_update_errors, " Update auto-comments in a .messages file"; "--version", Arg.Set version, " Show version number and exit"; "-b", Arg.Set_string base, " Synonymous with --base "; "-lg", Arg.Set_int logG, " Synonymous with --log-grammar"; "-la", Arg.Set_int logA, " Synonymous with --log-automaton"; "-lc", Arg.Set_int logC, " Synonymous with --log-code"; "-t", Arg.Set table, " Synonymous with --table"; "-v", Arg.Unit v, " Synonymous with --dump --explain"; ] let usage = sprintf "Usage: %s " Sys.argv.(0) (* ------------------------------------------------------------------------- *) (* Parse the command line. *) let () = Arg.parse options insert usage (* ------------------------------------------------------------------------- *) (* If required, print a version number and stop. *) let () = if !version then begin printf "menhir, version %s\n" Version.version; exit 0 end (* ------------------------------------------------------------------------- *) (* Menhir is able to suggest compile and link flags to be passed to the OCaml compilers. If required, do so and stop. *) (* If [--table] is not passed, no flags are necessary. If [--table] is passed, then [MenhirLib] needs to be visible (at compile time) and linked in (at link time). *) (* The compilation flags are in fact meant to be used both at compile- and link-time. *) let () = match !suggestion with | SuggestNothing -> () | SuggestCompFlags -> if !table then printf "-I %s\n%!" (Installation.libdir()); exit 0 | SuggestLinkFlags extension -> if !table then printf "menhirLib.%s\n%!" extension; exit 0 | SuggestWhereIsMenhirLibSource -> printf "%s\n%!" (Installation.libdir()); exit 0 | SuggestUseOcamlfind -> printf "false\n"; exit 0 (* ------------------------------------------------------------------------- *) (* Export the settings. *) let stdlib_filename = "" let filenames = StringSet.elements !filenames let base = if !base = "" then match filenames with | [] -> fprintf stderr "%s\n" usage; exit 1 | [ filename ] -> Filename.chop_suffix filename (if !coq then ".vy" else ".mly") | _ -> fprintf stderr "Error: you must specify --base when providing multiple input files.\n"; exit 1 else !base let token_type_mode = !token_type_mode let construction_mode = !construction_mode let explain = !explain let dump = !dump let follow = !follow let graph = !graph let trace = !trace let () = if !recovery then begin fprintf stderr "Error: --error-recovery mode is no longer supported.\n"; exit 1 end let noprefix = !noprefix let code_inlining = !code_inlining let inline = !inline let comment = !comment let preprocess_mode = !preprocess_mode let ocamlc = !ocamlc let ocamldep = !ocamldep let logG, logA, logC = !logG, !logA, !logC let timings = !timings let interpret = !interpret let interpret_show_cst = !interpret_show_cst let interpret_error = !interpret_error let table = !table let inspection = !inspection let () = if inspection && not table then begin fprintf stderr "Error: --inspection requires --table.\n"; exit 1 end let no_stdlib = !no_stdlib let coq = !coq let coq_no_version_check = !coq_no_version_check let coq_no_complete = !coq_no_complete let coq_no_actions = !coq_no_actions let strict = !strict let fixedexc = !fixedexc let ignored_unused_tokens = !ignored_unused_tokens let ignore_all_unused_tokens = !ignore_all_unused_tokens let ignore_all_unused_precedence_levels = !ignore_all_unused_precedence_levels let list_errors = !list_errors let compile_errors = !compile_errors let compare_errors = match !compare_errors with | [] -> None | [ filename2; filename1 ] -> (* LIFO *) Some (filename1, filename2) | _ -> eprintf "To compare two .messages files, please use:\n\ --compare-errors --compare-errors .\n"; exit 1 let update_errors = !update_errors let echo_errors = !echo_errors let cmly = !cmly let coq_lib_path = !coq_lib_path let dollars = !dollars let infer = !infer (* If some flags imply that we will NOT produce an OCaml parser, then there is no need to perform type inference, so [--infer] is ignored. This saves time and dependency nightmares. *) let skipping_parser_generation = coq || compile_errors <> None || interpret_error || list_errors || compare_errors <> None || update_errors <> None || echo_errors <> None || false (* maybe also: [preprocess_mode <> PMNormal] *) let infer = match infer with | IMInfer when skipping_parser_generation -> IMNone | _ -> infer menhir-20200123/src/settings.mli000066400000000000000000000202051361226111300163540ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module parses the command line. *) (* The list of file names that appear on the command line. *) val filenames: string list (* How to deal with the type of tokens. *) type token_type_mode = | TokenTypeAndCode (* produce the definition of the [token] type and code for the parser *) | TokenTypeOnly (* produce the type definition only *) | CodeOnly of string (* produce the code only, by relying on an external token type *) val token_type_mode: token_type_mode (* How to construct the automaton. *) type construction_mode = | ModeCanonical (* --canonical: canonical Knuth LR(1) automaton *) | ModeInclusionOnly (* --no-pager : states are merged when there is an inclusion relationship, default reductions are used *) | ModePager (* normal mode: states are merged as per Pager's criterion, default reductions are used *) | ModeLALR (* --lalr : states are merged as in an LALR generator, i.e. as soon as they have the same LR(0) core *) val construction_mode: construction_mode (* Whether conflicts should be explained. *) val explain: bool (* Whether the automaton should be dumped. *) val dump: bool (* Whether the automaton's construction should be explained (very verbose). *) val follow: bool (* Whether the grammar's dependence graph should be dumped. *) val graph: bool (* Whether tracing instructions should be generated. *) val trace: bool (* Whether one should stop and print the grammar after joining and expanding the grammar. *) type print_mode = | PrintNormal | PrintForOCamlyacc | PrintUnitActions of bool (* if true, declare unit tokens *) type preprocess_mode = | PMNormal (* preprocess and continue *) | PMOnlyPreprocess of print_mode (* preprocess, print grammar, stop *) val preprocess_mode: preprocess_mode (* Whether and how OCaml type inference (for semantic actions and nonterminal symbols) should be performed. See the manual for details. *) type infer_mode = (* Perform no type inference. This is the default mode. *) | IMNone (* Perform type inference by invoking ocamlc directly. *) | IMInfer (* --infer *) | IMDependRaw (* --raw-depend *) | IMDependPostprocess (* --depend *) (* Perform type inference by writing a mock .ml file and reading the corresponding inferred .mli file. *) | IMWriteQuery of string (* --infer-write-query *) | IMReadReply of string (* --infer-read-reply *) val infer: infer_mode (* Whether one should inline the non terminal definitions marked with the %inline keyword. *) val inline: bool (* Whether comments should be printed or discarded. *) val comment: bool (* This undocumented flag suppresses prefixing of identifiers with an unlikely prefix in the generated code. This increases the code's readability, but can cause identifiers in semantic actions to be captured. *) val noprefix: bool (* This undocumented flag causes the code to be transformed by [Inline]. It is on by default. *) val code_inlining: bool (* How [ocamlc] and [ocamldep] should be invoked. *) val ocamlc: string val ocamldep: string (* How verbose we should be. *) val logG: int (* diagnostics on the grammar *) val logA: int (* diagnostics on the automaton *) val logC: int (* diagnostics on the generated code *) (* Whether tasks should be timed. *) val timings: bool (* The base name that should be used for the files that we create. This name can contain a path. *) val base: string (* The filename of the standard library. *) val stdlib_filename : string (* Whether Menhir should behave as an interpreter. *) val interpret : bool (* Whether the interpreter should build and display concrete syntax trees. *) val interpret_show_cst : bool (* Whether Menhir should behave as an interpreter, in a special mode where it checks one input sentence, expecting it to trigger an error at the last token, and displays which state was reached. *) val interpret_error : bool (* Whether to use the table-based back-end ([true]) or the code-based back-end ([false]). *) val table : bool (* Whether to generate the inspection API (which requires GADTs, and requires producing more tables). *) val inspection : bool (* Whether the standard menhir library should be used. *) val no_stdlib : bool (* Whether to generate a coq description of the grammar and automaton. *) val coq : bool (* Whether to generate a version check for MenhirLib in the generated parser. *) val coq_no_version_check : bool (* Whether the coq description must contain completeness proofs. *) val coq_no_complete : bool (* Whether the coq backend should ignore types and semantic actions. *) val coq_no_actions : bool (* Whether unresolved LR(1) conflicts, useless precedence declarations, productions that are never reduced, etc. should be treated as errors. *) val strict: bool (* This flag causes the exception [Error] should be declared equal to [Parsing.Parse_error]. This is useful when full compatibility with ocamlyacc is desired. In particular, this is used when building Menhir itself, since Menhir is compiled first using ocamlyacc, then using Menhir. *) val fixedexc: bool (* This is a set of tokens which may be unused and about which we should not emit a warning. *) val ignored_unused_tokens: StringSet.t (* This flag supersedes the set [ignored_unused_tokens]. If it is set, then we should not emit a warning about any unused tokens. *) val ignore_all_unused_tokens: bool (* This flag suppresses all warnings about unused precedence levels. *) val ignore_all_unused_precedence_levels: bool (* This flag causes Menhir to produce a list of erroneous input sentences. Enough sentences are computed to produce exactly one error in every state where an error can occur. *) val list_errors: bool (* This flag causes Menhir to read the error message descriptions stored in [filename] and compile them to OCaml code. *) val compile_errors: string option (* If present, this is a pair of .messages files whose contents should be compared. *) val compare_errors: (string * string) option (* This flag causes Menhir to read the error message descriptions stored in [filename] and re-generate the auto-generated comments, which begin with [##]. This allows bringing these comments up to date when the grammar evolves. *) val update_errors: string option (* This flag causes Menhir to read the error message descriptions stored in [filename] and echo the error sentences (and nothing else; no messages, no comments). *) val echo_errors: string option (* This flag causes Menhir to produce a [.cmly] file, which contains a binary-format description of the grammar and automaton. *) val cmly: bool (* This name is used in --coq mode. It appears in the generated Coq file, and indicates under what name (or path) the Coq library MenhirLib is known. Its default value is [Some "MenhirLib"]. *) val coq_lib_path: string option (* This flag tells whether [$i] notation in semantic actions is allowed. *) type dollars = | DollarsDisallowed | DollarsAllowed val dollars: dollars menhir-20200123/src/slr.ml000066400000000000000000000125231361226111300151470ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module extends the LR(0) automaton with lookahead information in order to construct an SLR(1) automaton. The lookahead information is obtained by considering the FOLLOW sets. *) (* This construction is not used by Menhir, but can be used to check whether the grammar is in the class SLR(1). This check is performed when the log level [lg] is at least 1. *) open Grammar (* This flag, which is reserved for internal use, causes more information about SLR(1) conflict states to be printed. *) let tell_me_everything = false (* The following function turns an LR(0) state into an SLR(1) state. *) let make_slr_state (s : Lr0.node) : Lr0.concretelr1state = (* Obtain the set of LR(0) items associated with the state [s]. *) let items = Lr0.items s in (* Unfortunately, this set is not closed. We do not have a function that computes the closure of a set of LR(0) items -- we could build one using [Item.Closure], but that would be overkill. So, we first convert this set to a set of LR(1) items, then compute the closure at this level, and finally we turn this LR(1) state into an SLR(1) state by letting the lookahead sets be the FOLLOW sets. This is somewhat ugly and naïve, but seems to work. *) (* Convert this set to a set of LR(1) items. Here, we can use any set of tokens as the lookahead set. We use the empty set. *) let s = Item.Map.lift (fun _item -> TerminalSet.empty) items in (* Compute the LR(1) closure. *) let s = Lr0.closure s in (* We now have an LR(1) state that has the correct set of LR(0) items but phony lookahead information. We convert it into an SLR(1) state by deciding that, for each item, the lookahead set is the FOLLOW set of the symbol that appears on the left-hand side of the item. *) Item.Map.fold (fun item toks accu -> let _, nt, _, _, _ = Item.def item in let follow_nt = Analysis.follow nt in assert (TerminalSet.subset toks follow_nt); (* sanity check *) Item.Map.add item follow_nt accu ) s Item.Map.empty (* The following function turns a closed LR(1) state into a map of terminal symbols to reduction actions. Copied from a related function in [Lr0]. *) let reductions (s : Lr0.concretelr1state) : Production.index list TerminalMap.t = Item.Map.fold (fun item toks reductions -> match Item.classify item with | Item.Reduce prod -> Lr0.add_reductions prod toks reductions | Item.Shift _ -> reductions ) s TerminalMap.empty (* The following function turns a closed LR(1) state into a set of shift actions. *) let transitions (s : Lr0.concretelr1state) : TerminalSet.t = Item.Map.fold (fun item _ transitions -> match Item.classify item with | Item.Shift (Symbol.T tok, _) -> TerminalSet.add tok transitions | Item.Shift (Symbol.N _, _) | Item.Reduce _ -> transitions ) s TerminalSet.empty (* This function computes the domain of a terminal map, producing a terminal set. *) let domain (m : 'a TerminalMap.t) : TerminalSet.t = TerminalMap.fold (fun tok _ accu -> TerminalSet.add tok accu ) m TerminalSet.empty (* The following function checks whether a closed LR(1) state is free of conflicts. *) let state_is_ok (s : Lr0.concretelr1state) : bool = let reductions = reductions s and transitions = transitions s in (* Check for shift/reduce conflicts. *) TerminalSet.disjoint transitions (domain reductions) && (* Check for reduce/reduce conflicts. *) TerminalMap.fold (fun _ prods ok -> ok && match prods with | [] | [ _ ] -> true | _ :: _ :: _ -> false ) reductions true (* The following function counts the number of states in the SLR(1) automaton that have a conflict. *) let count_slr_violations () : int = let count = ref 0 in for s = 0 to Lr0.n - 1 do let s = make_slr_state s in if not (state_is_ok s) then begin incr count; if tell_me_everything then Printf.fprintf stderr "The following SLR(1) state has a conflict:\n%s" (Lr0.print_concrete "" s) end done; !count (* At log level 1, indicate whether the grammar is SLR(1). *) let check () = Error.logG 1 (fun f -> let count = count_slr_violations() in if count = 0 then Printf.fprintf f "The grammar is SLR(1).\n" else Printf.fprintf f "The grammar is not SLR(1) -- %d states have a conflict.\n" count ) menhir-20200123/src/slr.mli000066400000000000000000000025431361226111300153210ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module extends the LR(0) automaton with lookahead information in order to construct an SLR(1) automaton. The lookahead information is obtained by considering the FOLLOW sets. *) (* This construction is not used by Menhir, but can be used to check whether the grammar is in the class SLR(1). This check is performed when the log level [lg] is at least 1. *) val check: unit -> unit menhir-20200123/src/stage1/000077500000000000000000000000001361226111300151765ustar00rootroot00000000000000menhir-20200123/src/stage1/Driver.ml000066400000000000000000000030031361226111300167570ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* The module [Driver] serves to offer a unified API to the parser, which could be produced by either ocamlyacc or Menhir. *) (* This is the ocamlyacc-specific driver. There is nothing special to do. We handle syntax errors in a minimalistic manner. This error handling code will be exercised only if there is a syntax error in [fancy-parser.mly], during stage 2 of the bootstrap process. *) let grammar lexer lexbuf = try Parser.grammar lexer lexbuf with Parsing.Parse_error -> Error.error (Positions.lexbuf lexbuf) "syntax error." menhir-20200123/src/stage1/dune000066400000000000000000000007761361226111300160660ustar00rootroot00000000000000;; Build the stage1 version of Menhir. During this stage, Menhir's parser ;; is generated by ocamlyacc. (ocamlyacc parser) ;; As dune cannot use the same OCaml module in two different libraries or ;; executables, we must copy the source files to the present directory. (copy_files# ../*.{ml,mli}) ;; The stage1 version of Menhir. This executable is later used to build the ;; stage2 version of Menhir. (executable (name main) (libraries unix menhirLib menhirSdk) (flags :standard -open MenhirSdk) ) menhir-20200123/src/stage1/parser.mly000066400000000000000000000270121361226111300172170ustar00rootroot00000000000000/******************************************************************************/ /* */ /* Menhir */ /* */ /* François Pottier, Inria Paris */ /* Yann Régis-Gianas, PPS, Université Paris Diderot */ /* */ /* Copyright Inria. All rights reserved. This file is distributed under the */ /* terms of the GNU General Public License version 2, as described in the */ /* file LICENSE. */ /* */ /******************************************************************************/ /* This is the crude version of the parser. It is meant to be processed by ocamlyacc. Its existence is necessary for bootstrapping. It is kept in sync with [fancy-parser], with a few differences: 0. [yacc-parser] produces dummy position information; 1. [fancy-parser] exploits many features of Menhir; 2. [fancy-parser] performs slightly more refined error handling; 3. [fancy-parser] supports anonymous rules. 4. [fancy-parser] supports the new rule syntax. */ %{ open Syntax open Positions %} %token TOKEN TYPE LEFT RIGHT NONASSOC START PREC PUBLIC COLON BAR EOF EQUAL %token INLINE LPAREN RPAREN COMMA QUESTION STAR PLUS PARAMETER ON_ERROR_REDUCE %token PERCENTATTRIBUTE SEMI %token LID UID QID %token HEADER %token OCAMLTYPE %token PERCENTPERCENT %token ACTION %token ATTRIBUTE GRAMMARATTRIBUTE /* For the new rule syntax: */ %token LET TILDE UNDERSCORE COLONEQUAL EQUALEQUAL %start grammar %type producer %type production %type grammar /* These declarations solve a shift-reduce conflict in favor of shifting: when the declaration of a non-terminal symbol begins with a leading bar, it is understood as an (insignificant) leading optional bar, *not* as an empty right-hand side followed by a bar. This ambiguity arises due to the existence of a new notation for letting several productions share a single semantic action. */ %nonassoc no_optional_bar %nonassoc BAR %% /* ------------------------------------------------------------------------- */ /* A grammar consists of declarations and rules, followed by an optional postlude, which we do not parse. */ grammar: declarations PERCENTPERCENT rules postlude { { pg_filename = ""; (* filled in by the caller *) pg_declarations = List.rev $1; pg_rules = $3; pg_postlude = $4 } } postlude: EOF { None } | PERCENTPERCENT /* followed by actual postlude */ { Some (Lazy.force $1) } /* ------------------------------------------------------------------------- */ /* A declaration is an %{ OCaml header %}, or a %token, %start, %type, %left, %right, or %nonassoc declaration. */ declarations: /* epsilon */ { [] } | declarations declaration { $2 @ $1 } | declarations SEMI { $1 } declaration: | HEADER /* lexically delimited by %{ ... %} */ { [ unknown_pos (DCode $1) ] } | TOKEN optional_ocamltype terminals { let ty, ts = $2, $3 in List.map (Positions.map (fun (terminal, alias, attrs) -> DToken (ty, terminal, alias, attrs) )) ts } | START nonterminals { List.map (Positions.map (fun nonterminal -> DStart nonterminal)) $2 } | TYPE OCAMLTYPE actuals { List.map (Positions.map (fun nt -> DType ($2, nt))) (List.map Parameters.with_pos $3) } | START OCAMLTYPE nonterminals /* %start foo is syntactic sugar for %start foo %type foo */ { Misc.mapd (fun ntloc -> Positions.mapd (fun nt -> DStart nt, DType ($2, ParameterVar ntloc)) ntloc) $3 } | priority_keyword symbols { let prec = ParserAux.new_precedence_level (rhs_start_pos 1, rhs_end_pos 1) in List.map (Positions.map (fun symbol -> DTokenProperties (symbol, $1, prec))) $2 } | PARAMETER OCAMLTYPE { [ unknown_pos (DParameter $2) ] } | GRAMMARATTRIBUTE { [ unknown_pos (DGrammarAttribute $1) ] } | PERCENTATTRIBUTE actuals attributes { [ unknown_pos (DSymbolAttributes ($2, $3)) ] } | ON_ERROR_REDUCE actuals { let prec = ParserAux.new_on_error_reduce_level() in List.map (Positions.map (fun nt -> DOnErrorReduce (nt, prec))) (List.map Parameters.with_pos $2) } optional_ocamltype: /* epsilon */ { None } | OCAMLTYPE /* lexically delimited by angle brackets */ { Some $1 } priority_keyword: LEFT { LeftAssoc } | RIGHT { RightAssoc } | NONASSOC { NonAssoc } /* ------------------------------------------------------------------------- */ /* A symbol is a terminal or nonterminal symbol. */ /* One would like to require nonterminal symbols to begin with a lowercase letter, so as to lexically distinguish them from terminal symbols, which must begin with an uppercase letter. However, for compatibility with ocamlyacc, this is impossible. It can be required only for nonterminal symbols that are also start symbols. */ /* We also accept token aliases in place of ordinary terminal symbols. Token aliases are quoted strings. */ symbols: /* epsilon */ { [] } | symbols optional_comma symbol { $3 :: $1 } symbol: LID { $1 } | UID { $1 } | QID { $1 } optional_comma: /* epsilon */ { () } | COMMA { () } attributes: /* epsilon */ { [] } | ATTRIBUTE attributes { $1 :: $2 } /* ------------------------------------------------------------------------- */ /* Terminals must begin with an uppercase letter. Nonterminals that are declared to be start symbols must begin with a lowercase letter. */ terminals: /* epsilon */ { [] } | terminals optional_comma UID optional_alias attributes { let ts, uid, alias, attrs = $1, $3, $4, $5 in let alias = Option.map Positions.value alias in Positions.map (fun uid -> uid, alias, attrs) uid :: ts } nonterminals: /* epsilon */ { [] } | nonterminals LID { $2 :: $1 } optional_alias: /* epsilon */ { None } | QID { Some $1 } /* ------------------------------------------------------------------------- */ /* A rule defines a symbol. It is optionally declared %public, and optionally carries a number of formal parameters. The right-hand side of the definition consists of a list of production groups. */ rules: /* epsilon */ { [] } | rules rule { $2 :: $1 } | rules SEMI { $1 } rule: flags symbol attributes optional_formal_parameters COLON optional_bar production_group production_groups { let public, inline = $1 in { pr_public_flag = public; pr_inline_flag = inline; pr_nt = Positions.value $2; pr_positions = [ Positions.position $2 ]; pr_attributes = $3; pr_parameters = $4; pr_branches = List.flatten ($7 :: List.rev $8) } } flags: /* epsilon */ { false, false } | PUBLIC { true, false } | INLINE { false, true } | PUBLIC INLINE { true, true } | INLINE PUBLIC { true, true } /* ------------------------------------------------------------------------- */ /* Parameters are surroundered with parentheses and delimited by commas. The syntax of actual parameters allows applications, whereas the syntax of formal parameters does not. It also allows use of the "?", "+", and "*" shortcuts. */ optional_formal_parameters: /* epsilon */ { [] } | LPAREN formal_parameters RPAREN { $2 } formal_parameters: symbol { [ Positions.value $1 ] } | symbol COMMA formal_parameters { Positions.value $1 :: $3 } optional_actuals: /* epsilon */ { [] } | LPAREN actuals_comma RPAREN { $2 } actuals_comma: actual { [ $1 ] } | actual COMMA actuals_comma { $1 :: $3 } actual: symbol optional_actuals { Parameters.app $1 $2 } | actual modifier { ParameterApp ($2, [ $1 ]) } actuals: /* epsilon */ { [] } | actuals optional_comma actual { $3::$1 } optional_bar: /* epsilon */ %prec no_optional_bar { () } | BAR { () } /* ------------------------------------------------------------------------- */ /* The "?", "+", and "*" modifiers are short-hands for applications of certain parameterized nonterminals, defined in the standard library. */ modifier: QUESTION { unknown_pos "option" } | PLUS { unknown_pos "nonempty_list" } | STAR { unknown_pos "list" } /* ------------------------------------------------------------------------- */ /* A production group consists of a list of productions, followed by a semantic action and an optional precedence specification. */ production_groups: /* epsilon */ { [] } | production_groups BAR production_group { $3 :: $1 } production_group: productions ACTION /* action is lexically delimited by braces */ optional_precedence { let productions, action, oprec2 = $1, $2, $3 in (* If multiple productions share a single semantic action, check that all of them bind the same names. *) ParserAux.check_production_group productions; (* Then, *) List.map (fun (producers, oprec1, level, pos) -> (* Replace [$i] with [_i]. *) let pr_producers = ParserAux.normalize_producers producers in (* Distribute the semantic action. Also, check that every [$i] is within bounds. *) let names = ParserAux.producer_names producers in let pr_action = action Settings.dollars names in { pr_producers; pr_action; pr_branch_prec_annotation = ParserAux.override pos oprec1 oprec2; pr_branch_production_level = level; pr_branch_position = pos }) productions } optional_precedence: /* epsilon */ { None } | PREC symbol { Some $2 } /* ------------------------------------------------------------------------- */ /* A production is a list of producers, optionally followed by a precedence declaration. Lists of productions are nonempty and separated with bars. */ productions: production { [ $1 ] } | production bar_productions { $1 :: $2 } bar_productions: BAR production { [ $2 ] } | BAR production bar_productions { $2 :: $3 } production: producers optional_precedence { List.rev $1, $2, ParserAux.new_production_level(), Positions.import (symbol_start_pos(), symbol_end_pos()) } producers: /* epsilon */ { [] } | producers producer { $2 :: $1 } /* ------------------------------------------------------------------------- */ /* A producer is an actual parameter, possibly preceded by a binding, and possibly followed with attributes. */ producer: | actual attributes optional_semis { Positions.import (symbol_start_pos(), symbol_end_pos()), None, $1, $2 } | LID EQUAL actual attributes optional_semis { Positions.import (symbol_start_pos(), symbol_end_pos()), Some $1, $3, $4 } /* ------------------------------------------------------------------------- */ /* Semicolons used to be considered whitespace by our lexer, but are no longer. We must allow optional semicolons in a few conventional places. */ optional_semis: /* empty */ { () } | optional_semis SEMI { () } %% menhir-20200123/src/stage2/000077500000000000000000000000001361226111300151775ustar00rootroot00000000000000menhir-20200123/src/stage2/Driver.ml000066400000000000000000000056331361226111300167730ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* The module [Driver] serves to offer a unified API to the parser, which could be produced by either ocamlyacc or Menhir. *) (* This is the Menhir-specific driver. We wish to handle syntax errors in a more ambitious manner, so as to help our end users understand their mistakes. *) open Parser.MenhirInterpreter (* incremental API to our parser *) (* [fail buffer lexbuf s] is invoked if a syntax error is encountered in state [s]. *) let fail buffer lexbuf (s : int) = (* Display a nice error message. In principle, the table found in [ParserMessages] should be complete, so we should obtain a nice message. If [Not_found] is raised, we produce a generic message, which is better than nothing. Note that the OCaml code in [ParserMessages] is auto-generated based on the table in [ParserMessages.messages]. *) let message = try ParserMessages.message s with Not_found -> Printf.sprintf "Unknown syntax error (in state %d).\n" s in (* Show the two tokens between which the error took place. *) let where = MenhirLib.ErrorReports.show InputFile.chunk buffer in (* Hack: remove the final newline, because [Error.error] adds one. *) let message = String.sub message 0 (String.length message - 1) in (* Display our message and die. *) Error.error (Positions.lexbuf lexbuf) "syntax error %s.\n%s" where message (* Same as above, except we expect a checkpoint instead of a state [s]. *) let fail buffer lexbuf checkpoint = match checkpoint with | HandlingError env -> let s = current_state_number env in fail buffer lexbuf s | _ -> assert false (* this cannot happen *) (* The entry point. *) let grammar lexer lexbuf = (* Keep track of the last two tokens in a buffer. *) let buffer, lexer = MenhirLib.ErrorReports.wrap lexer in loop_handle (fun v -> v) (fail buffer lexbuf) (lexer_lexbuf_to_supplier lexer lexbuf) (Parser.Incremental.grammar lexbuf.Lexing.lex_curr_p) menhir-20200123/src/stage2/Makefile000066400000000000000000000014451361226111300166430ustar00rootroot00000000000000# The directory used by dune to mirror this directory. BUILD_DIR := ../../_build/default/src/stage2 # [make update] is used under the programmer's manual control, after the # grammar in [parser.mly] has been modified. # It updates the file [parserMessages.messages] with new error states (if # necessary) and with new auto-generated comments for all error states. .PHONY: update update: @ dune build --force parserMessages.messages.updated @ cp $(BUILD_DIR)/parserMessages.messages.updated parserMessages.messages # [make strip] strips away the auto-generated comments found in the file # parserMessages.messages. It is typically used after [make update], which # creates many such comments. .PHONY: strip strip: @ sed -e "/^##/d" -i.bak parserMessages.messages @ rm parserMessages.messages.bak menhir-20200123/src/stage2/dune000066400000000000000000000107261361226111300160630ustar00rootroot00000000000000;; Build the stage2 version of Menhir. During this stage, Menhir's parser ;; is generated by the stage1 Menhir executable. ;; ----------------------------------------------------------------------------- ;; The flags that are passed to every invocation of Menhir in the rules below ;; are set in the file "menhir_flags". Any flags that affect the construction ;; of the automaton, such as --canonical, *must* be listed there. ;; We need these flags in "s-expression" format in order to use them in the ;; "menhir" stanza below. The following rule generates a file in this format ;; by wrapping the list of arguments in parentheses. (rule (with-stdout-to menhir_flags.sexp (progn (echo "(") (cat %{dep:menhir_flags}) (echo ")") ) ) ) ;; ----------------------------------------------------------------------------- ;; Bind the name "menhir" to "../stage1/main.exe" within the present scope. ;; This is so that the "menhir" stanza below will use *that* executable ;; instead of whatever "menhir" executable (if any) is available on the ;; developer's machine. (env (_ (binaries ../stage1/main.exe (../stage1/main.exe as menhir))) ) ;; Menhir's parser is generated by Menhir. ;; We include the flags found in the file "menhir_flags" plus extra flags ;; specified here. (menhir (flags (:include menhir_flags.sexp) --strict -lg 1 -la 1 -lc 1 -v ) (modules parser) ) ;; ----------------------------------------------------------------------------- ;; As dune cannot use the same OCaml module in two different libraries or ;; executables, we must copy the source files to the present directory. (copy_files# ../*.{ml,mli}) ;; ----------------------------------------------------------------------------- ;; The stage2 version of Menhir. The stanza is identical to that used for the ;; stage1 version, but the [Driver] and [Parser] modules are different. ;; The link_deps field requires running the completeness check. (executable (name main) (libraries unix menhirLib menhirSdk) (flags :standard -open MenhirSdk) (link_deps parserMessages.check) ) ;; ----------------------------------------------------------------------------- ;; Install the Menhir executable under the "menhir" name. This would usually ;; be achieved by adding a "public_name" field in the "executable" stanza ;; above. However, we cannot do that here, because the public name "menhir" ;; would clash with the binding of this name to the stage1 version of Menhir ;; at the top of this file. Thus, we explicitly request its installation as ;; follows. (install (section bin) (package menhir) (files (./main.exe as menhir)) ) ;; ----------------------------------------------------------------------------- ;; This section deals with the .messages file. ;; The module [ParserMessages] is generated by Menhir based on the source file ;; "parserMessages.messages". The completeness check is performed first. (rule (deps parserMessages.check) (action (with-stdout-to parserMessages.ml (run menhir %{read-lines:menhir_flags} %{dep:parser.mly} --compile-errors %{dep:parserMessages.messages} ) )) ) ;; In order to perform the completeness check, we must first generate a file ;; "parserMessages.auto.messages" that contains a list of all error states. (rule (with-stdout-to parserMessages.auto.messages (run menhir %{read-lines:menhir_flags} %{dep:parser.mly} --list-errors ) ) ) ;; The completeness check verifies that all error messages are listed in the ;; ".messages" file. It compares the ".messages" file with that generated by ;; Menhir using the above rule. (rule (with-stdout-to parserMessages.check (run menhir %{read-lines:menhir_flags} %{dep:parser.mly} --compare-errors %{dep:parserMessages.auto.messages} --compare-errors %{dep:parserMessages.messages} )) ) ;; ----------------------------------------------------------------------------- ;; The following rule is used under the programmer's manual control, ;; after the grammar in [src/stage2/parser.mly] has been modified. ;; This rule updates the file [parserMessages.messages] with new error ;; states (if necessary) and with new auto-generated comments for all ;; error states. ;; It is invoked by running [make update] in the directory src/. (rule (with-stdout-to parserMessages.messages.updated (run menhir %{read-lines:menhir_flags} --update-errors %{dep:parserMessages.messages} %{dep:parser.mly} ) ) ) menhir-20200123/src/stage2/menhir_flags000066400000000000000000000000461361226111300175600ustar00rootroot00000000000000--canonical --table --fixed-exception menhir-20200123/src/stage2/parser.mly000066400000000000000000000606341361226111300172270ustar00rootroot00000000000000/******************************************************************************/ /* */ /* Menhir */ /* */ /* François Pottier, Inria Paris */ /* Yann Régis-Gianas, PPS, Université Paris Diderot */ /* */ /* Copyright Inria. All rights reserved. This file is distributed under the */ /* terms of the GNU General Public License version 2, as described in the */ /* file LICENSE. */ /* */ /******************************************************************************/ /* This is the fancy version of the parser, to be processed by menhir. It is kept in sync with [Parser], but exercises menhir's features. */ /* As of 2014/12/02, the $previouserror keyword and the --error-recovery mode no longer exists. Thus, we replace all calls to [Error.signal] with calls to [Error.error], and report just one error. */ /* ------------------------------------------------------------------------- */ /* Imports. */ %{ open Stretch open Syntax open Positions (* An injection of symbol expressions into choice expressions. *) let inject (e : symbol_expression located) : expression = Positions.pmap (fun pos e -> let branch = Branch ( Positions.with_pos pos (ESingleton e), ParserAux.new_production_level() ) in EChoice [ branch ] ) e (* When a stretch has been created by [Lexer.mk_stretch] with [parenthesize] set to [true], it includes parentheses. In some (rare) cases, this is undesirable. The following function removes the parentheses a posteriori. They are replaced with whitespace, so as to not alter column numbers. *) let rec find s n i = assert (i < n); if s.[i] = '(' then i else begin assert (s.[i] = ' '); find s n (i+1) end let unparenthesize (s : string) : string = let n = String.length s in (* The string [s] must end with a closing parenthesis. *) assert (n >= 2 && s.[n-1] = ')'); (* The string [s] must begin with a certain amount of spaces followed with an opening parenthesis. Find its offset [i]. *) let i = find s n 0 in (* Create a copy without the parentheses. *) let b = Bytes.of_string s in Bytes.set b i ' '; Bytes.set b (n-1) ' '; Bytes.to_string b let unparenthesize (s : Stretch.t) : Stretch.t = { s with stretch_content = unparenthesize s.stretch_content } let unparenthesize (o : Stretch.t option) : Stretch.t option = Option.map unparenthesize o %} /* ------------------------------------------------------------------------- */ /* Tokens. */ %token TOKEN TYPE LEFT RIGHT NONASSOC START PREC PUBLIC COLON BAR EOF EQUAL %token INLINE LPAREN RPAREN COMMA QUESTION STAR PLUS PARAMETER ON_ERROR_REDUCE %token PERCENTATTRIBUTE SEMI %token LID UID QID %token HEADER %token OCAMLTYPE %token PERCENTPERCENT %token ACTION %token ATTRIBUTE GRAMMARATTRIBUTE /* For the new rule syntax: */ %token LET TILDE UNDERSCORE COLONEQUAL EQUALEQUAL /* ------------------------------------------------------------------------- */ /* Type annotations and start symbol. */ %type producer %type production %start grammar /* ------------------------------------------------------------------------- */ /* Priorities. */ /* These declarations solve a shift-reduce conflict in favor of shifting: when the right-hand side of an old-style rule begins with a leading bar, this bar is understood as an (insignificant) leading optional bar, *not* as an empty right-hand side followed by a bar. This ambiguity arises due to the possibility for several productions to share a single semantic action. The new rule syntax does not have this possibility, and has no ambiguity. */ %nonassoc no_optional_bar %nonassoc BAR /* ------------------------------------------------------------------------- */ /* On-error-reduce declarations. */ /* These declarations reduce the number of states where an error can occur, thus reduce the number of syntax error messages that we have to write in parserMessages.messages. */ %on_error_reduce old_rule %on_error_reduce list(ATTRIBUTE) %on_error_reduce action_expression %on_error_reduce separated_nonempty_list(COMMA,symbol) %on_error_reduce separated_nonempty_list(COMMA,pattern) %on_error_reduce loption(delimited(LPAREN,separated_nonempty_list(COMMA,lax_actual),RPAREN)) %on_error_reduce loption(delimited(LPAREN,separated_nonempty_list(COMMA,expression),RPAREN)) %% /* ------------------------------------------------------------------------- */ /* A grammar consists of declarations and rules, followed by an optional postlude, which we do not parse. */ grammar: ds = flatten(declaration*) PERCENTPERCENT rs = rule* t = postlude { { pg_filename = ""; (* filled in by the caller *) pg_declarations = ds; pg_rules = rs; pg_postlude = t } } /* ------------------------------------------------------------------------- */ /* A declaration is an %{ OCaml header %}, or a %token, %start, %type, %left, %right, or %nonassoc declaration. */ declaration: | h = HEADER /* lexically delimited by %{ ... %} */ { [ with_loc $loc (DCode h) ] } | TOKEN ty = OCAMLTYPE? ts = clist(terminal_alias_attrs) { List.map (Positions.map (fun (terminal, alias, attrs) -> DToken (ty, terminal, alias, attrs) )) ts } | START t = OCAMLTYPE? nts = clist(nonterminal) /* %start foo is syntactic sugar for %start foo %type foo */ { match t with | None -> List.map (Positions.map (fun nonterminal -> DStart nonterminal)) nts | Some t -> Misc.mapd (fun ntloc -> Positions.mapd (fun nt -> DStart nt, DType (t, ParameterVar ntloc)) ntloc) nts } | TYPE t = OCAMLTYPE ss = clist(strict_actual) { List.map (Positions.map (fun nt -> DType (t, nt))) (List.map Parameters.with_pos ss) } | k = priority_keyword ss = clist(symbol) { let prec = ParserAux.new_precedence_level $loc(k) in List.map (Positions.map (fun symbol -> DTokenProperties (symbol, k, prec))) ss } | PARAMETER t = OCAMLTYPE { [ with_loc $loc (DParameter t) ] } | attr = GRAMMARATTRIBUTE { [ with_loc $loc (DGrammarAttribute attr) ] } | PERCENTATTRIBUTE actuals = clist(strict_actual) attrs = ATTRIBUTE+ { [ with_loc $loc (DSymbolAttributes (actuals, attrs)) ] } | ON_ERROR_REDUCE ss = clist(strict_actual) { let prec = ParserAux.new_on_error_reduce_level() in List.map (Positions.map (fun nt -> DOnErrorReduce (nt, prec))) (List.map Parameters.with_pos ss) } | SEMI { [] } /* This production recognizes tokens that are valid in the rules section, but not in the declarations section. This is a hint that a %% was forgotten. */ | rule_specific_token { Error.error [Positions.import $loc] "syntax error inside a declaration.\n\ Did you perhaps forget the %%%% that separates declarations and rules?" } priority_keyword: LEFT { LeftAssoc } | RIGHT { RightAssoc } | NONASSOC { NonAssoc } %inline rule_specific_token: | PUBLIC | INLINE | COLON | LET | EOF { () } /* ------------------------------------------------------------------------- */ /* Our lists of symbols are separated with optional commas. Order is irrelevant. */ %inline clist(X): xs = separated_nonempty_list(COMMA?, X) { xs } /* ------------------------------------------------------------------------- */ /* A symbol is a terminal or nonterminal symbol. */ /* One would like to require nonterminal symbols to begin with a lowercase letter, so as to lexically distinguish them from terminal symbols, which must begin with an uppercase letter. However, for compatibility with ocamlyacc, this is impossible. It can be required only for nonterminal symbols that are also start symbols. */ /* We also accept token aliases in place of ordinary terminal symbols. Token aliases are quoted strings. */ symbol: id = LID | id = UID | id = QID { id } /* ------------------------------------------------------------------------- */ /* Terminals must begin with an uppercase letter. Nonterminals that are declared to be start symbols must begin with a lowercase letter. */ /* In declarations, terminals must be UIDs, but we may also declare token aliases, which are QIDs. */ %inline terminal_alias_attrs: id = UID alias = QID? attrs = ATTRIBUTE* { let alias = Option.map Positions.value alias in Positions.map (fun uid -> uid, alias, attrs) id } %inline nonterminal: id = LID { id } /* ------------------------------------------------------------------------- */ /* A rule is expressed either in the traditional (yacc-style) syntax or in the new syntax. */ %inline rule: old_rule { $1 } | new_rule /* The new syntax is converted on the fly to the old syntax. */ { NewRuleSyntax.rule $1 } /* ------------------------------------------------------------------------- */ /* A rule defines a symbol. It is optionally declared %public, and optionally carries a number of formal parameters. The right-hand side of the definition consists of a list of productions. */ old_rule: flags = flags /* flags */ symbol = symbol /* the symbol that is being defined */ attributes = ATTRIBUTE* params = plist(symbol) /* formal parameters */ COLON optional_bar branches = branches SEMI* { let public, inline = flags in let rule = { pr_public_flag = public; pr_inline_flag = inline; pr_nt = Positions.value symbol; pr_positions = [ Positions.position symbol ]; pr_attributes = attributes; pr_parameters = List.map Positions.value params; pr_branches = branches } in rule } %inline branches: prods = separated_nonempty_list(BAR, production_group) { List.flatten prods } flags: /* epsilon */ { false, false } | PUBLIC { true, false } | INLINE { false, true } | PUBLIC INLINE | INLINE PUBLIC { true, true } optional_bar: /* epsilon */ %prec no_optional_bar | BAR { () } /* ------------------------------------------------------------------------- */ /* A production group consists of a list of productions, followed by a semantic action and an optional precedence specification. */ production_group: productions = separated_nonempty_list(BAR, production) action = ACTION oprec2 = ioption(precedence) { (* If multiple productions share a single semantic action, check that all of them bind the same names. *) ParserAux.check_production_group productions; (* Then, *) List.map (fun (producers, oprec1, level, pos) -> (* Replace [$i] with [_i]. *) let pr_producers = ParserAux.normalize_producers producers in (* Distribute the semantic action. Also, check that every [$i] is within bounds. *) let names = ParserAux.producer_names producers in let pr_action = action Settings.dollars names in { pr_producers; pr_action; pr_branch_prec_annotation = ParserAux.override pos oprec1 oprec2; pr_branch_production_level = level; pr_branch_position = pos }) productions } precedence: PREC symbol = symbol { symbol } /* ------------------------------------------------------------------------- */ /* A production is a list of producers, optionally followed by a precedence declaration. */ production: producers = producer* oprec = ioption(precedence) { producers, oprec, ParserAux.new_production_level(), Positions.import $loc } /* ------------------------------------------------------------------------- */ /* A producer is an actual parameter, possibly preceded by a binding, and possibly followed with attributes. Because both [ioption] and [terminated] are defined as inlined by the standard library, this definition expands to two productions, one of which begins with id = LID, the other of which begins with p = actual. The token LID is in FIRST(actual), but the LR(1) formalism can deal with that. If [option] was used instead of [ioption], an LR(1) conflict would arise -- looking ahead at LID would not allow determining whether to reduce an empty [option] or to shift. */ producer: | id = ioption(terminated(LID, EQUAL)) p = actual attrs = ATTRIBUTE* SEMI* { position (with_loc $loc ()), id, p, attrs } /* ------------------------------------------------------------------------- */ /* The ideal syntax of actual parameters includes: 1. a symbol, optionally applied to a list of actual parameters; 2. an actual parameter followed with a modifier; 3. an anonymous rule. (Not delimited by parentheses! Otherwise one would often end up writing two pairs of parentheses.) */ /* In order to avoid a few ambiguities, we restrict this ideal syntax as follows: a. Within a %type declaration, we use [strict_actual], which allows 1- and 2- (this is undocumented; the documentation says we require a symbol) but not 3-, which would not make semantic sense anyway. b. Within a producer, we use [actual], which allows 1- and 2- but not 3-. Case 3- is allowed by switching to [lax_actual] within the actual arguments of an application, which are clearly delimited by parentheses and commas. c. In front of a modifier, we can never allow [lax_actual], as this would create an ambiguity: basically, [A | B?] could be interpreted either as [(A | B)?] or as [A | (B?)]. */ %inline generic_actual(A, B): (* 1- *) symbol = symbol actuals = plist(A) { Parameters.app symbol actuals } (* 2- *) | p = B m = located(modifier) { ParameterApp (m, [ p ]) } strict_actual: p = generic_actual(strict_actual, strict_actual) { p } actual: p = generic_actual(lax_actual, actual) { p } lax_actual: p = generic_actual(lax_actual, /* cannot be lax_ */ actual) { p } (* 3- *) | /* leading bar disallowed */ branches = located(branches) { ParameterAnonymous branches } (* 2016/05/18: we used to eliminate anonymous rules on the fly during parsing. However, when an anonymous rule appears in a parameterized definition, the fresh nonterminal symbol that is created should be parameterized. This was not done, and is not easy to do on the fly, as it requires inherited attributes (or a way of simulating them). We now use explicit abstract syntax for anonymous rules. *) /* ------------------------------------------------------------------------- */ /* The "?", "+", and "*" modifiers are short-hands for applications of certain parameterized nonterminals, defined in the standard library. */ modifier: QUESTION { "option" } | PLUS { "nonempty_list" } | STAR { "list" } /* ------------------------------------------------------------------------- */ /* A postlude is announced by %%, but is optional. */ postlude: EOF { None } | p = PERCENTPERCENT /* followed by actual postlude */ { Some (Lazy.force p) } /* -------------------------------------------------------------------------- */ /* -------------------------------------------------------------------------- */ /* The new rule syntax. */ /* Whereas the old rule syntax allows a nonterminal symbol to begin with an uppercase letter, the new rule syntax disallows it. The left-hand side of a new rule must be a lowercase identifier [LID]. */ /* A new rule *cannot* be terminated by a semicolon. (This is contrast with a traditional rule, which can be followed with any number of semicolons.) We are forced to forbid the use of semicolons are as a rule terminator because they are used already as a sequencing construct. Permitting both uses would give rise to a shift/reduce conflict that we would not be able to solve. */ new_rule: | rule_public = boption(PUBLIC) LET rule_lhs = LID rule_attributes = ATTRIBUTE* rule_formals = plist(symbol) rule_inline = equality_symbol rule_rhs = expression {{ rule_public; rule_inline; rule_lhs; rule_attributes; rule_formals; rule_rhs; }} /* A new rule is written [let foo := ...] or [let foo == ...]. In the former case, we get an ordinary nonterminal symbol; in the latter case, we get an %inline nonterminal symbol. */ equality_symbol: COLONEQUAL { false } | EQUALEQUAL { true } /* The right-hand side of a new rule is an expression. */ /* An expression is a choice expression. */ expression: e = located(choice_expression) { e } /* A choice expression is a bar-separated list of alternatives, with an optional leading bar, which is ignored. Each alternative is a sequence expression. */ /* We cannot allow a choice expression to be empty, even though that would make semantic sense (the empty sum is void). Indeed, that would create a shift/reduce conflict: after reading [def x = y], it would be unclear whether this is a definition of [x] as an alias for [y], or a definition of [x] as an alias for the empty sum, followed with an old-style rule that happens to begin with [y]. */ %inline choice_expression: branches = preceded_or_separated_nonempty_llist(BAR, branch) { EChoice branches } %inline branch: e = seq_expression { Branch (e, ParserAux.new_production_level()) } /* A sequence expression takes one of the following forms: e1; e2 a sequence that binds no variables (sugar for _ = e1; e2) p = e1; e2 a sequence that binds the variables in the pattern p or is an symbol expression or an action expression. */ /* Allowing an symbol expression [e] where a sequence expression is expected can be understood as syntactic sugar for [x = e; { x }]. */ /* In a sequence [e1; e2] or [p = e1; e2], the left-hand expression [e1] is *not* allowed to be an action expression. That would be a Bison-style midrule action. Instead, one must explicitly write [midrule({ ... })]. */ /* In a sequence, the semicolon cannot be omitted. This is in contrast with old-style rules, where semicolons are optional. Here, semicolons are required for disambiguation: indeed, in the absence of mandatory semicolons, when a sequence begins with x(y,z), it would be unclear whether 1- x is a parameterized symbol and (y,z) are its actual arguments, or 2- x is unparameterized and (y, z) is a tuple pattern which forms the beginning of the next element of the sequence. */ /* We *could* allow the semicolon to be omitted when it precedes an action expression (as opposed to a sequence expression). This would be implemented in the definition of the nonterminal symbol [continuation]. We choose not to do this, as we wish to make it clear in this case that this is a sequence whose last element is the action expression. */ %inline seq_expression: e = located(raw_seq_expression) { e } raw_seq_expression: | e1 = symbol_expression e2 = continuation { ECons (SemPatWildcard, e1, e2) } | p1 = pattern EQUAL e1 = symbol_expression e2 = continuation { ECons (p1, e1, e2) } | e = symbol_expression { ESingleton e } | e = action_expression { e } %inline continuation: SEMI e2 = seq_expression /* | e2 = action_expression */ { e2 } /* A symbol expression takes one of the following forms: foo(...) a terminal or nonterminal symbol (with parameters) e* same as above e+ same as above e? same as above */ /* Note the absence of parenthesized expressions [(e)] in the syntax of symbol expressions. There are two reasons why they are omitted. At the syntactic level, introducing them would create a conflict. At a semantic level, they are both unnecessary and ambiguous, as one can instead write [endrule(e)] or [midrule(e)] and thereby indicate whether the anonymous nonterminal symbol that is generated should or should not be marked %inline. */ symbol_expression: | symbol = symbol es = plist(expression) attrs = ATTRIBUTE* { ESymbol (symbol, es, attrs) } | e = located(symbol_expression) m = located(modifier) attrs = ATTRIBUTE* (* We are forced by syntactic considerations to require a symbol expression in a position where an expression is expected. As a result, an injection must be applied. *) { ESymbol (m, [ inject e ], attrs) } /* An action expression is a semantic action, optionally preceded or followed with a precedence annotation. */ action_expression: | action = action { EAction (action, None) } | prec = precedence action = action { EAction (action, Some prec) } | action = action prec = precedence { EAction (action, Some prec) } /* A semantic action is either a traditional semantic action (an OCaml expression between curly braces) or a point-free semantic action (an optional OCaml identifier between angle brackets). */ /* The token OCAMLTYPE, which until now was supposed to denote an OCaml type between angle brackets, is re-used for this purpose. This is not very pretty. */ /* The stretch produced by the lexer is validated -- i.e., we check that it contains just an OCaml identifier, or is empty. The parentheses added by the lexer to the [stretch_content] field are removed (ugh!) because they are problematic when this identifier is a data constructor. */ action: action = ACTION { XATraditional action } | action = OCAMLTYPE { match ParserAux.validate_pointfree_action action with | os -> XAPointFree (unparenthesize os) | exception Lexpointfree.InvalidPointFreeAction -> Error.error [Positions.import $loc] "A point-free semantic action must consist \ of a single OCaml identifier." (* or whitespace *) } /* Patterns. */ pattern: | x = LID { SemPatVar x } | UNDERSCORE { SemPatWildcard } | TILDE { SemPatTilde (Positions.import $loc) } | LPAREN ps = separated_list(COMMA, pattern) RPAREN { SemPatTuple ps } (* -------------------------------------------------------------------------- *) (* -------------------------------------------------------------------------- *) (* Generic definitions. *) (* ------------------------------------------------------------------------- *) (* Formal and actual parameter lists can be absent. When present, they must be nonempty, and are delimited with parentheses and separated with commas. *) %inline plist(X): params = loption(delimited(LPAREN, separated_nonempty_list(COMMA, X), RPAREN)) { params } (* -------------------------------------------------------------------------- *) (* [reversed_preceded_or_separated_nonempty_llist(delimiter, X)] recognizes a nonempty list of [X]s, separated with [delimiter]s, and optionally preceded with a leading [delimiter]. It produces an OCaml list in reverse order. Its definition is left-recursive. *) reversed_preceded_or_separated_nonempty_llist(delimiter, X): | ioption(delimiter) x = X { [x] } | xs = reversed_preceded_or_separated_nonempty_llist(delimiter, X) delimiter x = X { x :: xs } (* [preceded_or_separated_nonempty_llist(delimiter, X)] recognizes a nonempty list of [X]s, separated with [delimiter]s, and optionally preceded with a leading [delimiter]. It produces an OCaml list in direct order. *) %inline preceded_or_separated_nonempty_llist(delimiter, X): xs = rev(reversed_preceded_or_separated_nonempty_llist(delimiter, X)) { xs } (* [preceded_or_separated_llist(delimiter, X)] recognizes a possibly empty list of [X]s, separated with [delimiter]s, and optionally preceded with a leading [delimiter]. It produces an OCaml list in direct order. *) preceded_or_separated_llist(delimiter, X): | (* empty *) { [] } | xs = preceded_or_separated_nonempty_llist(delimiter, X) { xs } (* -------------------------------------------------------------------------- *) (* [located(X)] recognizes the same language as [X] and converts the resulting value from type ['a] to type ['a located]. *) located(X): x = X { with_loc $loc x } %% menhir-20200123/src/stage2/parserMessages.messages000066400000000000000000000523211361226111300217170ustar00rootroot00000000000000# ---------------------------------------------------------------------------- grammar: UID grammar: HEADER UID Either a declaration or '%%' is expected at this point. # ---------------------------------------------------------------------------- grammar: TYPE UID grammar: TYPE OCAMLTYPE TYPE grammar: TYPE OCAMLTYPE UID PREC grammar: TYPE OCAMLTYPE UID LPAREN TYPE grammar: TYPE OCAMLTYPE UID COMMA TYPE grammar: TYPE OCAMLTYPE UID LPAREN UID UID grammar: TYPE OCAMLTYPE UID LPAREN UID COMMA TYPE grammar: TYPE OCAMLTYPE UID PLUS RPAREN grammar: ON_ERROR_REDUCE TYPE # %type and %on_error_reduce are both followed with clist(strict_actual), # so they are not distinguished in the automaton. Ill-formed declaration. Examples of well-formed declarations: %type expression %type date time %type option(date) %on_error_reduce expression %on_error_reduce date time %on_error_reduce option(date) # ---------------------------------------------------------------------------- grammar: TOKEN TYPE grammar: TOKEN OCAMLTYPE TYPE grammar: TOKEN UID STAR grammar: TOKEN UID QID STAR grammar: TOKEN UID COMMA TYPE Ill-formed '%token' declaration. Examples of well-formed declarations: %token FOO %token BAR "|" %token DOT "." SEMICOLON ";" %token LID UID %token FOO [@cost 0] # ---------------------------------------------------------------------------- grammar: START UID grammar: START OCAMLTYPE LEFT grammar: START LID UID grammar: START LID COMMA UID Ill-formed '%start' declaration. A start symbol must begin with a lowercase letter. Examples of well-formed declarations: %start program %start expression phrase %start date time # ---------------------------------------------------------------------------- grammar: RIGHT TYPE grammar: RIGHT UID STAR grammar: RIGHT UID COMMA TYPE Ill-formed precedence declaration. Examples of well-formed declarations: %left PLUS %left PLUS MINUS %nonassoc unary_minus %right CONCAT # ---------------------------------------------------------------------------- grammar: PARAMETER UID Ill-formed '%parameter' declaration. Examples of well-formed declarations: %parameter # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT TYPE # Do not mention that %% or EOF would be accepted at this point. A rule is expected at this point. # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT UID COLON ACTION SEMI UNDERSCORE # We have seen a semicolon, so we know that the previous rule is complete. # Do not mention that %% or EOF would be accepted at this point. Another rule is expected at this point. # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT UID COLON ACTION TYPE # Do not mention that %% or EOF would be accepted at this point. Either another production '|' ... or another rule is expected at this point. # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT INLINE TYPE # This is definitely old-style syntax. Ill-formed rule. Either '%public' or a non-terminal symbol is expected at this point. Examples of well-formed rules: %public option(X): { None } | x = X { Some x } %inline clist(X): xs = separated_nonempty_list(COMMA?, X) { xs } %public %inline pair(X, Y): x = X; y = Y { (x, y) } grammar: PERCENTPERCENT PUBLIC INLINE TYPE # This is definitely old-style syntax. Ill-formed rule. A non-terminal symbol is expected at this point. Examples of well-formed rules: %public option(X): { None } | x = X { Some x } %inline clist(X): xs = separated_nonempty_list(COMMA?, X) { xs } %public %inline pair(X, Y): x = X; y = Y { (x, y) } # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT UID UID Ill-formed rule. Either a parenthesized, comma-delimited list of formal parameters or an attribute or a colon ':' is expected at this point. Examples of well-formed rules: main: e = expr; EOL { e } expr: i = INT { i } | e1 = expr; PLUS; e2 = expr { e1 + e2 } option(X): { None } | x = X { Some x } main [@cost 0]: e = expr; EOL { e } # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT UID LPAREN TYPE grammar: PERCENTPERCENT UID LPAREN UID COMMA TYPE Ill-formed rule. A comma-delimited list of formal parameters is expected at this point. Examples of well-formed rules: option(X): { None } | x = X { Some x } pair(X, Y): x = X; y = Y { (x, y) } # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT UID LPAREN UID UID # Ignore the fact that the comma-delimited list of symbols could continue. Ill-formed rule. A closing parenthesis ')' is expected at this point. Examples of well-formed rules: option(X): { None } | x = X { Some x } pair(X, Y): x = X; y = Y { (x, y) } # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT UID COLON TYPE grammar: PERCENTPERCENT UID COLON BAR TYPE grammar: PERCENTPERCENT UID COLON ACTION BAR TYPE grammar: PERCENTPERCENT UID COLON UID BAR TYPE Ill-formed rule. A list of productions is expected at this point. Examples of well-formed rules: main: e = expr; EOL { e } expr: i = INT { i } | e1 = expr; PLUS; e2 = expr { e1 + e2 } symbol: s = LID | s = UID { s } # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT UID COLON UID TYPE grammar: PERCENTPERCENT UID COLON UID SEMI TYPE grammar: PERCENTPERCENT UID COLON LID TYPE grammar: PERCENTPERCENT UID COLON LID EQUAL TYPE grammar: PERCENTPERCENT UID COLON LID EQUAL UID PLUS TYPE Ill-formed production. A production is a sequence of producers, followed with a semantic action. Examples of well-formed producers: expr option(COMMA) separated_list(COMMA, expr) e = expr ds = declaration* es = list(terminated(expr, SEMI)) es = list(e = expr SEMI { e }) xs = list(x = var { Some x } | WILDCARD { None }) expr [@cost 0] # The following sentences are tricky. In front of us could be many things # (comma, closing parenthesis, identifier, modifier, %prec keyword, etc.). # We don't know which symbol we expect to reduce towards (e.g., it could be # [actual] or [lax_actual]). # Let's just back up to a safe level of abstraction and say that this is an # ill-formed production. # As RPAREN is in the lookahead set, we may suggest that a parenthesis could # be closed. grammar: PERCENTPERCENT UID COLON UID LPAREN UID TYPE grammar: PERCENTPERCENT UID COLON UID LPAREN UID STAR TYPE grammar: PERCENTPERCENT UID COLON UID LPAREN LID TYPE Ill-formed production. Maybe you meant to close a parenthesis at this point? A production is a sequence of producers, followed with a semantic action. Examples of well-formed producers: expr option(COMMA) separated_list(COMMA, expr) e = expr ds = declaration* es = list(terminated(expr, SEMI)) es = list(e = expr SEMI { e }) xs = list(x = var { Some x } | WILDCARD { None }) # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT UID COLON UID LPAREN ACTION BAR TYPE # Here, we have seen a BAR, so we expect a production (group). A production is expected at this point. A production is a sequence of producers, followed with a semantic action. Examples of well-formed producers: expr option(COMMA) separated_list(COMMA, expr) e = expr ds = declaration* es = list(terminated(expr, SEMI)) es = list(e = expr SEMI { e }) xs = list(x = var { Some x } | WILDCARD { None }) # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT UID COLON UID LPAREN ACTION UID # In the non-canonical automaton, this is a tricky case where, looking at the # description of the state, it seems that only COMMA and RPAREN can follow # here. But in fact, other tokens are possible, such as BAR, simply because # they will NOT take us into this state. In the canonical automaton, the list # of possibilities is explicit in the lookahead sets. grammar: PERCENTPERCENT UID COLON UID LPAREN ACTION PREC UID UID # In the first case above, we may expect a %prec annotation, whereas in the # second case above, we have just seen it. In the error message, we merge # these two situations and do not mention the possibility of a %prec # annotation. Either another production '|' ... or a comma ',' or a closing parenthesis ')' is expected at this point. # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT UID COLON PREC TYPE grammar: PERCENTPERCENT UID COLON UID LPAREN ACTION PREC TYPE grammar: PERCENTPERCENT UID COLON ACTION PREC TYPE grammar: PERCENTPERCENT LET LID COLONEQUAL PREC EOF grammar: PERCENTPERCENT LET LID COLONEQUAL ACTION PREC EOF # Conflate old rule syntax and new rule syntax. Ill-formed %prec annotation. A symbol is expected at this point. Examples of well-formed annotations: expr: MINUS e = expr %prec UMINUS { -e } # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT UID COLON UID LPAREN TYPE grammar: PERCENTPERCENT UID COLON UID LPAREN UID COMMA TYPE Ill-formed rule. A comma-delimited list of actual parameters is expected at this point. Examples of well-formed rules: call: f = callee LPAREN args = separated_list(COMMA, expr) RPAREN { f, args } list(X): { [] } | x = X; xs = list(X) { x :: xs } # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT UID COLON PREC LID UID Ill-formed rule. Either a semantic action '{' ... '}' or another production '|' ... is expected at this point. Examples of well-formed rules: expr: MINUS e = expr %prec UMINUS { -e } # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT UID LPAREN UID RPAREN BAR Ill-formed rule. A colon ':' is expected at this point. Examples of well-formed rules: option(X): { None } | x = X { Some x } # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT UID COLON ACTION PREC UID TYPE Either another rule or another production '|' ... is expected at this point. Examples of well-formed rules: option(X): { None } | x = X { Some x } # ---------------------------------------------------------------------------- grammar: TYPE OCAMLTYPE UID LPAREN UID LPAREN TYPE grammar: PERCENTPERCENT UID COLON UID LPAREN UID LPAREN TYPE Ill-formed list of actual parameters. A comma-delimited list of actual parameters is expected at this point. Examples of well-formed actual parameters: expr expr+ option(expr) separated_list(COMMA, expr) # Omitting the fact that an anonymous rule is a valid actual parameter... # Also omitting the subtle distinctions between lax_actual, actual, etc. # ---------------------------------------------------------------------------- grammar: TYPE OCAMLTYPE UID LPAREN UID PLUS UID Ill-formed list of actual parameters. Either a modifier '*' or '+' or '?' or a closing parenthesis ')' or a comma ',' is expected at this point. Examples of well-formed actual parameters: expr expr+ option(expr) separated_list(COMMA, expr) # ------------------------------------------------------------------------------ grammar: PERCENTATTRIBUTE TYPE grammar: PERCENTATTRIBUTE UID COMMA TYPE grammar: PERCENTATTRIBUTE UID TYPE grammar: PERCENTATTRIBUTE UID PLUS TYPE grammar: PERCENTATTRIBUTE UID LPAREN TYPE grammar: PERCENTATTRIBUTE UID ATTRIBUTE UID Ill-formed '%attribute' declaration. An '%attribute' declaration should contain a nonempty list of symbols, followed with a nonempty list of attributes. Examples of well-formed declarations: %attribute FOO [@printer "foo"] %attribute bar BAZ [@printer "bar/BAZ"] [@cost 2.0] # ---------------------------------------------------------------------------- # ---------------------------------------------------------------------------- # The following error sentences concern both the old and new rule syntax. grammar: PERCENTPERCENT PUBLIC TYPE Ill-formed rule. 'let' or '%inline' or a non-terminal symbol is expected at this point. Examples of well-formed rules: %public option(X): { None } | x = X { Some x } %public let option(X) := { None } | x = X; { Some x } # ---------------------------------------------------------------------------- # ---------------------------------------------------------------------------- # The following error sentences have to do with the new rule syntax. # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT LET EOF A lowercase identifier is expected at this point. # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT LET LID EOF grammar: PERCENTPERCENT LET LID LPAREN UID RPAREN EOF # Ignore attributes. # In the first case, we have not seen a list of formal parameters yet, # so such a list could still appear; yet I choose not to mention it. # People are likely to write '=' whereas we expect ':=' or '=='. # We should remind them what these two symbols mean. An equality symbol ':=' or '==' is expected at this point. Examples of well-formed rules: let option(X) := { None } | x = X; { Some x } (* ordinary *) let ioption(X) == { None } | x = X; { Some x } (* inline *) # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT LET LID LPAREN EOF # By accident, the lookahead token (EQUALEQUAL or COLONEQUAL) reveals # that we are in the new rule syntax. A comma-delimited list of formal parameters is expected at this point. # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT LET LID LPAREN UID EOF # By accident, the lookahead token (EQUALEQUAL or COLONEQUAL) reveals # that we are in the new rule syntax. At this point, one of the following is expected: a comma ',' followed with an expression, or a closing parenthesis ')'. # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT LET LID COLONEQUAL EOF # A choice expression is expected. An expression is expected at this point. Examples of expressions: term t = term; { t } LPAREN; ~ = term; RPAREN; <> factor | term; MUL; factor # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT LET LID COLONEQUAL TILDE EOF grammar: PERCENTPERCENT LET LID COLONEQUAL UID LPAREN TILDE EOF An equals sign '=' is expected at this point. # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT LET LID COLONEQUAL TILDE EQUAL EOF grammar: PERCENTPERCENT LET LID COLONEQUAL UID LPAREN TILDE EQUAL EOF # A symbol expression is expected. # A symbol expression always begins with a symbol, # so we can say that a symbol is expected. A symbol is expected at this point. # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT LET LID COLONEQUAL TILDE EQUAL LID EOF grammar: PERCENTPERCENT LET LID COLONEQUAL UID LPAREN UNDERSCORE EQUAL UID ATTRIBUTE EOF # Ignore the fact that an attribute or a modifier is permitted. A semicolon ';' is expected at this point. # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT LET LID COLONEQUAL PREC UID EOF grammar: PERCENTPERCENT LET LID COLONEQUAL UID LPAREN PREC UID EOF A semantic action is expected at this point. # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT LET LID COLONEQUAL LPAREN EOF grammar: PERCENTPERCENT LET LID COLONEQUAL LPAREN LPAREN EOF This opening parenthesis seems to be the beginning of a tuple pattern. Thus, a comma-separated list of patterns is expected at this point. # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT LET LID COLONEQUAL LPAREN LPAREN UNDERSCORE EOF grammar: PERCENTPERCENT LET LID COLONEQUAL LPAREN UNDERSCORE EOF The previous opening parenthesis seemed to be the beginning of a tuple pattern. Thus, either a comma ',' followed with a pattern or a closing parenthesis ')' is expected at this point. # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT LET LID COLONEQUAL LPAREN UNDERSCORE COMMA EOF A pattern is expected at this point. Examples of patterns: x ~ _ (x, y, _) # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT LET LID COLONEQUAL LID TYPE # This is tricky. We have read a lowercase identifier, # but do not know yet whether it represents a pattern # (in which case an EQUAL sign is expected) # or a symbol # (in which case many continuations are possible) # (in fact, the rule could be complete, as we are # at the top level). # Ignore the fact that this symbol could be followed with a list of # actual parameters, or a modifier, or an attribute. At this point, one of the following is expected: an equals sign '=' followed with a symbol, or a semicolon ';' followed with an expression, or a bar '|' followed with an expression, or another rule. # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT LET LID COLONEQUAL UID LPAREN LID EOF # This is analogous to the previous case, # except we are not at the top level. At this point, one of the following is expected: an equals sign '=' followed with a symbol, or a semicolon ';' followed with an expression, or a bar '|' followed with an expression, or a closing parenthesis ')'. # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT LET LID COLONEQUAL ACTION TYPE At this point, one of the following is expected: a bar '|' followed with an expression, or another rule. # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT LET LID COLONEQUAL UID TYPE # Ignore modifiers and attributes. # We expect either SEMI; seq_expression or BAR; expression or another rule. At this point, one of the following is expected: a semicolon ';' followed with an expression, or a bar '|' followed with an expression, or another rule. # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT LET LID COLONEQUAL UID LPAREN TYPE grammar: PERCENTPERCENT LET LID COLONEQUAL UID LPAREN UID LPAREN TYPE grammar: PERCENTPERCENT LET LID COLONEQUAL UNDERSCORE EQUAL UID LPAREN TYPE A comma-separated list of expressions is expected at this point. # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT LET LID COLONEQUAL BAR TYPE grammar: PERCENTPERCENT LET LID COLONEQUAL UID BAR TYPE grammar: PERCENTPERCENT LET LID COLONEQUAL UID LPAREN BAR TYPE grammar: PERCENTPERCENT LET LID COLONEQUAL UID LPAREN UID BAR TYPE # A sequence expression is expected. # We are inside a choice expression. # We can show examples that involve '|', # as our sequence expression can be part of a choice expression # and therefore followed with a BAR. An expression is expected at this point. Examples of expressions: term t = term; { t } LPAREN; ~ = term; RPAREN; <> factor | term; MUL; factor # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT LET LID COLONEQUAL UID LPAREN UID SEMI TYPE grammar: PERCENTPERCENT LET LID COLONEQUAL UNDERSCORE EQUAL UID SEMI TYPE grammar: PERCENTPERCENT LET LID COLONEQUAL UID LPAREN UNDERSCORE EQUAL UID SEMI TYPE grammar: PERCENTPERCENT LET LID COLONEQUAL UID SEMI EOF # A sequence expression is expected. # We are inside a sequence expression. # In fact, we have just read a semicolon. # Maybe it is worth re-iterating that (in the new syntax) # a rule cannot be terminated with a semicolon. After a semicolon, an expression is expected. (A rule cannot be terminated with a semicolon.) Examples of expressions: term t = term; { t } LPAREN; ~ = term; RPAREN; <> # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT LET LID COLONEQUAL UID LPAREN ACTION EOF At this point, one of the following is expected: a comma ',' followed with an expression, or a bar '|' followed with an expression, or a closing parenthesis ')'. # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT LET LID COLONEQUAL UID LPAREN UID EOF # Ignore modifiers and attributes. # We expect either SEMI; seq_expression or BAR; expression or COMMA; expression or RPAREN. At this point, one of the following is expected: a semicolon ';' followed with an expression, or a bar '|' followed with an expression, or a comma ',' followed with an expression, or a closing parenthesis ')'. # ---------------------------------------------------------------------------- grammar: PERCENTPERCENT LET LID COLONEQUAL UID LPAREN UID COMMA TYPE # A choice expression is expected (allowed). An expression is expected at this point. Examples of expressions: term t = term; { t } LPAREN; ~ = term; RPAREN; <> factor | ~ = term; ~ = op; ~ = factor; # ---------------------------------------------------------------------------- # Local Variables: # mode: shell-script # End: menhir-20200123/src/stage3/000077500000000000000000000000001361226111300152005ustar00rootroot00000000000000menhir-20200123/src/stage3/anonymize/000077500000000000000000000000001361226111300172115ustar00rootroot00000000000000menhir-20200123/src/stage3/anonymize/anonymize.ml000066400000000000000000000026531361226111300215620ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This script removes file names in # (line) directives. *) (* This is used to textually compare the parsers produced by the stage 2 and stage 3 executables. *) let line_directive = Str.regexp {|^# \([0-9]+\) ".*"$|} let process fn = let ic = open_in fn in try while true do let s = input_line ic in print_endline (Str.replace_first line_directive {|# \1 ""|} s) done with End_of_file -> close_in ic let () = Arg.parse [] process "" menhir-20200123/src/stage3/anonymize/dune000066400000000000000000000000631361226111300200660ustar00rootroot00000000000000(executable (name anonymize) (libraries str) ) menhir-20200123/src/stage3/dune000066400000000000000000000060111361226111300160540ustar00rootroot00000000000000;; Build the stage3 version of Menhir, based on a parser generated by the ;; stage2 version of Menhir. ;; One might think that one could save time by building only the stage3 ;; parser.ml and parser.mli, as opposed to a full stage3 executable. However, ;; because Menhir is used in --infer mode, building these files in a correct ;; manner actually requires type-checking a large part of Menhir's source ;; code. It's simpler to just build everything than to be smart. ;; ----------------------------------------------------------------------------- ;; As dune cannot use the same OCaml module in two different libraries or ;; executables, we must copy the source files to the present directory. (copy_files# ../*.{ml,mli}) ;; The following files are copied from stage 2. (copy_files ../stage2/menhir_flags.sexp) (copy_files ../stage2/parser.mly) (copy_files ../stage2/Driver.ml) (copy_files ../stage2/parserMessages.ml) ;; ----------------------------------------------------------------------------- ;; Bind the name "menhir" to "../stage2/main.exe" within the present scope. (env (_ (binaries ../stage2/main.exe (../stage2/main.exe as menhir))) ) ;; Menhir's parser is generated by Menhir. ;; We include the flags found in the file "menhir_flags" plus extra flags ;; specified here. (menhir (flags (:include menhir_flags.sexp) ) (modules parser) ) ;; ----------------------------------------------------------------------------- ;; The stage3 version of Menhir. ;; The link_deps field below is an artificial way of requiring the @bootstrap ;; target to be automatically built. (executable (name main) (libraries unix menhirLib menhirSdk) (flags :standard -open MenhirSdk) (link_deps (alias bootstrap)) ) ;; ---------------------------------------------------------------------------- ;; The bootstrap check verifies that stage1-Menhir and stage2-Menhir produce ;; the same result when applied to Menhir's stage2 grammar. If this check ;; fails, then the ocamlyacc parser in stage1/parser.mly and the Menhir parser ;; in stage2/parser.mly have different semantics, a mistake that must be ;; fixed. ;; This check compares the [Parser] modules generated by the stage1 and stage2 ;; executables, and fails if they are not identical. ;; These parsers contain line directives that are necessarily different ;; because they were generated in different directories. A helper script is ;; used to remove the "filename" part of the line directives. ;; This check is run as part of [dune test]. (rule (with-stdout-to parser.stage2.ml (run anonymize/anonymize.exe %{dep:../stage2/parser.ml}) ) ) (rule (with-stdout-to parser.stage3.ml (run anonymize/anonymize.exe %{dep:parser.ml}) ) ) (rule (alias bootstrap) (action (progn (echo "Bootstrap check: comparing the stage 2 and stage 3 parsers...\n") (progn (diff parser.stage2.ml parser.stage3.ml) (diff ../stage2/parser.mli parser.mli) ) (echo "Bootstrap check: done.\n") )) ) (rule (alias test) (deps (alias bootstrap)) (action (progn)) ) menhir-20200123/src/standard.mly000066400000000000000000000170501361226111300163400ustar00rootroot00000000000000/******************************************************************************/ /* */ /* Menhir */ /* */ /* François Pottier, Inria Paris */ /* Yann Régis-Gianas, PPS, Université Paris Diderot */ /* */ /* Copyright Inria. All rights reserved. This file is distributed under the */ /* terms of the GNU Library General Public License version 2, with a */ /* special exception on linking, as described in the file LICENSE. */ /* */ /******************************************************************************/ (* This is menhir's standard library. It offers a number of parameterized nonterminal definitions, such as options and lists, that should be useful in a number of circumstances. *) %% (* ------------------------------------------------------------------------- *) (* The identity. *) (* [endrule(X)] is the same as [X]. *) (* This allows placing an anonymous subrule in the middle of a rule, as in: cat endrule(dog { action1 }) cow { action2 } Because [endrule] is marked %inline, everything is expanded away. So, this is equivalent to: cat dog cow { action1; action2 } Note that [action1] moves to the end of the rule. The anonymous subrule can even have several branches, as in: cat endrule(dog { action1a } | fox { action1b }) cow { action2 } This is expanded to: cat dog cow { action1a; action2 } | cat fox cow { action1b; action2 } *) %public %inline endrule(X): x = X { x } (* [anonymous(X)] is a deprecated synonym for [endrule(X)]. It was never documented. *) %public %inline anonymous(X): x = X { x } (* [midrule(X)] is the same as [X]. *) (* This allows placing an anonymous subrule in the middle of a rule, as in: cat midrule(dog { action1 }) cow { action2 } Because [midrule] is not marked %inline, this is equivalent to: cat xxx cow { action2 } where the fresh nonterminal symbol [xxx] is separately defined by: xxx: dog { action1 } In particular, if there is no [dog], what we get is a semantic action embedded in the middle of a rule. For instance, cat midrule({ action1 }) cow { action2 } is equivalent to: cat xxx cow { action2 } where [xxx] is separately defined by the rule: xxx: { action1 } *) %public midrule(X): x = X { x } (* [embedded(X)] is a deprecated synonym for [midrule(X)]. It was never documented. *) %public embedded(X): x = X { x } (* ------------------------------------------------------------------------- *) (* Options. *) (* [option(X)] recognizes either nothing or [X]. It produces a value of type ['a option] if [X] produces a value of type ['a]. *) %public option(X): /* nothing */ { None } | x = X { Some x } (* [ioption(X)] is identical to [option(X)], except its definition is inlined. This has the effect of duplicating the production that refers to it, possibly eliminating an LR(1) conflict. *) %public %inline ioption(X): /* nothing */ { None } | x = X { Some x } (* [boption(X)] recognizes either nothing or [X]. It produces a value of type [bool]. *) %public boption(X): /* nothing */ { false } | X { true } (* [loption(X)] recognizes either nothing or [X]. It produces a value of type ['a list] if [X] produces a value of type ['a list]. *) %public loption(X): /* nothing */ { [] } | x = X { x } (* ------------------------------------------------------------------------- *) (* Sequences. *) (* [epsilon] recognizes the empty word. It can be used instead of the traditional /* empty */ comment. *) (* NOT YET ADDED because we first need to remove the limitation that every symbol must be reachable from the start symbol! %public %inline epsilon: /* empty */ { () } *) (* [pair(X, Y)] recognizes the sequence [X Y]. It produces a value of type ['a * 'b] if [X] and [Y] produce values of type ['a] and ['b], respectively. *) %public %inline pair(X, Y): x = X; y = Y { (x, y) } (* [separated_pair(X, sep, Y)] recognizes the sequence [X sep Y]. It produces a value of type ['a * 'b] if [X] and [Y] produce values of type ['a] and ['b], respectively. *) %public %inline separated_pair(X, sep, Y): x = X; sep; y = Y { (x, y) } (* [preceded(opening, X)] recognizes the sequence [opening X]. It passes on the value produced by [X], so that it produces a value of type ['a] if [X] produces a value of type ['a]. *) %public %inline preceded(opening, X): opening; x = X { x } (* [terminated(X, closing)] recognizes the sequence [X closing]. It passes on the value produced by [X], so that it produces a value of type ['a] if [X] produces a value of type ['a]. *) %public %inline terminated(X, closing): x = X; closing { x } (* [delimited(opening, X, closing)] recognizes the sequence [opening X closing]. It passes on the value produced by [X], so that it produces a value of type ['a] if [X] produces a value of type ['a]. *) %public %inline delimited(opening, X, closing): opening; x = X; closing { x } (* ------------------------------------------------------------------------- *) (* Lists. *) (* [list(X)] recognizes a possibly empty list of [X]'s. It produces a value of type ['a list] if [X] produces a value of type ['a]. The front element of the list is the first element that was parsed. *) %public list(X): /* nothing */ { [] } | x = X; xs = list(X) { x :: xs } (* [nonempty_list(X)] recognizes a nonempty list of [X]'s. It produces a value of type ['a list] if [X] produces a value of type ['a]. The front element of the list is the first element that was parsed. *) %public nonempty_list(X): x = X { [ x ] } | x = X; xs = nonempty_list(X) { x :: xs } (* [separated_list(separator, X)] recognizes a possibly empty list of [X]'s, separated with [separator]'s. It produces a value of type ['a list] if [X] produces a value of type ['a]. The front element of the list is the first element that was parsed. *) %public %inline separated_list(separator, X): xs = loption(separated_nonempty_list(separator, X)) { xs } (* [separated_nonempty_list(separator, X)] recognizes a nonempty list of [X]'s, separated with [separator]'s. It produces a value of type ['a list] if [X] produces a value of type ['a]. The front element of the list is the first element that was parsed. *) %public separated_nonempty_list(separator, X): x = X { [ x ] } | x = X; separator; xs = separated_nonempty_list(separator, X) { x :: xs } (* ------------------------------------------------------------------------- *) (* List manipulation and transformation. *) (* [rev(XS)] recognizes the same language as [XS], but reverses the resulting OCaml list. (20181005) *) %public %inline rev(XS): xs = XS { List.rev xs } (* [flatten(XSS)] recognizes the same language as [XSS], and flattens the resulting OCaml list of lists. (20181005) *) %public %inline flatten(XSS): xss = XSS { List.flatten xss } (* [append(XS, YS)] recognizes [XS YS], and appends (concatenates) the resulting OCaml lists. (20181005) *) %public %inline append(XS, YS): xs = XS ys = YS { xs @ ys } %% menhir-20200123/src/stretch.ml000066400000000000000000000037351361226111300160300ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* A stretch is a fragment of a source file. It holds the file name, the line number, and the line count (that is, the length) of the fragment. These are used to generate line number directives when the fragment is copied to an output file. It also holds the textual content of the fragment, as a string. The [raw_content] field holds the text that was found in the source file, while the [content] field holds the same text after transformation by the lexer (which may substitute keywords, insert padding, insert parentheses, etc.). See [Lexer.mk_stretch] and its various call sites in [Lexer]. *) type t = { stretch_filename : string; stretch_linenum : int; stretch_linecount : int; stretch_raw_content : string; stretch_content : string; stretch_keywords : Keyword.keyword list } (* An OCaml type is either a stretch (if it was found in some source file) or a string (if it was inferred via [Infer]). *) type ocamltype = | Declared of t | Inferred of string menhir-20200123/src/stringMap.ml000066400000000000000000000027171361226111300163170ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) include Map.Make (String) let cardinal s = fold (fun _ _ x -> x + 1) s 0 let filter pred map = fold (fun key value map -> if pred key value then add key value map else map) map empty let restrict domain map = filter (fun k _ -> StringSet.mem k domain) map let domain map = fold (fun key _ acu -> StringSet.add key acu) map StringSet.empty let multiple_add k v m = let vs = try find k m with Not_found -> [] in add k (v :: vs) m menhir-20200123/src/stringMap.mli000066400000000000000000000032761361226111300164710ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) include Map.S with type key = string (* [cardinal m] is the cardinal of the map [m]. *) val cardinal : 'a t -> int (* [restrict s m] restricts the domain of the map [m] to (its intersection with) the set [s]. *) val restrict: StringSet.t -> 'a t -> 'a t (* [filter pred m] restricts the domain of the map [m] to (key, value) couples that verify [pred]. *) val filter: (string -> 'a -> bool) -> 'a t -> 'a t (* [domain m] returns the domain of the map [m]. *) val domain: 'a t -> StringSet.t (* [multiple_add k v m] adds the key-value pair [k, v] to the map [m], which maps keys to *lists* of values. The list currently associated with [k] is extended with the value [v]. *) val multiple_add: key -> 'a -> 'a list t -> 'a list t menhir-20200123/src/stringSet.ml000066400000000000000000000020341361226111300163250ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) include Set.Make (String) let of_list xs = List.fold_right add xs empty menhir-20200123/src/stringSet.mli000066400000000000000000000020221361226111300164730ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) include Set.S with type elt = string val of_list: elt list -> t menhir-20200123/src/syntax.ml000066400000000000000000000300711361226111300156730ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* The type [partial_grammar] describes the abstract syntax that is produced by the parsers (yacc-parser and fancy-parser). The type [grammar] describes the abstract syntax that is obtained after one or more partial grammars are joined (see [PartialGrammar]). It differs in that declarations are organized in a more useful way and a number of well-formedness checks have been performed. *) type 'a located = 'a Positions.located (* ------------------------------------------------------------------------ *) (* Terminals and nonterminal symbols are strings. *) type terminal = string type nonterminal = string type symbol = string (* In a somewhat fragile convention, in a partial grammar, a reference to a terminal symbol either is a normal identifier [LID], in which case it is the name of the terminal symbol, or is a quoted identifier [QID], in which case it is a token alias. Token aliases are eliminated by replacing them with the corresponding terminal symbols very early on during the joining of the partial grammars -- see the function [dealias_pg] in [PartialGrammar]. In a complete grammar, there are no token aliases any longer. *) type alias = string option (* Identifiers (which are used to refer to a symbol's semantic value) are strings. *) type identifier = string (* A file name is a string. *) type filename = string (* ------------------------------------------------------------------------ *) (* A postlude is a source file fragment. *) type postlude = Stretch.t (* ------------------------------------------------------------------------ *) (* OCaml semantic actions are represented as stretches. *) type action = Action.t (* ------------------------------------------------------------------------ *) (* An attribute consists of an attribute name and an attribute payload. The payload is an uninterpreted stretch of source text. *) type attribute = string located * Stretch.t type attributes = attribute list (* Attributes allow the user to annotate the grammar with information that is ignored by Menhir, but can be exploited by other tools, via the SDK. *) (* Attributes can be attached in the following places: - with the grammar: %[@bar ...] - with a terminal symbol: %token FOO [@bar ...] - with a rule: foo(X) [@bar ...]: ... - with a producer: e = foo(quux) [@bar ...] - with an arbitrary symbol: %attribute FOO foo(quux) [@bar ...] After expanding away parameterized nonterminal symbols, things become a bit simpler, as %attribute declarations are desugared away. *) (* ------------------------------------------------------------------------ *) (* Information about tokens. (Only after joining.) *) type token_associativity = LeftAssoc | RightAssoc | NonAssoc | UndefinedAssoc type precedence_level = UndefinedPrecedence (* Items are incomparable when they originate in different files. A value of type [input_file] is used to record an item's origin. The positions allow locating certain warnings. *) | PrecedenceLevel of InputFile.input_file * int * Lexing.position * Lexing.position type token_properties = { tk_filename : filename; tk_ocamltype : Stretch.ocamltype option; tk_position : Positions.t; tk_attributes : attributes; mutable tk_associativity : token_associativity; mutable tk_precedence : precedence_level; mutable tk_is_declared : bool; } (* ------------------------------------------------------------------------ *) (* A [%prec] annotation is optional. A production can carry at most one. If there is one, it is a symbol name. See [ParserAux]. *) type branch_prec_annotation = symbol located option (* ------------------------------------------------------------------------ *) (* A "production level" is used to solve reduce/reduce conflicts. It reflects which production appears first in the grammar. See [ParserAux]. *) type branch_production_level = | ProductionLevel of InputFile.input_file * int (* ------------------------------------------------------------------------ *) (* A level is attached to every [%on_error_reduce] declaration. It is used to decide what to do when several such declarations are applicable in a single state. *) type on_error_reduce_level = branch_production_level (* we re-use the above type, to save code *) (* ------------------------------------------------------------------------ *) (* ------------------------------------------------------------------------ *) (* The old rule syntax. Although old, still used internally. The new syntax is translated down to it. *) (* A parameter is either just a symbol or an application of a symbol to a nonempty tuple of parameters. Before anonymous rules have been eliminated, it can also be an anonymous rule, represented as a list of branches. *) type parameter = | ParameterVar of symbol located | ParameterApp of symbol located * parameters | ParameterAnonymous of parameterized_branch list located and parameters = parameter list (* ------------------------------------------------------------------------ *) (* A producer is a pair of identifier and a parameter. In concrete syntax, it could be [e = expr], for instance. It carries a number of attributes. *) and producer = identifier located * parameter * attributes (* ------------------------------------------------------------------------ *) (* A branch contains a series of producers and a semantic action. *) and parameterized_branch = { pr_branch_position : Positions.t; pr_producers : producer list; pr_action : action; pr_branch_prec_annotation : branch_prec_annotation; pr_branch_production_level : branch_production_level } (* ------------------------------------------------------------------------ *) (* A rule has a header and several branches. *) type parameterized_rule = { pr_public_flag : bool; pr_inline_flag : bool; pr_nt : nonterminal; pr_positions : Positions.t list; pr_attributes : attributes; pr_parameters : symbol list; pr_branches : parameterized_branch list; } (* ------------------------------------------------------------------------ *) (* ------------------------------------------------------------------------ *) (* The new rule syntax. *) (* In the user's eyes, this replaces the old rule syntax, which corresponds to the types [parameter], [producer], [parameterized_branch], and [parameterized_rule] above. *) (* Internally, the new rule syntax is translated down to the old rule syntax; see [NewRuleSyntax]. This is done on the fly during parsing. *) type pattern = | SemPatVar of identifier located | SemPatWildcard | SemPatTilde of Positions.t | SemPatTuple of pattern list (* Patterns: as in the manual. *) type raw_action = Settings.dollars -> identifier option array -> action (* Ugly type produced by the lexer for an ACTION token. *) type expression = choice_expression located (* A toplevel expression is a choice expression. *) and choice_expression = | EChoice of branch list (* A choice expression is a list of branches. *) and branch = | Branch of seq_expression * branch_production_level (* A branch is a sequence expression, plus an ugly [branch_production_level]. *) and seq_expression = raw_seq_expression located and raw_seq_expression = | ECons of pattern * symbol_expression * seq_expression | ESingleton of symbol_expression | EAction of extended_action * branch_prec_annotation (* A sequence is either a cons [p = e1; e2] or a lone symbol expression [e] or a semantic action. *) and symbol_expression = | ESymbol of symbol located * expression list * attributes (* A symbol expression is a symbol, possibly accompanied with actual parameters and attributes. *) and extended_action = | XATraditional of raw_action | XAPointFree of Stretch.t option (* A semantic action is either traditional { ... } or point-free. There are two forms of point-free actions, <> and . In the latter case, [id] is an OCaml identifier. *) type rule = { rule_public: bool; rule_inline: bool; rule_lhs: symbol located; rule_attributes: attributes; rule_formals: symbol located list; rule_rhs: expression; } (* ------------------------------------------------------------------------ *) (* ------------------------------------------------------------------------ *) (* A declaration. (Only before joining.) *) type declaration = (* Raw OCaml code. *) | DCode of Stretch.t (* Raw OCaml functor parameter. *) | DParameter of Stretch.ocamltype (* really a stretch *) (* Terminal symbol (token) declaration. *) | DToken of Stretch.ocamltype option * terminal * alias * attributes (* Start symbol declaration. *) | DStart of nonterminal (* Priority and associativity declaration. *) | DTokenProperties of terminal * token_associativity * precedence_level (* Type declaration. *) | DType of Stretch.ocamltype * parameter (* Grammar-level attribute declaration. *) | DGrammarAttribute of attribute (* Attributes shared among multiple symbols, i.e., [%attribute]. *) | DSymbolAttributes of parameter list * attributes (* On-error-reduce declaration. *) | DOnErrorReduce of parameter * on_error_reduce_level (* ------------------------------------------------------------------------ *) (* A partial grammar. (Only before joining.) *) type partial_grammar = { pg_filename : filename; pg_postlude : postlude option; pg_declarations : declaration located list; pg_rules : parameterized_rule list; } (* ------------------------------------------------------------------------ *) (* ------------------------------------------------------------------------ *) (* A grammar. (Only after joining.) *) (* The differences with partial grammars (above) are as follows: 1. the file name is gone (there could be several file names, anyway). 2. there can be several postludes. 3. declarations are organized by kind: preludes, postludes, functor %parameters, %start symbols, %types, %tokens, %on_error_reduce, grammar attributes, %attributes. 4. rules are stored in a map, indexed by symbol names, instead of a list. 5. token aliases have been replaced with ordinary named terminal symbols. *) type grammar = { p_preludes : Stretch.t list; p_postludes : postlude list; p_parameters : Stretch.t list; p_start_symbols : Positions.t StringMap.t; p_types : (parameter * Stretch.ocamltype located) list; p_tokens : token_properties StringMap.t; p_on_error_reduce : (parameter * on_error_reduce_level) list; p_grammar_attributes : attributes; p_symbol_attributes : (parameter list * attributes) list; p_rules : parameterized_rule StringMap.t; } menhir-20200123/src/tableBackend.ml000066400000000000000000000734711361226111300167170ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) open CodeBits open Grammar open IL open Interface open Printf open TokenType open NonterminalType open CodePieces module Run (T : sig end) = struct (* ------------------------------------------------------------------------ *) (* Conventional names for modules, exceptions, record fields, functions. *) let menhirlib = "MenhirLib" let make_engine_table = menhirlib ^ ".TableInterpreter.MakeEngineTable" let make_engine = menhirlib ^ ".Engine.Make" let make_symbol = menhirlib ^ ".InspectionTableInterpreter.Symbols" let make_inspection = menhirlib ^ ".InspectionTableInterpreter.Make" let engineTypes = menhirlib ^ ".EngineTypes" let field x = engineTypes ^ "." ^ x let fstate = field "state" let fsemv = field "semv" let fstartp = field "startp" let fendp = field "endp" let fnext = field "next" let fstack = field "stack" let fcurrent = field "current" let entry = interpreter ^ ".entry" let start = interpreter ^ ".start" let staticVersion = menhirlib ^ ".StaticVersion" (* The following are names of internal sub-modules. *) let tables = "Tables" let symbols = "Symbols" let et = "ET" let ti = "TI" (* ------------------------------------------------------------------------ *) (* Statistics. *) (* Integer division, rounded up. *) let div a b = if a mod b = 0 then a / b else a / b + 1 (* [size] provides a rough measure of the size of its argument, in words. The [unboxed] parameter is true if we have already counted 1 for the pointer to the object. *) let rec size unboxed = function | EIntConst _ | ETuple [] | EData (_, []) -> if unboxed then 0 else 1 | EStringConst s -> 1 + div (String.length s * 8) Sys.word_size | ETuple es | EData (_, es) | EArray es -> 1 + List.length es + List.fold_left (fun s e -> s + size true e) 0 es | _ -> assert false (* not implemented *) let size = size false (* Optionally, print a measure of each of the tables that we are defining. *) let define (name, expr) = { valpublic = true; valpat = PVar name; valval = expr } let define_and_measure (x, e) = Error.logC 1 (fun f -> fprintf f "The %s table occupies roughly %d bytes.\n" x (size e * (Sys.word_size / 8)) ); define (x, e) (* ------------------------------------------------------------------------ *) (* Code generation for semantic actions. *) (* The functions [reducecellparams] and [reducebody] are adapted from [CodeBackend]. *) (* Things are slightly more regular here than in the code-based back-end, since there is no optimization: every stack cell has the same structure and holds a state, a semantic value, and a pair of positions. Because every semantic value is represented, we do not have a separate [unitbindings]. *) (* [reducecellparams] constructs a pattern that describes the contents of a stack cell. If this is the bottom cell, the variable [state] is bound to the state found in the cell. If [ids.(i)] is used in the semantic action, then it is bound to the semantic value. The position variables are always bound. *) let reducecellparams prod i _symbol (next : pattern) : pattern = let ids = Production.identifiers prod in PRecord [ fstate, (if i = 0 then PVar state else PWildcard); fsemv, PVar ids.(i); fstartp, PVar (Printf.sprintf "_startpos_%s_" ids.(i)); fendp, PVar (Printf.sprintf "_endpos_%s_" ids.(i)); fnext, next; ] (* The semantic values bound in [reducecellparams] have type [Obj.t]. They should now be cast to their real type. If we had [PMagic] in the syntax of patterns, we could do that in one swoop; since we don't, we have to issue a series of casts a posteriori. *) let reducecellcasts prod i symbol casts = let ids = Production.identifiers prod in let id = ids.(i) in let t : typ = match semvtype symbol with | [] -> tunit | [ t ] -> t | _ -> assert false in (* Cast: [let id = ((Obj.magic id) : t) in ...]. *) ( PVar id, EAnnot (EMagic (EVar id), type2scheme t) ) :: casts (* 2015/11/04. The start and end positions of an epsilon production are obtained by taking the end position stored in the top stack cell (whatever it is). *) let endpos_of_top_stack_cell = ERecordAccess(EVar stack, fendp) (* This is the body of the [reduce] function associated with production [prod]. It assumes that the variables [env] and [stack] have been bound. *) let reducebody prod = let nt, rhs = Production.def prod and ids = Production.identifiers prod and length = Production.length prod in (* Build a pattern that represents the shape of the stack. Out of the stack, we extract a state (except when the production is an epsilon production) and a number of semantic values. *) (* At the same time, build a series of casts. *) (* We want a [fold] that begins with the deepest cells in the stack. Folding from left to right on [rhs] is appropriate. *) let (_ : int), pat, casts = Array.fold_left (fun (i, pat, casts) symbol -> i + 1, reducecellparams prod i symbol pat, reducecellcasts prod i symbol casts ) (0, PVar stack, []) rhs in (* Determine beforeend/start/end positions for the left-hand side of the production, and bind them to the conventional variables [beforeendp], [startp], and [endp]. These variables may be unused by the semantic action, in which case these bindings are dead code and can be ignored by the OCaml compiler. *) let posbindings = ( PVar beforeendp, endpos_of_top_stack_cell ) :: ( PVar startp, if length > 0 then EVar (Printf.sprintf "_startpos_%s_" ids.(0)) else endpos_of_top_stack_cell ) :: ( PVar endp, if length > 0 then EVar (Printf.sprintf "_endpos_%s_" ids.(length - 1)) else EVar startp ) :: [] in (* This cannot be one of the start productions. *) assert (not (Production.is_start prod)); (* This is a regular production. Perform a reduction. *) let action = Production.action prod in let act = EAnnot (Action.to_il_expr action, type2scheme (semvtypent nt)) in EComment ( Production.print prod, blet ( (pat, EVar stack) :: (* destructure the stack *) casts @ (* perform type casts *) posbindings @ (* bind [startp] and [endp] *) [ PVar semv, act ], (* run the user's code and bind [semv] *) (* Return a new stack, onto which we have pushed a new stack cell. *) ERecord [ (* the new stack cell *) fstate, EVar state; (* the current state after popping; it will be updated by [goto] *) fsemv, ERepr (EVar semv); (* the newly computed semantic value *) fstartp, EVar startp; (* the newly computed start and end positions *) fendp, EVar endp; fnext, EVar stack; (* this is the stack after popping *) ] ) ) let semantic_action prod = EFun ( [ PVar env ], (* Access the stack and current state via the environment. *) (* In fact, the current state needs be bound here only if this is an epsilon production. Otherwise, the variable [state] will be bound by the pattern produced by [reducecellparams] above. *) ELet ( [ PVar stack, ERecordAccess (EVar env, fstack) ] @ (if Production.length prod = 0 then [ PVar state, ERecordAccess (EVar env, fcurrent) ] else []), reducebody prod ) ) (* Export the number of start productions. *) let start_def = define ( "start", EIntConst Production.start ) (* ------------------------------------------------------------------------ *) (* Table encodings. *) (* Encodings of entries in the default reduction table. *) let encode_DefRed prod = (* 1 + prod *) 1 + Production.p2i prod let encode_NoDefRed = (* 0 *) 0 (* Encodings of entries in the action table. *) let encode_Reduce prod = (* prod | 01 *) (Production.p2i prod lsl 2) lor 1 let encode_ShiftDiscard s = (* s | 10 *) ((Lr1.number s) lsl 2) lor 0b10 let encode_ShiftNoDiscard s = (* s | 11 *) ((Lr1.number s) lsl 2) lor 0b11 let encode_Fail = (* 00 *) 0 (* Encodings of entries in the goto table. *) let encode_Goto node = (* 1 + node *) 1 + Lr1.number node let encode_NoGoto = (* 0 *) 0 (* Encodings of the hole in the action and goto tables. *) let hole = assert (encode_Fail = 0); assert (encode_NoGoto = 0); 0 (* Encodings of entries in the error bitmap. *) let encode_Error = (* 0 *) 0 let encode_NoError = (* 1 *) 1 (* Encodings of terminal and nonterminal symbols in the production table. *) let encode_no_symbol = 0 (* 0 | 0 *) let encode_terminal tok = (Terminal.t2i tok + 1) lsl 1 (* t + 1 | 0 *) let encode_nonterminal nt = ((Nonterminal.n2i nt) lsl 1) lor 1 (* nt | 1 *) let encode_symbol = function | Symbol.T tok -> encode_terminal tok | Symbol.N nt -> encode_nonterminal nt let encode_symbol_option = function | None -> encode_no_symbol | Some symbol -> encode_symbol symbol (* Encoding a Boolean as an integer value. *) let encode_bool b = if b then 1 else 0 (* ------------------------------------------------------------------------ *) (* Table compression. *) (* Our sparse, two-dimensional tables are turned into one-dimensional tables via [RowDisplacement]. *) (* The error bitmap, which is two-dimensional but not sparse, is made one-dimensional by simple flattening. *) (* Every one-dimensional table is then packed via [PackedIntArray]. *) (* Optionally, we print some information about the compression ratio. *) (* [population] counts the number of significant entries in a two-dimensional matrix. *) let population (matrix : int array array) = Array.fold_left (fun population row -> Array.fold_left (fun population entry -> if entry = hole then population else population + 1 ) population row ) 0 matrix (* [marshal1] marshals a one-dimensional array. *) let marshal1 (table : int array) = let (bits : int), (text : string) = MenhirLib.PackedIntArray.pack table in ETuple [ EIntConst bits; EStringConst text ] (* [marshal11] marshals a one-dimensional array whose bit width is statically known to be [1]. *) let marshal11 (table : int array) = let (bits : int), (text : string) = MenhirLib.PackedIntArray.pack table in assert (bits = 1); EStringConst text (* List-based versions of the above functions. *) let marshal1_list (table : int list) = marshal1 (Array.of_list table) let marshal11_list (table : int list) = marshal11 (Array.of_list table) (* [linearize_and_marshal1] marshals an array of integer arrays (of possibly different lengths). *) let linearize_and_marshal1 (table : int array array) = let data, entry = MenhirLib.LinearizedArray.make table in ETuple [ marshal1 data; marshal1 entry ] (* [flatten_and_marshal11_list] marshals a two-dimensional bitmap, whose width (for now) is assumed to be [Terminal.n - 1]. *) let flatten_and_marshal11_list (table : int list list) = ETuple [ (* Store the table width. *) EIntConst (Terminal.n - 1); (* View the table as a one-dimensional array, and marshal it. *) marshal11_list (List.flatten table) ] (* [marshal2] marshals a two-dimensional table, with row displacement. *) let marshal2 name m n (matrix : int list list) = let matrix : int array array = Array.of_list (List.map Array.of_list matrix) in let (displacement : int array), (data : int array) = MenhirLib.RowDisplacement.compress (=) (fun x -> x = hole) hole m n matrix in Error.logC 1 (fun f -> fprintf f "The %s table is %d entries; %d non-zero; %d compressed.\n" name (m * n) (population matrix) (Array.length displacement + Array.length data) ); ETuple [ marshal1 displacement; marshal1 data; ] (* ------------------------------------------------------------------------ *) (* Table generation. *) (* The action table. *) let action node t = match Default.has_default_reduction node with | Some _ -> (* [node] has a default reduction; in that case, the action table is never looked up. *) hole | None -> try let target = SymbolMap.find (Symbol.T t) (Lr1.transitions node) in (* [node] has a transition to [target]. If [target] has a default reduction on [#], use [ShiftNoDiscard], otherwise [ShiftDiscard]. *) match Default.has_default_reduction target with | Some (_, toks) when TerminalSet.mem Terminal.sharp toks -> assert (TerminalSet.cardinal toks = 1); encode_ShiftNoDiscard target | _ -> encode_ShiftDiscard target with Not_found -> try (* [node] has a reduction. *) let prod = Misc.single (TerminalMap.find t (Lr1.reductions node)) in encode_Reduce prod with Not_found -> (* [node] has no action. *) encode_Fail (* In the error bitmap and in the action table, the row that corresponds to the [#] pseudo-terminal is never accessed. Thus, we do not create this row. This does not create a gap in the table, because this is the right-most row. For sanity, we check this fact here. *) let () = assert (Terminal.t2i Terminal.sharp = Terminal.n - 1) (* The goto table. *) let goto node nt = try let target = SymbolMap.find (Symbol.N nt) (Lr1.transitions node) in encode_Goto target with Not_found -> encode_NoGoto (* The error bitmap reflects which entries in the action table are [Fail]. Like the action table, it is not accessed when [node] has a default reduction. *) let error node t = if action node t = encode_Fail then encode_Error else encode_NoError (* The default reductions table. *) let default_reduction node = match Default.has_default_reduction node with | Some (prod, _) -> encode_DefRed prod | None -> encode_NoDefRed (* Generate the table definitions. *) let action = define_and_measure ( "action", marshal2 "action" Lr1.n (Terminal.n - 1) ( Lr1.map (fun node -> Terminal.mapx (fun t -> action node t ) ) ) ) let goto = define_and_measure ( "goto", marshal2 "goto" Lr1.n Nonterminal.n ( Lr1.map (fun node -> Nonterminal.map (fun nt -> goto node nt ) ) ) ) let error = define_and_measure ( "error", flatten_and_marshal11_list ( Lr1.map (fun node -> Terminal.mapx (fun t -> error node t ) ) ) ) let default_reduction = define_and_measure ( "default_reduction", marshal1_list ( Lr1.map (fun node -> default_reduction node ) ) ) let lhs = define_and_measure ( "lhs", marshal1 ( Production.amap (fun prod -> Nonterminal.n2i (Production.nt prod) ) ) ) let semantic_action = define ( "semantic_action", (* Non-start productions only. *) EArray (Production.mapx semantic_action) ) (* ------------------------------------------------------------------------ *) (* When [--trace] is enabled, we need tables that map terminals and productions to strings. *) let stringwrap f x = EStringConst (f x) let reduce_or_accept prod = match Production.classify prod with | Some _ -> "Accepting" | None -> "Reducing production " ^ (Production.print prod) let trace = define_and_measure ( "trace", if Settings.trace then EData ("Some", [ ETuple [ EArray (Terminal.map (stringwrap Terminal.print)); EArray (Production.map (stringwrap reduce_or_accept)); ] ]) else EData ("None", []) ) (* ------------------------------------------------------------------------ *) (* Generate the two functions that map a token to its integer code and to its semantic value, respectively. *) let token2terminal = destructuretokendef "token2terminal" tint false (fun tok -> EIntConst (Terminal.t2i tok)) let token2value = destructuretokendef "token2value" tobj true (fun tok -> ERepr ( match Terminal.ocamltype tok with | None -> EUnit | Some _ -> EVar semv ) ) (* ------------------------------------------------------------------------ *) (* The client APIs invoke the interpreter with an appropriate start state. The monolithic API calls [entry] (see [Engine]), while the incremental API calls [start]. *) (* An entry point to the monolithic API. *) let monolithic_entry_point state nt t = define ( Nonterminal.print true nt, let lexer = "lexer" and lexbuf = "lexbuf" in EFun ( [ PVar lexer; PVar lexbuf ], EAnnot ( EMagic ( EApp ( EVar entry, [ EIntConst (Lr1.number state); EVar lexer; EVar lexbuf ] ) ), type2scheme (TypTextual t) ) ) ) (* The whole monolithic API. *) let monolithic_api : IL.valdef list = Lr1.fold_entry (fun _prod state nt t api -> monolithic_entry_point state nt t :: api ) [] (* An entry point to the incremental API. *) let incremental_entry_point state nt t = let initial = "initial_position" in define ( Nonterminal.print true nt, (* In principle the eta-expansion [fun initial_position -> start s initial_position] should not be necessary, since [start] is a pure function. However, when [--trace] is enabled, [start] will log messages to the standard error channel. *) EFun ( [ PVar initial ], EAnnot ( EMagic ( EApp ( EVar start, [ EIntConst (Lr1.number state); EVar initial; ] ) ), type2scheme (checkpoint (TypTextual t)) ) ) ) (* The whole incremental API. *) let incremental_api : IL.valdef list = Lr1.fold_entry (fun _prod state nt t api -> incremental_entry_point state nt t :: api ) [] (* ------------------------------------------------------------------------ *) (* Constructing representations of symbols. *) (* [eterminal t] is a value of type ['a terminal] (for some ['a]) that encodes the terminal symbol [t]. It is just a data constructor of the terminal GADT. *) let eterminal (t : Terminal.t) : expr = EData (tokengadtdata (Terminal.print t), []) (* [enonterminal nt] is a value of type ['a nonterminal] (for some ['a]) that encodes the nonterminal symbol [nt]. It is just a data constructor of the nonterminal GADT. *) let enonterminal (nt : Nonterminal.t) : expr = EData (tnonterminalgadtdata (Nonterminal.print false nt), []) (* [esymbol symbol] is a value of type ['a symbol] (for some ['a]) that encodes the symbol [symbol]. It is built by applying the injection [T] or [N] to the terminal or nonterminal encoding. *) let dataT = "T" let dataN = "N" let esymbol (symbol : Symbol.t) : expr = match symbol with | Symbol.T t -> EData (dataT, [ eterminal t ]) | Symbol.N nt -> EData (dataN, [ enonterminal nt ]) (* [xsymbol symbol] is a value of type [xsymbol] that encodes the symbol [symbol]. It is built by applying the injection [X] (an existential quantifier) to [esymbol symbol]. *) let dataX = "X" let xsymbol (symbol : Symbol.t) : expr = EData (dataX, [ esymbol symbol ]) (* ------------------------------------------------------------------------ *) (* Produce a function that maps a terminal symbol (represented as an integer code) to its representation as an [xsymbol]. Include [error] but not [#], i.e., include all of the symbols which can appear in a production. *) (* Note that, instead of generating a function, we could (a) use an array or (b) use an unsafe conversion of an integer to a data constructor, then wrap it using [X] and [T/N]. Approach (b) is unsafe and causes memory allocation (due to the wrapping) at each call. *) let terminal () = assert Settings.inspection; let t = "t" in define ( "terminal", EFun ([ PVar t ], EMatch (EVar t, Terminal.mapx (fun tok -> { branchpat = pint (Terminal.t2i tok); branchbody = xsymbol (Symbol.T tok) } ) @ [ { branchpat = PWildcard; branchbody = EComment ("This terminal symbol does not exist.", EApp (EVar "assert", [ efalse ]) ) } ] ) ) ) (* ------------------------------------------------------------------------ *) (* Produce a function that maps a (non-start) nonterminal symbol (represented as an integer code) to its representation as an [xsymbol]. *) let nonterminal () = assert Settings.inspection; let nt = "nt" in define ( "nonterminal", EFun ([ PVar nt ], EMatch (EVar nt, Nonterminal.foldx (fun nt branches -> { branchpat = pint (Nonterminal.n2i nt); branchbody = xsymbol (Symbol.N nt) } :: branches ) [ { branchpat = PWildcard; branchbody = EComment ("This nonterminal symbol does not exist.", EApp (EVar "assert", [ efalse ]) ) } ] ) ) ) (* ------------------------------------------------------------------------ *) (* Produce a mapping of every LR(0) state to its incoming symbol (encoded as an integer value). (Note that the initial states do not have one.) *) let lr0_incoming () = assert Settings.inspection; define_and_measure ( "lr0_incoming", marshal1 (Array.init Lr0.n (fun node -> encode_symbol_option (Lr0.incoming_symbol node) )) ) (* ------------------------------------------------------------------------ *) (* A table that maps a production (i.e., an integer index) to the production's right-hand side. In principle, we use this table for ordinary productions only, as opposed to the start productions, whose existence is not exposed to the user. However, it is simpler (and not really costly) to include all productions in this table. *) let rhs () = assert Settings.inspection; let productions : int array array = Production.amap (fun prod -> Array.map encode_symbol (Production.rhs prod) ) in define_and_measure ( "rhs", linearize_and_marshal1 productions ) (* ------------------------------------------------------------------------ *) (* A table that maps an LR(1) state to its LR(0) core. *) let lr0_core () = assert Settings.inspection; define_and_measure ( "lr0_core", marshal1_list (Lr1.map (fun (node : Lr1.node) -> Lr0.core (Lr1.state node) )) ) (* A table that maps an LR(0) state to a set of LR(0) items. *) let lr0_items () = assert Settings.inspection; let items : int array array = Array.init Lr0.n (fun node -> Array.map Item.marshal (Array.of_list (Item.Set.elements (Lr0.items node))) ) in define_and_measure ( "lr0_items", linearize_and_marshal1 items ) (* ------------------------------------------------------------------------ *) (* A table that tells which nonterminal symbols are nullable. (For simplicity, this table includes the start symbols.) *) let nullable () = assert Settings.inspection; define_and_measure ( "nullable", marshal11_list ( Nonterminal.map (fun nt -> encode_bool (Analysis.nullable nt) ) ) ) (* ------------------------------------------------------------------------ *) (* A two-dimensional bitmap, indexed first by nonterminal symbols, then by terminal symbols, encodes the FIRST sets. *) let first () = assert Settings.inspection; define_and_measure ( "first", flatten_and_marshal11_list ( Nonterminal.map (fun nt -> Terminal.mapx (fun t -> encode_bool (TerminalSet.mem t (Analysis.first nt)) ) ) ) ) (* ------------------------------------------------------------------------ *) (* A reference to [MenhirLib.StaticVersion.require_XXXXXXXX], where [XXXXXXXX] is our 8-digit version number. This ensures that the generated code can be linked only with an appropriate version of MenhirLib. This is important because we use unsafe casts, and a version mismatch could cause a crash. *) let versiondef = { valpublic = true; valpat = PUnit; valval = EVar (staticVersion ^ ".require_" ^ Version.version); } (* ------------------------------------------------------------------------ *) (* Let's put everything together. *) open BasicSyntax let grammar = Front.grammar let program = [ SIFunctor (grammar.parameters, (* Make a reference to [MenhirLib.StaticVersion.require_XXXXXXXX], where [XXXXXXXX] is our 8-digit version number. This ensures that the generated code can be linked only with an appropriate version of MenhirLib. This is important because we use unsafe casts, and a version mismatch could cause a crash. *) SIComment "This generated code requires the following version of MenhirLib:" :: SIValDefs (false, [ versiondef ]) :: (* Define the internal sub-module [basics], which contains the definitions of the exception [Error] and of the type [token]. Then, include this sub-module. This sub-module is used again below, as part of the application of the functor [TableInterpreter.Make]. *) mbasics grammar @ (* In order to avoid hiding user-defined identifiers, only the exception [Error] and the type [token] should be defined (at top level, with non-mangled names) above this line. We also define the value [_eRR] above this line so that we do not have a problem if a user prelude hides the name [Error]. *) SIStretch grammar.preludes :: (* Define the tables. *) SIModuleDef (tables, MStruct [ (* The internal sub-module [basics] contains the definitions of the exception [Error] and of the type [token]. *) SIInclude (MVar basics); (* This is a non-recursive definition, so none of the names defined here are visible in the semantic actions. *) SIValDefs (false, [ token2terminal; define ("error_terminal", EIntConst (Terminal.t2i Terminal.error)); token2value; default_reduction; error; start_def; action; lhs; goto; semantic_action; trace; ]) ] ) :: SIModuleDef (interpreter, MStruct ( (* Apply the functor [TableInterpreter.MakeEngineTable] to the tables. *) SIModuleDef (et, MApp (MVar make_engine_table, MVar tables)) :: (* Apply the functor [Engine.Make] to obtain an engine. *) SIModuleDef (ti, MApp (MVar make_engine, MVar et)) :: SIInclude (MVar ti) :: listiflazy Settings.inspection (fun () -> (* Define the internal sub-module [symbols], which contains type definitions. Then, include this sub-module. This sub-module is used again below, as part of the application of the functor [TableInterpreter.MakeInspection]. *) SIModuleDef (symbols, MStruct ( interface_to_structure ( tokengadtdef grammar @ nonterminalgadtdef grammar ) )) :: SIInclude (MVar symbols) :: (* Apply the functor [InspectionTableInterpreter.Make], which expects four arguments. *) SIInclude (mapp (MVar make_inspection) [ (* Argument 1, of type [TableFormat.TABLES]. *) MVar tables; (* Argument 2, of type [InspectionTableFormat.TABLES]. *) MStruct ( (* [lr1state] *) SIInclude (MVar ti) :: (* [terminal], [nonterminal]. *) SIInclude (MVar symbols) :: (* This functor application builds the types [symbol] and [xsymbol] in terms of the types [terminal] and [nonterminal]. This saves us the trouble of generating these definitions. *) SIInclude (MApp (MVar make_symbol, MVar symbols)) :: SIValDefs (false, terminal() :: nonterminal() :: lr0_incoming() :: rhs() :: lr0_core() :: lr0_items() :: nullable() :: first() :: [] ) :: [] ); (* Argument 3, of type [EngineTypes.TABLE]. *) MVar et; (* Argument 4, of type [EngineTypes.ENGINE with ...]. *) MVar ti; ]) :: [] ) )) :: SIValDefs (false, monolithic_api) :: SIModuleDef (incremental, MStruct [ SIValDefs (false, incremental_api) ]) :: SIStretch grammar.postludes :: [])] let () = Time.tick "Producing abstract syntax" end menhir-20200123/src/tableBackend.mli000066400000000000000000000020711361226111300170540ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* The (table-based) code generator. *) module Run (T : sig end) : sig val program: IL.program end menhir-20200123/src/tarjan.ml000066400000000000000000000150621361226111300156270ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module provides an implementation of Tarjan's algorithm for finding the strongly connected components of a graph. The algorithm runs when the functor is applied. Its complexity is $O(V+E)$, where $V$ is the number of vertices in the graph $G$, and $E$ is the number of edges. *) module Run (G : sig type node (* We assume each node has a unique index. Indices must range from $0$ to $n-1$, where $n$ is the number of nodes in the graph. *) val n: int val index: node -> int (* Iterating over a node's immediate successors. *) val successors: (node -> unit) -> node -> unit (* Iterating over all nodes. *) val iter: (node -> unit) -> unit end) = struct (* Define the internal data structure associated with each node. *) type data = { (* Each node carries a flag which tells whether it appears within the SCC stack (which is defined below). *) mutable stacked: bool; (* Each node carries a number. Numbers represent the order in which nodes were discovered. *) mutable number: int; (* Each node [x] records the lowest number associated to a node already detected within [x]'s SCC. *) mutable low: int; (* Each node carries a pointer to a representative element of its SCC. This field is used by the algorithm to store its results. *) mutable representative: G.node; (* Each representative node carries a list of the nodes in its SCC. This field is used by the algorithm to store its results. *) mutable scc: G.node list } (* Define a mapping from external nodes to internal ones. Here, we simply use each node's index as an entry into a global array. *) let table = (* Create the array. We initially fill it with [None], of type [data option], because we have no meaningful initial value of type [data] at hand. *) let table = Array.make G.n None in (* Initialize the array. *) G.iter (fun x -> table.(G.index x) <- Some { stacked = false; number = 0; low = 0; representative = x; scc = [] } ); (* Define a function which gives easy access to the array. It maps each node to its associated piece of internal data. *) function x -> match table.(G.index x) with | Some dx -> dx | None -> assert false (* Indices do not cover the range $0\ldots n$, as expected. *) (* Create an empty stack, used to record all nodes which belong to the current SCC. *) let scc_stack = Stack.create() (* Initialize a function which allocates numbers for (internal) nodes. A new number is assigned to each node the first time it is visited. Numbers returned by this function start at 1 and increase. Initially, all nodes have number 0, so they are considered unvisited. *) let mark = let counter = ref 0 in fun dx -> incr counter; dx.number <- !counter; dx.low <- !counter (* This reference will hold a list of all representative nodes. *) let representatives = ref [] (* Look at all nodes of the graph, one after the other. Any unvisited nodes become roots of the search forest. *) let () = G.iter (fun root -> let droot = table root in if droot.number = 0 then begin (* This node hasn't been visited yet. Start a depth-first walk from it. *) mark droot; droot.stacked <- true; Stack.push droot scc_stack; let rec walk x = let dx = table x in G.successors (fun y -> let dy = table y in if dy.number = 0 then begin (* $y$ hasn't been visited yet, so $(x,y)$ is a regular edge, part of the search forest. *) mark dy; dy.stacked <- true; Stack.push dy scc_stack; (* Continue walking, depth-first. *) walk y; if dy.low < dx.low then dx.low <- dy.low end else if (dy.low < dx.low) && dy.stacked then begin (* The first condition above indicates that $y$ has been visited before $x$, so $(x, y)$ is a backwards or transverse edge. The second condition indicates that $y$ is inside the same SCC as $x$; indeed, if it belongs to another SCC, then the latter has already been identified and moved out of [scc_stack]. *) if dy.number < dx.low then dx.low <- dy.number end ) x; (* We are done visiting $x$'s neighbors. *) if dx.low = dx.number then begin (* $x$ is the entry point of a SCC. The whole SCC is now available; move it out of the stack. We pop elements out of the SCC stack until $x$ itself is found. *) let rec loop () = let element = Stack.pop scc_stack in element.stacked <- false; dx.scc <- element.representative :: dx.scc; element.representative <- x; if element != dx then loop() in loop(); representatives := x :: !representatives end in walk root end ) (* There only remains to make our results accessible to the outside. *) let representative x = (table x).representative let scc x = (table x).scc let iter action = List.iter (fun x -> let data = table x in assert (data.representative == x); (* a sanity check *) assert (data.scc <> []); (* a sanity check *) action x data.scc ) !representatives end menhir-20200123/src/tarjan.mli000066400000000000000000000044141361226111300157770ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module provides an implementation of Tarjan's algorithm for finding the strongly connected components of a graph. The algorithm runs when the functor is applied. Its complexity is $O(V+E)$, where $V$ is the number of vertices in the graph $G$, and $E$ is the number of edges. *) module Run (G : sig type node (* We assume each node has a unique index. Indices must range from $0$ to $n-1$, where $n$ is the number of nodes in the graph. *) val n: int val index: node -> int (* Iterating over a node's immediate successors. *) val successors: (node -> unit) -> node -> unit (* Iterating over all nodes. *) val iter: (node -> unit) -> unit end) : sig open G (* This function maps each node to a representative element of its strongly connected component. *) val representative: node -> node (* This function maps each representative element to a list of all members of its strongly connected component. Non-representative elements are mapped to an empty list. *) val scc: node -> node list (* [iter action] allows iterating over all strongly connected components. For each component, the [action] function is applied to the representative element and to a (non-empty) list of all elements. *) val iter: (node -> node list -> unit) -> unit end menhir-20200123/src/time.ml000066400000000000000000000032661361226111300153110ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) let channel = stderr open Unix open Printf let clock = ref (times()) let tick msg = if Settings.timings then let times1 = !clock in let times2 = times() in fprintf channel "%s: %.02fs\n%!" msg (times2.tms_utime -. times1.tms_utime); clock := times() type chrono = float ref let fresh () = ref 0. let chrono (chrono : float ref) (task : unit -> 'a) : 'a = if Settings.timings then begin let times1 = times() in let result = task() in let times2 = times() in chrono := !chrono +. times2.tms_utime -. times1.tms_utime; result end else task() let display (chrono : float ref) msg = if Settings.timings then fprintf channel "%s: %.02fs\n" msg !chrono menhir-20200123/src/time.mli000066400000000000000000000026651361226111300154640ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* Call [tick msg] to stop timing a task and start timing the next task. A message is displayed. The message includes [msg] as well as timing information. The very first task is deemed to begin when this module is initialized. *) val tick: string -> unit (* Another timing method, with separate chronometers; useful for more precise profiling. *) type chrono val fresh: unit -> chrono val chrono: chrono -> (unit -> 'a) -> 'a val display: chrono -> string -> unit menhir-20200123/src/tokenType.ml000066400000000000000000000151521361226111300163320ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module deals with a few details regarding the definition of the [token] type. In particular, if [--only-tokens] was specified, it emits the type definition and exits. *) open BasicSyntax open IL open CodeBits (* This is the conventional name of the [token] type, with no prefix. A prefix is possibly appended to it below, where [tctoken] is redefined before being exported. *) let tctoken = "token" let ttoken = TypApp (tctoken, []) (* This is the conventional name of the token GADT, which describes the tokens. Same setup as above. *) let tctokengadt = "terminal" let ttokengadt a = TypApp (tctokengadt, [ a ]) (* This is the conventional name of the data constructors of the token GADT. *) let ttokengadtdata token = "T_" ^ token (* This is the definition of the type of tokens. It is defined as an algebraic data type, unless [--external-tokens M] is set, in which case it is defined as an abbreviation for the type [M.token]. *) let tokentypedef grammar = let typerhs = match Settings.token_type_mode with | Settings.TokenTypeOnly | Settings.TokenTypeAndCode -> (* Algebraic data type. *) TDefSum ( List.map (fun (tok, typo) -> { dataname = tok; datavalparams = (match typo with None -> [] | Some t -> [ TypTextual t ]); datatypeparams = None }) (typed_tokens grammar) ) | Settings.CodeOnly m -> (* Type abbreviation. *) TAbbrev (TypApp (m ^ "." ^ tctoken, [])) in [ IIComment "The type of tokens."; IITypeDecls [{ typename = tctoken; typeparams = []; typerhs; typeconstraint = None }] ] (* This is the definition of the token GADT. Here, the data constructors have no value argument, but have a type index. *) (* The token GADT is produced only when [Settings.inspection] is true. Thus, when [Settings.inspection] is false, we remain compatible with old versions of OCaml, without GADTs. *) (* Although the [token] type does not include the [error] token (because this token is never produced by the lexer), the token GADT must include the [error] token (because this GADT must describe all of the tokens that are allowed to appear in a production). *) (* It is defined as a generalized algebraic data type, unless [--external-tokens M] is set, in which case it is defined as an abbreviation for the type ['a M.tokengadt]. *) let tokengadtdef grammar = assert Settings.inspection; let param, typerhs = match Settings.token_type_mode with | Settings.TokenTypeOnly | Settings.TokenTypeAndCode -> (* Generalized algebraic data type. *) let param = "_" in param, TDefSum ( (* The ordering of this list matters. We want the data constructors to respect the internal ordering (as determined by [typed_tokens] in [BasicSyntax]) of the terminal symbols. This may be exploited in the table back-end to allow an unsafe conversion of a data constructor to an integer code. See [t2i] in [InspectionTableInterpreter]. *) { dataname = ttokengadtdata "error"; datavalparams = []; datatypeparams = Some [ tunit ] (* the [error] token has a semantic value of type [unit] *) } :: List.map (fun (token, typo) -> { dataname = ttokengadtdata token; datavalparams = []; datatypeparams = Some [ match typo with None -> tunit | Some t -> TypTextual t ] }) (typed_tokens grammar) ) | Settings.CodeOnly m -> (* Type abbreviation. *) let param = "a" in param, TAbbrev (TypApp (m ^ "." ^ tctokengadt, [ TypVar param ])) in [ IIComment "The indexed type of terminal symbols."; IITypeDecls [{ typename = tctokengadt; typeparams = [ param ]; typerhs; typeconstraint = None }] ] (* If we were asked to only produce a type definition, then do so and stop. *) let produce_tokentypes grammar = match Settings.token_type_mode with | Settings.TokenTypeOnly -> (* Create both an .mli file and an .ml file. This is made necessary by the fact that the two can be different when there are functor parameters. *) let i = tokentypedef grammar @ listiflazy Settings.inspection (fun () -> tokengadtdef grammar ) in let module P = Printer.Make (struct let f = open_out (Settings.base ^ ".mli") let locate_stretches = None end) in P.interface [ IIFunctor (grammar.parameters, i) ]; let module P = Printer.Make (struct let f = open_out (Settings.base ^ ".ml") let locate_stretches = None end) in P.program [ SIFunctor (grammar.parameters, interface_to_structure i ) ]; exit 0 | Settings.CodeOnly _ | Settings.TokenTypeAndCode -> () (* The token type and the token GADTs can be referred to via a short (unqualified) name, regardless of how they have been defined (either directly or as an abbreviation). However, their data constructors must be qualified if [--external-tokens] is set. *) let tokenprefix id = match Settings.token_type_mode with | Settings.CodeOnly m -> m ^ "." ^ id | Settings.TokenTypeAndCode -> id | Settings.TokenTypeOnly -> id (* irrelevant, really *) let tokendata = tokenprefix let tokengadtdata token = tokenprefix (ttokengadtdata token) menhir-20200123/src/tokenType.mli000066400000000000000000000057221361226111300165050ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* This module deals with the definitions of the type(s) that describe the tokens and the terminal symbols. *) (* By default, following [ocamlyacc], we produce just one type, [token], which describes the tokens. A token contains a tag (a terminal symbol) and possibly a semantic value. *) (* In addition to that, in [--inspection] mode only, we produce a GADT which describes the terminal symbols. A terminal symbol is just a tag; it does not carry a semantic value. *) (* In this module, we also deal with [--only-tokens] and [--external-tokens]. If [--only-tokens] is specified on the command line, [produce_tokentypes] emits the type definition(s) and exit. If [--external-tokens M] is set, then the token type and the token GADT are defined as abbreviations for [M.token] and ['a M.terminal]. *) (* The conventional name of the [token] type, for use by the code generators. *) val ttoken: IL.typ (* [tokendata] maps the name of a token to a data constructor of the [token] type. (If [--external-tokens] is set, then it prefixes its argument with an appropriate OCaml module name. Otherwise, it is the identity.) *) val tokendata: string -> string (* The conventional name of the [terminal] type, a.k.a. the token GADT. This is an indexed type (i.e., it has one type parameter). Its data constructors carry zero value arguments. *) val tctokengadt: string val ttokengadt: IL.typ -> IL.typ (* [tokengadtdata] maps the name of a token to a data constructor of the token GADT. *) val tokengadtdata: string -> string (* The definitions of the token type and of the token GADT, for use by the code generators. Each of these lists defines zero or one type. *) val tokentypedef: BasicSyntax.grammar -> IL.interface val tokengadtdef: BasicSyntax.grammar -> IL.interface (* If [--only-tokens] is set, then [produce_tokentypes] writes the type definitions to the [.ml] and [.mli] files and stops Menhir. Otherwise, it does nothing. *) val produce_tokentypes: BasicSyntax.grammar -> unit menhir-20200123/src/traverse.ml000066400000000000000000000327071361226111300162100ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (* Code for traversing or transforming [IL] terms. *) open IL open CodeBits (* This turns a list of value definitions into a hash table. It also counts and numbers the definitions. We assume that the left-hand side of every definition is a variable. *) let tabulate_defs (defs : valdef list) : int * (string, int * valdef) Hashtbl.t = let count = ref 0 in let table = Hashtbl.create 1023 in List.iter (fun def -> let k = !count in count := k + 1; Hashtbl.add table (pat2var def.valpat) (k, def) ) defs; !count, table (* This mixin class, used by [map] and [fold] below, helps maintain environments, which can be used to keep track of local variable bindings. *) class virtual ['env] env = object(self) (* The virtual method [pvar] records a local variable binding in the environment. *) method virtual pvar: 'env -> string -> 'env method pat env = function | PWildcard | PUnit -> env | PVar id -> self#pvar env id | PTuple ps | POr ps | PData (_, ps) -> self#pats env ps | PAnnot (p, _) -> self#pat env p | PRecord fps -> self#fpats env fps method pats env ps = List.fold_left self#pat env ps method fpats env fps = List.fold_left self#fpat env fps method fpat env (_, p) = self#pat env p end (* A class that helps transform expressions. The environment [env] can be used to keep track of local variable bindings. *) exception NoChange class virtual ['env] map = object (self) inherit ['env] env method expr (env : 'env) e = try match e with | EVar x -> self#evar env x | EFun (ps, e) -> self#efun env ps e | EApp (e, es) -> self#eapp env e es | ELet (bs, e) -> self#elet env bs e | EMatch (e, bs) -> self#ematch env e bs | EIfThen (e, e1) -> self#eifthen env e e1 | EIfThenElse (e, e1, e2) -> self#eifthenelse env e e1 e2 | ERaise e -> self#eraise env e | ETry (e, bs) -> self#etry env e bs | EUnit -> self#eunit env | EIntConst k -> self#eintconst env k | EStringConst s -> self#estringconst env s | EData (d, es) -> self#edata env d es | ETuple es -> self#etuple env es | EAnnot (e, t) -> self#eannot env e t | EMagic e -> self#emagic env e | ERepr _ -> self#erepr env e | ERecord fs -> self#erecord env fs | ERecordAccess (e, f) -> self#erecordaccess env e f | ERecordWrite (e, f, e1) -> self#erecordwrite env e f e1 | ETextual action -> self#etextual env action | EComment (s, e) -> self#ecomment env s e | EPatComment (s, p, e) -> self#epatcomment env s p e | EArray es -> self#earray env es | EArrayAccess (e, i) -> self#earrayaccess env e i with NoChange -> e method evar _env _x = raise NoChange method efun env ps e = let e' = self#expr (self#pats env ps) e in if e == e' then raise NoChange else EFun (ps, e') method eapp env e es = let e' = self#expr env e and es' = self#exprs env es in if e == e' && es == es' then raise NoChange else EApp (e', es') method elet env bs e = let env, bs' = self#bindings env bs in let e' = self#expr env e in if bs == bs' && e == e' then raise NoChange else ELet (bs', e') method ematch env e bs = let e' = self#expr env e and bs' = self#branches env bs in if e == e' && bs == bs' then raise NoChange else EMatch (e', bs') method eifthen env e e1 = let e' = self#expr env e and e1' = self#expr env e1 in if e == e' && e1 == e1' then raise NoChange else EIfThen (e', e1') method eifthenelse env e e1 e2 = let e' = self#expr env e and e1' = self#expr env e1 and e2' = self#expr env e2 in if e == e' && e1 == e1' && e2 == e2' then raise NoChange else EIfThenElse (e', e1', e2') method eraise env e = let e' = self#expr env e in if e == e' then raise NoChange else ERaise e' method etry env e bs = let e' = self#expr env e and bs' = self#branches env bs in if e == e' && bs == bs' then raise NoChange else ETry (e', bs') method eunit _env = raise NoChange method eintconst _env _k = raise NoChange method estringconst _env _s = raise NoChange method edata env d es = let es' = self#exprs env es in if es == es' then raise NoChange else EData (d, es') method etuple env es = let es' = self#exprs env es in if es == es' then raise NoChange else ETuple es' method eannot env e t = let e' = self#expr env e in if e == e' then raise NoChange else EAnnot (e', t) method emagic env e = let e' = self#expr env e in if e == e' then raise NoChange else EMagic e' method erepr env e = let e' = self#expr env e in if e == e' then raise NoChange else ERepr e' method erecord env fs = let fs' = self#fields env fs in if fs == fs' then raise NoChange else ERecord fs' method erecordaccess env e f = let e' = self#expr env e in if e == e' then raise NoChange else ERecordAccess (e', f) method erecordwrite env e f e1 = let e' = self#expr env e and e1' = self#expr env e1 in if e == e' && e1 == e1' then raise NoChange else ERecordWrite (e', f, e1') method earray env es = let es' = self#exprs env es in if es == es' then raise NoChange else EArray es' method earrayaccess env e i = let e' = self#expr env e in if e == e' then raise NoChange else EArrayAccess (e', i) method etextual _env _action = raise NoChange method ecomment env s e = let e' = self#expr env e in if e == e' then raise NoChange else EComment (s, e') method epatcomment env s p e = let e' = self#expr env e in if e == e' then raise NoChange else EPatComment (s, p, e') method exprs env es = Misc.smap (self#expr env) es method fields env fs = Misc.smap (self#field env) fs method field env ((f, e) as field) = let e' = self#expr env e in if e == e' then field else (f, e') method branches env bs = Misc.smap (self#branch env) bs method branch env b = let e = b.branchbody in let e' = self#expr (self#pat env b.branchpat) e in if e == e' then b else { b with branchbody = e' } (* The method [binding] produces a pair of an updated environment and a transformed binding. *) method binding env ((p, e) as b) = let e' = self#expr env e in self#pat env p, if e == e' then b else (p, e') (* For nested non-recursive bindings, the environment produced by each binding is used to traverse the following bindings. The method [binding] produces a pair of an updated environment and a transformed list of bindings. *) method bindings env bs = Misc.smapa self#binding env bs method valdef env def = let e = def.valval in let e' = self#expr env e in if e == e' then def else { def with valval = e' } method valdefs env defs = Misc.smap (self#valdef env) defs end (* A class that helps iterate, or fold, over expressions. *) class virtual ['env, 'a] fold = object (self) inherit ['env] env method expr (env : 'env) (accu : 'a) e = match e with | EVar x -> self#evar env accu x | EFun (ps, e) -> self#efun env accu ps e | EApp (e, es) -> self#eapp env accu e es | ELet (bs, e) -> self#elet env accu bs e | EMatch (e, bs) -> self#ematch env accu e bs | EIfThen (e, e1) -> self#eifthen env accu e e1 | EIfThenElse (e, e1, e2) -> self#eifthenelse env accu e e1 e2 | ERaise e -> self#eraise env accu e | ETry (e, bs) -> self#etry env accu e bs | EUnit -> self#eunit env accu | EIntConst k -> self#eintconst env accu k | EStringConst s -> self#estringconst env accu s | EData (d, es) -> self#edata env accu d es | ETuple es -> self#etuple env accu es | EAnnot (e, t) -> self#eannot env accu e t | EMagic e -> self#emagic env accu e | ERepr _ -> self#erepr env accu e | ERecord fs -> self#erecord env accu fs | ERecordAccess (e, f) -> self#erecordaccess env accu e f | ERecordWrite (e, f, e1) -> self#erecordwrite env accu e f e1 | ETextual action -> self#etextual env accu action | EComment (s, e) -> self#ecomment env accu s e | EPatComment (s, p, e) -> self#epatcomment env accu s p e | EArray es -> self#earray env accu es | EArrayAccess (e, i) -> self#earrayaccess env accu e i method evar (_env : 'env) (accu : 'a) _x = accu method efun (env : 'env) (accu : 'a) ps e = let accu = self#expr (self#pats env ps) accu e in accu method eapp (env : 'env) (accu : 'a) e es = let accu = self#expr env accu e in let accu = self#exprs env accu es in accu method elet (env : 'env) (accu : 'a) bs e = let env, accu = self#bindings env accu bs in let accu = self#expr env accu e in accu method ematch (env : 'env) (accu : 'a) e bs = let accu = self#expr env accu e in let accu = self#branches env accu bs in accu method eifthen (env : 'env) (accu : 'a) e e1 = let accu = self#expr env accu e in let accu = self#expr env accu e1 in accu method eifthenelse (env : 'env) (accu : 'a) e e1 e2 = let accu = self#expr env accu e in let accu = self#expr env accu e1 in let accu = self#expr env accu e2 in accu method eraise (env : 'env) (accu : 'a) e = let accu = self#expr env accu e in accu method etry (env : 'env) (accu : 'a) e bs = let accu = self#expr env accu e in let accu = self#branches env accu bs in accu method eunit (_env : 'env) (accu : 'a) = accu method eintconst (_env : 'env) (accu : 'a) _k = accu method estringconst (_env : 'env) (accu : 'a) _s = accu method edata (env : 'env) (accu : 'a) _d es = let accu = self#exprs env accu es in accu method etuple (env : 'env) (accu : 'a) es = let accu = self#exprs env accu es in accu method eannot (env : 'env) (accu : 'a) e _t = let accu = self#expr env accu e in accu method emagic (env : 'env) (accu : 'a) e = let accu = self#expr env accu e in accu method erepr (env : 'env) (accu : 'a) e = let accu = self#expr env accu e in accu method erecord (env : 'env) (accu : 'a) fs = let accu = self#fields env accu fs in accu method erecordaccess (env : 'env) (accu : 'a) e _f = let accu = self#expr env accu e in accu method erecordwrite (env : 'env) (accu : 'a) e _f e1 = let accu = self#expr env accu e in let accu = self#expr env accu e1 in accu method earray (env : 'env) (accu : 'a) es = let accu = self#exprs env accu es in accu method earrayaccess (env : 'env) (accu : 'a) e _i = let accu = self#expr env accu e in accu method etextual (_env : 'env) (accu : 'a) _action = accu method ecomment (env : 'env) (accu : 'a) _s e = let accu = self#expr env accu e in accu method epatcomment (env : 'env) (accu : 'a) _s _p e = let accu = self#expr env accu e in accu method exprs (env : 'env) (accu : 'a) es = List.fold_left (self#expr env) accu es method fields (env : 'env) (accu : 'a) fs = List.fold_left (self#field env) accu fs method field (env : 'env) (accu : 'a) (_f, e) = let accu = self#expr env accu e in accu method branches (env : 'env) (accu : 'a) bs = List.fold_left (self#branch env) accu bs method branch (env : 'env) (accu : 'a) b = let accu = self#expr (self#pat env b.branchpat) accu b.branchbody in accu method binding ((env, accu) : 'env * 'a) (p, e) = let accu = self#expr env accu e in self#pat env p, accu method bindings (env : 'env) (accu : 'a) bs = List.fold_left self#binding (env, accu) bs method valdef (env : 'env) (accu : 'a) def = let accu = self#expr env accu def.valval in accu method valdefs (env : 'env) (accu : 'a) defs = List.fold_left (self#valdef env) accu defs end menhir-20200123/src/unionFind.ml000066400000000000000000000120111361226111300162700ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (** This module implements a simple and efficient union/find algorithm. See Robert E. Tarjan, ``Efficiency of a Good But Not Linear Set Union Algorithm'', JACM 22(2), 1975. *) (** The abstraction defined by this module is a set of points, partitioned into equivalence classes. With each equivalence class, a piece of information, of abstract type ['a], is associated; we call it a descriptor. A point is implemented as a cell, whose (mutable) contents consist of a single link to either information about the equivalence class, or another point. Thus, points form a graph, which must be acyclic, and whose connected components are the equivalence classes. In every equivalence class, exactly one point has no outgoing edge, and carries information about the class instead. It is the class's representative element. Information about a class consists of an integer weight (the number of elements in the class) and of the class's descriptor. *) type 'a point = { mutable link: 'a link } and 'a link = | Info of 'a info | Link of 'a point and 'a info = { mutable weight: int; mutable descriptor: 'a } (** [fresh desc] creates a fresh point and returns it. It forms an equivalence class of its own, whose descriptor is [desc]. *) let fresh desc = { link = Info { weight = 1; descriptor = desc } } (** [repr point] returns the representative element of [point]'s equivalence class. It is found by starting at [point] and following the links. For efficiency, the function performs path compression at the same time. *) let rec repr point = match point.link with | Link point' -> let point'' = repr point' in if point'' != point' then (* [point''] is [point']'s representative element. Because we just invoked [repr point'], [point'.link] must be [Link point'']. We write this value into [point.link], thus performing path compression. Note that this function never performs memory allocation. *) point.link <- point'.link; point'' | Info _ -> point (** [get point] returns the descriptor associated with [point]'s equivalence class. *) let rec get point = (* By not calling [repr] immediately, we optimize the common cases where the path starting at [point] has length 0 or 1, at the expense of the general case. *) match point.link with | Info info | Link { link = Info info } -> info.descriptor | Link { link = Link _ } -> get (repr point) let rec set point v = match point.link with | Info info | Link { link = Info info } -> info.descriptor <- v | Link { link = Link _ } -> set (repr point) v (** [union point1 point2] merges the equivalence classes associated with [point1] and [point2] into a single class whose descriptor is that originally associated with [point2]. It does nothing if [point1] and [point2] already are in the same class. The weights are used to determine whether [point1] should be made to point to [point2], or vice-versa. By making the representative of the smaller class point to that of the larger class, we guarantee that paths remain of logarithmic length (not accounting for path compression, which makes them yet smaller). *) let union point1 point2 = let point1 = repr point1 and point2 = repr point2 in if point1 != point2 then match point1.link, point2.link with | Info info1, Info info2 -> let weight1 = info1.weight and weight2 = info2.weight in if weight1 >= weight2 then begin point2.link <- Link point1; info1.weight <- weight1 + weight2; info1.descriptor <- info2.descriptor end else begin point1.link <- Link point2; info2.weight <- weight1 + weight2 end | _, _ -> assert false (* [repr] guarantees that [link] matches [Info _]. *) (** [equivalent point1 point2] tells whether [point1] and [point2] belong to the same equivalence class. *) let equivalent point1 point2 = repr point1 == repr point2 menhir-20200123/src/unionFind.mli000066400000000000000000000042711361226111300164520ustar00rootroot00000000000000(******************************************************************************) (* *) (* Menhir *) (* *) (* François Pottier, Inria Paris *) (* Yann Régis-Gianas, PPS, Université Paris Diderot *) (* *) (* Copyright Inria. All rights reserved. This file is distributed under the *) (* terms of the GNU General Public License version 2, as described in the *) (* file LICENSE. *) (* *) (******************************************************************************) (** This module implements a simple and efficient union/find algorithm. See Robert E. Tarjan, ``Efficiency of a Good But Not Linear Set Union Algorithm'', JACM 22(2), 1975. *) (** The abstraction defined by this module is a set of points, partitioned into equivalence classes. With each equivalence class, a piece of information, of abstract type ['a], is associated; we call it a descriptor. *) type 'a point (** [fresh desc] creates a fresh point and returns it. It forms an equivalence class of its own, whose descriptor is [desc]. *) val fresh: 'a -> 'a point (** [get point] returns the descriptor associated with [point]'s equivalence class. *) val get: 'a point -> 'a (** [union point1 point2] merges the equivalence classes associated with [point1] and [point2] into a single class whose descriptor is that originally associated with [point2]. It does nothing if [point1] and [point2] already are in the same class. *) val union: 'a point -> 'a point -> unit (** [equivalent point1 point2] tells whether [point1] and [point2] belong to the same equivalence class. *) val equivalent: 'a point -> 'a point -> bool (** [set p d] updates the descriptor of [p] to [d]. *) val set: 'a point -> 'a -> unit menhir-20200123/www/000077500000000000000000000000001361226111300140475ustar00rootroot00000000000000menhir-20200123/www/affichage.jpg000066400000000000000000002325171361226111300164600ustar00rootroot00000000000000JFIF,,ExifII*bj(1 r2~i,,GIMP 2.10.82019:03:26 09:04:512009:12:27 13:24:582009:12:27 13:24:58DbhWJFIFC    $.' ",#(7),01444'9=82<.342C  2!!22222222222222222222222222222222222222222222222222D" }!1AQa"q2#BR$3br %&'()*456789:CDEFGHIJSTUVWXYZcdefghijstuvwxyz w!1AQaq"2B #3Rbr $4%&'()*56789:CDEFGHIJSTUVWXYZcdefghijstuvwxyz ?uQ^R*F &]je卛<qNRЧmL5i !1򯛫sҌɕ߰}`IrzdJdAFd@Wp?JTd@xz>qVMTNjb`fU6:mơwTyeV#7<U{ٖY#( N1 NƣڦjB^JB1ydTx;ߕ.?x]Ӓ=~( Cm)VR珟UT] Rݎ(`pZ<t'ۥh-9 q/|vM?"^-$զSmy4 :GãBTd{˒A%5VkrIݹ+(*ĊbkW0#vEiK_59QEzQEtP6--КSTm@)a5M#AR2=5TwszsA!á'Vs^n'JpD}0ySӥJePW*={{ҳ+,pyāÜzV KIu{H¹׿FҪϥYFqTk!Ws8CUV%7O!K鑏,3CcEຊ l=[JbWv?(dn*69N֬ ?Zwnȥf;G&Td{V0NI+9 ;V v69rlco}K#=ؚƜK1泮/KU=k*qWdT[nnc ΞFBmi4%TֻT#J.]L[rv@8*4y1F_?N)5@h6}=2~dggngdi HQTvݬf4=U#S[;<нA=ḛrc[,;3xy",ij''fZ"D+櫵ܷ``7#-R$aݯ&/)04Kӎ"dBo8O_PK #@&e ͻ)<CTRטՓ6-hcrǽdx+>9Yg^GF $nU?FaQExQEBmQQL`o';W8*F#HJu%wl'v/P [n\9畇PsU:F O `T \y+S,79\dQ G6׍?1pbwdllN1ێ*{&8zLz&lbI*Cc<*&~RZp`W${ϓJ!7&mA%_CXZɊu@*io001ֽY.[q5i Hvh]ԪAk`NyT|Ҝm W;R3qnmAa8Sk-fh[ )vśh"3&TTQ!1K9XYe$VR{c ZBlx߭\9/#0ǹc~4*YۻIyVSjy-#i+"7(wZdfVQ:8l'©^RJX>ELǧNڸ\ ޮO?%QEz'QEؠ}6sn߭6|@R*0xTz,ʐçA#)G:ZIdI2:[)q2 0F;dX. nKbs!I9$G$=,Jˮї\Z$`R֯6BKev\(:9bhnc`ay^qަ/)2R1j+mF?.ȧ[NA3ItnՂfyUx Ke"Ek&2%zUd[_l9?ݟ9N!'UF c*g21U ^e 0EWyEPS-؍~}ED#]?:()nε`9$Ң1=G(:G$Xm'>ƤoŶl@rM MQv;!m ^=j`Eɲo.=6jW;sGQEadUX$`[r zsJJ!hcb-Ft@#O7Ro'kQHD< xI rIׯEH #2A(%~O=QQSsXl5-H l(MS icSL>6/JzQE0#«|s#+:qE#:dTQE{'http://ns.adobe.com/xap/1.0/ C       C R tZ&*.֥KSWOda\u*(p*ȍ@+KQEЪ*NBQJVO(@EF"F mN&R{[ˁ @=Oڞ7mhlTY<>- [G"fh^.`q 09*45hUQJ2,*;W@T Tĵ( ibNã. 4d.f[#Lh/5ֶH;x @7[: um]"^,43A^ރG+0அ*ZO"+q&FH--[Z`w"r68Cέ]d:Cʳ0EYf*VR,ZYQZNaѦ .5˄V*fFSvppi_d/Kl](n؜^z6 # c5&8ϛz8xm0,* 紂7;cpƙyC:p42S`@ @mڧwJrDXe.:|1Ͽ.4:2gAkw, ͓Z@e![\he⯓LZx*7uyqY \nu3$ÑUK:&SNWK͘GhʷgBΝQ`tkdR[ {IE=  @ @= gz}<ѫӥ7~7Z_lMɣ!UCIdȢi )9Yރ/ͱaZ^m8hzXoo:ѿɡ9_t)Slf7Z#[|{Q.gobTXU[qNNק_$/p*@ @ @\} e&f;G%F2RkG\8;5L\is?Nxz:2OG+S9+\umN} F7.='ƑCM$5*!+|ݗU8}/KNΗwGoMy)/b :E|E㷙b2Jkw~߅ @ {?K2)oWh3O:cV{A'ϕh,2SEvZTZ*/wW3gk1[6Z-wS=;x>s*&7"NWާ25H(cR5>OW64f71xkݥA1^L<."w=Sƽ$$D=I3.WFK @ @ׂ _[ƴC뤥y^L{`n$7?_.ۗix~ea5.'`ƕ*ytr7ν,iSqx~8 @ @xt;L=kG[q^2= t7ɻop-6ӹʱlTO헸|Av=ʖ=4Ϸϔ+qWנis( kX>㗡/oÁ @=߃hKWftϾ/?pOLt0|[$ 巧j?LC+dlE,-w@`JyOy<hZǙׯ|殌ySO:2-dw|n2ZLy^/7nW7xzsdoL-vOj]i]iy=eƈ i>WH @ @eeOg-P|?n]kGLKjwraMGw{9/ǮnNէq2Y{ѭ*So?wހ '<<ש͏)OSS->OS|z\nY:Nnw4TʒhҚ K Ȱm /F^N\i.eHg~hC @ @w&՞?hw秋N ~uSK vsJ,wzomh\ruNN4ޙmd0{wsn7!/AVkL/u9֨jЄ\ix=cjJISjD5О Mi}1)9/SYe(+su[=hJ; d) ] l$:ڊ1IҲ]hvFdf/s @=7`";'do.IUY/8Ѣך螹J he!kR%tkvg%j U䣦 #yjis]E i[o%mt!;.\jZFH xte &ٚTG. $К( *y3 @ @w4;\ I9z;/R3IA&& EרadTq"'}Z]k1\Q=A)RY:%IƢEB%*^ә5i5VTfWIiR5&SEXUkTՁCJ/sp1 @ @epm;ftw2ttCQJ"pnդiwYTҭj@эK sfq <4ݒvk1!ԴsiKZ*<ڍ9i'aTեxeTEDD3h%I@s*CH]l\ @0!"#14$023A %5@BCDDĠ7%gɭum19xgLgu dNyZ.>FG4|1gf>vh.c;ty# ~;t|] 3GvwhݣgvgD,4l4\LZ!G~##F:1DI3rmh>VN1XW4p'J'K'K¹E0sӰioN8(G A"Mcz2l.K2ǕW<ʸZ1?2yͫ0ͯ쩈0+<+K AD:7-\t1C-K 6U Rp9䫕_|rE>+cZ<.;sn9Ұ9>8 g|L\>],q7߰ w,nN!.͈0ro8;79SŝlXqZ猐+) tlA89uʄc3Y؅O,m#;HF?TJ G`0s28r2^w",hi#B ;YX 6‡ϖ L'[.dgʋ#MdlZJ۟*+#L٤OWʕ+'Hʫʵs*M@Z8ׯ1:JVt&q J8\d`b%{JqNιRϗ[7P^@,H`^[&OaA M\ V%8YKku: ʶ"@K_GQk4">9pכ Sw1dEbu`m!jmR=*s]v_S/Uqefu-r,d6iRVX2ʬ3?87 &R/UR^#?GW Fm?3Ć$r^.{/FI[;'i9ŗ,,Črq&[Ǭfl;dyq -:(.ʪp5"K*0r"ESiBMN)3Qm9\gNla&<|PC5ʭc.z|j̴UeARdVS+i, .\Ii ࣱ*PB r5PLx |$@P{N9aҧK\.8v"ˍ&W,  N^Adccbo}"sadX{lu, '68Gh5PHdLl bp>„3Dp]Qku$)0UTJ.͓'q3o \\i9\ghbȌװHb5ZI~LrgV+n#$Xڠ$rfI-/!t7q" n,h3wVi:,%+jrcAl"s񜽸|֖j3Ԁ(1L7±' QǷ֚edmfU`vU(W(>o.>*)m,RѼ=wqM)2r7FSS^Dc72X[x`8rePZߔ7"yf&"rMI_LݯN+k\ags1SւO.F P$,Ͷ±&NyޞnRJqlN=%!9L[(0,Oh$fʰ2B zUGn_V2rlZ~b}d&8sqsS [S}/!ti,1LDo>ј"1L"5o<}gI{wd ,=qɋ 'kiߊd ɺ8Éc9>?0c,X_Lgw=3lTssJF8'}T<C2U>.S>;9ґ0`ϡLԦ:VH&Cp~N&a1!dFXةԄfqoc1Hm)PG9%/*c%8r6垛 EK%ip"vf\O9X.yY8q.b|bP'NTa´7>Js#) ݤqWܷ&>w߯/,U-jN$p͓1OЉnhIo-C9iLM$ȭ-7UTu><"<3Kj-mv1sIUs$}QKy}ՎwtonXO(X,E sRu1w4ͲS_WӌRsG7LsqrCg(ۏ?WȈ蕻ىwzqaf3,(TV‚ܱ:9wH Sv4a-.@0"Q"],}'P9V3JXU6 zgX;rf> i,L ),3WlHgt;<=:əb,G#FVN^qUX286Fm4 bo!@:ݐ8aٖ(G&8 4dl_q鑝6b!=YSMɥl51!|]EM jc?f\\ tq Ȱ3}Z״PMaAZIм9[͹A01,Fy$hHzcV3uU#ith-@RSYDQK2\0 N]Sؾc10? ͹HO[ʉĦ@--sN)ٞ3 lO*!} Ϭ@ŗ'"%qE2P,2l&y|DI{?ĸPIyԹB´ nbM^8KhdG*Ԕ}y?A;X:f'iD?Umz4ׇvP"7i̝|XmJղ=supc- `v!ɎpDKR >%y+en.}cAlLk{ d6[xٍ8~BtBG q;ns?lJny*"&5%N^r=&Xy933goeN&No8I8n"9q(a,Ql%T@)M5%വMzEd9)/rģ] U6bIyӍC-q1!^Fp箚33>1ܣ9vdU%}.r7 \Re| 9KGT .By5ܬLRZ'?(,L@}w EcbO,'Y]Wa*a/-pXUeJa*(aM9L UKmOGpK8z^%#5q 4nK|ؑW'{L#8}Gry&rtb|ǯ\DGl@J7ڕI?\"Y6@TyS"ݦes^jӖëʐ/JoBRH}+nCnFvfauR}*qIXV&,"6G7kŬ|$TMIg`}h6 SUY ̵Yy3#gP@ij޸ scJh>ʟ8k 1=~kÍM"i$`l )r8Ai{Cl3xA"XHT *ثX4[.ev/ODI#[M\LZҹYӒj})葍$^Cְf,^DکT24-b6l$WbSVϥh[ YOW^2lɭԉbgQVFNKjFK>oJ#>mDruqSk4~qN3#>ySy~kC4Âu|b?yJxPﴌ+nKzMJcPzi4ًj2)jxVDk(v]q:TõuLMpcaB-MT/RfH-1~];,穐Y"Uz{.B>Lwq2G#?}mE_yIS <|; ^v#/ax;v#Gُb>|;1vcُf>v#/^N>Of>|;vcGُf>|;x ׃xOGzu׃ze闃/^C  'E#QeYb$QE"HR)M%QHi :(jEhZ65!4ZER5"ѩ M"YDyE{*׳B:(Eʊ(Ed~՛lrfkfH5Ls5^tڋ,{^]A(hud3E{h%,,O$6(v1,!{{Y/tܯڬme|21;Y匳HU q"BʲK. F٬R- Z5"5!4lu K ߓZe\E4kFef0qHyRWΆb-iӱc*픐,$OΝ[5ɚjcgqb5eRf[5w lmjfw3`:cƒgޑݙݑޑ+0[#Hԑ q5ФNNwQӸk!YӔcEʢvi$4! e )4N4JLQ!L1HL ЄLŜ9:/`6M l4"HhQ%Ye%VVN")%Q[!fPbH-MaW|Q(448*͑QH&#!e~ C|ea=-Q[P hY^QܒܡdIJɋnILHrs-E3,K&ri˲Bc7^p:0?+،>LM!4i+$Ɔb[>%Yȏ?# PɑVOr.QGsF"$Y7!1;5kLX{,3W!4a`|3$O+ς11HDecCBEQ&>Hciwdij%}[c0~TQ$pI'i=q\^хkM1:#YهW6#M=c<3OP&($At&,*$.j,y_5&2g$v(^̻rf)}]l;x2C*Id>De/fvG gc]HB"gP՘dGGCTz՞G=[#ջ)d5:5٬r$GY%:_IgF"#[p!(8#@DŽv4Qf/EYXɜ|5ȸ1z][xU[=FzC}H>D0tRJIXQ؋OŅR:K#,2CG&G lK:\ QȑCC# =@1lu"yōn2Yc@j5BtX1BVWNDlѪ؝FiY#eGKcQc,e(8UН{bt#$2Q<&8MXti*C(>9<XĆ3s Jd7X.l28/+l4%B5c_JC:(EEiB䋡$nnnjˡnnjen2kcMlrU cyXyGHi(i4M"(M%eEF4#JEM")mv%:_&^tYf˓E PGm#A iFhLѠh4#B4#IQGocf@-2-L&^VnS(Y~$K*%dMFŢRGW}:_>}!7( Hi fBHi )QE B,blee-[-C}6!1AQ "23@R0B#ar$4q?TMA?OE?Z^I{$G|/_SSOY?sOOsy?ssg)ħC K]D1/sǗ>^瘗%y{<1/sK<Ľ1/sK<<1/sO<Ľ1/sKy{b^瘗<Ľ1/q<̽3/sy{f^瘗%y{b^瘗ey{b^gy{f_2.^癗ey{f~癗<̽3/sKf^/y{f~癟'y6iO*M;XZF,|iXd b8#(ōiebב y)dlE1E͌ͬqfxlfs:~bm,EITǗ% Kb)"CHrHYR rD3%PDhCCq"GDQO#[v,nƑt#rJPj8E$!*-1$QTqr]}=?ġQ#dRTFȊ"HzHEC֊z^ ·~7ƿodpq_sŠeKp[ P&K6zHm+҆X! %QjD2Όr!G4> mr|DOF+9P֜ȡ&55 6P΅`N.?CnK<О ЇOz_7%T":?Dl~#Hj[ӝ(͌ED'QGD_)r//c]̏2c$YG؄T%hx2e!SilN^6]1ΐ$iJf.A WgQ;d'αV".HHM 4mlǃqN'QQ.}%ĖZe\k_}F_)ŎI $u~ ܗPB<LJ*>taF( CTLjVL3s؆J)DFkB =):FWˆA#+t :_R5x˰$)J*H}c!%BdY?T7s%6c&9p2]|Ր0C3ǂp"Icd %Gٓ$9fX{ t3km%DFR"ߩ% 0丙o0 ς2Dn%i,L҄^VMr2'KWNCu_^R:hECS'c,c2[!Ӌ2`pq(eRqcR%JޗOC4K(dm;,QY̙Q{M=HH!D͔m#/GA:_Hsiȇ1:ic3.hѭ̖. bhvdQ=C#yclɁ4NN+;ke6=ڬ3n- );&K#"3;w=&1/ߏuqhJI2Egկ\:^ʉFdF4$00FcIf(3w4Av6G|W;'6YJX1,~2abH;;,}SFYb*cu3.Xͬ&CLmd]2̩O?1¾ QFYoz'DZf7DUpQ@cF+/[77!͖.N,1N#1OEg3KlXīIbdKN6FT˲ c$ŵL/rحobVlkD1] _" _Ef] Η2[Lf Y4zz>>>gE(#z7$:cBBrFJhRbLFib7[1OrܔQu?ԋEIbPʱ.R"֊(S((,/IGqG xY3y)ynfn72ٹvc:S6kf<36 i( F*ѵDĆ-"9v+DLVHY4t_|y-V-_!ܶ[-ce[,Ye^Kh)"BHPE"HR) (ڎ/'L !1"2AQ3aq#4 5@BRbr$sCSc0d?gJ6~j7BD?Ex)·>I^_hM ƽC%zWT C?7E '?;EDi^ < y?z?z?+hLl_FWS[E+RREF~Sz%?QS^%^%m~Kߒ177䥺 ?7 (!?Cobn;4_Aï@ïOn-? 8[_,_@/'~_B!KEG!bg ?A&?jm_MCCeWğ/5cwk}F?_jk-_bXY|KD3Xm} '4a*.؏@apR @T%")Ӥ r^N!GHl8DZan93TaҘr~ fܽ)q~݇ELd8A6M1=zh':cdצؚ.hÖKNgцWMMfdǨ,T{0~ūvcΗE 'ӥJy+/Ix):gTI-$׽!Ť No;iKX.g{`7N4nʁcnBbGUQ͕c6Ab.50FlEAHTe [-bM+'yخ+Yg[KoLϸ3 ?Qa-ÚѶ eS,kR* }/8eGqU)2<чbݡGE2nkQnvX6@5uCأD>qŒAoy-ce^AˋdЧXBi56[Y1\x(:GT$)׺go6G|?O{Ħr7݄NN1+^YyKi,j(bs)ŘTFn -ź408⎑wÃiEgrJ0.o.3V-2LyuVq,+D[Nmi+~SdW[:qMpW;N^d;Mzm_bھŅ]im/KVҫ{W;kJz[6Y_,}r':w44]r;d ,sWT7n( 'Gw䱚EF5E/D.:3T Whw7^+`vUJbQ3C72hNW]4O =_BwzRx/@>!c~ HAHlU U:;$e0 V8UmJRAi# dt[8&Pl )x/m)nc{zӉcpjՖħBPyE2 /i3ڃ0N]M3%Sk\#86jz{ UEC u6x EUN޵O+Xշ,]v%(N.ϰ&m1!OIw_'K (#'.I=p# 8$)~{Ʊ@؈ ^^u .RM< '9hG6@4Ź&W/.hl(Lud9|]8VpV=uRNNy¨}7G,.dPӇ%< aLGKd4ȷ5kUg?iB[ñOb T{|5T7O"~}˫:܏rik!t側ڤj1cW䰖>anw/b,2V&P)^&2WS#Wt OFwԬc(3Yyd` ! ^lnqϚiCG2U{vIX[*ܬhA8G z5=JgZ(#5qɱUt7+}*d'?H鞂|V{T>IX#XD@@à}9iO blܻGAVj]Z+ }% %ã%VM-(H*f޿UnϲÜߣdrWz ;%vhs@F^ӈtBÊѫi]QeovVRBsF Fbp#]9)RV'bO5 qՄs!4r #Ώ<ۭ" oO7ܱfVp %w ]7Ԭtyk le!z0.̄Zyp0P 1XsYՅR (r£ã|5a@ҪٕqPS>rpXz14'7>'vHS n}{p6~hY|s⹐kX k Hk7dkhe,G6ܮHNdek7-3'TiOLܱlJ`'> [bו֔-K 6pD֟2PuFre7*B5 C !:*BԼ׵ r1z +p-e aAZpı`pMѤÌ8%QŘf,/媁@KY"}5%0euAZy!SϢ,2)V^V(ف8rON' ,Pr V6YYN$ۀ@Ȉ 8NKޅsu8OR{meCm+IUrQ9 2#5G -̢)jU^ZWZ|W%QXlH qCF3:o?k6h%Rq[ C@o$ٜ& rMq'ՑޞuvqgJc`֥,d¸y]<xOdLN:Cܵ,Ňڞ@} ҃8iGWz}Z:1GFZxZv.Q1iLm˙Ţ ӽJ i$XCZ} \~@pGޛTƑiTa. xͮrußqbijv^K<~gܡwR<$>sc T4𐎷h foܩiZ7N8Xy@TfjE);S|DQkmZ~d(WZ=Yz<)( tl@#̣G̾brZ7cn=buQҵuP.gbnsZ۾=W2G$iNwgHjFJ&$@˚$zd [A毚Y"h=ʹm{<q[C 훊4oMG;x]:柪IPm]z-5l&F/"\gXQvE_" #jenDv۹7teuKw4Mb)xj*mϱMd1i}hmF_2t.o(@[A˱YNa 1Zљ3(-7U 8?":RAkp ؞  EŠOIH8M~bp5%4"{Q%y4Ü9NfW&2ҥvC4eoBⴭsh}Vu2]iAL:+{N9;[U[o"E T9$eiW҅CQ#4Bs-b-EΨJ%n1h'sis(lb0˂x2.8zv K"{SCrFMWP"AVX]HAQ7@䵕D&tCh9ۺHENcK"E-، ]8>l1'f& ^Z;Q-/=R{ZLnKyXfGF0eպl c഍rh}WtA: v u>7L v8'v؋L G5UyV:*9rs@L\oHhX3[0s\IV^N< ˡ Ģ [Zj8샋$MU(~P凋 |rDr}@,<'p9j )[W8Wi2#}'YSo.*Aj#ZBe&L.(X}.اXARZ7+a2P=תm63,:Tr䃚0k~),p6ǹXL(Ugc<#uoj(m$;V>I#m:{ܶYu/j S%EwuSJ,(Q!4iGg*0tNwȴ'bmKFރnA9:κ8iϹCt IT|mdPH<<79ԜCYlFQ-*ܚ|ҍ\:[U;Q^W΃x6Gq7@_:.-0-uFLJu'f25OO1>+Zz)ꀦfOj{ű b !1Ki[֪cl__0 ҆1-!yR(57Lld-rOi{O aebvEQNQPޱ=5sRa?dSߜ۪V ДɤϨ7bMuDìPS8Dk cLkw1ްQt}rPLlmY)D݁^U T)06 ߫N;**Q-8gj{Uv?BA>;WN‡3uw*1>ടG7aSN9? XɞuRTjD* l'4d浙ŸV OY5Mi˱B[TwB8e n݇`X*m8pKH%?` NQѫ[WȦ2But=W*F;w[7DC6O.0e*P okT;D]h Nqi^TBp%ZF?;RCْR4`gx-Ic.7VipgSV:a\0# M g%x8hqZyB-Ǎ;% s@ƫ u]jŬuDkuN?buw=g_8Tꓔ;XR0 ?YQf$!Ob6# n&<;4!DmuX{ӜjVnwim|3RڞP]~ {Xu2\%XQ S[4euwX 0ej *tWTauDFbO,ֱ;'S 򆓆\p䩚UCCMNS|[^2O[e)̮u) Gåk8DYVk惌r[YH'bv &VL;~xrU[V u&U{C>IC}&5+8]Bpkk5ݱF꠹#\COQ25a\Qu.fMf.jԝ2Nj;G|OmE}Me^K,~MDKG]^"0u1Zg$A/_qyU`{ ]1Djnn]G~$0ޟ?[|X]D8rO u GͲ4_uOMN3Vܤ;-BчN۽+Tޅ=Cq9UU Mqo)o3fu#h]Q=/GV- Ȓ>]h~Xdq"}tcc+:6;Ig V-Nu7 byuU0OMsœJx;J[Bi[Fփۣ@<ʾMfҺkXT 9Ny!R)e^ Qp)uz~ '،S+ܰLHq%z+?eajA9_R0"Q NBd7L,]+.~C5Oh&rϊ>)EoNyM/9;GgF(/fP,gU<7Swy!C$,1dEP;Gm)g?l7looY% %v|Ih+px-,|WXֿN}%5G")!1AQaq@ 0?!flӈѬk3X` =^>@W؋b;N*茰;(9gYD5YG 7A5Ij?֡_O`.?PrZRJPZi(ק+xQEDdtrGb)uL$@UGR !x:H0;v/Q }I8C7xjiS3󣏨NZ1{.𗸙biMgeĺ}Ps Y;xh5;$`Cm/$m=8YN8%~%d-{` r(%EAj!;aY`TjEpσ+Ae*Tx$~"5w8x581f+cRUhS W9cxhY,i2E|E&!PE*0Cm `$u*y+b!Cs`_:<֖Gy_x RcQ@>-mk.FJ mq1\d1~`7X7@h!r~e5&MTV R*B1SQ5c,MO%Qa<"!-yaSx͉^KU Y GEBwM?i^yfCi xBa;E/+lŪY$fUaaE;w8'6Df#dPh\+L0ֳr? si? Go4$Q ǚEanxחVykYC D2@>:{~arHHܢȈbv8b`@Rۥ|ĹH4PqqH^+K<ũR*s6B8Hc^+EB5pZsp7~"r|+3Wq ܸ8rB+hlo{2*m8<*x#7; ! WuZHT% sH2x&fDO02*2xkNШ>X1SU%ya[$q50FL8Zh.^y"`vm`=I}+x% (:77;FGāJeZ:܅-n)ҬZyx3;#KxWuW̲k=碳/`Vʖ/b_*:L;E`J`E+O,1|p,%*0˝BIud:H-k 3rUZ+QMZg;M|Nwub z@%| k}3s?Y,k?ɝ_[@7s쌆6 ]]N[w={h_Dw$?41{ra;\SwEZ3)׬6nlw0\z|5L*^t2I&4hc%8&b= zJ i eyv/uE-Ы=|-3_/z%َ3ն oBT*جio00m-ܴ;%=m@Ymp`(9牙ʆb o?BwcW%s8gR#sx4Kj)X)]ˣI +ux Ke9a61E"]:rK%6ܨy ]{ ,j[_ C@QqɄ\bas E6cds"̛{[x}c_PlZg f؅M4㈴\ C3 PA)N0ˣ L+23y%)z:^V{0#RR4N%v,f#ufl%qSYs[ea#` O;&7u .c3,]ne$R2FpT]b(170 U0S2ҙg8 V+3Ak@hem* }%\ƨBut.5BU@G*VF,Tvw.oE{CK賌Vy#0n0=@ƴV.!V-@PRr= X>o^i_GL%m[#FZflx!VC{-_.&KWб 'SFڦ URx0nk%3~"SB^e%#:K;ш<2Z 鸍jꈪmS۞':UBYyk$oNKLQ;2s2Jl2$ANQAzbfg:^ Fnh@J8M*YRG%L* /Nc)/x& p 9B0w>.EF.VQ𛏛v(~Qwq ?'F-@vp g2ymwe&A{K=g\@`f:QZ­LOHԾ?RaOR5-+@BlTP7p[WXĬ^`l2b,@_ʃkQ9&xXn[ OS$cKaNw2jg=#. HV^9eʵ B pb^ҁw~+ G Fq$0-wbQg?2մ ?.p Q{`neV"?! T hh % uh0oad2ȻE# 0f4jukK:PK>"F2Հ֠oWFt۱fpV* @Y7)͠PNגjs-WMtLz0ٶzKp,,G$[)փW%|,Q9ENWBX M!awL5w7%))0ʹo9R~f!il >.Ր01V aˮ"C(ZD@+Mb j/+UFFwr Os^}  t2©=jb,ՂsdS(/dMjX3ÂS`2NbQNŽ兩Ʒs|Ln.؅oZKdymakMXL.Z=R[koyappslQh&~(,}V4x)Ul+=eODjk?-IDa\@lf"᫾"噽f[[Ox;%dT CquL˼"#v :b.RҤpvQm[y_Us= K#E5.aW5HAijϼ͹ޮ`(fe!4\[8!"`TȀW%g ^}31|V{QQ5Z+pvQEfwPc2+ V ^n^0VYb$%xg2&\k2EpG3 rC%XyNHWu4#";XcIjVzZr_eXAHB{դKmP~ 1b8Jn^8Kݰx}p..)rxIOj-r6}"u40{s/Y+X"*2ǻE-ECTo.))0"5hr0ZExgصҼݯ_Hg)lH6rPIzZdU\AKtȀZU#&xLpfpzc"ܠ63 >) ]S 'QP W&S6up%YZ; r YR[ԿYMBl \_-b61L1WkfE>}ۢns򗴯ĶELѸ Z`vEejpϴ%3+ :-- JxnF3ke :q!x:w'F}])fnU`omv|bnGWj1ȔQ*^/7iSԼ-+?ҌƥLř5  wЁ1an͋JmG¿ jX+ (os&ȏ$ !Lxng[YfD1*2ֆ %[\MW5N~\F+LX*+)ɆK/*jA}1e{/0 tdžԥaRduI/$&0g]F9?UkS -Fq%Wt^# Ʀw>w0y&AdhNȡI4%8DY ia~t~4za+Ķ\\Yg*v]^ lSN .77jpe*4Cefvֳ2x' .h䣟Sg4' zљI@1娑s͕5Վs"qBVw0+X;yW@ǃkNϭhP#vm1AU98[ɉMU @t,ͰzLA9zq5FgfB>,u% O6 >p>k<ԡ$J]bʼn~B 1x͝nQGs]0xbV%@TTVfQ#Ul~ ސ3'9 'wjjv(Rҋ eFBy!o%&Ckż$$ޛAɖó4sri$ʜ~&K3HE{9$6Zff6b)ew:ȽT|ؼ^H&dMg6x}e1;bZ] bג*ҘǬ03aYBZ~eۢ ɬq U_>f>6^QC VH OW/kkQLJ7LYjd=TrDKREOt7/eJ^Dz@̽A@,|LXnlO"2"+BOCɼ1W+ [m̘^f %@D0TA[Be[XOGcT%`-xOi2@=c*6'D C-B﨓`S@;w<NfgZşl,k(7' 10c"fݭp |K8 HWCrE[WUL="[Y}iɶ*o1b xo W^yTRKzXu~7qWj bX`"pwՒjژ+ʈ <$Iy!&#јXXxH"B̅UTymbU9EԈ8GDQ (81 \]_D hŋAc暸լYcuqDMNؗ h04?e"nm& $宩5*(ۄPA_:+> &2\5g@ H5;Q%)l~bF~VpF Ѹ\S7RSL*F!ģEUh6F.`- gk20>ٔ,+^s.Dcy8)<lҳ"QLwy%yWfkUt2~PBdawN<#A)oG3aİۼsFx+ejI[ 8?oÐJ%͗-`K9rcg@FRX P7w܊Suԡ{v riqYZGlazKd546ĴyN\QB8!p\vO[uj2YBL36g %_GlCou%&0L2|n>G_lmT8E:sq/ |"X**U!HLҹ* 3S]y s-˹@O'(6asp@0U袿X.h̾#?4D{YE~QlcvZ@ų*$*Zf|AUfݖ=0H8,%og\9R鞡fQs*MR˩P[,-2~5. R j8Uɋuc3!opIJ^vLRem8Yj7zOm.&c6vzAbWSmqlyx&`>ۄXߴ MZp X6QD y=3={csSW(p^8JL>2WMsV@Δ_qc5 u ߬mo 1u0" B{C5+,sjW >+KYzB-|\X}Jtbjbeʄ˳2/V@1י88C\Z㗴&L Q^v& ߆%^,_0ݧ!oaJ)>ej )Eu M/Z畇HГN5g1] /΃0-bb{8\A"cBXZi np]b+]`:ɂ:6 +qTŷ+ f /"wۆ_ufdzMZW% 2bO]…f9Hc%s?0dwNZjӨeZK<}}~]sPu~bUp#DW\ `J Exh{4;l&5BBùZq 3f9eePsh&6`҆enRvDط nE1LeOƩXga(S5:S-)юF-4@(\5G\9AQ(d+-IJs<:EWv~ I^uzBO˨V\ܤKT67 xp~=}>h3T:K{79|@WCk [*HcewbɥLE?&B` 9qPsXc, VW: .Y\ǭĨCJҽ7mVRߟ2Pi5#J%b19 ?4uF!p9}q) 70x,iƮbuRs*.QOk35Z f)ʬ.%:䃪z,%EV6*5(;]-2VwF=V|J)5°iEX*/Jnu[V\-Vus\WE[Pqɘ]m=?ڎ"#Y2\6PhR`(7m"N8j|E`JRmg}'lw%7⎘X nsSG7[pg#SCZ-ImE9aFV5K6ˮ_Nn3v}b{3]1eڹ#l8*~BLU"*9x]tuǤQNC4eՈ1k u]A:zrij "3x@iRF`#>"US3(c}DDƼ//k4dQ{3gw2ßM:%"K)0q pMrLR(~/24hLq0iE>j(r9H_$K`K=LKA(QNC)#9*bk]Ȕ7ctnQZR9%(5̦UHό_PTm5(>&- gLDoΡi.$|,j@Š!n#uJO/;߁{ Zk)]#𖎏u ]-J})n:`uOt?0>f^+wr +/j:媨?1d[9_YnNߤԩyj4 EU֜s"AH N)~\Kpk]# &KfVee<}05 d|Sȗ\@VcX\D{q /жm3!y\=AAM,.FV>mbei] nuLf4¨rzD!V8>7!]ŗX;{B|<|CKA/q ^K9,̮r0zdsV" ӸEA \[3ri97jRn((gP"onwȠ1>"ZyF~§C?t]te.XEΗhfzߤU1㺺F e'S\x2^)7: h0]3'p4zKwmnr䉝+p|M(y"`(3e˹P *5_pͦu2f{͛0_y'UGЊoq=2սi09Uc7̵Hj*}q<Ųv^To(V54+jbt L8KqlGݯq`2 9g▗.zFkc7$>"l63 ȟ04 ҷbXw mWWYn""/j 21mBMa˭/nȶC6mmY^' #|,k?},VI֟UҬdmmᚯr6UyQ(~ ymCn]m\#d43t߯tYY#2 Ce9l&mmmՔY Pd(t1oȬFymmNV`N㰍r(ի>^ }z5mn\~+UOJ,+ջ(# mm1ІjG:oRSٗܶ `dBmHgG6jl|C'j\d\frmmCNa CT>󜝂eBӪ~{f*bmmәNz0.ӕzRRVyH< mzo"IS'ɉV0EI(7pUxmmKD"8 6O=C[:_wmmҘDtP܌G\ڸP8KQ#m`Lg|bۖWM5 75jДd3mm?jǜooF}1O=m>ݻ6Yvܭb0JcF mmٵgFHnI#ď'rͲaXxpmm^U"ji#讬ɾa.ӳVEm`sGZ:IS+m sEnGmmL`ƭ* [ڜ|xU0mm۽``/f϶C_$% p4m8mmh\G#{^2l5r ;mm,MjW3rʜ8va! 0SDd m!;5Nhd2ik"ό҉:O]›!ҷ. m&!1AQa@q 0??:7__?>iHz~)G//#~?D___G.NE?//-}~?E?E?E~qu//-~?E _9//////'?_~? G//#?EDtg踺~~Dy#_腠?x~l =^m]{)X,&ps6 ̆H$ bg[ͺ~6 $mR_śg->,rvEJs=ܽu76c\ XY$4E"bg(u.P .?2s<[hf`3ͮjsKݠͭCܰ涬v9$m2z&9YݡtMɽ́73DG-j'VgCci[sOddc<C ݄Qcxl5GzO݇sƜVXYD&٬(fvՋ d|vdgWKR<}!8oPo0>I^ũ,p^ AiMhSA3 K%?>sjKܯSsX6rw0'gCInù|w )q}'02`9o| ʹXݝaכ!Rؔpsnޮg);4hoNo2,R21'n_~v%竁9 r: y[60rq̡эDSwb̌Nٜmby5\ώl1dxns&qbfK\1-9!\-n+*CgD m _?nG|'DJ,n]Zf.OLn`e#s#˷eA2cБ!ۚ:]XHNɬHtoK) K,&ih:q8>八> L![\G6<6 nɵQsS:xnApn-ӨK6H+<s 8ek+٥n4l^m[b[ɳg;0Gy'7jSG|b c1rsr|EdX7+2D%P|Ard(̄M\vX7\,ȃgn|B8w-<sz!VQY۶DX|;IͧLy*/^7ôG Ơ'F~#aġ䃞avvb^0,,\F2@(4sҽTgś64e%s CC ^1hQKC.=&Z W|HKoC%IjRĮsnZ%< 43dGpe/blY@=, Ftsǃf$\≮BF dr%Ϯ82(d@Ń1! Nh-{'06YlR+:|[-lCogS$԰ d /Q@<Fc¶2sau=\9 &_ČVlŌ6 |e|k;'ܵ|,)/~K[ܛ ~k_*~il~?(!1AQa@q 0??^t./gێ76>?o؛_7ػml/ퟗ=ml߶Cc ~o~_~O~O~~ ߛooll߶7e}폛ߗߗ:=3 >_?c>l^| oG+폗޵ֿ2}}?߶gƻx ZmSnԶe73[)L-nF.8G S{x=\sٿ/x#2Y# uN.CSV0R-$9/C$g>oCY ^o#NvF#N Is7Ra9l /W^< ]@, D_k՜۶d wäHp7" sC͒[x,s>?lf,`rV$텮Kzw.2.Ra(r[!/O`BZYwj\1p[l/P\`04)sinnIqocVssFy]Y BZ=Zs!cCbD2IBټ4v/20,@Y znq>\!rmsYLSݞ)#Eπ =N.yamġ4BΆy $0V]Ϩ9 >-<95av C_1<İCb6F =J !9Xlrqs'^♶=E3R.s ; dJh`ć0Cpv1-~h$%͗;q{0rxc=,w_~)enDvea=˵5) yuc1-TlvXO3[nQ ?7Zu-@f8rƻd;l,[ >ë|&W`{!Ùxɱ v`hku~kIՅԆt_t] ^^mm񄇍LuB.0kw;w~I Gll/_77ro+;pp2k̈́ՍGXaVn]X:bԱWmeՇ>O(!1AQaq@ 0?=eӍ%IMk`>G|0PUboV$]9 >! C~n sFH~C"|0=\Iɵ'A^"p+h-3G x6L F*ꔝ0|8b&egxόB?;l.2? aC1S#"[S`5y Bb-4I@~zw0Au٣ r8 paۓ/J \Mv|2M 35s?݈H!9Fjs78~bU-x /(zA@/H=jo'AE|wƬ08  t *'d8eG.>3hsʘ1Mq@6p%_ 4NOWKM:y S7R>RLچ8? :+-6@T3P [~5#<{*C Wڝs4hxi漙@~7b%]è#@k\眂_;Xҝ4iT0> B#XBu iNy hQ!qE7o W@=J|{P4 `bwߟY2 .tM RG;-(> ^qCc>Msxʆa5#-J|+.&?J! v^v7x ]~r`!+4N]WHmלMM_`a]L7YuL> :v Άp "`yWηpEKB< hwnj:]k t!]n7 9ε3IZLd4|.t<AV (JiG ћ3>irQn'a:ģ5d/wS8pIY)Ps6"6TӋL6EP'(.:/K=b[=L&!4MrPB zOtRQuĆqbJ !}y^G`G'bS*ܠ%EXx=ȚjT8ƕLg8tݢta E:|5j@5!'f8y4sn4I5pU>>kbFׂ\껃HqճX' u=7]Vxͦ⦵F ln0blG*ق FbF0H Hь}ke*(:"I~>pcߧxvk$!7X0kvB,. h4w.jpa9-zHC8JW ^>Csvk)C|auSi S}L kxh瞿lR ,xuni haSۢy Jm|9zminVhQW/C{v*e 8;:9knjG{D;OZ_B&3s@wIѩߜ#)k࿬(]U['Bb#YeQ27;fruN8-|BSo;{HC\d`t"Uz} ?x8:'lUu{śb\Ѷj~OcTOB4 \%]$.~KJsU\| 58ȁ{|O3LBDM!^Cr0MѱM%!2<jEt6 kkKGO;:C#o@o]LL;Kz,clmjiZnh XA";؟+&J,#(coyWd@w;sV舻B@@Msؤ8΂Jah5 )nMņm -lg/>WI)L/']_xbA}[b3U['PG\Ս:k) \*k U=TȞo?\ߤnU)}3_ԻgnNK#aaa8l,"Ʃ %_8N` Vr_PʁKL`GYT"*0u/q"0pG~8^=\@s[ʂ6{߳x\uQ|8& }OA(,1/\0Wmm1 T?'9tH8?XSMj.c<͘ZkYwwAT-m8$׆cojq!afyʹѣ`>=}@ih h6瞙?_[J-vbUgα⃝_?ۏY 0BEֹRD{08>>x%AWщl!ofxG j_}%QO0Ma 8jDɳjNAǘԞHoHVE׬_A~f-ze jFM[/vlxb7r",5 Az<ᜐ< ӃkXZۋ˛hnDv58 fDc5b@+м}ˈ Ïi D3aȡkߏyAmzs}x.Ry3?׷Yi7se΋1|Q*| TϬ;HƜnGQIP"(E |dWjDp #Oi6!)X,UP5_/&nqlP@GVWv4j`C<.?]x'X>F6pm4Nu)W:5A Z:yl w*sArb]8ןD|[ߜfޫzdMo)8K_T+.G`==j'=|wIP(8213u>u"x@߬"!E?p|=~18Z'c[ >OM*_ Do(([6")- "c!zYּhOxETJDלll/ch,FS,x-DC~3oQ!Fm6tmQBְOaPkx(cCzq8UiwX#̈q'QFvl|onS\ CL!0@^ݶy=/Xp= bXiXQq\~~~Ƴcx8F4 ~q@MKyz8!P|&"uE7ёrAM>gmM Ð~N`/'xVD?b}#דB J]oç}z^}qLQN ZT|Ix#vC W=b%-}=ό/ix@* M$Vx1: 1LDD*4k,?~m iq^ D  G~fQvVFW-|PXÒdh5]OkfoD[ 9X;@jPv{_3CMGh&'pѼhC\wnr^kW0WBipc_ q)ADYv&+>dz*?JK Vcۇ ;PdN__ch6\{i%9n7 5*k6(,N~#ž6>DQCc ^̯i<(."g/#v״4VkN[AzB`%˂T :z"ͺ8T֮xͮsblXd{/=z- -O+>n.0҆{E@yCA!jtyC nXZTh YGi_/& jcIPK#@x7X1?> m}H&h.Bsp|`;tzX8Cbm?qp_!|7z^4>q)=Af}`M/1>81{:,R.?ӓ@lcDPL/ɉ`a/NB?F`Ƈ;F?%h$[ϛ>P1 C]9̈ҁ-@_xu@:'BPzb h>uKwb<֣)DK)}aIRsQG;z3jMr8|sئG\M;W D)o3oƊ2`zRl5yÙǓÿ n%uI"8˜kF ʶ! 0Icnx)4M]qC@gf>"zei|oƍPSSa9'~wbsv_"~0 k~8$|fMÁlO.V)\xC!Cg!4QF/kmEԈC> -R<ƚAȚLoNu9ƈ>ьaǹѽi8cF2G!oohBC!+pTnlQR~3~9()( WɑLc,&s-Bu ǟli3O9:_:jp_s,nN>?R? F9>ŒwuhQ}8aV?XƜ86@Z+xn`%5\Uno ^El-i`EQ~~$[/7J;z*fmN!G޻"|#CO0ڤ6i택l61n{zBN@/K2vw}67bz b~Ld*NP sńE pmA7t wN?p?8F#byI#hnp`s vh= Łf<& h~:Tp Gbǭ`<ջNI\ӚCMp$BlĘY7Jg9N5r5?XԲU>cB{5Q^ŒXyx8 z?8h$0{:@8L!0Ծ0^{P~<:VD 1K- qDx ˚/ zH 7xJ$v0{ ʇ<|S5y.ZUJ6.ہ9>2mۛ3"cc-*8aZ++K2UY%1fzP]b?I,,HQOb࿜%A_%l49@aDbǼKW5s1!ps7X^4۴2|go} AN <1""zzL*<@ݼ˅YXd%! `H+(YAUҌ;0pJ4U9 dۍ"I!>9nOѩ@&8ɂނڇH> }݅ק;t=1 v1F}xą~&=$Bۭ8; 1́?d|~2ӗ ^"x5yy \0)B,^ d',Ic ¤waD+Df`:Uc0V}0T$q~p#}.=dMyVҼh ǏCm5ϫ(JT#ޱo KXlÐAGfJ#Uu\[|('zB"U`E!?.0g`FG&+4-↓M1)_ G 9@͊Fu:‰C;+V.XGEDobJЦqH0W:߮*D棶SoَA͍{ڤ:8k(9т:~G *} 6XozGTIs U.0^G;9M+ Q4qZ_a~48[y2_c ֵoLȖDuT/C* 3ÇC _:(\;#@?$cwOQ6{/#X"эLxЩ@#8mӅqw.&,8 c+0%V3sI'0No: Gʼ"HCAi~Y5Ph`KrmC-8W_'1衺|蹦۹ %rE"r޳aei<*?_xR6Ul]dE}zhSod^[1q{k{1BPNYfQU+3nݻFI7 Q-@.Ne$d Bbf6'}0*h\r._N4[ 8=60kOaA_\ޑ?S#r4_O #MxQDrQ?& m>t!qG-R2lWDzu uA䪐 b7"yNZ:h}p^0+̌n:MpgNO?ϗ3#7H'^}r[b |jъY%:%35-pA=LSN\xMYpY 5`ئ,; ChH]F91ɥb&pC-f?:9ʗ1T?dEH-R)`~.KO1q lA7`DOhe=P88p> ^^Wŵt0u <F&t/q,@G A9&TZmùQ Rg<.%}^~1DLmM_yWU 3+l˳x@C x. h8@NEލCH% FS~gS Qq{Z~..Ҫ| Kww Dk G?I Υ:l+o^CjܟB3S4vMApxcd]k|zECY./i xeٗΫ!n*!Gn? nX%_ix i8.7*Ɉ <@MAi˱?YA6br)/vx.H| ^ +I\` բR, ⮺*F ,a`y@kEmgˤ_M> ~sTuUD<Wл&<$E'3qA`y-B2w/JߟYT ]&GG7J^^ yBq m87 tk %o翬F={s |^\T+F1juL ` 珬OF ?Y?rYxڴd*D >i6Q#˞\?PBF?53d5s{i<;o 冂ZvX|_u |qd U)%u.p$#[OT3)/MYfv{ q= ;9D,0U}cch\O`[ߏŃ~KrϝYnFrGx!'8lQOf r,Ur4Hԝ hGr؏;'H@!K 9"`IVb%lG۳z_/!\ & <2"MhW)cwn7E_x)y7-6!% ǞOa߉p Ӯ2@"k  i]1+t'`t~k R˯zR<",JmO"Iѡ~1X;qu*a ؼ0b%ɿHz-j[֡"zSfTً@]?Qq&Cf#*5гdo28ES(L1l_HQ|y.56_oxBjchbQCnkcums~*^干%oXo퐦pt_^M>5,̟1'Z,u~5GFXr:VY]FA)J&$=];'o<1+A8*qWDu|/,.fZ<E-qrO%>ƚgLn|a8E B^laۿ#e ۾ۉuj:Ey"941* Qo8׷)G1Ƴi&]XGM?SAC=%"}?j/?iP oݴ[sdwS[HTGTGu3:ЄU.>f˧;vA@R#ӳegU|15<$w3݇ܐ> ɜnb4F<a*z2F' 1׼bUSYH[EE \$hY >?8Y\Lx1{fX5Jz0G4c{*SCR|BvP':jv9.`hЬFBkͨ&0L#B:*`Z N̓[[r l?M}c^ C(ϩ) 2Yg]5FG{VR0뛛Q1c~mc`TLOm4CNRw mrE@ވV5ug?kXLC5/1qkvּ{:/VAxH8JY:*QZy`jt*~%&!~IIZ_m˃zx1T?b&$tkЭR|)4:@ rфۨJP>&yS/xՅZ@ƚqd2"1LU07忌4hj`Y#~8fXM\b:ߋ &9 c`Mu1?cmnt=k`>'XQ7{84Aux*LgIS_y6cTp$NxyX6c/M [~"KV7a)up vyʖR?Ā|%F<=sx=q—:H"A sC$FZ EP^60k y8:ҍE*8t%_~qzuEGl_" ?(4p^O߆ӆ6` ='[+ 7~qnHqpA/ bYiS5 *$QgCBR'nPS"qN)v~A?Ŏx89GGg.j:[AvgLvU8Oˠxf#ĭDoXD1;ݖp<Bv=eDKl_HZwqm#88K"N9Г]f .h_Ұ:lCwuVUWlJ/Z=DŽچ:Ƒoj|FrhnÄY9Z޵w[1pwaBT=Mу.h i<.ѹ-*:{!Hw{S:WZy=On5iaG5dpp (O^i@x~Ma ?.Bvh'| (rԆ+7` -|x@I+0W%m0ʧji?YsY~PGm@Tv\ tb:/bxR_!cJ ȁ}_§X $Gh#݅ %0YsvrK a>S)DH݅xzjv^!}dPx[4ׄp9u+ϳ0IJw? GiU~ 1Sy$; x'FK`ϴn/8ǟ9=783ھ"Ms Ήrb*l5X6O>(W&p ],$eۈ'x>UQG}q8}O1%_bͣ9YQpG( #1jOݿl2hvu1tm!RP0ٷ]WWJ0U( Q"9(%C%og, LeziT'َ/P'ү WcZ笽Xy/c+u@PJ!@B:I(`;0 4OP`0unl8Ͻ Súb_w+86q0%T@$AW|kHaG7.!'k]&-*c;TF}^C*~amcq*e%WaZfd6)X4X$!"rna'14s)מMv\0h#s,b:N1!dM5&QB;x9[$đ0ݏH,Y~6ET6OD#3FА/hN!&*-O"'_.كU}DblZ8'tZ.|VSt2iĨJ[6,mlk7〨jM~9‚p& "iDxHd"c2Z~2B~O3|zpwKx;)܀*A5a ۗt(݄EZS$ ~sEi ͽa'CbZǽ8jAXH-5.~p,^!rNcs!l.5U>ߌvEnD0lCx23ksk#䇞@ߌ;ڸ%}? nߏ2ܰ@kumPlU kc)ThA.ϥ 5#2Iͯ/JtFܑTD].7;^䌞˜=ϜQa92vk9$A?Max`'osWjeICٛEь&L%!YDgrHV#|Sz(_OWy Nj/p@h nOz/Ro ?lbw$8o?dw{buˠVnlJ*᭿)mzq >*!A(mεai5|pXW [$ӟV\pOKUwlQ牛k)eG߭H!`sR* "wXb!ywmY*he o58 HuY#A!pr8cL!3Ǭ䯖 Menhir Reference Manual (version 20200121)

Menhir Reference Manual
(version 20200121)

François Pottier and Yann Régis-Gianas
INRIA
{Francois.Pottier, Yann.Regis-Gianas}@inria.fr

Contents

1  Foreword

Menhir is a parser generator. It turns high-level grammar specifications, decorated with semantic actions expressed in the OCaml programming language [18], into parsers, again expressed in OCaml. It is based on Knuth’s LR(1) parser construction technique [15]. It is strongly inspired by its precursors: yacc [11], ML-Yacc [22], and ocamlyacc [18], but offers a large number of minor and major improvements that make it a more modern tool.

This brief reference manual explains how to use Menhir. It does not attempt to explain context-free grammars, parsing, or the LR technique. Readers who have never used a parser generator are encouraged to read about these ideas first [1,2,8]. They are also invited to have a look at the demos directory in Menhir’s distribution.

Potential users of Menhir should be warned that Menhir’s feature set is not completely stable. There is a tension between preserving a measure of compatibility with ocamlyacc, on the one hand, and introducing new ideas, on the other hand. Some aspects of the tool, such as the error handling mechanism, are still potentially subject to incompatible changes: for instance, in the future, the current error handling mechanism (which is based on the error token, see §10) could be removed and replaced with an entirely different mechanism.

There is room for improvement in the tool and in this reference manual. Bug reports and suggestions are welcome!

2  Usage

Menhir is invoked as follows:

menhir optionoption filenamefilename

Each of the file names must end with .mly (unless --coq is used, in which case it must end with .vy) and denotes a partial grammar specification. These partial grammar specifications are joined (§5.1) to form a single, self-contained grammar specification, which is then processed. The following optional command line switches allow controlling many aspects of the process.

--base basename.  This switch controls the base name of the .ml and .mli files that are produced. That is, the tool will produce files named basename.ml and basename.mli. Note that basename can contain occurrences of the / character, so it really specifies a path and a base name. When only one filename is provided on the command line, the default basename is obtained by depriving filename of its final .mly suffix. When multiple file names are provided on the command line, no default base name exists, so that the --base switch must be used.

--cmly.  This switch causes Menhir to produce a .cmly file in addition to its normal operation. This file contains a (binary-form) representation of the grammar and automaton (see §13.1).

--comment.  This switch causes a few comments to be inserted into the OCaml code that is written to the .ml file.

--compare-errors filename1 --compare-errors filename2.  Two such switches must always be used in conjunction so as to specify the names of two .messages files, filename1 and filename2. Each file is read and internally translated to a mapping of states to messages. Menhir then checks that the left-hand mapping is a subset of the right-hand mapping. This feature is typically used in conjunction with --list-errors to check that filename2 is complete (that is, covers all states where an error can occur). For more information, see §11.

--compile-errors filename.  This switch causes Menhir to read the file filename, which must obey the .messages file format, and to compile it to an OCaml function that maps a state number to a message. The OCaml code is sent to the standard output channel. At the same time, Menhir checks that the collection of input sentences in the file filename is correct and irredundant. For more information, see §11.

--coq.  This switch causes Menhir to produce Coq code. See §12.

--coq-lib-path path.  This switch allows specifying under what name (or path) the Coq support library MenhirLib is known to Coq. When Menhir runs in --coq mode, the generated parser contains references to several modules in this library. This path is used to qualify these references. Its default value is MenhirLib.

--coq-lib-no-path.  This switch indicates that references to the Coq library MenhirLib should not be qualified. This was the default behavior of Menhir prior to 2018/05/30. This switch is provided for compatibility, but normally should not be used.

--coq-no-actions.  (Used in conjunction with --coq.) This switch causes the semantic actions present in the .vy file to be ignored and replaced with tt, the unique inhabitant of Coq’s unit type. This feature can be used to test the Coq back-end with a standard grammar, that is, a grammar that contains OCaml semantic actions. Just rename the file from .mly to .vy and set this switch.

--coq-no-complete.  (Used in conjunction with --coq.) This switch disables the generation of the proof of completeness of the parser (§12). This can be necessary because the proof of completeness is possible only if the grammar has no conflict (not even a benign one, in the sense of §6.1). This can be desirable also because, for a complex grammar, completeness may require a heavy certificate and its validation by Coq may take time.

--depend.  See §14.

--dump.  This switch causes a description of the automaton to be written to the file basename.automaton.

--echo-errors filename.  This switch causes Menhir to read the .messages file filename and to produce on the standard output channel just the input sentences. (That is, all messages, blank lines, and comments are filtered out.) For more information, see §11.

--explain.  This switch causes conflict explanations to be written to the file basename.conflicts. See also §6.

--external-tokens T.  This switch causes the definition of the token type to be omitted in basename.ml and basename.mli. Instead, the generated parser relies on the type T.token, where T is an OCaml module name. It is up to the user to define module T and to make sure that it exports a suitable token type. Module T can be hand-written. It can also be automatically generated out of a grammar specification using the --only-tokens switch.

--fixed-exception.  This switch causes the exception Error to be internally defined as a synonym for Parsing.Parse_error. This means that an exception handler that catches Parsing.Parse_error will also catch the generated parser’s Error. This helps increase Menhir’s compatibility with ocamlyacc. There is otherwise no reason to use this switch.

--graph.  This switch causes a description of the grammar’s dependency graph to be written to the file basename.dot. The graph’s vertices are the grammar’s nonterminal symbols. There is a directed edge from vertex A to vertex B if the definition of A refers to B. The file is in a format that is suitable for processing by the graphviz toolkit.

--infer, --infer-write-query, --infer-read-reply.  See §14.

--inspection.  This switch requires --table. It causes Menhir to generate not only the monolithic and incremental APIs (§9.1, §9.2), but also the inspection API (§9.3). Activating this switch causes a few more tables to be produced, resulting in somewhat larger code size.

--interpret.  This switch causes Menhir to act as an interpreter, rather than as a compiler. No OCaml code is generated. Instead, Menhir reads sentences off the standard input channel, parses them, and displays outcomes. This switch can be usefully combined with --trace. For more information, see §8.

--interpret-error.  This switch is analogous to --interpret, except Menhir expects every sentence to cause an error on its last token, and displays information about the state in which the error is detected, in the .messages file format. For more information, see §11.

--interpret-show-cst.  This switch, used in conjunction with --interpret, causes Menhir to display a concrete syntax tree when a sentence is successfully parsed. For more information, see §8.

--list-errors.  This switch causes Menhir to produce (on the standard output channel) a complete list of input sentences that cause an error, in the .messages file format. For more information, see §11.

--log-automaton level.  When level is nonzero, this switch causes some information about the automaton to be logged to the standard error channel.

--log-code level.  When level is nonzero, this switch causes some information about the generated OCaml code to be logged to the standard error channel.

--log-grammar level.  When level is nonzero, this switch causes some information about the grammar to be logged to the standard error channel. When level is 2, the nullable, FIRST, and FOLLOW tables are displayed.

--no-dollars.  This switch disallows the use of positional keywords of the form $i.

--no-inline.  This switch causes all %inline keywords in the grammar specification to be ignored. This is especially useful in order to understand whether these keywords help solve any conflicts.

--no-stdlib.  This switch instructs Menhir to not use its standard library (§5.4).

--ocamlc command.  See §14.

--ocamldep command.  See §14.

--only-preprocess.  This switch causes the grammar specifications to be transformed up to the point where the automaton’s construction can begin. The grammar specifications whose names are provided on the command line are joined (§5.1); all parameterized nonterminal symbols are expanded away (§5.2); type inference is performed, if --infer is enabled; all nonterminal symbols marked %inline are expanded away (§5.3). This yields a single, monolithic grammar specification, which is printed on the standard output channel.

--only-tokens.  This switch causes the %token declarations in the grammar specification to be translated into a definition of the token type, which is written to the files basename.ml and basename.mli. No code is generated. This is useful when a single set of tokens is to be shared between several parsers. The directory demos/calc-two contains a demo that illustrates the use of this switch.

--raw-depend.  See §14.

--stdlib directory.  This switch exists only for backwards compatibility and is ignored. It may be removed in the future.

--strict.  This switch causes several warnings about the grammar and about the automaton to be considered errors. This includes warnings about useless precedence declarations, non-terminal symbols that produce the empty language, unreachable non-terminal symbols, productions that are never reduced, conflicts that are not resolved by precedence declarations, and end-of-stream conflicts.

--suggest-*.  See §14.

--table.  This switch causes Menhir to use its table-based back-end, as opposed to its (default) code-based back-end. When --table is used, Menhir produces significantly more compact and somewhat slower parsers. See §16 for a speed comparison.

The table-based back-end produces rather compact tables, which are analogous to those produced by yacc, bison, or ocamlyacc. These tables are not quite stand-alone: they are exploited by an interpreter, which is shipped as part of the support library MenhirLib. For this reason, when --table is used, MenhirLib must be made visible to the OCaml compilers, and must be linked into your executable program. The --suggest-* switches, described above, help do this.

The code-based back-end compiles the LR automaton directly into a nest of mutually recursive OCaml functions. In that case, MenhirLib is not required.

The incremental API (§9.2) and the inspection API (§9.3) are made available only by the table-based back-end.

--timings.  This switch causes internal timing information to be sent to the standard error channel.

--trace.  This switch causes tracing code to be inserted into the generated parser, so that, when the parser is run, its actions are logged to the standard error channel. This is analogous to ocamlrun’s p=1 parameter, except this switch must be enabled at compile time: one cannot selectively enable or disable tracing at runtime.

--unused-precedence-levels.  This switch suppresses all warnings about useless %left, %right, %nonassoc and %prec declarations.

--unused-token symbol.  This switch suppresses the warning that is normally emitted when Menhir finds that the terminal symbol symbol is unused.

--unused-tokens.  This switch suppresses all of the warnings that are normally emitted when Menhir finds that some terminal symbols are unused.

--update-errors filename.  This switch causes Menhir to read the .messages file filename and to produce on the standard output channel a new .messages file that is identical, except the auto-generated comments have been re-generated. For more information, see §11.

--version.  This switch causes Menhir to print its own version number and exit.

3  Lexical conventions

A semicolon character (;) may appear after a declaration (§4.1).

An old-style rule (§4.2) may be terminated with a semicolon. Also, within an old-style rule, each producer (§4.2.3) may be terminated with a semicolon.

A new-style rule (§4.3) must not be terminated with a semicolon. Within such a rule, the elements of a sequence must be separated with semicolons.

Semicolons are not allowed to appear anywhere except in the places mentioned above. This is in contrast with ocamlyacc, which views semicolons as insignificant, just like whitespace.

Identifiers (id) coincide with OCaml identifiers, except they are not allowed to contain the quote () character. Following OCaml, identifiers that begin with a lowercase letter (lid) or with an uppercase letter (uid) are distinguished.

A quoted identifier qid is a string enclosed in double quotes. Such a string cannot contain a double quote or a backslash. Quoted identifiers are used as token aliases (§4.1.3).

Comments are C-style (surrounded with /* and */, cannot be nested), C++-style (announced by // and extending until the end of the line), or OCaml-style (surrounded with (* and *), can be nested). Of course, inside OCaml code, only OCaml-style comments are allowed.

OCaml type expressions are surrounded with < and >. Within such expressions, all references to type constructors (other than the built-in list, option, etc.) must be fully qualified.

4  Syntax of grammar specifications


specification ::= declarationdeclaration %% rulerule%% OCaml code ]
declaration ::= %{ OCaml code %}
  %parameter < uid : OCaml module type >
  %token< OCaml type > ] uidqid ] … uidqid ]
  %nonassoc uiduid
  %left uiduid
  %right uiduid
  %type < OCaml type > lidlid
  %start< OCaml type > ] lidlid
  %attribute actualactual attributeattribute
  % attribute
  %on_error_reduce lidlid
attribute ::= [@ name payload ]
old syntaxrule ::= %public ] [ %inline ] lid(  id, …, id ) ] :| ] group || group
group ::= production || production { OCaml code }%prec id ]
production ::= producerproducer%prec id ]
producer ::= lid = ] actual
actual ::= id(  actual, …, actual ) ]
  actual?  ∣ +  ∣ * )
  group || group
new syntaxrule ::= %public ] let lid(  id, …, id ) ] ( :=  ∣ == ) expression
expression ::= | ] expression || expression
  pattern = ] expression ; expression
   id(  expression , …, expression  ) ]
   expression?  ∣ +  ∣ * )
   { OCaml code }%prec id ]
   < OCaml id >%prec id ]
pattern ::= lid   ∣   _   ∣   ~   ∣   (  pattern , …, pattern  )
Figure 1: Syntax of grammar specifications

The syntax of grammar specifications appears in Figure 1. The places where attributes can be attached are not shown; they are documented separately (§13.2). A grammar specification begins with a sequence of declarations (§4.1), ended by a mandatory %% keyword. Following this keyword, a sequence of rules is expected. Each rule defines a nonterminal symbol lid, whose name must begin with a lowercase letter. A rule is expressed either in the “old syntax” (§4.2) or in the “new syntax” (§4.3), which is slightly more elegant and powerful.

4.1  Declarations

4.1.1  Headers

A header is a piece of OCaml code, surrounded with %{ and %}. It is copied verbatim at the beginning of the .ml file. It typically contains OCaml open directives and function definitions for use by the semantic actions. If a single grammar specification file contains multiple headers, their order is preserved. However, when two headers originate in distinct grammar specification files, the order in which they are copied to the .ml file is unspecified.

4.1.2  Parameters

A declaration of the form:

%parameter < uid : OCaml module type >

causes the entire parser to become parameterized over the OCaml module uid, that is, to become an OCaml functor. The directory demos/calc-param contains a demo that illustrates the use of this switch.

If a single specification file contains multiple %parameter declarations, their order is preserved, so that the module name uid introduced by one declaration is effectively in scope in the declarations that follow. When two %parameter declarations originate in distinct grammar specification files, the order in which they are processed is unspecified. Last, %parameter declarations take effect before %{%}, %token, %type, or %start declarations are considered, so that the module name uid introduced by a %parameter declaration is effectively in scope in all %{%}, %token, %type, or %start declarations, regardless of whether they precede or follow the %parameter declaration. This means, in particular, that the side effects of an OCaml header are observed only when the functor is applied, not when it is defined.

4.1.3  Tokens

A declaration of the form:

%token< OCaml type > ] uid1qid1 ]  …  uidnqidn ]

defines the identifiers uid1, …, uidn as tokens, that is, as terminal symbols in the grammar specification and as data constructors in the token type.

If an OCaml type t is present, then these tokens are considered to carry a semantic value of type t, otherwise they are considered to carry no semantic value.

If a quoted identifier qidi is present, then it is considered an alias for the terminal symbol uidi. (This feature, known as “token aliases”, is borrowed from Bison.) Throughout the grammar, the quoted identifier qidi is then synonymous with the identifier uidi. For example, if one declares:

%token PLUS "+"

then the quoted identifier "+" stands for the terminal symbol PLUS throughout the grammar. An example of the use of token aliases appears in the directory demos/calc-alias. Token aliases can be used to improve the readability of a grammar. One must keep in mind, however, that they are just syntactic sugar: they are not interpreted in any way by Menhir or conveyed to tools like ocamllex. They could be considered confusing by a reader who mistakenly believes that they are interpreted as string literals.

4.1.4  Priority and associativity

A declaration of one of the following forms:

%nonassoc uid1uidn
%left uid1uidn
%right uid1uidn

assigns both a priority level and an associativity status to the symbols uid1, …, uidn. The priority level assigned to uid1, …, uidn is not defined explicitly: instead, it is defined to be higher than the priority level assigned by the previous %nonassoc, %left, or %right declaration, and lower than that assigned by the next %nonassoc, %left, or %right declaration. The symbols uid1, …, uidn can be tokens (defined elsewhere by a %token declaration) or dummies (not defined anywhere). Both can be referred to as part of %prec annotations. Associativity status and priority levels allow shift/reduce conflicts to be silently resolved (§6).

4.1.5  Types

A declaration of the form:

%type < OCaml type > lid1lidn

assigns an OCaml type to each of the nonterminal symbols lid1, …, lidn. For start symbols, providing an OCaml type is mandatory, but is usually done as part of the %start declaration. For other symbols, it is optional. Providing type information can improve the quality of OCaml’s type error messages.

A %type declaration may concern not only a nonterminal symbol, such as, say, expression, but also a fully applied parameterized nonterminal symbol, such as list(expression) or separated_list(COMMA, option(expression)).

The types provided as part of %type declarations are copied verbatim to the .ml and .mli files. In contrast, headers (§4.1.1) are copied to the .ml file only. For this reason, the types provided as part of %type declarations must make sense both in the presence and in the absence of these headers. They should typically be fully qualified types.

4.1.6  Start symbols

A declaration of the form:

%start< OCaml type > ] lid1lidn

declares the nonterminal symbols lid1, …, lidn to be start symbols. Each such symbol must be assigned an OCaml type either as part of the %start declaration or via separate %type declarations. Each of lid1, …, lidn becomes the name of a function whose signature is published in the .mli file and that can be used to invoke the parser.

4.1.7  Attribute declarations

Attribute declarations of the form %attribute actualactual attributeattribute and % attribute are explained in §13.2.

4.1.8  Extra reductions on error

A declaration of the form:

%on_error_reduce lid1lidn

marks the nonterminal symbols lid1, …, lidn as potentially eligible for reduction when an invalid token is found. This may cause one or more extra reduction steps to be performed before the error is detected.

More precisely, this declaration affects the automaton as follows. Let us say that a production lid → … is “reducible on error” if its left-hand symbol lid appears in a %on_error_reduce declaration. After the automaton has been constructed and after any conflicts have been resolved, in every state s, the following algorithm is applied:

  1. Construct the set of all productions that are ready to be reduced in state s and are reducible on error;
  2. Test if one of them, say p, has higher “on-error-reduce-priority” than every other production in this set;
  3. If so, in state s, replace every error action with a reduction of the production p. (In other words, for every terminal symbol t, if the action table says: “in state s, when the next input symbol is t, fail”, then this entry is replaced with: “in state s, when the next input symbol is t, reduce production p”.)

If step 3 above is executed in state s, then an error can never be detected in state s, since all error actions in state s are replaced with reduce actions. Error detection is deferred: at least one reduction takes place before the error is detected. It is a “spurious” reduction: in a canonical LR(1) automaton, it would not take place.

An %on_error_reduce declaration does not affect the language that is accepted by the automaton. It does not affect the location where an error is detected. It is used to control in which state an error is detected. If used wisely, it can make errors easier to report, because they are detected in a state for which it is easier to write an accurate diagnostic message (§11.3).

Like a %type declaration, an %on_error_reduce declaration may concern not only a nonterminal symbol, such as, say, expression, but also a fully applied parameterized nonterminal symbol, such as list(expression) or separated_list(COMMA, option(expression)).

The “on-error-reduce-priority” of a production is that of its left-hand symbol. The “on-error-reduce-priority” of a nonterminal symbol is determined implicitly by the order of %on_error_reduce declarations. In the declaration %on_error_reduce  lid1lidn, the symbols lid1, …, lidn have the same “on-error-reduce-priority”. They have higher “on-error-reduce-priority” than the symbols listed in previous %on_error_reduce declarations, and lower “on-error-reduce-priority” than those listed in later %on_error_reduce declarations.

4.2  Rules—old syntax

In its simplest form, a rule begins with the nonterminal symbol lid, followed by a colon character (:), and continues with a sequence of production groups (§4.2.1). Each production group is preceded with a vertical bar character (|); the very first bar is optional. The meaning of the bar is choice: the nonterminal symbol id develops to either of the production groups. We defer explanations of the keyword %public5.1), of the keyword %inline5.3), and of the optional formal parameters (  id, …, id )5.2).

4.2.1  Production groups

In its simplest form, a production group consists of a single production (§4.2.2), followed by an OCaml semantic action (§4.2.1) and an optional %prec annotation (§4.2.1). A production specifies a sequence of terminal and nonterminal symbols that should be recognized, and optionally binds identifiers to their semantic values.

Semantic actions

A semantic action is a piece of OCaml code that is executed in order to assign a semantic value to the nonterminal symbol with which this production group is associated. A semantic action can refer to the (already computed) semantic values of the terminal or nonterminal symbols that appear in the production via the semantic value identifiers bound by the production.

For compatibility with ocamlyacc, semantic actions can also refer to unnamed semantic values via positional keywords of the form $1, $2, etc. This style is discouraged. (It is in fact forbidden if --no-dollars is turned on.) Furthermore, as a positional keyword of the form $i is internally rewritten as _i, the user should not use identifiers of the form _i.

%prec annotations

An annotation of the form %prec id indicates that the precedence level of the production group is the level assigned to the symbol id via a previous %nonassoc, %left, or %right declaration (§4.1.4). In the absence of a %prec annotation, the precedence level assigned to each production is the level assigned to the rightmost terminal symbol that appears in it. It is undefined if the rightmost terminal symbol has an undefined precedence level or if the production mentions no terminal symbols at all. The precedence level assigned to a production is used when resolving shift/reduce conflicts (§6).

Multiple productions in a group

If multiple productions are present in a single group, then the semantic action and precedence annotation are shared between them. This short-hand effectively allows several productions to share a semantic action and precedence annotation without requiring textual duplication. It is legal only when every production binds exactly the same set of semantic value identifiers and when no positional semantic value keywords ($1, etc.) are used.

4.2.2  Productions

A production is a sequence of producers (§4.2.3), optionally followed by a %prec annotation (§4.2.1). If a precedence annotation is present, it applies to this production alone, not to other productions in the production group. It is illegal for a production and its production group to both carry %prec annotations.

4.2.3  Producers

A producer is an actual (§4.2.4), optionally preceded with a binding of a semantic value identifier, of the form lid =. The actual specifies which construction should be recognized and how a semantic value should be computed for that construction. The identifier lid, if present, becomes bound to that semantic value in the semantic action that follows. Otherwise, the semantic value can be referred to via a positional keyword ($1, etc.).

4.2.4  Actuals

In its simplest form, an actual is just a terminal or nonterminal symbol id. If it is a parameterized non-terminal symbol (see §5.2), then it should be applied: id(  actual, …, actual ) .

An actual may be followed with a modifier (?, +, or *). This is explained further on (see §5.2 and Figure 2).

An actual may also be an “anonymous rule”. In that case, one writes just the rule’s right-hand side, which takes the form group || group. (This form is allowed only as an argument in an application.) This form is expanded on the fly to a definition of a fresh non-terminal symbol, which is declared %inline. For instance, providing an anonymous rule as an argument to list:

list (  e = expression; SEMICOLON { e }  )

is equivalent to writing this:

list (  expression_SEMICOLON  )

where the non-terminal symbol expression_SEMICOLON is chosen fresh and is defined as follows:

%inline expression_SEMICOLON:
   |  e = expression; SEMICOLON { e }

4.3  Rules—new syntax

Please be warned that the new syntax is considered experimental and is subject to change in the future.

In its simplest form, a rule takes the form let lid := expression. Its left-hand side lid is a nonterminal symbol; its right-hand side is an expression. Such a rule defines an ordinary nonterminal symbol, while the alternate form let lid == expression defines an %inline nonterminal symbol (§5.3), that is, a macro. A rule can be preceded with the keyword %public5.1) and can be parameterized with a tuple of formal parameters (  id, …, id )5.2). The various forms of expressions, listed in Figure 1, are:

  • A choice between several expressions, [ | ] expression1 || expressionn. The leading bar is optional.
  • A sequence of two expressions, pattern = expression1 ; expression2. The semantic value produced by expression1 is decomposed according to the pattern pattern. The OCaml variables introduced by pattern may appear in a semantic action that ends the sequence expression2.
  • A sequence ~ = id1 ; expression2, which is sugar for id1 = id1 ; expression2. This is a pun.
  • A sequence expression1 ; expression2, which is sugar for _ = expression1 ; expression2.
  • A symbol id, possibly applied to a tuple of expressions (  expression1, …, expressionn  ). It is worth noting that such an expression can form the end of a sequence: id at the end of a sequence stands for x = id ; { x } for some fresh variable x. Thus, a sequence need not end with a semantic action.
  • An expression followed with ?, +, or *. This is sugar for the previous form: see §5.2 and Figure 2.
  • A semantic action { OCaml code } , possibly followed with a precedence annotation %prec id. This OCaml code can refer to the variables that have been bound earlier in the sequence that this semantic action ends. These include all variables named by the user as well as all variables introduced by a ~ pattern as part of a pun. The notation $i, where i is an integer, is forbidden.
  • A point-free semantic action < OCaml id >, possibly followed with a precedence annotation %prec id. The OCaml identifier id must denote a function or a data constructor. It is applied to a tuple of the variables that have been bound earlier in the sequence that this semantic action ends. Thus, <  id  > is sugar for {  id  (x1, …, xn} , where x1, …, xn are the variables bound earlier. These include all variables named by the user as well as all variables introduced by a ~ pattern.
  • An identity semantic action <>. This is sugar for < identity >, where identity is OCaml’s identity function. Therefore, it is sugar for {  (x1, …, xn} , where x1, …, xn are the variables bound earlier.

The syntax of expressions, as presented in Figure 1, seems more permissive than it really is. In reality, a choice cannot be nested inside a sequence; a sequence cannot be nested in the left-hand side of a sequence; a semantic action cannot appear in the left-hand side of a sequence. (Thus, there is a stratification in three levels: choice expressions, sequence expressions, and atomic expressions, which corresponds roughly to the stratification of rules, productions, and producers in the old syntax.) Furthermore, an expression between parentheses (  expression  ) is not a valid expression. To surround an expression with parentheses, one must write either midrule  (  expression  ) or endrule  (  expression  ) ; see §5.4 and Figure 3.

When a complex expression (e.g., a choice or a sequence) is placed in parentheses, as in id (  expression  ), this is equivalent to using id (  s ) , where the fresh symbol s is declared as a synonym for this expression, via the declaration let s == expression. This idiom is also known as an anonymous rule (§4.2.4).

Examples

As an example of a rule in the new syntax, the parameterized nonterminal symbol option, which is part of Menhir’s standard library (§5.4), can be defined as follows:

let option(x) :=
  | { None }
  | x = x ; { Some x }

Using a pun, it can also be written as follows:

let option(x) :=
  | { None }
  | ~ = x ; { Some x }

Using a pun and a point-free semantic action, it can also be expressed as follows:

let option(x) :=
  | { None }
  | ~ = x ; < Some >

As another example, the parameterized symbol delimited, also part of Menhir’s standard library (§5.4), can be defined in the new syntax as follows:

let delimited(opening, x, closing) ==
  opening ; ~ = x ; closing ; <>

The use of == indicates that this is a macro, i.e., an %inline nonterminal symbol (see §5.3). The identity semantic action <> is here synonymous with { x }.

Other illustrations of the new syntax can be found in the directories demos/calc-new-syntax-dune and demos/calc-ast-dune.

5  Advanced features

5.1  Splitting specifications over multiple files

Modules

Grammar specifications can be split over multiple files. When Menhir is invoked with multiple argument file names, it considers each of these files as a partial grammar specification, and joins these partial specifications in order to obtain a single, complete specification.

This feature is intended to promote a form a modularity. It is hoped that, by splitting large grammar specifications into several “modules”, they can be made more manageable. It is also hoped that this mechanism, in conjunction with parameterization (§5.2), will promote sharing and reuse. It should be noted, however, that this is only a weak form of modularity. Indeed, partial specifications cannot be independently processed (say, checked for conflicts). It is necessary to first join them, so as to form a complete grammar specification, before any kind of grammar analysis can be done.

This mechanism is, in fact, how Menhir’s standard library (§5.4) is made available: even though its name does not appear on the command line, it is automatically joined with the user’s explicitly-provided grammar specifications, making the standard library’s definitions globally visible.

A partial grammar specification, or module, contains declarations and rules, just like a complete one: there is no visible difference. Of course, it can consist of only declarations, or only rules, if the user so chooses. (Don’t forget the mandatory %% keyword that separates declarations and rules. It must be present, even if one of the two sections is empty.)

Private and public nonterminal symbols

It should be noted that joining is not a purely textual process. If two modules happen to define a nonterminal symbol by the same name, then it is considered, by default, that this is an accidental name clash. In that case, each of the two nonterminal symbols is silently renamed so as to avoid the clash. In other words, by default, a nonterminal symbol defined in module A is considered private, and cannot be defined again, or referred to, in module B.

Naturally, it is sometimes desirable to define a nonterminal symbol N in module A and to refer to it in module B. This is permitted if N is public, that is, if either its definition carries the keyword %public or N is declared to be a start symbol. A public nonterminal symbol is never renamed, so it can be referred to by modules other than its defining module.

In fact, it is permitted to split the definition of a public nonterminal symbol, over multiple modules and/or within a single module. That is, a public nonterminal symbol N can have multiple definitions, within one module and/or in distinct modules. All of these definitions are joined using the choice (|) operator. For instance, in the grammar of a programming language, the definition of the nonterminal symbol expression could be split into multiple modules, where one module groups the expression forms that have to do with arithmetic, one module groups those that concern function definitions and function calls, one module groups those that concern object definitions and method calls, and so on.

Tokens aside

Another use of modularity consists in placing all %token declarations in one module, and the actual grammar specification in another module. The module that contains the token definitions can then be shared, making it easier to define multiple parsers that accept the same type of tokens. (On this topic, see demos/calc-two.)

5.2  Parameterizing rules

A rule (that is, the definition of a nonterminal symbol) can be parameterized over an arbitrary number of symbols, which are referred to as formal parameters.

Example

For instance, here is the definition of the parameterized nonterminal symbol option, taken from the standard library (§5.4):

%public option(X):
   |  { None }
   |  x = X { Some x }

This definition states that option(X) expands to either the empty string, producing the semantic value None, or to the string X, producing the semantic value Some x, where x is the semantic value of X. In this definition, the symbol X is abstract: it stands for an arbitrary terminal or nonterminal symbol. The definition is made public, so option can be referred to within client modules.

A client who wishes to use option simply refers to it, together with an actual parameter – a symbol that is intended to replace X. For instance, here is how one might define a sequence of declarations, preceded with optional commas:

declarations:
   |  { [] }
   |  ds = declarations; option(COMMA); d = declaration { d :: ds }

This definition states that declarations expands either to the empty string or to declarations followed by an optional comma followed by declaration. (Here, COMMA is presumably a terminal symbol.) When this rule is encountered, the definition of option is instantiated: that is, a copy of the definition, where COMMA replaces X, is produced. Things behave exactly as if one had written:

optional_comma:
   |  { None }
   |  x = COMMA { Some x }
declarations:
   |  { [] }
   |  ds = declarations; optional_comma; d = declaration { d :: ds }

Note that, even though COMMA presumably has been declared as a token with no semantic value, writing x = COMMA is legal, and binds x to the unit value. This design choice ensures that the definition of option makes sense regardless of the nature of X: that is, X can be instantiated with a terminal symbol, with or without a semantic value, or with a nonterminal symbol.

Parameterization in general

In general, the definition of a nonterminal symbol N can be parameterized with an arbitrary number of formal parameters. When N is referred to within a production, it must be applied to the same number of actuals. In general, an actual is:

  • either a single symbol, which can be a terminal symbol, a nonterminal symbol, or a formal parameter;
  • or an application of such a symbol to a number of actuals.

For instance, here is a rule whose single production consists of a single producer, which contains several, nested actuals. (This example is discussed again in §5.4.)

plist(X):
   |  xs = loption(delimited(LPAREN, separated_nonempty_list(COMMA, X), RPAREN)) { xs }

actual?  is syntactic sugar for option(actual)
actual+  is syntactic sugar for nonempty_list(actual)
actual*  is syntactic sugar for list(actual)
Figure 2: Syntactic sugar for simulating regular expressions, also known as EBNF

Applications of the parameterized nonterminal symbols option, nonempty_list, and list, which are defined in the standard library (§5.4), can be written using a familiar, regular-expression like syntax (Figure 2).

Higher-order parameters

A formal parameter can itself expect parameters. For instance, here is a rule that defines the syntax of procedures in an imaginary programming language:

procedure(list):
   |  PROCEDURE ID list(formal) SEMICOLON block SEMICOLON {}

This rule states that the token ID, which represents the name of the procedure, should be followed with a list of formal parameters. (The definitions of the nonterminal symbols formal and block are not shown.) However, because list is a formal parameter, as opposed to a concrete nonterminal symbol defined elsewhere, this definition does not specify how the list is laid out: which token, if any, is used to separate, or terminate, list elements? is the list allowed to be empty? and so on. A more concrete notion of procedure is obtained by instantiating the formal parameter list: for instance, procedure(plist), where plist is the parameterized nonterminal symbol defined earlier, is a valid application.

Consistency

Definitions and uses of parameterized nonterminal symbols are checked for consistency before they are expanded away. In short, it is checked that, wherever a nonterminal symbol is used, it is supplied with actual arguments in appropriate number and of appropriate nature. This guarantees that expansion of parameterized definitions terminates and produces a well-formed grammar as its outcome.

5.3  Inlining

It is well-known that the following grammar of arithmetic expressions does not work as expected: that is, in spite of the priority declarations, it has shift/reduce conflicts.

%token < int > INT
%token PLUS TIMES
%left PLUS
%left TIMES
 
%%
 
expression:
   |  i = INT { i }
   |  e = expression; o = op; f = expression { o e f }
op:
   |  PLUS { ( + ) }
   |  TIMES { ( * ) }

The trouble is, the precedence level of the production expressionexpression op expression is undefined, and there is no sensible way of defining it via a %prec declaration, since the desired level really depends upon the symbol that was recognized by op: was it PLUS or TIMES?

The standard workaround is to abandon the definition of op as a separate nonterminal symbol, and to inline its definition into the definition of expression, like this:

expression:
   |  i = INT { i }
   |  e = expression; PLUS; f = expression { e + f }
   |  e = expression; TIMES; f = expression { e * f }

This avoids the shift/reduce conflict, but gives up some of the original specification’s structure, which, in realistic situations, can be damageable. Fortunately, Menhir offers a way of avoiding the conflict without manually transforming the grammar, by declaring that the nonterminal symbol op should be inlined:

expression:
   |  i = INT { i }
   |  e = expression; o = op; f = expression { o e f }
%inline op:
   |  PLUS { ( + ) }
   |  TIMES { ( * ) }

The %inline keyword causes all references to op to be replaced with its definition. In this example, the definition of op involves two productions, one that develops to PLUS and one that expands to TIMES, so every production that refers to op is effectively turned into two productions, one that refers to PLUS and one that refers to TIMES. After inlining, op disappears and expression has three productions: that is, the result of inlining is exactly the manual workaround shown above.

In some situations, inlining can also help recover a slight efficiency margin. For instance, the definition:

%inline plist(X):
   |  xs = loption(delimited(LPAREN, separated_nonempty_list(COMMA, X), RPAREN)) { xs }

effectively makes plist(X) an alias for the right-hand side loption(…). Without the %inline keyword, the language recognized by the grammar would be the same, but the LR automaton would probably have one more state and would perform one more reduction at run time.

The %inline keyword does not affect the computation of positions (§7). The same positions are computed, regardless of where %inline keywords are placed.

If the semantic actions have side effects, the %inline keyword can affect the order in which these side effects take place. In the example of op and expression above, if for some reason the semantic action associated with op has a side effect (such as updating a global variable, or printing a message), then, by inlining op, we delay this side effect, which takes place after the second operand has been recognized, whereas in the absence of inlining it takes place as soon as the operator has been recognized.

5.4  The standard library


NameRecognizesProducesComment
 
endrule(X)Xα, if X : α(inlined)
midrule(X)Xα, if X : α
 
option(X)є | Xα option, if X : α(also X?)
ioption(X)є | Xα option, if X : α(inlined)
boption(X)є | Xbool
loption(X)є | Xα list, if X : α list
 
pair(X, Y)X Yα×β, if X : α and Y : β
separated_pair(X, sep, Y)X sep Yα×β, if X : α and Y : β
preceded(opening, X)opening Xα, if X : α
terminated(X, closing)X closingα, if X : α
delimited(opening, X, closing)opening X closingα, if X : α
 
list(X)a possibly empty sequence of X’sα list, if X : α(also X*)
nonempty_list(X)a nonempty sequence of X’sα list, if X : α(also X+)
separated_list(sep, X)a possibly empty sequence of X’s separated with sep’sα list, if X : α
separated_nonempty_list(sep, X)a nonempty sequence of X’s   separated with sep’sα list, if X : α
 
rev(X)Xα list, if X : α list(inlined)
flatten(X)Xα list, if X : α list list(inlined)
append(X, Y)X Yα list, if X, Y : α list(inlined)
Figure 3: Summary of the standard library; see standard.mly for details

Once equipped with a rudimentary module system (§5.1), parameterization (§5.2), and inlining (§5.3), it is straightforward to propose a collection of commonly used definitions, such as options, sequences, lists, and so on. This standard library is joined, by default, with every grammar specification. A summary of the nonterminal symbols offered by the standard library appears in Figure 3. See also the short-hands documented in Figure 2.

By relying on the standard library, a client module can concisely define more elaborate notions. For instance, the following rule:

%inline plist(X):
   |  xs = loption(delimited(LPAREN, separated_nonempty_list(COMMA, X), RPAREN)) { xs }

causes plist(X) to recognize a list of X’s, where the empty list is represented by the empty string, and a non-empty list is delimited with parentheses and comma-separated.

The standard library is stored in a file named standard.mly, which is embedded inside Menhir when it is built. The command line switch --no-stdlib instructs Menhir to not load the standard library.

The meaning of the symbols defined in the standard library (Figure 3) should be clear in most cases. Yet, the symbols endrule(X) and midrule(X) deserve an explanation. Both take an argument X, which typically will be instantiated with an anonymous rule (§4.2.4). Both are defined as a synonym for X. In both cases, this allows placing an anonymous subrule in the middle of a rule.

For instance, the following is a well-formed production:

  cat    endrule(dog    { OCaml code1 })    cow    { OCaml code2 }

This production consists of three producers, namely cat and endrule(dog { OCaml code1 }) and cow, and a semantic action { OCaml code2 }. Because endrule(X) is declared as an %inline synonym for X, the expansion of anonymous rules (§4.2.4), followed with the expansion of %inline symbols (§5.3), transforms the above production into the following:

  cat    dog    cow    { OCaml code1; OCaml code2 }

Note that OCaml code1 moves to the end of the rule, which means that this code is executed only after cat, dog and cow have been recognized. In this example, the use of endrule is rather pointless, as the expanded code is more concise and clearer than the original code. Still, endrule can be useful when its actual argument is an anonymous rule with multiple branches.

midrule is used in exactly the same way as endrule, but its expansion is different. For instance, the following is a well-formed production:

  cat    midrule({ OCaml code1 })    cow    { OCaml code2 }

(There is no dog in this example; this is intentional.) Because midrule(X) is a synonym for X, but is not declared %inline, the expansion of anonymous rules (§4.2.4), followed with the expansion of %inline symbols (§5.3), transforms the above production into the following:

  cat    xxx    cow    { OCaml code2 }

where the fresh nonterminal symbol xxx is separately defined by the rule xxx: { OCaml code1 } . Thus, xxx recognizes the empty string, and as soon as it is recognized, OCaml code1 is executed. This is known as a “mid-rule action”.

6  Conflicts

When a shift/reduce or reduce/reduce conflict is detected, it is classified as either benign, if it can be resolved by consulting user-supplied precedence declarations, or severe, if it cannot. Benign conflicts are not reported. Severe conflicts are reported and, if the --explain switch is on, explained.

6.1  When is a conflict benign?

A shift/reduce conflict involves a single token (the one that one might wish to shift) and one or more productions (those that one might wish to reduce). When such a conflict is detected, the precedence level (§4.1.4, §4.2.1) of these entities are looked up and compared as follows:

  1. if only one production is involved, and if it has higher priority than the token, then the conflict is resolved in favor of reduction.
  2. if only one production is involved, and if it has the same priority as the token, then the associativity status of the token is looked up:
    1. if the token was declared nonassociative, then the conflict is resolved in favor of neither action, that is, a syntax error will be signaled if this token shows up when this production is about to be reduced;
    2. if the token was declared left-associative, then the conflict is resolved in favor of reduction;
    3. if the token was declared right-associative, then the conflict is resolved in favor of shifting.
  3. if multiple productions are involved, and if, considered one by one, they all cause the conflict to be resolved in the same way (that is, either in favor in shifting, or in favor of neither), then the conflict is resolved in that way.

In either of these cases, the conflict is considered benign. Otherwise, it is considered severe. Note that a reduce/reduce conflict is always considered severe, unless it happens to be subsumed by a benign multi-way shift/reduce conflict (item 3 above).

6.2  How are severe conflicts explained?

When the --dump switch is on, a description of the automaton is written to the .automaton file. Severe conflicts are shown as part of this description. Fortunately, there is also a way of understanding conflicts in terms of the grammar, rather than in terms of the automaton. When the --explain switch is on, a textual explanation is written to the .conflicts file.

Not all conflicts are explained in this file: instead, only one conflict per automaton state is explained. This is done partly in the interest of brevity, but also because Pager’s algorithm can create artificial conflicts in a state that already contains a true LR(1) conflict; thus, one cannot hope in general to explain all of the conflicts that appear in the automaton. As a result of this policy, once all conflicts explained in the .conflicts file have been fixed, one might need to run Menhir again to produce yet more conflict explanations.


%token IF THEN ELSE
%start < expression > expression
 
%%
 
expression:
   |  …
   |  IF b = expression THEN e = expression {}
   |  IF b = expression THEN e = expression ELSE f = expression {}
   |  …
Figure 4: Basic example of a shift/reduce conflict

How the conflict state is reached

Figure 4 shows a grammar specification with a typical shift/reduce conflict. When this specification is analyzed, the conflict is detected, and an explanation is written to the .conflicts file. The explanation first indicates in which state the conflict lies by showing how that state is reached. Here, it is reached after recognizing the following string of terminal and nonterminal symbols—the conflict string:

IF expression THEN IF expression THEN expression

Allowing the conflict string to contain both nonterminal and terminal symbols usually makes it shorter and more readable. If desired, a conflict string composed purely of terminal symbols could be obtained by replacing each occurrence of a nonterminal symbol N with an arbitrary N-sentence.

The conflict string can be thought of as a path that leads from one of the automaton’s start states to the conflict state. When multiple such paths exist, the one that is displayed is chosen shortest. Nevertheless, it may sometimes be quite long. In that case, artificially (and temporarily) declaring some existing nonterminal symbols to be start symbols has the effect of adding new start states to the automaton and can help produce shorter conflict strings. Here, expression was declared to be a start symbol, which is why the conflict string is quite short.

In addition to the conflict string, the .conflicts file also states that the conflict token is ELSE. That is, when the automaton has recognized the conflict string and when the lookahead token (the next token on the input stream) is ELSE, a conflict arises. A conflict corresponds to a choice: the automaton is faced with several possible actions, and does not know which one should be taken. This indicates that the grammar is not LR(1). The grammar may or may not be inherently ambiguous.

In our example, the conflict string and the conflict token are enough to understand why there is a conflict: when two IF constructs are nested, it is ambiguous which of the two constructs the ELSE branch should be associated with. Nevertheless, the .conflicts file provides further information: it explicitly shows that there exists a conflict, by proving that two distinct actions are possible. Here, one of these actions consists in shifting, while the other consists in reducing: this is a shift/reduce conflict.

A proof takes the form of a partial derivation tree whose fringe begins with the conflict string, followed by the conflict token. A derivation tree is a tree whose nodes are labeled with symbols. The root node carries a start symbol. A node that carries a terminal symbol is considered a leaf, and has no children. A node that carries a nonterminal symbol N either is considered a leaf, and has no children; or is not considered a leaf, and has n children, where n≥ 0, labeled x1,…,xn, where Nx1,…,xn is a production. The fringe of a partial derivation tree is the string of terminal and nonterminal symbols carried by the tree’s leaves. A string of terminal and nonterminal symbols that is the fringe of some partial derivation tree is a sentential form.

Why shifting is legal


Figure 5: A partial derivation tree that justifies shifting


expression
IF expression THEN expression
IF expression THEN expression . ELSE expression
Figure 6: A textual version of the tree in Figure 5

In our example, the proof that shifting is possible is the derivation tree shown in Figures 5 and 6. At the root of the tree is the grammar’s start symbol, expression. This symbol develops into the string IF expression THEN expression, which forms the tree’s second level. The second occurrence of expression in that string develops into IF expression THEN expression ELSE expression, which forms the tree’s last level. The tree’s fringe, a sentential form, is the string IF expression THEN IF expression THEN expression ELSE expression. As announced earlier, it begins with the conflict string IF expression THEN IF expression THEN expression, followed with the conflict token ELSE.

In Figure 6, the end of the conflict string is materialized with a dot. Note that this dot does not occupy the rightmost position in the tree’s last level. In other words, the conflict token (ELSE) itself occurs on the tree’s last level. In practical terms, this means that, after the automaton has recognized the conflict string and peeked at the conflict token, it makes sense for it to shift that token.

Why reducing is legal


Figure 7: A partial derivation tree that justifies reducing


expression
IF expression THEN expression ELSE expression       // lookahead token appears
IF expression THEN expression .
Figure 8: A textual version of the tree in Figure 7

In our example, the proof that shifting is possible is the derivation tree shown in Figures 7 and 8. Again, the sentential form found at the fringe of the tree begins with the conflict string, followed with the conflict token.

Again, in Figure 8, the end of the conflict string is materialized with a dot. Note that, this time, the dot occupies the rightmost position in the tree’s last level. In other words, the conflict token (ELSE) appeared on an earlier level (here, on the second level). This fact is emphasized by the comment // lookahead token appears found at the second level. In practical terms, this means that, after the automaton has recognized the conflict string and peeked at the conflict token, it makes sense for it to reduce the production that corresponds to the tree’s last level—here, the production is expressionIF expression THEN expression.

An example of a more complex derivation tree

Figures 9 and 10 show a partial derivation tree that justifies reduction in a more complex situation. (This derivation tree is relative to a grammar that is not shown.) Here, the conflict string is DATA UIDENT EQUALS UIDENT; the conflict token is LIDENT. It is quite clear that the fringe of the tree begins with the conflict string. However, in this case, the fringe does not explicitly exhibit the conflict token. Let us examine the tree more closely and answer the question: following UIDENT, what’s the next terminal symbol on the fringe?


Figure 9: A partial derivation tree that justifies reducing


decls
decl opt_semi decls       // lookahead token appears because opt_semi can vanish and decls can begin with LIDENT
DATA UIDENT EQUALS tycon_expr       // lookahead token is inherited
tycon_item       // lookahead token is inherited
UIDENT opt_type_exprs       // lookahead token is inherited
.
Figure 10: A textual version of the tree in Figure 9

First, note that opt_type_exprs is not a leaf node, even though it has no children. The grammar contains the production opt_type_exprs → є: the nonterminal symbol opt_type_exprs develops to the empty string. (This is made clear in Figure 10, where a single dot appears immediately below opt_type_exprs.) Thus, opt_type_exprs is not part of the fringe.

Next, note that opt_type_exprs is the rightmost symbol within its level. Thus, in order to find the next symbol on the fringe, we have to look up one level. This is the meaning of the comment // lookahead token is inherited. Similarly, tycon_item and tycon_expr appear rightmost within their level, so we again have to look further up.

This brings us back to the tree’s second level. There, decl is not the rightmost symbol: next to it, we find opt_semi and decls. Does this mean that opt_semi is the next symbol on the fringe? Yes and no. opt_semi is a nonterminal symbol, but we are really interested in finding out what the next terminal symbol on the fringe could be. The partial derivation tree shown in Figures 9 and 10 does not explicitly answer this question. In order to answer it, we need to know more about opt_semi and decls.

Here, opt_semi stands (as one might have guessed) for an optional semicolon, so the grammar contains a production opt_semi → є. This is indicated by the comment // opt_semi can vanish. (Nonterminal symbols that generate є are also said to be nullable.) Thus, one could choose to turn this partial derivation tree into a larger one by developing opt_semi into є, making it a non-leaf node. That would yield a new partial derivation tree where the next symbol on the fringe, following UIDENT, is decls.

Now, what about decls? Again, it is a nonterminal symbol, and we are really interested in finding out what the next terminal symbol on the fringe could be. Again, we need to imagine how this partial derivation tree could be turned into a larger one by developing decls. Here, the grammar happens to contain a production of the form declsLIDENT … This is indicated by the comment // decls can begin with LIDENT. Thus, by developing decls, it is possible to construct a partial derivation tree where the next symbol on the fringe, following UIDENT, is LIDENT. This is precisely the conflict token.

To sum up, there exists a partial derivation tree whose fringe begins with the conflict string, followed with the conflict token. Furthermore, in that derivation tree, the dot occupies the rightmost position in the last level. As in our previous example, this means that, after the automaton has recognized the conflict string and peeked at the conflict token, it makes sense for it to reduce the production that corresponds to the tree’s last level—here, the production is opt_type_exprs → є.

Greatest common factor among derivation trees

Understanding conflicts requires comparing two (or more) derivation trees. It is frequent for these trees to exhibit a common factor, that is, to exhibit identical structure near the top of the tree, and to differ only below a specific node. Manual identification of that node can be tedious, so Menhir performs this work automatically. When explaining a n-way conflict, it first displays the greatest common factor of the n derivation trees. A question mark symbol (?) is used to identify the node where the trees begin to differ. Then, Menhir displays each of the n derivation trees, without their common factor – that is, it displays n sub-trees that actually begin to differ at the root. This should make visual comparisons significantly easier.

6.3  How are severe conflicts resolved in the end?

It is unspecified how severe conflicts are resolved. Menhir attempts to mimic ocamlyacc’s specification, that is, to resolve shift/reduce conflicts in favor of shifting, and to resolve reduce/reduce conflicts in favor of the production that textually appears earliest in the grammar specification. However, this specification is inconsistent in case of three-way conflicts, that is, conflicts that simultaneously involve a shift action and several reduction actions. Furthermore, textual precedence can be undefined when the grammar specification is split over multiple modules. In short, Menhir’s philosophy is that

severe conflicts should not be tolerated,

so you should not care how they are resolved.

6.4  End-of-stream conflicts

Menhir’s treatment of the end of the token stream is (believed to be) fully compatible with ocamlyacc’s. Yet, Menhir attempts to be more user-friendly by warning about a class of so-called “end-of-stream conflicts”.

How the end of stream is handled

In many textbooks on parsing, it is assumed that the lexical analyzer, which produces the token stream, produces a special token, written #, to signal that the end of the token stream has been reached. A parser generator can take advantage of this by transforming the grammar: for each start symbol S in the original grammar, a new start symbol S’ is defined, together with the production S′→ S# . The symbol S is no longer a start symbol in the new grammar. This means that the parser will accept a sentence derived from S only if it is immediately followed by the end of the token stream.

This approach has the advantage of simplicity. However, ocamlyacc and Menhir do not follow it, for several reasons. Perhaps the most convincing one is that it is not flexible enough: sometimes, it is desirable to recognize a sentence derived from S, without requiring that it be followed by the end of the token stream: this is the case, for instance, when reading commands, one by one, on the standard input channel. In that case, there is no end of stream: the token stream is conceptually infinite. Furthermore, after a command has been recognized, we do not wish to examine the next token, because doing so might cause the program to block, waiting for more input.

In short, ocamlyacc and Menhir’s approach is to recognize a sentence derived from S and to not look, if possible, at what follows. However, this is possible only if the definition of S is such that the end of an S-sentence is identifiable without knowledge of the lookahead token. When the definition of S does not satisfy this criterion, and end-of-stream conflict arises: after a potential S-sentence has been read, there can be a tension between consulting the next token, in order to determine whether the sentence is continued, and not consulting the next token, because the sentence might be over and whatever follows should not be read. Menhir warns about end-of-stream conflicts, whereas ocamlyacc does not.

A definition of end-of-stream conflicts

Technically, Menhir proceeds as follows. A # symbol is introduced. It is, however, only a pseudo-token: it is never produced by the lexical analyzer. For each start symbol S in the original grammar, a new start symbol S’ is defined, together with the production S′→ S. The corresponding start state of the LR(1) automaton is composed of the LR(1) item S′ → .  S  [# ]. That is, the pseudo-token # initially appears in the lookahead set, indicating that we expect to be done after recognizing an S-sentence. During the construction of the LR(1) automaton, this lookahead set is inherited by other items, with the effect that, in the end, the automaton has:

  • shift actions only on physical tokens; and
  • reduce actions either on physical tokens or on the pseudo-token #.

A state of the automaton has a reduce action on # if, in that state, an S-sentence has been read, so that the job is potentially finished. A state has a shift or reduce action on a physical token if, in that state, more tokens potentially need to be read before an S-sentence is recognized. If a state has a reduce action on #, then that action should be taken without requesting the next token from the lexical analyzer. On the other hand, if a state has a shift or reduce action on a physical token, then the lookahead token must be consulted in order to determine if that action should be taken.


%token < int > INT
%token PLUS TIMES
%left PLUS
%left TIMES
%start < int > expr
%%
expr:
   |  i = INT { i }
   |  e1 = expr PLUS e2 = expr { e1 + e2 }
   |  e1 = expr TIMES e2 = expr { e1 * e2 }
Figure 11: Basic example of an end-of-stream conflict


State 6:
expr -> expr . PLUS expr [ # TIMES PLUS ]
expr -> expr PLUS expr . [ # TIMES PLUS ]
expr -> expr . TIMES expr [ # TIMES PLUS ]
-- On TIMES shift to state 3
-- On # PLUS reduce production expr -> expr PLUS expr

State 4:
expr -> expr . PLUS expr [ # TIMES PLUS ]
expr -> expr . TIMES expr [ # TIMES PLUS ]
expr -> expr TIMES expr . [ # TIMES PLUS ]
-- On # TIMES PLUS reduce production expr -> expr TIMES expr

State 2:
expr' -> expr . [ # ]
expr -> expr . PLUS expr [ # TIMES PLUS ]
expr -> expr . TIMES expr [ # TIMES PLUS ]
-- On TIMES shift to state 3
-- On PLUS shift to state 5
-- On # accept expr
Figure 12: Part of an LR automaton for the grammar in Figure 11


%token END
%start < int > main     // instead of expr
%%
main:
   |  e = expr END { e }
expr:
   |  …
Figure 13: Fixing the grammar specification in Figure 11

An end-of-stream conflict arises when a state has distinct actions on # and on at least one physical token. In short, this means that the end of an S-sentence cannot be unambiguously identified without examining one extra token. Menhir’s default behavior, in that case, is to suppress the action on #, so that more input is always requested.

Example

Figure 11 shows a grammar that has end-of-stream conflicts. When this grammar is processed, Menhir warns about these conflicts, and further warns that expr is never accepted. Let us explain.

Part of the corresponding automaton, as described in the .automaton file, is shown in Figure 12. Explanations at the end of the .automaton file (not shown) point out that states 6 and 2 have an end-of-stream conflict. Indeed, both states have distinct actions on # and on the physical token TIMES. It is interesting to note that, even though state 4 has actions on # and on physical tokens, it does not have an end-of-stream conflict. This is because the action taken in state 4 is always to reduce the production exprexpr TIMES expr, regardless of the lookahead token.

By default, Menhir produces a parser where end-of-stream conflicts are resolved in favor of looking ahead: that is, the problematic reduce actions on # are suppressed. This means, in particular, that the accept action in state 2, which corresponds to reducing the production exprexpr’, is suppressed. This explains why the symbol expr is never accepted: because expressions do not have an unambiguous end marker, the parser will always request one more token and will never stop.

In order to avoid this end-of-stream conflict, the standard solution is to introduce a new token, say END, and to use it as an end marker for expressions. The END token could be generated by the lexical analyzer when it encounters the actual end of stream, or it could correspond to a piece of concrete syntax, say, a line feed character, a semicolon, or an end keyword. The solution is shown in Figure 13.

7  Positions

When an ocamllex-generated lexical analyzer produces a token, it updates two fields, named lex_start_p and lex_curr_p, in its environment record, whose type is Lexing.lexbuf. Each of these fields holds a value of type Lexing.position. Together, they represent the token’s start and end positions within the text that is being scanned. These fields are read by Menhir after calling the lexical analyzer, so it is the lexical analyzer’s responsibility to correctly set these fields.

A position consists mainly of an offset (the position’s pos_cnum field), but also holds information about the current file name, the current line number, and the current offset within the current line. (Not all ocamllex-generated analyzers keep this extra information up to date. This must be explicitly programmed by the author of the lexical analyzer.)


$startpos  start position of the first symbol in the production’s right-hand side, if there is one;
    end position of the most recently parsed symbol, otherwise
$endpos  end position of the last symbol in the production’s right-hand side, if there is one;
    end position of the most recently parsed symbol, otherwise
$startpos( $i | id )  start position of the symbol named $i or id
$endpos( $i | id )  end position of the symbol named $i or id
$symbolstartpos   start position of the leftmost symbol id such that $startpos(id) !=  $endpos(id);
    if there is no such symbol, $endpos
$startofs   
$endofs   
$startofs( $i | id )  same as above, but produce an integer offset instead of a position
$endofs( $i | id )   
$symbolstartofs   
$loc  stands for the pair ($startpos, $endpos)
$loc( id )  stands for the pair ($startpos( id ), $endpos( id ))
$sloc  stands for the pair ($symbolstartpos, $endpos)
Figure 14: Position-related keywords


symbol_start_pos()$symbolstartpos       
symbol_end_pos()$endpos       
rhs_start_pos i$startpos($i)      (1 ≤ in)
rhs_end_pos i$endpos($i)      (1 ≤ in)
symbol_start()$symbolstartofs       
symbol_end()$endofs       
rhs_start i$startofs($i)      (1 ≤ in)
rhs_end i$endofs($i)      (1 ≤ in)
Figure 15: Translating position-related incantations from ocamlyacc to Menhir

This mechanism allows associating pairs of positions with terminal symbols. If desired, Menhir automatically extends it to nonterminal symbols as well. That is, it offers a mechanism for associating pairs of positions with terminal or nonterminal symbols. This is done by making a set of keywords available to semantic actions (Figure 14). These keywords are not available outside of a semantic action: in particular, they cannot be used within an OCaml header.

OCaml’s standard library module Parsing is deprecated. The functions that it offers can be called, but will return dummy positions.

We remark that, if the current production has an empty right-hand side, then $startpos and $endpos are equal, and (by convention) are the end position of the most recently parsed symbol (that is, the symbol that happens to be on top of the automaton’s stack when this production is reduced). If the current production has a nonempty right-hand side, then $startpos is the same as $startpos($1) and $endpos is the same as $endpos($n), where n is the length of the right-hand side.

More generally, if the current production has matched a sentence of length zero, then $startpos and $endpos will be equal, and conversely.

The position $startpos is sometimes “further towards the left” than one would like. For example, in the following production:

  declaration: modifier? variable { $startpos }

the keyword $startpos represents the start position of the optional modifier modifier?. If this modifier turns out to be absent, then its start position is (by definition) the end position of the most recently parsed symbol. This may not be what is desired: perhaps the user would prefer in this case to use the start position of the symbol variable. This is achieved by using $symbolstartpos instead of $startpos. By definition, $symbolstartpos is the start position of the leftmost symbol whose start and end positions differ. In this example, the computation of $symbolstartpos skips the absent modifier, whose start and end positions coincide, and returns the start position of the symbol variable (assuming this symbol has distinct start and end positions).

There is no keyword $symbolendpos. Indeed, the problem with $startpos is due to the asymmetry in the definition of $startpos and $endpos in the case of an empty right-hand side, and does not affect $endpos.

The positions computed by Menhir are exactly the same as those computed by ocamlyacc1. More precisely, Figure 15 sums up how to translate a call to the Parsing module, as used in an ocamlyacc grammar, to a Menhir keyword.

We note that Menhir’s $startpos does not appear in the right-hand column in Figure 15. In other words, Menhir’s $startpos does not correspond exactly to any of the ocamlyacc function calls. An exact ocamlyacc equivalent of $startpos is rhs_start_pos 1 if the current production has a nonempty right-hand side and symbol_start_pos() if it has an empty right-hand side.

Finally, we remark that Menhir’s %inline keyword (§5.3) does not affect the computation of positions. The same positions are computed, regardless of where %inline keywords are placed.

8  Using Menhir as an interpreter

When --interpret is set, Menhir no longer behaves as a compiler. Instead, it acts as an interpreter. That is, it repeatedly:

  • reads a sentence off the standard input channel;
  • parses this sentence, according to the grammar;
  • displays an outcome.

This process stops when the end of the input channel is reached.

8.1  Sentences

The syntax of sentences is as follows:

sentence ::= lid : ] uiduid  \n

Less formally, a sentence is a sequence of zero or more terminal symbols (uid’s), separated with whitespace, terminated with a newline character, and optionally preceded with a non-terminal start symbol (lid). This non-terminal symbol can be omitted if, and only if, the grammar only has one start symbol.

For instance, here are four valid sentences for the grammar of arithmetic expressions found in the directory demos/calc:

main: INT PLUS INT EOL
INT PLUS INT
INT PLUS PLUS INT EOL
INT PLUS PLUS

In the first sentence, the start symbol main was explicitly specified. In the other sentences, it was omitted, which is permitted, because this grammar has no start symbol other than main. The first sentence is a stream of four terminal symbols, namely INT, PLUS, INT, and EOL. These terminal symbols must be provided under their symbolic names. Writing, say, “12+32\n” instead of INT PLUS INT EOL is not permitted. Menhir would not be able to make sense of such a concrete notation, since it does not have a lexer for it.

8.2  Outcomes

As soon as Menhir is able to read a complete sentence off the standard input channel (that is, as soon as it finds the newline character that ends the sentence), it parses the sentence according to whichever grammar was specified on the command line, and displays an outcome.

An outcome is one of the following:

  • ACCEPT: a prefix of the sentence was successfully parsed; a parser generated by Menhir would successfully stop and produce a semantic value;
  • OVERSHOOT: the end of the sentence was reached before it could be accepted; a parser generated by Menhir would request a non-existent “next token” from the lexer, causing it to fail or block;
  • REJECT: the sentence was not accepted; a parser generated by Menhir would raise the exception Error.

When --interpret-show-cst is set, each ACCEPT outcome is followed with a concrete syntax tree. A concrete syntax tree is either a leaf or a node. A leaf is either a terminal symbol or error. A node is annotated with a non-terminal symbol, and carries a sequence of immediate descendants that correspond to a valid expansion of this non-terminal symbol. Menhir’s notation for concrete syntax trees is as follows:

cst ::= uid
  error
  [ lid : cstcst ]

For instance, if one wished to parse the example sentences of §8.1 using the grammar of arithmetic expressions in demos/calc, one could invoke Menhir as follows:

$ menhir --interpret --interpret-show-cst demos/calc/parser.mly
main: INT PLUS INT EOL
ACCEPT
[main: [expr: [expr: INT] PLUS [expr: INT]] EOL]
INT PLUS INT
OVERSHOOT
INT PLUS PLUS INT EOL
REJECT
INT PLUS PLUS
REJECT

(Here, Menhir’s input—the sentences provided by the user on the standard input channel— is shown intermixed with Menhir’s output—the outcomes printed by Menhir on the standard output channel.) The first sentence is valid, and accepted; a concrete syntax tree is displayed. The second sentence is incomplete, because the grammar specifies that a valid expansion of main ends with the terminal symbol EOL; hence, the outcome is OVERSHOOT. The third sentence is invalid, because of the repeated occurrence of the terminal symbol PLUS; the outcome is REJECT. The fourth sentence, a prefix of the third one, is rejected for the same reason.

8.3  Remarks

Using Menhir as an interpreter offers an easy way of debugging your grammar. For instance, if one wished to check that addition is considered left-associative, as requested by the %left directive found in the file demos/calc/parser.mly, one could submit the following sentence:

$ ./menhir --interpret --interpret-show-cst ../demos/calc/parser.mly
INT PLUS INT PLUS INT EOL
ACCEPT
[main:
  [expr: [expr: [expr: INT] PLUS [expr: INT]] PLUS [expr: INT]]
  EOL
]

The concrete syntax tree displayed by Menhir is skewed towards the left, as desired.

The switches --interpret and --trace can be used in conjunction. When --trace is set, the interpreter logs its actions to the standard error channel.

9  Generated API

When Menhir processes a grammar specification, say parser.mly, it produces one OCaml module, Parser, whose code resides in the file parser.ml and whose signature resides in the file parser.mli. We now review this signature. For simplicity, we assume that the grammar specification has just one start symbol main, whose OCaml type is thing.

9.1  Monolithic API

The monolithic API defines the type token, the exception Error, and the parsing function main, named after the start symbol of the grammar.

The type token is an algebraic data type. A value of type token represents a terminal symbol and its semantic value. For instance, if the grammar contains the declarations %token A and %token<int> B, then the generated file parser.mli contains the following definition:

  type token =
  | A
  | B of int

If --only-tokens is specified on the command line, the type token is generated, and the rest is omitted. On the contrary, if --external-tokens is used, the type token is omitted, but the rest (described below) is generated.

The exception Error carries no argument. It is raised by the parsing function main (described below) when a syntax error is detected.

  exception Error

Next comes one parsing function for each start symbol of the grammar. Here, we have assumed that there is one start symbol, named main, so the generated file parser.mli contains the following declaration:

  val main: (Lexing.lexbuf -> token) -> Lexing.lexbuf -> thing

This function expects two arguments, namely: a lexer, which typically is produced by ocamllex and has type Lexing.lexbuf -> token; and a lexing buffer, which has type Lexing.lexbuf. This API is compatible with ocamlyacc. (For information on using Menhir without ocamllex, please consult §16.) This API is “monolithic” in the sense that there is just one function, which does everything: it pulls tokens from the lexer, parses, and eventually returns a semantic value (or fails by throwing the exception Error).

9.2  Incremental API

If --table is set, Menhir offers an incremental API in addition to the monolithic API. In this API, control is inverted. The parser does not have access to the lexer. Instead, when the parser needs the next token, it stops and returns its current state to the user. The user is then responsible for obtaining this token (typically by invoking the lexer) and resuming the parser from that state. The directory demos/calc-incremental contains a demo that illustrates the use of the incremental API.

This API is “incremental” in the sense that the user has access to a sequence of the intermediate states of the parser. Assuming that semantic values are immutable, a parser state is a persistent data structure: it can be stored and used multiple times, if desired. This enables applications such as “live parsing”, where a buffer is continuously parsed while it is being edited. The parser can be re-started in the middle of the buffer whenever the user edits a character. Because two successive parser states share most of their data in memory, a list of n successive parser states occupies only O(n) space in memory.

9.2.1  Starting the parser

In this API, the parser is started by invoking Incremental.main. (Recall that we assume that main is the name of the start symbol.) The generated file parser.mli contains the following declaration:

  module Incremental : sig
    val main: position -> thing MenhirInterpreter.checkpoint
  end

The argument is the initial position. If the lexer is based on an OCaml lexing buffer, this argument should be lexbuf.lex_curr_p. In §9.2 and §9.3, the type position is a synonym for Lexing.position.

We emphasize that the function Incremental.main does not parse anything. It constructs a checkpoint which serves as a starting point. The functions offer and resume, described below, are used to drive the parser.

9.2.2  Driving the parser

The sub-module MenhirInterpreter is also part of the incremental API. Its declaration, which appears in the generated file parser.mli, is as follows:

  module MenhirInterpreter : MenhirLib.IncrementalEngine.INCREMENTAL_ENGINE
    with type token = token

The signature INCREMENTAL_ENGINE, defined in the module MenhirLib.IncrementalEngine, contains many types and functions, which are described in the rest of this section (§9.2.2) and in the following sections (§9.2.3, §9.2.4).

Please keep in mind that, from the outside, these types and functions should be referred to with an appropriate prefix. For instance, the type checkpoint should be referred to as MenhirInterpreter.checkpoint, or Parser.MenhirInterpreter.checkpoint, depending on which modules the user chooses to open.

  type 'a env

The abstract type 'a env represents the current state of the parser. (That is, it contains the current state and stack of the LR automaton.) Assuming that semantic values are immutable, it is a persistent data structure: it can be stored and used multiple times, if desired. The parameter 'a is the type of the semantic value that will eventually be produced if the parser succeeds.

  type production

The abstract type production represents a production of the grammar. The “start productions” (which do not exist in an .mly file, but are constructed by Menhir internally) are not part of this type.

  type 'a checkpoint = private
    | InputNeeded of 'a env
    | Shifting of 'a env * 'a env * bool
    | AboutToReduce of 'a env * production
    | HandlingError of 'a env
    | Accepted of 'a
    | Rejected

The type 'a checkpoint represents an intermediate or final state of the parser. An intermediate checkpoint is a suspension: it records the parser’s current state, and allows parsing to be resumed. The parameter 'a is the type of the semantic value that will eventually be produced if the parser succeeds.

Accepted and Rejected are final checkpoints. Accepted carries a semantic value.

InputNeeded is an intermediate checkpoint. It means that the parser wishes to read one token before continuing.

Shifting is an intermediate checkpoint. It means that the parser is taking a shift transition. It exposes the state of the parser before and after the transition. The Boolean parameter tells whether the parser intends to request a new token after this transition. (It always does, except when it is about to accept.)

AboutToReduce is an intermediate checkpoint: it means that the parser is about to perform a reduction step. HandlingError is also an intermediate checkpoint: it means that the parser has detected an error and is about to handle it. (Error handling is typically performed in several steps, so the next checkpoint is likely to be HandlingError again.) In these two cases, the parser does not need more input. The parser suspends itself at this point only in order to give the user an opportunity to observe the parser’s transitions and possibly handle errors in a different manner, if desired.

  val offer:
    'a checkpoint ->
    token * position * position ->
    'a checkpoint

The function offer allows the user to resume the parser after the parser has suspended itself with a checkpoint of the form InputNeeded env. This function expects the previous checkpoint checkpoint as well as a new token (together with the start and end positions of this token). It produces a new checkpoint, which again can be an intermediate checkpoint or a final checkpoint. It does not raise any exception. (The exception Error is used only in the monolithic API.)

  val resume:
    'a checkpoint ->
    'a checkpoint

The function resume allows the user to resume the parser after the parser has suspended itself with a checkpoint of the form AboutToReduce (env, prod) or HandlingError env. This function expects just the previous checkpoint checkpoint. It produces a new checkpoint. It does not raise any exception.

The incremental API subsumes the monolithic API. Indeed, main can be (and is in fact) implemented by first using Incremental.main, then calling offer and resume in a loop, until a final checkpoint is obtained.

  type supplier =
    unit -> token * position * position

A token supplier is a function of no arguments which delivers a new token (together with its start and end positions) every time it is called. The function loop and its variants, described below, expect a supplier as an argument.

  val lexer_lexbuf_to_supplier:
    (Lexing.lexbuf -> token) -> Lexing.lexbuf -> supplier

The function lexer_lexbuf_to_supplier, applied to a lexer and to a lexing buffer, produces a fresh supplier.

The functions offer and resume, documented above, are sufficient to write a parser loop. One can imagine many variations of such a loop, which is why we expose offer and resume in the first place. Nevertheless, some variations are so common that it is worth providing them, ready for use. The following functions are implemented on top of offer and resume.

  val loop: supplier -> 'a checkpoint -> 'a

loop supplier checkpoint begins parsing from checkpoint, reading tokens from supplier. It continues parsing until it reaches a checkpoint of the form Accepted v or Rejected. In the former case, it returns v. In the latter case, it raises the exception Error. (By the way, this is how we implement the monolithic API on top of the incremental API.)

  val loop_handle:
    ('a -> 'answer) ->
    ('a checkpoint -> 'answer) ->
    supplier -> 'a checkpoint -> 'answer

loop_handle succeed fail supplier checkpoint begins parsing from checkpoint, reading tokens from supplier. It continues until it reaches a checkpoint of the form Accepted v or HandlingError _ (or Rejected, but that should not happen, as HandlingError _ will be observed first). In the former case, it calls succeed v. In the latter case, it calls fail with this checkpoint. It cannot raise Error.

This means that Menhir’s traditional error-handling procedure (which pops the stack until a state that can act on the error token is found) does not get a chance to run. Instead, the user can implement her own error handling code, in the fail continuation.

  val loop_handle_undo:
    ('a -> 'answer) ->
    ('a checkpoint -> 'a checkpoint -> 'answer) ->
    supplier -> 'a checkpoint -> 'answer

loop_handle_undo is analogous to loop_handle, but passes a pair of checkpoints (instead of a single checkpoint) to the failure continuation. The first (and oldest) checkpoint that is passed to the failure continuation is the last InputNeeded checkpoint that was encountered before the error was detected. The second (and newest) checkpoint is where the error was detected. (This is the same checkpoint that loop_handle would pass to its failure continuation.) Going back to the first checkpoint can be thought of as undoing any reductions that were performed after seeing the problematic token. (These reductions must be default reductions or spurious reductions.) This can be useful to someone who wishes to implement an error explanation or error recovery mechanism.

loop_handle_undo must be applied to an InputNeeded checkpoint. The initial checkpoint produced by Incremental.main is of this form.

  val shifts: 'a checkpoint -> 'a env option

shifts checkpoint assumes that checkpoint has been obtained by submitting a token to the parser. It runs the parser from checkpoint, through an arbitrary number of reductions, until the parser either accepts this token (i.e., shifts) or rejects it (i.e., signals an error). If the parser decides to shift, then Some env is returned, where env is the parser’s state just before shifting. Otherwise, None is returned. This can be used to test whether the parser is willing to accept a certain token. This function should be used with caution, though, as it causes semantic actions to be executed. It is desirable that all semantic actions be side-effect-free, or that their side-effects be harmless.

  val acceptable: 'a checkpoint -> token -> position -> bool

acceptable checkpoint token pos requires checkpoint to be an InputNeeded checkpoint. It returns true iff the parser is willing to shift this token. This can be used to test, after an error has been detected, which tokens would have been accepted at this point. To do this, one would typically use loop_handle_undo to get access to the last InputNeeded checkpoint that was encountered before the error was detected, and apply acceptable to that checkpoint.

acceptable is implemented using shifts, so, like shifts, it causes certain semantic actions to be executed. It is desirable that all semantic actions be side-effect-free, or that their side-effects be harmless.

9.2.3  Inspecting the parser’s state

Although the type env is opaque, a parser state can be inspected via a few accessor functions, which are described in this section. The following types and functions are contained in the MenhirInterpreter sub-module.

  type 'a lr1state

The abstract type 'a lr1state describes a (non-initial) state of the LR(1) automaton. If s is such a state, then s should have at least one incoming transition, and all of its incoming transitions carry the same (terminal or non-terminal) symbol, say A. We say that A is the incoming symbol of the state s. The index 'a is the type of the semantic values associated with A. The role played by 'a is clarified in the definition of the type element, which appears further on.

  val number: _ lr1state -> int

The states of the LR(1) automaton are numbered (from 0 and up). The function number maps a state to its number.

  val production_index: production -> int
  val find_production: int -> production

Productions are numbered. (The set of indices of all productions forms an interval, which does not necessarily begin at 0.) The function production_index converts a production to an integer number, whereas the function find_production carries out the reverse conversion. It is an error to apply find_production to an invalid index.

  type element =
    | Element: 'a lr1state * 'a * position * position -> element

The type element describes one entry in the stack of the LR(1) automaton. In a stack element of the form Element (s, v, startp, endp), s is a (non-initial) state and v is a semantic value. The value v is associated with the incoming symbol A of the state s. In other words, the value v was pushed onto the stack just before the state s was entered. Thus, for some type 'a, the state s has type 'a lr1state and the value v has type 'a. The positions startp and endp delimit the fragment of the input text that was reduced to the symbol A.

In order to do anything useful with the value v, one must gain information about the type 'a, by inspection of the state s. So far, the type 'a lr1state is abstract, so there is no way of inspecting s. The inspection API (§9.3) offers further tools for this purpose.

  val top: 'a env -> element option

top env returns the parser’s top stack element. The state contained in this stack element is the current state of the automaton. If the stack is empty, None is returned. In that case, the current state of the automaton must be an initial state.

  val pop_many: int -> 'a env -> 'a env option

pop_many i env pops i elements off the automaton’s stack. This is done via i successive invocations of pop. Thus, pop_many 1 is pop. The index i must be nonnegative. The time complexity is O(i).

  val get: int -> 'a env -> element option

get i env returns the parser’s i-th stack element. The index i is 0-based: thus, get 0 is top. If i is greater than or equal to the number of elements in the stack, None is returned. get is implemented using pop_many and top: its time complexity is O(i).

  val current_state_number: 'a env -> int

current_state_number env is the integer number of the automaton’s current state. Although this number might conceivably be obtained via the functions top and number, using current_state_number is preferable, because this method works even when the automaton’s stack is empty (in which case the current state is an initial state, and top returns None). This number can be passed as an argument to a message function generated by menhir --compile-errors.

  val equal: 'a env -> 'a env -> bool

equal env1 env2 tells whether the parser configurations env1 and env2 are equal in the sense that the automaton’s current state is the same in env1 and env2 and the stack is physically the same in env1 and env2. If equal env1 env2 is true, then the sequence of the stack elements, as observed via pop and top, must be the same in env1 and env2. Also, if equal env1 env2 holds, then the checkpoints input_needed env1 and input_needed env2 must be equivalent. (The function input_needed is documented in §9.2.4.) The function equal has time complexity O(1).

  val positions: 'a env -> position * position

The function positions returns the start and end positions of the current lookahead token. If invoked in an initial state, this function returns a pair of twice the initial position that was passed as an argument to main.

  val env_has_default_reduction: 'a env -> bool
  val state_has_default_reduction: _ lr1state -> bool

When applied to an environment env taken from a checkpoint of the form AboutToReduce (env, prod), the function env_has_default_reduction tells whether the reduction that is about to take place is a default reduction.

state_has_default_reduction s tells whether the state s has a default reduction. This includes the case where s is an accepting state.

9.2.4  Updating the parser’s state

The functions presented in the previous section (§9.2.3) allow inspecting parser states of type 'a checkpoint and 'a env. However, so far, there are no functions for manufacturing new parser states, except offer and resume, which create new checkpoints by feeding tokens, one by one, to the parser.

In this section, a small number of functions are provided for manufacturing new parser states of type 'a env and 'a checkpoint. These functions allow going far back into the past and jumping ahead into the future, so to speak. In other words, they allow driving the parser in other ways than by feeding tokens into it. The functions pop, force_reduction and feed (part of the inspection API; see §9.3) construct values of type 'a env. The function input_needed constructs values of type 'a checkpoint and thereby allows resuming parsing in normal mode (via offer). Together, these functions can be used to implement error handling and error recovery strategies.

  val pop: 'a env -> 'a env option

pop env returns a new environment, where the parser’s top stack cell has been popped off. (If the stack is empty, None is returned.) This amounts to pretending that the (terminal or nonterminal) symbol that corresponds to this stack cell has not been read.

  val force_reduction: production -> 'a env -> 'a env

force_reduction prod env can be called only if in the state env the parser is capable of reducing the production prod. If this condition is satisfied, then this production is reduced, which means that its semantic action is executed (this can have side effects!) and the automaton makes a goto (nonterminal) transition. If this condition is not satisfied, an Invalid_argument exception is raised.

  val input_needed: 'a env -> 'a checkpoint

input_needed env returns InputNeeded env. Thus, out of a parser state that might have been obtained via a series of calls to the functions pop, force_reduction, feed, and so on, it produces a checkpoint, which can be used to resume normal parsing, by supplying this checkpoint as an argument to offer.

This function should be used with some care. It could “mess up the lookahead” in the sense that it allows parsing to resume in an arbitrary state s with an arbitrary lookahead symbol t, even though Menhir’s reachability analysis (which is carried out via the --list-errors switch) might well think that it is impossible to reach this particular configuration. If one is using Menhir’s new error reporting facility (§11), this could cause the parser to reach an error state for which no error message has been prepared.

9.3  Inspection API

If --inspection is set, Menhir offers an inspection API in addition to the monolithic and incremental APIs. (The reason why this is not done by default is that this requires more tables to be generated, thus making the generated parser larger.) Like the incremental API, the inspection API is found in the sub-module MenhirInterpreter. It offers the following types and functions.

The type 'a terminal is a generalized algebraic data type (GADT). A value of type 'a terminal represents a terminal symbol (without a semantic value). The index 'a is the type of the semantic values associated with this symbol. For instance, if the grammar contains the declarations %token A and %token<int> B, then the generated module MenhirInterpreter contains the following definition:

  type _ terminal =
  | T_A : unit terminal
  | T_B : int terminal

The data constructors are named after the terminal symbols, prefixed with “T_”.

The type 'a nonterminal is also a GADT. A value of type 'a nonterminal represents a nonterminal symbol (without a semantic value). The index 'a is the type of the semantic values associated with this symbol. For instance, if main is the only nonterminal symbol, then the generated module MenhirInterpreter contains the following definition:

  type _ nonterminal =
  | N_main : thing nonterminal

The data constructors are named after the nonterminal symbols, prefixed with “N_”.

The type 'a symbol is the disjoint union of the types 'a terminal and 'a nonterminal. In other words, a value of type 'a symbol represents a terminal or nonterminal symbol (without a semantic value). This type is (always) defined as follows:

  type 'a symbol =
    | T : 'a terminal -> 'a symbol
    | N : 'a nonterminal -> 'a symbol

The type xsymbol is an existentially quantified version of the type 'a symbol. It is useful in situations where the index 'a is not statically known. It is (always) defined as follows:

  type xsymbol =
    | X : 'a symbol -> xsymbol

The type item describes an LR(0) item, that is, a pair of a production prod and an index i into the right-hand side of this production. If the length of the right-hand side is n, then i is comprised between 0 and n, inclusive.

  type item =
      production * int

The following functions implement total orderings on the types _ terminal, _ nonterminal, xsymbol, production, and item.

  val compare_terminals: _ terminal -> _ terminal -> int
  val compare_nonterminals: _ nonterminal -> _ nonterminal -> int
  val compare_symbols: xsymbol -> xsymbol -> int
  val compare_productions: production -> production -> int
  val compare_items: item -> item -> int

The function incoming_symbol maps a (non-initial) LR(1) state s to its incoming symbol, that is, the symbol that the parser must recognize before it enters the state s.

  val incoming_symbol: 'a lr1state -> 'a symbol

This function can be used to gain access to the semantic value v in a stack element Element (s, v, _, _). Indeed, by case analysis on the symbol incoming_symbol s, one gains information about the type 'a, hence one obtains the ability to do something useful with the value v.

The function items maps a (non-initial) LR(1) state s to its LR(0) core, that is, to the underlying set of LR(0) items. This set is represented as a list, whose elements appear in an arbitrary order. This set is not closed under є-transitions.

  val items: _ lr1state -> item list

The functions lhs and rhs map a production prod to its left-hand side and right-hand side, respectively. The left-hand side is always a nonterminal symbol, hence always of the form N _. The right-hand side is a (possibly empty) sequence of (terminal or nonterminal) symbols.

  val lhs: production -> xsymbol
  val rhs: production -> xsymbol list

The function nullable, applied to a non-terminal symbol, tells whether this symbol is nullable. A nonterminal symbol is nullable if and only if it produces the empty word є.

  val nullable: _ nonterminal -> bool

The function call first nt t tells whether the FIRST set of the nonterminal symbol nt contains the terminal symbol t. That is, it returns true if and only if nt produces a word that begins with t. The function xfirst is identical to first, except it expects a first argument of type xsymbol instead of _ terminal.

  val first: _ nonterminal -> _ terminal -> bool
  val xfirst: xsymbol -> _ terminal -> bool

The function foreach_terminal enumerates the terminal symbols, including the special symbol error. The function foreach_terminal_but_error enumerates the terminal symbols, excluding error.

  val foreach_terminal:           (xsymbol -> 'a -> 'a) -> 'a -> 'a
  val foreach_terminal_but_error: (xsymbol -> 'a -> 'a) -> 'a -> 'a

feed symbol startp semv endp env causes the parser to consume the (terminal or nonterminal) symbol symbol, accompanied with the semantic value semv and with the start and end positions startp and endp. Thus, the automaton makes a transition, and reaches a new state. The stack grows by one cell. This operation is permitted only if the current state (as determined by env) has an outgoing transition labeled with symbol. Otherwise, an Invalid_argument exception is raised.

  val feed: 'a symbol -> position -> 'a -> position -> 'b env -> 'b env

10  Error handling: the traditional way

Menhir’s traditional error handling mechanism is considered deprecated: although it is still supported for the time being, it might be removed in the future. We recommend setting up an error handling mechanism using the new tools offered by Menhir (§11).

Error handling

Menhir’s error traditional handling mechanism is inspired by that of yacc and ocamlyacc, but is not identical. A special error token is made available for use within productions. The LR automaton is constructed exactly as if error was a regular terminal symbol. However, error is never produced by the lexical analyzer. Instead, when an error is detected, the current lookahead token is discarded and replaced with the error token, which becomes the current lookahead token. At this point, the parser enters error handling mode.

In error handling mode, automaton states are popped off the automaton’s stack until a state that can act on error is found. This includes both shift and reduce actions. (yacc and ocamlyacc do not trigger reduce actions on error. It is somewhat unclear why this is so.)

When a state that can reduce on error is found, reduction is performed. Since the lookahead token is still error, the automaton remains in error handling mode.

When a state that can shift on error is found, the error token is shifted. At this point, the parser returns to normal mode.

When no state that can act on error is found on the automaton’s stack, the parser stops and raises the exception Error. This exception carries no information. The position of the error can be obtained by reading the lexical analyzer’s environment record.

Error recovery

ocamlyacc offers an error recovery mode, which is entered immediately after an error token was successfully shifted. In this mode, tokens are repeatedly taken off the input stream and discarded until an acceptable token is found. This feature is no longer offered by Menhir.

Error-related keywords

The following keyword is made available to semantic actions.

When the $syntaxerror keyword is evaluated, evaluation of the semantic action is aborted, so that the current reduction is abandoned; the current lookahead token is discarded and replaced with the error token; and error handling mode is entered. Note that there is no mechanism for inserting an error token in front of the current lookahead token, even though this might also be desirable. It is unclear whether this keyword is useful; it might be suppressed in the future.

11  Error handling: the new way

Menhir’s incremental API (§9.2) allows taking control when an error is detected. Indeed, as soon as an invalid token is detected, the parser produces a checkpoint of the form HandlingError _. At this point, if one decides to let the parser proceed, by just calling resume, then Menhir enters its traditional error handling mode (§10). Instead, however, one can decide to take control and perform error handling or error recovery in any way one pleases. One can, for instance, build and display a diagnostic message, based on the automaton’s current stack and/or state. Or, one could modify the input stream, by inserting or deleting tokens, so as to suppress the error, and resume normal parsing. In principle, the possibilities are endless.

An apparently simple-minded approach to error reporting, proposed by Jeffery [10] and further explored by Pottier [20], consists in selecting a diagnostic message (or a template for a diagnostic message) based purely on the current state of the automaton.

In this approach, one determines, ahead of time, which are the “error states” (that is, the states in which an error can be detected), and one prepares, for each error state, a diagnostic message. Because state numbers are fragile (they change when the grammar evolves), an error state is identified not by its number, but by an input sentence that leads to it: more precisely, by an input sentence which causes an error to be detected in this state. Thus, one maintains a set of pairs of an erroneous input sentence and a diagnostic message.

Menhir defines a file format, the .messages file format, for representing this information (§11.1), and offers a set of tools for creating, maintaining, and exploiting .messages files (§11.2). Once one understands these tools, there remains to write a collection of diagnostic messages, a more subtle task than one might think (§11.3), and to glue everything together (§11.4).

In this approach to error handling, as in any other approach, one must understand exactly when (that is, in which states) errors are detected. This in turn requires understanding how the automaton is constructed. Menhir’s construction technique is not Knuth’s canonical LR(1) technique [15], which is usually too expensive to be practical. Instead, Menhir merges states [19] and introduces so-called default reductions. These techniques defer error detection by allowing extra reductions to take place before an error is detected. The impact of these alterations must be taken into account when writing diagnostic messages (§11.3).

In this approach to error handling, the special error token is not used. It should not appear in the grammar. Similarly, the $syntaxerror keyword should not be used.

11.1  The .messages file format

A .messages file is a text file. Comment lines, which begin with a # character, are ignored everywhere. As is evident in the following description, blank lines are significant: they are used as separators between entries and within an entry.

.messages file is composed of a list of entries. Two entries are separated by one or more blank lines. Each entry consists of one or more input sentences, followed with one or more blank lines, followed with a message. The syntax of an input sentence is described in §8.1. A message is arbitrary text, but cannot contain a blank line. We stress that there cannot be a blank line between two sentences (if there is one, Menhir becomes confused and may complain about some word not being “a known non-terminal symbol”).


grammar: TYPE UID
grammar: TYPE OCAMLTYPE UID PREC

# A (handwritten) comment.

Ill-formed declaration.
Examples of well-formed declarations:
  %type <Syntax.expression> expression
  %type <int> date time
Figure 16: An entry in a .messages file


grammar: TYPE UID
##
## Ends in an error in state: 1.
##
## declaration -> TYPE . OCAMLTYPE separated_nonempty_list(option(COMMA),
##   strict_actual) [ TYPE TOKEN START RIGHT PUBLIC PERCENTPERCENT PARAMETER
##   ON_ERROR_REDUCE NONASSOC LEFT INLINE HEADER EOF COLON ]
##
## The known suffix of the stack is as follows:
## TYPE
##
grammar: TYPE OCAMLTYPE UID PREC
##
## Ends in an error in state: 5.
##
## strict_actual -> symbol . loption(delimited(LPAREN,separated_nonempty_list
##   (COMMA,strict_actual),RPAREN)) [ UID TYPE TOKEN START STAR RIGHT QUESTION
##   PUBLIC PLUS PERCENTPERCENT PARAMETER ON_ERROR_REDUCE NONASSOC LID LEFT
##   INLINE HEADER EOF COMMA COLON ]
##
## The known suffix of the stack is as follows:
## symbol
##

# A (handwritten) comment.

Ill-formed declaration.
Examples of well-formed declarations:
  %type <Syntax.expression> expression
  %type <int> date time
Figure 17: An entry in a .messages file, decorated with auto-generated comments

As an example, Figure 16 shows a valid entry, taken from Menhir’s own .messages file. This entry contains two input sentences, which lead to errors in two distinct states. A single message is associated with these two error states.

Several commands, described next (§11.2), produce .messages files where each input sentence is followed with an auto-generated comment, marked with ##. This special comment indicates in which state the error is detected, and is supposed to help the reader understand what it means to be in this state: What has been read so far? What is expected next?

As an example, the previous entry, decorated with auto-generated comments, is shown in Figure 17. (We have manually wrapped the lines that did not fit in this document.)

An auto-generated comment begins with the number of the error state that is reached via this input sentence.

Then, the auto-generated comment shows the LR(1) items that compose this state, in the same format as in an .automaton file. these items offer a description of the past (that is, what has been read so far) and the future (that is, which terminal symbols are allowed next).

Finally, the auto-generated comment shows what is known about the stack when the automaton is in this state. (This can be deduced from the LR(1) items, but is more readable if shown separately.)

In a canonical LR(1) automaton, the LR(1) items offer an exact description of the past and future. However, in a noncanonical automaton, which is by default what Menhir produces, the situation is more subtle. The lookahead sets can be over-approximated, so the automaton can perform one or more “spurious reductions” before an error is detected. As a result, the LR(1) items in the error state offer a description of the future that may be both incorrect (that is, a terminal symbol that appears in a lookahead set is not necessarily a valid continuation) and incomplete (that is, a terminal symbol that does not appear in any lookahead set may nevertheless be a valid continuation). More details appear further on (§11.3).

In order to attract the user’s attention to this issue, if an input sentence causes one or more spurious reductions, then the auto-generated comment contains a warning about this fact. This mechanism is not completely foolproof, though, as it may be the case that one particular sentence does not cause any spurious reductions (hence, no warning appears), yet leads to an error state that can be reached via other sentences that do involve spurious reductions.

11.2  Maintaining .messages files

Ideally, the set of input sentences in a .messages file should be correct (that is, every sentence causes an error on its last token), irredundant (that is, no two sentences lead to the same error state), and complete (that is, every error state is reached by some sentence).

Correctness and irredundancy are checked by the command --compile-errors filename, where filename is the name of a .messages file. This command fails if a sentence does not cause an error at all, or causes an error too early. It also fails if two sentences lead to the same error state. If the file is correct and irredundant, then (as its name suggests) this command compiles the .messages file down to an OCaml function, whose code is printed on the standard output channel. This function, named message, has type int -> string, and maps a state number to a message. It raises the exception Not_found if its argument is not the number of a state for which a message has been defined.

Completeness is checked via the commands --list-errors and --compare-errors. The former produces, from scratch, a complete set of input sentences, that is, a set of input sentences that reaches all error states. The latter compares two sets of sentences (more precisely, the two underlying sets of error states) for inclusion.

The command --list-errors first computes all possible ways of causing an error. From this information, it deduces a list of all error states, that is, all states where an error can be detected. For each of these states, it computes a (minimal) input sentence that causes an error in this state. Finally, it prints these sentences, in the .messages file format, on the standard output channel. Each sentence is followed with an auto-generated comment and with a dummy diagnostic message. The user should be warned that this algorithm may require large amounts of time (typically in the tens of seconds, possibly more) and memory (typically in the gigabytes, possibly more). It requires a 64-bit machine. (On a 32-bit machine, it works, but quickly hits a built-in size limit.) At the verbosity level --log-automaton 2, it displays some progress information and internal statistics on the standard error channel.

The command --compare-errors filename1 --compare-errors filename2 compares the .messages files filename1 and filename2. Each file is read and internally translated to a mapping of states to messages. Menhir then checks that the left-hand mapping is a subset of the right-hand mapping. That is, if a state s is reached by some sentence in filename1, then it should also be reached by some sentence in filename2. Furthermore, if the message associated with s in filename1 is not a dummy message, then the same message should be associated with s in filename2.

To check that the sentences in filename2 cover all error states, it suffices to (1) use --list-errors to produce a complete set of sentences, which one stores in filename1, then (2) use --compare-errors to compare filename1 and filename2.

In the case of a grammar that evolves fairly often, it can take significant human time and effort to update the .messages file and ensure correctness, irredundancy, and completeness. A way of reducing this effort is to abandon completeness. This implies that the auto-generated message function can raise Not_found and that a generic “syntax error” message must be produced in that case. We prefer to discourage this approach, as it implies that the end user is exposed to a mixture of specific and generic syntax error messages, and there is no guarantee that the specific (hand-written) messages will appear in all situations where there are expected to appear. Instead, we recommend waiting for the grammar to become stable and enforcing completeness.

The command --update-errors filename is used to update the auto-generated comments in the .messages file filename. It is typically used after a change in the grammar (or in the command line options that affect the construction of the automaton). A new .messages file is produced on the standard output channel. It is identical to filename, except the auto-generated comments, identified by ##, have been removed and re-generated.

The command --echo-errors filename is used to filter out all comments, blank lines, and messages from the .messages file filename. The input sentences, and nothing else, are echoed on the standard output channel. As an example application, one could then translate the sentences to concrete syntax and create a collection of source files that trigger every possible syntax error.

The command --interpret-error is analogous to --interpret. It causes Menhir to act as an interpreter. Menhir reads sentences off the standard input channel, parses them, and displays the outcome. This switch can be usefully combined with --trace. The main difference between --interpret and --interpret-error is that, when the latter command is used, Menhir expects the input sentence to cause an error on its last token, and displays information about the state in which the error is detected, in the form of a .messages file entry. This can be used to quickly find out exactly what error is caused by one particular input sentence.

11.3  Writing accurate diagnostic messages

One might think that writing a diagnostic message for each error state is a straightforward (if lengthy) task. In reality, it is not so simple.

A state, not a sentence

The first thing to keep in mind is that a diagnostic message is associated with a state s, as opposed to a sentence. An entry in a .messages file contains a sentence w that leads to an error in state s. This sentence is just one way of causing an error in state s; there may exist many other sentences that also cause an error in this state. The diagnostic message should not be specific of the sentence w: it should make sense regardless of how the state s is reached.

As a rule of thumb, when writing a diagnostic message, one should (as much as possible) ignore the example sentence w altogether, and concentrate on the description of the state s, which appears as part of the auto-generated comment.

The LR(1) items that compose the state s offer a description of the past (that is, what has been read so far) and the future (that is, which terminal symbols are allowed next). A diagnostic message should be designed based on this description.


%token ID ARROW LPAREN RPAREN COLON SEMICOLON
%start<unit> program
%%
typ0: ID | LPAREN typ1 RPAREN {}
typ1: typ0 | typ0 ARROW typ1  {}
declaration: ID COLON typ1    {}
program:
| LPAREN declaration RPAREN
| declaration SEMICOLON       {}
Figure 18: A grammar where one error state is difficult to explain


program: ID COLON ID LPAREN
##
## Ends in an error in state: 8.
##
## typ1 -> typ0 . [ SEMICOLON RPAREN ]
## typ1 -> typ0 . ARROW typ1 [ SEMICOLON RPAREN ]
##
## The known suffix of the stack is as follows:
## typ0
##
Figure 19: A problematic error state in the grammar of Figure 18, due to over-approximation

The problem of over-approximated lookahead sets

As pointed out earlier (§11.1), in a noncanonical automaton, the lookahead sets in the LR(1) items can be both over- and under-approximated. One must be aware of this phenomenon, otherwise one runs the risk of writing a diagnostic message that proposes too many or too few continuations.

As an example, let us consider the grammar in Figure 18. According to this grammar, a “program” is either a declaration between parentheses or a declaration followed with a semicolon. A “declaration” is an identifier, followed with a colon, followed with a type. A “type” is an identifier, a type between parentheses, or a function type in the style of OCaml.

The (noncanonical) automaton produced by Menhir for this grammar has 17 states. Using --list-errors, we find that an error can be detected in 10 of these 17 states. By manual inspection of the auto-generated comments, we find that for 9 out of these 10 states, writing an accurate diagnostic message is easy. However, one problematic state remains, namely state 8, shown in Figure 19.

In this state, a (level-0) type has just been read. One valid continuation, which corresponds to the second LR(1) item in Figure 19, is to continue this type: the terminal symbol ARROW, followed with a (level-1) type, is a valid continuation. Now, the question is, what other valid continuations are there? By examining the first LR(1) item in Figure 19, it may look as if both SEMICOLON and RPAREN are valid continuations. However, this cannot be the case. A moment’s thought reveals that either we have seen an opening parenthesis LPAREN at the very beginning of the program, in which case we definitely expect a closing parenthesis RPAREN; or we have not seen one, in which case we definitely expect a semicolon SEMICOLON. It is never the case that both SEMICOLON and RPAREN are valid continuations!

In fact, the lookahead set in the first LR(1) item in Figure 19 is over-approximated. State 8 in the noncanonical automaton results from merging two states in the canonical automaton.

In such a situation, one cannot write an accurate diagnostic message. Knowing that the automaton is in state 8 does not give us a precise view of the valid continuations. Some valuable information (that is, whether we have seen an opening parenthesis LPAREN at the very beginning of the program) is buried in the automaton’s stack.


%token ID ARROW LPAREN RPAREN COLON SEMICOLON
%start<unit> program
%%
typ0: ID | LPAREN typ1(RPAREN) RPAREN          {}
typ1(phantom): typ0 | typ0 ARROW typ1(phantom) {}
declaration(phantom): ID COLON typ1(phantom)   {}
program:
| LPAREN declaration(RPAREN) RPAREN
| declaration(SEMICOLON)  SEMICOLON            {}
Figure 20: Splitting the problematic state of Figure 19 via selective duplication


%token ID ARROW LPAREN RPAREN COLON SEMICOLON
%start<unit> program
%on_error_reduce typ1
%%
typ0: ID | LPAREN typ1 RPAREN {}
typ1: typ0 | typ0 ARROW typ1  {}
declaration: ID COLON typ1    {}
program:
| LPAREN declaration RPAREN
| declaration SEMICOLON       {}
Figure 21: Avoiding the problematic state of Figure 19 via reductions on error


program: ID COLON ID LPAREN
##
## Ends in an error in state: 15.
##
## program -> declaration . SEMICOLON [ # ]
##
## The known suffix of the stack is as follows:
## declaration
##
## WARNING: This example involves spurious reductions.
## This implies that, although the LR(1) items shown above provide an
## accurate view of the past (what has been recognized so far), they
## may provide an INCOMPLETE view of the future (what was expected next).
## In state 8, spurious reduction of production typ1 -> typ0
## In state 11, spurious reduction of production declaration -> ID COLON typ1
##
Figure 22: A problematic error state in the grammar of Figure 21, due to under-approximation

How can one work around this problem? Let us suggest three options.

Blind duplication of states

One option would be to build a canonical automaton by using the --canonical switch. In this example, one would obtain a 27-state automaton, where the problem has disappeared. However, this option is rarely viable, as it duplicates many states without good reason.

Selective duplication of states

A second option is to manually cause just enough duplication to remove the problematic over-approximation. In our example, we wish to distinguish two kinds of types and declarations, namely those that must be followed with a closing parenthesis, and those that must be followed with a semicolon. We create such a distinction by parameterizing typ1 and declaration with a phantom parameter. The modified grammar is shown in Figure 20. The phantom parameter does not affect the language that is accepted: for instance, the nonterminal symbols declaration(SEMICOLON) and declaration(RPAREN) generate the same language as declaration in the grammar of Figure 18. Yet, by giving distinct names to these two symbols, we force the construction of an automaton where more states are distinguished. In this example, Menhir produces a 23-state automaton. Using --list-errors, we find that an error can be detected in 11 of these 23 states, and by manual inspection of the auto-generated comments, we find that for each of these 11 states, writing an accurate diagnostic message is easy. In summary, we have selectively duplicated just enough states so as to split the problematic error state into two non-problematic error states.

Reductions on error

A third and last option is to introduce an %on_error_reduce declaration (§4.1.8) so as to prevent the detection of an error in the problematic state 8. We see in Figure 19 that, in state 8, the production typ1typ0 is ready to be reduced. If we could force this reduction to take place, then the automaton would move to some other state where it would be clear which of SEMICOLON and RPAREN is expected. We achieve this by marking typ1 as “reducible on error”. The modified grammar is shown in Figure 21. For this grammar, Menhir produces a 17-state automaton. (This is the exact same automaton as for the grammar of Figure 18, except 2 of the 17 states have received extra reduction actions.) Using --list-errors, we find that an error can be detected in 9 of these 17 states. The problematic state, namely state 8, is no longer an error state! The problem has vanished.

The problem of under-approximated lookahead sets

The third option seems by far the simplest of all, and is recommended in many situations. However, it comes with a caveat. There may now exist states whose lookahead sets are under-approximated, in a certain sense. Because of this, there is a danger of writing an incomplete diagnostic message, one that does not list all valid continuations.

To see this, let us look again at the sentence ID COLON ID LPAREN. In the grammar and automaton of Figure 18, this sentence takes us to the problematic state 8, shown in Figure 19. In the grammar and automaton of Figure 21, because more reduction actions are carried out before the error is detected, this sentence takes us to state 15, shown in Figure 22.

When writing a diagnostic message for state 15, one might be tempted to write: “Up to this point, a declaration has been recognized. At this point, a semicolon is expected”. Indeed, by examining the sole LR(1) item in state 15, it looks as if SEMICOLON is the only permitted continuation. However, this is not the case. Another valid continuation is ARROW: indeed, the sentence ID COLON ID ARROW ID SEMICOLON forms a valid program. In fact, if the first token following ID COLON ID is ARROW, then in state 8 this token is shifted, so the two reductions that take us from state 8 through state 11 to state 15 never take place. This is why, even though ARROW does not appear in state 15 as a valid continuation, it nevertheless is a valid continuation of ID COLON ID. The warning produced by Menhir, shown in Figure 22, is supposed to attract attention to this issue.

Another way to explain this issue is to point out that, by declaring %on_error_reduce typ1, we make a choice. When the parser reads a type and finds an invalid token, it decides that this type is finished, even though, in reality, this type could be continued with ARROW …. This in turn causes the parser to perform another reduction and consider the current declaration finished, even though, in reality, this declaration could be continued with ARROW ….

In summary, when writing a diagnostic message for state 15, one should take into account the fact that this state can be reached via spurious reductions and (therefore) SEMICOLON may not be the only permitted continuation. One way of doing this, without explicitly listing all permitted continuations, is to write: “Up to this point, a declaration has been recognized. If this declaration is complete, then at this point, a semicolon is expected”.

11.4  A working example

The CompCert verified compiler offers a real-world example of this approach to error handling. The “pre-parser” is where syntax errors are detected: see cparser/pre_parser.mly. A database of erroneous input sentences and (templates for) diagnostic messages is stored in cparser/handcrafted.messages. It is compiled, using --compile-errors, to an OCaml file named cparser/pre_parser_messages.ml. The function Pre_parser_messages.message, which maps a state number to (a template for) a diagnostic message, is called from cparser/ErrorReports.ml, where we construct and display a full-fledged diagnostic message.

In CompCert, we allow a template for a diagnostic message to contain the special form $i, where i is an integer constant, understood as an index into the parser’s stack. The code in cparser/ErrorReports.ml automatically replaces this special form with the fragment of the source text that corresponds to this stack entry. This mechanism is not built into Menhir ; it is implemented in CompCert using Menhir’s incremental API.

12  Coq back-end

Menhir is able to generate a parser that whose correctness can be formally verified using the Coq proof assistant [13]. This feature is used to construct the parser of the CompCert verified compiler [17].

Setting the --coq switch on the command line enables the Coq back-end. When this switch is set, Menhir expects an input file whose name ends in .vy and generates a Coq file whose name ends in .v.

Like a .mly file, a .vy file is a grammar specification, with embedded semantic actions. The only difference is that the semantic actions in a .vy file are expressed in Coq instead of OCaml. A .vy file otherwise uses the same syntax as a .mly file. CompCert’s cparser/Parser.vy serves as an example.

Several restrictions are imposed when Menhir is used in --coq mode:

  • The error handling mechanism (§10) is absent. The $syntaxerror keyword and the error token are not supported.
  • Location information is not propagated. The $start* and $end* keywords (Figure 14) are not supported.
  • %parameter4.1.2) is not supported.
  • %inline5.3) is not supported.
  • The standard library (§5.4) is not supported, of course, because its semantic actions are expressed in OCaml. If desired, the user can define an analogous library, whose semantic actions are expressed in Coq.
  • Because Coq’s type inference algorithm is rather unpredictable, the Coq type of every nonterminal symbol must be provided via a %type or %start declaration (§4.1.5, §4.1.6).
  • Unless the proof of completeness has been deactivated using --coq-no-complete, the grammar must not have a conflict (not even a benign one, in the sense of §6.1). That is, the grammar must be LR(1). Conflict resolution via priority and associativity declarations (§4.1.4) is not supported. The reason is that there is no simple formal specification of how conflict resolution should work.

The generated file contains several modules:

  • The module Gram defines the terminal and non-terminal symbols, the grammar, and the semantic actions.
  • The module Aut contains the automaton generated by Menhir, together with a certificate that is checked by Coq while establishing the soundness and completeness of the parser.

The type terminal of the terminal symbols is an inductive type, with one constructor for each terminal symbol. A terminal symbol named Foo in the .vy file is named Foo't in Coq. A terminal symbol per se does not carry a the semantic value.

We also define the type token of tokens, that is, dependent pairs of a terminal symbol and a semantic value of an appropriate type for this symbol. We model the lexer as an object of type Streams.Stream token, that is, an infinite stream of tokens.

The type nonterminal of the non-terminal symbols is an inductive type, with one constructor for each non-terminal symbol. A non-terminal symbol named Bar in the .vy file is named Bar'nt in Coq.

The proof of termination of an LR(1) parser in the case of invalid input seems far from obvious. We did not find such a proof in the literature. In an application such as CompCert [17], this question is not considered crucial. For this reason, we did not formally establish the termination of the parser. Instead, in order to satisfy Coq’s termination requirements, we use the “fuel” technique: the parser takes an additional parameter log_fuel of type nat such that 2log_fuel is the maximum number of steps the parser is allowed to perform. In practice, one can use a value of e.g., 40 or 50 to make sure the parser will never run out of fuel in a reasonnable time.

Parsing can have three different outcomes, represented by the type parse_result. (This definition is implicitly parameterized over the initial state init. We omit the details here.)

  Inductive parse_result :=
  | Fail_pr:    parse_result
  | Timeout_pr: parse_result
  | Parsed_pr:
      symbol_semantic_type (NT (start_nt init)) ->
      Stream token ->
      parse_result.

The outcome Fail_pr means that parsing has failed because of a syntax error. (If the completeness of the parser with respect to the grammar has been proved, this implies that the input is invalid). The outcome Timeout_pr means that the fuel has been exhausted. Of course, this cannot happen if the parser was given an infinite amount of fuel, as suggested above. The outcome Parsed_pr means that the parser has succeeded in parsing a prefix of the input stream. It carries the semantic value that has been constructed for this prefix, as well as the remainder of the input stream.

For each entry point entry of the grammar, Menhir generates a parsing function entry, whose type is nat -> Stream token -> parse_result.

Two theorems are provided, named entry_point_correct and entry_point_complete. The correctness theorem states that, if a word (a prefix of the input stream) is accepted, then this word is valid (with respect to the grammar) and the semantic value that is constructed by the parser is valid as well (with respect to the grammar). The completeness theorem states that if a word (a prefix of the input stream) is valid (with respect to the grammar), then (given sufficient fuel) it is accepted by the parser.

These results imply that the grammar is unambiguous: for every input, there is at most one valid interpretation. This is proved by another generated theorem, named Parser.unambiguous.

The parsers produced by Menhir’s Coq back-end must be linked with a Coq library. This library can be installed via the command opam install coq-menhirlib.2 The Coq sources of this library can be found in the coq-menhirlib directory of the Menhir repository.

The CompCert verified compiler [17,16] can be used as an example if one wishes to use Menhir to generate a formally verified parser as part of some other project. See in particular the directory cparser.

13  Building grammarware on top of Menhir

It is possible to build a variety of grammar-processing tools, also known as “grammarware” [14], on top of Menhir’s front-end. Indeed, Menhir offers a facility for dumping a .cmly file, which contains a (binary-form) representation of the grammar and automaton, as well as a library, MenhirSdk, for (programmatically) reading and exploiting a .cmly file. These facilities are described in §13.1. Furthermore, Menhir allows decorating a grammar with “attributes”, which are ignored by Menhir’s back-ends, yet are written to the .cmly file, thus can be exploited by other tools, via MenhirSdk. Attributes are described in §13.2.

13.1  Menhir’s SDK

The command line option --cmly causes Menhir to produce a .cmly file in addition to its normal operation. This file contains a (binary-form) representation of the grammar and automaton. This is the grammar that is obtained after the following steps have been carried out:

  • joining multiple .mly files, if necessary;
  • eliminating anonymous rules;
  • expanding away parameterized nonterminal symbols;
  • removing unreachable nonterminal symbols;
  • performing OCaml type inference, if the --infer switch is used;
  • inlining away nonterminal symbols that are decorated with %inline.

The library MenhirSdk offers an API for reading a .cmly file. The functor MenhirSdk.Cmly_read.Read reads such a file and produces a module whose signature is MenhirSdk.Cmly_api.GRAMMAR. This API is not explained in this document; for details, the reader is expected to follow the above links.

13.2  Attributes

Attributes are decorations that can be placed in .mly files. They are ignored by Menhir’s back-ends, but are written to .cmly files, thus can be exploited by other tools, via MenhirSdk.

An attribute consists of a name and a payload. An attribute name is an OCaml identifier, such as cost, or a list of OCaml identifiers, separated with dots, such as my.name. An attribute payload is an OCaml expression of arbitrary type, such as 1 or "&&" or print_int. Following the syntax of OCaml’s attributes, an attribute’s name and payload are separated with one or more spaces, and are delimited by [@ and ]. Thus, [@cost 1] and [@printer print_int] are examples of attributes.

An attribute can be attached at one of four levels:

  1. An attribute can be attached with the grammar. Such an attribute must be preceded with a % sign and must appear in the declarations section (§4.1). For example, the following is a valid declaration:
      %[@trace true]
    
  2. An attribute can be attached with a terminal symbol. Such an attribute must follow the declaration of this symbol. For example, the following is a valid declaration of the terminal symbol INT:
      %token<int> INT [@cost 0] [@printer print_int]
    
  3. An attribute can be attached with a nonterminal symbol. Such an attribute must appear inside the rule that defines this symbol, immediately after the name of this symbol. For instance, the following is a valid definition of the nonterminal symbol expr:
      expr [@default EConst 0]:
        i = INT                  { EConst i }
      | e1 = expr PLUS e2 = expr { EAdd (e1, e2) }
    
    An attribute can be attached with a parameterized nonterminal symbol:
      option [@default None] (X):
              { None }
      | x = X { Some x }
    
    An attribute cannot be attached with a nonterminal symbol that is decorated with the %inline keyword.
  4. An attribute can be attached with a producer (§4.2.3), that is, with an occurrence of a terminal or nonterminal symbol in the right-hand side of a production. Such an attribute must appear immediately after the producer. For instance, in the following rule, an attribute is attached with the producer expr*:
      exprs:
        LPAREN es = expr* [@list true] RPAREN { es }
    

As a convenience, it is possible to attach many attributes with many (terminal and nonterminal) symbols in one go, via an %attribute declaration, which must be placed in the declarations section (§4.1). For instance, the following declaration attaches both of the attributes [@cost 0] and [@precious false] with each of the symbols INT and id:

  %attribute INT id [@cost 0] [@precious false]

An %attribute declaration can be considered syntactic sugar: it is desugared away in terms of the four forms of attributes presented earlier. (The command line switch --only-preprocess can be used to see how it is desugared.)

If an attribute is attached with a parameterized nonterminal symbol, then, when this symbol is expanded away, the attribute is transmitted to every instance. For instance, in an earlier example, the attribute [@default None] was attached with the parameterized symbol option. Then, every instance of option, such as option(expr), option(COMMA), and so on, inherits this attribute. To attach an attribute with one specific instance only, one can use an %attribute declaration. For instance, the declaration %attribute option(expr) [@cost 10] attaches an attribute with the nonterminal symbol option(expr), but not with the symbol option(COMMA).

14  Interaction with build systems

This section explains some details of the compilation workflow, including OCaml type inference and its repercussions on dependency analysis (§14.1) and compilation flags (§14.2). This material should be of interest only to authors of build systems who wish to build support for Menhir into their system. Ordinary users should skip this section and use a build system that knows about Menhir, such as dune (preferred) or ocamlbuild.

14.1  OCaml type inference and dependency analysis

In an ideal world, the semantic actions in a .mly file should be well-typed according to the OCaml type discipline, and their types should be known to Menhir, which may need this knowledge. (When --inspection is set, Menhir needs to know the OCaml type of every nonterminal symbol.) To address this problem, three approaches exist:

  • Ignore the problem and let Menhir run without OCaml type information (§14.1.1).
  • Let Menhir obtain OCaml type information by invoking the OCaml compiler (§14.1.2).
  • Let Menhir request and receive OCaml type information without invoking the OCaml compiler (§14.1.3).

14.1.1  Running without OCaml type information

The simplest thing to do is to run Menhir without any of the flags described in the following (§14.1.2, §14.1.3). Then, the semantic actions are not type-checked, and their OCaml type is not inferred. (This is analogous to using ocamlyacc.) The drawbacks of this approach are as follows:

  • A type error in a semantic action is detected only when the .ml file produced by Menhir is type-checked. The location of the type error, as reported by the OCaml compiler, can be suboptimal.
  • Unless a %type declaration for every nonterminal symbol is given, the inspection API cannot be generated, that is, --inspection must be turned off.

14.1.2  Obtaining OCaml type information by calling the OCaml compiler

The second approach is to let Menhir invoke the OCaml compiler so as to type-check the semantic actions and infer their types. This is done by invoking Menhir with the --infer switch, as follows.

--infer.  This switch causes the semantic actions to be checked for type consistency before the parser is generated. To do so, Menhir generates a mock .ml file, which contains just the semantic actions, and invokes the OCaml compiler, under the form ocamlc -i, so as to type-check this file and infer the types of the semantic actions. Menhir then reads this information and produces real .ml and .mli files.

--ocamlc command.  This switch controls how ocamlc is invoked. It allows setting both the name of the executable and the command line options that are passed to it.

One difficulty with the this approach is that the OCaml compiler usually needs to consult a few .cm[iox] files. Indeed, if the .mly file contains a reference to an external OCaml module, say A, then the OCaml compiler typically needs to read one or more files named A.cm[iox].

This implies that these files must have been created first. But how is one supposed to know, exactly, which files should be created first? One must scan the .mly file so as to find out which external modules it depends upon. In other words, a dependency analysis is required. This analysis can be carried out by invoking Menhir with the --depend switch, as follows.

--depend.  This switch causes Menhir to generate dependency information for use in conjunction with make. When invoked in this mode, Menhir does not generate a parser. Instead, it examines the grammar specification and prints a list of prerequisites for the targets basename.cm[iox], basename.ml, and basename.mli. This list is intended to be textually included within a Makefile. To produce this list, Menhir generates a mock .ml file, which contains just the semantic actions, invokes ocamldep, and postprocesses its output.

--raw-depend.  This switch is analogous to --depend. However, in this case, ocamldep’s output is not postprocessed by Menhir: it is echoed without change. This switch is not suitable for direct use with make ; it is intended for use with omake or ocamlbuild, which perform their own postprocessing.

--ocamldep command.  This switch controls how ocamldep is invoked. It allows setting both the name of the executable and the command line options that are passed to it.

14.1.3  Obtaining OCaml type information without calling the OCaml compiler

The third approach is to let Menhir request and receive OCaml type information without allowing Menhir to invoke the OCaml compiler. There is nothing magic about this: to achieve this, Menhir must be invoked twice, and the OCaml compiler must be invoked (by the user, or by the build system) in between. This is done as follows.

--infer-write-query mockfilename.  When invoked in this mode, Menhir does not generate a parser. Instead, generates a mock .ml file, named mockfilename, which contains just the semantic actions. Then, it stops.

It is then up to the user (or to the build system) to invoke ocamlc -i so as to type-check the mock .ml file and infer its signature. The output of this command should be redirected to some file sigfilename. Then, Menhir can be invoked again, as follows.

--infer-read-reply sigfilename.  When invoked in this mode, Menhir assumes that the file sigfilename contains the result of running ocamlc -i on the file mockfilename. It reads and parses this file, so as to obtain the OCaml type of every semantic action, then proceeds normally to generate a parser.

This protocol was introduced on 2018/05/23; earlier versions of Menhir do not support it. Its existence can be tested as follows:

--infer-protocol-supported.  When invoked with this switch, Menhir immediately terminates with exit code 0. An earlier version of Menhir, which does not support this protocol, would display a help message and terminate with a nonzero exit code.

14.2  Compilation flags

The following switches allow querying Menhir so as to find out which compilation flags should be passed to the OCaml compiler and linker.

--suggest-comp-flags.  This switch causes Menhir to print a set of suggested compilation flags, and exit. These flags are intended to be passed to the OCaml compilers (ocamlc or ocamlopt) when compiling and linking the parser generated by Menhir. What flags are suggested? In the absence of the --table switch, no flags are suggested. When --table is set, a -I flag is suggested, so as to ensure that MenhirLib is visible to the OCaml compiler.

--suggest-link-flags-byte.  This switch causes Menhir to print a set of suggested link flags, and exit. These flags are intended to be passed to ocamlc when producing a bytecode executable. What flags are suggested? In the absence of the --table switch, no flags are suggested. When --table is set, the object file menhirLib.cmo is suggested, so as to ensure that MenhirLib is linked in.

--suggest-link-flags-opt.  This switch causes Menhir to print a set of suggested link flags, and exit. These flags are intended to be passed to ocamlopt when producing a native code executable. What flags are suggested? In the absence of the --table switch, no flags are suggested. When --table is set, the object file menhirLib.cmx is suggested, so as to ensure that MenhirLib is linked in.

--suggest-menhirLib.  This switch causes Menhir to print (the absolute path of) the directory where MenhirLib was installed.

--suggest-ocamlfind.  This switch is deprecated and may be removed in the future. It always prints false.

15  Comparison with ocamlyacc

Roughly speaking, Menhir is 90% compatible with ocamlyacc. Legacy ocamlyacc grammar specifications are accepted and compiled by Menhir. The resulting parsers run and produce correct parse trees. However, parsers that explicitly invoke functions in the module Parsing behave slightly incorrectly. For instance, the functions that provide access to positions return a dummy position when invoked by a Menhir parser. Porting a grammar specification from ocamlyacc to Menhir requires replacing all calls to Parsing with new Menhir-specific keywords (§7).

Here is an incomplete list of the differences between ocamlyacc and Menhir. The list is roughly sorted by decreasing order of importance.

  • Menhir allows the definition of a nonterminal symbol to be parameterized (§5.2). A formal parameter can be instantiated with a terminal symbol, a nonterminal symbol, or an anonymous rule (§4.2.4). A library of standard parameterized definitions (§5.4), including options, sequences, and lists, is bundled with Menhir. EBNF syntax is supported: the modifiers ?, +, and * are sugar for options, nonempty lists, and arbitrary lists (Figure 2).
  • ocamlyacc only accepts LALR(1) grammars. Menhir accepts LR(1) grammars, thus avoiding certain artificial conflicts.
  • Menhir’s %inline keyword (§5.3) helps avoid or resolve some LR(1) conflicts without artificial modification of the grammar.
  • Menhir explains conflicts (§6) in terms of the grammar, not just in terms of the automaton. Menhir’s explanations are believed to be understandable by mere humans.
  • Menhir offers an incremental API (in --table mode only) (§9.2). This means that the state of the parser can be saved at any point (at no cost) and that parsing can later be resumed from a saved state.
  • Menhir offers a set of tools for building a (complete, irredundant) set of invalid input sentences, mapping each such sentence to a (hand-written) error message, and maintaining this set as the grammar evolves (§11).
  • In --coq mode, Menhir produces a parser whose correctness and completeness with respect to the grammar can be checked by Coq (§12).
  • Menhir offers an interpreter (§8) that helps debug grammars interactively.
  • Menhir allows grammar specifications to be split over multiple files (§5.1). It also allows several grammars to share a single set of tokens.
  • Menhir produces reentrant parsers.
  • Menhir is able to produce parsers that are parameterized by OCaml modules.
  • ocamlyacc requires semantic values to be referred to via keywords: $1, $2, and so on. Menhir allows semantic values to be explicitly named.
  • Menhir warns about end-of-stream conflicts (§6.4), whereas ocamlyacc does not. Menhir warns about productions that are never reduced, whereas, at least in some cases, ocamlyacc does not.
  • Menhir offers an option to typecheck semantic actions before a parser is generated: see --infer.
  • ocamlyacc produces tables that are interpreted by a piece of C code, requiring semantic actions to be encapsulated as OCaml closures and invoked by C code. Menhir offers a choice between producing tables and producing code. In either case, no C code is involved.
  • Menhir makes OCaml’s standard library module Parsing entirely obsolete. Access to locations is now via keywords (§7). Uses of raise Parse_error within semantic actions are deprecated. The function parse_error is deprecated. They are replaced with keywords (§10).
  • Menhir’s error handling mechanism (§10) is inspired by ocamlyacc’s, but is not guaranteed to be fully compatible. Error recovery, also known as re-synchronization, is not supported by Menhir.
  • The way in which severe conflicts (§6) are resolved is not guaranteed to be fully compatible with ocamlyacc.
  • Menhir warns about unused %token, %nonassoc, %left, and %right declarations. It also warns about %prec annotations that do not help resolve a conflict.
  • Menhir accepts OCaml-style comments.
  • Menhir allows %start and %type declarations to be condensed.
  • Menhir allows two (or more) productions to share a single semantic action.
  • Menhir produces better error messages when a semantic action contains ill-balanced parentheses.
  • ocamlyacc ignores semicolons and commas everywhere. Menhir regards semicolons and commas as significant, and allows them, or requires them, in certain well-defined places.
  • ocamlyacc allows %type declarations to refer to terminal or non-terminal symbols, whereas Menhir requires them to refer to non-terminal symbols. Types can be assigned to terminal symbols with a %token declaration.

16  Questions and Answers


Is Menhir faster than ocamlyacc? What is the speed difference between menhir and menhir --table? A (not quite scientific) benchmark suggests that the parsers produced by ocamlyacc and menhir --table have comparable speed, whereas those produced by menhir are between 2 and 5 times faster. This benchmark excludes the time spent in the lexer and in the semantic actions.


How do I write Makefile rules for Menhir? This can a bit tricky. If you must do this, see §14. It is recommended instead to use a build system with built-in support for Menhir, such as dune (preferred) or ocamlbuild.


How do I use Menhir with ocamlbuild? Pass -use-menhir to ocamlbuild. To pass options to Menhir, pass -menhir "menhir <options>" to ocamlbuild. To use Menhir’s table-based back-end, pass -menhir "menhir --table" to ocamlbuild, and either pass -package menhirLib to ocamlbuild or add the tag package(menhirLib) in the _tags file. To combine multiple .mly files, say a.mly and b.mly, into a single parser, say parser.{ml,mli}, create a file named parser.mlypack that contains the module names A B. See the demos directory for examples. To deal with .messages files (§11), use the rules provided in the file demos/ocamlbuild/myocamlbuild.ml.


How do I use Menhir with dune? Please use dune version 1.4.0 or newer, as it has appropriate built-in rules for Menhir parsers. In the simplest scenario, where the parser resides in a single source file parser.mly, the dune-project file should contain a “stanza” along the following lines:

(menhir (
  (modules (parser))
  (flags ("--explain" "--dump"))
  (infer true)
))

The --infer switch has special status and should not be used directly; instead, write (infer true) or (infer false), as done above. (The default is true.) Ordinary command line switches, like --explain and --dump, are passed as part of the flags line, as done above. The directory demos/calc-dune offers an example. For more details, see dune’s documentation. To deal with .messages files (§11), use and adapt the rules found in the file src/stage2/dune.


Menhir reports more shift/reduce conflicts than ocamlyacc! How come? ocamlyacc sometimes merges two states of the automaton that Menhir considers distinct. This happens when the grammar is not LALR(1). If these two states happen to contain a shift/reduce conflict, then Menhir reports two conflicts, while ocamlyacc only reports one. Of course, the two conflicts are very similar, so fixing one will usually fix the other as well.


I do not use ocamllex. Is there an API that does not involve lexing buffers? Like ocamlyacc, Menhir produces parsers whose monolithic API (§9.1) is intended for use with ocamllex. However, it is possible to convert them, after the fact, to a simpler, revised API. In the revised API, there are no lexing buffers, and a lexer is just a function from unit to tokens. Converters are provided by the library module MenhirLib.Convert. This can be useful, for instance, for users of ulex, the Unicode lexer generator. Also, please note that Menhir’s incremental API (§9.2) does not mention the type Lexing.lexbuf. In this API, the parser expects to be supplied with triples of a token and start/end positions of type Lexing.position.


I need both %inline and non-%inline versions of a non-terminal symbol. Is this possible? Define an %inline version first, then use it to define a non-%inline version, like this:

%inline ioption(X):  (* nothing *) { None } | x = X { Some x }
         option(X): o = ioption(X) { o }

This can work even in the presence of recursion, as illustrated by the following definition of (reversed, left-recursive, possibly empty) lists:

%inline irevlist(X):    (* nothing *) { [] } | xs = revlist(X) x = X { x :: xs }
         revlist(X): xs = irevlist(X) { xs }

The definition of irevlist is expanded into the definition of revlist, so in the end, revlist receives its normal, recursive definition. One can then view irevlist as a variant of revlist that is inlined one level deep.


Can I ship a generated parser while avoiding a dependency on MenhirLib? Yes. One option is to use the code-based back-end (that is, to not use --table). In this case, the generated parser is self-contained. Another option is to use the table-based back-end (that is, use --table) and include a copy of the files menhirLib.{ml,mli} together with the generated parser. The command menhir --suggest-menhirLib will tell you where to find these source files.


Why is $startpos off towards the left? It seems to include some leading whitespace. Indeed, as of 2015/11/04, the computation of positions has changed so as to match ocamlyacc’s behavior. As a result, $startpos can now appear to be too far off to the left. This is explained in §7. In short, the solution is to use $symbolstartpos instead.


Can I pretty-print a grammar in ASCII, HTML, or LATEX format? Yes. Have a look at obelisk [4].


Does Menhir support mid-rule actions? Yes. See midrule and its explanation in §5.4.

17  Technical background

After experimenting with Knuth’s canonical LR(1) technique [15], we found that it really is not practical, even on today’s computers. For this reason, Menhir implements a slightly modified version of Pager’s algorithm [19], which merges states on the fly if it can be proved that no reduce/reduce conflicts will arise as a consequence of this decision. This is how Menhir avoids the so-called mysterious conflicts created by LALR(1) parser generators [7, section 5.7].

Menhir’s algorithm for explaining conflicts is inspired by DeRemer and Pennello’s [6] and adapted for use with Pager’s construction technique.

By default, Menhir produces code, as opposed to tables. This approach has been explored before [3,9]. Menhir performs some static analysis of the automaton in order to produce more compact code.

When asked to produce tables, Menhir performs compression via first-fit row displacement, as described by Tarjan and Yao [23]. Double displacement is not used. The action table is made sparse by factoring out an error matrix, as suggested by Dencker, Dürre, and Heuft [5].

The type-theoretic tricks that triggered our interest in LR parsers [21] are not implemented in Menhir. In the beginning, we did not implement them because the OCaml compiler did not at the time offer generalized algebraic data types (GADTs). Today, OCaml has GADTs, but, as the saying goes, “if it ain’t broken, don’t fix it”.

The main ideas behind the Coq back-end are described in a paper by Jourdan, Pottier and Leroy [13]. The C11 parser in the CompCert compiler [17] is constructed by Menhir and verified by Coq, following this technique. How to construct a correct C11 parser using Menhir is described by Jourdan and Pottier [12].

The approach to error reports presented in §11 was proposed by Jeffery [10] and further explored by Pottier [20].

18  Acknowledgements

Menhir’s interpreter (--interpret) and table-based back-end (--table) were implemented by Guillaume Bau, Raja Boujbel, and François Pottier. The project was generously funded by Jane Street Capital, LLC through the “OCaml Summer Project” initiative.

Frédéric Bour provided motivation and an initial implementation for the incremental API, for the inspection API, for attributes, and for MenhirSdk. Merlin, an emacs mode for OCaml, contains an impressive incremental, syntax-error-tolerant OCaml parser, which is based on Menhir and has been a driving force for Menhir’s APIs.

Jacques-Henri Jourdan designed and implemented the Coq back-end and did the Coq proofs for it.

Gabriel Scherer provided motivation for investigating Jeffery’s technique.

References

[1]
Alfred V. Aho, Ravi Sethi, and Jeffrey D. Ullman. Compilers: Principles, Techniques, and Tools. Addison-Wesley, 1986.
[2]
Andrew Appel. Modern Compiler Implementation in ML. Cambridge University Press, 1998.
[3]
Achyutram Bhamidipaty and Todd A. Proebsting. Very fast YACC-compatible parsers (for very little effort). Software: Practice and Experience, 28(2):181–190, 1998.
[4]
Lélio Brun. Obelisk. https://github.com/Lelio-Brun/Obelisk, 2017.
[5]
Peter Dencker, Karl Dürre, and Johannes Heuft. Optimization of parser tables for portable compilers. ACM Transactions on Programming Languages and Systems, 6(4):546–572, 1984.
[6]
Frank DeRemer and Thomas Pennello. Efficient computation of LALR(1) look-ahead sets. ACM Transactions on Programming Languages and Systems, 4(4):615–649, 1982.
[7]
Charles Donnelly and Richard Stallman. Bison, 2015.
[8]
John E. Hopcroft, Rajeev Motwani, and Jeffrey D. Ullman. Introduction to Automata Theory, Languages, and Computation. Addison-Wesley, 2000.
[9]
R. Nigel Horspool and Michael Whitney. Even faster LR parsing. Software: Practice and Experience, 20(6):515–535, 1990.
[10]
Clinton L. Jeffery. Generating LR syntax error messages from examples. ACM Transactions on Programming Languages and Systems, 25(5):631–640, 2003.
[11]
Steven C. Johnson. Yacc: Yet another compiler compiler. In UNIX Programmer’s Manual, volume 2, pages 353–387. Holt, Rinehart, and Winston, 1979.
[12]
Jacques-Henri Jourdan and François Pottier. A simple, possibly correct LR parser for C11. ACM Transactions on Programming Languages and Systems, 39(4):14:1–14:36, August 2017.
[13]
Jacques-Henri Jourdan, François Pottier, and Xavier Leroy. Validating LR(1) parsers. volume 7211, pages 397–416, 2012.
[14]
Paul Klint, Ralf Lämmel, and Chris Verhoef. Toward an engineering discipline for grammarware. 14(3):331–380, 2005.
[15]
Donald E. Knuth. On the translation of languages from left to right. Information & Control, 8(6):607–639, 1965.
[16]
Xavier Leroy. The CompCert C verified compiler. https://github.com/AbsInt/CompCert, 2014.
[17]
Xavier Leroy. The CompCert C compiler. http://compcert.inria.fr/, 2015.
[18]
Xavier Leroy, Damien Doligez, Alain Frisch, Jacques Garrigue, Didier Rémy, and Jérôme Vouillon. The OCaml system: documentation and user’s manual, 2016.
[19]
David Pager. A practical general method for constructing LR(k) parsers. Acta Informatica, 7:249–268, 1977.
[20]
François Pottier. Reachability and error diagnosis in LR(1) parsers. In Compiler Construction (CC), pages 88–98, 2016.
[21]
François Pottier and Yann Régis-Gianas. Towards efficient, typed LR parsers. Electronic Notes in Theoretical Computer Science, 148(2):155–180, 2006.
[22]
David R. Tarditi and Andrew W. Appel. ML-Yacc User’s Manual, 2000.
[23]
Robert Endre Tarjan and Andrew Chi-Chih Yao. Storing a sparse table. Communications of the ACM, 22(11):606–611, 1979.

1
The computation of $symbolstartpos is optimized by Menhir under two assumptions about the lexer. First, Menhir assumes that the lexer never produces a token whose start and end positions are equal. Second, Menhir assumes that two positions produced by the lexer are equal if and only if they are physically equal. If the lexer violates either of these assumptions, the computation of $symbolstartpos could produce a result that differs from Parsing.symbol_start_pos().
2
This assumes that you have installed opam, the OCaml package manager, and that you have run the command opam repo add coq-released https://coq.inria.fr/opam/released.

This document was translated from LATEX by HEVEA.
menhir-20200123/www/doc/manual.pdf000066400000000000000000016176201361226111300166010ustar00rootroot00000000000000%PDF-1.5 % 5 0 obj << /Type /ObjStm /N 100 /First 807 /Length 1233 /Filter /FlateDecode >> stream xڕVMs6W1Adq4L= LB6?V"Z$@ ݷ5edHedI J**-DA2<IMR1?OAJR)mgCIieؒ.JҚr6,!T P%LSQ,SodbVd]a%STJ.Y lAf˒,e LDc,BP y2AVaU܂ cd,!DUjX@mr1L4YflCl9 ˆbK X&L4fÑs V@ZHS0 H${0፰'3V)@P5H@$6&̌U)9q0!R&UYqHԊ̸A2`2ؑ<c$P\pҸKd%!Cil]٧p}н'!듽ff\KU|O\=w=Qmt/}A`uCٻǣHРu5rL}cHe2\L?/nB{S*y_!QO#cRת(piڸK2. 01X= :XUZv0A:_5y4.?l*ջZMJMwsj) MG nrcWiw%e蛉w&^/QqKO\CuOls=3@ v<"T>rڎ~_.d59 z۪ĝX%z!fz5x{hq磙1K.M]$;< 70UՊvOݑpj/LؑWݟs=ik5)"), tkNf$R endstream endobj 257 0 obj << /Length 303 /Filter /FlateDecode >> stream xڅJ1}3&`KJYRP5 )V/⻛%KxH~ @?];+d@W j+ Z D"䪋/J(nY]V>}0FkY9Qʮ1fBH#M#1SV4%fD>nueaȔt -i-7SEL0.-YSuurYNg.,ŸBYv օ22+]֔.&_7!þk?GVIm L.4s:/⾝}} endstream endobj 307 0 obj << /Length 1251 /Filter /FlateDecode >> stream xMs6 (S'fN3wzA%,!gFa'q&\ -Υ:9:;: GN/9+bBU߳'0 !aqҌ0t?x@ynؼ%i }mff?+zɞġ‹B'\x)m3&1n߯V(zLX^4KIs*j25 aF^S5>4(ן^%XsgYA6hzZ}L I!Xb>2,f^t,^-72Q%ƻQ*3]O>\{c3 W4!F_Ml؜)*neQK?<"@q ݥb |[@?|j$t$m$.Yy79hLJTZ.UpMyY+IY ZV&@k|:<]w*Kmi\.x6%/dF[$xurHȧq&[ohL-OaH6qwmwnJuFhR*4t.[һ܏՛2;H^k=%DiAX4#> stream xXMs6WV $&ICL$ШI+4uk<$(|ow.``vW@;9 f7g2$H<r&C E4 f)JZ"?02cCg>< A4avr=,$tD`A "Q"O,yL^)OPd3gŀ2Gm4!M:BiXiORya oܱ* `#Cp'<$q|9 m<@gyel’ah؏`?[w Ir:(e]4&C#n*] 3tY} qN;6W"Lqz3䅽;Q Fc7Ȳv:[sPUm̯9<Q:yTA[o;eX(WŨ;bHY0DQOȪrr,I-Iϔ y.8|084ɁtkrއɒŨ`0v٭X)|!KLQЁ. $87,wr9!BQob#L endstream endobj 353 0 obj << /Length 3567 /Filter /FlateDecode >> stream xZɒ#WOfGPf)4 ےt.B2{Xl_X@b/_&=yw߼û/ޛNk{8݅] 8w?촺?hvMk{ }A|=zU;[Ѯ2^Rdkf=;Ų\ںcѹx:JStÝ-yUUrbG3yM a}.3:[eu_2h- bq_/:Ļ^jNgˉ|U8/KlZQ?qYOCdNûw- &>|=2>)!뾑i;gO.+j.ۚh 3>[{Kom[=?!u,Ï^i,O$6oQ.m~_6 ōxΩ9MTE]ל_n/aG횶kε7#=^K[JnbeLj qW& ݷܘA+o1[.[(&"=79(˭meʌsx7n1w / TZ&2#KzQ5"U,dE/Fe^ϙ ^e?#YRdyT5! ?UZ:ڶt4pFegZMk {dk-ֹ@!+Hq Zڝܤg.F$@Y0*Eﱱ׍ե_2 ,ll2gjЗx>(ّN3ϰRJ3T>:n_,=a*!LOօgtn =oX2b Nl4Gu|htNu+-Ă3߬NíTdu[P6"1oOo 5&Jc G[5bk!>|2Ν|BɱQր-y0I-UMP  6c,PYhYF(kk+raݓ̀X.lRuEAToχeՄudMaA`OU0XSpv,0)[Go)ѨeX"# $qJ6!Wx̛kZZ/1$>FH,Ko* 'Y|ZFgY7gfݪmf6FIsǂ 02>bAeɲYЏ a" ]hRڝaiy(wjZW w7{'4K(VMu A>"|>O׈$nvS/0P&ivʄsݤt-}ls+1) ~[H1)Ν_< 1α's%nkb`G;4c|ۦ8uz?IA_9sMʾa:a|GqU4m߭:'PQиٖr GT X|NC6\:dyD;#T*dǨ)["60aĪjxGfu7 սߵ0angcT/];2YeL|8^N|t PO;b',iԼe4ܠԌ * (1~:"G|K?5pț_6OLd2LItpꂝ ʳNV[˥n+KMUnR3jU\KtkҠ:ky@lpt0:.̬9\X%˹7x4fP)auxf jV֕va:[de2hj{c5s (?eĄ=̋@7-Ul| @Er f.CUBIdO妄L\mj֑+6a0Ukx e'=sZG%bPL B)\EuH tM0ҍ` lH0UY WVeNBr$׼|a Y9(%%}rf];nڂ$Rl:j8)W4U:ט;oh&l4Bmr9BJ{c \`@_l%r,qJ)ʀ1Z@%ڹ)e/̵|Q\ߌuRx7TU/dLd_O;#b츙*C!:xި!&\Q<:9 󠗶Bְĉ߶cf'Bh|.W$~#>t?鯊 E3i0lqABhx$&!(lqTD8~~TuâRhoJn-]ɤٍ hJWG qԇ U(Y Č(&}Ċ\=wAo4LSK"1Ӿfy|yFddWd# FKnJi]xFNFn|<ԍP# ⌧I C}"},}9 '"ZcLx<& }Se[C#6y  C5$ BA!(sb9#5×tJamqc,y@$Ͽ̛ut{}O*An\CdTrn 0zW0 6|){-hH 8F@oR=eih `q\` 䌸(Mұ>_@x]|u?$`I|M-ή[,n:xg(;ar)ra O|~hRաV0ɏ'>w8@p$w3 Et@lugr7N@/>7 n|޻JSu`ïvp>|Kc3Iس51x"4Pa-7<$}Z{cBƛ&gfo}HBx3"uy^g2=Qb. I +{o ݲ>㛵} p[+iʳ?dсq& )v_!0x"Ö!xHٸRKY(f|Ǣر(HU{t?a&Rȹf/Tŷږsn.\7r =8?o7eiN98/X#0۪ -9-=/}WgQfm$J'т3hrgBget1 )[v+߿qfs@DZ <{$Ix ϝiܱ5E endstream endobj 206 0 obj << /Type /ObjStm /N 100 /First 876 /Length 2553 /Filter /FlateDecode >> stream xڽZMWn|U\%KJ\V:$ҁK,Q䖴ޯvg&8@~qƻb8OB&'4b។ Ά[{o<>vFc6 61Neś![<&B)&oj}mCn?<0dnm;Ͳ3͵wk"~Wͥ u#puگ_`f~P'';} >v}8@n?>᧋_ϰZxa6z&0n>3ǃ^eBY3ʚ?lui4*666׶jKVyKU^R\*/WyU^r\*Ty+U^JWR*Ty(O#ѱiz7WOsy;]w~Kܫ?ϟ\RA6` 4 Nc3W,FWy<DZ)PȂ &gA=6"S}%Z nzlr'Oؐ,dgAg'd>EVCOxv/>2,'BD<=pNC, 8&8 zD@ަHx$\i IAD8'}`32 ۟tjYNJΓT E2TيV$4jMX-hНe1 aa"A]\l2~Qq R|Ӟ尚 L`2N[xINh٨<킌S~LQ:A bR8 M)C֬$ x(5XhqP:iEt 8C($ Zll䞠ZG.}2Iւ@zqH9ā(o/!in@M8Y2gh+H &io!\I +'sKBөBK$\#|7L_ys}%jjj~<_*WyU|<_kY?پ˵.eg?>ԛ? |^ 5l&f$!$}ekCMEMn0#2V̜@C,M]F;"Y~Iey'UԿe}#xX%:CX#f_:Oΐ'}ATxĐ~[Ǜj[[P(ΰ/ }LA~X_bsԋ4%OݑBW/<80t4xЧm1^qһNP sKJ OyҤjcr\-|ոH\s$9Ij%5璚sI͹\Rs.ʓ*O9Sm `HH3 tA]<T. endstream endobj 379 0 obj << /Length 3708 /Filter /FlateDecode >> stream xڽZݏ$_ovѶl8GQ [TQ@,}y 4Uŏ&sջ3y&on<)yʒt~l?}_o|W{?82a7+-罸d,|/%ݟuOt?x2L4m|Uó6>\D4E$OZ׎@S|~$J?^m7V3jt +Ȯr2KrPq,"A e$|ҊFa8>=_;Km}Z}ʵ0nmM\BL=x>=Gא V(7*X%IP׏lk3P͉Z;um'oJHXZ~!8 hv7N4qs}v@@? ("eA0Z4gez Wh ݃skC*@B *=v_+]'5^^-úOur[h ŗ[NW kx@qy3Vw8?期{A!XP ,] evtG0%64m%ge*7"~]N$fy(r `HU +Á#''7IZi7cvC4d+|iK{qRIp K*Jt=`o?|ԵQPa1.猆-X-zY),m{~kи1ZC7tFTn (=$`X>DGy kRAs*q,Y+r[e]0%l J$lTdMVQ?؜5qa SgZGݩňgENrT`<\]G{qקxgzI]kDVKPIvW?cZ[тdxF?pS`)* Zsf 9vd*c&kCdy1u<{u`%D&Wz)WKQLj)JyYSQ>)`g|A}TQ>ygY,/X%7Vȗ'}xsʺkKe5Ɛ܎ՆFr<ƨ~%h6+zYfyI0~OY#8,@E^j~hjZP6r+.|wNnPgjOGh{i=Vj1 ;? gp&}7;t .&:H#Y*ģ1 ʐjME '\5o[m*%+'S\>ĕ;\ I)y8aI[`Un6o?(- !Tv}l}KnW rmZ8P%AM ImLSG%輘ǍnF?z{As9J NLx1[3Pz RL47Q:_4rM?9qE! N5fNMj 8\0A.9-WQjn[dNtV]t^8QgLG6Yvawmh!\e0dQ-m~Ju4A$V Eix~B1∝Ou /R{H\iڭo nA5Q6V mu ]is9uV`o!3n\"!.9Dtwg#1( X_9R)q :›u$V읕 h,coSAT@E)OJyп&6%(xE pH`X dy k3ôs0fBYphC"~Z;jKwf(ˠ8?j~ wb#x.u.6/.q,e)GKB{oZoS/S&뽵ϰ2n<\9AeFPѥy*LK4猋UXehp@鬚+jsȪwP'7L!)gs}<ɓ&U[IHYY.^[_ݧi}1ث/H.!DlF' @Ի(1Xox@*!BV> DtvnW7!abs&6 SfS vb:F$†0>Z(@hاڦhEMՓԑDӣonEx2GϥKE v0 Oy5kei"lɂX-Le"L½wQV WƺT8д[CMUJ Sp}|ZӃܛRܭƼH%yc fUtٶ=lI0Ju2 eBFjqT^}8i7CYp9+&a+ :XqWp3?K8KYIӼh?2@p4c ^ْ)IL $>ǰu0ga6/w_ȋ7\ɗ:w4kEoܷ 2A1-tk p&} ]: JwbxG3X grI{DdQpGc`>ygco/4a!HOݦH:ur#/31ǵSуHYx|}xsA | <& oWޢoݜj |,MG) ʌbNq-> stream xZKﯘC\Ѥ$OI*Z&NNqhSBR}F| 5uʉ@ģ_77_ ԝ=+qxGDHnvu{nu]wlvy/dwIav(<~'p=}~R'\:]07:˦!~D{\V ڵ22l7L+g9W3My KۉBsekkuuՖt5Gƞܾ6"%u=^)<ѦisjZuشuW0@kI X#-ZDEbЋnۆ8LÙ>!>p{H Gnma|GQpv>-\1k]+/3fs:W,8oYiIh|c0lrkwv\ 58#WzmzLGISro} Gfw&@;C 0]l#^v #P#+Ic? ?5_ s[3*  wڙx _WO/UUYkp\sdшA:dUDc #@m&sO6LYw}{RŊhu^Z1^֙\ZJB^ȑ.A2 Wr@%Ugx\4Ѕ_\q~xWS }^Ŋn>\oy$ k~ݹ)m(3_ۢ)gPV d5qi.a0nlϽo#N|0)ނ?>O c(S?`> /Vh\d7PٹQtl:͟I[j涅߀&F"##AM<0d3!5~Knbf_ILᠺm@SiBфӾvv?cbt̔[)n|k<"M*F<떕kkN z"u8̶jũ"SUׄъǜDcζ`6tK̊U<d%<"gSJi.`! .*!u[GF\gW,,CxS7U'G?T|B#u⨝RƵ~5ji*4^o~E J !r;42J/t >ePIU`y(Z 0tᯀBa=cWF`0@VgJQh.J2WjLQ'!h{NW4ǒOm~mYU ՀY":$û2SZTxFݤ32rFa{\u(ytp|W@lM1 ='5z1#"|x+n^䡌\;ݰ_;IQ:[xm }j/w f_dYAbge=4^V0dnݏi0V^^6{aW~cjYlnvH ٌ2]ɚ3yf_9ˮsK^ ֘ȑO;x}fs2[eO牰aAhISef3gY&VFHv!^ $)#˰0Q4B"ZV0;+``ld!,σ7 -l맲+-~GKL෶g"^ѐUY"Ayn.-X /}6lNz ?D2z5s4R;|(JoWOE3cMC鏮d-?؏lJKz[3?5dQ> stream xڭɒܶ*=rD{*Elr%VUql3͘MHT=o75W.M|tpp|"޿x.ot 7oXq|j}uXa?~g8Pi6Zntz3q>ޕG~-`enG۷EoS1]+?Vv\όG75mUپzXmSNȸөʢd/J:U5Du:",Z]UV>{G[D o[h`[ZYpw.Qˉ>Vv1Cҁ7ZE:ބ*r& d_=Q@ErtWM1@ϣL27'2/m0!7LhVWk _nZqο>d&PI6!';:g0S&׬ǻK{v%gH0m¶šL\7v21kvuٶ謪 [>!lD-_9U0= <ۥPݝa V-]c?|t8ɖvhD$ʒcQہM08tߒš 㠉:u}u/Ϸ嵓΂lـ_! |N?۟Q\fb&=&*H+pq9e׸sB c!Lj0\(v}f9+wL9q>1~:h!윞|Y VRf27Y@ ]S #lC#xήƷ?*O1 ) Aˏ?aL?G̒?>ZЧ8$[o8)"N9T׉\s*%$QT/3f ,G?އ %v339AՅg7J֏t9:}GFp[`r{"s1C9hMôcIBÈXΐ @,)WQWs }\-b{_- 1;s)$A8OCS G)X[=^ :{3I\.YY?{\ r=H̵͕T){;.}]ڊ{+"Dl]ׯ:Na ړJ^"29(GsFBEk 2cojK%~vOmbC3wtu hB$m[vۛŊmN%j9ԭup!:IIRQ|UK?BgO 36z5\xbk#t880.`:&otA>QXLse.ݓ'/b.+;^T6p7>1FNPdi‹p5A(t>Yo U=!#t4>4e,\̆G75_3֌ R]B?oMw]K݌wAʦ#f2-LxxeRe&;c$^ _.(TƑ @֑}":bϢ|uU-%~n/=1 F<>>g`\ ы{.xC >pch2֌Ma^TGә\:@Jlh=J7epՕt$Vk*JŸ =k=g) U&w0W$" K@+~qӍ*!u\$YMz~ox&WY:⾼)bII9:۰NA\~uxgI\-&% $t{EvN Fv _S '^FfaUCtU@q\Ih5kVN#GT@7 js\HQ`dp^V00;2„]S=]tWۀf/8f>Nt\/jlbq7ى"Ҭ1ԅ-1>Xڢo,ݑ>@1;rV*8qmB,|JzNpjz#zou`f=x/s4}7I-;b\3'amg;:$.nw!ȥʄ-Ss%Ř]);5$ǯ(N^쵌UHU> 15y@ t܅\@rxf+ hKA]s\Q2a&aV3ti20j kAu:CT fxkdT1> stream xZY~_d!^Yȱ af0nʖWn~{.Ҕe\U*xq?]o`& 18<$ tWi_+‡q@ŋvG\"O&,.V/TPZ!C# vVHQPdS$+d^{b)F v R#_՚r\ ;oC4LD2X8?r4CDSWr59+gl *:-\d,R )"ݼ;+FEKT 931HwpQv8Fx4ҪS语TD)$Ī\I0FNZApIƸ;:#Gw cGGCQ2ܸX8l(0%G2)Èj9>eAD($[k"I< ¡"_Qh9eٗTR$t2x} 9@Z:Ft!OVE6 I$$f*g`9yi?+Og6gWU?VIY4rqb~g4AdiYQx `'& e5Aw wIQ䵳+q46z?❔8LKjώ׿ 2!(Ch?fytn9Kj" 0`g=3϶vw@h(6 o(# z͟i8*ZS=OD*p|VXI|7:g>'0)*}1o0ǡۥ\+))Ynxz04\5# ՑT>$P٢Q0A3^luW"̏BoXDZN:͓Vt־C'#dpbƟ?H6aN|}fN-zb# B hPsyE #jvn?%eH7R̔OAzHF,=Im$4E#g_H/ Wa%fXDjDqqb= 9-:/ЉgC--$O+]̵"aWUR^Ġ,_McbR{J1a ,/7rU #rE>F9\^5k2vaM&j25YV^;/ف/oyIdТϒ?H0Vׯd{D*/\uhtwA"?ԥD$?VP_ H`>n@%Ӑl6 Fj9HPS ^&l1H0jۄQ]n5 ^muhE.*b]^R1K\VςL/fŋ:>zg,_Y>'[qˠۓF:ov.b_ FIs=WK9Ԝ_A_+6/j$t}S=J@ߗ>5h]3Re}Z75ȩ.s7vJx=gVZ|f1fֈCڀ>k>ӵY@]|B wo.vpix,qiχ "f0qL C0ún*z EH}`(j[q-}im*$>(`n MPQHb1Z{ endstream endobj 429 0 obj << /Length 3314 /Filter /FlateDecode >> stream xZm_aEwy|%])R4Hn?Bkk5Hm_7ӻ{A"Q$E 3 W+'ܿ?2] rnU*˒|u]}>vcN)nZzjnd~oBXyθ10cqĺ--m4Z[˦f4hƒ-j˚Ǣ |}ч2cwnwQSׯ?/RFgPb ɄRylF$k5uuTr`aw7}A`cU-}M?l{em 6 b՚ n|}uk[wEEwaaq<%_M{xK5剀irQ-IќrPocY{iyBZjb|s0cf]lڄIz-Q|<4bF0l)f5ml>0瓠# K`&%Ike I҂?/p3Ԁ{Μc5X]..N/aɄmʸ6/Fy".dꋪ~K 3p!fMN mʸ~LL4uڢ?+ &o^J5jʢ[)}燦mi';ٵKrYfoN=}h/Ϛ2EҰQ?NIR  5l܆ T+Beˏ#/苙HfPig:~,z_B`Xv4) k FJD;~\$NWi#(=V^ tN`$FueNəQj׃h|qR?}%|+^5dx[%Ip]M-ENML/}tJק29v>VOmڢ1 -!FqA˪E.5D\(to6EGl)3 7_?#[F:()t"XGպx(?SC.5_3\^N]OD=R Jp('7|"tτZņ<=~oշo/45lJD7: OZ>@+[˖ʍ'Fq0-M@UI"m;i 8Oi6;+clipy6Q%~#Ty:,q1 Dԕv,s0mJ;^CVdջM~+7_lBp@BֶE= Y8Lh-T!aٔ dNd"ٝO{#` F&"}E+_{11rRh" ` bq}X;IQ˫9787qx\-n}qLO_dr`dɶJ@cv`Jd  iyDbf!rV3KytT3{(r׿O1TK0M?]Yٺek]yJr%YlL?OҟΠ'L}`˪-Qe]# bE!rro/9,:Q =2oLb3N]w(ؗr~ ҧ HN0~Wq[ʛpX@NJ9Tx.kw n$_;;euTf2Xr 6z;] F<̧3qI\J'2ssө n#QT4g^$Y9 C51hڰe@F-o1-yG]?},/2&e"D R6<Z rI +Ky>h;- S _}mwzG3;ڛJ  wjW ^ A.IƌGNLJY,=&~"-_}<+y90V FUtһsˬ;_!a<[ݍ#t_CEtɱ Ӝx_@.,!-9M1#3cCeLF"dWhF_%-~t \?rWeWMČ"8i&gBdWE ]H943Os> stream xko C5eć^NQ\+H- .zE|g8CiR$ &oo"~_JF"ryuwU*R$[Em뭊Mxe{A:E]me*bm[mlWMGzvǺ9"+M){y,L&<%2r zlQ:\{4"3Y3C(5/<8ᇿ%Xq @0:JEFK@", P3LAj[$G]]񸆑~|] >BL){`jСսP3wZfw>dF\㿂FmFZW?[d?52';w9"!yQ^)^lI=g.$ZsagSM v)C u>pDd X1CAm>n?9Te<lɓȨG߽}B--!dR0m9I`n$my_7P^6,aLsB,ʣ͗A Ӻn2' Vqّj-r#n* 9/.ҩb.;0ŠU]ex3 /[k0,{,wJuh\nIsS޷) É \?i.}loiJ\0E- Fe^Ggn4]T羛 ѮsdjqlQ= TM C.!Osq^ގNVdl5\^x 4lLD}r8MB:jlrLC@yV(dP7 O}3YS -Ӭ%2 $6vӥ]Kn6L"0܇j.||:W \EVsiIjl$ZOCA|]@6 vQI|:z6*g1'3$Ajj>0wPpPؖN2Mm6ţߑY6&'nh=OM γ 誮ewϤ- t)%;ihUj਀//@酣l0kiz'_ Vwa2ALv&,P{CeB3 _~"p^LJ 3ݙ>_ Tj] dR`btI6e 9A4_i ;]4*_Bm"RGxC;:E^PeYSoojَk&2ax߯ԪopnRnb,׶h`Ko4u$ YzgBB'ԑ}j7LdTౌZuH M|F* _.1GǹӹWC~]m+G22"d"(' y5ʈYfaؗ,"%={]o YSܳ~iikk|qXmJKb}X8P"WC|) I㱴O;Poy |2(Fn}&/Rc!i>U~autөs>/?%&=J|}:(_:'TX :qs>q tn_t}HT'4$+D3TDW4B/ܓ@|D4,ϠPs!L=ZV =/?pbݘ'/UQ5BgaHϜ%xFx&ɼ=KAZټ<5dhdۜv$_dq;|Wnh\0$Z_b)GdP굖 ]Ӗ M KQ5*PH:Fur{akPryAa5֡Tk~|MpE:+77?>"ɭ,Y[A$Sm@'lzDG! ,e@ s낹FfnכaZO.q2ȉ.Ũl Cʝ ,Ѻ0o5s@}SgF*E2cLkeC'- X9fZglō0@ jLVOțRI3- 8NVx> stream xn}\,bg !ԲHqOUW5MlO /Ruӫ7)Ûމ*K*+,KJ]]}^i5vY+ӌvv4#8}w: Mq]+^5;Ĭ3q_U[?=b?WTc.eYĿ<͆VwT*ٴcӏB(0z$6X'l)&(n2hiMkbh*Ѱ3Ж:]?[_So>}}0؅JR}y剪[0idLTY%B'Ve$V2īHt/Z0Md_0V$*>DB<#"  ̞7Y dhG|{ݥ&`)Uf14 #Yӟ/طO u38@}CMT;KmﻵЫG'>xm}3*P#) KO kV%_VTY B,00Z@Uy1+V4%'۶ li*qɴCaǡ!bI`e O&B]-&w~py;Yݱum=uLh-lXޥwtflz>q}B'3j,~UT"K0dyK˽LL 3Kw88n˓LBYd1hg[ot㐥 8 =O 4EwHLb팏Zw#8!< NzP6M Kh-⽫ 3=.uDSVV}o l)J@fG#q`f%b T=m ȅf|clǣ{jΠGȵ>5A ;9{Z϶"tgRd1`Y/,}3",@r#pG,z!x7QӤak)N^ 2"@ѼcNm ƥDd8vC3C9'x`oA:f\Еt7"dcdLòIU}!^ŢeMB~ a| -42xS_ߓ܎BkTil'y.)B*'jܴlEr9d?aTZ/l\;T\H=?y},B-2DZ^|v$=9 _4^ ( A6ivb7ֽqD2:n%zd_/.sl-g .QⲀ'-BGYQ+*s_x4OrVg`ͅd\sIwEFiJtwE2\]:CxiLY"x60:!>^G8ly޲'wɜw6 Lr!C^pbš%ٿ Wv8P] s(zڄOӝv.ͱڠ@NV/rYL+ܛ$W-X+`f! RzOfױz]DbN-# if;% x]`&j,T^/&숨;;559Ng&9Y /#mV+3E\Pp-@4rsLq ^>Am*a^@BX< na{./;[TݪQjv'W"B.V>7²Q?.UGз:VMh EԼ'œ:!Ǥ1)ZTLj{w7~6f326&l(i?6GY7.M "bӷ;vxz c)얤=ASgGRDŞ$_TPM~101<@7ݙX(jnWwxǀlvxtY6Eԧ -ȭY6-w)0-d(>NO'6tܣbROT!-khk-9*|RQ}.׵`.03@<-3R*JTAPݥ 0|0"-D> :r )k-Kζ@}!$gV1כYwsy3bΜhTRsR/_ kιP&8 z60+x&H%U^d^Kor)׌8+ VK|2} SytGrHj w0x;ɩ4c@M렛3"!NC"We:mnU^s4+}ca^e$PE%u*D>[o'y8J<ƅl_b) lˮ[q4LtL6ށ:t}Viv1*DɃT!Dn_ĭ 8x6D eTJ5aH }IUñxW"\a-7H(=o:tmҞKfXTu=.mGz\~Ugq8YXC 2)<ׁ}LixBzYiE&\9H2_=f4{T*v]4TplM>R&7ܕ JI :N+Vx^ؼFƇW+ '9Em YgYr]?X_.ҩ B̈́9=*f.-҃`c_M?S&duT Ty"ӪO>ʑcDB_[T,*'7* dADualz\U3CP?yoG`-*;N%}UϦ*N[HOŊmp3dlV'и*栔fOMj -BM8kn' n/A.rߢA㽑5l˥}@'TٔloӁG6g+s}.O/P˴;x| t#l`J:Ǽoqm]nAځ+@a6acĿ-U0'yRTK <vcvY|jS@j'y;#=,J~f~'ٍe/DQNt_G7V"RU"pĿh2) H-,@ l0/2EcZw?~1,L HpoTVL ]1♲j2[eJ"M3Û, endstream endobj 358 0 obj << /Type /ObjStm /N 100 /First 895 /Length 2733 /Filter /FlateDecode >> stream xڽZ]o7|_A6: pkymEV#$di6d˜pj]䄐3!#VK1AѠhČ븖]BLV8ǎшԈd$7 1T81j$cHɸx'$4< xb/%E`؄Ip)es |C_D_?{b9`t^<2E,F*dT"FY%PK%M3j4Z=/%]lԂ4m(D%LL[axƛX٧;y>$5ixIYդ9JT+0Ux)wRUx ,T* й\. Qm:EwbJxv S8sQz$_OÉUdH>=€hԝ%0g` ̃tL )=F CAS8t1#ˮ? iVg/wzsr9?]f#W_Ny`ÉYzyMܶkOG 9@>m@e5Cf>=Yxצ{7wgb~ꯠ+oכ ƇSZ|\ο[jJ\o ao+<%Pnںmkַ6V[DI6"e4@?Ztp G)ڜXISfь (e 6 Dء պTAEs䓋e Ϩ5!gqci#$K6ZfDm(nB:Au )p?)#S' $LP(XZG NR&ZT%ݖYG{Qܥ\g.wWQsc4,!Kcrӯ1ni[Ƹ1nm6^):?SfchTB-9=W QPSpzAH\d 4m|(LnW=S& PV *|8w |*-[8j9A&J~#+RA R:LXc2!ϡ.eVS.rqhFxƂ jqau֡\nC,tӣt+/8^rp X@r> )xvD.L .Rm PH# Ҕ:0pWrp"I-܀ y_ ᮂp jG4ȩ[ OTP*N> JE*U@ܮ۶ZZmmlmjmnmimR)yU5Ld|+!@{0ى:_I=0% ȕbnЌ 8SQܰ&TX 0SV|ndmE(ÙT IM[\JH ك@e;ȅʝb#7u|YLߑU#-rw0lnƴ 1%:,iIaV14/+n"7t/BFv:  GX~\*,U$آx6A~62sxBoDJܿFekՃ+,xNm1gyࡠ7W+rwko=Ww}^셾)OߔWRDYE!!]j@q8/}@AȥfMÒbP[35ۄ\\=?m||;Pᗸ5h?ej#$nh? ;Ѹ;) KدO*9ԧshehaZ419h\.A`~̘gL~UD*j1`(: ek7k):+C󎬥VkSy_:-Ө_qhF)*@S=0줏)3ƎNxj@|m)u֝v^oηַb[j_,il(t̨1M㧔&W&¯kȑTn.{Ɵ_uXhrˤe[]]v07G3gw endstream endobj 480 0 obj << /Length 4191 /Filter /FlateDecode >> stream x\Ys~C\%;帼M%qVU.5Cip$ڍt̡nʋH618__/>>z!,㙸HquqfV62]y)ُ'_UKvCׇleۛ?u׏^h9M,ŵ,RrU6-a$Ѝk'ܷ z̘Yר|z[_*;+T}_~埾KtaY6APCHY#<E]t7զnz]V(͛fPSV?pRS۶rEۼZMnQ\V'|Y63`T% g|XV*bE'"}E+^$ŶZH 3pX3jst *&ѲĊDZNRYOO ;7 EO9 L篥e\xj.|nU4U ~c30_WEsdų]mlZeZw8I?TM(Kκۢ.נUTX3q45ۿ󉆜/Ϩ&n`T3dľğH3+[DYSwz}E/r԰DhEC/hPL9K^VE=- ] jPm43z`  G 8*% q{3=(~VPz4zǡzNۨaw.$I1Wt-.E!n0"~/}3m t|ͦ^<7Μw3˲덄YdR{Xv>):)耍1.{1fȬȚebP5 kfa y {/ުx5T hӱr*ݕ':FO^Ν8ɩV\͋E a:U>-ȚRA)>#}fRԠ?nW<)0\foD74KwpnH-|](|aq_n e0袉-Xq0*09 l'jb rc<3̨tgGQ&3œY[u"sUƶ% u0^[3gPѵ|g_`zNʵٓ6_^!.'#%4!Nda'? `Rƻư<ph:`TQw\"Zcՙ,սtpo$VjfJ;G\;0]dK4J+'k|nN+((j~jzhV/z6Cĺٶ0~bǤ 4RmZB=]QExe/xⵔ,>b<Hfdy'DRu@xӢ9:jUЯN:2E9,d,duS>>+q[o9\|}Bg&8A׀b7߬ơxs]7wuNJPH,JӞBNwo,_3s-g(>*1kUGYoun^QxwtGdJE3qsS=yk}oN:)Kbw(Xz3 "Xr0= ||~Ȝ\Η`% F#y;16~wc&]#t.DD6҇8:M5o{7q]O*1oo{q1%QyQԹ|!mfvA'Vs5N^EZypa;%O[f> h*>k{DQXc(Ձt(挶eK{< F֐\m\jg(0c g[TDy~$w] EJ7HJ T<-(v z:ĸ=mA=t. (1irJ",9Mlh3$~*&L KY{idbSqOC?.O7v1FFśOxR-OU,7M@4zԐ-HdXo'3.q L?c蜧{;Ŷ((jQ3a$5 H?׷zl;,8'J>cjظ OEsaF֑kX, rD5,8^*y%ejpf@jLȗW`II Pusu_3ɉtx<<92\mN,|kݼIu?}xC|ΓKxx`3Bˁ,J] ]d9Őgg9-N";~|8rNEN~E@@\EOS> ~f[, 5D3׺Krˊ:ėxMҍ=smY2~OEӪ]`sthyznD# [?ۮ` Aǎ<ֽ2%lSbKi:Mi(|wWtNɷiE[jT{jNa;j3ޛm/Ǜvv;؝75n\z fw$1jj-8 ҨSn+wBvZR͋]&Gқifu%БMقZD́R--sj%iO[09D3X~F!'pѓW}fZY'G2ux+qhh@K$;)#^eSȇ3^iu,Jheۺ\^&QhmHb寎-D?,A.d&Kr oKC9OH^'@C]orvZ]pmxթgMHqlW@`7M"4(=Y2 cygI Ҽ:Iԏ*EI^SP ?-|WK5wsu.h?:(),v21uQW ^*ԼmƏ~*NpE,s-`874<Ҩjb)UďWdڧ~XV,zRjW¨&agz_&D}y8:. ZPԟ0e&3%B7 o endstream endobj 491 0 obj << /Length 2965 /Filter /FlateDecode >> stream xZYܶ~ׯgf $xv풫M!1Er$mߞnt,VZI/CF 6w`ݳ_<{Rdh*3wlیWZ^CwGPc>H_ H.R(JVon~bԯQ* ZwtfFhJT=H(X{= L婦V&dLŷ[m$m@]وř)`OS !E(0R`bbNv^4fy>e QeӢuG*Y+܇̸j3ʯ}0 GM%b0*]Ab3H;|,2POqYm="ְ+1_GOQ;R()n+"M\I*Kot>| xƴ_>wb@RB!I!MAl{vp*I_U8"O޴)ؖx*FW2-+el{UTO.3F4;4wi\] -l,͝j* egg)Pxu}A?b88dbؙA8ѓS½x 8 2F`9. Sb[Sחa @M"pc|<(wfs|^L˱ <[K m-m ˢ.Cu!]p$D7L X8I&('2O`!HtuB0F˕  ( `zKou1^̶Sn l$UQ(I&ӹ{tκ4K 6#%-RRv6SO}hH|uXSjqy vѰ0X(i13ǂj9i&E؋f@c~xcPQt[S~vҔݥBrfٶ9p  eZta3C ]+@Ud4PS^DlwHp#zmafT?Ýԫa d1#JL, > ˓.' ǂuCdq` V*gN[jgɬ#*=5RV4G 4_[4f:1.I螚VÇsQfFUZZS9vI})M)cNKNaj8/}a9-1=ݝȄ0?E"|DAgƮ/ ,VaL2v QkЅg![ɂ2ܭMi`MDXFñhS,b.99d151-|>] Vf@kǽBj.0a |xT1XhN˶aXj%cr{]Q ʉ+Ep-4[ݥ$Wβ!MX ^Rλf*+Ca*|U6ju؏F7shfKC T\'}<8'5>HsO ;,vv ҠL VT"eE"g7P:=؞jk/-V@leI$HF߽K`_ݜ90/҅0n7o (&c l\bE} ǓS:%$GȎ8St%]09,E<_vIE>ED ݺR\x]ek~uK%s`[9o<jZAPaJAxNÔ5+ =-K^68pT~ţ! E WRJ0x NV fȈ5pP]Lp$:tH$/iO؈{/??as( I| hs.c2ex=0h"Z*˗O97iΙ]/Ez||tC枱;XػAc_D_o_vlXKEsv'y 0d(Wyeo endstream endobj 501 0 obj << /Length 3196 /Filter /FlateDecode >> stream xZݏ_6`/Iܢ"@]rA!ڵrJrv7:zhI"H~-"r/oxN$ 1[Ě ,Ywn|Zz^mdE->+̻;>o&ۖ9MjzQ$*GQU7ǢJ"Om]~ӛwRQD}L)EEk I .)U驚.oܞKi4pɴ2ef~/C<,`oDɒ|O.LEB,{۲حiunvf;,s/;U(h.k"w5>xW\/Y7{bPEc6L!BGQp_f݄$93FJ"UÄtz|Wf.svYӹUF4ǒ&2@m{7ؘ˟PnV0Uvkp$$il?,]:vDNu:QQګ{:̮RU&#/\}Ե9y  ٞJBacxHh)DY.^Ɉź׈SCV%9krp'2lyCXMTL`bѨQhAƸ7-kWa{r\W18X_3X]MJDNDA9 $7% d3S%'D6,PvEdhmY BmNDi?Ee4= hC]BLܙǠ19;X;fΩw89ji-w4gI嶰2ۑ eDI"$;.%1xB1xc{=쑃j4 ZE!l] ]p[[¦W&$` De=0(n A8}@1@7m+2d.!tዽ{)kڼi tSdsk#0|W C}' 9#ѢRGi0cݾm keŖ;J. 59o& k\@R=ĊT6jZY,{}{h*ߛPh"'j%EL%[p_buz{C{@^RQ/߅JL@S& uO1Hs7\,Q1$QA*<uݨAaVĞ`-N{TJ'BH8>JΥ3U-',d9ꨁld J:h0c' )!B2:===pEB L)9ҳ`FɄVW,3.vyi!2Tz- =c[Ocf!Y)H@D]AE (Fv[ap|a&Q{*6IaH{uk,$SJ0%^ԅ~u,-E^u(,G΢tR¿Ç~>*br<8ڗ`AW -l5۬r-sq_WW@P#{!b7Mul[)|,?ޖMIsP )ǡ`y1ZWx*5Rjԡ4Qs*nO69vh;Mswyh+u+ tnC] x{)kHAYA+&NMĀU,M?1bfm^t%6FT d\ &QT72KAi އd$y9ܝ ,tg/$GkCp͸CgL-IYj=Cgl_ Dq*5Kɡ>vnH+z^ڹB1_]]CY5T]5! ruJ~R_ GDftz s G=-Fgk$syvĀ+#W ÖB'L3B@Xο /p"cEۮx%/EXB?a928!K^eh7Bv&"UO9q;ʽ=.xtMI.OC/IG9(ꕠ$Sd(o#IX/ #U8-ij5]mq~w<+] _b69a6ZX&΁t؆$V?*?XzdQir=(~8r {~2EHԩj3DQt9cmWׅ#&e:eic ^ƶP0~4ܿ>8W/z1;|_ lU{n֯ugy/*a KnURSKAꕛ+*6b)"E\"_~oF endstream endobj 510 0 obj << /Length 2814 /Filter /FlateDecode >> stream xڽZs۸Bŗ;\rsy4qInnh 9Hq{DTlgn.R|_N.O{LpVB.g"LH5`.r58/î\x̘s-2( [~q&>VK[/y붣"ފIkC6S'SOs[h ӕ.e 24?RW +sT#]C^C5diX H{x3ɝdHc+bgS2L9:`bhuU ]f2ݍRE?g8L_*a>T.x]O^kŧbJtj8ʼnjfnʑ=IL2` {Ep28Bd,<2N"{q\Q%#!#*h[L Wb)LH)g;#ڌKޝr쁫!-l`NT5SYb |xXX(Wn[1|Wq8 ~I"Ku,;JIKsМ%.tnoڞ^vpm  Q]ዒ^[@$.,P7{4uwG@^EDY1=&{b升g֛_XUr??InMH=|+UYgi{ B> ?EOAa0j-u3 JT?spPa,|T\+SW,/'ݢg/&*AKh ,& 'k:`%xZ=xgGxy}pZsRɃUEGUMfLf)4dOeN>hPlN=KC0)l_=П(2OեsjI2Viy@1 "@/+1en>r'$TWK|{;xS#,=4+P@coզ\O{k&XG]6]6 @f2t*(ē٣NG`2"M޽ŋejwi 'OT,CÔ>\~oX[pPKɟmVRm S|q&{iH[*LAb8`hE8LN`tib-Y cCiUZ RpI"ߴz+s5 [",Ӈ]%Uh@7r &2=_P1x߱d!Y]m(h@/ $eT ?F5OEZL7ɒad,@=`pҞgݶXed `98/dj,fomB( P<) =$"p3̒Ka.+ }觸yrE}DsDkpB1{+H<>1 0q{s`rv ͝;RnMmp޴W"GEجs!Ol ^ ^h_-5oz\9L@dOK$'KSQ>L nyR~@]6xдg8\cGêrHNĻSpp"U+X!Dz+AXkC40"qLmǎskLOU\s~[ 4l7pT9q0țsѥ'|^1 Z*Fmw۲7lPrv#TZ]-Dhu8IHA B 0ܡ>>z*o:Ԡ _ǡrcj;ems;~=ݫ:cK PrvWxUs%Bƍ˓hi endstream endobj 518 0 obj << /Length 2747 /Filter /FlateDecode >> stream xZ۸TDA)}!-r-!E 4Łk˶Y$9wC=Zks8$()j8/f$wuX3yͷ.ݘn+)X,╄Ӧx+o3<(84o̢u3EႹo\cs%cqou=]&h?,ssŏ-z ?6{dɾ1Gm4Ͳ9hKTHƑ!F5Z·Z{I:-i.d.Ycc.g묛 vFĢ^SݿTd/Uj#2d俸`ursTΠ}:)Y bAanX@F#b<*J \ޮ/.!K r((%ǸeJJxkg NG{\TpO t \Jpחy?.p|b1+.B_A8=jW *ηH'7fhf*&$ߚRλ5$bOKdY={-\,t@yqbW>@-eay=D7+d!r*ߑn"79(BՄ1Q#'/h\'ya~]GV4[hTф|?OKD`KxdJ6ɔ5WuZ%;LVʲvm仚R:bĉt[dQ5tEiWmBEiK͆&nsCC㯚HKU'C`%UglSLu[ ~U7sqC'=0<)IЎiݤ[is֤2 oߩ:$ 0oQj9@ag_%1IUX_sov qVDԩHcq{RÔpR@\ҕ}(0JNZnoZhH{Drg& *"р" QhY !x-8pp|}ֽO>U7ӛ{ڗ;"B E#^k^07_xK/Yd_[?w\S!5d'ϛ7g<'˨Y>*<[ uMxӐi7QuO"O: p6ފc5pA!z^$:{rg2XTq>!@UϚ;63c4~G[ӵ d A@j%7ܙUM g;c`bl!:#X "ƶ-,F}4/38lZetZ  bN24*Bڛc߹xm@thYCp(}fMT6kx pqjAs{X7Nu[ }fٰk^Tjaו] ?AV0tLoVf54bGPllY~6<-FY?}+cBJA؈A!οہ.lۑ|񡶿-¸!a53' ]Ivru:_`+N>9O^ï1z|A32J941C#g˫{9ۇ@Z=}C!\[+OwE?KUM{VTW Nw ƃy1'؛ 2#%m-:r1<ǜCLmuzWF!т>\[Jhjxf2i=86 SއSndqNM*zMKGU#ꌀ&&+-a YewdUEՑ T> stream x[۶~tJ'ot:v&L&NR>4g2;H|4߻x")s܋? /..vL |_`L'#$&A*|tFHLqJ-ʛ"EY(*BW%QYZxWo/̄'d""D)X"*dGq^4@%0Ѐ\FG2\!R?) gu[Hq[.q8ύl^'Y]V8?%Rl!*`_51oɝ^11"=gS/-S㰠l'Ee0ܙ(۵Z'Ք iT>JPts^?=JFQ_a4ʀq, B*,\U7^jtľ+ad0d6Ze6fyd܎-އ brv * >]MU}Q{Q|MF=l!d gy$ FuY<5պzQBZU ?=Tk({J(Yql(#7'J+v 7&dUAO {α|dLDa-C#h$1ELF'4(K*v fvП}3FhzRȌyW <>K5%yƗİLZ!0fW5 ͚׿O 9i:ĢI'{1žG.~J h {[׶xֵ=xʝ=Gg 6CnB$PĜ߅es4ܔ2{1KF*g"Bx]o!!&)C=@`04͑Ҋy5CK1G (f;h}qD>*2g3DmLEv, \О,jTw]lj~P7I&Bl^ؿay~vaWVݸ%eYܐtSD ܮK>c[_2+e kU tfـq[)/3RZ,<]4c ZMjU&3MȥJ_m8h~:7`geenB0cbt*P,i!%Ym?WO_!֚k#M޸]ݧY{Q~Cюji- ܚ6kUzp{5>n5/;~wʉL fR9J'P{k1tj͗ΰXU3[3n´\l򷜟e@K?#l e` s=\n~h5{xշ+_:ub~uP%tpDz7?<if7>fLjK8=]H6`Zi ]1`m5?6ϣK{r똂ޟq]e3ܗ|ƤK'er]da~sp*]h ֻ v.}hig(S$TudX' Kfݽ1e/0! 1L¶%Q*OOLPNJn2H"@Hrha6,ŭ*M[6XfmXnYMdW4-#ͧ,ogTFEw;p76\>Ͷg6+Yݤ }o2k,z_ 2pH/Q}CIz nrv$T̥# ˻l oP&먣 A,7yj92a@UitUw&Ȗ]8w> stream xZKϯ%hsė3 .CdHLjn +K$俧*J,=*hV?Eˢv+V ,˼7ۻw\_ͩ]Ѻ=U?G:,Tr?ºъZ}O` S4lA'|R Dtk>G=KyGO|tCͮi-?*1ݪ#Ɠt[ 8 bǁ"LwҳSU͝R^mdTՆKU4U^FIDmWwzDFI=u6;"s p2ˀx=ϨӲ y6U1؎clOE_6%g V1R݆G BSo^݇\0Gfk<tФmsǣ͋Dx/Ye~ƟnhĠV ˒̇2XҾʊ%ggi$8q=hw2S%; l# cT R4%)&d`pMݕ]OV߷*az]ڶlq q5LIdL&O 8&%KEU76cHggkrl5)X/zV@( ygǡpGsJKG*px9xVnKQDLMLxn0Ra0JKni6 KvOw|mԙE }[oF`ԭ]n owtg,xMYWe;xh\Sbo+G k/\O!f97^G0#1\;4 cvL1o1Fj W1xl7|cK} Z|Ahz?6v#?`u5I@7oE­i)=G-CcBUN8<˩cdsƳo% iS>ؗ,[edaPrtk#֐ >?bO܀NoG] ԱQzP "]oU: K[ ?f'b,^H+]p&`$ND!Gci.e \nO¤&9懣;#:}98T6cQa96uwFNsW: =^n"gӖ,jR]nKQOQ] ݨҴSY畧5 ~˪ZTPBml#q ewyoh쩖ptt:i~E^&ओ/c.YˣSMx!Mٯ\>|P ;#WcH =[;_w!en{H۩'nX\n˝ ;±0 ncaAqWaaXq{"P>< B4Ѽp=N,x)]ME-E}7*dsdG p=V: !u,;LzCpa3^ >shLG`/y(AOs8I p`Lp1ӳ$n"a'I,?ncH$f# < [c OO9yy@r/=+.JץtB [[<9m uݶ)b:i0ϛ9 ~14\ @ĿZ]UސmFXV}F(ceךnʐY `Zz!,QsۢkOt:cpt3T xc xmaFZW[Xas-cYQsؿb_׷`O؏UW鋮qX =eOǬm9HaǗ%Ipn6ʹ$6C.a[љ̶Gf8ĚBHӛ13+J, lBX#Y1#|E,nikzI(w%jݾ!J 0}!wBkasL{ڛ(a]WZ+v1*{Y4%fGSO=}'ԐT]whspA6{ד aӝǪċb !fk4=E' 0Mkhkz|0674k~ۚc:[Lh:m#?RpE2٘ci5/o淋+Gvz>q>B,'rA9;iMs6Mw" F;)p6l_قWGr"zV}д4O¢ߴP>ﹴMx$bKp9~Rqe6nIՅT7$!#(.ST9>** <֞ڥ&_a-chZ:KPD[(0 K3ɪ! <|8栵2e: JPc37RxC)k>\хp.:o }NAuw M[.U𘜏s律zx×G&h٦ɰtlwՆ{V3 nZhqŬb/ff%3b!CdbV7x(Si+(Y@=1)JR]'"G?lߐ^p-`b? |r_ZyKV&VG;lLyMY'{" 0Ah3-#%UNāu!Jh ui[eUHW^#(Cjz"Y Yb&M Ls.FD6]!X-kn7VhSޭg&Ŝ hmuJ¡R3*"@or t),@D"ew, endstream endobj 465 0 obj << /Type /ObjStm /N 100 /First 891 /Length 2473 /Filter /FlateDecode >> stream xZ]o[}Ǥ~B'-`X.-юP4( }Y^)ev/9ܝ33{Cƙ1V0.D5zG#aJFb$$N8 h~K4PH$Rl$2W#y R1p*5MA0QQ8Fp`l!GpA8 bLω g+,qU8*Dž? Cm +( 4h"f(pJؤ)tXej11cLUNnRBrk?+&`Puɤ88&`ɤ"HBESOq` % &dL ް0۶d9UIx -8B-XS( Q){Ah&+*)SgaU!ZA#|YhgR 7BZy`X&'=\Gg _$,s3mc9pL]DLW?Lt>_&U{?||M&ʼb+Ef8W듂991ݩ~YZ ?Xb{= -f_o}#^_m{E]j[3cjYDA 8|bӈ O`, Y-mka/DŽf@7 !؊G0IFh=XJELS1#ڋ$*h')XW˽ 1]:j=/rpXO=*/LƱwr3UK'(`}D/08}9:H# l] I6Lr<8=&1]f˹_?~9]MW3#I|1_yִ{L@rCȴQoȪjb8;j߳{52o l|uBsZ\/fW- lcv~1i4Mr%.ۖ#7fWؕ*yaζz̮}?~VʛW`en&<DӰ?xONv :yˑ`ǚyxI*^mEv yd9'nz 1T~^ 7:r{BJBڋ:]>.{x-/5ZxZ`۳ mFcHr%m`p[XH{cp,bm/F"'LGl ߣ&@nA6@ _(:3oX< GRњhXz|,oʘx9Pzn=E`,8:D)ޑM=l* 0jbMZ]U`YlQe- G]BZk@u,R:nNEV$"= 7 Cu{&!* Acɒkdj=l6-A HՃASw "UZ盅тhq2D^ >Dd|1;oBo dCtlP-J CDtCxs𺉼i=Q y JL$Dx@LAenD\߫t3Kɲe}+4LyqmI 0)U+_n7E O $eN-Wtl0+(6| vt9)rǯ%GN"L-=MF8KbzFHYu;i <=]\;3̫Go9>0'@ xojy{4jy]~<^B'xP"(RTQĴtKPZt{2f[nyϋn[SawR HKx7\B4ф»Ulk(u:SxWdĽp)+6`nZ޺i8Na @6C* `Z|gk=cqT؄ALFXoa7:<ob< &؃-7mTb1"<_)|eNl޽*5 e> stream xۮ|^JCryMopNm|>y="pJܖC\wgggfw._뇻/_D*σ* ţ߽~.&q(/ v 3ϕc7~n7*]kkMS/1=i 4mdy 6LALyt~NN<_%NEH$[rF|x ?-ʼc@.܄Ӏҁ ԡqZe^a]zKxyKkc\ ߈j?\{\aq},.÷fo|o󇻟+VY*Owo~W{=ad(Tz}a< D@0A s ۤEo-#ׅ5Ӗ ^/NJ7-cK)eS}eʎ:\Rx#_&ȫ7[ooW>]Έ6u (ZZIצDgT%{<0 rZ :/4];5\TU bAyaܝ^$I6*90IF6'oE3cy llHbG܇73YNM} I[3G'%ar3$.'72wOA=MH8ofW*Q 0!,Ƌ0Ʋ"*^gӵ9/ 3V 5$5#j:)@11SL9oQ}ɥv'3RWy=糮ߝa/;{99hJ$C1=+гt$}i{=9RsBkA+f$Ӊ}+eZcj9[qUaj?)9UĜ9q'[ACvKtdigߙ<2a2G= -[s8# KלDŽSl"7x[0μ0\@r }t[ +$zz|KxmCPc4ޡ):`myx.vG#k{x<@9E+WnvBVjǒq`Pi{p֌>,X ݞh;|ɡ-N'bhmLwpcQWhiAdXy'1Y6hG4s&ќQRZs5"~>VR&^ێPW-$OM{#?9z+5xP5䱔/9OjI\RʘĈ}WcESM?I \r"< oKvqMIF势!Ӟj""QwGOx#-+T_H;xluNC.>bf`;Ya?Q~NB  qQ[ <]ik=PZ6f3m֕a).p 'p:sQD3ca^r)HrF>vu9D&6U5ǮτIUr-,ꦞiTm;phrͽ0_xsHMZOZO2(JIƳ >֥tO9b QtI< FFUDi'B&Xbp:sFx"CFv{@>{ WXfJ`B"{DI{H t黁L=S{J%}Ktarne Z68^^QI*$(\ji} 1fnFxL}vUl8IO'o\|{g 9[c59<[)Ȅ칩+hSr2wG>m0J +gI#17O&?ֱ 4^ l<a0? endstream endobj 569 0 obj << /Length 3790 /Filter /FlateDecode >> stream xn6=m5|RrC-閼-y<_U$j3U,zn%Vߟ+RZdBY)p }g"eaW+_Xy4,9p`Ǜ*:yVjuDB!\Ψq$W׷Osu7]{Ogﯧ%ˀ/=#012Ҭ.u92YQ@̙LswR\.RHe,{iMKa`*^Wg1reQi,-w.e? aE#mfEN%ɟ *|\v%H.?DXq*%Ak`DrF98 ^ԏ.KHm'E$g:e fl\Vqo[|٨LZt!Z:BL߉]HTy ¨,9̡hi+Pir tx4"1j͗ * ZsEqh=lcs#W (g5Sv@=CPwдwj+KLyާj_JYSY|z|8E"j74ӫn(]u"  m¤Bf(Ⓐs*Pn- OCP^'AvATF׬exV,*!ZgL1qB }ojvh'k6;vmM]_'mF׻ 7T;VCud?0XH'*r>SхxXÞ \EnRJ+"_w1~@ PAsFpZ( Gt(~o/Xƶc#mp=Zu0}ܝg$0Jwȑ;ρ|D~%f:lz"MA[Cfg 7:(Πywu̫IwCChQUG>jzĺ9Vm"ojฌeZ}i61fe4PS݃SD-Ķ.#}Iva9r}pӏ[,w iioo2ГllȖWL%@*sǫR*|579rvL*/[ ٴ){8N^A~fyCrֻt${^8#Y'v\q 0O5 *sz V*Ȃ s3ְhp̳<{QH>lNqř9&l݄-Y- @fߍՋ>dLH 8FrH<]wp˖nxK!y'bh"/d t.}C~)$E9H2#G J[x|>tr7Ή#8T; z{~por(A "l!c8M_x)iʬÇ TLg an{nnN wLuD`xl~)Gf T\<݌W`RFCX Lf%L(Vy!% 0;6JGI8rmN*́,LéFhEub~s`7:Eqe eUPG^Qk0F[ 7t%(>6뤝e IyZĢyW%?U_?1gb@A"+|+tD̰$.<]bEHXϲ.gX/]6YY$zT SKEf^aR^hsL a BGTȍb7'O;_՛{6Ԑrʠ)aldiPi | &dOCrY1@@|@A@@&@QC~j̡#%Y{/k"KO47՟nN!rܜI*ZuSY_nm.t ?A|ǟU`!!5%<$/s n;|2]%c:kv/}6ÓXy8ztQ%d1dcDd`xj&!z*nܣ-vCvK@Q'O~86w!i#OWf ;Ta퓨7L(u"!wƏV+FEuOSn%gC_o9h' 'g Y*d,Pߐ.U4*4sbxg @|r^ݝ%vG hwUc=- fg8)EJ/cI; }I<|&9m'_x[,<C4bo }USqC=> -]L/%,d|j!;$t(\o>YN?@ ?n[BEajx☤|Mw8yyn,(wDk>S !/Fu0UH#c&xf _ MhYp2#4S@jB;[&?+ JnW`JF_p#R endstream endobj 585 0 obj << /Length 3164 /Filter /FlateDecode >> stream xڵZ[s6~>4c!ĕ@a;uzLgqNh[ITI:N}"@Jp·s,}R?ӜP5iFLffw۫?g@R^e-gO3vaqQ WgRovzXBe 6[&T3\?Wo˜ng,Pݮfe/fyٶU[zAd,g^RJ%W$uҜlrx#@em*XJ >D(&d΀ݥ 噂s mfWaM/(;'8AU´b#p) (5Q4\™ˠa" d4?<0s<CIbDHɟ%PIDdg$PIiYSXPq2BK.:LTMcx@uĉd1NEpqdf']H!$:+z&G\IF0;{I9EU> 9F?/pG\' TUyiG^5L_ۧ=,Ca)yB]O ]18FN:/d#IH^e/$af5GsI~!֪ )#CWt@Qߺ`1TƉF25m8!R|#xdmk|p -,2_Lۭ3Xυ>B~dW YZش®\9B2a:H0 Ks-zw>dQݖj qX8| fZXb8QM ϓ8Q92r}I'p M Em]&\M堺9ڧp1˦N'"Q*s#-)TD~ٿ,TeasN4:Z&}[аy7{ ]yww5}qh a5 槇늴'" E4:o5*S$Խ$#S;p74u^aКXG\ wz< "mrcm=Z{v]iUճ恲)7`mCSlEy!kj?0` );Xᾥ{۔tgت~p 1 µ{ZV׿+:`.=S:*VTwU:[WWa );Jߕ?n-\l]gÍ ݦnK (0]H+Xh߸zq aN7AS+48~ tn'P=V]gtw}";ۥ&2RF.mմDKwgxBbȃ LHPQO5L[0 4~ު%\Efݺڬx<0ջp&vR? +ShfI2c<=ɧ 6ߠ @΃״wmQ {6W$޷:a SC?Oe[yK~'!=bS`إM`rnSmtTgv' Jc] ?cJ\VÕrG>9EF+:?0xhܱ8/߮ݶn;;cfl*QDܘyYa[鬭BaA`8ό5 xn %$vWhnyT0$rƈoU|[;fEro]7ׁ_ c];\w`UW& nXSJ̣VcUr>i7}קq;<4Ui`5:^oaOX+<:lTa*o*09;M?6qOb* C5fCEe,(<^3ɔ( endstream endobj 592 0 obj << /Length 3742 /Filter /FlateDecode >> stream xڵZ[~_>f"ub&[$8(l6m+K$d{nÙEa. _͇X/Ta"T.ֻ/˝Va$;esb-]HUqLB߼U|.qo>pij)ێ+-/|m)2M9wܘew~Úy N'U,߭l^w<<]W (Z emJ?έGPZ uA[Mk{"̂B6!syb13qp6@^J :3 9vyFBդ: o<|Լ7֥WpOyJȖt+)˕ZRnMYUnqӡǤ6r@g=1C%{ ^P<=K2/@m"&K7*t*ͥ aoL5¹l{K*;iُۄRwl^|ًvϸQ4wBC|z }vdlXBD[ӁݽiedXg Y|M4- 8%a@_EKJmfYqԴҳܠO*^h\J3˃8/F_Fztx`}2dA'D 8G Sp! C Sp@ń}r|nLHN;![ۦj[!5ܑ=#ٖS)@K[ S%K&Az%s.[p '#a:/e~} U2Yt"`X _hے|V73@`{(M c3Ƌ S\\<|dLQT&~qJ#wS0SepEWѲP]ԕM=ԁ18TUox2N㥻*9M y@/m-vz<:zx=B(Jqw*M0igNKMqksx0yG1CaUFrl( TrWrj;X\xk\a$`o#fQIܚ )bC؈ ý1HUJy{ ew9qrB'EzaB)axR074sa""pa9bxta…pf*(+_"ˮDȨåE`z{-g!sU2hXoȠf-^z.,8'>]Uٻn ,P\ vy:Wat쥽Kߜʾ m7q~t|2Y-| :-e|Nlp'|Vp z\`k֋ -oU*8,&kLtZ@熲2\uJ J9>i x4r6~*dpLf g:gUGͪ={Rd{B ސ>cG嶧z ڿ !5 t:`kI95UZŶ xסw7Xy:'Ii`Ya:*ǞiACSdh)6܍nV }p/Ny[ss~ 4Ն`Hdhܒ84~iAZߝݣM[6r7 1/q[&4}_(dt)N;I.28yz7;EM%.7Ѷ\>=N1& {;{M%ԇ{gfp5j E6EOKo퀕p=16d!G#5UNj+Qtg"{vc=O[",z$D}f'U&[<7#JQLW_F)I E9? $Q",F'- '6AB5* d:l0>o r{nd:aԑ4 i#+? y\240(ohѸ- >10]ݩ{̓,v)a+S#< t?+(3A͝;O5,lɭ(NfszYRԏ&Kizq69uGq۱C%WAA'4'*: nݻ[);?CcuD4ED;9grova5 HԻAHvq<wr|yXScӐRn f,bҵ't˾7ԓ=٭(^1 b5xf[rZCP!`.R1k#E,6Rp 1;}e O]gIjrgPQ3r3Ʌ/[o$sϩGiEM(U'#.PP)8[OTρAvLoƤz¬ZKPUN1o5b F=2Q._J)M6^Ntt@_'bw,bC)cP/nY*s:9IqICD8Q"&eC0v# \9 > stream xڥێ=_El"%J:}JsSh,PIh^ǖ\If M`ކùww$ٟ?.erpһp8ȻMR<-#M"\>)VΞyAJ]")MOW,[u0(V8֟]9pq.<@}"7 `?LnݩrHxL=$;\Ogk=UvSD`~ΞOv["F&EЕ h>`#mL e@<-3g߻Q|&s͡'f-z6W+{/ɏllvtr̉?(]d%ƀ)3[<Ç :~88>*P ߣF TORhe?4l m|x9xmZtaIql'ڞI劧}:V#}ן}Z֗_w- #*QfML2{Y(U~tZ0Z>o E]h#J>HdI6]^|4I4i5=~T#ȷK72,Sua#PNB zVaW=,O!)ݺmaPSxp `}SMgprGߞ!π{{4mdČATBk\H}xk˜}6I* 3Kr!ZP-`K’&FcHB38WLD-]f[u{Nj I( ϥi W$8MJ`Ɵ) /n֡;-(m]/Atx̃3=Pxx=u  3AMJ$3'Mjv Y"|*6c0]R<$~mH @Lv%B~*XYk" Eb3T!15&)f\?N\a ~D%iŞ;d_;qqTGܿSbש_3P?P8 ?Ȓ?25`L cDbcuXtZO$<>[=wl_xs (` @-n?tUR=m⎳(ߢTT gIpSw(;LfCn0/h;XD6Y+ k_xe Tdۅ8997 !HY%0GIRŊB@/ ΧEs ݱˡZr8^3Wpi>}XcSG @SUOL.!eaD%vL`(+߇_L>u{m8{ȺTvV~݆^zͰ8YQ!|i͉6,M /_><*砉'[ E1coZ 3ofy79 vmݐTKͲJ!|%~STb'zb? +ͫC5mHnB*^24er[(Q%o'&Ǽ Dbp"4Uͭ<ވ6ڳ ︎\1-lX1m(}IJSER&jk?|S endstream endobj 611 0 obj << /Length 1066 /Filter /FlateDecode >> stream xWo6_A(*ooևbI"{ Ȳ+,<[ {(Œi[NҮ~w4C#BD 6P4"Ͱai°feEא ] C&If66lQ8:X^ PcN 攣r,1Ӄ+8jW3㷿_yumy43XJڂa:`7tT:u[.VŁLtHYqXѮ}-FmD 4)ʬn!~B+OĺI}#D_?cHdېV 1aMBys e>p%0tp S! ޖgjkrC/kd~jp`k1tmz32, pa-k4H")4Fd>!h!9J91id-^4uh-qmGJܢJE6[ )W v%k\952ȧ..&a> *է$_Iϒmgܴ'"Wꯊ$Ձ*k_]:;|U_]T04(}K7Qu%Z\Ǒ۠mM>~)_C[j@eNc o+-k xg҉}Ku.)|%FviŜ˝(8SKI->w ɒzO*t|]2`{S Ur?0ɽ~`a@yBNvgˢC3:]UXi,5}X'ž&,Fˬ>$/sL~b;P-b=gi㋘9C:jO=*EChL Ez@;4LF0 ~ endstream endobj 618 0 obj << /Length 3121 /Filter /FlateDecode >> stream xێ}A ԵE8E>$A@ܵYr$٢߹Q|vyYCr836X=_n~|Wa WU(HV)0Ym֟vmS_tz c@G[iF},}_㽲T%Izic(&\A-q2K*ٝK,G(Q3w23fzQD'Da:S%͕>E( IJYyA[/RtxI2{0ʺB Ͻ姿2w p+6̻[4SLK~C;۝06)0[ ;a#V8M*z0XW{zaZb Qd A1޹=z{:,zT/-M/y8w]an (3&Tr!vL19N[`+ [J[QozbAc_J!j^y2m'j*swhX4L!tD"~$ y̘r]LD-Z ({^7N4\#Xئt}쑗>LYl ,% ‰\aϽKG֫MKa$+p?Y}>,N҇p+X%JXPޟ;`0OY  ▋Xy(r Nh.haȐ%! 7|- <'9!_P 40+2xNG4$zZN8 '(=b\,gׄm; d" ˪bYř"=ճ(Aٟ7`zhH)nV\L+uGVB*wt%*LF/V}rN$z370 QE&Dih#Uc+>we/1v9D 9WB1-v-'~_ViTLem9B[s57c&!6-uzz-cyz*]!y̠EzvOuo=MIeLUY{vXq-03IsDzhWڡTTRdUb7^߰ZMݡ!tRa>H̫AqXHHU/yaoȻpF!C7R]@+AP4=V{8Z*Li35>ynyy^ 礧{>M@_g,\٭|b\3pcYWʪ쟖T#lڌ16b?|~.&TbYZM?¬@z:6ҙ[ NAEܛٷce2A*ʷ`(ڤjL`E}>.zTOė Wu ;w 0=vCs';SlY&+cw<)o450-𮞜~.i聛mU-NǶGNɔL&aU[K1;Lʋ\K^ܷOg9XŠxZ `;ǿ ʢdKɩIN%׾CοL|K0Py5|Xń A:YݑduMQZn#?]DSϭC* eMLd({ו-u:W7xw 7QRn/e)6z!> stream xZ_ O-&:I-{_^-n>G31ֱsӹ߽(ٲGyXMIIHo6|IM665?cv ?O ?Dt!ĴQBؾիw`G]`5) aQ)|:\L˴V(2HCeLNPșYTILB':&'c:/~6 4^l. xYܒTN,~7jjt,'"K=F}PB+ M^2"ށBx_L24ph*sV͟o~@'H5STǛ~=bY"ͳ:nT<7oAG\YH΄E4pӹHYld۾lބWuzd'jVe;11\L&~*KYU$LeNIδ'2VL'f' b6 rQ?V4#vk;f0(n t]!4bcRyV[n˺)C#tqH*9#h\*܆Ԩ[?As!V>h1;n;s'2M,s=pܖOx0iֆR@XG4 v_{j5C_/c?7&fUB,ѓ^ZlDJ2xtoNО`is[yx(Gz?գ釨 dàL Iʦ1;jmH@1$"6|n bI3"kFz[ y-_?X(roqD?9£#Gyc9vF>Q3{vmO n|\7YjLZjda"w-Z ˅q{]InWdIM$R1.:7E.ƆE0u'[x^knL6~$(6Yؽbr1L~M"}vׄ1}uãƩbÓqK3k}BL=bXtˁs[W@ڌ!e v QC?r.oXy4|OI.B3>Nݤ4 [rޢ"|} f..PwD>@N)H r8 mopNm`5-@7A$ѴTzhz5h [c (B^@@~!%l +y&Dϻ7C =QO6P9i^Oˑ7? aVޯ6_@rG %@N"JIEhE ]wX|O!CN0/Cm0PL~=*H&gW `bF0BL bd١. -l =|wvi|JZ-^ZdKML<;/s{9O֔h4npL8Mn 0Kѩay>~52[cwrx:Sz['JEΨhT[ Qg/˞7iXN@s zۊ2# 'i}`E3gFXOI EXi.|$aG0' #q_#Mx{WYUYHHdB(_T9S}%($p @x%WA >{3բh+ {> 0!ͮ_4Ϯ"6O(aAz@JH,d`Vl+3dکK3L\Wn _bB8x2_WW93}g\6js YJ%$CfDp.OBVð-UBaNet]_\xʁѐlWO\o{s`1:ú\SF+"q (g(֪<)\J\Jf]nK6]v ["vpiťmk?Y|YZ䠮B+'D{=KRyflWkT|}y.4mScmMYk\HV q*=%O"gp%]vzq~MۘaXck ]:]s1Eb t^_@/_H/zҙ-Y, |:W9K2IW[)Ph8U}Ikԓ7-l 24~=u?`Mt=j;QC<2NW|FA,Jm#Q;xޑ||NuE0M4(o*8"´F5b:0)n d- @C4A8G2 ]k+,28n/C](;e a?`jBScnҁ]K_ ( eYv+*GϷA.> f r߫s=P@,Y庑5hLk9U^dJ.9R endstream endobj 638 0 obj << /Length 2677 /Filter /FlateDecode >> stream xڭM۶_[+o&ϩMs%_(R!)3APZu}> =yǕ+ۿ߽|_dQ$WwU&q+Wשj##ojvUG綡j]Zd{"2V?Q)TC˂lL"XWm6S8ŕF,YjE;^7-m:?]~%+?Z+ޞ+]æ(Z |N5bi];ނ6Y 8" 3`*TY?E#S*Y,cJ(4HӔv{~2R2aǡ9,/ \)핌[ ;퇼)4q8-vyӨ,B$_9 !zĪTmW>ƻPJK5?)x@;U Vhܙ=ߘmzI}=7CE9HcmE|]ݰllmE(Y߹b(3IHK]RPfm]spHQbVtR$h ܛ"I'aHͱ3$B& }72%a ~>8Z_TNg<ڏZFpյM^NMUht?~&4IqmP&G*Tyxl&$\5ix(Ì>8Qc|cǸ+  U ˆ13x}5AѐlC c36 ;cnBv.0`уJ rٶG^ 9#Eکx!|vUP4VgIa>jfP dhhg6`%V\> 6S/վ+xuF1Yd.qjF>r/o0}Vssvfe+ct#\GQ$#{@,EVwI}s?" uʫ0;@A,`uUT> B&ص 4kVr~ V6)t}@+BAW4XzV`@a@ktAo@'.֫ԗU`R -{!swԓM1S+kĄ2H򽪟&EI+ˉE6^9epsEH FR />|?AkZSƓc?f G=Tpi |-~F|,Yrkc\ /ȳ`A3@tTE4Dx4luݞ )u(3Zo:u}AAƦ=֥C"3f3Z>1,9jdsm 1㰤l`lMD-q8k2CgҨ,ϓ罣l4-tcۢx9Cqx3HD鴜}4Z'ocBCR_\1&8+45cȀ=nJq$v3q;) &Ϙlc\;-}$, C'+4.vnbY |ڏQ6^|y_PY o܅n`OdxI ߐ_s/&DjkLx(ՅBqg[$,ؙFc? m(Q&_3141CV3 !(56XY$r3 Q}ᗸ߷?ݻEaj b\Swb;\Nc2-U*`d+,h: 3F_wqC Kuϣ5èU75QӺv?S>pB'GP jUWPߢN% y ש;us+S'=嬥Y,=L8~>}s1L|a˿x)H/l׵93FTFϽmEվ@D6}\-CHAgZgB0`W̲ѭ\@SRKTv)so8L|ӥlUE%sl*D; AJ<7إgI~BPf*A͠HUௗ Bso"MS]90FPSO"E꧎AUJy)35.%`ȻZlNi•3*\0,U_#ś9KQN֜IHF2UomRTLmDb%=3~ cvkO],$ƒŌ3qtl#O*7`らH &͗Pi0=/_oR),Ӭ¼-XpKT: @=iF]혰CRc&GrȀe4TW&)&6BTénnV8q2z*|_2º|@#q3|qVoP`#3m Vz|z endstream endobj 643 0 obj << /Length 2183 /Filter /FlateDecode >> stream xڭXKs6WaS;S@n풝Q6[H9Xtp$y/4&_. ~?]x<WE,bO0/Ň?VЋ[uUjL l殺_gƭj[LNFոmGb, R+_&˟N|p%îu'zN gv^/_^^=}Njs>8QoD<ܘ8 hGUVk!6J å!JwqsbĊΔ۰.ʑ.oҝۘQ'K:u>FNjJ q\^G p hVkuH<{Hʪrv5 i|vM4w^ ,ؙj}\kL3{ƨ/Y9\P824U:׮h9F>JS+E#t0ٕrFt$6NLc\+AmQwZFne=g٩@o 1R$FDhMABg9Q^NjSՆ3O@`އ/vҹӃ%RjxZD6MUJCӖ{S^RO# wⳉXD8}&¸6US fRA ӏDIa, 0}իB],rZ픴/.gپ{GG5NO=#5͗G.l?{S\3ӰD20 6վp㝋h*_ͬ{۴ +5Z}Rʪ'ܪv4f΀pd0""; /+xͩ wa)-*i&5};Ģj"_Uwm4_y@>t\ݬ_oo;6 ThL*JZ+'0w5V@DPTp$hlޒZAJˡ44/զYKcM51qj^=%)0d^q WHq@2 j[^`V¢Foڏ"؇h C;@|WIY(hU=-l$ae Oli.`9 ,g|q%Ķ3Bc"N?aF֘j]jlX$AamgLRFX٫$"񉶵Ydd+4["΢`qO>PH/6ڍfrT*ͱ}ȉRh%| &*ͣ>_ApN HƕƔk([w9Ek;ݾ,]C_ "`QДViUˮĿ|c؅u!頿IVP?"@j[(-&gnADģm<QJCC/gzTAr[m J(0;$\;]yWYanT+,&6afJ 2c[-}WRH!޿yv]s*YLQP>t^xbZ(F:"+* m[7v]):ۅ2 qgN:>MCWF֎$!lI9IJ7k.m9TWt~S< @gN 5F}QO, qO2e+]Fפ%WbxǃЈqVH+3jb2|6U]oJ?mc6`@5f^؄\!z+# }y!KxVL?'Xny~0}}ݪ @HG0p~ppɋ9f N B +oM-m9ʈ(ީFg}+ l*{ 6:H1m,&ӌٙb-yGN7˃A mL*3ڃ0+e- UjY7 _ⱘA4K&bjq3'3D3|?cUE3gHB>щg<ɿڨ?ub}yE`HY$N$Q*C{z'T( endstream endobj 655 0 obj << /Length 3305 /Filter /FlateDecode >> stream xڽZ[~?6fDRS@n "]M2wYrEq}g8C,ݠ@;ù|3Tz^Wߩd%\=>pH%zܯ~x񇯾xP&8ܐ˶k>VkK3&RvURAuSSGw0T(1*ks7]NfD*K@i4'Ԧ;Ѿ9ͱ`ϣRE" 3ڮh&ָratܙΫ-rܝm밥JpҹTǁ[lLH-\U8߈9{6*X;ޏ6= msSLlT~>ߐ`T$rGKFFυ9ueS/3Doۦ]<2"oXڢn蛷GSw:;jC.;݆qBn`c_S \r*.TԜf. @ tVZ"?VTúT,6THp( 8/R"֊1 4B{B|=a  7 (z^]B J-h>PS{<*a:Ee3wɋbQ}PhDHC^Y.kǜG42H`hb%"M7,Y5T=W„MI~̖YTb=݉_UcS7,2IHԔ`6 p3 }VFu8OMhQ;K$i%U{q=[Uey#~'Ԇ6ŭ2!SȀ`8FݙtK ]YP̫>) ;CEwԫv8T'TR04BCH d9:P'8Kj>2Y&~f8JJG#.==L&[ݧ%ִ* Lͱ_y +?]DH6Ж3e4K(!!|. Fs~\pӸgj ȎQpCՠ@hh00h1Qi@ME{ޘKaW`76Zﳩ ,V֝i!+;bڱ0E^ tQ/U,+dzw#d69bF5/0ռ>\0O83а3KO1d\l L=fۻHx}Ƃ⠑hyө[>[ (?xz9_0©};]A*ssd$q}Oebd3OuoYb6jt{sgѷ5[97 XXr蜟g;ĥewϔRcNb#AڼoLɀpo{A=2ݫ f53f`<: pzC֓ ^IhV_jDĺxa"glhK|*Z]]q?98s`B ^Z&cptXdqPS^􎀾ߗ $O ~ ᙉo@hReC*|q~S%q3%'a\#l/О &@ S1Hr8bD I T'0ApzbUcilxF( / GLٟc2j2UF5婱I&])'@-O ȼ޿`Z\C AT}@i4Yp(vj㆛#g\^EA{Lom~;-*x$R dP쿘\4-EzzԺiͬ%w<:铚\ Ϊ91i MyzN3@OLT]EZ̴,[KR=_>@@9*멷3/=b;rig]n &ux-h]@Zb"YuՅ0A~mL;=ekp$$9DY!NI Uf K? #0۲ZRIKAAwNB8D*+G-JϞE/:!B:DčۈȾJY7"^_L,7ҋ%yC:wx6ێ95fԎ ->47Sp=cY<-㟎?cц 2œ+bhak{^IYח 0}f&Re-2 TH%'v\g endstream endobj 549 0 obj << /Type /ObjStm /N 100 /First 889 /Length 2399 /Filter /FlateDecode >> stream xZMoGW19{!p(1 ]Zb%R(j88Dr1s5=կ^}\p)\$\ݏ.eGu?*bGr9(iqUMBG0"T ^ڟ#mJr8ȕ NpR(evqGpRVj1d` e{?e'ǥb%`uBj#''Lw8LI5DF*9I0bTpi ;xC )E"Ԉ7#GvЦj.S-.*B! <;V `db!X(N ^2Fي)V[Oe=^XK2'L6hՕp3%PJW \"#źj]=rH2(G1 =81aҴI5&גG C w+XT(8S`%܁:q e Wӈ S!K^) Qٔ!6 dl5fW[FS~!08;|qq;8ur9^]er8jol\sn7T ?_xgu{Dy&c вQb~u1YK5&_?Aݟd9̖NaԼW[|q>Cb/,Jl6P-.vxu+7j.?,Og5?'}[ S !-W>VO!5k~rrfAt OGu]~xz"lD9F/[~u:mS7hQtuLQ!'f31oD`Wwa5 1Ig1o rd# H̞mKyi\Gd٥@vsDL LO,PEWz>*e! ZW[\>@2Q|d X$&/ٖ%rFZu!Ɣ32[- ޷gcNaPelŇy׉#wGʅz+ G%m~ci_W[:n7 >h=/ oEsF`$gXK2*-D0$G}FTWV*_Fp-WKY\bQBP9Sb_X"wϓz>/!:hWP\ˠ ,oN1+iޗ X.oXWk_w*^5IpoiR_+$D۔#k>ϴvu0\?=IB[bc]kFAP㵉C2N)Vs.ID@.jb00!.ĈNOYQr' G0q s7I³2m@yaFl9c@tgd C ZvT_v z_>CsIñm|Xii]tӐ)HZ g+v6vq@L>fd'вmoh}Z>K ؍D0t> X3ΔNk!ah$Y3lmLO 튒H"yuEwxx:yI/| Q6kw bmRTH= V=yԶ_74d"9cGv1S 0&3"bDmD[64Z<5?߾X.o4?-ci|/ɢ.|к*GxaGeis BllY\>4WC[@W&WL [}]~:ƭ$l7HOa϶..* ʈ8FSQ˩܎ws/zBLeS݁ 9BbNOyǂ-mmGi%Qc;}~p)>:KmBwjh؛m{CsN}?_O-?LgW Db"e†vYJٛ2V^Aۊ1x.5F&xu@Q)V<~[\5olN.wyrߓVqGm endstream endobj 660 0 obj << /Length 2618 /Filter /FlateDecode >> stream xڵˎ_A fM0A63HOW˽=\,HVvyڄ??}M2,RQ&b&}gQU8/( ,tKЋnkseQ@Sg.ÞVc߮=aӭ=ZmZgsk;:NT)ρ!賶gU-^;ste_nvQʒ.{0۝ eahVaJwB8SYFI*0Q;dz:~Zڼ|("{UkB`2=uCЪ_%wJ<֪^%#:?QB =5 rQfsI~ mie.0Џ0UnhjZ>LËo$x6 L_\5*2TJl!Nvi?VU׵K"+":< CohivOBQ;Ly7w ߫LS>@Z& *n< it 0 hn:p4t6z{28\X&l<~t"':8~o\*kZEo1wPnԫKZ}mUsCy1j}luhꑄoUu hozãnwt^ȫx̐O+#v05irk6V_>5` mpM"'fRt؃rp,q2bi9]N`۳'4 I KVJ4}|};:Q d XݸJ۬"FL x4~"{l?;g(_N;zv i%޴ RyG?qVYͣaE읝Y4[cMo`M~vM(a>8Dq]XG #3h4rfעRQGcp @2"V&l`DӠG.Б7F,;XNyOk} nGϘ3S/zl7w0$㕀t{|g}G; ,*<1+D ܛO }ëcP4;. u-~ރMEI[9\ѣRe|U2x=M!JS Z\T)֧.t+h`A@K"[ZXNc:QŊܝ!eBr^Ra`SGռq#KcRE 8hqP& <(O0OޏRf"S;R9YIC|!+3GĀIH4];g4޹bJ~Q F" N<ȅSH ! 5\`QeDMoP"OrӾ©^}UbUϜU™VI"N&Dqe-.zA=J[Ϻ?+&au bZX ,ZtPK勢+,߬TCZ{З% +&Pł+3 ł[Y, &d߈^,DlX#>AMЖG^J%'!ɓ5c]8J``J^na"_ޯ ]9nvtJ5MOd]^2'֬N A;9g7m,2U#m3T@m2`4xO}cR3ofTSP;:=Ɍ09H}Iɺ\.0BF]~[,}..(T% T*uutjR? 'Bs.ȱ:J7C3tvEֺ~Bm6!4<0[U3%_r2 \|[ a,lh3 eDlO q9 ϸ2gxi wmlb7e,+55oj&N1|YSD[M: A-Mɞ `&LڥһLD)b8踶6aQ<̟띮آ;<_}~L>q2DxC`D vM\S2ʆ7h? d_7 ӚTW?> ,E뾜λ^>.EQ! !)0(3m endstream endobj 664 0 obj << /Length 2654 /Filter /FlateDecode >> stream xYY6~_ȋ6#:`ۋ[4dn+#u:k?h On4q/ﺼ[WIedsʫfM2~4*AWIڂRY-c]t ' sMW/u:#O"&>tWJsvuǯ2mH]eBhxnW1y 5u`L߀& Ɯ-CW#/LƕbAJ$k/y-"yfg s]G !hCSM΋|DcOܜMW1ʌ_~x?{$P- - $~&_vӀf= k늀 _g(+dRyҠ+vOr>o{{$~ԬaD]ɱccT>""Fy 7[: "PEh goE<_zzcR 'd4o|Jm%kPlȳK&kr=8NڻD"/d7Yi}B#D%g>~8<"qUcA $p+l մ 7/w1icKc!+G36R{]{5QSxrڀ1.8MO+Hbi"~+}'y;zS@KPoEpE(nUJ4y@@ƯxrH7{ ,MA@nޠhCWX{JY|l۪ sn\}D 8NuʟQ0*c b=їR3`B$ V ju@˪Mmc5cLWeKǃ#xe<QNRT@Oh<>y/9,Q뮻1|QRН9 !v:qtrBmUn{FOhZw0#%JT<:V˘{q21S2nCљJw+7(>z!kA(zI&$YL4~սRؿ|Tx$ D^ 8۵ک"9VwXjpZ@1 '41;ZyH ]jL{zit}^|䮅"ܗ{.Us*IEYHD [Me6քlJeU8(EP?9cۆΓĆvhq7݁:CeG'䎉)-.Ni6$⟳I+M9̖#_g Wo)}/Z$84T/#h% > stream xڭێ}bޢlFy(M0E}h,keee7=7wm_l<7‡ÏBo0كU6* q g40>s11ұ:?Y{G޼%*,2:㵫*G&QA),ضv=$^`Ww=WMpF{T#%L²}4I#5Ja.`$.^e1,ʢ"w"&!l+p y>;D2A<%Jt4tDmzRfb#5vhV+LHκHxڱnrVniߏ #aaK~ZzXVMβڝtS{\HI/< *dophX1\mxAiy+šBC;!`{{ T}D-V0?:p,2ȪXS# [)t9S_wӰrLlXDD0 45 خ \ r5X>2 #~mʖ̅e_~ۿd L.BV\u-^H^jDddSFIDeYڏ*&ENim=eͫ9X#\:[{OS[tZR֭W?pDM4&<;!9@WZyDgpOO)-6خ+YJs%¸_I]8)5 =uI,Ez[ϥ>X$w>^a:\qdR8rwFVp-|ZW#MUOs)BwA"PۤO w"z|Mr F"@-^OAQ7%Z[XUe,o=HfeSDkeR-b1- {+$oGYC[6 ӉY"ZdVOh|벢$Gɉ{Qseí'8`;)X٪3* eFy.B6d1CTEߠ*?,J^Pri'BI?/f\Bd6 -~l\҃e-BZf?F%-bJчˣ1!iyDހy_e{t?n*d0:Mm% g3a L88 N vdR z֛ ix3s%ԬR4rj4O#r OݗFNqCț_h Aqg?x b_d *d'.Kcm#w.?JƠ Np vC͖bnRB,E"^אe*fo~gtfp~e{"qM',妢RHfZ0ui rᚬ|F7@r۹qTJLj_$F4v rX\z,@%Xu&T \-Ϯ  LhC!s, 2-;w " a^u.WAQ^Kêe1U̽Zg$6 ybbǼR)!aM&%тM yV^^l xI)V6iPDh}L8Dj~_*AGM>bQH"xF!Tī Qy5SBtC\X2.Q(~q:`\X̕`{`auԾPA"sΟ ebVdWz PFEvBB.@q:6X2D1qZ0C5?łd{Gq6s]&/"U @;XYp|8T4'a;^BVs @/1׉,!V O]ta<ʔCh㋷ǎk%rsV+efW۞O;r闤K3k{)_TܸKwߧdKT}||nS=@bNBs;w?]tl֨^wIFW]#kS09}TRR( UMǠ6[9d-l/]JG HzGr$W Ur&2k ?4k.7h|׸ fe1p =CL+\x,N׹],|};lvJEw{ q_ʫVd;գ #|e|Ou I2-SZq}  {PU#oоZ6 %āxSΓz !ಮB39(zK.:zSAع6A pJm}%~XO>~Hλ@(C{BKE!ĭ'- ,gG@Y#ϹÓ^JGY> 1>唂u 5 HT3X> stream xZKϯmA˧$`w1AelmelɑȏOEݓr)*(-Ï? `he"ePx.|)󩩷Ê-,%:I}i7MY+>w!ݾxjԲhG7@XҾ!8se GƮt:\*l3Ŀ/Rᢥʭ!XI%DIojֿ OeT.VTJrR,]XT>֮j_P5~ ks3Ln^8o ƬED~SNBp"&9"pTtRY:y5{t M\#zYEϊ7M>8*;/]ldrf}2uY ]EcR0C1Jq%; H974X}u 'էI2TT4&nrl'Kz$8X)_-s27x3|$3҄%
q] >`݉#vfl?Y31-״^<ʩPDdum["[Unr~7"Ht2EQXf|u9G g((M\v!c哽G p &#"<ڑꪲ-4e+LűFw CX81 Edm$b}v)[>2xb{3%^<,\LduO26f2rz8}+Qne#7Eo69;p0rև X`R-m9h54^fǩ&iv{awǂ|By}E?yX6C~_TY:>yXG_K .6 KNNdy,Dr&챕8]onjhm:.cm_nm ^Ѯ k|3I)݌K;cyTɴzƆY9Ccn7,[ή=6LIW#dQ.ʴd@6K t0\AD|֗ $,_KV+Ca>MVNT^'E-^z,wd}GapS 3u>EgPw&u,@exSZ,d@Aq;kHߔ}/鷌EUiD$&7Q$ThQWU|3P5'ڬ@͇mސ}MUJ iiM~8!NwKEkB )Pxs5dE4]f <7K3"hzלA61Ṩ\ p^.`'|HF>T9r6@z Go RT(+q8,|xrWFWqJ Z wHVߥ?WE5ӳ+fCø+[Uo0~GV>wNUGW/>K:@W eQ| "I~)[n͹ qNn\q"~GW=INFP)&\Dx jX` ~7e=߅܂v~kξ3l鹋) 9 vʛl"#$ ׋ss0~=}n2Mk/rnW5aaK_ #uIڊq3zqzK=H3g-~ N4yLVn7 {>R+Nr,&%c^VwB`+Pg+9@;IsS[?=}_ endstream endobj 682 0 obj << /Length 3384 /Filter /FlateDecode >> stream xڥَH}y*+  إWZ2Se34_O\]ݳ˓ȸ#ӝw;ΤwS]*?b+=w}.C㝭>9s無v@CRp+2[!ge _nn[G[Ah+>~6}Y=qh$)8ɺ$^ȧWpߚ"Aζ 2qiα G>"nl2{~ƬdGgR1@rwZtwr+#D`,].5w 7O*HmC7eP#p= M3)lyC?AMKҼDyQ:M kag9t O5GOHcżרNz/m4ڌ'eg7w` ɕf[@5KQYf^QZ\vd"v$0p@*7.tdsOjjGk(Cɞ;+%G3)'"Ğ y溩LB_D*8$3a u+Lл7,lD:U81@mDNL36Ü2)7jyN"B. p cpV5C[d&cpVǻ Mq5H/4O2BzzT<* F=9̢fJ#b&@YV fξov-eu"mt~C4_=D'p/fr8 rCG 7-R^Mض%3`t ߙ^hV6g1׳^r3cXTN-4}722p`] VD8Y*G{M_cZ*zWgZ`E+@pY6r^`-,آG<Po8 jXPN+cˎA<6> fRR_S( ۼDJ^T7,,^o*N_3BQQ3*҉GB0! K5M TɎ8FvDS݁66,q)VOw-'%@>++N0R{YxKc5 K]=&}=a7I$' űȅj@z A֞؂:R u' s؂2{@ }R}V2 IDPN~YY놦<*N\h`5v4HlJ62S>cGqJѷs=\ 'e엲?H˳=zkŽѥ\"W qj'/ujL.9 Ek:[u/e'g 'gHOMcکӐ_dl#HfKp8uQv,e011Zbj@Hz_b %3-g|? Vչ%&Scy){A e'/#}⮸y YS4tAl.9&dp) At,g%zS Ve537UZYn>DO_"lÅ0Rxm`b|\%r%pTRBnqә\{iL:ܽTpvS3:A~|72X oDtÍ͞P4<ٓ]S/$ӓ7yy|5?=)wnLq0+nEDʉ[,؎]¶ yM4,{3cG)x.{ #ҕ1ŦܒT%/`(KXPb/6 ꍗt}a[06)ȏfJb'+qa|? ,KR<@(J :'I#`{BSm endstream endobj 686 0 obj << /Length 2505 /Filter /FlateDecode >> stream xڽYݏ۸_bIQA A/{(ဦXpmYڊr-wCJ:kDp>3%UU_}^+2)nUKzK=tOﻮ4|ضvV~e9gSL ݧloi(VZ0^~n1`E4P".F8 X\yMUTrTZ&S%@jOnlceƲt/UPڡ7[+x+ 2Lp]4F(t{OSc@/o JQ D99+ =Ch57]߼kze@ÊLtx+{$0~ٖs&4kkM#PnzRݻM>x;{^I FdAiG!d4DҌ{K_r)H\Th%YFFJwv(^BrdOЍp֚&~|M;褺9V4ݝ+<+o+0-HpT@+Mf돉JЊq !MuPhrsFsx0}N5&H8Sds#@\flڮvC1Xwj S|ư\1/mJ#" ńr6sd -Kx(k'vX?h"vC~ 3[ek[K0Y~T!3L|c 5fp 9 s8Yoga6:͜I "ENW@[A(0Mkihbk/%K̝Hҗ+6fy Xeaz~0iX TβV30}BY .P!cy Ѯ\&;Wb> abH.OL0wnr"@?SģШU~2H-14̯t X -E<|9êl鶈 lD8QXH*7Ӟ*(# 4y\*I5(9k <7a6`8L2)!tK5vacGS׎%SG= <]Q3fZ*X1U'*B844W;]xR?}ȃ{.\sS|-bhlIpziSsq6A=ThU2w+[[i _Qӵڕ;A4xuOmDz(/wѨ٘qˌcN X>*gSm=D9 Krp_ũ޻@"[y%x_[pM[.ΧHU`Q1oЁq+5`2 ,];\D) Q;\~'5fX5bq,XA<ڙ`[5_ J7dE~J T)\_.] endstream endobj 691 0 obj << /Length 3273 /Filter /FlateDecode >> stream xڵZ[ۺ~ϯ۱]ny(i^CS] %UM3Rdzw/58p. m67ѕ!.6a2v_v j|aa\Mt&92˅гEr>~.#L& @ a!4$̆RXO. VFJ`vUv{)pwɁ@xחKV0Kw z5PюZ82=]l b>nW"xaBju8}^=BOYA:Ȳ u-"e`Gg"@{KId0 oɼvu>&?]k>2XI-. )R;p_ԅfČM[tVbQ|^iR`5KD>P\";e O15˂<VS4bz m#t}l]1Dq [5d|VuI ܝ#AV#WPLt.O :0bjltgn[u84V&%'X`̂Rbi#T5W8d)d_1oK xy>k(чKNK5|"LBm\^|U;ƻhXCsVH?䌢pt`pvBX0x^߈f*tw/_ G)Olj;9yiG5-zTŐն'FфFH9=t Ʈ3x tTtб QrI,<9^KDQ\!U f|8Q7f?PVs?mYn+k0.45,ɮbDH{;5 ض]֫z?Bma]~ ӻ"6g!O~Yܴ҆)A= {[h ß̬UthR;8iaI"b VMv1šۥ sST[i |&[꣛0JJF k[ eT*^ %&2+e|vwԹ6_l \ЎSpsC\5]X M/BN'XhX$ooTgڦm MfqAEIK;61r7/&+B6٫+T-UVf1duER.lf!$K [gY2eƥi4kv*lfX|-/ U pR\< ^q)3+_K۱_>6`qCW><Cj`Ko&˄\sX%`D .oTacoLw:-@+o%ۧK?f&y_MVU?ȵ S98{gA8'X$ 3;d E}k@nl٧yݪlE![D܌mR˄Ԑ$0(]1yge"Jh[{oyh12MB (Q<jiHL|Eɳ¼FWtÇyA`O4-Z."w3tte5K磡Adn%; ī`DxW!A`&˨mכ`'9Ӵ_)U_դ`};ؼGkW=[y"h./W в+3AbY71MF=T)o"7֪dܚ ЁnρځG"vgqH}DbXvM=_7cգ[6Ks0L ".woJ endstream endobj 709 0 obj << /Length 3772 /Filter /FlateDecode >> stream xڕn6}.mV$N&IvG?uo]|%AN3TdG2=Ök}V \zyxM+\ k- i ]&ynIϣ !>-Akx|щ^<#yF}GVDWGT΁̀#a إ ރt9|}h]Y>Ȥ/Z*nGMGSX `* 2=L]'xn$>-߿|Xl{]1]Omׁ60:lh3X4[MI¿}el]vN6`<DT CVG0mcBrE ]Ct,v]%S(^U !(gAOtklqjw'P2= N,N~%cqp$`~_3pK1$Qic`@)N: Pe/#8[;nBI#r;GrH|,' Y@dcLt'vw :Aе|>f @hazEar6[?N;i69*Yqq=tl9U})D{a-&6{dt,$̴dɔ`/|0tdr":{vAKQ3;%\I\Cjm4@P_8# pw]۽AUgZcbǺUK8R.J0Y076q4诺ݎ0Hw'x؉_:p8:d.sG?K_FN*;^F=YKjۃ(U"Kg]vq-fVp\;.,6,pCd+-,馡 p5{ޟIrT|N\!D~ CE$I;Mpŋ/{ 1Hߊ_J-p·^%7|YJtK[IyAsl*;~ne:oe=WIe-ϻ_⬭ -^ ImM7>$J/,n̥AAiذ0hlc1ΕZAMt $lSe^D zPziLrmkńnV-hp9\ -Qxy\jUTB=t]3-4 a> )^d{X{p#ߐ]4OELg!voB]k+?!G_L&L"R55?"XKHJW)  p0|QKQ(΃8]b #ezSONok'SnxQ1 _B9l&धX"s&y'!,{/}qP+Sko8 K +d}H%#w<͜3zs "C#L³,\F迬-<@sb #zKb+5xӈQ;˗mkUՍ>+ [s$YYGkȗew b,pDR$3$`j60f/eݾ_Tϕ |ѡG')-WzTH(<܁5DYrQml" xYI^}æ$3G% Ø9eNQax{SՇVSIB<9Y'_Nߐ`ɹ=5P`_k a΋Rˍw=PLK?%"L$Y@Ke%8/'c$0Ԏ_"x5va)uxM͂TB2f*3@%V>ղX ('4dܓ%iWϳ*t $Gyq,DhO|ȳw2KSsr!zD?5K;I1+oR敫̋PcmIf?@b&~OKq܄ℽ4 .z}*/3#%|ge!ƻ*#޿-?Cd^Tڍ/\a_[nq;E` fPQQ ξ `cYv*Ta0}xx|`z endstream endobj 719 0 obj << /Length 946 /Filter /FlateDecode >> stream xV[o6~ P%%Qb(:JMRY~,+1p#҉e݀{\}VaouG!9!JRE8UueҼQb擫7fDtD pN~t'FxGvlB]|?$UmrƲD8uYfCIQVuSf؛f"iQY^WXSRnJ|mYF?/lDЯ]Œ'=m ږǾNncmf+&c҄I# ^<2 <+FI- ,wXIF(nSmpPf=MDtM&F6|\[-h.Z=8ir An%65;!ҔW`tc;hTJ|: 0Okqu͑NlD2!\\vA^JaMTCFunMp 냠4m e颪pv"oP@Jt Zux }iP>59lZEdIB~Sw/,ՍbFHvT2>Ǎܩs QFȻafXhw-N=7{ջaB~G?HUg{g %D(|NՄAO~%o>#ϝvWY=/H"/nY  'J2,% d{mON/u)sϩkDJbYXTB׍4~{x{C?:x?Qu)r'q\G@ɮ|NEB?&+,}Pó+12Xh&m#>*|Xu uYգuVe^ 蟌#0HE)tLGS5aL4  106 endstream endobj 728 0 obj << /Length 3662 /Filter /FlateDecode >> stream xڍێ|< xQ, 'My) ڦmȒ+J;7JWNIp;NxÇX}$_8*R^mYUzrX=='i6 ֿ˵X>:^U_h8oMjS_L]dænOl-˓Ǯ06S!cop?2Tm[.9sO%ӑS?s'xsB2 ܷMopu4VJGYZ2yha~: :@c=С_z=W35zYj퐂R q*WޣMo]󿕾9Ֆa9!K} eתzuv)y%aV,Hhj3m~E aMy8r>X cσ,Ca\îɝ(*at׮= {$1#mY%gY| 5K30NbYTIhH[;6pt؇ lWn:'%KQajoK&`80~?- 6(E.4۷L$4|$uT8X d)tMI0h.Bܱ.F0jފg,)ɍc6.ʢ\RM*pp:G㘃X뵯ZdN ո!{~IҰ-:XRw&$ʆ '>b13BfX+*种b| zng+Q`__SWlcB:=dVt@,j;lLPJVA}tN!s$5(jYTೊHQEqţZȿ8a8ⷐ=TGy^GNŴlE$[_QيEcC5&[ef1*/BϹA{;]26\l+N + bD"(uN O0ƅI9ZW/0Zu((42)V&EBc!=IfG cRO$ɨB؉*< L.CDQcRf܉Y<;ݹN}c݋%aŒD3¿XuނU}ښ>hU6t;oeAX& uyNe)?i}K'EGdY꽚d۲uyЎ9҂1ʷ5rJgU\ԣg% p";2 Fu3 G``JDS5UI)Lv2#` N;-f-aoAZN-+D1VLpGYWx3J;4e3)͈yP-/ #).7@.z'% |y\:7P|(p^~Kt<.r? @7C/phL5%^SFSJ'lrx]uJ[˅S$v]4b5Z ZCv9؃"N *1WI| |  7.3#dmOd*rOR{u7ٻƝܥR=V'Ц 5Xy%03Fbzn1VQܿN%h_, tIrCfs_9}1DHw[WFFvQ,J`qu΍CƍQ5*EaF2zb4|嗋m㻫n EqDRnImCݛGijx2S&!϶ou*Xj|e?AA-2T?CLSō04 @Qx/h)VOQ:J,0o\s?RL"q)8|"O(?=p,v.svTHJyq@(2@ .cm)u$.[ykU}Nek.ObBZHi8@1Z AɆrI\0114.,Bc@?|AKkdf[b.gQVf ʲ|\oAe+yY b|q& R:]f eӻZ#b:?G/TI> ˍ{95=XE#XeW=& |fW3/ u}j; "!*H":߾0 nW|P< 6wݷGtrGG0p{z}NOu*'`gE9]'/u'q5[ˇRtG endstream endobj 734 0 obj << /Length 3956 /Filter /FlateDecode >> stream xڝɎ>_@.n戤$ h` e,9OSL<?}G<@eA!Fͥr_>lmM[DZa73-s-ۢqۗ2SY{Bsͣy3x=ƛۣ6M |hU #oNe?-~Wʮ]O"UQi\ O3QoOM_T&.6RTX^nW5ǝq7oj^udFes<ewkЮ9 9utmuQ)ZLkE/JO 7ys>#J+#dЋkxVh4K5LU\-Peqv-VplᑎkpXtkGV4.rfY67Y) 5W֩OSèPр qd[SZkQʖA fStOQoz)Q$GȦ8eQw+d>;(NknX_-\7.Rݬ[pR _^!=!szJAqpup+I1]I~˾Z檮a^OWH _GXPvY5M-ۭ`>}U S;"nzFqp=_)'w$ ;4sb6 oiâD**Qڼ/#JC!QeΉVM ,mP{=iL/y^~O;6gU2ZAH\&| -͚xRYᖂdBj OC[:|Nw.Qq0cU%sbu*Vdܫ pl g|o !wLhF-F ZD)P2JT;boWb2wE_Q(a*X U.ONX+"e&w] q8[Nn0EKISt<Wr7[Vղ{sW뒄=8]ΣZWr 3d4 pUJ[Y5 C#xSݵ`޴m5eH :G>H_!$)}!QW`^YlD;!"T/0Y-@nKV̳-06kIvpT:07̝z9Fu=8 k=7W/qDXt*XxA3&yʯoCǞ]`4BG:*tY0@Vt ayT o= Y99[DZ(OgDB 4.lOnuA H b!N)m9[[ :i;xR1к!.g4xɦ-DHē:B瓫 Q56@W1v>$)y<̺Ŋ0k<]6(:t/vc-/\NDo8;WoÚ,æ5{Th'>\N|ϗMlG3*#@1:NfO"G_c M~ֆ/+Ă]ݷS'aeڻr+$ނx:&ංnm_(ŏ.*/UV><є*Ѣ_Y 6K0u/4ِ79c혛迠 ކg/1`*Zp ?IFKFiE!`s \i}q$U; jyKTI _dp e+B#ޑL8Sћ Tb=ƌFDگfn&5@xZɈO[S0v_V!Ņ{$0\olsMǻpTl26er͠>'@5x ԴdrjF\:e@]$8G/5&&5CtSLYւ(5D7{h}{5%?Ƞ?tγm2eǚ޹(qL:S!D4bîf {^L0B&~)jO**YZ#X6>\̮$ #H1Ҹ bė5B6NNN%)^hɣQh_~T5$] xjTw`端fLWkqZт_C}Suw~!WU _j޿1_gi>{AH B\ƙ{7D3[n&wB5Q^K iZa/zb  F&dD endstream endobj 745 0 obj << /Length 2439 /Filter /FlateDecode >> stream xڥks~;Lr7N:Jj'8v+?@D5m{ Jx{{{{{>H/ B?3pwK7^bf{8sz~m[}7_y}+~s ÷#Ƿ /xopʸbaߴIy_f,-򡮶5ΓbvӷsA_Oϩ@eNw 1R6uv"m'ufUvҎtƜg8znm֓G?d`ƕHxqwۅ&zay(EZ\kuŁyD? 7ʸ-F,l/h D EăFEeI͓£*u]2d+جqm.Ak'-#J^!OT\x X8zGxG:볦3X\7 ebd{vˤU0O7D R|W}%zT)W;8^vOeu4l/ WP3eh_ȸ:4n|VГΐn k Od|;;ZI+r۟.%Lb$iWU~ Nc.߽':ZZi\&pܨ2OHtGAF/0Fx_eU}Jv6LcfNӎ8zU7rp ܷ ؤ3 pi akv+F1ᡬ4,MrYطUmW^ڐѽO:nL`P"^Z[qW1'ftRr_OS-_ǘ 2JGmxelٲ*BT>diȕ l*k> Ն1:kr˓ul˪C6MeP5;zrX,U )|+M1΄5Je7)r{Řܶ GMeL2l"m{&`1pؐp2)#1d/J @r($=-ʫRw]FrX5aCeE%3q$wrx<Ms^JUv"*`HTO '=ȈA0 I$}vc)s5cnI.0 8N @oKBp>%#c.aހ" лD&dF D$F ըi3ƁrLը*Ϛv՟zٺʡ愶'(8`6j1#,-2dpH`-ӬmkSdh)Oz G!i"aV6]h֖?I -z5։Zwl@TTqN)]~ZZ}WC$M5g8% &ec"ە!ٯsvϤ`D,F@ tjꑾP&ng;ȼXv ෽m e bԎA]mϿas]Db`ƪ9r@3IO)pٝ`|֝;!"yd qQlxNo1!< &@O^ t1  HN_ow@3#태cCߦ+I҅|w+.IJjJ$)S4]2I=nSC/ow-/#HZLBNʍ{ IlhD]7>Oo8y4Tǒ34vY3k T^79j312`fOtg}Dws*hAʸ.@TQt@ބ$NO~F+|*&t3fMmpQĩnL7"wh2NB O{y jNa?K!Zn}ȹUWτM\@w*\tT2|L\yA:S>2}LS?p:I)?N:{kW> stream xYob!.(Z\Krͥ..8Z8 Xkk)~"93~ys񫯼d%\'q[*eV_t6@D7ջO^W޽}߽7cws?N5ݯ*6; cSr%jxt/~~*$%a'N?9= 7ǃ+DY$4<>1ٙ}q~U-iuGQQG1"3rinؤ(Ĭ!I' `(2G@ R0`=.^\BJ D8*:x2Y=Uʏ}G>şd0!IoDy1Uz`I:(F3?NEuy'`BzJp;j́ڨI5{V{F$0[]8S'P W76- cܒ]^wP3ù3`C3۾3T[Wu總Ԇ]Os"8`1z q7=~hl [P5 [o=z'tUV.kTec IyZ_B.)8kCb:s"spE(FHhb!ҏoC iy;71N^r|{AovоG AQBQOpJ 4qhV(c> :/9j )FYqyQy2$> stream xZM7W9 `x]`0bvc0l!kVьlI2 QMV^b*r"5Q-Ij֓}IdykqiIs~ aD$bYVq%A?k1bo8LYD0(:N)q[KW0. 7ȱv$6F ] s8PWq Kb\qXjZ$ZU ޓTҊ̬CTO7 G?=q%iJ /8T LNFV4URSU`j߈pxKrgr\iCT nu8n[6ks#tB96_WRXqLp1gaaZh9`&OPR$]dRp}S8iO4'YC_N0mh>hWu+qOWS)C΍y+:-T'+ `qUcc<, f GUgO߽^77lz8͟oճ,s3Nf_Wg l0fz,^!Cz ͟˧4E&YK?oud8gxbG*J󧋷>}17Mn<sfyZ@|oח?.ߦa H>\d5f) ;0>f aU qWȓZ}dC-Y5~HZ1e2``CKhOƹt̑CZFr]> z´<;I#}d ީ7į;djc)eB)W:w[v7p75*硃jk``Y5 }m/n iQ娎ɑaΈ8[|rxnx)'5wTtڵJ?4ܧ 򎠔ųG)^ % ^m qP*,3l{ WuC[OuT$V qFrM&ǦsNsNXش4<2:uRk0]c5zXM D޶P~XZVo2}9; p(cy,cy,ml}l2ZG9)aRM5(sG!<1X'߽B\@u$2Ёj] b\p(S%q+PN3PfPDرD`NyS',48n`Ҹؿu")G\jH$٘!źۘQ%+wՋ& u 87%WCLw?#?|u\##yw:)yZ@QŋIM_޲zՌx1j(r|oPD@;cffG yO .ya! seSMl_4|xHX) .Pu89 3"S`xW֐ 2y_85Q  endstream endobj 775 0 obj << /Length 3105 /Filter /FlateDecode >> stream xڵn=_bQlVP`vN1m֋y,%WIFYr~տPQ(~(-ʊ[Km2wRϛᡟn;9Gj;~Ɂr#,[خ.o8|;u5K[|&v[ߵ:sxe֧ח~xSwnunvh ƭZ4o<p_Bn6f3 4|Sl[mv=A-0OޖnŷRsTho̯#.q1*BlsזD/`r;{?}3||ۻ3Cwhû$;Fǡ+:?{?LPE[D1)A՘bX<ų *^<%ϐxJ&&8XLd0hE <#y}OOYeMMCeAGXnN :Rkb&YuŘ:>x# h2.AÆlp-x˻הy9'XfdP>?? @y8ێڔ(SQ$ }(,7:]zq=t5hE])ES>G%-:v7|%콮$=#qsC:{擉, N <r[bGE{+>@G(k]L bX7B,[ n}DC 5{ܖmDط4T ~WC%p=24pߐveEt&>Df2>N9$y$}$H0mU;} E\(Y^jl& CW+IT0Oo  DŽc"c²C;|iXT~4EyA,6*!X(Ď|Ţ-ͦ:ۓ#D*v+s¾4׺?%/#pHyWLGW/S yg{xбѼ%{"!i1-F{űZe 8徴ޅ,+NB*boFKu*A72T-X;[켽CI=:{U"wTJA=Bt8Pr "V=}ݡ%eP_\1x4Va~OL6JL*& mU4߂8;z8x K g8R?9hؽ  cO*Kg_58)gpĀ|_25VcQ a}~JAPV0f) Xn} >`=F8i.'.6TnmieG7Cb2)} zUT{Tgvmamjse9,2C'B 'Ucc endstream endobj 798 0 obj << /Length 3412 /Filter /FlateDecode >> stream xڽZK4+A5%l͞ f =0ULɏqU7 'KG/Snv7w/M"(›Ǜ$B)澺eSAi6 FS6 6LH͎kSE߆Puu@nu늪Ÿ'3j'lu[[^ρ?MҊܷө5Sb%۫:UyىЩ5T }ӈϧn5sӫTqEbGo+PAAE-O߁FT)&=|yoTG/~yZ!@" 7UGz:ފ*of Py2R^7A1Sm3S{hZh vVƀQ(Qwڹ5?޴G.DTKM٪G8g*$y,hq 6BjUHA&nY]d֊: /%X?VQ!Qu ەPC\oN~µi!0H緊?MiM(\?M)dL`5Y`sȞ:(zt=@p|++بj#7磵Bj֭mk ~5FEdbιz"آ+X[L#KSk3E cߝW%pzO9u3W"qqK15~2{)%. fGAiz"^Zpe;6«`χ>]UVgӀsO`iMs8;s2E".2K̘aa2+ԔBGظz1y#rA[."&8j^+~kaqՓ}>Wxwyjykˇ8YYit8|J#07xCbْ ,A|ʻ1$䈺~,NVld*E f8 @okL;xD)EO.us&L \Co @̿ZH)J0Hej1l3! ŇpY@ȹEb<]j%]]qsȞp2sZ*N_F lЧ{Z3pg'~Lg37\T#f"FfuF {Cs6O1!0Oj=0`v!aoJBwҰ@ԞjOyvn;=U5(Xe(n?P/י'Ioꋠx3“E dx(V"^1Xy=; /p&En,-_;B$h"1fE&WF)&Jڡ~hU{Kvx _vSQRK5t1&j#J˝ G yr&h]ʦҹJ#WiFެ`1MJ>+T`YBDa#Qj>1s:L 5mo 塁]U@0<q tw):SSco˭ ;>9Υw28ػr~W0EPoE lڻ G"+ ~u<4bEnTWF_k~a!@9,\py/(rj(|{sAI.$,gLzx؛*;j~µ1Ⱦ[FxAR %ۢ4I} r(M/ͽA@w;;n2SǺ%&np/{ ܭ[L5pcdW 9 &Y:e-ՙ)̈́3e"D:ev=f3tJ 3OEun5Or ]͆\EW_6kΫyyvn8;bBQ8 j ǠBrJA&Tuq` hH1,H\C8:>cۦW%yq1|U. Euacj/dw&./>"9CTԋ-wif-ӜЮQwZÓ8`pɪCPb>u4l<934.ڍ _$iO2JLl;zkw!̇M+bǂ2tjޛ<[阰d~l)x駲(s%¬i.uLAɧ}st2oť;* a -9mI!1) pr6VM"˃QddU-V6TF_  _Tm;q#wܼH= L%aoEZȍ drg7 endstream endobj 812 0 obj << /Length 3673 /Filter /FlateDecode >> stream xڕZݏ۸_^F@/h)@rmk6Hr6.w/5$GÙ̏CUU ^~U"(~&HUjDrqϻõ0%B,=_؊TZseɯ) ;۴xq><8Va{QbaaitXLXoѫ<KvKmҡ ӀS ,!*T<=$Kޏ}vժ(Ɣc-Ά*.L~|̤! qJ/F<'*M[|ijBw?Ny/Fe\K}d(4>u4 ڞ#xuٖ2EWF؟ZysYRX䁝c01¸I jvgnjBK ~q}d8s`WhQ= S#5][{jmh=^<#Nc`GĖ" f,`0(V#NE 6S`eqpp&wDkP>Zl XGt)mD#X$ (moQ"0Mky-BePZQ@?6A{]zBqO-tB[!?me?ֲ`QR$9>. N]-m'G3\c=8(Ot9 |E@8ack~u xBlcYiz@1Guѯx$.ƍdxܛW Z@W`b{sלAdH3_Ĭ@ Vir@ɦf?ͣnT7qpi @OO Fފ][vIp0Wf8PP>$ngmevxIior|ԒXV=Oؙeke,e#/P٠+4us7ޮ&3rH̥[)N^^T: qIoڒ 7'A)XK4%`сpR)5Ϯ} ӪCBD!^:Q0S$1_)̆Le2( cB!G z4_ Dt mI:-jNa'v /C =S ?F`8voc#1>  yT * ʇt1Qf$rLGЅt_R)_,Kޛm{Y243Ɏ_^ɖ P1 ) n͹{=1-lW2f)3HɪElLČD謁nW0ϵh<I4N݇\0_PMEU=TR78%V+ )Ec_E 1O 5Nt~t5׳dhahm[NV^$߸i0 r*]FpR}!5''nh83k,дLYWΧ%U=+\$Sg5HR..Ԏ#Pn#Q8@ =vD8DTɠImL A-4 N}Ki|6ą3C߱xܑ3}670}qQ4/b3G\ W||6É~bVf*P7r( u7-VЮ|2MEW~v^F.nw>.6U\9.Bi80$$3HUaiF:`K:5Mr3,&̇nNsu6b x:T'RJujH%(-Q&~pxa| FUG,#q<jc296WIڋqd^Zto՟2,W[>AEr*ͅL4]嘗޸z*_Nqjg}-Akz)o6}S\|>Ob1ei.`[|(?-־EӺ~ _Yn<<拶ԳP|/W9h^}=X@gTt Ғ)1.ܞ\[Ұ* UBF9ygTd\q%9 _3n5!RMơ 9y)dV|(]TG 'XI䴺$wBo/g_!:*+FGS/֠#0!*iJ8uԽy:[gԾ}_C_*K҆5⻾2IUiHg QBu o)* endstream endobj 823 0 obj << /Length 2733 /Filter /FlateDecode >> stream xYmܶ_T2פ-|uc#m+6U6 gAN\ΐPBXVUaL|DV)ڪ+|Ixv"bbPb ,8>سPOzUfe.ȊDXx@֍u4nFN]ۼ*gĄ{\B$1. ;wطQ"Y@j,fYn3VmhkλѽR`F '46WUЯuUi^HLn2 ,?=fWmnYdADsf 3=.mQG3ЫcmڒֹdV@VU+T:ݴmhhv53 ):tbҚcunEW/D$ @}ʀEW cxGx!U^vjEHI};21l7,Z7 3SPQrEhi H?J}5/c2㕡 &sߧG e*o`Dw+9~4DjPʇz TBƌN?KwE{k~6ZOx&{Q"\c],7ޗ̻h1`lA|. $W^g>չS[qo 4v߄/z6 "J BCLJQԻ!V̝R0 Iv2g8QNm)gF!0#Am#{`.(#DF) Dz@Ä~d ""ќ^q>#_PfNAM8`;$ū48Y$(*~ \/0+J VehU6G }FIH{jiNhmyJDvFRPF^jNONX畤22 Wt,?S?蚫5DxMM2iK}%C7gf,>&U0 +zf|FI4x6^a@vHٵ7}̖p4-D1WL#4SH[x'^rL"X< P%)bAV{ 0:줛ACP3zP68BY I#&5[ :UW'=*ȹـp7ZRz]y*x{]b ai ݓ>,,}/ėܱ9 K%Xb+wpQBbK$Bꑌg3NW> D.z`AB^:~ҝHkv:ˁG1'l3JC]6mJ#i؛,,GO%lV~ \IPHvLIտlz#xPwE<ӛP0{iohR47J9᩿2otDO#=Cǿ㖧ڽ)v,J]8&unsIFI9?Ν@{wJS~_pn0:Oit,NfVחsr򥠻ۜP02b"}P=~Π:eIXJo~Vq{~ND:ҕ1a* endstream endobj 838 0 obj << /Length 3081 /Filter /FlateDecode >> stream xڵ]ܶݿb_DQ_ ą+`A+qoJ[I˦ gbIԐ {V+oWW߿Df C!`RxX}Z+rէm uJ)Y߷nsvzX]}NS\gbFUqAR 3YO{5Ak4;3e;`}%U2jE2ݙ#bϢO֬tK֤RnSf _] !=hJO47%KB5D@G$aD } X 1@j sCs6zaVA 7)HkBii5M f A@!Svn Nׇ% 5c  '3g[R3rݕ'k鶗6c2@/UƎ&NqX_Cc/ A=H3ε3j! :AI;+cv"vpc,~kfDH0aHg1VBp˘5E؍),_SN~ݡE*S e+z'NCg Ve%P"G0 beSaȥR7-{HYn]&&A,2E**bYO Cأ[:I4xͫz AָW[S X9<)!6z]y]?cIINehp:E2yiŔnVW- 41ka(}j]Wjr M3eYpVr]{"ʪ/>8\?dmݴmJϜ6|41c'XT ?̛]nvL/92\+ Lp40}%"}䷱)y9ljκ*\3Y<ԗP J?zYFKJʞAE7/wѳR-+d I:1sZ*[6m\ c~.ZK免e\ߤ ÏXn^Q;6< ؄ż by(QF;̛t1o6k@H&ECltNBb[b2EEӾԀR@ݐUͭ41d4V)s"wshM^|˙C ];1f7k-uwY~Ǒjiac5Йu?'6"o> stream xZKW- !eT|l( 3W"efӍnA3T"5@h4~N+?~ߔK(e=>ݙ.Oq&IE"6Rn~܍U6Vh:_Ζ}Wfjl:w/_Wcx}<ثytnY>Lm\gLo[wtS}Wz ?v# A}*7+>'rӎ~~أyjLꄱi_Q@3;=Mja #i5$AxpDkRӡ8r.{e $9n)J$E0s}x+;;TOb[UBD&RY.E%~Iʚɔ 3At]]t.8`:hMc:P:VԒ=u)z.K]Z)r]|$ PE8sиS` X$] (f(ݑlSeRBK^[a| HLq$Tͮ_6liSN!`NaecXvP!u A;{wX}GwwI"J^*!~~yԛ} Tr_,R/"PHM |/x )eZ馵vvfx9{7u#UHWOM_bȖ !@NwnD.H*_V,Vq;XOajM4ھWD FK-p%f{M+L%:]nXnA:ȉOh866 awc rg5`QPAI\]nS5}ll8 51RlPcraGháD !3F;FHrib^[1Za ifC;$fv4ޞm`y^]뎰`x H*+rz Ǹ_vizvgE|181#!5)hػ"r4&L@0ILYm0=P揥ey*I%e1X԰(ui!#IZOE>G *qK̐ ZO=ziislS1Ac*82+]:p)btU}gy@ۍDLbVE8143dhb89  gOn >K8M;]5P.s<(F9NR!E+ 5ʶBXXE6Ykku dE Vd➇I/ZyR=HDΔ 33#ZZTf0^ U&淴 `O ~pPBɬ-6@T&bHF ɛ891ޕNE,qQe0!E)ά(>EU o&et" #fd2.'r3B6QW"KEn\Q{.]ǩ2-aK(6$JkFk(#2i>hg'1Lg^.Ӑ;5P\VlG`B)7 /`$|Ū 8x22R[H/ucni oGgГI6e Gf&$zvեmef ppJʩ.[PSM9O93r$xZo߿SM?8Z/|6 p~_repG)^kY.vnq8CS6#;9 0vU9z̗ၺ~@y,cj)7# UТg!kA ^V/s31S ŷ3RX@4.Obhj.9be$]8b:;~ Xw"݉GL|_RhX| R%A/6Ij*{+تz70Ϭ龂ɽ\ d #Ic}4/nl>e|4/&'4h:W=\wpN!'_]e4+TKBڨ%"~4*Vѐ_Uq(}$WS%/,6ÜjV%U %D'mL5',.k W-g4:<|qs[Oˌ]!eJX?j0RxaI4_+-=Ul,[0RB>J0E eg?ӆnJ50C9#}fiԭM]nj雷i!bFږY?I3aM|ldz wgؘ`O h߱k$)_, V GwRPw|24t70/9F$#r^)Z<* cw3uճ|:@@N:P+ollKOA6ά·w*IE(peV*%eK}꿿 endstream endobj 857 0 obj << /Length 3235 /Filter /FlateDecode >> stream xZKﯘ+d .M*( CcקI-CKr4I }%F%̣LJD?2~px&B=n#)>U1mU(e\cOse3Qc$tҿ}] 4c$70$AҋH 0׊nvcuC(0 @'î$sl;{t/`9vffA#~{dqBBH`>l n)-RΉU%_XVZtlsp9N5@3Am@cÉ)^G5<85Fhm¼S/jj:2L`."6_N&ZN\, ͌sh~T9ew2AN@8`#ŗ7՞Ů͞;ΔxtEZ1@Szg]eWD~"ݴ$gliy} tCB3EXUO؊$`D޾]{-t2<-NDRxӖL ܷM? #rAܭqĠȿ}2mv@&Zjs _F+Qp2uB;XiR-',R,ERXXGp Kn 70{ N6w#N@'[ADY+~DFנ-A'@܊ /ihȃPv B-%P@(YǠ_0F)|jH""ѩOM6%&G3evmҐU(ܳ-0mLY$'-_D\bWAsLwHnFϏɦc벶 ,&Z$u]tqw:/DBWʧ+ޞX7V6D]*NLg蚆.U*zRw:]n 5ڞʍvG-7Oڞ?ڴN ]sUƌ WQMKMI~g@Ma eR! ز Wi+ 78@CD}X%"WW*YP*wC`iMiJ}9{R M)t+d>]xgꆐ7 oE|m I\][Skr= ސR.8#u窠q\UC\.KQ"UPukyH+@[H--;`MA]%mwKDtKul<)*TCYh Tvb^UP)Cya-Fwvx >XERb*O/Hy%x{;=$BS&n)R9}g'q o"1©ϰzXHΚ<"78+Qzș귑L<0)Jw<=U" {4R,l(9K =G/6%A]/%;rusKemW{.^8d8:INcf"i1c)$MDG۱)X1?-B ]W]:ZPtxn?tVCa(xkn'ݹOA =dNvcs<8/b~_߀Ő^ y7N8@xÁ"ͅ1\a3>ь$6֙ X[bcЙô'b͒5)Su… vDtd0f[ yE+ʁ1EHI~'orU#spn^KsaP8nBfAr1~MPj UDw#w҈kr0>W,6xƇ39Ġ$C-S,n߫aء,«+P~y9;Ugd?^t}Q6U٬q||ثCk40E4W.OS>ͩ.W3oipL7Դ 4htZB_T2m] m#mWwP8{S*_Q$DUòYpe.qzic_4EUBQՈR½[꺼Q}*|z wb"ڽ%p sV :q5~+նp"'q#`8c$u$&ja)^f6t_2Qu-_iZ삫zU1l3eڳY5.y]'c,S^kz.a 2GaOfTv=OB:57JB]q>N6uq>ߨy؂>ݷwc 㨾o_ri7Z}_<ܓӽ IFϽ1#Lo7a?iχ{Jb`֠relA:F]I%e+}տѹ endstream endobj 874 0 obj << /Length 3254 /Filter /FlateDecode >> stream xi #` ENѠi1E?tB豰HN_H]Cg鶟LR<}V|/S*ei$n&,H*T﹈oqؗx%8Ky*F0#w5>!X"l'6NPJf#CAmS;ufv4 mu׺s}9+k Fp&}C'EW@8AJpYMG~?օpw@3xC^|%V-R \*9` 8q _ё|ndv 'uc l_PǑ:i>UM-Yޗuu |:xUU0Y]ٱ;UIq:yt'͆M  nxgNq<rsݽoJ@&Dj8N-=͵iȽuJo\̧Wd͊0<ȗn0\wY[ܶY{GM嵘RT #8ϣbɨBTm`odB \.oM &,b'xc'$v JvQjYY'Zm)35F_37ZL8v#DLGj/SmNH7 OVY}W**D+S,k >.*Jp_ew3:]˺;&'7LX/u8w [zj=އ sRM24w:Cx Us8փ"-&LET@lȖ7ݹmS?r;Ǧi(1sXp$gea \xAZqIckOt87'z qlqB X0a&ptK !*TLH%T: |'-O sc"uAl;۷uSg\y1K^8$Go+}Ȉ%c{KXc m/`ǃ-t^tLjw9adĦ䈔 "h`SVg$h 45 {-sa^v}7)mxٺ]ȟWHd`Bk旡q9rͯ[/;\{w*puw-BMVg%rB1rnдzp<-ѼL𿭏Cf8|D ܦǽ8yXgJYUmVeB ם~"VV b]>ԍͥ22o*bJl* P38 ,Φh@nԚH&[t^ry 9v^1PtWu2ƎU=0FC&YNJLYx* 4 ҺH > 2%3etHE lLsA4 >Xo@Uh=e01WKʙ)(L!O>dvp&b]/Fy`zW6lD{2 ڴ7L&d7:͙ p[wĿua5"ݎ~͉N]?E4e>b[R|Ғv> @ d5㵦D0J`d33ʇx2 P ֩YVLw3DNWݦ~2lm> a(k:{ZU8S/D\?ࣉXZ]XK /Ȁda}%^HalC]ua> stream x[mO9ίvUVfHwRrEj&[A=Un1rRtUSOq&hM&Ib(~z#h xB2~D7Jh)Q;] ;)y x)蛒NE>l|c G·9 !_(!q9"@0`ha*7gâbb8D)p׋7߂\2F I;#QŽ> `1=lWvdٱ p';1WvqG )'::gNx=BJ gCLd eMd(A_nj( xTeX3 2 XrMV }8⣌; j«eM٩{dx}©$a~1Fg9w\sS~aYI4|v}- ?zN\fo4P90X}?0cTbn? P[y)~7ͻKo.uzU>;{ѷ~i޶] 1G{ஈ֥W M'LJbUr1o/6(%`E ~ $i  x\[mm \BJJJ#PlْV =Vb葋Ep龃6/4/r4|ZWkC{F hvlqU`dD ]%#]L2R"|FQ~;Y.{| ]C5h) V&6& B]D4Cݑ= - O瑴zrF6ۖ$Vl@6sԖr!8/Y{v֞vɮ S7S6Z'a,1l؂(Gd^aF4 `xT$bXM}l ;Ӻ禰P2'jO I2cD$[C0\4˜(Ngv3{5'5mŧy./a5Gg_12#J>>Y @)JCZtBϤ|/-Rse,iěd~Uc]nsV*<rPܣ¥|~ҏ+&m湼և񤶿u|}"otҚC9DjKjKj'H%(zTJ-OIH<8bEIyww >y=bUa-[SցR@D$ȠJh]hLBr}.=$<>h-ᨣXz&Yؔ[\NKUsrLGϻDž^Zt>a #51ݹ~ܾa`#"bΊLR&efу^.I[*ޗ4jۢѵ>5N!E=\6$pqUHZJРj6\Y5[kp5ީVXz"QMdՃ.lR$2aIK滣甧Un8n)Cn- qV endstream endobj 886 0 obj << /Length 3685 /Filter /FlateDecode >> stream xڭZ~6@Qo`kEQ Mrť -"EՓ%CwMw6bQ$Wy~q ި&˛Eӷ+)ĢnW2O U C=ݮĢhkC/yQZ@A+HHH;QZT O%R{m/jlM_XvštByaۗ'@y0ɶK^N ~ͭ9 `{NUt(]Xćj$&( <PZOf'^<7 nyϷ Qb[EGGz+" A~4VNԙqA{˓@7mQW6Xs6uP)b_S.wPCSGÎmaw+0R@)`4 4oo_ÑI¬?Yamg:6ʚ^ħni=Mot `x(?fc2) 7"O8`F1 |S󇸂Gcޠu.1S|$`O_X6a_2~vY Z*ꁛ,[hl벬Is ;V(i_O|5hD(, P#0; K/LDE)~l>l ݀O%bOV+Pf~W}:se'zjXp5G\Ͱ.p?B]jAM>6mp dhzSP×.fwZvUwXk~-X(GnJ䎧`QO AJ$'v$$I (/c3+r"[s &(ҾgScR; '2ՇO6~8Aˢ2jײu/>(5jD2} CrF`o}ލf E')-'9Q{.733I߫t9:uGIhZX` bmrcV\ \ r sVs _aOmGB`*}M8A0 $JkeAr^:ín9B$o D{MCk\Ih36ID 3բgXxpoRxq0fsAw)$[C>dHEs+>MgKM^fDyb"<b2ʝDcNY)l\!>Ųx>a{LIsT:@-X {M;5N# Ꞑ#' 6z@YTٱYWWjjHppC]!doZɋ+M]a )ѕy4:d~1ƻ7~@$Do. hqtl 8*'mMIpV:7^R{5)v6ݝd "+řh/&ݩT+kqjWI4y[hkB z/**@9d;M jZ7D6xeJE}@cC&x ?dl)O϶#/Dlʏv'3)dד/1bH,-A0+C@=fg84WTZ ms'2C23C zgK8f0'G׌Q-%O)[Pamh;da,;GvW*MT K٫xÈ)K]_F$.4LR^>ׯkw鰫?slBy'T^~8b(^` gGWk$Cd&Cu{Ɍ P |PaHf`B4_:{s[%F7zd*֊;[hU7 ѹTa] 4q!ǂ N5`$,K2G5EVu>(¡PYM%M![ 9qIyEs"UKʲPW*RtizڃHA9WJ>`ڜvɃQk|d\$hӿMR:`8R1pk븦L'] Lv^3wL!e+8N(AJL)v!T ].Զqa̘Vb8-a71:dlygԶunR+X*^ƥ'PQYi M|-u Lk'3Kn3~ls{{8IWBZ+0A¥iBE0-) Y endstream endobj 911 0 obj << /Length 3468 /Filter /FlateDecode >> stream xڝ˒۸!U+UI4ߏ\;θgI{HHMZF7(RÑwsƫoʟmg3ϞY{_,gYz~4{f{씹Wk?U0}Wa6pvUݎzV6҈NV8aZX$sR1a{y; * j[v+b8(w|Uuf8ͼ0K0 ~G|2%u;M'edӭ>RԲdQI2[ ?O7"H*jx^vtN$^p^7: ja`Zn{QJԌ]SNJwzj8Aқz(@N]SIY-a#&*y xK$82w8vS8p'Z/w٢,#ԺɎ{ѕ7 AЊ3KO,N|-5Y˝X_J;EoxZP n9*ri<K)#yy`)caWA C}#Ю[@8ߠXUpؾo2;ssDvjE#)3 ? f v޿~)+BilHr2a|՟b33˧ZkYO;YSoմ̤Ye~.;{A޿}ERw"NQMGPAІ7|[RռG!'k'^3@6dM4 '^dOEV´ EK+ӟ&HP_M|zEޟײV헩_i/iaͅ0Բ%޲x8:`9֎5\ Q[?HV8Gpr~/pk#^Q8:&7 ^7 ]C:Cg^I!8!L H}X\F(l飋sd`(2/Vl:i_pI S-\휗@h.?O@E GydANtT$әEͥ?]2e讃A I褫 EQad<]S2PsR ^Q֒HFV7Kj%'jB;JaLh]Vo[LZ;[izaR[mګL܀ U=[ܢrPS4Q!:QWmh^܎mlH Z 6 FVR>O ޣʎQAF|`E M)FAѼB{4-عű~xNzYjJB9׃ ܻQ8z^)xQ i,~}"HĹiF5if<[I :'> P *@,FLՊ@頌.!7p|Sˀd@x\!;}TJ:w;SP@6S*V;s7ݏy| m=~>1Elӓ8főBфFp_W Vumʢ<݋!5ԷZR,3cC1L9/+NJ&9 A< ɄYľ3/?EYltg'yt5z5%Tex:! 3rB)d,_ !h۹%Ƈ]`zaS 9TqC]R5Obvo_K}s!`E:4St-KB0FC#><<.ΧrGǥr0PSs)kS Ctz֐s?^߼|h1hڪ6Fދ42 0҆oI1:vڭ V8#50OPUpm4iH{*=G;mH='wD?#Wk}JZ}U;՗`VƳ& ֹlg0]Nyy˵tbڞ(ucOȚ#cо?J[dNT4tn:Q}Mc>rga5gb}`hIz=aEpv5?rh*L+`[ l L3p ,,@Q 6ҍ-y95L- 0`ӋC8οS .:\,lQOH%S7GC,p/}RE~SqTզn/ }ہt+x*a2-SEEd T5 kY3s8eD)3mUNDB(ܟ\5'8\[[ToX3h -uMٷ> B:|w)zP!NAzu.=mR^OQTc\Ɖ]I0\q5 ;(?~4ww`]k#Oe==x{sC !$z"P}gF/~+Mvr/ʖ@{m3Az^Q", 4?Yf^_԰"|[ub e d]95RЇ{sr1u xeU+\ʿq.ja]]%D':~m#C5YP)89i/RdZ& ӟ2\[ H۩.sl!_٣:?2t ű܎*SiZWt?^%nedyZovzI$D<æٗ|uמh?Ʒ.yg@\,YxDƮк>G.x 4N_:A5 t(5:f`Ms`5ْE{o$bPҁ< C`g&[ endstream endobj 945 0 obj << /Length 2799 /Filter /FlateDecode >> stream xY[s۶~9ԌB7Gq&Nqv>$HJ)3! `oW|xW2,d4_"δNF|Ku/yp=[um*OT/Vp'lmLDc8A9wOqphcUǺɊ%kUܹ΃X Iy,$lh`:5=ݘUM>peE3_B2 Ϛ.RbWSL&BTkܧrnW}pM3k!cc0EI D&x{x2)"Mv!SaAig% {Qm Yă6/U7Ҥi62kVG܂r2.+$#J=p,S͢X6"8K|AFߙ**ʺtx{c$eadOlYW傮]o <@HOD~#l,a =8JHC4ߏE.uPҖz MXnsZ{T7v]`_#~蠯cA? `B>ALD`pgh+Y_kSG A )8yCrb[NjT2, U;zK{LajG;af^_&fe?[m%R0T@YpN0+:D?I?v;?Q#Bgz> |+_Oe3x{c|mcaI8Ë&>+%?Gd1Oz> F+7oͬQ'ZW~_6]d4B1n|[Y}ҋWTQRghṠWؔtm6ʍ5W>[:.~Q'a0<+0 naҹJ(xm [M'厤OEc1[U>ֵY:V/-r_A=abgFAQi| Pp.ڣ>ID Y~(hiM@ |n(@q74GA )ëtA+UfW=ۆdlVQ@ꞎlrcCn]?!3HkZ)&r0' z+2߮--\7I|Eiրy8@M7mR-,dÏɬ ~f߭'olQe 2ń ܖ8TAq4l LO:sE2I3 zS5O45+5^EBo*V eGDEIY{>KDt+"*hJn=r!?KIz,k΀Bkn@ endstream endobj 876 0 obj << /Type /ObjStm /N 100 /First 915 /Length 3301 /Filter /FlateDecode >> stream x[[H~ϯ!uH0,#hJ;xpw!#NM w'ٕrԩsSEIeRgV)9F#7EgqYLhcJz?>1~ #ZzE1BE`*$zJ1"1h(6O $>5Ӟ ѪYl"eFȒ̨1H& ~ JR2Yn䘵MS`6z+2KĔڰ$%s*҅b!5t<. s^[\XK8z#IϼIJ^3o xb54.<0 Y':̒̇H,H6A1A@&<ڳO& 4._\`Cf?5fiCC` &RL DWqR[)/zJK陁q%|@II `D$N09H,>,]ƻJk\)=]bfȔ,]7w-5&)#㹐W }q8 $7B+ii5x|hi<Q o7 OMnxiS1ia&ζ!Z7fi-7kofeUӷ\b?8ßeQ&qxr&sl0DYrTOB,lOk{@/ )obwU.e[nEՈ"W?ʿi 5^MĈ 6 @V hR3$D|X#RbyVehCHMďQ5{M "9Xbg O$U7 0ԏNFmt- 8F cm֨sDs%n7Q5bρ#bԷXhX"X!X!X!X!X!c֏=8^ڍGd{תck֎[?a69 [ PtDeY9O<9=!^nM܏BPG}_*.Uݿ;з& ")[";w!uefZQ!}R5rޝr7Dsa)K4xi7'%w&BIdP$,mN'%@{0A#Y][M4jΣ>:.,2rBtar.u7ijus1O񄢁4 MvvmWuIИ Tj 1Yr1 Eh柊v@9'8rTεs>jwM,+Z:?OsiN'Le ZJ!fwq,F/˹4 5A,Bt.ȾwnU7G! ^'IF;(/ݳ>|˞o)fV-sT$ȏnP*֛,h !tS2&IZ8ُj!5o%_U Zs ̽Fiir׎LLmZڑq\3;*wոWގq\_kX>vh@BT:Wi"֦j7J r)6Ŧzers?"&m>jNQ]50+Q1Wv6s[">#rm[˶Emw!aiZ@>BH\ 6sY n!'X$15- zK6KڷXͶ[mh%N'Y9wY X{+"{O'CE4JI oj~DdX&В*9:뒗R.VuY; :)(ly~Drˠ/fNk!l.7Q5hM `QZ7m_l{jб4M5e th 3#vAt pXzG- C/󌾏3 g)]J!쎣sɉzYRP?nZ%iե6(A^z^}﷼Y9mm$YKY*W4˺X>DPBgR@K"]F Z*Bߊ\bQXɢכTq v!0uw#tɵ._CHJ}t BD=XcJ6yo*tXt&3.95 Q>rJbW L >R'i1 #mc?%FU<@L=:s|z`t -W6 ;t CѹDఈ^5o-R,A0 #ʼn`4 &ojMޡu'ՠdp0Pc7 Os7\;Gzz<$Ln o~tL녶RH:-0;%mEN>o L)b[nU&QzwTyszUa<Gf7pUt*jZݩycbsq.ų Dz74#` endstream endobj 970 0 obj << /Length1 1704 /Length2 10268 /Length3 0 /Length 11377 /Filter /FlateDecode >> stream xڍTk6t4  00tw+!))tKw7 xy_֬5\}yi_iIYBAPS aƦ#A˱@p'0"/ 8xjP@=/%  5v2r¦`kCL.!!?R 86 @{6 BWFQ&tpb­řX.` @ ;,[@ƎMб;ІZ!\pA`A\B,ApCv*@i+ps'_޿!8-,0 XA yUv+X6;A@=ҁy)Mÿsa'v'9~y8f9 A8aO Y<_õ@] #+0woa[_6"d SPGr\-l8~'qPr?VmVl'3yy[ ` @A`? ?`W?y2~`%b#搑2T2`(6n7'W q^/_%;z"_`0 '3p(WoEo3inYn>~a? <(г =h~>oџ!$TAC?Hphx:AK<B?*2y({R_y8,R<{$_G >koqFq1@ iHmM`e ֈ(a7H &[nt-V;#}^D~KĶ]]2syc95Gqϧ4"}O.w#֦s6΅H^@9E(]RHۺɼOgeֺû7aIUDۮc]*W{ >G);o#D(ey1*s~e:>0W <.]aJQ?<>0e1~EE[$Ngn[.dР4]8u,S)Ovzd4Ȫ+vWͫ5*eU" I:ӾU@N}9cФ>`MxP`38Bn#Ý.礭Maޱ24}v35!GGHwsJ5c^hW{H0&x=<09yFWu{K"S?h s_r*@x5oN7O$ _e^)Ρ[-&f>6ͧBkA[jOlq4bEH6eE0MTSL~R$.T~pcMA~.GcfJso~5%>[Ĩ?N y>#*xZI#)0>В%RSBFLa^\+۹ۚҕKfm6Q>L] -8mj . mȴr|MiSn {^w]j;iH31&A 4ݛ&3]6Iyl`&g0T}cM5!<ģ'j`tٛ06Wk#c϶Z},78$d%{MwXXiVZ"Ţe=m%oNN9>a֍kg*EXG1 "+iNt{FT$@_.bBElU~EO5ڟ/ӛ:mudo=/^%댾GHNʔ0ijB^]:~с=`!1Pz}nE\p~NlKĚ0vu+y3)F<AN@scGDյWx`ATm gOEӝ>r&h4w<~!>ő]VY,]/[xfaC+4EHyjׅ[JI`yD(k@77ԽgQƞXmbcc$lv{#DKfir&h5TຼS"Bg4= CY,s'bTOn 6&̟U=ߜ֑2u;y [ZHSbz84fӘPiM䷤^@0%Y!,;hT=WwC~Ɗq5 ū_Qz<.akmŏ$d7nuLg͙zdW,4}vl#!jߧI3VJKFVdbjOIx&[~W9Tk0a{MIFy܉D <_GwCj!>g酘KƝ ߯ 39$$bzTlkOpzF/Boa}ߙ!/Awqs*=6)P9/ԁrzh\a'\C: z|n*'IG<[i˽ = 9iem)[;XMO$}А WU^13&0)[TUgߵd;4397kqH斍SSfO]' %e<3':bi&ePʣ:`ރGGa8qRRsg;騖m"7SIb`+-Ičd2 xq_VFt`U$ Ŷܷgϥ.ZW)Lf6w96gy`P8N]|6~hdQbXH2#cruX#\Ibʳ<2Ċ8͞ՎV&EWhot퓗/ b]&6LviM`,AKI'g 1OH5.S_Gܐ'YlJ< Zv7Ž ZͲ2Yn?0U!ր@)7g2fj |6:YQn5%3B^]<ݤv z9S[qw2s2z9w(Id!D#VKie7 f΁Ԃy t}&<'7NdО;G4 EI$g]/JQM4l9 R(rWHط zy*φ 1 N[XAdA/@Oi?iP)

/DX)5BPχ'NTY&Ub2G91$vf1DLZoȊqa_8'2 MWOń0~ҥ17ǜ]`рDv|ً,/8F%6\Uz%c+Ka;y\܌Wߤ/:q;:dW>8;iP=p`rXo@ahq1C{Y Ԕn#$XnMZ~v( Qzkfs_wpnTG0g LGM"1AM.eR֣]=\^CEJ#"ةߪq6"- r]fqC+VDs/> @VTֶkzeNiۉ*~FuBܰvoES )h쵈*"%1ct{rFnuEn%r\Ch+Xk57'bKu>Xb+=!($%) )(M6~AEy>Ӆ@lIy'8+gII2lI~'Y@GBHrC$ 0 *^jr9;i6<]K4pNTӇ l9iR∝;R%2ln2yZ9F7XoQ>"QK^Kg χBmzJF\Cx,Ƃ(,)b\VeyL:V&> q=MMwkMClFۨ{-f1OD$xnNêNz.He\hiɓ'u.^CKk-P|n:g{ .}T%8?.Mmk҄Z€@"Jh(&İZjԐF>j~rIJ,)JS#UlFGt/BhVG_(5_(| qK H1nھ׳4&gXX ߹$_#oE_8&pRR=2t#M5jҵ˧!Ӝߚya8[)Q2EϢkU59/U'̰4H%vd*Ȟ&~?{MBL MO^s.xBĹ?TQb^y8G<'~x8x)BLmQru\)c ~Foyϳ n;.QL ֪=Z=R|b2;s(?oxҟ:vbHI"ʑN,_TW9FP^yvPiR"a/#3Kkk'/*)c&LoԜJӟΒSdko;6HV uƨIj(wbӼJ۪&5֋cDM,wT8^pVb:ٝ.T'=~J͞҈=>!uJ[/,|MSAnC/F2A†͇* x-Z%SLXeyԾ4)HMO^`'w"abOFF=hqO*D}RLv#Bsrf>H{w&魾;2qc"FXq*ו{&9?_G|1Pv%Jm<3x3dw!vF/fֹO^\""?%J~wu neĵϪDbfutI?ASl45 l{oz,UC4[XLⲯ@)s9z X^}2^piET=0Z},zv%d0!9Aw7¾y׎q"μC[@:%117%i[eAҘ<^jUoEoL%r:u\<=BY$Wۛ1{^W'KfgyάcCbÙWPVc]tq~ëtQn'[Wr ɳS6}+wJs ["5j!Rgn5xRs\=P |4NO\~Wd:hQ? 8o_V($.o`huv[O27cGyOpHҏ?i,`\KԆw>۠pGzhOHiSsQ/I7G쓆ΰFqix#oM]m$_r7YS@fT%4˙+0@3b{ps&\ܴ:3+eX{ gsGE GیzobZaBN:ڥ~h_ɣB٦wt?y _<zע^H.Ω|ڞbceX?q'5 / X" Ivԝ !({!@nPPE88ٛlj[qA~}pmfTgw/D&W>V6sdCCY( -NA~bwRx.my)lr . յ^4 SZ >֍NEb#I&O//CxI&3Q]7IJΞߖzת饺w9qy[쑺i\am%w Jr**<ϟF=L F ]|ȨL5עphk ;aWm΂[4E3IGwSᖳ쏲zls(2\Z 7U NJ|!nTG–X! {^II&Zm_Di5tg=f0;t:p8k wbTG}}j\#bNv$>%,tJҒtQMܘ`MՖaaxDMyN&*]< VY˸aJ|Ay Y*ܷp%r<ޒNX܊]=EqkaE !iv3  {HUW+Էߣmhf/o$qCX mҭBfQ;뾜d#9QKSNy%vhĻmY?fXdU$|Cwꂳ3o[/EZYxT~!:w*{OBš(t{F#˩3nԔY<3M82GmR66ۢG*BR=wd a>_ƦDx%Fxb*Y?hFfLtz> ӱի=uYmyPhBƠĘ~F: b9Cw\p; 2lcj?6L탅7{S94 \ 3Zְ02Y9c C[ݷ)8Jn%S6f*[WЪfBJ;v{OmC:A눴gf>zE\)/$x*6*0LeFɒYV`iLrvZaI@iH93$P($!P4,($4$,yVw6~!>XԷ$W_"ۯ0^qk$y/ۏ]v嚐4Wٕn0fXf awLY84=V'IX-|16Z[ bZ^Ɗqo>7׳NjȽo09 dZA4uj8*yedCZ{ o3@io) 9qi;ː8K2-B.wQH~-)T[]yYGT=^u)MܯSdy֪|fS~Un 3݌wǂ3șzeӅ)Y]0F y?ig(V{?> stream xڍt4\"Ja:m0e3zhA! тDQI5k9{gww?aa瑵C4,kj`/ObG Y`(8!0(S18M$Pt DX""`ؿHq@4y5$"dGzјc r11逬+n EP#s-Gah`tDy(^7 G;z0 fЂLK8QH{7`.p[D<ှCkpB]oBpd- #{ VE(Bb^P 9PQ(˯~ܲ"N CQS{l1g7aGӍw*`\9ЀXTD@T0[G_ |`_nnH73,n^0 ߁Z`E608?1n|C/;$?5W3crrH_ `^YF ?{$nsOj/g1-$0?,7 m17(ݐ0]|0DcPcjUEC1JE8` QJpm3Gt( & F`Θ C?!( 65a=E-ԗz%C0 6Nj@1)fAkcb(m[OLف_o`>0[H[pYoOۙ'&<-]#V )Os+>wB,ZOr&}xVl&=?e yU`.G/^хCKF mХBco;J"jZ-m59hŌ~,, P-&}3Ť.rJ]M:g=6D+ty9M(C Wm4Sx"k#F=}F=J7۔ <G 7(If}/qDg uf1E9I*8;߻B6Ğz|/(OV?)֌86qLtF!U!n=-HCB&0,)7r-ꠂ7*qVI Y(rjcx/1nWɵsYaFn?VOlPcNy;ZM'>gR7{$+kB/4 Y}-9sۍ4R(3WwNUr1w,44/RN [{cJQUt ]5LO圎^EdXW~˩qdZaJ4P!g|`d't| J|J1yh<\ T ,CO+Cؽ 4Å/o1<cOnntհ"4kXX}T(Y' ]賧v3<:dp)*Jf~NyoY=aӒ*,L?Z\>quώ;=͗E^~2-q .RD\]7ڜuЌ4_>pU8+Ry%ҫsäZԆ1!)bIcr߃鄄@ΟρX""?=VB]Ӣ Gh/ga ٮq!upǥڣ<ހv>j ʪ22{̺^#FhMd6;r^]i_l 9I6fWIkJ-iXŁ)Y=ERbIYIlgIȩ__ y'Pg=ږ_'RW./zIpaXH(oAvyOP]b#Kcz)Y6><*)V9aSLUbkn.zҢzgS6ML}bKܓN>J֛o4HƾfqFvl*nRo~iC՘ʈUԣW$AN7|LMr W_K& ?*zؘ0)p.w**_{]v5J;53 S{-Md`sB+ם\QT$=Lf |[).H=N>皃g:C?s,#SvQ( Qxo'}'}iTqv@l3;v@$>uͦmX@񻂸;͌:-.R;zd87:v&Τ2p/+OW٬ 3z.{@6on%voշ(";֛!--Чmi/?ho$lDɁѸQuH A=g3bKR0m3iT6lkZs )g 05הn 6 n(+0 tKm.LA/^%ݛ|29YWesI}}w¾xoH~Qvs2.l4 2tX'nǥ좿4&mP7gɹ t{e޺n6*̞Qe⇣BľP`""@EȽ"l#ۧkxy7Օ8bamVsMK%ILt7]2ueaX]}oVjqT(o[IݏV2U#aV$l;-Byv?W"6bEX3=tzřAeCfw.dxf%$|0aOȔҺ;"V5kRyUt_K/#6 Z8Ꮂ.BLf">22#&4( M4i;jtMH?9IK5OAl\aNq na=g˦u-иtěW^' ߆!_OplW6DOpX9ELjX93&>ˈyO %DIƛ*U\w皑yan``I ծ(c@|gqX1yO@= 캫L~g G%oPA m-T#Oл]>ϽfqyB7>ysCZI4> $pI[oSq!+yG5PfÎogs5qڎWn*/;d{ɵڲi=-"Mj/:%Z3u'X2k3 FEOX[޹'gC2^-b^P臊"FX_Fy&SdBn7 *| 2FMc:O:O?1Һl) 3wc]&{Ά0.328CAX4ѭ1T[>^z""ƧYyx+c PvzKkIT4?^ |a~uh\vФ\X;="Vzwm̂T֧lb݉FUjwHL)w2r֗#2,(UskkY Ju XG{&~I>qw'ho=RYc@b KrBk(f7c5qU7+T{0Hʷ+ҾhAYrGk@}/aF/N}E֤C, sw~ Xr:l@]|5u )24lZfŕӖioB$s0euQy(ɽո!_ ~ 8{ vbQF \U=aqL"to$rhhG쐌"џtBm#Y ZEm <kJLky?О`bs+:*n]Vַ\vA*>N+PZZ,5qk@2" uWQz,GI|E~툞=S8GDf8nJ}tVZ+EJ`}سgTR^IMx|~ľ' ^SkqMGgU# jo  A3L#_h"&wVB{QSB +'AQXQoxq׷"S(An'X\"TMd!]__7Qpmof3"}eľBVD(w`E;s3݅`D\lŚ&nUUmB>xS]m 񣅥[AېTsG5ȰRy gNUӧĹo*4[pt Q鈵ln(nܳ]lHUEG*w&C]Iq}Gߩ0u):s&5 e!n3T[fМ$ng]|mP<:ah.D!a'}uD$[3sYbUgQ+sOedux8?EOI>vj'$n@eȿXhoi??[Y.HIeX|jHV``3hLaUR`klC~ϻEtTl95PsWb2Yr0 WIŒkmdUj$qϊx⠕_H5>T"|~܌UO~!jhfh] k~tof5OϏe;h0_^ ʭ ~"eC&"1ڮRj黹r$v~>_;ϣl oE7"?D_{FQc/ک0@'Ap|b驷Ŋu!ܠWB_H*]gvm9^ptࣩFDj9&'-Tw˙Q=+ݬ j$9}{kE6 9b QX4X+~hz6]}zpV4;2{ i}clMI]r'4_Hi ?D<Ę·rI;ːCϽua%7' s{8[h>.v>y;IW HP󌼩yF 'U@TK> stream xڍvT.")H7C 1t7 )!!݂Htwt珟wνkݻf}v{ =&9,ـ윂Ie ' ^  !0Rt l@nO/ G 9CN/ ssXZs01m3 [m#l03.p;Av#;R[4`g9w-Oe-+\fw9 u|pJU;0Vooߎ пAff0[; X@lU%v+V8A QAqus4s!6K񕥡0[[0;?)8tb[.ɎC wK=jf۽/!7X`X b~Ap9p' CS%o0W#ߟ>q_吓gSd0W'M2ǃQANC-`?>>?;MT`MrCN^N/721;!'L1b#i { 2dRy8qġ6zF lYa\@`5#Z=dequ8>R/qr34 f{ĸx co,]"1 ?XwT0u?n? o_#}0<lغaf7_.=t;v3'Gk_||0l>=3 l,'sa[|=J1x4Es6Ui3;݆g!_coEۢ~MljiPl/q]Gq3>P)D{EpyY⊼wBBG Cb+Ahң okʵA8*5q }+ kpneɺb+ hT [M:։3#U1,}]Vj޾.įu}^E;E-]/ fS)qXg_> {I\Wr99߼zڎOϛyURsҼ4bmCʃ\+ni`T Յ*5cEK%RnTJ1+SGty<듥anW9:%ڍQ9k{l[5~=A[+F0Pe--ɲO/VLiiI8S}z}:ВZ$/EϘ7ܩ-9vJ*WY/W7K&4HY$Zou:GdZhݗ'GV yؔ~~HFf\M9Y|!"~AEz"u ӏ,A7' 1mA)a"6z=]cRG3׆-tR;hC|#b05HD\0QAFʋ<fAzf[/QV{*?&iqKgS V`|t?$\tG ݃%Og}ew|Bn,$̤ oG#P:ϲ?`"ưwp:%5Aqj*mp!+pdHJ$ۧ |#! my"cp;*Nb)zX%1@oe7)*b$ei{ȅB?I2[ WSLrV˔ƗXmX E;r͘X6<̄0F8h|h2W1OQ=)JʞZ|t #Z5gL~6E/}G^eE@񝨼ll }"sԩs7  VA W؅mw,C@`ucS]uǚ*.yIlܖ)&FyoǶCNw<8vΔs:kgٱg6դ)uę5%86y$3s/mJ=~`0~{ AA GHB|1yڡEĴf{TGa71b<ۚoqVѷ < UL͑ sDWFIj<9$nWd|PpmiDJS:u%Ƚ6u4{f!$~ 4p 5Dt#}SKg9GBG>+ Wp'y#,SQTs,!V|ʒljQFXXm}Wm W5R\98gafͥd5 zC'N?E= A  2=fA&]\.o<ݍ ˵aew*)JǴn9ebIȥ+1:i)׊X s]\IpWdэyKb p3dk.}(1m&sOXWiODY}{aeYH=I/&zHpu2?Ƕ5:ng53/iҺ{'~s>nJN>,`wxOGח,^wh/ ^! S  W,GifBr(g~bL ˦0N@"$ǖ|ُX&A:۟vԾu +F3t&0)&Q dkaFJP%.oq;J "3͈Bw>cnKe>kOU!gF5?o@ y|hqmd2/ ZG[F*sse`Mz*PY4' Ro{:~ۃ*1/CԮOe?8U hǖXY#F]l[~z NbP|e5MǬ gj19-&s@H JلL b蕛Y?RKjT(<ejgޥ.Aw (Ejg#6tz܆:(-}{ Lswo:9YDzo*6J-)IAdʮvrdjd騖dN6lx^68ʠ$lӳ;5{oGB-`[sղq{i_Hf)X*ʓɐv=DP+\[bKig託2hbno2$\$&ĢQ%;Ss^mՅc/^J/ B-.\y|傲uҠxˇzmV ! sݶn;-WА0А|u2ΚOU\? /̧:7hs:k$nWAܷhmB]En&v?\> ͧg}ZHl JP쐀c|kCjV]5^ 42ď (aBY>@Q"dQ:GU].e].:(sKpmUM1G3MFL358BkqU4lWYu'q3p2ɠSkN~p%I`^w4>z~$Ot-|(Bl>.VHP a:C U_簵SUGԽy2N-J48Eh{}l?-@ SHЏACG_/$Q2˓.ߣI}-1/Wx=yzU>ۺý/[8>$D)ApH0K^8F&qB_>x^f#$<7\~/*;6aEIu}I§nE_4v޾C0q޷\bj2ԼA>|bc9ZrNnӻv/2eӓQ=mQ<r)R'SZʴ$_'ޘy11C"{]'rШ'_ Ji=59wʝݧW7Nc,r ӆV~ݰMI}NaYnjy0eF |,ȴ&KFFTg *t|ΕZ/*%> ~MExw~#Cd#4ʲIHE>wɈоC'7_ b`ܔ-)1kfa]+5PO_:R 㞉_a{ t!° {%4fI|v0Z+ŵ&݅X,)P2Z@ʰKiAn(1]Tȕ>J_ůYzB3lՇ4 6׷ (M"+5o}mүxa8C2B8* x0 Vc!Q-_bS2D*4Z9+07Z-N}h8>UH0e#J4$i>FSp{!wӈ!z2G"oTֈ$ pF@Y1l >y C͹s[^'"/9Gk i<Pgܻzd,s B_aj]0Z \2k(#zfߍbkAWDJS>dz3+CiZ/,Oעhܶs8ljCM $UuS isU%DŽf{3ľW؞(?TM0Җ{~6VE*ODꠟ: QQiIs ڭz3f,KhT$4y/Z#2h':Ҫ,׌L u;QQf*_>|b&dBPҮ'@($n./Iff @s7asCrOdr^7Os]-|TH1cjh_kjHNn/[LkᲤu"¢ElOap ErSNDT~u }C`iVg9mlG,> ًUlq;S​~#ȕ4&_rib☁Gm)53Һ`b;v.ujОy̹77T[}(e./G9S=?!b 8" )w{A籠XvIXD=9)>1~bl'i3}燪ϏZ#:1 SniM-d>~hF*ʨ$<_LḼ !!ӜeW3 :rMxy8nkX5!YiJIW(n>LE} _w =v/sy)^n=Nծ~aOlAzG\:Y\km ISȹҬ/ᵅ}ROmob6J[{NnI٧C[ej'}oJ?d jWG`1ƒ8e{ke2DI[XydTF>da0 =ȰV'Od`?֜@NHfUM>w*KHMɎ}Wbps6}\'FŅf|I,4O+jj(Q'y /qI<Mk>ΜNpBUc fx?#qAUeTDŽ߇d܍cƠ"ݓwv$TE$tmJTd (8l*x#)rbL ɹ:c<`+BRK鮻Yt]r|7)nVA.9WXo LW?%qϲ{lmɲae&zѕ,CȪD8[kﶭRS C!f3el䬅D%b$lޞ|C {#'K+Pp#t"(ڱPk^6d|֨hr6YٞZ!՛$u+0b>'*UooF_nF2AʹUHZN84Ϲ4&D57zbb ՄN0۲ARf|j8j#_Z:fc[2de4wE å!'< Yr!\G#/ZZ?#6i5.*4O>ddlAHUEy{3Gt*⟝0ylx-4)?Zt3Ug#=d#u0QBdYPm`Q$g"yˣn)|lb5(J67gRADv#ϳr1=p1-~B9ϝ endstream endobj 976 0 obj << /Length1 1373 /Length2 6093 /Length3 0 /Length 7034 /Filter /FlateDecode >> stream xڍtT/]%Ȁ9H#) 00--J#tJK(ߨ9{Ͻkݻf<~mp/( P >"bc3?Z"6c0 $]:E ݡ#PT(&)(#! Py@Zu8 $bSz# (t\p Z #]m!`?Rp>vD\%<==A.H~8 /r胑`k\6{0~"6#GmGy`Z؂aHt;k 4:`g?;ѿA`ApWsC`&? ~9H8:@A6hߍrzzC" ($?54KV)]\0W} {|!0;_#ع  n`5ſ=*(vl~%7v6Vu#!`/`mD ( #OvlGFo dƖ *&yy(OHD̢ ݅b`pğfѷ=>36X0?7E0C,w?Po+/a@xuGG3߮OU Bs@%B/.e*Fp$׃ *[gD &?K*lv%$" ! o"ђ708 @#~SX ~~):(Ool4~ſߜDp[Pֳj9OQ)ͧ\|6 R4+>+q.0_~kÏhNkJҟl!8N7\m/!#ߵq3vf:[8nՙgWmopVƝI8XiW63tx(>&n/)ʗcIC6 nslj!v~ZIr `SĮ4&$ |R_R)dI@jHz&j3ڐR[iuӃr+Q^ujяza~(It)i/9K:*J(9镤+;xz$LiR8΀ہFmCRn|qnV.CǤ1K 2/tx;\<+1R]0sߕD55bM;EJp@*δ;3Ŧn(rD>IE7,(sA%V=0!J%a8.aS>h;Y&`=uʚK#H|!PSynf/1T4Shn^B!KIi!! 5J-#Q(ͼNqE3Ɠ#GZHLwW$wC>4l(B~ב:S6!U/~5&, YOlj hy̥U1 N\Id:v@ SQ/]tCG2uk@uѝ,$ ?c}Q0@u=44mg z{ I.DmX6WD(LkEhni(9}d{az 1,Ũe(ǻ3e,3&—$O^u'5oU;ЫM-([t` ?Rl}1Đ7N.ĩ2t7?ER=zYbf6]pD`@g31,ܹRo>3kMonFJy_^t.~X] |N"K#вMd Cb.ך"&z B##]],P A1±V^aV36~jzwQu0<~՚ζoULby[p#i:m:w \!ܾ-onVIz6(JhqSnuߧpk#Eq",_U@i CF)(؁XkaD5lPB- ^K=&j2}EHLjq2٩Y 13̾< fGSiU[x"5O-ݎ7u>1^E.)a&'ѩ' J:^DN.E\&mدg#bCbv^~v& -ޔ*,lc@+nNG)d_LQ0:}_U-!8]0ˎqksm1m 6. Ǒ$2Z{ګvZG7Ym&Ќw#0Gf}P${Ǖ])fDDzGbez"uO>sl"ɑÌxG^IĺO4Z >A[0OT_q"2Wng]ŸխTw ΧRټos`bA=swǴ-Wer{*RP)N{^Ou/|fYڏzΜ~4N NA)lV#xbg&G=We\[i3SSM/:Xа*s|^4OA#~kR2Vq`L׬=GY¨Eg dw%nMz.+1T SFv7rTr]LRSux·{pD+6:5YE#05.h߸=0п# lD)cZ͓_g)'IXg6}ܕM))=fL#C~}wiZ'I*屨{lּ.嵐]-u$#] pdi+t}%-ޮJ=ƭ? _(UwR&x@fTf֏;;Om-(a C䛨LQO'_y}#kjɔB̞UlU$uw:yx4tJlRB7Z+&2Y'cdy䴧}+ݔfmycj'DUzkɟX ܝ=XE-*b7x2G>[<9ЬOgș}u^=?XecYʀߨS0z@\)"Jҙ/~nwY1z:|wZpaťM*)j/b-HΫIƹ A’C _?cG>o\}ѭ$JrxdU=_!;YH}U, - o'PWoܳ L|] :Ut&UZl¥RFQ'iSW%bgGO i,CG_ޱwȓRi[J)`\R!zB+l[4Ct?4wSK5uƾ>VkS#9c^z`J"BNu0Y,e,5v;4fc>ج]™kXp8Hx>:4"9 P6!K@Hf./+w52:' 8G'0c@|#bySb?C(sv,l_}cu (g&1y6Qyt+z4TtHHVaGR#ikTʻe;m2 h v2\pI_c!@ڻ˛xԑm Pܽwyn@.=| joKLy[0c-lrF2[f1*1^5$WlyNvGZm A>Nh$!JRt6ܴѵ)cԄC]7ĔgWGScmVKZeWІI3/}FUTּXkꋪO%y~@5drjoSXz_yecvФ%^Fw ΂4:[Ay~Q5ewWHG)]3YgwIR!&y:gB;!]| +V\8t\GuX mz}mNv-N?(mۇS3o ;z?lt `VɊen" eԭ$ca~f6Us< /Gl#ڿhD;M2slFp^b*U yµR69 }$ܓlF_7(u"R%k9y:t5׼I bKc`UGܾ̃#-EKqiDr&"ViJ|Yςc9(C"U)7ݣ6%{5!9i!E͘0o"ؒ]3{Vp_} v Jv|'n`#uAAUcmͰw!}> _!1+m%O=XX%cpW/QjpAeRQ}zsJrKCy3PE5,('v\W`68cZ >,.hAQ Pgt}h=,J\"a.hR;LRXk:2#[\eCQiV[ٶ--dÛwQ+Bƒߕ^ȩԼUq)ey`ɖwڑ-^l7f@7-lHW0p+ YMyGQym!FF 2JcX>c3V<,oΦ jc-v/enHy.Qiʎ8UP*!ᅀfOnux\'x>|\vLgEO~ ͙T' CMk?n&_~5*^o5$ʽa]-M'}6qx,ez4rtxglޗt͛=!pk1!Z%xu@.;R Ϳ9sp Lo1;8!Z#xnÛxectk->g)6pzE ~F u`2٬ojrVS8tl-\5\KF PÑ4AM7=G6}S[C]IT"2VմV.^ۡ9 xW_-]` =1AD3M&ī^?-~){?g>cAM]Q?a|&_5jzhg4D\%&J=^Dt[)þN>ET mM$m}'݅{M0}C4C$M'{@͖L BN5S7R*9?ziZr. 8$x7{HH=5=ۊs]và)~YN8?S7 -) ʩb ?I#C>u"Љ*m9[OQE >OwmX3z`Ќ%}]nk;1Eq*- IuF%Jz{rAdEګgJ. Җ`^]e|lw3`(=y'Ǎ!գg'8Ы|[qM` e#&"VUp[&(D$_a1vy$ê endstream endobj 978 0 obj << /Length1 1511 /Length2 6868 /Length3 0 /Length 7892 /Filter /FlateDecode >> stream xڍxT[.HIH00  JJtI(%!-G=;ݻֽk֚{?}=QGGnVÐ<@q)? ,,$7bvG@0!E0[$*Py@~q~Qq  w(zB58 eCu~8bbܿ`wдE:]Q+l}8Fc'$Mˋ wwxAN=0  вuE`Aq^` B@`PU5n`؟`?܀re*Nn0JHo$7f+m=m!P[;Tdu(C!nH/őW1+ᮮ`k w0u>|5m9@`h{ O< Š 0G0  '_ ;(~np78F}!l=8;m! $SبC@_]]=%3()'xD~~~0 utl!?E[K ߵ(( ;_UB)y@+WJHhQPcC<\۫EM,h~!^Pxu H _7 * /j@.[o5C^W6a.(KǏJ{o1xap$*pj .7ٺlYA![W8A>_!8>(>W(" ;;jEo{A pDY/OXkщ}QHֹ~NSOl g̗ELp^y[ ^MWG)m ~;}S;fDo%٘c~M'bLl ( fLAu?3ed^GZw70P5Aki5>/CPZ*W1o8y<)G *'|!_DUa#}-i@УQK>OHG\xV2{z7>N5r,hCGՂL zV]ҭ4T}1iW6qIiG_ʉ@݆k872FvMnZ0돧|;񥧃-m[nHY|)Zј *Omho~֌=x RBh+jA{9}"p1* V1?L(@G l'BfahN[PiE}sg1$un=+|jKG=u?hw>߶4~Bقe*VrLRh03Rhxq#i-ЂWRWI"cjo\]\k1e+d$L}gM̄2(RXj9WʳZ\5 %-uz:Ch0\BiAQLUVz-]#?@J˵UbL\5rX"kk怇 r$H 4?s[OJPGޘ4د*ƲUuF? [BuKy_R$Rsj%7K*:-*5eSNԓO/Up¥"!pOgݖUұB/Z&-WTua] zO ?dZ>V6?5n/8.aܼ<}v~JhDǫ:lj/=f뙜 K4S8gD= (%tD)2Ӣ >=rhPGhsTE?O Sua/rŠDiV_Vֆ\l<F2 G)0j'lTGrіQAtg;7e 17fr#'ķ<Zܺ+OF[c]eٖg;3'vhJHҭvgu HX> CQ޻DwwEuf;][b.>$ *>n sg c0ʕvL,!"/ƝQ"=M-8;ٳU|ii/G`E;L:^mP۳8i8l2湆JчVR)M0M0WICߏ=~U$ec`:9cIɈUi;bw>Dɯ"&eF-qdϨ>+;W_z;㯙}G9TAH C0Q?k?`{+x||O {Q/hwc>h}>->`S '.lw ;I a3G ā|J44-t^zk :/߉P'xkP-^/SS|k}';(XK%GotXL"Y4b'n5eҡtWP\/;?&bX{:)ԀC*qÃ*OgE6ҾpUb#~{C8ʳ@Ke|qހ1ԦRNѬTPMmvR䃪ww8;yz,,<%5whf䭼2a|:$3 4e2Wٹ}r&c-T콢Ҏ>#tV}QoT qe$Q`~[XuHVV k)U<@ ƃM^e\vORܿ[(JV5>.x8p7X̍ѺD}|XrL)BѷΖa&Rj AZWw:`tq~O(L٥q<u)o\-NU6 6Z,_w$ۿSL&*y@ELnD̹uavqaگ&<94\Wc˜ ~̽pcVHi^JvzAʓNG?gdI`o ]X}%talCDG)s^~e㤸ybdpv|L3r‡#]&Jő7ȊXn9sؖજqvvd\tV}|Fc\ R:povS3@:k(OBzAw~2|%j(P8vv!oƁYq3ǚ!vyQf+nC]~OKOZhf:/$ D{x͕N3.ܪgJW:Lmc\S`QRi`{Ğ&[^[}vSUXC ϲ8*_4l 5{YNfSr`V*pbD-v3Bʢ$=/upQ>TT޼Suz/*H؀wu驱?goWma`/Zlh 8ڞ=DsO<՗r4W4O,SIx2$g5ِ&ǐ|/Ą4Wg]UB:U AruhIOd1AAH)`;yΧV>Y([U_qڏ;g:)H$dDn~Zzgsu]jm|JL"5@m,3eu0 tŤ~WA7\r}q뇈 o p0qpYW%IQSet1u׳e>۟zS c(;nb 0'DfIJ$4tVhJ_uT9+!"~lXfNky 1G(H)k$$͞\KQXk '>_IkK._$^W⼶x"{jOgX?ngYY:J:Db'UL9Ǝ-ƟIY&h~ a07[kc|ed~f#F[I'=n`po=- ˘b{7m{{'F0 iksj ~i˼bLR Y%@7{bFXPmNh)oǸ\T? I95'z=L%gqɧ5bMqÅE]~kNhў +'~ۇs%8~ӖiW6SgBqh7RM&,n0iОh^>7ՅLuVX['M'/z?AV=nYaؐRL_􏑘uK[oO6tB$J0;Lg'O-g7ɽcZ JP3t*.i+!ÜqlS6^nq%[ Lӟt~}?U\͐ԏ{CkIyM v3W%/bR[f@ \fLB+&R(UD彏S,˫Ӌ &b"=n!&"#z?Q=. QM!59P*r/cSZ>i'v3eU}é;ꖻ,2XE 0; oGWkrME'Sh 3bĖ/7\?%v6p$ś>ƻQfTm n³xS @ot7 {~|ED3 1a{0u<<7mIؼO_t֤թú5EÅcz]Gყj~^K8̵-)sg|(bDCO(cTjpCGJ5+W(qDuW~V^ I0꽳"0JabPf\# բ|͙i8]rvLo: 'ڗhr@9UŪ*Y.id<ю c_# v&#$pO^hxg?Ke" o[r(= vdw'Inۗv[V&Qy/5TGjЪN.7fr/H>6( TZ9 yؗ?w+寽VٓC*[~wXp˥yxi{9|HڻכƯ=q rFdHνI>G~iR0Ӳ5xfԜ׺n q*P"F2aRfNjooN[M*Bɭ9!22)X1 ]L<_;J?,=)UQhAE.tY;yj{lXݒޱ\dךP!D*P endstream endobj 980 0 obj << /Length1 1372 /Length2 5935 /Length3 0 /Length 6880 /Filter /FlateDecode >> stream xڍuTݶ-A@%Bґ.А%$A JEDtKA91#\ss~b30Rw(, T3b 0XeĹ .3DdAQ@mPDRVDJ 2"1@5'(v qmEdd]$ AqW0  C"p>(+ù {yyX'BF,㉀ uE 8"pc= A @a(8o4!PȺg_P E Q@{ G (/"C=Hs(!xXÂH_# *?eu\@ᰀ_!1}ܬ3 G{ -<%22@; sU ;( Ov@ 0 "#a8:FY/Bx>~ F~Z 17OH HKd,cEm\-=([1cϿk>?kEUG` 0 %-)77twwC].> xzCmo9ipGpPQx1 p$7@`$7e5$ a"[Y`9X.xs_u 3Q I x9OoH8 Og ڣ1_*. v a?J<0~+ֿ@x#` 4L.ܩ:RK{_Zb-%pܓu/gj܇]O3z92¿q8mݖ2G޵%w怸G3; I,Po>2IyB yl>q!.\Tpւ]Y RYpsZc-8YZS` &ZCg8#H|ƻ4< ɲHZ&:_m&GXn})L]#爠]8(S凛va#VbLj 춺g8Ј4G’g7WyH)Z$ vn+憯rǁw)e%md$"t2tթjܞwKT(]y7w{0!ט>Vxb quC 5~fҶfgwYߎkuz_<ٿ5v1vZ4[:mϧ)~x[~鞰0lFaP`y{s%I:|ڕiZxUH|V?*/}i;`R$1QKA^zCLtog;UD~+3 DEpd㧏h^@idJrM\UC4 e5k6AeLWwK`9w)B |E r!n+uw7NJUԀ4t/X 6L6 ^xV٩"j@ټ0;ŸkjXGLJ3=(N\G&7inzha?7r[:ikz|c| d#q2|PPgmKqS%PDYٯ{>o={1)]="&njyXE`9P^xN(e?>ޕ}@:G&*9rd٧Z6'b-*]m(GʱCИa `rv* RYelptcq>2h?|wBuuZT!<,z,w5IGj'ƒ*˟Oi8fsNCzorIw.`gd؟Kx^x0#ye)p6yIʗ4?{~rKkG#4 Gdn>y,ȼa<AҾ4PN""1 7/JI딖a f&l^- &v^^ao@ug(3$#5#x ;X{O>}:Ktxqqc Ng)6gAKig/+޾~c9ψw7A`P "E] nS̴SPTb)sc,RG0ϟGd6M~䗆(o:0X BEO>ȯ fMtCdh킻 `"y'*f:DflYd&eK.a Ob]^}2jD;"޴&:<ǛTnupEWf5³ &N9)+yi+Jn+d~= .-1桽έhetn~Z^ƒcXi_x-0=آKCIQ秛ȟĂmHnEOZd08vwvxg "Y;#6>ݲ&8a_bEvYi:,$#IzCmַ  acx9R]4naK %nS nQ'}o{uyKCiqő($I_c gng^ËՏ-'8Pzf&I.1 LRV,xF( ܋^R;}OX5s#(|ijCf&{=ɅĪx bO 5[2 !PǍD5=3eXUhRqS3g;j T PV3Q֟}+mфC#-_GFoQ;e:GuҢW!{YɶZ8n6#gَVe[<5߼S.%gpg'sPpH)TR{ )h/|/xEY'Q2n?իo#|$%um%=K_'S_v4ײyE8+m~!q(O4Uԍ~a a{ RYd]~S.(@d Of.AblMJ]fԗo7Ǐ]b5?i,/HH|ꄻ^ Xtl0ZЖQ$KC{kĨUqfb7Iv7}|j3VY9>#rUw{bmYˢ\8Lo-Y#yH Ѽtӿlx8cXl MN~e˛{r]^UҤb6`Lg.Okx1^|? Hm!UJtkѠu@RdavK"n,qqg1O̸.mSM#]ܛk="Z$IAua ( nl: ˯|b~(v:S4JigS 0b,ktm73%`SPF~F$ImtV:"3I˔ {0kHmťQ1QMsɬvEaRTE|!v//ˆvGEZ]U(*b P[9ZTu EݥTdU{$/Ȗ~!۞WLv}J&hݺ}N`+<`vsrN])AU0fv_Umۓn1c=3Y*ȼ [G^#Җ~|[,ּďpԖwZku>yIEmvc*|7tAZ#6qrxYh%Y ǜGqAzؐrHɢkWL%Qg (?"XۤY}՛y۷%M\ ٍizGTok95<[پԚSLB8*%yy#vm2,]Ֆޫ`Ik7,/*d~`N~D9IP|›<x'k"U q^C%t7J Wܠ/\hѩmn>ҋe{ŕOL>}7ڄ 1b!O7I0i.*'?2\E5ʰPi/U:Sv`nɋ5@OWg.4kfqqzqaEDBQR3}uPO{.pAOUћl֊J$v=g;`kՁ[p)\2'e WzVt<T' Ru endstream endobj 982 0 obj << /Length1 726 /Length2 7624 /Length3 0 /Length 8217 /Filter /FlateDecode >> stream xmReP%8 Cpw`w . Wkj T&А`gpQSK:PD  t@ #ƎF tpt[YCtm; vsp[\\Y]EX4i@5` $UTetZY ف ` ;;@,G=@IZS\FEY -ɪ) B,C. sl:{,P AcoK߰??\`vSڃtP3@ h!.q[Cڃ<_P3!VcȀ=@`-v| 9J ?V G寁Zr?2{!`@Pſ_eU G?'`\ u{ Xs#lyDF˒pf3srعxy||G3?uM ma\0&>O:oq|TqvnkmaB8WњE%03<En4K|W@UjWs#A\94eq!Ha+-RHG=ʴ {5v~ "KoDd=1X qJMdb[;LuGOC%5bk'] qeQ+{@t2LZv9W) `h%R|<$;-ڃx;zRO4<\3 G~"$ҥ-eu5atU&7D<="6GoPzq 0~5_cZǗxR`2R'(Pviѯ^m+vT<-Z7D[d)]dȦ"+wlt7UἬXQfϔN.|;4N_+ 2nU]ܬ^ UɤdŔT &|D+ȓ)Ȝ[#PSBx3Ȅf\ ?\YC{ⴤgkZsc7|Nm:VmnF*~z[* R)qāC# CTq!-Gq|Q\J1-uXIsNF{ZnZU:B\社H7lXڑIIzP(ʉ9;$v-$?!i#qH{FSis4pJC{:2+l l$X_;⥌'S7@˪f8 3q Jj9:OrEg3~ ;|E+,фJ)[z7 V`0{iEcFuKW7b~(5tiǴs dЌ$!n )lH!VŨhrhRKptFc*X`!nu% u6t~˰'&uRb FƹOoxfx'ưO8#T1\2mu$sfSCpp5*yQN ;7,eAñ jɮx9Ob \x(&QӱEYV؝&taxn_(^0qMZE"b iɏ y?xC7ajNuOaOXfF -3LӼ)ZLbPɘБhɐ>e | 7e F,;wOBQ2ڸLV--wrV#Zӳ7S%M+WX]~:cAf\+#WF ٨3 ^oWLc|ȿ: ߱\jqeJ&8Tbr)94i``%Zϫ um"|训"7U gGRy],R_KyM^k= { 싫bfbwzyÛ,FF3ܸvsD]nu/${MG)N IT/y8=%F?%s4{I ?Gt6U~}LcaJn pCVu}t(b+AM`õdw19jf(-\b-u7V:ibˇ|tͦL9ۀhnS1l~-D ;%Bz0Oc(f x*#^Cz_fD \ޠMxc#ZϨB+~dY| 9T*Rixηq“D _a:jUD ` {6K bae5Q?_.;[ •ЪNRQ&KBk_WIضb` zd<K?OhjoX@nk7V[׵ZmI(RyR}/d ؐR8DwkW?V"KZ &ڄ嘅&oy7V~~~+Mq5z2_Dk]+aJ"aЛs_,M)3VqP$Jl׭JHPxK|n]w`ͅE;$g;.:E/:+Tz*@^kNz2ݭbgWm2 pU~&S F; Ę7~7v8_=U2ÐeB{>zpɸVfBEB79(Wеʸ ~AOM 5uWu(7)A`Z8-efP B)qh=Q :~ ywP~aS9BIRֲAq-|6ΩZ״i8|=w Pla0n9W&θWV%+Ww^ QZR:tom^Xެ4'ds7+ɾ$Lg:ekr#AܭOe36Ak4JWY ]lΟ2GxE8cTxŇv kloFg bvbuB^2m(qWn4Ok1üǴ-pK7;K3 ïtY47bijLx>axBiޕM%U@@O>@IAݭVOӻ]G$aJyrue:Nsu T=yYȯb{2)\K[ $7+='ݻlacq\<7巛_ng |CGǦ?'!H`ncߪaUD3Tjt/2py JXS2 훬.wjEgrϐ9B%}7c痂DŽ2>a(h.w>sx3-IfsN}xX]1r:9H7$ IVlI";IxohƼ[(pgդ0}zbo #+J37bYq^TN^"1{ג*ڂ2Ts1g^t,.O)F8dZ8Y%sa?Xn-` q6]^Ƒ{aN3lLPqYE:z5q%c_)/Q]jce Y?;Cjn3 E; `2{qF.1&0:[n\76&Y$Qs ЮO?| 9 ìIkC)<8H.߈<]ۉG169˹aJ w{3i"ȯ6/y}jԼgKnז9`=3o4?WwpH]B ,R @%' \u풒 ZO!>y5nt#Tnx\԰ERY@,dvv*^dUgιWa29|b*< @B\b 08Nݽzq=mi~#_8\p1${7hm"}|]6_"RBi2C@5hvos>M)m埔86 $rN\F 6ٚhծQ1(A\[Vp]&ܸ'Yp&czfl$@Fp_3l>X@iv-"I^~e!'9lG\/}VQg)HL~ rgu.x/HQods.X_VX0Rʿ8J? I?}$W"xϝ18AȂQ07,TKe jwb򨓆73{ 04ž#k_QB=uE- b]sפ wXWISga~ kPؕlk` >.ÓRS+3:[/7I~ѕ\]:KUaTnz/tl h:M.$lWnH{e- }ʚ͏Ν8GNR@#F\&}aPc簆%ADӡ?3;ц$Bb$&+"婃DHE6;9l1d1  'Jd:G(jn ˮ^|&7$lM0~74#VU 4&Ҏ 6p"*H-(Y2PFAmCB7r9 ο~U6K=+KI"l,83ߑ8>Pc.eT]@KmYP;a%ܑY<d-Ҏ;Av*K⡊$uՌsߪOc+xr;Ҧ*q eyٔ~T CAm3̱,Ʈ\=[$$:o a.Qx6IԳ,Y*רs_)MVYVp$#~=';\Q y40 7Onc=ĽLyl @;~V$dmZ)O%p v gJu85ZdׅU~",zS?epu36Ɍeq; ^h̖"#ku(7O x vn,@Ljݝn9ѯܻ\7upO5C#>E"&to%+\%R.W>^ !n'*Հ-[ 둟^Hq3=h/FOhu%/khk?ՇT e V_͵{X:*bf:j_ 2x#T6<jGM@q'oǁL41$fsK9^v%m}[hjy#gu^gn3фC˻͙ۗ7P43VD<,_Y c\mF- Cሥn:@:n ,FiS?s!h_iqb8M*}S呂mg7[#l響":`ly2K#_L۸bmcV.,f$'ט!._M>J4*2M&lX#+cd @O zX9ΙlNԳ*P 4#^۪ĴZô̲S a9L@s9YI?#X(3Ԋʕ)&hAg8,A*ױw~k@a0ޙr ҵjgöh"[6 {РV0:}HyNB`Vfv>N6|+)M+YEsҩ!#fX7(@|F>a6\y%5yJqiH#q?fΰ x--\(vJF+H*:-}z yQ5G|B)#+6 ${ft/-vpݸކH΃-+4]qHÃjhoUQP]8*E50u!K^? ~bOu)蠭oEX7 >q|t.YAڸ=fE{qN3~814VU,1Iʬ6Q]~Ns1܌9u\Ň${ LqJt2]3UlH;Qj f :M˒+' ;1R+.~kq\Z_AG)woPW@+#UR)G)(:9NBOj.p`eK_;STV:_Q/&ƸRExXXc# Pwc+i#XR`{ `l!Ob}4|1kdL|]o& OL[o mGO.N4<دqjQ,0SZc`X7b.3*[t.ud1L$1b?44wm4Ǭg7jlw`VdlV(m ?F0&nb>/3 endstream endobj 984 0 obj << /Length1 738 /Length2 6744 /Length3 0 /Length 7344 /Filter /FlateDecode >> stream xmvePi-A 3 Nw=xnA [u뭧T>?^e"#/ W yXX=!`/'% " 0!A ? @`;0r99y`0'[Gܛ[Hx9BN2!P'md p@=A' d ePR3Pv ?p\O5t,WYO,^,K7U= OB?4AT{j vE xB`("N@  9S#<ᏘN~;'?ۃ]wϪjO+\j+T"N+AmavNPqvN뀝9{^5^N~3~޿_`~m@u[Tf^>ňV ,+ke9TsB8QG !P ޺{aFm&v(gYSY=GnC N/vtV%cQgԓ)`0}?~(C|Pמ K}Rk*„HQ'yW/nEwcRB;?kСUV2"W̿H@`^mmN+zE#f^AV"zx@C۵ӂ,[o"(1bCn=6ov3۫vIωPk 9g{)# ._k26̅2]_C.f ɗ[D>79,B7/lz-48!fտ?-5*9wnu`|,״+(;b =}$Mӕyٷ=#l%BETB-z7[-`; XX m πIesL^ O_e~ޔ wݍZͥ$̰縊-?>*3P!0;{O&pTb/H" 5]iJ#^TT95ŔG*י RHIdɵ3og!RW2-}MY|sT=e BU[e=-G DqAJ?dG(V;SۮYUU :۵|7wȄ{Mipkz>;, ďK1ZTXgCp P/*~Cd 8} b!|5PoYZL{z&UИG㵣"L}+CRQJsq7;8~i#hZj8}ֈb^Ul؜ǧ o;Tp϶ kDӧȭ!ۯ_fRYD/+i^E{͞Ç;T5bA=* J؇^3f-v4u`|83xG)Kh}ik 9>ڸl:b@1-RvHb0V C]x?2n4|< %Okfv!c.hT3丗Wzrn5>SݧMĖD/qT} -d׮mLiȖt47RcbÔnnm4/b Z*lN~A; 5A#K#(ځw}Chba1~D(]d?K*ճwk2He!yK{c2v7,at{\'E 2JMΎF(̗ 9t>$ꉹ_ЫxĚy] N9[A"\W{PƲ?"ry$wbE!gՆ'Ed|K'}¨wu̝mc^<`-vpIW!S'wzjdyfG"tC55D@)f [qEdBMӹ0 q[ɣ 8-_b0z\Z"CRAKᅻ(u)ojܒ_=hętɂ<("i(US?=D|jZ&ZM=wZ&^)M.E]@K 8`$F)QVM4`?մ ht[}ﹾF\fs.^: p`]͢jXp:~8z'in#{:sXTDBY a12 6z]C=}URQX7jknOj"YDty#52`GSGRKeLZBQ4ոt8ưEaf$ӹ{31)ݩ?wh:`gw`v#C&1YPEB@gך//@;Kd.1ntq#k g/Zl*q h9L iH:e>BG"]j@2"2߶naˈdh6r8٪͏绿E_}4{Bkउ ~;k(6L9O'I<&qkVj$A\~ '_ˊ*/W<Ig|zKTGJl'C|=Kۄ#@aؑ&ԍ 5!bj&c ($=Ň]"בQ Uz"[v&4㟄KM>b߿荲.l =.fd3Z,k]BޭpR|Q ȡH#X[ݳpd/'kM֣ c <#Skm$Yֺ MիLqL,,P0>=ctWǑ_UIB1ӘkxRaI\o6m{LpDJ>}AXܧ~SNW\(R`55l+ ў{99DYo:`W0.HKUJ;Grg//w 췻͏m$pְK~\]!@beO$=HӽشE"Ttcj3Sv%N`?oXP,$rP\Ʉ+| "?4EZ"8A`!SR#_a} 6&$ґPם76+VnJs,RMˢTr` [qt-1(g曝_;GLaغ7tqD[so:CIL7gؗtE opPiݢ K,[S2)vhStaD,/—907T̹D1bo_NA!%ѵED^32tgڰX>ݷ {&S.I5<^ߢMe$r+@X> $I72mԥ6Q?]j%p$2{(%}p_`RJUp.>~LM,&G&(/]=zˠ< `%LvF 3<*!5Xf|􉟊m7>dLRmNy9v/;tF۝@GZ5t:y?d hYhi'UJL~#&9W]Rš=VT)0yHgHAW09c+~.=Ic{;aY5'}|&xqn6>REǬ% ſCrqr]/Ku$gAYUf%vQ]]`ܸ݀ q6 ŐkE!#IN ג0{x}![%/h8"VŚeх jM_ɽuWow*;1xs=D8 )BŗwBp yb󦕸*时mTB\ jQGQ~5GDiR>11PԶlgSP;ݒ0w}bI*=ז\.[-[8<66~&L}Qܿ7697Πs/ϐ~#E<([)BYI0XbّDn#ÂҾ\ܧ0F*IAI㿳]A>[W hJa\ KAΤ)͆,QVO'efGR_,98 nTD"25;&؇5Rm:tpgm描mJs]܃Bn w-_r^2 hBib\U[z3LFLN$ ގ>[Q]C FuE6j@ݾ؏q?i6b+_$φ4lb1njtY*t;(hTgQ=QAB5J&?u$P]sb˳w0?N^e|Q K"k8>z'}e s? endstream endobj 986 0 obj << /Length1 738 /Length2 19516 /Length3 0 /Length 20096 /Filter /FlateDecode >> stream xlzc&m{m۶m۞Ӷm۶mcڶm}9|7'w"rʬڻ*PIHHE,J@T`dnk#b"P(ڹ;9PQ[Ife`lnm@fkebndF"LGO2@d 01+hJʉP˩lV ΆVF2FG%F6Ho6.8ZȊ˩ ӫHn7_8de?HcupDȉ`jn_IژGml_?G@ ¶vNY[c ?8Z9l 46r_I'1L Nob`k֫Ӫ%O?iMO]NG/,'-,H?o)?8fwZ܍@_뿒ueb$eff"`dgc%dd_p6N?wFזm,R[B}D g+gR/3LF443?ºWs7ݽܑo&2OLF.cWl+Շ]aZɁ݄9(PMdHٗ"3F)G؎[]x%GSW#̚\ܐbj>Rq_ΠQ ?;1ængBDMY%;m 3a͊EAтJmR`X9Tlla+R#G4(}"ˤf6GHE55X Գ^*MѠ 7v``F]K+R>rwJ/q35z$K8U` V&nΚg<_4Ŀ~7\R zO.ʥ0 (:%@?3Jr:ɸ"Bqd4m=N[K~}iƉXZn^m.j=1o d-`){e$8|KbRI@K Rhʱؕs 4b@((k. K2^=NEf/b.ˑ\xAFФ4.+[~(EB.Sw\}k4X KJ04T!9GB;&ÈX6|*!-{7bNsf1{F!|Uޤ"PvñA0#͊omn. \ ] 1 \܉5vwm's`׶*ۣd=,6{'[IǐYi^xm: HTf4=xxD!(T+{?%NqJBk/kNAJ7'"HjNϯBi+6y_1:b 6x ơaV6.D12uvʪ'&5SC{Br+̨eVqOkoH,9[k&;lIk/ o0ЃwT%wK0O/h|yG >Ύ/A\5%c0a=_ ;I.$ RB IjLxҗIV_(>OM1qĐ XpiR$g\#'Tj:ty_ 1YQkJMҽ鞨@ ɼJVZL16.{b <;M[ ~lb6q#$_*"O;gJ3QMU#}MS(\gKkNrYhJ \*+BAɢ^Frd$Hz\ԈɄ._Sj9FfI_6,Z8w#l;W =O&8洵ees6T4p8ēpKݐ d.#w'sSVMǽ/oHtQdbEng 䁴ȪOIhQpѥjhRL*)WMB u?)KK{Ab6QcIgT7lQ XU437oZ zs5UA-`%&b3v$c.>F7=H"U= $}-ݟ(VxI M-)pkM$fΚOv҉Hgr4;#10^`94$?g q2Iv,گv :_-DD!8ѽb214I2dpd%a숷\P~fT;?)WPT"d,+:s[?($t0j}".mpŽo`@Zh6][P NqwV14wXt?Lmr-+Y4a0qZXctb2뱗vڤ_&io 7ו uUSlDA|h0|ۋMU=AX&%&]}{&Qx|Я]9 (ڏG?PK:F>0U^ji"{W,G.Ja dT;᎞Rڡ7/)_⭇ni6ZWeo'y<W7pQ̗_"oXК"ktPve1#X#n5wHV;zV?>W)ißE">E;Xm^ ^VfAp[H!f ΫDEE) e4?9k@Oא4۽oMn7M!Hw-%Q5f}mx+96f ;cp0BRq:p%"1($(I:ԥI3/]teU#9c}Xk7ZfK?CWWY37*(mJ e^7?lNI 0ܼw`ӓA~=>de~όpI4D^I5 >շc.iz$p|P̳ Y|!MPTc۞b5JEVP&z&ąf]syC׎!y=S٪XNc~c5L<ȇ0yB4氰z %qjfe] %%:5lNdncgfj(@2A<$b O:8=Od(b}^XYϥsP\f~? Lff=wܥ)]/4عƓբH%Na@9ltuֺRohX#rPREh~?'TQQy)| 4̧S=Rsk傥';VcYlO׋m,b/1>A⿓hi3 ErYOIR>i]W%1<u34uaIVC\v.OE>C3m  mdfe*RiZve&|P FȒpDŐ#P-l_g@3ߏ ?Rh܆6n&юiXtj~N\_lRobodq|ǬF=i5=={Z[ P/"eC=# zƀ /Me'I\"#ͺu\?Cк1 XxH KE/;UVUy].\%=EXy=χ4jox4Ӣ`hQ+$Pawd<%khA@kyFnU*INٚNnC p Je\q/H;shIʒul'~5ߖ!W&1y}8" 3YmhLlD5*jӦjڬcrTuٱ33 _l\yY ǭ{F1g.f!ibɔ,):`Et,gՌ8EI"w|j[}dAzɣ3ʜI=OOS;!Uu*grN:_\' dTIo|%S MMV:M;1vB{ql ̡ǻ1EI;Rl4c ?)#aK!Z̗W;y1 q&vCrF !KBNL V3->JSўOuqbj$^_% Wœfևιv+\ˊ1Ɵu<9ؾKW¿toI sqĝ7?Muѓ,6TrXOlx]~L:dKqS6s; n]O66Ox2KVa,n+6I2s/>fJyHs#&`gܞev)}kMGt3ۓB}< _bu.5,01;Ixg5'8]$G)cLHk{wP;/jrT';zov]Eݰ ൐reS}ܵWօ8͸AZ,Fη *0h3ωAA*H#!,%@GQ#YkB!SdP(w|jN*O qxw)67 CJWVHSj&cLEaskƨXNj]vth^.a<sTcW{)tZmp-FpzG1*b-~e A=XtɼxU~ `9E~I"2H)_!@0Lj'{ A_N}~xztS~$c]h ! R<,V&K]KWu+m3Ȉ•j2cQն!HyUxUk`Irgډ Bcb͌k|]BwH%Ö(0,[+r)?HTQ ?0) ]J]gI {UBKQNk:0l UT]%os!O&_Qc WUDV"-' ɲDYZ'DQ-vSyN)2}rc`)azְr/:Qmdɉn'@ĉ"2\XGY#V'kXӶS d)d }yn&8N}vuD1 EeL6i(=iOthWiyѣ^PhSZk֔C{@n'6p'yo AaI&x}XԬ_OLE05A{] x#Fz+V 6"qO sjOQex8_ 5}tVݮU>D{!Up5^~kۄ k=R;N:K&o:CjftGЇ|ۊ4Z*{/ܰđӂ^=eD;lyIB>)Ԑ> e- bP(yPU>s9Y8> H_ʍSXj.?kŞPv4gW$VNܑ ys.TiKzS1 FN{mM}c$_žx7%DM jI}s)ost"7V֝qKS !bR&H\$׈<rϕѧ#_<9yM.61D(Ad ~l#n+GSr6d[R3T궢aI--4Sm|Un/WkHj38,F+ώU<&GU Ϧ40m]u8cNwylZo4N4QzaJEˠ|_+eFf̒\ ZeBeb= ម{Yh ь@tdbr8m7zs0ćߦ/ZS7,a+ E*Y3^sH R"ۿc;@HJ~C#v͆rP9+w>Y?fK´J ,U2fw TKW49F[+9pu35tؘԗbn|b<[zY]COGP@9h\VǬl6OEtDRUjR9AGyҞe `%ZkI8wC1,-j7/c JΎ$X;}~_l4+mw͕S\RQDX7ryqtuxUioFkOCn;mw#HK>Y5ۉkn jg{glE V]DGĝIn=x/2WsDhTt YSQuF;@2XTWGWASw_eS6,cvKrp^o:CuSӢN)>%!vF.lWkNVqPR]pp|@ p(g&ԡ ѮI#o:ksK_S-&Ǘ'UNF$H3ׁ* ę>yZ]x[ajK&"ߟ#I"]f13Sӥ N.(g[kXAT{nu2:Ybx`(P@ @sF@%f־2'ù'فs+ƪ\vK9d#`*:G`;3 s?dS,aeһHfIyq􁔣ETKoW)"0X/Et%)Sw\hͬUh-\Flb3)?K21KF/w $WܰxiH3a#ERP}0Bu=&. ~5AIkzBsu_ .E֜C_4BuKкF}xVZgp ;YQy+rB״}B^mA?G9d~ S ,͋ON(">K/M0ÖXIFnbߚz 1+VͅzBdzX u6r(y)KO;< `GQޥUg+R <qзv;i5yqQh5<r7z`;h؄dСّ-x< HEXx#JXW+>G_t~'坻4ےB]ĩ1ߴC3OBrC᮪e2X܈'lβYIe0bHoi@#B|=8KjeF/M#mx@ 0TgC ve|Ҵ s{m$؍ۅWZ IvtsM[M9ꃤ+z'6`+Hر?BT(R8++X,OK+ GxP-a&K%f󋠜_\پ:=ޠ^U!qmiyVv@9msǫsI~0ÅrNvsPiqQU^L. eʈj B\oei~"3Vy Ҭeꪈh;XF=MR-9Ԙ1z-5^ `{UxQz*C&Fuҹji7/?&5:CU\viT v_RM]; u̜eSf^\A{g7\eKO<@nHL7 xmGTmxEڽx11dTt[n ] PrqW ᫲50<{Cllf~g 21DvJ;zvщmM%woN4gkW:b?f;ҥ̇hB13~+v/6Cf!Y Vӆ{ՁȨkg;/Q)|a׀4: 7^e` µqYeҘyl'Kd!pm954Tr}ޛũmNY rY*.̄p#^cQ](va;\=Q L| dL!.R;43QO6+pa] USb|189Cop_}|*o1y]Ta"ˮ8Vkh_-=,Ie1&D? %S@-\$3i7.Z%?X_B`O?cP_o;{ |qBJ@.ĒrA%?+3<|6s[ّ+>B&Fj଺p;Hck9p.I%?0WߢUΌ[ta U=G?GWD}< "TB^ͿSevLOC`.#3,DUH4yՔ/ꨚHcBsqywϭħ\A ljMtr mJ8n=.|?C/N'%Q}4fk+ߌޛ7\&ѴըFb4CPN vc789X}/wcvH5 eQC.Q/nK8S}>Orn- `Oȿ?' 鈥, Y&+0>jkChb-Bvvȕf=/EokS'psQ 5vdF ؤ6 cp^2}NLȷ]ڀۻB\\s7 |?PKœ7k P`.oD)qn<,˃mq#(0(:m=wp9єj%'|ɣ!RlZO_$I0 ef/*x8)Fծt0~'fXk)FQ5#%PPu<8JLHrp-ΣoS=WNJMAflSշ!0S . ` !Q9ySP#sh“7!tS銰sqհ# 2\YLuEb {~OxᢨQl35Vg7@wo`;}BDh%cmKMko=j83y+CU ~}}wej 0?;0X%݋B/^LCU־\2C_qQǞs^<*$ORY`cY כ~@S[oHv$M8uzrrZx$ƒ Ta}B$LOwy(ou{o9ȍ~L}eavKPEQX-6%[<[e-r KbI5o 33獆+rƃꂲ_?N]'>,[@;3fЛ%o6m|՘e1 .c, c\_Z@(Y$ұY  8:*Ҭ,mzHѲӂ $DvuN_Br♄{ ]ld)BQFTU.%|Jw1Wr19Gz~4>L .Q5X}BkJ {uS*8#,ў'vAQIEjWc2h$i)䪋.#qeşyB@-6{ ¼yrп?̼^FG5+\3H]÷]BNR꺓ugNP,LA!ˌ}w?5q'(&ZN2&;Fv5>r%}3QrԻFsKu^*HaK[ΊlJZ_ MY?p$G/f¢_KS KͿf=Y 'sӉ{NHBLqw4nwǓ DǑ١}; 'FwC=lzE6 ݓ[h%_=u#] q|.4edi;=TpVΈ:,YX0(|Os{G9v'iΣ`n(]X`f힋"+:>dz+#11ɱx%,"~'EܯA(kHj,KIp oe},[2U2d"f?y{2y<-&(O ,Ϥt\QVqO .5KLgEWѝΑ+)oC'A=))^+ i.H}5 ǟzkz@*X@l(*|ш;~̖KɨC}ѿ5;;ya9&3!i4[h" Zgj?:GD_Z`T ^;F06)_䱽%jS”P#6Ho <9Q~yKm66]{eC}o1i-Z JQ 5jz yX뷜U[v~Ki-Nх5+i?7püN5}_BSwev'-'i@Z+PfQ28!n9]>6=͢‡ܾWF ep5b@=eN`6֟USA]*M*ƥ= )Gb+,ҥ*_>QJO}\WG54uCmr :)}Y 5feR )Jي6F}dUKBѳ=.uXxĴ1ĚЎE`&Y8A~$R+7K0 9 +rXjG̝l[Ea,\{l5tf}ʨ+IJN"h;2Fo?63g%-nmYÀ~J|$_x{,VOA]I@w%*w豔w^U2LPǎd0gNHoa:+DPW"(O8Ugn/&b&T%\\_:67ޘCΐB SHȩ j>*0ŭ3b]Ʌ쯾G {ח aA9/d",5)U:1^pQ9IIѝDiH䐲oO kVA1WK@Z#:Iˮ/~ 9,X'!ެtwQ1-qCV 04YLJ}S"e:i.0߷ؒj˟#*o L" ˤSc{7EFeHN=58u ҋԪ&OӸ   {_K /ΑmP<:iTK+6@ -쇸~>rァF:+,OVwyȢZAG>H'<3-VuF 2W:I-pϫH﹏> xH:{P`0 w Qrr3oQcZjZq?vtDJy.·INdbR:xO/)9"z_ewޞ}i%7jc?,:ϯpVvw0MX:|AFb.[ڰj~J\Q]~MV_YG?Q_p@H_)+Ҕo-IzU]3RhA`7QZx匧%4${mn?B 2jxH _YEׇc9DWp B=`UrE̊huW^ž@Jz `"tv7rVF4%Ӯ˾dQy7kfoc"y&&GpF8̄F\3 CubfߟS[ C(7~abct5J0-tx'ʼ?I/gLZ3#W,,t֮HS_LT%tZqa+K`i6|iLV5ca' E,0E_S%}};yZelCPXc1Ofm0n?vO~ҍq:GYa%6+)Jv+]#`StI_rq[mjeS34鍶̑[d{CHo!o߃E:dU*^JTuUmtDH:k)xPp̩w`S/Ip0 s]BK6~su4M>X{jױ<{!s7yMEKdćGȽy6F,4B)wu᛻}B];ti Q6iV19L`'oV^@֖3ajH4te{C}̺j;Jÿ?Y¬ӿ7OyHJP?DUŵWġפ5L9>dy K\(ibDk|u"5Tx8$]۴h/I^/1;c ?1m ""FOx:}#qEVpy▐.O(&0;VDzQhEu1v@NS𻝕(B]K0&\̴v-2TM i7v>&U\: (A@5KGK8X(@iodTF3#<4RCr<'/*jSRI8Ra uT6Ex&e# endstream endobj 988 0 obj << /Length1 725 /Length2 27938 /Length3 0 /Length 28472 /Filter /FlateDecode >> stream xlc.]-\]me۶˶mvm۶ݷύg2֘c͌") #-#@ILY@ MJ*hblag+blP31(L Фa;{G 3sg忍Ukc  G9UɅх$%  ,'!)+Uؚ8X] -F&N&S;Gſ09?Ll2ʂbrQazea1@Zt[g'MU $k G$:hFF3_Iښc6v@NP!%hkkY#@/-77FH:C?1d$fnb,oΎ.&1godL-\ln'쪱]Ro)iEm-lJih? ,Aߎ0?]@a_IE ٹ{212hY9l? In&w#e;# Ԗr pi}eF-$Asc+{e<7XCf"X(0eb 3w0k0U,#Y;yI̜)R^$q(~pՅG[qJr oX}jYkC׽5м]R,=rP+Ka- 9jTH!kbJS*R=15#=x;OsA*K[yPH޵DR9{v_`\|(as3>(T![2 ~;?2 z_y:n8D O exFrh.z fsF*rNi^r}}1If: hUl zF&<*-#6 ]_oH]@Hi^+ C~j넇.t|)H;}NXzZ"QCg%3]%>l>/_]tjej/e(YVbذ G;ŋnß/42<񡠙@,u1F*ͮYͮgPF8Y3cd[g&a\LB'(ق<5T?),߭;81cxs}S ?|D[=`jK.1Sk%4 tp+35 5SҜboQ{d([Npx^Mz{W-@҇ΣN)jpS{tኤ[ 3N[K_N݋XaCp~L*N|i2@0=r 萔9@JpM%JA0e۹"?XISزRJtoE(ZѼ83ڮASrp\},#c##虫XA .rΰ; QZzJw˖D턔|@8eo< 񨔁Iȫb@ < :=RĖhUAU[4XՓNW]'ךD6 BB7yi M(hPiz JBKe\\$fG|A;: ݛU>I^,poQ_3˜[mf7t rԝ$S$vڏW53ů%>$7׌ƸylQ9=}7`n?ͱw`զ:A@Z7d|XoFP*Ov/UӅ2@=QV(q 'Aa)¹- J6 'YU&}rז;];Z9bPY"*^WszΊ+@T]sK3 ]X:aޯ4b)v[" O A#T;w_J!V\wuӞ钟i.l?8Lp]l)oʚ \i|B B.iY*&w:|xbsI"ߙfPS?@*nM9R?/R Cѓ7^ݯ9e"``:|'6\M|B.5d3̈́Mᐉ|?=9`6Oũiv\bƄc)&fE-n/0 ?_I/EK\ӫ. <uDQ tC2{ɦ!/ٵ\SFbӒ6-2LJ J9ޑq 9+*n DgWj¢>*.[NG}WU8݆c;Ӏ iToH%(4Z^ qKPr;yMNG Q=/S'?ͫ#ACHA2k, raע!5L;SW؁~~{D95Z\oʲo?,dꈩqٛKO 3eU՘",E268D(+OֵťQB@~sv7[E 輤Rs^ tFhuGLVC62&qTu`Q1ar (+~ ilQN͵|}? LBgN30*Ѷ4Y^Y꜎Ih!֐; l\ ]B3V$2(bPO 9B:BYw_f^;:jJ`H8T>Dfߩt]5ab;r]6ō.'d P{bq7n|Hw.to_p7Б /p%54ll*VwӀP%є0gOAl?UQ%٤HYRnش-)B̳t~.B㩡gj֑`LnR.Bh )c+җտBU;rfnhw=pjCF*cs>/ɺ{`*xn,6OA]9^B;7 ߟ@) J<; n[Qdl<ȩe Γ+ 6ˀ^@{`6c!OS3/kp u0q9ٲMkn5ބ6ʼ1AUCtP:-qTU:;b+#u,%a:[O20pSvq2I5i[IN.(ٜ>;~\̤Ӣ 67~4=YYD^HWqzbѿS_Kݾ;_W"6@=;ऱ,@p ph{k(=~ǎnDTIoDJcAB ۇ3} 3 tNOx\ȑۃ;~Rt)=p/Yr(7D=0[no/xlQb"X%m7HfԆ|JQK+#ɀْN 7.&*09-{9HW>83 n!s3w;%\ ߲ٕ$ ߮64u{ Lsׁȋv=%DDW4Wn 6ܙ}Zsf\~5YC2&"uiIȶ h)qQÚv?: @cr-s5vhd/d(0( !܇iN+3leRZtoCccب9;L)FY:)X-X\&akH4՝7yAcZU4*b+Iq'ke^q?;ns+0H3|M!sD[Ul$uM dȺ$hǀ {`~ы2 F_U<詗AU. Xuvȫ@߽%fn^š.p4YMM)W J~-t833G|hM܍SKH!t j/4vy FJ0&Gh9[t5,MWzc"-L*desvR4Vˡ7f3dK eOQ7C!)EA&m.&Ř9Wm QvYeEUeGog}㭗,݄MӥU[X¯OIsU|D,RŜ #-fd %DHwGi7#2>,^7X%s~Ij!MmG2eҷTo+ |S⪺mpˁ m2Ln#Zp;# Pa.SIBKX>JmN/"2 +EW(ʊXr\RU:EQ!XHt]\(u2٨n͍QQP%bBEhM hYh@&0n,zm5=%2^q,GverLrb.GvyA5<hP/z_MhWMV4EڒF3z 7Z`u_(ʴW!p J62eIzqozԴxBđMG黂埖YgSvkUP| Ivj:cBQK|!m? {zqy|b +#Q@@Xy] hTcM{>wr&>̱E'"'~GcJy|>P%̙zm rdMl{>RPh<#ů(%7d&P'6gUg>m#x3pÆn:Ϋy~]}[Lk#h|ey>m8g@*V\ԥzF\v{[Y$/Y}e%P-Y.v=l{掤L5Q k#Ә,3uNRe'T`~R)F jPFqVQhɑ)[YMkXT&}.vYfz/O넪2aDX@7{CߕSLܳz_:3\DT@@{( n) `NJ>Q/ j)]869a?_D}1ztT%2 aR^N}a7Pe)dvUk@q_|PȚA# 7"oj{Ojz7!YFdr D:eUmkӸW׃vtZhe6m⑹o>/~5zq%NӌP-\Z47r$n˼7 9gҖ;>2Rip?K zE+ M1C R( Uj!s#ps]jBĐA(l ˭D{(ɭ9B6<8SFS!bg;|u;9s=W]T~~q"O_ф.LqQO8Vc7Z@T)ٮCluGw x8v/ $\J >7&8 QW* h)h**Ni}`0ꏐi4CKo'I*,Ŝ Jgkljb?H ?qV{Q<VxO+Ig䧯bR\*aqWIWvx'[JBZ-Գ9_+c"ӍpSÚ[^ʄ0PY6cz5Ĝq+7GľT5Chn)Jif8Geh+`< &ǐw1tz3Q1s؄&'h}&RuX7eYtSR(s MCBMw\qGԃ%|! _/`|eg?aDgl1T\"ȷI)R3zSN4L26WmH[]g< H!0yLB$ (2iԦ}u8>nh_t@h(d[dAu+˫nC[&V*JWx~$)C|}gfR.sg(M*Rz+:XT)23-u梗09sk~lٖ Ɓ4OIjtcy\Etn&>qD6 pPZ$|jV̰K;8 MBhۏ3^X|k-esuPd(%ݟG#<P+5G"woG̖gcn@~'3{nG,9@Ja!J(VtJ BCOuo}}jZ#Le[\w4 @{#Py)1NPx~D!1 EפN!ߒn/ jx&yy:}x?HTU_HX !z.uLy $⻄{c> ]Rz;Wb@YVZ:LkH֭TñظZbNآ,sڪx;Z77Z6+LBݚpijBOpB9:Tf9CxXFI^%N8e$Ң,Yi饷dXko}˭DJ] {Fm͑dVx\%2cp%VN𒬿.s@P;/e[Џ׵TKݭA#ZoRy%+gtpf_r]Z:4iߞM u_K9{YKLgkBl4I-aoݫ(F=yc,ք?*Ed yɉt@tݦqJւfR]!R¦'ޭb {䔳%NEvsNll=B)."yA?jJ^̦y 78IV-3Y^0 _X pqC9(X,z{o)4ySťa\A/r=+[Pt82cQ }:x![`Mz#jrzzP=2hmr{ ߅18w.Qr1vצ#UG1} N&XHwRv;ۻ_LHz#m9_(=~AȤmPhuk˅].V-]gOMyOwr?oN(0bg2D{Ȝ`HGF;t%VC ?X^vF;?FB,O qŎJc,m,JR-{|8!gC͑ԪsOXR` vEئA,?uRК`tK9i#ciRf$AޅSyve5Aȇ= B qTH}0 ˆ<ށl+tK}曐 \. D~nߍR}B>+(g_^~4NYY0l1r9Fm~5cJT@[_Á"x*'[ E ?!|.k:-=u  %lAoT'V}vbIG7q4lY(}WfeDtbKdNPM<;ݘi X!K|@u7̞'PD,t`GEyUjtM1{-~QwOC=b#:0ZfaDJ7VZyLuv l;i^棇c5 G6pӧrs-t2nwgK^]~pvPkAqeߋPt_u=,V??Z-q^xIIj' bR9%`6J-R4䥣nNGgGز%O!oۺNjHпʨhӾ_RvSo1gQAM7xkxmCH\UeIlD^ )點E Gՠ90RMPBn<fQ:2 \GY ^}+WgS(T64}6;g1>#p젌zˁKJ @J8iv5{ _2ci,8|Dzb.=<#p $ d2S|?4...cj9iq'ߑm"E!I޺B5bCj@GG;2["f&<&YB3m {M$1l+u2i^iy@nCWL3l܊7L0:z O֏dH$ԝj|6^YްbuZA([&l@v ng]ֿ7砶𵳟$-Ym[dR>WqfY{~ْіng]mt/mڸT4Na>ysL3v1 8ؖ!宪70%'R&J?gb茭1s(_[kbm%"yݖ$ l[eepE{i 3DO8іduC?x$7v{68:_tu2Q}J܎09ާ=**=B{|*Dˆv rӲoPbNX>}g}rA|AʹqvWyC1 bFɊ4cS䰍z;br6@u-de&j8:C/O?yCoܥBјR@4C? t#UrF$Xju@3ݏDRyU0!clTj, L2;1`y|de!a SN<&g"pi1"o4)`GR:FZɣ S~Hw뚽;5aBADU{Z&% _HjBv#ޅ:=`9yL|_&O4ޔ̒p KM< YZޘ~\T·@GfY8-SG5UFHuYz5C0>tK!$rfCκ+齱"*)G-n,zsx#A2yr6\oE5c,oÎ⼌۬86TKIHOX`VPm #P nt)h餅C~^K>Ѯгu/)ןc#"Ss:dQފ>!d|[,ǁ|U!3!%6Xxn~ Q/p`^O8l NΣ DEPlcJQ-e]9XWA(ct3<;qu_;R&Ɲ'oz^*P<Ul(Ü{ 1ON/@gNVI1ůٶ6FT&W]RLj׵0IGq^rxf``Ǯ4e؄L- CSYe `Yf8/.ܷFb顑5>@ np唔u8'tL(Ԟ_F0Gw:2AgWnfVTٚmZK#eH(=̴H*A㳱B+$)}&e /tS7N"ZչɷȚ6ESy|UO3mw{~WZh#2nq3zVk ƅtu͕0\mcv:m ;ӱm۶m3pVծ]{u\@n_ #U(Cąl_`RPB ,}v c_*&+b.}W8f+q_#@R-/8ӝ{YߋK٢ՆbvSofIlOH62Ago^BСh] SC}U>Y~;ġ4+ 㧴_..u.Gl« [+Kw08Vz]M7̧Ҟ8@k.g>  t y7n^hj0\oxs tran)htacs%|e9lS{'UxX5F kW@߱ɭ 0. nwb&9.{Iji%Eڹ5qf~\QXDpOWuwm[z`@zׯrڣQ/wmCh66MCRYwXKQX085+`q)O^j2B( L-P:kGKBA_K=m$]wv_لVcU8{A6%gg= &uU;Id8ytRlJ?|Rz{wF=iV$[BQCdj+4G'ڒwhA}?t)Lq84ZH.Kɼ0 J ^kۃM~5CRPy=kEP*8+R1yVlڀWe)R;yOH}f7Dmsn?lc#e1!?eW@T=oKOka<,Az+ ~al˔N9 0<{LQw$ -ܯBUUSVD#4j0Tel0HDÌ?gKa=W-> edtR [-8Mwҧ_i65EC+z>*ұ:2 W\ؑZ˞cŻc@3x1^C(G|_ fb-#t@v?,<,hp[.dvjWS\v1{2&`$KlˮM9K4Z}ѨCtqak]Q_Uȓ[_mX3Ç޲ Nƙ Y"?<*iixmWQ+/ RMalh/eo&&*Ȓ2e¾<1"lFx2^x#x `P`/v&uocUFљ }a NVE"PxM-C>W#؋^Wg1D>vښ?0dwlbb}j l˸wZ,4 rWPv,TfA EeHZ*`pKlaJ݆}ySfHEly4J^LqJ#R.: CaM(rUB.M `XN1́쒒T#~ CN 6dIʰmA8tqN=\8L=N0:}O 9=)4D:#ZveFzU_1ɼZm|ML-мo9. uYV;2f( pMZ&g$%4_/xyR :Ȝa7?+C6{,&zc0.T @,QV},5qgvU jk곆s\Y񷳽WJmuo"2a>m5:Č~Mߤ'Ћu/76 84a_9kp(@; z}@,/e9NqPi^<Ǩ5,CDH/~%bVT`\h_%0aw[EL+(ft*i73/UI8 I PZYG>(;&-1wi,F;c˜rQ+b'QXa4$ v~ΐ. Y[*ꌠ,02u&PF 'UemC-.֐5x5ֆ$Αn0fvjRʈf}DŽ$iҜh#׵<56mp2Ovw#}6z;^}h.BW+x3˽T8 b3me '/wdG No=3ׅp/ F%z&ӑ_=oܲCPh(YAkugĦ+\'^Q?IAa8Rڲ~aT7}UQn&(BYiv-.dpЕE(߯ŽBtGKe-)S>Z?ij8PbZ@lC0uVw8iW<d2o؊SMAB2ϋ,Gܩ2& 4vC%gr2 *ȝ5ꖈӝ%?ą%hvيC^5 s*KHsc^ 6>/ INt5K)4mU/Arg} ٢zoHc1Ho\]={! ET*Gk_jDb):tw&sT <gX8Fvh/8QRJJx[xZld cJ͠ݪVDRP-i}ҭ>+|v1A%J\&&qDd4͝^yGPIʚA֘j#1v,;zSnxoB>~lHWf;y@EV-\\a{s[ԙ^tD .I%Kt)c /BYؠUB8ǠB0DYXϞiW6WcR-;fo xa]TGH60dC?FNlA>!"9$v6wRS?bWqBuNg̱@"iLlv"}reqcM^>?T!@q8gw}Ww`|#(t l9~A7GXLi/Ͻ6^R#!*8Smd6[?[{ ËndAsJk#.f4ee*XRغGp(#[PtN*܅?z%ޚa644 䟢pT;S9ZH_2džܝT:/\R_)lo@R"mV ۞9 $st:492,pIn{,$,;w ؖR^ ôZ?XmgCeo_Cأdx,:!_ݴ߄)o/8dLO!_J? /ң}KfӮr7r-c8@25--BWI2'1.lw羉R;*r#ڥ <'a:C9dx !\$\"=a[0f,tS\H7Q6T1zag:El;dzs y$ljpX'bA[{Wq{'/GE;Q$ MQ.D",BjsqU{1Z y_oî8Ebݢ"7bzT5`DUhG`]ZIi{EsӍq\ϏD!s2/aB{gk·ZK_+jXJ1UEN':5ٽ= BVAѠS4gHHNm,$UQGr4~~b\rA2xӒbY!=i4?H4S-~,W9e&,j7yX wuS׉=*I-u`q ㏲@LDbM5|NZ;y1w'ijdބuu/^*rfr{A='܅Kddcu D2ߓ%,*gdp7=-C]KdF ˱ #NlYi%ކ9k`Jvl AKHeR Lzr%%cSAgȞkD>GDc`MAlWQH~g_6/%F|҇W'l+ J`5[$'1' 9`r/zucEhдT|< C7e%H&o+^3f.xƯ#æ^ i.-Gb4jOz/exa?m{82kE`ǧ9 $d5n+TPhFFG!TEaK$ jg_HaTb |iQe"JbWr> pRW;[ɬ{E,E԰1:0dy;;Rӏ1U+"@p't~F7Jޗa,x&àP_lh!L?+̅ZW}Z BukrH-vY?"^E|4pg%צV1KRṜsY Fgt T/NIXa{߲m~ڃmQf`ԭi崠L|ꭚa-"15>9W5ѡ%!Eɹ]4lv!`-Dh?YY_x Smj-ܫ@vF@Fd3Kg1b0!swmGfoҺ`?lȀqVY J* :dL4$Yj>xD DŽC^lpY)2!yr}eE0dɛ୵9O@8}/J|nI(C23Yu衧3"C r.hnA7f+-4TB*jl6?[OnU7ı8p-@k^=p coON$!L=hS/-Ӕ]3@cXbcu]c#%bE$z0hz2/4>Vdʹ#w/EUX?E?ߵDǾ&vk}0VqpC:ؖt" `ߪ]s%Q ;,k";gc.gUP?5?l 8 8)~OXp>WB67EwQ9P>'5-ęh(\r/28~Tti`~2 ,gF蒙V ]U x>}(Ϲ˷nY;xbڀETw)¥8ǵ<5X2̩- t)k*wk|VbtYVDpO2zeq5>.0 pؙ5 Ϭ$h]K8.KT <>oq\کo,1G뾢j;H&TB2D'K|K\h=lUa#/H(}dX>tS#aZ[%YŞzH~$m&*?{ٷXړ̦ddԸkL'_ImB!D5)@U{b+nYIRhF\yJ!̐0OQ,*\@4$(Z#JRi 9yθ8_9qRG_duqUA\iB^U?k%w0g-Ҳ/Lg,Cc~"?p4,+OKS3}dbh dp/=0r!/ˣYW0WS}wkZ.@ YLlWÍ7;N<dҘÓ'AݦTo !-~ V~Sk@/+m\CPiAlGNŸ1jm + (m=mQ\/26EsKv`ǧ3~fcO&?{|XڂB{>EI^V܇][Aa$k/͢\bau_CEnF!#9vS/v*]ʞ'T, 'nvۿKRG[@,7֩9`T̽_^X9/5G,4>5,"}B /pP 0MGw 7zD Ζ8[O0-X^{5F=$6Vs?m`7Hm4?""@k=sL>{Y"xVґr5WɅܡ`ӵo$&B}\+g(seAARaf$x1:GW9Lp_E2px *.9Iˋ%O1)1KQ/y7澦x' |nP<#ѴGi[vgR.7iX-7l@= -A0K_X6 x(pѹԈTęO)yd{Xd1ԇ% mM[ue=J&[%)k v!AD,I}r=‰F&3L.3H͙O8>Qtd6V-TbfS6FVu6z\irJE3>u4ΰaVfܭ!a S8] C^DᐱZٝ>yNZ󾻳>A֏z(NA1Ljiyuc7si2 *c J8 <wZ70Nަ51 glK3{ƒ१0/GցluAůH섊{b~_ye~b ( RXs6CjNȪA%~&5ɼ)J #Gw@ ۅ)n_%A A]Kc.1]vʧ9vzg].\ԫ/; Vs<9Ոsث5=+t1YQW*ݖ76ut؟Wkc} f:h-nиDj:=eV+ IP+Gf?) vQ Ǫ>~\ղi|"0hi& wV'oY`43&R^( k  SY2լ™(b"pa]{ -]J;O?Tvu~ )⊳^nfe5hob 89`8MD߬3bӑı8dS 3mpHr w!TUD%%"ٟ6O} `Ae-t3 endstream endobj 990 0 obj << /Length1 725 /Length2 22309 /Length3 0 /Length 22859 /Filter /FlateDecode >> stream xlspo-v~Ɏm۶mٱm۶mNvl;9zj3{9=WUwZDbv.*t \e1F3 ))@l`d0100<,-\Ɣj6&N5{{7Kc ++?Iʦ S)@X^ASRN@!. 73u2(Xd,MM)fN v&LovnN.3sȊ˩DUv&\65vqWu4Z6k`bi025nvf&?hH 015'F@!lob71uK{w' 5 .!hg| ,,=LM,]?F?hϪx)ы*H S 373( -A: Ʋ.Nmÿt_-`_t]L\-?djaj jolZ+Z4_ N=?kJ6{l9lat󠂓?`Ļw+Blg w25MVܷvC!\L X6%cs#Pj%+VʱY1ڠsOғ8.9J_8y# =[Rl}Š0k+amMn 8c:ON B_*X3Xe+(U Wp̓qbbۿ޺[e0mZkA^`g)x{@-~%Ϧ{6zK t:O*(i= 2Ch5-I/42l^Xа^~<.&M**̚^Ʂ[PoӍhl־31c\RDd"RS%Nޅҟ #Y?4fVADl- =(H&wYgWJ%VO1;Ƞ;.g\)Ò2DI˔ i1훱j8p켦c }MLEWE39k7Ze+IfInms&fN6ppSt}z)Ʉ*( i'IV_@81}Gf$֦ŗ,q;i\B3(mykߦ|{ٛ936}xQe>:dUFk@ؖ4~0 {Ԑ"jTGuy ˮ渨!F%/v:=Mxi6mKTŸ/ Op-;RyFߵO2289I@e q}e`>@WdE6uYhZ $%>omi4]FcYX,o_@cfÙSAH-CzLy`uřmb~bmxZ-# S%ʚJփnں5]=WP2SYÙD"8ZdTmr' \oWN||{gxq) Ov[ ,>Z#eu)kGy[I $+ bؐ>.q{=,r_hOtoH3Nڏ@ fH4T_s 6W ѐ]Qބ0N*y6?bL`+B_jR9HR(PP{{F/sZː篳WrnuP_w93i0͏駅4 "$b`oܩO@O0Ű()E^A\Xcuqg.g f,'3#Eڗ~bL n͕m9O)̜&(EvW/7{ݨŌCc`' _}?G7eZ&P! ~P ^54+ߴ3C_E,/!poKFNRXR {G@ad>&+1dm )h[\JH_ Ylqh%Dfj}V"Hiɇ!l^BhI~=΅I6wBQ :w-Ix'߾nVdOs'P'ji;JңF0> Ńh08$ C4BңI0S[uT pݭ"HMjWY&c &I?;bZ-r0ag 9Q.V5>gkZx@ꇵ6a WavO]"aRP~c|4xŒ⒨ٰAefLQ/? Va/ Ntld&^ kEoCX]ym*Z]FmDg}wo65(O~?$smooJCYGdp SWǔ4,ʱn'D#>(2d-"" r—-1fr927 Z9`@G_${uS 3깇rh*` :L(g=ZD&pbq])!L>TKFdHW⤏ڐ6|㴭^$X  RR[ԸۭrmvtiD\Cezk~O5lywcWn?hn$EK:k ӄEW kw<|Ǔ )Ðg[!ކ'8O#oV(Ǥ+ܧޗ# (({JQɕ fm|Xo|փl||@CS=Ҟ%)"PN|~TnzVWT+ @D/ v~V6v1b/ 'n @CەCV ʰؑb5LRU>aI(#chmZ-˵$ar0!KAʌ {HEň idU}Tiõ"!Gg< f\,pEɎAJq 䰼4ai0xY4v`Jx5D ~:G# &ZT;a-KMIcö$S7g0^ՅTޖx/qCV9Wv2%EZ/1P "& $%R}sl[jǁ. Пj !qg-ffg7bą6袂y+%]h J?N0q!(FLŮ>7DQ&`fZXP+4?qcƧ+`9.%Zn,:L{w)jy i-kd})#M$[S䡸F1S$*Nʭb$(Ł@'a8$1ĘT,WFwM:R\.ذh**0w`Px!1r@u 0c|/`ӄHI:kw1*1pbc7gT"8Dn@~!+%G'ġIEY{~ VÉ.I*6Ҙ9Zzɰ:s\쿷}7`辌ۛ#(Z?KQvRU]…/T)jC7s˨W*#aPb`7.$nz3GkBG38lfX)b%ƥ3ٲ*UI{8zZUܵR~ZU,pa@ |<9(}ZJ.i6>S_H}DZV9;cj-) >HZͰXr QMo> v]11BKȔ]RJG@qAhk٤L2uG*EA. Wcx C@Fd_KF}Nl8~j#/\78ӝ "#ZlV{ 0X҇ G8vi/GnM=ěL?IcuʸǺk4}DPAI|> Cb7!rVis=_[1_3a%j8=?t*y`slrT4n&Ww֤k,m}IA56 5P=zH|3>תhi=/szAaDT$M+8uEl3'{qNu9 J {&xepRk S1pAY+g\!Xd) asmLU{ X"e4" Z(oP\)ж#U1oǹZ (\+Zs~Zw] PʇjII`q0yyrؗ띄>lNJToÔ4ÀB憌LS)偤} xLO ?Z*gS]r_aOJ:•")!7۫,aU9Lး&nAXCeu  . ?/Dj9mH1a1YlbZ}@i>*;*ũZB_y>sy2K ~NgiJVy^+ݲl4RaftC ?\|J8^jN_'}[{>7L*mz+a4!r<0Ѓ]+JtFlTZ/bjL_oA3Tbweгbe?][ HX#Yل<+ު@ EFN\KM7ТfP+Tj\2=LP}m^}^"9sU&~kK+=O覝>|H2TB{)c_ Gf€sJ!TȀOHz)\ɝylM 8ƉJPRnܧS2vǙPA򖗷ti^E^nC<5Juۆ@+6#D=)--"_w8mGW$ `j]e3j.]!f竞^5b}pW*.>~,C4@:Ղ;+#6ߥ:8kj\U/Ը#F*X -UU1[u,5K.*Ua 2. na:N[2sMCY%|pח;ߞ+VZ"f|5_Y%hq {}FlIEz!a&r cSƴg짉eݎu&qYۭ bt !Z#Kۖrߗ/nz߳ T=k]mԩش,AJoEAޔl{ves)6A܇J1^<4BB$6*'o3M1t?8)گJmkgFs !IɈ1M0ͯڻ위/Jb@A"y27Õ %-iD'Ř?ЋgLTMc%vLu|~A2}%C_ s$~ٷox!%^lq􂪩m[բY6o ?{9duL[vD$a[Y -3ߵ\>>d;C/sGFA&D,e9w+IܘTSUƘZ,yn5 Ѥ10^H>斉WzR*G/Gځ%4V\T| 7;%[c '#i+VY1Q H 2C `(}ouK?꧚QF4~;2`O?^^Las,N*`27 wE@-}J5݅: 4.넭.W3Y!ƉNf^:Ni{8(֨ xa?8`&wGǮ:CE@q~~,oJ2D>F94P#]"Sx十n.*(o3-,7NV;Vzl8sۑ "\LXxY3 =IBm+hN;7q8~m:h`D avnY/աUBP0e\;n KrC@tA~n{%Fp/6& >4oq27/ι5VpHi))tU 1 %kf-uohmL}QYUZ],cYNT69v$V ܂@n׍rgPݮŮw*;`tKu1XȺD{\`(]N0 0EoocL]pd]QDAw=Z1YS <YsK%{{Q}JXAX1ru9q?`T\qb2n њ@$̢-J"}{Qab^*50D[nuAj7BycF4ƁD/*.{tN@`3spFn(5+麒}}$}$1@Fe ;w#KK߃j:S~Qt{֚ vB|T(1ұ^ yF?O[~LbJ:3 TGE]L shh3** GWTCl;mhrX©XvaS#N: ӁE iOX An_žy!!4eP). &ZwtA NuR?rPZ'9ÕSSkd%%`Td}}g{2-_ޮOVx`8S0&Ŧ-m􄊨wX]g{> (}Z+I};O虡~-a~m G fQeS˓e/;)R5ȨM%c&nj+yȳɈ;ΛJJfӢ QlG XEwV$qGsp:)8 5BqKF2&y?/+/#eN $:;u*a< mYѹ+P ,#TıLˊr4^d%abXjKѕlԿv(;6Y8u6|;e12+ D: { jd+6i/}ճ޹е+RAb4F:gN@O4]΅p-T Dq cc~J'/oHRLvKV`*s]6D(ڕzu_=D(D_zDZT/\ڐJ֨IN4ӊJ=,i,?923Szhv8_{U+\Eo?i1eѬ$pVy54ڿ%4DQhn,r)[NmC c:eDB WgUBmř9uHo;BƮ6T9lAN!#( '6u^j"4J<h])P*?uքvc^i*ʏu 0HUð&WRV[,D1;o)k$Ȓ [3lc"xbƷjQ P ~h}5uǚvC.|gr\%F1gmyre{| [ٮQcN`D=E*+<}[MU]"~YWI=ǎeMiワIl:s0!L11O$9yzGTSQy^ORbzrLē/,H8.# #HP1Ojl< @B2]1׃1W;OlC텦 E<7ڏ6[uxDz jqs2R@lƪ`/ה7dVGm ߴMد)ZtSXfdoG/) JכaPK#*^r"%er/JFbk#5}R 7yC\x^N¢ԃ&`릍4wH;]`-@z~~ZOWO3_4na@qa96aCPԪ=6S''V{[[NOrNv+|NQA'r`GC 9BG8̃gQ FTc=8>ی6`?8Of$F3~oIu|)s键"Dzr&jIp O@)WtH;5-r[^s+0 ς2V$f݃QYAA[YAYx}0vLmdS#\2JQŗ g/Ȇ'J h+tտr1r%{wI?6)/`v5 OK)~WFyؼ(:|R+.M; ,/dۀȌ+xrbtr1WNɈPV>SSc dܗ|3 Y݉p> ^+myq* m8׺ ؇al"#y|]j*1g̨FgHo,e;MX^He~C*ȔH T>vؖ"WOA0͸|%;qܽ*6j[" ci&5#2Nb|K[&Lhݚd q՜AշP=0=fO7 {A-]kGy9:`ӸEھz8%4y =*O Gz 7X{e\Km* A"y:rhzkATo*f͑XgmO> hy@ǤvU)虋0 ]fkIkzz IgE25]߷97/gއ\{m q/ł5%1**2(qeȚ o+Jljy{|nNJgUٙjB+&e.+h־27P=Dͺ +N2 ʩ+qf˖91ozMCQ{\W#*-]gUqn|7WQ;!xZv%(HU[{ܴߪl-h6nbUe!j`8]⎼^%+x[;J )'Tɑ.f<+صyDt }#76·pwTd-&CZjPlYd&-tMyB60:3̌"6^Tٜ ΥE|Au DƾBoS{eERώuhFJ@w;Mr-N PSQj.!gq_KqIT" 8@-הN{r,fre`$hh}[*GmyP[%hM[ Ͼ )d炟$ dt?!8eqn0q4 hAU Z =QdXALVm`(ĕ3-`IʮƎe'faOSI=SqA8 }5lTmSm% n>?NΧm)Nȝ.RZ'Jm7(6*pՆ0r2zLVwWʨ'0Hᇪ͢(1 雒=3z"bv`4˿FlсJA;Pvq(@%:T'׾Uzk).5[QEZ2߫wҹ,Eٶ.ǼoL qUt=L*\ !YJ̗Ϭa<Ԝ4 9&RA^?xYUw}@.~9<"uhD*q>nDAo8 &8*U(q`)dƞR$J/}#ݐ<.v&Y˨Y>(NgFsDKwSB$Xަ{0?^Yf_|"j>H-mMPm~wSY t(jd@M ԽPk 850\pWmEWo{78H\FJ01lW@V8kK5`:N -7 3C˅YBF T *m tYɼ`ũeƘ\4` k\¯w'85_VݕnXS3C [#pNF&xf2CB~ ٪uvu(KQ4-C{Ots¥-g4Wh">6|JQ/UCPkW?`D+O|~;<8]6,A|,4LU4PMz`7Ya*-]KH^Sf9ȣnx%pÇVZƾw F<W0&v='83r8irlopU`45kO-u187byd`*iK#. A.r] .^ .Z \֐Q!XgUvJ}I^r_rjZ qN =@)o&L #,,A޹ {=S[ JUu|[#W3bh'z Q=4ЩYRő.qCN_vʛ[.w*0 UZ\j*>tWaU=f+xb:Z[(lQ-# pbbJcy1- YQ Y.Fq3 8"p*^J@EzzX̃ Le ޷+I>N)dC`ѻJ R!?8 'Ihk[`=(^lIm%DG3A.ŒkGbmQ8~).B=kpxظ{J,gg#аuV[e>rxx)#=b~ADo }ȔaN e0)9Mr}9&[ ɦB v&űZlQgyq&^"a]qpQm9kH8D ;q5K-#_&, z2q(K-R+y0z\%{3ZdIN7[JЫ]~hpQ8XDZ %e _>X3?Z:Lc4Jf)ˡfQ,P P# A~Xҡ;IRqHV|"ռ6z5It=NDvZVF ~׻t<8|0 Jp:OCj*yٹ~$Zf,I3ƻi#=C4! 'TEϷ Jt/ug7vzA%#ȩA;uT\s=q;Qy`FQý4^ f"K._J@(*l3P"}FO@'jo}Ma;r1klFDFV0t!i>,5S㠌^I*-wlއ`o?+xNiFŤYA% ]f h3Pr ŝ&fOs |.&kx=?}t }pʹAhg̩v7ChsS>o(e|r5|7* D iv|90VAuL=:Lqt'Ʈ|yo x_cDD"=>_8t E}_%׊%؞P. p4UG#8GO7xrA6)hgX0\Dޒ؍wE ו6/BGr$9w r:S:qdIX.@Peq%+~Vc`x"U6$7.V9.Rre?v3( r^|isɪ<,úJɃ[4*AO:\+*6@ @b(cO%٨&i˜~ g.\*}̞ߖDz֚"z)r)?HDf I. 䀛NoSG8|ݠycix~ Kcn WnY~+ Sڒ~+hQ1Eaۦ{;S@'j\;e$1?#cF;^V_}>_u<7ȐSF㍁Ish~nb ^Цbq\.]yVμ5wPK'Mvz7HL씼o)>(@1~8,XϘ;iP4X1=$J|-S8%NiYA?ΎxD=YZiL}rxUY/'!T=2n< EeE6Ձal7m.xv"oLR[qU͝2uI56)f7xZ3gV R]&W{:c%kY әF7ћEahR@Άr{ʔM]g!0=/٫MZOw{}mldH,ο;gH y6n'ki6R,Y^p|Ȭ>oi`JJ_vk^_FL+ȧn{GB {BԺCfStxn7,5Xo&<%q$Lw4+Nу^fiO[ܞXFG_&3W@:}"p}D qJ09{lo.P-1e+ jqSD7,WP8<;=($>aO.\"rZ 7[0Vڃ XLjQSlh[+P18Z}yQ3>M #Rk`Hh !3!B2{{!a⤴_s܄+V͢2nx[ؐҀAp}3_<t=/FE @VFdS$_IJ8 ζ T\Ӡž8}Z=Q MAҍw BD#xHRx80V~]c3y |b/ӱbN]*2jZ.J@'ZO9s#U ֢ɴDavg(M8rpW4:hG _u*Ib 4gCH]O0Be*+! >Dע6&K}@`|(,ڳ6||@P~edH-N#^]^{Rot+d5P@"j]Q>²?P?)q$;-OQ~MGEf(VUn=K}K8i10˔ۼ-9)gbGsz5?}FKw\}qӞa]kG.oLjGcH7En1GΘQy}m>T6 *ᢖ@1J؛.e x*V$_b0|U]͎zNOn,hD5G}+$OR=!{^Bí=Uky2/]p~ta1l |i5r3A:1Rح\Z#J|95@ ѓ/nmB&K44-9fؒF i퐠ִ&1=GqU;$W٫MGoҷarnš`j>gu 67(Пs`.{N@%'.3ݽy4n"ES*5.}ϡSst"TE X*qҊh S\ѣTD8]6̖~ZƦ ~Uh>ecGk<=U݌9@'V+?EvFM/-,E`8 Xw6Y׊[o:H@p'^^OLD_b~ʫB\x"l+Wp"O#Orɋ紟1A"St,~"]2ltmRdJ.O=mr Mszmvcu'ueۤ5M* %l 썪25U 9Fӭ=I<+fC `Ktea7DjA *_7]|Ñ+L_⑖F9ݿFS`U?(Z^:[5frU7? J,J9Q&3r6Ƈ;6ȫ ? (}q7=҆Cjsa]uoxUr#px ?W?H9e %,D@0=pyxu; SٴPԹF3/a/$ 1 f}.6D"_ĽǕ 5!5?t}IמqDyt.B̌cز cܵ!o5g5Uu~rf>nQYq(wc_᭗%d ℑ2[fs f%P1gcxS!pmM@y KqI<};Ch[h#x̭cj 5(Pok9B>xKvиz͎;r C_BotBU4ur Оo8I ΓDo'܆DA! H@TLtQ9WIr.|lʲtL[n;KvlOQk#M@&JC$.}Lx՝y#s/z SF$2@Б}$aOi@[#Gɪr 1駟s|7X/5|9AuY*Y"H1ߛll@z=s+kK1lNj ju3|&s?j,J҉;Ɯ b*WT?HtΣ}G^ Ly?t8mdKeJ 8=;T݂]U(QMذS|5_5V= |s>(r!"%+j6\qemD# o=¾(л!wLFg<5DO` ܨ"iu{Ԅ .D&t^ܿW؇d۶A=rՖe^~zGB$h>QJY_Ӥ2dPR|xE}4[MYgӋVsP6O\©tDy|>~IR9ª-J vy*6^E{^>"|8x# ib#Mm'Q}+Pу^TJ8g}, ٴUٳT"QǮ1L`8 =9@uYn":@pEmx\KyK8} hƮc;D䘛kÜDM6᯵y:DyCc y@1#Խs-S}}:5ҧil} M6qٸl~{R]F.c}uݍ'Neʣ%DJu? Dk.oҶUeNyh|Q)Z]Ⱦ|u}JG.E˶9Nes2Ao⏈u2Z_0tw:fևDQ~^ ΰ;n-Aゼ;GXU ƻyPik|;jKhq(Y^I^KJ7Ē.mѡL '4! 3B.v/f̅Y{O,d,-bB e>pOnŗ*M0W̐@OP# .uc+,%z.uỵ:2 {$f5*"J'4Fım GJK~gڂ]ϿqC+A\{w ׃gd(iGpU^Ŕy\qT.AmYnrf08i{85*?կaw1f+ԗ7u6Kjhe\pJzu91ۗ dE9dt: Z`9?4d-2 infY eg;D1J]^.]m^GEgd$0qc"P!8RxL28fWgnnLNG QnW%EvcheG]>:R{C*aMKtFU3Íi-X7;Gcw0)"dg,P*5KxyS(WM%  !5J}BQ 'iw[+3Aћ kTNXJdU~EPp\#()M[|>BOXXAjg\q%g pL#Q1V"IxbԈk*)n-_V>ct΁e9-[> stream xlcn%ZeOٶm۶m6vٵe۶m>_X2Gf9fFs-RB1{;OSFZF:. #'+TNŔ njP6u02IN. c5CK[K'Mٕɕ"eSS) ,))'Sڙ:\l,2ƦvΦ3{'89?̜m*br*Qaza @Fr;gM]\ ,,:hFF _Iڙ:7!(ѐ`bjO)B kobdN M5& I3G>@b& .jX?{#kjbjr4gWMl<׊Eըg373(+Zg`߾@_#翖eb`21Y ]cW''S;O/=Lk!V-a~EԳˌ+S7H&W*x3F;D1د%ĕ~5[oz`jXr`7w.kA T,ٹ5ReH1V)uɑaMg-B _@waoH- +õ ꯬5;%0Dȶ*7ﯘ~el .<*œYQze<ܝq{VW!w p&ݖ%b9]^rq[UG«*lɜ7-/¶bFJK$q!G]qVy81疩QEV֗_9-Bp׸ϣ^\ Н21q w,geRwyBĚ*'dXXkLݪ6ܫ~;j1UY{-B iWv G@?~Y7e+;L?It&CУ];;.U"tuD](]i;D827EBƍ`Fߑbu4%VkWljlCڬQ"TV`#|$pڃF !0L_1mٹ=9a;Irf~tB9eP18#;2640 fT@xO\Z+[gs%hܫCަA7X5ɱ;eR<&LjbֹnkMZ(am5M>X>@:Ql In'qmdVO݊mg$ڵ$OY8{D&c x;v.|GyZ+2'LxN;S&ͼzFuzpC\%\3!/TRi)P+ g@]qaڌ˪ N T-?ڒ07;)N>!?IQ$) r7wv8DtqC)Mh(9,Ũ IC;E34I,Lpfڳ4cG7pTf9Bо'OҸ[_aB&/re%f :.#=";)¬D/:cGمcJ`_/ݼRuEݐX*s #sX@ N7.<kCn''9ot2Ԟ#)GL\^KlwP쐶)|9$AϿq VhuY~Ae8vJV^QfA&ʇ . A]A5`I>[ *)BH;޷0.E`~Bp@QAYRᬙ$]SK|ކe-v(bIvqړ ;m02\;lSPcF7Wr?(@&K,yeII ʨ6BZ,qÙ|6H=灿N VL,b KvB:+(3ďɆʽ^10/wT4x{/ s>AgZhW#>yu#Y%P,x#Hm̗Up8{PAߜ7e&:k778 ys+^( v`P(mOx:Mҵ?;$M -ЉBN2yau}Al\i[D{ B>mU덌r79Q:i hH: l)FKeB^&)@2h.[pϴ«.mF?"p_N)i&Ē;][Yd].xy'Z z쏗í`H J-AsʢsWB]anjXd[v[p/?ᝦyq2Db04ŮE:U9hkM/ i5hs{Ĭx?B_ VAv LjmG:䉆#q@S<vՇCF <ġ1͡j2aN DgUg^߱']7\^`WppU!٠ 'c^t٩֪( Aj<"!IGSRp /C_캥r~E/% =9GSZarkg*ʀYi)i<] TD9_Wkư+/}@GE~SET: dž;퍉Hvn R԰.\ ro\f1Ү(ߡSz@l3\֋PׯPIc /[~<xm|)V=HloyYZU^q8UA{*1:_LRZn%;<"|z2j'gxE:A;G2hmcixI5ZUǜtc7{7F=z5j3OV˂f+橾X9U{$3_8'7-Ho&"zo<88Ca uӆ~;xhvFY0/TAkゅ氨 TjUCꗩ %ּf+{|;yTї-*o6LO~ˠ ʖb@/_ziЃW%4kw4s% ˻"R#moDT^1Q eFlV\U.i82dvrUyH끐YYxlLܸ(*\Z'!vujnn>e }:SWEݑ{cuŀ|jFXxZb| bX'~Gx‚爵yhX6g[&6߾HІ)A&u]oʝ`.1ph ErZ(~=3k5ia Y̻c(L)Y?b s̯5x6p BZ?AJA7>` pUd\'2ofrAԤ*=_CK jG0z)Vx99S6 ZׅQ/4o«'kNj- >?V}Hwb4qVAWX6ʈhk*[Ps׊_{ W5Q1DaeK_WQꈿ(p>6ЭGD!"%7ug DZ%ڽxώqhkVrt ӊ;a(\5 8X"}(|+1M(fc4֓t{MR*owe*l1Z`eYG_*Jv~E?k Ma]?GE]2i*lMCd3 ?}Y`:CG֡Ҩ;Z\%EhDu>zҔwxqPC[d6x*cҲ{ M;@IqÃO)aHt$at)E)RŠ!1rQ`S%+nW9E7|-v3(aZJm ~K\wlAxFPzKȆQsrp"&/Sy!J{C53Tqqftu'^9}(/,@0ERp.$grmԷNA̺1JV;$ ;#iIM FQU >‚-e&%_6JV EH_'GFsuy\XJNK0`b 3gH0h+^k CT(y2BI7VmSκx;2c'oCeG~K"Ϛ?t˃r[82E6HL_Υ={zߣ9V*oQ4F *zfw3I }aZ%QReIYEjɗoЛK RUk=*Y  }Ű9AvǠTlO dT`RbF;5ZNQ@K"}\j/T % )f=ێ5P&pP2\@r$-\\`#*hk;91@ysjza@p6ܨ4*av3Z@MBJMa+m%|}[瓷jYUƽ}pBawPc7}pe.u*'Q_ 0Wt&HЗp+kS³эOjG[X_ n&@ǷRfA#}V%tVNၕw$)㌪9tī_rv\h]@諆,1 &{OeJEWYsh6F*Y'Vܵ\.;}߇a+Z==oŹ{B oYtU]v4+L/-{:3SQ\H9ST+`aYb`"Z<=UsEi/)@_f%3[h/lH| 涺L5A|yQ>18O=sDSҌN6b-Ask6ԓuyiA;4Ɲ=maeʃ5F*7>S& TÞ!*=CaWf)=MhX^e:emGW:jEj_Őr5Vzu+|_?XD.yi(s{^2G֦ #`Zs;߲cР# b+ R&&Շ 6QN9K6 WF5/!%xCXF$!0cq>qٗ=^>'!2 L{?兜'xtNj4bTUktKgoF T@i+ڐ閶Xj/ꚜOQ~x,Ãh&)J!B_P[KrM=8XWA]N7c^ȋNZadB_M2aFK'_ \MM,ƮzDZ`k,7pvԙ]9@GѱĞA9#Rqv`z8RUh@cF$WNߵ!523-cO fsKKU(ej@:C]+ip'v6GS{؃rO|~qHQ]?[̂E5 9ct{Nȴ1=Aqx" ͨPnOZҖSό򱡷tl݉!O~|_0E~ =GZE1-w`Jҧ\ ejn[h XӋ,0ftw{9K&֘֬`We^} +~iKoIvJiqW(9 '`?  _h:XAQOv=Ð@gV)՘AmU[ Y2P)_r 5 Sv1M;Mj>J[;aNsp9I㕨hwٷg^ΑJ#pEYvSl5j?TeÎ6}@qg'xLƔ/cx>8'`zط9|@oOzXeXh1$ٯ6\9A9\$0OU$I>ݤU,74=kci!ξX ΠXqEfDLGTᇁMg`da]hd $ y|Q%[Pt$U-Κu8b|%a2^ChT @P~^餧.ZL26n7_0E>-ԩI`Q {o;T%d*SfQw퓫ފn ލK=u?dNoF$#l1쓴ȩno0Ec)_;D-XϬϪ{`v"c?K xZܨh=D8*eNg`D ڢ oAX#?I?nەM,9z%&POSFM^_љSL ] Ôڛ N9OA(*:;6,g䡾>1?J5;ԂeʚyeSt'嘤/R[K0\GMCx7dg٫"9Kr/ 1R: a^_ (qnli8dɋU"n:܇6Ij󣚍?z{*8{K iA[$cG(]Z,b|1XS 2]Փ.tݟRB/힨({طHꂏd,i8C`֚VCd|UQs l:-Q4$ny& 4zgC܀T&8x#!\,?3oyչӯsgPCEf ܍Cc~! E(IS2e[:  T󤄚Uբ$b}Rw&1XJ;|n:ٌG[ Pgݾ'_'IPsLJ8'g h:uPސ q9?qA\V*@酩_i+('"X Sj+?UN3Ŀ Ps =䡂AosեWSs32Uz͑{CnmcĠ\ɏ;0/Di,_v'Hw5k| =ch]){Iӎ @qJ*V^Ӽiq^L 7j?.Na!pښ4C^GcJ F~@S jDzDV\q~tLoi3L:;8 bRpXLCM( WtP>Vc Km&p%}ǯ+8 * `hyܝ.Ns{$0+i "Cs'3yA#3X,a#X HzkbʉT(+AOW($)؂ѷYO R`GE&gM, ]_;=pyd\r#<&L|7t!-)dv+ᔪiaC'xaЙ6h~X_nc*T=y(ݳ00P؁Bmq˱X"}Y>gzKyȎʐ "V'_pGy.³)r3VdKodYaXga$E[WP7 AbaD p!bW.7V;U=.`BÛcv{ٱaATK3+;RW ȕB{  &/$#P9?juX`\) 򠻹QK>3hd$͂j`ʩbrxk\,7 R,{:ZL 9)0_^k4еl4|RѳÓm;vtm?JRoXwMX#wZj~6P] ۍ:4ã|~f' 7R`5 }/CJs l: 6mXj#FOvSwfӳnuH]JPn^r  S.geV$*#',r,w~wc}kqGQ- &9sJ胭E4cb`.L5hRƐ84sOq`Ac):rHh2n- W)Ծd$"dFa+cݹհs1+Mz8W.eiNJ$M*ZQ-\ᤛgZN K: v}>0[5_XŶ. {^yWZɹ}/H=]-NtR}AU؆c׃QX*qA~YcH#QR{F V8vKJ#h:(|ڤ튽4^w9qg:w C4êL~p3l.◟z-ЏLO F!7ޙ([T)x}I %0's:cCBf ?[6eʔ V$t< 8a^?;:u`8׍}ҡq[$Ʌ~m; Ӝ 2}7zwv[o[8 F#r hQn _GY,0 M֓͗I s4oIkv"A؜k4$ٔځUT˺P"Pvpn(}N])G FS*Q0!8n̼[N ȋZh∌Sp*YP{)7 ؏7],$"7j,(9_~jE4Ȯ$NwoIJhd+scP)FGG1 ^vNڵzamaWx&~ .>#gl O礓CwQݽu:!$]Aƙq9=K98͙3w dD${8'FTOJw CΈS Rt& H/~LuE\t 2i6`#vNr@7qIww /.M#eP%^UeyOCg$ğK#*zokiEaI{2m=J~V D6,{\}\2{7(ݍM/e4 1?:|ަΥEhf@EeoeO?]?sM~_l芨Z6IQzVo~#<[8CB쎥Fss,5*;iR'|4$!9[g ?KkxS>I|ϹXu:Ld5>lIG&X\[;b@W*!kHYM0}'✺Dt%ݶݶm۶m۶m۶m۶9gf^yJ$”87EAW_lqt:i7No.Lʙ8wlIYc@}MX+RLlB_*5e :4Mf@qJ.7CFE)Jf&!"=Jptu+pORp(ߞr:\O軔^5wJT}쿎_4qVzwBf\c+ʥC]J@8rLM5V 2EWI3TY+|MPOC8xu`$ /|O OSl6ʨX2V?ИfŎ>vvy5CżV$#l5uR"3+nAU!}w7= AU`K>э`rwAhרqyy.Щ궱2&4!Y49Mb\%H?ww%0C[XG$mTз)>Ve`_I퍸 =Ӥڴ3#f?ZH NhXНoJ69.y픠r#~bF̹'KcftZe̲2i]eVhG_чalMA ?("y7mԬmg/cgb9r篖.td)x/G}dm~Q@jv4k-; C~JL՚90KUM/YuqÅ~c"[BF.7e'~lF8I~u0nDl`>9`ɠ#)g45q|yx !=q{DN)6y{,w ?J,st^HVƚSIJE僳pH C8Ĕp5 㒺k-Ik:oV5l*),.pa@ki})Q!OOx?=L𰩟])^!=ѭ\ f3쯻%Wd׈ٶj/`]l_Pr,= don=mWG 1y֎EtmjAy3NWݳz\"EEQs@pBZ#d ⪎-H6r̳ε8l\1v2\5OVu.e? Ƣ~ /[`@B{A1G'3WwMIҭd鵹ʌ{ `mY`7*MS5ԓӀ|2*n֋rׇ,k]c4nR ]O=d M#AEje8rqlhuWHG &pU5457U9k# w#E{ayAW&ih %8dʷm< XRG;no f!BTw4 t&% ՟7:KC+ϳaϳxTPLx*;7k{\zJQqME_N|ZJ-5DMbVO;NhM)#aq&ʳGؙ(z"c]Gݓwv] %yQ3<[a:='xMltخKPN2ae3TLFGsYe<$w~*!K^Q2Sr=r؞Eߒ3-pj!oOp #*azVi SCƉc]S*$X T%"੽:ηuygR~.0΀k\1Ewcj+oI \}H'l3J V3M1!Gÿ/k[S{t1d*YG!7[Am/l\2uE. [a*L ۾w-ǀR!dyp6S97`4-{~:ڐG U;.;cT= 47ܠގag%gjr"[`TII cP8г;Z,Y ь+;$ͽwͻG<.BU3zZWZ z>@rQFاED[} 8d U+wiq(p'A;ڰuVzp8cFHZGԥ|irw0sqv0'RqN%D۬3&DހMC)G.z6=`7c mZX;}DfʥݞLo&=l5+_lxbHKS]_C趥¨: wP 2yN/6c$=0 "O[ ªH~zQ|4w^xG,Xe{f*gDD}ؾZs6rXCޅً^Am%s\iVNR q5z "uȇiT注o®3KJ5a~V @ț\5j4zؐ @' \pJVZyBI5 i[oh%ѿdu}eF+yXHnhSP@?=4EHFF.¯25%UL GhFVh;CCR~ ׅG=Qꭨ& E(%_&j]@3W{!enMO@Nˀ)zOMxcO)NxTC 'K +^Z{n^CV!|d9]֪b8S,I*~"pxʤ'tw=!G)(edKfN0z~MݟhT쿔BMX"h!kG$;eC$d #uExfT{oQC@BI#HOZyźp Ç]%ےD~jɓyzmb(uJdĝG/xtz2 5]BʕOBȍj[s3<:B.50W!*y#z;04fBB_Nv-q%^ؓT?/WҋƳlMJ&\0Ѩ9 ݇HDDH-I/<.J;F4tey|x$}?y\ pNn jk NG%x`3~5YH!尧5vY__q 3b[Z]h.&8vL*n/x*`0ԵDk9H#E:Ip&,W w g 4 _U]S:ۀc+v>V44^ݩם+جBCk) w4>-ɏ{2:֟7X6\_(ܦ@h :3$avQgb/ kY+"'|>(kb?Z +ž:zM!>!^YfB F; #ғSgy g8*a|?2~Ive;riTA QSf06VWh:E3HO{M fnVBG$GasOP}lbp A*?+aX0-X!7`$D5"@PǕۻ' )5F^zĘhuh0^T; O"D*_SM(N4-pM,sqcԅ+0-Sȡ38P ⏺DKaC VDe;YrWbW/j2hT.?.a Γj:9imnz1^D{[ Mr\lBX>1V*eha 9vHaIi{QmWH7\h:qbd.̼GйQjaܿ=CꨇN|T&'62X"SN[W LA{"u5`[]]XY1RZG FiXҊ3'vt1[8hYiPSx}XMy]'QOyKR.h$fU+D_8pJD\Ur7v5[[l&+lUm>Lnߢ@-0d*E2z0|4!QnY~ 8CIO:|YԔ%g4i&Zu!y-gM;4KyA/.qjn%~N?.pesB`'L)i}X)^|cà{ps F: G[]֚1z竽ke'JȊ<Ȏ<$ьnP3[*UG㾄X뫲:<ׄ a}7>g6]u_Wn'Y03l._Eh 1AT4"S7daO> ),CWk$WML#+VIͫ\c8]kI]2KzDwÏa;kC/MTǾqcuc}i>a .T6B3d*^JU,Qh 9;g_Pu8Q?ȱ /{1QgT)APUH eB՚% XK z< 9Iǫ]b$8fԬ? M 0q=? !_qrNvEği7oS wfƉ JMiwB=گ=IߊhdQ}^@ny@ߡnaY؅`{t q f9;>C.|PFw>Z%`ea˻*5酪<{V .kFl\Ff ݈wʽ5eKSўbnd[q 9{~dթc9(v_V4PUt~V i\HW1QKLÛ2ǨSY SI&uUo Vqs8]h.mZ) eS:EYJcAh|-QP ݋ŚẉM_@>8=,D+ \zgh%Bp O~~XDK_f6znd,s {dw\:r" ߉<]EK_pB߶z`"[qUB݌T-?D(Ple| p>I93ajxNP[=hnp97X DYq+L!P<06|k d+N)0HS`8@ IQ;WD{WIBIt~}#mzYVU!MY&*L_XqY.tH,/9}`z395 G͢[\ZGa-+s'\w akM;l|A9&OH ReC_3l8-$rWۑ3ӟMkOBpRr!a|n]y^ 9ޙkq޴̸Y2RD$Ю{ Yݏ%ֈnhT6!d*czᆏ"Õt$CO$:=I)U5)`Z/IB߳v:-z FraK6}}cZŁQPխc3Chfi.V6e(cjXUr.+Wނ i;ڙ&vQ`ϲ<O ~89Ӽ@R_2v!t aQ~{ _cY$hfE0Λ P$bT*tJD?yB-;ܚrpFٓH$:;6{Lip`( 4w{9JvͷqXcC)|+W K& RdO㸮qq@iCYo9lَg)iZ OCKA;Eټȶ !و0¿MKFp ;ӜqO&ujpd9{ T.BI[ dj;јR|#QFg&bݛ~ftӜ&(5GS>g͍ޞ_,]䆈xf_N5:Z;< g YˠLxZ8,Lּ1W)67H=|'t,݁oi^Op>_&1X.ڑ֦7!.\ȯ~h5壉Cϩ.28e~ʺ!  nԳ?LҐ+q-}~]ěmw=R˪ӕf=%‘5%pмQ·5${gK*RSL$a$͡ݙk&:}Ph$x?1-6M>xkBfid6*j|#»b$`2Ce,y_e_E"VQX#s<*PIa)+"ŮDFh;9*o//mȻw;0Q i ,XfnN^N {=+41G=:%2ó+ 4 7oL}8| MV2%8+^؋FpdMG,o_sb18j-h.UKP!;]fn8\2#=Qli]?k)vi]C9/:@F0/FО.z5[mb>z8qpmTROy/ W|\ [uPy)mz{M0mw0 qR#Q<ԼEL Qv/( u:4N \iBؐ&*w9#ղs0J ԤkxR3o)m-Uny^Bu>z!7\}|y%^3vyV* FO%x6lQkkAgdKiP$IĐJKg UEdg8 ٨7J9vGrtFŘ$L4&cϹ>*aTҬ>UJlJxPvMgdR( 7%fh[yWQB_8*}F܉N~wOqPh(Ob'2rA5ػuwWPڹpni<(L̋N'9 zWbiOD/BדHQX<`:,K< !c(Qz`!- h~ x4eWn NMߞN hѫ~-}mLgi!{G]bj3=q׋tad8Qj5K_ϴ0_0a6IUpgGH归6B xF 0OXb"4?*BC)Wg8tgz桄}i4u1_H@9dVj1'8 xJ??-QzZG5p2vv]iYPD= _/^ @sr0i3ddcI6Kνk+7jFm"(fXZ>]*`S#""ɉ,ItwIF_ay*1cOW1l~L䄿aS_&HSjQqG&bC}YREq}P 6A9~WKiX:b u|:A|k2smAQIU@6&[GKwm7g}R>ۀ0DnttЍ"a,|ǥ ɲ#h 3xpM!9DSh]̅1mveC0yxxzR8 [mnX)k͊ͺ+ P̍M1_ gtְ0{`:7"}ȍ0T# U# @NpFv0Gނ"}nh%ۋEdZLDB,Gֻp /o+mZUwpX*]LF8Zdk9nu&Qz`3p1pQ=wHhp..Q;{ >Qs*;&nHf&mMH.c3Ӟd9j8C vY ެ,;j$Cvcۧib wҭ0=YԵ` ۙxx+n9dCV̗aE_C29c P{]BTۜ&G 4j8mp7Gz̅GoaMpȚ.$dYoLUC9(jI);AB ?ט?> PG +& zCZCp c2h6]*sO΅;n 0`2z 5躺%F$^@EZʼnX"@{$n n >Wêe1;TKvS2(fBpo8c"o)~ @+B50;INntR[-ht9<th2M ߡb27"z^~Q^oo68ٍ"Ex#2MԩD!1͍*J EZ^\'g`1W#]]vŬ&C1QHc@ɗf6icy1af'0kaK_Ϥ< s> +F-(C֔.5-}dn-t)j$l$BƱ P7[e.p&\J6-s|mᚡ-rtKƙo ;lvzYsloԼ΄!qf>8LY.:as VR׺o`Vǒ" .sZәm3JLmGN4uE_ TN?&T n`ϭɦO}fY~c톒IEsJZ R@35aE@+k,LiBh1'_Wn5QuϺNYb GXނ gp`١ҕ6G[{SAƈdэ_@g eySdzfep7ɴ0mPP5*=(k#6\'Ah{,E$mT"CNa]DܨGGAzYܱuӍYQYBT_íI̒P SGem9r5:g,CGEL5劋b8iaњ&-0u{p;ka(K!h]s_)rw9F e]a:K"4ʯN +}1tD]us;ϰ k#i{DfV%!R@(8b* ::]Vޞ=fr2K̚zG= yi"ކ_zTd=1a)N({FXIm3i)ѶkG_a܀Ӿrm ]N[^@-(FacEmQ!b*NڦfyAq7caE\7/5,-r\DzMISVx՜5;5C=?hJKA lG ngNFgnOMWb%g˩Xeݩ{&ҾԺT", h/²aP/ +4HF]v7MAP 0A @gEzS(-*AQJӀ~@S`'_eF{ &2˼i1Sgk_d22+iH}FAu/מЋhW_Jp0L(~`Ӌszpt-zbQg3ձƂɆmJt~V*|J(qU |%۷H au9{"[d/.!5fUIJmHV/!i}KQpoS;gjvu/O*cܷ+F_Ux~\*ɸ|MEtDo1i\eg?JN"%y,)\USI3`f\5<1Jwzb,k4m?ed,:\"h)4'$?6cu5x"B\~woFƜg_[DBpuKAGa?{1!c7k*Bi[;t!~5L7^%=?&;?%=!^*tRyR dž3\*u 0㈝F`?dBdJ}q~6=XG lSll$qJp: PzD=׆҄T _M{~Tkfejfl'쮲`x,3F3т7C]2<~q.5+C6(P ‚༹AӱnP?}EY6!ݴ5.I$韇ٲeЕ͆Tч[LfoJ>9[TQ@N 7!| !syo=.XwHO27'~V4'}`ҵ-]*+@bz+WP-+2"we OMZIG#j"N202PY|A@YG._RB?Hur1|.;򐺭vj/M]ϵ\-\]+R#xUx !7O eTK zğ4GR?ŶKE:2vsH9,ʮ|8y}7i s"Ug^sQ} *4Z3MG\IF(dr/jv"n[\R6X08 p}m',t)$Wg1@Q'O嘜/;@rCZU1d [k6 k:J1u;>Oƀh#1YeT'-n#9V5z$_ +-+MWWIalU~!˜?;F~+50lS mvзa[a*ke`g CZ-TqRX(1H6'D3$Ω[*p3m8˚^;8-f^%ۍqH!r&q2ʡkt@#,?9ɓq y414@)"~7%o LsESJ#[2hDVZIO@m~ V9uc@>?Y6Ψ*nd#,6\9#˲6Ҙ`~ yr!wΜ:k#0xwFk6x&ⵎ}v| 0殟CmAZ+V{e?: -|MZxϣb*7RV*%9ii[p~NJB1Y 7l`d١ڛ/<4FA~.Dx8;&yK׌toh=UWO3h o0hJ2@'Eo_uln?y5j ]fyTЈΤԝ屘قMatʾ WqO-ad\-V :).wk*3D4,%M ^vFݐL2a b,{c%5li0!CM@EW pCPHJ~cq#Tg~12;_)k*XYb[='giNr8{f;R/λu_tT̮CP ғB櫥Yi%cVYA<0햜֩_YETBvXBSqiXS@>׉'Fw1 UuX=]T[Jѽ/kQq&:ڏ!rߨX,Ks?QYB͒^Xd ?a>avȧGڏRCd+,!}.>VnkUifI6"8~vHNq M㽰SF{MKRB+3t+^l˔P$6=ԬUN6#j9:9wAS"h^d]xM?o6t5 0xg~ mI?L<#3pޜPH@xè2#?¬m3>%Ob|)Z+@^Ca$6 f 7 CJ)If uO6/P`(~aj0,+7c|fWHZrI൬ w17[E_Fݎ$o=# kcms't-ތՖS42ڰm+ML`FҬpɯzemO (6'rf[zb6 (1ςLr%,?xqˁÉdXSZ7Qs`piLRcе˫vcq<5X'\$}}s;RfoNΒ(7F`tuUH$a]YˆpMt#"&L~{u<[b|e}Eo QjV|Ӽ ʬw~b(w"c[0MH^~#IPQ]UT%[Go%CNhqlN9GHC<3f­LKMӆ@(n^M#i%w8I_ }mt˹2~ qD~ @˥f&k^6v-̕`, X޳(:&ߛE$b1 bl㜜w+\ߘr6V95L퉗x{CqeAFy {Wv&g0|`~aq҆WYkFSU=ntSP@*4 NbJ`0PQOJ~{hnUgy]n 's zt@GdJ dL⩂y1_;J|XG^Y/3*. Q#EБʛ#?}+˕X?@]c.>4u<Mw(m@KjoY/Vd% qk;1낞''%iv |oO[fԘ,N N\mԋ[VC|"Tӗ#2O 98(ϾkJoC PHB;BiM[02H{Мmsޟ!x5 'ɽ'U,"Z>Y=T!pOk|+\KBxLOŵ ę--nyؠ'_Ba~#xfB 7i U^Чύq6G?iq&d #lJRG "Lt\=V,Jc91&B]՚a!q~$G}2/SC7`4ZSFh_oP,%ivJ9YS=_6:6)qjךGD.t}W7a+_.}O, LGJzqNvNccm 7o!!e{CXuD*֣_a"R%[D"acbbg<8zJ36p[lSX~żj^PV چe'_slxU?kf>5VQߟ1U"9k:X Mnl% :ϑG闌_ [)Fsq3V٧%W8k #ϡdoj=E &P@qg.pRX<&'Z)HmiBIcYo0jޘOe~BCVLp9:0& _F~G9p5TNqn".1ʍhA^PHb'ʚvax t8”\[@mdk?A: >aHGmPo_{TܙSKٞr p@8 7g]*b\r5 i CQ񅻘%5D2_$0<{7l CgkG CMR8=&,VRax 4j@6ԝЪHb7z/ 3$05N%W/ ;v<ޕ**Hk0ebCJQ -$~61IrBmw%Լ1t ҆0Jdj/ 9ޕ pð#W2\7ڶ)7s^)H)&-U,zs(+eV/xa[7LʆO7i}S[K#z"tiSU 沍mIo++K|Q{jn^~hm%߾xIU3l(`FN#&F62 lL5D6^dL#j$ f97bo2_zQw %S \*ZP>P/2J\Jc4<Qks>{c_JIjH;5Y5$+Ã5oqz総ܤT3ֵ=]p"mN9."]K*0 z 3u]fo9- 0o֢Y$D\\e# ѸV "j!*-;?eJLnβ@@Y\ c@ĩP{:(P|KR 9Qlt=EV6{db:8CJ)BGȧP}U'o0}[N0aw\"nL@ zWVVSݓ=EN|cfX>)w}Vs.~AVM^cER&O%B!dNZSppeyzKͶ4K,HW6gEos-7ɇrvWl~^7\Ejn}m4G@iqb}ok]x_BJcb~#IɘΆ{\1@hn ÌWxAeC8@[r3St0mj\ ?!U<Ra/rxf?{VVm{D侓 t#7M]ؐخmۓQ31.C9'\6>"! ɜ?C=.SGMT۩dr> i}ߓK剭B3L4EBa[c,l W `&"NjZdaxhɦ"Fă /wU hsw:[v|&2<߱jZ`xI!$sKv O./q/`X^` {YVᩄ?&r0N젬 QVJTD(AU-zuCChӴf,z$Y, +؋YV߲BМUak:v:YϬv7"|+R%pzr:6I"UWlRWhg@5# :sho;k6 LH;ӆ*fns:sD1e>_Vuޞnfɝ}bhff[!=OՠY3[mY(xuDJ" C(hs* p8դ]Gfl^S8刢{|0iFf ̾.BeNqJ} ȓEJsD)GOɥoIlwfuEsZIt҉{ku܀\B=! t}SMAh +>PSzP"؏Ŕ,k꩕ ]6T{gakzc76| '-H=NjZギ}~_Xxu%yޱi 26a4jg5ÿ3 8~]Vj,e&Ԯe&W,; בhrݰ.Oqj-⌠/bVyZQ̘.,K֧sv:dTJisLXqP@%lz?(zIqCP\+بfNْX4xٽ͵%AWanuhW>E70EU։$&GORSt<6H"\B[Ğ7A.\V@}KzaT %CmOR2Ve S҃՛:G =F˧]h= q$0MbCHQ%Rְ 757ٲN |\)Ȍ!P'ZMWW5`$bXUH~ToI.HV\\䰩 XHॲ}I;Pi fL3QTȀV "Ţ"95tn u8|ei^Kpl"[@ p)m同_qVvU.& VGzr6klr<:wvs4Sݢ<7'Ƣϋ^N9 X!F-`3|[3@wVZQR />`QX3\C]NDž7ki|YN4''Ee7J>b~E-B5@xgJnǥd>^ Nr9H>/o`Ɨg[%N#pV?#O/mtn;(u״*/ö.?[us gс~?ĀǎZy\7準`2eU,M⹐kl `*di:n'L?BD8S|WX VMWq XHSH۲ N ?[Z6#cpy PM{pld ՝/Ӳ^BWc۫e]Yw0l>7UւT.=dc6*R# bO夥uAeRGufRQ o ~v1-E;0:keZjIH^(y\_5dmzb3ПwU;ڡ`C6 03\nRa]|Y0 wѪ o f;xTXvi&\yւ|߶@ EMH_Eyx 1ڮ.P^VJ4hbW"e^OػF2ueeP%JgLeGs>쌯,*>(kom+ I ø挙߇5cvC:8>2\7R>06,>P=5_1Cgk N2}[/ƀNs,؃^ӗltkxBˮI티%fv C$[cމ%S34wл5^YsDz endstream endobj 994 0 obj << /Length1 725 /Length2 22975 /Length3 0 /Length 23507 /Filter /FlateDecode >> stream xlcpnݶ-ZJĶm۶m;+m+m۶>W_?mh>f1IDl=Mhh9J zZ&!Gg ;[agN1@Cp03wQ;P60pY۹Z]]]]\h]xiYdbp67ZX5$db*1[Gk@Ʉ`j05''u5qt @FDY@TNV "D,05Hى󿪣8em?XutE 00rYK7 [S;.MC~4v51 ػ88dMm6p6G [OIX]LV3OodL-\lM骱R "Fvf%4p4rD/chТ׈Pv^4&Vz35rqt4u_dbnblgdR+R8[N5;B2}l1hnwx};`ȳ{#qDhkw<1EZLTa[V{ ec$ v~眾(O9ɜs^#Pʃ,_h+FC gOU4k\pmgKϼ<0T3oѠ 7Sk_-R$j&DJMDEi1KUWk!Ora:]=.k hZ}~aYjfS1D6Qs*:fVJGEr80z^X(73?5h\LǤ`]X~$V'q־Iln~[Wv;)keGQ;0P N nBb [ ֦D=v;=փ>2.RCf[AGaDQ`%9S<[otOyX:*rgGH?cvMB BAc#',Փ(-!&_bt쏠AxJ)7ͩE[m} 93ьRQb:>ǝYЯEan&̆vcS9)u  ix&<`AQ V+p`G 7,>~q1 gӻ`3ٵs #y#Ze `dGJ%M׫ ߅f>˃VDF0t4s5kDgՃ48݄/qMlUͅL>+^? |G=<\a_ðNZӄ]o{)XC\/{ U1^ ][Z݋f]}m/y RbniHI[}]&LNQSI=Ԉ<")K$vsHr{W Y9tV@6)`ܬJybU\q@~a7CB bB>#5oš,? |ߛVA,/}wn:cMvs^<6BU@15Ȓ-t|";TeԷEݍ4e-7`%3rlz,|(^@PIJVDOG\D:|TTAm6ɏ[Fd-ˋIqzy%V1~t \m݌H~'8g.1D.W!7%6&m_ c4"x̊#\ϸI bQ8~iι*Icu=/znGƩƐ%dfpMaf~N d$lݺ Ëy% {j[ky,:'A̓RܗkC5.$,hۄc+''3M'vΏj K}0߫ f0G3 Xщ x~zt6YUi6.*:Ibۈ.csՃ`M}I,ǁ|UvA3A%Vp9۝˅w拮hLMV"oa1>w>gm#̼D/2KxP\&hIwBB-hDcS}j8!cw0 A:NuL)B0F/06A4f[3M Ma]$:O>VfoSf:JN 5Tv挬72*(CB.a׽z(p$?4.(ZX#΄ #+),zES#Jkp)ե;aVVhPw(w@}rqo"";Lo4(cYV,4}_B 6iAn"qZ9 Gatu=ݨ%OȋNjE8m )8)A @3OAv"S8qc-{%"tZ[^ar7DF&㳍v0W.]_D"R3 Hrjw{M & \.v:h=)vJ2jVrhLe8]}~'-ʂ2Gi 7MFr'"T}' )n8-<:窎T`B,_DUX{Ng-!+Ӗ/v :'SF`g6/654l_Q41F_&/Z(`#T} Y_alcvwXPZ eu!Pgܾ,~A5u!2}IK|xNZ M_HP](a֣mz[*!t,μ 646B}O =ni+j*~ُoڀvgDHod./GRgƊq1,Ӥ-3GE3eTl7Ss{'k2{ʂ4lc 9-*:ch#Tғa>SUI߉[uZ˔eӑ_.l3C`Ҭj<CAG|.£b I~EJb,GATLDN#`{߷~:(ө1XN8 7vatX3fqޔ,I3wxm.|S#Ps-jw?3cLGPNMMFK[A =X?&q5}2{@ PG8L8 YZa6n/qn0s V^a$LmsKn(،Ӄr(\Cn%ywj"J:DiDՄ׷"ĭ>D.ZZd b={5B)  ypP5 u^DGmGNVgIEA*!7 ZD/W'|AEǯƾ'nGg+]2.Qa$QOG|mqNJw,g\5F&F tP U D%/h`V{B3wrJ{֑8x_5gVQ?`r~v[%ss% e-CѨ\eWz7䌲mp;U>F+e۶SN)'y| 5UBŸZa&ZcWp.mЫ++r,qh^"oHr<+bcϝR39zƢj4Qzyb9zI!C%pmNj;4lqO-fTōe=C.ї"M_NC1SdpU Ev7Ն:*)4xA%?023RpH< xYG-ujLqgVM2UJ58Y^j# wNM>Tstfy:c>Zr2'߻?` |\GP }g1i5sm3О_/A>͠)kjMXxGDqV'B3i9A#6vq7؏,q5?eaI JWJ2b)UE"?]k' LX]~A,%PKY 1fX|&t >B˘V2W o0ÐѿK?DHO|^a[G*">g ӗ)gO=,ӖK_ ,h.t-u J^\f7zt7C-Ϊ B_)J{#!04+CZ~ΈBsOHnOP BVXu%8{.PG1HGo6!+kŜԣ@o+M싖YgؔB:ЮkIG\SOԋpPH_]bb?W`s>D)`MC,sX`AbGqh|(c)ҧ+w?xm&a}H9,4u[~W"cx:M  ]]z /Ce0Dv,'DŽ6FT}/ r8kMʱO)x*3c=ML T ZOj,Y~t[ ؝=ZTݡojp o-V;ˢs|^lzZc>sie|kyƠؤ9yEF :47!DFHuLR!&^/٠ig9s_TⶳQp,px-|c[Vl:,,e;sAg.:w (vSt ƚbIp"tEґ2oW>)ww+F®- l@W p=:I5}B<8bp1h`>KbU2j1J*sf=jn9N؄X𒙏Wy K H"WVeT%F[ʽ/6NlpQ\:=u{yA׆+kk}m>g~ )Yc:S˝s$TNSSYڼyjEi*nȍCӵL9{-n!iL8-5SX_OlU`0=iLNZ22w0/}6ɝGI1Hy_ȵm "[%<-*N}P0>89QUPn0qE] Rhv F(1\C&T;_",$Kcu9+4&Q+b.o 4.2_+Dg^U/B]p*ΠB'E 2p CiI2>3q]e;u7ʟ|?X =s`P=d;H//lI02u`b2L{A6J6W~nȽGFnקNØE?*:_׉3lP jdop_~aLq1,'FlÈV`1fDW[u;0oD&dfciR!U~ޫ^[:#;8>'4C)z~/#P͉ `? 1m-?|kDmQF݀SMMJ8YYtp.9@ͲrG[X_OGX[Dȉ1*#ŗ΁5{ Rz􌤅BltI(߬ctRN\TZet'`-X|^PoJƳk1E7g\6*!{&"nq"W#zo s ;ASЅ>{P)$5FsRPVPquSQ8i$](n O^z{`g^MDco{@,obԟҶ>d-`B06x8Y[r >CzEf@.Rs6늹 =|dCfIѸ1{ԓBUsLC0gۯd\xO߃Ea}T3pNa{м߉^OrM-*' Y#WPs7K8dw3Jy&#lbÙۜ$iKlʲmߜcb?MFKJTK W"3+,y9: B<#f?%\|O%wk (Q/^TM喰GB tübrc;L;LC"aZ=oM#]?& o jD1=ξ6Wnz s' ]I./Kn5v D^Z(]N|7'7nl nv7A::CfS1Yغs_@q.rF"hUU[?A$` xXY'o"(2kVўXcnx]#*L΅-f~Wwmh; hIa'G2ؽk_ZpPi*xVICgj ST6Bݣ ɟEٵv30oit 6f$0d(+G`++O@ й٫ybҳ4kK;)/݊wifSjP/albJKy)nS\#Q{41 eઠUv}o Myk_ݥxt *&4WtG!>FtT hS Dcozap`u1(6Kq4ӺIw(0_pm$S쾪L8 ?"Eu%J$?umT`jH,o,dl9ylgfk͎d 5ײ*rNWL:[{J{ "ɬN[E W;>còeg 4Bc<~p7oj%̶L k)%yª.O tz_>^ZkONcnIq~Bb  Z]!@EދUI݂鰅%[ȧD< [mP].b@W ST`6Γ;->?f͂g䖇Yb갤G&%/;OlM;;]a5E8rx2VJ\Eϙa?,S-2Q[E,v O/Y9XAe,sـ# Z^˄Ș|| MM,}f(1KcRh(jO/vGybtiŢOeK?_;omb=Q|=:s eH{y.N2ԕ3igx aPGL@ީ)ZvDž (bWL)c '~Fs̹` gNrSoB*{yFSYX q+> N7ϼҥQj>'eRA ëКy'ۚZ|{"z!#M~Ji#$ےy0Ib7:wz5:>ڟS1 5 jASA  fjw"Z3X?xfN7СE^?/Wҳ'z 2kY ߛz"QWmr#R]~=wق.N$t֖5IhMiѺҍ#7"{g?rWtdϧߐbbFg15<^}պ(愈F+ CijkjWEZ@#lM$Dž0{ҨuߩcU-C^=ՐV0CNb! _Z=٧(N:N}>0uoU[9)5إhLt㐬s#$WReC@eg؂&WyMr+>ޒ聤|=h*ɾ /QF'ʶajd;gd =&Is 3͚knB:g+if8n) *fXՊ%Md-F8j;0l '~w# On[7*EF75|;f|U6ֶ}q@4:CTQ3ZVLgXw1u+`ͤz.kuܽ5Vw _~Q] qt^meyG D0@n@'x-}/429:}gemd^bǵ b g1:[fa\PaR'*>4_0fT~?SSiK4nvkLH-K .}=ّVFTh%-|Xuw԰ exJv0:"VLhJ Twʎ¦1Pwq 8 s?Mno8PA=@ǂX?4gebrw>hP/z-"MG5utq]CcA4jx ˽J^PKG'[Co8`j1ԏ5UcR.Hɸ8FC)$!BL91GWv?^KX:zޫ"uxn;n1ze)Ntƨ_ Թ=^Ya)7k{2|\BŚse2nƣ!?.^2fՊ ٻ%`1$M |/#]fΚyiŮ++GAS{T$emycdJU,'V/nlѮ= q*}?Z9>OK UXhQ;TI3&EBWsJҩHCOCw)1y f/A_{+t9.!S_s53ZJP@WqLx Ts'݆>w$*Df1EltՏ ~-a%C}WS1[N8yU.^ugѓ*(M9F*X!V 34;0!T*Xp&Q,J/5K]2򐾪 Ѝ.D /g#AN1PZ&)ECc9sJu5J[xl•"527O 6AItUQ_cv5g*&8 ž>XBկ% -Ruc,Yhqp3Db(3#-n+]79Sh3r';9w:daFKpmtyAus8=+4ylSM[H|R* ~BRIT*d]>q mzĴYˍkK:<*$+Ixk(4-'K;ܑ0;`䀛~Z gSfC̜7,˱6eJ$U:9lZ7nh 2`9C"}ER$_9̱~ bxz.QG `(e~'x }Cp%'wɜ*\+ 9|OoW<6xĦmR`4wfV.= # h%'rpgxk!<` UF۶b>절r?=QA) RA:/Jeu&L6׶ dw z!Q4kV}sY:X<8??L[(RlNr`Pd rۉg'oBjYcevm`w|0~o͗fq~QK&>UדK*lT8}f?&I-J?1dٺ3ǰ8LCrdqX.7Bv2" < 9Umɚ}H0AE3d2Eu>m16tJa~9P+N=-ZO'Xv&.I\@2WCDmr}[`# ;x;6 1.)XWZ' [ 7QyJ}M FqsMCKm00{rTRf~hƇhc'E 񀬛\`aƑwͭ.\҉֣ 'a#dy,{ai^>c[>GӃtf2Kôӆ˖V J7w G꯹VD{1u-hMѤW! },RLYM;K4u]gs@XM7Ŗݫz-Dg x#lDJD9P!<$wϬ0Ϫ[ ̖ ϙ@ rP3Qmq!8A[E!ObajHa!NU0 !<1gɏ owf+Fl(^ky]=<7g;^j2!]:Ȕ\цþq 5=_Mv abOt+<~꽨}'QԬbB`l&Dž:{LSvFMѮ{~/4C.N1’tLhՉ̸ Nv@/0jD5}2XAS+dT\~myl^i62o#~=IF=9UKh>/Ӧ@65K?)]R=]> 1ͼ W/ğdJ>kY>}^2cAv(ԟi rmPVSvRbp3_"dRl#IHBDWK,Pe 7Do ,6@A{RI!b@?I+%Z% mvҼG>p0a}"l +f4~{ n}Uoۛ2H #Kvdm5+XYv Eʓ$m/K7qSMA6$ZzhX' wj"h/SCwԿ(ū o%r3V1@-_z!`<!> Y+|ݘq~t^ghϼZU:`P#ן7ktr#t>wge1f4=NgyZ?B%"Cq%ekÆ/m) UI4 x' 2 vFX q^A u+A΂~ehn QV{~5ͅ }Y@k=UAYX?]yS3v:GK2l4 0/sfž@"a7^h oeI?˩"J?wcQTNkW@2 O?][$ųնٶ]Sm8Y m۶6i1ٶz~oϿp΋\׹?[P6*"I@ ty΍F0QӆkڢT:" kIUQ75 A4G1v ̉K*17`-)xaN:(K$қ7y0zDl0/v5b^L'zR-lC7$8;]oӐ D5=*MJ2 a?^qFiJ44ru_z:%qxZzsLV1ְ>>tAXmEuh 4$u=)Ԫ}7Q ښC{j4ˊ|b.J=κّܰOt|)i{p\(oܯ+S.hHW֟TzHHXz ʞӼνN D@xmh΀Yp"pI d^(ڲ~PA`7Š7?99`;>O֭=y~|J" _o6vִ%d ?`}ogO+v̏yÙ'_NRp]H 9Ql5ԵZ SZ_jLvBH `fP(cX=)ŎnmkO|gCnXAP"2To>O9߸ͺ6B1D:u)c'`)k3 !v ZyJ5pjn>'YXqܚ3+nqp'Kh,i8N,axLI>:EвO٣{uaV*RJ2/ /enN)d`u)Sq,Dg 7`# 7)RLWYQ&hoO WUZHRMs~nˇˌJ\;Y9aE9U/(+IRp ܧn5 rU{V?]$~ '`05tiރkK< (hdД8dNk#?31CEM֮;kec Is^INscΎ+s^/ \&Qn >muS, ΂c0 ~ѷcv+$bCJI˟XZ ^E3BUa2x-зyyduwt^n>sVʼ+aVu8|f> Z,ŗe"4 nTb-"\tg$ p+( Bpg7'$z/)ӨvSgzr`,؞fu|wX ┞,j-u@[!|U6Q͢#.V' ITbs!zRwVy/#>* ʝCkEpE ,mok"rgwmݚҠI'(o@Yd*}UFͺR![ j _)2C3Lwe!BQfOV[ǯVk8 eOa4ԩ%`;H!8+~a8nQE"ǩfedK?o܃xLSC19 8>IvJ0I+*vJuvŽфg&+Uz8rSJ"8QNaZ#Wg 7+h%ҤdcB]dE-'2N˜^%X"olE(c||Vd# _%m[5%|#zC.M  67mئ;:TQ|A^+̽G\hL7 d༦3Z KSi ~N/_ΡHI/mfl8þʮeBA{TP={rUW(WkG"#%%+.00vm#./Veө Dd2 ʻa,-5HclGgZ4HcML!*DTOiGJ;RŁ-30r5,瀢}M>͐Bњ2%Ld͎bDH pڄ=iwpMOtظ^OUOd$QLrl*Wk5h(6ԧmQYʇ"VW+mI8p͗^A5!bMzl"US ]k1C?gVS+PAZnpԩd-jD#>0Hp؏ݜ cu/jg^ǢRkDC]~#))C96cO(EXiRv)ɣ=!"/+?@6BFV8EׂIZJ;;Oh5) /ٛ-ˡ1. Wڄo\RQiZ虧,yAU𭢚ęA(һ%`7fҢ,Ìko! L0LK}iD X4 }FlAMէ=!\0KE Ԏ8Ccë6yd zck~E2)=@VI,ŶεIz{R.o:gn LxT@Zd|OBBAIwrqS0pӈ{wE?^D*:\|~MWf̣AR2HpͰbc4sѾ#C.x@F[ݿ:V$ ]^JBk>[8x<|r XsO$3!0 kЊpEa'bB`FrЁ.[ }.^U]6%RJLO*|.yhI%"rҦx^lmj770 cj*\if@ ;_,}~Q EWS BoVv#^v*[U'еymtL/A_He =DUSV|$'G6L&WOAUj_S1rYZ4p""&Tk:C qz@;Qo̍;GX /ro'pn%;\FtגUk]Hzqs{rC{^WbtChQJG0 "ḍc̝6Y!x!eqpz,ܸ%@~+إL>yw?9akK'(е*nCYo vx&ԝ{0#:Z{o8AOS)Y{H> uu` /VMymCm|-^*\%y 4ǰx6 oR peWA=\":ܜ4>KǞ 'T9Y1CZC5Du+g>hEMr/oao~]$TVQ-uN?fCL dL?[fs/AaL.+%X%tGf^u Rw.$8r]t9صVYrSiʹ3{;}IDn3dkTo#4'sP '"O(9 X\`, )KR~I` v O (s']I ~Ne%sS6`O޹"ѯ]BVУ(l_{x-: `;4%w-u UE7rCǵ+i*fk܉.HM=^ҋU+" 6 sqWǪaÑ42g[N|g(—s`^VZ׹ o"LQdO_˰`Yc>]D6i_cTvLҨ?cfY6zŚ䯫UBPW/z)+v~n_[t=),l k@#ߔ07zFcqi9 E4语ki$)$(jT430P?"&ZD`x͉I$t#D=t{dS| .4FAtY, 8ɏ#,OA먇LZMuU; )TiZ.3~J!g6YDʼn!މwor=oWhsr%z5ͧZ. Jcd;]ioOD,r ^0@?)03Jq/K-`?g&(bFtξ_qqD7//fh'O_͇8+4( )GeAe<`U}!Np붟ϲ2u'J+:Dۻ,~)h@#k0\bR[D馕`YD*,F-]]xߤ㰘F5f>;+|ƾ"@i)bLORGewl!}1 ;fF6̧%.WwK(N-Րgt\,84^~q˅"xfI3bu,\]uWliVl2MpR҃!{2%T`=(lr+XCs+z[M)yZʴgJ:96vO|ɗ.o<Ia{=|3fq>LKx.".1-U5-(4e@t`UDXӇ(j;ԬٰЯ?yzF1לuOR1 St kq2ުv8&r'fpN)C P"G +Na ^d)a&*abz[-w%B$ѰQ1>uz(lA* rف!Qm*HR?Է˃Yxb6ڮȨSĻ(PHsmIN{lȃQhWvP>o9(ko a9B3S5J+~T9PWm[p[B#_s'F&ΕWX#B}swN%Ov^9Z8"8J+9Tzs9G}Xïp[F5ÛeK |E3 r"鑏]}Vd"xcaS6u(lĭWF[7D2D}o Ur(4g H _1rH'#b)/c;Z_lYO0)s%_6!1ޅe/7oI Cm߭\W+`hz!Ej * ( _)IS1jqߒ;"0IZHڣN endstream endobj 996 0 obj << /Length1 1626 /Length2 16014 /Length3 0 /Length 16866 /Filter /FlateDecode >> stream xڬctf&R1+zb۶m;OlbFŶm[۩>_}Xc{k*( lir6F.NJv6rv2@ _9+)#VPD&&#''')@@NIMM_LFS@hmgouQ8@8B\N :Z\-2@[' %`Ҝt݀@T4{wo.&+7 {G6u)99;;Z;fU7NgsCr;YULZSҿt:Z:2L, =_0\,l  hfhb tro_u =m/ hmJ7fp̊rs:A _&v)ߔ;Hozo}b.r6;w Ebr1rOku YNoKm@o;D`jh_ښ-lyWK S10& /UO(".DX2T;*RdL0BBv/ZF6-3 ߻'!1YnU?:GQ[c;F/?jcGǿ5@;nuΘ;2-3ݹ3wxRD|8ؾA0߯ڮ7-l&qc@pÚ'x ׇ/y0^{zբ6^;t#kɣ=OXNFڂsēGᡞȾXXRnC̟gI Ɵ c.ZRt&MUTUtK5Տw6jQB6".5'u-Kx qeqOMcGp R̕f ,qcL3uF?Y2#]}\B!+FPkE<07Ѣ]}Crd8<)FFP(4bv0G ޷rhgO,a=0-֖R4*i7xq~bM.P_|! +Dy+OPq9`P[f9061r@w- FR.RN멜dY-QU<]x3p#qJfa"Yur z 7Vvd\jMbS + xYH0]mƥY~Jx rY.ge 0/墔-i@kS}o70>#2l*fVDy6k=r~?VvO<4Z*Wtɡݷ2^kwSRU\wi(Y.b3fjt>8cH%BS&1֪u/Rófo6߳"˄Ljea:a]Fwba?Lxj{Xpfg):R6-hX>JU]Om6,R/'yDGiySt;ݵWw'&,eL]-69qkzK+EMk;R:Gsr M47j5{հqJ/Y;nNzL9w/f %27[{+Rv>Y 7po&gȐ hYQ>ǹϑ|H%t=./26S%w*souή+ܬP4ap#z3 25ɤl"> H-8L(xkDFSaP![oyQ| -ߙRZ6& :J!$4ڣVxx; _q(c}veTrARg|u.dBIl$@̮.+W{ГqVڑkSxRF(n$YF6VI}?7QVVYi;ޗ/ҵ:S-D 2~UD}?l-՗q Vv0sGkJ8*EXXJ =sqp4ytD;EDD$Phž]Dq[b 䇚ӴM\H?z:Z?Re6oǷ)+RBy\+N,Дw'nȷ4c{f$~`z}ȓC?1+UEu=q(Ny"8ቯC 1o WҺUM®:ȹֈ_wG/a=Y lqf`6S0"猾S:SdwEOqe'A?,p;%05Y&O{g80pݒXSrcT=pTL wsV7"ƷSlJ`wP e(a&jsŷ*Ws}A6WZi/ ciQQ d#} i,3CSrkesjHP<2:~$Tj:F!A_9d7ЂGqQ8=i,,b?(Ea=Mɳs6pyb>t Bv S74Cp8>fK_3(Ms/YTY 58ڲ, "!o"eߺjk dWN+!pଞ Ei7M蚾pq;Q9It$XwIb!F"ֵ49;{LMO v㺁cFGS-g;a/v>#Y 7TGKB~<VRP;T:GI$ԁԣ4ޭ1wPbwx.W?n|ګVL54^EŐ_R$Fܦ4^mZ},1xlY`$~'tv5>S{ڹ>1|6nDv=Hjrnq;L䫬JZ<\`r zgaS*2F[sJ _LLV*^`2y 8rVx?.QjqoXT13⶛3լԺ&eahAssD|@Ty#]"hpJ}o:0t`m._! Z[ʹ?2 gjn 6|k\c RN*Ɛɢ@ "ݎajȩDyR`s80:@;B[ Ξ9ex wp ]n_1Zw3Z.~ Jp%M_[o1$|  M|ҶGcp7ëNfMLyaB|yGB- +6vy39f:.yz#83l{ s mNeb#BZtd3Kk%'-{،l1BZPx E(~}9j %m)(鈢s[OUhR_W@d90;nK*pHZQ\bs-0iF 9ŠD _Y~Qw$ؔFZIղ}>%Sm䂀RCDόRXW*y>Dc/[ƺb/  X)>z%Fp5 [Z y/]1ZKí&Zsv) ƌ3S4d(o-wTv|1AsR1?͹e88P`ٹH4N#DYͳ^xHR% {R59фrF;nNN$uv{d6<,9ׁ*?䤙C,v(8y~T6`CbU@q&P =1۵#ǗP5G?_ɐA\ PpzQ P_Um6lσx29;#$pf=gu;\hd.9<mMgytZ=6Wv?shUNjlwѿ0d j=U^D#uAs0Web& AՖ97w8/-,PNڪ& R\鳋3qrآ3fT^a%4vb͆YpdTa9mF] X{u]9 =|A/w3 ݒ br*Epf?ȗPgMLL\{DeE[^2'`TSŧOle)Nt -\ť <(;3MZg*ީ`N@3ڭN 7*w[N,2dPUPgįe#Ð|t<ޱ}g/.?B"Fu"-kTX1Uhv5G7QPzn]܉zl @ Vu b:trYpI۶fCeS±Z\`Jlq ͂lfIY:Y CP)$rӊFlC펅ZN$UY2_,p3idaac|^K [2-/0l H熢|xI{MEo fP69#Axg01y֟A.Xa˿0tN;k}V>5; v_f,g륶l0ƝTƵM~S4 +\7xi.Լ#gvWX!dR.L CF)w<}Ϯk7.-OLJ|{ u[\a[,Va'.5=`v7T (aM%6sYu kSnk[]Sa L!CJ8j@ҷXCLZ։?UÊ(X?"D+P] h"jT*Xzw&?Z&r&<|K&]cMut;VŊ,Hu~錔׎&m^dԤWtgX5%YGB_@PJ證v` eRȸT*gtEpM6 hEN/tHgϗE"}-?pv NjAx'x>QTH)rN"TBć y.,2IA:nbS6M*Sh/l䷦⮲'X`a|ş35mq"MX,*ged[qw\ۿ 14~ЪKYRō)2,Z؄39T=U%깕DnlC+}jpLڳZ8fGi=b}Ա|a( nY)<;;abN ,+": *"_+ d$x߲X5gI($ax[W'DNmkcAJ՛U1#>YoZ~"*a-"mp.< ( &Ԏ=+F'};Stwf=w)f2-p;]iZ0 iC5S|cl :DsT:ws50PI`I=jcApP@Y撮`іwŤDZ~y9}HZ:)Y94(lɥcpƟR+ͼmkz[邱y=5oiR3T&C!OX@iQp/=pt}Ҧ/7،W|Ky.sZ#U= }'RbHQjӒy7 >WUxӞ^T~!䃃قQ o / h|善Z7ow]Q0fG9٨<װg ݃ ~oYx&ݔ{(]yxL- i*KCo>/6}v1QJۋ{6S wBbcSΎva.2ŠUȕ`' 턿Q~OeN$vhS藅72Mul_~ : I@g3̝5' aC]D9e;. E3}-"qr 9*cXV4{f͏6I8*yy(k,]W/jwD/ tltQ»B{ 3(!Z Wpř—Ӊ !8!Smɰ@mk4̐;C.<)@#Bg ɬ ɨ{sZ}=3֜ ePT̩(sD R*QrQ+7?*ؠf;ErI5KA'CzyjuBDK Aܺ=jbHԅ"v vD|y!ׄZX3>mQFg]G8 @N%x x?hGCҥ?8:qP1+WI ?t`( #AG{~ZvSݭJ5]R țG6Nr.0Lݏp ]?{Y@<>7BbXvu!`#NP\Ɯ%n,;VϭDKKd%|L1$S5 IݵzF}u{[ 3AҤQ!9kO$0%\=gGEr]rTgBO>Efk ^.&2wJVJ.&{0)";0(ީֱg 63B(Q L{MV%>^u)*xTv+~Z[oRjeqJ (3?/xT ]tu%`0p::F_:gZ LK.KΐbR<<$I*p%#o>C xb̫ &.qK> ёlӓ7S:)uGQ[wclj&_{m9=%&K.;ԦF"8+8I\R'Î|1h`24L=k8QmuʛSH a4goN"H R<69_!'e9<(YvBAp 4;]Z^k 2=p DufzGRˏy"6ݚ kYF%̺ȉ+1鮸~ֲ`IR[/8l]QEjmV6?j#uNQuzs :oU7ye+4WDɺTGL=sP=nܽK7DwɥgNh{I f܃%ƐվBn'~Q"%nSAL:D&:x3Layܖbaէ<+oHl,`VG`SBeCg?i6B ;U{|RR†; ~NIUխ^~nc:ïa=;3ɘoj 9=]OKX`u帪dh2xqt3Nb@tCA*N\I|>KHђFܤ_kVLg`1NyvC"5@Šd!+x@1iqW3թXTw+&6>$2UW4xdln[Δǁ˫%!ӏTO; &[?l۳rj8}nm,-=q:Z2Qi9x8 WI-Z`Uo=J3(, = -Ϲ8f$[쎫Կ=cKxU'5te0찍$ݍ J^MJ@:b Źd;):Cϼa BA[r2aCD_1 >|yE.Ǖy1X1^H?L|+?`ֳo/ZYM2"{[ Yю+6l&J!ݦE5!n<$Գf.0$Ok$jP-~ajn:!6smN BQsy)r`y+{oqaQKPp;/7*ƑB W,cȂ `zr!}޾q5;G&o@lt+CBU)!o l_:Gj(yT0@wWpk(u-^\)H^px-oEwkG$U" 0ωx Č~pCl"?<-I|SXpӁeGbq4;VAbt( MFU")$pQP5S3É|+o&>τͣEY9 4 \:P=+R3jNo)ONv9l Wi٩5CZ^.w:dJjl %ޫz{y3'وZ鴰EY/n2q/qBNC )`I3m=.GY_?5WW[Imr]ɝ1:F(OܺAV\Vsyri6'uCgMCJ~ͩS F螕}E)ӰHd~z\Yl}H \pk G2 vCFBOAJfӍ<u,gHVm;d#ުRVpoUxLzOSgt~ =,@:p(hbdB2*?B|mQ4Vg}j3+k !=usD>=|K(75X-e`]8v48rclH~@x9۽'OZ/bkCJvXy(g2P8pA#cTDEJ72ietW^,JRH0UDA[}L XdຎK;X樳;SGM1dⰕ菢j?VGrԒ|BTII2ƾ@zWKXcA6D@V-#TPPF w ]54.zפ j(g?/b`6o 7Ӄxsx|U,3-+gޭQ߫ 1t?0kJhD6iC;#*'^u皦lKr*Ц\!PtDmv=؟HZ"r *)UvS: lEs8s?vp?{uSjp8U^c;հjEze6 &#ZaIHiC{9 :UU<h[g 3=EdӱRP[/ЬA6l(Vg{ i L|!TwvWY^P|0{78MgeT }^+L_XpwP +`VfT~+zdyU ?QNy%ōͼ Jj`u*ž e"Es/hFպ}>ӠiFǙXW'[GXּ c SYK^4n*d֕yyaTyc,顪N8%ʡ~D|i8dp `L[Dn*R5AMVNE7I1P={\Hn[)CsU;N: k9UӞʖQ x;4h1tQ )MrX!uSK/6|wM&B5\)X邙kY[ĥƖ,%_l' o\`-tYIRwA{:h҂`f?}[ Z(?6 7I5+kl{N+ǨaM-[NʓGaoR_?*aec+VGeو"H_l~&9Cv@o0)WDZ U75xFRg;7tͲLnGI=A^yc/8{*ܽrA=-vA$;6:r<7nWo;4`a߿sj*:M8/iA8`f,B e^yxDziCwת(6V{Kr5''kC2sJ+uNc< "|ns?@"T\J1Œ:{jsމYW|R76[[Fj&" u:g@{$ n @a>kNRnV|JϩW/|><"ړy _bs$g}n'cUl=]]Ǧ  b suRy9J)î+7U ZAh2W- iel 噭V<c pe/iDZy[@$џYf{XP^..:U^j)bR\3_f7R&8p92"N5>/^j(|iԳ7/F7!lZAbŢt b1z1 mZyw( )=8՗o *K$4 s"SCpD}i{w%VD3lX P 99N=tu +f-u$6ۤY3 C* ?}} Eb?Aw=k[+M3L޵טd'$#ʣ\Amvr7ٱ\RU'VBJvpi؇'ľ\ ݈h-h \%;t_,f "pYFAjK.L#>e茈nMx2 /EOfpoȉn_4dY ;(wfzIGrJLw}7%dKq.R٩ԫ'CjNJ:;Q'WIoUpJ]ܜ(f!ԙyj4Ex=ZuoGRmYiڠAnܰwr%Eﱦ5;Iꎟ2P&x*lx>ʼBS}wf4K\ݴi{Qޕ^*ы *SE^ps4\Gvf`|S%#\;s\Ek' # K&#Xp#- 4Q%2e8''GF66@aٛt88Uoq G:f6"ea̓=yfĤIQ /LvD: )*1f Љ@JՄ ã@#*,6vWl01UXlc讖/cS4:H(4ƒʚחEu N89rWp&naF~C_ѓ`q5DdN%[͔7lkFay~LFS%^8N0e¹m9uhhp瓁D( $_ Ias#%Zb,x1kSF"WIz`G_ZƋ sThF+@#]yz+(i~z8:BA-z~N: ]6`"Ul PVIa?{%f4bw'E³MW?0wc endstream endobj 998 0 obj << /Length1 1642 /Length2 12202 /Length3 0 /Length 13056 /Filter /FlateDecode >> stream xڭweT\ݲ-,kp@74HCp''8CNpww{tjUf{SkIXB>e!P6.vN!*^"̦+@=3 8HCB]%@hp  `+k(I[C[YOl`x}pAW&Z  *`SίE~[@ 3qc8X*ͅK`pqZ_݀@ǿ V# gv%jAN{%S@],PkTui 6  ם ׿J{yE`+G hgw.`e pZ;[]\^i^: O՛;:y {@h\ܯ1-hbg_08y, 4U5$鿧2 "L?]}wjYW;;Usǜs(6 4yW[%!v_U6.>v.`w:ja ۽vmK*go5/%,WNCVKKY^1iެP-G FUX/*II;56n^ ?@oU̡`w!';''FbW iB,_? ίb=^ ݁h3 `Th5qv߈aw'|_cqVA%_J`cU{sӖ"@'$q.-sW2C@bCH)U~N&EH<('7nyt׎Xɵ_ pan{z!vmJ/lNx@0sxFwpG>-wJŻn|뿵#uFGH7ߛ[@M͸Y 'JOP~AP6$qӀkogI*9"x_nѻl=)#'{ .2L[:Rw﷓ v?^FEj< ;exEݵr#|۳Ah+ U\HϳM\nd;&)xXuZy;qjOVhG%{yW$a/*^X?~U:A|"e2"~iD{7gcu$ȩ)FGb)+KT} 4C{C/,э!,7B&֯nDF-﬒>L{͟ѐU J[Һ$ڎ&VQ@yHWjOiAx"27GN$;w <o_zTJ{EF' TNVAecdJҶj1r?/e  ?66BL9qYnن*WWX06"DPS+Ak $>9m'oYІ - c X.C7 Uk94slj_mhr~ώqn!SV#z2s}e580l ,wLtAʝs@fG[ ŬqvkSVE[ʧimVڣaaH)C:2ͪkwleua5`ft08#¹*CFA"fӾP"҃hR} .-P!7Ta{&%Aej!EYsʝ=\o/ޱD٨";8-OspN[g+~[wvz9]XG~ٟy8@U:3.;h$* 舨,n)WXR%4Hֶ5ʂ}mC?=>{g"8\}񜬴s:R3y7Aˆ ч Wqgyb$OO"iuX4ӧO9mTzN,c *ܛszb^x<ڳi?"X:FnZTMx@.u vdFs|M̟V)ԛ Qol)Z,P`$A\cSF|\Qs1t_+I.YTb$O.(bwQ߬L"Կn%7l$_n& 'QLԁ/K'_AݠxV,eR[&Wѷ] Vî)5ȲgFd ^B{O[i/OKaGP̡>̓ HtޜIL& 6>.dn{EhH0QZ^gO2ĂsǛNtMf|H"^Ԥ2ӱ'+OݕcH6I7Ks 1zx'cjTq|tnZIF.)eKfsoVKflCE.eoaoYsU\%Ԟ&Ip"1r67uH$?@ 5"&<CКlf[gВWKC˥Є& ~_)_!)0e鶉w-(L~MMˁ񸦺3h`&mRHe&"6 GTt-GX@J29lZFj3vNg JJq 1p~#`~Kn:| X iKmWq'|::Pt_Vz4ڏrRun/^1\MQ>RjZ/ÏUlf슘"[{At>ncn0je%!"A.i\>Ž*L:a6㓵v{9#0t[%'/葽/P%4{VqU oldme:.8kpP+79 kG@'v2+\kWҭZVmNEͶv{i_t~|>.,10yiZp!ӷ 4Bf}g +0A>9gdC 4l=f8"K0\̫ p|2RQ-<>mCHZM?IA`2'tv.t*kRE 2ZI*5a*/fG7ݖ4;ϛмG}s~a)ڦ Uq\Br5Wj3z=290a< S(CQ^MFb9 ` U|* Wᷰ;y{d)ooFr%Mp4;k?Do+&؞?ݙb~ ?gNN>j9ĔJcC m.{<5؞1^drjLஊ9ުGw|h=0|b[0޳鰶#eZL@Ҩww)Kb4 + LƓY0v,?^JF3 Svi)wof3j1zXZa٣R#>krǡ}Atz9HsΙ@ͧ:U"Iٚ g3Ry9ZMPh/ X5?x{;kEiǤ]1H*d$B[%!Ee*u/"g/&I2Nٍ4Q|cmE,Np7?cbkep+4L,QL[> ?u,Y%}t(@!*܀KӁLgwhQٶSrhA8cm/ :Y ZEtfZwtN">IbKT.Aar">E }:aZߞOC'dTy6I))#.^ܦ+:n#}C]B-vs$: >8ᑻh 4mLft"Jz R~;hËΏqd!/@H0iF# 8gT+:+5D,l:)uE9;z)m/060Uÿy%,R|l9f?|J~Kx3Lŭ;Ey6G%ӯ%$ = "~f\_ʝϦ =}ҔQYz=GuKO/7+z 1;a"-hz_/aw[V NgY\Z;qgk-V_e`9}bVfv$i 65gI2Eyq$VVV! 0>hhdz QP9!ΐ5䂗VV J\ PiWݧAE<[+zEOdK%yD 'f0\Iϕ@IUrQFt[}P4 kwWY*UA'. M .eE[Pf.άm_,+`;id,.]3Mr2p*ɒclb&L/ FӒ Y3LMby] h f=׻-o\*6H ПzFcbL1eKtyAn0c4(ov +>hh2;]+:QG(;W\xfӥ ^^b!4L?nu=dG/Ĵ=KR}.{Mf I#`w O\wRtߧMGQw[{\2?!15Ro?I ?[_e[;%+H5E|zs.^톿4eZţ[> -{A;$^X=7CKUg8M湟w޷iN5M4G^lj*m]nK^GUaiK/M\Nv9aQczs4Ap猷ZW3_*ȕH] `.![dś<5]rq68$D%|_n=WT2q"~sXH0.R}t<M`txƣ!J:`E6S1˽/$ި.6^<^gX!NANOA>3~O" UY4Т1;b]w]!m-1ږi!wjz&ڭx/gMVZXT/Qx<=t)2LQ*Jl綝8DD<ƪnYȋȲ*3 T^>N;dDvr nQld "q_8You@@rQtx}HK*,GES6[mHэ%͙?\+"bd(y'|20kIT#䓀qAq0//+iȕc)ܒCvEd y%="Mќ0t LLΪ"1j:ta*Fg׵wѧj8{uts ٷ=#EnJӟ:č)f|?[Fu V鞨wBa pa .xY+.lNSe٣Ƹ`H%u,<NV6oRf2:`XwǬ`Vz ӢEuomե6Ng[3e*Wtx8s'+%đ-ʮg}<.Ap9 V@WLֲȤ^fo#VgM%eU0 lS =Tu-Ss :pLknQ ǫ"kv3F- L5`RQh%i2P*Ȼx2!ЗU](ujJMe揧}Q&OZ{O<|8SkUJ(W2v=q>X E3# L~<ҨMռô$S{‡e_ S"ϤwaJ?eڛ%tf/]7r^$a‰9^e-QGh~\>!ؾj'?@NL;Z( Tg}EJ#g'6:wHiŽ?_l]Nc yzs@ɍw\gVaܯmc6@~07#ltМ)z?ú HQŶ_eE$y`R\)w]7j-p'ID#Kb: fCrb?glQZ͔K1My'//ڐfdu3s|˨`X_l.wt!4r".TtgѐS(<1XuD+0rGo0kg|$KoGv/9 cwsX$sc|'7=naؚ(,C& AlZޚGtai9顠ET>ATR`J)ݰ 7KGf ݝmO6sP Tk <2Ne4j/`x Zz;(P )z8NMGyM.33b/YK A==J`U,p;Kc3qFHw$i•ÕA8Y&DcI BS-w0n@Tz;Ū%DY=5I:PJM@VKy w?~}^ύb/hؿk6ܜ;9U50ܻ{VMQP][U\\p*AWx#<Ǣ`}dH~Dtnݼ^"m d~3JdC|OVh83ˀ 'X82؞UVᩴaoV`,hvSgn?*3"Li͜10gSr,Bpݼ)-r9 ;aqki bj-{Ȇ є@P=V8gl`xDި˄\fn֡da۩?M&-KS s˹lWbi;WHib2??뚂fٴd?5fnF¬qՕ, .}A?A9d̚-'}v} !E­x@aѦ0L2WӸl% gL/W\jr>}UjzUioU(c ˤt0޲g%sUpdN~xWi)bI["x%y.lԏ H˝#H',RANPQ]rh 6*U!O\3[0Q n0@؄C{"ZuIHKvtː%rLAp'0ZP(>gݻa, -~cG./BKk蛶7R(d`5:o&JT;\)LkB>Ӆ8=&(dUPi#09A;qإ?XTJ_5_:Zjǻ>~W 5nW1_(SQ* m*D^s5ô ~ ʮgaFY;ʱ v{G;$z99A<a of"nTA§sa!j V] /orT5"8'd"gzh &ᲟWIBXW1!P&! 8+,fLbѼrqD7 =^ Dz?S \hKΒ֏(B,.%|1Zgi <3lM(MUXe3 mMuhk 3~;|rm?!Ę]/z\ '{X+Q(MX=ڠ`QWPEu%o YO݆ },žCұ:4<g^E ] oٗcyN"]τHoJRu< N&|o,Mihd~XZ%j'K/^j\Xz-wW'{KWq#@|;jF("]8z/H'<L_v(N݁$;/>~BZt[xWZ9o_U}cu X-(jpҨ6A\4J]ЋseNM!]m&{'}biQ{hfоv*p5X )~\ "ͧ&it]<ᕨDЀ=~A;HD4,k3e p,4:o+w:%UzD$+jSK,n,>mmYwA!+mM\{܅ؔdx.68`Ċ؋ij-91$Ӟ.Rt3)x|"U L# $ C8}{,{af5ωwGe>Y i! 1Y{ jT{ԋ[E\!?$IjT|7FUǑ>ѐtAsiYLUn6cc~^~uz !\co*P*KU~wsj|XV=lzl鳽w G6L)xM#1\Vbf"EE^ voiT*r:_D^(6{6TbI"T8’5%e-W0w#O5z^wj5Hx/Hy"jjh&Y3"4)Fz͍&r7NP&|t`cרb>OCNUG#Sb-2$d$`q1 j(CRY݀yrߪ(v)^ʽlcAȂAeE [6mlK3j_Rզ0/Y~܈vONtdi@Xdٻ6QBikr{G1[cBPSo},{ywATTڛm!zuut4Yo"b|R?u&i<I+F5ڽR[jG?Y)!`Y(I x > 8XW KN=U!/@I+%RK5ḣ6 &4 ]GeXV}0 ɑ|C> ~Qvf(x &~{L0\R6&h x,9P<-~1j8}"r?0dVDA\ ˇNQHCNӇD{.قJ[LHr&x3QGXϿNh>OOeڟgA[P̯rdpqV({> stream xڬctem&۬xǨضmU*1wlI%;ضm}q?c5ؔj "@I{;gFf^=<*W΁@I)9[ۉ9yZ@S8 `A;x,-4Z_hz:Y>mlv!Հ@`fi))(Jh5R@; blci49if Ϳ{;SJsb%089M,M@gddK;+7WB u흜L@΀Q%?,f-MM\)_0FvNg?SK'#9,f@͍@6@'0 _7rpgN@3Fֿ1M6C`gVd,8WhڿIxLfLChXf#[#r\_%]lllw 13y?1?yWk-?8m_j-tt*[:X̌l_r ;S _t&voV$".`ew=?Q7`DE^ ,VnoB<>bϳ3o,Hؙ؛3:jFv Q@I[5@;ayބ/*5#͹7gpL[o7 `CIza}_jO[ucG,p7 uW "Ї'}7@&Sh9M(]Nf1U7l GZr|,_uqh `5'TIG#C]={_)p}OH8{M>_\*3Г4< ^\H(c0Yو?Ew.BcY IF&ekdN8WyS3xb-R 96>5i2tCL5Tt4&k(%QkG~l v!TO:Ufda9psgֱ ;~}Q-eLJHYo-\Jm@vإd{ C^m ፖɡMoљ*<+d8y W Q?#yK^0V1ח=Yeθz[ԑV9AfhJFW$> %;F^(r{5l@ 9ͱl6|nԗofuOyqK~okb~?r[(v]]3ss*w;߃]qu&]a2WS- MR1;Te Js\խ73U)伿vUJkIgWrآ2quD{purbWfS4iފj=UaBGY~1Q0U\j>HTbȭ|߻`cs2saߎ[5 /xRh=ד'a'X M>=X>De_ f4g)TwNV e6;U_VؔbMo~vrC7h]Fd -!!omc4G8@zPrMu%VX]\*&uyߘ5#.fLp !莂u`Se! >sz|W{RȡգX],AZq+{Y6n;S8压ݚzNi❍k[)0p]O'iD[4gy`gƈTi kτ#akcKDFwFnH?S9ay[Y;R/hRҿ2Ǔ SYOw?fa*!^s*p GA୺ߍ$ X;#SW__2-ypiXSQvt_q^kSXJ`>s{IO,vqaǟԱnTF/I3炵]6 uSs̤E9E<DO"ԁFC wBm(/w_uiBɞoG*jz(yJ])hg O!w09eOd양G=w;O6\0 Z̳7ytfʴXXc \n޶*/>չV'"<ԈuОY6_WRRn'Ed.L-1H9P3pH9OqnY`AJ Ɵv3 v(lRBZ|ۙ*$tŘ͢~U7!WQ*]8owr*(5U:, .1 M56;_ =d}/NB"Z1{i^j3# ]=&MB*زh)j\ObOYkq5whnӚ*_Dlw&6ҳs]q5ʄ6v)ځ5ж9r͉{B wvOiV[:x8nZ㸍,fƀQCNYf_XL)7|w_tz"x>3v-fexyxn ZEq@2 uU`@6f6۪'O.,+VD&I/Hx55,'՞Us~8/mTpfE 沭{4em)E*|sm!7Ͽ deoy~B@M] cIoV@<ИshĊ"IIn>ԁޫ4&'!Y_Ii܇ӰT6٢Lm>rؙN9'v*Dt{xv)VEd!MW_sLi Cp "~)WȎ'ضl$T (mY{j>%ma.eqb<j\;NU=(7-κnxkM&.Qa\Z~y^5B̋<%ϰ,}TPqĮ|t~2݆}ݫY]o&}F,gbo;g)A\YM7=T1m7QpGyɔOwwIj_"b;ؒUHׇx;=mľIn; CutQ4e} $X_l6$kZHܸMwoɎ n.wwIpS1kiL {k, m(hmIGSz/'duSE*9GWm!O5I~"5dA4&rM[28"7' G K?,D5щkl! ~' ~,zd8kZĬcݱ͍ TGXho4dӿNiǧv?i !DUV8':xa=$!Nzx7T Bl[j!aY&Gt*.,AF#IɼGdǑ~9]aGW,ʝ(=y(ju`=.qΦK%Ò15X{pdadG)Y))"7p,]'&R:ɥׂ;aޔXt)GZ#$<e 57{&@?Ro[EQ  pIFewB ΅َ҃y+s`]ē]WySM1RhenBn*iF3f=O{[8ǐ$ox&ENiL<NC/ 5:Nۻ5rN;獖%FLS!v||boN5>)+뀸7ٛ~뮙Gk53 #*LBrk*1-zTuS:Ϧ\>x!ĥ=I%"QJ=wc%/U`z`i 庺F lLA17%V/m!N-|.ΨD-ƳxkHiXq yVQ [^bGA"ڔ\ow)'NZR8aUJJf BB_K0]NZPA\=4sٿ#f;׺|֮ 5"䙛߰$os[FV') I(|+ߋ &oɓ0H 9tp4}A֤Z2p3[스+{j;.:t-S)G躴Kx^qceS5~[Hb|UΕ'm l A>[ԏ+ R+w)t΂u鷻u B&]Z#Ӡ1$-}A* L8L*ޏ<[jDr^q]5}Ȏ&-VxYstZ=Iˉ0-T]cY߫k{RN<gό U\:rQ‡EwlR5Q_8>Ӕ=w})_dovu;*mbq4 ԨZWQ_ [v^Q+;)baX`|ҶKFTz$=+,mo7u'KP rEhY/\Ix$"֝Ħ[*P8%c9 ռTůAv}]˹RLɥ֋>eޜnKEYs2M~ yq]m4p݅;VpZ Z~&8)/Pb^:R _Dʗ 4?R>4"m!~+"fgU.M}MO3kqȷ%3 ++z2ۙw IA0(/~'#>BWNBB%C'>[{2]pym{fSa&LXa{}3 q.sjq^~$7#P%PKut+*[7$t^O*3T,s@ [})XKbx#WV˹ڂ]y.lz҃ Aqh{m T&]׍zLr6hx/8"S5xKq߈p7yl4'̝>EW<8)߻\6d?5M?T&.\W{c X:J]s`//pżZ&:=LbC/TS:; |]G: 2$uO8Q54fʬhL"eahӶA^f M7jO¥ }۶g+|emrsJq "ɥvkzxRcdmRETKLq>/Moй8 ttܭgXQJ|Hx?x+@IvViݵכ.QL=wv_+&Nj#EbַN`9 8.}K4 G}yzۏ;1-cZRNf%Ҽ8{518:@b PRk'N{:g{`*Ea:ݯ*$fܤʗ"k1+ki75EI$YռSeւg91*C&繛V̨(sBv}!opĝ L}*/c+>W+/)~Qݗ1`[A[{s(xTBJrIoek+ce6&0?? ?d;شu_!ynې8n=;~:έ2}_F*UM##u%i>co9JezSn$'3ҲAT^Ynf)z@~ ^3ˉ;XZD_$t:!4M-^2b4A04z0\oczL1 V،ᄈ僐m"Jm m!JV+)iϰ@EuK~wYl:6 "S:kw#N93OgծkA Yzv{|-Tg?BYxHFT)¾Z~y6nZC wh>i*kݼ9 =cʔ؂{ob#,M;;1Š~N R.Kd7!aj-M 1+ yxf2$L;wQߐd=z\a&fb/vQž\j'W?!@}..?c: EV辠4ͮ`>OX o}>V$y_ItU6s`| cfū>D7 ! d8|vA]YxQAIMZ{o3iE*r-Li-(tpKjO u!,*+'ˆnU(`&plՑ8@d$..i/n 6)IJ(:f?}?'،j_1s0b'u׭NQ:Я۠/~_ 󹡩ܢn;o< uY,|fK0b[@֧+jb5V]Jn~ޛ8paPIBz7D)o;suczaYj[ c ?f:ϽY.8࢞+XL37(x >~@y4rM#Fk"z$&,F4a[I!BU\ ͢9-[/-2YEQֳ^7QH/3}~DJaI\or-$ŗVV|h3\&ܭ? Г2Dlt{^kvKSMeoT:3/%.Il33 YXj-.dR#_`!KV5\ÿFm5Z%migg/hNS[(&E#%\dG^%q7 r݁t\VC)ʃ\a *P*!#umKOH>6ccb2*^.eMj~TvcVȎcfMaxsu, $.rE ^yFk[s9׫#fr|Sz{Hq4b=dHGB`T2 YR8FKva,hX3+g]) ]'[Ld5rヷe^U!+ޙLo:]$ZekjJAc'f #Ve&(z(шh8GCqs||b#.2#ZH:^OwBo(fˑOo#XU x:F`E6<.xtwRuLBFJR ǀgP9A_JلW>vl_#|Zr ~ '  ,~t&;8/% :3'6NWd֠0ՔyZmҷK65ՠ8xUG -sI漊J_{?ݺHj <`h]CZ3j+0uț7a ;x42<7;=. ! m` nK 4 -IYH6m+M$ztbgT/d7;W)L$f.bC\p5ZM=jg&suv|x+r{Kk`TED]MokxXF1HǏ yZ Oō/gKUe#W{m-YD"$nEෛdO5-ȹ❩SrKG< VC' B6Odl%9xÄ(0Ce tzӫ {e챘D5im8h%H2FKK˴ⳬ9N ?R钡8}aN6 8ir5&&;w"Xґ2:jV6ܣQ_nFQD)*bkY}y w gv$A Ê9Z|" ;2G|$ne m>VhLb `."95843NVo^M*1%5@TDNҾ,~Lvɝx=O~kwi `1Js\5%֞C[ޜ#[]TݖevAKm*w7?W$$u (o=WJ#cJxߛ>L9F >-;!a~l2$`jb$UB%9L40Dky\!eqO_bNpm:/yrM]C]ڵۉMR#AQ#7ʔ8ҔQWKzIs8~6Jrk$P֫ɢ1}L6h.nFnK:UF&sM{D ۃ4Ba7Wتk'9xSb:✜MO/.-H،ϗ{h᭑6 L 2 L}XSyQ;K_\^J㊀cf[/-ߡ"Ifl=.-ѕ 0ݱc!x5߻[ X/v zG<(R-&މ*8T1 盓дTLgԪwy fan@0BjpyVFUxxPq,dC5X/Tb.2I#iմp`GRc0ļc$k窤᛾VoShE.yzwDv_hsfd:`AO Y奠;Ɉr[&}~ml[Sv=GVZ C`Mȃ ЃV\F<*nÖ6Ϩ`Jcڞٹ68ͮLƠ$hhOKOGyC .wQk|? fZnې%y)Dj"{j:7u/^fyilmAmpP: ŧI,ƕb~5r:#C`Hd8M:j8mN"=Ӎ M,)Bsݚk+Л$[u--ޮh^c;"TխDBΨChLpP- {BÀͮhcc'.̨]jxmͬeHlI~ !=?˻FOvc'FyB𣕄vX[#Ӌ+G#Gՠ ZQ;f*"AhÖ)ʬ4wE1*a(eJꎯ(I=/:CLC"N9uN `p4n_vdiy-` `/ PZ]/9о`Cɢ(jNi5'vbHZhi4ǖPRW8),@q]2vG%0زuO.;~11\ GiS7멎!@*+*R~aP[i%GJ!FȃjGw.x_`Y̎-svWm2u\\ v ZIjalFW<.Z=XB\fv* =1$$S1y1PVfi3\h|!NJ 3^;%%0-֤ C x9{?ֶZrcvڤJ/R"Q'nM?,'Խ .mktaxs2)#y} 0}g]_eXx]ժab2o֢|W~bk0r\,V ,|fB{%c|;%bpJsQ&IǏT!}d*݇ خr94(efMEo&5)-Jz(ugh@-u|7naGĔ%Kyg?)4c2>e6z0"XpiI鋀V`Z66z^* $ƑZ_jT.Cf"{e`{On}֫Uw V%\Ĺ%  ;mR;'Ww0Fx-?co mэFTmZW0JbHsmj%~ǔիvbɫ^f9h9N8$d7zd]g cv+ߧT$2*/SEuՃ d%{ϴZR6g01 ܿ 3A0iBu /AB"_cU|f-;8a_(g1FgbmOD6z'Df'4C:烚0.VE@u-+R3e”XHI07?Cb@ ZR}RI< ֥@Kwsֻ!)md!M2I8v08vW|eFe'拱mzkH*;1Ah Ν6ĩ~Yjd-8}qqxN9v0t݆4lmSwKGuO2fzU;CB=k~ϼ\]z(G;wLYef_YUԂ#Cqo)=*l @֔GTzFd"LLu%2{f+tȉ]cȍ}NL}[}i~)D\OxRXLɻrڶ `gcK/ױTT^ۑ1o.\&d:@gMk!'EM ܤt4j DxLBTTw c( G8 R90tjQyJogr}֐#=ƽlPET]d | !&K^?l'ڷf>ײ !m6qu^nm"-3bedjn.oL%" * rjણ1[w6,dQ2r}95AUzu;H^KˡdduPL^]ɻ  K8N*8XRoŭ'E{T|wāVjㅓJo"iy۫}A=WFKpp>I,|C 6u4z ]aOLgV<+f)ںAsTaoTO.W;}6Ք; hwq- "Ra164LhҸ>A%) ĽѱXV/tZ;'do?/աp]vaN5~?6Oniftc1D߁'xxe6M]B~CJBd=h+[Wjù͙%R8.W} p m: %'i;zmUi[Dȸe"wPS>vH `v~a+( 4uIH%FK5QVLh'c֦~>ڎByQe/_A:rM>,bY%G-x!R],˙'1C9JASYj7̱2p¥j5b>{ qf!0Xgj/R5|!.Ê+|k6 9km@nFW֥֕tz]YÇɽڳ_SNKBR5`8h Z1/UֻXOj6&ȿzNFc](H%<:&lFt#`E<*MUz ~::YKQ\TSӀ,)lzmG 4<1=N =^X뼼=gD&΁zMBn 25hr6ݻ6Γ, k,iӖΘHVjC[[|Ku(BP ]2h\ՄX>Ujvaij|I[R;8mk?425o]R_Y 爇mc2Xh~-1,su!Q);*3^ jҮ)Yqȃ<5NK!GaY]_;3}kl&=ֳ"WBm׽y"}n>KB܂yG+ֻL$];m>ې _[YRQd4.%oP>:OBm2$fІcn7FPy2vG޺MZ>NxєɰJKO6YLZpj UK c7*(OٶhϨ3b۬!N bi ] GY=sь g G|w]=3|;oI0tMI2Jnz1@r9+`^tTK0:ޝ{fR r(C7e'x! qgF4Dtk~r&{2ÊO@DHf\~}hn-vyÜuHZ ZuXtN^ PtpmlCC6 i/-}XsHEe`ygjhm}Г |, |-!u/Ъ:s&fɩ)~b ڳ|\MzTURhw~҉'PAa@CRU>E@ k0ҍP;[gssQVa & Q "b:_Ȟ¸KC 5T$u W%_{f~V9s]LVw{5Y,V{SAAbOtWtEd $mf;Du T](W+w'g替\MP,"#KQEK'3DEU[ÅvNp.0(-ij(m^Z0|紵HZZ/eнfۊbo$ˀ8<d}6y]VV3]!¬T!(F$s}&"zL gū(euR}-svSf1رGdDd%G[km1 }D5S8ٹ?Dhݑ1yH^BvO*]Z{y+7mҍЫ"fNTs8<<@?* $)q%-wy3| }$ ^Xn[F0nU7 fƬ;ẅ~r$վ7&Ͼ :; p>EM9ɮޜ"n5tO3o7ug;;@qbG:qMyhzKd֠KoY9q( 9U#ܘW2C]A ۦm[;I()rK46ܞvAcO6ogS0 CyVԟV!̿@_,> stream xڬcx%\&tlvmccIvl۶mul۶mN?;g\WUwUVm2"y%:AcC#=#,AJֆSN`$h`Ic%#8mE \jcB!33!''',=̑REQ?-7hjMH3 `:P t4-r)ɪX6!dh 4"T&6>Xí/-7 j%[&:X;GB? [{Vd6F@[G¿YE]?aB6FN//_h@pu'!`ki7_2[{prZgS{cK_Iuo`kkhyKzX&96Z23/&6L; Q33T00t#428MIH2 "UJ-di)k`wg.k¿Peci`OB nwN!hmW!:&Vzb@W<Ȍˮbm Z_0e3?jXUDh_Q@?3?TBB6tLlt',cht%bgdd"<Qk#H/?_ v=`dchX34!6d[R\[e[:am@pÒ;pEB՛INsϠ[vq ƨz;[7b}DKNhmRco[:3vq)E.<9/ Pf|û?[Xekl3Al.yޒOբwa N V!Y ֆ!n/\xvUi(eJW'"WߝE2>V<n H艦bX}/hrIĔհ@oU׽2q 'r)iFw x2^"2e~VIK\<CʆnW2q O\e x0 !`.ax]U)VtX-Ak-Hມ-s]/TJ r 8ͬknH Wnj#!Z68We`N-ĿΟE 5ldID zŲ0 ib ui$uqS wp`ªm#U  ݐڔܞ3QNg izWMzҥhޱ|Fv #ϑ'R2s&]MrEp/jVUr[W[] ;H233$;cAW=`W_C,:×t9D2q>5҆.>V$?&uDEVM1OMU+qv7ނVfa"A$DC6^ܘ&xD2E4|!!zGu)fKIdɞh3_u*D(U2 v/jfSZ)RIS9\3T Jx 3s,̈́n}i-<L+< !`d}$ߺw٥ehmbHB)1EQۑ)o#9pl`tk[/mEFbMP}l aOV?Ffj[ 8:rg #8-kIJAoi >P(A33_˒W;yab@71qh27frR~^{jPCxsUc&w&Y{)dRbT/ɢ?Ȉ%fE./+='A^|!' شkk˻w兦)}cFs'',LȬ:v#(T+<%L t+^ ӵRV+*:"Ӹ -Œ<|z96q[RnSG'yxj9C_V#j/gmEwXnq*쑾sR;G1thT3D$!e;6d@Չxw6x[ߙ,^dr5 w,nR ;T0aUS?Vz_S$E;a1)WKEc.+Ðk8jS i\1T!WODqQ:QAt'OWVUr)y1KHe'OaClBggs[V3իb;h(`u76#*5qRKqLBo :bƲA}P>ڭ$)5ɠɊ2ב\gZ[`SBř4Re$J bok*z!̉W}*8ojM!?[,~/İNϰv(|)w;twW|K+a ~>7 E򊺢D ʪDUh@3AǿL׷MFeB@&Qu | . C=1pZo~I 5RZtYfatMkle3~F<+%"5f2 #@=^7Y:J=<^ӽ\6jx~EpK'ePM:*ıP6xѡ';*8=I oOstrRѠ0rbkz;9Mcx8(q7QqQn:ۊZCs74+rGX򶡥E i髶fɶ HSv20 5< \Q*gĀٍ&{ƫ[yM%dSo/nk=W ?P.ESy;֐q p)9yd5P#Pf8W0Aljdv_* Gxz sT%$QWL6fڂ@fJmY85'T,F&ORS:e$b1|sS #yb] i'150Z^-A bmM/%Ph{WaZ;v}S.#讆-_$#W=4bas5Ÿ:"!lp<ìz fƐɊ}1eW5'EDOD)[Jۥ)"Y/go`^˦U\$'Am9~{wyvLyp;H5_ED}wb uOp॰2O|Q[/ֶ鄕|=}wH1/Ad3*~$Y+Yb~Om\}D=qw~HNH fߎrz1c]·ʹFzs% m'÷tC6·x4)*RF2 qwb|HTOwC~ F Y gr d0t vZhdjMTQ+ m8o;a#c3 }8WD;1#G޻m{ a~MW%zFAZ;*OAИ%7Rhm"ê݌}gu^K{,"}ήLTWI>Wf{>/!ǪMX׽MBz:̳⵽^5 (˱hLSHI[=܋--@8>c+ $\z}G84T?s)6~y6lKwjM@dB O!I/!/Ygi3tܲ*g$PWcƉ&`OvU׀Q8G ReoN=*=,sg[Qf2'r:mQR?-bksjSzH Z)ZCF6ZE=MR e+"q'i9Pl8%}0 dtӫ];<s]t'JuiA%;=Z|-BaKͮ|6r`='NGog..'[H9m=IV? ơ8}@Fvdց,)n|3w&JUD4̈ަPeakpuCA"E{<<{FȨ(aIwt!H*a 'sZF `,fMOu6SVCP8hCqg>N02eniL{KaB!}l{LSCzMlI3)d-Ibq'=t~/cqOLЩ(p~RG*"4Z kOY/r2bK62K&KtoA{D()$LnBHlkfNx<_DFL"e#[~PvU#"%s{Uɷy{7D ӻ 8aqT&3GZ oLތR*8CǛeg#'5%tQ{Hoh=u]<တ)'ϳ]%1@QKz\.p5ܧfH+))eZjѽ]p͍"P=Sp{;qK7NQ4gC) hQKM`g($SKNNݐ㛝od^p2ՆCi p2qYL9f-#Ӣ%g80:-P[O NPTxug-fj1D]2#3hBt@<6Lnd%DcgďtR߽֕+jѩN8sbw{ `?3S)k.l&Zo1H<>pJ{`s{h눋5C&Ktc0B'Fw6{VAOuBdq1U,i}7nnT1DCx AѻTh$'ܕjS2Jz~ҡģ#`vn¢b皲m5&{o/`d?D>[&}Ցj%YruLR{5mm|㈖yBXU#e¾gFNYIuޫ$cW={~͞FH$HEP(NG}_d6/iF[}f?RH BW]R,OdE vIeAȡHE[vB7PnU܍hG^0iPقL'Yg k,^Damz"dRTC-|հ ,p ME+V|b #H|0pNN:}$K|ڨt4םۿ{$>>W@l=nGkp@Sy%ƵGZs/O_9P(6'qeK>>CBa ér7/%@$@DbdxH'^ Bw/5]z Cm^wI؀5a`] F\M58%?H)5@yq>)ԉhJpJZ|zP""dk1bňF6Hp#ogcBC@ 9s7ر>RÄ)~z'*N:}/v0I>M ]&[JbLJݳ`RSre?׍S|3 ʏw^TIL:Yx6V)|赗3a'W$D ȓUL3RQLWvpS2-u>U"!D`uk5ohL"wa -2s[߇dBOUX񼲲/vI lˌcoN8b)4kqcQ,HL&“Ft"^]msK{{Vn~1R\J9KpVN-]Ͼ0lB(*mc4#U<S*濿^A@6=fzZ YE3'a/Cc&&zRДꆂ5 S nIMNK +l6zpVM( S~2/Jg R RtQ#φ`7'z0h?"`IzwR |m^s2dܕ,GO'x*:rVbvMwcAަ߱|ٷ\NrLXxH7䡏%.GhKno1w Rm9#vzJ"%>*2f\ԒTG"!2!5#|B,Ew_V*: DlpA~\t5abhzu^M`KGRzi6Տ2ߠ=?pTpQ:Q7JK$QUK cYV.m/_uPLמFĤuƌNt,}Ѐ:Q]z,,IUۮ[|`՞@_ʹ4PN?YKJnXbtŰ8S(*Ѻ<(S`lO}{ N^]2)к@_G(\;Q+;}&0>ݚrN5*m̏ Xe#Fcf6^e^gnzyI2@a=b\wEȓ'p:h_75`#ohoWSuߌ\EԾҮvo<7/[/l޴&:x.p7@[YkͿV]P7(q_hP=FJ\3d]A='@L e^b1#c8{h7D_\9*}J(9!]ʭ/cI~5El)ʬfQ*p1{K'ǚ%+~JicsOD`W^'+>RXt&O?㘎U%Rj*^&KL8k/p,( ;&8~ȌXJ0ݖ72Nfn0dp9banb PH22>Ko+5L\ J:*9skJ2r fy6\KjIXoI;2K1o D|8 Ŭoͦc騘ǿ pc"uc|O%?;{R=BM⿶Wl7rZ}= R_5Oj^S9gіwǾIy EgUS(D7h} ,~3v.JYP*O׼7asD,'u+5hoSf.'@,w?8& ?%m&ʔ/F?V&_VRx(^L,F\n0n踯(<"3\<&+dVec7;tJ }rr{ ̽Ѽms$25+Y6 }zZ0uHJH$oVQ9bPwj8Iz h&,"E;>AWRuE۶}>EJ*Q] ^t37䫖 Uc7i.#f۝OL[D).a*CuE=L- \__+ ! ,^˯NޢDkV@C%%AGF){3?Y)vG9䦜JryE $p@K-jTcv "UkՓP9;o8OV770;3QkZ|wA5^']LfSZoiFR]d] `XG3q}1j<{Bv|B<ҒF* 6nAЭyGjpUM_ Qw4>p0qJ(17X"d !X%Jjz rrG S.Ai|I_Nm`"#UqXee ,wVe 9@ U&^Q|83eQ/ rC&7e~[^-"J q,l1xo$oVKT!wxL>ީ9%yaԼ&$ƸwP4V$z6wz6bUfɓܭZ5~_Оo@q `aiVhp}4m2Fq|5%G2R$?tT_t>ս럧eʢTPOF r]^vM@tkqY}$d;2c3?A1ևZ+(mVխD%Hմ317 r%wy&qGvU/?=#,XESZAQhsm~Y 88 vࢳ4oZޣOsiBʌ-`^ISⓆW\qk TPRC;V2ӹd8Š(ݬ(*A".s|kX<i%\К1ҼRү5߂jn*?IH* ނR0؂P8WB:+D(n\菵1چ(M Lc K;~ D M]ǗpBZ;>u WZkx^NDh ;^ƙ@d:f&I˙nH Y:*|+ YaoB`N}S9g5#zOic W>O㾘mɶC וEre >atXx' }2A8ɣ}@8L @?Ju5dZA*4;I#tv~s6@M(@d.]{D' E98בCaEΨMo" ˽1FMYI :OɥNʲL 0UxϤMB&JɕzLU+q*ђoUÌV+-[7;#͕M=0OMkmSXK f邗< V&DjW}R#i^3"6lkJU-v9Jسtg8ctR&)\&#/ܢfï=%SP?,l_$JCDQzwê6"\%}aO`f<)U4= )1^އ'zv5[z+z+<2^Auٴ쐱e<+֘5#b҉.י)4M^J flm)!>/dG͞%/-\39gFD]rBJ4ys8Q۷D3_RʳaՍrNk1}QE Ѩrf\Yc2ieUغt/vAc=O?6)}dNGH]~QsËW>fU&r@5ί5%DJ}+10s!j+FZC!fUǀtuG6hClt[u/:*vӞ)U },1NJ0fǢΙs"a5ȉLB _6:a`>āwfeL"⾰qk@gNt8(⛲ +~lS&\b'@H?]\i縼oZC؜4`n!=5Jv僭݊k.6?9eTGlN%*0A}Ǿѳ·&D)@RyBKBy'[*5{I15c>)8_aP_`>$cɍu+>^YN6b} bj~r9b-=o!}\ gxH>bo.|bbZe ^4i²/\pfDlw^HU =oZOKWP=맫E[*vfipUyG1Wx׹ސ0͔mG/A6׮#)3H97VퟳҎ$%]&ӻX }^W1S (L(F''ŪI^ZmcȇVO֗:ƦLF}!))/紭\Aa% 5igۛ Th~^}  \H˳$(h4Z&nδk [!OYt N,֏>IÌycP͢?Lǘ-{n9X P(m%vC0cLG3O/@+6 )!+'k;5$aM3 6tlAv%^h'S.i\DvX\~9H[~JmUg?Ǣ}!! L8Ҹ'g\krЉsRO"/R go=TY%",ԺӗSYBY-K y)rNuqB,;ĞnW#8,%+Kƒ#HZ7RKc-c񓤵TGsGqw}F^;!~+H59{N_e*M`n!50EkC9Yؓŗ| fw*-./![>Sk,YKc9lUYY,:՝[Cz!ۻ܊qK+;)6x.UPXn,.&9pQGUmeUle_.*.M N_@S[u C{]8?%aT9_OO8[{l>7y}j͙6?2ecVt:BE>B"3RުWtj̚^¶ܼU ߲` b;bK`Oت6IQTi5Z(_ڬqH]aj%GLR "d\a!,DI.p6:7FcpU͔tފ:Div_}$ބNH|)$|]ff.$.Quf+2GXd,J3 u4pqEQ4C.] 4:#2(ш )#=Ĩ_{ËA3#STh& -m~Qjpb F=ou4K{K&wܢbgɕDt2-]sN!|]٬Ԥ QwU/L:]BK+ݾ* y21 qB>x^(9Ş"gprIp}#UEn2ƵxTv8ږ'^լ_WU+=8vjOT=LAϛct!,g#X_Jİ nDEau#kbɤų`pUA+eL xXhۀryTgh*1+;3n@v'-su7ix{K&de7O>VN`X8BnQ%-Bɯѱ'<1jSCr;xGW@mR"ig$ endstream endobj 952 0 obj << /Type /ObjStm /N 100 /First 972 /Length 4814 /Filter /FlateDecode >> stream x\[o۸~ϯ9$[t=胓g}H,9q("օXQNBJe*ch0X؂) ,fP\q((- ͣU!E.$UL!-wT+4Ya00C* wSECq1ذV8Ϊ*)Q4%2y 098GVʷhg1A|P , 㒣FpGx!"$GRf,;fRddEo8$C?Ѐ |R9&#i@M%d,ԡ+cYy 1"f)\gʒv$i)LjSW5#5ל8oIa4VK.(NP{; "P!)V@8ҢF% 79hNCs 6q` L9#2о #$ТqxC,1Gm 5Gv,ԏ.`0#$zPa@#0h{Ok!POIYXEqq^#ŽGHzcrcurk'@K.:Z (븪JhAj/紐0 w&gciuW32gpީ{>$g}9F31(sG+H~soiNSwu<,4*4P܊ /3JBK91з`(ٚMJZ\[&o>b5k[3db9ؕc=XZ KX2YD2r>̘ Ԍm)A\S6!sߓY8 .Ry(IjTЅWi%2`T <{4ǕgSI9'%HZ[* :PfYek(1Yx¶@OCEeK}Ƈtl"b$a;+ ʨv T V-m+CANdG4ʳצ'qvv'oȫd뭠I!ZXP,0I$mL6 (vx*G6*ohBDq*CRxK ?obJV9FRcO5 *( Z)V KIH"Sid&ة d&畨|?&Ti(z>HЏVODM!&JMz%QFfk.13孄l2?5?P ݍ&|@n?`tYߛ!>c}=\+_i}\?vǎq{?8A5XL(CW7160{^r o1ƐI˝+.z<Ic4|_HïC[9?1ZL #8JhOjJm|%@> 롽 @8NV=i0dRV5,hzF"XAn| ic|]bl{P +{ߜ4s#]qu<10$\v &1`a($xNw%Yԣ-*bȝI:TGGg&;Xx=Zս%u`" }J*јogQ|^2SE@56ìOg@ oԞr}y>:Ju*w {c{}cA'O4Sұ xxm|pM9dzt )}3&zLzw_A{w|;e=US-_3ߏA˺(,Gt5,W<ӣ?=|9|y]{?xAy}Z/6Rχg9=ZqP cx|3N/>kjM} rP~hIΧzSWy|P,_oʣ|WNLŴ/׳rY\e-u)*wɌ(: G獰8|s>#((rѕ#ͳgO=O)VVI4UEQYNV/f^ڤ H̲j:/kez!R7 0՝a(dꊣGOM:^{#!V@$ݷFEV˿Nj>7lz\]L7U=ԫy^|3~~S.zѿ ,{]y^g/_tUNbf/7:$2Aʵ+e.ѓiyHrZN׍/ѓS.?/bXL!z e'XFppYSЕ٭qQ6%^uW/Q*N-s4>pnpF¹Klj_Oyp!sl9`01<~a Y\]ԫ3,/Ë-כj,Pe媼ZU]v+"=~ſHUn8)o'7([2؎s~yowH<йyW88h{EG/zAѕ%а Z4甃ڗ65uPq{o}KDe?]\WlXB?d yj˔\ʍH:f]%8JM$yxIXEֆϓ%Ō: ||3]bIb8H9vSxKj;jj J 0*}Yݽ^G^yT&9 <#aUF~8Nq=[kUlmo^mN5J߷|t:4"ᇥ0r/O,W >o~ttF6L~鮡ikymLi#Hz2̳wiH{mHcm x8] ٧Oi.~Bh0Yd `e D86Azo՛mGKɚ#e,h0q2 ':Qp*Cp&&26:×^ 5|z?i#]k~c~u73h͠MP7{Q(lPo!EQ;koLNmN$8L!dHt]'7هQ>Diz@~v9_2|Ò>uER9yWvO7U<:Fv w>K6[]F[傻i]:at&y?I{ݲ9i3xhDW>߲:bg62飚R0ݛ>Kۖ(2ul8bw1r-义`idҖ a)CIɻ`iܱǒ6n2KwC>K[٨u`iҖu;38q,:3gvBMu}vj5ؙo!A7kQ g6[ M<~״\^asz@+lEO3a_w]{@7_H&%݅;3.b7Q? `spgշёc]Ļ uX4~ tȒqP&Nj$a^DTNTaroDFFfOID"&DFk endstream endobj 1053 0 obj << /Author()/Title()/Subject()/Creator(LaTeX with hyperref package)/Producer(pdfTeX-1.40.18)/Keywords() /CreationDate (D:20200121122322+01'00') /ModDate (D:20200121122322+01'00') /Trapped /False /PTEX.Fullbanner (This is pdfTeX, Version 3.14159265-2.6-1.40.18 (TeX Live 2017/MacPorts 2017_4) kpathsea version 6.2.3) >> endobj 1008 0 obj << /Type /ObjStm /N 72 /First 650 /Length 2566 /Filter /FlateDecode >> stream xڝZ[~_Ǹ_ @h6 zy jwum62eRIoCʶq&`2FҿIYLIςa;3z220G  'YǼ*25rHRzBrf<LbB\ A)3Iҳ$ %=o_ dr<ГʙJ>Г {U!40&ED)prFO)+ApLW=!HFpNS IidL{8J>ZJE=όdCk(AXYIFbq`KNqQf 1 KhS 7a9sB̗HH^; rxцaXT ՇȮyP3Pj|I4|?4TPK=Q ?.ͩ9}i~m/QM׷苇5_/SEYLZԬZLS?k;ӈg<G]wrc Qnە|U2t#6,[ڜbW؎[&细Ǝ|[Qu-ohJˑ;,L [,-zH)ɖyWJzf$[eKm*dlmVl-ݨy]a^+n܍YU+|dNP#rbS*ˑbYx*GV6Pd>"L{~}$[Ձy$F/0~h9Z;FǪ5?Zc.x叻cse?C] 4`_S7_ 1 }=H7Gy"I" B$Ƕ~bu舤H2lDGLA35FLJuL6 1z|&$07}Kd޺'Gt+ΥssksUUHUި;yٞ/]tk ?Ʉx7("߾#7{nNτAbiܜN@&:}w]sLɣ"vؔC[M?]l n8uuBID>a:#~p3tӒw_F&B26m$387&2+d3"P. $ q8X[Un %:pMg/c42eq%249JC4.@TLo &}(%4 x" Te:͐ Q(b 8RUg%7xIX($Zu\&^B'rXmNJS&z8f̸+3;sT]w!fE{b^$8w)fFDݥ Mn)W`dzh6t1S,n&xvv%ڊD'j51_VY9\n۵4!جcB:9>ecVPn+ veSLMmֲO,ҕSsW!ҶW$d-Tڬ%" .=tZ\FGJ>~.oVRȫv|K%x K8"ńpKͶ wÔe<4Km"_Zĝdb ']Գva:/3Y_ TATzkߥ9S'':|zH%>v}˔Lʔ:`?_)r)V o{f f0Y%RUpTC%: Njf}S6:eS6:ew/W\ )ڇ=~w_n6P/a J_w  u>[<$u2<7"!jK endstream endobj 1054 0 obj << /Type /XRef /Index [0 1055] /Size 1055 /W [1 3 1] /Root 1052 0 R /Info 1053 0 R /ID [<70F275EBD15EAB3E0611EE7BDE41A9A8> <70F275EBD15EAB3E0611EE7BDE41A9A8>] /Length 2523 /Filter /FlateDecode >> stream x[h^Y95M&M6M&)Mp``P,kDā eF  \0W:ԁ (,7{޵ַveYʳ[<3˲MYY}ӺV :Ѷ@YFЄ6O ZfѶ69>vЁ6p؈v pDCnuB/]4DCIG2[m$@l[b;64VJ5 vuh4$؅pFۏCk"iL 2Z!pM4N#hrIp4M8E Jy$8F_g4RNh+hO g,-'s9*V^=$5_C{@5^hh 54[#n % Ѷ!\v&M"@mUh˄C4^z}4)~v! }!7鐣C$4B+3]EcmhL2!"t1aKrQa#,/2C',o`&n+Y^ d^4&I7=2R`mO\X҂:&/|L0a`v&l a/`r)&b8 ~`҅@{0 )Lp0q,`B#S!˚>қkeUpI>_ pbp{Łn`Ffm;:jSY%W{85}hwiSZvvcKY9m?䍣8bղtea`!s7h ul0AP`;lg0z_g#`ѯ_Xyջ>b{z!]G1pL$8N35M'u%p\-[O"Q{\&XBw,qW]mg1X_ѽw*x\qI%D\qI%D\qIT}IJ^^L;6>L7ZtuH#$>H#$>* Q{6`H"9Gm1ZOuI|qui/ o$@7t OMEbQH,a-H F$ XҴoQ, 3$,U0M'r?rEқ$ ˤ/['z V.uıIg缪;989Nw 8*' <mMZ9w'z6VzMwpptP99Nm#+xrqDqwϩȜҽ8o98Vew<TdST%P8;nT=;qwy@u.ˀ#wG,]=ՑQ== .꼚Yӿ'%+I^N:|t je]u+5auz_[XOUjcW56[g6j@~}+ajmIAkb\mbR!k|Zt[ڲZ# %\2Υv9kn; UxCᡊ*b$+*N@3\*\R{.pIK*2Xa TX*> K>VJE+Rjޚ[YsC endstream endobj startxref 464048 %%EOF menhir-20200123/www/doc/manual001.png000066400000000000000000000105161361226111300170230ustar00rootroot00000000000000PNG  IHDRooEbKGD̿ pHYsaa?itIME iIDATx/iJEE4 t`zU"=0?2/Ll>]]˲f{w7mKV蓢:a5\F5 a[6GکJhIR]mrW1V;W[}!IEjH3(6ٺO[n8 Q-j5 $$No>էJmCUj\ 3Ɵ+ލH>FF,Y}4DSw6_k+BrG6{2[[=Y/eE? )1?(7W.+2nET$'EPTFڞi'sb$F%6>?fqgJ3C@#:QkՉ=LS*G;,惖i1Q#;$Ĝ_:949kϊYז|ҎhCyeDhs/?g.FfG$DKsEZZ#ko'Hiq<¿+Dd|@@7̢[@T+fK ܵZp*}[:Tͮ1$ظR|?v=QBr _ܘG$?Yۨ26mFt: 6 O uHM Jm#_OޟYɁε;N.D|gF9 YB$ȜJֆ=[$})^F%YG.I_}rbr,G@t3뗺 oRat)k@@@]# &?_ѳH$#) ' _xqiO<D54HD'r4jpcCIJ'm|VF6 `yɸ܈"o Me=guLbI3~[,rP,Z 4ObҲz^<`sv}8f.ME_z~<`󁾸xy&0 sNy`׫ъu)N`]U^Dnw <병oj=sJe *T@@Mg1S~ cxG"GӤKR"ă |0J>"IdILl9Q\^ EBBLM@ۥ5ypMd6 M~]0)["B0P5DƔƐ,asM2*b=Yci=vZրYvP4HR̅M—&Med=λ]e>)힧rt?\\P@J`8ڲǙŕv^րI5>Q2W6/H"b241z,4tB;Gmhw.SuzsI]6^OVkwr$.HmQ;k8^:GY x,KG GkgDb[Kcyk-k;AX;5 *uOg~ߣ^#7| -O>MͭO#x`O@S' 0gIgwB|5!I))Igˊ[æ^5~ _pTvif6jAHh^e^$M;?,F_?6 >]Ǹa0I_®/ay`wä/gח2KYeKjYIc^#AMzts6 [*PI'Ѧ;t6+yIKW;YNهfdž3}M2Q1i(k GCql>HJyIڭˊՏcNdĕY7G~=zd]LLrSdA]  ՘ji P4&UFEi}ɑ]:$I+se&Kk=p;?0Π%tEXtdate:create2020-01-21T12:23:24+01:00%tEXtdate:modify2020-01-21T12:23:24+01:00M-tEXticc:copyrightCopyright Artifex Software 2011Ŵ1tEXticc:descriptionArtifex Software sRGB ICC Profile 2tEXticc:manufacturerArtifex Software sRGB ICC Profile\~=+tEXticc:modelArtifex Software sRGB ICC Profile1(tEXtSoftwareGPL Ghostscript 9.23(8IENDB`menhir-20200123/www/doc/manual002.png000066400000000000000000000102211361226111300170150ustar00rootroot00000000000000PNG  IHDRQo 6bKGD̿ pHYsaa?itIME r,IDATx-F<9P4¦{*0^1 1-0 ]7Loj%[eTꪯ T!Aϙtݿ7M߰=G" JK)I;9;wԹx7{@S$;ϕ"@TR3~|۱uxn1!*&kw*$ (d^]bmwz+~66y*،&=Ltw WU5K]׿-jV$ǵ)_ŋ{&E6EHht]IP$D@HdȲRSBaGjM ֣-ɕ+•\~d -(ٌ.B0$2W"NNSȈ $8:UxpTY{*bl6_'Q|4b|⌣+d% L#4RQEՎĴ%7WwZԽWjdW)Z"C.%0wa.,홺2E!PDBjHHjDHD4(T!3$z CXSI(ȕAocjQ~/N}z4'&ɪj6zMoyJɒn=m-xZ6w] ;OJJy$ODB7+ϤdtN`ĭVڂM̮Oe~QD6ǥSʍ^ Kzۧ0qnH`%D8{n ccx\%tI4!̩)LF2A&7ٷ3DôC,Z3C4w,f=LQ 3i_q>"zszoJL|6zN8Zyx)NkgtG~㨎"L/q9zEOQdzK>w![ fT}ϪwM .O7&jO"c\;wÃo~.IH1گ8ޅ<9u68vi[D>Oxz;V9孒e:v1tu^<7`֪\Ͷʽ:s]RjBłx՚v1VF@ ɂF5=JY^n0#t`пhB$1]=_# ̺B1`ilY4 6*GnjaW[jпJzSسinvvH_UjE*T0RŅrwӏ~x zae;g I?C{E#MĂҼ q>{Ժ6UU&0(7#MJGb]XHmh{bwZTBȧK{3aQۗkם*,Z˔zleB٣9$NJupRH_Uj@ͦ ;= BQ)lFԺB&hS&R=<<<~>hg&AůێL41 ?>yAmCN-3yvcakc(ZcIc=C1Jcmi8ի0h$3=u=Kճ{99[󷒝ۻw,U.3K˜znÍ)8'ĺo.#!}a?7/$ x&|)Rjҫ3[,s:]kJR4AfXmp 擛T^+9LzӻsGb-l@XN*/fd/kMYi~m"uIzGo@!Fb3(yA;\SӮW[ ɳ=NB򙀐/|1&s(r[7 TнFR=ՙڸ ~,m-MKnF'=@@I9>n2ӴR!`q E۹nف^SR5QfmXQVEtѢ=gUG{\ U 7Xg% d#IR@DJ{c MAj0ԬsF$P^qXRsLe-1Uc~U-\ ,#Ǎ04;ͨO8(<'O^oYYٹ}^ugrN^$ON<{ܦ@3 }JC9aq.]XxzWfmaR[UvE mMun"6Efqd;\I[k 9>'^D^ yP|3kc$|3!!/|ED`|œ3$ⳝrot5hF@Sؕe5. *YW)!"$E"b2Di6 fd@B#GH-3I2G WǘĀLXk s.=Vn[q]q/y+vkf2k-1OAE!QQjQd^miss $r]p\ Ĕ/܍'{Wi]5Ɏ_Rg8b6,kYGY$Mc:Ͻ栮vju V;cdꑝgV=t iyPDr]mXW+mڧIԾZX1 V=׌hWS6P=( O]9$Av]kX~ֵOWcH %[u{5a]+X͇=4#{&`-= ”DOF̏|W~O|7ws DH?ugތ'+_ף'I7b|#%xǏ˼?~u}vae}WO[3K_*37!0WM76x Լzfx=n]]3ݸ^C:\mGް768U=d0XP@#\缻!#?w\ao𰭾G$l85BUIq֚6~-k|XQ?x8ں7:;lnT'&D%3"/jA ^<6z wT nEa )~آv+ܯ/ok:T fa~zYUq>Do|(i.Ѣ {ݩQ @ExhTR9C&80uJ+WSnBl3{.kcb:w}I::eƼ@9Ktv䭟e+:/GJ)\SG?JH`sNu%ޡ c;bYW?CP w]ˌWLo#B!B MqQ'ZNS + ^ |$䌨, =r9RzppNM.:2#5Xsf&pތ$_:5/|߲-+O5.MtW{M~ץ M|f}֤a~' ٙkKے'Nvsu^TG74g~޷fH3J9ٮ7ĉN]E:u*\.ȡ.:(!֩no{6j^.QҸĈ!>/x;~'|+4~Oo}xxg|Tbuj@Bȳ]C"#ҐȈ4FF!72" iȍHCndDr##ҐȈ4FF$8ZWx )g |xE/ .(d]q.8D*~B F16Ѿ -î'I|n2 3F)ujuOHD~qO w7ާ8 hmq6ф;+Sy5RZPCQȾלŌKÌS m,H<Q5f0oY;4}BX~Şas/x>9.83SI !BqbjkB6]MLE!&"{E~-GMB>RؼlF!@%"(#I lc6!);#x2yoT-f>';Malfc6|'B͌:A:9G_KB:B⒄BF+5r(B\/G&KHQx)J#EI!n$HQR#EI!N%x89RHQR#Eyj p%! ZHuLjo¨q*H5/AHrDnk D!DHrYtb^r G<*DqQ战~,lҍ%cBQR|'NIµ~#͊FAI_b_0Y#/ wHK]>pco|~S=>)|~~q/Ipo|(_xx Iyz_xKؖ˭Bh0#‹HWyV3k_xW@ҚwU /ed8'^RF*5f H"BGWo/Ԛgkp4lPSZ΍zRD֯N[錙}U=2^C ua!r4Ym!I4CTcZ#'a~ډ8Y(Q^-qZ]bۭNc@H wNļlo,N\_?l67u UNw;6ƒܐ)ЁxޱÖPH$\tBPd1Tys:hս< /y om$|@ta ޵lRG^'(= 2V:mo.lz;/#nr=[ @\7y-Ր 50wh@j5lTZ۷tIME 2 {tEXtCommentCreated with GIMPW IDATxwxTe?I&3IH/^BGAذ?˪ۻUڅ]Q $4RHOs~L2H$$s%̜(J*^zOg>]?Gg\Q}*{j+ˡq\Ɋ>q$|zQ/<'"v!T"*ȴ\8KZE;?ױ;} okP0x`wI#/1 靟;1ЏOfq8X΍} li:W7.w!io}_Xy`'W4+`^fF=ZE(v}|"o\CvE Igަ՘2^~]޺ֳ0uE`K̨B7҆hG(I "QA{o:?~R''%OVlA"+tLR5V!"evO:g^oUUuepGzLU.0JL Ꝟ'?@1:ŰY䜤ڲ.Ea2/ ]3fUtYrz=(*K fT) h;v U1_uR?utDAV=VUjCTwI v y 7xgQ\T,6_vP'W5NKO3#w " E w rdYئxGڽk\.LGzv܂ n( T'PhE$Qh[:rfb1=n#vh4!Զ NӶ9 #Qt!@IQdIo8JKa,EQ.ZwcuREQmT^6H"""}Ȳ^'PW_$I ZÎ$JgںZPÁFy[J w1g;]Bkwt5ʥ?13nARcgzwUڽ/ERu$tZ"zu;okEccyyy|D#O})QQQum NIv Ue3uήK#˩ʫY'A`ngi/wprM\.iӹ>|~:= kG|n A2${DFp8f:&c& *0#w`T0s``/,6ƦFZ-563)̴CP3"v(\nQQ2̞5/w䒛˔)jR_bo#5w㕗&**b}>?(^`>ۏ R:]= f.o0sn-so5X^]*C7Y{ʭDUU_=ӦNc˖-|>\.111T)xqs]x<bbbشi:[g8qAL ^BGAʠ-ܻh=y6W9 ~P@5# ڽ pEYjހpR~/Eǩt裏ؼ VXAtt4 ,`MӲEQβ} " dZlr~dTΠݟϴ;f SJ|Բ$̶}qfK`1E+]wFzeNxwƿ;-bP˝(rƸsSӁ<Lb^vƫߍQiϳOom~\'H3 n;(鈍9Z9U[͑IIIBch QIIOB\^7M L1JôS2?U$'%#+2:2G)̡ƭuf$ +vEhʽv_BuuQ:X?r;3ta,}uډ\2t,7Oa1[ :-v: x!{>KTOu :Q Kw{a 0Gq=nL>^d޼y۰S^eDg0dH#IJIq3vIKIX yibdQUUEnn. $%%Q]SM02(3~2WiGiH t: aDE8'KQZZF S"V `t_ŋcو䯯W*Hbbbl߿ĸd7$;v`0>m^kIjj*?яpT(& rx=>tZ:A }Y;NUBu3!F3$H8NN:o[N:^g\{n.[QQQXVHMMeOnv<ȬYرc>rő#GHNNFQWhllh4bjbWfaf .?\|Ίk#`be"wO^ f[jN'@bb2A}]BjHKK#333zht:8nǤI$$$qFJ?Nff&?^׿$N:fϞEtt4& Kll,. YUfFAFU}A 8A3VOP2￿rF[1 >.W\yeeeRSS?Itt4u hZldff5;ٳgSUUYj'N!111!IDEFȊ$Im*^(}藍1R?Q$=fF3BpiժʅJ)Q N;.E(..eweVfϛV_7x#է} )8r#G#;w.LM<;XPE`BGo k׮{&mh%%%f|>F"vH&P\\BMM-3gΤqƑ>3&MĮXd vEQ0G,ٳg3o,inng1Ɋ* }3u[S|\ǡ=1S7Zcg 3#8uGiLWN;fgk3gyyy<̝5'L擵0e(,,D1|n/6md2úֲl2֯_ϲexDu]5NZV+Æ0DK,d͚u8H-}^- 13WgwmU0szLÞ(& HNKfΜ9|J~ongذa!JII -{Yߏ1L1 FEqq1FPWWV+1tP'I Il%| @/16" fT0sҮMj_1e ]SP ]pWj*6mM7ĒEK7nǏ4]p-FLii)Hz=6]v~(,,DE8ND]C3g'? }w}zOUzC?n]P]33B7ЉraMgS$[[sovWбh:[zzZ'77{b۩!gD<^7?!o :g}kbXaDFFB}}VrRR&**-%#2 }?$--Ç->2ǥ('GVˉ hsi_Yϊ(0]C_{[dƌ,~?~wxݍ2gOηͮ;9|(':UoSO=EUU7p$q5װsN6'L<KII11qE >I#<֭[zrN* ԧJȊOTz17yv@q{=@mu->(^ew,c֬Y9rFɖ-[hjj{LCC466k;`ΝL>'xZ`0HDuM5 ׿?9_qx^֬Yè#t+<ҥK6lz͛60|p*NUra>l e%7t|AHNL&{x&6o'رc8v+Do_L s7o> **9s&MMM7Q>¤;f3 y {83r0F Ox>|8%%%l߱4:9tuuutM8]v~?l6Gee%'NdL0nXr+}^/ر;MdY?x[7ԩS3zh(*3ӝ413gŠڸDK-J[1# xW6c7ndvZ7o> ~.-;'ew0amXOVV~Oλ+C5%33p`łF!+{_liWvٳgNjjjjr͢3SՕ a2xoumosٝRh $!IRܷ:9 v=,t ,{U+a`չ`̞}Wh{~$A v8>}~Ÿ(K'&w̴2貫Fk f.fF@!AHP  ul1''Yz$&KII C !㡦cǎ1o/ _ȈHh4gp{ A,:RVpȕjRzF3t:fer`2Et&n&JNO3}tRq:YYY :͛#59dظq#yxw/qWFRq[o[2l0ĒDyy%UUU,f+WdQlٲ<9~(cƏ#c "nN;vh+`I[(**rbcb#*ˡYL`0|7x/::3r@ef 3v.4TTfFUzeP=Τ9޸F#^7xSsȑ#IIK?Fӱr*2*OX̸\Sp<Zo~|'Gf޼9s7'yQF Va2?gر|#糵kufƌx)Ӧ" p4Ͽr͢t:$I{ldY& RR\˙;w.1nh~_a0"2~!&Lv3w\an?<&LݻY`9?0UUft:vxN54*kh@ɶmXxq(U-p3%3:ϭ2P nf*u&̨`F3*1q8DFD"+2 |&77h'o>֮V{/۷aڴi4662edYnLᖛn üyHIIŰaF4 TUČ3h5{RSS1dfd;o2gka$%%!IGeP^}BK'.9!{ٽw/ۗ6S5 QNϛoAS^^b7ɱǹe$}~(3v\})nֵ} pR*JH,і-U?8.9rw[tRnJBB/fYaÆOzYw8~8>OzJ:@azFENrZ*SgL'6~GDi[Xr\f^57ysA*++/0 \wu؜6N$3vؖ >*+BQ%( qf+E%E$'BfN'VNgh3^z|V`͌?M̜/QLў `f`F=fR岗b0ꍼK^3fPTTDAA+W`0NZZVYfo+իȨQ8uSLat:F#[l!++ ˅_Z 66oތaذaEss3 .ĉٻw/cǎeʕdeeQSSCRRF C:t(&S Fl6z>ܫI$q8<:ZHNNfΡ`0.b tzë*CTfFef.{f+uA>/˖- NC~~>?0ƍc¤ ܹz|u9+VnżKܱv>^:)SȓQn""h4$'$R]]`b`HKIپ};񔖖2w\n79r$$22!Cb455a;R@ZZe'q|rrrp8WV *DEGcDz"+`hE &;4]af:{=^?Y|̨̌*LDov礧UoOAAZ:>y{1JKKYd m{e,\ v;&MKرcٵkƒrP'O뉏+z6mDCCv"--_*PEaa!ד(;ѣG3s,뉉!++1c`ZlNCC_~'O2m4~u PzQ03}ȲN# h5Z.'zIC;v,`Ǐrq0vfZ-< ˗/gמ|_e޽?B||< 6|E!lI'ffϚCqq1se޽vNJBAAeeeFn~?yyy?~cZ%//aÆa0rcUa4ꉵqIbѸnCunbb`5SS[V#55w3eTbP^^ά+gĉIIIa廫8z1$!d4 A9$J~A^@ @4Pq`0(Rk6},;!ܘR(~f{Cu]gp33WUT,# $aۈ0E4؝v-n'##8^xDQG?Ç':‚dv3sL׿p8xw/ /CIp8h5Z P6T.mZ&HFi9RT%˅Չt~QQ MMMHZ ooXt)rPYYmoѣ:Y,_Çsws(?HłF!!!tn7&Ljj$ ͱcHHH 55Ʀ9BIY1G!͛'=-'JP2$>ɐx|~?466Kyy9TWCss3,f"""h5Z䖾I(  h `[nIB@g*-vAP?ϵBZ?BW?Cfm;k{[w4>C3%.P7(jkk HZ-QSSCqq1<3̟?7Lzz:GeȐ!dee1|\N MDEFc2F2m4ӹ2fmFΤ7n}hhh`hf&7o&95A;vyٸy3fcEDT~9,š( ̞=3fr9E4Op8Xb ( ٙ|5L6?F||<֭3NIF 4-(zH Ǝ)EQB}@) tpN6۳[\J"_KªL}OLRֳƦF+ncP}S&cX,}YnF~_0e|>_=(Bjj*#Fpa4,;}1c&5XU.#CLFC(vr+`0PQQ墺{1c¯^`Ő!Ctjc0HII!///Y|bcc_?`Ń3~\> C aL2 ̙[oE\\۶mc/~V&)((d2Z駟2d228v^&N:fI&NHP rwgONMM C s1z|n8{JBGK ?<>SvYZ+WbD a -GTr',*!*i]@7̨̌t&AVCSs3DdD$n?{EDz9W2ws7.F+²eHF\t6ٸ\.JKOJ^"&7~Hff&vaΜ9ܹbcbHIJ믿 6h"ԩS&>>RSS)..fȑhZjkkHee%(222ݲaCF b2&>>dfg~zf]5'N7$Ԡfc|o3몙?^D=##R( %󆁉` kŋyjEѴag:Щ]횙ikQjraujvkjpO{mY Ҧ7S{bqӔ &zv;Dݨ30#_dff2i$,XSNbrw}*S|kuL7CJΦFVeӆqpBV^W_޽{Yd DFFrGQ1c֭%;3`0bARRRBZr III466R[[KJJ &`0QQ{޵1c`6Gcot>,%]˚O>垻3fs$>V JD@va4СC\:|>+& ۍd4Anѯ)1WGuRċ~/ԙ(߅ Yn̓SBiq:@̩oӒ$ lF fc hVC!a@:NbkEZ!\ǓO>(|s/K0gbbcIKKW_Ge޼yddd@DD c4EP\RLNDhob7 )!FjNUzTא8zzKrD~222x<=D$I"::Z&MLsS#fGcS#S'N@AijnFo4x,}v͙];ݳc&`)9z Ldh4Z4 Ap:z~jc Y4"Fx?c45k6 jV &>=B/hP?`:W%3$pw䌴 .{o =:~L0: ^Z= R+-of@efTfFef.[L|bHz Z;_lq3qDXr%3gL:zrrrXn'O$>>I&q[v$!)%ŠQWSKUy#bP̤I@9x #Fr;,[> :Zx<~&MDYY8q#3c fYYY$I444`І_ZcٵgcƌnC$L5$Iaòp]̘1СC_σ$e?( bbPMVpv;'CEg8NL&CA]13~3s=r3fF`z >Q fT0siݎ$1))Qf F=#GƛobϞ=-x$r-Zā@ei IDAT`߾}hZr&M>o7D:6[OJJ up1f̘'??iӦ%wvF#@d1LdddjzfF#UUU$&&Ȧ9r$h ʾ1bD Bjǃ TWWs)G.B=o~k?^e-cŊ57( a6 BUN5$ F H(k$`C_`Z%$fLۿK'J3*Q f43#x^DIbvx?{?>?Oشy v$I/d}:{eڌ\S董8r821ζmL&à#553k,*++CUwZ*++AQ=Jdd$@{Ygzh4j5˽NMic!HHBdžd&{s !! ! !@*YeIVu43M?F.rW<9{>>-b'PPP墿ӦS^^ԩS t:;1244X,FV qdimdϞ=% H%N.(TJ0 PȮeL43D備O.5ȭ | Yzihh`ѢEs<46h(..Ç3{lv;>,GaٲeonrrrXz5]]]̙3?&9Y߿1|;ߦZt@GGNCjdɒ%$$$ #!!%K D,]nfV+ӦM򨬬dʔ)ttt Jٳ${%::#Gz Hinh#̙31L!f͚yZE|5FD8a&})'B~`˅D"BDY!:TӯU}74v#XbhCG~&,3 2HjّQ @"3Yp|~PjHGll,fZfrFdRɖկ~Fl6 SS}AHNN&%%>xNRRR8tT*:"HBv;ӧO‚}Bjj*,^ƖFLIШr"p8p:dgRQQ7̲e[RSSRD!!Ch4sχg<x<dUF]l\PcO=~njx֟:\cӿ7Ό kc^PB#?p40a~|.D$Y=s\s*m'v"!J7o~^{/,iNw$d2b!G iQ dD̞=d9fDqL;CCVz{{```H,FV-Fj `F@qV/|\.$b 2H3ByfM2!2f&dBndQ)#"AĀy`}AHє)SؾmfѢE[Emm-@gg'XV1lܸX4 vudggCrr2fL,YByB1ԝ86ZO ʬ9g>|']կe;ȑ#B! [o1<<'|`ޣ]vKNNVjQviikXb1x^} $b1^Fg2f^_عgpz\Qk ِ>7M HbJYVu%|f:mk]ؙ ys}STh8Oq-~?gԟ˫|]]S_F\"P0L&C!bK$26nF~;s~BvNd2 z̜9˴iظqUbƌTVVX,&++DN'G>t:$bL*,w}w6ndiX;:TPM{dnw~aoߎ``ǎbnQˋucZϏdMͿi>111,ZںZb 1dgGL&hˍdp8ѨjG@?}}ɓ'388HSSZm L.aD5x~$"ygDE$P)U =-s zN❳.E>cmrDOүWGa6%BEkkm ]Ewi@_zhG]'ep D^lܛ~f$%'?GSs3Ejĉ}dŊlٲ8, FAappp8̝_.zۙ6}:]=zbٺcf̘A(bjCŋٻw/SL7њ13Ϡf, 999455q7+Ͱm5᭷"''dH>ǃZt1$'  ``0p8 #Ka|da:,5шϞP[* #Nt39"D'3{H p\^+hk?>׸/3#uBd3kۈ]LȄLLD@ qq<:tGAQ!---ge'Nkn0DZ2>l2rrryu],_+Vʕ+5kﻏ^ 9fZZZ()L[gyt: 119rX^z%V\ bqIJs%{Ǜzx^nV}̙ÛoߎJbkٷ%%%deeQ__? 229qbnj5RD[imm1P[[CZZ*Fc tuufY"B$f&;RZZJcS#x^0#g?cϞ=ȥ xGpLi^rp:ٶmneVy8|0yلa6JEuu5iiiL2:,__6|AUV_rs/Vԩ9| T := b5؇mP__f!z{{#nd6oed>$ٔB0BQcb5_N[kwGE"*%PTD,F@RqTݟf5ut ( N`̵ K} ]BUHL 3wtՕ̸k0G 9}uyߔqBs{aܷ^@wt :w.Z|>5Aoqq > og$$$pV^Mő#4551w\ͭo㍷6PXXG.jٽ{7}}}RRhiiĉAjkjHOORSSôټy3?8vI&vQȕDEE1|R)555r-lߺ$IMMc\^z J|Gf*++`|^:;&-=TJIi);?/@hqRhimPT*-- MRRtwwy1o/I)ɤedKrR"!DTN89 #J "DBt9Q(f9zAȋ#sY!/|1k>sebS.oZB,3Ԍ:˷|Ft2>Ѩ#`{YfQQQիYv-{/˿ pZMnn. G}^}3f`ZsTWWJOOL&&55Dh$11w|_>PYYbppT磺?RSSJ|+vLdG%..Nbalڴ\$ XV RXXH0gݫss *zb̙9Պ ֒OVVsgΥVKbb"}!bSFFN>ڐH$uz$b PVEӡRXh!JD,9=m[X;!rcOcS2s!+…7 6l,3b=O3ׯۍNcl6԰xb>ٵ NVv6| lڴiF&Ő/Htt4555J⌱tvvb6i<0RZZJtt4V&RZRJcK#yyVtc(?SJ !lVjjjinj&3=c47P?fj`1[%A$b(  ,~vFE8z?͆RC7kRzW_ 1x2"[GEU%K.cz=hj(=b!&&n:::PT$&&vaV+AngY2k,IIMrdggyD[[ġ̜9!0~NNNhcǎ@Fzkc0jHZ.***())4Q vGn^z+"DY MDit 8v Nm!:ǍR?Ѓ477w^,V hyR) fNͦ4jϷ_`R^;YB`q>sq"!a$mr\MڌyYV:?/C?_?p V^OSYx1/e*9ƍa˖-ܹsq:撟cԩ444} \ٳ9|0-~l6̘1r|>;wD,oͱc页^|E:\.'** DZZ,YHZZdffREE@XWkVahh|6mD{{;o6yy9޽6ux^y7yGA^i IP"ē'}DEEoAZZ8 IDAT]s}BL鯦k ~'(/Jj7V \'˗/gÆ x^&O??#$%%t:<@jڵlx3ɬ~h5|,Y^{(Jbbb2e eeeVXΝ;7_#O ;;J29rjjjGN׿u줨>=hkkpV1 ̜9!l呑JBTNjj*̟?*2228|0֭㭷bҥ2gh4 hji&*J1L::M I[D1}&R FMCC3g੧b=$&&2}T"C׳w^CV|>iiizz=7/DX$E$v 15TfE@@T(,,qG/` F \vDgY|aӄr ծww1{IS+Lxcyro:5䆉b_j=y,DW}1_:(\  1\by\&O̺u9s6o9y#Uyx<Ԟh4to:?[9I ObXO{]_SUUEaa!粒ɓiRUU ̘>7|XLff&qqq8bhmml6uhZMKK $1gvލ^th"^}Un6V+SNeCRR.ar9f\."ɄfCbt1mqan3_K~wnJKKX,088ȎtRf |G|ɧXh!Qj 6NGk[3i@ąt̹#D"F#yyy(?pEA.n~?C!D"w;fq+3t/q\k02f&՘?s8FRdH}݋L> ~;EӢj:v,RH,<ͬY?x?̙3-nG"6u*;vT*ٲe ߃!&>z{{ L.R@AAx]Q+j||x 23ͥ8?N?`lB$ۦ'N K%==ݻwc4q8$%t:FR)YYY9|BjkkYzR^INI4<`` ?36l 77}wns v;|>j<^z{{Yz  3sm͘-x^ lݺP(& 2elc=7MOIuu5O<~BmC|kсjfΜ9b 5ԑjJ'B@FTKwPYyx|, III~$RyI0vrssY~=wy'hqۍ磡ӧxP(xQ`&4'3`fL̜yfYr%m-YG}M61i$5& ϋBdۤuwtz=-Md2F#Hi0zHy8z(~͆hDgׇ+ذaNR2s֒Ł9|0EEEXVj5n|>Xf4h:|}OMSTyX,X ~򓟠ټy3ܽo|ر;W/BIa))Ide #4&~/.@ }1bIa'װaڷ(..رc\.OFwx<^-66BAyy9%%%ȍ6|>d2`χH$B.V,o\;(/`f`f"T}B&䪃C1v6Ӱcϋ_d޼y9tP|)̏p,j^|Ebt1,XNچFtuucٲe z呟O0d޽xHo@ 0شinɓ'SWW\.Xp!DVoAcc#ǎa***&o[ǣT*;wh2J%&Jl6HRz=2nroǾ}Xz5===>rDNbb"V3N˅FA,Z9D"}}}JIzF*X,RRRhomc(RRRؾ}{$Ar9~fɄjE6zj233A dfeb2yFDZFA´P~7Ϳ#GرsZ~nkbX`ӦM)..EE$''VqQy<RMAb q.p8IJNF!#y^&%%JKK)))A&XdDsXp"B sET~xD"B1Μk]ƫpkw3K}:|3}/g7DU+3^/xtx>329;+|:LzL٩P(T*ŔٳFB&痿_(4PZ8gq}Yerssx<̙3?ϔz1 tvvEYYhq#q2UUU̙5a$R/p8f#h$3;m۶tHJJB&Dll, RZXB`?}}}8TJ %% &==XLeeFn7\f͚EzfG?fbEPDwg ,ZLۑH$tttPRTL\\]U 233Ÿ\.AnJus!"9Bjj*i ,\@]]=qqqԜ FçYI Zϩe^{8Y.*Ht:B+m9tX7ׄiBΰ |N:dB33v]d SNE.h69pNc|_ǨrF#];wg4GT2erh;J5KBRa4?>]]]|`ӻ#//;VdlټIdgfRXPo,͛Kbb"YYY?~ T*ѼF>ٵ LFl6??CWW=}446~cسw SZ*YlݾʣUܳ>'$oILIF"jdTTTӃB+;w"x7IMMshhjj"91Flv̚5 ՊD"BF0fsv;:Z9Zexx~?CC8DJ}9W'A ps׿~#oVOȄ\zy 65IpCYҡ둊~PT$D3R;v#d|AZ[[YnhƖ-[P(\n(-bQpqԩd2hk wy:~wH ؈X,f߿Z[[Yd /2]|r^y֯_OTThZ1Lv;~?<۶B,O`0(U*\.Cv;-"&&DBzz:jzͲe˰Zg򕯰~, f^ Iܵ ~y'hؽwSNC '}@QLX),*]wknn]vېPsTؓ>|BrrIJHtB6:sCo>d2nd===zp8NCTvڬ$&& 455(..!dfer1x0Yx1@Dqs#}_+|$\ .]^D q .:>0i(9>/: /|\f^>b,V ^ŋSZ2Ν;ilvҩ$$k.RRRBd2v;1kFii)*>Z[[Yz5 DjDl۶˗F~~>uuuPUUŔ)S 55uvb㩯gƌTTTpͷ[nf) ><#kב0yӧˬYp-PsOxuTUUQ]]B  òe%h(VAl6)j'޾Q@V"HdB!J5mmE b҃fD&﹗JxbPTTpDJx¡ .-x^:s\Nrq/t'aQ,HvjVņZŠ͊7৥_^.8q6III^`DHR~?1͔q1L2殻b֬YZb).-%??}{o&@92zfILLdΝ$R/ZL!iӦgq\X\VRp$%'u{xѯsӼy$%&RZRBR|߻y mV qlݺ6JiQ}N Hr$R9]ݽ ;\4jʠ8nN<^?۶oyyC\7nX/]<_|?*\/Ao2ۄLڇ&,3Zf[$' elհ̌z>dR]]tMMJ||<{~σ_~m۷hRWWG[[3ffOS&O෿-iii 0LrHOOgxxCii)󩮮vى& V7o:%n#^̿i!K+`߾}bvqq (..geViii{_F{:G8c,}}}L:ݎRdժU,YcǎrJ:::HJH-t:j|>>K*Fj[D`~S|3#N22 b~=?8p?OyFiG (!L @*/| ed"x-p{tC{s• {, m IDATځIE(q B!ACb}އ̝;4~yp{ܸ\NDH?D*>ozgEROȄ\!erf):ϨwW=2³xDbrb)aj8p x/^Ç3gR,Zz~i^u$ 6m⩧"044đ#GZn݋d㤦r!g:ϼ9HiQEI !8'dw&N웾&>M>d؉^pӌE BH#M#lpg4s}w$HX<-MeE+o~78pp8JAa!l$sN~?c=Ƽ %pϝw>]wISC#LΟ־^xٳk7ft"ɤ{"'*,Ff< G^>:Nvh4:@%RZZN(gg`xM)..EԨ?ʤKF x%Dq:-zɂ֠'8cj{GU]5\Noh1u!}$v%܇xk>">}.kW.t14)r#2eӸw}lظQxk[tӍ>ғ󶶶ߏ륦2sL|Io}v̙O "HNqk+@TS?Y# ô/~ f̘K__tvvѨz|XV+56FMM Z?,CAAǎ[O1 G(ejJ%ƈG̛ێm[6SZR–-[hiźQQ^NMU ^FC Y\L"@RVX,9diTL&KQQx}> Rh4JEj"(e;h9֖V'?l6h}4P8Ao70l4Yǟ7_i?Bkڸ6TPyy9ӧOm0nD"{rFC]]{__BLLL I?8%%%gٴi@ *pkRVVc=Fss3{ygy׹'>磨=܃w>(t:`ǎ@UU$ SxO(")=˱Gyضy /3<0?[W22<̭˗ӱ˗hBVRDEf3ť#IY9K65h(WD"tti&^/7n̉_z%( /O?q|({q8 "|8&qm\Hz-ѧ>|i I/cJηԮYķgѢE,F =;?PRRZmn ;w+"IӧOgϞ=rnW<]v؁V"^xyL&g BЯ +V`߾}Nz<ӦMFFFx衇/~/R7Ӧ1g~_rJvR6f9s]eLNN"aŤ0vJ֭Xp!> 222Bee%1***m۶q;2 f#"pAL& Bq1*++I&X\nWѧ!QRС̟;}]],]rX,l߾0:Jx^>ûf0@j447Cj́WI8U<Ѭg GZ b4Zj Cuu7޸xdYŋSOMgƢP]]%!*-[ӹoXg>`B^uf͚E8ahhYZ- Ot dJ0  G#Le4>CعY}1hiicn t8@"`زu33*~:UBn#Lx08 b4DШUGB&LJ$ShT"vx:|9s0S T5|s+f.'r#$| ss32N_:WW?c8Q+,$$DQE:fϞ=|/>l5xG|\zJT{0L>EՒd(**Alvm ޻蠱ё1 vbR#rXf sCa|c駟D3`2>}:dSiJJJliOcc#/2?0B}}}@EBa&&&dhmͤ{ GCC,H$rJ5?Wv-2DcQIdQ0L2fddrT*D&!b4AuلjYD&& "e*@d2M4=>I{{;V+?Gey%2Yҙ:2AuS<3rLm*wBt1ըQ1;q3| )"H{7,۶dؼy3wy;V{ 7܀n`ddɪU8z(Ko\rt:yc: n[A줧L߱Bm\~`"V+uuutwwSPP@*͛73}{?/+V ɰtRFGGId2V\IUM5=80[FF\.,4^/V8eۿQ4zv3I&&&(..fll ZWdJn#3fř>}:[ڨ&jqy zɄYH^^jh"8pX4/C@E0D֞\r 8|<T?ߏc0x),̹3Yf\%g4-.EUxZ4U$j5t L4E2sLw_dY^CEx饗_O|f|t \PA?Bş,ۅA:}4p`)\1 )q2DӡR+};߾_}͛7E_UEAό&*Ukp\TUUQYYI__$ٹ0:A?!w}omҋ԰sx{3f`O47dllֶVY hmme͚5hZ^Z-WFV#I9Kټe3b1vٵkUUUhjBkQVVz+|<y{wv'XH$Bii }}}*}}}8~?@h"QZ?LLZilVCCC 9*G6m^`0H}}=zI=,/` (tJYשfV lt&FA>%tJ8vF%[N2yW.\yfJl?|e?Þf4̩hg59 W{`J6&"Nc4P* &=3gYRK/D$f˖-tG[[T gIu<ǹٰa$ ݻQ !I\ PY>^{.e׮]w5ǯy `niUs]? NR-[`52 K/|j^/cccL&&&&r|.P(R__ϴiMShҥ ===l΂9|0_áCz4773|ccchzDQ߃^#R\\&Ia6DQ&3xxJJ{eEQ/z P*.ZHmm-L'L:c$I*++)..H6S;wd֬Yرٳgjruױ|rV+VD"ZFvݺuZ{׳c8'\.EEEȲ̒%Kx3/}ŋ+VOZܹsADμnƌt:$^Z;wbXx8>Ϣp\JߤFRQTTM7݄fRQQA^^FPXii6SQaT*E&l2308Vh4NҒx f͡+W5kd$YE}-I&W}HfFhnɵq>N\\Qk#IcXq #kk146Mg7|ݦ02N'[m^<3,ZtGaΝ<+Ee, grrLZbٲeȲL " R{8>ļy`0 JEC=į_]~W^|{8 )//gppo}tuuhB~b8=)//GHRXDww7tvvRTTfl6cوD"|t:lX,0Omm5 Ip{ndYf||L?CܹApOq:FX,d2ѨԘ̞=?yz{zA@Ei39v6;v!HY4n& #"ڑ_V66 Ww/kf"e饋ͫݹR̕k]_f3rh,=2,+V`-"Lib788Ngl̅=Ncc{7}7xŋu sF1:" b0艄#zt8ݻٷw/sΥ8hZBwy'wן+__}G۷^M0d2OA~~Nn6mƲe300@kk+ .P7&\t*E0DMё#G0Zb1 z`bg122ZTiwmM-Z`0,K*"98h4pOLPZZ5G,X| L)m Yv%m.]7:%]v{NK;]ʟy//\lڴƆFs V|ZD2V%IO(bV\Btl6hD$>r\cc200@QQ!. FC"2d*I=xWZjAI+NLN`6YH4?K>B/^90#RnQ/&ĕSG@%p NX)\gN:x.a~`S~^LNųs] bF1YtVGUuJWo|O6;ɔR|w"GM0L,?I((( JH B1>>Mo"2(h"}YnF8>1?rw9<9rN/uVlݺ헉F46nHyY%MyGx~A?&榆F4MNr.\yx'immk塇2>>N8&Hp}k.fΘ>_?=~ϡC1l>xw|_D e||ɔZK^^$'BzPWWGGG , *(28|}D"hQ]]M("V16*gz#---Dqjkki1C_'? j@ H~^>c$D}gW)k?XN-f'fw23I^f]vxWDf{9}p9@kz:vS^V6[l׿5/ӧՅZ&b2cN;y#Af+?O477ڗAm[۷?M:rQSW˓O>… ˿4;w_*۷o zf1</TxUp!vv]aSr8^V&Ivj199I*" c2PTX,hlld2+}kDQ, ǎfrA**ʑe I <oYd ?яx췿C֑ʌ! "&.#o IНω8R aU5]NZ骙Lg??t:W縥e6x9ID"s=߿ǫ9sISh5H0f r ٘P4da=BSStFB&N?nvM<=9ٳbQQ1M`0RСC"krrM'y'Xr%9t >(xx*{:t> tH3]3`ƒA*E s!`0??,Db!tDќ`fJ# #>K0d۶m bYQTs=Ht&C6~b* O^^ "Xd"Ǚ3gf"'ǎ#1m4\. 'կx}:/^ ^z-y7h1qWgR1sPQYn9lvt:CCCar@3k,R RSZżXZ:yذaf>tϼyxcz>$sOѰ8ɳrx<Ŭը fڐ$Çt:1t:@I1Ю10^M38sJfa*"~mΝK__6׳`udYl6ES]UOSSr-0 SA_KZ.̜9;UJ%v`F`pq̉k`f>``V, HCC!XhCCCTTT0>>lbͬ\̞=Fĸ~|ҋՇ{sys*J˷a1sʪW* h4>|ݞa* ۍV(ue& qǘ7oǨksgVVm 7,Ea㥵yGPPPzJcz0ŅN8TV͊5 fT*Cc454w('`"!?FD2ha FÈkфJƠa6EIp'+@3J>'ZhEIRlܰ"`6/2`K|9<`l)Ot 33{6}H:wKeZuM$M$C>C_Gfw,'wrnۥҾAK2S} D#_1c7y KHNy T'SIexgdhpIVV#DDA@B2@ѐHQkd24*AE̟KcSwu|[ z֬Y%7bgr҃ݞGƆ~S &''Y8zn$Ifƌ `%KnDR(..a֬~ߑ3,Kj wuT*/e֬YijlXjwi"#$S)LFxee:xހd03>6NCgS^YA<V5u(,`dd"*2=LqD5#IYeeeF##::EӴgnFql6@$fc2c3ZBoa/l2X6حy''(- b6QkԄ!^2w}{fq&lv 3g6iɤӨ8a?<)” 'iINe'^)ۆs_?+(ϲ) 8Kg{g:Y^cgC9Ι:qW&˻cNN̻,gs~gY2 ((Dxα(')P=03t4EV@ٳ0iDD0#]r33E2Agy̩1NJ)D2^G%x >/ KwR*JQT|_I$ʔt&F!IN+ FѐfIS)]ZMoo/H}kc㦛n~z|ղfrRn hoo'P[[/. MȓFUUFI4omg<,[466r,{vzzzw ]w]]],XcǎQ+eEENƔԜBUu96紵aِeFCoo/)bpNqq1HJ&! *?g.nIlh4\##P^RNNQ:b #{bTUe-.".+#0MMMB! EEEC<ؗ-^εJ{s}5…s5f*afN}sgI_ fs.̜`e*uN؛L%immJ9WD8P8=?~(d*u.5%d@Zp:Xy*<,d:^gÆ ȲV`0Z~n7uRPPmF>(3gd``y3::bprpB2 ]u*/hG%//9sPZZÇx<8'Zm馺B #iΝ;)++={UVTLS3bo!dxsfV5簻vZ-[ѣGa˖-+M7 nl$8S`0H}}=2v7ߌ`0H$r󠢬J0D(e݂P$NCR(r18#=ڇ$IZP$̴:_}&$dDK[+(4(tp8X|9>(|'֭cŊ[fvw;}7ۋFȑ# X,4uuuTTTPSSCWWNĢEغu+V[^Z^p*yN` p\d2:::(r?Џ1r홠!T*d=PU4h̙3ikk#H`JJ׿Ξ] JR+'72u?ggg^cf>xf&}gC7J.?3sW]ҙQs1 7~9/pDI)bg0DJ^}U"HΎ^>E}2d3ܷc#Al3/KHCղ}v/\V孷vGÓ荆\l&Akk+Nё1iii  xYz56m?Ʉ :":PpNpwqz/$c69H4~f3vl6K:ds_vك(8KvZV^Zh4GyuQYYɁAUz,͟?0ccc,\ǃjp!Bh4jp066F"@VziiiѣjD"GQLLL`ۉD&05(%'OߤIqQ1dΞh42ᙤݝ{λXd ZÇAd)<;}kE.,538A?y璘k:.k@FLrS9 xlp޼ӫ.6gYhγu s%=9W+_ dN"@բU$ t:;wo|l۶7b0PՈH"5 LgҊVFHӊ?p)Z~F+oulb0u6qG|̜9QفxFoW_AĄb &#xWD2?iu`erA_53%ƽ  WW.5^Zݿ>['̉4~a`Ffڇ(*w @2`0OKZ-Y)^t:|AnVZ?S V)aN$Id,"H4lĬ7&iii_dwn2 zD"l6399<~+x?N__-b׮]x<'3gdÆ 7+W̥mvM7u]G[lm{h4snuTUUoknL&BeſtO|0:tgd2f]OS|>y7Nj/$O>$<o`+?y`JWTOOǏnZI L&o&P|7"ȴiezt֯_O]mzh %eQ03|fϞ͆ ?/OĉFS*Q>%ތNKlx<[Yn|?CCCtMݻKOr>ρhhsa$۷oh4SXXH?VI("222  P^^N<.7:nl6'O`޼y<#|;ȑ#bѢ9ۊfzmFII < r3.fy*++vOK%f9x @kz \p+Wr|>twwSSS"oivٸq#~;ǏlJÁnI4ER1::l&33d2I$I%hQi PUUvXzXMt%O>ɷ>47x#n ),,AX@eD IDAT8!S:W^y[݂Z;B$,a-靮RP"K8Aw12LH$ӄK$*Wrн삫@P{f>C: 055jER9s0Ϗ>3onuS!?L f4W%I-:h,Z TkB!AP8&`Xx?J__P*8 JRJђ$&mOĉD"seuzLNz0TUWp~?---LNNp8(//"piz222hkkd2R6F 0`u뮻NjQY-;`E&kuL%B@PP_(/vm%כҪB)wqq1"2SSSȲ̉'Xz떲,_mQ\l566""( |B!rrreD;6oeڵ ގ32ǃ`h,B"@]4K7}R̕L)' f3(RUQW{*0\>3WϙqHV(,f0v`{ʷ89BgJ=wF@E1I&Q)t4J0Q(dJM" hDQ[+y=*QRL;_rˌƢqAH%6JAe 6sN‘?eF Οٳ NMM ~i=… ##8$3sLz . #Bww7dg[ W).PNNa&=V\I8r Hh֭[)gΞbΝ;1L8ɲE9^g|b16Eވ#+# O0L Z gϞb[)J{{;aKK9|ٶp@ @QA!DH DʘۅZ;9`'~1Ei=q|j$QB$'ĢD" JAI"ZGU~߫{;o,|+qK ZvM&o(><ƙy?_D#o,RFKWWON./DJBT#a֭[dz> ~37OOWyBPBPѢK/TmO3c `5x}7p$!"R^^?%Lrn6*uRY]ɊU+J@S3hYb KIIG#q9̙G(v@`rr}PQQE{{zM[^#^RP1<4BaA1P *~JKʱX2Ǔdee32:JMMD, 5|/_53H$8h5s FÇYb;wb6x<`ÔAQQp …^\\.$p s 7j*^}UR][+/xc@ p(D8ւB@զт |wzc]̇WO>_\ U^3 ͵x BPʽW!';'( ^SRs+2VvezJJJx<9s&cMTH,_ 3LU@ rlP8T)N:E$l6smwя~̙39rDiӦq1*++ٿ?55lܸ6n܈j+W200fddՊ淋`0HEEӧOO{b1֯_Omm5CCzXlgΜᦛnbrriӦqQ.\dbܹi@8|sQkDV/[jdddk144/~ ޽z &&&8r###|>f3@ FCEEX7:^JG ( R"Q*f! $jX^ݼ\fl|g;P(a5x 塇~#''Z:;E2j5###T*zzz0[x&ɡ{RQQ=7*ō$P[[˱#GQT8N-[F<errP(&N2urj^|yl9vPH,]InfF&a0pRi蠪p8Lnn1Ȳ㡿iӦX|{ah"(Xp4"wgL0n#399ZNՊ$Ix޴O @ww7 PX""DJKKQEn̙qO0m49Hee%e,t:b8AA(– |ե.C$B)((ԨSj% %% )RVd߯pepV[<fx$Om4i; /_\)d9.d[&6m-/Wv`oWCb&?1x}WWU?p:Yp!D T*5>B_;x^@ B2HVca'ѫUO{)Μƞcfd‡ ܸfG56l=[n@8( t\ Ӛ9|233I$ݻロ]^gݍ8z(eeeB!hll$ٱcׯgddLF)mt:)++h4ШT- "YY> NːcVdZ顼D"Jze||߄FcpQCr<$ z=DIj ^lrHJ׏KXS˦6SQcGy?XgHv ) Ϟ"j'a'T)Q zdRhHW޴|[[o R_!!(Vo0x)Ok>»|f~eH/ L wt_(Vp 7+=H8  a * o|7‚= 9CV^+E1)FŌzX0kz!|?[vs_"a) ;vbb0離 x&''G| n7Ph4S I8233'x}.??Oa֬Y妺ݻw35Zt.ʰ F0Èb̙C?"ajk`0HWWΣ!dn8x <ox[o{/| Mg"=|ߥ$HD(FKqTJES=5 7}̇ fCJ!] s#. \30|h,Vk$ģhTj4YVj2lLNy/~M1|Gccc8F|ӟ& uVJVlfpp}l۶Ңb^}Un6vEs4n78qjB0Ph,FUe%x={0w\5kq jWsl+SS)ХiC䐝M8buxmVfe!2nzzz>miO9÷vɓ'YhmmmTTTZd㔖""w|;370fc1r! eǜ9sH$LLL理'c۱lQQQb9L]]v)r9<.AQPPJBB"Ht:No%nKeeeaZSPDII 6mb>ٽg7[BLctfz/t~ZTLEQtw oz~d픖V5xa.jjy}h:t:x$ F2H8Zu \ |`ZGb(>CL]CUUUUB DZ<B$YBJ& G PTdhZb(5UH8Mu `ղ%_!s8q0s[b.# 0M>}Vl'ԧXd {% QTTFaQ!@R"/2CJȨD #QfϞC8ewRMcc}lzm Gyy9{lǍ70/֭[1M3g XLf܇`d9,DPCII ===jIlOJBRٳgb455!)ġ!F#XLf3ťDʐnjj|~?pGyJ0$??m۶q9"(H$x)++Koذn Dcc3deΌDYY==}X,&^/yp 32>KFԐχs`;.NYtBN86^A(]緙ޟˢNP[ W@>LU\_AMM)(SCAPӕv DR蘓i),/I)o *)%S  PKaAK aRh gZOq9xd;y_H}clk`&''d2wwwn~oY`}}}̝;6VXv%"?/BAkkŶ8$!K2.s툢ȑGANE$˚E8F/2w8 YUuxttt؈$eXٳQ_D"رc' 8lݶt:.\HE3܋b IDAT|pinǚ" `̵"(S ۗVZ$%JKJ255VCrJJJf 7p9V+ǎch4R`q̙ ɚ5kC f9H Mz^ˈcBAVV&S|SueHȈq=S`% "ՠ Zߛfr_~5_mWCZeD R믿ڵѨ5Lz'/Q GWG랻(mHINR_=J<57p}Z_'d?a&qMӅLŋq{&)-/tYsr((,o|h4O$w&QyݟaRUUB/JGG*YZP6~989vg̤Z#Ũ8˗/GDq͛u #33˜ysg5tvvl ^e\#O"I^}uN(uu388H^^nܴ deea|TWWj*fsTcv:$$2N261{C$DbQb8Fzzzddd`ʹ |vIH RI}}=(RPP^'##r ihhH 8p^/ "P[O" d""0\<J^ڴb. 6f<!|c,Y0^.Zb@Ʉ Fo \^^o$WrN|s=]Rv}WDQ)tR^xbxiޕ+Mlܸ?c :P*s) RSMJ,Ly<|gu  :5ɴX׋kb3ȵ1[,5Z9z!uf~_uV~aΞ=D$vŋ9vgj>|;ewwbX~=?OGOK+WDټ5rrmTUհxbΝ;GSSD3f0w\ S[@$a˖-tttfc۶mO4e=v z=٬ %ٵ{Zx jǞxY- 81Le|l۾I JMGG'ť̜9 Neh ,رc (//c]w/VJF;T*9юV'n‚P(sۃdY"3JiI w~'IHYӀ 055yŗ3,]ǏcbhhI0L\p*^yJKKiooSN؈륰ۍaZjDQĠ3d2ը5!RQԜ8~*|)/^B(Fd$$˔ǔGA~!~_ d֬٩Pg"I|{X&B0 ,$OFFp1,41s ΞI^߽s=X̺$yڱ 251Iei7ʚ+yhhcdќ'$ԢAA@@T—3.?\l3]oq>r̵kc)JTZ2LD*^/cv-դL&&ӓR( 7 (<cOY:V/];Xs] - ~X|_ٲE9y<}}-fJo|rJD!]h?477h"^/JJvqtt`֌9tE CeE۶nee:V+S^/ 99]c a`|bI).*A?4@Պ?=;~JKK(? L&jD/N]*N2dxdžyFDQ͔o ބ^o Mi(q::x 4TW1q-(@^~lٱHR&wڭ| !#0:6F8G茶_l6g$ 4 6mYٶm lDQC8&(,,dǎf-{w{;3ӧOo~t:˗/ǚEuY5 Nn6T*$rX0&qI pBySaѢE>}3gNNUey 4E\.DQӧOS^^͛Yp!f8pYfȳյLn]u ZmA9??E;奴P(,\I&QR\B۹6X,!HdFT8Nl6 QRվ*+NR^^& )DcQFGGꤠ \.7*H<GUD/JeYo;]3:xRfpȁɠ' SC2eVC-3jf9w DU}w]TH4FP(R k` \:I=1 HBPQhBvX_Ty_B@ kutZ"`!*5Ԗ!"p=v].,FV$e\S6 jt8CSKV6ocLfFGLj̚91ęp T :u `|w߇d`y(E%ZӭMqҥp=E錎p8LlݲŋJ^^ݝ8#9{^G ` T-vh0 -|"f%Kp9x<̚5+ʫ7'N$ze޽ddd .SNaL%D"QFmm(**h4a6Y' F@A"Ddl9tJFe, B0`Adf#bqT*5Ǐd2K/ѣǰX̌9D456t`0젤6գ8rQ*f.0fLc&IT$1tdfdڎ:. r뭷LZL*50!e+/?כ93?G_n|?2)9\^_ɇ'm PTDcŊ̝5 %%bI ZThB%XZ$81G4"EC06ss ;ƨhlLlކ!FGTTcb}ܜl= k֮Ek0P+E~?. Žs=Dzeؾ};_e}[M͍t:yghiil6#"K' ZZZկ~ק@>9r555;wD" DRw^~=LLL0k֌ٳgY),,K*L&l6Ejl6,& g2o{=.`fs<>ǬUo@132 Zr8<:IGs1;K&(()׿~H&;wB"C8## Vضmy~q "&Nn~.6eu(F ǃ$I,k^f%\5-!d49Z-TIm 6~sE;:_>g﹇h< .x̌kxuB0 f\`\K36y۔diq'_t&R@R҈ѽ?C2@ J"" V$$Bg~nVy/| yXsl\}Fc(* &~Z0B&ő<1Y*yz-Aaݺ55JB,V7x=e%,ij@Vix?'ɠx`Kut04/9lZ Z#a-bvHIi);qx ^x+Z=|%%G^ lKH6:#eP*|Sp~7K* QTQj4:=Z U)8َ +)V/m™c#4Ap1Ѵj!h78<7tǏd7wfʵsM^*B$TQ6mDSee F$6_϶-[#{n0k׬].ټGy+V0=9MIi)n7Gelltz=GzzXY&Յj"G^å\BNN\^unLpwѱ&DAu2>$IbrjJϑ#|>FGhlld۶mHWW@yFߗm&aXXX`0033CYY&1U[aȘf|>/N< RVRF0d >"NP( ,MyyFH$ 33Z S_W\q% PTTbM75kބN'knzDA@5IPCm@`v\+^L$f>$)DI \6^r "̜~ ﱾɿ`{*;ZH,˙ p&g%KHWZN58ɲDV2#}y&͎p=|?"\‡( L%QTڵ.Çl2!Jt:C&A7F ;p⎛g!*ss'8ΒRXQJd`:LZTk{ld3W?gŚLLMPtb1Q|;aɒ% .2&c1K$ 6nHWW,p8PVY|/вj%BP*8}J&"X21- Fl6GAtol63>>NEEx< qLőeay7L+p7,RZZʞ={dvvZh6QUUA{{;˖,vӲ|9SSS,khbbbχDnICA &GςCEEE6')vrh4|>J.*5;_ݹAe{nQ(~DQnht:2 FIJ& #2u VHl6뙝ER-t:~?DX(Daa>s, ,c%C{{GVXl/`aaZGuu?M;OYܱ|o7z)2BЪIj 4]z_x蛌H|k#Fd6dw[YoǾ[_f.t7]@H)awEݙ޳8>cn~JM*nR,& Jh<($j Qo,CEJ⛙¨V296 VQ-[Z^B!J%z)<tww/~s:O,_LOQjjjy] (/ƉaEJ4j-8J>qSOcXb2dRikhjhKryvV;e||,y`?ttt?kJΝ;Ģ(TJ, ccc16dx4F<ggSk+#AswJgY?8>՝ns:%p Jg#o~J(VK&#D:M<@fJu⍝;{fk]w3$9F˚Ngd2t299Iss37C[[筷[nann#77 o}.)**Z8=tagFd<2ۆ>24̦M8йQ_}O-v+]veϓJhll$/|qOt}>7p>΋/Ȓ%K)((CQXXڵkinnm(,, D4<֒Nx'H$\lɔHS b1 RYcbA 2 \.Z\.nYevvi468x,,  P(z XL***xe+жn7drL&$aZ),,ZC%LNNr˭Ҵt l6xtwwO!~jPvl6< sssl{j_D<4j5}'N8ӃZ"Hr:gҥu:vLF#7l'`Ŋuj wCWW-+W_5<|+_!T2==sFFFI;ɱxשgIxo/:FCII ###^bVXZVKVZ-ӳȲLUUEEE "2Z#G,2JpD"cp|E9>>{zD*b 㙛cͪuLLORk8Ōtr,R ZL6.!i۳ֱcN֯͆ LpT:b!??7|]ݘM fff())crr9- EI'S|_&Lr QrgN˵^RQf 8NA""q7g:K=>_폔=.N#cpÍ|+_K/˗mf*֮] ,Y`0g>^xyZWuVfff$Jp8Prfz=]]]Mfzzzȑ#߿X,ƿ̡CXf O=$w}7?8EEEq&''1<(y۟NSSd[o Ϝ zټ:6 8b'c1L$^ȉg}x,---d۸x l96FGFЪTI4!0;a% ca2YDiI >-+?чp$J%,]RImM+X/uu _`ʪ*4Z-Ȳ!|~/_αhni_gժUҋ0̈́BaFGGx<Ng#Ѐngdhp8Lss3MMMd2YD*PۈmqףjLø\cXHRtvvR㌹PBgYlVZ JQA<gbr" B,6j7D%S70茤iA1A xc./omH$RŚdp((ҳo^F%HXrriݰ1j$QTXP%(i v||BdeHe B<p8%ogaNv okܑ߅xs3n[$Ig^;4n̻Z\w޴q6QX|>N̙brnOBOQԨT*?* JM"_~^C!*P)Uq*FE8J;Ʊu Dfgi20y,&#chz)miȼ?Hgﺋv QLNLPQY6{[8~X>?]]8w`ϵ}0553͵ѱw$h4jjjXn}}}\~Q^^O?ڵk$Y***aQ(gq׮]KϑVZţ>ZL0 ]|O322,tttGK__Y dlMqI12$, /544JY 뢢"d28llR PYZ祮JV2%f#vG A6єgcnnX,ޣ8QTx>JKK ~$32 6WvBss3dNG*bffիW36[o*.u7hhh̶b2 w9xkjUXtj~osOF#S^^Z&/'Ah4YS;L&(n$bHyq%`y{D' ^ Ъ$Si2?^oD# Jfgc~dImȳ`2ZZr.9~x,#~RW9hҥ1sPk(H&|Y TJh4J *$<|9 y3:T,Npη0(8@CUp+.]N2%! 2xÇ# Mf8&=u⸀8o.N/t&OpO6U\s]sye2i %tv )#eHhnVZC_{x"V5dz`G !-K*A\qY{%.8Mz|Sc(XNO*mߟNb)$:&y055I<ƛodkdYqW(**bTVDZ g۝ MMMT*O5J> DQ$//X,LOgۚSk֭%-Acc#ccr]wqA/_MNgzXO`jj 60;!#K QX@^xjGq8XRW],}}Xl6|a?: 8k֬ЁYΡÔhH$ 07p*yc IDATGH$ x^ʋ颦ΈgރJzXmLMaiN' ?ώWw4x֮]~A\.XlChzzZn'J155EUETJCF6q^X,ڵ B+,[L￟Cm-[9/%.z)C=ʪjvSjVaR0%( ٜUʏ=kݽ,t:#l{bgb=jfDcQt:YeB*OgP~nY?~>GݠB2;YǖFt0R'TDQ$N!"W?GђH&F- .(uu5'?f 8I}}-:IJ399lĖch68y5JFp"*HzFce* E¨VC$n(ruyٱc555S۟oݍg~}xB['5uk|ǏCo>F~~jNi WRSZIǨaTkI¨2dnrȉ|;޹y칹t;7M|T&Z$#e`\δ$!q03qw "4 SQv^sg@$7,T*J//($ $IBOIR&AI s囹IcJƹqD׃Y LZAE)(),)e!~ 6z&d?!֭cbb| Jo>n&2 sY DQܣ266F}M'J}֪u^\, " ?@ee%}}}'N$AENuB!jjFeq::ҲD&Qz]cd"&I;77Ve.lp///q7WV?,DB7ovg3}̷m&&&8v^{-O=sرwߟPSS˅, ݻfL&HHG~kWARK$| b!EhƍjE-*P*4DWZC,+W}I1 ՅRtRM2 χ |L&46̌>3#Igoَ~ؙ̜.:sOa?Jg82RIN)|x )OY^= >X9]b N3[r&$~HvP{V>}mT*\Bfg0ԭ7? hL,+7o IƛO?CE~.T͈$EI:+"O[C\h,9?@emm$eY%j?V @NK8Gy3?{A2|Gq뭷odI ˖/adp)榧22Ir-f¾F]ʲ%hU***)/+!b11$14*Xx<ʕW^l{^6LձqFblZ΃Q8.:}9?fm,iPdYF!*.I13=2&&ٺu+E l60hXFUFܼ]*6'0ƢRP`2-g6.`09Kʵ(lXO{Jlqtt`/z;w_RPP@(Coғ$ffeֳשr7裏r=wL&C^A>** .N8 :t|AvF֯_,ǏQ\\Ncll )-1??Fa줴 555tcҥs뭷v&}<|9V29=D,h42gj1L9s%==@Bm6lv;&<$ lY233u@h4Q(d2fgg/GiL48, X PHGG{09t]=LNNL,allDf3ynHMM CC#X,/>WUY2'RI Db?5L/]IĨU:-c!D=a:=`$.S'J+$I*^.CR:%Nd]=Df42gw6cq%˼ l_j{B`6 ٳg"^LAgB112L x睈! RQV"//9;iiivTVVvcaQC=c3,]/WԤTq$ ?7*̦t-۟c<;i9FÒe ظӧR,ť%f,Y"++۷c0xxm@?>|QN'X FT*ER`0")//`0w0QifI}}R"lssDb~uJD4 0==C^^>*F`qM?=~0<8t={`4ʰPKnn.4^C~nui۶mS}^d2q#l߾..j` z=D#zPŔRZTh  ckQ)IqgOPWcvTL82kiiߋ?#Mxkr9:G#|{{ P 7כ&`I0ŋ=z-Ɖc>MC]-4Ң$BAcKQ^,X :|pgpTBMu뚸}m\H4nfP_ȢEɡbG!H݇`?_yD"w\ ];vw^nFFFx 3u[ett뮻.%311Yr%~|bv<J\FGG9pZ<Μ9s.>R)mmmڵkyvt:9z(7l,\[yffffׇLFee%555v-[Px<8,Xb!j9b͆@ T#|NQi =d2ƈD"aJ%V9ƍq8DQ~?X,&+3_zHBaa!xk^9 -Rhnnr*6mc=F~~>/YGۅighx: vfnfΜ>ZLMM(CPsQTZB8";/yfΝ;fcAE%SSSXE=zE ~1}}}hZ^ҳرc|ʹRۜ шKa JÁc΅lJUUb2xӍtvv224OSnvvA:N$//^fttx4ZfXVjD"X*E!{J )*,d2191Œɾ~ tZ2u:-&'uG(41=؇Rh16=h@ 1 Pgs7 D3Z9x`aC7!ISQ×1ӫͬ_nooI\JJߜmKBL~32~@0X$FTqa6 [2$"!",@"HbbI}-iF2L^R) NFfs6yfeb4&#M"B2)*C_?{}A0˗|L& z$ 穯gdd'|_^?l|+_!33vF#NCqq1de˖<zLNNyfrrr(//)z=6lQzzz8}4YYY//p8e||;w"hii!g!''vm… Ytjd2ɺupݔ0>>Ί+H$477G~~>^J8VZ[Q(RrPNىbr188Hww7iii<uF.\@sS3t s$")麟SUU?y֭[ѣGX,SS^0MMMTWW_ֵlذ L&f~v;#Ctv^Hq by8x CQWYINL(X2!&(XH$J/17;õMLdB$BZ` ڦ&|~M~eվ\mpdmZ?sݕmGfC?vlo%2|OJ(3X>N㒥Dz|'cc P3--WϰrY~Ҹ{nZ1y%5"A*9D cz:YL& HdJF3 +Wp$d Bp1JK(D"jvJ[f?fs*WSH2cjʼn f}0#Ond`F!o>jjjZQԩP6a{,D yyFx \D%z0lrlZt<tzmLMM# ِd?~kV&Rػw dX㐢bsdex)))Zq =dff"ˉD# %N5 ^޳X,jɩIlYffgh?ƣ=||/p"͒N"d2"&yɧP>3XȱǸƛj17X$~3-gda]Z<br.?OMm-yD"hMEY?gÆHO#Nvq2oɢ Xb)F1̏P_]mzEuDhumvn&gfuX{Cg/\D#QiPr*64O$Pʕ߸܂ {"1Wl+wR̕`m+;yL8F,<'?IB!X XNARc夓QᚙbIU9Sv9=? e0B I71M$j U Q>bϋstuwj9p 99LLLPXh299Icc#MMMhs(UJ|>~R)++#rP(c29~8yyyHRN>D$5D!zjKee%[n~ +V4JII1>vdߞ}=n.\ȒKJ,_ʪJ$ H$223$46 L% )..Rtfbbt:Zf]/kjj#(믿`8W\ BaJpdR/YvI{G;>1ΞiI_Z>PX,Ʈ];1|CMm ad2ŬY@ @}]=PUh4Z =43"#*q<ď8b֪=Iq[%YL@yB(R.L&3r̬4YIA5&'hMZwuqGh# ɑ#op`T*d2֭gll*Μ9?w?X"zz{IHG,B!g5T*R)HJŋ&LΝDe߾tuw284@Ey9i>}I())&7'FMD|}5BP`8su8}F>Lf#G.s7621= h'OCPRTvRSxvPWJm ӁL(dfrʪJ=T39=Mnn`h8A0!q|>Nᦛn" J1L(Jj}(*&Xʪ!QɥQƆMD75@ `tb 5,~x>֯_AMm CCuם<_E23;œfbxh|iW`޽TVTׇT"%LxB4)һgw:9}4DBaxN=wMgo7`xhRRZµrr9nVvEMm He2ii\ȼEְ0!++nB,fV78T*JJJpϻШ58)KAvv6p΋,Z[[Yj5<,?/(@*a6}gp;8q`y:-PJ -Jl;*t*H$b34jm=D")xp,;n')/,Ik`sU0s3JI^-~ ǞxO!ҲR>q= T}p:0M$U3ѼkV":\|XMzNfsąɳZd4 zJD$%rf=~iD*yrn.j=s RN>C0\3#DaV\ɱchnn"JͲi&?@yy9}455!"P q/_Σ<—eFFϣhb|l @@WW$aǣl;;FiI9j kN6m_ʱ'eMLO"8yݲiZb2zсhDHbX!7'44<l:::PR$v.x<磶ی]ȕ2| X+9ڂZexdka1gBRPVf߾46.^wqxP Rsv#`ƍ<Gq\<~֍\pka!hӉ:H(J^nh)JKhZ"a:zp B4ӧP9fY`IC/#Fga} zΓ%3=QtDj`Z~cIHH8x~22s&HbSlT4 *xwW1H}\oGoK7u}#$yѾ= J/{"'׮TҬ?>yϫ@H4cdlE@0d4D 8h H bN3f6Pj#;-yH"/ },[܄ P ddӫP,BSiBKnrn7JAF~6s3\D&m@!MFz&>ζik'bٲF.^좺 \tZ)a>}-_¯~9z0^#G! x]tu^㚧"9Yq233Ag>ƙ3gx賟fFD"̧ʤi533S^h4mJOgEV+8Mo>;Miqp~V-]NG{+jmJC}/zRS1qopcǎr99"Aq6*\4j%g[*0j$D$T+p^V7]`?&RΉ0 gvnz~yF122Ngw QPPHIQ)H\+  d3<8BѢV( C3 $dZ2o ŋ=( 9R!NG1x/H%xrBEFn!Ye9BP2)"?;BfTV.`wkNbR Ǽ˅D A@#S5+r:+D,:;<mߝ y#߿9>J|c;@{ߺxm|s B{ٲe X"F(B.Xdc1b1bx"N|P(B,rp٦ :3yliBUI2LMR\X`99LOD G c1[C&361M\$aݴR,x}~"0d JAOo/GA&33=CF%sȑ#1<<###ѱ1֬Y3<Cs Xb`8##,X@({?9?x壙}{R]SCEy9cc(JN8#L"Ku R8x`J:av\"rZF(**bpp&>J,9,6;2X8P >gG$EPHb4z5JOVVnO`ЙH42! D\@eYR9Kv:d:u> ԄB>gG")hZ---|#o&=+X,j5XKyE/ލbA*g◾ĩӧ9{,?0_7!N:vڅ9͌H$`0PRRBVF&Bh5.Y4>FK…?Ǟ_> K>ۍJD099Q*T԰}6 ==q\6 Kf$T/X6Ka~55Z< kj1lp=٬^*T\ZE@7ݰv6p=#e2u8("HER;Rvii;1tc3<5'+jYZ]A&>1Fvd4\.GՠѨʥBܞy>7yii.<+=Hz&=xc$j H>)j,`amOqT祗1Yhn=CcZy ZͼIZZ:=}DI5EVT?1HLR$zZjHtDB-K`Ft7}lU0҄Wj{{u^Voϙޗ@ IDATȤ2X2Q+ň%r*@.JdrtFN$+Â&a-cnΆR%u07?L)GסRCkca]S3*db).XJ㧦|H$\.8~2oׇh0f01rs,[,9998sjN:=+7rC\*&~RAUM*T5J_0B kI9Oem$N69$cq:;{X~=Ru1::M7D4MIR]]*?}<###TWWdFy饗jp}S[[K}}=wy'۶mcv+'^azzp4DWW˖-_"(jrsd˖-Xd2ߎjeժU~nvΟ?Omm-yyyX C8NdJDfសnۆ^gΝz~_sQ&&c```0$.GMs3cW$k׬d2 HOϠR^"6ld0VIx<B! P H$b֭466gO) E,1<KbR@ K@@c lsvz=i4rss (JYy Do'Nfz= BؘbT#6cv{i?ׁF.O,!qRo|l,E600w_7/_?E?!;Z$r  kطhT d)jI"PJ03A!U\ʫF7:v_mWy @lE"Q, ~?>ۍR!VDp<C$B"ev=E<.2Z-jXD8@P`2(rε9@$QPZI8+ؙ,^;wPBy V23Q!9֯_ϋvK[[&;w333CZZf:qHdB.b*rt;vbcT*z{{g멯'Ht:y衇8x۰o~[X|!,~f߁<ď(++رchZvq%ѨT$b1|/K $q֯UÇ7иl%[ )//g&iZ^΋eXwؽ% 9pζY{ 1ζ25>E)L )$u&{de*./|H"8yEV"2 @l&//VT*ZNFGH`׮]066//B[|9ǎcXV&&'(*r011AUUdQI&XPZZ3N$FPX@ÇXr%Nd2l\xI0 8T @Ey/޽{yhnn1N80EvJ3[n]Y6޲z)y?2.#Ur~M]QI?ŌFk8*x湟Ȉy yd22ANX( ;:]axCݻ\~oߌ t ~^';9GˮM,I@&($ Qk 9" J% UKKC1r[oی͐Nı[m,&T**d2Y؎LN8!KdHZ-<4 j ro *%l >JVV̿?mb{%Kr9L&/؅ZBPUi LMMQVꦢN\.(--%H裏|jw/7Xf /͜>{{マp4E/ra/O<(tuu>˙3gx'GǙfϞ=R(j1j5˗/'R__餵rn77|3~;=wv0 lr֬c\.0'N`T*ٹs'(T!gL='OC̥(YzH8F.rAxAyq{ɷo%ǿLaɹzf-WapMކ ><9HBivj L<Ul}p?24JS4PJ:ARI%Ryc)usޏgJz󓝙}/]I&D"7ԁL"+<ڵ;I6߈b[narlݲ^z BKK eeeRRRrjÇ3ȑ#|{?tyUV>J<l6hll̙3}vN8Y%阙H$–-[t{G&AVS[[>[E"`nnl6 199h BR)sss̅P*aFHČwm' p88z(( .7X J\.GדdhlhFF#aYh8HՂo|R=[ph.~3||G/li.<7$`  1 Ģ|`45 "޲iKl޴ZE>n@>G6"J ϯׇwG{WfKWY,o={}қؘ_ddSW% 塅ry6Z !* B͛Q$JnXL} %EXt:r 'a GbȔJ8LNNbdъ/4 Y`dӖ-(NBw?m RuZ9ɍnG= ;$ zN^Db$S)^/*P8tB}]->x! ŋ|O"`ڵTWWT*y駸񦛰ڬvm?~}gE\d7ҎXz5~Yjkٳgwy'>N ?)CC,\\.ǎ# B444P()qYZBo,q7(//gpw''(*.&Hйd)gΞF&c[7Dp҃^&03>A@&S &TT %NN>AEU&!{CW损,WYi}f+#o*y;%^9.Bzl.C:"Lhɤ(d :Hjg _ؠ#c3z< ã-ngd!8+q4_yѩ09 8roqJ%A5 N Bc%>tyF[[>MyE2Aqêp݌siV78O{v###LOOL\ DQ;룲g}T*$%%%LOO~zN1Nۍ\&c L&\.g``7oذ~im-m TTTCoo/o\O D@J<VU@Y ?KgZWC*|)6Y!!FGGG/N M>M%Q`fOcK39˱G'D)eEYnyV2*pj+ٙՁ׹f:nHU&ȾF}ٯp|R_1L"N'@:8PHTU+{kvDDoXBN8)?wƢ $4eW9rZT{6 ʹg XYǞyȩ 5t0&Ϗ&m $o*?U3s%a&`dtOxˉ2oU9fF|^$ID"W*wѣ4TU K &g9RLP]V7M./σP1;"!Sjȋ0nLv=u WV}p}(v<hu:-[ξVk(v:dH_}-r`+n:?fO$Z8]NfggYj5?oh`tt'|GI1tvvfI&tvvd05 %JѾd FL.ˮ]hkkɓi8 .ݻٰa{` 1t?ڵkI&RZZ ZZZ|466rY8Njj@,QϒJȤH4hZK:_dҥ188B`|P v%Z[[444NYl`ӉfCE,jJ>GPT* 8d?b2 QLz&(`3R$xt|_gڵHD6ett[nxN>QYZuLLLO|kFWW+Wmwmnvo裏RUUE[[/؁\(0 jӧO399ʕ+~ĉ "$R)zꩂڵtvLi[ 'Odձn:*++inn. (T*zaߏ$LȕJ. x@(?fͺuﻀjP(H__r9QṴcj˅b&PSSde+W̱}1GC@T5y="zϣԨ2ҙ,\)x2A$ N㟞!HOXl:B&nۃRb .oNLƾ}LNN200PRHRUTUW00ЇNFfă'LL厍E"T:PfXBeIxV MMd3V._ UvPd05A.0 ģ_P'^}z~ca_ K("Ȥ35FRyݕގ΃wq׺oقA4b0vq+VpcY{ \B&G3:pR˥ ڗ-GH<* iln|3T׃M(s0u-cߠzA3$VM$!p)$  eE@Um O=ntއ'JY>46l``2<00bB?˗/'Lb2db IDATbâE?=M4ev3;oh\4DGm`3ƒN{{/#2&D,YRDӬjk?^) 77q,Օ Dmq12-CPذ~=B&K U,EN$)K0AWXKPLUNs/PjTZf*5aT~N\muns bgQz .%L&W_2A/?Օ^Y୻$ oEW/~%_Fe G WP!!CԑqWRdXjfJ]x MiFGDV0Sy490k;1GŒ`61Bq[( |AGYe=8{J+{PVUAc[2 Ӂ?O29>Νwl% r.bfGұd)]a9}?̟3paLA>g \@!(>fxuWrIsXh8F(fb,vRᮠM6`a+wN97Pʸ- b2pP[[ & 7N{|r9ƭpih yﲰe1xq&&u205 p;%XlV2,"24Vx2l@QuȔBFH<#/dYffɉY"0Zﴏ]]Ig3)2H"?`l $bmD0^SÚ5ktt.[䔗R7gNPSQI}y9 to8R4ΪE$!6 KWra DVݸ,F ?AU39;Ȋ?}9/r'N 2Z#*3;58V sss8\\K(+$KB$CY W;R-k/WڿLx0(gL]䫩~oҼREiG~{oy"bArP!)jJ)aK/Pd2NjQ3Sd J9b.b!/A!(U*57nG=8֪gɊ9}wep?f4#4*-mŌLLY1%q %6m,l W*hklcbrӧNX_OMy F`22xyLF&!kSSS /@YYԧ>͎;p\>}iind2Mg|O>G^a5 B@.. gI pfDQT*\)Ν랷¹s(*2E$+_ s?! >r͂Vi/;wJ }H$Xl)'NgŊJ8׳h"Z[[imm{3<<̅ gy"f3X!(^)6[fddT* #$z@`A&SY`UUUl۶Aز|6ҥKd2,hir4(RZZJ<cAkdT2w&Jr.DQ(`JT4O~O?czKUyb6aglx>Ո `މYFIUPD;HKFD*i*++X,K۳z{׆yL6s.~w̼22ϿhR l(R?z IG`t‘Y M&&%c %$r'Ϝ::IrE8a{B(_CdPI031IS]h?S}LIY)3ss$i0ܾj $ EpnɈdSiFrW^j!)++#Hp]wKs]$$bRe;wrqZZZXԲq8?<CV˂ |\X,F}}= 'q $I֮]{-:NcٰZ\.z{{Vhӌc6A9rt:$aQN>ͭoM&N<… zfd g>l6/lY'$aZCVSS]s/* *EjZ=\ F8R T3g"6;I6 }\ⲭÙsݬ^@0ŋ9x0oނ$J<NMe̍ I"<Ǡ J"T6K<'QW]]]$Lc6NzzYr9(`3ȉh&ɳnf9Bo$Hٴi!a^,OxYXBvz3|5[' ^fF`JʯW:5>oxdWz`g`F0#\tRɞWv p܉JS]]Ʌ^J]."0hBPjHfE:=+m`63v$([oADFbel`ZGsM~AK>YVIL;rVVch FQѨQ5(*@321>Eey(p\?C^fML3piT&gömɤ2XV&G&1[*U4Or* 6;=}gIBaIQh/ҵl&DZczhZ.]D[N>αccժUJ(bѢt:j5Ca֬Y@[[[T1033CMu5sssd2(C×ŋq!JJJq.8qw7ndxx+V1坺ގB b6q466z)///ax&QT!H?ϟ3g8 ,(?i4$Irb")jBuUy)OV51tR\>d,aY, chjr,j%& nr1tF=S>/ё4*)/K#xgH zyk;c.>A|$Hb6YS*I"psI)y ◾HSSpMa7rf` L_kzg~QDoSJ:5WֱޮD#KzSG"I"rӧ@& kuȤ5e=CS/C'FXؽg{4`BcTwq/iGF@e׋/V˒DZ LF9n&|,mB)+UbQtZ5o!|QQ*^`-\.Myy}gvOee%\֯_ϗeΝ; 7@E>OK/em`[!3;;,O?$ =Ǟ]ǓQ8q FB!`p\.ÂօZ0c2QU,URAgbP8r(u5 _aU:rd2I*3gAÜA&q!#cXPt:$I"իIR,_BƍܸƂR2I("b28z|b7x˕i"(}} ŘBIb&D:4v L@ ,+CJ񎍑D $#1ZqjBPLN[[w嶺ogd2z+j~|CLLLn2jkkikkcinnch pBr---wA.CTH$X,HD0dbbQ"@L&î]~^}Uظq#$a6ٿwFZMmm-SSpP(Xp!D5kFYvmdr"h`j҇`"K;277(ssAv;! C{j-+:W088` %jj1:֬Yӧ9v_yر`0`vvp8lF$IbϞWI'|lZLE6&&(//С,[$; R"A&Ja7Xֱ&LxGl߾FCCCC8}}}lݺdǏEػw/?ַEqq1=6mٳ f3xǏ#3y^Z[.;b1jjjD"^ŋRNM7+WdjjE /`3& <c^ LOO388Ȇ d1sIVZ('Fp˪ 4Z= p˭ßF3Q)Dq@^o$ 2==QoD@'rҎg˖-75*+c}1wGCS#&*&Fǰ-|(g?Q3|Et03>ƌgH8HqOP3i[{"S(IĒ]\aGQz ɈNԑaxgG IDATLy(P),V=x s1Z%`Fn'J!Jt0rj5rBXW)W\zw93biW>VWry(CAK8$ ) (drz85e:~FTyQ*48LQj YJJݜ#'I76♜g:9{ߙW+iUݽc`0Bq08!&[eYk%V[3;+"/dٙ9gΙ}s?s=LLe| s߭#Ө<Q$ b3yu8} H% 9 붳bJ&ǩ$ظiG%Jꞽ[Q9wW_g!úu:˖-cݺur9d˖-tvv>ɗevA0NZ-l6:;immeTWW )))^ k׮o>F#X $7]bժU;vUWC=ĥKF$.Lqq1NtjރI$8>EۿyedC÷A&"b8Mt2 ܼK,YL6V^(grrh|(#1ډQAT ygqhT$DB~t&3GζqwK*L"slܴիV!R&P )HQcX?Bk;v*oI_3oh? f?3 H%1Dё!:HƣܰVU!S)a`xk֓Hƛn#Ǐ`6Yn 7gl,NtK!mpZ,LOO)q+8~gdD9Βщi>(.r153w .g1fF:ETVuLNcCCC_x<5W_Ao`֭?d"NiIbp`}ݻo<n SNԸH$B29oP($IT*/ ˚[ h5:\bGP*T=n.W1 @0HEcTVVQ򔗗ϻoוAN'~ΟG)..HL]M=pɩiffHXVdi2=n]=ݬ߸]W_Jaͪ,rJHgW5 d2Y %SSٳǎz144HYi)'OIUe5ݗzPDQ&ΎVH)if`USfG)I̅1.$WZ=l[yC>?雜p(D:!}J“$ RL* (T.i'Fg@Em!)U sP_W@&D&29 !)703my -::I$F"=brrC8y4U5%,jZL^'d1!Iyb*"jQ2_`flz=R6Cˉ;KKK DH"O7az^Lee%v_oٳy8NjkkGĉx< ŕThDEn7T*9B&! J1^/2H$-rxPW׬ZlF.K%Xl7yf|gժը4lۺwq j %tdi6-elrl6(+_ +WG?Hl6ns,]֭[Ox.f##I|$j*3Yj*PMV!/HC8aYy#F-$YD99\6.q EQ(]|K eU &lVr @KuϻJd2N'ؿB&Jryd",M,B9?ۋ]Q#Wq{|>t:ݕEl4(//21=̹Qb\JjeڵNA!fezlA$Z4 ݤR s yf}>BQdHShJbQdD4nkI.ȯz@ D "eff6T*%OB.s7c6Yv-333tuuQVVB K$kX~=wu#Cs q7s|~f#Ml6rqajj8| |pΝ|&0>:J`dhH(D&"q.^dbldg033Cuu5d/`0iZE<_|0'>/_|{\.G("L) DStBᠳRB@Q݅fŨ5059E*0[ē G#_?{o~st>I22GŨVQ)`43EO s9,f+fB6ϲU+Ր4Xw9bfg9sXz*| BAZv:p:/!Gj2*5r Dlv1fخݿ [#;y(Hy@@kY&#A`c_)иd!6Ji0I¿펛g貙*A*/>/JB0޾ i.X$PjL 2I]*|~R@<ݲCC0t$1 x2B0LHO?+VQi4Qִ'r:-uQqI73>>Bݍwz㇎PZalݵ;:xwݍ^cfr,. G2L3rH(EeyJޠZQ*i&461^~e.]$Isϱ}vFF8qFÉyσab|R(r8w%e?7FlLLL069b41  TK288\.'[DX:Ny<dM;Cr$^ ۶mȑ#w?ÖMٻw/h@P V($ ,ZL6A%H W2I"p|*Y\$L g0Ztwt[bUcZZ*ʸ~rҾl&3XARP2FA=LMMc3)&2*I%wm\f+K`TED)JDHf $Sd5,_}(KkQ\ONiꡐQ  Rwgo~^:/CwGzo 2MKz;[ r33oѢdf jj.f+e[n.P dI:wQYbaPXjZC,ii'>L%ɲEt<<O.p0@f1陚 e2$br,B>KiYd0DVP dM)\NjLA:S)"Tϑfr 9 GhU83b1;]&^xEf_s Qwl'x?P( QR@\C*t&(NDc0a*LL@1sT`v8ӅT慵DA( S y;jc2K1ނDXc "J!"PߏT*f23^龄Vgxtں:f>6nZ؉aß3k_A&I,_n>Hhllq֬Y3[ԴՆVAnPTJXx2@Og'U,fFGS]WG:!aw8%QyPNǙ3ghoo'_'Hh`ttIUUNLl޾\nDM]=U\Mhtz6o݆Re*Jl66Q~?`ŋػuV+fΝx rQpRzB!^{Z(P*lݺ5G?@WH\w58HCosgٸz;wDR]ZC$?rcǎdY"0tx2(]QmUTVEʘFӲtY3zL*HoSgxcvrl2E.& Sgx45Tٰarni^addis3Ht29p/@W7lY H& X%Kw5p0 bv9љO5LG]wŇ}l].}SBB]B_x ?ЕX`f_-7$rLҚ-]A*\ pGYdG.-;098DyiD'O6ҘE ;*dJ().` &39-˖᝞A0===_Th6159L`rzKlujA2l@.=aJ%chj/ PW^G@?j| >|oT*MQQ< x` c˖-,[RN˅Zp /Byy9{evv>DQҥKXVΞ=K*-ZqYAz{d|~9|06obϫm6~_cڈ%ż F֭]key EnVnފBD{ﻏ|!G>Q>~z ߿JڈF$ \NK/ܹs464ɇ}LI淾O466}U+V! "|Din5('#$%U|JۍRB.W΃Q;o7 Y=!Ù-8og,鷾\"fIb2)$j p!:;)rXEYRrT[K.De D|>jXi*)^}y\z=1H 1֬[0 |E~ìY=$6l$LqͷunlXˑGejf-1 Dar9:(΢R*P*tvvO~|.Kikp 7!EZ=#lٴ 4*"i'+JT$K<>}h,SJ{{;55?r&um1"YXMC&D 1%J{q9v39>ZbE+*j H$VP0ʢe-N{g>( ^OX\WE7g>E5@c8tVI4&aq:Yv=7|+ P(/KۛƓL `f,03o;?o2oCͫS~VK`믽lȁZ.s9IQ"w1PTJ+lATUTPPG:fQS&gϱjjp=|%a0$YN:|x ]w1=C˻_r7hG)));nQlB2fUW]E:jR)`0zhlhBjeٲetbXBUJۘn5Ҫ5=zglݲgrw0>>2z+Tŋ+d2y͛7sQ֬YZ@ذ~=+WR5YںZ9jE`ٲe 'p\JB$D Jl$JQ)a/"Lb40̔{h&nۆA x$|> ɱTZD<ȁ\.i0D1,V LلQF.Xr:Xm;[٬|e1 rͶ"} <ͫSZSKFq4ėHtB(ΛBe›f,?2YYL6nIdUYx@A ߰Bm"&g  ͠7p{\`6"j\س*(gzx kWafjߴH(B>cL0 IDAT.~? 044Br Y9w4asg{p lU-+g> }$ VXݻI SQQ/K里뮻͛7SUU5oB#(t2"sQA_?. I\ mm=SR±x< uVZ[[QT$I|>Nh4ri,YBoTP4%DdhdApP(d2رVGC#,^C}***(..f8pP(^Q{Y֮[M8Gk&s9~~r bVh4h (J$zKLF3c )T27^s%gfRSbfJ%K5B.OeI*Bl&GI$$ dbsX,&ff؎(09y 889v&;xЈ2iffT6𭧿G&/Ph4RinB/M1ﻅx3̼Z~oL!Go1s`^:p{h (TO0[LzgQ Lj`nw{8unzK( F#tJ%T^\&'JL&DQNUu5P|t#J(C0[t351»O38y$uC!vL&[lAVJ8KDYi9sssqJCAFFsQ.Fnx<ݻ믿^x+Vpuq)p( d2TTTR(kl)@Ig^b!NŘ -[F$S,Zn("E*%EEETIb``2b(6o}[lin3;;DP^Vnj Ov[2A<' V+/جj5ΞGTj5fL\'fc2H&?Dĩm2NıZT*l29DAt8VNO^F_\'{YE.YZ]Y"ff}m.~/RP0m /O狍 F$d JWpN&/L BiOo2 {sNB^BeD~Frh v*{awYњ[n--˯!@QȡQqc14*5FFj% aZx M $͠T Eēq"$PJGͨu:9\n'{c8bFaZz{zW~A&޽Xșֳhu: 3gpQ9;q lڸ 7O8]졮BI__?od.:ڵ8} ]8˭7B*B1>>et: tba׹v׵tvTihlbG'C>}ޞ^:~knb5Z%`0ddhhZZ/TWUQrqאN CڱUȤqi7-|dt*l&QhdsYrT* h4BB>O:&S^^I"UF)WQ@iZ2/Kش}oT4ɠ HF<&\0۶` dӦ PSUg& bXfp݌M31 }M!jP'O;,%0G.(8z /U reUv؋\\3,_ҿw։~%+RyB7͜|@! \6Ht.E@#&搲Vr$c㴶av: xQS3t>́~2<&ʕ` P0<KKK }}}0c2H&}W^yJ;o{3yv;2A$J8y$<^z{{Y$dEUE%t'(Yٲl6J"bN?я0q5%U;/}Kh:N-w=Cg[;jǎIPeSueեl߾*.bxdMeE VI&|}}T*`_?+W$c2N6AVхjGo0qE6l؄l6 fLAFL* h4FiY$jO皫o`˚s93~VU)6Niy_$v%сnannՌ?jPT532>nexvJy;ܼ}*DvnH: X|s r;7OqaƋg[GA̤P+WҼ$k{gǓ7w}nBGOSOI&Ͽy8}444di_Į;箻ٱm3Cby {errΟ?jg߾}8N.\L&#_)EUuVN^B!^K8D"\.EMM \FDDi/شrX #SUQ ٌJ 6.ٳ9\LP\TL]Jow̡r;A8gXd*C{% z=2΋;>R LL5Z%cNg{J򿽗p!m 53owSg7}P\.\_Q(`/ ir wYtZ`t&G82ǝ2Nϡ{"G! EPF?7T ghlŋ1 2," R+ڬWNǒ^ d KЙL  Q@Yy)}}ƛ H5DQF#^P(ĥ~::;eLNM,V O8;l4s!,~\l*C(|;c#c &($iX[ڬ$-e%444e{*9l~Ujjju߰s1řoc`tζv]}oI wŏ<3 ϟGYd kVadxFdH6l2SVR\ԙ3lٺ AY- IXF>?VO,ET20OqQ zH8$6|d2M8@ѱk n^Fk5Gsql-& tAf(*A7n1=9F%OH3zZj>?6n$#:T\ꣿO$!Hq6obTk8{JQdre( bx.PII STUU#x<(eJznFf{Yڴ^݋⠻-+A}X Xr0Y br귾*5I&2 X",;g_8 \.'˒LQ Z䀾~v$1#rٹs'5u !IFI贄a"N<-[8tK,axxf(..F()/Ɩeɒ&'q98򯰚Dr9rqlRÇB9s׋gƍqg}&կdhooꫮѣL&T*?0d|+TUTFi^y !"gΜR555t:.^$ڶ_$ Xz5Op8D"444ӟ;~uo-[O|_|{?9<z+pgΰqF/ `$ .r)+8=Ln}s!V@]^=snf9&ffE)z)upPKu\[^V;pŋ:XlXhikcllE38xF@>l6`9T*{x{nx?oK/Ȋ+6*d:I4!Jq]`w!2xkk 'ڵ%]`t禕scŸ!v˃υr_z ~'QoCO%#~e hu&dDAkΎ"lA4_l_)ZڈF|?=}&t<\( M}vҲr( VN'\Ab67uUUů}%Hss3 "ND,]{ꢧ=O4%L&}U&d%=~BE4arbAG(H9,l۾B__UUyx'w}nfgfQ5(*ijlG;n﹛}^.R:/~|+_?jt:{hnjt222D:3ɤd>òeK9qbZ/Ē.];f(+' #I!Hee2/v|1`BנIO3؍FE}'WLtfxNZވ$8J9r?;|2RFREg^XOđ1 GP*Epl߹B*BVz90xZ`1 %W,`33 |:58+'V>zYL@t)ug-?Oچg7[~ F ,?Ϲ鬁XALO-ɻg&I Ȋ &YYI8+!k-Nbt:bd*hcsPO9̒Nz!q:649(|B>,H&0trx^#a-YA0}cc,[T*(,[~}t-_Θ:|n|&` 8Ah2jchtR/J?䣮X,>™FV Gc8pB8/k֒H&:B~|ήŌ-jJ4ZLc}x镗uLLOijnF$-- zlv+<+:^m۶fZmA,f3}{Yvwʭh,ʲeKijib7uﺎ Sxu{Srb ˢP)hni&bpj 2:]] y06T&A&d2爨 Fb!|U,`1;;C(DK2j|ޞr9q][Qҡ vL4d1*k86ԋf`vvFon%pII$(UD2nCUQ^QVa(oӆl6k0=vrv؋Q=450?HSmd QHdRNM10Z-Hőcǹ{QX-$D(|>ς 8v O=4߿{:klnj Kssp*j5ZɄfERrՕCA~bӁ,<Ô PUUd4sF:/~ z̈́BAnFn&{w\LCc=㣼e .vlCFbtvuǘ!PYQ1 IDATdvv߇,dR$\D2N4rH&N3</م?IR B qP7|T*^o ˣQNulԨTn>*|Xw ;Y,Oyj6⬡=5Tx6QnN" 8fd@י5zҙ aY:"0<&("e2gyb|jb0')ͦo{/HH*}tv,cll ͉(F2,jt.]'Xt1jȦصy bh> *5 HHMs{8ׯPw|9@`:;F?eeeT*ZZ9tn|1ŋ 8q!f('SYYA]B0jP(tF'ǩFEA`ppVR$Fh4^'Ojb1dEZ\ ]"$9磽^w-T4BJA"OMgp{cQJ ,hfFPk4h FdJ?ˆ w՝KK&XsP]\,V(Ys^fc~iRӱ,ڐ ±8{͡RP(U),&3H ABVxj\+PiZm0o9+2g13Ndw9g nDpgﳳ&GYIҩS% (JZNǪQ ,F1k r$.eegs0|ӟUsTWzH&b$dI"R M& ,ss۷G#8|@:G1 rl`ou J##,Xg?%wmX?BGG'jDQ֭z<3yC0l1.q ]vcǨF,Yl6 Et:>^ʉbj1ňS,ؿ˗-e˖-,\OCCpd>!Il6n&24_vZUVt&$JA?=vLkk+>^ F(""@TG<'xxe+}z@&J;uzJJh:8옔J\}}RvA>G粒h4 LLD BMu-Pz6;\(DKkzt2Qit|sR֠6P\o~kYoEJ^޵т]JQٙGE,1umMĂr 3dOʽmv.B&*+jyjTȑcr&bSST9,j5b݊TMˢELRQYA"S!ɠh0DINtd& (5jf(Er@YM%F J#c,[VBa͝u=f΁w+>g@?I>{FKzm}JYmʕ+񔔲aR0U%,hl`d`JKXd+891A[bl$J<}f}>'QEbXS^^^jrPҥKon PUk "-(D N$ PYEcu5ӣ"JY㡽D*h4b٘h4r(++~\r%y&&&8v:$4X FC>grz|q^|E^/477311AP`ݺuB!&{졵g}7Q#G?ʪUXl555tuu1==8Zz+===u(U$ zzzѡaBg̤V+!>PGE,_/}->85d%$KkCe0@>G>$ dARq>4 S^V$( tZ bt2R 0􌎏!ӳ(*fff1j Pv3=9M(& )HZ>,]R!L589|oľ=)-x,F0 R[[˫_nLHӔ%(U*.7eeeىFa͚5!( t:x^:DUu-P;N1N)F>_ ňFi~_c:I[K /7Ԣ(Ou`21W]E<:QK$Y|' \tz ," ^~*+'=d9Np"Sxed2Yr"2`R@8D*x"m݂ըFuz&|jmphL cժIF;J]}S[_O x,N]S3Z]tXN0c.NRS]M>E@ H(hw,)h4b cc%`0,ZIu\qZT:MG־Ȝ"Ly-U_^cJ߸V<\ JGV( "=qͷR^^⅋iiifXVXtSy"l<|iPb! q!$I"J155E<'ߏ^\6ꔊnO`2طok֬<555$ VX ,pCxbY~=hJ24WocǏxrj\.ZĖ-[E]Kؿo\q BH^N/zVbppܞ:'~A <$Q-#ˢiQ:f# HC!$#t5cը]^'`rXdQSR&,_щ1J=^A,v'HD!92D6fEU32:hEc211Ge4P̃BS44 bbtW^ڊRLJ_hni!`j4gL52C 3 3?yDxI+P(Q*ɤQX|:ˢ( (d|.y^d*S@EF%3i" 48 "I$:j y\FI ?UcRAxfJx$Jc]=e(Aۿimm`Ѣd2Yk0dY***[mV3dAbC,V*-ӓShL&}}}~zF(x'Yֱ1"XP+42>:Idqڷg/Z]JXȑ#ANv8yꩧH$aDQ)#M`Q*,Zԁ^o.BAdff}Zu_dfYgZ܁N(r cff890VW^aկ 9p;v`т(Tr"gOuu5JOpUWa2Xr533Ӹ\.<% lH$ n'L255bavnV( mgǎ، ;lf $@Nîlazxl4FXDnwrˡǐjRPP]YbE#(ièa:U6P*A`rtH$h`f:wE~#aҨPHj w(XVddDI~j<Z~?tytIe"I?l69}No3cjbH6E%tZjFd*Z&LFP(YXFE&37z>l4H%Yli$IB)()"&b^$ a4Q)5" N(NRA?G!]@S̱ay?AQ- Dv'kWoB }c[39R^^GyYIsssb1BP4 bXD"F,bP([bfy$鍊 t4W*LP]^N<# @&f5rߧctdl:͛f,_nYb*{9֭[ p8xgttwwP(cff]v|gyGCزe K.gnnǃ[B4ejz.߃bbr;@e6n܈s*'H܌gbbZM{{pM6FGVSOhxy˸\.FFFFPM" ###gb^H$B}]=~ \H>k7]r kfvxQ ЀRIclV$I(#LҲ`yQd᭭Gꙉh_99NO1(HX$-$1n7C'DA҉$ ں:F,h_CyrD]/D&l㛝!Qe(t&$˯fe]YO, w|B8s9t&VE!"T|!(y2 |~E6;Z|>d ! yN+:T, !f:  d P YrY"br zLfƉFX:TZFOgHg}%4= Z@G[  BnÇz tu-edx ɱc$In7 E, ~FN[Ͻ@uy: Z/s~JNC`wXd2htjT:=3ShMFNDVh8p(s~2J]]c0Hf('TV}V4*5JC\I.dDQB2-5Tg=z'MdSQ Ç1Bz#J\:"y[1s`朽}S*My2 ZRDTq)DI$ɐL2F F-R>M*`,S)2QQجvN'rE@KN1d$: O ,F R GAiȾf|^J6\|>O{{;}}}ȲL:fa"8mvzJ{ JPkKHM3HR"tJ m-dIFn_LSm x=%Hb ?q~p lN'#XMfn&Vȑ#\q455q-0338|>qOss3B0uuu$ $I:茶rX}jJK=l޼YСC知chpϮ];bjj FGG Y|9'ٰaSSSwyl۶իWo>-f!I3TWWSSS ޳Z,6N rXf-H>A\.78NΈ544`3[hinfyņV:=wΧ QUQN0'IʥPh$)DAt()-CTqVU|[EtV O?ާ榿D$VQȤY{Qdr(*nP`0uW١z&nxtwmFZɱQgsXzN7络agN)/U1 oN 3r9g 3HrQ,R)td*F""Ez- 86~<ЏOʼnJK\8MFjk[lجٴ2yu6fxJ&E Xł(1-(ds™cLR\..tt*FJ60l6*i(' LN'91ʮmۙellJY`xxG'IĒDB!FGGf8\%?~;3|3~F]Zdkj &jՌY#87UU*HDc|~:-?h$JF[[<xkjV\R䳟,wl R4]z9O<$wl&?.X,N4ȑˌ3>>֮Ye^BS}+'{{ ˖tf _J!PYQ7NJsI-myI\nGe[ZxGxGl6Os`N_xlLu}Mry;9Q(+) FF'0YœDE8ʱzpWVqQ*kx嗙eK/b3yuv,f3[_f53;#O'y_x@G_PfRzyY~äb!Z[E2xKDbZџ7_|w u;_&Icd3t:.wcV}i4z}ܯ0iu*@)Q__N ̴WYt&˯nTIEK 9d|Ѷp(rŗ04<Ğ}{7V[YIFKII fq}>x'Kk8qmmm޽K.ހ_:  {@ @@y{عs'pdr`mݺƦf{o{d2$ja***e ҂ ^O$!̫76jj  ^/c{K9y$^/RD шT,bŢX-V>;#VtBA޳n-vs`:Aƪՠ,+qԊ`&`(Fie5`CNja|鏩oo"#p9~~޽bN?HcGXp,R&ᴖ2Op2i>obUYbm'lM.enQVBe͇~dJ "BMS#FO>?2VB )vL)f\-1o n[R~X9>} ş"Io=vYo\i;E{J A Lhd3twwo>nf6oEIuu5-MhN%XƷm{Q,`7Zf/%N#g駟l6S[]֠' D$*qۭUUdv| 2ܽJ/dĢL"eqxRCPB&MVuvYlǻar|9C#-\ȶйxrn|׻y_`9w/퍍% 3HdyRETͻv_W>q:Z `NCX&MPɈ=`֓f("a50p.aAp1Gk1BКD"5Џ@ \.[L @&E b}wþHj J{0/ gȧf|kfΞg7_KA*-8wnF208%(Z@[ooVg>y;1[opJ;C("#9.(Pp$^P̑fTz$AZz_<he\UI"a40RI(z1dQhTI?a4nV˫/LY:ǨRbZHS*9hu*_*x?dMĨF!Kʋ8|c)5g3qj=ZJΠ *u}\QNUB1WdW ܱ ̌A%3h 0;êUyuwu]'(**5 A HL<S\gv,XO'HlPf\xp* ,p%Xj9vymn͛7}ݻw>OF>h$̦K`Ұvj{2::bqa^vzj,\K+q4j CC#lt%lɉ'p]TT30Џټy˽l 053Ie,:s{/s^vvve{ve)ҋRр1h1"" Ͷs~@Ш) w33s~y*yq 5֓ʤ;c\ZJj%Hp:ͨb&8M.dF0aa:Q GhHs4bBEqK*8k/'86BIF)!rOf φ2H4jZ9swFFd _gy"=iq^k8q8ND$c3˯r dx;Ԟni %S,k;?K4(+u ^_T)>wPR4,f|vȿzĨ@i~}fRȒ YDI@UJH+T*"Ie4|>,DL77\I4WDY_DAAf~WwjkJ Ǯ~-_Mg̨y}o+)ɐ)b@$.`4Z@u|aNk9h:{9޵R-Vc5ɧVT351AƇ'h pó,hj!8;Kgg'?~ tEpj24UUuP%9lDy/hhxm~dd&K,Gh8zw\^A,imsΡkKt,!lf fd:P_F0_GM7ѡA!;Hc}>R@TX" t, +Iz\^>d?e|u=;fz{m u6~a6mĥl`0{2۶ne,ZA.ۺ}hnhfjvP0BTBKAJқYxlqN[4eLLarxZv۹K-hCX֌XHs[)b\\2. Ipex1g#K ye#(WryI2zY5+p;t.hj;};;׿?5o%YCU5lf$J&I::urq&gÄ"q'9Uelft6=hl# J&(/3ӋN>3F(aA>̚ Xz3S> 6ejrK/B&Kߏ$0S)zB*dq MMmLիaim_J u \pf*H$*1[<ȣ~J6򋶱ܵ~l&SQQI$˹뮻8oyLNN?lj$In't Qhooĉl2چ$J$qN:Immg/Zig``d2ź5HkgϞw=Fcchj HlMq8Ȳ u:(VB!*LLcwdsXvgB.XdL'Q(SLsyvV2֭ȁxڿE8V fCQLv;FR);ii嗯R܍f_b.:ɱa ,l2E`a0bSs'B2 zvBM0 " SUTpX6O<́"I8.4gF&/D 7տ¼ٓ(h@,lj+6@BF2h6smHM#_,ɈDP$Jc4E <Ϩyǟ%JfRIpK֬4l[Uяh]h6d!hص't.ZL:ԩSD"ē vM}cC[xɧYbַ:z믾abHmm5*^6l.̙3X,6O~zG4Ztj*TUeϞ=8n}af3{.tMQI& r{930ȢEؽ{7uuuΥb܇nxtv3#03?7R` EA-!d3x-fj>źՠeًIf)wXmi,J,W.?O>cBӳd)LL]u }.$v1j$R"੧3LN/},^X8att=x3}qnDH6`Ν,hr`uLOD)(8gx9'dXgv`˯G$N,e2GFijj"OPBg0RSbw8f|_uP[_ (L&|ٱc'm Hz;B.IB d2؏~7?͛x啝,^|;v'Nr*8`2 C&</P(^V1M۷6"(K.㡇~u]j˙ɤ)/K>yE rqflq<sVU fLF3}CXv GC Wzu'-5d#aN83ff 0eNzDXZ9l"A$_'O|{"9L2`0ͱ2<,( L3 RRȋ"$js#<ns~~?'D-*ٳ3oD Eϱ2*#/*F&''(x8X26IT(&cG3mTz|P*rU1f>nzR]f|xN2n2f'& Prp^F=(rP2Y>r-<L:P0Z$xv|h8Lc} Uhƣ?zS=\{D 7㥗Yl^z%ETUꞽt,^CG)jDFFٻws :AgaZf3#N;v Cy]=kVM1AZ&'&۬ ޞ^Z[[Fc8vDQD 399룠,Vt#_XL6±B>ƹu~Z߽O˷-8x n IDATJ;?|yt4~5X]ǐ iLP LPľCUT`wAK3-9\*(B77s?Q0sRLي*(NQ,sJIT*Ig%Es13;D4l4_DBf~u]Wq&ݚ?6?PQϛN{y#Iē)\v;:Q`Æ9\6lU'+TC܄M3:0@CM-:t%"ERbrB/?~6ogl:GeYccF F+LtuwSЙl>?E@}4|f\>Ñ#Gظa#\&N7b$ 2]#cX ȕ tuujJDA*Nq$]B)Ťhnٙ)`2!$F Չf0rekq_|JZ pg>C+ϱi2cV28&xp'o1γ;[ٴl ۸wR028pYM {]h4N9IrY-sǨΧϱsA{": USb:I~ifgC JFp$b۶mB^\6 QoVդ K75MCtzO[&y ߡΞs  M矋aEQ9JfmsW112B[KgFy'Y|NV]ƃ߸GXrcL& x>GGo.^aX&KEM-͝GY#/e   $1356 6YoxrIxʙH'x,>/I$W:?/R z С!  *fL6KI)l=~ǹ\R$ N}|_} U(j=%Nv~eXr^`ժU  I|eEJv 555$)9pW]uP(0<% b>tł r9dY& R,Yl%pbQn\&iu ɒ/%c{m7_|wR/vO9EAȕ4YO}]%% /!O$x<>MIl =$ ';d=Hds؜NzI7o& }no6gijL&C*@DTDIsײ\nq6mH2bttFŹ -]ʁZ}>9{1S}}4TD: Y=g0Ɍd21L"͌NMl0as8%}#բ%b\^AhlZ3g`jf6f96ږ,yQ'%|w17]U ~<.,JR*Ag WP&gBx˫T7Q58p(=Cȇ>QEt J&1 h'x{o+_ջ%,h@.Ff8 :rz|u+R[[M; x],\>akDC3475t;%x"dAAX@'L<%[??l_7?SGQafaf@A+$P,e=LA:j~nk؆fpٌFIN[fjbgwҥKw`?رEKfihkʚJ^ ֭bULQHg MMZ)㈪B"ёa索A(Np2#z#c(5  qM7A*uy)! :gQM%?}GtU|yW;Y~&H2y3+Vw׿UL|o2IHP!(20Ձ rJIoc/EW\έ| @Mtp7ҭmV$Qfhph믺ɩ"`1R H&1>bQ!`fU$ FxD$%lEft| LJѣLSSlvtT>Hb5M=ߙ0N0?f~WP17(8sJYMMNѹ 7O>@, %Sc&/l `?8w6.&/pIP+9~8C}YD6BT!2;EJEq'pQO>qf1 (2QY]K2'LV,^?zV.Ǩ=p+sݖػ\i#1BA(Mۅ*K M?cJ#6e>"D*]^Py\ 52je'0Ch߉jR>i ms+q;NNƂQ,>c3a;:WQC8US=,S_!N'0S T9c1VY)~KZpȢ]uWU񍻾h:kSWV3s8z${gwW_ˢFzJ fSTUE2Iq \#cta9x Ǣtڄdh4iܷo-M,^ǟx%KP}֭[1LH0dXd9y}}}l6N'gFfFI%d28uMMMKlܰYpSYYÇl6O8h6p֒NfgzLMMbwX'J140$IxeTVV%tdY|3z^5 eJqut:ʲeS'NMMbd5^OUS f%p7R^nǨדnb5W$hDƨɄ"FQ@ȔTbE#o0XLl[Tm\|SQi0LNSR@FZG|d"U uL#b%t2P(P*@tr+6I"m54Jrsv`0YgEOOo@rgt"@9 ҹt9O?_~[d0J(X8׸Dd^ң4d#&yoOݯhfzS緳fy|٠ IW/a:HvcmV*+TTq~n]J%GG1J:| I8}^ a^۩oiXR &SUY(DHӬX4גeilĂ~\ɩjkj PH2碋.;`rJf[lAe b A`8pk z / `wI~oNSSXVN>My^zݻI 1LRYY X-Yb(Nqsx33Ӭ[ Ghll@UUn7zL:"R[[!HT*E Ir%OzJFFGhlic"vayXI/~g%DV.lkN*\vr( x(axWTc)!%Y~6&SV]}߹^=|c1?9 An\T,W￙mtTUpY³4Q|u'^%GW(/:JkM% ~=T:h8D," f8tzIHGc'??O8Nn?{YteUU%-߲6͌' fX|f[3v13h"/dT42FZV,>]I1˔%-( :lx,* ( F+V9I|'5oԙQ 2?fh,H6j6""N$[hv%Juzu?}W^x!v({\L cP__ J  J#[,nf8v9&^ڽ<3STW0mT֐+20JÌt @.-ɉfP>p<,z#?4qX,NOKYR1LNhkc$痯H?5mL !013;3<öm3?{%KH$(//r9Z[[  `XP?Ț5k(//gpp;_~7H?Q$I?)7|3'O`0033K$Nx/}]hii$ ijy g:Rw8&,:ɗLftL"!#$6aqK ."El*l@^CB QQ^N*"Oaje* IDAT4LNNR133C}M=}}}TUVEbHCo4%Snl udզ|oFD}]' \sM|b %*SPqq߷d3H03w6j\m4>Cv|@96G*W_A:Ĩ:Tbs1 BTrsw'Ρ:Uq8 146!+P? FFy8&e^zSRe>d)Bz $SHs֭ƃ?]/̒%Kvq |>pIdYd2( 6uJ޾ŠLFENJ"^T*lbphE&fFB (IW25=I4eppƆzHeU9ee>'(QQQ,E#8NV+f6yr3ݧװ5Ǣ8M&UMYv gXx|Pf5{(ISU99PBBB"I H6<<`c $ll$ +vqvvwNSuW,>8`ljN9szNͩ_ݺ~}<>Ab%.d;LSJ&&٥%;StH& Tt])[wwRY+.Bli3ȭrh|r W >l4NG?If[i Դ&Uʁ#Lu|>7zR-ix${)G谛T! bQ6^1ZH0@v9lJҪiVJTYlF#x |bd@i5qmnd"֭[ik`-&PJ$Y .2f&YJepw013K-6r{/"~j(P,cabqF ҹj5_#C(,899MVDC`ja-7~@ @0RW}H&ӉA7jJ%xp{مY}%v12a=e&K8HUV"5ɴ2ix\^dY&X,R*q\LMM{nYz5*Dvf'(Wh@TVrq:ݴ*DHڠ3MIRK'yDinSB~fNJE|B])4lcSt &W!2~>F"u$jA42ܾ{Lh6d_{'OwѠq-7a3j4Y$aQLH&LĖ]ԋ//=E% l ˹SX-p(߻J†5iVLfciF#%#lpDtAb,K&lv:"eZEVasb2j(LT"lͬi:l `XȎ1؝`uRh8xx;|5'ey] Okef ?|~Ӻ~LOOQ;bfGXIY{QE#FA@6J|D|EZƦ8Fg|IlZBtOLMLH%SHI8v,V+bӆbXʃ.UmRTȲDRV{]]0H^sëgYHcr5WȲeڑ! nB0?;AP"׿fHŀnQ(q s 8]x:EnT-G0O+;/y Ξ90jI05Etv ks$SLOs1tN8AǏp L&yadņ'Ll  v Ie0j]]]lް3㌮ZEToj[^s=l\)f𷵑8qPÇ"-Z*WrN0EC(.FlO?#=t?7(ǖ`%&΍ 9~b%:_h5^<3R7lb'To(f,n^w x,!k`nVJe^L[[XbW5*oMb'ZRˋ 28Ϯ+Й\sy@DXZzR `2q]{[0ˌ1T2C_ȶu?8w,nNSh*0JI0v:Vzb6S(iiM"6Uҹ,baIRtکT+ft!Xdزe`h4TD WBo ?e5/ʙy~q~ =rsR&$A pC^xo~zr^JF&|.OZp`%$RrDRRW,*5vJSNg6qJVAK#Si@i4V&Ȧ&ٲe>m7P.rNfd:1QĀF3$H,gdFm呧oƻ{7|];6cvX SI=-+L9[8C#?˦[A4줤i(~/|3O vJ}zLo(W^3-[)W+gɉ9IGol(ht)LF7\*M˶kj0qZ.WN`ahU䟿 ~eXLVjQPL $"KFRagG6akFfflԫ5ʥ2I6h4drY"g+fd׼ uUC>~MoГ3?{0tɲRc]iرclڴ)~Ji ٢X,JϘa2+Ih`P.ٸ Sa6ن**L]@>e~~FCm`4i6/5K4N=3/xiGW Ҋd4Hy= ps3x*Vwv+Z)8 Zr"$K@XtR7_`%eym~"+EdjR7 IxJKk!K"z bTPS댬^E+Dt9h1sb4هQѹ8Ñ&ذe`j277es I~24}Z-8Jj!O*]+@{}/\,TU$IlXNjG%,V3b)#_ *Ɍ"L120H<%240墥ȪQ.'\syzp:D1;  ?zrbZcjnmyJ&fڵ;;-hkcji5۷üηy߇'q髮b!/?YNahU@22MɌRq14wu7^,BoN&Oرa6kwus?,Hr,MOM.u:ḾHjy.clXd"(QL .ZNPZy.E#drtZg1[- <Gq ]) ڃ^ݻx=hbRJ+2ӯ 3 3f& _o0c E& j=ﺇ_Z$ Rg/ǸhZ^s%D'a7 }^20BᗲIB.$I"9⋬ZY.x;O?(`En76dIqxEZ EQ0LX**(bWL ZM"-A_ Rx8uf'z;h9z BVKew-c(^Tx2Ś 1t(q:GΜ8VE/~l6fNt43A"V Mӈ/'i:Ft nm^ue~o|ǯau8Xa-r` SQUN:k_}%[֯wٳc|ѷȩcUo미.kWt1᠟J@P,[$$I^tr,8?EQ(Ju1i+T Æj-Alv;֝;)Vk ZMX7ʗxg[ ,Dcx=^ٸq#eyIUUٯWdfxEfdVG4Tlf+=\mVVET7L IDATI_$h+ HL]'ҎjPmH@bE$BBD|0S KX>TȎK/Mw.ǂ5Ў$_ BQJN(t FD`t`GL&5БE I&I,,xU?/݋!w֏2y fR:$#j* UE44NZ]jQVG]`d:s_OvfFTB2(P(ML)o|3{χب඘)e2RՉgrX>/rw?$6|TYLZ Y"2+ CLgrCgë.&Lϓ^Mի4h?=zw0K_{F_G NfJV'N}vx v;zaZNb/)NsUWL&c۶m$ <hnd2tG;ld2:333\kO`hh_`xpHg'tEz8~ kl\ "v ZY/P+vuX`F LP)Դ&X<&+ [ϡ3 bn&vI`irHc]aY*^ZCQBݝcŻ1|7ތj(|݁ yf?BOO38=yo2?yo{#4睜=>}oZcy,\Jhhi R $RU-V<.7KK1F#VFWa~~!Ν=ˮ/# 3;;KZ"26T"I$!`0jhEbHg8"~_T.O(QD v7Ne૏-@4j. M~`0+(~N&22[oM/I9Dn5 -kYϲeը!*"r˅6$IVk +ftlu+UPlvVo'kaxHˈvFR]lvy)1Ҫ7hݮ(BkHL\b/F`Pc!~kq-կ&UPɦIRr<$IHD\^(&TF^m]T[M}x''gH &3ˍnThiXvʵv=ڝ;oD$@W\* B!ϩַbZ)  p[}!T 9 l:/H#/h4q51~|T1|_3AZc4'&9x Hq;rrhh4bXx'K8p-VF^R)n$Iٳn~?I&}=;b!7MnV8uʴ&fUGrdGHZf|ħ?N3^KW PZK0far RVH;BDs9 D\]o~+d#o{˭F9o/= IQ(IL+FQ&]-SDCt9<^>Bxmw[REr9H'э+lo믺 4zF2p*0uR K=H*tVG\F$ˬ"_^u$Ʉd"Hѿv-cn1 l5kp֭[/ UEody%_b~)، FcxGx◿*"> }IԹW!K `I1K"`;\H(mZ A= |(+bd"Hyi5*^s͸=R4FrHg$٤`VLghp~.ٱ'| 9Fi*Etwv[<-ox=?bّ%}zJG8sgH%ڂc ^ٹYGh6B!r/] h6M[t{ |>  E?fBRn3?0 ITaqS׸09C0d8o txFh3K4E|v;L2I$ŋˉs [$k7oᎷX"Ib=]ӉD¨j;139M_sߴzH$tGƪXTjLNNhG9xz/,]N.,Fqy} |J%QA(`O>y3 &~}8J2Jn^dY!Tk:VFH 8~E@{0HHX#&cZ0( |> jUj*$ɫv_ Л&YFo4j5\' ` [B~) aE7fԆ$&6 hl2#Zueb qz|_o~]]ly S_X2Y:;#4M,FI&BAh*$OɥS8,V7ٰf7jʕ # "nB.dp&Nz. 2,}}LO𱸴F%" Blԩtve>jAZgi!lS|M>tt2,,.Qg0d֬]'!Xf 6'PT$ɑ0E|>?&¡Cq:]PTٶe;3.p8i6[4*}8]+ra*b``\.baffaǰYl $ *KQ|mL3q#Oۏ?N46=ux8yvnH%K"lGv{Xjf&,{zAV8xf9֌ro׽ Gزa{{{,=ⳟiŤ䒤&&hV+J.7(ZNX60 Ϙz{~#_3AE~uwM%V%K$fzEXu Jlv7]'N\C2DlbKK R(QL&gaiiR>5q;WIу/04G2`an"fŌY1#ZE6CUTk+FYGm5V$ZFB4h552$zT+dK\^\P҃DgPVqϻXen~#-H-ȱ㌌ 5dӊQuصc'x^u9'gtx`e33{aR!bZĢqdqQ+fjrޞneJ"j}Lzfl+o~;x|N 6KvFU,/bU,(PzdC=L{ZCeQoQoTyrS&_{x)KLf>swJp "HfIT칳/m|+RL#`YٳgmmzJ$aZ9v8\r zUUyGy^W+9y} biC4J>S xT)'ȭ&.]m140@Ow|գ=uk\o [*"Dʃ?._ᆫ/@g3On*Nc4VkX^&b,fx#gd;~|c N>?Fy3pі-}vdmxǟxߺN6> :FRIVnw F (a۩3Y:;uFB.(Jėl޺'NC:&"jJIz{>"zzI$4R,Jow7xAj0഻͆b`d/βvޣMMco 8(v'K8&s3<}ݔʥA֍3 z 53?)Gf~`&L`M~>r`$$MaY?ObveDfQdFA+M&tfg0[8]N&&&j5APdJj"228@*իFer,/PT@q{Ԫuxid2SאL:{%]_Oι{r;3fvVZ!D2 `x6l-1`cL6,$! 9;ܕ3~8<ب3}j@El,2"P>,TPiXvA(bie‘lEl<3{U| a`lJisԩu<]<߸Źۈ,[#~QV54f%7G3:6pմ 8\. "7ȱ 0@KG; tvuҨ֩*Whm%L[Y"N188O?U`ǩ$39{:֖6@S_Gc}~*fLjD"-"+EAzp$H>_0-P(@8R-308H<g)TjU^D,DoSi +8@d o~9.py<--Df<4jÁfǴLJ`'%2m[/wB|j/[ ̌i/R-N:D@VU x=|ثeUl4{\ ! 2ED"Ē ZZIDZfw155E$BӴtPKDR oO>=zPUZH8^J:"c  Ħ㨤6--4f~~]QT 2 VVp(*Fli^pmoe#{rmlSg8y\X?D*b`h Oex:bI],މWcr~^5 ?LP=`{ {9sp?F0s C}9‘N /3]{wXSaA?S>N{Ov ѮRVCK7jG'[t,^W/|cGbôwvP(4cǏ1<|~VR)(jD{xmo'n=A)SS s!q3J54Fvn'V(aLY][XN?| ^=}u] A0kGyqBHKK2\ɥsx~}}}9rb1O*`˘<Ϟ;V"hzVt251RMIrQkY.yFz9u$C̥ZJ8&HEl6KGGL L.[ZB|esi|-mLR2)r&%Y&O,zUv;a\`D 03ӯ=kf~/ Xl۹)3|/LVl7={QLEq%zu fbۑRLw_7˫+~hoo'_I\$)E4Kw˔L#0rd57N|)ںYM%@yKDByyZA  9aT %B NA/I-.1͛^j:C::} ǟftd뮹?~nJ nTd Ǜ IDATxVVV IӵfTPG;ߺN S5DVJ Y<{7\y "ZhXgnzO=B*daf~L>I@csf_kbl㎿ |5^v\…)JǷ4C eZZZ8?115rٳ˲Pe{͛7( h@ Çٽ{7tV<jjei3IF˅B9z/dqi׏Z099IGwL.K[[i,;H=偯~ϠΜX(sUאH$4 Á"[R5%/]d uFG߰5,^8`!u|+ QD("ywF\s s 7l!cq DL.Ӌ_[gHTj|pg UKk`|H Zi& "~R@CTUR$l'HuPer@k @Kbq,C]ޱWRfϟcea_ZbˆnZfp)"zJjuV RTY6 ǃas,K_`CÈd7P˟$ldw¹3'p_\IRz;Ś4~KPC8 sP*qG{v?=>?vA)ߌHȢeee`$iHJST-ffvaM۶q!ۛյZL*gϞell *d^װ2eMV']SdLO| Cj"vJUUY:B$Ir JKce5!̿4`D@XHO> ݴtڥ~i p쬟40vA+/HK:xGup9dngO18ЇJ6SŒLTJ^ׇ" ry6mN&AAAl8DxIvryN:RUu*zb?57K wLG-ë :,#PtN"nuӉk"(*2zg0-Wӳy38_^>`4x٫@{O~jCwSVXm,C֣KniHT^ABK M~:iv5k׳a&:r_f_gȉN*$~=w=놇( "9\|q9=I u*0WWyٵ@%Q+skF\&1aBb+P5"[(85Hk; Oh;>WKd=qtzqnG3k=s lj찱c4 삍|"GuSH$bU4:S a z?ad ]NN}J*ex:"|ӟ"Ukv]u6nATexeDuvuȑe>Wft [@j2w4D0m {G:I*VWʚa%|b|BdQfqjODNGy_GjE\;V{$ʱ'JU_7x߇5z?d~7|OXso?>ehA_~8EPGf(S uȲ | B\DEdA$@'0ѥDWb4jdX:N:M d96l݄RPH {NTA#TbJuӅF4]4u CC8Ŝh{AD Mk4LZ4O6o9şcFZ_;/V.u#ld'0(9d-s^13Q/^/bȵ{^tcU>,?p!Ҕl_lieY$H2<9zpRl@ITO:jd _W.q< * "zYCoh}FEN:i:*ͫh4+;Q^RTF{ H=#PG;w]MN9T+unjX]BLJ 6QnSe qY~?H|NZ QM6lw~ցa@)t(eSz\|k_îxh PUTUAfax#®xu)0ur+ow taLԊe4M^#Hۘc&VWȤR|~BE$IT)=W]E<` <ٶub֞v"*!.ua_9=u-o#H`CƢEhWofK0_ϲ4 d O0vNs-7@ok;'g]n*]kFO&pϏ~,t<A$,b[ tvprv"Ys9puP%?PsfKocx_0/pcQbўn#T%6\شv]TU m+op`8$bgeN+K1\vn`&rΞ:MWg'"rM^GD0i"2Z̚t4 j*p8(%h|㏱{n&&&,^!R _(BaP(,5'\(IRc{=cQUCCk^d~醉a"+i鈂 }NM$*bbRtX+87/ynz5/bY|>o~6ˊ_7U\$Ůi(tF-/Кο \n:X^^rcw ?6EX, 8qᰟsg0.UEӴäq|/grRͷLW{# nM" P*T,]dp۝,/,tu5tB0fp8 EU]T,2de mQ;^}L_MKE슍`0@Rat\_|^.6otKѴ2.u |RqηYv-O)"mr\^Z"#":lStXIĸ/ӟsVfrfv/҆^(al-ӳtݰlNuk=}`$DK$Va<tB8, wr9\>7Fyyݭ;cqz,0f "Fعg7F%L#EA1gf?ԡ}^vxGsǎ;&ӸYVp:I(ITt9;l6 àgN"0 adXYYAErKKKo`ff;v0=5ukI$ '򊝔 y2[.Eܡ@ cȾߺqTSQYO} 5FQZ6$igY(6GE?k00.kMtP.WP͆it$};i̯73ic匍Q簛&ťR_}ąӄr*FiWTU$\N'n67eQ6pl,/00G6Fkft0+birNJ "/mi=w.km]\rBU|sۛތ% ԵJdݨpT"FG( Ν_`gWmވ^*K%l>&SeT,JguyP( HC<3w/OUt]ˣ7jdF [F>G1G +4J۸nVkEuzrl~'ֆawpfzDB:E<'JGT,kX] Cd)KqQ,Qm-,,.arz.rc|wλ87u^B;页J>J|aҕ,VV;p9L<׿n @χY)\]&#?\ˍ.T,@{;@LS N1L]VoP:N*La%K-EfdžaH@FF1 Mcȯ3?[D40M\6{K]aSpN;\MZ^sqbh4rhFl6EFZBLNONZF RUpH"eHl2cdc1Z~Ƈɭ04_8i?Gy_û_gWh∥<-N;d$bq昛[ DYEVDY y B`ƍd%LYe.  ء|K'^*xM׳v?$[,1S5P$ .<>7eBN;y~c,9,7kT*z{ql|'sckXá6vDm)״`lS(WJ7A[C>K$G?x92J@G<%2;;M8$y8ҼR#JsYP;p nQ+WY34 ân1EP0R\χ~|A(Uزveغu+Z*23|IFxdUчfq9^"ٓ/qxLE rd~57W\)6 __zxMEC} "s N]e*Dt'xoRȳm8O<#TF>?\n3WS&W)`THQMg)e2Aj&SSLQcnnMɥ5X$Sq 215zB~n7H I74B #ã,,,pN Wt{I<{BPD;Lx2M6Nt~+e iڥl2R`p^2@I(UKXAZT,r)<.n1?7ϙp8]NVfE+wp_C=j34e:|_` }LONfRZ4q:ۉc5zڻPN:Zͤx񅂸= @0-xoaZz":D_k'r01Zi^Z ȁ"ᅬ'OIGGJp9.ʋvnC.0)(2 fH9oz Ng2vEV}DW1 jQ74b==,.-`ZL.sU\d/}2<:B4gú,/;-?DR%N$D!N9ƚ>Ο>`tx NI2XDKg'$PtFZn>-5 YTjHHTju9q NibtDž2!Tl IDATL ٘q3ilvL *P4[mS  㧟dӦhZW7d%T4zqhz'mVr --ĢQ~?Xkv_ɓXZaYT:Sl8B S'غi Ti!?'ϝ paz׾6{Q¾ ]N"aΟ=O*@&%MQ(x]\E=]==E5fgk,/t  Pp$).۱ 6n'_%F좀h |#£O瑯u}-86Mp8l|3T7G>sOC6Q"Js!S2_K=_bynޞ>b4oȁg_2ސRuxdX }&&j&Μkv_;Hbl߼Յ%3GOxɓxqZI׈,ciM@al$_g}ϰY&BLeáXPD@ 2ހŦSGv QH$BM9s9lγy[navvK f:*{^C7,0tt:G0$QDKVDh|x ݽ}8yqvԴTl6dY²,4MT.Q,Éi۷~\}.j$c&`fӯEQX\\…Iۨl6D)`y0f g>Z[ȧSl2VBG%{vر(fRtQ,Ѩq; cr&/\`uLNL<~Z A'Ncv}aF֮ԉ RTiiicu5Mq6v'RP0D!_Z*J-6r6m47,/09=R $0.'zf40EL )TEEQ(*ȒnZb L. vI!6rԓOsdId,6-܌P.zk2$H ,.104@Zri@`+Y\L02:r ftgn:vLˢZ`9{vpKh1Mf/Lb&z]rs)sۨ$w?9Eb5FwO7BJipکT*dsfMeqqN?FSLy=wMg1 kwĉ?~A9̎;~DdyyR=Bkk Eqdd@2 wlj'J jb2I6u&'3=5ۘ$"{tbw8y'IJ]sE (]C&=ypq6lDX={U~vwqQ^ |<|VV },^x4F(D7p;lY r9Q@ˍ$JdNS\c'U:<^&h8P1UDSqE&1n=Nge1110F2$m-$S)8y[lч칳8.\s]t/gffσFο`I4H+RH-P!8܄hljg . EEv844 moe`F\.(r]fƛyZ\.7mmmRױ]N ͦuロkײq|# !Ia`/Mf?5[T*u6 88Sk @qz*NiC2 <bV׈DZ_\`[+ (ds 191A__\ĦaUov ik(B3QӨ74luän&*un7rnw94b _ݸaOv b&ިsp9$)z{14d2V4)kAH]+uհNp닮7v"< sO=I-WGg X]Iv﹆ B `++. ;wsz ˫TiU:Z2Nz"⦛':]׍䰡kE[or,iSq# Nu9}""NMqŶm/LQ$$vˤϰyIܳ "f(YdԅI|--j5d,DàIF4R;K_ry[xk_.u(tȥ8U Ix~DYB,<~FAԖj5 BApvV󌏏s |>n0PL&s)HpP)].,R(, 0EJah]NF^)N$#`o9U[#gN7XHDC5]}{q$KYl:OkJE7g22bJtvtzB$^2er;hih.p(B\Ʋf \N'S^hiign~ D,񱴴J>WIJex~b ݎ(+HB8$"'Ӆ`Syo 1nߺA{=d2i$E?w$w}?U]9yvvgsNaW"H MF`q 'l?AXD$j%mqf陞9wU?z1wN:=L~~ vLPD2Hr,ss^Ip8~,ZG!zV-d.feq~Vuفdbͺ5n;(oyq,B4i#L$ x$bQJLQ8~ ]ՂJowStt0>3R,HFȚVwTlx&E^c|r͚իI%SMf$AQ :ˆcZ'wumnȱ} Fi2B>f3aZܹsȲ`#'ZG<4KEq1 +WP.0`Thy;c'"2rB!qYe۶ht lDIygoz#&L2DocJ)Әv&Y+<{06|ø2<&+ظj <nɃm,--!!PH/`ژl"Awk+NÁj^:>/;oVEQ)a7*&\60`u:EZZ(vt]v7];DٖtP 2 z3R^ov%ɗI6\.d28N"f|+HSي맨 .{X|~dZopi {m6 %M (s_ۯudYF5JV}sqttt4] R j|5f.??`tmpL&CZ]CW5l6 puLLbfi4l6s9$sT $HXD%Zژc-737?ϲeYZ XXXQ(cd#Ɍ!)5]*)L 5Ӄ3Weܾ= u6 TÉZڑXdK?###X,&:Qey-6J鱉x7lfӪ>'27O4_`ba^cU?uO?06**$#Ѩ0"zLO0@br]۶rܴ~ oSh ]!+_M7P={C+ b2tC={!TIب/[mdcI6#7aDzv'x?Yj^fc0T*5;Ab9>ypa!R Z(===WO"Gfڵ۷l&27ٸq#7o(dF*e>`wt^(Jh`ih*tJJK[+SubQD@G b!h Yt]fQv{zr?cx*o ݆jb߁T*i1Y,>ɥ #@G{'lh,J5efxVlVodr|LTFk400! R.a|r6F.]d?իWFQ͆f#p{݋Kyf9HXF &BQmfVweUqkU;~ŊaGDLf;|-ˇؼq}i@hP)7~_2?![(01&ˬ:.ǿ0C+1 {q7 "TX d]a^ 6_={9 (f3DE2Xv&fgq̅g]ydQ'0[T*bP9s ˷lVpof ##td *B]c~zE:r̤0|vɬ( nSOv:DAl1cEdY&&dd2:F`0`yh4iGT&[ǐ@{ ͲuȧOglW@{;K8]jQ5PUUӐ&ԆN!_Vk4"LOxj>3\QlVj'MՎAT&Iw_7Zp̱k|n};/_|ˤ|-g|nUc(= U/6i|ݎ,uz cUd1&3 |A{r"kn͛)d3Og$ { ȇ>1F&&ZcFA>ċxF2Z)frE:iuY\VI,XИx;{ܰ~5:|&|4F!"Oq7!;|g("%nT?P^W9p,:º$l9Xteӄ#aN;gXuq={U+EUkbʣ"`-=ӧ0,Kt[WžӧG7Ȭha<ƚkn޽\ާ$"ZIgW;'NCGr4$x230O5( ._0L*vl P fl{< D,E )Bw1y%5Dx`vLכ%@Yfrr5k֐fX"%H$(BTz|T77S/]z BH()^H0JVaxxhD>L6;m/WB4$YP(R(/Lf%`ՎѴ#yjr`P#JSc_\RUҏ?$&cx`]V3w޹Z1͠a(da m2d'C"FuRM80F)+Wq8j-r8DtdZzlŋlܲNvzffp(˖-gjj#M` /.ҵv<W'p`2B*f3C.fgx|tNFGGygx|cȗ;:IfKEV^ͥ1 ].lB2I&ϳz0y9~~K,'_u{/Σʼfm8$^CRu7eJ>Kf)B5%u!R`0HP$S(`vb;x.LN|z~OOY Kin޺}= [Hx0%/$v&ps%وf1qs)Āl'( U0ɳghiia)eu,,,0<<̞={صkwΦqzz)RX | o`+vѴyCUO].́W,>)Re>Fno%^8z0;wDTd뭷1FIunܱE5'2;YPZZ0ejI6ҨQz`||nfg´B\ P$ңLK[݊Q1xGy=dEDsiF]f,EXLL>#L c% "\4c|bZAN8|Uk&BJx>`b&CM(B5M@2@*t`x,6f.c X$RHllaתx}~nV>rڠjuk9p֬T.͹gٺu+_`n~RA }fbV戢$RWsXF.0?ąBmN3v265%ّ D"X\NZ vt`0[9yF]gA _?|?I<`!D[gi>7L"cRLZ[@7nȑ,[1ɤ v9)86 VZ6 $C|3c6T,S-1LT*զx*C<t5It4U8x(|] ,>'MU00>lz9r(߲j媫]fe|2$IȲ|WT*ceu t ($Ln槕~?f^w k~?"6]/\}.|0c0J c %/Ob.n#LxXXpq6q8s,P|L*A2)].b%$պ b\6|]o$ɸnuޞnYU@׋n%J Xl<}4w{kod]M؞=;\PFYInF1KK-ƮnEAc6nRTj("&@1n t>F'ƨk |~>3gOQ*Y`>Qa.2GVc`+c9{pD"alz :XvFb ۃ۰l&bYlw^ǥD1t@2Q*hm+R) 0916(J)oF:Ѽ&H4HGW(=clظ5kz,FLgiioɓDZYLO f~U ݭmxزe+˙^Zbx64 l_6!'d;Lnā;"m! JFFRbR)$r{q]+.{qm8\ڻ;]JFWONd?DGȗiPڊA7ruQ={֭ 4p:\xe(r"Ph4J:b``YoEh4JR!*\ll`1T*x=>t]%BXPL&lV dD5::Q5LQOhЉX>vN\BUAHH$Y,;8d@V(jbT~%.z˕TL&.\SO=E0`hh7Fh~U;scL?2?3M(̵7o(uEW} :~?sy5RjyD@,z R< RY)8yzXnزl"EM ft"Ag{V]kPqEL;Fh,AX nP\rQLJĥ+X\!+J )P^ϠVL\:GȢW-UXtb-ԒGR7mWb/;wsg(LbWTLQ=?モ9v_q8u׿z2h݂E%w Q0 hsy{=֮Cr8vS9S(ՆZ]TшDzHH FTR\:C.e={4Ri^S58l5Ԫ*][i&?~X~fsq REfp*"jZl kS:3ԋi10QW\(y]n"K.1|>RjK|nGy?>k6m s/p϶K.^I>?g'2>Ohwݜ;vuTuM8&z }v|/$MZgTujA2H0$:KEr4B##<^^طr+Sd+*^AzVT>r,_t8wP秹|s(:i$#<5FZa1HG_.bA2Ҏ4.f&iu\0BϏPQ$HW[+YXnS/fFejrzjB&![ȢKh 6Ol#NrF //pI$Bj,HZɈ4N+ņ@XF㵗4Th VGO\ĉ3XV QinU՚Xx^_bǏe}mq.0;?b~g\ 4K(ɤq\[G]4*eQ w܉Ẓ Xt1%J}=}]K&|5:#"NT*El6SfETM'5A2)}Tpc?A|qmltK~)ab =|iR/ScTrEdKy뭴BF\V;=]~w~z_8`Q$ҨhJ6RH'\rjl4sݎ4:==D8p2ݭx^ A0`Qbsww1-SIR*8auK`fvNfxn :[(V+3gp:86B~?H'DcQ|~/DB>K>OG;淾k^:C^Ow=vڲ︛K/N:DwO'l¹s$ x]\t 4RB[{;?|(v jkaR9Y!?8ۿɞ^sGrݶt]YXƢ(WK'33rSty5QgJ&#thbe֗%u]-o})`aqNV#L`ٚYnF ,O!> jJRn2C<2+?~}*SVrg0eDO>]! ],NM/v:2Hd*E"$Q*k D1ʥ2z׋ZWe8lV IzH3\XoRYb8fz9{:KĹ46_#Jf  8{nU180DX_}ل6p( #ΰ~*rE?5 "f3ͦ sk֐I)Dazڙbexv L!bK ضd,u۷sQoO:-./~38 2dK=?Ǐ橧HR I?w +˔J%~_P, Z CRFQQX~ٳ;wraBPi,i7ꨪd~\3s bnDs]} 9`匇r6}>/b%j8~3"Bigq!LV%#IFh  pI(@Den7"6l&ÊˉbtwuyttTMP(`و.E \P{9r @efc)pzXʾ7:\UC`5g8V+E&3tMPV1ǡYf-O=$C8q(ԟ̳eFxrB2AR&D2IXՆjGGoT02:A,q\B.͛_laY_?R'7>ZJ Cy ۹ٙ)F MQWk$)@DSU,V;,S,]CD6DM`/K6{)篂:N\s9]T%j>ayf/r7<8b<-^+bYe"Ыe~b&C!Jفnmz}K/_N&Au::q9] .qt!Ο=b4JR! D4ł/8r!˱rxd.n4!: j`GtASȺDo{6Yad2z|d+$]) [) YA}f/17fm~[7o\s} ֑efi)b6QTV^wCG|^$)%t&yytU]b@DP5M@69vK(F-d<D7.rDjqՃ2)'%6o,`\>5hSI+@Kf2>u~Ʈ\l4r -Q!awyHK8_9g71+(̟Gٹy BU2 7TV\M$S?gN!|o|Zpc+#gY=ba~>aaMS9n%L=eDId$)D)JT:b+~TJZ9q8[nl۱gݻ$qyDatd;w//w|WmXï \>y b *_UOƇ̈_~~2dXH"UN>s3S<vW@af v;IgX;롐(WHfd3,fjeĢQN7RETF6HK% }Hmuuu!IIRd"HPVh"ĹVmSg99:-f2@]*N>jAoݼ)A(tFA\fm:V˲$Iujb W //pՔ3s 3Oad~?2(M烙T&fK$K(fhPK / ڈs\.MN?L4&g'{(GS%o-իz^gzfzcMBHJn$77rB^ `1`==c{}UKT*?4ņ' >*Wտ{yTEc \V?bYfmo&nTT\{M,r`6B27?90 4\0K+d+ Ssp\r|쁇llauzgdJ"Ͻ !vlMRB΋̇~嗱LL_uϞ-۶C]Y$ S)0$\6+ 5a2H.fggd yZ[X\"DF#PɄVQӓSlٺx$zB.QTQzL w|OAְ9,tuuq7~ht4F:,UWC+46YEiݺA-A.GuF,,KlXG,F/0"q8\vҩ^l*i$k#HQt5] |f (`"I1WؚV=7so^_kwm'/s: Ǔ\;O?]wE.~(ݗ:biX-V+:QG&!7-+w$(PyۀO'/l!;߅?TVE69E/a֋+e |.ڽn2Ujj 8M-Ģq "@< (@C~FT#ؾ}3Ri|>2 irÉdd _: 2?v(XV skf8vrp5,M\fώON2ߋb"M26T*MET#ϐ+d-ň(qyN_ȋmtI%)+<#nXNL6;`pZ4'48) 5<]9tME=v;?xEn%hmmCitr%ybd[g֭H=BPFxuwOD^宻oPRV&~xΎ5tw1~f oqwkN4S(6rqvo޶N"eA/=UN;w>p?J*AAm0J:` buDqڬtR*.84\4jzɤSl۲p(ڵCL2i L=wE(׈hs |K_nwYY#=bc9*|XE)Q*Tu"?xŕez q,_@Jdæ.+Й%"lAԑdZCH'R8V+ PRU+Hzn\&T.bmē1֯_/D2󱴴j q%tB^ ˘-4nbUTt:M*197z;F^Zs?(@Pt57"ibq2.joWinn8m-H<'Ia4I:R.˦QE*Jbhtx1)%6vwcQ߲sGupѣGaV~푇ݏ'>YOshh7޸LݍH_ 'ϟ[/Z;qtv.nlSuċeFT&M$'K`s:X 7{dj)po'+s:x&|Nrhi,^'F01&衬*HVh)뫘=6 vr6ýJ1`%B{AVذq3J OLv;j {Uz}H;m"C«jTJJ$cRBmmmr96o=; F$j6 ٌd2uKe\=b `ELf+~3##` f$a1[FT*P/ ZrI>_e(RxOfdObdewW\5%}vpD+twXcӓS%6oـiuZ\VdHF9'TBRAtd2TUf0[-ԴzX;<́Çbrqjh ^^?Ϋ_'28}L]B-*5\6=OO~o(P)RLP(JrˑOTic6[)jbБV(%\C# ϯ┠]9s4;v`yeg8,"Fwvnܵ0:v ׇ`q^'B=+AǷ%۟W$ Uk7o3Jw.ghnKp .s7D*A - ;𵵱Yk_A&>ŋrzJM3y֬auu*lĉ13|>t0;fajj<Þ}|-^DZ#G{˭2a67Nije/D gwM")Ո(#/|򓟬8]uf'ߓޓ~!d_#K޹NQTJH_/u91 5{7GsXD AV5zIӴ\X\;]N O#|JYA.&IcZ ˜LFj5_ˡe:;# $E"؜yw>A>䰁PJ։ 9%&?:ì`׮嵃سo/Ϝe%Ȧ_A2duuj9IQ(Љ:Awӈ a(F|In(vl'8?ھ>҉86oe03]s=h qa:ژ$4[F EC(X [630G$hljaiydD2p8px FfH<ɥsqb[n 9'Q*)0[M"ccPYkwmf 8P`(g; [G^.ю^-.sʃdill<ú6?Xnjf҅,tv3=1IKS U8:KݙW_AVxVaF,ţ22º49WU"zL{s3t\cI-yL:M/edhNzQlٺXq967sy|5=\<ƞݻy!vŋ/{x7ipY\Gr8Q=O>,T%L/~o`!28|oM%ٰqo˦p8x<fܽ yb@;"evZYAQ+`5Ic)qvPk -UG6\`;H:<O=5dҦ,a4Y2<<ٲc+-تdp\,--4R)|~?*`;YMpۺ>EБ.$)~aY2I2O$ P(ʘF^}(,fCL Jrnpw011 `1LhZ B$--D"1zz9w,d$_(oQQʔJEZZ[rVr("Ltt$hlj18O|>(Td*cFұNV<۷r}9s[p< eݎjAe|f@T"+"RAjT* 7,,,2F)/KY_`u z9#??"\U a%BgWR&:qEbxZ;Iee 6A,%%hP*+ >#Ξf{ KxMqw hxf&+ #M'$KXsd4a 8-f|nO}[̎_c2}Vvȝ7Hܸg-&.-\(Iٹi3Ej_S.K㓴,,1͌]as Tk,,.1q#o:ڡA;‹?y0[v"計Q=8rv֮b$##ط{7RA+)19w4(pؼm3&I-it\f(tdl6g||@\.byyÌ"&UDIU0$z*l~>7%3 ǚ k$T$ X.>,oOp2k{z392f YU)%V+r6C<ehhAB2MMM 2:^jFCj @,XcGrr1t\hjk%+Cwu~ zX5RT$3 t 8)x @+@23@"(/dmzΞ|wrIvK0Y y;`3 2՚@Pk!ʳ/#+mt505>cظe3lƓ40\{/m͜{p#dB:IOGNQpȊ\)&rzN,9]݄gH ⭳g4E~(;wbvaDW[ E .'fN7x̓=|衇Y˿|kH׋CQi*L5C+XFbU;R $y֮]{HD&j'IUERz\F[8<>G*xۻIde0kxS*"dr)v'(GjZ=JWS2r$rUiI'3'31346#h{)Y1T*el ||Zfl2dbj Ue, 3sXVA[ ,..i+$snٲm33enj_X@o'}|39;)2hyvJRENW54Um$NͩYx+Uހj 2C7ތhBK]ӣU'jV9] DVٶ}Nr,l6DDQ B!$I"ɑ*FRQ?~aCQ˘ͦ+:QG\[ H VVfQVWMzm13a$Q/ 3a~j߯S23rQb6-6w_4{^e  CEjL0MWXB6Gcsz*Lf>#s Y}dSig4HgS477O4QӓqtRM|f YBvqfyRNU0M._DM=%9;GH!rK'\: |Æ,Љ"bF/D0-"2׏3Jln9jK17LVBP}4{&c.he%mg1 |X(`g' 48ضi#0?Eɋ7G1tzFIггqoML19جNʹ"ng~w~;&548J̇z,v͂U*PчdJ'RPJZ.]J7@>n%ZSYG4^Lhl io{svj `#G~c/Y'˨~߷ KcxV5jDC<9@Ig$R6K{Jɠ#[S5d0o]f /! XfrAF䲼YK!l**e~K tt`\LVx$J\\.cZ)u(&@Go0p42Һn=G''WFfl@9p('Odmb]bJ s\PfP,1$t|!Nl2S uICUիmy03a## ?~̙k?*~s]Vw^Љ:j5x(ȵ۶7r|7y2J.bBo(VTM~/lE"$ԚHU$jI#ry|JLՊU4"Rà10P.eV#+8-8ƖkoBr831Ǿ[7p=2&r9:3aʩ]F*rETd2QS+QI&r6& UP|R):ٰ1 v5d24zj5v4]Q7`@;333tvvfART$#U`2STdry:aΝz}{x?%z=\S.Upy;]$Iҹ,(RBa_ȣt7`78 25yEԣ wG0uDQz";o/~Ē&X[6 tp!4^yY*0;L=hhLƛ| %tٌ Lq;ɥRԪԪX%rE/ifQ*,v|s Rl*2F&hmƩߴ3c4uwS9I>Gg_7e@ SmdQn! EoeيB>lRQkhHZA)s%#T#Ja޼8F{j7Bo;ן6"{68@1m5 K4FH,%::XZXϳxqo@4oe*|^3VQOV.173KT`P*LFVd_7j- p5zN|b}5wY;~_9?HRT|w0Sa=0%~`&r;t^-4Iz>̿yBx L fZ<6cva\d팍4"(5M@s3PeT5k<>ȆuD"+<.&g'Y,)|dAJuw3d7ܑ78I܁רF\|~C;J"6dXMf\B:@$@b_rݠi8Μ90oazv,#̈́B+ uUQE}Q5f0 hZ]s숢@>_w\͔ d\WX~a:4L&&g8p0c@j?z rjI `]s F]{8;MNQlڳ>KXŨ?˷Gnڽ$&nGʕhSl2@< bqۆu:.~5k2Eۑ=2ԊEݶN*Eq;4Y2"h8-6Lv+tf/l@"pJ#C9s^$D"@.qS( KZ:LWYNg,&RX\ iTU+!2[ Nf&GGi7FHLC.AժDYz&<^ $QP5dDSO+ 2/9Cw_/K!VL(a U #XmV,,-RT5 Qԑ5(0suuFiQW ?!|̼Ay"QT"8+D /pZ)Gxf <5A<}k673\\Act6IqZlb1:Ξ;aj5v -z z=U!a;,LGl&IU4gH+:(UlZL4,v+/H1.L$BWS 2n^ 谲j%Ja5V VEn7JFfffضs'Oк.\eMOD͆$IJ&#Iԭѩ6{kJD !C/ %<,,-ofi9̚A&hn[VLP(Ě"8#[8#ǎ""PU:}PL4J}7^uM7zX@|>GR)ڲ@3VAa:UDeǿ}2+a*u+v^xk 7WJ4XL8V}RYmTV 2D|U~+FMxy2C1Dn5G{R2MI&X`0]馉+ν5VQfV#r1::ʺul6;y|^{zvU (5Л͜ox+BѩiN]uM7n?AeR^nbK،f:lӉbav|>0kcqaI!W#"cimjF2 zedAD^?ԧ83j tw"IzhyA_)a(hmp`TU:U!ElV3BUT**rA$veqmhN*M&!Mr{$4[IxksyYoszivQ!Z[=0 &{T{S㧏\Q~8kZH$jW8FtGbɂh8?O1[.-y=9fLx|9V*bJ%zjIA/HgҘ,f~?TDQd``EQF N;dž t299z$J>'!IDl6^_[,bUPT(d,FՊI2v8iv&Rax:Vl^Op~:w~ϞƛoyDbuaG︍Cܲw7vNˋMF\.7rAf%%E\RQI2d3ilf3=l&RǍR.  7ؼm;ϾAU'bX2-"‹%# ss`I$:%NO4d^]e=I;_Er&nsE۲]yZ~:^_Y]^ֲ4Vhf$MΡt$f&H"*TUu?=I$ۻ||V9sf7cY^8~`uvbe^~)nlbt&2잚by2)<@WӐ$@߭**>F$™IfH&ԫe$A@Dbs氺&=G˲V %w0#22…u/SI $G~su~ve 2AՃ8D>ڕ &&Y_Z$3Ü;{].ѳ{LN‘DJek̯>Kt8Ksen>4p[e,e"K8&VX0H~{Qh;$I4EAQ0`cHb:Lin(+x~ Laae %sQ#^/Đ~s9V6x;R$MCG@ su7?6zz];s5~yu !}9W9܈s fau"($ri>?o3ϟ8uI"(Q5 E#,^L혢BL(D8kK&9H״DQVLgmm1ih5'GH#&H%<}P&?mw|2mwq;~&jgi 0*Uu։ rwn^xIn/_&"*6.{g,ǎem}6-$KPC!tQjy| ,?G«e.?Nh2S$j=_0@Qjt: QVٷon<\pNz2)߿eFori^S~7?J&b,3x]c$]1=5I2AS$*"١!JThZ$IB7:N#sǭϒH&4@|`JA s>QƇG=|Ets׿Ïȿ޿ wLBqbj=\`mm ˗5=͕EQU^?.! 9t|=Ǯ㡧(jj5˿(&ׇ({M3b՛k0ݫ| ̼ &]uQdަlx(nxǰ5KX,z1PzJZð,~eqT,XrLv(Y4Pqp  tD o L,E#3<{n/=8=ws7mAJwmK'Oqx4+q+ss^MMx$LGI$.ߋ8ǩT*dYl񰵵HӜ>}n8kkk8## Ex5/$cuL:A\&fldt> Y.P&١!a IDATB R;06x ME1 ASߣ"Ja0Z-:Cm^ @Oo328D[7 \K,1`:s'O%&&F#(e,ǡew>H";J]` "Der++Lb{sPm&]d /2B4ku̓eL g1#+ģQkڻZAPz &[ *~\FAdfNVR1mXY]!JS,WH3mmI)J3gn}EabzGgh8nWx|!,d &_(5[Eô(&tay}'{ŢԚ:5QUnB2#|5?@y]e;]]eAzd/<$w[c4g0ETUe 5nt^at]uh6MQeu^+WXXQp[py5Ksvi+3_i]WwC%=PC9ϕg+z,grxf d|^tC'Qm:v q3gv/)'Bݾ9ڡCjt:$I*pqq]Bb1DQ$288a ```-KRfdtI,ayy Ӳܤ۵U+ UL]^ I[[g0@eTAU#6meJm'wЬ+L`׎)r[ Y *vŧ SoaZ=**6fWl<0h9b(]W$l K|5uL(^å9nzΝ9>Q$$b+@QLZ*??'O.Jcbr yBr$1a?"2WP1R5:ȁn,Qو(~/|;Gt?7xD..Aɓ? > qsleڭH2[` + $-M0"Hx11itv-2 =,xēOw}Kv;̾ |,\^ MUlBe<١!$YF4{9n8A$h`Z(J(v>*\ãx$R,%h~? !b\?__"1KRXXYee-o|#{5[Z8 ʌ ߨ̼5[k0ݡ|3f^fWYk03 A鴛ȊڥK TL019,(8/ kTj`bbP.i}8Y\Yvml0 |'_{+tm8x(6xm**vϡl^!",p$u ˑ^h)WLMjǨ) uWOͦTVGӲ.((..$#Ux횙^X[RJw#0Kԛ뺯~ݺ·kwr2m`_=k`,/}qh͹gmz;o99iv z6Pv P-8-MDV5joVA4fE\_?|,O}a.mP0zb1jYl;&iLLNulY!2:rz=l~ɂ}ss@ #zd2,d5G&N lmm Su @8unU&'&7Ks|~r\RH6lPdb:Hit]dQ$Y0qбL(f, 4ֶrTR(-mB/?Ϯ7ʙcBv1>$ ӪPD^b09ixE"ݬ\eA""˴s1ETD\ uɹOH"tQ4v|箝\p6KR&18@\etã#j5A"l4ll2 )\A6pᖷ/GXxNran sLً(x}lcL횦X. >F4X,dƢs1VVWٿwk glo9O?́#GxIF&Ʃu,Uʝc ғd/_W[<\o fw?̸/k[%0y50U_fW&k0 3$H(${4KD7{;nzJC!dUejjzNERK"csc}3,.,p;]$U\/TM**]]Qb^UM~?@߹cضMTb``5vf_:m HR*Hg24[m"2{JʮirU+4/>$PhvNTȯ"X^/fs mn1<< X I2^s1>9:pDepph,&H?ק1>:J\uX_]#K% ^K2lP*Xpc%ηBosxS\wÍHRl"$q,tLzK=IdeebWh6HFnQ 2,ѕUDI'(0ϼx@,0w;$S@p-+Ww^~ok 3((ɘ:oaJ_{u\bbbP0rLb ɋg2<2O'(K#QJ z2MF%2fJ}_yJ=Xk/j0h' v4;˱{X+ ē VI&\p ^x1ffX@D6Iec $$dQ$ iJ333t{=4M# }hff8s ӻg̑'weìN'r R)Ξ=4NjxTںAX@8t0`u ô-b(dj\Qhw:(^* #ã*+Dku{=E0:HLkveLMOEYަiHrhR4I&i RP6KVe= 1:>q9;FT:M\D\f.c<|}&ͣoQ,l3b%G`Zctz QI ҬT=@ͥevBb9@jMfl G ۿ-`r>Vez{i[=}a']( KG7t DA0tJ׶d?@಺H$CAӟ wr rƶ:cqIR6606>|L0Б|oՍ5>fa?ԣ2JQV5$ vL!Q"3:nãϝB.GQs?XMq]5Bzh:M^nvie;6O8E$~wG(VEo16GLqу|죿?gc~a7:?˻f֮\a,frt%9Ⱦ}g8 ԚM$vAw(Fnoa * ]GE ^jbO) 9vokuf#Gn5LN8.Qxq:N{>rw/zx2I:5,4EQ(  im(ܴ 0++>tݦVoRk6P^%m%|!~gjŞ5W|]u{K0#|/Lϴ'yO˿b*mްw7[ˋLd1c#TkU<ţJE8*;t{}EXmc|᳟enmq*w ~A|R 0LQ#!1JQ*ϓ * F3\e :&DV1 ,-.|e y5I$膁 m,ˢcBD$dEAuRm,6M,b߾}T*ḛ[gϞ=;w۶i4Hu.U'xyfvM 5f:C_{Iϲ/0hۨ>/1VAI &"e@o_0K^NxY |-t"̿$̈ݵȁ={ 0w)|C^馛˿ n{9C,ϳo^*2`P$cL`ofmHD4u2Yv6{"2fj` zBP#aVHgȻz7|4 :TVJ"!ӳ7O񇜸c< ǟ|clv{ܵUps.p;Ʌ'x4P8B0peχeZZ#Wk j^lEpY>So5{4pW̿:yژyk_]Z0q6h. Zx=0@. ׁŕx;&v 藾]|>|G"['cu~Eѵ $)TD#qh-@{ 2SMYId IDATS6n |Uo~vF}eYz!(*!68t]<bP(D>' b=0Zqblnni4xV nI&m," qqz5-E귰Zx4mp=r4Mjr"PV9z> esuC*BHbt{H踨uNҨU$ Q^"w nK *[V7&63'k l?5~'4-E@M: MAv{upAFy; WQ=0jo4SC/{fl$Rsv\ Շx4mqimpu7p<~ۆI`l[=Eg#_G`:~ 6 Xd޽8]ŹYP4Ji}l"l[6Jt˦!z41= f9%vMܩl6EmNףQ)Il7E}5PV$Id?YbaEl0N@o6`FF(4Dc5ZNVט9ͧ|O&>?H ޢCr5pٳF:$Q׈Fc/(l (D\e 3LDCa6+UBSOa \-@Ͳ83{B={ɵv1 +_;JA| ̼5_f}Y%~0"T. SG_gWWp~m_%n]IgwqQ\B~g/ $DٰH"^DT8 Lbccd2OӰqf0 N%](H*s]zH\Xs9y Wo{,Knnv eA&wU=unf~}?~>`qd YYXAY/EJ4YBm7֑=*GC~zRq R)ʰS,--16FFpvIt-HvN2\"y=MRiz4:6VVȎf1uzBDtl.]`,]Z@r H@ZF4i~&#髼j36:铧8z-6]T NOjX3MQZmFF4K%4QFpezYPh,[^HRiHr'ȩgO0sx=V TE~M 8DaZ;Fֻ̅kf'J - .E4E0[ *a͇l*=a!> K$2#O2>7ދ`ZcNl9S4*-}QΜ&A'b"[T6; T z@c" 2~C(J2>FLfcZsܴWJ? W=eC㾒C }5XQvyIz8ӟMMZS~?DA|U@ǵ_G^[wלHw_a.FտgYUť*j:uskh;Y~b&$ѵEU"y|~ 'bժĽVNb7&i(cKeYYZ"D$ǟ  Q(JBݦQon17axtuGfv2|'z|sY/#Kҗw?3 a{sH؏ XJ8R=bu;d2)'&A4'14% 73:CG9\ t{ e6ق$v;x.Tq~aԎCIx` b0Opx.V y"QfpI.IY(pxn6艆|;Ǚ|C0u}]]n2?@n3ˮ)gs I--x) XLf tC' V[-9?ZAֈ r9|^/k\s1 inӧph^fdKUp:A]k95}??_xkpClV.Q^qFpm\@,dt^Ղ?@U5lLXdqSV)Tr$Ewe;`LGbh0[, U\3S:V?)/yL'; G>/ ;:ȹK\l2_yx=/n%o r{Qh4$ɌihGAh6֫8,fEg}q-^hU6jo"57ˁY_O1cG{]{vٖ(|>:v2}}|𱴼hA0:e^s-/CMpl^RKD_/A ٦]M vDQD5z{IeS Pom{옜ٳ|Ӎw>j:vBX&H5Bssx$&.|@vk\>$li6Rł4 M?8S~W.6/;q0SF}I]MVcd\sv/Z!Οګf w`cmɱ1*fxSgaXnQT52"8 я/~h"fcI 􇃼#]fM٧ybjM0 ˕f $/ZmJ,* Rp%r#)$nr{x?͞}7m,VsJfD^ȀjR;-;[|gFxcO躾m%6avzD ,%03 fm$  bD̒_;quWpH{|5NF@_.1; :Ϟl@TjѮ)fsDA0 048Dnk[F>fd&br|OtfV誆!^atLO] RD_o/T w[f%TMcph0luK>U*0cAr gT oо\o/FPx-"u^Heш}6ӓ b߾,/-WW")i]R*R|T:]:3y߾]Ld+J }2G{5=x^b=1~R>q?&0$lEUZ(BD/~VMCWnM`%Y<`n RGk$!f=U8z`^h(*GoVdOVI<@tѻ]MR$Jt:JM$fcmӧi6kj,th3hZF\u59slFFE<#H"6 JFy'B[p̶*RAD&9S"AWW\)]f2syo>,Gpߣ0[X:.];Tf w0yAe2 `ޥX-8b~/=070[H-,kh󗈺}fEHd;IfXjFV*4m**&D  Xffgfؽ{7JY | _`b$ݷ':ɵcaaZX5b\a$l5eMQn:ZFC]UIo1LVj- ,6fW9Wq-,zY'X_]V)a5 zC~rNqcX\V+B*"mE!daa^4 RAcr{۹l|j`x_Ad5 $Qz+ilqWRWx^'o/ݻߋc`G>\D mUӷ3r96+V ^3?[p9gD,"pt&7e0*_ҙƣfR~d8.ch*KK ԣEX]_#𱺺&j5XlVMr4m7BKO2aC4 &"0uuF\`-p5&()*d%w:z~4 ӗl&vv(Y}-GG(ת 8e;f_( IDATRĵ7Sܱs!} n 2;as3N9 ;dmiIffdJflGy8 >pa>ϰŗxݏj}A,CVWqٜHDo_FV"*hFQ4DIDLd yd[iKPTmzGI ;?}$GBpؑ=.,6EM#/OHn7ze}^fgg+%th*J0E tDD7}?x𹝴-fx쑧rm\mȘTV@' PO n467H111I&addzFoN.t:92ƽ'.`mdsk{q/fdA@+ #oӳ_3#|WXӱZpf xb>4O.M5o~M ͆} ~4߷ 6EzӵMӻ]6nw93K"JP4H!B,+*AWr3"KtT\N5&&Y\# Rȕ|79I@;XMf:l!˴Z-:펆 T t&vR.@v8 R&)Ji "A>4tHg> mx @9jNhhkYR!{]^/bCVlFYY^СCHY*)XD.n+Hf0@؀7y3 ^ |qwt>/\կ!.=O$d+„HWpd6Sbr0?O_o[2aIg$0v:t%*]G)jmyk- N8IѦ1iIB@2짙ˢ Ʈ4h-2ٟ$^vQ*8* HF9"EC@x)DU&AK?G}myF1ʅ*Fۇ&HA.g7xpziHfX_S,epzZ:2;|?H1?p=zh{ /WoS^l %<x^Aԋ[m.UFU,V볗0"&Pv1$ECo4lr$#JLJKw(pZ Z<S1?{D?B@8B]SHg3 0;ǎ;XYX! kGj])8vt]pc}L-ýX#I?afm3Sx1hV#yձcxNڍ&掁-x6z (jMD^/BXa`@U;X,[[[1*@5Wx CDӻXJuH؃ݹ};wz 󌎎14leb|JSO׼O<5W_"lڰ̴uV,>Ϝfߞ1gq}WL Z5 H4UJ lF) vA/J@Uf%ԕmh`j),/_ &#~1:%@W.6w󋫼W[) MCvbvxzv6ͻ|'F13SmV$Jhww7wbljJ%=X-v&9#$|> j*%8<㣣\B66HDc6pLoDb`XXOH& L &!H040ȩS[X^^vI$d2dYR0<< @QCvrbqyi9}8qx.^ SWE$oOr?K0&~nM$r>_& l6#[t Uv4Lt]^b&CPbw o{])-0ѭ7N۶kQP;Tm;{QHLX2͎Ip-& ?W`vi ( +]nV \$vRT() ņU2xMK~2#c,/.H$I7"a BDP.UXX\!10n$?ӿAv@0Vկ5vL4eo0=;ē ڭR.zuh6[XLV$AjBE6E<^oxдzcͧrB0rD,)9ŋ\,{el|K/.ӧN20oC<emuW%vLLb0P ]& TDEU1AvFI5jNL]#w+lX~F U(Y["IfL6mMT1.YƌAa}HQi㵘qCI.̰wVVIRVq۝Aʕ&Pf#Spavf"A6׊]hojx,"4;ĹS8aڪn2ƛ^qJ"JW05=pO+%>_q9)? Syۯ{58&H'LrOC333* yf3] VV',,-DXghht*Eo"2q388HX$ ( W]=O>ve9v9zx_ghh.L&rz{Y+/|CɳV/#4Բ[O>Fkd^Wq CsMFY[_F$(ڥ:lX_/FpOkʣ*}==r[>Vז9rKnz`rr'aVSOa{x2jKed.3sV4d\VCB9DLvv݆Q pj k'>#Nz FZ$ VⲻIg=5h! Gg5_>g|7ObpA?LMp U(U~]}CYՆi:AJΝz9{a9wmle= %?jm]yoe曏qu׳,l1<xGb|N=V6M!J3kILEZ͸lMb2??KR`es>)nvNL%fXZX.\}4M^V CBUUzzz(Ȳf6C8ctjVDO&DSH'*A0˫حVxŭ7Om0 XXMr[@, `Q*<6^.N]<߳y&HKYZ^& H4G2[7UBwv?G`|Jߎs!148tγfh(~@3ҏ؝0H=z [&^]d;нb.jO'"$1alD_"F9h?e|x$wNRk-VF˃j#[XMfLSj4dTeF8w@;$J!ϟ'd&WW {T(| Z&o~]8˫nTċU\3>Ffᚫϡt4Cx$N=K,g%ze+=$37rr-e& 2<0HTvC`׃l My ]#jWhtTrMIp89D (K]&nR v;P0]JMm^Zee6Wr RvN6Aa9]d~fXff!@ @^GUU(&]UK$[uw"mMjh. &cq<1pa`1XBHbsޝg:TB$tޙ޷~y;L1cd^R7?Jɱqn7\.zf$ljFHLe:Z8`7%.n,yMT:C~Ýկwsfv0 Ѯ>5SgPS[Σl/L4k8VșKֱ'sZ_8C7I{{T* 0cc\erNjwH&)K5z{[)jWG7 W 1[|{.v @fFyVwy6fFƢ|ȈYnR50-Tkɥ]46 ^+/fznf\ `) 8lV$P8lv4 R O0t F @Vo&t=D]n`uT:::X]ߠT}h v]o_fV67k :ܤ{x\v\/s  :kWf`,}}C-J8L_wXHXl6( *5QM*DՂ)U+ -3B1GWO7kk+}^L&DAN<$]}\aD)f< /xfoATi04ԃQo`{{^j4ՂAPVJLPTe K>APAFd IDATTd6Wqa׼U8,~fy;`]]/N_(8ɮc3P%26U(oԩKxw1GO{p^O:#d0ЌmQ WykngP/ Wp=cg./ҿs74k^Iʍoc{yM% +J^G,{wfyn 6f|ާT*aHfٷHl AOzai1O$PEB$[aĠ!Ёbu+ ݻ~T:TfmILfTG2lfF~l0ƋgC*{S~03Ϧ <-J *VPQPUItsY)Ӝl][kN 7>TU7_}PTZHC"`P,AV?j#t`hH43I Re1:Ʒ1tԤ,{F7ۭ$QXd8LɈmTut 8]nT"r.ͤ " ӨW4,c3[PM^OgNfgh 7[ǓIQa.$R,u39؏ంZlzr"\*cXY3Υ=DQm,3>0cFFwwx!2*{."[a'IFcL;nzӎFFUz(ˬoDp=LM_ K9sx:{9>=lus%N?LlbzU^kɗ`ب2>|.Tn]V'SP%p׽^& Dg1gNg߁}|B AiH4J%CA='ev7B1nEn6ik!8N2;ƉStPGG5Uj^2yn+kVf7$~黸:\Xl.T`)4r QRpZ`k|YK&EoюɬgqfSSD68pTلVEߎb1ɧ҈@Mn"k$b1V+Ǟzn.<" c)Wd2)4}GWx;w3si˗F#jXm6zfX""f,Ԛ ~/9V&+#kLMmr$"BF-g3|E?mdt:=ToRh,j#&8;GGD߀=a ߿ W%x)ſ_QDxSt5/{oʏ~џ,:BfRA7 ch6 >eV &E!A.)v 6͍5zz:i֫y dbbq䦛y;ߡpՂT+Υ 2,J.-sźV+b٬,fEi +X;R@[H;I y z;{ KO]8ϟ!W8z~=L 9|=b@.K~+B*žSi4ޞ^r,vSWX-CA"[[ dX,(2x :`12;C[{-6O憛O}D6߲6h u0uy?"GVIZp_k5,zS.cbɵ{wc6&Ԋe^G*!Uwû'?Ǧtv`G͗ +IG6馒NSe80iLNR,Dd2-BΠhnhlgqqłԐȪB2nս2zՆCg!jE>bz> ڹex<DQ`lrl6GgWX ;{8ē޹f)h R(Jh4""uY& c6$y2ҿFLR@vO{7.WRm۷+(D4J6U*jFn]XLF2se6jTEbS7# urye~y1ʪ@5<3CT0Tz==g?ks3s?+#ӈ?~sATfpL`]ߏ3 fJHw2OCdi.[i dۘt"AAPPqXmL]j% Ì33=LCSnUl6; ՊJˇh2rf ZT(vh2jKS8!qhJJiZ,Өvמ*h!Qv3/ɋx_2J U🾏R.Oog'X͊M'2[$rjUBJְ-,,,`8N|>QLOgVVV"L008,v \\&lrSH*wbvzh6Rs++A3LV ==(@CPћ.VWVp9[,<"Dz^eY^Z!`not:ELʡ38bPq\yO&aph|.N'" 5 !yx03'{ VΟ>\ct|fAWw'z%ojk5:TEݎl6)qkY:{ u6~L>+/31:>+p8\4 j@YXhR0L&JN{KCR.P4 zL ٌe8xV.ϭ=<ο<(o#5Acd\fڄp+C>~ ?0|HM|YB"}#Aʅ"{'vs|_vp8Hg2BAbQjr͂V wx wpitu?xoRۊ`jLj JdžnAЊ,]aoե5{ 'qAh.79B)q4<㍿M,EG_m?uٕ$q=]e/xG$dw:/.b6G]ijVckk` :G#r G8vn|JI"'I$͎Vdz"&) Hb!JTUziԛ\..]b߾ANemuA6GЀllbYY[[hy9qǓX;بV);o{;N]@~fOo~}cyOۮ"&Q\)b2ubFRCU\:C\L!E{0"ND>vh4 HY`0'x{lG)[NBNFa4BO_?/̨?ԫ`*ł\.VHuCơ"TJȹ,Gq2Nx^Oj6P09.kxNrO"`rbK. N :.bfieG"yVj$XiUTd2Ih)ag3F5Yn=,n%`}D,RY~^Jh4!Mcs!qE.O]dtp H5jNRo٨lEۂʥJIe׮]>}>Zb1dltd2Go}+:t?r YEV])[:L:t#{?wϞ7Td}{P-:TAqY0*C]xl6VhX_]e߁*URx>f{X]^äSiuYP{5YrReb,K+\<}h׼qB;6@ղ4?.ʉ(fB\@дjnssLN˔e!_]wqլkG'P1<|ɑ|YFh KjųgtY-/b1ɥ2Mg;\bny~("ڨK & eUMY^[qX44 -FBv6V Clom06O"2;;PT*tuuZ=&@8kҥK͉G7'180$Id2^ڵ|;ccwˍ,5YBp$^6/N>Ԓ8g>Kz|uZ=΃{8^U^ҩ,&+Lߋޤ%͠b0d=^jF_0P0;?_}Ӭ%B;F!"l|뉓LϜ @&n?Ӓ-`* f;QдJT *~c<3Ձ%Bv&`֓\Y㤘JbҋJ]pPeIh FxNw|_B$ i/;FYcPLňk7_à領4iT*nTJ%vjȞ=|ۏs666ɦ$I4@,n'-RIh4&[<BЁToFs+K|Qzw Fe^sa"ӗѕ 8"FI`4u:8mvVC]L-`oAvp >Y1\ zH&Y*sk:=욜df o5:ɧr^~=,DDžEt;Pd|6jl DDn|Vf2/0x}\Y]ᶻ__{[nťEf(XX9tFC,,- [_nƾ\dr.366FҖzb-r`a"jKXQ8( Z-&07㾗2s87"hТ;  ]CAU#,ైT]2gy̟8b)b{;B0'Mt,-028B6n53=;TBDT[^oKI^oCtyO}B{)Z#14ǏX*b@M\Bף`j@WJKe1iHH4L׼> :&DQ+[fgQ()-Acye@ 2~Eurr'384Pl:KRjuK% ѐi(J GO8 #S"gjx?RTȩSD?lp($\g/5zm#\w?s=t,,ZëePUtZ=$X|VCaXXZXlaw:X\^T2AoYAc|q~oc4|k?K.3Fb0fb ؔ0+b)`[.\,J:fR$\ $"hLjWr2EE58fvJY*cPT^p 6ombRtke63RFAtkN,12l4asy(dr=I>_/WRnTMVW7 |#Oxa#2;}nDPnCnF'JSJ E:#Y!MHmM$9<>/gzޞn֗x=3YE!u9[餽Xt>;cqfxpm [Bxn7KKK *nfI*vL '% 2?7G(bue..Wq xͯJiUFgB6vRKجfN%SO<ۑ@T"A F vP;ժDOo/[,v‰4~\ތjhw1o65NQZkOj~ ѻ53W0fFQUjJF8|:F{*Nj*SxG>9F=f 5 B2*-i˃Ncc#L{{;BIVaSeiDYFѴ H$pe˗&L YXXb`hl>v4Ξ8zt3k83OAα~c`jP,;#d7<#c[sY^7]]9~xggbx+웜cPWTd ɦ &S*0Lz%Ցh;-eVF*w2<0Lt+` f`|s zKx1?D8{jѲCdstZ3zFn`3!IzBvDt"[(b0eKtwN01m v'rO/+(c]vz:yw@P>'`( 7sQPmaUH,Նjuݤ39v'JC#DQcw)>/_s|lM'n=ǭ߁i^7}w=pӞ}}({Fɦz+|L.6 IDATc2ыU$B-,Xk`vبEJ__\{0Νc׾ yVXhoc {܅tt<Ή N:7zٌVEUUjZUzjx<hA*p`؉b8lDQr177s8tf|R]EkwK]\.4j2z22 S̵6"r]d0ܹs g/.XVb0) ހ3 콆78:I6EDR345DQDEtږލT0LWEi&ȅ_<*ȲDUy&: l?UYooi$|~(3$磔NIģ, "^<#;T*eL#D`T2bQ0hۘQDP'GP(W@1lH&:`wN1,;H5<1~f?M/z1-E>#l|2?|CH>/jJezjXtkҙ3yl..wsطwU'MYFky"Fl2!2\C]Q(0zJfjrTb6ZMTZNXBAe=oyQ{Z]D'שR2TQ*D2M[[Rv MEDJҥ DMApP7^CGgDfA&' *Vho]&NϠEߨAFSE[EVRJR`$b Ttc}+M\QU2{o nEE}=LM>y;X ⑯>_u, 1,//6MOglFEFd ^l Î^GPT"6RS`2K%i`m}c;4=Ν;IhkkgyyN*JWess.*2bB<SʗZu+kX,YFUU$$iLF UGw;8~nrs.|>^[Nrd^.U*<b!N0MHef]hNf$ ::HAJR AEmcb$'5-vV'*_k~A-wx߷<,UUEcZ~ii&fzrFA#DFM+4h\&g>IڃJf/CײjD!Rʃ 7J5V؝[QBm$IAD< $R)<^/b ˅%Hٍ35zh:hHKzﯰ7ߊcq9ۏ~Vjf e:~4r ^Jb3j']k7};9q=4:xF._ R(XVYd2DQC>\*c 4 4Facc݊lިPdŎC0XEjQTFx޷g8067%;*6fB)cNVW֨՛2jh"^ofiy! uY@ԛ9(MxڻFx#C|OзsلQh{ hE,OET 0禐q,Ķш<:D,cdb'SK^zͅ [7@Vp5Gc<~:-{vRTp98mv,W3snO}|#'Oq`>P3_7p+^F4cDM,![.fjya6(dsy'6鴌J( ΃pN:Egg'ϟGדePUt:Eoo/[[[DMf|`b!P: ӆjfe+LMvoƯs9Ņ'?>]H09:B,va5(rtuwQ,¨Ѩ70!Eu<4#;F(Tkulvp`0 66fs]^ `5-z?P '^i_of<)k4MZ- h~)?)TJ;&A?l %㚱Q<]D&'XChl4 ,6ՕQ-K`Uf,,,o~.^>G3@.7'v:J4T0n*}^yzWi)XMFj2^ZJXdǎ\<ŎI zA6 aЛl%xں" "_~1fW1$ ޮNFxፇlk@V]M7ݻYۤG)e1!`ZY1lZ& ř9.r;y䉣| RH2 ۱8VC DV,76d% :/$䰳铧eمEqdA U3ÞpΦWV08,<FGYBա6eFfܬc2(KV D1L$ n7L1VVWaqq (F$JKd#Ҳ1ؠ#uAmCa0LamqD"jJlD6h,ƣ>Fwr=m-np3yOZXn\S̥1h\,`5YTz]F H7e{{`[jAnQ(d;ITOQ4pqs=|*?}WU0'0O|_ZIQx`{๳x}q]wPG:۹}l-Ioo61ER^`6ZT*h2:} ο}g0ijԱ8?Ͼ{ z4JEB3K̩):jRDvs 1E6NwgJ^Vx)˔eGG9ݧcqyzSi҃HK ٽ^:K{_.T>̇KA] tF6e:YYXarxS utal\8eyq|D@'sKa&[@J$=\gukɻɓ'yķg?6F4!xp:lD bHwg;BE[Z|/bX( 4ZZ-0n ( \_y!n6.]@0'J2:: lB!b(& \&o[Lf!Yr+^ j(opu4ej :u 6-pkPjeZGmHT^Fj&RོL@/RIr{X]Zfdd:{؎vbOnuEy4bp`ؔ* f~_z+<dW3# fr@(M 5 +M3  ^Q-a҉D48N@< p,ve3 p)֗>h9ldbڼnvk+uT@Oسw?s :..sԛ\sm0ɑW%/czD:F2C|1.;K6]GU^z(*?͞9sN%Yl p%\H`BB `l۲%ےۑNosΙ3){?FT۹ryg晽׬_v={6r\?!p1)R,vr25hZ;Jd~~ۋn?#) X 3n,X\!I&\#<]^0ɤYv^p_l$۷nر}~ Bԫh.FN`ڵgɢzM@[U]gbv4`q9׾C60&KdkZ sCG5;⒋9mSIN'tE$>cG "ԛ ꢙyVY,7mdXAxyxH{r:G<`ZQL2\B.e6,6 b-a7Sd&@Ѭ{oހCȔ|{zLr*O >ϏCQw{!s1z睼k9z_(Vb1OP 111A@$FEr9IF|(&N'O" ad2)J VFU$ V^ٰvN000b=N4 tBTep98~j4@|YCo[U4Ė}0=9C.RffElXI4MD5[ ؾj GOK$%c{P2/<SV[jL0~AUJQމjsr8/O+j<~cS2DVCX89>Nccdfas?ӧXv x~ҩ|J4ٙYz(nS*QCoo9r07oLx/*dy9C݋i,Fz$)2r$Hb1Yp89-9:5_>A:XJ,s޻fsS,FTЃ0;=ODDBt>`MW{^'[B7\+V QV꣤ir<{89:=?PWrAS1DXtB}>kH-,U\͵\vvGM&HDȪTD4?תMTBN\r|:jæ0)LQcE(\wܸ RMb6[+%,Fضk),vѤ!I|dF`?./\IR&W+g) \oK.iVy~oŵ^Cy[%:rn%OщQo0?3$<^]/v8b$HbRJbq)֯_O"#Gr |UlU,*F+wrk'}O07=CX^]͂IJ%"بsN2$-24X,u9p/8Oo [aXIMt,勈+J k4g7rH<?zf^.~fP,jقnj`fדm7,fojZUE*䈞</9nh,= %&Jv竄;9r8"*_ʷ@oI$N, ,f|Q)l 3 nZg!(&g7)"pӵWX &c(fS1Jj ѵ= W]D̢,(4H.GYjSNYX\ IDATe\^OB|r;_X -Nnv>ɏvgpyd [ahEo ™p3h^3/f3m^lu^ ~Q0Ƌ$_</y?9mxftlnł,7DA?;^ƙgvsVBȎK/kR*Ю*\{޹8J^-"65QIXm5H* Dw}dJ>pl<wXߏ^*!M$]'JH-2p4Lw4u_#~ٽi<}?!g'#/e03Ӵyۨ8^D !]{SdeeY7}vwMP$5?dQa,aL"4ut /# 1c}t L+Wdi, jENcY&|LtpW7B;~ *:Uд֬5% Y7 q(} xfF)=m&'"Ճ泥C#|s_"Mjałչ4٘%E4_K|6v \b@Jp H)F 'p[\R#Ȣ~#RR GW1d_[VDYi 6T"Z*q(R8ljB wSBiڝ0 JЉ#ȧv' ^?m23:_O$c4 yL>ޞhxCVbЬQ aRDeB?8bt2G n3u{u\m+O=c.ܾ'EC5)ZřIV`qVSH=~͗q8=N-'8]eKK1BNERE5 :릦NK1^ebjnpG'` Wl;i~J^xbCm!v64s n i_'N,cab%UOp{=EDYmd EB)c wxk)H {dtς]?#"gC/Y_Kw_/g_K/P~)a5\[;JETOFrڗ0ۖmpig`%vvvW" `8vv@ت,WʤR [k]~Y٬Sk$1vZ1G8ablp[A1 ]{0%z|=x'؏Ī^N8HW8@w/i8,̊d$2C/}GxYsr_fbn m^{}+G#l':;EGTEF.y5ג__IVkL$SO$hxd$n&MȦFE^'nciq@JS1zbA9W`a uCE)!h&(Pk6Z-ؚF,EdX;4؉,f^8css%1?:Ѩ#.4Mjљ)9ڹ=OO׳̀$?~jMl&tH9S"Pii e*"zՊEQU424drlr mᆳݬ3?\{-?YmVK6J yalf tE12FCtdp|hVFQ[WKE)d3D"fgpv&XqS~j,er UvDز%}{β?V)1é@gX|F2l6c5kREPntZJy .pc3Q"X(iؼ|[wcd}y.8*&]g $cxvJ4aj6C!:5E$$Q(B+%+(rh r~R,*d IZmt$Ibia>=ʹ7QժLR%|>*Ef oR*IPff~ul/`%f=LM-C.mUgL-johD[CD'Ǒ5\sMdcHrbo޷GDWe LB]o;;IԪNImUA6[0+̅C! @C AgwZiD;9tfDOOKd]l೘rVT1?<6A^ z `ͤcO #$Qgb| 7mwf:fbbc84U3mrbaԛ5QZ"L*&&iT4J"ZLXDl۸t'o/k/=˚CLa[8P\<:PJ$$XT5%҅ Y0 2TEQznfHG&L__㣄ATIoe t!G 'W(PѪ!F&\\/>% &+]2L]T 'w= 9#k03Zċ7ğ{3'?O./xeK8{H٤7QM*e>O_|4(Q*N6N<P(c)l&tRFk\{D`~~ήx1LDTaMƩeh$mOO  $"ES|ȓ{! &ʹ8.Li)ƻo~3Wf|DR UL\u d#'0,X=n,v!:LLc L#~nf3O]s 8~(m;L*͍7݂bY.P.PdCD dJMXZm:::& ˉ xqy4EHWG³ONoV v#l9_\.O[}/̞=ٰI\ % jY"m파P.Jtu vՂjQ-U0, podnlvGC bQԊn(\+ŖPUVl6JR.Ž˯h~=p8-o6s" Bόklہl7e EtOZٹoǯC.gM`偤`g=<6z j5>~ݎ$/TojFU1XA%fj*6lX"ձH:EF["+XkSID$::$ݼ/|X}=mZ-oyڅb"4Ȣji5ө+dc|䃷ȧ yI/̢eSLd;aTo4o fX`!]g20qVJ ALǠިL˸}^QP*ar0tIt:MWW3D"BdY033vʕCdsYtMwwNv*YbeDž4?n=C058t,!X| NCY)6 ֟w_wO901ŗ]B]A6a5T㋼v%T1kM (bh"QU%W|M[1YT%R'XI`k' ah4 r\l nOk@gM.NsCjkGQ-X,V**fR A_{vpxzӹ)d _3GN00K=@GGtB06Sc$3YaD "aA)KattAlrTbPEjUCELf $Qk蘭V4NCdYN'l`$.J4kq|k'Nr3O>s/(jFژ]X's 7]={#R鎰X.bi p9 B>aci>@:as nyfoqGuUZ 9=\y͵NOXXt"C$2>Z"N'Ӭ[F؝k J ?55|픚:łb[E8cD_\*3/;_f_/[u[)/ bw{1f) TU]]]/V}CV]IVQz(aiVϦMxݭLp(,Ԫu.J}qe0;%,?(>p?:~")泘M =$NId2!8N+h4Z6?^R|\s=ǎ#"mmm躎b&Ja)keh GE Fh8Mm$0JڽRϡZ.sNvW]Z靿߽?sX/9l&ORYrܲ8dR6F\;;y]@M#'15EæNٴm=]Eëy$yDSz GYX2ӓ6EAkԑTA1_6'T*N\&";(K3MȻv dV!iu$;7?b2KOO?'Dk,d9 A$weGєZҙql?vl%;eaZ8TV#H8afѠRQרMEvyZZBAfg)#JӳzM~ ;o _{6ZU - \z9_7y1Wd%XCQM.Fijgt|oil|~r ֭p$;<ԓ|Ok_z$ՊA ˧zl ?B4l(iY-,9'k9~(+1 :N?ShԪ]x>( '">4 8M|7Eh~;p/f3qFsf8@G}[nq/cDA yVqP(vT+\r% Q#594iR3tx}T<(p)UjhZYQ羇8gU%~Q0xzZ^\4+cx=.J@Xm}Sױ0Ԟ,-41Ί6,H6iq`F%Hktp 4py< BVrh4p\LLLуL p\>fcvv\#G$fn尚:.|>fk՛Qւ0$ h$YV aEVXX^f->K.eS\t|_ݏ(FԩH"]:-Di(,/aR  Xl `ZIttuʦ$RY ]hHt;?l| 'bq]?v(&3&tY1W̼C93|m"j^9gF݁WS_z"VEeCʭJgRVb=[&yfB^'!BaiNbV aV0r9 V!NT+ucyۈc`p(hLO XJ!b%#R*x}$2y|CXVyi tοXgNNGrʼCmyG@LJZkAk0I2՜FWG7(,So65nSjȲ̑Cy]vw2??O ˔J%{z .䩧b˖->}N&&&tL A"]`ppiB!R 21!@_?GxHaFgٲs_wy'Z׎pQ1"Z,[*E: yvb@thpwCSl؜p;艴#Vb G,VizW3J ϝ<$x}_ #S(<(J%BmmvDDzW~ I$n:u|V GNp;ѳ |"˦0yl(eE&5x6to|[YΥe8^B.I0L+U>/XH[Y"$`X:%#T>?dUPĬ PLVҥ*E]4!,OӻyWzۑ= llNIA D 2lѐ((f^EVe8vޠ^ivER (ZB3 RV(U[z8SS0==Ofƍ=Ioo/Te<b!!)򨪹2ŗڝ0BA8>9I906Λn^{3U`LMǖ Z,MN kuzÍzK-.0q#|,|L c&`檝bfz>MS4 Ɍ5"Q5))*n˼o_'O!Hp*./| \v٥<5oɉ)<V%ELWP+UjzFCaR3 wӿ$4)ڳ#㣘.|ma J S [NCbnN72L7fDG'JrV!*2n\>b0@Ejښgv;BUViK,f,4BIQfH-1r^<]ԛ|(=ݽdrYtdP,%3sGؽ0 *jEuC[pycwaɗ1r9tp4E5S.Zj¹&fH,TUdIC?zQ}ߪrs M feYrZ/|k{F`F! P͌ɱ{:ꮜ2 !0}z}Ny~u}] TXdյdN}t-> ~B2G{O7KMC-Q,NM}xpX$vrٱuS%Wa(ymAqډUEc.depAB.G/./qڭcy[o| '2?@.iVV rf,ުJp@"VN-/ptjt]w_8l-MQ24M0ssenzёQH ^/k BZh[k~ݻ/rLo~/ Zjut;8r 3sڵ EPM`jv@Kcg9ި4&IMHĖ"JMӰz\ NX]ҋwsmęӼ2?0Aj:{yq&&&Y_[Y#R)U%X"MPc`J]!42?|+n|7#t|F-n*ѐ] x0D3SDMdSYmqDI@W5uH$grv`S3 qz SqF,'ָ[j8N$IV;{^ό2/cs%9 HWF̋_vUUZ/} ԕzö[^r]BVuN3swӃXcF&Ng"MgeMY N[{\I&&S+q;ԪnB1GZrؐH8L*rtaZ}FXCU3(+&gpQd':8?sf}2dŮx&*5Tt$̣ѿ~'N";d)ZZ  ѭj!HVkNgyNR4PhKL*[m;. hmmm$jˍQZ ]HS誊ajrÄy;8v}Lc8:1IR|gN+{tA,6;VvyML&3 D|=iB/ȧX `t>"B)$@9cC_? mdrM~-N* j~IL#DGg_ eDX噃65fs9dw|3G8CT|/I?rr;r I30' ") H*:AOoFWkZۑe sHlj:@! p{ՙ3|#b{z&=8rDnu s8]v8.EQTj8.>?PXD, ϋa-,uNƧghwhǞ:emB9K/BjEezWaf^_/ 3f~g0fzUY37>?j֑ٔ2u?ɽP)vdS"h8-f'[8E2q%</Dh ae8]]$qlf/tUb6zъiM۱dDx@U֮^b<,$?1~ trɵ׳hS1M̆fjNL4?ヒ?|{<8|(mđ\ ~Un[-~?rJap9*R ÎiTk5 Nl6iDI&4P4#]O<`],Vv\@Mذm;\ʭr߽~el(ˍ?lJe|>l5+^kwdibaRUj[e3x[[([yt!۬<< zqaLfk(FY[]vE(n.'vIs_*RN;JU6D$mml66māufggu%I y21~n‘0(WJx!b-]$Yb1CC#8]n25(J U8:tottсC40Hs4J!')rIVV]". jFdzrl:KAh8^D[@Ҁ^6ƞa$b9UU4VDɂjb[7JɄ}>N9al.7w}l x\n6VX )RbJVlv;a->DLmPUUFXYYyBcer}= ݝ8U*v7v2V=;r2ŔD/Nr-CULB*? |Ljn .`$مY8~ϏXdi¢뜝v.pf0ɧD&Z+mfgg$gT-">/IM ߽ dž^|(pw~?333\z}I~?b+.OQcͮvnܳ ZFGGrx~tIm[M.*ZzHXj51pH QUZ% LDQ`XPUVkTe~?Nx.a"]{H(`W1U J*;Y[R=Ζ-[HR1;d,,..FPA87r2܆l6(Ǟ9Dkk+D0 BᥒL9W(rXVR7R6fVf6۴:-LmZ:85=o| _821I—,m=g7EH.ȃ?̗O%7^vǞޏ$ tts גA`4LPw1]Z\:FW5\kZT:7\W)TR1wuR.Ipi7JX~=ǏCE0jjE W,"vE"Z|8Jx$Nl{6d'5j6͎a%e&ЙM`_alw˷ϱǞ ldi |@ jD݆q6ω5|c'?׿Od}NF7q F nhgO)Wbxhɓ~ l$B:o`ʝ.ηDZSN{{;C=O2$T  FhvP|[% ]%t SEUAp{}E$n'6qO #LLb:Vb֭u7s;>A;DDiSq%tuCbsQՆV#X$\7UkHP@W5zI'SXF*\f晛j>l%E28m=3N|%MsU‹xǸ7gmog|5׾w|Am\]?,f*u!xAJ]E6Ï?A֖|%^oc#[P5D clWܝ-nav-Ɇ {PQ)}Wa0H6M'U]Wf"u#%Q:l i:R g> "xe KT?Y8e JH$B"NxBQzQMx͞K{(O~OЂ P.jVtU End.nx|>T]CW5JÁci!hmLM%4R 񺝈ح2tYd "{ϻٶi3gq"K+?^ بW\0N2.HIaaF9DG[QMȗKHU+F-`s;0u UD,T*%z'NUf|1Ftx-CüZ)ܷmi dya 30#z|M0bM f~;0c3|{Awqm3/R2zeHuϏ$JϏauu\֐_[:{KYV2lE>J B U|Y{09tfVi^LMM1<2ik%j !u6LSr( 8NikkP.54뻴FW0kW~>=禧7\{5O.`fzvr%߹v;=0d-cS<@)Љ\?חT:fgczzhK3qzzzXYYtRW(J rv ۷ocvz67xTV@$Jlr)(ncC\~51&|XHRTUn76^H % ?ɵ\gyjAL$H=8#A(͊&[[xyB~e*A ,cT(BTS(HZ{&ƫ/xfvV?##۶rzzPK3wsÜ=suȦS, nƣ?_z>댌i SpWigffh꤯3{*jK/vrSaiGo5!Pat t$G(̷ /g2(?)l߇jTܲAreV.z% 8]ͭU:x'ƷC|_56@ e%F415Q :iȖrXl,+.X,]jʹv|>%Hv (23 Hq(U31@UAN?b u):r+Uz6vmXGrr' I u{TK(aaybczvH|>bP̗hnjejj@t3݊gplGۡ%j  H@d0|Eϵ7D땛־sU|W~f8OQ.5yG0D5!̨7>g>4JU, SpW™T* tRf1tFRt6]N>`0֘X,ȲE0L:H <;!λ=&_244#x7]{=](#Tַw2Dk1=~1e?Etige-a-dE2,UB$~5 %nxכXO' bUT`Er;m,+lj/&CRGWVU1H2(WKdR)]T]r!24 RbeqZJ($c8L"R+1tG4 w(^0ӄ,+X];PK9:! eLXDAVWW`n~V]n  ]3Y\."[ eDтx|!jB2fw Gnq=]p!C"S*#ϲ_&m!N 40U0#`"b`7^0_f +4i ĒWw_y91_V"QW 7n|>υ^kM/Z T+dF\oghhi(*zu0+ܲnVEWULinibnnۅZfI#^ho'Ht(l޼t:M(0 ^/. A0q88n,L]ٴg,p|5Dv_6ngqUbH0<:Ɵ0v&hˮxvZSfԫ<7.ZFXhL'i(JrDle6$6m`yy Gb5= ֏ @wo7͛835Jk`󪫨&|;y9Vl:JYWX*J8esX*5b \wn.ߺ\+R*_ZpG+s> asYjډOd3*>ՆQW0*haqedbra893Ý{l*8V䴱Lئqk^q:YZ^qae<>/!"@TRFbmu'OiJrLܧǰܜ?, 7/-LP32}=Ty0+k 27o#W(259bv&uG FVTQ I{gO&Y`(#Yd9N۾Fٺ~SGOZS.d@B& [D#fÇFPuv'r]("[eDD";Oq5;va"M\u+ˮ{8Fǘf]=g 6===dES 6oɁGc9~v`s3?{q 5 !SU G#Ri"FBe5Ab%_~+= gK2\C4H̬.KhoQ(gҼuHN2`똚h h@r)h*pk(3gNgh2?x>{?侟>@<"ˬh $Yp: doЏf8y[rZN:E{{;O暫bm477F.vP)2D ]Ahuwuq6۶lX,KYsjv^Z(rDZ(fuݝ8v9L_w_-?nTJWogb`w**n#qQvmB,CQ5.r>N)J{4DWg.ƈFa~eH[+5]ALL,QIt\UA .bnR*7q ɰd)6n7 <ȿ~ 7#?DF4D6;`Y>v[9s|-~Kt}ߤe)\lg1Ȳ(,Vhim'LawxB U(sxR ȤSN$d4N(GK.'3g00,px}>tCVP,hkkjB0^ &ZŹ)2k+$*QgҌ qvbkVDh] [`2~Juϥ\* DV K X FleQ8r F9~88STE6mH2iBMMHDHSw/[.tl(qrj=%YcmKrbVq{I/.s 6w܄\% b݀~4ۿBlg&X $aT˨sNc]9pz) [.?:B,dőX-$@2Lj+sO@b6b@2"QgY]^wߏhDru#O ,WJ$)NL.Fv5{=pvI'Y&=m&u,OMq( Ѱ0B&bp8=+bK,BL 'M*bm-N$%Hnd'G%l'g] &8]u7@()X-Uf%~CU_u߿yU3;4A&?e]F6["z^ftCGSULfgoNjmrb h2LE:.CE6TrH"-Q* Z CP@2MN'AbێqŻY\\cǏp->|R&^b"ۘt磮j477S.U lA4tAG4 * mL.bs8.ɩhRw8J<(^(x x[ތK \Y! Q(eG&fxx'Ԅ5bU:XDel.7\^R)`m!<̓<"c,>76 EQd2tt8DP!~/^$&ojA)g m$109etd]fz~Rđ%w5s.k)m0x l0a)J-'V*`Strrz+F>_c|aǞzRʕW_E*ԉSDBaLM'uy i XI 9u6rM7q 7'3FP(0<<̡CB Y\~;4@Z$O0<8j{̛nx W_v(470+\y3[7lbifVR, p_ƭ웞ᅤrgϞa9e9t,O>Ɇgple+ٵel_w{GV^,`<;vOF{ǟ}Ezy}u6z].YT j*찓ɤx}F\r>ggp9ؾ9Bf"<8On~o"mu(LZ*E!2wwuS_[szGB<Cvbhp=Ǐbo8vA5L:k%*8.c0<8,%Dבtz?~壥w9>fv`(//K&5\W~g߫ dsY.7Rfz)|ͿQ3( cٱX>1 "&2&RprgO~ՅyžԌM(tuvLPra@T2()٩I ]gjj \c=Ν;9yCCCx}$hE]QikaTu44n뺁amc 6$<`BK WzJr cЛ۸띙ݝ59Z ɃX:I͑޿~;kf < q8HĖ:*Hz<\EO /=NVQU|]=AxwXvV{N* ͊fC"ƹ]ܮ&24LEQr:l0H? +JX9,{?$\(`m/qTuj.T5dBDf#k Hl_{FpZy*ly7]DQNrZGv;vϢUe^'I P֨L`63$k2!x|GX}gƱ/#Oci\&$Ef 3DTgfz.btZZwf~fwf(;wrr,ƍԪUZ}>yYK/c޽LONr EI,'`TFZ(j(s{ɤB aXVxn(D|&ǡ{V(򄗒8l"zYS+ YӑL2rQ\>b%9#bQTjIinBA>?`ڵNqEovqE;}EBJpwp,^$T'Nf`0Ţ9kvt#bjT5< fFPUB(},Dbt{{fIWV@D,dfTɦ@ IDAT4Y%B2+[pŎl^6O3s Qؼq/15""t믿Meiq|6 o;lfcOszX[!L *S7 *uA¤Z(9SG㴭$o*Oys%-۶ʤ9|0VmwI2O # +W K'Z|2ͥx[nCk.FgKK HFGGYf hQ4n"a`Y'I-7֮[x䑇-SwͲQ*vz|bQ_GOO[XOgC4gq-TIWN&Neu:jR~Oϵxr)<.^?b ]Tl7f0sd*+J%\.ᆣ"*-^^9u>Μ9*xNr8oG0H2tvw#Z bPVsqŗ3:vLzYdS5Qf5BMB]Xp5tHđYfԨ ?bb%HC0-SClܹq*g`a~tlDIGE#Xm `Mƹ%FW330vwۭ$S!`7nFZؤ7Z|>8d(U3jMV_2 R(͛jk*\+Z8~z b6C&cH":FL(dՊAV+˱86B.!2H2$PUrLM<rr2}Hgq\9s/34r֫`Pr8V˘5|&i te`bz [HUtviIcdq1:kdFvx`Ur(JoLQf+RbL:ftuw3:>FgO7ss&`*CTaæMDR VS'?EYuuh6$r9DIARUV`$8t:E*TBJX]+)TkXZXn| wϟ,%T~=f'i-b(b,2Ȫpԩ) Md{W_Nzo},;={?=X#OY;@pnPA7w05;,Br!X.Ξ>úua5T* ( RD"A{{;x^Z[vq!{xŽx㍔rFyQ[[u]\v.&n㚷^ǨūYU̵ a7sN:^c<`pJـp*v |;?项T!^aSdbD ߙ<=Dh~,P,ᴨ$3DlN"_[o߾m~j\%^JDMW+,,D*"vJ,8v Vl*I',r@& \9 Zm;:WJ }=ϥ1z J]eI05` sU,.5!sy:ɗ L&j n"R/hnn&/"kf[hNa ,jfT$)ѣYY1 B1b0tAA_]sYYR fCgFUo ko`33jJ ]׹9qG$Z[Z^-ϙA:tɤ?N6D5鴺,rk3>@$n^尀nPeɦB\.c5YJStp8 XZf$5(B,u^/333sQl<6Jl!Jf5{ BHfLft3w6IL^7m^|f^ڷ^N'}}|oη@-_Dd;T  "p^4MxOOOld"b!0==M6E$zb@? s qVctVŽQ<| 7n$ZbeW7>\kwįi4"љyZ\Nr_x!߽+U. ܭ-|?f"Ļ>>Rz/4u{fXECV9>'ߋUgP3Qz[<,,uq7g ݀zh́aq355HbSLLiVQ/PEzL<J VDkv |r>ǶMp;] 3b%R֏“/m]@U8{ śs1vtE3u1~j zɹ)< \A<وdeIctwM)`(KPV6:'yٓz}x<> sWW2V$rYV+bن `"yCzfވ PTD^GQpCTo{;xN^pQ(0L躎h:& Mp:7[$fEaf|_'ij8lVj a6k ͎Bx<4UEUعs'HZpA~`l6f'afj<@%CU4"BT.3btXIكt6p2͗r&Ήq(QL"mJr)J<fwq{ GHF"jff++$ΰqN< _OW]aIby&f3T8|nkކjCXR(`c`j>Q*6o[ǚ-Xi>%B!._~"S.LJ]a_b)$ _h%Y,#LG"T4 gW{99;CAUO'u[|{dupk1tmmm,eE;vΎNr,fY'tH% B4\:+I'S;rljn(NݎfH$FlVr8H&tvvkXHgn~>~~"(v#ǎKC,,"\qe |3crvRQG__/^{{!nb3[I,'jX=x0We]ϑİI` j ||VƗ7T ݧ>Ir5W`eK+Eut0DFh,\z9q$B ՉP09;I0 &BK(VEy;HU9IraD;:b-DamJ"eLLM1rbD]ˤPn{)/lylں۶L EQD(!,oSx)'8\yu:5'?s7gM0_fw&фaH(rAx<+eIF0 HҐDQ\.(*<3KEl6;rB*͍׿DCGUzV_ Fz9DUU8񸙞zȒ,ȊDZ%P4j\$86V/bY)JXXo'/f&}).ݻutlRǏ<ʡC|O= 2~#׸h&SS3p$LgW7T^z8s>b;/O5D9?=? x}tt-W$_Z:BɀB2IO#cb6پv=ݾV̢3g56l@e\v3T٢Wl6;ϒDQ*uwul[RCQi6Ot\7˩f|`XBrرY:r"W|#_wyedqzih#ԉt\Zby;ҝ_xlc\*7dN2,9rw]<](Jwy455191a%fhYkgza<*~&u]1I&f=/s>ϧ>' sZVBCGpjxqjx<4W_4gς`pM9=rGNqsхId5?@P"LniPD2?}!۶D:o5뙝 G1+ TTr~#VJj9̙E$"7>Į . pV>``{'\s٥||1-كr(&c˙<>?Dǂ;Іb#[|qѥ`w<h2FZmeDIz̼!xݧ~U@3oLT͔J%̚AV7p_֫ȢLZDȼ@]*4 A0eeɩ)_/~ <^Qdso7~S1!U ,ph64kD4ebZ /dan5@QTU&M0X,F`|VF^a*ΜoB,I$UC,}dJx&YXyWa2@Mx7,B<䁗X: ZKQ1EI\&F'wP'݅h6S4 md*Dϼ|ũzZղ,PE I :l"b+/؁E2;ੇf?=v#Ir7CրvFy;N{_jҗp t}lgaZ,V첌Ӭb1FÍ^)$P(iȜcj^7>R&o3HM6ߏ׈D! tuf@E֮^^131I.a Kmt25>ԩS_iZ[[D|>4BTf~~/sRAE2E ESX,(+V@ebeRnVo+##ChYSq4㴻T ʹ=w@k cr9.~143d/~n5oē/ȡ1V%*"$LHT@Fv6_6E0X8s"sxnj^C\Naѧ?^7dQMu* `ݪ̎OOX}J%@\.#&4Q@ӫh071UkX -cIJ&OG%I)G/)еa#' h?gGQU<3Gx6AêI/A$fE$nfv~kǎy&Feqi&ɬ2=?P ޶<tO**dё!zU2-sRUޔM~$)3waf@%* FxcOOHx7fQEU(RcjD&[`4d'Q꧹-#jfdgg M,I%شycg$Ŋ,XfȒD*#M$iL0rY^/lA'۵z ôwEennլ&$ޖVREQ4O{{;CCC ˢj(hrYDQ֯b(H^#`u+ "}} Kx[se)T^zi6oXQV g_9Ŏm[Q-*|Rۮ{+##>MW{'^xW≧Ȁ@XB6q+<:A,0ccqpU#qUW٧>[?VsdqF29tH m&Z!֪ Rf5#KlZdx k!b[.R&`nf _ügH=rj mVΎN{@'OrΞCUTz-,;x>L&&r oYx)M>dt.KTl067ةwGGs{bU&dm^F-Wd;>HW勌zmW"]`>znI.*Ls[׾m0\lji,Z 05o]G@d0VC"Jmo+ot{E->&&FxM6,Z7?D(A%SO?Q R=ŭj(^*07KsK3\--: fFv7!m:DklT.NXZXZ!O9G\H"ɪ/'y[-?3ʶMKdIfFTAب QLDc G]CKx&b5+zl\1ة!$Q/ſ+z\,pVX]/.i󳺿r@+c45Պ,iD 82F*e#Aoۆ?<7s:tv{x0$[9z(_WjUTj\>fzUfRTdD<})z:xooÆK=dqRkVq~Iė49Yq]BF(R,1LKpTpPY^^\PTuUUfs(*ITc9HuEhkO|Cl d|4}>^a6 'JE\B #?3/wC?߻G< GST{]X::,A*\`2ת,RiRpOѤBSXd6S4(gpGGl`_$m_=p ;N}!<((c$ *BGBDm&Y(`eꂉr w{ޞnNLOq ~tQ _^]?d1137{4SC\}D"DA+$RQ0vZ=E]ġ}رcvZqC$aժU,,, j@\. /%\B*`hjκk{hw>ӳ~N;ζ[#,۷n' *2lL_|!>7geUH"9/-*lN d>φ-[O>,C>=XX/;BGw;ANcI]'b2PzST), U0 Tфj] 1ӍX3R+ϋOqh+xxM=ʇ?,P*i6ivl `[m~,TbV0tuDE`tR6[=wج8^2(3>>|>X m,<ƚ@ R1K6rehzwT&M{{;SSSNRfI'b=,,-;@ m7ޏ^& L4艓罷o}JHVjNp:]d}"3ޠz2Kj]Y81Ι03Bg?W?s2Ưf,Kn1 &,6>sL&bas%ӹ^{y;r"),+yEC<&lHj,I8V^xYBETCyi27ndv~Yq9,FTA{l&,+( 1.BdtP,/,0N4h\P*UDb@w_'北޵Vvzd& R<$>UҹmwB-"[*J%Vn).FF#?gLS+UoiK,G)j$I<.7r DJ%GӎaeI6t0r8NQOe0 JzQ34yJw￟\]\+"7VmB=!K >JKgW'N33>w0?3C((r10*&DNR{f&QX,`V9̚L2]6!AEdefžC {k'_ؙ(ښ;=}kD*AgT*F9& Q(i X&LnT SG("u6 d YUds%JLT$GTF4=aH"uQDYM|aY Co4bِM1d?U! fyt=z띿wDfy;c^_y_ K2zJ(Ih @XDUT* zpޕez^GGRhK/c}ڵ IjpLHT^aTʈh1Zn *ap8!4Mݻ%J$.wЬ[ TU% z*f-8A.'T݁Q$4#& K-<"*f 257KS1|m %R"߹>|'0+wIL_d*Xby{~=2.?Y]LQ$IHg2 R.i(vLhv|0xpuױx,\a¨Hbhjr323Eۆ >}>pVL&h|SB,o>/[W$;d"[/@+^ssxNPH,rN==MNb,m<%B? |I%zzYNhcLNMu6~|ٸ~#. h2a:,s;nb=lٲ]ikkCU$tO(jRF|x=^VaZY .7 ֭]8p~jW&L\ZRE8k׭!^]Wu316lbL$hr6qj|rϙ!HgJ @+F<"H5W^K/ɏ~\~e_{SWq&::;:*1:9/̩tuu~DY#Wa16+lU6XԫU6_(Jtۙ]XS+B|\ĥOSpPk:3ج6,fE#P)) 5 BM尲vj{TeFƈ HV'n 26>Į/IZ=M4Db& k259U+vHcPDZ% |VKaڻfEfG'wѽOF]oX5$IX)!k++(0~5?%sUR/$_M΁2+3x<RP$ZhTjzC]QdbXm6j2 )I I 7phw٧ ABV\.c;;p.pMzlTuUW3ݓlڕ(HID 06~&lkMF(X!$JJ]mavfvL*tE9gPg;(Q W.Z_`~~p{B>VfKx|_dӦM<p 9rp; fdYL&X,U$I=z.l6K(%i45uBffDQ$Qa):Ǐk7`U˺{K$TU#|o'L&bèsajnfGQs͍.L*۟"YXn9lEC4QV%ZcIԬ]ΥkVX;2F@Mհ"f`i~ÉPo`:Y!+б,۩.Y4N%hgnvb;vP[ &)44H*oCt2ˠ-HN'ūVK)Mip+&$Y1%|.j}D jls0IavzC4ALYsQ5 lF؟q7 ,;_q /s ?a`Fx J;?*af^}`Hf3"jYnlCO|Gyranr,6b~JDU?SXf DY .\vzS+(jR$3.Vavhϼ>}89;MT MxX,6; r%ٳs'{d>8m(6 A$nl$yQ, d $q1NgFG9ja(29>Fkk+Р.H \,&x7eG(kU6\|1w<9v t|7"F|2bevbގ.fg .-"9\N'E 44_}sd**.yV\/|P1]X`4Djs3]V{1fq[K+cE뙝bll ˅ 377G5nذ;v0<<ذ~=p[lriiifa fffx]ij6 <^V+lzg۶m,--!gΜa,nYY\\lF闊0s/좯o`/Ȏw߈G6fdVZ$>#=5ˬ0{vp$~H{O޴h2Ƀ>{ U^ͮi mvDY [Ud@HD%uBNk;(Z1K/@&Cz#!$0(.L& 2=?KKZ,-r,iz9r a,,IZiyL0Y8}-k2+bI#6 hi W1ް:t-xj jP Scuu07KK[T:HI5e ;;;qS5j\LͱxG?Oˑ6Xcǎ˕)Xm6F Uױ-~!f'/~3W|s3ۚ(0{2f 3_Ow0""BSC d*FGG; utMra$ )ERb/~(f+lX,&[czS"[`¨tw{f NzsQؽ>=J#B2rnf@K b^ȊzfBkW7|J 6U7- eUŤX)kVXno?© $Z{UWrնwt0j5&Y?U'FT,d$W_/ f%(10svW]~%xe+WerݵL.Eh}R7K%Hu{:yMsdb0'Ί+hiiaӦM fffذa^H$‘#GäR)nvy.22 CLMO4 ˉZriKZZ]N |>&Adrr'N000餿qq:,[aXhkkCQ\`0AwvxXv<.{쥳b&sɥk99:EZęQ֬YOb1U2[yZȊ̥7=Y4Q2$᎛qaPP. Ie29B]۰d}OLp \vUEwbcdZI$y7vfIN`XQ;~r$GTJ:bE5TdQڞσzʗ1SF9/=+WׯM{f[@_T*V fhT*V+d憭7p!0 p0CTfUdL6Q)*ݻztwu:(""tFU*%l6;Rln*EGwW_=gxZΡ 9\6+}= :;j&:=b%E2 40dPUSYT*E6d:%IXer!J{@5czٹՏ,U, QlT B@g84fDMؽ}> jp[|=4Wlǟr*;LMs. g'(/?(5@{HtvtɿͷE/s9JDKƛ2@grMMQEEWtQFiܨ#ܿ17r:;ƺqIL?2f3Md2XQ,X, QUq&''E$طo.;w/}Xf ֆVeFFF+Jd" $BGjiX,D$aff+W{ b6YXX@UU0,sIZmLN[8;: ȴvJfzAADIRg~R X">?˙K,yȔKFyGhF:{I ٰ)|mm|M]) F֮!S!N>'Oh4p;]O"Mt(gR33gNqJҋ5 ,#6V$CDї}4p{ D<r 0d4*_Rbtvk8;%9BLG(U7vb0wNZ7Ҽ6; $V0)vLD"<.d63bQ̱S0 `穔U;^+7,5̨FdFO7_z3^.@M13/e*6K2RMSQ 6I4[fffT*e_X*R?e,Qo4PdAGg֮n  4` {dB1[d $HXG|v' K1Ӂ +X]M>uPG/:) IDATY\>?3s8|4A$W) KD5"˯ș [TgkDkt9e$.ߺTU_^࿰qCubE9:1{g8\>bfge߾093M&!0ϑ#fQ$ǎ1p>nttrL2d2Q׉F\.% R. l۶ MX|/ڈ$K8v;mmmEjbm[qڜLLNzSn2f1j,3;7jhP9r(6o~ǍjިS(f$Bժ6!5{imif*oy{<ѠwP3Y.""I}n~/L.2$wiJ Ao f3Wz~k33MW_#ǧ@Xp654jɄ,T*7pKKK|cc֭x=c&YBU+X-v:t0q˭}&~^PGM4cͫ"]D04Q. s$ADn),LLѨo\b͊E456 `SNquWo]믧*6$ëVsj|gIʸ}AzENkLχ+Jfp8uRpbP$L)G"pN<NJ~V &U%9?KF-5B?Z:$iDEA(k6Sx-B [ᮻ`?v©SwE{{ﻟ{NĺNjawM(ZZ6MIvq  I2 "I0IzD/1ˮ)T]exfuqz󨚆D6IF*dpp/yg2424CCCcah4ʆ 8Nٲe BBd.immEDRCCڽ 0p: L& ڵZZZMh`0dYHRΝ;k4|Μ=VUYzX A,2Tf/csx:O?W\Y 5RLXU ~|_RQLfe1#jg~ad+, suLjAzrd|ç`p[K&b, $(!'xf^qt;F;cd268Qjn.ZU鴴073aD"ARD4ɔJ,̹1lV<%ܿo?8hۄ8pZ( 6dUgQE-ю0VG f:fUАe5\CƸݴ!Q֐n4ng_dt8sERF3Sx3]f|^_V?_`f_Fh_@Y?22Š+@3 VӑsN|'_rPB/# J<0wy7Vۃ͢Iej 4Dg%o^Et~]Xr\٢pi:N!p1?l NP:bb|ũP::;Tx1lTzCPlfYQR,L8$+z{x]x,"oyg'Hts4plRI.sS[[kUK&p{S|oh8V?[la8tJL"SKNr;3ZBv=G>xW^r R?@fٙ:v B!JI ;w>knN\Av:YH%YR>uwz%W qn6fi"~?B P(dL&LTH'yQ3_uKf>?jSWx!ff=:/P1+?~R%0פf?-~ƅ`W(ϝܹ|>ټe3Vi+|?3\vP.+(K(Ҭ~:GAE,DO}_n˖QQ D"GqTk5V\iZz,:Ƿȉ Г:|$Cpy=[V UA0ɸ>DY&hiVKgIfQ@4j8$ m@EQ(zp=,r _?PM%*h:ZQ.!*BUE/iTUJ&ċYtYb2Ě-Y̦xjQ5z[O7e+yGQ{:6ʫo<aٲeO?{?j<E1"LwkYݠ(r[x>Ʒv<8q'G뙜#JZaݺ5GlC|>s!W$&'X,;FV'Ni zX,?@ N3??npQiЊҴsk:PC4UUqݨ(S,˔ed28 U|>fD"bi#3??O&$kXbf 'OT)1li. Qc4ֶVq]w׾^~qzܴ[Ȗ3xn/bCit\DIK[nv~H8dK("-o$}ɳD)sHBLBmjHЉwf߁#<ģX]nb ]$DL2uc9Hf2w25;K_ 'N&WT*]70j5Ć@ͨkŌT)b(n&YF\s ln/fTpO=;oYZ>~ގvgi PXlrktvwn^IGŝ|7c+Ζ0E W,:`Sg[ilxEVVe`Fڿ~c#᷾kEV0{%_513?I~\],ٴi6vg>!d$23J@F0OoQզ$i\{i:&Q$RټR!cX1 ]Thԉ * A$*%dY&.b55EhPC)Ubk UE$ @`A0awHS\nB71fg(2_-}AΌNOsx΍) eFXĮhUUf zF:B W$DRR29}vgw"ws\y x 6p NLOկ~+%.DRv܂YK9fFq099I?@XBY~&ػ^܋#&Wqrk7]_ߑȤ6ok ׯ'J122D4P366F$'NQ*p< Pj,cdY;;V,[Qb6l@Q 26~R8V]dRXf㹹94@ݻ()bR,$ ^/BIp\Hi , vx<ĪذYTUl6+S3;}-Lΐ+8}8Nv>_Ѩ7O$P^!qnr4߼]xu;Ot2zx?l M2&bnbyY *U/| L8::I2 zYhooGW5rCS,pv,Ʋn6o؀12z "m<@O7dΝǮXHXT"x<>rOu03f|A'|ey^ު-q)Ȋl&K2oM|-I{IF:]nb>EUXը"RTu p |-!^oPFVUB,"+W@& ,6 Vja'aZ(OMqY6+7lͷuW\'?I3j=o;{vl% 64z>C1 $ф (5d`PnV`">v?Wѣ[8{]>9E|7XIzW+T+%VXƱh E8q,׮gbqu_N!|l5LęǐNvzjDisq===a6399I ˤRYv'uđ'yu#7跿!zInXb&N%OwEio.a7}\yŌbZ<@3t ΀Nb%V)o uh.GTfrv|^Y5N?<Μ9í7Yh48u$--EgS0gNS9z(jUUVr9t]k%H.:tC'Lr t]=܎nCUq|>Rp8ܼ#9~kֳ[i& K&azfv&M\,^krFňD"X6TM硇=1+Pv/[Dh,-ƸiMؾkWҍTfajwccza77޸qZCdӣgttPMѵl&OPdi)(VLFѠX)y+|uvscdc&WfLOY.|aMӰ(/288 sIu̲Ŝg0X,f CCL?uQ8 D4MC1˔W Vd&v=TLH$B)W+8^ZMGa]6LCհj dBMԴfCTVQttw+x_Ϟ##xN6]Wľ( xz6n*~imAVQ2ZR.K$&sS(h&xNjZiGM=~OQDmƺk?~Ü8z٢S:X̧|LDr t3C}9zz>մ,ݝ:H7^>-̙9\6zf'ϼ|V]a6[),_>YZ~pmq1Eaժ՜:uIFD7rYL&333,_YEٳtuu!I(Oַl"&SVim t:Be:,ŢFF=z>j -¡?:ZtbZ B|!MMhenvf D_J$i\s }~LCh [8<~2\QrtvS,œtuusIn6~pqIRtwwOjժ |> lذ1 W^y%zJX:;ۛI& \.V+|y֯Y@:FUU\.LzN `bb^`jj G>GQdYfnn4 LZط7TEV24#,-կ~'xsDclzy;### !^AOzEh*ե8.tn fPS)f3^'h x)3vt]Ũ5(iU ,J6J:sCˉK>q玜!AM6SxA$I\)RWU\v QT*aq;=XDs B|S qL9bFRJ]S-,.12<уX>0@9_@l8\0i`\{)W/g>#ArV3R d?f)誫9<9g{*&J%D/Qrﯪ}}hɲ,۲eK,6`u>=7$K1y26FeY2fY{ꪺx!_8qSg}}yw׻ j2嚂dV}ń&738Rg`"_|w}7ӓ?~D"0Rx!餡!W$Mꁖ&YM?r=@V*uz,V (Lv*FAT 5dFPhqvŃ?}oFUIr|(qtpuk)4 :5C5*El~+玲hx6Ξ9Cw7T zdEJ˞< R,$ZMàI|O=ɺ~{jVRmۯБښ_O24t *^/\mo9E`2ρp8\(L{{'\ajjQ 86,6px`ϦKgzzޮnT*ͅNbI`6Fx}nҩ,| jGeYf!IRE:::Pee,T*E!@T"55VINf9}]fͪuDqr2W_+n&ǽ/>|[;VB葠^j<6D(g;uFf&ƱPP*e $Q+e,bBMUi`.HVC ldbq\~?}.}*:hjC$lNRl\"Ir>r.Y/b։L"poؾu pPHf21>2-lްjmAih\&'ا?LuØE|l2|^?$Q$r%z6lbx*̛~8F^يf zJ2Nɱvvpם]~'3֍_g ;Q=ȯ0_ ̼k73U$3` oj'lGs'6 n6;0($IhjةX,bX^՗yګ~"Z&\Q`M4DzV@U@$@DD $2)\N'W1%lo59zQfɦ3xN0JzdETEIBZ064|4-7&r7}ʶ͛p$e|>fAQ.V2JzL&30`X,Rt >$ijjbzz`S3Kz֬a&,-suxMna{6#n#Lb4AS(1Y6ҙfFKYFQx3 _?8߽><J]b~q:۸ٵkssstttN1L+NLI(We\?A\&s%1L:,iZZbhh]vRXuabaqqՊbA4J(HH"0G{{'J E033[ٹWP(XZZbfihh13;hbs2J*# "|Q6mݻnN޽Å~/lJ6IQFY|UcQ|&#&@.qryMSF<hB4CDV sKZBDqzl/on~C|S/99zlbxW*L3Rޡ0Qn'0Jt4Y믽Fgwشa=ܻ.G\iɥ%EE@T#S*byYbVtTrDA>`4RC ].oY{IUdv&?'ʰvv8:j U @1eyz3T*b68_,^P?ĜN α*W~8%zWpZ=lRt{{M@P_Bx7~.xb98~ǿ(`~GIj5JBssַ#sַ! "( "" B͆&u$h3_YZ~,0߅'L A$=VJMF$& b|Ts9j8>7YͩCpgZÓvHraǍI0dIBg0lt+ԃ[o˟" c0ɒK0 Y6;\#OvbBHcclK:rшfS,EfD4 ^ m$)<~0]>rZax2B-]dWCMiO#J,o!<Hl&[QZ}<5xG0,}9v I,j{97/hl:NxܠB,@e4V&''W6rl.c܁?",S`ܹO|HKH4,ݘL&}{ؾH<M {n44q]?ˉYz RQVv:**=vˍa~~gu:E=Iu& FB! "zAb%ijjb9nf(}L3\u+O>=8FSNxحVP*^V#yɗUb,f2"6 vUA(?xO:z kQFرYl%(i Kb|M&[Y`4D{g'd CFܭm391[YcOu`9j5>/|>l+LJȹs7_k2dTUu^-hXxIV{M 5X3/  *~Z< UuT_0|?\8i?v0ۯ$ 6;/ݻ{4vZv{ EUЖG?N7/]qH a0Pz*(|/?SY݁Cca<_ϲ#fndb::X'h+W0۝8>:{Xy-裏O)eSo7MD'hgIģgҘM^MS h߀,Wh( bQF#ϏfNd~~K.m`bb5T+2;]nflIGqc 2V S3lز\pN3xa"BU_ \Z& gx?\$ȯOtNh_/߻A3/^ WOCWl6yLbLx;vpۙk哟^%Ju{WڠL I:TM@@_jӣ j fzw]>qɺEY+B|fddFԊ… zbDX +i0:rOPf;طL2zn݌=CϮL&DQdllzw zj. b1' 4Y_!BCt$qp=O>`˅!dؼq3 (U x]^< >>{. fzYXX ɰy&0)50\RP/ up8yJ_Rdrn_I^XQUfjccctuuаUFijjB&+TdH0$r0 ,D8qEQ|>imm̙3lv:H:K/kfu=;r?,LqU?IVȡxlfΞ7tGiȅvDԅ1VgppLO? :>ݗz"-斢4(k`͠Fh:ۃl؈NgFh"t Myj4 `BP?׬;n5Y)s鰓e;?,m-m%4UfS+oG//!^ 42RW5-V{Lwf0o2պWNY>|,..cN4@7%XS\/Y%XAT"@EQEQV=d.D_WFّ\w&ڸO=]'0I,LE,lR\"|gjjTRwr [Phv,F{.BVa^/a6H&6$If%YZj2tB1OGG'^j5c!zZMl2c2@ZS "LJNh zAR-1 DpXL6ȕD3jF#9T$HmZ*zT5/}U}i|T ǃ$I~JɄh7P*ioo)p؜MtHPCc HVcia۷Heb14Mcͪ~LF#cccm+ .P,P5 $!β~:{>FZl6az} v sGp:X,4V+fvzI&X,./J<^u"?8mm,,mhFgg'ǎۍ^gddBG{<ԯp\XV3ӴO}￟S4nL33stz N@"t(U$H6)vF:dPcx"AwW|.fƧhi Qj*nB:^Tryy3;::]/$IrX0LVFX"bfz~b<@\Bo MȚ@dGCs[Yya:?ɱF.gYF'ŒE^`)k66b=$S8nJDSl*T)_EA@uR<hW|xqGU;4/!<_X_fW;a_̼N3m@Zd4QSj˞1`X0 ῃ ĉщ:ɺ5oC!痗炰U~}t]_jf4>%CCgW[oCjij*S'iu;05,:9͒ހj l J,[n'? ]768>l_87/=#L܌jbnn6F#ZNl6[=LTmOOn:2 @h4?`vv MU0~K8 zDJBl)F\#HVkXf0I"tш\`4ۃĆ+1D'NpWNzsmhmm%_p8( ?moWUlB(brbz=6'O~8Tl.^grr1\YdYm~~L&333twwiu4==YX\PTظq#6MSVei 6Q*DsC#S IDAT>>}c4}^SlJVYC>*"N YO"ˤqj|@UI'4:4֬%H0;"Y,RjQHh ]M&QeL#6c} ,͡:Ep\T2P˅fen~ɂ#ҘN7R l ȂBSGF}4$ kV+Z:[:XLd VvSatDcCT"di6WqAJQ*gĽRhu!~!h4I:$4W>]koYL`u0fw/Չ:ҙ4TN:FXz ߿>H,Wt8_;Bw +Bh1|k _7r$pƣ4є#456rQ|M۷nxga )8DIv;Kجv8mBOO,.,%ٲRdEMфn̙3zEzwa%aK.P(r>395$Iݻ}s/Ox>,x.i XXX/r-LNNN*f6<5</#47`6[hnj…1ox'{@NV+~L.G\*8N{yX&{zxd>/XsCxN> 7g(nÆ!/ (H$ J&XЩrfER bd)%ZWފla! (MM/.t* lFU\Bw"kPET f :qںzIdrE*UD QSq;TK%Z~ WpH"(Ukc426DOg+6l@' Xv c@( W81^~T_f::+Ɨh>Z뚙?lfE?S*0+e$IBIr-<^ٷo53"EQWw3j/Z/.Z^o@U.ͥ7ajkU̕2 |?EIĈL&WHxB-j=?ͳϝBVT;~+>Qd5<>+b4H  B sl6jH$X,XNXcZ)WeMrp9{Ht233D˥*q8VJ,MeɤXN5\MAQVLs_qm'Xg9yYzVcjJ2\R.O1jdzGjn"Hd2xr9rgff"vMR&W-xVӷ 60;p8z+?ؾ};G#|][~D/~_.?(b6UFW,^ 6 ^O$AeAl6xd2Y\\D IXV&&&#bXꁗ>P.% MM-uWl^O>ZсspU!*\. 5NUDj/rx ۉ J4j!D"jFBp'՚3r D\6+D$Sj("@OƦl~\q5xV͆@6iwPV5Nz Օim/R{O/1y]3љy^a4w$;jьF=]|7)A=y/ 00HUEQW2"chr " EDx5D\C@Dz:;J[ՋI#:9m{vwgϰoLK$ӵaɪc'bt>g?\r "&PЪ2|GdSU,ds RzpL tR*EP("^\.Vj%Lp(T+eEl\`G(5*B AXJ&b5JX~D OQVp lݽN䙡L&x;ژ˦9;9E4'?{H4J2D:^'j$PhmiFTCAݽ=8y!Bɹs 4NzNHz=:Q`aa,bIm3`ѴWU*7tN˶ne|;wdrzy~~M7/ZF*nJhkk#ϳo>rHQ}%P(`4tyFGGٲy x NRtR.WBBPOvZl60---Rb͆N) ժsgio`&<"JKXZ\vNP,c4HF}F֬'TNbʵe`tdC,GsXlV24(,kl6+LP0 ]Y֯]C,]-4h22K3rvo\~ n7Bq  sP+3>zFIliшa1K\bQRb!J0HH&#\4H,IҰE|N çƓnrwΛA4V;~vfUvڍ7Qbh4.w=^Z^CRz狤/ܼ T=/L~& qsיQ5u_./﵋|~3_3jDΊrШLhhW5O<Ypmmm 2:>RӃtr9 nH$BW{\l2KSk+c#ZbdO{}OLQй\W0S \c;011ڍ +illDD P- b1cC!farngnn`0}}}hƺu믿BkM `dY/L&O~B1S!$Ϡz wq~'xb%KiϞ=,..r֬YNNyf>̍oSgN n"$144 F @$l6#2Z ʱc'LֵBBWWDYZil 0223q# <,b3g{(r_ac6Yp;$"1rfar4&2q\s;v6&mm 5ӵf=οb /, Q)LTED&)M|꣼fFWk/V?~/۹];30#H`F|gA R+BoBM#UA@/W$Wp4@Aej;ȕW^BnG_,dW`CcUS#KSy޳95Yfb >xmvuuH4&I"WJh)>v^ǏrWlqt:bH(ZbժU a41 (&L&R,˘un()*u Qe,f33uTRή^Nff#I]ݜc&sg17 /3 ===d):;85tqa=== ,x^ gپZ~U nf$Ibb| hhpQںϋ:19IR t"CCCx&]wx'8u?(h~lp8__? #c#=zu7TQD"T*4Mcqq7p3<V6e$x8}j6Q\J6f\(Y1?7Ww/:dq(Q4]CV(:͝]L.1RTIb>O+rM6& r*yQVdt}^2sx]n,&dLU܊5okfR$)2zUx`FPp_G`?f~P@@MN[yj]TUNY wԺ0XPg+E2_t640x]u{}mۯ执AZWqjr .M7s~MMl^/}sLȻc0==$q6;cG]lݺGl+]D"hdj^s@H$͛199ISKP*ԹO6]7.%f3te׮ZZ0Hfc ֯?؛B<z[| w~703?省UU"ͱ~jN'#44395E6шT(H%$lfn`l<Ǟ;G?;q9\|ⓟ$<;C2dG?'Or]wbbp>*里իWsî먇YZZAl۶ <.O=r`hdYZBA (2^gbq8100#B!j$I;WnG{];vrغ'Ne˰[`va`g」,WU[7{Ai[]~ ? *좂`  s}vߜc #G̙snWէy?w5DF*as137ܜ[033Kccꗿ4,q[̄g y~tM9Y)S)c8(n:X8FUF"xR$:X$d%ѣl۶UQ;a0p:ܔ ;bF_؃&A4EEW! ͓̌!s=`mj W2_V|ccUxOr)E+Sf#_|R,C8_}yֵͤ4:dtw" "DJ1SJF )8'/w+b_ )KMQ*(D6Eןk5ZJ ]#Jo(FwG͍j̃vM5/rѹp/gljt"b`x7*!^LO:V7 HC8,fƏ255A-L:= ӳ=BZdllV~?LOO*KKKtvvV8 +Z3h݄,9rfl6[EYzzl6cYbl6FGG!PT;x[]-qՇ>FR1r1ln[nZȑ#:@*>67"** :],Oç?!~״{d*);  IDATdUWb|6OP$TBhr233GOOss3uZU> ]U$cIdE[Z*<FDS DALf+d Ɉp6GCWǦ8kڷ/p;=u fT$B.M}u"#ajm&ZZJ Yv 2%$R | NͰfY|[7XPh?}#t>Qn֍B:G/kh+M o̼ę_s_nVf0#-_S/qc$lRn z S**HB+SQ7atY$֬YC>|ш/|ٰyۻt>G2|:Ç96D?G?A177j%JX$\F/he+L<%:;t:˖-[ ]{~ErTh$B&YDb~\bRA#q!\U.Ʀ'QD2D,w}h&nD}[Oكl߿5+_l0q?^Dž}A䙧*RfDA%ɐNO thoo'NCS#BȖgpM.']H>qEGe W\qX~=(>37}~SZ:1xѧ( B*f6*LefNd^əQNQOUޟ+̼ WƫMR^L's xCBoh<І9F$D ڈDv߿w+etRWAbW)XT[P`0 t4`JPSL{?gSK"WA9S -,glǨ%Z*s7|3naSOPPpt06ss~&'`兿!,633s4zu?y7\|ň|+_2ZIƛowȽ5k_GE'cǶ: }^oK/{abD~7nw^uhZ4Lxi FdrjbD.'`ZQK*##\+($I8۷mpA43$ HIl2J%1ud)vs2==")+eݳjD0-#łb!h2F.wL't,-Cף(*L݁@0d>WW5ͭMzʇ^* &<,j/W/L׍ɨd2P*0MTq=dY|jR֭cj lV8T9>6frj*ߏj%JSr32rU}LM IՄ3YL DUI~q77R6):޳ox#NDž^OYL6MkK3mmTH%2\tNPE֮; [::YZ bXݿ|ǹ'r]?d457_X`3OyfTN.T4O<8l/~n[Ǯ~_x86r޾^.BDI,ضc+Gз1Z2 ,S婢|m 2h9߿|*IA㺏R/R I+پ};)[Yx<^]l޴^X,O~Sy\B_:,ǎ *p/]`4H:r#i4<,\s'<]v)L;݊x\*d{}b\y5/"^e6;Zd2~iDQ$T#Z鼉b(2Py@XFkPT4Ҋb4 h4"i$ J%Z+yMFV?8.8NTYzQUFI&hZʊF333MwW/v6V[nW]lf2!  Dh$YB)q{j(+ >"(`0eNHFS*rl`p|.G~X,Q)i52zY%0`Y! 1Vr |asݻn% RI}>"V2v Pq:AjDCHHum K!,6+JbXZ )J]$f]/~ A>Fyrgs÷7-rtsac5./q.D",Zv9Gٷomx]$qF960ȺukpcsYg]n]ve$ Dpq˅DNKڗ T ^O\><>ZZZjB0"IR*%cTJ6\yd205?^i=,EHx=FRc0?=EsKGH:{GS;qUD՛" MWl yO 99gE9kٜQ`'3o7ɴɔW꫔W2 |)7JMU:rߚYez% YO$X9-in0K\{Ŝt\6 %~y@;nO.|dJEb$Ǐܾb.GPa`$Stut2;=`079'pK<c*SXV|>lL&C,^<͉bŨawG]q`Hf Gj+r,]= tAq;]fG[OP Nhm<p#6`2Q2s uX={j53SӬ_^"^qȤK%4-C>[@'k##"h4Z p8ЈZ246, ~QVP BnѰj"j6яm[E<YUrY./2bv M3adhKο8ͽ~CGG>??}$Nj*ƆfdUUAҙ z:BsKeq1T{k9pHH`!'hQ uu(2fИs9zN?DHu|c<󲜈(~C ʫLDe2c>HGoocSXl6|>]y)\rlۺm  s_<]]|-85Oٴe _ב:nޭ%H (JN.?<:dݽĊ+rQ _*bsXXXduRYj_-z#x نDPU!IH8b0;;K$!HA-38:R&3c湁X@<|$D0s_g]:߽zVGimmQ4 ;w'w142Dh=W͹矇lc5s;{78^v' p7ȁC:;vΟƍя?m\ƛn`br~'|D2Ϯ$1xtgnG)X\Xǃg׮]X,"b1[u:,6+.]$555P(H&O/|>Nd9ɗ*Tuz=^\.QoW555twv)46ۻQ,TrYAA47188D233^o UU9rVI$!0778zw ##jk)jcqj;(D~t@RVONUXDHdcxtֶ>?I|Fannm]#47ޅbi1NH(椶಻#ƱLNNr8e3#aLD$@wcY/z`hb3Π 7naar8͡4ijr?4]KJh$(( G!Ja6q9TWɤA'V*ƿF׷~8YP^G&g̼_z:83͸̼\\- VhW㥧ՌA-<ûwT妭Cz@966O?K^Po`qW¾AzէchxC?pZk+1[Ʊ; R nwˡrM8jVߞRHQQBN" $ ::RNGQ/Ȥ9뒋=Cke2?$r9TIn'{glj033Mpy)<5^D T2:k֬#RUU0ni۹wry?D{G8_>|>?Ǽeff* Q9_6mĿ?曹Xnuuutwws-pֶ8z{*dG玎N0 9rbYI4'h4"Lh4N%/F Q0rq֟@0c9XlfZ=BYD"J/J``yy\kעWt JVWU&"I z⊅dd9aEVltzCã54*`=˻R.U}d-h (xtFL)-1z8=%NE:::8r0.bj| ˅$`0W]1 5 b)t5^_5%R Uu5YL8| O=~3w=>sb~,z#dƵ WSg *؎PT8v-\zhe#PO|z֜FVȔK/w\}%: B^V[ 8NQ g6T^9M̜@IZk_[ CSؼRwd(r ŗ,+wXB#|?odxqX\-)H"11'WThm_*?g/Ss67(/߾ `gȆ̏c̏O1d22 :X2j#t./fZmczr Ͽv^dbq!^o@)iln _Ȣш&A|^R Q'%XNAoHwy=C<.U^fC?Oχ֨gzn*o5JL[K Cs<|M,F;|rYabbɩi&Ǐ#">,v{_Χ?i~|YZrd`36lc3Oqlt v/{?/޾>zzloGpجy珹L%vލfcn$I"IpnG `0vE* BK]q"jNyz#S8d2IUU\IG rh|!hd~~B$ R.Idiw_XFN7Lt*|BVhX,bXȦӔUb "SSv)KpD!, {>Ckk ?ŽDQN7%EEQ#Dg~;oڈR*(Ed(e\.(iIe2idT*M%JbF#|۶ bXZmLf2, d2:__N>Jc{V3K? f|  Uȣ",st8ݿHs KU>OQ^ެ̎xJNp~K%\'1s83o zʉ`O\I+xu܄ . 畭hze7_̜ZFxC U|VX$QZAt(pUMz:>šh줽X.Ͻ>b`p%;AU%}.wg% s{!?2L,CS*I~r4T\6jgiy,U5$n'+to=$aÄ)>|~gzeUUU̲i_>|?r}-0<: d\rmfJC(P__M7/K>,< L֎v8$Zq&x zWqs?Fgg'##twu2;3[yb.vkגf李w_}5###jjwhjnb~nEVYMKK \^O:ANC(bSEt·3=3Ngx~!D"X,+N*ҙ4D&jVJKn/ZIP.X,X,VS fP?[ e$!" 6L.K:yluOx/#df2J8FȤ3YŠ_OMmwG]DrDK[GS*eThmj"PSSCM4,.ٸclٶI:::8>6mȮ'={Qj}5ydY& "SӘf0YbUj"L*^Ekldvb6~EpX P E`uhn}Im6|gw3Y擟4Hjb~jفL:"%jcaiݶ2HLf, wȲ-[\iu9D!zX$JcS9dRnC,,-#vsUWr}_G(~#ZM[GCkglR$RHgʫ"Xd/E1-x^ē jI$(1 DQ 1L|cT*0= q{V{Q}ii6\.p>ڵk #Ginl Z"PSS nF#,s!2 M-nc2~8;v&i&4 vY֬=GǡChnn&Tq6LMt(x\30 LbX(J,6" I&j^EERnX,Jf(l6, *ơX,&'hnn&JW:{f*ٗ$INTpp|bLzHRD.d2asd2D#lT9'h4[#m#uNK{}1nWMaSUb3j8lZF5: ñCM7,?ꪮ޻e` #HԯFM@QQ D(0" 0[U׾Q"A|Sy~~[O*RUvڝ J :Y^vQ$SqJ,iQi5dBHs[s%-<7|zx?(B7s3ѿm;3ZB!E:jLGw? }*f* O<B\D'IL/3/_f`F\+je~IzQo׾d|Oy@g0ڃ=O(KTJ%*r F^#|G*dphnp޷ǟy$}! \:D^ =C 2:B(+8ClyNJ &nMqȳ=Ɠދ"]PL1|WIR"j%eAZccyq,.. jU7nY-L|>h"c1   YId4sr| {S3Ǧg^cܹaTԧ ͱnzPU2I]!L/ނ L78<=}<'00p^͛r!|>{.+++Ԋ1{wcO<Ν;cU;i&-z<>DKw]c?l?U|fN>M9W@Rq{ǡX3cGivL&xBVE3V+@,byyIB嚦&`UOD"l6d||4RȲL=ByV^֑LIӘfBPC*X,c28~8T*Jr ﲇ<$)x<4T]u:&Ƨʾ'b۶mi OfzzK.BJR#f&&D!ttp;\?wnex[phԸ: .SX4*&#P'c6Y[\rD<jN/zilj144b1Q(H&Xm0+LDQPp8؀/̒wqE 8&?LT=vڎnmԀޤb``=L w<{(]㍟YblzhB![*RZ^o,`b̼Vu]T^u<_V/K-ͣ4(J [s"R*FΠXU_G1_>p $ZHBq RjM`kQVBAꜼ嚷rQ.ܹX29拷܄Z0[-Z-Ѱe J% T9zVu!"J VCYy$E۾m$TZ O$㩮>+rH4V"ѨRAQ!Nr70@swE|Z{ {yQ"]*qu05=K!_- dz{гm{)?4_KWo77G:c?Tb0Yn=5V;l9g-ghoduFLdiۉFeǻ.N:, =yx<^ P)}8qyT ٕ'|ՊZj1 աj8(U"HXyUeZx<@,#vp8Pb)O*@RP,RFn"IzX2d`aaՃx|jGU>@_/{{[-:g7&LP_G01>dƳb('Rz{9.>O22к $ kM$&J")D"P(n% ժIJE"LAG4n8555EL&+L3r FFE%Zېef^ffvN]$A:{GDnu'g R$,F̉\e#g?t xu`||x7KKK(ݗ^V TB՟obw~mC!c5 Ό+_%{3rGYD%LFZS9@7]&jZrO!/~fOC#)$g>>JGO1jevwK bxE2{͒ZY!bQ*9hmES ,NOaX' BV+`:g=ӓSܿA:;;X^^p8Lks3VuT vT͆PRkp7527%dFVkxl0FN}y5䳟,o@T@g4G:M6ȑ#yVqO<$NWWwNDQjr0CCkp8l[ťVZE!U`[$<\֫s*=b>'OP(8mLMMc2fs,Vd2Ū3tvvR+R*QTD10>tZ --$SU(( aQ(DQ$aX(WՌAJ3gΠRQ[Xbctt _H*4V%×UpD"8].jl5z |RI^K4f R)YI!_VSO=ŊgA1 }GsInl6G,t-vﺘ3T*&q,̑K' @R$P߀ xuVǙqtP,it7Ld(K$ {uAitwl6]jpfbD"'8]N~I?#}=44YZuu4ctwb6iltS*TxA0-T*j5- M--|~(Qkj1 W*'~yR__O4EP099t DäR)L&jZ{-2U_%QQ~zz4j DJ,4QJ2dZe@k$L&) ȳ`ժ.2xd<PԹFB?H/\. u9]^g{~QK2@!*k\v֦&:x ٌ^QAD]Da z2uuCa\u.q71 UdVjfjrRE\3? ϲY>(S,TJDQB!*476hmj&avQP)H ]}] nZM_} zFǙ3xM(j[f^*53o~] _xn_bqnVK.ၟ'y:=nU]=4AP=T]>=nj BF@(Ck2au0wu?wv;'/eM>y&cz( s9,T*J]m +$ : *beƖQXjky')"[_:;@E;|M[7E/oep["_SAرc6XpF^^ԺA'dԙ3 z:zO~/ e0O|A!`A~mY^k f^,_yfF~|_+t 7 u{;JY,o4/37|0S׵jOE.]e=/'c~TJl VDo|_V+\wMpI%RsSdAbI$`ij, 5\*Id D9B4=A)SzI,-1u8B"IьRAђ/WX0RQ(IerX5|Kl=W_yѩԄ~bT$Js]&W`:`(FCҪJf#cp9+D.*xw=0b [K+Ofl޾z s]yf{ ^tW\y֭D7 tߋA3y|?>$Jr9 <󴰼@"g~aUb6PdFBm۷ ıcq8feC}]OǎӽIlhniT."#cHSM$! ɦ1 >wRpR(d1Y̤2)VuBoc4 KelvV<+@6n#$a6YYXUG<`0L&2+%J"Vd2B)gׁ S)W0u"D hh:MͤijjlBa,6QI2U·(VietzxBb G E*24Q,9LwG3xE\`|z"^c &:}467QWB*J<2(Q.%/T_U@9'93{?Znۼw{^7/wky12x^'߀<~!̍[D]E_&$Jeߦ-kyio"43ql=> rG"'QSN& 쬥eB+>5v{d ,#ђ  }<2v]xWVPH*0*h,& ͭ|rtE(;x1jn4va>>?5V+UWaJ*_@O'iS7̷ UUjFg_ 7m?pRy[6q;QUg2~VF&ذ|,2>sOojtj3߾v?|~O'޽{y_w8C8vB F~_?dhhA(w]C C|?r.E\`b|AG>e~~'Nwfg9xW_y Z"p0~yoN2dtM8h={AhmmED;: $I&gK ,xC6R(hnme#L ( ($+˘Lfqڝ/h,//9pպh4t:1L/t;U*(Ve8ZBE*"NI3tr#jjq7417@(nL&1h 9M2։AgVFv/P|{󱸸HOOmtuuQ*839R#Qȓ( H"OSTk88rbՆ$P *M%#_5T˅_O9k`? n7O>j;uq ׃85:Zj4TDR~O &,f6spQ/bu1zf b,OH BNExn?T yZ={)VT*Eqѥx^j Lt:R T XJ`З#ԢS&8Ak( O<x^5[AA9E>ZfBS|s@|fRAhqXd2D{*MM93`&O"(-5\CZ%Ps穣LOak)Gq~tq)txFiوE¨ZL BITrCEm: QULLt! "`2Gc:B\E86?HG[''ǩw&:<BEIR221]|I &=}}}\237Ӥiz)ϱuVnoqPukjih|t1Y\\$sell D8Mn7UjP(qLNN288ȡC133}?2J%n`ٻL1nc6FFM&$QtT*zL& Q$I|>v;RB,{[+266F[[("2X R\.p\̝mep8 B$鰘,,,-`2Y=Xjl9dY\.#H"J@(ZhDBbf~B6dl6STHβef"T9:dD$AfC4皫BгEeezYr46 D=j$IBӲ0;GD"NKK xE4!OL LaoǏ፻yGwL>K4Ũc6Q _4/,3P [ ot7xQ ҡ4w?23THfnbJO"-WH zV9~ċ`w YUjFG,/R.Q*5HHg' ӯWmwɯbvoyJ6`ϙ?R*%JVbO)RFgnTj0~Qg;oàӲVcy/' 2xPU_hDV_⬵YZdБEbaaF&fgi'V,qb ;OP,ȌNs-pmԨ)WfwpFyчػw/:!{1 ~;\tN\z,W_}57nW_E0䦛nbƍ,..}[oقc֭߿&ؼy3ׯGӑfY^^.dQ~iؾ};Rpe񶷽M6q ٷoׯK>P(T٪Wˁp\=z /l6}?ӧOP*>ӳKXsiX,DQQ*48X,g4 :O$!ϓYųkY(455w333SRIWfgh4Թ(ʬgq\XVFF0$ _C'͑H$^h$I$ 8. ݎf#NS*p\lܸbH,駟fiinB`.(U vn#qw}`P>Ϻm[ Wؾ"g@rqr<{Zǎ=?KXnѲ,c1CVuw$2y︈ˑ,Y.amgo/3;rB(ÉSc Ux6ĠU*dbQ勞CQ_Z#X,&$FE0E%JbXubQ5 ?p5n E7:8E8E4,X1^R\âpwne1 Z\Ԓ$w2v }zx}33sNQ[S<ӌ.D|eQc -Ij RR"ρ$GwRX+%f VDB!T( zD1.&+%ڦT56cQ'//64 GNPrqobaþg}3ղL{k-&֭Y˃aii#BAnXQJ"FG8>rm|+_Joq:gD[[s ,,,pq֮_i ܄Bj6\Bz{h­;xgijjb ~zzӧNr8r0mm9r6{X.E,X,Ba=?lm6}l6F _Dzg /ᣇhokYahhʊd0Q*VNNzcGDP V( R2rR&Jr9)Kh4Z tV fٌN\P(.:& _jpJE U*j,5,--`7HҜ8ydN{1n[>CFDZl 9JGG;gLb09q%B<ŻbA)  LMMۿ34)tu \ՌJB˅$*wpzt B heg;t*¥;sŶʹ ҋ!7\He'܆Yo\* )LF$H"Zp295 -- FdQ($ 92Xx<¥QcL&A$ (T^}_)x8R'j~?!ˑ[ xEN/(93~v?MMh%| IN|Rd~fJ!K$bffZ KS2*@8H&pd Ob9P`yyA _oGJgɢDʥlQO0AQͥyUWbi)^J?22mbJEL<%_ZCj\2H[bK}K˜N8yjt6FHJ5Qto`tth"n WcEӃ``gĉ|#;0455t2JDI&HUW]Z]8, 'OoD.Ǹ;r~\. 0|jC2dqq'OuV/W\磵o.h4$ Afj9tlBv"153Ś5k(DQ5k x񇩭E$VZE>glbJd`0DXXXRP,I& zqBF#fٌ|HjJ%dYrdw\.wߍFaye:gRM"  P(ZJ% FTfdd@ @?@Ux^pT*l߾ILNNq044D4@0r0͝rYh3_&ko#-I߹\5BT*&X9oyilB*BǠs%tZ5ǎA(DTzǓY* G\{.O>b G} ]E6襫܇M)r|H,3:(*;yK6 O'鴤>,;6Ţ$t I@FEVq뷾I<'J`D*A݁N{ge*y3rfP{\W^1y x b>I`n#Rt+FcZ y@H IDAT:#+K$ lB@If!Q+ Eb06jX)"K"=x{ҿb*F(,VTv+eccL/Ϸ+biV">(]]]Z[[ lܸ_ E;wزm?s E|;wv]p!j6'}͛7sq &#\rr[GM$NA}Q;@Vq76alݱ?EEe:p4J۪wlD1Md2* |QFDFA,k]:R;.561#+$rU->P)I)Zݔ31B,(Q譬hkԸ[g3|f{k'9B4CGٽkַHd2uFrbH$jJZ|շgW^6:gE|ct93!错~`O;#9"0̞{e14Nhe *tͨe~ ul2NE\ wC#|h,MAشsS3\tno@D>?/>/pdmn"ձFDFlZV%P\E1_֜ P*  o #ilry}mvfI>qJ .J4'ػY֮~zVSTXYYsauTdh$Bꫮ"o[! y(^n~|l 'ؿ?>ObppdYo~ɓ-pf 8]NuVZZHgDl;NG? Uz^/KKKiL& |rWrs=܌wKKK hZ, \.&l6K_?֮%ɠX,Ԟ-i42zd*^%aX( j0r$1;;JB ZL6B"W5h @frfߏFף(476S,F33,..ۋJ"0==MWW\3Ν;{}rwŽދJ8(JD'LQ#د^o *ĉFT4\+|Fdt򶷾r?Νc`p% z9캄r,/-i4LRiZ-Fݡ\鶳*rGb4rH3ڵ&G*!jy4-NN?@!S3{b~_~qCu#2;w`}}mwP)+ҬW1tdYQ-.V`08f.^HT*e⭷b2k zZ=rqs̴|zclגڗ?y.`#WF}y@Xg(wg}G^ 3?)8W􃃙~=eNѽ?H u6ji MWWj*b@Gш đc mEљhE>}{{ؽC$j4'lYJ(rZ?|9$ˤ׽ Diכ PU 7mvEVسLALV K+(p$ k_v >OEhpu.ٽ$!ȲLoo/.;ccc.&_җ;l2S)f3rHfpp,aۉ7y}199Iٵvc,//Ç;x'ٷwG=?0Z^W~.^ `m۶jdΝ4uet:m|>X[[dlΝ?O&ԩS\y\pR8{,Pcǎ1<2<"QVGb||T*NcnvuJ"Fݎ,حMoݦlt89vccc4 N>b! Dh6rM^,& |T*Mpoo/V 6FpT`0 ;@VUUz=FQfZn^ivӧf|t8v\>+ny%SVlV }}%&~HAsSLoVh¨7 *ZQ@`jd%4x.jn`<l6G'B|yaN8c8nVqj[ùoHX B0ssWZ[glxNdb*-$I4 z=z$# _Mjue G$[.ҵ:O,''X2"O=um[&PU;v +2>(]s3S-r<'ObvnYҙ O>$DQ8NCUZI;n zaPT:6VӉґV*<Mo"NsŕWj6!p\.s*t-ccL^@*\k7B:h46D+wR.٬Teժ4 \.'vRDoьC+Mt(BPAl6q:$דe|>\N$;v$cЈ"frh&ˑdغuSO300fGb2XZ144Pr:D{-F3lXZ^fx4:<*':ɥ{_p[aHg3޵|hRfѨc4i븜Pl`9a0h;\.ڲ?\G( 1;7@?lAm,#$ >=ӧyHmpJ|d&ͯ 9:-8=0V UQ;ld2Y'b Fp\@ujw/^n4?l!rYKRW)}`ѓHƗZ-vT8Lf$=,.Dh4(B^%p:C% I"`hZz9}ޠP(hh1TU : 33?20±]) N_9%^_FNmΦRV:nzgDӋ;䉣= ~Z={0sqNjV۽5-f\JCڜ_ZᣟwU>~53~2XOO 0 dLLh_?:&I1Y,Ĺ{+y䡇eyqrL&EעjH2&VqCs@Zmv`b:w`Et_8y˯ST+5_} aC026yCCCLN^  vq]h4 rnrO8X(jGE: FVfs98jfXZ\b%odzz\.hafq\_$ <&[-_dbl4f8pJr@XRWhBP:L#GpeWvY:^yl6`I/\dp+sՊ, aw) |nLol@Pa^XX  uDeΜ=%.v;>EѠ"Iسg\ٌ(EU zjx݌ qq"Bh$J^lzi̖nB:R Ir7Ogii.qͥ%2uvEZE#X,:[Rwxꩣz4P*v>{s3DV E6F ݎVԢf, EhKNT%&l!˲mVfغm+_÷ *}}omkIƖjhDȗ Sk6p\*eN'Agiy]v0;3DYKDxA)VWWiZaf ~AA#R HR. x_\(z~ YrO 7n07hh4nwf[F"$klݺSN100@d=$ntP@_r/SK&-Q& L& $ :wF-4975M8%_*Sk7Zl9x)Gf}NF[PzTZ UΈ^odeu u=$`6p.2l Yi6= ̄{"xu,$]z";*PdD"b"*2&RJ|5Ėm?~@G: ٱuNW^^|sj5_sAR3 z=T yߏeYf?/N_\Otk?Fnu|z;i+^͎m۹⪫1Y,l߹q=܃``uub& MբZ]wѣGٿM[h4<@:~Eѐ^O|۲03==j#sqZb1oxDQ$I Yv슰YlP(P:tEN'fr@ $Iͱw^JME]w 2px@зr>t=4\ ? kkkFY- qڝTjMuZʤ6۵Z#G;QX-bXt:"cd2 X,Ξ=QM_"D.t280D:BQ. Ø &.LM1::JP`h4t=BH8y.j vE,v(ID"MwO_NXѠ[9n*v! hj5j }n/4A^"FDcУ1 8}nZv|*"6O8R g$L@*~*J03/|WC ?3X}57 Y77gjZ"* NG^=yH6^VYqhFL^ z*֠GjYmԪu+lQe)O`!84ƃǞ/?Wne3,^$u0u$#=,]Qn6 4 ַ#ImZJ\&CgrjɄiCTۯ# zGb-AoKKr93?#h"LZjpW2G(c IDAT95.O{Ǘi|3ZRDOovT:EOo/jV8y)KJdUB&fZw:%\B&!2h 'Nxq\4MΜ97ߌ^' QְX, gk ==GYŋ8]."Ȧ`2@gqqMh5[Fʥ (buTbfzKW3J&d26 UUh[PTD"<O$mhD zUU) ݴP(DZEVdt:+++H~c'  cl٬t:͑ZT" c67y= cYYY!DGt)-֨c4ך8\N|QF0dyٳo/LP$":ވVAqwOg-& QٳPVطR *zA\N(`0TFXXZDbwQR}4K%:ZF"N[r{( x|^r7q' ǃlԱclCn7KlFf#Ԥ'GD=}r*^NQc1iu4m\^/3ӳz{fye;N7EmK^l6s:cL?_*Ex`?<3fE:Ϟ" i5[LM07;C"qQH%Ѵ۬/]m9\&=lbמLgTh5:bӓ &jfb橙9 o"uE<^/D#QV/>fO,W1Zqٝ< I5Z$YFQUnj$EG( 7;3tn7ݎ`4G__{?C[brrvl߾oᑇ/l6pyFF B8]N"0??k^gdxhy'kxS}~gf $)Ql6IbaqN8F$sss\zu xoC׳cvT:۶nghxVO<Z266~Dbb,|;@|5$)9 Zl߱'Ȋ:mz"=Dr9Z-=Z&GӑJ% ~لt}E㘉,bL VJfK_2w_S13;KOǞdpp>nՆcjjQ&}}dr8Ph^fzҘ,0IY]]qnQ VH$G8w^o1[U!r:촚M:k ňah0i5%+<Ԗ(ת(B8&eQzVM`!H$tj;vQ#ַ:{c~".n `ӡhYHسgK+5Z֓ p1hD VuPTlDXrqi?8`4#p5̜?KݨX'($ְ,ǖPTX,NCS5;F% نg[tLrk_Ѣ5)* Fg*|7Ã9]f~B^hT|G <<< >ԗ^*5]4ֺ/TeSUlfuc#A g# # <TDQKR`4 R(g?G8V*`ՀAm2iyFzL?@/vי27{;wNX-J2ў( ,юhu7(oߡӠdΜz1zCB5bᵿB/]g¡7#dihDÉd-hjܖTk$RY|c(_' V+X,|k7ů_oO|⓼-N{ɄDՓ8qr3)C'NkN:6Sla.e$388L /gqik_2gpͰK2$ϜFpY‘0ӓ"!‘0~S'"[bxxbѨgmmdj@. b4D"Vk؝ Gz CLNM3:6=P*trv^eJ*tζ-z$v)Uطعcv"HY`6h6FɈt[ۇF%W(#K&J vr0,_}3<0wyY^Y& +df3h"nTjJL.fdtQO6&\cbb Z\.K"c0[l4ZuB .݉s#YV4Z-@JNtՈ[M ٙidIEpŰmTU<-C*EDd&|6G! C(P??Ad2ґTlN7-MPbe%I>(RiJm&8ad3ivn4X,1r53ャ,wۥeNSEX+QӔi"af9lA+u5&3:t*Pe:>NRCd z+Ȗ@/hryl8ժtؙuoz#FNG'hTTU莕ڨȨq 3y'c E<64O!ow"Ϗ9'yHVhnpt;\ϵ=>9`lMH >y6@7Ј::=:QST+E;~ |_ᖗ]4Vj氘-Ke ᠟d&CVQhGN=MAR-͊5YY?7^ʉ5v/3mJM(w,/ب*Ri* JrBYݑBFV'ZAKP5x 3BY#??Gk2rɾp9x z;_όmh00::ʉ'X,ynN>u-BaR#y9~W\q%rb18k &&&9=^&H8v</FҽhZR4zU W>2JN? w&Jn|h4J(n`wرmX,5Px zEfeeAQ^nc:X[[pEՑL&7*rrl Qĉ  凯__ŗyY>* B͊5:w7:7~P8BGx4mbhD oRd xXLnh2`̨VjՈ$k6bK rgf8wPi5dD""4V:+X0봤 lvs7P% s;df8-zD.3P0j(z#Kw&6MU߈$ɘL ,U=(>}{/|!3Xd(X_FS79H+~`#󭂂VkFo>wfj<Ռ͓JX@oCMzyGBm y<]o}lOPd~"  @T AkС( >ވe}mP LR% X$#}x^<04[^?s/_qA?q7{عc'RH8(j`vvb+n~xAƷlѣGfZ LMMQ.W%b14sαc.^СC `5[q:\h4ZDf `kzQOll2r8:+q>_IX`bb+Lp0<Vq֪hݎ)YŊB٢ j:&ق$uAj60;;T)x8<##J%tŠ(bZ1M|>+h6AnS2A贺@Zzpl6 8q׍bzz|Rdv;u}~VrHP*В:_$*"PWAQX-+Dz" zQC Pл]6#y7;0}?ý\sYglh4)荺bFE\`E-FEV쀊jCF [艓6{GHw]`Eu$e ?3?"7)/IyhH#3} 6ԍeAyğQiҠ n4 "͆( جJN*!g؅b E[+3vɌ%Fjj*)uGh xǷ`d "@})!q,k78vwݬĖ[m,ZUE([-int;] c9GBsfq^};;3)rDq˫~{Dra쾄hZ&sX]Kpq'?Α#Gxu7033CXf~} 244iSl:(ȲL"HJBz."V ÁQh?^K.W "n\},..ꙝkA$<ivE%W 0bN8{6Gaw9jx|^\.W(crrqe f ۍޠ%%MQ(Л3H%X,];r.DH&b Ll{MnEA:feq+.? ,ŖVضF$I :;dumL˄ax&&&mT~$;n#Ok.b[l!A') oO1ׇZo2ILO6r&CFTdF|fjrPB?FGe;:}w{ʱ巿 vͬ¥alC(pr9Z6RFlFYZYUZP ##cb1"Z_yQgwهɋлmǟ9(Jܩ*]0 m\/z~Q!fݬ?GWI`fM|mCK n-x}?Mf۝*shutBUeQ\N:Q@ʄvy4oՈ:RN6FAbR(So6Y[MmC#F&[fuiɄ .o|kYY^AVPt5\L^hS*0h TM\p8XOh ѡaָ5+o`dau?jwxꩧhnp*։ dE٨#2۶n4C0R[ng[&yx衇i5|cxh/| }~{ql'P)WOn݆NӸ.!P`Vtz `Xٹc'QC6+}łܑt jDk \nO'y^ƅ Јo6`7 p9htZ]]KbH$<j^$I+e=J7ٳglݿzK(}}} p 4 ^;3?rUWuT$r^ѣvm$ e,--w^Ǚ*u^N< V t;Fd2GxǸXY]!#zq{<~D6raX6-ػvb5gf5W\MRΤi.> Y^^3@zzz(4MhD:6@)ZmD֪Ȳ́5\s57e`b@Y tJHXD5L&~?$Jw3jN Ft:$IH|hZܮrQ*p8Ȳ&Kx{ȊRl "۷nGj(ݺ aneFdٍn,nZ8NfffGMG6FOGf6~4r1 J0jD{{8PZf\*2:4LG TXIi-V i!VZ.7$rSL%gRib2h x"&J–-\QIDEv;F$+S5oPnk]WOr7b3 z=+D 6Z^yCGcΝ-I6An tuwifrzo|\{hz*SYFF.ǹ曹ٽ{csr=4?!v.H=,+J2`fS%KRݮ0v" Ioo/Fαc67X\\`00\EQfff366vd4NE :{ٳ NrzF#`Zvb-rnrFgyup}QN:ŕW^鳧ju ۷% 'O][,!c;jIpi#K۳bٰ-h zTzU"mLMMx،m`Z$i)"74Z-*vRNggZѰRVkqz(Wʨ# RT*8N$FVjID24z<^Q><}-[hmx\{5h$\.ZlBӢjVʘ FR$$ѹ%\~ZIb9N]ԇM:q|l:r 58m"rARa}co :рM.COq8쬬.yYXCKECk<[nϿ$OrM77] M;i򃕀/3wffz5 fnSQiIh5Zr4O> VRQn  ϮqMLjRˤiV]N4UDQRi135IWWBhg!wXMV^ IgwSs3LOO3| PSiڼDAV-C^A  l6zCGxFP@d kx}>RNl0riv |I\u%p+"#!RHEJ6%qa~ 3w8\N|IΞ=`  2::J|+7ވclle>Op7wx/}4kbKJ0JFl6J%۱X 9^~%:ɓ'z[B@:8Py|>?HX, ڽH$a%./nopՁXZ^ ASNm6xx<V+fUmR*8?rJhѐy鴄B!v;L&({chhSN{c?޽{b\qv f|&m"j*Z ,mm!JVjJ2*xt("z|A𺼨BM]  E;Jubv{֨1;7K{{;baaq("IR+:\*QհlT*$*Mr;z{XX^F~/gϞ套ٔQM\O?-DӐ *յ,&F&*vEUh Vbjpc45R^fST %&'p9MMpWN&1E4dSW$Q'b0z$i@˙K:t1"vAoXk4<kUD-s>3 o93+|7`.[m] =2KZ?Qy [Z#*8٠'vR-jj$_^cT,;mILlEnoR T4yZd- Ih%O?,a{+J*c5Hh4zD:W\Pnn[=RL(4xŋ/^czj$j]һͺb1N6H$ںCkZb  EEѰ(1=5t*N ق ^eTU%Hy'dyq zQ$*LWW$x ~A0drr>~?NQdEi جַQJzD",˘L&rJSOCJry,+1T?'N0<E*FӢEx^**pvdb~al6?:Os!1 EZ*K<'dv~Tkdg ǍNkY %$ɀ4mdYhfQU}-[xS%k}Ns瞧jJ\=*l(M$hZ Z֪M@)B`+XtbZX,^/zH\T.FE4Rp{feu_yN'_ciٿwX?b]݄f"X̱w0tq\.e,S*]=s?)Շx78snsc\t]llSIRM*" !eR^d2]ET[l@{{; `يoa0lll*uQH$[erB!|A395bot@ DQb2.37D}+rP(q8Ξ=KOO\Z(J%vKˋF-웧y+?1?T*#Iʕ2zdӼK|? 5<#9~)~7pp#zn ~ 1$)DQ.ՑJ%Y__'033bbii[[<>FFFh* `hH%Stuvh4z-Vx1,,/q-[%`2p9lN2Z $"juz,[[[z#1raYlVFbHRh4b4EÎd+[_g$#GCe#htzz}|}[n)^VGZd:zvP5,;b ^OC|Lwj0q"3S,-⫯S! ,>ΞfW;:"[tD0K:L("/=TkUl.'s P֑HeId8=^4Kud#m2\a^qEmb6ZuUN,ۗGQMf4[ҷFN!JCf`f~`M[V֌F/$uci)6b>E5&t_[ka~|02,|@GggΜfd}u:6Ս )"희6kUt z9z]v67K2NVL*twvdЉ:^y%zL%*Tuz?#a5![|(U*b1Ls#g<<3\s5JN*B#{<w3g8|0'^?wsnfN8l&Pe"SWr߿i9**7^= :b\LYT9z/ukn, ,sQgV G|)$U5NuLF#-dB/YYYZ2;;˾}X^^fiql>UW]E"X^Yb\R,˹ø\.l6;ϝgp ~a.]@.+~˿_e~avdKFrqY^/rjcvv`0HP#W*Px<Ύ;Pj*mmm!$. JZfO1[4,)Ng'ζ:a$Ibzzbb1hȴCTfK3;"j+*DZlX,677i R^fdHRq,V=3ǏBXms/g~r9:;Xäh-hdu}VYWx.{]ä6_/2B1=N"N5[7~8w4!E".\B!86&;z_ _lF"Bi6pup=?"+kO~ M" JSn hZ\)4z _c#0G`̼TAP2lEA҉|JJυ&ɍ- RI&(8Lr>GzszjVj``aq՘\cY PUҹ,=;f &H}yv`lou KgN1v,$aX4`?l`=v󣣸~&DTE THT^N_Ï}e`hnF^zˑ#HRjUR^At:-gu-I /amm]C8s4sɀ`bnt<ǰ y8g?V|x2ΗeTI*?~ljujP7hoc$ ~ngJ+LN[U} b1$ f r\>kFOO.t:Fb1FIH&P󳸜N:Z];z=M}E۠#ځ`cc@)zzzjufx˼↓A( mPZŘO<&ϟÜ>}v^{5MVe}ȲL0d:x%7r*MN,f3`i~_KP\, X^^ިKT*|eZיBv{8s dYlyd$}RERl6ֆb!b0{}}~N<٪Q(lnn20jEp)&&&Z #܎FGUSΞ=K0Zr葖hx333OVpخT*\.c6[沴(MJN!2l 泵fRrǃ,<ԓӗn72;;`2tttp$IvCbzvtYPj4ͶȅVVFTUA`uuW^~#ᠰ]Q*h4uN;븜VRfCQ:%=ME^7gvnoP.9q4`ԉ3i;ȤSx>&3uu {c|l]_en7H"y.eW$xRѫoUHV갑+pzd ~cqWxvYmXlRi@,Ak |үgU Rm?6z+Рj1M+eZ]RUtZzL2:̴J_erj{Aj&gp8881z{X[𱫹p{wFnȨhǰ IDATz6r A&BU[jdɥE,V\*ǣ=I}'_Y)৘I1;8UE)sАInm\_D*h0IKlfPf-# 88s"<_Ǿ8{wQ[_䙓+%*gΝfr~F[bssFlta0̌}Νzlf+u/=]x^677#qMr^G{H) fjT\v;.|i )ZapJMLjF@?r>T:::Jmt:`G|~===t"HAvۊ1s'/CCCsP$߿'|%iwcp 084l6\.S(0HFD"HzRINqn"7…n6~z{{io /IDQccctD:9{,vȹsx<l4uB@nEfQ[6BIFQ4 t:lFVdNo+%M5Ixяկ}n N h#u|# _q3o)Ϳp+Ȩ[Z۟F+?[hлfP[`Fr?gI7KElz^2WfJ)b;{ ^OR%PVDgf d2)B1F^OA4H:;q4oq+aa|e-yGXl&`$D<xh{+[[zmvG;XI9;>>A@o033 ?  馻Oߎ^FGG1ieNz78cװB* bn:s%ffٵ{'Ob`3g y衇ɟI, .R.&x|`(Hg:fvfK$ѣhQԳEuTjuFGуagq~\TP)CWՋd3ttƎ^g&c}h4l ;z@Qp8lr9Z-Tp{Ip̰cG>^FlYyGi4eFlx<8ynx$fe;6r>^W0̴CudJ/"m 餩66$ ,3E&&&0 :ujTpEjrÆ٪!PϏ^*QZ-Zz ـjetDyz= l6f3Rg}l6KT*lV;nU$Ibmm M6bVjw lnQ?5vAд,آVC-#LOO׊fDC-čD`Tӻgn݉Z*1~bk[rzL6 4:A#`Zq< E%ERᓟ?8s $ށSǹy(2VAbbf`4Z,F+j hn;23wu7_G>fF#P4/ؾ; xw0'w3yw0%?SN0GУp/ i[|~];|hL?TcWkjkh6i6k $Q\ſsL:-6Vː\؁Ԓ[ff AM߈^'r9b,//8O?:h4N#bjrJL*ax0F13;C-tdcڈ6p=\.FGrrn &&&d4#ɳZ=44HZwyQTdrY:8qvL6dZ20aS*r4fH("~?F _ѐeDIl6E[ڱrZ$Jկ~(ku,6 xKЉ"z9>W_#Hp 7x] - y>_D8. f|NKZ *+%P0H>`4Z'P(!J"Fp EXZ)S$=Lˎ>TZdYFUlVr,z͉VԐL&0 lmmb0i5j:L.MQcj96 SO:;wtr1fg0LF"F/( t=-FFFo`+z=F+WJ{{;Dt iQΝ;Ν9w( ۡrN ݎa ͆P q88NVVDZBL6M!q,63z@0u_*u]4 ^&B$)v&%.'JUmR.W(sX,f 33AsXl6NZɀA/ٔ q=,Ӎ塉"`E0;;CWBj%137z])n7+FH*񳼺ΝcjCqfWWTΎNa U F0u"(~[Tmn[Qio4e>z+Ǘ% 8]L *c۳YQI˔ ΌO2c]w'{\y-]nnN:@+EƦ'Y[ g?^>;بbh8,oo 2z"^|.KP$rUVaЋ$):zvpj c|>zv1>?KRQYg~j b IE+I mg8ǹ㎻94hp|7[]dRurw |߾Y "3VUE5+5Ӿ fޮ\~ϥ-/`YH4&zo3~>7Ugx⡯o돱p~/ ӴtJP&k#ۤR2(LV0{|*ֹ-T{ysK;  v,VUe DY!氓[[nᙓo/ Rhj3ߢڨ{g8.zNP\*aXgfjNjV`4J&ȇ>!}Q= iZԹN$:A2W_}5f= Cr92 w李Ԩ8qSNQTU%`u)+H@T܅sl%MVǹ醛xy?@\n5ol TU Q(tT{9s X*"j4 V'[|>6x^Ll u{9z{{i L| ry cae"\*099N8Z־\fbb뮻-Zxp8LT"QiP,橖kpZ;Mt:,>Q#d03;;0l5X,&''D"-G(8qs/jb:OR9>o}3e`Ӌ4z-/.{,m$؏ݹCmj3l[nD[Qh#YltGi!+/0{Un\c?+`G˗z_ev#FQ1|*J^VP6Tмmв{3̾#47Kl9؎IL@%h(Ж\K[zooK)$B lɲ$kh>qAh s=z9<?~s~jkk6$oM̛$-~f:m7잒iZBl6^oCZ5TԂm1Y)q NDYb(I&8mDѪ5P(m.tu`tvuQUT%x/.=\w;q0;k}- 8ۚ8>5ƣ? 60}e#mw V*T%j5llÇjRhPM<t#'ϰfEǸ[ŶS,ҔJ%t:rgvI|>r_q=}}?}[\"^R8&Wġ5;>윟ӣ?,COmﺜo̓U'٨w ,7nلSx?w@?M͂Z-'0-$1 nySDž)m^bEoRd:,n$ ̨N Jt:v9LYEVRJ$+LT!J ˙W_K7ro7]>_rm*'k*yu|%E=:w@2U$h&㏳BHgg'Ͽfw {Ν'Os. EU7Ԩcvj*rZ! aXjlٲJ$IT*VW}p4+pJfN>̴Jhc1hTZ:J!|~"O,ͱ-[yr>}n~J%+Wdttzj5PJ"XV ǏYZ[d7$Id3yrz$v UգT0H kC8. xh4fcӦM,,c[Usil6ɧAZR֏RC}m\q9CF!AӒCT<,'-[,j)H$k vrQCc/f$OI*`KL-|z IGMFĠ -rpIdzzZB(ZaմZijj! ku|e/+:RȧR$‹\~-^΃&R$:UtzLŎ KT a:N'?8JT*:3͜Yo^S./I˭r }Lo;>H}jDx5?oK2T2*j((DTG~Cw65V)LF؝{. $V|6IC1(U)<|~'hwl(Ɓ' דV968D@BvrB0‘,qLV \J*ag:NAAc1fxr{\v9?2LCs;}~ܹՊ FKs3gϲuLNO@sbꝫF9"Rb={Ԓ+ (@ѠT*T*sqeqqmK/͛ytBT<``'w+ )d IDATlذIpl6br}.CjlvDEr d͋JTbzzuqq ~?J4dsαuVb1xՆZ_456"(>}H$*!d٘y饗HSxufiooq9n'JA['g zb=]=LLOD(P(މ$IX, 338%Kܦ]VrHf'r8iQ)3SFylv~=+Www% HjJBGEzzz(/#lv{\DzJQ19sۍF#g"rw\~? >ʕ2U<6P(d"NRP(4pap͉'hii@Ԡ4Z5j<ӧ0͸n$ M˔$5kسm--}y')n};Ͽxg 䭡JB>υgjOM>7l!IasZ",V$TG!*}Da::) t],r:8~ݝ]BB!NRZ,I%tR3U˓̙y( #)b5< ޲f eZ̿KZ=y̿VF*U~CRT2-^Ȣ$9Ig߾WP+E&3:A$@6Õؽc;+gj廩@RDU! ,crR/}F/"/*dhrXZjQZf>GåנAb4SKV+d l!%A`?ET6#ΦF h^K_GO<=d*n&9ACUt'J[k|nw< fǎx^ۇhh4e||M?Of|bowi1Id2U8z($SIVXA4pPLFݻdZmv F#c46612:J:tk@\eaqan3Ql6rȚF.5gg`"5z0^snlFNu9](*$==+EU = >u\˹!~ >}(\.ԄZFR!2 DP&.ux=^r,>_=BAZE9?mmm.R|$\RTRV%M-< 7d٬1=n H(Lsc3lFK*BrHRy"b2Z,RlA0`0,3,#BHJ&Pcj[Tj f\>G}}=ZP(,w}i *5J$UcL&$IBRQ(t5+:W00.[1fxjj<ӷz5Gc&jjrHhlkhDB̎vU:~ybh0j4TJ]t1.`p.J&`H&S8lvR4zIAדIeE2l4Zuinmyk_假"+zW+WPJnfJ|K`@_k/343?x#T"ѨTR(P+՜K5FV)BJFhO~CI*q ? v'T+zbZd2AlݺёQ.Z/}oleHRkt:I$l۶my5bx.\\؉\)P9#PbJ}ȉS; :饯.~?.sٖ-r7e@X*`s0 xlnZV+֎̩֯_G*&/0==V[r&N)n7ZJ}m=UdO2 v{$ ^K:" "Bb_}2T*)W*zΟ?O}J&/վgq\LNNF ^hĺ?#iY\ -8-M-$SI$I"V&3jU^"I rJ\ r~NPCssl6|>/W;h3t~6ΜdUj?A[K;7_׿q/W\L6GPO>ngEJ^/cNzkkMu8w ^h<,)lV"8vH59fjaC|>klajQ(tr[w:sqh4/䙘TpdsY9鸩B KR%bTeZ(dFZAѐL&ihh`vV؋'477s1|>GAL>Os:}FҠ*1=3KWOO~@8_|Op N'7]}%7] ?᳨Mu-^Oƹn&M"TˈR%J l6S(EF6T*Q( A ZNpk05IK{'P#=$hO~ZRTɤhu˲^D~14P)1+Ͽ۩Jr^<KbEϱo|Jo0^w@B\ry"-,EeV̪v߃_yQcG̨DTA)*_IJ?EڛA(dq5k{?7HNhaƦ:b8j$`,Fq8p}װ}v^w879Fc=Fgg'#EΟ?Occ# 022 LNNyf|Im?ﺋ ֯  L&tt0]ּiVf˻(JbxT:E*Rb`9q<󴵷o|Cͷy2l6 LMMz*v{SS\)S[WK(BrnΝg?Æ ^[t:N  .Kdd2R[:w+I$$1Z-JF*ZjI&eF#/`ikkbʫ):::GgУdm6(T֖V&''Z YQ#- RNC#tQL&B`1Zl6c6 uu^~F+{dZQrr(X.p1ى5<sַ0]u<B'*dj9srk:yy=cw|8NdtJAT@ *J( J5$Q(0LHMm-hzJsl޼g}za~*M&ggpwhu,,,pp>V :@_j.OۿH*~LCW|ّQln' 2==w܁T)qͶ.x-dѿH[KxAj&Yd~vw^a2ral6vq.U;(P">(Fh40Ofjjo@ rz'nF |j*.Y^y9?ɏhnk]/093 RH$"׿L   Ynuuu,..H$\.b0HTp&*Rx?$yٴix|N*jq!:::ظq#\#G(JL&^/SSS:tAgǎ\yK6kNt- 8fٹYl6%[555H$ZՆBA8f|r|٪kh`ɓtuu11!h4JkK X-p:V\)\b}ul؄q8mƑ#G[xp7DfzzJ"B*"?W$*j?(Te\.r9" L(rumm-{ʌkFV#mPPCat:>B@8\.S*0 Բ\@Qd 2 zj"Hl6,[vr`^ghh\& lRYΊqڝ,pڝhZduݜ$ kAZF0BEt3;3KMM 5;IS,ڬ@0d2H{)8Nf~\^2ZӃ\q?, Dq|QT!Px}$IURuَVɭv;^ǛZWEȈ(=3h(KU "q%[C2dxxxɚ;(*P*E,*5(Iz{)j8p+9{,]D +Wv39uX0ӧЪՔKzx|H _k#G0$YM EoaFOO^p*E$B\W,g,T Rޕ=AdCKs:_={\u8tˮ~ncٹce}dzow|w261;bֲi60;;KT[S رFڸ[W0*]ye@u]O7AKK bOqQolg3gh4zÔe^/Bv<(7c75zپ};333Z&* {^z JdsZI%45lRΜ9}D6n+053I2GsNرc6={?ǎcÆ v9u>[.g׮]K+K T-x0BCCviRCDTؾ&<QM6tryT*CCC\zFZXQ(hy&!266ƺu06ιs좥H8`(H׃J%b2h4/LDPD5011n'OVk0Lrh+V$[ٽ{:vhGsC#TJDTD72>zX,F<%v:91/u]3<ûnGO<]w݅ uVx :P($]]]y h˶ld4.յ`ZrB!b4%wq?=+Bs9>|LJa6mD.+燔enʹ!zzzbXVA z1ͨT*, VH$dϞ=p8E x"57ruۖK7saq;{eAّJv_=.\ F I(n7f"#c\.v;RYNd0rfᄑlٴB6#wUz=Ѹ Fh(j0N`0H]M=hddrK\rqY֮YK4nBkK b @8as`0(qJ%V NF- %D:`ГL!jŒѸ4:*%%N>/RZЩT+e"!^T&z|c%ca2ED*A"K_eEG͢)98ɢPȽ`J ZI6`hJ$R("JT6l a08p*™ab6_Jg$_(VrjkNjX,NKJ%/2/FqQ(X?F7̼0CUR)!"HU-  sQ Y(p nۈ'(\`(imigf6\x7wVDQAJ4֯$qp8?R.RB>ϪkȖ*44SɧHFb="-N$A/(54k0 |ug1r$kh]kKL6d`|bۉBT`00dsY\V'RI:b39=Q-1͜0J%R[wZ$QFC@*?Z/ VԖ-Q(X?w]֭[9FYfy:||ﲛ&_ __yLQ yTș\Zg^|QMO1}$;1ǟN!ݠF(ABEw`>nw vcE$t A/g_!UDyDJ%)J*s Ө2l?'ǎs{{RTk)TعTT-/M,ϡҨɕKYV&22rۃbBUTUݷ /edtJBRh2IQTU9'IlR0Ld)*h4N*;* "twș6j 8I2 !*6~"HZ‖hhѢB!Gl?YCΟR)*4F9 JM.ɕ4dr斮ayY6/~E rn ež$x˭7 _CLLLp) -9Jݻwꫯ7Ǐs-Icq?(Pm۶$JD{P(pASfnz!7$Z*6l٧澷>x}St:jE<`7'h4d2lܰl6tN{r9N'- fc=,###~7c'O0l@KΎ𯮱}v-sv{/ktwwS.vh4& p8p8r1 m^GE<;q}{q X[[kJ%V3 ֛ڵi:;;Q(hCPh?A9 mjZFjFC6m_;hۍR$SN(K4.]k׋SM&zC2vfi6R=ۉctwz2DV0hL_7ヌn#} J<:zd1x\76juR&Ra#Ѩh!.t&2ԫu:%z<3p(]Ҙ126B`9T*5o|aˇ^:̆ $SIl6[[P|=:^yOf^~?9ُE>G?y|/blT߿iҠՄf.F{obXЪBUԲ\3(eQz|&N>&MAՋԫEJJ cw9) NNCryfyU.'z Cѿ{F|>wKgs8}}}@X\.348ؒb]GT*Öj4fC.L%L] .m!ތAsrNPTx\ҙ4KKKLlF!WHJ:N~wKj/++z;ԧkx<.IS}b+k& E *UNH$@.F!@2^l(r6t*VVjPԈ W^e׮]\x^kS29Jj#Ph6Gh)5d \eJCVBZlC='>|CE Qϲ6~O f~|4ժ ԫ;_Q~23@ETrvz:;(e5FN?NXYV W4i6 A=ZΝlFRT(0T9z JX+aY(W*dkU6Ix62ӈWٽ䧠VOqqJ [w |oaǞ(&VS /I2u^~e4 %\aj Dzzzpz܌0;;K:!E߿Ap~Aq晛CrBLLt:`mm "wu<ãܲ"ϟZGF8y2$ L&(]">͆R_"F%6X$ Rl& J6Jh`6qQ{{jTUJ%gΜaӦMÇ{H$T*Ri9z([w&:d"࠴QZD"!IQcw9#صk6t&M>ȑa2ڛg$mBA( E6nʊtU7L&;YOF#ݔJ% n|)Qg$[#z$R. d2">T&ECpM<{cc$bq<60r_`$L&"G΅+l!PQXVh4Ѫ$Rmڠ3pQ|>nfI.ɓzSٹ}'T(Jmk~.泴Z-,&Kg8vd^GP.A_3YMfb2.4 *kP.+)+։*e&z(%\r}PT X*oHfh!(@N&AաPH6RI2d4IjT/_g_"?Kꗕ1XRJZlej JbIgsD j0R:l7Eoc%Ʌ)ΑW_ҥK9rNw.yT(t~={gfs2A$Ǐgێ$)Z-yb+[Mp9<;٧#ɓNf rJ.]k|駸| s3"qL3PV[Ȥx\nL#LD,N8eo~]*UoaW ={099ɮ]mj\."`~h4^yJ%lJ \R)ovc(u(d2p@<.]>}sQX\\l]h IDAT$'OnSV'@EfggfTRYkט| Z 6 Kxa3ȦS(dHXSȗHerJ 8\۝A*;;;) RR$zRD2\.MDj4:N*mg8q˗/S8~HzG!G&DjJ5&ZrDcѶ]kLODŅvfczzR+U5 zgyoa/ccc$I./dd2^Rx"hDg) twvWDQ *B X,FK~qL0vsY(T800 VR$|\g>tuwpV 2A O!*Tػ?%n}Q+z,joW' jli5b ^]@kb6꩖hUjjk~z#W/]!OQѩu\h4x08lUoL1!Hlʘ<_Q~=_yo7|3jo|֍]Z*^Q;pjv1ɚ~3h!6krHDpwY:l~ۆ2R̈́RZMy'۵ @<f6ۭ H8j5S.P ju"`=:/_{m@W|=;w[ɿs$ξ~,-!>O LOOc1i~[|"XAzzzxg7K/qwpI_n3;7h4Ngp;\[fEkAhl"b0R]9X ^O8M'd2j5X,-[kjQ)KWWlJ\$cɗ-ʥ  168"9U,^p4RΈ$rA0dpPtwwC-%|}de0HfF q:{@5L& hGtd2NN>bNo'3s3 - l޴r(nY((B;bL&NߧRbZh7 ݷxq(e"3Ȳ)z *?{\Pg1iTj oI@fLJ#4LN"iz=(v0MT+uA4 >dbh$`9L(7x4zV333F^z%ֵGR ʳ>+ vlھ FM7D$vN[蠣ǥ}u NK$3yjUzErS3S(5+ֈ&Rܿbd2Ǒdm͈ofj5&LPT8p(z$l|ː˱c0LtIUGEѰ_Eղnkuz2 @]OX]Z"I߉DF]p2;asP>D\J† P*l6\VE.#5ԂSvuGiEr />}*t9f Z"!NJVETJ&BBXE͛6S(2uM6etxFEɌ\. dمiwZ /Ӓ+VDr]#c8 k_Xj-\w1L( n7JCR ƨ2GRDkr<+++J%6R~/^˓L&2pG1Kl`P)U%r,HvpH$#Vʗ+7_R|kO9#kMP|m`l5SSBTEӢE<B.xVyccj Rşg jU<.RA&VNK\r@ J%fp*: Fٲi(5Ξ;n'/= j@/-("ǟ|˅dXcǸ7_ebr9 t:alf4H$bB!k׮\oW?~&''i`rr;wH$B^\.7ٺu+BݻvO/.fvvQٲe zFv{err|ޠbR*bS)&|].󌏏syV+_=4Zd 338pV 6m\.xX^^fϞ=u|>_.d,WBFGGrjUId{,}}}D:;; Ο;febb xzx"(d2Y:::$Mƫǎ3<>3\Үùf :2 H6`BdBb1YYZضT6Egg't ͈ p8uV*M:jfP*mMu֥\.3<<.\]]d2a1Y+(b2T%0#=شlBPPfZtvr,5 VVV%b NAP(ؼi+wuI )+s9TJt:O4yFIoO7W:\dSI|k  kuZMqsV\gpds9j:LACcX)*UJ*Z֝7#/=8LV#z.K `\T.Q[FkĆHdnn@wm ( p毿Q{/r|-`O]㗯񯟆yNU˕|\;Z=Dl*%r;wTh9ze,9ߣ^ ((dM"T4Lx<իWj)ʥl[G%r$ēOb ~6R+e_7΋/AAoo7'Np`~i6m'lr!nҙ|ZNk@ֱL2&Oyo׏R#ʯ q㷿᠔/`6H(oWQ JR~ ۷Nʑ  ? xH$D>VqIJ& J<GR k * VK ݉Vg|| h@X[nل^;v r㞽y\|Z+G_atk"0J@>k,--ͥ',,a4 #,f|}(Jv'H!gߍ7J٤!(M͉F&O}6]]\x[nvP DѶzP rZx #[[h:b&jDlԘir;h4&lK{sJѶ Xl]ayyY<O[jh`2ЪL&VjP(0LxFRΊFx^9)HRtuuNhR!iR!McHR$ W.1T*jZ4]k2 T*d2I=22GyGGG9uTanT*{Ygb:D6az~FDJIǟKY*HVɋ m4 <b͛7hH^csQ)5N,f3ZK <`k5_^ ѪUde+zjNZ?ȥ;* VsISj5hD  ojUdÆ T:z^ωO%}CS03G>*AkfI)TJIrd|1ZM"zl2ĬbP)˼;(2$"aZ2^H^ϡV**-t@!k+tt#SiTD=gh8tt {㕓'8z$]0B -&,tR Lq)u-[Q}}tz=K+xrt2R" S( <.Z2Ҙ,6PQL?=+ț2ZBl?*@6¬Rf)t:l&ǣ[9p idbAF ( AIR%_,RkLZ@Ph!"G9xpˁtkh*U {).\PgqTZ-sNLF#Gg?mC 033C^'Pٸi_bnnGP(ꫯrb6I$bT*ewpH#`0@gg}> JFwΩS'Y[[e*%{[ /aFp] Z|K_36n%rQo!V+a\.===\v(vN.cV0vI~R7H,"[vB Lo6tV*bDZM6GèjzP(mR )hr"Hۭt!BoO/V JE,`X P(昝}:jKW -#|_wt ޾!*-rAPbw U{{aagg&>_\BPJfHg2D1ZT*A!%NbiaNΞqawXQTvk)*䊟j̒fbUj BK/D"#fA.IZ]w%EWLqy|$/244Q!aY+3Ra8q>Uz{{ɭg>{=b=7)L&C"MۏN#IvBj3׃ :6{jYv ZM.ch`zfKčo`$~)\.׺M!Wbp9\zZZ- l6`VֶKJv;*AZ/OƱ[MR>RdnnBeb;p/DH&t:4^K)[&39J"lM8jWҒ?7cXP)?s뷹x"P@m4L)5z_62٨2D*J@(bEXaw: l]Ċ\`~i=˗q8zSxn"Q.L+W.SQמZ-Ć5?xfl! ϓ/H`R֠hѯ_R|#׮V~&iZr4#hJu<tZr+UZl(cf.ˋ8zkp;l*b|(fШՔv h6-fN]EP*jXwG7X[ڱ;~A&!W) G⸜n?x Nx<ml׋ڵ REoo/?0CCCtw}vodƍpu*+hZ Gœ;wQ:wT*h4ϑ#Gx<=spi^/33߷X,0W^eØd!CP`6DrÞݬ,rmʖ-[% Fww7Joַػg/ro 2 4nT$moo/rNY.˴?RE@Jg㺯sw= e $HIQ )زHqrrܔ'Zrbَk|Rb+cʲ$JH{A z37Nb;ȇ`[k}ߛFSQQfgv~鮨h2{P jH>+Ξ=CEExPvN\tFF%J+"E"vIt::JxFC]>|!O:NO" Z={wy1L\|x"^ε'lD2rejZF-A4h6TmBA S.7WPQ$100@CCt|>O fP"Z vU1z}M7reB?D%It#ɳ]vN}޾ \FGZH,h0"rf|ejkk G㘌$D.Ilٲyu_YOqML/nsxrk7v T*HP ɂ ~HO~JM6%ͬbJT_~ c&PLDAje3៞. EX{︃K؋)qjjWs  SY)g>i?nB |q{7e܌l䩓444088Ȯvq"&iXXX@׳~v͚5;qd&?~t:Mmm-O~O`7m^$H݃ b1.]-QH 'LSc=ׯ_g͚fo9B]n/ /)QPPI*x}zz4Ukע620;^vl[_C!xn^++\8}&jYynH.CPN e"zl B!Zmsq~/_Zlڴi"iiimN{evve};pTTp8'm6r\y*++ fDA2 F,S3#%JҙtYS̈)eČP!$~qWA?3_B D*64 zTݷʖ.$2%@ Jikn77f&A>G6FPӄi%%P E X*hVԺR ^c}g/R֞MLO<7DZ9~$tl6^$3'01>Eƍĩ\|ioۆYŨpYٴn O|;C2۷na|t4⩩"B8TjttvQU]mb,}Q][w?xt6EEg& z}466fMdH$R&/̒)>|77x~P8#\\D.h#)E~fk4{p9\G! c.Gynt-[7G#cQ\JLf+PBK~jLT>ǂG"Y]|S? u4LNTO"ѱaT*%޾^.QlڴI'e2vtٝs' D:Ż(L&4z /UA<-13?GMm=Z˅˗]$Y$Sv0t =]>= J ,&P¿f7D:@,A,-X' 90 EJxT:2lfqiW PT \G19>Plf+HSN҂F˒KˍYQIPlb2YPXH(FE-E(EH%XX_oyJKKdityofJP fbLK  c0a 0vFP87כpWWa2iAFZ[P bPixa4%I¬g.FW_͵@}Yb*-ih:O8WX @*G&l'OSrU( !f}: *٫ )r\BHN)W@)*q:5Bl.M Q@m^( Ẉjd 9J%?&^Gk]|ݫD Z~v߫y~F YGXT__[i*KP#ʏ/?)]M3W§PD8EӣT {h(bb"DAoܶ o[Cɓlcnj(ybe&H4tH&ĢqJ5N ׇhsљy׻H Sŷ>w WrmqYDQj~&H~yG{UH[s8뮻P(uz_j6n4LL Ξ=Vv=BZ۹~} ^9r{wjɓ'g˖-͑finjK>_|p|}1gϞ#M("qHQ'ɔ<*lBjjjHl6&qV3XlV&)454dp쌏㩮!͒J$eR^O Jeve$I .vW ,//ֺl|& e Q^GP1x ni5'HvP GhBNfjZJ%II$d8\*bfIZ[[eYߊ5-caLcłN)gR)&lЪ"Y q9\aYEKaQUD!)g7h) %$I" JTf*) 4**AAg c D"RdzzՊjV&+LffBڵ!t:޵Gܱk5z H9ooӄau8vv+hQ0kfH.=;wI&!DLbH.bгCSD2֊ :Y^^>wmj!0Y641p=}L:l_:Ɵ>|3EQ(1<ef sss$I,x@ #獘fp9jڵD"Z=}7 9)J!o{ajjM6 XXX`0bahh0~;罹${܋ȧ?<\6R+pVd2ɡrŅ d]2/LnX2׻@sK#ϟgݺu _*,W/^dúJ*1pݎ$I ]nq}T bll ݎh$ϓ+dtSqlfL@ bR[[K0h4rݳbzDQ\UNC-Y,XYniDQnfq84:e񡖴,1 A$X@ @{k;\h$dѻ֠[=gbP[!銅@z&''p82-Lgggxj(*w*D(AR(t2Z#a6ܪ\>C:E $ VG$c%B4Nv+,-p9L5`9l/NbVl9:űcGH~+MV0t(PIR I 3x Pˀ墶F.X`,~D⊟ z2Gjd xAŜ ixE(Y(;sހo?!Z7H\T܋GY߫@6-È?ẀZ&Nr_y_E̔E̯O1#/d*Nk [ȑo>Iz%^u]ي*(0,NMQtEC y< @Ռfexx݁ 9v;p JQf9!,*ix#Ϟ㙗A犌Npa>=we9_t:/ M6'QTX,J%^fggill$L~v'333e8T7T*bvv%Hg不W^yT*9I`047+/Nٻ{/xl۶|>Ϛ5m꫔J%jNK/_l6ڊd" zk٪k09&? T(RUU<^8̧>,/Guqwd2qYzzzXYYX,i T*~:7ݴ/rF#u.lFիHD[dRtgg'b "׮Ӽf &X,F$"NrVVC DT`00==FEgТ똘 Ͳ`$H0$pTE|>hӴ *rՙS@RcX>D[[锼,q8cA*zu  h4D"r+ft:P k֬f@w['My6pLʝp$jEPyWNz=l ZBT"l$wYdA":$@ Y$D 4 \; tl&E*Bre H(*?V(F,1-tuu7mz5D Ya:sd<[n3Ns[s o܂Z{o<0[7tZEbQ~; ")?ŁW:?C4JlSǏ۹44Z{?Eo&Pr%yA^RJ$SI˵ZL IDAT~K$ј:3oV!f~tk/3 XIP(a7ɑW)xY~/=OC *,&R(Zl >j+UXS[[  07Ql'Tq^X!+(c$6o]E?zɵbV+w*++P(QQQA4%Lg>?%ɠRx"7oT*0;;ˆ *^/V|>/dBSSJ| \v ݎᠮyΟ?^'R__O:bqql[th޻x"|yn6CdYV~V+rfrLC=j>2:>AG}Vᩧ*ۀgfPhN'k֐d0\p5DQ8BZjg077`J%Aad2444={FR@宋ÁhOZ#+V;+rǣP(09.+** 8qn w)j-KKKZ+ͥٳk/KKKtttABtj8\TUYU^ƣAg`-? L&2a~nT6Mwo33729;I1L8"Lp8.-QSSeY~|Z.gk1H5  d" ,f#Zh4J*ӟkr&J˗ni" @;T*h4"N 3:9MjCMVR$k;;RTf*(i ݼ7h] s}z WM-qU8v sxk<0 . Guu W\!VdzzV멩sԄ^m<1<<̞{8uVׇkBÇ_`ffZ4W x<466288sUbv' rQ, Q*,..qFX,tww311ɺuNP(2HI$M6~Fii\C*#v ^韨2>>Ε|"s[/L(W_$[򱸸HuU5?~V|~m߆(*pWBN'ǣIz1d.T 彔** ,ed2\x6@XQ4444]BjnpTVʩ"r `6rUQ4xTR ~yV(O,*kCp*huZa6[ʙ5$XV"֭_p,$-At zD&;ޥ%j( h5:fggE(w4:ffgiudYq9Gh0ګC]%QU# +"DbV;s8pI$X-rPTBo{>ǫGӲO=Z .01>=׾Ux>ϻ~ܽzLF#HRÇB 'صk7~Szt1Ν;1LKETJu 7|)#{urc7UMU_bR :p,˗{.""%V"ֵX#fӴ5!"lZVo"ORSJ(Bm]*z `2b y嫗qTyȊ:18#UJ2 ѹy6~3e֭{jG 7pH*B@Cc#ZήN6+UU\Gj5xIcTUUj###47PQQ_=wH$l߱QưZmr+m OL4#Ԍl媤DBf6mv#U$ v 6 IRt:Q*%DQS" XB Nsa078vg}e Or=w㩫JF7o"ocɻ^K!d2E/mBA,//vÜxESS/D~*++ɕWt{+\6^iZHʈ|H&h26ry9Mדdrrceee5/Eb%% +tp($˩[ՒZ'zlV@0hf!dVjp022jAD,V+%J(%%O'Ormȑ#8x:O_w㦝y^>O[Nr)8ƺuDbQ T Nh2! EEwO,,xٹ0hOu-xy$waшQ#`ZI$N& }}m,,L7^7n￀|eu뉤3hQz+MR(,+BsL<СCڽky@)*_y~1OITW.fUBO)5b?AgMJE8ZFB!rգ- jJ|l$Qi1bVY^'`6WHeIf 8]U!oµQ}>px:EM]-A2 Zg5,Dh*\8X>99pwuG^9^=]mxGIܾy6mD:F$1hj>_7U(fӦMΝ;Ƕmݻ"b\^j"0vxuuT*1ͳV[%2Lr^FR__Qo _s)K8N|KdsY,J* $e->j%ʰZ^x`(HCXV~iv| ݿn9x  162N.gwyfFFF0LXV&CCL'_(D"aL&3.LO&fq kke׮;~L*NuU5g{߆FRT(FK.DӑPI*Uy4ុr9'׮^%%/ ̳9|}7d#h-VMxyP(Ō%SQI2_*xaTj9W!*( _ه7қo^oboZ2:;?-b;ɢVkVE>Ѩ$r$ZT$MH"B:S6jCȗD_^mu'cŸel;vZ,xږN++<_s+ <'g>c^N8yGՒN"l۱>O[[+ϜprUK~/ccTWWS]] 44(u55 λ(Ji^y:::P*8jX~= ҷǾ}{+$ ~?*}klذG}LLMfN>de*jG7iF[vʢwI TTTT*갱-f^g>Y._UUլ߸3grA2,Xċ/Hoobm{;k׮]YE*·cڵ(6,,,, Qd~~Jk:0[,z .\H_]-[9Q:jGI$ 4 ?._GR>2L4FO< H "6L&VGlBJDh_F,ell& EM<| AQ(%&&& fbׇtS,pV253R&Iɥ1&T*5\( c  Fh,FӍO8X4b% "jvI)?OZ- 1:;Z͒KjPRҠ,PkIBӱ( CA*kC(wi$ f/ۿ}|M6c_񰴴DXSēq~T*QUUѣGijj|QS[MWw'Xj=Eat:-k׶& D"DUZG,d2HRM۷կHJ Ri5Xm?6ґ/A䲜>}?R(;#fJE$zYr?YS{'Mіy"p3'OQ0 dIT %LM6R,[m466Dt'k$!8tt=<759AIc+sa!-HCccliBnt:9vo}oGGtlq RzQٽ{7ӴP(hmm[nd2wM7D4%B*#GM`M{r< If"/rwo>T5ǎcbb?a:::?O P}}[O$"FGG C($%6I$IB C9N:ɋ/㯲g~\.6A?|}xpo{w9f3/| Nw^l6eyyI`0P(墿L&ó>RH$•+W8rׯgeӶm|R4`2sםOGTPUUť˗_BEioo'܌ P(K:Ug18N4j t |  tDqA(ۮ=6Ϟ&myy'>ʞ={0339#`0zO=LtUMT77'"yX FG>_$[aMϖu;t ;Z8&W޷qeZvrwFIJ=T NG&Jd0h0м_aw5卮.7T*ЯLO;_V̶6RiG)'G(,Lあ Gѵ$˃WY?|ܲ};/<ĝ{/ $#TɗNԬiq5vyǻ>(ܵW/O&}m"mmm?wuBkf3SSS(rh(6ZB( BAK>2, BA]bȦ3t.gΜ=y133ݻyZlgq:;w>JI҂9Z[u,-binh&ɐIf( :|cSeeel>Ї?K4'~wf>>Gpz==:unQ!EƧhkk]$4 cǎcv}K؝vF[GW<>,[n%LQ(R*PK"@E&''mm`馛|2}}}Y]z~?77GKK *zL&KKK\.D|> r( \|#J%U`v޼%)wHF,![W~jj`ڵHĒĄܑtdyh Id(Pd2s9nܰD&NR TQ'mv}rԀM\ڵ97FTJقH GHZh4NTa`%j50HD N,T*I& =AR X^>@Tr_L;lo]i{,``M1v&?@$!O @`)6re5Wjٝ9^C<Μ9.Tя~ˏqwphmiC,H f}tL&e_IӘjwM@ *`jFL>asA>dbx1[6kEg(&Ħ&ikGȫI%A\7+[S3weKBJi:=׬^Llh 1~KT{ܔ \//>2ƭ-nسK9%oÇhhg"Ac3ġ3i# ӽ/w]\clt\&C $Rq4z@X$ņ x'{\  #ATM<qg@!27Yf VlV-@mK:vs|~7N8@r6mڤ?Ά 8|0~;>"#05&%;N$؂e~.<6lbjjM6qs]wᯬK==X9s$`%{졦E0lڴ\i&yǞ9r@ IDAT# lۼ=&^/###n'ǩrYv`?@b0U^AC]=8]v$\.KWpaZZZv1Fh4FGGlfff08mNdJ(( \bV$}oĩH]]:K.uVN=Śkwŝwމ,h4FF4W J2 THTQTj13=BӠi)rhtZ24LIoTVp)DTu5,VN<ɶWsBl"5; (;V.e]KvMbшȩ8>YjkIdDSi6;Zo}YnRbqZ7o#{*Z97H(+74vT@dVu 7of@O~J*p/׵?L01&c^HzMն=ϫEdA#aڈ3LLMp8wl`hd%95s,V:c ZJ?2O=Vlpb1dU%AjtD2^x9vlNxn۶ϞaD2@091F"gErDN'GoTCGǸx΂Fl>OU0fʕ*i_Px@M57n$byy׻nfs 裏qB2Koo?U466ѱӧOdQ(91dE  .bR~S\L.̓>ȃ.pHK-&{@>_@SgXƅn֯]Ƿ-ߧ L?gΜY0g3qdgΝ%Hdim]$ğٟO֎b%bq:]d9*** GAë @ @[[C4673  TVK$D'_,IP(t:bvf@e5'N 1>1SSSa#R*hkkjyV)d:b Gٸi LOG8G,L$:8x?pÅ(#K,@+"$  ?5DQ~#h4*?ldd;wyf$Ib˖-E>O7-֭Yǵ^׿5C[IğY~-VN2'M "2`6Yъ:juĢQ :cy'e$ ˉ\.NHjRf唋Ym۷:S\i*/g_Y6bL(X-jj@ң5IMO)dih#IqﲑEHD]X\|&KTFą*y7W *>UQT~䢒鷻L['>o3V]t|'"r`z'`vvAnSsEƧf|K: #e _Pu`7&@Uun?$:ˈ MMv2*LLMr)R$BRIfjjD"Fٱcs%:;;g>ݻ+Whmm'bw^g?Yo߾e׮]|\tn 477S*8}4/^p8B*ϦʃD8vgdd\.餬 t-Kxپ};\{ vy'e׮]X=oGDgNI> mp:466.i}NϰuVN<͛OUW]ŗe3T3YN'zrj\TWWF:FQTb'D"d2bvvv ԳOjx"T IٹD$vBEj@ @._, bb`@H.# Ő$Y2 HUVqAN'ǻSUUz XHT" -z]ϧEL\T*܏WGGRK/aas:X ~XLu]'h:40/rEQT Z-VC:ٰvW︚-[J|'2㩧[n#uP(޽{h4jfVwb *̇C qF*D@`S(|\h$hann 5;{QJx \ :=AЉJ1O*rйf/22R7&'lf73$4çREj{0 Qu;n'񰤩oA9iɂ׼e޿ o&IsM7aZQ0f_05U+L-?;F@:Ơ7`U%,c d _ȕnΡpcڵd28Ȳeh%=]]Xf5t,]Ƒ_Ǐ=׿U?ͱz ʕ+44q!:W+\~k| _`xh6Z"W$\w=c\}\V BUeVCsSSSXlV-<mmmH]]- `3YAPT^Q/L6dfc0XlVt*A'J-~=51Q22l:I6ƤE&3yLeTۗB.ɱ+Nz݊iE/S(q%-VȔ F6/. 5? (oN 2 l=KZPˋ@X."7pn~{7jBH[mJWߘ .y3oPFD,V{mD| 0U+XD6DG{ξB] @)f\z _c8D(pqhdSW]\R0 ( MMMM& 0h4`ky9uz~>NUUp4-Hfyn&}pi% 288Hmm-FX,$IGٹs'&C.4gNºuBFN'x^._(G.KH TVC#8"$LqMxo=LE9GHZ Z Ejͣ5) J77Hȗ+y+YdYȪj b^Gy"IxrqUNeؚUk !6ٿ%k)1Q|:83›%oRd^ˎ3Wr E^ ʛo8]]AV5_X+@(@>fYlҙ4~X,?G:"PYQA ͆YTVr:S.APhmw'xrl*鳟dsy "zz|ѱl)G%'geg'=c4@"֦&YFN J 1c#l޲RHҬYӉNGEjjk٬\ -(ƍf-1[$R8$qB8Vñv,ehdI"HR(AeFGhiiE)X5%݉l0RUQn'X]V^JD>_ FhT@  FLJ[LLz봄fg(T/`Y.S.uj0L$Vj`4DIex\>f(&ą>asK%Bx^(L:"Rq.rxfazfUkVs__q9dEP35=#?f26>C/sUh5xXh(KuD z \tΤyh" t;5H6\)˔2:h$ K%R4:L6Agrz@<DQ_.{~)z( SDPތz<x_޲M{:oM+/%d3S~}.^4oz| T ق zp P07o! E$8w<ϝeiG+J9KbnIꫫ1Y'g1{0Zn{mXlR$W\YA}nxez.^ϕ2'Yr-Νf \غm?Ha&&'p8] WP.yǍ7pDͭ{yv^~3{ݜ?wޞKdR)oZΟ;˪իشqmmKH&htzrLNNbǒ$111@4%H0??H DcaL(LuUWp:x^j?60 BYQhZ8:O}mIOT"RWUUT KT"PUYI2dzfr@XL%- |X oŌ(8f5u&b: <χ^_Lʮ""2cx|>E&4 dL6GTT."#dYk*T\Bע3%+Xl6f A)jpEdh ~,# RQ,IdohZz.`](2w/l0X_Ogg'[ f&ioY`X"`zv] EQU- .H4(j$qZ yO)e"' c]pB.2Zl&JF:q("=3GF@0>7l.8VIU9K%GFp]? &S=85\EtH峼moP*)W*^ސ(qډP,DEX$c2jºE'%į̼( y=^=;/A}^f7=*(o Ht6(h5zc+u~6YS&┋Vu.\(m:4J \@.q{ /WאXVn7n2 +V`ض}VC3<<#L~~i,f w9s4~gOsIV^͓O˗S.ŌF#qyHgȪD0>1HPN (Uww7K:(RYYI8&( #G /@{{CuX9z(Mi\673E0d4J @dx<lVb"Lj)у٢esYtz=xAgYG'W)eFG0\&Y\Dԩ"S(ms9VCCw$Yb%SRQ2Z]WBUߏ\.G38FVd !IPYq9\9v'ssSd Q"Fjz[0 IDAT`DdEh4R([ yL&tT:%_(`ͅDa\Nnϫfrd"CS*!xN(xB9}m;0n@neeht:t22:QЇuy%#( a̼( :tZz\>G>_@AFռE0oHU_Z0k3ox2f~GbaVǂ$̋?d&2=Iñ/Ks|^ Οc۷md͊'ȥShEH8eœʒ~3"fl gB"Qu&t:CKK /":H$Bmm-LLL шb _rM72>6Fee%]8~87|3555B!\.YSSó=BYl6SӔe;VS]`O:͚5kx?Lww7mm<ܳTVUrmsۯl6/*:;HRtvvXڵcǎHg3MK9wL:appG,BKp:lذx=5g*N;N}}='vtۍjcӦMXVfggs" 8]nb---Ȳd1, TWWS*ܹsy<Y/vE**/9I'Ν;lٺ1Lf3s Fr`V+Zf4 u}f/۷mfQFhiiiN:jΥ1]RD\I>?QbE R)=u^ǓO?hjKKK gϞ% "vYVXA\fiRD VVy!$Vu3anN'&u#H vF]n&'uI,`Z cccZ6<UrrDՂ\VIzfBLOpH8O ʕ2C0k/#/aб3G0`FdDAd2"K($UN;R U=|\~*<$FTDS)"Fg:&w-pݻᓟ$~;CCC$3:2h; 7ܰolٺ&_*jrՊޠ' }#z=C< OrJV+V IXj5\rDxzjKSUezfˉFh"JjV [eXvs=AB>Ǡ~h̨\luEQ0%‘0& qkHҔ҂1YȼNl-Qck0#i jP(XXerZ"Vx8#'AsDf1h棸t]ַױD25m<1>q]w?<üٳTVVs ,b<_1>sqvﺚ1{{kWWvmNټyMP,wl>.gbXѱBlV<ʕ+yqߋu]|?ǞP*vsΝ<}χfc$R UGV֮Z\x?#f)ˤi֮]E#T[n^Ǐk8{,;^y׭H&iqXv-JjΒFAB.cbbYihT.20xI$FIbvz`0H"dٲeO \l6cXd>Ĭ^InsE@U@5q**( X,Ud}6zryW`0`ZQ]u ccjiphF.\  211lfttt_ndXzt:l=>$Æ XMV%RWWG>CQ.bx$I(j|8jkp"; `2A?ǪU(Kt.l2fX(.Z/Y勗hjlb0d4b F)Jج6F#GU,$ d"JX-V&3^b#"+2Fn, n\fdd͆, 2>SŢ\^XLMOc;8s4^whK%FGhmk#2ꫯɗ,\;HK:'3;=A0P,յ??۷J&_f lܴL&`X4+~(Pue畤R)'"s9/r͵P]]FYUe,pgdYF7NjK`G[3V:!H24Dz6ɹnu,ע$B1 @%ZND4DIi\AyO<\.`sP r-o4#,]qwQ(Ѐ 8N._fs-c|y'O̧  2Xڊ(jre֖.PLQ'p8|,oxcc/xdΝTz+9|0x<v́Xd ôrbQs ?C50-[y;w2>>dc|kw嶺M6-6$. Ӊ`@eV^mwAtX~>x:G Os9CC{ضm<|e^FuKnRzD\^0^b`Z|;wzkfʕd2E='tQ+pّefYڶ+GSghooW=NzL HAeTY|u5/7|3N^ۉb=^ u G6f `L&bpdǃbP.H$" /:]f|Y$I B~&''Y޹Y52 c6t:Ӭ] {Gvsk4JZmV{YwHMsI0%.$!0HBlu6uW+iwۨ͌4Ϝq! ~gǖH9z0LyIrΥ1DcQ3D٬e)d\,U"Lv8nb8ٹc=^݉(LNf.m6۷[n(H(UP(DNiēq$QU"EuFq:aV+Fx N&p_dRcuT4B(HD9dՆذnRXt:q\$q.;\rF Nx&K$ΫҨ-&j8yfN<ŚF+O|D.manb:mH&GdK<_,U|zd*l$`2Y"P.Z~vrQIJ@|2^ş (<<caat*KRDy>s+=-w *w@iޗJp_D‹HSA*t6pXy:IfZB>((9cI\&,|[EXI:R*2?:c:W @/pQ*V\.db֭\.ahpvv۷7#\}\`4O$ sUWO8r(w}ϰ~Хa\.7G(lذM6Ub11 A馥qN &3>O= v_I8%NT&qawy6|综uoK_;[sܴFz{7DGGݤi>?}h$ d^/W_}5u~?VLn뮽t&A R__$fA\F>aO~•W^zB!ŢZacڠ BR""D$IZX((X,f-͜8y={n6o|7,--8HRjV=QZz"%pXf4 A_d2J;wN%FBh( d2ZbinnlڰCx<j1Le(R ˃$kJdۃNǠ7ʦN2<˱>RD &>L\ Ǩhil [QʥZ(LTcg",,,OlBPDHIU;/AjV?_]Nl+s8zf5;P(s>qDjW016J!j2235bB#t&>f#6Ѿ%V -̆daۿ~I!I۶mjrmCX$LR,fx<BLMMqQL&LD"X^FFFظq#}݌p]w155Ecc#{eӦM ŋjUè[oU/%5YaXW 6O277l&044l6׾5'> [[xC:sdA>eLLLPYp QT8wT*ꝟ 5e2w^ƷOsM70LP__O]]Grذf^9F+JR]V$J@6 >.]Pu*UnѣGX,2??dbppPT?q:Uh̙38NbzX,FTb477UU%ʤTN4 & V^grFmMJ%D"5Š|.^~e:p|$APرmG51l4P(DTd4L&yG/};w288Η%:::x饗gÆ 7cǎqWyT~kC+:v5jh4J%u&a0QL:M!X,"Z .Iu aXjCKPy&.Tɲ^Gu{ իWxadl6S__OZRǐe(R,fj8 DL5#TZAֲ]ѹ<"K-ވ K ^l2N&F)Sѻޮ.tFJ^SٴZ^R~=5D$LFu;q---hԛkieL~^(=L{c#.b"Nd4-&D8A`2(TX0Tu&'4%Z(1H2];>B{{;©}v8pE֭[Ƿ-$IJ.V6l]08c &sx IDATGihR(ظi Ѩn(.qnʏcxظq#-?Νp8<̳x=>N:CPbڵVGFFkXjJGr77CUj\Ņy}Cu;woo}_◾׾5|I:::xÇ$ d~nF^:,, kK.rqwRUAOOr L:i$L6ennzzz'뮻طwq7ٰaCmJ(K4Ǩ7raV!Jdvh4fN0L$bq" AR$rh85kp:Ȓd"a AՐ,izȲL"" i&"K^ RBV9#r52cA5E>V;T+HD\rRt <7lfv.% zyXv-b) rNGx!A'NU "vx))LD"^nDAc{>O?4pUWQQ*8Nn6oO?~|6"~JB@,G@]!F:-heEuh(Jd9&JUpP(p:Z$I;jJTBQTVOijjb9L0djzJxUpzz.JX,L&3(!"bHT"LCXFxQfA_!\@y'4B5Ȳ,>p$h, v‰&B.h$":4 P~JkW|͒(Wuz. ]bvvxHk\෫4k0#P01|'!7o f82\%!^q39>No _(O()U/\,N]K;SIZ\"m:dSf4P.z=nVSQt R(H R*?^ /d2R*~5k00::bj /p7+m6hmm%(lٲx :q\رΞ9C*BxM1,/?~l6͛P,qfquqsM7qDSN=mw#NV=#^H&Ei &d6p88fIXz5f~n:J'O{?xs{H"@(*O`]fhəL˵P!xpRTt+|ƃZֺ.eYF39s +;rVxpJ<KE\7$QUTX֚ ZbxF2V\EKQB!#W] P~ Yz25=M`6mH-%6l%Y9sq A$u7ވNfwdTr\6 hϯ~].1Mqyx[J}}/̼6̼f/f|`ObyAYE^,I5bF7`;HJN8}qͅ"rmי( MB:Q'10؏nt244D&T*.\h42==u 2ozӛp/%+ jd $S&&y#^~019Mk[{6a2[I2l6^/&n8P٫w)N:GŖ9ux%l=HX,177O:>x>-m>/_苯nVҥKt^,rUW166"YZZJ%,&3QdE}Yyej;==,\۷@Tb׮]h4֬]K0t:ey݌^G"0xT8{,M&aX8u 2rPhƆz&ikmfrb). bq$qؾxnqB*b l6[mGvT=A&H$V L200Pa`UGXdڵDQp݈X _q4֚>Ykj"͍j`jz ^jRc?jGEN'g)D:ESȷ)V*?u ^?gӟ4{oy'ر}6σΠ5kqPȗȇ'LÍ[G:FRUjR23! hbL| dPs%JFMt5UX,e8.:#BRBU0-mNQ/Q, /F)+$)* tT&K:%S ;EheJL[s+|I6f1xkdDx ΍q_s#=҅2Q$ldxl\D.b0<B&#n7& T͗T*dYʕ2-d2iH@1bZYX\`007?dZUnq1N:I{*ذq\bxU}׿uv\ Bx<>hH87LlyKxql.@x]*G%X~Zx,TAj$ReF@$I3<<?Nַj̼f/9dya23!~K̙ :$oy=[8z[r D"LLL8~8ݽ{7^ٳg1 d2]]]n\h42;;u]G"a㎚|}y@SS@}7B>Oww7b֭LMt:8qϟ\^1 ^Rc$SZ)ٳg}啼僚g5:WDlK:YQ(D"nɌFҨdHKح(mmm %F3c*9{,N{x{|3grttrI?p<{BXns1t K:XQ(ThtHZY dDI+Ϊ|!RR(T8f3Hx%m$Ok;Q1 dJ6)K+)*m(~ٌt:T*t:jN&DQD!ih42b @>WTZI2xݡVE\PfhEsm#mr:p-6^\DJx\玜&U o︃KgN`sjV8dj>AV!{r: ߵ~":r~8H*M:BSu2Xw0pWNǻn@SS~i5yh9~8@|@*P "dz{tI$R1lBJLNMeyq ͒x}+\.7DӅhBLf.VɂjT,b2Flݲd*IX_Wb&NH%8.,6a"((I&/tD642NF?dYr!ҩ z+VF}]=0. (a98vҩ-M̆^M|Τ2E$zYK"WSH BRWt: PKs'/@ͤ3yv*|{xȟs.{ހlu~?#c ThmlF#i\ĖgX,f&&T*Vַ|RY42UEb4!` GX,tZ-`^z@ (VZlVRKzd"F԰UzfBdil6+2*h4L%H".NF$J" BL>K:&PG* T-HS465ruu:x A#dȤLOs100=O(=*PC:I>ŇE%cQ>?C5l>V*X@DPWekCH\^! {Ĺs_~=&auE!iػw/LrYUQ Oذ~#{9xY#}=ַ˹sZ ͂gqa "۶lb6yUFkXZR db%B,R.ѻq}}پm'_UD]dYi^Niii%LrM7:UfOOvJY!Rkz=sydY "LFɲ)^/fgjc1B Xvq{<(*DD"A!f`4`JQsS F_ 1LjB,F[[(b4(x:Y=l˥2K(\6}F6C>|+vdht:&H6AN#QeItN*hTcEՠl61L QdV ??Vko{;7=<ܹ#G_l6ēO}꯸kynV  ncn6DsSR&Ht1̘LFt9# F2 :RH:w9ekFC.#;H׫~g".@, IDATy^fYHgH=hD TAT"#jDraYX2oRQlB@&*( ,-ExD:J&B.Gi6bw;ohkbiy@QYxr ,{z^>O8.9>Dj Z.2IsFR*PTPf T*|_%Nd{x<8NJ!bi?La_}=|Q$#ket:-U8~#s9XDgjK/h&J4ZI͡7(UF %:86p=:>?s]#Ss2]=zfccڐ%T!م,k1]4 _G88].z4ұjBzf2OsStttvhz}݇ ?twnnN>%L1=3gjr Ǐ'H7{/xi.H&t:&''[ikog^>+h$ 7nFDihh 6JX^j*N:EX$HbZf*J~n1^{d61:2je~~ή*wh'j*EjS`=SYzVa)jCX1X68^|eAbqdYfppj(lX]l39=IkkϱsNffhknGvHg$c3[8N* XV4jVK2E'''I$lXIU%9vȑysT&AF:;뙝 a1hllntb5[5,uyY /R.illB'Y,P,FrLg4"x9|])lFXQJ*e K JF,'O$ ŇdTiu3 3etw|Rd]N+5$&N^* j OBL$pp"Ė$⋳6hNZ;ln\IY##M### 2Ν?NSSj.N>A 'NafgUr-M͌[#b16#ep@MPXݹ}o[ +eQ]lbM:ffv7 sIRdFG'?C,ǏOA^'w}7_я? {nbhtF"BNU*M_?^{-O=JlPcnDJJgj~h4J ̹sp ̆py=J%K\qvưUpgg7hvWh4JP X߀abbRD+I]]]S__ϙSEh4Fe(⯫c.2ldaa.2h6P^_ H`;9z({T(ӃF!ͪk\,~:paAU* DQپ};IdhxշuZ^yصjz׬gjf B?[mU_bd͆,ˤilհ:1@:RpC_ns(Řb cۻFRy)n&GFTJX&Z lvU=b4TTϜ`@D$˨}VvGS.1uē'NcX% JLEhx O.C$r)Qx &=t ф(5؞@@P*uʫ\Rp8aWrV+˱eJ% L,!T|pgZH$(X_O:D[Da. mv'hink. cֶ6p˴f,QK~G* ,.1;\(YYe|ǐZS.j~"I#՜bVGCc"gϞV~_\/0{^~2~.)3AbxAV"1pzYXX*cwS(U3rRY5"2IB* H>fQUv;d`0 iUe)KB!$Y"Lrԧ??<gPH8N8EndYl6D.p8hilR*q:զhٌNҁn]% +͂ u((R)T:UK ZEskjZf v}KO6eXP(^.2LO$ HCD :LZ-"DQbbrVK\B+Y.xYZZR|5553(jJUBGev'ሚl$htdIB4t6椾1'hlhBR/~jAow[}.r1LC9"M͒-&Vqvͦ8Rwm'];#Q6E";dI(r 0?.%:vNhxF@1 s<4hE\s`_zA4.O\Ҩ׾;I[&36Jk4ZS)sO9{߃j+ή'!p2[.o/ny393fRaG/P4z 2G17 @]5Zz_p)| btg=$j|;>x##v;pCK}y^z%b;vԩSܹrM7ٳ lvZWtvǎcΝLMM~bLnQ{|G/ٷo{hf# F1Μ9CXn`~~\6wcǎ1<<̡ÇXXXqhD@6ry<xAݲNl6F/xxoJvOdteFV+N˩N6Z0/|.V"SSSLLL\r1P,5 'c8\h4j, 6ddiin7DfIO_/VAvA#L&y?,,,07KfX[WNSh4tɪzΝwsgf$UOO/J&%LсFR, rՕUM`yo j&+ՠѶ CEQDQV4$ɴ'jB>W"b3Pu!%P l t:XT-;jѨ&"&sx(& @թdZyz{{YYYv+rQ؃B@<hLww7Z|Z(rH$1,-/t8)h"pE4:'x^=['I$eټ}TBN`5([nftZd& VӦת!7T9x9>LZlRbl.&]db_S*y[Č^CעQ rǿ8Gx fTW^J( FՊ fq5LC'Ubr{ (jtTzj:X6G~8y,|^$y16zAb8Ȳ #GӧOyf{9z!^//_& ENz};v>7L([,F,kO>n݊,hZҩ,M8n7jQbii!OLL'}7^{&Zd&֭nw?}p1N:ȈYUسw KXm23sɓ(- @:Eu֎7 GXC|??䎻̙3hE-+ oN"R*K/qS9{cc*$a͜:[7oС#000@4իX,Rcc,//9@Tuk֫5P(U&-tZGq7 1WEzzz8:0$W^c>C$KlvdRNk\՛sm t ~jfl@fSmtj:::TvߚlSE(jlc9@c2h6^SfIF&zJ*;Z-^/ZIx\^J2~\_;v`j2]]z.\%͓N嘜z x%~q%<]45*d_(RU\])ohZ jD]l :$Nx*cm!)fbg]M^dhT;X百xk5QI bB"AiejuD),52JZ+4UAK~5HvBpZNT~/̓Ig֪255jFV\(LLLpĉvuʕ+av;xFA`%$Ǐӧ\|bSO=7CCC4u|A::;(r9AtVkx'Zs)۴^xH&'dٟNL]c=]wō7ȧKy$Z^6o$ zX Qusw22<}O&xy}}`$K rI$ԪUDZƥKؿ{?.aX8}46nZb4[W=mN1>>@/XU[lt8(UJFSؾ};ssq5ᲺިB]r96nD*b5$ FGX[&ɈdRK";;ъZq9@(RO~?ZnIR,jⲻ(KϷy*x:~ Fh40m1u4ueuSlŹłTx$Ih$I&Du3Y, ]yB\.c6[U$bvb6S*^Eۋޠ'b̸.2LYY !dFEn6a|OЋ" |>BNS]7H&#04<ՙiv߼J8eZe\:BZJ 7Sߤwhl¾n3Z=LoEԩSSZ5fX kۿ_p^R`XU)z[ [!HPjB.hZ.OSFW8u.:,f:lc1h dB0[quvl0G~#[nj kׁ"P)WIk h::>IoyЊZ:;;:uH$BTau՗enu'9r VO` ejR\MBLLR= xיS Fg&=5z{25ő#oy$ıدG:9ttttv:?ƗeH&_$ 6mHwB>V#`6IP,V+8v,3 bY\Z`-"Qdl&R.h*Z&Io=DVKZ([+:Dnfgf3dsY^7WIN\AttvE+( VVFmMl:^oddhBd0*V`LB>t0=}jJl1+󪑵Z!atмDJuX3:F_O/&gqia@)K(x]* JE5:Neuu]*T,P*vxE^bF2X `f xgJzu}dZѠ%_(഻j FDQO2BYEnEz5&U`esYuQ(HRz̒\>G&nG5[0%@iV+fsPPM.BAM:]4Q VL6CDQق^'2*ނ+Pf'c4Mb;EQ0 2zqbdtx;i6Z@0%renv^ye‘0}\"qw%Pt#7*5b󉭞laU@Rd||GwM_t:wŝʭm{~#f?Cv4m1cþ[\w܁@,g떭"TwVL٠\*p9AQp]DQm$Ne /3gp]w+088H4E@oo/Ⴧcqq_G'/m~cA\M0raq7]ax'1%յ$,S(Vj g%Bq8QsQf3Fťx{``d2lܹs- oFT׊\.7tuf%&VXQTڪCE]]]ͪb*@{{{ɲJ5[}>* R $4{L$X,4 t:d:$IdZ-?ͺ(L&hԞ5TW'*lk'6EZ`jO f\.צ'AZX̪*vP(a9cHD\P'H8j0ì׿A߇E9v8x)j:FFpWMtAӚߑ2Zk/ZEfZK)?_̏$-f3?ދI.WDE2"h5%Dfy\4iz::LO#". N"rGW! |t \f>W>Q!4r~Vv=W~}]T&p N8(Ga>|^O<gddT*a2ڍSSSlݲ%۰nx=dUTe+ʞݻ9s,6oW^a;9~$otwUhN4# zVV<~=!x/{,&|nV*bS'OFoOT7^ *7~OO?Y~DIlj'Xvl1iz!JDm*\z<LP鳻v"H&)ˬ_C<GEv݂t5l6cfa2f;2ŭn lĕ)V+XV H I>(388,DjBZcthd(gsX,2Ǩ7p_VW[N;4xj۵%ͪi@"D'"b,щ:$#G`0h:y1bQ5%hT/Neqy!2&~h4g!Na4t&]FHRfl.C: =HzU H&HD2aӨ)Xeh ˍ lP(h5R&Jw˩ ^Ofm*g4-2Ԗl^dh4dsYPP'LvZI2 _E, z|>lպѠsВ*X&Hѩllv^}UdنnCxV:5J&ZJdنd$I{(8.bFwL f9ÅÏ+f[P~93-bۜ1wL]&j>˾}{MtC…shIVy}N&d=<س333x}W%h4|>^W;VDH$B-9};vP,9{,XJd4222B,̙3ܰuzUfSO=(ܼFO>$q;rW/]d6#jlxG^Ղ۩I~u1$BP`ffKSٿwfi6]+j_P@OO!:a^}UnٷϷ4{/oG^{ _~Yx<>M6g-aZa˓?Bk2:,N]oOщzzzz8w=}=WԪkI!x8w~vy^ͱc11%"(\Vǹs癟_h8|%./qpЄZU%4_?IWW7dHw7LZN:h4aX$՛a6d)q ]=/]Fo4Űv.\D>|ٻg7 OPTZ,  qqxu:.bHT=_zQw :"kx<C+ vnb`/J&z<|ߠjVъ:y$L }%d|l*NZJ"gr&IVӋj(7l!c2Hd =3tt#jt:Ej8{ KKK 'p{X vJ1_$N3?@6EѲ yD˅"ݮOf3>j 72բ k4ZL2X(Ь7TVGyr K (ˏEP\JOw7( ?A#07@TlOq$Iawj(X-V24RBhԓh6,DM5EKŋ:}Ri$I:ՉFy[QMfrʋii4XVyq)#L\nW^W[VN>D hb$4Hhh(Ft:Zá2sv֪tUR&lҢ#oe``Arjnl9PXVqv㧎'-,-.`1L] _W$Ki+we斷Z$mJ|_ޏ7yg?p{m1!fA t A#ԓreb!Y/^X]wE.@TrY& 7:B^#sqqCH|pad6Qk6hLNNrn7۷o'HL/̰eV8w,o:7olذZtMvoZs?@o+W[˅,9֬Yɍ رcLLLPVIgTut:[X`5\|ϰvZ^y5~_ɧ>)8z}oEӑ/cɓ'ٱ}_3obÆ )ʥ2` @H4JOO~݆Gr|2AgɹC4 8Dq(&z7ބ`jz``{PSOtwwƍz,..R70tww3<<"ׯ'Ͱ0CRn4'Nx8|0w8q$l1A`uuX,VzؖE0KfկS=3IrE9.l6zF&fr$U No'HD:,$RI*.8ַ:rZYى(JMSHDVH, Rp\4MeLWՐe}W*:<VCv@tVZՕ`ȋl@6A,TZ5ֶj*LuZZx]wWK&3fP`x &H{ݔUFVKHVM, Pz=L<~6n`qy OlQoR-WmN~Q,d26lѣG9w_drB-Gi\pݎn'^ 8]N6m̅0L,i`Z.[݅0~6^d=7^o'JY{Ȥ2v-:xʥ2=TJedW=XfB0Zt*M7~,p??$$3=;b!ˑe*d2؁,[h6Uq>e+t |LVËPPWa E08&@;wr֭[O?0 oիӘf CVk4M] hTieZXZZb5 j)JoG{|DBfFC*B(45B+D-KKuD"aԢE@RntӪUG2Lׇ`h0ZrMu,&+zH<C6n#jh,@gJ__JX.H&.]X&rՔZ,1MXenRZCj5ZUW:~a-8.î7s8vD1B Bg`$"  4M&7L23;`?M8foA.el| &PͰ}vY~G,[HfX2xQC9{,'7GX.180H$a4rC`%«ޛRWxj$L`?. DVl1e+\>*$q]D@ ACX,cȘ zd޽|:5h&pr+L$UHc4@oيɠ&'шel0<4L:jl:h DC " Rb:XٹY<./a.ACP 0f-K P(q{$Rq|LfT pHFd"c4HKy:lLXh.$v2p\Y!xh:.\@_jk4\*r5Ro4p83ZVChSi0IF ëLVEbEٌ:Xe;\^Te2 ddfnӁ`4߇0:2(04<$2::Fb=۶m%Hz|CC|S"~=dR028AhoA۾ 4Zw [&?VAxKt8JC ܟipi*Qf@( IDAT\1EQ~$OiFj~) GT4rժ!ϠS$~޸B"N%`;S͗pX$Ǚ L< m\,j&; ìVIe1 \p7R(lR)&&&x7ټY%.,,P4znV^x A`5Y8q[6mX,p1n7B{v)IgHgΜ[nC|> H =_~W_AN:ž}8}zj29b5ʥKe8N.]D4A1<1ƕiIXLfv_\t[o3o,sY1u~asظ_9L_evvw&388H<gvfJ±#GVX31rPm'?>V.ԉR깈ǣ޽/ٽ{7k~@mnW^a``͆l6o(2v2(b5|KsyFGGWkvylUڱ^A ꕾJضyˋ힡y1D,BT%ךvFD"ERappX,FXfg k׭C'tz|,,Q,Y3P8HOg7 +UU+lݺX,b!ˑNUϧc1FFF EQj1X,h4TL^ibZTp\VMS8T}J"\vеYV+ݽ 9CkV+T ӉNԴM&bY-4I&rVZ i&hX.b2h(v_eoh4k4ⅅp'ֳYPٍ H"I2c2Т%qwf*}dT*Q5X,Fp8,e;++-No'DLMJ98lgIR /q8dY:6s==w_~uk׳޽Xl E^bӧOc$"a4C =87?<-M]'?U1ӊ5O> ?J)O (kUEl&+&Gqf|.;(nh4I&[djvOOկ18@QQ5t /7|Rx"2<,~<Ǘ{]n$R):::"rb@tZVQ[1-jڵkeH$ʺuT@\*\~A! jq8h:f+Ǐ@$.\0 ^/X^- $q=lUttt ɀ+jS;Lλ>׿^vWͫ.[c1`L'$ȏ~r> 4;D2Ɩܭji-*ڦmvfvzsxgLakuiw>V˗jv  _[]? Ju FΜ=jA(X,+l\BwW7Ze4 rB륥@0d.)[r:ds9!J`Zd^fT*Io#+{h4ZbbrFM>gfnՊ\.'z\h.RڢrIY.ArP, bvLV>|kpOK rs5 bF8D2hF$*'"E:J`(H&`0@ ʧB$N eXuu,//a0)HΟ-sTTyMMRIt:U1LXLffUmۆlɉ jFJZOuIR9}%\ߚZL;v~^Zs/xS/oyfʌ<./Aŕ\{Hʲ*HW"6vlc&GQc|ඛGB=t&U 컉=K,}H F;zPM,ZeG*H.]bppK.9_ގT,c/ Dl6osrPiȕ2~V\J[i4DŽL6eӦM8pB&gLʎ;X[[mzw.TJh/PW`D,Qբصkc#L/Lqufp4B]}=X g>9B*g 7}c/eVqZ[[I%%)6l؀g-(4ˋEnxWF'If+@@  Q]]M(!b-ř]+KR467ryW#˰4@GGF\.-@EE䅃1dM[6)tCv$Ve\^ &&? IfNۍB`uSVM4F'pE6nބϻF0dΝN @Rv-%BRH$4jr,DLΠ'!gs4jh8VCTX"X&%h"QᐖJ'HDc8ru>Gӑ+> "EP(^J&:h4bNJz}9M?,)Hu\TWU)~2!e\.T*-{j:E1HR|YJ,"7flZ_& l TbX,d2-0??5  #^O,C.!_RUJχj'2==Qo'}/P >^  el ݳd6IUU^wa==]tttp5zٿ?x|CXw`˖-=ny+wrwC$p4CCgI\8̚86`0$ofZ[[yپ};< Q>Os#ضmg_}۶sӴ pC]=s3 R(y#XMfΜ9<#W=,-#F";φ>?Lwd1A$p E Y4*\& ,R)jJX*![SZEgГq8J49imiG3'ѩ5ZV3xD"V& Lϙ3O(A&? EL&H$*$aVDJbŵBWG. HB M+b * B\]\*pD(>͖Wb9^rd FH$A.Wb0lL?gξbd0R }V^ޠ1"n %,#WȤ2,͓L d2R&VW'Ȥs_\; hL&#NDTi(CH8'mۯfn F5׵f*ٗhV( d\ε;1ĬS#gX4"`,NkO/ ̸D R 7_਩a/Q(BwoRf~~D"A. d~~ :::l8HRl޼c/ogŇ>!Μ9M7ĩӧhiih4֎Ncmm@ @KK+ĥQ?x[6s+ݻG}T,.uV.^(=wy'?RC8NgΝy?T7マ[w}'D,qFx+|_8~==|`M'zRƩD" -[6 /cvb1۶ljiF#wȳG%`n[LJV\+^Elvinv;3=CCCGb`0(TZ'"5fv;fx"T&E ҧT*mHeR /:/LObX!1=3R,O1UVxZ帴d$LRa@J[' R9|x,^rKޠ+_r9 ̜Hp8 &IPJmnN (GCF!a2 `@8FS(QɕD1bH4%O|Lx2}( *lvn&=v335M4ȯu⻏=JoW'^*=݌ T%M1maz{4|c߾}:tl6K brrSN瑈eքѣGŴGl\.:ۻ0)6mȑ#Gğ/̖-[WB@?MOӿN'5UU| _`ΝяwA)W0bxx;3Q]H>rh4LLL`2D e:߇D&d&͉WNPhnkD+CghhhB"biuo:Ux'' QSS˽JX(% 9PԄ"alvVV]htZ*<2L}{ ̐L&y{Da0, n[XyP*x^] R˳g?Y `vظ}vR)}7a`Lط:'Fٰa&χZebb{w'|;|/_F"eadXJ2E9͔dZOp8l !vFQcD211 ^~nghnnʕaZZZȤ,-/êM[G' mee x<b1RB$!H"jD"bjo Dl%Sa  ŐJRZd3dxx]8˞]{8uMMMt:^y߻Ox "aeZYQI" L jMS\@2" VQ(2o}8l62V4 kk7LRYYI,-:t0h56 =33,6f;Xր{͍fCT&EX$#HD"&aXC"8RZ<:h4J]]/^dFkB aT:J"Z劃|,+2x*A<GZՈD"84e C~Trd2?-܂ Ͼ,7obХ}UidHbْ)R E@3yQ!4ڂYo& NM wy'yQF.A (b(?Nz]XD,7bl[fz#_@$C*!K(ryd29nnVVjoshq6SU]͖-[F" nJSS"|>:::hniᡯ?w7;wK_Gb X|[h4|ՄL&3|SN{nz}דA$byy7x=gNu ctt3<|x^ ?OoO7sB7kT2˵UUUg?G__W'yۏq+@)桯㞻9sLy lmn"bDvuElgU) TUUp8xݻk𑧩dzJG{.^ +++ .^HGGOtOcbÁBP(jhu:V\.fٳ{ \x5vN`PȓJٸq销w{4j KF2(P V ZG ] E :+#jy&Ģq2V,6+= NTj @% qVppAwqwfC$*2u*o}zdYG^笙sk}~x%%Bۯdߟ$omPF( 塥Sf>NU#B2Jxuu!GOmuUj&V9=2^ lkٸe3k>_ Ĕܹs0Ν9p{abb.]DXh48466RWWRd| eݽz}ٲeP q0g$,,-rQW8+BSSbl6ZH DH]TWSWWN \qgNj#ʤ[yذaCg_X^^fHkl&p7*{d2HRAGL?R+l@l> B@&AqY1ccc#~ZE0tnoU WƱwCT裏)ٱm+CCC44aì.'NFFFkDb&^xY0*p\"jY\\d2zd~vҒ  p]twvqI͈b:;;bH6E.P(X fFFG@*28O]] &riu |  fH#C{{;XGy\zM8!#7DⲚH$!M155EwgKKTVV"EʊgJUV\FcQWURY C̵ƨl4B!#Qa0>>,-VK"~u]lڴ kX,Q+Ue@qU:+:PO)+?:8gF)3_ӾOgZn̔FG,C.W7٨^E*=RpuiL,'313[ngmx^rya'\BMu-}ٳ9 >|L:Eo/`:Kpl&nn 55A͚A%S dJG5/S]]]ld] y+++w8}vF|6ncstbZqp{=4;455B34*ldie;w{::y7M3K]]S3x^ faq~* T2nE Jp`1TVVRaW ;1 TUVEEmF <5v9|4m<{6mDP`ltSSL$p~|5ҩ55%_RJ[maAtUjkkel~8bv;销_7ZV?PB!`$+ WوY^^&ҙ4HR=z9Z ZE0L8R yE%ɲkZb(U2Dm[#b9L2W##cr>硵@$D[[;ScWu]|;aÆ d2:< bÇsusw0=5IՊ\*p8RF3<|'555Qr9pn?sEv]~sF~aq=W8\k6oaxxfffw/b[B kl׋^'L -f# h4LEEz@]cǎ,-T )łhD, /qr%Zhzrdhkn#07?C Ym/J"(sL*F&ͪ&H$ɉD"Dg2-k.q.[ *VR)`JNC.333C}]#"wŐH2@UE/}NB!MR8*~f+N(4]]NO }̱c/3i#BTB.!RW](m-eYL& PEfT+OJdKk,HL(7K噹9D"( ?NMM <9rYv+tƺzx&'C8z(EDK455!Jiq 9)._ӟ4E<+.U?yWW[Uʌn?`?[(UZ N2j^45Rk0h 94 `1bk^l6P\Pi2 e Y5iTԥ $$Qb6 ECAfݠ<>>7M}QR%JɁFDB^_Giiiӧx[ƉG/pYj?(=}V fSyH 7A}MccYxmFSSdoA2$JAw7zo92S)Sת"-,&M!ϖ^zۛpD}kطu 'O`hxBw(Lc&&ٰah@OSG3Lp8h$-P__B_"wq`vtf3~ng}n},/.p7ptuyBTNI$eF!ZR&Hp r9{J|{;_~zٻw/gϞQZ͌?3Μ;CCC ݽx&&&X,T N^|h4޽{ٹs'=Oq#L&??eƍ<ڵ ӧH&E^:E&5.v(~cuu??'}P(7H*B񖷾C׿U~R$*W_2ucX$52oWg旷ޙ%7By=?RfdsTj('>gl#I 8Ѓ 3R"*X6GFvMEESSSR)6nHWW?86mdյ®ݻ\n*JK.2JC}#@0=5\]#H֭[j455{]"7߸?|g{D}ظi.\b۶|O-O~;wH%bż6+~J&OXxwn"HDER Z[oenn*DEؾ};οF?cWFoev g9](r ZZH&̔<IVJ\p\.k27?*"Z-+++H$,+z+p$LJUU5d2IUE, Ʌ#WhmmYEVqh4^/UbBUp*QBjH,j!JqTEEt,nÑ͍ cȦShZ^}U<UNਬ 1\J5Dc?OV8(rN!TZIE"LF6D"$-N0MC!*v z#WBԿ D$l(Rdvv! dD"tł뫨@RT)J R+_X_)x@SʕdsBif2D"055Emm-WRYQœ0EX,.++bHd2Y\\\Ρ=WFG??s~_|o7x;a˖k26naffgq:<7ld0}_p0.^" ]Py\s [led --{|k_XΎ9=#|KHgW'xBu{CRQV?z'̯N^ew_*#^|~AefVQfI%z>t,$e>(U֐>IyᯣP(JVjb8fY2 MMMt:ZZ(|w ) R(Jl6xV uV:(ój K4Dd&Rdqql6>>sۍB)'k. _$%K\ZԙDKd-L^ 211`⭷""}xen/S(ɓ'q8%f۶mlٲBCCClٲup kz/ WH" lpJ$x<:uVf3*5G \!jS%r YHìAVWW1LSSSrQz} '|>רol(ŭri!SW'xL&.\lDb++KlDSS3 I%cd$*Y]p~H$jkF#0naax ϲc~?VU`hb||ޮny)cXE(JEh0f#ϣEB"!I"S47066`vv]w+"⢠ؤJbr͕z=nqv;  "cXYZ^veFF (͒IRԖ 1BIe)zq %Vi~)LL.gb*jÁL,ГA!06l`Æ X-v^|EL&3Νh42??;_>}'t|ghh^XYYvlΙ035\.[naaaQYY\Fa49x |η:PYY?".dm?m)3Ì̼AFe?[- e8zƏh^,m'%~}"E}I&"r$/,DEf#ӿFuȍf"4r;7ۮێҤ VL36i͜y dzKE YFj GUwnCQy}.yuL0w لf%ߏh`YUqycppqj9{f&j4sMf 9:ZPH<y{{Qz~|/kkkJy^n&D".#TF0Mhu-m\ /ʼncz,?xi::htfN"Isdtb]puj^zZ̸p IDAT)Bt.ޤZfh4ˋ nٌh NA!G<amB6lG$.򶷽 (TbQ "hmmF"!KxC;Y]]%pqK,-wRH$^ Fd2EVF#ZHSShuJ6m'ϲm 6Jl ;JM}vlUZ:jH0` s8-8&osɽs@W 1 0 ! +ʮ6fw>;;1v|3:*gw5Ey~^/*+d6[Geltu]4v^9uJUj{I :^;d AK:fai+WW:(NHDkk+gϞeZf9yUVV {[ЛX0Z̜冽70??O:frrB@"Fl%S O&t?HVC AiلR@!UR] E viiH4377T&! QJ%Rf瘟E% QeȤR9BeEFɄN# drSPRjШuER*kBL6G>Z<9B  " `q0JlpL&db-G,?w^jҹ4T$pe*nWwq{ Rg>9zFصkf^yGوTd~˯L>J7i~ӧOŘd)9̟~3|chdvE}}=555|ShZtp~|~~{P _: sEd\߽-ϑ([#_}&E _Hv/{~G7 J lLZj3 ?=oxVڗ|T*J)'M#zI%R _WW$iܬ]GeUu.Μc~~͛7s :x뭷x {YXXSNTuvvJ\pTٳD"LiW_73Sϗ~ ͊H.ϳ>>яN"G\@Ȯ?w(\rx,I[k;r;T'Ncv._`<( .]5\08N2;wdii ŊQ/ldl6:=---8+D9nf>'pyn=p\ld*;va;ҥ˼ؽ:tz-voӷ1 OD 2;3$&3g`6YZZB"bJ@H-[PkԔJ@?>>>OO`4Hg2$Iƹnn|8n#G`Ϟa'Iea~$F^6_""bt D KYt/a1[o@$a2PʅlΝ;mYUULMOqz*%P%Ng0>݃N/V+W\&LR+z?Nme˖htU\N&f16lr- Kn;D̍Zm"  Bt:} ؘ@^y &I?5D* #YRnCɺrnux^^`F$֒zMQIDx=,72 X"HiNת.(n[LNNN"HDS^HH$LFeds gRIt:& фJ,1g~?b6#JQi4A4Z ~8B!F3"x,A&3fQPKX,L2EJ.G!Wƣ[RT\.TUW9wn7ClݺJ ;8tun}g/xss {q͸Eq;lfyy o>nu| ϳkǮ3Ns4nȑ#\tI{辰T&T"O;QD~J3_G,fD?j~<_HSr.;bF,! RI&hZ.\B$P_@8d6337ǻ+oTpX133ùimir(//RUUE$X̙3gJ5kְGa-+GPzinnֆV8w"sTWWs֭K_ uh`0D,vvlkٷoԧh4L]]-fZ tHi55BuqqL=pe֮]bȤ3Ͳ]ȕ 6m,0stz6^d*H"殻oAmm-. pw2??G8r7y"XΝ;1 TWW0Ld36mxիW t䰑夒BtxD(rN>R-g||6384HCC#3331;;+(h4U,{fcժULLL*B^OOO/r$HRq8y$ z`aaرcf&&&P(KK \ն)r͆Vd:EUU./Yp/E|>$---#aT*L&C0$OR 8X exg䡇ٟG?dMWhm۶azlvLLƍ7cǎ}v;p,{?^g-D"֯__c=b5'`ۄ-_QS<_儫_LDbf1# 24R7`29z(F Gec>"Jj-vbl֖VvN*YX>$ބL!C!ذ~=~;oǟtZ-#TW0>>_$ɠk9}qf\.Q+javӍ9;mvbۘb  }v t*rx\bAf T:=Riffg۰Zʸx}/<djkkdzEHMU?KK\f5Ç |[?J'$Nx |駘uhjj]gL( M#e36 >6jihe`ƦreV>Gj5\w.+_g«_Hą|b1Q)T{ß_(fP)ԯbW̤RItZx&)A"ֆ_zh,NKk25kQUUaDYE<'O83.jkX,aX8tj .!J%>_Kh<&t2T*^2MM9L&r[iܳeee%t2$ϗFR_"ͮ<֡pXʵ[E$s_ J'NPWWG2Dx{|+f>qavN0Ndy͌c4Htw2;b~~P϶sy+D.]Dss3###444 YXX(o8GZ r9Gy衇P*rĿ3o?y3S*fD"~wbUe!!8rDqRB0m>ǡ#GBEjX ,2|.O&F3YZF ~Z[pT:ٰq/"7xPɹs穯o`.fff,..a6[⑏}_EZ-TE`@0n߾4 GÇ#J\5l!7m#?yH$bR4w`||"ZH0@EE%'O~=ǎҽRW)ΒH$;Gy'qE0L ׋\X2\.+o:hT 5u:Z[8|]rqM7qUN'KKKQHDR)ss>?LOOxD"!ˈDTUTr v#IQ*%W8txLF!d6ɤD"**+`ڵ!oB܌F (m199Fpaddr*2k9.j%X\t7)//ho&9M*vh4x^ nwXXb!NWh@@$L,G*!*ȋ IdP+)pc58q Νcæ ^6&/5fff9{,w?{y1[>rwpss3\^(Rt:M*%wEue5MM| nx,U1jkqKl D`d-ө#J"E! yr;',..Oਨj$rWfu*z`q/y8zhɽ299n ÁkabbNǷ->O7n0={KݵX,Vnwrb7r9ܵ^z ʴkf ^z%rz3gq}q1***f͸fg9޽{91jK1 l۱STbcIB d2qݬ^,---\pƚD<(J6mW^]ztuuCCCrDN6enf}'g/+Wazfff!Jep8F#xy>Eٸ:4DKK SS.R)zU2rCW)//Wy<!HǓX,t:W\a綝\QQxD,hhii%\X%4761::B&vӽn-jP$D$!pXOMM!JdsqqDB8&j****غ}#cl۱]`TU^k_wBVo_yW6~15""oo׊?6~3?XFQȕĊ@2DER_Qlv;`p$JMm-b =DSc#/_{M7s3j u6;]V^aH,/.ĉoy&<¸E* 1Yxپ};=NťKJOLp:9zV֮[ɿ|FZWJTKKcپc;cc<ssyd~K^eW^yp8F`X{q"LN4Cב/~J%zFR!'c1!n!es&¡mm9r}] Ϗ`LZ!Oɓ7C266J]]=9Mmm-hTb1/^br9@$Ll6T*h4`6ÆX,dWTWW3;;K*.]ܬ^8ss[288Dmm-##HRNxZQQQL&#:d D,ET`4 J86Xվ1ʬ%&,t,// y.FNb4Q*yD1P,jNR?Zz}q!Řfd2(–ЫJ4K2,//VlvٌPT&5j24FX89tz=RX<^rQ鴺":L&liS\N0Ġ3H'P*UHDR\3%>M*B$1<A2" #Po6mL~6z3xAs{'NQQ|%֭[[Ǐ8S$baaH4Buu5L ' 6 K|^24ZE477a2岘-V~gZUU cR*vLfʟ H~ΌHįuIJ}Èw܈Eb" :D"B.GT:D$"LL8u4ǎl&NO}}=[n`0j1LD#sPermehj'flٲZhD&4\A~>laD"N ~?> <!0 LLLp7T*I&mف۽D{GwX\^ǻDSSSaYmF8nĉٳ'|k}}۷bd/:[nÇi L.z"L&ͥPPI`T^@&AbS\{ +/J>ýYH$twwo|u#6FA֊` aRhijj044D]]>T*`byyZn7ZD"!wau*Fǟx7r9L&Zʉ'J9{VkbTVVF4558.^TdFr t:{M8X,lYqfD"v;oy.. \4ZX,444V5DU%jX,F6d0/ Gh4Ra ZXX`ffFȚdky_$hFٙY> ^"D%_,X~݊gqvV;A1BIZ̾ry @T+$v}םtvv3O3=5yW1ߧx ]]]x<R 7o.or9]]]<ٳ]?яػw/###:t͛7xT"ɦ͛Xݵ/ HJ%E٥> T*. \/>)._jr 1vsE7\[nT0>1?L^N:EMM 466̴x,X$bl2s4M &K:fגd(+/cph|d2tww &HVGwW7_~U:)7]{9 T*FFF(Jb8T*Y۽F7d"EC}Xޞ^Vw 5[rgBafRy\_t:| gwÎlL_/Jd2acfŕdӘ-&ҙӮ)dXUX," 0̮pHD"k%bqtBFJÔQ(\p˗Kc7R>fk[ŨtVbd2d2 L 0BH/s~b'9sZqytqh2vQUp8X,a4x<8++$ 2 6\(:=H$l9TVWH%XEO& 1[,$R)錀|26 FDzwffw/~l޼@0믿N}CMdvl,\mވRB֠m&ƹn.yl2J%innL*McC#W._bx]=9SNֆ^g(vYʈŢ,..b+/P(idiRPv,n:t:d>F&)f1ER@N1Pd2rAvńPLX"!Oϴu391Am]o=d2>,VuC]}hm[ephF,g޽n~ɃS($ hW_}B__@If9[twwsaxٰa}]\x;v H|>kwh$D|+O2+u>K}C~;?<*3<t:;;y~J\9GccO>$7x#lڰQp&^M>Ybyv;rSd޼yv; C<G&agii?f3ãèT*~R(pP(@?_TF4h6LpTr,-ͭ,-- ͭ3B6( pɢ*bE?q"::{W;wL& -8pVto?)f3$c\>G!J >Yq  L&FcȤRv,Mkoqz~zbr߇5#ɔ+an@p5HTL"\.\.vѣG+ilnӨT› ؽg;w_?|ᅲlgyy.۩g"xp$J6^6l?'U/p# QS_w %=---ֳe5qR---8=.?=^~tmذZfgg+,-.rײLcCC UTB^/̺>uQ^nsqv܉/ˤ) V9<^JUU%n"fT*!-;rl'03"¥zLիWtN@a֯_/n\ q}uԄpe0  Ci #HtRf-R?UUU f&#Pt(W*&NSYYY|m ʁK{['"PYYh$N3??Ϻ5H ff0vJD"!ɔFT RQZn,T v'sEΐX,YW.cXCs}R) >[y)G$HQkd3) "\ٌD*8\" FXMVhqt"_c8m.\7M6mL0vs 7~|;l޴ T̖-[p:T&ESc#jfT Kҩ4Mͨ*FGǨp8;Hbt >ELFs}0[L}~k8~8YeTDQ. ؄tU W Z.\@c}#x ^O8Rk&4j<7n$VqM|?,jr}!tfm)fX\L*GR<0R:|?f/Y䕟LL27?0{Moo/vȑì^#lȤRW[φ 1,-yJlH$ڵkрT*)FFFI$tr >&b?ItZ\"J JMM-W…VK9/"PE\s eZxj*^B1B6088Duu5&p򭷘{,/RxgGhij'ζv6;2~Vw"rMs{K߹sFA(`@(--.j!_/wwyyvf{Viլb˱\6$@b0$pp9 $&1ƖE%.Jۤ;3;}f;qp9sH:2;}|BLC]=ZB;@V܄n# #[yVwP.C$fii#"Y*T*y^u&3>i8u$+Wdft:]MH$׋d2T*Q__\@RU)֢SD2 ԈL&evvÎJĠՓ/X,3GAPv0>^'NWᐉfn7R Ad9NLHwю[,X,$SI3XwD"Q]%Y|T(Y,t"W켂͗^f㩧T.?\.㩥M6#ɰl btS.Ѩ=vn c#(TrU8tx ϏVE*rMf&' 6nڀngrr7s?_"Ns;$ؿEpM7݄D"iz)aYߏB.GV33;KC$J-+2  ###Tb ynw .~klڴ &W/i)HeRp& pU6ɌF%Ug T&4-><%,+^^RzIl63rp0h, VC{G'l l<bjcNhf<hDlJ'ڸLL|!`| IDAT-&dRj'O70,$ 2A0d2UQ\&#b(hZEbBNX"NHgLRS(1M?q\J&c$2jq:+bՋ"FZ"A !rH3-$&57e"=88Xua3ifgя~D6~k}z3}H0=3be9D L:ü׋l"OP(+֝qLFD:;;]M0T~0Lģ11jbbD~h4rl;s9T:PnPqF#ǎcժULMOc0[hh'JJ;pp3[3#|? 7ֽ̼S}|7Uyo3 ʕr-#[o߉'MK*ABD*!͐ΤPky5 FB8]~.2|A`9f 'P(ilj⩧Ё#| O<$6$R)kD-Ϟ%Mr)n޽{2xL䛔ʜ twv2t,|G44SwӿzfIZhniannV\O>5 js|_EVF)sLN۬sY2e(Q+$bQJ<Nn? HJN,0Ɋ:yz{yʂ@k[XRN:VsE7eʌ]U{guԠP( h[N֠Tk'8.2&ׇ HDRQ*̡CԲl"b4 #<.7RrL*)ޮ^1T2X*܀T&P.d0bDR ZqH f RT&%gbS#ST9v8-3Ydr%gϱx,^l+ VV!WS, l\{ 2A"! aқ((Jl;e$E:Aӳ"tBAYOq.xɟP,xɟV]&I:Zo R믽77ބɨj1SfԸE" b0rYQ0LѨQk5#Ar<  RI̾Q(Fc(J$Hy'xjFct (K,+Dr A;DV+xJ\z+ oinʿo=[o0|{;̔F׽?63o\J:DR4uz$R)?ɏ '#XbSC`and29< ==X,Ξbjj7<}b(2\^-;yszzaۙz ,Xn-L |%.!$\~Ӡ^o`lذFÎPT*Ԅnc9@QaD"uzN &\&R(`xh}g]6T*ܹ8qˉgnnTT"`2ei1F[nfx+WaZI,,,bE011A{{;##LMq:3\pgyO}ST& A̩' 54A"(rF|9Ab9W_n?1]\8>|Ƣ@M ٌFAaiomi%'Xl2!Hes}>tZ-2b^p: ,$ZC�GFD z1Rh5:>flU[VYmМIdi2 d2t:ٽ{w.ؽ{7 @G0/UbfQT;\6[':_jrWS7OnrZvA3Sif~y:~B&Q*U)=hw%ٙ9^ݻSغe+?ݵ I/>)֭[F*WL0t,~FYf5Ckx'p׸0 E,c{c7n`Ν<(*&&&l60t$qW3<2L\bhfjv{k/4z;l6K/K_Cy8y$xR]vq=gZ[[y'غu+B`(CV#8r(B .2F#*J$=/-Ժt])\}*+aZ-/O>NSC.]?s \}5_ڍfczzT՞ B33'ZudsYzzٷǯrNCVp8zf߇E111AwOl6Ο?OCCgϞL&Lc 7ԣP(XZZѣtuut:IR$ItudWY6^ Z$5.BZ|>rƆhllܹsUGR.ࡃFk뙜f9%,C&1?h2 G\A"R(Kh58VH4(Bh4d22ť%$R)6t6#LR[[+FGP(HR,&3xrReZ==5Nd)tP0jCPD47rP*18N %EJ|pgc>V**twu}vyt:ãsv,3S"QVvĿ;Y_z=x ގRĨ7 0MFǹlehu:n7N֢Vk VG0d͚5.cǏU]eߏfdd ElbEď45؀f##p^vq5&sCغu KKK$O+D>S2nw2>>NsX[sQfR.晔";`׮]|8tT5@=!׋`g{ܹJs=%7288Hgg'rV&''ٱc?<3U skk  AR!$0;7h|6G}}=|B!9;0 ej8t۶m_kW_v^Nm6FFG(tGЀ֠%R{r K"O juj#GĐ8RI,chhL&gV ,-4. ֬YÅqYN%hcvNa^`8Verr61&MD zfimm%b\rXSr\?Z(XΎNΏrᛝE1h4jQ+ēq qwQSQiEJA<@ dR)4*5B^'^RVX $6騩|s6f 5(PrR(HfSJ8<xP9[YSu];7L歹sy%]FַE6ȑ#qJz+>J~anVΝ;ǚr͵pdB*4J3PгRBA&L匨3HbAHcc# b`a\&9d2Fr bbVQoo. D JN2G"T9LmLLv|ը`RIK_Xjv}^z -Ofxx.U؀7Y]3dT6whg3#߉˿˒XN&EX8|`r,r_EV#H8?_fӆt373J$Ermm-KLNCb5K%>N䱣H(Q*1l޸)eT.zP(hUZ2wl6]Mזvd2m؈ѨGt1tr0MM Lβre/|Ng|{"㪣 cu\.R;::H$hZTf1nv&/Lchhh D[˹6|xf*RNRRBT\Nb2Ie|wSFfR!N#[lrM7ɄoK6a\*z!~1==}S,q:lذϋųe^yN87n K8jm6^{5z=?ٰav;,,c6ٵk]v?<}6[o\Jb``oHŭ/|4 ޽D"$}}  eΟ?ᠱ_M5ݲe /G3,ׯgi^ٳ 6~Oɧ8z(k׮Ӭ\J)hll ZMsC#$P d2ǫ!kC!j5TI4RߠcGT*ijjرcH @+cǎ(7\w=OzuCFFjΜ9úup9E]ĿE݅oG$addۃ%ɰ@MM OWWٸq#?9Ţ8N~}+?@Vi&*Hu[( qՉEiXl.[M.K:=_)Q:OT] (f%Ao,)o'4~z-N,ifޫ)HRA@&΅HT`y9o8yfΏ!^O=${ARgÜ:ysge3LNLP_Wˎ.#J"VJ[k+s3z^Wܹs]w}HDJ~]SKwXΝ;dkånGOO7G?1z!jHoaWۙ橧bvv#RljbZ?wO<VGsDQdR)L 'O`bb9sf}Q<<t*$MRfN'bRĶ"I[^O0p('Oi&Í7 ?B!GTT(ؿolƆb(vGz:ʥ>&r|>+W$Ѿ P(hiifbbfz{E-м͛b1pռ~KErRI:#|Hh)^Gn&p(balLi///+X<\!T*ϳrJS,,,dhii6Z- L&n' cZH$bZT*BnJDV>7Jq9 Q|o2y(K"J&fsAt:=jOM-XD"UN'_(P(qqm& q:_pӼݴDxs:5_-L;EKVhf@%omfʿxER\>3nv5uj>1<W\~9Aر2>Ob o{nJwiNUձ|>ZZZ8|puqqGYXX%Eì_χhdp---eݺu - IDATΙnGx׸똚kE'>('N`jjGyшIcH\>SO=ŽދbSW'?Ijkk)<^NǾ}M6q]wqAn&͊v h4d2T*VU- R|sU&XV%T[aF#hBN:/UY?2|>ɓ'ijjKj;ZXXDC2$ @,![,A6rLl@*bpJlm6KK"hxxM$… \y啜>}$A9Z$SI-SS0[ȢҨnbXN-h CȤNkd%xj(u"b6D"Pury֮/rUW/㥥v(|,//c |퟾NowH$~ |ةc6x< ,>gΜanfA`Æ _l6L's|#J:#6j_c Fb^L8DgУRD"8mN$%U+^$h7@Rc6:qQdT .сD"Xʋn"zH4RQ\jlFPR yVBLk@&hxm^o=w7fWCYn7Ԝ _S'3in톔 PYedr 022J"Dٲe+FFF-P4|[oۼnFw}."SӓHe6mcַť^K/DCC^D"\.ffvt&G:AV\^Vtvjp{P5v^y--׳g^nOXΦ9|(Ac\S,yQ(cѹ_~ns |G8r(o+vR[[S?a}y<5\ zfd8N (J4Bi6Ӄfٸv#{R455!Y24 RჇ\fjrήNFƪVTADwP}=-M-L&Yb*&'Q("0b q:d2DavnR,r96sՉQX`~ޏ\.PX"N}}=gΜfEg"팝C׳jJÁB@&P(0R$LTB4 V+6yJ˕4YB6n'N -$IlVE&׷AԲ HP(hT*2 hcqUPT2JB4aRC}d*+VC"GOg7޹5##ô4cF¤R˸5hɤ( $i B8;,>^nO-_ѐNSx5H6e =G8$l $1˱Nǵ^[I>Xbl&뫧gF,]=N^|E444p挨c:'N Ja2ꪫxG v\__R;qhgϊ<_g/K曫ź￟z{1t:]5%j.ۭ҄d9ΝD*rp:kqnN4{vG*]ؑ[~|K_7ޠBlf9@"ի-޽m۶Q "TBԈeqZ05%N49r\J""badVO~mƉ'ƛ%DrBe6L(d ( *%rD"A6x2AVXS( <rbH0҉U*rx<‚a2) LOOrנh8z(}}},'Peu^&N`؈'UfGTZB$v'g P(DRX,է*6#Cyq8ϋ e*Xj& x! ,[D*N*`H$b6jY Nf)T.O\RD:A[0FKK X.Μ:͚5kJEk0dMyQ Qi.LgʄJd2)UȦxBADLR)QSSC&!H`ۙbE "DL`J磽 0띥`8lf82 OS*Na8lذ@ ^% ʜoNǹsزy lH4J$|3۷vPUVأUJ\>2w֬k~f]]z?5Yy2{3ъB H eO੭%OP*_I7MW|eX\}F@cS3{ zilٺIq'B%5Ns }}+yg0LQERi^>w޽K\-A*!CcqB9zTjC|_f'?I.[n!{JX;F0d߾}ZiYmLNLpUW17;FeZ55$ Z)DalB!GR1887s)illOϣQ1 455Qɤ,U,'OS#J% K4ٳB!.YD֖*zNK$xbP,XbD]G&!J'BA I:%l6PEkEV띣)}PWO,\.c0j2,D@ZY#iT@P ёN-Z^…1jj\hUj2 g2u^1xpdu@]}=dɈ#chmjCo3:2NYN.E HQU:wXD.#(hu) vY^^]B`SEVb'l!q%ep9\5ftt+W .98y-h4bXb0`0H]]*qZ-BAlP(yfN:RkR(C"G0 H4U}LOg`RT*@D Hh4hmm%LX, ] t8 d5 z=b>ZA&jtLj-#c#tvt91rZ&J蘙bIeR,VDZrB.=U*2t NG2dഹ[%1K$DB"&B% 5K @:-68Pĕm$Պ8JFsx |;sT*Ft4J ,Ne2bNAo$LhXZZl2%,bT]G˨T*J ;Ȥ2 _H>t}}},--Q(P*՜;wE}r9LFo{[k |>l! dxcX %W1?? k}}}d2Ο?/ZNMco %<ɶm۸뮻j5=bro[y2^kٻ׈{LFY$Rޟ̼?-|*243bF˿gD12G/ :}-/Wd29Sy睼 y=DOo7V3g xXVgϝFC:cU?ޠg͘flvv%l|)'Ol6sQ>[&ֲbznB#G0,|ޯpw[?δKgwZQj xB@X`4,7_b7O!gu}{>w xZCOwSS_wwR W^fu<ǹgyGeffZ^_5#r144\u8]NΝ=G*?.}>_ӿNw]u EB ? 6q9;sw';I=>q K^fwgvvNs|f6;9c4t>j`JPB\ٳtuu/~G7Mo]JGGq8 {:tN >T*ɓ'q؝ smQ1 ^aʕFb1B4񈞌N9ݻQHZ-E 466255%d%@$Nrf'1,,,p]wL&EVlj\Y@{<bXx<΍7Dbo؉c| LMMq[AQ3??O.[`Æ >zx{x722¶1:1Jss3zEL"d"ˉA(86p8)LVL&A `0HH b84d2Q(aH4h :D41دʗ*K>w=477233$GN7) 4*- x|k֠T)k] vvrc)ՅKffg졚-ru?_x{K.cF+?M $I>N8mx''?͡#G?ʁ}|w]Fٳgb͛OR-P_߀Z3 Gkk+Tx=^$IG[K Ca2o23=C>Nuuu\vjX,FKK --~m9{,Et44rU֭]#js:Nvv/4::hR܌h$ RTs9q\g8N6t nEpaE$KKvl/200{ī'Xf FĻ0u:*z_=jO{{;hXj447b4- >p8,RjEp8@oT,a4DZM$X̣jHh4"&FaN( LZP(`XVCZfii JBD^$,3FrΉZ&嬣Rl4nw"LL!OSs3#F*LV:Nl5]nD/I&tZV3jA6\>2v*s֡IkAA:"Xb1atxV^* LF~ju yJ(d6HXșMLlV+jzo=lH d2i:\=wr IDATr!J|K.tZ24`\Yr,|>6ꘝ't:|~Z+V0tVI/zdhpT*g?Y::PKB-V}O[+̯~̼cϛOgf015IkG;\d6gɓ|dݺuH:-6lgŊ"v\FTfqRcX(%MM MXϱc`Xld2)v'~4;vٳ{y֯_͛%EҐH&|;7 >b0_\@VR(x<^N?Ƈ?0{w$+V|˗ l(u100# ufffy'شiG+yStS.g?=LOhVEP244ƍp\*1hujFFehZ;F?]l4`61qQ<׋h4P(Byhnn J\u.,| MMC'O7陘6 A$ÁZ_[N>ùsgq:_|`Gh1Ƚ8\h4–-[(sD"a,o^9~-͍|byOF>#N /^YZ[ <FtuQJTBVbF}c3zF kЋԑZAAxp8,|CP%1t2-mV + iin#JբעjUNh*<݂dd2ωt:ΙNqM7N'kPP^DB!^?6Y-`2Zv5Zrbt:t&%-`Y t$L.M:EZQo U\Ũ7ZRJ0[gIh4*K1;xZR"W,TD"جJKZZZH(=$P$=zIobQƒNGU lـR\B&BT BJ:53~?͍a h%tZ R:6Mzy\FX@PP(J|!ᤪ`4HzrJR;#x^ 8~8oza\&իٷo?WFM\AVsy֭ZMgfzZH  ERctwwv&D$Sd3;:Fj9ytE֮YSO=͛9wmP(R( B!zWHl{K;֭[CPpU\.hС%z{z6| Çҥ377 $ VX--͜9sz~?۶o'YDIHNi? 7lՊף*0fEEk4G#IlKE06bAN0gsHqyIeLF$MޥR-#Ɉ^H$@Ӳ`TPVēK1J%|.ӒɤK:R$^T1Mb1745I> 8v :RN'b#a1Y.1L/S)jţ,VJr  JD9f"1W:$D,EbIb" u{ Jbaam7o#RkUM!\.Dr/⵮Np.Kz_vMxеA">G099id2a fjj JERav?fCTR(P(h(Tʟy_`Cs̾,!\ʷus_3K_~̌B@}L`\g "sXa=yjx9<ZùKz{{PrB6ai)IWWWa:G lڴ9?dT*iy\x[m obvH$ø<^Ig 헿DW $ -T(&ZzOwɓ8pՊnI9T*aZq uy 455RC}]djjlgllYxr\tD"u6K/HCEբBoo\"jZ6 o\ .u덼klڴI < rJQGT"`Xp{T*$bq$I"YSntj5*JVz=_%.7444Lx"W&O<=éSX~r5T*8"W\!ϳw^ :Z[Vl6hZ~r ՌyQud^gҥKf||Nl\FT23#9t: z/([.xs .Ѫ5,..rdh0$jBPnARRX,Kq9<<{Y$ZHuuurQB ˇˣsR I++䐴ɴXwVBEƑ$ JA2E0LPcvvV.tLX,Yn1L"LbHgzTHSf"NBn'H`BP+m|@"b0;?+ Y=Nbݺu~t:,}ikoah*V86lGi&СCݻB!f2:+Z S'ꪱ*^j!~!z:Bcs/^rϽ(kEFh473wݓx[`>?_\s*ޡ?oQL6C6'(X /ʍo e,f bX,ѣGdxxZV-"|{ߣڵkeoY$D^xvZ[9<}}+8~8/`0p|>%, s7[n۷og߾}yTUxgeCS=;vnb VIxP5?8wq?~iyyCn (J\v|>'NV'Np7xQ__FP*F^/HFa*f!I%N'L-\xA CvcZ B޽Av͹sx{˗ BV 6M6gΜ!Nikk#I0`RtEQzzzigPɪqb\<\ tM=z72??//~Ο?O0[nWCJExN* nn$IB. IfLNc٨VdyfggsQV LOO@WWtv9`S*CCCr)ZffFT*R_EˆhR,x!夗RĨ7_bQ#|;vF!MPt2%%l8K8W+K B.2:bK1t*%+r,WRISS`VYf\69L&GPa4-TO#r,ZK17Fa1"BBQ@P(RT&jDG:f-H:.^/6'|RD'&yNN8!'ִZ-wy'^W. FX&͊t:͙뙛ڵkbZX,ƩSÄB!bӦM+e9t,XyjDxB*U73_N#׷jB_ݖ %|}q뭷֒Bs}#2~̴/ Xvf3ן SSDavŧS|[OL<1FcD"a},Y0ԹcU8( zz1=3am7̾gs.]ħ$/"vv5ĺŗ*J'>ASS3{ŋ_#/qJ,G_s/29ry!Xr®]x'?!JJYmXL& ;v|泼 _͏wy'`1 D"bsT*Z`,ym&Q\Boo/0~l6}}}9} M_C DW>^=}[63Ow]S[Æ*ԫ'O SׯF^aШx\nqr9&"0HTEL&ΟRI4h,WO o'/@[[;ɌRڵa:::)+;߸\n  .}뙜WT۾ʑJDˍ#]IRŠ7P[;c58v߾" nzɄ'ˉq\.'oAgfQTؘ1wZ"SS~d.0G6fj,..b2vǩs8y},,qQ)W(:Lsҩ4b JBQ*幃\.'G5m.BNĭLjZ ɌQodjzJh1V;ReSJEf?oTFձ$JjWZ|:l.KT aZEZ+Nc2)XLVHA./^~cÆm ]B˛X&ӧOt:zJ%=Pl6lD2I4r؈b[O/[I(@ R}wyFM$?qo}Kez<^^G?Q|?e >_^>Q7m􋙕LUo9Mr~z2cǎLժr*a,rݦyYѷ~g}9H&wSRj%qAr+WKFGT el|Ύn,VCMzFGy䓏pK29񼗖DiY{7Z lذANUE"nf'1%p8,HJ%8q311]ա !~r9A7z[%oHOf t:fffl4122V+$[l7NiokՓXrS':ˣnrQtI)IdE(t393ETR(IH&LNL\Jsc#v\.4==5HN$I&d2z(jirHPA2 uzs~vFM&,w<9NhiiÉz#hFl&rHe &%,& D\< P,rC^J-HL4b,Z+3<<{?.޺ShN$r2)JQZfq1"`vz=/| y_sjjfggiiin0`ttcǎb :r,3 5^fz?d'̯MW}/:Ξ=O~,<r7BxSf&ϣjISX63"b$Nb۶.vj^LljWM\晧'o 9Lss ǎCt8֮]b߾gVN."LpE~YW.oGeݺu"rvv#|Sڹsl޼'Orf~Aν6kCرgYl6Ʌho N7Y~=gfs;̇!.^͛yresss455f**LNMpB`~nE&&&ㆭ[yYv-_hzh,jAV "W_;!Yy饗xcffL&*kנRkD]$G">nʡCQT9uR"*eX2=t-Ͳzi::;i*NB!6/ IDAT"}}+QU8uB<7--4*dRlT.Z F ٣h,B<G\I *RUj BBI{[;\- z'1jB:bEVR(QTht9"Ub߬vůjdiipHc0ͤiZXVҙ4 TjFrL vN,*hU*/O4FOqy5ϏH$*m5jdL&ZZP,eb1%p؉Frf37n7`4&r)h4Zҙ ?-{Ttuʉd9rRfǎ|Bk(ݬT*M&G^{P(d/f N7 3cf4?ǎcnnN 4Z2 ^zs0P()H#=$RI&#qnF44z<XXLLOOP/JfhmmejjBlfÆ i p;ohrjb|_F$F?S(rO _F&nBSO=Ņ all;+ .P.bD;H"$0R%?w󵄋R$blذ\:CGGO>$vBRXG ! LO-096hq\BZ.Fl޼qb\, ay>rAz:z*׭Zsb2DZZL&eIQ:::NO(|NdF)455qE|>*gC, 6nUT*i3H$455fթ)Vt`jnZRzoQT(Pu{V]Z-W^7eH$hll$ ڨT*X,r\^fAR% #R0*N^J"_,> BA4ʞJ%KCJh4`l6 ˳:PPD&RT1N#ˠT*rx\&:#j$IX~vRȠ(a5[q8T=5+ˤ<.%!I%i - UhjTY /T*sԑ+煔U笫Z&|N|L&C>fP(Meof1"I tY^R*JRG{ny |{|lZMCCd r ]wuuKƗyǐtry2[Loye#kE2T*o)xT+W+5TpWW|Uq(0(vF+_1խ\. `RK3i4 sll~<3ٳgx8x V"Ht:e}r䒹}O.cnnSWWG6F[ApS(޽\&su1}lH|нmkyN8A:D{kó;?|rYٳ˗/R(9|ķ9|nJMTFغu+_=܃ba Nnݹ.2 |v \8W _EOprn/O~U \`BD0֬\P_evvBWoPNK:ru鯽ܹ֭SNrx'ihhW_SeΜ9J+8lvZ4::JOO.] 061F\P\!ZyZ0̙3$ʼRʕ+|]ݻ󴴴m+^/LWWOӳJjU)W ,,,W\x^JUQ 0L!`0LJ%T*rQ(+%K1WYp8p044DSC$Mi?zIxyZ-zI$Ii:HqyD"b1fggZ %vdEDQѪjxnD*fwJ &jMؿD.W (JhDL.FjS.WIg(jUKKgzzl6+"ԵZ)$e@l!m]nJR-]d0 s`"Rr9t=S3d2bcժUMf1~V[fzfF U{J*" !100@$Fe``v,akoogjj CDQ(8{V\,wp,yV^-'LnT*%oJnB8СCuv1tUG{?H"#q7/EW{ӓ<\Çĉ"͠VuV#<|K_.Μ9*$Q.WY$:(ihhĉlt=6R /@oo7!<F0lݺm~j/TFww74j''뮻Fww7`P>(5 ho"$T,kV(遳\.::B+X\\ĠW>! Z ---Tˢkgd2ܹ^ϙ3rl6sU(JtutHF*&XX.ct:YZZ\A*" hp8T*FGGi9W\l6f._ᐇ+ /?0sx^񸈞x"VB($ dR,wwPYٷfI{rD"}XMfNQ"V֗S@LRÁfi&h4*hT:n4ze_u(rhB̗J|-.bժJEPiIur,/3kxuZXn?6H"lZJb1| "-jJX_Pl^CP =kKp&fgń\ޞ^"NS,b%_̓^x<.Gf JzH8 b@`(K/C}+Wp\ 8VX MONaˇ? u1p ^FjudYV\vR*E:ZLamW՟1.̼[{d[X7,~@QޛO(3,d~YdkU?(B^fi \NTZ->,WfqqM6a6Y /&7"m4я~(JR;:Cw3;?$ l`4n};`*JΝxWKlڲP*8=pp4®n?7RБè4j&&x:|#rZ\/>(1,Z[[RCz6;JlٲE}|g||旾?:sssLNN\'gժ>|*eNJ3\xD"w$Slٲ̦Mr}Ŋq%(L?~ALORhAɬVⷤ) $BFQ[S+XI6;5T*24hR|8L>#<B() ) 9* ss*LP#S)A@ (d y:\FC B7'PD ؕJ%z^./$;l,f LHDiZx"hbӂ(\L&# 9){*ˑL'ˡ)R pH$B&JrZZZ8q9\\(kVE*:nRQ Z|S5$)BGҲ@8@(¥$R):-?yIfzN(.0ﰘz.3擖bfFo|VR+Ms7s!.^(۝xw ^}/~tv9w^z;w!H$hmm322`t k\Ɨ%r 7pa\.@ٌVa;&PNf߾}vm~}+`vzICF*fEG'p:[`xxñc8w4h믽 Wzrb*;پ};_Co~[o zjګwQ}{M]^u5pIx!ZOe>w0\mKH&kl6&Xz5Fx+s=<;F\p͛7K/qmԄhddh@d(vr570&6}vSUUEc]-P^,ZZx\.VQ8}4\NJk_|9`ڵ  SSSqnn^O4evQcrE:\, BM4?HHLKZS'ioo(f :fgSp ฅn%~ ( rQ|R*b^@R1p9G$qj/V+lFe]F N0MM1 اѕ) pL,t2 ΊAFǬߏl4L%lLo%.m#`6xg)DRlB {r9 $NNEYT]`t|T& 477R$v34Z hK}d9[X 7;_Ids9Vwftdc2\ʕjjj;"G;kgxWbU|_nΜ9׾MOlZΞ=GKK3~ ttv'xX,ttF'>Ck8zDTo!\'z?-L`˖->O4G~ĕW^P՜>tɉ Wo:VvrUuuD1:;:!'LyP)ըj "yx0[,.ZLeec x>\/JRYY3>ŲȤS D֮[<7Hoo/.Ǐgf\Ѓ$ gN---:NQKںl!ԈM<`}twχp?BűǨשԓL%Y|H&UuR`vf:t|RͪU]αt2Yvdl_NҲX4d4S1-(Ur44x@Vl4 hzvŒ%3i-wez#%J&'hhulL B&n+'fY,f$D1I,Pȕr (!+P4x2d&4tMd9 y9"AZV (4-Z\`0QF* ee &&}RQjrb12ԡI…@"Y P,L&J&IX(tLP*rtvt q$SI|*@QogOx,IIsg `XEmN8Ej4uuTW0F"y/n%KQ I4eϞy]LO͠?N^SV(შT]tc}jYb;=y[kw_3oލ]{fz$׋=(?_~ǟ3oW.v-B@Y>Opa4T:%v ަD"8.2 7\]-lh,gЙ|S̙"0je͚5J%jjjJE8`bbd2ŲeK$LŘ k.z{{IRTWWCqw3PYY݁fbh'_(0<67mM-'Is155ESc)RrWb˖-?qQDCCz( A@PԌҍ-W*fggj5x0,&6+sssvKbυkg!gtl Mdb1 ϟthxeod>" O&#l6|3Stb=,^l.F!cZ@BA$:1*%JT*Zf? D QJPTBA,63P" T*/|rQTZ('v B$H{ȂNƠ3!^Hqd2qz4Z5^}XH _pJ%I)!v-& BAp9]2ɔ:PR)0r̬z&_c5[MJ9-Qnu0S7`ZnlQ( [t фbff֏lTJ lP8@__P"s(tӻ8|+;Vr9r9NjbXtfFGg>JZH}c@?3 ~nw/h:3o_Zw֙NJFN|F=L֬YC2%euZSmŌVcvn]vazjBH,ɟ=ɷ->2矣*i1\cbrqM>s7m$ΒH$Dlܸh4ٳꪫY~ZbQnȑ F"J IDAT{qO8~j|-[꫿b`ht&hjjfppx >Wɳ>˦+w.qyQx^:W;N9F!75*"ztWR]%Vwnr<ׯ_{x2#C9rD%LW\yzHafhpŋ1 ?zJq8\H.'"ۖ?0H]]= "0>βe$P(V*Tj5 JE>jDzbqUX_SGX@ \.GMU-CCXŬVt%LR*P; nQk&IHq]Dh4! +**Ш;d2" 9<%@RSrR&ˢ'fZ'H6dAWP WMR>/jь'c5[Ƣh΢_.iR\ r=Hghl&X(~MRD&)ԤRin&3333LF**VSSRtFaw]E0\pJbރL.0ub)D~VŞ={ٰrihgjʇѨ$R <^W#1:tL&?1V^MMM O\x;]󯏚3MRF̛43۵4o*%9)`}RJ?OyGc^r C/831LFXCeŊd+4Ӳ)jkky饗x׻ŋ/Dmm-`^/-Yott:Ξ=K4:ϖ-hr, ܹs\8*b|b*8^ܶ?#,F=uE4.'~n6?Ikk+d #Xf V|>TY[MX@0P*tdRiT5Th 7N BB.4xI}ٷW*ԔJ%NCZHDyq(Z-vbfTWV מ^glB 9Z[Z Itb hUvڭ4773::jP `Qn&$SSS4{ɱq̉Kc^$&p A犒:JNl6O&@gRD́BV!W( 5A0 ᬨ$vWEˁrR\.~P@Ґe9:= \YPBPh4"FR1_+"K?^'HDpVƣRV(P+ՀX\ [<%ə'vLpZkur]L֊Bm!pThlldjj9VX!=F_---pl*Z%:*++Q5FZuâ%U AljYl9l޺gEbT8+39;b=44zyh6Oh8r8U◿d6V*˱#bݚ$ [Lgg.M}}c^jw04:ʻo^86l6}}}+鍌OdIrSsr"Yyg)AN9 y@8flrP`?K0`cزe-Wmb6ai4T1,j\(m͋֌V&P)dLMS]Y8v;Ncnnr8}8db>^FFobdQW]gQ+T!ft&N~\S " @VJ~øJ;JR'#P)UN<ɩSo}=W]u*JqS4:DYF7JBK;^%mq>`]lݺE KmT*Eg*C|a˖-޽}p9ٻwZU7ɺ6s_ʗٶmΜaRvG[[S37ŝcqm5W^͎_?v37HW_?ٰaΝ;YO=[-Ms9z)֭_$reb N'v" aXP(ilݺtuu{ ݎgbbʏ1+bp-k8͢EH$A\.'lڴFO#BI^ t'|ENQaFi_A84OPȑ#Ŷ={V<Ǐu129z,F *L"$zId:Assp ٜhꧦH*]%h4ϲvmmK;$J+*XtSSS8L&#@J9u4KZI8,NvEd~FM"<6OdR'&hin̙3vm| <&\T.أ+e!HVRBPB-w!D mU QcR,9J%LF&X,b4)Ϊ2HG*F:d0OđeA󗟡X,DTSh4Yz5 RC!Sȸe͜>ú5ۤPc20M,B!!əgX*q|R[neMdsYKNm&vm߿C\̼!`M[.2oJRm&d0Q,Ѩ4|P*ؼy3mmm"d01bH&\Jr{y4ܙXRC=ȡC\i@.WP_!wuk7<@]]=}}xښ:`|[O>&'ڗ׿54 ƨTcصu&K :OI\n7\s :Nutt0??jܹsdY֭[4/_2' BlܸGRWWGKK .\n׸뮻ẫgFp1AP(p)< :u+VpItz8X,Foox d_@ @.W^{P,vr餢СCs= P]] x8СCe3gΠT*xymw+;B.Ap8 C"&YK~l6;X+9s ]]AgdJMMM[o]]tttp9jjjl6=FP);PT3@0DaZ(Q|yTj5J 8+԰︝Kmb |k_[n&āYܺD"\)ڙYt1h]C2ʸhh5 +VZ&_(EX(Ay dB"@"bHON[`zJh$ bEQ@V;F# 2U\e P2X6jZW-I,2㢠zaSt"hE l>+z{{qRGczzpш^|y87t&;vifQP^id2q7oX?4rc2*ta6D4CTRWw&Y<3jjjjH&t:[$ZSJ|baj_/fZ".ߖ$PB  aE+ nO9rK.ɟ<9RLA"K̛pzO{Ie rom 4c``<9&&&03~ڵkyW1,땘ji7x#?,˗/'pCUUmmm \`htnD2~:O#ǏAg4t:/|͉bprI:WOC=D!UdeT*>w睘t&c,Z\ΚUkxe+8֯_O4%HN2~/$L&#Ɖ'"rJ?Ʌ 8VNsαif~Ip` _ȱ:L&zSNb FcYS@RKeeeO"d $˖-czzLƉ'hllN$arrR`o|3.a i/"nhjjStvvrE[SF[V#cT*6;ojw 0<$jU :3;=G}}8BRvddd+WDh4~z|>i_Xϋss466 q:hZrT:g ϳh">яF?K9rnWsss\յ;qj5.\^ Zl1YHJ%ɤPAvdFf,juX,r(2LˠuLL,f+s9v;X ^b&r^V'ˠU뙏D͌_)EBaI+PjF4(7F!jobm"!u ~*ɶJ0L"COƉb( Q PH%bRRlX|yꫯbXذapXdt %׏. uXl9/] _Hi Gɓ'LLLN:fff+WJ#YINCC#>$@u7mz9{b|%b?Rf?vf~Z=%ޚf]M J 29c1Ym/lR+"\B^&w~6Q ENg5;t:@0;;NŨ7d;yh[Y?\ty< u  ￟1r9ruדL8} Pbvn|>υ  yo}vF]wOM]m[nǏ_ǧORSWɧ Oضu |yNCC =3VY͡Çض yf3ΝP(줵F^|EN':Ξ=nAjK,a߾}|/gj6pAtQ TUVc!ǃACVQ(#"łVAǹ*l6+R4 333T8+HgVK4-/byB(KV# !Eͅ\b LFbt&-uG :t|>OEE"9>>.n &'1 $I:=}4x%bJZՅL&#ŏPBH1R F<(v#lAZ:|SbB>F&<#k9vu]Ņ zbmmm1)筱C&i&'}FjDhD"^o=c6  z$9/dA u8\twj*zzzذ~#TUVy}.^|b &}9eՍNx}}Y $ tiu;*N:E$CCIɜbpY鷾L&3L[h4r%BaߴRAEEbiZZZlDQϾ}IRt:~a6nŋX,Ri-Z$r;|lذ=#>˝wɮ]ؾ};P/~~9Ko}Lo.߸7da^e|^9\O_~usϡщZիyy0k׮vp244D(bǎpUWa4Yd CCC~jkk%w,D 0z+;vv32:Bee%===l߾ÇK;P(ի x铒fx@ @6n# H6ꩩMZ4/&Jqv; dYi-eQXLfΜ=C4b چc'rUWe rĂ!K~nJ%]4S3SdY4jhoD@2)+++Ig ccc477300jeff\dn $MgG'g1`ov, lXH,BЏh]\)184-ʤo~+,]J4: CDQrYQo&S%dr:d\.L.ȴrXpL&>bajz *HR~ 477 Pk4JLf3رr*X,e2ZrlWH$ JJ% K&RR"3??jcfvJ6petZ,V tq:ЀRn{+4z ~\.T*ZV~?* fشi3 ۶^A@?FzΜ9mۈTUU i[يPդo8qBA> %J & %D :dtd~ .^m Y홭N>ş~O룱>A`ʕTUUt:Yd S\I4bEnJUe%.IxbXrc|o}[ٟ}^謿D"Acc#Uյv>F'Nd``;_GRljcl|#]]]|W_w?gTWUrݵpQw30O&&P__G2_Ɨ|G>~%*ݕ jkOSBC*d|>T:uuEN$fD"ۙ/glxX 4MLLk /֮]8F :::HRR)T*fÁ MH0 Y b' Bh4pHeUWWH$`囹ׇR`jj%vO2FÃC\.$HP2@4IQSaFxPU$)l;S> fD$@X|4D&1=3CMMm p @ږ3_T*G>Q-ۗ3:<̢E$ T*KDɹ0*R*G֒H&0"P(b4cwDP$I7C2[>M! \H4n'JQ+dsbvQ0`H tYKFA'TQ[[X,PŽ*J(wБGYZt:JREFRiQ0;2"S3Sf}y;'N 9:͜?BN<' HfhG(ZZ2ll.?0dFTۘYn=\joŊhZfkB&Ge%qnn1 ozG[ZE8l4V!c??իFTʤJL.Z*T̔dNٽwu6(H&w}}j]NKS ?dtD"C&1<<,&7gDxkk+6|C}}{[i~LKI8.^~St_s7ɖmۘdeJV+dJtWdlY?1qY֬]˙3g… "'.H`(Ķm㦛n+*}^;W\>wAuu5EOOݻ{G\velذ^z.T*q  lݺh4*ܸ\.>ؘ5=9v&êU8q℔IU]]- b˺.;?JsYܲ~i`%NKϕJz/t:t xb 1fnne˖1;;Kmm-?0@SE-My1aѢEB!nmdRWWG9Q'娫25ՌbsڱlpLדd$oTbrj-%R$_,D*ݕEzzz'Ҿp,o>LJ>A**]ٿ{Ax/5444cii6I) AJ\i"F.̛ZX4VEJZhuZh0(UJiViPb̀N/ tlN8l.\2=ZiF#z"HEA9QxAlX$o: J.S&2??/q}VT)JV gϝ VVSeÇYz2ͨT*9d/9|0]}!w˯n`߾tZ7un ?11g?YR$Z;]&Z&c1;LV녣6mjfdٕ|BO<D"^w9B$7x#G>Νc9x< $Iկqa}{~-Ju,H͆ w=P*{.T*%v|AR|>֭[EX륢&Yb\}=znz/ [vڿ?K.%qF>, #Z={d0LF#a+\ zl6+.Zjf`4C z:;WVc1/[J% yf.^$Her'Nm4p>l6+?}6mW]7{X,GVa6p:h4jb(zHd^` `4imiŠ3H㚊 7rA,gu\\w$) :jFGuu d9Z=tF*݋``ppH$N/jF͊HV V_N&AV{ qA3RWW'$.~?/|!P(pzܬ"˘ѣ>Ο?OKS ymBgdtt̡z9y$4640:>N8T*DJΝ9ϑ#Gx9x O>$344E|cNH$>G?Q}YT*dHgg'D|h4@k*jJ8]444(Jx}Topp LOO>,{/Ot b;vL&d2s7Jkk+T Vxhnl"J1<Zg\)P*htjt.n&TӾljlxT<;;;E`rzm;_f^R(Z5CA6;3FAPEbX,^Ƨfج6}Y.ᰨojJ8ݹs'bs鳧ill$HVo`4X__ZJhjj6JEp $pW(hnHSS7ﻅT*ťWG̹t:"5R-*SNOp=E/s1YH󕗐 ^37ӟ??tg߮~͏z⦫C&0X__;Ltx@趼R=oEȖ̺٘wy1z{{Bn>Ç,,,`2v:FA`Bb1vr.0ZZ[Piֽ^ijjbp c4_ &'';m$}N'SܵakT|.2bZ_R ;3rH$IH$&4-LH$JMsS3uf p];wBWg7[HgGGrh4c=F\feezfggX,|{rqi|>۶m?HK4Iq!uh4r=I%4Z9^`0ؽ{7j>F#fz0 Ol߾<zL\uf3uIS,-ll&JDְ]e~~I\fqqt:Y__G HQ*U466Ҳ"r\dr^.7qPTtvvr$R HH,jaaiyd2),pF8~Aچ׻NOw7J%:ikkRWG"NL&MwsgΝ{ZMw[O~[}7ejz^:[, liresDjI0E1MFKjE|0P(EIZ% # HTB)hbkA&(JZuC t:~PrV!cT)c0T_U(P\rEk+"nppRI$p9kH4j5FR bD. JΙg⥋HeR[XmP,_\bYY^ZD.FxbqxH2nxy-%C{KH%8>455111A4ZxdhT^=b"A"BPe 鿊q%3IT(d3~N8Ass3HDkf/} ur9TK&qm199ΠK.aXV*l۶h$M7{AVsw ge5Bk6Qtuu|;H$.V'z{駟f`vػw/\-o|AVWVx{ٳgCV̳p!8Bq*nrmVN8A<.wկ/'?_{3###㣣h4P(p8p\C9Y&iBVa wbr9~?L&aDH$q$tm! 366R2*|;bJn@R,E3:: eyg4ŌQL8&#\N8Ƣ.|D׋hQrSߦ'g7,~cppJp8r9, ͮTRRKŗ=JLF(0"!+IP +DbnTas sW*Ztf=h&OxL9lR+(P,jTUq!_)L 3H 1Q-fj m1$+f ł(RjMfRX,uKEB t:*=1ֽ40:6HqlDb*%gΝ{cjf}{iH&\.8pKKq|?M7,}K%Zd'l!jBjp18p8Loo|CchW b?U1#}J߿H%RJjZ3iorFB1#3ɗpU_yH_aD":ťE61f&.5T*GP[ tuu[<@Xoogz~Vdžw %vbll󶷾rxG9xV>fggҿ?q=_ $RTyinlfK޴߆VNE8l6KKK Q__ώ;X,l&J¹gikke!Ν>~Q.mxη3D!!FbeRI0^\i(hʖ-[t$ <x^*f1@`0֛TX,|a0rԊpQ!iF R/cNաR 4Rg¶m۩X22rVM:pbs:6ihg#rM" J 0( w8+ry¡.{A1_cMgeܴ{J{r ?r?mmmpÔev$b1[1:daaz`M_:ZmlP(ĻD" :yQiQ(ȿT.Պ&IW)U ڬr}d2 z-bʤjb\ ZZZMz\B>/]b(Vk ) B'VE~?ͬ{,..nsdsY, rZZGB:v}=ss"\)=Ric s𶃨*v'B \lE&a4innX*`ىD"ܺVA? g` @JE&%Hr3>6Ass3StwuqG f>0kμU|~>L&Dm/s1P#7qK?1kSH6䋂S Sg#8W_Tf4? 0W^d21<)| m{C/">}CXk n7HR:::8vS(gnn\p:h0Uk hZѕ Ӗ-"8odd^&&&hii!BUSb7S @Z)tuuQ(p\J%sD… Hx^~7p1^<".tr$3 3qǸ|2β{nGi43=;Ç>!CCCT*<Νt򶷾@@SZ VPr F4j/W7p̫3՟.\ `1[L&JE]]Px%kf^ygvq| _L.KPT.̑#GXZZBZ*CC|;. wx X|r9 njE;,..g._đ#G||xzX,ŋ0Y-$҂l6S09L=ίt;4|5͍7|/}{{ ;'h4o|#AP*A.ko~ /pfLfJu:fff'8HgEbbX q(tz2p#|K%1fA(b464R*Di49~8$Im2G46Dimm%S__MMM4q,;x;UY^YfA\X/OO˿+~7~d8/ڵk={ьFcEQӋL&C>R.c~vߺoСLMLP,l0r ӓS"=pw =|r & ƱcǰX,tttJhj-#rLRfQTY,|_QtZ#fHhmnAP033Ў\~Vǹ ?ȧox۷~_YIq2 &SCw%ilj S)Fb~n: RCGg.ѻ\egv~N 0d24bNDž صs7tttH3T%<W E< ́TpEfgg4M-ׯ&B M&ڵUZZZX[[# bZmѱ:{XY]qc6??/f444Q*XX< cLo{D*(2)A\)*&ZF% *^yi7t#ˑ"uV$I :J`0HRAѡ,//w^WFCC}#Nn {QV eN/غ 92Rʥ<_FORʺׇFB#K$3r!XBT15=Nf󋋋\p8LRZ[B@Zl6#B mLNB!h$|N"`vVoZnJE500L&,gϞvijF"zgl۶*eo~3'ٳ߷_{S}}=psS*I{bgw?6\Ö+gq3I7"v(~uH#դK3_wTZ)$  22*dHKe(GTd) HPVT5:C eTR}]"ͦ)rԙحuT%k44ѩq;ިQ(h1re F=p3&opE8;ngvfw?v??IΞ>G.v򣬭 ilndrrBF*czzx/W}^]ʍkױZٹi:33D̑^ϡ[ hF!w~Wqs9zT VVطo/Pad--MLMM B0鴓HĀ @B!dž˭>H8pAFF3jD2T+WQju|[~o N;"28+Ie3gV$ n7Opǹ;ʕ6.^Bgw/RZKAzz)U嫸<(Zl%@ֱ{ @iU,Vc'g>ls\rY_7 ^8~FJ"\z \LPT`0)؜N R ccsطwd ٌ/Tgpr{LLͱ}.| ]=8LC[ /^@c6rutp"ƪKIRL% O|I:-GoPVPi5AT* \ThF֢י+*r"MmT"+)+YlNe-$P5ot:ʊ-W? #q$2)::d:ZT.A} D4jqwPJZpht,*hjjTT1:&GF)ɤF> NgH2.a0uTUuF#2D*I0YiDV!Se1MBA44z(SWB6 }GK,Q_F9JXp84Bt6O4`4"QBC!/x< 8l623S'N7R>r2 zVxӯ*'СC_%===)J G~WEt:(H$PHezsq/˜"fUkgRT"~3ޫU ժ䵀U_c!6S_EOUjU린-͢hEZ0?0FqwQk̓bJI.F/IH`(BN2jC06zOs%J9PT*]O*j5>8}:{7rfrԻiN}l6nq0P0lfltx,F,ի 1<Qǹλoz#RL:*Fz[ĭ#c`` gNsC~y511 /vCAv uuB SS;vp8֭[A*_g1??ݻY\\dppNG}6Z H2v ۷\I&VN/0@jY˩* ſkջP(֊~I-'E<`kVF'z}}5*J2jF#T(W*$S)l6+X (mF.C. 1,.-bXdqJ`8\Zc4RRL$EDdR)JI"WE& "r ]`r9TjV CjN.x<qUiYY]P-j\6O6`0077'Fa!(`ZY__GղDWWsss456e> Lk$au H0 t yz3 rƆp8,T ,rd2AcuMZ&  L4D.ԄR 9]V+2pz* .044k4.S'YYYu.'T;]H$޵K/+ƍtup B0pJY㵢u?Pȿl93?m+C׿1k1zW$?B Q\) >|fVPe ^g_:===y3SSd2a- l>KGx"_pˁhuZ666\z} Edr1R! mRiJXYY%ߋT&o& gbj|: q&'' 46z8{ NN2ǎ?_}#Dhomc۶m * x DŽy4bC,J%#7F<+׮7  @CCs:tMoll G4?v3l߾qg``@tww36>^O0瘟J}z;Æ_`hT"|xxAw)d3(ʟRQ}uLOb/f^KL0 j*DjBbBJd2`R H3ٵk`޴d2xP*H_0|"x;hnkEo4\MG̉S/illvHXh4N>+yZZ{ÊB.ѣ o&lC勵UJ%X۷sYo񿦵vxI]tttpY4 =]i6{%8H$Hkt::;vxE"|_LI&d2)v؅Sz-|{qr%jL&Gss# E0"CCC,//ӷuIVVV [[[:KNNTt&}b'$rosM7ēOw^#9֚ElXn369Foo/jh4l&^cd2FnB*l6P(X__f111!;x<15]R X[[\.8s/ X^^e1yb}|T*Z lAp JŊxVT P(k.d2)K@)tp LLv)8yinnի `zbHD"#NcB r$I4TV`0([jx HӘfR1hZ\\5Np8L&X,&ҬsA,|ŢFvDh& $ !LFB!RrVx}>Z-rB.hWcd3E*z@-QYAos8[{*68ī3twQiuTe\vmFR)%j=fSg2ĢX jtt S3sLLc6a2PHe8N&'bo[];wQEʟٟ+n;trFC\Ͻ@\`ͻF)S;OЌV~!6?-f~q-}M;3W$ɿA\-fjcR MB@J~SCA~?O=h}ߍJP,P,Q)UBT%LU7JvSV)T%^F_?;v:zǏ3gܾ|>:Z.Q?5oa[055y Lr# _Ȯ]d2Mnrwf;xZOLON3r}iLf B8qJB8_KKK Evc,..M1)uhmk%ϑʤp!nvt TB8େ-+H%rΜ=þoajzbL\D#$ Fٹc==r,]m > QR)@m3Sbn333P(ps]Cdz[nE$[-F#=]="ncp:\TΎ8\4J&gfH$4peqaxm۶ŐJx<^}C JYDwV?R{\򟢘( T.^j5Rq"5l|$+ t|#ʘoez@`KE8s W]&Ztދb<<P*XV_&b4 4ԻY^^U GuwˡHH, eiiL&˶mۙehh7Nȑ70>>A{{##7P*U!R }\.|x$2/^B*p7JdB!m ~HnH0LOOIww7{o]g]#ݶd{$vB8\=--=m\W{[텴@@$!M%Y}H3sxFO(%K!={_ze=}>1ͱgnN')N j5b*LFZ~(ÃEciq{R8|p<\.'r?cQDߌVVNLAss&XFx6m%͊CfhD,*)K,..I}A5RX"N.gΜn355%FFhnn"*D"R_W{h4B(BV9c4fsTjk+55K TʄB4p<RVy:p2$LJEvJ5J1xDk<:8$m4 b|/#Q*ɝ.]DKS t:@ud.7#G0P -BG6tQ,!fS*frT+hnjalb\f 6#BՒfQR$v<;- %sfUu*2 ͯ [j3rYgsY4j x_~n \vzrYhfRADVEnb,8].r9K++twwo~4-/^`ӦMDQ<< ,--IgqAX__r1}f N'ܷww.]d2v[Co0`Zp:<3<طhh妛n⥗^Qr;TǏs]o'ˡVX YP,D|rCktwwߏ XvDL40nڴ s[x"w}7?<{/]uss3bZ~q~-L]]F}sj5ȐKt]vm@lh>0 6LMMM a6c̈́atD"N&+++,--i&Ub(1G6jba"d}{Q,|i$fLƾ}X^^b;) 7"1_ c 4Z- >,w}7O>${̙3<=IjA䬌\CRIBӿ9lv;TV~x,Fgg'N]+466r1 9BE,V%Ho`yeE삲,,.v9%ujPPTJilTl6JajedrPHꂫAJ%BX,R$LJ 濱͉BKK |NVF3z5屘T*Ce%BhD.^ܜUױr LN׉L&'NK@ T+U ͬL$sYɔؖT_y% ]e׮]OT(du7ᙩ_3o2[_/dUZ#iԚWeHo|bdY)aG=ϿѣpenfcV) ^_ L&G&TOJe8. Kt348L8d4;!c-FOw/VUvJYXX@|/o};|#ﺋVV }$q\B.:KF29ZbH*H$'v&''P(p:0MH,iwpe Z}{"3<<,Ec044DP`0H X6&&&ؼy3Lv$6m4*%*vR@sS˗.t:agdF%<^h䮈֖t:ccc9sF:T ۍb!/کLӉ磫 :;oRZ-ƤKhram6ӓ#˘\.Kbappݎ^WK9 \!JkHn:m۶_Nc׮D1,j"fQ1L"Eabb&VꙘ__\pN8Lq\,/p8Z2bNk455a1:rZ$VVs
QV$S\.G)W|H4l430O6p\115n\Pd9uu~rI&rrL&F/ nx"NPzT*pl6b~lZ\j!t:M\<2]]]\vF5Fh2Ll6fG"`5[)L & ka$*N#ϣP*p>D"XK$Fd 9v]*4 hM\.gnn^Nr蠵:nIϋ/$P@TT*t RĥKbbj䕺~:RAk! }#ܿPwV_FjU 1W+5o0 U @&P,ɾHGСX[_l.mok_>>o"JJ"rrP(򗿌Rd2A)obA?kjR`jjrٳgY]]Ewϕ˗4vcCvPPU$ F#rŞ]|C$ZEE~~I&ϳmk?r)jZS466gn ٶmn=>4{va5J  G-Ҟ ϓfbRڤ\,rEN}mml6K{k;ɄZZ[ZZ$o("h<.Akkh-c˖-⚱Ǐc2(kF7nT*bii9zBp8L իl۶ .1=3ޝ{%^cώ=XlbZ"f#S__@V;h5624<'QxX&q(1IӨT*U(it X-zQ),n[dN&\> rdb r\T4"DP($liiIJHzimm*^S U>R\eYvثN#fEo2h63??O>') E,vY^YT*"?SS\!ZB!v;KKK$qkk?,r.ZR H];wb9w]ݬP\F&˓/䙝bI4>ЀVyGh<;WW1,-.rY>?DVrb1mmmeYLF3li{vn188Hsc#Vc׆ػwzJj!Ghnnbxx6~\HX5\NP+WVh6L&T FCC=gillDVQ_gaa&7ɤz}J%|>N@cM5ۇPXY^[z)D$^GZsDQ{ngjj'x{W2.,,`4 BھbHoW/fxwr]  RR*t2&D2J$ϢVy,.-0#<®ݻx[lD:ugbrJI6#HP(Y\\MDK/rT* ӍGT266IFR$:_gSo/WJ{Yá6 @ DF^~Xfn<-xMd=$^bd$j"x @[әTPE&)g>V(QVdt"kFh0J~91^Tj'>fAV#IS(Q((Jʂ0:s b"h2"Y__!@8\.Kʥf#uVJ5bUnw*CG2\.PcXm6.]D^:#Blv1 ψ+5;~`R*ڵ-N2^/줓))/<(-&`0կ~6Tj5~_2j4:;;ڬd2)Q"7r}L6>ȇwyrdD*E0B&H%VVCmr:?ƓO>?qZ[V * fmm B3-|G].$/0r;w瞣 ,6+u~?NFZhhؿ?}轤2)|>ϟghZEF͛)Mn/0  .]LP|Dsu*uuurm+Bkk+7QT>Tj7n`Ϟ=a4ٶm}{Ƣb-A4BOg q^vyI4TFr)KGhjz Mdpl\Zx{éS;H۳PImnP*ٺu+dVC<`˖-d2L&X6 mdrtvGo?ƽtJP{\t;vHmf)VQx}24mmm(rH-EdeyeG4 &өJE,aDƑ\N&!J\VK4EoГJPzh5М( *OLl`x6Pf||FMXRxZXk4$u'Rj1BR ɠhxn\lx^*%&TcXe24 D"&&imn% d2FFFhkkĉq8RjNC{SnRQ4lb1vžP(e LB0~?Y~0SCٯ0SVH bRP*;O~|DբՈVѣr-KaFK<!NOu#ӯRZd60[\~'^?/>"MM^ESQkD1'&x>o YY thl PCR3G Rp8B2t53id}[>sUtz] jwb<8bdR.c2P\xP ݳY IDATGwO۶ml?eA T[;8pn4j s| //b0 9s^ZNO__XvsXX.1 4brr&ǩVɓ'^ >OL{{;G>BGG7D?G03;CXdnnJ1z{{Q*2), pW&gVI rD__K( ]F\fnnH,pYZ W^o ^+WпY1q27/%>c1Z( D"|frzNFDఋHbHww7Zt\͵*џNm B2kk$qvKKh4:e'0ry2ˍF+ /ɄFnUkh5T*qQB a4Q5 e@NGҒLH3h:p:]Xd9 z=RQ2,6WޤrLSIb1ZZ(K."N\(֣d6*r:RRX (5 "SsUbdbjZLݸqC2'I:/imnETRT:EoW7Uxf;Dkk)рBӰh" o 3R Șgyy];vHU\YE.Sp(_8FWW7ZIfe(uN'3t:|^/nazzٶm+33|_yxٛdfޒa*K2_7\d&FUєxVS^2:x@e2" NQkZ: "` /KKK``0pmw x۷oghh򲈱w\{ȑ#L&/rmߏF_׀Vcrf@}Cgϝd2ZZ9}4<* >y-³>yq ' z{{`0H[[D"y(WVë 0o-\.'nSB$ ??яQS_ݻI|Rqf3>(D62 r;v000 izzzL4e۶mnABAKK `xq({VF#}}}y Dmj^?ݵ.{9jܭVq:vFFFhZN'---R<|֭$ wsnioogdd>n&z=kkkβyfR |[ߠY/l6&''X, cccvהX"&r9s sbKuvѿ#S*,//4aYX^`rr&ZQ~8jNd2x6~aFGG9uNVXZZ⮻&RǏyɓӃnGT+++tvA*ctt^/7yqΦWC~w浈_їUe&Jb4(r-v~/?oX,Y,)\nv, yE\F\:zfVC!&3@֭(J*GyG+9|>yivۘ1==lۺ3gϠT*xy׻;~˗–>^ҩ;o+P*;s?|֖f뙚z;*@UR*}OŸi͈A&rx(45+p\a[Rr>B!*Miݵ"}$j)8uTX,8:Axlъ.Μ9Csk3sSR*lbi~gnvx4`-gNxY^Z3vc+^`aqAB$^g?Z[[qV+&@@\chַ᫴6!TrJq:]kyu^#Lb[Q($SI ,uZ&cPTbQ#܈+(FܽP(P*n/u@&*JP3~:mY^ A*BR1=3Mg{'2H4B׺UB@X.jBhdm}PSG2@}Ao=rpV555Ekk+P5I0ߛL&\NzB@4CZ`0H\tR_om5?@R 3U4*5 n[Jib$h4b٤+;ILtʋV/DJqV3A(ɤ(rT r \I|>PJ Dh-4[aX__{M1A] JJPP T*1RZ&|F|Q%Ef>M8&ɰ{nb\ޞ^ 4!BB&ejjV!MJCGwOդsY  F \7ߢ7K/sct;vpctPf9D"o4 dBڊ$җP(I8jmyJB *F_ns'eopovjOy7I^/RIL,ˑTjb8_?hu5NQ)bqؾcXfXPr-L\EϪJT`2sinBE N ٿ?3uY . pˁT%baޡCK/s~Vҙ,X4F.G(G)J,/.gn~i6NeQ(59N7eB6jqff2D߼CLfT:-J%lV 9< s _k.7R)fFwW7GnxI6mނADZwNgؽg/CCJ6mj(Lo&ୠH,ja}Y*5KA{z+Ν>@* yof[?O}}=Of fHRx\,x L3u:(RYb , NcCLB.ag^ikO|`ppH왙BQ.Y\\BV3s8]n2X"A.G7 T(U*Ξ;lDihPr,@r يJTU*v]ZE=jD"x<e\*"K y :=*J*%BFP.cP2R]=5 e^^F!ˉʊPh027;&c2y T%2a D \ASs3F:=.͆VCTi47iim,TJ(DGW7F z6mbfn}ߠ+/L[[O?4@c/G7X] h1- lI*VN'#ׯWhZrݽ~ϴOBFʿ8-?M3E7jɤט oExec)SW׷S*,rY4)(䢌o| TJE2bdd[o9@ZŠYmق Hŕ*fs9*5l=dT.ZWNŽYXX]b)K݉J>|3a-$),f3h٩9?KRF.Έ77̼ *Qjxk)/xA_s*_W~Ref DݢB}Ȟ={+TrRN>wݍB@RLB\^ERϟGcB0L*۶K.7܄l&HRhmmPClO"Ŷm|ӟG?t`>sU>OqItz='Nl63;;+kŗC4ŕ5JZ\TP?RߏB(U*zܴR,iooƍ|,.,JX[[㮻B.suuL y嗹y> ɸq qJ"SSSt:qp ۷oP8$\c5[z n6_a֭\"ޖlcQ6037&;w?--ͤRIZ[[8w,bm7Ifss 6#Gp !B.YK͎;PTLF![)-vb(Wq{! عQ^/_7VE LLL`6ZQ,&6L&DZWWB.F/^H}}=vRI8_-AVWW`hdzffgH%a%t&EP h0 xndr$ FNĤVAFUuzdrbRE}4ɤ88d4j +|AI(&SlF9 #ZJ\"TZDN#P#IIAVm HDR pp5Hg{' FtZ&X<`d=B`uu%&t:۠D{<^4j 2XT*1LK%}B   󳶶d@o4T)HS~x1ҝ}O1\qj( J ^.Ш4?0H3~zI?x}V=-~=̼I,op,y"J%O<}2,=$.bQBk4;:LNGw"- ]Zѣuc4%pUh5H[k+y{QULNMf#K0dl|lBn_W֭[9zTL t{a=?0cxxLl .FOW ouX,Ra}}D2JlFGGٲe wcbbK,cddۍL& nvJeG;朗r9 .O(bZټy36_++|><,,,B:r3>>-ӧa-@(J3 Cf~a6iiiPK,..r]a(ʬG"477L:ǏtJd7BJW 4㤒If J[[x啗 l'Jɓ'9pAr,8qZ4wfuuUjވrZ5 c>VK(T*sT744EqtZ-X\z@CԄB.q?%d2f$S qyŘ3ހ) *T 8Ȑ9lFlj, jVK"q3 IDATL%1 J%F#NӍPJ%v#2km)|d2I? 4J% NR$%VWWi 4 ._LWG5Q՛l1OıX,DQUtd3TULz@REZN&"bBX,"" u>hmM,, e&#h AԵwj%HrwR8NJTesf::8}4yF-ȼ^fffضm HWT477OLLMd2)E@ATM$L#O=%~ox=b0==͟ɟׇV$kt5[9̼VDYsol0afc@uzdH&|_E|Pֈ* L[{;՟t*VT+[g ) P/]$˒u4662rmVmNgaaz|>cqy ӉL&ӫ0M-)T*kkkѧ?__c4 >c47?]~DvΝ;֭|>  zۃ_gii=\.!*{_Wq̾}$Kcvb; BBmp[ty Bm7@  d$8!xWYeьfg~I ʯ#̜UU\:?Owa:::… % w|'FM(t:tz)`\{EPgwí %0477с ۅ֭[I% ^7ȳ>KϖVVVx46266J{{l}|ouk04t*‘uR$\d*B$ N+fΟ?ώ;0 DQ ntzR[XcIj:;:D6%gKӓr9op_ױl!*U'LNNYXX͂L&:T*5[nE H&SRSSC:aiiF7n/_t233#ek:3bDG{;`PbڻD#t:hom'Jb4Y aX)ERD(0\N"#K dڂn2FBl!LHɿ& Hvt:M<'#8njR)rfeJ$,,E"cX-,^O,fB@(XT*ET H+F2 dsawl9VVX*b6d`2FC}hzx,B@&Ȩ!!ſ죣Z(D6ŒOr9q;]O D|&> =Zd"΍30RiwR뭥P366F&7u( MKE>bp 33"$t[ߒڋ"J,}{H4/I+0S_P"ɥJbIDϜ9`G.KG&I b({Te]U0Vךj~w݃bqmt,,,VvYFF.S]]4 GցsSwHń}; lG>pLLLp7ryjjj^~?>?Ƹ~^  ={A&!p)XQ###l۶/}K~ |ͬ---lݺ(qzv ұcT԰NOOYX,vlߎRDU311$CgqĝZ>/e1"yG\viB Ra>߻w/7x}_` >.2J۳*rYz^"jJ#g}Oz-]"W]]\.giiId*G{܅qdsy$ !]]]( V1 qslNy`vvJ-43L&q:],--RH$d@lvݕ.!JT r0  B/v=[[l%ψ/)=ZRI$Aŧi2A*, I \.F\^,4UX͚sT('QZ-vMh4))3j^/ش* h@VIb[ P*RJ:nΦQXMVHm&&j5+ֈUtuVVn"K#Gyɉ',--|P(hnknjXL^@p lv+`łlS__Vf.l6zIRJ%xrf( e4j5/2э˔en7#W.s{ k J1=1g>22erQӤVgުu[x7Ys+0ٮ9&RĐ%@.crboE֐dfs@ LR\&X, LMJEdFC*%b8p8,R,.-pzJEN>(tVB{K7׮sN:ۿ[nZ-bjoO"W(?CPr1 044ĭ2;3Rf2?;2Hgil6Obee}xļkccgnL&022Yg߾}|k_v;vh0pUvnr7r N:D.e=88.M"pp VWWzq:,j;whjjرg .K/YIGer{n.m$SIzY r<fp;~<&EBBXmv^:P ճ(LLNNb2FÎ;xC}]r9 (+wJx<ΥK$]y_wۇ9mz)p1sIbd/JA}`EOIdr9zB6nTp\RWWG*d2dp8-fV vH4lH(łr`mm VK*dmm $H>ZJ@P"يWVS.PTb1:=rh,*JVE&h5RB6QkkkRF&R$+ɠQ(j2,x-,,.rČ@ @{k;DL:+AR#8.Ъ5DYb(ZFB #Ͱb_޼x<f3T ,B"Ɉ (T +X㏳}6'O`98mR%- .t:ڵk J?EJtLPL3m.v{ضmb֖6|> P(f ) ,/ rrAŃ 2,7_m 3o B *HE [vCSSts1r>HT~$jjʔ*efo(@Q/>DTjD7?S,(ⅸ%R)q'k ǏK_2/^LOO344Ĺm&LnO'?8q$9qsJJfogG'?سgܴ&#twu[F @04tݻw0 ×Y]]6ru KhZ<'N==>0UլBT*)*_ѰĎ!d dZP(D{+㒪*^B!i,b6H\. BR.Mggd4x#G+ 077GGG  ٹs'br+^7f9r[fyyBTNt:M2@lbqixѣ|LMMlJE$aZ[)K~T*uuu`wSoؙ_`0V(Eh4b(kX,jT.d >P"Ibؽl6KJ|te6+&ZlZI$EɷbR$.H%P13;-,f lF[-VQ'3`0CVI3* J2no tѫWաT*IRSYOQFL& U...rmOr8{,}}}f(xBuwtS(ꢡAJK$d2+`.~?x'ObJ d&LLPh*Μ8.hub M*EM&_,yu?h$ů_43) 1-l!~g?#`@TJbV ?_s~ =~(ߧRY,rLFX_1>9F\Td2zjsHWw7u\¿?_x0M>n g ,M{{SSx\{N ! /$ dr9t&C<g=,-.D7 q;vM6cc#Jwz/>韸;Y\XJVȡC(TKyGQ)|?s Lhxz-\8(mlf k+,--ڊlj1c4Ƣ)X^^ βe2 dF6LQi2ф\dfnd2ɉ'Fqeٲe *Gd4RNIQE0d6vbZx"P,j1,,,zI&vM|E e)(θ'NPSS`߲P($U2h466"I"XZT*1V4[ڻ6ĞdM"xr9]LMOQWWٳgNCF/ P[G&D@*o4Z'l6WbD"jPh4,zʔE!QIdreX^^h4199Euu x\ 1M$I(Jr|!_a Q*9@YԨbl%u:y2X/]Q fYb7Fd&MwH$0YLylɄB`mu ׋ bS} @")ڭe *hzeլ"U KE6>Oz .eH( bYYZZdjjf9 <UU^l68DXY]C^^Sܹf, :R䡇o+R%NOO*rJX׀ {M+koyshD3#ː+P(P*={-XViA(JK$*R &^+*$͌t<A&#l?GGGlDJϜ9g"!;pń lٲ'Nrmwns>LsS#mmm|S{Eww7bK0\pV=JSS:x 6ߛn)\Nǎ|փr!nzYX\nI*/_Q 18qHmm-GE&N__wd9z~<ȁ8{GgGn*>2F%NLTd2~]]]WVI/as:y0#^_3_ٜ I.T*q:;B=L2Fqփܽ/p0ͭ-!yomNzil6{Ek_*f^xٶm---Ŭ W_?SO= 7呇Fejj#ٺ" Z[EE0 Ab*nul:F,L&X,qC!*$Xl߶|@,9\n7KK><*.\Ν;e~~d2I:Fpk뤞Wj$I|Kbud"܊B"_(NCTT @MU f߲+WN&'NpM7P|ߔжp8LcS#cccht.^P( bZa`ee6)sJvnnTJ%&X+e;v`zzlLpnv|.OsCtw0=9B`zf^VW=*'L&l+1U`2Y1%bFa-F._񰲲Hzp<grrRW{yKx<8vnx".el76k6ٙV,ZRIo$ cXeZ TJ%:TWWO$q YIR8NI$444T\y9Z56l:Cw{ǎeRVאdimʄA(\LOM;:}K`mk2z*c= {s zHg3DaSC6ciϵkc۷d2R桇H}E%X̜:LJ~VerpԷI`_ ނf3?T ]( 4W*ͰT.}/֭[ مJh IDAT4440>><,VX\ΥaGyן+n&?N}}=_swrNB0??СCkkkvm$SI6f3AvE8HL6裏ѫJ%~a~7~ӧOӷ1&&&@N>{( ~'vBIww7/"vejjjxu F"O GxW000.B!F#ڵ Ǎ7H4DV_lj'项V8AլJ,zqY\\ #1\tF\"KE٬ oEFKEEZĿFXTZ-BZ-fdRԣi &&io$z*`Ziiiᥗ^ADmTiɹzPl0eTy|`(( FIFq(+#l7+fhZ(fYl FFz=E%IUDQ(5ɔt:iu$S z8Dsc tյUa߾}}wtz U.!ZZZa[6zSSS8bbcHO!Hbd'LfV|,ZeX0TO_%-E foЯ}(?$v*Kd0֏fШ53iA@ְgn]UUU ]PTDx3v%UgcdvB~;z4r ۿ_z r5z=@{[zH,F,*̥I 555J%j5VUS,%V LVC<gqqZZ[["Pco+tttpQ:;p,*< vkkP՜:uݻY ”ʔX 3??jCߧi  OӔJ%:._Lss3XFfggd28N<*QQd;77ǻ}/XVN<-B<BA$@,Q\^VԴlb:Z۸p[nŋtwHIةTJS8NB! ŜqSdq4)Q$ J% D*!cJj(hPē8H4N#HR.@UG =Juu5Ν/R8s / S]a N' Q_@<'|75dYMMM&TĔ+poaaSO4Ga˖-'hwZbemm[ܱ`Zygyig||A._ۅM&._M7-·mx&$N__h N$HK>2ު*$]]$ ; 8s FF, ]n)m5aqTR!ĈP,۷ogiiBuI>fFѦn||r:8q;wrIvك\.G`ݻvL{k;PhłR]|>.'HH@.=L4w( RĂ13OWgf*fFƯnozSE3 ŲoRb<^4u 9rD2VVA2k7y*Hm۶qYz=L&C&Q zꩧzrA{9022Bmm-lY>hkkc~~F4UUUb֭q(­V1_~e >{G2ޕJ%tM SWJss3O~_~ez=&Y!Çs%~az2t4 ;wQ>FÞAZ-gslP(B|gY ǩm' :8mm<FR ?$l%^|n@ 8} r+WFQm'ЂJ,UUUh4t:KLollxx<$ ir8,KKKgf9L&G`َͱA2)Fٿ?"Hr8hkk#JQ.ٻw/. dBѰw^L&MMMH{/Yf3 K>RdJ%O?4MMM NVK$6UQ],Jl,BA֒N9xȤ"1;3\&faZY^&&,(bXa26:bQoyҕ|> ͘ &xłUryݗf)D' QiX,8T*kKĈbh-(Z鴤Z__'2H$Bc}#jl6TJmJ1a0P*G,A=kyI_W,< *sQ(8yԕe}]ǎkm'_o/dikoF4{￟ ;Z)T*Z%j1ʘNI$i(BN,FFA@@j/r913n, 7O@o(r2A)(rƒ~+Uj  )$͌PL T2kLF*DדQo7~#apn2HLF]F$]wپcWVgKOfq${߃V᝷?y{CVsvؿ? MKK =,,TP`%F#-NZ{%i|ѳ|+T&y͕+Wxw裏rA/ $ m߆RsidNVVW J&q8BӧNaw:P*嬯&4ߣSdTtzN"RNggLMr9褯T*ΨGRLNӧOsmt6\Dđ%jkkY][PQ+yښn zv"FgW|jTy'bJef BXVbRh4,^P8 2RL=s Ė-[xill$JjYYYA&IĮ]Ǟݻfl߾ijvTM%47KQ*PrdX̓5,-!b#P K.D׋Z,xK yўm0a)SBS*(L6U ,a2YXP,tRJFTUU177':@*L$R&SӉ]M '|ZVʥc:յU)c&ˑH$+Lf6*P($2l6b&KzioN0$d2jp=X,V[r.5NA(K%m׊h*3jg}_z_4̫AOJE]zv(DEpعp"++r2@6A S|" Z2^-.Wr1Ud2055VAX-Jwu69(C{['7콁gymtod0wn~x{o~዗hhhܹMFG(W8}jm;vpcv6bYZ^blb*N-['؝jji l[oxH(~}$SIkʔ˴156ɻz_rLN!(>6$SI^p$LT`zfήN^FGö S vk4]N6'Mf ~ NLf vWűX,211Vq|k2:v-:s%-GAZZZhinfI)GA;n6_@" b5[YY]pL}Ck`HBA,F ΢dՂjCѐL&Q444Q.j#NڊJA7h1,8n K#ۏyV |>oq~}Qʲ-جVr9 MV>6"5غ F2`[(r4rdrzW/ssu;F#z2.7'JQn|N=^7:Iq +lyeC[~:Uz}z} bd\WLT~Lf_qyu?m@o~n`$|.Q3\x m6#~iXt huD6"7} n2;}|\`;X"Jm] RG[[Ξl2SHTcvM-{wE~]G=ZgϜGD %\ΑL$(Jn8q\\^t[9&&1[\rlB@uu5cbПbaaُba~??<:v4 %0D1۹v(T >ӧ)J{h;w:ٷ?`χRR(ft*-m4J533LMD8\>j-}xgN/+&|KhZT*%nÎb`t:Y m`Mx<^R ---twwNQ*𳮮bHsC3ϟQJ}GH$b8UN ٳgٹ:.\g*޾^^82nlT* Q(e2vqnJ6~l.GccCڈ`f=ih%@ib/L.n3i$6p$_{'eʘLFbEmrJ%d@D[/MpR~_^J%ǎQ~M729r\^n2of~يן+y@s3y+`sdҴr9z!>%|M o y<}|~?M  (:.Ot:Kx<~ 1;;ÇV6;|mD"29Lfѩᰣje L|ޏL& jU*;T()ʕD5z={xOފ`VG7D"an2 FM"'q\zk\x{G}g w, HdJTl%Ďxu]'=N$\E[,H$@t `{?ݵo՜C 3{<'lf8N:;|\.H$n 7 F*\HS?x DsS3v(5~b(CCo`mmFzK\&|;w'p8…!_F ̬V+t\!Z[x++AX 9u|{ߣx<αcP*D<;w$1>> ߿b+F6HD4ګ4J@Ku),h"!JX 85H)KiD ;wdШ47516:J`ՏRL&C[Go]FU6϶$p:d2a5YY/1?7GUUFp4L.euuӉjg5Z.N)VL6fiyQ,#X66pNG(T IDATUA8Fq\DQzct:"VLXfſQ/5Z[ZIgxqqQf)`0r2MG2[l2p9\Db. dX[# IE 9cCGL0de|+7;ڽI<|K_b~~/| d"fijjڵk\2BGصkCA4 |EӖӖy. H$H&Sq߽sirZ G#\xn7j7x-)i}Vfu(O3y+ kM.C Ì읊f]ߋmwLʟqN|#ǾYH&d(x|'沼~=\g޽WDw)Xcqz+L^uFFFx|Qz 9|0 W~2fhhw1==́C Π'u&5Ѭ299I>[ HXWO]]aϮ~oŋ<H%vsq=rBF( h4r db;G{={ ޾\N.Wř3gbw.Μ9#ҵ5P_[8x:fgx7%N2>1N4#Jrq8?fV(f F|[ej4b4p8393IR)A88e8/7>̍ ;+~jDJ.'̪e8N_kc fff$/֦(fks[t]"cwwH[NO>/R.zB`kkKTk m0MRLѰ%%@( z^mi/5_Z\ZfJ0-vT*`׋b' jBVIR0QTޠgſR%!Q<ɩI)VE9a}#DLg2/`2p]6tRd2%ĺ,Z+LwWRI*;S ޹ѱa4|I24|vi#A$EVSY顾 x^QKPYI4APGx*%L>t:MKKs+zuxykxXLM B)4ӿ2L&%B@&5߻*ewwfY"IS*+_JY pW\!D\)pp-GŇ&6PI/" I$ilv &Ee/^D9r3o;tZԜe۷}\.'D}jfM{9Z9{,K4Vzf\'jjkK/vZLNNŋcXsw}v[Z zh4i|>{ =axx << LOOcw8hhh@3q::cf~~NGkk+lQVVVݿ@"errh4?>V+҉W_}Uj+>q/Ogk'KxlllZ86L IUU~?!ܹEiB\.@d2TzzLL\E⡇btt:N<<$*8ss3467?b``3gz%Fq;D k"'Or e:oj$CEWW7&hoogjj;vr}Tg6YYYfx,..RQQAGrm\Р++*)Q"O@[[֒$<=]=]x<..ffpIbMQ8DZl &PWL&b T:%s bJPfkD"Auu5EX,2;;KMM &h4&M MLLD& EA7LX, 8+erPw%rq/b0?@bJEؿ|; S__VCիWh$Iq9\l7D(Az{>qH$…7>2F+#z~__LNxj^~e~n6Mjѿ# cH$/U$"'+3]:pV323D~[<GӣRhljK_am+3k /}R"~gcccb33R,-,e1:0Mol07G,lllaڽd2)ʭNQh4jZ[[ill JN@ѰH{kt8 ZZxo7ꪟt:EuuJ:vQW[ZoG/Zgiynڿ&h-R)nfffhkk]-LNNroTX$FXb ALf o]bmmNd@?n3318z6X[[#s}a2C.sA,d2I]]w illd%jjjx}o%p6(prr^|E,[fשkg}cֶ6LΐH&;8\N]'_(70[,tK uH]vSAneiy9V7yȇ?8B0r9jSN}>jȤ3twhhg2L64 d6QD8t! #/`4BVڢac=F-RFf'/{O,U)TV`p$Lc}0 &RfH6'V& ΀NC2 ⽻yX斨1FΊ@;ʍ7ef@3JFۃN%˲D[KjWJRiѳ!kjj jjjDѲl$WWu]@$}SNs|>__{ByimD"…Adrp;,-/_$7|32RI}mo^ 3339ri7fI?LYI,8_?J{Gse/c}cOI&جFX2UzTefu?SaJρw=32f$/ї mo-0 d,.Q*c)nrM|F`x-{9q!Z9)q1-8bJq݄B!M\ۜ-}>J8z(DIUsEѨЈ%355糿y*29===\rE8%ڨ^M476d,,ҿw/( IB^```lj;^W^Ds)~7'hmmW~?y^=2wCc#5|>KޏoG:-۵KKK߿_~R<AFFDcxx#Goabb>&&&ho.v٬`%a,Jv[A U\LOǥb:ISvbyyd2Iuo"w?p8\Qa0de ģ1<O ''RD}}CDcB>29x2! ]A "`|MjXZZ3z}^/d2lшB fa5Lēqimk6"8?Qz@ uuub~:* -xC$j5˒:jZ$ Z=hF8FHk-T TQkHg]|>/v;Q7yJЀ\.'ϣR_XR\./ns ginnq\7D"ɟ׹vP:BKW^1}{ED8jfyrY[ ˥Kسgd?]]]R׿g|w>˫'_e``g%xɄBns)222®]099Ɏ;( X,~eQK?"g4jϠ|+3JLPI?L]eRfJ@PDTSK _fKy|>OXB!W"CB O1V^Vo'?_\A2b@ Z?K|CO᭬$lf5VhmmU׎tӨjleB GGGnt:MXJ =iqbjjjj_'O?*wur*Gy_/77_YuR^RӧyEq\47#<[޽[MNO~"<\zGWw`zf:.\"NQ2zuD"A$ॗ~L[[;. Css ccؿ&[;Pk4BŹ_ p@!^Frd?a56SQQ[a$jPhcXbBR_ Lhm)o}133[UIP$RYjrYYYBAkk+eJTz(TVz~v$*6 QHs>ga~Ado~J\.K0\.F܌ޠYRHSS^ k4Yp0M1͊Fj0JL:K8!r򬭭'Zlh: " F@Q,PkԸnrVUtsRlTVNRJ"nZt.K8FF) H/%qmӿV+^(@(E3NC"WanKjZ-V~2bv|> &nLP_<͍'O3=+g@8ێܠl6UѨQ*Tz+<[̵ڵ @6vS,(K\|{Y[XV))/+“Uf\.EU^d d2;1lp":rQk"țB V?Lrw}7_W1R &oENdQdEjkkyY]]wxEXYYatt/R'_Y߳g-u-q~7OO[qӁ'8v(/{/o q`M>wy'lV2b'IinnF&G.SWW'5;wG@("HpD"W 455F:fvfk׮a4'i<rTv:c=337ݻ%@~żh! %]DJ&D"š7|tww~)zfvRh4?fc?O~ݎl6CfOoO:VB`-޽{x<,,*=Nr8lvB[$ }x+DBA6P(rX pݜ9sn:=f iIhUP.'7V.\LFtXkJ)ڭ-6nL 2/}2N7cikkcJ% 46Τ<Wۃ` JΈg2X-Ӌ^gmm j|^<rI@S'nLpO:|>˓O>IGG'j9c1I@ s;XZ^t‹/e(H("Isҥ28w X,{1Iiȑ#|_f}}{RUUE]]:Jߎ>H8]Nf3tuu`jfO~L&{*HB|#a&{@ɦD)2(ʕ+WhokP,d/5KTJ|L ݻ[JA^ZʛbV؊iiiD[[IP8;zB]/3Y}|>\ :::aqΝlmm׿p9Z[x/ܻv"FǎG- |W^as+^/84MSx*0|={p~pT*E.c׮]\vVkɡlB454[oQ[[la׮]J#㦛nc?ΝJ4t:oLc~6 o>imjflln]l&`s:"Hpq.\@>*ˬ%OvI2(e@r뭷=rrx$6W0Agbb,--JbϮ~666px}8N|In(BFVH͂jd&M4D oeD5 IDATL&C6pn3VTbdjjVJ177GooTFJFD6(\I$Ȝ LV$ɹ*/[- ȸ.Fd)vI:&ͲNSK 47\jr%+ln *^T핎R$Kk BT(LNޠBI$$ zU n[(%MΉDz(f:ژ ,LFUe%gϝӉB ok.677iiiaii&qvgP[S`b C>JⱿ{>pqѣP ?ff^Wdd229RQ*f]3 = rJ ?>V2n %2کSL?m&Uȕ706y"ND"A6A&Q]SC[[d ʧA8llh` Y__XZz}?D ``^X:5 "133իW 5@F($bg8x tWs/Zdn6N8Vh׋h[v+SSSt:$ (y&&&e}]GGGj4tww'۷O6#fb@]]7&&Hڹ|>OwW=zݻwK~n2 ;w1ID"sxgۈFY*++1Y^Ap)n&.]D__Tł``rrb[I2?>B h@@) ó=KcSt%022BGgh*V\DCCKKenr9g2CΉ'H$PfbbfѴQJ:B[k+[-,hD")0ŢYWWW3;#LBA$zB!; mTUV֨Q/:RDjRd2IX,G"F& nF!zE^/ hzhn3)JbuRjd2&w$o2H|>OTbzzF0D+jM4k9N\j"X*|AN: 7f՘-f递f$jf!WzE]zɩR)*++z*w\6FUUsss8NTWW x<-Ӳ777inl9t*%4 rB2FTL&I$,-/K NO4U+Ϊ]ePffC*sbH[{###}=s|{yB`"wO?;͞={EcȀ|Q M.O~bhmmeɷD aaч-<5SSJtl6/G$XJ㌎a2`0xfyy;pRWWWʀ"msfgfA.SP$PMVVg ?ǝ'Npap._N.GzDCX]]T*q뭷p bYe:vtR,bN: u\~N.]j^$9t0u X-6##W<347VkHē EjkCѠP(hgnn6 .U0;wDǐ{E +U[/!+##,N zu ӅL unܸA{{;5&kZػw/d2[D(r24:jj&ÁJbyih4bppiB .k.g3I2b+FFٹsg ˅J JcH'<"EmUT ZC:[%TKl6Gmu-PӅY4\J*&͕2zdrzKb6[X^^[YE `$ˢjP">_.9x<t‘0ɿb1Y AsIRl0L%1jV+PJhQm5.'KKK(J\.76+Jjjj1rYo1aq*P(d2`}REXСC;:yjy并F$r++K zW>&pP/y[2x<جVjj~$FFhF5 TJv*+E~}}D2dh2 AXBY^BM~&߻-{|-}N'bI43?̼yGRI.#H=&7nT*% _./q qQޥ<77׋]E(r\.0555LNN? عk\1=;=ރRhx|_|8_"fFF#LMM)UUU?χ]yimmE&=8 t_seNq`/>xh|:--tvv ٱcN3iZ[dttiiis,t V:\zC *sssr9rܫ)"2.W ]TVt98}4. Lo'(iuL&iʕ{R&k׮r1 f*;o]RMOss3TyX^^fmmp4\.S YZZ*q47SO!)%N#c Ɨ!ɓ|A?=>k{F.K_/0nwwjdY #/H$<{y}V J2J%(K~hDcQ2 RV>{%bJ%,"L[oٳٳ(^?~߾BdiyRW_E.DO'7z1tRj&aݨTjd2@x<{Aߺ̃o sN ,--SP[[Nj/rE>[o/p!44L6Coo7DU?O>$wOpsϒeٽ{7}}}(Jp=VKGG;HNMe9n9xVc‘Çrqq{S'ٿ?` .ܹ֕s ^{5n'ijjhk׮Rջp$4PFru9snVFFFު*nLLq}Op ٷo>,|7nr ΝcGo/cjn LLa80S3S-&ٳg&p9h$dssG R,F#bmmmhjjG?d yLf37nܠ7D]&^`*7#FƮ N&hdѷčI:::6 _ZT/Q*R3p{[;d Պ磵znffՒRj |I(r|!LՑH$D<'!$E^Å\!gmm B)%<#WGrYf+3{(hiiI "2R :yȯ TgKUUwu_|3]"ǏP(D{{;3RWVb2xyȊ"_YqnV"XvJOTYD&-&{}VʱnGSSSլrU$FaskNR,*VKxj&9X][E&1 ܌frk FGGPJD2Ad>D"N0y=z;LBzN7tIڸqcJ4Ev <38q;333bRt8YXgzzrP9w;w_`||fFfuOJmwBi]Tjc` Si\n4WGF9|0x<ROk|_LyE(P(a^\C߽ De *O&տe%bG>̹gd4R( ٬P̗?5j \Y yAH8ꫴ@R13=WO8yu||CfxM&#Wޢ \~޼pnBg{;L8:lݻwKg-bg_/NJlv;L#7LXE'v'q+`lnD!!PiMeߞmH\wc^o>}_7;︋@(HMU-h\\hOKkzI&L.ER +hBbyyy--͘FB x4 ]w-ͭE$qv-һD\DTdIqQL KL"=PH$E3b7}o-2ؿbt DCP[6<bk_}'O~ApB~3ZD<律H$H3t:s/J"CNR;q8D19} AtuuuLMM0?ᚫH ;wRFGGٶc+JRF=Xjv\X4D.c\8qByFGGŢl6 Yu8KxA3/w#JYk=^;.@hY:;;44r 6 A,tmۘQ%͖,HF&SPPvbjjh4&P~ Bڱhdaa&!…A&Hea\VXȾq%G{{;/O:%[rJKK B G{v`޷L.T:&o#mAt{\~A!@_6p  )*(  ΤQ,6>/߸8@_j婧"cYvgPVp,--rer<\.D"r%THTpabqΡQiʺ줘f<hJ t:L&e`NӹJqqݧPR"t$:bkBq\uU( ~?7xЉ ٲy+@\7Fd"cccTTTxr ,قAgqg5U`Xӑ!O<`0`Zq8L&:oOqS TD,ͅJ$j "`iix1dr(++ĉS_Y@oqx twqN'q:7n$ۋJh\MBNR %%Ŝ8q];wBM.Ţ rV * Z+pn$Ba~!n^ ߞO(2TJd2y.gh4b6QUXLV4Z <{Dh4\.E"A %2bx"V#%ũSݔWVk!} x[,J@J`(!8!4 Ӈ÷mZ׵LƩYiq0T %e>?K\"| JQl6#CM"JqeWo}뮿hkk#NT_011Agg'~. G0Nfgs$YFC"`nn¢, |>Z-NVN;x{E^arR7n}ev]V{eϞ=vqWW{% Jy7ٹs'gΜQf3[6u Iyv;> SO>D,)=(\p'O~x2窛RiBEEEru{nAPk3 M-VqQoNmUv؅JfeeL*/5 ux?{An*~G9D`8DSK S33LΒx_O&8?4H"aͳst7n %*k%"ɑ)U {hkk禛>L2h4SPPlh4˕HL:ǏD870bDtm&ggw;:*ٱ".ޱL3y|L3& <ͭ~F:;;ed0`28y$Jq[^^[B,ksZE,ZI,!066 Heb8+K qx^ac 1>>jE.ljD",,-`6 w$ĈsUtZLۅ\.祗^"T(F*~_Dsv^"X46p t: n|d24RI4q27nd%oOWW@)X,jn`b2Z !_IeIAg@"PRThgdrP42br@8D0&ETwe [wlkfQY) 9u$ mmmar%zH$B$&Ԕren:uBVWW'Ng-%x}4YG.n2L3$0$ E[</EN*I 3;5([cR}~$A&^/{*@ ={y'Y\YfisQV+@KII &I`LNwW̶mۘ[n36k) ƂC(R H¤i*+YZvrFFBR^YK~Gkl Jyf=*r CqQk.oA$NE9dYO>wqp+￟y7khhcuuGy+olذepeLLͰqS'n:;;I=R5Fȳ {;uq3h5z>uaCNf3".\wY?{R2 шށ^::A, 1(Gv)籺*QUUU9"܅>YQ L , VL&CӡVj%b(d ~O.ކ`STT?FPΌX$&Ɉr2t&b1qkyWVH4ux]3fp-;[c9un+xT:?~-J"'bE)//grjwCif[u:}߰KaQd2̑b]{;gaxdZ!pqq VW3>>]īu6.\ q&24ijjغy3+KǹvN<)MϠ먩Q]]Maa!ƅAȂ橯odhxZs\bذaO? _vW]}%z+ ( :np:fp<6T*IOi6lXOIi)%%QZVFQq1,--먩b`Z0=3C~AՃΙ^KJ9t0|VVpPR\^'stZNlj2aڨabr2K=Nuu5,MTRS[jgN__]|ݧOc0`0=B2Bql߄9vORRZVرuR)8Nwt088Hkk+DA[ bSTlSPH TWH'X^Y"D%F<I%LOԄȑ`Z-9;o~~~kX,"JJ%JL.# c`א< F277M ju;vT20KKoO~SNvfee-[TUVu[?n-wrÍ7+ISAvbh"YXhVX<b1F#>B,# 8@䬮:I$h:rzA sL,CJJF&3 IOh5tuvG?`ۓFGr¾>ژ)d2 rΞ=K, "4:TjUv]&o Xϧ?}j:DB=J#ɉ'ٴq d[nL2HL{kbifDcBL? $+Ln7LL*{H "шoTFzEb8h46w8Cq7slKD0===| 'N0x<ikk7/~ jj>u|gǎn>vG9tz8mxee[Д޽Nj5CCCLOOO=4;w_f˖-{%9tW_}5\wTVWRYY {/?׾,tottt}vΜ3PTTD*Ç@*sgiF9L*r ~Ŝ?szBdbӦM={Vo܈fc=o6_~ccTTT299 jBc1QTTLee%GR&vnHoo/5ٴ^ٷo5UΒL&+l6۷Ũd2aX똜dbbVww]FtE ![lݞQDQR)mmm r܈rq3l6399 ߟ㋔ /Xu\a pKL&Sε277GKs F1׽X 혍fBлt.+WOL&. !(*,"\iBR) zl_`;簾d"< _/~==> DoH,?8CCCan'&ZhTA?Tx.5 z+ĸnFGG,S[[b7"ɄYTH$f NeH$rRY'N~ZR)f3rfN:?q}G8nnʞ={XYYz7&pcۅBյ`"L nBP.(vjj*G}H^^;.!uU% &бKs]0J4E,Nv凌AݧW™3grLV`r_?5==3ǂ!-DlyykV Ӻ]d2{pÌ8z(%MzLf3pھx\8 K$8>s/|^{/9NET~C*)++c˦->|zzzmrl6X3ݴhcddBKK N}٥A`iyy6q9z8DBQ:::xٵs75U5||#WH=?ӟc]VKD"tuu1<2JW^FCr):LF0&X"A't2~?DQA02JMM@Xq,+ps=ǵ^ˋ/HM( l6Nj:3sbRȥ2B P6E_٧OۓO>W}%8d͛qXvnD0>>259B*-[r|DB:::(J~-ʲV!C[8wʊ |amۆFd2!IX\ORbA,xhhhdllF'&;=:r1x0. _vXYFThnjG?16!j4a׮r v!%LMNb2t2>>Naaawx3fgouuuLNNFgy]vA:lW^au-\.WnlPUYĄ`oJL&ndZ[V&&&,̙F#DLf xc$C d2z=ΝC.cXP(8KH%zE`K # _׋(Dx^\n:pD * lM7H2??Tݻw .|; #Ä!>O1lڴVC$QSUB@&pPK9VG<G&FY]] ez=]DJbppr\.ebZD"( dR4yDW﹆7 ǁ^d2199j嗿%fCJ%Bē \7@&ZZX^^:SSSH$;ʎ;룥FI}S;8ϖ[8|̧02hŗ_bLD$P__2g;w^σ>`dbl<ת5,͓Jhii'NeVJ%8X\Zb][gzi˦_yաQ(/)r 724<_|’c@2j*j3sWpW# ~%Uc۶mbXC*233,bؘ]Q4$IJYt,H&ee9FAѰʺuH&˜lEԄZm5552P\($}/-/"(ãø\.(*ƌ,]ҒҜsvRTT}sp 0> db M0>>lL[$HXWO4TD4R&Q*~9;K[[x2*++ I GA"itiA?۶m#NpWQ*Ymgcrj= L);Yu N1w6.냽D"yWF/AgOyݙy5H,#Ra!F2TVVBk IaA>*F\҂ca[p U5LNqw3>2%w37=Cm]D4a߾}H%Z YhΝPW[KO/1??O$!Jо~=7uFw8df),,DUɉ'ٵk'Ǐjq1n7. aaANRD"n譜=N?/'uۺpp'd֭8 gyyp8^sgNnjZ[hM?b2Y^0z!1&NByy"55?qE,ݻd2 )-)x!ϫT/\LaΞ;KQQccc#LFˬ U'* rLNM"Hts4^O G"Q(9.L.1R@΍cb#NcqM78=X, ǎiyy$\vQ*LLDѨU\~LNNreɷt9n0NQ*xD"A$h;RbSH$\H$" sk_BOoX"E6l\rdLMO(5qv.'V0}H$<|۶aXAr9teV}YT*b1rťlBEs:XN#V; ~Ξ Y6arr2[ CeٙDEbYZ!n];6}vf3w ]:3{ݘ_RAJR L-[6aT*555,ms 7/}v{ϋcuB \'hTd~qYVֆ`"SZVm'GV7&<.7-?~YϝjSOc6&(.,$HZ]%044ٳnzţXMfN=Fueb2h*n(Xj(T͛6a08H( JQ]-GFFP(ɚ[6IˑgI### (j@̡Co9d2T"eJ")y~?9fjRZ\Ji![-̑O:0םp8h!5L&QThZ!Q& sY./qm[o30438Ϳ8Ƶ{o7czTi9}G?Q>/tk1/ۇdf֭YWOsq]>};;~9LMMQ\T7L gyyfn7۷mcd|m;Vۅ筪"4\^V8ϟG,SSS9{Vl6Fcg?YOw2uuu sõ7ttt,YfxyGyk8q@ VzT((--epp֭ogfv D @rT\5,sW7_>u!Hؿ_x_&yrG>yr =OVF*"h48b.L-Rn#Krr}>O/E c455ȳ2,6lV;eťRHR|D|>V+SSS yͰ-.v{!rM s.?U\??"Jhoovįwq7+J%/2w~."\VF!??r9H(k!#tF>hZM($$A !/jhT8JRx~n Ƣ9!WȓO>ɕW^0 ?+Gt#QߦcǎcN:E;.:7vp*[7od2,ť,.-zYXX ɰ;+4j C})zP r^yU&&I$BRg z[:3bQdJp>ʗ/#$kw^w>(ft v-?2ٜl Y"NY߱4c *z)}ҵi%JKܸrM7cɩ'p 9r$R)C.PRZz+x[n7ٻw/?暫9zfǞygn~ɩ)Ŏ. 7^{L7hq0T޽c2hϹ6l@,0ZffgG籺*jzz{),*B,0jnn%H~}ΕU3ڹ{﹇+/9:I,/9_c5{FRV⟾|[n7WV;:2ֵ7蘜ghh"6mބ((gii\N'N":LKS3hҒFGGwۋ=N*f }"fݻ/7޾#GPXT03nedtTd-R)rQn8fddR,3LNMc2X|~׵#|Xr8hofph&NZT*[o ":N8Օx}^d2x"fR~?'fJ%RBuBJ"Imy~u^\.C=6f JR`aac]v{)..g{ec(rF TXL"# RZPH$R)ۅD"6A$3baphjsEH$B 4*R)6?\d"HFå/}];'NctzdD,T,AhƸd.ZZ9[HDb‘EŨT*^}U֯_O4%J,0qjkkh4R^^Nqq1}}}k]G(b族?McS=A xWhjn䞻"2`6f LIG/~73);5du$>fLdaN!ddљy1 7>UlL7q{Q*-fVWWQUo8W nIH IM{&dɖ-kߗh4?~! %!2~q$3y߯gVyNfgP(QbÏ9ϾK?yoЊ؆pA&&&FGP9"0JR=vю ʒR:[W=>qi|}'Oz)4zkl6aUaplx4tD B9^rCCCNpM7q 8pFTfrnYFf^x'lB(as&z{0454S(h}N$ NG\  V+t(STU3l:%r4 ph4ʎ;avNz[mm61fW3 ډs=~Ng{Wr3::VĪ-2SSb",N#X P]f vT:%^{*:`(hT.$^:p a~q+Yn-JaZ=t5kxƃDC%^\('\$;`|jo]=U ԖR.FeDEB>G&UW.KKƪw*qU UB h捫dA\Y׋-`0Hss3===lۺ /aZ0fߏfC7P( 9sul6 ϑL&YjHQ*UsZlJ`%FQWWG6Ũ7GMKʽ|z/]BоqzBdbhhM72LMgn FC6aqaN8v({/ggxvCt:sg9Ktvl"re*R@0H]2BAl7.0x:--d2Yn;pSSuFfgصs,a0?_\2?яxϐɥ /[Ax]xIv-KmeG/?o޶nz/yu%߅Bx+$woJ*fOg,S̱1&&'sYR׮Vٿ?Cik[́}Ǿŧӧ˃>__Og>Ñ#Gp:UƐZ|ߡH$Ç)xX\\S5cB!._̕+WK(UFGFhlh_{( ].ݼx^\>9[˿Koh$e؈N!M38::;Xj=Xz rf'h1z/D9s 5ux<j 'm@\PS[COo/~|K_g՜:u;v02N7ndbbL+Q|UVQ*"F8tdp0JhNpp V"™3Yn-h`pb(z E$^/`7c6f2_M^n'LVW"--b^|cƢ|>Ni*`pB"dlrL^gtr ZM:p2=;k EC.BRMAȈD1==ޠ߿`0HT[c&Vz=p)hF] > IDAT5kEq!U^U*ty& dLAGD>L&fR$NUr,MOUjR'GL0ɤ-U՗t:l!H'ذnNbڵU"͒JkZ6wv YYYteVΞ=Kww7bшc(J<Xl6/#h(:|QyȫLOOVoSSSb1OT&U+Ine~ oK۔JڵkE9R 6 A_@~Տf4̼/ PdHJѐ G Ԑ]wck#ؿ?_i6pyFGG_f֭P*ebrFCKk+*^{rNC&őttlBa1[eݺu8p\Ngg'hmmeee￟@ 3ghmmeqqڌ{#*OsN,w& `A$LzpDLxH$6wm&ڵkŔSkk+X-68Whjj#GfZ.]=?~ﻟgϊ%Z-AhE<Gul6>>Ay+ez8zjIX,fv+Z٬XmxVHO\B)WR,|d*{w~rmᒷ?[*7sxf~ һK5ʔ~a&W./%dRg>#wq8~wHR!o\VEqbk׮_"z~6oތD"ߏ|%ˡedh$]]]/Jؽ{7JeX3gΰsN0?ٽ{7h4Z-R2 tuuqavAoo/ww7_kl seQizٸq# >Rŋ1M[x"L.addPVա) +R)6;:1"\M >ituuQDϞ=˶mT__w8ٱc:2YT*rZWH$hijMMMѺjˁez=jp$d!_S.) d2q.H^&4㴯kgzNil0o~umE"FRB [Ȳ%)---@4%LR[+r9"~<|M8Fӑ.p9\*S{Ө7xD*QtjdffF:MFp8FFD"p\9l>4۷oX,p88z([lʕ+I>ER,Ex埥;'lRETdpp7s)֭݀ޠ}ӚGdk/fUV@)K)I-H)T1>LcShz;'Ht:guBvlN,%r':RE,V c4pڈKbxvN(jXL4j LNnR`PT5*&+RA\GH&&@( &,V0W!A3091A[DPӃ"PZ,YL3|)BP(TrUR>l6_v)T.X :T\.G$ƇoDR)gΜx뼌r9f299Iks+*Ej7VP*TW?B5kTR͝twu395Ikk+ᰨ^px4D}}=b˖-<'ذaVt:ۑ Q4rQT|/ D:J_<*R< W\ed2)LZQ P*Id`޽44 RB5Eۆ"[e]Rg~2ÌRU(I !W#)߷C:[JSI>w6v337Rp;vTnN_䲘FFtuuqeڅ\.2R(+Ǐy衇H$;~Ndz>˹sػw/?0W^r]jjj&Fp8[X`ێLqP.yCtvtjܹGr7]K0DP*^>SvsJdUC2$NzjHRr$jZC[[/";v^/c1 \x9s wy'y^?B۪U*Kh~~: yhlldzbVzʪOF)R+p]w0ԺjB&HX\\Dk02::9w]]]<<# Vs0 Q*hhhՃi͚5aO:+ vL.#=H%RFFGhii\gffX,b#jVeiiFGEf%dXRh_N2D&DPբҦ,$qj-ccC\C,k֬all J"zBrEt]Y1߈Zr-9rJ.8VbnnL2d{v&gDF" G0b1Qv %\T#Y[e ;(X|^\I+~_bB>??2%L@#O{r2+0ȯ+n2巨2RRDg%>D&F"lit H.ӥ2Jn^}MM͔FC:&poo64믿^$IP(PWWϪUĉ(LFf%T*t`Q\׳sN5wlbO[|j0 Y puv/dzukXH&\&JP(DScjߏb!Rrvhjn" v-j%p$F 3??x^v ɡCb iiiW, 馛?tz:&Q#ci6!dBRQ@o0ŰZLNNkݵ1"RYR `uZB)Jd20ljoGTJ$IR( N|>(u :=`hŊL"Ť7"HXD^Q/f&@0H t\zo}=%$2ыBD.C AVk E,!~rD*FU B@:|/T*Ƣ)38x):6v0:6Js>ps)ԧxG'cXbpqnv"2LLsITjM0 "tl!ӧ9pO>$qWǏxOծp8L@ NCRzV;rf:::HR۶m#\:rtwwS=A`jj`0]gϞdffEjjj JO* <|UiQ)T~F#454KĪp8  \I8{ fp4|QdbĄxfQ؝pU1Y-aJehÆ KVʤU?O.CTj^tVTg06RI,3 (*Q"Dcv~X,ի|F"vRMT c2:`Q*d2V;CXV\+˔Jk9`0N#inn&rHEoVkk+ׯ_@ _*), hɄB _CYT&#ˣT&o}fJšvxxL*@lNdT U/#O|>}i \P*ҿCIx3Uf;ǿQ$r\. r ᅪ}m6ҩbL.æN:;732i\fU 2TJfV.LF3LNLs.'=C#G=}Q;T*˗/c69v˩ShkkҥK8NjhjjF*0[l؄NqS;TբѨϳcv.v:]Ν;H$,-d2B!G"3fPk\rD2kX,Xv-###b3)BU)PT475u$555huZAJ26mbrrm[d2t:ϜŠ7dBTQ*Opk8N/c6[0,-e+YnX cc#h5ڊ &'Ͼ)-Ax] odޗMgUx.'!  _2S*W ~e&/){0iwEN$ y333L&fffson3>>ξ}}A"(wu}lV'Xz5 i1{pJtW_%ɠjq8n̙|/Nd2155źuhnnhVd2Z[ֵǿ86uOhimsDavZ7=xɷ߹ _b1q… y<,--ӧ^QRL&###lڴd2JDru~t`;X)͡.^z%z队&re15sQ^YzZRTU) ^RU8ϟͯ(d*4ׯ^{use`p$"J%\ٜh5Pl6,JYH$x뽔 E) h5Zbuu:j-~?*tZT3uZW._fú D"*h^1T*"OssL&-DaVB"hu"LTJ Osxj<qJh x VϢoņoɇ^'LúBX`iix}.6vn$rI9wu>s/ԄB\.ffyMxmΟF|>sss`41?ܹsU XbULƺuv)Ey>Xaqq$*.a1q@:T* GV<zijj-[pa4{y?H0JrAB[LLM8D*A"l2JzϱD TZ}jT qgF%߲h̵Xϒ5kT+Ǐg޽dgaվש!Jr*|%innfllLL{Uw99JBE6g֭|>4;vQ.{A,jkkFRj9Wkk+BǒG__m("sz;"D4f13>>J___jgJ"ӓS՞3gشivr? _@.ٹs'U <q$ΝcXV1,H]]J $OVkfd29.]'+JKԸ=d3y>!Q. ,..&s^FشIܮZ=&0̤Yjl@ 7+0\ft|T&h$NG"`2jv:xj\~ӧ΢TQUMVV55AYhaltښ:[8x] *HȦ3j;I~?ۻRh_ι7mlȱ8>Lϓf x^b8)$ x^FGGŞBDmm >vâoD"1l6 j,NEK,,-hE&S,dmvWJW.CӑJشqFf". Y)oGW2enr hx rťE$ T*qݫՓL\!G"  J]|>䔸ݳ{/ UyOBRX FbH `ݺuL&jkkWJ,ի(J^|Ejjj]-P O<~#۫Aq'GTP(*VV??d2Igg' `4o_Qʕdrq PQ(bLo]>7nLX-V,oTb6E7 o>%MPV)5 2o9Y348w@:A?SSSh S|ߨꫴbXڬ Fٿ?_xx{~.\xCm6ƆFFFFdYv޼ γ>DBARWKbՀL*S}2qj y>O"J{877LA$@`FmٖeYUک&vdNJIj5*E$zm 0 Ǝ7soȻC3(>ikk#@^$g~Nc9AGT\z#d\ bu Y2hu:b͚5bR ng39jտbZG$M{/:Q={̻+J~:=2Y9pT:CMu5|ag?gNMQ__KCCCEk[oLMMaYyP*TTT L`zaU4={ .\+_*/2h>JF| _੧b֭=z/}K|nsaZZZe  3'J裏ߨoٳX€ 219N,:r9׻{d]GGAMn`i>R__ρ9zz3#Lbۙ*|{9RT߿?e./^Nǹ #Hֶ@`TF4# Ԅ&ڳ{zz}H%TTTگι3gG>7#G+ܱ~ׯ_gM[##2>p{wA8VZf#&&HP[[[a_@B$޾^\. F.DlzVf aXNyT&Eo4 J ht:2GS %{E7шS2\A,jA$bK/q 7\s(UbuD2oR7D"bgQ.#a6[˕օ\!GTb1 W?bQF#h.^L&-[ Jq:J́ظQ2 C#,1z, o&sy9N XYRgNSUU s'ønA+d!jBN"!.uZT$tҙ4RD efP*ٟ ׌2pq{ޙL%ED"dR1;~<3!&y7xf~rhRh$B}}=wPqqםwez @K =KKK* -L.) nF^JCCCۡVbü쳐_+\*7||_^y?IiipޜuuRhll$?bp%kZ/شi︓D<ΛSRRB8O&A!Whs ._fڵW~*7lݼ_ c V9vv^8(OzQk4qeZ-/\`ӦMrw~ijjs#'Os {3M7#l%hadt [C,eK- Loo/;wd2Imm-GEPp8XO)wgOk;ֱdUT.CTqeV5Ǯne`hRM$$RB0RVrH(urHI---a69}4ո$1, }}}^Dz95 LMMЄoއdpx"u"jHӅ貊qQ츢\t-dydJ& \.G"G❜,zאgj' c|rGS[]RhЛdBP^.&,f3#LQkf$B r8^֭[I$Ȥ tB5ksH-/w133C:qzl$ J%DQ|/$CHrH%r-LAK__ʥH&JU\vRl :7mA"2%HS@Nfx"(J2 rt:J"1RT"-d2\q=cWv˦I*\l*A"aqd޽HV4DcQ~WLnLa3>~0LPޅ~e˓O>I}Ka7܀̹HRT*kֶ366FMWW d]T*zzz-d1v;hݻQ(崵a0n 7J+ɤIgDh1zTD PSSmVsEkPkԤR B%rzV=쳬[Ʈw126fnwxd$qHؼyW^allwR8X db#2>>A4is%R$2Vk jBZl6VǞϠҨ )2aRh2241`aO~^' J9pQT*5@`k]Q[SK61Ip8bUe066J2J T\KLXJl\8.Ghd,F$#͈.%R IDATl6׾w4n$͐. *"׻yq+d( ɯ 19JdJ]<ǘ{gvG,rd⢷YQTJ#沿AfJ X^I@NLLNP*]f_ow0Cd▸r ~:t.Wi)?Yd)\.--!W(xmk8x ;w@2UUUnifffhZgj9rHH+e I& 3::RQ(dTT388`h4NsvQ+xشiϟ'qY:NH$¥Kx*EPPș# :h6NgjjCWWuuDQF"r9똞jA&!ttt҂R$26!sssLNz+W.Sv311N>/eÚ5mͰq&FGF(r!Iq9DWT7 TF)\.G.jhxg"JL'DX\ K6x<0e|sdYzz(/+'^` D"&-Jq8$Iz{{x<\.aNJLflv+===v%01>>F$,ȸ٬ f7LV*~_ZHA./bѽH$PTH$T*PP(Jhod4,(jZˤDQ"eefEZ L&bT+194j5Tz<uȤR6d:JD&L$(9QAF%bpQn,,-/B*"D"(%VEY~HOrM&Q(յkXiDWl2L ȑh4 J&Ax<ɓ'q8x^ZZZDZM(L*f||x4QdJD6G6иҲṞZEE*T"ezld2)J_)Pș-.1}s\~rwq(KbL.-~_䮻B& xsÌL.CPɊTl*2V5~Eo2fiZ)~r*^x={pu$6l@Qf/cttp8wߍ\.g͚5d2$"Qb{PVn%%%LLL#`Xx衇0 imFcccOuVáCH$\EkJK 8>s9nJ$a˖-kVIJ`馛xg,--qy$gΜallgy A6nȑ#G0L|>_ G1[XmXJ\|//,,, ConÇB299ɖ-[KDRwj-퍍 Fkem܆wb E*hQVV&$UEmm-ׯ'N`b'>q缀Hqy^±0jK.0;;K0رcf4MYzbezȥr$Tɂ?bH%(5k֠Qih4LNNL&JD"Cˤb,z,LvmφGQ)/H0:>˗a LMM s` Q]] 3e>GP7Zģ1$HH'J>ɒ%ظP;#xPO as`2Z,QVZhb˅S!>Fnbh&J/E0%I|>_8n4477399ɽ˗[x?ARo>=65mk˔y***yN I;!>?2T*ŅSt2rR~bP)U477PQ^bpQĴ5 Hc'O੪W^amc6fffaZYz5/\bPHgpO}bXityq: a0r?#4Gb2qi۹c5nשp7D¥KbffрR)ҒL&@fWH4¬342Q^^.YaMx>̜X)1L(J$|HMREگ1+JL&C@47767 H>Xi2?Oz=T 0u5u/̓J$ G#HҨqWH'S(*RTl:C|3^hZ"zyJJJ c'XaUUٳLFS477SpqIڕUnf/XnLx'8KKd2jbD%BB:%JrGYZZd|Y&q8e1BAQ`w:8:}wiHD9\(X,&EC( 4*ͯDg @"Gǐx_$3iEv>(NJo瞫|N2Jt[3f_/Hx]bpD,E,uO*++F1_jWRlv;$I>r1&=~d2cƍݻ{_￟￟y[laaA:qB$UUUhnÇemv>'h[F" L;p-ɓs=<ܳlٲ2 NYBYY)JCtww|>YZZ)v;HcqΟ?̙3477qInv>DSc#cc#lۺy?%9{<{ ;6sr뭷ҶT¹saFpXE閛1h "a.^lfzvRի] cyY"l۾'Npˮ[ٿou5,.,'n{ LFii)f˖-(JB,..Jy4 } *bLNzҥKL 3<FMU h ^O2f188H`0Rb1GE!#+8q2q*rzA$g*H@ՓJ1[,$i%VjA,wI:gi)ׯ*+37;ˢo@0hA!dU+\!'J?=hW(IS³#T(EXsϽP(h]݊L&EJw 8#S>W| +{{̻f`ʓG.ӊd2VcǶPUƛjyѣe.0LtD1z f|>ZVۇG?ko|{o`00L9:-VCQ[[˥Khii:vFʺu똞T:V (v,WGFF Hя~}{޾ߜ"CC=_uj~i>e\zkI>|G^*p;v0;;O?MMM fEpn0:11AYYc464DPT}tttpE6o{ٲy GB.RRRBw56}ikЁC]Ι3#_(utt/C"ΑGH$TVUⴗry֭C.S[[H;Xm6Tj5u8_Ɇ=G]m-:۷|PZZᠪo}[<?~H1zH@֠T*X%f }Pa O}}rJz=JI*!TNG>/"jyԕ#GXj>|^VlvYQS8PT*%Ulq(tW^=&$R\Z^bf(--%ϋUF0uzZa\eΈ%$y Lt2h"ΐ'G2*z@`իWT˕$ٹYjkj9"kNMںWEP9ER208HǺ{YyWh_Fjjs sm0KFO"`0DxP*Ux<ӾBL*gy9F%͑+( "a$ cqr9ffg())!26:|h'b d2i0kǙѨbqtZ=avŧ?i@I \1\xTJQ}{n֮]+|T*H=bdsTޭhQ(ED"wH$bʖ@"RT|-r9x2=zN@?d/~K,r7XpXGu&H$^RÇc٘uu+f@@D1kkkYjU7 a6I&ݻwsرB6nԩSr]wqyna#***ۨZFT!YXX jj9B8ȑ䳠j X^17cN;ϳk.^|lBkt:V+kֶ#JcmX,E;2 Ng)'No,FՄB!neVtؙn288ȎۉTUz 4n+hTwwsͷpE|2555twwp8%KKI(L~O;lfڵDrw_SNӉׯzM&nܺ/|shiॗ_⡇bttky\zVj5 y0. \N۪6AD,+!iҩ45U:}2^R"o#MM444PYYI`^|>s> Gè*I&_(QWC/]{2NQT(H%erШ5( $RIQyٽ{(͋^5sQ(l<!9f7f~E(J HL"TcUj4J y'BSSjme |?$J鱘K%0@.#Pt19en]wAבJ {[[&u dB"zǹ9pm?|G}ٌ^監]Kkn;vo@.388XR)"(nNxZL&3e@]mm-nEmM=CdR}Q{z?!9ƒ~_~I8GotD2Aocǎ*6pZf ;XWbx ؟q{Y^zY|.G>%"LpڙG&rF| k7Gn344FF2>>N4Ea2E *//t211A8,xi&)-ZE>h4VxJ [*2>>J-@+OHȗyϮ]{{$SBy"NP -y1xߏD*̻yEcW^ʰL={wEz2BRת1j5'O? M\>Q]]? uuu޽a2rw,..2;;Ԕo߲I0Xjjj#& }VĉcpR8qn*aN;w7|+2X,FOOO1 dyy׋n3g"q#4׭+$ AZ[[9xm顽GÇo'cX^￟R EdPUUKKKTU׾]L,R%%%\s ʸr2\J|>WPM%͑ظi3rT*NՎf#{n7DI@7nORkq\b1L&.]wPH¡L O./?R[fdЋ\η9y$Gعs''Np xLŋ$l۶\\E"<6Mr!^{56  BgR]]M .^@]]RQCqm9rロGx vcL87#t@k@ՎEp)N ~nƼ?$K/o~|O3<>ɓM7g41, mنZ#O-(ujggrZN'@jkkyqWz8x ~T^֬m矣!/yGuۭ<>Yvn{OKK 455L&ټy3٣`ݺulVNq8~[n,:z{{(' 9qj d2(?=NCC3gp8``qqQxbQ<OFuY 9j \JDe)1[3hTjb(zH,L&%Nb͊YtqID-%QZ]B!fgf"LbZ;JJzZш\ Jeq=Her2dyg?a*t='N3~_۶o硇揾G׿=G>[HlO:(`jvAO<c-%%ٰ~=t_H0M1::J]mD UUU& CCJ9t0Vk Y|sTWUsAVvlq9wAWW X8hD,Ƌ/b%@!#bHr*2AՏJBTҗy!ʘd4!JD ˥H$Wu^TWһD&6/+?Nc4SfSf 3oSf$@$AA|矧Rf3--<wL&C[[ocH$꫘L&f3<LMMq!n&0Zy ŖdTUU!Z V8~;R)|ŗAg"ubbX,Fgg'c$ drr ɓ'ٸq#C#$iR\;w2??Omm5gΜbzj|>c#Ds|.&&}$H{jjjp:T.^ZFPY)ض|>HԨu*WflxbacKs3/"mmmvTTz@*d2Gcp6=7˕+W ©I_4 ?[3??ϱcxGU֯_ϟɟ`/)Uĉcq@l[V&1( ,,,R)Dbfhuy<F!f#ؾ};cccA֯__dVޜWJKKQTt:ff(+-%f$[hi,&cZK`Q0u4]444`X8qk׮eyyZ,vBMQ]Y͜RGi GrB$:P=`& Qk5hTHr2,F7CG8z(2g  yfV##? _drW/;nƋ/{ws |+_aƍ@Ӡꑐ#K$ɰ˗/SUYNCPFJEG?ԧK%tQ)1&*<,G"hT*zXI… Zont&]ZJ)OZ")By[la|+_)%JlNpdB裓+t B~v܉jE*OB>w2#g $H~9L+}U)3)3nf R9 T:J&Nc6[TēOrơd#֭`E-_G.u;Lw),6 Jgg'OCg"w'ogc_;D'attW:u6^zdrx\Wg~~p8[`0H]]=/_.'*nE#%"<ouַoys'?я1,.LLTWWsy"Qdپ˗/0۷o'rEb;wވZb7D"IǚKḴn:~S)P[o|7382LuM F^"a~!;oǏS^^ծag۶>}1ZZZ˕%55t<3|+|Wik"Ms%vǃ^ڵkn2 555c0t7o駟`օi&Be\rl6ؾy;Ex p"@ӳ@SS#r7>@2b077GYYhNT*a7[>d2f15Xtڝ,)1X9{UiJ3]$[nm` 5 /ZvۅPB6ZjJljLqr-꽏f4#l6/&:Hf4hty?)DQdl6&ULB"`6r 85b0L XوDQmhz ##EZQ(T هc5A~AטSRͥkz]lݱiUө;[ 8v(55͚mέvvmF0 hh8K{G;k24<̎3g.%P(LAAb^)(ȧ(ys1pbɧ,V.ZOKsSR5u*H4Y(}nC,KjD׋^g;n;B6fFCB<|A,c=D`$]3#ߗB柝vfT(ub2H@py\,20]<xMh6lɓ"=B[[p(Çglܸ^J%v;###TTTHEDca$^!0<~iy6mڄޠ,Y>~!rb0Yh =}ܹhT-]ߏ%+^SLVazF3P(#ΧP(D__^u5Oh4%LFf*JK}}=~dG  HII$AUԌ&?JJJHX2e ۶n6mDO_:]f ())$f>> G%H`65u7oL&N:W l6s.\-|B 5Jn^E4:z:T^Gpt2/ mi466X,hu硐  h !Aj4D_XryR   {,])W'';Q(1Β$##͆hd߾},Y"ZZΦܼH=>x0D"R$ 3::J w}xHRz{{RSS (JΞm/g޽LXkֲEe %Vp#gΜ᷿]ͧ~[oA~~.,ZQG8替̙3Q*vb1ddd!Kvv6HCP 8HHG2{Ѫ uf̘%jAkB?O1:*X DB"rwM A_Fq;v[===h4$Q{(Dbe]L&K읷ikfx2f&vfBk45L*$*DQ*|wC.r_/g ;omCssͧe˖qWwkE"p #Xhll$33L8.(K^D"aٲe={vj_Cv`liiAT02b+ شy3+WٳgIMML0aLH.v& V fjp!4j D̸q  3w\~hk"hV xJV~VXO~ExǎC*Dkk;&~u֑'azD"xwbIT5s Xpo?yW\ysXh {cH$RD"1}H2 8q>O(rx28zkj0th4:d2zIZ[[Fz<>n: }R# Eb$coh$'Wk2}tb1 IeD"D"2H$x_3} f͜cԑL ( |~?)="X4fza \W\tݕ <\|yȽb1TJRS(%Hb(xTLFeXV F=G%PX,>#~ϑ+䤘Sd2Ν;T.' 1uHc C{{r{P(ٷo/E(rlLZZ&r V\q`4"//Onn.ϧ@ HZZ@/8pɄT"f`VC]).^Su9LNv F6oތX,"=#oC9aٵk~;;v ''Çk)--f! 4D*e„2N>MaQG媫gh4jy,eڵ3f 6_Zʕ7rL=#;ĉKqp-]'|e.SU= ={j48q+-Z|nzTaxp5W]!J8]uFېeP_Ď #8v 5]]XҬVO$?CV`6jvS5 8 !?~|-p>Τ漝DQ)_ʻᅢ1ȺO!'ĈQ{وb|\y\}5<#\~<32=HnfVef4r59~NbIt݁\"E5wͻo#}Z--]`IKG$au I"qe2J'ꨞ>)S*)*,RXPLVf.͎맪j. &G8N(H<6-[ Cww'HnFFHOcxxݎj%Gww/JDZZ2MJj*JQ RZC$ GlCD"QJs T*pl,6̨Å\D,"i E}+1r5EEdfff@bJImv<ȼUػ sLQS_r$JKP*d=S%D8ܹ|gϝ%-#ZL&5s)g֌(2.`>tP$@ez{1 ER!% !Xp!p4^J%ځ55|Y,)V6>P,N\$FPH{va <RtzpNK7jВ%QR @jZFRAH qJH{% )))B%B*"a/2))20$%7BF<%HƶZ+dRJ[R3;Hٵt? տz^׀('doaxZ页\ɣzzؽ{7%%6@H|> |I$ u0o<22|"Qb1?kǫ/ƥ^C7{hll%(VByZYgxv[@9NII XSSQ*466ʷ6$AxE|[^'ݚR)<vy\NSS'rVWajP,}NjϽ¥\A8ĉh*ZZZĢQD8~9u&Hn;5n\}57j:+G2QTXl"!bo7Ld|۾ 6-ZD$"f3k֬!*UpmM8::VU͡䣏>{r8z(-afՋ/r;roz:oxc0Jnݺ"y?IMxE{gbPJ}}_| XTڵRU5 ˁ1c/v?8)X,8z@Б oH$WTLUe%xE"Qn"0K.%S3g.*6m wgꑈ  0iD,)|z3ikĩ:F.?F;ʥ.n6x2QfZ ==Gag?)ܜl߾Z"1DC\s v)//7^g֬Y=/ߥghmoPm-)٦ML2Z5=d1;2@$P4!n7zLә8~"̛7ܹs9|bvEjjj(Y61{LAAvD"}}}rp#%D"I\*#6FC8F,ZEBH$!N`0p)c'奭 JfvQ__`Fq8( %ctCYb(OzF%. _dž>% b48ib@/mmmztttpwnLB2{zzhnn8B}ĉ9x ŅE^B6Mĸ[{ԝH$BKK eee(((td^x9Ǘb41x rBאh՜;A Ҫ9}@<G WʑKl۶ SP*W^A2bIn!N?&>CqH%GR ;1p~}m6<7G+~ N]$E",rػg;wb3ϰ|ť zPTT̩SHKKcݺuqT AvAj#~z-2+Vl:xh4Ѫ0^B z68ŋ Chug>ƍ1L4662m4Ξ=KFz ]w5ҖM0`ܹܹlbԝ:Mee%`j&NSh4OFJZZ[DtPYY{+}>tviӨcٲ F Y+}=!nFV^ ;%0LV-{`0`QBJŔ)Shhh0&_q>|I'q)jjjx<Ov^lf:yLkk8D9s&~FNNz@ @CC &eT*J^OWW`z8~?z3).,2rsOp8F_Bp }Ű PJ)J|>Z@}DQ rA4Fev(@"~?f */+GQ'URT"liAtFSO=ſۿ1o<ּ hƍ4653ͮ];y7oPQQ&=#GHOPa33yWxG8pYYYl۶+V`K N7q),Fkh ?^Jup:زs&L&5:[3ذa=s̡Q!tmC[iU<3̭K0$hhmij³ϑ F%A/"^0AK$z뭜={۷y8s+s&HjB!f+^x!1aWx"X5oRg;2b6ţbLt;ZjzL4iUS5\N$`άsߥcǎ"().FUsLJFngڵ?lC=UV1ydN>`:: !J<^qL  ɤS`?eikkgIdr.N8 rذa׬M6QRRBGGӧOIE=@lwaԝFTDù<~1qhilb9|~g'IYYY|>F^=DB4mdԙL&"GsP-ݽ= نytEXxn3P*傆 Xɠ\}=4k` ؆imnBVq̛3H(Ȅ2&N@M\Źf>k0j2r SAi)j贸2y>s&?ӱR=}*%3N1+`DdPfƬc(JVT*ΦI& EǛŅ'++`0 =ddd4d4A4 VM"ĢD D1&X,H$bI# #v4b)pX<,zaT* 2vڅL!o~//K{=VH$ng ˌ3p8̞='-- Ϻu9s&Y9Gxǹꪫ8|0c?`{k𹃼jfSTTP@45(-GSS#=YII1LzVMNN%PRR5JExơh|[bhh˗3ct#N̹st}2(NCP"'HRV^C8F`6QDccѱ%X,Pq6n܈Nkr;"ggϬD ̜zD@sK Mf K*?? 3#ϷlA" lȤR uufVe˖̉cǹeݺQ[[ҥKYv-ꪫp:Gijj`P8p+Þ={NVVXr^/>QcTuuඁ!$>"'+FE$4+sakɲZSg初H$B,&R)ow .ysb7. 0 \.aVP*qoQ:::蓏D&jᆛV~z."8ew>T"(J\Iɸ"]{wM~^y tZ-L(T*Z-ޞvɬYxICCT JNM+o??ɨGנV䢋0aR p\D!n-ػo_z FCss3F^x{%+GQuh4: K+(\y巸oJŤ t'[ٱk;uqbHZɱ#K^G~̄ <S=}z>(=;Z444-q^65oM^vZBl|)qbbMFFvf͙V#2~xFhmn! q +k1R `(HVfVbFὤ&px"lب%ŜBEE^Z;p]_ZJpZ~& 7|\OcQ>7ccIj`$Đs } š!jб?m0 ,"++ NǪ_B"맞Gg'`ؿlܸN~rOxGꪫxGͫ[oihh 33˅BbJkk+`ٌ IZJE"rrp̞=իߠuuy|'dڴitutr sʕٳ &k.ny%rGl\ . VѣTTTj&;7o ##tFl \.'P(lr^/}{[~SvW^}{hHebRAb0(..f˖-+J`]w &M|߹LJ$ r|r(//wիyߐj1rعkQ.jRSSC8tRω'ZIMMED"Ȉ]s ƆsedSPHcS U9wxXW^zqOfze5u'hlt6oلfbBh`]]^x! ''^/9Y9b1@ƞj%O|'x /X&d2>@4DP$$vB o4EBA~Pg`6zn7e IDATZˣB.#+'[n_^wN|Sx?.?iii^&/6NhbBEBqBMM x!_ ;#h4ÄaJKK'@.SrO~LnnRzz={'ŬE_М|Dq XRq)7|w}L"/>zP*BgF&# #JQq~x߲gL0 QWic!}oOC JM8(JX3ߓ_Q%V:̗ahwLfСJJM<##6aN?(F;#"[?{r&NHI\R***0_OFf:VKEEErl##̛;S2<4Vsh&UL$b0ÇqI yp8Fٽ{Eh4j{u`Cb> 7m]wS>'aw}NZZZٵg/۶m#=3L*5l>!DϘ> Y3{JŊZiU̔cԩtvtR"+;Ȉp AVN6{CׇZbMM&F2}:̜1R[b\0>~V"+3,TJ##EEE¡PB"@&!NO΁c4k@فhaY= Trǃ'fIG6\rRE{#UjvB0tjWOio}&I"z1.HgQՈD"Ξ;˧>)E4EҤ3oȼVWqfdiP)UDBPFU; _߳){%Iss1 |mV7ް_xW^='. j{^GtwG#??r.Y} G}aZiiia~,Yq8XV, uuuhb22;;Kii)7o;Tʏ~#T*]v###ӟM6w^ =#!x hoo'pw311! _@H.d8N~x 0:{c'O   bٰZr>|ڽ5մ=CIi)GcZN׿-ihht2==Myy9*U%%N"{tK+DNVVEwG'¸ X,ab1Ae677RB$0;;<LMMqW288Hmm-JΟ?\.b066FQQ3*04δOFmm-zp0ܬ *n˦h(`p/$R)Οu:x'rͦk/~Amm-o~JEiI)RĈs!Z(x3ޙEvR 9B>>A ZIf3R߂K"'܌r>. XLey%fh<7|= HxWp8|> 555jСCsY$Ǐgҥ*!!jRVVj={dQ\qpQ6o,EEED"?#NvEEE%.MWo`{Ϭ+gpxd:Eub@0\CAV5BQSS˙YvgD),*&7BuQR\\.O=Iyy)#ì^NDi1SST/`˙g;tuuSt΢rrj2#)!IaIaϱ g;ҥu<*elt FDWr7z+Y,"1GRH]HNg?_4x"L*cjz @,.(iFdx+AauX,hl4zI&ݖP׉Gcc1V44#tR>W$&7o~I֯_O__c=< bM/Kj#.l߾#Ǐ݅hرc?up,jZ)^x%U*%XLgeϞ=꫙~pwO($#ĉ 277LJ?abMMM;vB(:>T0??/ɓ'ikkCTw^2JT*AΝcɒ%zǸۨ"H~%/9 ʦ+6e ~z~/|'N@*M hc0 .-[FolnJJJhv&LNN QEvNG$b˵[x葟sWf2B`fǎ8efijj{Q(e`0X\Sظ L裏dissQՈb$|GGx> in`/}}h\J^yeV._Uo@WO7˖-98pO~S~ZϜARQ\\ȕ2z38D^#1G2˄п[69")iǨ7A*b6qrI***p?0@EbXzT*f3VHFZmV0BAFcf jD"jՂWJ* EDCr\GӞs>4->(c\.A>l48wORͦM|_{Ayv&&QR\\p8Uɓ'7 R)*TzZJcd.eD.X$#-`0TF|_10EEBt]]Loo/۷m \Evv6څZ^}Uv k7B;\"<C7`4 Z& +\[eBs 8z(6\*~ezFQ*#5HXB[oa<yǝ;d4 ¼ÿSC ˑ Tr?12=o~ӉϿ{3A{g/z+II$Hv2 T&(r7=_ ֠hx7BTǚ(6l`ۙvH$0̜n=@?W]u'??>OW\y%d*\.czj"fY2)((8+RK0ߏVٌRIO0O~AOHNسu&LWWzm;y'ٴihBvv6bh44g?S b ӭ觱\{Q+LLsehmi! Gζ1#]U.2'&p8ǩ*'7X@$f"J팎~QRR{ڍ!L$h<6^uZFq\B.0DVDǑx<:Ȑ/pb1jFř LCT mI$He2\##8 jq\, iIddDL!C'K`}tz>tR,V βR|ܴu+϶T)qک,d|r-F"wO4iN:͞M `vv6Y,VT)y[|M: z=z+W1 MNG8\NWw'Z߬/xa;:9rt{`Ա{.FFFXZ_ZfŊ8q![o-E"iUy\<^H$00ԏjER1 t:ɏLՆo·L*?**dR)DE8Όz]42hD@]J R!!\&'Hد>`m۞qBz>ïy0FIiMM@p7X8//-83/:DEf}YޏF$E.;DaDba$v2bTHs⹼ [폰xBJJ%"&)r?Hefff<$)6^u==277'z.\1SSزs8 W_}k;xrUh4ְo~y8G/TWvcf380(^8lС\HR`0H@۶=ɏ=~_/~A~~ )2xnJ (ȳRUuUga5Z=INb2X"ưk%O#p\Բ^j 'J0EE<4\Cu8T*DPRZ"#a\cT-^Laq!hBhՂj& ]4>j Wȑ*ڽe˄Sp @Lό{x9ΪU+αa0KM7`Gb,_VE~~>PLƢJ=LqQ!;w-[8| _vZN0dϋNejjIN'gZϐ'D1MC!f}>dfƆFRvs>֬Y?ɏٻg :pbry' s7mor|Bl`(@__/_9|055HTc9lV9 eRH"F""J D"OB DPaR ReHR|s>J%HJHDWDay&:HbERx,Le?][EN {iR ;۷XO߲&:VǢ1tz-%%%TTT H6f7S_b_y [2Xh[S pmxy>ボ9H@*3'qbl'~@am'Vǘ" QU]76/" gϲf:ߏjUVߏD"#qs9zzzTg,,7޸d2cjj Mnn._w n<;.kKPG`x3$ C~V\ɔM*\yhZf8{,G#ß]vP) Bh\NcM#ֽsN9?4a5%F$J H 8viqu( ~Y16l੧n-Z6X &~~033ùsHRl̮](++~j*c2hod=I}2v|rZZZqf1MHRʝe߿޽{Y!jk 1=ƞggO|,9 z=`bpfs8ԠhXLHعc}f~_! d~Z᳼QPDJd/-ݐ66> s/HDڗz t2m;ԭ^̟3Q?7&iaT^ab.A*#4KQ65,V| ˒<N.DDbiO7>f85x$2 Nl|9b*.eYammm V> اjhi9_Avv63Ξ9C}dPS㙡'%,m6JaXpV*Ǹ{"N"PkH%uvҺ:$ ^/`R 7m`(ɓ4_Lvv6=}=8q4PXXH`.Dpo>^|%/j8KÇӰ4jS'OIWs=zkl`0{M"`IA.+V$HlYѨаA֬Y(~Xj%@>!Po?m_~ 6P(q=<^uϜ& 266Fyy9EEEx^KD"\v-ڂD*A^;G*PV^?34neh4z)--زmtvwR*9u95zF]l6JMF~D\fYRbڊ&K֭[k D I$bxd)$KH~~>۞|Ғxrss#brbQLMMQHR9Jbٲz(,.D$QRI]]ZO,i׭c`halLA>rxe'0OEexn1d adhh%K044D~~>1>>N]MӞiB&<5RmۆRDR059MQQ?8 cłB!C*cgzj @=`((uz4 *xAC]_ke߁w)89rgjj,?Çty8c\z饴DW_f#JaZ\Tmd"9$&v555{tr)*++9q1F\BAww7/W^y[nvqŋI$㔖Q\Rd&?/ugZ\X VE[:{DB㜜D*L'<~ %%%6JVr4 -$1= b=ZҺ:Coo/\܌X"& SRRB"$ twuQ^^J,{|sαh"Ν;ǜG2j111Dxez{{f%prrru( z=/'O~ǏՅdb~ΏJDŊNC+ygXr= ߷ukrA$b12 [v6-D>/CQ_a z/ [t, P\\LAAXvFCCv ~ZʋyH$L% سq 7r%%%  sssid2eҲ=ccc8D4 )t:-fQVꤨT\!CVQS\RD$& LfiuR\T5oOWW7 wghd n6Q]Sk|ùmLNMukTA,&bZX-:;B-**+*9v8d, L&˸bb1XB2l<_m(Q&oCF oPHuJ}_ʙ p$R-705&i~2`Gee%RT*8ONOjH$"șD ȑ#alْ2iaGI*kxgfٰa~2FGe^PSSNTUWJI8ϟg?OAVsW]T*4JR qiq1Js)*(477Kss3>FWWW-"qG^nAF֬h0 ł-GII t:^o&HV3<9q֭ke7/+Z@x^zZ&'DZXl R{%q-NرcHBظq#ΝcÚKF&144fcrrR' ȱeg|`>O MZC.mfbrxo)TR)I$%C̺b B*BϢr-2B 2TJ$宻aycfO}'~+WNB\AW݃J" "BDEygΜtmO>E۰l>-vMub;ag¢"2fh^v)#̇br*EEDqrjj055hb%멬f͖`)((YDvN?OYz5> vO?6nfl6Dji᪫7;V7;KQaZ2 QTPH,㪍I$,YΝ;msϱuV]Ô֎\D$13aѢE;v9y֬YWwhjsb,u)*,bp"*5SSӬXق=Nc``ŋ9t_+ZfppP(F!N줰C2'?1EŅ%%vY kuZ>vTWWSTT /3~?M[o䳟,jBI2$+3ӷ]D[HӖH$(dLKSH$B\,#N# aFY,\r=s:2 6t: zs’~FEc2yy]B$Ry.. &y+R߁īB1jn D4---lx;tn}בDi 2aG#ZŋͲ[2G#tttP[[@_?rh4^l6 Xf > _lj'KX￟\yV^FBV+w+dblD"A" ''mD"T*ӔTqby444`3[|~yp2??O^^Ν_*/"VVnV^xη_Fv/+TWW܌{n.d˖-twwN+'i7gϞb2h"DA}QV4 OʕMtwwSTXB__k׮[ccc8%TNG*"++~n7:^$8sLf4h ;}hll nE$;wn:^{m'_=CH$穭%IiJ%`Z1sDd2p8vPF#r^N=]wţ5yr9s~=Ћ`~~5ۍÚÍ7 S^^NGGZV>|p4JP9XJ,E!W+2P)TbQ-TJMS "'+h4{- oш`ȸ =yǭ\7ͽҒ ]-џJhBwy+hBI0$Ţ{)Ng',G/gRZ|f/Ef=]`SO Q6v& c Css3hɀb.%agdh{Rnʜe˖q]ΑGDf|$ ^z%nΝ;dl̐^ Ea eMHDbv;6༐# νSTVU?ܱ7tws7< xg~b+.^ xW'13X;T"!;;BA%QQ^N:ƭ[!Rc2ex #I.YSNtR:;;)//J's l6?8ڵ:::BD"T*d@ BRT*Yx1XV~?\r CCChhZD8"<b:9rW,s%84vΟ?nZM0$ߏZjD")8*!r >Oo}+{dI-2T:5Ưk6^uǏ7qqV\ɑ#GeϞ=;xgyɜۇ`d41㞡Fт{8VFGIq)pd<`Y]SH%NG[[8zeKyYnC^u  "LM G#& Hq!FR3"JP)MFLF6+@\re۷%55x<j,䲅B! z qpA3,Y~֭}QWS^}-o yyy\.rrDQ{/ &RA/xH%7CDo&.i_p.R%oL>)"%gAT1AEwP:LH~F=ة,oϹItv cccQ,4+sssttK0Da>0z\27S@fj[|~ΙI~Z-`PH$ظ*6,cbj /Q__X,F@?bDȤbJJ%_7ng!@B!ye188?{}}7 {o"iI5,Zr4ij;MZΨ['vi&W켖d+=IH{{  -YN>O 9}wT.n:%B&Rj#c-O~7up:ܹZ LEE]SD IDAT}}}<~zj#OLtUg\<={b1@PR #4 2i73WH{[]?2"N̟62Ɓ질KXR&G$#P*فv & ˦M"й##*/GbX뉋5m6N;b6KEEk BJj*BsB0HIߟ|<8|0EE>Aww7ܰ@ GnnrȈ;Fff&yyyjd2ADUb ~?iii8Nv;qF*Y/'<F&slQQ$$$ hjjtvZ9/~ LfNSQYIrr2B*oogjr͛7q;NyY9~D@dd$s?[rcdddv8D,D$fɿHgG'FxGG$%&HaAI\\EE46>׭>s]~fggݻworJ.^UZ"JIKK Tl/{ƍ&::4Gf~~!<22]wMwwwx{so|UȽ>YnV^uyؼy3==( 4 zM61<RW_'MJKYXX /?sN;;ؼvٲe+}}"jaUq$ IISYQA0$++i+!(:>>NdDZTA>3l';w˅9 "6w{\ظn|dggjILHD"z)**" 1NNNRQQA]]bJ(o\ZD, +BUUeejE*2>5h‚,)RRp:TTVpݸ'IIaDCoo/eڵ,!NL(*a"wկ~Egg'jյ |02$H`IhfN O\y.=^AT("t/#'ODfF&vF{ϬOjhfCBfCRD";v'$9%H}$n}k""hjj{#R`xxF͖6cswu'O>}~λܹ3ugqjÅ ?}??tBm;n+_k7ҩ}n]  o?#G% Ӎfdx:;:%:QH𞙙l41;3ˎ;hhhՊRb떭;իimm樂_555Bm-{GC.\ ++ <>@H$:tVnq9r,qzzzzF$ЊbOZ!s|˅crr*gP[{9Td\R 6 %ZmݷF$PxG?Sw$5-UV13;\.ZܽǏSTTwCä$'sQ&'*diIqLA~>33$%$1>6FT9QQ|~ "1 ( t:mmDEdU;HIq 3ĈbDUʕ+LNP|9?ʗ«Lz`2̢ >'O;~$Nױ_$:* \D:IQa2ɉ z{zصs2X3G]q"LNMPEr󘚞B([ ƘOZZss <X"BQc4XXXX`lh{TL"CA'gE=nmٺu+< %%%x ~ۿZc U8354x~{/{r`E,D tH$ۅVT?fF Xʧ.#idu0>%Wp>^wf;?ԩSTTU RW^zWbǎ]<}vGQ KTTg##C(Mزe ǎ 322/)QQ)&4 n yoXVZb!--qt:R+V1==ftt]&B~;~4;ʕ+?s<˿ۿ135 ?Hikk]vqYK8xq^~X^55rFezz$!>XLQa=ĚL>$b1ZQaimi&&ƑJ$DGF Ej5K%p8u$ţR 7BuuuhfrD룸4;=PKnn.G!;;)8+>۴T^uSxh4baxxLtuK.ʥ﫛e%_\d6tD'l0#W+ػo"fRio@ѐVArTj@ N yHyvQ(}~ Ň9ϘkydR/‘#GOF _Yv-6`Zٰa%謐Ƞۃ'11LYl2 .qB֛o]B-Zf3}A֭[YS]]ioorssQDDD~G{YWs#}=N Ϸ>Ozz&㣘L\͍ (,,``ffRjaFIq Sc!X!̙:q:!$ $$$W}}}HR6mɓ'IIIaaAww'yyy8B9[oɜu;v-\^)IȔ ?B8rSRAGLYj(/a)(bzj~\AMs9GO? /fjPD$fӦMtvvzZy,+8NrEG`XZgXZZ$66NvwA96̷JMPIL&Si3g6c4Fcdd5kGff& n8T  řx<3==M[[thFDM].`8ʨCGm<X!U`̐{p˖~mX, 3=9mmmT IHs#3^xN< A <.}Hr5~ŇAşaoˏ_eB^v#Bf> uOkWC\ʌL;>5 3A* @" W @.K/1V;Դ~oXGbR"vpܲ}SSVS%q{ .reŴ\Ȳ2~?l6322Baa!CCCXCc˴˴ήN40B`0ۇVC*1>1IVv6ʰN[QBܽTJll,Q,l\OSc#}2i+//~Z^#_g{7D  =dr9{GXxB>\*6tෲ`' D!‰J֟?+}z>al82y8D8>k$z gq'108y}c'|+ۉޟ8_ĉ5jK ny+-@Vb*ۇ… =>~?}}}>}={y9˜FF#*Aٶm RY)xz/,`!d& "###)1^4 sss;@_o/.fgg)*.?#ٺf֯_?#_W_yo|ݻ+V000@||ffEL. HD\|i119\$H IDAT&|>/./vC۱Z\lif͌ sC X$! @rr29Y9ױzIHHfci)@LLtXMىJbŊ477 !'v"""_>+=}ݻJnG"'k}tl6d2)* I8p15(aE T^6mDbR.F"NNǕm2>-pɏ3\o/!I>hB}}tzH$?4:)mf.71-8KHeRTj5>r iLǏ7022L&l6GVVN`0@AAJh:;;-۶˗^{ѣP__OFFFx!Hxw"v;,III!))|>IIIbHKKk _ߏm<;W_.fyy9lذ 6Rp H[k Z8я>]G||(J'_\B=JF"Jq\c3VDrJ@՜:u:]0#C$&& cvvՅZ'?/VZʲeKP9Ct ]]]E!K&G*T.#*: ˴B"5j|fΣ>JEE% ++ NoX"bێm-+1v;##Go(=r \FT4ZLr9Z6l'Q.X\.ssDA.'>G q>tZILLTiv;ttuMKa|Y)&~5"ɐKn6nŋٺur&,d>Orr26ɉ)PT}>M7ēO>B!ٳLFΜ9Æֆhl6 Q$n7KKKDDDB D#+=Ǿ-ܜT曉5I D)W?}Lq/Whd UVVLF046$D$5wifǙWo3rF>qџ"gޯ\!$Y"@{1??ϳ/<ӧ}(J:IJJ駟7s,/"6l|c3fH$f#455T*%"BGbR2gdsϱj*())#-IMMe޽\.~^/DDDPWW ˗/gϞ=s=裏rEZ[[gpxN9.'Nf:;<ٜ={RI|lQQQaHIIazz2(**@ V%>>m=SUUEgg'iiiN[ioo'##Ȓ_Sv ʎ;wPBN>EJ:;INNJowl6333ݻٿ?&c,c#d`0h4^pl2.65ʹe5kyQiR]]ͼ݉l//CVvcdd$gMqq1V5HIKSRR{֭[KeD"Z- F76R\\#rX, 9j5\.H9Css3 QY)4V&=6haMdd$Yt:2Y<EEEzHMKC&2==M BJbb"NАN[pv6/@k{ EEE<x͂F\\߿;vJqq1/?ޏNGTT#caw:TWWJtt4:@ l744hran7x^'0$&&zIOd``D8?@aqm-|>j56jKX,nu+h*Z-%%%]\ p8BxA. ťKYR*`F!S_vQ4,--PF39 8vh4%,8쬬Yś{vu dffIQH'7ߴ% vq<q,^G|\"v@ U\z=o6v¢|Ο?O||i \WXF3gOGf2f>0_jdHĶ[=&5%FCcC#os&zzzWʓ*Wϣ>DGGCvvmZPS6N'fof1554Ge׮]lٲZLF#yrssijjgdrr3p/ vARtr{, yM6oތL&tt:z=IIIȤR[{P?9w>3gd&FyNC"D457311AdD$yS]]B&'55ӧ8p;wD333ť(,,DjSSS$%jQ(ZZZqݔ1ﲑ#xRh8{-[080@zz:m13>*HIOOG&^7'3!v;ˊ)p=n昝%55PxfȽ5ȥKANf.##LMM122ښu  9s ,. Rn)Vx͛73995kyo78B*ybBV\ƥK`0`ِJD"r9s6b1 Ferr*ҊC{Gvj5 ddeHd[Z!>>C$HdT(fQd2*T27f 1b?e*D"m6$ 2EɧT?g +(p>fhfqSif>.uihh#ɘerz^n޺zvモ/(+[o}krp8(J xwq:\HRnv>B`)&-5Ą@gUXPL[,|<nݵ1;hmk#)9 կO:~ hL[ϓF\|<7|3Ͽv %,p1r9V<.xZx<;@zz:)1:>BTHJKJBIuu5J9av(Ņ9|pa^N>E|Ȱ67ن3TU@Br"ٿϾh(//rvq:-+gvv6|zHJĨ$b)ss6KAN:]M547_@RFU ;HBB"jƙYr}}}b/DVT( ZѪuW@~f&J O w{xC*H0; LM@8)\p?|/?R6;` //۷s/-rsi/!^N8իlpx7ʢp:?~d!(p~@ @aa!111l۶S]]ͻ˖-[W477SVV?CXF-477P(xWlh48! 3τI۶mcjj"<;v 11\z=w[ne``NG?VP*RZZ:i&N8r-ՑK]]fQ$ \.֮]``۶m9s;\aDGGNrr2`4`zzFFF06>BK녰"z/{nΩSxG%>)?$yԬ][qdO=|;vzロjz{{;X~=sFZ-ZK.a!++ӧONKK *sM8BxSRRЩuaߝE22iio!-- JEAAhZ^7蠹9I395IGG:LMM -X}"{a͚5 qFh4z,_d{X,|d͚5HCᴴ4әA*"HcHNBVs l6{ˤB)**ԩSOrJD"oIJw& !Gh4 & DžX$.8I`¿TVWbK\VH$-vÎ\&Gk}~#+_?̟gڃhۍD"A. wBC "G83`D"VTBMݙzn6<''̻FL&.]ty;rFC^^#hٌ`0GsssUVeƍ\xVj\̱tuuzj馛(s_<555af>6Y!qjj,Ν;Gaa!#E"55JΟ?^055?_.333TTTp!bbaǎ 128Dww7+V@ށ ʓ8}4+VL&'?i\.* p8IMMefJLhEy!r_2Ac10tÊUQVVFv<3??DDmtPVV餵5kۋJ%b$ r<2L,jlt:v;,2'O":1RSS#77K ʙb2ŢV#`-V~戋M@"019Fdd$9slHHH8.ZZZ(//G* HMM!ll%"dxԔ NNII f"GhX&""sƯ=" Oe@Att4+Vdׁ===|f?tm.5qf]q}@P ᾏ ?!"r3s5TF|<2_LcOU  ـ-XLj>8 "xM&Zژ!hmm/?[rZ6ȤqI68|0?(JYHII!""PUUŻヒ`ƍnN<V1FBB'ODׇN>ڵkyWHNNKqɇ%##hkk#)%}t]$$$Rdbbٙ]L?cccJ,BxJ$35iann222x7OLmБ447SZ.Nu4wɖd[=NYbB%JyiUh)--$dsرؒKj:3Ҍ4ٷGl^(qhgc]%͙g<{wߍΏމgu4onF&\C/Fp8px4 !.//S:^{5F#sv0  qFN'FеdۙqC#d vb11LX,V[~b16 XvTʱcذa~d2 jZ[Np뭷"HX,q3^P( iezTvw?3W_ERR)G$of_ڵǏyf&&\<22Ѩ ͂B.Oz]{TWkcNg 43{ݚ-F4֢}kܚ-0E"$[qڛ!Zz4yo<%R "v+pp4«G˯%ԧ7ɍ7^\s5H$ƨ]OII 'x<9V+. TJ^^ӌҗTf#ӃVgnaÁZm[yꩧ[nsJ1l߾Ѵرczfggg>~JKKEaػx<.Y777NG ஻B,[J%022Byy9KKKrPռvmz*ffFiij$LF8ftt4* |>65lGcc#KKKhZJ%mmm_^[/$CJYYYB!j1C,c4Y1D"BfbRXXbcv~\R:Խ$P^^̙bǞ={xg駅҂t:]:\5idjf@ L&CP N[CӉhwjj|&&&**-gcFr+b8v&&&'dk]3ǥz IDATL. t1d&>d92Bq &]Ff.#3gTZǚ ZGEzwsbv(]8 Y񭐑+Gt:G_9Ν;Q|>Yp㍜>݁T*_oɓխgtt)**b2#J)wm6,{VW) χN###cddyXXX3} )//' jeqq͆hDG<GRddZ3J8N,6au:^΀\*'gfg`~~\(PEEE?b d2OII1 a墾kHKK T U&ti :Ia\Ngg'999tvvR__&^x짽FNREooszB d=sYEe?9Z}qDedr1GS̤>{T"MSHgԚvF, x"d?8Ho_NªT·>t#5MM[X]] <2 {z:3gEr; FxmxՊ\.`z߱c`r311ɶ6457s|x~d2RQrE000@EEE:+_ D"ۋX,f``XjRZZJVVZ`,bRSQK^A64ldٻ'.Bt~~>455=`F=z3gΐi:"}(J4&oGjucǹW^q?t+==Ø,fv;w99$IVWk R;; V9~p.8INQj6$"0(rÃÔhѪVDQkjI&S FGꪫd2yy**p]$;??H$.l6&,)XX#''x\Сdr)c(*$cxᇹ[x=׭ghhZ[[D#alLLLv SRRJ`P@~ X,,//zl6nt: 5|AA7bXp8ڽx!Fǎ' HM8 ~o#hJMq=JVf(̹ˆse鏍fhHb1Y] 4wvdLF(&b6Xavv̬,^z=Ã2::aٷo@}ƛǐ sbW/SKKK#''GII = &h4*$;l6^5NδMMM j9}4۶mC&1;;+P+lݺG}:,`5ʡC0L*|>rT`FZm6]]]ܵS0JEiQ1?<?nTƏ~#OGGzx*Jȶ" Ci@ @~~>2E!atZ_ףXYYrJ%. XJ$f禛nbǎ|")æFvɯ5^oqҢb,V 不+Qfth >{ -6[cuRTVb{nZ[[h4hTLMMp8bt:_Ơ3Gd]:"lPJ*dzSWF! j|> ."DQ:::8q={VcZ/#?/eӦM 5ǏSWW'E\. y V5m'HbiCbD"455K}}=O?tZ{e2MVE{I$ 2:2 J6:G|f:/t/`1sf\~Q=[$ k&"d"C /}eI$B"2<2LFF&r/}K|ZO ~$~`u)~G"痿% r`t:xP(9s"n݊b`4[nID"BJD*avfB*vƻhZ^/AƘ$8ۦy? Q#zH$ĉlذrppZuŜF¡{!244""XꫯԄL"% r}}ܚͳ>K~Aдe+sR"b8vB.'颸Ro8F$acha3#۷łZᗨgF<^m'[Q**++dtt۷oiF~iJ KK^6nD{{xZ:(., VMI&Rx8sPUPix0ۻVI(++N?l3zH'Or a|D"rr7fll?~7^sssp1?Nmv;I"ݻzNG<rtDbL& Rq_/2QO[k+7t\A"Պ KnkR'YoR$IQd!3zIu8E^^/bH'wUy)Bv@2}˕??~s|Ԛ[;MDnS[qEKIq,6+_ᩧ~D,B$DOƉ"_Ǒ%D"a)1 ⤘9rY&eh|m۩,"q#Cuu%]qyשA.W),,G_CNwtǷBf~sx8."yy,׿Ĺdhtג sJN4feK$`vjh8D8N!+7]W줸[u()/Aא媽Whimʫ{ϕO~)>s=9|ɉ1rm'[k5xLt̋/אָĹ0C*aa~2 T \A,A"c"HU C8"Q*iDa :=|%"&d/OWht:r숥RF&lj'SML;>{CMD v܃Tdchi%;7TX&'tcԛx)2mtziꫫ><+dr lj%ctu5hKoRbJZE )ϲ)%+7PNG2%"#3{-$Q1k۝1I&JHD7Eh>yK E<89y)gϚ{h\gBr=,5\cuHq&S}8o,=~Kr^Bd] fWm_*1sH^9]p$ $;3 #{ԛ͓˒g B2bbڻB,#+420=JBA?| jҸQގjrbQZn8''3gP[[fݺuWnv~_PVVFepl 654d2#Qvk: C}CE7j) V23I&X-bB3hZl >ϙ.57/F^+v144Ҩ$LNOD\.grrT*R)]?R"|$jp:B!Es=<\}~elB__XLBDj%#+A4,//s355lݺX̫GRZZ"YȚXճ#HzOD"ɓo"Yjؽ{7Of~\}d2(++CP Y]]EQ K*R$sXB$!rfgg[700N#Jvv6+++t3}l޴CbݴIsS3V>55Ciii:ZR122Bmzb8~ hT*EvF6Σ߫V̙3rذaNFJdff2??OCCNMp8hookefjٌZfuuUR"%% ±0###B!Q ˘L&JKH$B4no"++ />_zFzқz,<ccc,6kk+dk CSSU#I2E<ϣ8tzG O~;zTJj37ߤ͛7sۭ󣮸 JKK>F,cƍh4rsns"@V^;>^, "-&.+.Ń/"+x'q`uZ-o&ahh(]~Kk^\kD.غ K!@}`sTtnTXL&C# LNÇ333ٟ͟QԈb.'ZD,^"B J|{H0ƖiKi5FGFXW[V),,#G{Y\tz=KKr k'Ncfgg^l6stv8BTM__A@.IbB yyy )h;JYE9X ZwBzD"dn|>J<^:xR)JKKɄ"`ff,FGG)++vvyO܉ĉ#a ),*dpp͛7rq8b^RbDX,*** \.T*6UZL$C.b`0o"''S|()+ebr2F#PMSSeeeDaz*??-lݺ'FgQPPjJdӦMFEo@3Oee%^2"ǃT,L_/:J";`n~ k>ZT 9Jʘ!##Ch׶B JIe><&h4X̂bA&R) $ j5eet:B$I Et\.ײڪZ&Ƒn \𰐥53+dS&0<2Vejf]w19=ƍ$?_@"w"Vl\K.sܽ Ӛ-T*rrŮڵ X. ޹5Cf0z ͳĹ%o]y;ys =Ht\XO4MSLx ޽{y_p˭`Xp. )0Lk 2bHeVhDa{q@ @qq1+^/ܜ{^'&>'dX$Y H$l%%ܹQ&''X\\djj BUW]%&SSSM J"JEX,ƞ]aӆiii+ 72481ã|ŅEĢQ칄Ca4j5R+vBQc˰Ot^tjerjS IDAT;cLLN)*.#BEyȞL1S\X8X$b y?CCC&9q ,3d<y D"F! tZy2;;Aq 7JYi]ZIo`jbr+>o,PiJ\N/(3::J~~>H8 @ff&=ssi;y|jkk0twwH,d2L@./ mֱDHDVehsEBG`(H" //:b10fggDWwDBEђX4 WVҝxTJ Z&+ӳӸ\.J%El `Fr2dhhɈT*a|l*ZOPSS +h|b(BHdB7ݍbA֒_ZR򑏠Rh4J뮻x)--Gn`zz7裏P(hoogzz>?6 B].,KKKd2Ng:r۹mÔ :FH"%h&P$$IYd0J\<+^D8tfqJllzXlf[6jE$L&@,caaJEzN'C( 1Lkt:n|#tvvr)bťLMM j :<#444ix\)t"nt:X,q8(JYW_l!HwF|~&&&X VAK&q.9{eY_ o,BD4E, ;\J-LrSvr9Z QY]ⴰ(g9gy:;Nj -6OJQQf9_ڄ;шBW8^ɺuw˽Kkk+7o ɤrf DəIvmuuu QZR", NVVVpP(tuvP(X~="R\.gvvN'|2I<ķL8䓟$/fI bba*DTVVr)f3 5gx/+++Vq%׮_glbM[xٵkW<037=CNN LME]]===TUU C0$++ɄX,r_)YYYNsϽW^SO='>q|+E `vv͆fM G XVRTZH$K|>YYYj0DJDYYY0PyWDbaD"fY4j]^'N<yX^^jv(˅d";;Dt^2̈́twff&n;955n"zٺu+o())a~nkۍL&chhQ: FYXX+'P(͛b>p!֯_D`0H˱jG? $Z- ed,tޯi^"8#ted;$RF\.WȤ2_jh5Z‘0dBZ%Kxx".d܋n'p8x<L&N;vPTTD8&;;#G#hx|>vN<ɍ7ȱcǸ馛GۋNbїosAa̐1gdd<6D©SP($ bFH(DKK sss\.xq)..k_*  Y:!''H 7 ,~?f˖-~773>>ΦMBHb0F&8VuV/'ZzTVV" N[[bvz=SSSDQq8꫘fjN<Ν;ioo_O{=zK~~>sssL&ۙ`0I(Dlh4H$.R)B7 ?ZNٶm[cl\.Oi2awJBҵ H9E0L #࿲Jbbb"h+O $PPP;w`0`0(++ѫFFFX]]M9Hүgjj ONN--- ,--g {errZl߾łbc^/͛d9R;(c%Nw|O>D"AѲX/ /1! oEw ? L"@$~K$STTwaY@p?x&JoUb1b\{?m݆hD!̼ryp*IsUrtYZbuONOf~CXkyuݘ-B02^rr:T*IVV&yiT*%VɈ{х"ngiq";@wbXDģq*yWOaO~JMu-*^}(sx㍘&z{1 :tj5,faʹѸuLMO^ZcǏQZ^Ι~w=8@=aDb1V">?1bjkyWޞ^*+ 2ĦÁX,Dr)F\N$aB&L&h4D"aDb_'UUq| $Sqߢh#Lcp/PVZJSdefu2^>2Fq/j1~lf+*ssslٲLB.ti*HbDQ\.eer-X@4%ME#tzUB0**s"j"J8}4PŒzH$VW!F,,,q z3@:b[FFP"łpUec$I$):p8‘#GZmͣ8t09 ˡCDXlT*جVD">W1ٸ@^n>*\;:غu+rw~ĶԬfqqLj uw`6H&ȤRbJ,ⳋ{,rB[8zo˕̼:|9/yo|M*)Jr1 dkgb4Ki'ID;T*%,`}4|^ĢLrGG8pw9q˘Mf f.=(x2CHZeR)ZN~yrޫhiiihϞ=+pVxlog~ٲe :cǏ DORZZJYI $ҵE^؝ooॗ^bӆ 9rJ nFzzz?Yjz瞡 h}hm&P;Yz-"3 cLMMiBd*ÞǼsLQ*DqH"h"- X,Z5s3drZM*b``|$2ZRUQ?৿BJ%![^^fjjt7Qaa!Ad2H9iD"jC=ħ>)FGGٰa===i7~1x`ii VK4$cF 'g߾}-#36nn_N?~8"wBT*C1996,f[sB$I߭$/8ϊ.nK3 \ NA|חJ%ߏI&z_ߞ'?4ӥ.յB%5̸.B0rpQ 8Pn$tHRȑ#?>,MMM|_׏J(l6P*|'Owikki Ǐg~~b1\:KؿZ[ٹs'JEw024ćnSNqw,sn^xEb)<>>ӧOxG7S\VJ[)c,VxK^:ĉ'曩ZӭteK3Sdf ܢ"z#EE%TU+Gٽ{mmؼg}N>͇o0tuun:y:5= 99Y{<68IMIsR+rΥ9|u:kjRw[oE}R)M$|ޯL_wOAa1SPP@OO&Q:zzzg妛nB"Noo/&z/"4R)^;bbV>۷m# e|IR3ex^zp8LONh"ZZHMME&:d2Y}+VqaT*cڵ233ikkd2ܟ{9djnX$&&vIN6 idddrl=Wn'du;S1-dE"?W!)j5CBX,rlIzxbxA A&'')//grb<2Y]^n6SW_ه)BUe $Fǐxr:tB%t:)*(bpp1l6EE\.FFFjbSYrZDhkL \&eW\C|3#H_Ԡa`wФ||̌@% 94ä[S  D{X$D|wĹsr'\bdfp555ٳKxD HZ/FV/xx TVVb6|XVbu9z+|o}A cccrM ƈD"oMjj ;‹HDbN?6z͛7Յ^@_o ,4Z5,(i466i&jt./^8~200ONV~JrZZ!h4Ģ"?A]]n c˖Q O_#zkdLdc2`@GGyy8]NT*5n,{IKKcbb]"jABv.fNt߿:r{23DE%@Ĭ}$y=.BgB^}U&&psss$1F^xJh4Yn{qiiihZ.W999 H^N.SSSB!BS2gΜS.ۋ``jjDL 33D*'ORUU,*V0iii444|rzzzIMMe``%GAAB`ۅN%99^Zt:# ȥ2rrr$RRRB tP1X8x 6mBS3ee%8.B^O4,|-ZYb6u+r)t:q]`D"j8z(˗/z1f'W@,c~fH'&&Ѐ`~z~կ~&*++)F Ŝ;h5 x^t 6*_+v{W]u%"SNrJ29-Z8wO=W_}5XIQ*Ri5_SꘟGP`Q*͌O#KVh4Ą E,Ǚx䔍ŋkD7IN6?sŶtvtPH18Ovf!!VX$iiidggRHOOζn@BE(bB!2)7[ss>f&gJ><23KycoY޼~43~~0l|w:Z<1b1\|$@K_|޿WC=!1V\X"LC DTFA$R"bkR)M6sW7ʈD"%R 8{W]8@B&AK$e6gÁR+FOwJAuUiiiRS]CcyJUSQY3SSTTs 6mٳ).)4kvq0 ogu%x> ifeI队wy--\u?Oشq3<4y8mW\N|>.N~q!/1$|IR1̄!.FC ӳ~DaFGfUd砐+(r.YJKs kVatxҒbghm@,Cb6I2&ӍX"sl x|^A!16:!QJ@O RPOMU5bL&¡ W^`O^v6&Á!ɈT&tKr>/ Yt I,YVǏ~C2<#\{u|<333\h,Ƨ?}#_)/ˢEXC~RR$I6&ji$…fR,ffgBK3&`j c$N:IMu51 1:2LLrb(I4 Y‘0zZB0b VD*''7Cb2>*kjxg[37?1?@(@ȥRDfXZWR J#bl,[( )(Xtuc~`zpͯ\ʏR |C}bl3{;b^aW"Zx-&/}ދ D؇v!wL|xà ?n/{]_$@ L*E.x߿:N@PJCAr%ȂV8B&Qq4?O! .)D^z BLFFIMMsxA FehhɆ9p k.|~?:jY̳x<^#O399秷+{v\=ZGo^~%֬YMEZFɝ/[_v9Ϝ!h$bA" ԋkIoO/bX$B8p0R ';FsvaLII̡Cٰa#GAv;),,@R288$*|y Yf5rT&d2+d2allEq9IILUU%]]QY HN6RC(d J%RX,̱Yf59YٌFINNzrnDP*+.gjjAVY^-J0o8?_ vc4觻4+allp8lB""qi~&S2=SLLLr)/[eeeh4dz-jI::hZ>|hT>JMMcjj9z~s-㏱z:y#dgg#ByY/@* E\_(ZȊŭ߼&)%4K/m}C3|0sx7t:).*dT*O}SD_\&cvnj:;ؿo?^{V?%@?_ՊF!ِ̜}H$Bvv6|>sWOs5p1LVV&MYY{fޜZ-ɛF+*% @" 0qVf.uJ|ܪO'`C:r9v]7Tsy֯_L&ؑmӁ\.G*CP**r%\B z)** 166Z={Y~5` ;jӓOK,O[ocK{ٺu+/DoWa Blڼ#GRPD,ee h9TJQQHV S^^ѣGmە`0D"!J kzΜl6P(P)Tx^~ӟrFikktbMbwٴiMMM\uՄ!4jm\4t9,ZTb^HĄazz{F#hBV2:6/[A%++''$ iiil1())z, FNjd2n B&PTn!".Mi4^ [n瞣;wAqqqܝ.ʪf`HBl!& v.{3ż 0 ;~ JPƳvΝ;-~"R\N0H2D"PD "Jœr;>>lF"`JI<ǧGMU |{׈Fc&Oa˖b-|ʫh8ȭ׾56,NdM.`J X\XPb xF $PRZׯgpp2D"Wm΄mĸ C[k.n&{7xꫯ桇4ihh@$QTPȱPRZRDjٙi|b$S'q:z뭸=.D9y$,_jw|v!LVN>իhkks[я ͌q1VL.\@uu- rYho' HOĉQ*+ШH$puv}vYj ODšk"v*^{mu{北G+裏3gP_ j\ |R ?e D"UEEww7==R\^lٔIe>rpP(J%F ߇\.n'aDEW$axhM[67)(f2HLLd||˖Y^.]JO_76$:ٱcۍdB.zI6ZLLLLRL&EJJ  177GEE 262*/$9{,7llv\e dƍݻ311AvF&XV" H$x䔔`QSYË{LFz:., Rp8`NEEϟ'))eikk; 󤦦p8XjX,N8;41??Ouu5HARSSb077wrQRRBrr2O=m ZIoo/7x#ǎAeyP)ILH{~nRSRHT|[O,# |fE n/:ǍNg`gٰa=jW<[D04<@Ho_w\󓟟Ϲs(իWAFFn/ӉT*]`_\84 TBR&/1B{[GYYYʗTUU]z#/d,ٳgrRXXݻ?-p:L&Xf I& ?Xf&v%~sDEXLw1LCǗ(qǻ|l3wCO3"  I f*B )))qZ& ʈD#Da} pRPLFCC#]")B\$=LLc4Ogbbfggرc_{RI{G+ڵW1>>Nu`1j*&&&P*>}Z^/s$%YT^Jvn.{ /F$ ٙYd2t-g琙IOgE%_`DQtI?#==ѡa-[FCCr+}$;A}c=Ņ<.֮^Kwo ZH8Z Dzz466v;Z(2R)}}}t:4d2EFF62FFFHNN7Ӈf9pW\qgϞj288H]]/BViii.~i,Y/HbTMa~!XD`s?ť%ȥrLs}x^J%lٲ 6p 7c't:w^nV?O뮻p\\.T*hJKK1$GX|M"H45M$azzVE.W211NYE9rnw<`jr@QAãM҂߾(//'l[nBPL$e`)**h4SO=ZF*/}-۷o3]466N^N~?Ho3&&n!Hrn0 |( "H$ap1%h 55p8\& !)0\L;"!D"1v37ۙ߼N9ݻX| x\nZ-[labb6\.7n$133f@$aƍgrrrll۶}+WrL&~Lgɍ7}Ç#Sș斝7sIZ_瑙Ԭ0&VK[[˗/''SvwwJvfUUU=zLl6^u5 1>OqYIJ6 `JDvjjjlGɉ?#\ nSp ^ f.bH_|f͸"[laS7]L4/ȋ`r @P`V"1~twup8L.}Q+,XDggjQ4Mc=2,Dbbb\n:b1Gc͚5QTP};L&o.^|EV\bʕ|䚫OGG Z`vo={`agÆ <3R)WfdhjJ, >X.)BΑCP Hp:d2jkpx<;"++g}˷nall>B -- /lٳTUU1tQXXȃ>Hnn."bΞ=N!-- A~~>===XV.\@nn.ʅ-~fffXjr./Vzv /ʕ+lNWWWOMMI\DOW7UUUdzztv;^eKl6 R)PJOObQRRR0؝v "\&^_jѡH$B"M0ZZ BvJKKd~zz(((`0D| PRVF,J'"<^rq-PXXȎ|+<|+_CG177dbʕ?'55믿TT&c >H$Bww7NI BEp(| Ë?Ku^zĥM ol͛H$37.ȽDd,xh13ҏL|f>%I8Z].t x?, D~#Bjo|sm7pd`|3!))QZ-ӌń>mk8 dfZyWq8XYt)3sriJ%J: Gҗ+Faa!{^|ugՊ\hnF$^#2:>FAi! Svv;444b gPTc6Q%K+ ^R6(iiiuPTZJ 7055䔍H$B!#9䤍1ZH$ƺkxu>RS-Y ÇKzz:%$%G!7O]]333 ĺ:J%㚫ǻmajjqK.E"PWPUUE! )dXimm%!!RZqhD#̠Q*Iqa1s94 2ߏF-x8q+*}>'188H΂wbnDtwwNJJ  WH 'PWF wCoo/sUۨh4i&&''yY|9YY/;JD"k\Njj* Zg{/}K`0HLNN*hj2A ?3`&x?fE0A 歙"߂gN{6+RZz!"Ayv;)?h4\.WHdJ ,? !E%<\un:֯ZO~~>*RYb:{1yL3s5# c_/`yހR:-VAdҨlذ~?N;lܗ$ Xb`0Hss3R$9vǏSSS޽{I갍1  r H4$EU$*-YFZkz&d LNNa1L`||4jDOQWs#x'LLLbn/W1|#'f(cǏھWVŸmk*FT026ʹn +H()+`Hbf~RBkk+$%& ,܂@z!nfgg pL 11rDLNN>QFLFJIIsGQ fs 6$ deel?)K*U;lشt+Q4TT쳻Xn333\˷͑c'/2MMM:t_W νΝ;ퟥe˖Q{fR,())K PI1@2|~ vB$~Of&ڹED#H19 .x|nFZ-  +-ZD?iii$8}$V4N7mTUU-0 >Y.2t:ssLlu ] L&<>z QNr&֮]˹3g ``l۶s3133Cqq1ARHMd19;B`vvZ=KjHRfhhh`ݺRjA__Әf:::ٳ3<jjj,3P> 6k|'NX-[pwp5۱ŸnRSS 222Boo/+W={N-|FGGQ*WBocfښK e 3[НKmef>03[_}=O|f>Cq}cp;w"HPUj: $CCARɻXx_taI.slkl~ӟ>^X,~~N$abrXL[[ARO}s #?~TT_K[6`F76/m]{C(^;uogo}P0kfh\Vf73t:8p?{EVP(ͅvHӉ>AO$pE@G,/[%}^3?ϋ\c?6AGcc#UUUŠZ,Jؽ{+V.awHKKU+VpI>|IjkjF\.b6˯P\T4>zZRnE"ZF!U҈ᠠ.^H~hhh //DBRRVѣGɓ'$//9spAJ%7XLSKrKEh+2կ_?v;«s ѥq 04rQՈ":-NKV$V!)1ֶVl\.`pEa:6|h6;T*:je:ͮkO'W @ָeD}J^xa%" eڴiUϟO^^ǏDEE1ax Ξ=9y4C w"=}׮]#>>ޭfTT&!992331xyyq9V3V]`Z\Ȍ7?Hd溛]둙l4!m+t*\޻dee%B!R_h<5gM&,*m{Իx穮A`0O ' 33NV׊bqڽ}+9fF 9!.Qc2IHSs&NjjI/;˵k#bpDL1O?Q UϢP(Z\ʾm_ѳW/6|V[l`4 cz6}IddÆ goHJJb֭n8z*Ghh=z$r9BBb]wSRVLbb| =z}2LZ4>LZZN ]c#55UPbqw?#++˝^h4]"[.K\R 1lF"UvF񣴬LAXH82 ¼yYK8q  Eo7>SNt:TVE"c0r҈yDFv///jjkYz56m0kֽL0A!f]3Lc2 z!z-d2^_*l6^z%F# |z F||#wG|SLai˂bKb_ 9fr0k^&F#[N鄙f Jv%^^^N)m; IDATxb2Ls8@kC#Ǐ2rX:#F d2D[[!!!۷&m SƆ>owSRV϶-_oP(8s )i F^A{TczɆ/@.R,fFT*O>رʠ!o\Ν;#.Bb6Yx爎&..8\Z>.}6rƏfb2w\23Z\cРA1l0G`?*F||]TD||xzjh3;AGѶD:y}{SVVF.]j>=o/ztAYUA=z$^|EVzGreΟ?O^r&f͚#G r?#ӳgO29YYYD"222R{EZȤ2NG't_ B'ÄQp3w>tحp:ܵ(R捤 K'ku,V fK.1`3 ٌBp;_-)b]t卿ɤIغe;O<k׮eذaddfVH$0fXrrr˕ٹ}kt:999LV, zJUGTT999d2̛3X̾oZO?ͼyp9 ^ljkk9y$K,Gm_rw`uboHNNG$ȌhxǨA(ûOTTT 7 >^.Ec_/3eeeϚ5k(//gݺ(-bb)ǡCߓzjT*%B 6 ""ѶP(\5uN?>;wĠ3~~~<3}:M JAAƌEQsw?~z5kְ`];3%EL4 o7oo&>>aՊU44" Qfײ)l1c6e˸BছnB*!gԩ[?δS9t ,ߥT*ټy;KY̼GKg} F"0qDN9_/մZϧwZo;'dddZ/evʾh2e >(q@(b\׭VͿcwL'Lg #Kv HUL&C.%ロ/U\n@Ҍ76*򆟿6Dۃǁ YյvNHRR)>~TVVBs_t.]6&++r*LFCCTUUƉG?~;F///*++q:1SNT* C.SUU8%$=_+PUU;IIIcίwLfA !I 6;v | Zo%#555dgg}(VH$իŋ~ʚkySOoNn,&^-k׮474be$T0 8C 9|{^|9.]ʡC $EyG)//gٲe,yNfHK;͆J{便KY ct҅:O۷/w?&зyfFFAi|mHDbۢfÌg_xNL3F}WP ^'0C `4 ]m W aJeO!BN;";NV\ϯdD0|}}uZ 444[?ٱcpyJ8qlٺgy~~C#˗/3h@$..,0H$֏:֮] U3% ___xnf3YYY45514iMZJop@`huxڧ7ځbjtM$Zu:N׮]#4,ł`";;Pt e?@\\wr9U$''S\\Lmm-q݈mۙ=s6oɓ'yu̚9֭cMɩ|!(=DwٓHɲexŬ^F(<( \D.sIHHB,s[XG^;'O2fΜM_BXX%%%PSSCll,$&$cYz!JJJX,Fn]P[`0`4ikkC,Bbb"&$jjjx<` @tTWƍ7swյ+;aau+Wft:j1 ـB&ѣ׏g2r(jcÆ Hl;ӫh[x碵=L3uL'tL'̸}eDvU׊n˛,$77-[`p_oGӍ͂H$pE"vGv>C^|ER)iiiPVQX*Y\ʽ@\LZB!zj=z47oO>Fue1111tP233ijjKx8W^kx/SRR?țE%n8r6 H3'ٽ{7/^$;;FvY8p b~D*ն2xPZZhjjbр?.ÇsIR{reٞιt %* J[[]ϛvȑ#)..`hC$SYY t5^8v^{u]H֥l ,`̙{'K+Wa0X,x0|PywHNMMؘݷ]vq <==1\ڂ@>T(\#&&#3i$6|}O9y$͍,]`RSS9t :uHRz%l2^ >Cy?g~NZ[[bf6յU#mq|- Nt3:߻5f7ߚ}0MLכlɓZ O'e'6ɠ7Px`[]59b&o\|N'2 ݥ<{l+P(dȔ &=W^Ki6r)4 gΜa0 9s1c_3bpsO>466 iE"P(RzCӦ55k?9ZI~|W\|+_q_cDpطg x{{cǑG֭'Odq{3gN+fvW+9%kk[- mU]Pp!Wx}J^O}}=Q]p8 : sy|IMƶ=vwٳxxx0i$*U嫡QPҙ ɧ3y:!cƌ!''N9Dz|}}J^~~>h|~2-}2)|WETlC&c6\~PZZJ|8~<#:s1t`LVV/oɡÇ8uK-㍷“-fs=r-8q/'̞5̬LtVKyy+<~4"GlvF&;ӪkEb/'mvz؝0f\?kf~fuL WZEUU.[Koې8N$bɯZ}ꐲw(.BS5f(2H䂟Ru'..ϯ6sq8w#3fLÇ"|[nue`T*w'R82ElHR|@&NQgBk3q;I݇AqM)4^>L&#ȹ{44QSYI}صkGۛ#G J Ϗ2N'>>~.S{R^^. >>?n Dŋ8p sV@>}8p 99  $??$WPii)4֑BTTNC;w2ydMǟx$JJJڏC?|X,&&&ɄjP|<\zI^}ud"/`20a.Ml6q=Gs&ɰZ82)6@1cb;nKwϘGŋz#mDDE. ^xmFeyk֬A3=>DZ-Ǐg̙L<`[`0P73ֻfafF`;a3aG~ fٲezr96e&a܅&U7!0ӑX,TL^GR؀/F 4 OD"!܃2Μ:ˤ)ۍ{DNCsw&""Ç1a)**"55RRرc<3xҧOZZZHKKc|Q6mڄlcٳsrYÑd (++cݺu"Xs}}3j &XJna>ؽԴ4++HMM̙3\a/I995jÃ,GڵkxyyA}mͤՊVe}sqD#b̘1,Xmc35<̛7^ƛV˫WqQb! j wSOqwٺOxӂ?ˍӗqytȕJŋBz鮗ilt EDDPXX@bD,+R?nGh4CeE5F^=W^_~DuJYY'Nt2jL:><چZ(//'6*LLee%$==J FK’B\ `00b+V7$55FP*TЪkBTa[$fv;rܕkLA faLGzI>q-f;aK9WanWrXv^! ĉbqU&d2 P Dצs9{yc0Qڦcǎ,]X$&&ȵkcYbO=eex{{iFvIKK ^^^,XMv46@__VZEn5j+V`SKx?~ng}B,\cjhԞڵ~'̙39v)++#,,#ٻw/(45|B)*)"6*YΫ)((@.ӭkwgW_}?;z?=P(۷/jGf:Y"DR ];XboXϓO>IϞ= `}3ؾe++V`…ϼ@~ĉxzz2&} .[J"99\ιlJ`.pq8MG.{DP atL'tL'̴OH3gp뭷0z=jO,V v]H|}qÌ㟤K8vBĬPLJmu>s;޽{]J*nnn.˗/%2Ɛw܁X,fά98q0|prrrhnh7^wޘL&38v(>>222\ۉSNZg!00I66D.w4x>jjHKKC.s5Q=p$t-ЀVlDTWl26|RIjj*&B4qŪWVlfҤIPRVBSS?0z+oe-ǎCRBxf#==ٳg铧IĤMeM޾9rT8uj ځtzy۳G^Lll,}]{wpȻFE^^VhEpYP( ֌FԆRzoʌi38 C7z?-/'((-۶;wbf?p088qތfq?MM.n DnIp9u %8v=Tt8hꄙfD]0#￝0 34/ߙL&wq $$$%K`;Ie-jłL&sպZ?N6M(d Q)HۅœNvG2jA"s*;vޣ ___ jE.WbXE$QQy PHme-s̥{l7yZULqq1SL'sN>NmC=Ǐ磏>bԩL23g`қAb^{|0 C9lr]ٴiGT^'($jv;V.Q؜6:|9]R%%% 2RD"<8oW&44'>ΈӉfÆ zW4jl7t: ~q90DFv-x߿?k֬FCP IDATxx8&#˗ׯ999RY^A@(h6}=z)))!}(B!^^^Bw&䥗^!R&I8p`Q(1vf}j&4(p ۝h4֪c/7dܹL8}Ç3k,-ZDlXT ;w2p@Cٹs'#G &T*Ѩ= #<f3/sHJҥ\h<56֭TW" [08N;.miial߾^ /{=LCCϟ'$$***Tj /keuv X7WQܫDEEmW,^FB;w./KWJ2gOŁk׸r 111322:t(G!%%W^y͛Gjj*NB$JMHHE[(]H`2DGGȾKRR/f3&L@.|sX, )//'* rV͜ڵk9uVH(J>>| аPjزu3S&Oa˶-L2+W@yy96=zVilltEjkynr/^.ȎC$"πEn誟nN鄙NCo17z] ItLcv4ٵknpEJ$7 3V-d . @-VRY{^۳/ee@p@0~b6[­M"11ko.m6JK:u*!&**LƗ_nᮻ"''o VY>a˘L&ptR^QF}8r#F@բ׷3# DII W 4hEEE ;5aҥ=/7nf\tQxHJJB$ht?vQ3i$_.izpmBBńX,vT\lDoٚD 5z 3K3q|-f)5ԛ]6}_f!]uitƦMC(5JҾ v$b[G)kOmDb8D$3Ϟ㟌_.8p"Dja| >*HQQ1TUEUU~3{N޽ԨxWڵ+}kdWVVh=dl޼QFɕ+W\&+WsNJKKX, ПsS bRSSÕ\wNuu5y.Էz+N&spވb4 AAA.?zHbȐ!d_Fn:V-KT$o.Szm6>#֬YC\\iiiC~~>~~~ݛ|Ճ>Hyy%)))L4ѣGeƎ_l`3rf|ANDKYt҅`BB2dY,Z5LEuݻu< ,_VS+J"JJJ(..fرn6SteM9s |{ !!aЈ?є0f=5#hյr#K3o @r~}U0fnds:.q:]FxkײpB-[Zv;]Smkqy$p ~F-ѣYt) ]K|>v)'˪իHFtt4ˉ'?W^߿qƣ+i3ԩSIOOgQTT%29,YK+_o7!A!;z^x 1L[O-z46nوB ..___>>>h4Xb]JFj#57˟^ΥK]#11m[|2Z[>}H.2] &6޽SɼA^^˖=ĉӻO"ҟ8ŸH#3oN#D脙NmP/}#td28"f͆`0 | fVd;_Bl6b'v3BJM@@IIrb)555ͮst!a}6T OT^rBC"xD,;7Jx"Sn\d/ЗF־Jb|"28~8}ѧO &#{!{w; ;; nJE}c=6=ZNA}W̟3DG󎓪<3l4Q@( hD(W[-F=A,1&F FEDi˲]Xvwv9qf"ˢ g?Y)OsaŌ;ɓ_ .N;d;ϞaiPv/yᇩrQUU5\͝wɓGw㣏>>#O>4O4iׯ祗^*$z>zvAcj,\QgO?ei#HRD#QZ*T2gn&M[oEvt?,^EQxwH&N(ƳY=@$~;ڴii;)Sߥo߾TUUD2$٭;D/<kҲ'x"x@ 3 _BkۆiC hC @")"/ϵYAr_L`뽄 w mj B(R-Wɨ|/v{Jvja1̤3i\.7ev#Br7H޽hF8F4b u,_5kVQQV5+))) lC^/aLz>hƍG4eæ tXS I2׮aմmߎ˖ӽ{wr6l+.'Rmmm۶~x^<;sO3wL&% [4"0Ӝ rBLV`7 ۘCqf0M` ]wŜ9s H$B 0P4mgX,ch:n{{nԬOL%|yPۤfVVYr%z*:iңG^u+2_| UW̓ #d֧xnl^|.l.ˬYڵ+zB34x ^z%zɋ/H2b9chס=?SN)$& p( %fQYhN:.:"Nb ]KsхiHM4 n!:jF%JpʦdɒYpi۶-]vӧȯk~vO8q"cƌgP[_W_ʹiӰ4/T*磾뮻ѣGn]%>͂3Ql޼Mlձnm?Q.bn7?߷ܘ"+7(+-A +)5E0sh0}9]CA?Ǿ'KhthzZBj.)!4aZPsI4 !4~w4CBSЭi)a\nH K$;Ya0̬HiiaZ9!.Rd;\.b5Tk_ ;}ҋą^$981𔁢SN19VG'pxQ޾LsbT.)]B5bƵbS.^ETD얄AEZdDLȈjxXSW%js11ˏŬkĂՋD\OYsg [DFdEJg p !gEEFϊ9_|&^|e@B8}.q3?)\,^׫WSMK\bъEwP*tO?@A,ZP"+fBlKlj+.4q7qsE\|&FZhB/\K.ćs9b3ſS :H Q״MWw1j(/]riiFRFF8xI"cŇ񂈫MG'  "PވGCn {Ĩ#];GHʼnu?FM"M]:ek!4Q]S)Py FcsjӲ]o﷟?o|~:v}ط7-5?5_SK͟}8׿/IS[k?\wfƴL r$\l{UC=D}}=o&{yFCqn:c;X̴|?UV#VG!)ϴL8~ R]SO.]$H: a=c؟q1c s妛n^={ղXd X 7|Z@(9ҧO<>/WqdED"P$Of}̥\ƍI%L6;e˖qyQ_S"HH'=z49#kׁ#G @0 L3TM;~锗ӷo_zo-[С{/-O5o-pg3sǼ+X>u;~6o`Ȑ!l7%qyѲ_~9wq|yg۸\.N?t, ^x!3g'6-۶pLٌM$]wE:f̙\$'Q]]GfPHdc I_.xQDކ:²lrEIg}|ee٬-y`LNcfv+q"33{Й)fR锝48 L`]CkL> /~mN?m{pNqڪ.e0b`($e l`4dws9TWWHՋM5ԩ۶mc$Tl LDZ,ɓ'%pyۗ>_}={PNaYxNN,vE ٰ]I4~^_s; R[cnhWMj-]:u[nuN;TTU%J$ C]V9#7wիѲ:q=:Cqe?ڭc9X,ˢ[L#N':u²,sOiN0 pUWrTWo}P}YIJe˸袋={6J׮]>}?ʫ׏Znv֮]͛y9sIgTUU+Xf {,gFe6mD4ZB(Q?3y?~ȊB&tt4AA@7IBWe())AB´̼YS`f`(N[>7]N.Z#k3i?ݧk;@_̴B3hvܹf\3`0VNk`3J,P0D*̞ۀIjVs{\|%<. @6m,A<'g|4:n:d;tRx^~e^z%Cr Fiĉp>tcƌNࡇ]Yv-F)er-=j4RVVNme„ \wu|3P3fзo_|MXx1]v裏c)COdܹSO裏2|p:w¦M8uiD7ul#{AØ2e UUU̜9.]0uʻ+0tP}]>hnձ`bFAee%uuu4ƒ{ޘB+OK{3fRyI46*)Aac Bks4R,B"N`f`F]}'=yvyg~ޟ>,aF8IxWky'[ӟo﹟K]i3> JQ_[AUUFSqpqQ[ ͝wM7Dƛn#0褁d~ 6 ̈́ 3~>>Q^^׿KE /gҤI<#L<[tOz;wf̛7@ @6%v=49'kOrQZ^:э*-JBv` ;s{3-..bg9")"9Lk蛽J;-cE0s ݾSq\d2Ei9NgbΌg0# B?j6l3EmM-zyaA0 C(b֭455ft@vJ̚5N8?p8LMM gaBǚQGE4eT/T6+1|o3yw\p/2<֭cѢEut+wNT{կ~$I8|2S\>'?o~g}ƱK}}=ӟǫC.]?>6mO߱()m׭[COod&q-`YO<&=șgɳ>K2qc:%%%\tEtm]y;أ99sеkW&#Fry8q"<{.@Erۇoߞ3g2~xD4iٓ' d|у޽s 7ҳ{ƍ)'N?Yn`8|>Fƞ{- 4D"NYiDp(L&TL 465R)! IŦ1DI(BVS1uQ~ `E$a&BlX]nNz ~9[j)qE0S3E0fZs 3{4}СCp\!СK,! j*444 1McR+W |'u,^Lcɭz4EC)oSZQ'|}w /ĉ1MSIcc#sΥO /HĀ8yo!~1n8\.K,! 1l0" PO>a|>sVX2rH*UqDWϟadY o&3|\d2ƍGFMf6oq:W]u/fܹd2>s\iKMNuHH3i,alWVlsRUS,T-2فXyjYvys?z(Hy3I|>oLL`Pb<ȋ`怀{o=udt]RI -z30S3d-E8dX#>_ˋ$蚝Hv0K9s`a4(o[ưaH$_>}pD#Xqt+W"2( 444pYgdخ=_}(D?"jyIdxXnJϞ?~'о}{~z_|ٳӟD<!CЭ[7x hhhW ap:54|%~\VB0fp<f) GѴ\!XQ(6-T*;'4-RdVCEpe02Xhv7`f0s̴V= UbwV$j7`F34;Zwtt%+{=-d#i`&n,sLl^",8ɨd$<?KCQN$at.@I|kd){ J }6 2p!Cr5н{w;<,â_dY8Oʗ_~7vR֦͛7p84?Dͦl r`jހM8SgvBII O=׭a\qȲ-H$| X֭[Ӈoǟx3g2#X|)Ǽ[a^|E $),X0~v=uu3pUWsyP (ĨQ?~ykW(>ێGͯ v}'b}Cyh-K &vn՞_v(' ]s%DZMQDhI; 멨jnބi;?锊%kᰟxHC~Yd)^_&۴)92^CFiʇN՗N>}84IzTTT˯`ڴi\uU+tޝjF$R[[[xl6gIছnfh-0D;݆[9YVh4Cqd'M,Hi?x?1c=z*$͇8`Y#0vX>#"%.Bq9-G2BJ'*zo3>m0[JtnkYf`fq2.,p9]߱0nNYBΝlVIFc&àC KXbU}e:vRis,TnfSzJ"V\F  `jj*V[PEɨbJev+|®$I}T*'?)iJ'B;UY"L[dZr\azgvѸq|2|{ #pЭ[Ҷm[Y]1 &nV^~e"*~;d?2 9b_^03~}E0s(VlŶOاp]뿻[v(C LM,˅nWHR>?Y5K}}=y6oߑ-k1b%y$C9-KvmgA&#tv tyfBppǢ:G!(--SNlذOOczb*SuӛXMWYb5' Ϡ'H6 $ۿKPM^ּ mX}c˳l?K/vZ.B.]*G͂^i]zR)pn=\|M&dQ`sszb:\_CEf}}\Pv,=3Ӳ?ER;[e4M(RX}^?jNEe^? "[1Kx>hzbՊ\zŜ:,SGƤWE# db45C<cǎ8$'xfΜ֭XE48H$>fEVn YRh7 U3jjء0%t&,+x9#;} t;󃔿e3NLirg|rF̓>HPsB(b1.>䓼ĪUԩSoC#ax<$B͞@͓offߙCYj]dfؾ'bgcWd\[B34\O!*5[ JP0DNˑUI6Er{qz}|8m*BH|444AXr%}qT Cqݕp֘s޸ݎ Ԅ@2T$lܸ6em ͚5k9ꨣh7p8`Crڋx*-v1_$0MQ"sNX,F4)wມh"(ݻu Lxfo`^01.֮ /dݺu8NnnNg!P(,o2-_KXөqIJlXC?gfPgf3./b7a(!P.$I!t{t^XwͿ `4PU0q~dӁM7ѽ[W|2j*kBYjm+{|lTENi%wRH::vh_,Czܹs6|_~%t&m$E"@ x<n'x=^ҙ4M&g(!L`F^@B"업D$Hvle`$ r p|>>p8hjjBs1TUUX,75 c NNL6"+v4 ~LN^pii@3ͻ,^&}?"3s23 \.G4%3|2k&&C $I%p5Ae(++Cu؈eٓfպJ ફ⭷bE~4MEQ }THh5Q=?G[52 9/LGX|9BzɗyJyfsja~{9֠qp=g+{93;s$YUhX!4Qvwaܷf` `iz!̤T>m[DEEX@ @"f5 LͪU7zF;DfKEQX`!3p@֯@6mm8Ibرq(L6C(H:Τi5`YtڞxL}d{0b+L]쉝/23Ef]?oa=oeff۶mD8NtCˇl;EP'Ȳ3eȊB֭iCO>d2Z#GL&ݲO8fJbTKWp8( ` ErPPGyYٰq]:w!LQVVFMO')`me^EQrf[aPRʹ睸1(yuW"3SdfK˲d?4eo_~ aػ,TU%N4bv\`˱$#cZ&PχSQ,,AVb i4ȩ&;j}Y=3fdbN|"+Vvm۴tP(D:TP^ZNIx#`mJd2X]i96msDˢx^z݋}bY&Rҙ.vK8NdI)ԜJVbQȑeVleTUGɻ|#P(D d&]!?O93;Ttp>4dBFEޥyo`'Dr_[~av;kA,3b}~gbQۉi`FDZmjÁ$Lv$k)iK/k4ZWd۵K$ Hy I_] e™lVޝիע*NIA nVzi?xQ$SV^3bEHʽ-#!$ K8$[gԘ'Zwv]<q^+]P%󁹏 !L=}ôt`fL';Q O?)on(]6Ҝcv#(bIv_+:t8 r=O P_8\.`4ow5+0Mx<#<¸/d޼yߏ¡8HghmX5 i׮8`{ n2GYX,Y~=eee g-%vnk[rl[k~!]r8naC.ۇH,vc bcc#rضm?OrCt2],-MKs_ln( h`0ŋ۷/f"M}SYYC=D߾}Q$hFtXZ[jN֭ad8,Xd:Yp..;o Oևxw , KXRfFb7ލ|oiNl\A&tL|Yx<\./׏MufE)G 9֢q:vlV\vg͓8LMM `ٳgSQQAΝ ,K"@B"0||9=G)FI-5;-VcڵV,_E:7۟^"翷}pO|;C)-q83z߿9fOL !7_mNRmPoifl?$jkk *Nl6K͖Ǝ=mrgn:LcS#8tFӇN: ة#{o.]A0Dͩ,Z%KPQQ1c9r$M(-i nN]j*覎&RPGSvG@#IlMWFIRL>La~Sl}-~ov*k>,h%wv wnߏr0MG0=F]<ЃN}&,^ƇB۶m$u6r3h HTVndҥtڕb KQ }' ?LӤ{d2dYfkd2XK%Q8%%6b&%%%$IE!NL')/-'ki~r%HDYY={$Tg}]I]eY_Uk3B]L:q gGd)IM&E!v B rH&0M*.tS9-G*ek͔0-5lHMcLŶ-(b-hWW9$H$B}}=mȈD,Ѓ:'OFd'[nrS-%N2H$INr2wqojҮ]gqڵoߣӧne˗1x`Ԭ碢 %TUEB¡8|^ GSN:!IFBY5 4M:v@.CAΝ7o^N *JӺ qo h8 f<^RLBʡme kX筵Cur8ۗP(DϞ=|Tm_wlɢEd7r3ڶC7t8T.VU>r*Qs3 < UMq5WЧOzk*L&_~Xn@}}=LxZИi7!%{1c&*~zn/,ǻTK-òl ˲hlldڵDBou IZ1Tl9@Lake{vɪY4]pf;rꩧҦM>0`6o&L[nCaoH$:Eu๿=Ggb R z-cuZ[o˵Km&lqtޝeի;lTPO8D",XP0`L$$I~u/u}'jw{g_}e7ǂ+f,ɻLC)|7_l6W)"+8rott8rTVV u޽x G-k-x L%L*`ۃd2tCǴL*^z%MF}}=撋/7 +WСc0^jrZG8&`&L`Χ8۾9us~=e[ضC=QGecpQG1f*kἴw# 0MNαn]37R(ʝ`ko4>i2UHӕJ@njmŝq=up]jsHh=T*ʝw7wmƽǸqD"ضQGq2U؎ 8ENՊAu/(UQk_'O#&Lrgsb&bzdY&O㏳t<ϦT5\CsFt]3XeG5o̚5j<<4Mò-(G@YI,`1MMMtm&S#v r __Eg33}1E _Q isifիȱ%b qm'd٢{GWW'0\.Ǘ%Ǝ/?HgҜtI /̦X(Xkh sD5}UV)X4( uK|p xgp],rqs뭷r饗RB7tbZ,hG|k_cѢEq,X.]ģ DQ?K}}=>h۱$bH"GX(IHȑGE]m-u=ٳ8Qovt~ 83>g&[t6;_7IuomRqg;9^?% nwՏ>o߇uo[u~df*A֭>we22M3|r9 BHϓO>Ɍ3p=MӨgժU,^LII@6h*֮]Rӧ%ɐhj(hQG̜9o_$2l^`ذa65Ə L8O<뮿ٳOb p.v$mmmtvwqOnbcK ~]ٸa&s)]{;~Go߿uIuu{'`,d=>NW['DPx4#<˩cǯ_L0h4ʹ瞋!94io ɰxb,MS;v,C I͟ xGu%`|T\JTb,ˡ%BF0W#̜9+VP*Xh#F@E::Qy|;X,F.ò0xBh4ܹ,BAg /^̲exC-0Nq<a㦍>K/˗3m4~ @HQgٲ["ێM]m]I&AUV\(;Wr!."*jRB/?{m/ +`Δ-?U%$y3[vPT6uC/yxb>h~߅|d2adsY (V4]QI&Ȓ,L8} q{ϟOkk L85sնite{Ԅ1xpkƌ!ڊ+/~FUUo}[cq\ץCT]UMkk+s6I ˁUW]dz|΁5{m;_0^+Q3\>"+$ɐƍAVDDZt*d) ttvNŠ\,B@ cZ& I]V\,^t&A$˘*]ihkv1,t:M L<xQ#G񭫿W_B>\!ŷQJ6MhL%3u L6'i&=X$Q~U}w*ݷWJTL%ԛȁO&ῃR'r뭷2x`Z^$QcN86^|Ű]]!16OTc{*Y&.5شmxΌmL0Mնq-%|'"~--]GGјBTq$I{|l77tƕu,]l7]9t %"H0x`}'|QCC%$CsoCidc*`T箄@sC.55HC(d 2I&|_w׿5a~'x4T'(J!]w ¬xWyG`>(xb~u˯PU ʝ%Y`ΜŴ%D$A$vm'<yT 0:u*> t4뮻< /2>|jQ>Pg'd7x۱I%S9 ӠOLtG*qP̳L{Ğ-#^ fN:A8ydzܵY ua 0tWi鲥}8>:A*(x衇( }A.ϋz60+KdI͛7J:tuuռ+zH(2FedIFɧGGV)D"dQus+Æ 3D@dܹ:]=deUU'zK.;o^M,g?{/ӦMcٲeljiaSKPAs|oJ7O^Nav5>>K͵{`k}Dm#]l_?McXo`ߕ̧n^ ~`C4(p{GdYƲ-FfJF4550zh/_g >d ˶0-ncÆ y|_G,0 $Q²8NpUUU=!{6Ubyđtwwf͚`I F !cb'ٳ泼466"2+W㎣d"I$A~z<'P,_ XE[[uuulj݄iȲf*UD%3S|Ⱦ߃a+7_*;`Oʔ"U8o #К)+JƼygҤI8mOUp5dv7iZx<™g|1RH-sTaHR}"KAaN]m]$[*ճgCLP`qN`xl_WVXm۬Z UU)Js=L9aJP-X,ƕW^ɴi4#Akk+H$±m.]J*Ʀ4MJ6d}|J5Ё4UGȕ.Do[*.% DQrn6EUUZvjkj0 { ! g,^'{y%("+X8h qV\+C0$Rhb2eC`;Aֿ+EuBd*I]]#GT*1o<l<~Xd 7t>}:A{Woq>,o3A@.ge%_7 :+=y60dߏ9\V{<`7likk#RWW׿4=o䳟,{챸K2DPWUTF6%őD%x)],! 3aȲUU ǢDwT*"+8[`u%S"Fl޼3gr50uT?pc1c555s=DŽ X`^{-@uAqrzDEQRr"{t>d f|,enͻ|!,qr gL f>gL IDATov)lot-!;__~~]׉ǒ=-YBh%XTd=UF*"" HϤ..";8^~e>OuRz{{4>{L_@7}݊<(L"] ɄafuH#llȃ>x0bpUbݺu\xᅤiFmTEdjQ,Tu׾Ɗ4 /#QZݓ!}/}>`6G<*Sax,ˬ[.2>O?4|ښʄۇ-TϟA޶S3-7e@*ѷhoo&,ɘ Lr,?]?5dJh,|B… 9S<:.b<g"+ahߧ[^2jqTUCU5Zd`YN&u)Eqhc9KeJ:󲫟lwum?W;qܯ@euvL.# -e֬Y\|Ʌq,[&&M:3fpgR2JD"QA0@` i*Bd"VuULDU#FaC wq;_|1xffΜICC~!XOr!F@e+$Qrx,ah y,gֆc`ǔ@D1:u*tvv2{8[iN47~t3n^,C{k!4oey#}jfp =T˜reطjO7|7䠈 @e< YH :?xo1/2 _B"aڵTWeSݮ:gr*EAe, <@1cx8hZOJMM ̚5.FCC=7PEQwܹsY`TutvBzerӓ%l7W\q=*9s^O m[h-T-W3y?ϗe>]7|A$ u"m!6=M0߇^?3{{_{׾lkQ>||w`6?)0SfJ-SȉDz^ _ {, ,4mONs֬YCuU5|.ԜUĢp+k2p N:$̙]w@{G;e6V\ɟ 5 .8?T*E4Ee,YB.|8iPS]O *[gL l",( ?3cǎe„ $ x 9䐰ZN4m.2N8\T88tD"as=#2 B (MWʐ!C;vlh(Ƕ*W֣uHaFmwCPiflQ_WߓíQ .O<~!&L'xCϷ;wn1MT2E:ɮ u줾v]9Q,Z(t.on4TX:/ZIG@Igo]fal;Ȑ" 2(ѲHhwq3Xf# =P±=9xdsYr,uu P ǴL6wm笳΢-2A /@PW_e;) ڎΧ!g6i""& <$f NR΢r+oôLp{~Ə~#N=Tz)' ݊]w݅ؼ+ ^`]*aT$S_Wϻ.uuض̈́ XdI51}LAIUT(iRSS'L$"QKarS2 =uuuicqT%vbHģ=ѣ9'g2{,\dQJ\gg''x"©xBY&T:L$НY hۨJ}}=k׮P,P[S˅^H>_+`E:'ZzI߆Q#aT'!w X,[z.X,Ȳr睿c/zZ@ 'z:_w#gy,aΫsYbLt F_| q]6o9_8jN&=]=zDr-ZO?W_} R_W@LOOhiD"@ 6S6]#>f}h},:JFC+~} L6nja1eV^ ΡaΜ9 ar,d*4k,caѨ@"I# bhGA;n>guUUUa]Sh4J*bꔓX!>lƍ?Q thjjbif͚m(»gܸqaeQ<[ntaq\7ȶTUU1{toB}= >Yb͚ TW0m4~9@g߽$jϕehv/[cI$< x3/9+""-Ѽ=ۍ"I\}seG<gy&v| ,!I.79SO~ٳ뮻8vgBرcy?3~1R݅E6V E`ZPGܙh/<~M^'~< ȀB]nŽto)uv_웨@PGn93S98Ύ<ܶmL&H8&Mbض͒%KX}$c;6 koyR93 VTt͛iEmm5Lz/2m{#F9s" J4w}7޷h#(Q 9flذn2PmނK,@25^ᩫ=iƺ?'+.3O`fG'~,n]20]t3J*Q"̛7T:iXd MMMX͛uX]f8h뺔E#Q,3(s-ZY ʯ:_:O>d0]/+&I6nڲ-jkj>}: o~AH$-b1ND"8ƍf /qZ2PmSNOϠP,/vʁО*ksh jD;׷Bk[ /~)S*M$LKz p7+ȲeYITbҥ5 *ee y PrۺںOq~N8B`%JS_/9?T*ƍ:t(bq !eΨQ4nLBM%F) HD0MEQ(:m1d zOT}nbmi|.VA ,+<_I'ਣ0 "j$,5YCp]d2?HReCUUDL$Qdl6KQ/r뭷bYMMM.磏V0rHdYFlFDL$qSI%Rdsِ ЀkF.Cet]P 2Ue;VHK6u>],8 4۶u_P+F0SJ H(R /|I&De, ۶4 EQ(dY4Mc%D*Ie(#@4CE-[F*Y͙g8u@>_D"d|$*HD"Q$*D")<^4@CCsaܸq<3$ JzUU*e!lqMK/gA$@+ϡjV]V. $Q[[ۼ\|| a"Lx6 ="bΎn=XλヒȊ(BQ$ZI|z)S1ud֬^ڵkq-ƘcȢ}Q~0f^|N83Kcq'o^CMu(a;jmS.#JQcq3jo^ؘA}͘*1ⅥR)Nի{@vi|/sUWqꩧrA*zJx8vnGï_f*qps1tvt3bjkk<+V0bjjjf\zd9[t)QWW(7 k)$ms~i;v,y\cQ2PdIBd"{,ZAT)J_ijj"JcY,#Kr!ȰSiݼ <>V|B,#_[ND2 )J*^]]a P@.Cd4MiAX,2zhC{ﻗx[}+olذM6FuH$ҥKDATWȑ#ijj3`444pYgҶ AX%vB5|WYe?DoxT[k9 XV,rwI|-;)r}V^(|>Oww7MMMtttX l %t:MuuuE7n| FoI2K/1 IDAT>OF^zef4N86(Jht:CX$5zR㏟B$"OpIPQ)J`Hd!or_b#vP6N$_G]g[T25j|eض$q|` ,k=O<͛ۘjk_\s5L0jyl.rwsW0O?^| B"pD`Vm٣i+҃{fy^Y>?:C7}؛~7o!9W234lo;sJNtvu**`(%ǢR)<&Z[[Cih,q$IbSo~??3a2 /w&Lj~@b<=c?JYiZHt)> ; c%P]UxG4 ˘$RUU444('N,C44M#HL&D"dY J/K (70xgo\@4񜐫Rl]y#  % $1Him1,aWZży<#< .>.Geٌ֬97dp=kx΅b>m`l3 @lk`SG(Ɩ jĚ5kez[4M$IT**fYaٲe5 S/1zh&M<UUi^UUy7꫹.II y$I )B.0~xZ4̶(:=\o)omr-ƌw]6th(Hرj*"t:M5ώ´Ph{3_ gYc1ð 45 `dFL2߇{:,YB*D$"mҋCCᣏVpW3}t\%P[S8=``LloG~dg%9Hp*zʙ2E [)J$I%m n\~|K_bʕh(ZV=q]7,-dʢmlڴ EQXv-Rwyx< /7̿|3E͛)D<f.q.?pr9W^y3fCeGB ײ(`F E41}t{=V^N{f\aa؎kcq",Y477JB=]#X|+n*pt, юaAvPUqKz^,2\w 7nSN90fK&nE}]fYPd*Jf[9@ϣ03ٙea*qTWU=IJ,S_W:ʺu\{>B!hR)l&JaYDH$(:=g}~˯ cqEc -weժUtttvpgO!OPԋ8C,#C5r v'˶ExikkcڵDQjkj~AId2$ Ǝ=_|w}YUMFMMM`I@-%4!K`q]AUU$I $I¶m֯_i\o=0H>ҲBP@dRZD;`CO;Ld4o^+Ya~ 4vY \ض,\f1/~9zr-YÉFo9VҔK6l6K,__GABu\˶K:l~|8q>Z|$P.ڦ HSUD(b 03$~1chi' Jz(v(lU5bL<>)IA! 7Ղ sȤ3RdiR):e*^{5wͿsdҙV!v yqz iy%*SbP3щ=I;2}_9z˷P]Qd%[6l7э?ãd|oK%ͲiS+Bx<.U4q-ʠzQ^{K/2m $La;6%B,yc3w(;@ ݟdB%iY ܢò~<E %xJUUUŴ̐[hb$IdYƎ˲ehll ϥ[ozMh47P%8iXiH@$RsBPA溜ui/RVXA6n CI>&cAuA#Fؚ}lIsٷ$`M#~-kz>6A-M@B@D|__oo%MUM9/իQO?mÿ+Dwgҕ]kqr9jbHhJȤg9<_YV[~MgW'(A,+NQ/rYgq1_F9l7> Jc(j|@D4HRX냏zX H,>ZƳMy6o~>dEف H8S\`L1%ȲȐ|$IRa\wmހ Xú Q(Œ ʈD2@˵3'qzۭd Y||l;?( &n}k`BOk{e wqó|'?+|~cxCvl$Ә)s_qvmɡ;<]?}=}ei gf\gV}YlWfw w+\ϥdJXĢ1^'NcR2Jtgs|Q]]]wqF:;;7m֭[DŽgӨ8~CPD}-|ϵ!_SSUa&bL*C6Ű .BRjTUeժU|WpމeYz4Md9(,Cz5cD "q:;}L4 D!Ƕ,\*k*<d;0,Y•W^_ѧmPYK%(|on| $"3b(F ;e+%0L2J1$$kyJ%R[i$A'}[wyA ;Ϊ^]OsL `ņ]DłQo77XMo1ɍS &`T,`200̜s>g34Dϙ<ֳ<3Λmi7F}V2oޛr7TdE/Xv-ws|0^v1y8C,CUJD].\ڋN>Eq1pPK.8M>'ɰfÌ;qa!2r1Xyi Rbiy1/歷`Rqh,qoIsS3=#F@e(˖-F6keZ9abd@ܯbV2e 'N$@la&hm}~0=3#}Bd/,Wʔ+eRɔ?W˰;;;9#Ns^{CyՕN5d{2D#=~cae"˨+\JkcYV02F~p}2tP8> to/|g 1} rzc]׉F0C﷽ÝȲL$!H龶LRBF&K?DSo|L:EQF8+0^tܗ\>mzܗ61?38`f3;Kgfc,i%D}@(>2Cu]~&OkcƌSOG?8ģq@ A}WT OPM| RWr9b'*D8W~Wwy\q,Z4IR^L2X,ʔ)S7nƍcфa~SNncc;<;8ҩtxO?4===qzzz mT9ŐEU ͵t*F¶mvuWd𽩾iZ t=B@TZό}p:_ܹsd2L>X$F\$H 4*NH˗YdI,|^ҝU?VŖnh-{r=n~$ fv<Bz_ؑjF}Y,BQPwt:M6%]3W{oQ\?,tev}5Q)UJ"1rB$5UنO\6K,CjeĦߧ\T=3W\Qhii/!C~UO~dYS`xD?_ŭ*a)S(-~@2DS4z =Br4XN$a鴵?΢EmE_ѲʦkƟ- Joۄ 63K_9 j]^'^I{kC/ ? ǎwlrEgf2CJY Ӑn@B"]+͞~tuuqI5F])Fd2+\tUGtbV,~Y{U- 10̟?ImEQP(.ݺ|r aXE e2,|r!=z4T qW49sK,Cq`@N&C~+>U{T* D&!ӝgmFB"5~vW_nj.]k90ﴴ*Be˗r#2Fc93bdYZ[[d2{ >X,?myoi3K{ -|vȆ-;}|ӍnUTSyHH2M?@*75 p8gCA<޺ 뺎+\߶@% 3S#˲(e9@8R)_EQb,`6 $~+"a&u:2]Ӭh_߾,]]=.\H*b̚3f@QjZ.j22oҨ1_HRJ%ax8VmIԵ>\WW:d28CxGXl$cYQ+ۖy* \BYql{Q$H(\iϪU*Z[[e,XADRAWngea[o[l0Il킦ǡ6fK:#[ڶlڳ׵LuOm*Gf?0Ֆthv/J<زۮsP1+P0!خM< .IXbde˖Q.Aw*iiaj|rA6<,}>~;2l0V`]v^c7⋷x}5uЈ*>Au]}ۀTwhnn`]]r&T.YǺuK\fL0/ݝaP*T?ɲ@ʱ#T $: R)F|iN\_lh='xt:MTbʕ}ٜ{">Hov>qAt>OKn`38q{>M֯|Mn[J"T eGT`]w%NT 0@eTU'aMömTU $ҥKa$oîܹsui'L #Fk}&rD8'|2Bq|VCill䤓N" cپx4jLX3}c?˲ҍiZ" $t5䛁Vz)9JRgyeKWqA|[b„ $ ²em>[%)}ߏ acz!W뷎=[oDBº].Ittt`&k׮%NcYEVZTPմQU$IP(iTF-ZĠA3MKHR3ϰb hV1*}ZP44E#_3rH}f.ƍN4ev/iIP:b1ΰa2d{7D4l.|>`8mmmɓYz5B2u444pyp 7s8IӁ ̙3\L$ T_g7{1e}CRfNq ]ZH^_|W7ҖF6{e>"lS*Xf xH$BSSp|Wu`"m p@OXዾ ʂ70j(<@L( :١_gsgf[P[S<۵URD:< _We(m6rOu@8$IABm-4it65L&Ø1c <uQw} 4|r]ߠ>U>˰aØ4iHW_}!g&hllk MH$va̚5 !~xP6_ӹT*ƭ6H~և~0x<&V:/\ن?7#)#v{c&eӃXRĊ+MӂOupS&!LNABqX˗c!I&Lm-ێaTL_7"x?^{Eww7dQFz*0WO!/vlR(W(œ9stV+93q]EKK 466`* z?OձdqۛI$nyĄ8q"מ1mx]_;3#OohImy~r$ZOH^/z?$$M6'U W}a+OxVѧdrϨOm* PC G(VJ~BiJ%Z[[Yf Tv$,cXgaUH$ dYa?'tR}iY&LR.q-ggD k%֮6M&w5tR8:`~\V9ד㩧C$H{rJ"\s-,Z.0p@~i-[Fs@aݺu[:e%x"Lh 0rv}w* x/¼W^&j456+=U+x$.EaȐ!ư t]GގU־?Llb2ך6і_=kzI[O66~lc E6۱kM\qk{c;>U1Žmyoκl0Nu+~7aj-F7)<~V"U<Dz!!Q,sDn"pPd9 00,h¯,]#**k 2lHt4r[w\|>S:XhC$d*h$J$}#:u*e@,SOoo~{ar8cd2<̞,`1{?4 -\?c/l㤓Nd.FT"2G?fN>d뽌;N;o}[\h!]C^}{oo#=KHw7:(3aX8vpfܲvVؖ薵du{33 dvZf6|N|7j0m5}EUq_Em:~]u%#ϳn:TW!B@8FQ"X,FK{{;MMM444tR9,s"4>@TD"Q,q@xӊ ]]464R1*hƇ~S444K/8F*Xs(?v i.b^x!uuu$ j>+(Mm\׹{wUV.'psgذaض_2^2ʼ[!;iO-[FG{g#yWrDv$ŀ~{2 ئo6BXEgg'/gy&bٖ/Pr4Up^8 !􃙍O`FB"͢BCP06dUX,{+X+ G&H۶FA!xJҥK)sd2az-|$ko ȲcBk׭e@>[&'XrD4]/YɒiA.I6%ر9S={6m` Պgl[C8j߶my@&MڋŋW7oGb]R$`u]T2ň/my; .f* RrLG}idyo!Ő"{Sd2980hhh LL&/~A.׃1x`^z%VT*i"ɂ+?OUCAUkeZ/}K#Lr1`~?3f ~;/2="?5<<!H%S>]5ד#LD Xcwi~0>`FDǓvtD6>\*6t(:Yr%MM8ÿSN9S.8q"*C aɲe2r$ƏmJL"r< ^yd:V,x( $D*j I:nd3tuuq3~xN*Z¡0e!=b\uU <UU5LwI% "!3 T2*RDX֞gb,_Ə3GO">_l{KG޶y~v<W^fѼtM\ve}\>l,\+Gf=B,&?яPSN=3gRTQGIKu]tvvrdzv:V^… enz6\.Ϯ~G}za>o1yƌÚ5먯Oo~3fp 7( ?q}3`@#hRj<Yt)./8Wú*cM-f% >f8nkl/?AP7;aV۹K:u*'O&R,BK;"3;;X6c[3䓙5 != |JX$H=c|K8ds*ezUwK/?~HO,%|ò,dX4َn!ɰP* 0sL~ߡ*===VN.R&Ogܸq|ߦ j*ƎKd )\qꗿ´FM н>UR>5ܞڊW]^{ʕ+90 ?6io3RɊ9w>C#di\u uH74 !+fBP=/P|Ku($ d 2 ƍ#L2qd2fϞ$A4#cvbhFT * L42pW*F$R1IEx9N8FBX*l2Əߓ]v#B_!<- }U;Cas:uUY=X%L6Ê6o?(D$c<|6אLa-?wRhaBb7nlBXO9{MX aqQ$D, !,ɮSg,!,q˭70 _,Eԧ?X !R 1j0xGEe}-r²K4}N#ד0V=lnHkoҞ?fGKYaa٥\L(ʕxE8HT/"xcxE.H55׋m !lae!% Lھ Jp=Cau6_hk_"BqM}ܑBUl^]a w;?Oh;Ḇ8.zH ݫg3EW K0o@B-*fQ8)}o@BxbVXNEv9J[[cGufXi6mI'Al3/|- 2c n*XFcccjCbj  7O= zM>_rlª,)_ZEOm24uY0-}|}KWUT+%ܦZ>9U]MtlaΜ9444j2ԧ0Y/p 4U# U%p5נm;0zhFĉ={6R*Uwy7hۏ>;/gvj3>Ckk+& eܧ΢sR*P0??)uvv[^40LUQ3g_?rםw"477Q.NK\̘1۶inn I3gvH2\Ei\:p][njx ێNp>z ηXʊɇ|vz Î0?=&q@BuEOKT(Wz#!,Q1GL=H!5*F^NI-t\qbɓ d [vYNֲEwfp=STv3nj{/6ܶ5ǟvT1``8CieajW(q̱GcGK.HxГ'lQ,HL>|w{ [خ_{ej'/!-u"4!ˈh4,1iҾdϧ))pE[۲.Aq,1bDC1``@BD8ĵ]%ږT}\7wĞ{!T]}2~]C=b8◿Jr=Sf1'՞GKNEt$1d@zjڱa0S/K-ijc22T2Edh4z.sH&؎ŋƟR+P(^SjɊQKUESi0Q OK>^)J\wud:3tuex7(Ul6ɓ1-O  lA hnnf,Zp8ToFUKg͚:r7D<iӦ*9/>~tF@ח?Cu~8?->BH8u]BSN9K{{;t\s G~D2] Y&Rl#R^׾58L$HfL, ^q&L-RpN:aTs=fC~o};3Θ<_@s vu l'NDXlx$$Nm?v@$@T\.DX#<ٳvm7BW_}+VH$X:Y]I$Dꫯ୷ *\%%\bZ&/mNf?$zX9f{X,`Je?[oio_pm;%~m_ffc k DZ3cFd˕Q(feaDRFT%f^tv&*Wǥ:>Ñͬ]#Tawj56&#"=^ELg1aŅ'>d*L1턩m!-uѓ_qdqerG8nO6$S&W[ՒkZ|kQb =$8ݴk>4Sa5+p)~a!GN]t@F ߥUD!꒐U1S= !4{jnSۖt !\/s8䐃EccZQ!E" pX{9AHBUqqӎ'tH7&aX% [X!.ʹ"ӅKB%ETBO"ԋPX]vD$ \.ٹ6R) !\Q(X6gD"}zƤhlJ1cGz8!+Iҽ$!l^₯ۇm])db̗GtcR<zr0՘|!#r'= +'f}FX ̶6:4{93}{s IDAT(ܶnΌk,^]W@ް z.;o+ x$F<YU)KTYT5;6m-Z ]Ճ;モ5sR'ϓH$m\.Gc}:&Xp!x<ܘ?>W^y%3f̠Rͩ8yGxT P\.G>M`2;:ijl;Ӎva 4L&ü:6h4,5۱}UڎK>dH0'a՚v"ڲ,RO e\%69y#2/fb?z|_e…?h4_L[[,ABuhnn橧+C]]#>f)*yv}w.iC)XK,fܸqYC{0hPmm+P5`g>CrW馛|\n۾AkضM,#R.f,[E1o<>{11̢QS=mx%zIT*y n\z Z[[qp><3twwԄAɴ@IҞGHf1yd>ږs2j(aJ%ΝO?MXdwgvcܹBx6lUU?b5v]Swx N/`fP<B*$aYBFfha]:0d#g>':$I"Y2}a80K&\YQTmۦ=؃O>[ouڳ>~ .%K)U'cpt2EOO|O?4MA1dڪ]w-/x)L! < B)֪[z{^S-Mv‰'ֶ O#h 4}#T}Uk$ʂWAwt#* :n\px㍁ζNKL?]0cB!>9<:^z%Pe8--8c?~<&I5?,_nu8="x"d3y?A;$sr)^v --C1e4}هɓ's}IXfРE"DB) 8kײ>ꫯ2a^}U$IA\ےnL4eG<}:4M}8/^(6l$q衇R*Xf bEa&0nhH8P(dLӤ|hL0S dzM?Lm?㊥"CJ'u駟{!|'p$eU0m֬Y×FF,X+\z!N=Lr/%oی1avmkt:;; KQ|!sr뭷s k\ oP,>}:+V7 r8_*LDudi@vj̖]$.Z`;6G5EQXz5O>$TʷiĢ1lN(L[ F܅O^G\T*N ȑ#4{L'tN35>Oתݺihh`̙퓈Gz[T+ hxp/NUK@|$(Lt:;.HJ@C˗/[nᮻ"#TYIEرc9kXt)+:H$xRɲL$bQ/믿QG |7|uQa\~\|,׿upC=]>̈o^lcĐ͂ !7z#R$ ۶qtOQ4E6dN 64, %6`rYv&õ`7V$`uC #c&)*kWw0w\RuItMX, 4QC! +o~"f=5+FD R|q)'C1o< FsS6*0&mQ m+Y& 1ug2x≼+Bi2?V=XTUŲ,t]qE8J%pJAi5~N5i۶h4 twwS__TvLEFE0R6<_60 ~.c>gYj5+$I$ T`fr>/m8!zBl肮D"A|;m&m[&E9ùK) y(!?Oؼ+McclCZ(J?gErJ">¶m`0.U Xa _GCUUʕ2cFzh *p/ȲLCCÇCLnBN]Q-Lw>,uuu O() 2 BB%+Y;~ݞ?N՝KIW%TսsXeU8xǭ"i]MZC~L5eJ"X"6(K*O\p8L: _3gbBAJRc\6RM0t+xG\NvZzhijAAqvu9v⡇O硇SرCFZvA2¿PF /_k&mEc|>/ׯy؎-4UW]EX;`ΝmoKއjviTEEteUEplR6˲1Ȩ~A=|OTvJ%LSh2o?(4؁ 2Ճ0zw6xF4%1cV+_~9/"S,زu P .|7XEWWs l߾>]ݴΠbM5J('^f XkUEakOeUm4M\+WbTmښg*(3faV`7|pgQ,)\yT*sK6jW|6Kr~zrf}iCLSv0tCtEB"¢(|#]/>{wjʉ'p8̻}!K~~?_+/"jhSU I&q^x^{D"y$i&XSRA%ϳ~zBHRNa0jj͛eݮ]A(p  7tiv;!? "ݤH.PA5&z"cE{S`H|vlx |'d|sᥗ_駟yGƺ(󴵶 gYfϞMOOM7D2$_K?C7VFql^pPyeSx1rWIS5ͬYDKʫʿۿP(L:C\$K5k o-o9E їh24e|]c…㎓A׍]Eٮ$QK*o[nᡇUQU8wH.O|Ӵ2X]2" Qk[դHQm{hԚF @a֮]˧B .oHXbV}N:$͛ǽ˓O>I(W*";;Z2)}> 3?;FN:Eͤ|?(Y<1SFuY:p& r ߜzi89 C$,[|h$B,yipy`AVܑx0%eԔ~.|Eaz(DbJx$ƒӖꫯƹCTUES5I=os֭uiR)֬Y5\W_M<GUTfR$P]ϕMTQdWN#[j Rģ>J,_{e͚5㈨LV.\ȉ'(E(o6R{,ӟX~=#r^׿+}N9̙Î;ظq#vl _cb&Pƻ#=X yźKp8< uOꨓ1Ձfi1 ̰G7H[33"2E>ZE8/za.P43{l1MjJ&W_%JVSۮ^@x6chk1bd") P( 0B\{Y9“O>ɽSB6)D)!AzO>%K㏳i&|0to8s_3Y|_K/TFEQ0w3>v'F402`fR7kIei= |e0STUx2ɼGqa4Cwg7Gy$lD"C=/^L4b5[UU Tˑ&4Uxf@K\o0VUԖJ%t=ar931sL>P(yWYj׺Zg…lyMJ%O>U+Wɨm 1̾l~;_uwb1֮]˖-['?ɍ7(\Z Ղ Z7# r Z+E$!_OwSv3R$wTe)3x4bxݿhwcnL6Ǎ? u a6mΝ;Yv,\`ݻ:yEsyϒ{!JQThJ7/I%SzAmWN+mP_"g fI2N4S$sdl_]]qȜCO>W_gS(fY*nP*dd&P4rtMjW{cbGiTMEXq x/ev(`_,)k\1&1*4"{ͳ:yb,yxKv۶,o~|Cɧe!pajH z{{Yd H$IסZ5\7L&ᤓNG칶c=%#z"D‘(gQ+WR*Z{clD"5x?A|IfϞ͗e<̺A:DQҩ48 ~ߓf7A"Beq @y׸ IDAT'Y^CFHӜ#cAT/>I`@5+&!RVIԸ͝;w2{lY T9srpss3W3$ѱ,U.z{ikk4M^|ET!\ϥbUBض}Ǘ5;n?ƯEá~Z<ϯi$ifEhEbVՒ J 5g1YG&y5 kAٿ 1= '>`<||4E&U+6jh4N6H3sLV^y]A{{;}}}RZur<8 z{{9S%g h,?΀6ݠj \q$ /_NPq<4MWsEG㺬X39 (vw <FN}Tq0ף Wdqw(-8?<ò,)tMǧX,:%],K{'J2XM UIa1k0L 03yihi򉃙>nۃسu7:9<ERF\EUt׿5??0h"TUeܹr9vI<g˖-!src=#Wok w p q0章xL4f<ׯPhҦG2鉌ƪR|/e(elިrlzF/umvc`\wY !' XUdcK^Pe!kWw]w~ᰆe]K1*lٲ@8Y9y[OnP +c0(:s׿A](2r(I$ŴD1ZC\W|=˗$! iD#Q'pMդXn֖K. RA۶*DRh>>e$颭m0*ȅBTEBYW5,*`Ipa2xGZQc膁Y)T^KE"HM@OZsȠbU$i_Q~iUD0gv/HcpZDg_} ??%pa_9L"w0e0ܞ^z$1on8 s3Q(H&Yz{z?[ :])2 _ wa>x̩FծP*x{L&۽ԃ >hL L1H0S(>vl9eɤuv܊ ?;`Æ xǶmۈpUUYgy$f̚9@ף45Ey8hiiײh4*{1M_s"R Efinj\).xs}C3:UBzqpm񌣱\T:Y.Ű*‘'d'B0fiimŪ >˲Hr]k yBd|rL4ESewO7XX46 ZWUQX* tF<+Zx~VN8MShud>3#>p`ftcp@W:XRS $#0 :lt]ߍqDa!$ vv=f:'|E'Ɣf 76۞Z[$\5)˔1Z@JE!K`Z&}\xs߾B}al ł6yꩧˤTkOo>obB0[UKiZP "ή]hoKq2Kfx4N,UU%ԥY\Oa4U.?%-T*z=̬i -ce{Eу>>1z$Iy6+,buQ.sr %L$#XŜ9shmmdY:vuP,T,~rOPU曥$j$I݄O_OJ階m۸eTU*Պ2^.-mbʢi1ߙL3>BKKhIwll&aU-Zy`ON&iʦ K%S9bh$ :sQ6|ߗ^٬H] *H QE鴐?>﹆}ٿG4z~ KxRl^D"A!oBZeUmE#OD#9ʦI*FAa…-'kb$IY24=|^FX"ԌZtmeDQP(iY47";e|ST)MQJ,&uGg>K&4:6JT TP(㹔eeitegW'TEQ("BmMr]U*U@%O4ՠX*%E睯NeQ/䥶UծjD4qtx ܡXOՁi Gs4I";¯hi=vLa|N֖o={<Ԡ8X%IEDCUUlpRԿ'% QJG֫l4Jm٥җ퓵$AM:TLa#Tnॗ^"O)W$IBa>l'eA:`4553p*:V"Ηy>yTZJ<bAFmҥ'GAEς(QgWVՒi5p8Lo_(thmmeRT(%* n*J4a#L>Hh4JOo]hok5LXDsA)Ď;|T2%ꁌ\[xl jZήN vUUb Pqk-<jE{NL1F-v:g(1u'L##`8>h$J.#"NO}(h$JծʮMm"aKEbheVZ8?. Q4 wxC{G<. c`}|I }(QPZH> Gy?0?>m6˜W_1CX$N|KE8nPmTU~UѩVEֲe˄ Gr:%N8bD"A___ٺu+}}}*|_"G ^6M8zM7}-lڴo~zP e<'NU]2S2{ĔFcq2ʼORk\<89 A?ȩM(ϢF&K,߾n\r --ҠljD]N+:@bBS UV#+ /Pv-s ̈[: PU4b!`0*޾T†0e G9nq x TUr T;bȊ+ضmz 0)BGGlnEgp$aT*++W|{c_f|q׋hS4 'u0MdxG{{;T bժUl&t̛7L+b <Y?{"]86W,P'.>_\Ù.<zHC`.OzM]Wm40 lbVLXU qF=4 <N;^u8 Ndm+r9MͲV6x4{;ծNZftIՑMd6XG`#N;U5oNM@Dn-BϧM8k~_E=!H`u]p׮b\WD Qq}պ5oFn&BP/|;S:R.) BhARAU<{CuuX,8DµbS_5&Ҹz88y"T{:G>innfݺu{yΝ˖-[H&<}"-VB8$-pƇb PklEѬoI8_"< sghinqyw Œƍ9XR UHD[i|t:ͦM0 ^x˲1cgqwu[l}{PXfq;v~(xoq1M3\SO`ޑ8Å`e6˼yy7iիWL&9Su]}OJ{{0m3k,I _dޑNGGgͦbU̥^ʒ%K%B!q89sxꩧ8c5 ͮ]կ~(Lju2^];vE#wttnzw=_8;NFZgu͙+U N>beό3p]t]T* C\.SVYzbٳg`|ߧ7xl6K[[cҥ0c <n:v]7xhkk<kR.ijjbݺ ܹMGYʜ98e0 D"2-oMȣG4fΜY[/B*yO9h4ʲ'HxIo#dT*of8 yN9l[qSOeǶlټ Ňb"կ~n+u1c ,ˢh4Jss3lܸ#<]7hnNۛS8vT*Q.y9묳BXN8{m:u1;:D},uZ[2cF^{-} ǭ h֭[I'r9O>f`fi&`(NWwmmgYкD"6ȍcV^_̝we^㺼 }Ѥ) 5RC)OFQ"#ŬYm֊WaLC&‘GɪUя~9ţ>G /dǎD13g7^gǎ|_w[neς Xfhf׮^8uQ)F\Yz5JJ#s!QTH&l۶>QtGm.tC.j-Mq8T7|˲I$b e|tC6<\ƍ%1@KsQz!9ER){1kMMJlu-*z);0<ׯdRw:̝;_~YD I\C𶷽hT̚5^}Jپ}'iPؠjQpygI[[saӦMB!}Yry2Mu#:E?O~c|sopEŢE}\.ey*N8H1B%._WXz5g .w}|t'?]wGMG4lfڵ /^̻.Y,9dWWHx<>,ٲe sΕi@gL\;`f42fm:f(0z$U 8=֮]˼yxMX}/]wx:6|tJpSO<$kAH^h0'Bҫy^̭UTeMN[[?\s ׭_.cҥ$I.\ȦM8묳H&ttt0o<y=\02twws'Ӌ8̝;W7|37Mt I2P(N' D":J.( t .RDSSJ%4\_~Mɤ)ضahض9~q",Xm۶H$hmkfɒ%_ŋs %<˗c.s4R[ajT󞬍,Oz͢fϮ3>v܉m &=T*/_!mt w47'- r-pm?a88."nve_gu| |Yf蠥_(—%V\ɼyH&l߾ ?|_ce|1mmm8Ïc/˲B8W^\wu'W^y% p h1tc],aCB1Xf ~8v"+YJk[><HxÆ |[q,9t0duή.etnȗlް_KIo?]H%SdsYBhxGXDQzzzu3gOZz1 ((|vuwI(]WHZ[ZeSQBFC0MS|qX%ٗAwgIDAT{GWWL#C0C74t3Ģ(RTbJ%񸬧TM" {mQ B3 bU},駟!N,[̊Wpގil޼H$izl߾XLxæi./B*|>O<'R.ǣQLyͣ62 ۷`ڵlٲUXLIqyY?`…dY=r9 z`+TUNs \ eChb8DvbJ[sy嗹՝/Gq0I#W*8'ublۮ>l߱+DضmhgyV.n_MӈFl߱Y3Ȑwoo/X477U"fq]V4*8>@Տ;w0{l@Ї'A}4gkXXHf%j[n'x4r].F}d2\.KP(DWWmuڀ7KsS3>ʉEb+eby!{LSxiBLJ%8|T*5IX>rJBKs =d2A6'r$qzDzO̾:@hT Ԅ`=~ͧPkz4s != :'%L_& BT|t*] TvUJr:{<_usjYV  '  AA}AOO-M3(d403TS4L 0#s#`푆\.I< \_<6ipRMX*RD"yGr|M~@54Mt  Q5E.#JðT.`*L*bC?0A7PwO7ш78HY 0 :;;iooF3HJOyl[6MX,&l[u]$cU rOkKkt-fVL"z2`㦍yđS J׏>:\ikkC3K$X.J;(y( 6rtQ$A`pUR9K45?L&x үc8kKT"[fYZ[E\?u.+"T}iTMϓL*ZU,Ǫ hTPdehJݨi%RR#`PVe:5X{A$6 MM) Q&jI**XcF-iڗLv+r_L<23^0S6>1~p f|$&`f /A?;YpȸjЄ@y䮦jyr -^5}1ou<L$k<6X+"J-q=][2`V,Y?~-2: M> YR&J*x)jA7a[\ i bD"[%`u]Wze1NP !Y0 [ ,>H "V5<Rh8-<ZQpxw|u1U E<|\"Fp]M)$)hHDlqoA^O]R@8,M# -"K >Pu*hVZr]]N)C4eC$1.Q1$q@Ŭ\ǣr[(K5bNDT*"tdӑ\7rh4h8Pfd07} 7ש3j2 q|fyO9 c:7KCɻu#?Lm781./ 9y7}X_}drcmXc Y _|瞼7N`|OVoGxlvsLj8euߙ󡶆UuL020֭[u=Ҽ`1=|/#88U5n#5*Z)mAQkdG4ޞϯړ{ndGu`*X_m_ʣS͇xcL{Md&kLFp>ARn %'Mih3CQSQF?F$>#BzPu>L=0 o&N~Cm|ʟ߁h`C6tܷ׷w=S~.^ui L7̰OS`S'G۫fbP&y4<1m%i<_eoʤWby2IԽ^Q{{"``ol~Ӛ=J~c4֟n㤝!bvp/ } FFXs d &QE& )Qx 8u=簻3fk8y4ie"Lc4~?&k}ڏ٦h}X;(ޗ*c=V=oQ6"2?{Iqv2HؚQ~ƳڿvQc44skg_10+jّHn hDf1M5wnom|1G>]{:d}Me^&KWo8TLm f1d@ :W7ƁFG2QtkDg& 1`f/c[)6H.FsoCmoLyr ?!ڱgɍ'7MF˟!Z @N0ILϭQXO ڏ{?zq<GmXǾw}uQ~齏vq=ވkrxyq'o7>Js+RpOgƓ'xXjO91V}<.;"sֹ:"FX$0IZeKQdr&kV4ewR~GkQޫФc{Qh?ySy|&;ᣝ{fty idc7BK{s`fOIrϟG@Mݺ'dy~cZ}ۺ?e `x_h3uOp}͵:q(az u~?9Sʻ?L1Juhu8c1MA5ht** }KmMMT %*5T<2z7HL߿"3S7t|n"g+ Cx~'6&2Mk` ;8wh8fʓPځG.t7QqF9s1sdo)Z{ 1V`FQPLGp?PeNwO = 5={p\:M?ꐷڈ4FcLh1u),*:IENDB`menhir-20200123/www/index.html000066400000000000000000000135111361226111300160450ustar00rootroot00000000000000 Menhir

Menhir

What is it?

Menhir is an LR(1) parser generator for OCaml: it compiles LR(1) grammars down to OCaml code.

Menhir replaces ocamlyacc. Legacy grammars can be compiled by Menhir, with a few caveats, described in the reference manual (HTML; PDF).

How to get it?

Menhir is available through opam, OCaml's package manager.

Type opam install menhir.

Menhir's source code is hosted in this repository (releases; changes).

How to get help?

There is a mailing list for announcements of new releases and discussion of problems, bugs, feature requests, and so on. Only subscribers can post.

Menhir has been designed and implemented by François Pottier and Yann Régis-Gianas.

What are the key features of Menhir?

Menhir has many features that make it superior to the traditional yacc-style parser generators that many people are familiar with.

  • Menhir is not restricted to LALR(1) grammars. It accepts LR(1) grammars, thus avoiding certain artificial conflicts. When a grammar lies outside this class, Menhir explains conflicts in terms of the grammar, not just in terms of the automaton. Menhir's explanations are believed to be understandable by mere humans.
  • Menhir allows the definition of a nonterminal symbol to be parameterized. A formal parameter can be instantiated with a terminal symbol, a nonterminal symbol, or an anonymous rule. A library of standard parameterized definitions, including options, sequences, and lists, is bundled with Menhir. EBNF syntax is supported: the modifiers ?, +, and * are sugar for options, nonempty lists, and arbitrary lists. Parameterized definitions are expanded away in a straightforward way.
  • Menhir's %inline keyword allows indicating that a nonterminal symbol should be replaced with its definition at every use site. This offers a second macro-expansion mechanism. Together, these expansion mechanisms help write concise and elegant grammars, while avoiding LR(1) conflicts. In other words, they extend Menhir's expressive power far beyond LR(1), while retaining the attractive features of LR(1): determinism, performance, guaranteed unambiguity.
  • In --table mode only, Menhir supports incremental parsing. This means that the state of the parser can be saved at any point (at no cost) and that parsing can later be resumed from a saved state. Furthermore, Menhir offers an inspection API which allows the parser's current state and stack to be examined by the user. This opens the door to a variety of advanced uses, including error explanation, error recovery, context-dependent lexical analysis, and so on.
  • Menhir offers a set of tools for building a (complete, irredundant) set of invalid input sentences, mapping each such sentence to a hand-written error message, and maintaining this mapping as the grammar evolves. Thus, a generated parser can produce good syntax error messages.
  • Menhir has a Coq back-end, which produces parsers whose correctness and completeness with respect to the grammar can be verified by Coq.
  • Menhir offers an interpreter that helps debug grammars interactively.
  • Menhir allows grammar specifications to be split over multiple files. It also allows several grammars to share a single set of tokens.
  • Menhir produces reentrant parsers.
  • Menhir is able to produce parsers that are parameterized by OCaml modules.
  • Instead of referring to semantic values via keywords: $1, $2, etc., Menhir allows semantic values to be explicitly named. In fact, Menhir now has fairly nice syntax for describing grammars.
menhir-20200123/www/style.css000066400000000000000000000054211361226111300157230ustar00rootroot00000000000000body { width: 100%; margin: 0 0 0 0; background-color: rgb(105,136,164); font-size: 1.5vmin; } div.header { background-color: With; background-size: cover; background-image:url(affichage.jpg); background-repeat:no-repeat; background-position:center top; } h1 { margin: 0; margin-left: 0.4vmin; margin-right: 0.4vmin; font-family: 'Carter One', cursive; padding-top: 3vmin; padding-bottom: 1vmin; font-size: 12vmin; text-align: center; color: rgb(105,136,164); font-variant: small-caps; text-shadow: -0.01vmin 0 black, 0 0.01vmin black, 0.01vmin 0 black, 0 -0.01vmin black; } .widecard { background: #fff; border-radius: 2px; position: relative; padding: 0.2rem; flex: 1; margin: 0.5rem; } .card { background: #fff; border-radius: 2px; margin: 0.4rem; padding: 0.2rem; position: relative; flex: 1; padding-bottom: 2rem; } .card-1 { box-shadow: 0 1px 3px rgba(0,0,0,0.12), 0 1px 2px rgba(0,0,0,0.24); transition: all 0.3s cubic-bezier(.25,.8,.25,1); } .card-header { background-color: rgba(255, 255, 255, 0.7); } .card-1:hover { box-shadow: 0 14px 28px rgba(0,0,0,0.25), 0 10px 10px rgba(0,0,0,0.22); } p.card-contents { margin: auto; width: 85%; margin-top: 0.5em; } p.subtitle { text-align: center; font-size: 1.6vmin; margin-top: 1vmin; margin-bottom: 2vmin; } img.footer { height: 15vmin; display: block; margin-top: 5vmin; margin-left: auto; margin-right: auto; } h2.card-title { font-family: 'Carter One', cursive; font-size: 4vmin; text-align: center; color: rgb(105,136,164); font-variant: small-caps; margin-top: 0.1em; margin-bottom: 0.1em; } div.contents { font-family: sans-serif; margin: auto; width: 100%; background-color: rgb(105,136,164); } code { font-size: 1.4vmin; } @media only screen and (min-width: 768px) { div.card { max-height: 20vh; overflow: auto; } .flexbox { display: -webkit-flex; display: -ms-flexbox; display: flex; overflow: hidden; padding-top: 0.5em; } div.contents { font-family: sans-serif; margin: auto; width: 100%; background-color: rgb(105,136,164); } div.features { height: 33vmin; overflow: auto; } ul.features { columns: 3; -webkit-columns: 3; -moz-columns: 3; -webkit-column-gap: 2em; -moz-column-gap: 2em; column-gap: 2em; text-align: justify; text-justify: inter-word; padding-left: 3em; padding-right: 3em; } li { padding: 0; padding-top: 0.5em; font-size: 1.5vmin; } div.header { height: 50vmin; } div.widecard { height: 47vmin; } }