pycparser-2.18/0000775000175000017500000000000013127011712014167 5ustar elibeneliben00000000000000pycparser-2.18/CHANGES0000664000175000017500000002300513127010747015171 0ustar elibeneliben00000000000000+ Version 2.18 (04.07.2017) - PR #161 & #184: Update bundled PLY version to 3.10 - PR #158: Add support for the __int128 type. - PR #169: Handle more tricky TYPEID in declarators. - PR #178: Add columns to the coord of each node + Version 2.17 (29.10.2016) - Again functionality identical to 2.15 and 2.16; the difference is that the tarball now contains Python files with properly set permissions. + Version 2.16 (18.10.2016) - Functionally identical to 2.15, but fixes a packaging problem that caused failed installation (_build_tables wasn't rerun in the pycparser/ dir). + Version 2.15 (18.10.2016) - PR #121: Update bundled PLY version to 3.8 - Issue #117: Fix parsing of extra semi-colons inside structure declarations. - PR #109: Update c_generator to add {} around nested named initializers. - PR #101: Added support for parsing pragmas into the AST. - Additional fake headers and typedefs, manifest fixes (#97, #106, #111). - Testing with Python 3.5 instead of 3.3 now (3.4 and 3.5 are the 3.x versions tested). - PR #145: More complete support for offsetof() - Issue #116: Fix line numbers recorded for empty and compound statements. - Minor performance improvement to the invalid string literal regex. + Version 2.14 (09.06.2015) - Added CParser parameter to specify output directory for generated parsing tables (#84). - Removed lcc's cpp and its license from the distribution. Using lcc's cpp is no longer recommended, now that Clang has binary builds available for Windows. + Version 2.13 (12.05.2015) - Added support for offsetof() the way gcc implements it (special builtin that takes a type as an argument). - Added faked va_* macros (these are expected to come from stdarg.h) - Added a bunch more fake headers and typedefs to support parsing C projects like Git and SQLite without modifications to pycparser. - Added support for empty initializer lists (#79). + Version 2.12 (21.04.2015) - This is a fix release for 2.11; the memory optimization with __slots__ on Coord and AST nodes didn't take weakrefs into account, which broke cffi and its many dependents (issue #76). Fixed by adding __weakref__ to __slots__. + Version 2.11 (21.04.2015) - Add support for C99 6.5.3.7 p7 - qualifiers within array dimensions in function declarations. Started with issue #21 (reported with initial patch by Robin Martinjak). - Issue #27: bug in handling of unified wstring literals. - Issue #28: fix coord reporting for 'for' loops. - Added ``examples/using_gcc_E_libc.py`` to demonstrate how ``gcc -E`` can be used instead of ``cpp`` for preprocessing. - Pull request #64: support keywords like const, volatile, restrict and static in dimensions in array declarations. - Reduce memory usage of AST nodes (issue #72). - Parsing order of nested pointer declarations fixed (issue #68). + Version 2.10 (03.08.2013) - A number of improvements in the handling of typedef-name ambiguities, contributed by Sye van der Veen in GitHub issue #1: * Allow shadowing of types by identifiers in inner scopes. * Allow struct field names to reside in a separate namespace and have the same names as types. * Allow duplicate typedefs in some cases to mimic real compiler behavior. - c_generator error for ExprList in expression context. - Assume default int type for functions whose argument or return types were not specified. - Relax the lexer a bit w.r.t. some integer suffixes and $ in identifier names (which is supported by some other compilers). + Version 2.09.1 (29.12.2012) - No actual functionality changes. - The source distribution was re-packaged to contain the pre-generated Lex and Yacc tables of PLY. + Version 2.09 (27.12.2012) - The pycparser project has moved to Bitbucket. For this version, issue numbers still refer to the old Googlecode project, unless stated otherwise. Starting with the next version all issue numbers will refer to the new Bitbucket project. - pycparser now carries its PLY dependency along. The pycparser/ply directory contains the source of PLY for the currently supported version. This makes distribution and testing easier. - Issue #79: fix generation of new switch/case AST nodes. - Issue #83: fix parsing and C generation to distinguish between initializer lists in declarations and initializing variables with parenthesized comma-separated expressions. - Issue #84: fix C generation for some statements. - Issues #86 and #87: improve location reporting for parse errors. - Issue #89: fix C generation for K&R-style function definitions. + Version 2.08 (10.08.2012) - Issue 73: initial support for #pragma directives. Consume them without errors and ignore (no tokens are returned). Line numbers are preserved. - Issue 68: more correct handling of source files without any actual content. - Issue 69: running all tests will now set appropriate return code. - Better error reporting in case where multiple type specifiers are provided. Also fixes Issue 60. - Issue 63: line endings cleanup for consistent LF ending. - Issues 64 & 65: added some more headers and typedefs to fake includes. - Refactoring the cpp invocation in parse_file into a separate function, which can also be used as a utility. - Issue 74: some Windows include paths were handled incorrectly. + Version 2.07 (16.06.2012) - Issue 54: added an optional parser argument to parse_file - Issue 59: added some more fake headers for C99 - Issue 62: correct coord for Ellipsis nodes - Issue 57: support for C99 hexadecimal float constants - Made running tests that call on 'cpp' a bit more robust. + Version 2.06 (04.02.2012) - Issue 48: gracefully handle parsing of empty files - Issues 49 & 50: handle more escaped chars in paths to #line - "..\..\test.h". - Support for C99 _Complex type. - CGenerator moves from examples/ to pycparser/ as a first-class citizen, and added some fixes to it. examples/c-to-c.py still stays as a convenience wrapper. - Fix problem with parsing a file in which the first statement is just a semicolon. - Improved the AST created for switch statements, making it closer to the semantic meaning than to the grammar. + Version 2.05 (16.10.2011) - Added support for the C99 ``_Bool`` type and ``stdbool.h`` header file - Expanded ``examples/explore_ast.py`` with more details on working with the AST - Relaxed the rules on parsing unnamed struct members (helps parse ``windows.h``) - Bug fixes: * Fixed spacing issue for some type declarations * Issue 47: display empty statements (lone ';') correctly after parsing + Version 2.04 (21.05.2011) - License changed from LGPL to BSD - Bug fixes: * Issue 31: constraining the scope of typedef definitions * Issues 33, 35: fixes for the c-to-c.py example - Added C99 integer types to fake headers - Added unit tests for the c-to-c.py example + Version 2.03 (06.03.2011) - Bug fixes: * Issue 17: empty file-level declarations * Issue 18: empty statements and declarations in functions * Issue 19: anonymous structs & union fields * Issue 23: fix coordinates of Cast nodes - New example added (``examples/c-to-c.py``) for translating ASTs generated by ``pycparser`` back into C code. - ``pycparser`` is now on PyPI (Python Package Index) - Created `FAQ `_ on the ``pycparser`` project page - Removed support for Python 2.5. ``pycparser`` supports Python 2 from 2.6 and on, and Python 3. + Version 2.02 (10.12.2010) * The name of a ``NamedInitializer`` node was turned into a sequence of nodes instead of an attribute, to make it discoverable by the AST node visitor. * Documentation updates + Version 2.01 (04.12.2010) * Removed dependency on YAML. Parsing of the AST node configuration file is done with a simple parser. * Fixed issue 12: installation problems + Version 2.00 (31.10.2010) * Support for C99 (read `this wiki page `_ for more information). + Version 1.08 (09.10.2010) * Bug fixes: + Correct handling of ``do{} ... while`` statements in some cases + Issues 6 & 7: Concatenation of string literals + Issue 9: Support for unnamed bitfields in structs + Version 1.07 (18.05.2010) * Python 3.1 compatibility: ``pycparser`` was modified to run on Python 3.1 as well as 2.6 + Version 1.06 (10.04.2010) * Bug fixes: + coord not propagated to FuncCall nodes + lexing of the ^= token (XOREQUALS) + parsing failed on some abstract declarator rules * Linux compatibility: fixed end-of-line and ``cpp`` path issues to allow all tests and examples run on Linux + Version 1.05 (16.10.2009) * Fixed the ``parse_file`` auxiliary function to handle multiple arguments to ``cpp`` correctly + Version 1.04 (22.05.2009) * Added the ``fake_libc_include`` directory to allow parsing of C code that uses standard C library include files without dependency on a real C library. * Tested with Python 2.6 and PLY 3.2 + Version 1.03 (31.01.2009) * Accept enumeration lists with a comma after the last item (C99 feature). + Version 1.02 (16.01.2009) * Fixed problem of parsing struct/enum/union names that were named similarly to previously defined ``typedef`` types. + Version 1.01 (09.01.2009) * Fixed subprocess invocation in the helper function parse_file - now it's more portable + Version 1.0 (15.11.2008) * Initial release * Support for ANSI C89 pycparser-2.18/README.rst0000664000175000017500000002017113127010767015670 0ustar elibeneliben00000000000000=============== pycparser v2.18 =============== :Author: `Eli Bendersky `_ .. contents:: :backlinks: none .. sectnum:: Introduction ============ What is pycparser? ------------------ **pycparser** is a parser for the C language, written in pure Python. It is a module designed to be easily integrated into applications that need to parse C source code. What is it good for? -------------------- Anything that needs C code to be parsed. The following are some uses for **pycparser**, taken from real user reports: * C code obfuscator * Front-end for various specialized C compilers * Static code checker * Automatic unit-test discovery * Adding specialized extensions to the C language One of the most popular uses of **pycparser** is in the `cffi `_ library, which uses it to parse the declarations of C functions and types in order to auto-generate FFIs. **pycparser** is unique in the sense that it's written in pure Python - a very high level language that's easy to experiment with and tweak. To people familiar with Lex and Yacc, **pycparser**'s code will be simple to understand. It also has no external dependencies (except for a Python interpreter), making it very simple to install and deploy. Which version of C does pycparser support? ------------------------------------------ **pycparser** aims to support the full C99 language (according to the standard ISO/IEC 9899). Some features from C11 are also supported, and patches to support more are welcome. **pycparser** supports very few GCC extensions, but it's fairly easy to set things up so that it parses code with a lot of GCC-isms successfully. See the `FAQ `_ for more details. What grammar does pycparser follow? ----------------------------------- **pycparser** very closely follows the C grammar provided in Annex A of the C99 standard (ISO/IEC 9899). How is pycparser licensed? -------------------------- `BSD license `_. Contact details --------------- For reporting problems with **pycparser** or submitting feature requests, please open an `issue `_, or submit a pull request. Installing ========== Prerequisites ------------- * **pycparser** was tested on Python 2.7, 3.4 and 3.5, on both Linux and Windows. It should work on any later version (in both the 2.x and 3.x lines) as well. * **pycparser** has no external dependencies. The only non-stdlib library it uses is PLY, which is bundled in ``pycparser/ply``. The current PLY version is 3.10, retrieved from ``_ Installation process -------------------- Installing **pycparser** is very simple. Once you download and unzip the package, you just have to execute the standard ``python setup.py install``. The setup script will then place the ``pycparser`` module into ``site-packages`` in your Python's installation library. Alternatively, since **pycparser** is listed in the `Python Package Index `_ (PyPI), you can install it using your favorite Python packaging/distribution tool, for example with:: > pip install pycparser Known problems -------------- * Some users who've installed a new version of **pycparser** over an existing version ran into a problem using the newly installed library. This has to do with parse tables staying around as ``.pyc`` files from the older version. If you see unexplained errors from **pycparser** after an upgrade, remove it (by deleting the ``pycparser`` directory in your Python's ``site-packages``, or wherever you installed it) and install again. Using ===== Interaction with the C preprocessor ----------------------------------- In order to be compilable, C code must be preprocessed by the C preprocessor - ``cpp``. ``cpp`` handles preprocessing directives like ``#include`` and ``#define``, removes comments, and performs other minor tasks that prepare the C code for compilation. For all but the most trivial snippets of C code **pycparser**, like a C compiler, must receive preprocessed C code in order to function correctly. If you import the top-level ``parse_file`` function from the **pycparser** package, it will interact with ``cpp`` for you, as long as it's in your PATH, or you provide a path to it. Note also that you can use ``gcc -E`` or ``clang -E`` instead of ``cpp``. See the ``using_gcc_E_libc.py`` example for more details. Windows users can download and install a binary build of Clang for Windows `from this website `_. What about the standard C library headers? ------------------------------------------ C code almost always ``#include``\s various header files from the standard C library, like ``stdio.h``. While (with some effort) **pycparser** can be made to parse the standard headers from any C compiler, it's much simpler to use the provided "fake" standard includes in ``utils/fake_libc_include``. These are standard C header files that contain only the bare necessities to allow valid parsing of the files that use them. As a bonus, since they're minimal, it can significantly improve the performance of parsing large C files. The key point to understand here is that **pycparser** doesn't really care about the semantics of types. It only needs to know whether some token encountered in the source is a previously defined type. This is essential in order to be able to parse C correctly. See `this blog post `_ for more details. Basic usage ----------- Take a look at the ``examples`` directory of the distribution for a few examples of using **pycparser**. These should be enough to get you started. Advanced usage -------------- The public interface of **pycparser** is well documented with comments in ``pycparser/c_parser.py``. For a detailed overview of the various AST nodes created by the parser, see ``pycparser/_c_ast.cfg``. There's also a `FAQ available here `_. In any case, you can always drop me an `email `_ for help. Modifying ========= There are a few points to keep in mind when modifying **pycparser**: * The code for **pycparser**'s AST nodes is automatically generated from a configuration file - ``_c_ast.cfg``, by ``_ast_gen.py``. If you modify the AST configuration, make sure to re-generate the code. * Make sure you understand the optimized mode of **pycparser** - for that you must read the docstring in the constructor of the ``CParser`` class. For development you should create the parser without optimizations, so that it will regenerate the Yacc and Lex tables when you change the grammar. Package contents ================ Once you unzip the ``pycparser`` package, you'll see the following files and directories: README.rst: This README file. LICENSE: The pycparser license setup.py: Installation script examples/: A directory with some examples of using **pycparser** pycparser/: The **pycparser** module source code. tests/: Unit tests. utils/fake_libc_include: Minimal standard C library include files that should allow to parse any C code. utils/internal/: Internal utilities for my own use. You probably don't need them. Contributors ============ Some people have contributed to **pycparser** by opening issues on bugs they've found and/or submitting patches. The list of contributors is in the CONTRIBUTORS file in the source distribution. After **pycparser** moved to Github I stopped updating this list because Github does a much better job at tracking contributions. CI Status ========= **pycparser** has automatic testing enabled through the convenient `Travis CI project `_. Here is the latest build status: .. image:: https://travis-ci.org/eliben/pycparser.png?branch=master :align: center :target: https://travis-ci.org/eliben/pycparser AppVeyor also helps run tests on Windows: .. image:: https://ci.appveyor.com/api/projects/status/wrup68o5y8nuk1i9?svg=true :align: center :target: https://ci.appveyor.com/project/eliben/pycparser/ pycparser-2.18/LICENSE0000664000175000017500000000300013045001366015170 0ustar elibeneliben00000000000000pycparser -- A C parser in Python Copyright (c) 2008-2017, Eli Bendersky All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of Eli Bendersky nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. pycparser-2.18/utils/0000775000175000017500000000000013127011712015327 5ustar elibeneliben00000000000000pycparser-2.18/utils/fake_libc_include/0000775000175000017500000000000013127011712020731 5ustar elibeneliben00000000000000pycparser-2.18/utils/fake_libc_include/getopt.h0000664000175000017500000000006713045001366022412 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/reent.h0000664000175000017500000000006713045001366022225 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/envz.h0000664000175000017500000000006713045001366022072 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/tgmath.h0000664000175000017500000000006713045001366022374 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/semaphore.h0000664000175000017500000000006713045001366023073 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/_syslist.h0000664000175000017500000000006713045001366022761 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/libgen.h0000664000175000017500000000006713045001366022350 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/dirent.h0000664000175000017500000000006713045001366022375 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/pwd.h0000664000175000017500000000006713045001366021702 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/limits.h0000664000175000017500000000006713045001366022411 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/utime.h0000664000175000017500000000006713045001366022233 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/stddef.h0000664000175000017500000000006713045001366022361 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/dlfcn.h0000664000175000017500000000006713045001366022176 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/netinet/0000775000175000017500000000000013127011712022377 5ustar elibeneliben00000000000000pycparser-2.18/utils/fake_libc_include/netinet/in.h0000664000175000017500000000006713045001366023164 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/netinet/tcp.h0000664000175000017500000000006713045001366023344 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/stdio.h0000664000175000017500000000006713045001366022232 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/xcb/0000775000175000017500000000000013127011712021505 5ustar elibeneliben00000000000000pycparser-2.18/utils/fake_libc_include/xcb/xcb.h0000664000175000017500000000006713062513705022444 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/unctrl.h0000664000175000017500000000006713045001366022417 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/sys/0000775000175000017500000000000013127011712021547 5ustar elibeneliben00000000000000pycparser-2.18/utils/fake_libc_include/sys/stat.h0000664000175000017500000000006713045001366022701 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/sys/ioctl.h0000664000175000017500000000006713045001366023040 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/sys/select.h0000664000175000017500000000006713045001366023205 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/sys/wait.h0000664000175000017500000000006713045001366022672 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/sys/utsname.h0000664000175000017500000000006713045001366023402 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/sys/types.h0000664000175000017500000000006713045001366023072 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/sys/poll.h0000664000175000017500000000006713045001366022674 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/sys/time.h0000664000175000017500000000006713045001366022664 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/sys/socket.h0000664000175000017500000000006713045001366023216 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/sys/un.h0000664000175000017500000000006713045001366022350 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/sys/resource.h0000664000175000017500000000006713045001366023555 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/sys/mman.h0000664000175000017500000000006713045001366022656 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/sys/sysctl.h0000664000175000017500000000006713045001366023247 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/sys/uio.h0000664000175000017500000000006713045001366022522 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/regex.h0000664000175000017500000000006713045001366022222 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/fastmath.h0000664000175000017500000000006713045001366022717 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/netdb.h0000664000175000017500000000006713045001366022204 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/zlib.h0000664000175000017500000000006713045001366022050 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/alloca.h0000664000175000017500000000006713045001366022343 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/string.h0000664000175000017500000000006713045001366022416 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/math.h0000664000175000017500000000006713045001366022041 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/malloc.h0000664000175000017500000000006713045001366022357 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/paths.h0000664000175000017500000000006713045001366022227 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/utmp.h0000664000175000017500000000006713045001366022075 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/_ansi.h0000664000175000017500000000006713045001366022201 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/tar.h0000664000175000017500000000006713045001366021676 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/_fake_defines.h0000664000175000017500000000200513045001366023644 0ustar elibeneliben00000000000000#ifndef _FAKE_DEFINES_H #define _FAKE_DEFINES_H #define NULL 0 #define BUFSIZ 1024 #define FOPEN_MAX 20 #define FILENAME_MAX 1024 #ifndef SEEK_SET #define SEEK_SET 0 /* set file offset to offset */ #endif #ifndef SEEK_CUR #define SEEK_CUR 1 /* set file offset to current plus offset */ #endif #ifndef SEEK_END #define SEEK_END 2 /* set file offset to EOF plus offset */ #endif #define __LITTLE_ENDIAN 1234 #define LITTLE_ENDIAN __LITTLE_ENDIAN #define __BIG_ENDIAN 4321 #define BIG_ENDIAN __BIG_ENDIAN #define __BYTE_ORDER __LITTLE_ENDIAN #define BYTE_ORDER __BYTE_ORDER #define EXIT_FAILURE 1 #define EXIT_SUCCESS 0 #define UCHAR_MAX 255 #define USHRT_MAX 65535 #define UINT_MAX 4294967295U #define RAND_MAX 32767 #define INT_MAX 32767 /* C99 stdbool.h defines */ #define __bool_true_false_are_defined 1 #define false 0 #define true 1 /* va_arg macros and type*/ typedef int va_list; #define va_start(_ap, _type) __builtin_va_start((_ap)) #define va_arg(_ap, _type) __builtin_va_arg((_ap)) #define va_end(_list) #endif pycparser-2.18/utils/fake_libc_include/newlib.h0000664000175000017500000000006713045001366022370 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/unistd.h0000664000175000017500000000006713045001366022416 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/grp.h0000664000175000017500000000006713045001366021700 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/float.h0000664000175000017500000000006713045001366022215 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/regdef.h0000664000175000017500000000006713045001366022344 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/ar.h0000664000175000017500000000006713045001366021512 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/assert.h0000664000175000017500000000006713045001366022411 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/complex.h0000664000175000017500000000006713045001366022557 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/sched.h0000664000175000017500000000006713045001366022176 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/pthread.h0000664000175000017500000000006713045001366022537 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/stdarg.h0000664000175000017500000000006713045001366022374 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/time.h0000664000175000017500000000006713045001366022046 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/linux/0000775000175000017500000000000013127011712022070 5ustar elibeneliben00000000000000pycparser-2.18/utils/fake_libc_include/linux/socket.h0000664000175000017500000000006713045001366023537 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/linux/version.h0000664000175000017500000000006713045001366023734 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/iconv.h0000664000175000017500000000006713045001366022226 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/argz.h0000664000175000017500000000006713045001366022053 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/openssl/0000775000175000017500000000000013127011712022414 5ustar elibeneliben00000000000000pycparser-2.18/utils/fake_libc_include/openssl/evp.h0000664000175000017500000000006713045001366023365 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/openssl/x509v3.h0000664000175000017500000000006713045001366023551 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/openssl/err.h0000664000175000017500000000006713045001366023363 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/openssl/ssl.h0000664000175000017500000000006713045001366023374 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/openssl/hmac.h0000664000175000017500000000006713045001366023503 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/wchar.h0000664000175000017500000000006713045001366022214 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/inttypes.h0000664000175000017500000000006713045001366022767 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/process.h0000664000175000017500000000006713045001366022566 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/termios.h0000664000175000017500000000006713045001366022572 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/endian.h0000664000175000017500000000006713045001366022346 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/locale.h0000664000175000017500000000006713045001366022347 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/mir_toolkit/0000775000175000017500000000000013127011712023265 5ustar elibeneliben00000000000000pycparser-2.18/utils/fake_libc_include/mir_toolkit/client_types.h0000664000175000017500000000006713062513705026152 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/ctype.h0000664000175000017500000000006713045001366022234 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/_fake_typedefs.h0000664000175000017500000001040613062513705024062 0ustar elibeneliben00000000000000#ifndef _FAKE_TYPEDEFS_H #define _FAKE_TYPEDEFS_H typedef int size_t; typedef int __builtin_va_list; typedef int __gnuc_va_list; typedef int __int8_t; typedef int __uint8_t; typedef int __int16_t; typedef int __uint16_t; typedef int __int_least16_t; typedef int __uint_least16_t; typedef int __int32_t; typedef int __uint32_t; typedef int __int64_t; typedef int __uint64_t; typedef int __int_least32_t; typedef int __uint_least32_t; typedef int __s8; typedef int __u8; typedef int __s16; typedef int __u16; typedef int __s32; typedef int __u32; typedef int __s64; typedef int __u64; typedef int _LOCK_T; typedef int _LOCK_RECURSIVE_T; typedef int _off_t; typedef int __dev_t; typedef int __uid_t; typedef int __gid_t; typedef int _off64_t; typedef int _fpos_t; typedef int _ssize_t; typedef int wint_t; typedef int _mbstate_t; typedef int _flock_t; typedef int _iconv_t; typedef int __ULong; typedef int __FILE; typedef int ptrdiff_t; typedef int wchar_t; typedef int __off_t; typedef int __pid_t; typedef int __loff_t; typedef int u_char; typedef int u_short; typedef int u_int; typedef int u_long; typedef int ushort; typedef int uint; typedef int clock_t; typedef int time_t; typedef int daddr_t; typedef int caddr_t; typedef int ino_t; typedef int off_t; typedef int dev_t; typedef int uid_t; typedef int gid_t; typedef int pid_t; typedef int key_t; typedef int ssize_t; typedef int mode_t; typedef int nlink_t; typedef int fd_mask; typedef int _types_fd_set; typedef int clockid_t; typedef int timer_t; typedef int useconds_t; typedef int suseconds_t; typedef int FILE; typedef int fpos_t; typedef int cookie_read_function_t; typedef int cookie_write_function_t; typedef int cookie_seek_function_t; typedef int cookie_close_function_t; typedef int cookie_io_functions_t; typedef int div_t; typedef int ldiv_t; typedef int lldiv_t; typedef int sigset_t; typedef int __sigset_t; typedef int _sig_func_ptr; typedef int sig_atomic_t; typedef int __tzrule_type; typedef int __tzinfo_type; typedef int mbstate_t; typedef int sem_t; typedef int pthread_t; typedef int pthread_attr_t; typedef int pthread_mutex_t; typedef int pthread_mutexattr_t; typedef int pthread_cond_t; typedef int pthread_condattr_t; typedef int pthread_key_t; typedef int pthread_once_t; typedef int pthread_rwlock_t; typedef int pthread_rwlockattr_t; typedef int pthread_spinlock_t; typedef int pthread_barrier_t; typedef int pthread_barrierattr_t; typedef int jmp_buf; typedef int rlim_t; typedef int sa_family_t; typedef int sigjmp_buf; typedef int stack_t; typedef int siginfo_t; typedef int z_stream; /* C99 exact-width integer types */ typedef int int8_t; typedef int uint8_t; typedef int int16_t; typedef int uint16_t; typedef int int32_t; typedef int uint32_t; typedef int int64_t; typedef int uint64_t; /* C99 minimum-width integer types */ typedef int int_least8_t; typedef int uint_least8_t; typedef int int_least16_t; typedef int uint_least16_t; typedef int int_least32_t; typedef int uint_least32_t; typedef int int_least64_t; typedef int uint_least64_t; /* C99 fastest minimum-width integer types */ typedef int int_fast8_t; typedef int uint_fast8_t; typedef int int_fast16_t; typedef int uint_fast16_t; typedef int int_fast32_t; typedef int uint_fast32_t; typedef int int_fast64_t; typedef int uint_fast64_t; /* C99 integer types capable of holding object pointers */ typedef int intptr_t; typedef int uintptr_t; /* C99 greatest-width integer types */ typedef int intmax_t; typedef int uintmax_t; /* C99 stdbool.h bool type. _Bool is built-in in C99 */ typedef _Bool bool; typedef int va_list; /* Xlib objects */ typedef struct Display Display; typedef unsigned long XID; typedef unsigned long VisualID; typedef XID Window; /* Mir typedefs */ typedef void* MirEGLNativeWindowType; typedef void* MirEGLNativeDisplayType; typedef struct MirConnection MirConnection; typedef struct MirSurface MirSurface; typedef struct MirSurfaceSpec MirSurfaceSpec; typedef struct MirScreencast MirScreencast; typedef struct MirPromptSession MirPromptSession; typedef struct MirBufferStream MirBufferStream; typedef struct MirPersistentId MirPersistentId; typedef struct MirBlob MirBlob; typedef struct MirDisplayConfig MirDisplayConfig; /* xcb typedefs */ typedef struct xcb_connection_t xcb_connection_t; typedef uint32_t xcb_window_t; typedef uint32_t xcb_visualid_t; #endif pycparser-2.18/utils/fake_libc_include/wctype.h0000664000175000017500000000006713045001366022423 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/stdlib.h0000664000175000017500000000006713045001366022371 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/arpa/0000775000175000017500000000000013127011712021654 5ustar elibeneliben00000000000000pycparser-2.18/utils/fake_libc_include/arpa/inet.h0000664000175000017500000000006713045001366022772 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/search.h0000664000175000017500000000006713045001366022355 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/fenv.h0000664000175000017500000000006713045001366022046 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/stdbool.h0000664000175000017500000000006713045001366022556 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/langinfo.h0000664000175000017500000000006713045001366022705 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/errno.h0000664000175000017500000000006713045001366022235 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/fcntl.h0000664000175000017500000000006713045001366022216 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/X11/0000775000175000017500000000000013127011712021302 5ustar elibeneliben00000000000000pycparser-2.18/utils/fake_libc_include/X11/Xlib.h0000664000175000017500000000006713062505527022366 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/signal.h0000664000175000017500000000006713045001366022365 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/setjmp.h0000664000175000017500000000006713045001366022412 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/iso646.h0000664000175000017500000000006713045001366022142 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/asm-generic/0000775000175000017500000000000013127011712023123 5ustar elibeneliben00000000000000pycparser-2.18/utils/fake_libc_include/asm-generic/int-ll64.h0000664000175000017500000000006713045001366024653 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/stdint.h0000664000175000017500000000006713045001366022415 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/features.h0000664000175000017500000000006713045001366022726 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/ieeefp.h0000664000175000017500000000006713045001366022345 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/syslog.h0000664000175000017500000000006713045001366022430 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/utils/fake_libc_include/libintl.h0000664000175000017500000000006713045001366022545 0ustar elibeneliben00000000000000#include "_fake_defines.h" #include "_fake_typedefs.h" pycparser-2.18/pycparser/0000775000175000017500000000000013127011712016177 5ustar elibeneliben00000000000000pycparser-2.18/pycparser/_build_tables.py0000664000175000017500000000153113045001366021344 0ustar elibeneliben00000000000000#----------------------------------------------------------------- # pycparser: _build_tables.py # # A dummy for generating the lexing/parsing tables and and # compiling them into .pyc for faster execution in optimized mode. # Also generates AST code from the configuration file. # Should be called from the pycparser directory. # # Eli Bendersky [http://eli.thegreenplace.net] # License: BSD #----------------------------------------------------------------- # Generate c_ast.py from _ast_gen import ASTCodeGenerator ast_gen = ASTCodeGenerator('_c_ast.cfg') ast_gen.generate(open('c_ast.py', 'w')) import sys sys.path[0:0] = ['.', '..'] from pycparser import c_parser # Generates the tables # c_parser.CParser( lex_optimize=True, yacc_debug=False, yacc_optimize=True) # Load to compile into .pyc # import lextab import yacctab import c_ast pycparser-2.18/pycparser/c_ast.py0000664000175000017500000005610713127011712017653 0ustar elibeneliben00000000000000#----------------------------------------------------------------- # ** ATTENTION ** # This code was automatically generated from the file: # _c_ast.cfg # # Do not modify it directly. Modify the configuration file and # run the generator again. # ** ** *** ** ** # # pycparser: c_ast.py # # AST Node classes. # # Eli Bendersky [http://eli.thegreenplace.net] # License: BSD #----------------------------------------------------------------- import sys class Node(object): __slots__ = () """ Abstract base class for AST nodes. """ def children(self): """ A sequence of all children that are Nodes """ pass def show(self, buf=sys.stdout, offset=0, attrnames=False, nodenames=False, showcoord=False, _my_node_name=None): """ Pretty print the Node and all its attributes and children (recursively) to a buffer. buf: Open IO buffer into which the Node is printed. offset: Initial offset (amount of leading spaces) attrnames: True if you want to see the attribute names in name=value pairs. False to only see the values. nodenames: True if you want to see the actual node names within their parents. showcoord: Do you want the coordinates of each Node to be displayed. """ lead = ' ' * offset if nodenames and _my_node_name is not None: buf.write(lead + self.__class__.__name__+ ' <' + _my_node_name + '>: ') else: buf.write(lead + self.__class__.__name__+ ': ') if self.attr_names: if attrnames: nvlist = [(n, getattr(self,n)) for n in self.attr_names] attrstr = ', '.join('%s=%s' % nv for nv in nvlist) else: vlist = [getattr(self, n) for n in self.attr_names] attrstr = ', '.join('%s' % v for v in vlist) buf.write(attrstr) if showcoord: buf.write(' (at %s)' % self.coord) buf.write('\n') for (child_name, child) in self.children(): child.show( buf, offset=offset + 2, attrnames=attrnames, nodenames=nodenames, showcoord=showcoord, _my_node_name=child_name) class NodeVisitor(object): """ A base NodeVisitor class for visiting c_ast nodes. Subclass it and define your own visit_XXX methods, where XXX is the class name you want to visit with these methods. For example: class ConstantVisitor(NodeVisitor): def __init__(self): self.values = [] def visit_Constant(self, node): self.values.append(node.value) Creates a list of values of all the constant nodes encountered below the given node. To use it: cv = ConstantVisitor() cv.visit(node) Notes: * generic_visit() will be called for AST nodes for which no visit_XXX method was defined. * The children of nodes for which a visit_XXX was defined will not be visited - if you need this, call generic_visit() on the node. You can use: NodeVisitor.generic_visit(self, node) * Modeled after Python's own AST visiting facilities (the ast module of Python 3.0) """ def visit(self, node): """ Visit a node. """ method = 'visit_' + node.__class__.__name__ visitor = getattr(self, method, self.generic_visit) return visitor(node) def generic_visit(self, node): """ Called if no explicit visitor function exists for a node. Implements preorder visiting of the node. """ for c_name, c in node.children(): self.visit(c) class ArrayDecl(Node): __slots__ = ('type', 'dim', 'dim_quals', 'coord', '__weakref__') def __init__(self, type, dim, dim_quals, coord=None): self.type = type self.dim = dim self.dim_quals = dim_quals self.coord = coord def children(self): nodelist = [] if self.type is not None: nodelist.append(("type", self.type)) if self.dim is not None: nodelist.append(("dim", self.dim)) return tuple(nodelist) attr_names = ('dim_quals', ) class ArrayRef(Node): __slots__ = ('name', 'subscript', 'coord', '__weakref__') def __init__(self, name, subscript, coord=None): self.name = name self.subscript = subscript self.coord = coord def children(self): nodelist = [] if self.name is not None: nodelist.append(("name", self.name)) if self.subscript is not None: nodelist.append(("subscript", self.subscript)) return tuple(nodelist) attr_names = () class Assignment(Node): __slots__ = ('op', 'lvalue', 'rvalue', 'coord', '__weakref__') def __init__(self, op, lvalue, rvalue, coord=None): self.op = op self.lvalue = lvalue self.rvalue = rvalue self.coord = coord def children(self): nodelist = [] if self.lvalue is not None: nodelist.append(("lvalue", self.lvalue)) if self.rvalue is not None: nodelist.append(("rvalue", self.rvalue)) return tuple(nodelist) attr_names = ('op', ) class BinaryOp(Node): __slots__ = ('op', 'left', 'right', 'coord', '__weakref__') def __init__(self, op, left, right, coord=None): self.op = op self.left = left self.right = right self.coord = coord def children(self): nodelist = [] if self.left is not None: nodelist.append(("left", self.left)) if self.right is not None: nodelist.append(("right", self.right)) return tuple(nodelist) attr_names = ('op', ) class Break(Node): __slots__ = ('coord', '__weakref__') def __init__(self, coord=None): self.coord = coord def children(self): return () attr_names = () class Case(Node): __slots__ = ('expr', 'stmts', 'coord', '__weakref__') def __init__(self, expr, stmts, coord=None): self.expr = expr self.stmts = stmts self.coord = coord def children(self): nodelist = [] if self.expr is not None: nodelist.append(("expr", self.expr)) for i, child in enumerate(self.stmts or []): nodelist.append(("stmts[%d]" % i, child)) return tuple(nodelist) attr_names = () class Cast(Node): __slots__ = ('to_type', 'expr', 'coord', '__weakref__') def __init__(self, to_type, expr, coord=None): self.to_type = to_type self.expr = expr self.coord = coord def children(self): nodelist = [] if self.to_type is not None: nodelist.append(("to_type", self.to_type)) if self.expr is not None: nodelist.append(("expr", self.expr)) return tuple(nodelist) attr_names = () class Compound(Node): __slots__ = ('block_items', 'coord', '__weakref__') def __init__(self, block_items, coord=None): self.block_items = block_items self.coord = coord def children(self): nodelist = [] for i, child in enumerate(self.block_items or []): nodelist.append(("block_items[%d]" % i, child)) return tuple(nodelist) attr_names = () class CompoundLiteral(Node): __slots__ = ('type', 'init', 'coord', '__weakref__') def __init__(self, type, init, coord=None): self.type = type self.init = init self.coord = coord def children(self): nodelist = [] if self.type is not None: nodelist.append(("type", self.type)) if self.init is not None: nodelist.append(("init", self.init)) return tuple(nodelist) attr_names = () class Constant(Node): __slots__ = ('type', 'value', 'coord', '__weakref__') def __init__(self, type, value, coord=None): self.type = type self.value = value self.coord = coord def children(self): nodelist = [] return tuple(nodelist) attr_names = ('type', 'value', ) class Continue(Node): __slots__ = ('coord', '__weakref__') def __init__(self, coord=None): self.coord = coord def children(self): return () attr_names = () class Decl(Node): __slots__ = ('name', 'quals', 'storage', 'funcspec', 'type', 'init', 'bitsize', 'coord', '__weakref__') def __init__(self, name, quals, storage, funcspec, type, init, bitsize, coord=None): self.name = name self.quals = quals self.storage = storage self.funcspec = funcspec self.type = type self.init = init self.bitsize = bitsize self.coord = coord def children(self): nodelist = [] if self.type is not None: nodelist.append(("type", self.type)) if self.init is not None: nodelist.append(("init", self.init)) if self.bitsize is not None: nodelist.append(("bitsize", self.bitsize)) return tuple(nodelist) attr_names = ('name', 'quals', 'storage', 'funcspec', ) class DeclList(Node): __slots__ = ('decls', 'coord', '__weakref__') def __init__(self, decls, coord=None): self.decls = decls self.coord = coord def children(self): nodelist = [] for i, child in enumerate(self.decls or []): nodelist.append(("decls[%d]" % i, child)) return tuple(nodelist) attr_names = () class Default(Node): __slots__ = ('stmts', 'coord', '__weakref__') def __init__(self, stmts, coord=None): self.stmts = stmts self.coord = coord def children(self): nodelist = [] for i, child in enumerate(self.stmts or []): nodelist.append(("stmts[%d]" % i, child)) return tuple(nodelist) attr_names = () class DoWhile(Node): __slots__ = ('cond', 'stmt', 'coord', '__weakref__') def __init__(self, cond, stmt, coord=None): self.cond = cond self.stmt = stmt self.coord = coord def children(self): nodelist = [] if self.cond is not None: nodelist.append(("cond", self.cond)) if self.stmt is not None: nodelist.append(("stmt", self.stmt)) return tuple(nodelist) attr_names = () class EllipsisParam(Node): __slots__ = ('coord', '__weakref__') def __init__(self, coord=None): self.coord = coord def children(self): return () attr_names = () class EmptyStatement(Node): __slots__ = ('coord', '__weakref__') def __init__(self, coord=None): self.coord = coord def children(self): return () attr_names = () class Enum(Node): __slots__ = ('name', 'values', 'coord', '__weakref__') def __init__(self, name, values, coord=None): self.name = name self.values = values self.coord = coord def children(self): nodelist = [] if self.values is not None: nodelist.append(("values", self.values)) return tuple(nodelist) attr_names = ('name', ) class Enumerator(Node): __slots__ = ('name', 'value', 'coord', '__weakref__') def __init__(self, name, value, coord=None): self.name = name self.value = value self.coord = coord def children(self): nodelist = [] if self.value is not None: nodelist.append(("value", self.value)) return tuple(nodelist) attr_names = ('name', ) class EnumeratorList(Node): __slots__ = ('enumerators', 'coord', '__weakref__') def __init__(self, enumerators, coord=None): self.enumerators = enumerators self.coord = coord def children(self): nodelist = [] for i, child in enumerate(self.enumerators or []): nodelist.append(("enumerators[%d]" % i, child)) return tuple(nodelist) attr_names = () class ExprList(Node): __slots__ = ('exprs', 'coord', '__weakref__') def __init__(self, exprs, coord=None): self.exprs = exprs self.coord = coord def children(self): nodelist = [] for i, child in enumerate(self.exprs or []): nodelist.append(("exprs[%d]" % i, child)) return tuple(nodelist) attr_names = () class FileAST(Node): __slots__ = ('ext', 'coord', '__weakref__') def __init__(self, ext, coord=None): self.ext = ext self.coord = coord def children(self): nodelist = [] for i, child in enumerate(self.ext or []): nodelist.append(("ext[%d]" % i, child)) return tuple(nodelist) attr_names = () class For(Node): __slots__ = ('init', 'cond', 'next', 'stmt', 'coord', '__weakref__') def __init__(self, init, cond, next, stmt, coord=None): self.init = init self.cond = cond self.next = next self.stmt = stmt self.coord = coord def children(self): nodelist = [] if self.init is not None: nodelist.append(("init", self.init)) if self.cond is not None: nodelist.append(("cond", self.cond)) if self.next is not None: nodelist.append(("next", self.next)) if self.stmt is not None: nodelist.append(("stmt", self.stmt)) return tuple(nodelist) attr_names = () class FuncCall(Node): __slots__ = ('name', 'args', 'coord', '__weakref__') def __init__(self, name, args, coord=None): self.name = name self.args = args self.coord = coord def children(self): nodelist = [] if self.name is not None: nodelist.append(("name", self.name)) if self.args is not None: nodelist.append(("args", self.args)) return tuple(nodelist) attr_names = () class FuncDecl(Node): __slots__ = ('args', 'type', 'coord', '__weakref__') def __init__(self, args, type, coord=None): self.args = args self.type = type self.coord = coord def children(self): nodelist = [] if self.args is not None: nodelist.append(("args", self.args)) if self.type is not None: nodelist.append(("type", self.type)) return tuple(nodelist) attr_names = () class FuncDef(Node): __slots__ = ('decl', 'param_decls', 'body', 'coord', '__weakref__') def __init__(self, decl, param_decls, body, coord=None): self.decl = decl self.param_decls = param_decls self.body = body self.coord = coord def children(self): nodelist = [] if self.decl is not None: nodelist.append(("decl", self.decl)) if self.body is not None: nodelist.append(("body", self.body)) for i, child in enumerate(self.param_decls or []): nodelist.append(("param_decls[%d]" % i, child)) return tuple(nodelist) attr_names = () class Goto(Node): __slots__ = ('name', 'coord', '__weakref__') def __init__(self, name, coord=None): self.name = name self.coord = coord def children(self): nodelist = [] return tuple(nodelist) attr_names = ('name', ) class ID(Node): __slots__ = ('name', 'coord', '__weakref__') def __init__(self, name, coord=None): self.name = name self.coord = coord def children(self): nodelist = [] return tuple(nodelist) attr_names = ('name', ) class IdentifierType(Node): __slots__ = ('names', 'coord', '__weakref__') def __init__(self, names, coord=None): self.names = names self.coord = coord def children(self): nodelist = [] return tuple(nodelist) attr_names = ('names', ) class If(Node): __slots__ = ('cond', 'iftrue', 'iffalse', 'coord', '__weakref__') def __init__(self, cond, iftrue, iffalse, coord=None): self.cond = cond self.iftrue = iftrue self.iffalse = iffalse self.coord = coord def children(self): nodelist = [] if self.cond is not None: nodelist.append(("cond", self.cond)) if self.iftrue is not None: nodelist.append(("iftrue", self.iftrue)) if self.iffalse is not None: nodelist.append(("iffalse", self.iffalse)) return tuple(nodelist) attr_names = () class InitList(Node): __slots__ = ('exprs', 'coord', '__weakref__') def __init__(self, exprs, coord=None): self.exprs = exprs self.coord = coord def children(self): nodelist = [] for i, child in enumerate(self.exprs or []): nodelist.append(("exprs[%d]" % i, child)) return tuple(nodelist) attr_names = () class Label(Node): __slots__ = ('name', 'stmt', 'coord', '__weakref__') def __init__(self, name, stmt, coord=None): self.name = name self.stmt = stmt self.coord = coord def children(self): nodelist = [] if self.stmt is not None: nodelist.append(("stmt", self.stmt)) return tuple(nodelist) attr_names = ('name', ) class NamedInitializer(Node): __slots__ = ('name', 'expr', 'coord', '__weakref__') def __init__(self, name, expr, coord=None): self.name = name self.expr = expr self.coord = coord def children(self): nodelist = [] if self.expr is not None: nodelist.append(("expr", self.expr)) for i, child in enumerate(self.name or []): nodelist.append(("name[%d]" % i, child)) return tuple(nodelist) attr_names = () class ParamList(Node): __slots__ = ('params', 'coord', '__weakref__') def __init__(self, params, coord=None): self.params = params self.coord = coord def children(self): nodelist = [] for i, child in enumerate(self.params or []): nodelist.append(("params[%d]" % i, child)) return tuple(nodelist) attr_names = () class PtrDecl(Node): __slots__ = ('quals', 'type', 'coord', '__weakref__') def __init__(self, quals, type, coord=None): self.quals = quals self.type = type self.coord = coord def children(self): nodelist = [] if self.type is not None: nodelist.append(("type", self.type)) return tuple(nodelist) attr_names = ('quals', ) class Return(Node): __slots__ = ('expr', 'coord', '__weakref__') def __init__(self, expr, coord=None): self.expr = expr self.coord = coord def children(self): nodelist = [] if self.expr is not None: nodelist.append(("expr", self.expr)) return tuple(nodelist) attr_names = () class Struct(Node): __slots__ = ('name', 'decls', 'coord', '__weakref__') def __init__(self, name, decls, coord=None): self.name = name self.decls = decls self.coord = coord def children(self): nodelist = [] for i, child in enumerate(self.decls or []): nodelist.append(("decls[%d]" % i, child)) return tuple(nodelist) attr_names = ('name', ) class StructRef(Node): __slots__ = ('name', 'type', 'field', 'coord', '__weakref__') def __init__(self, name, type, field, coord=None): self.name = name self.type = type self.field = field self.coord = coord def children(self): nodelist = [] if self.name is not None: nodelist.append(("name", self.name)) if self.field is not None: nodelist.append(("field", self.field)) return tuple(nodelist) attr_names = ('type', ) class Switch(Node): __slots__ = ('cond', 'stmt', 'coord', '__weakref__') def __init__(self, cond, stmt, coord=None): self.cond = cond self.stmt = stmt self.coord = coord def children(self): nodelist = [] if self.cond is not None: nodelist.append(("cond", self.cond)) if self.stmt is not None: nodelist.append(("stmt", self.stmt)) return tuple(nodelist) attr_names = () class TernaryOp(Node): __slots__ = ('cond', 'iftrue', 'iffalse', 'coord', '__weakref__') def __init__(self, cond, iftrue, iffalse, coord=None): self.cond = cond self.iftrue = iftrue self.iffalse = iffalse self.coord = coord def children(self): nodelist = [] if self.cond is not None: nodelist.append(("cond", self.cond)) if self.iftrue is not None: nodelist.append(("iftrue", self.iftrue)) if self.iffalse is not None: nodelist.append(("iffalse", self.iffalse)) return tuple(nodelist) attr_names = () class TypeDecl(Node): __slots__ = ('declname', 'quals', 'type', 'coord', '__weakref__') def __init__(self, declname, quals, type, coord=None): self.declname = declname self.quals = quals self.type = type self.coord = coord def children(self): nodelist = [] if self.type is not None: nodelist.append(("type", self.type)) return tuple(nodelist) attr_names = ('declname', 'quals', ) class Typedef(Node): __slots__ = ('name', 'quals', 'storage', 'type', 'coord', '__weakref__') def __init__(self, name, quals, storage, type, coord=None): self.name = name self.quals = quals self.storage = storage self.type = type self.coord = coord def children(self): nodelist = [] if self.type is not None: nodelist.append(("type", self.type)) return tuple(nodelist) attr_names = ('name', 'quals', 'storage', ) class Typename(Node): __slots__ = ('name', 'quals', 'type', 'coord', '__weakref__') def __init__(self, name, quals, type, coord=None): self.name = name self.quals = quals self.type = type self.coord = coord def children(self): nodelist = [] if self.type is not None: nodelist.append(("type", self.type)) return tuple(nodelist) attr_names = ('name', 'quals', ) class UnaryOp(Node): __slots__ = ('op', 'expr', 'coord', '__weakref__') def __init__(self, op, expr, coord=None): self.op = op self.expr = expr self.coord = coord def children(self): nodelist = [] if self.expr is not None: nodelist.append(("expr", self.expr)) return tuple(nodelist) attr_names = ('op', ) class Union(Node): __slots__ = ('name', 'decls', 'coord', '__weakref__') def __init__(self, name, decls, coord=None): self.name = name self.decls = decls self.coord = coord def children(self): nodelist = [] for i, child in enumerate(self.decls or []): nodelist.append(("decls[%d]" % i, child)) return tuple(nodelist) attr_names = ('name', ) class While(Node): __slots__ = ('cond', 'stmt', 'coord', '__weakref__') def __init__(self, cond, stmt, coord=None): self.cond = cond self.stmt = stmt self.coord = coord def children(self): nodelist = [] if self.cond is not None: nodelist.append(("cond", self.cond)) if self.stmt is not None: nodelist.append(("stmt", self.stmt)) return tuple(nodelist) attr_names = () class Pragma(Node): __slots__ = ('string', 'coord', '__weakref__') def __init__(self, string, coord=None): self.string = string self.coord = coord def children(self): nodelist = [] return tuple(nodelist) attr_names = ('string', ) pycparser-2.18/pycparser/lextab.py0000664000175000017500000001554313127010662020043 0ustar elibeneliben00000000000000# lextab.py. This file automatically created by PLY (version 3.10). Don't edit! _tabversion = '3.10' _lextokens = set(('VOID', 'LBRACKET', 'WCHAR_CONST', 'FLOAT_CONST', 'MINUS', 'RPAREN', 'LONG', 'PLUS', 'ELLIPSIS', 'GT', 'GOTO', 'ENUM', 'PERIOD', 'GE', 'INT_CONST_DEC', 'ARROW', '__INT128', 'HEX_FLOAT_CONST', 'DOUBLE', 'MINUSEQUAL', 'INT_CONST_OCT', 'TIMESEQUAL', 'OR', 'SHORT', 'RETURN', 'RSHIFTEQUAL', 'RESTRICT', 'STATIC', 'SIZEOF', 'UNSIGNED', 'UNION', 'COLON', 'WSTRING_LITERAL', 'DIVIDE', 'FOR', 'PLUSPLUS', 'EQUALS', 'ELSE', 'INLINE', 'EQ', 'AND', 'TYPEID', 'LBRACE', 'PPHASH', 'INT', 'SIGNED', 'CONTINUE', 'NOT', 'OREQUAL', 'MOD', 'RSHIFT', 'DEFAULT', 'CHAR', 'WHILE', 'DIVEQUAL', 'EXTERN', 'CASE', 'LAND', 'REGISTER', 'MODEQUAL', 'NE', 'SWITCH', 'INT_CONST_HEX', '_COMPLEX', 'PPPRAGMASTR', 'PLUSEQUAL', 'STRUCT', 'CONDOP', 'BREAK', 'VOLATILE', 'PPPRAGMA', 'ANDEQUAL', 'INT_CONST_BIN', 'DO', 'LNOT', 'CONST', 'LOR', 'CHAR_CONST', 'LSHIFT', 'RBRACE', '_BOOL', 'LE', 'SEMI', 'LT', 'COMMA', 'OFFSETOF', 'TYPEDEF', 'XOR', 'AUTO', 'TIMES', 'LPAREN', 'MINUSMINUS', 'ID', 'IF', 'STRING_LITERAL', 'FLOAT', 'XOREQUAL', 'LSHIFTEQUAL', 'RBRACKET')) _lexreflags = 64 _lexliterals = '' _lexstateinfo = {'ppline': 'exclusive', 'pppragma': 'exclusive', 'INITIAL': 'inclusive'} _lexstatere = {'ppline': [('(?P"([^"\\\\\\n]|(\\\\(([a-zA-Z._~!=&\\^\\-\\\\?\'"])|(\\d+)|(x[0-9a-fA-F]+))))*")|(?P(0(([uU]ll)|([uU]LL)|(ll[uU]?)|(LL[uU]?)|([uU][lL])|([lL][uU]?)|[uU])?)|([1-9][0-9]*(([uU]ll)|([uU]LL)|(ll[uU]?)|(LL[uU]?)|([uU][lL])|([lL][uU]?)|[uU])?))|(?P\\n)|(?Pline)', [None, ('t_ppline_FILENAME', 'FILENAME'), None, None, None, None, None, None, ('t_ppline_LINE_NUMBER', 'LINE_NUMBER'), None, None, None, None, None, None, None, None, None, None, None, None, None, None, None, None, ('t_ppline_NEWLINE', 'NEWLINE'), ('t_ppline_PPLINE', 'PPLINE')])], 'pppragma': [('(?P\\n)|(?Ppragma)|(?P.+)', [None, ('t_pppragma_NEWLINE', 'NEWLINE'), ('t_pppragma_PPPRAGMA', 'PPPRAGMA'), ('t_pppragma_STR', 'STR')])], 'INITIAL': [('(?P[ \\t]*\\#)|(?P\\n+)|(?P\\{)|(?P\\})|(?P((((([0-9]*\\.[0-9]+)|([0-9]+\\.))([eE][-+]?[0-9]+)?)|([0-9]+([eE][-+]?[0-9]+)))[FfLl]?))|(?P(0[xX]([0-9a-fA-F]+|((([0-9a-fA-F]+)?\\.[0-9a-fA-F]+)|([0-9a-fA-F]+\\.)))([pP][+-]?[0-9]+)[FfLl]?))|(?P0[xX][0-9a-fA-F]+(([uU]ll)|([uU]LL)|(ll[uU]?)|(LL[uU]?)|([uU][lL])|([lL][uU]?)|[uU])?)', [None, ('t_PPHASH', 'PPHASH'), ('t_NEWLINE', 'NEWLINE'), ('t_LBRACE', 'LBRACE'), ('t_RBRACE', 'RBRACE'), ('t_FLOAT_CONST', 'FLOAT_CONST'), None, None, None, None, None, None, None, None, None, ('t_HEX_FLOAT_CONST', 'HEX_FLOAT_CONST'), None, None, None, None, None, None, None, ('t_INT_CONST_HEX', 'INT_CONST_HEX')]), ('(?P0[bB][01]+(([uU]ll)|([uU]LL)|(ll[uU]?)|(LL[uU]?)|([uU][lL])|([lL][uU]?)|[uU])?)|(?P0[0-7]*[89])|(?P0[0-7]*(([uU]ll)|([uU]LL)|(ll[uU]?)|(LL[uU]?)|([uU][lL])|([lL][uU]?)|[uU])?)|(?P(0(([uU]ll)|([uU]LL)|(ll[uU]?)|(LL[uU]?)|([uU][lL])|([lL][uU]?)|[uU])?)|([1-9][0-9]*(([uU]ll)|([uU]LL)|(ll[uU]?)|(LL[uU]?)|([uU][lL])|([lL][uU]?)|[uU])?))|(?P\'([^\'\\\\\\n]|(\\\\(([a-zA-Z._~!=&\\^\\-\\\\?\'"])|(\\d+)|(x[0-9a-fA-F]+))))\')|(?PL\'([^\'\\\\\\n]|(\\\\(([a-zA-Z._~!=&\\^\\-\\\\?\'"])|(\\d+)|(x[0-9a-fA-F]+))))\')|(?P(\'([^\'\\\\\\n]|(\\\\(([a-zA-Z._~!=&\\^\\-\\\\?\'"])|(\\d+)|(x[0-9a-fA-F]+))))*\\n)|(\'([^\'\\\\\\n]|(\\\\(([a-zA-Z._~!=&\\^\\-\\\\?\'"])|(\\d+)|(x[0-9a-fA-F]+))))*$))|(?P(\'([^\'\\\\\\n]|(\\\\(([a-zA-Z._~!=&\\^\\-\\\\?\'"])|(\\d+)|(x[0-9a-fA-F]+))))[^\'\n]+\')|(\'\')|(\'([\\\\][^a-zA-Z._~^!=&\\^\\-\\\\?\'"x0-7])[^\'\\n]*\'))', [None, ('t_INT_CONST_BIN', 'INT_CONST_BIN'), None, None, None, None, None, None, None, ('t_BAD_CONST_OCT', 'BAD_CONST_OCT'), ('t_INT_CONST_OCT', 'INT_CONST_OCT'), None, None, None, None, None, None, None, ('t_INT_CONST_DEC', 'INT_CONST_DEC'), None, None, None, None, None, None, None, None, None, None, None, None, None, None, None, None, ('t_CHAR_CONST', 'CHAR_CONST'), None, None, None, None, None, None, ('t_WCHAR_CONST', 'WCHAR_CONST'), None, None, None, None, None, None, ('t_UNMATCHED_QUOTE', 'UNMATCHED_QUOTE'), None, None, None, None, None, None, None, None, None, None, None, None, None, None, ('t_BAD_CHAR_CONST', 'BAD_CHAR_CONST')]), ('(?PL"([^"\\\\\\n]|(\\\\(([a-zA-Z._~!=&\\^\\-\\\\?\'"])|(\\d+)|(x[0-9a-fA-F]+))))*")|(?P"([^"\\\\\\n]|(\\\\(([a-zA-Z._~!=&\\^\\-\\\\?\'"])|(\\d+)|(x[0-9a-fA-F]+))))*?([\\\\][^a-zA-Z._~^!=&\\^\\-\\\\?\'"x0-7])([^"\\\\\\n]|(\\\\(([a-zA-Z._~!=&\\^\\-\\\\?\'"])|(\\d+)|(x[0-9a-fA-F]+))))*")|(?P[a-zA-Z_$][0-9a-zA-Z_$]*)|(?P"([^"\\\\\\n]|(\\\\(([a-zA-Z._~!=&\\^\\-\\\\?\'"])|(\\d+)|(x[0-9a-fA-F]+))))*")|(?P\\.\\.\\.)|(?P\\+\\+)|(?P\\|\\|)|(?P\\^=)|(?P\\|=)|(?P<<=)|(?P>>=)|(?P\\+=)|(?P\\*=)|(?P\\+)|(?P%=)|(?P/=)', [None, ('t_WSTRING_LITERAL', 'WSTRING_LITERAL'), None, None, None, None, None, None, ('t_BAD_STRING_LITERAL', 'BAD_STRING_LITERAL'), None, None, None, None, None, None, None, None, None, None, None, None, None, ('t_ID', 'ID'), (None, 'STRING_LITERAL'), None, None, None, None, None, None, (None, 'ELLIPSIS'), (None, 'PLUSPLUS'), (None, 'LOR'), (None, 'XOREQUAL'), (None, 'OREQUAL'), (None, 'LSHIFTEQUAL'), (None, 'RSHIFTEQUAL'), (None, 'PLUSEQUAL'), (None, 'TIMESEQUAL'), (None, 'PLUS'), (None, 'MODEQUAL'), (None, 'DIVEQUAL')]), ('(?P\\])|(?P\\?)|(?P\\^)|(?P<<)|(?P<=)|(?P\\()|(?P->)|(?P==)|(?P!=)|(?P--)|(?P\\|)|(?P\\*)|(?P\\[)|(?P>=)|(?P\\))|(?P&&)|(?P>>)|(?P-=)|(?P\\.)|(?P&=)|(?P=)|(?P<)|(?P,)|(?P/)|(?P&)|(?P%)|(?P;)|(?P-)|(?P>)|(?P:)|(?P~)|(?P!)', [None, (None, 'RBRACKET'), (None, 'CONDOP'), (None, 'XOR'), (None, 'LSHIFT'), (None, 'LE'), (None, 'LPAREN'), (None, 'ARROW'), (None, 'EQ'), (None, 'NE'), (None, 'MINUSMINUS'), (None, 'OR'), (None, 'TIMES'), (None, 'LBRACKET'), (None, 'GE'), (None, 'RPAREN'), (None, 'LAND'), (None, 'RSHIFT'), (None, 'MINUSEQUAL'), (None, 'PERIOD'), (None, 'ANDEQUAL'), (None, 'EQUALS'), (None, 'LT'), (None, 'COMMA'), (None, 'DIVIDE'), (None, 'AND'), (None, 'MOD'), (None, 'SEMI'), (None, 'MINUS'), (None, 'GT'), (None, 'COLON'), (None, 'NOT'), (None, 'LNOT')])]} _lexstateignore = {'ppline': ' \t', 'pppragma': ' \t', 'INITIAL': ' \t'} _lexstateerrorf = {'ppline': 't_ppline_error', 'pppragma': 't_pppragma_error', 'INITIAL': 't_error'} _lexstateeoff = {} pycparser-2.18/pycparser/__pycache__/0000775000175000017500000000000013127011713020410 5ustar elibeneliben00000000000000pycparser-2.18/pycparser/__pycache__/plyparser.cpython-36.pyc0000664000175000017500000001044713127011712025064 0ustar elibeneliben000000000000003 õ±ÂX¦ã@sLGdd„deƒZGdd„deƒZGdd„deƒZdd„Zdd „Zd d „Zd S) c@s&eZdZdZd Zd dd„Zd d „ZdS) ÚCoordzž Coordinates of a syntactic element. Consists of: - File name - Line number - (optional) column number, for the Lexer ÚfileÚlineÚcolumnÚ __weakref__NcCs||_||_||_dS)N)rrr)Úselfrrr©rú../pycparser/plyparser.pyÚ__init__szCoord.__init__cCs(d|j|jf}|jr$|d|j7}|S)Nz%s:%sz:%s)rrr)rÚstrrrrÚ__str__sz Coord.__str__)rrrr)N)Ú__name__Ú __module__Ú __qualname__Ú__doc__Ú __slots__r r rrrrr s rc@s eZdZdS)Ú ParseErrorN)r r rrrrrrsrc@s.eZdZdd„Zd dd„Zdd„Zdd „ZdS) Ú PLYParsercCs<|d}dd„}d||f|_d||_t|j|j|ƒdS)zŽ Given a rule name, creates an optional ply.yacc rule for it. The name of the optional rule is _opt Z_optcSs|d|d<dS)Néér)rÚprrrÚoptrule)sz+PLYParser._create_opt_rule..optrulez%s : empty | %szp_%sN)rr ÚsetattrÚ __class__)rZrulenameZoptnamerrrrÚ_create_opt_rule"s  zPLYParser._create_opt_ruleNcCst|jj||dS)N)rrr)rÚclexÚfilename)rÚlinenorrrrÚ_coord0szPLYParser._coordcCsF|jjjjdd|j|ƒƒ}|dkr&d}|j|ƒ|}|j|j|ƒ|ƒS)zÖ Returns the coordinates for the YaccProduction objet 'p' indexed with 'token_idx'. The coordinate includes the 'lineno' and 'column'. Both follow the lex semantic, starting from 1. Ú rréÿÿÿÿ)ÚlexerÚlexdataÚrfindÚlexposrr)rrZ token_idxÚlast_crrrrrÚ _token_coord6s zPLYParser._token_coordcCstd||fƒ‚dS)Nz%s: %s)r)rÚmsgÚcoordrrrÚ _parse_errorAszPLYParser._parse_error)N)r r rrrr%r(rrrrr!s  rcs‡fdd„}|S)aÞ Decorator to create parameterized rules. Parameterized rule methods must be named starting with 'p_' and contain 'xxx', and their docstrings may contain 'xxx' and 'yyy'. These will be replaced by the given parameter tuples. For example, ``p_xxx_rule()`` with docstring 'xxx_rule : yyy' when decorated with ``@parameterized(('id', 'ID'))`` produces ``p_id_rule()`` with the docstring 'id_rule : ID'. Using multiple tuples produces multiple rules. cs ˆ|_|S)N)Ú_params)Z rule_func)ÚparamsrrÚdecorateOszparameterized..decorater)r*r+r)r*rÚ parameterizedEs r,cCsHxBt|ƒD]6}|jdƒr t||ƒ}t|dƒr t||ƒt||ƒq W|S)z Class decorator to generate rules from parameterized rule templates. See `parameterized` for more information on parameterized rules. Úp_r))ÚdirÚ startswithÚgetattrÚhasattrÚdelattrÚ_create_param_rules)ÚclsZ attr_nameÚmethodrrrÚtemplateUs    r6csZxTˆjD]J\}}‡fdd„}ˆjjd|ƒjd|ƒ|_ˆjjd|ƒ|_t||j|ƒqWdS)a Create ply.yacc rules based on a parameterized rule function Generates new methods (one per each pair of parameters) based on the template rule function `func`, and attaches them to `cls`. The rule function's parameters must be accessible via its `_params` attribute. csˆ||ƒdS)Nr)rr)ÚfuncrrÚ param_rulelsz'_create_param_rules..param_ruleÚxxxÚyyyN)r)rÚreplacer r)r4r7r9r:r8r)r7rr3cs  r3N)ÚobjectrÚ Exceptionrrr,r6r3rrrrÚ s $pycparser-2.18/pycparser/__pycache__/_ast_gen.cpython-36.pyc0000664000175000017500000002065213127011712024621 0ustar elibeneliben000000000000003 ö”Xã!ã@shddlZddlmZGdd„deƒZGdd„deƒZdZdZed krdddl Z ed ƒZ e j e d d ƒƒdS) éN)ÚTemplatec@s(eZdZd dd„Zd dd„Zdd„ZdS) ÚASTCodeGeneratorú _c_ast.cfgcCs ||_dd„|j|ƒDƒ|_dS)zN Initialize the code generator from a configuration file. cSsg|]\}}t||ƒ‘qS©)ÚNodeCfg)Ú.0ÚnameÚcontentsrrú?/home/eliben/eli/pycparser/pycparser-2.18/pycparser/_ast_gen.pyú sz-ASTCodeGenerator.__init__..N)Ú cfg_filenameÚ parse_cfgfileÚnode_cfg)Úselfr rrr Ú__init__szASTCodeGenerator.__init__NcCsHttƒj|jd}|t7}x|jD]}||jƒd7}q"W|j|ƒdS)z< Generates the code into file, an open file buffer. )r z N)rÚ_PROLOGUE_COMMENTZ substituter Ú_PROLOGUE_CODErÚgenerate_sourceÚwrite)rÚfileÚsrcrrrr Úgenerates   zASTCodeGenerator.generatec csÊt|dƒ¶}x®|D]¦}|jƒ}| s|jdƒr0q|jdƒ}|jdƒ}|jdƒ}|dksf||ksf||krvtd||fƒ‚|d|…}||d|…}|rªd d „|jd ƒDƒng} || fVqWWdQRXdS) ze Parse the configuration file and yield pairs of (name, contents) for each node. Úrú#ú:ú[ú]ézInvalid line in %s: %s NcSsg|] }|jƒ‘qSr)Ústrip)rÚvrrr r 7sz2ASTCodeGenerator.parse_cfgfile..ú,)ÚopenrÚ startswithÚfindÚ RuntimeErrorÚsplit) rÚfilenameÚfÚlineZcolon_iZ lbracket_iZ rbracket_irÚvalZvallistrrr r &s      zASTCodeGenerator.parse_cfgfile)r)N)Ú__name__Ú __module__Ú __qualname__rrr rrrr rs  rc@s8eZdZdZdd„Zdd„Zdd„Zdd „Zd d „Zd S) rzº Node configuration. name: node name contents: a list of contents - attributes and child nodes See comment at the top of the configuration file for details. cCs‚||_g|_g|_g|_g|_x^|D]V}|jdƒ}|jj|ƒ|jdƒrV|jj|ƒq$|jdƒrn|jj|ƒq$|jj|ƒq$WdS)NÚ*z**)rÚ all_entriesÚattrÚchildÚ seq_childÚrstripÚappendÚendswith)rrr ÚentryZ clean_entryrrr rBs     zNodeCfg.__init__cCs,|jƒ}|d|jƒ7}|d|jƒ7}|S)NÚ )Ú _gen_initÚ _gen_childrenÚ_gen_attr_names)rrrrr rTszNodeCfg.generate_sourcecCsŽd|j}|jrDdj|jƒ}djdd„|jDƒƒ}|d7}d|}nd}d}|d |7}|d |7}x$|jd gD]}|d ||f7}qrW|S) Nzclass %s(Node): z, css|]}dj|ƒVqdS)z'{0}'N)Úformat)rÚerrr ú _sz$NodeCfg._gen_init..z, 'coord', '__weakref__'z(self, %s, coord=None)z'coord', '__weakref__'z(self, coord=None)z __slots__ = (%s) z def __init__%s: Zcoordz self.%s = %s )rr.Újoin)rrÚargsZslotsZarglistrrrr r7Zs     zNodeCfg._gen_initcCspd}|jrd|d7}x$|jD]}|ddt|d7}qWx |jD]}|dt|d7}q@W|d7}n|d7}|S) Nz def children(self): z nodelist = [] z& if self.%(child)s is not None:z0 nodelist.append(("%(child)s", self.%(child)s)) )r0zu for i, child in enumerate(self.%(child)s or []): nodelist.append(("%(child)s[%%d]" %% i, child)) z return tuple(nodelist) z return () )r.r0Údictr1)rrr0r1rrr r8ns   zNodeCfg._gen_childrencCs"ddjdd„|jDƒƒd}|S)Nz attr_names = (Úcss|]}d|VqdS)z%r, Nr)rZnmrrr r<‡sz*NodeCfg._gen_attr_names..ú))r=r/)rrrrr r9†szNodeCfg._gen_attr_namesN) r*r+r,Ú__doc__rrr7r8r9rrrr r;s ra¼#----------------------------------------------------------------- # ** ATTENTION ** # This code was automatically generated from the file: # $cfg_filename # # Do not modify it directly. Modify the configuration file and # run the generator again. # ** ** *** ** ** # # pycparser: c_ast.py # # AST Node classes. # # Eli Bendersky [http://eli.thegreenplace.net] # License: BSD #----------------------------------------------------------------- a× import sys class Node(object): __slots__ = () """ Abstract base class for AST nodes. """ def children(self): """ A sequence of all children that are Nodes """ pass def show(self, buf=sys.stdout, offset=0, attrnames=False, nodenames=False, showcoord=False, _my_node_name=None): """ Pretty print the Node and all its attributes and children (recursively) to a buffer. buf: Open IO buffer into which the Node is printed. offset: Initial offset (amount of leading spaces) attrnames: True if you want to see the attribute names in name=value pairs. False to only see the values. nodenames: True if you want to see the actual node names within their parents. showcoord: Do you want the coordinates of each Node to be displayed. """ lead = ' ' * offset if nodenames and _my_node_name is not None: buf.write(lead + self.__class__.__name__+ ' <' + _my_node_name + '>: ') else: buf.write(lead + self.__class__.__name__+ ': ') if self.attr_names: if attrnames: nvlist = [(n, getattr(self,n)) for n in self.attr_names] attrstr = ', '.join('%s=%s' % nv for nv in nvlist) else: vlist = [getattr(self, n) for n in self.attr_names] attrstr = ', '.join('%s' % v for v in vlist) buf.write(attrstr) if showcoord: buf.write(' (at %s)' % self.coord) buf.write('\n') for (child_name, child) in self.children(): child.show( buf, offset=offset + 2, attrnames=attrnames, nodenames=nodenames, showcoord=showcoord, _my_node_name=child_name) class NodeVisitor(object): """ A base NodeVisitor class for visiting c_ast nodes. Subclass it and define your own visit_XXX methods, where XXX is the class name you want to visit with these methods. For example: class ConstantVisitor(NodeVisitor): def __init__(self): self.values = [] def visit_Constant(self, node): self.values.append(node.value) Creates a list of values of all the constant nodes encountered below the given node. To use it: cv = ConstantVisitor() cv.visit(node) Notes: * generic_visit() will be called for AST nodes for which no visit_XXX method was defined. * The children of nodes for which a visit_XXX was defined will not be visited - if you need this, call generic_visit() on the node. You can use: NodeVisitor.generic_visit(self, node) * Modeled after Python's own AST visiting facilities (the ast module of Python 3.0) """ def visit(self, node): """ Visit a node. """ method = 'visit_' + node.__class__.__name__ visitor = getattr(self, method, self.generic_visit) return visitor(node) def generic_visit(self, node): """ Called if no explicit visitor function exists for a node. Implements preorder visiting of the node. """ for c_name, c in node.children(): self.visit(c) Z__main__z _c_ast.cfgzc_ast.pyÚw) ÚpprintZstringrÚobjectrrrrr*ÚsysÚast_genrr!rrrr Ú s *brpycparser-2.18/pycparser/__pycache__/c_parser.cpython-36.pyc0000664000175000017500000016542713127011712024652 0ustar elibeneliben000000000000003 õ±ÂX$ã@sŒddlZddlmZddlmZddlmZddlmZm Z m Z m Z m Z ddl mZe Gdd „d eƒƒZed krˆddlZddlZddlZdS) éNé)Úyacc)Úc_ast)ÚCLexer)Ú PLYParserÚCoordÚ ParseErrorÚ parameterizedÚtemplate)Úfix_switch_casesc @sVeZdZdedddddfdd„ZdZd d „Zd d „Zd d„Zdd„Zdd„Z dd„Z dd„Z dd„Z dd„Z dd„Zdd„Zdd „Zd!d"„Zd[d#d$„Zd\d%d&„Zd'd(„Zd)d*„ZdgZd>d?„Zd@dA„ZdBdC„ZdDdE„ZdFdG„ZdHdI„ZdJdK„ZdLdM„ZdNdO„ZdPdQ„Z dRdS„Z!dTdU„Z"dVdW„Z#dXdY„Z$dZd[„Z%d\d]„Z&d^d_„Z'd`da„Z(dbdc„Z)ddde„Z*dfdg„Z+dhdi„Z,djdk„Z-dldm„Z.dndo„Z/dpdq„Z0drds„Z1dtdu„Z2dvdw„Z3dxdy„Z4dzd{„Z5d|d}„Z6d~d„Z7d€d„Z8d‚dƒ„Z9d„d…„Z:d†d‡„Z;dˆd‰„ZdŽd„Z?dd‘„Z@d’d“„ZAd”d•„ZBd–d—„ZCd˜d™„ZDdšd›„ZEdœd„ZFdždŸ„ZGd d¡„ZHd¢d£„ZId¤d¥„ZJd¦d§„ZKeLdhdidjƒd­d®„ƒZMeLdkdldmƒd¯d°„ƒZNeLdndodpƒd±d²„ƒZOeLdqdrƒd³d´„ƒZPeLdsdtduƒdµd¶„ƒZQeLdvdwdxƒd·d¸„ƒZReLdydzd{ƒd¹dº„ƒZSeLd|d}d~ƒd»d¼„ƒZTd½d¾„ZUd¿dÀ„ZVdÁd„ZWdÃdÄ„ZXdÅdÆ„ZYdÇdÈ„ZZdÉdÊ„Z[dËdÌ„Z\dÍd΄Z]dÏdЄZ^dÑdÒ„Z_dÓdÔ„Z`dÕdÖ„Zad×dØ„ZbdÙdÚ„ZcdÛdÜ„ZddÝdÞ„Zedßdà„Zfdádâ„Zgdãdä„Zhdådæ„Zidçdè„Zjdédê„Zkdëdì„Zldídî„Zmdïdð„Zndñdò„Zodódô„Zpdõdö„Zqd÷dø„Zrdùdú„Zsdûdü„Ztdýdþ„Zudÿd„Zvdd„Zwdd„Zxdd„Zydd„Zzd d „Z{d d „Z|d d„Z}dd„Z~dd„Zdd„Z€dd„Zdd„Z‚dd„Zƒdd„Z„dd„Z…dd „Z†d!d"„Z‡d#d$„Zˆd%d&„Z‰d'd(„ZŠd)d*„Z‹d+d,„ZŒd-d.„Zd/d0„ZŽd1d2„Zd3d4„Zd5d6„Z‘d7d8„Z’d9d:„Z“d;d<„Z”d=d>„Z•d?d@„Z–dAdB„Z—dCdD„Z˜dEdF„Z™dGdH„ZšdIdJ„Z›dKdL„ZœdMdN„ZdOdP„ZždQdR„ZŸdSdT„Z dUdV„Z¡dWdX„Z¢dYS(ÚCParserTzpycparser.lextabzpycparser.yacctabFÚc Csš||j|j|j|jd|_|jj|||d|jj|_ddddddd d d d d dddg}x|D]} |j| ƒq\Wtj|d||||d|_ t ƒg|_ d|_ dS)a Create a new CParser. Some arguments for controlling the debug/optimization level of the parser are provided. The defaults are tuned for release/performance mode. The simple rules for using them are: *) When tweaking CParser/CLexer, set these to False *) When releasing a stable parser, set to True lex_optimize: Set to False when you're modifying the lexer. Otherwise, changes in the lexer won't be used, if some lextab.py file exists. When releasing with a stable lexer, set to True to save the re-generation of the lexer table on each run. lexer: Set this parameter to define the lexer to use if you're not using the default CLexer. lextab: Points to the lex table that's used for optimized mode. Only if you're modifying the lexer and want some tests to avoid re-generating the table, make this point to a local lex table file (that's been earlier generated with lex_optimize=True) yacc_optimize: Set to False when you're modifying the parser. Otherwise, changes in the parser won't be used, if some parsetab.py file exists. When releasing with a stable parser, set to True to save the re-generation of the parser table on each run. yacctab: Points to the yacc table that's used for optimized mode. Only if you're modifying the parser, make this point to a local yacc table file yacc_debug: Generate a parser.out file that explains how yacc built the parsing table from the grammar. taboutputdir: Set this parameter to control the location of generated lextab and yacctab files. )Z error_funcZon_lbrace_funcZon_rbrace_funcZtype_lookup_func)ÚoptimizeÚlextabÚ outputdirZabstract_declaratorZassignment_expressionZdeclaration_listZdeclaration_specifiers_no_typeZ designationZ expressionZidentifier_listZinit_declarator_listZid_init_declarator_listZinitializer_listZparameter_type_listZblock_item_listZtype_qualifier_listZstruct_declarator_listZtranslation_unit_or_empty)ÚmoduleÚstartÚdebugrZ tabmodulerN) Ú_lex_error_funcÚ_lex_on_lbrace_funcÚ_lex_on_rbrace_funcÚ_lex_type_lookup_funcÚclexZbuildÚtokensZ_create_opt_rulerÚcparserÚdictÚ _scope_stackÚ_last_yielded_token) ÚselfÚ lex_optimizeÚlexerrÚ yacc_optimizeÚyacctabÚ yacc_debugZ taboutputdirZrules_with_optZrule©r$ú../pycparser/c_parser.pyÚ__init__sF:    zCParser.__init__rcCs6||j_|jjƒtƒg|_d|_|jj||j|dS)a& Parses C code and returns an AST. text: A string containing the C source code filename: Name of the file being parsed (for meaningful error messages) debuglevel: Debug level to yacc N)Úinputr r)rÚfilenameZ reset_linenorrrrÚparse)rÚtextr(Z debuglevelr$r$r%r)„s   z CParser.parsecCs|jjtƒƒdS)N)rÚappendr)rr$r$r%Ú _push_scopeœszCParser._push_scopecCs t|jƒdkst‚|jjƒdS)Nr)ÚlenrÚAssertionErrorÚpop)rr$r$r%Ú _pop_scopeŸszCParser._pop_scopecCs4|jdj|dƒs"|jd||ƒd|jd|<dS)zC Add a new typedef name (ie a TYPEID) to the current scope rTz;Typedef %r previously declared as non-typedef in this scopeNéÿÿÿÿr1)rÚgetÚ _parse_error)rÚnameÚcoordr$r$r%Ú_add_typedef_name£s  zCParser._add_typedef_namecCs4|jdj|dƒr"|jd||ƒd|jd|<dS)ze Add a new object, function, or enum member name (ie an ID) to the current scope rFz;Non-typedef %r previously declared as typedef in this scopeNr1r1)rr2r3)rr4r5r$r$r%Ú_add_identifier¬s  zCParser._add_identifiercCs.x(t|jƒD]}|j|ƒ}|dk r |Sq WdS)z8 Is *name* a typedef-name in the current scope? NF)Úreversedrr2)rr4ZscopeZin_scoper$r$r%Ú_is_type_in_scope¶s  zCParser._is_type_in_scopecCs|j||j||ƒƒdS)N)r3Ú_coord)rÚmsgÚlineÚcolumnr$r$r%rÀszCParser._lex_error_funccCs |jƒdS)N)r,)rr$r$r%rÃszCParser._lex_on_lbrace_funccCs |jƒdS)N)r0)rr$r$r%rÆszCParser._lex_on_rbrace_funccCs|j|ƒ}|S)z§ Looks up types that were previously defined with typedef. Passed to the lexer for recognizing identifiers that are types. )r9)rr4Zis_typer$r$r%rÉs zCParser._lex_type_lookup_funccCs|jjS)z§ We need access to yacc's lookahead token in certain cases. This is the last token yacc requested from the lexer, so we ask the lexer. )rZ last_token)rr$r$r%Ú_get_yacc_lookahead_tokenÒsz!CParser._get_yacc_lookahead_tokencCsd|}|}x|jr|j}q Wt|tjƒr0||_|S|}xt|jtjƒsL|j}q6W|j|_||_|SdS)z  Tacks a type modifier on a declarator, and returns the modified declarator. Note: the declarator and modifier may be modified N)ÚtypeÚ isinstancerÚTypeDecl)rÚdeclÚmodifierZ modifier_headZ modifier_tailZ decl_tailr$r$r%Ú_type_modify_declûs    zCParser._type_modify_declcCsÆ|}xt|tjƒs|j}qW|j|_|j|_x>|D]6}t|tjƒs2t|ƒdkr^|j d|j ƒq2||_|Sq2W|s¢t|jtj ƒsŒ|j d|j ƒtjdg|j d|_n tjdd„|Dƒ|dj d|_|S) z- Fixes a declaration. Modifies decl. rz Invalid multiple types specifiedzMissing type in declarationÚint)r5cSsg|]}|jD]}|‘qqSr$)Únames)Ú.0Úidr4r$r$r%ú [sz/CParser._fix_decl_name_type..r) r@rrAr?Údeclnamer4ÚqualsÚIdentifierTyper-r3r5ÚFuncDecl)rrBÚtypenamer?Ztnr$r$r%Ú_fix_decl_name_type2s.       zCParser._fix_decl_name_typecCs<|ptggggd}|r(||j|ƒn||jd|ƒ|S)a‡ Declaration specifiers are represented by a dictionary with the entries: * qual: a list of type qualifiers * storage: a list of storage type qualifiers * type: a list of type specifiers * function: a list of function specifiers This method is given a declaration specifier, and a new specifier of a given kind. If `append` is True, the new specifier is added to the end of the specifiers list, otherwise it's added at the beginning. Returns the declaration specifier, with the new specifier incorporated. )ÚqualÚstorager?Úfunctionr)rr+Úinsert)rZdeclspecZnewspecZkindr+Úspecr$r$r%Ú_add_declaration_specifier_s z"CParser._add_declaration_specifierc CsRd|dk}g}|djdƒdk r&n4|dddkrèt|dƒdksvt|ddjƒd ksv|j|ddjdƒ rªd }x"|dD]}t|d ƒr„|j}Pq„W|jd |ƒtj|ddjddd|ddjd |dd<|dd=nrt |ddtj tj tj fƒsZ|dd}xt |tjƒs.|j }qW|jdkrZ|ddjd|_|dd=xò|D]ê} | ddk svt‚|r¤tjd|d|d| d| djd} n|r.|j| j| jƒn|j| j| jƒ|j| ƒq`W|S)zÿ Builds a list of declarations all sharing the given specifiers. If typedef_namespace is true, each declared name is added to the "typedef namespace", which also includes objects, functions, and enum constants. ÚtypedefrQrÚbitsizeNrBr?érú?r5zInvalid declaration)rJr?rKr5rP)r4rKrQr?r5rRÚinit)r4rKrQÚfuncspecr?rZrWr5r1r1r1r1r1r1r1)r2r-rFr9Úhasattrr5r3rrAr@ÚStructÚUnionrLr?rJr.ZTypedefÚDeclrOr6r4r7r+) rrTÚdeclsÚtypedef_namespaceZ is_typedefZ declarationsr5ÚtZ decls_0_tailrBÚ declarationZ fixed_declr$r$r%Ú_build_declarationswsn &         zCParser._build_declarationscCsBd|dkst‚|j|t|ddgddd}tj||||jdS) z' Builds a function definition. rVrQN)rBrZT)rTr`rar)rBÚ param_declsÚbodyr5)r.rdrrZFuncDefr5)rrTrBrerfrcr$r$r%Ú_build_function_definitionÐs  z"CParser._build_function_definitioncCs|dkrtjStjSdS)z` Given a token (either STRUCT or UNION), selects the appropriate AST class. ZstructN)rr]r^)rÚtokenr$r$r%Ú_select_struct_union_classàsz"CParser._select_struct_union_classÚleftÚLORÚLANDÚORÚXORÚANDÚEQÚNEÚGTÚGEÚLTÚLEÚRSHIFTÚLSHIFTÚPLUSÚMINUSÚTIMESÚDIVIDEÚMODcCs2|ddkrtjgƒ|d<ntj|dƒ|d<dS)zh translation_unit_or_empty : translation_unit | empty rNr)rZFileAST)rÚpr$r$r%Úp_translation_unit_or_emptys z#CParser.p_translation_unit_or_emptycCs|d|d<dS)z4 translation_unit : external_declaration rrNr$)rr}r$r$r%Úp_translation_unit_1 szCParser.p_translation_unit_1cCs.|ddk r|dj|dƒ|d|d<dS)zE translation_unit : translation_unit external_declaration rXNrr)Úextend)rr}r$r$r%Úp_translation_unit_2s zCParser.p_translation_unit_2cCs|dg|d<dS)z7 external_declaration : function_definition rrNr$)rr}r$r$r%Úp_external_declaration_1sz CParser.p_external_declaration_1cCs|d|d<dS)z/ external_declaration : declaration rrNr$)rr}r$r$r%Úp_external_declaration_2"sz CParser.p_external_declaration_2cCs|dg|d<dS)zi external_declaration : pp_directive | pppragma_directive rrNr$)rr}r$r$r%Úp_external_declaration_3'sz CParser.p_external_declaration_3cCs d|d<dS)z( external_declaration : SEMI Nrr$)rr}r$r$r%Úp_external_declaration_4-sz CParser.p_external_declaration_4cCs|jd|j|dƒƒdS)z pp_directive : PPHASH zDirectives not supported yetrN)r3Ú _token_coord)rr}r$r$r%Úp_pp_directive2szCParser.p_pp_directivecCsFt|ƒdkr*tj|d|j|dƒƒ|d<ntjd|j|dƒƒ|d<dS)zg pppragma_directive : PPPRAGMA | PPPRAGMA PPPRAGMASTR érXrr rN)r-rZPragmar†)rr}r$r$r%Úp_pppragma_directive8s zCParser.p_pppragma_directivecCsLtggtjdg|j|dƒdggd}|j||d|d|dd|d<d S) zU function_definition : id_declarator declaration_list_opt compound_statement rEr)r5)rPrQr?rRrXrˆ)rTrBrerfrN)rrrLr†rg)rr}rTr$r$r%Úp_function_definition_1DszCParser.p_function_definition_1cCs.|d}|j||d|d|dd|d<dS)zl function_definition : declaration_specifiers id_declarator declaration_list_opt compound_statement rrXrˆé)rTrBrerfrN)rg)rr}rTr$r$r%Úp_function_definition_2Us zCParser.p_function_definition_2cCs|d|d<dS)a7 statement : labeled_statement | expression_statement | compound_statement | selection_statement | iteration_statement | jump_statement | pppragma_directive rrNr$)rr}r$r$r%Ú p_statement`s zCParser.p_statementc Cs¶|d}|ddkr–|d}tjtjtjf}t|ƒdkrzt|d|ƒrztjd|d|d|d|ddd|djd g}qª|j|t ddd gd d }n|j||dd d }||d<dS) z˜ decl_body : declaration_specifiers init_declarator_list_opt | declaration_specifiers_no_type id_init_declarator_list_opt rrXNr?rrPrQrR)r4rKrQr[r?rZrWr5)rBrZT)rTr`ra) rr]r^ÚEnumr-r@r_r5rdr)rr}rTZtyZs_u_or_er`r$r$r%Ú p_decl_bodyts.   zCParser.p_decl_bodycCs|d|d<dS)z& declaration : decl_body SEMI rrNr$)rr}r$r$r%Ú p_declaration°szCParser.p_declarationcCs,t|ƒdkr|dn|d|d|d<dS)zj declaration_list : declaration | declaration_list declaration rXrrN)r-)rr}r$r$r%Úp_declaration_list¹szCParser.p_declaration_listcCs|j|d|ddƒ|d<dS)z] declaration_specifiers_no_type : type_qualifier declaration_specifiers_no_type_opt rXrrPrN)rU)rr}r$r$r%Ú"p_declaration_specifiers_no_type_1Äsz*CParser.p_declaration_specifiers_no_type_1cCs|j|d|ddƒ|d<dS)zf declaration_specifiers_no_type : storage_class_specifier declaration_specifiers_no_type_opt rXrrQrN)rU)rr}r$r$r%Ú"p_declaration_specifiers_no_type_2Ész*CParser.p_declaration_specifiers_no_type_2cCs|j|d|ddƒ|d<dS)za declaration_specifiers_no_type : function_specifier declaration_specifiers_no_type_opt rXrrRrN)rU)rr}r$r$r%Ú"p_declaration_specifiers_no_type_3Îsz*CParser.p_declaration_specifiers_no_type_3cCs"|j|d|dddd|d<dS)zI declaration_specifiers : declaration_specifiers type_qualifier rrXrPT)r+rN)rU)rr}r$r$r%Úp_declaration_specifiers_1Ôsz"CParser.p_declaration_specifiers_1cCs"|j|d|dddd|d<dS)zR declaration_specifiers : declaration_specifiers storage_class_specifier rrXrQT)r+rN)rU)rr}r$r$r%Úp_declaration_specifiers_2Ùsz"CParser.p_declaration_specifiers_2cCs"|j|d|dddd|d<dS)zM declaration_specifiers : declaration_specifiers function_specifier rrXrRT)r+rN)rU)rr}r$r$r%Úp_declaration_specifiers_3Þsz"CParser.p_declaration_specifiers_3cCs"|j|d|dddd|d<dS)zS declaration_specifiers : declaration_specifiers type_specifier_no_typeid rrXr?T)r+rN)rU)rr}r$r$r%Úp_declaration_specifiers_4ãsz"CParser.p_declaration_specifiers_4cCs|jd|ddƒ|d<dS)z2 declaration_specifiers : type_specifier Nrr?r)rU)rr}r$r$r%Úp_declaration_specifiers_5èsz"CParser.p_declaration_specifiers_5cCs"|j|d|dddd|d<dS)zQ declaration_specifiers : declaration_specifiers_no_type type_specifier rrXr?T)r+rN)rU)rr}r$r$r%Úp_declaration_specifiers_6ísz"CParser.p_declaration_specifiers_6cCs|d|d<dS)zß storage_class_specifier : AUTO | REGISTER | STATIC | EXTERN | TYPEDEF rrNr$)rr}r$r$r%Úp_storage_class_specifierósz!CParser.p_storage_class_specifiercCs|d|d<dS)z& function_specifier : INLINE rrNr$)rr}r$r$r%Úp_function_specifierüszCParser.p_function_specifiercCs$tj|dg|j|dƒd|d<dS)a+ type_specifier_no_typeid : VOID | _BOOL | CHAR | SHORT | INT | LONG | FLOAT | DOUBLE | _COMPLEX | SIGNED | UNSIGNED | __INT128 r)r5rN)rrLr†)rr}r$r$r%Úp_type_specifier_no_typeidsz"CParser.p_type_specifier_no_typeidcCs|d|d<dS)zÄ type_specifier : typedef_name | enum_specifier | struct_or_union_specifier | type_specifier_no_typeid rrNr$)rr}r$r$r%Úp_type_specifierszCParser.p_type_specifiercCs|d|d<dS)zo type_qualifier : CONST | RESTRICT | VOLATILE rrNr$)rr}r$r$r%Úp_type_qualifierszCParser.p_type_qualifiercCs0t|ƒdkr|d|dgn|dg|d<dS)z„ init_declarator_list : init_declarator | init_declarator_list COMMA init_declarator r‹rrˆrN)r-)rr}r$r$r%Úp_init_declarator_list szCParser.p_init_declarator_listcCs,t|dt|ƒdkr|dndd|d<dS)zb init_declarator : declarator | declarator EQUALS initializer rrXrˆN)rBrZr)rr-)rr}r$r$r%Úp_init_declarator)szCParser.p_init_declaratorcCs0t|ƒdkr|d|dgn|dg|d<dS)z id_init_declarator_list : id_init_declarator | id_init_declarator_list COMMA init_declarator r‹rrˆrN)r-)rr}r$r$r%Úp_id_init_declarator_list/sz!CParser.p_id_init_declarator_listcCs,t|dt|ƒdkr|dndd|d<dS)zn id_init_declarator : id_declarator | id_declarator EQUALS initializer rrXrˆN)rBrZr)rr-)rr}r$r$r%Úp_id_init_declarator5szCParser.p_id_init_declaratorcCs"|j|d|dddd|d<dS)zY specifier_qualifier_list : specifier_qualifier_list type_specifier_no_typeid rrXr?T)r+rN)rU)rr}r$r$r%Úp_specifier_qualifier_list_1=sz$CParser.p_specifier_qualifier_list_1cCs"|j|d|dddd|d<dS)zO specifier_qualifier_list : specifier_qualifier_list type_qualifier rrXrPT)r+rN)rU)rr}r$r$r%Úp_specifier_qualifier_list_2Bsz$CParser.p_specifier_qualifier_list_2cCs|jd|ddƒ|d<dS)z4 specifier_qualifier_list : type_specifier Nrr?r)rU)rr}r$r$r%Úp_specifier_qualifier_list_3Gsz$CParser.p_specifier_qualifier_list_3cCs2t|dgggd}|j||dddd|d<dS) zH specifier_qualifier_list : type_qualifier_list type_specifier r)rPrQr?rRrXr?T)r+rN)rrU)rr}rTr$r$r%Úp_specifier_qualifier_list_4Lsz$CParser.p_specifier_qualifier_list_4cCs0|j|dƒ}||dd|j|dƒd|d<dS)z{ struct_or_union_specifier : struct_or_union ID | struct_or_union TYPEID rrXN)r4r`r5r)rir†)rr}Úklassr$r$r%Úp_struct_or_union_specifier_1Us z%CParser.p_struct_or_union_specifier_1cCs0|j|dƒ}|d|d|j|dƒd|d<dS)zd struct_or_union_specifier : struct_or_union brace_open struct_declaration_list brace_close rNrˆrX)r4r`r5r)rir†)rr}r¨r$r$r%Úp_struct_or_union_specifier_2_s z%CParser.p_struct_or_union_specifier_2cCs4|j|dƒ}||d|d|j|dƒd|d<dS)zÙ struct_or_union_specifier : struct_or_union ID brace_open struct_declaration_list brace_close | struct_or_union TYPEID brace_open struct_declaration_list brace_close rrXr‹)r4r`r5rN)rir†)rr}r¨r$r$r%Úp_struct_or_union_specifier_3hs z%CParser.p_struct_or_union_specifier_3cCs|d|d<dS)zF struct_or_union : STRUCT | UNION rrNr$)rr}r$r$r%Úp_struct_or_unionrszCParser.p_struct_or_unioncCs:t|ƒdkr|dpg|d<n|d|dp.g|d<dS)z struct_declaration_list : struct_declaration | struct_declaration_list struct_declaration rXrrN)r-)rr}r$r$r%Úp_struct_declaration_listzs z!CParser.p_struct_declaration_listcCs¬|d}d|dkst‚|ddk r8|j||dd}nht|dƒdkrˆ|dd}t|tjƒrf|}n tj|ƒ}|j|t|d gd}n|j|tddd gd}||d<dS) zW struct_declaration : specifier_qualifier_list struct_declarator_list_opt SEMI rrVrQrXN)rTr`r?r)rB)rBrZ)r.rdr-r@rZNoderLr)rr}rTr`ZnodeZ decl_typer$r$r%Úp_struct_declaration_1ƒs$    zCParser.p_struct_declaration_1cCs d|d<dS)z# struct_declaration : SEMI Nrr$)rr}r$r$r%Úp_struct_declaration_2©szCParser.p_struct_declaration_2cCs0t|ƒdkr|d|dgn|dg|d<dS)zŠ struct_declarator_list : struct_declarator | struct_declarator_list COMMA struct_declarator r‹rrˆrN)r-)rr}r$r$r%Úp_struct_declarator_list®sz CParser.p_struct_declarator_listcCs|dddœ|d<dS)z( struct_declarator : declarator rN)rBrWrr$)rr}r$r$r%Úp_struct_declarator_1·szCParser.p_struct_declarator_1cCsDt|ƒdkr$|d|ddœ|d<ntjdddƒ|ddœ|d<dS)z€ struct_declarator : declarator COLON constant_expression | COLON constant_expression rˆr)rBrWrNrX)r-rrA)rr}r$r$r%Úp_struct_declarator_2¼s zCParser.p_struct_declarator_2cCs"tj|dd|j|dƒƒ|d<dS)zM enum_specifier : ENUM ID | ENUM TYPEID rXNrr)rrŽr†)rr}r$r$r%Úp_enum_specifier_1ÅszCParser.p_enum_specifier_1cCs"tjd|d|j|dƒƒ|d<dS)zG enum_specifier : ENUM brace_open enumerator_list brace_close Nrˆrr)rrŽr†)rr}r$r$r%Úp_enum_specifier_2ËszCParser.p_enum_specifier_2cCs&tj|d|d|j|dƒƒ|d<dS)z› enum_specifier : ENUM ID brace_open enumerator_list brace_close | ENUM TYPEID brace_open enumerator_list brace_close rXr‹rrN)rrŽr†)rr}r$r$r%Úp_enum_specifier_3ÐszCParser.p_enum_specifier_3cCsht|ƒdkr*tj|dg|djƒ|d<n:t|ƒdkrD|d|d<n |djj|dƒ|d|d<dS)z™ enumerator_list : enumerator | enumerator_list COMMA | enumerator_list COMMA enumerator rXrrrˆN)r-rZEnumeratorListr5Z enumeratorsr+)rr}r$r$r%Úp_enumerator_listÖs   zCParser.p_enumerator_listcCsbt|ƒdkr(tj|dd|j|dƒƒ}ntj|d|d|j|dƒƒ}|j|j|jƒ||d<dS)zR enumerator : ID | ID EQUALS constant_expression rXrNrˆr)r-rZ Enumeratorr†r7r4r5)rr}Z enumeratorr$r$r%Ú p_enumeratorãs  zCParser.p_enumeratorcCs|d|d<dS)zQ declarator : id_declarator | typeid_declarator rrNr$)rr}r$r$r%Ú p_declaratorószCParser.p_declaratorrHÚIDÚtypeidÚTYPEIDÚtypeid_noparencCs|d|d<dS)z1 xxx_declarator : direct_xxx_declarator rrNr$)rr}r$r$r%Úp_xxx_declarator_1ùszCParser.p_xxx_declarator_1cCs|j|d|dƒ|d<dS)z9 xxx_declarator : pointer direct_xxx_declarator rXrrN)rD)rr}r$r$r%Úp_xxx_declarator_2ÿszCParser.p_xxx_declarator_2cCs&tj|ddd|j|dƒd|d<dS)z' direct_xxx_declarator : yyy rN)rJr?rKr5r)rrAr†)rr}r$r$r%Úp_direct_xxx_declarator_1s z!CParser.p_direct_xxx_declarator_1cCs|d|d<dS)z@ direct_xxx_declarator : LPAREN xxx_declarator RPAREN rXrNr$)rr}r$r$r%Úp_direct_xxx_declarator_2sz!CParser.p_direct_xxx_declarator_2cCsft|ƒdkr|dngpg}tjdt|ƒdkr6|dn|d||djd}|j|d|d|d<dS) z} direct_xxx_declarator : direct_xxx_declarator LBRACKET type_qualifier_list_opt assignment_expression_opt RBRACKET érˆNr‹r)r?ÚdimÚ dim_qualsr5)rBrCr)r-rÚ ArrayDeclr5rD)rr}rKÚarrr$r$r%Úp_direct_xxx_declarator_3sz!CParser.p_direct_xxx_declarator_3cCs^dd„|d|dgDƒ}dd„|Dƒ}tjd|d||djd }|j|d|d |d <dS) zÿ direct_xxx_declarator : direct_xxx_declarator LBRACKET STATIC type_qualifier_list_opt assignment_expression RBRACKET | direct_xxx_declarator LBRACKET type_qualifier_list STATIC assignment_expression RBRACKET cSs g|]}t|tƒr|n|g‘qSr$)r@Úlist)rGÚitemr$r$r%rI,sz5CParser.p_direct_xxx_declarator_4..rˆr‹cSs"g|]}|D]}|dk r |‘q qS)Nr$)rGZsublistrPr$r$r%rI.s NrÁr)r?rÂrÃr5)rBrCr)rrÄr5rD)rr}Z listed_qualsrÃrÅr$r$r%Úp_direct_xxx_declarator_4$sz!CParser.p_direct_xxx_declarator_4cCsZtjdtj|d|j|dƒƒ|ddkr0|dng|djd}|j|d|d|d<dS)zi direct_xxx_declarator : direct_xxx_declarator LBRACKET type_qualifier_list_opt TIMES RBRACKET Nr‹rˆr)r?rÂrÃr5)rBrCr)rrÄr¹r†r5rD)rr}rÅr$r$r%Úp_direct_xxx_declarator_5:s z!CParser.p_direct_xxx_declarator_5cCs|tj|dd|djd}|jƒjdkrb|jdk rbx.|jjD]"}t|tjƒrNP|j |j |jƒqr?rËÚparamsr@Ú EllipsisParamr7r4rD)rr}ÚfuncZparamr$r$r%Úp_direct_xxx_declarator_6Fs   z!CParser.p_direct_xxx_declarator_6cCsn|j|dƒ}tj|dpgd|d}t|ƒdkrb|d}x|jdk rL|j}q:W||_|d|d<n||d<dS)zm pointer : TIMES type_qualifier_list_opt | TIMES type_qualifier_list_opt pointer rrXN)rKr?r5rˆr)r†rZPtrDeclr-r?)rr}r5Z nested_typeZ tail_typer$r$r%Ú p_pointercs    zCParser.p_pointercCs0t|ƒdkr|dgn|d|dg|d<dS)zs type_qualifier_list : type_qualifier | type_qualifier_list type_qualifier rXrrN)r-)rr}r$r$r%Úp_type_qualifier_listszCParser.p_type_qualifier_listcCs:t|ƒdkr*|djjtj|j|dƒƒƒ|d|d<dS)zn parameter_type_list : parameter_list | parameter_list COMMA ELLIPSIS rXrrˆrN)r-rÍr+rrÎr†)rr}r$r$r%Úp_parameter_type_list‡s zCParser.p_parameter_type_listcCsNt|ƒdkr*tj|dg|djƒ|d<n |djj|dƒ|d|d<dS)zz parameter_list : parameter_declaration | parameter_list COMMA parameter_declaration rXrrrˆN)r-rÚ ParamListr5rÍr+)rr}r$r$r%Úp_parameter_lists zCParser.p_parameter_listcCsT|d}|ds.tjdg|j|dƒdg|d<|j|t|ddgdd|d<d S) zŸ parameter_declaration : declaration_specifiers id_declarator | declaration_specifiers typeid_noparen_declarator rr?rE)r5rX)rB)rTr`rN)rrLr†rdr)rr}rTr$r$r%Úp_parameter_declaration_1¤sz!CParser.p_parameter_declaration_1cCsÞ|d}|ds.tjdg|j|dƒdg|d<t|dƒdkrŽt|dd jƒdkrŽ|j|ddjdƒrŽ|j|t|dddgd d}nDtjd |d |dp®tj dddƒ|j|dƒd }|d}|j ||ƒ}||d<dS)zR parameter_declaration : declaration_specifiers abstract_declarator_opt rr?rE)r5rrXN)rBrZ)rTr`r rP)r4rKr?r5r1r1) rrLr†r-rFr9rdrÚTypenamerArO)rr}rTrBrNr$r$r%Úp_parameter_declaration_2°s"& z!CParser.p_parameter_declaration_2cCsNt|ƒdkr*tj|dg|djƒ|d<n |djj|dƒ|d|d<dS)ze identifier_list : identifier | identifier_list COMMA identifier rXrrrˆN)r-rrÔr5rÍr+)rr}r$r$r%Úp_identifier_listÏs zCParser.p_identifier_listcCs|d|d<dS)z- initializer : assignment_expression rrNr$)rr}r$r$r%Úp_initializer_1ÙszCParser.p_initializer_1cCs6|ddkr&tjg|j|dƒƒ|d<n |d|d<dS)z‹ initializer : brace_open initializer_list_opt brace_close | brace_open initializer_list COMMA brace_close rXNrr)rÚInitListr†)rr}r$r$r%Úp_initializer_2Þs zCParser.p_initializer_2cCs–t|ƒdkrN|ddkr |dntj|d|dƒ}tj|g|djƒ|d<nD|ddkrb|dntj|d|dƒ}|djj|ƒ|d|d<dS)z initializer_list : designation_opt initializer | initializer_list COMMA designation_opt initializer rˆrNrXrr‹)r-rZNamedInitializerrÛr5Úexprsr+)rr}rZr$r$r%Úp_initializer_listçs  ((zCParser.p_initializer_listcCs|d|d<dS)z. designation : designator_list EQUALS rrNr$)rr}r$r$r%Ú p_designationószCParser.p_designationcCs0t|ƒdkr|dgn|d|dg|d<dS)z_ designator_list : designator | designator_list designator rXrrN)r-)rr}r$r$r%Úp_designator_listûszCParser.p_designator_listcCs|d|d<dS)zi designator : LBRACKET constant_expression RBRACKET | PERIOD identifier rXrNr$)rr}r$r$r%Ú p_designatorszCParser.p_designatorcCsPtjd|dd|dp$tjdddƒ|j|dƒd}|j||ddƒ|d<dS) zH type_name : specifier_qualifier_list abstract_declarator_opt r rrPrXN)r4rKr?r5r?r)rr×rAr†rO)rr}rNr$r$r%Ú p_type_names  zCParser.p_type_namecCs(tjdddƒ}|j||dd|d<dS)z+ abstract_declarator : pointer Nr)rBrCr)rrArD)rr}Z dummytyper$r$r%Úp_abstract_declarator_1szCParser.p_abstract_declarator_1cCs|j|d|dƒ|d<dS)zF abstract_declarator : pointer direct_abstract_declarator rXrrN)rD)rr}r$r$r%Úp_abstract_declarator_2szCParser.p_abstract_declarator_2cCs|d|d<dS)z> abstract_declarator : direct_abstract_declarator rrNr$)rr}r$r$r%Úp_abstract_declarator_3szCParser.p_abstract_declarator_3cCs|d|d<dS)zA direct_abstract_declarator : LPAREN abstract_declarator RPAREN rXrNr$)rr}r$r$r%Úp_direct_abstract_declarator_1)sz&CParser.p_direct_abstract_declarator_1cCs6tjd|dg|djd}|j|d|d|d<dS)zn direct_abstract_declarator : direct_abstract_declarator LBRACKET assignment_expression_opt RBRACKET Nrˆr)r?rÂrÃr5)rBrCr)rrÄr5rD)rr}rÅr$r$r%Úp_direct_abstract_declarator_2-s z&CParser.p_direct_abstract_declarator_2cCs0tjtjdddƒ|dg|j|dƒd|d<dS)zS direct_abstract_declarator : LBRACKET assignment_expression_opt RBRACKET NrXr)r?rÂrÃr5r)rrÄrAr†)rr}r$r$r%Úp_direct_abstract_declarator_38s  z&CParser.p_direct_abstract_declarator_3cCsFtjdtj|d|j|dƒƒg|djd}|j|d|d|d<dS)zZ direct_abstract_declarator : direct_abstract_declarator LBRACKET TIMES RBRACKET Nrˆr)r?rÂrÃr5)rBrCr)rrÄr¹r†r5rD)rr}rÅr$r$r%Úp_direct_abstract_declarator_4As z&CParser.p_direct_abstract_declarator_4cCs@tjtjdddƒtj|d|j|dƒƒg|j|dƒd|d<dS)z? direct_abstract_declarator : LBRACKET TIMES RBRACKET Nrˆr)r?rÂrÃr5r)rrÄrAr¹r†)rr}r$r$r%Úp_direct_abstract_declarator_5Ls  z&CParser.p_direct_abstract_declarator_5cCs4tj|dd|djd}|j|d|d|d<dS)zh direct_abstract_declarator : direct_abstract_declarator LPAREN parameter_type_list_opt RPAREN rˆNr)rËr?r5)rBrCr)rrMr5rD)rr}rÏr$r$r%Úp_direct_abstract_declarator_6Us z&CParser.p_direct_abstract_declarator_6cCs.tj|dtjdddƒ|j|dƒd|d<dS)zM direct_abstract_declarator : LPAREN parameter_type_list_opt RPAREN rXNr)rËr?r5r)rrMrAr†)rr}r$r$r%Úp_direct_abstract_declarator_7_s z&CParser.p_direct_abstract_declarator_7cCs(t|dtƒr|dn|dg|d<dS)zG block_item : declaration | statement rrN)r@rÇ)rr}r$r$r%Ú p_block_itemjszCParser.p_block_itemcCs:t|ƒdks|ddgkr"|dn|d|d|d<dS)z_ block_item_list : block_item | block_item_list block_item rXNrr)r-)rr}r$r$r%Úp_block_item_listrszCParser.p_block_item_listcCs"tj|d|j|dƒd|d<dS)zA compound_statement : brace_open block_item_list_opt brace_close rXr)Z block_itemsr5rN)rZCompoundr†)rr}r$r$r%Úp_compound_statement_1yszCParser.p_compound_statement_1cCs&tj|d|d|j|dƒƒ|d<dS)z( labeled_statement : ID COLON statement rrˆrN)rZLabelr†)rr}r$r$r%Úp_labeled_statement_1szCParser.p_labeled_statement_1cCs(tj|d|dg|j|dƒƒ|d<dS)z> labeled_statement : CASE constant_expression COLON statement rXr‹rrN)rZCaser†)rr}r$r$r%Úp_labeled_statement_2ƒszCParser.p_labeled_statement_2cCs"tj|dg|j|dƒƒ|d<dS)z- labeled_statement : DEFAULT COLON statement rˆrrN)rZDefaultr†)rr}r$r$r%Úp_labeled_statement_3‡szCParser.p_labeled_statement_3cCs(tj|d|dd|j|dƒƒ|d<dS)z= selection_statement : IF LPAREN expression RPAREN statement rˆrÁNrr)rÚIfr†)rr}r$r$r%Úp_selection_statement_1‹szCParser.p_selection_statement_1cCs,tj|d|d|d|j|dƒƒ|d<dS)zL selection_statement : IF LPAREN expression RPAREN statement ELSE statement rˆrÁérrN)rrór†)rr}r$r$r%Úp_selection_statement_2szCParser.p_selection_statement_2cCs*ttj|d|d|j|dƒƒƒ|d<dS)zA selection_statement : SWITCH LPAREN expression RPAREN statement rˆrÁrrN)r rZSwitchr†)rr}r$r$r%Úp_selection_statement_3“szCParser.p_selection_statement_3cCs&tj|d|d|j|dƒƒ|d<dS)z@ iteration_statement : WHILE LPAREN expression RPAREN statement rˆrÁrrN)rZWhiler†)rr}r$r$r%Úp_iteration_statement_1˜szCParser.p_iteration_statement_1cCs&tj|d|d|j|dƒƒ|d<dS)zH iteration_statement : DO statement WHILE LPAREN expression RPAREN SEMI rÁrXrrN)rZDoWhiler†)rr}r$r$r%Úp_iteration_statement_2œszCParser.p_iteration_statement_2cCs2tj|d|d|d|d|j|dƒƒ|d<dS)zj iteration_statement : FOR LPAREN expression_opt SEMI expression_opt SEMI expression_opt RPAREN statement rˆrÁrõé rrN)rÚForr†)rr}r$r$r%Úp_iteration_statement_3 szCParser.p_iteration_statement_3cCsBtjtj|d|j|dƒƒ|d|d|d|j|dƒƒ|d<dS)zb iteration_statement : FOR LPAREN declaration expression_opt SEMI expression_opt RPAREN statement rˆrr‹éérN)rrûZDeclListr†)rr}r$r$r%Úp_iteration_statement_4¤szCParser.p_iteration_statement_4cCs tj|d|j|dƒƒ|d<dS)z jump_statement : GOTO ID SEMI rXrrN)rZGotor†)rr}r$r$r%Úp_jump_statement_1©szCParser.p_jump_statement_1cCstj|j|dƒƒ|d<dS)z jump_statement : BREAK SEMI rrN)rZBreakr†)rr}r$r$r%Úp_jump_statement_2­szCParser.p_jump_statement_2cCstj|j|dƒƒ|d<dS)z! jump_statement : CONTINUE SEMI rrN)rZContinuer†)rr}r$r$r%Úp_jump_statement_3±szCParser.p_jump_statement_3cCs0tjt|ƒdkr|dnd|j|dƒƒ|d<dS)z\ jump_statement : RETURN expression SEMI | RETURN SEMI r‹rXNrr)rZReturnr-r†)rr}r$r$r%Úp_jump_statement_4µszCParser.p_jump_statement_4cCs4|ddkr$tj|j|dƒƒ|d<n |d|d<dS)z, expression_statement : expression_opt SEMI rNrXr)rZEmptyStatementr†)rr}r$r$r%Úp_expression_statement»s zCParser.p_expression_statementcCsjt|ƒdkr|d|d<nLt|dtjƒsFtj|dg|djƒ|d<|djj|dƒ|d|d<dS)zn expression : assignment_expression | expression COMMA assignment_expression rXrrrˆN)r-r@rÚExprListr5rÝr+)rr}r$r$r%Ú p_expressionÂs  zCParser.p_expressioncCs$tj|dg|j|dƒd|d<dS)z typedef_name : TYPEID r)r5rN)rrLr†)rr}r$r$r%Úp_typedef_nameÏszCParser.p_typedef_namecCsDt|ƒdkr|d|d<n&tj|d|d|d|djƒ|d<dS)z› assignment_expression : conditional_expression | unary_expression assignment_operator assignment_expression rXrrrˆN)r-rZ Assignmentr5)rr}r$r$r%Úp_assignment_expressionÓs zCParser.p_assignment_expressioncCs|d|d<dS)aÞ assignment_operator : EQUALS | XOREQUAL | TIMESEQUAL | DIVEQUAL | MODEQUAL | PLUSEQUAL | MINUSEQUAL | LSHIFTEQUAL | RSHIFTEQUAL | ANDEQUAL | OREQUAL rrNr$)rr}r$r$r%Úp_assignment_operatorás zCParser.p_assignment_operatorcCs|d|d<dS)z. constant_expression : conditional_expression rrNr$)rr}r$r$r%Úp_constant_expressionðszCParser.p_constant_expressioncCsDt|ƒdkr|d|d<n&tj|d|d|d|djƒ|d<dS)zœ conditional_expression : binary_expression | binary_expression CONDOP expression COLON conditional_expression rXrrrˆrÁN)r-rZ TernaryOpr5)rr}r$r$r%Úp_conditional_expressionôs z CParser.p_conditional_expressioncCsDt|ƒdkr|d|d<n&tj|d|d|d|djƒ|d<dS)ak binary_expression : cast_expression | binary_expression TIMES binary_expression | binary_expression DIVIDE binary_expression | binary_expression MOD binary_expression | binary_expression PLUS binary_expression | binary_expression MINUS binary_expression | binary_expression RSHIFT binary_expression | binary_expression LSHIFT binary_expression | binary_expression LT binary_expression | binary_expression LE binary_expression | binary_expression GE binary_expression | binary_expression GT binary_expression | binary_expression EQ binary_expression | binary_expression NE binary_expression | binary_expression AND binary_expression | binary_expression OR binary_expression | binary_expression XOR binary_expression | binary_expression LAND binary_expression | binary_expression LOR binary_expression rXrrrˆN)r-rZBinaryOpr5)rr}r$r$r%Úp_binary_expressionýs zCParser.p_binary_expressioncCs|d|d<dS)z$ cast_expression : unary_expression rrNr$)rr}r$r$r%Úp_cast_expression_1szCParser.p_cast_expression_1cCs&tj|d|d|j|dƒƒ|d<dS)z; cast_expression : LPAREN type_name RPAREN cast_expression rXr‹rrN)rZCastr†)rr}r$r$r%Úp_cast_expression_2szCParser.p_cast_expression_2cCs|d|d<dS)z* unary_expression : postfix_expression rrNr$)rr}r$r$r%Úp_unary_expression_1szCParser.p_unary_expression_1cCs$tj|d|d|djƒ|d<dS)z¸ unary_expression : PLUSPLUS unary_expression | MINUSMINUS unary_expression | unary_operator cast_expression rrXrN)rÚUnaryOpr5)rr}r$r$r%Úp_unary_expression_2#szCParser.p_unary_expression_2cCs:tj|dt|ƒdkr|dn|d|j|dƒƒ|d<dS)zx unary_expression : SIZEOF unary_expression | SIZEOF LPAREN type_name RPAREN rrˆrXrN)rrr-r†)rr}r$r$r%Úp_unary_expression_3*szCParser.p_unary_expression_3cCs|d|d<dS)zÏ unary_operator : AND | TIMES | PLUS | MINUS | NOT | LNOT rrNr$)rr}r$r$r%Úp_unary_operator3szCParser.p_unary_operatorcCs|d|d<dS)z* postfix_expression : primary_expression rrNr$)rr}r$r$r%Úp_postfix_expression_1=szCParser.p_postfix_expression_1cCs$tj|d|d|djƒ|d<dS)zG postfix_expression : postfix_expression LBRACKET expression RBRACKET rrˆrN)rÚArrayRefr5)rr}r$r$r%Úp_postfix_expression_2AszCParser.p_postfix_expression_2cCs4tj|dt|ƒdkr|dnd|djƒ|d<dS)zœ postfix_expression : postfix_expression LPAREN argument_expression_list RPAREN | postfix_expression LPAREN RPAREN rrÁrˆNr)rÚFuncCallr-r5)rr}r$r$r%Úp_postfix_expression_3EszCParser.p_postfix_expression_3cCs>tj|d|j|dƒƒ}tj|d|d||djƒ|d<dS)zÿ postfix_expression : postfix_expression PERIOD ID | postfix_expression PERIOD TYPEID | postfix_expression ARROW ID | postfix_expression ARROW TYPEID rˆrrXrN)rr¹r†Ú StructRefr5)rr}Úfieldr$r$r%Úp_postfix_expression_4KszCParser.p_postfix_expression_4cCs(tjd|d|d|djƒ|d<dS)z{ postfix_expression : postfix_expression PLUSPLUS | postfix_expression MINUSMINUS r}rXrrN)rrr5)rr}r$r$r%Úp_postfix_expression_5TszCParser.p_postfix_expression_5cCstj|d|dƒ|d<dS)zÇ postfix_expression : LPAREN type_name RPAREN brace_open initializer_list brace_close | LPAREN type_name RPAREN brace_open initializer_list COMMA brace_close rXrÁrN)rZCompoundLiteral)rr}r$r$r%Úp_postfix_expression_6ZszCParser.p_postfix_expression_6cCs|d|d<dS)z" primary_expression : identifier rrNr$)rr}r$r$r%Úp_primary_expression_1`szCParser.p_primary_expression_1cCs|d|d<dS)z primary_expression : constant rrNr$)rr}r$r$r%Úp_primary_expression_2dszCParser.p_primary_expression_2cCs|d|d<dS)zp primary_expression : unified_string_literal | unified_wstring_literal rrNr$)rr}r$r$r%Úp_primary_expression_3hszCParser.p_primary_expression_3cCs|d|d<dS)z0 primary_expression : LPAREN expression RPAREN rXrNr$)rr}r$r$r%Úp_primary_expression_4nszCParser.p_primary_expression_4cCsB|j|dƒ}tjtj|d|ƒtj|d|dg|ƒ|ƒ|d<dS)za primary_expression : OFFSETOF LPAREN type_name COMMA offsetof_member_designator RPAREN rrˆrÁrN)r†rrr¹r)rr}r5r$r$r%Úp_primary_expression_5rs zCParser.p_primary_expression_5cCs¤t|ƒdkr|d|d<n†t|ƒdkrbtj|d|j|dƒƒ}tj|d|d||djƒ|d<n>t|ƒdkrtj|d|d|djƒ|d<ntdt|ƒƒ‚dS) zì offsetof_member_designator : identifier | offsetof_member_designator PERIOD identifier | offsetof_member_designator LBRACKET expression RBRACKET rXrrr‹rˆrÁz$Unexpected parsing state. len(p): %uN)r-rr¹r†rr5rÚNotImplementedError)rr}rr$r$r%Úp_offsetof_member_designatorzs  $ "z$CParser.p_offsetof_member_designatorcCsNt|ƒdkr*tj|dg|djƒ|d<n |djj|dƒ|d|d<dS)zœ argument_expression_list : assignment_expression | argument_expression_list COMMA assignment_expression rXrrrˆN)r-rrr5rÝr+)rr}r$r$r%Úp_argument_expression_list‰s z"CParser.p_argument_expression_listcCs tj|d|j|dƒƒ|d<dS)z identifier : ID rrN)rr¹r†)rr}r$r$r%Ú p_identifier“szCParser.p_identifiercCs"tjd|d|j|dƒƒ|d<dS)z constant : INT_CONST_DEC | INT_CONST_OCT | INT_CONST_HEX | INT_CONST_BIN rErrN)rÚConstantr†)rr}r$r$r%Ú p_constant_1—szCParser.p_constant_1cCs"tjd|d|j|dƒƒ|d<dS)zM constant : FLOAT_CONST | HEX_FLOAT_CONST ÚfloatrrN)rr'r†)rr}r$r$r%Ú p_constant_2 szCParser.p_constant_2cCs"tjd|d|j|dƒƒ|d<dS)zH constant : CHAR_CONST | WCHAR_CONST ÚcharrrN)rr'r†)rr}r$r$r%Ú p_constant_3§szCParser.p_constant_3cCsdt|ƒdkr,tjd|d|j|dƒƒ|d<n4|djdd…|ddd…|d_|d|d<dS)z~ unified_string_literal : STRING_LITERAL | unified_string_literal STRING_LITERAL rXÚstringrrNr1)r-rr'r†Úvalue)rr}r$r$r%Úp_unified_string_literal³s  (z CParser.p_unified_string_literalcCsht|ƒdkr,tjd|d|j|dƒƒ|d<n8|djjƒdd…|ddd…|d_|d|d<dS)z unified_wstring_literal : WSTRING_LITERAL | unified_wstring_literal WSTRING_LITERAL rXr-rrNr1)r-rr'r†r.Úrstrip)rr}r$r$r%Úp_unified_wstring_literal¾s  ,z!CParser.p_unified_wstring_literalcCs"|d|d<|jd|jdƒƒdS)z brace_open : LBRACE rrN)Ú set_linenoÚlineno)rr}r$r$r%Ú p_brace_openÉs zCParser.p_brace_opencCs"|d|d<|jd|jdƒƒdS)z brace_close : RBRACE rrN)r2r3)rr}r$r$r%Ú p_brace_closeÏs zCParser.p_brace_closecCs d|d<dS)zempty : Nrr$)rr}r$r$r%Úp_emptyÕszCParser.p_emptycCs@|r,|jd|j|j|j|jj|ƒdƒn|jd|jjƒdS)Nz before: %s)r3r=zAt end of input)r3r.r:r3rZfind_tok_columnr()rr}r$r$r%Úp_errorÙs zCParser.p_errorN)r r)F)F©rjrk©rjrl©rjrm©rjrn©rjro©rjrprq©rjrrrsrtru©rjrvrw©rjrxry©rjrzr{r|) r8r9r:r;r<r=r>r?r@rA)rHr¹)rºr»)r¼r»)rHr¹)rºr»)r¼r»)rHr¹)rºr»)r¼r»)rHr¹)rºr»)rHr¹)rºr»)r¼r»)rHr¹)rºr»)r¼r»)rHr¹)rºr»)r¼r»)rHr¹)rºr»)r¼r»)£Ú__name__Ú __module__Ú __qualname__rr&r)r,r0r6r7r9rrrrr>rDrOrUrdrgriZ precedencer~rrr‚rƒr„r…r‡r‰rŠrŒrrrr‘r’r“r”r•r–r—r˜r™ršr›rœrržrŸr r¡r¢r£r¤r¥r¦r§r©rªr«r¬r­r®r¯r°r±r²r³r´rµr¶r·r¸r r½r¾r¿rÀrÆrÉrÊrÐrÑrÒrÓrÕrÖrØrÙrÚrÜrÞrßràrárârãrärårærçrèrérêrërìrírîrïrðrñròrôrör÷rørùrürÿrrrrrrrrr r r r r rrrrrrrrrrrrrr r!r"r$r%r&r(r*r,r/r1r4r5r6r7r$r$r$r%r sZ g     )7-  Y      <         &                                                    r Ú__main__)ÚreZplyrr rÚc_lexerrZ plyparserrrrr r Zast_transformsr r rBÚpprintÚtimeÚsysr$r$r$r%Ú s.     `pycparser-2.18/pycparser/__pycache__/c_lexer.cpython-36.pyc0000664000175000017500000002754113127011712024467 0ustar elibeneliben000000000000003 õ±ÂX8ã@s<ddlZddlZddlmZddlmZGdd„deƒZdS)éNé)Úlex)ÚTOKENc>@sTeZdZdZdd„Zdd„Zdd„Zdd „Zd d „Zd d „Z dd„Z dd„Z d Z iZ xd¹dº„Z?d»Z@d¼ZAd½ZBd¾ZCd¿ZDdÀZEdÁZFdÂZGdÃZHdÄZIdÅZJdÆZKdÇZLdÈZMdÉZNdÊZOdËZPdÌZQdÍZRdÎZSdÏZTdÐZUdÑZVdÒZWdÓZXdÔZYdÕZZdÖZ[d×Z\dØZ]dÙZ^dÚZ_dÛZ`dÜZadÝZbdÞZcdßZddàZedáZfdâZgdãZhdäZidåZjdæZke2dçƒdèd鄃Zle2dêƒdëd섃Zme'Zne2e,ƒdídZoe2e/ƒdïdð„ƒZpe2eƒdñdò„ƒZqe2eƒdódô„ƒZre2eƒdõdö„ƒZse2eƒd÷dø„ƒZte2eƒdùdú„ƒZue2e"ƒdûdü„ƒZve2e#ƒdýdþ„ƒZwe2e$ƒdÿd„ƒZxe2e%ƒdd„ƒZye2e(ƒdd„ƒZze2e)ƒdd„ƒZ{e2eƒdd„ƒZ|d d „Z}d S(ÚCLexera A lexer for the C language. After building it, set the input text with input(), and call token() to get new tokens. The public attribute filename can be set to an initial filaneme, but the lexer will update it upon #line directives. cCs@||_||_||_||_d|_d|_tjdƒ|_tjdƒ|_ dS)ab Create a new Lexer. error_func: An error function. Will be called with an error message, line and column as arguments, in case of an error during lexing. on_lbrace_func, on_rbrace_func: Called when an LBRACE or RBRACE is encountered (likely to push/pop type_lookup_func's scope) type_lookup_func: A type lookup function. Given a string, it must return True IFF this string is a name of a type that was defined with a typedef earlier. ÚNz([ \t]*line\W)|([ \t]*\d+)z[ \t]*pragma\W) Ú error_funcÚon_lbrace_funcÚon_rbrace_funcÚtype_lookup_funcÚfilenameÚ last_tokenÚreÚcompileÚ line_patternÚpragma_pattern)Úselfrrr r ©rú../pycparser/c_lexer.pyÚ__init__s zCLexer.__init__cKstjfd|i|—Ž|_dS)zù Builds the lexer from the specification. Must be called after the lexer object is created. This method exists separately, because the PLY manual warns against calling lex.lex inside __init__ ÚobjectN)rÚlexer)rÚkwargsrrrÚbuild:sz CLexer.buildcCs d|j_dS)z? Resets the internal line number counter of the lexer. rN)rÚlineno)rrrrÚ reset_linenoDszCLexer.reset_linenocCs|jj|ƒdS)N)rÚinput)rÚtextrrrrIsz CLexer.inputcCs|jjƒ|_|jS)N)rÚtokenr )rrrrrLs z CLexer.tokencCs|jjjdd|jƒ}|j|S)z3 Find the column of the token in its line. Ú r)rÚlexdataÚrfindÚlexpos)rrZlast_crrrrÚfind_tok_columnPszCLexer.find_tok_columncCs0|j|ƒ}|j||d|dƒ|jjdƒdS)Nrr)Ú_make_tok_locationrrÚskip)rÚmsgrÚlocationrrrÚ_error[s z CLexer._errorcCs|j|j|ƒfS)N)rr")rrrrrr#`szCLexer._make_tok_locationÚ_BOOLÚ_COMPLEXÚAUTOÚBREAKÚCASEÚCHARÚCONSTÚCONTINUEÚDEFAULTÚDOÚDOUBLEÚELSEÚENUMÚEXTERNÚFLOATÚFORÚGOTOÚIFÚINLINEÚINTÚLONGÚREGISTERÚOFFSETOFÚRESTRICTÚRETURNÚSHORTÚSIGNEDÚSIZEOFÚSTATICÚSTRUCTÚSWITCHÚTYPEDEFÚUNIONÚUNSIGNEDÚVOIDÚVOLATILEÚWHILEÚ__INT128Z_BoolZ_ComplexÚIDÚTYPEIDÚ INT_CONST_DECÚ INT_CONST_OCTÚ INT_CONST_HEXÚ INT_CONST_BINÚ FLOAT_CONSTÚHEX_FLOAT_CONSTÚ CHAR_CONSTÚ WCHAR_CONSTÚSTRING_LITERALÚWSTRING_LITERALÚPLUSÚMINUSÚTIMESÚDIVIDEÚMODÚORÚANDÚNOTÚXORÚLSHIFTÚRSHIFTÚLORÚLANDÚLNOTÚLTÚLEÚGTÚGEÚEQÚNEÚEQUALSÚ TIMESEQUALÚDIVEQUALÚMODEQUALÚ PLUSEQUALÚ MINUSEQUALÚ LSHIFTEQUALÚ RSHIFTEQUALÚANDEQUALÚXOREQUALÚOREQUALÚPLUSPLUSÚ MINUSMINUSÚARROWÚCONDOPÚLPARENÚRPARENÚLBRACKETÚRBRACKETÚLBRACEÚRBRACEÚCOMMAÚPERIODÚSEMIÚCOLONÚELLIPSISÚPPHASHÚPPPRAGMAÚ PPPRAGMASTRz[a-zA-Z_$][0-9a-zA-Z_$]*z0[xX]z [0-9a-fA-F]+z0[bB]z[01]+zD(([uU]ll)|([uU]LL)|(ll[uU]?)|(LL[uU]?)|([uU][lL])|([lL][uU]?)|[uU])?z(0z)|([1-9][0-9]*ú)z0[0-7]*z 0[0-7]*[89]z([a-zA-Z._~!=&\^\-\\?'"])z(\d+)z(x[0-9a-fA-F]+)z#([\\][^a-zA-Z._~^!=&\^\-\\?'"x0-7])z(\\(ú|z))z ([^'\\\n]|ú'ÚLz('z*\n)|('z*$)z[^' ]+')|('')|('z [^'\n]*')z ([^"\\\n]|ú"z*"z*?z([eE][-+]?[0-9]+)z([0-9]*\.[0-9]+)|([0-9]+\.)z((((z ?)|([0-9]+z ))[FfLl]?)z([pP][+-]?[0-9]+)z(((z)?\.z)|(z\.))ú(z[FfLl]?)ÚpplineÚ exclusiveÚpppragmacCsf|jj|jj|jjdr2|jjdƒd|_|_n0|jj|jj|jjdrX|jjdƒn d|_ |SdS)z[ \t]*\#)Úposr‘Nr“rˆ) rÚmatchrrr!ÚbeginÚpp_lineÚ pp_filenamerÚtype)rÚtrrrÚt_PPHASH÷s zCLexer.t_PPHASHcCs0|jdkr|jd|ƒn|jjdƒjdƒ|_dS)Nz$filename before line number in #liner)r—r'ÚvalueÚlstripÚrstripr˜)rršrrrÚt_ppline_FILENAMEs zCLexer.t_ppline_FILENAMEcCs|jdkr|j|_ndS)N)r—rœ)rršrrrÚt_ppline_LINE_NUMBER s  zCLexer.t_ppline_LINE_NUMBERcCsH|jdkr|jd|ƒn t|jƒ|j_|jdk r8|j|_|jjdƒdS)z\nNzline number missing in #lineÚINITIAL)r—r'Úintrrr˜r r–)rršrrrÚt_ppline_NEWLINEs   zCLexer.t_ppline_NEWLINEcCsdS)ÚlineNr)rršrrrÚt_ppline_PPLINE!szCLexer.t_ppline_PPLINEz cCs|jd|ƒdS)Nzinvalid #line directive)r')rršrrrÚt_ppline_error'szCLexer.t_ppline_errorcCs |jjd7_|jjdƒdS)z\nrr¡N)rrr–)rršrrrÚt_pppragma_NEWLINE-szCLexer.t_pppragma_NEWLINEcCs|S)Zpragmar)rršrrrÚt_pppragma_PPPRAGMA2szCLexer.t_pppragma_PPPRAGMAcCs d|_|S)z.+rŠ)r™)rršrrrÚt_pppragma_STR8szCLexer.t_pppragma_STRcCs|jd|ƒdS)Nzinvalid #pragma directive)r')rršrrrÚt_pppragma_error=szCLexer.t_pppragma_errorcCs|jj|jjdƒ7_dS)z\n+rN)rrrœÚcount)rršrrrÚ t_NEWLINEFszCLexer.t_NEWLINEz\+ú-z\*ú/ú%z\|ú&ú~z\^z<>z\|\|z&&ú!ú<ú>z<=z>=z==z!=ú=z\*=z/=z%=z\+=z-=z<<=z>>=z&=z\|=z\^=z\+\+z--z->z\?z\(z\)z\[z\]ú,z\.ú;ú:z\.\.\.z\{cCs |jƒ|S)N)r)rršrrrÚt_LBRACEŽszCLexer.t_LBRACEz\}cCs |jƒ|S)N)r )rršrrrÚt_RBRACE’szCLexer.t_RBRACEcCs|S)Nr)rršrrrÚ t_FLOAT_CONSTžszCLexer.t_FLOAT_CONSTcCs|S)Nr)rršrrrÚt_HEX_FLOAT_CONST¢szCLexer.t_HEX_FLOAT_CONSTcCs|S)Nr)rršrrrÚt_INT_CONST_HEX¦szCLexer.t_INT_CONST_HEXcCs|S)Nr)rršrrrÚt_INT_CONST_BINªszCLexer.t_INT_CONST_BINcCsd}|j||ƒdS)NzInvalid octal constant)r')rršr%rrrÚt_BAD_CONST_OCT®szCLexer.t_BAD_CONST_OCTcCs|S)Nr)rršrrrÚt_INT_CONST_OCT³szCLexer.t_INT_CONST_OCTcCs|S)Nr)rršrrrÚt_INT_CONST_DEC·szCLexer.t_INT_CONST_DECcCs|S)Nr)rršrrrÚ t_CHAR_CONST¾szCLexer.t_CHAR_CONSTcCs|S)Nr)rršrrrÚ t_WCHAR_CONSTÂszCLexer.t_WCHAR_CONSTcCsd}|j||ƒdS)Nz Unmatched ')r')rršr%rrrÚt_UNMATCHED_QUOTEÆszCLexer.t_UNMATCHED_QUOTEcCsd|j}|j||ƒdS)NzInvalid char constant %s)rœr')rršr%rrrÚt_BAD_CHAR_CONSTËs zCLexer.t_BAD_CHAR_CONSTcCs|S)Nr)rršrrrÚt_WSTRING_LITERALÐszCLexer.t_WSTRING_LITERALcCsd}|j||ƒdS)Nz#String contains invalid escape code)r')rršr%rrrÚt_BAD_STRING_LITERALÖszCLexer.t_BAD_STRING_LITERALcCs2|jj|jdƒ|_|jdkr.|j|jƒr.d|_|S)NrNrO)Ú keyword_mapÚgetrœr™r )rršrrrÚt_IDÛsz CLexer.t_IDcCs"dt|jdƒ}|j||ƒdS)NzIllegal character %sr)Úreprrœr')rršr%rrrÚt_errorâszCLexer.t_errorN)&r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrM)=rNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr€rr‚rƒr„r…r†r‡rˆr‰rŠ©r‘r’©r“r’)rÍrÎ)~Ú__name__Ú __module__Ú __qualname__Ú__doc__rrrrrr"r'r#ÚkeywordsrÈÚkeywordÚlowerÚtokensZ identifierZ hex_prefixZ hex_digitsZ bin_prefixZ bin_digitsZinteger_suffix_optZdecimal_constantZoctal_constantZ hex_constantZ bin_constantZbad_octal_constantZ simple_escapeZdecimal_escapeZ hex_escapeZ bad_escapeZescape_sequenceZ cconst_charZ char_constZ wchar_constZunmatched_quoteZbad_char_constZ string_charZstring_literalZwstring_literalZbad_string_literalZ exponent_partZfractional_constantZfloating_constantZbinary_exponent_partZhex_fractional_constantZhex_floating_constantZstatesr›rrŸr r£r¥Zt_ppline_ignorer¦r§r¨Zt_pppragma_ignorer©rªZt_ignorer¬Zt_PLUSZt_MINUSZt_TIMESZt_DIVIDEZt_MODZt_ORZt_ANDZt_NOTZt_XORZt_LSHIFTZt_RSHIFTZt_LORZt_LANDZt_LNOTZt_LTZt_GTZt_LEZt_GEZt_EQZt_NEZt_EQUALSZ t_TIMESEQUALZ t_DIVEQUALZ t_MODEQUALZ t_PLUSEQUALZ t_MINUSEQUALZ t_LSHIFTEQUALZ t_RSHIFTEQUALZ t_ANDEQUALZ t_OREQUALZ t_XOREQUALZ t_PLUSPLUSZ t_MINUSMINUSZt_ARROWZt_CONDOPZt_LPARENZt_RPARENZ t_LBRACKETZ t_RBRACKETZt_COMMAZt_PERIODZt_SEMIZt_COLONZ t_ELLIPSISr¹rºZt_STRING_LITERALr»r¼r½r¾r¿rÀrÁrÂrÃrÄrÅrÆrÇrÊrÌrrrrrsB!             $  r)r ÚsysÚplyrZply.lexrrrrrrrÚ s  pycparser-2.18/pycparser/__pycache__/lextab.cpython-36.pyc0000664000175000017500000001267413127011713024327 0ustar elibeneliben000000000000003 ²\Ycãd@sÔdZeddddddddd d d d d ddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;dd?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dadbdcfcƒZddZdeZdfdfdgdhœZdidjdkdlfdjdjdjdjdjdjdmdnfdjdjdjdjdjdjdjdjdjdjdjdjdjdjdjdjdodpfdqdrfgfgdsdjdtdpfdudGfdvdwfgfgdxdjdyd,fdzdpfd{d+fd|dPfd}dfdjdjdjdjdjdjdjdjdjd~dfdjdjdjdjdjdjdjdd?fgfd€djddIfdjdjdjdjdjdjdjd‚dƒfd„dfdjdjdjdjdjdjdjd…dfdjdjdjdjdjdjdjdjdjdjdjdjdjdjdjdjd†dNfdjdjdjdjdjdjd‡dfdjdjdjdjdjdjdˆd‰fdjdjdjdjdjdjdjdjdjdjdjdjdjdjdŠd‹fgAfdŒdjdd!fdjdjdjdjdjdjdŽdfdjdjdjdjdjdjdjdjdjdjdjdjdjdd]fdjd_fdjdjdjdjdjdjdjd fdjd$fdjdMfdjdafdjd1fdjdbfdjdfdjdBfdjdfdjdfdjd"([^"\\\n]|(\\(([a-zA-Z._~!=&\^\-\\?'"])|(\d+)|(x[0-9a-fA-F]+))))*")|(?P(0(([uU]ll)|([uU]LL)|(ll[uU]?)|(LL[uU]?)|([uU][lL])|([lL][uU]?)|[uU])?)|([1-9][0-9]*(([uU]ll)|([uU]LL)|(ll[uU]?)|(LL[uU]?)|([uU][lL])|([lL][uU]?)|[uU])?))|(?P\n)|(?Pline)NÚt_ppline_FILENAMEZFILENAMEÚt_ppline_LINE_NUMBERZ LINE_NUMBERÚt_ppline_NEWLINEÚNEWLINEÚt_ppline_PPLINEZPPLINEzQ(?P\n)|(?Ppragma)|(?P.+)Út_pppragma_NEWLINEÚt_pppragma_PPPRAGMAÚt_pppragma_STRZSTRa˜(?P[ \t]*\#)|(?P\n+)|(?P\{)|(?P\})|(?P((((([0-9]*\.[0-9]+)|([0-9]+\.))([eE][-+]?[0-9]+)?)|([0-9]+([eE][-+]?[0-9]+)))[FfLl]?))|(?P(0[xX]([0-9a-fA-F]+|((([0-9a-fA-F]+)?\.[0-9a-fA-F]+)|([0-9a-fA-F]+\.)))([pP][+-]?[0-9]+)[FfLl]?))|(?P0[xX][0-9a-fA-F]+(([uU]ll)|([uU]LL)|(ll[uU]?)|(LL[uU]?)|([uU][lL])|([lL][uU]?)|[uU])?)Út_PPHASHÚ t_NEWLINEÚt_LBRACEÚt_RBRACEÚ t_FLOAT_CONSTÚt_HEX_FLOAT_CONSTÚt_INT_CONST_HEXay(?P0[bB][01]+(([uU]ll)|([uU]LL)|(ll[uU]?)|(LL[uU]?)|([uU][lL])|([lL][uU]?)|[uU])?)|(?P0[0-7]*[89])|(?P0[0-7]*(([uU]ll)|([uU]LL)|(ll[uU]?)|(LL[uU]?)|([uU][lL])|([lL][uU]?)|[uU])?)|(?P(0(([uU]ll)|([uU]LL)|(ll[uU]?)|(LL[uU]?)|([uU][lL])|([lL][uU]?)|[uU])?)|([1-9][0-9]*(([uU]ll)|([uU]LL)|(ll[uU]?)|(LL[uU]?)|([uU][lL])|([lL][uU]?)|[uU])?))|(?P'([^'\\\n]|(\\(([a-zA-Z._~!=&\^\-\\?'"])|(\d+)|(x[0-9a-fA-F]+))))')|(?PL'([^'\\\n]|(\\(([a-zA-Z._~!=&\^\-\\?'"])|(\d+)|(x[0-9a-fA-F]+))))')|(?P('([^'\\\n]|(\\(([a-zA-Z._~!=&\^\-\\?'"])|(\d+)|(x[0-9a-fA-F]+))))*\n)|('([^'\\\n]|(\\(([a-zA-Z._~!=&\^\-\\?'"])|(\d+)|(x[0-9a-fA-F]+))))*$))|(?P('([^'\\\n]|(\\(([a-zA-Z._~!=&\^\-\\?'"])|(\d+)|(x[0-9a-fA-F]+))))[^' ]+')|('')|('([\\][^a-zA-Z._~^!=&\^\-\\?'"x0-7])[^'\n]*'))Út_INT_CONST_BINÚt_BAD_CONST_OCTZ BAD_CONST_OCTÚt_INT_CONST_OCTÚt_INT_CONST_DECÚ t_CHAR_CONSTÚ t_WCHAR_CONSTÚt_UNMATCHED_QUOTEZUNMATCHED_QUOTEÚt_BAD_CHAR_CONSTZBAD_CHAR_CONSTaŒ(?PL"([^"\\\n]|(\\(([a-zA-Z._~!=&\^\-\\?'"])|(\d+)|(x[0-9a-fA-F]+))))*")|(?P"([^"\\\n]|(\\(([a-zA-Z._~!=&\^\-\\?'"])|(\d+)|(x[0-9a-fA-F]+))))*?([\\][^a-zA-Z._~^!=&\^\-\\?'"x0-7])([^"\\\n]|(\\(([a-zA-Z._~!=&\^\-\\?'"])|(\d+)|(x[0-9a-fA-F]+))))*")|(?P[a-zA-Z_$][0-9a-zA-Z_$]*)|(?P"([^"\\\n]|(\\(([a-zA-Z._~!=&\^\-\\?'"])|(\d+)|(x[0-9a-fA-F]+))))*")|(?P\.\.\.)|(?P\+\+)|(?P\|\|)|(?P\^=)|(?P\|=)|(?P<<=)|(?P>>=)|(?P\+=)|(?P\*=)|(?P\+)|(?P%=)|(?P/=)Út_WSTRING_LITERALÚt_BAD_STRING_LITERALZBAD_STRING_LITERALÚt_IDaî(?P\])|(?P\?)|(?P\^)|(?P<<)|(?P<=)|(?P\()|(?P->)|(?P==)|(?P!=)|(?P--)|(?P\|)|(?P\*)|(?P\[)|(?P>=)|(?P\))|(?P&&)|(?P>>)|(?P-=)|(?P\.)|(?P&=)|(?P=)|(?P<)|(?P,)|(?P/)|(?P&)|(?P%)|(?P;)|(?P-)|(?P>)|(?P:)|(?P~)|(?P!)z Út_ppline_errorÚt_pppragma_errorÚt_error) Ú _tabversionÚsetÚ _lextokensÚ _lexreflagsÚ _lexliteralsÚ _lexstateinfoÚ _lexstatereÚ_lexstateignoreÚ_lexstateerrorfÚ _lexstateeoff©r’r’ú../pycparser/lextab.pyÚsÎ ÿÿÐ  pycparser-2.18/pycparser/__pycache__/c_ast.cpython-36.pyc0000664000175000017500000007275213127011712024143 0ustar elibeneliben000000000000003 Ê\YG\ã@sddlZGdd„deƒZGdd„deƒZGdd„deƒZGdd „d eƒZGd d „d eƒZGd d „d eƒZGdd„deƒZGdd„deƒZ Gdd„deƒZ Gdd„deƒZ Gdd„deƒZ Gdd„deƒZ Gdd„deƒZGdd„deƒZGdd„deƒZGd d!„d!eƒZGd"d#„d#eƒZGd$d%„d%eƒZGd&d'„d'eƒZGd(d)„d)eƒZGd*d+„d+eƒZGd,d-„d-eƒZGd.d/„d/eƒZGd0d1„d1eƒZGd2d3„d3eƒZGd4d5„d5eƒZGd6d7„d7eƒZGd8d9„d9eƒZGd:d;„d;eƒZGdd?„d?eƒZ Gd@dA„dAeƒZ!GdBdC„dCeƒZ"GdDdE„dEeƒZ#GdFdG„dGeƒZ$GdHdI„dIeƒZ%GdJdK„dKeƒZ&GdLdM„dMeƒZ'GdNdO„dOeƒZ(GdPdQ„dQeƒZ)GdRdS„dSeƒZ*GdTdU„dUeƒZ+GdVdW„dWeƒZ,GdXdY„dYeƒZ-GdZd[„d[eƒZ.Gd\d]„d]eƒZ/Gd^d_„d_eƒZ0Gd`da„daeƒZ1Gdbdc„dceƒZ2dS)déNc@s0eZdZfZdd„Zejdddddfdd„ZdS)ÚNodecCsdS)z3 A sequence of all children that are Nodes N©)Úselfrrú../pycparser/c_ast.pyÚchildrensz Node.childrenrFNc sd|}|r4|dk r4|j|ˆjjd|dƒn|j|ˆjjdƒˆjr°|r~‡fdd„ˆjDƒ}djd d „|Dƒƒ} n(‡fd d„ˆjDƒ} djd d „| Dƒƒ} |j| ƒ|rÄ|jd ˆjƒ|jdƒx.ˆjƒD]"\} } | j||d|||| dqØWdS)a Pretty print the Node and all its attributes and children (recursively) to a buffer. buf: Open IO buffer into which the Node is printed. offset: Initial offset (amount of leading spaces) attrnames: True if you want to see the attribute names in name=value pairs. False to only see the values. nodenames: True if you want to see the actual node names within their parents. showcoord: Do you want the coordinates of each Node to be displayed. ú Nz : z: csg|]}|tˆ|ƒf‘qSr)Úgetattr)Ú.0Ún)rrrú =szNode.show..z, css|]}d|VqdS)z%s=%sNr)r Znvrrrú >szNode.show..csg|]}tˆ|ƒ‘qSr)r)r r )rrrr @scss|]}d|VqdS)z%sNr)r Úvrrrr Asz (at %s)Ú é)ÚoffsetÚ attrnamesÚ nodenamesÚ showcoordÚ _my_node_name)ÚwriteÚ __class__Ú__name__Ú attr_namesÚjoinÚcoordrÚshow) rZbufrrrrrZleadZnvlistZattrstrZvlistZ child_nameÚchildr)rrrs,    z Node.show)rÚ __module__Ú __qualname__Ú __slots__rÚsysÚstdoutrrrrrrsrc@s eZdZdZdd„Zdd„ZdS)Ú NodeVisitora- A base NodeVisitor class for visiting c_ast nodes. Subclass it and define your own visit_XXX methods, where XXX is the class name you want to visit with these methods. For example: class ConstantVisitor(NodeVisitor): def __init__(self): self.values = [] def visit_Constant(self, node): self.values.append(node.value) Creates a list of values of all the constant nodes encountered below the given node. To use it: cv = ConstantVisitor() cv.visit(node) Notes: * generic_visit() will be called for AST nodes for which no visit_XXX method was defined. * The children of nodes for which a visit_XXX was defined will not be visited - if you need this, call generic_visit() on the node. You can use: NodeVisitor.generic_visit(self, node) * Modeled after Python's own AST visiting facilities (the ast module of Python 3.0) cCs"d|jj}t|||jƒ}||ƒS)z Visit a node. Úvisit_)rrrÚ generic_visit)rÚnodeÚmethodÚvisitorrrrÚvisitss zNodeVisitor.visitcCs$x|jƒD]\}}|j|ƒq WdS)zy Called if no explicit visitor function exists for a node. Implements preorder visiting of the node. N)rr()rr%Zc_nameÚcrrrr$zszNodeVisitor.generic_visitN)rrrÚ__doc__r(r$rrrrr"Rs r"c@s&eZdZd Zd dd„Zd d „Zd ZdS)Ú ArrayDeclÚtypeÚdimÚ dim_qualsrÚ __weakref__NcCs||_||_||_||_dS)N)r,r-r.r)rr,r-r.rrrrÚ__init__„szArrayDecl.__init__cCs@g}|jdk r|jd|jfƒ|jdk r8|jd|jfƒt|ƒS)Nr,r-)r,Úappendr-Útuple)rÚnodelistrrrrŠs   zArrayDecl.children)r,r-r.rr/)N)r.)rrrrr0rrrrrrr+‚s r+c@s&eZdZd Zd dd„Zdd „ZfZdS) ÚArrayRefÚnameÚ subscriptrr/NcCs||_||_||_dS)N)r5r6r)rr5r6rrrrr0”szArrayRef.__init__cCs@g}|jdk r|jd|jfƒ|jdk r8|jd|jfƒt|ƒS)Nr5r6)r5r1r6r2)rr3rrrr™s   zArrayRef.children)r5r6rr/)N)rrrrr0rrrrrrr4’s r4c@s&eZdZd Zd dd„Zd d „Zd ZdS)Ú AssignmentÚopÚlvalueÚrvaluerr/NcCs||_||_||_||_dS)N)r8r9r:r)rr8r9r:rrrrr0£szAssignment.__init__cCs@g}|jdk r|jd|jfƒ|jdk r8|jd|jfƒt|ƒS)Nr9r:)r9r1r:r2)rr3rrrr©s   zAssignment.children)r8r9r:rr/)N)r8)rrrrr0rrrrrrr7¡s r7c@s&eZdZd Zd dd„Zd d „Zd ZdS)ÚBinaryOpr8ÚleftÚrightrr/NcCs||_||_||_||_dS)N)r8r<r=r)rr8r<r=rrrrr0³szBinaryOp.__init__cCs@g}|jdk r|jd|jfƒ|jdk r8|jd|jfƒt|ƒS)Nr<r=)r<r1r=r2)rr3rrrr¹s   zBinaryOp.children)r8r<r=rr/)N)r8)rrrrr0rrrrrrr;±s r;c@s&eZdZdZd dd„Zdd„ZfZdS) ÚBreakrr/NcCs ||_dS)N)r)rrrrrr0ÃszBreak.__init__cCsfS)Nr)rrrrrÆszBreak.children)rr/)N)rrrrr0rrrrrrr>Ás r>c@s&eZdZd Zd dd„Zdd „ZfZdS) ÚCaseÚexprÚstmtsrr/NcCs||_||_||_dS)N)r@rAr)rr@rArrrrr0Ísz Case.__init__cCsTg}|jdk r|jd|jfƒx,t|jp*gƒD]\}}|jd||fƒq.Wt|ƒS)Nr@z stmts[%d])r@r1Ú enumeraterAr2)rr3ÚirrrrrÒs  z Case.children)r@rArr/)N)rrrrr0rrrrrrr?Ës r?c@s&eZdZd Zd dd„Zdd „ZfZdS) ÚCastÚto_typer@rr/NcCs||_||_||_dS)N)rEr@r)rrEr@rrrrr0Ýsz Cast.__init__cCs@g}|jdk r|jd|jfƒ|jdk r8|jd|jfƒt|ƒS)NrEr@)rEr1r@r2)rr3rrrrâs   z Cast.children)rEr@rr/)N)rrrrr0rrrrrrrDÛs rDc@s&eZdZd Zd dd„Zdd„ZfZdS) ÚCompoundÚ block_itemsrr/NcCs||_||_dS)N)rGr)rrGrrrrr0ìszCompound.__init__cCs:g}x,t|jpgƒD]\}}|jd||fƒqWt|ƒS)Nzblock_items[%d])rBrGr1r2)rr3rCrrrrrðszCompound.children)rGrr/)N)rrrrr0rrrrrrrFês rFc@s&eZdZd Zd dd„Zdd „ZfZdS) ÚCompoundLiteralr,Úinitrr/NcCs||_||_||_dS)N)r,rIr)rr,rIrrrrr0úszCompoundLiteral.__init__cCs@g}|jdk r|jd|jfƒ|jdk r8|jd|jfƒt|ƒS)Nr,rI)r,r1rIr2)rr3rrrrÿs   zCompoundLiteral.children)r,rIrr/)N)rrrrr0rrrrrrrHøs rHc@s&eZdZd Zd dd„Zdd „Zd ZdS) ÚConstantr,Úvaluerr/NcCs||_||_||_dS)N)r,rKr)rr,rKrrrrr0 szConstant.__init__cCs g}t|ƒS)N)r2)rr3rrrrszConstant.children)r,rKrr/)N)r,rK)rrrrr0rrrrrrrJs rJc@s&eZdZdZd dd„Zdd„ZfZdS) ÚContinuerr/NcCs ||_dS)N)r)rrrrrr0szContinue.__init__cCsfS)Nr)rrrrrszContinue.children)rr/)N)rrrrr0rrrrrrrLs rLc @s&eZdZdZdd d „Zd d„ZdZd S)ÚDeclr5ÚqualsÚstorageÚfuncspecr,rIÚbitsizerr/Nc Cs4||_||_||_||_||_||_||_||_dS)N)r5rNrOrPr,rIrQr) rr5rNrOrPr,rIrQrrrrr0 sz Decl.__init__cCsZg}|jdk r|jd|jfƒ|jdk r8|jd|jfƒ|jdk rR|jd|jfƒt|ƒS)Nr,rIrQ)r,r1rIrQr2)rr3rrrr*s   z Decl.children) r5rNrOrPr,rIrQrr/)N)r5rNrOrP)rrrrr0rrrrrrrMs rMc@s&eZdZd Zd dd„Zdd„ZfZdS) ÚDeclListÚdeclsrr/NcCs||_||_dS)N)rSr)rrSrrrrr05szDeclList.__init__cCs:g}x,t|jpgƒD]\}}|jd||fƒqWt|ƒS)Nz decls[%d])rBrSr1r2)rr3rCrrrrr9szDeclList.children)rSrr/)N)rrrrr0rrrrrrrR3s rRc@s&eZdZd Zd dd„Zdd„ZfZdS) ÚDefaultrArr/NcCs||_||_dS)N)rAr)rrArrrrr0CszDefault.__init__cCs:g}x,t|jpgƒD]\}}|jd||fƒqWt|ƒS)Nz stmts[%d])rBrAr1r2)rr3rCrrrrrGszDefault.children)rArr/)N)rrrrr0rrrrrrrTAs rTc@s&eZdZd Zd dd„Zdd „ZfZdS) ÚDoWhileÚcondÚstmtrr/NcCs||_||_||_dS)N)rVrWr)rrVrWrrrrr0QszDoWhile.__init__cCs@g}|jdk r|jd|jfƒ|jdk r8|jd|jfƒt|ƒS)NrVrW)rVr1rWr2)rr3rrrrVs   zDoWhile.children)rVrWrr/)N)rrrrr0rrrrrrrUOs rUc@s&eZdZdZd dd„Zdd„ZfZdS) Ú EllipsisParamrr/NcCs ||_dS)N)r)rrrrrr0`szEllipsisParam.__init__cCsfS)Nr)rrrrrcszEllipsisParam.children)rr/)N)rrrrr0rrrrrrrX^s rXc@s&eZdZdZd dd„Zdd„ZfZdS) ÚEmptyStatementrr/NcCs ||_dS)N)r)rrrrrr0jszEmptyStatement.__init__cCsfS)Nr)rrrrrmszEmptyStatement.children)rr/)N)rrrrr0rrrrrrrYhs rYc@s&eZdZd Zd dd„Zdd „Zd ZdS) ÚEnumr5Úvaluesrr/NcCs||_||_||_dS)N)r5r[r)rr5r[rrrrr0tsz Enum.__init__cCs&g}|jdk r|jd|jfƒt|ƒS)Nr[)r[r1r2)rr3rrrrys z Enum.children)r5r[rr/)N)r5)rrrrr0rrrrrrrZrs rZc@s&eZdZd Zd dd„Zdd „Zd ZdS) Ú Enumeratorr5rKrr/NcCs||_||_||_dS)N)r5rKr)rr5rKrrrrr0‚szEnumerator.__init__cCs&g}|jdk r|jd|jfƒt|ƒS)NrK)rKr1r2)rr3rrrr‡s zEnumerator.children)r5rKrr/)N)r5)rrrrr0rrrrrrr\€s r\c@s&eZdZd Zd dd„Zdd„ZfZdS) ÚEnumeratorListÚ enumeratorsrr/NcCs||_||_dS)N)r^r)rr^rrrrr0szEnumeratorList.__init__cCs:g}x,t|jpgƒD]\}}|jd||fƒqWt|ƒS)Nzenumerators[%d])rBr^r1r2)rr3rCrrrrr”szEnumeratorList.children)r^rr/)N)rrrrr0rrrrrrr]Žs r]c@s&eZdZd Zd dd„Zdd„ZfZdS) ÚExprListÚexprsrr/NcCs||_||_dS)N)r`r)rr`rrrrr0žszExprList.__init__cCs:g}x,t|jpgƒD]\}}|jd||fƒqWt|ƒS)Nz exprs[%d])rBr`r1r2)rr3rCrrrrr¢szExprList.children)r`rr/)N)rrrrr0rrrrrrr_œs r_c@s&eZdZd Zd dd„Zdd„ZfZdS) ÚFileASTÚextrr/NcCs||_||_dS)N)rbr)rrbrrrrr0¬szFileAST.__init__cCs:g}x,t|jpgƒD]\}}|jd||fƒqWt|ƒS)Nzext[%d])rBrbr1r2)rr3rCrrrrr°szFileAST.children)rbrr/)N)rrrrr0rrrrrrraªs rac@s&eZdZd Zd dd „Zd d „ZfZdS)ÚForrIrVÚnextrWrr/NcCs"||_||_||_||_||_dS)N)rIrVrdrWr)rrIrVrdrWrrrrr0ºs z For.__init__cCstg}|jdk r|jd|jfƒ|jdk r8|jd|jfƒ|jdk rR|jd|jfƒ|jdk rl|jd|jfƒt|ƒS)NrIrVrdrW)rIr1rVrdrWr2)rr3rrrrÁs    z For.children)rIrVrdrWrr/)N)rrrrr0rrrrrrrc¸s rcc@s&eZdZd Zd dd„Zdd „ZfZdS) ÚFuncCallr5Úargsrr/NcCs||_||_||_dS)N)r5rfr)rr5rfrrrrr0ÍszFuncCall.__init__cCs@g}|jdk r|jd|jfƒ|jdk r8|jd|jfƒt|ƒS)Nr5rf)r5r1rfr2)rr3rrrrÒs   zFuncCall.children)r5rfrr/)N)rrrrr0rrrrrrreËs rec@s&eZdZd Zd dd„Zdd „ZfZdS) ÚFuncDeclrfr,rr/NcCs||_||_||_dS)N)rfr,r)rrfr,rrrrr0ÜszFuncDecl.__init__cCs@g}|jdk r|jd|jfƒ|jdk r8|jd|jfƒt|ƒS)Nrfr,)rfr1r,r2)rr3rrrrás   zFuncDecl.children)rfr,rr/)N)rrrrr0rrrrrrrgÚs rgc@s&eZdZd Zd dd„Zd d „ZfZdS) ÚFuncDefÚdeclÚ param_declsÚbodyrr/NcCs||_||_||_||_dS)N)rirjrkr)rrirjrkrrrrr0ëszFuncDef.__init__cCsng}|jdk r|jd|jfƒ|jdk r8|jd|jfƒx,t|jpDgƒD]\}}|jd||fƒqHWt|ƒS)Nrirkzparam_decls[%d])rir1rkrBrjr2)rr3rCrrrrrñs  zFuncDef.children)rirjrkrr/)N)rrrrr0rrrrrrrhés rhc@s&eZdZd Zd dd„Zdd„Zd ZdS) ÚGotor5rr/NcCs||_||_dS)N)r5r)rr5rrrrr0ýsz Goto.__init__cCs g}t|ƒS)N)r2)rr3rrrrsz Goto.children)r5rr/)N)r5)rrrrr0rrrrrrrlûs rlc@s&eZdZd Zd dd„Zdd„Zd ZdS) ÚIDr5rr/NcCs||_||_dS)N)r5r)rr5rrrrr0 sz ID.__init__cCs g}t|ƒS)N)r2)rr3rrrr sz ID.children)r5rr/)N)r5)rrrrr0rrrrrrrms rmc@s&eZdZd Zd dd„Zdd„Zd ZdS) ÚIdentifierTypeÚnamesrr/NcCs||_||_dS)N)ror)rrorrrrr0szIdentifierType.__init__cCs g}t|ƒS)N)r2)rr3rrrrszIdentifierType.children)rorr/)N)ro)rrrrr0rrrrrrrns rnc@s&eZdZd Zd dd„Zd d „ZfZdS) ÚIfrVÚiftrueÚiffalserr/NcCs||_||_||_||_dS)N)rVrqrrr)rrVrqrrrrrrr0!sz If.__init__cCsZg}|jdk r|jd|jfƒ|jdk r8|jd|jfƒ|jdk rR|jd|jfƒt|ƒS)NrVrqrr)rVr1rqrrr2)rr3rrrr's   z If.children)rVrqrrrr/)N)rrrrr0rrrrrrrps rpc@s&eZdZd Zd dd„Zdd„ZfZdS) ÚInitListr`rr/NcCs||_||_dS)N)r`r)rr`rrrrr02szInitList.__init__cCs:g}x,t|jpgƒD]\}}|jd||fƒqWt|ƒS)Nz exprs[%d])rBr`r1r2)rr3rCrrrrr6szInitList.children)r`rr/)N)rrrrr0rrrrrrrs0s rsc@s&eZdZd Zd dd„Zdd „Zd ZdS) ÚLabelr5rWrr/NcCs||_||_||_dS)N)r5rWr)rr5rWrrrrr0@szLabel.__init__cCs&g}|jdk r|jd|jfƒt|ƒS)NrW)rWr1r2)rr3rrrrEs zLabel.children)r5rWrr/)N)r5)rrrrr0rrrrrrrt>s rtc@s&eZdZd Zd dd„Zdd „ZfZdS) ÚNamedInitializerr5r@rr/NcCs||_||_||_dS)N)r5r@r)rr5r@rrrrr0NszNamedInitializer.__init__cCsTg}|jdk r|jd|jfƒx,t|jp*gƒD]\}}|jd||fƒq.Wt|ƒS)Nr@zname[%d])r@r1rBr5r2)rr3rCrrrrrSs  zNamedInitializer.children)r5r@rr/)N)rrrrr0rrrrrrruLs ruc@s&eZdZd Zd dd„Zdd„ZfZdS) Ú ParamListÚparamsrr/NcCs||_||_dS)N)rwr)rrwrrrrr0^szParamList.__init__cCs:g}x,t|jpgƒD]\}}|jd||fƒqWt|ƒS)Nz params[%d])rBrwr1r2)rr3rCrrrrrbszParamList.children)rwrr/)N)rrrrr0rrrrrrrv\s rvc@s&eZdZd Zd dd„Zdd „Zd ZdS) ÚPtrDeclrNr,rr/NcCs||_||_||_dS)N)rNr,r)rrNr,rrrrr0lszPtrDecl.__init__cCs&g}|jdk r|jd|jfƒt|ƒS)Nr,)r,r1r2)rr3rrrrqs zPtrDecl.children)rNr,rr/)N)rN)rrrrr0rrrrrrrxjs rxc@s&eZdZd Zd dd„Zdd„ZfZdS) ÚReturnr@rr/NcCs||_||_dS)N)r@r)rr@rrrrr0zszReturn.__init__cCs&g}|jdk r|jd|jfƒt|ƒS)Nr@)r@r1r2)rr3rrrr~s zReturn.children)r@rr/)N)rrrrr0rrrrrrryxs ryc@s&eZdZd Zd dd„Zdd „Zd ZdS) ÚStructr5rSrr/NcCs||_||_||_dS)N)r5rSr)rr5rSrrrrr0‡szStruct.__init__cCs:g}x,t|jpgƒD]\}}|jd||fƒqWt|ƒS)Nz decls[%d])rBrSr1r2)rr3rCrrrrrŒszStruct.children)r5rSrr/)N)r5)rrrrr0rrrrrrrz…s rzc@s&eZdZd Zd dd„Zd d „Zd ZdS)Ú StructRefr5r,Úfieldrr/NcCs||_||_||_||_dS)N)r5r,r|r)rr5r,r|rrrrr0–szStructRef.__init__cCs@g}|jdk r|jd|jfƒ|jdk r8|jd|jfƒt|ƒS)Nr5r|)r5r1r|r2)rr3rrrrœs   zStructRef.children)r5r,r|rr/)N)r,)rrrrr0rrrrrrr{”s r{c@s&eZdZd Zd dd„Zdd „ZfZdS) ÚSwitchrVrWrr/NcCs||_||_||_dS)N)rVrWr)rrVrWrrrrr0¦szSwitch.__init__cCs@g}|jdk r|jd|jfƒ|jdk r8|jd|jfƒt|ƒS)NrVrW)rVr1rWr2)rr3rrrr«s   zSwitch.children)rVrWrr/)N)rrrrr0rrrrrrr}¤s r}c@s&eZdZd Zd dd„Zd d „ZfZdS) Ú TernaryOprVrqrrrr/NcCs||_||_||_||_dS)N)rVrqrrr)rrVrqrrrrrrr0µszTernaryOp.__init__cCsZg}|jdk r|jd|jfƒ|jdk r8|jd|jfƒ|jdk rR|jd|jfƒt|ƒS)NrVrqrr)rVr1rqrrr2)rr3rrrr»s   zTernaryOp.children)rVrqrrrr/)N)rrrrr0rrrrrrr~³s r~c@s&eZdZd Zd dd„Zd d „Zd ZdS)ÚTypeDeclÚdeclnamerNr,rr/NcCs||_||_||_||_dS)N)r€rNr,r)rr€rNr,rrrrr0ÆszTypeDecl.__init__cCs&g}|jdk r|jd|jfƒt|ƒS)Nr,)r,r1r2)rr3rrrrÌs zTypeDecl.children)r€rNr,rr/)N)r€rN)rrrrr0rrrrrrrÄs rc@s&eZdZd Zd dd „Zd d „ZdZdS)ÚTypedefr5rNrOr,rr/NcCs"||_||_||_||_||_dS)N)r5rNrOr,r)rr5rNrOr,rrrrr0Õs zTypedef.__init__cCs&g}|jdk r|jd|jfƒt|ƒS)Nr,)r,r1r2)rr3rrrrÜs zTypedef.children)r5rNrOr,rr/)N)r5rNrO)rrrrr0rrrrrrrÓs rc@s&eZdZd Zd dd„Zd d „Zd ZdS)ÚTypenamer5rNr,rr/NcCs||_||_||_||_dS)N)r5rNr,r)rr5rNr,rrrrr0åszTypename.__init__cCs&g}|jdk r|jd|jfƒt|ƒS)Nr,)r,r1r2)rr3rrrrës zTypename.children)r5rNr,rr/)N)r5rN)rrrrr0rrrrrrr‚ãs r‚c@s&eZdZd Zd dd„Zdd „Zd ZdS) ÚUnaryOpr8r@rr/NcCs||_||_||_dS)N)r8r@r)rr8r@rrrrr0ôszUnaryOp.__init__cCs&g}|jdk r|jd|jfƒt|ƒS)Nr@)r@r1r2)rr3rrrrùs zUnaryOp.children)r8r@rr/)N)r8)rrrrr0rrrrrrrƒòs rƒc@s&eZdZd Zd dd„Zdd „Zd ZdS) ÚUnionr5rSrr/NcCs||_||_||_dS)N)r5rSr)rr5rSrrrrr0szUnion.__init__cCs:g}x,t|jpgƒD]\}}|jd||fƒqWt|ƒS)Nz decls[%d])rBrSr1r2)rr3rCrrrrrszUnion.children)r5rSrr/)N)r5)rrrrr0rrrrrrr„s r„c@s&eZdZd Zd dd„Zdd „ZfZdS) ÚWhilerVrWrr/NcCs||_||_||_dS)N)rVrWr)rrVrWrrrrr0szWhile.__init__cCs@g}|jdk r|jd|jfƒ|jdk r8|jd|jfƒt|ƒS)NrVrW)rVr1rWr2)rr3rrrrs   zWhile.children)rVrWrr/)N)rrrrr0rrrrrrr…s r…c@s&eZdZd Zd dd„Zdd„Zd ZdS) ÚPragmaÚstringrr/NcCs||_||_dS)N)r‡r)rr‡rrrrr0 szPragma.__init__cCs g}t|ƒS)N)r2)rr3rrrr$szPragma.children)r‡rr/)N)r‡)rrrrr0rrrrrrr†s r†)3r Úobjectrr"r+r4r7r;r>r?rDrFrHrJrLrMrRrTrUrXrYrZr\r]r_rarcrergrhrlrmrnrprsrtrurvrxryrzr{r}r~rrr‚rƒr„r…r†rrrrÚsb<0         pycparser-2.18/pycparser/__pycache__/ast_transforms.cpython-36.pyc0000664000175000017500000000466113127011713026112 0ustar elibeneliben000000000000003 ö”Xö ã@s ddlmZdd„Zdd„ZdS)é)Úc_astcCsªt|tjƒst‚t|jtjƒs"|Stjg|jjƒ}d}xh|jjD]\}t|tjtj fƒrz|jj |ƒt ||jƒ|jd}q@|dkr|jj |ƒq@|j j |ƒq@W||_|S)aÜ The 'case' statements in a 'switch' come out of parsing with one child node, so subsequent statements are just tucked to the parent Compound. Additionally, consecutive (fall-through) case statements come out messy. This is a peculiarity of the C grammar. The following: switch (myvar) { case 10: k = 10; p = k + 1; return 10; case 20: case 30: return 20; default: break; } Creates this tree (pseudo-dump): Switch ID: myvar Compound: Case 10: k = 10 p = k + 1 return 10 Case 20: Case 30: return 20 Default: break The goal of this transform is to fix this mess, turning it into the following: Switch ID: myvar Compound: Case 10: k = 10 p = k + 1 return 10 Case 20: Case 30: return 20 Default: break A fixed AST node is returned. The argument may be modified. Nréÿÿÿÿ) Ú isinstancerÚSwitchÚAssertionErrorÚstmtÚCompoundÚcoordÚ block_itemsÚCaseÚDefaultÚappendÚ_extract_nested_caseÚstmts)Z switch_nodeZ new_compoundZ last_caseÚchild©rú../pycparser/ast_transforms.pyÚfix_switch_cases s3   rcCs:t|jdtjtjfƒr6|j|jjƒƒt|d|ƒdS)z€ Recursively extract consecutive Case statements that are made nested by the parser and add them to the stmts_list. érNr)rrrr r r Úpopr)Z case_nodeZ stmts_listrrrrbsrN)ÚrrrrrrrÚ s Upycparser-2.18/pycparser/__pycache__/yacctab.cpython-36.pyc0000664000175000017500000041712213127011713024453 0ustar elibeneliben000000000000003 ²\Yë‚ã›@søedZdZdZddddddd d d d d ddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;dd?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrdsdtdudvdwdxdydzd{d|d}d~dd€dd‚dƒd„d…d†d‡dˆd‰dŠd‹dŒddŽddd‘d’d“d”d•d–d—d˜g–dd™ dH dš dL d› dœ d dž dŸ dF d6 d3 d  dD d¡ dd¢ d£ dJ d7 d¤ d¥ ddf d¦ d§ dd5 d¨ dd dK d© dG d8 dª d« d™ d¬ d™ dI dM d9 d­ d® d d dd. dd¯ ddd4 d° d± dX dd² d³ d´ dµ d¶ ddC d· dE d¸ d¹ dº d: d ddd; ddB dddV d» d¼ dddd½ ddd! d ddd> d¾ d= d? dd@ d¿ dÀ dÁ d dà dddW dÄ dÅ dÆ dT dU dÇ d< ddÈ dddd ddddd dÉ ddÊ dË dÌ dÍ dÎ dÏ dÐ dÑ dÒ dd d dÓ dÔ d` dÕ d_ db dÖ da d× dØ dÙ dc g–fddddd d d d d ddddddddddÈdd d"d#d%d'd(d*d,d-d/d4d:d;dd?d¾dÁd@dÚdAdBd®dµdDdEdFdKdLdšdSdYd¼d»d²dÑdÛdÜdÝdÞdßdàdádâdãdädådædçdèd\d]d^dédêd¿dÍdëdmdndpdqdrdsdìdídîdïdðdñdvdzd{dòdódôd…d†dˆd‰dõdöd÷dødùdúdûdüdýdþdÿdŠd‹dŒdddddddddd d d d d ddddddddddddddddddg’dH dš dL d› dœ d dž dŸ dF d  dD d¢ d£ dJ d¤ d¥ df d¦ d™ d§ d¨ d dK dG dª d« d¬ dI dM d­ d d° d± dX d² d³ d! dà d´ d" dµ d¶ d#d" dC d· dE d dËd$ dV d½ dá d#d{ d% d& d' d( d) dñ d* d%d+ d, d- dz d. d/ d0 d! d dËdËd1d( d2d$ dW dÄ dÆ dT dU dÇ d3 d4d5 d6 d7 du dÈ d dËdËd1d2dÎ dÏ dÑ dÒ d' d8 dç d4dî d9 d: d; d< d= dËdËd d d> d] d? dé d# d& d@ d4dè dA dB d4d- d, d^ dC dê d% d$ dD dEdw dF d+ dv d4dï d* d) dx dð dG g’fdd d-d.d d;d¾d@dÚdGdJd«d¦d¬dOdYd#dHdIdÇdJdKdLdMdNdOd[dPdQdRdSdTdËd`dadbd×dcdÉdddedfdgdhdidUdjdVdpdWdXdYdZd[díd\d]d^d_d`d,dAdBd<d;d:d9dud%d=dxdyd/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid1djd2d~dd€ddkdldmd‚dndƒd„dodpdqdrdsdtdud4dvdwdxdydzd{d|ddŽd}dd~dddd€dd d‚dƒd„d…d†d‡d‘dˆd‰dŠd‹dŒdd’d“d”dEddŽddd•d–d‘d—d˜gdš d¨ dM d9 d™ d± d! d´ d" d¸ d: dÝd! d™ dÝd½ d™ dÝdÝd` dÝd^ dÝd] dÝd\ dÝdÝd’ d_ dÝdÝdÝd> d¾ d= dÝd? dÝdÝd@ d¿ dÀ dÁ d dÝdà d\ dÆ dÝdÝdÝd! d™ d™ dU d“ dj d” di d• dV dÝd– dk d— dë dÝdÝdÝdÝdÝdÝdÝdÝdÝdÝdÝdÝdÝdÝdÝdÝdÝdÝdÝdÝdÝdÝdÝdÝd™ d\ dÝdd dÉ dÝdÊ dÝdÝdÝdË dÝdÌ dÍ dÝdÝdÝd\ dÝdÝd dÝdÝd dÝdÝd! d™ d\ dÓ dÔ dÝd` dÝdÕ d™ d˜ dÝdÝd™ dÝd\ dÝdÝdÝdÝd_ dÝdÝdÝdÝd dÝdb dÖ da dÝd™ dÝdÝdÝd× dØ dÝdÙ dc gfdd d-d.d d;d¾d@dÚdGdJd«d¦d¬dOdYd#dHdIdÇdJdKdLdMdNdOd[dPdQdRdSdTdËd`dadbd×dcdÉdddedfdgdhdidUdjdVdpdWdXdYdZd[díd\d]d^d_d`d,dAdBd<d;d:d9dud%d=dxdyd/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid1djd2d~dd€ddkdldmd‚dndƒd„dodpdqdrdsdtdud4dvdwdxdydzd{d|ddŽd}dd~dddd€dd d‚dƒd„d…d†d‡d‘dˆd‰dŠd‹dŒdd’d“d”dEddŽddd•d–d‘d—d˜gdš d¨ dM d9 d™ d± d! d´ d" d¸ d: dÞd! d™ dÞd½ d™ dÞdÞd` dÞd^ dÞd] dÞd\ dÞdÞd’ d_ dÞdÞdÞd> d¾ d= dÞd? dÞdÞd@ d¿ dÀ dÁ d dÞdà d\ dÆ dÞdÞdÞd! d™ d™ dU d“ dj d” di d• dV dÞd– dk d— dë dÞdÞdÞdÞdÞdÞdÞdÞdÞdÞdÞdÞdÞdÞdÞdÞdÞdÞdÞdÞdÞdÞdÞdÞd™ d\ dÞdd dÉ dÞdÊ dÞdÞdÞdË dÞdÌ dÍ dÞdÞdÞd\ dÞdÞd dÞdÞd dÞdÞd! d™ d\ dÓ dÔ dÞd` dÞdÕ d™ d˜ dÞdÞd™ dÞd\ dÞdÞdÞdÞd_ dÞdÞdÞdÞd dÞdb dÖ da dÞd™ dÞdÞdÞd× dØ dÞdÙ dc gfdd d-d.d d;d¾d@dÚdGdJd«d¦d¬dšdOdYd#dHdId²dÇdÒdÑdÛdÜdÝdÞdßdJdKdàdád½dLdMdNdâdOdãd[dPdädQdådæd˜dçdèdRdSdTdËd`dadbd×dcdÉdddedfdgdhdidUdjdVdëdpdWdXdYdZd[díd\d]d^d_d`d,dAdBd<d;d:d9dîdud7dïdðdñd%d=dxdGd™dyd$d)d/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid1djd2d~dd€ddkdldmd‚dndƒd„dodpdqdrdsdtdud4dvdwdùdúdûdüdýdþdxdšd›dœddždŸd d¡d¢d£d¤d¥d¦d§d¨d©dªd«dydzd{d|ddŽd}dd~dddd¬d d d€dd­d d‚dƒd„d…d†d‡d‘dˆd‰dŠd‹dŒdd’d“d”ddEdddŽdddd•d–d‘d—d˜gÚdš d¨ dM d9 d™ d± d! d´ d" d¸ d: dKd! d™ d$ dKd½ d™ dKdKd{ d` d[ d% d& d' d( d) dñ dKd^ d* d® do dKd] dKd+ d\ d, dKdKd- d’ dz d. d-d/ d0 d_ dKdKdKd> d¾ d= dKd? dKdKd@ d¿ dÀ dÁ d dKdà d\ d$ dÆ dKdKdKd! d™ d™ dU d“ dj d” di d• dV dKd– dk d— dë d5 dKd¯ d6 d7 du dKdKdKd[ dí dKdt d° dKdKdKdKdKdKdKdKdKdKdKdKdKdKdKdKdKdKdKd™ d\ dKdd dÉ dKdÊ dKdKdKdË dKdÌ dÍ dKdKdKd\ dKdKd dKdKd dî d9 d: d; d< d= dKd-d-d-d± d-d-d-d² d-d-dq dp d-d-d-d-d-dW dKd! d™ d\ dÓ dÔ dKd` dKdÕ d™ d˜ d³ dA dB dKdKd´ d™ dKd\ dKdKdKdKd_ dKdKdKdKd dKdb dÖ da dv dKd™ dï dKdKdð dKd× dØ dKdÙ dc gÚfddddd d d d d ddddddddddÈdd d"d#d%d'd(d*d,d-d/dµd3d4d:dd?d¾dÁd@dÚdAdBd®dµdDdEdFdKd£d¥d§dªdLdŸdšd¨d¶dSdYd¼dÅd»dZd²dÒdÑdÛdÜdÝdÞdßdàdád½dâdãdädådæd˜dçdèd@d\d]d^d?dCdédêd>dÀd¿dÂdÍdÌd“dmdndpdqdrdsdìd´d®dîd7dïdðdñd=dvdGd™d$dzd0d+d{d)dòd·d¸d¹dºdódôd|d}d»d¼d½d¾d…d†dˆd‰dõdöd¿dÀdùdúdûdüdÁdýdÂdÃdÄdþdÅdÿdŠdšd›dœddždŸd d¡d¢d£d¤d¥d¦d§d¨d©dªd«d‹dŒdddÆdÇdddÈdÉdÊdËdÌddd¬d d d­d d ddddddÍdddÎddddÏdddddŽdÐdddÑdgÚdH dš dL d› dœ d dž dŸ dF d  dD d¢ d£ dJ d¤ d¥ df d¦ d™ d§ d¨ d dK dG dª d« d¬ dI dM d­ dKd™ d. d° dX d² d³ d! dà d´ d" dµ d¶ d/ d" dC d· dE d d\d d]dM d™ d d$ dÒ dã dV d½ dá dìd0 d™ d{ d[ d% d& d' d( d) dñ d* d® do d+ d, d- dz d. dm d/ d0 dg d! d d™ dO d dS d1 dN d d( dâ d\ d dÓ dW dÄ dÆ dT dU dÇ d3 dõdöd5 d¯ d6 d7 du dýdÈ d[ dí dt d dþdxd™ d° dS d d ddd2 dT d™ d™ d[ dL dÔ dÕ dÎ dÏ dÑ dÒ d' d8 dh d¬dî d9 d: d; d™ d< d ddÖd= dR dS d™ d× dZ dØ d± dÙ ds dr d² dÚ dÛ dq dp dÜ dY dÝ dX dì dW d d d> d] d d d? dé dde d‡dˆdŠd# d& d³ dA dB d´ d- d, d^ dC dê d% d$ dy ddw dn dF d+ d™ dÞdv dï d* d) d™ ddx dð d‘dG gÚfddddddd d d d d ddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;dd?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrdsdtdudvdwdxdydzd{d|d}d~dd€dd‚dƒd„d…d†d‡dˆd‰dŠd‹dŒddŽddd‘d’d“d”d•d–d—d˜g–dd™ dH dš dL d› dœ d dž dŸ dF d6 d3 d  dD d¡ dd¢ d£ dJ d7 d¤ d¥ ddf d¦ d§ dd5 d¨ dd dK d© dG d8 dª d« d™ d¬ d™ dI dM d9 d­ d® d d dd. dd¯ ddd4 d° d± dX dd² d³ d´ dµ d¶ ddC d· dE d¸ d¹ dº d: d ddd; ddB dddV d» d¼ dddd½ ddd! d ddd> d¾ d= d? dd@ d¿ dÀ dÁ d dà dddW dÄ dÅ dÆ dT dU dÇ d< ddÈ dddd ddddd dÉ ddÊ dË dÌ dÍ dÎ dÏ dÐ dÑ dÒ dd d dÓ dÔ d` dÕ d_ db dÖ da d× dØ dÙ dc g–fdd d-d.d d;d¾d@dÚdGdJd«d¦d¬dšdOdYd#dHdId²dÇdÒdÑdÛdÜdÝdÞdßdJdKdàdád½dLdMdNdâdOdãd[dPdädQdådæd˜dçdèdRdSdTdËd`dadbd×dcdÉdddedfdgdhdidUdjdVdëdpdWdXdYdZd[díd\d]d^d_d`d,dAdBd<d;d:d9dîdud7dïdðdñd%d=dxdGd™dyd$d)d/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid1djd2d~dd€ddkdldmd‚dndƒd„dodpdqdrdsdtdud4dvdwdùdúdûdüdýdþdxdšd›dœddždŸd d¡d¢d£d¤d¥d¦d§d¨d©dªd«dydzd{d|ddŽd}dd~dddd¬d d d€dd­d d‚dƒd„d…d†d‡d‘dˆd‰dŠd‹dŒdd’d“d”ddEdddŽdddd•d–d‘d—d˜gÚdš d¨ dM d9 d™ d± d! d´ d" d¸ d: dMd! d™ d$ dMd½ d™ dMdMd{ d` d[ d% d& d' d( d) dñ dMd^ d* d® do dMd] dMd+ d\ d, dMdMd- d’ dz d. d±d/ d0 d_ dMdMdMd> d¾ d= dMd? dMdMd@ d¿ dÀ dÁ d dMdà d\ d$ dÆ dMdMdMd! d™ d™ dU d“ dj d” di d• dV dMd– dk d— dë d5 dMd¯ d6 d7 du dMdMdMd[ dí dMdt d° dMdMdMdMdMdMdMdMdMdMdMdMdMdMdMdMdMdMdMd™ d\ dMdd dÉ dMdÊ dMdMdMdË dMdÌ dÍ dMdMdMd\ dMdMd dMdMd dî d9 d: d; d< d= dMd±d±d±d± d±d±d±d² d±d±dq dp d±d±d±d±d±dW dMd! d™ d\ dÓ dÔ dMd` dMdÕ d™ d˜ d³ dA dB dMdMd´ d™ dMd\ dMdMdMdMd_ dMdMdMdMd dMdb dÖ da dv dMd™ dï dMdMdð dMd× dØ dMdÙ dc gÚfd_gd¼gfdšd²dÒdÑdÛdÜdÝdÞdßdàdád½dâdãdädådæd˜dçdèdëdpdîd7dïdðdñdGd™d$d)dùdúdûdüdýdþdšd›dœddždŸd d¡d¢d£d¤d¥d¦d§d¨d©dªd«d¬d d d­dddg>d$ d{ d[ d% d& d' d( d) dñ d* d® do d+ d, d- dz d. dÆd/ d0 d$ dÆ d5 d¯ d6 d7 du d[ dí dt d° dî d9 d: d; d< d= d× dÆdØ d± dÙ dÆdr d² dÚ dÆdq dp dÜ dÆdÆdÆdÆdW d³ dA dB d´ dv dï dð g>fd.d;dGdJdOd`dadbdcdddedfdgdhdidUdjdpd~dddld‚dƒd„doddŽdd~dd‡d‘dˆdŠd’d“d”ddd•d–d‘d—d˜g-d9 d± d¸ d: dØd> d¾ d= d? dØd@ d¿ dÀ dÁ d dØdà dÆ dd dÉ dÊ dØdË dÌ dÍ dØdÓ dÔ d` dØdÕ dØd_ dØdØdb dÖ da dØdØd× dØ dØdÙ dc g-fdddd d d d dddddddddd d!d"d$d&d)d+d-d.d/d0d1d2d3d4d6d7d8d9d;d=d@dCdGdHdIdJdKdMdNdOdPdQdRdTdUdVdWdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldodpdtdudvdwdxdydzd|d}d~dd€dd‚dƒd„d‡dŠd‹dŒddŽddd‘d’d“d”d•d–d—d˜gld!d™ dš dœ d dž dF d6 d3 d¡ d!d¢ d7 d¤ d!d5 d¨ d!d d© d8 d™ d™ dM d9 d­ d® d d d!d. d¯ d!d!d4 d± d!d´ d!d¸ d¹ dº d: d d!d; d!dB d!d!d» d¼ d!d!d½ d!d!d! d d!d!d> d¾ d= d? d!d@ d¿ dÀ dÁ d dà d!d!dÅ dÆ d< d!dÈ d!d!d!d d!d!dd dÉ d!dÊ dË dÌ dÍ dÐ d!d d dÓ dÔ d` dÕ d_ db dÖ da d× dØ dÙ dc glfd;dšd²dÑdÛdÜdÝdÞdßdàdádâdãdädådædçdèdëdpdídîdïdðdñd÷dødùdúdûdüdýdþdddd d d dddddddddg0d± d$ d{ d% d& d' d( d) dñ d* d*d+ d, d- dz d. d/ d0 d$ dÆ dßd5 d6 d7 du dç dßdî d9 d: d; d< d= d@ dßdè dA dB dßdD dàdw dv dßdï dx dð dG g0fdšd²dÒdÑdÛdÜdÝdÞdßdàdád½dâdãdädådæd˜dçdèdëdpdîd7dïdðdñdGd™d$d)dùdúdûdüdýdþdšd›dœddždŸd d¡d¢d£d¤d¥d¦d§d¨d©dªd«d¬d d d­dddg>d$ d{ d[ d% d& d' d( d) dñ d* d® do d+ d, d- dz d. dcd/ d0 d$ dÆ d5 d¯ d6 d7 du d[ dí dt d° dî d9 d: d; d< d= d× dcdØ d± dÙ dcdr d² dÚ dcdq dp dÜ dcdcdcdcdW d³ dA dB d´ dv dï dð g>fdd d-d.d d;d¾d@dÚdGdJd«d¦d¬dOdYd#dHdIdÇdJdKdLdMdNdOd[dPdQdRdSdTdËd`dadbd×dcdÉdddedfdgdhdidUdjdVdpdWdXdYdZd[díd\d]d^d_d`d,dAdBd<d;d:d9dud%d=dxdyd/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid1djd2d~dd€ddkdldmd‚dndƒd„dodpdqdrdsdtdud4dvdwdxdydzd{d|ddŽd}dd~dddd€dd d‚dƒd„d…d†d‡d‘dˆd‰dŠd‹dŒdd’d“d”dEddŽddd•d–d‘d—d˜gdš d¨ dM d9 d™ d± d! d´ d" d¸ d: dåd! d™ dåd½ d™ dådåd` dåd^ dåd] dåd\ dådåd’ d_ dådådåd> d¾ d= dåd? dådåd@ d¿ dÀ dÁ d dådà d\ dÆ dådådåd! d™ d™ dU d“ dj d” di d• dV dåd– dk d— dë dådådådådådådådådådådådådådådådådådådådådådådådåd™ d\ dådd dÉ dådÊ dådådådË dådÌ dÍ dådådåd\ dådåd dådåd dådåd! d™ d\ dÓ dÔ dåd` dådÕ d™ d˜ dådåd™ dåd\ dådådådåd_ dådådådåd dådb dÖ da dåd™ dådådåd× dØ dådÙ dc gfdšd²dÑdÛdÜdÝdÞdßdàdádâdãdädådædçdèdëdpdîdïdðdñdùdúdûdüdýdþd d dddg"d$ d{ d% d& d' d( d) dñ d* d.d+ d, d- dz d. d/ d0 d$ dÆ d5 d6 d7 du dî d9 d: d; d< d= dA dB dv dï dð g"fddddddd d d d d ddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;dd?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrdsdtdudvdwdxdydzd{d|d}d~dd€dd‚dƒd„d…d†d‡dˆd‰dŠd‹dŒddŽddd‘d’d“d”d•d–d—d˜g–d%d™ dH dš dL d› dœ d dž dŸ dF d6 d3 d  dD d¡ d%d¢ d£ dJ d7 d¤ d¥ d%df d¦ d§ d%d5 d¨ d%d dK d© dG d8 dª d« d™ d¬ d™ dI dM d9 d­ d® d d d%d. d%d¯ d%d%d4 d° d± dX d%d² d³ d´ dµ d¶ d%dC d· dE d¸ d¹ dº d: d d%d%d; d%dB d%d%dV d» d¼ d%d%d%d½ d%d%d! d d%d%d> d¾ d= d? d%d@ d¿ dÀ dÁ d dà d%d%dW dÄ dÅ dÆ dT dU dÇ d< d%dÈ d%d%d%d d%d%d%dd dÉ d%dÊ dË dÌ dÍ dÎ dÏ dÐ dÑ dÒ d%d d dÓ dÔ d` dÕ d_ db dÖ da d× dØ dÙ dc g–fdd d-d.d d;d¾d@dÚdGdJd«d¦d¬dOdYd#dHdIdÇdJdKdLdMdNdOd[dPdQdRdSdTdËd`dadbd×dcdÉdddedfdgdhdidUdjdVdpdWdXdYdZd[díd\d]d^d_d`d,dAdBd<d;d:d9dud%d=dxdyd/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid1djd2d~dd€ddkdldmd‚dndƒd„dodpdqdrdsdtdud4dvdwdxdydzd{d|ddŽd}dd~dddd€dd d‚dƒd„d…d†d‡d‘dˆd‰dŠd‹dŒdd’d“d”dEddŽddd•d–d‘d—d˜gdš d¨ dM d9 d™ d± d! d´ d" d¸ d: dçd! d™ dçd½ d™ dçdçd` dçd^ dçd] dçd\ dçdçd’ d_ dçdçdçd> d¾ d= dçd? dçdçd@ d¿ dÀ dÁ d dçdà d\ dÆ dçdçdçd! d™ d™ dU d“ dj d” di d• dV dçd– dk d— dë dçdçdçdçdçdçdçdçdçdçdçdçdçdçdçdçdçdçdçdçdçdçdçdçd™ d\ dçdd dÉ dçdÊ dçdçdçdË dçdÌ dÍ dçdçdçd\ dçdçd dçdçd dçdçd! d™ d\ dÓ dÔ dçd` dçdÕ d™ d˜ dçdçd™ dçd\ dçdçdçdçd_ dçdçdçdçd dçdb dÖ da dçd™ dçdçdçd× dØ dçdÙ dc gfddddddd d d d d ddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;dd?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrdsdtdudvdwdxdydzd{d|d}d~dd€dd‚dƒd„d…d†d‡dˆd‰dŠd‹dŒddŽddd‘d’d“d”d•d–d—d˜g–d(d™ dH dš dL d› dœ d dž dŸ dF d6 d3 d  dD d¡ d(d¢ d£ dJ d7 d¤ d¥ d(df d¦ d§ d(d5 d¨ d(d dK d© dG d8 dª d« d™ d¬ d™ dI dM d9 d­ d® d d d(d. d(d¯ d(d(d4 d° d± dX d(d² d³ d´ dµ d¶ d(dC d· dE d¸ d¹ dº d: d d(d(d; d(dB d(d(dV d» d¼ d(d(d(d½ d(d(d! d d(d(d> d¾ d= d? d(d@ d¿ dÀ dÁ d dà d(d(dW dÄ dÅ dÆ dT dU dÇ d< d(dÈ d(d(d(d d(d(d(dd dÉ d(dÊ dË dÌ dÍ dÎ dÏ dÐ dÑ dÒ d(d d dÓ dÔ d` dÕ d_ db dÖ da d× dØ dÙ dc g–fdšd²dÒdÑdÛdÜdÝdÞdßdàdádâdãdädådædçdèdëdpdîd7dïdðdñdGd™d$d)dùdúdûdüdýdþd¬d d d­dddg*d$ d{ d]d% d& d' d( d) dñ d* d® d+ d, d- dz d. d/ d0 d$ dÆ d5 d¯ d6 d7 du d[ dí dt d° dî d9 d: d; d< d= d³ dA dB d´ dv dï dð g*fdd d-d.d d;d¾d@dÚdGdJd«d¦d¬dOdYd#dHdIdÇdJdKdLdMdNdOd[dPdQdRdSdTdËd`dadbd×dcdÉdddedfdgdhdidUdjdVdpdWdXdYdZd[díd\d]d^d_d`d,dAdBd<d;d:d9dud%d=dxdyd/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid1djd2d~dd€ddkdldmd‚dndƒd„dodpdqdrdsdtdud4dvdwdxdydzd{d|ddŽd}dd~dddd€dd d‚dƒd„d…d†d‡d‘dˆd‰dŠd‹dŒdd’d“d”dEddŽddd•d–d‘d—d˜gdš d¨ dM d9 d™ d± d! d´ d" d¸ d: dèd! d™ dèd½ d™ dèdèd` dèd^ dèd] dèd\ dèdèd’ d_ dèdèdèd> d¾ d= dèd? dèdèd@ d¿ dÀ dÁ d dèdà d\ dÆ dèdèdèd! d™ d™ dU d“ dj d” di d• dV dèd– dk d— dë dèdèdèdèdèdèdèdèdèdèdèdèdèdèdèdèdèdèdèdèdèdèdèdèd™ d\ dèdd dÉ dèdÊ dèdèdèdË dèdÌ dÍ dèdèdèd\ dèdèd dèdèd dèdèd! d™ d\ dÓ dÔ dèd` dèdÕ d™ d˜ dèdèd™ dèd\ dèdèdèdèd_ dèdèdèdèd dèdb dÖ da dèd™ dèdèdèd× dØ dèdÙ dc gfdšd²dÒdÑdÛdÜdÝdÞdßdàdádâdãdädådædçdèdëdpdîd7dïdðdñdGd™d$d)dùdúdûdüdýdþd¬d d d­dddg*d$ d{ d:d% d& d' d( d) dñ d* d® d+ d, d- dz d. d/ d0 d$ dÆ d5 d¯ d6 d7 du d[ dí dt d° dî d9 d: d; d< d= d³ dA dB d´ dv dï dð g*fdšd²dÒdÑdÛdÜdÝdÞdßdàdád½dâdãdädådæd˜dçdèdëdpdîd7dïdðdñdGd™d$d)dùdúdûdüdýdþdšd›dœddždŸd d¡d¢d£d¤d¥d¦d§d¨d©dªd«d¬d d d­dddg>d$ d{ d[ d% d& d' d( d) dñ d* d® do d+ d, d- dz d. dhd/ d0 d$ dÆ d5 d¯ d6 d7 du d[ dí dt d° dî d9 d: d; d< d= d× dhdØ d± dÙ ds dr d² dÚ dÛ dq dp dÜ dhdÝ dX dì dW d³ dA dB d´ dv dï dð g>fddddddd d d d d ddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;dd?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrdsdtdudvdwdxdydzd{d|d}d~dd€dd‚dƒd„d…d†d‡dˆd‰dŠd‹dŒddŽddd‘d’d“d”d•d–d—d˜g–dd™ dH dš dL d› dœ d dž dŸ dF d6 d3 d  dD d¡ dd¢ d£ dJ d7 d¤ d¥ ddf d¦ d§ dd5 d¨ dd dK d© dG d8 dª d« d™ d¬ d™ dI dM d9 d­ d® d d dd. dd¯ ddd4 d° d± dX dd² d³ d´ dµ d¶ ddC d· dE d¸ d¹ dº d: d ddd; ddB dddV d» d¼ dddd½ ddd! d ddd> d¾ d= d? dd@ d¿ dÀ dÁ d dà dddW dÄ dÅ dÆ dT dU dÇ d< ddÈ dddd ddddd dÉ ddÊ dË dÌ dÍ dÎ dÏ dÐ dÑ dÒ dd d dÓ dÔ d` dÕ d_ db dÖ da d× dØ dÙ dc g–fd.d;dGdJdOd`dadbdcdddedfdgdhdidUdjdpd~dddld‚dƒd„doddŽdd~dd‡d‘dˆdŠd’d“d”ddd•d–d‘d—d˜g-d9 d± d¸ d: d×d> d¾ d= d? d×d@ d¿ dÀ dÁ d d×dà dÆ dd dÉ dÊ d×dË dÌ dÍ d×dÓ dÔ d` d×dÕ d×d_ d×d×db dÖ da d×d×d× dØ d×dÙ dc g-fdšd²dÒdÑdÛdÜdÝdÞdßdàdádâdãdädådædçdèdëdpdîd7dïdðdñdGd™d$d)dùdúdûdüdýdþd¬d d d­dddg*d$ d{ d9d% d& d' d( d) dñ d* d® d+ d, d- dz d. d/ d0 d$ dÆ d5 d¯ d6 d7 du d[ dí dt d° dî d9 d: d; d< d= d³ dA dB d´ dv dï dð g*fddddddd d d d d ddddddddddddddddÈdddd d"d#d$d%d&d'd(d)d*d+d,d-d.d/d d3d4d5d6d8d9d:d;dd?d¾d@dAdBdCdDdEdFdGdJdKd¦d¬dLdNdOdPdQdRdSdTdUdVdWdXdYd#dZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrdsdZd[dtdudvdwdxdydzd{d1d|d}d~dd€dd‚dƒd„d…d†d‡dˆd‰dŠd‹dŒdzd{ddŽddd‘d’d“d”d•d–d—d˜g™d d dH dš dL d› dœ d dž dŸ dF d6 d3 d  dD d¡ d d¢ d£ dJ d7 d¤ d¥ d df d¦ d d§ d d5 d¨ d dK d© dG d8 dª d« d d¬ d dI dM d9 d­ d d d. d d¯ d d4 d° d± dX d d² d³ d d´ dµ d¶ d dC d· dE d¸ d: d d d d d; d dB d d dV d» d¼ d d d d½ d d d d! d d d d> d¾ d= d? d d@ d¿ dÀ dÁ d dà d d dW dÄ dÅ dÆ dT dU dÇ d d d< d dÈ d d d d d d d d dd dÉ d dÊ dË dÌ dÍ dÎ dÏ dÐ dÑ dÒ d d d d d dÓ dÔ d` dÕ d_ db dÖ da d× dØ dÙ dc g™fddddddd d d d d ddddddddddddddddddd d"d#d$d%d&d'd(d)d*d+d,d-d.d/d d3d4d5d6d8d9d:d;dd?d@dAdBdCdDdEdFdGdJdKd¦dLdNdOdPdYd#dZd\d]d^d_d`dadbdcdddedfdgdhdidjdndpdsdZdtdvdzd1d|d}d~dd€dd‚dƒd„d…d†dˆd‰dŠd‹dŒdzddŽddd‘d’d“d”d•d–d—d˜g~d d dH dš dL d› dœ d dž dŸ dF d6 d3 d  dD d¡ d d¢ d£ dJ d7 d¤ d¥ d df d¦ d§ d d5 d¨ d dK d© dG d8 dª d« d d¬ d dI dM d9 d­ d¬d d. d d¯ d d4 d° d± dX d² d³ d´ dµ d¶ d dC d· dE d¸ d: d dSd d; d dB d½ d[d d! d d d d> d¾ d= d? d d@ d¿ dÀ dÁ d dà dÄ dÆ dÇ dsd< dÈ d d{d d dd dÉ d dÊ dË dÌ dÍ dÎ dÏ dÑ dÒ d d d d„dÓ dÔ d` dÕ d_ db dÖ da d× dØ dÙ dc g~fdd d-d.d d;d¾d@dÚdGdJd«d¦d¬dOdYd#dHdIdÇdJdKdLdMdNdOd[dPdQdRdSdTdËd`dadbd×dcdÉdddedfdgdhdidUdjdVdpdWdXdYdZd[díd\d]d^d_d`d,dAdBd<d;d:d9dud%d=dxdyd/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid1djd2d~dd€ddkdldmd‚dndƒd„dodpdqdrdsdtdud4dvdwdxdydzd{d|ddŽd}dd~dddd€dd d‚dƒd„d…d†d‡d‘dˆd‰dŠd‹dŒdd’d“d”dEddŽddd•d–d‘d—d˜gdš d¨ dM d9 d™ d± d! d´ d" d¸ d: dJd! d™ dJd½ d™ dJdJd` dJd^ dJd] dJd\ dJdJd’ d_ dJdJdJd> d¾ d= dJd? dJdJd@ d¿ dÀ dÁ d dJdà d\ dÆ dJdJdJd! d™ d™ dU d“ dj d” di d• dV dJd– dk d— dë dJdJdJdJdJdJdJdJdJdJdJdJdJdJdJdJdJdJdJdJdJdJdJdJd™ d\ dJdd dÉ dJdÊ dJdJdJdË dJdÌ dÍ dJdJdJd\ dJdJd dJdJd dJdJd! d™ d\ dÓ dÔ dJd` dJdÕ d™ d˜ dJdJd™ dJd\ dJdJdJdJd_ dJdJdJdJd dJdb dÖ da dJd™ dJdJdJd× dØ dJdÙ dc gfddddddd d d d d ddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;dd?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrdsdtdudvdwdxdydzd{d|d}d~dd€dd‚dƒd„d…d†d‡dˆd‰dŠd‹dŒddŽddd‘d’d“d”d•d–d—d˜g–dd™ dH dš dL d› dœ d dž dŸ dF d6 d3 d  dD d¡ dd¢ d£ dJ d7 d¤ d¥ ddf d¦ d§ dd5 d¨ dd dK d© dG d8 dª d« d™ d¬ d™ dI dM d9 d­ d® d d dd. dd¯ ddd4 d° d± dX dd² d³ d´ dµ d¶ ddC d· dE d¸ d¹ dº d: d ddd; ddB dddV d» d¼ dddd½ ddd! d ddd> d¾ d= d? dd@ d¿ dÀ dÁ d dà dddW dÄ dÅ dÆ dT dU dÇ d< ddÈ dddd ddddd dÉ ddÊ dË dÌ dÍ dÎ dÏ dÐ dÑ dÒ dd d dÓ dÔ d` dÕ d_ db dÖ da d× dØ dÙ dc g–fdddd d d d dddddddddd d!d"d$d&d)d+d-d.d/d0d1d2d3d4d6d7d8d9d;d=d@dCdGdHdIdJdKdMdNdOdPdQdRdTdUdVdWdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldodpdtdudvdwdxdydzd|d}d~dd€dd‚dƒd„d‡dŠd‹dŒddŽddd‘d’d“d”d•d–d—d˜gldád™ dš dœ d dž dF d6 d3 d¡ dád¢ d7 d¤ dád5 d¨ dád d© d8 d™ d™ dM d9 d­ d® d d dád. d¯ dádád4 d± dád´ dád¸ d¹ dº d: d dád; dádB dádád» d¼ dádád½ dádád! d dádád> d¾ d= d? dád@ d¿ dÀ dÁ d dà dádádÅ dÆ d< dádÈ dádádád dádádd dÉ dádÊ dË dÌ dÍ dÐ dád d dÓ dÔ d` dÕ d_ db dÖ da d× dØ dÙ dc glfddddd d dddddddd d"d#d%d'd(d*d,d-d4d:dd?d®dºdµdKdšdâdSdXd»d²dÒdÑdÛdÜdÝdÞdßdàdád½dâdãdädådæd˜dçdèd@d\d]dÊd“dëdmdndpdqdrdÙdsdìdîd7dïdðdñdvdGd™d$dzd)dãdäd…d†dådˆd‰dõdöd¿dùdúdûdüdýdþdšd›dœddždŸd d¡d¢d£d¤d¥d¦d§d¨d©dædªd«d‹dŒdÉddd¬d d d­dddÎdddg‚dH dš dL d› d dŸ d  d£ dJ d¥ df d¦ d§ d¨ d dK dG dª d« d¬ dI dM d. d° dX d² d³ d/ dK d" d d$ dJ dV dWd0 d{ d[ d% d& d' d( d) dñ d* d® do d+ d, d- dz d. dm d/ d0 dg d! d dldÓ dodW dÄ dÆ dT dU dqdÇ d3 d5 d¯ d6 d7 du dÈ d[ dí dt d d° d~dl dÎ dÏ dWdÑ dÒ d' d8 dh dî d9 d: d; d< d= d× dZ dØ d± dÙ ds dr d² dÚ dÛ dq dp dÜ dY dÝ dX d‚dì dW d d de d# d& d³ dA dB d´ d% d$ dn dv dï dð g‚fdddddddd$d&d8dd.d9dGdJdNdpdtd„gd™ d6 d3 d¡ d7 dµ d5 d© d8 dd d9 d4 d¸ d: d; dÆ d< dÍ gfdd d-d.d d;d¾d@dÚdGdJd«d¦d¬dOdYd#dHdIdÇdÑdÜdJdKdLdMdNdOd[dPdQdRdSdTdËd`dadbd×dcdÉdddedfdgdhdidUdjdVdpdWdXdYdZd[díd\d]d^d_d`d,dAdBd<d;d:d9dîdud%d=dxdyd/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid1djd2d~dd€ddkdldmd‚dndƒd„dodpdqdrdsdtdud4dvdwdxdydzd{d|ddŽd}dd~dddd€dd d‚dƒd„d…d†d‡d‘dˆd‰dŠd‹dŒdd’d“d”dEddŽddd•d–d‘d—d˜g dš d¨ dM d9 d™ d± d! d´ d" d¸ d: dÜd! d™ dÜd½ d™ dÜdÜd` dîd' dÜd^ dÜd] dÜd\ dÜdÜd’ d_ dÜdÜdÜd> d¾ d= dÜd? dÜdÜd@ d¿ dÀ dÁ d dÜdà d\ dÆ dÜdÜdÜd! d™ d™ dU d“ dj d” di d• dV dÜd– dk d— dë d5 dÜdÜdÜdÜdÜdÜdÜdÜdÜdÜdÜdÜdÜdÜdÜdÜdÜdÜdÜdÜdÜdÜdÜdÜd™ d\ dÜdd dÉ dÜdÊ dÜdÜdÜdË dÜdÌ dÍ dÜdÜdÜd\ dÜdÜd dÜdÜd dÜdÜd! d™ d\ dÓ dÔ dÜd` dÜdÕ d™ d˜ dÜdÜd™ dÜd\ dÜdÜdÜdÜd_ dÜdÜdÜdÜd dÜdb dÖ da dÜd™ dÜdÜdÜd× dØ dÜdÙ dc g fdšd²dÒdÑdÛdÜdÝdÞdßdàdád½dâdãdädådæd˜dçdèdëdpdîd7dïdðdñdGd™d$d)dùdúdûdüdýdþdšd›dœddždŸd d¡d¢d£d¤d¥d¦d§d¨d©dªd«d¬d d d­dddg>d$ d{ d[ d% d& d' d( d) dñ d* d® do d+ d, d- dz d. dad/ d0 d$ dÆ d5 d¯ d6 d7 du d[ dí dt d° dî d9 d: d; d< d= dadadadadadadadadadadq dp dadadadadadW d³ dA dB d´ dv dï dð g>fd.d;dGdJdOd`dadbdcdddedfdgdhdidUdjdpd~dddld‚dƒd„doddŽdd~dd‡d‘dˆdŠd’d“d”ddd•d–d‘d—d˜g-d9 d± d¸ d: dÙd> d¾ d= d? dÙd@ d¿ dÀ dÁ d dÙdà dÆ dd dÉ dÊ dÙdË dÌ dÍ dÙdÓ dÔ d` dÙdÕ dÙd_ dÙdÙdb dÖ da dÙdÙd× dØ dÙdÙ dc g-fdd d-d.d d;d¾d@dÚdGdJd«d¦d¬dšdOdYd#dHdId²dÇdÑdÛdÜdÝdÞdßdJdKdàdádLdMdNdâdOdãd[dPdädQdådædçdèdRdSdTdËd`dadbd×dcdÉdddedfdgdhdidUdjdVdëdpdWdXdYdZd[díd\d]d^d_d`d,dAdBd<d;d:d9dîdudïdðdñd%d=dxdyd/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid1djd2d~dd€ddkdldmd‚dndƒd„dodpdqdrdsdtdud4dvdwdùdúdûdüdýdþdxdydzd{d|ddŽd}dd~dddd d d€dd d‚dƒd„d…d†d‡d‘dˆd‰dŠd‹dŒdd’d“d”ddEdddŽdddd•d–d‘d—d˜g¾dš d¨ dM d9 d™ d± d! d´ d" d¸ d: dNd! d™ d$ dNd½ d™ dNdNd{ d` d% d& d' d( d) dñ dNd^ d* dñdNd] dNd+ d\ d, dNdNd- d’ dz d. d/ d0 d_ dNdNdNd> d¾ d= dNd? dNdNd@ d¿ dÀ dÁ d dNdà d\ d$ dÆ dNdNdNd! d™ d™ dU d“ dj d” di d• dV dNd– dk d— dë d5 dNd6 d7 du dNdNdNdNdNdNdNdNdNdNdNdNdNdNdNdNdNdNdNdNdNdNdNd™ d\ dNdd dÉ dNdÊ dNdNdNdË dNdÌ dÍ dNdNdNd\ dNdNd dNdNd dî d9 d: d; d< d= dNdNd! d™ d\ dÓ dÔ dNd` dNdÕ d™ d˜ dA dB dNdNd™ dNd\ dNdNdNdNd_ dNdNdNdNd dNdb dÖ da dv dNd™ dï dNdNdð dNd× dØ dNdÙ dc g¾fd d"d4d®dºdCdµd·dœdKdšdâdçd»d²dÒdÑdÛdÜdÝdÞdßdàdádâdãdädådædçdèd\d]dëdpdìdîd7dïdðdñdvdGd™d$dzd)dõdöd÷dødùdúdûdüdýdþd‹dŒddddd¬d d d­ddddddgId d d. d/ dK dJ d" dHdId d$ dJ dXd0 d{ d_d% d& d' d( d) dñ d* d® d+ d, d- dz d. d/ d0 d! d d$ dÆ d3 d5 d¯ d6 d7 du dÈ d[ dí dt d d° d' d8 dç ddî d9 d: d; d< d= d d d# d& d@ dè d³ dA dB d´ d% d$ dD dv dï dð gIfd.dJd`dadbdcdedhdjdpd~dddƒd„ddŽddd‘d’d“d”d•d–d—d˜gd9 d: d> d¾ d= d? d@ dÁ dà dÆ dd dÉ dÊ dÌ dÍ dÓ dÔ d` dÕ d_ db dÖ dd× dØ dÙ dc gfdšd²dÒdÑdÛdÜdÝdÞdßdàdádâdãdädådædçdèdëdpdîd7dïdðdñdGd™d$d)dùdúdûdüdýdþd¬d d d­dddg*d$ d{ d<d% d& d' d( d) dñ d* d® d+ d, d- dz d. d/ d0 d$ dÆ d5 d¯ d6 d7 du d[ dí dt d° dî d9 d: d; d< d= d³ dA dB d´ dv dï dð g*fdšd²dÒdÑdÛdÜdÝdÞdßdàdád½dâdãdädådæd˜dçdèdëdpdîd7dïdðdñdGd™d$d)dùdúdûdüdýdþdšd›dœddždŸd d¡d¢d£d¤d¥d¦d§d¨d©dªd«d¬d d d­dddg>d$ d{ d[ d% d& d' d( d) dñ d* d® do d+ d, d- dz d. ded/ d0 d$ dÆ d5 d¯ d6 d7 du d[ dí dt d° dî d9 d: d; d< d= d× dedØ d± dÙ ds dr d² dÚ dedq dp dÜ dedÝ dededW d³ dA dB d´ dv dï dð g>fdd d-d.d d;d¾d@dÚdGdJd«d¦d¬dšdOdYd#dHdId²dÇdÒdÑdÛdÜdÝdÞdßdJdKdàdád½dLdMdNdâdOdãd[dPdädQdådæd˜dçdèdRdSdTdËd`dadbd×dcdÉdddedfdgdhdidUdjdVdëdpdWdXdYdZd[díd\d]d^d_d`d,dAdBd<d;d:d9dîdud7dïdðdñd%d=dxdGd™dyd$d)d/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid1djd2d~dd€ddkdldmd‚dndƒd„dodpdqdrdsdtdud4dvdwdùdúdûdüdýdþdxdšd›dœddždŸd d¡d¢d£d¤d¥d¦d§d¨d©dªd«dydzd{d|ddŽd}dd~dddd¬d d d€dd­d d‚dƒd„d…d†d‡d‘dˆd‰dŠd‹dŒdd’d“d”ddEdddŽdddd•d–d‘d—d˜gÚdš d¨ dM d9 d™ d± d! d´ d" d¸ d: dQd! d™ d$ dQd½ d™ dQdQd{ d` d[ d% d& d' d( d) dñ dQd^ d* d® do dQd] dQd+ d\ d, dQdQd- d’ dz d. dfd/ d0 d_ dQdQdQd> d¾ d= dQd? dQdQd@ d¿ dÀ dÁ d dQdà d\ d$ dÆ dQdQdQd! d™ d™ dU d“ dj d” di d• dV dQd– dk d— dë d5 dQd¯ d6 d7 du dQdQdQd[ dí dQdt d° dQdQdQdQdQdQdQdQdQdQdQdQdQdQdQdQdQdQdQd™ d\ dQdd dÉ dQdÊ dQdQdQdË dQdÌ dÍ dQdQdQd\ dQdQd dQdQd dî d9 d: d; d< d= dQd× dfdØ d± dÙ ds dr d² dÚ dfdq dp dÜ dfdÝ dX dfdW dQd! d™ d\ dÓ dÔ dQd` dQdÕ d™ d˜ d³ dA dB dQdQd´ d™ dQd\ dQdQdQdQd_ dQdQdQdQd dQdb dÖ da dv dQd™ dï dQdQdð dQd× dØ dQdÙ dc gÚfddddddd d d d d ddddddddddáddddd"d!dddÈdddd d!d"d3d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;dd?d¾dÁd@dÚdAd¸dBd¹dCdDdEdFdGdHdIdJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdYd¼dÐdZd³d[d\d]d^déd_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrdsdtdud.d*dvdwdxdydzd|d}d~dd€dd‚dƒd„d…d†dåd‡dˆd‰dŠd‹dŒddŽddd‘d’d“d”d•d–d—d˜g¦dd™ dH dš dL d› dœ d dž dŸ dF d6 d3 d  dD d¡ dd¢ d£ dJ dY d7 d¤ d¥ dd:d>df d¦ d™ d§ dµd5 d¨ dd dç dK d© dG d8 dª d« d™ d¬ d™ dI dM d9 d­ d® d d dd. dµd¯ ddd4 d° d± dX dd² d³ d! dà d´ d" dµ dµd¶ dµddC d· dE d¸ d¹ dº d: d d¿dd; ddB dddV d» d¼ dddµd½ dá dµddµdd! d dd¿dd> d¾ d= d? dd@ d¿ dÀ dÁ d dà dddW dÄ dÅ dÆ dT dU dÇ d< ddùdûdÈ dddd dddd dÉ ddÊ dË dÌ dÍ dÎ dÏ dµdÐ dÑ dÒ dd d dÓ dÔ d` dÕ d_ db dÖ da d× dØ dÙ dc g¦fd ddád"d!d"d3d.d4d©d6d¡d8d:d;dd?dCdGdJdKdOdPdèdHdId\d]d`dadbdcdddedfdgdhdidUdjdpdídvdzd~dddld‚dƒd„dodudvdwdxd‹dŒddŽdd~dddd¬ddÖd d‡d‘dˆdŠd‹dŒd’d“d”dddd•d–d‘d—d˜gWd d™ dY d;d;d dç d9 d. d d¯ d;d d;d± d;d;d;d™ d¸ d: d d;dB d;d;d;d! d d> d¾ d= d? d;d@ d¿ dÀ dÁ d d;dà dÆ d™ dÈ d dd dÉ dÊ d;dË dÌ dÍ d;d d;d d;d d dÓ dÔ d` d;dÕ d™ d˜ d;d;d;d™ d;d_ d;d;d;d db dÖ da d™ d;d;d× dØ d;dÙ dc gWfdddddddd$d&d.d9dGdJdNdpdtd„gd&d6 d3 d¡ d7 d&d5 d© d8 d9 d4 d¸ d: d; dÆ d< dÍ gfddddddd d d d d ddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;dd?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrdsdtdudvdwdxdydzd{d|d}d~dd€dd‚dƒd„d…d†d‡dˆd‰dŠd‹dŒddŽddd‘d’d“d”d•d–d—d˜g–d,d™ dH dš dL d› dœ d dž dŸ dF d6 d3 d  dD d¡ d,d¢ d£ dJ d7 d¤ d¥ d,df d¦ d§ d,d5 d¨ d,d dK d© dG d8 dª d« d™ d¬ d™ dI dM d9 d­ d® d d d,d. d,d¯ d,d,d4 d° d± dX d,d² d³ d´ dµ d¶ d,dC d· dE d¸ d¹ dº d: d d,d,d; d,dB d,d,dV d» d¼ d,d,d,d½ d,d,d! d d,d,d> d¾ d= d? d,d@ d¿ dÀ dÁ d dà d,d,dW dÄ dÅ dÆ dT dU dÇ d< d,dÈ d,d,d,d d,d,d,dd dÉ d,dÊ dË dÌ dÍ dÎ dÏ dÐ dÑ dÒ d,d d dÓ dÔ d` dÕ d_ db dÖ da d× dØ dÙ dc g–fddddddd d d d d ddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;dd?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrdsdtdudvdwdxdydzd{d|d}d~dd€dd‚dƒd„d…d†d‡dˆd‰dŠd‹dŒddŽddd‘d’d“d”d•d–d—d˜g–d*d™ dH dš dL d› dœ d dž dŸ dF d6 d3 d  dD d¡ d*d¢ d£ dJ d7 d¤ d¥ d*df d¦ d§ d*d5 d¨ d*d dK d© dG d8 dª d« d™ d¬ d™ dI dM d9 d­ d® d d d*d. d*d¯ d*d*d4 d° d± dX d*d² d³ d´ dµ d¶ d*dC d· dE d¸ d¹ dº d: d d*d*d; d*dB d*d*dV d» d¼ d*d*d*d½ d*d*d! d d*d*d> d¾ d= d? d*d@ d¿ dÀ dÁ d dà d*d*dW dÄ dÅ dÆ dT dU dÇ d< d*dÈ d*d*d*d d*d*d*dd dÉ d*dÊ dË dÌ dÍ dÎ dÏ dÐ dÑ dÒ d*d d dÓ dÔ d` dÕ d_ db dÖ da d× dØ dÙ dc g–fd.d;dGdJdOd`dadbdcdddedfdgdhdidUdjdpd~dddld‚dƒd„doddŽdd~dd‡d‘dˆdŠd’d“d”ddd•d–d‘d—d˜g-d9 d± d¸ d: dÓd> d¾ d= d? dÓd@ d¿ dÀ dÁ d dÓdà dÆ dd dÉ dÊ dÓdË dÌ dÍ dÓdÓ dÔ d` dÓdÕ dÓd_ dÓdÓdb dÖ da dÓdÓd× dØ dÓdÙ dc g-fdd d-d.d d;d¾d@dÚdGdJd«d¦d¬dOdYd#dHdIdÇdJdKdLdMdNdOd[dPdQdRdSdTdËd`dadbd×dcdÉdddedfdgdhdidUdjdVdpdWdXdYdZd[díd\d]d^d_d`d,dAdBd<d;d:d9dud%d=dxdyd/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid1djd2d~dd€ddkdldmd‚dndƒd„dodpdqdrdsdtdud4dvdwdxdydzd{d|ddŽd}dd~dddd€dd d‚dƒd„d…d†d‡d‘dˆd‰dŠd‹dŒdd’d“d”dEddŽddd•d–d‘d—d˜gdš d¨ dM d9 d™ d± d! d´ d" d¸ d: dRd! d™ dRd½ d™ dRdRd` dRd^ dRd] dRd\ dRdRd’ d_ dRdRdRd> d¾ d= dRd? dRdRd@ d¿ dÀ dÁ d dRdà d\ dÆ dRdRdRd! d™ d™ dU d“ dj d” di d• dV dRd– dk d— dë dRdRdRdRdRdRdRdRdRdRdRdRdRdRdRdRdRdRdRdRdRdRdRdRd™ d\ dRdd dÉ dRdÊ dRdRdRdË dRdÌ dÍ dRdRdRd\ dRdRd dRdRd dRdRd! d™ d\ dÓ dÔ dRd` dRdÕ d™ d˜ dRdRd™ dRd\ dRdRdRdRd_ dRdRdRdRd dRdb dÖ da dRd™ dRdRdRd× dØ dRdÙ dc gfdšd²dÒdÑdÛdÜdÝdÞdßdàdádâdãdädådædçdèdëdpdîd7dïdðdñdGd™d$d)dùdúdûdüdýdþd¬d d d­dddg*d$ d{ d;d% d& d' d( d) dñ d* d® d+ d, d- dz d. d/ d0 d$ dÆ d5 d¯ d6 d7 du d[ dí dt d° dî d9 d: d; d< d= d³ dA dB d´ dv dï dð g*fdšd²dÒdÑdÛdÜdÝdÞdßdàdád½dâdãdädådæd˜dçdèdëdpdîd7dïdðdñdGd™d$d)dùdúdûdüdýdþdšd›dœddždŸd d¡d¢d£d¤d¥d¦d§d¨d©dªd«d¬d d d­dddg>d$ d{ d[ d% d& d' d( d) dñ d* d® do d+ d, d- dz d. did/ d0 d$ dÆ d5 d¯ d6 d7 du d[ dí dt d° dî d9 d: d; d< d= dididididididididididq dp didididididW d³ dA dB d´ dv dï dð g>fdšd²dÒdÑdÛdÜdÝdÞdßdàdád½dâdãdädådæd˜dçdèdëdpdîd7dïdðdñdGd™d$d)dùdúdûdüdýdþdšd›dœddždŸd d¡d¢d£d¤d¥d¦d§d¨d©dªd«d¬d d d­dddg>d$ d{ d[ d% d& d' d( d) dñ d* d® do d+ d, d- dz d. d/d/ d0 d$ dÆ d5 d¯ d6 d7 du d[ dí dt d° dî d9 d: d; d< d= d× d/dØ d± d/d/d/d² d/d/dq dp d/d/d/d/d/dW d³ dA dB d´ dv dï dð g>fd.d;dGdJdOd`dadbdcdddedfdgdhdidUdjdpd~dddld‚dƒd„doddŽdd~dd‡d‘dˆdŠd’d“d”ddd•d–d‘d—d˜g-d9 d± d¸ d: dÊd> d¾ d= d? dÊd@ d¿ dÀ dÁ d dÊdà dÆ dd dÉ dÊ dÊdË dÌ dÍ dÊdÓ dÔ d` dÊdÕ dÊd_ dÊdÊdb dÖ da dÊdÊd× dØ dÊdÙ dc g-fddddddd d d d d ddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;dd?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrdsdtdudvdwdxdydzd{d|d}d~dd€dd‚dƒd„d…d†d‡dˆd‰dŠd‹dŒddŽddd‘d’d“d”d•d–d—d˜g–dd™ dH dš dL d› dœ d dž dŸ dF d6 d3 d  dD d¡ dd¢ d£ dJ d7 d¤ d¥ ddf d¦ d§ dd5 d¨ dd dK d© dG d8 dª d« d™ d¬ d™ dI dM d9 d­ d® d d dd. dd¯ ddd4 d° d± dX dd² d³ d´ dµ d¶ ddC d· dE d¸ d¹ dº d: d ddd; ddB dddV d» d¼ dddd½ ddd! d ddd> d¾ d= d? dd@ d¿ dÀ dÁ d dà dddW dÄ dÅ dÆ dT dU dÇ d< ddÈ dddd ddddd dÉ ddÊ dË dÌ dÍ dÎ dÏ dÐ dÑ dÒ dd d dÓ dÔ d` dÕ d_ db dÖ da d× dØ dÙ dc g–fd.d;dGdJdOd`dadbdcdddedfdgdhdidUdjdpd~dddld‚dƒdéd„doddŽdd~dd‡d‘dˆdŠd’d“d”ddd•d–d‘d—d˜g.d9 d± d¸ d: dÔd> d¾ d= d? dÔd@ d¿ dÀ dÁ d dÔdà dÆ dd dÉ dÊ dÔdË dÌ dêdÍ dÔdÓ dÔ d` dÔdÕ dÔd_ dÔdÔdb dÖ da dÔdÔd× dØ dÔdÙ dc g.fdšd²dÒdÑdÛdÜdÝdÞdßdàdádâdãdädådædçdèdëdpdîd7dïdðdñdGd™d$d)dùdúdûdüdýdþd¬d d d­dddg*d$ d{ d\d% d& d' d( d) dñ d* d® d+ d, d- dz d. d/ d0 d$ dÆ d5 d¯ d6 d7 du d[ dí dt d° dî d9 d: d; d< d= d³ dA dB d´ dv dï dð g*fddddddd d d d d ddddddddddddddddddd d"d#d$d%d&d'd(d)d*d+d,d-d.d/d3d4d5d6d8d9d:d;dd?dAdBdCdDdEdFdGdJdKdLdNdOdPdZd\d]d^d_d`dadbdcdddedfdgdhdidjdndpdsdtdvdzd|d}d~dd€dd‚dƒd„d…d†dˆd‰dŠd‹dŒddŽddd‘d’d“d”d•d–d—d˜gvd d dH dš dL d› dœ d dž dŸ dF d6 d3 d  dD d¡ d d¢ d£ dJ d7 d¤ d¥ d df d¦ d§ d d5 d¨ d dK d© dG d8 dª d« d d¬ d dI dM d9 d­ d d. d d¯ d d4 d° d± dX d² d³ dµ d¶ d dC d· dE d¸ d: d d d; d dB d d! d d d d> d¾ d= d? d d@ d¿ dÀ dÁ d dà dÄ dÆ dÇ d< dÈ d d d dd dÉ d dÊ dË dÌ dÍ dÎ dÏ dÑ dÒ d d d dÓ dÔ d` dÕ d_ db dÖ da d× dØ dÙ dc gvfd.d;dGdJdOd`dadbdcdddedfdgdhdidUdjdpd~dddld‚dƒd„doddŽdd~dd‡d‘dˆdŠd’d“d”ddd•d–d‘d—d˜g-d9 d± d¸ d: dÉd> d¾ d= d? dÉd@ d¿ dÀ dÁ d dÉdà dÆ dd dÉ dÊ dÉdË dÌ dÍ dÉdÓ dÔ d` dÉdÕ dÉd_ dÉdÉdb dÖ da dÉdÉd× dØ dÉdÙ dc g-fdšd²dÒdÑdÛdÜdÝdÞdßdàdád½dâdãdädådæd˜dçdèdëdpdîd7dïdðdñdGd™d$d)dùdúdûdüdýdþdšd›dœddždŸd d¡d¢d£d¤d¥d¦d§d¨d©dªd«d¬d d d­dddg>d$ d{ d[ d% d& d' d( d) dñ d* d® do d+ d, d- dz d. ddd/ d0 d$ dÆ d5 d¯ d6 d7 du d[ dí dt d° dî d9 d: d; d< d= d× dddØ d± dÙ ds dr d² dÚ dÛ dq dp dÜ dY dÝ dX dì dW d³ dA dB d´ dv dï dð g>fddddddd d d d d ddddddddddddddddddd d"d#d$d%d&d'd(d)d*d+d,d-d.d/d3d4d5d6d8d9d:d;dd?dAdBdCdDdEdFdGdJdKdLdNdOdPdZd\d]d^d_d`dadbdcdddedfdgdhdidjdndpdsdtdvdzd|d}d~dd€dd‚dƒd„d…d†dˆd‰dŠd‹dŒddŽddd‘d’d“d”d•d–d—d˜gvdddH dš dL d› dœ d dž dŸ dF d6 d3 d  dD d¡ dd¢ d£ dJ d7 d¤ d¥ ddf d¦ d§ dd5 d¨ d dK d© dG d8 dª d« dd¬ ddI dM d9 d­ dd. dd¯ dd4 d° d± dX d² d³ dµ d¶ ddC d· dE d¸ d: d dd; ddB dd! d ddd> d¾ d= d? dd@ d¿ dÀ dÁ d dà dÄ dÆ dÇ d< dÈ d dddd dÉ ddÊ dË dÌ dÍ dÎ dÏ dÑ dÒ dd d dÓ dÔ d` dÕ d_ db dÖ da d× dØ dÙ dc gvfdšd²dÒdÑdÛdÜdÝdÞdßdàdádâdãdädådædçdèdëdpdîd7dïdðdñdGd™d$d)dùdúdûdüdýdþd¬d d d­dddg*d$ d{ d^d% d& d' d( d) dñ d* d® d+ d, d- dz d. d/ d0 d$ dÆ d5 d¯ d6 d7 du d[ dí dt d° dî d9 d: d; d< d= d³ dA dB d´ dv dï dð g*fdšd²dÒdÑdÛdÜdÝdÞdßdàdád½dâdãdädådæd˜dçdèdëdpdîd7dïdðdñdGd™d$d)dùdúdûdüdýdþdšd›dœddždŸd d¡d¢d£d¤d¥d¦d§d¨d©dªd«d¬d d d­dddg>d$ d{ d[ d% d& d' d( d) dñ d* d® do d+ d, d- dz d. d'd/ d0 d$ dÆ d5 d¯ d6 d7 du d[ dí dt d° dî d9 d: d; d< d= d× d'dØ d± dÙ ds dr d² dÚ d'dq dp dÜ d'dÝ d'd'dW d³ dA dB d´ dv dï dð g>fd.d;dGdJdOd`dadbdcdddedfdgdhdidUdjdpd~dddld‚dƒd„doddŽdd~dd‡d‘dˆdŠd’d“d”ddd•d–d‘d—d˜g-d9 d± d¸ d: dÓd> d¾ d= d? dÓd@ d¿ dÀ dÁ d dÓdà dÆ dd dÉ dÊ dÓdË dÌ dÍ dÓdÓ dÔ d` dÓdÕ dÓd_ dÓdÓdb dÖ da dÓdÓd× dØ dÓdÙ dc g-fdd d-d.d d;d¾d@dÚdGdJd«d¦d¬dOdYd#dHdIdÇdJdKdLdMdNdOd[dPdQdRdSdTdËd`dadbd×dcdÉdddedfdgdhdidUdjdVdpdWdXdYdZd[díd\d]d^d_d`d,dAdBd<d;d:d9dud%d=dxdyd/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid1djd2d~dd€ddkdldmd‚dndƒd„dodpdqdrdsdtdud4dvdwdxdydzd{d|ddŽd}dd~dddd€dd d‚dƒd„d…d†d‡d‘dˆd‰dŠd‹dŒdd’d“d”dEddŽddd•d–d‘d—d˜gdš d¨ dM d9 d™ d± d! d´ d" d¸ d: dâd! d™ dâd½ d™ dâdâd` dâd^ dâd] dâd\ dâdâd’ d_ dâdâdâd> d¾ d= dâd? dâdâd@ d¿ dÀ dÁ d dâdà d\ dÆ dâdâdâd! d™ d™ dU d“ dj d” di d• dV dâd– dk d— dë dâdâdâdâdâdâdâdâdâdâdâdâdâdâdâdâdâdâdâdâdâdâdâdâd™ d\ dâdd dÉ dâdÊ dâdâdâdË dâdÌ dÍ dâdâdâd\ dâdâd dâdâd dâdâd! d™ d\ dÓ dÔ dâd` dâdÕ d™ d˜ dâdâd™ dâd\ dâdâdâdâd_ dâdâdâdâd dâdb dÖ da dâd™ dâdâdâd× dØ dâdÙ dc gfddddddd d d d d ddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;dd?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrdsdtdudvdwdxdydzd{d|d}d~dd€dd‚dƒd„d…d†d‡dˆd‰dŠd‹dŒddŽddd‘d’d“d”d•d–d—d˜g–dd™ dH dš dL d› dœ d dž dŸ dF d6 d3 d  dD d¡ dd¢ d£ dJ d7 d¤ d¥ ddf d¦ d§ dd5 d¨ dd dK d© dG d8 dª d« d™ d¬ d™ dI dM d9 d­ d® d d dd. dd¯ ddd4 d° d± dX dd² d³ d´ dµ d¶ ddC d· dE d¸ d¹ dº d: d ddd; ddB dddV d» d¼ dddd½ ddd! d ddd> d¾ d= d? dd@ d¿ dÀ dÁ d dà dddW dÄ dÅ dÆ dT dU dÇ d< ddÈ dddd ddddd dÉ ddÊ dË dÌ dÍ dÎ dÏ dÐ dÑ dÒ dd d dÓ dÔ d` dÕ d_ db dÖ da d× dØ dÙ dc g–fd.gdJgfdšd²dÒdÑdÛdÜdÝdÞdßdàdádâdãdädådædçdèdëdpdîd7dïdðdñdGd™d$d)dùdúdûdüdýdþd¬d d d­dddg*d$ d{ d,d% d& d' d( d) dñ d* d® d+ d, d- dz d. d/ d0 d$ dÆ d5 d¯ d6 d7 du d[ dí dt d° dî d9 d: d; d< d= d³ dA dB d´ dv dï dð g*fdddd d d d dddddddddd d!d"d$d&d)d+d-d.d/d0d1d2d3d4d6d7d8d9d;d=d@dCdGdHdIdJdKdMdNdOdPdQdRdTdUdVdWdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldodpdtdudvdwdxdydzd|d}d~dd€dd‚dƒd„d‡dŠd‹dŒddŽddd‘d’d“d”d•d–d—d˜gld3d™ dš dœ d dž dF d6 d3 d¡ d3d¢ d7 d¤ d3d5 d¨ d3d d© d8 d™ d™ dM d9 d­ d® d d d3d. d¯ d3d3d4 d± d3d´ d3d¸ d¹ dº d: d d3d; d3dB d3d3d» d¼ d3d3d½ d3d3d! d d3d3d> d¾ d= d? d3d@ d¿ dÀ dÁ d dà d3d3dÅ dÆ d< d3dÈ d3d3d3d d3d3dd dÉ d3dÊ dË dÌ dÍ dÐ d3d d dÓ dÔ d` dÕ d_ db dÖ da d× dØ dÙ dc glfdšd²dÒdÑdÛdÜdÝdÞdßdàdád½dâdãdädådæd˜dçdèdëdpdîd7dïdðdñdGd™d$d)dùdúdûdüdýdþdšd›dœddždŸd d¡d¢d£d¤d¥d¦d§d¨d©dªd«d¬d d d­dddg>d$ d{ d[ d% d& d' d( d) dñ d* d® do d+ d, d- dz d. dgd/ d0 d$ dÆ d5 d¯ d6 d7 du d[ dí dt d° dî d9 d: d; d< d= d× dZ dØ d± dÙ ds dr d² dÚ dÛ dq dp dÜ dY dÝ dX dì dW d³ dA dB d´ dv dï dð g>fd.d;dGdJdOd`dadbdcdddedfdgdhdidUdjdpd~dddld‚dƒd„doddŽdd~dd‡d‘dˆdŠd’d“d”ddd•d–d‘d—d˜g-d9 d± d¸ d: d”d> d¾ d= d? d”d@ d¿ dÀ dÁ d d”dà dÆ dd dÉ dÊ d”dË dÌ dÍ d”dÓ dÔ d` d”dÕ d”d_ d”d”db dÖ da d”d”d× dØ d”dÙ dc g-fddddddd d d d d ddddddddddddddddÈdddd d"d#d$d%d&d'd(d)d*d+d,d-d.d/d d3d4d5d6d8d9d:d;dd?d¾d@dAdBdCdDdEdFdGdJdKd¦d¬dLdNdOdPdQdRdSdTdUdVdWdXdYd#dZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrdsdZd[dtdudvdwdxdydzd{d1d|d}d~dd€dd‚dƒd„d…d†d‡dˆd‰dŠd‹dŒdzd{ddŽddd‘d’d“d”d•d–d—d˜g™d-d-dH dš dL d› dœ d dž dŸ dF d6 d3 d  dD d¡ d-d¢ d£ dJ d7 d¤ d¥ d-df d¦ d-d§ d-d5 d¨ d dK d© dG d8 dª d« d-d¬ d-dI dM d9 d­ d-d-d. d-d¯ d-d4 d° d± dX d-d² d³ d-d´ dµ d¶ d-dC d· dE d¸ d: d d-d-d-d; d-dB d-d-dV d» d¼ d-d-d-d½ d-d-d-d! d d-d-d> d¾ d= d? d-d@ d¿ dÀ dÁ d dà d-d-dW dÄ dÅ dÆ dT dU dÇ d-d-d< d-dÈ d-d-d-d d-d-d-d-dd dÉ d-dÊ dË dÌ dÍ dÎ dÏ dÐ dÑ dÒ d-d d d-d-dÓ dÔ d` dÕ d_ db dÖ da d× dØ dÙ dc g™fdddddddd$d&d.d9d;dGdJdNdOd`dadbdcdddedfdgdhdidUdjdpdtd~dddld‚dƒd„doddŽdd~dd‡d‘dˆdŠd’d“d”ddd•d–d‘d—d˜g9d.d6 d3 d¡ d7 d.d5 d© d8 d9 d4 d± d¸ d: d; d.d> d¾ d= d? d.d@ d¿ dÀ dÁ d d.dà dÆ d< dd dÉ dÊ d.dË dÌ dÍ d.dÓ dÔ d` d.dÕ d.d_ d.d.db dÖ da d.d.d× dØ d.dÙ dc g9fddddddd d d d d ddddddddddddddddddd d"d#d$d%d&d'd(d)d*d+d,d-d.d/d3d4d5d6d8d9d:d;dd?dAdBdCdDdEdFdGdJdKdLdNdOdPdZd\d]d^d_d`dadbdcdddedfdgdhdidjdndpdsdtdvdzd|d}d~dd€dd‚dƒd„d…d†dˆd‰dŠd‹dŒddŽddd‘d’d“d”d•d–d—d˜gvd/d/dH dš dL d› dœ d dž dŸ dF d6 d3 d  dD d¡ d/d¢ d£ dJ d7 d¤ d¥ d/df d¦ d§ d/d5 d¨ d dK d© dG d8 dª d« d/d¬ d/dI dM d9 d­ d/d. d/d¯ d/d4 d° d± dX d² d³ dµ d¶ d/dC d· dE d¸ d: d d/d; d/dB d/d! d d/d/d> d¾ d= d? d/d@ d¿ dÀ dÁ d dà dÄ dÆ dÇ d< dÈ d d/d/dd dÉ d/dÊ dË dÌ dÍ dÎ dÏ dÑ dÒ d/d d dÓ dÔ d` dÕ d_ db dÖ da d× dØ dÙ dc gvfdd d-d.d d;d¾d@dÚdGdJd«d¦d¬dOdYd#dHdIdÇdJdKdLdMdNdOd[dPdQdRdSdTdËd`dadbd×dcdÉdddedfdgdhdidUdjdVdpdWdXdYdZd[díd\d]d^d_d`d,dAdBd<d;d:d9dud%d=dxdyd/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid1djd2d~dd€ddkdldmd‚dndƒd„dodpdqdrdsdtdud4dvdwdxdydzd{d|ddŽd}dd~dddd€dd d‚dƒd„d…d†d‡d‘dˆd‰dŠd‹dŒdd’d“d”dEddŽddd•d–d‘d—d˜gdš d¨ dM d9 d™ d± d! d´ d" d¸ d: d²d! d™ d²d½ d™ d²d²d` d²d^ d²d] d²d\ d²d²d’ d_ d²d²d²d> d¾ d= d²d? d²d²d@ d¿ dÀ dÁ d d²dà d\ dÆ d²d²d²d! d™ d™ dU d“ dj d” di d• dV d²d– dk d— dë d²d²d²d²d²d²d²d²d²d²d²d²d²d²d²d²d²d²d²d²d²d²d²d²d™ d\ d²dd dÉ d²dÊ d²d²d²dË d²dÌ dÍ d²d²d²d\ d²d²d d²d²d d²d²d! d™ d\ dÓ dÔ d²d` d²dÕ d™ d˜ d²d²d™ d²d\ d²d²d²d²d_ d²d²d²d²d d²db dÖ da d²d™ d²d²d²d× dØ d²dÙ dc gfd.d;dGdJdOd`dadbdcdddedfdgdhdidUdjdpd~dddld‚dƒd„doddŽdd~dd‡d‘dˆdŠd’d“d”ddd•d–d‘d—d˜g-d9 d± d¸ d: dUd> d¾ d= d? dUd@ d¿ dÀ dÁ d dUdà dÆ dd dÉ dÊ dUdË dÌ dÍ dUdÓ dÔ d` dUdÕ dUd_ dUdUdb dÖ da dUdUd× dØ dUdÙ dc g-fdd d-d.d d;d¾d@dÚdGdJd«d¦d¬dOdYd#dHdIdÇdJdKdLdMdNdOd[dPdQdRdSdTdËd`dadbd×dcdÉdddedfdgdhdidUdjdVdpdWdXdYdZd[díd\d]d^d_d`d,dAdBd<d;d:d9dud%d=dxdyd/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid1djd2d~dd€ddkdldmd‚dndƒd„dodpdqdrdsdtdud4dvdwdxdydzd{d|ddŽd}dd~dddd€dd d‚dƒd„d…d†d‡d‘dˆd‰dŠd‹dŒdd’d“d”dEddŽddd•d–d‘d—d˜gdš d¨ dM d9 d™ d± d! d´ d" d¸ d: dÇd! d™ dÇd½ d™ dÇdÇd` dÇd^ dÇd] dÇd\ dÇdÇd’ d_ dÇdÇdÇd> d¾ d= dÇd? dÇdÇd@ d¿ dÀ dÁ d dÇdà d\ dÆ dÇdÇdÇd! d™ d™ dU d“ dj d” di d• dV dÇd– dk d— dë dÇdÇdÇdÇdÇdÇdÇdÇdÇdÇdÇdÇdÇdÇdÇdÇdÇdÇdÇdÇdÇdÇdÇdÇd™ d\ dÇdd dÉ dÇdÊ dÇdÇdÇdË dÇdÌ dÍ dÇdÇdÇd\ dÇdÇd dÇdÇd dÇdÇd! d™ d\ dÓ dÔ dÇd` dÇdÕ d™ d˜ dÇdÇd™ dÇd\ dÇdÇdÇdÇd_ dÇdÇdÇdÇd dÇdb dÖ da dÇd™ dÇdÇdÇd× dØ dÇdÙ dc gfddddddd d d d d ddddddddddddddddÈdddd d"d#d$d%d&d'd(d)d*d+d,d-d.d/d d3d4d5d6d8d9d:d;dd?d¾d@dAdBdCdDdEdFdGdJdKd¦d¬dLdNdOdPdQdRdSdTdUdVdWdXdYd#dZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrdsdZd[dtdudvdwdxdydzd{d1d|d}d~dd€dd‚dƒd„d…d†d‡dˆd‰dŠd‹dŒdzd{ddŽddd‘d’d“d”d•d–d—d˜g™dddH dš dL d› dœ d dž dŸ dF d6 d3 d  dD d¡ dd¢ d£ dJ d7 d¤ d¥ ddf d¦ dd§ dd5 d¨ d dK d© dG d8 dª d« dd¬ ddI dM d9 d­ ddd. dd¯ dd4 d° d± dX dd² d³ dd´ dµ d¶ ddC d· dE d¸ d: d dddd; ddB dddV d» d¼ dddd½ dddd! d ddd> d¾ d= d? dd@ d¿ dÀ dÁ d dà dddW dÄ dÅ dÆ dT dU dÇ ddd< ddÈ dddd dddddd dÉ ddÊ dË dÌ dÍ dÎ dÏ dÐ dÑ dÒ dd d dddÓ dÔ d` dÕ d_ db dÖ da d× dØ dÙ dc g™fdšd²dÒdÑdÛdÜdÝdÞdßdàdád½dâdãdädådæd˜dçdèdëdpdîd7dïdðdñdGd™d$d)dùdúdûdüdýdþdšd›dœddždŸd d¡d¢d£d¤d¥d¦d§d¨d©dªd«d¬d d d­dddg>d$ d{ d[ d% d& d' d( d) dñ d* d® do d+ d, d- dz d. d&d/ d0 d$ dÆ d5 d¯ d6 d7 du d[ dí dt d° dî d9 d: d; d< d= d× dZ dØ d± dÙ ds dr d² dÚ dÛ dq dp dÜ dY dÝ dX dì dW d³ dA dB d´ dv dï dð g>fdd d-d.d d;d¾d@dÚdGdJd«d¦d¬dOdYd#dHdIdÇdJdKdLdMdNdOd[dPdQdRdSdTdËd`dadbd×dcdÉdddedfdgdhdidUdjdVdpdWdXdYdZd[díd\d]d^d_d`d,dAdBd<d;d:d9dud%d=dxdyd/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid1djd2d~dd€ddkdldmd‚dndƒd„dodpdqdrdsdtdud4dvdwdxdydzd{d|ddŽd}dd~dddd€dd d‚dƒd„d…d†d‡d‘dˆd‰dŠd‹dŒdd’d“d”dEddŽddd•d–d‘d—d˜gdš d¨ dM d9 d™ d± d! d´ d" d¸ d: dÛd! d™ dÛd½ d™ dÛdÛd` dÛd^ dÛd] dÛd\ dÛdÛd’ d_ dÛdÛdÛd> d¾ d= dÛd? dÛdÛd@ d¿ dÀ dÁ d dÛdà d\ dÆ dÛdÛdÛd! d™ d™ dU d“ dj d” di d• dV dÛd– dk d— dë dÛdÛdÛdÛdÛdÛdÛdÛdÛdÛdÛdÛdÛdÛdÛdÛdÛdÛdÛdÛdÛdÛdÛdÛd™ d\ dÛdd dÉ dÛdÊ dÛdÛdÛdË dÛdÌ dÍ dÛdÛdÛd\ dÛdÛd dÛdÛd dÛdÛd! d™ d\ dÓ dÔ dÛd` dÛdÕ d™ d˜ dÛdÛd™ dÛd\ dÛdÛdÛdÛd_ dÛdÛdÛdÛd dÛdb dÖ da dÛd™ dÛdÛdÛd× dØ dÛdÙ dc gfdšd²dÒdÑdÛdÜdÝdÞdßdàdád½dâdãdädådæd˜dçdèdëdpdîd7dïdðdñdGd™d$d)dùdúdûdüdýdþdšd›dœddždŸd d¡d¢d£d¤d¥d¦d§d¨d©dªd«d¬d d d­dddg>d$ d{ d[ d% d& d' d( d) dñ d* d® do d+ d, d- dz d. d(d/ d0 d$ dÆ d5 d¯ d6 d7 du d[ dí dt d° dî d9 d: d; d< d= d× d(dØ d± d(d(d(d² d(d(dq dp d(d(d(d(d(dW d³ dA dB d´ dv dï dð g>fd.d;dGdJdšdOdTdUdWdÏdÎdçd²dÒdÑdÛdÜdÝdÞdßdàdád½dâdãdädådæd˜dçdèd@d`dadbdcdddedfdgdhdid—djd•dkdldodpdÜdÚdÝdíd¯dîd7dïdðdñdGd™d$d)d~dddäd‚dƒd„d‡dëdìdídîdwd¿dùdúdûdüdýdþdšd›dœddždŸd d¡d¢d£d¤d¥d¦d§d¨d©dªd«ddŽdddïddðd¬d d d­d‘dñdòdÎd’d“d”dóddddd•d–d—d˜g€d9 d± d¸ d: d$ d™ d» d¼ dpdÛ dpdÞ d{ d[ d% d& d' d( d) dñ d* d® do d+ d, d- dz d. dm d/ d0 dg d> d¾ d= d? d d@ d¿ dÀ dÁ d d dà dpdpdpdÅ dÆ dpdpdÜ d™ dP d5 d¯ d6 d7 du d[ dí dt d° dd dÉ dÊ dl dË dÌ dÍ dÐ dÝ dß dpdá d dh dî d9 d: d; d< d= d× dZ dØ d± dÙ ds dr d² dÚ dÛ dq dp dÜ dY dÝ dX dì dW dÓ dÔ d` dÕ dä dpdå d³ dA dB d´ d_ dQ dpdn db dÖ da dæ dv dpdï dð d× dØ dÙ dc g€fddddddd d d d d ddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;dd?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrdsdtdudvdwdxdydzd{d|d}d~dd€dd‚dƒd„d…d†d‡dˆd‰dŠd‹dŒddŽddd‘d’d“d”d•d–d—d˜g–dd™ dH dš dL d› dœ d dž dŸ dF d6 d3 d  dD d¡ dd¢ d£ dJ d7 d¤ d¥ ddf d¦ d§ dd5 d¨ dd dK d© dG d8 dª d« d™ d¬ d™ dI dM d9 d­ d® d d dd. dd¯ ddd4 d° d± dX dd² d³ d´ dµ d¶ ddC d· dE d¸ d¹ dº d: d ddd; ddB dddV d» d¼ dddd½ ddd! d ddd> d¾ d= d? dd@ d¿ dÀ dÁ d dà dddW dÄ dÅ dÆ dT dU dÇ d< ddÈ dddd ddddd dÉ ddÊ dË dÌ dÍ dÎ dÏ dÐ dÑ dÒ dd d dÓ dÔ d` dÕ d_ db dÖ da d× dØ dÙ dc g–fdšd²dÒdÑdÛdÜdÝdÞdßdàdád½dâdãdädådæd˜dçdèdëdpdîd7dïdðdñdGd™d$d)dùdúdûdüdýdþdšd›dœddždŸd d¡d¢d£d¤d¥d¦d§d¨d©dªd«d¬d d d­dddg>d$ d{ d[ d% d& d' d( d) dñ d* d® do d+ d, d- dz d. d6d/ d0 d$ dÆ d5 d¯ d6 d7 du d[ dí dt d° dî d9 d: d; d< d= d× d6dØ d± dÙ d6dr d² dÚ d6dq dp dÜ d6d6d6d6dW d³ dA dB d´ dv dï dð g>fddddddd d d d d dddddddddddddddddd d!d"d#d$d%d&d'd(dFd)d*d+d,d-d.d/d0d1d2d4d5d7d9d:d;dd?dAd¯dBd®dºdCdµd·d¶dDdEd¤d¢dždFdœd­d›d dGdHdIdJdKdšdâdNdOdQdRdSdTdUdWdXd»d²dÒdÑdÛdÜdÝdÞdßdàdád½dâdãdädådæd˜dçdèd@d\d]d`dadÖdbd×dcdÓdÌdddedfdgdhd”did—dUdjd“dëdkdldmdndodpdqd²d±d×dØdrdÙdsdìdÛdtd°d¯d³d’dîd7dïdðdñdvdGd™d$dzd)dôd~ddõd€ddldäd‚d¾dƒd„dod…d†död‡dˆd‰dõdöd¿dùdúdûdüdýdþdšd›dœddždŸd d¡d¢d£d¤d¥d¦d§d¨d©dªd«d‹dŒddŽd÷d}dÉdd~ddødùdddïd¬d d d­d†dúd‡d‘dˆdŠdddñdÎdûd’d“d”dddÞdddd•d–d‘d—d˜gdd™ dH dš dL d› dœ d dž dŸ dF d6 d3 d  dD d¡ d¢ d£ dJ d7 d¤ d¥ ddf d¦ d§ d™ d5 d¨ d™ d dK d© dG d8 dª d« dGd™ d¬ d™ dI dM d9 d­ d® d d d. d™ d™ d4 d° d± dX dTd² d³ dµ d d¶ d/ dK dJ d" dN dÚ dC d· d d¶ dP dE dR d dA d d¸ d¹ dº d: d d$ dJ d; d™ dTdTdV d» d¼ dTd™ d0 d{ d[ d% d& d' d( d) dñ d* d® do d+ d, d- dz d. dm d/ d0 dg d! d d> d¾ d~d= dd? dd d™ d@ d¿ dÀ dÁ dƒd dÕ d™ dà dÓ d$ dTdTdW dÄ dÅ dÆ dT d d d‡d# dU dè dÇ d3 dâ d< dO dP dS dQ d5 d¯ d6 d7 du dÈ d[ dí dt d d° ddd dÉ dŽd™ dÊ d™ dl dË dÕ dÌ dÍ d™ dÎ dÏ dI dÐ dÑ dÒ d' d8 dh dî d9 d: d; d< d= d× dZ dØ d± dÙ ds dr d² dÚ dÛ dq dp dÜ dY dÝ dX dì dW d d dÓ dÔ d†d™ de d` d™ dÕ dZ dH d# d& dä d³ dA dB d´ d™ dd™ d_ d™ d™ d% d$ dQ dn dŽdb dÖ da dv dï d•d™ dð d™ d× dØ d™ dÙ dc gfdšd²dÒdÑdÛdÜdÝdÞdßdàdád½dâdãdädådæd˜dçdèdëdpdîd7dïdðdñdGd™d$d)dùdúdûdüdýdþdšd›dœddždŸd d¡d¢d£d¤d¥d¦d§d¨d©dªd«d¬d d d­dddg>d$ d{ d[ d% d& d' d( d) dñ d* d® do d+ d, d- dz d. d5d/ d0 d$ dÆ d5 d¯ d6 d7 du d[ dí dt d° dî d9 d: d; d< d= d× d5dØ d± dÙ d5dr d² dÚ d5dq dp dÜ d5d5d5d5dW d³ dA dB d´ dv dï dð g>fddddd d d d d ddddddddddÈdd d"d#d%d'd(d*d,d-d/d4d:dd?d¾dÁd@dÚdAd¯dBd®dºdCdµd·dDdEd¢dždFdœd­dKdªdLdŸdšd¨d¶dâdSdÏdÎdçdYd¼d»d²dÒdÑdÛdÜdÝdÞdßdàdád½dâdãdädådæd˜dçdèd@d\d]d?dCdédêd>dÀd¿dÂdÍdÌd“dëdmdndpdqd²dØdrdÙdÜdÚdÝdsdìdÛd°d¯d³d’dîd7dïdðdñdvdGd™d$dzd0d{d)dódôd»d½dõdäd…d†dödˆd‰dëdìdõdödîd¿dùdúdûdüdüdÁdýdÂdýdþdÅdÿdšd›dœddždŸd d¡d¢d£d¤d¥d¦d§d¨d©dædªd«d‹dŒdddddÉdÊdËdÌdødùdddïdðd¬d d d­d d ddddddñdÍdòdÎdddÏdódddddþdgßdH dš dL d› dœ d dž dŸ dF d  dD d¢ d£ dJ d¤ d¥ df d¦ d™ d§ d¨ d dK dG dª d« d¬ dI dM d­ d. d° dX d² d³ d! dà d´ d" dµ dÐd¶ d/ dK dJ d" dN dC d· d¶ dP dE dR d³d dM d™ dÕd$ d_dã dJ dV dÛ dÝdÞ d½ dá d0 d{ d[ d% d& d' d( d) dñ d* d® do d+ d, d- dz d. dm d/ d0 dg d! d dO d dS d1 dN d d( dâ d\ dkdÓ d$ dW dÄ dÆ dT dåd# dU dè dÝdÝdÜ dÇ d3 dâ dO dP dS dQ d5 d¯ d6 d7 du dÈ d[ dí dt d dkd™ d° d2 dT d[ dÔ dkdl dÎ dÏ dI dÑ dÒ dÝ dß d' d8 ddh dî d9 d: d; dkd™ d< d€dÿd= dR dS d× dZ dØ d± dÙ ds dr d² dÚ dÛ dq dp dÜ dY dÝ dX dkdì dW d d d> d] d? dé de dkdkdkdZ dH d# d& dä då d³ dA dB d´ d- d, d^ dC dê d% d$ dQ dy ddn dF d+ dkdæ dv dï d* d) dkdð gßfdd d-d.d d;d¾d@dÚdGdJd«d¦d¬dOdYd#dHdIdÇdJdKdLdMdNdOd[dPdQdRdSdTdËd`dadbd×dcdÉdddedfdgdhdidUdjdVdpdWdXdYdZd[díd\d]d^d_d`d,dAdBd<d;d:d9dud%d=dxdyd/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid1djd2d~dd€ddkdldmd‚dndƒd„dodpdqdrdsdtdud4dvdwdxdydzd{d|ddŽd}dd~dddd€dd d‚dƒd„d…d†d‡d‘dˆd‰dŠd‹dŒdd’d“d”dEddŽddd•d–d‘d—d˜gdš d¨ dM d9 d™ d± d! d´ d" d¸ d: dÒd! d™ dÒd½ d™ dÒdÒd` dÒd^ dÒd] dÒd\ dÒdÒd’ d_ dÒdÒdÒd> d¾ d= dÒd? dÒdÒd@ d¿ dÀ dÁ d dÒdà d\ dÆ dÒdÒdÒd! d™ d™ dU d“ dj d” di d• dV dÒd– dk d— dë dÒdÒdÒdÒdÒdÒdÒdÒdÒdÒdÒdÒdÒdÒdÒdÒdÒdÒdÒdÒdÒdÒdÒdÒd™ d\ dÒdd dÉ dÒdÊ dÒdÒdÒdË dÒdÌ dÍ dÒdÒdÒd\ dÒdÒd dÒdÒd dÒdÒd! d™ d\ dÓ dÔ dÒd` dÒdÕ d™ d˜ dÒdÒd™ dÒd\ dÒdÒdÒdÒd_ dÒdÒdÒdÒd dÒdb dÖ da dÒd™ dÒdÒdÒd× dØ dÒdÙ dc gfddddddd d d d d ddddddddddddddddddd d"d#d$d%d&d'd(d)d*d+d,d-d.d/d3d4d5d6d8d9d:d;dd?dAdBdCdDdEdFdGdJdKdLdNdOdPdZd\d]d^d_d`dadbdcdddedfdgdhdidjdndpdsdtdvdzd|d}d~dd€dd‚dƒd„d…d†dˆd‰dŠd‹dŒddŽddd‘d’d“d”d•d–d—d˜gvd d dH dš dL d› dœ d dž dŸ dF d6 d3 d  dD d¡ d d¢ d£ dJ d7 d¤ d¥ d df d¦ d§ d d5 d¨ d dK d© dG d8 dª d« d d¬ d dI dM d9 d­ d d. d d¯ d d4 d° d± dX d² d³ dµ d¶ d dC d· dE d¸ d: d d d; d dB d d! d d d d> d¾ d= d? d d@ d¿ dÀ dÁ d dà dÄ dÆ dÇ d< dÈ d d d dd dÉ d dÊ dË dÌ dÍ dÎ dÏ dÑ dÒ d d d dÓ dÔ d` dÕ d_ db dÖ da d× dØ dÙ dc gvfdšd²dÒdÑdÛdÜdÝdÞdßdàdád½dâdãdädådæd˜dçdèdëdpdîd7dïdðdñdGd™d$d)dùdúdûdüdýdþdšd›dœddždŸd d¡d¢d£d¤d¥d¦d§d¨d©dªd«d¬d d d­dddg>d$ d{ d[ d% d& d' d( d) dñ d* d® do d+ d, d- dz d. d™d/ d0 d$ dÆ d5 d¯ d6 d7 du d[ dí dt d° dî d9 d: d; d< d= d× d™dØ d± dÙ ds dr d² dÚ dÛ dq dp dÜ d™dÝ dX d™dW d³ dA dB d´ dv dï dð g>fddddddd d d d d ddddddddddddddddddd d"d#d$d%d&d'd(d)d*d+d,d-d.d/d3d4d5d6d8d9d:d;dd?dAdBdCdDdEdFdGdJdKdLdNdOdPdZd\d]d^d_d`dadbdcdddedfdgdhdidjdndpdsdtdvdzd|d}d~dd€dd‚dƒd„d…d†dˆd‰dŠd‹dŒddŽddd‘d’d“d”d•d–d—d˜gvdddH dš dL d› dœ d dž dŸ dF d6 d3 d  dD d¡ dd¢ d£ dJ d7 d¤ d¥ ddf d¦ d§ dd5 d¨ d dK d© dG d8 dª d« dd¬ ddI dM d9 d­ dd. dd¯ dd4 d° d± dX d² d³ dµ d¶ ddC d· dE d¸ d: d dd; ddB dd! d ddd> d¾ d= d? dd@ d¿ dÀ dÁ d dà dÄ dÆ dÇ d< dÈ d dddd dÉ ddÊ dË dÌ dÍ dÎ dÏ dÑ dÒ dd d dÓ dÔ d` dÕ d_ db dÖ da d× dØ dÙ dc gvfdddddddd d d d dddddddddddddddÈdddd d!d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d d5d7d9d:d;dd?d¾dÁd@dÚdAd¸dBdDdEdFdGdHdIdJd«d¦d¬dLdšdNdOdSdXdYdÐd#dHdId³d²dÇdÒdÑdÛdÜdÝdÞdßdJdKdàdád½dLdMdNdâdOdãd[dPdädQdådæd˜dçdèdRdSdTd^dËd`dadbd×dcdÉdddedfdgdhdidUdjdVdëdmdndpdqdWdrdsdXdYdZd[dtdíd\d]d^d_d`d,dAdBd<d;d:d9dîdud7dïdðdñd%d=dxdGd™dyd$d{d)d/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid1djd2d~dd€ddkdldmd‚dndƒd„dodpd…d†dådqdˆd‰drdsdtdud4dvdwdùdúdûdüdýdþdxdŠdšd›dœddždŸd d¡d¢d£d¤d¥d¦d§d¨d©dªd«dydzd{d|ddŽd}dd~dddd¬d d d€dd­d d‚dƒd„d…d†d‡d‘dˆd‰dŠd‹dŒdd’d“d”ddEdddŽdddd•d–d‘d—d˜g)dÈd™ dH dš dÈdL d› dœ dž dŸ dF d6 d3 d  dD d¡ d¢ d£ dJ d7 d¤ d¥ dÈdf d¦ d™ d§ dÈd5 d¨ dÈdK d© dG d8 dª d« d™ d¬ d™ dI dM d9 d­ d® d d d™ dÈdÈd4 d° d± dX d² d³ d! dÈd´ d" dµ dÈd¶ dC d· dE d¸ d¹ dº d: dOd! d™ dÈd$ d; dVdV dÈd½ dÈd™ dVdVdÈd{ d` d[ d% d& d' d( d) dñ dVd^ d* d® do dVd] dVd+ d\ d, dVdVd- d’ dz d. dbd/ d0 d_ dVdVdÈdjd> d¾ d= dVd? dVdVd@ d¿ dÀ dÁ d dVdà d\ d$ dW dÄ dÆ dT dVdU dÇ dVdrd! d™ d< d™ dU d“ dj d” di d• dV dVd– dk d— dë d5 dVd¯ d6 d7 du dVdVdVd[ dí dVdt dÈd° dVdVdVdVdVdVdVdVdVdVdVdVdVdVdVdVdVdVdVd™ d\ d|dd dÉ dVdÊ dVdVdVdË dVdÌ dÍ dVdVdÎ dÏ dÈdVdÑ dÒ d\ dVdVd dVdVd dî d9 d: d; d< d= dVdÈdbdbdbdbdbdbdbdbdbdbdq dp dbdbdbdbdbdW dƒd! d™ d\ dÓ dÔ dVd` dVdÕ d™ d˜ d³ dA dB dVdVd´ d™ dVd\ dVdVdVdVd_ dVdVdVdVd dVdb dÖ da dv dVd™ dï dVdVdð dVd× dØ dVdÙ dc g)fdddddddd d d d d ddÕddddddddddddddÈdddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d d4d5d7d9d:d;dd?d¾dÁd@dÚdAd¸dBd¹d®dµdDdEdFdGdHdIdJdKd«d¦d¬dLdšdNdOdSdXdYd¼dÐd»d#dHdId³d²dÇdÑdÛdÜdÝdÞdßdJdKdàdádÒdLdMdNdâdOdãd[dPdädQdådædçdèdRdSdTd\d]d^dédêd¿dËdÍd`dadbd×dcdÙdÔdÉdddÓdedfdgdhdidUdjdVdëd–dmdndpdqdWdrdsdXdìdYdZd[dtdíd\d]d^d_d`d,dAdBd<d;d:d9dîdudïdðdñd%d=dvdxdydzd{d/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdidòdódôd1djd2d~dd€ddkdldmd‚dndƒd„dodpd…d†dådqdˆd‰drdsdtdõdödud4dvdwdùdúdûdüdýdþdxdÿdŠd‹dŒdddydzd{ddd|ddŽd}dd~dêdddddd d d€dd d‚dƒd„d…d d dddd†d‡d‘dˆd‰dŠddd‹dŒdddd’d“d”ddEdddddŽdddd•d–d‘d—d˜g?dd™ dH dš ddL d› dœ d3dž dŸ dF d6 dd3 d  dD d¡ d¢ d£ dJ d7 d¤ d¥ ddf d¦ d™ d§ d¸d5 d¨ dd dK d© dG d8 dª d« d™ d¬ d™ dI dM d9 d­ d® d d d™ d3d¸dd4 d° d± dX d² d³ d! dà d´ d" dµ d¸d¶ d¸dZd" dC d· dE d¸ d¹ dº d: d d[d! d™ d^d$ d; d[dV d¸d½ dá d¸dZd™ d[d[d¸d{ d` d% d& d' d( d) dñ dud^ d* d=dwdxd] dyd+ d\ d, d[dyd- d’ dz d. d/ d0 d_ d[d[d! d d^d^d|d( d[d}d> d¾ d= d[d? d€dmdxd[dnd@ d¿ dÀ dÁ d d[dà d\ d$ dpdW dÄ dÆ dT dxdU dÇ dxd3 d[d! d™ d< d™ dU d“ dj d” di d• dV d[d– dk d— dë d5 d[d6 d7 du d[d[dÈ d[d[d dŠdxdxdxdxdxdxdxdxdxdxdxdxdxdxdxdxd[dxdxd^d|d}d™ d\ d[dd dÉ d[dÊ d[d[d[dË d[dÌ dÍ d[d[dÎ dÏ d¸dxdÑ dÒ d\ d[d[d' d8 d dxd[d dî d9 d: d; d< d= dxdŠdŠd d d> d] d[d! d™ d? dé d\ dÓ dÔ d[d` d[d‰dÕ d# d& d™ d˜ dA dB d[dxd™ dxd\ d[d[d- d, d^ dC dê d[d[d_ d[d[d[d% d$ d[d dF d+ d[db dÖ da dv d[d™ dï d* d) d[d[dð d[d× dØ d[dÙ dc g?fdd d-d.d d;d¾d@dÚdGdJd«d¦d¬dšdOdYd#dHdId²dÇdÑdÛdÜdÝdÞdßdJdKdàdádLdMdNdâdOdãd[dPdädQdådædçdèdRdSdTdËd`dadbd×dcdÉdddedfdgdhdidUdjdVdëdpdWdXdYdZd[díd\d]d^d_d`d,dAdBd<d;d:d9dîdudïdðdñd%d=dxdyd/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid1djd2d~dd€ddkdldmd‚dndƒd„dodpdqdrdsdtdud4dvdwdùdúdûdüdýdþdxdydzd{d|ddŽd}dd~dddd d d€dd d‚dƒd„d…d†d‡d‘dˆd‰dŠd‹dŒdd’d“d”ddEdddŽdddd•d–d‘d—d˜g¾dš d¨ dM d9 d™ d± d! d´ d" d¸ d: dPd! d™ d$ dPd½ d™ dPdPd{ d` d% d& d' d( d) dñ dPd^ d* dðdPd] dPd+ d\ d, dPdPd- d’ dz d. d/ d0 d_ dPdPdPd> d¾ d= dPd? dPdPd@ d¿ dÀ dÁ d dPdà d\ d$ dÆ dPdPdPd! d™ d™ dU d“ dj d” di d• dV dPd– dk d— dë d5 dPd6 d7 du dPdPdPdPdPdPdPdPdPdPdPdPdPdPdPdPdPdPdPdPdPdPdPd™ d\ dPdd dÉ dPdÊ dPdPdPdË dPdÌ dÍ dPdPdPd\ dPdPd dPdPd dî d9 d: d; d< d= dPdPd! d™ d\ dÓ dÔ dPd` dPdÕ d™ d˜ dA dB dPdPd™ dPd\ dPdPdPdPd_ dPdPdPdPd dPdb dÖ da dv dPd™ dï dPdPdð dPd× dØ dPdÙ dc g¾fdddddddd d d d ddÕddddddddáddddd"d!dddÈdddd d!d3d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d d3d5d7d9d:d;dd?dÃd¾dÁd@dÚdAd¸dBd¹dDdEdFdGdHdIdJd«d¦d¬dLdNdOdSdXd°dÄdYd¼dÐd#dZdHdId³dÇdJdKdLdMdNdOd[dPdQdRdSdTd^dédËdÕd`dadØdbd×dcdÉdddedfdgdhdidUdjdVdmdndpdqdWdrdÝdsdXdYdZd[dtdíd\d]d^d_d`d,dAdBd<d;d:d9dud.d*d%d=dxdyd/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdidòd1d|djd2d~dd€ddkdldmd‚dndƒd„dodpd…d†dådqdˆd‰drdsdtdudßd4dvdwdxdydzd{d|ddŽd}dd~dddd€dÿdd d‚dƒd„d…d†d‡d‘dˆd‰dŠd‹dŒdd’d“d”dàdEddŽddd•d–d‘d—d˜gd"d™ dH dš d"dL d› dœ dž dŸ dF d6 d"d3 d  dD d¡ d¢ d£ dJ dY d7 d¤ d¥ d"d d¾ dôd= dšd? dšdëd@ d¿ dÀ dÁ d dëdà d\ dW dÄ dÆ dT dšdU dçdÇ dšdšd! d™ d< d™ dU d“ dj d” di d• dV dšd– dk d— dë dšdúdüdšdšdšdšdšdšdšdšdšdšdšdšdšdšdšdšdšdšdšdšdšdšdšd"d™ dšd\ dšdd dÉ dšdÊ dšdëdšdË dšdÌ dÍ dëdšdÎ dÏ d"dšdÑ dÒ d\ dšdšd dšdšdšd dšdšd! d™ d\ dÓ dÔ dšd` dëdÕ d™ d˜ dšdšdšd™ dšd\ dšdšdšdëd_ dëdšdëdšd dšdb dÖ da dšdšd™ dšdëdëd× dØ dëdÙ dc gfd.d;dGdJdOd`dadbdcdddedfdgdhdidUdjdpd~dddld‚dƒd„doddŽdd~dd‡d‘dˆdŠd’d“d”ddd•d–d‘d—d˜g-d9 d± d¸ d: d–d> d¾ d= d? d–d@ d¿ dÀ dÁ d d–dà dÆ dd dÉ dÊ d–dË dÌ dÍ d–dÓ dÔ d` d–dÕ d–d_ d–d–db dÖ da d–d–d× dØ d–dÙ dc g-fdd d-d.d d;d¾d@dÚdGdJd«d¦d¬dOdYd#dHdIdÇdJdKdàdLdMdNdOd[dPdädQdRdSdTdËd`dadbd×dcdÉdddedfdgdhdidUdjdVdpdWdXdYdZd[díd\d]d^d_d`d,dAdBd<d;d:d9dudïd%d=dxdyd/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid1djd2d~dd€ddkdldmd‚dndƒd„dodpdqdrdsdtdud4dvdwdxdydzd{d|ddŽd}dd~dddd€dd d‚dƒd„d…d†d‡d‘dˆd‰dŠd‹dŒdd’d“d”dEddŽddd•d–d‘d—d˜g dš d¨ dM d9 d™ d± d! d´ d" d¸ d: däd! d™ däd½ d™ dädäd` däd^ dïdäd] däd\ dädäd- d’ d_ dädädäd> d¾ d= däd? dädäd@ d¿ dÀ dÁ d dädà d\ dÆ dädädäd! d™ d™ dU d“ dj d” di d• dV däd– dk d— dë däd6 dädädädädädädädädädädädädädädädädädädädädädädäd™ d\ dädd dÉ dädÊ dädädädË dädÌ dÍ dädädäd\ dädäd dädäd dädäd! d™ d\ dÓ dÔ däd` dädÕ d™ d˜ dädäd™ däd\ dädädädäd_ dädädädäd dädb dÖ da däd™ dädädäd× dØ dädÙ dc g fddddddd d d d d ddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;dd?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrdsdtdudvdwdxdydzd{d|d}d~dd€dd‚dƒd„d…d†d‡dˆd‰dŠd‹dŒddŽddd‘d’d“d”d•d–d—d˜g–d#d™ dH dš dL d› dœ d dž dŸ dF d6 d3 d  dD d¡ d#d¢ d£ dJ d7 d¤ d¥ d#df d¦ d§ d#d5 d¨ d#d dK d© dG d8 dª d« d™ d¬ d™ dI dM d9 d­ d® d d d#d. d#d¯ d#d#d4 d° d± dX d#d² d³ d´ dµ d¶ d#dC d· dE d¸ d¹ dº d: d d#d#d; d#dB d#d#dV d» d¼ d#d#d#d½ d#d#d! d d#d#d> d¾ d= d? d#d@ d¿ dÀ dÁ d dà d#d#dW dÄ dÅ dÆ dT dU dÇ d< d#dÈ d#d#d#d d#d#d#dd dÉ d#dÊ dË dÌ dÍ dÎ dÏ dÐ dÑ dÒ d#d d dÓ dÔ d` dÕ d_ db dÖ da d× dØ dÙ dc g–fdšd²dÒdÑdÛdÜdÝdÞdßdàdádâdãdädådædçdèdëdpdîd7dïdðdñdGd™d$d)dùdúdûdüdýdþd¬d d d­dddg*d$ d{ d`d% d& d' d( d) dñ d* d® d+ d, d- dz d. d/ d0 d$ dÆ d5 d¯ d6 d7 du d[ dí dt d° dî d9 d: d; d< d= d³ dA dB d´ dv dï dð g*fdšd²dÒdÑdÛdÜdÝdÞdßdàdádâdãdädådædçdèdëdpdîd7dïdðdñdGd™d$d)dùdúdûdüdýdþd¬d d d­dddg*d$ d{ dAd% d& d' d( d) dñ d* d® d+ d, d- dz d. d/ d0 d$ dÆ d5 d¯ d6 d7 du d[ dí dt d° dî d9 d: d; d< d= d³ dA dB d´ dv dï dð g*fdd d-d d@dÚd«d¦dšdYd#d²dÒdÑdÛdÜdÝdÞdßdàdád´d½dÔdâdOdãdädådæd˜dçdèdDd@dËd“dpdYdZdîd7dïdðdñdGd™d$d)ddd1ddjd2däddrd¿dùdúdûdüdüdýdþdšd›dœddždŸd d¡d¢d£d¤d¥d¦d§d¨d©dªd«dydzdd|dÉdddd¬d d d­d dƒdÎd d dddþdgidš d¨ dM d™ d´ d" d™ d! d$ d½ d™ d{ d[ d% d& d' d( d) dñ d* d® dvdo d d+ dzd, d- dz d. dm d/ d0 d dg d™ dÓ dÆ d™ d! d5 d¯ d6 d7 du d[ dí dt d° d‹dŒd™ ddd™ dl dddh dî d9 d: d; d d< d= d× dZ dØ d± dÙ ds dr d² dÚ dÛ dq dp dÜ dY dÝ dX dì dW d™ d! ddde dddd³ dA dB d´ dddn dddv dï ddð gifd œdZiZx^ejƒD]R\ZZxDeededƒD].\Z Z e ek !r€iee <e ee e<!qbW!qFW[dOdddUdldod~d‡dˆdŠddd‘g d`d`d`d`d`d`d`d`d`d`d`d`g fdddd!d3d7d8d=dCdMdOdQdRdVdWdZd[d^d_dddkdldudwdxdyd|d}d€dŠgddddddddddddddddddddddddddddddgfdd5gd¯d¯gfdd5gd¶d¶gfdOdddUdldod~d‡dˆdŠddd‘g dadadadadadadadadadadadag fd«dOdHdIdJdLdNd[dPdSdTdËd×dÉdddUdWdXdYdBdud%d=dxdyd/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid2d€dkdldmdndodpdqdsdtd4dvdxdyd}d~d€dd‚d„d…d†d‡dˆd‰dŠd‹ddEdŽddd‘gNdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàdàgNfd«dËdYd2dygd´dddd gfd"d!d¡d:dd?dOdèdHdIdddUdldodvdxd~d¬ddÖd‡dˆdŠd‹ddd‘gd=dÃdOdQdRd°dÄdOdOdídídOdOdOdOdíd dOd d d dOdOdOdídOdOdOgfdÃd°dÄdÝgdÏdÏdÏdëgfdLgdÂgfdÈd d¬d#d[d1d{gdÁd«dTdYdtdyd…gfdd)d+gd0dHdIgfdOdddUd€dldod}d~d†d‡dˆdŠddŽddd‘gdÖdÖdÖd÷dÖdÖdúdÖdûdÖdÖdÖdÐdÑdÖdÖdÖgfdídd dgdudududugfd3dZd^d|d}dŠgd¨d¨d¨d¨d¨d¨gfdOdddUdldod~d‡dˆdŠddd‘g dbdbdbdbdbdbdbdbdbdbdbdbg fdLd^d{dŠgdCdºdCdºgfdgdgfdd5dÐd³gd¢d¢dÛd’gfdLd^déd{dòdÿdŠgdÍdÍdôdÍdôdôdÍgfdídd dgdødødødøgfd3d«dOdZdHdIdJdLdNd[dPdSdTdËdÕd×dÉdddUdWdXdYdBdud%d=dxdyd/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid|d2d€dkdldmdndodpdqdsdtdßd4dvdxdyd}d~d€dÿdd‚d„d…d†d‡dˆd‰dŠd‹ddàdEdŽddd‘gUd¶dædæd¶dædædædædædædædædædæd»dædædædædædædædædædædædædædædædædædædædædædædædædædædædædædædædæd¶dædædædædædædædædædædæddædædædædædædæddædædædædædædædædædædæddædædædædægUfdÿgdgfd«dOdHdIdJdLdNd[dPdSdTdËd×dÉdddUdWdXdYdBdud%d=dxdyd/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid2d€dkdldmdndodpdqdsdtd4dvdxdyd}d~d€dd‚d„d…d†d‡dˆd‰dŠd‹ddEdŽddd‘gNdÒdÒdÒdÒd7dGd$dÒd)dÒdÒdÒdÒdGdÒdÒdGdGdÒdÒdÒdÒdÒdÒdÒdGdGdGdGdGdGdGdGdGdGdGdGdGdGdGdGdÒdGdGdÒdÒdÒdÒdÒdÒdÒdÒdGdÒdÒdGdÒdGdÒdÒdÒdÒdGdGdÒdÒdÒdÒdÒdÒdÒdÒdÒdÒdÒdÒdÒdÒgNfdLd{gd?dÅgfdHdIdvd‹gd°d³dðdógfdddÕddd!d5d7d¸d¹dLdXdÐd³d^dédòdågd d d4d d d d d d d4d d d d d d4d4d gfd=dQdRgdWdkdlgfddgddgfddCgd8d8gfd!d7gdždžgfdddd!d3d7d8d=dCdMdOdQdRdVdWdZd[d^d_dddkdldudwdxdyd|d}d€dŠgddddFddFddSddFddSdSdmdSddSddddSdSdSdSdSdSddddgfd¡dOdèdddUdldod~d‡dˆdŠddd‘gdNdcdtdcdcdcdcdcdcdcdcdcdcdcgfddddd!d5d7dÁd¸dLdXdÐd³d^d{dådŠgdÕdÕdÕd¹dÕd¹dÕd¼d¹déd¹d¹d¹dòdÿd¹dÿgfdd5d¸dXdÐd³dågdºdºdÅdºdºdºdºgfd!d7gd­d­gfdd5dXdÐd³dågd·d·dÙd·d·dÙgfd=gdÂgfdXgd×gfdddd!d3d7d8d=dCdMdOdQdRdVdWdZd[d^d_dddkdldudwdxdyd|d}d€dŠgddddddddddddddddddddddddddddddgfd^d}dŠgd¹dÈd¹gfdXdågdØdøgfdddddÈdd)d+d d3d5d8d=d¾dCd¦d¬dLdOdQdRdVdWdXd#dZd[d^d_dddkdldZd[dudwdxdyd{d1d|d}d€dŠdzd{g.d)d)d)d)d@dDd)d)d@d)dDd)d@dYd)dYd@dDd)d@d@dYd@drd@d)d@d)d)d)d@d@dYd@d@d@d@d@drd@d)d)d)d)dYd@g.fdÒgdBgfdOd[d×dddUdud%dxdydgd€dldmdndodpd}d~d†d‡dˆd‰dŠddEdŽddd‘gdÌd0dõdÌdÌd0düd0d0dædÌdÌdÊdËdÌdÌdÌdÌdÌdÌdÌdÏdÌdÌdþdÌdÌdÌdÌgfdddddd)d+d3d5d8dCdLdOdZd^d_ddd|d}d€dŠgdddddAddddAdddAdddddddddgfd«dOdHdIdJdLdNd[dPdSdTdËd×dÉdddUdWdXdYdBdud%d=dxdyd/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid2d€dkdldmdndodpdqdsdtd4dvdxdyd}d~d€dd‚d„d…d†d‡dˆd‰dŠd‹ddEdŽddd‘gNdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑdÑgNfdgd8gfdígdígfdWdÎd•dkdldÜdÚdíddòdg dndsd„d…d†dˆd‰dïdñddg fdd5d¸d¹dXdÐd³dågd®d®d®d»d®d®d®d®gfddgdd9gfd[dudwdxdygd+dÀdýdÃdÄgfdOgddgfdddOdddUdldod~d‡dˆdŠddd‘gdddedededededededededededegfdOdddUdldod~d‡dˆdŠddd‘g dfdfdéddd‘d’d“d”d–d—d˜g fd«dOdHdIdLd[dSdTdËd×dÉdddUdWdXdYdBdud%d=dxdyd/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid2d€dkdldmdndodpdqdsdtd4dvdxdyd}d~d€dd‚d„d…d†d‡dˆd‰dŠd‹ddEdŽddd‘gKd½d½d½d½d™d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½d­d½d½d½d½d­d½d½d½d½d½d½d½d½d½d½d½d½d½d½d½gKfdXgd²gfddddÈdd!d)d+d d3d5d7dCd«d¬dLdOdXd#dZd^dËdddUdYd[díd{d1d|d2d}d€dldodŠdyd{d}d~dd d†d‡dˆdŠdddŽddd‘g4dd2d©dÚd¤d d2d2dÚd¥d¤d d©dDdÚdÀd—d±dÚd¥d·dDd¾d¾dDdÚdwdÀdÚd¥dDd·d¾d¾d¾d·dDdÚd¾d¾dŒdŒd¾d¾d¾d¾d¾dŒd¾d¾d¾d¾g4fd3dZd^d_d|d}dŠgdªdªdªd½dªdªdªgfd«dOdHdIdJdLdNd[dPdSdTdËd×dÉdddUdWdXdYdBdud%d=dxdyd/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid2d€dkdldmdndodpdqdsdtd4dvdxdyd}d~d€dd‚d„d…d†d‡dˆd‰dŠd‹ddEdŽddd‘gNdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãdãgNfdddd8dCdOddd€gdd6ddPd6dgdgd}gfddddd)d+d3d8dCdOdZd^d_ddd|d}d€dŠgd!d1d7d!d1d1dMd7d7d7dMdMdMd7dMdMd7dMgfdOdddUdldod~d‡dˆdŠddd‘g dhdhdhdhdhdhdhdhdhdhdhdhg fdÃd°dÄgdÎdÜdÚgfdOddgdid‚gfdÉdWdXdqd4gdãdödìdùdgfd3dZd|gd£d´dÆgfd«dOdHdIdJdLdNd[dPdSdTdËd×dÉdddUdWdXdYdBdud%d=dxdyd/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid2d€dkdldmdndodpdqdsdtd4dvdxdyd}d~d€dd‚d„d…d†d‡dˆd‰dŠd‹ddEdŽddd‘gNdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßdßgNfddddd!d3d5d7d8d=dCdLdMdOdQdRdVdWdXdZd[d^d_dddkdldudwdxdyd{d|d}d€dŠg#d d d dBd d dBd d d d dBd d d d d d dqd d d d d d d d d d d dqd d d d g#fd=dQdRdWdkdlgdUdUdUdododogfdLdégdêdógfddddd!d5d7d¸dLdXdÐd³d^dågddµddCdœdâdœdµd>dâdâdâdµdâgfdOdddUdldod~d‡dˆdŠddd‘g djdjdjdjdjdjdjdjdjdjdjdjg fd«dOdHdIdJdLdNd[dPdSdTdËd×dÉdddUdWdXdYdBdud%d=dxdyd/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid2d€dkdldmdndodpdqdsdtd4dvdxdyd}d~d€dd‚d„d…d†d‡dˆd‰dŠd‹ddEdŽddd‘gNdádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádádágNfdíd gdîdògfd«dOdHdIdJdLdNd[dPdSdTdËd×dÉdddUdWdXdYdBdud%d=dxdyd/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid2d€dkdldmdndodpdqdsdtd4dvdxdyd}d~d€dd‚d„d…d†d‡dˆd‰dŠd‹ddEdŽddd‘gNdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLdLgNfdddd!d3d7d8d=dCdMdOdQdRdVdWdZd[d^d_dddkdldudwdxdyd|d}d€dŠgd"d"d"d"d"d"d"d"d"d"d"d"d"d"d"d"d"d"d"d"d"d"d"d"d"d"d"d"d"d"gfdOgd•gfd«dOdHdId[dSdTdËd×dddUdYdBdud%d=dxdydgd2d€dkdldmdndodpdsdtdvdyd}d~d€d„d…d†d‡dˆd‰dŠd‹ddEdŽddd‘g0dÔd“d¯d¯d“dddÔd“d“d“dÔd¿d“d“dÁd“d“d“dÔd“dÉd“d“d“d“d“ddd¯dÔd“d“dÍd d d“d“d“d“d“d¯d“d“d“d“d“d“g0fdídd dgdvd‹dvd‹gfd3dZd^d|d}dŠgd§d®d¸dÇd¸d¸gfdÈd d=d¬dQdRdWd#d[dkdld[dudwdxdyd1d{gd¾d¦dVd¾dVdVdVdZdVdVdVd¾dVdVdVdVdzd¾gfdídødd dgd÷dd÷d÷d÷gfd!d7gd›d›gfdddd3d8dCdOdZd^d_ddd|d}d€dŠgdd5ddLd5d5d5dLdLdLd5dLdLd5dLgfd3dZd|gdŸdŸdŸgfddCgd¡dègfddgd$d$gfd«dOdHdId[dSdTdËd×dÉdddUdWdXdYdBdud%d=dxdyd/d&d(d-d6d'd5d±dÆd™dadbdcdddedfdgdhdid2d€dkdldmdndodpdqdsdtd4dvdyd}d~d€d‚d„d…d†d‡dˆd‰dŠd‹ddEdŽddd‘gHd˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜dšd›dœddždŸd d¡d¢d£d¤d¥d¦d§d¨d©d˜dªd«d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜d˜gHfdddd!d3d7d8d=dCdMdOdQdRdVdWdZd[d^d_dddkdldudwdxdyd|d}d€dŠgd'd'd'd'd'd'd'd'd'd'd'd'd'd'd'd'd'd'd'd'd'd'd'd'd'd'd'd'd'd'gfdddd8dCdOddd€gdFdFdFdFdFdFdFdFgfdddddd)d+d3d5d8dCdLdOdZd^d_ddd|d}d€dŠgd+d+d+d+dEd+d+d+dEd+d+dEd+d+d+d+d+d+d+d+d+gfd=dQdRdWd[dkdldudwdxdyg dXdXdXdXd{dXdXd{d{d{d{g fd«dOdHdId[dSdTdËd×dÉdddUdWdXdYdBdud%d=dxdydgd2d€dkdldmdndodpdqdsdtd4dvdyd}d~d€d‚d„d…d†d‡dˆd‰dŠd‹ddEdŽddd‘g6d@d@d@d@d@d@d@d@d@däd@d@dädäd@d@d@d@d@d@d@d@d@d@d@d@d@d@d@d@däd@d@däd@d@d@d@d@dÎd@d@d@d@d@d@d@d@d@d@d@d@d@d@g6fd œ_Z iZ x^e jƒD]R\ZZxDeededƒD].\Z Z e e k Ir~ie e <e e e e<Iq`WIqDW[ ddddddfdddddd&fdddddd8fdddddd&fdddddd8fdddddd&fdddddd8fdddd dd&fd!ddd dd8fd"d#dd$dd&fd%d#dd$dd8fd&d'dd(dd&fd)d'dd(dd8fd*d+dd,dd&fd-d+dd,dd8fd.d/dd0dd&fd1d/dd0dd8fd2d3dd4dd&fd5d3dd4dd8fd6d7dd8dd&fd9d7dd8dd8fd:d;dd<dd&fd=d;dd<dd8fd>d?dd@dd&fdAd?dd@dd8fdBdCddDdd&fdEdCddDdd8fdFdGddHdd&fdIdGddHdd8fdJdKddLdd£fdMdKddNdd£fdOdKddPdd£fdQdKddRdd£fdSdKddRdd¥fdTdKddUdd£fdVdKddWdd£fdXdKddWdd¥fdYdZdd[dd£fd\dZdd]dd£fd^dZdd_dd£fd`dZddadd£fdbdZddadd¥fdcdZddddd£fdedZddfdd£fdgdZddfdd¥fdhdiddjdd£fdkdiddldd£fdmdiddndd£fdodiddndd¥fdpdiddqdd£fdrdiddsdd£fdtdiddsdd¥fdudvddwdd£fdxdvddydd£fdzd{dd|dd£fd}d{dd~dd£fdd€dddd£fd‚d€ddƒdd£fd„d…dd†d‡dfdˆd…dd†d‡d•fd‰dŠdd‹d‡dŒfddŠddŽd‡dfdd‘dd’d‡d“fd”d‘dd•d‡d–fd—d‘dd˜d‡d™fdšd‘dd˜d‡d›fdœd‘ddd‡džfdŸd dd¡d‡d¢fd£d¤dd¥d‡d¦fd§d¤dd¥d‡d¨fd©dªdd«d‡d¬fd­dªdd®d‡d¯fd°d±dd²d‡d³fd´d±dd²d‡dµfd¶d±dd²d‡d·fd¸d±dd²d‡d¹fdºd±dd²d‡d»fd¼d±dd²d‡d½fd¾d±dd²d‡d¿fdÀdÁddÂd‡dÃfdÄdÁddÂd‡dÅfdÆdÇddÈd‡dÉfdÊdËddÌd‡dÍfdÎdËddÌd‡dÏfdÐdÑddÒd‡dÓfdÔdÑddÕd‡dÖfd×dÑddØd‡dÙfdÚdÛddÜd‡dÝfdÞdÛddßd‡dàfdádÛddâd‡dãfdädÛddåd‡dæfdçdÛddèd‡défdêdÛddëd‡dìfdídîddïd‡dðfdñdîddïd‡dòfdódîddïd‡dôfdõdîddïd‡döfd÷dîddïd‡døfdùdúddûd‡düfdýdþddÿd‡dfddþddÿd‡dfddþddÿd‡dfddþddÿd‡dfddþddÿd‡dfd dþddÿd‡d fd dþddÿd‡d fd dþddÿd‡dfddþddÿd‡dfddþddÿd‡dfddþddÿd‡dfddþddÿd‡dfddddd‡dfddddd‡dfddddd‡dfddddd‡d fd!d"dd#d‡d$fd%d"dd#d‡d&fd'd"dd#d‡d(fd)d*dd+d‡d,fd-d*dd+d‡d.fd/d0dd1d‡d2fd3d0dd1d‡d4fd5d6dd7d‡d8fd9d6dd7d‡d:fd;dfd?dfd?dépéjélébéUéRéƒi-é“é’é éXéZéYéQéTéVéŒé‰é¡éLéÁéÀéMéÂéKé„éŠi.é”ééÓéÑéÃéÐéÄé†é…é‹é–é•éÏéÒéÅéÊéÌéÉéÎéOé—é˜é™éšé›éžéŸé§é©é¬é®é¯é±é²é½é¾éâé÷éiiiii;i@iAilimipiuiyizi{i|ii„i‡ižiŸi¥i¦i¸i¹i½i¿iÁiÄiÅiËiÑiÒiÓiÔiÕiÝiÞißiäiåièiéióiöi÷iøiùiþiié;éééi ii'i+i(i%ii#i i)ii&i"iBiFé&iri,i*ié+iii i ié¿é»é´i i é¼é³iõé.iéé‘éœéé£é¤é¦é¨é«é­éµé¶é·éÝéáéëéöéùéúéûiiiiii0i1i2i3i4i5i6i7i8iEiQiRiSiXi\i]iciiijikinitivi…i i¡i¢i¨i­i±iÂiÆiÈiÌiÎiÏiÐiÖiØiÚiÛiÜiàiáiíiúiýiiiéàéÚéßéãéÜé°ii‰iŠi‹iŒiiŽiii‘i’i“i”i•i–i—i˜iši›iÃiÊéýiéÿéíéìiéüé:évi<i=i>i?iHiIiJiWiwixi~i€i‚iƒi†i£i¤i©i®i°i²iµiãiçiðiûié¢éÕé¥é iÉéîéïéñéóéøéòéôiüiqiôééwiTiUiai™é‡éiZi³ifigioisi¼iÀiâiæiòiKiNi`i¬i¶i·i×iìi}iiÿiÇéi9i:iDihi§iºi»i¾iÍiêië)dÚVOIDÚLBRACKETÚ WCHAR_CONSTÚ FLOAT_CONSTÚMINUSÚRPARENÚLONGÚPLUSÚELLIPSISÚGTÚGOTOÚENUMÚPERIODÚGEÚ INT_CONST_DECÚARROWÚCHARÚHEX_FLOAT_CONSTÚDOUBLEÚ MINUSEQUALÚ INT_CONST_OCTÚ TIMESEQUALÚORÚSHORTÚRETURNÚ RSHIFTEQUALÚRESTRICTÚSTATICÚSIZEOFÚUNSIGNEDÚUNIONÚCOLONz$endÚWSTRING_LITERALÚDIVIDEÚFORÚPLUSPLUSÚEQUALSÚELSEÚANDEQUALÚEQÚANDÚTYPEIDÚLBRACEÚPPHASHÚINTÚSIGNEDÚCONTINUEÚNOTÚOREQUALÚMODÚRSHIFTÚDEFAULTÚ__INT128ÚWHILEÚDIVEQUALÚEXTERNÚCASEÚLANDÚREGISTERÚMODEQUALÚNEÚSWITCHÚ INT_CONST_HEXÚ_COMPLEXÚ PPPRAGMASTRÚ PLUSEQUALÚSTRUCTÚCONDOPÚBREAKÚVOLATILEÚPPPRAGMAÚINLINEÚ INT_CONST_BINÚDOÚLNOTÚCONSTÚLORÚ CHAR_CONSTÚLSHIFTÚRBRACEÚ_BOOLÚLEÚSEMIÚLTÚCOMMAÚOFFSETOFÚTYPEDEFÚXORÚAUTOÚTIMESÚLPARENÚ MINUSMINUSÚIDÚIFÚSTRING_LITERALÚFLOATÚXOREQUALÚ LSHIFTEQUALÚRBRACKET)_Úexpression_statementÚstruct_or_union_specifierÚinit_declarator_listÚinit_declarator_list_optÚiteration_statementÚunified_string_literalÚassignment_expression_optÚ brace_openÚ enumeratorÚtypeid_noparen_declaratorÚtype_qualifier_list_optÚ"declaration_specifiers_no_type_optÚexpression_optÚ designationÚparameter_listÚlabeled_statementÚabstract_declaratorÚtranslation_unitÚinit_declaratorÚdirect_abstract_declaratorÚdesignator_listÚ identifierÚoffsetof_member_designatorÚunary_expressionÚabstract_declarator_optÚ initializerÚdirect_id_declaratorÚstruct_declaration_listÚ pp_directiveÚdeclaration_listÚid_init_declaratorÚtype_specifierÚcompound_statementÚpointerÚtypeid_declaratorÚid_init_declarator_listÚ declaratorÚargument_expression_listÚstruct_declarator_list_optÚ typedef_nameÚparameter_type_list_optÚstruct_declaratorÚtype_qualifierÚassignment_operatorÚ expressionÚstorage_class_specifierÚunified_wstring_literalÚtranslation_unit_or_emptyÚinitializer_list_optÚ brace_closeÚdirect_typeid_declaratorÚexternal_declarationÚ type_nameÚblock_item_listÚpppragma_directiveÚ statementÚcast_expressionÚstruct_declarator_listÚemptyÚparameter_declarationÚprimary_expressionÚ declarationÚdeclaration_specifiers_no_typeÚjump_statementÚenumerator_listÚ block_itemÚconstant_expressionÚidentifier_list_optÚconstantÚtype_specifier_no_typeidÚstruct_declarationÚ direct_typeid_noparen_declaratorÚ id_declaratorÚselection_statementÚpostfix_expressionÚinitializer_listÚunary_operatorÚstruct_or_unionÚblock_item_list_optÚassignment_expressionÚdesignation_optÚparameter_type_listÚtype_qualifier_listÚ designatorÚid_init_declarator_list_optÚdeclaration_specifiersÚidentifier_listÚdeclaration_list_optÚfunction_definitionÚbinary_expressionÚenum_specifierÚ decl_bodyÚfunction_specifierÚspecifier_qualifier_listÚconditional_expressionzS' -> translation_unit_or_emptyzS'Nz abstract_declarator_opt -> emptyr~Úp_abstract_declarator_optz plyparser.pyz.abstract_declarator_opt -> abstract_declaratorz"assignment_expression_opt -> emptyrlÚp_assignment_expression_optz2assignment_expression_opt -> assignment_expressionzblock_item_list_opt -> emptyr´Úp_block_item_list_optz&block_item_list_opt -> block_item_listzdeclaration_list_opt -> emptyr½Úp_declaration_list_optz(declaration_list_opt -> declaration_listz+declaration_specifiers_no_type_opt -> emptyrqÚ$p_declaration_specifiers_no_type_optzDdeclaration_specifiers_no_type_opt -> declaration_specifiers_no_typezdesignation_opt -> emptyr¶Úp_designation_optzdesignation_opt -> designationzexpression_opt -> emptyrrÚp_expression_optzexpression_opt -> expressionz$id_init_declarator_list_opt -> emptyrºÚp_id_init_declarator_list_optz6id_init_declarator_list_opt -> id_init_declarator_listzidentifier_list_opt -> emptyr©Úp_identifier_list_optz&identifier_list_opt -> identifier_listz!init_declarator_list_opt -> emptyriÚp_init_declarator_list_optz0init_declarator_list_opt -> init_declarator_listzinitializer_list_opt -> emptyr–Úp_initializer_list_optz(initializer_list_opt -> initializer_listz parameter_type_list_opt -> emptyrŽÚp_parameter_type_list_optz.parameter_type_list_opt -> parameter_type_listz#struct_declarator_list_opt -> emptyrŒÚp_struct_declarator_list_optz4struct_declarator_list_opt -> struct_declarator_listz type_qualifier_list_opt -> emptyrpÚp_type_qualifier_list_optz.type_qualifier_list_opt -> type_qualifier_listzdirect_id_declarator -> IDr€Úp_direct_id_declarator_1z3direct_id_declarator -> LPAREN id_declarator RPARENÚp_direct_id_declarator_2zpdirect_id_declarator -> direct_id_declarator LBRACKET type_qualifier_list_opt assignment_expression_opt RBRACKETÚp_direct_id_declarator_3zsdirect_id_declarator -> direct_id_declarator LBRACKET STATIC type_qualifier_list_opt assignment_expression RBRACKETÚp_direct_id_declarator_4zodirect_id_declarator -> direct_id_declarator LBRACKET type_qualifier_list STATIC assignment_expression RBRACKETz\direct_id_declarator -> direct_id_declarator LBRACKET type_qualifier_list_opt TIMES RBRACKETÚp_direct_id_declarator_5zNdirect_id_declarator -> direct_id_declarator LPAREN parameter_type_list RPARENÚp_direct_id_declarator_6zNdirect_id_declarator -> direct_id_declarator LPAREN identifier_list_opt RPARENz"direct_typeid_declarator -> TYPEIDr˜Úp_direct_typeid_declarator_1z;direct_typeid_declarator -> LPAREN typeid_declarator RPARENÚp_direct_typeid_declarator_2zxdirect_typeid_declarator -> direct_typeid_declarator LBRACKET type_qualifier_list_opt assignment_expression_opt RBRACKETÚp_direct_typeid_declarator_3z{direct_typeid_declarator -> direct_typeid_declarator LBRACKET STATIC type_qualifier_list_opt assignment_expression RBRACKETÚp_direct_typeid_declarator_4zwdirect_typeid_declarator -> direct_typeid_declarator LBRACKET type_qualifier_list STATIC assignment_expression RBRACKETzddirect_typeid_declarator -> direct_typeid_declarator LBRACKET type_qualifier_list_opt TIMES RBRACKETÚp_direct_typeid_declarator_5zVdirect_typeid_declarator -> direct_typeid_declarator LPAREN parameter_type_list RPARENÚp_direct_typeid_declarator_6zVdirect_typeid_declarator -> direct_typeid_declarator LPAREN identifier_list_opt RPARENz*direct_typeid_noparen_declarator -> TYPEIDr­Ú$p_direct_typeid_noparen_declarator_1zˆdirect_typeid_noparen_declarator -> direct_typeid_noparen_declarator LBRACKET type_qualifier_list_opt assignment_expression_opt RBRACKETÚ$p_direct_typeid_noparen_declarator_3z‹direct_typeid_noparen_declarator -> direct_typeid_noparen_declarator LBRACKET STATIC type_qualifier_list_opt assignment_expression RBRACKETÚ$p_direct_typeid_noparen_declarator_4z‡direct_typeid_noparen_declarator -> direct_typeid_noparen_declarator LBRACKET type_qualifier_list STATIC assignment_expression RBRACKETztdirect_typeid_noparen_declarator -> direct_typeid_noparen_declarator LBRACKET type_qualifier_list_opt TIMES RBRACKETÚ$p_direct_typeid_noparen_declarator_5zfdirect_typeid_noparen_declarator -> direct_typeid_noparen_declarator LPAREN parameter_type_list RPARENÚ$p_direct_typeid_noparen_declarator_6zfdirect_typeid_noparen_declarator -> direct_typeid_noparen_declarator LPAREN identifier_list_opt RPARENz%id_declarator -> direct_id_declaratorr®Úp_id_declarator_1z-id_declarator -> pointer direct_id_declaratorÚp_id_declarator_2z-typeid_declarator -> direct_typeid_declaratorrˆÚp_typeid_declarator_1z5typeid_declarator -> pointer direct_typeid_declaratorÚp_typeid_declarator_2z=typeid_noparen_declarator -> direct_typeid_noparen_declaratorroÚp_typeid_noparen_declarator_1zEtypeid_noparen_declarator -> pointer direct_typeid_noparen_declaratorÚp_typeid_noparen_declarator_2z-translation_unit_or_empty -> translation_unitr•Úp_translation_unit_or_emptyz c_parser.pyz"translation_unit_or_empty -> emptyz(translation_unit -> external_declarationrwÚp_translation_unit_1i z9translation_unit -> translation_unit external_declarationÚp_translation_unit_2iz+external_declaration -> function_definitionr™Úp_external_declaration_1iz#external_declaration -> declarationÚp_external_declaration_2i#z$external_declaration -> pp_directiveÚp_external_declaration_3i(z*external_declaration -> pppragma_directivei)zexternal_declaration -> SEMIÚp_external_declaration_4i.zpp_directive -> PPHASHr‚Úp_pp_directivei3zpppragma_directive -> PPPRAGMArœÚp_pppragma_directivei9z*pppragma_directive -> PPPRAGMA PPPRAGMASTRi:zLfunction_definition -> id_declarator declaration_list_opt compound_statementr¾Úp_function_definition_1iEzcfunction_definition -> declaration_specifiers id_declarator declaration_list_opt compound_statementÚp_function_definition_2iVzstatement -> labeled_statementrÚ p_statementiaz!statement -> expression_statementibzstatement -> compound_statementicz statement -> selection_statementidz statement -> iteration_statementiezstatement -> jump_statementifzstatement -> pppragma_directiveigz declaration_specifiers init_declarator_list_optrÁÚ p_decl_bodyiuzGdecl_body -> declaration_specifiers_no_type id_init_declarator_list_optivzdeclaration -> decl_body SEMIr£Ú p_declarationi±zdeclaration_list -> declarationrƒÚp_declaration_listiºz0declaration_list -> declaration_list declarationi»zSdeclaration_specifiers_no_type -> type_qualifier declaration_specifiers_no_type_optr¤Ú"p_declaration_specifiers_no_type_1iÅz\declaration_specifiers_no_type -> storage_class_specifier declaration_specifiers_no_type_optÚ"p_declaration_specifiers_no_type_2iÊzWdeclaration_specifiers_no_type -> function_specifier declaration_specifiers_no_type_optÚ"p_declaration_specifiers_no_type_3iÏz?declaration_specifiers -> declaration_specifiers type_qualifierr»Úp_declaration_specifiers_1iÕzHdeclaration_specifiers -> declaration_specifiers storage_class_specifierÚp_declaration_specifiers_2iÚzCdeclaration_specifiers -> declaration_specifiers function_specifierÚp_declaration_specifiers_3ißzIdeclaration_specifiers -> declaration_specifiers type_specifier_no_typeidÚp_declaration_specifiers_4iäz(declaration_specifiers -> type_specifierÚp_declaration_specifiers_5iézGdeclaration_specifiers -> declaration_specifiers_no_type type_specifierÚp_declaration_specifiers_6iîzstorage_class_specifier -> AUTOr“Úp_storage_class_specifieriôz#storage_class_specifier -> REGISTERiõz!storage_class_specifier -> STATICiöz!storage_class_specifier -> EXTERNi÷z"storage_class_specifier -> TYPEDEFiøzfunction_specifier -> INLINErÂÚp_function_specifieriýz type_specifier_no_typeid -> VOIDr«Úp_type_specifier_no_typeidiz!type_specifier_no_typeid -> _BOOLiz type_specifier_no_typeid -> CHARiz!type_specifier_no_typeid -> SHORTiztype_specifier_no_typeid -> INTiz type_specifier_no_typeid -> LONGiz!type_specifier_no_typeid -> FLOATiz"type_specifier_no_typeid -> DOUBLEi z$type_specifier_no_typeid -> _COMPLEXi z"type_specifier_no_typeid -> SIGNEDi z$type_specifier_no_typeid -> UNSIGNEDi z$type_specifier_no_typeid -> __INT128i ztype_specifier -> typedef_namer…Úp_type_specifieriz type_specifier -> enum_specifieriz+type_specifier -> struct_or_union_specifieriz*type_specifier -> type_specifier_no_typeidiztype_qualifier -> CONSTrÚp_type_qualifieriztype_qualifier -> RESTRICTiztype_qualifier -> VOLATILEiz'init_declarator_list -> init_declaratorrhÚp_init_declarator_listi!zBinit_declarator_list -> init_declarator_list COMMA init_declaratori"zinit_declarator -> declaratorrxÚp_init_declaratori*z0init_declarator -> declarator EQUALS initializeri+z-id_init_declarator_list -> id_init_declaratorr‰Úp_id_init_declarator_listi0zHid_init_declarator_list -> id_init_declarator_list COMMA init_declaratori1z#id_init_declarator -> id_declaratorr„Úp_id_init_declaratori6z6id_init_declarator -> id_declarator EQUALS initializeri7zMspecifier_qualifier_list -> specifier_qualifier_list type_specifier_no_typeidrÃÚp_specifier_qualifier_list_1i>zCspecifier_qualifier_list -> specifier_qualifier_list type_qualifierÚp_specifier_qualifier_list_2iCz*specifier_qualifier_list -> type_specifierÚp_specifier_qualifier_list_3iHz>specifier_qualifier_list -> type_qualifier_list type_specifierÚp_specifier_qualifier_list_4iMz/struct_or_union_specifier -> struct_or_union IDrgÚp_struct_or_union_specifier_1iVz3struct_or_union_specifier -> struct_or_union TYPEIDiWz[struct_or_union_specifier -> struct_or_union brace_open struct_declaration_list brace_closeÚp_struct_or_union_specifier_2i`z^struct_or_union_specifier -> struct_or_union ID brace_open struct_declaration_list brace_closeÚp_struct_or_union_specifier_3iizbstruct_or_union_specifier -> struct_or_union TYPEID brace_open struct_declaration_list brace_closeijzstruct_or_union -> STRUCTr³Úp_struct_or_unioniszstruct_or_union -> UNIONitz-struct_declaration_list -> struct_declarationrÚp_struct_declaration_listi{zEstruct_declaration_list -> struct_declaration_list struct_declarationi|zNstruct_declaration -> specifier_qualifier_list struct_declarator_list_opt SEMIr¬Úp_struct_declaration_1i„zstruct_declaration -> SEMIÚp_struct_declaration_2iªz+struct_declarator_list -> struct_declaratorrŸÚp_struct_declarator_listi¯zHstruct_declarator_list -> struct_declarator_list COMMA struct_declaratori°zstruct_declarator -> declaratorrÚp_struct_declarator_1i¸z9struct_declarator -> declarator COLON constant_expressionÚp_struct_declarator_2i½z.struct_declarator -> COLON constant_expressioni¾zenum_specifier -> ENUM IDrÀÚp_enum_specifier_1iÆzenum_specifier -> ENUM TYPEIDiÇz=enum_specifier -> ENUM brace_open enumerator_list brace_closeÚp_enum_specifier_2iÌz@enum_specifier -> ENUM ID brace_open enumerator_list brace_closeÚp_enum_specifier_3iÑzDenum_specifier -> ENUM TYPEID brace_open enumerator_list brace_closeiÒzenumerator_list -> enumeratorr¦Úp_enumerator_listi×z(enumerator_list -> enumerator_list COMMAiØz3enumerator_list -> enumerator_list COMMA enumeratoriÙzenumerator -> IDrnÚ p_enumeratoriäz+enumerator -> ID EQUALS constant_expressioniåzdeclarator -> id_declaratorrŠÚ p_declaratoriôzdeclarator -> typeid_declaratoriõz(pointer -> TIMES type_qualifier_list_optr‡Ú p_pointeridz0pointer -> TIMES type_qualifier_list_opt pointeriez%type_qualifier_list -> type_qualifierr¸Úp_type_qualifier_listi‚z9type_qualifier_list -> type_qualifier_list type_qualifieriƒz%parameter_type_list -> parameter_listr·Úp_parameter_type_listiˆz4parameter_type_list -> parameter_list COMMA ELLIPSISi‰z'parameter_list -> parameter_declarationrtÚp_parameter_listi‘z parameter_list COMMA parameter_declarationi’z=parameter_declaration -> declaration_specifiers id_declaratorr¡Úp_parameter_declaration_1i¥zIparameter_declaration -> declaration_specifiers typeid_noparen_declaratori¦zGparameter_declaration -> declaration_specifiers abstract_declarator_optÚp_parameter_declaration_2i±zidentifier_list -> identifierr¼Úp_identifier_listiÐz3identifier_list -> identifier_list COMMA identifieriÑz$initializer -> assignment_expressionrÚp_initializer_1iÚz:initializer -> brace_open initializer_list_opt brace_closeÚp_initializer_2ißz brace_open initializer_list COMMA brace_closeiàz/initializer_list -> designation_opt initializerr±Úp_initializer_listièzFinitializer_list -> initializer_list COMMA designation_opt initializeriéz%designation -> designator_list EQUALSrsÚ p_designationiôzdesignator_list -> designatorrzÚp_designator_listiüz-designator_list -> designator_list designatoriýz3designator -> LBRACKET constant_expression RBRACKETr¹Ú p_designatorizdesignator -> PERIOD identifieriz=type_name -> specifier_qualifier_list abstract_declarator_optršÚ p_type_nameizabstract_declarator -> pointerrvÚp_abstract_declarator_1iz9abstract_declarator -> pointer direct_abstract_declaratorÚp_abstract_declarator_2iz1abstract_declarator -> direct_abstract_declaratorÚp_abstract_declarator_3i z?direct_abstract_declarator -> LPAREN abstract_declarator RPARENryÚp_direct_abstract_declarator_1i*zddirect_abstract_declarator -> direct_abstract_declarator LBRACKET assignment_expression_opt RBRACKETÚp_direct_abstract_declarator_2i.zIdirect_abstract_declarator -> LBRACKET assignment_expression_opt RBRACKETÚp_direct_abstract_declarator_3i9zPdirect_abstract_declarator -> direct_abstract_declarator LBRACKET TIMES RBRACKETÚp_direct_abstract_declarator_4iBz5direct_abstract_declarator -> LBRACKET TIMES RBRACKETÚp_direct_abstract_declarator_5iMz^direct_abstract_declarator -> direct_abstract_declarator LPAREN parameter_type_list_opt RPARENÚp_direct_abstract_declarator_6iVzCdirect_abstract_declarator -> LPAREN parameter_type_list_opt RPARENÚp_direct_abstract_declarator_7i`zblock_item -> declarationr§Ú p_block_itemikzblock_item -> statementilzblock_item_list -> block_itemr›Úp_block_item_listisz-block_item_list -> block_item_list block_itemitz@compound_statement -> brace_open block_item_list_opt brace_closer†Úp_compound_statement_1izz'labeled_statement -> ID COLON statementruÚp_labeled_statement_1i€z=labeled_statement -> CASE constant_expression COLON statementÚp_labeled_statement_2i„z,labeled_statement -> DEFAULT COLON statementÚp_labeled_statement_3iˆz IF LPAREN expression RPAREN statementr¯Úp_selection_statement_1iŒzKselection_statement -> IF LPAREN expression RPAREN statement ELSE statementÚp_selection_statement_2iz@selection_statement -> SWITCH LPAREN expression RPAREN statementÚp_selection_statement_3i”z?iteration_statement -> WHILE LPAREN expression RPAREN statementrjÚp_iteration_statement_1i™zGiteration_statement -> DO statement WHILE LPAREN expression RPAREN SEMIÚp_iteration_statement_2iziiteration_statement -> FOR LPAREN expression_opt SEMI expression_opt SEMI expression_opt RPAREN statementÚp_iteration_statement_3i¡zaiteration_statement -> FOR LPAREN declaration expression_opt SEMI expression_opt RPAREN statementÚp_iteration_statement_4i¥zjump_statement -> GOTO ID SEMIr¥Úp_jump_statement_1iªzjump_statement -> BREAK SEMIÚp_jump_statement_2i®zjump_statement -> CONTINUE SEMIÚp_jump_statement_3i²z(jump_statement -> RETURN expression SEMIÚp_jump_statement_4i¶zjump_statement -> RETURN SEMIi·z+expression_statement -> expression_opt SEMIrfÚp_expression_statementi¼z#expression -> assignment_expressionr’Ú p_expressioniÃz4expression -> expression COMMA assignment_expressioniÄztypedef_name -> TYPEIDrÚp_typedef_nameiÐz/assignment_expression -> conditional_expressionrµÚp_assignment_expressioniÔzSassignment_expression -> unary_expression assignment_operator assignment_expressioniÕzassignment_operator -> EQUALSr‘Úp_assignment_operatoriâzassignment_operator -> XOREQUALiãz!assignment_operator -> TIMESEQUALiäzassignment_operator -> DIVEQUALiåzassignment_operator -> MODEQUALiæz assignment_operator -> PLUSEQUALiçz!assignment_operator -> MINUSEQUALièz"assignment_operator -> LSHIFTEQUALiéz"assignment_operator -> RSHIFTEQUALiêzassignment_operator -> ANDEQUALiëzassignment_operator -> OREQUALiìz-constant_expression -> conditional_expressionr¨Úp_constant_expressioniñz+conditional_expression -> binary_expressionrÄÚp_conditional_expressioniõzZconditional_expression -> binary_expression CONDOP expression COLON conditional_expressioniöz$binary_expression -> cast_expressionr¿Úp_binary_expressioniþz>binary_expression -> binary_expression TIMES binary_expressioniÿz?binary_expression -> binary_expression DIVIDE binary_expressioniz binary_expression MOD binary_expressioniz=binary_expression -> binary_expression PLUS binary_expressioniz>binary_expression -> binary_expression MINUS binary_expressioniz?binary_expression -> binary_expression RSHIFT binary_expressioniz?binary_expression -> binary_expression LSHIFT binary_expressioniz;binary_expression -> binary_expression LT binary_expressioniz;binary_expression -> binary_expression LE binary_expressioniz;binary_expression -> binary_expression GE binary_expressioniz;binary_expression -> binary_expression GT binary_expressioni z;binary_expression -> binary_expression EQ binary_expressioni z;binary_expression -> binary_expression NE binary_expressioni z binary_expression AND binary_expressioni z;binary_expression -> binary_expression OR binary_expressioni z binary_expression XOR binary_expressioniz=binary_expression -> binary_expression LAND binary_expressioniz binary_expression LOR binary_expressioniz#cast_expression -> unary_expressionržÚp_cast_expression_1iz:cast_expression -> LPAREN type_name RPAREN cast_expressionÚp_cast_expression_2iz&unary_expression -> postfix_expressionr}Úp_unary_expression_1i z-unary_expression -> PLUSPLUS unary_expressionÚp_unary_expression_2i$z/unary_expression -> MINUSMINUS unary_expressioni%z2unary_expression -> unary_operator cast_expressioni&z+unary_expression -> SIZEOF unary_expressionÚp_unary_expression_3i+z2unary_expression -> SIZEOF LPAREN type_name RPARENi,zunary_operator -> ANDr²Úp_unary_operatori4zunary_operator -> TIMESi5zunary_operator -> PLUSi6zunary_operator -> MINUSi7zunary_operator -> NOTi8zunary_operator -> LNOTi9z(postfix_expression -> primary_expressionr°Úp_postfix_expression_1i>zEpostfix_expression -> postfix_expression LBRACKET expression RBRACKETÚp_postfix_expression_2iBzOpostfix_expression -> postfix_expression LPAREN argument_expression_list RPARENÚp_postfix_expression_3iFz6postfix_expression -> postfix_expression LPAREN RPARENiGz2postfix_expression -> postfix_expression PERIOD IDÚp_postfix_expression_4iLz6postfix_expression -> postfix_expression PERIOD TYPEIDiMz1postfix_expression -> postfix_expression ARROW IDiNz5postfix_expression -> postfix_expression ARROW TYPEIDiOz1postfix_expression -> postfix_expression PLUSPLUSÚp_postfix_expression_5iUz3postfix_expression -> postfix_expression MINUSMINUSiVzUpostfix_expression -> LPAREN type_name RPAREN brace_open initializer_list brace_closeÚp_postfix_expression_6i[z[postfix_expression -> LPAREN type_name RPAREN brace_open initializer_list COMMA brace_closei\z primary_expression -> identifierr¢Úp_primary_expression_1iazprimary_expression -> constantÚp_primary_expression_2iez,primary_expression -> unified_string_literalÚp_primary_expression_3iiz-primary_expression -> unified_wstring_literalijz.primary_expression -> LPAREN expression RPARENÚp_primary_expression_4iozWprimary_expression -> OFFSETOF LPAREN type_name COMMA offsetof_member_designator RPARENÚp_primary_expression_5isz(offsetof_member_designator -> identifierr|Úp_offsetof_member_designatori{zJoffsetof_member_designator -> offsetof_member_designator PERIOD identifieri|zUoffsetof_member_designator -> offsetof_member_designator LBRACKET expression RBRACKETi}z1argument_expression_list -> assignment_expressionr‹Úp_argument_expression_listiŠzPargument_expression_list -> argument_expression_list COMMA assignment_expressioni‹zidentifier -> IDr{Ú p_identifieri”zconstant -> INT_CONST_DECrªÚ p_constant_1i˜zconstant -> INT_CONST_OCTi™zconstant -> INT_CONST_HEXišzconstant -> INT_CONST_BINi›zconstant -> FLOAT_CONSTÚ p_constant_2i¡zconstant -> HEX_FLOAT_CONSTi¢zconstant -> CHAR_CONSTÚ p_constant_3i¨zconstant -> WCHAR_CONSTi©z(unified_string_literal -> STRING_LITERALrkÚp_unified_string_literali´z?unified_string_literal -> unified_string_literal STRING_LITERALiµz*unified_wstring_literal -> WSTRING_LITERALr”Úp_unified_wstring_literali¿zBunified_wstring_literal -> unified_wstring_literal WSTRING_LITERALiÀzbrace_open -> LBRACErmÚ p_brace_openiÊzbrace_close -> RBRACEr—Ú p_brace_closeiÐzempty -> r Úp_emptyiÖ)Ú _tabversionÚ _lr_methodÚ _lr_signatureZ_lr_action_itemsÚ _lr_actionÚitemsZ_kÚ_vÚzipZ_xZ_yZ_lr_goto_itemsÚ_lr_gotoÚ_lr_productions©ruruú../pycparser/yacctab.pyÚsÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿNÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿ¿pycparser-2.18/pycparser/__pycache__/__init__.cpython-36.pyc0000664000175000017500000000465213127011712024603 0ustar elibeneliben000000000000003 ï\Y] ã@sBdddgZdZddlmZmZddlmZdd d „Zddd„Zd S)Zc_lexerÚc_parserÚc_astz2.18é)ÚPopenÚPIPEé)ÚCParserÚcppÚcCs|g}t|tƒr||7}n|dkr,||g7}||g7}yt|tdd}|jƒd}Wn6tk rŠ}ztddd|ƒ‚WYdd}~XnX|S) ae Preprocess a file using cpp. filename: Name of the file you want to preprocess. cpp_path: cpp_args: Refer to the documentation of parse_file for the meaning of these arguments. When successful, returns the preprocessed file's contents. Errors from cpp will be printed out. r T)ÚstdoutZuniversal_newlinesrzUnable to invoke 'cpp'. z(Make sure its path was passed correctly zOriginal error: %sN)Ú isinstanceÚlistrrZ communicateÚOSErrorÚ RuntimeError)ÚfilenameÚcpp_pathÚcpp_argsÚ path_listÚpipeÚtextÚe©rú../pycparser/__init__.pyÚpreprocess_files     rFNc CsJ|rt|||ƒ}nt|dƒ}|jƒ}WdQRX|dkr>tƒ}|j||ƒS)aÿ Parse a C file using pycparser. filename: Name of the file you want to parse. use_cpp: Set to True if you want to execute the C pre-processor on the file prior to parsing it. cpp_path: If use_cpp is True, this is the path to 'cpp' on your system. If no path is provided, it attempts to just execute 'cpp', so it must be in your PATH. cpp_args: If use_cpp is True, set this to the command line arguments strings to cpp. Be careful with quotes - it's best to pass a raw string (r'') here. For example: r'-I../utils/fake_libc_include' If several arguments are required, pass a list of strings. parser: Optional parser object to be used instead of the default CParser When successful, an AST is returned. ParseError can be thrown if the file doesn't parse successfully. Errors from cpp will be printed out. ZrUN)rÚopenÚreadrÚparse)rZuse_cpprrZparserrÚfrrrÚ parse_file6s r)rr )Frr N) Ú__all__Ú __version__Ú subprocessrrrrrrrrrrÚ s   %pycparser-2.18/pycparser/ply/0000775000175000017500000000000013127011712017003 5ustar elibeneliben00000000000000pycparser-2.18/pycparser/ply/lex.py0000664000175000017500000012364613070450737020174 0ustar elibeneliben00000000000000# ----------------------------------------------------------------------------- # ply: lex.py # # Copyright (C) 2001-2017 # David M. Beazley (Dabeaz LLC) # All rights reserved. # # Redistribution and use in source and binary forms, with or without # modification, are permitted provided that the following conditions are # met: # # * Redistributions of source code must retain the above copyright notice, # this list of conditions and the following disclaimer. # * Redistributions in binary form must reproduce the above copyright notice, # this list of conditions and the following disclaimer in the documentation # and/or other materials provided with the distribution. # * Neither the name of the David Beazley or Dabeaz LLC may be used to # endorse or promote products derived from this software without # specific prior written permission. # # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS # "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT # LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR # A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT # OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, # SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT # LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, # DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY # THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT # (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. # ----------------------------------------------------------------------------- __version__ = '3.10' __tabversion__ = '3.10' import re import sys import types import copy import os import inspect # This tuple contains known string types try: # Python 2.6 StringTypes = (types.StringType, types.UnicodeType) except AttributeError: # Python 3.0 StringTypes = (str, bytes) # This regular expression is used to match valid token names _is_identifier = re.compile(r'^[a-zA-Z0-9_]+$') # Exception thrown when invalid token encountered and no default error # handler is defined. class LexError(Exception): def __init__(self, message, s): self.args = (message,) self.text = s # Token class. This class is used to represent the tokens produced. class LexToken(object): def __str__(self): return 'LexToken(%s,%r,%d,%d)' % (self.type, self.value, self.lineno, self.lexpos) def __repr__(self): return str(self) # This object is a stand-in for a logging object created by the # logging module. class PlyLogger(object): def __init__(self, f): self.f = f def critical(self, msg, *args, **kwargs): self.f.write((msg % args) + '\n') def warning(self, msg, *args, **kwargs): self.f.write('WARNING: ' + (msg % args) + '\n') def error(self, msg, *args, **kwargs): self.f.write('ERROR: ' + (msg % args) + '\n') info = critical debug = critical # Null logger is used when no output is generated. Does nothing. class NullLogger(object): def __getattribute__(self, name): return self def __call__(self, *args, **kwargs): return self # ----------------------------------------------------------------------------- # === Lexing Engine === # # The following Lexer class implements the lexer runtime. There are only # a few public methods and attributes: # # input() - Store a new string in the lexer # token() - Get the next token # clone() - Clone the lexer # # lineno - Current line number # lexpos - Current position in the input string # ----------------------------------------------------------------------------- class Lexer: def __init__(self): self.lexre = None # Master regular expression. This is a list of # tuples (re, findex) where re is a compiled # regular expression and findex is a list # mapping regex group numbers to rules self.lexretext = None # Current regular expression strings self.lexstatere = {} # Dictionary mapping lexer states to master regexs self.lexstateretext = {} # Dictionary mapping lexer states to regex strings self.lexstaterenames = {} # Dictionary mapping lexer states to symbol names self.lexstate = 'INITIAL' # Current lexer state self.lexstatestack = [] # Stack of lexer states self.lexstateinfo = None # State information self.lexstateignore = {} # Dictionary of ignored characters for each state self.lexstateerrorf = {} # Dictionary of error functions for each state self.lexstateeoff = {} # Dictionary of eof functions for each state self.lexreflags = 0 # Optional re compile flags self.lexdata = None # Actual input data (as a string) self.lexpos = 0 # Current position in input text self.lexlen = 0 # Length of the input text self.lexerrorf = None # Error rule (if any) self.lexeoff = None # EOF rule (if any) self.lextokens = None # List of valid tokens self.lexignore = '' # Ignored characters self.lexliterals = '' # Literal characters that can be passed through self.lexmodule = None # Module self.lineno = 1 # Current line number self.lexoptimize = False # Optimized mode def clone(self, object=None): c = copy.copy(self) # If the object parameter has been supplied, it means we are attaching the # lexer to a new object. In this case, we have to rebind all methods in # the lexstatere and lexstateerrorf tables. if object: newtab = {} for key, ritem in self.lexstatere.items(): newre = [] for cre, findex in ritem: newfindex = [] for f in findex: if not f or not f[0]: newfindex.append(f) continue newfindex.append((getattr(object, f[0].__name__), f[1])) newre.append((cre, newfindex)) newtab[key] = newre c.lexstatere = newtab c.lexstateerrorf = {} for key, ef in self.lexstateerrorf.items(): c.lexstateerrorf[key] = getattr(object, ef.__name__) c.lexmodule = object return c # ------------------------------------------------------------ # writetab() - Write lexer information to a table file # ------------------------------------------------------------ def writetab(self, lextab, outputdir=''): if isinstance(lextab, types.ModuleType): raise IOError("Won't overwrite existing lextab module") basetabmodule = lextab.split('.')[-1] filename = os.path.join(outputdir, basetabmodule) + '.py' with open(filename, 'w') as tf: tf.write('# %s.py. This file automatically created by PLY (version %s). Don\'t edit!\n' % (basetabmodule, __version__)) tf.write('_tabversion = %s\n' % repr(__tabversion__)) tf.write('_lextokens = set(%s)\n' % repr(tuple(self.lextokens))) tf.write('_lexreflags = %s\n' % repr(self.lexreflags)) tf.write('_lexliterals = %s\n' % repr(self.lexliterals)) tf.write('_lexstateinfo = %s\n' % repr(self.lexstateinfo)) # Rewrite the lexstatere table, replacing function objects with function names tabre = {} for statename, lre in self.lexstatere.items(): titem = [] for (pat, func), retext, renames in zip(lre, self.lexstateretext[statename], self.lexstaterenames[statename]): titem.append((retext, _funcs_to_names(func, renames))) tabre[statename] = titem tf.write('_lexstatere = %s\n' % repr(tabre)) tf.write('_lexstateignore = %s\n' % repr(self.lexstateignore)) taberr = {} for statename, ef in self.lexstateerrorf.items(): taberr[statename] = ef.__name__ if ef else None tf.write('_lexstateerrorf = %s\n' % repr(taberr)) tabeof = {} for statename, ef in self.lexstateeoff.items(): tabeof[statename] = ef.__name__ if ef else None tf.write('_lexstateeoff = %s\n' % repr(tabeof)) # ------------------------------------------------------------ # readtab() - Read lexer information from a tab file # ------------------------------------------------------------ def readtab(self, tabfile, fdict): if isinstance(tabfile, types.ModuleType): lextab = tabfile else: exec('import %s' % tabfile) lextab = sys.modules[tabfile] if getattr(lextab, '_tabversion', '0.0') != __tabversion__: raise ImportError('Inconsistent PLY version') self.lextokens = lextab._lextokens self.lexreflags = lextab._lexreflags self.lexliterals = lextab._lexliterals self.lextokens_all = self.lextokens | set(self.lexliterals) self.lexstateinfo = lextab._lexstateinfo self.lexstateignore = lextab._lexstateignore self.lexstatere = {} self.lexstateretext = {} for statename, lre in lextab._lexstatere.items(): titem = [] txtitem = [] for pat, func_name in lre: titem.append((re.compile(pat, lextab._lexreflags), _names_to_funcs(func_name, fdict))) self.lexstatere[statename] = titem self.lexstateretext[statename] = txtitem self.lexstateerrorf = {} for statename, ef in lextab._lexstateerrorf.items(): self.lexstateerrorf[statename] = fdict[ef] self.lexstateeoff = {} for statename, ef in lextab._lexstateeoff.items(): self.lexstateeoff[statename] = fdict[ef] self.begin('INITIAL') # ------------------------------------------------------------ # input() - Push a new string into the lexer # ------------------------------------------------------------ def input(self, s): # Pull off the first character to see if s looks like a string c = s[:1] if not isinstance(c, StringTypes): raise ValueError('Expected a string') self.lexdata = s self.lexpos = 0 self.lexlen = len(s) # ------------------------------------------------------------ # begin() - Changes the lexing state # ------------------------------------------------------------ def begin(self, state): if state not in self.lexstatere: raise ValueError('Undefined state') self.lexre = self.lexstatere[state] self.lexretext = self.lexstateretext[state] self.lexignore = self.lexstateignore.get(state, '') self.lexerrorf = self.lexstateerrorf.get(state, None) self.lexeoff = self.lexstateeoff.get(state, None) self.lexstate = state # ------------------------------------------------------------ # push_state() - Changes the lexing state and saves old on stack # ------------------------------------------------------------ def push_state(self, state): self.lexstatestack.append(self.lexstate) self.begin(state) # ------------------------------------------------------------ # pop_state() - Restores the previous state # ------------------------------------------------------------ def pop_state(self): self.begin(self.lexstatestack.pop()) # ------------------------------------------------------------ # current_state() - Returns the current lexing state # ------------------------------------------------------------ def current_state(self): return self.lexstate # ------------------------------------------------------------ # skip() - Skip ahead n characters # ------------------------------------------------------------ def skip(self, n): self.lexpos += n # ------------------------------------------------------------ # opttoken() - Return the next token from the Lexer # # Note: This function has been carefully implemented to be as fast # as possible. Don't make changes unless you really know what # you are doing # ------------------------------------------------------------ def token(self): # Make local copies of frequently referenced attributes lexpos = self.lexpos lexlen = self.lexlen lexignore = self.lexignore lexdata = self.lexdata while lexpos < lexlen: # This code provides some short-circuit code for whitespace, tabs, and other ignored characters if lexdata[lexpos] in lexignore: lexpos += 1 continue # Look for a regular expression match for lexre, lexindexfunc in self.lexre: m = lexre.match(lexdata, lexpos) if not m: continue # Create a token for return tok = LexToken() tok.value = m.group() tok.lineno = self.lineno tok.lexpos = lexpos i = m.lastindex func, tok.type = lexindexfunc[i] if not func: # If no token type was set, it's an ignored token if tok.type: self.lexpos = m.end() return tok else: lexpos = m.end() break lexpos = m.end() # If token is processed by a function, call it tok.lexer = self # Set additional attributes useful in token rules self.lexmatch = m self.lexpos = lexpos newtok = func(tok) # Every function must return a token, if nothing, we just move to next token if not newtok: lexpos = self.lexpos # This is here in case user has updated lexpos. lexignore = self.lexignore # This is here in case there was a state change break # Verify type of the token. If not in the token map, raise an error if not self.lexoptimize: if newtok.type not in self.lextokens_all: raise LexError("%s:%d: Rule '%s' returned an unknown token type '%s'" % ( func.__code__.co_filename, func.__code__.co_firstlineno, func.__name__, newtok.type), lexdata[lexpos:]) return newtok else: # No match, see if in literals if lexdata[lexpos] in self.lexliterals: tok = LexToken() tok.value = lexdata[lexpos] tok.lineno = self.lineno tok.type = tok.value tok.lexpos = lexpos self.lexpos = lexpos + 1 return tok # No match. Call t_error() if defined. if self.lexerrorf: tok = LexToken() tok.value = self.lexdata[lexpos:] tok.lineno = self.lineno tok.type = 'error' tok.lexer = self tok.lexpos = lexpos self.lexpos = lexpos newtok = self.lexerrorf(tok) if lexpos == self.lexpos: # Error method didn't change text position at all. This is an error. raise LexError("Scanning error. Illegal character '%s'" % (lexdata[lexpos]), lexdata[lexpos:]) lexpos = self.lexpos if not newtok: continue return newtok self.lexpos = lexpos raise LexError("Illegal character '%s' at index %d" % (lexdata[lexpos], lexpos), lexdata[lexpos:]) if self.lexeoff: tok = LexToken() tok.type = 'eof' tok.value = '' tok.lineno = self.lineno tok.lexpos = lexpos tok.lexer = self self.lexpos = lexpos newtok = self.lexeoff(tok) return newtok self.lexpos = lexpos + 1 if self.lexdata is None: raise RuntimeError('No input string given with input()') return None # Iterator interface def __iter__(self): return self def next(self): t = self.token() if t is None: raise StopIteration return t __next__ = next # ----------------------------------------------------------------------------- # ==== Lex Builder === # # The functions and classes below are used to collect lexing information # and build a Lexer object from it. # ----------------------------------------------------------------------------- # ----------------------------------------------------------------------------- # _get_regex(func) # # Returns the regular expression assigned to a function either as a doc string # or as a .regex attribute attached by the @TOKEN decorator. # ----------------------------------------------------------------------------- def _get_regex(func): return getattr(func, 'regex', func.__doc__) # ----------------------------------------------------------------------------- # get_caller_module_dict() # # This function returns a dictionary containing all of the symbols defined within # a caller further down the call stack. This is used to get the environment # associated with the yacc() call if none was provided. # ----------------------------------------------------------------------------- def get_caller_module_dict(levels): f = sys._getframe(levels) ldict = f.f_globals.copy() if f.f_globals != f.f_locals: ldict.update(f.f_locals) return ldict # ----------------------------------------------------------------------------- # _funcs_to_names() # # Given a list of regular expression functions, this converts it to a list # suitable for output to a table file # ----------------------------------------------------------------------------- def _funcs_to_names(funclist, namelist): result = [] for f, name in zip(funclist, namelist): if f and f[0]: result.append((name, f[1])) else: result.append(f) return result # ----------------------------------------------------------------------------- # _names_to_funcs() # # Given a list of regular expression function names, this converts it back to # functions. # ----------------------------------------------------------------------------- def _names_to_funcs(namelist, fdict): result = [] for n in namelist: if n and n[0]: result.append((fdict[n[0]], n[1])) else: result.append(n) return result # ----------------------------------------------------------------------------- # _form_master_re() # # This function takes a list of all of the regex components and attempts to # form the master regular expression. Given limitations in the Python re # module, it may be necessary to break the master regex into separate expressions. # ----------------------------------------------------------------------------- def _form_master_re(relist, reflags, ldict, toknames): if not relist: return [] regex = '|'.join(relist) try: lexre = re.compile(regex, reflags) # Build the index to function map for the matching engine lexindexfunc = [None] * (max(lexre.groupindex.values()) + 1) lexindexnames = lexindexfunc[:] for f, i in lexre.groupindex.items(): handle = ldict.get(f, None) if type(handle) in (types.FunctionType, types.MethodType): lexindexfunc[i] = (handle, toknames[f]) lexindexnames[i] = f elif handle is not None: lexindexnames[i] = f if f.find('ignore_') > 0: lexindexfunc[i] = (None, None) else: lexindexfunc[i] = (None, toknames[f]) return [(lexre, lexindexfunc)], [regex], [lexindexnames] except Exception: m = int(len(relist)/2) if m == 0: m = 1 llist, lre, lnames = _form_master_re(relist[:m], reflags, ldict, toknames) rlist, rre, rnames = _form_master_re(relist[m:], reflags, ldict, toknames) return (llist+rlist), (lre+rre), (lnames+rnames) # ----------------------------------------------------------------------------- # def _statetoken(s,names) # # Given a declaration name s of the form "t_" and a dictionary whose keys are # state names, this function returns a tuple (states,tokenname) where states # is a tuple of state names and tokenname is the name of the token. For example, # calling this with s = "t_foo_bar_SPAM" might return (('foo','bar'),'SPAM') # ----------------------------------------------------------------------------- def _statetoken(s, names): nonstate = 1 parts = s.split('_') for i, part in enumerate(parts[1:], 1): if part not in names and part != 'ANY': break if i > 1: states = tuple(parts[1:i]) else: states = ('INITIAL',) if 'ANY' in states: states = tuple(names) tokenname = '_'.join(parts[i:]) return (states, tokenname) # ----------------------------------------------------------------------------- # LexerReflect() # # This class represents information needed to build a lexer as extracted from a # user's input file. # ----------------------------------------------------------------------------- class LexerReflect(object): def __init__(self, ldict, log=None, reflags=0): self.ldict = ldict self.error_func = None self.tokens = [] self.reflags = reflags self.stateinfo = {'INITIAL': 'inclusive'} self.modules = set() self.error = False self.log = PlyLogger(sys.stderr) if log is None else log # Get all of the basic information def get_all(self): self.get_tokens() self.get_literals() self.get_states() self.get_rules() # Validate all of the information def validate_all(self): self.validate_tokens() self.validate_literals() self.validate_rules() return self.error # Get the tokens map def get_tokens(self): tokens = self.ldict.get('tokens', None) if not tokens: self.log.error('No token list is defined') self.error = True return if not isinstance(tokens, (list, tuple)): self.log.error('tokens must be a list or tuple') self.error = True return if not tokens: self.log.error('tokens is empty') self.error = True return self.tokens = tokens # Validate the tokens def validate_tokens(self): terminals = {} for n in self.tokens: if not _is_identifier.match(n): self.log.error("Bad token name '%s'", n) self.error = True if n in terminals: self.log.warning("Token '%s' multiply defined", n) terminals[n] = 1 # Get the literals specifier def get_literals(self): self.literals = self.ldict.get('literals', '') if not self.literals: self.literals = '' # Validate literals def validate_literals(self): try: for c in self.literals: if not isinstance(c, StringTypes) or len(c) > 1: self.log.error('Invalid literal %s. Must be a single character', repr(c)) self.error = True except TypeError: self.log.error('Invalid literals specification. literals must be a sequence of characters') self.error = True def get_states(self): self.states = self.ldict.get('states', None) # Build statemap if self.states: if not isinstance(self.states, (tuple, list)): self.log.error('states must be defined as a tuple or list') self.error = True else: for s in self.states: if not isinstance(s, tuple) or len(s) != 2: self.log.error("Invalid state specifier %s. Must be a tuple (statename,'exclusive|inclusive')", repr(s)) self.error = True continue name, statetype = s if not isinstance(name, StringTypes): self.log.error('State name %s must be a string', repr(name)) self.error = True continue if not (statetype == 'inclusive' or statetype == 'exclusive'): self.log.error("State type for state %s must be 'inclusive' or 'exclusive'", name) self.error = True continue if name in self.stateinfo: self.log.error("State '%s' already defined", name) self.error = True continue self.stateinfo[name] = statetype # Get all of the symbols with a t_ prefix and sort them into various # categories (functions, strings, error functions, and ignore characters) def get_rules(self): tsymbols = [f for f in self.ldict if f[:2] == 't_'] # Now build up a list of functions and a list of strings self.toknames = {} # Mapping of symbols to token names self.funcsym = {} # Symbols defined as functions self.strsym = {} # Symbols defined as strings self.ignore = {} # Ignore strings by state self.errorf = {} # Error functions by state self.eoff = {} # EOF functions by state for s in self.stateinfo: self.funcsym[s] = [] self.strsym[s] = [] if len(tsymbols) == 0: self.log.error('No rules of the form t_rulename are defined') self.error = True return for f in tsymbols: t = self.ldict[f] states, tokname = _statetoken(f, self.stateinfo) self.toknames[f] = tokname if hasattr(t, '__call__'): if tokname == 'error': for s in states: self.errorf[s] = t elif tokname == 'eof': for s in states: self.eoff[s] = t elif tokname == 'ignore': line = t.__code__.co_firstlineno file = t.__code__.co_filename self.log.error("%s:%d: Rule '%s' must be defined as a string", file, line, t.__name__) self.error = True else: for s in states: self.funcsym[s].append((f, t)) elif isinstance(t, StringTypes): if tokname == 'ignore': for s in states: self.ignore[s] = t if '\\' in t: self.log.warning("%s contains a literal backslash '\\'", f) elif tokname == 'error': self.log.error("Rule '%s' must be defined as a function", f) self.error = True else: for s in states: self.strsym[s].append((f, t)) else: self.log.error('%s not defined as a function or string', f) self.error = True # Sort the functions by line number for f in self.funcsym.values(): f.sort(key=lambda x: x[1].__code__.co_firstlineno) # Sort the strings by regular expression length for s in self.strsym.values(): s.sort(key=lambda x: len(x[1]), reverse=True) # Validate all of the t_rules collected def validate_rules(self): for state in self.stateinfo: # Validate all rules defined by functions for fname, f in self.funcsym[state]: line = f.__code__.co_firstlineno file = f.__code__.co_filename module = inspect.getmodule(f) self.modules.add(module) tokname = self.toknames[fname] if isinstance(f, types.MethodType): reqargs = 2 else: reqargs = 1 nargs = f.__code__.co_argcount if nargs > reqargs: self.log.error("%s:%d: Rule '%s' has too many arguments", file, line, f.__name__) self.error = True continue if nargs < reqargs: self.log.error("%s:%d: Rule '%s' requires an argument", file, line, f.__name__) self.error = True continue if not _get_regex(f): self.log.error("%s:%d: No regular expression defined for rule '%s'", file, line, f.__name__) self.error = True continue try: c = re.compile('(?P<%s>%s)' % (fname, _get_regex(f)), self.reflags) if c.match(''): self.log.error("%s:%d: Regular expression for rule '%s' matches empty string", file, line, f.__name__) self.error = True except re.error as e: self.log.error("%s:%d: Invalid regular expression for rule '%s'. %s", file, line, f.__name__, e) if '#' in _get_regex(f): self.log.error("%s:%d. Make sure '#' in rule '%s' is escaped with '\\#'", file, line, f.__name__) self.error = True # Validate all rules defined by strings for name, r in self.strsym[state]: tokname = self.toknames[name] if tokname == 'error': self.log.error("Rule '%s' must be defined as a function", name) self.error = True continue if tokname not in self.tokens and tokname.find('ignore_') < 0: self.log.error("Rule '%s' defined for an unspecified token %s", name, tokname) self.error = True continue try: c = re.compile('(?P<%s>%s)' % (name, r), self.reflags) if (c.match('')): self.log.error("Regular expression for rule '%s' matches empty string", name) self.error = True except re.error as e: self.log.error("Invalid regular expression for rule '%s'. %s", name, e) if '#' in r: self.log.error("Make sure '#' in rule '%s' is escaped with '\\#'", name) self.error = True if not self.funcsym[state] and not self.strsym[state]: self.log.error("No rules defined for state '%s'", state) self.error = True # Validate the error function efunc = self.errorf.get(state, None) if efunc: f = efunc line = f.__code__.co_firstlineno file = f.__code__.co_filename module = inspect.getmodule(f) self.modules.add(module) if isinstance(f, types.MethodType): reqargs = 2 else: reqargs = 1 nargs = f.__code__.co_argcount if nargs > reqargs: self.log.error("%s:%d: Rule '%s' has too many arguments", file, line, f.__name__) self.error = True if nargs < reqargs: self.log.error("%s:%d: Rule '%s' requires an argument", file, line, f.__name__) self.error = True for module in self.modules: self.validate_module(module) # ----------------------------------------------------------------------------- # validate_module() # # This checks to see if there are duplicated t_rulename() functions or strings # in the parser input file. This is done using a simple regular expression # match on each line in the source code of the given module. # ----------------------------------------------------------------------------- def validate_module(self, module): try: lines, linen = inspect.getsourcelines(module) except IOError: return fre = re.compile(r'\s*def\s+(t_[a-zA-Z_0-9]*)\(') sre = re.compile(r'\s*(t_[a-zA-Z_0-9]*)\s*=') counthash = {} linen += 1 for line in lines: m = fre.match(line) if not m: m = sre.match(line) if m: name = m.group(1) prev = counthash.get(name) if not prev: counthash[name] = linen else: filename = inspect.getsourcefile(module) self.log.error('%s:%d: Rule %s redefined. Previously defined on line %d', filename, linen, name, prev) self.error = True linen += 1 # ----------------------------------------------------------------------------- # lex(module) # # Build all of the regular expression rules from definitions in the supplied module # ----------------------------------------------------------------------------- def lex(module=None, object=None, debug=False, optimize=False, lextab='lextab', reflags=int(re.VERBOSE), nowarn=False, outputdir=None, debuglog=None, errorlog=None): if lextab is None: lextab = 'lextab' global lexer ldict = None stateinfo = {'INITIAL': 'inclusive'} lexobj = Lexer() lexobj.lexoptimize = optimize global token, input if errorlog is None: errorlog = PlyLogger(sys.stderr) if debug: if debuglog is None: debuglog = PlyLogger(sys.stderr) # Get the module dictionary used for the lexer if object: module = object # Get the module dictionary used for the parser if module: _items = [(k, getattr(module, k)) for k in dir(module)] ldict = dict(_items) # If no __file__ attribute is available, try to obtain it from the __module__ instead if '__file__' not in ldict: ldict['__file__'] = sys.modules[ldict['__module__']].__file__ else: ldict = get_caller_module_dict(2) # Determine if the module is package of a package or not. # If so, fix the tabmodule setting so that tables load correctly pkg = ldict.get('__package__') if pkg and isinstance(lextab, str): if '.' not in lextab: lextab = pkg + '.' + lextab # Collect parser information from the dictionary linfo = LexerReflect(ldict, log=errorlog, reflags=reflags) linfo.get_all() if not optimize: if linfo.validate_all(): raise SyntaxError("Can't build lexer") if optimize and lextab: try: lexobj.readtab(lextab, ldict) token = lexobj.token input = lexobj.input lexer = lexobj return lexobj except ImportError: pass # Dump some basic debugging information if debug: debuglog.info('lex: tokens = %r', linfo.tokens) debuglog.info('lex: literals = %r', linfo.literals) debuglog.info('lex: states = %r', linfo.stateinfo) # Build a dictionary of valid token names lexobj.lextokens = set() for n in linfo.tokens: lexobj.lextokens.add(n) # Get literals specification if isinstance(linfo.literals, (list, tuple)): lexobj.lexliterals = type(linfo.literals[0])().join(linfo.literals) else: lexobj.lexliterals = linfo.literals lexobj.lextokens_all = lexobj.lextokens | set(lexobj.lexliterals) # Get the stateinfo dictionary stateinfo = linfo.stateinfo regexs = {} # Build the master regular expressions for state in stateinfo: regex_list = [] # Add rules defined by functions first for fname, f in linfo.funcsym[state]: line = f.__code__.co_firstlineno file = f.__code__.co_filename regex_list.append('(?P<%s>%s)' % (fname, _get_regex(f))) if debug: debuglog.info("lex: Adding rule %s -> '%s' (state '%s')", fname, _get_regex(f), state) # Now add all of the simple rules for name, r in linfo.strsym[state]: regex_list.append('(?P<%s>%s)' % (name, r)) if debug: debuglog.info("lex: Adding rule %s -> '%s' (state '%s')", name, r, state) regexs[state] = regex_list # Build the master regular expressions if debug: debuglog.info('lex: ==== MASTER REGEXS FOLLOW ====') for state in regexs: lexre, re_text, re_names = _form_master_re(regexs[state], reflags, ldict, linfo.toknames) lexobj.lexstatere[state] = lexre lexobj.lexstateretext[state] = re_text lexobj.lexstaterenames[state] = re_names if debug: for i, text in enumerate(re_text): debuglog.info("lex: state '%s' : regex[%d] = '%s'", state, i, text) # For inclusive states, we need to add the regular expressions from the INITIAL state for state, stype in stateinfo.items(): if state != 'INITIAL' and stype == 'inclusive': lexobj.lexstatere[state].extend(lexobj.lexstatere['INITIAL']) lexobj.lexstateretext[state].extend(lexobj.lexstateretext['INITIAL']) lexobj.lexstaterenames[state].extend(lexobj.lexstaterenames['INITIAL']) lexobj.lexstateinfo = stateinfo lexobj.lexre = lexobj.lexstatere['INITIAL'] lexobj.lexretext = lexobj.lexstateretext['INITIAL'] lexobj.lexreflags = reflags # Set up ignore variables lexobj.lexstateignore = linfo.ignore lexobj.lexignore = lexobj.lexstateignore.get('INITIAL', '') # Set up error functions lexobj.lexstateerrorf = linfo.errorf lexobj.lexerrorf = linfo.errorf.get('INITIAL', None) if not lexobj.lexerrorf: errorlog.warning('No t_error rule is defined') # Set up eof functions lexobj.lexstateeoff = linfo.eoff lexobj.lexeoff = linfo.eoff.get('INITIAL', None) # Check state information for ignore and error rules for s, stype in stateinfo.items(): if stype == 'exclusive': if s not in linfo.errorf: errorlog.warning("No error rule is defined for exclusive state '%s'", s) if s not in linfo.ignore and lexobj.lexignore: errorlog.warning("No ignore rule is defined for exclusive state '%s'", s) elif stype == 'inclusive': if s not in linfo.errorf: linfo.errorf[s] = linfo.errorf.get('INITIAL', None) if s not in linfo.ignore: linfo.ignore[s] = linfo.ignore.get('INITIAL', '') # Create global versions of the token() and input() functions token = lexobj.token input = lexobj.input lexer = lexobj # If in optimize mode, we write the lextab if lextab and optimize: if outputdir is None: # If no output directory is set, the location of the output files # is determined according to the following rules: # - If lextab specifies a package, files go into that package directory # - Otherwise, files go in the same directory as the specifying module if isinstance(lextab, types.ModuleType): srcfile = lextab.__file__ else: if '.' not in lextab: srcfile = ldict['__file__'] else: parts = lextab.split('.') pkgname = '.'.join(parts[:-1]) exec('import %s' % pkgname) srcfile = getattr(sys.modules[pkgname], '__file__', '') outputdir = os.path.dirname(srcfile) try: lexobj.writetab(lextab, outputdir) except IOError as e: errorlog.warning("Couldn't write lextab module %r. %s" % (lextab, e)) return lexobj # ----------------------------------------------------------------------------- # runmain() # # This runs the lexer as a main program # ----------------------------------------------------------------------------- def runmain(lexer=None, data=None): if not data: try: filename = sys.argv[1] f = open(filename) data = f.read() f.close() except IndexError: sys.stdout.write('Reading from standard input (type EOF to end):\n') data = sys.stdin.read() if lexer: _input = lexer.input else: _input = input _input(data) if lexer: _token = lexer.token else: _token = token while True: tok = _token() if not tok: break sys.stdout.write('(%s,%r,%d,%d)\n' % (tok.type, tok.value, tok.lineno, tok.lexpos)) # ----------------------------------------------------------------------------- # @TOKEN(regex) # # This decorator function can be used to set the regex expression on a function # when its docstring might need to be set in an alternative way # ----------------------------------------------------------------------------- def TOKEN(r): def set_regex(f): if hasattr(r, '__call__'): f.regex = _get_regex(r) else: f.regex = r return f return set_regex # Alternative spelling of the TOKEN decorator Token = TOKEN pycparser-2.18/pycparser/ply/ygen.py0000664000175000017500000000431313045001366020323 0ustar elibeneliben00000000000000# ply: ygen.py # # This is a support program that auto-generates different versions of the YACC parsing # function with different features removed for the purposes of performance. # # Users should edit the method LParser.parsedebug() in yacc.py. The source code # for that method is then used to create the other methods. See the comments in # yacc.py for further details. import os.path import shutil def get_source_range(lines, tag): srclines = enumerate(lines) start_tag = '#--! %s-start' % tag end_tag = '#--! %s-end' % tag for start_index, line in srclines: if line.strip().startswith(start_tag): break for end_index, line in srclines: if line.strip().endswith(end_tag): break return (start_index + 1, end_index) def filter_section(lines, tag): filtered_lines = [] include = True tag_text = '#--! %s' % tag for line in lines: if line.strip().startswith(tag_text): include = not include elif include: filtered_lines.append(line) return filtered_lines def main(): dirname = os.path.dirname(__file__) shutil.copy2(os.path.join(dirname, 'yacc.py'), os.path.join(dirname, 'yacc.py.bak')) with open(os.path.join(dirname, 'yacc.py'), 'r') as f: lines = f.readlines() parse_start, parse_end = get_source_range(lines, 'parsedebug') parseopt_start, parseopt_end = get_source_range(lines, 'parseopt') parseopt_notrack_start, parseopt_notrack_end = get_source_range(lines, 'parseopt-notrack') # Get the original source orig_lines = lines[parse_start:parse_end] # Filter the DEBUG sections out parseopt_lines = filter_section(orig_lines, 'DEBUG') # Filter the TRACKING sections out parseopt_notrack_lines = filter_section(parseopt_lines, 'TRACKING') # Replace the parser source sections with updated versions lines[parseopt_notrack_start:parseopt_notrack_end] = parseopt_notrack_lines lines[parseopt_start:parseopt_end] = parseopt_lines lines = [line.rstrip()+'\n' for line in lines] with open(os.path.join(dirname, 'yacc.py'), 'w') as f: f.writelines(lines) print('Updated yacc.py') if __name__ == '__main__': main() pycparser-2.18/pycparser/ply/__pycache__/0000775000175000017500000000000013127011712021213 5ustar elibeneliben00000000000000pycparser-2.18/pycparser/ply/__pycache__/yacc.cpython-36.pyc0000664000175000017500000015114613127011712024570 0ustar elibeneliben000000000000003 ìQâXjã @sddlZddlZddlZddlZddlZddlZddlZdZdZ dZ dZ dZ dZ dZdZd ZdZejddkrteZneZejZGd d „d eƒZGd d „d eƒZGdd„deƒZdd„Zdd„Zdada da!dZ"dd„Z#dd„Z$dd„Z%dd„Z&Gdd„dƒZ'Gdd „d ƒZ(Gd!d"„d"ƒZ)ej*d#ƒZ+Gd$d%„d%eƒZ,Gd&d'„d'eƒZ-Gd(d)„d)eƒZ.d*d+„Z/Gd,d-„d-eƒZ0Gd.d/„d/eƒZ1Gd0d1„d1eƒZ2Gd2d3„d3eƒZ3d4d5„Z4d6d7„Z5Gd8d9„d9eƒZ6Gd:d;„d;e3ƒZ7dd?„Z9Gd@dA„dAeƒZ:de de dddde ddddf dBdC„Z;dS)DéNz3.10Tz parser.outÚparsetabÚLALRéFé(c@s4eZdZdd„Zdd„ZeZdd„Zdd„ZeZd S) Ú PlyLoggercCs ||_dS)N)Úf)Úselfr©r ú../pycparser/ply/yacc.pyÚ__init__nszPlyLogger.__init__cOs|jj||dƒdS)NÚ )rÚwrite)rÚmsgÚargsÚkwargsr r r ÚdebugqszPlyLogger.debugcOs|jjd||dƒdS)Nz WARNING: r )rr )rrrrr r r ÚwarningvszPlyLogger.warningcOs|jjd||dƒdS)NzERROR: r )rr )rrrrr r r ÚerroryszPlyLogger.errorN) Ú__name__Ú __module__Ú __qualname__r rÚinforrZcriticalr r r r rms rc@seZdZdd„Zdd„ZdS)Ú NullLoggercCs|S)Nr )rÚnamer r r Ú__getattribute__€szNullLogger.__getattribute__cOs|S)Nr )rrrr r r Ú__call__ƒszNullLogger.__call__N)rrrrrr r r r rsrc@s eZdZdS)Ú YaccErrorN)rrrr r r r r‡srcCsPt|ƒ}d|krt|ƒ}t|ƒtkr4|dt…d}dt|ƒjt|ƒ|f}|S)Nr z ...z<%s @ 0x%x> (%s))ÚreprÚlenÚ resultlimitÚtyperÚid)ÚrÚrepr_strÚresultr r r Ú format_result‹s r%cCsBt|ƒ}d|krt|ƒ}t|ƒdkr(|Sdt|ƒjt|ƒfSdS)Nr éz <%s @ 0x%x>)rrr rr!)r"r#r r r Úformat_stack_entry•s  r'aPLY: Don't use global functions errok(), token(), and restart() in p_error(). Instead, invoke the methods on the associated parser instance: def p_error(p): ... # Use parser.errok(), parser.token(), parser.restart() ... parser = yacc.yacc() cCstjtƒtƒS)N)ÚwarningsÚwarnÚ_warnmsgÚ_errokr r r r Úerrok¯s r,cCstjtƒtƒS)N)r(r)r*Ú_restartr r r r Úrestart³s r.cCstjtƒtƒS)N)r(r)r*Ú_tokenr r r r Útoken·s r0c Cs>|ja|ja|ja||ƒ}y bbbWntk r8YnX|S)N)r,r+r0r/r.r-Ú NameError)Ú errorfuncr0Úparserr"r r r Úcall_errorfunc¼s r4c@seZdZdd„Zdd„ZdS)Ú YaccSymbolcCs|jS)N)r )rr r r Ú__str__ÚszYaccSymbol.__str__cCst|ƒS)N)Ústr)rr r r Ú__repr__ÝszYaccSymbol.__repr__N)rrrr6r8r r r r r5Ùsr5c@sfeZdZddd„Zdd„Zdd„Zdd „Zd d „Zd d „Zdd„Z dd„Z dd„Z dd„Z dd„Z dS)ÚYaccProductionNcCs||_||_d|_d|_dS)N)ÚsliceÚstackÚlexerr3)rÚsr;r r r r êszYaccProduction.__init__cCsBt|tƒrdd„|j|DƒS|dkr2|j|jS|j|jSdS)NcSsg|] }|j‘qSr )Úvalue)Ú.0r=r r r ú òsz.YaccProduction.__getitem__..r)Ú isinstancer:r>r;)rÚnr r r Ú __getitem__ðs   zYaccProduction.__getitem__cCs||j|_dS)N)r:r>)rrBÚvr r r Ú __setitem__øszYaccProduction.__setitem__cCsdd„|j||…DƒS)NcSsg|] }|j‘qSr )r>)r?r=r r r r@üsz/YaccProduction.__getslice__..)r:)rÚiÚjr r r Ú __getslice__ûszYaccProduction.__getslice__cCs t|jƒS)N)rr:)rr r r Ú__len__þszYaccProduction.__len__cCst|j|ddƒS)NÚlinenor)Úgetattrr:)rrBr r r rJszYaccProduction.linenocCs||j|_dS)N)r:rJ)rrBrJr r r Ú set_linenoszYaccProduction.set_linenocCs,t|j|ddƒ}t|j|d|ƒ}||fS)NrJrÚ endlineno)rKr:)rrBÚ startlineZendliner r r ÚlinespanszYaccProduction.linespancCst|j|ddƒS)NÚlexposr)rKr:)rrBr r r rP szYaccProduction.lexposcCs,t|j|ddƒ}t|j|d|ƒ}||fS)NrPrÚ endlexpos)rKr:)rrBÚstartposÚendposr r r ÚlexspanszYaccProduction.lexspancCst‚dS)N)Ú SyntaxError)rr r r rszYaccProduction.error)N)rrrr rCrErHrIrJrLrOrPrTrr r r r r9és r9c@s\eZdZdd„Zdd„Zdd„Zdd„Zd d „Zdd d„Zddd„Z ddd„Z ddd„Z d S)ÚLRParsercCs0|j|_|j|_|j|_||_|jƒd|_dS)NT) Úlr_productionsÚ productionsÚ lr_actionÚactionÚlr_gotoÚgotor2Úset_defaulted_statesÚerrorok)rZlrtabZerrorfr r r r s zLRParser.__init__cCs d|_dS)NT)r^)rr r r r,&szLRParser.errokcCs@|jdd…=|jdd…=tƒ}d|_|jj|ƒ|jjdƒdS)Nz$endr)Ú statestackÚsymstackr5r Úappend)rÚsymr r r r.)s    zLRParser.restartcCsTi|_xH|jjƒD]:\}}t|jƒƒ}t|ƒdkr|ddkr|d|j|<qWdS)Nér)Údefaulted_statesrZÚitemsÚlistÚvaluesr)rÚstateÚactionsZrulesr r r r]9s  zLRParser.set_defaulted_statescCs i|_dS)N)rd)rr r r Údisable_defaulted_states@sz!LRParser.disable_defaulted_statesNFcCsZ|str.t|tƒrttjƒ}|j|||||ƒS|rD|j|||||ƒS|j|||||ƒSdS)N) Ú yaccdevelrAÚintrÚsysÚstderrÚ parsedebugÚparseoptÚparseopt_notrack)rÚinputr<rÚtrackingÚ tokenfuncr r r ÚparseCs  zLRParser.parsec CsÌd}g}|j}|j} |j} |j} tdƒ} d} |jdƒ|sLddlm}|j}|| _|| _ |dk rj|j |ƒ|dkrz|j }n|}||_ g}||_ g}||_ || _d}|jdƒtƒ}d|_|j|ƒd}xþ|jdƒ|jd|ƒ|| kr.|s|sþ|ƒ}n|jƒ}|stƒ}d|_|j}||j|ƒ}n| |}|jd|| ƒ|jd d d jd d „|Dƒdd…ƒt|ƒfjƒƒ|dk rŠ|dkrÄ|j|ƒ|}|jd|ƒ|j|ƒd}| rÊ| d8} qÊ|dkrN| | }|j}|j}tƒ}||_d|_|rB|jd|jddjdd „|| d…Dƒƒd| |d%||ƒn|jd|jg| |d&|ƒ|r’|| dd…}||d<|rÆ|d}|j|_|j|_|d'}t|d|jƒ|_t|d|jƒ|_|| _ yd|| d…=||_!|j"| ƒ|| d…=|jdt#| dƒƒ|j|ƒ| |d(|}|j|ƒWqÊt$k rŒ|j|ƒ|j%|dd)…ƒ|jƒ|d*}d|_d|_|}t&} d|_'YqÊXqÊn¼|r¨|j|_|j|_|g}|| _ yL||_!|j"| ƒ|jdt#| dƒƒ|j|ƒ| |d+|}|j|ƒWqÊt$k rJ|j|ƒ|jƒ|d,}d|_d|_|}t&} d|_'YqÊXqÊ|dkrŠ|d-}t|ddƒ}|jdt#|ƒƒ|jdƒ|S|dkr¼|j(dd d jdd „|Dƒdd…ƒt|ƒfjƒƒ| dksÚ|j'r¤t&} d|_'|}|jdkrød}|j)rB|rt*|dƒ r||_||_!t+|j)||ƒ}|j'r¢|}d}qÊn`|r’t*|dƒr\|j}nd}|r~t,j-j.d ||jfƒnt,j-j.d!|jƒnt,j-j.d"ƒdSnt&} t|ƒdkrÚ|jdkrÚd}d}d}|dd…=qÊ|jdkrêdS|jdkrŒ|d.}|jdkr6|r0t|d|jƒ|_t|d#|jƒ|_d}qÊtƒ}d|_t*|dƒr\|j|_|_t*|d#ƒrv|j|_|_||_|j|ƒ|}qÊ|jƒ}|rª|j|_|j|_|jƒ|d/}qÊt/d$ƒ‚qÊWdS)0NrzPLY: PARSE DEBUG STARTrc)Úlexz$endÚz State : %sz#Defaulted state %s: Reduce using %dz Stack : %sz%s . %sú cSsg|] }|j‘qSr )r )r?Úxxr r r r@±sz'LRParser.parsedebug..z Action : Shift and goto state %sz3Action : Reduce rule [%s] with %s and goto state %dú[ú,cSsg|]}t|jƒ‘qSr )r'r>)r?Z_vr r r r@Ôsú]rMrQz Result : %srFr>zDone : Returning %szPLY: PARSE DEBUG ENDz Error : %scSsg|] }|j‘qSr )r )r?ryr r r r@Bsr<rJz(yacc: Syntax error at line %d, token=%s zyacc: Syntax error, token=%sz yacc: Parse error in input. EOF rPzyacc: internal parser error!!! éÿÿÿÿr}r}r}r}r}r}r}r}r}r})0rZr\rXrdr9rrwrvr<r3rrr0r_r`r;rar5r rÚpopÚgetÚjoinr7Úlstriprrr>rJrPrKrMrQr:rhÚcallabler%rUÚextendÚ error_countr^rr2Úhasattrr4rmrnr Ú RuntimeError) rrrr<rrsrtÚ lookaheadÚlookaheadstackrir\ÚprodrdÚpsliceÚ errorcountrvÚ get_tokenr_r`ÚerrtokenrbrhÚltypeÚtÚpÚpnameÚplenÚtargÚt1rBr$ÚtokrJr r r ro\s~        .        $               .           zLRParser.parsedebugc Csvd}g}|j}|j} |j} |j} tdƒ} d} |sBddlm}|j}|| _|| _|dk r`|j |ƒ|dkrp|j }n|}||_ g}||_ g}||_ || _ d}|jdƒtƒ}d|_|j|ƒd}x²|| kr|sò|sÚ|ƒ}n|jƒ}|sòtƒ}d|_|j}||j|ƒ}n| |}|dk rh|dkrN|j|ƒ|}|j|ƒd}| rÀ| d8} qÀ|dkrF| | }|j}|j}tƒ}||_d|_|rž|| dd…}||d<|ræ|d}|j|_|j|_|d}t|d|jƒ|_t|d|jƒ|_|| _yP|| d…=||_|j| ƒ|| d…=|j|ƒ| |d|}|j|ƒWqÀtk r˜|j|ƒ|j|dd…ƒ|jƒ|d}d|_d|_|}t } d|_!YqÀXqÀn¨|r´|j|_|j|_|g}|| _y8||_|j| ƒ|j|ƒ| |d|}|j|ƒWqÀtk rB|j|ƒ|jƒ|d}d|_d|_|}t } d|_!YqÀXqÀ|dkrh|d}t|d dƒ}|S|dkrf| dks„|j!rNt } d|_!|}|jdkr¢d}|j"rì|rÄt#|d ƒ rÄ||_||_t$|j"||ƒ}|j!rL|}d}qÀn`|rr<rJz(yacc: Syntax error at line %d, token=%s zyacc: Syntax error, token=%sz yacc: Parse error in input. EOF rPzyacc: internal parser error!!! r}r}r}r}r}r}r}r}r}))rZr\rXrdr9rwrvr<r3rrr0r_r`r;rar5r r~rrrr>rJrPrKrMrQr:rhr‚rUrƒr„r^r2r…r4rmrnr r†) rrrr<rrsrtr‡rˆrir\r‰rdrŠr‹rvrŒr_r`rrbrhrŽrrr‘r’r“r”rBr$r•rJr r r rp·sX                                  zLRParser.parseoptc CsÞd}g}|j}|j} |j} |j} tdƒ} d} |sBddlm}|j}|| _|| _|dk r`|j |ƒ|dkrp|j }n|}||_ g}||_ g}||_ || _ d}|jdƒtƒ}d|_|j|ƒd}x|| kr|sò|sÚ|ƒ}n|jƒ}|sòtƒ}d|_|j}||j|ƒ}n| |}|dk r |dkrN|j|ƒ|}|j|ƒd}| rÀ| d8} qÀ|dkrê| | }|j}|j}tƒ}||_d|_|rX|| dd…}||d<|| _yP|| d…=||_|j| ƒ|| d…=|j|ƒ| |d|}|j|ƒWqÀtk rR|j|ƒ|j|dd…ƒ|jƒ|d}d|_d|_|}t} d|_YqÀXqÀn’|g}|| _y8||_|j| ƒ|j|ƒ| |d|}|j|ƒWqÀtk ræ|j|ƒ|jƒ|d}d|_d|_|}t} d|_YqÀXqÀ|dkr |d}t|ddƒ}|S|dkrÎ| dks(|jròt} d|_|}|jdkrFd}|jr|rht|dƒ rh||_||_t |j||ƒ}|jrð|}d}qÀn`|ràt|d ƒrª|j!}nd}|rÌt"j#j$d ||jfƒnt"j#j$d |jƒnt"j#j$d ƒdSnt} t|ƒdkr(|jdkr(d}d}d}|dd…=qÀ|jdkr8dS|jdkr´|d}|jdkr^d}qÀtƒ}d|_t|d ƒr„|j!|_!|_%t|d ƒrž|j&|_&|_'||_|j|ƒ|}qÀ|jƒ}|jƒ|d}qÀt(dƒ‚qÀWdS)Nrrc)rvz$endrFr>r<rJz(yacc: Syntax error at line %d, token=%s zyacc: Syntax error, token=%sz yacc: Parse error in input. EOF rPzyacc: internal parser error!!! r}r}r}r}r}r}r}r}))rZr\rXrdr9rwrvr<r3rrr0r_r`r;rar5r r~rrrr>r:rhr‚rUrƒr„r^rKr2r…r4rJrmrnr rMrPrQr†)rrrr<rrsrtr‡rˆrir\r‰rdrŠr‹rvrŒr_r`rrbrhrŽrrr‘r’r“rBr$r•rJr r r rqés8                                  zLRParser.parseopt_notrack)NNFFN)NNFFN)NNFFN)NNFFN) rrrr r,r.r]rjrurorprqr r r r rVs  ] 4rVz^[a-zA-Z0-9_-]+$c@sReZdZdZddd„Zdd„Zd d „Zd d „Zd d„Zdd„Z dd„Z dd„Z dS)Ú ProductionrÚrightNrwc Cs¨||_t|ƒ|_||_||_d|_||_||_||_t |jƒ|_ g|_ x$|jD]}||j krN|j j |ƒqNWg|_ d|_ |jr˜d|jdj|jƒf|_n d|j|_dS)Nz%s -> %srxz %s -> )rÚtupler‰ÚnumberÚfuncr‚ÚfileÚlineÚprecrÚusymsraÚlr_itemsÚlr_nextr€r7) rr™rr‰Ú precedenceršr›rœr=r r r r s$    zProduction.__init__cCs|jS)N)r7)rr r r r6=szProduction.__str__cCsdt|ƒdS)Nz Production(ú))r7)rr r r r8@szProduction.__repr__cCs t|jƒS)N)rr‰)rr r r rICszProduction.__len__cCsdS)Nrcr )rr r r Ú __nonzero__FszProduction.__nonzero__cCs |j|S)N)r‰)rÚindexr r r rCIszProduction.__getitem__cCsˆ|t|jƒkrdSt||ƒ}yt|j|d|_Wnttfk rRg|_YnXy|j|d|_Wntk r‚d|_YnX|S)Nrc)rr‰ÚLRItemÚ ProdnamesÚlr_afterÚ IndexErrorÚKeyErrorÚ lr_before)rrBrr r r Úlr_itemMs   zProduction.lr_itemcCs|jr||j|_dS)N)ršr‚)rÚpdictr r r Úbind]szProduction.bind©r—r)r®Nrwr) rrrÚreducedr r6r8rIr£rCr«r­r r r r r–s r–c@s,eZdZdd„Zdd„Zdd„Zdd„Zd S) ÚMiniProductioncCs.||_||_||_d|_||_||_||_dS)N)rrršr‚r›rœr7)rr7rrršr›rœr r r r fszMiniProduction.__init__cCs|jS)N)r7)rr r r r6oszMiniProduction.__str__cCs d|jS)NzMiniProduction(%s))r7)rr r r r8rszMiniProduction.__repr__cCs|jr||j|_dS)N)ršr‚)rr¬r r r r­vszMiniProduction.bindN)rrrr r6r8r­r r r r r°es r°c@s$eZdZdd„Zdd„Zdd„ZdS)r¥cCsZ|j|_t|jƒ|_|j|_||_i|_|jj|dƒt|jƒ|_t|jƒ|_|j |_ dS)NÚ.) rrfr‰r™Úlr_indexÚ lookaheadsÚinsertr˜rrž)rrrBr r r r ”s   zLRItem.__init__cCs,|jrd|jdj|jƒf}n d|j}|S)Nz%s -> %srxz %s -> )r‰rr€)rr=r r r r6Ÿs zLRItem.__str__cCsdt|ƒdS)NzLRItem(r¢)r7)rr r r r8¦szLRItem.__repr__N)rrrr r6r8r r r r r¥“s r¥cCs:t|ƒd}x(|dkr4|||kr*||S|d8}qWdS)Nrcr)r)ZsymbolsÚ terminalsrFr r r Úrightmost_terminal®s     r¶c@s eZdZdS)Ú GrammarErrorN)rrrr r r r r·¾sr·c@s’eZdZdd„Zdd„Zdd„Zdd„Zd$d d „Zd%dd„Zdd„Z dd„Z dd„Z dd„Z dd„Z dd„Zdd„Zdd„Zd&d d!„Zd"d#„Zd S)'ÚGrammarcCsfdg|_i|_i|_i|_x|D]}g|j|<q Wg|jd<i|_i|_i|_i|_tƒ|_ d|_ dS)Nr) Ú Productionsr¦ÚProdmapÚ TerminalsÚ NonterminalsÚFirstÚFollowÚ PrecedenceÚsetÚUsedPrecedenceÚStart)rrµÚtermr r r r Âs  zGrammar.__init__cCs t|jƒS)N)rr¹)rr r r rIæszGrammar.__len__cCs |j|S)N)r¹)rr¤r r r rCészGrammar.__getitem__cCsL|jdgkstdƒ‚||jkr*td|ƒ‚|dkr:tdƒ‚||f|j|<dS)Nz2Must call set_precedence() before add_production()z,Precedence already specified for terminal %rÚleftr—Únonassocz:Associativity must be one of 'left','right', or 'nonassoc')rÄr—rÅ)r¹ÚAssertionErrorr¿r·)rrÃÚassocÚlevelr r r Úset_precedenceôs   zGrammar.set_precedenceNrwrcCsÀ||jkrtd|||fƒ‚|dkr6td|||fƒ‚tj|ƒsRtd|||fƒ‚x¨t|ƒD]œ\}}|ddkrÐyJt|ƒ}t|ƒdkrštd||||fƒ‚||jkr®g|j|<|||<w\Wntk rÎYnXtj|ƒ r\|d kr\td ||||fƒ‚q\Wd |krˆ|dd kr$td ||fƒ‚|dd krBtd ||fƒ‚|d} |jj | ƒ} | sptd||| fƒ‚n |j j | ƒ|dd…=nt ||jƒ} |jj | dƒ} d||f} | |j kræ|j | } td||| fd| j| jfƒ‚t|jƒ} ||jkrg|j|<xR|D]J}||jkr.|j|j| ƒn&||jkrDg|j|<|j|j| ƒq Wt| ||| |||ƒ}|jj|ƒ||j | <y|j|j|ƒWn"tk rº|g|j|<YnXdS)Nz7%s:%d: Illegal rule name %r. Already defined as a tokenrz5%s:%d: Illegal rule name %r. error is a reserved wordz%s:%d: Illegal rule name %rrz'"rczA%s:%d: Literal token %s in rule %r may only be a single characterz%precz!%s:%d: Illegal name %r in rule %rz+%s:%d: Syntax error. Nothing follows %%precézH%s:%d: Syntax error. %%prec can only appear at the end of a grammar rulez/%s:%d: Nothing known about the precedence of %rr—z%s -> %sz%s:%d: Duplicate rule %s. zPrevious definition at %s:%dr}éþÿÿÿr}rË)r—r)r»r·Ú_is_identifierÚmatchÚ enumerateÚevalrrUr¿rrÁÚaddr¶rºr›rœr¹r¼rar–r¦r©)rÚprodnameÚsymsršr›rœrBr=ÚcZprecnameZprodprecÚmapÚmZpnumberrrr r r Úadd_production sp                        zGrammar.add_productioncCsT|s|jdj}||jkr&td|ƒ‚tdd|gƒ|jd<|j|jdƒ||_dS)Nrczstart symbol %s undefinedrzS')r¹rr¼r·r–rarÂ)rÚstartr r r Ú set_startas   zGrammar.set_startcs>‡‡‡fdd„‰tƒ‰ˆˆjdjdƒ‡fdd„ˆjDƒS)NcsJ|ˆkr dSˆj|ƒx.ˆjj|gƒD]}x|jD] }ˆ|ƒq2Wq&WdS)N)rÐr¦rr‰)r=rr")Úmark_reachable_fromÚ reachablerr r rÙts   z5Grammar.find_unreachable..mark_reachable_fromrcsg|]}|ˆkr|‘qSr r )r?r=)rÚr r r@~sz,Grammar.find_unreachable..)rÀr¹r‰r¼)rr )rÙrÚrr Úfind_unreachableqszGrammar.find_unreachablec Csøi}x|jD] }d||<q Wd|d<x|jD] }d||<q,Wxpd}x`|jjƒD]R\}}xH|D]@}x |jD]}||shd}PqhWd}|r\||sšd||<d}Pq\WqNW|s>Pq>Wg} x@|jƒD]4\}} | s¼||jkræ||jkræ|dkræq¼| j|ƒq¼W| S)NTz$endFr)r»r¼r¦rer‰ra) rZ terminatesrrBÚ some_changeZplrr=Z p_terminatesÚinfiniterÃr r r Úinfinite_cyclesˆs:       zGrammar.infinite_cyclescCsXg}xN|jD]D}|sq x8|jD].}||jkr||jkr|dkr|j||fƒqWq W|S)Nr)r¹r‰r¦r»ra)rr$rr=r r r Úundefined_symbolsÈs  zGrammar.undefined_symbolscCs8g}x.|jjƒD] \}}|dkr| r|j|ƒqW|S)Nr)r»rera)rZ unused_tokr=rDr r r Úunused_terminalsÙs zGrammar.unused_terminalscCs<g}x2|jjƒD]$\}}|s|j|d}|j|ƒqW|S)Nr)r¼rer¦ra)rZ unused_prodr=rDrr r r Ú unused_rulesès zGrammar.unused_rulescCsDg}x:|jD]0}||jkp"||jks |j||j|dfƒq W|S)Nr)r¿r»rÁra)rZunusedZtermnamer r r Úunused_precedenceùs  zGrammar.unused_precedencecCs`g}xV|D]D}d}x2|j|D]$}|dkr0d}q||kr|j|ƒqW|rLq Pq W|jdƒ|S)NFzT)r½ra)rZbetar$ÚxZx_produces_emptyrr r r Ú_first s  zGrammar._firstcCsÀ|jr |jSx|jD]}|g|j|<qWdg|jd<x|jD]}g|j|<qT) r¾r½rår¼r¹rrÎr‰rärar) rr×ÚkÚdidaddrrFÚBZfstZhasemptyrr r r Úcompute_followQs<     zGrammar.compute_followcCsÔxÎ|jD]Ä}|}d}g}x¬|t|ƒkr,d}ntt||ƒ}y|j|j|d|_Wnttfk rng|_YnXy|j|d|_Wntk ržd|_YnX||_ |s¬P|j |ƒ|}|d7}qW||_ qWdS)Nrrc) r¹rr¥r¦r‰r§r¨r©rªr rarŸ)rrZlastlrirFrŸZlrir r r Ú build_lritemsŒs.       zGrammar.build_lritems)Nrwr)N)N)rrrr rIrCrÉrÖrØrÛrÞrßràrárârärårérêr r r r r¸Ás $  T @#% ;r¸c@s eZdZdS)Ú VersionErrorN)rrrr r r r rë°srëc@s,eZdZdd„Zdd„Zdd„Zdd„Zd S) ÚLRTablecCsd|_d|_d|_d|_dS)N)rYr[rWÚ lr_method)rr r r r ´szLRTable.__init__cCs~t|tjƒr|}ntd|ƒtj|}|jtkr:tdƒ‚|j |_ |j |_ g|_ x|jD]}|j jt|ŽƒqXW|j|_|jS)Nz import %sz&yacc table file version is out of date)rAÚtypesÚ ModuleTypeÚexecrmÚmodulesZ _tabversionÚ__tabversion__rëZ _lr_actionrYZ_lr_gotor[rWZ_lr_productionsrar°Z _lr_methodríZ _lr_signature)rÚmodulerrr r r Ú read_tableºs     zLRTable.read_tablec CsÆy ddl}Wntk r(ddl}YnXtjj|ƒs:t‚t|dƒ}|j|ƒ}|tkr^t dƒ‚|j|ƒ|_ |j|ƒ}|j|ƒ|_ |j|ƒ|_ |j|ƒ}g|_ x|D]}|j jt|Žƒq¢W|jƒ|S)NrÚrbz&yacc table file version is out of date)ÚcPickleÚ ImportErrorÚpickleÚosÚpathÚexistsÚopenÚloadròrërírYr[rWrar°Úclose)rÚfilenamerøZin_fZ tabversionÚ signaturerXrr r r Ú read_pickleÎs(          zLRTable.read_picklecCsx|jD]}|j|ƒqWdS)N)rWr­)rr¬rr r r Úbind_callablesês zLRTable.bind_callablesN)rrrr rôrrr r r r rì³srìc CsTi}x|D] }d||<q Wg}i}x,|D]$}||dkr(t|||||||ƒq(W|S)Nr)Útraverse)ÚXÚRÚFPÚNrãr;ÚFr r r Údigraphs    r c Cs |j|ƒt|ƒ}|||<||ƒ||<||ƒ}xr|D]j} || dkrXt| ||||||ƒt|||| ƒ||<x.|j| gƒD]} | ||kr|||j| ƒq|Wq4W|||krt||d<||||d<|jƒ} x2| |krt||d<||||d<|jƒ} qÖWdS)Nrrcr}r}r}r})rarrÚminrÚMAXINTr~) rãrr;rrrrÚdÚrelÚyÚaÚelementr r r rs(        rc@s eZdZdS)Ú LALRErrorN)rrrr r r r r)src@s’eZdZd$dd„Zdd„Zdd„Zd d „Zd d „Zd d„Zdd„Z dd„Z dd„Z dd„Z dd„Z dd„Zdd„Zdd„Zd%d d!„Zd&d"d#„ZdS)'ÚLRGeneratedTablerNcCsž|dkrtd|ƒ‚||_||_|s*tƒ}||_i|_i|_|j|_i|_ i|_ d|_ d|_ d|_ g|_g|_g|_|jjƒ|jjƒ|jjƒ|jƒdS)NÚSLRrzUnsupported method %sr)rr)rÚgrammarrírÚlogrYr[r¹rWÚ lr_goto_cacheÚ lr0_cidhashÚ _add_countZ sr_conflictZ rr_conflictZ conflictsÚ sr_conflictsÚ rr_conflictsrêråréÚlr_parse_table)rrÚmethodrr r r r 4s,    zLRGeneratedTable.__init__cCsz|jd7_|dd…}d}xV|rtd}xH|D]@}x:|jD]0}t|ddƒ|jkrRq:|j|jƒ|j|_d}q:Wq.Wq W|S)NrcTFÚ lr0_addedr)rr§rKrar r)rÚIÚJrçrGrãr r r Ú lr0_closureYs    zLRGeneratedTable.lr0_closurec CsÔ|jjt|ƒ|fƒ}|r|S|jj|ƒ}|s:i}||j|<g}xP|D]H}|j}|rD|j|krD|jt|ƒƒ}|s~i}||t|ƒ<|j|ƒ|}qDW|jdƒ}|s¾|r¶|j|ƒ}||d<n||d<||jt|ƒ|f<|S)Nz$end)rrr!r rªrar ) rrrãÚgr=ZgsrrBÚs1r r r Úlr0_gotoss2        zLRGeneratedTable.lr0_gotoc Csà|j|jjdjgƒg}d}x"|D]}||jt|ƒ<|d7}q"Wd}x–|t|ƒkrÚ||}|d7}i}x$|D]}x|jD] }d||<qxWqlWxJ|D]B}|j||ƒ}| s’t|ƒ|jkr¸q’t|ƒ|jt|ƒ<|j |ƒq’WqFW|S)Nrrc) r rr¹r rr!rržr#ra) rÚCrFrZasymsÚiir=rãr!r r r Ú lr0_items–s(      zLRGeneratedTable.lr0_itemscCs‚tƒ}d}xrxV|jjdd…D]B}|jdkr:|j|jƒqx$|jD]}||krBPqBW|j|jƒqWt|ƒ|krrPt|ƒ}q W|S)Nrrc)rÀrr¹rrÐrr‰)rÚnullableZ num_nullablerrr r r Úcompute_nullable_nonterminalsÎs     z.LRGeneratedTable.compute_nullable_nonterminalscCsrg}xht|ƒD]\\}}xR|D]J}|j|jdkr||j|jdf}|d|jjkr||kr|j|ƒqWqW|S)Nrc)rÎr²rr‰rr¼ra)rr$ÚtransZstatenorhrrr r r Úfind_nonterminal_transitionsës z-LRGeneratedTable.find_nonterminal_transitionsc Cs˜i}|\}}g}|j|||ƒ}xJ|D]B} | j| jdkr&| j| jd} | |jjkr&| |kr&|j| ƒq&W|dkr”||jjdjdkr”|jdƒ|S)Nrcrz$end)r#r²rr‰rr»rar¹) rr$r)r'Zdr_setrhrÚtermsr!rrr r r Ú dr_relationÿs   zLRGeneratedTable.dr_relationc Csvg}|\}}|j|||ƒ}|jjt|ƒdƒ}xB|D]:} | j| jdkr4| j| jd} | |kr4|j|| fƒq4W|S)Nrcr})r#rrr!r²rr‰ra) rr$r)Úemptyr rhrr!rGrrr r r Úreads_relation s zLRGeneratedTable.reads_relationcCsÞi}i}i}x|D] }d||<qWx°|D]¦\}} g} g} xR||D]D} | j| krZqH| j} |}x¦| | jdkr | d} | j| }||f|kræ| d}xH|| jkrÖ| j||jjkr¼P| j||krÌP|d}qžW| j||fƒ|j|||ƒ}|jj t |ƒdƒ}qfWx€||D]t}|j| jkr,q|j| jkr>qd}xD||jkrx|j|| j|dkrlP|d}qDW| j||fƒqWqHWx2| D]*}||kr®g||<||j|| fƒq˜W| ||| f<q*W||fS)Nrcrr}) rr²rr‰rr»rar#rrr!)rr$r)r'ZlookdictZ includedictZdtransrrhrZlookbZincludesrr²rGZlir!r"rFr r r Úcompute_lookback_includesC sX         z*LRGeneratedTable.compute_lookback_includescs0‡‡‡fdd„}‡‡‡fdd„}t|||ƒ}|S)Ncsˆjˆ|ˆƒS)N)r,)rã)r$r'rr r Ú” sz4LRGeneratedTable.compute_read_sets..csˆjˆ|ˆƒS)N)r.)rã)r$r'rr r r0• s)r )rr$Úntransr'rrrr )r$r'rr Úcompute_read_sets“ s z"LRGeneratedTable.compute_read_setscs(‡fdd„}‡fdd„}t|||ƒ}|S)Ncsˆ|S)Nr )rã)Úreadsetsr r r0ª sz6LRGeneratedTable.compute_follow_sets..cs ˆj|gƒS)N)r)rã)Úinclsetsr r r0« s)r )rr1r3r4rrrr )r4r3r Úcompute_follow_sets© s   z$LRGeneratedTable.compute_follow_setsc Csxxr|jƒD]f\}}x\|D]T\}}||jkr4g|j|<|j|gƒ}x*|D]"}||j|krF|j|j|ƒqFWqWq WdS)N)rer³rra) rZ lookbacksZ followsetr)Zlbrhrrrr r r Úadd_lookaheads» s    zLRGeneratedTable.add_lookaheadscCsP|jƒ}|j|ƒ}|j|||ƒ}|j|||ƒ\}}|j|||ƒ}|j||ƒdS)N)r(r*r2r/r5r6)rr$r'r)r3ZlookdZincludedZ followsetsr r r Úadd_lalr_lookaheadsÍ s  z$LRGeneratedTable.add_lalr_lookaheadsc$ Cs8|jj}|jj}|j}|j}|j}i}|jd|jƒ|jƒ}|jdkrP|j |ƒd}xÜ|D]Ò} g} i} i} i} |jdƒ|jd|ƒ|jdƒx| D]}|jd|j |ƒq˜W|jdƒx| D]ú}|j |j dkr(|j dkrød| d <|| d <q¾|jdkr|j|}n|jj|j }xœ|D]ú}| j||d |j |ffƒ| j|ƒ}|dk rø|dkr>|j|dƒ\}}||j j\}}||ks¢||krú|d krú|j | |<|| |<| rä| rä|jd |ƒ|jj||dfƒ||j jd7_nB||kr|dkrd| |<n$|sö|jd|ƒ|jj||dfƒn¸|dkrê|| }||j }|j|jkr²|j | |<|| |<||}}||j jd7_||j jd8_n ||}}|jj|||fƒ|jd|| |j | |ƒn td|ƒ‚n(|j | |<|| |<||j jd7_q&WqÂ|j }|j|d}||jjkrÂ|j| |ƒ}|jjt|ƒdƒ}|dkrÂ| j||d|fƒ| j|ƒ}|dk r®|dkr¸||kr¬td|ƒ‚nô|dkr |j|dƒ\}}|| |j j\}}||ks||krR|d krR|| |j jd8_|| |<|| |<|sž|jd|ƒ|jj||dfƒnL||krp|dkrpd| |<n.| r¬| r¬|jd |ƒ|jj||dfƒn td|ƒ‚qÂ|| |<|| |<qÂWi}xF| D]>\}}}|| krÌ|| |krÌ|jd||ƒd|||f<qÌW|jdƒd}xX| D]P\}}}|| kr"|| |k r"||f|kr"|jd||ƒd}d|||f<q"W|r†|jdƒi} x6| D].}!x&|!jD]}"|"|jjkrœd| |"<qœWqWxL| D]D}#|j| |#ƒ}|jjt|ƒdƒ}|dkrÈ|| |#<|jd|#|ƒqÈW| ||<| ||<| ||<|d7}q\WdS)NzParsing method: %srrrwzstate %dz (%d) %srczS'z$endzreduce using rule %d (%s)r—rÄz3 ! shift/reduce conflict for %s resolved as reduceÚreducerÅz2 ! shift/reduce conflict for %s resolved as shiftZshiftz= ! reduce/reduce conflict for %s resolved using rule %d (%s)zUnknown conflict in state %dzshift and go to state %dz Shift/shift conflict in state %dz %-15s %sz ! %-15s [ %s ]z" %-30s shift and go to state %d)r—rr})r—rr}) rr¹r¿r[rYrrrír&r7r™rr²rr³r¾rarrrr¯rœrrr‰r»r#rr!rržr¼)$rr¹r¿r\rZrZactionpr$ÚstrZactlistZ st_actionZ st_actionpZst_gotorZlaheadsrr"ZsprecZslevelZrprecZrlevelZoldpZppZchosenpZrejectprFr!rGZ _actprintrÕZnot_usedZnkeysr%r=rBr r r rå s                                             zLRGeneratedTable.lr_parse_tablerwcCsÎt|tjƒrtdƒ‚|jdƒd}tjj||ƒd}ylt|dƒ}|j dtjj |ƒt |j |fƒd}|rti}xf|j jƒD]X\} } xN| jƒD]B\} } |j| ƒ} | s´ggf} | || <| dj| ƒ| dj| ƒqŽWq|W|j dƒxz|jƒD]n\}} |j d |ƒx | dD]} |j d | ƒq W|j d ƒx | dD]} |j d | ƒq8W|j d ƒqìW|j d ƒ|j dƒnJ|j dƒx4|j jƒD]&\}} |j d|d|d| fƒqŠW|j d ƒ|rÔi}xl|jjƒD]^\} } xR| jƒD]F\} } |j| ƒ} | sggf} | || <| dj| ƒ| dj| ƒqæWqÔW|j dƒx||jƒD]p\}} |j d |ƒx | dD]} |j d | ƒqjW|j d ƒx | dD]} |j d | ƒq–W|j d ƒqJW|j d ƒ|j dƒnJ|j dƒx4|jjƒD]&\}} |j d|d|d| fƒqêW|j d ƒ|j dƒxd|jD]Z}|jrl|j d|j|j|j|jtjj |jƒ|jfƒn|j dt|ƒ|j|jfƒq0W|j dƒ|jƒWn&tk rÈ}z‚WYdd}~XnXdS)Nz"Won't overwrite existing tabmoduler±rcz.pyÚwzu # %s # This file is automatically generated. Do not edit. _tabversion = %r _lr_method = %r _lr_signature = %r rz _lr_action_items = {z%r:([z%r,z],[z]),z} z _lr_action = {} for _k, _v in _lr_action_items.items(): for _x,_y in zip(_v[0],_v[1]): if not _x in _lr_action: _lr_action[_x] = {} _lr_action[_x][_k] = _y del _lr_action_items z _lr_action = { z (%r,%r):%r,z _lr_goto_items = {z¹ _lr_goto = {} for _k, _v in _lr_goto_items.items(): for _x, _y in zip(_v[0], _v[1]): if not _x in _lr_goto: _lr_goto[_x] = {} _lr_goto[_x][_k] = _y del _lr_goto_items z _lr_goto = { z_lr_productions = [ z (%r,%r,%d,%r,%r,%d), z (%r,%r,%d,None,None,None), z] r})rArîrïÚIOErrorÚsplitrùrúr€rür ÚbasenameròrírYrerrar[rWršr7rrr›rœrþ)rÚ tabmoduleÚ outputdirrZbasemodulenamerÿrZsmallerrer=ZndrrDrFrærÚer r r Ú write_table¦ sŽ       "      "   "  zLRGeneratedTable.write_tablecCsy ddl}Wntk r(ddl}YnXt|dƒÄ}|jt|tƒ|j|j|tƒ|j||tƒ|j|j|tƒ|j|j |tƒg}x^|j D]T}|j rÄ|j |j |j|j|j tjj|jƒ|jfƒqŽ|j t |ƒ|j|jdddfƒqŽW|j||tƒWdQRXdS)NrÚwb)rör÷rørüÚdumpròÚpickle_protocolrírYr[rWršrar7rrrùrúr=r›rœ)rrÿrrøZoutfZoutprr r r Ú pickle_table s    ,"zLRGeneratedTable.pickle_table)rN)rwrw)rw)rrrr r r#r&r(r*r,r.r/r2r5r6r7rrArEr r r r r3s" %#8+PB zrcCs0tj|ƒ}|jjƒ}|j|jkr,|j|jƒ|S)N)rmÚ _getframeÚ f_globalsÚcopyÚf_localsÚupdate)ZlevelsrZldictr r r Úget_caller_module_dictC s     rKc Csg}|jƒ}d}|}xê|D]â}|d7}|jƒ}|s4qyˆ|ddkrh|sVtd||fƒ‚|} |dd…} n@|d} | }|dd…} |d} | dkr¨| dkr¨td||fƒ‚|j||| | fƒWqtk rÒ‚Yqtk rútd |||jƒfƒ‚YqXqW|S) Nrcrú|z%s:%d: Misplaced '|'rÊú:z::=z!%s:%d: Syntax error. Expected ':'z%s:%d: Syntax error in rule %r)Ú splitlinesr<rUraÚ ExceptionÚstrip) Údocr›rœrZpstringsZlastpZdlineZpsrrÑrÒZassignr r r Ú parse_grammarO s6    rRc@s†eZdZd dd„Zdd„Zdd„Zdd „Zd d „Zd d „Zdd„Z dd„Z dd„Z dd„Z dd„Z dd„Zdd„Zdd„Zdd„ZdS)!Ú ParserReflectNcCsL||_d|_d|_d|_tƒ|_g|_d|_|dkrBtt j ƒ|_ n||_ dS)NF) r¬r×Ú error_funcÚtokensrÀrñrrrrmrnr)rr¬rr r r r y szParserReflect.__init__cCs,|jƒ|jƒ|jƒ|jƒ|jƒdS)N)Ú get_startÚget_error_funcÚ get_tokensÚget_precedenceÚget_pfunctions)rr r r Úget_allˆ s zParserReflect.get_allcCs6|jƒ|jƒ|jƒ|jƒ|jƒ|jƒ|jS)N)Úvalidate_startÚvalidate_error_funcÚvalidate_tokensÚvalidate_precedenceÚvalidate_pfunctionsÚvalidate_modulesr)rr r r Ú validate_all szParserReflect.validate_allc Csžg}yv|jr|j|jƒ|jr:|jdjdd„|jDƒƒƒ|jrR|jdj|jƒƒx$|jD]}|drZ|j|dƒqZWWnttfk r’YnXdj|ƒS)NrwcSsg|]}dj|ƒ‘qS)rw)r€)r?rr r r r@  sz+ParserReflect.signature..rxr)r×rarr€rUÚpfuncsÚ TypeErrorÚ ValueError)rÚpartsrr r r rš s  zParserReflect.signaturec Cs¸tjdƒ}x¨|jD]ž}ytj|ƒ\}}Wntk r>wYnXi}xjt|ƒD]^\}}|d7}|j|ƒ}|rN|jdƒ}|j |ƒ} | sŽ|||<qNtj |ƒ} |j j d| ||| ƒqNWqWdS)Nz\s*def\s+(p_[a-zA-Z_0-9]*)\(rcz;%s:%d: Function %s redefined. Previously defined on line %d) ÚreÚcompilerñÚinspectZgetsourcelinesr;rÎrÍÚgrouprÚ getsourcefilerr) rZfreróÚlinesZlinenZ counthashrœrÕrÚprevrÿr r r raµ s$       zParserReflect.validate_modulescCs|jjdƒ|_dS)Nr×)r¬rr×)rr r r rVÎ szParserReflect.get_startcCs&|jdk r"t|jtƒs"|jjdƒdS)Nz'start' must be a string)r×rAÚ string_typesrr)rr r r r\Ò s  zParserReflect.validate_startcCs|jjdƒ|_dS)NÚp_error)r¬rrT)rr r r rWØ szParserReflect.get_error_funccCs |jrœt|jtjƒrd}n*t|jtjƒr.d}n|jjdƒd|_dS|jjj}|jjj }t j |jƒ}|j j |ƒ|jjj|}|dkrœ|jjd||ƒd|_dS)Nrrcz2'p_error' defined, but is not a function or methodTz$%s:%d: p_error() requires 1 argument)rTrArîÚ FunctionTypeÚ MethodTyperrÚ__code__Úco_firstlinenoÚ co_filenameriÚ getmodulerñrÐÚ co_argcount)rZismethodZelineZefileróZargcountr r r r]Ü s      z!ParserReflect.validate_error_funccCsn|jjdƒ}|s&|jjdƒd|_dSt|ttfƒsJ|jjdƒd|_dS|sd|jjdƒd|_dS||_dS)NrUzNo token list is definedTztokens must be a list or tupleztokens is empty)r¬rrrrArfr˜rU)rrUr r r rXò s    zParserReflect.get_tokenscCsZd|jkr |jjdƒd|_dStƒ}x.|jD]$}||krH|jjd|ƒ|j|ƒq.WdS)Nrz.Illegal token name 'error'. Is a reserved wordTzToken %r multiply defined)rUrrrÀrrÐ)rrµrBr r r r^ s   zParserReflect.validate_tokenscCs|jjdƒ|_dS)Nr¡)r¬rr)rr r r rY szParserReflect.get_precedencecCsg}|jrt|jttfƒs2|jjdƒd|_dSxÐt|jƒD]Â\}}t|ttfƒsj|jjdƒd|_dSt|ƒdkrŽ|jjd|ƒd|_dS|d}t|tƒs¶|jjdƒd|_dSxH|dd…D]8}t|tƒsè|jjd ƒd|_dS|j |||dfƒqÄWq>W||_ dS) Nz"precedence must be a list or tupleTzBad precedence tablerÊz?Malformed precedence entry %s. Must be (assoc, term, ..., term)rz)precedence associativity must be a stringrcz precedence items must be strings) rrArfr˜rrrÎrrnraÚpreclist)rrwrÈrrÇrÃr r r r_ s6       z!ParserReflect.validate_precedencecCsŒg}xl|jjƒD]^\}}|jdƒ s|dkr.qt|tjtjfƒrt|d|jj ƒ}t j |ƒ}|j ||||j fƒqW|jdd„d||_dS)NÚp_rorscSs |dt|dƒ|d|dfS)NrrcrÊr)r7)Z p_functionr r r r0D s z.ParserReflect.get_pfunctions..)Úkey)r¬reÚ startswithrArîrprqrKrrrsriruraÚ__doc__Úsortrc)rZ p_functionsrÚitemrœrór r r rZ7 s zParserReflect.get_pfunctionscCs^g}t|jƒdkr(|jjdƒd|_dSx"|jD]\}}}}tj|ƒ}|j|}t|tj ƒrfd}nd}|j j |kr’|jjd|||j ƒd|_q2|j j |krº|jjd|||j ƒd|_q2|j sÖ|jjd|||j ƒq2y,t|||ƒ} x| D]} |j|| fƒqêWWn:tk r<} z|jjt| ƒƒd|_WYdd} ~ XnX|jj|ƒq2Wx|jjƒD]ô\} } | jd ƒrˆt| tjtj fƒrˆq\| jd ƒr˜q\| jd ƒr¼| d kr¼|jjd | ƒt| tjƒrØ| j j dksöt| tj ƒr\| jj j dkr\| j r\y8| j jd ƒ}|ddkr4|jjd| j j| j j| ƒWntk rLYnXq\W||_dS)Nrz+no rules of the form p_rulename are definedTrÊrcz%%s:%d: Rule %r has too many argumentsz#%s:%d: Rule %r requires an argumentzA%s:%d: No documentation string specified in function %r (ignored)rxZt_roz%r not defined as a functionrxrMz9%s:%d: Possible grammar rule %r defined without p_ prefix)rrcrrrirkr¬rArîrqrrrvrr{rrRrarUr7rñrÐrerzrpÚ__func__r<rtrsr¨r)rrrœrórrQr›ršZreqargsZparsed_gr!r@rBrDr r r r`L s\            z!ParserReflect.validate_pfunctions)N)rrrr r[rbrrarVr\rWr]rXr^rYr_rZr`r r r r rSx s  rSc <Osd |dkr t}| rd}| dkr&ttjƒ} ˆrf‡fdd„tˆƒDƒ} t| ƒ}d|krntj|dj|d<ntdƒ}| dkrât |t j ƒrŠ|j}nLd|krœ|d}n:|j dƒ}dj |dd6…ƒ}td |ƒttj|dd ƒ}tjj|ƒ} |jd ƒ}|oøt |tƒrd|kr|d|}|dk r$||d <t|| d }|jƒ|jrHtdƒ‚|jƒ}yŠtƒ}| rj|j| ƒ}n |j|ƒ}|s„||krØy"|j|jƒt||j ƒ}|j!a!|St"k rÖ}z| j#d|ƒWYdd}~XnXWnFt$k r }z| j#t|ƒƒWYdd}~Xnt%k r YnX| dkr”|rŽytt&tjj | |ƒdƒƒ} Wn<t'k rŠ}z| j#d||fƒt(ƒ} WYdd}~XnXnt(ƒ} | j)dt*ƒd}|j+ƒr¶tdƒ‚|j sÈ| j#dƒt,|j-ƒ}xZ|j.D]P\}}}y|j/|||ƒWn0t0k r&}z| j#d|ƒWYdd}~XnXqÚWxl|j1D]b\}}|\} }!}"}#y|j2|"|#|| |!ƒWn4t0k r”}z| jd|ƒd}WYdd}~XnXq6Wy&|dkr¶|j3|j4ƒn |j3|ƒWn6t0k rø}z| jt|ƒƒd}WYdd}~XnX|rtdƒ‚|j5ƒ}$x*|$D]"\}%}&| jd|&j6|&j7|%ƒd}qW|j8ƒ}'|'r| j)d ƒ| j)dƒ| j)d ƒx&|'D]}| j#d|ƒ| j)d|ƒqnW|rÜ| j)d ƒ| j)dƒ| j)d ƒx&t9|j:ƒD]\}(})| j)d|(|)ƒqÀW|j;ƒ}*x$|*D]}&| j#d|&j6|&j7|&j<ƒqêWt=|'ƒdkr"| j#dƒt=|'ƒdkr@| j#dt=|'ƒƒt=|*ƒdkrX| j#d ƒt=|*ƒdkrv| j#d!t=|*ƒƒ|rN| j)d ƒ| j)d"ƒ| j)d ƒt>|j?ƒ}+|+j@ƒx2|+D]*}| j)d#|d$j d%d„|j?|Dƒƒƒq²W| j)d ƒ| j)d&ƒ| j)d ƒt>|jAƒ},|,j@ƒx2|,D]*}-| j)d#|-d$j d'd„|jA|-DƒƒƒqW| j)d ƒ|r |jBƒ}.x|.D]}/| j#d(|/ƒqbW|jCƒ}0x|0D]}1| jd)|1ƒd}q†W|jDƒ}2x$|2D]\}}| jd*||ƒd}q®W|rÜtdƒ‚|rî| jEd+|ƒtF||| ƒ}|rlt=|jGƒ}3|3dkr | j#d,ƒn|3dkr6| j#d-|3ƒt=|jHƒ}4|4dkrV| j#d.ƒn|4dkrl| j#d/|4ƒ|r¤|jGs‚|jHr¤| j#d ƒ| j#d0ƒ| j#d ƒx&|jGD]\}5}6}7| j#d1|6|5|7ƒq¨WtIƒ}8x‚|jHD]x\}5}9}:|5tJ|9ƒtJ|:ƒf|8krüqÖ| j#d2|5|9ƒ| j#d3|:|5ƒ| j#d2|5|9ƒ| j#d3|:|5ƒ|8jK|5tJ|9ƒtJ|:ƒfƒqÖWg};xL|jHD]B\}5}9}:|:jL r^|:|;kr^| j#d4|:ƒ| j#d4|:ƒ|;jM|:ƒq^W|rôy|jN|| |ƒWn6t'k rò}z| j#d5||fƒWYdd}~XnX| rBy|jO| |ƒWn6t'k r@}z| j#d5| |fƒWYdd}~XnX|j|jƒt||j ƒ}|j!a!|S)7Nrcsg|]}|tˆ|ƒf‘qSr )rK)r?ræ)rór r r@¡ szyacc..Ú__file__rrÊr±rcz import %srwÚ __package__r×)rzUnable to build parserz.There was a problem loading the table file: %rr:zCouldn't open %r. %sz5Created by PLY version %s (http://www.dabeaz.com/ply)Fz no p_error() function is definedz%sTz;%s:%d: Symbol %r used, but not defined as a token or a rulezUnused terminals:zToken %r defined, but not usedz %sr¸z Rule %-5d %sz$%s:%d: Rule %r defined, but not usedzThere is 1 unused tokenzThere are %d unused tokenszThere is 1 unused rulezThere are %d unused rulesz'Terminals, with rules where they appearz %-20s : %srxcSsg|] }t|ƒ‘qSr )r7)r?r=r r r r@G sz*Nonterminals, with rules where they appearcSsg|] }t|ƒ‘qSr )r7)r?r=r r r r@O szSymbol %r is unreachablez)Infinite recursion detected for symbol %rz0Precedence rule %r defined for unknown symbol %rzGenerating %s tablesz1 shift/reduce conflictz%d shift/reduce conflictsz1 reduce/reduce conflictz%d reduce/reduce conflictsz Conflicts:z7shift/reduce conflict for %s in state %d resolved as %sz;reduce/reduce conflict in state %d resolved using rule (%s)zrejected rule (%s) in state %dzRule (%s) is never reducedzCouldn't create %r. %sr})PÚ tab_modulerrmrnÚdirÚdictrñrrKrArîrïr<r€rðrKrùrúÚdirnamerr7rSr[rrrrìrrôrr¬rVrTrurOrrër÷rür;rrÚ __version__rbr¸rUrwrÉr·rrÖrØr×rßr›rœràrÎr¹rárrrfr»r|r¼rÛrÞrârrrrrÀr!rÐr¯rarArE)r×Zcheck_recursionÚoptimizeZ write_tablesZ debugfiler?ZdebuglogZerrorlogZ picklefileZ_itemsr¬ZsrcfilerfZpkgnameZpkgZpinforZlrZread_signaturer3r@ÚerrorsrrÃrÇrÈÚfuncnameZgramr›rœrÑrÒrßrbr‰ràrBrrár+ZnontermsZnontermZ unreachableÚurÝÚinfZ unused_precZnum_srZnum_rrrhr•Z resolutionZalready_reportedÚruleZrejectedZ warned_neverr )rór ÚyaccŽ s„               "     $                    *     *                       $$  rŒ)>s’   4m H.rT   ) pycparser-2.18/pycparser/ply/__pycache__/lex.cpython-36.pyc0000664000175000017500000005167213127011712024444 0ustar elibeneliben000000000000003 ßQâX¦§ã@s<dZdZddlZddlZddlZddlZddlZddlZyejej fZ Wne k rde e fZ YnXejdƒZGdd„deƒZGdd„deƒZGdd „d eƒZGd d „d eƒZGd d „d ƒZdd„Zdd„Zdd„Zdd„Zdd„Zdd„ZGdd„deƒZdddddeejƒddddf dd„Z d$d d!„Z!d"d#„Z"e"Z#dS)%z3.10éNz^[a-zA-Z0-9_]+$c@seZdZdd„ZdS)ÚLexErrorcCs|f|_||_dS)N)ÚargsÚtext)ÚselfÚmessageÚs©rú../pycparser/ply/lex.pyÚ__init__:szLexError.__init__N)Ú__name__Ú __module__Ú __qualname__r rrrr r9src@seZdZdd„Zdd„ZdS)ÚLexTokencCsd|j|j|j|jfS)NzLexToken(%s,%r,%d,%d))ÚtypeÚvalueÚlinenoÚlexpos)rrrr Ú__str__AszLexToken.__str__cCst|ƒS)N)Ústr)rrrr Ú__repr__DszLexToken.__repr__N)r r r rrrrrr r@src@s4eZdZdd„Zdd„Zdd„Zdd„ZeZeZd S) Ú PlyLoggercCs ||_dS)N)Úf)rrrrr r LszPlyLogger.__init__cOs|jj||dƒdS)NÚ )rÚwrite)rÚmsgrÚkwargsrrr ÚcriticalOszPlyLogger.criticalcOs|jjd||dƒdS)Nz WARNING: r)rr)rrrrrrr ÚwarningRszPlyLogger.warningcOs|jjd||dƒdS)NzERROR: r)rr)rrrrrrr ÚerrorUszPlyLogger.errorN) r r r r rrrÚinfoÚdebugrrrr rKs rc@seZdZdd„Zdd„ZdS)Ú NullLoggercCs|S)Nr)rÚnamerrr Ú__getattribute__^szNullLogger.__getattribute__cOs|S)Nr)rrrrrr Ú__call__aszNullLogger.__call__N)r r r r#r$rrrr r!]sr!c@s|eZdZdd„Zddd„Zddd„Zd d „Zd d „Zd d„Zdd„Z dd„Z dd„Z dd„Z dd„Z dd„Zdd„ZeZdS)ÚLexercCsŽd|_d|_i|_i|_i|_d|_g|_d|_i|_i|_ i|_ d|_ d|_ d|_ d|_d|_d|_d|_d|_d|_d|_d|_d|_dS)NÚINITIALrÚéF)ÚlexreÚ lexretextÚ lexstatereÚlexstateretextÚlexstaterenamesÚlexstateÚ lexstatestackÚ lexstateinfoÚlexstateignoreÚlexstateerrorfÚ lexstateeoffÚ lexreflagsÚlexdatarÚlexlenÚ lexerrorfÚlexeoffÚ lextokensÚ lexignoreÚ lexliteralsÚ lexmodulerÚ lexoptimize)rrrr r ts.zLexer.__init__Nc Csâtj|ƒ}|rÞi}xŽ|jjƒD]€\}}g}x\|D]T\}}g} xF|D]>} | sV| d rb| j| ƒqB| jt|| djƒ| dfƒqBWq0W|j|| fƒ|||<qW||_i|_x(|jjƒD]\}} t|| jƒ|j|<qºW||_|S)Nrr()Úcopyr+ÚitemsÚappendÚgetattrr r2r<) rÚobjectÚcZnewtabÚkeyZritemZnewreZcreZfindexZ newfindexrÚefrrr Úclones(   & z Lexer.cloner'cCsÞt|tjƒrtdƒ‚|jdƒd}tjj||ƒd}t|dƒ”}|j d|t fƒ|j dt t ƒƒ|j dt t |jƒƒƒ|j d t |jƒƒ|j d t |jƒƒ|j d t |jƒƒi}xb|jjƒD]T\}}g} x>t||j||j|ƒD]"\\} } } } | j| t| | ƒfƒqðW| ||<qÊW|j d t |ƒƒ|j d t |jƒƒi}x,|jjƒD]\}}|rl|jnd||<qXW|j dt |ƒƒi}x,|jjƒD]\}}|r°|jnd||<qœW|j dt |ƒƒWdQRXdS)Nz&Won't overwrite existing lextab moduleÚ.r(z.pyÚwzJ# %s.py. This file automatically created by PLY (version %s). Don't edit! z_tabversion = %s z_lextokens = set(%s) z_lexreflags = %s z_lexliterals = %s z_lexstateinfo = %s z_lexstatere = %s z_lexstateignore = %s z_lexstateerrorf = %s z_lexstateeoff = %s éÿÿÿÿ)Ú isinstanceÚtypesÚ ModuleTypeÚIOErrorÚsplitÚosÚpathÚjoinÚopenrÚ __version__ÚreprÚ__tabversion__Útupler9r4r;r0r+r?Úzipr,r-r@Ú_funcs_to_namesr1r2r r3)rÚlextabÚ outputdirZ basetabmoduleÚfilenameZtfZtabreÚ statenameÚlreÚtitemÚpatÚfuncZretextÚrenamesZtaberrrEZtabeofrrr Úwritetab®s6 ( zLexer.writetabc CsRt|tjƒr|}ntd|ƒtj|}t|ddƒtkr@tdƒ‚|j |_ |j |_ |j |_|j t|jƒB|_|j|_|j|_i|_i|_xb|jjƒD]T\}}g}g}x.|D]&\}} |jtj||j ƒt| |ƒfƒq¨W||j|<||j|<q’Wi|_x$|jjƒD]\}} || |j|<qüWi|_x&|j jƒD]\}} || |j|<q(W|j!dƒdS)Nz import %sÚ _tabversionz0.0zInconsistent PLY versionr&)"rJrKrLÚexecÚsysÚmodulesrArUÚ ImportErrorZ _lextokensr9Z _lexreflagsr4Z _lexliteralsr;ÚsetÚ lextokens_allZ _lexstateinfor0Z_lexstateignorer1r+r,Z _lexstaterer?r@ÚreÚcompileÚ_names_to_funcsr2Z_lexstateerrorfr3Z _lexstateeoffÚbegin) rZtabfileÚfdictrYr\r]r^Ztxtitemr_Z func_namerErrr ÚreadtabÓs8   " z Lexer.readtabcCs8|dd…}t|tƒstdƒ‚||_d|_t|ƒ|_dS)Nr(zExpected a stringr)rJÚ StringTypesÚ ValueErrorr5rÚlenr6)rrrCrrr Úinputûs   z Lexer.inputcCsd||jkrtdƒ‚|j||_|j||_|jj|dƒ|_|jj|dƒ|_ |j j|dƒ|_ ||_ dS)NzUndefined stater') r+rqr)r,r*r1Úgetr:r2r7r3r8r.)rÚstaterrr rms   z Lexer.begincCs|jj|jƒ|j|ƒdS)N)r/r@r.rm)rrurrr Ú push_stateszLexer.push_statecCs|j|jjƒƒdS)N)rmr/Úpop)rrrr Ú pop_stateszLexer.pop_statecCs|jS)N)r.)rrrr Ú current_state!szLexer.current_statecCs|j|7_dS)N)r)rÚnrrr Úskip'sz Lexer.skipc Cs~|j}|j}|j}|j}xþ||kr|||kr<|d7}qxÖ|jD]ä\}}|j||ƒ}|s`qFtƒ}|jƒ|_|j |_ ||_|j } || \} |_ | s´|j rª|j ƒ|_|S|j ƒ}P|j ƒ}||_ ||_||_| |ƒ} | sè|j}|j}P|js(| j |jkr(td| jj| jj| j| j f||d…ƒ‚| SW|||jkrrtƒ}|||_|j |_ |j|_ ||_|d|_|S|jròtƒ}|j|d…|_|j |_ d|_ ||_ ||_||_|j|ƒ} ||jkràtd||||d…ƒ‚|j}| sîq| S||_td|||f||d…ƒ‚qW|jr\tƒ}d|_ d|_|j |_ ||_||_ ||_|j|ƒ} | S|d|_|jdkrztdƒ‚dS) Nr(z4%s:%d: Rule '%s' returned an unknown token type '%s'rz&Scanning error. Illegal character '%s'z"Illegal character '%s' at index %dÚeofr'z"No input string given with input())rr6r:r5r)ÚmatchrÚgrouprrÚ lastindexrÚendÚlexerZlexmatchr=rirÚ__code__Ú co_filenameÚco_firstlinenor r;r7r8Ú RuntimeError) rrr6r:r5r)Ú lexindexfuncÚmÚtokÚir`Znewtokrrr Útoken1sœ         "   z Lexer.tokencCs|S)Nr)rrrr Ú__iter__ŸszLexer.__iter__cCs|jƒ}|dkrt‚|S)N)rŠÚ StopIteration)rÚtrrr Únext¢sz Lexer.next)N)r')r r r r rFrbrorsrmrvrxryr{rŠr‹rŽÚ__next__rrrr r%ss  %(   nr%cCst|d|jƒS)NÚregex)rAÚ__doc__)r`rrr Ú _get_regex·sr’cCs0tj|ƒ}|jjƒ}|j|jkr,|j|jƒ|S)N)reÚ _getframeÚ f_globalsr>Úf_localsÚupdate)ÚlevelsrÚldictrrr Úget_caller_module_dictÁs     r™cCsJg}x@t||ƒD]2\}}|r8|dr8|j||dfƒq|j|ƒqW|S)Nrr()rWr@)ZfunclistÚnamelistÚresultrr"rrr rXÎs  rXcCsHg}x>|D]6}|r6|dr6|j||d|dfƒq |j|ƒq W|S)Nrr()r@)ršrnr›rzrrr rlÝs   rlc Csd|sgSdj|ƒ}yÎtj||ƒ}dgt|jjƒƒd}|dd…}xˆ|jjƒD]z\}} |j|dƒ} t| ƒt j t j fkr’| ||f|| <||| <qP| dk rP||| <|j dƒdkrºd|| <qPd||f|| <qPW||fg|g|gfSt k r^tt|ƒdƒ} | dkrd} t|d| …|||ƒ\} } }t|| d…|||ƒ\}}}| || |||fSXdS)Nú|r(Úignore_ré)NN)rQrjrkÚmaxÚ groupindexÚvaluesr?rtrrKÚ FunctionTypeÚ MethodTypeÚfindÚ ExceptionÚintrrÚ_form_master_re)ZrelistÚreflagsr˜Útoknamesrr)r†Z lexindexnamesrr‰Zhandler‡Zllistr]ZlnamesZrlistZrreZrnamesrrr r§ís2       r§cCsˆd}|jdƒ}x0t|dd…dƒD]\}}||kr"|dkr"Pq"W|dkrZt|d|…ƒ}nd}d|krnt|ƒ}dj||d…ƒ}||fS)Nr(Ú_ÚANYr&)r&)rNÚ enumeraterVrQ)rÚnamesZnonstateÚpartsr‰ÚpartÚstatesZ tokennamerrr Ú _statetokens r±c@sfeZdZddd„Zdd„Zdd„Zd d „Zd d „Zd d„Zdd„Z dd„Z dd„Z dd„Z dd„Z dS)Ú LexerReflectNrcCsL||_d|_g|_||_ddi|_tƒ|_d|_|dkrBtt j ƒn||_ dS)Nr&Ú inclusiveF) r˜Ú error_funcÚtokensr¨Ú stateinforhrfrrreÚstderrÚlog)rr˜r¸r¨rrr r /s zLexerReflect.__init__cCs$|jƒ|jƒ|jƒ|jƒdS)N)Ú get_tokensÚ get_literalsÚ get_statesÚ get_rules)rrrr Úget_all:szLexerReflect.get_allcCs|jƒ|jƒ|jƒ|jS)N)Úvalidate_tokensÚvalidate_literalsÚvalidate_rulesr)rrrr Ú validate_allAszLexerReflect.validate_allcCsp|jjddƒ}|s(|jjdƒd|_dSt|ttfƒsL|jjdƒd|_dS|sf|jjdƒd|_dS||_dS)NrµzNo token list is definedTztokens must be a list or tupleztokens is empty)r˜rtr¸rrJÚlistrVrµ)rrµrrr r¹Hs   zLexerReflect.get_tokenscCsTi}xJ|jD]@}tj|ƒs.|jjd|ƒd|_||krD|jjd|ƒd||<q WdS)NzBad token name '%s'TzToken '%s' multiply definedr()rµÚ_is_identifierr}r¸rr)rÚ terminalsrzrrr r¾\s  zLexerReflect.validate_tokenscCs |jjddƒ|_|jsd|_dS)NÚliteralsr')r˜rtrÅ)rrrr rºgszLexerReflect.get_literalsc CspyDx>|jD]4}t|tƒ s&t|ƒdkr |jjdt|ƒƒd|_q WWn&tk rj|jjdƒd|_YnXdS)Nr(z.Invalid literal %s. Must be a single characterTzIInvalid literals specification. literals must be a sequence of characters)rÅrJrprrr¸rrTÚ TypeError)rrCrrr r¿ms  zLexerReflect.validate_literalscCsü|jjddƒ|_|jrøt|jttfƒs:|jjdƒd|_n¾x¼|jD]²}t|tƒ s^t|ƒdkrx|jjdt |ƒƒd|_qB|\}}t|t ƒs¤|jjdt |ƒƒd|_qB|dkp²|dksÊ|jjd |ƒd|_qB||j krê|jjd |ƒd|_qB||j |<qBWdS) Nr°z)states must be defined as a tuple or listTržzMInvalid state specifier %s. Must be a tuple (statename,'exclusive|inclusive')zState name %s must be a stringr³Ú exclusivez:State type for state %s must be 'inclusive' or 'exclusive'zState '%s' already defined) r˜rtr°rJrVrÂr¸rrrrTrpr¶)rrr"Z statetyperrr r»xs0    zLexerReflect.get_statesc CsRdd„|jDƒ}i|_i|_i|_i|_i|_i|_x"|jD]}g|j|<g|j|<q˜sz*LexerReflect.get_rules..rz+No rules of the form t_rulename are definedTr$rr|Úignorez,%s:%d: Rule '%s' must be defined as a stringú\z#%s contains a literal backslash '\'z'Rule '%s' must be defined as a functionz&%s not defined as a function or stringcSs |djjS)Nr()r‚r„)Úxrrr ÚÒsz(LexerReflect.get_rules..)rDcSs t|dƒS)Nr()rr)rÍrrr rÎÖs)rDÚreverse)r˜r©ÚfuncsymÚstrsymrËÚerrorfÚeoffr¶rrr¸rr±Úhasattrr‚r„rƒr r@rJrprr¡Úsort) rZtsymbolsrrrr°ÚtoknameÚlineÚfilerrr r¼—sb                  zLexerReflect.get_rulescCs”xp|jD]d}x||j|D]l\}}|jj}|jj}tj|ƒ}|jj|ƒ|j |}t |t j ƒrjd}nd}|jj } | |krš|jjd|||jƒd|_q| |kr¾|jjd|||jƒd|_qt|ƒsâ|jjd|||jƒd|_qyDtjd|t|ƒf|jƒ} | jdƒr$|jjd |||jƒd|_Wqtjk rŠ} zD|jjd |||j| ƒd t|ƒkrt|jjd |||jƒd|_WYdd} ~ XqXqWx |j|D]ü\} } |j | }|d krÒ|jjd| ƒd|_qž||jkr|jdƒdkr|jjd| |ƒd|_qžy:tjd| | f|jƒ} | jdƒr@|jjd| ƒd|_WnTtjk r–} z4|jjd| | ƒd | kr€|jjd| ƒd|_WYdd} ~ XnXqžW|j| rÎ|j| rÎ|jjd|ƒd|_|jj|dƒ}|r |}|jj}|jj}tj|ƒ}|jj|ƒt |t j ƒrd}nd}|jj } | |krN|jjd|||jƒd|_| |kr |jjd|||jƒd|_q Wx|jD]}|j|ƒq|WdS)Nržr(z'%s:%d: Rule '%s' has too many argumentsTz%%s:%d: Rule '%s' requires an argumentz2%s:%d: No regular expression defined for rule '%s'z (?P<%s>%s)r'z<%s:%d: Regular expression for rule '%s' matches empty stringz3%s:%d: Invalid regular expression for rule '%s'. %sú#z6%s:%d. Make sure '#' in rule '%s' is escaped with '\#'rz'Rule '%s' must be defined as a functionrrz-Rule '%s' defined for an unspecified token %sz5Regular expression for rule '%s' matches empty stringz,Invalid regular expression for rule '%s'. %sz/Make sure '#' in rule '%s' is escaped with '\#'zNo rules defined for state '%s')r¶rÐr‚r„rƒÚinspectÚ getmodulerfÚaddr©rJrKr£Ú co_argcountr¸rr r’rjrkr¨r}rÑrµr¤rÒrtÚvalidate_module)rruÚfnamerr×rØÚmodulerÖÚreqargsÚnargsrCÚer"ÚrZefuncrrr rÀÙs–                zLexerReflect.validate_rulesc CsÄytj|ƒ\}}Wntk r&dSXtjdƒ}tjdƒ}i}|d7}xv|D]n}|j|ƒ}|sj|j|ƒ}|r´|jdƒ} |j| ƒ} | s||| <n$tj|ƒ} |j j d| || | ƒd|_ |d7}qNWdS)Nz\s*def\s+(t_[a-zA-Z_0-9]*)\(z\s*(t_[a-zA-Z_0-9]*)\s*=r(z7%s:%d: Rule %s redefined. Previously defined on line %dT) rÚÚgetsourcelinesrMrjrkr}r~rtÚ getsourcefiler¸r) rràÚlinesÚlinenÚfreZsreÚ counthashr×r‡r"Úprevr[rrr rÞ@s*         zLexerReflect.validate_module)Nr)r r r r r½rÁr¹r¾rºr¿r»r¼rÀrÞrrrr r².s   Bgr²FrYc %sŽ|dkr d}d} ddi} tƒ} || _| dkr6ttjƒ} |rL|dkrLttjƒ}|rT|‰ˆr”‡fdd„tˆƒDƒ} t| ƒ} d| krœtj| dj| d<nt dƒ} | j d ƒ}|rÈt |t ƒrÈd |krÈ|d |}t | | |d }|jƒ|sò|jƒròtd ƒ‚|oø|r4y | j|| ƒ| ja| ja| a| Stk r2YnX|rd|jd |jƒ|jd|jƒ|jd|jƒtƒ| _x|jD]}| jj|ƒqtWt |jttfƒr¸t|jdƒƒj |jƒ| _!n|j| _!| jt| j!ƒB| _"|j} i}x¶| D]®}g}xX|j#|D]J\}}|j$j%}|j$j&}|j'd|t(|ƒfƒ|rö|jd|t(|ƒ|ƒqöWx@|j)|D]2\}}|j'd||fƒ|rP|jd|||ƒqPW|||<qâW|r¤|jdƒxt|D]l}t*|||| |j+ƒ\}}}|| j,|<|| j-|<|| j.|<|rªx&t/|ƒD]\}}|jd|||ƒqöWqªWxl| j0ƒD]`\}}|dkr$|dkr$| j,|j1| j,dƒ| j-|j1| j-dƒ| j.|j1| j.dƒq$W| | _2| j,d| _3| j-d| _4|| _5|j6| _7| j7j ddƒ| _8|j9| _:|j9j ddƒ| _;| j;sî| j|j=j ddƒ| _?x¤| j0ƒD]˜\} }|dkr\| |j9kr:| j.Ú__file__r ržÚ __package__rG)r¸r¨zCan't build lexerzlex: tokens = %rzlex: literals = %rzlex: states = %rrz (?P<%s>%s)z(lex: Adding rule %s -> '%s' (state '%s')z#lex: ==== MASTER REGEXS FOLLOW ====z"lex: state '%s' : regex[%d] = '%s'r'zNo t_error rule is definedrÇz1No error rule is defined for exclusive state '%s'z2No ignore rule is defined for exclusive state '%s'r(z import %sz#Couldn't write lextab module %r. %srI)Jr%r=rrer·ÚdirÚdictrfrír™rtrJrr²r½rÁÚ SyntaxErrorrorŠrsrrgrrµrÅr¶rhr9rÜrÂrVrrQr;rirÐr‚r„rƒr@r’rÑr§r©r+r,r-r¬r?Úextendr0r)r*r4rËr1r:rÒr2r7rrÓr3r8rKrLrNrdrArOrPÚdirnamerbrM)%ràrBr ÚoptimizerYr¨ZnowarnrZÚdebuglogÚerrorlogr˜r¶ZlexobjÚ_itemsÚpkgZlinforzZregexsruZ regex_listrßrr×rØr"rär)Zre_textZre_namesr‰rÚstyperÚsrcfiler®Úpkgnamerãr)ràr Úlex_sð                             $rücCs´|sVy&tjd}t|ƒ}|jƒ}|jƒWn*tk rTtjjdƒtjjƒ}YnX|rb|j }nt }||ƒ|rz|j }nt }x0|ƒ}|sŒPtjjd|j |j |j |jfƒq€WdS)Nr(z/Reading from standard input (type EOF to end): z(%s,%r,%d,%d) )reÚargvrRÚreadÚcloseÚ IndexErrorÚstdoutrÚstdinrsrŠrrrr)rÚdatar[rÚ_inputÚ_tokenrˆrrr Úrunmains*   rcs‡fdd„}|S)Ncs tˆdƒrtˆƒ|_nˆ|_|S)Nr$)rÔr’r)r)rärr Ú set_regexBs  zTOKEN..set_regexr)rärr)rär ÚTOKENAs r)NN)$rSrUrjrerKr>rOrÚZ StringTypeZ UnicodeTyperpÚAttributeErrorrÚbytesrkrÃr¥rrBrrr!r%r’r™rXrlr§r±r²r¦ÚVERBOSErürrÚTokenrrrr Ú"sD  F  (3 @ " pycparser-2.18/pycparser/ply/__pycache__/__init__.cpython-36.pyc0000664000175000017500000000024013127011712025374 0ustar elibeneliben000000000000003 ¾”Xfã@sdZddgZdS)z3.9ZlexÚyaccN)Ú __version__Ú__all__©rrú../pycparser/ply/__init__.pyÚspycparser-2.18/pycparser/ply/ctokens.py0000664000175000017500000000615113045001366021031 0ustar elibeneliben00000000000000# ---------------------------------------------------------------------- # ctokens.py # # Token specifications for symbols in ANSI C and C++. This file is # meant to be used as a library in other tokenizers. # ---------------------------------------------------------------------- # Reserved words tokens = [ # Literals (identifier, integer constant, float constant, string constant, char const) 'ID', 'TYPEID', 'INTEGER', 'FLOAT', 'STRING', 'CHARACTER', # Operators (+,-,*,/,%,|,&,~,^,<<,>>, ||, &&, !, <, <=, >, >=, ==, !=) 'PLUS', 'MINUS', 'TIMES', 'DIVIDE', 'MODULO', 'OR', 'AND', 'NOT', 'XOR', 'LSHIFT', 'RSHIFT', 'LOR', 'LAND', 'LNOT', 'LT', 'LE', 'GT', 'GE', 'EQ', 'NE', # Assignment (=, *=, /=, %=, +=, -=, <<=, >>=, &=, ^=, |=) 'EQUALS', 'TIMESEQUAL', 'DIVEQUAL', 'MODEQUAL', 'PLUSEQUAL', 'MINUSEQUAL', 'LSHIFTEQUAL','RSHIFTEQUAL', 'ANDEQUAL', 'XOREQUAL', 'OREQUAL', # Increment/decrement (++,--) 'INCREMENT', 'DECREMENT', # Structure dereference (->) 'ARROW', # Ternary operator (?) 'TERNARY', # Delimeters ( ) [ ] { } , . ; : 'LPAREN', 'RPAREN', 'LBRACKET', 'RBRACKET', 'LBRACE', 'RBRACE', 'COMMA', 'PERIOD', 'SEMI', 'COLON', # Ellipsis (...) 'ELLIPSIS', ] # Operators t_PLUS = r'\+' t_MINUS = r'-' t_TIMES = r'\*' t_DIVIDE = r'/' t_MODULO = r'%' t_OR = r'\|' t_AND = r'&' t_NOT = r'~' t_XOR = r'\^' t_LSHIFT = r'<<' t_RSHIFT = r'>>' t_LOR = r'\|\|' t_LAND = r'&&' t_LNOT = r'!' t_LT = r'<' t_GT = r'>' t_LE = r'<=' t_GE = r'>=' t_EQ = r'==' t_NE = r'!=' # Assignment operators t_EQUALS = r'=' t_TIMESEQUAL = r'\*=' t_DIVEQUAL = r'/=' t_MODEQUAL = r'%=' t_PLUSEQUAL = r'\+=' t_MINUSEQUAL = r'-=' t_LSHIFTEQUAL = r'<<=' t_RSHIFTEQUAL = r'>>=' t_ANDEQUAL = r'&=' t_OREQUAL = r'\|=' t_XOREQUAL = r'\^=' # Increment/decrement t_INCREMENT = r'\+\+' t_DECREMENT = r'--' # -> t_ARROW = r'->' # ? t_TERNARY = r'\?' # Delimeters t_LPAREN = r'\(' t_RPAREN = r'\)' t_LBRACKET = r'\[' t_RBRACKET = r'\]' t_LBRACE = r'\{' t_RBRACE = r'\}' t_COMMA = r',' t_PERIOD = r'\.' t_SEMI = r';' t_COLON = r':' t_ELLIPSIS = r'\.\.\.' # Identifiers t_ID = r'[A-Za-z_][A-Za-z0-9_]*' # Integer literal t_INTEGER = r'\d+([uU]|[lL]|[uU][lL]|[lL][uU])?' # Floating literal t_FLOAT = r'((\d+)(\.\d+)(e(\+|-)?(\d+))? | (\d+)e(\+|-)?(\d+))([lL]|[fF])?' # String literal t_STRING = r'\"([^\\\n]|(\\.))*?\"' # Character constant 'c' or L'c' t_CHARACTER = r'(L)?\'([^\\\n]|(\\.))*?\'' # Comment (C-Style) def t_COMMENT(t): r'/\*(.|\n)*?\*/' t.lexer.lineno += t.value.count('\n') return t # Comment (C++-Style) def t_CPPCOMMENT(t): r'//.*\n' t.lexer.lineno += 1 return t pycparser-2.18/pycparser/ply/yacc.py0000664000175000017500000041415213070450754020315 0ustar elibeneliben00000000000000# ----------------------------------------------------------------------------- # ply: yacc.py # # Copyright (C) 2001-2017 # David M. Beazley (Dabeaz LLC) # All rights reserved. # # Redistribution and use in source and binary forms, with or without # modification, are permitted provided that the following conditions are # met: # # * Redistributions of source code must retain the above copyright notice, # this list of conditions and the following disclaimer. # * Redistributions in binary form must reproduce the above copyright notice, # this list of conditions and the following disclaimer in the documentation # and/or other materials provided with the distribution. # * Neither the name of the David Beazley or Dabeaz LLC may be used to # endorse or promote products derived from this software without # specific prior written permission. # # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS # "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT # LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR # A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT # OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, # SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT # LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, # DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY # THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT # (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. # ----------------------------------------------------------------------------- # # This implements an LR parser that is constructed from grammar rules defined # as Python functions. The grammer is specified by supplying the BNF inside # Python documentation strings. The inspiration for this technique was borrowed # from John Aycock's Spark parsing system. PLY might be viewed as cross between # Spark and the GNU bison utility. # # The current implementation is only somewhat object-oriented. The # LR parser itself is defined in terms of an object (which allows multiple # parsers to co-exist). However, most of the variables used during table # construction are defined in terms of global variables. Users shouldn't # notice unless they are trying to define multiple parsers at the same # time using threads (in which case they should have their head examined). # # This implementation supports both SLR and LALR(1) parsing. LALR(1) # support was originally implemented by Elias Ioup (ezioup@alumni.uchicago.edu), # using the algorithm found in Aho, Sethi, and Ullman "Compilers: Principles, # Techniques, and Tools" (The Dragon Book). LALR(1) has since been replaced # by the more efficient DeRemer and Pennello algorithm. # # :::::::: WARNING ::::::: # # Construction of LR parsing tables is fairly complicated and expensive. # To make this module run fast, a *LOT* of work has been put into # optimization---often at the expensive of readability and what might # consider to be good Python "coding style." Modify the code at your # own risk! # ---------------------------------------------------------------------------- import re import types import sys import os.path import inspect import base64 import warnings __version__ = '3.10' __tabversion__ = '3.10' #----------------------------------------------------------------------------- # === User configurable parameters === # # Change these to modify the default behavior of yacc (if you wish) #----------------------------------------------------------------------------- yaccdebug = True # Debugging mode. If set, yacc generates a # a 'parser.out' file in the current directory debug_file = 'parser.out' # Default name of the debugging file tab_module = 'parsetab' # Default name of the table module default_lr = 'LALR' # Default LR table generation method error_count = 3 # Number of symbols that must be shifted to leave recovery mode yaccdevel = False # Set to True if developing yacc. This turns off optimized # implementations of certain functions. resultlimit = 40 # Size limit of results when running in debug mode. pickle_protocol = 0 # Protocol to use when writing pickle files # String type-checking compatibility if sys.version_info[0] < 3: string_types = basestring else: string_types = str MAXINT = sys.maxsize # This object is a stand-in for a logging object created by the # logging module. PLY will use this by default to create things # such as the parser.out file. If a user wants more detailed # information, they can create their own logging object and pass # it into PLY. class PlyLogger(object): def __init__(self, f): self.f = f def debug(self, msg, *args, **kwargs): self.f.write((msg % args) + '\n') info = debug def warning(self, msg, *args, **kwargs): self.f.write('WARNING: ' + (msg % args) + '\n') def error(self, msg, *args, **kwargs): self.f.write('ERROR: ' + (msg % args) + '\n') critical = debug # Null logger is used when no output is generated. Does nothing. class NullLogger(object): def __getattribute__(self, name): return self def __call__(self, *args, **kwargs): return self # Exception raised for yacc-related errors class YaccError(Exception): pass # Format the result message that the parser produces when running in debug mode. def format_result(r): repr_str = repr(r) if '\n' in repr_str: repr_str = repr(repr_str) if len(repr_str) > resultlimit: repr_str = repr_str[:resultlimit] + ' ...' result = '<%s @ 0x%x> (%s)' % (type(r).__name__, id(r), repr_str) return result # Format stack entries when the parser is running in debug mode def format_stack_entry(r): repr_str = repr(r) if '\n' in repr_str: repr_str = repr(repr_str) if len(repr_str) < 16: return repr_str else: return '<%s @ 0x%x>' % (type(r).__name__, id(r)) # Panic mode error recovery support. This feature is being reworked--much of the # code here is to offer a deprecation/backwards compatible transition _errok = None _token = None _restart = None _warnmsg = '''PLY: Don't use global functions errok(), token(), and restart() in p_error(). Instead, invoke the methods on the associated parser instance: def p_error(p): ... # Use parser.errok(), parser.token(), parser.restart() ... parser = yacc.yacc() ''' def errok(): warnings.warn(_warnmsg) return _errok() def restart(): warnings.warn(_warnmsg) return _restart() def token(): warnings.warn(_warnmsg) return _token() # Utility function to call the p_error() function with some deprecation hacks def call_errorfunc(errorfunc, token, parser): global _errok, _token, _restart _errok = parser.errok _token = parser.token _restart = parser.restart r = errorfunc(token) try: del _errok, _token, _restart except NameError: pass return r #----------------------------------------------------------------------------- # === LR Parsing Engine === # # The following classes are used for the LR parser itself. These are not # used during table construction and are independent of the actual LR # table generation algorithm #----------------------------------------------------------------------------- # This class is used to hold non-terminal grammar symbols during parsing. # It normally has the following attributes set: # .type = Grammar symbol type # .value = Symbol value # .lineno = Starting line number # .endlineno = Ending line number (optional, set automatically) # .lexpos = Starting lex position # .endlexpos = Ending lex position (optional, set automatically) class YaccSymbol: def __str__(self): return self.type def __repr__(self): return str(self) # This class is a wrapper around the objects actually passed to each # grammar rule. Index lookup and assignment actually assign the # .value attribute of the underlying YaccSymbol object. # The lineno() method returns the line number of a given # item (or 0 if not defined). The linespan() method returns # a tuple of (startline,endline) representing the range of lines # for a symbol. The lexspan() method returns a tuple (lexpos,endlexpos) # representing the range of positional information for a symbol. class YaccProduction: def __init__(self, s, stack=None): self.slice = s self.stack = stack self.lexer = None self.parser = None def __getitem__(self, n): if isinstance(n, slice): return [s.value for s in self.slice[n]] elif n >= 0: return self.slice[n].value else: return self.stack[n].value def __setitem__(self, n, v): self.slice[n].value = v def __getslice__(self, i, j): return [s.value for s in self.slice[i:j]] def __len__(self): return len(self.slice) def lineno(self, n): return getattr(self.slice[n], 'lineno', 0) def set_lineno(self, n, lineno): self.slice[n].lineno = lineno def linespan(self, n): startline = getattr(self.slice[n], 'lineno', 0) endline = getattr(self.slice[n], 'endlineno', startline) return startline, endline def lexpos(self, n): return getattr(self.slice[n], 'lexpos', 0) def lexspan(self, n): startpos = getattr(self.slice[n], 'lexpos', 0) endpos = getattr(self.slice[n], 'endlexpos', startpos) return startpos, endpos def error(self): raise SyntaxError # ----------------------------------------------------------------------------- # == LRParser == # # The LR Parsing engine. # ----------------------------------------------------------------------------- class LRParser: def __init__(self, lrtab, errorf): self.productions = lrtab.lr_productions self.action = lrtab.lr_action self.goto = lrtab.lr_goto self.errorfunc = errorf self.set_defaulted_states() self.errorok = True def errok(self): self.errorok = True def restart(self): del self.statestack[:] del self.symstack[:] sym = YaccSymbol() sym.type = '$end' self.symstack.append(sym) self.statestack.append(0) # Defaulted state support. # This method identifies parser states where there is only one possible reduction action. # For such states, the parser can make a choose to make a rule reduction without consuming # the next look-ahead token. This delayed invocation of the tokenizer can be useful in # certain kinds of advanced parsing situations where the lexer and parser interact with # each other or change states (i.e., manipulation of scope, lexer states, etc.). # # See: http://www.gnu.org/software/bison/manual/html_node/Default-Reductions.html#Default-Reductions def set_defaulted_states(self): self.defaulted_states = {} for state, actions in self.action.items(): rules = list(actions.values()) if len(rules) == 1 and rules[0] < 0: self.defaulted_states[state] = rules[0] def disable_defaulted_states(self): self.defaulted_states = {} def parse(self, input=None, lexer=None, debug=False, tracking=False, tokenfunc=None): if debug or yaccdevel: if isinstance(debug, int): debug = PlyLogger(sys.stderr) return self.parsedebug(input, lexer, debug, tracking, tokenfunc) elif tracking: return self.parseopt(input, lexer, debug, tracking, tokenfunc) else: return self.parseopt_notrack(input, lexer, debug, tracking, tokenfunc) # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! # parsedebug(). # # This is the debugging enabled version of parse(). All changes made to the # parsing engine should be made here. Optimized versions of this function # are automatically created by the ply/ygen.py script. This script cuts out # sections enclosed in markers such as this: # # #--! DEBUG # statements # #--! DEBUG # # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! def parsedebug(self, input=None, lexer=None, debug=False, tracking=False, tokenfunc=None): #--! parsedebug-start lookahead = None # Current lookahead symbol lookaheadstack = [] # Stack of lookahead symbols actions = self.action # Local reference to action table (to avoid lookup on self.) goto = self.goto # Local reference to goto table (to avoid lookup on self.) prod = self.productions # Local reference to production list (to avoid lookup on self.) defaulted_states = self.defaulted_states # Local reference to defaulted states pslice = YaccProduction(None) # Production object passed to grammar rules errorcount = 0 # Used during error recovery #--! DEBUG debug.info('PLY: PARSE DEBUG START') #--! DEBUG # If no lexer was given, we will try to use the lex module if not lexer: from . import lex lexer = lex.lexer # Set up the lexer and parser objects on pslice pslice.lexer = lexer pslice.parser = self # If input was supplied, pass to lexer if input is not None: lexer.input(input) if tokenfunc is None: # Tokenize function get_token = lexer.token else: get_token = tokenfunc # Set the parser() token method (sometimes used in error recovery) self.token = get_token # Set up the state and symbol stacks statestack = [] # Stack of parsing states self.statestack = statestack symstack = [] # Stack of grammar symbols self.symstack = symstack pslice.stack = symstack # Put in the production errtoken = None # Err token # The start state is assumed to be (0,$end) statestack.append(0) sym = YaccSymbol() sym.type = '$end' symstack.append(sym) state = 0 while True: # Get the next symbol on the input. If a lookahead symbol # is already set, we just use that. Otherwise, we'll pull # the next token off of the lookaheadstack or from the lexer #--! DEBUG debug.debug('') debug.debug('State : %s', state) #--! DEBUG if state not in defaulted_states: if not lookahead: if not lookaheadstack: lookahead = get_token() # Get the next token else: lookahead = lookaheadstack.pop() if not lookahead: lookahead = YaccSymbol() lookahead.type = '$end' # Check the action table ltype = lookahead.type t = actions[state].get(ltype) else: t = defaulted_states[state] #--! DEBUG debug.debug('Defaulted state %s: Reduce using %d', state, -t) #--! DEBUG #--! DEBUG debug.debug('Stack : %s', ('%s . %s' % (' '.join([xx.type for xx in symstack][1:]), str(lookahead))).lstrip()) #--! DEBUG if t is not None: if t > 0: # shift a symbol on the stack statestack.append(t) state = t #--! DEBUG debug.debug('Action : Shift and goto state %s', t) #--! DEBUG symstack.append(lookahead) lookahead = None # Decrease error count on successful shift if errorcount: errorcount -= 1 continue if t < 0: # reduce a symbol on the stack, emit a production p = prod[-t] pname = p.name plen = p.len # Get production function sym = YaccSymbol() sym.type = pname # Production name sym.value = None #--! DEBUG if plen: debug.info('Action : Reduce rule [%s] with %s and goto state %d', p.str, '['+','.join([format_stack_entry(_v.value) for _v in symstack[-plen:]])+']', goto[statestack[-1-plen]][pname]) else: debug.info('Action : Reduce rule [%s] with %s and goto state %d', p.str, [], goto[statestack[-1]][pname]) #--! DEBUG if plen: targ = symstack[-plen-1:] targ[0] = sym #--! TRACKING if tracking: t1 = targ[1] sym.lineno = t1.lineno sym.lexpos = t1.lexpos t1 = targ[-1] sym.endlineno = getattr(t1, 'endlineno', t1.lineno) sym.endlexpos = getattr(t1, 'endlexpos', t1.lexpos) #--! TRACKING # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! # The code enclosed in this section is duplicated # below as a performance optimization. Make sure # changes get made in both locations. pslice.slice = targ try: # Call the grammar rule with our special slice object del symstack[-plen:] self.state = state p.callable(pslice) del statestack[-plen:] #--! DEBUG debug.info('Result : %s', format_result(pslice[0])) #--! DEBUG symstack.append(sym) state = goto[statestack[-1]][pname] statestack.append(state) except SyntaxError: # If an error was set. Enter error recovery state lookaheadstack.append(lookahead) # Save the current lookahead token symstack.extend(targ[1:-1]) # Put the production slice back on the stack statestack.pop() # Pop back one state (before the reduce) state = statestack[-1] sym.type = 'error' sym.value = 'error' lookahead = sym errorcount = error_count self.errorok = False continue # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! else: #--! TRACKING if tracking: sym.lineno = lexer.lineno sym.lexpos = lexer.lexpos #--! TRACKING targ = [sym] # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! # The code enclosed in this section is duplicated # above as a performance optimization. Make sure # changes get made in both locations. pslice.slice = targ try: # Call the grammar rule with our special slice object self.state = state p.callable(pslice) #--! DEBUG debug.info('Result : %s', format_result(pslice[0])) #--! DEBUG symstack.append(sym) state = goto[statestack[-1]][pname] statestack.append(state) except SyntaxError: # If an error was set. Enter error recovery state lookaheadstack.append(lookahead) # Save the current lookahead token statestack.pop() # Pop back one state (before the reduce) state = statestack[-1] sym.type = 'error' sym.value = 'error' lookahead = sym errorcount = error_count self.errorok = False continue # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! if t == 0: n = symstack[-1] result = getattr(n, 'value', None) #--! DEBUG debug.info('Done : Returning %s', format_result(result)) debug.info('PLY: PARSE DEBUG END') #--! DEBUG return result if t is None: #--! DEBUG debug.error('Error : %s', ('%s . %s' % (' '.join([xx.type for xx in symstack][1:]), str(lookahead))).lstrip()) #--! DEBUG # We have some kind of parsing error here. To handle # this, we are going to push the current token onto # the tokenstack and replace it with an 'error' token. # If there are any synchronization rules, they may # catch it. # # In addition to pushing the error token, we call call # the user defined p_error() function if this is the # first syntax error. This function is only called if # errorcount == 0. if errorcount == 0 or self.errorok: errorcount = error_count self.errorok = False errtoken = lookahead if errtoken.type == '$end': errtoken = None # End of file! if self.errorfunc: if errtoken and not hasattr(errtoken, 'lexer'): errtoken.lexer = lexer self.state = state tok = call_errorfunc(self.errorfunc, errtoken, self) if self.errorok: # User must have done some kind of panic # mode recovery on their own. The # returned token is the next lookahead lookahead = tok errtoken = None continue else: if errtoken: if hasattr(errtoken, 'lineno'): lineno = lookahead.lineno else: lineno = 0 if lineno: sys.stderr.write('yacc: Syntax error at line %d, token=%s\n' % (lineno, errtoken.type)) else: sys.stderr.write('yacc: Syntax error, token=%s' % errtoken.type) else: sys.stderr.write('yacc: Parse error in input. EOF\n') return else: errorcount = error_count # case 1: the statestack only has 1 entry on it. If we're in this state, the # entire parse has been rolled back and we're completely hosed. The token is # discarded and we just keep going. if len(statestack) <= 1 and lookahead.type != '$end': lookahead = None errtoken = None state = 0 # Nuke the pushback stack del lookaheadstack[:] continue # case 2: the statestack has a couple of entries on it, but we're # at the end of the file. nuke the top entry and generate an error token # Start nuking entries on the stack if lookahead.type == '$end': # Whoa. We're really hosed here. Bail out return if lookahead.type != 'error': sym = symstack[-1] if sym.type == 'error': # Hmmm. Error is on top of stack, we'll just nuke input # symbol and continue #--! TRACKING if tracking: sym.endlineno = getattr(lookahead, 'lineno', sym.lineno) sym.endlexpos = getattr(lookahead, 'lexpos', sym.lexpos) #--! TRACKING lookahead = None continue # Create the error symbol for the first time and make it the new lookahead symbol t = YaccSymbol() t.type = 'error' if hasattr(lookahead, 'lineno'): t.lineno = t.endlineno = lookahead.lineno if hasattr(lookahead, 'lexpos'): t.lexpos = t.endlexpos = lookahead.lexpos t.value = lookahead lookaheadstack.append(lookahead) lookahead = t else: sym = symstack.pop() #--! TRACKING if tracking: lookahead.lineno = sym.lineno lookahead.lexpos = sym.lexpos #--! TRACKING statestack.pop() state = statestack[-1] continue # Call an error function here raise RuntimeError('yacc: internal parser error!!!\n') #--! parsedebug-end # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! # parseopt(). # # Optimized version of parse() method. DO NOT EDIT THIS CODE DIRECTLY! # This code is automatically generated by the ply/ygen.py script. Make # changes to the parsedebug() method instead. # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! def parseopt(self, input=None, lexer=None, debug=False, tracking=False, tokenfunc=None): #--! parseopt-start lookahead = None # Current lookahead symbol lookaheadstack = [] # Stack of lookahead symbols actions = self.action # Local reference to action table (to avoid lookup on self.) goto = self.goto # Local reference to goto table (to avoid lookup on self.) prod = self.productions # Local reference to production list (to avoid lookup on self.) defaulted_states = self.defaulted_states # Local reference to defaulted states pslice = YaccProduction(None) # Production object passed to grammar rules errorcount = 0 # Used during error recovery # If no lexer was given, we will try to use the lex module if not lexer: from . import lex lexer = lex.lexer # Set up the lexer and parser objects on pslice pslice.lexer = lexer pslice.parser = self # If input was supplied, pass to lexer if input is not None: lexer.input(input) if tokenfunc is None: # Tokenize function get_token = lexer.token else: get_token = tokenfunc # Set the parser() token method (sometimes used in error recovery) self.token = get_token # Set up the state and symbol stacks statestack = [] # Stack of parsing states self.statestack = statestack symstack = [] # Stack of grammar symbols self.symstack = symstack pslice.stack = symstack # Put in the production errtoken = None # Err token # The start state is assumed to be (0,$end) statestack.append(0) sym = YaccSymbol() sym.type = '$end' symstack.append(sym) state = 0 while True: # Get the next symbol on the input. If a lookahead symbol # is already set, we just use that. Otherwise, we'll pull # the next token off of the lookaheadstack or from the lexer if state not in defaulted_states: if not lookahead: if not lookaheadstack: lookahead = get_token() # Get the next token else: lookahead = lookaheadstack.pop() if not lookahead: lookahead = YaccSymbol() lookahead.type = '$end' # Check the action table ltype = lookahead.type t = actions[state].get(ltype) else: t = defaulted_states[state] if t is not None: if t > 0: # shift a symbol on the stack statestack.append(t) state = t symstack.append(lookahead) lookahead = None # Decrease error count on successful shift if errorcount: errorcount -= 1 continue if t < 0: # reduce a symbol on the stack, emit a production p = prod[-t] pname = p.name plen = p.len # Get production function sym = YaccSymbol() sym.type = pname # Production name sym.value = None if plen: targ = symstack[-plen-1:] targ[0] = sym #--! TRACKING if tracking: t1 = targ[1] sym.lineno = t1.lineno sym.lexpos = t1.lexpos t1 = targ[-1] sym.endlineno = getattr(t1, 'endlineno', t1.lineno) sym.endlexpos = getattr(t1, 'endlexpos', t1.lexpos) #--! TRACKING # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! # The code enclosed in this section is duplicated # below as a performance optimization. Make sure # changes get made in both locations. pslice.slice = targ try: # Call the grammar rule with our special slice object del symstack[-plen:] self.state = state p.callable(pslice) del statestack[-plen:] symstack.append(sym) state = goto[statestack[-1]][pname] statestack.append(state) except SyntaxError: # If an error was set. Enter error recovery state lookaheadstack.append(lookahead) # Save the current lookahead token symstack.extend(targ[1:-1]) # Put the production slice back on the stack statestack.pop() # Pop back one state (before the reduce) state = statestack[-1] sym.type = 'error' sym.value = 'error' lookahead = sym errorcount = error_count self.errorok = False continue # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! else: #--! TRACKING if tracking: sym.lineno = lexer.lineno sym.lexpos = lexer.lexpos #--! TRACKING targ = [sym] # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! # The code enclosed in this section is duplicated # above as a performance optimization. Make sure # changes get made in both locations. pslice.slice = targ try: # Call the grammar rule with our special slice object self.state = state p.callable(pslice) symstack.append(sym) state = goto[statestack[-1]][pname] statestack.append(state) except SyntaxError: # If an error was set. Enter error recovery state lookaheadstack.append(lookahead) # Save the current lookahead token statestack.pop() # Pop back one state (before the reduce) state = statestack[-1] sym.type = 'error' sym.value = 'error' lookahead = sym errorcount = error_count self.errorok = False continue # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! if t == 0: n = symstack[-1] result = getattr(n, 'value', None) return result if t is None: # We have some kind of parsing error here. To handle # this, we are going to push the current token onto # the tokenstack and replace it with an 'error' token. # If there are any synchronization rules, they may # catch it. # # In addition to pushing the error token, we call call # the user defined p_error() function if this is the # first syntax error. This function is only called if # errorcount == 0. if errorcount == 0 or self.errorok: errorcount = error_count self.errorok = False errtoken = lookahead if errtoken.type == '$end': errtoken = None # End of file! if self.errorfunc: if errtoken and not hasattr(errtoken, 'lexer'): errtoken.lexer = lexer self.state = state tok = call_errorfunc(self.errorfunc, errtoken, self) if self.errorok: # User must have done some kind of panic # mode recovery on their own. The # returned token is the next lookahead lookahead = tok errtoken = None continue else: if errtoken: if hasattr(errtoken, 'lineno'): lineno = lookahead.lineno else: lineno = 0 if lineno: sys.stderr.write('yacc: Syntax error at line %d, token=%s\n' % (lineno, errtoken.type)) else: sys.stderr.write('yacc: Syntax error, token=%s' % errtoken.type) else: sys.stderr.write('yacc: Parse error in input. EOF\n') return else: errorcount = error_count # case 1: the statestack only has 1 entry on it. If we're in this state, the # entire parse has been rolled back and we're completely hosed. The token is # discarded and we just keep going. if len(statestack) <= 1 and lookahead.type != '$end': lookahead = None errtoken = None state = 0 # Nuke the pushback stack del lookaheadstack[:] continue # case 2: the statestack has a couple of entries on it, but we're # at the end of the file. nuke the top entry and generate an error token # Start nuking entries on the stack if lookahead.type == '$end': # Whoa. We're really hosed here. Bail out return if lookahead.type != 'error': sym = symstack[-1] if sym.type == 'error': # Hmmm. Error is on top of stack, we'll just nuke input # symbol and continue #--! TRACKING if tracking: sym.endlineno = getattr(lookahead, 'lineno', sym.lineno) sym.endlexpos = getattr(lookahead, 'lexpos', sym.lexpos) #--! TRACKING lookahead = None continue # Create the error symbol for the first time and make it the new lookahead symbol t = YaccSymbol() t.type = 'error' if hasattr(lookahead, 'lineno'): t.lineno = t.endlineno = lookahead.lineno if hasattr(lookahead, 'lexpos'): t.lexpos = t.endlexpos = lookahead.lexpos t.value = lookahead lookaheadstack.append(lookahead) lookahead = t else: sym = symstack.pop() #--! TRACKING if tracking: lookahead.lineno = sym.lineno lookahead.lexpos = sym.lexpos #--! TRACKING statestack.pop() state = statestack[-1] continue # Call an error function here raise RuntimeError('yacc: internal parser error!!!\n') #--! parseopt-end # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! # parseopt_notrack(). # # Optimized version of parseopt() with line number tracking removed. # DO NOT EDIT THIS CODE DIRECTLY. This code is automatically generated # by the ply/ygen.py script. Make changes to the parsedebug() method instead. # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! def parseopt_notrack(self, input=None, lexer=None, debug=False, tracking=False, tokenfunc=None): #--! parseopt-notrack-start lookahead = None # Current lookahead symbol lookaheadstack = [] # Stack of lookahead symbols actions = self.action # Local reference to action table (to avoid lookup on self.) goto = self.goto # Local reference to goto table (to avoid lookup on self.) prod = self.productions # Local reference to production list (to avoid lookup on self.) defaulted_states = self.defaulted_states # Local reference to defaulted states pslice = YaccProduction(None) # Production object passed to grammar rules errorcount = 0 # Used during error recovery # If no lexer was given, we will try to use the lex module if not lexer: from . import lex lexer = lex.lexer # Set up the lexer and parser objects on pslice pslice.lexer = lexer pslice.parser = self # If input was supplied, pass to lexer if input is not None: lexer.input(input) if tokenfunc is None: # Tokenize function get_token = lexer.token else: get_token = tokenfunc # Set the parser() token method (sometimes used in error recovery) self.token = get_token # Set up the state and symbol stacks statestack = [] # Stack of parsing states self.statestack = statestack symstack = [] # Stack of grammar symbols self.symstack = symstack pslice.stack = symstack # Put in the production errtoken = None # Err token # The start state is assumed to be (0,$end) statestack.append(0) sym = YaccSymbol() sym.type = '$end' symstack.append(sym) state = 0 while True: # Get the next symbol on the input. If a lookahead symbol # is already set, we just use that. Otherwise, we'll pull # the next token off of the lookaheadstack or from the lexer if state not in defaulted_states: if not lookahead: if not lookaheadstack: lookahead = get_token() # Get the next token else: lookahead = lookaheadstack.pop() if not lookahead: lookahead = YaccSymbol() lookahead.type = '$end' # Check the action table ltype = lookahead.type t = actions[state].get(ltype) else: t = defaulted_states[state] if t is not None: if t > 0: # shift a symbol on the stack statestack.append(t) state = t symstack.append(lookahead) lookahead = None # Decrease error count on successful shift if errorcount: errorcount -= 1 continue if t < 0: # reduce a symbol on the stack, emit a production p = prod[-t] pname = p.name plen = p.len # Get production function sym = YaccSymbol() sym.type = pname # Production name sym.value = None if plen: targ = symstack[-plen-1:] targ[0] = sym # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! # The code enclosed in this section is duplicated # below as a performance optimization. Make sure # changes get made in both locations. pslice.slice = targ try: # Call the grammar rule with our special slice object del symstack[-plen:] self.state = state p.callable(pslice) del statestack[-plen:] symstack.append(sym) state = goto[statestack[-1]][pname] statestack.append(state) except SyntaxError: # If an error was set. Enter error recovery state lookaheadstack.append(lookahead) # Save the current lookahead token symstack.extend(targ[1:-1]) # Put the production slice back on the stack statestack.pop() # Pop back one state (before the reduce) state = statestack[-1] sym.type = 'error' sym.value = 'error' lookahead = sym errorcount = error_count self.errorok = False continue # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! else: targ = [sym] # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! # The code enclosed in this section is duplicated # above as a performance optimization. Make sure # changes get made in both locations. pslice.slice = targ try: # Call the grammar rule with our special slice object self.state = state p.callable(pslice) symstack.append(sym) state = goto[statestack[-1]][pname] statestack.append(state) except SyntaxError: # If an error was set. Enter error recovery state lookaheadstack.append(lookahead) # Save the current lookahead token statestack.pop() # Pop back one state (before the reduce) state = statestack[-1] sym.type = 'error' sym.value = 'error' lookahead = sym errorcount = error_count self.errorok = False continue # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! if t == 0: n = symstack[-1] result = getattr(n, 'value', None) return result if t is None: # We have some kind of parsing error here. To handle # this, we are going to push the current token onto # the tokenstack and replace it with an 'error' token. # If there are any synchronization rules, they may # catch it. # # In addition to pushing the error token, we call call # the user defined p_error() function if this is the # first syntax error. This function is only called if # errorcount == 0. if errorcount == 0 or self.errorok: errorcount = error_count self.errorok = False errtoken = lookahead if errtoken.type == '$end': errtoken = None # End of file! if self.errorfunc: if errtoken and not hasattr(errtoken, 'lexer'): errtoken.lexer = lexer self.state = state tok = call_errorfunc(self.errorfunc, errtoken, self) if self.errorok: # User must have done some kind of panic # mode recovery on their own. The # returned token is the next lookahead lookahead = tok errtoken = None continue else: if errtoken: if hasattr(errtoken, 'lineno'): lineno = lookahead.lineno else: lineno = 0 if lineno: sys.stderr.write('yacc: Syntax error at line %d, token=%s\n' % (lineno, errtoken.type)) else: sys.stderr.write('yacc: Syntax error, token=%s' % errtoken.type) else: sys.stderr.write('yacc: Parse error in input. EOF\n') return else: errorcount = error_count # case 1: the statestack only has 1 entry on it. If we're in this state, the # entire parse has been rolled back and we're completely hosed. The token is # discarded and we just keep going. if len(statestack) <= 1 and lookahead.type != '$end': lookahead = None errtoken = None state = 0 # Nuke the pushback stack del lookaheadstack[:] continue # case 2: the statestack has a couple of entries on it, but we're # at the end of the file. nuke the top entry and generate an error token # Start nuking entries on the stack if lookahead.type == '$end': # Whoa. We're really hosed here. Bail out return if lookahead.type != 'error': sym = symstack[-1] if sym.type == 'error': # Hmmm. Error is on top of stack, we'll just nuke input # symbol and continue lookahead = None continue # Create the error symbol for the first time and make it the new lookahead symbol t = YaccSymbol() t.type = 'error' if hasattr(lookahead, 'lineno'): t.lineno = t.endlineno = lookahead.lineno if hasattr(lookahead, 'lexpos'): t.lexpos = t.endlexpos = lookahead.lexpos t.value = lookahead lookaheadstack.append(lookahead) lookahead = t else: sym = symstack.pop() statestack.pop() state = statestack[-1] continue # Call an error function here raise RuntimeError('yacc: internal parser error!!!\n') #--! parseopt-notrack-end # ----------------------------------------------------------------------------- # === Grammar Representation === # # The following functions, classes, and variables are used to represent and # manipulate the rules that make up a grammar. # ----------------------------------------------------------------------------- # regex matching identifiers _is_identifier = re.compile(r'^[a-zA-Z0-9_-]+$') # ----------------------------------------------------------------------------- # class Production: # # This class stores the raw information about a single production or grammar rule. # A grammar rule refers to a specification such as this: # # expr : expr PLUS term # # Here are the basic attributes defined on all productions # # name - Name of the production. For example 'expr' # prod - A list of symbols on the right side ['expr','PLUS','term'] # prec - Production precedence level # number - Production number. # func - Function that executes on reduce # file - File where production function is defined # lineno - Line number where production function is defined # # The following attributes are defined or optional. # # len - Length of the production (number of symbols on right hand side) # usyms - Set of unique symbols found in the production # ----------------------------------------------------------------------------- class Production(object): reduced = 0 def __init__(self, number, name, prod, precedence=('right', 0), func=None, file='', line=0): self.name = name self.prod = tuple(prod) self.number = number self.func = func self.callable = None self.file = file self.line = line self.prec = precedence # Internal settings used during table construction self.len = len(self.prod) # Length of the production # Create a list of unique production symbols used in the production self.usyms = [] for s in self.prod: if s not in self.usyms: self.usyms.append(s) # List of all LR items for the production self.lr_items = [] self.lr_next = None # Create a string representation if self.prod: self.str = '%s -> %s' % (self.name, ' '.join(self.prod)) else: self.str = '%s -> ' % self.name def __str__(self): return self.str def __repr__(self): return 'Production(' + str(self) + ')' def __len__(self): return len(self.prod) def __nonzero__(self): return 1 def __getitem__(self, index): return self.prod[index] # Return the nth lr_item from the production (or None if at the end) def lr_item(self, n): if n > len(self.prod): return None p = LRItem(self, n) # Precompute the list of productions immediately following. try: p.lr_after = Prodnames[p.prod[n+1]] except (IndexError, KeyError): p.lr_after = [] try: p.lr_before = p.prod[n-1] except IndexError: p.lr_before = None return p # Bind the production function name to a callable def bind(self, pdict): if self.func: self.callable = pdict[self.func] # This class serves as a minimal standin for Production objects when # reading table data from files. It only contains information # actually used by the LR parsing engine, plus some additional # debugging information. class MiniProduction(object): def __init__(self, str, name, len, func, file, line): self.name = name self.len = len self.func = func self.callable = None self.file = file self.line = line self.str = str def __str__(self): return self.str def __repr__(self): return 'MiniProduction(%s)' % self.str # Bind the production function name to a callable def bind(self, pdict): if self.func: self.callable = pdict[self.func] # ----------------------------------------------------------------------------- # class LRItem # # This class represents a specific stage of parsing a production rule. For # example: # # expr : expr . PLUS term # # In the above, the "." represents the current location of the parse. Here # basic attributes: # # name - Name of the production. For example 'expr' # prod - A list of symbols on the right side ['expr','.', 'PLUS','term'] # number - Production number. # # lr_next Next LR item. Example, if we are ' expr -> expr . PLUS term' # then lr_next refers to 'expr -> expr PLUS . term' # lr_index - LR item index (location of the ".") in the prod list. # lookaheads - LALR lookahead symbols for this item # len - Length of the production (number of symbols on right hand side) # lr_after - List of all productions that immediately follow # lr_before - Grammar symbol immediately before # ----------------------------------------------------------------------------- class LRItem(object): def __init__(self, p, n): self.name = p.name self.prod = list(p.prod) self.number = p.number self.lr_index = n self.lookaheads = {} self.prod.insert(n, '.') self.prod = tuple(self.prod) self.len = len(self.prod) self.usyms = p.usyms def __str__(self): if self.prod: s = '%s -> %s' % (self.name, ' '.join(self.prod)) else: s = '%s -> ' % self.name return s def __repr__(self): return 'LRItem(' + str(self) + ')' # ----------------------------------------------------------------------------- # rightmost_terminal() # # Return the rightmost terminal from a list of symbols. Used in add_production() # ----------------------------------------------------------------------------- def rightmost_terminal(symbols, terminals): i = len(symbols) - 1 while i >= 0: if symbols[i] in terminals: return symbols[i] i -= 1 return None # ----------------------------------------------------------------------------- # === GRAMMAR CLASS === # # The following class represents the contents of the specified grammar along # with various computed properties such as first sets, follow sets, LR items, etc. # This data is used for critical parts of the table generation process later. # ----------------------------------------------------------------------------- class GrammarError(YaccError): pass class Grammar(object): def __init__(self, terminals): self.Productions = [None] # A list of all of the productions. The first # entry is always reserved for the purpose of # building an augmented grammar self.Prodnames = {} # A dictionary mapping the names of nonterminals to a list of all # productions of that nonterminal. self.Prodmap = {} # A dictionary that is only used to detect duplicate # productions. self.Terminals = {} # A dictionary mapping the names of terminal symbols to a # list of the rules where they are used. for term in terminals: self.Terminals[term] = [] self.Terminals['error'] = [] self.Nonterminals = {} # A dictionary mapping names of nonterminals to a list # of rule numbers where they are used. self.First = {} # A dictionary of precomputed FIRST(x) symbols self.Follow = {} # A dictionary of precomputed FOLLOW(x) symbols self.Precedence = {} # Precedence rules for each terminal. Contains tuples of the # form ('right',level) or ('nonassoc', level) or ('left',level) self.UsedPrecedence = set() # Precedence rules that were actually used by the grammer. # This is only used to provide error checking and to generate # a warning about unused precedence rules. self.Start = None # Starting symbol for the grammar def __len__(self): return len(self.Productions) def __getitem__(self, index): return self.Productions[index] # ----------------------------------------------------------------------------- # set_precedence() # # Sets the precedence for a given terminal. assoc is the associativity such as # 'left','right', or 'nonassoc'. level is a numeric level. # # ----------------------------------------------------------------------------- def set_precedence(self, term, assoc, level): assert self.Productions == [None], 'Must call set_precedence() before add_production()' if term in self.Precedence: raise GrammarError('Precedence already specified for terminal %r' % term) if assoc not in ['left', 'right', 'nonassoc']: raise GrammarError("Associativity must be one of 'left','right', or 'nonassoc'") self.Precedence[term] = (assoc, level) # ----------------------------------------------------------------------------- # add_production() # # Given an action function, this function assembles a production rule and # computes its precedence level. # # The production rule is supplied as a list of symbols. For example, # a rule such as 'expr : expr PLUS term' has a production name of 'expr' and # symbols ['expr','PLUS','term']. # # Precedence is determined by the precedence of the right-most non-terminal # or the precedence of a terminal specified by %prec. # # A variety of error checks are performed to make sure production symbols # are valid and that %prec is used correctly. # ----------------------------------------------------------------------------- def add_production(self, prodname, syms, func=None, file='', line=0): if prodname in self.Terminals: raise GrammarError('%s:%d: Illegal rule name %r. Already defined as a token' % (file, line, prodname)) if prodname == 'error': raise GrammarError('%s:%d: Illegal rule name %r. error is a reserved word' % (file, line, prodname)) if not _is_identifier.match(prodname): raise GrammarError('%s:%d: Illegal rule name %r' % (file, line, prodname)) # Look for literal tokens for n, s in enumerate(syms): if s[0] in "'\"": try: c = eval(s) if (len(c) > 1): raise GrammarError('%s:%d: Literal token %s in rule %r may only be a single character' % (file, line, s, prodname)) if c not in self.Terminals: self.Terminals[c] = [] syms[n] = c continue except SyntaxError: pass if not _is_identifier.match(s) and s != '%prec': raise GrammarError('%s:%d: Illegal name %r in rule %r' % (file, line, s, prodname)) # Determine the precedence level if '%prec' in syms: if syms[-1] == '%prec': raise GrammarError('%s:%d: Syntax error. Nothing follows %%prec' % (file, line)) if syms[-2] != '%prec': raise GrammarError('%s:%d: Syntax error. %%prec can only appear at the end of a grammar rule' % (file, line)) precname = syms[-1] prodprec = self.Precedence.get(precname) if not prodprec: raise GrammarError('%s:%d: Nothing known about the precedence of %r' % (file, line, precname)) else: self.UsedPrecedence.add(precname) del syms[-2:] # Drop %prec from the rule else: # If no %prec, precedence is determined by the rightmost terminal symbol precname = rightmost_terminal(syms, self.Terminals) prodprec = self.Precedence.get(precname, ('right', 0)) # See if the rule is already in the rulemap map = '%s -> %s' % (prodname, syms) if map in self.Prodmap: m = self.Prodmap[map] raise GrammarError('%s:%d: Duplicate rule %s. ' % (file, line, m) + 'Previous definition at %s:%d' % (m.file, m.line)) # From this point on, everything is valid. Create a new Production instance pnumber = len(self.Productions) if prodname not in self.Nonterminals: self.Nonterminals[prodname] = [] # Add the production number to Terminals and Nonterminals for t in syms: if t in self.Terminals: self.Terminals[t].append(pnumber) else: if t not in self.Nonterminals: self.Nonterminals[t] = [] self.Nonterminals[t].append(pnumber) # Create a production and add it to the list of productions p = Production(pnumber, prodname, syms, prodprec, func, file, line) self.Productions.append(p) self.Prodmap[map] = p # Add to the global productions list try: self.Prodnames[prodname].append(p) except KeyError: self.Prodnames[prodname] = [p] # ----------------------------------------------------------------------------- # set_start() # # Sets the starting symbol and creates the augmented grammar. Production # rule 0 is S' -> start where start is the start symbol. # ----------------------------------------------------------------------------- def set_start(self, start=None): if not start: start = self.Productions[1].name if start not in self.Nonterminals: raise GrammarError('start symbol %s undefined' % start) self.Productions[0] = Production(0, "S'", [start]) self.Nonterminals[start].append(0) self.Start = start # ----------------------------------------------------------------------------- # find_unreachable() # # Find all of the nonterminal symbols that can't be reached from the starting # symbol. Returns a list of nonterminals that can't be reached. # ----------------------------------------------------------------------------- def find_unreachable(self): # Mark all symbols that are reachable from a symbol s def mark_reachable_from(s): if s in reachable: return reachable.add(s) for p in self.Prodnames.get(s, []): for r in p.prod: mark_reachable_from(r) reachable = set() mark_reachable_from(self.Productions[0].prod[0]) return [s for s in self.Nonterminals if s not in reachable] # ----------------------------------------------------------------------------- # infinite_cycles() # # This function looks at the various parsing rules and tries to detect # infinite recursion cycles (grammar rules where there is no possible way # to derive a string of only terminals). # ----------------------------------------------------------------------------- def infinite_cycles(self): terminates = {} # Terminals: for t in self.Terminals: terminates[t] = True terminates['$end'] = True # Nonterminals: # Initialize to false: for n in self.Nonterminals: terminates[n] = False # Then propagate termination until no change: while True: some_change = False for (n, pl) in self.Prodnames.items(): # Nonterminal n terminates iff any of its productions terminates. for p in pl: # Production p terminates iff all of its rhs symbols terminate. for s in p.prod: if not terminates[s]: # The symbol s does not terminate, # so production p does not terminate. p_terminates = False break else: # didn't break from the loop, # so every symbol s terminates # so production p terminates. p_terminates = True if p_terminates: # symbol n terminates! if not terminates[n]: terminates[n] = True some_change = True # Don't need to consider any more productions for this n. break if not some_change: break infinite = [] for (s, term) in terminates.items(): if not term: if s not in self.Prodnames and s not in self.Terminals and s != 'error': # s is used-but-not-defined, and we've already warned of that, # so it would be overkill to say that it's also non-terminating. pass else: infinite.append(s) return infinite # ----------------------------------------------------------------------------- # undefined_symbols() # # Find all symbols that were used the grammar, but not defined as tokens or # grammar rules. Returns a list of tuples (sym, prod) where sym in the symbol # and prod is the production where the symbol was used. # ----------------------------------------------------------------------------- def undefined_symbols(self): result = [] for p in self.Productions: if not p: continue for s in p.prod: if s not in self.Prodnames and s not in self.Terminals and s != 'error': result.append((s, p)) return result # ----------------------------------------------------------------------------- # unused_terminals() # # Find all terminals that were defined, but not used by the grammar. Returns # a list of all symbols. # ----------------------------------------------------------------------------- def unused_terminals(self): unused_tok = [] for s, v in self.Terminals.items(): if s != 'error' and not v: unused_tok.append(s) return unused_tok # ------------------------------------------------------------------------------ # unused_rules() # # Find all grammar rules that were defined, but not used (maybe not reachable) # Returns a list of productions. # ------------------------------------------------------------------------------ def unused_rules(self): unused_prod = [] for s, v in self.Nonterminals.items(): if not v: p = self.Prodnames[s][0] unused_prod.append(p) return unused_prod # ----------------------------------------------------------------------------- # unused_precedence() # # Returns a list of tuples (term,precedence) corresponding to precedence # rules that were never used by the grammar. term is the name of the terminal # on which precedence was applied and precedence is a string such as 'left' or # 'right' corresponding to the type of precedence. # ----------------------------------------------------------------------------- def unused_precedence(self): unused = [] for termname in self.Precedence: if not (termname in self.Terminals or termname in self.UsedPrecedence): unused.append((termname, self.Precedence[termname][0])) return unused # ------------------------------------------------------------------------- # _first() # # Compute the value of FIRST1(beta) where beta is a tuple of symbols. # # During execution of compute_first1, the result may be incomplete. # Afterward (e.g., when called from compute_follow()), it will be complete. # ------------------------------------------------------------------------- def _first(self, beta): # We are computing First(x1,x2,x3,...,xn) result = [] for x in beta: x_produces_empty = False # Add all the non- symbols of First[x] to the result. for f in self.First[x]: if f == '': x_produces_empty = True else: if f not in result: result.append(f) if x_produces_empty: # We have to consider the next x in beta, # i.e. stay in the loop. pass else: # We don't have to consider any further symbols in beta. break else: # There was no 'break' from the loop, # so x_produces_empty was true for all x in beta, # so beta produces empty as well. result.append('') return result # ------------------------------------------------------------------------- # compute_first() # # Compute the value of FIRST1(X) for all symbols # ------------------------------------------------------------------------- def compute_first(self): if self.First: return self.First # Terminals: for t in self.Terminals: self.First[t] = [t] self.First['$end'] = ['$end'] # Nonterminals: # Initialize to the empty set: for n in self.Nonterminals: self.First[n] = [] # Then propagate symbols until no change: while True: some_change = False for n in self.Nonterminals: for p in self.Prodnames[n]: for f in self._first(p.prod): if f not in self.First[n]: self.First[n].append(f) some_change = True if not some_change: break return self.First # --------------------------------------------------------------------- # compute_follow() # # Computes all of the follow sets for every non-terminal symbol. The # follow set is the set of all symbols that might follow a given # non-terminal. See the Dragon book, 2nd Ed. p. 189. # --------------------------------------------------------------------- def compute_follow(self, start=None): # If already computed, return the result if self.Follow: return self.Follow # If first sets not computed yet, do that first. if not self.First: self.compute_first() # Add '$end' to the follow list of the start symbol for k in self.Nonterminals: self.Follow[k] = [] if not start: start = self.Productions[1].name self.Follow[start] = ['$end'] while True: didadd = False for p in self.Productions[1:]: # Here is the production set for i, B in enumerate(p.prod): if B in self.Nonterminals: # Okay. We got a non-terminal in a production fst = self._first(p.prod[i+1:]) hasempty = False for f in fst: if f != '' and f not in self.Follow[B]: self.Follow[B].append(f) didadd = True if f == '': hasempty = True if hasempty or i == (len(p.prod)-1): # Add elements of follow(a) to follow(b) for f in self.Follow[p.name]: if f not in self.Follow[B]: self.Follow[B].append(f) didadd = True if not didadd: break return self.Follow # ----------------------------------------------------------------------------- # build_lritems() # # This function walks the list of productions and builds a complete set of the # LR items. The LR items are stored in two ways: First, they are uniquely # numbered and placed in the list _lritems. Second, a linked list of LR items # is built for each production. For example: # # E -> E PLUS E # # Creates the list # # [E -> . E PLUS E, E -> E . PLUS E, E -> E PLUS . E, E -> E PLUS E . ] # ----------------------------------------------------------------------------- def build_lritems(self): for p in self.Productions: lastlri = p i = 0 lr_items = [] while True: if i > len(p): lri = None else: lri = LRItem(p, i) # Precompute the list of productions immediately following try: lri.lr_after = self.Prodnames[lri.prod[i+1]] except (IndexError, KeyError): lri.lr_after = [] try: lri.lr_before = lri.prod[i-1] except IndexError: lri.lr_before = None lastlri.lr_next = lri if not lri: break lr_items.append(lri) lastlri = lri i += 1 p.lr_items = lr_items # ----------------------------------------------------------------------------- # == Class LRTable == # # This basic class represents a basic table of LR parsing information. # Methods for generating the tables are not defined here. They are defined # in the derived class LRGeneratedTable. # ----------------------------------------------------------------------------- class VersionError(YaccError): pass class LRTable(object): def __init__(self): self.lr_action = None self.lr_goto = None self.lr_productions = None self.lr_method = None def read_table(self, module): if isinstance(module, types.ModuleType): parsetab = module else: exec('import %s' % module) parsetab = sys.modules[module] if parsetab._tabversion != __tabversion__: raise VersionError('yacc table file version is out of date') self.lr_action = parsetab._lr_action self.lr_goto = parsetab._lr_goto self.lr_productions = [] for p in parsetab._lr_productions: self.lr_productions.append(MiniProduction(*p)) self.lr_method = parsetab._lr_method return parsetab._lr_signature def read_pickle(self, filename): try: import cPickle as pickle except ImportError: import pickle if not os.path.exists(filename): raise ImportError in_f = open(filename, 'rb') tabversion = pickle.load(in_f) if tabversion != __tabversion__: raise VersionError('yacc table file version is out of date') self.lr_method = pickle.load(in_f) signature = pickle.load(in_f) self.lr_action = pickle.load(in_f) self.lr_goto = pickle.load(in_f) productions = pickle.load(in_f) self.lr_productions = [] for p in productions: self.lr_productions.append(MiniProduction(*p)) in_f.close() return signature # Bind all production function names to callable objects in pdict def bind_callables(self, pdict): for p in self.lr_productions: p.bind(pdict) # ----------------------------------------------------------------------------- # === LR Generator === # # The following classes and functions are used to generate LR parsing tables on # a grammar. # ----------------------------------------------------------------------------- # ----------------------------------------------------------------------------- # digraph() # traverse() # # The following two functions are used to compute set valued functions # of the form: # # F(x) = F'(x) U U{F(y) | x R y} # # This is used to compute the values of Read() sets as well as FOLLOW sets # in LALR(1) generation. # # Inputs: X - An input set # R - A relation # FP - Set-valued function # ------------------------------------------------------------------------------ def digraph(X, R, FP): N = {} for x in X: N[x] = 0 stack = [] F = {} for x in X: if N[x] == 0: traverse(x, N, stack, F, X, R, FP) return F def traverse(x, N, stack, F, X, R, FP): stack.append(x) d = len(stack) N[x] = d F[x] = FP(x) # F(X) <- F'(x) rel = R(x) # Get y's related to x for y in rel: if N[y] == 0: traverse(y, N, stack, F, X, R, FP) N[x] = min(N[x], N[y]) for a in F.get(y, []): if a not in F[x]: F[x].append(a) if N[x] == d: N[stack[-1]] = MAXINT F[stack[-1]] = F[x] element = stack.pop() while element != x: N[stack[-1]] = MAXINT F[stack[-1]] = F[x] element = stack.pop() class LALRError(YaccError): pass # ----------------------------------------------------------------------------- # == LRGeneratedTable == # # This class implements the LR table generation algorithm. There are no # public methods except for write() # ----------------------------------------------------------------------------- class LRGeneratedTable(LRTable): def __init__(self, grammar, method='LALR', log=None): if method not in ['SLR', 'LALR']: raise LALRError('Unsupported method %s' % method) self.grammar = grammar self.lr_method = method # Set up the logger if not log: log = NullLogger() self.log = log # Internal attributes self.lr_action = {} # Action table self.lr_goto = {} # Goto table self.lr_productions = grammar.Productions # Copy of grammar Production array self.lr_goto_cache = {} # Cache of computed gotos self.lr0_cidhash = {} # Cache of closures self._add_count = 0 # Internal counter used to detect cycles # Diagonistic information filled in by the table generator self.sr_conflict = 0 self.rr_conflict = 0 self.conflicts = [] # List of conflicts self.sr_conflicts = [] self.rr_conflicts = [] # Build the tables self.grammar.build_lritems() self.grammar.compute_first() self.grammar.compute_follow() self.lr_parse_table() # Compute the LR(0) closure operation on I, where I is a set of LR(0) items. def lr0_closure(self, I): self._add_count += 1 # Add everything in I to J J = I[:] didadd = True while didadd: didadd = False for j in J: for x in j.lr_after: if getattr(x, 'lr0_added', 0) == self._add_count: continue # Add B --> .G to J J.append(x.lr_next) x.lr0_added = self._add_count didadd = True return J # Compute the LR(0) goto function goto(I,X) where I is a set # of LR(0) items and X is a grammar symbol. This function is written # in a way that guarantees uniqueness of the generated goto sets # (i.e. the same goto set will never be returned as two different Python # objects). With uniqueness, we can later do fast set comparisons using # id(obj) instead of element-wise comparison. def lr0_goto(self, I, x): # First we look for a previously cached entry g = self.lr_goto_cache.get((id(I), x)) if g: return g # Now we generate the goto set in a way that guarantees uniqueness # of the result s = self.lr_goto_cache.get(x) if not s: s = {} self.lr_goto_cache[x] = s gs = [] for p in I: n = p.lr_next if n and n.lr_before == x: s1 = s.get(id(n)) if not s1: s1 = {} s[id(n)] = s1 gs.append(n) s = s1 g = s.get('$end') if not g: if gs: g = self.lr0_closure(gs) s['$end'] = g else: s['$end'] = gs self.lr_goto_cache[(id(I), x)] = g return g # Compute the LR(0) sets of item function def lr0_items(self): C = [self.lr0_closure([self.grammar.Productions[0].lr_next])] i = 0 for I in C: self.lr0_cidhash[id(I)] = i i += 1 # Loop over the items in C and each grammar symbols i = 0 while i < len(C): I = C[i] i += 1 # Collect all of the symbols that could possibly be in the goto(I,X) sets asyms = {} for ii in I: for s in ii.usyms: asyms[s] = None for x in asyms: g = self.lr0_goto(I, x) if not g or id(g) in self.lr0_cidhash: continue self.lr0_cidhash[id(g)] = len(C) C.append(g) return C # ----------------------------------------------------------------------------- # ==== LALR(1) Parsing ==== # # LALR(1) parsing is almost exactly the same as SLR except that instead of # relying upon Follow() sets when performing reductions, a more selective # lookahead set that incorporates the state of the LR(0) machine is utilized. # Thus, we mainly just have to focus on calculating the lookahead sets. # # The method used here is due to DeRemer and Pennelo (1982). # # DeRemer, F. L., and T. J. Pennelo: "Efficient Computation of LALR(1) # Lookahead Sets", ACM Transactions on Programming Languages and Systems, # Vol. 4, No. 4, Oct. 1982, pp. 615-649 # # Further details can also be found in: # # J. Tremblay and P. Sorenson, "The Theory and Practice of Compiler Writing", # McGraw-Hill Book Company, (1985). # # ----------------------------------------------------------------------------- # ----------------------------------------------------------------------------- # compute_nullable_nonterminals() # # Creates a dictionary containing all of the non-terminals that might produce # an empty production. # ----------------------------------------------------------------------------- def compute_nullable_nonterminals(self): nullable = set() num_nullable = 0 while True: for p in self.grammar.Productions[1:]: if p.len == 0: nullable.add(p.name) continue for t in p.prod: if t not in nullable: break else: nullable.add(p.name) if len(nullable) == num_nullable: break num_nullable = len(nullable) return nullable # ----------------------------------------------------------------------------- # find_nonterminal_trans(C) # # Given a set of LR(0) items, this functions finds all of the non-terminal # transitions. These are transitions in which a dot appears immediately before # a non-terminal. Returns a list of tuples of the form (state,N) where state # is the state number and N is the nonterminal symbol. # # The input C is the set of LR(0) items. # ----------------------------------------------------------------------------- def find_nonterminal_transitions(self, C): trans = [] for stateno, state in enumerate(C): for p in state: if p.lr_index < p.len - 1: t = (stateno, p.prod[p.lr_index+1]) if t[1] in self.grammar.Nonterminals: if t not in trans: trans.append(t) return trans # ----------------------------------------------------------------------------- # dr_relation() # # Computes the DR(p,A) relationships for non-terminal transitions. The input # is a tuple (state,N) where state is a number and N is a nonterminal symbol. # # Returns a list of terminals. # ----------------------------------------------------------------------------- def dr_relation(self, C, trans, nullable): dr_set = {} state, N = trans terms = [] g = self.lr0_goto(C[state], N) for p in g: if p.lr_index < p.len - 1: a = p.prod[p.lr_index+1] if a in self.grammar.Terminals: if a not in terms: terms.append(a) # This extra bit is to handle the start state if state == 0 and N == self.grammar.Productions[0].prod[0]: terms.append('$end') return terms # ----------------------------------------------------------------------------- # reads_relation() # # Computes the READS() relation (p,A) READS (t,C). # ----------------------------------------------------------------------------- def reads_relation(self, C, trans, empty): # Look for empty transitions rel = [] state, N = trans g = self.lr0_goto(C[state], N) j = self.lr0_cidhash.get(id(g), -1) for p in g: if p.lr_index < p.len - 1: a = p.prod[p.lr_index + 1] if a in empty: rel.append((j, a)) return rel # ----------------------------------------------------------------------------- # compute_lookback_includes() # # Determines the lookback and includes relations # # LOOKBACK: # # This relation is determined by running the LR(0) state machine forward. # For example, starting with a production "N : . A B C", we run it forward # to obtain "N : A B C ." We then build a relationship between this final # state and the starting state. These relationships are stored in a dictionary # lookdict. # # INCLUDES: # # Computes the INCLUDE() relation (p,A) INCLUDES (p',B). # # This relation is used to determine non-terminal transitions that occur # inside of other non-terminal transition states. (p,A) INCLUDES (p', B) # if the following holds: # # B -> LAT, where T -> epsilon and p' -L-> p # # L is essentially a prefix (which may be empty), T is a suffix that must be # able to derive an empty string. State p' must lead to state p with the string L. # # ----------------------------------------------------------------------------- def compute_lookback_includes(self, C, trans, nullable): lookdict = {} # Dictionary of lookback relations includedict = {} # Dictionary of include relations # Make a dictionary of non-terminal transitions dtrans = {} for t in trans: dtrans[t] = 1 # Loop over all transitions and compute lookbacks and includes for state, N in trans: lookb = [] includes = [] for p in C[state]: if p.name != N: continue # Okay, we have a name match. We now follow the production all the way # through the state machine until we get the . on the right hand side lr_index = p.lr_index j = state while lr_index < p.len - 1: lr_index = lr_index + 1 t = p.prod[lr_index] # Check to see if this symbol and state are a non-terminal transition if (j, t) in dtrans: # Yes. Okay, there is some chance that this is an includes relation # the only way to know for certain is whether the rest of the # production derives empty li = lr_index + 1 while li < p.len: if p.prod[li] in self.grammar.Terminals: break # No forget it if p.prod[li] not in nullable: break li = li + 1 else: # Appears to be a relation between (j,t) and (state,N) includes.append((j, t)) g = self.lr0_goto(C[j], t) # Go to next set j = self.lr0_cidhash.get(id(g), -1) # Go to next state # When we get here, j is the final state, now we have to locate the production for r in C[j]: if r.name != p.name: continue if r.len != p.len: continue i = 0 # This look is comparing a production ". A B C" with "A B C ." while i < r.lr_index: if r.prod[i] != p.prod[i+1]: break i = i + 1 else: lookb.append((j, r)) for i in includes: if i not in includedict: includedict[i] = [] includedict[i].append((state, N)) lookdict[(state, N)] = lookb return lookdict, includedict # ----------------------------------------------------------------------------- # compute_read_sets() # # Given a set of LR(0) items, this function computes the read sets. # # Inputs: C = Set of LR(0) items # ntrans = Set of nonterminal transitions # nullable = Set of empty transitions # # Returns a set containing the read sets # ----------------------------------------------------------------------------- def compute_read_sets(self, C, ntrans, nullable): FP = lambda x: self.dr_relation(C, x, nullable) R = lambda x: self.reads_relation(C, x, nullable) F = digraph(ntrans, R, FP) return F # ----------------------------------------------------------------------------- # compute_follow_sets() # # Given a set of LR(0) items, a set of non-terminal transitions, a readset, # and an include set, this function computes the follow sets # # Follow(p,A) = Read(p,A) U U {Follow(p',B) | (p,A) INCLUDES (p',B)} # # Inputs: # ntrans = Set of nonterminal transitions # readsets = Readset (previously computed) # inclsets = Include sets (previously computed) # # Returns a set containing the follow sets # ----------------------------------------------------------------------------- def compute_follow_sets(self, ntrans, readsets, inclsets): FP = lambda x: readsets[x] R = lambda x: inclsets.get(x, []) F = digraph(ntrans, R, FP) return F # ----------------------------------------------------------------------------- # add_lookaheads() # # Attaches the lookahead symbols to grammar rules. # # Inputs: lookbacks - Set of lookback relations # followset - Computed follow set # # This function directly attaches the lookaheads to productions contained # in the lookbacks set # ----------------------------------------------------------------------------- def add_lookaheads(self, lookbacks, followset): for trans, lb in lookbacks.items(): # Loop over productions in lookback for state, p in lb: if state not in p.lookaheads: p.lookaheads[state] = [] f = followset.get(trans, []) for a in f: if a not in p.lookaheads[state]: p.lookaheads[state].append(a) # ----------------------------------------------------------------------------- # add_lalr_lookaheads() # # This function does all of the work of adding lookahead information for use # with LALR parsing # ----------------------------------------------------------------------------- def add_lalr_lookaheads(self, C): # Determine all of the nullable nonterminals nullable = self.compute_nullable_nonterminals() # Find all non-terminal transitions trans = self.find_nonterminal_transitions(C) # Compute read sets readsets = self.compute_read_sets(C, trans, nullable) # Compute lookback/includes relations lookd, included = self.compute_lookback_includes(C, trans, nullable) # Compute LALR FOLLOW sets followsets = self.compute_follow_sets(trans, readsets, included) # Add all of the lookaheads self.add_lookaheads(lookd, followsets) # ----------------------------------------------------------------------------- # lr_parse_table() # # This function constructs the parse tables for SLR or LALR # ----------------------------------------------------------------------------- def lr_parse_table(self): Productions = self.grammar.Productions Precedence = self.grammar.Precedence goto = self.lr_goto # Goto array action = self.lr_action # Action array log = self.log # Logger for output actionp = {} # Action production array (temporary) log.info('Parsing method: %s', self.lr_method) # Step 1: Construct C = { I0, I1, ... IN}, collection of LR(0) items # This determines the number of states C = self.lr0_items() if self.lr_method == 'LALR': self.add_lalr_lookaheads(C) # Build the parser table, state by state st = 0 for I in C: # Loop over each production in I actlist = [] # List of actions st_action = {} st_actionp = {} st_goto = {} log.info('') log.info('state %d', st) log.info('') for p in I: log.info(' (%d) %s', p.number, p) log.info('') for p in I: if p.len == p.lr_index + 1: if p.name == "S'": # Start symbol. Accept! st_action['$end'] = 0 st_actionp['$end'] = p else: # We are at the end of a production. Reduce! if self.lr_method == 'LALR': laheads = p.lookaheads[st] else: laheads = self.grammar.Follow[p.name] for a in laheads: actlist.append((a, p, 'reduce using rule %d (%s)' % (p.number, p))) r = st_action.get(a) if r is not None: # Whoa. Have a shift/reduce or reduce/reduce conflict if r > 0: # Need to decide on shift or reduce here # By default we favor shifting. Need to add # some precedence rules here. # Shift precedence comes from the token sprec, slevel = Precedence.get(a, ('right', 0)) # Reduce precedence comes from rule being reduced (p) rprec, rlevel = Productions[p.number].prec if (slevel < rlevel) or ((slevel == rlevel) and (rprec == 'left')): # We really need to reduce here. st_action[a] = -p.number st_actionp[a] = p if not slevel and not rlevel: log.info(' ! shift/reduce conflict for %s resolved as reduce', a) self.sr_conflicts.append((st, a, 'reduce')) Productions[p.number].reduced += 1 elif (slevel == rlevel) and (rprec == 'nonassoc'): st_action[a] = None else: # Hmmm. Guess we'll keep the shift if not rlevel: log.info(' ! shift/reduce conflict for %s resolved as shift', a) self.sr_conflicts.append((st, a, 'shift')) elif r < 0: # Reduce/reduce conflict. In this case, we favor the rule # that was defined first in the grammar file oldp = Productions[-r] pp = Productions[p.number] if oldp.line > pp.line: st_action[a] = -p.number st_actionp[a] = p chosenp, rejectp = pp, oldp Productions[p.number].reduced += 1 Productions[oldp.number].reduced -= 1 else: chosenp, rejectp = oldp, pp self.rr_conflicts.append((st, chosenp, rejectp)) log.info(' ! reduce/reduce conflict for %s resolved using rule %d (%s)', a, st_actionp[a].number, st_actionp[a]) else: raise LALRError('Unknown conflict in state %d' % st) else: st_action[a] = -p.number st_actionp[a] = p Productions[p.number].reduced += 1 else: i = p.lr_index a = p.prod[i+1] # Get symbol right after the "." if a in self.grammar.Terminals: g = self.lr0_goto(I, a) j = self.lr0_cidhash.get(id(g), -1) if j >= 0: # We are in a shift state actlist.append((a, p, 'shift and go to state %d' % j)) r = st_action.get(a) if r is not None: # Whoa have a shift/reduce or shift/shift conflict if r > 0: if r != j: raise LALRError('Shift/shift conflict in state %d' % st) elif r < 0: # Do a precedence check. # - if precedence of reduce rule is higher, we reduce. # - if precedence of reduce is same and left assoc, we reduce. # - otherwise we shift # Shift precedence comes from the token sprec, slevel = Precedence.get(a, ('right', 0)) # Reduce precedence comes from the rule that could have been reduced rprec, rlevel = Productions[st_actionp[a].number].prec if (slevel > rlevel) or ((slevel == rlevel) and (rprec == 'right')): # We decide to shift here... highest precedence to shift Productions[st_actionp[a].number].reduced -= 1 st_action[a] = j st_actionp[a] = p if not rlevel: log.info(' ! shift/reduce conflict for %s resolved as shift', a) self.sr_conflicts.append((st, a, 'shift')) elif (slevel == rlevel) and (rprec == 'nonassoc'): st_action[a] = None else: # Hmmm. Guess we'll keep the reduce if not slevel and not rlevel: log.info(' ! shift/reduce conflict for %s resolved as reduce', a) self.sr_conflicts.append((st, a, 'reduce')) else: raise LALRError('Unknown conflict in state %d' % st) else: st_action[a] = j st_actionp[a] = p # Print the actions associated with each terminal _actprint = {} for a, p, m in actlist: if a in st_action: if p is st_actionp[a]: log.info(' %-15s %s', a, m) _actprint[(a, m)] = 1 log.info('') # Print the actions that were not used. (debugging) not_used = 0 for a, p, m in actlist: if a in st_action: if p is not st_actionp[a]: if not (a, m) in _actprint: log.debug(' ! %-15s [ %s ]', a, m) not_used = 1 _actprint[(a, m)] = 1 if not_used: log.debug('') # Construct the goto table for this state nkeys = {} for ii in I: for s in ii.usyms: if s in self.grammar.Nonterminals: nkeys[s] = None for n in nkeys: g = self.lr0_goto(I, n) j = self.lr0_cidhash.get(id(g), -1) if j >= 0: st_goto[n] = j log.info(' %-30s shift and go to state %d', n, j) action[st] = st_action actionp[st] = st_actionp goto[st] = st_goto st += 1 # ----------------------------------------------------------------------------- # write() # # This function writes the LR parsing tables to a file # ----------------------------------------------------------------------------- def write_table(self, tabmodule, outputdir='', signature=''): if isinstance(tabmodule, types.ModuleType): raise IOError("Won't overwrite existing tabmodule") basemodulename = tabmodule.split('.')[-1] filename = os.path.join(outputdir, basemodulename) + '.py' try: f = open(filename, 'w') f.write(''' # %s # This file is automatically generated. Do not edit. _tabversion = %r _lr_method = %r _lr_signature = %r ''' % (os.path.basename(filename), __tabversion__, self.lr_method, signature)) # Change smaller to 0 to go back to original tables smaller = 1 # Factor out names to try and make smaller if smaller: items = {} for s, nd in self.lr_action.items(): for name, v in nd.items(): i = items.get(name) if not i: i = ([], []) items[name] = i i[0].append(s) i[1].append(v) f.write('\n_lr_action_items = {') for k, v in items.items(): f.write('%r:([' % k) for i in v[0]: f.write('%r,' % i) f.write('],[') for i in v[1]: f.write('%r,' % i) f.write(']),') f.write('}\n') f.write(''' _lr_action = {} for _k, _v in _lr_action_items.items(): for _x,_y in zip(_v[0],_v[1]): if not _x in _lr_action: _lr_action[_x] = {} _lr_action[_x][_k] = _y del _lr_action_items ''') else: f.write('\n_lr_action = { ') for k, v in self.lr_action.items(): f.write('(%r,%r):%r,' % (k[0], k[1], v)) f.write('}\n') if smaller: # Factor out names to try and make smaller items = {} for s, nd in self.lr_goto.items(): for name, v in nd.items(): i = items.get(name) if not i: i = ([], []) items[name] = i i[0].append(s) i[1].append(v) f.write('\n_lr_goto_items = {') for k, v in items.items(): f.write('%r:([' % k) for i in v[0]: f.write('%r,' % i) f.write('],[') for i in v[1]: f.write('%r,' % i) f.write(']),') f.write('}\n') f.write(''' _lr_goto = {} for _k, _v in _lr_goto_items.items(): for _x, _y in zip(_v[0], _v[1]): if not _x in _lr_goto: _lr_goto[_x] = {} _lr_goto[_x][_k] = _y del _lr_goto_items ''') else: f.write('\n_lr_goto = { ') for k, v in self.lr_goto.items(): f.write('(%r,%r):%r,' % (k[0], k[1], v)) f.write('}\n') # Write production table f.write('_lr_productions = [\n') for p in self.lr_productions: if p.func: f.write(' (%r,%r,%d,%r,%r,%d),\n' % (p.str, p.name, p.len, p.func, os.path.basename(p.file), p.line)) else: f.write(' (%r,%r,%d,None,None,None),\n' % (str(p), p.name, p.len)) f.write(']\n') f.close() except IOError as e: raise # ----------------------------------------------------------------------------- # pickle_table() # # This function pickles the LR parsing tables to a supplied file object # ----------------------------------------------------------------------------- def pickle_table(self, filename, signature=''): try: import cPickle as pickle except ImportError: import pickle with open(filename, 'wb') as outf: pickle.dump(__tabversion__, outf, pickle_protocol) pickle.dump(self.lr_method, outf, pickle_protocol) pickle.dump(signature, outf, pickle_protocol) pickle.dump(self.lr_action, outf, pickle_protocol) pickle.dump(self.lr_goto, outf, pickle_protocol) outp = [] for p in self.lr_productions: if p.func: outp.append((p.str, p.name, p.len, p.func, os.path.basename(p.file), p.line)) else: outp.append((str(p), p.name, p.len, None, None, None)) pickle.dump(outp, outf, pickle_protocol) # ----------------------------------------------------------------------------- # === INTROSPECTION === # # The following functions and classes are used to implement the PLY # introspection features followed by the yacc() function itself. # ----------------------------------------------------------------------------- # ----------------------------------------------------------------------------- # get_caller_module_dict() # # This function returns a dictionary containing all of the symbols defined within # a caller further down the call stack. This is used to get the environment # associated with the yacc() call if none was provided. # ----------------------------------------------------------------------------- def get_caller_module_dict(levels): f = sys._getframe(levels) ldict = f.f_globals.copy() if f.f_globals != f.f_locals: ldict.update(f.f_locals) return ldict # ----------------------------------------------------------------------------- # parse_grammar() # # This takes a raw grammar rule string and parses it into production data # ----------------------------------------------------------------------------- def parse_grammar(doc, file, line): grammar = [] # Split the doc string into lines pstrings = doc.splitlines() lastp = None dline = line for ps in pstrings: dline += 1 p = ps.split() if not p: continue try: if p[0] == '|': # This is a continuation of a previous rule if not lastp: raise SyntaxError("%s:%d: Misplaced '|'" % (file, dline)) prodname = lastp syms = p[1:] else: prodname = p[0] lastp = prodname syms = p[2:] assign = p[1] if assign != ':' and assign != '::=': raise SyntaxError("%s:%d: Syntax error. Expected ':'" % (file, dline)) grammar.append((file, dline, prodname, syms)) except SyntaxError: raise except Exception: raise SyntaxError('%s:%d: Syntax error in rule %r' % (file, dline, ps.strip())) return grammar # ----------------------------------------------------------------------------- # ParserReflect() # # This class represents information extracted for building a parser including # start symbol, error function, tokens, precedence list, action functions, # etc. # ----------------------------------------------------------------------------- class ParserReflect(object): def __init__(self, pdict, log=None): self.pdict = pdict self.start = None self.error_func = None self.tokens = None self.modules = set() self.grammar = [] self.error = False if log is None: self.log = PlyLogger(sys.stderr) else: self.log = log # Get all of the basic information def get_all(self): self.get_start() self.get_error_func() self.get_tokens() self.get_precedence() self.get_pfunctions() # Validate all of the information def validate_all(self): self.validate_start() self.validate_error_func() self.validate_tokens() self.validate_precedence() self.validate_pfunctions() self.validate_modules() return self.error # Compute a signature over the grammar def signature(self): parts = [] try: if self.start: parts.append(self.start) if self.prec: parts.append(''.join([''.join(p) for p in self.prec])) if self.tokens: parts.append(' '.join(self.tokens)) for f in self.pfuncs: if f[3]: parts.append(f[3]) except (TypeError, ValueError): pass return ''.join(parts) # ----------------------------------------------------------------------------- # validate_modules() # # This method checks to see if there are duplicated p_rulename() functions # in the parser module file. Without this function, it is really easy for # users to make mistakes by cutting and pasting code fragments (and it's a real # bugger to try and figure out why the resulting parser doesn't work). Therefore, # we just do a little regular expression pattern matching of def statements # to try and detect duplicates. # ----------------------------------------------------------------------------- def validate_modules(self): # Match def p_funcname( fre = re.compile(r'\s*def\s+(p_[a-zA-Z_0-9]*)\(') for module in self.modules: try: lines, linen = inspect.getsourcelines(module) except IOError: continue counthash = {} for linen, line in enumerate(lines): linen += 1 m = fre.match(line) if m: name = m.group(1) prev = counthash.get(name) if not prev: counthash[name] = linen else: filename = inspect.getsourcefile(module) self.log.warning('%s:%d: Function %s redefined. Previously defined on line %d', filename, linen, name, prev) # Get the start symbol def get_start(self): self.start = self.pdict.get('start') # Validate the start symbol def validate_start(self): if self.start is not None: if not isinstance(self.start, string_types): self.log.error("'start' must be a string") # Look for error handler def get_error_func(self): self.error_func = self.pdict.get('p_error') # Validate the error function def validate_error_func(self): if self.error_func: if isinstance(self.error_func, types.FunctionType): ismethod = 0 elif isinstance(self.error_func, types.MethodType): ismethod = 1 else: self.log.error("'p_error' defined, but is not a function or method") self.error = True return eline = self.error_func.__code__.co_firstlineno efile = self.error_func.__code__.co_filename module = inspect.getmodule(self.error_func) self.modules.add(module) argcount = self.error_func.__code__.co_argcount - ismethod if argcount != 1: self.log.error('%s:%d: p_error() requires 1 argument', efile, eline) self.error = True # Get the tokens map def get_tokens(self): tokens = self.pdict.get('tokens') if not tokens: self.log.error('No token list is defined') self.error = True return if not isinstance(tokens, (list, tuple)): self.log.error('tokens must be a list or tuple') self.error = True return if not tokens: self.log.error('tokens is empty') self.error = True return self.tokens = tokens # Validate the tokens def validate_tokens(self): # Validate the tokens. if 'error' in self.tokens: self.log.error("Illegal token name 'error'. Is a reserved word") self.error = True return terminals = set() for n in self.tokens: if n in terminals: self.log.warning('Token %r multiply defined', n) terminals.add(n) # Get the precedence map (if any) def get_precedence(self): self.prec = self.pdict.get('precedence') # Validate and parse the precedence map def validate_precedence(self): preclist = [] if self.prec: if not isinstance(self.prec, (list, tuple)): self.log.error('precedence must be a list or tuple') self.error = True return for level, p in enumerate(self.prec): if not isinstance(p, (list, tuple)): self.log.error('Bad precedence table') self.error = True return if len(p) < 2: self.log.error('Malformed precedence entry %s. Must be (assoc, term, ..., term)', p) self.error = True return assoc = p[0] if not isinstance(assoc, string_types): self.log.error('precedence associativity must be a string') self.error = True return for term in p[1:]: if not isinstance(term, string_types): self.log.error('precedence items must be strings') self.error = True return preclist.append((term, assoc, level+1)) self.preclist = preclist # Get all p_functions from the grammar def get_pfunctions(self): p_functions = [] for name, item in self.pdict.items(): if not name.startswith('p_') or name == 'p_error': continue if isinstance(item, (types.FunctionType, types.MethodType)): line = getattr(item, 'co_firstlineno', item.__code__.co_firstlineno) module = inspect.getmodule(item) p_functions.append((line, module, name, item.__doc__)) # Sort all of the actions by line number; make sure to stringify # modules to make them sortable, since `line` may not uniquely sort all # p functions p_functions.sort(key=lambda p_function: ( p_function[0], str(p_function[1]), p_function[2], p_function[3])) self.pfuncs = p_functions # Validate all of the p_functions def validate_pfunctions(self): grammar = [] # Check for non-empty symbols if len(self.pfuncs) == 0: self.log.error('no rules of the form p_rulename are defined') self.error = True return for line, module, name, doc in self.pfuncs: file = inspect.getsourcefile(module) func = self.pdict[name] if isinstance(func, types.MethodType): reqargs = 2 else: reqargs = 1 if func.__code__.co_argcount > reqargs: self.log.error('%s:%d: Rule %r has too many arguments', file, line, func.__name__) self.error = True elif func.__code__.co_argcount < reqargs: self.log.error('%s:%d: Rule %r requires an argument', file, line, func.__name__) self.error = True elif not func.__doc__: self.log.warning('%s:%d: No documentation string specified in function %r (ignored)', file, line, func.__name__) else: try: parsed_g = parse_grammar(doc, file, line) for g in parsed_g: grammar.append((name, g)) except SyntaxError as e: self.log.error(str(e)) self.error = True # Looks like a valid grammar rule # Mark the file in which defined. self.modules.add(module) # Secondary validation step that looks for p_ definitions that are not functions # or functions that look like they might be grammar rules. for n, v in self.pdict.items(): if n.startswith('p_') and isinstance(v, (types.FunctionType, types.MethodType)): continue if n.startswith('t_'): continue if n.startswith('p_') and n != 'p_error': self.log.warning('%r not defined as a function', n) if ((isinstance(v, types.FunctionType) and v.__code__.co_argcount == 1) or (isinstance(v, types.MethodType) and v.__func__.__code__.co_argcount == 2)): if v.__doc__: try: doc = v.__doc__.split(' ') if doc[1] == ':': self.log.warning('%s:%d: Possible grammar rule %r defined without p_ prefix', v.__code__.co_filename, v.__code__.co_firstlineno, n) except IndexError: pass self.grammar = grammar # ----------------------------------------------------------------------------- # yacc(module) # # Build a parser # ----------------------------------------------------------------------------- def yacc(method='LALR', debug=yaccdebug, module=None, tabmodule=tab_module, start=None, check_recursion=True, optimize=False, write_tables=True, debugfile=debug_file, outputdir=None, debuglog=None, errorlog=None, picklefile=None): if tabmodule is None: tabmodule = tab_module # Reference to the parsing method of the last built parser global parse # If pickling is enabled, table files are not created if picklefile: write_tables = 0 if errorlog is None: errorlog = PlyLogger(sys.stderr) # Get the module dictionary used for the parser if module: _items = [(k, getattr(module, k)) for k in dir(module)] pdict = dict(_items) # If no __file__ attribute is available, try to obtain it from the __module__ instead if '__file__' not in pdict: pdict['__file__'] = sys.modules[pdict['__module__']].__file__ else: pdict = get_caller_module_dict(2) if outputdir is None: # If no output directory is set, the location of the output files # is determined according to the following rules: # - If tabmodule specifies a package, files go into that package directory # - Otherwise, files go in the same directory as the specifying module if isinstance(tabmodule, types.ModuleType): srcfile = tabmodule.__file__ else: if '.' not in tabmodule: srcfile = pdict['__file__'] else: parts = tabmodule.split('.') pkgname = '.'.join(parts[:-1]) exec('import %s' % pkgname) srcfile = getattr(sys.modules[pkgname], '__file__', '') outputdir = os.path.dirname(srcfile) # Determine if the module is package of a package or not. # If so, fix the tabmodule setting so that tables load correctly pkg = pdict.get('__package__') if pkg and isinstance(tabmodule, str): if '.' not in tabmodule: tabmodule = pkg + '.' + tabmodule # Set start symbol if it's specified directly using an argument if start is not None: pdict['start'] = start # Collect parser information from the dictionary pinfo = ParserReflect(pdict, log=errorlog) pinfo.get_all() if pinfo.error: raise YaccError('Unable to build parser') # Check signature against table files (if any) signature = pinfo.signature() # Read the tables try: lr = LRTable() if picklefile: read_signature = lr.read_pickle(picklefile) else: read_signature = lr.read_table(tabmodule) if optimize or (read_signature == signature): try: lr.bind_callables(pinfo.pdict) parser = LRParser(lr, pinfo.error_func) parse = parser.parse return parser except Exception as e: errorlog.warning('There was a problem loading the table file: %r', e) except VersionError as e: errorlog.warning(str(e)) except ImportError: pass if debuglog is None: if debug: try: debuglog = PlyLogger(open(os.path.join(outputdir, debugfile), 'w')) except IOError as e: errorlog.warning("Couldn't open %r. %s" % (debugfile, e)) debuglog = NullLogger() else: debuglog = NullLogger() debuglog.info('Created by PLY version %s (http://www.dabeaz.com/ply)', __version__) errors = False # Validate the parser information if pinfo.validate_all(): raise YaccError('Unable to build parser') if not pinfo.error_func: errorlog.warning('no p_error() function is defined') # Create a grammar object grammar = Grammar(pinfo.tokens) # Set precedence level for terminals for term, assoc, level in pinfo.preclist: try: grammar.set_precedence(term, assoc, level) except GrammarError as e: errorlog.warning('%s', e) # Add productions to the grammar for funcname, gram in pinfo.grammar: file, line, prodname, syms = gram try: grammar.add_production(prodname, syms, funcname, file, line) except GrammarError as e: errorlog.error('%s', e) errors = True # Set the grammar start symbols try: if start is None: grammar.set_start(pinfo.start) else: grammar.set_start(start) except GrammarError as e: errorlog.error(str(e)) errors = True if errors: raise YaccError('Unable to build parser') # Verify the grammar structure undefined_symbols = grammar.undefined_symbols() for sym, prod in undefined_symbols: errorlog.error('%s:%d: Symbol %r used, but not defined as a token or a rule', prod.file, prod.line, sym) errors = True unused_terminals = grammar.unused_terminals() if unused_terminals: debuglog.info('') debuglog.info('Unused terminals:') debuglog.info('') for term in unused_terminals: errorlog.warning('Token %r defined, but not used', term) debuglog.info(' %s', term) # Print out all productions to the debug log if debug: debuglog.info('') debuglog.info('Grammar') debuglog.info('') for n, p in enumerate(grammar.Productions): debuglog.info('Rule %-5d %s', n, p) # Find unused non-terminals unused_rules = grammar.unused_rules() for prod in unused_rules: errorlog.warning('%s:%d: Rule %r defined, but not used', prod.file, prod.line, prod.name) if len(unused_terminals) == 1: errorlog.warning('There is 1 unused token') if len(unused_terminals) > 1: errorlog.warning('There are %d unused tokens', len(unused_terminals)) if len(unused_rules) == 1: errorlog.warning('There is 1 unused rule') if len(unused_rules) > 1: errorlog.warning('There are %d unused rules', len(unused_rules)) if debug: debuglog.info('') debuglog.info('Terminals, with rules where they appear') debuglog.info('') terms = list(grammar.Terminals) terms.sort() for term in terms: debuglog.info('%-20s : %s', term, ' '.join([str(s) for s in grammar.Terminals[term]])) debuglog.info('') debuglog.info('Nonterminals, with rules where they appear') debuglog.info('') nonterms = list(grammar.Nonterminals) nonterms.sort() for nonterm in nonterms: debuglog.info('%-20s : %s', nonterm, ' '.join([str(s) for s in grammar.Nonterminals[nonterm]])) debuglog.info('') if check_recursion: unreachable = grammar.find_unreachable() for u in unreachable: errorlog.warning('Symbol %r is unreachable', u) infinite = grammar.infinite_cycles() for inf in infinite: errorlog.error('Infinite recursion detected for symbol %r', inf) errors = True unused_prec = grammar.unused_precedence() for term, assoc in unused_prec: errorlog.error('Precedence rule %r defined for unknown symbol %r', assoc, term) errors = True if errors: raise YaccError('Unable to build parser') # Run the LRGeneratedTable on the grammar if debug: errorlog.debug('Generating %s tables', method) lr = LRGeneratedTable(grammar, method, debuglog) if debug: num_sr = len(lr.sr_conflicts) # Report shift/reduce and reduce/reduce conflicts if num_sr == 1: errorlog.warning('1 shift/reduce conflict') elif num_sr > 1: errorlog.warning('%d shift/reduce conflicts', num_sr) num_rr = len(lr.rr_conflicts) if num_rr == 1: errorlog.warning('1 reduce/reduce conflict') elif num_rr > 1: errorlog.warning('%d reduce/reduce conflicts', num_rr) # Write out conflicts to the output file if debug and (lr.sr_conflicts or lr.rr_conflicts): debuglog.warning('') debuglog.warning('Conflicts:') debuglog.warning('') for state, tok, resolution in lr.sr_conflicts: debuglog.warning('shift/reduce conflict for %s in state %d resolved as %s', tok, state, resolution) already_reported = set() for state, rule, rejected in lr.rr_conflicts: if (state, id(rule), id(rejected)) in already_reported: continue debuglog.warning('reduce/reduce conflict in state %d resolved using rule (%s)', state, rule) debuglog.warning('rejected rule (%s) in state %d', rejected, state) errorlog.warning('reduce/reduce conflict in state %d resolved using rule (%s)', state, rule) errorlog.warning('rejected rule (%s) in state %d', rejected, state) already_reported.add((state, id(rule), id(rejected))) warned_never = [] for state, rule, rejected in lr.rr_conflicts: if not rejected.reduced and (rejected not in warned_never): debuglog.warning('Rule (%s) is never reduced', rejected) errorlog.warning('Rule (%s) is never reduced', rejected) warned_never.append(rejected) # Write the table file if requested if write_tables: try: lr.write_table(tabmodule, outputdir, signature) except IOError as e: errorlog.warning("Couldn't create %r. %s" % (tabmodule, e)) # Write a pickled version of the tables if picklefile: try: lr.pickle_table(picklefile, signature) except IOError as e: errorlog.warning("Couldn't create %r. %s" % (picklefile, e)) # Build the parser lr.bind_callables(pinfo.pdict) parser = LRParser(lr, pinfo.error_func) parse = parser.parse return parser pycparser-2.18/pycparser/ply/cpp.py0000664000175000017500000010104513070450745020152 0ustar elibeneliben00000000000000# ----------------------------------------------------------------------------- # cpp.py # # Author: David Beazley (http://www.dabeaz.com) # Copyright (C) 2017 # All rights reserved # # This module implements an ANSI-C style lexical preprocessor for PLY. # ----------------------------------------------------------------------------- from __future__ import generators import sys # Some Python 3 compatibility shims if sys.version_info.major < 3: STRING_TYPES = (str, unicode) else: STRING_TYPES = str xrange = range # ----------------------------------------------------------------------------- # Default preprocessor lexer definitions. These tokens are enough to get # a basic preprocessor working. Other modules may import these if they want # ----------------------------------------------------------------------------- tokens = ( 'CPP_ID','CPP_INTEGER', 'CPP_FLOAT', 'CPP_STRING', 'CPP_CHAR', 'CPP_WS', 'CPP_COMMENT1', 'CPP_COMMENT2', 'CPP_POUND','CPP_DPOUND' ) literals = "+-*/%|&~^<>=!?()[]{}.,;:\\\'\"" # Whitespace def t_CPP_WS(t): r'\s+' t.lexer.lineno += t.value.count("\n") return t t_CPP_POUND = r'\#' t_CPP_DPOUND = r'\#\#' # Identifier t_CPP_ID = r'[A-Za-z_][\w_]*' # Integer literal def CPP_INTEGER(t): r'(((((0x)|(0X))[0-9a-fA-F]+)|(\d+))([uU][lL]|[lL][uU]|[uU]|[lL])?)' return t t_CPP_INTEGER = CPP_INTEGER # Floating literal t_CPP_FLOAT = r'((\d+)(\.\d+)(e(\+|-)?(\d+))? | (\d+)e(\+|-)?(\d+))([lL]|[fF])?' # String literal def t_CPP_STRING(t): r'\"([^\\\n]|(\\(.|\n)))*?\"' t.lexer.lineno += t.value.count("\n") return t # Character constant 'c' or L'c' def t_CPP_CHAR(t): r'(L)?\'([^\\\n]|(\\(.|\n)))*?\'' t.lexer.lineno += t.value.count("\n") return t # Comment def t_CPP_COMMENT1(t): r'(/\*(.|\n)*?\*/)' ncr = t.value.count("\n") t.lexer.lineno += ncr # replace with one space or a number of '\n' t.type = 'CPP_WS'; t.value = '\n' * ncr if ncr else ' ' return t # Line comment def t_CPP_COMMENT2(t): r'(//.*?(\n|$))' # replace with '/n' t.type = 'CPP_WS'; t.value = '\n' return t def t_error(t): t.type = t.value[0] t.value = t.value[0] t.lexer.skip(1) return t import re import copy import time import os.path # ----------------------------------------------------------------------------- # trigraph() # # Given an input string, this function replaces all trigraph sequences. # The following mapping is used: # # ??= # # ??/ \ # ??' ^ # ??( [ # ??) ] # ??! | # ??< { # ??> } # ??- ~ # ----------------------------------------------------------------------------- _trigraph_pat = re.compile(r'''\?\?[=/\'\(\)\!<>\-]''') _trigraph_rep = { '=':'#', '/':'\\', "'":'^', '(':'[', ')':']', '!':'|', '<':'{', '>':'}', '-':'~' } def trigraph(input): return _trigraph_pat.sub(lambda g: _trigraph_rep[g.group()[-1]],input) # ------------------------------------------------------------------ # Macro object # # This object holds information about preprocessor macros # # .name - Macro name (string) # .value - Macro value (a list of tokens) # .arglist - List of argument names # .variadic - Boolean indicating whether or not variadic macro # .vararg - Name of the variadic parameter # # When a macro is created, the macro replacement token sequence is # pre-scanned and used to create patch lists that are later used # during macro expansion # ------------------------------------------------------------------ class Macro(object): def __init__(self,name,value,arglist=None,variadic=False): self.name = name self.value = value self.arglist = arglist self.variadic = variadic if variadic: self.vararg = arglist[-1] self.source = None # ------------------------------------------------------------------ # Preprocessor object # # Object representing a preprocessor. Contains macro definitions, # include directories, and other information # ------------------------------------------------------------------ class Preprocessor(object): def __init__(self,lexer=None): if lexer is None: lexer = lex.lexer self.lexer = lexer self.macros = { } self.path = [] self.temp_path = [] # Probe the lexer for selected tokens self.lexprobe() tm = time.localtime() self.define("__DATE__ \"%s\"" % time.strftime("%b %d %Y",tm)) self.define("__TIME__ \"%s\"" % time.strftime("%H:%M:%S",tm)) self.parser = None # ----------------------------------------------------------------------------- # tokenize() # # Utility function. Given a string of text, tokenize into a list of tokens # ----------------------------------------------------------------------------- def tokenize(self,text): tokens = [] self.lexer.input(text) while True: tok = self.lexer.token() if not tok: break tokens.append(tok) return tokens # --------------------------------------------------------------------- # error() # # Report a preprocessor error/warning of some kind # ---------------------------------------------------------------------- def error(self,file,line,msg): print("%s:%d %s" % (file,line,msg)) # ---------------------------------------------------------------------- # lexprobe() # # This method probes the preprocessor lexer object to discover # the token types of symbols that are important to the preprocessor. # If this works right, the preprocessor will simply "work" # with any suitable lexer regardless of how tokens have been named. # ---------------------------------------------------------------------- def lexprobe(self): # Determine the token type for identifiers self.lexer.input("identifier") tok = self.lexer.token() if not tok or tok.value != "identifier": print("Couldn't determine identifier type") else: self.t_ID = tok.type # Determine the token type for integers self.lexer.input("12345") tok = self.lexer.token() if not tok or int(tok.value) != 12345: print("Couldn't determine integer type") else: self.t_INTEGER = tok.type self.t_INTEGER_TYPE = type(tok.value) # Determine the token type for strings enclosed in double quotes self.lexer.input("\"filename\"") tok = self.lexer.token() if not tok or tok.value != "\"filename\"": print("Couldn't determine string type") else: self.t_STRING = tok.type # Determine the token type for whitespace--if any self.lexer.input(" ") tok = self.lexer.token() if not tok or tok.value != " ": self.t_SPACE = None else: self.t_SPACE = tok.type # Determine the token type for newlines self.lexer.input("\n") tok = self.lexer.token() if not tok or tok.value != "\n": self.t_NEWLINE = None print("Couldn't determine token for newlines") else: self.t_NEWLINE = tok.type self.t_WS = (self.t_SPACE, self.t_NEWLINE) # Check for other characters used by the preprocessor chars = [ '<','>','#','##','\\','(',')',',','.'] for c in chars: self.lexer.input(c) tok = self.lexer.token() if not tok or tok.value != c: print("Unable to lex '%s' required for preprocessor" % c) # ---------------------------------------------------------------------- # add_path() # # Adds a search path to the preprocessor. # ---------------------------------------------------------------------- def add_path(self,path): self.path.append(path) # ---------------------------------------------------------------------- # group_lines() # # Given an input string, this function splits it into lines. Trailing whitespace # is removed. Any line ending with \ is grouped with the next line. This # function forms the lowest level of the preprocessor---grouping into text into # a line-by-line format. # ---------------------------------------------------------------------- def group_lines(self,input): lex = self.lexer.clone() lines = [x.rstrip() for x in input.splitlines()] for i in xrange(len(lines)): j = i+1 while lines[i].endswith('\\') and (j < len(lines)): lines[i] = lines[i][:-1]+lines[j] lines[j] = "" j += 1 input = "\n".join(lines) lex.input(input) lex.lineno = 1 current_line = [] while True: tok = lex.token() if not tok: break current_line.append(tok) if tok.type in self.t_WS and '\n' in tok.value: yield current_line current_line = [] if current_line: yield current_line # ---------------------------------------------------------------------- # tokenstrip() # # Remove leading/trailing whitespace tokens from a token list # ---------------------------------------------------------------------- def tokenstrip(self,tokens): i = 0 while i < len(tokens) and tokens[i].type in self.t_WS: i += 1 del tokens[:i] i = len(tokens)-1 while i >= 0 and tokens[i].type in self.t_WS: i -= 1 del tokens[i+1:] return tokens # ---------------------------------------------------------------------- # collect_args() # # Collects comma separated arguments from a list of tokens. The arguments # must be enclosed in parenthesis. Returns a tuple (tokencount,args,positions) # where tokencount is the number of tokens consumed, args is a list of arguments, # and positions is a list of integers containing the starting index of each # argument. Each argument is represented by a list of tokens. # # When collecting arguments, leading and trailing whitespace is removed # from each argument. # # This function properly handles nested parenthesis and commas---these do not # define new arguments. # ---------------------------------------------------------------------- def collect_args(self,tokenlist): args = [] positions = [] current_arg = [] nesting = 1 tokenlen = len(tokenlist) # Search for the opening '('. i = 0 while (i < tokenlen) and (tokenlist[i].type in self.t_WS): i += 1 if (i < tokenlen) and (tokenlist[i].value == '('): positions.append(i+1) else: self.error(self.source,tokenlist[0].lineno,"Missing '(' in macro arguments") return 0, [], [] i += 1 while i < tokenlen: t = tokenlist[i] if t.value == '(': current_arg.append(t) nesting += 1 elif t.value == ')': nesting -= 1 if nesting == 0: if current_arg: args.append(self.tokenstrip(current_arg)) positions.append(i) return i+1,args,positions current_arg.append(t) elif t.value == ',' and nesting == 1: args.append(self.tokenstrip(current_arg)) positions.append(i+1) current_arg = [] else: current_arg.append(t) i += 1 # Missing end argument self.error(self.source,tokenlist[-1].lineno,"Missing ')' in macro arguments") return 0, [],[] # ---------------------------------------------------------------------- # macro_prescan() # # Examine the macro value (token sequence) and identify patch points # This is used to speed up macro expansion later on---we'll know # right away where to apply patches to the value to form the expansion # ---------------------------------------------------------------------- def macro_prescan(self,macro): macro.patch = [] # Standard macro arguments macro.str_patch = [] # String conversion expansion macro.var_comma_patch = [] # Variadic macro comma patch i = 0 while i < len(macro.value): if macro.value[i].type == self.t_ID and macro.value[i].value in macro.arglist: argnum = macro.arglist.index(macro.value[i].value) # Conversion of argument to a string if i > 0 and macro.value[i-1].value == '#': macro.value[i] = copy.copy(macro.value[i]) macro.value[i].type = self.t_STRING del macro.value[i-1] macro.str_patch.append((argnum,i-1)) continue # Concatenation elif (i > 0 and macro.value[i-1].value == '##'): macro.patch.append(('c',argnum,i-1)) del macro.value[i-1] continue elif ((i+1) < len(macro.value) and macro.value[i+1].value == '##'): macro.patch.append(('c',argnum,i)) i += 1 continue # Standard expansion else: macro.patch.append(('e',argnum,i)) elif macro.value[i].value == '##': if macro.variadic and (i > 0) and (macro.value[i-1].value == ',') and \ ((i+1) < len(macro.value)) and (macro.value[i+1].type == self.t_ID) and \ (macro.value[i+1].value == macro.vararg): macro.var_comma_patch.append(i-1) i += 1 macro.patch.sort(key=lambda x: x[2],reverse=True) # ---------------------------------------------------------------------- # macro_expand_args() # # Given a Macro and list of arguments (each a token list), this method # returns an expanded version of a macro. The return value is a token sequence # representing the replacement macro tokens # ---------------------------------------------------------------------- def macro_expand_args(self,macro,args): # Make a copy of the macro token sequence rep = [copy.copy(_x) for _x in macro.value] # Make string expansion patches. These do not alter the length of the replacement sequence str_expansion = {} for argnum, i in macro.str_patch: if argnum not in str_expansion: str_expansion[argnum] = ('"%s"' % "".join([x.value for x in args[argnum]])).replace("\\","\\\\") rep[i] = copy.copy(rep[i]) rep[i].value = str_expansion[argnum] # Make the variadic macro comma patch. If the variadic macro argument is empty, we get rid comma_patch = False if macro.variadic and not args[-1]: for i in macro.var_comma_patch: rep[i] = None comma_patch = True # Make all other patches. The order of these matters. It is assumed that the patch list # has been sorted in reverse order of patch location since replacements will cause the # size of the replacement sequence to expand from the patch point. expanded = { } for ptype, argnum, i in macro.patch: # Concatenation. Argument is left unexpanded if ptype == 'c': rep[i:i+1] = args[argnum] # Normal expansion. Argument is macro expanded first elif ptype == 'e': if argnum not in expanded: expanded[argnum] = self.expand_macros(args[argnum]) rep[i:i+1] = expanded[argnum] # Get rid of removed comma if necessary if comma_patch: rep = [_i for _i in rep if _i] return rep # ---------------------------------------------------------------------- # expand_macros() # # Given a list of tokens, this function performs macro expansion. # The expanded argument is a dictionary that contains macros already # expanded. This is used to prevent infinite recursion. # ---------------------------------------------------------------------- def expand_macros(self,tokens,expanded=None): if expanded is None: expanded = {} i = 0 while i < len(tokens): t = tokens[i] if t.type == self.t_ID: if t.value in self.macros and t.value not in expanded: # Yes, we found a macro match expanded[t.value] = True m = self.macros[t.value] if not m.arglist: # A simple macro ex = self.expand_macros([copy.copy(_x) for _x in m.value],expanded) for e in ex: e.lineno = t.lineno tokens[i:i+1] = ex i += len(ex) else: # A macro with arguments j = i + 1 while j < len(tokens) and tokens[j].type in self.t_WS: j += 1 if tokens[j].value == '(': tokcount,args,positions = self.collect_args(tokens[j:]) if not m.variadic and len(args) != len(m.arglist): self.error(self.source,t.lineno,"Macro %s requires %d arguments" % (t.value,len(m.arglist))) i = j + tokcount elif m.variadic and len(args) < len(m.arglist)-1: if len(m.arglist) > 2: self.error(self.source,t.lineno,"Macro %s must have at least %d arguments" % (t.value, len(m.arglist)-1)) else: self.error(self.source,t.lineno,"Macro %s must have at least %d argument" % (t.value, len(m.arglist)-1)) i = j + tokcount else: if m.variadic: if len(args) == len(m.arglist)-1: args.append([]) else: args[len(m.arglist)-1] = tokens[j+positions[len(m.arglist)-1]:j+tokcount-1] del args[len(m.arglist):] # Get macro replacement text rep = self.macro_expand_args(m,args) rep = self.expand_macros(rep,expanded) for r in rep: r.lineno = t.lineno tokens[i:j+tokcount] = rep i += len(rep) del expanded[t.value] continue elif t.value == '__LINE__': t.type = self.t_INTEGER t.value = self.t_INTEGER_TYPE(t.lineno) i += 1 return tokens # ---------------------------------------------------------------------- # evalexpr() # # Evaluate an expression token sequence for the purposes of evaluating # integral expressions. # ---------------------------------------------------------------------- def evalexpr(self,tokens): # tokens = tokenize(line) # Search for defined macros i = 0 while i < len(tokens): if tokens[i].type == self.t_ID and tokens[i].value == 'defined': j = i + 1 needparen = False result = "0L" while j < len(tokens): if tokens[j].type in self.t_WS: j += 1 continue elif tokens[j].type == self.t_ID: if tokens[j].value in self.macros: result = "1L" else: result = "0L" if not needparen: break elif tokens[j].value == '(': needparen = True elif tokens[j].value == ')': break else: self.error(self.source,tokens[i].lineno,"Malformed defined()") j += 1 tokens[i].type = self.t_INTEGER tokens[i].value = self.t_INTEGER_TYPE(result) del tokens[i+1:j+1] i += 1 tokens = self.expand_macros(tokens) for i,t in enumerate(tokens): if t.type == self.t_ID: tokens[i] = copy.copy(t) tokens[i].type = self.t_INTEGER tokens[i].value = self.t_INTEGER_TYPE("0L") elif t.type == self.t_INTEGER: tokens[i] = copy.copy(t) # Strip off any trailing suffixes tokens[i].value = str(tokens[i].value) while tokens[i].value[-1] not in "0123456789abcdefABCDEF": tokens[i].value = tokens[i].value[:-1] expr = "".join([str(x.value) for x in tokens]) expr = expr.replace("&&"," and ") expr = expr.replace("||"," or ") expr = expr.replace("!"," not ") try: result = eval(expr) except Exception: self.error(self.source,tokens[0].lineno,"Couldn't evaluate expression") result = 0 return result # ---------------------------------------------------------------------- # parsegen() # # Parse an input string/ # ---------------------------------------------------------------------- def parsegen(self,input,source=None): # Replace trigraph sequences t = trigraph(input) lines = self.group_lines(t) if not source: source = "" self.define("__FILE__ \"%s\"" % source) self.source = source chunk = [] enable = True iftrigger = False ifstack = [] for x in lines: for i,tok in enumerate(x): if tok.type not in self.t_WS: break if tok.value == '#': # Preprocessor directive # insert necessary whitespace instead of eaten tokens for tok in x: if tok.type in self.t_WS and '\n' in tok.value: chunk.append(tok) dirtokens = self.tokenstrip(x[i+1:]) if dirtokens: name = dirtokens[0].value args = self.tokenstrip(dirtokens[1:]) else: name = "" args = [] if name == 'define': if enable: for tok in self.expand_macros(chunk): yield tok chunk = [] self.define(args) elif name == 'include': if enable: for tok in self.expand_macros(chunk): yield tok chunk = [] oldfile = self.macros['__FILE__'] for tok in self.include(args): yield tok self.macros['__FILE__'] = oldfile self.source = source elif name == 'undef': if enable: for tok in self.expand_macros(chunk): yield tok chunk = [] self.undef(args) elif name == 'ifdef': ifstack.append((enable,iftrigger)) if enable: if not args[0].value in self.macros: enable = False iftrigger = False else: iftrigger = True elif name == 'ifndef': ifstack.append((enable,iftrigger)) if enable: if args[0].value in self.macros: enable = False iftrigger = False else: iftrigger = True elif name == 'if': ifstack.append((enable,iftrigger)) if enable: result = self.evalexpr(args) if not result: enable = False iftrigger = False else: iftrigger = True elif name == 'elif': if ifstack: if ifstack[-1][0]: # We only pay attention if outer "if" allows this if enable: # If already true, we flip enable False enable = False elif not iftrigger: # If False, but not triggered yet, we'll check expression result = self.evalexpr(args) if result: enable = True iftrigger = True else: self.error(self.source,dirtokens[0].lineno,"Misplaced #elif") elif name == 'else': if ifstack: if ifstack[-1][0]: if enable: enable = False elif not iftrigger: enable = True iftrigger = True else: self.error(self.source,dirtokens[0].lineno,"Misplaced #else") elif name == 'endif': if ifstack: enable,iftrigger = ifstack.pop() else: self.error(self.source,dirtokens[0].lineno,"Misplaced #endif") else: # Unknown preprocessor directive pass else: # Normal text if enable: chunk.extend(x) for tok in self.expand_macros(chunk): yield tok chunk = [] # ---------------------------------------------------------------------- # include() # # Implementation of file-inclusion # ---------------------------------------------------------------------- def include(self,tokens): # Try to extract the filename and then process an include file if not tokens: return if tokens: if tokens[0].value != '<' and tokens[0].type != self.t_STRING: tokens = self.expand_macros(tokens) if tokens[0].value == '<': # Include <...> i = 1 while i < len(tokens): if tokens[i].value == '>': break i += 1 else: print("Malformed #include <...>") return filename = "".join([x.value for x in tokens[1:i]]) path = self.path + [""] + self.temp_path elif tokens[0].type == self.t_STRING: filename = tokens[0].value[1:-1] path = self.temp_path + [""] + self.path else: print("Malformed #include statement") return for p in path: iname = os.path.join(p,filename) try: data = open(iname,"r").read() dname = os.path.dirname(iname) if dname: self.temp_path.insert(0,dname) for tok in self.parsegen(data,filename): yield tok if dname: del self.temp_path[0] break except IOError: pass else: print("Couldn't find '%s'" % filename) # ---------------------------------------------------------------------- # define() # # Define a new macro # ---------------------------------------------------------------------- def define(self,tokens): if isinstance(tokens,STRING_TYPES): tokens = self.tokenize(tokens) linetok = tokens try: name = linetok[0] if len(linetok) > 1: mtype = linetok[1] else: mtype = None if not mtype: m = Macro(name.value,[]) self.macros[name.value] = m elif mtype.type in self.t_WS: # A normal macro m = Macro(name.value,self.tokenstrip(linetok[2:])) self.macros[name.value] = m elif mtype.value == '(': # A macro with arguments tokcount, args, positions = self.collect_args(linetok[1:]) variadic = False for a in args: if variadic: print("No more arguments may follow a variadic argument") break astr = "".join([str(_i.value) for _i in a]) if astr == "...": variadic = True a[0].type = self.t_ID a[0].value = '__VA_ARGS__' variadic = True del a[1:] continue elif astr[-3:] == "..." and a[0].type == self.t_ID: variadic = True del a[1:] # If, for some reason, "." is part of the identifier, strip off the name for the purposes # of macro expansion if a[0].value[-3:] == '...': a[0].value = a[0].value[:-3] continue if len(a) > 1 or a[0].type != self.t_ID: print("Invalid macro argument") break else: mvalue = self.tokenstrip(linetok[1+tokcount:]) i = 0 while i < len(mvalue): if i+1 < len(mvalue): if mvalue[i].type in self.t_WS and mvalue[i+1].value == '##': del mvalue[i] continue elif mvalue[i].value == '##' and mvalue[i+1].type in self.t_WS: del mvalue[i+1] i += 1 m = Macro(name.value,mvalue,[x[0].value for x in args],variadic) self.macro_prescan(m) self.macros[name.value] = m else: print("Bad macro definition") except LookupError: print("Bad macro definition") # ---------------------------------------------------------------------- # undef() # # Undefine a macro # ---------------------------------------------------------------------- def undef(self,tokens): id = tokens[0].value try: del self.macros[id] except LookupError: pass # ---------------------------------------------------------------------- # parse() # # Parse input text. # ---------------------------------------------------------------------- def parse(self,input,source=None,ignore={}): self.ignore = ignore self.parser = self.parsegen(input,source) # ---------------------------------------------------------------------- # token() # # Method to return individual tokens # ---------------------------------------------------------------------- def token(self): try: while True: tok = next(self.parser) if tok.type not in self.ignore: return tok except StopIteration: self.parser = None return None if __name__ == '__main__': import ply.lex as lex lexer = lex.lex() # Run a preprocessor import sys f = open(sys.argv[1]) input = f.read() p = Preprocessor(lexer) p.parse(input,sys.argv[1]) while True: tok = p.token() if not tok: break print(p.source, tok) pycparser-2.18/pycparser/ply/__init__.py0000664000175000017500000000014613045001676021124 0ustar elibeneliben00000000000000# PLY package # Author: David Beazley (dave@dabeaz.com) __version__ = '3.9' __all__ = ['lex','yacc'] pycparser-2.18/pycparser/_c_ast.cfg0000664000175000017500000001015413045001366020114 0ustar elibeneliben00000000000000#----------------------------------------------------------------- # pycparser: _c_ast.cfg # # Defines the AST Node classes used in pycparser. # # Each entry is a Node sub-class name, listing the attributes # and child nodes of the class: # * - a child node # ** - a sequence of child nodes # - an attribute # # Eli Bendersky [http://eli.thegreenplace.net] # License: BSD #----------------------------------------------------------------- # ArrayDecl is a nested declaration of an array with the given type. # dim: the dimension (for example, constant 42) # dim_quals: list of dimension qualifiers, to support C99's allowing 'const' # and 'static' within the array dimension in function declarations. ArrayDecl: [type*, dim*, dim_quals] ArrayRef: [name*, subscript*] # op: =, +=, /= etc. # Assignment: [op, lvalue*, rvalue*] BinaryOp: [op, left*, right*] Break: [] Case: [expr*, stmts**] Cast: [to_type*, expr*] # Compound statement in C99 is a list of block items (declarations or # statements). # Compound: [block_items**] # Compound literal (anonymous aggregate) for C99. # (type-name) {initializer_list} # type: the typename # init: InitList for the initializer list # CompoundLiteral: [type*, init*] # type: int, char, float, etc. see CLexer for constant token types # Constant: [type, value] Continue: [] # name: the variable being declared # quals: list of qualifiers (const, volatile) # funcspec: list function specifiers (i.e. inline in C99) # storage: list of storage specifiers (extern, register, etc.) # type: declaration type (probably nested with all the modifiers) # init: initialization value, or None # bitsize: bit field size, or None # Decl: [name, quals, storage, funcspec, type*, init*, bitsize*] DeclList: [decls**] Default: [stmts**] DoWhile: [cond*, stmt*] # Represents the ellipsis (...) parameter in a function # declaration # EllipsisParam: [] # An empty statement (a semicolon ';' on its own) # EmptyStatement: [] # Enumeration type specifier # name: an optional ID # values: an EnumeratorList # Enum: [name, values*] # A name/value pair for enumeration values # Enumerator: [name, value*] # A list of enumerators # EnumeratorList: [enumerators**] # A list of expressions separated by the comma operator. # ExprList: [exprs**] # This is the top of the AST, representing a single C file (a # translation unit in K&R jargon). It contains a list of # "external-declaration"s, which is either declarations (Decl), # Typedef or function definitions (FuncDef). # FileAST: [ext**] # for (init; cond; next) stmt # For: [init*, cond*, next*, stmt*] # name: Id # args: ExprList # FuncCall: [name*, args*] # type (args) # FuncDecl: [args*, type*] # Function definition: a declarator for the function name and # a body, which is a compound statement. # There's an optional list of parameter declarations for old # K&R-style definitions # FuncDef: [decl*, param_decls**, body*] Goto: [name] ID: [name] # Holder for types that are a simple identifier (e.g. the built # ins void, char etc. and typedef-defined types) # IdentifierType: [names] If: [cond*, iftrue*, iffalse*] # An initialization list used for compound literals. # InitList: [exprs**] Label: [name, stmt*] # A named initializer for C99. # The name of a NamedInitializer is a sequence of Nodes, because # names can be hierarchical and contain constant expressions. # NamedInitializer: [name**, expr*] # a list of comma separated function parameter declarations # ParamList: [params**] PtrDecl: [quals, type*] Return: [expr*] # name: struct tag name # decls: declaration of members # Struct: [name, decls**] # type: . or -> # name.field or name->field # StructRef: [name*, type, field*] Switch: [cond*, stmt*] # cond ? iftrue : iffalse # TernaryOp: [cond*, iftrue*, iffalse*] # A base type declaration # TypeDecl: [declname, quals, type*] # A typedef declaration. # Very similar to Decl, but without some attributes # Typedef: [name, quals, storage, type*] Typename: [name, quals, type*] UnaryOp: [op, expr*] # name: union tag name # decls: declaration of members # Union: [name, decls**] While: [cond*, stmt*] Pragma: [string] pycparser-2.18/pycparser/c_parser.py0000664000175000017500000020144413060530765020366 0ustar elibeneliben00000000000000#------------------------------------------------------------------------------ # pycparser: c_parser.py # # CParser class: Parser and AST builder for the C language # # Eli Bendersky [http://eli.thegreenplace.net] # License: BSD #------------------------------------------------------------------------------ import re from .ply import yacc from . import c_ast from .c_lexer import CLexer from .plyparser import PLYParser, Coord, ParseError, parameterized, template from .ast_transforms import fix_switch_cases @template class CParser(PLYParser): def __init__( self, lex_optimize=True, lexer=CLexer, lextab='pycparser.lextab', yacc_optimize=True, yacctab='pycparser.yacctab', yacc_debug=False, taboutputdir=''): """ Create a new CParser. Some arguments for controlling the debug/optimization level of the parser are provided. The defaults are tuned for release/performance mode. The simple rules for using them are: *) When tweaking CParser/CLexer, set these to False *) When releasing a stable parser, set to True lex_optimize: Set to False when you're modifying the lexer. Otherwise, changes in the lexer won't be used, if some lextab.py file exists. When releasing with a stable lexer, set to True to save the re-generation of the lexer table on each run. lexer: Set this parameter to define the lexer to use if you're not using the default CLexer. lextab: Points to the lex table that's used for optimized mode. Only if you're modifying the lexer and want some tests to avoid re-generating the table, make this point to a local lex table file (that's been earlier generated with lex_optimize=True) yacc_optimize: Set to False when you're modifying the parser. Otherwise, changes in the parser won't be used, if some parsetab.py file exists. When releasing with a stable parser, set to True to save the re-generation of the parser table on each run. yacctab: Points to the yacc table that's used for optimized mode. Only if you're modifying the parser, make this point to a local yacc table file yacc_debug: Generate a parser.out file that explains how yacc built the parsing table from the grammar. taboutputdir: Set this parameter to control the location of generated lextab and yacctab files. """ self.clex = lexer( error_func=self._lex_error_func, on_lbrace_func=self._lex_on_lbrace_func, on_rbrace_func=self._lex_on_rbrace_func, type_lookup_func=self._lex_type_lookup_func) self.clex.build( optimize=lex_optimize, lextab=lextab, outputdir=taboutputdir) self.tokens = self.clex.tokens rules_with_opt = [ 'abstract_declarator', 'assignment_expression', 'declaration_list', 'declaration_specifiers_no_type', 'designation', 'expression', 'identifier_list', 'init_declarator_list', 'id_init_declarator_list', 'initializer_list', 'parameter_type_list', 'block_item_list', 'type_qualifier_list', 'struct_declarator_list' ] for rule in rules_with_opt: self._create_opt_rule(rule) self.cparser = yacc.yacc( module=self, start='translation_unit_or_empty', debug=yacc_debug, optimize=yacc_optimize, tabmodule=yacctab, outputdir=taboutputdir) # Stack of scopes for keeping track of symbols. _scope_stack[-1] is # the current (topmost) scope. Each scope is a dictionary that # specifies whether a name is a type. If _scope_stack[n][name] is # True, 'name' is currently a type in the scope. If it's False, # 'name' is used in the scope but not as a type (for instance, if we # saw: int name; # If 'name' is not a key in _scope_stack[n] then 'name' was not defined # in this scope at all. self._scope_stack = [dict()] # Keeps track of the last token given to yacc (the lookahead token) self._last_yielded_token = None def parse(self, text, filename='', debuglevel=0): """ Parses C code and returns an AST. text: A string containing the C source code filename: Name of the file being parsed (for meaningful error messages) debuglevel: Debug level to yacc """ self.clex.filename = filename self.clex.reset_lineno() self._scope_stack = [dict()] self._last_yielded_token = None return self.cparser.parse( input=text, lexer=self.clex, debug=debuglevel) ######################-- PRIVATE --###################### def _push_scope(self): self._scope_stack.append(dict()) def _pop_scope(self): assert len(self._scope_stack) > 1 self._scope_stack.pop() def _add_typedef_name(self, name, coord): """ Add a new typedef name (ie a TYPEID) to the current scope """ if not self._scope_stack[-1].get(name, True): self._parse_error( "Typedef %r previously declared as non-typedef " "in this scope" % name, coord) self._scope_stack[-1][name] = True def _add_identifier(self, name, coord): """ Add a new object, function, or enum member name (ie an ID) to the current scope """ if self._scope_stack[-1].get(name, False): self._parse_error( "Non-typedef %r previously declared as typedef " "in this scope" % name, coord) self._scope_stack[-1][name] = False def _is_type_in_scope(self, name): """ Is *name* a typedef-name in the current scope? """ for scope in reversed(self._scope_stack): # If name is an identifier in this scope it shadows typedefs in # higher scopes. in_scope = scope.get(name) if in_scope is not None: return in_scope return False def _lex_error_func(self, msg, line, column): self._parse_error(msg, self._coord(line, column)) def _lex_on_lbrace_func(self): self._push_scope() def _lex_on_rbrace_func(self): self._pop_scope() def _lex_type_lookup_func(self, name): """ Looks up types that were previously defined with typedef. Passed to the lexer for recognizing identifiers that are types. """ is_type = self._is_type_in_scope(name) return is_type def _get_yacc_lookahead_token(self): """ We need access to yacc's lookahead token in certain cases. This is the last token yacc requested from the lexer, so we ask the lexer. """ return self.clex.last_token # To understand what's going on here, read sections A.8.5 and # A.8.6 of K&R2 very carefully. # # A C type consists of a basic type declaration, with a list # of modifiers. For example: # # int *c[5]; # # The basic declaration here is 'int c', and the pointer and # the array are the modifiers. # # Basic declarations are represented by TypeDecl (from module c_ast) and the # modifiers are FuncDecl, PtrDecl and ArrayDecl. # # The standard states that whenever a new modifier is parsed, it should be # added to the end of the list of modifiers. For example: # # K&R2 A.8.6.2: Array Declarators # # In a declaration T D where D has the form # D1 [constant-expression-opt] # and the type of the identifier in the declaration T D1 is # "type-modifier T", the type of the # identifier of D is "type-modifier array of T" # # This is what this method does. The declarator it receives # can be a list of declarators ending with TypeDecl. It # tacks the modifier to the end of this list, just before # the TypeDecl. # # Additionally, the modifier may be a list itself. This is # useful for pointers, that can come as a chain from the rule # p_pointer. In this case, the whole modifier list is spliced # into the new location. def _type_modify_decl(self, decl, modifier): """ Tacks a type modifier on a declarator, and returns the modified declarator. Note: the declarator and modifier may be modified """ #~ print '****' #~ decl.show(offset=3) #~ modifier.show(offset=3) #~ print '****' modifier_head = modifier modifier_tail = modifier # The modifier may be a nested list. Reach its tail. # while modifier_tail.type: modifier_tail = modifier_tail.type # If the decl is a basic type, just tack the modifier onto # it # if isinstance(decl, c_ast.TypeDecl): modifier_tail.type = decl return modifier else: # Otherwise, the decl is a list of modifiers. Reach # its tail and splice the modifier onto the tail, # pointing to the underlying basic type. # decl_tail = decl while not isinstance(decl_tail.type, c_ast.TypeDecl): decl_tail = decl_tail.type modifier_tail.type = decl_tail.type decl_tail.type = modifier_head return decl # Due to the order in which declarators are constructed, # they have to be fixed in order to look like a normal AST. # # When a declaration arrives from syntax construction, it has # these problems: # * The innermost TypeDecl has no type (because the basic # type is only known at the uppermost declaration level) # * The declaration has no variable name, since that is saved # in the innermost TypeDecl # * The typename of the declaration is a list of type # specifiers, and not a node. Here, basic identifier types # should be separated from more complex types like enums # and structs. # # This method fixes these problems. # def _fix_decl_name_type(self, decl, typename): """ Fixes a declaration. Modifies decl. """ # Reach the underlying basic type # type = decl while not isinstance(type, c_ast.TypeDecl): type = type.type decl.name = type.declname type.quals = decl.quals # The typename is a list of types. If any type in this # list isn't an IdentifierType, it must be the only # type in the list (it's illegal to declare "int enum ..") # If all the types are basic, they're collected in the # IdentifierType holder. # for tn in typename: if not isinstance(tn, c_ast.IdentifierType): if len(typename) > 1: self._parse_error( "Invalid multiple types specified", tn.coord) else: type.type = tn return decl if not typename: # Functions default to returning int # if not isinstance(decl.type, c_ast.FuncDecl): self._parse_error( "Missing type in declaration", decl.coord) type.type = c_ast.IdentifierType( ['int'], coord=decl.coord) else: # At this point, we know that typename is a list of IdentifierType # nodes. Concatenate all the names into a single list. # type.type = c_ast.IdentifierType( [name for id in typename for name in id.names], coord=typename[0].coord) return decl def _add_declaration_specifier(self, declspec, newspec, kind, append=False): """ Declaration specifiers are represented by a dictionary with the entries: * qual: a list of type qualifiers * storage: a list of storage type qualifiers * type: a list of type specifiers * function: a list of function specifiers This method is given a declaration specifier, and a new specifier of a given kind. If `append` is True, the new specifier is added to the end of the specifiers list, otherwise it's added at the beginning. Returns the declaration specifier, with the new specifier incorporated. """ spec = declspec or dict(qual=[], storage=[], type=[], function=[]) if append: spec[kind].append(newspec) else: spec[kind].insert(0, newspec) return spec def _build_declarations(self, spec, decls, typedef_namespace=False): """ Builds a list of declarations all sharing the given specifiers. If typedef_namespace is true, each declared name is added to the "typedef namespace", which also includes objects, functions, and enum constants. """ is_typedef = 'typedef' in spec['storage'] declarations = [] # Bit-fields are allowed to be unnamed. # if decls[0].get('bitsize') is not None: pass # When redeclaring typedef names as identifiers in inner scopes, a # problem can occur where the identifier gets grouped into # spec['type'], leaving decl as None. This can only occur for the # first declarator. # elif decls[0]['decl'] is None: if len(spec['type']) < 2 or len(spec['type'][-1].names) != 1 or \ not self._is_type_in_scope(spec['type'][-1].names[0]): coord = '?' for t in spec['type']: if hasattr(t, 'coord'): coord = t.coord break self._parse_error('Invalid declaration', coord) # Make this look as if it came from "direct_declarator:ID" decls[0]['decl'] = c_ast.TypeDecl( declname=spec['type'][-1].names[0], type=None, quals=None, coord=spec['type'][-1].coord) # Remove the "new" type's name from the end of spec['type'] del spec['type'][-1] # A similar problem can occur where the declaration ends up looking # like an abstract declarator. Give it a name if this is the case. # elif not isinstance(decls[0]['decl'], (c_ast.Struct, c_ast.Union, c_ast.IdentifierType)): decls_0_tail = decls[0]['decl'] while not isinstance(decls_0_tail, c_ast.TypeDecl): decls_0_tail = decls_0_tail.type if decls_0_tail.declname is None: decls_0_tail.declname = spec['type'][-1].names[0] del spec['type'][-1] for decl in decls: assert decl['decl'] is not None if is_typedef: declaration = c_ast.Typedef( name=None, quals=spec['qual'], storage=spec['storage'], type=decl['decl'], coord=decl['decl'].coord) else: declaration = c_ast.Decl( name=None, quals=spec['qual'], storage=spec['storage'], funcspec=spec['function'], type=decl['decl'], init=decl.get('init'), bitsize=decl.get('bitsize'), coord=decl['decl'].coord) if isinstance(declaration.type, (c_ast.Struct, c_ast.Union, c_ast.IdentifierType)): fixed_decl = declaration else: fixed_decl = self._fix_decl_name_type(declaration, spec['type']) # Add the type name defined by typedef to a # symbol table (for usage in the lexer) # if typedef_namespace: if is_typedef: self._add_typedef_name(fixed_decl.name, fixed_decl.coord) else: self._add_identifier(fixed_decl.name, fixed_decl.coord) declarations.append(fixed_decl) return declarations def _build_function_definition(self, spec, decl, param_decls, body): """ Builds a function definition. """ assert 'typedef' not in spec['storage'] declaration = self._build_declarations( spec=spec, decls=[dict(decl=decl, init=None)], typedef_namespace=True)[0] return c_ast.FuncDef( decl=declaration, param_decls=param_decls, body=body, coord=decl.coord) def _select_struct_union_class(self, token): """ Given a token (either STRUCT or UNION), selects the appropriate AST class. """ if token == 'struct': return c_ast.Struct else: return c_ast.Union ## ## Precedence and associativity of operators ## precedence = ( ('left', 'LOR'), ('left', 'LAND'), ('left', 'OR'), ('left', 'XOR'), ('left', 'AND'), ('left', 'EQ', 'NE'), ('left', 'GT', 'GE', 'LT', 'LE'), ('left', 'RSHIFT', 'LSHIFT'), ('left', 'PLUS', 'MINUS'), ('left', 'TIMES', 'DIVIDE', 'MOD') ) ## ## Grammar productions ## Implementation of the BNF defined in K&R2 A.13 ## # Wrapper around a translation unit, to allow for empty input. # Not strictly part of the C99 Grammar, but useful in practice. # def p_translation_unit_or_empty(self, p): """ translation_unit_or_empty : translation_unit | empty """ if p[1] is None: p[0] = c_ast.FileAST([]) else: p[0] = c_ast.FileAST(p[1]) def p_translation_unit_1(self, p): """ translation_unit : external_declaration """ # Note: external_declaration is already a list # p[0] = p[1] def p_translation_unit_2(self, p): """ translation_unit : translation_unit external_declaration """ if p[2] is not None: p[1].extend(p[2]) p[0] = p[1] # Declarations always come as lists (because they can be # several in one line), so we wrap the function definition # into a list as well, to make the return value of # external_declaration homogenous. # def p_external_declaration_1(self, p): """ external_declaration : function_definition """ p[0] = [p[1]] def p_external_declaration_2(self, p): """ external_declaration : declaration """ p[0] = p[1] def p_external_declaration_3(self, p): """ external_declaration : pp_directive | pppragma_directive """ p[0] = [p[1]] def p_external_declaration_4(self, p): """ external_declaration : SEMI """ p[0] = None def p_pp_directive(self, p): """ pp_directive : PPHASH """ self._parse_error('Directives not supported yet', self._token_coord(p, 1)) def p_pppragma_directive(self, p): """ pppragma_directive : PPPRAGMA | PPPRAGMA PPPRAGMASTR """ if len(p) == 3: p[0] = c_ast.Pragma(p[2], self._token_coord(p, 2)) else: p[0] = c_ast.Pragma("", self._token_coord(p, 1)) # In function definitions, the declarator can be followed by # a declaration list, for old "K&R style" function definitios. # def p_function_definition_1(self, p): """ function_definition : id_declarator declaration_list_opt compound_statement """ # no declaration specifiers - 'int' becomes the default type spec = dict( qual=[], storage=[], type=[c_ast.IdentifierType(['int'], coord=self._token_coord(p, 1))], function=[]) p[0] = self._build_function_definition( spec=spec, decl=p[1], param_decls=p[2], body=p[3]) def p_function_definition_2(self, p): """ function_definition : declaration_specifiers id_declarator declaration_list_opt compound_statement """ spec = p[1] p[0] = self._build_function_definition( spec=spec, decl=p[2], param_decls=p[3], body=p[4]) def p_statement(self, p): """ statement : labeled_statement | expression_statement | compound_statement | selection_statement | iteration_statement | jump_statement | pppragma_directive """ p[0] = p[1] # In C, declarations can come several in a line: # int x, *px, romulo = 5; # # However, for the AST, we will split them to separate Decl # nodes. # # This rule splits its declarations and always returns a list # of Decl nodes, even if it's one element long. # def p_decl_body(self, p): """ decl_body : declaration_specifiers init_declarator_list_opt | declaration_specifiers_no_type id_init_declarator_list_opt """ spec = p[1] # p[2] (init_declarator_list_opt) is either a list or None # if p[2] is None: # By the standard, you must have at least one declarator unless # declaring a structure tag, a union tag, or the members of an # enumeration. # ty = spec['type'] s_u_or_e = (c_ast.Struct, c_ast.Union, c_ast.Enum) if len(ty) == 1 and isinstance(ty[0], s_u_or_e): decls = [c_ast.Decl( name=None, quals=spec['qual'], storage=spec['storage'], funcspec=spec['function'], type=ty[0], init=None, bitsize=None, coord=ty[0].coord)] # However, this case can also occur on redeclared identifiers in # an inner scope. The trouble is that the redeclared type's name # gets grouped into declaration_specifiers; _build_declarations # compensates for this. # else: decls = self._build_declarations( spec=spec, decls=[dict(decl=None, init=None)], typedef_namespace=True) else: decls = self._build_declarations( spec=spec, decls=p[2], typedef_namespace=True) p[0] = decls # The declaration has been split to a decl_body sub-rule and # SEMI, because having them in a single rule created a problem # for defining typedefs. # # If a typedef line was directly followed by a line using the # type defined with the typedef, the type would not be # recognized. This is because to reduce the declaration rule, # the parser's lookahead asked for the token after SEMI, which # was the type from the next line, and the lexer had no chance # to see the updated type symbol table. # # Splitting solves this problem, because after seeing SEMI, # the parser reduces decl_body, which actually adds the new # type into the table to be seen by the lexer before the next # line is reached. def p_declaration(self, p): """ declaration : decl_body SEMI """ p[0] = p[1] # Since each declaration is a list of declarations, this # rule will combine all the declarations and return a single # list # def p_declaration_list(self, p): """ declaration_list : declaration | declaration_list declaration """ p[0] = p[1] if len(p) == 2 else p[1] + p[2] # To know when declaration-specifiers end and declarators begin, # we require declaration-specifiers to have at least one # type-specifier, and disallow typedef-names after we've seen any # type-specifier. These are both required by the spec. # def p_declaration_specifiers_no_type_1(self, p): """ declaration_specifiers_no_type : type_qualifier declaration_specifiers_no_type_opt """ p[0] = self._add_declaration_specifier(p[2], p[1], 'qual') def p_declaration_specifiers_no_type_2(self, p): """ declaration_specifiers_no_type : storage_class_specifier declaration_specifiers_no_type_opt """ p[0] = self._add_declaration_specifier(p[2], p[1], 'storage') def p_declaration_specifiers_no_type_3(self, p): """ declaration_specifiers_no_type : function_specifier declaration_specifiers_no_type_opt """ p[0] = self._add_declaration_specifier(p[2], p[1], 'function') def p_declaration_specifiers_1(self, p): """ declaration_specifiers : declaration_specifiers type_qualifier """ p[0] = self._add_declaration_specifier(p[1], p[2], 'qual', append=True) def p_declaration_specifiers_2(self, p): """ declaration_specifiers : declaration_specifiers storage_class_specifier """ p[0] = self._add_declaration_specifier(p[1], p[2], 'storage', append=True) def p_declaration_specifiers_3(self, p): """ declaration_specifiers : declaration_specifiers function_specifier """ p[0] = self._add_declaration_specifier(p[1], p[2], 'function', append=True) def p_declaration_specifiers_4(self, p): """ declaration_specifiers : declaration_specifiers type_specifier_no_typeid """ p[0] = self._add_declaration_specifier(p[1], p[2], 'type', append=True) def p_declaration_specifiers_5(self, p): """ declaration_specifiers : type_specifier """ p[0] = self._add_declaration_specifier(None, p[1], 'type') def p_declaration_specifiers_6(self, p): """ declaration_specifiers : declaration_specifiers_no_type type_specifier """ p[0] = self._add_declaration_specifier(p[1], p[2], 'type', append=True) def p_storage_class_specifier(self, p): """ storage_class_specifier : AUTO | REGISTER | STATIC | EXTERN | TYPEDEF """ p[0] = p[1] def p_function_specifier(self, p): """ function_specifier : INLINE """ p[0] = p[1] def p_type_specifier_no_typeid(self, p): """ type_specifier_no_typeid : VOID | _BOOL | CHAR | SHORT | INT | LONG | FLOAT | DOUBLE | _COMPLEX | SIGNED | UNSIGNED | __INT128 """ p[0] = c_ast.IdentifierType([p[1]], coord=self._token_coord(p, 1)) def p_type_specifier(self, p): """ type_specifier : typedef_name | enum_specifier | struct_or_union_specifier | type_specifier_no_typeid """ p[0] = p[1] def p_type_qualifier(self, p): """ type_qualifier : CONST | RESTRICT | VOLATILE """ p[0] = p[1] def p_init_declarator_list(self, p): """ init_declarator_list : init_declarator | init_declarator_list COMMA init_declarator """ p[0] = p[1] + [p[3]] if len(p) == 4 else [p[1]] # Returns a {decl= : init=} dictionary # If there's no initializer, uses None # def p_init_declarator(self, p): """ init_declarator : declarator | declarator EQUALS initializer """ p[0] = dict(decl=p[1], init=(p[3] if len(p) > 2 else None)) def p_id_init_declarator_list(self, p): """ id_init_declarator_list : id_init_declarator | id_init_declarator_list COMMA init_declarator """ p[0] = p[1] + [p[3]] if len(p) == 4 else [p[1]] def p_id_init_declarator(self, p): """ id_init_declarator : id_declarator | id_declarator EQUALS initializer """ p[0] = dict(decl=p[1], init=(p[3] if len(p) > 2 else None)) # Require at least one type specifier in a specifier-qualifier-list # def p_specifier_qualifier_list_1(self, p): """ specifier_qualifier_list : specifier_qualifier_list type_specifier_no_typeid """ p[0] = self._add_declaration_specifier(p[1], p[2], 'type', append=True) def p_specifier_qualifier_list_2(self, p): """ specifier_qualifier_list : specifier_qualifier_list type_qualifier """ p[0] = self._add_declaration_specifier(p[1], p[2], 'qual', append=True) def p_specifier_qualifier_list_3(self, p): """ specifier_qualifier_list : type_specifier """ p[0] = self._add_declaration_specifier(None, p[1], 'type') def p_specifier_qualifier_list_4(self, p): """ specifier_qualifier_list : type_qualifier_list type_specifier """ spec = dict(qual=p[1], storage=[], type=[], function=[]) p[0] = self._add_declaration_specifier(spec, p[2], 'type', append=True) # TYPEID is allowed here (and in other struct/enum related tag names), because # struct/enum tags reside in their own namespace and can be named the same as types # def p_struct_or_union_specifier_1(self, p): """ struct_or_union_specifier : struct_or_union ID | struct_or_union TYPEID """ klass = self._select_struct_union_class(p[1]) p[0] = klass( name=p[2], decls=None, coord=self._token_coord(p, 2)) def p_struct_or_union_specifier_2(self, p): """ struct_or_union_specifier : struct_or_union brace_open struct_declaration_list brace_close """ klass = self._select_struct_union_class(p[1]) p[0] = klass( name=None, decls=p[3], coord=self._token_coord(p, 2)) def p_struct_or_union_specifier_3(self, p): """ struct_or_union_specifier : struct_or_union ID brace_open struct_declaration_list brace_close | struct_or_union TYPEID brace_open struct_declaration_list brace_close """ klass = self._select_struct_union_class(p[1]) p[0] = klass( name=p[2], decls=p[4], coord=self._token_coord(p, 2)) def p_struct_or_union(self, p): """ struct_or_union : STRUCT | UNION """ p[0] = p[1] # Combine all declarations into a single list # def p_struct_declaration_list(self, p): """ struct_declaration_list : struct_declaration | struct_declaration_list struct_declaration """ if len(p) == 2: p[0] = p[1] or [] else: p[0] = p[1] + (p[2] or []) def p_struct_declaration_1(self, p): """ struct_declaration : specifier_qualifier_list struct_declarator_list_opt SEMI """ spec = p[1] assert 'typedef' not in spec['storage'] if p[2] is not None: decls = self._build_declarations( spec=spec, decls=p[2]) elif len(spec['type']) == 1: # Anonymous struct/union, gcc extension, C1x feature. # Although the standard only allows structs/unions here, I see no # reason to disallow other types since some compilers have typedefs # here, and pycparser isn't about rejecting all invalid code. # node = spec['type'][0] if isinstance(node, c_ast.Node): decl_type = node else: decl_type = c_ast.IdentifierType(node) decls = self._build_declarations( spec=spec, decls=[dict(decl=decl_type)]) else: # Structure/union members can have the same names as typedefs. # The trouble is that the member's name gets grouped into # specifier_qualifier_list; _build_declarations compensates. # decls = self._build_declarations( spec=spec, decls=[dict(decl=None, init=None)]) p[0] = decls def p_struct_declaration_2(self, p): """ struct_declaration : SEMI """ p[0] = None def p_struct_declarator_list(self, p): """ struct_declarator_list : struct_declarator | struct_declarator_list COMMA struct_declarator """ p[0] = p[1] + [p[3]] if len(p) == 4 else [p[1]] # struct_declarator passes up a dict with the keys: decl (for # the underlying declarator) and bitsize (for the bitsize) # def p_struct_declarator_1(self, p): """ struct_declarator : declarator """ p[0] = {'decl': p[1], 'bitsize': None} def p_struct_declarator_2(self, p): """ struct_declarator : declarator COLON constant_expression | COLON constant_expression """ if len(p) > 3: p[0] = {'decl': p[1], 'bitsize': p[3]} else: p[0] = {'decl': c_ast.TypeDecl(None, None, None), 'bitsize': p[2]} def p_enum_specifier_1(self, p): """ enum_specifier : ENUM ID | ENUM TYPEID """ p[0] = c_ast.Enum(p[2], None, self._token_coord(p, 1)) def p_enum_specifier_2(self, p): """ enum_specifier : ENUM brace_open enumerator_list brace_close """ p[0] = c_ast.Enum(None, p[3], self._token_coord(p, 1)) def p_enum_specifier_3(self, p): """ enum_specifier : ENUM ID brace_open enumerator_list brace_close | ENUM TYPEID brace_open enumerator_list brace_close """ p[0] = c_ast.Enum(p[2], p[4], self._token_coord(p, 1)) def p_enumerator_list(self, p): """ enumerator_list : enumerator | enumerator_list COMMA | enumerator_list COMMA enumerator """ if len(p) == 2: p[0] = c_ast.EnumeratorList([p[1]], p[1].coord) elif len(p) == 3: p[0] = p[1] else: p[1].enumerators.append(p[3]) p[0] = p[1] def p_enumerator(self, p): """ enumerator : ID | ID EQUALS constant_expression """ if len(p) == 2: enumerator = c_ast.Enumerator( p[1], None, self._token_coord(p, 1)) else: enumerator = c_ast.Enumerator( p[1], p[3], self._token_coord(p, 1)) self._add_identifier(enumerator.name, enumerator.coord) p[0] = enumerator def p_declarator(self, p): """ declarator : id_declarator | typeid_declarator """ p[0] = p[1] @parameterized(('id', 'ID'), ('typeid', 'TYPEID'), ('typeid_noparen', 'TYPEID')) def p_xxx_declarator_1(self, p): """ xxx_declarator : direct_xxx_declarator """ p[0] = p[1] @parameterized(('id', 'ID'), ('typeid', 'TYPEID'), ('typeid_noparen', 'TYPEID')) def p_xxx_declarator_2(self, p): """ xxx_declarator : pointer direct_xxx_declarator """ p[0] = self._type_modify_decl(p[2], p[1]) @parameterized(('id', 'ID'), ('typeid', 'TYPEID'), ('typeid_noparen', 'TYPEID')) def p_direct_xxx_declarator_1(self, p): """ direct_xxx_declarator : yyy """ p[0] = c_ast.TypeDecl( declname=p[1], type=None, quals=None, coord=self._token_coord(p, 1)) @parameterized(('id', 'ID'), ('typeid', 'TYPEID')) def p_direct_xxx_declarator_2(self, p): """ direct_xxx_declarator : LPAREN xxx_declarator RPAREN """ p[0] = p[2] @parameterized(('id', 'ID'), ('typeid', 'TYPEID'), ('typeid_noparen', 'TYPEID')) def p_direct_xxx_declarator_3(self, p): """ direct_xxx_declarator : direct_xxx_declarator LBRACKET type_qualifier_list_opt assignment_expression_opt RBRACKET """ quals = (p[3] if len(p) > 5 else []) or [] # Accept dimension qualifiers # Per C99 6.7.5.3 p7 arr = c_ast.ArrayDecl( type=None, dim=p[4] if len(p) > 5 else p[3], dim_quals=quals, coord=p[1].coord) p[0] = self._type_modify_decl(decl=p[1], modifier=arr) @parameterized(('id', 'ID'), ('typeid', 'TYPEID'), ('typeid_noparen', 'TYPEID')) def p_direct_xxx_declarator_4(self, p): """ direct_xxx_declarator : direct_xxx_declarator LBRACKET STATIC type_qualifier_list_opt assignment_expression RBRACKET | direct_xxx_declarator LBRACKET type_qualifier_list STATIC assignment_expression RBRACKET """ # Using slice notation for PLY objects doesn't work in Python 3 for the # version of PLY embedded with pycparser; see PLY Google Code issue 30. # Work around that here by listing the two elements separately. listed_quals = [item if isinstance(item, list) else [item] for item in [p[3],p[4]]] dim_quals = [qual for sublist in listed_quals for qual in sublist if qual is not None] arr = c_ast.ArrayDecl( type=None, dim=p[5], dim_quals=dim_quals, coord=p[1].coord) p[0] = self._type_modify_decl(decl=p[1], modifier=arr) # Special for VLAs # @parameterized(('id', 'ID'), ('typeid', 'TYPEID'), ('typeid_noparen', 'TYPEID')) def p_direct_xxx_declarator_5(self, p): """ direct_xxx_declarator : direct_xxx_declarator LBRACKET type_qualifier_list_opt TIMES RBRACKET """ arr = c_ast.ArrayDecl( type=None, dim=c_ast.ID(p[4], self._token_coord(p, 4)), dim_quals=p[3] if p[3] != None else [], coord=p[1].coord) p[0] = self._type_modify_decl(decl=p[1], modifier=arr) @parameterized(('id', 'ID'), ('typeid', 'TYPEID'), ('typeid_noparen', 'TYPEID')) def p_direct_xxx_declarator_6(self, p): """ direct_xxx_declarator : direct_xxx_declarator LPAREN parameter_type_list RPAREN | direct_xxx_declarator LPAREN identifier_list_opt RPAREN """ func = c_ast.FuncDecl( args=p[3], type=None, coord=p[1].coord) # To see why _get_yacc_lookahead_token is needed, consider: # typedef char TT; # void foo(int TT) { TT = 10; } # Outside the function, TT is a typedef, but inside (starting and # ending with the braces) it's a parameter. The trouble begins with # yacc's lookahead token. We don't know if we're declaring or # defining a function until we see LBRACE, but if we wait for yacc to # trigger a rule on that token, then TT will have already been read # and incorrectly interpreted as TYPEID. We need to add the # parameters to the scope the moment the lexer sees LBRACE. # if self._get_yacc_lookahead_token().type == "LBRACE": if func.args is not None: for param in func.args.params: if isinstance(param, c_ast.EllipsisParam): break self._add_identifier(param.name, param.coord) p[0] = self._type_modify_decl(decl=p[1], modifier=func) def p_pointer(self, p): """ pointer : TIMES type_qualifier_list_opt | TIMES type_qualifier_list_opt pointer """ coord = self._token_coord(p, 1) # Pointer decls nest from inside out. This is important when different # levels have different qualifiers. For example: # # char * const * p; # # Means "pointer to const pointer to char" # # While: # # char ** const p; # # Means "const pointer to pointer to char" # # So when we construct PtrDecl nestings, the leftmost pointer goes in # as the most nested type. nested_type = c_ast.PtrDecl(quals=p[2] or [], type=None, coord=coord) if len(p) > 3: tail_type = p[3] while tail_type.type is not None: tail_type = tail_type.type tail_type.type = nested_type p[0] = p[3] else: p[0] = nested_type def p_type_qualifier_list(self, p): """ type_qualifier_list : type_qualifier | type_qualifier_list type_qualifier """ p[0] = [p[1]] if len(p) == 2 else p[1] + [p[2]] def p_parameter_type_list(self, p): """ parameter_type_list : parameter_list | parameter_list COMMA ELLIPSIS """ if len(p) > 2: p[1].params.append(c_ast.EllipsisParam(self._token_coord(p, 3))) p[0] = p[1] def p_parameter_list(self, p): """ parameter_list : parameter_declaration | parameter_list COMMA parameter_declaration """ if len(p) == 2: # single parameter p[0] = c_ast.ParamList([p[1]], p[1].coord) else: p[1].params.append(p[3]) p[0] = p[1] # From ISO/IEC 9899:TC2, 6.7.5.3.11: # "If, in a parameter declaration, an identifier can be treated either # as a typedef name or as a parameter name, it shall be taken as a # typedef name." # # Inside a parameter declaration, once we've reduced declaration specifiers, # if we shift in an LPAREN and see a TYPEID, it could be either an abstract # declarator or a declarator nested inside parens. This rule tells us to # always treat it as an abstract declarator. Therefore, we only accept # `id_declarator`s and `typeid_noparen_declarator`s. def p_parameter_declaration_1(self, p): """ parameter_declaration : declaration_specifiers id_declarator | declaration_specifiers typeid_noparen_declarator """ spec = p[1] if not spec['type']: spec['type'] = [c_ast.IdentifierType(['int'], coord=self._token_coord(p, 1))] p[0] = self._build_declarations( spec=spec, decls=[dict(decl=p[2])])[0] def p_parameter_declaration_2(self, p): """ parameter_declaration : declaration_specifiers abstract_declarator_opt """ spec = p[1] if not spec['type']: spec['type'] = [c_ast.IdentifierType(['int'], coord=self._token_coord(p, 1))] # Parameters can have the same names as typedefs. The trouble is that # the parameter's name gets grouped into declaration_specifiers, making # it look like an old-style declaration; compensate. # if len(spec['type']) > 1 and len(spec['type'][-1].names) == 1 and \ self._is_type_in_scope(spec['type'][-1].names[0]): decl = self._build_declarations( spec=spec, decls=[dict(decl=p[2], init=None)])[0] # This truly is an old-style parameter declaration # else: decl = c_ast.Typename( name='', quals=spec['qual'], type=p[2] or c_ast.TypeDecl(None, None, None), coord=self._token_coord(p, 2)) typename = spec['type'] decl = self._fix_decl_name_type(decl, typename) p[0] = decl def p_identifier_list(self, p): """ identifier_list : identifier | identifier_list COMMA identifier """ if len(p) == 2: # single parameter p[0] = c_ast.ParamList([p[1]], p[1].coord) else: p[1].params.append(p[3]) p[0] = p[1] def p_initializer_1(self, p): """ initializer : assignment_expression """ p[0] = p[1] def p_initializer_2(self, p): """ initializer : brace_open initializer_list_opt brace_close | brace_open initializer_list COMMA brace_close """ if p[2] is None: p[0] = c_ast.InitList([], self._token_coord(p, 1)) else: p[0] = p[2] def p_initializer_list(self, p): """ initializer_list : designation_opt initializer | initializer_list COMMA designation_opt initializer """ if len(p) == 3: # single initializer init = p[2] if p[1] is None else c_ast.NamedInitializer(p[1], p[2]) p[0] = c_ast.InitList([init], p[2].coord) else: init = p[4] if p[3] is None else c_ast.NamedInitializer(p[3], p[4]) p[1].exprs.append(init) p[0] = p[1] def p_designation(self, p): """ designation : designator_list EQUALS """ p[0] = p[1] # Designators are represented as a list of nodes, in the order in which # they're written in the code. # def p_designator_list(self, p): """ designator_list : designator | designator_list designator """ p[0] = [p[1]] if len(p) == 2 else p[1] + [p[2]] def p_designator(self, p): """ designator : LBRACKET constant_expression RBRACKET | PERIOD identifier """ p[0] = p[2] def p_type_name(self, p): """ type_name : specifier_qualifier_list abstract_declarator_opt """ typename = c_ast.Typename( name='', quals=p[1]['qual'], type=p[2] or c_ast.TypeDecl(None, None, None), coord=self._token_coord(p, 2)) p[0] = self._fix_decl_name_type(typename, p[1]['type']) def p_abstract_declarator_1(self, p): """ abstract_declarator : pointer """ dummytype = c_ast.TypeDecl(None, None, None) p[0] = self._type_modify_decl( decl=dummytype, modifier=p[1]) def p_abstract_declarator_2(self, p): """ abstract_declarator : pointer direct_abstract_declarator """ p[0] = self._type_modify_decl(p[2], p[1]) def p_abstract_declarator_3(self, p): """ abstract_declarator : direct_abstract_declarator """ p[0] = p[1] # Creating and using direct_abstract_declarator_opt here # instead of listing both direct_abstract_declarator and the # lack of it in the beginning of _1 and _2 caused two # shift/reduce errors. # def p_direct_abstract_declarator_1(self, p): """ direct_abstract_declarator : LPAREN abstract_declarator RPAREN """ p[0] = p[2] def p_direct_abstract_declarator_2(self, p): """ direct_abstract_declarator : direct_abstract_declarator LBRACKET assignment_expression_opt RBRACKET """ arr = c_ast.ArrayDecl( type=None, dim=p[3], dim_quals=[], coord=p[1].coord) p[0] = self._type_modify_decl(decl=p[1], modifier=arr) def p_direct_abstract_declarator_3(self, p): """ direct_abstract_declarator : LBRACKET assignment_expression_opt RBRACKET """ p[0] = c_ast.ArrayDecl( type=c_ast.TypeDecl(None, None, None), dim=p[2], dim_quals=[], coord=self._token_coord(p, 1)) def p_direct_abstract_declarator_4(self, p): """ direct_abstract_declarator : direct_abstract_declarator LBRACKET TIMES RBRACKET """ arr = c_ast.ArrayDecl( type=None, dim=c_ast.ID(p[3], self._token_coord(p, 3)), dim_quals=[], coord=p[1].coord) p[0] = self._type_modify_decl(decl=p[1], modifier=arr) def p_direct_abstract_declarator_5(self, p): """ direct_abstract_declarator : LBRACKET TIMES RBRACKET """ p[0] = c_ast.ArrayDecl( type=c_ast.TypeDecl(None, None, None), dim=c_ast.ID(p[3], self._token_coord(p, 3)), dim_quals=[], coord=self._token_coord(p, 1)) def p_direct_abstract_declarator_6(self, p): """ direct_abstract_declarator : direct_abstract_declarator LPAREN parameter_type_list_opt RPAREN """ func = c_ast.FuncDecl( args=p[3], type=None, coord=p[1].coord) p[0] = self._type_modify_decl(decl=p[1], modifier=func) def p_direct_abstract_declarator_7(self, p): """ direct_abstract_declarator : LPAREN parameter_type_list_opt RPAREN """ p[0] = c_ast.FuncDecl( args=p[2], type=c_ast.TypeDecl(None, None, None), coord=self._token_coord(p, 1)) # declaration is a list, statement isn't. To make it consistent, block_item # will always be a list # def p_block_item(self, p): """ block_item : declaration | statement """ p[0] = p[1] if isinstance(p[1], list) else [p[1]] # Since we made block_item a list, this just combines lists # def p_block_item_list(self, p): """ block_item_list : block_item | block_item_list block_item """ # Empty block items (plain ';') produce [None], so ignore them p[0] = p[1] if (len(p) == 2 or p[2] == [None]) else p[1] + p[2] def p_compound_statement_1(self, p): """ compound_statement : brace_open block_item_list_opt brace_close """ p[0] = c_ast.Compound( block_items=p[2], coord=self._token_coord(p, 1)) def p_labeled_statement_1(self, p): """ labeled_statement : ID COLON statement """ p[0] = c_ast.Label(p[1], p[3], self._token_coord(p, 1)) def p_labeled_statement_2(self, p): """ labeled_statement : CASE constant_expression COLON statement """ p[0] = c_ast.Case(p[2], [p[4]], self._token_coord(p, 1)) def p_labeled_statement_3(self, p): """ labeled_statement : DEFAULT COLON statement """ p[0] = c_ast.Default([p[3]], self._token_coord(p, 1)) def p_selection_statement_1(self, p): """ selection_statement : IF LPAREN expression RPAREN statement """ p[0] = c_ast.If(p[3], p[5], None, self._token_coord(p, 1)) def p_selection_statement_2(self, p): """ selection_statement : IF LPAREN expression RPAREN statement ELSE statement """ p[0] = c_ast.If(p[3], p[5], p[7], self._token_coord(p, 1)) def p_selection_statement_3(self, p): """ selection_statement : SWITCH LPAREN expression RPAREN statement """ p[0] = fix_switch_cases( c_ast.Switch(p[3], p[5], self._token_coord(p, 1))) def p_iteration_statement_1(self, p): """ iteration_statement : WHILE LPAREN expression RPAREN statement """ p[0] = c_ast.While(p[3], p[5], self._token_coord(p, 1)) def p_iteration_statement_2(self, p): """ iteration_statement : DO statement WHILE LPAREN expression RPAREN SEMI """ p[0] = c_ast.DoWhile(p[5], p[2], self._token_coord(p, 1)) def p_iteration_statement_3(self, p): """ iteration_statement : FOR LPAREN expression_opt SEMI expression_opt SEMI expression_opt RPAREN statement """ p[0] = c_ast.For(p[3], p[5], p[7], p[9], self._token_coord(p, 1)) def p_iteration_statement_4(self, p): """ iteration_statement : FOR LPAREN declaration expression_opt SEMI expression_opt RPAREN statement """ p[0] = c_ast.For(c_ast.DeclList(p[3], self._token_coord(p, 1)), p[4], p[6], p[8], self._token_coord(p, 1)) def p_jump_statement_1(self, p): """ jump_statement : GOTO ID SEMI """ p[0] = c_ast.Goto(p[2], self._token_coord(p, 1)) def p_jump_statement_2(self, p): """ jump_statement : BREAK SEMI """ p[0] = c_ast.Break(self._token_coord(p, 1)) def p_jump_statement_3(self, p): """ jump_statement : CONTINUE SEMI """ p[0] = c_ast.Continue(self._token_coord(p, 1)) def p_jump_statement_4(self, p): """ jump_statement : RETURN expression SEMI | RETURN SEMI """ p[0] = c_ast.Return(p[2] if len(p) == 4 else None, self._token_coord(p, 1)) def p_expression_statement(self, p): """ expression_statement : expression_opt SEMI """ if p[1] is None: p[0] = c_ast.EmptyStatement(self._token_coord(p, 2)) else: p[0] = p[1] def p_expression(self, p): """ expression : assignment_expression | expression COMMA assignment_expression """ if len(p) == 2: p[0] = p[1] else: if not isinstance(p[1], c_ast.ExprList): p[1] = c_ast.ExprList([p[1]], p[1].coord) p[1].exprs.append(p[3]) p[0] = p[1] def p_typedef_name(self, p): """ typedef_name : TYPEID """ p[0] = c_ast.IdentifierType([p[1]], coord=self._token_coord(p, 1)) def p_assignment_expression(self, p): """ assignment_expression : conditional_expression | unary_expression assignment_operator assignment_expression """ if len(p) == 2: p[0] = p[1] else: p[0] = c_ast.Assignment(p[2], p[1], p[3], p[1].coord) # K&R2 defines these as many separate rules, to encode # precedence and associativity. Why work hard ? I'll just use # the built in precedence/associativity specification feature # of PLY. (see precedence declaration above) # def p_assignment_operator(self, p): """ assignment_operator : EQUALS | XOREQUAL | TIMESEQUAL | DIVEQUAL | MODEQUAL | PLUSEQUAL | MINUSEQUAL | LSHIFTEQUAL | RSHIFTEQUAL | ANDEQUAL | OREQUAL """ p[0] = p[1] def p_constant_expression(self, p): """ constant_expression : conditional_expression """ p[0] = p[1] def p_conditional_expression(self, p): """ conditional_expression : binary_expression | binary_expression CONDOP expression COLON conditional_expression """ if len(p) == 2: p[0] = p[1] else: p[0] = c_ast.TernaryOp(p[1], p[3], p[5], p[1].coord) def p_binary_expression(self, p): """ binary_expression : cast_expression | binary_expression TIMES binary_expression | binary_expression DIVIDE binary_expression | binary_expression MOD binary_expression | binary_expression PLUS binary_expression | binary_expression MINUS binary_expression | binary_expression RSHIFT binary_expression | binary_expression LSHIFT binary_expression | binary_expression LT binary_expression | binary_expression LE binary_expression | binary_expression GE binary_expression | binary_expression GT binary_expression | binary_expression EQ binary_expression | binary_expression NE binary_expression | binary_expression AND binary_expression | binary_expression OR binary_expression | binary_expression XOR binary_expression | binary_expression LAND binary_expression | binary_expression LOR binary_expression """ if len(p) == 2: p[0] = p[1] else: p[0] = c_ast.BinaryOp(p[2], p[1], p[3], p[1].coord) def p_cast_expression_1(self, p): """ cast_expression : unary_expression """ p[0] = p[1] def p_cast_expression_2(self, p): """ cast_expression : LPAREN type_name RPAREN cast_expression """ p[0] = c_ast.Cast(p[2], p[4], self._token_coord(p, 1)) def p_unary_expression_1(self, p): """ unary_expression : postfix_expression """ p[0] = p[1] def p_unary_expression_2(self, p): """ unary_expression : PLUSPLUS unary_expression | MINUSMINUS unary_expression | unary_operator cast_expression """ p[0] = c_ast.UnaryOp(p[1], p[2], p[2].coord) def p_unary_expression_3(self, p): """ unary_expression : SIZEOF unary_expression | SIZEOF LPAREN type_name RPAREN """ p[0] = c_ast.UnaryOp( p[1], p[2] if len(p) == 3 else p[3], self._token_coord(p, 1)) def p_unary_operator(self, p): """ unary_operator : AND | TIMES | PLUS | MINUS | NOT | LNOT """ p[0] = p[1] def p_postfix_expression_1(self, p): """ postfix_expression : primary_expression """ p[0] = p[1] def p_postfix_expression_2(self, p): """ postfix_expression : postfix_expression LBRACKET expression RBRACKET """ p[0] = c_ast.ArrayRef(p[1], p[3], p[1].coord) def p_postfix_expression_3(self, p): """ postfix_expression : postfix_expression LPAREN argument_expression_list RPAREN | postfix_expression LPAREN RPAREN """ p[0] = c_ast.FuncCall(p[1], p[3] if len(p) == 5 else None, p[1].coord) def p_postfix_expression_4(self, p): """ postfix_expression : postfix_expression PERIOD ID | postfix_expression PERIOD TYPEID | postfix_expression ARROW ID | postfix_expression ARROW TYPEID """ field = c_ast.ID(p[3], self._token_coord(p, 3)) p[0] = c_ast.StructRef(p[1], p[2], field, p[1].coord) def p_postfix_expression_5(self, p): """ postfix_expression : postfix_expression PLUSPLUS | postfix_expression MINUSMINUS """ p[0] = c_ast.UnaryOp('p' + p[2], p[1], p[1].coord) def p_postfix_expression_6(self, p): """ postfix_expression : LPAREN type_name RPAREN brace_open initializer_list brace_close | LPAREN type_name RPAREN brace_open initializer_list COMMA brace_close """ p[0] = c_ast.CompoundLiteral(p[2], p[5]) def p_primary_expression_1(self, p): """ primary_expression : identifier """ p[0] = p[1] def p_primary_expression_2(self, p): """ primary_expression : constant """ p[0] = p[1] def p_primary_expression_3(self, p): """ primary_expression : unified_string_literal | unified_wstring_literal """ p[0] = p[1] def p_primary_expression_4(self, p): """ primary_expression : LPAREN expression RPAREN """ p[0] = p[2] def p_primary_expression_5(self, p): """ primary_expression : OFFSETOF LPAREN type_name COMMA offsetof_member_designator RPAREN """ coord = self._token_coord(p, 1) p[0] = c_ast.FuncCall(c_ast.ID(p[1], coord), c_ast.ExprList([p[3], p[5]], coord), coord) def p_offsetof_member_designator(self, p): """ offsetof_member_designator : identifier | offsetof_member_designator PERIOD identifier | offsetof_member_designator LBRACKET expression RBRACKET """ if len(p) == 2: p[0] = p[1] elif len(p) == 4: field = c_ast.ID(p[3], self._token_coord(p, 3)) p[0] = c_ast.StructRef(p[1], p[2], field, p[1].coord) elif len(p) == 5: p[0] = c_ast.ArrayRef(p[1], p[3], p[1].coord) else: raise NotImplementedError("Unexpected parsing state. len(p): %u" % len(p)) def p_argument_expression_list(self, p): """ argument_expression_list : assignment_expression | argument_expression_list COMMA assignment_expression """ if len(p) == 2: # single expr p[0] = c_ast.ExprList([p[1]], p[1].coord) else: p[1].exprs.append(p[3]) p[0] = p[1] def p_identifier(self, p): """ identifier : ID """ p[0] = c_ast.ID(p[1], self._token_coord(p, 1)) def p_constant_1(self, p): """ constant : INT_CONST_DEC | INT_CONST_OCT | INT_CONST_HEX | INT_CONST_BIN """ p[0] = c_ast.Constant( 'int', p[1], self._token_coord(p, 1)) def p_constant_2(self, p): """ constant : FLOAT_CONST | HEX_FLOAT_CONST """ p[0] = c_ast.Constant( 'float', p[1], self._token_coord(p, 1)) def p_constant_3(self, p): """ constant : CHAR_CONST | WCHAR_CONST """ p[0] = c_ast.Constant( 'char', p[1], self._token_coord(p, 1)) # The "unified" string and wstring literal rules are for supporting # concatenation of adjacent string literals. # I.e. "hello " "world" is seen by the C compiler as a single string literal # with the value "hello world" # def p_unified_string_literal(self, p): """ unified_string_literal : STRING_LITERAL | unified_string_literal STRING_LITERAL """ if len(p) == 2: # single literal p[0] = c_ast.Constant( 'string', p[1], self._token_coord(p, 1)) else: p[1].value = p[1].value[:-1] + p[2][1:] p[0] = p[1] def p_unified_wstring_literal(self, p): """ unified_wstring_literal : WSTRING_LITERAL | unified_wstring_literal WSTRING_LITERAL """ if len(p) == 2: # single literal p[0] = c_ast.Constant( 'string', p[1], self._token_coord(p, 1)) else: p[1].value = p[1].value.rstrip()[:-1] + p[2][2:] p[0] = p[1] def p_brace_open(self, p): """ brace_open : LBRACE """ p[0] = p[1] p.set_lineno(0, p.lineno(1)) def p_brace_close(self, p): """ brace_close : RBRACE """ p[0] = p[1] p.set_lineno(0, p.lineno(1)) def p_empty(self, p): 'empty : ' p[0] = None def p_error(self, p): # If error recovery is added here in the future, make sure # _get_yacc_lookahead_token still works! # if p: self._parse_error( 'before: %s' % p.value, self._coord(lineno=p.lineno, column=self.clex.find_tok_column(p))) else: self._parse_error('At end of input', self.clex.filename) #------------------------------------------------------------------------------ if __name__ == "__main__": import pprint import time, sys #t1 = time.time() #parser = CParser(lex_optimize=True, yacc_debug=True, yacc_optimize=False) #sys.write(time.time() - t1) #buf = ''' #int (*k)(int); #''' ## set debuglevel to 2 for debugging #t = parser.parse(buf, 'x.c', debuglevel=0) #t.show(showcoord=True) pycparser-2.18/pycparser/ast_transforms.py0000664000175000017500000000676613045001366021640 0ustar elibeneliben00000000000000#------------------------------------------------------------------------------ # pycparser: ast_transforms.py # # Some utilities used by the parser to create a friendlier AST. # # Eli Bendersky [http://eli.thegreenplace.net] # License: BSD #------------------------------------------------------------------------------ from . import c_ast def fix_switch_cases(switch_node): """ The 'case' statements in a 'switch' come out of parsing with one child node, so subsequent statements are just tucked to the parent Compound. Additionally, consecutive (fall-through) case statements come out messy. This is a peculiarity of the C grammar. The following: switch (myvar) { case 10: k = 10; p = k + 1; return 10; case 20: case 30: return 20; default: break; } Creates this tree (pseudo-dump): Switch ID: myvar Compound: Case 10: k = 10 p = k + 1 return 10 Case 20: Case 30: return 20 Default: break The goal of this transform is to fix this mess, turning it into the following: Switch ID: myvar Compound: Case 10: k = 10 p = k + 1 return 10 Case 20: Case 30: return 20 Default: break A fixed AST node is returned. The argument may be modified. """ assert isinstance(switch_node, c_ast.Switch) if not isinstance(switch_node.stmt, c_ast.Compound): return switch_node # The new Compound child for the Switch, which will collect children in the # correct order new_compound = c_ast.Compound([], switch_node.stmt.coord) # The last Case/Default node last_case = None # Goes over the children of the Compound below the Switch, adding them # either directly below new_compound or below the last Case as appropriate for child in switch_node.stmt.block_items: if isinstance(child, (c_ast.Case, c_ast.Default)): # If it's a Case/Default: # 1. Add it to the Compound and mark as "last case" # 2. If its immediate child is also a Case or Default, promote it # to a sibling. new_compound.block_items.append(child) _extract_nested_case(child, new_compound.block_items) last_case = new_compound.block_items[-1] else: # Other statements are added as children to the last case, if it # exists. if last_case is None: new_compound.block_items.append(child) else: last_case.stmts.append(child) switch_node.stmt = new_compound return switch_node def _extract_nested_case(case_node, stmts_list): """ Recursively extract consecutive Case statements that are made nested by the parser and add them to the stmts_list. """ if isinstance(case_node.stmts[0], (c_ast.Case, c_ast.Default)): stmts_list.append(case_node.stmts.pop()) _extract_nested_case(stmts_list[-1], stmts_list) pycparser-2.18/pycparser/_ast_gen.py0000664000175000017500000002074313045001366020341 0ustar elibeneliben00000000000000#----------------------------------------------------------------- # _ast_gen.py # # Generates the AST Node classes from a specification given in # a configuration file # # The design of this module was inspired by astgen.py from the # Python 2.5 code-base. # # Eli Bendersky [http://eli.thegreenplace.net] # License: BSD #----------------------------------------------------------------- import pprint from string import Template class ASTCodeGenerator(object): def __init__(self, cfg_filename='_c_ast.cfg'): """ Initialize the code generator from a configuration file. """ self.cfg_filename = cfg_filename self.node_cfg = [NodeCfg(name, contents) for (name, contents) in self.parse_cfgfile(cfg_filename)] def generate(self, file=None): """ Generates the code into file, an open file buffer. """ src = Template(_PROLOGUE_COMMENT).substitute( cfg_filename=self.cfg_filename) src += _PROLOGUE_CODE for node_cfg in self.node_cfg: src += node_cfg.generate_source() + '\n\n' file.write(src) def parse_cfgfile(self, filename): """ Parse the configuration file and yield pairs of (name, contents) for each node. """ with open(filename, "r") as f: for line in f: line = line.strip() if not line or line.startswith('#'): continue colon_i = line.find(':') lbracket_i = line.find('[') rbracket_i = line.find(']') if colon_i < 1 or lbracket_i <= colon_i or rbracket_i <= lbracket_i: raise RuntimeError("Invalid line in %s:\n%s\n" % (filename, line)) name = line[:colon_i] val = line[lbracket_i + 1:rbracket_i] vallist = [v.strip() for v in val.split(',')] if val else [] yield name, vallist class NodeCfg(object): """ Node configuration. name: node name contents: a list of contents - attributes and child nodes See comment at the top of the configuration file for details. """ def __init__(self, name, contents): self.name = name self.all_entries = [] self.attr = [] self.child = [] self.seq_child = [] for entry in contents: clean_entry = entry.rstrip('*') self.all_entries.append(clean_entry) if entry.endswith('**'): self.seq_child.append(clean_entry) elif entry.endswith('*'): self.child.append(clean_entry) else: self.attr.append(entry) def generate_source(self): src = self._gen_init() src += '\n' + self._gen_children() src += '\n' + self._gen_attr_names() return src def _gen_init(self): src = "class %s(Node):\n" % self.name if self.all_entries: args = ', '.join(self.all_entries) slots = ', '.join("'{0}'".format(e) for e in self.all_entries) slots += ", 'coord', '__weakref__'" arglist = '(self, %s, coord=None)' % args else: slots = "'coord', '__weakref__'" arglist = '(self, coord=None)' src += " __slots__ = (%s)\n" % slots src += " def __init__%s:\n" % arglist for name in self.all_entries + ['coord']: src += " self.%s = %s\n" % (name, name) return src def _gen_children(self): src = ' def children(self):\n' if self.all_entries: src += ' nodelist = []\n' for child in self.child: src += ( ' if self.%(child)s is not None:' + ' nodelist.append(("%(child)s", self.%(child)s))\n') % ( dict(child=child)) for seq_child in self.seq_child: src += ( ' for i, child in enumerate(self.%(child)s or []):\n' ' nodelist.append(("%(child)s[%%d]" %% i, child))\n') % ( dict(child=seq_child)) src += ' return tuple(nodelist)\n' else: src += ' return ()\n' return src def _gen_attr_names(self): src = " attr_names = (" + ''.join("%r, " % nm for nm in self.attr) + ')' return src _PROLOGUE_COMMENT = \ r'''#----------------------------------------------------------------- # ** ATTENTION ** # This code was automatically generated from the file: # $cfg_filename # # Do not modify it directly. Modify the configuration file and # run the generator again. # ** ** *** ** ** # # pycparser: c_ast.py # # AST Node classes. # # Eli Bendersky [http://eli.thegreenplace.net] # License: BSD #----------------------------------------------------------------- ''' _PROLOGUE_CODE = r''' import sys class Node(object): __slots__ = () """ Abstract base class for AST nodes. """ def children(self): """ A sequence of all children that are Nodes """ pass def show(self, buf=sys.stdout, offset=0, attrnames=False, nodenames=False, showcoord=False, _my_node_name=None): """ Pretty print the Node and all its attributes and children (recursively) to a buffer. buf: Open IO buffer into which the Node is printed. offset: Initial offset (amount of leading spaces) attrnames: True if you want to see the attribute names in name=value pairs. False to only see the values. nodenames: True if you want to see the actual node names within their parents. showcoord: Do you want the coordinates of each Node to be displayed. """ lead = ' ' * offset if nodenames and _my_node_name is not None: buf.write(lead + self.__class__.__name__+ ' <' + _my_node_name + '>: ') else: buf.write(lead + self.__class__.__name__+ ': ') if self.attr_names: if attrnames: nvlist = [(n, getattr(self,n)) for n in self.attr_names] attrstr = ', '.join('%s=%s' % nv for nv in nvlist) else: vlist = [getattr(self, n) for n in self.attr_names] attrstr = ', '.join('%s' % v for v in vlist) buf.write(attrstr) if showcoord: buf.write(' (at %s)' % self.coord) buf.write('\n') for (child_name, child) in self.children(): child.show( buf, offset=offset + 2, attrnames=attrnames, nodenames=nodenames, showcoord=showcoord, _my_node_name=child_name) class NodeVisitor(object): """ A base NodeVisitor class for visiting c_ast nodes. Subclass it and define your own visit_XXX methods, where XXX is the class name you want to visit with these methods. For example: class ConstantVisitor(NodeVisitor): def __init__(self): self.values = [] def visit_Constant(self, node): self.values.append(node.value) Creates a list of values of all the constant nodes encountered below the given node. To use it: cv = ConstantVisitor() cv.visit(node) Notes: * generic_visit() will be called for AST nodes for which no visit_XXX method was defined. * The children of nodes for which a visit_XXX was defined will not be visited - if you need this, call generic_visit() on the node. You can use: NodeVisitor.generic_visit(self, node) * Modeled after Python's own AST visiting facilities (the ast module of Python 3.0) """ def visit(self, node): """ Visit a node. """ method = 'visit_' + node.__class__.__name__ visitor = getattr(self, method, self.generic_visit) return visitor(node) def generic_visit(self, node): """ Called if no explicit visitor function exists for a node. Implements preorder visiting of the node. """ for c_name, c in node.children(): self.visit(c) ''' if __name__ == "__main__": import sys ast_gen = ASTCodeGenerator('_c_ast.cfg') ast_gen.generate(open('c_ast.py', 'w')) pycparser-2.18/pycparser/c_generator.py0000664000175000017500000003317713111175436021063 0ustar elibeneliben00000000000000#------------------------------------------------------------------------------ # pycparser: c_generator.py # # C code generator from pycparser AST nodes. # # Eli Bendersky [http://eli.thegreenplace.net] # License: BSD #------------------------------------------------------------------------------ from . import c_ast class CGenerator(object): """ Uses the same visitor pattern as c_ast.NodeVisitor, but modified to return a value from each visit method, using string accumulation in generic_visit. """ def __init__(self): # Statements start with indentation of self.indent_level spaces, using # the _make_indent method # self.indent_level = 0 def _make_indent(self): return ' ' * self.indent_level def visit(self, node): method = 'visit_' + node.__class__.__name__ return getattr(self, method, self.generic_visit)(node) def generic_visit(self, node): #~ print('generic:', type(node)) if node is None: return '' else: return ''.join(self.visit(c) for c_name, c in node.children()) def visit_Constant(self, n): return n.value def visit_ID(self, n): return n.name def visit_Pragma(self, n): ret = '#pragma' if n.string: ret += ' ' + n.string return ret def visit_ArrayRef(self, n): arrref = self._parenthesize_unless_simple(n.name) return arrref + '[' + self.visit(n.subscript) + ']' def visit_StructRef(self, n): sref = self._parenthesize_unless_simple(n.name) return sref + n.type + self.visit(n.field) def visit_FuncCall(self, n): fref = self._parenthesize_unless_simple(n.name) return fref + '(' + self.visit(n.args) + ')' def visit_UnaryOp(self, n): operand = self._parenthesize_unless_simple(n.expr) if n.op == 'p++': return '%s++' % operand elif n.op == 'p--': return '%s--' % operand elif n.op == 'sizeof': # Always parenthesize the argument of sizeof since it can be # a name. return 'sizeof(%s)' % self.visit(n.expr) else: return '%s%s' % (n.op, operand) def visit_BinaryOp(self, n): lval_str = self._parenthesize_if(n.left, lambda d: not self._is_simple_node(d)) rval_str = self._parenthesize_if(n.right, lambda d: not self._is_simple_node(d)) return '%s %s %s' % (lval_str, n.op, rval_str) def visit_Assignment(self, n): rval_str = self._parenthesize_if( n.rvalue, lambda n: isinstance(n, c_ast.Assignment)) return '%s %s %s' % (self.visit(n.lvalue), n.op, rval_str) def visit_IdentifierType(self, n): return ' '.join(n.names) def _visit_expr(self, n): if isinstance(n, c_ast.InitList): return '{' + self.visit(n) + '}' elif isinstance(n, c_ast.ExprList): return '(' + self.visit(n) + ')' else: return self.visit(n) def visit_Decl(self, n, no_type=False): # no_type is used when a Decl is part of a DeclList, where the type is # explicitly only for the first declaration in a list. # s = n.name if no_type else self._generate_decl(n) if n.bitsize: s += ' : ' + self.visit(n.bitsize) if n.init: s += ' = ' + self._visit_expr(n.init) return s def visit_DeclList(self, n): s = self.visit(n.decls[0]) if len(n.decls) > 1: s += ', ' + ', '.join(self.visit_Decl(decl, no_type=True) for decl in n.decls[1:]) return s def visit_Typedef(self, n): s = '' if n.storage: s += ' '.join(n.storage) + ' ' s += self._generate_type(n.type) return s def visit_Cast(self, n): s = '(' + self._generate_type(n.to_type) + ')' return s + ' ' + self._parenthesize_unless_simple(n.expr) def visit_ExprList(self, n): visited_subexprs = [] for expr in n.exprs: visited_subexprs.append(self._visit_expr(expr)) return ', '.join(visited_subexprs) def visit_InitList(self, n): visited_subexprs = [] for expr in n.exprs: visited_subexprs.append(self._visit_expr(expr)) return ', '.join(visited_subexprs) def visit_Enum(self, n): s = 'enum' if n.name: s += ' ' + n.name if n.values: s += ' {' for i, enumerator in enumerate(n.values.enumerators): s += enumerator.name if enumerator.value: s += ' = ' + self.visit(enumerator.value) if i != len(n.values.enumerators) - 1: s += ', ' s += '}' return s def visit_FuncDef(self, n): decl = self.visit(n.decl) self.indent_level = 0 body = self.visit(n.body) if n.param_decls: knrdecls = ';\n'.join(self.visit(p) for p in n.param_decls) return decl + '\n' + knrdecls + ';\n' + body + '\n' else: return decl + '\n' + body + '\n' def visit_FileAST(self, n): s = '' for ext in n.ext: if isinstance(ext, c_ast.FuncDef): s += self.visit(ext) elif isinstance(ext, c_ast.Pragma): s += self.visit(ext) + '\n' else: s += self.visit(ext) + ';\n' return s def visit_Compound(self, n): s = self._make_indent() + '{\n' self.indent_level += 2 if n.block_items: s += ''.join(self._generate_stmt(stmt) for stmt in n.block_items) self.indent_level -= 2 s += self._make_indent() + '}\n' return s def visit_CompoundLiteral(self, n): return '(' + self.visit(n.type) + '){' + self.visit(n.init) + '}' def visit_EmptyStatement(self, n): return ';' def visit_ParamList(self, n): return ', '.join(self.visit(param) for param in n.params) def visit_Return(self, n): s = 'return' if n.expr: s += ' ' + self.visit(n.expr) return s + ';' def visit_Break(self, n): return 'break;' def visit_Continue(self, n): return 'continue;' def visit_TernaryOp(self, n): s = '(' + self._visit_expr(n.cond) + ') ? ' s += '(' + self._visit_expr(n.iftrue) + ') : ' s += '(' + self._visit_expr(n.iffalse) + ')' return s def visit_If(self, n): s = 'if (' if n.cond: s += self.visit(n.cond) s += ')\n' s += self._generate_stmt(n.iftrue, add_indent=True) if n.iffalse: s += self._make_indent() + 'else\n' s += self._generate_stmt(n.iffalse, add_indent=True) return s def visit_For(self, n): s = 'for (' if n.init: s += self.visit(n.init) s += ';' if n.cond: s += ' ' + self.visit(n.cond) s += ';' if n.next: s += ' ' + self.visit(n.next) s += ')\n' s += self._generate_stmt(n.stmt, add_indent=True) return s def visit_While(self, n): s = 'while (' if n.cond: s += self.visit(n.cond) s += ')\n' s += self._generate_stmt(n.stmt, add_indent=True) return s def visit_DoWhile(self, n): s = 'do\n' s += self._generate_stmt(n.stmt, add_indent=True) s += self._make_indent() + 'while (' if n.cond: s += self.visit(n.cond) s += ');' return s def visit_Switch(self, n): s = 'switch (' + self.visit(n.cond) + ')\n' s += self._generate_stmt(n.stmt, add_indent=True) return s def visit_Case(self, n): s = 'case ' + self.visit(n.expr) + ':\n' for stmt in n.stmts: s += self._generate_stmt(stmt, add_indent=True) return s def visit_Default(self, n): s = 'default:\n' for stmt in n.stmts: s += self._generate_stmt(stmt, add_indent=True) return s def visit_Label(self, n): return n.name + ':\n' + self._generate_stmt(n.stmt) def visit_Goto(self, n): return 'goto ' + n.name + ';' def visit_EllipsisParam(self, n): return '...' def visit_Struct(self, n): return self._generate_struct_union(n, 'struct') def visit_Typename(self, n): return self._generate_type(n.type) def visit_Union(self, n): return self._generate_struct_union(n, 'union') def visit_NamedInitializer(self, n): s = '' for name in n.name: if isinstance(name, c_ast.ID): s += '.' + name.name elif isinstance(name, c_ast.Constant): s += '[' + name.value + ']' s += ' = ' + self._visit_expr(n.expr) return s def visit_FuncDecl(self, n): return self._generate_type(n) def _generate_struct_union(self, n, name): """ Generates code for structs and unions. name should be either 'struct' or union. """ s = name + ' ' + (n.name or '') if n.decls: s += '\n' s += self._make_indent() self.indent_level += 2 s += '{\n' for decl in n.decls: s += self._generate_stmt(decl) self.indent_level -= 2 s += self._make_indent() + '}' return s def _generate_stmt(self, n, add_indent=False): """ Generation from a statement node. This method exists as a wrapper for individual visit_* methods to handle different treatment of some statements in this context. """ typ = type(n) if add_indent: self.indent_level += 2 indent = self._make_indent() if add_indent: self.indent_level -= 2 if typ in ( c_ast.Decl, c_ast.Assignment, c_ast.Cast, c_ast.UnaryOp, c_ast.BinaryOp, c_ast.TernaryOp, c_ast.FuncCall, c_ast.ArrayRef, c_ast.StructRef, c_ast.Constant, c_ast.ID, c_ast.Typedef, c_ast.ExprList): # These can also appear in an expression context so no semicolon # is added to them automatically # return indent + self.visit(n) + ';\n' elif typ in (c_ast.Compound,): # No extra indentation required before the opening brace of a # compound - because it consists of multiple lines it has to # compute its own indentation. # return self.visit(n) else: return indent + self.visit(n) + '\n' def _generate_decl(self, n): """ Generation from a Decl node. """ s = '' if n.funcspec: s = ' '.join(n.funcspec) + ' ' if n.storage: s += ' '.join(n.storage) + ' ' s += self._generate_type(n.type) return s def _generate_type(self, n, modifiers=[]): """ Recursive generation from a type node. n is the type node. modifiers collects the PtrDecl, ArrayDecl and FuncDecl modifiers encountered on the way down to a TypeDecl, to allow proper generation from it. """ typ = type(n) #~ print(n, modifiers) if typ == c_ast.TypeDecl: s = '' if n.quals: s += ' '.join(n.quals) + ' ' s += self.visit(n.type) nstr = n.declname if n.declname else '' # Resolve modifiers. # Wrap in parens to distinguish pointer to array and pointer to # function syntax. # for i, modifier in enumerate(modifiers): if isinstance(modifier, c_ast.ArrayDecl): if (i != 0 and isinstance(modifiers[i - 1], c_ast.PtrDecl)): nstr = '(' + nstr + ')' nstr += '[' + self.visit(modifier.dim) + ']' elif isinstance(modifier, c_ast.FuncDecl): if (i != 0 and isinstance(modifiers[i - 1], c_ast.PtrDecl)): nstr = '(' + nstr + ')' nstr += '(' + self.visit(modifier.args) + ')' elif isinstance(modifier, c_ast.PtrDecl): if modifier.quals: nstr = '* %s %s' % (' '.join(modifier.quals), nstr) else: nstr = '*' + nstr if nstr: s += ' ' + nstr return s elif typ == c_ast.Decl: return self._generate_decl(n.type) elif typ == c_ast.Typename: return self._generate_type(n.type) elif typ == c_ast.IdentifierType: return ' '.join(n.names) + ' ' elif typ in (c_ast.ArrayDecl, c_ast.PtrDecl, c_ast.FuncDecl): return self._generate_type(n.type, modifiers + [n]) else: return self.visit(n) def _parenthesize_if(self, n, condition): """ Visits 'n' and returns its string representation, parenthesized if the condition function applied to the node returns True. """ s = self._visit_expr(n) if condition(n): return '(' + s + ')' else: return s def _parenthesize_unless_simple(self, n): """ Common use case for _parenthesize_if """ return self._parenthesize_if(n, lambda d: not self._is_simple_node(d)) def _is_simple_node(self, n): """ Returns True for nodes that are "simple" - i.e. nodes that always have higher precedence than operators. """ return isinstance(n,( c_ast.Constant, c_ast.ID, c_ast.ArrayRef, c_ast.StructRef, c_ast.FuncCall)) pycparser-2.18/pycparser/c_lexer.py0000664000175000017500000003417713060530765020220 0ustar elibeneliben00000000000000#------------------------------------------------------------------------------ # pycparser: c_lexer.py # # CLexer class: lexer for the C language # # Eli Bendersky [http://eli.thegreenplace.net] # License: BSD #------------------------------------------------------------------------------ import re import sys from .ply import lex from .ply.lex import TOKEN class CLexer(object): """ A lexer for the C language. After building it, set the input text with input(), and call token() to get new tokens. The public attribute filename can be set to an initial filaneme, but the lexer will update it upon #line directives. """ def __init__(self, error_func, on_lbrace_func, on_rbrace_func, type_lookup_func): """ Create a new Lexer. error_func: An error function. Will be called with an error message, line and column as arguments, in case of an error during lexing. on_lbrace_func, on_rbrace_func: Called when an LBRACE or RBRACE is encountered (likely to push/pop type_lookup_func's scope) type_lookup_func: A type lookup function. Given a string, it must return True IFF this string is a name of a type that was defined with a typedef earlier. """ self.error_func = error_func self.on_lbrace_func = on_lbrace_func self.on_rbrace_func = on_rbrace_func self.type_lookup_func = type_lookup_func self.filename = '' # Keeps track of the last token returned from self.token() self.last_token = None # Allow either "# line" or "# " to support GCC's # cpp output # self.line_pattern = re.compile(r'([ \t]*line\W)|([ \t]*\d+)') self.pragma_pattern = re.compile(r'[ \t]*pragma\W') def build(self, **kwargs): """ Builds the lexer from the specification. Must be called after the lexer object is created. This method exists separately, because the PLY manual warns against calling lex.lex inside __init__ """ self.lexer = lex.lex(object=self, **kwargs) def reset_lineno(self): """ Resets the internal line number counter of the lexer. """ self.lexer.lineno = 1 def input(self, text): self.lexer.input(text) def token(self): self.last_token = self.lexer.token() return self.last_token def find_tok_column(self, token): """ Find the column of the token in its line. """ last_cr = self.lexer.lexdata.rfind('\n', 0, token.lexpos) return token.lexpos - last_cr ######################-- PRIVATE --###################### ## ## Internal auxiliary methods ## def _error(self, msg, token): location = self._make_tok_location(token) self.error_func(msg, location[0], location[1]) self.lexer.skip(1) def _make_tok_location(self, token): return (token.lineno, self.find_tok_column(token)) ## ## Reserved keywords ## keywords = ( '_BOOL', '_COMPLEX', 'AUTO', 'BREAK', 'CASE', 'CHAR', 'CONST', 'CONTINUE', 'DEFAULT', 'DO', 'DOUBLE', 'ELSE', 'ENUM', 'EXTERN', 'FLOAT', 'FOR', 'GOTO', 'IF', 'INLINE', 'INT', 'LONG', 'REGISTER', 'OFFSETOF', 'RESTRICT', 'RETURN', 'SHORT', 'SIGNED', 'SIZEOF', 'STATIC', 'STRUCT', 'SWITCH', 'TYPEDEF', 'UNION', 'UNSIGNED', 'VOID', 'VOLATILE', 'WHILE', '__INT128', ) keyword_map = {} for keyword in keywords: if keyword == '_BOOL': keyword_map['_Bool'] = keyword elif keyword == '_COMPLEX': keyword_map['_Complex'] = keyword else: keyword_map[keyword.lower()] = keyword ## ## All the tokens recognized by the lexer ## tokens = keywords + ( # Identifiers 'ID', # Type identifiers (identifiers previously defined as # types with typedef) 'TYPEID', # constants 'INT_CONST_DEC', 'INT_CONST_OCT', 'INT_CONST_HEX', 'INT_CONST_BIN', 'FLOAT_CONST', 'HEX_FLOAT_CONST', 'CHAR_CONST', 'WCHAR_CONST', # String literals 'STRING_LITERAL', 'WSTRING_LITERAL', # Operators 'PLUS', 'MINUS', 'TIMES', 'DIVIDE', 'MOD', 'OR', 'AND', 'NOT', 'XOR', 'LSHIFT', 'RSHIFT', 'LOR', 'LAND', 'LNOT', 'LT', 'LE', 'GT', 'GE', 'EQ', 'NE', # Assignment 'EQUALS', 'TIMESEQUAL', 'DIVEQUAL', 'MODEQUAL', 'PLUSEQUAL', 'MINUSEQUAL', 'LSHIFTEQUAL','RSHIFTEQUAL', 'ANDEQUAL', 'XOREQUAL', 'OREQUAL', # Increment/decrement 'PLUSPLUS', 'MINUSMINUS', # Structure dereference (->) 'ARROW', # Conditional operator (?) 'CONDOP', # Delimeters 'LPAREN', 'RPAREN', # ( ) 'LBRACKET', 'RBRACKET', # [ ] 'LBRACE', 'RBRACE', # { } 'COMMA', 'PERIOD', # . , 'SEMI', 'COLON', # ; : # Ellipsis (...) 'ELLIPSIS', # pre-processor 'PPHASH', # '#' 'PPPRAGMA', # 'pragma' 'PPPRAGMASTR', ) ## ## Regexes for use in tokens ## ## # valid C identifiers (K&R2: A.2.3), plus '$' (supported by some compilers) identifier = r'[a-zA-Z_$][0-9a-zA-Z_$]*' hex_prefix = '0[xX]' hex_digits = '[0-9a-fA-F]+' bin_prefix = '0[bB]' bin_digits = '[01]+' # integer constants (K&R2: A.2.5.1) integer_suffix_opt = r'(([uU]ll)|([uU]LL)|(ll[uU]?)|(LL[uU]?)|([uU][lL])|([lL][uU]?)|[uU])?' decimal_constant = '(0'+integer_suffix_opt+')|([1-9][0-9]*'+integer_suffix_opt+')' octal_constant = '0[0-7]*'+integer_suffix_opt hex_constant = hex_prefix+hex_digits+integer_suffix_opt bin_constant = bin_prefix+bin_digits+integer_suffix_opt bad_octal_constant = '0[0-7]*[89]' # character constants (K&R2: A.2.5.2) # Note: a-zA-Z and '.-~^_!=&;,' are allowed as escape chars to support #line # directives with Windows paths as filenames (..\..\dir\file) # For the same reason, decimal_escape allows all digit sequences. We want to # parse all correct code, even if it means to sometimes parse incorrect # code. # simple_escape = r"""([a-zA-Z._~!=&\^\-\\?'"])""" decimal_escape = r"""(\d+)""" hex_escape = r"""(x[0-9a-fA-F]+)""" bad_escape = r"""([\\][^a-zA-Z._~^!=&\^\-\\?'"x0-7])""" escape_sequence = r"""(\\("""+simple_escape+'|'+decimal_escape+'|'+hex_escape+'))' cconst_char = r"""([^'\\\n]|"""+escape_sequence+')' char_const = "'"+cconst_char+"'" wchar_const = 'L'+char_const unmatched_quote = "('"+cconst_char+"*\\n)|('"+cconst_char+"*$)" bad_char_const = r"""('"""+cconst_char+"""[^'\n]+')|('')|('"""+bad_escape+r"""[^'\n]*')""" # string literals (K&R2: A.2.6) string_char = r"""([^"\\\n]|"""+escape_sequence+')' string_literal = '"'+string_char+'*"' wstring_literal = 'L'+string_literal bad_string_literal = '"'+string_char+'*?'+bad_escape+string_char+'*"' # floating constants (K&R2: A.2.5.3) exponent_part = r"""([eE][-+]?[0-9]+)""" fractional_constant = r"""([0-9]*\.[0-9]+)|([0-9]+\.)""" floating_constant = '(((('+fractional_constant+')'+exponent_part+'?)|([0-9]+'+exponent_part+'))[FfLl]?)' binary_exponent_part = r'''([pP][+-]?[0-9]+)''' hex_fractional_constant = '((('+hex_digits+r""")?\."""+hex_digits+')|('+hex_digits+r"""\.))""" hex_floating_constant = '('+hex_prefix+'('+hex_digits+'|'+hex_fractional_constant+')'+binary_exponent_part+'[FfLl]?)' ## ## Lexer states: used for preprocessor \n-terminated directives ## states = ( # ppline: preprocessor line directives # ('ppline', 'exclusive'), # pppragma: pragma # ('pppragma', 'exclusive'), ) def t_PPHASH(self, t): r'[ \t]*\#' if self.line_pattern.match(t.lexer.lexdata, pos=t.lexer.lexpos): t.lexer.begin('ppline') self.pp_line = self.pp_filename = None elif self.pragma_pattern.match(t.lexer.lexdata, pos=t.lexer.lexpos): t.lexer.begin('pppragma') else: t.type = 'PPHASH' return t ## ## Rules for the ppline state ## @TOKEN(string_literal) def t_ppline_FILENAME(self, t): if self.pp_line is None: self._error('filename before line number in #line', t) else: self.pp_filename = t.value.lstrip('"').rstrip('"') @TOKEN(decimal_constant) def t_ppline_LINE_NUMBER(self, t): if self.pp_line is None: self.pp_line = t.value else: # Ignore: GCC's cpp sometimes inserts a numeric flag # after the file name pass def t_ppline_NEWLINE(self, t): r'\n' if self.pp_line is None: self._error('line number missing in #line', t) else: self.lexer.lineno = int(self.pp_line) if self.pp_filename is not None: self.filename = self.pp_filename t.lexer.begin('INITIAL') def t_ppline_PPLINE(self, t): r'line' pass t_ppline_ignore = ' \t' def t_ppline_error(self, t): self._error('invalid #line directive', t) ## ## Rules for the pppragma state ## def t_pppragma_NEWLINE(self, t): r'\n' t.lexer.lineno += 1 t.lexer.begin('INITIAL') def t_pppragma_PPPRAGMA(self, t): r'pragma' return t t_pppragma_ignore = ' \t' def t_pppragma_STR(self, t): '.+' t.type = 'PPPRAGMASTR' return t def t_pppragma_error(self, t): self._error('invalid #pragma directive', t) ## ## Rules for the normal state ## t_ignore = ' \t' # Newlines def t_NEWLINE(self, t): r'\n+' t.lexer.lineno += t.value.count("\n") # Operators t_PLUS = r'\+' t_MINUS = r'-' t_TIMES = r'\*' t_DIVIDE = r'/' t_MOD = r'%' t_OR = r'\|' t_AND = r'&' t_NOT = r'~' t_XOR = r'\^' t_LSHIFT = r'<<' t_RSHIFT = r'>>' t_LOR = r'\|\|' t_LAND = r'&&' t_LNOT = r'!' t_LT = r'<' t_GT = r'>' t_LE = r'<=' t_GE = r'>=' t_EQ = r'==' t_NE = r'!=' # Assignment operators t_EQUALS = r'=' t_TIMESEQUAL = r'\*=' t_DIVEQUAL = r'/=' t_MODEQUAL = r'%=' t_PLUSEQUAL = r'\+=' t_MINUSEQUAL = r'-=' t_LSHIFTEQUAL = r'<<=' t_RSHIFTEQUAL = r'>>=' t_ANDEQUAL = r'&=' t_OREQUAL = r'\|=' t_XOREQUAL = r'\^=' # Increment/decrement t_PLUSPLUS = r'\+\+' t_MINUSMINUS = r'--' # -> t_ARROW = r'->' # ? t_CONDOP = r'\?' # Delimeters t_LPAREN = r'\(' t_RPAREN = r'\)' t_LBRACKET = r'\[' t_RBRACKET = r'\]' t_COMMA = r',' t_PERIOD = r'\.' t_SEMI = r';' t_COLON = r':' t_ELLIPSIS = r'\.\.\.' # Scope delimiters # To see why on_lbrace_func is needed, consider: # typedef char TT; # void foo(int TT) { TT = 10; } # TT x = 5; # Outside the function, TT is a typedef, but inside (starting and ending # with the braces) it's a parameter. The trouble begins with yacc's # lookahead token. If we open a new scope in brace_open, then TT has # already been read and incorrectly interpreted as TYPEID. So, we need # to open and close scopes from within the lexer. # Similar for the TT immediately outside the end of the function. # @TOKEN(r'\{') def t_LBRACE(self, t): self.on_lbrace_func() return t @TOKEN(r'\}') def t_RBRACE(self, t): self.on_rbrace_func() return t t_STRING_LITERAL = string_literal # The following floating and integer constants are defined as # functions to impose a strict order (otherwise, decimal # is placed before the others because its regex is longer, # and this is bad) # @TOKEN(floating_constant) def t_FLOAT_CONST(self, t): return t @TOKEN(hex_floating_constant) def t_HEX_FLOAT_CONST(self, t): return t @TOKEN(hex_constant) def t_INT_CONST_HEX(self, t): return t @TOKEN(bin_constant) def t_INT_CONST_BIN(self, t): return t @TOKEN(bad_octal_constant) def t_BAD_CONST_OCT(self, t): msg = "Invalid octal constant" self._error(msg, t) @TOKEN(octal_constant) def t_INT_CONST_OCT(self, t): return t @TOKEN(decimal_constant) def t_INT_CONST_DEC(self, t): return t # Must come before bad_char_const, to prevent it from # catching valid char constants as invalid # @TOKEN(char_const) def t_CHAR_CONST(self, t): return t @TOKEN(wchar_const) def t_WCHAR_CONST(self, t): return t @TOKEN(unmatched_quote) def t_UNMATCHED_QUOTE(self, t): msg = "Unmatched '" self._error(msg, t) @TOKEN(bad_char_const) def t_BAD_CHAR_CONST(self, t): msg = "Invalid char constant %s" % t.value self._error(msg, t) @TOKEN(wstring_literal) def t_WSTRING_LITERAL(self, t): return t # unmatched string literals are caught by the preprocessor @TOKEN(bad_string_literal) def t_BAD_STRING_LITERAL(self, t): msg = "String contains invalid escape code" self._error(msg, t) @TOKEN(identifier) def t_ID(self, t): t.type = self.keyword_map.get(t.value, "ID") if t.type == 'ID' and self.type_lookup_func(t.value): t.type = "TYPEID" return t def t_error(self, t): msg = 'Illegal character %s' % repr(t.value[0]) self._error(msg, t) pycparser-2.18/pycparser/plyparser.py0000664000175000017500000000764613060530765020621 0ustar elibeneliben00000000000000#----------------------------------------------------------------- # plyparser.py # # PLYParser class and other utilites for simplifying programming # parsers with PLY # # Eli Bendersky [http://eli.thegreenplace.net] # License: BSD #----------------------------------------------------------------- class Coord(object): """ Coordinates of a syntactic element. Consists of: - File name - Line number - (optional) column number, for the Lexer """ __slots__ = ('file', 'line', 'column', '__weakref__') def __init__(self, file, line, column=None): self.file = file self.line = line self.column = column def __str__(self): str = "%s:%s" % (self.file, self.line) if self.column: str += ":%s" % self.column return str class ParseError(Exception): pass class PLYParser(object): def _create_opt_rule(self, rulename): """ Given a rule name, creates an optional ply.yacc rule for it. The name of the optional rule is _opt """ optname = rulename + '_opt' def optrule(self, p): p[0] = p[1] optrule.__doc__ = '%s : empty\n| %s' % (optname, rulename) optrule.__name__ = 'p_%s' % optname setattr(self.__class__, optrule.__name__, optrule) def _coord(self, lineno, column=None): return Coord( file=self.clex.filename, line=lineno, column=column) def _token_coord(self, p, token_idx): """ Returns the coordinates for the YaccProduction objet 'p' indexed with 'token_idx'. The coordinate includes the 'lineno' and 'column'. Both follow the lex semantic, starting from 1. """ last_cr = p.lexer.lexer.lexdata.rfind('\n', 0, p.lexpos(token_idx)) if last_cr < 0: last_cr = -1 column = (p.lexpos(token_idx) - (last_cr)) return self._coord(p.lineno(token_idx), column) def _parse_error(self, msg, coord): raise ParseError("%s: %s" % (coord, msg)) def parameterized(*params): """ Decorator to create parameterized rules. Parameterized rule methods must be named starting with 'p_' and contain 'xxx', and their docstrings may contain 'xxx' and 'yyy'. These will be replaced by the given parameter tuples. For example, ``p_xxx_rule()`` with docstring 'xxx_rule : yyy' when decorated with ``@parameterized(('id', 'ID'))`` produces ``p_id_rule()`` with the docstring 'id_rule : ID'. Using multiple tuples produces multiple rules. """ def decorate(rule_func): rule_func._params = params return rule_func return decorate def template(cls): """ Class decorator to generate rules from parameterized rule templates. See `parameterized` for more information on parameterized rules. """ for attr_name in dir(cls): if attr_name.startswith('p_'): method = getattr(cls, attr_name) if hasattr(method, '_params'): delattr(cls, attr_name) # Remove template method _create_param_rules(cls, method) return cls def _create_param_rules(cls, func): """ Create ply.yacc rules based on a parameterized rule function Generates new methods (one per each pair of parameters) based on the template rule function `func`, and attaches them to `cls`. The rule function's parameters must be accessible via its `_params` attribute. """ for xxx, yyy in func._params: # Use the template method's body for each new method def param_rule(self, p): func(self, p) # Substitute in the params for the grammar rule and function name param_rule.__doc__ = func.__doc__.replace('xxx', xxx).replace('yyy', yyy) param_rule.__name__ = func.__name__.replace('xxx', xxx) # Attach the new method to the class setattr(cls, param_rule.__name__, param_rule) pycparser-2.18/pycparser/yacctab.py0000664000175000017500000050135313127010662020171 0ustar elibeneliben00000000000000 # yacctab.py # This file is automatically generated. Do not edit. _tabversion = '3.10' _lr_method = 'LALR' _lr_signature = 'translation_unit_or_emptyleftLORleftLANDleftORleftXORleftANDleftEQNEleftGTGELTLEleftRSHIFTLSHIFTleftPLUSMINUSleftTIMESDIVIDEMOD_BOOL _COMPLEX AUTO BREAK CASE CHAR CONST CONTINUE DEFAULT DO DOUBLE ELSE ENUM EXTERN FLOAT FOR GOTO IF INLINE INT LONG REGISTER OFFSETOF RESTRICT RETURN SHORT SIGNED SIZEOF STATIC STRUCT SWITCH TYPEDEF UNION UNSIGNED VOID VOLATILE WHILE __INT128 ID TYPEID INT_CONST_DEC INT_CONST_OCT INT_CONST_HEX INT_CONST_BIN FLOAT_CONST HEX_FLOAT_CONST CHAR_CONST WCHAR_CONST STRING_LITERAL WSTRING_LITERAL PLUS MINUS TIMES DIVIDE MOD OR AND NOT XOR LSHIFT RSHIFT LOR LAND LNOT LT LE GT GE EQ NE EQUALS TIMESEQUAL DIVEQUAL MODEQUAL PLUSEQUAL MINUSEQUAL LSHIFTEQUAL RSHIFTEQUAL ANDEQUAL XOREQUAL OREQUAL PLUSPLUS MINUSMINUS ARROW CONDOP LPAREN RPAREN LBRACKET RBRACKET LBRACE RBRACE COMMA PERIOD SEMI COLON ELLIPSIS PPHASH PPPRAGMA PPPRAGMASTRabstract_declarator_opt : empty\n| abstract_declaratorassignment_expression_opt : empty\n| assignment_expressionblock_item_list_opt : empty\n| block_item_listdeclaration_list_opt : empty\n| declaration_listdeclaration_specifiers_no_type_opt : empty\n| declaration_specifiers_no_typedesignation_opt : empty\n| designationexpression_opt : empty\n| expressionid_init_declarator_list_opt : empty\n| id_init_declarator_listidentifier_list_opt : empty\n| identifier_listinit_declarator_list_opt : empty\n| init_declarator_listinitializer_list_opt : empty\n| initializer_listparameter_type_list_opt : empty\n| parameter_type_liststruct_declarator_list_opt : empty\n| struct_declarator_listtype_qualifier_list_opt : empty\n| type_qualifier_list direct_id_declarator : ID\n direct_id_declarator : LPAREN id_declarator RPAREN\n direct_id_declarator : direct_id_declarator LBRACKET type_qualifier_list_opt assignment_expression_opt RBRACKET\n direct_id_declarator : direct_id_declarator LBRACKET STATIC type_qualifier_list_opt assignment_expression RBRACKET\n | direct_id_declarator LBRACKET type_qualifier_list STATIC assignment_expression RBRACKET\n direct_id_declarator : direct_id_declarator LBRACKET type_qualifier_list_opt TIMES RBRACKET\n direct_id_declarator : direct_id_declarator LPAREN parameter_type_list RPAREN\n | direct_id_declarator LPAREN identifier_list_opt RPAREN\n direct_typeid_declarator : TYPEID\n direct_typeid_declarator : LPAREN typeid_declarator RPAREN\n direct_typeid_declarator : direct_typeid_declarator LBRACKET type_qualifier_list_opt assignment_expression_opt RBRACKET\n direct_typeid_declarator : direct_typeid_declarator LBRACKET STATIC type_qualifier_list_opt assignment_expression RBRACKET\n | direct_typeid_declarator LBRACKET type_qualifier_list STATIC assignment_expression RBRACKET\n direct_typeid_declarator : direct_typeid_declarator LBRACKET type_qualifier_list_opt TIMES RBRACKET\n direct_typeid_declarator : direct_typeid_declarator LPAREN parameter_type_list RPAREN\n | direct_typeid_declarator LPAREN identifier_list_opt RPAREN\n direct_typeid_noparen_declarator : TYPEID\n direct_typeid_noparen_declarator : direct_typeid_noparen_declarator LBRACKET type_qualifier_list_opt assignment_expression_opt RBRACKET\n direct_typeid_noparen_declarator : direct_typeid_noparen_declarator LBRACKET STATIC type_qualifier_list_opt assignment_expression RBRACKET\n | direct_typeid_noparen_declarator LBRACKET type_qualifier_list STATIC assignment_expression RBRACKET\n direct_typeid_noparen_declarator : direct_typeid_noparen_declarator LBRACKET type_qualifier_list_opt TIMES RBRACKET\n direct_typeid_noparen_declarator : direct_typeid_noparen_declarator LPAREN parameter_type_list RPAREN\n | direct_typeid_noparen_declarator LPAREN identifier_list_opt RPAREN\n id_declarator : direct_id_declarator\n id_declarator : pointer direct_id_declarator\n typeid_declarator : direct_typeid_declarator\n typeid_declarator : pointer direct_typeid_declarator\n typeid_noparen_declarator : direct_typeid_noparen_declarator\n typeid_noparen_declarator : pointer direct_typeid_noparen_declarator\n translation_unit_or_empty : translation_unit\n | empty\n translation_unit : external_declaration\n translation_unit : translation_unit external_declaration\n external_declaration : function_definition\n external_declaration : declaration\n external_declaration : pp_directive\n | pppragma_directive\n external_declaration : SEMI\n pp_directive : PPHASH\n pppragma_directive : PPPRAGMA\n | PPPRAGMA PPPRAGMASTR\n function_definition : id_declarator declaration_list_opt compound_statement\n function_definition : declaration_specifiers id_declarator declaration_list_opt compound_statement\n statement : labeled_statement\n | expression_statement\n | compound_statement\n | selection_statement\n | iteration_statement\n | jump_statement\n | pppragma_directive\n decl_body : declaration_specifiers init_declarator_list_opt\n | declaration_specifiers_no_type id_init_declarator_list_opt\n declaration : decl_body SEMI\n declaration_list : declaration\n | declaration_list declaration\n declaration_specifiers_no_type : type_qualifier declaration_specifiers_no_type_opt\n declaration_specifiers_no_type : storage_class_specifier declaration_specifiers_no_type_opt\n declaration_specifiers_no_type : function_specifier declaration_specifiers_no_type_opt\n declaration_specifiers : declaration_specifiers type_qualifier\n declaration_specifiers : declaration_specifiers storage_class_specifier\n declaration_specifiers : declaration_specifiers function_specifier\n declaration_specifiers : declaration_specifiers type_specifier_no_typeid\n declaration_specifiers : type_specifier\n declaration_specifiers : declaration_specifiers_no_type type_specifier\n storage_class_specifier : AUTO\n | REGISTER\n | STATIC\n | EXTERN\n | TYPEDEF\n function_specifier : INLINE\n type_specifier_no_typeid : VOID\n | _BOOL\n | CHAR\n | SHORT\n | INT\n | LONG\n | FLOAT\n | DOUBLE\n | _COMPLEX\n | SIGNED\n | UNSIGNED\n | __INT128\n type_specifier : typedef_name\n | enum_specifier\n | struct_or_union_specifier\n | type_specifier_no_typeid\n type_qualifier : CONST\n | RESTRICT\n | VOLATILE\n init_declarator_list : init_declarator\n | init_declarator_list COMMA init_declarator\n init_declarator : declarator\n | declarator EQUALS initializer\n id_init_declarator_list : id_init_declarator\n | id_init_declarator_list COMMA init_declarator\n id_init_declarator : id_declarator\n | id_declarator EQUALS initializer\n specifier_qualifier_list : specifier_qualifier_list type_specifier_no_typeid\n specifier_qualifier_list : specifier_qualifier_list type_qualifier\n specifier_qualifier_list : type_specifier\n specifier_qualifier_list : type_qualifier_list type_specifier\n struct_or_union_specifier : struct_or_union ID\n | struct_or_union TYPEID\n struct_or_union_specifier : struct_or_union brace_open struct_declaration_list brace_close\n struct_or_union_specifier : struct_or_union ID brace_open struct_declaration_list brace_close\n | struct_or_union TYPEID brace_open struct_declaration_list brace_close\n struct_or_union : STRUCT\n | UNION\n struct_declaration_list : struct_declaration\n | struct_declaration_list struct_declaration\n struct_declaration : specifier_qualifier_list struct_declarator_list_opt SEMI\n struct_declaration : SEMI\n struct_declarator_list : struct_declarator\n | struct_declarator_list COMMA struct_declarator\n struct_declarator : declarator\n struct_declarator : declarator COLON constant_expression\n | COLON constant_expression\n enum_specifier : ENUM ID\n | ENUM TYPEID\n enum_specifier : ENUM brace_open enumerator_list brace_close\n enum_specifier : ENUM ID brace_open enumerator_list brace_close\n | ENUM TYPEID brace_open enumerator_list brace_close\n enumerator_list : enumerator\n | enumerator_list COMMA\n | enumerator_list COMMA enumerator\n enumerator : ID\n | ID EQUALS constant_expression\n declarator : id_declarator\n | typeid_declarator\n pointer : TIMES type_qualifier_list_opt\n | TIMES type_qualifier_list_opt pointer\n type_qualifier_list : type_qualifier\n | type_qualifier_list type_qualifier\n parameter_type_list : parameter_list\n | parameter_list COMMA ELLIPSIS\n parameter_list : parameter_declaration\n | parameter_list COMMA parameter_declaration\n parameter_declaration : declaration_specifiers id_declarator\n | declaration_specifiers typeid_noparen_declarator\n parameter_declaration : declaration_specifiers abstract_declarator_opt\n identifier_list : identifier\n | identifier_list COMMA identifier\n initializer : assignment_expression\n initializer : brace_open initializer_list_opt brace_close\n | brace_open initializer_list COMMA brace_close\n initializer_list : designation_opt initializer\n | initializer_list COMMA designation_opt initializer\n designation : designator_list EQUALS\n designator_list : designator\n | designator_list designator\n designator : LBRACKET constant_expression RBRACKET\n | PERIOD identifier\n type_name : specifier_qualifier_list abstract_declarator_opt\n abstract_declarator : pointer\n abstract_declarator : pointer direct_abstract_declarator\n abstract_declarator : direct_abstract_declarator\n direct_abstract_declarator : LPAREN abstract_declarator RPAREN direct_abstract_declarator : direct_abstract_declarator LBRACKET assignment_expression_opt RBRACKET\n direct_abstract_declarator : LBRACKET assignment_expression_opt RBRACKET\n direct_abstract_declarator : direct_abstract_declarator LBRACKET TIMES RBRACKET\n direct_abstract_declarator : LBRACKET TIMES RBRACKET\n direct_abstract_declarator : direct_abstract_declarator LPAREN parameter_type_list_opt RPAREN\n direct_abstract_declarator : LPAREN parameter_type_list_opt RPAREN\n block_item : declaration\n | statement\n block_item_list : block_item\n | block_item_list block_item\n compound_statement : brace_open block_item_list_opt brace_close labeled_statement : ID COLON statement labeled_statement : CASE constant_expression COLON statement labeled_statement : DEFAULT COLON statement selection_statement : IF LPAREN expression RPAREN statement selection_statement : IF LPAREN expression RPAREN statement ELSE statement selection_statement : SWITCH LPAREN expression RPAREN statement iteration_statement : WHILE LPAREN expression RPAREN statement iteration_statement : DO statement WHILE LPAREN expression RPAREN SEMI iteration_statement : FOR LPAREN expression_opt SEMI expression_opt SEMI expression_opt RPAREN statement iteration_statement : FOR LPAREN declaration expression_opt SEMI expression_opt RPAREN statement jump_statement : GOTO ID SEMI jump_statement : BREAK SEMI jump_statement : CONTINUE SEMI jump_statement : RETURN expression SEMI\n | RETURN SEMI\n expression_statement : expression_opt SEMI expression : assignment_expression\n | expression COMMA assignment_expression\n typedef_name : TYPEID assignment_expression : conditional_expression\n | unary_expression assignment_operator assignment_expression\n assignment_operator : EQUALS\n | XOREQUAL\n | TIMESEQUAL\n | DIVEQUAL\n | MODEQUAL\n | PLUSEQUAL\n | MINUSEQUAL\n | LSHIFTEQUAL\n | RSHIFTEQUAL\n | ANDEQUAL\n | OREQUAL\n constant_expression : conditional_expression conditional_expression : binary_expression\n | binary_expression CONDOP expression COLON conditional_expression\n binary_expression : cast_expression\n | binary_expression TIMES binary_expression\n | binary_expression DIVIDE binary_expression\n | binary_expression MOD binary_expression\n | binary_expression PLUS binary_expression\n | binary_expression MINUS binary_expression\n | binary_expression RSHIFT binary_expression\n | binary_expression LSHIFT binary_expression\n | binary_expression LT binary_expression\n | binary_expression LE binary_expression\n | binary_expression GE binary_expression\n | binary_expression GT binary_expression\n | binary_expression EQ binary_expression\n | binary_expression NE binary_expression\n | binary_expression AND binary_expression\n | binary_expression OR binary_expression\n | binary_expression XOR binary_expression\n | binary_expression LAND binary_expression\n | binary_expression LOR binary_expression\n cast_expression : unary_expression cast_expression : LPAREN type_name RPAREN cast_expression unary_expression : postfix_expression unary_expression : PLUSPLUS unary_expression\n | MINUSMINUS unary_expression\n | unary_operator cast_expression\n unary_expression : SIZEOF unary_expression\n | SIZEOF LPAREN type_name RPAREN\n unary_operator : AND\n | TIMES\n | PLUS\n | MINUS\n | NOT\n | LNOT\n postfix_expression : primary_expression postfix_expression : postfix_expression LBRACKET expression RBRACKET postfix_expression : postfix_expression LPAREN argument_expression_list RPAREN\n | postfix_expression LPAREN RPAREN\n postfix_expression : postfix_expression PERIOD ID\n | postfix_expression PERIOD TYPEID\n | postfix_expression ARROW ID\n | postfix_expression ARROW TYPEID\n postfix_expression : postfix_expression PLUSPLUS\n | postfix_expression MINUSMINUS\n postfix_expression : LPAREN type_name RPAREN brace_open initializer_list brace_close\n | LPAREN type_name RPAREN brace_open initializer_list COMMA brace_close\n primary_expression : identifier primary_expression : constant primary_expression : unified_string_literal\n | unified_wstring_literal\n primary_expression : LPAREN expression RPAREN primary_expression : OFFSETOF LPAREN type_name COMMA offsetof_member_designator RPAREN\n offsetof_member_designator : identifier\n | offsetof_member_designator PERIOD identifier\n | offsetof_member_designator LBRACKET expression RBRACKET\n argument_expression_list : assignment_expression\n | argument_expression_list COMMA assignment_expression\n identifier : ID constant : INT_CONST_DEC\n | INT_CONST_OCT\n | INT_CONST_HEX\n | INT_CONST_BIN\n constant : FLOAT_CONST\n | HEX_FLOAT_CONST\n constant : CHAR_CONST\n | WCHAR_CONST\n unified_string_literal : STRING_LITERAL\n | unified_string_literal STRING_LITERAL\n unified_wstring_literal : WSTRING_LITERAL\n | unified_wstring_literal WSTRING_LITERAL\n brace_open : LBRACE\n brace_close : RBRACE\n empty : ' _lr_action_items = {'VOID':([0,1,2,3,5,6,7,8,9,10,11,12,14,15,16,17,18,19,20,21,23,24,25,26,29,30,32,33,34,35,36,37,39,40,41,42,44,45,47,48,49,50,51,53,54,55,56,57,60,61,63,64,65,67,68,69,70,71,72,73,74,78,80,83,87,91,92,96,101,102,103,104,105,113,117,120,121,122,123,124,125,126,127,128,129,130,136,142,170,184,185,186,198,199,200,203,205,212,214,215,216,217,219,222,228,229,230,231,232,233,234,240,245,254,273,282,283,284,287,289,292,323,327,332,333,335,336,342,345,347,350,351,354,356,357,392,412,413,426,427,431,436,473,494,495,497,515,516,519,520,],[6,-303,-102,-115,-113,-99,-97,-52,-95,-114,-96,-64,-60,-100,-91,-66,6,-94,-109,-104,-65,-93,-110,6,-215,-107,-111,6,-63,-116,6,-29,-105,-62,-101,-67,-112,-106,-303,-108,-303,-103,-117,-68,-98,-85,-10,-9,6,-53,6,-82,6,6,-61,-131,-301,-130,6,-147,-146,-160,-88,-90,6,-87,-89,-92,-81,-84,-86,-69,-30,6,6,-70,6,-83,6,6,-128,-140,-137,6,6,6,-161,6,6,-36,-35,6,6,-73,-76,-72,-74,6,-78,-193,-192,-77,-194,-75,6,6,-129,-132,-138,-302,-126,-127,-148,-71,6,-31,6,6,6,-34,6,6,6,-212,-211,6,-209,-195,-208,-196,-134,-133,-139,-150,-149,6,-33,-32,-207,-210,-199,-197,-198,-203,-202,-200,-204,-201,-206,-205,]),'LBRACKET':([2,3,5,6,7,8,9,10,11,15,16,19,20,21,24,25,29,30,31,32,35,37,39,41,44,45,48,50,51,54,61,69,70,71,73,74,76,77,78,79,80,83,85,88,91,92,96,105,113,115,125,136,137,140,147,150,151,152,153,154,155,158,159,167,169,172,174,175,177,178,184,185,186,189,190,193,196,226,230,231,233,234,240,245,247,256,272,275,276,278,282,289,292,315,320,321,350,351,356,357,364,365,368,373,377,378,379,380,383,388,391,392,412,413,414,415,421,422,440,441,445,447,449,452,453,459,465,466,467,468,469,477,478,479,484,485,488,489,499,502,503,504,505,510,512,517,],[-102,-115,-113,-99,-97,59,-95,-114,-96,-100,-91,-94,-109,-104,-93,-110,-215,-107,-303,-111,-116,-29,-105,-101,-112,-106,-108,-103,-117,-98,59,-131,-301,-130,-147,-146,-28,-158,-160,-27,-88,-90,141,-37,-87,-89,-92,-30,195,-288,-128,-161,-159,141,-292,-280,-295,-299,-296,-293,-278,-279,280,-291,-265,-297,-289,-277,-294,-290,-36,-35,195,195,322,-45,326,-288,-129,-132,-302,-126,-127,-148,-38,370,-300,-298,-274,-273,-31,-34,195,195,322,326,-134,-133,-150,-149,-44,-43,-177,370,-272,-271,-270,-269,-268,-281,195,195,-33,-32,-191,-185,-187,-189,-39,-42,-180,370,-178,-266,-267,370,-51,-50,-186,-188,-190,-41,-40,-179,501,-283,-46,-49,-282,370,-275,-48,-47,-284,-276,-285,]),'WCHAR_CONST':([3,35,51,53,59,70,76,78,79,101,104,106,107,108,121,136,141,144,145,148,156,157,163,164,166,168,170,171,173,181,182,183,195,199,200,203,204,205,211,212,214,215,216,217,219,221,222,225,233,235,246,249,250,251,256,260,261,262,263,264,265,266,267,268,269,270,271,273,280,281,284,287,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,322,325,326,332,333,335,336,337,338,339,342,344,345,347,348,349,355,361,362,363,366,370,372,374,389,416,417,418,424,426,427,429,431,433,436,447,450,454,456,459,460,462,463,464,470,472,473,474,475,476,480,481,493,494,495,497,501,502,506,509,514,515,516,518,519,520,],[-115,-116,-117,-68,-303,-301,-28,-160,-27,-81,-69,153,-28,-303,153,-161,-303,153,153,-264,153,-262,153,-261,153,-260,153,153,-259,-263,153,153,153,-73,-76,-72,153,-74,153,153,-78,-193,-192,-77,-194,153,-75,-260,-302,153,153,153,-28,-303,-303,-221,-224,-222,-218,-219,-223,-225,153,-227,-228,-220,-226,153,153,153,153,153,153,153,153,153,153,153,153,153,153,153,153,153,153,153,153,153,153,153,153,-303,-260,153,-212,-211,153,-209,153,153,153,-195,153,-208,-196,153,153,153,-260,153,153,-12,153,153,-11,153,153,-28,-303,-260,-207,-210,153,-199,153,-197,-303,-176,153,153,-303,153,-260,153,153,153,153,-198,153,153,153,153,-11,153,-203,-202,-200,153,-303,153,153,153,-204,-201,153,-206,-205,]),'FLOAT_CONST':([3,35,51,53,59,70,76,78,79,101,104,106,107,108,121,136,141,144,145,148,156,157,163,164,166,168,170,171,173,181,182,183,195,199,200,203,204,205,211,212,214,215,216,217,219,221,222,225,233,235,246,249,250,251,256,260,261,262,263,264,265,266,267,268,269,270,271,273,280,281,284,287,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,322,325,326,332,333,335,336,337,338,339,342,344,345,347,348,349,355,361,362,363,366,370,372,374,389,416,417,418,424,426,427,429,431,433,436,447,450,454,456,459,460,462,463,464,470,472,473,474,475,476,480,481,493,494,495,497,501,502,506,509,514,515,516,518,519,520,],[-115,-116,-117,-68,-303,-301,-28,-160,-27,-81,-69,154,-28,-303,154,-161,-303,154,154,-264,154,-262,154,-261,154,-260,154,154,-259,-263,154,154,154,-73,-76,-72,154,-74,154,154,-78,-193,-192,-77,-194,154,-75,-260,-302,154,154,154,-28,-303,-303,-221,-224,-222,-218,-219,-223,-225,154,-227,-228,-220,-226,154,154,154,154,154,154,154,154,154,154,154,154,154,154,154,154,154,154,154,154,154,154,154,154,-303,-260,154,-212,-211,154,-209,154,154,154,-195,154,-208,-196,154,154,154,-260,154,154,-12,154,154,-11,154,154,-28,-303,-260,-207,-210,154,-199,154,-197,-303,-176,154,154,-303,154,-260,154,154,154,154,-198,154,154,154,154,-11,154,-203,-202,-200,154,-303,154,154,154,-204,-201,154,-206,-205,]),'MINUS':([3,35,51,53,59,70,76,78,79,101,104,106,107,108,115,121,136,141,144,145,147,148,149,150,151,152,153,154,155,156,157,158,159,161,163,164,166,167,168,169,170,171,172,173,174,175,176,177,178,181,182,183,195,199,200,203,204,205,211,212,214,215,216,217,219,221,222,225,226,233,235,246,249,250,251,256,260,261,262,263,264,265,266,267,268,269,270,271,272,273,274,275,276,278,280,281,284,285,286,287,288,293,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,322,325,326,332,333,335,336,337,338,339,342,344,345,347,348,349,355,361,362,363,366,370,372,374,377,378,379,380,383,388,389,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,410,411,416,417,418,424,426,427,429,431,433,436,447,450,451,452,453,454,456,458,459,460,462,463,464,470,472,473,474,475,476,480,481,493,494,495,497,499,501,502,503,506,509,512,514,515,516,518,519,520,],[-115,-116,-117,-68,-303,-301,-28,-160,-27,-81,-69,157,-28,-303,-288,157,-161,-303,157,157,-292,-264,-251,-280,-295,-299,-296,-293,-278,157,-262,-279,-253,-232,157,-261,157,-291,-260,-265,157,157,-297,-259,-289,-277,297,-294,-290,-263,157,157,157,-73,-76,-72,157,-74,157,157,-78,-193,-192,-77,-194,157,-75,-260,-288,-302,157,157,157,-28,-303,-303,-221,-224,-222,-218,-219,-223,-225,157,-227,-228,-220,-226,-300,157,-257,-298,-274,-273,157,157,157,-251,-256,157,-254,-255,157,157,157,157,157,157,157,157,157,157,157,157,157,157,157,157,157,157,157,-303,-260,157,-212,-211,157,-209,157,157,157,-195,157,-208,-196,157,157,157,-260,157,157,-12,157,157,-11,-272,-271,-270,-269,-268,-281,157,297,297,297,-237,297,297,297,-236,297,297,-234,-233,297,297,297,297,297,-235,157,-28,-303,-260,-207,-210,157,-199,157,-197,-303,-176,-258,-266,-267,157,157,-252,-303,157,-260,157,157,157,157,-198,157,157,157,157,-11,157,-203,-202,-200,-282,157,-303,-275,157,157,-276,157,-204,-201,157,-206,-205,]),'RPAREN':([2,3,5,6,7,8,9,10,11,15,16,19,20,21,24,25,29,30,31,32,35,37,39,41,44,45,48,50,51,54,58,60,61,69,71,73,74,76,77,78,79,80,83,85,88,91,92,96,105,109,110,111,112,113,114,115,116,118,125,136,137,138,140,142,147,149,150,151,152,153,154,155,158,159,161,167,169,172,174,175,176,177,178,180,184,185,186,187,188,189,190,191,192,193,194,196,208,224,230,231,233,234,240,245,247,252,253,272,274,275,276,278,281,282,285,286,288,289,290,291,292,293,315,316,317,318,319,320,321,323,327,328,329,330,343,350,351,356,357,364,365,375,376,377,378,379,380,382,383,384,386,387,388,390,391,392,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,410,411,412,413,414,415,419,420,421,422,425,430,432,434,437,440,441,451,452,453,458,465,466,467,468,469,477,478,483,484,485,487,488,489,493,496,499,503,504,505,506,507,510,512,513,517,],[-102,-115,-113,-99,-97,-52,-95,-114,-96,-100,-91,-94,-109,-104,-93,-110,-215,-107,-303,-111,-116,-29,-105,-101,-112,-106,-108,-103,-117,-98,105,-303,-53,-131,-130,-147,-146,-28,-158,-160,-27,-88,-90,-54,-37,-87,-89,-92,-30,184,-17,185,-164,-303,-18,-288,-162,-169,-128,-161,-159,247,-55,-303,-292,-251,-280,-295,-299,-296,-293,-278,-279,-253,-232,-291,-265,-297,-289,-277,-230,-294,-290,-216,-36,-35,-303,-168,-2,-182,-56,-166,-1,-45,-167,-184,-14,-213,-129,-132,-302,-126,-127,-148,-38,364,365,-300,-257,-298,-274,-273,383,-31,-251,-256,-254,-34,388,389,-303,-255,-182,-23,-24,414,415,-57,-183,-303,-303,-170,-163,-165,-13,-134,-133,-150,-149,-44,-43,-217,451,-272,-271,-270,-269,-286,-268,453,456,457,-281,-181,-182,-303,-238,-250,-239,-237,-241,-245,-240,-236,-243,-248,-234,-233,-242,-249,-244,-246,-247,-235,-33,-32,-191,-185,465,466,-187,-189,469,-214,472,474,476,-39,-42,-258,-266,-267,-252,-51,-50,-186,-188,-190,-41,-40,-287,499,-283,-231,-46,-49,-303,508,-282,-275,-48,-47,-303,514,-284,-276,518,-285,]),'LONG':([0,1,2,3,5,6,7,8,9,10,11,12,14,15,16,17,18,19,20,21,23,24,25,26,29,30,32,33,34,35,36,37,39,40,41,42,44,45,47,48,49,50,51,53,54,55,56,57,60,61,63,64,65,67,68,69,70,71,72,73,74,78,80,83,87,91,92,96,101,102,103,104,105,113,117,120,121,122,123,124,125,126,127,128,129,130,136,142,170,184,185,186,198,199,200,203,205,212,214,215,216,217,219,222,228,229,230,231,232,233,234,240,245,254,273,282,283,284,287,289,292,323,327,332,333,335,336,342,345,347,350,351,354,356,357,392,412,413,426,427,431,436,473,494,495,497,515,516,519,520,],[21,-303,-102,-115,-113,-99,-97,-52,-95,-114,-96,-64,-60,-100,-91,-66,21,-94,-109,-104,-65,-93,-110,21,-215,-107,-111,21,-63,-116,21,-29,-105,-62,-101,-67,-112,-106,-303,-108,-303,-103,-117,-68,-98,-85,-10,-9,21,-53,21,-82,21,21,-61,-131,-301,-130,21,-147,-146,-160,-88,-90,21,-87,-89,-92,-81,-84,-86,-69,-30,21,21,-70,21,-83,21,21,-128,-140,-137,21,21,21,-161,21,21,-36,-35,21,21,-73,-76,-72,-74,21,-78,-193,-192,-77,-194,-75,21,21,-129,-132,-138,-302,-126,-127,-148,-71,21,-31,21,21,21,-34,21,21,21,-212,-211,21,-209,-195,-208,-196,-134,-133,-139,-150,-149,21,-33,-32,-207,-210,-199,-197,-198,-203,-202,-200,-204,-201,-206,-205,]),'PLUS':([3,35,51,53,59,70,76,78,79,101,104,106,107,108,115,121,136,141,144,145,147,148,149,150,151,152,153,154,155,156,157,158,159,161,163,164,166,167,168,169,170,171,172,173,174,175,176,177,178,181,182,183,195,199,200,203,204,205,211,212,214,215,216,217,219,221,222,225,226,233,235,246,249,250,251,256,260,261,262,263,264,265,266,267,268,269,270,271,272,273,274,275,276,278,280,281,284,285,286,287,288,293,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,322,325,326,332,333,335,336,337,338,339,342,344,345,347,348,349,355,361,362,363,366,370,372,374,377,378,379,380,383,388,389,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,410,411,416,417,418,424,426,427,429,431,433,436,447,450,451,452,453,454,456,458,459,460,462,463,464,470,472,473,474,475,476,480,481,493,494,495,497,499,501,502,503,506,509,512,514,515,516,518,519,520,],[-115,-116,-117,-68,-303,-301,-28,-160,-27,-81,-69,164,-28,-303,-288,164,-161,-303,164,164,-292,-264,-251,-280,-295,-299,-296,-293,-278,164,-262,-279,-253,-232,164,-261,164,-291,-260,-265,164,164,-297,-259,-289,-277,301,-294,-290,-263,164,164,164,-73,-76,-72,164,-74,164,164,-78,-193,-192,-77,-194,164,-75,-260,-288,-302,164,164,164,-28,-303,-303,-221,-224,-222,-218,-219,-223,-225,164,-227,-228,-220,-226,-300,164,-257,-298,-274,-273,164,164,164,-251,-256,164,-254,-255,164,164,164,164,164,164,164,164,164,164,164,164,164,164,164,164,164,164,164,-303,-260,164,-212,-211,164,-209,164,164,164,-195,164,-208,-196,164,164,164,-260,164,164,-12,164,164,-11,-272,-271,-270,-269,-268,-281,164,301,301,301,-237,301,301,301,-236,301,301,-234,-233,301,301,301,301,301,-235,164,-28,-303,-260,-207,-210,164,-199,164,-197,-303,-176,-258,-266,-267,164,164,-252,-303,164,-260,164,164,164,164,-198,164,164,164,164,-11,164,-203,-202,-200,-282,164,-303,-275,164,164,-276,164,-204,-201,164,-206,-205,]),'ELLIPSIS':([198,],[329,]),'GT':([115,147,149,150,151,152,153,154,155,158,159,161,167,169,172,174,175,176,177,178,226,233,272,274,275,276,278,285,286,288,293,377,378,379,380,383,388,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,410,411,451,452,453,458,499,503,512,],[-288,-292,-251,-280,-295,-299,-296,-293,-278,-279,-253,-232,-291,-265,-297,-289,-277,302,-294,-290,-288,-302,-300,-257,-298,-274,-273,-251,-256,-254,-255,-272,-271,-270,-269,-268,-281,-238,302,-239,-237,-241,302,-240,-236,-243,302,-234,-233,-242,302,302,302,302,-235,-258,-266,-267,-252,-282,-275,-276,]),'GOTO':([53,70,101,104,121,199,200,203,205,212,214,215,216,217,219,221,222,233,332,333,336,338,342,345,347,348,426,427,431,433,436,472,473,474,476,494,495,497,509,514,515,516,518,519,520,],[-68,-301,-81,-69,201,-73,-76,-72,-74,201,-78,-193,-192,-77,-194,201,-75,-302,-212,-211,-209,201,-195,-208,-196,201,-207,-210,-199,201,-197,201,-198,201,201,-203,-202,-200,201,201,-204,-201,201,-206,-205,]),'ENUM':([0,1,3,7,8,9,11,12,14,17,18,19,23,24,26,34,35,36,37,40,42,47,49,51,53,54,55,56,57,60,61,64,65,67,68,70,72,78,87,101,102,103,104,105,117,120,121,122,123,124,126,127,128,129,136,142,170,184,185,186,198,199,200,203,205,212,214,215,216,217,219,222,228,229,232,233,254,273,282,283,284,287,289,323,327,332,333,335,336,342,345,347,354,392,412,413,426,427,431,436,473,494,495,497,515,516,519,520,],[28,-303,-115,-97,-52,-95,-96,-64,-60,-66,28,-94,-65,-93,28,-63,-116,28,-29,-62,-67,-303,-303,-117,-68,-98,-85,-10,-9,28,-53,-82,28,28,-61,-301,28,-160,28,-81,-84,-86,-69,-30,28,-70,28,-83,28,28,-140,-137,28,28,-161,28,28,-36,-35,28,28,-73,-76,-72,-74,28,-78,-193,-192,-77,-194,-75,28,28,-138,-302,-71,28,-31,28,28,28,-34,28,28,-212,-211,28,-209,-195,-208,-196,-139,28,-33,-32,-207,-210,-199,-197,-198,-203,-202,-200,-204,-201,-206,-205,]),'PERIOD':([70,115,147,150,151,152,153,154,155,158,159,167,169,172,174,175,177,178,226,233,256,272,275,276,278,368,373,377,378,379,380,383,388,445,447,449,452,453,459,479,484,485,499,502,503,510,512,517,],[-301,-288,-292,-280,-295,-299,-296,-293,-278,-279,279,-291,-265,-297,-289,-277,-294,-290,-288,-302,369,-300,-298,-274,-273,-177,369,-272,-271,-270,-269,-268,-281,-180,369,-178,-266,-267,369,-179,500,-283,-282,369,-275,-284,-276,-285,]),'GE':([115,147,149,150,151,152,153,154,155,158,159,161,167,169,172,174,175,176,177,178,226,233,272,274,275,276,278,285,286,288,293,377,378,379,380,383,388,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,410,411,451,452,453,458,499,503,512,],[-288,-292,-251,-280,-295,-299,-296,-293,-278,-279,-253,-232,-291,-265,-297,-289,-277,306,-294,-290,-288,-302,-300,-257,-298,-274,-273,-251,-256,-254,-255,-272,-271,-270,-269,-268,-281,-238,306,-239,-237,-241,306,-240,-236,-243,306,-234,-233,-242,306,306,306,306,-235,-258,-266,-267,-252,-282,-275,-276,]),'INT_CONST_DEC':([3,35,51,53,59,70,76,78,79,101,104,106,107,108,121,136,141,144,145,148,156,157,163,164,166,168,170,171,173,181,182,183,195,199,200,203,204,205,211,212,214,215,216,217,219,221,222,225,233,235,246,249,250,251,256,260,261,262,263,264,265,266,267,268,269,270,271,273,280,281,284,287,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,322,325,326,332,333,335,336,337,338,339,342,344,345,347,348,349,355,361,362,363,366,370,372,374,389,416,417,418,424,426,427,429,431,433,436,447,450,454,456,459,460,462,463,464,470,472,473,474,475,476,480,481,493,494,495,497,501,502,506,509,514,515,516,518,519,520,],[-115,-116,-117,-68,-303,-301,-28,-160,-27,-81,-69,174,-28,-303,174,-161,-303,174,174,-264,174,-262,174,-261,174,-260,174,174,-259,-263,174,174,174,-73,-76,-72,174,-74,174,174,-78,-193,-192,-77,-194,174,-75,-260,-302,174,174,174,-28,-303,-303,-221,-224,-222,-218,-219,-223,-225,174,-227,-228,-220,-226,174,174,174,174,174,174,174,174,174,174,174,174,174,174,174,174,174,174,174,174,174,174,174,174,-303,-260,174,-212,-211,174,-209,174,174,174,-195,174,-208,-196,174,174,174,-260,174,174,-12,174,174,-11,174,174,-28,-303,-260,-207,-210,174,-199,174,-197,-303,-176,174,174,-303,174,-260,174,174,174,174,-198,174,174,174,174,-11,174,-203,-202,-200,174,-303,174,174,174,-204,-201,174,-206,-205,]),'ARROW':([115,147,150,151,152,153,154,155,158,159,167,169,172,174,175,177,178,226,233,272,275,276,278,377,378,379,380,383,388,452,453,499,503,512,],[-288,-292,-280,-295,-299,-296,-293,-278,-279,277,-291,-265,-297,-289,-277,-294,-290,-288,-302,-300,-298,-274,-273,-272,-271,-270,-269,-268,-281,-266,-267,-282,-275,-276,]),'CHAR':([0,1,2,3,5,6,7,8,9,10,11,12,14,15,16,17,18,19,20,21,23,24,25,26,29,30,32,33,34,35,36,37,39,40,41,42,44,45,47,48,49,50,51,53,54,55,56,57,60,61,63,64,65,67,68,69,70,71,72,73,74,78,80,83,87,91,92,96,101,102,103,104,105,113,117,120,121,122,123,124,125,126,127,128,129,130,136,142,170,184,185,186,198,199,200,203,205,212,214,215,216,217,219,222,228,229,230,231,232,233,234,240,245,254,273,282,283,284,287,289,292,323,327,332,333,335,336,342,345,347,350,351,354,356,357,392,412,413,426,427,431,436,473,494,495,497,515,516,519,520,],[41,-303,-102,-115,-113,-99,-97,-52,-95,-114,-96,-64,-60,-100,-91,-66,41,-94,-109,-104,-65,-93,-110,41,-215,-107,-111,41,-63,-116,41,-29,-105,-62,-101,-67,-112,-106,-303,-108,-303,-103,-117,-68,-98,-85,-10,-9,41,-53,41,-82,41,41,-61,-131,-301,-130,41,-147,-146,-160,-88,-90,41,-87,-89,-92,-81,-84,-86,-69,-30,41,41,-70,41,-83,41,41,-128,-140,-137,41,41,41,-161,41,41,-36,-35,41,41,-73,-76,-72,-74,41,-78,-193,-192,-77,-194,-75,41,41,-129,-132,-138,-302,-126,-127,-148,-71,41,-31,41,41,41,-34,41,41,41,-212,-211,41,-209,-195,-208,-196,-134,-133,-139,-150,-149,41,-33,-32,-207,-210,-199,-197,-198,-203,-202,-200,-204,-201,-206,-205,]),'HEX_FLOAT_CONST':([3,35,51,53,59,70,76,78,79,101,104,106,107,108,121,136,141,144,145,148,156,157,163,164,166,168,170,171,173,181,182,183,195,199,200,203,204,205,211,212,214,215,216,217,219,221,222,225,233,235,246,249,250,251,256,260,261,262,263,264,265,266,267,268,269,270,271,273,280,281,284,287,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,322,325,326,332,333,335,336,337,338,339,342,344,345,347,348,349,355,361,362,363,366,370,372,374,389,416,417,418,424,426,427,429,431,433,436,447,450,454,456,459,460,462,463,464,470,472,473,474,475,476,480,481,493,494,495,497,501,502,506,509,514,515,516,518,519,520,],[-115,-116,-117,-68,-303,-301,-28,-160,-27,-81,-69,177,-28,-303,177,-161,-303,177,177,-264,177,-262,177,-261,177,-260,177,177,-259,-263,177,177,177,-73,-76,-72,177,-74,177,177,-78,-193,-192,-77,-194,177,-75,-260,-302,177,177,177,-28,-303,-303,-221,-224,-222,-218,-219,-223,-225,177,-227,-228,-220,-226,177,177,177,177,177,177,177,177,177,177,177,177,177,177,177,177,177,177,177,177,177,177,177,177,-303,-260,177,-212,-211,177,-209,177,177,177,-195,177,-208,-196,177,177,177,-260,177,177,-12,177,177,-11,177,177,-28,-303,-260,-207,-210,177,-199,177,-197,-303,-176,177,177,-303,177,-260,177,177,177,177,-198,177,177,177,177,-11,177,-203,-202,-200,177,-303,177,177,177,-204,-201,177,-206,-205,]),'DOUBLE':([0,1,2,3,5,6,7,8,9,10,11,12,14,15,16,17,18,19,20,21,23,24,25,26,29,30,32,33,34,35,36,37,39,40,41,42,44,45,47,48,49,50,51,53,54,55,56,57,60,61,63,64,65,67,68,69,70,71,72,73,74,78,80,83,87,91,92,96,101,102,103,104,105,113,117,120,121,122,123,124,125,126,127,128,129,130,136,142,170,184,185,186,198,199,200,203,205,212,214,215,216,217,219,222,228,229,230,231,232,233,234,240,245,254,273,282,283,284,287,289,292,323,327,332,333,335,336,342,345,347,350,351,354,356,357,392,412,413,426,427,431,436,473,494,495,497,515,516,519,520,],[45,-303,-102,-115,-113,-99,-97,-52,-95,-114,-96,-64,-60,-100,-91,-66,45,-94,-109,-104,-65,-93,-110,45,-215,-107,-111,45,-63,-116,45,-29,-105,-62,-101,-67,-112,-106,-303,-108,-303,-103,-117,-68,-98,-85,-10,-9,45,-53,45,-82,45,45,-61,-131,-301,-130,45,-147,-146,-160,-88,-90,45,-87,-89,-92,-81,-84,-86,-69,-30,45,45,-70,45,-83,45,45,-128,-140,-137,45,45,45,-161,45,45,-36,-35,45,45,-73,-76,-72,-74,45,-78,-193,-192,-77,-194,-75,45,45,-129,-132,-138,-302,-126,-127,-148,-71,45,-31,45,45,45,-34,45,45,45,-212,-211,45,-209,-195,-208,-196,-134,-133,-139,-150,-149,45,-33,-32,-207,-210,-199,-197,-198,-203,-202,-200,-204,-201,-206,-205,]),'MINUSEQUAL':([115,147,149,150,151,152,153,154,155,158,159,167,169,172,174,175,177,178,226,233,272,274,275,276,278,285,286,288,293,377,378,379,380,383,388,451,452,453,458,499,503,512,],[-288,-292,261,-280,-295,-299,-296,-293,-278,-279,-253,-291,-265,-297,-289,-277,-294,-290,-288,-302,-300,-257,-298,-274,-273,-251,-256,-254,-255,-272,-271,-270,-269,-268,-281,-258,-266,-267,-252,-282,-275,-276,]),'INT_CONST_OCT':([3,35,51,53,59,70,76,78,79,101,104,106,107,108,121,136,141,144,145,148,156,157,163,164,166,168,170,171,173,181,182,183,195,199,200,203,204,205,211,212,214,215,216,217,219,221,222,225,233,235,246,249,250,251,256,260,261,262,263,264,265,266,267,268,269,270,271,273,280,281,284,287,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,322,325,326,332,333,335,336,337,338,339,342,344,345,347,348,349,355,361,362,363,366,370,372,374,389,416,417,418,424,426,427,429,431,433,436,447,450,454,456,459,460,462,463,464,470,472,473,474,475,476,480,481,493,494,495,497,501,502,506,509,514,515,516,518,519,520,],[-115,-116,-117,-68,-303,-301,-28,-160,-27,-81,-69,178,-28,-303,178,-161,-303,178,178,-264,178,-262,178,-261,178,-260,178,178,-259,-263,178,178,178,-73,-76,-72,178,-74,178,178,-78,-193,-192,-77,-194,178,-75,-260,-302,178,178,178,-28,-303,-303,-221,-224,-222,-218,-219,-223,-225,178,-227,-228,-220,-226,178,178,178,178,178,178,178,178,178,178,178,178,178,178,178,178,178,178,178,178,178,178,178,178,-303,-260,178,-212,-211,178,-209,178,178,178,-195,178,-208,-196,178,178,178,-260,178,178,-12,178,178,-11,178,178,-28,-303,-260,-207,-210,178,-199,178,-197,-303,-176,178,178,-303,178,-260,178,178,178,178,-198,178,178,178,178,-11,178,-203,-202,-200,178,-303,178,178,178,-204,-201,178,-206,-205,]),'TIMESEQUAL':([115,147,149,150,151,152,153,154,155,158,159,167,169,172,174,175,177,178,226,233,272,274,275,276,278,285,286,288,293,377,378,379,380,383,388,451,452,453,458,499,503,512,],[-288,-292,270,-280,-295,-299,-296,-293,-278,-279,-253,-291,-265,-297,-289,-277,-294,-290,-288,-302,-300,-257,-298,-274,-273,-251,-256,-254,-255,-272,-271,-270,-269,-268,-281,-258,-266,-267,-252,-282,-275,-276,]),'OR':([115,147,149,150,151,152,153,154,155,158,159,161,167,169,172,174,175,176,177,178,226,233,272,274,275,276,278,285,286,288,293,377,378,379,380,383,388,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,410,411,451,452,453,458,499,503,512,],[-288,-292,-251,-280,-295,-299,-296,-293,-278,-279,-253,-232,-291,-265,-297,-289,-277,311,-294,-290,-288,-302,-300,-257,-298,-274,-273,-251,-256,-254,-255,-272,-271,-270,-269,-268,-281,-238,311,-239,-237,-241,-245,-240,-236,-243,-248,-234,-233,-242,311,-244,-246,-247,-235,-258,-266,-267,-252,-282,-275,-276,]),'SHORT':([0,1,2,3,5,6,7,8,9,10,11,12,14,15,16,17,18,19,20,21,23,24,25,26,29,30,32,33,34,35,36,37,39,40,41,42,44,45,47,48,49,50,51,53,54,55,56,57,60,61,63,64,65,67,68,69,70,71,72,73,74,78,80,83,87,91,92,96,101,102,103,104,105,113,117,120,121,122,123,124,125,126,127,128,129,130,136,142,170,184,185,186,198,199,200,203,205,212,214,215,216,217,219,222,228,229,230,231,232,233,234,240,245,254,273,282,283,284,287,289,292,323,327,332,333,335,336,342,345,347,350,351,354,356,357,392,412,413,426,427,431,436,473,494,495,497,515,516,519,520,],[2,-303,-102,-115,-113,-99,-97,-52,-95,-114,-96,-64,-60,-100,-91,-66,2,-94,-109,-104,-65,-93,-110,2,-215,-107,-111,2,-63,-116,2,-29,-105,-62,-101,-67,-112,-106,-303,-108,-303,-103,-117,-68,-98,-85,-10,-9,2,-53,2,-82,2,2,-61,-131,-301,-130,2,-147,-146,-160,-88,-90,2,-87,-89,-92,-81,-84,-86,-69,-30,2,2,-70,2,-83,2,2,-128,-140,-137,2,2,2,-161,2,2,-36,-35,2,2,-73,-76,-72,-74,2,-78,-193,-192,-77,-194,-75,2,2,-129,-132,-138,-302,-126,-127,-148,-71,2,-31,2,2,2,-34,2,2,2,-212,-211,2,-209,-195,-208,-196,-134,-133,-139,-150,-149,2,-33,-32,-207,-210,-199,-197,-198,-203,-202,-200,-204,-201,-206,-205,]),'RETURN':([53,70,101,104,121,199,200,203,205,212,214,215,216,217,219,221,222,233,332,333,336,338,342,345,347,348,426,427,431,433,436,472,473,474,476,494,495,497,509,514,515,516,518,519,520,],[-68,-301,-81,-69,204,-73,-76,-72,-74,204,-78,-193,-192,-77,-194,204,-75,-302,-212,-211,-209,204,-195,-208,-196,204,-207,-210,-199,204,-197,204,-198,204,204,-203,-202,-200,204,204,-204,-201,204,-206,-205,]),'RSHIFTEQUAL':([115,147,149,150,151,152,153,154,155,158,159,167,169,172,174,175,177,178,226,233,272,274,275,276,278,285,286,288,293,377,378,379,380,383,388,451,452,453,458,499,503,512,],[-288,-292,271,-280,-295,-299,-296,-293,-278,-279,-253,-291,-265,-297,-289,-277,-294,-290,-288,-302,-300,-257,-298,-274,-273,-251,-256,-254,-255,-272,-271,-270,-269,-268,-281,-258,-266,-267,-252,-282,-275,-276,]),'RESTRICT':([0,1,2,3,5,6,7,8,9,10,11,12,14,15,16,17,18,19,20,21,23,24,25,26,29,30,31,32,33,34,35,37,39,40,41,42,44,45,47,48,49,50,51,53,54,59,60,61,63,64,67,68,69,70,71,72,73,74,76,78,80,83,87,91,92,96,101,104,105,107,108,113,120,121,122,123,124,125,126,127,128,129,130,136,141,142,170,184,185,186,198,199,200,203,205,212,214,215,216,217,219,222,228,229,230,231,232,233,234,240,245,250,251,254,273,282,283,284,287,289,292,322,323,327,332,333,335,336,342,345,347,350,351,354,356,357,392,412,413,417,418,426,427,431,436,473,494,495,497,515,516,519,520,],[35,35,-102,-115,-113,-99,-97,-52,-95,-114,-96,-64,-60,-100,-91,-66,35,-94,-109,-104,-65,-93,-110,35,-215,-107,35,-111,35,-63,-116,-29,-105,-62,-101,-67,-112,-106,35,-108,35,-103,-117,-68,-98,35,35,-53,35,-82,35,-61,-131,-301,-130,35,-147,-146,35,-160,-88,-90,35,-87,-89,-92,-81,-69,-30,35,35,35,-70,35,-83,35,35,-128,-140,-137,35,35,35,-161,35,35,35,-36,-35,35,35,-73,-76,-72,-74,35,-78,-193,-192,-77,-194,-75,35,35,-129,-132,-138,-302,-126,-127,-148,35,35,-71,35,-31,35,35,35,-34,35,35,35,35,-212,-211,35,-209,-195,-208,-196,-134,-133,-139,-150,-149,35,-33,-32,35,35,-207,-210,-199,-197,-198,-203,-202,-200,-204,-201,-206,-205,]),'STATIC':([0,1,2,3,5,6,7,8,9,10,11,12,14,15,16,17,18,19,20,21,23,24,25,26,29,30,32,33,34,35,37,39,40,41,42,44,45,47,48,49,50,51,53,54,59,60,61,63,64,67,68,69,70,71,73,74,78,80,83,87,91,92,96,101,104,105,107,113,120,121,122,136,141,142,184,185,186,198,199,200,203,205,212,214,215,216,217,219,222,231,233,245,250,254,282,289,322,323,327,332,333,335,336,342,345,347,350,351,356,357,392,412,413,417,426,427,431,436,473,494,495,497,515,516,519,520,],[9,9,-102,-115,-113,-99,-97,-52,-95,-114,-96,-64,-60,-100,-91,-66,9,-94,-109,-104,-65,-93,-110,9,-215,-107,-111,9,-63,-116,-29,-105,-62,-101,-67,-112,-106,9,-108,9,-103,-117,-68,-98,108,9,-53,9,-82,9,-61,-131,-301,-130,-147,-146,-160,-88,-90,9,-87,-89,-92,-81,-69,-30,182,9,-70,9,-83,-161,251,9,-36,-35,9,9,-73,-76,-72,-74,9,-78,-193,-192,-77,-194,-75,-132,-302,-148,362,-71,-31,-34,418,9,9,-212,-211,9,-209,-195,-208,-196,-134,-133,-150,-149,9,-33,-32,463,-207,-210,-199,-197,-198,-203,-202,-200,-204,-201,-206,-205,]),'SIZEOF':([3,35,51,53,59,70,76,78,79,101,104,106,107,108,121,136,141,144,145,148,156,157,163,164,166,168,170,171,173,181,182,183,195,199,200,203,204,205,211,212,214,215,216,217,219,221,222,225,233,235,246,249,250,251,256,260,261,262,263,264,265,266,267,268,269,270,271,273,280,281,284,287,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,322,325,326,332,333,335,336,337,338,339,342,344,345,347,348,349,355,361,362,363,366,370,372,374,389,416,417,418,424,426,427,429,431,433,436,447,450,454,456,459,460,462,463,464,470,472,473,474,475,476,480,481,493,494,495,497,501,502,506,509,514,515,516,518,519,520,],[-115,-116,-117,-68,-303,-301,-28,-160,-27,-81,-69,156,-28,-303,156,-161,-303,156,156,-264,156,-262,156,-261,156,-260,156,156,-259,-263,156,156,156,-73,-76,-72,156,-74,156,156,-78,-193,-192,-77,-194,156,-75,-260,-302,156,156,156,-28,-303,-303,-221,-224,-222,-218,-219,-223,-225,156,-227,-228,-220,-226,156,156,156,156,156,156,156,156,156,156,156,156,156,156,156,156,156,156,156,156,156,156,156,156,-303,-260,156,-212,-211,156,-209,156,156,156,-195,156,-208,-196,156,156,156,-260,156,156,-12,156,156,-11,156,156,-28,-303,-260,-207,-210,156,-199,156,-197,-303,-176,156,156,-303,156,-260,156,156,156,156,-198,156,156,156,156,-11,156,-203,-202,-200,156,-303,156,156,156,-204,-201,156,-206,-205,]),'UNSIGNED':([0,1,2,3,5,6,7,8,9,10,11,12,14,15,16,17,18,19,20,21,23,24,25,26,29,30,32,33,34,35,36,37,39,40,41,42,44,45,47,48,49,50,51,53,54,55,56,57,60,61,63,64,65,67,68,69,70,71,72,73,74,78,80,83,87,91,92,96,101,102,103,104,105,113,117,120,121,122,123,124,125,126,127,128,129,130,136,142,170,184,185,186,198,199,200,203,205,212,214,215,216,217,219,222,228,229,230,231,232,233,234,240,245,254,273,282,283,284,287,289,292,323,327,332,333,335,336,342,345,347,350,351,354,356,357,392,412,413,426,427,431,436,473,494,495,497,515,516,519,520,],[20,-303,-102,-115,-113,-99,-97,-52,-95,-114,-96,-64,-60,-100,-91,-66,20,-94,-109,-104,-65,-93,-110,20,-215,-107,-111,20,-63,-116,20,-29,-105,-62,-101,-67,-112,-106,-303,-108,-303,-103,-117,-68,-98,-85,-10,-9,20,-53,20,-82,20,20,-61,-131,-301,-130,20,-147,-146,-160,-88,-90,20,-87,-89,-92,-81,-84,-86,-69,-30,20,20,-70,20,-83,20,20,-128,-140,-137,20,20,20,-161,20,20,-36,-35,20,20,-73,-76,-72,-74,20,-78,-193,-192,-77,-194,-75,20,20,-129,-132,-138,-302,-126,-127,-148,-71,20,-31,20,20,20,-34,20,20,20,-212,-211,20,-209,-195,-208,-196,-134,-133,-139,-150,-149,20,-33,-32,-207,-210,-199,-197,-198,-203,-202,-200,-204,-201,-206,-205,]),'UNION':([0,1,3,7,8,9,11,12,14,17,18,19,23,24,26,34,35,36,37,40,42,47,49,51,53,54,55,56,57,60,61,64,65,67,68,70,72,78,87,101,102,103,104,105,117,120,121,122,123,124,126,127,128,129,136,142,170,184,185,186,198,199,200,203,205,212,214,215,216,217,219,222,228,229,232,233,254,273,282,283,284,287,289,323,327,332,333,335,336,342,345,347,354,392,412,413,426,427,431,436,473,494,495,497,515,516,519,520,],[22,-303,-115,-97,-52,-95,-96,-64,-60,-66,22,-94,-65,-93,22,-63,-116,22,-29,-62,-67,-303,-303,-117,-68,-98,-85,-10,-9,22,-53,-82,22,22,-61,-301,22,-160,22,-81,-84,-86,-69,-30,22,-70,22,-83,22,22,-140,-137,22,22,-161,22,22,-36,-35,22,22,-73,-76,-72,-74,22,-78,-193,-192,-77,-194,-75,22,22,-138,-302,-71,22,-31,22,22,22,-34,22,22,-212,-211,22,-209,-195,-208,-196,-139,22,-33,-32,-207,-210,-199,-197,-198,-203,-202,-200,-204,-201,-206,-205,]),'COLON':([2,3,5,6,8,10,15,20,21,25,29,30,32,35,37,39,41,44,45,48,50,51,61,69,71,73,74,85,86,88,105,115,119,125,130,140,147,149,150,151,152,153,154,155,158,159,161,167,169,172,174,175,176,177,178,180,184,185,209,224,226,230,231,233,234,240,241,245,247,272,274,275,276,278,282,285,286,288,289,293,340,341,350,351,353,356,357,364,365,375,377,378,379,380,383,388,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,409,410,411,412,413,430,440,441,451,452,453,458,477,478,487,499,503,512,],[-102,-115,-113,-99,-52,-114,-100,-109,-104,-110,-215,-107,-111,-116,-29,-105,-101,-112,-106,-108,-103,-117,-53,-131,-130,-147,-146,-54,-157,-37,-30,-288,-156,-128,235,-55,-292,-251,-280,-295,-299,-296,-293,-278,-279,-253,-232,-291,-265,-297,-289,-277,-230,-294,-290,-216,-36,-35,338,-213,348,-129,-132,-302,-126,-127,355,-148,-38,-300,-257,-298,-274,-273,-31,-251,-256,-254,-34,-255,433,-229,-134,-133,235,-150,-149,-44,-43,-217,-272,-271,-270,-269,-268,-281,-238,-250,-239,-237,-241,-245,-240,-236,-243,-248,-234,-233,-242,-249,-244,-246,460,-247,-235,-33,-32,-214,-39,-42,-258,-266,-267,-252,-41,-40,-231,-282,-275,-276,]),'$end':([0,12,14,17,23,26,34,40,42,43,52,53,68,101,104,120,233,254,347,],[-303,-64,-60,-66,-65,-58,-63,-62,-67,0,-59,-68,-61,-81,-69,-70,-302,-71,-196,]),'WSTRING_LITERAL':([3,35,51,53,59,70,76,78,79,101,104,106,107,108,121,136,141,144,145,148,150,152,156,157,163,164,166,168,170,171,173,181,182,183,195,199,200,203,204,205,211,212,214,215,216,217,219,221,222,225,233,235,246,249,250,251,256,260,261,262,263,264,265,266,267,268,269,270,271,272,273,280,281,284,287,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,322,325,326,332,333,335,336,337,338,339,342,344,345,347,348,349,355,361,362,363,366,370,372,374,389,416,417,418,424,426,427,429,431,433,436,447,450,454,456,459,460,462,463,464,470,472,473,474,475,476,480,481,493,494,495,497,501,502,506,509,514,515,516,518,519,520,],[-115,-116,-117,-68,-303,-301,-28,-160,-27,-81,-69,152,-28,-303,152,-161,-303,152,152,-264,272,-299,152,-262,152,-261,152,-260,152,152,-259,-263,152,152,152,-73,-76,-72,152,-74,152,152,-78,-193,-192,-77,-194,152,-75,-260,-302,152,152,152,-28,-303,-303,-221,-224,-222,-218,-219,-223,-225,152,-227,-228,-220,-226,-300,152,152,152,152,152,152,152,152,152,152,152,152,152,152,152,152,152,152,152,152,152,152,152,152,-303,-260,152,-212,-211,152,-209,152,152,152,-195,152,-208,-196,152,152,152,-260,152,152,-12,152,152,-11,152,152,-28,-303,-260,-207,-210,152,-199,152,-197,-303,-176,152,152,-303,152,-260,152,152,152,152,-198,152,152,152,152,-11,152,-203,-202,-200,152,-303,152,152,152,-204,-201,152,-206,-205,]),'DIVIDE':([115,147,149,150,151,152,153,154,155,158,159,161,167,169,172,174,175,176,177,178,226,233,272,274,275,276,278,285,286,288,293,377,378,379,380,383,388,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,410,411,451,452,453,458,499,503,512,],[-288,-292,-251,-280,-295,-299,-296,-293,-278,-279,-253,-232,-291,-265,-297,-289,-277,304,-294,-290,-288,-302,-300,-257,-298,-274,-273,-251,-256,-254,-255,-272,-271,-270,-269,-268,-281,304,304,304,304,304,304,304,304,304,304,-234,-233,304,304,304,304,304,-235,-258,-266,-267,-252,-282,-275,-276,]),'FOR':([53,70,101,104,121,199,200,203,205,212,214,215,216,217,219,221,222,233,332,333,336,338,342,345,347,348,426,427,431,433,436,472,473,474,476,494,495,497,509,514,515,516,518,519,520,],[-68,-301,-81,-69,206,-73,-76,-72,-74,206,-78,-193,-192,-77,-194,206,-75,-302,-212,-211,-209,206,-195,-208,-196,206,-207,-210,-199,206,-197,206,-198,206,206,-203,-202,-200,206,206,-204,-201,206,-206,-205,]),'PLUSPLUS':([3,35,51,53,59,70,76,78,79,101,104,106,107,108,115,121,136,141,144,145,147,148,150,151,152,153,154,155,156,157,158,159,163,164,166,167,168,169,170,171,172,173,174,175,177,178,181,182,183,195,199,200,203,204,205,211,212,214,215,216,217,219,221,222,225,226,233,235,246,249,250,251,256,260,261,262,263,264,265,266,267,268,269,270,271,272,273,275,276,278,280,281,284,287,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,322,325,326,332,333,335,336,337,338,339,342,344,345,347,348,349,355,361,362,363,366,370,372,374,377,378,379,380,383,388,389,416,417,418,424,426,427,429,431,433,436,447,450,452,453,454,456,459,460,462,463,464,470,472,473,474,475,476,480,481,493,494,495,497,499,501,502,503,506,509,512,514,515,516,518,519,520,],[-115,-116,-117,-68,-303,-301,-28,-160,-27,-81,-69,166,-28,-303,-288,166,-161,-303,166,166,-292,-264,-280,-295,-299,-296,-293,-278,166,-262,-279,278,166,-261,166,-291,-260,-265,166,166,-297,-259,-289,-277,-294,-290,-263,166,166,166,-73,-76,-72,166,-74,166,166,-78,-193,-192,-77,-194,166,-75,-260,-288,-302,166,166,166,-28,-303,-303,-221,-224,-222,-218,-219,-223,-225,166,-227,-228,-220,-226,-300,166,-298,-274,-273,166,166,166,166,166,166,166,166,166,166,166,166,166,166,166,166,166,166,166,166,166,166,166,-303,-260,166,-212,-211,166,-209,166,166,166,-195,166,-208,-196,166,166,166,-260,166,166,-12,166,166,-11,-272,-271,-270,-269,-268,-281,166,166,-28,-303,-260,-207,-210,166,-199,166,-197,-303,-176,-266,-267,166,166,-303,166,-260,166,166,166,166,-198,166,166,166,166,-11,166,-203,-202,-200,-282,166,-303,-275,166,166,-276,166,-204,-201,166,-206,-205,]),'EQUALS':([8,37,61,85,86,87,88,89,97,105,115,119,135,140,147,149,150,151,152,153,154,155,158,159,167,169,172,174,175,177,178,184,185,226,233,247,272,274,275,276,278,282,285,286,288,289,293,364,365,368,373,377,378,379,380,383,388,412,413,440,441,445,449,451,452,453,458,477,478,479,499,503,512,],[-52,-29,-53,-54,-157,-156,-37,144,145,-30,-288,-156,246,-55,-292,263,-280,-295,-299,-296,-293,-278,-279,-253,-291,-265,-297,-289,-277,-294,-290,-36,-35,-288,-302,-38,-300,-257,-298,-274,-273,-31,-251,-256,-254,-34,-255,-44,-43,-177,450,-272,-271,-270,-269,-268,-281,-33,-32,-39,-42,-180,-178,-258,-266,-267,-252,-41,-40,-179,-282,-275,-276,]),'ELSE':([53,104,199,200,203,205,214,217,222,233,332,333,336,345,347,426,427,431,436,473,494,495,497,515,516,519,520,],[-68,-69,-73,-76,-72,-74,-78,-77,-75,-302,-212,-211,-209,-208,-196,-207,-210,-199,-197,-198,-203,-202,509,-204,-201,-206,-205,]),'ANDEQUAL':([115,147,149,150,151,152,153,154,155,158,159,167,169,172,174,175,177,178,226,233,272,274,275,276,278,285,286,288,293,377,378,379,380,383,388,451,452,453,458,499,503,512,],[-288,-292,268,-280,-295,-299,-296,-293,-278,-279,-253,-291,-265,-297,-289,-277,-294,-290,-288,-302,-300,-257,-298,-274,-273,-251,-256,-254,-255,-272,-271,-270,-269,-268,-281,-258,-266,-267,-252,-282,-275,-276,]),'EQ':([115,147,149,150,151,152,153,154,155,158,159,161,167,169,172,174,175,176,177,178,226,233,272,274,275,276,278,285,286,288,293,377,378,379,380,383,388,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,410,411,451,452,453,458,499,503,512,],[-288,-292,-251,-280,-295,-299,-296,-293,-278,-279,-253,-232,-291,-265,-297,-289,-277,308,-294,-290,-288,-302,-300,-257,-298,-274,-273,-251,-256,-254,-255,-272,-271,-270,-269,-268,-281,-238,308,-239,-237,-241,-245,-240,-236,-243,308,-234,-233,-242,308,-244,308,308,-235,-258,-266,-267,-252,-282,-275,-276,]),'AND':([3,35,51,53,59,70,76,78,79,101,104,106,107,108,115,121,136,141,144,145,147,148,149,150,151,152,153,154,155,156,157,158,159,161,163,164,166,167,168,169,170,171,172,173,174,175,176,177,178,181,182,183,195,199,200,203,204,205,211,212,214,215,216,217,219,221,222,225,226,233,235,246,249,250,251,256,260,261,262,263,264,265,266,267,268,269,270,271,272,273,274,275,276,278,280,281,284,285,286,287,288,293,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,322,325,326,332,333,335,336,337,338,339,342,344,345,347,348,349,355,361,362,363,366,370,372,374,377,378,379,380,383,388,389,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,410,411,416,417,418,424,426,427,429,431,433,436,447,450,451,452,453,454,456,458,459,460,462,463,464,470,472,473,474,475,476,480,481,493,494,495,497,499,501,502,503,506,509,512,514,515,516,518,519,520,],[-115,-116,-117,-68,-303,-301,-28,-160,-27,-81,-69,173,-28,-303,-288,173,-161,-303,173,173,-292,-264,-251,-280,-295,-299,-296,-293,-278,173,-262,-279,-253,-232,173,-261,173,-291,-260,-265,173,173,-297,-259,-289,-277,309,-294,-290,-263,173,173,173,-73,-76,-72,173,-74,173,173,-78,-193,-192,-77,-194,173,-75,-260,-288,-302,173,173,173,-28,-303,-303,-221,-224,-222,-218,-219,-223,-225,173,-227,-228,-220,-226,-300,173,-257,-298,-274,-273,173,173,173,-251,-256,173,-254,-255,173,173,173,173,173,173,173,173,173,173,173,173,173,173,173,173,173,173,173,-303,-260,173,-212,-211,173,-209,173,173,173,-195,173,-208,-196,173,173,173,-260,173,173,-12,173,173,-11,-272,-271,-270,-269,-268,-281,173,-238,309,-239,-237,-241,-245,-240,-236,-243,309,-234,-233,-242,309,-244,-246,309,-235,173,-28,-303,-260,-207,-210,173,-199,173,-197,-303,-176,-258,-266,-267,173,173,-252,-303,173,-260,173,173,173,173,-198,173,173,173,173,-11,173,-203,-202,-200,-282,173,-303,-275,173,173,-276,173,-204,-201,173,-206,-205,]),'TYPEID':([0,1,2,3,5,6,7,8,9,10,11,12,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,44,45,47,48,49,50,51,53,54,55,56,57,60,61,63,64,65,67,68,69,70,71,72,73,74,76,77,78,79,80,81,83,84,87,91,92,96,101,102,103,104,105,113,117,120,121,122,123,124,125,126,127,128,129,130,136,137,139,142,146,170,184,185,186,189,198,199,200,203,205,212,214,215,216,217,219,222,228,229,230,231,232,233,234,240,245,254,273,277,279,282,283,284,287,289,323,327,332,333,335,336,342,345,347,350,351,353,354,356,357,392,412,413,426,427,431,436,473,494,495,497,515,516,519,520,],[29,-303,-102,-115,-113,-99,-97,-52,-95,-114,-96,-64,-60,-100,-91,-66,29,-94,-109,-104,-136,-65,-93,-110,29,69,73,-215,-107,-303,-111,88,-63,-116,29,-29,-135,-105,-62,-101,-67,-112,-106,-303,-108,-303,-103,-117,-68,-98,-85,-10,-9,29,-53,88,-82,29,29,-61,-131,-301,-130,29,-147,-146,-28,-158,-160,-27,-88,88,-90,88,29,-87,-89,-92,-81,-84,-86,-69,-30,193,29,-70,29,-83,29,29,-128,-140,-137,29,29,88,-161,-159,88,29,88,29,-36,-35,29,193,29,-73,-76,-72,-74,29,-78,-193,-192,-77,-194,-75,29,29,-129,-132,-138,-302,-126,-127,-148,-71,29,377,379,-31,29,29,29,-34,29,29,-212,-211,29,-209,-195,-208,-196,-134,-133,88,-139,-150,-149,29,-33,-32,-207,-210,-199,-197,-198,-203,-202,-200,-204,-201,-206,-205,]),'LBRACE':([8,18,22,27,28,37,38,53,61,62,64,66,67,69,70,71,73,74,87,101,104,105,121,122,143,144,145,184,185,199,200,203,205,212,214,215,216,217,219,221,222,233,256,282,289,332,333,336,338,342,345,347,348,366,372,374,389,412,413,426,427,431,433,436,447,450,451,456,457,459,472,473,474,476,480,481,494,495,497,502,509,514,515,516,518,519,520,],[-52,-303,-136,70,70,-29,-135,-68,-53,-7,-82,70,-8,70,-301,70,70,70,-303,-81,-69,-30,70,-83,70,70,70,-36,-35,-73,-76,-72,-74,70,-78,-193,-192,-77,-194,70,-75,-302,-303,-31,-34,-212,-211,-209,70,-195,-208,-196,70,-12,70,-11,70,-33,-32,-207,-210,-199,70,-197,-303,-176,70,70,70,-303,70,-198,70,70,70,-11,-203,-202,-200,-303,70,70,-204,-201,70,-206,-205,]),'PPHASH':([0,12,14,17,23,26,34,40,42,53,68,101,104,120,233,254,347,],[42,-64,-60,-66,-65,42,-63,-62,-67,-68,-61,-81,-69,-70,-302,-71,-196,]),'INT':([0,1,2,3,5,6,7,8,9,10,11,12,14,15,16,17,18,19,20,21,23,24,25,26,29,30,32,33,34,35,36,37,39,40,41,42,44,45,47,48,49,50,51,53,54,55,56,57,60,61,63,64,65,67,68,69,70,71,72,73,74,78,80,83,87,91,92,96,101,102,103,104,105,113,117,120,121,122,123,124,125,126,127,128,129,130,136,142,170,184,185,186,198,199,200,203,205,212,214,215,216,217,219,222,228,229,230,231,232,233,234,240,245,254,273,282,283,284,287,289,292,323,327,332,333,335,336,342,345,347,350,351,354,356,357,392,412,413,426,427,431,436,473,494,495,497,515,516,519,520,],[50,-303,-102,-115,-113,-99,-97,-52,-95,-114,-96,-64,-60,-100,-91,-66,50,-94,-109,-104,-65,-93,-110,50,-215,-107,-111,50,-63,-116,50,-29,-105,-62,-101,-67,-112,-106,-303,-108,-303,-103,-117,-68,-98,-85,-10,-9,50,-53,50,-82,50,50,-61,-131,-301,-130,50,-147,-146,-160,-88,-90,50,-87,-89,-92,-81,-84,-86,-69,-30,50,50,-70,50,-83,50,50,-128,-140,-137,50,50,50,-161,50,50,-36,-35,50,50,-73,-76,-72,-74,50,-78,-193,-192,-77,-194,-75,50,50,-129,-132,-138,-302,-126,-127,-148,-71,50,-31,50,50,50,-34,50,50,50,-212,-211,50,-209,-195,-208,-196,-134,-133,-139,-150,-149,50,-33,-32,-207,-210,-199,-197,-198,-203,-202,-200,-204,-201,-206,-205,]),'SIGNED':([0,1,2,3,5,6,7,8,9,10,11,12,14,15,16,17,18,19,20,21,23,24,25,26,29,30,32,33,34,35,36,37,39,40,41,42,44,45,47,48,49,50,51,53,54,55,56,57,60,61,63,64,65,67,68,69,70,71,72,73,74,78,80,83,87,91,92,96,101,102,103,104,105,113,117,120,121,122,123,124,125,126,127,128,129,130,136,142,170,184,185,186,198,199,200,203,205,212,214,215,216,217,219,222,228,229,230,231,232,233,234,240,245,254,273,282,283,284,287,289,292,323,327,332,333,335,336,342,345,347,350,351,354,356,357,392,412,413,426,427,431,436,473,494,495,497,515,516,519,520,],[48,-303,-102,-115,-113,-99,-97,-52,-95,-114,-96,-64,-60,-100,-91,-66,48,-94,-109,-104,-65,-93,-110,48,-215,-107,-111,48,-63,-116,48,-29,-105,-62,-101,-67,-112,-106,-303,-108,-303,-103,-117,-68,-98,-85,-10,-9,48,-53,48,-82,48,48,-61,-131,-301,-130,48,-147,-146,-160,-88,-90,48,-87,-89,-92,-81,-84,-86,-69,-30,48,48,-70,48,-83,48,48,-128,-140,-137,48,48,48,-161,48,48,-36,-35,48,48,-73,-76,-72,-74,48,-78,-193,-192,-77,-194,-75,48,48,-129,-132,-138,-302,-126,-127,-148,-71,48,-31,48,48,48,-34,48,48,48,-212,-211,48,-209,-195,-208,-196,-134,-133,-139,-150,-149,48,-33,-32,-207,-210,-199,-197,-198,-203,-202,-200,-204,-201,-206,-205,]),'CONTINUE':([53,70,101,104,121,199,200,203,205,212,214,215,216,217,219,221,222,233,332,333,336,338,342,345,347,348,426,427,431,433,436,472,473,474,476,494,495,497,509,514,515,516,518,519,520,],[-68,-301,-81,-69,207,-73,-76,-72,-74,207,-78,-193,-192,-77,-194,207,-75,-302,-212,-211,-209,207,-195,-208,-196,207,-207,-210,-199,207,-197,207,-198,207,207,-203,-202,-200,207,207,-204,-201,207,-206,-205,]),'NOT':([3,35,51,53,59,70,76,78,79,101,104,106,107,108,121,136,141,144,145,148,156,157,163,164,166,168,170,171,173,181,182,183,195,199,200,203,204,205,211,212,214,215,216,217,219,221,222,225,233,235,246,249,250,251,256,260,261,262,263,264,265,266,267,268,269,270,271,273,280,281,284,287,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,322,325,326,332,333,335,336,337,338,339,342,344,345,347,348,349,355,361,362,363,366,370,372,374,389,416,417,418,424,426,427,429,431,433,436,447,450,454,456,459,460,462,463,464,470,472,473,474,475,476,480,481,493,494,495,497,501,502,506,509,514,515,516,518,519,520,],[-115,-116,-117,-68,-303,-301,-28,-160,-27,-81,-69,181,-28,-303,181,-161,-303,181,181,-264,181,-262,181,-261,181,-260,181,181,-259,-263,181,181,181,-73,-76,-72,181,-74,181,181,-78,-193,-192,-77,-194,181,-75,-260,-302,181,181,181,-28,-303,-303,-221,-224,-222,-218,-219,-223,-225,181,-227,-228,-220,-226,181,181,181,181,181,181,181,181,181,181,181,181,181,181,181,181,181,181,181,181,181,181,181,181,-303,-260,181,-212,-211,181,-209,181,181,181,-195,181,-208,-196,181,181,181,-260,181,181,-12,181,181,-11,181,181,-28,-303,-260,-207,-210,181,-199,181,-197,-303,-176,181,181,-303,181,-260,181,181,181,181,-198,181,181,181,181,-11,181,-203,-202,-200,181,-303,181,181,181,-204,-201,181,-206,-205,]),'OREQUAL':([115,147,149,150,151,152,153,154,155,158,159,167,169,172,174,175,177,178,226,233,272,274,275,276,278,285,286,288,293,377,378,379,380,383,388,451,452,453,458,499,503,512,],[-288,-292,269,-280,-295,-299,-296,-293,-278,-279,-253,-291,-265,-297,-289,-277,-294,-290,-288,-302,-300,-257,-298,-274,-273,-251,-256,-254,-255,-272,-271,-270,-269,-268,-281,-258,-266,-267,-252,-282,-275,-276,]),'MOD':([115,147,149,150,151,152,153,154,155,158,159,161,167,169,172,174,175,176,177,178,226,233,272,274,275,276,278,285,286,288,293,377,378,379,380,383,388,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,410,411,451,452,453,458,499,503,512,],[-288,-292,-251,-280,-295,-299,-296,-293,-278,-279,-253,-232,-291,-265,-297,-289,-277,312,-294,-290,-288,-302,-300,-257,-298,-274,-273,-251,-256,-254,-255,-272,-271,-270,-269,-268,-281,312,312,312,312,312,312,312,312,312,312,-234,-233,312,312,312,312,312,-235,-258,-266,-267,-252,-282,-275,-276,]),'RSHIFT':([115,147,149,150,151,152,153,154,155,158,159,161,167,169,172,174,175,176,177,178,226,233,272,274,275,276,278,285,286,288,293,377,378,379,380,383,388,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,410,411,451,452,453,458,499,503,512,],[-288,-292,-251,-280,-295,-299,-296,-293,-278,-279,-253,-232,-291,-265,-297,-289,-277,294,-294,-290,-288,-302,-300,-257,-298,-274,-273,-251,-256,-254,-255,-272,-271,-270,-269,-268,-281,-238,294,-239,-237,294,294,294,-236,294,294,-234,-233,294,294,294,294,294,-235,-258,-266,-267,-252,-282,-275,-276,]),'DEFAULT':([53,70,101,104,121,199,200,203,205,212,214,215,216,217,219,221,222,233,332,333,336,338,342,345,347,348,426,427,431,433,436,472,473,474,476,494,495,497,509,514,515,516,518,519,520,],[-68,-301,-81,-69,209,-73,-76,-72,-74,209,-78,-193,-192,-77,-194,209,-75,-302,-212,-211,-209,209,-195,-208,-196,209,-207,-210,-199,209,-197,209,-198,209,209,-203,-202,-200,209,209,-204,-201,209,-206,-205,]),'__INT128':([0,1,2,3,5,6,7,8,9,10,11,12,14,15,16,17,18,19,20,21,23,24,25,26,29,30,32,33,34,35,36,37,39,40,41,42,44,45,47,48,49,50,51,53,54,55,56,57,60,61,63,64,65,67,68,69,70,71,72,73,74,78,80,83,87,91,92,96,101,102,103,104,105,113,117,120,121,122,123,124,125,126,127,128,129,130,136,142,170,184,185,186,198,199,200,203,205,212,214,215,216,217,219,222,228,229,230,231,232,233,234,240,245,254,273,282,283,284,287,289,292,323,327,332,333,335,336,342,345,347,350,351,354,356,357,392,412,413,426,427,431,436,473,494,495,497,515,516,519,520,],[25,-303,-102,-115,-113,-99,-97,-52,-95,-114,-96,-64,-60,-100,-91,-66,25,-94,-109,-104,-65,-93,-110,25,-215,-107,-111,25,-63,-116,25,-29,-105,-62,-101,-67,-112,-106,-303,-108,-303,-103,-117,-68,-98,-85,-10,-9,25,-53,25,-82,25,25,-61,-131,-301,-130,25,-147,-146,-160,-88,-90,25,-87,-89,-92,-81,-84,-86,-69,-30,25,25,-70,25,-83,25,25,-128,-140,-137,25,25,25,-161,25,25,-36,-35,25,25,-73,-76,-72,-74,25,-78,-193,-192,-77,-194,-75,25,25,-129,-132,-138,-302,-126,-127,-148,-71,25,-31,25,25,25,-34,25,25,25,-212,-211,25,-209,-195,-208,-196,-134,-133,-139,-150,-149,25,-33,-32,-207,-210,-199,-197,-198,-203,-202,-200,-204,-201,-206,-205,]),'WHILE':([53,70,101,104,121,199,200,203,205,212,214,215,216,217,219,221,222,233,332,333,336,338,342,345,346,347,348,426,427,431,433,436,472,473,474,476,494,495,497,509,514,515,516,518,519,520,],[-68,-301,-81,-69,210,-73,-76,-72,-74,210,-78,-193,-192,-77,-194,210,-75,-302,-212,-211,-209,210,-195,-208,435,-196,210,-207,-210,-199,210,-197,210,-198,210,210,-203,-202,-200,210,210,-204,-201,210,-206,-205,]),'DIVEQUAL':([115,147,149,150,151,152,153,154,155,158,159,167,169,172,174,175,177,178,226,233,272,274,275,276,278,285,286,288,293,377,378,379,380,383,388,451,452,453,458,499,503,512,],[-288,-292,260,-280,-295,-299,-296,-293,-278,-279,-253,-291,-265,-297,-289,-277,-294,-290,-288,-302,-300,-257,-298,-274,-273,-251,-256,-254,-255,-272,-271,-270,-269,-268,-281,-258,-266,-267,-252,-282,-275,-276,]),'EXTERN':([0,1,2,3,5,6,7,8,9,10,11,12,14,15,16,17,18,19,20,21,23,24,25,26,29,30,32,33,34,35,37,39,40,41,42,44,45,47,48,49,50,51,53,54,60,61,63,64,67,68,69,70,71,73,74,80,83,87,91,92,96,101,104,105,113,120,121,122,142,184,185,186,198,199,200,203,205,212,214,215,216,217,219,222,231,233,245,254,282,289,323,327,332,333,335,336,342,345,347,350,351,356,357,392,412,413,426,427,431,436,473,494,495,497,515,516,519,520,],[11,11,-102,-115,-113,-99,-97,-52,-95,-114,-96,-64,-60,-100,-91,-66,11,-94,-109,-104,-65,-93,-110,11,-215,-107,-111,11,-63,-116,-29,-105,-62,-101,-67,-112,-106,11,-108,11,-103,-117,-68,-98,11,-53,11,-82,11,-61,-131,-301,-130,-147,-146,-88,-90,11,-87,-89,-92,-81,-69,-30,11,-70,11,-83,11,-36,-35,11,11,-73,-76,-72,-74,11,-78,-193,-192,-77,-194,-75,-132,-302,-148,-71,-31,-34,11,11,-212,-211,11,-209,-195,-208,-196,-134,-133,-150,-149,11,-33,-32,-207,-210,-199,-197,-198,-203,-202,-200,-204,-201,-206,-205,]),'CASE':([53,70,101,104,121,199,200,203,205,212,214,215,216,217,219,221,222,233,332,333,336,338,342,345,347,348,426,427,431,433,436,472,473,474,476,494,495,497,509,514,515,516,518,519,520,],[-68,-301,-81,-69,211,-73,-76,-72,-74,211,-78,-193,-192,-77,-194,211,-75,-302,-212,-211,-209,211,-195,-208,-196,211,-207,-210,-199,211,-197,211,-198,211,211,-203,-202,-200,211,211,-204,-201,211,-206,-205,]),'LAND':([115,147,149,150,151,152,153,154,155,158,159,161,167,169,172,174,175,176,177,178,226,233,272,274,275,276,278,285,286,288,293,377,378,379,380,383,388,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,410,411,451,452,453,458,499,503,512,],[-288,-292,-251,-280,-295,-299,-296,-293,-278,-279,-253,-232,-291,-265,-297,-289,-277,307,-294,-290,-288,-302,-300,-257,-298,-274,-273,-251,-256,-254,-255,-272,-271,-270,-269,-268,-281,-238,307,-239,-237,-241,-245,-240,-236,-243,-248,-234,-233,-242,-249,-244,-246,-247,-235,-258,-266,-267,-252,-282,-275,-276,]),'REGISTER':([0,1,2,3,5,6,7,8,9,10,11,12,14,15,16,17,18,19,20,21,23,24,25,26,29,30,32,33,34,35,37,39,40,41,42,44,45,47,48,49,50,51,53,54,60,61,63,64,67,68,69,70,71,73,74,80,83,87,91,92,96,101,104,105,113,120,121,122,142,184,185,186,198,199,200,203,205,212,214,215,216,217,219,222,231,233,245,254,282,289,323,327,332,333,335,336,342,345,347,350,351,356,357,392,412,413,426,427,431,436,473,494,495,497,515,516,519,520,],[19,19,-102,-115,-113,-99,-97,-52,-95,-114,-96,-64,-60,-100,-91,-66,19,-94,-109,-104,-65,-93,-110,19,-215,-107,-111,19,-63,-116,-29,-105,-62,-101,-67,-112,-106,19,-108,19,-103,-117,-68,-98,19,-53,19,-82,19,-61,-131,-301,-130,-147,-146,-88,-90,19,-87,-89,-92,-81,-69,-30,19,-70,19,-83,19,-36,-35,19,19,-73,-76,-72,-74,19,-78,-193,-192,-77,-194,-75,-132,-302,-148,-71,-31,-34,19,19,-212,-211,19,-209,-195,-208,-196,-134,-133,-150,-149,19,-33,-32,-207,-210,-199,-197,-198,-203,-202,-200,-204,-201,-206,-205,]),'MODEQUAL':([115,147,149,150,151,152,153,154,155,158,159,167,169,172,174,175,177,178,226,233,272,274,275,276,278,285,286,288,293,377,378,379,380,383,388,451,452,453,458,499,503,512,],[-288,-292,262,-280,-295,-299,-296,-293,-278,-279,-253,-291,-265,-297,-289,-277,-294,-290,-288,-302,-300,-257,-298,-274,-273,-251,-256,-254,-255,-272,-271,-270,-269,-268,-281,-258,-266,-267,-252,-282,-275,-276,]),'NE':([115,147,149,150,151,152,153,154,155,158,159,161,167,169,172,174,175,176,177,178,226,233,272,274,275,276,278,285,286,288,293,377,378,379,380,383,388,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,410,411,451,452,453,458,499,503,512,],[-288,-292,-251,-280,-295,-299,-296,-293,-278,-279,-253,-232,-291,-265,-297,-289,-277,299,-294,-290,-288,-302,-300,-257,-298,-274,-273,-251,-256,-254,-255,-272,-271,-270,-269,-268,-281,-238,299,-239,-237,-241,-245,-240,-236,-243,299,-234,-233,-242,299,-244,299,299,-235,-258,-266,-267,-252,-282,-275,-276,]),'SWITCH':([53,70,101,104,121,199,200,203,205,212,214,215,216,217,219,221,222,233,332,333,336,338,342,345,347,348,426,427,431,433,436,472,473,474,476,494,495,497,509,514,515,516,518,519,520,],[-68,-301,-81,-69,213,-73,-76,-72,-74,213,-78,-193,-192,-77,-194,213,-75,-302,-212,-211,-209,213,-195,-208,-196,213,-207,-210,-199,213,-197,213,-198,213,213,-203,-202,-200,213,213,-204,-201,213,-206,-205,]),'INT_CONST_HEX':([3,35,51,53,59,70,76,78,79,101,104,106,107,108,121,136,141,144,145,148,156,157,163,164,166,168,170,171,173,181,182,183,195,199,200,203,204,205,211,212,214,215,216,217,219,221,222,225,233,235,246,249,250,251,256,260,261,262,263,264,265,266,267,268,269,270,271,273,280,281,284,287,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,322,325,326,332,333,335,336,337,338,339,342,344,345,347,348,349,355,361,362,363,366,370,372,374,389,416,417,418,424,426,427,429,431,433,436,447,450,454,456,459,460,462,463,464,470,472,473,474,475,476,480,481,493,494,495,497,501,502,506,509,514,515,516,518,519,520,],[-115,-116,-117,-68,-303,-301,-28,-160,-27,-81,-69,167,-28,-303,167,-161,-303,167,167,-264,167,-262,167,-261,167,-260,167,167,-259,-263,167,167,167,-73,-76,-72,167,-74,167,167,-78,-193,-192,-77,-194,167,-75,-260,-302,167,167,167,-28,-303,-303,-221,-224,-222,-218,-219,-223,-225,167,-227,-228,-220,-226,167,167,167,167,167,167,167,167,167,167,167,167,167,167,167,167,167,167,167,167,167,167,167,167,-303,-260,167,-212,-211,167,-209,167,167,167,-195,167,-208,-196,167,167,167,-260,167,167,-12,167,167,-11,167,167,-28,-303,-260,-207,-210,167,-199,167,-197,-303,-176,167,167,-303,167,-260,167,167,167,167,-198,167,167,167,167,-11,167,-203,-202,-200,167,-303,167,167,167,-204,-201,167,-206,-205,]),'_COMPLEX':([0,1,2,3,5,6,7,8,9,10,11,12,14,15,16,17,18,19,20,21,23,24,25,26,29,30,32,33,34,35,36,37,39,40,41,42,44,45,47,48,49,50,51,53,54,55,56,57,60,61,63,64,65,67,68,69,70,71,72,73,74,78,80,83,87,91,92,96,101,102,103,104,105,113,117,120,121,122,123,124,125,126,127,128,129,130,136,142,170,184,185,186,198,199,200,203,205,212,214,215,216,217,219,222,228,229,230,231,232,233,234,240,245,254,273,282,283,284,287,289,292,323,327,332,333,335,336,342,345,347,350,351,354,356,357,392,412,413,426,427,431,436,473,494,495,497,515,516,519,520,],[30,-303,-102,-115,-113,-99,-97,-52,-95,-114,-96,-64,-60,-100,-91,-66,30,-94,-109,-104,-65,-93,-110,30,-215,-107,-111,30,-63,-116,30,-29,-105,-62,-101,-67,-112,-106,-303,-108,-303,-103,-117,-68,-98,-85,-10,-9,30,-53,30,-82,30,30,-61,-131,-301,-130,30,-147,-146,-160,-88,-90,30,-87,-89,-92,-81,-84,-86,-69,-30,30,30,-70,30,-83,30,30,-128,-140,-137,30,30,30,-161,30,30,-36,-35,30,30,-73,-76,-72,-74,30,-78,-193,-192,-77,-194,-75,30,30,-129,-132,-138,-302,-126,-127,-148,-71,30,-31,30,30,30,-34,30,30,30,-212,-211,30,-209,-195,-208,-196,-134,-133,-139,-150,-149,30,-33,-32,-207,-210,-199,-197,-198,-203,-202,-200,-204,-201,-206,-205,]),'PPPRAGMASTR':([53,],[104,]),'PLUSEQUAL':([115,147,149,150,151,152,153,154,155,158,159,167,169,172,174,175,177,178,226,233,272,274,275,276,278,285,286,288,293,377,378,379,380,383,388,451,452,453,458,499,503,512,],[-288,-292,265,-280,-295,-299,-296,-293,-278,-279,-253,-291,-265,-297,-289,-277,-294,-290,-288,-302,-300,-257,-298,-274,-273,-251,-256,-254,-255,-272,-271,-270,-269,-268,-281,-258,-266,-267,-252,-282,-275,-276,]),'STRUCT':([0,1,3,7,8,9,11,12,14,17,18,19,23,24,26,34,35,36,37,40,42,47,49,51,53,54,55,56,57,60,61,64,65,67,68,70,72,78,87,101,102,103,104,105,117,120,121,122,123,124,126,127,128,129,136,142,170,184,185,186,198,199,200,203,205,212,214,215,216,217,219,222,228,229,232,233,254,273,282,283,284,287,289,323,327,332,333,335,336,342,345,347,354,392,412,413,426,427,431,436,473,494,495,497,515,516,519,520,],[38,-303,-115,-97,-52,-95,-96,-64,-60,-66,38,-94,-65,-93,38,-63,-116,38,-29,-62,-67,-303,-303,-117,-68,-98,-85,-10,-9,38,-53,-82,38,38,-61,-301,38,-160,38,-81,-84,-86,-69,-30,38,-70,38,-83,38,38,-140,-137,38,38,-161,38,38,-36,-35,38,38,-73,-76,-72,-74,38,-78,-193,-192,-77,-194,-75,38,38,-138,-302,-71,38,-31,38,38,38,-34,38,38,-212,-211,38,-209,-195,-208,-196,-139,38,-33,-32,-207,-210,-199,-197,-198,-203,-202,-200,-204,-201,-206,-205,]),'CONDOP':([115,147,149,150,151,152,153,154,155,158,159,161,167,169,172,174,175,176,177,178,226,233,272,274,275,276,278,285,286,288,293,377,378,379,380,383,388,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,410,411,451,452,453,458,499,503,512,],[-288,-292,-251,-280,-295,-299,-296,-293,-278,-279,-253,-232,-291,-265,-297,-289,-277,310,-294,-290,-288,-302,-300,-257,-298,-274,-273,-251,-256,-254,-255,-272,-271,-270,-269,-268,-281,-238,-250,-239,-237,-241,-245,-240,-236,-243,-248,-234,-233,-242,-249,-244,-246,-247,-235,-258,-266,-267,-252,-282,-275,-276,]),'BREAK':([53,70,101,104,121,199,200,203,205,212,214,215,216,217,219,221,222,233,332,333,336,338,342,345,347,348,426,427,431,433,436,472,473,474,476,494,495,497,509,514,515,516,518,519,520,],[-68,-301,-81,-69,218,-73,-76,-72,-74,218,-78,-193,-192,-77,-194,218,-75,-302,-212,-211,-209,218,-195,-208,-196,218,-207,-210,-199,218,-197,218,-198,218,218,-203,-202,-200,218,218,-204,-201,218,-206,-205,]),'VOLATILE':([0,1,2,3,5,6,7,8,9,10,11,12,14,15,16,17,18,19,20,21,23,24,25,26,29,30,31,32,33,34,35,37,39,40,41,42,44,45,47,48,49,50,51,53,54,59,60,61,63,64,67,68,69,70,71,72,73,74,76,78,80,83,87,91,92,96,101,104,105,107,108,113,120,121,122,123,124,125,126,127,128,129,130,136,141,142,170,184,185,186,198,199,200,203,205,212,214,215,216,217,219,222,228,229,230,231,232,233,234,240,245,250,251,254,273,282,283,284,287,289,292,322,323,327,332,333,335,336,342,345,347,350,351,354,356,357,392,412,413,417,418,426,427,431,436,473,494,495,497,515,516,519,520,],[51,51,-102,-115,-113,-99,-97,-52,-95,-114,-96,-64,-60,-100,-91,-66,51,-94,-109,-104,-65,-93,-110,51,-215,-107,51,-111,51,-63,-116,-29,-105,-62,-101,-67,-112,-106,51,-108,51,-103,-117,-68,-98,51,51,-53,51,-82,51,-61,-131,-301,-130,51,-147,-146,51,-160,-88,-90,51,-87,-89,-92,-81,-69,-30,51,51,51,-70,51,-83,51,51,-128,-140,-137,51,51,51,-161,51,51,51,-36,-35,51,51,-73,-76,-72,-74,51,-78,-193,-192,-77,-194,-75,51,51,-129,-132,-138,-302,-126,-127,-148,51,51,-71,51,-31,51,51,51,-34,51,51,51,51,-212,-211,51,-209,-195,-208,-196,-134,-133,-139,-150,-149,51,-33,-32,51,51,-207,-210,-199,-197,-198,-203,-202,-200,-204,-201,-206,-205,]),'PPPRAGMA':([0,12,14,17,23,26,34,40,42,53,68,70,101,104,120,121,199,200,203,205,212,214,215,216,217,219,221,222,233,254,332,333,336,338,342,345,347,348,426,427,431,433,436,472,473,474,476,494,495,497,509,514,515,516,518,519,520,],[53,-64,-60,-66,-65,53,-63,-62,-67,-68,-61,-301,-81,-69,-70,53,-73,-76,-72,-74,53,-78,-193,-192,-77,-194,53,-75,-302,-71,-212,-211,-209,53,-195,-208,-196,53,-207,-210,-199,53,-197,53,-198,53,53,-203,-202,-200,53,53,-204,-201,53,-206,-205,]),'INLINE':([0,1,2,3,5,6,7,8,9,10,11,12,14,15,16,17,18,19,20,21,23,24,25,26,29,30,32,33,34,35,37,39,40,41,42,44,45,47,48,49,50,51,53,54,60,61,63,64,67,68,69,70,71,73,74,80,83,87,91,92,96,101,104,105,113,120,121,122,142,184,185,186,198,199,200,203,205,212,214,215,216,217,219,222,231,233,245,254,282,289,323,327,332,333,335,336,342,345,347,350,351,356,357,392,412,413,426,427,431,436,473,494,495,497,515,516,519,520,],[54,54,-102,-115,-113,-99,-97,-52,-95,-114,-96,-64,-60,-100,-91,-66,54,-94,-109,-104,-65,-93,-110,54,-215,-107,-111,54,-63,-116,-29,-105,-62,-101,-67,-112,-106,54,-108,54,-103,-117,-68,-98,54,-53,54,-82,54,-61,-131,-301,-130,-147,-146,-88,-90,54,-87,-89,-92,-81,-69,-30,54,-70,54,-83,54,-36,-35,54,54,-73,-76,-72,-74,54,-78,-193,-192,-77,-194,-75,-132,-302,-148,-71,-31,-34,54,54,-212,-211,54,-209,-195,-208,-196,-134,-133,-150,-149,54,-33,-32,-207,-210,-199,-197,-198,-203,-202,-200,-204,-201,-206,-205,]),'INT_CONST_BIN':([3,35,51,53,59,70,76,78,79,101,104,106,107,108,121,136,141,144,145,148,156,157,163,164,166,168,170,171,173,181,182,183,195,199,200,203,204,205,211,212,214,215,216,217,219,221,222,225,233,235,246,249,250,251,256,260,261,262,263,264,265,266,267,268,269,270,271,273,280,281,284,287,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,322,325,326,332,333,335,336,337,338,339,342,344,345,347,348,349,355,361,362,363,366,370,372,374,389,416,417,418,424,426,427,429,431,433,436,447,450,454,456,459,460,462,463,464,470,472,473,474,475,476,480,481,493,494,495,497,501,502,506,509,514,515,516,518,519,520,],[-115,-116,-117,-68,-303,-301,-28,-160,-27,-81,-69,147,-28,-303,147,-161,-303,147,147,-264,147,-262,147,-261,147,-260,147,147,-259,-263,147,147,147,-73,-76,-72,147,-74,147,147,-78,-193,-192,-77,-194,147,-75,-260,-302,147,147,147,-28,-303,-303,-221,-224,-222,-218,-219,-223,-225,147,-227,-228,-220,-226,147,147,147,147,147,147,147,147,147,147,147,147,147,147,147,147,147,147,147,147,147,147,147,147,-303,-260,147,-212,-211,147,-209,147,147,147,-195,147,-208,-196,147,147,147,-260,147,147,-12,147,147,-11,147,147,-28,-303,-260,-207,-210,147,-199,147,-197,-303,-176,147,147,-303,147,-260,147,147,147,147,-198,147,147,147,147,-11,147,-203,-202,-200,147,-303,147,147,147,-204,-201,147,-206,-205,]),'DO':([53,70,101,104,121,199,200,203,205,212,214,215,216,217,219,221,222,233,332,333,336,338,342,345,347,348,426,427,431,433,436,472,473,474,476,494,495,497,509,514,515,516,518,519,520,],[-68,-301,-81,-69,221,-73,-76,-72,-74,221,-78,-193,-192,-77,-194,221,-75,-302,-212,-211,-209,221,-195,-208,-196,221,-207,-210,-199,221,-197,221,-198,221,221,-203,-202,-200,221,221,-204,-201,221,-206,-205,]),'LNOT':([3,35,51,53,59,70,76,78,79,101,104,106,107,108,121,136,141,144,145,148,156,157,163,164,166,168,170,171,173,181,182,183,195,199,200,203,204,205,211,212,214,215,216,217,219,221,222,225,233,235,246,249,250,251,256,260,261,262,263,264,265,266,267,268,269,270,271,273,280,281,284,287,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,322,325,326,332,333,335,336,337,338,339,342,344,345,347,348,349,355,361,362,363,366,370,372,374,389,416,417,418,424,426,427,429,431,433,436,447,450,454,456,459,460,462,463,464,470,472,473,474,475,476,480,481,493,494,495,497,501,502,506,509,514,515,516,518,519,520,],[-115,-116,-117,-68,-303,-301,-28,-160,-27,-81,-69,148,-28,-303,148,-161,-303,148,148,-264,148,-262,148,-261,148,-260,148,148,-259,-263,148,148,148,-73,-76,-72,148,-74,148,148,-78,-193,-192,-77,-194,148,-75,-260,-302,148,148,148,-28,-303,-303,-221,-224,-222,-218,-219,-223,-225,148,-227,-228,-220,-226,148,148,148,148,148,148,148,148,148,148,148,148,148,148,148,148,148,148,148,148,148,148,148,148,-303,-260,148,-212,-211,148,-209,148,148,148,-195,148,-208,-196,148,148,148,-260,148,148,-12,148,148,-11,148,148,-28,-303,-260,-207,-210,148,-199,148,-197,-303,-176,148,148,-303,148,-260,148,148,148,148,-198,148,148,148,148,-11,148,-203,-202,-200,148,-303,148,148,148,-204,-201,148,-206,-205,]),'CONST':([0,1,2,3,5,6,7,8,9,10,11,12,14,15,16,17,18,19,20,21,23,24,25,26,29,30,31,32,33,34,35,37,39,40,41,42,44,45,47,48,49,50,51,53,54,59,60,61,63,64,67,68,69,70,71,72,73,74,76,78,80,83,87,91,92,96,101,104,105,107,108,113,120,121,122,123,124,125,126,127,128,129,130,136,141,142,170,184,185,186,198,199,200,203,205,212,214,215,216,217,219,222,228,229,230,231,232,233,234,240,245,250,251,254,273,282,283,284,287,289,292,322,323,327,332,333,335,336,342,345,347,350,351,354,356,357,392,412,413,417,418,426,427,431,436,473,494,495,497,515,516,519,520,],[3,3,-102,-115,-113,-99,-97,-52,-95,-114,-96,-64,-60,-100,-91,-66,3,-94,-109,-104,-65,-93,-110,3,-215,-107,3,-111,3,-63,-116,-29,-105,-62,-101,-67,-112,-106,3,-108,3,-103,-117,-68,-98,3,3,-53,3,-82,3,-61,-131,-301,-130,3,-147,-146,3,-160,-88,-90,3,-87,-89,-92,-81,-69,-30,3,3,3,-70,3,-83,3,3,-128,-140,-137,3,3,3,-161,3,3,3,-36,-35,3,3,-73,-76,-72,-74,3,-78,-193,-192,-77,-194,-75,3,3,-129,-132,-138,-302,-126,-127,-148,3,3,-71,3,-31,3,3,3,-34,3,3,3,3,-212,-211,3,-209,-195,-208,-196,-134,-133,-139,-150,-149,3,-33,-32,3,3,-207,-210,-199,-197,-198,-203,-202,-200,-204,-201,-206,-205,]),'LOR':([115,147,149,150,151,152,153,154,155,158,159,161,167,169,172,174,175,176,177,178,226,233,272,274,275,276,278,285,286,288,293,377,378,379,380,383,388,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,410,411,451,452,453,458,499,503,512,],[-288,-292,-251,-280,-295,-299,-296,-293,-278,-279,-253,-232,-291,-265,-297,-289,-277,295,-294,-290,-288,-302,-300,-257,-298,-274,-273,-251,-256,-254,-255,-272,-271,-270,-269,-268,-281,-238,-250,-239,-237,-241,-245,-240,-236,-243,-248,-234,-233,-242,-249,-244,-246,-247,-235,-258,-266,-267,-252,-282,-275,-276,]),'CHAR_CONST':([3,35,51,53,59,70,76,78,79,101,104,106,107,108,121,136,141,144,145,148,156,157,163,164,166,168,170,171,173,181,182,183,195,199,200,203,204,205,211,212,214,215,216,217,219,221,222,225,233,235,246,249,250,251,256,260,261,262,263,264,265,266,267,268,269,270,271,273,280,281,284,287,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,322,325,326,332,333,335,336,337,338,339,342,344,345,347,348,349,355,361,362,363,366,370,372,374,389,416,417,418,424,426,427,429,431,433,436,447,450,454,456,459,460,462,463,464,470,472,473,474,475,476,480,481,493,494,495,497,501,502,506,509,514,515,516,518,519,520,],[-115,-116,-117,-68,-303,-301,-28,-160,-27,-81,-69,151,-28,-303,151,-161,-303,151,151,-264,151,-262,151,-261,151,-260,151,151,-259,-263,151,151,151,-73,-76,-72,151,-74,151,151,-78,-193,-192,-77,-194,151,-75,-260,-302,151,151,151,-28,-303,-303,-221,-224,-222,-218,-219,-223,-225,151,-227,-228,-220,-226,151,151,151,151,151,151,151,151,151,151,151,151,151,151,151,151,151,151,151,151,151,151,151,151,-303,-260,151,-212,-211,151,-209,151,151,151,-195,151,-208,-196,151,151,151,-260,151,151,-12,151,151,-11,151,151,-28,-303,-260,-207,-210,151,-199,151,-197,-303,-176,151,151,-303,151,-260,151,151,151,151,-198,151,151,151,151,-11,151,-203,-202,-200,151,-303,151,151,151,-204,-201,151,-206,-205,]),'LSHIFT':([115,147,149,150,151,152,153,154,155,158,159,161,167,169,172,174,175,176,177,178,226,233,272,274,275,276,278,285,286,288,293,377,378,379,380,383,388,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,410,411,451,452,453,458,499,503,512,],[-288,-292,-251,-280,-295,-299,-296,-293,-278,-279,-253,-232,-291,-265,-297,-289,-277,296,-294,-290,-288,-302,-300,-257,-298,-274,-273,-251,-256,-254,-255,-272,-271,-270,-269,-268,-281,-238,296,-239,-237,296,296,296,-236,296,296,-234,-233,296,296,296,296,296,-235,-258,-266,-267,-252,-282,-275,-276,]),'RBRACE':([53,70,101,104,115,121,126,127,129,133,134,135,147,149,150,151,152,153,154,155,158,159,161,167,169,172,174,175,176,177,178,180,199,200,203,205,212,214,215,216,217,219,220,222,223,228,229,232,233,242,243,244,256,257,272,274,275,276,278,285,286,288,293,332,333,336,341,342,345,347,354,358,359,367,371,374,375,377,378,379,380,383,388,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,410,411,426,427,431,436,444,447,448,451,452,453,458,473,482,486,487,494,495,497,498,499,502,503,512,515,516,519,520,],[-68,-301,-81,-69,-288,-303,-140,-137,233,-151,233,-154,-292,-251,-280,-295,-299,-296,-293,-278,-279,-253,-232,-291,-265,-297,-289,-277,-230,-294,-290,-216,-73,-76,-72,-74,-6,-78,-193,-192,-77,-194,-5,-75,233,233,233,-138,-302,233,233,-152,-303,-171,-300,-257,-298,-274,-273,-251,-256,-254,-255,-212,-211,-209,-229,-195,-208,-196,-139,-153,-155,233,-22,-21,-217,-272,-271,-270,-269,-268,-281,-238,-250,-239,-237,-241,-245,-240,-236,-243,-248,-234,-233,-242,-249,-244,-246,-247,-235,-207,-210,-199,-197,-172,233,-174,-258,-266,-267,-252,-198,-173,233,-231,-203,-202,-200,-175,-282,233,-275,-276,-204,-201,-206,-205,]),'_BOOL':([0,1,2,3,5,6,7,8,9,10,11,12,14,15,16,17,18,19,20,21,23,24,25,26,29,30,32,33,34,35,36,37,39,40,41,42,44,45,47,48,49,50,51,53,54,55,56,57,60,61,63,64,65,67,68,69,70,71,72,73,74,78,80,83,87,91,92,96,101,102,103,104,105,113,117,120,121,122,123,124,125,126,127,128,129,130,136,142,170,184,185,186,198,199,200,203,205,212,214,215,216,217,219,222,228,229,230,231,232,233,234,240,245,254,273,282,283,284,287,289,292,323,327,332,333,335,336,342,345,347,350,351,354,356,357,392,412,413,426,427,431,436,473,494,495,497,515,516,519,520,],[15,-303,-102,-115,-113,-99,-97,-52,-95,-114,-96,-64,-60,-100,-91,-66,15,-94,-109,-104,-65,-93,-110,15,-215,-107,-111,15,-63,-116,15,-29,-105,-62,-101,-67,-112,-106,-303,-108,-303,-103,-117,-68,-98,-85,-10,-9,15,-53,15,-82,15,15,-61,-131,-301,-130,15,-147,-146,-160,-88,-90,15,-87,-89,-92,-81,-84,-86,-69,-30,15,15,-70,15,-83,15,15,-128,-140,-137,15,15,15,-161,15,15,-36,-35,15,15,-73,-76,-72,-74,15,-78,-193,-192,-77,-194,-75,15,15,-129,-132,-138,-302,-126,-127,-148,-71,15,-31,15,15,15,-34,15,15,15,-212,-211,15,-209,-195,-208,-196,-134,-133,-139,-150,-149,15,-33,-32,-207,-210,-199,-197,-198,-203,-202,-200,-204,-201,-206,-205,]),'LE':([115,147,149,150,151,152,153,154,155,158,159,161,167,169,172,174,175,176,177,178,226,233,272,274,275,276,278,285,286,288,293,377,378,379,380,383,388,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,410,411,451,452,453,458,499,503,512,],[-288,-292,-251,-280,-295,-299,-296,-293,-278,-279,-253,-232,-291,-265,-297,-289,-277,298,-294,-290,-288,-302,-300,-257,-298,-274,-273,-251,-256,-254,-255,-272,-271,-270,-269,-268,-281,-238,298,-239,-237,-241,298,-240,-236,-243,298,-234,-233,-242,298,298,298,298,-235,-258,-266,-267,-252,-282,-275,-276,]),'SEMI':([0,1,2,3,5,6,7,8,9,10,11,12,14,15,16,17,19,20,21,23,24,25,26,29,30,32,33,34,35,36,37,39,40,41,42,44,45,46,47,48,49,50,51,53,54,55,56,57,61,63,65,68,69,70,71,72,73,74,80,82,83,85,86,87,88,89,90,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,115,119,120,121,123,124,125,126,127,129,130,140,147,149,150,151,152,153,154,155,158,159,161,167,169,172,174,175,176,177,178,180,184,185,199,200,202,203,204,205,207,208,212,214,215,216,217,218,219,220,221,222,224,226,228,229,230,231,232,233,234,236,237,238,239,240,241,245,247,248,254,255,257,258,259,272,274,275,276,278,282,285,286,288,289,293,331,332,333,334,335,336,338,341,342,343,345,347,348,350,351,352,354,356,357,364,365,375,377,378,379,380,383,388,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,410,411,412,413,426,427,428,429,430,431,433,436,438,439,440,441,444,451,452,453,458,470,471,472,473,474,476,477,478,482,487,492,494,495,497,499,503,508,509,512,514,515,516,518,519,520,],[17,-303,-102,-115,-113,-99,-97,-52,-95,-114,-96,-64,-60,-100,-91,-66,-94,-109,-104,-65,-93,-110,17,-215,-107,-111,-303,-63,-116,-303,-29,-105,-62,-101,-67,-112,-106,101,-303,-108,-303,-103,-117,-68,-98,-85,-10,-9,-53,-303,-303,-61,-131,-301,-130,126,-147,-146,-88,-20,-90,-54,-157,-156,-37,-120,-79,-87,-89,-19,-118,-122,-92,-124,-16,-80,-15,-81,-84,-86,-69,-30,-288,-156,-70,-303,126,126,-128,-140,-137,126,-303,-55,-292,-251,-280,-295,-299,-296,-293,-278,-279,-253,-232,-291,-265,-297,-289,-277,-230,-294,-290,-216,-36,-35,-73,-76,332,-72,333,-74,336,-14,-303,-78,-193,-192,-77,345,-194,-13,-303,-75,-213,-288,126,126,-129,-132,-138,-302,-126,-26,-25,354,-141,-127,-143,-148,-38,-119,-71,-121,-171,-125,-123,-300,-257,-298,-274,-273,-31,-251,-256,-254,-34,-255,426,-212,-211,427,-303,-209,-303,-229,-195,-13,-208,-196,-303,-134,-133,-145,-139,-150,-149,-44,-43,-217,-272,-271,-270,-269,-268,-281,-238,-250,-239,-237,-241,-245,-240,-236,-243,-248,-234,-233,-242,-249,-244,-246,-247,-235,-33,-32,-207,-210,470,-303,-214,-199,-303,-197,-142,-144,-39,-42,-172,-258,-266,-267,-252,-303,493,-303,-198,-303,-303,-41,-40,-173,-231,506,-203,-202,-200,-282,-275,515,-303,-276,-303,-204,-201,-303,-206,-205,]),'LT':([115,147,149,150,151,152,153,154,155,158,159,161,167,169,172,174,175,176,177,178,226,233,272,274,275,276,278,285,286,288,293,377,378,379,380,383,388,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,410,411,451,452,453,458,499,503,512,],[-288,-292,-251,-280,-295,-299,-296,-293,-278,-279,-253,-232,-291,-265,-297,-289,-277,300,-294,-290,-288,-302,-300,-257,-298,-274,-273,-251,-256,-254,-255,-272,-271,-270,-269,-268,-281,-238,300,-239,-237,-241,300,-240,-236,-243,300,-234,-233,-242,300,300,300,300,-235,-258,-266,-267,-252,-282,-275,-276,]),'COMMA':([2,3,5,6,7,8,9,10,11,15,16,19,20,21,24,25,29,30,31,32,35,37,39,41,44,45,48,50,51,54,61,69,71,73,74,76,77,78,79,80,82,83,85,86,87,88,89,91,92,94,95,96,97,98,105,112,113,114,115,116,118,119,125,133,134,135,136,137,140,147,149,150,151,152,153,154,155,158,159,161,167,169,172,174,175,176,177,178,180,184,185,187,188,189,190,191,192,193,194,196,208,224,226,230,231,233,234,236,239,240,241,242,243,244,245,247,248,255,257,258,259,272,274,275,276,278,282,285,286,288,289,290,292,293,320,321,328,330,334,341,350,351,352,356,357,358,359,364,365,371,375,377,378,379,380,381,382,383,384,385,388,390,391,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,409,410,411,412,413,414,415,421,422,430,432,434,437,438,439,440,441,444,448,451,452,453,458,465,466,467,468,469,477,478,482,483,486,487,488,489,496,498,499,503,504,505,511,512,],[-102,-115,-113,-99,-97,-52,-95,-114,-96,-100,-91,-94,-109,-104,-93,-110,-215,-107,-303,-111,-116,-29,-105,-101,-112,-106,-108,-103,-117,-98,-53,-131,-130,-147,-146,-28,-158,-160,-27,-88,139,-90,-54,-157,-156,-37,-120,-87,-89,-118,-122,-92,-124,146,-30,-164,-303,197,-288,198,-169,-156,-128,-151,244,-154,-161,-159,-55,-292,-251,-280,-295,-299,-296,-293,-278,-279,-253,-232,-291,-265,-297,-289,-277,-230,-294,-290,-216,-36,-35,-168,-2,-182,-56,-166,-1,-45,-167,-184,337,-213,-288,-129,-132,-302,-126,353,-141,-127,-143,244,244,-152,-148,-38,-119,-121,-171,-125,-123,-300,-257,-298,-274,-273,-31,-251,-256,-254,-34,337,-303,-255,-57,-183,-170,-165,337,-229,-134,-133,-145,-150,-149,-153,-155,-44,-43,447,-217,-272,-271,-270,-269,337,-286,-268,454,455,-281,-181,-182,-238,-250,-239,-237,-241,-245,-240,-236,-243,-248,-234,-233,-242,-249,-244,-246,337,-247,-235,-33,-32,-191,-185,-187,-189,-214,337,337,337,-142,-144,-39,-42,-172,-174,-258,-266,-267,-252,-51,-50,-186,-188,-190,-41,-40,-173,-287,502,-231,-46,-49,337,-175,-282,-275,-48,-47,337,-276,]),'OFFSETOF':([3,35,51,53,59,70,76,78,79,101,104,106,107,108,121,136,141,144,145,148,156,157,163,164,166,168,170,171,173,181,182,183,195,199,200,203,204,205,211,212,214,215,216,217,219,221,222,225,233,235,246,249,250,251,256,260,261,262,263,264,265,266,267,268,269,270,271,273,280,281,284,287,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,322,325,326,332,333,335,336,337,338,339,342,344,345,347,348,349,355,361,362,363,366,370,372,374,389,416,417,418,424,426,427,429,431,433,436,447,450,454,456,459,460,462,463,464,470,472,473,474,475,476,480,481,493,494,495,497,501,502,506,509,514,515,516,518,519,520,],[-115,-116,-117,-68,-303,-301,-28,-160,-27,-81,-69,162,-28,-303,162,-161,-303,162,162,-264,162,-262,162,-261,162,-260,162,162,-259,-263,162,162,162,-73,-76,-72,162,-74,162,162,-78,-193,-192,-77,-194,162,-75,-260,-302,162,162,162,-28,-303,-303,-221,-224,-222,-218,-219,-223,-225,162,-227,-228,-220,-226,162,162,162,162,162,162,162,162,162,162,162,162,162,162,162,162,162,162,162,162,162,162,162,162,-303,-260,162,-212,-211,162,-209,162,162,162,-195,162,-208,-196,162,162,162,-260,162,162,-12,162,162,-11,162,162,-28,-303,-260,-207,-210,162,-199,162,-197,-303,-176,162,162,-303,162,-260,162,162,162,162,-198,162,162,162,162,-11,162,-203,-202,-200,162,-303,162,162,162,-204,-201,162,-206,-205,]),'TYPEDEF':([0,1,2,3,5,6,7,8,9,10,11,12,14,15,16,17,18,19,20,21,23,24,25,26,29,30,32,33,34,35,37,39,40,41,42,44,45,47,48,49,50,51,53,54,60,61,63,64,67,68,69,70,71,73,74,80,83,87,91,92,96,101,104,105,113,120,121,122,142,184,185,186,198,199,200,203,205,212,214,215,216,217,219,222,231,233,245,254,282,289,323,327,332,333,335,336,342,345,347,350,351,356,357,392,412,413,426,427,431,436,473,494,495,497,515,516,519,520,],[7,7,-102,-115,-113,-99,-97,-52,-95,-114,-96,-64,-60,-100,-91,-66,7,-94,-109,-104,-65,-93,-110,7,-215,-107,-111,7,-63,-116,-29,-105,-62,-101,-67,-112,-106,7,-108,7,-103,-117,-68,-98,7,-53,7,-82,7,-61,-131,-301,-130,-147,-146,-88,-90,7,-87,-89,-92,-81,-69,-30,7,-70,7,-83,7,-36,-35,7,7,-73,-76,-72,-74,7,-78,-193,-192,-77,-194,-75,-132,-302,-148,-71,-31,-34,7,7,-212,-211,7,-209,-195,-208,-196,-134,-133,-150,-149,7,-33,-32,-207,-210,-199,-197,-198,-203,-202,-200,-204,-201,-206,-205,]),'XOR':([115,147,149,150,151,152,153,154,155,158,159,161,167,169,172,174,175,176,177,178,226,233,272,274,275,276,278,285,286,288,293,377,378,379,380,383,388,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,410,411,451,452,453,458,499,503,512,],[-288,-292,-251,-280,-295,-299,-296,-293,-278,-279,-253,-232,-291,-265,-297,-289,-277,303,-294,-290,-288,-302,-300,-257,-298,-274,-273,-251,-256,-254,-255,-272,-271,-270,-269,-268,-281,-238,303,-239,-237,-241,-245,-240,-236,-243,-248,-234,-233,-242,303,-244,-246,303,-235,-258,-266,-267,-252,-282,-275,-276,]),'AUTO':([0,1,2,3,5,6,7,8,9,10,11,12,14,15,16,17,18,19,20,21,23,24,25,26,29,30,32,33,34,35,37,39,40,41,42,44,45,47,48,49,50,51,53,54,60,61,63,64,67,68,69,70,71,73,74,80,83,87,91,92,96,101,104,105,113,120,121,122,142,184,185,186,198,199,200,203,205,212,214,215,216,217,219,222,231,233,245,254,282,289,323,327,332,333,335,336,342,345,347,350,351,356,357,392,412,413,426,427,431,436,473,494,495,497,515,516,519,520,],[24,24,-102,-115,-113,-99,-97,-52,-95,-114,-96,-64,-60,-100,-91,-66,24,-94,-109,-104,-65,-93,-110,24,-215,-107,-111,24,-63,-116,-29,-105,-62,-101,-67,-112,-106,24,-108,24,-103,-117,-68,-98,24,-53,24,-82,24,-61,-131,-301,-130,-147,-146,-88,-90,24,-87,-89,-92,-81,-69,-30,24,-70,24,-83,24,-36,-35,24,24,-73,-76,-72,-74,24,-78,-193,-192,-77,-194,-75,-132,-302,-148,-71,-31,-34,24,24,-212,-211,24,-209,-195,-208,-196,-134,-133,-150,-149,24,-33,-32,-207,-210,-199,-197,-198,-203,-202,-200,-204,-201,-206,-205,]),'TIMES':([0,1,2,3,4,5,6,7,9,10,11,12,14,15,16,17,19,20,21,23,24,25,26,29,30,31,32,33,34,35,36,39,40,41,42,44,45,47,48,49,50,51,53,54,55,56,57,59,63,65,68,69,70,71,73,74,76,77,78,79,80,81,83,91,92,96,101,102,103,104,106,107,108,113,115,120,121,125,130,136,139,141,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,163,164,166,167,168,169,170,171,172,173,174,175,176,177,178,181,182,183,186,195,199,200,203,204,205,211,212,214,215,216,217,219,221,222,225,226,230,231,233,234,235,240,245,246,249,250,251,254,256,260,261,262,263,264,265,266,267,268,269,270,271,272,273,274,275,276,278,280,281,284,285,286,287,288,292,293,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,322,325,326,332,333,335,336,337,338,339,342,344,345,347,348,349,350,351,353,355,356,357,361,362,363,366,370,372,374,377,378,379,380,383,388,389,392,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,410,411,416,417,418,424,426,427,429,431,433,436,447,450,451,452,453,454,456,458,459,460,462,463,464,470,472,473,474,475,476,480,481,493,494,495,497,499,501,502,503,506,509,512,514,515,516,518,519,520,],[31,-303,-102,-115,31,-113,-99,-97,-95,-114,-96,-64,-60,-100,-91,-66,-94,-109,-104,-65,-93,-110,31,-215,-107,-303,-111,31,-63,-116,31,-105,-62,-101,-67,-112,-106,-303,-108,-303,-103,-117,-68,-98,-85,-10,-9,-303,31,31,-61,-131,-301,-130,-147,-146,-28,31,-160,-27,-88,31,-90,-87,-89,-92,-81,-84,-86,-69,168,-28,-303,31,-288,-70,225,-128,31,-161,31,-303,225,225,31,-292,-264,-251,-280,-295,-299,-296,-293,-278,225,-262,-279,-253,-232,225,-261,225,-291,-260,-265,225,225,-297,-259,-289,-277,305,-294,-290,-263,225,225,31,325,-73,-76,-72,225,-74,225,225,-78,-193,-192,-77,-194,225,-75,-260,-288,-129,-132,-302,-126,225,-127,-148,225,361,-28,-303,-71,-303,-221,-224,-222,-218,-219,-223,-225,225,-227,-228,-220,-226,-300,225,-257,-298,-274,-273,225,225,225,-251,-256,225,-254,31,-255,225,225,225,225,225,225,225,225,225,225,225,225,225,225,225,225,225,225,225,-303,-260,424,-212,-211,225,-209,225,225,225,-195,225,-208,-196,225,225,-134,-133,31,225,-150,-149,-260,225,225,-12,225,225,-11,-272,-271,-270,-269,-268,-281,225,31,305,305,305,305,305,305,305,305,305,305,-234,-233,305,305,305,305,305,-235,462,-28,-303,-260,-207,-210,225,-199,225,-197,-303,-176,-258,-266,-267,225,225,-252,-303,225,-260,225,225,225,225,-198,225,225,225,225,-11,225,-203,-202,-200,-282,225,-303,-275,225,225,-276,225,-204,-201,225,-206,-205,]),'LPAREN':([0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,19,20,21,23,24,25,26,29,30,31,32,33,34,35,36,37,39,40,41,42,44,45,47,48,49,50,51,53,54,55,56,57,59,61,63,65,68,69,70,71,73,74,76,77,78,79,80,81,83,84,85,88,91,92,96,101,102,103,104,105,106,107,108,113,115,120,121,125,130,136,137,139,140,141,144,145,146,147,148,150,151,152,153,154,155,156,157,158,159,162,163,164,166,167,168,169,170,171,172,173,174,175,177,178,181,182,183,184,185,186,189,190,193,195,196,199,200,203,204,205,206,210,211,212,213,214,215,216,217,219,221,222,225,226,227,230,231,233,234,235,240,245,246,247,249,250,251,254,256,260,261,262,263,264,265,266,267,268,269,270,271,272,273,275,276,278,280,281,282,284,287,289,292,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,315,320,321,322,325,326,332,333,335,336,337,338,339,342,344,345,347,348,349,350,351,353,355,356,357,361,362,363,364,365,366,370,372,374,377,378,379,380,383,388,389,391,392,412,413,414,415,416,417,418,421,422,424,426,427,429,431,433,435,436,440,441,447,450,452,453,454,456,459,460,462,463,464,465,466,467,468,469,470,472,473,474,475,476,477,478,480,481,488,489,493,494,495,497,499,501,502,503,504,505,506,509,512,514,515,516,518,519,520,],[4,-303,-102,-115,4,-113,-99,-97,60,-95,-114,-96,-64,4,-60,-100,-91,-66,-94,-109,-104,-65,-93,-110,4,-215,-107,-303,-111,81,-63,-116,4,-29,-105,-62,-101,-67,-112,-106,-303,-108,-303,-103,-117,-68,-98,-85,-10,-9,-303,60,81,4,-61,-131,-301,-130,-147,-146,-28,-158,-160,-27,-88,81,-90,81,142,-37,-87,-89,-92,-81,-84,-86,-69,-30,170,-28,-303,186,-288,-70,170,-128,81,-161,-159,81,142,-303,170,170,81,-292,-264,-280,-295,-299,-296,-293,-278,273,-262,-279,281,283,284,-261,287,-291,-260,-265,170,287,-297,-259,-289,-277,-294,-290,-263,170,170,-36,-35,186,186,323,-45,170,327,-73,-76,-72,170,-74,335,339,284,170,344,-78,-193,-192,-77,-194,170,-75,-260,-288,349,-129,-132,-302,-126,284,-127,-148,284,-38,170,-28,-303,-71,-303,-221,-224,-222,-218,-219,-223,-225,170,-227,-228,-220,-226,-300,170,-298,-274,-273,170,170,-31,170,170,-34,392,284,284,284,284,284,284,284,284,284,284,284,284,284,284,284,284,170,284,284,186,323,327,-303,-260,170,-212,-211,170,-209,170,170,170,-195,170,-208,-196,170,170,-134,-133,81,284,-150,-149,-260,170,170,-44,-43,-12,284,170,-11,-272,-271,-270,-269,-268,-281,284,392,392,-33,-32,-191,-185,170,-28,-303,-187,-189,-260,-207,-210,170,-199,170,475,-197,-39,-42,-303,-176,-266,-267,170,284,-303,284,-260,170,170,-51,-50,-186,-188,-190,170,170,-198,170,170,170,-41,-40,170,-11,-46,-49,170,-203,-202,-200,-282,170,-303,-275,-48,-47,170,170,-276,170,-204,-201,170,-206,-205,]),'MINUSMINUS':([3,35,51,53,59,70,76,78,79,101,104,106,107,108,115,121,136,141,144,145,147,148,150,151,152,153,154,155,156,157,158,159,163,164,166,167,168,169,170,171,172,173,174,175,177,178,181,182,183,195,199,200,203,204,205,211,212,214,215,216,217,219,221,222,225,226,233,235,246,249,250,251,256,260,261,262,263,264,265,266,267,268,269,270,271,272,273,275,276,278,280,281,284,287,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,322,325,326,332,333,335,336,337,338,339,342,344,345,347,348,349,355,361,362,363,366,370,372,374,377,378,379,380,383,388,389,416,417,418,424,426,427,429,431,433,436,447,450,452,453,454,456,459,460,462,463,464,470,472,473,474,475,476,480,481,493,494,495,497,499,501,502,503,506,509,512,514,515,516,518,519,520,],[-115,-116,-117,-68,-303,-301,-28,-160,-27,-81,-69,171,-28,-303,-288,171,-161,-303,171,171,-292,-264,-280,-295,-299,-296,-293,-278,171,-262,-279,276,171,-261,171,-291,-260,-265,171,171,-297,-259,-289,-277,-294,-290,-263,171,171,171,-73,-76,-72,171,-74,171,171,-78,-193,-192,-77,-194,171,-75,-260,-288,-302,171,171,171,-28,-303,-303,-221,-224,-222,-218,-219,-223,-225,171,-227,-228,-220,-226,-300,171,-298,-274,-273,171,171,171,171,171,171,171,171,171,171,171,171,171,171,171,171,171,171,171,171,171,171,171,-303,-260,171,-212,-211,171,-209,171,171,171,-195,171,-208,-196,171,171,171,-260,171,171,-12,171,171,-11,-272,-271,-270,-269,-268,-281,171,171,-28,-303,-260,-207,-210,171,-199,171,-197,-303,-176,-266,-267,171,171,-303,171,-260,171,171,171,171,-198,171,171,171,171,-11,171,-203,-202,-200,-282,171,-303,-275,171,171,-276,171,-204,-201,171,-206,-205,]),'ID':([0,1,2,3,4,5,6,7,9,10,11,12,13,14,15,16,17,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,38,39,40,41,42,44,45,47,48,49,50,51,53,54,55,56,57,59,60,63,65,68,69,70,71,73,74,75,76,77,78,79,80,81,83,84,91,92,96,101,102,103,104,106,107,108,113,120,121,125,130,131,132,136,137,139,141,142,144,145,146,148,156,157,163,164,166,168,170,171,173,181,182,183,186,189,195,197,199,200,201,203,204,205,211,212,214,215,216,217,219,221,222,225,230,231,233,234,235,240,244,245,246,249,250,251,254,256,260,261,262,263,264,265,266,267,268,269,270,271,273,277,279,280,281,284,287,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,315,322,323,325,326,332,333,335,336,337,338,339,342,344,345,347,348,349,350,351,353,355,356,357,361,362,363,366,369,370,372,374,389,416,417,418,424,426,427,429,431,433,436,447,450,454,455,456,459,460,462,463,464,470,472,473,474,475,476,480,481,493,494,495,497,500,501,502,506,509,514,515,516,518,519,520,],[37,-303,-102,-115,37,-113,-99,-97,-95,-114,-96,-64,37,-60,-100,-91,-66,-94,-109,-104,-136,-65,-93,-110,37,71,74,-215,-107,-303,-111,37,-63,-116,37,-135,-105,-62,-101,-67,-112,-106,-303,-108,-303,-103,-117,-68,-98,-85,-10,-9,-303,115,37,37,-61,-131,-301,-130,-147,-146,135,-28,-158,-160,-27,-88,37,-90,37,-87,-89,-92,-81,-84,-86,-69,115,-28,-303,37,-70,226,-128,37,135,135,-161,-159,37,-303,115,115,115,37,-264,115,-262,115,-261,115,-260,115,115,-259,-263,115,115,37,37,115,115,-73,-76,331,-72,115,-74,115,226,-78,-193,-192,-77,-194,226,-75,-260,-129,-132,-302,-126,115,-127,135,-148,115,115,-28,-303,-71,-303,-221,-224,-222,-218,-219,-223,-225,115,-227,-228,-220,-226,115,378,380,115,115,115,115,115,115,115,115,115,115,115,115,115,115,115,115,115,115,115,115,115,115,115,37,-303,115,-260,115,-212,-211,115,-209,115,226,115,-195,115,-208,-196,226,115,-134,-133,37,115,-150,-149,-260,115,115,-12,115,115,115,-11,115,115,-28,-303,-260,-207,-210,115,-199,226,-197,-303,-176,115,115,115,-303,115,-260,115,115,115,226,-198,226,115,226,115,-11,115,-203,-202,-200,115,115,-303,115,226,226,-204,-201,226,-206,-205,]),'IF':([53,70,101,104,121,199,200,203,205,212,214,215,216,217,219,221,222,233,332,333,336,338,342,345,347,348,426,427,431,433,436,472,473,474,476,494,495,497,509,514,515,516,518,519,520,],[-68,-301,-81,-69,227,-73,-76,-72,-74,227,-78,-193,-192,-77,-194,227,-75,-302,-212,-211,-209,227,-195,-208,-196,227,-207,-210,-199,227,-197,227,-198,227,227,-203,-202,-200,227,227,-204,-201,227,-206,-205,]),'STRING_LITERAL':([3,35,51,53,59,70,76,78,79,101,104,106,107,108,121,136,141,144,145,148,156,157,158,163,164,166,168,170,171,172,173,181,182,183,195,199,200,203,204,205,211,212,214,215,216,217,219,221,222,225,233,235,246,249,250,251,256,260,261,262,263,264,265,266,267,268,269,270,271,273,275,280,281,284,287,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,322,325,326,332,333,335,336,337,338,339,342,344,345,347,348,349,355,361,362,363,366,370,372,374,389,416,417,418,424,426,427,429,431,433,436,447,450,454,456,459,460,462,463,464,470,472,473,474,475,476,480,481,493,494,495,497,501,502,506,509,514,515,516,518,519,520,],[-115,-116,-117,-68,-303,-301,-28,-160,-27,-81,-69,172,-28,-303,172,-161,-303,172,172,-264,172,-262,275,172,-261,172,-260,172,172,-297,-259,-263,172,172,172,-73,-76,-72,172,-74,172,172,-78,-193,-192,-77,-194,172,-75,-260,-302,172,172,172,-28,-303,-303,-221,-224,-222,-218,-219,-223,-225,172,-227,-228,-220,-226,172,-298,172,172,172,172,172,172,172,172,172,172,172,172,172,172,172,172,172,172,172,172,172,172,172,-303,-260,172,-212,-211,172,-209,172,172,172,-195,172,-208,-196,172,172,172,-260,172,172,-12,172,172,-11,172,172,-28,-303,-260,-207,-210,172,-199,172,-197,-303,-176,172,172,-303,172,-260,172,172,172,172,-198,172,172,172,172,-11,172,-203,-202,-200,172,-303,172,172,172,-204,-201,172,-206,-205,]),'FLOAT':([0,1,2,3,5,6,7,8,9,10,11,12,14,15,16,17,18,19,20,21,23,24,25,26,29,30,32,33,34,35,36,37,39,40,41,42,44,45,47,48,49,50,51,53,54,55,56,57,60,61,63,64,65,67,68,69,70,71,72,73,74,78,80,83,87,91,92,96,101,102,103,104,105,113,117,120,121,122,123,124,125,126,127,128,129,130,136,142,170,184,185,186,198,199,200,203,205,212,214,215,216,217,219,222,228,229,230,231,232,233,234,240,245,254,273,282,283,284,287,289,292,323,327,332,333,335,336,342,345,347,350,351,354,356,357,392,412,413,426,427,431,436,473,494,495,497,515,516,519,520,],[39,-303,-102,-115,-113,-99,-97,-52,-95,-114,-96,-64,-60,-100,-91,-66,39,-94,-109,-104,-65,-93,-110,39,-215,-107,-111,39,-63,-116,39,-29,-105,-62,-101,-67,-112,-106,-303,-108,-303,-103,-117,-68,-98,-85,-10,-9,39,-53,39,-82,39,39,-61,-131,-301,-130,39,-147,-146,-160,-88,-90,39,-87,-89,-92,-81,-84,-86,-69,-30,39,39,-70,39,-83,39,39,-128,-140,-137,39,39,39,-161,39,39,-36,-35,39,39,-73,-76,-72,-74,39,-78,-193,-192,-77,-194,-75,39,39,-129,-132,-138,-302,-126,-127,-148,-71,39,-31,39,39,39,-34,39,39,39,-212,-211,39,-209,-195,-208,-196,-134,-133,-139,-150,-149,39,-33,-32,-207,-210,-199,-197,-198,-203,-202,-200,-204,-201,-206,-205,]),'XOREQUAL':([115,147,149,150,151,152,153,154,155,158,159,167,169,172,174,175,177,178,226,233,272,274,275,276,278,285,286,288,293,377,378,379,380,383,388,451,452,453,458,499,503,512,],[-288,-292,264,-280,-295,-299,-296,-293,-278,-279,-253,-291,-265,-297,-289,-277,-294,-290,-288,-302,-300,-257,-298,-274,-273,-251,-256,-254,-255,-272,-271,-270,-269,-268,-281,-258,-266,-267,-252,-282,-275,-276,]),'LSHIFTEQUAL':([115,147,149,150,151,152,153,154,155,158,159,167,169,172,174,175,177,178,226,233,272,274,275,276,278,285,286,288,293,377,378,379,380,383,388,451,452,453,458,499,503,512,],[-288,-292,266,-280,-295,-299,-296,-293,-278,-279,-253,-291,-265,-297,-289,-277,-294,-290,-288,-302,-300,-257,-298,-274,-273,-251,-256,-254,-255,-272,-271,-270,-269,-268,-281,-258,-266,-267,-252,-282,-275,-276,]),'RBRACKET':([3,35,51,59,78,79,106,107,115,136,141,147,149,150,151,152,153,154,155,158,159,160,161,165,167,168,169,172,174,175,176,177,178,179,180,195,224,233,249,250,272,274,275,276,278,285,286,288,293,313,314,322,324,325,326,341,360,361,375,377,378,379,380,381,383,388,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,410,411,416,417,423,424,430,442,443,446,451,452,453,458,461,462,487,490,491,499,503,511,512,],[-115,-116,-117,-303,-160,-27,-303,-28,-288,-161,-303,-292,-251,-280,-295,-299,-296,-293,-278,-279,-253,282,-232,-4,-291,289,-265,-297,-289,-277,-230,-294,-290,-3,-216,-303,-213,-302,-303,-28,-300,-257,-298,-274,-273,-251,-256,-254,-255,412,413,-303,421,422,-303,-229,440,441,-217,-272,-271,-270,-269,452,-268,-281,-238,-250,-239,-237,-241,-245,-240,-236,-243,-248,-234,-233,-242,-249,-244,-246,-247,-235,-303,-28,467,468,-214,477,478,479,-258,-266,-267,-252,488,489,-231,504,505,-282,-275,517,-276,]),} _lr_action = {} for _k, _v in _lr_action_items.items(): for _x,_y in zip(_v[0],_v[1]): if not _x in _lr_action: _lr_action[_x] = {} _lr_action[_x][_k] = _y del _lr_action_items _lr_goto_items = {'expression_statement':([121,212,221,338,348,433,472,474,476,509,514,518,],[199,199,199,199,199,199,199,199,199,199,199,199,]),'struct_or_union_specifier':([0,18,26,36,60,65,67,72,87,117,121,123,124,128,129,142,170,186,198,212,228,229,273,283,284,287,323,327,335,392,],[5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,]),'init_declarator_list':([33,63,],[82,82,]),'init_declarator_list_opt':([33,63,],[90,90,]),'iteration_statement':([121,212,221,338,348,433,472,474,476,509,514,518,],[200,200,200,200,200,200,200,200,200,200,200,200,]),'unified_string_literal':([106,121,144,145,156,163,166,170,171,182,183,195,204,211,212,221,235,246,249,267,273,280,281,284,287,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,326,335,337,338,339,344,348,349,355,362,363,370,372,389,416,429,433,454,456,460,463,464,470,472,474,475,476,480,493,501,506,509,514,518,],[158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,]),'assignment_expression_opt':([106,195,249,326,416,],[160,324,360,423,461,]),'brace_open':([27,28,66,69,71,73,74,121,143,144,145,212,221,338,348,372,389,433,451,456,457,472,474,476,480,509,514,518,],[72,75,121,123,124,131,132,121,121,256,256,121,121,121,121,256,459,121,459,459,459,121,121,121,256,121,121,121,]),'enumerator':([75,131,132,244,],[133,133,133,358,]),'typeid_noparen_declarator':([113,],[194,]),'type_qualifier_list_opt':([31,59,108,141,251,322,418,],[77,106,183,249,363,416,464,]),'declaration_specifiers_no_type_opt':([1,47,49,],[55,102,103,]),'expression_opt':([121,212,221,335,338,348,429,433,470,472,474,476,493,506,509,514,518,],[202,202,202,428,202,202,471,202,492,202,202,202,507,513,202,202,202,]),'designation':([256,447,459,502,],[366,366,366,366,]),'parameter_list':([60,142,186,323,327,392,],[116,116,116,116,116,116,]),'labeled_statement':([121,212,221,338,348,433,472,474,476,509,514,518,],[203,203,203,203,203,203,203,203,203,203,203,203,]),'abstract_declarator':([113,186,292,392,],[188,319,188,319,]),'translation_unit':([0,],[26,]),'init_declarator':([33,63,139,146,],[94,94,248,259,]),'direct_abstract_declarator':([113,186,189,292,315,391,392,],[196,196,321,196,321,321,196,]),'designator_list':([256,447,459,502,],[373,373,373,373,]),'identifier':([60,106,121,142,144,145,156,163,166,170,171,182,183,195,197,204,211,212,221,235,246,249,267,273,280,281,284,287,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,323,326,335,337,338,339,344,348,349,355,362,363,369,370,372,389,416,429,433,454,455,456,460,463,464,470,472,474,475,476,480,493,500,501,506,509,514,518,],[118,175,175,118,175,175,175,175,175,175,175,175,175,175,328,175,175,175,175,175,175,175,175,175,175,175,175,175,175,175,175,175,175,175,175,175,175,175,175,175,175,175,175,175,175,175,175,118,175,175,175,175,175,175,175,175,175,175,175,445,175,175,175,175,175,175,175,485,175,175,175,175,175,175,175,175,175,175,175,510,175,175,175,175,175,]),'offsetof_member_designator':([455,],[484,]),'unary_expression':([106,121,144,145,156,163,166,170,171,182,183,195,204,211,212,221,235,246,249,267,273,280,281,284,287,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,326,335,337,338,339,344,348,349,355,362,363,370,372,389,416,429,433,454,456,460,463,464,470,472,474,475,476,480,493,501,506,509,514,518,],[149,149,149,149,274,285,288,149,293,149,149,149,149,285,149,149,285,285,149,149,149,149,149,149,149,285,285,285,285,285,285,285,285,285,285,285,285,285,285,285,285,149,285,285,149,149,149,149,149,149,149,149,285,149,149,285,149,285,149,149,149,149,285,285,149,149,149,149,149,149,149,149,149,149,149,149,149,149,]),'abstract_declarator_opt':([113,292,],[187,390,]),'initializer':([144,145,372,480,],[255,258,448,498,]),'direct_id_declarator':([0,4,13,26,33,36,63,65,81,84,113,130,139,146,186,189,315,353,],[8,8,61,8,8,8,8,8,8,61,8,8,8,8,8,61,61,8,]),'struct_declaration_list':([72,123,124,],[129,228,229,]),'pp_directive':([0,26,],[12,12,]),'declaration_list':([18,87,],[67,67,]),'id_init_declarator':([36,65,],[95,95,]),'type_specifier':([0,18,26,36,60,65,67,72,87,117,121,123,124,128,129,142,170,186,198,212,228,229,273,283,284,287,323,327,335,392,],[16,16,16,96,16,96,16,125,16,96,16,125,125,230,125,16,125,16,16,16,125,125,125,125,125,125,16,16,16,16,]),'compound_statement':([66,121,143,212,221,338,348,433,472,474,476,509,514,518,],[120,205,254,205,205,205,205,205,205,205,205,205,205,205,]),'pointer':([0,4,26,33,36,63,65,77,81,113,130,139,146,186,292,353,392,],[13,13,13,84,13,84,13,137,84,189,84,84,84,315,391,84,391,]),'typeid_declarator':([33,63,81,130,139,146,353,],[86,86,138,86,86,86,86,]),'id_init_declarator_list':([36,65,],[98,98,]),'declarator':([33,63,130,139,146,353,],[89,89,241,89,89,241,]),'argument_expression_list':([281,],[384,]),'struct_declarator_list_opt':([130,],[238,]),'typedef_name':([0,18,26,36,60,65,67,72,87,117,121,123,124,128,129,142,170,186,198,212,228,229,273,283,284,287,323,327,335,392,],[32,32,32,32,32,32,32,32,32,32,32,32,32,32,32,32,32,32,32,32,32,32,32,32,32,32,32,32,32,32,]),'parameter_type_list_opt':([186,327,392,],[318,425,318,]),'struct_declarator':([130,353,],[239,438,]),'type_qualifier':([0,1,18,26,31,33,47,49,59,60,63,67,72,76,87,107,108,113,121,123,124,128,129,130,141,142,170,186,198,212,228,229,250,251,273,283,284,287,292,322,323,327,335,392,417,418,],[47,47,47,47,78,91,47,47,78,47,91,47,78,136,47,136,78,91,47,78,78,136,78,240,78,47,78,47,47,47,78,78,136,78,78,78,78,78,240,78,47,47,47,47,136,78,]),'assignment_operator':([149,],[267,]),'expression':([121,170,204,212,221,273,280,284,287,310,335,338,339,344,348,349,429,433,470,472,474,475,476,493,501,506,509,514,518,],[208,290,334,208,208,290,381,290,290,409,208,208,432,434,208,437,208,208,208,208,208,496,208,208,511,208,208,208,208,]),'storage_class_specifier':([0,1,18,26,33,47,49,60,63,67,87,113,121,142,186,198,212,323,327,335,392,],[1,1,1,1,80,1,1,1,80,1,1,80,1,1,1,1,1,1,1,1,1,]),'unified_wstring_literal':([106,121,144,145,156,163,166,170,171,182,183,195,204,211,212,221,235,246,249,267,273,280,281,284,287,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,326,335,337,338,339,344,348,349,355,362,363,370,372,389,416,429,433,454,456,460,463,464,470,472,474,475,476,480,493,501,506,509,514,518,],[150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,]),'translation_unit_or_empty':([0,],[43,]),'initializer_list_opt':([256,],[367,]),'brace_close':([129,134,223,228,229,242,243,367,447,486,502,],[231,245,347,350,351,356,357,444,482,503,512,]),'direct_typeid_declarator':([33,63,81,84,130,139,146,353,],[85,85,85,140,85,85,85,85,]),'external_declaration':([0,26,],[14,68,]),'type_name':([170,273,283,284,287,],[291,376,385,386,387,]),'block_item_list':([121,],[212,]),'pppragma_directive':([0,26,121,212,221,338,348,433,472,474,476,509,514,518,],[23,23,214,214,214,214,214,214,214,214,214,214,214,214,]),'statement':([121,212,221,338,348,433,472,474,476,509,514,518,],[215,215,346,431,436,473,494,495,497,516,519,520,]),'cast_expression':([106,121,144,145,163,170,182,183,195,204,211,212,221,235,246,249,267,273,280,281,284,287,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,326,335,337,338,339,344,348,349,355,362,363,370,372,389,416,429,433,454,456,460,463,464,470,472,474,475,476,480,493,501,506,509,514,518,],[161,161,161,161,286,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,458,161,161,161,161,458,161,161,161,161,161,161,161,161,161,161,161,161,161,161,161,]),'struct_declarator_list':([130,],[236,]),'empty':([0,1,18,31,33,36,47,49,59,60,63,65,87,106,108,113,121,130,141,142,186,195,212,221,249,251,256,292,322,323,326,327,335,338,348,392,416,418,429,433,447,459,470,472,474,476,493,502,506,509,514,518,],[52,57,62,79,93,100,57,57,79,110,93,100,62,179,79,192,220,237,79,110,316,179,343,343,179,79,374,192,79,110,179,316,343,343,343,316,179,79,343,343,481,481,343,343,343,343,343,481,343,343,343,343,]),'parameter_declaration':([60,142,186,198,323,327,392,],[112,112,112,330,112,112,112,]),'primary_expression':([106,121,144,145,156,163,166,170,171,182,183,195,204,211,212,221,235,246,249,267,273,280,281,284,287,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,326,335,337,338,339,344,348,349,355,362,363,370,372,389,416,429,433,454,456,460,463,464,470,472,474,475,476,480,493,501,506,509,514,518,],[169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,169,]),'declaration':([0,18,26,67,87,121,212,335,],[34,64,34,122,64,216,216,429,]),'declaration_specifiers_no_type':([0,1,18,26,47,49,60,67,87,121,142,186,198,212,323,327,335,392,],[36,56,65,36,56,56,117,65,65,65,117,117,117,65,117,117,65,117,]),'jump_statement':([121,212,221,338,348,433,472,474,476,509,514,518,],[217,217,217,217,217,217,217,217,217,217,217,217,]),'enumerator_list':([75,131,132,],[134,242,243,]),'block_item':([121,212,],[219,342,]),'constant_expression':([211,235,246,355,370,],[340,352,359,439,446,]),'identifier_list_opt':([60,142,323,],[109,252,419,]),'constant':([106,121,144,145,156,163,166,170,171,182,183,195,204,211,212,221,235,246,249,267,273,280,281,284,287,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,326,335,337,338,339,344,348,349,355,362,363,370,372,389,416,429,433,454,456,460,463,464,470,472,474,475,476,480,493,501,506,509,514,518,],[155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,]),'type_specifier_no_typeid':([0,18,26,33,36,60,63,65,67,72,87,113,117,121,123,124,128,129,130,142,170,186,198,212,228,229,273,283,284,287,292,323,327,335,392,],[10,10,10,83,10,10,83,10,10,10,10,83,10,10,10,10,10,10,234,10,10,10,10,10,10,10,10,10,10,10,234,10,10,10,10,]),'struct_declaration':([72,123,124,129,228,229,],[127,127,127,232,232,232,]),'direct_typeid_noparen_declarator':([113,189,],[190,320,]),'id_declarator':([0,4,26,33,36,63,65,81,113,130,139,146,186,353,],[18,58,18,87,97,119,97,58,191,119,119,119,58,119,]),'selection_statement':([121,212,221,338,348,433,472,474,476,509,514,518,],[222,222,222,222,222,222,222,222,222,222,222,222,]),'postfix_expression':([106,121,144,145,156,163,166,170,171,182,183,195,204,211,212,221,235,246,249,267,273,280,281,284,287,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,326,335,337,338,339,344,348,349,355,362,363,370,372,389,416,429,433,454,456,460,463,464,470,472,474,475,476,480,493,501,506,509,514,518,],[159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,159,]),'initializer_list':([256,459,],[371,486,]),'unary_operator':([106,121,144,145,156,163,166,170,171,182,183,195,204,211,212,221,235,246,249,267,273,280,281,284,287,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,326,335,337,338,339,344,348,349,355,362,363,370,372,389,416,429,433,454,456,460,463,464,470,472,474,475,476,480,493,501,506,509,514,518,],[163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,163,]),'struct_or_union':([0,18,26,36,60,65,67,72,87,117,121,123,124,128,129,142,170,186,198,212,228,229,273,283,284,287,323,327,335,392,],[27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,]),'block_item_list_opt':([121,],[223,]),'assignment_expression':([106,121,144,145,170,182,183,195,204,212,221,249,267,273,280,281,284,287,310,326,335,337,338,339,344,348,349,362,363,372,416,429,433,454,463,464,470,472,474,475,476,480,493,501,506,509,514,518,],[165,224,257,257,224,313,314,165,224,224,224,165,375,224,224,382,224,224,224,165,224,430,224,224,224,224,224,442,443,257,165,224,224,483,490,491,224,224,224,224,224,257,224,224,224,224,224,224,]),'designation_opt':([256,447,459,502,],[372,480,372,480,]),'parameter_type_list':([60,142,186,323,327,392,],[111,253,317,420,317,317,]),'type_qualifier_list':([31,59,72,108,123,124,129,141,170,228,229,251,273,283,284,287,322,418,],[76,107,128,76,128,128,128,250,128,128,128,76,128,128,128,128,417,76,]),'designator':([256,373,447,459,502,],[368,449,368,368,368,]),'id_init_declarator_list_opt':([36,65,],[99,99,]),'declaration_specifiers':([0,18,26,60,67,87,121,142,186,198,212,323,327,335,392,],[33,63,33,113,63,63,63,113,113,113,63,113,113,63,113,]),'identifier_list':([60,142,323,],[114,114,114,]),'declaration_list_opt':([18,87,],[66,143,]),'function_definition':([0,26,],[40,40,]),'binary_expression':([106,121,144,145,170,182,183,195,204,211,212,221,235,246,249,267,273,280,281,284,287,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,326,335,337,338,339,344,348,349,355,362,363,370,372,416,429,433,454,460,463,464,470,472,474,475,476,480,493,501,506,509,514,518,],[176,176,176,176,176,176,176,176,176,176,176,176,176,176,176,176,176,176,176,176,176,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,176,410,411,176,176,176,176,176,176,176,176,176,176,176,176,176,176,176,176,176,176,176,176,176,176,176,176,176,176,176,176,176,176,176,176,]),'enum_specifier':([0,18,26,36,60,65,67,72,87,117,121,123,124,128,129,142,170,186,198,212,228,229,273,283,284,287,323,327,335,392,],[44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,]),'decl_body':([0,18,26,67,87,121,212,335,],[46,46,46,46,46,46,46,46,]),'function_specifier':([0,1,18,26,33,47,49,60,63,67,87,113,121,142,186,198,212,323,327,335,392,],[49,49,49,49,92,49,49,49,92,49,49,92,49,49,49,49,49,49,49,49,49,]),'specifier_qualifier_list':([72,123,124,129,170,228,229,273,283,284,287,],[130,130,130,130,292,130,130,292,292,292,292,]),'conditional_expression':([106,121,144,145,170,182,183,195,204,211,212,221,235,246,249,267,273,280,281,284,287,310,326,335,337,338,339,344,348,349,355,362,363,370,372,416,429,433,454,460,463,464,470,472,474,475,476,480,493,501,506,509,514,518,],[180,180,180,180,180,180,180,180,180,341,180,180,341,341,180,180,180,180,180,180,180,180,180,180,180,180,180,180,180,180,341,180,180,341,180,180,180,180,180,487,180,180,180,180,180,180,180,180,180,180,180,180,180,180,]),} _lr_goto = {} for _k, _v in _lr_goto_items.items(): for _x, _y in zip(_v[0], _v[1]): if not _x in _lr_goto: _lr_goto[_x] = {} _lr_goto[_x][_k] = _y del _lr_goto_items _lr_productions = [ ("S' -> translation_unit_or_empty","S'",1,None,None,None), ('abstract_declarator_opt -> empty','abstract_declarator_opt',1,'p_abstract_declarator_opt','plyparser.py',42), ('abstract_declarator_opt -> abstract_declarator','abstract_declarator_opt',1,'p_abstract_declarator_opt','plyparser.py',43), ('assignment_expression_opt -> empty','assignment_expression_opt',1,'p_assignment_expression_opt','plyparser.py',42), ('assignment_expression_opt -> assignment_expression','assignment_expression_opt',1,'p_assignment_expression_opt','plyparser.py',43), ('block_item_list_opt -> empty','block_item_list_opt',1,'p_block_item_list_opt','plyparser.py',42), ('block_item_list_opt -> block_item_list','block_item_list_opt',1,'p_block_item_list_opt','plyparser.py',43), ('declaration_list_opt -> empty','declaration_list_opt',1,'p_declaration_list_opt','plyparser.py',42), ('declaration_list_opt -> declaration_list','declaration_list_opt',1,'p_declaration_list_opt','plyparser.py',43), ('declaration_specifiers_no_type_opt -> empty','declaration_specifiers_no_type_opt',1,'p_declaration_specifiers_no_type_opt','plyparser.py',42), ('declaration_specifiers_no_type_opt -> declaration_specifiers_no_type','declaration_specifiers_no_type_opt',1,'p_declaration_specifiers_no_type_opt','plyparser.py',43), ('designation_opt -> empty','designation_opt',1,'p_designation_opt','plyparser.py',42), ('designation_opt -> designation','designation_opt',1,'p_designation_opt','plyparser.py',43), ('expression_opt -> empty','expression_opt',1,'p_expression_opt','plyparser.py',42), ('expression_opt -> expression','expression_opt',1,'p_expression_opt','plyparser.py',43), ('id_init_declarator_list_opt -> empty','id_init_declarator_list_opt',1,'p_id_init_declarator_list_opt','plyparser.py',42), ('id_init_declarator_list_opt -> id_init_declarator_list','id_init_declarator_list_opt',1,'p_id_init_declarator_list_opt','plyparser.py',43), ('identifier_list_opt -> empty','identifier_list_opt',1,'p_identifier_list_opt','plyparser.py',42), ('identifier_list_opt -> identifier_list','identifier_list_opt',1,'p_identifier_list_opt','plyparser.py',43), ('init_declarator_list_opt -> empty','init_declarator_list_opt',1,'p_init_declarator_list_opt','plyparser.py',42), ('init_declarator_list_opt -> init_declarator_list','init_declarator_list_opt',1,'p_init_declarator_list_opt','plyparser.py',43), ('initializer_list_opt -> empty','initializer_list_opt',1,'p_initializer_list_opt','plyparser.py',42), ('initializer_list_opt -> initializer_list','initializer_list_opt',1,'p_initializer_list_opt','plyparser.py',43), ('parameter_type_list_opt -> empty','parameter_type_list_opt',1,'p_parameter_type_list_opt','plyparser.py',42), ('parameter_type_list_opt -> parameter_type_list','parameter_type_list_opt',1,'p_parameter_type_list_opt','plyparser.py',43), ('struct_declarator_list_opt -> empty','struct_declarator_list_opt',1,'p_struct_declarator_list_opt','plyparser.py',42), ('struct_declarator_list_opt -> struct_declarator_list','struct_declarator_list_opt',1,'p_struct_declarator_list_opt','plyparser.py',43), ('type_qualifier_list_opt -> empty','type_qualifier_list_opt',1,'p_type_qualifier_list_opt','plyparser.py',42), ('type_qualifier_list_opt -> type_qualifier_list','type_qualifier_list_opt',1,'p_type_qualifier_list_opt','plyparser.py',43), ('direct_id_declarator -> ID','direct_id_declarator',1,'p_direct_id_declarator_1','plyparser.py',109), ('direct_id_declarator -> LPAREN id_declarator RPAREN','direct_id_declarator',3,'p_direct_id_declarator_2','plyparser.py',109), ('direct_id_declarator -> direct_id_declarator LBRACKET type_qualifier_list_opt assignment_expression_opt RBRACKET','direct_id_declarator',5,'p_direct_id_declarator_3','plyparser.py',109), ('direct_id_declarator -> direct_id_declarator LBRACKET STATIC type_qualifier_list_opt assignment_expression RBRACKET','direct_id_declarator',6,'p_direct_id_declarator_4','plyparser.py',109), ('direct_id_declarator -> direct_id_declarator LBRACKET type_qualifier_list STATIC assignment_expression RBRACKET','direct_id_declarator',6,'p_direct_id_declarator_4','plyparser.py',110), ('direct_id_declarator -> direct_id_declarator LBRACKET type_qualifier_list_opt TIMES RBRACKET','direct_id_declarator',5,'p_direct_id_declarator_5','plyparser.py',109), ('direct_id_declarator -> direct_id_declarator LPAREN parameter_type_list RPAREN','direct_id_declarator',4,'p_direct_id_declarator_6','plyparser.py',109), ('direct_id_declarator -> direct_id_declarator LPAREN identifier_list_opt RPAREN','direct_id_declarator',4,'p_direct_id_declarator_6','plyparser.py',110), ('direct_typeid_declarator -> TYPEID','direct_typeid_declarator',1,'p_direct_typeid_declarator_1','plyparser.py',109), ('direct_typeid_declarator -> LPAREN typeid_declarator RPAREN','direct_typeid_declarator',3,'p_direct_typeid_declarator_2','plyparser.py',109), ('direct_typeid_declarator -> direct_typeid_declarator LBRACKET type_qualifier_list_opt assignment_expression_opt RBRACKET','direct_typeid_declarator',5,'p_direct_typeid_declarator_3','plyparser.py',109), ('direct_typeid_declarator -> direct_typeid_declarator LBRACKET STATIC type_qualifier_list_opt assignment_expression RBRACKET','direct_typeid_declarator',6,'p_direct_typeid_declarator_4','plyparser.py',109), ('direct_typeid_declarator -> direct_typeid_declarator LBRACKET type_qualifier_list STATIC assignment_expression RBRACKET','direct_typeid_declarator',6,'p_direct_typeid_declarator_4','plyparser.py',110), ('direct_typeid_declarator -> direct_typeid_declarator LBRACKET type_qualifier_list_opt TIMES RBRACKET','direct_typeid_declarator',5,'p_direct_typeid_declarator_5','plyparser.py',109), ('direct_typeid_declarator -> direct_typeid_declarator LPAREN parameter_type_list RPAREN','direct_typeid_declarator',4,'p_direct_typeid_declarator_6','plyparser.py',109), ('direct_typeid_declarator -> direct_typeid_declarator LPAREN identifier_list_opt RPAREN','direct_typeid_declarator',4,'p_direct_typeid_declarator_6','plyparser.py',110), ('direct_typeid_noparen_declarator -> TYPEID','direct_typeid_noparen_declarator',1,'p_direct_typeid_noparen_declarator_1','plyparser.py',109), ('direct_typeid_noparen_declarator -> direct_typeid_noparen_declarator LBRACKET type_qualifier_list_opt assignment_expression_opt RBRACKET','direct_typeid_noparen_declarator',5,'p_direct_typeid_noparen_declarator_3','plyparser.py',109), ('direct_typeid_noparen_declarator -> direct_typeid_noparen_declarator LBRACKET STATIC type_qualifier_list_opt assignment_expression RBRACKET','direct_typeid_noparen_declarator',6,'p_direct_typeid_noparen_declarator_4','plyparser.py',109), ('direct_typeid_noparen_declarator -> direct_typeid_noparen_declarator LBRACKET type_qualifier_list STATIC assignment_expression RBRACKET','direct_typeid_noparen_declarator',6,'p_direct_typeid_noparen_declarator_4','plyparser.py',110), ('direct_typeid_noparen_declarator -> direct_typeid_noparen_declarator LBRACKET type_qualifier_list_opt TIMES RBRACKET','direct_typeid_noparen_declarator',5,'p_direct_typeid_noparen_declarator_5','plyparser.py',109), ('direct_typeid_noparen_declarator -> direct_typeid_noparen_declarator LPAREN parameter_type_list RPAREN','direct_typeid_noparen_declarator',4,'p_direct_typeid_noparen_declarator_6','plyparser.py',109), ('direct_typeid_noparen_declarator -> direct_typeid_noparen_declarator LPAREN identifier_list_opt RPAREN','direct_typeid_noparen_declarator',4,'p_direct_typeid_noparen_declarator_6','plyparser.py',110), ('id_declarator -> direct_id_declarator','id_declarator',1,'p_id_declarator_1','plyparser.py',109), ('id_declarator -> pointer direct_id_declarator','id_declarator',2,'p_id_declarator_2','plyparser.py',109), ('typeid_declarator -> direct_typeid_declarator','typeid_declarator',1,'p_typeid_declarator_1','plyparser.py',109), ('typeid_declarator -> pointer direct_typeid_declarator','typeid_declarator',2,'p_typeid_declarator_2','plyparser.py',109), ('typeid_noparen_declarator -> direct_typeid_noparen_declarator','typeid_noparen_declarator',1,'p_typeid_noparen_declarator_1','plyparser.py',109), ('typeid_noparen_declarator -> pointer direct_typeid_noparen_declarator','typeid_noparen_declarator',2,'p_typeid_noparen_declarator_2','plyparser.py',109), ('translation_unit_or_empty -> translation_unit','translation_unit_or_empty',1,'p_translation_unit_or_empty','c_parser.py',514), ('translation_unit_or_empty -> empty','translation_unit_or_empty',1,'p_translation_unit_or_empty','c_parser.py',515), ('translation_unit -> external_declaration','translation_unit',1,'p_translation_unit_1','c_parser.py',523), ('translation_unit -> translation_unit external_declaration','translation_unit',2,'p_translation_unit_2','c_parser.py',530), ('external_declaration -> function_definition','external_declaration',1,'p_external_declaration_1','c_parser.py',542), ('external_declaration -> declaration','external_declaration',1,'p_external_declaration_2','c_parser.py',547), ('external_declaration -> pp_directive','external_declaration',1,'p_external_declaration_3','c_parser.py',552), ('external_declaration -> pppragma_directive','external_declaration',1,'p_external_declaration_3','c_parser.py',553), ('external_declaration -> SEMI','external_declaration',1,'p_external_declaration_4','c_parser.py',558), ('pp_directive -> PPHASH','pp_directive',1,'p_pp_directive','c_parser.py',563), ('pppragma_directive -> PPPRAGMA','pppragma_directive',1,'p_pppragma_directive','c_parser.py',569), ('pppragma_directive -> PPPRAGMA PPPRAGMASTR','pppragma_directive',2,'p_pppragma_directive','c_parser.py',570), ('function_definition -> id_declarator declaration_list_opt compound_statement','function_definition',3,'p_function_definition_1','c_parser.py',581), ('function_definition -> declaration_specifiers id_declarator declaration_list_opt compound_statement','function_definition',4,'p_function_definition_2','c_parser.py',598), ('statement -> labeled_statement','statement',1,'p_statement','c_parser.py',609), ('statement -> expression_statement','statement',1,'p_statement','c_parser.py',610), ('statement -> compound_statement','statement',1,'p_statement','c_parser.py',611), ('statement -> selection_statement','statement',1,'p_statement','c_parser.py',612), ('statement -> iteration_statement','statement',1,'p_statement','c_parser.py',613), ('statement -> jump_statement','statement',1,'p_statement','c_parser.py',614), ('statement -> pppragma_directive','statement',1,'p_statement','c_parser.py',615), ('decl_body -> declaration_specifiers init_declarator_list_opt','decl_body',2,'p_decl_body','c_parser.py',629), ('decl_body -> declaration_specifiers_no_type id_init_declarator_list_opt','decl_body',2,'p_decl_body','c_parser.py',630), ('declaration -> decl_body SEMI','declaration',2,'p_declaration','c_parser.py',689), ('declaration_list -> declaration','declaration_list',1,'p_declaration_list','c_parser.py',698), ('declaration_list -> declaration_list declaration','declaration_list',2,'p_declaration_list','c_parser.py',699), ('declaration_specifiers_no_type -> type_qualifier declaration_specifiers_no_type_opt','declaration_specifiers_no_type',2,'p_declaration_specifiers_no_type_1','c_parser.py',709), ('declaration_specifiers_no_type -> storage_class_specifier declaration_specifiers_no_type_opt','declaration_specifiers_no_type',2,'p_declaration_specifiers_no_type_2','c_parser.py',714), ('declaration_specifiers_no_type -> function_specifier declaration_specifiers_no_type_opt','declaration_specifiers_no_type',2,'p_declaration_specifiers_no_type_3','c_parser.py',719), ('declaration_specifiers -> declaration_specifiers type_qualifier','declaration_specifiers',2,'p_declaration_specifiers_1','c_parser.py',725), ('declaration_specifiers -> declaration_specifiers storage_class_specifier','declaration_specifiers',2,'p_declaration_specifiers_2','c_parser.py',730), ('declaration_specifiers -> declaration_specifiers function_specifier','declaration_specifiers',2,'p_declaration_specifiers_3','c_parser.py',735), ('declaration_specifiers -> declaration_specifiers type_specifier_no_typeid','declaration_specifiers',2,'p_declaration_specifiers_4','c_parser.py',740), ('declaration_specifiers -> type_specifier','declaration_specifiers',1,'p_declaration_specifiers_5','c_parser.py',745), ('declaration_specifiers -> declaration_specifiers_no_type type_specifier','declaration_specifiers',2,'p_declaration_specifiers_6','c_parser.py',750), ('storage_class_specifier -> AUTO','storage_class_specifier',1,'p_storage_class_specifier','c_parser.py',756), ('storage_class_specifier -> REGISTER','storage_class_specifier',1,'p_storage_class_specifier','c_parser.py',757), ('storage_class_specifier -> STATIC','storage_class_specifier',1,'p_storage_class_specifier','c_parser.py',758), ('storage_class_specifier -> EXTERN','storage_class_specifier',1,'p_storage_class_specifier','c_parser.py',759), ('storage_class_specifier -> TYPEDEF','storage_class_specifier',1,'p_storage_class_specifier','c_parser.py',760), ('function_specifier -> INLINE','function_specifier',1,'p_function_specifier','c_parser.py',765), ('type_specifier_no_typeid -> VOID','type_specifier_no_typeid',1,'p_type_specifier_no_typeid','c_parser.py',770), ('type_specifier_no_typeid -> _BOOL','type_specifier_no_typeid',1,'p_type_specifier_no_typeid','c_parser.py',771), ('type_specifier_no_typeid -> CHAR','type_specifier_no_typeid',1,'p_type_specifier_no_typeid','c_parser.py',772), ('type_specifier_no_typeid -> SHORT','type_specifier_no_typeid',1,'p_type_specifier_no_typeid','c_parser.py',773), ('type_specifier_no_typeid -> INT','type_specifier_no_typeid',1,'p_type_specifier_no_typeid','c_parser.py',774), ('type_specifier_no_typeid -> LONG','type_specifier_no_typeid',1,'p_type_specifier_no_typeid','c_parser.py',775), ('type_specifier_no_typeid -> FLOAT','type_specifier_no_typeid',1,'p_type_specifier_no_typeid','c_parser.py',776), ('type_specifier_no_typeid -> DOUBLE','type_specifier_no_typeid',1,'p_type_specifier_no_typeid','c_parser.py',777), ('type_specifier_no_typeid -> _COMPLEX','type_specifier_no_typeid',1,'p_type_specifier_no_typeid','c_parser.py',778), ('type_specifier_no_typeid -> SIGNED','type_specifier_no_typeid',1,'p_type_specifier_no_typeid','c_parser.py',779), ('type_specifier_no_typeid -> UNSIGNED','type_specifier_no_typeid',1,'p_type_specifier_no_typeid','c_parser.py',780), ('type_specifier_no_typeid -> __INT128','type_specifier_no_typeid',1,'p_type_specifier_no_typeid','c_parser.py',781), ('type_specifier -> typedef_name','type_specifier',1,'p_type_specifier','c_parser.py',786), ('type_specifier -> enum_specifier','type_specifier',1,'p_type_specifier','c_parser.py',787), ('type_specifier -> struct_or_union_specifier','type_specifier',1,'p_type_specifier','c_parser.py',788), ('type_specifier -> type_specifier_no_typeid','type_specifier',1,'p_type_specifier','c_parser.py',789), ('type_qualifier -> CONST','type_qualifier',1,'p_type_qualifier','c_parser.py',794), ('type_qualifier -> RESTRICT','type_qualifier',1,'p_type_qualifier','c_parser.py',795), ('type_qualifier -> VOLATILE','type_qualifier',1,'p_type_qualifier','c_parser.py',796), ('init_declarator_list -> init_declarator','init_declarator_list',1,'p_init_declarator_list','c_parser.py',801), ('init_declarator_list -> init_declarator_list COMMA init_declarator','init_declarator_list',3,'p_init_declarator_list','c_parser.py',802), ('init_declarator -> declarator','init_declarator',1,'p_init_declarator','c_parser.py',810), ('init_declarator -> declarator EQUALS initializer','init_declarator',3,'p_init_declarator','c_parser.py',811), ('id_init_declarator_list -> id_init_declarator','id_init_declarator_list',1,'p_id_init_declarator_list','c_parser.py',816), ('id_init_declarator_list -> id_init_declarator_list COMMA init_declarator','id_init_declarator_list',3,'p_id_init_declarator_list','c_parser.py',817), ('id_init_declarator -> id_declarator','id_init_declarator',1,'p_id_init_declarator','c_parser.py',822), ('id_init_declarator -> id_declarator EQUALS initializer','id_init_declarator',3,'p_id_init_declarator','c_parser.py',823), ('specifier_qualifier_list -> specifier_qualifier_list type_specifier_no_typeid','specifier_qualifier_list',2,'p_specifier_qualifier_list_1','c_parser.py',830), ('specifier_qualifier_list -> specifier_qualifier_list type_qualifier','specifier_qualifier_list',2,'p_specifier_qualifier_list_2','c_parser.py',835), ('specifier_qualifier_list -> type_specifier','specifier_qualifier_list',1,'p_specifier_qualifier_list_3','c_parser.py',840), ('specifier_qualifier_list -> type_qualifier_list type_specifier','specifier_qualifier_list',2,'p_specifier_qualifier_list_4','c_parser.py',845), ('struct_or_union_specifier -> struct_or_union ID','struct_or_union_specifier',2,'p_struct_or_union_specifier_1','c_parser.py',854), ('struct_or_union_specifier -> struct_or_union TYPEID','struct_or_union_specifier',2,'p_struct_or_union_specifier_1','c_parser.py',855), ('struct_or_union_specifier -> struct_or_union brace_open struct_declaration_list brace_close','struct_or_union_specifier',4,'p_struct_or_union_specifier_2','c_parser.py',864), ('struct_or_union_specifier -> struct_or_union ID brace_open struct_declaration_list brace_close','struct_or_union_specifier',5,'p_struct_or_union_specifier_3','c_parser.py',873), ('struct_or_union_specifier -> struct_or_union TYPEID brace_open struct_declaration_list brace_close','struct_or_union_specifier',5,'p_struct_or_union_specifier_3','c_parser.py',874), ('struct_or_union -> STRUCT','struct_or_union',1,'p_struct_or_union','c_parser.py',883), ('struct_or_union -> UNION','struct_or_union',1,'p_struct_or_union','c_parser.py',884), ('struct_declaration_list -> struct_declaration','struct_declaration_list',1,'p_struct_declaration_list','c_parser.py',891), ('struct_declaration_list -> struct_declaration_list struct_declaration','struct_declaration_list',2,'p_struct_declaration_list','c_parser.py',892), ('struct_declaration -> specifier_qualifier_list struct_declarator_list_opt SEMI','struct_declaration',3,'p_struct_declaration_1','c_parser.py',900), ('struct_declaration -> SEMI','struct_declaration',1,'p_struct_declaration_2','c_parser.py',938), ('struct_declarator_list -> struct_declarator','struct_declarator_list',1,'p_struct_declarator_list','c_parser.py',943), ('struct_declarator_list -> struct_declarator_list COMMA struct_declarator','struct_declarator_list',3,'p_struct_declarator_list','c_parser.py',944), ('struct_declarator -> declarator','struct_declarator',1,'p_struct_declarator_1','c_parser.py',952), ('struct_declarator -> declarator COLON constant_expression','struct_declarator',3,'p_struct_declarator_2','c_parser.py',957), ('struct_declarator -> COLON constant_expression','struct_declarator',2,'p_struct_declarator_2','c_parser.py',958), ('enum_specifier -> ENUM ID','enum_specifier',2,'p_enum_specifier_1','c_parser.py',966), ('enum_specifier -> ENUM TYPEID','enum_specifier',2,'p_enum_specifier_1','c_parser.py',967), ('enum_specifier -> ENUM brace_open enumerator_list brace_close','enum_specifier',4,'p_enum_specifier_2','c_parser.py',972), ('enum_specifier -> ENUM ID brace_open enumerator_list brace_close','enum_specifier',5,'p_enum_specifier_3','c_parser.py',977), ('enum_specifier -> ENUM TYPEID brace_open enumerator_list brace_close','enum_specifier',5,'p_enum_specifier_3','c_parser.py',978), ('enumerator_list -> enumerator','enumerator_list',1,'p_enumerator_list','c_parser.py',983), ('enumerator_list -> enumerator_list COMMA','enumerator_list',2,'p_enumerator_list','c_parser.py',984), ('enumerator_list -> enumerator_list COMMA enumerator','enumerator_list',3,'p_enumerator_list','c_parser.py',985), ('enumerator -> ID','enumerator',1,'p_enumerator','c_parser.py',996), ('enumerator -> ID EQUALS constant_expression','enumerator',3,'p_enumerator','c_parser.py',997), ('declarator -> id_declarator','declarator',1,'p_declarator','c_parser.py',1012), ('declarator -> typeid_declarator','declarator',1,'p_declarator','c_parser.py',1013), ('pointer -> TIMES type_qualifier_list_opt','pointer',2,'p_pointer','c_parser.py',1124), ('pointer -> TIMES type_qualifier_list_opt pointer','pointer',3,'p_pointer','c_parser.py',1125), ('type_qualifier_list -> type_qualifier','type_qualifier_list',1,'p_type_qualifier_list','c_parser.py',1154), ('type_qualifier_list -> type_qualifier_list type_qualifier','type_qualifier_list',2,'p_type_qualifier_list','c_parser.py',1155), ('parameter_type_list -> parameter_list','parameter_type_list',1,'p_parameter_type_list','c_parser.py',1160), ('parameter_type_list -> parameter_list COMMA ELLIPSIS','parameter_type_list',3,'p_parameter_type_list','c_parser.py',1161), ('parameter_list -> parameter_declaration','parameter_list',1,'p_parameter_list','c_parser.py',1169), ('parameter_list -> parameter_list COMMA parameter_declaration','parameter_list',3,'p_parameter_list','c_parser.py',1170), ('parameter_declaration -> declaration_specifiers id_declarator','parameter_declaration',2,'p_parameter_declaration_1','c_parser.py',1189), ('parameter_declaration -> declaration_specifiers typeid_noparen_declarator','parameter_declaration',2,'p_parameter_declaration_1','c_parser.py',1190), ('parameter_declaration -> declaration_specifiers abstract_declarator_opt','parameter_declaration',2,'p_parameter_declaration_2','c_parser.py',1201), ('identifier_list -> identifier','identifier_list',1,'p_identifier_list','c_parser.py',1232), ('identifier_list -> identifier_list COMMA identifier','identifier_list',3,'p_identifier_list','c_parser.py',1233), ('initializer -> assignment_expression','initializer',1,'p_initializer_1','c_parser.py',1242), ('initializer -> brace_open initializer_list_opt brace_close','initializer',3,'p_initializer_2','c_parser.py',1247), ('initializer -> brace_open initializer_list COMMA brace_close','initializer',4,'p_initializer_2','c_parser.py',1248), ('initializer_list -> designation_opt initializer','initializer_list',2,'p_initializer_list','c_parser.py',1256), ('initializer_list -> initializer_list COMMA designation_opt initializer','initializer_list',4,'p_initializer_list','c_parser.py',1257), ('designation -> designator_list EQUALS','designation',2,'p_designation','c_parser.py',1268), ('designator_list -> designator','designator_list',1,'p_designator_list','c_parser.py',1276), ('designator_list -> designator_list designator','designator_list',2,'p_designator_list','c_parser.py',1277), ('designator -> LBRACKET constant_expression RBRACKET','designator',3,'p_designator','c_parser.py',1282), ('designator -> PERIOD identifier','designator',2,'p_designator','c_parser.py',1283), ('type_name -> specifier_qualifier_list abstract_declarator_opt','type_name',2,'p_type_name','c_parser.py',1288), ('abstract_declarator -> pointer','abstract_declarator',1,'p_abstract_declarator_1','c_parser.py',1299), ('abstract_declarator -> pointer direct_abstract_declarator','abstract_declarator',2,'p_abstract_declarator_2','c_parser.py',1307), ('abstract_declarator -> direct_abstract_declarator','abstract_declarator',1,'p_abstract_declarator_3','c_parser.py',1312), ('direct_abstract_declarator -> LPAREN abstract_declarator RPAREN','direct_abstract_declarator',3,'p_direct_abstract_declarator_1','c_parser.py',1322), ('direct_abstract_declarator -> direct_abstract_declarator LBRACKET assignment_expression_opt RBRACKET','direct_abstract_declarator',4,'p_direct_abstract_declarator_2','c_parser.py',1326), ('direct_abstract_declarator -> LBRACKET assignment_expression_opt RBRACKET','direct_abstract_declarator',3,'p_direct_abstract_declarator_3','c_parser.py',1337), ('direct_abstract_declarator -> direct_abstract_declarator LBRACKET TIMES RBRACKET','direct_abstract_declarator',4,'p_direct_abstract_declarator_4','c_parser.py',1346), ('direct_abstract_declarator -> LBRACKET TIMES RBRACKET','direct_abstract_declarator',3,'p_direct_abstract_declarator_5','c_parser.py',1357), ('direct_abstract_declarator -> direct_abstract_declarator LPAREN parameter_type_list_opt RPAREN','direct_abstract_declarator',4,'p_direct_abstract_declarator_6','c_parser.py',1366), ('direct_abstract_declarator -> LPAREN parameter_type_list_opt RPAREN','direct_abstract_declarator',3,'p_direct_abstract_declarator_7','c_parser.py',1376), ('block_item -> declaration','block_item',1,'p_block_item','c_parser.py',1387), ('block_item -> statement','block_item',1,'p_block_item','c_parser.py',1388), ('block_item_list -> block_item','block_item_list',1,'p_block_item_list','c_parser.py',1395), ('block_item_list -> block_item_list block_item','block_item_list',2,'p_block_item_list','c_parser.py',1396), ('compound_statement -> brace_open block_item_list_opt brace_close','compound_statement',3,'p_compound_statement_1','c_parser.py',1402), ('labeled_statement -> ID COLON statement','labeled_statement',3,'p_labeled_statement_1','c_parser.py',1408), ('labeled_statement -> CASE constant_expression COLON statement','labeled_statement',4,'p_labeled_statement_2','c_parser.py',1412), ('labeled_statement -> DEFAULT COLON statement','labeled_statement',3,'p_labeled_statement_3','c_parser.py',1416), ('selection_statement -> IF LPAREN expression RPAREN statement','selection_statement',5,'p_selection_statement_1','c_parser.py',1420), ('selection_statement -> IF LPAREN expression RPAREN statement ELSE statement','selection_statement',7,'p_selection_statement_2','c_parser.py',1424), ('selection_statement -> SWITCH LPAREN expression RPAREN statement','selection_statement',5,'p_selection_statement_3','c_parser.py',1428), ('iteration_statement -> WHILE LPAREN expression RPAREN statement','iteration_statement',5,'p_iteration_statement_1','c_parser.py',1433), ('iteration_statement -> DO statement WHILE LPAREN expression RPAREN SEMI','iteration_statement',7,'p_iteration_statement_2','c_parser.py',1437), ('iteration_statement -> FOR LPAREN expression_opt SEMI expression_opt SEMI expression_opt RPAREN statement','iteration_statement',9,'p_iteration_statement_3','c_parser.py',1441), ('iteration_statement -> FOR LPAREN declaration expression_opt SEMI expression_opt RPAREN statement','iteration_statement',8,'p_iteration_statement_4','c_parser.py',1445), ('jump_statement -> GOTO ID SEMI','jump_statement',3,'p_jump_statement_1','c_parser.py',1450), ('jump_statement -> BREAK SEMI','jump_statement',2,'p_jump_statement_2','c_parser.py',1454), ('jump_statement -> CONTINUE SEMI','jump_statement',2,'p_jump_statement_3','c_parser.py',1458), ('jump_statement -> RETURN expression SEMI','jump_statement',3,'p_jump_statement_4','c_parser.py',1462), ('jump_statement -> RETURN SEMI','jump_statement',2,'p_jump_statement_4','c_parser.py',1463), ('expression_statement -> expression_opt SEMI','expression_statement',2,'p_expression_statement','c_parser.py',1468), ('expression -> assignment_expression','expression',1,'p_expression','c_parser.py',1475), ('expression -> expression COMMA assignment_expression','expression',3,'p_expression','c_parser.py',1476), ('typedef_name -> TYPEID','typedef_name',1,'p_typedef_name','c_parser.py',1488), ('assignment_expression -> conditional_expression','assignment_expression',1,'p_assignment_expression','c_parser.py',1492), ('assignment_expression -> unary_expression assignment_operator assignment_expression','assignment_expression',3,'p_assignment_expression','c_parser.py',1493), ('assignment_operator -> EQUALS','assignment_operator',1,'p_assignment_operator','c_parser.py',1506), ('assignment_operator -> XOREQUAL','assignment_operator',1,'p_assignment_operator','c_parser.py',1507), ('assignment_operator -> TIMESEQUAL','assignment_operator',1,'p_assignment_operator','c_parser.py',1508), ('assignment_operator -> DIVEQUAL','assignment_operator',1,'p_assignment_operator','c_parser.py',1509), ('assignment_operator -> MODEQUAL','assignment_operator',1,'p_assignment_operator','c_parser.py',1510), ('assignment_operator -> PLUSEQUAL','assignment_operator',1,'p_assignment_operator','c_parser.py',1511), ('assignment_operator -> MINUSEQUAL','assignment_operator',1,'p_assignment_operator','c_parser.py',1512), ('assignment_operator -> LSHIFTEQUAL','assignment_operator',1,'p_assignment_operator','c_parser.py',1513), ('assignment_operator -> RSHIFTEQUAL','assignment_operator',1,'p_assignment_operator','c_parser.py',1514), ('assignment_operator -> ANDEQUAL','assignment_operator',1,'p_assignment_operator','c_parser.py',1515), ('assignment_operator -> OREQUAL','assignment_operator',1,'p_assignment_operator','c_parser.py',1516), ('constant_expression -> conditional_expression','constant_expression',1,'p_constant_expression','c_parser.py',1521), ('conditional_expression -> binary_expression','conditional_expression',1,'p_conditional_expression','c_parser.py',1525), ('conditional_expression -> binary_expression CONDOP expression COLON conditional_expression','conditional_expression',5,'p_conditional_expression','c_parser.py',1526), ('binary_expression -> cast_expression','binary_expression',1,'p_binary_expression','c_parser.py',1534), ('binary_expression -> binary_expression TIMES binary_expression','binary_expression',3,'p_binary_expression','c_parser.py',1535), ('binary_expression -> binary_expression DIVIDE binary_expression','binary_expression',3,'p_binary_expression','c_parser.py',1536), ('binary_expression -> binary_expression MOD binary_expression','binary_expression',3,'p_binary_expression','c_parser.py',1537), ('binary_expression -> binary_expression PLUS binary_expression','binary_expression',3,'p_binary_expression','c_parser.py',1538), ('binary_expression -> binary_expression MINUS binary_expression','binary_expression',3,'p_binary_expression','c_parser.py',1539), ('binary_expression -> binary_expression RSHIFT binary_expression','binary_expression',3,'p_binary_expression','c_parser.py',1540), ('binary_expression -> binary_expression LSHIFT binary_expression','binary_expression',3,'p_binary_expression','c_parser.py',1541), ('binary_expression -> binary_expression LT binary_expression','binary_expression',3,'p_binary_expression','c_parser.py',1542), ('binary_expression -> binary_expression LE binary_expression','binary_expression',3,'p_binary_expression','c_parser.py',1543), ('binary_expression -> binary_expression GE binary_expression','binary_expression',3,'p_binary_expression','c_parser.py',1544), ('binary_expression -> binary_expression GT binary_expression','binary_expression',3,'p_binary_expression','c_parser.py',1545), ('binary_expression -> binary_expression EQ binary_expression','binary_expression',3,'p_binary_expression','c_parser.py',1546), ('binary_expression -> binary_expression NE binary_expression','binary_expression',3,'p_binary_expression','c_parser.py',1547), ('binary_expression -> binary_expression AND binary_expression','binary_expression',3,'p_binary_expression','c_parser.py',1548), ('binary_expression -> binary_expression OR binary_expression','binary_expression',3,'p_binary_expression','c_parser.py',1549), ('binary_expression -> binary_expression XOR binary_expression','binary_expression',3,'p_binary_expression','c_parser.py',1550), ('binary_expression -> binary_expression LAND binary_expression','binary_expression',3,'p_binary_expression','c_parser.py',1551), ('binary_expression -> binary_expression LOR binary_expression','binary_expression',3,'p_binary_expression','c_parser.py',1552), ('cast_expression -> unary_expression','cast_expression',1,'p_cast_expression_1','c_parser.py',1560), ('cast_expression -> LPAREN type_name RPAREN cast_expression','cast_expression',4,'p_cast_expression_2','c_parser.py',1564), ('unary_expression -> postfix_expression','unary_expression',1,'p_unary_expression_1','c_parser.py',1568), ('unary_expression -> PLUSPLUS unary_expression','unary_expression',2,'p_unary_expression_2','c_parser.py',1572), ('unary_expression -> MINUSMINUS unary_expression','unary_expression',2,'p_unary_expression_2','c_parser.py',1573), ('unary_expression -> unary_operator cast_expression','unary_expression',2,'p_unary_expression_2','c_parser.py',1574), ('unary_expression -> SIZEOF unary_expression','unary_expression',2,'p_unary_expression_3','c_parser.py',1579), ('unary_expression -> SIZEOF LPAREN type_name RPAREN','unary_expression',4,'p_unary_expression_3','c_parser.py',1580), ('unary_operator -> AND','unary_operator',1,'p_unary_operator','c_parser.py',1588), ('unary_operator -> TIMES','unary_operator',1,'p_unary_operator','c_parser.py',1589), ('unary_operator -> PLUS','unary_operator',1,'p_unary_operator','c_parser.py',1590), ('unary_operator -> MINUS','unary_operator',1,'p_unary_operator','c_parser.py',1591), ('unary_operator -> NOT','unary_operator',1,'p_unary_operator','c_parser.py',1592), ('unary_operator -> LNOT','unary_operator',1,'p_unary_operator','c_parser.py',1593), ('postfix_expression -> primary_expression','postfix_expression',1,'p_postfix_expression_1','c_parser.py',1598), ('postfix_expression -> postfix_expression LBRACKET expression RBRACKET','postfix_expression',4,'p_postfix_expression_2','c_parser.py',1602), ('postfix_expression -> postfix_expression LPAREN argument_expression_list RPAREN','postfix_expression',4,'p_postfix_expression_3','c_parser.py',1606), ('postfix_expression -> postfix_expression LPAREN RPAREN','postfix_expression',3,'p_postfix_expression_3','c_parser.py',1607), ('postfix_expression -> postfix_expression PERIOD ID','postfix_expression',3,'p_postfix_expression_4','c_parser.py',1612), ('postfix_expression -> postfix_expression PERIOD TYPEID','postfix_expression',3,'p_postfix_expression_4','c_parser.py',1613), ('postfix_expression -> postfix_expression ARROW ID','postfix_expression',3,'p_postfix_expression_4','c_parser.py',1614), ('postfix_expression -> postfix_expression ARROW TYPEID','postfix_expression',3,'p_postfix_expression_4','c_parser.py',1615), ('postfix_expression -> postfix_expression PLUSPLUS','postfix_expression',2,'p_postfix_expression_5','c_parser.py',1621), ('postfix_expression -> postfix_expression MINUSMINUS','postfix_expression',2,'p_postfix_expression_5','c_parser.py',1622), ('postfix_expression -> LPAREN type_name RPAREN brace_open initializer_list brace_close','postfix_expression',6,'p_postfix_expression_6','c_parser.py',1627), ('postfix_expression -> LPAREN type_name RPAREN brace_open initializer_list COMMA brace_close','postfix_expression',7,'p_postfix_expression_6','c_parser.py',1628), ('primary_expression -> identifier','primary_expression',1,'p_primary_expression_1','c_parser.py',1633), ('primary_expression -> constant','primary_expression',1,'p_primary_expression_2','c_parser.py',1637), ('primary_expression -> unified_string_literal','primary_expression',1,'p_primary_expression_3','c_parser.py',1641), ('primary_expression -> unified_wstring_literal','primary_expression',1,'p_primary_expression_3','c_parser.py',1642), ('primary_expression -> LPAREN expression RPAREN','primary_expression',3,'p_primary_expression_4','c_parser.py',1647), ('primary_expression -> OFFSETOF LPAREN type_name COMMA offsetof_member_designator RPAREN','primary_expression',6,'p_primary_expression_5','c_parser.py',1651), ('offsetof_member_designator -> identifier','offsetof_member_designator',1,'p_offsetof_member_designator','c_parser.py',1659), ('offsetof_member_designator -> offsetof_member_designator PERIOD identifier','offsetof_member_designator',3,'p_offsetof_member_designator','c_parser.py',1660), ('offsetof_member_designator -> offsetof_member_designator LBRACKET expression RBRACKET','offsetof_member_designator',4,'p_offsetof_member_designator','c_parser.py',1661), ('argument_expression_list -> assignment_expression','argument_expression_list',1,'p_argument_expression_list','c_parser.py',1674), ('argument_expression_list -> argument_expression_list COMMA assignment_expression','argument_expression_list',3,'p_argument_expression_list','c_parser.py',1675), ('identifier -> ID','identifier',1,'p_identifier','c_parser.py',1684), ('constant -> INT_CONST_DEC','constant',1,'p_constant_1','c_parser.py',1688), ('constant -> INT_CONST_OCT','constant',1,'p_constant_1','c_parser.py',1689), ('constant -> INT_CONST_HEX','constant',1,'p_constant_1','c_parser.py',1690), ('constant -> INT_CONST_BIN','constant',1,'p_constant_1','c_parser.py',1691), ('constant -> FLOAT_CONST','constant',1,'p_constant_2','c_parser.py',1697), ('constant -> HEX_FLOAT_CONST','constant',1,'p_constant_2','c_parser.py',1698), ('constant -> CHAR_CONST','constant',1,'p_constant_3','c_parser.py',1704), ('constant -> WCHAR_CONST','constant',1,'p_constant_3','c_parser.py',1705), ('unified_string_literal -> STRING_LITERAL','unified_string_literal',1,'p_unified_string_literal','c_parser.py',1716), ('unified_string_literal -> unified_string_literal STRING_LITERAL','unified_string_literal',2,'p_unified_string_literal','c_parser.py',1717), ('unified_wstring_literal -> WSTRING_LITERAL','unified_wstring_literal',1,'p_unified_wstring_literal','c_parser.py',1727), ('unified_wstring_literal -> unified_wstring_literal WSTRING_LITERAL','unified_wstring_literal',2,'p_unified_wstring_literal','c_parser.py',1728), ('brace_open -> LBRACE','brace_open',1,'p_brace_open','c_parser.py',1738), ('brace_close -> RBRACE','brace_close',1,'p_brace_close','c_parser.py',1744), ('empty -> ','empty',0,'p_empty','c_parser.py',1750), ] pycparser-2.18/pycparser/__init__.py0000664000175000017500000000553513127010757020330 0ustar elibeneliben00000000000000#----------------------------------------------------------------- # pycparser: __init__.py # # This package file exports some convenience functions for # interacting with pycparser # # Eli Bendersky [http://eli.thegreenplace.net] # License: BSD #----------------------------------------------------------------- __all__ = ['c_lexer', 'c_parser', 'c_ast'] __version__ = '2.18' from subprocess import Popen, PIPE from .c_parser import CParser def preprocess_file(filename, cpp_path='cpp', cpp_args=''): """ Preprocess a file using cpp. filename: Name of the file you want to preprocess. cpp_path: cpp_args: Refer to the documentation of parse_file for the meaning of these arguments. When successful, returns the preprocessed file's contents. Errors from cpp will be printed out. """ path_list = [cpp_path] if isinstance(cpp_args, list): path_list += cpp_args elif cpp_args != '': path_list += [cpp_args] path_list += [filename] try: # Note the use of universal_newlines to treat all newlines # as \n for Python's purpose # pipe = Popen( path_list, stdout=PIPE, universal_newlines=True) text = pipe.communicate()[0] except OSError as e: raise RuntimeError("Unable to invoke 'cpp'. " + 'Make sure its path was passed correctly\n' + ('Original error: %s' % e)) return text def parse_file(filename, use_cpp=False, cpp_path='cpp', cpp_args='', parser=None): """ Parse a C file using pycparser. filename: Name of the file you want to parse. use_cpp: Set to True if you want to execute the C pre-processor on the file prior to parsing it. cpp_path: If use_cpp is True, this is the path to 'cpp' on your system. If no path is provided, it attempts to just execute 'cpp', so it must be in your PATH. cpp_args: If use_cpp is True, set this to the command line arguments strings to cpp. Be careful with quotes - it's best to pass a raw string (r'') here. For example: r'-I../utils/fake_libc_include' If several arguments are required, pass a list of strings. parser: Optional parser object to be used instead of the default CParser When successful, an AST is returned. ParseError can be thrown if the file doesn't parse successfully. Errors from cpp will be printed out. """ if use_cpp: text = preprocess_file(filename, cpp_path, cpp_args) else: with open(filename, 'rU') as f: text = f.read() if parser is None: parser = CParser() return parser.parse(text, filename) pycparser-2.18/PKG-INFO0000664000175000017500000000113213127011712015261 0ustar elibeneliben00000000000000Metadata-Version: 1.1 Name: pycparser Version: 2.18 Summary: C parser in Python Home-page: https://github.com/eliben/pycparser Author: Eli Bendersky Author-email: eliben@gmail.com License: BSD Description: pycparser is a complete parser of the C language, written in pure Python using the PLY parsing library. It parses C code into an AST and can serve as a front-end for C compilers or analysis tools. Platform: Cross Platform Classifier: Programming Language :: Python :: 2 Classifier: Programming Language :: Python :: 3 pycparser-2.18/setup.cfg0000664000175000017500000000013013127011712016002 0ustar elibeneliben00000000000000[bdist_wheel] universal = 1 [egg_info] tag_build = tag_date = 0 tag_svn_revision = 0 pycparser-2.18/examples/0000775000175000017500000000000013127011712016005 5ustar elibeneliben00000000000000pycparser-2.18/examples/func_defs.py0000664000175000017500000000247613045001366020327 0ustar elibeneliben00000000000000#----------------------------------------------------------------- # pycparser: func_defs.py # # Using pycparser for printing out all the functions defined in a # C file. # # This is a simple example of traversing the AST generated by # pycparser. Call it from the root directory of pycparser. # # Eli Bendersky [http://eli.thegreenplace.net] # License: BSD #----------------------------------------------------------------- from __future__ import print_function import sys # This is not required if you've installed pycparser into # your site-packages/ with setup.py sys.path.extend(['.', '..']) from pycparser import c_parser, c_ast, parse_file # A simple visitor for FuncDef nodes that prints the names and # locations of function definitions. class FuncDefVisitor(c_ast.NodeVisitor): def visit_FuncDef(self, node): print('%s at %s' % (node.decl.name, node.decl.coord)) def show_func_defs(filename): # Note that cpp is used. Provide a path to your own cpp or # make sure one exists in PATH. ast = parse_file(filename, use_cpp=True, cpp_args=r'-Iutils/fake_libc_include') v = FuncDefVisitor() v.visit(ast) if __name__ == "__main__": if len(sys.argv) > 1: filename = sys.argv[1] else: filename = 'examples/c_files/memmgr.c' show_func_defs(filename) pycparser-2.18/examples/c_json.py0000664000175000017500000001437613060530765017657 0ustar elibeneliben00000000000000#------------------------------------------------------------------------------ # pycparser: c_json.py # # by Michael White (@mypalmike) # # This example includes functions to serialize and deserialize an ast # to and from json format. Serializing involves walking the ast and converting # each node from a python Node object into a python dict. Deserializing # involves the opposite conversion, walking the tree formed by the # dict and converting each dict into the specific Node object it represents. # The dict itself is serialized and deserialized using the python json module. # # The dict representation is a fairly direct transformation of the object # attributes. Each node in the dict gets one metadata field referring to the # specific node class name, _nodetype. Each local attribute (i.e. not linking # to child nodes) has a string value or array of string values. Each child # attribute is either another dict or an array of dicts, exactly as in the # Node object representation. The "coord" attribute, representing the # node's location within the source code, is serialized/deserialized from # a Coord object into a string of the format "filename:line[:column]". # # Example TypeDecl node, with IdentifierType child node, represented as a dict: # "type": { # "_nodetype": "TypeDecl", # "coord": "c_files/funky.c:8", # "declname": "o", # "quals": [], # "type": { # "_nodetype": "IdentifierType", # "coord": "c_files/funky.c:8", # "names": [ # "char" # ] # } # } #------------------------------------------------------------------------------ from __future__ import print_function import json import sys import re # This is not required if you've installed pycparser into # your site-packages/ with setup.py # sys.path.extend(['.', '..']) from pycparser import parse_file, c_ast from pycparser.plyparser import Coord RE_CHILD_ARRAY = re.compile(r'(.*)\[(.*)\]') RE_INTERNAL_ATTR = re.compile('__.*__') class CJsonError(Exception): pass def memodict(fn): """ Fast memoization decorator for a function taking a single argument """ class memodict(dict): def __missing__(self, key): ret = self[key] = fn(key) return ret return memodict().__getitem__ @memodict def child_attrs_of(klass): """ Given a Node class, get a set of child attrs. Memoized to avoid highly repetitive string manipulation """ non_child_attrs = set(klass.attr_names) all_attrs = set([i for i in klass.__slots__ if not RE_INTERNAL_ATTR.match(i)]) return all_attrs - non_child_attrs def to_dict(node): """ Recursively convert an ast into dict representation. """ klass = node.__class__ result = {} # Metadata result['_nodetype'] = klass.__name__ # Local node attributes for attr in klass.attr_names: result[attr] = getattr(node, attr) # Coord object if node.coord: result['coord'] = str(node.coord) else: result['coord'] = None # Child attributes for child_name, child in node.children(): # Child strings are either simple (e.g. 'value') or arrays (e.g. 'block_items[1]') match = RE_CHILD_ARRAY.match(child_name) if match: array_name, array_index = match.groups() array_index = int(array_index) # arrays come in order, so we verify and append. result[array_name] = result.get(array_name, []) if array_index != len(result[array_name]): raise CJsonError('Internal ast error. Array {} out of order. ' 'Expected index {}, got {}'.format( array_name, len(result[array_name]), array_index)) result[array_name].append(to_dict(child)) else: result[child_name] = to_dict(child) # Any child attributes that were missing need "None" values in the json. for child_attr in child_attrs_of(klass): if child_attr not in result: result[child_attr] = None return result def to_json(node, **kwargs): """ Convert ast node to json string """ return json.dumps(to_dict(node), **kwargs) def file_to_dict(filename): """ Load C file into dict representation of ast """ ast = parse_file(filename, use_cpp=True) return to_dict(ast) def file_to_json(filename, **kwargs): """ Load C file into json string representation of ast """ ast = parse_file(filename, use_cpp=True) return to_json(ast, **kwargs) def _parse_coord(coord_str): """ Parse coord string (file:line[:column]) into Coord object. """ if coord_str is None: return None vals = coord_str.split(':') vals.extend([None] * 3) filename, line, column = vals[:3] return Coord(filename, line, column) def _convert_to_obj(value): """ Convert an object in the dict representation into an object. Note: Mutually recursive with from_dict. """ value_type = type(value) if value_type == dict: return from_dict(value) elif value_type == list: return [_convert_to_obj(item) for item in value] else: # String return value def from_dict(node_dict): """ Recursively build an ast from dict representation """ class_name = node_dict.pop('_nodetype') klass = getattr(c_ast, class_name) # Create a new dict containing the key-value pairs which we can pass # to node constructors. objs = {} for key, value in node_dict.items(): if key == 'coord': objs[key] = _parse_coord(value) else: objs[key] = _convert_to_obj(value) # Use keyword parameters, which works thanks to beautifully consistent # ast Node initializers. return klass(**objs) def from_json(ast_json): """ Build an ast from json string representation """ return from_dict(json.loads(ast_json)) #------------------------------------------------------------------------------ if __name__ == "__main__": if len(sys.argv) > 1: # Some test code... # Do trip from C -> ast -> dict -> ast -> json, then print. ast_dict = file_to_dict(sys.argv[1]) ast = from_dict(ast_dict) print(to_json(ast, sort_keys=True, indent=4)) else: print("Please provide a filename as argument") pycparser-2.18/examples/serialize_ast.py0000664000175000017500000000161113053207707021225 0ustar elibeneliben00000000000000#----------------------------------------------------------------- # pycparser: serialize_ast.py # # Simple example of serializing AST # # Hart Chu [https://github.com/CtheSky] # Eli Bendersky [http://eli.thegreenplace.net] # License: BSD #----------------------------------------------------------------- from __future__ import print_function import pickle from pycparser import c_parser text = r""" void func(void) { x = 1; } """ parser = c_parser.CParser() ast = parser.parse(text) # Since AST nodes use __slots__ for faster attribute access and # space saving, it needs Pickle's protocol version >= 2. # The default version is 3 for python 3.x and 1 for python 2.7. # You can always select the highest available protocol with the -1 argument. with open('ast', 'wb') as f: pickle.dump(ast, f, protocol=-1) # Deserialize. with open('ast', 'rb') as f: ast = pickle.load(f) ast.show() pycparser-2.18/examples/using_gcc_E_libc.py0000664000175000017500000000163713045001366021567 0ustar elibeneliben00000000000000#------------------------------------------------------------------------------- # pycparser: using_gcc_E_libc.py # # Similar to the using_cpp_libc.py example, but uses 'gcc -E' instead # of 'cpp'. The same can be achieved with Clang instead of gcc. If you have # Clang installed, simply replace 'gcc' with 'clang' here. # # Eli Bendersky [http://eli.thegreenplace.net] # License: BSD #------------------------------------------------------------------------------- import sys # This is not required if you've installed pycparser into # your site-packages/ with setup.py # sys.path.extend(['.', '..']) from pycparser import parse_file if __name__ == "__main__": if len(sys.argv) > 1: filename = sys.argv[1] else: filename = 'examples/c_files/year.c' ast = parse_file(filename, use_cpp=True, cpp_path='gcc', cpp_args=['-E', r'-Iutils/fake_libc_include']) ast.show() pycparser-2.18/examples/c_files/0000775000175000017500000000000013127011712017411 5ustar elibeneliben00000000000000pycparser-2.18/examples/c_files/memmgr.c0000664000175000017500000001241713045001366021051 0ustar elibeneliben00000000000000//---------------------------------------------------------------- // Statically-allocated memory manager // // by Eli Bendersky (eliben@gmail.com) // // This code is in the public domain. //---------------------------------------------------------------- #include "memmgr.h" typedef ulong Align; union mem_header_union { struct { // Pointer to the next block in the free list // union mem_header_union* next; // Size of the block (in quantas of sizeof(mem_header_t)) // ulong size; } s; // Used to align headers in memory to a boundary // Align align_dummy; }; typedef union mem_header_union mem_header_t; // Initial empty list // static mem_header_t base; // Start of free list // static mem_header_t* freep = 0; // Static pool for new allocations // static byte pool[POOL_SIZE] = {0}; static ulong pool_free_pos = 0; void memmgr_init() { base.s.next = 0; base.s.size = 0; freep = 0; pool_free_pos = 0; } static mem_header_t* get_mem_from_pool(ulong nquantas) { ulong total_req_size; mem_header_t* h; if (nquantas < MIN_POOL_ALLOC_QUANTAS) nquantas = MIN_POOL_ALLOC_QUANTAS; total_req_size = nquantas * sizeof(mem_header_t); if (pool_free_pos + total_req_size <= POOL_SIZE) { h = (mem_header_t*) (pool + pool_free_pos); h->s.size = nquantas; memmgr_free((void*) (h + 1)); pool_free_pos += total_req_size; } else { return 0; } return freep; } // Allocations are done in 'quantas' of header size. // The search for a free block of adequate size begins at the point 'freep' // where the last block was found. // If a too-big block is found, it is split and the tail is returned (this // way the header of the original needs only to have its size adjusted). // The pointer returned to the user points to the free space within the block, // which begins one quanta after the header. // void* memmgr_alloc(ulong nbytes) { mem_header_t* p; mem_header_t* prevp; // Calculate how many quantas are required: we need enough to house all // the requested bytes, plus the header. The -1 and +1 are there to make sure // that if nbytes is a multiple of nquantas, we don't allocate too much // ulong nquantas = (nbytes + sizeof(mem_header_t) - 1) / sizeof(mem_header_t) + 1; // First alloc call, and no free list yet ? Use 'base' for an initial // denegerate block of size 0, which points to itself // if ((prevp = freep) == 0) { base.s.next = freep = prevp = &base; base.s.size = 0; } for (p = prevp->s.next; ; prevp = p, p = p->s.next) { // big enough ? if (p->s.size >= nquantas) { // exactly ? if (p->s.size == nquantas) { // just eliminate this block from the free list by pointing // its prev's next to its next // prevp->s.next = p->s.next; } else // too big { p->s.size -= nquantas; p += p->s.size; p->s.size = nquantas; } freep = prevp; return (void*) (p + 1); } // Reached end of free list ? // Try to allocate the block from the pool. If that succeeds, // get_mem_from_pool adds the new block to the free list and // it will be found in the following iterations. If the call // to get_mem_from_pool doesn't succeed, we've run out of // memory // else if (p == freep) { if ((p = get_mem_from_pool(nquantas)) == 0) { #ifdef DEBUG_MEMMGR_FATAL printf("!! Memory allocation failed !!\n"); #endif return 0; } } } } // Scans the free list, starting at freep, looking the the place to insert the // free block. This is either between two existing blocks or at the end of the // list. In any case, if the block being freed is adjacent to either neighbor, // the adjacent blocks are combined. // void memmgr_free(void* ap) { mem_header_t* block; mem_header_t* p; // acquire pointer to block header block = ((mem_header_t*) ap) - 1; // Find the correct place to place the block in (the free list is sorted by // address, increasing order) // for (p = freep; !(block > p && block < p->s.next); p = p->s.next) { // Since the free list is circular, there is one link where a // higher-addressed block points to a lower-addressed block. // This condition checks if the block should be actually // inserted between them // if (p >= p->s.next && (block > p || block < p->s.next)) break; } // Try to combine with the higher neighbor // if (block + block->s.size == p->s.next) { block->s.size += p->s.next->s.size; block->s.next = p->s.next->s.next; } else { block->s.next = p->s.next; } // Try to combine with the lower neighbor // if (p + p->s.size == block) { p->s.size += block->s.size; p->s.next = block->s.next; } else { p->s.next = block; } freep = p; } pycparser-2.18/examples/c_files/hash.c0000664000175000017500000000731613045001366020512 0ustar elibeneliben00000000000000/* ** C implementation of a hash table ADT */ typedef enum tagReturnCode {SUCCESS, FAIL} ReturnCode; typedef struct tagEntry { char* key; char* value; } Entry; typedef struct tagNode { Entry* entry; struct tagNode* next; } Node; typedef struct tagHash { unsigned int table_size; Node** heads; } Hash; static unsigned int hash_func(const char* str, unsigned int table_size) { unsigned int hash_value; unsigned int a = 127; for (hash_value = 0; *str != 0; ++str) hash_value = (a*hash_value + *str) % table_size; return hash_value; } ReturnCode HashCreate(Hash** hash, unsigned int table_size) { unsigned int i; if (table_size < 1) return FAIL; // // Allocate space for the Hash // if (((*hash) = malloc(sizeof(**hash))) == NULL) return FAIL; // // Allocate space for the array of list heads // if (((*hash)->heads = malloc(table_size*sizeof(*((*hash)->heads)))) == NULL) return FAIL; // // Initialize Hash info // for (i = 0; i < table_size; ++i) { (*hash)->heads[i] = NULL; } (*hash)->table_size = table_size; return SUCCESS; } ReturnCode HashInsert(Hash* hash, const Entry* entry) { unsigned int index = hash_func(entry->key, hash->table_size); Node* temp = hash->heads[index]; HashRemove(hash, entry->key); if ((hash->heads[index] = malloc(sizeof(Node))) == NULL) return FAIL; hash->heads[index]->entry = malloc(sizeof(Entry)); hash->heads[index]->entry->key = malloc(strlen(entry->key)+1); hash->heads[index]->entry->value = malloc(strlen(entry->value)+1); strcpy(hash->heads[index]->entry->key, entry->key); strcpy(hash->heads[index]->entry->value, entry->value); hash->heads[index]->next = temp; return SUCCESS; } const Entry* HashFind(const Hash* hash, const char* key) { unsigned int index = hash_func(key, hash->table_size); Node* temp = hash->heads[index]; while (temp != NULL) { if (!strcmp(key, temp->entry->key)) return temp->entry; temp = temp->next; } return NULL; } ReturnCode HashRemove(Hash* hash, const char* key) { unsigned int index = hash_func(key, hash->table_size); Node* temp1 = hash->heads[index]; Node* temp2 = temp1; while (temp1 != NULL) { if (!strcmp(key, temp1->entry->key)) { if (temp1 == hash->heads[index]) hash->heads[index] = hash->heads[index]->next; else temp2->next = temp1->next; free(temp1->entry->key); free(temp1->entry->value); free(temp1->entry); free(temp1); temp1 = NULL; return SUCCESS; } temp2 = temp1; temp1 = temp1->next; } return FAIL; } void HashPrint(Hash* hash, void (*PrintFunc)(char*, char*)) { unsigned int i; if (hash == NULL || hash->heads == NULL) return; for (i = 0; i < hash->table_size; ++i) { Node* temp = hash->heads[i]; while (temp != NULL) { PrintFunc(temp->entry->key, temp->entry->value); temp = temp->next; } } } void HashDestroy(Hash* hash) { unsigned int i; if (hash == NULL) return; for (i = 0; i < hash->table_size; ++i) { Node* temp = hash->heads[i]; while (temp != NULL) { Node* temp2 = temp; free(temp->entry->key); free(temp->entry->value); free(temp->entry); temp = temp->next; free(temp2); } } free(hash->heads); hash->heads = NULL; free(hash); } pycparser-2.18/examples/c_files/year.c0000664000175000017500000000212513045001366020520 0ustar elibeneliben00000000000000#include #include #include void convert(int thousands, int hundreds, int tens, int ones) { char *num[] = {"", "One", "Two", "Three", "Four", "Five", "Six", "Seven", "Eight", "Nine"}; char *for_ten[] = {"", "", "Twenty", "Thirty", "Fourty", "Fifty", "Sixty", "Seventy", "Eighty", "Ninty"}; char *af_ten[] = {"Ten", "Eleven", "Twelve", "Thirteen", "Fourteen", "Fifteen", "Sixteen", "Seventeen", "Eighteen", "Ninteen"}; printf("\nThe year in words is:\n"); printf("%s thousand", num[thousands]); if (hundreds != 0) printf(" %s hundred", num[hundreds]); if (tens != 1) printf(" %s %s", for_ten[tens], num[ones]); else printf(" %s", af_ten[ones]); } int main() { int year; int n1000, n100, n10, n1; printf("\nEnter the year (4 digits): "); scanf("%d", &year); if (year > 9999 || year < 1000) { printf("\nError !! The year must contain 4 digits."); exit(EXIT_FAILURE); } n1000 = year/1000; n100 = ((year)%1000)/100; n10 = (year%100)/10; n1 = ((year%10)%10); convert(n1000, n100, n10, n1); return 0; } pycparser-2.18/examples/c_files/memmgr.h0000664000175000017500000000552713045001366021062 0ustar elibeneliben00000000000000//---------------------------------------------------------------- // Statically-allocated memory manager // // by Eli Bendersky (eliben@gmail.com) // // This code is in the public domain. //---------------------------------------------------------------- #ifndef MEMMGR_H #define MEMMGR_H // // Memory manager: dynamically allocates memory from // a fixed pool that is allocated statically at link-time. // // Usage: after calling memmgr_init() in your // initialization routine, just use memmgr_alloc() instead // of malloc() and memmgr_free() instead of free(). // Naturally, you can use the preprocessor to define // malloc() and free() as aliases to memmgr_alloc() and // memmgr_free(). This way the manager will be a drop-in // replacement for the standard C library allocators, and can // be useful for debugging memory allocation problems and // leaks. // // Preprocessor flags you can define to customize the // memory manager: // // DEBUG_MEMMGR_FATAL // Allow printing out a message when allocations fail // // DEBUG_MEMMGR_SUPPORT_STATS // Allow printing out of stats in function // memmgr_print_stats When this is disabled, // memmgr_print_stats does nothing. // // Note that in production code on an embedded system // you'll probably want to keep those undefined, because // they cause printf to be called. // // POOL_SIZE // Size of the pool for new allocations. This is // effectively the heap size of the application, and can // be changed in accordance with the available memory // resources. // // MIN_POOL_ALLOC_QUANTAS // Internally, the memory manager allocates memory in // quantas roughly the size of two ulong objects. To // minimize pool fragmentation in case of multiple allocations // and deallocations, it is advisable to not allocate // blocks that are too small. // This flag sets the minimal ammount of quantas for // an allocation. If the size of a ulong is 4 and you // set this flag to 16, the minimal size of an allocation // will be 4 * 2 * 16 = 128 bytes // If you have a lot of small allocations, keep this value // low to conserve memory. If you have mostly large // allocations, it is best to make it higher, to avoid // fragmentation. // // Notes: // 1. This memory manager is *not thread safe*. Use it only // for single thread/task applications. // #define DEBUG_MEMMGR_SUPPORT_STATS 1 #define POOL_SIZE 8 * 1024 #define MIN_POOL_ALLOC_QUANTAS 16 typedef unsigned char byte; typedef unsigned long ulong; // Initialize the memory manager. This function should be called // only once in the beginning of the program. // void memmgr_init(); // 'malloc' clone // void* memmgr_alloc(ulong nbytes); // 'free' clone // void memmgr_free(void* ap); // Prints statistics about the current state of the memory // manager // void memmgr_print_stats(); #endif // MEMMGR_H pycparser-2.18/examples/c_files/funky.c0000664000175000017500000000031613045001366020714 0ustar elibeneliben00000000000000char foo(void) { return '1'; } int maxout_in(int paste, char** matrix) { char o = foo(); return (int) matrix[1][2] * 5 - paste; } int main() { auto char* multi = "a multi"; } pycparser-2.18/examples/using_cpp_libc.py0000664000175000017500000000154513045001366021347 0ustar elibeneliben00000000000000#----------------------------------------------------------------- # pycparser: using_cpp_libc.py # # Shows how to use the provided 'cpp' (on Windows, substitute for # the 'real' cpp if you're on Linux/Unix) and "fake" libc includes # to parse a file that includes standard C headers. # # Eli Bendersky [http://eli.thegreenplace.net] # License: BSD #----------------------------------------------------------------- import sys # This is not required if you've installed pycparser into # your site-packages/ with setup.py # sys.path.extend(['.', '..']) from pycparser import parse_file if __name__ == "__main__": if len(sys.argv) > 1: filename = sys.argv[1] else: filename = 'examples/c_files/year.c' ast = parse_file(filename, use_cpp=True, cpp_path='cpp', cpp_args=r'-Iutils/fake_libc_include') ast.show() pycparser-2.18/examples/rewrite_ast.py0000664000175000017500000000114213053207517020715 0ustar elibeneliben00000000000000#----------------------------------------------------------------- # pycparser: rewrite_ast.py # # Tiny example of rewriting a AST node # # Eli Bendersky [http://eli.thegreenplace.net] # License: BSD #----------------------------------------------------------------- from __future__ import print_function import sys from pycparser import c_parser text = r""" void func(void) { x = 1; } """ parser = c_parser.CParser() ast = parser.parse(text) print("Before:") ast.show(offset=2) assign = ast.ext[0].body.block_items[0] assign.lvalue.name = "y" assign.rvalue.value = 2 print("After:") ast.show(offset=2) pycparser-2.18/examples/cdecl.py0000664000175000017500000001443213054560063017444 0ustar elibeneliben00000000000000#----------------------------------------------------------------- # pycparser: cdecl.py # # Example of the CDECL tool using pycparser. CDECL "explains" C type # declarations in plain English. # # The AST generated by pycparser from the given declaration is traversed # recursively to build the explanation. Note that the declaration must be a # valid external declaration in C. As shown below, typedef can be optionally # expanded. # # For example: # # c_decl = 'typedef int Node; const Node* (*ar)[10];' # # explain_c_declaration(c_decl) # => ar is a pointer to array[10] of pointer to const Node # # struct and typedef can be optionally expanded: # # explain_c_declaration(c_decl, expand_typedef=True) # => ar is a pointer to array[10] of pointer to const int # # c_decl = 'struct P {int x; int y;} p;' # # explain_c_declaration(c_decl) # => p is a struct P # # explain_c_declaration(c_decl, expand_struct=True) # => p is a struct P containing {x is a int, y is a int} # # Eli Bendersky [http://eli.thegreenplace.net] # License: BSD #----------------------------------------------------------------- import copy import sys # This is not required if you've installed pycparser into # your site-packages/ with setup.py # sys.path.extend(['.', '..']) from pycparser import c_parser, c_ast def explain_c_declaration(c_decl, expand_struct=False, expand_typedef=False): """ Parses the declaration in c_decl and returns a text explanation as a string. The last external node of the string is used, to allow earlier typedefs for used types. expand_struct=True will spell out struct definitions recursively. expand_typedef=True will expand typedef'd types. """ parser = c_parser.CParser() try: node = parser.parse(c_decl, filename='') except c_parser.ParseError: e = sys.exc_info()[1] return "Parse error:" + str(e) if (not isinstance(node, c_ast.FileAST) or not isinstance(node.ext[-1], c_ast.Decl) ): return "Not a valid declaration" try: expanded = expand_struct_typedef(node.ext[-1], node, expand_struct=expand_struct, expand_typedef=expand_typedef) except Exception as e: return "Not a valid declaration: " + str(e) return _explain_decl_node(expanded) def _explain_decl_node(decl_node): """ Receives a c_ast.Decl note and returns its explanation in English. """ storage = ' '.join(decl_node.storage) + ' ' if decl_node.storage else '' return (decl_node.name + " is a " + storage + _explain_type(decl_node.type)) def _explain_type(decl): """ Recursively explains a type decl node """ typ = type(decl) if typ == c_ast.TypeDecl: quals = ' '.join(decl.quals) + ' ' if decl.quals else '' return quals + _explain_type(decl.type) elif typ == c_ast.Typename or typ == c_ast.Decl: return _explain_type(decl.type) elif typ == c_ast.IdentifierType: return ' '.join(decl.names) elif typ == c_ast.PtrDecl: quals = ' '.join(decl.quals) + ' ' if decl.quals else '' return quals + 'pointer to ' + _explain_type(decl.type) elif typ == c_ast.ArrayDecl: arr = 'array' if decl.dim: arr += '[%s]' % decl.dim.value return arr + " of " + _explain_type(decl.type) elif typ == c_ast.FuncDecl: if decl.args: params = [_explain_type(param) for param in decl.args.params] args = ', '.join(params) else: args = '' return ('function(%s) returning ' % (args) + _explain_type(decl.type)) elif typ == c_ast.Struct: decls = [_explain_decl_node(mem_decl) for mem_decl in decl.decls] members = ', '.join(decls) return ('struct%s ' % (' ' + decl.name if decl.name else '') + ('containing {%s}' % members if members else '')) def expand_struct_typedef(cdecl, file_ast, expand_struct=False, expand_typedef=False): """Expand struct & typedef and return a new expanded node.""" decl_copy = copy.deepcopy(cdecl) _expand_in_place(decl_copy, file_ast, expand_struct, expand_typedef) return decl_copy def _expand_in_place(decl, file_ast, expand_struct=False, expand_typedef=False): """Recursively expand struct & typedef in place, throw RuntimeError if undeclared struct or typedef are used """ typ = type(decl) if typ in (c_ast.Decl, c_ast.TypeDecl, c_ast.PtrDecl, c_ast.ArrayDecl): decl.type = _expand_in_place(decl.type, file_ast, expand_struct, expand_typedef) elif typ == c_ast.Struct: if not decl.decls: struct = _find_struct(decl.name, file_ast) if not struct: raise RuntimeError('using undeclared struct %s' % decl.name) decl.decls = struct.decls for i, mem_decl in enumerate(decl.decls): decl.decls[i] = _expand_in_place(mem_decl, file_ast, expand_struct, expand_typedef) if not expand_struct: decl.decls = [] elif (typ == c_ast.IdentifierType and decl.names[0] not in ('int', 'char')): typedef = _find_typedef(decl.names[0], file_ast) if not typedef: raise RuntimeError('using undeclared type %s' % decl.names[0]) if expand_typedef: return typedef.type return decl def _find_struct(name, file_ast): """Receives a struct name and return declared struct object in file_ast """ for node in file_ast.ext: if (type(node) == c_ast.Decl and type(node.type) == c_ast.Struct and node.type.name == name): return node.type def _find_typedef(name, file_ast): """Receives a type name and return typedef object in file_ast """ for node in file_ast.ext: if type(node) == c_ast.Typedef and node.name == name: return node if __name__ == "__main__": if len(sys.argv) > 1: c_decl = sys.argv[1] else: c_decl = "char *(*(**foo[][8])())[];" print("Explaining the declaration: " + c_decl + "\n") print(explain_c_declaration(c_decl) + "\n") pycparser-2.18/examples/func_calls.py0000664000175000017500000000233013053207517020476 0ustar elibeneliben00000000000000#----------------------------------------------------------------- # pycparser: func_calls.py # # Using pycparser for printing out all the calls of some function # in a C file. # # Eli Bendersky [http://eli.thegreenplace.net] # License: BSD #----------------------------------------------------------------- from __future__ import print_function import sys # This is not required if you've installed pycparser into # your site-packages/ with setup.py sys.path.extend(['.', '..']) from pycparser import c_parser, c_ast, parse_file # A visitor with some state information (the funcname it's # looking for) # class FuncCallVisitor(c_ast.NodeVisitor): def __init__(self, funcname): self.funcname = funcname def visit_FuncCall(self, node): if node.name.name == self.funcname: print('%s called at %s' % (self.funcname, node.name.coord)) def show_func_calls(filename, funcname): ast = parse_file(filename, use_cpp=True) v = FuncCallVisitor(funcname) v.visit(ast) if __name__ == "__main__": if len(sys.argv) > 2: filename = sys.argv[1] func = sys.argv[2] else: filename = 'examples/c_files/hash.c' func = 'malloc' show_func_calls(filename, func) pycparser-2.18/examples/dump_ast.py0000664000175000017500000000141113111176162020174 0ustar elibeneliben00000000000000#----------------------------------------------------------------- # pycparser: dump_ast.py # # Basic example of parsing a file and dumping its parsed AST. # # Eli Bendersky [http://eli.thegreenplace.net] # License: BSD #----------------------------------------------------------------- from __future__ import print_function import argparse import sys # This is not required if you've installed pycparser into # your site-packages/ with setup.py sys.path.extend(['.', '..']) from pycparser import c_parser, c_ast, parse_file if __name__ == "__main__": argparser = argparse.ArgumentParser('Dump AST') argparser.add_argument('filename', help='name of file to parse') args = argparser.parse_args() ast = parse_file(args.filename, use_cpp=False) ast.show() pycparser-2.18/examples/c-to-c.py0000664000175000017500000000306013045001366017443 0ustar elibeneliben00000000000000#------------------------------------------------------------------------------ # pycparser: c-to-c.py # # Example of using pycparser.c_generator, serving as a simplistic translator # from C to AST and back to C. # # Eli Bendersky [http://eli.thegreenplace.net] # License: BSD #------------------------------------------------------------------------------ from __future__ import print_function import sys # This is not required if you've installed pycparser into # your site-packages/ with setup.py # sys.path.extend(['.', '..']) from pycparser import parse_file, c_parser, c_generator def translate_to_c(filename): """ Simply use the c_generator module to emit a parsed AST. """ ast = parse_file(filename, use_cpp=True) generator = c_generator.CGenerator() print(generator.visit(ast)) def _zz_test_translate(): # internal use src = r''' void f(char * restrict joe){} int main(void) { unsigned int long k = 4; int p = - - k; return 0; } ''' parser = c_parser.CParser() ast = parser.parse(src) ast.show() generator = c_generator.CGenerator() print(generator.visit(ast)) # tracing the generator for debugging #~ import trace #~ tr = trace.Trace(countcallers=1) #~ tr.runfunc(generator.visit, ast) #~ tr.results().write_results() #------------------------------------------------------------------------------ if __name__ == "__main__": #_zz_test_translate() if len(sys.argv) > 1: translate_to_c(sys.argv[1]) else: print("Please provide a filename as argument") pycparser-2.18/examples/explore_ast.py0000664000175000017500000001256613111176162020722 0ustar elibeneliben00000000000000#----------------------------------------------------------------- # pycparser: explore_ast.py # # This example demonstrates how to "explore" the AST created by # pycparser to understand its structure. The AST is a n-nary tree # of nodes, each node having several children, each with a name. # Just read the code, and let the comments guide you. The lines # beginning with #~ can be uncommented to print out useful # information from the AST. # It helps to have the pycparser/_c_ast.cfg file in front of you. # # Eli Bendersky [http://eli.thegreenplace.net] # License: BSD #----------------------------------------------------------------- from __future__ import print_function import sys # This is not required if you've installed pycparser into # your site-packages/ with setup.py # sys.path.extend(['.', '..']) from pycparser import c_parser, c_ast # This is some C source to parse. Note that pycparser must begin # at the top level of the C file, i.e. with either declarations # or function definitions (this is called "external declarations" # in C grammar lingo) # # Also, a C parser must have all the types declared in order to # build the correct AST. It doesn't matter what they're declared # to, so I've inserted the dummy typedef in the code to let the # parser know Hash and Node are types. You don't need to do it # when parsing real, correct C code. text = r""" typedef int Node, Hash; void HashPrint(Hash* hash, void (*PrintFunc)(char*, char*)) { unsigned int i; if (hash == NULL || hash->heads == NULL) return; for (i = 0; i < hash->table_size; ++i) { Node* temp = hash->heads[i]; while (temp != NULL) { PrintFunc(temp->entry->key, temp->entry->value); temp = temp->next; } } } """ # Create the parser and ask to parse the text. parse() will throw # a ParseError if there's an error in the code # parser = c_parser.CParser() ast = parser.parse(text, filename='') # Uncomment the following line to see the AST in a nice, human # readable way. show() is the most useful tool in exploring ASTs # created by pycparser. See the c_ast.py file for the options you # can pass it. #ast.show(showcoord=True) # OK, we've seen that the top node is FileAST. This is always the # top node of the AST. Its children are "external declarations", # and are stored in a list called ext[] (see _c_ast.cfg for the # names and types of Nodes and their children). # As you see from the printout, our AST has two Typedef children # and one FuncDef child. # Let's explore FuncDef more closely. As I've mentioned, the list # ext[] holds the children of FileAST. Since the function # definition is the third child, it's ext[2]. Uncomment the # following line to show it: #ast.ext[2].show() # A FuncDef consists of a declaration, a list of parameter # declarations (for K&R style function definitions), and a body. # First, let's examine the declaration. function_decl = ast.ext[2].decl # function_decl, like any other declaration, is a Decl. Its type child # is a FuncDecl, which has a return type and arguments stored in a # ParamList node #function_decl.type.show() #function_decl.type.args.show() # The following displays the name and type of each argument: #for param_decl in function_decl.type.args.params: #print('Arg name: %s' % param_decl.name) #print('Type:') #param_decl.type.show(offset=6) # The body is of FuncDef is a Compound, which is a placeholder for a block # surrounded by {} (You should be reading _c_ast.cfg parallel to this # explanation and seeing these things with your own eyes). # Let's see the block's declarations: function_body = ast.ext[2].body # The following displays the declarations and statements in the function # body #for decl in function_body.block_items: #decl.show() # We can see a single variable declaration, i, declared to be a simple type # declaration of type 'unsigned int', followed by statements. # block_items is a list, so the third element is the For statement: for_stmt = function_body.block_items[2] #for_stmt.show() # As you can see in _c_ast.cfg, For's children are 'init, cond, # next' for the respective parts of the 'for' loop specifier, # and stmt, which is either a single stmt or a Compound if there's # a block. # # Let's dig deeper, to the while statement inside the for loop: while_stmt = for_stmt.stmt.block_items[1] #while_stmt.show() # While is simpler, it only has a condition node and a stmt node. # The condition: while_cond = while_stmt.cond #while_cond.show() # Note that it's a BinaryOp node - the basic constituent of # expressions in our AST. BinaryOp is the expression tree, with # left and right nodes as children. It also has the op attribute, # which is just the string representation of the operator. #print(while_cond.op) #while_cond.left.show() #while_cond.right.show() # That's it for the example. I hope you now see how easy it is to explore the # AST created by pycparser. Although on the surface it is quite complex and has # a lot of node types, this is the inherent complexity of the C language every # parser/compiler designer has to cope with. # Using the tools provided by the c_ast package it's easy to explore the # structure of AST nodes and write code that processes them. # Specifically, see the cdecl.py example for a non-trivial demonstration of what # you can do by recursively going through the AST. pycparser-2.18/MANIFEST.in0000664000175000017500000000047513045001366015736 0ustar elibeneliben00000000000000recursive-include examples *.c *.h *.py recursive-include tests *.c *.h *.py recursive-include pycparser *.py *.cfg recursive-include utils/fake_libc_include *.h include README.* include LICENSE include CHANGES include setup.* recursive-exclude tests yacctab.* lextab.* recursive-exclude examples yacctab.* lextab.* pycparser-2.18/setup.py0000664000175000017500000000333213127010727015707 0ustar elibeneliben00000000000000import os, sys try: from setuptools import setup from setuptools.command.install import install as _install from setuptools.command.sdist import sdist as _sdist except ImportError: from distutils.core import setup from distutils.command.install import install as _install from distutils.command.sdist import sdist as _sdist def _run_build_tables(dir): from subprocess import call call([sys.executable, '_build_tables.py'], cwd=os.path.join(dir, 'pycparser')) class install(_install): def run(self): _install.run(self) self.execute(_run_build_tables, (self.install_lib,), msg="Build the lexing/parsing tables") class sdist(_sdist): def make_release_tree(self, basedir, files): _sdist.make_release_tree(self, basedir, files) self.execute(_run_build_tables, (basedir,), msg="Build the lexing/parsing tables") setup( # metadata name='pycparser', description='C parser in Python', long_description=""" pycparser is a complete parser of the C language, written in pure Python using the PLY parsing library. It parses C code into an AST and can serve as a front-end for C compilers or analysis tools. """, license='BSD', version='2.18', author='Eli Bendersky', maintainer='Eli Bendersky', author_email='eliben@gmail.com', url='https://github.com/eliben/pycparser', platforms='Cross Platform', classifiers = [ 'Programming Language :: Python :: 2', 'Programming Language :: Python :: 3',], packages=['pycparser', 'pycparser.ply'], package_data={'pycparser': ['*.cfg']}, cmdclass={'install': install, 'sdist': sdist}, ) pycparser-2.18/tests/0000775000175000017500000000000013127011712015331 5ustar elibeneliben00000000000000pycparser-2.18/tests/test_c_lexer.py0000664000175000017500000003462613060530765020410 0ustar elibeneliben00000000000000import re import sys import unittest sys.path.insert(0, '..') from pycparser.c_lexer import CLexer def token_list(clex): return list(iter(clex.token, None)) def token_types(clex): return [i.type for i in token_list(clex)] class TestCLexerNoErrors(unittest.TestCase): """ Test lexing of strings that are not supposed to cause errors. Therefore, the error_func passed to the lexer raises an exception. """ def error_func(self, msg, line, column): self.fail(msg) def on_lbrace_func(self): pass def on_rbrace_func(self): pass def type_lookup_func(self, typ): if typ.startswith('mytype'): return True else: return False def setUp(self): self.clex = CLexer(self.error_func, lambda: None, lambda: None, self.type_lookup_func) self.clex.build(optimize=False) def assertTokensTypes(self, str, types): self.clex.input(str) self.assertEqual(token_types(self.clex), types) def test_trivial_tokens(self): self.assertTokensTypes('1', ['INT_CONST_DEC']) self.assertTokensTypes('-', ['MINUS']) self.assertTokensTypes('volatile', ['VOLATILE']) self.assertTokensTypes('...', ['ELLIPSIS']) self.assertTokensTypes('++', ['PLUSPLUS']) self.assertTokensTypes('case int', ['CASE', 'INT']) self.assertTokensTypes('caseint', ['ID']) self.assertTokensTypes('$dollar cent$', ['ID', 'ID']) self.assertTokensTypes('i ^= 1;', ['ID', 'XOREQUAL', 'INT_CONST_DEC', 'SEMI']) def test_id_typeid(self): self.assertTokensTypes('myt', ['ID']) self.assertTokensTypes('mytype', ['TYPEID']) self.assertTokensTypes('mytype6 var', ['TYPEID', 'ID']) def test_integer_constants(self): self.assertTokensTypes('12', ['INT_CONST_DEC']) self.assertTokensTypes('12u', ['INT_CONST_DEC']) self.assertTokensTypes('12l', ['INT_CONST_DEC']) self.assertTokensTypes('199872Ul', ['INT_CONST_DEC']) self.assertTokensTypes('199872lU', ['INT_CONST_DEC']) self.assertTokensTypes('199872LL', ['INT_CONST_DEC']) self.assertTokensTypes('199872ull', ['INT_CONST_DEC']) self.assertTokensTypes('199872llu', ['INT_CONST_DEC']) self.assertTokensTypes('1009843200000uLL', ['INT_CONST_DEC']) self.assertTokensTypes('1009843200000LLu', ['INT_CONST_DEC']) self.assertTokensTypes('077', ['INT_CONST_OCT']) self.assertTokensTypes('0123456L', ['INT_CONST_OCT']) self.assertTokensTypes('0xf7', ['INT_CONST_HEX']) self.assertTokensTypes('0b110', ['INT_CONST_BIN']) self.assertTokensTypes('0x01202AAbbf7Ul', ['INT_CONST_HEX']) # no 0 before x, so ID catches it self.assertTokensTypes('xf7', ['ID']) # - is MINUS, the rest a constnant self.assertTokensTypes('-1', ['MINUS', 'INT_CONST_DEC']) def test_special_names(self): self.assertTokensTypes('sizeof offsetof', ['SIZEOF', 'OFFSETOF']) def test_floating_constants(self): self.assertTokensTypes('1.5f', ['FLOAT_CONST']) self.assertTokensTypes('01.5', ['FLOAT_CONST']) self.assertTokensTypes('.15L', ['FLOAT_CONST']) self.assertTokensTypes('0.', ['FLOAT_CONST']) # but just a period is a period self.assertTokensTypes('.', ['PERIOD']) self.assertTokensTypes('3.3e-3', ['FLOAT_CONST']) self.assertTokensTypes('.7e25L', ['FLOAT_CONST']) self.assertTokensTypes('6.e+125f', ['FLOAT_CONST']) self.assertTokensTypes('666e666', ['FLOAT_CONST']) self.assertTokensTypes('00666e+3', ['FLOAT_CONST']) # but this is a hex integer + 3 self.assertTokensTypes('0x0666e+3', ['INT_CONST_HEX', 'PLUS', 'INT_CONST_DEC']) def test_hexadecimal_floating_constants(self): self.assertTokensTypes('0xDE.488641p0', ['HEX_FLOAT_CONST']) self.assertTokensTypes('0x.488641p0', ['HEX_FLOAT_CONST']) self.assertTokensTypes('0X12.P0', ['HEX_FLOAT_CONST']) def test_char_constants(self): self.assertTokensTypes(r"""'x'""", ['CHAR_CONST']) self.assertTokensTypes(r"""L'x'""", ['WCHAR_CONST']) self.assertTokensTypes(r"""'\t'""", ['CHAR_CONST']) self.assertTokensTypes(r"""'\''""", ['CHAR_CONST']) self.assertTokensTypes(r"""'\?'""", ['CHAR_CONST']) self.assertTokensTypes(r"""'\012'""", ['CHAR_CONST']) self.assertTokensTypes(r"""'\x2f'""", ['CHAR_CONST']) self.assertTokensTypes(r"""'\x2f12'""", ['CHAR_CONST']) self.assertTokensTypes(r"""L'\xaf'""", ['WCHAR_CONST']) def test_on_rbrace_lbrace(self): braces = [] def on_lbrace(): braces.append('{') def on_rbrace(): braces.append('}') clex = CLexer(self.error_func, on_lbrace, on_rbrace, self.type_lookup_func) clex.build(optimize=False) clex.input('hello { there } } and again }}{') token_list(clex) self.assertEqual(braces, ['{', '}', '}', '}', '}', '{']) def test_string_literal(self): self.assertTokensTypes('"a string"', ['STRING_LITERAL']) self.assertTokensTypes('L"ing"', ['WSTRING_LITERAL']) self.assertTokensTypes( '"i am a string too \t"', ['STRING_LITERAL']) self.assertTokensTypes( r'''"esc\ape \"\'\? \0234 chars \rule"''', ['STRING_LITERAL']) self.assertTokensTypes( r'''"hello 'joe' wanna give it a \"go\"?"''', ['STRING_LITERAL']) self.assertTokensTypes( '"\123\123\123\123\123\123\123\123\123\123\123\123\123\123\123\123"', ['STRING_LITERAL']) def test_mess(self): self.assertTokensTypes( r'[{}]()', ['LBRACKET', 'LBRACE', 'RBRACE', 'RBRACKET', 'LPAREN', 'RPAREN']) self.assertTokensTypes( r'()||!C&~Z?J', ['LPAREN', 'RPAREN', 'LOR', 'LNOT', 'ID', 'AND', 'NOT', 'ID', 'CONDOP', 'ID']) self.assertTokensTypes( r'+-*/%|||&&&^><>=<===!=', ['PLUS', 'MINUS', 'TIMES', 'DIVIDE', 'MOD', 'LOR', 'OR', 'LAND', 'AND', 'XOR', 'GT', 'LT', 'GE', 'LE', 'EQ', 'NE']) self.assertTokensTypes( r'++--->?.,;:', ['PLUSPLUS', 'MINUSMINUS', 'ARROW', 'CONDOP', 'PERIOD', 'COMMA', 'SEMI', 'COLON']) def test_exprs(self): self.assertTokensTypes( 'bb-cc', ['ID', 'MINUS', 'ID']) self.assertTokensTypes( 'foo & 0xFF', ['ID', 'AND', 'INT_CONST_HEX']) self.assertTokensTypes( '(2+k) * 62', ['LPAREN', 'INT_CONST_DEC', 'PLUS', 'ID', 'RPAREN', 'TIMES', 'INT_CONST_DEC'],) self.assertTokensTypes( 'x | y >> z', ['ID', 'OR', 'ID', 'RSHIFT', 'ID']) self.assertTokensTypes( 'x <<= z << 5', ['ID', 'LSHIFTEQUAL', 'ID', 'LSHIFT', 'INT_CONST_DEC']) self.assertTokensTypes( 'x = y > 0 ? y : -6', ['ID', 'EQUALS', 'ID', 'GT', 'INT_CONST_OCT', 'CONDOP', 'ID', 'COLON', 'MINUS', 'INT_CONST_DEC']) self.assertTokensTypes( 'a+++b', ['ID', 'PLUSPLUS', 'PLUS', 'ID']) def test_statements(self): self.assertTokensTypes( 'for (int i = 0; i < n; ++i)', ['FOR', 'LPAREN', 'INT', 'ID', 'EQUALS', 'INT_CONST_OCT', 'SEMI', 'ID', 'LT', 'ID', 'SEMI', 'PLUSPLUS', 'ID', 'RPAREN']) self.assertTokensTypes( 'self: goto self;', ['ID', 'COLON', 'GOTO', 'ID', 'SEMI']) self.assertTokensTypes( """ switch (typ) { case TYPE_ID: m = 5; break; default: m = 8; }""", ['SWITCH', 'LPAREN', 'ID', 'RPAREN', 'LBRACE', 'CASE', 'ID', 'COLON', 'ID', 'EQUALS', 'INT_CONST_DEC', 'SEMI', 'BREAK', 'SEMI', 'DEFAULT', 'COLON', 'ID', 'EQUALS', 'INT_CONST_DEC', 'SEMI', 'RBRACE']) def test_preprocessor_line(self): self.assertTokensTypes('#abracadabra', ['PPHASH', 'ID']) str = r""" 546 #line 66 "kwas\df.h" id 4 dsf # 9 armo #line 10 "..\~..\test.h" tok1 #line 99999 "include/me.h" tok2 """ #~ self.clex.filename self.clex.input(str) self.clex.reset_lineno() t1 = self.clex.token() self.assertEqual(t1.type, 'INT_CONST_DEC') self.assertEqual(t1.lineno, 2) t2 = self.clex.token() self.assertEqual(t2.type, 'ID') self.assertEqual(t2.value, 'id') self.assertEqual(t2.lineno, 66) self.assertEqual(self.clex.filename, r'kwas\df.h') for i in range(3): t = self.clex.token() self.assertEqual(t.type, 'ID') self.assertEqual(t.value, 'armo') self.assertEqual(t.lineno, 9) self.assertEqual(self.clex.filename, r'kwas\df.h') t4 = self.clex.token() self.assertEqual(t4.type, 'ID') self.assertEqual(t4.value, 'tok1') self.assertEqual(t4.lineno, 10) self.assertEqual(self.clex.filename, r'..\~..\test.h') t5 = self.clex.token() self.assertEqual(t5.type, 'ID') self.assertEqual(t5.value, 'tok2') self.assertEqual(t5.lineno, 99999) self.assertEqual(self.clex.filename, r'include/me.h') def test_preprocessor_line_funny(self): str = r''' #line 10 "..\6\joe.h" 10 ''' self.clex.input(str) self.clex.reset_lineno() t1 = self.clex.token() self.assertEqual(t1.type, 'INT_CONST_DEC') self.assertEqual(t1.lineno, 10) self.assertEqual(self.clex.filename, r'..\6\joe.h') def test_preprocessor_pragma(self): str = ''' 42 #pragma #pragma helo me #pragma once # pragma omp parallel private(th_id) #\tpragma {pack: 2, smack: 3} #pragma "nowit.h" #pragma "string" #pragma somestring="some_other_string" #pragma id 124124 and numbers 0235495 59 ''' # Check that pragmas are tokenized, including trailing string self.clex.input(str) self.clex.reset_lineno() t1 = self.clex.token() self.assertEqual(t1.type, 'INT_CONST_DEC') t2 = self.clex.token() self.assertEqual(t2.type, 'PPPRAGMA') t3 = self.clex.token() self.assertEqual(t3.type, 'PPPRAGMA') t4 = self.clex.token() self.assertEqual(t4.type, 'PPPRAGMASTR') self.assertEqual(t4.value, 'helo me') for i in range(3): t = self.clex.token() t5 = self.clex.token() self.assertEqual(t5.type, 'PPPRAGMASTR') self.assertEqual(t5.value, 'omp parallel private(th_id)') for i in range(5): ta = self.clex.token() self.assertEqual(ta.type, 'PPPRAGMA') tb = self.clex.token() self.assertEqual(tb.type, 'PPPRAGMASTR') t6 = self.clex.token() self.assertEqual(t6.type, 'INT_CONST_DEC') self.assertEqual(t6.lineno, 12) # Keeps all the errors the lexer spits in one place, to allow # easier modification if the error syntax changes. # ERR_ILLEGAL_CHAR = 'Illegal character' ERR_OCTAL = 'Invalid octal constant' ERR_UNMATCHED_QUOTE = 'Unmatched \'' ERR_INVALID_CCONST = 'Invalid char constant' ERR_STRING_ESCAPE = 'String contains invalid escape' ERR_FILENAME_BEFORE_LINE = 'filename before line' ERR_LINENUM_MISSING = 'line number missing' ERR_INVALID_LINE_DIRECTIVE = 'invalid #line directive' class TestCLexerErrors(unittest.TestCase): """ Test lexing of erroneous strings. Works by passing an error functions that saves the error in an attribute for later perusal. """ def error_func(self, msg, line, column): self.error = msg def on_lbrace_func(self): pass def on_rbrace_func(self): pass def type_lookup_func(self, typ): return False def setUp(self): self.clex = CLexer(self.error_func, self.on_lbrace_func, self.on_rbrace_func, self.type_lookup_func) self.clex.build(optimize=False) self.error = "" def assertLexerError(self, str, error_like): # feed the string to the lexer self.clex.input(str) # Pulls all tokens from the string. Errors will # be written into self.error by the error_func # callback # token_types(self.clex) # compare the error to the expected self.assertTrue(re.search(error_like, self.error), "\nExpected error matching: %s\nGot: %s" % (error_like, self.error)) # clear last error, for the sake of subsequent invocations self.error = "" def test_trivial_tokens(self): self.assertLexerError('@', ERR_ILLEGAL_CHAR) self.assertLexerError('`', ERR_ILLEGAL_CHAR) self.assertLexerError('\\', ERR_ILLEGAL_CHAR) def test_integer_constants(self): self.assertLexerError('029', ERR_OCTAL) self.assertLexerError('012345678', ERR_OCTAL) def test_char_constants(self): self.assertLexerError("'", ERR_UNMATCHED_QUOTE) self.assertLexerError("'b\n", ERR_UNMATCHED_QUOTE) self.assertLexerError("'jx'", ERR_INVALID_CCONST) self.assertLexerError(r"'\*'", ERR_INVALID_CCONST) def test_string_literals(self): self.assertLexerError(r'"jx\9"', ERR_STRING_ESCAPE) self.assertLexerError(r'"hekllo\* on ix"', ERR_STRING_ESCAPE) self.assertLexerError(r'L"hekllo\* on ix"', ERR_STRING_ESCAPE) def test_preprocessor(self): self.assertLexerError('#line "ka"', ERR_FILENAME_BEFORE_LINE) self.assertLexerError('#line df', ERR_INVALID_LINE_DIRECTIVE) self.assertLexerError('#line \n', ERR_LINENUM_MISSING) if __name__ == '__main__': unittest.main() pycparser-2.18/tests/test_c_ast.py0000664000175000017500000000503313045001366020037 0ustar elibeneliben00000000000000import pprint import re import sys import unittest import weakref sys.path.insert(0, '..') import pycparser.c_ast as c_ast import pycparser.plyparser as plyparser class Test_c_ast(unittest.TestCase): def test_BinaryOp(self): b1 = c_ast.BinaryOp( op='+', left=c_ast.Constant(type='int', value='6'), right=c_ast.ID(name='joe')) self.failUnless(isinstance(b1.left, c_ast.Constant)) self.assertEqual(b1.left.type, 'int') self.assertEqual(b1.left.value, '6') self.failUnless(isinstance(b1.right, c_ast.ID)) self.assertEqual(b1.right.name, 'joe') def test_weakref_works_on_nodes(self): c1 = c_ast.Constant(type='float', value='3.14') wr = weakref.ref(c1) cref = wr() self.assertEqual(cref.type, 'float') self.assertEqual(weakref.getweakrefcount(c1), 1) def test_weakref_works_on_coord(self): coord = plyparser.Coord(file='a', line=2) wr = weakref.ref(coord) cref = wr() self.assertEqual(cref.line, 2) self.assertEqual(weakref.getweakrefcount(coord), 1) class TestNodeVisitor(unittest.TestCase): class ConstantVisitor(c_ast.NodeVisitor): def __init__(self): self.values = [] def visit_Constant(self, node): self.values.append(node.value) def test_scalar_children(self): b1 = c_ast.BinaryOp( op='+', left=c_ast.Constant(type='int', value='6'), right=c_ast.ID(name='joe')) cv = self.ConstantVisitor() cv.visit(b1) self.assertEqual(cv.values, ['6']) b2 = c_ast.BinaryOp( op='*', left=c_ast.Constant(type='int', value='111'), right=b1) b3 = c_ast.BinaryOp( op='^', left=b2, right=b1) cv = self.ConstantVisitor() cv.visit(b3) self.assertEqual(cv.values, ['111', '6', '6']) def tests_list_children(self): c1 = c_ast.Constant(type='float', value='5.6') c2 = c_ast.Constant(type='char', value='t') b1 = c_ast.BinaryOp( op='+', left=c1, right=c2) b2 = c_ast.BinaryOp( op='-', left=b1, right=c2) comp = c_ast.Compound( block_items=[b1, b2, c1, c2]) cv = self.ConstantVisitor() cv.visit(comp) self.assertEqual(cv.values, ['5.6', 't', '5.6', 't', 't', '5.6', 't']) if __name__ == '__main__': unittest.main() pycparser-2.18/tests/test_general.py0000664000175000017500000000434213111175436020371 0ustar elibeneliben00000000000000import os import platform import sys import unittest sys.path.insert(0, '..') from pycparser import parse_file, c_ast CPPPATH = 'cpp' # Test successful parsing # class TestParsing(unittest.TestCase): def _find_file(self, name): """ Find a c file by name, taking into account the current dir can be in a couple of typical places """ testdir = os.path.dirname(__file__) name = os.path.join(testdir, 'c_files', name) assert os.path.exists(name) return name def test_without_cpp(self): ast = parse_file(self._find_file('example_c_file.c')) self.assertTrue(isinstance(ast, c_ast.FileAST)) @unittest.skipUnless(platform.system() == 'Linux', 'cpp only works on Linux') def test_with_cpp(self): memmgr_path = self._find_file('memmgr.c') c_files_path = os.path.dirname(memmgr_path) ast = parse_file(memmgr_path, use_cpp=True, cpp_path=CPPPATH, cpp_args='-I%s' % c_files_path) self.assertTrue(isinstance(ast, c_ast.FileAST)) fake_libc = os.path.join(c_files_path, '..', '..', 'utils', 'fake_libc_include') ast2 = parse_file(self._find_file('year.c'), use_cpp=True, cpp_path=CPPPATH, cpp_args=[r'-I%s' % fake_libc]) self.assertTrue(isinstance(ast2, c_ast.FileAST)) @unittest.skipUnless(platform.system() == 'Linux', 'cpp only works on Linux') def test_cpp_funkydir(self): # This test contains Windows specific path escapes if sys.platform != 'win32': return c_files_path = os.path.join('tests', 'c_files') ast = parse_file(self._find_file('simplemain.c'), use_cpp=True, cpp_path=CPPPATH, cpp_args='-I%s' % c_files_path) self.assertTrue(isinstance(ast, c_ast.FileAST)) @unittest.skipUnless(platform.system() == 'Linux', 'cpp only works on Linux') def test_no_real_content_after_cpp(self): ast = parse_file(self._find_file('empty.h'), use_cpp=True, cpp_path=CPPPATH) self.assertTrue(isinstance(ast, c_ast.FileAST)) if __name__ == '__main__': unittest.main() pycparser-2.18/tests/test_c_parser.py0000775000175000017500000021040313060530765020555 0ustar elibeneliben00000000000000#!/usr/bin/env python import pprint import re import os, sys import unittest sys.path[0:0] = ['.', '..'] from pycparser import c_parser from pycparser.c_ast import * from pycparser.c_parser import CParser, Coord, ParseError _c_parser = c_parser.CParser( lex_optimize=False, yacc_debug=True, yacc_optimize=False, yacctab='yacctab') def expand_decl(decl): """ Converts the declaration into a nested list. """ typ = type(decl) if typ == TypeDecl: return ['TypeDecl', expand_decl(decl.type)] elif typ == IdentifierType: return ['IdentifierType', decl.names] elif typ == ID: return ['ID', decl.name] elif typ in [Struct, Union]: decls = [expand_decl(d) for d in decl.decls or []] return [typ.__name__, decl.name, decls] else: nested = expand_decl(decl.type) if typ == Decl: if decl.quals: return ['Decl', decl.quals, decl.name, nested] else: return ['Decl', decl.name, nested] elif typ == Typename: # for function parameters if decl.quals: return ['Typename', decl.quals, nested] else: return ['Typename', nested] elif typ == ArrayDecl: dimval = decl.dim.value if decl.dim else '' return ['ArrayDecl', dimval, decl.dim_quals, nested] elif typ == PtrDecl: if decl.quals: return ['PtrDecl', decl.quals, nested] else: return ['PtrDecl', nested] elif typ == Typedef: return ['Typedef', decl.name, nested] elif typ == FuncDecl: if decl.args: params = [expand_decl(param) for param in decl.args.params] else: params = [] return ['FuncDecl', params, nested] def expand_init(init): """ Converts an initialization into a nested list """ typ = type(init) if typ == NamedInitializer: des = [expand_init(dp) for dp in init.name] return (des, expand_init(init.expr)) elif typ in (InitList, ExprList): return [expand_init(expr) for expr in init.exprs] elif typ == Constant: return ['Constant', init.type, init.value] elif typ == ID: return ['ID', init.name] elif typ == UnaryOp: return ['UnaryOp', init.op, expand_decl(init.expr)] class TestCParser_base(unittest.TestCase): def parse(self, txt, filename=''): return self.cparser.parse(txt, filename) def setUp(self): self.cparser = _c_parser def assert_coord(self, node, line, column=None, file=None): self.assertEqual(node.coord.line, line) if column is not None: self.assertEqual(node.coord.column, column) if file: self.assertEqual(node.coord.file, file) class TestCParser_fundamentals(TestCParser_base): def get_decl(self, txt, index=0): """ Given a source and an index returns the expanded declaration at that index. FileAST holds a list of 'external declarations'. index is the offset of the desired declaration in that list. """ t = self.parse(txt).ext[index] return expand_decl(t) def get_decl_init(self, txt, index=0): """ Returns the expanded initializer of the declaration at index. """ t = self.parse(txt).ext[index] return expand_init(t.init) def test_FileAST(self): t = self.parse('int a; char c;') self.assertTrue(isinstance(t, FileAST)) self.assertEqual(len(t.ext), 2) # empty file t2 = self.parse('') self.assertTrue(isinstance(t2, FileAST)) self.assertEqual(len(t2.ext), 0) def test_empty_toplevel_decl(self): code = 'int foo;;' t = self.parse(code) self.assertTrue(isinstance(t, FileAST)) self.assertEqual(len(t.ext), 1) self.assertEqual(self.get_decl(code), ['Decl', 'foo', ['TypeDecl', ['IdentifierType', ['int']]]]) def test_coords(self): """ Tests the "coordinates" of parsed elements - file name, line and column numbers, with modification insterted by #line directives. """ self.assert_coord(self.parse('int a;').ext[0], 1, 5) t1 = """ int a; int b;\n\n int c; """ f1 = self.parse(t1, filename='test.c') self.assert_coord(f1.ext[0], 2, 13, 'test.c') self.assert_coord(f1.ext[1], 3, 13, 'test.c') self.assert_coord(f1.ext[2], 6, 13, 'test.c') t1_1 = ''' int main() { k = p; printf("%d", b); return 0; }''' f1_1 = self.parse(t1_1, filename='test.c') self.assert_coord(f1_1.ext[0].body.block_items[0], 3, 13, 'test.c') self.assert_coord(f1_1.ext[0].body.block_items[1], 4, 13, 'test.c') t1_2 = ''' int main () { int p = (int) k; }''' f1_2 = self.parse(t1_2, filename='test.c') # make sure that the Cast has a coord (issue 23) self.assert_coord(f1_2.ext[0].body.block_items[0].init, 3, 21, file='test.c') t2 = """ #line 99 int c; """ self.assert_coord(self.parse(t2).ext[0], 99, 13) t3 = """ int dsf; char p; #line 3000 "in.h" char d; """ f3 = self.parse(t3, filename='test.c') self.assert_coord(f3.ext[0], 2, 13, 'test.c') self.assert_coord(f3.ext[1], 3, 14, 'test.c') self.assert_coord(f3.ext[2], 3000, 14, 'in.h') t4 = """ #line 20 "restore.h" int maydler(char); #line 30 "includes/daween.ph" long j, k; #line 50000 char* ro; """ f4 = self.parse(t4, filename='myb.c') self.assert_coord(f4.ext[0], 20, 13, 'restore.h') self.assert_coord(f4.ext[1], 30, 14, 'includes/daween.ph') self.assert_coord(f4.ext[2], 30, 17, 'includes/daween.ph') self.assert_coord(f4.ext[3], 50000, 13, 'includes/daween.ph') t5 = """ int #line 99 c; """ self.assert_coord(self.parse(t5).ext[0], 99, 9) # coord for ellipsis t6 = """ int foo(int j, ...) { }""" f6 = self.parse(t6, filename='z.c') self.assert_coord(self.parse(t6).ext[0].decl.type.args.params[1], 3, 17) def test_forloop_coord(self): t = '''\ void foo() { for(int z=0; z<4; z++){} } ''' s = self.parse(t, filename='f.c') forloop = s.ext[0].body.block_items[0] self.assert_coord(forloop.init, 2, 13, 'f.c') self.assert_coord(forloop.cond, 2, 26, 'f.c') self.assert_coord(forloop.next, 3, 17, 'f.c') def test_simple_decls(self): self.assertEqual(self.get_decl('int a;'), ['Decl', 'a', ['TypeDecl', ['IdentifierType', ['int']]]]) self.assertEqual(self.get_decl('unsigned int a;'), ['Decl', 'a', ['TypeDecl', ['IdentifierType', ['unsigned', 'int']]]]) self.assertEqual(self.get_decl('_Bool a;'), ['Decl', 'a', ['TypeDecl', ['IdentifierType', ['_Bool']]]]) self.assertEqual(self.get_decl('float _Complex fcc;'), ['Decl', 'fcc', ['TypeDecl', ['IdentifierType', ['float', '_Complex']]]]) self.assertEqual(self.get_decl('char* string;'), ['Decl', 'string', ['PtrDecl', ['TypeDecl', ['IdentifierType', ['char']]]]]) self.assertEqual(self.get_decl('long ar[15];'), ['Decl', 'ar', ['ArrayDecl', '15', [], ['TypeDecl', ['IdentifierType', ['long']]]]]) self.assertEqual(self.get_decl('long long ar[15];'), ['Decl', 'ar', ['ArrayDecl', '15', [], ['TypeDecl', ['IdentifierType', ['long', 'long']]]]]) self.assertEqual(self.get_decl('unsigned ar[];'), ['Decl', 'ar', ['ArrayDecl', '', [], ['TypeDecl', ['IdentifierType', ['unsigned']]]]]) self.assertEqual(self.get_decl('int strlen(char* s);'), ['Decl', 'strlen', ['FuncDecl', [['Decl', 's', ['PtrDecl', ['TypeDecl', ['IdentifierType', ['char']]]]]], ['TypeDecl', ['IdentifierType', ['int']]]]]) self.assertEqual(self.get_decl('int strcmp(char* s1, char* s2);'), ['Decl', 'strcmp', ['FuncDecl', [ ['Decl', 's1', ['PtrDecl', ['TypeDecl', ['IdentifierType', ['char']]]]], ['Decl', 's2', ['PtrDecl', ['TypeDecl', ['IdentifierType', ['char']]]]] ], ['TypeDecl', ['IdentifierType', ['int']]]]]) # function return values and parameters may not have type information self.assertEqual(self.get_decl('extern foobar(foo, bar);'), ['Decl', 'foobar', ['FuncDecl', [ ['ID', 'foo'], ['ID', 'bar'] ], ['TypeDecl', ['IdentifierType', ['int']]]]]) def test_int128(self): self.assertEqual(self.get_decl('__int128 a;'), ['Decl', 'a', ['TypeDecl', ['IdentifierType', ['__int128']]]]) def test_nested_decls(self): # the fun begins self.assertEqual(self.get_decl('char** ar2D;'), ['Decl', 'ar2D', ['PtrDecl', ['PtrDecl', ['TypeDecl', ['IdentifierType', ['char']]]]]]) self.assertEqual(self.get_decl('int (*a)[1][2];'), ['Decl', 'a', ['PtrDecl', ['ArrayDecl', '1', [], ['ArrayDecl', '2', [], ['TypeDecl', ['IdentifierType', ['int']]]]]]]) self.assertEqual(self.get_decl('int *a[1][2];'), ['Decl', 'a', ['ArrayDecl', '1', [], ['ArrayDecl', '2', [], ['PtrDecl', ['TypeDecl', ['IdentifierType', ['int']]]]]]]) self.assertEqual(self.get_decl('char* const* p;'), ['Decl', 'p', ['PtrDecl', ['PtrDecl', ['const'], ['TypeDecl', ['IdentifierType', ['char']]]]]]) self.assertEqual(self.get_decl('char* * const p;'), ['Decl', 'p', ['PtrDecl', ['const'], ['PtrDecl', ['TypeDecl', ['IdentifierType', ['char']]]]]]) self.assertEqual(self.get_decl('char ***ar3D[40];'), ['Decl', 'ar3D', ['ArrayDecl', '40', [], ['PtrDecl', ['PtrDecl', ['PtrDecl', ['TypeDecl', ['IdentifierType', ['char']]]]]]]]) self.assertEqual(self.get_decl('char (***ar3D)[40];'), ['Decl', 'ar3D', ['PtrDecl', ['PtrDecl', ['PtrDecl', ['ArrayDecl', '40', [], ['TypeDecl', ['IdentifierType', ['char']]]]]]]]) self.assertEqual(self.get_decl('int (*x[4])(char, int);'), ['Decl', 'x', ['ArrayDecl', '4', [], ['PtrDecl', ['FuncDecl', [ ['Typename', ['TypeDecl', ['IdentifierType', ['char']]]], ['Typename', ['TypeDecl', ['IdentifierType', ['int']]]]], ['TypeDecl', ['IdentifierType', ['int']]]]]]]) self.assertEqual(self.get_decl('char *(*(**foo [][8])())[];'), ['Decl', 'foo', ['ArrayDecl', '', [], ['ArrayDecl', '8', [], ['PtrDecl', ['PtrDecl', ['FuncDecl', [], ['PtrDecl', ['ArrayDecl', '', [], ['PtrDecl', ['TypeDecl', ['IdentifierType', ['char']]]]]]]]]]]]) # explore named and unnamed function pointer parameters, # with and without qualifiers # unnamed w/o quals self.assertEqual(self.get_decl('int (*k)(int);'), ['Decl', 'k', ['PtrDecl', ['FuncDecl', [['Typename', ['TypeDecl', ['IdentifierType', ['int']]]]], ['TypeDecl', ['IdentifierType', ['int']]]]]]) # unnamed w/ quals self.assertEqual(self.get_decl('int (*k)(const int);'), ['Decl', 'k', ['PtrDecl', ['FuncDecl', [['Typename', ['const'], ['TypeDecl', ['IdentifierType', ['int']]]]], ['TypeDecl', ['IdentifierType', ['int']]]]]]) # named w/o quals self.assertEqual(self.get_decl('int (*k)(int q);'), ['Decl', 'k', ['PtrDecl', ['FuncDecl', [['Decl', 'q', ['TypeDecl', ['IdentifierType', ['int']]]]], ['TypeDecl', ['IdentifierType', ['int']]]]]]) # named w/ quals self.assertEqual(self.get_decl('int (*k)(const volatile int q);'), ['Decl', 'k', ['PtrDecl', ['FuncDecl', [['Decl', ['const', 'volatile'], 'q', ['TypeDecl', ['IdentifierType', ['int']]]]], ['TypeDecl', ['IdentifierType', ['int']]]]]]) # restrict qualifier self.assertEqual(self.get_decl('int (*k)(restrict int* q);'), ['Decl', 'k', ['PtrDecl', ['FuncDecl', [['Decl', ['restrict'], 'q', ['PtrDecl', ['TypeDecl', ['IdentifierType', ['int']]]]]], ['TypeDecl', ['IdentifierType', ['int']]]]]]) def test_func_decls_with_array_dim_qualifiers(self): self.assertEqual(self.get_decl('int zz(int p[static 10]);'), ['Decl', 'zz', ['FuncDecl', [['Decl', 'p', ['ArrayDecl', '10', ['static'], ['TypeDecl', ['IdentifierType', ['int']]]]]], ['TypeDecl', ['IdentifierType', ['int']]]]]) self.assertEqual(self.get_decl('int zz(int p[const 10]);'), ['Decl', 'zz', ['FuncDecl', [['Decl', 'p', ['ArrayDecl', '10', ['const'], ['TypeDecl', ['IdentifierType', ['int']]]]]], ['TypeDecl', ['IdentifierType', ['int']]]]]) self.assertEqual(self.get_decl('int zz(int p[restrict][5]);'), ['Decl', 'zz', ['FuncDecl', [['Decl', 'p', ['ArrayDecl', '', ['restrict'], ['ArrayDecl', '5', [], ['TypeDecl', ['IdentifierType', ['int']]]]]]], ['TypeDecl', ['IdentifierType', ['int']]]]]) self.assertEqual(self.get_decl('int zz(int p[const restrict static 10][5]);'), ['Decl', 'zz', ['FuncDecl', [['Decl', 'p', ['ArrayDecl', '10', ['const', 'restrict', 'static'], ['ArrayDecl', '5', [], ['TypeDecl', ['IdentifierType', ['int']]]]]]], ['TypeDecl', ['IdentifierType', ['int']]]]]) def test_qualifiers_storage_specifiers(self): def assert_qs(txt, index, quals, storage): d = self.parse(txt).ext[index] self.assertEqual(d.quals, quals) self.assertEqual(d.storage, storage) assert_qs("extern int p;", 0, [], ['extern']) assert_qs("const long p = 6;", 0, ['const'], []) d1 = "static const int p, q, r;" for i in range(3): assert_qs(d1, i, ['const'], ['static']) d2 = "static char * const p;" assert_qs(d2, 0, [], ['static']) pdecl = self.parse(d2).ext[0].type self.assertTrue(isinstance(pdecl, PtrDecl)) self.assertEqual(pdecl.quals, ['const']) def test_sizeof(self): e = """ void foo() { int a = sizeof k; int b = sizeof(int); int c = sizeof(int**);; char* p = "just to make sure this parses w/o error..."; int d = sizeof(int()); } """ compound = self.parse(e).ext[0].body s1 = compound.block_items[0].init self.assertTrue(isinstance(s1, UnaryOp)) self.assertEqual(s1.op, 'sizeof') self.assertTrue(isinstance(s1.expr, ID)) self.assertEqual(s1.expr.name, 'k') s2 = compound.block_items[1].init self.assertEqual(expand_decl(s2.expr), ['Typename', ['TypeDecl', ['IdentifierType', ['int']]]]) s3 = compound.block_items[2].init self.assertEqual(expand_decl(s3.expr), ['Typename', ['PtrDecl', ['PtrDecl', ['TypeDecl', ['IdentifierType', ['int']]]]]]) def test_offsetof(self): e = """ void foo() { int a = offsetof(struct S, p); a.b = offsetof(struct sockaddr, sp) + strlen(bar); int a = offsetof(struct S, p.q.r); int a = offsetof(struct S, p[5].q[4][5]); } """ compound = self.parse(e).ext[0].body s1 = compound.block_items[0].init self.assertTrue(isinstance(s1, FuncCall)) self.assertTrue(isinstance(s1.name, ID)) self.assertEqual(s1.name.name, 'offsetof') self.assertTrue(isinstance(s1.args.exprs[0], Typename)) self.assertTrue(isinstance(s1.args.exprs[1], ID)) s3 = compound.block_items[2].init self.assertTrue(isinstance(s3.args.exprs[1], StructRef)) s4 = compound.block_items[3].init self.assertTrue(isinstance(s4.args.exprs[1], ArrayRef)) def test_compound_statement(self): e = """ void foo() { } """ compound = self.parse(e).ext[0].body self.assertTrue(isinstance(compound, Compound)) self.assert_coord(compound, 2) # The C99 compound literal feature # def test_compound_literals(self): ps1 = self.parse(r''' void foo() { p = (long long){k}; tc = (struct jk){.a = {1, 2}, .b[0] = t}; }''') compound = ps1.ext[0].body.block_items[0].rvalue self.assertEqual(expand_decl(compound.type), ['Typename', ['TypeDecl', ['IdentifierType', ['long', 'long']]]]) self.assertEqual(expand_init(compound.init), [['ID', 'k']]) compound = ps1.ext[0].body.block_items[1].rvalue self.assertEqual(expand_decl(compound.type), ['Typename', ['TypeDecl', ['Struct', 'jk', []]]]) self.assertEqual(expand_init(compound.init), [ ([['ID', 'a']], [['Constant', 'int', '1'], ['Constant', 'int', '2']]), ([['ID', 'b'], ['Constant', 'int', '0']], ['ID', 't'])]) def test_enums(self): e1 = "enum mycolor op;" e1_type = self.parse(e1).ext[0].type.type self.assertTrue(isinstance(e1_type, Enum)) self.assertEqual(e1_type.name, 'mycolor') self.assertEqual(e1_type.values, None) e2 = "enum mysize {large=20, small, medium} shoes;" e2_type = self.parse(e2).ext[0].type.type self.assertTrue(isinstance(e2_type, Enum)) self.assertEqual(e2_type.name, 'mysize') e2_elist = e2_type.values self.assertTrue(isinstance(e2_elist, EnumeratorList)) for e2_eval in e2_elist.enumerators: self.assertTrue(isinstance(e2_eval, Enumerator)) self.assertEqual(e2_elist.enumerators[0].name, 'large') self.assertEqual(e2_elist.enumerators[0].value.value, '20') self.assertEqual(e2_elist.enumerators[2].name, 'medium') self.assertEqual(e2_elist.enumerators[2].value, None) # enum with trailing comma (C99 feature) e3 = """ enum { red, blue, green, } color; """ e3_type = self.parse(e3).ext[0].type.type self.assertTrue(isinstance(e3_type, Enum)) e3_elist = e3_type.values self.assertTrue(isinstance(e3_elist, EnumeratorList)) for e3_eval in e3_elist.enumerators: self.assertTrue(isinstance(e3_eval, Enumerator)) self.assertEqual(e3_elist.enumerators[0].name, 'red') self.assertEqual(e3_elist.enumerators[0].value, None) self.assertEqual(e3_elist.enumerators[1].name, 'blue') self.assertEqual(e3_elist.enumerators[2].name, 'green') def test_typedef(self): # without typedef, error s1 = """ node k; """ self.assertRaises(ParseError, self.parse, s1) # now with typedef, works s2 = """ typedef void* node; node k; """ ps2 = self.parse(s2) self.assertEqual(expand_decl(ps2.ext[0]), ['Typedef', 'node', ['PtrDecl', ['TypeDecl', ['IdentifierType', ['void']]]]]) self.assertEqual(expand_decl(ps2.ext[1]), ['Decl', 'k', ['TypeDecl', ['IdentifierType', ['node']]]]) s3 = """ typedef int T; typedef T *pT; pT aa, bb; """ ps3 = self.parse(s3) self.assertEqual(expand_decl(ps3.ext[3]), ['Decl', 'bb', ['TypeDecl', ['IdentifierType', ['pT']]]]) s4 = ''' typedef char* __builtin_va_list; typedef __builtin_va_list __gnuc_va_list; ''' ps4 = self.parse(s4) self.assertEqual(expand_decl(ps4.ext[1]), ['Typedef', '__gnuc_va_list', ['TypeDecl', ['IdentifierType', ['__builtin_va_list']]]]) s5 = '''typedef struct tagHash Hash;''' ps5 = self.parse(s5) self.assertEqual(expand_decl(ps5.ext[0]), ['Typedef', 'Hash', ['TypeDecl', ['Struct', 'tagHash', []]]]) def test_struct_union(self): s1 = """ struct { int id; char* name; } joe; """ self.assertEqual(expand_decl(self.parse(s1).ext[0]), ['Decl', 'joe', ['TypeDecl', ['Struct', None, [ ['Decl', 'id', ['TypeDecl', ['IdentifierType', ['int']]]], ['Decl', 'name', ['PtrDecl', ['TypeDecl', ['IdentifierType', ['char']]]]]]]]]) s2 = """ struct node p; """ self.assertEqual(expand_decl(self.parse(s2).ext[0]), ['Decl', 'p', ['TypeDecl', ['Struct', 'node', []]]]) s21 = """ union pri ra; """ self.assertEqual(expand_decl(self.parse(s21).ext[0]), ['Decl', 'ra', ['TypeDecl', ['Union', 'pri', []]]]) s3 = """ struct node* p; """ self.assertEqual(expand_decl(self.parse(s3).ext[0]), ['Decl', 'p', ['PtrDecl', ['TypeDecl', ['Struct', 'node', []]]]]) s4 = """ struct node; """ self.assertEqual(expand_decl(self.parse(s4).ext[0]), ['Decl', None, ['Struct', 'node', []]]) s5 = """ union { struct { int type; } n; struct { int type; int intnode; } ni; } u; """ self.assertEqual(expand_decl(self.parse(s5).ext[0]), ['Decl', 'u', ['TypeDecl', ['Union', None, [['Decl', 'n', ['TypeDecl', ['Struct', None, [['Decl', 'type', ['TypeDecl', ['IdentifierType', ['int']]]]]]]], ['Decl', 'ni', ['TypeDecl', ['Struct', None, [['Decl', 'type', ['TypeDecl', ['IdentifierType', ['int']]]], ['Decl', 'intnode', ['TypeDecl', ['IdentifierType', ['int']]]]]]]]]]]]) s6 = """ typedef struct foo_tag { void* data; } foo, *pfoo; """ s6_ast = self.parse(s6) self.assertEqual(expand_decl(s6_ast.ext[0]), ['Typedef', 'foo', ['TypeDecl', ['Struct', 'foo_tag', [['Decl', 'data', ['PtrDecl', ['TypeDecl', ['IdentifierType', ['void']]]]]]]]]) self.assertEqual(expand_decl(s6_ast.ext[1]), ['Typedef', 'pfoo', ['PtrDecl', ['TypeDecl', ['Struct', 'foo_tag', [['Decl', 'data', ['PtrDecl', ['TypeDecl', ['IdentifierType', ['void']]]]]]]]]]) s7 = r""" struct _on_exit_args { void * _fnargs[32]; void * _dso_handle[32]; long _fntypes; #line 77 "D:\eli\cpp_stuff\libc_include/sys/reent.h" long _is_cxa; }; """ s7_ast = self.parse(s7, filename='test.c') self.assert_coord(s7_ast.ext[0].type.decls[2], 6, 22, 'test.c') self.assert_coord(s7_ast.ext[0].type.decls[3], 78, 22, r'D:\eli\cpp_stuff\libc_include/sys/reent.h') s8 = """ typedef enum tagReturnCode {SUCCESS, FAIL} ReturnCode; typedef struct tagEntry { char* key; char* value; } Entry; typedef struct tagNode { Entry* entry; struct tagNode* next; } Node; typedef struct tagHash { unsigned int table_size; Node** heads; } Hash; """ s8_ast = self.parse(s8) self.assertEqual(expand_decl(s8_ast.ext[3]), ['Typedef', 'Hash', ['TypeDecl', ['Struct', 'tagHash', [['Decl', 'table_size', ['TypeDecl', ['IdentifierType', ['unsigned', 'int']]]], ['Decl', 'heads', ['PtrDecl', ['PtrDecl', ['TypeDecl', ['IdentifierType', ['Node']]]]]]]]]]) def test_struct_with_extra_semis_inside(self): s1 = """ struct { int a;; } foo; """ s1_ast = self.parse(s1) self.assertEqual(expand_decl(s1_ast.ext[0]), ['Decl', 'foo', ['TypeDecl', ['Struct', None, [['Decl', 'a', ['TypeDecl', ['IdentifierType', ['int']]]]]]]]) s2 = """ struct { int a;;;; float b, c; ;; char d; } foo; """ s2_ast = self.parse(s2) self.assertEqual(expand_decl(s2_ast.ext[0]), ['Decl', 'foo', ['TypeDecl', ['Struct', None, [['Decl', 'a', ['TypeDecl', ['IdentifierType', ['int']]]], ['Decl', 'b', ['TypeDecl', ['IdentifierType', ['float']]]], ['Decl', 'c', ['TypeDecl', ['IdentifierType', ['float']]]], ['Decl', 'd', ['TypeDecl', ['IdentifierType', ['char']]]]]]]]) def test_anonymous_struct_union(self): s1 = """ union { union { int i; long l; }; struct { int type; int intnode; }; } u; """ self.assertEqual(expand_decl(self.parse(s1).ext[0]), ['Decl', 'u', ['TypeDecl', ['Union', None, [['Decl', None, ['Union', None, [['Decl', 'i', ['TypeDecl', ['IdentifierType', ['int']]]], ['Decl', 'l', ['TypeDecl', ['IdentifierType', ['long']]]]]]], ['Decl', None, ['Struct', None, [['Decl', 'type', ['TypeDecl', ['IdentifierType', ['int']]]], ['Decl', 'intnode', ['TypeDecl', ['IdentifierType', ['int']]]]]]]]]]]) s2 = """ struct { int i; union { int id; char* name; }; float f; } joe; """ self.assertEqual(expand_decl(self.parse(s2).ext[0]), ['Decl', 'joe', ['TypeDecl', ['Struct', None, [['Decl', 'i', ['TypeDecl', ['IdentifierType', ['int']]]], ['Decl', None, ['Union', None, [['Decl', 'id', ['TypeDecl', ['IdentifierType', ['int']]]], ['Decl', 'name', ['PtrDecl', ['TypeDecl', ['IdentifierType', ['char']]]]]]]], ['Decl', 'f', ['TypeDecl', ['IdentifierType', ['float']]]]]]]]) # ISO/IEC 9899:201x Commitee Draft 2010-11-16, N1539 # section 6.7.2.1, par. 19, example 1 s3 = """ struct v { union { struct { int i, j; }; struct { long k, l; } w; }; int m; } v1; """ self.assertEqual(expand_decl(self.parse(s3).ext[0]), ['Decl', 'v1', ['TypeDecl', ['Struct', 'v', [['Decl', None, ['Union', None, [['Decl', None, ['Struct', None, [['Decl', 'i', ['TypeDecl', ['IdentifierType', ['int']]]], ['Decl', 'j', ['TypeDecl', ['IdentifierType', ['int']]]]]]], ['Decl', 'w', ['TypeDecl', ['Struct', None, [['Decl', 'k', ['TypeDecl', ['IdentifierType', ['long']]]], ['Decl', 'l', ['TypeDecl', ['IdentifierType', ['long']]]]]]]]]]], ['Decl', 'm', ['TypeDecl', ['IdentifierType', ['int']]]]]]]]) s4 = """ struct v { int i; float; } v2;""" # just make sure this doesn't raise ParseError self.parse(s4) def test_struct_members_namespace(self): """ Tests that structure/union member names reside in a separate namespace and can be named after existing types. """ s1 = """ typedef int Name; typedef Name NameArray[10]; struct { Name Name; Name NameArray[3]; } sye; void main(void) { sye.Name = 1; } """ s1_ast = self.parse(s1) self.assertEqual(expand_decl(s1_ast.ext[2]), ['Decl', 'sye', ['TypeDecl', ['Struct', None, [ ['Decl', 'Name', ['TypeDecl', ['IdentifierType', ['Name']]]], ['Decl', 'NameArray', ['ArrayDecl', '3', [], ['TypeDecl', ['IdentifierType', ['Name']]]]]]]]]) self.assertEqual(s1_ast.ext[3].body.block_items[0].lvalue.field.name, 'Name') def test_struct_bitfields(self): # a struct with two bitfields, one unnamed s1 = """ struct { int k:6; int :2; } joe; """ parsed_struct = self.parse(s1).ext[0] # We can see here the name of the decl for the unnamed bitfield is # None, but expand_decl doesn't show bitfield widths # ... self.assertEqual(expand_decl(parsed_struct), ['Decl', 'joe', ['TypeDecl', ['Struct', None, [ ['Decl', 'k', ['TypeDecl', ['IdentifierType', ['int']]]], ['Decl', None, ['TypeDecl', ['IdentifierType', ['int']]]]]]]]) # ... # so we test them manually self.assertEqual(parsed_struct.type.type.decls[0].bitsize.value, '6') self.assertEqual(parsed_struct.type.type.decls[1].bitsize.value, '2') def test_tags_namespace(self): """ Tests that the tags of structs/unions/enums reside in a separate namespace and can be named after existing types. """ s1 = """ typedef int tagEntry; struct tagEntry { char* key; char* value; } Entry; """ s1_ast = self.parse(s1) self.assertEqual(expand_decl(s1_ast.ext[1]), ['Decl', 'Entry', ['TypeDecl', ['Struct', 'tagEntry', [['Decl', 'key', ['PtrDecl', ['TypeDecl', ['IdentifierType', ['char']]]]], ['Decl', 'value', ['PtrDecl', ['TypeDecl', ['IdentifierType', ['char']]]]]]]]]) s2 = """ struct tagEntry; typedef struct tagEntry tagEntry; struct tagEntry { char* key; char* value; } Entry; """ s2_ast = self.parse(s2) self.assertEqual(expand_decl(s2_ast.ext[2]), ['Decl', 'Entry', ['TypeDecl', ['Struct', 'tagEntry', [['Decl', 'key', ['PtrDecl', ['TypeDecl', ['IdentifierType', ['char']]]]], ['Decl', 'value', ['PtrDecl', ['TypeDecl', ['IdentifierType', ['char']]]]]]]]]) s3 = """ typedef int mytag; enum mytag {ABC, CDE}; enum mytag joe; """ s3_type = self.parse(s3).ext[1].type self.assertTrue(isinstance(s3_type, Enum)) self.assertEqual(s3_type.name, 'mytag') def test_multi_decls(self): d1 = 'int a, b;' self.assertEqual(self.get_decl(d1, 0), ['Decl', 'a', ['TypeDecl', ['IdentifierType', ['int']]]]) self.assertEqual(self.get_decl(d1, 1), ['Decl', 'b', ['TypeDecl', ['IdentifierType', ['int']]]]) d2 = 'char* p, notp, ar[4];' self.assertEqual(self.get_decl(d2, 0), ['Decl', 'p', ['PtrDecl', ['TypeDecl', ['IdentifierType', ['char']]]]]) self.assertEqual(self.get_decl(d2, 1), ['Decl', 'notp', ['TypeDecl', ['IdentifierType', ['char']]]]) self.assertEqual(self.get_decl(d2, 2), ['Decl', 'ar', ['ArrayDecl', '4', [], ['TypeDecl', ['IdentifierType', ['char']]]]]) def test_invalid_multiple_types_error(self): bad = [ 'int enum {ab, cd} fubr;', 'enum kid char brbr;'] for b in bad: self.assertRaises(ParseError, self.parse, b) def test_duplicate_typedef(self): """ Tests that redeclarations of existing types are parsed correctly. This is non-standard, but allowed by many compilers. """ d1 = ''' typedef int numbertype; typedef int numbertype; ''' self.assertEqual(self.get_decl(d1, 0), ['Typedef', 'numbertype', ['TypeDecl', ['IdentifierType', ['int']]]]) self.assertEqual(self.get_decl(d1, 1), ['Typedef', 'numbertype', ['TypeDecl', ['IdentifierType', ['int']]]]) d2 = ''' typedef int (*funcptr)(int x); typedef int (*funcptr)(int x); ''' self.assertEqual(self.get_decl(d2, 0), ['Typedef', 'funcptr', ['PtrDecl', ['FuncDecl', [['Decl', 'x', ['TypeDecl', ['IdentifierType', ['int']]]]], ['TypeDecl', ['IdentifierType', ['int']]]]]]) self.assertEqual(self.get_decl(d2, 1), ['Typedef', 'funcptr', ['PtrDecl', ['FuncDecl', [['Decl', 'x', ['TypeDecl', ['IdentifierType', ['int']]]]], ['TypeDecl', ['IdentifierType', ['int']]]]]]) d3 = ''' typedef int numberarray[5]; typedef int numberarray[5]; ''' self.assertEqual(self.get_decl(d3, 0), ['Typedef', 'numberarray', ['ArrayDecl', '5', [], ['TypeDecl', ['IdentifierType', ['int']]]]]) self.assertEqual(self.get_decl(d3, 1), ['Typedef', 'numberarray', ['ArrayDecl', '5', [], ['TypeDecl', ['IdentifierType', ['int']]]]]) def test_decl_inits(self): d1 = 'int a = 16;' #~ self.parse(d1).show() self.assertEqual(self.get_decl(d1), ['Decl', 'a', ['TypeDecl', ['IdentifierType', ['int']]]]) self.assertEqual(self.get_decl_init(d1), ['Constant', 'int', '16']) d1_1 = 'float f = 0xEF.56p1;' self.assertEqual(self.get_decl_init(d1_1), ['Constant', 'float', '0xEF.56p1']) d1_2 = 'int bitmask = 0b1001010;' self.assertEqual(self.get_decl_init(d1_2), ['Constant', 'int', '0b1001010']) d2 = 'long ar[] = {7, 8, 9};' self.assertEqual(self.get_decl(d2), ['Decl', 'ar', ['ArrayDecl', '', [], ['TypeDecl', ['IdentifierType', ['long']]]]]) self.assertEqual(self.get_decl_init(d2), [ ['Constant', 'int', '7'], ['Constant', 'int', '8'], ['Constant', 'int', '9']]) d21 = 'long ar[4] = {};' self.assertEqual(self.get_decl_init(d21), []) d3 = 'char p = j;' self.assertEqual(self.get_decl(d3), ['Decl', 'p', ['TypeDecl', ['IdentifierType', ['char']]]]) self.assertEqual(self.get_decl_init(d3), ['ID', 'j']) d4 = "char x = 'c', *p = {0, 1, 2, {4, 5}, 6};" self.assertEqual(self.get_decl(d4, 0), ['Decl', 'x', ['TypeDecl', ['IdentifierType', ['char']]]]) self.assertEqual(self.get_decl_init(d4, 0), ['Constant', 'char', "'c'"]) self.assertEqual(self.get_decl(d4, 1), ['Decl', 'p', ['PtrDecl', ['TypeDecl', ['IdentifierType', ['char']]]]]) self.assertEqual(self.get_decl_init(d4, 1), [ ['Constant', 'int', '0'], ['Constant', 'int', '1'], ['Constant', 'int', '2'], [['Constant', 'int', '4'], ['Constant', 'int', '5']], ['Constant', 'int', '6']]) def test_decl_named_inits(self): d1 = 'int a = {.k = 16};' self.assertEqual(self.get_decl_init(d1), [( [['ID', 'k']], ['Constant', 'int', '16'])]) d2 = 'int a = { [0].a = {1}, [1].a[0] = 2 };' self.assertEqual(self.get_decl_init(d2), [ ([['Constant', 'int', '0'], ['ID', 'a']], [['Constant', 'int', '1']]), ([['Constant', 'int', '1'], ['ID', 'a'], ['Constant', 'int', '0']], ['Constant', 'int', '2'])]) d3 = 'int a = { .a = 1, .c = 3, 4, .b = 5};' self.assertEqual(self.get_decl_init(d3), [ ([['ID', 'a']], ['Constant', 'int', '1']), ([['ID', 'c']], ['Constant', 'int', '3']), ['Constant', 'int', '4'], ([['ID', 'b']], ['Constant', 'int', '5'])]) def test_function_definitions(self): def parse_fdef(str): return self.parse(str).ext[0] def fdef_decl(fdef): return expand_decl(fdef.decl) f1 = parse_fdef(''' int factorial(int p) { return 3; } ''') self.assertEqual(fdef_decl(f1), ['Decl', 'factorial', ['FuncDecl', [['Decl', 'p', ['TypeDecl', ['IdentifierType', ['int']]]]], ['TypeDecl', ['IdentifierType', ['int']]]]]) self.assertEqual(type(f1.body.block_items[0]), Return) f2 = parse_fdef(''' char* zzz(int p, char* c) { int a; char b; a = b + 2; return 3; } ''') self.assertEqual(fdef_decl(f2), ['Decl', 'zzz', ['FuncDecl', [ ['Decl', 'p', ['TypeDecl', ['IdentifierType', ['int']]]], ['Decl', 'c', ['PtrDecl', ['TypeDecl', ['IdentifierType', ['char']]]]]], ['PtrDecl', ['TypeDecl', ['IdentifierType', ['char']]]]]]) self.assertEqual(list(map(type, f2.body.block_items)), [Decl, Decl, Assignment, Return]) f3 = parse_fdef(''' char* zzz(p, c) long p, *c; { int a; char b; a = b + 2; return 3; } ''') self.assertEqual(fdef_decl(f3), ['Decl', 'zzz', ['FuncDecl', [ ['ID', 'p'], ['ID', 'c']], ['PtrDecl', ['TypeDecl', ['IdentifierType', ['char']]]]]]) self.assertEqual(list(map(type, f3.body.block_items)), [Decl, Decl, Assignment, Return]) self.assertEqual(expand_decl(f3.param_decls[0]), ['Decl', 'p', ['TypeDecl', ['IdentifierType', ['long']]]]) self.assertEqual(expand_decl(f3.param_decls[1]), ['Decl', 'c', ['PtrDecl', ['TypeDecl', ['IdentifierType', ['long']]]]]) # function return values and parameters may not have type information f4 = parse_fdef(''' que(p) { return 3; } ''') self.assertEqual(fdef_decl(f4), ['Decl', 'que', ['FuncDecl', [['ID', 'p']], ['TypeDecl', ['IdentifierType', ['int']]]]]) def test_unified_string_literals(self): # simple string, for reference d1 = self.get_decl_init('char* s = "hello";') self.assertEqual(d1, ['Constant', 'string', '"hello"']) d2 = self.get_decl_init('char* s = "hello" " world";') self.assertEqual(d2, ['Constant', 'string', '"hello world"']) # the test case from issue 6 d3 = self.parse(r''' int main() { fprintf(stderr, "Wrong Params?\n" "Usage:\n" "%s \n", argv[0] ); } ''') self.assertEqual( d3.ext[0].body.block_items[0].args.exprs[1].value, r'"Wrong Params?\nUsage:\n%s \n"') d4 = self.get_decl_init('char* s = "" "foobar";') self.assertEqual(d4, ['Constant', 'string', '"foobar"']) d5 = self.get_decl_init(r'char* s = "foo\"" "bar";') self.assertEqual(d5, ['Constant', 'string', r'"foo\"bar"']) def test_unified_wstring_literals(self): d1 = self.get_decl_init('char* s = L"hello" L"world";') self.assertEqual(d1, ['Constant', 'string', 'L"helloworld"']) d2 = self.get_decl_init('char* s = L"hello " L"world" L" and I";') self.assertEqual(d2, ['Constant', 'string', 'L"hello world and I"']) def test_inline_specifier(self): ps2 = self.parse('static inline void inlinefoo(void);') self.assertEqual(ps2.ext[0].funcspec, ['inline']) # variable length array def test_vla(self): ps2 = self.parse(r''' int main() { int size; int var[size = 5]; int var2[*]; } ''') self.assertTrue(isinstance(ps2.ext[0].body.block_items[1].type.dim, Assignment)) self.assertTrue(isinstance(ps2.ext[0].body.block_items[2].type.dim, ID)) def test_pragma(self): s1 = r''' #pragma bar void main() { #pragma foo for(;;) {} #pragma } ''' s1_ast = self.parse(s1) self.assertTrue(isinstance(s1_ast.ext[0], Pragma)) self.assertEqual(s1_ast.ext[0].string, 'bar') self.assertEqual(s1_ast.ext[0].coord.line, 2) self.assertTrue(isinstance(s1_ast.ext[1].body.block_items[0], Pragma)) self.assertEqual(s1_ast.ext[1].body.block_items[0].string, 'foo') self.assertEqual(s1_ast.ext[1].body.block_items[0].coord.line, 4) self.assertTrue(isinstance(s1_ast.ext[1].body.block_items[2], Pragma)) self.assertEqual(s1_ast.ext[1].body.block_items[2].string, '') self.assertEqual(s1_ast.ext[1].body.block_items[2].coord.line, 6) class TestCParser_whole_code(TestCParser_base): """ Testing of parsing whole chunks of code. Since I don't want to rely on the structure of ASTs too much, most of these tests are implemented with visitors. """ # A simple helper visitor that lists the values of all the # Constant nodes it sees. # class ConstantVisitor(NodeVisitor): def __init__(self): self.values = [] def visit_Constant(self, node): self.values.append(node.value) # This visitor counts the amount of references to the ID # with the name provided to it in the constructor. # class IDNameCounter(NodeVisitor): def __init__(self, name): self.name = name self.nrefs = 0 def visit_ID(self, node): if node.name == self.name: self.nrefs += 1 # Counts the amount of nodes of a given class # class NodeKlassCounter(NodeVisitor): def __init__(self, node_klass): self.klass = node_klass self.n = 0 def generic_visit(self, node): if node.__class__ == self.klass: self.n += 1 NodeVisitor.generic_visit(self, node) def assert_all_Constants(self, code, constants): """ Asserts that the list of all Constant values (by 'preorder' appearance) in the chunk of code is as given. """ if isinstance(code, str): parsed = self.parse(code) else: parsed = code cv = self.ConstantVisitor() cv.visit(parsed) self.assertEqual(cv.values, constants) def assert_num_ID_refs(self, code, name, num): """ Asserts the number of references to the ID with the given name. """ if isinstance(code, str): parsed = self.parse(code) else: parsed = code iv = self.IDNameCounter(name) iv.visit(parsed) self.assertEqual(iv.nrefs, num) def assert_num_klass_nodes(self, code, klass, num): """ Asserts the amount of klass nodes in the code. """ if isinstance(code, str): parsed = self.parse(code) else: parsed = code cv = self.NodeKlassCounter(klass) cv.visit(parsed) self.assertEqual(cv.n, num) def test_expressions(self): e1 = '''int k = (r + 10.0) >> 6 + 8 << (3 & 0x14);''' self.assert_all_Constants(e1, ['10.0', '6', '8', '3', '0x14']) e2 = r'''char n = '\n', *prefix = "st_";''' self.assert_all_Constants(e2, [r"'\n'", '"st_"']) s1 = r'''int main() { int i = 5, j = 6, k = 1; if ((i=j && k == 1) || k > j) printf("Hello, world\n"); return 0; }''' ps1 = self.parse(s1) self.assert_all_Constants(ps1, ['5', '6', '1', '1', '"Hello, world\\n"', '0']) self.assert_num_ID_refs(ps1, 'i', 1) self.assert_num_ID_refs(ps1, 'j', 2) def test_statements(self): s1 = r''' void foo(){ if (sp == 1) if (optind >= argc || argv[optind][0] != '-' || argv[optind][1] == '\0') return -1; else if (strcmp(argv[optind], "--") == 0) { optind++; return -1; } } ''' self.assert_all_Constants(s1, ['1', '0', r"'-'", '1', r"'\0'", '1', r'"--"', '0', '1']) ps1 = self.parse(s1) self.assert_num_ID_refs(ps1, 'argv', 3) self.assert_num_ID_refs(ps1, 'optind', 5) self.assert_num_klass_nodes(ps1, If, 3) self.assert_num_klass_nodes(ps1, Return, 2) self.assert_num_klass_nodes(ps1, FuncCall, 1) # strcmp self.assert_num_klass_nodes(ps1, BinaryOp, 7) # In the following code, Hash and Node were defined as # int to pacify the parser that sees they're used as # types # s2 = r''' typedef int Hash, Node; void HashDestroy(Hash* hash) { unsigned int i; if (hash == NULL) return; for (i = 0; i < hash->table_size; ++i) { Node* temp = hash->heads[i]; while (temp != NULL) { Node* temp2 = temp; free(temp->entry->key); free(temp->entry->value); free(temp->entry); temp = temp->next; free(temp2); } } free(hash->heads); hash->heads = NULL; free(hash); } ''' ps2 = self.parse(s2) self.assert_num_klass_nodes(ps2, FuncCall, 6) self.assert_num_klass_nodes(ps2, FuncDef, 1) self.assert_num_klass_nodes(ps2, For, 1) self.assert_num_klass_nodes(ps2, While, 1) self.assert_num_klass_nodes(ps2, StructRef, 10) # declarations don't count self.assert_num_ID_refs(ps2, 'hash', 6) self.assert_num_ID_refs(ps2, 'i', 4) s3 = r''' void x(void) { int a, b; if (a < b) do { a = 0; } while (0); else if (a == b) { a = 1; } } ''' ps3 = self.parse(s3) self.assert_num_klass_nodes(ps3, DoWhile, 1) self.assert_num_ID_refs(ps3, 'a', 4) self.assert_all_Constants(ps3, ['0', '0', '1']) def test_empty_statements(self): s1 = r''' void foo(void){ ; return;; ; } ''' ps1 = self.parse(s1) self.assert_num_klass_nodes(ps1, EmptyStatement, 3) self.assert_num_klass_nodes(ps1, Return, 1) self.assert_coord(ps1.ext[0].body.block_items[0], 3) self.assert_coord(ps1.ext[0].body.block_items[1], 4) self.assert_coord(ps1.ext[0].body.block_items[2], 4) self.assert_coord(ps1.ext[0].body.block_items[3], 6) def test_switch_statement(self): def assert_case_node(node, const_value): self.assertTrue(isinstance(node, Case)) self.assertTrue(isinstance(node.expr, Constant)) self.assertEqual(node.expr.value, const_value) def assert_default_node(node): self.assertTrue(isinstance(node, Default)) s1 = r''' int foo(void) { switch (myvar) { case 10: k = 10; p = k + 1; return 10; case 20: case 30: return 20; default: break; } return 0; } ''' ps1 = self.parse(s1) switch = ps1.ext[0].body.block_items[0] block = switch.stmt.block_items assert_case_node(block[0], '10') self.assertEqual(len(block[0].stmts), 3) assert_case_node(block[1], '20') self.assertEqual(len(block[1].stmts), 0) assert_case_node(block[2], '30') self.assertEqual(len(block[2].stmts), 1) assert_default_node(block[3]) s2 = r''' int foo(void) { switch (myvar) { default: joe = moe; return 10; case 10: case 20: case 30: case 40: break; } return 0; } ''' ps2 = self.parse(s2) switch = ps2.ext[0].body.block_items[0] block = switch.stmt.block_items assert_default_node(block[0]) self.assertEqual(len(block[0].stmts), 2) assert_case_node(block[1], '10') self.assertEqual(len(block[1].stmts), 0) assert_case_node(block[2], '20') self.assertEqual(len(block[1].stmts), 0) assert_case_node(block[3], '30') self.assertEqual(len(block[1].stmts), 0) assert_case_node(block[4], '40') self.assertEqual(len(block[4].stmts), 1) def test_for_statement(self): s2 = r''' void x(void) { int i; for (i = 0; i < 5; ++i) { x = 50; } } ''' ps2 = self.parse(s2) self.assert_num_klass_nodes(ps2, For, 1) # here there are 3 refs to 'i' since the declaration doesn't count as # a ref in the visitor # self.assert_num_ID_refs(ps2, 'i', 3) s3 = r''' void x(void) { for (int i = 0; i < 5; ++i) { x = 50; } } ''' ps3 = self.parse(s3) self.assert_num_klass_nodes(ps3, For, 1) # here there are 2 refs to 'i' since the declaration doesn't count as # a ref in the visitor # self.assert_num_ID_refs(ps3, 'i', 2) s4 = r''' void x(void) { for (int i = 0;;) i; } ''' ps4 = self.parse(s4) self.assert_num_ID_refs(ps4, 'i', 1) def _open_c_file(self, name): """ Find a c file by name, taking into account the current dir can be in a couple of typical places """ testdir = os.path.dirname(__file__) name = os.path.join(testdir, 'c_files', name) assert os.path.exists(name) return open(name, 'rU') def test_whole_file(self): # See how pycparser handles a whole, real C file. # with self._open_c_file('memmgr_with_h.c') as f: code = f.read() p = self.parse(code) self.assert_num_klass_nodes(p, FuncDef, 5) # each FuncDef also has a FuncDecl. 4 declarations # + 5 definitions, overall 9 self.assert_num_klass_nodes(p, FuncDecl, 9) self.assert_num_klass_nodes(p, Typedef, 4) self.assertEqual(p.ext[4].coord.line, 88) self.assertEqual(p.ext[4].coord.file, "./memmgr.h") self.assertEqual(p.ext[6].coord.line, 10) self.assertEqual(p.ext[6].coord.file, "memmgr.c") def test_whole_file_with_stdio(self): # Parse a whole file with stdio.h included by cpp # with self._open_c_file('cppd_with_stdio_h.c') as f: code = f.read() p = self.parse(code) self.assertTrue(isinstance(p.ext[0], Typedef)) self.assertEqual(p.ext[0].coord.line, 213) self.assertEqual(p.ext[0].coord.file, r"D:\eli\cpp_stuff\libc_include/stddef.h") self.assertTrue(isinstance(p.ext[-1], FuncDef)) self.assertEqual(p.ext[-1].coord.line, 15) self.assertEqual(p.ext[-1].coord.file, "example_c_file.c") self.assertTrue(isinstance(p.ext[-8], Typedef)) self.assertTrue(isinstance(p.ext[-8].type, TypeDecl)) self.assertEqual(p.ext[-8].name, 'cookie_io_functions_t') class TestCParser_typenames(TestCParser_base): """ Test issues related to the typedef-name problem. """ def test_innerscope_typedef(self): # should fail since TT is not a type in bar s1 = r''' void foo() { typedef char TT; TT x; } void bar() { TT y; } ''' self.assertRaises(ParseError, self.parse, s1) # should succeed since TT is not a type in bar s2 = r''' void foo() { typedef char TT; TT x; } void bar() { unsigned TT; } ''' self.assertTrue(isinstance(self.parse(s2), FileAST)) def test_ambiguous_parameters(self): # From ISO/IEC 9899:TC2, 6.7.5.3.11: # "If, in a parameter declaration, an identifier can be treated either # as a typedef name or as a parameter name, it shall be taken as a # typedef name." # foo takes an int named aa # bar takes a function taking a TT s1 = r''' typedef char TT; int foo(int (aa)); int bar(int (TT)); ''' s1_ast = self.parse(s1) self.assertEqual(expand_decl(s1_ast.ext[1].type.args.params[0]), ['Decl', 'aa', ['TypeDecl', ['IdentifierType', ['int']]]]) self.assertEqual(expand_decl(s1_ast.ext[2].type.args.params[0]), ['Typename', ['FuncDecl', [['Typename', ['TypeDecl', ['IdentifierType', ['TT']]]]], ['TypeDecl', ['IdentifierType', ['int']]]]]) # foo takes a function taking a char # bar takes a function taking a function taking a char s2 = r''' typedef char TT; int foo(int (aa (char))); int bar(int (TT (char))); ''' s2_ast = self.parse(s2) self.assertEqual(expand_decl(s2_ast.ext[1].type.args.params[0]), ['Decl', 'aa', ['FuncDecl', [['Typename', ['TypeDecl', ['IdentifierType', ['char']]]]], ['TypeDecl', ['IdentifierType', ['int']]]]]) self.assertEqual(expand_decl(s2_ast.ext[2].type.args.params[0]), ['Typename', ['FuncDecl', [['Typename', ['FuncDecl', [['Typename', ['TypeDecl', ['IdentifierType', ['char']]]]], ['TypeDecl', ['IdentifierType', ['TT']]]]]], ['TypeDecl', ['IdentifierType', ['int']]]]]) # foo takes an int array named aa # bar takes a function taking a TT array s3 = r''' typedef char TT; int foo(int (aa[])); int bar(int (TT[])); ''' s3_ast = self.parse(s3) self.assertEqual(expand_decl(s3_ast.ext[1].type.args.params[0]), ['Decl', 'aa', ['ArrayDecl', '', [], ['TypeDecl', ['IdentifierType', ['int']]]]]) self.assertEqual(expand_decl(s3_ast.ext[2].type.args.params[0]), ['Typename', ['FuncDecl', [['Typename', ['ArrayDecl', '', [], ['TypeDecl', ['IdentifierType', ['TT']]]]]], ['TypeDecl', ['IdentifierType', ['int']]]]]) def test_innerscope_reuse_typedef_name(self): # identifiers can be reused in inner scopes; the original should be # restored at the end of the block s1 = r''' typedef char TT; void foo(void) { unsigned TT; TT = 10; } TT x = 5; ''' s1_ast = self.parse(s1) self.assertEqual(expand_decl(s1_ast.ext[1].body.block_items[0]), ['Decl', 'TT', ['TypeDecl', ['IdentifierType', ['unsigned']]]]) self.assertEqual(expand_decl(s1_ast.ext[2]), ['Decl', 'x', ['TypeDecl', ['IdentifierType', ['TT']]]]) # this should be recognized even with an initializer s2 = r''' typedef char TT; void foo(void) { unsigned TT = 10; } ''' s2_ast = self.parse(s2) self.assertEqual(expand_decl(s2_ast.ext[1].body.block_items[0]), ['Decl', 'TT', ['TypeDecl', ['IdentifierType', ['unsigned']]]]) # before the second local variable, TT is a type; after, it's a # variable s3 = r''' typedef char TT; void foo(void) { TT tt = sizeof(TT); unsigned TT = 10; } ''' s3_ast = self.parse(s3) self.assertEqual(expand_decl(s3_ast.ext[1].body.block_items[0]), ['Decl', 'tt', ['TypeDecl', ['IdentifierType', ['TT']]]]) self.assertEqual(expand_decl(s3_ast.ext[1].body.block_items[1]), ['Decl', 'TT', ['TypeDecl', ['IdentifierType', ['unsigned']]]]) # a variable and its type can even share the same name s4 = r''' typedef char TT; void foo(void) { TT TT = sizeof(TT); unsigned uu = TT * 2; } ''' s4_ast = self.parse(s4) self.assertEqual(expand_decl(s4_ast.ext[1].body.block_items[0]), ['Decl', 'TT', ['TypeDecl', ['IdentifierType', ['TT']]]]) self.assertEqual(expand_decl(s4_ast.ext[1].body.block_items[1]), ['Decl', 'uu', ['TypeDecl', ['IdentifierType', ['unsigned']]]]) # ensure an error is raised if a type, redeclared as a variable, is # used as a type s5 = r''' typedef char TT; void foo(void) { unsigned TT = 10; TT erroneous = 20; } ''' self.assertRaises(ParseError, self.parse, s5) # reusing a type name should work with multiple declarators s6 = r''' typedef char TT; void foo(void) { unsigned TT, uu; } ''' s6_ast = self.parse(s6) items = s6_ast.ext[1].body.block_items self.assertEqual(expand_decl(items[0]), ['Decl', 'TT', ['TypeDecl', ['IdentifierType', ['unsigned']]]]) self.assertEqual(expand_decl(items[1]), ['Decl', 'uu', ['TypeDecl', ['IdentifierType', ['unsigned']]]]) # reusing a type name should work after a pointer s7 = r''' typedef char TT; void foo(void) { unsigned * TT; } ''' s7_ast = self.parse(s7) items = s7_ast.ext[1].body.block_items self.assertEqual(expand_decl(items[0]), ['Decl', 'TT', ['PtrDecl', ['TypeDecl', ['IdentifierType', ['unsigned']]]]]) # redefine a name in the middle of a multi-declarator declaration s8 = r''' typedef char TT; void foo(void) { int tt = sizeof(TT), TT, uu = sizeof(TT); int uu = sizeof(tt); } ''' s8_ast = self.parse(s8) items = s8_ast.ext[1].body.block_items self.assertEqual(expand_decl(items[0]), ['Decl', 'tt', ['TypeDecl', ['IdentifierType', ['int']]]]) self.assertEqual(expand_decl(items[1]), ['Decl', 'TT', ['TypeDecl', ['IdentifierType', ['int']]]]) self.assertEqual(expand_decl(items[2]), ['Decl', 'uu', ['TypeDecl', ['IdentifierType', ['int']]]]) # Don't test this until we have support for it # self.assertEqual(expand_init(items[0].init), # ['UnaryOp', 'sizeof', ['Typename', ['TypeDecl', ['IdentifierType', ['TT']]]]]) # self.assertEqual(expand_init(items[2].init), # ['UnaryOp', 'sizeof', ['ID', 'TT']]) def test_parameter_reuse_typedef_name(self): # identifiers can be reused as parameter names; parameter name scope # begins and ends with the function body; it's important that TT is # used immediately before the LBRACE or after the RBRACE, to test # a corner case s1 = r''' typedef char TT; void foo(unsigned TT, TT bar) { TT = 10; } TT x = 5; ''' s1_ast = self.parse(s1) self.assertEqual(expand_decl(s1_ast.ext[1].decl), ['Decl', 'foo', ['FuncDecl', [ ['Decl', 'TT', ['TypeDecl', ['IdentifierType', ['unsigned']]]], ['Decl', 'bar', ['TypeDecl', ['IdentifierType', ['TT']]]]], ['TypeDecl', ['IdentifierType', ['void']]]]]) # the scope of a parameter name in a function declaration ends at the # end of the declaration...so it is effectively never used; it's # important that TT is used immediately after the declaration, to # test a corner case s2 = r''' typedef char TT; void foo(unsigned TT, TT bar); TT x = 5; ''' s2_ast = self.parse(s2) self.assertEqual(expand_decl(s2_ast.ext[1]), ['Decl', 'foo', ['FuncDecl', [ ['Decl', 'TT', ['TypeDecl', ['IdentifierType', ['unsigned']]]], ['Decl', 'bar', ['TypeDecl', ['IdentifierType', ['TT']]]]], ['TypeDecl', ['IdentifierType', ['void']]]]]) # ensure an error is raised if a type, redeclared as a parameter, is # used as a type s3 = r''' typedef char TT; void foo(unsigned TT, TT bar) { TT erroneous = 20; } ''' self.assertRaises(ParseError, self.parse, s3) def test_nested_function_decls(self): # parameter names of nested function declarations must not escape into # the top-level function _definition's_ scope; the following must # succeed because TT is still a typedef inside foo's body s1 = r''' typedef char TT; void foo(unsigned bar(int TT)) { TT x = 10; } ''' self.assertTrue(isinstance(self.parse(s1), FileAST)) def test_samescope_reuse_name(self): # a typedef name cannot be reused as an object name in the same scope s1 = r''' typedef char TT; char TT = 5; ''' self.assertRaises(ParseError, self.parse, s1) # ...and vice-versa s2 = r''' char TT = 5; typedef char TT; ''' self.assertRaises(ParseError, self.parse, s2) if __name__ == '__main__': #~ suite = unittest.TestLoader().loadTestsFromNames( #~ ['test_c_parser.TestCParser_fundamentals.test_typedef']) #~ unittest.TextTestRunner(verbosity=2).run(suite) unittest.main() pycparser-2.18/tests/all_tests.py0000775000175000017500000000057313045001366017710 0ustar elibeneliben00000000000000#!/usr/bin/env python import sys sys.path[0:0] = ['.', '..'] import unittest suite = unittest.TestLoader().loadTestsFromNames( [ 'test_c_lexer', 'test_c_ast', 'test_general', 'test_c_parser', 'test_c_generator', ] ) testresult = unittest.TextTestRunner(verbosity=1).run(suite) sys.exit(0 if testresult.wasSuccessful() else 1) pycparser-2.18/tests/c_files/0000775000175000017500000000000013127011712016735 5ustar elibeneliben00000000000000pycparser-2.18/tests/c_files/memmgr.c0000664000175000017500000001273513045001366020400 0ustar elibeneliben00000000000000//---------------------------------------------------------------- // Statically-allocated memory manager // // by Eli Bendersky (eliben@gmail.com) // // This code is in the public domain. //---------------------------------------------------------------- #include "memmgr.h" typedef ulong Align; union mem_header_union { struct { // Pointer to the next block in the free list // union mem_header_union* next; // Size of the block (in quantas of sizeof(mem_header_t)) // ulong size; } s; // Used to align headers in memory to a boundary // Align align_dummy; }; typedef union mem_header_union mem_header_t; // Initial empty list // static mem_header_t base; // Start of free list // static mem_header_t* freep = 0; // Static pool for new allocations // static byte pool[POOL_SIZE] = {0}; static ulong pool_free_pos = 0; void memmgr_init() { base.s.next = 0; base.s.size = 0; freep = 0; pool_free_pos = 0; } static mem_header_t* get_mem_from_pool(ulong nquantas) { ulong total_req_size; mem_header_t* h; if (nquantas < MIN_POOL_ALLOC_QUANTAS) nquantas = MIN_POOL_ALLOC_QUANTAS; total_req_size = nquantas * sizeof(mem_header_t); if (pool_free_pos + total_req_size <= POOL_SIZE) { h = (mem_header_t*) (pool + pool_free_pos); h->s.size = nquantas; memmgr_free((void*) (h + 1)); pool_free_pos += total_req_size; } else { return 0; } return freep; } // Allocations are done in 'quantas' of header size. // The search for a free block of adequate size begins at the point 'freep' // where the last block was found. // If a too-big block is found, it is split and the tail is returned (this // way the header of the original needs only to have its size adjusted). // The pointer returned to the user points to the free space within the block, // which begins one quanta after the header. // void* memmgr_alloc(ulong nbytes) { mem_header_t* p; mem_header_t* prevp; // Calculate how many quantas are required: we need enough to house all // the requested bytes, plus the header. The -1 and +1 are there to make sure // that if nbytes is a multiple of nquantas, we don't allocate too much // ulong nquantas = (nbytes + sizeof(mem_header_t) - 1) / sizeof(mem_header_t) + 1; // First alloc call, and no free list yet ? Use 'base' for an initial // denegerate block of size 0, which points to itself // if ((prevp = freep) == 0) { base.s.next = freep = prevp = &base; base.s.size = 0; } for (p = prevp->s.next; ; prevp = p, p = p->s.next) { // big enough ? if (p->s.size >= nquantas) { // exactly ? if (p->s.size == nquantas) { // just eliminate this block from the free list by pointing // its prev's next to its next // prevp->s.next = p->s.next; } else // too big { p->s.size -= nquantas; p += p->s.size; p->s.size = nquantas; } freep = prevp; return (void*) (p + 1); } // Reached end of free list ? // Try to allocate the block from the pool. If that succeeds, // get_mem_from_pool adds the new block to the free list and // it will be found in the following iterations. If the call // to get_mem_from_pool doesn't succeed, we've run out of // memory // else if (p == freep) { if ((p = get_mem_from_pool(nquantas)) == 0) { #ifdef DEBUG_MEMMGR_FATAL printf("!! Memory allocation failed !!\n"); #endif return 0; } } } } // Scans the free list, starting at freep, looking the the place to insert the // free block. This is either between two existing blocks or at the end of the // list. In any case, if the block being freed is adjacent to either neighbor, // the adjacent blocks are combined. // void memmgr_free(void* ap) { mem_header_t* block; mem_header_t* p; // acquire pointer to block header block = ((mem_header_t*) ap) - 1; // Find the correct place to place the block in (the free list is sorted by // address, increasing order) // for (p = freep; !(block > p && block < p->s.next); p = p->s.next) { // Since the free list is circular, there is one link where a // higher-addressed block points to a lower-addressed block. // This condition checks if the block should be actually // inserted between them // if (p >= p->s.next && (block > p || block < p->s.next)) break; } // Try to combine with the higher neighbor // if (block + block->s.size == p->s.next) { block->s.size += p->s.next->s.size; block->s.next = p->s.next->s.next; } else { block->s.next = p->s.next; } // Try to combine with the lower neighbor // if (p + p->s.size == block) { p->s.size += block->s.size; p->s.next = block->s.next; } else { p->s.next = block; } freep = p; } pycparser-2.18/tests/c_files/memmgr_with_h.c0000664000175000017500000000770613045001366021744 0ustar elibeneliben00000000000000#line 1 "memmgr.c" #line 1 "./memmgr.h" typedef unsigned char byte; typedef unsigned long ulong; void memmgr_init(); void* memmgr_alloc(ulong nbytes); void memmgr_free(void* ap); void memmgr_print_stats(); #line 9 "memmgr.c" typedef ulong Align; union mem_header_union { struct { union mem_header_union* next; ulong size; } s; Align align_dummy; }; typedef union mem_header_union mem_header_t; static mem_header_t base; static mem_header_t* freep = 0; static byte pool[8 * 1024] = {0}; static ulong pool_free_pos = 0; void memmgr_init() { base.s.next = 0; base.s.size = 0; freep = 0; pool_free_pos = 0; } void memmgr_print_stats() { mem_header_t* p; printf("------ Memory manager stats ------\n\n"); printf( "Pool: free_pos = %lu (%lu bytes left)\n\n", pool_free_pos,8 * 1024 - pool_free_pos); p = (mem_header_t*) pool; while (p < (mem_header_t*) (pool + pool_free_pos)) { printf( " * Addr: 0x%8lu; Size: %8lu\n", p, p->s.size); p += p->s.size; } printf("\nFree list:\n\n"); if (freep) { p = freep; while (1) { printf( " * Addr: 0x%8lu; Size: %8lu; Next: 0x%8lu\n", p, p->s.size, p->s.next); p = p->s.next; if (p == freep) break; } } else { printf("Empty\n"); } printf("\n"); } static mem_header_t* get_mem_from_pool(ulong nquantas) { ulong total_req_size; mem_header_t* h; if (nquantas < 16) nquantas = 16; total_req_size = nquantas * sizeof(mem_header_t); if (pool_free_pos + total_req_size <= 8 * 1024) { h = (mem_header_t*) (pool + pool_free_pos); h->s.size = nquantas; memmgr_free((void*) (h + 1)); pool_free_pos += total_req_size; } else { return 0; } return freep; } void* memmgr_alloc(ulong nbytes) { mem_header_t* p; mem_header_t* prevp; ulong nquantas = (nbytes + sizeof(mem_header_t) - 1) / sizeof(mem_header_t) + 1; if ((prevp = freep) == 0) { base.s.next = freep = prevp = &base; base.s.size = 0; } for (p = prevp->s.next; ; prevp = p, p = p->s.next) { if (p->s.size >= nquantas) { if (p->s.size == nquantas) { prevp->s.next = p->s.next; } else { p->s.size -= nquantas; p += p->s.size; p->s.size = nquantas; } freep = prevp; return (void*) (p + 1); } else if (p == freep) { if ((p = get_mem_from_pool(nquantas)) == 0) { return 0; } } } } void memmgr_free(void* ap) { mem_header_t* block; mem_header_t* p; block = ((mem_header_t*) ap) - 1; for (p = freep; !(block > p && block < p->s.next); p = p->s.next) { if (p >= p->s.next && (block > p || block < p->s.next)) break; } if (block + block->s.size == p->s.next) { block->s.size += p->s.next->s.size; block->s.next = p->s.next->s.next; } else { block->s.next = p->s.next; } if (p + p->s.size == block) { p->s.size += block->s.size; p->s.next = block->s.next; } else { p->s.next = block; } freep = p; } pycparser-2.18/tests/c_files/simplemain.c0000664000175000017500000000010013045001366021231 0ustar elibeneliben00000000000000#include "hdir\emptydir\..\9\inc.h" int main() { return 0; } pycparser-2.18/tests/c_files/cppd_with_stdio_h.c0000664000175000017500000010602313045001366022600 0ustar elibeneliben00000000000000#line 1 "example_c_file.c" #line 1 "D:\eli\cpp_stuff\libc_include/stdio.h" #line 19 "D:\eli\cpp_stuff\libc_include/stdio.h" #line 25 "D:\eli\cpp_stuff\libc_include/stdio.h" #line 1 "D:\eli\cpp_stuff\libc_include/_ansi.h" #line 11 "D:\eli\cpp_stuff\libc_include/_ansi.h" #line 1 "D:\eli\cpp_stuff\libc_include/newlib.h" #line 3 "D:\eli\cpp_stuff\libc_include/newlib.h" #line 16 "D:\eli\cpp_stuff\libc_include/_ansi.h" #line 1 "D:\eli\cpp_stuff\libc_include/sys/config.h" #line 1 "D:\eli\cpp_stuff\libc_include/machine/ieeefp.h" #line 52 "D:\eli\cpp_stuff\libc_include/machine/ieeefp.h" #line 58 "D:\eli\cpp_stuff\libc_include/machine/ieeefp.h" #line 83 "D:\eli\cpp_stuff\libc_include/machine/ieeefp.h" #line 86 "D:\eli\cpp_stuff\libc_include/machine/ieeefp.h" #line 89 "D:\eli\cpp_stuff\libc_include/machine/ieeefp.h" #line 95 "D:\eli\cpp_stuff\libc_include/machine/ieeefp.h" #line 5 "D:\eli\cpp_stuff\libc_include/sys/config.h" #line 11 "D:\eli\cpp_stuff\libc_include/sys/config.h" #line 143 "D:\eli\cpp_stuff\libc_include/sys/config.h" #line 157 "D:\eli\cpp_stuff\libc_include/sys/config.h" #line 195 "D:\eli\cpp_stuff\libc_include/sys/config.h" #line 207 "D:\eli\cpp_stuff\libc_include/sys/config.h" #line 17 "D:\eli\cpp_stuff\libc_include/_ansi.h" #line 21 "D:\eli\cpp_stuff\libc_include/_ansi.h" #line 30 "D:\eli\cpp_stuff\libc_include/stdio.h" #line 1 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 19 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 26 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 30 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 35 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 39 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 42 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 53 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 56 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 67 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 76 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 98 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 108 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 126 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 131 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 170 "D:\eli\cpp_stuff\libc_include/stddef.h" typedef long unsigned int size_t; #line 243 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 246 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 290 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 302 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 310 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 361 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 365 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 418 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 422 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 427 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 35 "D:\eli\cpp_stuff\libc_include/stdio.h" #line 1 "D:\eli\cpp_stuff\libc_include/stdarg.h" #line 19 "D:\eli\cpp_stuff\libc_include/stdarg.h" #line 26 "D:\eli\cpp_stuff\libc_include/stdarg.h" #line 30 "D:\eli\cpp_stuff\libc_include/stdarg.h" typedef char* __builtin_va_list; typedef __builtin_va_list __gnuc_va_list; #line 50 "D:\eli\cpp_stuff\libc_include/stdarg.h" #line 66 "D:\eli\cpp_stuff\libc_include/stdarg.h" #line 80 "D:\eli\cpp_stuff\libc_include/stdarg.h" #line 98 "D:\eli\cpp_stuff\libc_include/stdarg.h" #line 38 "D:\eli\cpp_stuff\libc_include/stdio.h" #line 44 "D:\eli\cpp_stuff\libc_include/stdio.h" #line 1 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 6 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 1 "D:\eli\cpp_stuff\libc_include/_ansi.h" #line 11 "D:\eli\cpp_stuff\libc_include/_ansi.h" #line 21 "D:\eli\cpp_stuff\libc_include/_ansi.h" #line 14 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 1 "D:\eli\cpp_stuff\libc_include/sys/_types.h" #line 8 "D:\eli\cpp_stuff\libc_include/sys/_types.h" #line 1 "D:\eli\cpp_stuff\libc_include/machine/_types.h" #line 4 "D:\eli\cpp_stuff\libc_include/machine/_types.h" #line 1 "D:\eli\cpp_stuff\libc_include/machine/_default_types.h" #line 4 "D:\eli\cpp_stuff\libc_include/machine/_default_types.h" #line 15 "D:\eli\cpp_stuff\libc_include/machine/_default_types.h" #line 17 "D:\eli\cpp_stuff\libc_include/machine/_default_types.h" #line 1 "D:\eli\cpp_stuff\libc_include/limits.h" #line 1 "D:\eli\cpp_stuff\libc_include/newlib.h" #line 3 "D:\eli\cpp_stuff\libc_include/newlib.h" #line 5 "D:\eli\cpp_stuff\libc_include/limits.h" #line 19 "D:\eli\cpp_stuff\libc_include/limits.h" #line 1 "D:\eli\cpp_stuff\libc_include/sys/config.h" #line 11 "D:\eli\cpp_stuff\libc_include/sys/config.h" #line 143 "D:\eli\cpp_stuff\libc_include/sys/config.h" #line 157 "D:\eli\cpp_stuff\libc_include/sys/config.h" #line 195 "D:\eli\cpp_stuff\libc_include/sys/config.h" #line 207 "D:\eli\cpp_stuff\libc_include/sys/config.h" #line 25 "D:\eli\cpp_stuff\libc_include/limits.h" #line 79 "D:\eli\cpp_stuff\libc_include/limits.h" #line 23 "D:\eli\cpp_stuff\libc_include/machine/_default_types.h" typedef signed char __int8_t ; typedef unsigned char __uint8_t ; typedef signed short __int16_t; typedef unsigned short __uint16_t; typedef __int16_t __int_least16_t; typedef __uint16_t __uint_least16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef __int32_t __int_least32_t; typedef __uint32_t __uint_least32_t; #line 8 "D:\eli\cpp_stuff\libc_include/machine/_types.h" #line 13 "D:\eli\cpp_stuff\libc_include/sys/_types.h" #line 1 "D:\eli\cpp_stuff\libc_include/sys/lock.h" typedef int _LOCK_T; typedef int _LOCK_RECURSIVE_T; #line 14 "D:\eli\cpp_stuff\libc_include/sys/_types.h" typedef long _off_t; typedef short __dev_t; typedef unsigned short __uid_t; typedef unsigned short __gid_t; typedef long long _off64_t; #line 43 "D:\eli\cpp_stuff\libc_include/sys/_types.h" typedef long _fpos_t; typedef int _ssize_t; #line 1 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 19 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 26 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 30 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 35 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 39 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 42 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 53 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 56 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 67 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 76 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 98 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 108 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 126 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 131 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 170 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 243 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 246 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 290 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 302 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 310 "D:\eli\cpp_stuff\libc_include/stddef.h" typedef unsigned int wint_t; #line 361 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 365 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 418 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 422 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 427 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 64 "D:\eli\cpp_stuff\libc_include/sys/_types.h" typedef struct { int __count; union { wint_t __wch; unsigned char __wchb[4]; } __value; } _mbstate_t; typedef _LOCK_RECURSIVE_T _flock_t; typedef void *_iconv_t; #line 15 "D:\eli\cpp_stuff\libc_include/sys/reent.h" typedef unsigned long __ULong; struct _reent; #line 43 "D:\eli\cpp_stuff\libc_include/sys/reent.h" struct _Bigint { struct _Bigint *_next; int _k, _maxwds, _sign, _wds; __ULong _x[1]; }; struct __tm { int __tm_sec; int __tm_min; int __tm_hour; int __tm_mday; int __tm_mon; int __tm_year; int __tm_wday; int __tm_yday; int __tm_isdst; }; #line 68 "D:\eli\cpp_stuff\libc_include/sys/reent.h" struct _on_exit_args { void * _fnargs[32]; void * _dso_handle[32]; __ULong _fntypes; #line 77 "D:\eli\cpp_stuff\libc_include/sys/reent.h" __ULong _is_cxa; }; struct _atexit { struct _atexit *_next; int _ind; void (*_fns[32])(void); struct _on_exit_args _on_exit_args; }; #line 104 "D:\eli\cpp_stuff\libc_include/sys/reent.h" struct __sbuf { unsigned char *_base; int _size; }; #line 134 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 141 "D:\eli\cpp_stuff\libc_include/sys/reent.h" struct __sFILE { unsigned char *_p; int _r; int _w; short _flags; short _file; struct __sbuf _bf; int _lbfsize; char * _cookie; int(*_read)(); #line 176 "D:\eli\cpp_stuff\libc_include/sys/reent.h" int(*_write)(); #line 178 "D:\eli\cpp_stuff\libc_include/sys/reent.h" _fpos_t(*_seek)(); int(*_close)(); struct __sbuf _ub; unsigned char *_up; int _ur; unsigned char _ubuf[3]; unsigned char _nbuf[1]; struct __sbuf _lb; int _blksize; int _offset; struct _reent *_data; _flock_t _lock; }; typedef struct __sFILE __FILE; struct _glue { struct _glue *_next; int _niobs; __FILE *_iobs; }; #line 284 "D:\eli\cpp_stuff\libc_include/sys/reent.h" struct _rand48 { unsigned short _seed[3]; unsigned short _mult[3]; unsigned short _add; }; #line 313 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 344 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 350 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 420 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 452 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 474 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 478 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 482 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 494 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 496 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 503 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 505 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 508 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 531 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 533 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 536 "D:\eli\cpp_stuff\libc_include/sys/reent.h" struct _reent { int _errno; #line 571 "D:\eli\cpp_stuff\libc_include/sys/reent.h" __FILE *_stdin, *_stdout, *_stderr; int _inc; char _emergency[25]; int _current_category; char *_current_locale; int __sdidinit; void(*__cleanup)(); struct _Bigint *_result; int _result_k; struct _Bigint *_p5s; struct _Bigint **_freelist; int _cvtlen; char *_cvtbuf; union { struct { unsigned int _unused_rand; char * _strtok_last; char _asctime_buf[26]; struct __tm _localtime_buf; int _gamma_signgam; unsigned long long _rand_next; struct _rand48 _r48; _mbstate_t _mblen_state; _mbstate_t _mbtowc_state; _mbstate_t _wctomb_state; char _l64a_buf[8]; char _signal_buf[24]; int _getdate_err; _mbstate_t _mbrlen_state; _mbstate_t _mbrtowc_state; _mbstate_t _mbsrtowcs_state; _mbstate_t _wcrtomb_state; _mbstate_t _wcsrtombs_state; } _reent; #line 619 "D:\eli\cpp_stuff\libc_include/sys/reent.h" struct { unsigned char * _nextf[30]; unsigned int _nmalloc[30]; } _unused; } _new; struct _atexit *_atexit; struct _atexit _atexit0; void (**(_sig_func))(int); #line 637 "D:\eli\cpp_stuff\libc_include/sys/reent.h" struct _glue __sglue; __FILE __sf[3]; }; #line 689 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 751 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 791 "D:\eli\cpp_stuff\libc_include/sys/reent.h" extern struct _reent *_impure_ptr; extern struct _reent * _global_impure_ptr; void _reclaim_reent(); #line 46 "D:\eli\cpp_stuff\libc_include/stdio.h" #line 1 "D:\eli\cpp_stuff\libc_include/sys/types.h" #line 17 "D:\eli\cpp_stuff\libc_include/sys/types.h" #line 1 "D:\eli\cpp_stuff\libc_include/_ansi.h" #line 11 "D:\eli\cpp_stuff\libc_include/_ansi.h" #line 21 "D:\eli\cpp_stuff\libc_include/_ansi.h" #line 21 "D:\eli\cpp_stuff\libc_include/sys/types.h" #line 1 "D:\eli\cpp_stuff\libc_include/machine/_types.h" #line 4 "D:\eli\cpp_stuff\libc_include/machine/_types.h" #line 26 "D:\eli\cpp_stuff\libc_include/sys/types.h" #line 33 "D:\eli\cpp_stuff\libc_include/sys/types.h" #line 1 "D:\eli\cpp_stuff\libc_include/sys/_types.h" #line 8 "D:\eli\cpp_stuff\libc_include/sys/_types.h" #line 43 "D:\eli\cpp_stuff\libc_include/sys/_types.h" #line 62 "D:\eli\cpp_stuff\libc_include/sys/types.h" #line 1 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 19 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 26 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 30 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 35 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 39 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 42 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 53 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 56 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 67 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 76 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 98 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 108 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 126 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 131 "D:\eli\cpp_stuff\libc_include/stddef.h" typedef long int ptrdiff_t; #line 170 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 243 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 246 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 290 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 302 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 310 "D:\eli\cpp_stuff\libc_include/stddef.h" typedef int wchar_t; #line 361 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 365 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 418 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 422 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 427 "D:\eli\cpp_stuff\libc_include/stddef.h" #line 70 "D:\eli\cpp_stuff\libc_include/sys/types.h" #line 1 "D:\eli\cpp_stuff\libc_include/machine/types.h" #line 9 "D:\eli\cpp_stuff\libc_include/machine/types.h" typedef long int __off_t; typedef int __pid_t; typedef long int __loff_t; #line 71 "D:\eli\cpp_stuff\libc_include/sys/types.h" #line 79 "D:\eli\cpp_stuff\libc_include/sys/types.h" typedef unsigned char u_char; typedef unsigned short u_short; typedef unsigned int u_int; typedef unsigned long u_long; typedef unsigned short ushort; typedef unsigned int uint; typedef unsigned long clock_t; typedef long time_t; struct timespec { time_t tv_sec; long tv_nsec; }; struct itimerspec { struct timespec it_interval; struct timespec it_value; }; typedef long daddr_t; typedef char * caddr_t; #line 131 "D:\eli\cpp_stuff\libc_include/sys/types.h" typedef unsigned short ino_t; #line 160 "D:\eli\cpp_stuff\libc_include/sys/types.h" typedef _off_t off_t; typedef __dev_t dev_t; typedef __uid_t uid_t; typedef __gid_t gid_t; typedef int pid_t; typedef long key_t; typedef _ssize_t ssize_t; typedef unsigned int mode_t; typedef unsigned short nlink_t; #line 200 "D:\eli\cpp_stuff\libc_include/sys/types.h" #line 209 "D:\eli\cpp_stuff\libc_include/sys/types.h" typedef long fd_mask; #line 221 "D:\eli\cpp_stuff\libc_include/sys/types.h" typedef struct _types_fd_set { fd_mask fds_bits[(((64)+(((sizeof (fd_mask) * 8))-1))/((sizeof (fd_mask) * 8)))]; } _types_fd_set; #line 236 "D:\eli\cpp_stuff\libc_include/sys/types.h" typedef unsigned long clockid_t; typedef unsigned long timer_t; typedef unsigned long useconds_t; typedef long suseconds_t; #line 1 "D:\eli\cpp_stuff\libc_include/sys/features.h" #line 20 "D:\eli\cpp_stuff\libc_include/sys/features.h" #line 257 "D:\eli\cpp_stuff\libc_include/sys/types.h" #line 266 "D:\eli\cpp_stuff\libc_include/sys/types.h" #line 273 "D:\eli\cpp_stuff\libc_include/sys/types.h" #line 47 "D:\eli\cpp_stuff\libc_include/stdio.h" typedef __FILE FILE; typedef _fpos_t fpos_t; #line 1 "D:\eli\cpp_stuff\libc_include/sys/stdio.h" #line 1 "D:\eli\cpp_stuff\libc_include/sys/lock.h" #line 5 "D:\eli\cpp_stuff\libc_include/sys/stdio.h" #line 1 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 6 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 43 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 68 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 77 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 104 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 134 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 141 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 284 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 313 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 344 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 350 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 420 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 452 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 474 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 478 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 482 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 494 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 496 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 503 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 505 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 508 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 531 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 533 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 536 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 571 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 619 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 637 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 689 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 751 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 791 "D:\eli\cpp_stuff\libc_include/sys/reent.h" #line 6 "D:\eli\cpp_stuff\libc_include/sys/stdio.h" #line 11 "D:\eli\cpp_stuff\libc_include/sys/stdio.h" #line 66 "D:\eli\cpp_stuff\libc_include/stdio.h" #line 96 "D:\eli\cpp_stuff\libc_include/stdio.h" #line 163 "D:\eli\cpp_stuff\libc_include/stdio.h" FILE * tmpfile(); char * tmpnam(); int fclose(); int fflush(); FILE * freopen(); void setbuf(); int setvbuf(); int fprintf(); #line 179 "D:\eli\cpp_stuff\libc_include/stdio.h" int fscanf(); #line 181 "D:\eli\cpp_stuff\libc_include/stdio.h" int printf(); #line 183 "D:\eli\cpp_stuff\libc_include/stdio.h" int scanf(); #line 185 "D:\eli\cpp_stuff\libc_include/stdio.h" int sscanf(); #line 187 "D:\eli\cpp_stuff\libc_include/stdio.h" int vfprintf(); #line 189 "D:\eli\cpp_stuff\libc_include/stdio.h" int vprintf(); #line 191 "D:\eli\cpp_stuff\libc_include/stdio.h" int vsprintf(); #line 193 "D:\eli\cpp_stuff\libc_include/stdio.h" int fgetc(); char * fgets(); int fputc(); int fputs(); int getc(); int getchar(); char * gets(); int putc(); int putchar(); int puts(); int ungetc(); size_t fread(); size_t fwrite(); int fgetpos(); int fseek(); int fsetpos(); long ftell(); void rewind(); void clearerr(); int feof(); int ferror(); void perror(); FILE * fopen(); int sprintf(); #line 227 "D:\eli\cpp_stuff\libc_include/stdio.h" int remove(); int rename(); int fseeko(); off_t ftello(); int asiprintf(); #line 241 "D:\eli\cpp_stuff\libc_include/stdio.h" char * asniprintf(); #line 243 "D:\eli\cpp_stuff\libc_include/stdio.h" char * asnprintf(); #line 245 "D:\eli\cpp_stuff\libc_include/stdio.h" int asprintf(); #line 247 "D:\eli\cpp_stuff\libc_include/stdio.h" int diprintf(); #line 250 "D:\eli\cpp_stuff\libc_include/stdio.h" int fcloseall(); int fiprintf(); #line 254 "D:\eli\cpp_stuff\libc_include/stdio.h" int fiscanf(); #line 256 "D:\eli\cpp_stuff\libc_include/stdio.h" int iprintf(); #line 258 "D:\eli\cpp_stuff\libc_include/stdio.h" int iscanf(); #line 260 "D:\eli\cpp_stuff\libc_include/stdio.h" int siprintf(); #line 262 "D:\eli\cpp_stuff\libc_include/stdio.h" int siscanf(); #line 264 "D:\eli\cpp_stuff\libc_include/stdio.h" int snprintf(); #line 266 "D:\eli\cpp_stuff\libc_include/stdio.h" int sniprintf(); #line 268 "D:\eli\cpp_stuff\libc_include/stdio.h" char * tempnam(); int vasiprintf(); #line 271 "D:\eli\cpp_stuff\libc_include/stdio.h" char * vasniprintf(); #line 273 "D:\eli\cpp_stuff\libc_include/stdio.h" char * vasnprintf(); #line 275 "D:\eli\cpp_stuff\libc_include/stdio.h" int vasprintf(); #line 277 "D:\eli\cpp_stuff\libc_include/stdio.h" int vdiprintf(); #line 279 "D:\eli\cpp_stuff\libc_include/stdio.h" int vfiprintf(); #line 281 "D:\eli\cpp_stuff\libc_include/stdio.h" int vfiscanf(); #line 283 "D:\eli\cpp_stuff\libc_include/stdio.h" int vfscanf(); #line 285 "D:\eli\cpp_stuff\libc_include/stdio.h" int viprintf(); #line 287 "D:\eli\cpp_stuff\libc_include/stdio.h" int viscanf(); #line 289 "D:\eli\cpp_stuff\libc_include/stdio.h" int vscanf(); #line 291 "D:\eli\cpp_stuff\libc_include/stdio.h" int vsiprintf(); #line 293 "D:\eli\cpp_stuff\libc_include/stdio.h" int vsiscanf(); #line 295 "D:\eli\cpp_stuff\libc_include/stdio.h" int vsniprintf(); #line 297 "D:\eli\cpp_stuff\libc_include/stdio.h" int vsnprintf(); #line 299 "D:\eli\cpp_stuff\libc_include/stdio.h" int vsscanf(); #line 301 "D:\eli\cpp_stuff\libc_include/stdio.h" #line 307 "D:\eli\cpp_stuff\libc_include/stdio.h" FILE * fdopen(); int fileno(); int getw(); int pclose(); FILE * popen(); int putw(); void setbuffer(); int setlinebuf(); int getc_unlocked(); int getchar_unlocked(); void flockfile(); int ftrylockfile(); void funlockfile(); int putc_unlocked(); int putchar_unlocked(); #line 331 "D:\eli\cpp_stuff\libc_include/stdio.h" int dprintf(); #line 337 "D:\eli\cpp_stuff\libc_include/stdio.h" FILE * fmemopen(); FILE * open_memstream(); int vdprintf(); #line 345 "D:\eli\cpp_stuff\libc_include/stdio.h" #line 351 "D:\eli\cpp_stuff\libc_include/stdio.h" int _asiprintf_r(); #line 354 "D:\eli\cpp_stuff\libc_include/stdio.h" char * _asniprintf_r(); #line 356 "D:\eli\cpp_stuff\libc_include/stdio.h" char * _asnprintf_r(); #line 358 "D:\eli\cpp_stuff\libc_include/stdio.h" int _asprintf_r(); #line 360 "D:\eli\cpp_stuff\libc_include/stdio.h" int _diprintf_r(); #line 362 "D:\eli\cpp_stuff\libc_include/stdio.h" int _dprintf_r(); #line 364 "D:\eli\cpp_stuff\libc_include/stdio.h" int _fclose_r(); int _fcloseall_r(); FILE * _fdopen_r(); int _fflush_r(); char * _fgets_r(); int _fiprintf_r(); #line 371 "D:\eli\cpp_stuff\libc_include/stdio.h" int _fiscanf_r(); #line 373 "D:\eli\cpp_stuff\libc_include/stdio.h" FILE * _fmemopen_r(); FILE * _fopen_r(); int _fprintf_r(); #line 377 "D:\eli\cpp_stuff\libc_include/stdio.h" int _fputc_r(); int _fputs_r(); size_t _fread_r(); int _fscanf_r(); #line 382 "D:\eli\cpp_stuff\libc_include/stdio.h" int _fseek_r(); long _ftell_r(); size_t _fwrite_r(); int _getc_r(); int _getc_unlocked_r(); int _getchar_r(); int _getchar_unlocked_r(); char * _gets_r(); int _iprintf_r(); #line 392 "D:\eli\cpp_stuff\libc_include/stdio.h" int _iscanf_r(); #line 394 "D:\eli\cpp_stuff\libc_include/stdio.h" int _mkstemp_r(); char * _mktemp_r(); FILE * _open_memstream_r(); void _perror_r(); int _printf_r(); #line 400 "D:\eli\cpp_stuff\libc_include/stdio.h" int _putc_r(); int _putc_unlocked_r(); int _putchar_unlocked_r(); int _putchar_r(); int _puts_r(); int _remove_r(); int _rename_r(); #line 408 "D:\eli\cpp_stuff\libc_include/stdio.h" int _scanf_r(); #line 410 "D:\eli\cpp_stuff\libc_include/stdio.h" int _siprintf_r(); #line 412 "D:\eli\cpp_stuff\libc_include/stdio.h" int _siscanf_r(); #line 414 "D:\eli\cpp_stuff\libc_include/stdio.h" int _sniprintf_r(); #line 416 "D:\eli\cpp_stuff\libc_include/stdio.h" int _snprintf_r(); #line 418 "D:\eli\cpp_stuff\libc_include/stdio.h" int _sprintf_r(); #line 420 "D:\eli\cpp_stuff\libc_include/stdio.h" int _sscanf_r(); #line 422 "D:\eli\cpp_stuff\libc_include/stdio.h" char * _tempnam_r(); FILE * _tmpfile_r(); char * _tmpnam_r(); int _ungetc_r(); int _vasiprintf_r(); #line 428 "D:\eli\cpp_stuff\libc_include/stdio.h" char * _vasniprintf_r(); #line 430 "D:\eli\cpp_stuff\libc_include/stdio.h" char * _vasnprintf_r(); #line 432 "D:\eli\cpp_stuff\libc_include/stdio.h" int _vasprintf_r(); #line 434 "D:\eli\cpp_stuff\libc_include/stdio.h" int _vdiprintf_r(); #line 436 "D:\eli\cpp_stuff\libc_include/stdio.h" int _vdprintf_r(); #line 438 "D:\eli\cpp_stuff\libc_include/stdio.h" int _vfiprintf_r(); #line 440 "D:\eli\cpp_stuff\libc_include/stdio.h" int _vfiscanf_r(); #line 442 "D:\eli\cpp_stuff\libc_include/stdio.h" int _vfprintf_r(); #line 444 "D:\eli\cpp_stuff\libc_include/stdio.h" int _vfscanf_r(); #line 446 "D:\eli\cpp_stuff\libc_include/stdio.h" int _viprintf_r(); #line 448 "D:\eli\cpp_stuff\libc_include/stdio.h" int _viscanf_r(); #line 450 "D:\eli\cpp_stuff\libc_include/stdio.h" int _vprintf_r(); #line 452 "D:\eli\cpp_stuff\libc_include/stdio.h" int _vscanf_r(); #line 454 "D:\eli\cpp_stuff\libc_include/stdio.h" int _vsiprintf_r(); #line 456 "D:\eli\cpp_stuff\libc_include/stdio.h" int _vsiscanf_r(); #line 458 "D:\eli\cpp_stuff\libc_include/stdio.h" int _vsniprintf_r(); #line 460 "D:\eli\cpp_stuff\libc_include/stdio.h" int _vsnprintf_r(); #line 462 "D:\eli\cpp_stuff\libc_include/stdio.h" int _vsprintf_r(); #line 464 "D:\eli\cpp_stuff\libc_include/stdio.h" int _vsscanf_r(); #line 466 "D:\eli\cpp_stuff\libc_include/stdio.h" ssize_t __getdelim(); ssize_t __getline(); #line 493 "D:\eli\cpp_stuff\libc_include/stdio.h" int __srget_r(); int __swbuf_r(); #line 500 "D:\eli\cpp_stuff\libc_include/stdio.h" FILE * funopen(); #line 514 "D:\eli\cpp_stuff\libc_include/stdio.h" #line 518 "D:\eli\cpp_stuff\libc_include/stdio.h" #line 520 "D:\eli\cpp_stuff\libc_include/stdio.h" typedef ssize_t cookie_read_function_t(void *__cookie, char *__buf, size_t __n); typedef ssize_t cookie_write_function_t(void *__cookie, const char *__buf, size_t __n); typedef int cookie_seek_function_t(void *__cookie, off_t *__off, int __whence); typedef int cookie_close_function_t(void *__cookie); typedef struct { #line 535 "D:\eli\cpp_stuff\libc_include/stdio.h" cookie_read_function_t *read; cookie_write_function_t *write; cookie_seek_function_t *seek; cookie_close_function_t *close; } cookie_io_functions_t; FILE * fopencookie(); #line 542 "D:\eli\cpp_stuff\libc_include/stdio.h" #line 549 "D:\eli\cpp_stuff\libc_include/stdio.h" #line 574 "D:\eli\cpp_stuff\libc_include/stdio.h" #line 580 "D:\eli\cpp_stuff\libc_include/stdio.h" #line 603 "D:\eli\cpp_stuff\libc_include/stdio.h" #line 613 "D:\eli\cpp_stuff\libc_include/stdio.h" #line 621 "D:\eli\cpp_stuff\libc_include/stdio.h" #line 626 "D:\eli\cpp_stuff\libc_include/stdio.h" #line 657 "D:\eli\cpp_stuff\libc_include/stdio.h" #line 4 "example_c_file.c" #line 8 "example_c_file.c" char tav = 'b'; char maav = L"'guruguru\n"; char* moral = "ain't I \\\"\\\t\" a nice string?\"\""; char* comment_inside = "but you will /* see it */!!!!"; char* i_have_newlines = "line one\nline two\nline three"; int main() { auto char* multi = "a multi"; } pycparser-2.18/tests/c_files/example_c_file.c0000664000175000017500000000037213045001366022042 0ustar elibeneliben00000000000000char tav = 'b'; char* moral = "ain't I \\\"\\\t\" a nice string?\"\""; char* comment_inside = "but you will /* see it */!!!!"; char* i_have_newlines = "line one\nline two\nline three"; int main() { auto char* multi = "a multi"; } pycparser-2.18/tests/c_files/year.c0000664000175000017500000000240113045001366020041 0ustar elibeneliben00000000000000#include #include #include #include /* C99 bools */ _Bool just_a_flag = false; bool another_flag = true; void convert(int thousands, int hundreds, int tens, int ones) { char *num[] = {"", "One", "Two", "Three", "Four", "Five", "Six", "Seven", "Eight", "Nine"}; char *for_ten[] = {"", "", "Twenty", "Thirty", "Fourty", "Fifty", "Sixty", "Seventy", "Eighty", "Ninty"}; char *af_ten[] = {"Ten", "Eleven", "Twelve", "Thirteen", "Fourteen", "Fifteen", "Sixteen", "Seventeen", "Eighteen", "Ninteen"}; printf("\nThe year in words is:\n"); printf("%s thousand", num[thousands]); if (hundreds != 0) printf(" %s hundred", num[hundreds]); if (tens != 1) printf(" %s %s", for_ten[tens], num[ones]); else printf(" %s", af_ten[ones]); va_list jajaja; } int main() { int year; int n1000, n100, n10, n1; printf("\nEnter the year (4 digits): "); scanf("%d", &year); if (year > 9999 || year < 1000) { printf("\nError !! The year must contain 4 digits."); exit(EXIT_FAILURE); } n1000 = year/1000; n100 = ((year)%1000)/100; n10 = (year%100)/10; n1 = ((year%10)%10); convert(n1000, n100, n10, n1); return 0; } pycparser-2.18/tests/c_files/memmgr.h0000664000175000017500000000566713045001366020413 0ustar elibeneliben00000000000000//---------------------------------------------------------------- // Statically-allocated memory manager // // by Eli Bendersky (eliben@gmail.com) // // This code is in the public domain. //---------------------------------------------------------------- #ifndef MEMMGR_H #define MEMMGR_H // // Memory manager: dynamically allocates memory from // a fixed pool that is allocated statically at link-time. // // Usage: after calling memmgr_init() in your // initialization routine, just use memmgr_alloc() instead // of malloc() and memmgr_free() instead of free(). // Naturally, you can use the preprocessor to define // malloc() and free() as aliases to memmgr_alloc() and // memmgr_free(). This way the manager will be a drop-in // replacement for the standard C library allocators, and can // be useful for debugging memory allocation problems and // leaks. // // Preprocessor flags you can define to customize the // memory manager: // // DEBUG_MEMMGR_FATAL // Allow printing out a message when allocations fail // // DEBUG_MEMMGR_SUPPORT_STATS // Allow printing out of stats in function // memmgr_print_stats When this is disabled, // memmgr_print_stats does nothing. // // Note that in production code on an embedded system // you'll probably want to keep those undefined, because // they cause printf to be called. // // POOL_SIZE // Size of the pool for new allocations. This is // effectively the heap size of the application, and can // be changed in accordance with the available memory // resources. // // MIN_POOL_ALLOC_QUANTAS // Internally, the memory manager allocates memory in // quantas roughly the size of two ulong objects. To // minimize pool fragmentation in case of multiple allocations // and deallocations, it is advisable to not allocate // blocks that are too small. // This flag sets the minimal ammount of quantas for // an allocation. If the size of a ulong is 4 and you // set this flag to 16, the minimal size of an allocation // will be 4 * 2 * 16 = 128 bytes // If you have a lot of small allocations, keep this value // low to conserve memory. If you have mostly large // allocations, it is best to make it higher, to avoid // fragmentation. // // Notes: // 1. This memory manager is *not thread safe*. Use it only // for single thread/task applications. // #define DEBUG_MEMMGR_SUPPORT_STATS 1 #define POOL_SIZE 8 * 1024 #define MIN_POOL_ALLOC_QUANTAS 16 typedef unsigned char byte; typedef unsigned long ulong; // Initialize the memory manager. This function should be called // only once in the beginning of the program. // void memmgr_init(); // 'malloc' clone // void* memmgr_alloc(ulong nbytes); // 'free' clone // void memmgr_free(void* ap); // Prints statistics about the current state of the memory // manager // void memmgr_print_stats(); #endif // MEMMGR_H pycparser-2.18/tests/c_files/empty.h0000664000175000017500000000016213045001366020246 0ustar elibeneliben00000000000000#define PERFECTLY #define NORMAL #define TO #define HAVE #define HEADER #define WITH #define ONLY #define DEFINES pycparser-2.18/tests/c_files/hdir/0000775000175000017500000000000013127011712017663 5ustar elibeneliben00000000000000pycparser-2.18/tests/c_files/hdir/9/0000775000175000017500000000000013127011712020033 5ustar elibeneliben00000000000000pycparser-2.18/tests/c_files/hdir/9/inc.h0000664000175000017500000000001713045001366020756 0ustar elibeneliben00000000000000extern int ie; pycparser-2.18/tests/test_c_generator.py0000664000175000017500000001702513111175436021246 0ustar elibeneliben00000000000000import sys import unittest # Run from the root dir sys.path.insert(0, '.') from pycparser import c_parser, c_generator, c_ast _c_parser = c_parser.CParser( lex_optimize=False, yacc_debug=True, yacc_optimize=False, yacctab='yacctab') def compare_asts(ast1, ast2): if type(ast1) != type(ast2): return False if isinstance(ast1, tuple) and isinstance(ast2, tuple): if ast1[0] != ast2[0]: return False ast1 = ast1[1] ast2 = ast2[1] return compare_asts(ast1, ast2) for attr in ast1.attr_names: if getattr(ast1, attr) != getattr(ast2, attr): return False for i, c1 in enumerate(ast1.children()): if compare_asts(c1, ast2.children()[i]) == False: return False return True def parse_to_ast(src): return _c_parser.parse(src) class TestFunctionDeclGeneration(unittest.TestCase): class _FuncDeclVisitor(c_ast.NodeVisitor): def __init__(self): self.stubs = [] def visit_FuncDecl(self, node): gen = c_generator.CGenerator() self.stubs.append(gen.visit(node)) def test_partial_funcdecl_generation(self): src = r''' void noop(void); void *something(void *thing); int add(int x, int y);''' ast = parse_to_ast(src) v = TestFunctionDeclGeneration._FuncDeclVisitor() v.visit(ast) self.assertEqual(len(v.stubs), 3) self.assertTrue(r'void noop(void)' in v.stubs) self.assertTrue(r'void *something(void *thing)' in v.stubs) self.assertTrue(r'int add(int x, int y)' in v.stubs) class TestCtoC(unittest.TestCase): def _run_c_to_c(self, src): ast = parse_to_ast(src) generator = c_generator.CGenerator() return generator.visit(ast) def _assert_ctoc_correct(self, src): """ Checks that the c2c translation was correct by parsing the code generated by c2c for src and comparing the AST with the original AST. """ src2 = self._run_c_to_c(src) self.assertTrue(compare_asts(parse_to_ast(src), parse_to_ast(src2)), src2) def test_trivial_decls(self): self._assert_ctoc_correct('int a;') self._assert_ctoc_correct('int b, a;') self._assert_ctoc_correct('int c, b, a;') def test_complex_decls(self): self._assert_ctoc_correct('int** (*a)(void);') self._assert_ctoc_correct('int** (*a)(void*, int);') self._assert_ctoc_correct('int (*b)(char * restrict k, float);') self._assert_ctoc_correct('int test(const char* const* arg);') self._assert_ctoc_correct('int test(const char** const arg);') #s = 'int test(const char* const* arg);' #parse_to_ast(s).show() def test_ternary(self): self._assert_ctoc_correct(''' int main(void) { int a, b; (a == 0) ? (b = 1) : (b = 2); }''') def test_casts(self): self._assert_ctoc_correct(r''' int main() { int b = (int) f; int c = (int*) f; }''') def test_initlist(self): self._assert_ctoc_correct('int arr[] = {1, 2, 3};') def test_exprs(self): self._assert_ctoc_correct(''' int main(void) { int a; int b = a++; int c = ++a; int d = a--; int e = --a; }''') def test_statements(self): # note two minuses here self._assert_ctoc_correct(r''' int main() { int a; a = 5; ; b = - - a; return a; }''') def test_casts(self): self._assert_ctoc_correct(r''' int main() { int a = (int) b + 8; int t = (int) c; } ''') def test_struct_decl(self): self._assert_ctoc_correct(r''' typedef struct node_t { struct node_t* next; int data; } node; ''') def test_krstyle(self): self._assert_ctoc_correct(r''' int main(argc, argv) int argc; char** argv; { return 0; } ''') def test_switchcase(self): self._assert_ctoc_correct(r''' int main() { switch (myvar) { case 10: { k = 10; p = k + 1; break; } case 20: case 30: return 20; default: break; } } ''') def test_nest_initializer_list(self): self._assert_ctoc_correct(r''' int main() { int i[1][1] = { { 1 } }; }''') def test_nest_named_initializer(self): self._assert_ctoc_correct(r'''struct test { int i; struct test_i_t { int k; } test_i; int j; }; struct test test_var = {.i = 0, .test_i = {.k = 1}, .j = 2}; ''') def test_expr_list_in_initializer_list(self): self._assert_ctoc_correct(r''' int main() { int i[1] = { (1, 2) }; }''') def test_issue36(self): self._assert_ctoc_correct(r''' int main() { }''') def test_issue37(self): self._assert_ctoc_correct(r''' int main(void) { unsigned size; size = sizeof(size); return 0; }''') def test_issue83(self): self._assert_ctoc_correct(r''' void x(void) { int i = (9, k); } ''') def test_issue84(self): self._assert_ctoc_correct(r''' void x(void) { for (int i = 0;;) i; } ''') def test_exprlist_with_semi(self): self._assert_ctoc_correct(r''' void x() { if (i < j) tmp = C[i], C[i] = C[j], C[j] = tmp; if (i <= j) i++, j--; } ''') def test_exprlist_with_subexprlist(self): self._assert_ctoc_correct(r''' void x() { (a = b, (b = c, c = a)); } ''') def test_comma_operator_funcarg(self): self._assert_ctoc_correct(r''' void f(int x) { return x; } int main(void) { f((1, 2)); return 0; } ''') def test_comma_op_in_ternary(self): self._assert_ctoc_correct(r''' void f() { (0, 0) ? (0, 0) : (0, 0); } ''') def test_comma_op_assignment(self): self._assert_ctoc_correct(r''' void f() { i = (a, b, c); } ''') def test_pragma(self): self._assert_ctoc_correct(r''' #pragma foo void f() { #pragma bar i = (a, b, c); } ''') def test_compound_literal(self): self._assert_ctoc_correct('char **foo = (char *[]){ "x", "y", "z" };') self._assert_ctoc_correct('int i = ++(int){ 1 };') self._assert_ctoc_correct('struct foo_s foo = (struct foo_s){ 1, 2 };') if __name__ == "__main__": unittest.main() pycparser-2.18/pycparser.egg-info/0000775000175000017500000000000013127011712017671 5ustar elibeneliben00000000000000pycparser-2.18/pycparser.egg-info/PKG-INFO0000644000175000017500000000113213127011712020761 0ustar elibeneliben00000000000000Metadata-Version: 1.1 Name: pycparser Version: 2.18 Summary: C parser in Python Home-page: https://github.com/eliben/pycparser Author: Eli Bendersky Author-email: eliben@gmail.com License: BSD Description: pycparser is a complete parser of the C language, written in pure Python using the PLY parsing library. It parses C code into an AST and can serve as a front-end for C compilers or analysis tools. Platform: Cross Platform Classifier: Programming Language :: Python :: 2 Classifier: Programming Language :: Python :: 3 pycparser-2.18/pycparser.egg-info/top_level.txt0000644000175000017500000000001213127011712022412 0ustar elibeneliben00000000000000pycparser pycparser-2.18/pycparser.egg-info/SOURCES.txt0000644000175000017500000001072713127011712021562 0ustar elibeneliben00000000000000CHANGES LICENSE MANIFEST.in README.rst setup.cfg setup.py examples/c-to-c.py examples/c_json.py examples/cdecl.py examples/dump_ast.py examples/explore_ast.py examples/func_calls.py examples/func_defs.py examples/rewrite_ast.py examples/serialize_ast.py examples/using_cpp_libc.py examples/using_gcc_E_libc.py examples/c_files/funky.c examples/c_files/hash.c examples/c_files/memmgr.c examples/c_files/memmgr.h examples/c_files/year.c pycparser/__init__.py pycparser/_ast_gen.py pycparser/_build_tables.py pycparser/_c_ast.cfg pycparser/ast_transforms.py pycparser/c_ast.py pycparser/c_generator.py pycparser/c_lexer.py pycparser/c_parser.py pycparser/lextab.py pycparser/plyparser.py pycparser/yacctab.py pycparser.egg-info/PKG-INFO pycparser.egg-info/SOURCES.txt pycparser.egg-info/dependency_links.txt pycparser.egg-info/top_level.txt pycparser/ply/__init__.py pycparser/ply/cpp.py pycparser/ply/ctokens.py pycparser/ply/lex.py pycparser/ply/yacc.py pycparser/ply/ygen.py tests/all_tests.py tests/test_c_ast.py tests/test_c_generator.py tests/test_c_lexer.py tests/test_c_parser.py tests/test_general.py tests/c_files/cppd_with_stdio_h.c tests/c_files/empty.h tests/c_files/example_c_file.c tests/c_files/memmgr.c tests/c_files/memmgr.h tests/c_files/memmgr_with_h.c tests/c_files/simplemain.c tests/c_files/year.c tests/c_files/hdir/9/inc.h utils/fake_libc_include/_ansi.h utils/fake_libc_include/_fake_defines.h utils/fake_libc_include/_fake_typedefs.h utils/fake_libc_include/_syslist.h utils/fake_libc_include/alloca.h utils/fake_libc_include/ar.h utils/fake_libc_include/argz.h utils/fake_libc_include/assert.h utils/fake_libc_include/complex.h utils/fake_libc_include/ctype.h utils/fake_libc_include/dirent.h utils/fake_libc_include/dlfcn.h utils/fake_libc_include/endian.h utils/fake_libc_include/envz.h utils/fake_libc_include/errno.h utils/fake_libc_include/fastmath.h utils/fake_libc_include/fcntl.h utils/fake_libc_include/features.h utils/fake_libc_include/fenv.h utils/fake_libc_include/float.h utils/fake_libc_include/getopt.h utils/fake_libc_include/grp.h utils/fake_libc_include/iconv.h utils/fake_libc_include/ieeefp.h utils/fake_libc_include/inttypes.h utils/fake_libc_include/iso646.h utils/fake_libc_include/langinfo.h utils/fake_libc_include/libgen.h utils/fake_libc_include/libintl.h utils/fake_libc_include/limits.h utils/fake_libc_include/locale.h utils/fake_libc_include/malloc.h utils/fake_libc_include/math.h utils/fake_libc_include/netdb.h utils/fake_libc_include/newlib.h utils/fake_libc_include/paths.h utils/fake_libc_include/process.h utils/fake_libc_include/pthread.h utils/fake_libc_include/pwd.h utils/fake_libc_include/reent.h utils/fake_libc_include/regdef.h utils/fake_libc_include/regex.h utils/fake_libc_include/sched.h utils/fake_libc_include/search.h utils/fake_libc_include/semaphore.h utils/fake_libc_include/setjmp.h utils/fake_libc_include/signal.h utils/fake_libc_include/stdarg.h utils/fake_libc_include/stdbool.h utils/fake_libc_include/stddef.h utils/fake_libc_include/stdint.h utils/fake_libc_include/stdio.h utils/fake_libc_include/stdlib.h utils/fake_libc_include/string.h utils/fake_libc_include/syslog.h utils/fake_libc_include/tar.h utils/fake_libc_include/termios.h utils/fake_libc_include/tgmath.h utils/fake_libc_include/time.h utils/fake_libc_include/unctrl.h utils/fake_libc_include/unistd.h utils/fake_libc_include/utime.h utils/fake_libc_include/utmp.h utils/fake_libc_include/wchar.h utils/fake_libc_include/wctype.h utils/fake_libc_include/zlib.h utils/fake_libc_include/X11/Xlib.h utils/fake_libc_include/arpa/inet.h utils/fake_libc_include/asm-generic/int-ll64.h utils/fake_libc_include/linux/socket.h utils/fake_libc_include/linux/version.h utils/fake_libc_include/mir_toolkit/client_types.h utils/fake_libc_include/netinet/in.h utils/fake_libc_include/netinet/tcp.h utils/fake_libc_include/openssl/err.h utils/fake_libc_include/openssl/evp.h utils/fake_libc_include/openssl/hmac.h utils/fake_libc_include/openssl/ssl.h utils/fake_libc_include/openssl/x509v3.h utils/fake_libc_include/sys/ioctl.h utils/fake_libc_include/sys/mman.h utils/fake_libc_include/sys/poll.h utils/fake_libc_include/sys/resource.h utils/fake_libc_include/sys/select.h utils/fake_libc_include/sys/socket.h utils/fake_libc_include/sys/stat.h utils/fake_libc_include/sys/sysctl.h utils/fake_libc_include/sys/time.h utils/fake_libc_include/sys/types.h utils/fake_libc_include/sys/uio.h utils/fake_libc_include/sys/un.h utils/fake_libc_include/sys/utsname.h utils/fake_libc_include/sys/wait.h utils/fake_libc_include/xcb/xcb.hpycparser-2.18/pycparser.egg-info/dependency_links.txt0000644000175000017500000000000113127011712023735 0ustar elibeneliben00000000000000