././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1689637208.8122263 PyYAML-6.0.1/0000755000175100001730000000000014455350531012177 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/CHANGES0000644000175100001730000002772414455350511013204 0ustar00runnerdocker For a complete changelog, see: * https://github.com/yaml/pyyaml/commits/ * https://bitbucket.org/xi/pyyaml/commits/ 6.0.1 (2023-07-18) * https://github.com/yaml/pyyaml/pull/702 -- pin Cython build dep to < 3.0 6.0 (2021-10-13) * https://github.com/yaml/pyyaml/pull/327 -- Change README format to Markdown * https://github.com/yaml/pyyaml/pull/483 -- Add a test for YAML 1.1 types * https://github.com/yaml/pyyaml/pull/497 -- fix float resolver to ignore `.` and `._` * https://github.com/yaml/pyyaml/pull/550 -- drop Python 2.7 * https://github.com/yaml/pyyaml/pull/553 -- Fix spelling of “hexadecimal” * https://github.com/yaml/pyyaml/pull/556 -- fix representation of Enum subclasses * https://github.com/yaml/pyyaml/pull/557 -- fix libyaml extension compiler warnings * https://github.com/yaml/pyyaml/pull/560 -- fix ResourceWarning on leaked file descriptors * https://github.com/yaml/pyyaml/pull/561 -- always require `Loader` arg to `yaml.load()` * https://github.com/yaml/pyyaml/pull/564 -- remove remaining direct distutils usage 5.4.1 (2021-01-20) * https://github.com/yaml/pyyaml/pull/480 -- Fix stub compat with older pyyaml versions that may unwittingly load it 5.4 (2021-01-19) * https://github.com/yaml/pyyaml/pull/407 -- Build modernization, remove distutils, fix metadata, build wheels, CI to GHA * https://github.com/yaml/pyyaml/pull/472 -- Fix for CVE-2020-14343, moves arbitrary python tags to UnsafeLoader * https://github.com/yaml/pyyaml/pull/441 -- Fix memory leak in implicit resolver setup * https://github.com/yaml/pyyaml/pull/392 -- Fix py2 copy support for timezone objects * https://github.com/yaml/pyyaml/pull/378 -- Fix compatibility with Jython 5.3.1 (2020-03-18) * https://github.com/yaml/pyyaml/pull/386 -- Prevents arbitrary code execution during python/object/new constructor 5.3 (2020-01-06) * https://github.com/yaml/pyyaml/pull/290 -- Use `is` instead of equality for comparing with `None` * https://github.com/yaml/pyyaml/pull/270 -- Fix typos and stylistic nit * https://github.com/yaml/pyyaml/pull/309 -- Fix up small typo * https://github.com/yaml/pyyaml/pull/161 -- Fix handling of __slots__ * https://github.com/yaml/pyyaml/pull/358 -- Allow calling add_multi_constructor with None * https://github.com/yaml/pyyaml/pull/285 -- Add use of safe_load() function in README * https://github.com/yaml/pyyaml/pull/351 -- Fix reader for Unicode code points over 0xFFFF * https://github.com/yaml/pyyaml/pull/360 -- Enable certain unicode tests when maxunicode not > 0xffff * https://github.com/yaml/pyyaml/pull/359 -- Use full_load in yaml-highlight example * https://github.com/yaml/pyyaml/pull/244 -- Document that PyYAML is implemented with Cython * https://github.com/yaml/pyyaml/pull/329 -- Fix for Python 3.10 * https://github.com/yaml/pyyaml/pull/310 -- Increase size of index, line, and column fields * https://github.com/yaml/pyyaml/pull/260 -- Remove some unused imports * https://github.com/yaml/pyyaml/pull/163 -- Create timezone-aware datetimes when parsed as such * https://github.com/yaml/pyyaml/pull/363 -- Add tests for timezone 5.2 (2019-12-02) ------------------ * Repair incompatibilities introduced with 5.1. The default Loader was changed, but several methods like add_constructor still used the old default https://github.com/yaml/pyyaml/pull/279 -- A more flexible fix for custom tag constructors https://github.com/yaml/pyyaml/pull/287 -- Change default loader for yaml.add_constructor https://github.com/yaml/pyyaml/pull/305 -- Change default loader for add_implicit_resolver, add_path_resolver * Make FullLoader safer by removing python/object/apply from the default FullLoader https://github.com/yaml/pyyaml/pull/347 -- Move constructor for object/apply to UnsafeConstructor * Fix bug introduced in 5.1 where quoting went wrong on systems with sys.maxunicode <= 0xffff https://github.com/yaml/pyyaml/pull/276 -- Fix logic for quoting special characters * Other PRs: https://github.com/yaml/pyyaml/pull/280 -- Update CHANGES for 5.1 5.1.2 (2019-07-30) ------------------ * Re-release of 5.1 with regenerated Cython sources to build properly for Python 3.8b2+ 5.1.1 (2019-06-05) ------------------ * Re-release of 5.1 with regenerated Cython sources to build properly for Python 3.8b1 5.1 (2019-03-13) ---------------- * https://github.com/yaml/pyyaml/pull/35 -- Some modernization of the test running * https://github.com/yaml/pyyaml/pull/42 -- Install tox in a virtualenv * https://github.com/yaml/pyyaml/pull/45 -- Allow colon in a plain scalar in a flow context * https://github.com/yaml/pyyaml/pull/48 -- Fix typos * https://github.com/yaml/pyyaml/pull/55 -- Improve RepresenterError creation * https://github.com/yaml/pyyaml/pull/59 -- Resolves #57, update readme issues link * https://github.com/yaml/pyyaml/pull/60 -- Document and test Python 3.6 support * https://github.com/yaml/pyyaml/pull/61 -- Use Travis CI built in pip cache support * https://github.com/yaml/pyyaml/pull/62 -- Remove tox workaround for Travis CI * https://github.com/yaml/pyyaml/pull/63 -- Adding support to Unicode characters over codepoint 0xffff * https://github.com/yaml/pyyaml/pull/75 -- add 3.12 changelog * https://github.com/yaml/pyyaml/pull/76 -- Fallback to Pure Python if Compilation fails * https://github.com/yaml/pyyaml/pull/84 -- Drop unsupported Python 3.3 * https://github.com/yaml/pyyaml/pull/102 -- Include license file in the generated wheel package * https://github.com/yaml/pyyaml/pull/105 -- Removed Python 2.6 & 3.3 support * https://github.com/yaml/pyyaml/pull/111 -- Remove commented out Psyco code * https://github.com/yaml/pyyaml/pull/129 -- Remove call to `ord` in lib3 emitter code * https://github.com/yaml/pyyaml/pull/149 -- Test on Python 3.7-dev * https://github.com/yaml/pyyaml/pull/158 -- Support escaped slash in double quotes "\/" * https://github.com/yaml/pyyaml/pull/175 -- Updated link to pypi in release announcement * https://github.com/yaml/pyyaml/pull/181 -- Import Hashable from collections.abc * https://github.com/yaml/pyyaml/pull/194 -- Reverting https://github.com/yaml/pyyaml/pull/74 * https://github.com/yaml/pyyaml/pull/195 -- Build libyaml on travis * https://github.com/yaml/pyyaml/pull/196 -- Force cython when building sdist * https://github.com/yaml/pyyaml/pull/254 -- Allow to turn off sorting keys in Dumper (2) * https://github.com/yaml/pyyaml/pull/256 -- Make default_flow_style=False * https://github.com/yaml/pyyaml/pull/257 -- Deprecate yaml.load and add FullLoader and UnsafeLoader classes * https://github.com/yaml/pyyaml/pull/261 -- Skip certain unicode tests when maxunicode not > 0xffff * https://github.com/yaml/pyyaml/pull/263 -- Windows Appveyor build 3.13 (2018-07-05) ----------------- * Resolved issues around PyYAML working in Python 3.7. 3.12 (2016-08-28) ----------------- * Wheel packages for Windows binaries. * Adding an implicit resolver to a derived loader should not affect the base loader. * Uniform representation for OrderedDict? across different versions of Python. * Fixed comparison to None warning. 3.11 (2014-03-26) ----------------- * Source and binary distributions are rebuilt against the latest versions of Cython and LibYAML. 3.10 (2011-05-30) ----------------- * Do not try to build LibYAML bindings on platforms other than CPython (Thank to olt(at)bogosoft(dot)com). * Clear cyclic references in the parser and the emitter (Thank to kristjan(at)ccpgames(dot)com). * Dropped support for Python 2.3 and 2.4. 3.09 (2009-08-31) ----------------- * Fixed an obscure scanner error not reported when there is no line break at the end of the stream (Thank to Ingy). * Fixed use of uninitialized memory when emitting anchors with LibYAML bindings (Thank to cegner(at)yahoo-inc(dot)com). * Fixed emitting incorrect BOM characters for UTF-16 (Thank to Valentin Nechayev) * Fixed the emitter for folded scalars not respecting the preferred line width (Thank to Ingy). * Fixed a subtle ordering issue with emitting '%TAG' directives (Thank to Andrey Somov). * Fixed performance regression with LibYAML bindings. 3.08 (2008-12-31) ----------------- * Python 3 support (Thank to Erick Tryzelaar). * Use Cython instead of Pyrex to build LibYAML bindings. * Refactored support for unicode and byte input/output streams. 3.07 (2008-12-29) ----------------- * The emitter learned to use an optional indentation indicator for block scalar; thus scalars with leading whitespaces could now be represented in a literal or folded style. * The test suite is now included in the source distribution. To run the tests, type 'python setup.py test'. * Refactored the test suite: dropped unittest in favor of a custom test appliance. * Fixed the path resolver in CDumper. * Forced an explicit document end indicator when there is a possibility of parsing ambiguity. * More setup.py improvements: the package should be usable when any combination of setuptools, Pyrex and LibYAML is installed. * Windows binary packages are built against LibYAML-0.1.2. * Minor typos and corrections (Thank to Ingy dot Net and Andrey Somov). 3.06 (2008-10-03) ----------------- * setup.py checks whether LibYAML is installed and if so, builds and installs LibYAML bindings. To force or disable installation of LibYAML bindings, use '--with-libyaml' or '--without-libyaml' respectively. * The source distribution includes compiled Pyrex sources so building LibYAML bindings no longer requires Pyrex installed. * 'yaml.load()' raises an exception if the input stream contains more than one YAML document. * Fixed exceptions produced by LibYAML bindings. * Fixed a dot '.' character being recognized as !!float. * Fixed Python 2.3 compatibility issue in constructing !!timestamp values. * Windows binary packages are built against the LibYAML stable branch. * Added attributes 'yaml.__version__' and 'yaml.__with_libyaml__'. 3.05 (2007-05-13) ----------------- * Windows binary packages were built with LibYAML trunk. * Fixed a bug that prevent processing a live stream of YAML documents in timely manner (Thanks edward(at)sweetbytes(dot)net). * Fixed a bug when the path in add_path_resolver contains boolean values (Thanks jstroud(at)mbi(dot)ucla(dot)edu). * Fixed loss of microsecond precision in timestamps (Thanks edemaine(at)mit(dot)edu). * Fixed loading an empty YAML stream. * Allowed immutable subclasses of YAMLObject. * Made the encoding of the unicode->str conversion explicit so that the conversion does not depend on the default Python encoding. * Forced emitting float values in a YAML compatible form. 3.04 (2006-08-20) ----------------- * Include experimental LibYAML bindings. * Fully support recursive structures. * Sort dictionary keys. Mapping node values are now represented as lists of pairs instead of dictionaries. No longer check for duplicate mapping keys as it didn't work correctly anyway. * Fix invalid output of single-quoted scalars in cases when a single quote is not escaped when preceded by whitespaces or line breaks. * To make porting easier, rewrite Parser not using generators. * Fix handling of unexpected block mapping values. * Fix a bug in Representer.represent_object: copy_reg.dispatch_table was not correctly handled. * Fix a bug when a block scalar is incorrectly emitted in the simple key context. * Hold references to the objects being represented. * Make Representer not try to guess !!pairs when a list is represented. * Fix timestamp constructing and representing. * Fix the 'N' plain scalar being incorrectly recognized as !!bool. 3.03 (2006-06-19) ----------------- * Fix Python 2.5 compatibility issues. * Fix numerous bugs in the float handling. * Fix scanning some ill-formed documents. * Other minor fixes. 3.02 (2006-05-15) ----------------- * Fix win32 installer. Apparently bdist_wininst does not work well under Linux. * Fix a bug in add_path_resolver. * Add the yaml-highlight example. Try to run on a color terminal: `python yaml_hl.py =3.6 License-File: LICENSE YAML is a data serialization format designed for human readability and interaction with scripting languages. PyYAML is a YAML parser and emitter for Python. PyYAML features a complete YAML 1.1 parser, Unicode support, pickle support, capable extension API, and sensible error messages. PyYAML supports standard YAML tags and provides Python-specific tags that allow to represent an arbitrary Python object. PyYAML is applicable for a broad range of tasks from complex configuration files to object serialization and persistence. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/README.md0000644000175100001730000000304414455350511013455 0ustar00runnerdockerPyYAML ====== A full-featured YAML processing framework for Python ## Installation To install, type `python setup.py install`. By default, the `setup.py` script checks whether LibYAML is installed and if so, builds and installs LibYAML bindings. To skip the check and force installation of LibYAML bindings, use the option `--with-libyaml`: `python setup.py --with-libyaml install`. To disable the check and skip building and installing LibYAML bindings, use `--without-libyaml`: `python setup.py --without-libyaml install`. When LibYAML bindings are installed, you may use fast LibYAML-based parser and emitter as follows: >>> yaml.load(stream, Loader=yaml.CLoader) >>> yaml.dump(data, Dumper=yaml.CDumper) If you don't trust the input YAML stream, you should use: >>> yaml.safe_load(stream) ## Testing PyYAML includes a comprehensive test suite. To run the tests, type `python setup.py test`. ## Further Information * For more information, check the [PyYAML homepage](https://github.com/yaml/pyyaml). * [PyYAML tutorial and reference](http://pyyaml.org/wiki/PyYAMLDocumentation). * Discuss PyYAML with the maintainers on Matrix at https://matrix.to/#/#pyyaml:yaml.io or IRC #pyyaml irc.libera.chat * Submit bug reports and feature requests to the [PyYAML bug tracker](https://github.com/yaml/pyyaml/issues). ## License The PyYAML module was written by Kirill Simonov . It is currently maintained by the YAML and Python communities. PyYAML is released under the MIT license. See the file LICENSE for more details. ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1689637208.6562278 PyYAML-6.0.1/examples/0000755000175100001730000000000014455350531014015 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1689637208.6602278 PyYAML-6.0.1/examples/pygments-lexer/0000755000175100001730000000000014455350531017000 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/examples/pygments-lexer/example.yaml0000644000175100001730000001167414455350511021326 0ustar00runnerdocker # # Examples from the Preview section of the YAML specification # (http://yaml.org/spec/1.2/#Preview) # # Sequence of scalars --- - Mark McGwire - Sammy Sosa - Ken Griffey # Mapping scalars to scalars --- hr: 65 # Home runs avg: 0.278 # Batting average rbi: 147 # Runs Batted In # Mapping scalars to sequences --- american: - Boston Red Sox - Detroit Tigers - New York Yankees national: - New York Mets - Chicago Cubs - Atlanta Braves # Sequence of mappings --- - name: Mark McGwire hr: 65 avg: 0.278 - name: Sammy Sosa hr: 63 avg: 0.288 # Sequence of sequences --- - [name , hr, avg ] - [Mark McGwire, 65, 0.278] - [Sammy Sosa , 63, 0.288] # Mapping of mappings --- Mark McGwire: {hr: 65, avg: 0.278} Sammy Sosa: { hr: 63, avg: 0.288 } # Two documents in a stream --- # Ranking of 1998 home runs - Mark McGwire - Sammy Sosa - Ken Griffey --- # Team ranking - Chicago Cubs - St Louis Cardinals # Documents with the end indicator --- time: 20:03:20 player: Sammy Sosa action: strike (miss) ... --- time: 20:03:47 player: Sammy Sosa action: grand slam ... # Comments --- hr: # 1998 hr ranking - Mark McGwire - Sammy Sosa rbi: # 1998 rbi ranking - Sammy Sosa - Ken Griffey # Anchors and aliases --- hr: - Mark McGwire # Following node labeled SS - &SS Sammy Sosa rbi: - *SS # Subsequent occurrence - Ken Griffey # Mapping between sequences --- ? - Detroit Tigers - Chicago cubs : - 2001-07-23 ? [ New York Yankees, Atlanta Braves ] : [ 2001-07-02, 2001-08-12, 2001-08-14 ] # Inline nested mapping --- # products purchased - item : Super Hoop quantity: 1 - item : Basketball quantity: 4 - item : Big Shoes quantity: 1 # Literal scalars --- | # ASCII art \//||\/|| // || ||__ # Folded scalars --- > Mark McGwire's year was crippled by a knee injury. # Preserved indented block in a folded scalar --- > Sammy Sosa completed another fine season with great stats. 63 Home Runs 0.288 Batting Average What a year! # Indentation determines scope --- name: Mark McGwire accomplishment: > Mark set a major league home run record in 1998. stats: | 65 Home Runs 0.278 Batting Average # Quoted scalars --- unicode: "Sosa did fine.\u263A" control: "\b1998\t1999\t2000\n" hex esc: "\x0d\x0a is \r\n" single: '"Howdy!" he cried.' quoted: ' # not a ''comment''.' tie-fighter: '|\-*-/|' # Multi-line flow scalars --- plain: This unquoted scalar spans many lines. quoted: "So does this quoted scalar.\n" # Integers --- canonical: 12345 decimal: +12_345 sexagesimal: 3:25:45 octal: 014 hexadecimal: 0xC # Floating point --- canonical: 1.23015e+3 exponential: 12.3015e+02 sexagesimal: 20:30.15 fixed: 1_230.15 negative infinity: -.inf not a number: .NaN # Miscellaneous --- null: ~ true: boolean false: boolean string: '12345' # Timestamps --- canonical: 2001-12-15T02:59:43.1Z iso8601: 2001-12-14t21:59:43.10-05:00 spaced: 2001-12-14 21:59:43.10 -5 date: 2002-12-14 # Various explicit tags --- not-date: !!str 2002-04-28 picture: !!binary | R0lGODlhDAAMAIQAAP//9/X 17unp5WZmZgAAAOfn515eXv Pz7Y6OjuDg4J+fn5OTk6enp 56enmleECcgggoBADs= application specific tag: !something | The semantics of the tag above may be different for different documents. # Global tags %TAG ! tag:clarkevans.com,2002: --- !shape # Use the ! handle for presenting # tag:clarkevans.com,2002:circle - !circle center: &ORIGIN {x: 73, y: 129} radius: 7 - !line start: *ORIGIN finish: { x: 89, y: 102 } - !label start: *ORIGIN color: 0xFFEEBB text: Pretty vector drawing. # Unordered sets --- !!set # sets are represented as a # mapping where each key is # associated with the empty string ? Mark McGwire ? Sammy Sosa ? Ken Griff # Ordered mappings --- !!omap # ordered maps are represented as # a sequence of mappings, with # each mapping having one key - Mark McGwire: 65 - Sammy Sosa: 63 - Ken Griffy: 58 # Full length example --- ! invoice: 34843 date : 2001-01-23 bill-to: &id001 given : Chris family : Dumars address: lines: | 458 Walkman Dr. Suite #292 city : Royal Oak state : MI postal : 48046 ship-to: *id001 product: - sku : BL394D quantity : 4 description : Basketball price : 450.00 - sku : BL4438H quantity : 1 description : Super Hoop price : 2392.00 tax : 251.42 total: 4443.52 comments: Late afternoon is best. Backup contact is Nancy Billsmer @ 338-4338. # Another full-length example --- Time: 2001-11-23 15:01:42 -5 User: ed Warning: This is an error message for the log file --- Time: 2001-11-23 15:02:31 -5 User: ed Warning: A slightly different error message. --- Date: 2001-11-23 15:03:17 -5 User: ed Fatal: Unknown variable "bar" Stack: - file: TopClass.py line: 23 code: | x = MoreObject("345\n") - file: MoreClass.py line: 58 code: |- foo = bar ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/examples/pygments-lexer/yaml.py0000644000175100001730000003640014455350511020315 0ustar00runnerdocker """ yaml.py Lexer for YAML, a human-friendly data serialization language (http://yaml.org/). Written by Kirill Simonov . License: Whatever suitable for inclusion into the Pygments package. """ from pygments.lexer import \ ExtendedRegexLexer, LexerContext, include, bygroups from pygments.token import \ Text, Comment, Punctuation, Name, Literal __all__ = ['YAMLLexer'] class YAMLLexerContext(LexerContext): """Indentation context for the YAML lexer.""" def __init__(self, *args, **kwds): super(YAMLLexerContext, self).__init__(*args, **kwds) self.indent_stack = [] self.indent = -1 self.next_indent = 0 self.block_scalar_indent = None def something(TokenClass): """Do not produce empty tokens.""" def callback(lexer, match, context): text = match.group() if not text: return yield match.start(), TokenClass, text context.pos = match.end() return callback def reset_indent(TokenClass): """Reset the indentation levels.""" def callback(lexer, match, context): text = match.group() context.indent_stack = [] context.indent = -1 context.next_indent = 0 context.block_scalar_indent = None yield match.start(), TokenClass, text context.pos = match.end() return callback def save_indent(TokenClass, start=False): """Save a possible indentation level.""" def callback(lexer, match, context): text = match.group() extra = '' if start: context.next_indent = len(text) if context.next_indent < context.indent: while context.next_indent < context.indent: context.indent = context.indent_stack.pop() if context.next_indent > context.indent: extra = text[context.indent:] text = text[:context.indent] else: context.next_indent += len(text) if text: yield match.start(), TokenClass, text if extra: yield match.start()+len(text), TokenClass.Error, extra context.pos = match.end() return callback def set_indent(TokenClass, implicit=False): """Set the previously saved indentation level.""" def callback(lexer, match, context): text = match.group() if context.indent < context.next_indent: context.indent_stack.append(context.indent) context.indent = context.next_indent if not implicit: context.next_indent += len(text) yield match.start(), TokenClass, text context.pos = match.end() return callback def set_block_scalar_indent(TokenClass): """Set an explicit indentation level for a block scalar.""" def callback(lexer, match, context): text = match.group() context.block_scalar_indent = None if not text: return increment = match.group(1) if increment: current_indent = max(context.indent, 0) increment = int(increment) context.block_scalar_indent = current_indent + increment if text: yield match.start(), TokenClass, text context.pos = match.end() return callback def parse_block_scalar_empty_line(IndentTokenClass, ContentTokenClass): """Process an empty line in a block scalar.""" def callback(lexer, match, context): text = match.group() if (context.block_scalar_indent is None or len(text) <= context.block_scalar_indent): if text: yield match.start(), IndentTokenClass, text else: indentation = text[:context.block_scalar_indent] content = text[context.block_scalar_indent:] yield match.start(), IndentTokenClass, indentation yield (match.start()+context.block_scalar_indent, ContentTokenClass, content) context.pos = match.end() return callback def parse_block_scalar_indent(TokenClass): """Process indentation spaces in a block scalar.""" def callback(lexer, match, context): text = match.group() if context.block_scalar_indent is None: if len(text) <= max(context.indent, 0): context.stack.pop() context.stack.pop() return context.block_scalar_indent = len(text) else: if len(text) < context.block_scalar_indent: context.stack.pop() context.stack.pop() return if text: yield match.start(), TokenClass, text context.pos = match.end() return callback def parse_plain_scalar_indent(TokenClass): """Process indentation spaces in a plain scalar.""" def callback(lexer, match, context): text = match.group() if len(text) <= context.indent: context.stack.pop() context.stack.pop() return if text: yield match.start(), TokenClass, text context.pos = match.end() return callback class YAMLLexer(ExtendedRegexLexer): """Lexer for the YAML language.""" name = 'YAML' aliases = ['yaml'] filenames = ['*.yaml', '*.yml'] mimetypes = ['text/x-yaml'] tokens = { # the root rules 'root': [ # ignored whitespaces (r'[ ]+(?=#|$)', Text.Blank), # line breaks (r'\n+', Text.Break), # a comment (r'#[^\n]*', Comment.Single), # the '%YAML' directive (r'^%YAML(?=[ ]|$)', reset_indent(Name.Directive), 'yaml-directive'), # the %TAG directive (r'^%TAG(?=[ ]|$)', reset_indent(Name.Directive), 'tag-directive'), # document start and document end indicators (r'^(?:---|\.\.\.)(?=[ ]|$)', reset_indent(Punctuation.Document), 'block-line'), # indentation spaces (r'[ ]*(?![ \t\n\r\f\v]|$)', save_indent(Text.Indent, start=True), ('block-line', 'indentation')), ], # trailing whitespaces after directives or a block scalar indicator 'ignored-line': [ # ignored whitespaces (r'[ ]+(?=#|$)', Text.Blank), # a comment (r'#[^\n]*', Comment.Single), # line break (r'\n', Text.Break, '#pop:2'), ], # the %YAML directive 'yaml-directive': [ # the version number (r'([ ]+)([0-9]+\.[0-9]+)', bygroups(Text.Blank, Literal.Version), 'ignored-line'), ], # the %YAG directive 'tag-directive': [ # a tag handle and the corresponding prefix (r'([ ]+)(!|![0-9A-Za-z_-]*!)' r'([ ]+)(!|!?[0-9A-Za-z;/?:@&=+$,_.!~*\'()\[\]%-]+)', bygroups(Text.Blank, Name.Type, Text.Blank, Name.Type), 'ignored-line'), ], # block scalar indicators and indentation spaces 'indentation': [ # trailing whitespaces are ignored (r'[ ]*$', something(Text.Blank), '#pop:2'), # whitespaces preceding block collection indicators (r'[ ]+(?=[?:-](?:[ ]|$))', save_indent(Text.Indent)), # block collection indicators (r'[?:-](?=[ ]|$)', set_indent(Punctuation.Indicator)), # the beginning a block line (r'[ ]*', save_indent(Text.Indent), '#pop'), ], # an indented line in the block context 'block-line': [ # the line end (r'[ ]*(?=#|$)', something(Text.Blank), '#pop'), # whitespaces separating tokens (r'[ ]+', Text.Blank), # tags, anchors and aliases, include('descriptors'), # block collections and scalars include('block-nodes'), # flow collections and quoted scalars include('flow-nodes'), # a plain scalar (r'(?=[^ \t\n\r\f\v?:,\[\]{}#&*!|>\'"%@`-]|[?:-][^ \t\n\r\f\v])', something(Literal.Scalar.Plain), 'plain-scalar-in-block-context'), ], # tags, anchors, aliases 'descriptors' : [ # a full-form tag (r'!<[0-9A-Za-z;/?:@&=+$,_.!~*\'()\[\]%-]+>', Name.Type), # a tag in the form '!', '!suffix' or '!handle!suffix' (r'!(?:[0-9A-Za-z_-]+)?' r'(?:![0-9A-Za-z;/?:@&=+$,_.!~*\'()\[\]%-]+)?', Name.Type), # an anchor (r'&[0-9A-Za-z_-]+', Name.Anchor), # an alias (r'\*[0-9A-Za-z_-]+', Name.Alias), ], # block collections and scalars 'block-nodes': [ # implicit key (r':(?=[ ]|$)', set_indent(Punctuation.Indicator, implicit=True)), # literal and folded scalars (r'[|>]', Punctuation.Indicator, ('block-scalar-content', 'block-scalar-header')), ], # flow collections and quoted scalars 'flow-nodes': [ # a flow sequence (r'\[', Punctuation.Indicator, 'flow-sequence'), # a flow mapping (r'\{', Punctuation.Indicator, 'flow-mapping'), # a single-quoted scalar (r'\'', Literal.Scalar.Flow.Quote, 'single-quoted-scalar'), # a double-quoted scalar (r'\"', Literal.Scalar.Flow.Quote, 'double-quoted-scalar'), ], # the content of a flow collection 'flow-collection': [ # whitespaces (r'[ ]+', Text.Blank), # line breaks (r'\n+', Text.Break), # a comment (r'#[^\n]*', Comment.Single), # simple indicators (r'[?:,]', Punctuation.Indicator), # tags, anchors and aliases include('descriptors'), # nested collections and quoted scalars include('flow-nodes'), # a plain scalar (r'(?=[^ \t\n\r\f\v?:,\[\]{}#&*!|>\'"%@`])', something(Literal.Scalar.Plain), 'plain-scalar-in-flow-context'), ], # a flow sequence indicated by '[' and ']' 'flow-sequence': [ # include flow collection rules include('flow-collection'), # the closing indicator (r'\]', Punctuation.Indicator, '#pop'), ], # a flow mapping indicated by '{' and '}' 'flow-mapping': [ # include flow collection rules include('flow-collection'), # the closing indicator (r'\}', Punctuation.Indicator, '#pop'), ], # block scalar lines 'block-scalar-content': [ # line break (r'\n', Text.Break), # empty line (r'^[ ]+$', parse_block_scalar_empty_line(Text.Indent, Literal.Scalar.Block)), # indentation spaces (we may leave the state here) (r'^[ ]*', parse_block_scalar_indent(Text.Indent)), # line content (r'[^\n\r\f\v]+', Literal.Scalar.Block), ], # the content of a literal or folded scalar 'block-scalar-header': [ # indentation indicator followed by chomping flag (r'([1-9])?[+-]?(?=[ ]|$)', set_block_scalar_indent(Punctuation.Indicator), 'ignored-line'), # chomping flag followed by indentation indicator (r'[+-]?([1-9])?(?=[ ]|$)', set_block_scalar_indent(Punctuation.Indicator), 'ignored-line'), ], # ignored and regular whitespaces in quoted scalars 'quoted-scalar-whitespaces': [ # leading and trailing whitespaces are ignored (r'^[ ]+|[ ]+$', Text.Blank), # line breaks are ignored (r'\n+', Text.Break), # other whitespaces are a part of the value (r'[ ]+', Literal.Scalar.Flow), ], # single-quoted scalars 'single-quoted-scalar': [ # include whitespace and line break rules include('quoted-scalar-whitespaces'), # escaping of the quote character (r'\'\'', Literal.Scalar.Flow.Escape), # regular non-whitespace characters (r'[^ \t\n\r\f\v\']+', Literal.Scalar.Flow), # the closing quote (r'\'', Literal.Scalar.Flow.Quote, '#pop'), ], # double-quoted scalars 'double-quoted-scalar': [ # include whitespace and line break rules include('quoted-scalar-whitespaces'), # escaping of special characters (r'\\[0abt\tn\nvfre "\\N_LP]', Literal.Scalar.Flow.Escape), # escape codes (r'\\(?:x[0-9A-Fa-f]{2}|u[0-9A-Fa-f]{4}|U[0-9A-Fa-f]{8})', Literal.Scalar.Flow.Escape), # regular non-whitespace characters (r'[^ \t\n\r\f\v\"\\]+', Literal.Scalar.Flow), # the closing quote (r'"', Literal.Scalar.Flow.Quote, '#pop'), ], # the beginning of a new line while scanning a plain scalar 'plain-scalar-in-block-context-new-line': [ # empty lines (r'^[ ]+$', Text.Blank), # line breaks (r'\n+', Text.Break), # document start and document end indicators (r'^(?=---|\.\.\.)', something(Punctuation.Document), '#pop:3'), # indentation spaces (we may leave the block line state here) (r'^[ ]*', parse_plain_scalar_indent(Text.Indent), '#pop'), ], # a plain scalar in the block context 'plain-scalar-in-block-context': [ # the scalar ends with the ':' indicator (r'[ ]*(?=:[ ]|:$)', something(Text.Blank), '#pop'), # the scalar ends with whitespaces followed by a comment (r'[ ]+(?=#)', Text.Blank, '#pop'), # trailing whitespaces are ignored (r'[ ]+$', Text.Blank), # line breaks are ignored (r'\n+', Text.Break, 'plain-scalar-in-block-context-new-line'), # other whitespaces are a part of the value (r'[ ]+', Literal.Scalar.Plain), # regular non-whitespace characters (r'(?::(?![ \t\n\r\f\v])|[^ \t\n\r\f\v:])+', Literal.Scalar.Plain), ], # a plain scalar is the flow context 'plain-scalar-in-flow-context': [ # the scalar ends with an indicator character (r'[ ]*(?=[,:?\[\]{}])', something(Text.Blank), '#pop'), # the scalar ends with a comment (r'[ ]+(?=#)', Text.Blank, '#pop'), # leading and trailing whitespaces are ignored (r'^[ ]+|[ ]+$', Text.Blank), # line breaks are ignored (r'\n+', Text.Break), # other whitespaces are a part of the value (r'[ ]+', Literal.Scalar.Plain), # regular non-whitespace characters (r'[^ \t\n\r\f\v,:?\[\]{}]+', Literal.Scalar.Plain), ], } def get_tokens_unprocessed(self, text=None, context=None): if context is None: context = YAMLLexerContext(text, 0) return super(YAMLLexer, self).get_tokens_unprocessed(text, context) ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1689637208.6602278 PyYAML-6.0.1/examples/yaml-highlight/0000755000175100001730000000000014455350531016724 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/examples/yaml-highlight/yaml_hl.cfg0000644000175100001730000001060514455350511021032 0ustar00runnerdocker%YAML 1.1 --- ascii: header: "\e[0;1;30;40m" footer: "\e[0m" tokens: stream-start: stream-end: directive: { start: "\e[35m", end: "\e[0;1;30;40m" } document-start: { start: "\e[35m", end: "\e[0;1;30;40m" } document-end: { start: "\e[35m", end: "\e[0;1;30;40m" } block-sequence-start: block-mapping-start: block-end: flow-sequence-start: { start: "\e[33m", end: "\e[0;1;30;40m" } flow-mapping-start: { start: "\e[33m", end: "\e[0;1;30;40m" } flow-sequence-end: { start: "\e[33m", end: "\e[0;1;30;40m" } flow-mapping-end: { start: "\e[33m", end: "\e[0;1;30;40m" } key: { start: "\e[33m", end: "\e[0;1;30;40m" } value: { start: "\e[33m", end: "\e[0;1;30;40m" } block-entry: { start: "\e[33m", end: "\e[0;1;30;40m" } flow-entry: { start: "\e[33m", end: "\e[0;1;30;40m" } alias: { start: "\e[32m", end: "\e[0;1;30;40m" } anchor: { start: "\e[32m", end: "\e[0;1;30;40m" } tag: { start: "\e[32m", end: "\e[0;1;30;40m" } scalar: { start: "\e[36m", end: "\e[0;1;30;40m" } replaces: - "\r\n": "\n" - "\r": "\n" - "\n": "\n" - "\x85": "\n" - "\u2028": "\n" - "\u2029": "\n" html: &html tokens: stream-start: stream-end: directive: { start: , end: } document-start: { start: , end: } document-end: { start: , end: } block-sequence-start: block-mapping-start: block-end: flow-sequence-start: { start: , end: } flow-mapping-start: { start: , end: } flow-sequence-end: { start: , end: } flow-mapping-end: { start: , end: } key: { start: , end: } value: { start: , end: } block-entry: { start: , end: } flow-entry: { start: , end: } alias: { start: , end: } anchor: { start: , end: } tag: { start: , end: } scalar: { start: , end: } events: stream-start: { start:
 }
        stream-end:     { end: 
} document-start: { start: } document-end: { end: } sequence-start: { start: } sequence-end: { end: } mapping-start: { start: } mapping-end: { end: } scalar: { start: , end: } replaces: - "\r\n": "\n" - "\r": "\n" - "\n": "\n" - "\x85": "\n" - "\u2028": "\n" - "\u2029": "\n" - "&": "&" - "<": "<" - ">": ">" html-page: header: | A YAML stream footer: | <<: *html # vim: ft=yaml ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/examples/yaml-highlight/yaml_hl.py0000755000175100001730000001052214455350511020724 0ustar00runnerdocker#!/usr/bin/python import yaml, codecs, sys, os.path, optparse class Style: def __init__(self, header=None, footer=None, tokens=None, events=None, replaces=None): self.header = header self.footer = footer self.replaces = replaces self.substitutions = {} for domain, Class in [(tokens, 'Token'), (events, 'Event')]: if not domain: continue for key in domain: name = ''.join([part.capitalize() for part in key.split('-')]) cls = getattr(yaml, '%s%s' % (name, Class)) value = domain[key] if not value: continue start = value.get('start') end = value.get('end') if start: self.substitutions[cls, -1] = start if end: self.substitutions[cls, +1] = end def __setstate__(self, state): self.__init__(**state) yaml.add_path_resolver(u'tag:yaml.org,2002:python/object:__main__.Style', [None], dict) yaml.add_path_resolver(u'tag:yaml.org,2002:pairs', [None, u'replaces'], list) class YAMLHighlight: def __init__(self, options): config = yaml.full_load(file(options.config, 'rb').read()) self.style = config[options.style] if options.input: self.input = file(options.input, 'rb') else: self.input = sys.stdin if options.output: self.output = file(options.output, 'wb') else: self.output = sys.stdout def highlight(self): input = self.input.read() if input.startswith(codecs.BOM_UTF16_LE): input = unicode(input, 'utf-16-le') elif input.startswith(codecs.BOM_UTF16_BE): input = unicode(input, 'utf-16-be') else: input = unicode(input, 'utf-8') substitutions = self.style.substitutions tokens = yaml.scan(input) events = yaml.parse(input) markers = [] number = 0 for token in tokens: number += 1 if token.start_mark.index != token.end_mark.index: cls = token.__class__ if (cls, -1) in substitutions: markers.append([token.start_mark.index, +2, number, substitutions[cls, -1]]) if (cls, +1) in substitutions: markers.append([token.end_mark.index, -2, number, substitutions[cls, +1]]) number = 0 for event in events: number += 1 cls = event.__class__ if (cls, -1) in substitutions: markers.append([event.start_mark.index, +1, number, substitutions[cls, -1]]) if (cls, +1) in substitutions: markers.append([event.end_mark.index, -1, number, substitutions[cls, +1]]) markers.sort() markers.reverse() chunks = [] position = len(input) for index, weight1, weight2, substitution in markers: if index < position: chunk = input[index:position] for substring, replacement in self.style.replaces: chunk = chunk.replace(substring, replacement) chunks.append(chunk) position = index chunks.append(substitution) chunks.reverse() result = u''.join(chunks) if self.style.header: self.output.write(self.style.header) self.output.write(result.encode('utf-8')) if self.style.footer: self.output.write(self.style.footer) if __name__ == '__main__': parser = optparse.OptionParser() parser.add_option('-s', '--style', dest='style', default='ascii', help="specify the highlighting style", metavar='STYLE') parser.add_option('-c', '--config', dest='config', default=os.path.join(os.path.dirname(sys.argv[0]), 'yaml_hl.cfg'), help="set an alternative configuration file", metavar='CONFIG') parser.add_option('-i', '--input', dest='input', default=None, help="set the input file (default: stdin)", metavar='FILE') parser.add_option('-o', '--output', dest='output', default=None, help="set the output file (default: stdout)", metavar='FILE') (options, args) = parser.parse_args() hl = YAMLHighlight(options) hl.highlight() ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1689637208.6562278 PyYAML-6.0.1/lib/0000755000175100001730000000000014455350531012745 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1689637208.6602278 PyYAML-6.0.1/lib/PyYAML.egg-info/0000755000175100001730000000000014455350531015512 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637208.0 PyYAML-6.0.1/lib/PyYAML.egg-info/PKG-INFO0000644000175100001730000000401214455350530016603 0ustar00runnerdockerMetadata-Version: 2.1 Name: PyYAML Version: 6.0.1 Summary: YAML parser and emitter for Python Home-page: https://pyyaml.org/ Download-URL: https://pypi.org/project/PyYAML/ Author: Kirill Simonov Author-email: xi@resolvent.net License: MIT Project-URL: Bug Tracker, https://github.com/yaml/pyyaml/issues Project-URL: CI, https://github.com/yaml/pyyaml/actions Project-URL: Documentation, https://pyyaml.org/wiki/PyYAMLDocumentation Project-URL: Mailing lists, http://lists.sourceforge.net/lists/listinfo/yaml-core Project-URL: Source Code, https://github.com/yaml/pyyaml Platform: Any Classifier: Development Status :: 5 - Production/Stable Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: MIT License Classifier: Operating System :: OS Independent Classifier: Programming Language :: Cython Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.6 Classifier: Programming Language :: Python :: 3.7 Classifier: Programming Language :: Python :: 3.8 Classifier: Programming Language :: Python :: 3.9 Classifier: Programming Language :: Python :: 3.10 Classifier: Programming Language :: Python :: 3.11 Classifier: Programming Language :: Python :: Implementation :: CPython Classifier: Programming Language :: Python :: Implementation :: PyPy Classifier: Topic :: Software Development :: Libraries :: Python Modules Classifier: Topic :: Text Processing :: Markup Requires-Python: >=3.6 License-File: LICENSE YAML is a data serialization format designed for human readability and interaction with scripting languages. PyYAML is a YAML parser and emitter for Python. PyYAML features a complete YAML 1.1 parser, Unicode support, pickle support, capable extension API, and sensible error messages. PyYAML supports standard YAML tags and provides Python-specific tags that allow to represent an arbitrary Python object. PyYAML is applicable for a broad range of tasks from complex configuration files to object serialization and persistence. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637208.0 PyYAML-6.0.1/lib/PyYAML.egg-info/SOURCES.txt0000644000175100001730000005147714455350530017413 0ustar00runnerdockerCHANGES LICENSE MANIFEST.in Makefile README.md pyproject.toml setup.py examples/pygments-lexer/example.yaml examples/pygments-lexer/yaml.py examples/yaml-highlight/yaml_hl.cfg examples/yaml-highlight/yaml_hl.py lib/PyYAML.egg-info/PKG-INFO lib/PyYAML.egg-info/SOURCES.txt lib/PyYAML.egg-info/dependency_links.txt lib/PyYAML.egg-info/top_level.txt lib/_yaml/__init__.py lib/yaml/__init__.py lib/yaml/composer.py lib/yaml/constructor.py lib/yaml/cyaml.py lib/yaml/dumper.py lib/yaml/emitter.py lib/yaml/error.py lib/yaml/events.py lib/yaml/loader.py lib/yaml/nodes.py lib/yaml/parser.py lib/yaml/reader.py lib/yaml/representer.py lib/yaml/resolver.py lib/yaml/scanner.py lib/yaml/serializer.py lib/yaml/tokens.py tests/data/a-nasty-libyaml-bug.loader-error tests/data/aliases-cdumper-bug.code tests/data/aliases.events tests/data/bool.data tests/data/bool.detect tests/data/construct-binary-py2.code tests/data/construct-binary-py2.data tests/data/construct-binary-py3.code tests/data/construct-binary-py3.data tests/data/construct-bool.code tests/data/construct-bool.data tests/data/construct-custom.code tests/data/construct-custom.data tests/data/construct-float.code tests/data/construct-float.data tests/data/construct-int.code tests/data/construct-int.data tests/data/construct-map.code tests/data/construct-map.data tests/data/construct-merge.code tests/data/construct-merge.data tests/data/construct-null.code tests/data/construct-null.data tests/data/construct-omap.code tests/data/construct-omap.data tests/data/construct-pairs.code tests/data/construct-pairs.data tests/data/construct-python-bool.code tests/data/construct-python-bool.data tests/data/construct-python-bytes-py3.code tests/data/construct-python-bytes-py3.data tests/data/construct-python-complex.code tests/data/construct-python-complex.data tests/data/construct-python-float.code tests/data/construct-python-float.data tests/data/construct-python-int.code tests/data/construct-python-int.data tests/data/construct-python-long-short-py2.code tests/data/construct-python-long-short-py2.data tests/data/construct-python-long-short-py3.code tests/data/construct-python-long-short-py3.data tests/data/construct-python-name-module.code tests/data/construct-python-name-module.data tests/data/construct-python-none.code tests/data/construct-python-none.data tests/data/construct-python-object.code tests/data/construct-python-object.data tests/data/construct-python-str-ascii.code tests/data/construct-python-str-ascii.data tests/data/construct-python-str-utf8-py2.code tests/data/construct-python-str-utf8-py2.data tests/data/construct-python-str-utf8-py3.code tests/data/construct-python-str-utf8-py3.data tests/data/construct-python-tuple-list-dict.code tests/data/construct-python-tuple-list-dict.data tests/data/construct-python-unicode-ascii-py2.code tests/data/construct-python-unicode-ascii-py2.data tests/data/construct-python-unicode-ascii-py3.code tests/data/construct-python-unicode-ascii-py3.data tests/data/construct-python-unicode-utf8-py2.code tests/data/construct-python-unicode-utf8-py2.data tests/data/construct-python-unicode-utf8-py3.code tests/data/construct-python-unicode-utf8-py3.data tests/data/construct-seq.code tests/data/construct-seq.data tests/data/construct-set.code tests/data/construct-set.data tests/data/construct-str-ascii.code tests/data/construct-str-ascii.data tests/data/construct-str-utf8-py2.code tests/data/construct-str-utf8-py2.data tests/data/construct-str-utf8-py3.code tests/data/construct-str-utf8-py3.data tests/data/construct-str.code tests/data/construct-str.data tests/data/construct-timestamp.code tests/data/construct-timestamp.data tests/data/construct-value.code tests/data/construct-value.data tests/data/document-separator-in-quoted-scalar.loader-error tests/data/documents.events tests/data/duplicate-anchor-1.loader-error tests/data/duplicate-anchor-2.loader-error tests/data/duplicate-key.former-loader-error.code tests/data/duplicate-key.former-loader-error.data tests/data/duplicate-mapping-key.former-loader-error.code tests/data/duplicate-mapping-key.former-loader-error.data tests/data/duplicate-merge-key.former-loader-error.code tests/data/duplicate-merge-key.former-loader-error.data tests/data/duplicate-tag-directive.loader-error tests/data/duplicate-value-key.former-loader-error.code tests/data/duplicate-value-key.former-loader-error.data tests/data/duplicate-yaml-directive.loader-error tests/data/emit-block-scalar-in-simple-key-context-bug.canonical tests/data/emit-block-scalar-in-simple-key-context-bug.data tests/data/emitting-unacceptable-unicode-character-bug-py3.code tests/data/emitting-unacceptable-unicode-character-bug-py3.data tests/data/emitting-unacceptable-unicode-character-bug-py3.skip-ext tests/data/emitting-unacceptable-unicode-character-bug.code tests/data/emitting-unacceptable-unicode-character-bug.data tests/data/emitting-unacceptable-unicode-character-bug.skip-ext tests/data/emoticons.unicode tests/data/emoticons2.unicode tests/data/empty-anchor.emitter-error tests/data/empty-document-bug.canonical tests/data/empty-document-bug.data tests/data/empty-document-bug.empty tests/data/empty-documents.single-loader-error tests/data/empty-python-module.loader-error tests/data/empty-python-name.loader-error tests/data/empty-tag-handle.emitter-error tests/data/empty-tag-prefix.emitter-error tests/data/empty-tag.emitter-error tests/data/expected-document-end.emitter-error tests/data/expected-document-start.emitter-error tests/data/expected-mapping.loader-error tests/data/expected-node-1.emitter-error tests/data/expected-node-2.emitter-error tests/data/expected-nothing.emitter-error tests/data/expected-scalar.loader-error tests/data/expected-sequence.loader-error tests/data/expected-stream-start.emitter-error tests/data/explicit-document.single-loader-error tests/data/fetch-complex-value-bug.loader-error tests/data/float-representer-2.3-bug.code tests/data/float-representer-2.3-bug.data tests/data/float.data tests/data/float.detect tests/data/forbidden-entry.loader-error tests/data/forbidden-key.loader-error tests/data/forbidden-value.loader-error tests/data/implicit-document.single-loader-error tests/data/int.data tests/data/int.detect tests/data/invalid-anchor-1.loader-error tests/data/invalid-anchor-2.loader-error tests/data/invalid-anchor.emitter-error tests/data/invalid-base64-data-2.loader-error tests/data/invalid-base64-data.loader-error tests/data/invalid-block-scalar-indicator.loader-error tests/data/invalid-character.loader-error tests/data/invalid-character.stream-error tests/data/invalid-directive-line.loader-error tests/data/invalid-directive-name-1.loader-error tests/data/invalid-directive-name-2.loader-error tests/data/invalid-escape-character.loader-error tests/data/invalid-escape-numbers.loader-error tests/data/invalid-indentation-indicator-1.loader-error tests/data/invalid-indentation-indicator-2.loader-error tests/data/invalid-item-without-trailing-break.loader-error tests/data/invalid-merge-1.loader-error tests/data/invalid-merge-2.loader-error tests/data/invalid-omap-1.loader-error tests/data/invalid-omap-2.loader-error tests/data/invalid-omap-3.loader-error tests/data/invalid-pairs-1.loader-error tests/data/invalid-pairs-2.loader-error tests/data/invalid-pairs-3.loader-error tests/data/invalid-python-bytes-2-py3.loader-error tests/data/invalid-python-bytes-py3.loader-error tests/data/invalid-python-module-kind.loader-error tests/data/invalid-python-module-value.loader-error tests/data/invalid-python-module.loader-error tests/data/invalid-python-name-kind.loader-error tests/data/invalid-python-name-module.loader-error tests/data/invalid-python-name-object.loader-error tests/data/invalid-python-name-value.loader-error tests/data/invalid-simple-key.loader-error tests/data/invalid-single-quote-bug.code tests/data/invalid-single-quote-bug.data tests/data/invalid-starting-character.loader-error tests/data/invalid-tag-1.loader-error tests/data/invalid-tag-2.loader-error tests/data/invalid-tag-directive-handle.loader-error tests/data/invalid-tag-directive-prefix.loader-error tests/data/invalid-tag-handle-1.emitter-error tests/data/invalid-tag-handle-1.loader-error tests/data/invalid-tag-handle-2.emitter-error tests/data/invalid-tag-handle-2.loader-error tests/data/invalid-uri-escapes-1.loader-error tests/data/invalid-uri-escapes-2.loader-error tests/data/invalid-uri-escapes-3.loader-error tests/data/invalid-uri.loader-error tests/data/invalid-utf8-byte.loader-error tests/data/invalid-utf8-byte.stream-error tests/data/invalid-yaml-directive-version-1.loader-error tests/data/invalid-yaml-directive-version-2.loader-error tests/data/invalid-yaml-directive-version-3.loader-error tests/data/invalid-yaml-directive-version-4.loader-error tests/data/invalid-yaml-directive-version-5.loader-error tests/data/invalid-yaml-directive-version-6.loader-error tests/data/invalid-yaml-version.loader-error tests/data/latin.unicode tests/data/mapping.sort tests/data/mapping.sorted tests/data/mappings.events tests/data/merge.data tests/data/merge.detect tests/data/more-floats.code tests/data/more-floats.data tests/data/multi-constructor.code tests/data/multi-constructor.multi tests/data/myfullloader.subclass_blacklist tests/data/negative-float-bug.code tests/data/negative-float-bug.data tests/data/no-alias-anchor.emitter-error tests/data/no-alias-anchor.skip-ext tests/data/no-block-collection-end.loader-error tests/data/no-block-mapping-end-2.loader-error tests/data/no-block-mapping-end.loader-error tests/data/no-document-start.loader-error tests/data/no-flow-mapping-end.loader-error tests/data/no-flow-sequence-end.loader-error tests/data/no-node-1.loader-error tests/data/no-node-2.loader-error tests/data/no-tag.emitter-error tests/data/null.data tests/data/null.detect tests/data/odd-utf16.stream-error tests/data/overwrite-state-new-constructor.loader-error tests/data/recursive-anchor.former-loader-error tests/data/recursive-dict.recursive tests/data/recursive-list.recursive tests/data/recursive-set.recursive tests/data/recursive-state.recursive tests/data/recursive-tuple.recursive tests/data/recursive.former-dumper-error tests/data/remove-possible-simple-key-bug.loader-error tests/data/resolver.data tests/data/resolver.path tests/data/run-parser-crash-bug.data tests/data/scalars.events tests/data/scan-document-end-bug.canonical tests/data/scan-document-end-bug.data tests/data/scan-line-break-bug.canonical tests/data/scan-line-break-bug.data tests/data/sequences.events tests/data/serializer-is-already-opened.dumper-error tests/data/serializer-is-closed-1.dumper-error tests/data/serializer-is-closed-2.dumper-error tests/data/serializer-is-not-opened-1.dumper-error tests/data/serializer-is-not-opened-2.dumper-error tests/data/single-dot-is-not-float-bug.code tests/data/single-dot-is-not-float-bug.data tests/data/sloppy-indentation.canonical tests/data/sloppy-indentation.data tests/data/spec-02-01.data tests/data/spec-02-01.structure tests/data/spec-02-01.tokens tests/data/spec-02-02.data tests/data/spec-02-02.structure tests/data/spec-02-02.tokens tests/data/spec-02-03.data tests/data/spec-02-03.structure tests/data/spec-02-03.tokens tests/data/spec-02-04.data tests/data/spec-02-04.structure tests/data/spec-02-04.tokens tests/data/spec-02-05.data tests/data/spec-02-05.structure tests/data/spec-02-05.tokens tests/data/spec-02-06.data tests/data/spec-02-06.structure tests/data/spec-02-06.tokens tests/data/spec-02-07.data tests/data/spec-02-07.structure tests/data/spec-02-07.tokens tests/data/spec-02-08.data tests/data/spec-02-08.structure tests/data/spec-02-08.tokens tests/data/spec-02-09.data tests/data/spec-02-09.structure tests/data/spec-02-09.tokens tests/data/spec-02-10.data tests/data/spec-02-10.structure tests/data/spec-02-10.tokens tests/data/spec-02-11.data tests/data/spec-02-11.structure tests/data/spec-02-11.tokens tests/data/spec-02-12.data tests/data/spec-02-12.structure tests/data/spec-02-12.tokens tests/data/spec-02-13.data tests/data/spec-02-13.structure tests/data/spec-02-13.tokens tests/data/spec-02-14.data tests/data/spec-02-14.structure tests/data/spec-02-14.tokens tests/data/spec-02-15.data tests/data/spec-02-15.structure tests/data/spec-02-15.tokens tests/data/spec-02-16.data tests/data/spec-02-16.structure tests/data/spec-02-16.tokens tests/data/spec-02-17.data tests/data/spec-02-17.structure tests/data/spec-02-17.tokens tests/data/spec-02-18.data tests/data/spec-02-18.structure tests/data/spec-02-18.tokens tests/data/spec-02-19.data tests/data/spec-02-19.structure tests/data/spec-02-19.tokens tests/data/spec-02-20.data tests/data/spec-02-20.structure tests/data/spec-02-20.tokens tests/data/spec-02-21.data tests/data/spec-02-21.structure tests/data/spec-02-21.tokens tests/data/spec-02-22.data tests/data/spec-02-22.structure tests/data/spec-02-22.tokens tests/data/spec-02-23.data tests/data/spec-02-23.structure tests/data/spec-02-23.tokens tests/data/spec-02-24.data tests/data/spec-02-24.structure tests/data/spec-02-24.tokens tests/data/spec-02-25.data tests/data/spec-02-25.structure tests/data/spec-02-25.tokens tests/data/spec-02-26.data tests/data/spec-02-26.structure tests/data/spec-02-26.tokens tests/data/spec-02-27.data tests/data/spec-02-27.structure tests/data/spec-02-27.tokens tests/data/spec-02-28.data tests/data/spec-02-28.structure tests/data/spec-02-28.tokens tests/data/spec-05-01-utf16be.data tests/data/spec-05-01-utf16be.empty tests/data/spec-05-01-utf16le.data tests/data/spec-05-01-utf16le.empty tests/data/spec-05-01-utf8.data tests/data/spec-05-01-utf8.empty tests/data/spec-05-02-utf16be.data tests/data/spec-05-02-utf16be.error tests/data/spec-05-02-utf16le.data tests/data/spec-05-02-utf16le.error tests/data/spec-05-02-utf8.data tests/data/spec-05-02-utf8.error tests/data/spec-05-03.canonical tests/data/spec-05-03.data tests/data/spec-05-04.canonical tests/data/spec-05-04.data tests/data/spec-05-05.data tests/data/spec-05-05.empty tests/data/spec-05-06.canonical tests/data/spec-05-06.data tests/data/spec-05-07.canonical tests/data/spec-05-07.data tests/data/spec-05-08.canonical tests/data/spec-05-08.data tests/data/spec-05-09.canonical tests/data/spec-05-09.data tests/data/spec-05-10.data tests/data/spec-05-10.error tests/data/spec-05-11.canonical tests/data/spec-05-11.data tests/data/spec-05-12.data tests/data/spec-05-12.error tests/data/spec-05-13.canonical tests/data/spec-05-13.data tests/data/spec-05-14.canonical tests/data/spec-05-14.data tests/data/spec-05-15.data tests/data/spec-05-15.error tests/data/spec-06-01.canonical tests/data/spec-06-01.data tests/data/spec-06-02.data tests/data/spec-06-02.empty tests/data/spec-06-03.canonical tests/data/spec-06-03.data tests/data/spec-06-04.canonical tests/data/spec-06-04.data tests/data/spec-06-05.canonical tests/data/spec-06-05.data tests/data/spec-06-06.canonical tests/data/spec-06-06.data tests/data/spec-06-07.canonical tests/data/spec-06-07.data tests/data/spec-06-08.canonical tests/data/spec-06-08.data tests/data/spec-07-01.canonical tests/data/spec-07-01.data tests/data/spec-07-01.skip-ext tests/data/spec-07-02.canonical tests/data/spec-07-02.data tests/data/spec-07-02.skip-ext tests/data/spec-07-03.data tests/data/spec-07-03.error tests/data/spec-07-04.canonical tests/data/spec-07-04.data tests/data/spec-07-05.data tests/data/spec-07-05.error tests/data/spec-07-06.canonical tests/data/spec-07-06.data tests/data/spec-07-07a.canonical tests/data/spec-07-07a.data tests/data/spec-07-07b.canonical tests/data/spec-07-07b.data tests/data/spec-07-08.canonical tests/data/spec-07-08.data tests/data/spec-07-09.canonical tests/data/spec-07-09.data tests/data/spec-07-10.canonical tests/data/spec-07-10.data tests/data/spec-07-11.data tests/data/spec-07-11.empty tests/data/spec-07-12a.canonical tests/data/spec-07-12a.data tests/data/spec-07-12b.canonical tests/data/spec-07-12b.data tests/data/spec-07-13.canonical tests/data/spec-07-13.data tests/data/spec-08-01.canonical tests/data/spec-08-01.data tests/data/spec-08-02.canonical tests/data/spec-08-02.data tests/data/spec-08-03.canonical tests/data/spec-08-03.data tests/data/spec-08-04.data tests/data/spec-08-04.error tests/data/spec-08-05.canonical tests/data/spec-08-05.data tests/data/spec-08-06.data tests/data/spec-08-06.error tests/data/spec-08-07.canonical tests/data/spec-08-07.data tests/data/spec-08-08.canonical tests/data/spec-08-08.data tests/data/spec-08-09.canonical tests/data/spec-08-09.data tests/data/spec-08-10.canonical tests/data/spec-08-10.data tests/data/spec-08-11.canonical tests/data/spec-08-11.data tests/data/spec-08-12.canonical tests/data/spec-08-12.data tests/data/spec-08-13.canonical tests/data/spec-08-13.data tests/data/spec-08-13.skip-ext tests/data/spec-08-14.canonical tests/data/spec-08-14.data tests/data/spec-08-15.canonical tests/data/spec-08-15.data tests/data/spec-09-01.canonical tests/data/spec-09-01.data tests/data/spec-09-02.canonical tests/data/spec-09-02.data tests/data/spec-09-03.canonical tests/data/spec-09-03.data tests/data/spec-09-04.canonical tests/data/spec-09-04.data tests/data/spec-09-05.canonical tests/data/spec-09-05.data tests/data/spec-09-06.canonical tests/data/spec-09-06.data tests/data/spec-09-07.canonical tests/data/spec-09-07.data tests/data/spec-09-08.canonical tests/data/spec-09-08.data tests/data/spec-09-09.canonical tests/data/spec-09-09.data tests/data/spec-09-10.canonical tests/data/spec-09-10.data tests/data/spec-09-11.canonical tests/data/spec-09-11.data tests/data/spec-09-12.canonical tests/data/spec-09-12.data tests/data/spec-09-13.canonical tests/data/spec-09-13.data tests/data/spec-09-14.data tests/data/spec-09-14.error tests/data/spec-09-15.canonical tests/data/spec-09-15.data tests/data/spec-09-16.canonical tests/data/spec-09-16.data tests/data/spec-09-17.canonical tests/data/spec-09-17.data tests/data/spec-09-18.canonical tests/data/spec-09-18.data tests/data/spec-09-19.canonical tests/data/spec-09-19.data tests/data/spec-09-20.canonical tests/data/spec-09-20.data tests/data/spec-09-20.skip-ext tests/data/spec-09-21.data tests/data/spec-09-21.error tests/data/spec-09-22.canonical tests/data/spec-09-22.data tests/data/spec-09-23.canonical tests/data/spec-09-23.data tests/data/spec-09-24.canonical tests/data/spec-09-24.data tests/data/spec-09-25.canonical tests/data/spec-09-25.data tests/data/spec-09-26.canonical tests/data/spec-09-26.data tests/data/spec-09-27.canonical tests/data/spec-09-27.data tests/data/spec-09-28.canonical tests/data/spec-09-28.data tests/data/spec-09-29.canonical tests/data/spec-09-29.data tests/data/spec-09-30.canonical tests/data/spec-09-30.data tests/data/spec-09-31.canonical tests/data/spec-09-31.data tests/data/spec-09-32.canonical tests/data/spec-09-32.data tests/data/spec-09-33.canonical tests/data/spec-09-33.data tests/data/spec-10-01.canonical tests/data/spec-10-01.data tests/data/spec-10-02.canonical tests/data/spec-10-02.data tests/data/spec-10-03.canonical tests/data/spec-10-03.data tests/data/spec-10-04.canonical tests/data/spec-10-04.data tests/data/spec-10-05.canonical tests/data/spec-10-05.data tests/data/spec-10-06.canonical tests/data/spec-10-06.data tests/data/spec-10-07.canonical tests/data/spec-10-07.data tests/data/spec-10-08.data tests/data/spec-10-08.error tests/data/spec-10-09.canonical tests/data/spec-10-09.data tests/data/spec-10-10.canonical tests/data/spec-10-10.data tests/data/spec-10-11.canonical tests/data/spec-10-11.data tests/data/spec-10-12.canonical tests/data/spec-10-12.data tests/data/spec-10-13.canonical tests/data/spec-10-13.data tests/data/spec-10-14.canonical tests/data/spec-10-14.data tests/data/spec-10-15.canonical tests/data/spec-10-15.data tests/data/str.data tests/data/str.detect tests/data/tags.events tests/data/test_mark.marks tests/data/timestamp-bugs.code tests/data/timestamp-bugs.data tests/data/timestamp.data tests/data/timestamp.detect tests/data/unacceptable-key.loader-error tests/data/unclosed-bracket.loader-error tests/data/unclosed-quoted-scalar.loader-error tests/data/undefined-anchor.loader-error tests/data/undefined-constructor.loader-error tests/data/undefined-tag-handle.loader-error tests/data/unknown.dumper-error tests/data/unsupported-version.emitter-error tests/data/utf16be.code tests/data/utf16be.data tests/data/utf16le.code tests/data/utf16le.data tests/data/utf8-implicit.code tests/data/utf8-implicit.data tests/data/utf8.code tests/data/utf8.data tests/data/value.data tests/data/value.detect tests/data/yaml.data tests/data/yaml.detect tests/data/yaml11.schema tests/data/yaml11.schema-skip tests/lib/canonical.py tests/lib/test_all.py tests/lib/test_appliance.py tests/lib/test_build.py tests/lib/test_build_ext.py tests/lib/test_canonical.py tests/lib/test_constructor.py tests/lib/test_dump_load.py tests/lib/test_emitter.py tests/lib/test_errors.py tests/lib/test_input_output.py tests/lib/test_mark.py tests/lib/test_multi_constructor.py tests/lib/test_reader.py tests/lib/test_recursive.py tests/lib/test_representer.py tests/lib/test_resolver.py tests/lib/test_schema.py tests/lib/test_sort_keys.py tests/lib/test_structure.py tests/lib/test_tokens.py tests/lib/test_yaml.py tests/lib/test_yaml_ext.py yaml/__init__.pxd yaml/_yaml.h yaml/_yaml.pxd yaml/_yaml.pyx././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637208.0 PyYAML-6.0.1/lib/PyYAML.egg-info/dependency_links.txt0000644000175100001730000000000114455350530021557 0ustar00runnerdocker ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637208.0 PyYAML-6.0.1/lib/PyYAML.egg-info/top_level.txt0000644000175100001730000000001314455350530020235 0ustar00runnerdocker_yaml yaml ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1689637208.6602278 PyYAML-6.0.1/lib/_yaml/0000755000175100001730000000000014455350531014046 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/lib/_yaml/__init__.py0000644000175100001730000000257214455350511016163 0ustar00runnerdocker# This is a stub package designed to roughly emulate the _yaml # extension module, which previously existed as a standalone module # and has been moved into the `yaml` package namespace. # It does not perfectly mimic its old counterpart, but should get # close enough for anyone who's relying on it even when they shouldn't. import yaml # in some circumstances, the yaml module we imoprted may be from a different version, so we need # to tread carefully when poking at it here (it may not have the attributes we expect) if not getattr(yaml, '__with_libyaml__', False): from sys import version_info exc = ModuleNotFoundError if version_info >= (3, 6) else ImportError raise exc("No module named '_yaml'") else: from yaml._yaml import * import warnings warnings.warn( 'The _yaml extension module is now located at yaml._yaml' ' and its location is subject to change. To use the' ' LibYAML-based parser and emitter, import from `yaml`:' ' `from yaml import CLoader as Loader, CDumper as Dumper`.', DeprecationWarning ) del warnings # Don't `del yaml` here because yaml is actually an existing # namespace member of _yaml. __name__ = '_yaml' # If the module is top-level (i.e. not a part of any specific package) # then the attribute should be set to ''. # https://docs.python.org/3.8/library/types.html __package__ = '' ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1689637208.6642277 PyYAML-6.0.1/lib/yaml/0000755000175100001730000000000014455350531013707 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/lib/yaml/__init__.py0000644000175100001730000003002714455350511016020 0ustar00runnerdocker from .error import * from .tokens import * from .events import * from .nodes import * from .loader import * from .dumper import * __version__ = '6.0.1' try: from .cyaml import * __with_libyaml__ = True except ImportError: __with_libyaml__ = False import io #------------------------------------------------------------------------------ # XXX "Warnings control" is now deprecated. Leaving in the API function to not # break code that uses it. #------------------------------------------------------------------------------ def warnings(settings=None): if settings is None: return {} #------------------------------------------------------------------------------ def scan(stream, Loader=Loader): """ Scan a YAML stream and produce scanning tokens. """ loader = Loader(stream) try: while loader.check_token(): yield loader.get_token() finally: loader.dispose() def parse(stream, Loader=Loader): """ Parse a YAML stream and produce parsing events. """ loader = Loader(stream) try: while loader.check_event(): yield loader.get_event() finally: loader.dispose() def compose(stream, Loader=Loader): """ Parse the first YAML document in a stream and produce the corresponding representation tree. """ loader = Loader(stream) try: return loader.get_single_node() finally: loader.dispose() def compose_all(stream, Loader=Loader): """ Parse all YAML documents in a stream and produce corresponding representation trees. """ loader = Loader(stream) try: while loader.check_node(): yield loader.get_node() finally: loader.dispose() def load(stream, Loader): """ Parse the first YAML document in a stream and produce the corresponding Python object. """ loader = Loader(stream) try: return loader.get_single_data() finally: loader.dispose() def load_all(stream, Loader): """ Parse all YAML documents in a stream and produce corresponding Python objects. """ loader = Loader(stream) try: while loader.check_data(): yield loader.get_data() finally: loader.dispose() def full_load(stream): """ Parse the first YAML document in a stream and produce the corresponding Python object. Resolve all tags except those known to be unsafe on untrusted input. """ return load(stream, FullLoader) def full_load_all(stream): """ Parse all YAML documents in a stream and produce corresponding Python objects. Resolve all tags except those known to be unsafe on untrusted input. """ return load_all(stream, FullLoader) def safe_load(stream): """ Parse the first YAML document in a stream and produce the corresponding Python object. Resolve only basic YAML tags. This is known to be safe for untrusted input. """ return load(stream, SafeLoader) def safe_load_all(stream): """ Parse all YAML documents in a stream and produce corresponding Python objects. Resolve only basic YAML tags. This is known to be safe for untrusted input. """ return load_all(stream, SafeLoader) def unsafe_load(stream): """ Parse the first YAML document in a stream and produce the corresponding Python object. Resolve all tags, even those known to be unsafe on untrusted input. """ return load(stream, UnsafeLoader) def unsafe_load_all(stream): """ Parse all YAML documents in a stream and produce corresponding Python objects. Resolve all tags, even those known to be unsafe on untrusted input. """ return load_all(stream, UnsafeLoader) def emit(events, stream=None, Dumper=Dumper, canonical=None, indent=None, width=None, allow_unicode=None, line_break=None): """ Emit YAML parsing events into a stream. If stream is None, return the produced string instead. """ getvalue = None if stream is None: stream = io.StringIO() getvalue = stream.getvalue dumper = Dumper(stream, canonical=canonical, indent=indent, width=width, allow_unicode=allow_unicode, line_break=line_break) try: for event in events: dumper.emit(event) finally: dumper.dispose() if getvalue: return getvalue() def serialize_all(nodes, stream=None, Dumper=Dumper, canonical=None, indent=None, width=None, allow_unicode=None, line_break=None, encoding=None, explicit_start=None, explicit_end=None, version=None, tags=None): """ Serialize a sequence of representation trees into a YAML stream. If stream is None, return the produced string instead. """ getvalue = None if stream is None: if encoding is None: stream = io.StringIO() else: stream = io.BytesIO() getvalue = stream.getvalue dumper = Dumper(stream, canonical=canonical, indent=indent, width=width, allow_unicode=allow_unicode, line_break=line_break, encoding=encoding, version=version, tags=tags, explicit_start=explicit_start, explicit_end=explicit_end) try: dumper.open() for node in nodes: dumper.serialize(node) dumper.close() finally: dumper.dispose() if getvalue: return getvalue() def serialize(node, stream=None, Dumper=Dumper, **kwds): """ Serialize a representation tree into a YAML stream. If stream is None, return the produced string instead. """ return serialize_all([node], stream, Dumper=Dumper, **kwds) def dump_all(documents, stream=None, Dumper=Dumper, default_style=None, default_flow_style=False, canonical=None, indent=None, width=None, allow_unicode=None, line_break=None, encoding=None, explicit_start=None, explicit_end=None, version=None, tags=None, sort_keys=True): """ Serialize a sequence of Python objects into a YAML stream. If stream is None, return the produced string instead. """ getvalue = None if stream is None: if encoding is None: stream = io.StringIO() else: stream = io.BytesIO() getvalue = stream.getvalue dumper = Dumper(stream, default_style=default_style, default_flow_style=default_flow_style, canonical=canonical, indent=indent, width=width, allow_unicode=allow_unicode, line_break=line_break, encoding=encoding, version=version, tags=tags, explicit_start=explicit_start, explicit_end=explicit_end, sort_keys=sort_keys) try: dumper.open() for data in documents: dumper.represent(data) dumper.close() finally: dumper.dispose() if getvalue: return getvalue() def dump(data, stream=None, Dumper=Dumper, **kwds): """ Serialize a Python object into a YAML stream. If stream is None, return the produced string instead. """ return dump_all([data], stream, Dumper=Dumper, **kwds) def safe_dump_all(documents, stream=None, **kwds): """ Serialize a sequence of Python objects into a YAML stream. Produce only basic YAML tags. If stream is None, return the produced string instead. """ return dump_all(documents, stream, Dumper=SafeDumper, **kwds) def safe_dump(data, stream=None, **kwds): """ Serialize a Python object into a YAML stream. Produce only basic YAML tags. If stream is None, return the produced string instead. """ return dump_all([data], stream, Dumper=SafeDumper, **kwds) def add_implicit_resolver(tag, regexp, first=None, Loader=None, Dumper=Dumper): """ Add an implicit scalar detector. If an implicit scalar value matches the given regexp, the corresponding tag is assigned to the scalar. first is a sequence of possible initial characters or None. """ if Loader is None: loader.Loader.add_implicit_resolver(tag, regexp, first) loader.FullLoader.add_implicit_resolver(tag, regexp, first) loader.UnsafeLoader.add_implicit_resolver(tag, regexp, first) else: Loader.add_implicit_resolver(tag, regexp, first) Dumper.add_implicit_resolver(tag, regexp, first) def add_path_resolver(tag, path, kind=None, Loader=None, Dumper=Dumper): """ Add a path based resolver for the given tag. A path is a list of keys that forms a path to a node in the representation tree. Keys can be string values, integers, or None. """ if Loader is None: loader.Loader.add_path_resolver(tag, path, kind) loader.FullLoader.add_path_resolver(tag, path, kind) loader.UnsafeLoader.add_path_resolver(tag, path, kind) else: Loader.add_path_resolver(tag, path, kind) Dumper.add_path_resolver(tag, path, kind) def add_constructor(tag, constructor, Loader=None): """ Add a constructor for the given tag. Constructor is a function that accepts a Loader instance and a node object and produces the corresponding Python object. """ if Loader is None: loader.Loader.add_constructor(tag, constructor) loader.FullLoader.add_constructor(tag, constructor) loader.UnsafeLoader.add_constructor(tag, constructor) else: Loader.add_constructor(tag, constructor) def add_multi_constructor(tag_prefix, multi_constructor, Loader=None): """ Add a multi-constructor for the given tag prefix. Multi-constructor is called for a node if its tag starts with tag_prefix. Multi-constructor accepts a Loader instance, a tag suffix, and a node object and produces the corresponding Python object. """ if Loader is None: loader.Loader.add_multi_constructor(tag_prefix, multi_constructor) loader.FullLoader.add_multi_constructor(tag_prefix, multi_constructor) loader.UnsafeLoader.add_multi_constructor(tag_prefix, multi_constructor) else: Loader.add_multi_constructor(tag_prefix, multi_constructor) def add_representer(data_type, representer, Dumper=Dumper): """ Add a representer for the given type. Representer is a function accepting a Dumper instance and an instance of the given data type and producing the corresponding representation node. """ Dumper.add_representer(data_type, representer) def add_multi_representer(data_type, multi_representer, Dumper=Dumper): """ Add a representer for the given type. Multi-representer is a function accepting a Dumper instance and an instance of the given data type or subtype and producing the corresponding representation node. """ Dumper.add_multi_representer(data_type, multi_representer) class YAMLObjectMetaclass(type): """ The metaclass for YAMLObject. """ def __init__(cls, name, bases, kwds): super(YAMLObjectMetaclass, cls).__init__(name, bases, kwds) if 'yaml_tag' in kwds and kwds['yaml_tag'] is not None: if isinstance(cls.yaml_loader, list): for loader in cls.yaml_loader: loader.add_constructor(cls.yaml_tag, cls.from_yaml) else: cls.yaml_loader.add_constructor(cls.yaml_tag, cls.from_yaml) cls.yaml_dumper.add_representer(cls, cls.to_yaml) class YAMLObject(metaclass=YAMLObjectMetaclass): """ An object that can dump itself to a YAML stream and load itself from a YAML stream. """ __slots__ = () # no direct instantiation, so allow immutable subclasses yaml_loader = [Loader, FullLoader, UnsafeLoader] yaml_dumper = Dumper yaml_tag = None yaml_flow_style = None @classmethod def from_yaml(cls, loader, node): """ Convert a representation node to a Python object. """ return loader.construct_yaml_object(node, cls) @classmethod def to_yaml(cls, dumper, data): """ Convert a Python object to a representation node. """ return dumper.represent_yaml_object(cls.yaml_tag, data, cls, flow_style=cls.yaml_flow_style) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/lib/yaml/composer.py0000644000175100001730000001142314455350511016107 0ustar00runnerdocker __all__ = ['Composer', 'ComposerError'] from .error import MarkedYAMLError from .events import * from .nodes import * class ComposerError(MarkedYAMLError): pass class Composer: def __init__(self): self.anchors = {} def check_node(self): # Drop the STREAM-START event. if self.check_event(StreamStartEvent): self.get_event() # If there are more documents available? return not self.check_event(StreamEndEvent) def get_node(self): # Get the root node of the next document. if not self.check_event(StreamEndEvent): return self.compose_document() def get_single_node(self): # Drop the STREAM-START event. self.get_event() # Compose a document if the stream is not empty. document = None if not self.check_event(StreamEndEvent): document = self.compose_document() # Ensure that the stream contains no more documents. if not self.check_event(StreamEndEvent): event = self.get_event() raise ComposerError("expected a single document in the stream", document.start_mark, "but found another document", event.start_mark) # Drop the STREAM-END event. self.get_event() return document def compose_document(self): # Drop the DOCUMENT-START event. self.get_event() # Compose the root node. node = self.compose_node(None, None) # Drop the DOCUMENT-END event. self.get_event() self.anchors = {} return node def compose_node(self, parent, index): if self.check_event(AliasEvent): event = self.get_event() anchor = event.anchor if anchor not in self.anchors: raise ComposerError(None, None, "found undefined alias %r" % anchor, event.start_mark) return self.anchors[anchor] event = self.peek_event() anchor = event.anchor if anchor is not None: if anchor in self.anchors: raise ComposerError("found duplicate anchor %r; first occurrence" % anchor, self.anchors[anchor].start_mark, "second occurrence", event.start_mark) self.descend_resolver(parent, index) if self.check_event(ScalarEvent): node = self.compose_scalar_node(anchor) elif self.check_event(SequenceStartEvent): node = self.compose_sequence_node(anchor) elif self.check_event(MappingStartEvent): node = self.compose_mapping_node(anchor) self.ascend_resolver() return node def compose_scalar_node(self, anchor): event = self.get_event() tag = event.tag if tag is None or tag == '!': tag = self.resolve(ScalarNode, event.value, event.implicit) node = ScalarNode(tag, event.value, event.start_mark, event.end_mark, style=event.style) if anchor is not None: self.anchors[anchor] = node return node def compose_sequence_node(self, anchor): start_event = self.get_event() tag = start_event.tag if tag is None or tag == '!': tag = self.resolve(SequenceNode, None, start_event.implicit) node = SequenceNode(tag, [], start_event.start_mark, None, flow_style=start_event.flow_style) if anchor is not None: self.anchors[anchor] = node index = 0 while not self.check_event(SequenceEndEvent): node.value.append(self.compose_node(node, index)) index += 1 end_event = self.get_event() node.end_mark = end_event.end_mark return node def compose_mapping_node(self, anchor): start_event = self.get_event() tag = start_event.tag if tag is None or tag == '!': tag = self.resolve(MappingNode, None, start_event.implicit) node = MappingNode(tag, [], start_event.start_mark, None, flow_style=start_event.flow_style) if anchor is not None: self.anchors[anchor] = node while not self.check_event(MappingEndEvent): #key_event = self.peek_event() item_key = self.compose_node(node, None) #if item_key in node.value: # raise ComposerError("while composing a mapping", start_event.start_mark, # "found duplicate key", key_event.start_mark) item_value = self.compose_node(node, item_key) #node.value[item_key] = item_value node.value.append((item_key, item_value)) end_event = self.get_event() node.end_mark = end_event.end_mark return node ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/lib/yaml/constructor.py0000644000175100001730000006773714455350511016670 0ustar00runnerdocker __all__ = [ 'BaseConstructor', 'SafeConstructor', 'FullConstructor', 'UnsafeConstructor', 'Constructor', 'ConstructorError' ] from .error import * from .nodes import * import collections.abc, datetime, base64, binascii, re, sys, types class ConstructorError(MarkedYAMLError): pass class BaseConstructor: yaml_constructors = {} yaml_multi_constructors = {} def __init__(self): self.constructed_objects = {} self.recursive_objects = {} self.state_generators = [] self.deep_construct = False def check_data(self): # If there are more documents available? return self.check_node() def check_state_key(self, key): """Block special attributes/methods from being set in a newly created object, to prevent user-controlled methods from being called during deserialization""" if self.get_state_keys_blacklist_regexp().match(key): raise ConstructorError(None, None, "blacklisted key '%s' in instance state found" % (key,), None) def get_data(self): # Construct and return the next document. if self.check_node(): return self.construct_document(self.get_node()) def get_single_data(self): # Ensure that the stream contains a single document and construct it. node = self.get_single_node() if node is not None: return self.construct_document(node) return None def construct_document(self, node): data = self.construct_object(node) while self.state_generators: state_generators = self.state_generators self.state_generators = [] for generator in state_generators: for dummy in generator: pass self.constructed_objects = {} self.recursive_objects = {} self.deep_construct = False return data def construct_object(self, node, deep=False): if node in self.constructed_objects: return self.constructed_objects[node] if deep: old_deep = self.deep_construct self.deep_construct = True if node in self.recursive_objects: raise ConstructorError(None, None, "found unconstructable recursive node", node.start_mark) self.recursive_objects[node] = None constructor = None tag_suffix = None if node.tag in self.yaml_constructors: constructor = self.yaml_constructors[node.tag] else: for tag_prefix in self.yaml_multi_constructors: if tag_prefix is not None and node.tag.startswith(tag_prefix): tag_suffix = node.tag[len(tag_prefix):] constructor = self.yaml_multi_constructors[tag_prefix] break else: if None in self.yaml_multi_constructors: tag_suffix = node.tag constructor = self.yaml_multi_constructors[None] elif None in self.yaml_constructors: constructor = self.yaml_constructors[None] elif isinstance(node, ScalarNode): constructor = self.__class__.construct_scalar elif isinstance(node, SequenceNode): constructor = self.__class__.construct_sequence elif isinstance(node, MappingNode): constructor = self.__class__.construct_mapping if tag_suffix is None: data = constructor(self, node) else: data = constructor(self, tag_suffix, node) if isinstance(data, types.GeneratorType): generator = data data = next(generator) if self.deep_construct: for dummy in generator: pass else: self.state_generators.append(generator) self.constructed_objects[node] = data del self.recursive_objects[node] if deep: self.deep_construct = old_deep return data def construct_scalar(self, node): if not isinstance(node, ScalarNode): raise ConstructorError(None, None, "expected a scalar node, but found %s" % node.id, node.start_mark) return node.value def construct_sequence(self, node, deep=False): if not isinstance(node, SequenceNode): raise ConstructorError(None, None, "expected a sequence node, but found %s" % node.id, node.start_mark) return [self.construct_object(child, deep=deep) for child in node.value] def construct_mapping(self, node, deep=False): if not isinstance(node, MappingNode): raise ConstructorError(None, None, "expected a mapping node, but found %s" % node.id, node.start_mark) mapping = {} for key_node, value_node in node.value: key = self.construct_object(key_node, deep=deep) if not isinstance(key, collections.abc.Hashable): raise ConstructorError("while constructing a mapping", node.start_mark, "found unhashable key", key_node.start_mark) value = self.construct_object(value_node, deep=deep) mapping[key] = value return mapping def construct_pairs(self, node, deep=False): if not isinstance(node, MappingNode): raise ConstructorError(None, None, "expected a mapping node, but found %s" % node.id, node.start_mark) pairs = [] for key_node, value_node in node.value: key = self.construct_object(key_node, deep=deep) value = self.construct_object(value_node, deep=deep) pairs.append((key, value)) return pairs @classmethod def add_constructor(cls, tag, constructor): if not 'yaml_constructors' in cls.__dict__: cls.yaml_constructors = cls.yaml_constructors.copy() cls.yaml_constructors[tag] = constructor @classmethod def add_multi_constructor(cls, tag_prefix, multi_constructor): if not 'yaml_multi_constructors' in cls.__dict__: cls.yaml_multi_constructors = cls.yaml_multi_constructors.copy() cls.yaml_multi_constructors[tag_prefix] = multi_constructor class SafeConstructor(BaseConstructor): def construct_scalar(self, node): if isinstance(node, MappingNode): for key_node, value_node in node.value: if key_node.tag == 'tag:yaml.org,2002:value': return self.construct_scalar(value_node) return super().construct_scalar(node) def flatten_mapping(self, node): merge = [] index = 0 while index < len(node.value): key_node, value_node = node.value[index] if key_node.tag == 'tag:yaml.org,2002:merge': del node.value[index] if isinstance(value_node, MappingNode): self.flatten_mapping(value_node) merge.extend(value_node.value) elif isinstance(value_node, SequenceNode): submerge = [] for subnode in value_node.value: if not isinstance(subnode, MappingNode): raise ConstructorError("while constructing a mapping", node.start_mark, "expected a mapping for merging, but found %s" % subnode.id, subnode.start_mark) self.flatten_mapping(subnode) submerge.append(subnode.value) submerge.reverse() for value in submerge: merge.extend(value) else: raise ConstructorError("while constructing a mapping", node.start_mark, "expected a mapping or list of mappings for merging, but found %s" % value_node.id, value_node.start_mark) elif key_node.tag == 'tag:yaml.org,2002:value': key_node.tag = 'tag:yaml.org,2002:str' index += 1 else: index += 1 if merge: node.value = merge + node.value def construct_mapping(self, node, deep=False): if isinstance(node, MappingNode): self.flatten_mapping(node) return super().construct_mapping(node, deep=deep) def construct_yaml_null(self, node): self.construct_scalar(node) return None bool_values = { 'yes': True, 'no': False, 'true': True, 'false': False, 'on': True, 'off': False, } def construct_yaml_bool(self, node): value = self.construct_scalar(node) return self.bool_values[value.lower()] def construct_yaml_int(self, node): value = self.construct_scalar(node) value = value.replace('_', '') sign = +1 if value[0] == '-': sign = -1 if value[0] in '+-': value = value[1:] if value == '0': return 0 elif value.startswith('0b'): return sign*int(value[2:], 2) elif value.startswith('0x'): return sign*int(value[2:], 16) elif value[0] == '0': return sign*int(value, 8) elif ':' in value: digits = [int(part) for part in value.split(':')] digits.reverse() base = 1 value = 0 for digit in digits: value += digit*base base *= 60 return sign*value else: return sign*int(value) inf_value = 1e300 while inf_value != inf_value*inf_value: inf_value *= inf_value nan_value = -inf_value/inf_value # Trying to make a quiet NaN (like C99). def construct_yaml_float(self, node): value = self.construct_scalar(node) value = value.replace('_', '').lower() sign = +1 if value[0] == '-': sign = -1 if value[0] in '+-': value = value[1:] if value == '.inf': return sign*self.inf_value elif value == '.nan': return self.nan_value elif ':' in value: digits = [float(part) for part in value.split(':')] digits.reverse() base = 1 value = 0.0 for digit in digits: value += digit*base base *= 60 return sign*value else: return sign*float(value) def construct_yaml_binary(self, node): try: value = self.construct_scalar(node).encode('ascii') except UnicodeEncodeError as exc: raise ConstructorError(None, None, "failed to convert base64 data into ascii: %s" % exc, node.start_mark) try: if hasattr(base64, 'decodebytes'): return base64.decodebytes(value) else: return base64.decodestring(value) except binascii.Error as exc: raise ConstructorError(None, None, "failed to decode base64 data: %s" % exc, node.start_mark) timestamp_regexp = re.compile( r'''^(?P[0-9][0-9][0-9][0-9]) -(?P[0-9][0-9]?) -(?P[0-9][0-9]?) (?:(?:[Tt]|[ \t]+) (?P[0-9][0-9]?) :(?P[0-9][0-9]) :(?P[0-9][0-9]) (?:\.(?P[0-9]*))? (?:[ \t]*(?PZ|(?P[-+])(?P[0-9][0-9]?) (?::(?P[0-9][0-9]))?))?)?$''', re.X) def construct_yaml_timestamp(self, node): value = self.construct_scalar(node) match = self.timestamp_regexp.match(node.value) values = match.groupdict() year = int(values['year']) month = int(values['month']) day = int(values['day']) if not values['hour']: return datetime.date(year, month, day) hour = int(values['hour']) minute = int(values['minute']) second = int(values['second']) fraction = 0 tzinfo = None if values['fraction']: fraction = values['fraction'][:6] while len(fraction) < 6: fraction += '0' fraction = int(fraction) if values['tz_sign']: tz_hour = int(values['tz_hour']) tz_minute = int(values['tz_minute'] or 0) delta = datetime.timedelta(hours=tz_hour, minutes=tz_minute) if values['tz_sign'] == '-': delta = -delta tzinfo = datetime.timezone(delta) elif values['tz']: tzinfo = datetime.timezone.utc return datetime.datetime(year, month, day, hour, minute, second, fraction, tzinfo=tzinfo) def construct_yaml_omap(self, node): # Note: we do not check for duplicate keys, because it's too # CPU-expensive. omap = [] yield omap if not isinstance(node, SequenceNode): raise ConstructorError("while constructing an ordered map", node.start_mark, "expected a sequence, but found %s" % node.id, node.start_mark) for subnode in node.value: if not isinstance(subnode, MappingNode): raise ConstructorError("while constructing an ordered map", node.start_mark, "expected a mapping of length 1, but found %s" % subnode.id, subnode.start_mark) if len(subnode.value) != 1: raise ConstructorError("while constructing an ordered map", node.start_mark, "expected a single mapping item, but found %d items" % len(subnode.value), subnode.start_mark) key_node, value_node = subnode.value[0] key = self.construct_object(key_node) value = self.construct_object(value_node) omap.append((key, value)) def construct_yaml_pairs(self, node): # Note: the same code as `construct_yaml_omap`. pairs = [] yield pairs if not isinstance(node, SequenceNode): raise ConstructorError("while constructing pairs", node.start_mark, "expected a sequence, but found %s" % node.id, node.start_mark) for subnode in node.value: if not isinstance(subnode, MappingNode): raise ConstructorError("while constructing pairs", node.start_mark, "expected a mapping of length 1, but found %s" % subnode.id, subnode.start_mark) if len(subnode.value) != 1: raise ConstructorError("while constructing pairs", node.start_mark, "expected a single mapping item, but found %d items" % len(subnode.value), subnode.start_mark) key_node, value_node = subnode.value[0] key = self.construct_object(key_node) value = self.construct_object(value_node) pairs.append((key, value)) def construct_yaml_set(self, node): data = set() yield data value = self.construct_mapping(node) data.update(value) def construct_yaml_str(self, node): return self.construct_scalar(node) def construct_yaml_seq(self, node): data = [] yield data data.extend(self.construct_sequence(node)) def construct_yaml_map(self, node): data = {} yield data value = self.construct_mapping(node) data.update(value) def construct_yaml_object(self, node, cls): data = cls.__new__(cls) yield data if hasattr(data, '__setstate__'): state = self.construct_mapping(node, deep=True) data.__setstate__(state) else: state = self.construct_mapping(node) data.__dict__.update(state) def construct_undefined(self, node): raise ConstructorError(None, None, "could not determine a constructor for the tag %r" % node.tag, node.start_mark) SafeConstructor.add_constructor( 'tag:yaml.org,2002:null', SafeConstructor.construct_yaml_null) SafeConstructor.add_constructor( 'tag:yaml.org,2002:bool', SafeConstructor.construct_yaml_bool) SafeConstructor.add_constructor( 'tag:yaml.org,2002:int', SafeConstructor.construct_yaml_int) SafeConstructor.add_constructor( 'tag:yaml.org,2002:float', SafeConstructor.construct_yaml_float) SafeConstructor.add_constructor( 'tag:yaml.org,2002:binary', SafeConstructor.construct_yaml_binary) SafeConstructor.add_constructor( 'tag:yaml.org,2002:timestamp', SafeConstructor.construct_yaml_timestamp) SafeConstructor.add_constructor( 'tag:yaml.org,2002:omap', SafeConstructor.construct_yaml_omap) SafeConstructor.add_constructor( 'tag:yaml.org,2002:pairs', SafeConstructor.construct_yaml_pairs) SafeConstructor.add_constructor( 'tag:yaml.org,2002:set', SafeConstructor.construct_yaml_set) SafeConstructor.add_constructor( 'tag:yaml.org,2002:str', SafeConstructor.construct_yaml_str) SafeConstructor.add_constructor( 'tag:yaml.org,2002:seq', SafeConstructor.construct_yaml_seq) SafeConstructor.add_constructor( 'tag:yaml.org,2002:map', SafeConstructor.construct_yaml_map) SafeConstructor.add_constructor(None, SafeConstructor.construct_undefined) class FullConstructor(SafeConstructor): # 'extend' is blacklisted because it is used by # construct_python_object_apply to add `listitems` to a newly generate # python instance def get_state_keys_blacklist(self): return ['^extend$', '^__.*__$'] def get_state_keys_blacklist_regexp(self): if not hasattr(self, 'state_keys_blacklist_regexp'): self.state_keys_blacklist_regexp = re.compile('(' + '|'.join(self.get_state_keys_blacklist()) + ')') return self.state_keys_blacklist_regexp def construct_python_str(self, node): return self.construct_scalar(node) def construct_python_unicode(self, node): return self.construct_scalar(node) def construct_python_bytes(self, node): try: value = self.construct_scalar(node).encode('ascii') except UnicodeEncodeError as exc: raise ConstructorError(None, None, "failed to convert base64 data into ascii: %s" % exc, node.start_mark) try: if hasattr(base64, 'decodebytes'): return base64.decodebytes(value) else: return base64.decodestring(value) except binascii.Error as exc: raise ConstructorError(None, None, "failed to decode base64 data: %s" % exc, node.start_mark) def construct_python_long(self, node): return self.construct_yaml_int(node) def construct_python_complex(self, node): return complex(self.construct_scalar(node)) def construct_python_tuple(self, node): return tuple(self.construct_sequence(node)) def find_python_module(self, name, mark, unsafe=False): if not name: raise ConstructorError("while constructing a Python module", mark, "expected non-empty name appended to the tag", mark) if unsafe: try: __import__(name) except ImportError as exc: raise ConstructorError("while constructing a Python module", mark, "cannot find module %r (%s)" % (name, exc), mark) if name not in sys.modules: raise ConstructorError("while constructing a Python module", mark, "module %r is not imported" % name, mark) return sys.modules[name] def find_python_name(self, name, mark, unsafe=False): if not name: raise ConstructorError("while constructing a Python object", mark, "expected non-empty name appended to the tag", mark) if '.' in name: module_name, object_name = name.rsplit('.', 1) else: module_name = 'builtins' object_name = name if unsafe: try: __import__(module_name) except ImportError as exc: raise ConstructorError("while constructing a Python object", mark, "cannot find module %r (%s)" % (module_name, exc), mark) if module_name not in sys.modules: raise ConstructorError("while constructing a Python object", mark, "module %r is not imported" % module_name, mark) module = sys.modules[module_name] if not hasattr(module, object_name): raise ConstructorError("while constructing a Python object", mark, "cannot find %r in the module %r" % (object_name, module.__name__), mark) return getattr(module, object_name) def construct_python_name(self, suffix, node): value = self.construct_scalar(node) if value: raise ConstructorError("while constructing a Python name", node.start_mark, "expected the empty value, but found %r" % value, node.start_mark) return self.find_python_name(suffix, node.start_mark) def construct_python_module(self, suffix, node): value = self.construct_scalar(node) if value: raise ConstructorError("while constructing a Python module", node.start_mark, "expected the empty value, but found %r" % value, node.start_mark) return self.find_python_module(suffix, node.start_mark) def make_python_instance(self, suffix, node, args=None, kwds=None, newobj=False, unsafe=False): if not args: args = [] if not kwds: kwds = {} cls = self.find_python_name(suffix, node.start_mark) if not (unsafe or isinstance(cls, type)): raise ConstructorError("while constructing a Python instance", node.start_mark, "expected a class, but found %r" % type(cls), node.start_mark) if newobj and isinstance(cls, type): return cls.__new__(cls, *args, **kwds) else: return cls(*args, **kwds) def set_python_instance_state(self, instance, state, unsafe=False): if hasattr(instance, '__setstate__'): instance.__setstate__(state) else: slotstate = {} if isinstance(state, tuple) and len(state) == 2: state, slotstate = state if hasattr(instance, '__dict__'): if not unsafe and state: for key in state.keys(): self.check_state_key(key) instance.__dict__.update(state) elif state: slotstate.update(state) for key, value in slotstate.items(): if not unsafe: self.check_state_key(key) setattr(instance, key, value) def construct_python_object(self, suffix, node): # Format: # !!python/object:module.name { ... state ... } instance = self.make_python_instance(suffix, node, newobj=True) yield instance deep = hasattr(instance, '__setstate__') state = self.construct_mapping(node, deep=deep) self.set_python_instance_state(instance, state) def construct_python_object_apply(self, suffix, node, newobj=False): # Format: # !!python/object/apply # (or !!python/object/new) # args: [ ... arguments ... ] # kwds: { ... keywords ... } # state: ... state ... # listitems: [ ... listitems ... ] # dictitems: { ... dictitems ... } # or short format: # !!python/object/apply [ ... arguments ... ] # The difference between !!python/object/apply and !!python/object/new # is how an object is created, check make_python_instance for details. if isinstance(node, SequenceNode): args = self.construct_sequence(node, deep=True) kwds = {} state = {} listitems = [] dictitems = {} else: value = self.construct_mapping(node, deep=True) args = value.get('args', []) kwds = value.get('kwds', {}) state = value.get('state', {}) listitems = value.get('listitems', []) dictitems = value.get('dictitems', {}) instance = self.make_python_instance(suffix, node, args, kwds, newobj) if state: self.set_python_instance_state(instance, state) if listitems: instance.extend(listitems) if dictitems: for key in dictitems: instance[key] = dictitems[key] return instance def construct_python_object_new(self, suffix, node): return self.construct_python_object_apply(suffix, node, newobj=True) FullConstructor.add_constructor( 'tag:yaml.org,2002:python/none', FullConstructor.construct_yaml_null) FullConstructor.add_constructor( 'tag:yaml.org,2002:python/bool', FullConstructor.construct_yaml_bool) FullConstructor.add_constructor( 'tag:yaml.org,2002:python/str', FullConstructor.construct_python_str) FullConstructor.add_constructor( 'tag:yaml.org,2002:python/unicode', FullConstructor.construct_python_unicode) FullConstructor.add_constructor( 'tag:yaml.org,2002:python/bytes', FullConstructor.construct_python_bytes) FullConstructor.add_constructor( 'tag:yaml.org,2002:python/int', FullConstructor.construct_yaml_int) FullConstructor.add_constructor( 'tag:yaml.org,2002:python/long', FullConstructor.construct_python_long) FullConstructor.add_constructor( 'tag:yaml.org,2002:python/float', FullConstructor.construct_yaml_float) FullConstructor.add_constructor( 'tag:yaml.org,2002:python/complex', FullConstructor.construct_python_complex) FullConstructor.add_constructor( 'tag:yaml.org,2002:python/list', FullConstructor.construct_yaml_seq) FullConstructor.add_constructor( 'tag:yaml.org,2002:python/tuple', FullConstructor.construct_python_tuple) FullConstructor.add_constructor( 'tag:yaml.org,2002:python/dict', FullConstructor.construct_yaml_map) FullConstructor.add_multi_constructor( 'tag:yaml.org,2002:python/name:', FullConstructor.construct_python_name) class UnsafeConstructor(FullConstructor): def find_python_module(self, name, mark): return super(UnsafeConstructor, self).find_python_module(name, mark, unsafe=True) def find_python_name(self, name, mark): return super(UnsafeConstructor, self).find_python_name(name, mark, unsafe=True) def make_python_instance(self, suffix, node, args=None, kwds=None, newobj=False): return super(UnsafeConstructor, self).make_python_instance( suffix, node, args, kwds, newobj, unsafe=True) def set_python_instance_state(self, instance, state): return super(UnsafeConstructor, self).set_python_instance_state( instance, state, unsafe=True) UnsafeConstructor.add_multi_constructor( 'tag:yaml.org,2002:python/module:', UnsafeConstructor.construct_python_module) UnsafeConstructor.add_multi_constructor( 'tag:yaml.org,2002:python/object:', UnsafeConstructor.construct_python_object) UnsafeConstructor.add_multi_constructor( 'tag:yaml.org,2002:python/object/new:', UnsafeConstructor.construct_python_object_new) UnsafeConstructor.add_multi_constructor( 'tag:yaml.org,2002:python/object/apply:', UnsafeConstructor.construct_python_object_apply) # Constructor is same as UnsafeConstructor. Need to leave this in place in case # people have extended it directly. class Constructor(UnsafeConstructor): pass ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/lib/yaml/cyaml.py0000644000175100001730000000741314455350511015371 0ustar00runnerdocker __all__ = [ 'CBaseLoader', 'CSafeLoader', 'CFullLoader', 'CUnsafeLoader', 'CLoader', 'CBaseDumper', 'CSafeDumper', 'CDumper' ] from yaml._yaml import CParser, CEmitter from .constructor import * from .serializer import * from .representer import * from .resolver import * class CBaseLoader(CParser, BaseConstructor, BaseResolver): def __init__(self, stream): CParser.__init__(self, stream) BaseConstructor.__init__(self) BaseResolver.__init__(self) class CSafeLoader(CParser, SafeConstructor, Resolver): def __init__(self, stream): CParser.__init__(self, stream) SafeConstructor.__init__(self) Resolver.__init__(self) class CFullLoader(CParser, FullConstructor, Resolver): def __init__(self, stream): CParser.__init__(self, stream) FullConstructor.__init__(self) Resolver.__init__(self) class CUnsafeLoader(CParser, UnsafeConstructor, Resolver): def __init__(self, stream): CParser.__init__(self, stream) UnsafeConstructor.__init__(self) Resolver.__init__(self) class CLoader(CParser, Constructor, Resolver): def __init__(self, stream): CParser.__init__(self, stream) Constructor.__init__(self) Resolver.__init__(self) class CBaseDumper(CEmitter, BaseRepresenter, BaseResolver): def __init__(self, stream, default_style=None, default_flow_style=False, canonical=None, indent=None, width=None, allow_unicode=None, line_break=None, encoding=None, explicit_start=None, explicit_end=None, version=None, tags=None, sort_keys=True): CEmitter.__init__(self, stream, canonical=canonical, indent=indent, width=width, encoding=encoding, allow_unicode=allow_unicode, line_break=line_break, explicit_start=explicit_start, explicit_end=explicit_end, version=version, tags=tags) Representer.__init__(self, default_style=default_style, default_flow_style=default_flow_style, sort_keys=sort_keys) Resolver.__init__(self) class CSafeDumper(CEmitter, SafeRepresenter, Resolver): def __init__(self, stream, default_style=None, default_flow_style=False, canonical=None, indent=None, width=None, allow_unicode=None, line_break=None, encoding=None, explicit_start=None, explicit_end=None, version=None, tags=None, sort_keys=True): CEmitter.__init__(self, stream, canonical=canonical, indent=indent, width=width, encoding=encoding, allow_unicode=allow_unicode, line_break=line_break, explicit_start=explicit_start, explicit_end=explicit_end, version=version, tags=tags) SafeRepresenter.__init__(self, default_style=default_style, default_flow_style=default_flow_style, sort_keys=sort_keys) Resolver.__init__(self) class CDumper(CEmitter, Serializer, Representer, Resolver): def __init__(self, stream, default_style=None, default_flow_style=False, canonical=None, indent=None, width=None, allow_unicode=None, line_break=None, encoding=None, explicit_start=None, explicit_end=None, version=None, tags=None, sort_keys=True): CEmitter.__init__(self, stream, canonical=canonical, indent=indent, width=width, encoding=encoding, allow_unicode=allow_unicode, line_break=line_break, explicit_start=explicit_start, explicit_end=explicit_end, version=version, tags=tags) Representer.__init__(self, default_style=default_style, default_flow_style=default_flow_style, sort_keys=sort_keys) Resolver.__init__(self) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/lib/yaml/dumper.py0000644000175100001730000000542514455350511015561 0ustar00runnerdocker __all__ = ['BaseDumper', 'SafeDumper', 'Dumper'] from .emitter import * from .serializer import * from .representer import * from .resolver import * class BaseDumper(Emitter, Serializer, BaseRepresenter, BaseResolver): def __init__(self, stream, default_style=None, default_flow_style=False, canonical=None, indent=None, width=None, allow_unicode=None, line_break=None, encoding=None, explicit_start=None, explicit_end=None, version=None, tags=None, sort_keys=True): Emitter.__init__(self, stream, canonical=canonical, indent=indent, width=width, allow_unicode=allow_unicode, line_break=line_break) Serializer.__init__(self, encoding=encoding, explicit_start=explicit_start, explicit_end=explicit_end, version=version, tags=tags) Representer.__init__(self, default_style=default_style, default_flow_style=default_flow_style, sort_keys=sort_keys) Resolver.__init__(self) class SafeDumper(Emitter, Serializer, SafeRepresenter, Resolver): def __init__(self, stream, default_style=None, default_flow_style=False, canonical=None, indent=None, width=None, allow_unicode=None, line_break=None, encoding=None, explicit_start=None, explicit_end=None, version=None, tags=None, sort_keys=True): Emitter.__init__(self, stream, canonical=canonical, indent=indent, width=width, allow_unicode=allow_unicode, line_break=line_break) Serializer.__init__(self, encoding=encoding, explicit_start=explicit_start, explicit_end=explicit_end, version=version, tags=tags) SafeRepresenter.__init__(self, default_style=default_style, default_flow_style=default_flow_style, sort_keys=sort_keys) Resolver.__init__(self) class Dumper(Emitter, Serializer, Representer, Resolver): def __init__(self, stream, default_style=None, default_flow_style=False, canonical=None, indent=None, width=None, allow_unicode=None, line_break=None, encoding=None, explicit_start=None, explicit_end=None, version=None, tags=None, sort_keys=True): Emitter.__init__(self, stream, canonical=canonical, indent=indent, width=width, allow_unicode=allow_unicode, line_break=line_break) Serializer.__init__(self, encoding=encoding, explicit_start=explicit_start, explicit_end=explicit_end, version=version, tags=tags) Representer.__init__(self, default_style=default_style, default_flow_style=default_flow_style, sort_keys=sort_keys) Resolver.__init__(self) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/lib/yaml/emitter.py0000644000175100001730000012377614455350511015750 0ustar00runnerdocker # Emitter expects events obeying the following grammar: # stream ::= STREAM-START document* STREAM-END # document ::= DOCUMENT-START node DOCUMENT-END # node ::= SCALAR | sequence | mapping # sequence ::= SEQUENCE-START node* SEQUENCE-END # mapping ::= MAPPING-START (node node)* MAPPING-END __all__ = ['Emitter', 'EmitterError'] from .error import YAMLError from .events import * class EmitterError(YAMLError): pass class ScalarAnalysis: def __init__(self, scalar, empty, multiline, allow_flow_plain, allow_block_plain, allow_single_quoted, allow_double_quoted, allow_block): self.scalar = scalar self.empty = empty self.multiline = multiline self.allow_flow_plain = allow_flow_plain self.allow_block_plain = allow_block_plain self.allow_single_quoted = allow_single_quoted self.allow_double_quoted = allow_double_quoted self.allow_block = allow_block class Emitter: DEFAULT_TAG_PREFIXES = { '!' : '!', 'tag:yaml.org,2002:' : '!!', } def __init__(self, stream, canonical=None, indent=None, width=None, allow_unicode=None, line_break=None): # The stream should have the methods `write` and possibly `flush`. self.stream = stream # Encoding can be overridden by STREAM-START. self.encoding = None # Emitter is a state machine with a stack of states to handle nested # structures. self.states = [] self.state = self.expect_stream_start # Current event and the event queue. self.events = [] self.event = None # The current indentation level and the stack of previous indents. self.indents = [] self.indent = None # Flow level. self.flow_level = 0 # Contexts. self.root_context = False self.sequence_context = False self.mapping_context = False self.simple_key_context = False # Characteristics of the last emitted character: # - current position. # - is it a whitespace? # - is it an indention character # (indentation space, '-', '?', or ':')? self.line = 0 self.column = 0 self.whitespace = True self.indention = True # Whether the document requires an explicit document indicator self.open_ended = False # Formatting details. self.canonical = canonical self.allow_unicode = allow_unicode self.best_indent = 2 if indent and 1 < indent < 10: self.best_indent = indent self.best_width = 80 if width and width > self.best_indent*2: self.best_width = width self.best_line_break = '\n' if line_break in ['\r', '\n', '\r\n']: self.best_line_break = line_break # Tag prefixes. self.tag_prefixes = None # Prepared anchor and tag. self.prepared_anchor = None self.prepared_tag = None # Scalar analysis and style. self.analysis = None self.style = None def dispose(self): # Reset the state attributes (to clear self-references) self.states = [] self.state = None def emit(self, event): self.events.append(event) while not self.need_more_events(): self.event = self.events.pop(0) self.state() self.event = None # In some cases, we wait for a few next events before emitting. def need_more_events(self): if not self.events: return True event = self.events[0] if isinstance(event, DocumentStartEvent): return self.need_events(1) elif isinstance(event, SequenceStartEvent): return self.need_events(2) elif isinstance(event, MappingStartEvent): return self.need_events(3) else: return False def need_events(self, count): level = 0 for event in self.events[1:]: if isinstance(event, (DocumentStartEvent, CollectionStartEvent)): level += 1 elif isinstance(event, (DocumentEndEvent, CollectionEndEvent)): level -= 1 elif isinstance(event, StreamEndEvent): level = -1 if level < 0: return False return (len(self.events) < count+1) def increase_indent(self, flow=False, indentless=False): self.indents.append(self.indent) if self.indent is None: if flow: self.indent = self.best_indent else: self.indent = 0 elif not indentless: self.indent += self.best_indent # States. # Stream handlers. def expect_stream_start(self): if isinstance(self.event, StreamStartEvent): if self.event.encoding and not hasattr(self.stream, 'encoding'): self.encoding = self.event.encoding self.write_stream_start() self.state = self.expect_first_document_start else: raise EmitterError("expected StreamStartEvent, but got %s" % self.event) def expect_nothing(self): raise EmitterError("expected nothing, but got %s" % self.event) # Document handlers. def expect_first_document_start(self): return self.expect_document_start(first=True) def expect_document_start(self, first=False): if isinstance(self.event, DocumentStartEvent): if (self.event.version or self.event.tags) and self.open_ended: self.write_indicator('...', True) self.write_indent() if self.event.version: version_text = self.prepare_version(self.event.version) self.write_version_directive(version_text) self.tag_prefixes = self.DEFAULT_TAG_PREFIXES.copy() if self.event.tags: handles = sorted(self.event.tags.keys()) for handle in handles: prefix = self.event.tags[handle] self.tag_prefixes[prefix] = handle handle_text = self.prepare_tag_handle(handle) prefix_text = self.prepare_tag_prefix(prefix) self.write_tag_directive(handle_text, prefix_text) implicit = (first and not self.event.explicit and not self.canonical and not self.event.version and not self.event.tags and not self.check_empty_document()) if not implicit: self.write_indent() self.write_indicator('---', True) if self.canonical: self.write_indent() self.state = self.expect_document_root elif isinstance(self.event, StreamEndEvent): if self.open_ended: self.write_indicator('...', True) self.write_indent() self.write_stream_end() self.state = self.expect_nothing else: raise EmitterError("expected DocumentStartEvent, but got %s" % self.event) def expect_document_end(self): if isinstance(self.event, DocumentEndEvent): self.write_indent() if self.event.explicit: self.write_indicator('...', True) self.write_indent() self.flush_stream() self.state = self.expect_document_start else: raise EmitterError("expected DocumentEndEvent, but got %s" % self.event) def expect_document_root(self): self.states.append(self.expect_document_end) self.expect_node(root=True) # Node handlers. def expect_node(self, root=False, sequence=False, mapping=False, simple_key=False): self.root_context = root self.sequence_context = sequence self.mapping_context = mapping self.simple_key_context = simple_key if isinstance(self.event, AliasEvent): self.expect_alias() elif isinstance(self.event, (ScalarEvent, CollectionStartEvent)): self.process_anchor('&') self.process_tag() if isinstance(self.event, ScalarEvent): self.expect_scalar() elif isinstance(self.event, SequenceStartEvent): if self.flow_level or self.canonical or self.event.flow_style \ or self.check_empty_sequence(): self.expect_flow_sequence() else: self.expect_block_sequence() elif isinstance(self.event, MappingStartEvent): if self.flow_level or self.canonical or self.event.flow_style \ or self.check_empty_mapping(): self.expect_flow_mapping() else: self.expect_block_mapping() else: raise EmitterError("expected NodeEvent, but got %s" % self.event) def expect_alias(self): if self.event.anchor is None: raise EmitterError("anchor is not specified for alias") self.process_anchor('*') self.state = self.states.pop() def expect_scalar(self): self.increase_indent(flow=True) self.process_scalar() self.indent = self.indents.pop() self.state = self.states.pop() # Flow sequence handlers. def expect_flow_sequence(self): self.write_indicator('[', True, whitespace=True) self.flow_level += 1 self.increase_indent(flow=True) self.state = self.expect_first_flow_sequence_item def expect_first_flow_sequence_item(self): if isinstance(self.event, SequenceEndEvent): self.indent = self.indents.pop() self.flow_level -= 1 self.write_indicator(']', False) self.state = self.states.pop() else: if self.canonical or self.column > self.best_width: self.write_indent() self.states.append(self.expect_flow_sequence_item) self.expect_node(sequence=True) def expect_flow_sequence_item(self): if isinstance(self.event, SequenceEndEvent): self.indent = self.indents.pop() self.flow_level -= 1 if self.canonical: self.write_indicator(',', False) self.write_indent() self.write_indicator(']', False) self.state = self.states.pop() else: self.write_indicator(',', False) if self.canonical or self.column > self.best_width: self.write_indent() self.states.append(self.expect_flow_sequence_item) self.expect_node(sequence=True) # Flow mapping handlers. def expect_flow_mapping(self): self.write_indicator('{', True, whitespace=True) self.flow_level += 1 self.increase_indent(flow=True) self.state = self.expect_first_flow_mapping_key def expect_first_flow_mapping_key(self): if isinstance(self.event, MappingEndEvent): self.indent = self.indents.pop() self.flow_level -= 1 self.write_indicator('}', False) self.state = self.states.pop() else: if self.canonical or self.column > self.best_width: self.write_indent() if not self.canonical and self.check_simple_key(): self.states.append(self.expect_flow_mapping_simple_value) self.expect_node(mapping=True, simple_key=True) else: self.write_indicator('?', True) self.states.append(self.expect_flow_mapping_value) self.expect_node(mapping=True) def expect_flow_mapping_key(self): if isinstance(self.event, MappingEndEvent): self.indent = self.indents.pop() self.flow_level -= 1 if self.canonical: self.write_indicator(',', False) self.write_indent() self.write_indicator('}', False) self.state = self.states.pop() else: self.write_indicator(',', False) if self.canonical or self.column > self.best_width: self.write_indent() if not self.canonical and self.check_simple_key(): self.states.append(self.expect_flow_mapping_simple_value) self.expect_node(mapping=True, simple_key=True) else: self.write_indicator('?', True) self.states.append(self.expect_flow_mapping_value) self.expect_node(mapping=True) def expect_flow_mapping_simple_value(self): self.write_indicator(':', False) self.states.append(self.expect_flow_mapping_key) self.expect_node(mapping=True) def expect_flow_mapping_value(self): if self.canonical or self.column > self.best_width: self.write_indent() self.write_indicator(':', True) self.states.append(self.expect_flow_mapping_key) self.expect_node(mapping=True) # Block sequence handlers. def expect_block_sequence(self): indentless = (self.mapping_context and not self.indention) self.increase_indent(flow=False, indentless=indentless) self.state = self.expect_first_block_sequence_item def expect_first_block_sequence_item(self): return self.expect_block_sequence_item(first=True) def expect_block_sequence_item(self, first=False): if not first and isinstance(self.event, SequenceEndEvent): self.indent = self.indents.pop() self.state = self.states.pop() else: self.write_indent() self.write_indicator('-', True, indention=True) self.states.append(self.expect_block_sequence_item) self.expect_node(sequence=True) # Block mapping handlers. def expect_block_mapping(self): self.increase_indent(flow=False) self.state = self.expect_first_block_mapping_key def expect_first_block_mapping_key(self): return self.expect_block_mapping_key(first=True) def expect_block_mapping_key(self, first=False): if not first and isinstance(self.event, MappingEndEvent): self.indent = self.indents.pop() self.state = self.states.pop() else: self.write_indent() if self.check_simple_key(): self.states.append(self.expect_block_mapping_simple_value) self.expect_node(mapping=True, simple_key=True) else: self.write_indicator('?', True, indention=True) self.states.append(self.expect_block_mapping_value) self.expect_node(mapping=True) def expect_block_mapping_simple_value(self): self.write_indicator(':', False) self.states.append(self.expect_block_mapping_key) self.expect_node(mapping=True) def expect_block_mapping_value(self): self.write_indent() self.write_indicator(':', True, indention=True) self.states.append(self.expect_block_mapping_key) self.expect_node(mapping=True) # Checkers. def check_empty_sequence(self): return (isinstance(self.event, SequenceStartEvent) and self.events and isinstance(self.events[0], SequenceEndEvent)) def check_empty_mapping(self): return (isinstance(self.event, MappingStartEvent) and self.events and isinstance(self.events[0], MappingEndEvent)) def check_empty_document(self): if not isinstance(self.event, DocumentStartEvent) or not self.events: return False event = self.events[0] return (isinstance(event, ScalarEvent) and event.anchor is None and event.tag is None and event.implicit and event.value == '') def check_simple_key(self): length = 0 if isinstance(self.event, NodeEvent) and self.event.anchor is not None: if self.prepared_anchor is None: self.prepared_anchor = self.prepare_anchor(self.event.anchor) length += len(self.prepared_anchor) if isinstance(self.event, (ScalarEvent, CollectionStartEvent)) \ and self.event.tag is not None: if self.prepared_tag is None: self.prepared_tag = self.prepare_tag(self.event.tag) length += len(self.prepared_tag) if isinstance(self.event, ScalarEvent): if self.analysis is None: self.analysis = self.analyze_scalar(self.event.value) length += len(self.analysis.scalar) return (length < 128 and (isinstance(self.event, AliasEvent) or (isinstance(self.event, ScalarEvent) and not self.analysis.empty and not self.analysis.multiline) or self.check_empty_sequence() or self.check_empty_mapping())) # Anchor, Tag, and Scalar processors. def process_anchor(self, indicator): if self.event.anchor is None: self.prepared_anchor = None return if self.prepared_anchor is None: self.prepared_anchor = self.prepare_anchor(self.event.anchor) if self.prepared_anchor: self.write_indicator(indicator+self.prepared_anchor, True) self.prepared_anchor = None def process_tag(self): tag = self.event.tag if isinstance(self.event, ScalarEvent): if self.style is None: self.style = self.choose_scalar_style() if ((not self.canonical or tag is None) and ((self.style == '' and self.event.implicit[0]) or (self.style != '' and self.event.implicit[1]))): self.prepared_tag = None return if self.event.implicit[0] and tag is None: tag = '!' self.prepared_tag = None else: if (not self.canonical or tag is None) and self.event.implicit: self.prepared_tag = None return if tag is None: raise EmitterError("tag is not specified") if self.prepared_tag is None: self.prepared_tag = self.prepare_tag(tag) if self.prepared_tag: self.write_indicator(self.prepared_tag, True) self.prepared_tag = None def choose_scalar_style(self): if self.analysis is None: self.analysis = self.analyze_scalar(self.event.value) if self.event.style == '"' or self.canonical: return '"' if not self.event.style and self.event.implicit[0]: if (not (self.simple_key_context and (self.analysis.empty or self.analysis.multiline)) and (self.flow_level and self.analysis.allow_flow_plain or (not self.flow_level and self.analysis.allow_block_plain))): return '' if self.event.style and self.event.style in '|>': if (not self.flow_level and not self.simple_key_context and self.analysis.allow_block): return self.event.style if not self.event.style or self.event.style == '\'': if (self.analysis.allow_single_quoted and not (self.simple_key_context and self.analysis.multiline)): return '\'' return '"' def process_scalar(self): if self.analysis is None: self.analysis = self.analyze_scalar(self.event.value) if self.style is None: self.style = self.choose_scalar_style() split = (not self.simple_key_context) #if self.analysis.multiline and split \ # and (not self.style or self.style in '\'\"'): # self.write_indent() if self.style == '"': self.write_double_quoted(self.analysis.scalar, split) elif self.style == '\'': self.write_single_quoted(self.analysis.scalar, split) elif self.style == '>': self.write_folded(self.analysis.scalar) elif self.style == '|': self.write_literal(self.analysis.scalar) else: self.write_plain(self.analysis.scalar, split) self.analysis = None self.style = None # Analyzers. def prepare_version(self, version): major, minor = version if major != 1: raise EmitterError("unsupported YAML version: %d.%d" % (major, minor)) return '%d.%d' % (major, minor) def prepare_tag_handle(self, handle): if not handle: raise EmitterError("tag handle must not be empty") if handle[0] != '!' or handle[-1] != '!': raise EmitterError("tag handle must start and end with '!': %r" % handle) for ch in handle[1:-1]: if not ('0' <= ch <= '9' or 'A' <= ch <= 'Z' or 'a' <= ch <= 'z' \ or ch in '-_'): raise EmitterError("invalid character %r in the tag handle: %r" % (ch, handle)) return handle def prepare_tag_prefix(self, prefix): if not prefix: raise EmitterError("tag prefix must not be empty") chunks = [] start = end = 0 if prefix[0] == '!': end = 1 while end < len(prefix): ch = prefix[end] if '0' <= ch <= '9' or 'A' <= ch <= 'Z' or 'a' <= ch <= 'z' \ or ch in '-;/?!:@&=+$,_.~*\'()[]': end += 1 else: if start < end: chunks.append(prefix[start:end]) start = end = end+1 data = ch.encode('utf-8') for ch in data: chunks.append('%%%02X' % ord(ch)) if start < end: chunks.append(prefix[start:end]) return ''.join(chunks) def prepare_tag(self, tag): if not tag: raise EmitterError("tag must not be empty") if tag == '!': return tag handle = None suffix = tag prefixes = sorted(self.tag_prefixes.keys()) for prefix in prefixes: if tag.startswith(prefix) \ and (prefix == '!' or len(prefix) < len(tag)): handle = self.tag_prefixes[prefix] suffix = tag[len(prefix):] chunks = [] start = end = 0 while end < len(suffix): ch = suffix[end] if '0' <= ch <= '9' or 'A' <= ch <= 'Z' or 'a' <= ch <= 'z' \ or ch in '-;/?:@&=+$,_.~*\'()[]' \ or (ch == '!' and handle != '!'): end += 1 else: if start < end: chunks.append(suffix[start:end]) start = end = end+1 data = ch.encode('utf-8') for ch in data: chunks.append('%%%02X' % ch) if start < end: chunks.append(suffix[start:end]) suffix_text = ''.join(chunks) if handle: return '%s%s' % (handle, suffix_text) else: return '!<%s>' % suffix_text def prepare_anchor(self, anchor): if not anchor: raise EmitterError("anchor must not be empty") for ch in anchor: if not ('0' <= ch <= '9' or 'A' <= ch <= 'Z' or 'a' <= ch <= 'z' \ or ch in '-_'): raise EmitterError("invalid character %r in the anchor: %r" % (ch, anchor)) return anchor def analyze_scalar(self, scalar): # Empty scalar is a special case. if not scalar: return ScalarAnalysis(scalar=scalar, empty=True, multiline=False, allow_flow_plain=False, allow_block_plain=True, allow_single_quoted=True, allow_double_quoted=True, allow_block=False) # Indicators and special characters. block_indicators = False flow_indicators = False line_breaks = False special_characters = False # Important whitespace combinations. leading_space = False leading_break = False trailing_space = False trailing_break = False break_space = False space_break = False # Check document indicators. if scalar.startswith('---') or scalar.startswith('...'): block_indicators = True flow_indicators = True # First character or preceded by a whitespace. preceded_by_whitespace = True # Last character or followed by a whitespace. followed_by_whitespace = (len(scalar) == 1 or scalar[1] in '\0 \t\r\n\x85\u2028\u2029') # The previous character is a space. previous_space = False # The previous character is a break. previous_break = False index = 0 while index < len(scalar): ch = scalar[index] # Check for indicators. if index == 0: # Leading indicators are special characters. if ch in '#,[]{}&*!|>\'\"%@`': flow_indicators = True block_indicators = True if ch in '?:': flow_indicators = True if followed_by_whitespace: block_indicators = True if ch == '-' and followed_by_whitespace: flow_indicators = True block_indicators = True else: # Some indicators cannot appear within a scalar as well. if ch in ',?[]{}': flow_indicators = True if ch == ':': flow_indicators = True if followed_by_whitespace: block_indicators = True if ch == '#' and preceded_by_whitespace: flow_indicators = True block_indicators = True # Check for line breaks, special, and unicode characters. if ch in '\n\x85\u2028\u2029': line_breaks = True if not (ch == '\n' or '\x20' <= ch <= '\x7E'): if (ch == '\x85' or '\xA0' <= ch <= '\uD7FF' or '\uE000' <= ch <= '\uFFFD' or '\U00010000' <= ch < '\U0010ffff') and ch != '\uFEFF': unicode_characters = True if not self.allow_unicode: special_characters = True else: special_characters = True # Detect important whitespace combinations. if ch == ' ': if index == 0: leading_space = True if index == len(scalar)-1: trailing_space = True if previous_break: break_space = True previous_space = True previous_break = False elif ch in '\n\x85\u2028\u2029': if index == 0: leading_break = True if index == len(scalar)-1: trailing_break = True if previous_space: space_break = True previous_space = False previous_break = True else: previous_space = False previous_break = False # Prepare for the next character. index += 1 preceded_by_whitespace = (ch in '\0 \t\r\n\x85\u2028\u2029') followed_by_whitespace = (index+1 >= len(scalar) or scalar[index+1] in '\0 \t\r\n\x85\u2028\u2029') # Let's decide what styles are allowed. allow_flow_plain = True allow_block_plain = True allow_single_quoted = True allow_double_quoted = True allow_block = True # Leading and trailing whitespaces are bad for plain scalars. if (leading_space or leading_break or trailing_space or trailing_break): allow_flow_plain = allow_block_plain = False # We do not permit trailing spaces for block scalars. if trailing_space: allow_block = False # Spaces at the beginning of a new line are only acceptable for block # scalars. if break_space: allow_flow_plain = allow_block_plain = allow_single_quoted = False # Spaces followed by breaks, as well as special character are only # allowed for double quoted scalars. if space_break or special_characters: allow_flow_plain = allow_block_plain = \ allow_single_quoted = allow_block = False # Although the plain scalar writer supports breaks, we never emit # multiline plain scalars. if line_breaks: allow_flow_plain = allow_block_plain = False # Flow indicators are forbidden for flow plain scalars. if flow_indicators: allow_flow_plain = False # Block indicators are forbidden for block plain scalars. if block_indicators: allow_block_plain = False return ScalarAnalysis(scalar=scalar, empty=False, multiline=line_breaks, allow_flow_plain=allow_flow_plain, allow_block_plain=allow_block_plain, allow_single_quoted=allow_single_quoted, allow_double_quoted=allow_double_quoted, allow_block=allow_block) # Writers. def flush_stream(self): if hasattr(self.stream, 'flush'): self.stream.flush() def write_stream_start(self): # Write BOM if needed. if self.encoding and self.encoding.startswith('utf-16'): self.stream.write('\uFEFF'.encode(self.encoding)) def write_stream_end(self): self.flush_stream() def write_indicator(self, indicator, need_whitespace, whitespace=False, indention=False): if self.whitespace or not need_whitespace: data = indicator else: data = ' '+indicator self.whitespace = whitespace self.indention = self.indention and indention self.column += len(data) self.open_ended = False if self.encoding: data = data.encode(self.encoding) self.stream.write(data) def write_indent(self): indent = self.indent or 0 if not self.indention or self.column > indent \ or (self.column == indent and not self.whitespace): self.write_line_break() if self.column < indent: self.whitespace = True data = ' '*(indent-self.column) self.column = indent if self.encoding: data = data.encode(self.encoding) self.stream.write(data) def write_line_break(self, data=None): if data is None: data = self.best_line_break self.whitespace = True self.indention = True self.line += 1 self.column = 0 if self.encoding: data = data.encode(self.encoding) self.stream.write(data) def write_version_directive(self, version_text): data = '%%YAML %s' % version_text if self.encoding: data = data.encode(self.encoding) self.stream.write(data) self.write_line_break() def write_tag_directive(self, handle_text, prefix_text): data = '%%TAG %s %s' % (handle_text, prefix_text) if self.encoding: data = data.encode(self.encoding) self.stream.write(data) self.write_line_break() # Scalar streams. def write_single_quoted(self, text, split=True): self.write_indicator('\'', True) spaces = False breaks = False start = end = 0 while end <= len(text): ch = None if end < len(text): ch = text[end] if spaces: if ch is None or ch != ' ': if start+1 == end and self.column > self.best_width and split \ and start != 0 and end != len(text): self.write_indent() else: data = text[start:end] self.column += len(data) if self.encoding: data = data.encode(self.encoding) self.stream.write(data) start = end elif breaks: if ch is None or ch not in '\n\x85\u2028\u2029': if text[start] == '\n': self.write_line_break() for br in text[start:end]: if br == '\n': self.write_line_break() else: self.write_line_break(br) self.write_indent() start = end else: if ch is None or ch in ' \n\x85\u2028\u2029' or ch == '\'': if start < end: data = text[start:end] self.column += len(data) if self.encoding: data = data.encode(self.encoding) self.stream.write(data) start = end if ch == '\'': data = '\'\'' self.column += 2 if self.encoding: data = data.encode(self.encoding) self.stream.write(data) start = end + 1 if ch is not None: spaces = (ch == ' ') breaks = (ch in '\n\x85\u2028\u2029') end += 1 self.write_indicator('\'', False) ESCAPE_REPLACEMENTS = { '\0': '0', '\x07': 'a', '\x08': 'b', '\x09': 't', '\x0A': 'n', '\x0B': 'v', '\x0C': 'f', '\x0D': 'r', '\x1B': 'e', '\"': '\"', '\\': '\\', '\x85': 'N', '\xA0': '_', '\u2028': 'L', '\u2029': 'P', } def write_double_quoted(self, text, split=True): self.write_indicator('"', True) start = end = 0 while end <= len(text): ch = None if end < len(text): ch = text[end] if ch is None or ch in '"\\\x85\u2028\u2029\uFEFF' \ or not ('\x20' <= ch <= '\x7E' or (self.allow_unicode and ('\xA0' <= ch <= '\uD7FF' or '\uE000' <= ch <= '\uFFFD'))): if start < end: data = text[start:end] self.column += len(data) if self.encoding: data = data.encode(self.encoding) self.stream.write(data) start = end if ch is not None: if ch in self.ESCAPE_REPLACEMENTS: data = '\\'+self.ESCAPE_REPLACEMENTS[ch] elif ch <= '\xFF': data = '\\x%02X' % ord(ch) elif ch <= '\uFFFF': data = '\\u%04X' % ord(ch) else: data = '\\U%08X' % ord(ch) self.column += len(data) if self.encoding: data = data.encode(self.encoding) self.stream.write(data) start = end+1 if 0 < end < len(text)-1 and (ch == ' ' or start >= end) \ and self.column+(end-start) > self.best_width and split: data = text[start:end]+'\\' if start < end: start = end self.column += len(data) if self.encoding: data = data.encode(self.encoding) self.stream.write(data) self.write_indent() self.whitespace = False self.indention = False if text[start] == ' ': data = '\\' self.column += len(data) if self.encoding: data = data.encode(self.encoding) self.stream.write(data) end += 1 self.write_indicator('"', False) def determine_block_hints(self, text): hints = '' if text: if text[0] in ' \n\x85\u2028\u2029': hints += str(self.best_indent) if text[-1] not in '\n\x85\u2028\u2029': hints += '-' elif len(text) == 1 or text[-2] in '\n\x85\u2028\u2029': hints += '+' return hints def write_folded(self, text): hints = self.determine_block_hints(text) self.write_indicator('>'+hints, True) if hints[-1:] == '+': self.open_ended = True self.write_line_break() leading_space = True spaces = False breaks = True start = end = 0 while end <= len(text): ch = None if end < len(text): ch = text[end] if breaks: if ch is None or ch not in '\n\x85\u2028\u2029': if not leading_space and ch is not None and ch != ' ' \ and text[start] == '\n': self.write_line_break() leading_space = (ch == ' ') for br in text[start:end]: if br == '\n': self.write_line_break() else: self.write_line_break(br) if ch is not None: self.write_indent() start = end elif spaces: if ch != ' ': if start+1 == end and self.column > self.best_width: self.write_indent() else: data = text[start:end] self.column += len(data) if self.encoding: data = data.encode(self.encoding) self.stream.write(data) start = end else: if ch is None or ch in ' \n\x85\u2028\u2029': data = text[start:end] self.column += len(data) if self.encoding: data = data.encode(self.encoding) self.stream.write(data) if ch is None: self.write_line_break() start = end if ch is not None: breaks = (ch in '\n\x85\u2028\u2029') spaces = (ch == ' ') end += 1 def write_literal(self, text): hints = self.determine_block_hints(text) self.write_indicator('|'+hints, True) if hints[-1:] == '+': self.open_ended = True self.write_line_break() breaks = True start = end = 0 while end <= len(text): ch = None if end < len(text): ch = text[end] if breaks: if ch is None or ch not in '\n\x85\u2028\u2029': for br in text[start:end]: if br == '\n': self.write_line_break() else: self.write_line_break(br) if ch is not None: self.write_indent() start = end else: if ch is None or ch in '\n\x85\u2028\u2029': data = text[start:end] if self.encoding: data = data.encode(self.encoding) self.stream.write(data) if ch is None: self.write_line_break() start = end if ch is not None: breaks = (ch in '\n\x85\u2028\u2029') end += 1 def write_plain(self, text, split=True): if self.root_context: self.open_ended = True if not text: return if not self.whitespace: data = ' ' self.column += len(data) if self.encoding: data = data.encode(self.encoding) self.stream.write(data) self.whitespace = False self.indention = False spaces = False breaks = False start = end = 0 while end <= len(text): ch = None if end < len(text): ch = text[end] if spaces: if ch != ' ': if start+1 == end and self.column > self.best_width and split: self.write_indent() self.whitespace = False self.indention = False else: data = text[start:end] self.column += len(data) if self.encoding: data = data.encode(self.encoding) self.stream.write(data) start = end elif breaks: if ch not in '\n\x85\u2028\u2029': if text[start] == '\n': self.write_line_break() for br in text[start:end]: if br == '\n': self.write_line_break() else: self.write_line_break(br) self.write_indent() self.whitespace = False self.indention = False start = end else: if ch is None or ch in ' \n\x85\u2028\u2029': data = text[start:end] self.column += len(data) if self.encoding: data = data.encode(self.encoding) self.stream.write(data) start = end if ch is not None: spaces = (ch == ' ') breaks = (ch in '\n\x85\u2028\u2029') end += 1 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/lib/yaml/error.py0000644000175100001730000000474514455350511015422 0ustar00runnerdocker __all__ = ['Mark', 'YAMLError', 'MarkedYAMLError'] class Mark: def __init__(self, name, index, line, column, buffer, pointer): self.name = name self.index = index self.line = line self.column = column self.buffer = buffer self.pointer = pointer def get_snippet(self, indent=4, max_length=75): if self.buffer is None: return None head = '' start = self.pointer while start > 0 and self.buffer[start-1] not in '\0\r\n\x85\u2028\u2029': start -= 1 if self.pointer-start > max_length/2-1: head = ' ... ' start += 5 break tail = '' end = self.pointer while end < len(self.buffer) and self.buffer[end] not in '\0\r\n\x85\u2028\u2029': end += 1 if end-self.pointer > max_length/2-1: tail = ' ... ' end -= 5 break snippet = self.buffer[start:end] return ' '*indent + head + snippet + tail + '\n' \ + ' '*(indent+self.pointer-start+len(head)) + '^' def __str__(self): snippet = self.get_snippet() where = " in \"%s\", line %d, column %d" \ % (self.name, self.line+1, self.column+1) if snippet is not None: where += ":\n"+snippet return where class YAMLError(Exception): pass class MarkedYAMLError(YAMLError): def __init__(self, context=None, context_mark=None, problem=None, problem_mark=None, note=None): self.context = context self.context_mark = context_mark self.problem = problem self.problem_mark = problem_mark self.note = note def __str__(self): lines = [] if self.context is not None: lines.append(self.context) if self.context_mark is not None \ and (self.problem is None or self.problem_mark is None or self.context_mark.name != self.problem_mark.name or self.context_mark.line != self.problem_mark.line or self.context_mark.column != self.problem_mark.column): lines.append(str(self.context_mark)) if self.problem is not None: lines.append(self.problem) if self.problem_mark is not None: lines.append(str(self.problem_mark)) if self.note is not None: lines.append(self.note) return '\n'.join(lines) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/lib/yaml/events.py0000644000175100001730000000461514455350511015571 0ustar00runnerdocker # Abstract classes. class Event(object): def __init__(self, start_mark=None, end_mark=None): self.start_mark = start_mark self.end_mark = end_mark def __repr__(self): attributes = [key for key in ['anchor', 'tag', 'implicit', 'value'] if hasattr(self, key)] arguments = ', '.join(['%s=%r' % (key, getattr(self, key)) for key in attributes]) return '%s(%s)' % (self.__class__.__name__, arguments) class NodeEvent(Event): def __init__(self, anchor, start_mark=None, end_mark=None): self.anchor = anchor self.start_mark = start_mark self.end_mark = end_mark class CollectionStartEvent(NodeEvent): def __init__(self, anchor, tag, implicit, start_mark=None, end_mark=None, flow_style=None): self.anchor = anchor self.tag = tag self.implicit = implicit self.start_mark = start_mark self.end_mark = end_mark self.flow_style = flow_style class CollectionEndEvent(Event): pass # Implementations. class StreamStartEvent(Event): def __init__(self, start_mark=None, end_mark=None, encoding=None): self.start_mark = start_mark self.end_mark = end_mark self.encoding = encoding class StreamEndEvent(Event): pass class DocumentStartEvent(Event): def __init__(self, start_mark=None, end_mark=None, explicit=None, version=None, tags=None): self.start_mark = start_mark self.end_mark = end_mark self.explicit = explicit self.version = version self.tags = tags class DocumentEndEvent(Event): def __init__(self, start_mark=None, end_mark=None, explicit=None): self.start_mark = start_mark self.end_mark = end_mark self.explicit = explicit class AliasEvent(NodeEvent): pass class ScalarEvent(NodeEvent): def __init__(self, anchor, tag, implicit, value, start_mark=None, end_mark=None, style=None): self.anchor = anchor self.tag = tag self.implicit = implicit self.value = value self.start_mark = start_mark self.end_mark = end_mark self.style = style class SequenceStartEvent(CollectionStartEvent): pass class SequenceEndEvent(CollectionEndEvent): pass class MappingStartEvent(CollectionStartEvent): pass class MappingEndEvent(CollectionEndEvent): pass ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/lib/yaml/loader.py0000644000175100001730000000401514455350511015525 0ustar00runnerdocker __all__ = ['BaseLoader', 'FullLoader', 'SafeLoader', 'Loader', 'UnsafeLoader'] from .reader import * from .scanner import * from .parser import * from .composer import * from .constructor import * from .resolver import * class BaseLoader(Reader, Scanner, Parser, Composer, BaseConstructor, BaseResolver): def __init__(self, stream): Reader.__init__(self, stream) Scanner.__init__(self) Parser.__init__(self) Composer.__init__(self) BaseConstructor.__init__(self) BaseResolver.__init__(self) class FullLoader(Reader, Scanner, Parser, Composer, FullConstructor, Resolver): def __init__(self, stream): Reader.__init__(self, stream) Scanner.__init__(self) Parser.__init__(self) Composer.__init__(self) FullConstructor.__init__(self) Resolver.__init__(self) class SafeLoader(Reader, Scanner, Parser, Composer, SafeConstructor, Resolver): def __init__(self, stream): Reader.__init__(self, stream) Scanner.__init__(self) Parser.__init__(self) Composer.__init__(self) SafeConstructor.__init__(self) Resolver.__init__(self) class Loader(Reader, Scanner, Parser, Composer, Constructor, Resolver): def __init__(self, stream): Reader.__init__(self, stream) Scanner.__init__(self) Parser.__init__(self) Composer.__init__(self) Constructor.__init__(self) Resolver.__init__(self) # UnsafeLoader is the same as Loader (which is and was always unsafe on # untrusted input). Use of either Loader or UnsafeLoader should be rare, since # FullLoad should be able to load almost all YAML safely. Loader is left intact # to ensure backwards compatibility. class UnsafeLoader(Reader, Scanner, Parser, Composer, Constructor, Resolver): def __init__(self, stream): Reader.__init__(self, stream) Scanner.__init__(self) Parser.__init__(self) Composer.__init__(self) Constructor.__init__(self) Resolver.__init__(self) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/lib/yaml/nodes.py0000644000175100001730000000264014455350511015371 0ustar00runnerdocker class Node(object): def __init__(self, tag, value, start_mark, end_mark): self.tag = tag self.value = value self.start_mark = start_mark self.end_mark = end_mark def __repr__(self): value = self.value #if isinstance(value, list): # if len(value) == 0: # value = '' # elif len(value) == 1: # value = '<1 item>' # else: # value = '<%d items>' % len(value) #else: # if len(value) > 75: # value = repr(value[:70]+u' ... ') # else: # value = repr(value) value = repr(value) return '%s(tag=%r, value=%s)' % (self.__class__.__name__, self.tag, value) class ScalarNode(Node): id = 'scalar' def __init__(self, tag, value, start_mark=None, end_mark=None, style=None): self.tag = tag self.value = value self.start_mark = start_mark self.end_mark = end_mark self.style = style class CollectionNode(Node): def __init__(self, tag, value, start_mark=None, end_mark=None, flow_style=None): self.tag = tag self.value = value self.start_mark = start_mark self.end_mark = end_mark self.flow_style = flow_style class SequenceNode(CollectionNode): id = 'sequence' class MappingNode(CollectionNode): id = 'mapping' ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/lib/yaml/parser.py0000644000175100001730000006162714455350511015567 0ustar00runnerdocker # The following YAML grammar is LL(1) and is parsed by a recursive descent # parser. # # stream ::= STREAM-START implicit_document? explicit_document* STREAM-END # implicit_document ::= block_node DOCUMENT-END* # explicit_document ::= DIRECTIVE* DOCUMENT-START block_node? DOCUMENT-END* # block_node_or_indentless_sequence ::= # ALIAS # | properties (block_content | indentless_block_sequence)? # | block_content # | indentless_block_sequence # block_node ::= ALIAS # | properties block_content? # | block_content # flow_node ::= ALIAS # | properties flow_content? # | flow_content # properties ::= TAG ANCHOR? | ANCHOR TAG? # block_content ::= block_collection | flow_collection | SCALAR # flow_content ::= flow_collection | SCALAR # block_collection ::= block_sequence | block_mapping # flow_collection ::= flow_sequence | flow_mapping # block_sequence ::= BLOCK-SEQUENCE-START (BLOCK-ENTRY block_node?)* BLOCK-END # indentless_sequence ::= (BLOCK-ENTRY block_node?)+ # block_mapping ::= BLOCK-MAPPING_START # ((KEY block_node_or_indentless_sequence?)? # (VALUE block_node_or_indentless_sequence?)?)* # BLOCK-END # flow_sequence ::= FLOW-SEQUENCE-START # (flow_sequence_entry FLOW-ENTRY)* # flow_sequence_entry? # FLOW-SEQUENCE-END # flow_sequence_entry ::= flow_node | KEY flow_node? (VALUE flow_node?)? # flow_mapping ::= FLOW-MAPPING-START # (flow_mapping_entry FLOW-ENTRY)* # flow_mapping_entry? # FLOW-MAPPING-END # flow_mapping_entry ::= flow_node | KEY flow_node? (VALUE flow_node?)? # # FIRST sets: # # stream: { STREAM-START } # explicit_document: { DIRECTIVE DOCUMENT-START } # implicit_document: FIRST(block_node) # block_node: { ALIAS TAG ANCHOR SCALAR BLOCK-SEQUENCE-START BLOCK-MAPPING-START FLOW-SEQUENCE-START FLOW-MAPPING-START } # flow_node: { ALIAS ANCHOR TAG SCALAR FLOW-SEQUENCE-START FLOW-MAPPING-START } # block_content: { BLOCK-SEQUENCE-START BLOCK-MAPPING-START FLOW-SEQUENCE-START FLOW-MAPPING-START SCALAR } # flow_content: { FLOW-SEQUENCE-START FLOW-MAPPING-START SCALAR } # block_collection: { BLOCK-SEQUENCE-START BLOCK-MAPPING-START } # flow_collection: { FLOW-SEQUENCE-START FLOW-MAPPING-START } # block_sequence: { BLOCK-SEQUENCE-START } # block_mapping: { BLOCK-MAPPING-START } # block_node_or_indentless_sequence: { ALIAS ANCHOR TAG SCALAR BLOCK-SEQUENCE-START BLOCK-MAPPING-START FLOW-SEQUENCE-START FLOW-MAPPING-START BLOCK-ENTRY } # indentless_sequence: { ENTRY } # flow_collection: { FLOW-SEQUENCE-START FLOW-MAPPING-START } # flow_sequence: { FLOW-SEQUENCE-START } # flow_mapping: { FLOW-MAPPING-START } # flow_sequence_entry: { ALIAS ANCHOR TAG SCALAR FLOW-SEQUENCE-START FLOW-MAPPING-START KEY } # flow_mapping_entry: { ALIAS ANCHOR TAG SCALAR FLOW-SEQUENCE-START FLOW-MAPPING-START KEY } __all__ = ['Parser', 'ParserError'] from .error import MarkedYAMLError from .tokens import * from .events import * from .scanner import * class ParserError(MarkedYAMLError): pass class Parser: # Since writing a recursive-descendant parser is a straightforward task, we # do not give many comments here. DEFAULT_TAGS = { '!': '!', '!!': 'tag:yaml.org,2002:', } def __init__(self): self.current_event = None self.yaml_version = None self.tag_handles = {} self.states = [] self.marks = [] self.state = self.parse_stream_start def dispose(self): # Reset the state attributes (to clear self-references) self.states = [] self.state = None def check_event(self, *choices): # Check the type of the next event. if self.current_event is None: if self.state: self.current_event = self.state() if self.current_event is not None: if not choices: return True for choice in choices: if isinstance(self.current_event, choice): return True return False def peek_event(self): # Get the next event. if self.current_event is None: if self.state: self.current_event = self.state() return self.current_event def get_event(self): # Get the next event and proceed further. if self.current_event is None: if self.state: self.current_event = self.state() value = self.current_event self.current_event = None return value # stream ::= STREAM-START implicit_document? explicit_document* STREAM-END # implicit_document ::= block_node DOCUMENT-END* # explicit_document ::= DIRECTIVE* DOCUMENT-START block_node? DOCUMENT-END* def parse_stream_start(self): # Parse the stream start. token = self.get_token() event = StreamStartEvent(token.start_mark, token.end_mark, encoding=token.encoding) # Prepare the next state. self.state = self.parse_implicit_document_start return event def parse_implicit_document_start(self): # Parse an implicit document. if not self.check_token(DirectiveToken, DocumentStartToken, StreamEndToken): self.tag_handles = self.DEFAULT_TAGS token = self.peek_token() start_mark = end_mark = token.start_mark event = DocumentStartEvent(start_mark, end_mark, explicit=False) # Prepare the next state. self.states.append(self.parse_document_end) self.state = self.parse_block_node return event else: return self.parse_document_start() def parse_document_start(self): # Parse any extra document end indicators. while self.check_token(DocumentEndToken): self.get_token() # Parse an explicit document. if not self.check_token(StreamEndToken): token = self.peek_token() start_mark = token.start_mark version, tags = self.process_directives() if not self.check_token(DocumentStartToken): raise ParserError(None, None, "expected '', but found %r" % self.peek_token().id, self.peek_token().start_mark) token = self.get_token() end_mark = token.end_mark event = DocumentStartEvent(start_mark, end_mark, explicit=True, version=version, tags=tags) self.states.append(self.parse_document_end) self.state = self.parse_document_content else: # Parse the end of the stream. token = self.get_token() event = StreamEndEvent(token.start_mark, token.end_mark) assert not self.states assert not self.marks self.state = None return event def parse_document_end(self): # Parse the document end. token = self.peek_token() start_mark = end_mark = token.start_mark explicit = False if self.check_token(DocumentEndToken): token = self.get_token() end_mark = token.end_mark explicit = True event = DocumentEndEvent(start_mark, end_mark, explicit=explicit) # Prepare the next state. self.state = self.parse_document_start return event def parse_document_content(self): if self.check_token(DirectiveToken, DocumentStartToken, DocumentEndToken, StreamEndToken): event = self.process_empty_scalar(self.peek_token().start_mark) self.state = self.states.pop() return event else: return self.parse_block_node() def process_directives(self): self.yaml_version = None self.tag_handles = {} while self.check_token(DirectiveToken): token = self.get_token() if token.name == 'YAML': if self.yaml_version is not None: raise ParserError(None, None, "found duplicate YAML directive", token.start_mark) major, minor = token.value if major != 1: raise ParserError(None, None, "found incompatible YAML document (version 1.* is required)", token.start_mark) self.yaml_version = token.value elif token.name == 'TAG': handle, prefix = token.value if handle in self.tag_handles: raise ParserError(None, None, "duplicate tag handle %r" % handle, token.start_mark) self.tag_handles[handle] = prefix if self.tag_handles: value = self.yaml_version, self.tag_handles.copy() else: value = self.yaml_version, None for key in self.DEFAULT_TAGS: if key not in self.tag_handles: self.tag_handles[key] = self.DEFAULT_TAGS[key] return value # block_node_or_indentless_sequence ::= ALIAS # | properties (block_content | indentless_block_sequence)? # | block_content # | indentless_block_sequence # block_node ::= ALIAS # | properties block_content? # | block_content # flow_node ::= ALIAS # | properties flow_content? # | flow_content # properties ::= TAG ANCHOR? | ANCHOR TAG? # block_content ::= block_collection | flow_collection | SCALAR # flow_content ::= flow_collection | SCALAR # block_collection ::= block_sequence | block_mapping # flow_collection ::= flow_sequence | flow_mapping def parse_block_node(self): return self.parse_node(block=True) def parse_flow_node(self): return self.parse_node() def parse_block_node_or_indentless_sequence(self): return self.parse_node(block=True, indentless_sequence=True) def parse_node(self, block=False, indentless_sequence=False): if self.check_token(AliasToken): token = self.get_token() event = AliasEvent(token.value, token.start_mark, token.end_mark) self.state = self.states.pop() else: anchor = None tag = None start_mark = end_mark = tag_mark = None if self.check_token(AnchorToken): token = self.get_token() start_mark = token.start_mark end_mark = token.end_mark anchor = token.value if self.check_token(TagToken): token = self.get_token() tag_mark = token.start_mark end_mark = token.end_mark tag = token.value elif self.check_token(TagToken): token = self.get_token() start_mark = tag_mark = token.start_mark end_mark = token.end_mark tag = token.value if self.check_token(AnchorToken): token = self.get_token() end_mark = token.end_mark anchor = token.value if tag is not None: handle, suffix = tag if handle is not None: if handle not in self.tag_handles: raise ParserError("while parsing a node", start_mark, "found undefined tag handle %r" % handle, tag_mark) tag = self.tag_handles[handle]+suffix else: tag = suffix #if tag == '!': # raise ParserError("while parsing a node", start_mark, # "found non-specific tag '!'", tag_mark, # "Please check 'http://pyyaml.org/wiki/YAMLNonSpecificTag' and share your opinion.") if start_mark is None: start_mark = end_mark = self.peek_token().start_mark event = None implicit = (tag is None or tag == '!') if indentless_sequence and self.check_token(BlockEntryToken): end_mark = self.peek_token().end_mark event = SequenceStartEvent(anchor, tag, implicit, start_mark, end_mark) self.state = self.parse_indentless_sequence_entry else: if self.check_token(ScalarToken): token = self.get_token() end_mark = token.end_mark if (token.plain and tag is None) or tag == '!': implicit = (True, False) elif tag is None: implicit = (False, True) else: implicit = (False, False) event = ScalarEvent(anchor, tag, implicit, token.value, start_mark, end_mark, style=token.style) self.state = self.states.pop() elif self.check_token(FlowSequenceStartToken): end_mark = self.peek_token().end_mark event = SequenceStartEvent(anchor, tag, implicit, start_mark, end_mark, flow_style=True) self.state = self.parse_flow_sequence_first_entry elif self.check_token(FlowMappingStartToken): end_mark = self.peek_token().end_mark event = MappingStartEvent(anchor, tag, implicit, start_mark, end_mark, flow_style=True) self.state = self.parse_flow_mapping_first_key elif block and self.check_token(BlockSequenceStartToken): end_mark = self.peek_token().start_mark event = SequenceStartEvent(anchor, tag, implicit, start_mark, end_mark, flow_style=False) self.state = self.parse_block_sequence_first_entry elif block and self.check_token(BlockMappingStartToken): end_mark = self.peek_token().start_mark event = MappingStartEvent(anchor, tag, implicit, start_mark, end_mark, flow_style=False) self.state = self.parse_block_mapping_first_key elif anchor is not None or tag is not None: # Empty scalars are allowed even if a tag or an anchor is # specified. event = ScalarEvent(anchor, tag, (implicit, False), '', start_mark, end_mark) self.state = self.states.pop() else: if block: node = 'block' else: node = 'flow' token = self.peek_token() raise ParserError("while parsing a %s node" % node, start_mark, "expected the node content, but found %r" % token.id, token.start_mark) return event # block_sequence ::= BLOCK-SEQUENCE-START (BLOCK-ENTRY block_node?)* BLOCK-END def parse_block_sequence_first_entry(self): token = self.get_token() self.marks.append(token.start_mark) return self.parse_block_sequence_entry() def parse_block_sequence_entry(self): if self.check_token(BlockEntryToken): token = self.get_token() if not self.check_token(BlockEntryToken, BlockEndToken): self.states.append(self.parse_block_sequence_entry) return self.parse_block_node() else: self.state = self.parse_block_sequence_entry return self.process_empty_scalar(token.end_mark) if not self.check_token(BlockEndToken): token = self.peek_token() raise ParserError("while parsing a block collection", self.marks[-1], "expected , but found %r" % token.id, token.start_mark) token = self.get_token() event = SequenceEndEvent(token.start_mark, token.end_mark) self.state = self.states.pop() self.marks.pop() return event # indentless_sequence ::= (BLOCK-ENTRY block_node?)+ def parse_indentless_sequence_entry(self): if self.check_token(BlockEntryToken): token = self.get_token() if not self.check_token(BlockEntryToken, KeyToken, ValueToken, BlockEndToken): self.states.append(self.parse_indentless_sequence_entry) return self.parse_block_node() else: self.state = self.parse_indentless_sequence_entry return self.process_empty_scalar(token.end_mark) token = self.peek_token() event = SequenceEndEvent(token.start_mark, token.start_mark) self.state = self.states.pop() return event # block_mapping ::= BLOCK-MAPPING_START # ((KEY block_node_or_indentless_sequence?)? # (VALUE block_node_or_indentless_sequence?)?)* # BLOCK-END def parse_block_mapping_first_key(self): token = self.get_token() self.marks.append(token.start_mark) return self.parse_block_mapping_key() def parse_block_mapping_key(self): if self.check_token(KeyToken): token = self.get_token() if not self.check_token(KeyToken, ValueToken, BlockEndToken): self.states.append(self.parse_block_mapping_value) return self.parse_block_node_or_indentless_sequence() else: self.state = self.parse_block_mapping_value return self.process_empty_scalar(token.end_mark) if not self.check_token(BlockEndToken): token = self.peek_token() raise ParserError("while parsing a block mapping", self.marks[-1], "expected , but found %r" % token.id, token.start_mark) token = self.get_token() event = MappingEndEvent(token.start_mark, token.end_mark) self.state = self.states.pop() self.marks.pop() return event def parse_block_mapping_value(self): if self.check_token(ValueToken): token = self.get_token() if not self.check_token(KeyToken, ValueToken, BlockEndToken): self.states.append(self.parse_block_mapping_key) return self.parse_block_node_or_indentless_sequence() else: self.state = self.parse_block_mapping_key return self.process_empty_scalar(token.end_mark) else: self.state = self.parse_block_mapping_key token = self.peek_token() return self.process_empty_scalar(token.start_mark) # flow_sequence ::= FLOW-SEQUENCE-START # (flow_sequence_entry FLOW-ENTRY)* # flow_sequence_entry? # FLOW-SEQUENCE-END # flow_sequence_entry ::= flow_node | KEY flow_node? (VALUE flow_node?)? # # Note that while production rules for both flow_sequence_entry and # flow_mapping_entry are equal, their interpretations are different. # For `flow_sequence_entry`, the part `KEY flow_node? (VALUE flow_node?)?` # generate an inline mapping (set syntax). def parse_flow_sequence_first_entry(self): token = self.get_token() self.marks.append(token.start_mark) return self.parse_flow_sequence_entry(first=True) def parse_flow_sequence_entry(self, first=False): if not self.check_token(FlowSequenceEndToken): if not first: if self.check_token(FlowEntryToken): self.get_token() else: token = self.peek_token() raise ParserError("while parsing a flow sequence", self.marks[-1], "expected ',' or ']', but got %r" % token.id, token.start_mark) if self.check_token(KeyToken): token = self.peek_token() event = MappingStartEvent(None, None, True, token.start_mark, token.end_mark, flow_style=True) self.state = self.parse_flow_sequence_entry_mapping_key return event elif not self.check_token(FlowSequenceEndToken): self.states.append(self.parse_flow_sequence_entry) return self.parse_flow_node() token = self.get_token() event = SequenceEndEvent(token.start_mark, token.end_mark) self.state = self.states.pop() self.marks.pop() return event def parse_flow_sequence_entry_mapping_key(self): token = self.get_token() if not self.check_token(ValueToken, FlowEntryToken, FlowSequenceEndToken): self.states.append(self.parse_flow_sequence_entry_mapping_value) return self.parse_flow_node() else: self.state = self.parse_flow_sequence_entry_mapping_value return self.process_empty_scalar(token.end_mark) def parse_flow_sequence_entry_mapping_value(self): if self.check_token(ValueToken): token = self.get_token() if not self.check_token(FlowEntryToken, FlowSequenceEndToken): self.states.append(self.parse_flow_sequence_entry_mapping_end) return self.parse_flow_node() else: self.state = self.parse_flow_sequence_entry_mapping_end return self.process_empty_scalar(token.end_mark) else: self.state = self.parse_flow_sequence_entry_mapping_end token = self.peek_token() return self.process_empty_scalar(token.start_mark) def parse_flow_sequence_entry_mapping_end(self): self.state = self.parse_flow_sequence_entry token = self.peek_token() return MappingEndEvent(token.start_mark, token.start_mark) # flow_mapping ::= FLOW-MAPPING-START # (flow_mapping_entry FLOW-ENTRY)* # flow_mapping_entry? # FLOW-MAPPING-END # flow_mapping_entry ::= flow_node | KEY flow_node? (VALUE flow_node?)? def parse_flow_mapping_first_key(self): token = self.get_token() self.marks.append(token.start_mark) return self.parse_flow_mapping_key(first=True) def parse_flow_mapping_key(self, first=False): if not self.check_token(FlowMappingEndToken): if not first: if self.check_token(FlowEntryToken): self.get_token() else: token = self.peek_token() raise ParserError("while parsing a flow mapping", self.marks[-1], "expected ',' or '}', but got %r" % token.id, token.start_mark) if self.check_token(KeyToken): token = self.get_token() if not self.check_token(ValueToken, FlowEntryToken, FlowMappingEndToken): self.states.append(self.parse_flow_mapping_value) return self.parse_flow_node() else: self.state = self.parse_flow_mapping_value return self.process_empty_scalar(token.end_mark) elif not self.check_token(FlowMappingEndToken): self.states.append(self.parse_flow_mapping_empty_value) return self.parse_flow_node() token = self.get_token() event = MappingEndEvent(token.start_mark, token.end_mark) self.state = self.states.pop() self.marks.pop() return event def parse_flow_mapping_value(self): if self.check_token(ValueToken): token = self.get_token() if not self.check_token(FlowEntryToken, FlowMappingEndToken): self.states.append(self.parse_flow_mapping_key) return self.parse_flow_node() else: self.state = self.parse_flow_mapping_key return self.process_empty_scalar(token.end_mark) else: self.state = self.parse_flow_mapping_key token = self.peek_token() return self.process_empty_scalar(token.start_mark) def parse_flow_mapping_empty_value(self): self.state = self.parse_flow_mapping_key return self.process_empty_scalar(self.peek_token().start_mark) def process_empty_scalar(self, mark): return ScalarEvent(None, None, (True, False), '', mark, mark) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/lib/yaml/reader.py0000644000175100001730000001521214455350511015522 0ustar00runnerdocker# This module contains abstractions for the input stream. You don't have to # looks further, there are no pretty code. # # We define two classes here. # # Mark(source, line, column) # It's just a record and its only use is producing nice error messages. # Parser does not use it for any other purposes. # # Reader(source, data) # Reader determines the encoding of `data` and converts it to unicode. # Reader provides the following methods and attributes: # reader.peek(length=1) - return the next `length` characters # reader.forward(length=1) - move the current position to `length` characters. # reader.index - the number of the current character. # reader.line, stream.column - the line and the column of the current character. __all__ = ['Reader', 'ReaderError'] from .error import YAMLError, Mark import codecs, re class ReaderError(YAMLError): def __init__(self, name, position, character, encoding, reason): self.name = name self.character = character self.position = position self.encoding = encoding self.reason = reason def __str__(self): if isinstance(self.character, bytes): return "'%s' codec can't decode byte #x%02x: %s\n" \ " in \"%s\", position %d" \ % (self.encoding, ord(self.character), self.reason, self.name, self.position) else: return "unacceptable character #x%04x: %s\n" \ " in \"%s\", position %d" \ % (self.character, self.reason, self.name, self.position) class Reader(object): # Reader: # - determines the data encoding and converts it to a unicode string, # - checks if characters are in allowed range, # - adds '\0' to the end. # Reader accepts # - a `bytes` object, # - a `str` object, # - a file-like object with its `read` method returning `str`, # - a file-like object with its `read` method returning `unicode`. # Yeah, it's ugly and slow. def __init__(self, stream): self.name = None self.stream = None self.stream_pointer = 0 self.eof = True self.buffer = '' self.pointer = 0 self.raw_buffer = None self.raw_decode = None self.encoding = None self.index = 0 self.line = 0 self.column = 0 if isinstance(stream, str): self.name = "" self.check_printable(stream) self.buffer = stream+'\0' elif isinstance(stream, bytes): self.name = "" self.raw_buffer = stream self.determine_encoding() else: self.stream = stream self.name = getattr(stream, 'name', "") self.eof = False self.raw_buffer = None self.determine_encoding() def peek(self, index=0): try: return self.buffer[self.pointer+index] except IndexError: self.update(index+1) return self.buffer[self.pointer+index] def prefix(self, length=1): if self.pointer+length >= len(self.buffer): self.update(length) return self.buffer[self.pointer:self.pointer+length] def forward(self, length=1): if self.pointer+length+1 >= len(self.buffer): self.update(length+1) while length: ch = self.buffer[self.pointer] self.pointer += 1 self.index += 1 if ch in '\n\x85\u2028\u2029' \ or (ch == '\r' and self.buffer[self.pointer] != '\n'): self.line += 1 self.column = 0 elif ch != '\uFEFF': self.column += 1 length -= 1 def get_mark(self): if self.stream is None: return Mark(self.name, self.index, self.line, self.column, self.buffer, self.pointer) else: return Mark(self.name, self.index, self.line, self.column, None, None) def determine_encoding(self): while not self.eof and (self.raw_buffer is None or len(self.raw_buffer) < 2): self.update_raw() if isinstance(self.raw_buffer, bytes): if self.raw_buffer.startswith(codecs.BOM_UTF16_LE): self.raw_decode = codecs.utf_16_le_decode self.encoding = 'utf-16-le' elif self.raw_buffer.startswith(codecs.BOM_UTF16_BE): self.raw_decode = codecs.utf_16_be_decode self.encoding = 'utf-16-be' else: self.raw_decode = codecs.utf_8_decode self.encoding = 'utf-8' self.update(1) NON_PRINTABLE = re.compile('[^\x09\x0A\x0D\x20-\x7E\x85\xA0-\uD7FF\uE000-\uFFFD\U00010000-\U0010ffff]') def check_printable(self, data): match = self.NON_PRINTABLE.search(data) if match: character = match.group() position = self.index+(len(self.buffer)-self.pointer)+match.start() raise ReaderError(self.name, position, ord(character), 'unicode', "special characters are not allowed") def update(self, length): if self.raw_buffer is None: return self.buffer = self.buffer[self.pointer:] self.pointer = 0 while len(self.buffer) < length: if not self.eof: self.update_raw() if self.raw_decode is not None: try: data, converted = self.raw_decode(self.raw_buffer, 'strict', self.eof) except UnicodeDecodeError as exc: character = self.raw_buffer[exc.start] if self.stream is not None: position = self.stream_pointer-len(self.raw_buffer)+exc.start else: position = exc.start raise ReaderError(self.name, position, character, exc.encoding, exc.reason) else: data = self.raw_buffer converted = len(data) self.check_printable(data) self.buffer += data self.raw_buffer = self.raw_buffer[converted:] if self.eof: self.buffer += '\0' self.raw_buffer = None break def update_raw(self, size=4096): data = self.stream.read(size) if self.raw_buffer is None: self.raw_buffer = data else: self.raw_buffer += data self.stream_pointer += len(data) if not data: self.eof = True ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/lib/yaml/representer.py0000644000175100001730000003355614455350511016631 0ustar00runnerdocker __all__ = ['BaseRepresenter', 'SafeRepresenter', 'Representer', 'RepresenterError'] from .error import * from .nodes import * import datetime, copyreg, types, base64, collections class RepresenterError(YAMLError): pass class BaseRepresenter: yaml_representers = {} yaml_multi_representers = {} def __init__(self, default_style=None, default_flow_style=False, sort_keys=True): self.default_style = default_style self.sort_keys = sort_keys self.default_flow_style = default_flow_style self.represented_objects = {} self.object_keeper = [] self.alias_key = None def represent(self, data): node = self.represent_data(data) self.serialize(node) self.represented_objects = {} self.object_keeper = [] self.alias_key = None def represent_data(self, data): if self.ignore_aliases(data): self.alias_key = None else: self.alias_key = id(data) if self.alias_key is not None: if self.alias_key in self.represented_objects: node = self.represented_objects[self.alias_key] #if node is None: # raise RepresenterError("recursive objects are not allowed: %r" % data) return node #self.represented_objects[alias_key] = None self.object_keeper.append(data) data_types = type(data).__mro__ if data_types[0] in self.yaml_representers: node = self.yaml_representers[data_types[0]](self, data) else: for data_type in data_types: if data_type in self.yaml_multi_representers: node = self.yaml_multi_representers[data_type](self, data) break else: if None in self.yaml_multi_representers: node = self.yaml_multi_representers[None](self, data) elif None in self.yaml_representers: node = self.yaml_representers[None](self, data) else: node = ScalarNode(None, str(data)) #if alias_key is not None: # self.represented_objects[alias_key] = node return node @classmethod def add_representer(cls, data_type, representer): if not 'yaml_representers' in cls.__dict__: cls.yaml_representers = cls.yaml_representers.copy() cls.yaml_representers[data_type] = representer @classmethod def add_multi_representer(cls, data_type, representer): if not 'yaml_multi_representers' in cls.__dict__: cls.yaml_multi_representers = cls.yaml_multi_representers.copy() cls.yaml_multi_representers[data_type] = representer def represent_scalar(self, tag, value, style=None): if style is None: style = self.default_style node = ScalarNode(tag, value, style=style) if self.alias_key is not None: self.represented_objects[self.alias_key] = node return node def represent_sequence(self, tag, sequence, flow_style=None): value = [] node = SequenceNode(tag, value, flow_style=flow_style) if self.alias_key is not None: self.represented_objects[self.alias_key] = node best_style = True for item in sequence: node_item = self.represent_data(item) if not (isinstance(node_item, ScalarNode) and not node_item.style): best_style = False value.append(node_item) if flow_style is None: if self.default_flow_style is not None: node.flow_style = self.default_flow_style else: node.flow_style = best_style return node def represent_mapping(self, tag, mapping, flow_style=None): value = [] node = MappingNode(tag, value, flow_style=flow_style) if self.alias_key is not None: self.represented_objects[self.alias_key] = node best_style = True if hasattr(mapping, 'items'): mapping = list(mapping.items()) if self.sort_keys: try: mapping = sorted(mapping) except TypeError: pass for item_key, item_value in mapping: node_key = self.represent_data(item_key) node_value = self.represent_data(item_value) if not (isinstance(node_key, ScalarNode) and not node_key.style): best_style = False if not (isinstance(node_value, ScalarNode) and not node_value.style): best_style = False value.append((node_key, node_value)) if flow_style is None: if self.default_flow_style is not None: node.flow_style = self.default_flow_style else: node.flow_style = best_style return node def ignore_aliases(self, data): return False class SafeRepresenter(BaseRepresenter): def ignore_aliases(self, data): if data is None: return True if isinstance(data, tuple) and data == (): return True if isinstance(data, (str, bytes, bool, int, float)): return True def represent_none(self, data): return self.represent_scalar('tag:yaml.org,2002:null', 'null') def represent_str(self, data): return self.represent_scalar('tag:yaml.org,2002:str', data) def represent_binary(self, data): if hasattr(base64, 'encodebytes'): data = base64.encodebytes(data).decode('ascii') else: data = base64.encodestring(data).decode('ascii') return self.represent_scalar('tag:yaml.org,2002:binary', data, style='|') def represent_bool(self, data): if data: value = 'true' else: value = 'false' return self.represent_scalar('tag:yaml.org,2002:bool', value) def represent_int(self, data): return self.represent_scalar('tag:yaml.org,2002:int', str(data)) inf_value = 1e300 while repr(inf_value) != repr(inf_value*inf_value): inf_value *= inf_value def represent_float(self, data): if data != data or (data == 0.0 and data == 1.0): value = '.nan' elif data == self.inf_value: value = '.inf' elif data == -self.inf_value: value = '-.inf' else: value = repr(data).lower() # Note that in some cases `repr(data)` represents a float number # without the decimal parts. For instance: # >>> repr(1e17) # '1e17' # Unfortunately, this is not a valid float representation according # to the definition of the `!!float` tag. We fix this by adding # '.0' before the 'e' symbol. if '.' not in value and 'e' in value: value = value.replace('e', '.0e', 1) return self.represent_scalar('tag:yaml.org,2002:float', value) def represent_list(self, data): #pairs = (len(data) > 0 and isinstance(data, list)) #if pairs: # for item in data: # if not isinstance(item, tuple) or len(item) != 2: # pairs = False # break #if not pairs: return self.represent_sequence('tag:yaml.org,2002:seq', data) #value = [] #for item_key, item_value in data: # value.append(self.represent_mapping(u'tag:yaml.org,2002:map', # [(item_key, item_value)])) #return SequenceNode(u'tag:yaml.org,2002:pairs', value) def represent_dict(self, data): return self.represent_mapping('tag:yaml.org,2002:map', data) def represent_set(self, data): value = {} for key in data: value[key] = None return self.represent_mapping('tag:yaml.org,2002:set', value) def represent_date(self, data): value = data.isoformat() return self.represent_scalar('tag:yaml.org,2002:timestamp', value) def represent_datetime(self, data): value = data.isoformat(' ') return self.represent_scalar('tag:yaml.org,2002:timestamp', value) def represent_yaml_object(self, tag, data, cls, flow_style=None): if hasattr(data, '__getstate__'): state = data.__getstate__() else: state = data.__dict__.copy() return self.represent_mapping(tag, state, flow_style=flow_style) def represent_undefined(self, data): raise RepresenterError("cannot represent an object", data) SafeRepresenter.add_representer(type(None), SafeRepresenter.represent_none) SafeRepresenter.add_representer(str, SafeRepresenter.represent_str) SafeRepresenter.add_representer(bytes, SafeRepresenter.represent_binary) SafeRepresenter.add_representer(bool, SafeRepresenter.represent_bool) SafeRepresenter.add_representer(int, SafeRepresenter.represent_int) SafeRepresenter.add_representer(float, SafeRepresenter.represent_float) SafeRepresenter.add_representer(list, SafeRepresenter.represent_list) SafeRepresenter.add_representer(tuple, SafeRepresenter.represent_list) SafeRepresenter.add_representer(dict, SafeRepresenter.represent_dict) SafeRepresenter.add_representer(set, SafeRepresenter.represent_set) SafeRepresenter.add_representer(datetime.date, SafeRepresenter.represent_date) SafeRepresenter.add_representer(datetime.datetime, SafeRepresenter.represent_datetime) SafeRepresenter.add_representer(None, SafeRepresenter.represent_undefined) class Representer(SafeRepresenter): def represent_complex(self, data): if data.imag == 0.0: data = '%r' % data.real elif data.real == 0.0: data = '%rj' % data.imag elif data.imag > 0: data = '%r+%rj' % (data.real, data.imag) else: data = '%r%rj' % (data.real, data.imag) return self.represent_scalar('tag:yaml.org,2002:python/complex', data) def represent_tuple(self, data): return self.represent_sequence('tag:yaml.org,2002:python/tuple', data) def represent_name(self, data): name = '%s.%s' % (data.__module__, data.__name__) return self.represent_scalar('tag:yaml.org,2002:python/name:'+name, '') def represent_module(self, data): return self.represent_scalar( 'tag:yaml.org,2002:python/module:'+data.__name__, '') def represent_object(self, data): # We use __reduce__ API to save the data. data.__reduce__ returns # a tuple of length 2-5: # (function, args, state, listitems, dictitems) # For reconstructing, we calls function(*args), then set its state, # listitems, and dictitems if they are not None. # A special case is when function.__name__ == '__newobj__'. In this # case we create the object with args[0].__new__(*args). # Another special case is when __reduce__ returns a string - we don't # support it. # We produce a !!python/object, !!python/object/new or # !!python/object/apply node. cls = type(data) if cls in copyreg.dispatch_table: reduce = copyreg.dispatch_table[cls](data) elif hasattr(data, '__reduce_ex__'): reduce = data.__reduce_ex__(2) elif hasattr(data, '__reduce__'): reduce = data.__reduce__() else: raise RepresenterError("cannot represent an object", data) reduce = (list(reduce)+[None]*5)[:5] function, args, state, listitems, dictitems = reduce args = list(args) if state is None: state = {} if listitems is not None: listitems = list(listitems) if dictitems is not None: dictitems = dict(dictitems) if function.__name__ == '__newobj__': function = args[0] args = args[1:] tag = 'tag:yaml.org,2002:python/object/new:' newobj = True else: tag = 'tag:yaml.org,2002:python/object/apply:' newobj = False function_name = '%s.%s' % (function.__module__, function.__name__) if not args and not listitems and not dictitems \ and isinstance(state, dict) and newobj: return self.represent_mapping( 'tag:yaml.org,2002:python/object:'+function_name, state) if not listitems and not dictitems \ and isinstance(state, dict) and not state: return self.represent_sequence(tag+function_name, args) value = {} if args: value['args'] = args if state or not isinstance(state, dict): value['state'] = state if listitems: value['listitems'] = listitems if dictitems: value['dictitems'] = dictitems return self.represent_mapping(tag+function_name, value) def represent_ordered_dict(self, data): # Provide uniform representation across different Python versions. data_type = type(data) tag = 'tag:yaml.org,2002:python/object/apply:%s.%s' \ % (data_type.__module__, data_type.__name__) items = [[key, value] for key, value in data.items()] return self.represent_sequence(tag, [items]) Representer.add_representer(complex, Representer.represent_complex) Representer.add_representer(tuple, Representer.represent_tuple) Representer.add_multi_representer(type, Representer.represent_name) Representer.add_representer(collections.OrderedDict, Representer.represent_ordered_dict) Representer.add_representer(types.FunctionType, Representer.represent_name) Representer.add_representer(types.BuiltinFunctionType, Representer.represent_name) Representer.add_representer(types.ModuleType, Representer.represent_module) Representer.add_multi_representer(object, Representer.represent_object) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/lib/yaml/resolver.py0000644000175100001730000002145414455350511016126 0ustar00runnerdocker __all__ = ['BaseResolver', 'Resolver'] from .error import * from .nodes import * import re class ResolverError(YAMLError): pass class BaseResolver: DEFAULT_SCALAR_TAG = 'tag:yaml.org,2002:str' DEFAULT_SEQUENCE_TAG = 'tag:yaml.org,2002:seq' DEFAULT_MAPPING_TAG = 'tag:yaml.org,2002:map' yaml_implicit_resolvers = {} yaml_path_resolvers = {} def __init__(self): self.resolver_exact_paths = [] self.resolver_prefix_paths = [] @classmethod def add_implicit_resolver(cls, tag, regexp, first): if not 'yaml_implicit_resolvers' in cls.__dict__: implicit_resolvers = {} for key in cls.yaml_implicit_resolvers: implicit_resolvers[key] = cls.yaml_implicit_resolvers[key][:] cls.yaml_implicit_resolvers = implicit_resolvers if first is None: first = [None] for ch in first: cls.yaml_implicit_resolvers.setdefault(ch, []).append((tag, regexp)) @classmethod def add_path_resolver(cls, tag, path, kind=None): # Note: `add_path_resolver` is experimental. The API could be changed. # `new_path` is a pattern that is matched against the path from the # root to the node that is being considered. `node_path` elements are # tuples `(node_check, index_check)`. `node_check` is a node class: # `ScalarNode`, `SequenceNode`, `MappingNode` or `None`. `None` # matches any kind of a node. `index_check` could be `None`, a boolean # value, a string value, or a number. `None` and `False` match against # any _value_ of sequence and mapping nodes. `True` matches against # any _key_ of a mapping node. A string `index_check` matches against # a mapping value that corresponds to a scalar key which content is # equal to the `index_check` value. An integer `index_check` matches # against a sequence value with the index equal to `index_check`. if not 'yaml_path_resolvers' in cls.__dict__: cls.yaml_path_resolvers = cls.yaml_path_resolvers.copy() new_path = [] for element in path: if isinstance(element, (list, tuple)): if len(element) == 2: node_check, index_check = element elif len(element) == 1: node_check = element[0] index_check = True else: raise ResolverError("Invalid path element: %s" % element) else: node_check = None index_check = element if node_check is str: node_check = ScalarNode elif node_check is list: node_check = SequenceNode elif node_check is dict: node_check = MappingNode elif node_check not in [ScalarNode, SequenceNode, MappingNode] \ and not isinstance(node_check, str) \ and node_check is not None: raise ResolverError("Invalid node checker: %s" % node_check) if not isinstance(index_check, (str, int)) \ and index_check is not None: raise ResolverError("Invalid index checker: %s" % index_check) new_path.append((node_check, index_check)) if kind is str: kind = ScalarNode elif kind is list: kind = SequenceNode elif kind is dict: kind = MappingNode elif kind not in [ScalarNode, SequenceNode, MappingNode] \ and kind is not None: raise ResolverError("Invalid node kind: %s" % kind) cls.yaml_path_resolvers[tuple(new_path), kind] = tag def descend_resolver(self, current_node, current_index): if not self.yaml_path_resolvers: return exact_paths = {} prefix_paths = [] if current_node: depth = len(self.resolver_prefix_paths) for path, kind in self.resolver_prefix_paths[-1]: if self.check_resolver_prefix(depth, path, kind, current_node, current_index): if len(path) > depth: prefix_paths.append((path, kind)) else: exact_paths[kind] = self.yaml_path_resolvers[path, kind] else: for path, kind in self.yaml_path_resolvers: if not path: exact_paths[kind] = self.yaml_path_resolvers[path, kind] else: prefix_paths.append((path, kind)) self.resolver_exact_paths.append(exact_paths) self.resolver_prefix_paths.append(prefix_paths) def ascend_resolver(self): if not self.yaml_path_resolvers: return self.resolver_exact_paths.pop() self.resolver_prefix_paths.pop() def check_resolver_prefix(self, depth, path, kind, current_node, current_index): node_check, index_check = path[depth-1] if isinstance(node_check, str): if current_node.tag != node_check: return elif node_check is not None: if not isinstance(current_node, node_check): return if index_check is True and current_index is not None: return if (index_check is False or index_check is None) \ and current_index is None: return if isinstance(index_check, str): if not (isinstance(current_index, ScalarNode) and index_check == current_index.value): return elif isinstance(index_check, int) and not isinstance(index_check, bool): if index_check != current_index: return return True def resolve(self, kind, value, implicit): if kind is ScalarNode and implicit[0]: if value == '': resolvers = self.yaml_implicit_resolvers.get('', []) else: resolvers = self.yaml_implicit_resolvers.get(value[0], []) wildcard_resolvers = self.yaml_implicit_resolvers.get(None, []) for tag, regexp in resolvers + wildcard_resolvers: if regexp.match(value): return tag implicit = implicit[1] if self.yaml_path_resolvers: exact_paths = self.resolver_exact_paths[-1] if kind in exact_paths: return exact_paths[kind] if None in exact_paths: return exact_paths[None] if kind is ScalarNode: return self.DEFAULT_SCALAR_TAG elif kind is SequenceNode: return self.DEFAULT_SEQUENCE_TAG elif kind is MappingNode: return self.DEFAULT_MAPPING_TAG class Resolver(BaseResolver): pass Resolver.add_implicit_resolver( 'tag:yaml.org,2002:bool', re.compile(r'''^(?:yes|Yes|YES|no|No|NO |true|True|TRUE|false|False|FALSE |on|On|ON|off|Off|OFF)$''', re.X), list('yYnNtTfFoO')) Resolver.add_implicit_resolver( 'tag:yaml.org,2002:float', re.compile(r'''^(?:[-+]?(?:[0-9][0-9_]*)\.[0-9_]*(?:[eE][-+][0-9]+)? |\.[0-9][0-9_]*(?:[eE][-+][0-9]+)? |[-+]?[0-9][0-9_]*(?::[0-5]?[0-9])+\.[0-9_]* |[-+]?\.(?:inf|Inf|INF) |\.(?:nan|NaN|NAN))$''', re.X), list('-+0123456789.')) Resolver.add_implicit_resolver( 'tag:yaml.org,2002:int', re.compile(r'''^(?:[-+]?0b[0-1_]+ |[-+]?0[0-7_]+ |[-+]?(?:0|[1-9][0-9_]*) |[-+]?0x[0-9a-fA-F_]+ |[-+]?[1-9][0-9_]*(?::[0-5]?[0-9])+)$''', re.X), list('-+0123456789')) Resolver.add_implicit_resolver( 'tag:yaml.org,2002:merge', re.compile(r'^(?:<<)$'), ['<']) Resolver.add_implicit_resolver( 'tag:yaml.org,2002:null', re.compile(r'''^(?: ~ |null|Null|NULL | )$''', re.X), ['~', 'n', 'N', '']) Resolver.add_implicit_resolver( 'tag:yaml.org,2002:timestamp', re.compile(r'''^(?:[0-9][0-9][0-9][0-9]-[0-9][0-9]-[0-9][0-9] |[0-9][0-9][0-9][0-9] -[0-9][0-9]? -[0-9][0-9]? (?:[Tt]|[ \t]+)[0-9][0-9]? :[0-9][0-9] :[0-9][0-9] (?:\.[0-9]*)? (?:[ \t]*(?:Z|[-+][0-9][0-9]?(?::[0-9][0-9])?))?)$''', re.X), list('0123456789')) Resolver.add_implicit_resolver( 'tag:yaml.org,2002:value', re.compile(r'^(?:=)$'), ['=']) # The following resolver is only for documentation purposes. It cannot work # because plain scalars cannot start with '!', '&', or '*'. Resolver.add_implicit_resolver( 'tag:yaml.org,2002:yaml', re.compile(r'^(?:!|&|\*)$'), list('!&*')) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/lib/yaml/scanner.py0000644000175100001730000014411714455350511015720 0ustar00runnerdocker # Scanner produces tokens of the following types: # STREAM-START # STREAM-END # DIRECTIVE(name, value) # DOCUMENT-START # DOCUMENT-END # BLOCK-SEQUENCE-START # BLOCK-MAPPING-START # BLOCK-END # FLOW-SEQUENCE-START # FLOW-MAPPING-START # FLOW-SEQUENCE-END # FLOW-MAPPING-END # BLOCK-ENTRY # FLOW-ENTRY # KEY # VALUE # ALIAS(value) # ANCHOR(value) # TAG(value) # SCALAR(value, plain, style) # # Read comments in the Scanner code for more details. # __all__ = ['Scanner', 'ScannerError'] from .error import MarkedYAMLError from .tokens import * class ScannerError(MarkedYAMLError): pass class SimpleKey: # See below simple keys treatment. def __init__(self, token_number, required, index, line, column, mark): self.token_number = token_number self.required = required self.index = index self.line = line self.column = column self.mark = mark class Scanner: def __init__(self): """Initialize the scanner.""" # It is assumed that Scanner and Reader will have a common descendant. # Reader do the dirty work of checking for BOM and converting the # input data to Unicode. It also adds NUL to the end. # # Reader supports the following methods # self.peek(i=0) # peek the next i-th character # self.prefix(l=1) # peek the next l characters # self.forward(l=1) # read the next l characters and move the pointer. # Had we reached the end of the stream? self.done = False # The number of unclosed '{' and '['. `flow_level == 0` means block # context. self.flow_level = 0 # List of processed tokens that are not yet emitted. self.tokens = [] # Add the STREAM-START token. self.fetch_stream_start() # Number of tokens that were emitted through the `get_token` method. self.tokens_taken = 0 # The current indentation level. self.indent = -1 # Past indentation levels. self.indents = [] # Variables related to simple keys treatment. # A simple key is a key that is not denoted by the '?' indicator. # Example of simple keys: # --- # block simple key: value # ? not a simple key: # : { flow simple key: value } # We emit the KEY token before all keys, so when we find a potential # simple key, we try to locate the corresponding ':' indicator. # Simple keys should be limited to a single line and 1024 characters. # Can a simple key start at the current position? A simple key may # start: # - at the beginning of the line, not counting indentation spaces # (in block context), # - after '{', '[', ',' (in the flow context), # - after '?', ':', '-' (in the block context). # In the block context, this flag also signifies if a block collection # may start at the current position. self.allow_simple_key = True # Keep track of possible simple keys. This is a dictionary. The key # is `flow_level`; there can be no more that one possible simple key # for each level. The value is a SimpleKey record: # (token_number, required, index, line, column, mark) # A simple key may start with ALIAS, ANCHOR, TAG, SCALAR(flow), # '[', or '{' tokens. self.possible_simple_keys = {} # Public methods. def check_token(self, *choices): # Check if the next token is one of the given types. while self.need_more_tokens(): self.fetch_more_tokens() if self.tokens: if not choices: return True for choice in choices: if isinstance(self.tokens[0], choice): return True return False def peek_token(self): # Return the next token, but do not delete if from the queue. # Return None if no more tokens. while self.need_more_tokens(): self.fetch_more_tokens() if self.tokens: return self.tokens[0] else: return None def get_token(self): # Return the next token. while self.need_more_tokens(): self.fetch_more_tokens() if self.tokens: self.tokens_taken += 1 return self.tokens.pop(0) # Private methods. def need_more_tokens(self): if self.done: return False if not self.tokens: return True # The current token may be a potential simple key, so we # need to look further. self.stale_possible_simple_keys() if self.next_possible_simple_key() == self.tokens_taken: return True def fetch_more_tokens(self): # Eat whitespaces and comments until we reach the next token. self.scan_to_next_token() # Remove obsolete possible simple keys. self.stale_possible_simple_keys() # Compare the current indentation and column. It may add some tokens # and decrease the current indentation level. self.unwind_indent(self.column) # Peek the next character. ch = self.peek() # Is it the end of stream? if ch == '\0': return self.fetch_stream_end() # Is it a directive? if ch == '%' and self.check_directive(): return self.fetch_directive() # Is it the document start? if ch == '-' and self.check_document_start(): return self.fetch_document_start() # Is it the document end? if ch == '.' and self.check_document_end(): return self.fetch_document_end() # TODO: support for BOM within a stream. #if ch == '\uFEFF': # return self.fetch_bom() <-- issue BOMToken # Note: the order of the following checks is NOT significant. # Is it the flow sequence start indicator? if ch == '[': return self.fetch_flow_sequence_start() # Is it the flow mapping start indicator? if ch == '{': return self.fetch_flow_mapping_start() # Is it the flow sequence end indicator? if ch == ']': return self.fetch_flow_sequence_end() # Is it the flow mapping end indicator? if ch == '}': return self.fetch_flow_mapping_end() # Is it the flow entry indicator? if ch == ',': return self.fetch_flow_entry() # Is it the block entry indicator? if ch == '-' and self.check_block_entry(): return self.fetch_block_entry() # Is it the key indicator? if ch == '?' and self.check_key(): return self.fetch_key() # Is it the value indicator? if ch == ':' and self.check_value(): return self.fetch_value() # Is it an alias? if ch == '*': return self.fetch_alias() # Is it an anchor? if ch == '&': return self.fetch_anchor() # Is it a tag? if ch == '!': return self.fetch_tag() # Is it a literal scalar? if ch == '|' and not self.flow_level: return self.fetch_literal() # Is it a folded scalar? if ch == '>' and not self.flow_level: return self.fetch_folded() # Is it a single quoted scalar? if ch == '\'': return self.fetch_single() # Is it a double quoted scalar? if ch == '\"': return self.fetch_double() # It must be a plain scalar then. if self.check_plain(): return self.fetch_plain() # No? It's an error. Let's produce a nice error message. raise ScannerError("while scanning for the next token", None, "found character %r that cannot start any token" % ch, self.get_mark()) # Simple keys treatment. def next_possible_simple_key(self): # Return the number of the nearest possible simple key. Actually we # don't need to loop through the whole dictionary. We may replace it # with the following code: # if not self.possible_simple_keys: # return None # return self.possible_simple_keys[ # min(self.possible_simple_keys.keys())].token_number min_token_number = None for level in self.possible_simple_keys: key = self.possible_simple_keys[level] if min_token_number is None or key.token_number < min_token_number: min_token_number = key.token_number return min_token_number def stale_possible_simple_keys(self): # Remove entries that are no longer possible simple keys. According to # the YAML specification, simple keys # - should be limited to a single line, # - should be no longer than 1024 characters. # Disabling this procedure will allow simple keys of any length and # height (may cause problems if indentation is broken though). for level in list(self.possible_simple_keys): key = self.possible_simple_keys[level] if key.line != self.line \ or self.index-key.index > 1024: if key.required: raise ScannerError("while scanning a simple key", key.mark, "could not find expected ':'", self.get_mark()) del self.possible_simple_keys[level] def save_possible_simple_key(self): # The next token may start a simple key. We check if it's possible # and save its position. This function is called for # ALIAS, ANCHOR, TAG, SCALAR(flow), '[', and '{'. # Check if a simple key is required at the current position. required = not self.flow_level and self.indent == self.column # The next token might be a simple key. Let's save it's number and # position. if self.allow_simple_key: self.remove_possible_simple_key() token_number = self.tokens_taken+len(self.tokens) key = SimpleKey(token_number, required, self.index, self.line, self.column, self.get_mark()) self.possible_simple_keys[self.flow_level] = key def remove_possible_simple_key(self): # Remove the saved possible key position at the current flow level. if self.flow_level in self.possible_simple_keys: key = self.possible_simple_keys[self.flow_level] if key.required: raise ScannerError("while scanning a simple key", key.mark, "could not find expected ':'", self.get_mark()) del self.possible_simple_keys[self.flow_level] # Indentation functions. def unwind_indent(self, column): ## In flow context, tokens should respect indentation. ## Actually the condition should be `self.indent >= column` according to ## the spec. But this condition will prohibit intuitively correct ## constructions such as ## key : { ## } #if self.flow_level and self.indent > column: # raise ScannerError(None, None, # "invalid indentation or unclosed '[' or '{'", # self.get_mark()) # In the flow context, indentation is ignored. We make the scanner less # restrictive then specification requires. if self.flow_level: return # In block context, we may need to issue the BLOCK-END tokens. while self.indent > column: mark = self.get_mark() self.indent = self.indents.pop() self.tokens.append(BlockEndToken(mark, mark)) def add_indent(self, column): # Check if we need to increase indentation. if self.indent < column: self.indents.append(self.indent) self.indent = column return True return False # Fetchers. def fetch_stream_start(self): # We always add STREAM-START as the first token and STREAM-END as the # last token. # Read the token. mark = self.get_mark() # Add STREAM-START. self.tokens.append(StreamStartToken(mark, mark, encoding=self.encoding)) def fetch_stream_end(self): # Set the current indentation to -1. self.unwind_indent(-1) # Reset simple keys. self.remove_possible_simple_key() self.allow_simple_key = False self.possible_simple_keys = {} # Read the token. mark = self.get_mark() # Add STREAM-END. self.tokens.append(StreamEndToken(mark, mark)) # The steam is finished. self.done = True def fetch_directive(self): # Set the current indentation to -1. self.unwind_indent(-1) # Reset simple keys. self.remove_possible_simple_key() self.allow_simple_key = False # Scan and add DIRECTIVE. self.tokens.append(self.scan_directive()) def fetch_document_start(self): self.fetch_document_indicator(DocumentStartToken) def fetch_document_end(self): self.fetch_document_indicator(DocumentEndToken) def fetch_document_indicator(self, TokenClass): # Set the current indentation to -1. self.unwind_indent(-1) # Reset simple keys. Note that there could not be a block collection # after '---'. self.remove_possible_simple_key() self.allow_simple_key = False # Add DOCUMENT-START or DOCUMENT-END. start_mark = self.get_mark() self.forward(3) end_mark = self.get_mark() self.tokens.append(TokenClass(start_mark, end_mark)) def fetch_flow_sequence_start(self): self.fetch_flow_collection_start(FlowSequenceStartToken) def fetch_flow_mapping_start(self): self.fetch_flow_collection_start(FlowMappingStartToken) def fetch_flow_collection_start(self, TokenClass): # '[' and '{' may start a simple key. self.save_possible_simple_key() # Increase the flow level. self.flow_level += 1 # Simple keys are allowed after '[' and '{'. self.allow_simple_key = True # Add FLOW-SEQUENCE-START or FLOW-MAPPING-START. start_mark = self.get_mark() self.forward() end_mark = self.get_mark() self.tokens.append(TokenClass(start_mark, end_mark)) def fetch_flow_sequence_end(self): self.fetch_flow_collection_end(FlowSequenceEndToken) def fetch_flow_mapping_end(self): self.fetch_flow_collection_end(FlowMappingEndToken) def fetch_flow_collection_end(self, TokenClass): # Reset possible simple key on the current level. self.remove_possible_simple_key() # Decrease the flow level. self.flow_level -= 1 # No simple keys after ']' or '}'. self.allow_simple_key = False # Add FLOW-SEQUENCE-END or FLOW-MAPPING-END. start_mark = self.get_mark() self.forward() end_mark = self.get_mark() self.tokens.append(TokenClass(start_mark, end_mark)) def fetch_flow_entry(self): # Simple keys are allowed after ','. self.allow_simple_key = True # Reset possible simple key on the current level. self.remove_possible_simple_key() # Add FLOW-ENTRY. start_mark = self.get_mark() self.forward() end_mark = self.get_mark() self.tokens.append(FlowEntryToken(start_mark, end_mark)) def fetch_block_entry(self): # Block context needs additional checks. if not self.flow_level: # Are we allowed to start a new entry? if not self.allow_simple_key: raise ScannerError(None, None, "sequence entries are not allowed here", self.get_mark()) # We may need to add BLOCK-SEQUENCE-START. if self.add_indent(self.column): mark = self.get_mark() self.tokens.append(BlockSequenceStartToken(mark, mark)) # It's an error for the block entry to occur in the flow context, # but we let the parser detect this. else: pass # Simple keys are allowed after '-'. self.allow_simple_key = True # Reset possible simple key on the current level. self.remove_possible_simple_key() # Add BLOCK-ENTRY. start_mark = self.get_mark() self.forward() end_mark = self.get_mark() self.tokens.append(BlockEntryToken(start_mark, end_mark)) def fetch_key(self): # Block context needs additional checks. if not self.flow_level: # Are we allowed to start a key (not necessary a simple)? if not self.allow_simple_key: raise ScannerError(None, None, "mapping keys are not allowed here", self.get_mark()) # We may need to add BLOCK-MAPPING-START. if self.add_indent(self.column): mark = self.get_mark() self.tokens.append(BlockMappingStartToken(mark, mark)) # Simple keys are allowed after '?' in the block context. self.allow_simple_key = not self.flow_level # Reset possible simple key on the current level. self.remove_possible_simple_key() # Add KEY. start_mark = self.get_mark() self.forward() end_mark = self.get_mark() self.tokens.append(KeyToken(start_mark, end_mark)) def fetch_value(self): # Do we determine a simple key? if self.flow_level in self.possible_simple_keys: # Add KEY. key = self.possible_simple_keys[self.flow_level] del self.possible_simple_keys[self.flow_level] self.tokens.insert(key.token_number-self.tokens_taken, KeyToken(key.mark, key.mark)) # If this key starts a new block mapping, we need to add # BLOCK-MAPPING-START. if not self.flow_level: if self.add_indent(key.column): self.tokens.insert(key.token_number-self.tokens_taken, BlockMappingStartToken(key.mark, key.mark)) # There cannot be two simple keys one after another. self.allow_simple_key = False # It must be a part of a complex key. else: # Block context needs additional checks. # (Do we really need them? They will be caught by the parser # anyway.) if not self.flow_level: # We are allowed to start a complex value if and only if # we can start a simple key. if not self.allow_simple_key: raise ScannerError(None, None, "mapping values are not allowed here", self.get_mark()) # If this value starts a new block mapping, we need to add # BLOCK-MAPPING-START. It will be detected as an error later by # the parser. if not self.flow_level: if self.add_indent(self.column): mark = self.get_mark() self.tokens.append(BlockMappingStartToken(mark, mark)) # Simple keys are allowed after ':' in the block context. self.allow_simple_key = not self.flow_level # Reset possible simple key on the current level. self.remove_possible_simple_key() # Add VALUE. start_mark = self.get_mark() self.forward() end_mark = self.get_mark() self.tokens.append(ValueToken(start_mark, end_mark)) def fetch_alias(self): # ALIAS could be a simple key. self.save_possible_simple_key() # No simple keys after ALIAS. self.allow_simple_key = False # Scan and add ALIAS. self.tokens.append(self.scan_anchor(AliasToken)) def fetch_anchor(self): # ANCHOR could start a simple key. self.save_possible_simple_key() # No simple keys after ANCHOR. self.allow_simple_key = False # Scan and add ANCHOR. self.tokens.append(self.scan_anchor(AnchorToken)) def fetch_tag(self): # TAG could start a simple key. self.save_possible_simple_key() # No simple keys after TAG. self.allow_simple_key = False # Scan and add TAG. self.tokens.append(self.scan_tag()) def fetch_literal(self): self.fetch_block_scalar(style='|') def fetch_folded(self): self.fetch_block_scalar(style='>') def fetch_block_scalar(self, style): # A simple key may follow a block scalar. self.allow_simple_key = True # Reset possible simple key on the current level. self.remove_possible_simple_key() # Scan and add SCALAR. self.tokens.append(self.scan_block_scalar(style)) def fetch_single(self): self.fetch_flow_scalar(style='\'') def fetch_double(self): self.fetch_flow_scalar(style='"') def fetch_flow_scalar(self, style): # A flow scalar could be a simple key. self.save_possible_simple_key() # No simple keys after flow scalars. self.allow_simple_key = False # Scan and add SCALAR. self.tokens.append(self.scan_flow_scalar(style)) def fetch_plain(self): # A plain scalar could be a simple key. self.save_possible_simple_key() # No simple keys after plain scalars. But note that `scan_plain` will # change this flag if the scan is finished at the beginning of the # line. self.allow_simple_key = False # Scan and add SCALAR. May change `allow_simple_key`. self.tokens.append(self.scan_plain()) # Checkers. def check_directive(self): # DIRECTIVE: ^ '%' ... # The '%' indicator is already checked. if self.column == 0: return True def check_document_start(self): # DOCUMENT-START: ^ '---' (' '|'\n') if self.column == 0: if self.prefix(3) == '---' \ and self.peek(3) in '\0 \t\r\n\x85\u2028\u2029': return True def check_document_end(self): # DOCUMENT-END: ^ '...' (' '|'\n') if self.column == 0: if self.prefix(3) == '...' \ and self.peek(3) in '\0 \t\r\n\x85\u2028\u2029': return True def check_block_entry(self): # BLOCK-ENTRY: '-' (' '|'\n') return self.peek(1) in '\0 \t\r\n\x85\u2028\u2029' def check_key(self): # KEY(flow context): '?' if self.flow_level: return True # KEY(block context): '?' (' '|'\n') else: return self.peek(1) in '\0 \t\r\n\x85\u2028\u2029' def check_value(self): # VALUE(flow context): ':' if self.flow_level: return True # VALUE(block context): ':' (' '|'\n') else: return self.peek(1) in '\0 \t\r\n\x85\u2028\u2029' def check_plain(self): # A plain scalar may start with any non-space character except: # '-', '?', ':', ',', '[', ']', '{', '}', # '#', '&', '*', '!', '|', '>', '\'', '\"', # '%', '@', '`'. # # It may also start with # '-', '?', ':' # if it is followed by a non-space character. # # Note that we limit the last rule to the block context (except the # '-' character) because we want the flow context to be space # independent. ch = self.peek() return ch not in '\0 \t\r\n\x85\u2028\u2029-?:,[]{}#&*!|>\'\"%@`' \ or (self.peek(1) not in '\0 \t\r\n\x85\u2028\u2029' and (ch == '-' or (not self.flow_level and ch in '?:'))) # Scanners. def scan_to_next_token(self): # We ignore spaces, line breaks and comments. # If we find a line break in the block context, we set the flag # `allow_simple_key` on. # The byte order mark is stripped if it's the first character in the # stream. We do not yet support BOM inside the stream as the # specification requires. Any such mark will be considered as a part # of the document. # # TODO: We need to make tab handling rules more sane. A good rule is # Tabs cannot precede tokens # BLOCK-SEQUENCE-START, BLOCK-MAPPING-START, BLOCK-END, # KEY(block), VALUE(block), BLOCK-ENTRY # So the checking code is # if : # self.allow_simple_keys = False # We also need to add the check for `allow_simple_keys == True` to # `unwind_indent` before issuing BLOCK-END. # Scanners for block, flow, and plain scalars need to be modified. if self.index == 0 and self.peek() == '\uFEFF': self.forward() found = False while not found: while self.peek() == ' ': self.forward() if self.peek() == '#': while self.peek() not in '\0\r\n\x85\u2028\u2029': self.forward() if self.scan_line_break(): if not self.flow_level: self.allow_simple_key = True else: found = True def scan_directive(self): # See the specification for details. start_mark = self.get_mark() self.forward() name = self.scan_directive_name(start_mark) value = None if name == 'YAML': value = self.scan_yaml_directive_value(start_mark) end_mark = self.get_mark() elif name == 'TAG': value = self.scan_tag_directive_value(start_mark) end_mark = self.get_mark() else: end_mark = self.get_mark() while self.peek() not in '\0\r\n\x85\u2028\u2029': self.forward() self.scan_directive_ignored_line(start_mark) return DirectiveToken(name, value, start_mark, end_mark) def scan_directive_name(self, start_mark): # See the specification for details. length = 0 ch = self.peek(length) while '0' <= ch <= '9' or 'A' <= ch <= 'Z' or 'a' <= ch <= 'z' \ or ch in '-_': length += 1 ch = self.peek(length) if not length: raise ScannerError("while scanning a directive", start_mark, "expected alphabetic or numeric character, but found %r" % ch, self.get_mark()) value = self.prefix(length) self.forward(length) ch = self.peek() if ch not in '\0 \r\n\x85\u2028\u2029': raise ScannerError("while scanning a directive", start_mark, "expected alphabetic or numeric character, but found %r" % ch, self.get_mark()) return value def scan_yaml_directive_value(self, start_mark): # See the specification for details. while self.peek() == ' ': self.forward() major = self.scan_yaml_directive_number(start_mark) if self.peek() != '.': raise ScannerError("while scanning a directive", start_mark, "expected a digit or '.', but found %r" % self.peek(), self.get_mark()) self.forward() minor = self.scan_yaml_directive_number(start_mark) if self.peek() not in '\0 \r\n\x85\u2028\u2029': raise ScannerError("while scanning a directive", start_mark, "expected a digit or ' ', but found %r" % self.peek(), self.get_mark()) return (major, minor) def scan_yaml_directive_number(self, start_mark): # See the specification for details. ch = self.peek() if not ('0' <= ch <= '9'): raise ScannerError("while scanning a directive", start_mark, "expected a digit, but found %r" % ch, self.get_mark()) length = 0 while '0' <= self.peek(length) <= '9': length += 1 value = int(self.prefix(length)) self.forward(length) return value def scan_tag_directive_value(self, start_mark): # See the specification for details. while self.peek() == ' ': self.forward() handle = self.scan_tag_directive_handle(start_mark) while self.peek() == ' ': self.forward() prefix = self.scan_tag_directive_prefix(start_mark) return (handle, prefix) def scan_tag_directive_handle(self, start_mark): # See the specification for details. value = self.scan_tag_handle('directive', start_mark) ch = self.peek() if ch != ' ': raise ScannerError("while scanning a directive", start_mark, "expected ' ', but found %r" % ch, self.get_mark()) return value def scan_tag_directive_prefix(self, start_mark): # See the specification for details. value = self.scan_tag_uri('directive', start_mark) ch = self.peek() if ch not in '\0 \r\n\x85\u2028\u2029': raise ScannerError("while scanning a directive", start_mark, "expected ' ', but found %r" % ch, self.get_mark()) return value def scan_directive_ignored_line(self, start_mark): # See the specification for details. while self.peek() == ' ': self.forward() if self.peek() == '#': while self.peek() not in '\0\r\n\x85\u2028\u2029': self.forward() ch = self.peek() if ch not in '\0\r\n\x85\u2028\u2029': raise ScannerError("while scanning a directive", start_mark, "expected a comment or a line break, but found %r" % ch, self.get_mark()) self.scan_line_break() def scan_anchor(self, TokenClass): # The specification does not restrict characters for anchors and # aliases. This may lead to problems, for instance, the document: # [ *alias, value ] # can be interpreted in two ways, as # [ "value" ] # and # [ *alias , "value" ] # Therefore we restrict aliases to numbers and ASCII letters. start_mark = self.get_mark() indicator = self.peek() if indicator == '*': name = 'alias' else: name = 'anchor' self.forward() length = 0 ch = self.peek(length) while '0' <= ch <= '9' or 'A' <= ch <= 'Z' or 'a' <= ch <= 'z' \ or ch in '-_': length += 1 ch = self.peek(length) if not length: raise ScannerError("while scanning an %s" % name, start_mark, "expected alphabetic or numeric character, but found %r" % ch, self.get_mark()) value = self.prefix(length) self.forward(length) ch = self.peek() if ch not in '\0 \t\r\n\x85\u2028\u2029?:,]}%@`': raise ScannerError("while scanning an %s" % name, start_mark, "expected alphabetic or numeric character, but found %r" % ch, self.get_mark()) end_mark = self.get_mark() return TokenClass(value, start_mark, end_mark) def scan_tag(self): # See the specification for details. start_mark = self.get_mark() ch = self.peek(1) if ch == '<': handle = None self.forward(2) suffix = self.scan_tag_uri('tag', start_mark) if self.peek() != '>': raise ScannerError("while parsing a tag", start_mark, "expected '>', but found %r" % self.peek(), self.get_mark()) self.forward() elif ch in '\0 \t\r\n\x85\u2028\u2029': handle = None suffix = '!' self.forward() else: length = 1 use_handle = False while ch not in '\0 \r\n\x85\u2028\u2029': if ch == '!': use_handle = True break length += 1 ch = self.peek(length) handle = '!' if use_handle: handle = self.scan_tag_handle('tag', start_mark) else: handle = '!' self.forward() suffix = self.scan_tag_uri('tag', start_mark) ch = self.peek() if ch not in '\0 \r\n\x85\u2028\u2029': raise ScannerError("while scanning a tag", start_mark, "expected ' ', but found %r" % ch, self.get_mark()) value = (handle, suffix) end_mark = self.get_mark() return TagToken(value, start_mark, end_mark) def scan_block_scalar(self, style): # See the specification for details. if style == '>': folded = True else: folded = False chunks = [] start_mark = self.get_mark() # Scan the header. self.forward() chomping, increment = self.scan_block_scalar_indicators(start_mark) self.scan_block_scalar_ignored_line(start_mark) # Determine the indentation level and go to the first non-empty line. min_indent = self.indent+1 if min_indent < 1: min_indent = 1 if increment is None: breaks, max_indent, end_mark = self.scan_block_scalar_indentation() indent = max(min_indent, max_indent) else: indent = min_indent+increment-1 breaks, end_mark = self.scan_block_scalar_breaks(indent) line_break = '' # Scan the inner part of the block scalar. while self.column == indent and self.peek() != '\0': chunks.extend(breaks) leading_non_space = self.peek() not in ' \t' length = 0 while self.peek(length) not in '\0\r\n\x85\u2028\u2029': length += 1 chunks.append(self.prefix(length)) self.forward(length) line_break = self.scan_line_break() breaks, end_mark = self.scan_block_scalar_breaks(indent) if self.column == indent and self.peek() != '\0': # Unfortunately, folding rules are ambiguous. # # This is the folding according to the specification: if folded and line_break == '\n' \ and leading_non_space and self.peek() not in ' \t': if not breaks: chunks.append(' ') else: chunks.append(line_break) # This is Clark Evans's interpretation (also in the spec # examples): # #if folded and line_break == '\n': # if not breaks: # if self.peek() not in ' \t': # chunks.append(' ') # else: # chunks.append(line_break) #else: # chunks.append(line_break) else: break # Chomp the tail. if chomping is not False: chunks.append(line_break) if chomping is True: chunks.extend(breaks) # We are done. return ScalarToken(''.join(chunks), False, start_mark, end_mark, style) def scan_block_scalar_indicators(self, start_mark): # See the specification for details. chomping = None increment = None ch = self.peek() if ch in '+-': if ch == '+': chomping = True else: chomping = False self.forward() ch = self.peek() if ch in '0123456789': increment = int(ch) if increment == 0: raise ScannerError("while scanning a block scalar", start_mark, "expected indentation indicator in the range 1-9, but found 0", self.get_mark()) self.forward() elif ch in '0123456789': increment = int(ch) if increment == 0: raise ScannerError("while scanning a block scalar", start_mark, "expected indentation indicator in the range 1-9, but found 0", self.get_mark()) self.forward() ch = self.peek() if ch in '+-': if ch == '+': chomping = True else: chomping = False self.forward() ch = self.peek() if ch not in '\0 \r\n\x85\u2028\u2029': raise ScannerError("while scanning a block scalar", start_mark, "expected chomping or indentation indicators, but found %r" % ch, self.get_mark()) return chomping, increment def scan_block_scalar_ignored_line(self, start_mark): # See the specification for details. while self.peek() == ' ': self.forward() if self.peek() == '#': while self.peek() not in '\0\r\n\x85\u2028\u2029': self.forward() ch = self.peek() if ch not in '\0\r\n\x85\u2028\u2029': raise ScannerError("while scanning a block scalar", start_mark, "expected a comment or a line break, but found %r" % ch, self.get_mark()) self.scan_line_break() def scan_block_scalar_indentation(self): # See the specification for details. chunks = [] max_indent = 0 end_mark = self.get_mark() while self.peek() in ' \r\n\x85\u2028\u2029': if self.peek() != ' ': chunks.append(self.scan_line_break()) end_mark = self.get_mark() else: self.forward() if self.column > max_indent: max_indent = self.column return chunks, max_indent, end_mark def scan_block_scalar_breaks(self, indent): # See the specification for details. chunks = [] end_mark = self.get_mark() while self.column < indent and self.peek() == ' ': self.forward() while self.peek() in '\r\n\x85\u2028\u2029': chunks.append(self.scan_line_break()) end_mark = self.get_mark() while self.column < indent and self.peek() == ' ': self.forward() return chunks, end_mark def scan_flow_scalar(self, style): # See the specification for details. # Note that we loose indentation rules for quoted scalars. Quoted # scalars don't need to adhere indentation because " and ' clearly # mark the beginning and the end of them. Therefore we are less # restrictive then the specification requires. We only need to check # that document separators are not included in scalars. if style == '"': double = True else: double = False chunks = [] start_mark = self.get_mark() quote = self.peek() self.forward() chunks.extend(self.scan_flow_scalar_non_spaces(double, start_mark)) while self.peek() != quote: chunks.extend(self.scan_flow_scalar_spaces(double, start_mark)) chunks.extend(self.scan_flow_scalar_non_spaces(double, start_mark)) self.forward() end_mark = self.get_mark() return ScalarToken(''.join(chunks), False, start_mark, end_mark, style) ESCAPE_REPLACEMENTS = { '0': '\0', 'a': '\x07', 'b': '\x08', 't': '\x09', '\t': '\x09', 'n': '\x0A', 'v': '\x0B', 'f': '\x0C', 'r': '\x0D', 'e': '\x1B', ' ': '\x20', '\"': '\"', '\\': '\\', '/': '/', 'N': '\x85', '_': '\xA0', 'L': '\u2028', 'P': '\u2029', } ESCAPE_CODES = { 'x': 2, 'u': 4, 'U': 8, } def scan_flow_scalar_non_spaces(self, double, start_mark): # See the specification for details. chunks = [] while True: length = 0 while self.peek(length) not in '\'\"\\\0 \t\r\n\x85\u2028\u2029': length += 1 if length: chunks.append(self.prefix(length)) self.forward(length) ch = self.peek() if not double and ch == '\'' and self.peek(1) == '\'': chunks.append('\'') self.forward(2) elif (double and ch == '\'') or (not double and ch in '\"\\'): chunks.append(ch) self.forward() elif double and ch == '\\': self.forward() ch = self.peek() if ch in self.ESCAPE_REPLACEMENTS: chunks.append(self.ESCAPE_REPLACEMENTS[ch]) self.forward() elif ch in self.ESCAPE_CODES: length = self.ESCAPE_CODES[ch] self.forward() for k in range(length): if self.peek(k) not in '0123456789ABCDEFabcdef': raise ScannerError("while scanning a double-quoted scalar", start_mark, "expected escape sequence of %d hexadecimal numbers, but found %r" % (length, self.peek(k)), self.get_mark()) code = int(self.prefix(length), 16) chunks.append(chr(code)) self.forward(length) elif ch in '\r\n\x85\u2028\u2029': self.scan_line_break() chunks.extend(self.scan_flow_scalar_breaks(double, start_mark)) else: raise ScannerError("while scanning a double-quoted scalar", start_mark, "found unknown escape character %r" % ch, self.get_mark()) else: return chunks def scan_flow_scalar_spaces(self, double, start_mark): # See the specification for details. chunks = [] length = 0 while self.peek(length) in ' \t': length += 1 whitespaces = self.prefix(length) self.forward(length) ch = self.peek() if ch == '\0': raise ScannerError("while scanning a quoted scalar", start_mark, "found unexpected end of stream", self.get_mark()) elif ch in '\r\n\x85\u2028\u2029': line_break = self.scan_line_break() breaks = self.scan_flow_scalar_breaks(double, start_mark) if line_break != '\n': chunks.append(line_break) elif not breaks: chunks.append(' ') chunks.extend(breaks) else: chunks.append(whitespaces) return chunks def scan_flow_scalar_breaks(self, double, start_mark): # See the specification for details. chunks = [] while True: # Instead of checking indentation, we check for document # separators. prefix = self.prefix(3) if (prefix == '---' or prefix == '...') \ and self.peek(3) in '\0 \t\r\n\x85\u2028\u2029': raise ScannerError("while scanning a quoted scalar", start_mark, "found unexpected document separator", self.get_mark()) while self.peek() in ' \t': self.forward() if self.peek() in '\r\n\x85\u2028\u2029': chunks.append(self.scan_line_break()) else: return chunks def scan_plain(self): # See the specification for details. # We add an additional restriction for the flow context: # plain scalars in the flow context cannot contain ',' or '?'. # We also keep track of the `allow_simple_key` flag here. # Indentation rules are loosed for the flow context. chunks = [] start_mark = self.get_mark() end_mark = start_mark indent = self.indent+1 # We allow zero indentation for scalars, but then we need to check for # document separators at the beginning of the line. #if indent == 0: # indent = 1 spaces = [] while True: length = 0 if self.peek() == '#': break while True: ch = self.peek(length) if ch in '\0 \t\r\n\x85\u2028\u2029' \ or (ch == ':' and self.peek(length+1) in '\0 \t\r\n\x85\u2028\u2029' + (u',[]{}' if self.flow_level else u''))\ or (self.flow_level and ch in ',?[]{}'): break length += 1 if length == 0: break self.allow_simple_key = False chunks.extend(spaces) chunks.append(self.prefix(length)) self.forward(length) end_mark = self.get_mark() spaces = self.scan_plain_spaces(indent, start_mark) if not spaces or self.peek() == '#' \ or (not self.flow_level and self.column < indent): break return ScalarToken(''.join(chunks), True, start_mark, end_mark) def scan_plain_spaces(self, indent, start_mark): # See the specification for details. # The specification is really confusing about tabs in plain scalars. # We just forbid them completely. Do not use tabs in YAML! chunks = [] length = 0 while self.peek(length) in ' ': length += 1 whitespaces = self.prefix(length) self.forward(length) ch = self.peek() if ch in '\r\n\x85\u2028\u2029': line_break = self.scan_line_break() self.allow_simple_key = True prefix = self.prefix(3) if (prefix == '---' or prefix == '...') \ and self.peek(3) in '\0 \t\r\n\x85\u2028\u2029': return breaks = [] while self.peek() in ' \r\n\x85\u2028\u2029': if self.peek() == ' ': self.forward() else: breaks.append(self.scan_line_break()) prefix = self.prefix(3) if (prefix == '---' or prefix == '...') \ and self.peek(3) in '\0 \t\r\n\x85\u2028\u2029': return if line_break != '\n': chunks.append(line_break) elif not breaks: chunks.append(' ') chunks.extend(breaks) elif whitespaces: chunks.append(whitespaces) return chunks def scan_tag_handle(self, name, start_mark): # See the specification for details. # For some strange reasons, the specification does not allow '_' in # tag handles. I have allowed it anyway. ch = self.peek() if ch != '!': raise ScannerError("while scanning a %s" % name, start_mark, "expected '!', but found %r" % ch, self.get_mark()) length = 1 ch = self.peek(length) if ch != ' ': while '0' <= ch <= '9' or 'A' <= ch <= 'Z' or 'a' <= ch <= 'z' \ or ch in '-_': length += 1 ch = self.peek(length) if ch != '!': self.forward(length) raise ScannerError("while scanning a %s" % name, start_mark, "expected '!', but found %r" % ch, self.get_mark()) length += 1 value = self.prefix(length) self.forward(length) return value def scan_tag_uri(self, name, start_mark): # See the specification for details. # Note: we do not check if URI is well-formed. chunks = [] length = 0 ch = self.peek(length) while '0' <= ch <= '9' or 'A' <= ch <= 'Z' or 'a' <= ch <= 'z' \ or ch in '-;/?:@&=+$,_.!~*\'()[]%': if ch == '%': chunks.append(self.prefix(length)) self.forward(length) length = 0 chunks.append(self.scan_uri_escapes(name, start_mark)) else: length += 1 ch = self.peek(length) if length: chunks.append(self.prefix(length)) self.forward(length) length = 0 if not chunks: raise ScannerError("while parsing a %s" % name, start_mark, "expected URI, but found %r" % ch, self.get_mark()) return ''.join(chunks) def scan_uri_escapes(self, name, start_mark): # See the specification for details. codes = [] mark = self.get_mark() while self.peek() == '%': self.forward() for k in range(2): if self.peek(k) not in '0123456789ABCDEFabcdef': raise ScannerError("while scanning a %s" % name, start_mark, "expected URI escape sequence of 2 hexadecimal numbers, but found %r" % self.peek(k), self.get_mark()) codes.append(int(self.prefix(2), 16)) self.forward(2) try: value = bytes(codes).decode('utf-8') except UnicodeDecodeError as exc: raise ScannerError("while scanning a %s" % name, start_mark, str(exc), mark) return value def scan_line_break(self): # Transforms: # '\r\n' : '\n' # '\r' : '\n' # '\n' : '\n' # '\x85' : '\n' # '\u2028' : '\u2028' # '\u2029 : '\u2029' # default : '' ch = self.peek() if ch in '\r\n\x85': if self.prefix(2) == '\r\n': self.forward(2) else: self.forward() return '\n' elif ch in '\u2028\u2029': self.forward() return ch return '' ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/lib/yaml/serializer.py0000644000175100001730000001010514455350511016425 0ustar00runnerdocker __all__ = ['Serializer', 'SerializerError'] from .error import YAMLError from .events import * from .nodes import * class SerializerError(YAMLError): pass class Serializer: ANCHOR_TEMPLATE = 'id%03d' def __init__(self, encoding=None, explicit_start=None, explicit_end=None, version=None, tags=None): self.use_encoding = encoding self.use_explicit_start = explicit_start self.use_explicit_end = explicit_end self.use_version = version self.use_tags = tags self.serialized_nodes = {} self.anchors = {} self.last_anchor_id = 0 self.closed = None def open(self): if self.closed is None: self.emit(StreamStartEvent(encoding=self.use_encoding)) self.closed = False elif self.closed: raise SerializerError("serializer is closed") else: raise SerializerError("serializer is already opened") def close(self): if self.closed is None: raise SerializerError("serializer is not opened") elif not self.closed: self.emit(StreamEndEvent()) self.closed = True #def __del__(self): # self.close() def serialize(self, node): if self.closed is None: raise SerializerError("serializer is not opened") elif self.closed: raise SerializerError("serializer is closed") self.emit(DocumentStartEvent(explicit=self.use_explicit_start, version=self.use_version, tags=self.use_tags)) self.anchor_node(node) self.serialize_node(node, None, None) self.emit(DocumentEndEvent(explicit=self.use_explicit_end)) self.serialized_nodes = {} self.anchors = {} self.last_anchor_id = 0 def anchor_node(self, node): if node in self.anchors: if self.anchors[node] is None: self.anchors[node] = self.generate_anchor(node) else: self.anchors[node] = None if isinstance(node, SequenceNode): for item in node.value: self.anchor_node(item) elif isinstance(node, MappingNode): for key, value in node.value: self.anchor_node(key) self.anchor_node(value) def generate_anchor(self, node): self.last_anchor_id += 1 return self.ANCHOR_TEMPLATE % self.last_anchor_id def serialize_node(self, node, parent, index): alias = self.anchors[node] if node in self.serialized_nodes: self.emit(AliasEvent(alias)) else: self.serialized_nodes[node] = True self.descend_resolver(parent, index) if isinstance(node, ScalarNode): detected_tag = self.resolve(ScalarNode, node.value, (True, False)) default_tag = self.resolve(ScalarNode, node.value, (False, True)) implicit = (node.tag == detected_tag), (node.tag == default_tag) self.emit(ScalarEvent(alias, node.tag, implicit, node.value, style=node.style)) elif isinstance(node, SequenceNode): implicit = (node.tag == self.resolve(SequenceNode, node.value, True)) self.emit(SequenceStartEvent(alias, node.tag, implicit, flow_style=node.flow_style)) index = 0 for item in node.value: self.serialize_node(item, node, index) index += 1 self.emit(SequenceEndEvent()) elif isinstance(node, MappingNode): implicit = (node.tag == self.resolve(MappingNode, node.value, True)) self.emit(MappingStartEvent(alias, node.tag, implicit, flow_style=node.flow_style)) for key, value in node.value: self.serialize_node(key, node, None) self.serialize_node(value, node, key) self.emit(MappingEndEvent()) self.ascend_resolver() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/lib/yaml/tokens.py0000644000175100001730000000501514455350511015563 0ustar00runnerdocker class Token(object): def __init__(self, start_mark, end_mark): self.start_mark = start_mark self.end_mark = end_mark def __repr__(self): attributes = [key for key in self.__dict__ if not key.endswith('_mark')] attributes.sort() arguments = ', '.join(['%s=%r' % (key, getattr(self, key)) for key in attributes]) return '%s(%s)' % (self.__class__.__name__, arguments) #class BOMToken(Token): # id = '' class DirectiveToken(Token): id = '' def __init__(self, name, value, start_mark, end_mark): self.name = name self.value = value self.start_mark = start_mark self.end_mark = end_mark class DocumentStartToken(Token): id = '' class DocumentEndToken(Token): id = '' class StreamStartToken(Token): id = '' def __init__(self, start_mark=None, end_mark=None, encoding=None): self.start_mark = start_mark self.end_mark = end_mark self.encoding = encoding class StreamEndToken(Token): id = '' class BlockSequenceStartToken(Token): id = '' class BlockMappingStartToken(Token): id = '' class BlockEndToken(Token): id = '' class FlowSequenceStartToken(Token): id = '[' class FlowMappingStartToken(Token): id = '{' class FlowSequenceEndToken(Token): id = ']' class FlowMappingEndToken(Token): id = '}' class KeyToken(Token): id = '?' class ValueToken(Token): id = ':' class BlockEntryToken(Token): id = '-' class FlowEntryToken(Token): id = ',' class AliasToken(Token): id = '' def __init__(self, value, start_mark, end_mark): self.value = value self.start_mark = start_mark self.end_mark = end_mark class AnchorToken(Token): id = '' def __init__(self, value, start_mark, end_mark): self.value = value self.start_mark = start_mark self.end_mark = end_mark class TagToken(Token): id = '' def __init__(self, value, start_mark, end_mark): self.value = value self.start_mark = start_mark self.end_mark = end_mark class ScalarToken(Token): id = '' def __init__(self, value, plain, start_mark, end_mark, style=None): self.value = value self.plain = plain self.start_mark = start_mark self.end_mark = end_mark self.style = style ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/pyproject.toml0000644000175100001730000000015014455350511015105 0ustar00runnerdocker[build-system] requires = ["setuptools", "wheel", "Cython<3.0"] build-backend = "setuptools.build_meta" ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1689637208.8122263 PyYAML-6.0.1/setup.cfg0000644000175100001730000000004614455350531014020 0ustar00runnerdocker[egg_info] tag_build = tag_date = 0 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/setup.py0000644000175100001730000002466114455350511013720 0ustar00runnerdocker NAME = 'PyYAML' VERSION = '6.0.1' DESCRIPTION = "YAML parser and emitter for Python" LONG_DESCRIPTION = """\ YAML is a data serialization format designed for human readability and interaction with scripting languages. PyYAML is a YAML parser and emitter for Python. PyYAML features a complete YAML 1.1 parser, Unicode support, pickle support, capable extension API, and sensible error messages. PyYAML supports standard YAML tags and provides Python-specific tags that allow to represent an arbitrary Python object. PyYAML is applicable for a broad range of tasks from complex configuration files to object serialization and persistence.""" AUTHOR = "Kirill Simonov" AUTHOR_EMAIL = 'xi@resolvent.net' LICENSE = "MIT" PLATFORMS = "Any" URL = "https://pyyaml.org/" DOWNLOAD_URL = "https://pypi.org/project/PyYAML/" CLASSIFIERS = [ "Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Operating System :: OS Independent", "Programming Language :: Cython", "Programming Language :: Python", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.6", "Programming Language :: Python :: 3.7", "Programming Language :: Python :: 3.8", "Programming Language :: Python :: 3.9", "Programming Language :: Python :: 3.10", "Programming Language :: Python :: 3.11", "Programming Language :: Python :: Implementation :: CPython", "Programming Language :: Python :: Implementation :: PyPy", "Topic :: Software Development :: Libraries :: Python Modules", "Topic :: Text Processing :: Markup", ] PROJECT_URLS = { 'Bug Tracker': 'https://github.com/yaml/pyyaml/issues', 'CI': 'https://github.com/yaml/pyyaml/actions', 'Documentation': 'https://pyyaml.org/wiki/PyYAMLDocumentation', 'Mailing lists': 'http://lists.sourceforge.net/lists/listinfo/yaml-core', 'Source Code': 'https://github.com/yaml/pyyaml', } LIBYAML_CHECK = """ #include int main(void) { yaml_parser_t parser; yaml_emitter_t emitter; yaml_parser_initialize(&parser); yaml_parser_delete(&parser); yaml_emitter_initialize(&emitter); yaml_emitter_delete(&emitter); return 0; } """ import sys, os, os.path, pathlib, platform, shutil, tempfile, warnings # for newer setuptools, enable the embedded distutils before importing setuptools/distutils to avoid warnings os.environ['SETUPTOOLS_USE_DISTUTILS'] = 'local' from setuptools import setup, Command, Distribution as _Distribution, Extension as _Extension from setuptools.command.build_ext import build_ext as _build_ext # NB: distutils imports must remain below setuptools to ensure we use the embedded version from distutils import log from distutils.errors import DistutilsError, CompileError, LinkError, DistutilsPlatformError with_cython = False if 'sdist' in sys.argv or os.environ.get('PYYAML_FORCE_CYTHON') == '1': # we need cython here with_cython = True try: from Cython.Distutils.extension import Extension as _Extension from Cython.Distutils import build_ext as _build_ext with_cython = True except ImportError: if with_cython: raise try: from wheel.bdist_wheel import bdist_wheel except ImportError: bdist_wheel = None # on Windows, disable wheel generation warning noise windows_ignore_warnings = [ "Unknown distribution option: 'python_requires'", "Config variable 'Py_DEBUG' is unset", "Config variable 'WITH_PYMALLOC' is unset", "Config variable 'Py_UNICODE_SIZE' is unset", "Cython directive 'language_level' not set" ] if platform.system() == 'Windows': for w in windows_ignore_warnings: warnings.filterwarnings('ignore', w) class Distribution(_Distribution): def __init__(self, attrs=None): _Distribution.__init__(self, attrs) if not self.ext_modules: return for idx in range(len(self.ext_modules)-1, -1, -1): ext = self.ext_modules[idx] if not isinstance(ext, Extension): continue setattr(self, ext.attr_name, None) self.global_options = [ (ext.option_name, None, "include %s (default if %s is available)" % (ext.feature_description, ext.feature_name)), (ext.neg_option_name, None, "exclude %s" % ext.feature_description), ] + self.global_options self.negative_opt = self.negative_opt.copy() self.negative_opt[ext.neg_option_name] = ext.option_name def has_ext_modules(self): if not self.ext_modules: return False for ext in self.ext_modules: with_ext = self.ext_status(ext) if with_ext is None or with_ext: return True return False def ext_status(self, ext): implementation = platform.python_implementation() if implementation not in ['CPython', 'PyPy']: return False if isinstance(ext, Extension): # the "build by default" behavior is implemented by this returning None with_ext = getattr(self, ext.attr_name) or os.environ.get('PYYAML_FORCE_{0}'.format(ext.feature_name.upper())) try: with_ext = int(with_ext) # attempt coerce envvar to int except TypeError: pass return with_ext else: return True class Extension(_Extension): def __init__(self, name, sources, feature_name, feature_description, feature_check, **kwds): if not with_cython: for filename in sources[:]: base, ext = os.path.splitext(filename) if ext == '.pyx': sources.remove(filename) sources.append('%s.c' % base) _Extension.__init__(self, name, sources, **kwds) self.feature_name = feature_name self.feature_description = feature_description self.feature_check = feature_check self.attr_name = 'with_' + feature_name.replace('-', '_') self.option_name = 'with-' + feature_name self.neg_option_name = 'without-' + feature_name class build_ext(_build_ext): def run(self): optional = True disabled = True for ext in self.extensions: with_ext = self.distribution.ext_status(ext) if with_ext is None: disabled = False elif with_ext: optional = False disabled = False break if disabled: return try: _build_ext.run(self) except DistutilsPlatformError: exc = sys.exc_info()[1] if optional: log.warn(str(exc)) log.warn("skipping build_ext") else: raise def get_source_files(self): self.check_extensions_list(self.extensions) filenames = [] for ext in self.extensions: if with_cython: self.cython_sources(ext.sources, ext) for filename in ext.sources: filenames.append(filename) base = os.path.splitext(filename)[0] for ext in ['c', 'h', 'pyx', 'pxd']: filename = '%s.%s' % (base, ext) if filename not in filenames and os.path.isfile(filename): filenames.append(filename) return filenames def get_outputs(self): self.check_extensions_list(self.extensions) outputs = [] for ext in self.extensions: fullname = self.get_ext_fullname(ext.name) filename = os.path.join(self.build_lib, self.get_ext_filename(fullname)) if os.path.isfile(filename): outputs.append(filename) return outputs def build_extensions(self): self.check_extensions_list(self.extensions) for ext in self.extensions: with_ext = self.distribution.ext_status(ext) if with_ext is not None and not with_ext: continue if with_cython: ext.sources = self.cython_sources(ext.sources, ext) try: self.build_extension(ext) except (CompileError, LinkError): if with_ext is not None: raise log.warn("Error compiling module, falling back to pure Python") class test(Command): user_options = [] def initialize_options(self): pass def finalize_options(self): pass def run(self): build_cmd = self.get_finalized_command('build') build_cmd.run() # running the tests this way can pollute the post-MANIFEST build sources # (see https://github.com/yaml/pyyaml/issues/527#issuecomment-921058344) # until we remove the test command, run tests from an ephemeral copy of the intermediate build sources tempdir = tempfile.TemporaryDirectory(prefix='test_pyyaml') try: # have to create a subdir since we don't get dir_exists_ok on copytree until 3.8 temp_test_path = pathlib.Path(tempdir.name) / 'pyyaml' shutil.copytree(build_cmd.build_lib, temp_test_path) sys.path.insert(0, str(temp_test_path)) sys.path.insert(0, 'tests/lib') import test_all if not test_all.main([]): raise DistutilsError("Tests failed") finally: try: # this can fail under Windows; best-effort cleanup tempdir.cleanup() except Exception: pass cmdclass = { 'build_ext': build_ext, 'test': test, } if bdist_wheel: cmdclass['bdist_wheel'] = bdist_wheel if __name__ == '__main__': setup( name=NAME, version=VERSION, description=DESCRIPTION, long_description=LONG_DESCRIPTION, author=AUTHOR, author_email=AUTHOR_EMAIL, license=LICENSE, platforms=PLATFORMS, url=URL, download_url=DOWNLOAD_URL, classifiers=CLASSIFIERS, project_urls=PROJECT_URLS, package_dir={'': 'lib'}, packages=['yaml', '_yaml'], ext_modules=[ Extension('yaml._yaml', ['yaml/_yaml.pyx'], 'libyaml', "LibYAML bindings", LIBYAML_CHECK, libraries=['yaml']), ], distclass=Distribution, cmdclass=cmdclass, python_requires='>=3.6', ) ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1689637208.6562278 PyYAML-6.0.1/tests/0000755000175100001730000000000014455350531013341 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1689637208.8042264 PyYAML-6.0.1/tests/data/0000755000175100001730000000000014455350531014252 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/a-nasty-libyaml-bug.loader-error0000644000175100001730000000000314455350511022336 0ustar00runnerdocker[ [././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/aliases-cdumper-bug.code0000644000175100001730000000002114455350511020726 0ustar00runnerdocker[ today, today ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/aliases.events0000644000175100001730000000027314455350511017121 0ustar00runnerdocker- !StreamStart - !DocumentStart - !SequenceStart - !Scalar { anchor: 'myanchor', tag: '!mytag', value: 'data' } - !Alias { anchor: 'myanchor' } - !SequenceEnd - !DocumentEnd - !StreamEnd ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/bool.data0000644000175100001730000000002714455350511016035 0ustar00runnerdocker- yes - NO - True - on ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/bool.detect0000644000175100001730000000002714455350511016374 0ustar00runnerdockertag:yaml.org,2002:bool ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-binary-py2.code0000644000175100001730000000244514455350511021127 0ustar00runnerdocker{ "canonical": "GIF89a\x0c\x00\x0c\x00\x84\x00\x00\xff\xff\xf7\xf5\xf5\xee\xe9\xe9\xe5fff\x00\x00\x00\xe7\xe7\xe7^^^\xf3\xf3\xed\x8e\x8e\x8e\xe0\xe0\xe0\x9f\x9f\x9f\x93\x93\x93\xa7\xa7\xa7\x9e\x9e\x9eiiiccc\xa3\xa3\xa3\x84\x84\x84\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9!\xfe\x0eMade with GIMP\x00,\x00\x00\x00\x00\x0c\x00\x0c\x00\x00\x05, \x8e\x810\x9e\xe3@\x14\xe8i\x10\xc4\xd1\x8a\x08\x1c\xcf\x80M$z\xef\xff0\x85p\xb8\xb01f\r\x1b\xce\x01\xc3\x01\x1e\x10' \x82\n\x01\x00;", "generic": "GIF89a\x0c\x00\x0c\x00\x84\x00\x00\xff\xff\xf7\xf5\xf5\xee\xe9\xe9\xe5fff\x00\x00\x00\xe7\xe7\xe7^^^\xf3\xf3\xed\x8e\x8e\x8e\xe0\xe0\xe0\x9f\x9f\x9f\x93\x93\x93\xa7\xa7\xa7\x9e\x9e\x9eiiiccc\xa3\xa3\xa3\x84\x84\x84\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9!\xfe\x0eMade with GIMP\x00,\x00\x00\x00\x00\x0c\x00\x0c\x00\x00\x05, \x8e\x810\x9e\xe3@\x14\xe8i\x10\xc4\xd1\x8a\x08\x1c\xcf\x80M$z\xef\xff0\x85p\xb8\xb01f\r\x1b\xce\x01\xc3\x01\x1e\x10' \x82\n\x01\x00;", "description": "The binary value above is a tiny arrow encoded as a gif image.", } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-binary-py2.data0000644000175100001730000000117414455350511021124 0ustar00runnerdockercanonical: !!binary "\ R0lGODlhDAAMAIQAAP//9/X17unp5WZmZgAAAOfn515eXvPz7Y6OjuDg4J+fn5\ OTk6enp56enmlpaWNjY6Ojo4SEhP/++f/++f/++f/++f/++f/++f/++f/++f/+\ +f/++f/++f/++f/++f/++SH+Dk1hZGUgd2l0aCBHSU1QACwAAAAADAAMAAAFLC\ AgjoEwnuNAFOhpEMTRiggcz4BNJHrv/zCFcLiwMWYNG84BwwEeECcgggoBADs=" generic: !!binary | R0lGODlhDAAMAIQAAP//9/X17unp5WZmZgAAAOfn515eXvPz7Y6OjuDg4J+fn5 OTk6enp56enmlpaWNjY6Ojo4SEhP/++f/++f/++f/++f/++f/++f/++f/++f/+ +f/++f/++f/++f/++f/++SH+Dk1hZGUgd2l0aCBHSU1QACwAAAAADAAMAAAFLC AgjoEwnuNAFOhpEMTRiggcz4BNJHrv/zCFcLiwMWYNG84BwwEeECcgggoBADs= description: The binary value above is a tiny arrow encoded as a gif image. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-binary-py3.code0000644000175100001730000000244714455350511021132 0ustar00runnerdocker{ "canonical": b"GIF89a\x0c\x00\x0c\x00\x84\x00\x00\xff\xff\xf7\xf5\xf5\xee\xe9\xe9\xe5fff\x00\x00\x00\xe7\xe7\xe7^^^\xf3\xf3\xed\x8e\x8e\x8e\xe0\xe0\xe0\x9f\x9f\x9f\x93\x93\x93\xa7\xa7\xa7\x9e\x9e\x9eiiiccc\xa3\xa3\xa3\x84\x84\x84\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9!\xfe\x0eMade with GIMP\x00,\x00\x00\x00\x00\x0c\x00\x0c\x00\x00\x05, \x8e\x810\x9e\xe3@\x14\xe8i\x10\xc4\xd1\x8a\x08\x1c\xcf\x80M$z\xef\xff0\x85p\xb8\xb01f\r\x1b\xce\x01\xc3\x01\x1e\x10' \x82\n\x01\x00;", "generic": b"GIF89a\x0c\x00\x0c\x00\x84\x00\x00\xff\xff\xf7\xf5\xf5\xee\xe9\xe9\xe5fff\x00\x00\x00\xe7\xe7\xe7^^^\xf3\xf3\xed\x8e\x8e\x8e\xe0\xe0\xe0\x9f\x9f\x9f\x93\x93\x93\xa7\xa7\xa7\x9e\x9e\x9eiiiccc\xa3\xa3\xa3\x84\x84\x84\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9\xff\xfe\xf9!\xfe\x0eMade with GIMP\x00,\x00\x00\x00\x00\x0c\x00\x0c\x00\x00\x05, \x8e\x810\x9e\xe3@\x14\xe8i\x10\xc4\xd1\x8a\x08\x1c\xcf\x80M$z\xef\xff0\x85p\xb8\xb01f\r\x1b\xce\x01\xc3\x01\x1e\x10' \x82\n\x01\x00;", "description": "The binary value above is a tiny arrow encoded as a gif image.", } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-binary-py3.data0000644000175100001730000000117414455350511021125 0ustar00runnerdockercanonical: !!binary "\ R0lGODlhDAAMAIQAAP//9/X17unp5WZmZgAAAOfn515eXvPz7Y6OjuDg4J+fn5\ OTk6enp56enmlpaWNjY6Ojo4SEhP/++f/++f/++f/++f/++f/++f/++f/++f/+\ +f/++f/++f/++f/++f/++SH+Dk1hZGUgd2l0aCBHSU1QACwAAAAADAAMAAAFLC\ AgjoEwnuNAFOhpEMTRiggcz4BNJHrv/zCFcLiwMWYNG84BwwEeECcgggoBADs=" generic: !!binary | R0lGODlhDAAMAIQAAP//9/X17unp5WZmZgAAAOfn515eXvPz7Y6OjuDg4J+fn5 OTk6enp56enmlpaWNjY6Ojo4SEhP/++f/++f/++f/++f/++f/++f/++f/++f/+ +f/++f/++f/++f/++f/++SH+Dk1hZGUgd2l0aCBHSU1QACwAAAAADAAMAAAFLC AgjoEwnuNAFOhpEMTRiggcz4BNJHrv/zCFcLiwMWYNG84BwwEeECcgggoBADs= description: The binary value above is a tiny arrow encoded as a gif image. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-bool.code0000644000175100001730000000022014455350511020053 0ustar00runnerdocker{ "canonical": True, "answer": False, "logical": True, "option": True, "but": { "y": "is a string", "n": "is a string" }, } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-bool.data0000644000175100001730000000014014455350511020053 0ustar00runnerdockercanonical: yes answer: NO logical: True option: on but: y: is a string n: is a string ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-custom.code0000644000175100001730000000042314455350511020437 0ustar00runnerdocker[ MyTestClass1(x=1), MyTestClass1(x=1, y=2, z=3), MyTestClass2(x=10), MyTestClass2(x=10, y=20, z=30), MyTestClass3(x=1), MyTestClass3(x=1, y=2, z=3), MyTestClass3(x=1, y=2, z=3), YAMLObject1(my_parameter='foo', my_another_parameter=[1,2,3]) ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-custom.data0000644000175100001730000000035114455350511020436 0ustar00runnerdocker--- - !tag1 x: 1 - !tag1 x: 1 'y': 2 z: 3 - !tag2 10 - !tag2 =: 10 'y': 20 z: 30 - !tag3 x: 1 - !tag3 x: 1 'y': 2 z: 3 - !tag3 =: 1 'y': 2 z: 3 - !foo my-parameter: foo my-another-parameter: [1,2,3] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-float.code0000644000175100001730000000027714455350511020241 0ustar00runnerdocker{ "canonical": 685230.15, "exponential": 685230.15, "fixed": 685230.15, "sexagesimal": 685230.15, "negative infinity": -1e300000, "not a number": 1e300000/1e300000, } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-float.data0000644000175100001730000000021414455350511020227 0ustar00runnerdockercanonical: 6.8523015e+5 exponential: 685.230_15e+03 fixed: 685_230.15 sexagesimal: 190:20:30.15 negative infinity: -.inf not a number: .NaN ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-int.code0000644000175100001730000000022514455350511017717 0ustar00runnerdocker{ "canonical": 685230, "decimal": 685230, "octal": 685230, "hexadecimal": 685230, "binary": 685230, "sexagesimal": 685230, } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-int.data0000644000175100001730000000020714455350511017716 0ustar00runnerdockercanonical: 685230 decimal: +685_230 octal: 02472256 hexadecimal: 0x_0A_74_AE binary: 0b1010_0111_0100_1010_1110 sexagesimal: 190:20:30 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-map.code0000644000175100001730000000027514455350511017707 0ustar00runnerdocker{ "Block style": { "Clark" : "Evans", "Brian" : "Ingerson", "Oren" : "Ben-Kiki" }, "Flow style": { "Clark" : "Evans", "Brian" : "Ingerson", "Oren" : "Ben-Kiki" }, } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-map.data0000644000175100001730000000026214455350511017702 0ustar00runnerdocker# Unordered set of key: value pairs. Block style: !!map Clark : Evans Brian : Ingerson Oren : Ben-Kiki Flow style: !!map { Clark: Evans, Brian: Ingerson, Oren: Ben-Kiki } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-merge.code0000644000175100001730000000046514455350511020232 0ustar00runnerdocker[ { "x": 1, "y": 2 }, { "x": 0, "y": 2 }, { "r": 10 }, { "r": 1 }, { "x": 1, "y": 2, "r": 10, "label": "center/big" }, { "x": 1, "y": 2, "r": 10, "label": "center/big" }, { "x": 1, "y": 2, "r": 10, "label": "center/big" }, { "x": 1, "y": 2, "r": 10, "label": "center/big" }, ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-merge.data0000644000175100001730000000061314455350511020224 0ustar00runnerdocker--- - &CENTER { x: 1, 'y': 2 } - &LEFT { x: 0, 'y': 2 } - &BIG { r: 10 } - &SMALL { r: 1 } # All the following maps are equal: - # Explicit keys x: 1 'y': 2 r: 10 label: center/big - # Merge one map << : *CENTER r: 10 label: center/big - # Merge multiple maps << : [ *CENTER, *BIG ] label: center/big - # Override << : [ *BIG, *LEFT, *SMALL ] x: 1 label: center/big ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-null.code0000644000175100001730000000035714455350511020105 0ustar00runnerdocker[ None, { "empty": None, "canonical": None, "english": None, None: "null key" }, { "sparse": [ None, "2nd entry", None, "4th entry", None, ], }, ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-null.data0000644000175100001730000000036114455350511020077 0ustar00runnerdocker# A document may be null. --- --- # This mapping has four keys, # one has a value. empty: canonical: ~ english: null ~: null key --- # This sequence has five # entries, two have values. sparse: - ~ - 2nd entry - - 4th entry - Null ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-omap.code0000644000175100001730000000042314455350511020061 0ustar00runnerdocker{ "Bestiary": [ ("aardvark", "African pig-like ant eater. Ugly."), ("anteater", "South-American ant eater. Two species."), ("anaconda", "South-American constrictor snake. Scaly."), ], "Numbers": [ ("one", 1), ("two", 2), ("three", 3) ], } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-omap.data0000644000175100001730000000043614455350511020064 0ustar00runnerdocker# Explicitly typed ordered map (dictionary). Bestiary: !!omap - aardvark: African pig-like ant eater. Ugly. - anteater: South-American ant eater. Two species. - anaconda: South-American constrictor snake. Scaly. # Etc. # Flow style Numbers: !!omap [ one: 1, two: 2, three : 3 ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-pairs.code0000644000175100001730000000036214455350511020245 0ustar00runnerdocker{ "Block tasks": [ ("meeting", "with team."), ("meeting", "with boss."), ("break", "lunch."), ("meeting", "with client."), ], "Flow tasks": [ ("meeting", "with team"), ("meeting", "with boss") ], } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-pairs.data0000644000175100001730000000031214455350511020237 0ustar00runnerdocker# Explicitly typed pairs. Block tasks: !!pairs - meeting: with team. - meeting: with boss. - break: lunch. - meeting: with client. Flow tasks: !!pairs [ meeting: with team, meeting: with boss ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-bool.code0000644000175100001730000000002014455350511021370 0ustar00runnerdocker[ True, False ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-bool.data0000644000175100001730000000005414455350511021376 0ustar00runnerdocker[ !!python/bool True, !!python/bool False ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-bytes-py3.code0000644000175100001730000000002414455350511022300 0ustar00runnerdockerb'some binary data' ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-bytes-py3.data0000644000175100001730000000005614455350511022304 0ustar00runnerdocker--- !!python/bytes 'c29tZSBiaW5hcnkgZGF0YQ==' ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-complex.code0000644000175100001730000000011114455350511022105 0ustar00runnerdocker[0.5+0j, 0.5+0.5j, 0.5j, -0.5+0.5j, -0.5+0j, -0.5-0.5j, -0.5j, 0.5-0.5j] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-complex.data0000644000175100001730000000033014455350511022107 0ustar00runnerdocker- !!python/complex 0.5+0j - !!python/complex 0.5+0.5j - !!python/complex 0.5j - !!python/complex -0.5+0.5j - !!python/complex -0.5+0j - !!python/complex -0.5-0.5j - !!python/complex -0.5j - !!python/complex 0.5-0.5j ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-float.code0000644000175100001730000000001014455350511021541 0ustar00runnerdocker123.456 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-float.data0000644000175100001730000000002714455350511021550 0ustar00runnerdocker!!python/float 123.456 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-int.code0000644000175100001730000000000414455350511021231 0ustar00runnerdocker123 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-int.data0000644000175100001730000000002114455350511021227 0ustar00runnerdocker!!python/int 123 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-long-short-py2.code0000644000175100001730000000000514455350511023244 0ustar00runnerdocker123L ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-long-short-py2.data0000644000175100001730000000002214455350511023242 0ustar00runnerdocker!!python/long 123 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-long-short-py3.code0000644000175100001730000000000414455350511023244 0ustar00runnerdocker123 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-long-short-py3.data0000644000175100001730000000002214455350511023243 0ustar00runnerdocker!!python/long 123 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-name-module.code0000644000175100001730000000010114455350511022640 0ustar00runnerdocker[str, yaml.Loader, yaml.dump, abs, yaml.tokens, signal.Handlers] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-name-module.data0000644000175100001730000000023414455350511022646 0ustar00runnerdocker- !!python/name:str - !!python/name:yaml.Loader - !!python/name:yaml.dump - !!python/name:abs - !!python/module:yaml.tokens - !!python/name:signal.Handlers ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-none.code0000644000175100001730000000000514455350511021377 0ustar00runnerdockerNone ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-none.data0000644000175100001730000000001614455350511021400 0ustar00runnerdocker!!python/none ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-object.code0000644000175100001730000000067414455350511021722 0ustar00runnerdocker[ AnObject(1, 'two', [3,3,3]), AnInstance(1, 'two', [3,3,3]), AnObject(1, 'two', [3,3,3]), AnInstance(1, 'two', [3,3,3]), AState(1, 'two', [3,3,3]), ACustomState(1, 'two', [3,3,3]), InitArgs(1, 'two', [3,3,3]), InitArgsWithState(1, 'two', [3,3,3]), NewArgs(1, 'two', [3,3,3]), NewArgsWithState(1, 'two', [3,3,3]), Reduce(1, 'two', [3,3,3]), ReduceWithState(1, 'two', [3,3,3]), Slots(1, 'two', [3,3,3]), MyInt(3), MyList(3), MyDict(3), ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-object.data0000644000175100001730000000243114455350511021712 0ustar00runnerdocker- !!python/object:test_constructor.AnObject { foo: 1, bar: two, baz: [3,3,3] } - !!python/object:test_constructor.AnInstance { foo: 1, bar: two, baz: [3,3,3] } - !!python/object/new:test_constructor.AnObject { args: [1, two], kwds: {baz: [3,3,3]} } - !!python/object/apply:test_constructor.AnInstance { args: [1, two], kwds: {baz: [3,3,3]} } - !!python/object:test_constructor.AState { _foo: 1, _bar: two, _baz: [3,3,3] } - !!python/object/new:test_constructor.ACustomState { state: !!python/tuple [1, two, [3,3,3]] } - !!python/object/new:test_constructor.InitArgs [1, two, [3,3,3]] - !!python/object/new:test_constructor.InitArgsWithState { args: [1, two], state: [3,3,3] } - !!python/object/new:test_constructor.NewArgs [1, two, [3,3,3]] - !!python/object/new:test_constructor.NewArgsWithState { args: [1, two], state: [3,3,3] } - !!python/object/apply:test_constructor.Reduce [1, two, [3,3,3]] - !!python/object/apply:test_constructor.ReduceWithState { args: [1, two], state: [3,3,3] } - !!python/object/new:test_constructor.Slots { state: !!python/tuple [null, { foo: 1, bar: 'two', baz: [3,3,3] } ] } - !!python/object/new:test_constructor.MyInt [3] - !!python/object/new:test_constructor.MyList { listitems: [~, ~, ~] } - !!python/object/new:test_constructor.MyDict { dictitems: {0, 1, 2} } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-str-ascii.code0000644000175100001730000000001714455350511022341 0ustar00runnerdocker"ascii string" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-str-ascii.data0000644000175100001730000000004014455350511022334 0ustar00runnerdocker--- !!python/str "ascii string" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-str-utf8-py2.code0000644000175100001730000000020214455350511022643 0ustar00runnerdockeru'\u042d\u0442\u043e \u0443\u043d\u0438\u043a\u043e\u0434\u043d\u0430\u044f \u0441\u0442\u0440\u043e\u043a\u0430'.encode('utf-8') ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-str-utf8-py2.data0000644000175100001730000000007214455350511022647 0ustar00runnerdocker--- !!python/str "Это уникодная строка" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-str-utf8-py3.code0000644000175100001730000000016114455350511022650 0ustar00runnerdocker'\u042d\u0442\u043e \u0443\u043d\u0438\u043a\u043e\u0434\u043d\u0430\u044f \u0441\u0442\u0440\u043e\u043a\u0430' ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-str-utf8-py3.data0000644000175100001730000000007214455350511022650 0ustar00runnerdocker--- !!python/str "Это уникодная строка" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-tuple-list-dict.code0000644000175100001730000000015014455350511023464 0ustar00runnerdocker[ [1, 2, 3, 4], (1, 2, 3, 4), {1: 2, 3: 4}, {(0,0): 0, (0,1): 1, (1,0): 1, (1,1): 0}, ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-tuple-list-dict.data0000644000175100001730000000033014455350511023463 0ustar00runnerdocker- !!python/list [1, 2, 3, 4] - !!python/tuple [1, 2, 3, 4] - !!python/dict {1: 2, 3: 4} - !!python/dict !!python/tuple [0,0]: 0 !!python/tuple [0,1]: 1 !!python/tuple [1,0]: 1 !!python/tuple [1,1]: 0 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-unicode-ascii-py2.code0000644000175100001730000000002014455350511023661 0ustar00runnerdockeru"ascii string" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-unicode-ascii-py2.data0000644000175100001730000000004414455350511023666 0ustar00runnerdocker--- !!python/unicode "ascii string" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-unicode-ascii-py3.code0000644000175100001730000000001714455350511023670 0ustar00runnerdocker"ascii string" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-unicode-ascii-py3.data0000644000175100001730000000004414455350511023667 0ustar00runnerdocker--- !!python/unicode "ascii string" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-unicode-utf8-py2.code0000644000175100001730000000016214455350511023466 0ustar00runnerdockeru'\u042d\u0442\u043e \u0443\u043d\u0438\u043a\u043e\u0434\u043d\u0430\u044f \u0441\u0442\u0440\u043e\u043a\u0430' ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-unicode-utf8-py2.data0000644000175100001730000000007614455350511023471 0ustar00runnerdocker--- !!python/unicode "Это уникодная строка" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-unicode-utf8-py3.code0000644000175100001730000000016114455350511023466 0ustar00runnerdocker'\u042d\u0442\u043e \u0443\u043d\u0438\u043a\u043e\u0434\u043d\u0430\u044f \u0441\u0442\u0440\u043e\u043a\u0430' ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-python-unicode-utf8-py3.data0000644000175100001730000000007614455350511023472 0ustar00runnerdocker--- !!python/unicode "Это уникодная строка" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-seq.code0000644000175100001730000000033514455350511017717 0ustar00runnerdocker{ "Block style": ["Mercury", "Venus", "Earth", "Mars", "Jupiter", "Saturn", "Uranus", "Neptune", "Pluto"], "Flow style": ["Mercury", "Venus", "Earth", "Mars", "Jupiter", "Saturn", "Uranus", "Neptune", "Pluto"], } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-seq.data0000644000175100001730000000102414455350511017712 0ustar00runnerdocker# Ordered sequence of nodes Block style: !!seq - Mercury # Rotates - no light/dark sides. - Venus # Deadliest. Aptly named. - Earth # Mostly dirt. - Mars # Seems empty. - Jupiter # The king. - Saturn # Pretty. - Uranus # Where the sun hardly shines. - Neptune # Boring. No rings. - Pluto # You call this a planet? Flow style: !!seq [ Mercury, Venus, Earth, Mars, # Rocks Jupiter, Saturn, Uranus, Neptune, # Gas Pluto ] # Overrated ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-set.code0000644000175100001730000000024514455350511017722 0ustar00runnerdocker{ "baseball players": set(["Mark McGwire", "Sammy Sosa", "Ken Griffey"]), "baseball teams": set(["Boston Red Sox", "Detroit Tigers", "New York Yankees"]), } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-set.data0000644000175100001730000000027014455350511017717 0ustar00runnerdocker# Explicitly typed set. baseball players: !!set ? Mark McGwire ? Sammy Sosa ? Ken Griffey # Flow style baseball teams: !!set { Boston Red Sox, Detroit Tigers, New York Yankees } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-str-ascii.code0000644000175100001730000000001714455350511021022 0ustar00runnerdocker"ascii string" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-str-ascii.data0000644000175100001730000000003114455350511021015 0ustar00runnerdocker--- !!str "ascii string" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-str-utf8-py2.code0000644000175100001730000000016214455350511021331 0ustar00runnerdockeru'\u042d\u0442\u043e \u0443\u043d\u0438\u043a\u043e\u0434\u043d\u0430\u044f \u0441\u0442\u0440\u043e\u043a\u0430' ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-str-utf8-py2.data0000644000175100001730000000006314455350511021330 0ustar00runnerdocker--- !!str "Это уникодная строка" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-str-utf8-py3.code0000644000175100001730000000016114455350511021331 0ustar00runnerdocker'\u042d\u0442\u043e \u0443\u043d\u0438\u043a\u043e\u0434\u043d\u0430\u044f \u0441\u0442\u0440\u043e\u043a\u0430' ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-str-utf8-py3.data0000644000175100001730000000006314455350511021331 0ustar00runnerdocker--- !!str "Это уникодная строка" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-str.code0000644000175100001730000000002514455350511017733 0ustar00runnerdocker{ "string": "abcd" } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-str.data0000644000175100001730000000001514455350511017731 0ustar00runnerdockerstring: abcd ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-timestamp.code0000644000175100001730000000053614455350511021135 0ustar00runnerdocker{ "canonical": datetime.datetime(2001, 12, 15, 2, 59, 43, 100000), "valid iso8601": datetime.datetime(2001, 12, 15, 2, 59, 43, 100000), "space separated": datetime.datetime(2001, 12, 15, 2, 59, 43, 100000), "no time zone (Z)": datetime.datetime(2001, 12, 15, 2, 59, 43, 100000), "date (00:00:00Z)": datetime.date(2002, 12, 14), } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-timestamp.data0000644000175100001730000000031114455350511021123 0ustar00runnerdockercanonical: 2001-12-15T02:59:43.1Z valid iso8601: 2001-12-14t21:59:43.10-05:00 space separated: 2001-12-14 21:59:43.10 -5 no time zone (Z): 2001-12-15 2:59:43.10 date (00:00:00Z): 2002-12-14 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-value.code0000644000175100001730000000032614455350511020243 0ustar00runnerdocker[ { "link with": [ "library1.dll", "library2.dll" ] }, { "link with": [ { "=": "library1.dll", "version": 1.2 }, { "=": "library2.dll", "version": 2.3 }, ], }, ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/construct-value.data0000644000175100001730000000025614455350511020244 0ustar00runnerdocker--- # Old schema link with: - library1.dll - library2.dll --- # New schema link with: - = : library1.dll version: 1.2 - = : library2.dll version: 2.3 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/document-separator-in-quoted-scalar.loader-error0000644000175100001730000000016514455350511025553 0ustar00runnerdocker--- "this --- is correct" --- "this ...is also correct" --- "a quoted scalar cannot contain --- document separators" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/documents.events0000644000175100001730000000055514455350511017504 0ustar00runnerdocker- !StreamStart - !DocumentStart { explicit: false } - !Scalar { implicit: [true,false], value: 'data' } - !DocumentEnd - !DocumentStart - !Scalar { implicit: [true,false] } - !DocumentEnd - !DocumentStart { version: [1,1], tags: { '!': '!foo', '!yaml!': 'tag:yaml.org,2002:', '!ugly!': '!!!!!!!' } } - !Scalar { implicit: [true,false] } - !DocumentEnd - !StreamEnd ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/duplicate-anchor-1.loader-error0000644000175100001730000000004114455350511022142 0ustar00runnerdocker- &foo bar - &bar bar - &foo bar ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/duplicate-anchor-2.loader-error0000644000175100001730000000002714455350511022147 0ustar00runnerdocker&foo [1, 2, 3, &foo 4] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/duplicate-key.former-loader-error.code0000644000175100001730000000002114455350511023521 0ustar00runnerdocker{ 'foo': 'baz' } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/duplicate-key.former-loader-error.data0000644000175100001730000000002614455350511023525 0ustar00runnerdocker--- foo: bar foo: baz ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/duplicate-mapping-key.former-loader-error.code0000644000175100001730000000006414455350511025161 0ustar00runnerdocker{ 'foo': { 'baz': 'bat', 'foo': 'duplicate key' } } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/duplicate-mapping-key.former-loader-error.data0000644000175100001730000000014114455350511025154 0ustar00runnerdocker--- &anchor foo: foo: bar *anchor: duplicate key baz: bat *anchor: duplicate key ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/duplicate-merge-key.former-loader-error.code0000644000175100001730000000006114455350511024622 0ustar00runnerdocker{ 'x': 1, 'y': 2, 'foo': 'bar', 'z': 3, 't': 4 } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/duplicate-merge-key.former-loader-error.data0000644000175100001730000000005714455350511024626 0ustar00runnerdocker--- <<: {x: 1, y: 2} foo: bar <<: {z: 3, t: 4} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/duplicate-tag-directive.loader-error0000644000175100001730000000006014455350511023262 0ustar00runnerdocker%TAG !foo! bar %TAG !foo! baz --- foo ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/duplicate-value-key.former-loader-error.code0000644000175100001730000000003114455350511024634 0ustar00runnerdocker{ 'foo': 'bar', '=': 2 } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/duplicate-value-key.former-loader-error.data0000644000175100001730000000002714455350511024640 0ustar00runnerdocker--- =: 1 foo: bar =: 2 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/duplicate-yaml-directive.loader-error0000644000175100001730000000004014455350511023447 0ustar00runnerdocker%YAML 1.1 %YAML 1.1 --- foo ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/emit-block-scalar-in-simple-key-context-bug.canonical0000644000175100001730000000007014455350511026325 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "foo" : !!str "bar" } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/emit-block-scalar-in-simple-key-context-bug.data0000644000175100001730000000002614455350511025310 0ustar00runnerdocker? |- foo : |- bar ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/emitting-unacceptable-unicode-character-bug-py3.code0000644000175100001730000000001114455350511026202 0ustar00runnerdocker"\udd00" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/emitting-unacceptable-unicode-character-bug-py3.data0000644000175100001730000000001114455350511026201 0ustar00runnerdocker"\udd00" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/emitting-unacceptable-unicode-character-bug-py3.skip-ext0000644000175100001730000000000014455350511027032 0ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/emitting-unacceptable-unicode-character-bug.code0000644000175100001730000000001214455350511025472 0ustar00runnerdockeru"\udd00" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/emitting-unacceptable-unicode-character-bug.data0000644000175100001730000000001114455350511025470 0ustar00runnerdocker"\udd00" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/emitting-unacceptable-unicode-character-bug.skip-ext0000644000175100001730000000000014455350511026321 0ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/emoticons.unicode0000644000175100001730000000051214455350511017616 0ustar00runnerdocker😀😁😂😃😄😅😆😇 😈😉😊😋😌😍😎😏 😐😑😒😓😔😕😖😗 😘😙😚😛😜😝😞😟 😠😡😢😣😤😥😦😧 😨😩😪😫😬😭😮😯 😰😱😲😳😴😵😶😷 😸😹😺😻😼😽😾😿 🙀🙁🙂🙃🙄🙅🙆🙇 🙈🙉🙊🙋🙌🙍🙎🙏 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/emoticons2.unicode0000644000175100001730000000000514455350511017675 0ustar00runnerdocker😀 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/empty-anchor.emitter-error0000644000175100001730000000014314455350511021376 0ustar00runnerdocker- !StreamStart - !DocumentStart - !Scalar { anchor: '', value: 'foo' } - !DocumentEnd - !StreamEnd ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/empty-document-bug.canonical0000644000175100001730000000005714455350511021650 0ustar00runnerdocker# This YAML stream contains no YAML documents. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/empty-document-bug.data0000644000175100001730000000000014455350511020616 0ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/empty-document-bug.empty0000644000175100001730000000000014455350511021043 0ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/empty-documents.single-loader-error0000644000175100001730000000005314455350511023201 0ustar00runnerdocker--- # first document --- # second document ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/empty-python-module.loader-error0000644000175100001730000000002514455350511022524 0ustar00runnerdocker--- !!python:module: ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/empty-python-name.loader-error0000644000175100001730000000003114455350511022154 0ustar00runnerdocker--- !!python/name: empty ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/empty-tag-handle.emitter-error0000644000175100001730000000015714455350511022135 0ustar00runnerdocker- !StreamStart - !DocumentStart { tags: { '': 'bar' } } - !Scalar { value: 'foo' } - !DocumentEnd - !StreamEnd ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/empty-tag-prefix.emitter-error0000644000175100001730000000015514455350511022175 0ustar00runnerdocker- !StreamStart - !DocumentStart { tags: { '!': '' } } - !Scalar { value: 'foo' } - !DocumentEnd - !StreamEnd ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/empty-tag.emitter-error0000644000175100001730000000017114455350511020700 0ustar00runnerdocker- !StreamStart - !DocumentStart - !Scalar { tag: '', value: 'key', implicit: [false,false] } - !DocumentEnd - !StreamEnd ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/expected-document-end.emitter-error0000644000175100001730000000017014455350511023151 0ustar00runnerdocker- !StreamStart - !DocumentStart - !Scalar { value: 'data 1' } - !Scalar { value: 'data 2' } - !DocumentEnd - !StreamEnd ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/expected-document-start.emitter-error0000644000175100001730000000007214455350511023541 0ustar00runnerdocker- !StreamStart - !MappingStart - !MappingEnd - !StreamEnd ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/expected-mapping.loader-error0000644000175100001730000000003014455350511022012 0ustar00runnerdocker--- !!map [not, a, map] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/expected-node-1.emitter-error0000644000175100001730000000007414455350511021655 0ustar00runnerdocker- !StreamStart - !DocumentStart - !DocumentEnd - !StreamEnd ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/expected-node-2.emitter-error0000644000175100001730000000016514455350511021657 0ustar00runnerdocker- !StreamStart - !DocumentStart - !MappingStart - !Scalar { value: 'key' } - !MappingEnd - !DocumentEnd - !StreamEnd ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/expected-nothing.emitter-error0000644000175100001730000000007014455350511022234 0ustar00runnerdocker- !StreamStart - !StreamEnd - !StreamStart - !StreamEnd ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/expected-scalar.loader-error0000644000175100001730000000003114455350511021625 0ustar00runnerdocker--- !!str [not a scalar] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/expected-sequence.loader-error0000644000175100001730000000003214455350511022171 0ustar00runnerdocker--- !!seq {foo, bar, baz} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/expected-stream-start.emitter-error0000644000175100001730000000004014455350511023211 0ustar00runnerdocker- !DocumentStart - !DocumentEnd ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/explicit-document.single-loader-error0000644000175100001730000000003214455350511023476 0ustar00runnerdocker--- foo: bar --- foo: bar ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/fetch-complex-value-bug.loader-error0000644000175100001730000000002114455350511023203 0ustar00runnerdocker? "foo" : "bar" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/float-representer-2.3-bug.code0000644000175100001730000000014014455350511021613 0ustar00runnerdocker{ # 0.0: 0, 1.0: 1, 1e300000: +10, -1e300000: -10, 1e300000/1e300000: 100, } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/float-representer-2.3-bug.data0000644000175100001730000000014014455350511021612 0ustar00runnerdocker#0.0: # hash(0) == hash(nan) and 0 == nan in Python 2.3 1.0: 1 +.inf: 10 -.inf: -10 .nan: 100 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/float.data0000644000175100001730000000011314455350511016203 0ustar00runnerdocker- 6.8523015e+5 - 685.230_15e+03 - 685_230.15 - 190:20:30.15 - -.inf - .NaN ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/float.detect0000644000175100001730000000003014455350511016540 0ustar00runnerdockertag:yaml.org,2002:float ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/forbidden-entry.loader-error0000644000175100001730000000003014455350511021653 0ustar00runnerdockertest: - foo - bar ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/forbidden-key.loader-error0000644000175100001730000000003014455350511021302 0ustar00runnerdockertest: ? foo : bar ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/forbidden-value.loader-error0000644000175100001730000000002114455350511021626 0ustar00runnerdockertest: key: value ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/implicit-document.single-loader-error0000644000175100001730000000002614455350511023472 0ustar00runnerdockerfoo: bar --- foo: bar ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/int.data0000644000175100001730000000012614455350511015674 0ustar00runnerdocker- 685230 - +685_230 - 02472256 - 0x_0A_74_AE - 0b1010_0111_0100_1010_1110 - 190:20:30 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/int.detect0000644000175100001730000000002614455350511016232 0ustar00runnerdockertag:yaml.org,2002:int ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-anchor-1.loader-error0000644000175100001730000000011214455350511021615 0ustar00runnerdocker--- &? foo # we allow only ascii and numeric characters in anchor names. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-anchor-2.loader-error0000644000175100001730000000017714455350511021631 0ustar00runnerdocker--- - [ &correct foo, *correct, *correct] # still correct - *correct: still correct - &correct-or-not[foo, bar] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-anchor.emitter-error0000644000175100001730000000015114455350511021665 0ustar00runnerdocker- !StreamStart - !DocumentStart - !Scalar { anchor: '5*5=25', value: 'foo' } - !DocumentEnd - !StreamEnd ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-base64-data-2.loader-error0000644000175100001730000000007114455350511022343 0ustar00runnerdocker--- !!binary двоичные данные в base64 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-base64-data.loader-error0000644000175100001730000000007714455350511022212 0ustar00runnerdocker--- !!binary binary data encoded in base64 should be here. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-block-scalar-indicator.loader-error0000644000175100001730000000004614455350511024522 0ustar00runnerdocker--- > what is this? # a comment data ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-character.loader-error0000644000175100001730000000424114455350511022150 0ustar00runnerdocker------------------------------------------------------------------------------------------------------------------------------- ------------------------------------------------------------------------------------------------------------------------------- ------------------------------------------------------------------------------------------------------------------------------- ------------------------------------------------------------------------------------------------------------------------------- ------------------------------------------------------------------------------------------------------------------------------- ------------------------------------------------------------------------------------------------------------------------------- ------------------------------------------------------------------------------------------------------------------------------- ------------------------------------------------------------------------------------------------------------------------------- ------------------------------------------------------------------------------------------------------------------------------- ------------------------------------------------------------------------------------------------------------------------------- ------------------------------------------------------------------------------------------------------------------------------- ------------------------------------------------------------------------------------------------------------------------------- ------------------------------------------------------------------------------------------------------------------------------- ------------------------------------------------------------------------------------------------------------------------------- ------------------------------------------------------------------------------------------------------------------------------- ------------------------------------------------------------------------------------------------------------------------------- Control character ('\x0'): <-- ------------------------------------------------------------------------------------------------------------------------------- ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-character.stream-error0000644000175100001730000001014114455350511022171 0ustar00runnerdocker############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### Control character ('\x0'): <-- ############################################################### ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-directive-line.loader-error0000644000175100001730000000004314455350511023113 0ustar00runnerdocker%YAML 1.1 ? # extra symbol --- ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-directive-name-1.loader-error0000644000175100001730000000003114455350511023237 0ustar00runnerdocker% # no name at all --- ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-directive-name-2.loader-error0000644000175100001730000000005214455350511023243 0ustar00runnerdocker%invalid-characters:in-directive name --- ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-escape-character.loader-error0000644000175100001730000000010214455350511023376 0ustar00runnerdocker"some escape characters are \ncorrect, but this one \?\nis not\n" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-escape-numbers.loader-error0000644000175100001730000000002014455350511023114 0ustar00runnerdocker"hm.... \u123?" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-indentation-indicator-1.loader-error0000644000175100001730000000003114455350511024631 0ustar00runnerdocker--- >0 # not valid data ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-indentation-indicator-2.loader-error0000644000175100001730000000001514455350511024634 0ustar00runnerdocker--- >-0 data ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-item-without-trailing-break.loader-error0000644000175100001730000000000414455350511025535 0ustar00runnerdocker- -0././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-merge-1.loader-error0000644000175100001730000000002114455350511021441 0ustar00runnerdockerfoo: bar <<: baz ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-merge-2.loader-error0000644000175100001730000000004314455350511021446 0ustar00runnerdockerfoo: bar <<: [x: 1, y: 2, z, t: 4] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-omap-1.loader-error0000644000175100001730000000003514455350511021303 0ustar00runnerdocker--- !!omap foo: bar baz: bat ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-omap-2.loader-error0000644000175100001730000000003414455350511021303 0ustar00runnerdocker--- !!omap - foo: bar - baz ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-omap-3.loader-error0000644000175100001730000000005414455350511021306 0ustar00runnerdocker--- !!omap - foo: bar - baz: bar bar: bar ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-pairs-1.loader-error0000644000175100001730000000003614455350511021466 0ustar00runnerdocker--- !!pairs foo: bar baz: bat ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-pairs-2.loader-error0000644000175100001730000000003514455350511021466 0ustar00runnerdocker--- !!pairs - foo: bar - baz ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-pairs-3.loader-error0000644000175100001730000000005514455350511021471 0ustar00runnerdocker--- !!pairs - foo: bar - baz: bar bar: bar ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-python-bytes-2-py3.loader-error0000644000175100001730000000007714455350511023534 0ustar00runnerdocker--- !!python/bytes двоичные данные в base64 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-python-bytes-py3.loader-error0000644000175100001730000000010514455350511023365 0ustar00runnerdocker--- !!python/bytes binary data encoded in base64 should be here. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-python-module-kind.loader-error0000644000175100001730000000005514455350511023742 0ustar00runnerdocker--- !!python/module:sys { must, be, scalar } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-python-module-value.loader-error0000644000175100001730000000005214455350511024126 0ustar00runnerdocker--- !!python/module:sys "non-empty value" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-python-module.loader-error0000644000175100001730000000004314455350511023014 0ustar00runnerdocker--- !!python/module:no.such.module ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-python-name-kind.loader-error0000644000175100001730000000004114455350511023370 0ustar00runnerdocker--- !!python/name:sys.modules {} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-python-name-module.loader-error0000644000175100001730000000004314455350511023732 0ustar00runnerdocker--- !!python/name:sys.modules.keys ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-python-name-object.loader-error0000644000175100001730000000004014455350511023710 0ustar00runnerdocker--- !!python/name:os.path.rm_rf ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-python-name-value.loader-error0000644000175100001730000000004014455350511023556 0ustar00runnerdocker--- !!python/name:sys.modules 5 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-simple-key.loader-error0000644000175100001730000000006314455350511022271 0ustar00runnerdockerkey: value invalid simple key next key: next value ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-single-quote-bug.code0000644000175100001730000000003414455350511021714 0ustar00runnerdocker["foo 'bar'", "foo\n'bar'"] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-single-quote-bug.data0000644000175100001730000000003514455350511021714 0ustar00runnerdocker- "foo 'bar'" - "foo\n'bar'" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-starting-character.loader-error0000644000175100001730000000002414455350511023774 0ustar00runnerdocker@@@@@@@@@@@@@@@@@@@ ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-tag-1.loader-error0000644000175100001730000000002114455350511021115 0ustar00runnerdocker- ! baz ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-tag-2.loader-error0000644000175100001730000000002614455350511021123 0ustar00runnerdocker- !prefix!foo#bar baz ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-tag-directive-handle.loader-error0000644000175100001730000000002114455350511024164 0ustar00runnerdocker%TAG !!! !!! --- ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-tag-directive-prefix.loader-error0000644000175100001730000000010114455350511024225 0ustar00runnerdocker%TAG ! tag:zz.com/foo#bar # '#' is not allowed in URLs --- ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-tag-handle-1.emitter-error0000644000175100001730000000016314455350511022560 0ustar00runnerdocker- !StreamStart - !DocumentStart { tags: { '!foo': 'bar' } } - !Scalar { value: 'foo' } - !DocumentEnd - !StreamEnd ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-tag-handle-1.loader-error0000644000175100001730000000002414455350511022351 0ustar00runnerdocker%TAG foo bar --- ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-tag-handle-2.emitter-error0000644000175100001730000000016214455350511022560 0ustar00runnerdocker- !StreamStart - !DocumentStart { tags: { '!!!': 'bar' } } - !Scalar { value: 'foo' } - !DocumentEnd - !StreamEnd ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-tag-handle-2.loader-error0000644000175100001730000000003014455350511022347 0ustar00runnerdocker%TAG !foo bar --- ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-uri-escapes-1.loader-error0000644000175100001730000000002414455350511022565 0ustar00runnerdocker--- ! foo ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-uri-escapes-2.loader-error0000644000175100001730000000001714455350511022570 0ustar00runnerdocker--- !<%FF> foo ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-uri-escapes-3.loader-error0000644000175100001730000000004114455350511022566 0ustar00runnerdocker--- ! baz ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-uri.loader-error0000644000175100001730000000002014455350511021002 0ustar00runnerdocker--- !foo! bar ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-utf8-byte.loader-error0000644000175100001730000001013514455350511022042 0ustar00runnerdocker############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### Invalid byte ('\xFF'): <-- ############################################################### ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-utf8-byte.stream-error0000644000175100001730000001013514455350511022067 0ustar00runnerdocker############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### Invalid byte ('\xFF'): <-- ############################################################### ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-yaml-directive-version-1.loader-error0000644000175100001730000000003714455350511024752 0ustar00runnerdocker# No version at all. %YAML --- ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-yaml-directive-version-2.loader-error0000644000175100001730000000002114455350511024744 0ustar00runnerdocker%YAML 1e-5 --- ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-yaml-directive-version-3.loader-error0000644000175100001730000000001514455350511024750 0ustar00runnerdocker%YAML 1. --- ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-yaml-directive-version-4.loader-error0000644000175100001730000000002414455350511024751 0ustar00runnerdocker%YAML 1.132.435 --- ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-yaml-directive-version-5.loader-error0000644000175100001730000000001614455350511024753 0ustar00runnerdocker%YAML A.0 --- ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-yaml-directive-version-6.loader-error0000644000175100001730000000002014455350511024747 0ustar00runnerdocker%YAML 123.C --- ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/invalid-yaml-version.loader-error0000644000175100001730000000002414455350511022634 0ustar00runnerdocker%YAML 2.0 --- foo ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/latin.unicode0000644000175100001730000016510014455350511016732 0ustar00runnerdockerABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyzªµºÀÁÂÃÄÅÆÇÈÉÊ ËÌÍÎÏÐÑÒÓÔÕÖØÙÚÛÜÝÞßàáâãäåæçèéêëìíîïðñòóôõöøùúûüýþÿĀāĂ㥹ĆćĈĉĊċČčĎ ďĐđĒēĔĕĖėĘęĚěĜĝĞğĠġĢģĤĥĦħĨĩĪīĬĭĮįİıIJijĴĵĶķĸĹĺĻļĽľĿŀŁłŃńŅņŇňʼnŊŋŌōŎŏŐ őŒœŔŕŖŗŘřŚśŜŝŞşŠšŢţŤťŦŧŨũŪūŬŭŮůŰűŲųŴŵŶŷŸŹźŻżŽžſƀƁƂƃƄƅƆƇƈƉƊƋƌƍƎƏƐƑƒ ƓƔƕƖƗƘƙƚƛƜƝƞƟƠơƢƣƤƥƦƧƨƩƪƫƬƭƮƯưƱƲƳƴƵƶƷƸƹƺƼƽƾƿDŽdžLJljNJnjǍǎǏǐǑǒǓǔǕǖǗǘǙǚǛǜ ǝǞǟǠǡǢǣǤǥǦǧǨǩǪǫǬǭǮǯǰDZdzǴǵǶǷǸǹǺǻǼǽǾǿȀȁȂȃȄȅȆȇȈȉȊȋȌȍȎȏȐȑȒȓȔȕȖȗȘșȚțȜȝȞȟ ȠȡȢȣȤȥȦȧȨȩȪȫȬȭȮȯȰȱȲȳȴȵȶȷȸȹȺȻȼȽȾȿɀɁɐɑɒɓɔɕɖɗɘəɚɛɜɝɞɟɠɡɢɣɤɥɦɧɨɩɪɫɬɭɮɯ ɰɱɲɳɴɵɶɷɸɹɺɻɼɽɾɿʀʁʂʃʄʅʆʇʈʉʊʋʌʍʎʏʐʑʒʓʔʕʖʗʘʙʚʛʜʝʞʟʠʡʢʣʤʥʦʧʨʩʪʫʬʭʮʯΆΈ ΉΊΌΎΏΐΑΒΓΔΕΖΗΘΙΚΛΜΝΞΟΠΡΣΤΥΦΧΨΩΪΫάέήίΰαβγδεζηθικλμνξοπρςστυφχψωϊϋόύ ώϐϑϒϓϔϕϖϗϘϙϚϛϜϝϞϟϠϡϢϣϤϥϦϧϨϩϪϫϬϭϮϯϰϱϲϳϴϵϷϸϹϺϻϼϽϾϿЀЁЂЃЄЅІЇЈЉЊЋЌЍЎЏАБ ВГДЕЖЗИЙКЛМНОПРСТУФХЦЧШЩЪЫЬЭЮЯабвгдежзийклмнопрстуфхцчшщъыьэюяѐёђѓ єѕіїјљњћќѝўџѠѡѢѣѤѥѦѧѨѩѪѫѬѭѮѯѰѱѲѳѴѵѶѷѸѹѺѻѼѽѾѿҀҁҊҋҌҍҎҏҐґҒғҔҕҖҗҘҙҚқҜҝ ҞҟҠҡҢңҤҥҦҧҨҩҪҫҬҭҮүҰұҲҳҴҵҶҷҸҹҺһҼҽҾҿӀӁӂӃӄӅӆӇӈӉӊӋӌӍӎӐӑӒӓӔӕӖӗӘәӚӛӜӝӞӟӠ ӡӢӣӤӥӦӧӨөӪӫӬӭӮӯӰӱӲӳӴӵӶӷӸӹԀԁԂԃԄԅԆԇԈԉԊԋԌԍԎԏԱԲԳԴԵԶԷԸԹԺԻԼԽԾԿՀՁՂՃՄՅՆՇՈՉ ՊՋՌՍՎՏՐՑՒՓՔՕՖաբգդեզէըթժիլխծկհձղճմյնշոչպջռսվտրցւփքօֆևႠႡႢႣႤႥႦႧႨႩႪႫႬႭ ႮႯႰႱႲႳႴႵႶႷႸႹႺႻႼႽႾႿჀჁჂჃჄჅᴀᴁᴂᴃᴄᴅᴆᴇᴈᴉᴊᴋᴌᴍᴎᴏᴐᴑᴒᴓᴔᴕᴖᴗᴘᴙᴚᴛᴜᴝᴞᴟᴠᴡᴢᴣᴤᴥᴦᴧᴨᴩ ᴪᴫᵢᵣᵤᵥᵦᵧᵨᵩᵪᵫᵬᵭᵮᵯᵰᵱᵲᵳᵴᵵᵶᵷᵹᵺᵻᵼᵽᵾᵿᶀᶁᶂᶃᶄᶅᶆᶇᶈᶉᶊᶋᶌᶍᶎᶏᶐᶑᶒᶓᶔᶕᶖᶗᶘᶙᶚḀḁḂḃḄḅḆḇ ḈḉḊḋḌḍḎḏḐḑḒḓḔḕḖḗḘḙḚḛḜḝḞḟḠḡḢḣḤḥḦḧḨḩḪḫḬḭḮḯḰḱḲḳḴḵḶḷḸḹḺḻḼḽḾḿṀṁṂṃṄṅṆṇṈṉ ṊṋṌṍṎṏṐṑṒṓṔṕṖṗṘṙṚṛṜṝṞṟṠṡṢṣṤṥṦṧṨṩṪṫṬṭṮṯṰṱṲṳṴṵṶṷṸṹṺṻṼṽṾṿẀẁẂẃẄẅẆẇẈẉẊẋ ẌẍẎẏẐẑẒẓẔẕẖẗẘẙẚẛẠạẢảẤấẦầẨẩẪẫẬậẮắẰằẲẳẴẵẶặẸẹẺẻẼẽẾếỀềỂểỄễỆệỈỉỊịỌọỎỏỐố ỒồỔổỖỗỘộỚớỜờỞởỠỡỢợỤụỦủỨứỪừỬửỮữỰựỲỳỴỵỶỷỸỹἀἁἂἃἄἅἆἇἈἉἊἋἌἍἎἏἐἑἒἓἔἕἘἙἚἛ ἜἝἠἡἢἣἤἥἦἧἨἩἪἫἬἭἮἯἰἱἲἳἴἵἶἷἸἹἺἻἼἽἾἿὀὁὂὃὄὅὈὉὊὋὌὍὐὑὒὓὔὕὖὗὙὛὝὟὠὡὢὣὤὥὦὧ ὨὩὪὫὬὭὮὯὰάὲέὴήὶίὸόὺύὼώᾀᾁᾂᾃᾄᾅᾆᾇᾐᾑᾒᾓᾔᾕᾖᾗᾠᾡᾢᾣᾤᾥᾦᾧᾰᾱᾲᾳᾴᾶᾷᾸᾹᾺΆιῂῃῄῆῇῈΈῊ ΉῐῑῒΐῖῗῘῙῚΊῠῡῢΰῤῥῦῧῨῩῪΎῬῲῳῴῶῷῸΌῺΏⁱⁿℂℇℊℋℌℍℎℏℐℑℒℓℕℙℚℛℜℝℤΩℨKÅℬℭℯℰℱℳℴℹ ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyzªµºÀÁÂÃÄÅÆÇÈÉÊ ËÌÍÎÏÐÑÒÓÔÕÖØÙÚÛÜÝÞßàáâãäåæçèéêëìíîïðñòóôõöøùúûüýþÿĀāĂ㥹ĆćĈĉĊċČčĎ ďĐđĒēĔĕĖėĘęĚěĜĝĞğĠġĢģĤĥĦħĨĩĪīĬĭĮįİıIJijĴĵĶķĸĹĺĻļĽľĿŀŁłŃńŅņŇňʼnŊŋŌōŎŏŐ őŒœŔŕŖŗŘřŚśŜŝŞşŠšŢţŤťŦŧŨũŪūŬŭŮůŰűŲųŴŵŶŷŸŹźŻżŽžſƀƁƂƃƄƅƆƇƈƉƊƋƌƍƎƏƐƑƒ ƓƔƕƖƗƘƙƚƛƜƝƞƟƠơƢƣƤƥƦƧƨƩƪƫƬƭƮƯưƱƲƳƴƵƶƷƸƹƺƼƽƾƿDŽdžLJljNJnjǍǎǏǐǑǒǓǔǕǖǗǘǙǚǛǜ ǝǞǟǠǡǢǣǤǥǦǧǨǩǪǫǬǭǮǯǰDZdzǴǵǶǷǸǹǺǻǼǽǾǿȀȁȂȃȄȅȆȇȈȉȊȋȌȍȎȏȐȑȒȓȔȕȖȗȘșȚțȜȝȞȟ ȠȡȢȣȤȥȦȧȨȩȪȫȬȭȮȯȰȱȲȳȴȵȶȷȸȹȺȻȼȽȾȿɀɁɐɑɒɓɔɕɖɗɘəɚɛɜɝɞɟɠɡɢɣɤɥɦɧɨɩɪɫɬɭɮɯ ɰɱɲɳɴɵɶɷɸɹɺɻɼɽɾɿʀʁʂʃʄʅʆʇʈʉʊʋʌʍʎʏʐʑʒʓʔʕʖʗʘʙʚʛʜʝʞʟʠʡʢʣʤʥʦʧʨʩʪʫʬʭʮʯΆΈ ΉΊΌΎΏΐΑΒΓΔΕΖΗΘΙΚΛΜΝΞΟΠΡΣΤΥΦΧΨΩΪΫάέήίΰαβγδεζηθικλμνξοπρςστυφχψωϊϋόύ ώϐϑϒϓϔϕϖϗϘϙϚϛϜϝϞϟϠϡϢϣϤϥϦϧϨϩϪϫϬϭϮϯϰϱϲϳϴϵϷϸϹϺϻϼϽϾϿЀЁЂЃЄЅІЇЈЉЊЋЌЍЎЏАБ ВГДЕЖЗИЙКЛМНОПРСТУФХЦЧШЩЪЫЬЭЮЯабвгдежзийклмнопрстуфхцчшщъыьэюяѐёђѓ єѕіїјљњћќѝўџѠѡѢѣѤѥѦѧѨѩѪѫѬѭѮѯѰѱѲѳѴѵѶѷѸѹѺѻѼѽѾѿҀҁҊҋҌҍҎҏҐґҒғҔҕҖҗҘҙҚқҜҝ ҞҟҠҡҢңҤҥҦҧҨҩҪҫҬҭҮүҰұҲҳҴҵҶҷҸҹҺһҼҽҾҿӀӁӂӃӄӅӆӇӈӉӊӋӌӍӎӐӑӒӓӔӕӖӗӘәӚӛӜӝӞӟӠ ӡӢӣӤӥӦӧӨөӪӫӬӭӮӯӰӱӲӳӴӵӶӷӸӹԀԁԂԃԄԅԆԇԈԉԊԋԌԍԎԏԱԲԳԴԵԶԷԸԹԺԻԼԽԾԿՀՁՂՃՄՅՆՇՈՉ ՊՋՌՍՎՏՐՑՒՓՔՕՖաբգդեզէըթժիլխծկհձղճմյնշոչպջռսվտրցւփքօֆևႠႡႢႣႤႥႦႧႨႩႪႫႬႭ ႮႯႰႱႲႳႴႵႶႷႸႹႺႻႼႽႾႿჀჁჂჃჄჅᴀᴁᴂᴃᴄᴅᴆᴇᴈᴉᴊᴋᴌᴍᴎᴏᴐᴑᴒᴓᴔᴕᴖᴗᴘᴙᴚᴛᴜᴝᴞᴟᴠᴡᴢᴣᴤᴥᴦᴧᴨᴩ ᴪᴫᵢᵣᵤᵥᵦᵧᵨᵩᵪᵫᵬᵭᵮᵯᵰᵱᵲᵳᵴᵵᵶᵷᵹᵺᵻᵼᵽᵾᵿᶀᶁᶂᶃᶄᶅᶆᶇᶈᶉᶊᶋᶌᶍᶎᶏᶐᶑᶒᶓᶔᶕᶖᶗᶘᶙᶚḀḁḂḃḄḅḆḇ ḈḉḊḋḌḍḎḏḐḑḒḓḔḕḖḗḘḙḚḛḜḝḞḟḠḡḢḣḤḥḦḧḨḩḪḫḬḭḮḯḰḱḲḳḴḵḶḷḸḹḺḻḼḽḾḿṀṁṂṃṄṅṆṇṈṉ ṊṋṌṍṎṏṐṑṒṓṔṕṖṗṘṙṚṛṜṝṞṟṠṡṢṣṤṥṦṧṨṩṪṫṬṭṮṯṰṱṲṳṴṵṶṷṸṹṺṻṼṽṾṿẀẁẂẃẄẅẆẇẈẉẊẋ ẌẍẎẏẐẑẒẓẔẕẖẗẘẙẚẛẠạẢảẤấẦầẨẩẪẫẬậẮắẰằẲẳẴẵẶặẸẹẺẻẼẽẾếỀềỂểỄễỆệỈỉỊịỌọỎỏỐố ỒồỔổỖỗỘộỚớỜờỞởỠỡỢợỤụỦủỨứỪừỬửỮữỰựỲỳỴỵỶỷỸỹἀἁἂἃἄἅἆἇἈἉἊἋἌἍἎἏἐἑἒἓἔἕἘἙἚἛ ἜἝἠἡἢἣἤἥἦἧἨἩἪἫἬἭἮἯἰἱἲἳἴἵἶἷἸἹἺἻἼἽἾἿὀὁὂὃὄὅὈὉὊὋὌὍὐὑὒὓὔὕὖὗὙὛὝὟὠὡὢὣὤὥὦὧ ὨὩὪὫὬὭὮὯὰάὲέὴήὶίὸόὺύὼώᾀᾁᾂᾃᾄᾅᾆᾇᾐᾑᾒᾓᾔᾕᾖᾗᾠᾡᾢᾣᾤᾥᾦᾧᾰᾱᾲᾳᾴᾶᾷᾸᾹᾺΆιῂῃῄῆῇῈΈῊ ΉῐῑῒΐῖῗῘῙῚΊῠῡῢΰῤῥῦῧῨῩῪΎῬῲῳῴῶῷῸΌῺΏⁱⁿℂℇℊℋℌℍℎℏℐℑℒℓℕℙℚℛℜℝℤΩℨKÅℬℭℯℰℱℳℴℹ ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyzªµºÀÁÂÃÄÅÆÇÈÉÊ ËÌÍÎÏÐÑÒÓÔÕÖØÙÚÛÜÝÞßàáâãäåæçèéêëìíîïðñòóôõöøùúûüýþÿĀāĂ㥹ĆćĈĉĊċČčĎ ďĐđĒēĔĕĖėĘęĚěĜĝĞğĠġĢģĤĥĦħĨĩĪīĬĭĮįİıIJijĴĵĶķĸĹĺĻļĽľĿŀŁłŃńŅņŇňʼnŊŋŌōŎŏŐ őŒœŔŕŖŗŘřŚśŜŝŞşŠšŢţŤťŦŧŨũŪūŬŭŮůŰűŲųŴŵŶŷŸŹźŻżŽžſƀƁƂƃƄƅƆƇƈƉƊƋƌƍƎƏƐƑƒ ƓƔƕƖƗƘƙƚƛƜƝƞƟƠơƢƣƤƥƦƧƨƩƪƫƬƭƮƯưƱƲƳƴƵƶƷƸƹƺƼƽƾƿDŽdžLJljNJnjǍǎǏǐǑǒǓǔǕǖǗǘǙǚǛǜ ǝǞǟǠǡǢǣǤǥǦǧǨǩǪǫǬǭǮǯǰDZdzǴǵǶǷǸǹǺǻǼǽǾǿȀȁȂȃȄȅȆȇȈȉȊȋȌȍȎȏȐȑȒȓȔȕȖȗȘșȚțȜȝȞȟ ȠȡȢȣȤȥȦȧȨȩȪȫȬȭȮȯȰȱȲȳȴȵȶȷȸȹȺȻȼȽȾȿɀɁɐɑɒɓɔɕɖɗɘəɚɛɜɝɞɟɠɡɢɣɤɥɦɧɨɩɪɫɬɭɮɯ ɰɱɲɳɴɵɶɷɸɹɺɻɼɽɾɿʀʁʂʃʄʅʆʇʈʉʊʋʌʍʎʏʐʑʒʓʔʕʖʗʘʙʚʛʜʝʞʟʠʡʢʣʤʥʦʧʨʩʪʫʬʭʮʯΆΈ ΉΊΌΎΏΐΑΒΓΔΕΖΗΘΙΚΛΜΝΞΟΠΡΣΤΥΦΧΨΩΪΫάέήίΰαβγδεζηθικλμνξοπρςστυφχψωϊϋόύ ώϐϑϒϓϔϕϖϗϘϙϚϛϜϝϞϟϠϡϢϣϤϥϦϧϨϩϪϫϬϭϮϯϰϱϲϳϴϵϷϸϹϺϻϼϽϾϿЀЁЂЃЄЅІЇЈЉЊЋЌЍЎЏАБ ВГДЕЖЗИЙКЛМНОПРСТУФХЦЧШЩЪЫЬЭЮЯабвгдежзийклмнопрстуфхцчшщъыьэюяѐёђѓ єѕіїјљњћќѝўџѠѡѢѣѤѥѦѧѨѩѪѫѬѭѮѯѰѱѲѳѴѵѶѷѸѹѺѻѼѽѾѿҀҁҊҋҌҍҎҏҐґҒғҔҕҖҗҘҙҚқҜҝ ҞҟҠҡҢңҤҥҦҧҨҩҪҫҬҭҮүҰұҲҳҴҵҶҷҸҹҺһҼҽҾҿӀӁӂӃӄӅӆӇӈӉӊӋӌӍӎӐӑӒӓӔӕӖӗӘәӚӛӜӝӞӟӠ ӡӢӣӤӥӦӧӨөӪӫӬӭӮӯӰӱӲӳӴӵӶӷӸӹԀԁԂԃԄԅԆԇԈԉԊԋԌԍԎԏԱԲԳԴԵԶԷԸԹԺԻԼԽԾԿՀՁՂՃՄՅՆՇՈՉ ՊՋՌՍՎՏՐՑՒՓՔՕՖաբգդեզէըթժիլխծկհձղճմյնշոչպջռսվտրցւփքօֆևႠႡႢႣႤႥႦႧႨႩႪႫႬႭ ႮႯႰႱႲႳႴႵႶႷႸႹႺႻႼႽႾႿჀჁჂჃჄჅᴀᴁᴂᴃᴄᴅᴆᴇᴈᴉᴊᴋᴌᴍᴎᴏᴐᴑᴒᴓᴔᴕᴖᴗᴘᴙᴚᴛᴜᴝᴞᴟᴠᴡᴢᴣᴤᴥᴦᴧᴨᴩ ᴪᴫᵢᵣᵤᵥᵦᵧᵨᵩᵪᵫᵬᵭᵮᵯᵰᵱᵲᵳᵴᵵᵶᵷᵹᵺᵻᵼᵽᵾᵿᶀᶁᶂᶃᶄᶅᶆᶇᶈᶉᶊᶋᶌᶍᶎᶏᶐᶑᶒᶓᶔᶕᶖᶗᶘᶙᶚḀḁḂḃḄḅḆḇ ḈḉḊḋḌḍḎḏḐḑḒḓḔḕḖḗḘḙḚḛḜḝḞḟḠḡḢḣḤḥḦḧḨḩḪḫḬḭḮḯḰḱḲḳḴḵḶḷḸḹḺḻḼḽḾḿṀṁṂṃṄṅṆṇṈṉ ṊṋṌṍṎṏṐṑṒṓṔṕṖṗṘṙṚṛṜṝṞṟṠṡṢṣṤṥṦṧṨṩṪṫṬṭṮṯṰṱṲṳṴṵṶṷṸṹṺṻṼṽṾṿẀẁẂẃẄẅẆẇẈẉẊẋ ẌẍẎẏẐẑẒẓẔẕẖẗẘẙẚẛẠạẢảẤấẦầẨẩẪẫẬậẮắẰằẲẳẴẵẶặẸẹẺẻẼẽẾếỀềỂểỄễỆệỈỉỊịỌọỎỏỐố ỒồỔổỖỗỘộỚớỜờỞởỠỡỢợỤụỦủỨứỪừỬửỮữỰựỲỳỴỵỶỷỸỹἀἁἂἃἄἅἆἇἈἉἊἋἌἍἎἏἐἑἒἓἔἕἘἙἚἛ ἜἝἠἡἢἣἤἥἦἧἨἩἪἫἬἭἮἯἰἱἲἳἴἵἶἷἸἹἺἻἼἽἾἿὀὁὂὃὄὅὈὉὊὋὌὍὐὑὒὓὔὕὖὗὙὛὝὟὠὡὢὣὤὥὦὧ ὨὩὪὫὬὭὮὯὰάὲέὴήὶίὸόὺύὼώᾀᾁᾂᾃᾄᾅᾆᾇᾐᾑᾒᾓᾔᾕᾖᾗᾠᾡᾢᾣᾤᾥᾦᾧᾰᾱᾲᾳᾴᾶᾷᾸᾹᾺΆιῂῃῄῆῇῈΈῊ ΉῐῑῒΐῖῗῘῙῚΊῠῡῢΰῤῥῦῧῨῩῪΎῬῲῳῴῶῷῸΌῺΏⁱⁿℂℇℊℋℌℍℎℏℐℑℒℓℕℙℚℛℜℝℤΩℨKÅℬℭℯℰℱℳℴℹ ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyzªµºÀÁÂÃÄÅÆÇÈÉÊ ËÌÍÎÏÐÑÒÓÔÕÖØÙÚÛÜÝÞßàáâãäåæçèéêëìíîïðñòóôõöøùúûüýþÿĀāĂ㥹ĆćĈĉĊċČčĎ ďĐđĒēĔĕĖėĘęĚěĜĝĞğĠġĢģĤĥĦħĨĩĪīĬĭĮįİıIJijĴĵĶķĸĹĺĻļĽľĿŀŁłŃńŅņŇňʼnŊŋŌōŎŏŐ őŒœŔŕŖŗŘřŚśŜŝŞşŠšŢţŤťŦŧŨũŪūŬŭŮůŰűŲųŴŵŶŷŸŹźŻżŽžſƀƁƂƃƄƅƆƇƈƉƊƋƌƍƎƏƐƑƒ ƓƔƕƖƗƘƙƚƛƜƝƞƟƠơƢƣƤƥƦƧƨƩƪƫƬƭƮƯưƱƲƳƴƵƶƷƸƹƺƼƽƾƿDŽdžLJljNJnjǍǎǏǐǑǒǓǔǕǖǗǘǙǚǛǜ ǝǞǟǠǡǢǣǤǥǦǧǨǩǪǫǬǭǮǯǰDZdzǴǵǶǷǸǹǺǻǼǽǾǿȀȁȂȃȄȅȆȇȈȉȊȋȌȍȎȏȐȑȒȓȔȕȖȗȘșȚțȜȝȞȟ ȠȡȢȣȤȥȦȧȨȩȪȫȬȭȮȯȰȱȲȳȴȵȶȷȸȹȺȻȼȽȾȿɀɁɐɑɒɓɔɕɖɗɘəɚɛɜɝɞɟɠɡɢɣɤɥɦɧɨɩɪɫɬɭɮɯ ɰɱɲɳɴɵɶɷɸɹɺɻɼɽɾɿʀʁʂʃʄʅʆʇʈʉʊʋʌʍʎʏʐʑʒʓʔʕʖʗʘʙʚʛʜʝʞʟʠʡʢʣʤʥʦʧʨʩʪʫʬʭʮʯΆΈ ΉΊΌΎΏΐΑΒΓΔΕΖΗΘΙΚΛΜΝΞΟΠΡΣΤΥΦΧΨΩΪΫάέήίΰαβγδεζηθικλμνξοπρςστυφχψωϊϋόύ ώϐϑϒϓϔϕϖϗϘϙϚϛϜϝϞϟϠϡϢϣϤϥϦϧϨϩϪϫϬϭϮϯϰϱϲϳϴϵϷϸϹϺϻϼϽϾϿЀЁЂЃЄЅІЇЈЉЊЋЌЍЎЏАБ ВГДЕЖЗИЙКЛМНОПРСТУФХЦЧШЩЪЫЬЭЮЯабвгдежзийклмнопрстуфхцчшщъыьэюяѐёђѓ єѕіїјљњћќѝўџѠѡѢѣѤѥѦѧѨѩѪѫѬѭѮѯѰѱѲѳѴѵѶѷѸѹѺѻѼѽѾѿҀҁҊҋҌҍҎҏҐґҒғҔҕҖҗҘҙҚқҜҝ ҞҟҠҡҢңҤҥҦҧҨҩҪҫҬҭҮүҰұҲҳҴҵҶҷҸҹҺһҼҽҾҿӀӁӂӃӄӅӆӇӈӉӊӋӌӍӎӐӑӒӓӔӕӖӗӘәӚӛӜӝӞӟӠ ӡӢӣӤӥӦӧӨөӪӫӬӭӮӯӰӱӲӳӴӵӶӷӸӹԀԁԂԃԄԅԆԇԈԉԊԋԌԍԎԏԱԲԳԴԵԶԷԸԹԺԻԼԽԾԿՀՁՂՃՄՅՆՇՈՉ ՊՋՌՍՎՏՐՑՒՓՔՕՖաբգդեզէըթժիլխծկհձղճմյնշոչպջռսվտրցւփքօֆևႠႡႢႣႤႥႦႧႨႩႪႫႬႭ ႮႯႰႱႲႳႴႵႶႷႸႹႺႻႼႽႾႿჀჁჂჃჄჅᴀᴁᴂᴃᴄᴅᴆᴇᴈᴉᴊᴋᴌᴍᴎᴏᴐᴑᴒᴓᴔᴕᴖᴗᴘᴙᴚᴛᴜᴝᴞᴟᴠᴡᴢᴣᴤᴥᴦᴧᴨᴩ ᴪᴫᵢᵣᵤᵥᵦᵧᵨᵩᵪᵫᵬᵭᵮᵯᵰᵱᵲᵳᵴᵵᵶᵷᵹᵺᵻᵼᵽᵾᵿᶀᶁᶂᶃᶄᶅᶆᶇᶈᶉᶊᶋᶌᶍᶎᶏᶐᶑᶒᶓᶔᶕᶖᶗᶘᶙᶚḀḁḂḃḄḅḆḇ ḈḉḊḋḌḍḎḏḐḑḒḓḔḕḖḗḘḙḚḛḜḝḞḟḠḡḢḣḤḥḦḧḨḩḪḫḬḭḮḯḰḱḲḳḴḵḶḷḸḹḺḻḼḽḾḿṀṁṂṃṄṅṆṇṈṉ ṊṋṌṍṎṏṐṑṒṓṔṕṖṗṘṙṚṛṜṝṞṟṠṡṢṣṤṥṦṧṨṩṪṫṬṭṮṯṰṱṲṳṴṵṶṷṸṹṺṻṼṽṾṿẀẁẂẃẄẅẆẇẈẉẊẋ ẌẍẎẏẐẑẒẓẔẕẖẗẘẙẚẛẠạẢảẤấẦầẨẩẪẫẬậẮắẰằẲẳẴẵẶặẸẹẺẻẼẽẾếỀềỂểỄễỆệỈỉỊịỌọỎỏỐố ỒồỔổỖỗỘộỚớỜờỞởỠỡỢợỤụỦủỨứỪừỬửỮữỰựỲỳỴỵỶỷỸỹἀἁἂἃἄἅἆἇἈἉἊἋἌἍἎἏἐἑἒἓἔἕἘἙἚἛ ἜἝἠἡἢἣἤἥἦἧἨἩἪἫἬἭἮἯἰἱἲἳἴἵἶἷἸἹἺἻἼἽἾἿὀὁὂὃὄὅὈὉὊὋὌὍὐὑὒὓὔὕὖὗὙὛὝὟὠὡὢὣὤὥὦὧ ὨὩὪὫὬὭὮὯὰάὲέὴήὶίὸόὺύὼώᾀᾁᾂᾃᾄᾅᾆᾇᾐᾑᾒᾓᾔᾕᾖᾗᾠᾡᾢᾣᾤᾥᾦᾧᾰᾱᾲᾳᾴᾶᾷᾸᾹᾺΆιῂῃῄῆῇῈΈῊ ΉῐῑῒΐῖῗῘῙῚΊῠῡῢΰῤῥῦῧῨῩῪΎῬῲῳῴῶῷῸΌῺΏⁱⁿℂℇℊℋℌℍℎℏℐℑℒℓℕℙℚℛℜℝℤΩℨKÅℬℭℯℰℱℳℴℹ ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyzªµºÀÁÂÃÄÅÆÇÈÉÊ ËÌÍÎÏÐÑÒÓÔÕÖØÙÚÛÜÝÞßàáâãäåæçèéêëìíîïðñòóôõöøùúûüýþÿĀāĂ㥹ĆćĈĉĊċČčĎ ďĐđĒēĔĕĖėĘęĚěĜĝĞğĠġĢģĤĥĦħĨĩĪīĬĭĮįİıIJijĴĵĶķĸĹĺĻļĽľĿŀŁłŃńŅņŇňʼnŊŋŌōŎŏŐ őŒœŔŕŖŗŘřŚśŜŝŞşŠšŢţŤťŦŧŨũŪūŬŭŮůŰűŲųŴŵŶŷŸŹźŻżŽžſƀƁƂƃƄƅƆƇƈƉƊƋƌƍƎƏƐƑƒ ƓƔƕƖƗƘƙƚƛƜƝƞƟƠơƢƣƤƥƦƧƨƩƪƫƬƭƮƯưƱƲƳƴƵƶƷƸƹƺƼƽƾƿDŽdžLJljNJnjǍǎǏǐǑǒǓǔǕǖǗǘǙǚǛǜ ǝǞǟǠǡǢǣǤǥǦǧǨǩǪǫǬǭǮǯǰDZdzǴǵǶǷǸǹǺǻǼǽǾǿȀȁȂȃȄȅȆȇȈȉȊȋȌȍȎȏȐȑȒȓȔȕȖȗȘșȚțȜȝȞȟ ȠȡȢȣȤȥȦȧȨȩȪȫȬȭȮȯȰȱȲȳȴȵȶȷȸȹȺȻȼȽȾȿɀɁɐɑɒɓɔɕɖɗɘəɚɛɜɝɞɟɠɡɢɣɤɥɦɧɨɩɪɫɬɭɮɯ ɰɱɲɳɴɵɶɷɸɹɺɻɼɽɾɿʀʁʂʃʄʅʆʇʈʉʊʋʌʍʎʏʐʑʒʓʔʕʖʗʘʙʚʛʜʝʞʟʠʡʢʣʤʥʦʧʨʩʪʫʬʭʮʯΆΈ ΉΊΌΎΏΐΑΒΓΔΕΖΗΘΙΚΛΜΝΞΟΠΡΣΤΥΦΧΨΩΪΫάέήίΰαβγδεζηθικλμνξοπρςστυφχψωϊϋόύ ώϐϑϒϓϔϕϖϗϘϙϚϛϜϝϞϟϠϡϢϣϤϥϦϧϨϩϪϫϬϭϮϯϰϱϲϳϴϵϷϸϹϺϻϼϽϾϿЀЁЂЃЄЅІЇЈЉЊЋЌЍЎЏАБ ВГДЕЖЗИЙКЛМНОПРСТУФХЦЧШЩЪЫЬЭЮЯабвгдежзийклмнопрстуфхцчшщъыьэюяѐёђѓ єѕіїјљњћќѝўџѠѡѢѣѤѥѦѧѨѩѪѫѬѭѮѯѰѱѲѳѴѵѶѷѸѹѺѻѼѽѾѿҀҁҊҋҌҍҎҏҐґҒғҔҕҖҗҘҙҚқҜҝ ҞҟҠҡҢңҤҥҦҧҨҩҪҫҬҭҮүҰұҲҳҴҵҶҷҸҹҺһҼҽҾҿӀӁӂӃӄӅӆӇӈӉӊӋӌӍӎӐӑӒӓӔӕӖӗӘәӚӛӜӝӞӟӠ ӡӢӣӤӥӦӧӨөӪӫӬӭӮӯӰӱӲӳӴӵӶӷӸӹԀԁԂԃԄԅԆԇԈԉԊԋԌԍԎԏԱԲԳԴԵԶԷԸԹԺԻԼԽԾԿՀՁՂՃՄՅՆՇՈՉ ՊՋՌՍՎՏՐՑՒՓՔՕՖաբգդեզէըթժիլխծկհձղճմյնշոչպջռսվտրցւփքօֆևႠႡႢႣႤႥႦႧႨႩႪႫႬႭ ႮႯႰႱႲႳႴႵႶႷႸႹႺႻႼႽႾႿჀჁჂჃჄჅᴀᴁᴂᴃᴄᴅᴆᴇᴈᴉᴊᴋᴌᴍᴎᴏᴐᴑᴒᴓᴔᴕᴖᴗᴘᴙᴚᴛᴜᴝᴞᴟᴠᴡᴢᴣᴤᴥᴦᴧᴨᴩ ᴪᴫᵢᵣᵤᵥᵦᵧᵨᵩᵪᵫᵬᵭᵮᵯᵰᵱᵲᵳᵴᵵᵶᵷᵹᵺᵻᵼᵽᵾᵿᶀᶁᶂᶃᶄᶅᶆᶇᶈᶉᶊᶋᶌᶍᶎᶏᶐᶑᶒᶓᶔᶕᶖᶗᶘᶙᶚḀḁḂḃḄḅḆḇ ḈḉḊḋḌḍḎḏḐḑḒḓḔḕḖḗḘḙḚḛḜḝḞḟḠḡḢḣḤḥḦḧḨḩḪḫḬḭḮḯḰḱḲḳḴḵḶḷḸḹḺḻḼḽḾḿṀṁṂṃṄṅṆṇṈṉ ṊṋṌṍṎṏṐṑṒṓṔṕṖṗṘṙṚṛṜṝṞṟṠṡṢṣṤṥṦṧṨṩṪṫṬṭṮṯṰṱṲṳṴṵṶṷṸṹṺṻṼṽṾṿẀẁẂẃẄẅẆẇẈẉẊẋ ẌẍẎẏẐẑẒẓẔẕẖẗẘẙẚẛẠạẢảẤấẦầẨẩẪẫẬậẮắẰằẲẳẴẵẶặẸẹẺẻẼẽẾếỀềỂểỄễỆệỈỉỊịỌọỎỏỐố ỒồỔổỖỗỘộỚớỜờỞởỠỡỢợỤụỦủỨứỪừỬửỮữỰựỲỳỴỵỶỷỸỹἀἁἂἃἄἅἆἇἈἉἊἋἌἍἎἏἐἑἒἓἔἕἘἙἚἛ ἜἝἠἡἢἣἤἥἦἧἨἩἪἫἬἭἮἯἰἱἲἳἴἵἶἷἸἹἺἻἼἽἾἿὀὁὂὃὄὅὈὉὊὋὌὍὐὑὒὓὔὕὖὗὙὛὝὟὠὡὢὣὤὥὦὧ ὨὩὪὫὬὭὮὯὰάὲέὴήὶίὸόὺύὼώᾀᾁᾂᾃᾄᾅᾆᾇᾐᾑᾒᾓᾔᾕᾖᾗᾠᾡᾢᾣᾤᾥᾦᾧᾰᾱᾲᾳᾴᾶᾷᾸᾹᾺΆιῂῃῄῆῇῈΈῊ ΉῐῑῒΐῖῗῘῙῚΊῠῡῢΰῤῥῦῧῨῩῪΎῬῲῳῴῶῷῸΌῺΏⁱⁿℂℇℊℋℌℍℎℏℐℑℒℓℕℙℚℛℜℝℤΩℨKÅℬℭℯℰℱℳℴℹ ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyzªµºÀÁÂÃÄÅÆÇÈÉÊ ËÌÍÎÏÐÑÒÓÔÕÖØÙÚÛÜÝÞßàáâãäåæçèéêëìíîïðñòóôõöøùúûüýþÿĀāĂ㥹ĆćĈĉĊċČčĎ ďĐđĒēĔĕĖėĘęĚěĜĝĞğĠġĢģĤĥĦħĨĩĪīĬĭĮįİıIJijĴĵĶķĸĹĺĻļĽľĿŀŁłŃńŅņŇňʼnŊŋŌōŎŏŐ őŒœŔŕŖŗŘřŚśŜŝŞşŠšŢţŤťŦŧŨũŪūŬŭŮůŰűŲųŴŵŶŷŸŹźŻżŽžſƀƁƂƃƄƅƆƇƈƉƊƋƌƍƎƏƐƑƒ ƓƔƕƖƗƘƙƚƛƜƝƞƟƠơƢƣƤƥƦƧƨƩƪƫƬƭƮƯưƱƲƳƴƵƶƷƸƹƺƼƽƾƿDŽdžLJljNJnjǍǎǏǐǑǒǓǔǕǖǗǘǙǚǛǜ ǝǞǟǠǡǢǣǤǥǦǧǨǩǪǫǬǭǮǯǰDZdzǴǵǶǷǸǹǺǻǼǽǾǿȀȁȂȃȄȅȆȇȈȉȊȋȌȍȎȏȐȑȒȓȔȕȖȗȘșȚțȜȝȞȟ ȠȡȢȣȤȥȦȧȨȩȪȫȬȭȮȯȰȱȲȳȴȵȶȷȸȹȺȻȼȽȾȿɀɁɐɑɒɓɔɕɖɗɘəɚɛɜɝɞɟɠɡɢɣɤɥɦɧɨɩɪɫɬɭɮɯ ɰɱɲɳɴɵɶɷɸɹɺɻɼɽɾɿʀʁʂʃʄʅʆʇʈʉʊʋʌʍʎʏʐʑʒʓʔʕʖʗʘʙʚʛʜʝʞʟʠʡʢʣʤʥʦʧʨʩʪʫʬʭʮʯΆΈ ΉΊΌΎΏΐΑΒΓΔΕΖΗΘΙΚΛΜΝΞΟΠΡΣΤΥΦΧΨΩΪΫάέήίΰαβγδεζηθικλμνξοπρςστυφχψωϊϋόύ ώϐϑϒϓϔϕϖϗϘϙϚϛϜϝϞϟϠϡϢϣϤϥϦϧϨϩϪϫϬϭϮϯϰϱϲϳϴϵϷϸϹϺϻϼϽϾϿЀЁЂЃЄЅІЇЈЉЊЋЌЍЎЏАБ ВГДЕЖЗИЙКЛМНОПРСТУФХЦЧШЩЪЫЬЭЮЯабвгдежзийклмнопрстуфхцчшщъыьэюяѐёђѓ єѕіїјљњћќѝўџѠѡѢѣѤѥѦѧѨѩѪѫѬѭѮѯѰѱѲѳѴѵѶѷѸѹѺѻѼѽѾѿҀҁҊҋҌҍҎҏҐґҒғҔҕҖҗҘҙҚқҜҝ ҞҟҠҡҢңҤҥҦҧҨҩҪҫҬҭҮүҰұҲҳҴҵҶҷҸҹҺһҼҽҾҿӀӁӂӃӄӅӆӇӈӉӊӋӌӍӎӐӑӒӓӔӕӖӗӘәӚӛӜӝӞӟӠ ӡӢӣӤӥӦӧӨөӪӫӬӭӮӯӰӱӲӳӴӵӶӷӸӹԀԁԂԃԄԅԆԇԈԉԊԋԌԍԎԏԱԲԳԴԵԶԷԸԹԺԻԼԽԾԿՀՁՂՃՄՅՆՇՈՉ ՊՋՌՍՎՏՐՑՒՓՔՕՖաբգդեզէըթժիլխծկհձղճմյնշոչպջռսվտրցւփքօֆևႠႡႢႣႤႥႦႧႨႩႪႫႬႭ ႮႯႰႱႲႳႴႵႶႷႸႹႺႻႼႽႾႿჀჁჂჃჄჅᴀᴁᴂᴃᴄᴅᴆᴇᴈᴉᴊᴋᴌᴍᴎᴏᴐᴑᴒᴓᴔᴕᴖᴗᴘᴙᴚᴛᴜᴝᴞᴟᴠᴡᴢᴣᴤᴥᴦᴧᴨᴩ ᴪᴫᵢᵣᵤᵥᵦᵧᵨᵩᵪᵫᵬᵭᵮᵯᵰᵱᵲᵳᵴᵵᵶᵷᵹᵺᵻᵼᵽᵾᵿᶀᶁᶂᶃᶄᶅᶆᶇᶈᶉᶊᶋᶌᶍᶎᶏᶐᶑᶒᶓᶔᶕᶖᶗᶘᶙᶚḀḁḂḃḄḅḆḇ ḈḉḊḋḌḍḎḏḐḑḒḓḔḕḖḗḘḙḚḛḜḝḞḟḠḡḢḣḤḥḦḧḨḩḪḫḬḭḮḯḰḱḲḳḴḵḶḷḸḹḺḻḼḽḾḿṀṁṂṃṄṅṆṇṈṉ ṊṋṌṍṎṏṐṑṒṓṔṕṖṗṘṙṚṛṜṝṞṟṠṡṢṣṤṥṦṧṨṩṪṫṬṭṮṯṰṱṲṳṴṵṶṷṸṹṺṻṼṽṾṿẀẁẂẃẄẅẆẇẈẉẊẋ ẌẍẎẏẐẑẒẓẔẕẖẗẘẙẚẛẠạẢảẤấẦầẨẩẪẫẬậẮắẰằẲẳẴẵẶặẸẹẺẻẼẽẾếỀềỂểỄễỆệỈỉỊịỌọỎỏỐố ỒồỔổỖỗỘộỚớỜờỞởỠỡỢợỤụỦủỨứỪừỬửỮữỰựỲỳỴỵỶỷỸỹἀἁἂἃἄἅἆἇἈἉἊἋἌἍἎἏἐἑἒἓἔἕἘἙἚἛ ἜἝἠἡἢἣἤἥἦἧἨἩἪἫἬἭἮἯἰἱἲἳἴἵἶἷἸἹἺἻἼἽἾἿὀὁὂὃὄὅὈὉὊὋὌὍὐὑὒὓὔὕὖὗὙὛὝὟὠὡὢὣὤὥὦὧ ὨὩὪὫὬὭὮὯὰάὲέὴήὶίὸόὺύὼώᾀᾁᾂᾃᾄᾅᾆᾇᾐᾑᾒᾓᾔᾕᾖᾗᾠᾡᾢᾣᾤᾥᾦᾧᾰᾱᾲᾳᾴᾶᾷᾸᾹᾺΆιῂῃῄῆῇῈΈῊ ΉῐῑῒΐῖῗῘῙῚΊῠῡῢΰῤῥῦῧῨῩῪΎῬῲῳῴῶῷῸΌῺΏⁱⁿℂℇℊℋℌℍℎℏℐℑℒℓℕℙℚℛℜℝℤΩℨKÅℬℭℯℰℱℳℴℹ ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyzªµºÀÁÂÃÄÅÆÇÈÉÊ ËÌÍÎÏÐÑÒÓÔÕÖØÙÚÛÜÝÞßàáâãäåæçèéêëìíîïðñòóôõöøùúûüýþÿĀāĂ㥹ĆćĈĉĊċČčĎ ďĐđĒēĔĕĖėĘęĚěĜĝĞğĠġĢģĤĥĦħĨĩĪīĬĭĮįİıIJijĴĵĶķĸĹĺĻļĽľĿŀŁłŃńŅņŇňʼnŊŋŌōŎŏŐ őŒœŔŕŖŗŘřŚśŜŝŞşŠšŢţŤťŦŧŨũŪūŬŭŮůŰűŲųŴŵŶŷŸŹźŻżŽžſƀƁƂƃƄƅƆƇƈƉƊƋƌƍƎƏƐƑƒ ƓƔƕƖƗƘƙƚƛƜƝƞƟƠơƢƣƤƥƦƧƨƩƪƫƬƭƮƯưƱƲƳƴƵƶƷƸƹƺƼƽƾƿDŽdžLJljNJnjǍǎǏǐǑǒǓǔǕǖǗǘǙǚǛǜ ǝǞǟǠǡǢǣǤǥǦǧǨǩǪǫǬǭǮǯǰDZdzǴǵǶǷǸǹǺǻǼǽǾǿȀȁȂȃȄȅȆȇȈȉȊȋȌȍȎȏȐȑȒȓȔȕȖȗȘșȚțȜȝȞȟ ȠȡȢȣȤȥȦȧȨȩȪȫȬȭȮȯȰȱȲȳȴȵȶȷȸȹȺȻȼȽȾȿɀɁɐɑɒɓɔɕɖɗɘəɚɛɜɝɞɟɠɡɢɣɤɥɦɧɨɩɪɫɬɭɮɯ ɰɱɲɳɴɵɶɷɸɹɺɻɼɽɾɿʀʁʂʃʄʅʆʇʈʉʊʋʌʍʎʏʐʑʒʓʔʕʖʗʘʙʚʛʜʝʞʟʠʡʢʣʤʥʦʧʨʩʪʫʬʭʮʯΆΈ ΉΊΌΎΏΐΑΒΓΔΕΖΗΘΙΚΛΜΝΞΟΠΡΣΤΥΦΧΨΩΪΫάέήίΰαβγδεζηθικλμνξοπρςστυφχψωϊϋόύ ώϐϑϒϓϔϕϖϗϘϙϚϛϜϝϞϟϠϡϢϣϤϥϦϧϨϩϪϫϬϭϮϯϰϱϲϳϴϵϷϸϹϺϻϼϽϾϿЀЁЂЃЄЅІЇЈЉЊЋЌЍЎЏАБ ВГДЕЖЗИЙКЛМНОПРСТУФХЦЧШЩЪЫЬЭЮЯабвгдежзийклмнопрстуфхцчшщъыьэюяѐёђѓ єѕіїјљњћќѝўџѠѡѢѣѤѥѦѧѨѩѪѫѬѭѮѯѰѱѲѳѴѵѶѷѸѹѺѻѼѽѾѿҀҁҊҋҌҍҎҏҐґҒғҔҕҖҗҘҙҚқҜҝ ҞҟҠҡҢңҤҥҦҧҨҩҪҫҬҭҮүҰұҲҳҴҵҶҷҸҹҺһҼҽҾҿӀӁӂӃӄӅӆӇӈӉӊӋӌӍӎӐӑӒӓӔӕӖӗӘәӚӛӜӝӞӟӠ ӡӢӣӤӥӦӧӨөӪӫӬӭӮӯӰӱӲӳӴӵӶӷӸӹԀԁԂԃԄԅԆԇԈԉԊԋԌԍԎԏԱԲԳԴԵԶԷԸԹԺԻԼԽԾԿՀՁՂՃՄՅՆՇՈՉ ՊՋՌՍՎՏՐՑՒՓՔՕՖաբգդեզէըթժիլխծկհձղճմյնշոչպջռսվտրցւփքօֆևႠႡႢႣႤႥႦႧႨႩႪႫႬႭ ႮႯႰႱႲႳႴႵႶႷႸႹႺႻႼႽႾႿჀჁჂჃჄჅᴀᴁᴂᴃᴄᴅᴆᴇᴈᴉᴊᴋᴌᴍᴎᴏᴐᴑᴒᴓᴔᴕᴖᴗᴘᴙᴚᴛᴜᴝᴞᴟᴠᴡᴢᴣᴤᴥᴦᴧᴨᴩ ᴪᴫᵢᵣᵤᵥᵦᵧᵨᵩᵪᵫᵬᵭᵮᵯᵰᵱᵲᵳᵴᵵᵶᵷᵹᵺᵻᵼᵽᵾᵿᶀᶁᶂᶃᶄᶅᶆᶇᶈᶉᶊᶋᶌᶍᶎᶏᶐᶑᶒᶓᶔᶕᶖᶗᶘᶙᶚḀḁḂḃḄḅḆḇ ḈḉḊḋḌḍḎḏḐḑḒḓḔḕḖḗḘḙḚḛḜḝḞḟḠḡḢḣḤḥḦḧḨḩḪḫḬḭḮḯḰḱḲḳḴḵḶḷḸḹḺḻḼḽḾḿṀṁṂṃṄṅṆṇṈṉ ṊṋṌṍṎṏṐṑṒṓṔṕṖṗṘṙṚṛṜṝṞṟṠṡṢṣṤṥṦṧṨṩṪṫṬṭṮṯṰṱṲṳṴṵṶṷṸṹṺṻṼṽṾṿẀẁẂẃẄẅẆẇẈẉẊẋ ẌẍẎẏẐẑẒẓẔẕẖẗẘẙẚẛẠạẢảẤấẦầẨẩẪẫẬậẮắẰằẲẳẴẵẶặẸẹẺẻẼẽẾếỀềỂểỄễỆệỈỉỊịỌọỎỏỐố ỒồỔổỖỗỘộỚớỜờỞởỠỡỢợỤụỦủỨứỪừỬửỮữỰựỲỳỴỵỶỷỸỹἀἁἂἃἄἅἆἇἈἉἊἋἌἍἎἏἐἑἒἓἔἕἘἙἚἛ ἜἝἠἡἢἣἤἥἦἧἨἩἪἫἬἭἮἯἰἱἲἳἴἵἶἷἸἹἺἻἼἽἾἿὀὁὂὃὄὅὈὉὊὋὌὍὐὑὒὓὔὕὖὗὙὛὝὟὠὡὢὣὤὥὦὧ ὨὩὪὫὬὭὮὯὰάὲέὴήὶίὸόὺύὼώᾀᾁᾂᾃᾄᾅᾆᾇᾐᾑᾒᾓᾔᾕᾖᾗᾠᾡᾢᾣᾤᾥᾦᾧᾰᾱᾲᾳᾴᾶᾷᾸᾹᾺΆιῂῃῄῆῇῈΈῊ ΉῐῑῒΐῖῗῘῙῚΊῠῡῢΰῤῥῦῧῨῩῪΎῬῲῳῴῶῷῸΌῺΏⁱⁿℂℇℊℋℌℍℎℏℐℑℒℓℕℙℚℛℜℝℤΩℨKÅℬℭℯℰℱℳℴℹ ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyzªµºÀÁÂÃÄÅÆÇÈÉÊ ËÌÍÎÏÐÑÒÓÔÕÖØÙÚÛÜÝÞßàáâãäåæçèéêëìíîïðñòóôõöøùúûüýþÿĀāĂ㥹ĆćĈĉĊċČčĎ ďĐđĒēĔĕĖėĘęĚěĜĝĞğĠġĢģĤĥĦħĨĩĪīĬĭĮįİıIJijĴĵĶķĸĹĺĻļĽľĿŀŁłŃńŅņŇňʼnŊŋŌōŎŏŐ őŒœŔŕŖŗŘřŚśŜŝŞşŠšŢţŤťŦŧŨũŪūŬŭŮůŰűŲųŴŵŶŷŸŹźŻżŽžſƀƁƂƃƄƅƆƇƈƉƊƋƌƍƎƏƐƑƒ ƓƔƕƖƗƘƙƚƛƜƝƞƟƠơƢƣƤƥƦƧƨƩƪƫƬƭƮƯưƱƲƳƴƵƶƷƸƹƺƼƽƾƿDŽdžLJljNJnjǍǎǏǐǑǒǓǔǕǖǗǘǙǚǛǜ ǝǞǟǠǡǢǣǤǥǦǧǨǩǪǫǬǭǮǯǰDZdzǴǵǶǷǸǹǺǻǼǽǾǿȀȁȂȃȄȅȆȇȈȉȊȋȌȍȎȏȐȑȒȓȔȕȖȗȘșȚțȜȝȞȟ ȠȡȢȣȤȥȦȧȨȩȪȫȬȭȮȯȰȱȲȳȴȵȶȷȸȹȺȻȼȽȾȿɀɁɐɑɒɓɔɕɖɗɘəɚɛɜɝɞɟɠɡɢɣɤɥɦɧɨɩɪɫɬɭɮɯ ɰɱɲɳɴɵɶɷɸɹɺɻɼɽɾɿʀʁʂʃʄʅʆʇʈʉʊʋʌʍʎʏʐʑʒʓʔʕʖʗʘʙʚʛʜʝʞʟʠʡʢʣʤʥʦʧʨʩʪʫʬʭʮʯΆΈ ΉΊΌΎΏΐΑΒΓΔΕΖΗΘΙΚΛΜΝΞΟΠΡΣΤΥΦΧΨΩΪΫάέήίΰαβγδεζηθικλμνξοπρςστυφχψωϊϋόύ ώϐϑϒϓϔϕϖϗϘϙϚϛϜϝϞϟϠϡϢϣϤϥϦϧϨϩϪϫϬϭϮϯϰϱϲϳϴϵϷϸϹϺϻϼϽϾϿЀЁЂЃЄЅІЇЈЉЊЋЌЍЎЏАБ ВГДЕЖЗИЙКЛМНОПРСТУФХЦЧШЩЪЫЬЭЮЯабвгдежзийклмнопрстуфхцчшщъыьэюяѐёђѓ єѕіїјљњћќѝўџѠѡѢѣѤѥѦѧѨѩѪѫѬѭѮѯѰѱѲѳѴѵѶѷѸѹѺѻѼѽѾѿҀҁҊҋҌҍҎҏҐґҒғҔҕҖҗҘҙҚқҜҝ ҞҟҠҡҢңҤҥҦҧҨҩҪҫҬҭҮүҰұҲҳҴҵҶҷҸҹҺһҼҽҾҿӀӁӂӃӄӅӆӇӈӉӊӋӌӍӎӐӑӒӓӔӕӖӗӘәӚӛӜӝӞӟӠ ӡӢӣӤӥӦӧӨөӪӫӬӭӮӯӰӱӲӳӴӵӶӷӸӹԀԁԂԃԄԅԆԇԈԉԊԋԌԍԎԏԱԲԳԴԵԶԷԸԹԺԻԼԽԾԿՀՁՂՃՄՅՆՇՈՉ ՊՋՌՍՎՏՐՑՒՓՔՕՖաբգդեզէըթժիլխծկհձղճմյնշոչպջռսվտրցւփքօֆևႠႡႢႣႤႥႦႧႨႩႪႫႬႭ ႮႯႰႱႲႳႴႵႶႷႸႹႺႻႼႽႾႿჀჁჂჃჄჅᴀᴁᴂᴃᴄᴅᴆᴇᴈᴉᴊᴋᴌᴍᴎᴏᴐᴑᴒᴓᴔᴕᴖᴗᴘᴙᴚᴛᴜᴝᴞᴟᴠᴡᴢᴣᴤᴥᴦᴧᴨᴩ ᴪᴫᵢᵣᵤᵥᵦᵧᵨᵩᵪᵫᵬᵭᵮᵯᵰᵱᵲᵳᵴᵵᵶᵷᵹᵺᵻᵼᵽᵾᵿᶀᶁᶂᶃᶄᶅᶆᶇᶈᶉᶊᶋᶌᶍᶎᶏᶐᶑᶒᶓᶔᶕᶖᶗᶘᶙᶚḀḁḂḃḄḅḆḇ ḈḉḊḋḌḍḎḏḐḑḒḓḔḕḖḗḘḙḚḛḜḝḞḟḠḡḢḣḤḥḦḧḨḩḪḫḬḭḮḯḰḱḲḳḴḵḶḷḸḹḺḻḼḽḾḿṀṁṂṃṄṅṆṇṈṉ ṊṋṌṍṎṏṐṑṒṓṔṕṖṗṘṙṚṛṜṝṞṟṠṡṢṣṤṥṦṧṨṩṪṫṬṭṮṯṰṱṲṳṴṵṶṷṸṹṺṻṼṽṾṿẀẁẂẃẄẅẆẇẈẉẊẋ ẌẍẎẏẐẑẒẓẔẕẖẗẘẙẚẛẠạẢảẤấẦầẨẩẪẫẬậẮắẰằẲẳẴẵẶặẸẹẺẻẼẽẾếỀềỂểỄễỆệỈỉỊịỌọỎỏỐố ỒồỔổỖỗỘộỚớỜờỞởỠỡỢợỤụỦủỨứỪừỬửỮữỰựỲỳỴỵỶỷỸỹἀἁἂἃἄἅἆἇἈἉἊἋἌἍἎἏἐἑἒἓἔἕἘἙἚἛ ἜἝἠἡἢἣἤἥἦἧἨἩἪἫἬἭἮἯἰἱἲἳἴἵἶἷἸἹἺἻἼἽἾἿὀὁὂὃὄὅὈὉὊὋὌὍὐὑὒὓὔὕὖὗὙὛὝὟὠὡὢὣὤὥὦὧ ὨὩὪὫὬὭὮὯὰάὲέὴήὶίὸόὺύὼώᾀᾁᾂᾃᾄᾅᾆᾇᾐᾑᾒᾓᾔᾕᾖᾗᾠᾡᾢᾣᾤᾥᾦᾧᾰᾱᾲᾳᾴᾶᾷᾸᾹᾺΆιῂῃῄῆῇῈΈῊ ΉῐῑῒΐῖῗῘῙῚΊῠῡῢΰῤῥῦῧῨῩῪΎῬῲῳῴῶῷῸΌῺΏⁱⁿℂℇℊℋℌℍℎℏℐℑℒℓℕℙℚℛℜℝℤΩℨKÅℬℭℯℰℱℳℴℹ ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyzªµºÀÁÂÃÄÅÆÇÈÉÊ ËÌÍÎÏÐÑÒÓÔÕÖØÙÚÛÜÝÞßàáâãäåæçèéêëìíîïðñòóôõöøùúûüýþÿĀāĂ㥹ĆćĈĉĊċČčĎ ďĐđĒēĔĕĖėĘęĚěĜĝĞğĠġĢģĤĥĦħĨĩĪīĬĭĮįİıIJijĴĵĶķĸĹĺĻļĽľĿŀŁłŃńŅņŇňʼnŊŋŌōŎŏŐ őŒœŔŕŖŗŘřŚśŜŝŞşŠšŢţŤťŦŧŨũŪūŬŭŮůŰűŲųŴŵŶŷŸŹźŻżŽžſƀƁƂƃƄƅƆƇƈƉƊƋƌƍƎƏƐƑƒ ƓƔƕƖƗƘƙƚƛƜƝƞƟƠơƢƣƤƥƦƧƨƩƪƫƬƭƮƯưƱƲƳƴƵƶƷƸƹƺƼƽƾƿDŽdžLJljNJnjǍǎǏǐǑǒǓǔǕǖǗǘǙǚǛǜ ǝǞǟǠǡǢǣǤǥǦǧǨǩǪǫǬǭǮǯǰDZdzǴǵǶǷǸǹǺǻǼǽǾǿȀȁȂȃȄȅȆȇȈȉȊȋȌȍȎȏȐȑȒȓȔȕȖȗȘșȚțȜȝȞȟ ȠȡȢȣȤȥȦȧȨȩȪȫȬȭȮȯȰȱȲȳȴȵȶȷȸȹȺȻȼȽȾȿɀɁɐɑɒɓɔɕɖɗɘəɚɛɜɝɞɟɠɡɢɣɤɥɦɧɨɩɪɫɬɭɮɯ ɰɱɲɳɴɵɶɷɸɹɺɻɼɽɾɿʀʁʂʃʄʅʆʇʈʉʊʋʌʍʎʏʐʑʒʓʔʕʖʗʘʙʚʛʜʝʞʟʠʡʢʣʤʥʦʧʨʩʪʫʬʭʮʯΆΈ ΉΊΌΎΏΐΑΒΓΔΕΖΗΘΙΚΛΜΝΞΟΠΡΣΤΥΦΧΨΩΪΫάέήίΰαβγδεζηθικλμνξοπρςστυφχψωϊϋόύ ώϐϑϒϓϔϕϖϗϘϙϚϛϜϝϞϟϠϡϢϣϤϥϦϧϨϩϪϫϬϭϮϯϰϱϲϳϴϵϷϸϹϺϻϼϽϾϿЀЁЂЃЄЅІЇЈЉЊЋЌЍЎЏАБ ВГДЕЖЗИЙКЛМНОПРСТУФХЦЧШЩЪЫЬЭЮЯабвгдежзийклмнопрстуфхцчшщъыьэюяѐёђѓ єѕіїјљњћќѝўџѠѡѢѣѤѥѦѧѨѩѪѫѬѭѮѯѰѱѲѳѴѵѶѷѸѹѺѻѼѽѾѿҀҁҊҋҌҍҎҏҐґҒғҔҕҖҗҘҙҚқҜҝ ҞҟҠҡҢңҤҥҦҧҨҩҪҫҬҭҮүҰұҲҳҴҵҶҷҸҹҺһҼҽҾҿӀӁӂӃӄӅӆӇӈӉӊӋӌӍӎӐӑӒӓӔӕӖӗӘәӚӛӜӝӞӟӠ ӡӢӣӤӥӦӧӨөӪӫӬӭӮӯӰӱӲӳӴӵӶӷӸӹԀԁԂԃԄԅԆԇԈԉԊԋԌԍԎԏԱԲԳԴԵԶԷԸԹԺԻԼԽԾԿՀՁՂՃՄՅՆՇՈՉ ՊՋՌՍՎՏՐՑՒՓՔՕՖաբգդեզէըթժիլխծկհձղճմյնշոչպջռսվտրցւփքօֆևႠႡႢႣႤႥႦႧႨႩႪႫႬႭ ႮႯႰႱႲႳႴႵႶႷႸႹႺႻႼႽႾႿჀჁჂჃჄჅᴀᴁᴂᴃᴄᴅᴆᴇᴈᴉᴊᴋᴌᴍᴎᴏᴐᴑᴒᴓᴔᴕᴖᴗᴘᴙᴚᴛᴜᴝᴞᴟᴠᴡᴢᴣᴤᴥᴦᴧᴨᴩ ᴪᴫᵢᵣᵤᵥᵦᵧᵨᵩᵪᵫᵬᵭᵮᵯᵰᵱᵲᵳᵴᵵᵶᵷᵹᵺᵻᵼᵽᵾᵿᶀᶁᶂᶃᶄᶅᶆᶇᶈᶉᶊᶋᶌᶍᶎᶏᶐᶑᶒᶓᶔᶕᶖᶗᶘᶙᶚḀḁḂḃḄḅḆḇ ḈḉḊḋḌḍḎḏḐḑḒḓḔḕḖḗḘḙḚḛḜḝḞḟḠḡḢḣḤḥḦḧḨḩḪḫḬḭḮḯḰḱḲḳḴḵḶḷḸḹḺḻḼḽḾḿṀṁṂṃṄṅṆṇṈṉ ṊṋṌṍṎṏṐṑṒṓṔṕṖṗṘṙṚṛṜṝṞṟṠṡṢṣṤṥṦṧṨṩṪṫṬṭṮṯṰṱṲṳṴṵṶṷṸṹṺṻṼṽṾṿẀẁẂẃẄẅẆẇẈẉẊẋ ẌẍẎẏẐẑẒẓẔẕẖẗẘẙẚẛẠạẢảẤấẦầẨẩẪẫẬậẮắẰằẲẳẴẵẶặẸẹẺẻẼẽẾếỀềỂểỄễỆệỈỉỊịỌọỎỏỐố ỒồỔổỖỗỘộỚớỜờỞởỠỡỢợỤụỦủỨứỪừỬửỮữỰựỲỳỴỵỶỷỸỹἀἁἂἃἄἅἆἇἈἉἊἋἌἍἎἏἐἑἒἓἔἕἘἙἚἛ ἜἝἠἡἢἣἤἥἦἧἨἩἪἫἬἭἮἯἰἱἲἳἴἵἶἷἸἹἺἻἼἽἾἿὀὁὂὃὄὅὈὉὊὋὌὍὐὑὒὓὔὕὖὗὙὛὝὟὠὡὢὣὤὥὦὧ ὨὩὪὫὬὭὮὯὰάὲέὴήὶίὸόὺύὼώᾀᾁᾂᾃᾄᾅᾆᾇᾐᾑᾒᾓᾔᾕᾖᾗᾠᾡᾢᾣᾤᾥᾦᾧᾰᾱᾲᾳᾴᾶᾷᾸᾹᾺΆιῂῃῄῆῇῈΈῊ ΉῐῑῒΐῖῗῘῙῚΊῠῡῢΰῤῥῦῧῨῩῪΎῬῲῳῴῶῷῸΌῺΏⁱⁿℂℇℊℋℌℍℎℏℐℑℒℓℕℙℚℛℜℝℤΩℨKÅℬℭℯℰℱℳℴℹ ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyzªµºÀÁÂÃÄÅÆÇÈÉÊ ËÌÍÎÏÐÑÒÓÔÕÖØÙÚÛÜÝÞßàáâãäåæçèéêëìíîïðñòóôõöøùúûüýþÿĀāĂ㥹ĆćĈĉĊċČčĎ ďĐđĒēĔĕĖėĘęĚěĜĝĞğĠġĢģĤĥĦħĨĩĪīĬĭĮįİıIJijĴĵĶķĸĹĺĻļĽľĿŀŁłŃńŅņŇňʼnŊŋŌōŎŏŐ őŒœŔŕŖŗŘřŚśŜŝŞşŠšŢţŤťŦŧŨũŪūŬŭŮůŰűŲųŴŵŶŷŸŹźŻżŽžſƀƁƂƃƄƅƆƇƈƉƊƋƌƍƎƏƐƑƒ ƓƔƕƖƗƘƙƚƛƜƝƞƟƠơƢƣƤƥƦƧƨƩƪƫƬƭƮƯưƱƲƳƴƵƶƷƸƹƺƼƽƾƿDŽdžLJljNJnjǍǎǏǐǑǒǓǔǕǖǗǘǙǚǛǜ ǝǞǟǠǡǢǣǤǥǦǧǨǩǪǫǬǭǮǯǰDZdzǴǵǶǷǸǹǺǻǼǽǾǿȀȁȂȃȄȅȆȇȈȉȊȋȌȍȎȏȐȑȒȓȔȕȖȗȘșȚțȜȝȞȟ ȠȡȢȣȤȥȦȧȨȩȪȫȬȭȮȯȰȱȲȳȴȵȶȷȸȹȺȻȼȽȾȿɀɁɐɑɒɓɔɕɖɗɘəɚɛɜɝɞɟɠɡɢɣɤɥɦɧɨɩɪɫɬɭɮɯ ɰɱɲɳɴɵɶɷɸɹɺɻɼɽɾɿʀʁʂʃʄʅʆʇʈʉʊʋʌʍʎʏʐʑʒʓʔʕʖʗʘʙʚʛʜʝʞʟʠʡʢʣʤʥʦʧʨʩʪʫʬʭʮʯΆΈ ΉΊΌΎΏΐΑΒΓΔΕΖΗΘΙΚΛΜΝΞΟΠΡΣΤΥΦΧΨΩΪΫάέήίΰαβγδεζηθικλμνξοπρςστυφχψωϊϋόύ ώϐϑϒϓϔϕϖϗϘϙϚϛϜϝϞϟϠϡϢϣϤϥϦϧϨϩϪϫϬϭϮϯϰϱϲϳϴϵϷϸϹϺϻϼϽϾϿЀЁЂЃЄЅІЇЈЉЊЋЌЍЎЏАБ ВГДЕЖЗИЙКЛМНОПРСТУФХЦЧШЩЪЫЬЭЮЯабвгдежзийклмнопрстуфхцчшщъыьэюяѐёђѓ єѕіїјљњћќѝўџѠѡѢѣѤѥѦѧѨѩѪѫѬѭѮѯѰѱѲѳѴѵѶѷѸѹѺѻѼѽѾѿҀҁҊҋҌҍҎҏҐґҒғҔҕҖҗҘҙҚқҜҝ ҞҟҠҡҢңҤҥҦҧҨҩҪҫҬҭҮүҰұҲҳҴҵҶҷҸҹҺһҼҽҾҿӀӁӂӃӄӅӆӇӈӉӊӋӌӍӎӐӑӒӓӔӕӖӗӘәӚӛӜӝӞӟӠ ӡӢӣӤӥӦӧӨөӪӫӬӭӮӯӰӱӲӳӴӵӶӷӸӹԀԁԂԃԄԅԆԇԈԉԊԋԌԍԎԏԱԲԳԴԵԶԷԸԹԺԻԼԽԾԿՀՁՂՃՄՅՆՇՈՉ ՊՋՌՍՎՏՐՑՒՓՔՕՖաբգդեզէըթժիլխծկհձղճմյնշոչպջռսվտրցւփքօֆևႠႡႢႣႤႥႦႧႨႩႪႫႬႭ ႮႯႰႱႲႳႴႵႶႷႸႹႺႻႼႽႾႿჀჁჂჃჄჅᴀᴁᴂᴃᴄᴅᴆᴇᴈᴉᴊᴋᴌᴍᴎᴏᴐᴑᴒᴓᴔᴕᴖᴗᴘᴙᴚᴛᴜᴝᴞᴟᴠᴡᴢᴣᴤᴥᴦᴧᴨᴩ ᴪᴫᵢᵣᵤᵥᵦᵧᵨᵩᵪᵫᵬᵭᵮᵯᵰᵱᵲᵳᵴᵵᵶᵷᵹᵺᵻᵼᵽᵾᵿᶀᶁᶂᶃᶄᶅᶆᶇᶈᶉᶊᶋᶌᶍᶎᶏᶐᶑᶒᶓᶔᶕᶖᶗᶘᶙᶚḀḁḂḃḄḅḆḇ ḈḉḊḋḌḍḎḏḐḑḒḓḔḕḖḗḘḙḚḛḜḝḞḟḠḡḢḣḤḥḦḧḨḩḪḫḬḭḮḯḰḱḲḳḴḵḶḷḸḹḺḻḼḽḾḿṀṁṂṃṄṅṆṇṈṉ ṊṋṌṍṎṏṐṑṒṓṔṕṖṗṘṙṚṛṜṝṞṟṠṡṢṣṤṥṦṧṨṩṪṫṬṭṮṯṰṱṲṳṴṵṶṷṸṹṺṻṼṽṾṿẀẁẂẃẄẅẆẇẈẉẊẋ ẌẍẎẏẐẑẒẓẔẕẖẗẘẙẚẛẠạẢảẤấẦầẨẩẪẫẬậẮắẰằẲẳẴẵẶặẸẹẺẻẼẽẾếỀềỂểỄễỆệỈỉỊịỌọỎỏỐố ỒồỔổỖỗỘộỚớỜờỞởỠỡỢợỤụỦủỨứỪừỬửỮữỰựỲỳỴỵỶỷỸỹἀἁἂἃἄἅἆἇἈἉἊἋἌἍἎἏἐἑἒἓἔἕἘἙἚἛ ἜἝἠἡἢἣἤἥἦἧἨἩἪἫἬἭἮἯἰἱἲἳἴἵἶἷἸἹἺἻἼἽἾἿὀὁὂὃὄὅὈὉὊὋὌὍὐὑὒὓὔὕὖὗὙὛὝὟὠὡὢὣὤὥὦὧ ὨὩὪὫὬὭὮὯὰάὲέὴήὶίὸόὺύὼώᾀᾁᾂᾃᾄᾅᾆᾇᾐᾑᾒᾓᾔᾕᾖᾗᾠᾡᾢᾣᾤᾥᾦᾧᾰᾱᾲᾳᾴᾶᾷᾸᾹᾺΆιῂῃῄῆῇῈΈῊ ΉῐῑῒΐῖῗῘῙῚΊῠῡῢΰῤῥῦῧῨῩῪΎῬῲῳῴῶῷῸΌῺΏⁱⁿℂℇℊℋℌℍℎℏℐℑℒℓℕℙℚℛℜℝℤΩℨKÅℬℭℯℰℱℳℴℹ ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyzªµºÀÁÂÃÄÅÆÇÈÉÊ ËÌÍÎÏÐÑÒÓÔÕÖØÙÚÛÜÝÞßàáâãäåæçèéêëìíîïðñòóôõöøùúûüýþÿĀāĂ㥹ĆćĈĉĊċČčĎ ďĐđĒēĔĕĖėĘęĚěĜĝĞğĠġĢģĤĥĦħĨĩĪīĬĭĮįİıIJijĴĵĶķĸĹĺĻļĽľĿŀŁłŃńŅņŇňʼnŊŋŌōŎŏŐ őŒœŔŕŖŗŘřŚśŜŝŞşŠšŢţŤťŦŧŨũŪūŬŭŮůŰűŲųŴŵŶŷŸŹźŻżŽžſƀƁƂƃƄƅƆƇƈƉƊƋƌƍƎƏƐƑƒ ƓƔƕƖƗƘƙƚƛƜƝƞƟƠơƢƣƤƥƦƧƨƩƪƫƬƭƮƯưƱƲƳƴƵƶƷƸƹƺƼƽƾƿDŽdžLJljNJnjǍǎǏǐǑǒǓǔǕǖǗǘǙǚǛǜ ǝǞǟǠǡǢǣǤǥǦǧǨǩǪǫǬǭǮǯǰDZdzǴǵǶǷǸǹǺǻǼǽǾǿȀȁȂȃȄȅȆȇȈȉȊȋȌȍȎȏȐȑȒȓȔȕȖȗȘșȚțȜȝȞȟ ȠȡȢȣȤȥȦȧȨȩȪȫȬȭȮȯȰȱȲȳȴȵȶȷȸȹȺȻȼȽȾȿɀɁɐɑɒɓɔɕɖɗɘəɚɛɜɝɞɟɠɡɢɣɤɥɦɧɨɩɪɫɬɭɮɯ ɰɱɲɳɴɵɶɷɸɹɺɻɼɽɾɿʀʁʂʃʄʅʆʇʈʉʊʋʌʍʎʏʐʑʒʓʔʕʖʗʘʙʚʛʜʝʞʟʠʡʢʣʤʥʦʧʨʩʪʫʬʭʮʯΆΈ ΉΊΌΎΏΐΑΒΓΔΕΖΗΘΙΚΛΜΝΞΟΠΡΣΤΥΦΧΨΩΪΫάέήίΰαβγδεζηθικλμνξοπρςστυφχψωϊϋόύ ώϐϑϒϓϔϕϖϗϘϙϚϛϜϝϞϟϠϡϢϣϤϥϦϧϨϩϪϫϬϭϮϯϰϱϲϳϴϵϷϸϹϺϻϼϽϾϿЀЁЂЃЄЅІЇЈЉЊЋЌЍЎЏАБ ВГДЕЖЗИЙКЛМНОПРСТУФХЦЧШЩЪЫЬЭЮЯабвгдежзийклмнопрстуфхцчшщъыьэюяѐёђѓ єѕіїјљњћќѝўџѠѡѢѣѤѥѦѧѨѩѪѫѬѭѮѯѰѱѲѳѴѵѶѷѸѹѺѻѼѽѾѿҀҁҊҋҌҍҎҏҐґҒғҔҕҖҗҘҙҚқҜҝ ҞҟҠҡҢңҤҥҦҧҨҩҪҫҬҭҮүҰұҲҳҴҵҶҷҸҹҺһҼҽҾҿӀӁӂӃӄӅӆӇӈӉӊӋӌӍӎӐӑӒӓӔӕӖӗӘәӚӛӜӝӞӟӠ ӡӢӣӤӥӦӧӨөӪӫӬӭӮӯӰӱӲӳӴӵӶӷӸӹԀԁԂԃԄԅԆԇԈԉԊԋԌԍԎԏԱԲԳԴԵԶԷԸԹԺԻԼԽԾԿՀՁՂՃՄՅՆՇՈՉ ՊՋՌՍՎՏՐՑՒՓՔՕՖաբգդեզէըթժիլխծկհձղճմյնշոչպջռսվտրցւփքօֆևႠႡႢႣႤႥႦႧႨႩႪႫႬႭ ႮႯႰႱႲႳႴႵႶႷႸႹႺႻႼႽႾႿჀჁჂჃჄჅᴀᴁᴂᴃᴄᴅᴆᴇᴈᴉᴊᴋᴌᴍᴎᴏᴐᴑᴒᴓᴔᴕᴖᴗᴘᴙᴚᴛᴜᴝᴞᴟᴠᴡᴢᴣᴤᴥᴦᴧᴨᴩ ᴪᴫᵢᵣᵤᵥᵦᵧᵨᵩᵪᵫᵬᵭᵮᵯᵰᵱᵲᵳᵴᵵᵶᵷᵹᵺᵻᵼᵽᵾᵿᶀᶁᶂᶃᶄᶅᶆᶇᶈᶉᶊᶋᶌᶍᶎᶏᶐᶑᶒᶓᶔᶕᶖᶗᶘᶙᶚḀḁḂḃḄḅḆḇ ḈḉḊḋḌḍḎḏḐḑḒḓḔḕḖḗḘḙḚḛḜḝḞḟḠḡḢḣḤḥḦḧḨḩḪḫḬḭḮḯḰḱḲḳḴḵḶḷḸḹḺḻḼḽḾḿṀṁṂṃṄṅṆṇṈṉ ṊṋṌṍṎṏṐṑṒṓṔṕṖṗṘṙṚṛṜṝṞṟṠṡṢṣṤṥṦṧṨṩṪṫṬṭṮṯṰṱṲṳṴṵṶṷṸṹṺṻṼṽṾṿẀẁẂẃẄẅẆẇẈẉẊẋ ẌẍẎẏẐẑẒẓẔẕẖẗẘẙẚẛẠạẢảẤấẦầẨẩẪẫẬậẮắẰằẲẳẴẵẶặẸẹẺẻẼẽẾếỀềỂểỄễỆệỈỉỊịỌọỎỏỐố ỒồỔổỖỗỘộỚớỜờỞởỠỡỢợỤụỦủỨứỪừỬửỮữỰựỲỳỴỵỶỷỸỹἀἁἂἃἄἅἆἇἈἉἊἋἌἍἎἏἐἑἒἓἔἕἘἙἚἛ ἜἝἠἡἢἣἤἥἦἧἨἩἪἫἬἭἮἯἰἱἲἳἴἵἶἷἸἹἺἻἼἽἾἿὀὁὂὃὄὅὈὉὊὋὌὍὐὑὒὓὔὕὖὗὙὛὝὟὠὡὢὣὤὥὦὧ ὨὩὪὫὬὭὮὯὰάὲέὴήὶίὸόὺύὼώᾀᾁᾂᾃᾄᾅᾆᾇᾐᾑᾒᾓᾔᾕᾖᾗᾠᾡᾢᾣᾤᾥᾦᾧᾰᾱᾲᾳᾴᾶᾷᾸᾹᾺΆιῂῃῄῆῇῈΈῊ ΉῐῑῒΐῖῗῘῙῚΊῠῡῢΰῤῥῦῧῨῩῪΎῬῲῳῴῶῷῸΌῺΏⁱⁿℂℇℊℋℌℍℎℏℐℑℒℓℕℙℚℛℜℝℤΩℨKÅℬℭℯℰℱℳℴℹ ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyzªµºÀÁÂÃÄÅÆÇÈÉÊ ËÌÍÎÏÐÑÒÓÔÕÖØÙÚÛÜÝÞßàáâãäåæçèéêëìíîïðñòóôõöøùúûüýþÿĀāĂ㥹ĆćĈĉĊċČčĎ ďĐđĒēĔĕĖėĘęĚěĜĝĞğĠġĢģĤĥĦħĨĩĪīĬĭĮįİıIJijĴĵĶķĸĹĺĻļĽľĿŀŁłŃńŅņŇňʼnŊŋŌōŎŏŐ őŒœŔŕŖŗŘřŚśŜŝŞşŠšŢţŤťŦŧŨũŪūŬŭŮůŰűŲųŴŵŶŷŸŹźŻżŽžſƀƁƂƃƄƅƆƇƈƉƊƋƌƍƎƏƐƑƒ ƓƔƕƖƗƘƙƚƛƜƝƞƟƠơƢƣƤƥƦƧƨƩƪƫƬƭƮƯưƱƲƳƴƵƶƷƸƹƺƼƽƾƿDŽdžLJljNJnjǍǎǏǐǑǒǓǔǕǖǗǘǙǚǛǜ ǝǞǟǠǡǢǣǤǥǦǧǨǩǪǫǬǭǮǯǰDZdzǴǵǶǷǸǹǺǻǼǽǾǿȀȁȂȃȄȅȆȇȈȉȊȋȌȍȎȏȐȑȒȓȔȕȖȗȘșȚțȜȝȞȟ ȠȡȢȣȤȥȦȧȨȩȪȫȬȭȮȯȰȱȲȳȴȵȶȷȸȹȺȻȼȽȾȿɀɁɐɑɒɓɔɕɖɗɘəɚɛɜɝɞɟɠɡɢɣɤɥɦɧɨɩɪɫɬɭɮɯ ɰɱɲɳɴɵɶɷɸɹɺɻɼɽɾɿʀʁʂʃʄʅʆʇʈʉʊʋʌʍʎʏʐʑʒʓʔʕʖʗʘʙʚʛʜʝʞʟʠʡʢʣʤʥʦʧʨʩʪʫʬʭʮʯΆΈ ΉΊΌΎΏΐΑΒΓΔΕΖΗΘΙΚΛΜΝΞΟΠΡΣΤΥΦΧΨΩΪΫάέήίΰαβγδεζηθικλμνξοπρςστυφχψωϊϋόύ ώϐϑϒϓϔϕϖϗϘϙϚϛϜϝϞϟϠϡϢϣϤϥϦϧϨϩϪϫϬϭϮϯϰϱϲϳϴϵϷϸϹϺϻϼϽϾϿЀЁЂЃЄЅІЇЈЉЊЋЌЍЎЏАБ ВГДЕЖЗИЙКЛМНОПРСТУФХЦЧШЩЪЫЬЭЮЯабвгдежзийклмнопрстуфхцчшщъыьэюяѐёђѓ єѕіїјљњћќѝўџѠѡѢѣѤѥѦѧѨѩѪѫѬѭѮѯѰѱѲѳѴѵѶѷѸѹѺѻѼѽѾѿҀҁҊҋҌҍҎҏҐґҒғҔҕҖҗҘҙҚқҜҝ ҞҟҠҡҢңҤҥҦҧҨҩҪҫҬҭҮүҰұҲҳҴҵҶҷҸҹҺһҼҽҾҿӀӁӂӃӄӅӆӇӈӉӊӋӌӍӎӐӑӒӓӔӕӖӗӘәӚӛӜӝӞӟӠ ӡӢӣӤӥӦӧӨөӪӫӬӭӮӯӰӱӲӳӴӵӶӷӸӹԀԁԂԃԄԅԆԇԈԉԊԋԌԍԎԏԱԲԳԴԵԶԷԸԹԺԻԼԽԾԿՀՁՂՃՄՅՆՇՈՉ ՊՋՌՍՎՏՐՑՒՓՔՕՖաբգդեզէըթժիլխծկհձղճմյնշոչպջռսվտրցւփքօֆևႠႡႢႣႤႥႦႧႨႩႪႫႬႭ ႮႯႰႱႲႳႴႵႶႷႸႹႺႻႼႽႾႿჀჁჂჃჄჅᴀᴁᴂᴃᴄᴅᴆᴇᴈᴉᴊᴋᴌᴍᴎᴏᴐᴑᴒᴓᴔᴕᴖᴗᴘᴙᴚᴛᴜᴝᴞᴟᴠᴡᴢᴣᴤᴥᴦᴧᴨᴩ ᴪᴫᵢᵣᵤᵥᵦᵧᵨᵩᵪᵫᵬᵭᵮᵯᵰᵱᵲᵳᵴᵵᵶᵷᵹᵺᵻᵼᵽᵾᵿᶀᶁᶂᶃᶄᶅᶆᶇᶈᶉᶊᶋᶌᶍᶎᶏᶐᶑᶒᶓᶔᶕᶖᶗᶘᶙᶚḀḁḂḃḄḅḆḇ ḈḉḊḋḌḍḎḏḐḑḒḓḔḕḖḗḘḙḚḛḜḝḞḟḠḡḢḣḤḥḦḧḨḩḪḫḬḭḮḯḰḱḲḳḴḵḶḷḸḹḺḻḼḽḾḿṀṁṂṃṄṅṆṇṈṉ ṊṋṌṍṎṏṐṑṒṓṔṕṖṗṘṙṚṛṜṝṞṟṠṡṢṣṤṥṦṧṨṩṪṫṬṭṮṯṰṱṲṳṴṵṶṷṸṹṺṻṼṽṾṿẀẁẂẃẄẅẆẇẈẉẊẋ ẌẍẎẏẐẑẒẓẔẕẖẗẘẙẚẛẠạẢảẤấẦầẨẩẪẫẬậẮắẰằẲẳẴẵẶặẸẹẺẻẼẽẾếỀềỂểỄễỆệỈỉỊịỌọỎỏỐố ỒồỔổỖỗỘộỚớỜờỞởỠỡỢợỤụỦủỨứỪừỬửỮữỰựỲỳỴỵỶỷỸỹἀἁἂἃἄἅἆἇἈἉἊἋἌἍἎἏἐἑἒἓἔἕἘἙἚἛ ἜἝἠἡἢἣἤἥἦἧἨἩἪἫἬἭἮἯἰἱἲἳἴἵἶἷἸἹἺἻἼἽἾἿὀὁὂὃὄὅὈὉὊὋὌὍὐὑὒὓὔὕὖὗὙὛὝὟὠὡὢὣὤὥὦὧ ὨὩὪὫὬὭὮὯὰάὲέὴήὶίὸόὺύὼώᾀᾁᾂᾃᾄᾅᾆᾇᾐᾑᾒᾓᾔᾕᾖᾗᾠᾡᾢᾣᾤᾥᾦᾧᾰᾱᾲᾳᾴᾶᾷᾸᾹᾺΆιῂῃῄῆῇῈΈῊ ΉῐῑῒΐῖῗῘῙῚΊῠῡῢΰῤῥῦῧῨῩῪΎῬῲῳῴῶῷῸΌῺΏⁱⁿℂℇℊℋℌℍℎℏℐℑℒℓℕℙℚℛℜℝℤΩℨKÅℬℭℯℰℱℳℴℹ ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyzªµºÀÁÂÃÄÅÆÇÈÉÊ ËÌÍÎÏÐÑÒÓÔÕÖØÙÚÛÜÝÞßàáâãäåæçèéêëìíîïðñòóôõöøùúûüýþÿĀāĂ㥹ĆćĈĉĊċČčĎ ďĐđĒēĔĕĖėĘęĚěĜĝĞğĠġĢģĤĥĦħĨĩĪīĬĭĮįİıIJijĴĵĶķĸĹĺĻļĽľĿŀŁłŃńŅņŇňʼnŊŋŌōŎŏŐ őŒœŔŕŖŗŘřŚśŜŝŞşŠšŢţŤťŦŧŨũŪūŬŭŮůŰűŲųŴŵŶŷŸŹźŻżŽžſƀƁƂƃƄƅƆƇƈƉƊƋƌƍƎƏƐƑƒ ƓƔƕƖƗƘƙƚƛƜƝƞƟƠơƢƣƤƥƦƧƨƩƪƫƬƭƮƯưƱƲƳƴƵƶƷƸƹƺƼƽƾƿDŽdžLJljNJnjǍǎǏǐǑǒǓǔǕǖǗǘǙǚǛǜ ǝǞǟǠǡǢǣǤǥǦǧǨǩǪǫǬǭǮǯǰDZdzǴǵǶǷǸǹǺǻǼǽǾǿȀȁȂȃȄȅȆȇȈȉȊȋȌȍȎȏȐȑȒȓȔȕȖȗȘșȚțȜȝȞȟ ȠȡȢȣȤȥȦȧȨȩȪȫȬȭȮȯȰȱȲȳȴȵȶȷȸȹȺȻȼȽȾȿɀɁɐɑɒɓɔɕɖɗɘəɚɛɜɝɞɟɠɡɢɣɤɥɦɧɨɩɪɫɬɭɮɯ ɰɱɲɳɴɵɶɷɸɹɺɻɼɽɾɿʀʁʂʃʄʅʆʇʈʉʊʋʌʍʎʏʐʑʒʓʔʕʖʗʘʙʚʛʜʝʞʟʠʡʢʣʤʥʦʧʨʩʪʫʬʭʮʯΆΈ ΉΊΌΎΏΐΑΒΓΔΕΖΗΘΙΚΛΜΝΞΟΠΡΣΤΥΦΧΨΩΪΫάέήίΰαβγδεζηθικλμνξοπρςστυφχψωϊϋόύ ώϐϑϒϓϔϕϖϗϘϙϚϛϜϝϞϟϠϡϢϣϤϥϦϧϨϩϪϫϬϭϮϯϰϱϲϳϴϵϷϸϹϺϻϼϽϾϿЀЁЂЃЄЅІЇЈЉЊЋЌЍЎЏАБ ВГДЕЖЗИЙКЛМНОПРСТУФХЦЧШЩЪЫЬЭЮЯабвгдежзийклмнопрстуфхцчшщъыьэюяѐёђѓ єѕіїјљњћќѝўџѠѡѢѣѤѥѦѧѨѩѪѫѬѭѮѯѰѱѲѳѴѵѶѷѸѹѺѻѼѽѾѿҀҁҊҋҌҍҎҏҐґҒғҔҕҖҗҘҙҚқҜҝ ҞҟҠҡҢңҤҥҦҧҨҩҪҫҬҭҮүҰұҲҳҴҵҶҷҸҹҺһҼҽҾҿӀӁӂӃӄӅӆӇӈӉӊӋӌӍӎӐӑӒӓӔӕӖӗӘәӚӛӜӝӞӟӠ ӡӢӣӤӥӦӧӨөӪӫӬӭӮӯӰӱӲӳӴӵӶӷӸӹԀԁԂԃԄԅԆԇԈԉԊԋԌԍԎԏԱԲԳԴԵԶԷԸԹԺԻԼԽԾԿՀՁՂՃՄՅՆՇՈՉ ՊՋՌՍՎՏՐՑՒՓՔՕՖաբգդեզէըթժիլխծկհձղճմյնշոչպջռսվտրցւփքօֆևႠႡႢႣႤႥႦႧႨႩႪႫႬႭ ႮႯႰႱႲႳႴႵႶႷႸႹႺႻႼႽႾႿჀჁჂჃჄჅᴀᴁᴂᴃᴄᴅᴆᴇᴈᴉᴊᴋᴌᴍᴎᴏᴐᴑᴒᴓᴔᴕᴖᴗᴘᴙᴚᴛᴜᴝᴞᴟᴠᴡᴢᴣᴤᴥᴦᴧᴨᴩ ᴪᴫᵢᵣᵤᵥᵦᵧᵨᵩᵪᵫᵬᵭᵮᵯᵰᵱᵲᵳᵴᵵᵶᵷᵹᵺᵻᵼᵽᵾᵿᶀᶁᶂᶃᶄᶅᶆᶇᶈᶉᶊᶋᶌᶍᶎᶏᶐᶑᶒᶓᶔᶕᶖᶗᶘᶙᶚḀḁḂḃḄḅḆḇ ḈḉḊḋḌḍḎḏḐḑḒḓḔḕḖḗḘḙḚḛḜḝḞḟḠḡḢḣḤḥḦḧḨḩḪḫḬḭḮḯḰḱḲḳḴḵḶḷḸḹḺḻḼḽḾḿṀṁṂṃṄṅṆṇṈṉ ṊṋṌṍṎṏṐṑṒṓṔṕṖṗṘṙṚṛṜṝṞṟṠṡṢṣṤṥṦṧṨṩṪṫṬṭṮṯṰṱṲṳṴṵṶṷṸṹṺṻṼṽṾṿẀẁẂẃẄẅẆẇẈẉẊẋ ẌẍẎẏẐẑẒẓẔẕẖẗẘẙẚẛẠạẢảẤấẦầẨẩẪẫẬậẮắẰằẲẳẴẵẶặẸẹẺẻẼẽẾếỀềỂểỄễỆệỈỉỊịỌọỎỏỐố ỒồỔổỖỗỘộỚớỜờỞởỠỡỢợỤụỦủỨứỪừỬửỮữỰựỲỳỴỵỶỷỸỹἀἁἂἃἄἅἆἇἈἉἊἋἌἍἎἏἐἑἒἓἔἕἘἙἚἛ ἜἝἠἡἢἣἤἥἦἧἨἩἪἫἬἭἮἯἰἱἲἳἴἵἶἷἸἹἺἻἼἽἾἿὀὁὂὃὄὅὈὉὊὋὌὍὐὑὒὓὔὕὖὗὙὛὝὟὠὡὢὣὤὥὦὧ ὨὩὪὫὬὭὮὯὰάὲέὴήὶίὸόὺύὼώᾀᾁᾂᾃᾄᾅᾆᾇᾐᾑᾒᾓᾔᾕᾖᾗᾠᾡᾢᾣᾤᾥᾦᾧᾰᾱᾲᾳᾴᾶᾷᾸᾹᾺΆιῂῃῄῆῇῈΈῊ ΉῐῑῒΐῖῗῘῙῚΊῠῡῢΰῤῥῦῧῨῩῪΎῬῲῳῴῶῷῸΌῺΏⁱⁿℂℇℊℋℌℍℎℏℐℑℒℓℕℙℚℛℜℝℤΩℨKÅℬℭℯℰℱℳℴℹ ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyzªµºÀÁÂÃÄÅÆÇÈÉÊ ËÌÍÎÏÐÑÒÓÔÕÖØÙÚÛÜÝÞßàáâãäåæçèéêëìíîïðñòóôõöøùúûüýþÿĀāĂ㥹ĆćĈĉĊċČčĎ ďĐđĒēĔĕĖėĘęĚěĜĝĞğĠġĢģĤĥĦħĨĩĪīĬĭĮįİıIJijĴĵĶķĸĹĺĻļĽľĿŀŁłŃńŅņŇňʼnŊŋŌōŎŏŐ őŒœŔŕŖŗŘřŚśŜŝŞşŠšŢţŤťŦŧŨũŪūŬŭŮůŰűŲųŴŵŶŷŸŹźŻżŽžſƀƁƂƃƄƅƆƇƈƉƊƋƌƍƎƏƐƑƒ ƓƔƕƖƗƘƙƚƛƜƝƞƟƠơƢƣƤƥƦƧƨƩƪƫƬƭƮƯưƱƲƳƴƵƶƷƸƹƺƼƽƾƿDŽdžLJljNJnjǍǎǏǐǑǒǓǔǕǖǗǘǙǚǛǜ ǝǞǟǠǡǢǣǤǥǦǧǨǩǪǫǬǭǮǯǰDZdzǴǵǶǷǸǹǺǻǼǽǾǿȀȁȂȃȄȅȆȇȈȉȊȋȌȍȎȏȐȑȒȓȔȕȖȗȘșȚțȜȝȞȟ ȠȡȢȣȤȥȦȧȨȩȪȫȬȭȮȯȰȱȲȳȴȵȶȷȸȹȺȻȼȽȾȿɀɁɐɑɒɓɔɕɖɗɘəɚɛɜɝɞɟɠɡɢɣɤɥɦɧɨɩɪɫɬɭɮɯ ɰɱɲɳɴɵɶɷɸɹɺɻɼɽɾɿʀʁʂʃʄʅʆʇʈʉʊʋʌʍʎʏʐʑʒʓʔʕʖʗʘʙʚʛʜʝʞʟʠʡʢʣʤʥʦʧʨʩʪʫʬʭʮʯΆΈ ΉΊΌΎΏΐΑΒΓΔΕΖΗΘΙΚΛΜΝΞΟΠΡΣΤΥΦΧΨΩΪΫάέήίΰαβγδεζηθικλμνξοπρςστυφχψωϊϋόύ ώϐϑϒϓϔϕϖϗϘϙϚϛϜϝϞϟϠϡϢϣϤϥϦϧϨϩϪϫϬϭϮϯϰϱϲϳϴϵϷϸϹϺϻϼϽϾϿЀЁЂЃЄЅІЇЈЉЊЋЌЍЎЏАБ ВГДЕЖЗИЙКЛМНОПРСТУФХЦЧШЩЪЫЬЭЮЯабвгдежзийклмнопрстуфхцчшщъыьэюяѐёђѓ єѕіїјљњћќѝўџѠѡѢѣѤѥѦѧѨѩѪѫѬѭѮѯѰѱѲѳѴѵѶѷѸѹѺѻѼѽѾѿҀҁҊҋҌҍҎҏҐґҒғҔҕҖҗҘҙҚқҜҝ ҞҟҠҡҢңҤҥҦҧҨҩҪҫҬҭҮүҰұҲҳҴҵҶҷҸҹҺһҼҽҾҿӀӁӂӃӄӅӆӇӈӉӊӋӌӍӎӐӑӒӓӔӕӖӗӘәӚӛӜӝӞӟӠ ӡӢӣӤӥӦӧӨөӪӫӬӭӮӯӰӱӲӳӴӵӶӷӸӹԀԁԂԃԄԅԆԇԈԉԊԋԌԍԎԏԱԲԳԴԵԶԷԸԹԺԻԼԽԾԿՀՁՂՃՄՅՆՇՈՉ ՊՋՌՍՎՏՐՑՒՓՔՕՖաբգդեզէըթժիլխծկհձղճմյնշոչպջռսվտրցւփքօֆևႠႡႢႣႤႥႦႧႨႩႪႫႬႭ ႮႯႰႱႲႳႴႵႶႷႸႹႺႻႼႽႾႿჀჁჂჃჄჅᴀᴁᴂᴃᴄᴅᴆᴇᴈᴉᴊᴋᴌᴍᴎᴏᴐᴑᴒᴓᴔᴕᴖᴗᴘᴙᴚᴛᴜᴝᴞᴟᴠᴡᴢᴣᴤᴥᴦᴧᴨᴩ ᴪᴫᵢᵣᵤᵥᵦᵧᵨᵩᵪᵫᵬᵭᵮᵯᵰᵱᵲᵳᵴᵵᵶᵷᵹᵺᵻᵼᵽᵾᵿᶀᶁᶂᶃᶄᶅᶆᶇᶈᶉᶊᶋᶌᶍᶎᶏᶐᶑᶒᶓᶔᶕᶖᶗᶘᶙᶚḀḁḂḃḄḅḆḇ ḈḉḊḋḌḍḎḏḐḑḒḓḔḕḖḗḘḙḚḛḜḝḞḟḠḡḢḣḤḥḦḧḨḩḪḫḬḭḮḯḰḱḲḳḴḵḶḷḸḹḺḻḼḽḾḿṀṁṂṃṄṅṆṇṈṉ ṊṋṌṍṎṏṐṑṒṓṔṕṖṗṘṙṚṛṜṝṞṟṠṡṢṣṤṥṦṧṨṩṪṫṬṭṮṯṰṱṲṳṴṵṶṷṸṹṺṻṼṽṾṿẀẁẂẃẄẅẆẇẈẉẊẋ ẌẍẎẏẐẑẒẓẔẕẖẗẘẙẚẛẠạẢảẤấẦầẨẩẪẫẬậẮắẰằẲẳẴẵẶặẸẹẺẻẼẽẾếỀềỂểỄễỆệỈỉỊịỌọỎỏỐố ỒồỔổỖỗỘộỚớỜờỞởỠỡỢợỤụỦủỨứỪừỬửỮữỰựỲỳỴỵỶỷỸỹἀἁἂἃἄἅἆἇἈἉἊἋἌἍἎἏἐἑἒἓἔἕἘἙἚἛ ἜἝἠἡἢἣἤἥἦἧἨἩἪἫἬἭἮἯἰἱἲἳἴἵἶἷἸἹἺἻἼἽἾἿὀὁὂὃὄὅὈὉὊὋὌὍὐὑὒὓὔὕὖὗὙὛὝὟὠὡὢὣὤὥὦὧ ὨὩὪὫὬὭὮὯὰάὲέὴήὶίὸόὺύὼώᾀᾁᾂᾃᾄᾅᾆᾇᾐᾑᾒᾓᾔᾕᾖᾗᾠᾡᾢᾣᾤᾥᾦᾧᾰᾱᾲᾳᾴᾶᾷᾸᾹᾺΆιῂῃῄῆῇῈΈῊ ΉῐῑῒΐῖῗῘῙῚΊῠῡῢΰῤῥῦῧῨῩῪΎῬῲῳῴῶῷῸΌῺΏⁱⁿℂℇℊℋℌℍℎℏℐℑℒℓℕℙℚℛℜℝℤΩℨKÅℬℭℯℰℱℳℴℹ ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyzªµºÀÁÂÃÄÅÆÇÈÉÊ ËÌÍÎÏÐÑÒÓÔÕÖØÙÚÛÜÝÞßàáâãäåæçèéêëìíîïðñòóôõöøùúûüýþÿĀāĂ㥹ĆćĈĉĊċČčĎ ďĐđĒēĔĕĖėĘęĚěĜĝĞğĠġĢģĤĥĦħĨĩĪīĬĭĮįİıIJijĴĵĶķĸĹĺĻļĽľĿŀŁłŃńŅņŇňʼnŊŋŌōŎŏŐ őŒœŔŕŖŗŘřŚśŜŝŞşŠšŢţŤťŦŧŨũŪūŬŭŮůŰűŲųŴŵŶŷŸŹźŻżŽžſƀƁƂƃƄƅƆƇƈƉƊƋƌƍƎƏƐƑƒ ƓƔƕƖƗƘƙƚƛƜƝƞƟƠơƢƣƤƥƦƧƨƩƪƫƬƭƮƯưƱƲƳƴƵƶƷƸƹƺƼƽƾƿDŽdžLJljNJnjǍǎǏǐǑǒǓǔǕǖǗǘǙǚǛǜ ǝǞǟǠǡǢǣǤǥǦǧǨǩǪǫǬǭǮǯǰDZdzǴǵǶǷǸǹǺǻǼǽǾǿȀȁȂȃȄȅȆȇȈȉȊȋȌȍȎȏȐȑȒȓȔȕȖȗȘșȚțȜȝȞȟ ȠȡȢȣȤȥȦȧȨȩȪȫȬȭȮȯȰȱȲȳȴȵȶȷȸȹȺȻȼȽȾȿɀɁɐɑɒɓɔɕɖɗɘəɚɛɜɝɞɟɠɡɢɣɤɥɦɧɨɩɪɫɬɭɮɯ ɰɱɲɳɴɵɶɷɸɹɺɻɼɽɾɿʀʁʂʃʄʅʆʇʈʉʊʋʌʍʎʏʐʑʒʓʔʕʖʗʘʙʚʛʜʝʞʟʠʡʢʣʤʥʦʧʨʩʪʫʬʭʮʯΆΈ ΉΊΌΎΏΐΑΒΓΔΕΖΗΘΙΚΛΜΝΞΟΠΡΣΤΥΦΧΨΩΪΫάέήίΰαβγδεζηθικλμνξοπρςστυφχψωϊϋόύ ώϐϑϒϓϔϕϖϗϘϙϚϛϜϝϞϟϠϡϢϣϤϥϦϧϨϩϪϫϬϭϮϯϰϱϲϳϴϵϷϸϹϺϻϼϽϾϿЀЁЂЃЄЅІЇЈЉЊЋЌЍЎЏАБ ВГДЕЖЗИЙКЛМНОПРСТУФХЦЧШЩЪЫЬЭЮЯабвгдежзийклмнопрстуфхцчшщъыьэюяѐёђѓ єѕіїјљњћќѝўџѠѡѢѣѤѥѦѧѨѩѪѫѬѭѮѯѰѱѲѳѴѵѶѷѸѹѺѻѼѽѾѿҀҁҊҋҌҍҎҏҐґҒғҔҕҖҗҘҙҚқҜҝ ҞҟҠҡҢңҤҥҦҧҨҩҪҫҬҭҮүҰұҲҳҴҵҶҷҸҹҺһҼҽҾҿӀӁӂӃӄӅӆӇӈӉӊӋӌӍӎӐӑӒӓӔӕӖӗӘәӚӛӜӝӞӟӠ ӡӢӣӤӥӦӧӨөӪӫӬӭӮӯӰӱӲӳӴӵӶӷӸӹԀԁԂԃԄԅԆԇԈԉԊԋԌԍԎԏԱԲԳԴԵԶԷԸԹԺԻԼԽԾԿՀՁՂՃՄՅՆՇՈՉ ՊՋՌՍՎՏՐՑՒՓՔՕՖաբգդեզէըթժիլխծկհձղճմյնշոչպջռսվտրցւփքօֆևႠႡႢႣႤႥႦႧႨႩႪႫႬႭ ႮႯႰႱႲႳႴႵႶႷႸႹႺႻႼႽႾႿჀჁჂჃჄჅᴀᴁᴂᴃᴄᴅᴆᴇᴈᴉᴊᴋᴌᴍᴎᴏᴐᴑᴒᴓᴔᴕᴖᴗᴘᴙᴚᴛᴜᴝᴞᴟᴠᴡᴢᴣᴤᴥᴦᴧᴨᴩ ᴪᴫᵢᵣᵤᵥᵦᵧᵨᵩᵪᵫᵬᵭᵮᵯᵰᵱᵲᵳᵴᵵᵶᵷᵹᵺᵻᵼᵽᵾᵿᶀᶁᶂᶃᶄᶅᶆᶇᶈᶉᶊᶋᶌᶍᶎᶏᶐᶑᶒᶓᶔᶕᶖᶗᶘᶙᶚḀḁḂḃḄḅḆḇ ḈḉḊḋḌḍḎḏḐḑḒḓḔḕḖḗḘḙḚḛḜḝḞḟḠḡḢḣḤḥḦḧḨḩḪḫḬḭḮḯḰḱḲḳḴḵḶḷḸḹḺḻḼḽḾḿṀṁṂṃṄṅṆṇṈṉ ṊṋṌṍṎṏṐṑṒṓṔṕṖṗṘṙṚṛṜṝṞṟṠṡṢṣṤṥṦṧṨṩṪṫṬṭṮṯṰṱṲṳṴṵṶṷṸṹṺṻṼṽṾṿẀẁẂẃẄẅẆẇẈẉẊẋ ẌẍẎẏẐẑẒẓẔẕẖẗẘẙẚẛẠạẢảẤấẦầẨẩẪẫẬậẮắẰằẲẳẴẵẶặẸẹẺẻẼẽẾếỀềỂểỄễỆệỈỉỊịỌọỎỏỐố ỒồỔổỖỗỘộỚớỜờỞởỠỡỢợỤụỦủỨứỪừỬửỮữỰựỲỳỴỵỶỷỸỹἀἁἂἃἄἅἆἇἈἉἊἋἌἍἎἏἐἑἒἓἔἕἘἙἚἛ ἜἝἠἡἢἣἤἥἦἧἨἩἪἫἬἭἮἯἰἱἲἳἴἵἶἷἸἹἺἻἼἽἾἿὀὁὂὃὄὅὈὉὊὋὌὍὐὑὒὓὔὕὖὗὙὛὝὟὠὡὢὣὤὥὦὧ ὨὩὪὫὬὭὮὯὰάὲέὴήὶίὸόὺύὼώᾀᾁᾂᾃᾄᾅᾆᾇᾐᾑᾒᾓᾔᾕᾖᾗᾠᾡᾢᾣᾤᾥᾦᾧᾰᾱᾲᾳᾴᾶᾷᾸᾹᾺΆιῂῃῄῆῇῈΈῊ ΉῐῑῒΐῖῗῘῙῚΊῠῡῢΰῤῥῦῧῨῩῪΎῬῲῳῴῶῷῸΌῺΏⁱⁿℂℇℊℋℌℍℎℏℐℑℒℓℕℙℚℛℜℝℤΩℨKÅℬℭℯℰℱℳℴℹ ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyzªµºÀÁÂÃÄÅÆÇÈÉÊ ËÌÍÎÏÐÑÒÓÔÕÖØÙÚÛÜÝÞßàáâãäåæçèéêëìíîïðñòóôõöøùúûüýþÿĀāĂ㥹ĆćĈĉĊċČčĎ ďĐđĒēĔĕĖėĘęĚěĜĝĞğĠġĢģĤĥĦħĨĩĪīĬĭĮįİıIJijĴĵĶķĸĹĺĻļĽľĿŀŁłŃńŅņŇňʼnŊŋŌōŎŏŐ őŒœŔŕŖŗŘřŚśŜŝŞşŠšŢţŤťŦŧŨũŪūŬŭŮůŰűŲųŴŵŶŷŸŹźŻżŽžſƀƁƂƃƄƅƆƇƈƉƊƋƌƍƎƏƐƑƒ ƓƔƕƖƗƘƙƚƛƜƝƞƟƠơƢƣƤƥƦƧƨƩƪƫƬƭƮƯưƱƲƳƴƵƶƷƸƹƺƼƽƾƿDŽdžLJljNJnjǍǎǏǐǑǒǓǔǕǖǗǘǙǚǛǜ ǝǞǟǠǡǢǣǤǥǦǧǨǩǪǫǬǭǮǯǰDZdzǴǵǶǷǸǹǺǻǼǽǾǿȀȁȂȃȄȅȆȇȈȉȊȋȌȍȎȏȐȑȒȓȔȕȖȗȘșȚțȜȝȞȟ ȠȡȢȣȤȥȦȧȨȩȪȫȬȭȮȯȰȱȲȳȴȵȶȷȸȹȺȻȼȽȾȿɀɁɐɑɒɓɔɕɖɗɘəɚɛɜɝɞɟɠɡɢɣɤɥɦɧɨɩɪɫɬɭɮɯ ɰɱɲɳɴɵɶɷɸɹɺɻɼɽɾɿʀʁʂʃʄʅʆʇʈʉʊʋʌʍʎʏʐʑʒʓʔʕʖʗʘʙʚʛʜʝʞʟʠʡʢʣʤʥʦʧʨʩʪʫʬʭʮʯΆΈ ΉΊΌΎΏΐΑΒΓΔΕΖΗΘΙΚΛΜΝΞΟΠΡΣΤΥΦΧΨΩΪΫάέήίΰαβγδεζηθικλμνξοπρςστυφχψωϊϋόύ ώϐϑϒϓϔϕϖϗϘϙϚϛϜϝϞϟϠϡϢϣϤϥϦϧϨϩϪϫϬϭϮϯϰϱϲϳϴϵϷϸϹϺϻϼϽϾϿЀЁЂЃЄЅІЇЈЉЊЋЌЍЎЏАБ ВГДЕЖЗИЙКЛМНОПРСТУФХЦЧШЩЪЫЬЭЮЯабвгдежзийклмнопрстуфхцчшщъыьэюяѐёђѓ єѕіїјљњћќѝўџѠѡѢѣѤѥѦѧѨѩѪѫѬѭѮѯѰѱѲѳѴѵѶѷѸѹѺѻѼѽѾѿҀҁҊҋҌҍҎҏҐґҒғҔҕҖҗҘҙҚқҜҝ ҞҟҠҡҢңҤҥҦҧҨҩҪҫҬҭҮүҰұҲҳҴҵҶҷҸҹҺһҼҽҾҿӀӁӂӃӄӅӆӇӈӉӊӋӌӍӎӐӑӒӓӔӕӖӗӘәӚӛӜӝӞӟӠ ӡӢӣӤӥӦӧӨөӪӫӬӭӮӯӰӱӲӳӴӵӶӷӸӹԀԁԂԃԄԅԆԇԈԉԊԋԌԍԎԏԱԲԳԴԵԶԷԸԹԺԻԼԽԾԿՀՁՂՃՄՅՆՇՈՉ ՊՋՌՍՎՏՐՑՒՓՔՕՖաբգդեզէըթժիլխծկհձղճմյնշոչպջռսվտրցւփքօֆևႠႡႢႣႤႥႦႧႨႩႪႫႬႭ ႮႯႰႱႲႳႴႵႶႷႸႹႺႻႼႽႾႿჀჁჂჃჄჅᴀᴁᴂᴃᴄᴅᴆᴇᴈᴉᴊᴋᴌᴍᴎᴏᴐᴑᴒᴓᴔᴕᴖᴗᴘᴙᴚᴛᴜᴝᴞᴟᴠᴡᴢᴣᴤᴥᴦᴧᴨᴩ ᴪᴫᵢᵣᵤᵥᵦᵧᵨᵩᵪᵫᵬᵭᵮᵯᵰᵱᵲᵳᵴᵵᵶᵷᵹᵺᵻᵼᵽᵾᵿᶀᶁᶂᶃᶄᶅᶆᶇᶈᶉᶊᶋᶌᶍᶎᶏᶐᶑᶒᶓᶔᶕᶖᶗᶘᶙᶚḀḁḂḃḄḅḆḇ ḈḉḊḋḌḍḎḏḐḑḒḓḔḕḖḗḘḙḚḛḜḝḞḟḠḡḢḣḤḥḦḧḨḩḪḫḬḭḮḯḰḱḲḳḴḵḶḷḸḹḺḻḼḽḾḿṀṁṂṃṄṅṆṇṈṉ ṊṋṌṍṎṏṐṑṒṓṔṕṖṗṘṙṚṛṜṝṞṟṠṡṢṣṤṥṦṧṨṩṪṫṬṭṮṯṰṱṲṳṴṵṶṷṸṹṺṻṼṽṾṿẀẁẂẃẄẅẆẇẈẉẊẋ ẌẍẎẏẐẑẒẓẔẕẖẗẘẙẚẛẠạẢảẤấẦầẨẩẪẫẬậẮắẰằẲẳẴẵẶặẸẹẺẻẼẽẾếỀềỂểỄễỆệỈỉỊịỌọỎỏỐố ỒồỔổỖỗỘộỚớỜờỞởỠỡỢợỤụỦủỨứỪừỬửỮữỰựỲỳỴỵỶỷỸỹἀἁἂἃἄἅἆἇἈἉἊἋἌἍἎἏἐἑἒἓἔἕἘἙἚἛ ἜἝἠἡἢἣἤἥἦἧἨἩἪἫἬἭἮἯἰἱἲἳἴἵἶἷἸἹἺἻἼἽἾἿὀὁὂὃὄὅὈὉὊὋὌὍὐὑὒὓὔὕὖὗὙὛὝὟὠὡὢὣὤὥὦὧ ὨὩὪὫὬὭὮὯὰάὲέὴήὶίὸόὺύὼώᾀᾁᾂᾃᾄᾅᾆᾇᾐᾑᾒᾓᾔᾕᾖᾗᾠᾡᾢᾣᾤᾥᾦᾧᾰᾱᾲᾳᾴᾶᾷᾸᾹᾺΆιῂῃῄῆῇῈΈῊ ΉῐῑῒΐῖῗῘῙῚΊῠῡῢΰῤῥῦῧῨῩῪΎῬῲῳῴῶῷῸΌῺΏⁱⁿℂℇℊℋℌℍℎℏℐℑℒℓℕℙℚℛℜℝℤΩℨKÅℬℭℯℰℱℳℴℹ ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/mapping.sort0000644000175100001730000000003614455350511016613 0ustar00runnerdockerz: 1 a: 2 y: 3 b: 4 x: 5 c: 6 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/mapping.sorted0000644000175100001730000000003614455350511017124 0ustar00runnerdockera: 2 b: 4 c: 6 x: 5 y: 3 z: 1 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/mappings.events0000644000175100001730000000267014455350511017321 0ustar00runnerdocker- !StreamStart - !DocumentStart - !MappingStart - !Scalar { implicit: [true,true], value: 'key' } - !Scalar { implicit: [true,true], value: 'value' } - !Scalar { implicit: [true,true], value: 'empty mapping' } - !MappingStart - !MappingEnd - !Scalar { implicit: [true,true], value: 'empty mapping with tag' } - !MappingStart { tag: '!mytag', implicit: false } - !MappingEnd - !Scalar { implicit: [true,true], value: 'block mapping' } - !MappingStart - !MappingStart - !Scalar { implicit: [true,true], value: 'complex' } - !Scalar { implicit: [true,true], value: 'key' } - !Scalar { implicit: [true,true], value: 'complex' } - !Scalar { implicit: [true,true], value: 'key' } - !MappingEnd - !MappingStart - !Scalar { implicit: [true,true], value: 'complex' } - !Scalar { implicit: [true,true], value: 'key' } - !MappingEnd - !MappingEnd - !Scalar { implicit: [true,true], value: 'flow mapping' } - !MappingStart { flow_style: true } - !Scalar { implicit: [true,true], value: 'key' } - !Scalar { implicit: [true,true], value: 'value' } - !MappingStart - !Scalar { implicit: [true,true], value: 'complex' } - !Scalar { implicit: [true,true], value: 'key' } - !Scalar { implicit: [true,true], value: 'complex' } - !Scalar { implicit: [true,true], value: 'key' } - !MappingEnd - !MappingStart - !Scalar { implicit: [true,true], value: 'complex' } - !Scalar { implicit: [true,true], value: 'key' } - !MappingEnd - !MappingEnd - !MappingEnd - !DocumentEnd - !StreamEnd ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/merge.data0000644000175100001730000000000514455350511016175 0ustar00runnerdocker- << ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/merge.detect0000644000175100001730000000003014455350511016532 0ustar00runnerdockertag:yaml.org,2002:merge ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/more-floats.code0000644000175100001730000000020114455350511017325 0ustar00runnerdocker[0.0, +1.0, -1.0, +1e300000, -1e300000, 1e300000/1e300000, -(1e300000/1e300000)] # last two items are ind and qnan respectively. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/more-floats.data0000644000175100001730000000005414455350511017332 0ustar00runnerdocker[0.0, +1.0, -1.0, +.inf, -.inf, .nan, .nan] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/multi-constructor.code0000644000175100001730000000010414455350511020614 0ustar00runnerdocker[ {'Tag1': ['a', 1, 'b', 2]}, {'Tag2': ['a', 1, 'b', 2]}, ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/multi-constructor.multi0000644000175100001730000000005714455350511021043 0ustar00runnerdocker--- - !Tag1 [a, 1, b, 2] - !!Tag2 [a, 1, b, 2] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/myfullloader.subclass_blacklist0000644000175100001730000000014314455350511022536 0ustar00runnerdocker- !!python/object/new:yaml.MappingNode args: state: mymethod: test wrong_method: test2 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/negative-float-bug.code0000644000175100001730000000000514455350511020557 0ustar00runnerdocker-1.0 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/negative-float-bug.data0000644000175100001730000000000514455350511020556 0ustar00runnerdocker-1.0 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/no-alias-anchor.emitter-error0000644000175100001730000000021614455350511021744 0ustar00runnerdocker- !StreamStart - !DocumentStart - !SequenceStart - !Scalar { anchor: A, value: data } - !Alias { } - !SequenceEnd - !DocumentEnd - !StreamEnd ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/no-alias-anchor.skip-ext0000644000175100001730000000000014455350511020677 0ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/no-block-collection-end.loader-error0000644000175100001730000000002514455350511023165 0ustar00runnerdocker- foo - bar baz: bar ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/no-block-mapping-end-2.loader-error0000644000175100001730000000002214455350511022621 0ustar00runnerdocker? foo : bar : baz ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/no-block-mapping-end.loader-error0000644000175100001730000000002114455350511022461 0ustar00runnerdockerfoo: "bar" "baz" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/no-document-start.loader-error0000644000175100001730000000003614455350511022151 0ustar00runnerdocker%YAML 1.1 # no --- foo: bar ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/no-flow-mapping-end.loader-error0000644000175100001730000000001514455350511022341 0ustar00runnerdocker{ foo: bar ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/no-flow-sequence-end.loader-error0000644000175100001730000000001314455350511022514 0ustar00runnerdocker[foo, bar} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/no-node-1.loader-error0000644000175100001730000000001114455350511020254 0ustar00runnerdocker- !foo ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/no-node-2.loader-error0000644000175100001730000000001514455350511020261 0ustar00runnerdocker- [ !foo } ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/no-tag.emitter-error0000644000175100001730000000016014455350511020154 0ustar00runnerdocker- !StreamStart - !DocumentStart - !Scalar { value: 'foo', implicit: [false,false] } - !DocumentEnd - !StreamEnd ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/null.data0000644000175100001730000000001514455350511016051 0ustar00runnerdocker- - ~ - null ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/null.detect0000644000175100001730000000002714455350511016413 0ustar00runnerdockertag:yaml.org,2002:null ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/odd-utf16.stream-error0000644000175100001730000000243714455350511020333 0ustar00runnerdocker############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### ############################################################### This file contains odd number of bytes, so it cannot be a valid UTF-16 stream. ###############################################################././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/overwrite-state-new-constructor.loader-error0000644000175100001730000000013414455350511025103 0ustar00runnerdocker- !!python/object/new:yaml.MappingNode args: state: extend: test __test__: test ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/recursive-anchor.former-loader-error0000644000175100001730000000004214455350511023332 0ustar00runnerdocker- &foo [1 2, 3, *foo] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/recursive-dict.recursive0000644000175100001730000000011214455350511021123 0ustar00runnerdockervalue = {} instance = AnInstance(value, value) value[instance] = instance ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/recursive-list.recursive0000644000175100001730000000003714455350511021161 0ustar00runnerdockervalue = [] value.append(value) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/recursive-set.recursive0000644000175100001730000000024514455350511021002 0ustar00runnerdockertry: set except NameError: from sets import Set as set value = set() value.add(AnInstance(foo=value, bar=value)) value.add(AnInstance(foo=value, bar=value)) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/recursive-state.recursive0000644000175100001730000000007314455350511021326 0ustar00runnerdockervalue = [] value.append(AnInstanceWithState(value, value)) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/recursive-tuple.recursive0000644000175100001730000000010214455350511021330 0ustar00runnerdockervalue = ([], []) value[0].append(value) value[1].append(value[0]) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/recursive.former-dumper-error0000644000175100001730000000004714455350511022115 0ustar00runnerdockerdata = [] data.append(data) dump(data) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/remove-possible-simple-key-bug.loader-error0000644000175100001730000000016414455350511024533 0ustar00runnerdockerfoo: &A bar *A ] # The ']' indicator triggers remove_possible_simple_key, # which should raise an error. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/resolver.data0000644000175100001730000000117114455350511016744 0ustar00runnerdocker--- "this scalar should be selected" --- key11: !foo key12: is: [selected] key22: key13: [not, selected] key23: [not, selected] key32: key31: [not, selected] key32: [not, selected] key33: {not: selected} key21: !bar - not selected - selected - not selected key31: !baz key12: key13: key14: {selected} key23: key14: [not, selected] key33: key14: {selected} key24: {not: selected} key22: - key14: {selected} key24: {not: selected} - key14: {selected} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/resolver.path0000644000175100001730000000143014455350511016765 0ustar00runnerdocker--- !root/scalar "this scalar should be selected" --- !root key11: !foo key12: !root/key11/key12/* is: [selected] key22: key13: [not, selected] key23: [not, selected] key32: key31: [not, selected] key32: [not, selected] key33: {not: selected} key21: !bar - not selected - !root/key21/1/* selected - not selected key31: !baz key12: key13: key14: !root/key31/*/*/key14/map {selected} key23: key14: [not, selected] key33: key14: !root/key31/*/*/key14/map {selected} key24: {not: selected} key22: - key14: !root/key31/*/*/key14/map {selected} key24: {not: selected} - key14: !root/key31/*/*/key14/map {selected} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/run-parser-crash-bug.data0000644000175100001730000000027514455350511021056 0ustar00runnerdocker--- - Harry Potter and the Prisoner of Azkaban - Harry Potter and the Goblet of Fire - Harry Potter and the Order of the Phoenix --- - Memoirs Found in a Bathtub - Snow Crash - Ghost World ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/scalars.events0000644000175100001730000000242214455350511017126 0ustar00runnerdocker- !StreamStart - !DocumentStart - !MappingStart - !Scalar { implicit: [true,true], value: 'empty scalar' } - !Scalar { implicit: [true,false], value: '' } - !Scalar { implicit: [true,true], value: 'implicit scalar' } - !Scalar { implicit: [true,true], value: 'data' } - !Scalar { implicit: [true,true], value: 'quoted scalar' } - !Scalar { value: 'data', style: '"' } - !Scalar { implicit: [true,true], value: 'block scalar' } - !Scalar { value: 'data', style: '|' } - !Scalar { implicit: [true,true], value: 'empty scalar with tag' } - !Scalar { implicit: [false,false], tag: '!mytag', value: '' } - !Scalar { implicit: [true,true], value: 'implicit scalar with tag' } - !Scalar { implicit: [false,false], tag: '!mytag', value: 'data' } - !Scalar { implicit: [true,true], value: 'quoted scalar with tag' } - !Scalar { value: 'data', style: '"', tag: '!mytag', implicit: [false,false] } - !Scalar { implicit: [true,true], value: 'block scalar with tag' } - !Scalar { value: 'data', style: '|', tag: '!mytag', implicit: [false,false] } - !Scalar { implicit: [true,true], value: 'single character' } - !Scalar { value: 'a', implicit: [true,true] } - !Scalar { implicit: [true,true], value: 'single digit' } - !Scalar { value: '1', implicit: [true,false] } - !MappingEnd - !DocumentEnd - !StreamEnd ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/scan-document-end-bug.canonical0000644000175100001730000000003014455350511022171 0ustar00runnerdocker%YAML 1.1 --- !!null "" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/scan-document-end-bug.data0000644000175100001730000000002314455350511021155 0ustar00runnerdocker# Ticket #4 --- ...././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/scan-line-break-bug.canonical0000644000175100001730000000007014455350511021624 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "foo" : !!str "bar baz" } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/scan-line-break-bug.data0000644000175100001730000000003014455350511020602 0ustar00runnerdockerfoo: bar baz ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/sequences.events0000644000175100001730000000323614455350511017475 0ustar00runnerdocker- !StreamStart - !DocumentStart - !SequenceStart - !SequenceEnd - !DocumentEnd - !DocumentStart - !SequenceStart { tag: '!mytag', implicit: false } - !SequenceEnd - !DocumentEnd - !DocumentStart - !SequenceStart - !SequenceStart - !SequenceEnd - !SequenceStart { tag: '!mytag', implicit: false } - !SequenceEnd - !SequenceStart - !Scalar - !Scalar { value: 'data' } - !Scalar { tag: '!mytag', implicit: [false,false], value: 'data' } - !SequenceEnd - !SequenceStart - !SequenceStart - !SequenceStart - !Scalar - !SequenceEnd - !SequenceEnd - !SequenceEnd - !SequenceStart - !SequenceStart { tag: '!mytag', implicit: false } - !SequenceStart - !Scalar { value: 'data' } - !SequenceEnd - !SequenceEnd - !SequenceEnd - !SequenceEnd - !DocumentEnd - !DocumentStart - !SequenceStart - !MappingStart - !Scalar { value: 'key1' } - !SequenceStart - !Scalar { value: 'data1' } - !Scalar { value: 'data2' } - !SequenceEnd - !Scalar { value: 'key2' } - !SequenceStart { tag: '!mytag1', implicit: false } - !Scalar { value: 'data3' } - !SequenceStart - !Scalar { value: 'data4' } - !Scalar { value: 'data5' } - !SequenceEnd - !SequenceStart { tag: '!mytag2', implicit: false } - !Scalar { value: 'data6' } - !Scalar { value: 'data7' } - !SequenceEnd - !SequenceEnd - !MappingEnd - !SequenceEnd - !DocumentEnd - !DocumentStart - !SequenceStart - !SequenceStart { flow_style: true } - !SequenceStart - !SequenceEnd - !Scalar - !Scalar { value: 'data' } - !Scalar { tag: '!mytag', implicit: [false,false], value: 'data' } - !SequenceStart { tag: '!mytag', implicit: false } - !Scalar { value: 'data' } - !Scalar { value: 'data' } - !SequenceEnd - !SequenceEnd - !SequenceEnd - !DocumentEnd - !StreamEnd ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/serializer-is-already-opened.dumper-error0000644000175100001730000000007514455350511024270 0ustar00runnerdockerdumper = yaml.Dumper(StringIO()) dumper.open() dumper.open() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/serializer-is-closed-1.dumper-error0000644000175100001730000000011414455350511023000 0ustar00runnerdockerdumper = yaml.Dumper(StringIO()) dumper.open() dumper.close() dumper.open() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/serializer-is-closed-2.dumper-error0000644000175100001730000000017114455350511023004 0ustar00runnerdockerdumper = yaml.Dumper(StringIO()) dumper.open() dumper.close() dumper.serialize(yaml.ScalarNode(tag='!foo', value='bar')) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/serializer-is-not-opened-1.dumper-error0000644000175100001730000000006014455350511023577 0ustar00runnerdockerdumper = yaml.Dumper(StringIO()) dumper.close() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/serializer-is-not-opened-2.dumper-error0000644000175100001730000000013414455350511023602 0ustar00runnerdockerdumper = yaml.Dumper(StringIO()) dumper.serialize(yaml.ScalarNode(tag='!foo', value='bar')) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/single-dot-is-not-float-bug.code0000644000175100001730000000000414455350511022230 0ustar00runnerdocker'.' ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/single-dot-is-not-float-bug.data0000644000175100001730000000000214455350511022225 0ustar00runnerdocker. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/sloppy-indentation.canonical0000644000175100001730000000101314455350511021754 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "in the block context" : !!map { ? !!str "indentation should be kept" : !!map { ? !!str "but in the flow context" : !!seq [ !!str "it may be violated" ] } } } --- !!str "the parser does not require scalars to be indented with at least one space" --- !!str "the parser does not require scalars to be indented with at least one space" --- !!map { ? !!str "foo": { ? !!str "bar" : !!str "quoted scalars may not adhere indentation" } } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/sloppy-indentation.data0000644000175100001730000000052414455350511020744 0ustar00runnerdocker--- in the block context: indentation should be kept: { but in the flow context: [ it may be violated] } --- the parser does not require scalars to be indented with at least one space ... --- "the parser does not require scalars to be indented with at least one space" --- foo: bar: 'quoted scalars may not adhere indentation' ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-01.data0000644000175100001730000000005214455350511016467 0ustar00runnerdocker- Mark McGwire - Sammy Sosa - Ken Griffey ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-01.structure0000644000175100001730000000002314455350511017614 0ustar00runnerdocker[True, True, True] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-01.tokens0000644000175100001730000000002214455350511017056 0ustar00runnerdocker[[ , _ , _ , _ ]} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-02.data0000644000175100001730000000012014455350511016464 0ustar00runnerdockerhr: 65 # Home runs avg: 0.278 # Batting average rbi: 147 # Runs Batted In ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-02.structure0000644000175100001730000000005314455350511017620 0ustar00runnerdocker[(True, True), (True, True), (True, True)] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-02.tokens0000644000175100001730000000003614455350511017064 0ustar00runnerdocker{{ ? _ : _ ? _ : _ ? _ : _ ]} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-03.data0000644000175100001730000000020514455350511016471 0ustar00runnerdockeramerican: - Boston Red Sox - Detroit Tigers - New York Yankees national: - New York Mets - Chicago Cubs - Atlanta Braves ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-03.structure0000644000175100001730000000007114455350511017621 0ustar00runnerdocker[(True, [True, True, True]), (True, [True, True, True])] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-03.tokens0000644000175100001730000000006614455350511017070 0ustar00runnerdocker{{ ? _ : [[ , _ , _ , _ ]} ? _ : [[ , _ , _ , _ ]} ]} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-04.data0000644000175100001730000000013614455350511016475 0ustar00runnerdocker- name: Mark McGwire hr: 65 avg: 0.278 - name: Sammy Sosa hr: 63 avg: 0.288 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-04.structure0000644000175100001730000000014414455350511017623 0ustar00runnerdocker[ [(True, True), (True, True), (True, True)], [(True, True), (True, True), (True, True)], ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-04.tokens0000644000175100001730000000010614455350511017064 0ustar00runnerdocker[[ , {{ ? _ : _ ? _ : _ ? _ : _ ]} , {{ ? _ : _ ? _ : _ ? _ : _ ]} ]} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-05.data0000644000175100001730000000012414455350511016473 0ustar00runnerdocker- [name , hr, avg ] - [Mark McGwire, 65, 0.278] - [Sammy Sosa , 63, 0.288] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-05.structure0000644000175100001730000000011414455350511017621 0ustar00runnerdocker[ [True, True, True], [True, True, True], [True, True, True], ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-05.tokens0000644000175100001730000000006614455350511017072 0ustar00runnerdocker[[ , [ _ , _ , _ ] , [ _ , _ , _ ] , [ _ , _ , _ ] ]} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-06.data0000644000175100001730000000012014455350511016470 0ustar00runnerdockerMark McGwire: {hr: 65, avg: 0.278} Sammy Sosa: { hr: 63, avg: 0.288 } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-06.structure0000644000175100001730000000013014455350511017620 0ustar00runnerdocker[ (True, [(True, True), (True, True)]), (True, [(True, True), (True, True)]), ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-06.tokens0000644000175100001730000000007614455350511017074 0ustar00runnerdocker{{ ? _ : { ? _ : _ , ? _ : _ } ? _ : { ? _ : _ , ? _ : _ } ]} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-07.data0000644000175100001730000000020214455350511016472 0ustar00runnerdocker# Ranking of 1998 home runs --- - Mark McGwire - Sammy Sosa - Ken Griffey # Team ranking --- - Chicago Cubs - St Louis Cardinals ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-07.structure0000644000175100001730000000004614455350511017627 0ustar00runnerdocker[ [True, True, True], [True, True], ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-07.tokens0000644000175100001730000000005114455350511017066 0ustar00runnerdocker--- [[ , _ , _ , _ ]} --- [[ , _ , _ ]} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-08.data0000644000175100001730000000017514455350511016504 0ustar00runnerdocker--- time: 20:03:20 player: Sammy Sosa action: strike (miss) ... --- time: 20:03:47 player: Sammy Sosa action: grand slam ... ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-08.structure0000644000175100001730000000013414455350511017626 0ustar00runnerdocker[ [(True, True), (True, True), (True, True)], [(True, True), (True, True), (True, True)], ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-08.tokens0000644000175100001730000000011514455350511017070 0ustar00runnerdocker--- {{ ? _ : _ ? _ : _ ? _ : _ ]} ... --- {{ ? _ : _ ? _ : _ ? _ : _ ]} ... ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-09.data0000644000175100001730000000016314455350511016502 0ustar00runnerdocker--- hr: # 1998 hr ranking - Mark McGwire - Sammy Sosa rbi: # 1998 rbi ranking - Sammy Sosa - Ken Griffey ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-09.structure0000644000175100001730000000005514455350511017631 0ustar00runnerdocker[(True, [True, True]), (True, [True, True])] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-09.tokens0000644000175100001730000000006214455350511017072 0ustar00runnerdocker--- {{ ? _ : [[ , _ , _ ]} ? _ : [[ , _ , _ ]} ]} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-10.data0000644000175100001730000000017714455350511016477 0ustar00runnerdocker--- hr: - Mark McGwire # Following node labeled SS - &SS Sammy Sosa rbi: - *SS # Subsequent occurrence - Ken Griffey ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-10.structure0000644000175100001730000000005414455350511017620 0ustar00runnerdocker[(True, [True, True]), (True, ['*', True])] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-10.tokens0000644000175100001730000000006414455350511017064 0ustar00runnerdocker--- {{ ? _ : [[ , _ , & _ ]} ? _ : [[ , * , _ ]} ]} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-11.data0000644000175100001730000000021614455350511016472 0ustar00runnerdocker? - Detroit Tigers - Chicago cubs : - 2001-07-23 ? [ New York Yankees, Atlanta Braves ] : [ 2001-07-02, 2001-08-12, 2001-08-14 ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-11.structure0000644000175100001730000000010014455350511017611 0ustar00runnerdocker[ ([True, True], [True]), ([True, True], [True, True, True]), ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-11.tokens0000644000175100001730000000007614455350511017070 0ustar00runnerdocker{{ ? [[ , _ , _ ]} : [[ , _ ]} ? [ _ , _ ] : [ _ , _ , _ ] ]} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-12.data0000644000175100001730000000020714455350511016473 0ustar00runnerdocker--- # products purchased - item : Super Hoop quantity: 1 - item : Basketball quantity: 4 - item : Big Shoes quantity: 1 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-12.structure0000644000175100001730000000013614455350511017623 0ustar00runnerdocker[ [(True, True), (True, True)], [(True, True), (True, True)], [(True, True), (True, True)], ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-12.tokens0000644000175100001730000000012214455350511017061 0ustar00runnerdocker--- [[ , {{ ? _ : _ ? _ : _ ]} , {{ ? _ : _ ? _ : _ ]} , {{ ? _ : _ ? _ : _ ]} ]} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-13.data0000644000175100001730000000005414455350511016474 0ustar00runnerdocker# ASCII Art --- | \//||\/|| // || ||__ ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-13.structure0000644000175100001730000000000514455350511017617 0ustar00runnerdockerTrue ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-13.tokens0000644000175100001730000000000614455350511017063 0ustar00runnerdocker--- _ ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-14.data0000644000175100001730000000007514455350511016500 0ustar00runnerdocker--- Mark McGwire's year was crippled by a knee injury. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-14.structure0000644000175100001730000000000514455350511017620 0ustar00runnerdockerTrue ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-14.tokens0000644000175100001730000000000614455350511017064 0ustar00runnerdocker--- _ ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-15.data0000644000175100001730000000017014455350511016475 0ustar00runnerdocker> Sammy Sosa completed another fine season with great stats. 63 Home Runs 0.288 Batting Average What a year! ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-15.structure0000644000175100001730000000000514455350511017621 0ustar00runnerdockerTrue ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-15.tokens0000644000175100001730000000000214455350511017061 0ustar00runnerdocker_ ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-16.data0000644000175100001730000000021214455350511016473 0ustar00runnerdockername: Mark McGwire accomplishment: > Mark set a major league home run record in 1998. stats: | 65 Home Runs 0.278 Batting Average ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-16.structure0000644000175100001730000000005314455350511017625 0ustar00runnerdocker[(True, True), (True, True), (True, True)] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-16.tokens0000644000175100001730000000003614455350511017071 0ustar00runnerdocker{{ ? _ : _ ? _ : _ ? _ : _ ]} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-17.data0000644000175100001730000000026114455350511016500 0ustar00runnerdockerunicode: "Sosa did fine.\u263A" control: "\b1998\t1999\t2000\n" hexesc: "\x13\x10 is \r\n" single: '"Howdy!" he cried.' quoted: ' # not a ''comment''.' tie-fighter: '|\-*-/|' ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-17.structure0000644000175100001730000000012514455350511017626 0ustar00runnerdocker[(True, True), (True, True), (True, True), (True, True), (True, True), (True, True)] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-17.tokens0000644000175100001730000000006614455350511017075 0ustar00runnerdocker{{ ? _ : _ ? _ : _ ? _ : _ ? _ : _ ? _ : _ ? _ : _ ]} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-18.data0000644000175100001730000000013514455350511016501 0ustar00runnerdockerplain: This unquoted scalar spans many lines. quoted: "So does this quoted scalar.\n" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-18.structure0000644000175100001730000000003514455350511017627 0ustar00runnerdocker[(True, True), (True, True)] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-18.tokens0000644000175100001730000000002614455350511017072 0ustar00runnerdocker{{ ? _ : _ ? _ : _ ]} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-19.data0000644000175100001730000000012314455350511016477 0ustar00runnerdockercanonical: 12345 decimal: +12,345 sexagesimal: 3:25:45 octal: 014 hexadecimal: 0xC ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-19.structure0000644000175100001730000000010714455350511017630 0ustar00runnerdocker[(True, True), (True, True), (True, True), (True, True), (True, True)] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-19.tokens0000644000175100001730000000005614455350511017076 0ustar00runnerdocker{{ ? _ : _ ? _ : _ ? _ : _ ? _ : _ ? _ : _ ]} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-20.data0000644000175100001730000000020114455350511016464 0ustar00runnerdockercanonical: 1.23015e+3 exponential: 12.3015e+02 sexagesimal: 20:30.15 fixed: 1,230.15 negative infinity: -.inf not a number: .NaN ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-20.structure0000644000175100001730000000012514455350511017620 0ustar00runnerdocker[(True, True), (True, True), (True, True), (True, True), (True, True), (True, True)] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-20.tokens0000644000175100001730000000006614455350511017067 0ustar00runnerdocker{{ ? _ : _ ? _ : _ ? _ : _ ? _ : _ ? _ : _ ? _ : _ ]} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-21.data0000644000175100001730000000005114455350511016470 0ustar00runnerdockernull: ~ true: y false: n string: '12345' ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-21.structure0000644000175100001730000000007114455350511017621 0ustar00runnerdocker[(True, True), (True, True), (True, True), (True, True)] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-21.tokens0000644000175100001730000000004614455350511017066 0ustar00runnerdocker{{ ? _ : _ ? _ : _ ? _ : _ ? _ : _ ]} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-22.data0000644000175100001730000000017314455350511016476 0ustar00runnerdockercanonical: 2001-12-15T02:59:43.1Z iso8601: 2001-12-14t21:59:43.10-05:00 spaced: 2001-12-14 21:59:43.10 -5 date: 2002-12-14 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-22.structure0000644000175100001730000000007114455350511017622 0ustar00runnerdocker[(True, True), (True, True), (True, True), (True, True)] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-22.tokens0000644000175100001730000000004614455350511017067 0ustar00runnerdocker{{ ? _ : _ ? _ : _ ? _ : _ ? _ : _ ]} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-23.data0000644000175100001730000000041014455350511016471 0ustar00runnerdocker--- not-date: !!str 2002-04-28 picture: !!binary | R0lGODlhDAAMAIQAAP//9/X 17unp5WZmZgAAAOfn515eXv Pz7Y6OjuDg4J+fn5OTk6enp 56enmleECcgggoBADs= application specific tag: !something | The semantics of the tag above may be different for different documents. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-23.structure0000644000175100001730000000005314455350511017623 0ustar00runnerdocker[(True, True), (True, True), (True, True)] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-23.tokens0000644000175100001730000000005014455350511017063 0ustar00runnerdocker--- {{ ? _ : ! _ ? _ : ! _ ? _ : ! _ ]} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-24.data0000644000175100001730000000045214455350511016500 0ustar00runnerdocker%TAG ! tag:clarkevans.com,2002: --- !shape # Use the ! handle for presenting # tag:clarkevans.com,2002:circle - !circle center: &ORIGIN {x: 73, y: 129} radius: 7 - !line start: *ORIGIN finish: { x: 89, y: 102 } - !label start: *ORIGIN color: 0xFFEEBB text: Pretty vector drawing. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-24.structure0000644000175100001730000000023214455350511017623 0ustar00runnerdocker[ [(True, [(True, True), (True, True)]), (True, True)], [(True, '*'), (True, [(True, True), (True, True)])], [(True, '*'), (True, True), (True, True)], ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-24.tokens0000644000175100001730000000030214455350511017064 0ustar00runnerdocker% --- ! [[ , ! {{ ? _ : & { ? _ : _ , ? _ : _ } ? _ : _ ]} , ! {{ ? _ : * ? _ : { ? _ : _ , ? _ : _ } ]} , ! {{ ? _ : * ? _ : _ ? _ : _ ]} ]} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-25.data0000644000175100001730000000021514455350511016476 0ustar00runnerdocker# sets are represented as a # mapping where each key is # associated with the empty string --- !!set ? Mark McGwire ? Sammy Sosa ? Ken Griff ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-25.structure0000644000175100001730000000005314455350511017625 0ustar00runnerdocker[(True, None), (True, None), (True, None)] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-25.tokens0000644000175100001730000000003014455350511017063 0ustar00runnerdocker--- ! {{ ? _ ? _ ? _ ]} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-26.data0000644000175100001730000000023714455350511016503 0ustar00runnerdocker# ordered maps are represented as # a sequence of mappings, with # each mapping having one key --- !!omap - Mark McGwire: 65 - Sammy Sosa: 63 - Ken Griffy: 58 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-26.structure0000644000175100001730000000006414455350511017630 0ustar00runnerdocker[ [(True, True)], [(True, True)], [(True, True)], ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-26.tokens0000644000175100001730000000007414455350511017074 0ustar00runnerdocker--- ! [[ , {{ ? _ : _ ]} , {{ ? _ : _ ]} , {{ ? _ : _ ]} ]} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-27.data0000644000175100001730000000120414455350511016477 0ustar00runnerdocker--- ! invoice: 34843 date : 2001-01-23 bill-to: &id001 given : Chris family : Dumars address: lines: | 458 Walkman Dr. Suite #292 city : Royal Oak state : MI postal : 48046 ship-to: *id001 product: - sku : BL394D quantity : 4 description : Basketball price : 450.00 - sku : BL4438H quantity : 1 description : Super Hoop price : 2392.00 tax : 251.42 total: 4443.52 comments: Late afternoon is best. Backup contact is Nancy Billsmer @ 338-4338. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-27.structure0000644000175100001730000000054714455350511017637 0ustar00runnerdocker[ (True, True), (True, True), (True, [ (True, True), (True, True), (True, [(True, True), (True, True), (True, True), (True, True)]), ]), (True, '*'), (True, [ [(True, True), (True, True), (True, True), (True, True)], [(True, True), (True, True), (True, True), (True, True)], ]), (True, True), (True, True), (True, True), ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-27.tokens0000644000175100001730000000040614455350511017074 0ustar00runnerdocker--- ! {{ ? _ : _ ? _ : _ ? _ : & {{ ? _ : _ ? _ : _ ? _ : {{ ? _ : _ ? _ : _ ? _ : _ ? _ : _ ]} ]} ? _ : * ? _ : [[ , {{ ? _ : _ ? _ : _ ? _ : _ ? _ : _ ]} , {{ ? _ : _ ? _ : _ ? _ : _ ? _ : _ ]} ]} ? _ : _ ? _ : _ ? _ : _ ]} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-28.data0000644000175100001730000000063314455350511016505 0ustar00runnerdocker--- Time: 2001-11-23 15:01:42 -5 User: ed Warning: This is an error message for the log file --- Time: 2001-11-23 15:02:31 -5 User: ed Warning: A slightly different error message. --- Date: 2001-11-23 15:03:17 -5 User: ed Fatal: Unknown variable "bar" Stack: - file: TopClass.py line: 23 code: | x = MoreObject("345\n") - file: MoreClass.py line: 58 code: |- foo = bar ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-28.structure0000644000175100001730000000037214455350511017634 0ustar00runnerdocker[ [(True, True), (True, True), (True, True)], [(True, True), (True, True), (True, True)], [(True, True), (True, True), (True, True), (True, [ [(True, True), (True, True), (True, True)], [(True, True), (True, True), (True, True)], ]), ] ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-02-28.tokens0000644000175100001730000000031214455350511017071 0ustar00runnerdocker--- {{ ? _ : _ ? _ : _ ? _ : _ ]} --- {{ ? _ : _ ? _ : _ ? _ : _ ]} --- {{ ? _ : _ ? _ : _ ? _ : _ ? _ : [[ , {{ ? _ : _ ? _ : _ ? _ : _ ]} , {{ ? _ : _ ? _ : _ ? _ : _ ]} ]} ]} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-01-utf16be.data0000644000175100001730000000004214455350511017743 0ustar00runnerdocker# Comment only. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-01-utf16be.empty0000644000175100001730000000006614455350511020176 0ustar00runnerdocker# This stream contains no # documents, only comments. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-01-utf16le.data0000644000175100001730000000004214455350511017755 0ustar00runnerdocker# Comment only. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-01-utf16le.empty0000644000175100001730000000006614455350511020210 0ustar00runnerdocker# This stream contains no # documents, only comments. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-01-utf8.data0000644000175100001730000000002314455350511017354 0ustar00runnerdocker# Comment only. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-01-utf8.empty0000644000175100001730000000006614455350511017610 0ustar00runnerdocker# This stream contains no # documents, only comments. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-02-utf16be.data0000644000175100001730000000013214455350511017744 0ustar00runnerdocker# Invalid use of BOM # inside a # document. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-02-utf16be.error0000644000175100001730000000006214455350511020166 0ustar00runnerdockerERROR: A BOM must not appear inside a document. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-02-utf16le.data0000644000175100001730000000013214455350511017756 0ustar00runnerdocker# Invalid use of BOM # inside a # document. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-02-utf16le.error0000644000175100001730000000006214455350511020200 0ustar00runnerdockerERROR: A BOM must not appear inside a document. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-02-utf8.data0000644000175100001730000000005714455350511017364 0ustar00runnerdocker# Invalid use of BOM # inside a # document. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-02-utf8.error0000644000175100001730000000006214455350511017600 0ustar00runnerdockerERROR: A BOM must not appear inside a document. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-03.canonical0000644000175100001730000000040014455350511017507 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "sequence" : !!seq [ !!str "one", !!str "two" ], ? !!str "mapping" : !!map { ? !!str "sky" : !!str "blue", # ? !!str "sea" : !!str "green", ? !!map { ? !!str "sea" : !!str "green" } : !!null "", } } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-03.data0000644000175100001730000000010014455350511016466 0ustar00runnerdockersequence: - one - two mapping: ? sky : blue ? sea : green ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-04.canonical0000644000175100001730000000030414455350511017513 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "sequence" : !!seq [ !!str "one", !!str "two" ], ? !!str "mapping" : !!map { ? !!str "sky" : !!str "blue", ? !!str "sea" : !!str "green", } } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-04.data0000644000175100001730000000007314455350511016500 0ustar00runnerdockersequence: [ one, two, ] mapping: { sky: blue, sea: green } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-05.data0000644000175100001730000000002014455350511016471 0ustar00runnerdocker# Comment only. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-05.empty0000644000175100001730000000006614455350511016730 0ustar00runnerdocker# This stream contains no # documents, only comments. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-06.canonical0000644000175100001730000000014014455350511017513 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "anchored" : &A1 !local "value", ? !!str "alias" : *A1, } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-06.data0000644000175100001730000000005614455350511016503 0ustar00runnerdockeranchored: !local &anchor value alias: *anchor ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-07.canonical0000644000175100001730000000014714455350511017523 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "literal" : !!str "text\n", ? !!str "folded" : !!str "text\n", } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-07.data0000644000175100001730000000004314455350511016500 0ustar00runnerdockerliteral: | text folded: > text ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-08.canonical0000644000175100001730000000014214455350511017517 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "single" : !!str "text", ? !!str "double" : !!str "text", } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-08.data0000644000175100001730000000003614455350511016503 0ustar00runnerdockersingle: 'text' double: "text" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-09.canonical0000644000175100001730000000003314455350511017517 0ustar00runnerdocker%YAML 1.1 --- !!str "text" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-09.data0000644000175100001730000000002314455350511016500 0ustar00runnerdocker%YAML 1.1 --- text ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-10.data0000644000175100001730000000005114455350511016471 0ustar00runnerdockercommercial-at: @text grave-accent: `text ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-10.error0000644000175100001730000000007114455350511016713 0ustar00runnerdockerERROR: Reserved indicators can't start a plain scalar. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-11.canonical0000644000175100001730000000021214455350511017507 0ustar00runnerdocker%YAML 1.1 --- !!str "Generic line break (no glyph)\n\ Generic line break (glyphed)\n\ Line separator\u2028\ Paragraph separator\u2029" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-11.data0000644000175100001730000000015514455350511016477 0ustar00runnerdocker| Generic line break (no glyph) Generic line break (glyphed)… Line separator
 Paragraph separator
././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-12.data0000644000175100001730000000025114455350511016475 0ustar00runnerdocker# Tabs do's and don'ts: # comment: quoted: "Quoted " block: | void main() { printf("Hello, world!\n"); } elsewhere: # separation indentation, in plain scalar ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-12.error0000644000175100001730000000024014455350511016713 0ustar00runnerdockerERROR: Tabs may appear inside comments and quoted or block scalar content. Tabs must not appear elsewhere, such as in indentation and separation spaces. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-13.canonical0000644000175100001730000000011214455350511017510 0ustar00runnerdocker%YAML 1.1 --- !!str "Text containing \ both space and \ tab characters" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-13.data0000644000175100001730000000007314455350511016500 0ustar00runnerdocker "Text containing both space and tab characters" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-14.canonical0000644000175100001730000000016714455350511017523 0ustar00runnerdocker%YAML 1.1 --- "Fun with \x5C \x22 \x07 \x08 \x1B \x0C \x0A \x0D \x09 \x0B \x00 \x20 \xA0 \x85 \u2028 \u2029 A A A" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-14.data0000644000175100001730000000014114455350511016475 0ustar00runnerdocker"Fun with \\ \" \a \b \e \f \… \n \r \t \v \0 \
 \ \_ \N \L \P \
 \x41 \u0041 \U00000041" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-15.data0000644000175100001730000000003314455350511016476 0ustar00runnerdockerBad escapes: "\c \xq-" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-05-15.error0000644000175100001730000000011614455350511016720 0ustar00runnerdockerERROR: - c is an invalid escaped character. - q and - are invalid hex digits. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-06-01.canonical0000644000175100001730000000042314455350511017513 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "Not indented" : !!map { ? !!str "By one space" : !!str "By four\n spaces\n", ? !!str "Flow style" : !!seq [ !!str "By two", !!str "Also by two", !!str "Still by two", ] } } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-06-01.data0000644000175100001730000000054714455350511016504 0ustar00runnerdocker # Leading comment line spaces are # neither content nor indentation. Not indented: By one space: | By four spaces Flow style: [ # Leading spaces By two, # in flow style Also by two, # are neither # Tabs are not allowed: # Still by two # content nor Still by two # content nor ] # indentation. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-06-02.data0000644000175100001730000000002114455350511016470 0ustar00runnerdocker # Comment ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-06-02.empty0000644000175100001730000000006614455350511016726 0ustar00runnerdocker# This stream contains no # documents, only comments. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-06-03.canonical0000644000175100001730000000007214455350511017515 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "key" : !!str "value" } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-06-03.data0000644000175100001730000000003214455350511016473 0ustar00runnerdockerkey: # Comment value ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-06-04.canonical0000644000175100001730000000007214455350511017516 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "key" : !!str "value" } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-06-04.data0000644000175100001730000000005314455350511016477 0ustar00runnerdockerkey: # Comment # lines value ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-06-05.canonical0000644000175100001730000000032214455350511017515 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!map { ? !!str "first" : !!str "Sammy", ? !!str "last" : !!str "Sosa" } : !!map { ? !!str "hr" : !!int "65", ? !!str "avg" : !!float "0.278" } } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-06-05.data0000644000175100001730000000014114455350511016476 0ustar00runnerdocker{ first: Sammy, last: Sosa }: # Statistics: hr: # Home runs 65 avg: # Average 0.278 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-06-06.canonical0000644000175100001730000000023314455350511017517 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "plain" : !!str "text lines", ? !!str "quoted" : !!str "text lines", ? !!str "block" : !!str "text\n lines\n" } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-06-06.data0000644000175100001730000000010614455350511016500 0ustar00runnerdockerplain: text lines quoted: "text lines" block: | text lines ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-06-07.canonical0000644000175100001730000000010114455350511017512 0ustar00runnerdocker%YAML 1.1 --- !!seq [ !!str "foo\nbar", !!str "foo\n\nbar" ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-06-07.data0000644000175100001730000000004414455350511016502 0ustar00runnerdocker- foo bar - |- foo bar ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-06-08.canonical0000644000175100001730000000007414455350511017524 0ustar00runnerdocker%YAML 1.1 --- !!str "specific\L\ trimmed\n\n\n\ as space" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-06-08.data0000644000175100001730000000006214455350511016503 0ustar00runnerdocker>- specific
 trimmed… … …… as… space ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-01.canonical0000644000175100001730000000003214455350511017510 0ustar00runnerdocker%YAML 1.1 --- !!str "foo" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-01.data0000644000175100001730000000011514455350511016474 0ustar00runnerdocker%FOO bar baz # Should be ignored # with a warning. --- "foo" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-01.skip-ext0000644000175100001730000000000014455350511017320 0ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-02.canonical0000644000175100001730000000003214455350511017511 0ustar00runnerdocker%YAML 1.1 --- !!str "foo" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-02.data0000644000175100001730000000010214455350511016471 0ustar00runnerdocker%YAML 1.2 # Attempt parsing # with a warning --- "foo" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-02.skip-ext0000644000175100001730000000000014455350511017321 0ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-03.data0000644000175100001730000000003014455350511016472 0ustar00runnerdocker%YAML 1.1 %YAML 1.1 foo ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-03.error0000644000175100001730000000011014455350511016711 0ustar00runnerdockerERROR: The YAML directive must only be given at most once per document. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-04.canonical0000644000175100001730000000003214455350511017513 0ustar00runnerdocker%YAML 1.1 --- !!str "foo" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-04.data0000644000175100001730000000006314455350511016501 0ustar00runnerdocker%TAG !yaml! tag:yaml.org,2002: --- !yaml!str "foo" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-05.data0000644000175100001730000000003414455350511016500 0ustar00runnerdocker%TAG ! !foo %TAG ! !foo bar ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-05.error0000644000175100001730000000013214455350511016717 0ustar00runnerdockerERROR: The TAG directive must only be given at most once per handle in the same document. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-06.canonical0000644000175100001730000000012014455350511017513 0ustar00runnerdocker%YAML 1.1 --- !!seq [ ! "baz", ! "string" ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-06.data0000644000175100001730000000012614455350511016503 0ustar00runnerdocker%TAG ! !foo %TAG !yaml! tag:yaml.org,2002: --- - !bar "baz" - !yaml!str "string" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-07a.canonical0000644000175100001730000000003414455350511017661 0ustar00runnerdocker%YAML 1.1 --- ! "bar" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-07a.data0000644000175100001730000000004214455350511016642 0ustar00runnerdocker# Private application: !foo "bar" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-07b.canonical0000644000175100001730000000006514455350511017666 0ustar00runnerdocker%YAML 1.1 --- ! "bar" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-07b.data0000644000175100001730000000010714455350511016645 0ustar00runnerdocker# Migrated to global: %TAG ! tag:ben-kiki.org,2000:app/ --- !foo "bar" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-08.canonical0000644000175100001730000000016414455350511017525 0ustar00runnerdocker%YAML 1.1 --- !!seq [ ! "bar", ! "string", ! "baz" ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-08.data0000644000175100001730000000030514455350511016504 0ustar00runnerdocker# Explicitly specify default settings: %TAG ! ! %TAG !! tag:yaml.org,2002: # Named handles have no default: %TAG !o! tag:ben-kiki.org,2000: --- - !foo "bar" - !!str "string" - !o!type "baz" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-09.canonical0000644000175100001730000000011614455350511017523 0ustar00runnerdocker%YAML 1.1 --- !!str "foo" %YAML 1.1 --- !!str "bar" %YAML 1.1 --- !!str "baz" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-09.data0000644000175100001730000000011414455350511016503 0ustar00runnerdocker--- foo ... # Repeated end marker. ... --- bar # No end marker. --- baz ... ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-10.canonical0000644000175100001730000000024114455350511017512 0ustar00runnerdocker%YAML 1.1 --- !!str "Root flow scalar" %YAML 1.1 --- !!str "Root block scalar\n" %YAML 1.1 --- !!map { ? !!str "foo" : !!str "bar" } --- #!!str "" !!null "" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-10.data0000644000175100001730000000021614455350511016476 0ustar00runnerdocker"Root flow scalar" --- !!str > Root block scalar --- # Root collection: foo : bar ... # Is optional. --- # Explicit document may be empty. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-11.data0000644000175100001730000000005214455350511016475 0ustar00runnerdocker# A stream may contain # no documents. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-11.empty0000644000175100001730000000006614455350511016727 0ustar00runnerdocker# This stream contains no # documents, only comments. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-12a.canonical0000644000175100001730000000007014455350511017655 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "foo" : !!str "bar" } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-12a.data0000644000175100001730000000010114455350511016632 0ustar00runnerdocker# Implicit document. Root # collection (mapping) node. foo : bar ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-12b.canonical0000644000175100001730000000004514455350511017660 0ustar00runnerdocker%YAML 1.1 --- !!str "Text content\n" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-12b.data0000644000175100001730000000010714455350511016641 0ustar00runnerdocker# Explicit document. Root # scalar (literal) node. --- | Text content ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-13.canonical0000644000175100001730000000017714455350511017525 0ustar00runnerdocker%YAML 1.1 --- !!str "First document" --- ! "No directives" --- ! "With directives" --- ! "Reset settings" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-07-13.data0000644000175100001730000000016714455350511016506 0ustar00runnerdocker! "First document" --- !foo "No directives" %TAG ! !foo --- !bar "With directives" %YAML 1.1 --- !baz "Reset settings" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-01.canonical0000644000175100001730000000013114455350511017511 0ustar00runnerdocker%YAML 1.1 --- !!map { ? &A1 !!str "foo" : !!str "bar", ? &A2 !!str "baz" : *A1 } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-01.data0000644000175100001730000000005214455350511016475 0ustar00runnerdocker!!str &a1 "foo" : !!str bar &a2 baz : *a1 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-02.canonical0000644000175100001730000000016014455350511017514 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "First occurrence" : &A !!str "Value", ? !!str "Second occurrence" : *A } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-02.data0000644000175100001730000000007314455350511016501 0ustar00runnerdockerFirst occurrence: &anchor Value Second occurrence: *anchor ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-03.canonical0000644000175100001730000000011514455350511017515 0ustar00runnerdocker%YAML 1.1 --- !!map { ? ! "foo" : ! "baz" } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-03.data0000644000175100001730000000005514455350511016502 0ustar00runnerdocker! foo : ! baz ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-04.data0000644000175100001730000000003014455350511016474 0ustar00runnerdocker- ! foo - !<$:?> bar ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-04.error0000644000175100001730000000022014455350511016715 0ustar00runnerdockerERROR: - Verbatim tags aren't resolved, so ! is invalid. - The $:? tag is neither a global URI tag nor a local tag starting with “!”. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-05.canonical0000644000175100001730000000016414455350511017523 0ustar00runnerdocker%YAML 1.1 --- !!seq [ ! "foo", ! "bar", ! "baz", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-05.data0000644000175100001730000000011314455350511016477 0ustar00runnerdocker%TAG !o! tag:ben-kiki.org,2000: --- - !local foo - !!str bar - !o!type baz ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-06.data0000644000175100001730000000011014455350511016475 0ustar00runnerdocker%TAG !o! tag:ben-kiki.org,2000: --- - !$a!b foo - !o! bar - !h!type baz ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-06.error0000644000175100001730000000015114455350511016722 0ustar00runnerdockerERROR: - The !$a! looks like a handle. - The !o! handle has no suffix. - The !h! handle wasn't declared. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-07.canonical0000644000175100001730000000023514455350511017524 0ustar00runnerdocker%YAML 1.1 --- !!seq [ ! "12", ! "12", # ! "12", ! "12", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-07.data0000644000175100001730000000006714455350511016511 0ustar00runnerdocker# Assuming conventional resolution: - "12" - 12 - ! 12 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-08.canonical0000644000175100001730000000022414455350511017523 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "foo" : !!str "bar baz" } %YAML 1.1 --- !!str "foo bar" %YAML 1.1 --- !!str "foo bar" %YAML 1.1 --- !!str "foo\n" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-08.data0000644000175100001730000000010014455350511016476 0ustar00runnerdocker--- foo: "bar baz" --- "foo bar" --- foo bar --- | foo ... ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-09.canonical0000644000175100001730000000073514455350511017533 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "scalars" : !!map { ? !!str "plain" : !!str "some text", ? !!str "quoted" : !!map { ? !!str "single" : !!str "some text", ? !!str "double" : !!str "some text" } }, ? !!str "collections" : !!map { ? !!str "sequence" : !!seq [ !!str "entry", !!map { ? !!str "key" : !!str "value" } ], ? !!str "mapping" : !!map { ? !!str "key" : !!str "value" } } } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-09.data0000644000175100001730000000032014455350511016503 0ustar00runnerdocker--- scalars: plain: !!str some text quoted: single: 'some text' double: "some text" collections: sequence: !!seq [ !!str entry, # Mapping entry: key: value ] mapping: { key: value } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-10.canonical0000644000175100001730000000102214455350511017511 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "block styles" : !!map { ? !!str "scalars" : !!map { ? !!str "literal" : !!str "#!/usr/bin/perl\n\ print \"Hello, world!\\n\";\n", ? !!str "folded" : !!str "This sentence is false.\n" }, ? !!str "collections" : !!map { ? !!str "sequence" : !!seq [ !!str "entry", !!map { ? !!str "key" : !!str "value" } ], ? !!str "mapping" : !!map { ? !!str "key" : !!str "value" } } } } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-10.data0000644000175100001730000000044614455350511016504 0ustar00runnerdockerblock styles: scalars: literal: !!str | #!/usr/bin/perl print "Hello, world!\n"; folded: > This sentence is false. collections: !!map sequence: !!seq # Entry: - entry # Plain # Mapping entry: - key: value mapping: key: value ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-11.canonical0000644000175100001730000000016014455350511017514 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "First occurrence" : &A !!str "Value", ? !!str "Second occurrence" : *A } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-11.data0000644000175100001730000000007314455350511016501 0ustar00runnerdockerFirst occurrence: &anchor Value Second occurrence: *anchor ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-12.canonical0000644000175100001730000000017514455350511017523 0ustar00runnerdocker%YAML 1.1 --- !!seq [ !!str "Without properties", &A !!str "Anchored", !!str "Tagged", *A, !!str "", !!str "", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-12.data0000644000175100001730000000022714455350511016503 0ustar00runnerdocker[ Without properties, &anchor "Anchored", !!str 'Tagged', *anchor, # Alias node !!str , # Empty plain scalar '', # Empty plain scalar ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-13.canonical0000644000175100001730000000016314455350511017521 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "foo" # : !!str "", # ? !!str "" : !!null "", ? !!null "" : !!str "bar", } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-13.data0000644000175100001730000000003214455350511016476 0ustar00runnerdocker{ ? foo :, ? : bar, } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-13.skip-ext0000644000175100001730000000000014455350511017324 0ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-14.canonical0000644000175100001730000000017514455350511017525 0ustar00runnerdocker%YAML 1.1 --- !!seq [ !!str "flow in block", !!str "Block scalar\n", !!map { ? !!str "foo" : !!str "bar" } ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-14.data0000644000175100001730000000011314455350511016477 0ustar00runnerdocker- "flow in block" - > Block scalar - !!map # Block collection foo : bar ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-15.canonical0000644000175100001730000000017114455350511017522 0ustar00runnerdocker%YAML 1.1 --- !!seq [ !!null "", !!map { ? !!str "foo" : !!null "", ? !!null "" : !!str "bar", } ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-08-15.data0000644000175100001730000000005714455350511016507 0ustar00runnerdocker- # Empty plain scalar - ? foo : ? : bar ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-01.canonical0000644000175100001730000000024514455350511017520 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "simple key" : !!map { ? !!str "also simple" : !!str "value", ? !!str "not a simple key" : !!str "any value" } } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-01.data0000644000175100001730000000012514455350511016477 0ustar00runnerdocker"simple key" : { "also simple" : value, ? "not a simple key" : "any value" } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-02.canonical0000644000175100001730000000012414455350511017515 0ustar00runnerdocker%YAML 1.1 --- !!str "as space \ trimmed\n\ specific\L\n\ escaped\t\n\ none" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-02.data0000644000175100001730000000007214455350511016501 0ustar00runnerdocker "as space trimmed specific
 escaped \
 none" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-03.canonical0000644000175100001730000000012314455350511017515 0ustar00runnerdocker%YAML 1.1 --- !!seq [ !!str " last", !!str " last", !!str " \tfirst last", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-03.data0000644000175100001730000000005514455350511016503 0ustar00runnerdocker- " last" - " last" - " first last" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-04.canonical0000644000175100001730000000007614455350511017525 0ustar00runnerdocker%YAML 1.1 --- !!str "first \ inner 1 \ inner 2 \ last" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-04.data0000644000175100001730000000004714455350511016505 0ustar00runnerdocker "first inner 1 \ inner 2 \ last" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-05.canonical0000644000175100001730000000014014455350511017516 0ustar00runnerdocker%YAML 1.1 --- !!seq [ !!str "first ", !!str "first\nlast", !!str "first inner \tlast", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-05.data0000644000175100001730000000007314455350511016505 0ustar00runnerdocker- "first " - "first last" - "first inner \ last" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-06.canonical0000644000175100001730000000005314455350511017522 0ustar00runnerdocker%YAML 1.1 --- !!str "here's to \"quotes\"" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-06.data0000644000175100001730000000002714455350511016505 0ustar00runnerdocker 'here''s to "quotes"' ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-07.canonical0000644000175100001730000000024514455350511017526 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "simple key" : !!map { ? !!str "also simple" : !!str "value", ? !!str "not a simple key" : !!str "any value" } } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-07.data0000644000175100001730000000012514455350511016505 0ustar00runnerdocker'simple key' : { 'also simple' : value, ? 'not a simple key' : 'any value' } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-08.canonical0000644000175100001730000000010514455350511017522 0ustar00runnerdocker%YAML 1.1 --- !!str "as space \ trimmed\n\ specific\L\n\ none" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-08.data0000644000175100001730000000005714455350511016512 0ustar00runnerdocker 'as space … trimmed …… specific
… none' ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-09.canonical0000644000175100001730000000012314455350511017523 0ustar00runnerdocker%YAML 1.1 --- !!seq [ !!str " last", !!str " last", !!str " \tfirst last", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-09.data0000644000175100001730000000005514455350511016511 0ustar00runnerdocker- ' last' - ' last' - ' first last' ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-10.canonical0000644000175100001730000000005714455350511017521 0ustar00runnerdocker%YAML 1.1 --- !!str "first \ inner \ last" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-10.data0000644000175100001730000000003014455350511016472 0ustar00runnerdocker 'first inner last' ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-11.canonical0000644000175100001730000000010114455350511017510 0ustar00runnerdocker%YAML 1.1 --- !!seq [ !!str "first ", !!str "first\nlast", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-11.data0000644000175100001730000000004114455350511016475 0ustar00runnerdocker- 'first ' - 'first last' ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-12.canonical0000644000175100001730000000027014455350511017520 0ustar00runnerdocker%YAML 1.1 --- !!seq [ !!str "::std::vector", !!str "Up, up, and away!", !!int "-123", !!seq [ !!str "::std::vector", !!str "Up, up, and away!", !!int "-123", ] ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-12.data0000644000175100001730000000022514455350511016502 0ustar00runnerdocker# Outside flow collection: - ::std::vector - Up, up, and away! - -123 # Inside flow collection: - [ '::std::vector', "Up, up, and away!", -123 ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-13.canonical0000644000175100001730000000024514455350511017523 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "simple key" : !!map { ? !!str "also simple" : !!str "value", ? !!str "not a simple key" : !!str "any value" } } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-13.data0000644000175100001730000000011514455350511016501 0ustar00runnerdockersimple key : { also simple : value, ? not a simple key : any value } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-14.data0000644000175100001730000000011614455350511016503 0ustar00runnerdocker--- --- ||| : foo ... >>>: bar --- [ --- , ... , { --- : ... # Nested } ] ... ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-14.error0000644000175100001730000000021314455350511016721 0ustar00runnerdockerERROR: The --- and ... document start and end markers must not be specified as the first content line of a non-indented plain scalar. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-15.canonical0000644000175100001730000000030114455350511017516 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "---" : !!str "foo", ? !!str "..." : !!str "bar" } %YAML 1.1 --- !!seq [ !!str "---", !!str "...", !!map { ? !!str "---" : !!str "..." } ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-15.data0000644000175100001730000000007714455350511016512 0ustar00runnerdocker--- "---" : foo ...: bar --- [ ---, ..., { ? --- : ... } ] ... ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-16.canonical0000644000175100001730000000010514455350511017521 0ustar00runnerdocker%YAML 1.1 --- !!str "as space \ trimmed\n\ specific\L\n\ none" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-16.data0000644000175100001730000000014414455350511016506 0ustar00runnerdocker# Tabs are confusing: # as space/trimmed/specific/none as space … trimmed …… specific
… none ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-17.canonical0000644000175100001730000000006414455350511017526 0ustar00runnerdocker%YAML 1.1 --- !!str "first line\n\ more line" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-17.data0000644000175100001730000000003514455350511016506 0ustar00runnerdocker first line more line ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-18.canonical0000644000175100001730000000015014455350511017523 0ustar00runnerdocker%YAML 1.1 --- !!seq [ !!str "literal\n", !!str " folded\n", !!str "keep\n\n", !!str " strip", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-18.data0000644000175100001730000000020514455350511016506 0ustar00runnerdocker- | # Just the style literal - >1 # Indentation indicator folded - |+ # Chomping indicator keep - >-1 # Both indicators strip ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-19.canonical0000644000175100001730000000010114455350511017520 0ustar00runnerdocker%YAML 1.1 --- !!seq [ !!str "literal\n", !!str "folded\n", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-19.data0000644000175100001730000000003114455350511016504 0ustar00runnerdocker- | literal - > folded ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-20.canonical0000644000175100001730000000017314455350511017521 0ustar00runnerdocker%YAML 1.1 --- !!seq [ !!str "detected\n", !!str "\n\n# detected\n", !!str " explicit\n", !!str "\t\ndetected\n", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-20.data0000644000175100001730000000010514455350511016476 0ustar00runnerdocker- | detected - > # detected - |1 explicit - > detected ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-20.skip-ext0000644000175100001730000000000014455350511017323 0ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-21.data0000644000175100001730000000005114455350511016477 0ustar00runnerdocker- | text - > text text - |1 text ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-21.error0000644000175100001730000000026014455350511016721 0ustar00runnerdockerERROR: - A leading all-space line must not have too many spaces. - A following text line must not be less indented. - The text is less indented than the indicated level. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-22.canonical0000644000175100001730000000020614455350511017520 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "strip" : !!str "text", ? !!str "clip" : !!str "text\n", ? !!str "keep" : !!str "text\L", } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-22.data0000644000175100001730000000006514455350511016505 0ustar00runnerdockerstrip: |- text
clip: | text…keep: |+ text
././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-23.canonical0000644000175100001730000000021614455350511017522 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "strip" : !!str "# text", ? !!str "clip" : !!str "# text\n", ? !!str "keep" : !!str "# text\L\n", } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-23.data0000644000175100001730000000024414455350511016505 0ustar00runnerdocker # Strip # Comments: strip: |- # text
 
 # Clip # comments: …clip: | # text… 
 # Keep # comments: …keep: |+ # text
… # Trail # comments. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-24.canonical0000644000175100001730000000017014455350511017522 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "strip" : !!str "", ? !!str "clip" : !!str "", ? !!str "keep" : !!str "\n", } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-24.data0000644000175100001730000000003614455350511016505 0ustar00runnerdockerstrip: >- clip: > keep: |+ ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-25.canonical0000644000175100001730000000006014455350511017521 0ustar00runnerdocker%YAML 1.1 --- !!str "literal\n\ \ttext\n" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-25.data0000644000175100001730000000005014455350511016502 0ustar00runnerdocker| # Simple block scalar literal text ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-26.canonical0000644000175100001730000000005414455350511017525 0ustar00runnerdocker%YAML 1.1 --- !!str "\n\nliteral\n\ntext\n" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-26.data0000644000175100001730000000004614455350511016510 0ustar00runnerdocker| literal text # Comment ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-27.canonical0000644000175100001730000000005414455350511017526 0ustar00runnerdocker%YAML 1.1 --- !!str "\n\nliteral\n\ntext\n" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-27.data0000644000175100001730000000004614455350511016511 0ustar00runnerdocker| literal text # Comment ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-28.canonical0000644000175100001730000000005414455350511017527 0ustar00runnerdocker%YAML 1.1 --- !!str "\n\nliteral\n\ntext\n" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-28.data0000644000175100001730000000004614455350511016512 0ustar00runnerdocker| literal text # Comment ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-29.canonical0000644000175100001730000000006514455350511017532 0ustar00runnerdocker%YAML 1.1 --- !!str "folded text\n\ \tlines\n" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-29.data0000644000175100001730000000005714455350511016515 0ustar00runnerdocker> # Simple folded scalar folded text lines ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-30.canonical0000644000175100001730000000016614455350511017524 0ustar00runnerdocker%YAML 1.1 --- !!str "folded line\n\ next line\n\n\ \ * bullet\n\ \ * list\n\n\ last line\n" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-30.data0000644000175100001730000000011414455350511016477 0ustar00runnerdocker> folded line next line * bullet * list last line # Comment ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-31.canonical0000644000175100001730000000016614455350511017525 0ustar00runnerdocker%YAML 1.1 --- !!str "folded line\n\ next line\n\n\ \ * bullet\n\ \ * list\n\n\ last line\n" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-31.data0000644000175100001730000000011414455350511016500 0ustar00runnerdocker> folded line next line * bullet * list last line # Comment ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-32.canonical0000644000175100001730000000016614455350511017526 0ustar00runnerdocker%YAML 1.1 --- !!str "folded line\n\ next line\n\n\ \ * bullet\n\ \ * list\n\n\ last line\n" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-32.data0000644000175100001730000000011414455350511016501 0ustar00runnerdocker> folded line next line * bullet * list last line # Comment ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-33.canonical0000644000175100001730000000016614455350511017527 0ustar00runnerdocker%YAML 1.1 --- !!str "folded line\n\ next line\n\n\ \ * bullet\n\ \ * list\n\n\ last line\n" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-09-33.data0000644000175100001730000000011414455350511016502 0ustar00runnerdocker> folded line next line * bullet * list last line # Comment ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-01.canonical0000644000175100001730000000020114455350511017500 0ustar00runnerdocker%YAML 1.1 --- !!seq [ !!seq [ !!str "inner", !!str "inner", ], !!seq [ !!str "inner", !!str "last", ], ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-01.data0000644000175100001730000000004314455350511016466 0ustar00runnerdocker- [ inner, inner, ] - [inner,last] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-02.canonical0000644000175100001730000000027114455350511017510 0ustar00runnerdocker%YAML 1.1 --- !!seq [ !!str "double quoted", !!str "single quoted", !!str "plain text", !!seq [ !!str "nested", ], !!map { ? !!str "single" : !!str "pair" } ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-02.data0000644000175100001730000000013214455350511016466 0ustar00runnerdocker[ "double quoted", 'single quoted', plain text, [ nested ], single: pair , ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-03.canonical0000644000175100001730000000020714455350511017510 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "block" : !!seq [ !!str "one", !!map { ? !!str "two" : !!str "three" } ] } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-03.data0000644000175100001730000000006514455350511016474 0ustar00runnerdockerblock: # Block # sequence - one - two : three ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-04.canonical0000644000175100001730000000015714455350511017515 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "block" : !!seq [ !!str "one", !!seq [ !!str "two" ] ] } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-04.data0000644000175100001730000000002614455350511016472 0ustar00runnerdockerblock: - one - - two ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-05.canonical0000644000175100001730000000024114455350511017510 0ustar00runnerdocker%YAML 1.1 --- !!seq [ !!null "", !!str "block node\n", !!seq [ !!str "one", !!str "two", ], !!map { ? !!str "one" : !!str "two", } ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-05.data0000644000175100001730000000015114455350511016472 0ustar00runnerdocker- # Empty - | block node - - one # in-line - two # sequence - one: two # in-line # mapping ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-06.canonical0000644000175100001730000000032514455350511017514 0ustar00runnerdocker%YAML 1.1 --- !!seq [ !!map { ? !!str "inner" : !!str "entry", ? !!str "also" : !!str "inner" }, !!map { ? !!str "inner" : !!str "entry", ? !!str "last" : !!str "entry" } ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-06.data0000644000175100001730000000010214455350511016467 0ustar00runnerdocker- { inner : entry , also: inner , } - {inner: entry,last : entry} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-07.canonical0000644000175100001730000000035514455350511017520 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!null "" : !!str "value", ? !!str "explicit key" : !!str "value", ? !!str "simple key" : !!str "value", ? !!seq [ !!str "collection", !!str "simple", !!str "key" ] : !!str "value" } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-07.data0000644000175100001730000000015214455350511016475 0ustar00runnerdocker{ ? : value, # Empty key ? explicit key: value, simple key : value, [ collection, simple, key ]: value } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-08.data0000644000175100001730000000410614455350511016501 0ustar00runnerdocker{ multi-line simple key : value, very long ...................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................(>1KB)................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... key: value } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-08.error0000644000175100001730000000016214455350511016717 0ustar00runnerdockerERROR: - A simple key is restricted to only one line. - A simple key must not be longer than 1024 characters. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-09.canonical0000644000175100001730000000013414455350511017515 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "key" : !!str "value", ? !!str "empty" : !!null "", } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-09.data0000644000175100001730000000005114455350511016475 0ustar00runnerdocker{ key : value, empty: # empty value↓ } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-10.canonical0000644000175100001730000000044214455350511017507 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "explicit key1" : !!str "explicit value", ? !!str "explicit key2" : !!null "", ? !!str "explicit key3" : !!null "", ? !!str "simple key1" : !!str "explicit value", ? !!str "simple key2" : !!null "", ? !!str "simple key3" : !!null "", } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-10.data0000644000175100001730000000032414455350511016470 0ustar00runnerdocker{ ? explicit key1 : explicit value, ? explicit key2 : , # Explicit empty ? explicit key3, # Empty value simple key1 : explicit value, simple key2 : , # Explicit empty simple key3, # Empty value } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-11.canonical0000644000175100001730000000053214455350511017510 0ustar00runnerdocker%YAML 1.1 --- !!seq [ !!map { ? !!str "explicit key1" : !!str "explicit value", }, !!map { ? !!str "explicit key2" : !!null "", }, !!map { ? !!str "explicit key3" : !!null "", }, !!map { ? !!str "simple key1" : !!str "explicit value", }, !!map { ? !!str "simple key2" : !!null "", }, ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-11.data0000644000175100001730000000026414455350511016474 0ustar00runnerdocker[ ? explicit key1 : explicit value, ? explicit key2 : , # Explicit empty ? explicit key3, # Implicit empty simple key1 : explicit value, simple key2 : , # Explicit empty ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-12.canonical0000644000175100001730000000014014455350511017504 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "block" : !!map { ? !!str "key" : !!str "value" } } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-12.data0000644000175100001730000000005114455350511016467 0ustar00runnerdockerblock: # Block # mapping key: value ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-13.canonical0000644000175100001730000000021214455350511017505 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "explicit key" : !!null "", ? !!str "block key\n" : !!seq [ !!str "one", !!str "two", ] } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-13.data0000644000175100001730000000014114455350511016470 0ustar00runnerdocker? explicit key # implicit value ? | block key : - one # explicit in-line - two # block value ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-14.canonical0000644000175100001730000000020614455350511017511 0ustar00runnerdocker%YAML 1.1 --- !!map { ? !!str "plain key" : !!null "", ? !!str "quoted key" : !!seq [ !!str "one", !!str "two", ] } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-14.data0000644000175100001730000000012614455350511016474 0ustar00runnerdockerplain key: # empty value "quoted key": - one # explicit next-line - two # block value ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-15.canonical0000644000175100001730000000033214455350511017512 0ustar00runnerdocker%YAML 1.1 --- !!seq [ !!map { ? !!str "sun" : !!str "yellow" }, !!map { ? !!map { ? !!str "earth" : !!str "blue" } : !!map { ? !!str "moon" : !!str "white" } } ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/spec-10-15.data0000644000175100001730000000005614455350511016477 0ustar00runnerdocker- sun: yellow - ? earth: blue : moon: white ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/str.data0000644000175100001730000000000714455350511015710 0ustar00runnerdocker- abcd ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/str.detect0000644000175100001730000000002614455350511016250 0ustar00runnerdockertag:yaml.org,2002:str ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/tags.events0000644000175100001730000000057614455350511016444 0ustar00runnerdocker- !StreamStart - !DocumentStart - !SequenceStart - !Scalar { value: 'data' } #- !Scalar { tag: '!', value: 'data' } - !Scalar { tag: 'tag:yaml.org,2002:str', value: 'data' } - !Scalar { tag: '!myfunnytag', value: 'data' } - !Scalar { tag: '!my!ugly!tag', value: 'data' } - !Scalar { tag: 'tag:my.domain.org,2002:data!? #', value: 'data' } - !SequenceEnd - !DocumentEnd - !StreamEnd ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/test_mark.marks0000644000175100001730000000103614455350511017300 0ustar00runnerdocker--- *The first line. The last line. --- The first*line. The last line. --- The first line.* The last line. --- The first line. *The last line. --- The first line. The last*line. --- The first line. The last line.* --- The first line. *The selected line. The last line. --- The first line. The selected*line. The last line. --- The first line. The selected line.* The last line. --- *The only line. --- The only*line. --- The only line.* --- Loooooooooooooooooooooooooooooooooooooooooooooong*Liiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiine ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/timestamp-bugs.code0000644000175100001730000000071014455350511020043 0ustar00runnerdocker[ [datetime.datetime(2001, 12, 15, 3, 29, 43, 100000), 'UTC-05:30'], [datetime.datetime(2001, 12, 14, 16, 29, 43, 100000), 'UTC+05:30'], [datetime.datetime(2001, 12, 14, 21, 59, 43, 1010), None], [datetime.datetime(2001, 12, 14, 21, 59, 43, 0, FixedOffset(60, "+1")), 'UTC+01:00'], [datetime.datetime(2001, 12, 14, 21, 59, 43, 0, FixedOffset(-90, "-1:30")), 'UTC-01:30'], [datetime.datetime(2005, 7, 8, 17, 35, 4, 517600), None], ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/timestamp-bugs.data0000644000175100001730000000036214455350511020045 0ustar00runnerdocker- !MyTime - 2001-12-14 21:59:43.10 -5:30 - !MyTime - 2001-12-14 21:59:43.10 +5:30 - !MyTime - 2001-12-14 21:59:43.00101 - !MyTime - 2001-12-14 21:59:43+1 - !MyTime - 2001-12-14 21:59:43-1:30 - !MyTime - 2005-07-08 17:35:04.517600 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/timestamp.data0000644000175100001730000000017114455350511017105 0ustar00runnerdocker- 2001-12-15T02:59:43.1Z - 2001-12-14t21:59:43.10-05:00 - 2001-12-14 21:59:43.10 -5 - 2001-12-15 2:59:43.10 - 2002-12-14 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/timestamp.detect0000644000175100001730000000003414455350511017442 0ustar00runnerdockertag:yaml.org,2002:timestamp ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/unacceptable-key.loader-error0000644000175100001730000000003214455350511021776 0ustar00runnerdocker--- ? - foo - bar : baz ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/unclosed-bracket.loader-error0000644000175100001730000000032714455350511022016 0ustar00runnerdockertest: - [ foo: bar # comment the rest of the stream to let the scanner detect the problem. # - baz #"we could have detected the unclosed bracket on the above line, but this would forbid such syntax as": { #} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/unclosed-quoted-scalar.loader-error0000644000175100001730000000001214455350511023136 0ustar00runnerdocker'foo bar ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/undefined-anchor.loader-error0000644000175100001730000000003014455350511021771 0ustar00runnerdocker- foo - &bar baz - *bat ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/undefined-constructor.loader-error0000644000175100001730000000001514455350511023107 0ustar00runnerdocker--- !foo bar ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/undefined-tag-handle.loader-error0000644000175100001730000000002414455350511022526 0ustar00runnerdocker--- !foo!bar baz ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/unknown.dumper-error0000644000175100001730000000002714455350511020313 0ustar00runnerdockeryaml.safe_dump(object) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/unsupported-version.emitter-error0000644000175100001730000000015014455350511023041 0ustar00runnerdocker- !StreamStart - !DocumentStart { version: [5,6] } - !Scalar { value: foo } - !DocumentEnd - !StreamEnd ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/utf16be.code0000644000175100001730000000001414455350511016353 0ustar00runnerdocker"UTF-16-BE" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/utf16be.data0000644000175100001730000000003614455350511016356 0ustar00runnerdocker--- UTF-16-BE ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/utf16le.code0000644000175100001730000000001414455350511016365 0ustar00runnerdocker"UTF-16-LE" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/utf16le.data0000644000175100001730000000003614455350511016370 0ustar00runnerdocker--- UTF-16-LE ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/utf8-implicit.code0000644000175100001730000000002114455350511017573 0ustar00runnerdocker"implicit UTF-8" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/utf8-implicit.data0000644000175100001730000000002314455350511017574 0ustar00runnerdocker--- implicit UTF-8 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/utf8.code0000644000175100001730000000001014455350511015761 0ustar00runnerdocker"UTF-8" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/utf8.data0000644000175100001730000000001514455350511015765 0ustar00runnerdocker--- UTF-8 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/value.data0000644000175100001730000000000414455350511016211 0ustar00runnerdocker- = ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/value.detect0000644000175100001730000000003014455350511016547 0ustar00runnerdockertag:yaml.org,2002:value ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/yaml.data0000644000175100001730000000004714455350511016046 0ustar00runnerdocker- !!yaml '!' - !!yaml '&' - !!yaml '*' ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/yaml.detect0000644000175100001730000000002714455350511016403 0ustar00runnerdockertag:yaml.org,2002:yaml ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/yaml11.schema0000644000175100001730000002346114455350511016544 0ustar00runnerdocker# https://github.com/perlpunk/yaml-test-schema/blob/master/data/schema-yaml11.yaml --- '!!bool FALSE': ['bool', 'false()', 'false'] '!!bool False': ['bool', 'false()', 'false'] '!!bool N': ['bool', 'false()', "false"] '!!bool NO': ['bool', 'false()', "false"] '!!bool No': ['bool', 'false()', "false"] '!!bool OFF': ['bool', 'false()', "false"] '!!bool ON': ['bool', 'true()', "true"] '!!bool Off': ['bool', 'false()', "false"] '!!bool On': ['bool', 'true()', "true"] '!!bool TRUE': ['bool', 'true()', 'true'] '!!bool True': ['bool', 'true()', 'true'] '!!bool Y': ['bool', 'true()', "true"] '!!bool YES': ['bool', 'true()', "true"] '!!bool Yes': ['bool', 'true()', "true"] '!!bool false': ['bool', 'false()', 'false'] '!!bool n': ['bool', 'false()', "false"] '!!bool no': ['bool', 'false()', "false"] '!!bool off': ['bool', 'false()', "false"] '!!bool on': ['bool', 'true()', "true"] '!!bool true': ['bool', 'true()', 'true'] '!!bool y': ['bool', 'true()', "true"] '!!bool yes': ['bool', 'true()', "true"] '!!float +.INF': ['inf', 'inf()', '.inf'] '!!float +.Inf': ['inf', 'inf()', '.inf'] '!!float +.inf': ['inf', 'inf()', '.inf'] '!!float +0.3e+3': ['float', '300.0', '300.0'] '!!float -.INF': ['inf', 'inf-neg()', '-.inf'] '!!float -.Inf': ['inf', 'inf-neg()', '-.inf'] '!!float -.inf': ['inf', 'inf-neg()', '-.inf'] '!!float -3.14': ['float', '-3.14', '-3.14'] '!!float .0': ['float', '0.0', '0.0'] '!!float .14': ['float', '0.14', '0.14'] '!!float .1_4': ['float', '0.14', '0.14'] '!!float .3E-1': ['float', '0.03', '0.03'] '!!float .3e+3': ['float', '300.0', '300.0'] '!!float .INF': ['inf', 'inf()', '.inf'] '!!float .Inf': ['inf', 'inf()', '.inf'] '!!float .NAN': ['nan', 'nan()', '.nan'] '!!float .NaN': ['nan', 'nan()', '.nan'] '!!float .inf': ['inf', 'inf()', '.inf'] '!!float .nan': ['nan', 'nan()', '.nan'] '!!float 0.0': ['float', '0.0', '0.0'] '!!float 001.23': ['float', '1.23', '1.23'] '!!float 190:20:30.15': ['float', '685230.15', '685230.15'] '!!float 3.': ['float', '3.0', '3.0'] '!!float 3.14': ['float', '3.14', '3.14'] '!!float 3.3e+3': ['float', '3300.0', '3300.0'] '!!float 85.230_15e+03': ['float', '85230.15', '85230.15'] '!!float 85_230.15': ['float', '85230.15', '85230.15'] '!!int +0': ['int', '0', '0'] '!!int +0100_200': ['int', '32896', '32896'] '!!int +0b100': ['int', '4', '4'] '!!int +190:20:30': ['int', '685230', '685230'] '!!int +23': ['int', '23', '23'] '!!int -0': ['int', '0', '0'] '!!int -0100_200': ['int', '-32896', '-32896'] '!!int -0b101': ['int', '-5', '-5'] '!!int -0x30': ['int', '-48', '-48'] '!!int -190:20:30': ['int', '-685230', '-685230'] '!!int -23': ['int', '-23', '-23'] '!!int 0': ['int', '0', '0'] '!!int 00': ['int', '0', '0'] '!!int 0011': ['int', '9', '9'] '!!int 010': ['int', '8', '8'] '!!int 02_0': ['int', '16', '16'] '!!int 07': ['int', '7', '7'] '!!int 0b0': ['int', '0', '0'] '!!int 0b100_101': ['int', '37', '37'] '!!int 0x0': ['int', '0', '0'] '!!int 0x10': ['int', '16', '16'] '!!int 0x2_0': ['int', '32', '32'] '!!int 0x42': ['int', '66', '66'] '!!int 0xa': ['int', '10', '10'] '!!int 100_000': ['int', '100000', '100000'] '!!int 190:20:30': ['int', '685230', '685230'] '!!int 23': ['int', '23', '23'] '!!null #empty': ['null', 'null()', "null"] '!!null NULL': ['null', 'null()', "null"] '!!null Null': ['null', 'null()', "null"] '!!null null': ['null', 'null()', 'null'] '!!null ~': ['null', 'null()', 'null'] '!!str #empty': ['str', '', "''"] '!!str +.INF': ['str', '+.INF', "'+.INF'"] '!!str +.Inf': ['str', '+.Inf', "'+.Inf'"] '!!str +.inf': ['str', '+.inf', "'+.inf'"] '!!str +0': ['str', '+0', "'+0'"] '!!str +0.3e+3': ['str', '+0.3e+3', "'+0.3e+3'"] '!!str +0.3e3': ['str', '+0.3e3', "+0.3e3"] '!!str +0100_200': ['str', '+0100_200', "'+0100_200'"] '!!str +0b100': ['str', '+0b100', "'+0b100'"] '!!str +190:20:30': ['str', '+190:20:30', "'+190:20:30'"] '!!str +23': ['str', '+23', "'+23'"] '!!str -.INF': ['str', '-.INF', "'-.INF'"] '!!str -.Inf': ['str', '-.Inf', "'-.Inf'"] '!!str -.inf': ['str', '-.inf', "'-.inf'"] '!!str -0': ['str', '-0', "'-0'"] '!!str -0100_200': ['str', '-0100_200', "'-0100_200'"] '!!str -0b101': ['str', '-0b101', "'-0b101'"] '!!str -0x30': ['str', '-0x30', "'-0x30'"] '!!str -190:20:30': ['str', '-190:20:30', "'-190:20:30'"] '!!str -23': ['str', '-23', "'-23'"] '!!str -3.14': ['str', '-3.14', "'-3.14'"] '!!str .': ['str', '.', '.'] '!!str .0': ['str', '.0', "'.0'"] '!!str .14': ['str', '.14', "'.14'"] '!!str .1_4': ['str', '.1_4', "'.1_4'"] '!!str .3E-1': ['str', '.3E-1', "'.3E-1'"] '!!str .3e+3': ['str', '.3e+3', "'.3e+3'"] '!!str .3e3': ['str', '.3e3', ".3e3"] '!!str .INF': ['str', '.INF', "'.INF'"] '!!str .Inf': ['str', '.Inf', "'.Inf'"] '!!str .NAN': ['str', '.NAN', "'.NAN'"] '!!str .NaN': ['str', '.NaN', "'.NaN'"] '!!str ._': ['str', '._', '._'] '!!str ._14': ['str', '._14', '._14'] '!!str .inf': ['str', '.inf', "'.inf'"] '!!str .nan': ['str', '.nan', "'.nan'"] '!!str 0': ['str', '0', "'0'"] '!!str 0.0': ['str', '0.0', "'0.0'"] '!!str 0.3e3': ['str', '0.3e3', "0.3e3"] '!!str 00': ['str', '00', "'00'"] '!!str 001.23': ['str', '001.23', "'001.23'"] '!!str 0011': ['str', '0011', "'0011'"] '!!str 010': ['str', '010', "'010'"] '!!str 02_0': ['str', '02_0', "'02_0'"] '!!str 07': ['str', '07', "'07'"] '!!str 0b0': ['str', '0b0', "'0b0'"] '!!str 0b100_101': ['str', '0b100_101', "'0b100_101'"] '!!str 0o0': ['str', '0o0', "0o0"] '!!str 0o10': ['str', '0o10', "0o10"] '!!str 0o7': ['str', '0o7', "0o7"] '!!str 0x0': ['str', '0x0', "'0x0'"] '!!str 0x2_0': ['str', '0x2_0', "'0x2_0'"] '!!str 0xa': ['str', '0xa', "'0xa'"] '!!str 100_000': ['str', '100_000', "'100_000'"] '!!str 190:20:30': ['str', '190:20:30', "'190:20:30'"] '!!str 190:20:30.15': ['str', '190:20:30.15', "'190:20:30.15'"] '!!str 23': ['str', '23', "'23'"] '!!str 3.': ['str', '3.', "'3.'"] '!!str 3.14': ['str', '3.14', "'3.14'"] '!!str 3.3e+3': ['str', '3.3e+3', "'3.3e+3'"] '!!str 85.230_15e+03': ['str', '85.230_15e+03', "'85.230_15e+03'"] '!!str 85_230.15': ['str', '85_230.15', "'85_230.15'"] '!!str FALSE': ['str', 'FALSE', "'FALSE'"] '!!str False': ['str', 'False', "'False'"] '!!str N': ['str', 'N', "'N'"] '!!str NO': ['str', 'NO', "'NO'"] '!!str NULL': ['str', 'NULL', "'NULL'"] '!!str Null': ['str', 'Null', "'Null'"] '!!str OFF': ['str', 'OFF', "'OFF'"] '!!str ON': ['str', 'ON', "'ON'"] '!!str Off': ['str', 'Off', "'Off'"] '!!str On': ['str', 'On', "'On'"] '!!str TRUE': ['str', 'TRUE', "'TRUE'"] '!!str True': ['str', 'True', "'True'"] '!!str Y': ['str', 'Y', "'Y'"] '!!str YES': ['str', 'YES', "'YES'"] '!!str Yes': ['str', 'Yes', "'Yes'"] '!!str _._': ['str', '_._', '_._'] '!!str false': ['str', 'false', "'false'"] '!!str n': ['str', 'n', "'n'"] '!!str no': ['str', 'no', "'no'"] '!!str null': ['str', 'null', "'null'"] '!!str off': ['str', 'off', "'off'"] '!!str on': ['str', 'on', "'on'"] '!!str true': ['str', 'true', "'true'"] '!!str y': ['str', 'y', "'y'"] '!!str yes': ['str', 'yes', "'yes'"] '!!str ~': ['str', '~', "'~'"] '#empty': ['null', 'null()', "null"] '+.INF': ['inf', 'inf()', '.inf'] '+.Inf': ['inf', 'inf()', '.inf'] '+.inf': ['inf', 'inf()', '.inf'] '+0': ['int', '0', '0'] '+0.3e+3': ['float', '300.0', '300.0'] '+0.3e3': ['str', '+0.3e3', '+0.3e3'] '+0100_200': ['int', '32896', '32896'] '+0b100': ['int', '4', '4'] '+190:20:30': ['int', '685230', '685230'] '+23': ['int', '23', '23'] '+3.14': ['float', '3.14', '3.14'] '-.INF': ['inf', 'inf-neg()', '-.inf'] '-.Inf': ['inf', 'inf-neg()', '-.inf'] '-.inf': ['inf', 'inf-neg()', '-.inf'] '-0': ['int', '0', '0'] '-0100_200': ['int', '-32896', '-32896'] '-0b101': ['int', '-5', '-5'] '-0x30': ['int', '-48', '-48'] '-190:20:30': ['int', '-685230', '-685230'] '-23': ['int', '-23', '-23'] '-3.14': ['float', '-3.14', '-3.14'] '.': ['str', '.', '.'] '.0': ['float', '0.0', '0.0'] '.14': ['float', '0.14', '0.14'] '.1_4': ['float', '0.14', '0.14'] '.3E-1': ['float', '0.03', '0.03'] '.3e+3': ['float', '300.0', '300.0'] '.3e3': ['str', '.3e3', '.3e3'] '.INF': ['inf', 'inf()', '.inf'] '.Inf': ['inf', 'inf()', '.inf'] '.NAN': ['nan', 'nan()', '.nan'] '.NaN': ['nan', 'nan()', '.nan'] '._': ['str', '._', '._'] '._14': ['str', '._14', '._14'] '.inf': ['inf', 'inf()', '.inf'] '.nan': ['nan', 'nan()', '.nan'] '0': ['int', '0', '0'] '0.0': ['float', '0.0', '0.0'] '0.3e3': ['str', '0.3e3', '0.3e3'] '00': ['int', '0', '0'] '001.23': ['float', '1.23', '1.23'] '0011': ['int', '9', '9'] '010': ['int', '8', '8'] '02_0': ['int', '16', '16'] '07': ['int', '7', '7'] '08': ['str', '08', '08'] '0b0': ['int', '0', '0'] '0b100_101': ['int', '37', '37'] '0o0': ['str', '0o0', '0o0'] '0o10': ['str', '0o10', '0o10'] '0o7': ['str', '0o7', '0o7'] '0x0': ['int', '0', '0'] '0x10': ['int', '16', '16'] '0x2_0': ['int', '32', '32'] '0x42': ['int', '66', '66'] '0xa': ['int', '10', '10'] '100_000': ['int', '100000', '100000'] '190:20:30': ['int', '685230', '685230'] '190:20:30.15': ['float', '685230.15', '685230.15'] '23': ['int', '23', '23'] '3.': ['float', '3.0', '3.0'] '3.14': ['float', '3.14', '3.14'] '3.3e+3': ['float', '3300', '3300.0'] '3e3': ['str', '3e3', '3e3'] '85.230_15e+03': ['float', '85230.15', '85230.15'] '85_230.15': ['float', '85230.15', '85230.15'] 'FALSE': ['bool', 'false()', 'false'] 'False': ['bool', 'false()', 'false'] 'N': ['bool', 'false()', "false"] 'NO': ['bool', 'false()', "false"] 'NULL': ['null', 'null()', "null"] 'Null': ['null', 'null()', "null"] 'OFF': ['bool', 'false()', "false"] 'ON': ['bool', 'true()', "true"] 'Off': ['bool', 'false()', "false"] 'On': ['bool', 'true()', "true"] 'TRUE': ['bool', 'true()', 'true'] 'True': ['bool', 'true()', 'true'] 'Y': ['bool', 'true()', "true"] 'YES': ['bool', 'true()', "true"] 'Yes': ['bool', 'true()', "true"] '_._': ['str', '_._', '_._'] 'false': ['bool', 'false()', 'false'] 'n': ['bool', 'false()', "false"] 'no': ['bool', 'false()', "false"] 'null': ['null', 'null()', "null"] 'off': ['bool', 'false()', "false"] 'on': ['bool', 'true()', "true"] 'true': ['bool', 'true()', 'true'] 'y': ['bool', 'true()', "true"] 'yes': ['bool', 'true()', "true"] '~': ['null', 'null()', "null"] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/data/yaml11.schema-skip0000644000175100001730000000026214455350511017502 0ustar00runnerdockerload: { 'Y': 1, 'y': 1, 'N': 1, 'n': 1, '!!bool Y': 1, '!!bool N': 1, '!!bool n': 1, '!!bool y': 1, } dump: { '!!str N': 1, '!!str Y': 1, '!!str n': 1, '!!str y': 1, } ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1689637208.8122263 PyYAML-6.0.1/tests/lib/0000755000175100001730000000000014455350531014107 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/lib/canonical.py0000644000175100001730000003012314455350511016405 0ustar00runnerdocker import yaml, yaml.composer, yaml.constructor, yaml.resolver class CanonicalError(yaml.YAMLError): pass class CanonicalScanner: def __init__(self, data): if isinstance(data, bytes): try: data = data.decode('utf-8') except UnicodeDecodeError: raise CanonicalError("utf-8 stream is expected") self.data = data+'\0' self.index = 0 self.tokens = [] self.scanned = False def check_token(self, *choices): if not self.scanned: self.scan() if self.tokens: if not choices: return True for choice in choices: if isinstance(self.tokens[0], choice): return True return False def peek_token(self): if not self.scanned: self.scan() if self.tokens: return self.tokens[0] def get_token(self, choice=None): if not self.scanned: self.scan() token = self.tokens.pop(0) if choice and not isinstance(token, choice): raise CanonicalError("unexpected token "+repr(token)) return token def get_token_value(self): token = self.get_token() return token.value def scan(self): self.tokens.append(yaml.StreamStartToken(None, None)) while True: self.find_token() ch = self.data[self.index] if ch == '\0': self.tokens.append(yaml.StreamEndToken(None, None)) break elif ch == '%': self.tokens.append(self.scan_directive()) elif ch == '-' and self.data[self.index:self.index+3] == '---': self.index += 3 self.tokens.append(yaml.DocumentStartToken(None, None)) elif ch == '[': self.index += 1 self.tokens.append(yaml.FlowSequenceStartToken(None, None)) elif ch == '{': self.index += 1 self.tokens.append(yaml.FlowMappingStartToken(None, None)) elif ch == ']': self.index += 1 self.tokens.append(yaml.FlowSequenceEndToken(None, None)) elif ch == '}': self.index += 1 self.tokens.append(yaml.FlowMappingEndToken(None, None)) elif ch == '?': self.index += 1 self.tokens.append(yaml.KeyToken(None, None)) elif ch == ':': self.index += 1 self.tokens.append(yaml.ValueToken(None, None)) elif ch == ',': self.index += 1 self.tokens.append(yaml.FlowEntryToken(None, None)) elif ch == '*' or ch == '&': self.tokens.append(self.scan_alias()) elif ch == '!': self.tokens.append(self.scan_tag()) elif ch == '"': self.tokens.append(self.scan_scalar()) else: raise CanonicalError("invalid token") self.scanned = True DIRECTIVE = '%YAML 1.1' def scan_directive(self): if self.data[self.index:self.index+len(self.DIRECTIVE)] == self.DIRECTIVE and \ self.data[self.index+len(self.DIRECTIVE)] in ' \n\0': self.index += len(self.DIRECTIVE) return yaml.DirectiveToken('YAML', (1, 1), None, None) else: raise CanonicalError("invalid directive") def scan_alias(self): if self.data[self.index] == '*': TokenClass = yaml.AliasToken else: TokenClass = yaml.AnchorToken self.index += 1 start = self.index while self.data[self.index] not in ', \n\0': self.index += 1 value = self.data[start:self.index] return TokenClass(value, None, None) def scan_tag(self): self.index += 1 start = self.index while self.data[self.index] not in ' \n\0': self.index += 1 value = self.data[start:self.index] if not value: value = '!' elif value[0] == '!': value = 'tag:yaml.org,2002:'+value[1:] elif value[0] == '<' and value[-1] == '>': value = value[1:-1] else: value = '!'+value return yaml.TagToken(value, None, None) QUOTE_CODES = { 'x': 2, 'u': 4, 'U': 8, } QUOTE_REPLACES = { '\\': '\\', '\"': '\"', ' ': ' ', 'a': '\x07', 'b': '\x08', 'e': '\x1B', 'f': '\x0C', 'n': '\x0A', 'r': '\x0D', 't': '\x09', 'v': '\x0B', 'N': '\u0085', 'L': '\u2028', 'P': '\u2029', '_': '_', '0': '\x00', } def scan_scalar(self): self.index += 1 chunks = [] start = self.index ignore_spaces = False while self.data[self.index] != '"': if self.data[self.index] == '\\': ignore_spaces = False chunks.append(self.data[start:self.index]) self.index += 1 ch = self.data[self.index] self.index += 1 if ch == '\n': ignore_spaces = True elif ch in self.QUOTE_CODES: length = self.QUOTE_CODES[ch] code = int(self.data[self.index:self.index+length], 16) chunks.append(chr(code)) self.index += length else: if ch not in self.QUOTE_REPLACES: raise CanonicalError("invalid escape code") chunks.append(self.QUOTE_REPLACES[ch]) start = self.index elif self.data[self.index] == '\n': chunks.append(self.data[start:self.index]) chunks.append(' ') self.index += 1 start = self.index ignore_spaces = True elif ignore_spaces and self.data[self.index] == ' ': self.index += 1 start = self.index else: ignore_spaces = False self.index += 1 chunks.append(self.data[start:self.index]) self.index += 1 return yaml.ScalarToken(''.join(chunks), False, None, None) def find_token(self): found = False while not found: while self.data[self.index] in ' \t': self.index += 1 if self.data[self.index] == '#': while self.data[self.index] != '\n': self.index += 1 if self.data[self.index] == '\n': self.index += 1 else: found = True class CanonicalParser: def __init__(self): self.events = [] self.parsed = False def dispose(self): pass # stream: STREAM-START document* STREAM-END def parse_stream(self): self.get_token(yaml.StreamStartToken) self.events.append(yaml.StreamStartEvent(None, None)) while not self.check_token(yaml.StreamEndToken): if self.check_token(yaml.DirectiveToken, yaml.DocumentStartToken): self.parse_document() else: raise CanonicalError("document is expected, got "+repr(self.tokens[0])) self.get_token(yaml.StreamEndToken) self.events.append(yaml.StreamEndEvent(None, None)) # document: DIRECTIVE? DOCUMENT-START node def parse_document(self): node = None if self.check_token(yaml.DirectiveToken): self.get_token(yaml.DirectiveToken) self.get_token(yaml.DocumentStartToken) self.events.append(yaml.DocumentStartEvent(None, None)) self.parse_node() self.events.append(yaml.DocumentEndEvent(None, None)) # node: ALIAS | ANCHOR? TAG? (SCALAR|sequence|mapping) def parse_node(self): if self.check_token(yaml.AliasToken): self.events.append(yaml.AliasEvent(self.get_token_value(), None, None)) else: anchor = None if self.check_token(yaml.AnchorToken): anchor = self.get_token_value() tag = None if self.check_token(yaml.TagToken): tag = self.get_token_value() if self.check_token(yaml.ScalarToken): self.events.append(yaml.ScalarEvent(anchor, tag, (False, False), self.get_token_value(), None, None)) elif self.check_token(yaml.FlowSequenceStartToken): self.events.append(yaml.SequenceStartEvent(anchor, tag, None, None)) self.parse_sequence() elif self.check_token(yaml.FlowMappingStartToken): self.events.append(yaml.MappingStartEvent(anchor, tag, None, None)) self.parse_mapping() else: raise CanonicalError("SCALAR, '[', or '{' is expected, got "+repr(self.tokens[0])) # sequence: SEQUENCE-START (node (ENTRY node)*)? ENTRY? SEQUENCE-END def parse_sequence(self): self.get_token(yaml.FlowSequenceStartToken) if not self.check_token(yaml.FlowSequenceEndToken): self.parse_node() while not self.check_token(yaml.FlowSequenceEndToken): self.get_token(yaml.FlowEntryToken) if not self.check_token(yaml.FlowSequenceEndToken): self.parse_node() self.get_token(yaml.FlowSequenceEndToken) self.events.append(yaml.SequenceEndEvent(None, None)) # mapping: MAPPING-START (map_entry (ENTRY map_entry)*)? ENTRY? MAPPING-END def parse_mapping(self): self.get_token(yaml.FlowMappingStartToken) if not self.check_token(yaml.FlowMappingEndToken): self.parse_map_entry() while not self.check_token(yaml.FlowMappingEndToken): self.get_token(yaml.FlowEntryToken) if not self.check_token(yaml.FlowMappingEndToken): self.parse_map_entry() self.get_token(yaml.FlowMappingEndToken) self.events.append(yaml.MappingEndEvent(None, None)) # map_entry: KEY node VALUE node def parse_map_entry(self): self.get_token(yaml.KeyToken) self.parse_node() self.get_token(yaml.ValueToken) self.parse_node() def parse(self): self.parse_stream() self.parsed = True def get_event(self): if not self.parsed: self.parse() return self.events.pop(0) def check_event(self, *choices): if not self.parsed: self.parse() if self.events: if not choices: return True for choice in choices: if isinstance(self.events[0], choice): return True return False def peek_event(self): if not self.parsed: self.parse() return self.events[0] class CanonicalLoader(CanonicalScanner, CanonicalParser, yaml.composer.Composer, yaml.constructor.Constructor, yaml.resolver.Resolver): def __init__(self, stream): if hasattr(stream, 'read'): stream = stream.read() CanonicalScanner.__init__(self, stream) CanonicalParser.__init__(self) yaml.composer.Composer.__init__(self) yaml.constructor.Constructor.__init__(self) yaml.resolver.Resolver.__init__(self) yaml.CanonicalLoader = CanonicalLoader def canonical_scan(stream): return yaml.scan(stream, Loader=CanonicalLoader) yaml.canonical_scan = canonical_scan def canonical_parse(stream): return yaml.parse(stream, Loader=CanonicalLoader) yaml.canonical_parse = canonical_parse def canonical_compose(stream): return yaml.compose(stream, Loader=CanonicalLoader) yaml.canonical_compose = canonical_compose def canonical_compose_all(stream): return yaml.compose_all(stream, Loader=CanonicalLoader) yaml.canonical_compose_all = canonical_compose_all def canonical_load(stream): return yaml.load(stream, Loader=CanonicalLoader) yaml.canonical_load = canonical_load def canonical_load_all(stream): return yaml.load_all(stream, Loader=CanonicalLoader) yaml.canonical_load_all = canonical_load_all ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/lib/test_all.py0000644000175100001730000000050214455350511016263 0ustar00runnerdocker import sys, yaml, test_appliance def main(args=None): collections = [] import test_yaml collections.append(test_yaml) if yaml.__with_libyaml__: import test_yaml_ext collections.append(test_yaml_ext) return test_appliance.run(collections, args) if __name__ == '__main__': main() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/lib/test_appliance.py0000644000175100001730000001220314455350511017450 0ustar00runnerdocker import sys, os, os.path, types, traceback, pprint DATA = 'tests/data' def find_test_functions(collections): if not isinstance(collections, list): collections = [collections] functions = [] for collection in collections: if not isinstance(collection, dict): collection = vars(collection) for key in sorted(collection): value = collection[key] if isinstance(value, types.FunctionType) and hasattr(value, 'unittest'): functions.append(value) return functions def find_test_filenames(directory): filenames = {} for filename in os.listdir(directory): if os.path.isfile(os.path.join(directory, filename)): base, ext = os.path.splitext(filename) if base.endswith('-py2'): continue filenames.setdefault(base, []).append(ext) filenames = sorted(filenames.items()) return filenames def parse_arguments(args): if args is None: args = sys.argv[1:] verbose = False if '-v' in args: verbose = True args.remove('-v') if '--verbose' in args: verbose = True args.remove('--verbose') if 'YAML_TEST_VERBOSE' in os.environ: verbose = True include_functions = [] if args: include_functions.append(args.pop(0)) if 'YAML_TEST_FUNCTIONS' in os.environ: include_functions.extend(os.environ['YAML_TEST_FUNCTIONS'].split()) include_filenames = [] include_filenames.extend(args) if 'YAML_TEST_FILENAMES' in os.environ: include_filenames.extend(os.environ['YAML_TEST_FILENAMES'].split()) return include_functions, include_filenames, verbose def execute(function, filenames, verbose): name = function.__name__ if verbose: sys.stdout.write('='*75+'\n') sys.stdout.write('%s(%s)...\n' % (name, ', '.join(filenames))) try: function(verbose=verbose, *filenames) except Exception as exc: info = sys.exc_info() if isinstance(exc, AssertionError): kind = 'FAILURE' else: kind = 'ERROR' if verbose: traceback.print_exc(limit=1, file=sys.stdout) else: sys.stdout.write(kind[0]) sys.stdout.flush() else: kind = 'SUCCESS' info = None if not verbose: sys.stdout.write('.') sys.stdout.flush() return (name, filenames, kind, info) def display(results, verbose): if results and not verbose: sys.stdout.write('\n') total = len(results) failures = 0 errors = 0 for name, filenames, kind, info in results: if kind == 'SUCCESS': continue if kind == 'FAILURE': failures += 1 if kind == 'ERROR': errors += 1 sys.stdout.write('='*75+'\n') sys.stdout.write('%s(%s): %s\n' % (name, ', '.join(filenames), kind)) if kind == 'ERROR': traceback.print_exception(file=sys.stdout, *info) else: sys.stdout.write('Traceback (most recent call last):\n') traceback.print_tb(info[2], file=sys.stdout) sys.stdout.write('%s: see below\n' % info[0].__name__) sys.stdout.write('~'*75+'\n') for arg in info[1].args: pprint.pprint(arg, stream=sys.stdout) for filename in filenames: sys.stdout.write('-'*75+'\n') sys.stdout.write('%s:\n' % filename) with open(filename, 'r', errors='replace') as file: data = file.read() sys.stdout.write(data) if data and data[-1] != '\n': sys.stdout.write('\n') sys.stdout.write('='*75+'\n') sys.stdout.write('TESTS: %s\n' % total) if failures: sys.stdout.write('FAILURES: %s\n' % failures) if errors: sys.stdout.write('ERRORS: %s\n' % errors) return not (failures or errors) def run(collections, args=None): test_functions = find_test_functions(collections) test_filenames = find_test_filenames(DATA) include_functions, include_filenames, verbose = parse_arguments(args) results = [] for function in test_functions: if include_functions and function.__name__ not in include_functions: continue if function.unittest and function.unittest is not True: for base, exts in test_filenames: if include_filenames and base not in include_filenames: continue filenames = [] for ext in function.unittest: if ext not in exts: break filenames.append(os.path.join(DATA, base+ext)) else: skip_exts = getattr(function, 'skip', []) for skip_ext in skip_exts: if skip_ext in exts: break else: result = execute(function, filenames, verbose) results.append(result) else: result = execute(function, [], verbose) results.append(result) return display(results, verbose=verbose) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/lib/test_build.py0000644000175100001730000000053614455350511016621 0ustar00runnerdocker if __name__ == '__main__': import sys, os, distutils.util build_lib = 'build/lib' build_lib_ext = os.path.join('build', 'lib.{}-{}.{}'.format(distutils.util.get_platform(), *sys.version_info)) sys.path.insert(0, build_lib) sys.path.insert(0, build_lib_ext) import test_yaml, test_appliance test_appliance.run(test_yaml) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/lib/test_build_ext.py0000644000175100001730000000054714455350511017503 0ustar00runnerdocker if __name__ == '__main__': import sys, os, distutils.util build_lib = 'build/lib' build_lib_ext = os.path.join('build', 'lib.{}-{}.{}'.format(distutils.util.get_platform(), *sys.version_info)) sys.path.insert(0, build_lib) sys.path.insert(0, build_lib_ext) import test_yaml_ext, test_appliance test_appliance.run(test_yaml_ext) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/lib/test_canonical.py0000644000175100001730000000230514455350511017445 0ustar00runnerdocker import yaml, canonical def test_canonical_scanner(canonical_filename, verbose=False): with open(canonical_filename, 'rb') as file: data = file.read() tokens = list(yaml.canonical_scan(data)) assert tokens, tokens if verbose: for token in tokens: print(token) test_canonical_scanner.unittest = ['.canonical'] def test_canonical_parser(canonical_filename, verbose=False): with open(canonical_filename, 'rb') as file: data = file.read() events = list(yaml.canonical_parse(data)) assert events, events if verbose: for event in events: print(event) test_canonical_parser.unittest = ['.canonical'] def test_canonical_error(data_filename, canonical_filename, verbose=False): with open(data_filename, 'rb') as file: data = file.read() try: output = list(yaml.canonical_load_all(data)) except yaml.YAMLError as exc: if verbose: print(exc) else: raise AssertionError("expected an exception") test_canonical_error.unittest = ['.data', '.canonical'] test_canonical_error.skip = ['.empty'] if __name__ == '__main__': import test_appliance test_appliance.run(globals()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/lib/test_constructor.py0000644000175100001730000002373214455350511020112 0ustar00runnerdocker import yaml import pprint import datetime import yaml.tokens # Import any packages here that need to be referenced in .code files. import signal def execute(code): global value exec(code) return value def _make_objects(): global MyLoader, MyDumper, MyTestClass1, MyTestClass2, MyTestClass3, YAMLObject1, YAMLObject2, \ AnObject, AnInstance, AState, ACustomState, InitArgs, InitArgsWithState, \ NewArgs, NewArgsWithState, Reduce, ReduceWithState, Slots, MyInt, MyList, MyDict, \ FixedOffset, today, execute, MyFullLoader class MyLoader(yaml.Loader): pass class MyDumper(yaml.Dumper): pass class MyTestClass1: def __init__(self, x, y=0, z=0): self.x = x self.y = y self.z = z def __eq__(self, other): if isinstance(other, MyTestClass1): return self.__class__, self.__dict__ == other.__class__, other.__dict__ else: return False def construct1(constructor, node): mapping = constructor.construct_mapping(node) return MyTestClass1(**mapping) def represent1(representer, native): return representer.represent_mapping("!tag1", native.__dict__) def my_time_constructor(constructor, node): seq = constructor.construct_sequence(node) dt = seq[0] tz = None try: tz = dt.tzinfo.tzname(dt) except: pass return [dt, tz] yaml.add_constructor("!tag1", construct1, Loader=MyLoader) yaml.add_constructor("!MyTime", my_time_constructor, Loader=MyLoader) yaml.add_representer(MyTestClass1, represent1, Dumper=MyDumper) class MyTestClass2(MyTestClass1, yaml.YAMLObject): yaml_loader = MyLoader yaml_dumper = MyDumper yaml_tag = "!tag2" def from_yaml(cls, constructor, node): x = constructor.construct_yaml_int(node) return cls(x=x) from_yaml = classmethod(from_yaml) def to_yaml(cls, representer, native): return representer.represent_scalar(cls.yaml_tag, str(native.x)) to_yaml = classmethod(to_yaml) class MyTestClass3(MyTestClass2): yaml_tag = "!tag3" def from_yaml(cls, constructor, node): mapping = constructor.construct_mapping(node) if '=' in mapping: x = mapping['='] del mapping['='] mapping['x'] = x return cls(**mapping) from_yaml = classmethod(from_yaml) def to_yaml(cls, representer, native): return representer.represent_mapping(cls.yaml_tag, native.__dict__) to_yaml = classmethod(to_yaml) class YAMLObject1(yaml.YAMLObject): yaml_loader = MyLoader yaml_dumper = MyDumper yaml_tag = '!foo' def __init__(self, my_parameter=None, my_another_parameter=None): self.my_parameter = my_parameter self.my_another_parameter = my_another_parameter def __eq__(self, other): if isinstance(other, YAMLObject1): return self.__class__, self.__dict__ == other.__class__, other.__dict__ else: return False class YAMLObject2(yaml.YAMLObject): yaml_loader = MyLoader yaml_dumper = MyDumper yaml_tag = '!bar' def __init__(self, foo=1, bar=2, baz=3): self.foo = foo self.bar = bar self.baz = baz def __getstate__(self): return {1: self.foo, 2: self.bar, 3: self.baz} def __setstate__(self, state): self.foo = state[1] self.bar = state[2] self.baz = state[3] def __eq__(self, other): if isinstance(other, YAMLObject2): return self.__class__, self.__dict__ == other.__class__, other.__dict__ else: return False class AnObject: def __new__(cls, foo=None, bar=None, baz=None): self = object.__new__(cls) self.foo = foo self.bar = bar self.baz = baz return self def __cmp__(self, other): return cmp((type(self), self.foo, self.bar, self.baz), (type(other), other.foo, other.bar, other.baz)) def __eq__(self, other): return type(self) is type(other) and \ (self.foo, self.bar, self.baz) == (other.foo, other.bar, other.baz) class AnInstance: def __init__(self, foo=None, bar=None, baz=None): self.foo = foo self.bar = bar self.baz = baz def __cmp__(self, other): return cmp((type(self), self.foo, self.bar, self.baz), (type(other), other.foo, other.bar, other.baz)) def __eq__(self, other): return type(self) is type(other) and \ (self.foo, self.bar, self.baz) == (other.foo, other.bar, other.baz) class AState(AnInstance): def __getstate__(self): return { '_foo': self.foo, '_bar': self.bar, '_baz': self.baz, } def __setstate__(self, state): self.foo = state['_foo'] self.bar = state['_bar'] self.baz = state['_baz'] class ACustomState(AnInstance): def __getstate__(self): return (self.foo, self.bar, self.baz) def __setstate__(self, state): self.foo, self.bar, self.baz = state class NewArgs(AnObject): def __getnewargs__(self): return (self.foo, self.bar, self.baz) def __getstate__(self): return {} class NewArgsWithState(AnObject): def __getnewargs__(self): return (self.foo, self.bar) def __getstate__(self): return self.baz def __setstate__(self, state): self.baz = state InitArgs = NewArgs InitArgsWithState = NewArgsWithState class Reduce(AnObject): def __reduce__(self): return self.__class__, (self.foo, self.bar, self.baz) class ReduceWithState(AnObject): def __reduce__(self): return self.__class__, (self.foo, self.bar), self.baz def __setstate__(self, state): self.baz = state class Slots: __slots__ = ("foo", "bar", "baz") def __init__(self, foo=None, bar=None, baz=None): self.foo = foo self.bar = bar self.baz = baz def __eq__(self, other): return type(self) is type(other) and \ (self.foo, self.bar, self.baz) == (other.foo, other.bar, other.baz) class MyInt(int): def __eq__(self, other): return type(self) is type(other) and int(self) == int(other) class MyList(list): def __init__(self, n=1): self.extend([None]*n) def __eq__(self, other): return type(self) is type(other) and list(self) == list(other) class MyDict(dict): def __init__(self, n=1): for k in range(n): self[k] = None def __eq__(self, other): return type(self) is type(other) and dict(self) == dict(other) class FixedOffset(datetime.tzinfo): def __init__(self, offset, name): self.__offset = datetime.timedelta(minutes=offset) self.__name = name def utcoffset(self, dt): return self.__offset def tzname(self, dt): return self.__name def dst(self, dt): return datetime.timedelta(0) class MyFullLoader(yaml.FullLoader): def get_state_keys_blacklist(self): return super().get_state_keys_blacklist() + ['^mymethod$', '^wrong_.*$'] today = datetime.date.today() def _load_code(expression): return eval(expression) def _serialize_value(data): if isinstance(data, list): return '[%s]' % ', '.join(map(_serialize_value, data)) elif isinstance(data, dict): items = [] for key, value in data.items(): key = _serialize_value(key) value = _serialize_value(value) items.append("%s: %s" % (key, value)) items.sort() return '{%s}' % ', '.join(items) elif isinstance(data, datetime.datetime): return repr(data.utctimetuple()) elif isinstance(data, float) and data != data: return '?' else: return str(data) def test_constructor_types(data_filename, code_filename, verbose=False): _make_objects() native1 = None native2 = None try: with open(data_filename, 'rb') as file: native1 = list(yaml.load_all(file, Loader=MyLoader)) if len(native1) == 1: native1 = native1[0] with open(code_filename, 'rb') as file: native2 = _load_code(file.read()) try: if native1 == native2: return except TypeError: pass if verbose: print("SERIALIZED NATIVE1:") print(_serialize_value(native1)) print("SERIALIZED NATIVE2:") print(_serialize_value(native2)) assert _serialize_value(native1) == _serialize_value(native2), (native1, native2) finally: if verbose: print("NATIVE1:") pprint.pprint(native1) print("NATIVE2:") pprint.pprint(native2) test_constructor_types.unittest = ['.data', '.code'] def test_subclass_blacklist_types(data_filename, verbose=False): _make_objects() try: with open(data_filename, 'rb') as file: yaml.load(file.read(), MyFullLoader) except yaml.YAMLError as exc: if verbose: print("%s:" % exc.__class__.__name__, exc) else: raise AssertionError("expected an exception") test_subclass_blacklist_types.unittest = ['.subclass_blacklist'] if __name__ == '__main__': import sys, test_constructor sys.modules['test_constructor'] = sys.modules['__main__'] import test_appliance test_appliance.run(globals()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/lib/test_dump_load.py0000644000175100001730000000113714455350511017464 0ustar00runnerdockerimport yaml def test_dump(verbose=False): assert yaml.dump(['foo']) test_dump.unittest = True def test_load_no_loader(verbose=False): try: yaml.load("- foo\n") except TypeError: return True assert(False, "load() require Loader=...") test_load_no_loader.unittest = True def test_load_safeloader(verbose=False): assert yaml.load("- foo\n", Loader=yaml.SafeLoader) test_load_safeloader.unittest = True if __name__ == '__main__': import sys, test_load sys.modules['test_load'] = sys.modules['__main__'] import test_appliance test_appliance.run(globals()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/lib/test_emitter.py0000644000175100001730000001050114455350511017164 0ustar00runnerdocker import yaml def _compare_events(events1, events2): assert len(events1) == len(events2), (events1, events2) for event1, event2 in zip(events1, events2): assert event1.__class__ == event2.__class__, (event1, event2) if isinstance(event1, yaml.NodeEvent): assert event1.anchor == event2.anchor, (event1, event2) if isinstance(event1, yaml.CollectionStartEvent): assert event1.tag == event2.tag, (event1, event2) if isinstance(event1, yaml.ScalarEvent): if True not in event1.implicit+event2.implicit: assert event1.tag == event2.tag, (event1, event2) assert event1.value == event2.value, (event1, event2) def test_emitter_on_data(data_filename, canonical_filename, verbose=False): with open(data_filename, 'rb') as file: events = list(yaml.parse(file)) output = yaml.emit(events) if verbose: print("OUTPUT:") print(output) new_events = list(yaml.parse(output)) _compare_events(events, new_events) test_emitter_on_data.unittest = ['.data', '.canonical'] def test_emitter_on_canonical(canonical_filename, verbose=False): with open(canonical_filename, 'rb') as file: events = list(yaml.parse(file)) for canonical in [False, True]: output = yaml.emit(events, canonical=canonical) if verbose: print("OUTPUT (canonical=%s):" % canonical) print(output) new_events = list(yaml.parse(output)) _compare_events(events, new_events) test_emitter_on_canonical.unittest = ['.canonical'] def test_emitter_styles(data_filename, canonical_filename, verbose=False): for filename in [data_filename, canonical_filename]: with open(filename, 'rb') as file: events = list(yaml.parse(file)) for flow_style in [False, True]: for style in ['|', '>', '"', '\'', '']: styled_events = [] for event in events: if isinstance(event, yaml.ScalarEvent): event = yaml.ScalarEvent(event.anchor, event.tag, event.implicit, event.value, style=style) elif isinstance(event, yaml.SequenceStartEvent): event = yaml.SequenceStartEvent(event.anchor, event.tag, event.implicit, flow_style=flow_style) elif isinstance(event, yaml.MappingStartEvent): event = yaml.MappingStartEvent(event.anchor, event.tag, event.implicit, flow_style=flow_style) styled_events.append(event) output = yaml.emit(styled_events) if verbose: print("OUTPUT (filename=%r, flow_style=%r, style=%r)" % (filename, flow_style, style)) print(output) new_events = list(yaml.parse(output)) _compare_events(events, new_events) test_emitter_styles.unittest = ['.data', '.canonical'] class EventsLoader(yaml.Loader): def construct_event(self, node): if isinstance(node, yaml.ScalarNode): mapping = {} else: mapping = self.construct_mapping(node) class_name = str(node.tag[1:])+'Event' if class_name in ['AliasEvent', 'ScalarEvent', 'SequenceStartEvent', 'MappingStartEvent']: mapping.setdefault('anchor', None) if class_name in ['ScalarEvent', 'SequenceStartEvent', 'MappingStartEvent']: mapping.setdefault('tag', None) if class_name in ['SequenceStartEvent', 'MappingStartEvent']: mapping.setdefault('implicit', True) if class_name == 'ScalarEvent': mapping.setdefault('implicit', (False, True)) mapping.setdefault('value', '') value = getattr(yaml, class_name)(**mapping) return value EventsLoader.add_constructor(None, EventsLoader.construct_event) def test_emitter_events(events_filename, verbose=False): with open(events_filename, 'rb') as file: events = list(yaml.load(file, Loader=EventsLoader)) output = yaml.emit(events) if verbose: print("OUTPUT:") print(output) new_events = list(yaml.parse(output)) _compare_events(events, new_events) if __name__ == '__main__': import test_appliance test_appliance.run(globals()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/lib/test_errors.py0000644000175100001730000000420614455350511017034 0ustar00runnerdocker import yaml, test_emitter def test_loader_error(error_filename, verbose=False): try: with open(error_filename, 'rb') as file: list(yaml.load_all(file, yaml.FullLoader)) except yaml.YAMLError as exc: if verbose: print("%s:" % exc.__class__.__name__, exc) else: raise AssertionError("expected an exception") test_loader_error.unittest = ['.loader-error'] def test_loader_error_string(error_filename, verbose=False): try: with open(error_filename, 'rb') as file: list(yaml.load_all(file.read(), yaml.FullLoader)) except yaml.YAMLError as exc: if verbose: print("%s:" % exc.__class__.__name__, exc) else: raise AssertionError("expected an exception") test_loader_error_string.unittest = ['.loader-error'] def test_loader_error_single(error_filename, verbose=False): try: with open(error_filename, 'rb') as file: yaml.load(file.read(), yaml.FullLoader) except yaml.YAMLError as exc: if verbose: print("%s:" % exc.__class__.__name__, exc) else: raise AssertionError("expected an exception") test_loader_error_single.unittest = ['.single-loader-error'] def test_emitter_error(error_filename, verbose=False): with open(error_filename, 'rb') as file: events = list(yaml.load(file, Loader=test_emitter.EventsLoader)) try: yaml.emit(events) except yaml.YAMLError as exc: if verbose: print("%s:" % exc.__class__.__name__, exc) else: raise AssertionError("expected an exception") test_emitter_error.unittest = ['.emitter-error'] def test_dumper_error(error_filename, verbose=False): with open(error_filename, 'rb') as file: code = file.read() try: import yaml from io import StringIO exec(code) except yaml.YAMLError as exc: if verbose: print("%s:" % exc.__class__.__name__, exc) else: raise AssertionError("expected an exception") test_dumper_error.unittest = ['.dumper-error'] if __name__ == '__main__': import test_appliance test_appliance.run(globals()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/lib/test_input_output.py0000644000175100001730000001270414455350511020301 0ustar00runnerdocker import yaml import codecs, io, tempfile, os, os.path def test_unicode_input(unicode_filename, verbose=False): with open(unicode_filename, 'rb') as file: data = file.read().decode('utf-8') value = ' '.join(data.split()) output = yaml.full_load(data) assert output == value, (output, value) output = yaml.full_load(io.StringIO(data)) assert output == value, (output, value) for input in [data.encode('utf-8'), codecs.BOM_UTF8+data.encode('utf-8'), codecs.BOM_UTF16_BE+data.encode('utf-16-be'), codecs.BOM_UTF16_LE+data.encode('utf-16-le')]: if verbose: print("INPUT:", repr(input[:10]), "...") output = yaml.full_load(input) assert output == value, (output, value) output = yaml.full_load(io.BytesIO(input)) assert output == value, (output, value) test_unicode_input.unittest = ['.unicode'] def test_unicode_input_errors(unicode_filename, verbose=False): with open(unicode_filename, 'rb') as file: data = file.read().decode('utf-8') for input in [data.encode('utf-16-be'), data.encode('utf-16-le'), codecs.BOM_UTF8+data.encode('utf-16-be'), codecs.BOM_UTF8+data.encode('utf-16-le')]: try: yaml.full_load(input) except yaml.YAMLError as exc: if verbose: print(exc) else: raise AssertionError("expected an exception") try: yaml.full_load(io.BytesIO(input)) except yaml.YAMLError as exc: if verbose: print(exc) else: raise AssertionError("expected an exception") test_unicode_input_errors.unittest = ['.unicode'] def test_unicode_output(unicode_filename, verbose=False): with open(unicode_filename, 'rb') as file: data = file.read().decode('utf-8') value = ' '.join(data.split()) for allow_unicode in [False, True]: data1 = yaml.dump(value, allow_unicode=allow_unicode) for encoding in [None, 'utf-8', 'utf-16-be', 'utf-16-le']: stream = io.StringIO() yaml.dump(value, stream, encoding=encoding, allow_unicode=allow_unicode) data2 = stream.getvalue() data3 = yaml.dump(value, encoding=encoding, allow_unicode=allow_unicode) if encoding is not None: assert isinstance(data3, bytes) data3 = data3.decode(encoding) stream = io.BytesIO() if encoding is None: try: yaml.dump(value, stream, encoding=encoding, allow_unicode=allow_unicode) except TypeError as exc: if verbose: print(exc) data4 = None else: raise AssertionError("expected an exception") else: yaml.dump(value, stream, encoding=encoding, allow_unicode=allow_unicode) data4 = stream.getvalue() if verbose: print("BYTES:", data4[:50]) data4 = data4.decode(encoding) assert isinstance(data1, str), (type(data1), encoding) assert isinstance(data2, str), (type(data2), encoding) test_unicode_output.unittest = ['.unicode'] def test_file_output(unicode_filename, verbose=False): with open(unicode_filename, 'rb') as file: data = file.read().decode('utf-8') handle, filename = tempfile.mkstemp() os.close(handle) try: stream = io.StringIO() yaml.dump(data, stream, allow_unicode=True) data1 = stream.getvalue() stream = io.BytesIO() yaml.dump(data, stream, encoding='utf-16-le', allow_unicode=True) data2 = stream.getvalue().decode('utf-16-le')[1:] with open(filename, 'w', encoding='utf-16-le') as stream: yaml.dump(data, stream, allow_unicode=True) with open(filename, 'r', encoding='utf-16-le') as file: data3 = file.read() with open(filename, 'wb') as stream: yaml.dump(data, stream, encoding='utf-8', allow_unicode=True) with open(filename, 'r', encoding='utf-8') as file: data4 = file.read() assert data1 == data2, (data1, data2) assert data1 == data3, (data1, data3) assert data1 == data4, (data1, data4) finally: if os.path.exists(filename): os.unlink(filename) test_file_output.unittest = ['.unicode'] def test_unicode_transfer(unicode_filename, verbose=False): with open(unicode_filename, 'rb') as file: data = file.read().decode('utf-8') for encoding in [None, 'utf-8', 'utf-16-be', 'utf-16-le']: input = data if encoding is not None: input = ('\ufeff'+input).encode(encoding) output1 = yaml.emit(yaml.parse(input), allow_unicode=True) if encoding is None: stream = io.StringIO() else: stream = io.BytesIO() yaml.emit(yaml.parse(input), stream, allow_unicode=True) output2 = stream.getvalue() assert isinstance(output1, str), (type(output1), encoding) if encoding is None: assert isinstance(output2, str), (type(output1), encoding) else: assert isinstance(output2, bytes), (type(output1), encoding) output2.decode(encoding) test_unicode_transfer.unittest = ['.unicode'] if __name__ == '__main__': import test_appliance test_appliance.run(globals()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/lib/test_mark.py0000644000175100001730000000174614455350511016460 0ustar00runnerdocker import yaml def test_marks(marks_filename, verbose=False): with open(marks_filename, 'r') as file: inputs = file.read().split('---\n')[1:] for input in inputs: index = 0 line = 0 column = 0 while input[index] != '*': if input[index] == '\n': line += 1 column = 0 else: column += 1 index += 1 mark = yaml.Mark(marks_filename, index, line, column, input, index) snippet = mark.get_snippet(indent=2, max_length=79) if verbose: print(snippet) assert isinstance(snippet, str), type(snippet) assert snippet.count('\n') == 1, snippet.count('\n') data, pointer = snippet.split('\n') assert len(data) < 82, len(data) assert data[len(pointer)-1] == '*', data[len(pointer)-1] test_marks.unittest = ['.marks'] if __name__ == '__main__': import test_appliance test_appliance.run(globals()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/lib/test_multi_constructor.py0000644000175100001730000000304514455350511021317 0ustar00runnerdockerimport yaml import pprint import sys def _load_code(expression): return eval(expression) def myconstructor1(constructor, tag, node): seq = constructor.construct_sequence(node) return {tag: seq } def myconstructor2(constructor, tag, node): seq = constructor.construct_sequence(node) string = '' try: i = tag.index('!') + 1 except: try: i = tag.rindex(':') + 1 except: pass if i >= 0: tag = tag[i:] return { tag: seq } class Multi1(yaml.FullLoader): pass class Multi2(yaml.FullLoader): pass def test_multi_constructor(input_filename, code_filename, verbose=False): with open(input_filename, 'rb') as file: input = file.read().decode('utf-8') with open(code_filename, 'rb') as file: native = _load_code(file.read()) # default multi constructor for ! and !! tags Multi1.add_multi_constructor('!', myconstructor1) Multi1.add_multi_constructor('tag:yaml.org,2002:', myconstructor1) data = yaml.load(input, Loader=Multi1) if verbose: print('Multi1:') print(data) print(native) assert(data == native) # default multi constructor for all tags Multi2.add_multi_constructor(None, myconstructor2) data = yaml.load(input, Loader=Multi2) if verbose: print('Multi2:') print(data) print(native) assert(data == native) test_multi_constructor.unittest = ['.multi', '.code'] if __name__ == '__main__': import test_appliance test_appliance.run(globals()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/lib/test_reader.py0000644000175100001730000000206114455350511016757 0ustar00runnerdocker import yaml.reader def _run_reader(data, verbose): try: stream = yaml.reader.Reader(data) while stream.peek() != '\0': stream.forward() except yaml.reader.ReaderError as exc: if verbose: print(exc) else: raise AssertionError("expected an exception") def test_stream_error(error_filename, verbose=False): with open(error_filename, 'rb') as file: _run_reader(file, verbose) with open(error_filename, 'rb') as file: _run_reader(file.read(), verbose) for encoding in ['utf-8', 'utf-16-le', 'utf-16-be']: try: with open(error_filename, 'rb') as file: data = file.read().decode(encoding) break except UnicodeDecodeError: pass else: return _run_reader(data, verbose) with open(error_filename, encoding=encoding) as file: _run_reader(file, verbose) test_stream_error.unittest = ['.stream-error'] if __name__ == '__main__': import test_appliance test_appliance.run(globals()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/lib/test_recursive.py0000644000175100001730000000250514455350511017527 0ustar00runnerdocker import yaml class AnInstance: def __init__(self, foo, bar): self.foo = foo self.bar = bar def __repr__(self): try: return "%s(foo=%r, bar=%r)" % (self.__class__.__name__, self.foo, self.bar) except RuntimeError: return "%s(foo=..., bar=...)" % self.__class__.__name__ class AnInstanceWithState(AnInstance): def __getstate__(self): return {'attributes': [self.foo, self.bar]} def __setstate__(self, state): self.foo, self.bar = state['attributes'] def test_recursive(recursive_filename, verbose=False): context = globals().copy() with open(recursive_filename, 'rb') as file: exec(file.read(), context) value1 = context['value'] output1 = None value2 = None output2 = None try: output1 = yaml.dump(value1) value2 = yaml.unsafe_load(output1) output2 = yaml.dump(value2) assert output1 == output2, (output1, output2) finally: if verbose: print("VALUE1:", value1) print("VALUE2:", value2) print("OUTPUT1:") print(output1) print("OUTPUT2:") print(output2) test_recursive.unittest = ['.recursive'] if __name__ == '__main__': import test_appliance test_appliance.run(globals()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/lib/test_representer.py0000644000175100001730000000310514455350511020053 0ustar00runnerdocker import yaml import test_constructor import pprint def test_representer_types(code_filename, verbose=False): test_constructor._make_objects() for allow_unicode in [False, True]: for encoding in ['utf-8', 'utf-16-be', 'utf-16-le']: with open(code_filename, 'rb') as file: native1 = test_constructor._load_code(file.read()) native2 = None try: output = yaml.dump(native1, Dumper=test_constructor.MyDumper, allow_unicode=allow_unicode, encoding=encoding) native2 = yaml.load(output, Loader=test_constructor.MyLoader) try: if native1 == native2: continue except TypeError: pass value1 = test_constructor._serialize_value(native1) value2 = test_constructor._serialize_value(native2) if verbose: print("SERIALIZED NATIVE1:") print(value1) print("SERIALIZED NATIVE2:") print(value2) assert value1 == value2, (native1, native2) finally: if verbose: print("NATIVE1:") pprint.pprint(native1) print("NATIVE2:") pprint.pprint(native2) print("OUTPUT:") print(output) test_representer_types.unittest = ['.code'] if __name__ == '__main__': import test_appliance test_appliance.run(globals()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/lib/test_resolver.py0000644000175100001730000000664414455350511017371 0ustar00runnerdocker import yaml import pprint def test_implicit_resolver(data_filename, detect_filename, verbose=False): correct_tag = None node = None try: with open(detect_filename, 'r') as file: correct_tag = file.read().strip() with open(data_filename, 'rb') as file: node = yaml.compose(file) assert isinstance(node, yaml.SequenceNode), node for scalar in node.value: assert isinstance(scalar, yaml.ScalarNode), scalar assert scalar.tag == correct_tag, (scalar.tag, correct_tag) finally: if verbose: print("CORRECT TAG:", correct_tag) if hasattr(node, 'value'): print("CHILDREN:") pprint.pprint(node.value) test_implicit_resolver.unittest = ['.data', '.detect'] def _make_path_loader_and_dumper(): global MyLoader, MyDumper class MyLoader(yaml.Loader): pass class MyDumper(yaml.Dumper): pass yaml.add_path_resolver('!root', [], Loader=MyLoader, Dumper=MyDumper) yaml.add_path_resolver('!root/scalar', [], str, Loader=MyLoader, Dumper=MyDumper) yaml.add_path_resolver('!root/key11/key12/*', ['key11', 'key12'], Loader=MyLoader, Dumper=MyDumper) yaml.add_path_resolver('!root/key21/1/*', ['key21', 1], Loader=MyLoader, Dumper=MyDumper) yaml.add_path_resolver('!root/key31/*/*/key14/map', ['key31', None, None, 'key14'], dict, Loader=MyLoader, Dumper=MyDumper) return MyLoader, MyDumper def _convert_node(node): if isinstance(node, yaml.ScalarNode): return (node.tag, node.value) elif isinstance(node, yaml.SequenceNode): value = [] for item in node.value: value.append(_convert_node(item)) return (node.tag, value) elif isinstance(node, yaml.MappingNode): value = [] for key, item in node.value: value.append((_convert_node(key), _convert_node(item))) return (node.tag, value) def test_path_resolver_loader(data_filename, path_filename, verbose=False): _make_path_loader_and_dumper() with open(data_filename, 'rb') as file: nodes1 = list(yaml.compose_all(file.read(), Loader=MyLoader)) with open(path_filename, 'rb') as file: nodes2 = list(yaml.compose_all(file.read())) try: for node1, node2 in zip(nodes1, nodes2): data1 = _convert_node(node1) data2 = _convert_node(node2) assert data1 == data2, (data1, data2) finally: if verbose: print(yaml.serialize_all(nodes1)) test_path_resolver_loader.unittest = ['.data', '.path'] def test_path_resolver_dumper(data_filename, path_filename, verbose=False): _make_path_loader_and_dumper() for filename in [data_filename, path_filename]: with open(filename, 'rb') as file: output = yaml.serialize_all(yaml.compose_all(file), Dumper=MyDumper) if verbose: print(output) nodes1 = yaml.compose_all(output) with open(data_filename, 'rb') as file: nodes2 = yaml.compose_all(file) for node1, node2 in zip(nodes1, nodes2): data1 = _convert_node(node1) data2 = _convert_node(node2) assert data1 == data2, (data1, data2) test_path_resolver_dumper.unittest = ['.data', '.path'] if __name__ == '__main__': import test_appliance test_appliance.run(globals()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/lib/test_schema.py0000644000175100001730000000757014455350511016767 0ustar00runnerdockerimport yaml import sys import pprint import math def check_bool(value, expected): if expected == 'false()' and value is False: return 1 if expected == 'true()' and value is True: return 1 print(value) print(expected) return 0 def check_int(value, expected): if (int(expected) == value): return 1 print(value) print(expected) return 0 def check_float(value, expected): if expected == 'inf()': if value == math.inf: return 1 elif expected == 'inf-neg()': if value == -math.inf: return 1 elif expected == 'nan()': if math.isnan(value): return 1 elif (float(expected) == value): return 1 else: print(value) print(expected) return 0 def check_str(value, expected): if value == expected: return 1 print(value) print(expected) return 0 def _fail(input, test): print("Input: >>" + input + "<<") print(test) # The tests/data/yaml11.schema file is copied from # https://github.com/perlpunk/yaml-test-schema/blob/master/data/schema-yaml11.yaml def test_implicit_resolver(data_filename, skip_filename, verbose=False): types = { 'str': [str, check_str], 'int': [int, check_int], 'float': [float, check_float], 'inf': [float, check_float], 'nan': [float, check_float], 'bool': [bool, check_bool], } with open(skip_filename, 'rb') as file: skipdata = yaml.load(file, Loader=yaml.SafeLoader) skip_load = skipdata['load'] skip_dump = skipdata['dump'] if verbose: print(skip_load) with open(data_filename, 'rb') as file: tests = yaml.load(file, Loader=yaml.SafeLoader) i = 0 fail = 0 for i, (input, test) in enumerate(sorted(tests.items())): if verbose: print('-------------------- ' + str(i)) # Skip known loader bugs if input in skip_load: continue exp_type = test[0] data = test[1] exp_dump = test[2] # Test loading try: loaded = yaml.safe_load(input) except: print("Error:", sys.exc_info()[0], '(', sys.exc_info()[1], ')') fail+=1 _fail(input, test) continue if verbose: print(input) print(test) print(loaded) print(type(loaded)) if exp_type == 'null': if loaded is None: pass else: fail+=1 _fail(input, test) else: t = types[exp_type][0] code = types[exp_type][1] if isinstance(loaded, t): if code(loaded, data): pass else: fail+=1 _fail(input, test) else: fail+=1 _fail(input, test) # Skip known dumper bugs if input in skip_dump: continue dump = yaml.safe_dump(loaded, explicit_end=False) # strip trailing newlines and footers if dump.endswith('\n...\n'): dump = dump[:-5] if dump.endswith('\n'): dump = dump[:-1] if dump == exp_dump: pass else: print("Compare: >>" + dump + "<< >>" + exp_dump + "<<") fail+=1 _fail(input, test) # if i >= 80: # break if fail > 0: print("Failed " + str(fail) + " / " + str(i) + " tests") assert(False) else: print("Passed " + str(i) + " tests") print("Skipped " + str(len(skip_load)) + " load tests") print("Skipped " + str(len(skip_dump)) + " dump tests") test_implicit_resolver.unittest = ['.schema', '.schema-skip'] if __name__ == '__main__': import test_appliance test_appliance.run(globals()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/lib/test_sort_keys.py0000644000175100001730000000173614455350511017547 0ustar00runnerdockerimport yaml import pprint import sys def test_sort_keys(input_filename, sorted_filename, verbose=False): with open(input_filename, 'rb') as file: input = file.read().decode('utf-8') with open(sorted_filename, 'rb') as file: sorted = file.read().decode('utf-8') data = yaml.load(input, Loader=yaml.FullLoader) dump_sorted = yaml.dump(data, default_flow_style=False, sort_keys=True) dump_unsorted = yaml.dump(data, default_flow_style=False, sort_keys=False) dump_unsorted_safe = yaml.dump(data, default_flow_style=False, sort_keys=False, Dumper=yaml.SafeDumper) if verbose: print("INPUT:") print(input) print("DATA:") print(data) assert dump_sorted == sorted if sys.version_info>=(3,7): assert dump_unsorted == input assert dump_unsorted_safe == input test_sort_keys.unittest = ['.sort', '.sorted'] if __name__ == '__main__': import test_appliance test_appliance.run(globals()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/lib/test_structure.py0000644000175100001730000001600114455350511017554 0ustar00runnerdocker import yaml, canonical import pprint def _convert_structure(loader): if loader.check_event(yaml.ScalarEvent): event = loader.get_event() if event.tag or event.anchor or event.value: return True else: return None elif loader.check_event(yaml.SequenceStartEvent): loader.get_event() sequence = [] while not loader.check_event(yaml.SequenceEndEvent): sequence.append(_convert_structure(loader)) loader.get_event() return sequence elif loader.check_event(yaml.MappingStartEvent): loader.get_event() mapping = [] while not loader.check_event(yaml.MappingEndEvent): key = _convert_structure(loader) value = _convert_structure(loader) mapping.append((key, value)) loader.get_event() return mapping elif loader.check_event(yaml.AliasEvent): loader.get_event() return '*' else: loader.get_event() return '?' def test_structure(data_filename, structure_filename, verbose=False): nodes1 = [] with open(structure_filename, 'r') as file: nodes2 = eval(file.read()) try: with open(data_filename, 'rb') as file: loader = yaml.Loader(file) while loader.check_event(): if loader.check_event( yaml.StreamStartEvent, yaml.StreamEndEvent, yaml.DocumentStartEvent, yaml.DocumentEndEvent ): loader.get_event() continue nodes1.append(_convert_structure(loader)) if len(nodes1) == 1: nodes1 = nodes1[0] assert nodes1 == nodes2, (nodes1, nodes2) finally: if verbose: print("NODES1:") pprint.pprint(nodes1) print("NODES2:") pprint.pprint(nodes2) test_structure.unittest = ['.data', '.structure'] def _compare_events(events1, events2, full=False): assert len(events1) == len(events2), (len(events1), len(events2)) for event1, event2 in zip(events1, events2): assert event1.__class__ == event2.__class__, (event1, event2) if isinstance(event1, yaml.AliasEvent) and full: assert event1.anchor == event2.anchor, (event1, event2) if isinstance(event1, (yaml.ScalarEvent, yaml.CollectionStartEvent)): if (event1.tag not in [None, '!'] and event2.tag not in [None, '!']) or full: assert event1.tag == event2.tag, (event1, event2) if isinstance(event1, yaml.ScalarEvent): assert event1.value == event2.value, (event1, event2) def test_parser(data_filename, canonical_filename, verbose=False): events1 = None events2 = None try: with open(data_filename, 'rb') as file: events1 = list(yaml.parse(file)) with open(canonical_filename, 'rb') as file: events2 = list(yaml.canonical_parse(file)) _compare_events(events1, events2) finally: if verbose: print("EVENTS1:") pprint.pprint(events1) print("EVENTS2:") pprint.pprint(events2) test_parser.unittest = ['.data', '.canonical'] def test_parser_on_canonical(canonical_filename, verbose=False): events1 = None events2 = None try: with open(canonical_filename, 'rb') as file: events1 = list(yaml.parse(file)) with open(canonical_filename, 'rb') as file: events2 = list(yaml.canonical_parse(file)) _compare_events(events1, events2, full=True) finally: if verbose: print("EVENTS1:") pprint.pprint(events1) print("EVENTS2:") pprint.pprint(events2) test_parser_on_canonical.unittest = ['.canonical'] def _compare_nodes(node1, node2): assert node1.__class__ == node2.__class__, (node1, node2) assert node1.tag == node2.tag, (node1, node2) if isinstance(node1, yaml.ScalarNode): assert node1.value == node2.value, (node1, node2) else: assert len(node1.value) == len(node2.value), (node1, node2) for item1, item2 in zip(node1.value, node2.value): if not isinstance(item1, tuple): item1 = (item1,) item2 = (item2,) for subnode1, subnode2 in zip(item1, item2): _compare_nodes(subnode1, subnode2) def test_composer(data_filename, canonical_filename, verbose=False): nodes1 = None nodes2 = None try: with open(data_filename, 'rb') as file: nodes1 = list(yaml.compose_all(file)) with open(canonical_filename, 'rb') as file: nodes2 = list(yaml.canonical_compose_all(file)) assert len(nodes1) == len(nodes2), (len(nodes1), len(nodes2)) for node1, node2 in zip(nodes1, nodes2): _compare_nodes(node1, node2) finally: if verbose: print("NODES1:") pprint.pprint(nodes1) print("NODES2:") pprint.pprint(nodes2) test_composer.unittest = ['.data', '.canonical'] def _make_loader(): global MyLoader class MyLoader(yaml.Loader): def construct_sequence(self, node): return tuple(yaml.Loader.construct_sequence(self, node)) def construct_mapping(self, node): pairs = self.construct_pairs(node) pairs.sort(key=(lambda i: str(i))) return pairs def construct_undefined(self, node): return self.construct_scalar(node) MyLoader.add_constructor('tag:yaml.org,2002:map', MyLoader.construct_mapping) MyLoader.add_constructor(None, MyLoader.construct_undefined) def _make_canonical_loader(): global MyCanonicalLoader class MyCanonicalLoader(yaml.CanonicalLoader): def construct_sequence(self, node): return tuple(yaml.CanonicalLoader.construct_sequence(self, node)) def construct_mapping(self, node): pairs = self.construct_pairs(node) pairs.sort(key=(lambda i: str(i))) return pairs def construct_undefined(self, node): return self.construct_scalar(node) MyCanonicalLoader.add_constructor('tag:yaml.org,2002:map', MyCanonicalLoader.construct_mapping) MyCanonicalLoader.add_constructor(None, MyCanonicalLoader.construct_undefined) def test_constructor(data_filename, canonical_filename, verbose=False): _make_loader() _make_canonical_loader() native1 = None native2 = None try: with open(data_filename, 'rb') as file: native1 = list(yaml.load_all(file, Loader=MyLoader)) with open(canonical_filename, 'rb') as file: native2 = list(yaml.load_all(file, Loader=MyCanonicalLoader)) assert native1 == native2, (native1, native2) finally: if verbose: print("NATIVE1:") pprint.pprint(native1) print("NATIVE2:") pprint.pprint(native2) test_constructor.unittest = ['.data', '.canonical'] if __name__ == '__main__': import test_appliance test_appliance.run(globals()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/lib/test_tokens.py0000644000175100001730000000447214455350511017030 0ustar00runnerdocker import yaml import pprint # Tokens mnemonic: # directive: % # document_start: --- # document_end: ... # alias: * # anchor: & # tag: ! # scalar _ # block_sequence_start: [[ # block_mapping_start: {{ # block_end: ]} # flow_sequence_start: [ # flow_sequence_end: ] # flow_mapping_start: { # flow_mapping_end: } # entry: , # key: ? # value: : _replaces = { yaml.DirectiveToken: '%', yaml.DocumentStartToken: '---', yaml.DocumentEndToken: '...', yaml.AliasToken: '*', yaml.AnchorToken: '&', yaml.TagToken: '!', yaml.ScalarToken: '_', yaml.BlockSequenceStartToken: '[[', yaml.BlockMappingStartToken: '{{', yaml.BlockEndToken: ']}', yaml.FlowSequenceStartToken: '[', yaml.FlowSequenceEndToken: ']', yaml.FlowMappingStartToken: '{', yaml.FlowMappingEndToken: '}', yaml.BlockEntryToken: ',', yaml.FlowEntryToken: ',', yaml.KeyToken: '?', yaml.ValueToken: ':', } def test_tokens(data_filename, tokens_filename, verbose=False): tokens1 = [] with open(tokens_filename, 'r') as file: tokens2 = file.read().split() try: with open(data_filename, 'rb') as file: for token in yaml.scan(file): if not isinstance(token, (yaml.StreamStartToken, yaml.StreamEndToken)): tokens1.append(_replaces[token.__class__]) finally: if verbose: print("TOKENS1:", ' '.join(tokens1)) print("TOKENS2:", ' '.join(tokens2)) assert len(tokens1) == len(tokens2), (tokens1, tokens2) for token1, token2 in zip(tokens1, tokens2): assert token1 == token2, (token1, token2) test_tokens.unittest = ['.data', '.tokens'] def test_scanner(data_filename, canonical_filename, verbose=False): for filename in [data_filename, canonical_filename]: tokens = [] try: with open(filename, 'rb') as file: for token in yaml.scan(file): tokens.append(token.__class__.__name__) finally: if verbose: pprint.pprint(tokens) test_scanner.unittest = ['.data', '.canonical'] if __name__ == '__main__': import test_appliance test_appliance.run(globals()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/lib/test_yaml.py0000644000175100001730000000104614455350511016461 0ustar00runnerdocker from test_dump_load import * from test_mark import * from test_reader import * from test_canonical import * from test_tokens import * from test_structure import * from test_errors import * from test_resolver import * from test_constructor import * from test_emitter import * from test_representer import * from test_recursive import * from test_input_output import * from test_sort_keys import * from test_multi_constructor import * from test_schema import * if __name__ == '__main__': import test_appliance test_appliance.run(globals()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/tests/lib/test_yaml_ext.py0000644000175100001730000002641614455350511017351 0ustar00runnerdocker import yaml._yaml, yaml import types, pprint, tempfile, sys, os yaml.PyBaseLoader = yaml.BaseLoader yaml.PySafeLoader = yaml.SafeLoader yaml.PyLoader = yaml.Loader yaml.PyBaseDumper = yaml.BaseDumper yaml.PySafeDumper = yaml.SafeDumper yaml.PyDumper = yaml.Dumper old_scan = yaml.scan def new_scan(stream, Loader=yaml.CLoader): return old_scan(stream, Loader) old_parse = yaml.parse def new_parse(stream, Loader=yaml.CLoader): return old_parse(stream, Loader) old_compose = yaml.compose def new_compose(stream, Loader=yaml.CLoader): return old_compose(stream, Loader) old_compose_all = yaml.compose_all def new_compose_all(stream, Loader=yaml.CLoader): return old_compose_all(stream, Loader) old_load = yaml.load def new_load(stream, Loader=yaml.CLoader): return old_load(stream, Loader) old_load_all = yaml.load_all def new_load_all(stream, Loader=yaml.CLoader): return old_load_all(stream, Loader) old_safe_load = yaml.safe_load def new_safe_load(stream): return old_load(stream, yaml.CSafeLoader) old_safe_load_all = yaml.safe_load_all def new_safe_load_all(stream): return old_load_all(stream, yaml.CSafeLoader) old_emit = yaml.emit def new_emit(events, stream=None, Dumper=yaml.CDumper, **kwds): return old_emit(events, stream, Dumper, **kwds) old_serialize = yaml.serialize def new_serialize(node, stream, Dumper=yaml.CDumper, **kwds): return old_serialize(node, stream, Dumper, **kwds) old_serialize_all = yaml.serialize_all def new_serialize_all(nodes, stream=None, Dumper=yaml.CDumper, **kwds): return old_serialize_all(nodes, stream, Dumper, **kwds) old_dump = yaml.dump def new_dump(data, stream=None, Dumper=yaml.CDumper, **kwds): return old_dump(data, stream, Dumper, **kwds) old_dump_all = yaml.dump_all def new_dump_all(documents, stream=None, Dumper=yaml.CDumper, **kwds): return old_dump_all(documents, stream, Dumper, **kwds) old_safe_dump = yaml.safe_dump def new_safe_dump(data, stream=None, **kwds): return old_dump(data, stream, yaml.CSafeDumper, **kwds) old_safe_dump_all = yaml.safe_dump_all def new_safe_dump_all(documents, stream=None, **kwds): return old_dump_all(documents, stream, yaml.CSafeDumper, **kwds) def _set_up(): yaml.BaseLoader = yaml.CBaseLoader yaml.SafeLoader = yaml.CSafeLoader yaml.Loader = yaml.CLoader yaml.BaseDumper = yaml.CBaseDumper yaml.SafeDumper = yaml.CSafeDumper yaml.Dumper = yaml.CDumper yaml.scan = new_scan yaml.parse = new_parse yaml.compose = new_compose yaml.compose_all = new_compose_all yaml.load = new_load yaml.load_all = new_load_all yaml.safe_load = new_safe_load yaml.safe_load_all = new_safe_load_all yaml.emit = new_emit yaml.serialize = new_serialize yaml.serialize_all = new_serialize_all yaml.dump = new_dump yaml.dump_all = new_dump_all yaml.safe_dump = new_safe_dump yaml.safe_dump_all = new_safe_dump_all def _tear_down(): yaml.BaseLoader = yaml.PyBaseLoader yaml.SafeLoader = yaml.PySafeLoader yaml.Loader = yaml.PyLoader yaml.BaseDumper = yaml.PyBaseDumper yaml.SafeDumper = yaml.PySafeDumper yaml.Dumper = yaml.PyDumper yaml.scan = old_scan yaml.parse = old_parse yaml.compose = old_compose yaml.compose_all = old_compose_all yaml.load = old_load yaml.load_all = old_load_all yaml.safe_load = old_safe_load yaml.safe_load_all = old_safe_load_all yaml.emit = old_emit yaml.serialize = old_serialize yaml.serialize_all = old_serialize_all yaml.dump = old_dump yaml.dump_all = old_dump_all yaml.safe_dump = old_safe_dump yaml.safe_dump_all = old_safe_dump_all def test_c_version(verbose=False): if verbose: print(_yaml.get_version()) print(_yaml.get_version_string()) assert ("%s.%s.%s" % yaml._yaml.get_version()) == yaml._yaml.get_version_string(), \ (_yaml.get_version(), yaml._yaml.get_version_string()) def test_deprecate_yaml_module(): import _yaml assert _yaml.__package__ == '' assert isinstance(_yaml.get_version(), str) def _compare_scanners(py_data, c_data, verbose): py_tokens = list(yaml.scan(py_data, Loader=yaml.PyLoader)) c_tokens = [] try: for token in yaml.scan(c_data, Loader=yaml.CLoader): c_tokens.append(token) assert len(py_tokens) == len(c_tokens), (len(py_tokens), len(c_tokens)) for py_token, c_token in zip(py_tokens, c_tokens): assert py_token.__class__ == c_token.__class__, (py_token, c_token) if hasattr(py_token, 'value'): assert py_token.value == c_token.value, (py_token, c_token) if isinstance(py_token, yaml.StreamEndToken): continue py_start = (py_token.start_mark.index, py_token.start_mark.line, py_token.start_mark.column) py_end = (py_token.end_mark.index, py_token.end_mark.line, py_token.end_mark.column) c_start = (c_token.start_mark.index, c_token.start_mark.line, c_token.start_mark.column) c_end = (c_token.end_mark.index, c_token.end_mark.line, c_token.end_mark.column) assert py_start == c_start, (py_start, c_start) assert py_end == c_end, (py_end, c_end) finally: if verbose: print("PY_TOKENS:") pprint.pprint(py_tokens) print("C_TOKENS:") pprint.pprint(c_tokens) def test_c_scanner(data_filename, canonical_filename, verbose=False): with open(data_filename, 'rb') as file1, open(data_filename, 'rb') as file2: _compare_scanners(file1, file2, verbose) with open(data_filename, 'rb') as file1, open(data_filename, 'rb') as file2: _compare_scanners(file1.read(), file2.read(), verbose) with open(canonical_filename, 'rb') as file1, open(canonical_filename, 'rb') as file2: _compare_scanners(file1, file2, verbose) with open(canonical_filename, 'rb') as file1, open(canonical_filename, 'rb') as file2: _compare_scanners(file1.read(), file2.read(), verbose) test_c_scanner.unittest = ['.data', '.canonical'] test_c_scanner.skip = ['.skip-ext'] def _compare_parsers(py_data, c_data, verbose): py_events = list(yaml.parse(py_data, Loader=yaml.PyLoader)) c_events = [] try: for event in yaml.parse(c_data, Loader=yaml.CLoader): c_events.append(event) assert len(py_events) == len(c_events), (len(py_events), len(c_events)) for py_event, c_event in zip(py_events, c_events): for attribute in ['__class__', 'anchor', 'tag', 'implicit', 'value', 'explicit', 'version', 'tags']: py_value = getattr(py_event, attribute, None) c_value = getattr(c_event, attribute, None) assert py_value == c_value, (py_event, c_event, attribute) finally: if verbose: print("PY_EVENTS:") pprint.pprint(py_events) print("C_EVENTS:") pprint.pprint(c_events) def test_c_parser(data_filename, canonical_filename, verbose=False): with open(data_filename, 'rb') as file1, open(data_filename, 'rb') as file2: _compare_parsers(file1, file2, verbose) with open(data_filename, 'rb') as file1, open(data_filename, 'rb') as file2: _compare_parsers(file1.read(), file2.read(), verbose) with open(canonical_filename, 'rb') as file1, open(canonical_filename, 'rb') as file2: _compare_parsers(file1, file2, verbose) with open(canonical_filename, 'rb') as file1, open(canonical_filename, 'rb') as file2: _compare_parsers(file1.read(), file2.read(), verbose) test_c_parser.unittest = ['.data', '.canonical'] test_c_parser.skip = ['.skip-ext'] def _compare_emitters(data, verbose): events = list(yaml.parse(data, Loader=yaml.PyLoader)) c_data = yaml.emit(events, Dumper=yaml.CDumper) if verbose: print(c_data) py_events = list(yaml.parse(c_data, Loader=yaml.PyLoader)) c_events = list(yaml.parse(c_data, Loader=yaml.CLoader)) try: assert len(events) == len(py_events), (len(events), len(py_events)) assert len(events) == len(c_events), (len(events), len(c_events)) for event, py_event, c_event in zip(events, py_events, c_events): for attribute in ['__class__', 'anchor', 'tag', 'implicit', 'value', 'explicit', 'version', 'tags']: value = getattr(event, attribute, None) py_value = getattr(py_event, attribute, None) c_value = getattr(c_event, attribute, None) if attribute == 'tag' and value in [None, '!'] \ and py_value in [None, '!'] and c_value in [None, '!']: continue if attribute == 'explicit' and (py_value or c_value): continue assert value == py_value, (event, py_event, attribute) assert value == c_value, (event, c_event, attribute) finally: if verbose: print("EVENTS:") pprint.pprint(events) print("PY_EVENTS:") pprint.pprint(py_events) print("C_EVENTS:") pprint.pprint(c_events) def test_c_emitter(data_filename, canonical_filename, verbose=False): with open(data_filename, 'rb') as file: _compare_emitters(file.read(), verbose) with open(canonical_filename, 'rb') as file: _compare_emitters(file.read(), verbose) test_c_emitter.unittest = ['.data', '.canonical'] test_c_emitter.skip = ['.skip-ext'] def test_large_file(verbose=False): SIZE_LINE = 24 SIZE_ITERATION = 0 SIZE_FILE = 31 if sys.maxsize <= 2**32: return if os.environ.get('PYYAML_TEST_GROUP', '') != 'all': return with tempfile.TemporaryFile() as temp_file: for i in range(2**(SIZE_FILE-SIZE_ITERATION-SIZE_LINE) + 1): temp_file.write(bytes(('-' + (' ' * (2**SIZE_LINE-4))+ '{}\n')*(2**SIZE_ITERATION), 'utf-8')) temp_file.seek(0) yaml.load(temp_file, Loader=yaml.CLoader) test_large_file.unittest = None def wrap_ext_function(function): def wrapper(*args, **kwds): _set_up() try: function(*args, **kwds) finally: _tear_down() wrapper.__name__ = '%s_ext' % function.__name__ wrapper.unittest = function.unittest wrapper.skip = getattr(function, 'skip', [])+['.skip-ext'] return wrapper def wrap_ext(collections): functions = [] if not isinstance(collections, list): collections = [collections] for collection in collections: if not isinstance(collection, dict): collection = vars(collection) for key in sorted(collection): value = collection[key] if isinstance(value, types.FunctionType) and hasattr(value, 'unittest'): functions.append(wrap_ext_function(value)) for function in functions: assert function.__name__ not in globals() globals()[function.__name__] = function import test_tokens, test_structure, test_errors, test_resolver, test_constructor, \ test_emitter, test_representer, test_recursive, test_input_output wrap_ext([test_tokens, test_structure, test_errors, test_resolver, test_constructor, test_emitter, test_representer, test_recursive, test_input_output]) if __name__ == '__main__': import test_appliance test_appliance.run(globals()) ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1689637208.8122263 PyYAML-6.0.1/yaml/0000755000175100001730000000000014455350531013141 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/yaml/__init__.pxd0000644000175100001730000000000014455350511015401 0ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/yaml/_yaml.h0000644000175100001730000000046714455350511014420 0ustar00runnerdocker #include #define PyUnicode_FromYamlString(s) PyUnicode_FromString((const char *)(void *)(s)) #define PyBytes_AS_Yaml_STRING(s) ((yaml_char_t *)PyBytes_AS_STRING(s)) #ifdef _MSC_VER /* MS Visual C++ 6.0 */ #if _MSC_VER == 1200 #define PyLong_FromUnsignedLongLong(z) PyInt_FromLong(i) #endif #endif ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/yaml/_yaml.pxd0000644000175100001730000002240314455350511014756 0ustar00runnerdocker cdef extern from "_yaml.h": void malloc(int l) void memcpy(void *d, void *s, int l) int strlen(char *s) int PyString_CheckExact(object o) int PyUnicode_CheckExact(object o) char *PyString_AS_STRING(object o) object PyUnicode_FromString(char *u) object PyUnicode_DecodeUTF8(char *u, int s, char *e) object PyUnicode_AsUTF8String(object o) int PY_MAJOR_VERSION ctypedef unsigned char yaml_char_t object PyUnicode_FromYamlString(void *u) yaml_char_t *PyBytes_AS_Yaml_STRING(object o) const char *PyBytes_AS_STRING(object o) int PyBytes_CheckExact(object o) int PyBytes_GET_SIZE(object o) object PyBytes_FromStringAndSize(char *v, int l) ctypedef enum: SIZEOF_VOID_P ctypedef enum yaml_encoding_t: YAML_ANY_ENCODING YAML_UTF8_ENCODING YAML_UTF16LE_ENCODING YAML_UTF16BE_ENCODING ctypedef enum yaml_break_t: YAML_ANY_BREAK YAML_CR_BREAK YAML_LN_BREAK YAML_CRLN_BREAK ctypedef enum yaml_error_type_t: YAML_NO_ERROR YAML_MEMORY_ERROR YAML_READER_ERROR YAML_SCANNER_ERROR YAML_PARSER_ERROR YAML_WRITER_ERROR YAML_EMITTER_ERROR ctypedef enum yaml_scalar_style_t: YAML_ANY_SCALAR_STYLE YAML_PLAIN_SCALAR_STYLE YAML_SINGLE_QUOTED_SCALAR_STYLE YAML_DOUBLE_QUOTED_SCALAR_STYLE YAML_LITERAL_SCALAR_STYLE YAML_FOLDED_SCALAR_STYLE ctypedef enum yaml_sequence_style_t: YAML_ANY_SEQUENCE_STYLE YAML_BLOCK_SEQUENCE_STYLE YAML_FLOW_SEQUENCE_STYLE ctypedef enum yaml_mapping_style_t: YAML_ANY_MAPPING_STYLE YAML_BLOCK_MAPPING_STYLE YAML_FLOW_MAPPING_STYLE ctypedef enum yaml_token_type_t: YAML_NO_TOKEN YAML_STREAM_START_TOKEN YAML_STREAM_END_TOKEN YAML_VERSION_DIRECTIVE_TOKEN YAML_TAG_DIRECTIVE_TOKEN YAML_DOCUMENT_START_TOKEN YAML_DOCUMENT_END_TOKEN YAML_BLOCK_SEQUENCE_START_TOKEN YAML_BLOCK_MAPPING_START_TOKEN YAML_BLOCK_END_TOKEN YAML_FLOW_SEQUENCE_START_TOKEN YAML_FLOW_SEQUENCE_END_TOKEN YAML_FLOW_MAPPING_START_TOKEN YAML_FLOW_MAPPING_END_TOKEN YAML_BLOCK_ENTRY_TOKEN YAML_FLOW_ENTRY_TOKEN YAML_KEY_TOKEN YAML_VALUE_TOKEN YAML_ALIAS_TOKEN YAML_ANCHOR_TOKEN YAML_TAG_TOKEN YAML_SCALAR_TOKEN ctypedef enum yaml_event_type_t: YAML_NO_EVENT YAML_STREAM_START_EVENT YAML_STREAM_END_EVENT YAML_DOCUMENT_START_EVENT YAML_DOCUMENT_END_EVENT YAML_ALIAS_EVENT YAML_SCALAR_EVENT YAML_SEQUENCE_START_EVENT YAML_SEQUENCE_END_EVENT YAML_MAPPING_START_EVENT YAML_MAPPING_END_EVENT ctypedef int yaml_read_handler_t(void *data, unsigned char *buffer, size_t size, size_t *size_read) except 0 ctypedef int yaml_write_handler_t(void *data, unsigned char *buffer, size_t size) except 0 ctypedef struct yaml_mark_t: size_t index size_t line size_t column ctypedef struct yaml_version_directive_t: int major int minor ctypedef struct yaml_tag_directive_t: yaml_char_t *handle yaml_char_t *prefix ctypedef struct _yaml_token_stream_start_data_t: yaml_encoding_t encoding ctypedef struct _yaml_token_alias_data_t: char *value ctypedef struct _yaml_token_anchor_data_t: char *value ctypedef struct _yaml_token_tag_data_t: char *handle char *suffix ctypedef struct _yaml_token_scalar_data_t: char *value size_t length yaml_scalar_style_t style ctypedef struct _yaml_token_version_directive_data_t: int major int minor ctypedef struct _yaml_token_tag_directive_data_t: char *handle char *prefix ctypedef union _yaml_token_data_t: _yaml_token_stream_start_data_t stream_start _yaml_token_alias_data_t alias _yaml_token_anchor_data_t anchor _yaml_token_tag_data_t tag _yaml_token_scalar_data_t scalar _yaml_token_version_directive_data_t version_directive _yaml_token_tag_directive_data_t tag_directive ctypedef struct yaml_token_t: yaml_token_type_t type _yaml_token_data_t data yaml_mark_t start_mark yaml_mark_t end_mark ctypedef struct _yaml_event_stream_start_data_t: yaml_encoding_t encoding ctypedef struct _yaml_event_document_start_data_tag_directives_t: yaml_tag_directive_t *start yaml_tag_directive_t *end ctypedef struct _yaml_event_document_start_data_t: yaml_version_directive_t *version_directive _yaml_event_document_start_data_tag_directives_t tag_directives int implicit ctypedef struct _yaml_event_document_end_data_t: int implicit ctypedef struct _yaml_event_alias_data_t: char *anchor ctypedef struct _yaml_event_scalar_data_t: char *anchor char *tag char *value size_t length int plain_implicit int quoted_implicit yaml_scalar_style_t style ctypedef struct _yaml_event_sequence_start_data_t: char *anchor char *tag int implicit yaml_sequence_style_t style ctypedef struct _yaml_event_mapping_start_data_t: char *anchor char *tag int implicit yaml_mapping_style_t style ctypedef union _yaml_event_data_t: _yaml_event_stream_start_data_t stream_start _yaml_event_document_start_data_t document_start _yaml_event_document_end_data_t document_end _yaml_event_alias_data_t alias _yaml_event_scalar_data_t scalar _yaml_event_sequence_start_data_t sequence_start _yaml_event_mapping_start_data_t mapping_start ctypedef struct yaml_event_t: yaml_event_type_t type _yaml_event_data_t data yaml_mark_t start_mark yaml_mark_t end_mark ctypedef struct yaml_parser_t: yaml_error_type_t error char *problem size_t problem_offset int problem_value yaml_mark_t problem_mark char *context yaml_mark_t context_mark ctypedef struct yaml_emitter_t: yaml_error_type_t error char *problem char *yaml_get_version_string() void yaml_get_version(int *major, int *minor, int *patch) void yaml_token_delete(yaml_token_t *token) int yaml_stream_start_event_initialize(yaml_event_t *event, yaml_encoding_t encoding) int yaml_stream_end_event_initialize(yaml_event_t *event) int yaml_document_start_event_initialize(yaml_event_t *event, yaml_version_directive_t *version_directive, yaml_tag_directive_t *tag_directives_start, yaml_tag_directive_t *tag_directives_end, int implicit) int yaml_document_end_event_initialize(yaml_event_t *event, int implicit) int yaml_alias_event_initialize(yaml_event_t *event, yaml_char_t *anchor) int yaml_scalar_event_initialize(yaml_event_t *event, yaml_char_t *anchor, yaml_char_t *tag, yaml_char_t *value, int length, int plain_implicit, int quoted_implicit, yaml_scalar_style_t style) int yaml_sequence_start_event_initialize(yaml_event_t *event, yaml_char_t *anchor, yaml_char_t *tag, int implicit, yaml_sequence_style_t style) int yaml_sequence_end_event_initialize(yaml_event_t *event) int yaml_mapping_start_event_initialize(yaml_event_t *event, yaml_char_t *anchor, yaml_char_t *tag, int implicit, yaml_mapping_style_t style) int yaml_mapping_end_event_initialize(yaml_event_t *event) void yaml_event_delete(yaml_event_t *event) int yaml_parser_initialize(yaml_parser_t *parser) void yaml_parser_delete(yaml_parser_t *parser) void yaml_parser_set_input_string(yaml_parser_t *parser, const unsigned char *input, size_t size) void yaml_parser_set_input(yaml_parser_t *parser, yaml_read_handler_t *handler, void *data) void yaml_parser_set_encoding(yaml_parser_t *parser, yaml_encoding_t encoding) int yaml_parser_scan(yaml_parser_t *parser, yaml_token_t *token) except * int yaml_parser_parse(yaml_parser_t *parser, yaml_event_t *event) except * int yaml_emitter_initialize(yaml_emitter_t *emitter) void yaml_emitter_delete(yaml_emitter_t *emitter) void yaml_emitter_set_output_string(yaml_emitter_t *emitter, char *output, size_t size, size_t *size_written) void yaml_emitter_set_output(yaml_emitter_t *emitter, yaml_write_handler_t *handler, void *data) void yaml_emitter_set_encoding(yaml_emitter_t *emitter, yaml_encoding_t encoding) void yaml_emitter_set_canonical(yaml_emitter_t *emitter, int canonical) void yaml_emitter_set_indent(yaml_emitter_t *emitter, int indent) void yaml_emitter_set_width(yaml_emitter_t *emitter, int width) void yaml_emitter_set_unicode(yaml_emitter_t *emitter, int unicode) void yaml_emitter_set_break(yaml_emitter_t *emitter, yaml_break_t line_break) int yaml_emitter_emit(yaml_emitter_t *emitter, yaml_event_t *event) except * int yaml_emitter_flush(yaml_emitter_t *emitter) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1689637193.0 PyYAML-6.0.1/yaml/_yaml.pyx0000644000175100001730000016346414455350511015020 0ustar00runnerdocker import yaml def get_version_string(): cdef const char *value value = yaml_get_version_string() return PyUnicode_FromString(value) def get_version(): cdef int major, minor, patch yaml_get_version(&major, &minor, &patch) return (major, minor, patch) #Mark = yaml.error.Mark YAMLError = yaml.error.YAMLError ReaderError = yaml.reader.ReaderError ScannerError = yaml.scanner.ScannerError ParserError = yaml.parser.ParserError ComposerError = yaml.composer.ComposerError ConstructorError = yaml.constructor.ConstructorError EmitterError = yaml.emitter.EmitterError SerializerError = yaml.serializer.SerializerError RepresenterError = yaml.representer.RepresenterError StreamStartToken = yaml.tokens.StreamStartToken StreamEndToken = yaml.tokens.StreamEndToken DirectiveToken = yaml.tokens.DirectiveToken DocumentStartToken = yaml.tokens.DocumentStartToken DocumentEndToken = yaml.tokens.DocumentEndToken BlockSequenceStartToken = yaml.tokens.BlockSequenceStartToken BlockMappingStartToken = yaml.tokens.BlockMappingStartToken BlockEndToken = yaml.tokens.BlockEndToken FlowSequenceStartToken = yaml.tokens.FlowSequenceStartToken FlowMappingStartToken = yaml.tokens.FlowMappingStartToken FlowSequenceEndToken = yaml.tokens.FlowSequenceEndToken FlowMappingEndToken = yaml.tokens.FlowMappingEndToken KeyToken = yaml.tokens.KeyToken ValueToken = yaml.tokens.ValueToken BlockEntryToken = yaml.tokens.BlockEntryToken FlowEntryToken = yaml.tokens.FlowEntryToken AliasToken = yaml.tokens.AliasToken AnchorToken = yaml.tokens.AnchorToken TagToken = yaml.tokens.TagToken ScalarToken = yaml.tokens.ScalarToken StreamStartEvent = yaml.events.StreamStartEvent StreamEndEvent = yaml.events.StreamEndEvent DocumentStartEvent = yaml.events.DocumentStartEvent DocumentEndEvent = yaml.events.DocumentEndEvent AliasEvent = yaml.events.AliasEvent ScalarEvent = yaml.events.ScalarEvent SequenceStartEvent = yaml.events.SequenceStartEvent SequenceEndEvent = yaml.events.SequenceEndEvent MappingStartEvent = yaml.events.MappingStartEvent MappingEndEvent = yaml.events.MappingEndEvent ScalarNode = yaml.nodes.ScalarNode SequenceNode = yaml.nodes.SequenceNode MappingNode = yaml.nodes.MappingNode cdef class Mark: cdef readonly object name cdef readonly size_t index cdef readonly size_t line cdef readonly size_t column cdef readonly buffer cdef readonly pointer def __init__(self, object name, size_t index, size_t line, size_t column, object buffer, object pointer): self.name = name self.index = index self.line = line self.column = column self.buffer = buffer self.pointer = pointer def get_snippet(self): return None def __str__(self): where = " in \"%s\", line %d, column %d" \ % (self.name, self.line+1, self.column+1) return where #class YAMLError(Exception): # pass # #class MarkedYAMLError(YAMLError): # # def __init__(self, context=None, context_mark=None, # problem=None, problem_mark=None, note=None): # self.context = context # self.context_mark = context_mark # self.problem = problem # self.problem_mark = problem_mark # self.note = note # # def __str__(self): # lines = [] # if self.context is not None: # lines.append(self.context) # if self.context_mark is not None \ # and (self.problem is None or self.problem_mark is None # or self.context_mark.name != self.problem_mark.name # or self.context_mark.line != self.problem_mark.line # or self.context_mark.column != self.problem_mark.column): # lines.append(str(self.context_mark)) # if self.problem is not None: # lines.append(self.problem) # if self.problem_mark is not None: # lines.append(str(self.problem_mark)) # if self.note is not None: # lines.append(self.note) # return '\n'.join(lines) # #class ReaderError(YAMLError): # # def __init__(self, name, position, character, encoding, reason): # self.name = name # self.character = character # self.position = position # self.encoding = encoding # self.reason = reason # # def __str__(self): # if isinstance(self.character, str): # return "'%s' codec can't decode byte #x%02x: %s\n" \ # " in \"%s\", position %d" \ # % (self.encoding, ord(self.character), self.reason, # self.name, self.position) # else: # return "unacceptable character #x%04x: %s\n" \ # " in \"%s\", position %d" \ # % (ord(self.character), self.reason, # self.name, self.position) # #class ScannerError(MarkedYAMLError): # pass # #class ParserError(MarkedYAMLError): # pass # #class EmitterError(YAMLError): # pass # #cdef class Token: # cdef readonly Mark start_mark # cdef readonly Mark end_mark # def __init__(self, Mark start_mark, Mark end_mark): # self.start_mark = start_mark # self.end_mark = end_mark # #cdef class StreamStartToken(Token): # cdef readonly object encoding # def __init__(self, Mark start_mark, Mark end_mark, encoding): # self.start_mark = start_mark # self.end_mark = end_mark # self.encoding = encoding # #cdef class StreamEndToken(Token): # pass # #cdef class DirectiveToken(Token): # cdef readonly object name # cdef readonly object value # def __init__(self, name, value, Mark start_mark, Mark end_mark): # self.name = name # self.value = value # self.start_mark = start_mark # self.end_mark = end_mark # #cdef class DocumentStartToken(Token): # pass # #cdef class DocumentEndToken(Token): # pass # #cdef class BlockSequenceStartToken(Token): # pass # #cdef class BlockMappingStartToken(Token): # pass # #cdef class BlockEndToken(Token): # pass # #cdef class FlowSequenceStartToken(Token): # pass # #cdef class FlowMappingStartToken(Token): # pass # #cdef class FlowSequenceEndToken(Token): # pass # #cdef class FlowMappingEndToken(Token): # pass # #cdef class KeyToken(Token): # pass # #cdef class ValueToken(Token): # pass # #cdef class BlockEntryToken(Token): # pass # #cdef class FlowEntryToken(Token): # pass # #cdef class AliasToken(Token): # cdef readonly object value # def __init__(self, value, Mark start_mark, Mark end_mark): # self.value = value # self.start_mark = start_mark # self.end_mark = end_mark # #cdef class AnchorToken(Token): # cdef readonly object value # def __init__(self, value, Mark start_mark, Mark end_mark): # self.value = value # self.start_mark = start_mark # self.end_mark = end_mark # #cdef class TagToken(Token): # cdef readonly object value # def __init__(self, value, Mark start_mark, Mark end_mark): # self.value = value # self.start_mark = start_mark # self.end_mark = end_mark # #cdef class ScalarToken(Token): # cdef readonly object value # cdef readonly object plain # cdef readonly object style # def __init__(self, value, plain, Mark start_mark, Mark end_mark, style=None): # self.value = value # self.plain = plain # self.start_mark = start_mark # self.end_mark = end_mark # self.style = style cdef class CParser: cdef yaml_parser_t parser cdef yaml_event_t parsed_event cdef object stream cdef object stream_name cdef object current_token cdef object current_event cdef object anchors cdef object stream_cache cdef int stream_cache_len cdef int stream_cache_pos cdef int unicode_source def __init__(self, stream): cdef is_readable if yaml_parser_initialize(&self.parser) == 0: raise MemoryError self.parsed_event.type = YAML_NO_EVENT is_readable = 1 try: stream.read except AttributeError: is_readable = 0 self.unicode_source = 0 if is_readable: self.stream = stream try: self.stream_name = stream.name except AttributeError: self.stream_name = u'' self.stream_cache = None self.stream_cache_len = 0 self.stream_cache_pos = 0 yaml_parser_set_input(&self.parser, input_handler, self) else: if PyUnicode_CheckExact(stream) != 0: stream = PyUnicode_AsUTF8String(stream) self.stream_name = u'' self.unicode_source = 1 else: self.stream_name = u'' if PyBytes_CheckExact(stream) == 0: raise TypeError(u"a string or stream input is required") self.stream = stream yaml_parser_set_input_string(&self.parser, PyBytes_AS_Yaml_STRING(stream), PyBytes_GET_SIZE(stream)) self.current_token = None self.current_event = None self.anchors = {} def __dealloc__(self): yaml_parser_delete(&self.parser) yaml_event_delete(&self.parsed_event) def dispose(self): pass cdef object _parser_error(self): if self.parser.error == YAML_MEMORY_ERROR: return MemoryError elif self.parser.error == YAML_READER_ERROR: return ReaderError(self.stream_name, self.parser.problem_offset, self.parser.problem_value, u'?', PyUnicode_FromString(self.parser.problem)) elif self.parser.error == YAML_SCANNER_ERROR \ or self.parser.error == YAML_PARSER_ERROR: context_mark = None problem_mark = None if self.parser.context != NULL: context_mark = Mark(self.stream_name, self.parser.context_mark.index, self.parser.context_mark.line, self.parser.context_mark.column, None, None) if self.parser.problem != NULL: problem_mark = Mark(self.stream_name, self.parser.problem_mark.index, self.parser.problem_mark.line, self.parser.problem_mark.column, None, None) context = None if self.parser.context != NULL: context = PyUnicode_FromString(self.parser.context) problem = PyUnicode_FromString(self.parser.problem) if self.parser.error == YAML_SCANNER_ERROR: return ScannerError(context, context_mark, problem, problem_mark) else: return ParserError(context, context_mark, problem, problem_mark) raise ValueError(u"no parser error") def raw_scan(self): cdef yaml_token_t token cdef int done cdef int count count = 0 done = 0 while done == 0: if yaml_parser_scan(&self.parser, &token) == 0: error = self._parser_error() raise error if token.type == YAML_NO_TOKEN: done = 1 else: count = count+1 yaml_token_delete(&token) return count cdef object _scan(self): cdef yaml_token_t token if yaml_parser_scan(&self.parser, &token) == 0: error = self._parser_error() raise error token_object = self._token_to_object(&token) yaml_token_delete(&token) return token_object cdef object _token_to_object(self, yaml_token_t *token): start_mark = Mark(self.stream_name, token.start_mark.index, token.start_mark.line, token.start_mark.column, None, None) end_mark = Mark(self.stream_name, token.end_mark.index, token.end_mark.line, token.end_mark.column, None, None) if token.type == YAML_NO_TOKEN: return None elif token.type == YAML_STREAM_START_TOKEN: encoding = None if token.data.stream_start.encoding == YAML_UTF8_ENCODING: if self.unicode_source == 0: encoding = u"utf-8" elif token.data.stream_start.encoding == YAML_UTF16LE_ENCODING: encoding = u"utf-16-le" elif token.data.stream_start.encoding == YAML_UTF16BE_ENCODING: encoding = u"utf-16-be" return StreamStartToken(start_mark, end_mark, encoding) elif token.type == YAML_STREAM_END_TOKEN: return StreamEndToken(start_mark, end_mark) elif token.type == YAML_VERSION_DIRECTIVE_TOKEN: return DirectiveToken(u"YAML", (token.data.version_directive.major, token.data.version_directive.minor), start_mark, end_mark) elif token.type == YAML_TAG_DIRECTIVE_TOKEN: handle = PyUnicode_FromYamlString(token.data.tag_directive.handle) prefix = PyUnicode_FromYamlString(token.data.tag_directive.prefix) return DirectiveToken(u"TAG", (handle, prefix), start_mark, end_mark) elif token.type == YAML_DOCUMENT_START_TOKEN: return DocumentStartToken(start_mark, end_mark) elif token.type == YAML_DOCUMENT_END_TOKEN: return DocumentEndToken(start_mark, end_mark) elif token.type == YAML_BLOCK_SEQUENCE_START_TOKEN: return BlockSequenceStartToken(start_mark, end_mark) elif token.type == YAML_BLOCK_MAPPING_START_TOKEN: return BlockMappingStartToken(start_mark, end_mark) elif token.type == YAML_BLOCK_END_TOKEN: return BlockEndToken(start_mark, end_mark) elif token.type == YAML_FLOW_SEQUENCE_START_TOKEN: return FlowSequenceStartToken(start_mark, end_mark) elif token.type == YAML_FLOW_SEQUENCE_END_TOKEN: return FlowSequenceEndToken(start_mark, end_mark) elif token.type == YAML_FLOW_MAPPING_START_TOKEN: return FlowMappingStartToken(start_mark, end_mark) elif token.type == YAML_FLOW_MAPPING_END_TOKEN: return FlowMappingEndToken(start_mark, end_mark) elif token.type == YAML_BLOCK_ENTRY_TOKEN: return BlockEntryToken(start_mark, end_mark) elif token.type == YAML_FLOW_ENTRY_TOKEN: return FlowEntryToken(start_mark, end_mark) elif token.type == YAML_KEY_TOKEN: return KeyToken(start_mark, end_mark) elif token.type == YAML_VALUE_TOKEN: return ValueToken(start_mark, end_mark) elif token.type == YAML_ALIAS_TOKEN: value = PyUnicode_FromYamlString(token.data.alias.value) return AliasToken(value, start_mark, end_mark) elif token.type == YAML_ANCHOR_TOKEN: value = PyUnicode_FromYamlString(token.data.anchor.value) return AnchorToken(value, start_mark, end_mark) elif token.type == YAML_TAG_TOKEN: handle = PyUnicode_FromYamlString(token.data.tag.handle) suffix = PyUnicode_FromYamlString(token.data.tag.suffix) if not handle: handle = None return TagToken((handle, suffix), start_mark, end_mark) elif token.type == YAML_SCALAR_TOKEN: value = PyUnicode_DecodeUTF8(token.data.scalar.value, token.data.scalar.length, 'strict') plain = False style = None if token.data.scalar.style == YAML_PLAIN_SCALAR_STYLE: plain = True style = u'' elif token.data.scalar.style == YAML_SINGLE_QUOTED_SCALAR_STYLE: style = u'\'' elif token.data.scalar.style == YAML_DOUBLE_QUOTED_SCALAR_STYLE: style = u'"' elif token.data.scalar.style == YAML_LITERAL_SCALAR_STYLE: style = u'|' elif token.data.scalar.style == YAML_FOLDED_SCALAR_STYLE: style = u'>' return ScalarToken(value, plain, start_mark, end_mark, style) else: raise ValueError(u"unknown token type") def get_token(self): if self.current_token is not None: value = self.current_token self.current_token = None else: value = self._scan() return value def peek_token(self): if self.current_token is None: self.current_token = self._scan() return self.current_token def check_token(self, *choices): if self.current_token is None: self.current_token = self._scan() if self.current_token is None: return False if not choices: return True token_class = self.current_token.__class__ for choice in choices: if token_class is choice: return True return False def raw_parse(self): cdef yaml_event_t event cdef int done cdef int count count = 0 done = 0 while done == 0: if yaml_parser_parse(&self.parser, &event) == 0: error = self._parser_error() raise error if event.type == YAML_NO_EVENT: done = 1 else: count = count+1 yaml_event_delete(&event) return count cdef object _parse(self): cdef yaml_event_t event if yaml_parser_parse(&self.parser, &event) == 0: error = self._parser_error() raise error event_object = self._event_to_object(&event) yaml_event_delete(&event) return event_object cdef object _event_to_object(self, yaml_event_t *event): cdef yaml_tag_directive_t *tag_directive start_mark = Mark(self.stream_name, event.start_mark.index, event.start_mark.line, event.start_mark.column, None, None) end_mark = Mark(self.stream_name, event.end_mark.index, event.end_mark.line, event.end_mark.column, None, None) if event.type == YAML_NO_EVENT: return None elif event.type == YAML_STREAM_START_EVENT: encoding = None if event.data.stream_start.encoding == YAML_UTF8_ENCODING: if self.unicode_source == 0: encoding = u"utf-8" elif event.data.stream_start.encoding == YAML_UTF16LE_ENCODING: encoding = u"utf-16-le" elif event.data.stream_start.encoding == YAML_UTF16BE_ENCODING: encoding = u"utf-16-be" return StreamStartEvent(start_mark, end_mark, encoding) elif event.type == YAML_STREAM_END_EVENT: return StreamEndEvent(start_mark, end_mark) elif event.type == YAML_DOCUMENT_START_EVENT: explicit = False if event.data.document_start.implicit == 0: explicit = True version = None if event.data.document_start.version_directive != NULL: version = (event.data.document_start.version_directive.major, event.data.document_start.version_directive.minor) tags = None if event.data.document_start.tag_directives.start != NULL: tags = {} tag_directive = event.data.document_start.tag_directives.start while tag_directive != event.data.document_start.tag_directives.end: handle = PyUnicode_FromYamlString(tag_directive.handle) prefix = PyUnicode_FromYamlString(tag_directive.prefix) tags[handle] = prefix tag_directive = tag_directive+1 return DocumentStartEvent(start_mark, end_mark, explicit, version, tags) elif event.type == YAML_DOCUMENT_END_EVENT: explicit = False if event.data.document_end.implicit == 0: explicit = True return DocumentEndEvent(start_mark, end_mark, explicit) elif event.type == YAML_ALIAS_EVENT: anchor = PyUnicode_FromYamlString(event.data.alias.anchor) return AliasEvent(anchor, start_mark, end_mark) elif event.type == YAML_SCALAR_EVENT: anchor = None if event.data.scalar.anchor != NULL: anchor = PyUnicode_FromYamlString(event.data.scalar.anchor) tag = None if event.data.scalar.tag != NULL: tag = PyUnicode_FromYamlString(event.data.scalar.tag) value = PyUnicode_DecodeUTF8(event.data.scalar.value, event.data.scalar.length, 'strict') plain_implicit = False if event.data.scalar.plain_implicit == 1: plain_implicit = True quoted_implicit = False if event.data.scalar.quoted_implicit == 1: quoted_implicit = True style = None if event.data.scalar.style == YAML_PLAIN_SCALAR_STYLE: style = u'' elif event.data.scalar.style == YAML_SINGLE_QUOTED_SCALAR_STYLE: style = u'\'' elif event.data.scalar.style == YAML_DOUBLE_QUOTED_SCALAR_STYLE: style = u'"' elif event.data.scalar.style == YAML_LITERAL_SCALAR_STYLE: style = u'|' elif event.data.scalar.style == YAML_FOLDED_SCALAR_STYLE: style = u'>' return ScalarEvent(anchor, tag, (plain_implicit, quoted_implicit), value, start_mark, end_mark, style) elif event.type == YAML_SEQUENCE_START_EVENT: anchor = None if event.data.sequence_start.anchor != NULL: anchor = PyUnicode_FromYamlString(event.data.sequence_start.anchor) tag = None if event.data.sequence_start.tag != NULL: tag = PyUnicode_FromYamlString(event.data.sequence_start.tag) implicit = False if event.data.sequence_start.implicit == 1: implicit = True flow_style = None if event.data.sequence_start.style == YAML_FLOW_SEQUENCE_STYLE: flow_style = True elif event.data.sequence_start.style == YAML_BLOCK_SEQUENCE_STYLE: flow_style = False return SequenceStartEvent(anchor, tag, implicit, start_mark, end_mark, flow_style) elif event.type == YAML_MAPPING_START_EVENT: anchor = None if event.data.mapping_start.anchor != NULL: anchor = PyUnicode_FromYamlString(event.data.mapping_start.anchor) tag = None if event.data.mapping_start.tag != NULL: tag = PyUnicode_FromYamlString(event.data.mapping_start.tag) implicit = False if event.data.mapping_start.implicit == 1: implicit = True flow_style = None if event.data.mapping_start.style == YAML_FLOW_MAPPING_STYLE: flow_style = True elif event.data.mapping_start.style == YAML_BLOCK_MAPPING_STYLE: flow_style = False return MappingStartEvent(anchor, tag, implicit, start_mark, end_mark, flow_style) elif event.type == YAML_SEQUENCE_END_EVENT: return SequenceEndEvent(start_mark, end_mark) elif event.type == YAML_MAPPING_END_EVENT: return MappingEndEvent(start_mark, end_mark) else: raise ValueError(u"unknown event type") def get_event(self): if self.current_event is not None: value = self.current_event self.current_event = None else: value = self._parse() return value def peek_event(self): if self.current_event is None: self.current_event = self._parse() return self.current_event def check_event(self, *choices): if self.current_event is None: self.current_event = self._parse() if self.current_event is None: return False if not choices: return True event_class = self.current_event.__class__ for choice in choices: if event_class is choice: return True return False def check_node(self): self._parse_next_event() if self.parsed_event.type == YAML_STREAM_START_EVENT: yaml_event_delete(&self.parsed_event) self._parse_next_event() if self.parsed_event.type != YAML_STREAM_END_EVENT: return True return False def get_node(self): self._parse_next_event() if self.parsed_event.type != YAML_STREAM_END_EVENT: return self._compose_document() def get_single_node(self): self._parse_next_event() yaml_event_delete(&self.parsed_event) self._parse_next_event() document = None if self.parsed_event.type != YAML_STREAM_END_EVENT: document = self._compose_document() self._parse_next_event() if self.parsed_event.type != YAML_STREAM_END_EVENT: mark = Mark(self.stream_name, self.parsed_event.start_mark.index, self.parsed_event.start_mark.line, self.parsed_event.start_mark.column, None, None) raise ComposerError(u"expected a single document in the stream", document.start_mark, u"but found another document", mark) return document cdef object _compose_document(self): yaml_event_delete(&self.parsed_event) node = self._compose_node(None, None) self._parse_next_event() yaml_event_delete(&self.parsed_event) self.anchors = {} return node cdef object _compose_node(self, object parent, object index): self._parse_next_event() if self.parsed_event.type == YAML_ALIAS_EVENT: anchor = PyUnicode_FromYamlString(self.parsed_event.data.alias.anchor) if anchor not in self.anchors: mark = Mark(self.stream_name, self.parsed_event.start_mark.index, self.parsed_event.start_mark.line, self.parsed_event.start_mark.column, None, None) raise ComposerError(None, None, u"found undefined alias", mark) yaml_event_delete(&self.parsed_event) return self.anchors[anchor] anchor = None if self.parsed_event.type == YAML_SCALAR_EVENT \ and self.parsed_event.data.scalar.anchor != NULL: anchor = PyUnicode_FromYamlString(self.parsed_event.data.scalar.anchor) elif self.parsed_event.type == YAML_SEQUENCE_START_EVENT \ and self.parsed_event.data.sequence_start.anchor != NULL: anchor = PyUnicode_FromYamlString(self.parsed_event.data.sequence_start.anchor) elif self.parsed_event.type == YAML_MAPPING_START_EVENT \ and self.parsed_event.data.mapping_start.anchor != NULL: anchor = PyUnicode_FromYamlString(self.parsed_event.data.mapping_start.anchor) if anchor is not None: if anchor in self.anchors: mark = Mark(self.stream_name, self.parsed_event.start_mark.index, self.parsed_event.start_mark.line, self.parsed_event.start_mark.column, None, None) raise ComposerError(u"found duplicate anchor; first occurrence", self.anchors[anchor].start_mark, u"second occurrence", mark) self.descend_resolver(parent, index) if self.parsed_event.type == YAML_SCALAR_EVENT: node = self._compose_scalar_node(anchor) elif self.parsed_event.type == YAML_SEQUENCE_START_EVENT: node = self._compose_sequence_node(anchor) elif self.parsed_event.type == YAML_MAPPING_START_EVENT: node = self._compose_mapping_node(anchor) self.ascend_resolver() return node cdef _compose_scalar_node(self, object anchor): start_mark = Mark(self.stream_name, self.parsed_event.start_mark.index, self.parsed_event.start_mark.line, self.parsed_event.start_mark.column, None, None) end_mark = Mark(self.stream_name, self.parsed_event.end_mark.index, self.parsed_event.end_mark.line, self.parsed_event.end_mark.column, None, None) value = PyUnicode_DecodeUTF8(self.parsed_event.data.scalar.value, self.parsed_event.data.scalar.length, 'strict') plain_implicit = False if self.parsed_event.data.scalar.plain_implicit == 1: plain_implicit = True quoted_implicit = False if self.parsed_event.data.scalar.quoted_implicit == 1: quoted_implicit = True if self.parsed_event.data.scalar.tag == NULL \ or (self.parsed_event.data.scalar.tag[0] == c'!' and self.parsed_event.data.scalar.tag[1] == c'\0'): tag = self.resolve(ScalarNode, value, (plain_implicit, quoted_implicit)) else: tag = PyUnicode_FromYamlString(self.parsed_event.data.scalar.tag) style = None if self.parsed_event.data.scalar.style == YAML_PLAIN_SCALAR_STYLE: style = u'' elif self.parsed_event.data.scalar.style == YAML_SINGLE_QUOTED_SCALAR_STYLE: style = u'\'' elif self.parsed_event.data.scalar.style == YAML_DOUBLE_QUOTED_SCALAR_STYLE: style = u'"' elif self.parsed_event.data.scalar.style == YAML_LITERAL_SCALAR_STYLE: style = u'|' elif self.parsed_event.data.scalar.style == YAML_FOLDED_SCALAR_STYLE: style = u'>' node = ScalarNode(tag, value, start_mark, end_mark, style) if anchor is not None: self.anchors[anchor] = node yaml_event_delete(&self.parsed_event) return node cdef _compose_sequence_node(self, object anchor): cdef int index start_mark = Mark(self.stream_name, self.parsed_event.start_mark.index, self.parsed_event.start_mark.line, self.parsed_event.start_mark.column, None, None) implicit = False if self.parsed_event.data.sequence_start.implicit == 1: implicit = True if self.parsed_event.data.sequence_start.tag == NULL \ or (self.parsed_event.data.sequence_start.tag[0] == c'!' and self.parsed_event.data.sequence_start.tag[1] == c'\0'): tag = self.resolve(SequenceNode, None, implicit) else: tag = PyUnicode_FromYamlString(self.parsed_event.data.sequence_start.tag) flow_style = None if self.parsed_event.data.sequence_start.style == YAML_FLOW_SEQUENCE_STYLE: flow_style = True elif self.parsed_event.data.sequence_start.style == YAML_BLOCK_SEQUENCE_STYLE: flow_style = False value = [] node = SequenceNode(tag, value, start_mark, None, flow_style) if anchor is not None: self.anchors[anchor] = node yaml_event_delete(&self.parsed_event) index = 0 self._parse_next_event() while self.parsed_event.type != YAML_SEQUENCE_END_EVENT: value.append(self._compose_node(node, index)) index = index+1 self._parse_next_event() node.end_mark = Mark(self.stream_name, self.parsed_event.end_mark.index, self.parsed_event.end_mark.line, self.parsed_event.end_mark.column, None, None) yaml_event_delete(&self.parsed_event) return node cdef _compose_mapping_node(self, object anchor): start_mark = Mark(self.stream_name, self.parsed_event.start_mark.index, self.parsed_event.start_mark.line, self.parsed_event.start_mark.column, None, None) implicit = False if self.parsed_event.data.mapping_start.implicit == 1: implicit = True if self.parsed_event.data.mapping_start.tag == NULL \ or (self.parsed_event.data.mapping_start.tag[0] == c'!' and self.parsed_event.data.mapping_start.tag[1] == c'\0'): tag = self.resolve(MappingNode, None, implicit) else: tag = PyUnicode_FromYamlString(self.parsed_event.data.mapping_start.tag) flow_style = None if self.parsed_event.data.mapping_start.style == YAML_FLOW_MAPPING_STYLE: flow_style = True elif self.parsed_event.data.mapping_start.style == YAML_BLOCK_MAPPING_STYLE: flow_style = False value = [] node = MappingNode(tag, value, start_mark, None, flow_style) if anchor is not None: self.anchors[anchor] = node yaml_event_delete(&self.parsed_event) self._parse_next_event() while self.parsed_event.type != YAML_MAPPING_END_EVENT: item_key = self._compose_node(node, None) item_value = self._compose_node(node, item_key) value.append((item_key, item_value)) self._parse_next_event() node.end_mark = Mark(self.stream_name, self.parsed_event.end_mark.index, self.parsed_event.end_mark.line, self.parsed_event.end_mark.column, None, None) yaml_event_delete(&self.parsed_event) return node cdef int _parse_next_event(self) except 0: if self.parsed_event.type == YAML_NO_EVENT: if yaml_parser_parse(&self.parser, &self.parsed_event) == 0: error = self._parser_error() raise error return 1 cdef int input_handler(void *data, unsigned char *buffer, size_t size, size_t *read) except 0: cdef CParser parser parser = data if parser.stream_cache is None: value = parser.stream.read(size) if PyUnicode_CheckExact(value) != 0: value = PyUnicode_AsUTF8String(value) parser.unicode_source = 1 if PyBytes_CheckExact(value) == 0: raise TypeError(u"a string value is expected") parser.stream_cache = value parser.stream_cache_pos = 0 parser.stream_cache_len = PyBytes_GET_SIZE(value) if (parser.stream_cache_len - parser.stream_cache_pos) < size: size = parser.stream_cache_len - parser.stream_cache_pos if size > 0: memcpy(buffer, PyBytes_AS_STRING(parser.stream_cache) + parser.stream_cache_pos, size) read[0] = size parser.stream_cache_pos += size if parser.stream_cache_pos == parser.stream_cache_len: parser.stream_cache = None return 1 cdef class CEmitter: cdef yaml_emitter_t emitter cdef object stream cdef int document_start_implicit cdef int document_end_implicit cdef object use_version cdef object use_tags cdef object serialized_nodes cdef object anchors cdef int last_alias_id cdef int closed cdef int dump_unicode cdef object use_encoding def __init__(self, stream, canonical=None, indent=None, width=None, allow_unicode=None, line_break=None, encoding=None, explicit_start=None, explicit_end=None, version=None, tags=None): if yaml_emitter_initialize(&self.emitter) == 0: raise MemoryError self.stream = stream self.dump_unicode = 0 if hasattr(stream, u'encoding'): self.dump_unicode = 1 self.use_encoding = encoding yaml_emitter_set_output(&self.emitter, output_handler, self) if canonical: yaml_emitter_set_canonical(&self.emitter, 1) if indent is not None: yaml_emitter_set_indent(&self.emitter, indent) if width is not None: yaml_emitter_set_width(&self.emitter, width) if allow_unicode: yaml_emitter_set_unicode(&self.emitter, 1) if line_break is not None: if line_break == '\r': yaml_emitter_set_break(&self.emitter, YAML_CR_BREAK) elif line_break == '\n': yaml_emitter_set_break(&self.emitter, YAML_LN_BREAK) elif line_break == '\r\n': yaml_emitter_set_break(&self.emitter, YAML_CRLN_BREAK) self.document_start_implicit = 1 if explicit_start: self.document_start_implicit = 0 self.document_end_implicit = 1 if explicit_end: self.document_end_implicit = 0 self.use_version = version self.use_tags = tags self.serialized_nodes = {} self.anchors = {} self.last_alias_id = 0 self.closed = -1 def __dealloc__(self): yaml_emitter_delete(&self.emitter) def dispose(self): pass cdef object _emitter_error(self): if self.emitter.error == YAML_MEMORY_ERROR: return MemoryError elif self.emitter.error == YAML_EMITTER_ERROR: problem = PyUnicode_FromString(self.emitter.problem) return EmitterError(problem) raise ValueError(u"no emitter error") cdef int _object_to_event(self, object event_object, yaml_event_t *event) except 0: cdef yaml_encoding_t encoding cdef yaml_version_directive_t version_directive_value cdef yaml_version_directive_t *version_directive cdef yaml_tag_directive_t tag_directives_value[128] cdef yaml_tag_directive_t *tag_directives_start cdef yaml_tag_directive_t *tag_directives_end cdef int implicit cdef int plain_implicit cdef int quoted_implicit cdef yaml_char_t *anchor cdef yaml_char_t *tag cdef yaml_char_t *value cdef int length cdef yaml_scalar_style_t scalar_style cdef yaml_sequence_style_t sequence_style cdef yaml_mapping_style_t mapping_style event_class = event_object.__class__ if event_class is StreamStartEvent: encoding = YAML_UTF8_ENCODING if event_object.encoding == u'utf-16-le' or event_object.encoding == 'utf-16-le': encoding = YAML_UTF16LE_ENCODING elif event_object.encoding == u'utf-16-be' or event_object.encoding == 'utf-16-be': encoding = YAML_UTF16BE_ENCODING if event_object.encoding is None: self.dump_unicode = 1 if self.dump_unicode == 1: encoding = YAML_UTF8_ENCODING yaml_stream_start_event_initialize(event, encoding) elif event_class is StreamEndEvent: yaml_stream_end_event_initialize(event) elif event_class is DocumentStartEvent: version_directive = NULL if event_object.version: version_directive_value.major = event_object.version[0] version_directive_value.minor = event_object.version[1] version_directive = &version_directive_value tag_directives_start = NULL tag_directives_end = NULL if event_object.tags: if len(event_object.tags) > 128: raise ValueError(u"too many tags") tag_directives_start = tag_directives_value tag_directives_end = tag_directives_value cache = [] for handle in event_object.tags: prefix = event_object.tags[handle] if PyUnicode_CheckExact(handle): handle = PyUnicode_AsUTF8String(handle) cache.append(handle) if not PyBytes_CheckExact(handle): raise TypeError(u"tag handle must be a string") tag_directives_end.handle = PyBytes_AS_Yaml_STRING(handle) if PyUnicode_CheckExact(prefix): prefix = PyUnicode_AsUTF8String(prefix) cache.append(prefix) if not PyBytes_CheckExact(prefix): raise TypeError(u"tag prefix must be a string") tag_directives_end.prefix = PyBytes_AS_Yaml_STRING(prefix) tag_directives_end = tag_directives_end+1 implicit = 1 if event_object.explicit: implicit = 0 if yaml_document_start_event_initialize(event, version_directive, tag_directives_start, tag_directives_end, implicit) == 0: raise MemoryError elif event_class is DocumentEndEvent: implicit = 1 if event_object.explicit: implicit = 0 yaml_document_end_event_initialize(event, implicit) elif event_class is AliasEvent: anchor = NULL anchor_object = event_object.anchor if PyUnicode_CheckExact(anchor_object): anchor_object = PyUnicode_AsUTF8String(anchor_object) if not PyBytes_CheckExact(anchor_object): raise TypeError(u"anchor must be a string") anchor = PyBytes_AS_Yaml_STRING(anchor_object) if yaml_alias_event_initialize(event, anchor) == 0: raise MemoryError elif event_class is ScalarEvent: anchor = NULL anchor_object = event_object.anchor if anchor_object is not None: if PyUnicode_CheckExact(anchor_object): anchor_object = PyUnicode_AsUTF8String(anchor_object) if not PyBytes_CheckExact(anchor_object): raise TypeError(u"anchor must be a string") anchor = PyBytes_AS_Yaml_STRING(anchor_object) tag = NULL tag_object = event_object.tag if tag_object is not None: if PyUnicode_CheckExact(tag_object): tag_object = PyUnicode_AsUTF8String(tag_object) if not PyBytes_CheckExact(tag_object): raise TypeError(u"tag must be a string") tag = PyBytes_AS_Yaml_STRING(tag_object) value_object = event_object.value if PyUnicode_CheckExact(value_object): value_object = PyUnicode_AsUTF8String(value_object) if not PyBytes_CheckExact(value_object): raise TypeError(u"value must be a string") value = PyBytes_AS_Yaml_STRING(value_object) length = PyBytes_GET_SIZE(value_object) plain_implicit = 0 quoted_implicit = 0 if event_object.implicit is not None: plain_implicit = event_object.implicit[0] quoted_implicit = event_object.implicit[1] style_object = event_object.style scalar_style = YAML_PLAIN_SCALAR_STYLE if style_object == "'" or style_object == u"'": scalar_style = YAML_SINGLE_QUOTED_SCALAR_STYLE elif style_object == "\"" or style_object == u"\"": scalar_style = YAML_DOUBLE_QUOTED_SCALAR_STYLE elif style_object == "|" or style_object == u"|": scalar_style = YAML_LITERAL_SCALAR_STYLE elif style_object == ">" or style_object == u">": scalar_style = YAML_FOLDED_SCALAR_STYLE if yaml_scalar_event_initialize(event, anchor, tag, value, length, plain_implicit, quoted_implicit, scalar_style) == 0: raise MemoryError elif event_class is SequenceStartEvent: anchor = NULL anchor_object = event_object.anchor if anchor_object is not None: if PyUnicode_CheckExact(anchor_object): anchor_object = PyUnicode_AsUTF8String(anchor_object) if not PyBytes_CheckExact(anchor_object): raise TypeError(u"anchor must be a string") anchor = PyBytes_AS_Yaml_STRING(anchor_object) tag = NULL tag_object = event_object.tag if tag_object is not None: if PyUnicode_CheckExact(tag_object): tag_object = PyUnicode_AsUTF8String(tag_object) if not PyBytes_CheckExact(tag_object): raise TypeError(u"tag must be a string") tag = PyBytes_AS_Yaml_STRING(tag_object) implicit = 0 if event_object.implicit: implicit = 1 sequence_style = YAML_BLOCK_SEQUENCE_STYLE if event_object.flow_style: sequence_style = YAML_FLOW_SEQUENCE_STYLE if yaml_sequence_start_event_initialize(event, anchor, tag, implicit, sequence_style) == 0: raise MemoryError elif event_class is MappingStartEvent: anchor = NULL anchor_object = event_object.anchor if anchor_object is not None: if PyUnicode_CheckExact(anchor_object): anchor_object = PyUnicode_AsUTF8String(anchor_object) if not PyBytes_CheckExact(anchor_object): raise TypeError(u"anchor must be a string") anchor = PyBytes_AS_Yaml_STRING(anchor_object) tag = NULL tag_object = event_object.tag if tag_object is not None: if PyUnicode_CheckExact(tag_object): tag_object = PyUnicode_AsUTF8String(tag_object) if not PyBytes_CheckExact(tag_object): raise TypeError(u"tag must be a string") tag = PyBytes_AS_Yaml_STRING(tag_object) implicit = 0 if event_object.implicit: implicit = 1 mapping_style = YAML_BLOCK_MAPPING_STYLE if event_object.flow_style: mapping_style = YAML_FLOW_MAPPING_STYLE if yaml_mapping_start_event_initialize(event, anchor, tag, implicit, mapping_style) == 0: raise MemoryError elif event_class is SequenceEndEvent: yaml_sequence_end_event_initialize(event) elif event_class is MappingEndEvent: yaml_mapping_end_event_initialize(event) else: raise TypeError(u"invalid event %s" % event_object) return 1 def emit(self, event_object): cdef yaml_event_t event self._object_to_event(event_object, &event) if yaml_emitter_emit(&self.emitter, &event) == 0: error = self._emitter_error() raise error def open(self): cdef yaml_event_t event cdef yaml_encoding_t encoding if self.closed == -1: if self.use_encoding == u'utf-16-le' or self.use_encoding == 'utf-16-le': encoding = YAML_UTF16LE_ENCODING elif self.use_encoding == u'utf-16-be' or self.use_encoding == 'utf-16-be': encoding = YAML_UTF16BE_ENCODING else: encoding = YAML_UTF8_ENCODING if self.use_encoding is None: self.dump_unicode = 1 if self.dump_unicode == 1: encoding = YAML_UTF8_ENCODING yaml_stream_start_event_initialize(&event, encoding) if yaml_emitter_emit(&self.emitter, &event) == 0: error = self._emitter_error() raise error self.closed = 0 elif self.closed == 1: raise SerializerError(u"serializer is closed") else: raise SerializerError(u"serializer is already opened") def close(self): cdef yaml_event_t event if self.closed == -1: raise SerializerError(u"serializer is not opened") elif self.closed == 0: yaml_stream_end_event_initialize(&event) if yaml_emitter_emit(&self.emitter, &event) == 0: error = self._emitter_error() raise error self.closed = 1 def serialize(self, node): cdef yaml_event_t event cdef yaml_version_directive_t version_directive_value cdef yaml_version_directive_t *version_directive cdef yaml_tag_directive_t tag_directives_value[128] cdef yaml_tag_directive_t *tag_directives_start cdef yaml_tag_directive_t *tag_directives_end if self.closed == -1: raise SerializerError(u"serializer is not opened") elif self.closed == 1: raise SerializerError(u"serializer is closed") cache = [] version_directive = NULL if self.use_version: version_directive_value.major = self.use_version[0] version_directive_value.minor = self.use_version[1] version_directive = &version_directive_value tag_directives_start = NULL tag_directives_end = NULL if self.use_tags: if len(self.use_tags) > 128: raise ValueError(u"too many tags") tag_directives_start = tag_directives_value tag_directives_end = tag_directives_value for handle in self.use_tags: prefix = self.use_tags[handle] if PyUnicode_CheckExact(handle): handle = PyUnicode_AsUTF8String(handle) cache.append(handle) if not PyBytes_CheckExact(handle): raise TypeError(u"tag handle must be a string") tag_directives_end.handle = PyBytes_AS_Yaml_STRING(handle) if PyUnicode_CheckExact(prefix): prefix = PyUnicode_AsUTF8String(prefix) cache.append(prefix) if not PyBytes_CheckExact(prefix): raise TypeError(u"tag prefix must be a string") tag_directives_end.prefix = PyBytes_AS_Yaml_STRING(prefix) tag_directives_end = tag_directives_end+1 if yaml_document_start_event_initialize(&event, version_directive, tag_directives_start, tag_directives_end, self.document_start_implicit) == 0: raise MemoryError if yaml_emitter_emit(&self.emitter, &event) == 0: error = self._emitter_error() raise error self._anchor_node(node) self._serialize_node(node, None, None) yaml_document_end_event_initialize(&event, self.document_end_implicit) if yaml_emitter_emit(&self.emitter, &event) == 0: error = self._emitter_error() raise error self.serialized_nodes = {} self.anchors = {} self.last_alias_id = 0 cdef int _anchor_node(self, object node) except 0: if node in self.anchors: if self.anchors[node] is None: self.last_alias_id = self.last_alias_id+1 self.anchors[node] = u"id%03d" % self.last_alias_id else: self.anchors[node] = None node_class = node.__class__ if node_class is SequenceNode: for item in node.value: self._anchor_node(item) elif node_class is MappingNode: for key, value in node.value: self._anchor_node(key) self._anchor_node(value) return 1 cdef int _serialize_node(self, object node, object parent, object index) except 0: cdef yaml_event_t event cdef int implicit cdef int plain_implicit cdef int quoted_implicit cdef yaml_char_t *anchor cdef yaml_char_t *tag cdef yaml_char_t *value cdef int length cdef int item_index cdef yaml_scalar_style_t scalar_style cdef yaml_sequence_style_t sequence_style cdef yaml_mapping_style_t mapping_style anchor_object = self.anchors[node] anchor = NULL if anchor_object is not None: if PyUnicode_CheckExact(anchor_object): anchor_object = PyUnicode_AsUTF8String(anchor_object) if not PyBytes_CheckExact(anchor_object): raise TypeError(u"anchor must be a string") anchor = PyBytes_AS_Yaml_STRING(anchor_object) if node in self.serialized_nodes: if yaml_alias_event_initialize(&event, anchor) == 0: raise MemoryError if yaml_emitter_emit(&self.emitter, &event) == 0: error = self._emitter_error() raise error else: node_class = node.__class__ self.serialized_nodes[node] = True self.descend_resolver(parent, index) if node_class is ScalarNode: plain_implicit = 0 quoted_implicit = 0 tag_object = node.tag if self.resolve(ScalarNode, node.value, (True, False)) == tag_object: plain_implicit = 1 if self.resolve(ScalarNode, node.value, (False, True)) == tag_object: quoted_implicit = 1 tag = NULL if tag_object is not None: if PyUnicode_CheckExact(tag_object): tag_object = PyUnicode_AsUTF8String(tag_object) if not PyBytes_CheckExact(tag_object): raise TypeError(u"tag must be a string") tag = PyBytes_AS_Yaml_STRING(tag_object) value_object = node.value if PyUnicode_CheckExact(value_object): value_object = PyUnicode_AsUTF8String(value_object) if not PyBytes_CheckExact(value_object): raise TypeError(u"value must be a string") value = PyBytes_AS_Yaml_STRING(value_object) length = PyBytes_GET_SIZE(value_object) style_object = node.style scalar_style = YAML_PLAIN_SCALAR_STYLE if style_object == "'" or style_object == u"'": scalar_style = YAML_SINGLE_QUOTED_SCALAR_STYLE elif style_object == "\"" or style_object == u"\"": scalar_style = YAML_DOUBLE_QUOTED_SCALAR_STYLE elif style_object == "|" or style_object == u"|": scalar_style = YAML_LITERAL_SCALAR_STYLE elif style_object == ">" or style_object == u">": scalar_style = YAML_FOLDED_SCALAR_STYLE if yaml_scalar_event_initialize(&event, anchor, tag, value, length, plain_implicit, quoted_implicit, scalar_style) == 0: raise MemoryError if yaml_emitter_emit(&self.emitter, &event) == 0: error = self._emitter_error() raise error elif node_class is SequenceNode: implicit = 0 tag_object = node.tag if self.resolve(SequenceNode, node.value, True) == tag_object: implicit = 1 tag = NULL if tag_object is not None: if PyUnicode_CheckExact(tag_object): tag_object = PyUnicode_AsUTF8String(tag_object) if not PyBytes_CheckExact(tag_object): raise TypeError(u"tag must be a string") tag = PyBytes_AS_Yaml_STRING(tag_object) sequence_style = YAML_BLOCK_SEQUENCE_STYLE if node.flow_style: sequence_style = YAML_FLOW_SEQUENCE_STYLE if yaml_sequence_start_event_initialize(&event, anchor, tag, implicit, sequence_style) == 0: raise MemoryError if yaml_emitter_emit(&self.emitter, &event) == 0: error = self._emitter_error() raise error item_index = 0 for item in node.value: self._serialize_node(item, node, item_index) item_index = item_index+1 yaml_sequence_end_event_initialize(&event) if yaml_emitter_emit(&self.emitter, &event) == 0: error = self._emitter_error() raise error elif node_class is MappingNode: implicit = 0 tag_object = node.tag if self.resolve(MappingNode, node.value, True) == tag_object: implicit = 1 tag = NULL if tag_object is not None: if PyUnicode_CheckExact(tag_object): tag_object = PyUnicode_AsUTF8String(tag_object) if not PyBytes_CheckExact(tag_object): raise TypeError(u"tag must be a string") tag = PyBytes_AS_Yaml_STRING(tag_object) mapping_style = YAML_BLOCK_MAPPING_STYLE if node.flow_style: mapping_style = YAML_FLOW_MAPPING_STYLE if yaml_mapping_start_event_initialize(&event, anchor, tag, implicit, mapping_style) == 0: raise MemoryError if yaml_emitter_emit(&self.emitter, &event) == 0: error = self._emitter_error() raise error for item_key, item_value in node.value: self._serialize_node(item_key, node, None) self._serialize_node(item_value, node, item_key) yaml_mapping_end_event_initialize(&event) if yaml_emitter_emit(&self.emitter, &event) == 0: error = self._emitter_error() raise error self.ascend_resolver() return 1 cdef int output_handler(void *data, unsigned char *bufferu, size_t size) except 0: cdef CEmitter emitter cdef char *buffer buffer = bufferu emitter = data if emitter.dump_unicode == 0: value = PyBytes_FromStringAndSize(buffer, size) else: value = PyUnicode_DecodeUTF8(buffer, size, 'strict') emitter.stream.write(value) return 1