././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 010212 x ustar 00 28 mtime=1690553458.2277496
wheezy.template-3.2.1/ 0000755 0001751 0000173 00000000000 14460746162 014255 5 ustar 00runner docker ././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553443.0
wheezy.template-3.2.1/LICENSE 0000644 0001751 0000173 00000002050 14460746143 015256 0 ustar 00runner docker Copyright (C) 2011 by Andriy Kornatskyy
Permission is hereby granted, free of charge, to any person
obtaining a copy of this software and associated documentation
files (the "Software"), to deal in the Software without
restriction, including without limitation the rights to use,
copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the
Software is furnished to do so, subject to the following
conditions:
The above copyright notice and this permission notice shall be
included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
OTHER DEALINGS IN THE SOFTWARE.
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553443.0
wheezy.template-3.2.1/MANIFEST.in 0000644 0001751 0000173 00000000032 14460746143 016005 0 ustar 00runner docker include LICENSE README.md
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 010212 x ustar 00 28 mtime=1690553458.2277496
wheezy.template-3.2.1/PKG-INFO 0000644 0001751 0000173 00000007607 14460746162 015364 0 ustar 00runner docker Metadata-Version: 2.1
Name: wheezy.template
Version: 3.2.1
Summary: A lightweight template library
Home-page: https://github.com/akornatskyy/wheezy.template
Author: Andriy Kornatskyy
Author-email: andriy.kornatskyy@live.com
License: MIT
Keywords: html markup template preprocessor
Platform: any
Classifier: Environment :: Web Environment
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Natural Language :: English
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: Implementation :: CPython
Classifier: Programming Language :: Python :: Implementation :: PyPy
Classifier: Topic :: Internet :: WWW/HTTP
Classifier: Topic :: Internet :: WWW/HTTP :: Dynamic Content
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Classifier: Topic :: Software Development :: Widget Sets
Classifier: Topic :: Text Processing :: Markup :: HTML
Requires-Python: >=3.8
Description-Content-Type: text/markdown
License-File: LICENSE
# wheezy.template
[](https://github.com/akornatskyy/wheezy.template/actions/workflows/tests.yml)
[](https://coveralls.io/github/akornatskyy/wheezy.template?branch=master)
[](https://wheezytemplate.readthedocs.io/en/latest/?badge=latest)
[](https://badge.fury.io/py/wheezy.template)
[wheezy.template](https://pypi.org/project/wheezy.template/) is a
[python](https://www.python.org) package written in pure Python code. It
is a lightweight template library. The design goals achived:
- **Compact, Expressive, Clean:** Minimizes the number of keystrokes
required to build a template. Enables fast and well read coding. You
do not need to explicitly denote statement blocks within HTML
(unlike other template systems), the parser is smart enough to
understand your code. This enables a compact and expressive syntax
which is really clean and just pleasure to type.
- **Intuitive, No time to Learn:** Basic Python programming skills
plus HTML markup. You are productive just from start. Use full power
of Python with minimal markup required to denote python statements.
- **Do Not Repeat Yourself:** Master layout templates for inheritance;
include and import directives for maximum reuse.
- **Blazingly Fast:** Maximum rendering performance: ultimate speed
and context preprocessor features.
Simple template:
```txt
@require(user, items)
Welcome, @user.name!
@if items:
@for i in items:
@i.name: @i.price!s.
@end
@else:
No items found.
@end
```
It is optimized for performance, well tested and documented.
Resources:
- [source code](https://github.com/akornatskyy/wheezy.template),
[examples](https://github.com/akornatskyy/wheezy.template/tree/master/demos)
and [issues](https://github.com/akornatskyy/wheezy.template/issues)
tracker are available on
[github](https://github.com/akornatskyy/wheezy.template)
- [documentation](https://wheezytemplate.readthedocs.io/en/latest/)
## Install
[wheezy.template](https://pypi.org/project/wheezy.template/) requires
[python](https://www.python.org) version 3.8+. It is independent of
operating system. You can install it from
[pypi](https://pypi.org/project/wheezy.template/) site:
```sh
pip install -U wheezy.template
```
If you run into any issue or have comments, go ahead and add on
[github](https://github.com/akornatskyy/wheezy.template).
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553443.0
wheezy.template-3.2.1/README.md 0000644 0001751 0000173 00000005207 14460746143 015537 0 ustar 00runner docker # wheezy.template
[](https://github.com/akornatskyy/wheezy.template/actions/workflows/tests.yml)
[](https://coveralls.io/github/akornatskyy/wheezy.template?branch=master)
[](https://wheezytemplate.readthedocs.io/en/latest/?badge=latest)
[](https://badge.fury.io/py/wheezy.template)
[wheezy.template](https://pypi.org/project/wheezy.template/) is a
[python](https://www.python.org) package written in pure Python code. It
is a lightweight template library. The design goals achived:
- **Compact, Expressive, Clean:** Minimizes the number of keystrokes
required to build a template. Enables fast and well read coding. You
do not need to explicitly denote statement blocks within HTML
(unlike other template systems), the parser is smart enough to
understand your code. This enables a compact and expressive syntax
which is really clean and just pleasure to type.
- **Intuitive, No time to Learn:** Basic Python programming skills
plus HTML markup. You are productive just from start. Use full power
of Python with minimal markup required to denote python statements.
- **Do Not Repeat Yourself:** Master layout templates for inheritance;
include and import directives for maximum reuse.
- **Blazingly Fast:** Maximum rendering performance: ultimate speed
and context preprocessor features.
Simple template:
```txt
@require(user, items)
Welcome, @user.name!
@if items:
@for i in items:
@i.name: @i.price!s.
@end
@else:
No items found.
@end
```
It is optimized for performance, well tested and documented.
Resources:
- [source code](https://github.com/akornatskyy/wheezy.template),
[examples](https://github.com/akornatskyy/wheezy.template/tree/master/demos)
and [issues](https://github.com/akornatskyy/wheezy.template/issues)
tracker are available on
[github](https://github.com/akornatskyy/wheezy.template)
- [documentation](https://wheezytemplate.readthedocs.io/en/latest/)
## Install
[wheezy.template](https://pypi.org/project/wheezy.template/) requires
[python](https://www.python.org) version 3.8+. It is independent of
operating system. You can install it from
[pypi](https://pypi.org/project/wheezy.template/) site:
```sh
pip install -U wheezy.template
```
If you run into any issue or have comments, go ahead and add on
[github](https://github.com/akornatskyy/wheezy.template).
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 010212 x ustar 00 28 mtime=1690553458.2277496
wheezy.template-3.2.1/setup.cfg 0000644 0001751 0000173 00000000046 14460746162 016076 0 ustar 00runner docker [egg_info]
tag_build =
tag_date = 0
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553443.0
wheezy.template-3.2.1/setup.py 0000644 0001751 0000173 00000005065 14460746143 015774 0 ustar 00runner docker #!/usr/bin/env python
import multiprocessing
import os
import re
from setuptools import setup # type: ignore[import]
extra = {}
try:
from Cython.Build import cythonize # type: ignore[import]
p = os.path.join("src", "wheezy", "template")
extra["ext_modules"] = cythonize(
[os.path.join(p, "*.py"), os.path.join(p, "ext", "*.py")],
exclude=[
os.path.join(p, "__init__.py"),
os.path.join(p, "ext", "__init__.py"),
],
# https://github.com/cython/cython/issues/3262
nthreads=0 if multiprocessing.get_start_method() == "spawn" else 2,
compiler_directives={"language_level": 3},
quiet=True,
)
except ImportError:
pass
README = open("README.md").read()
VERSION = (
re.search( # type: ignore
r'__version__ = "(.+)"',
open("src/wheezy/template/__init__.py").read(),
)
.group(1)
.strip()
)
setup(
name="wheezy.template",
version=VERSION,
python_requires=">=3.8",
description="A lightweight template library",
long_description=README,
long_description_content_type="text/markdown",
url="https://github.com/akornatskyy/wheezy.template",
author="Andriy Kornatskyy",
author_email="andriy.kornatskyy@live.com",
license="MIT",
classifiers=[
"Environment :: Web Environment",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"Natural Language :: English",
"Operating System :: OS Independent",
"Programming Language :: Python",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: Implementation :: CPython",
"Programming Language :: Python :: Implementation :: PyPy",
"Topic :: Internet :: WWW/HTTP",
"Topic :: Internet :: WWW/HTTP :: Dynamic Content",
"Topic :: Software Development :: Libraries :: Python Modules",
"Topic :: Software Development :: Widget Sets",
"Topic :: Text Processing :: Markup :: HTML",
],
keywords="html markup template preprocessor",
packages=["wheezy", "wheezy.template", "wheezy.template.ext"],
package_data={"wheezy.template": ["py.typed"]},
package_dir={"": "src"},
namespace_packages=["wheezy"],
zip_safe=False,
install_requires=[],
entry_points={
"console_scripts": ["wheezy.template=wheezy.template.console:main"]
},
platforms="any",
**extra
)
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 010212 x ustar 00 28 mtime=1690553458.2237494
wheezy.template-3.2.1/src/ 0000755 0001751 0000173 00000000000 14460746162 015044 5 ustar 00runner docker ././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 010212 x ustar 00 28 mtime=1690553458.2237494
wheezy.template-3.2.1/src/wheezy/ 0000755 0001751 0000173 00000000000 14460746162 016357 5 ustar 00runner docker ././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553443.0
wheezy.template-3.2.1/src/wheezy/__init__.py 0000644 0001751 0000173 00000000500 14460746143 020462 0 ustar 00runner docker import typing
# See
# http://peak.telecommunity.com/DevCenter/setuptools#namespace-packages
try:
__import__("pkg_resources").declare_namespace(__name__)
except ImportError:
from pkgutil import extend_path
__path__: typing.List[str] = extend_path( # type: ignore[no-redef]
__path__, __name__
)
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 010212 x ustar 00 28 mtime=1690553458.2277496
wheezy.template-3.2.1/src/wheezy/template/ 0000755 0001751 0000173 00000000000 14460746162 020172 5 ustar 00runner docker ././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553456.0
wheezy.template-3.2.1/src/wheezy/template/__init__.py 0000644 0001751 0000173 00000000711 14460746160 022300 0 ustar 00runner docker """
"""
from wheezy.template.engine import Engine
from wheezy.template.ext.code import CodeExtension
from wheezy.template.ext.core import CoreExtension
from wheezy.template.loader import DictLoader, FileLoader, PreprocessLoader
from wheezy.template.preprocessor import Preprocessor
__all__ = (
"Engine",
"CodeExtension",
"CoreExtension",
"DictLoader",
"FileLoader",
"PreprocessLoader",
"Preprocessor",
)
__version__ = "3.2.1"
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553443.0
wheezy.template-3.2.1/src/wheezy/template/builder.py 0000644 0001751 0000173 00000007340 14460746143 022175 0 ustar 00runner docker import typing
from wheezy.template.typing import Builder, BuilderRule, Token
def builder_scan(
extensions: typing.List[typing.Any],
) -> typing.Mapping[str, typing.Any]:
builder_rules: typing.Dict[str, typing.List[BuilderRule]] = {}
for extension in extensions:
if hasattr(extension, "builder_rules"):
rules = extension.builder_rules
for token, builder in rules:
builder_rules.setdefault(token, []).append(builder)
return {"builder_rules": builder_rules}
class BlockBuilder(Builder):
__slots__ = ("rules", "indent", "lineno", "buf")
def __init__(
self,
rules: typing.Dict[str, typing.List[BuilderRule]],
indent: str = "",
lineno: int = 0,
) -> None:
self.rules = rules
self.indent = indent
self.lineno = lineno
self.buf: typing.List[str] = []
def start_block(self) -> None:
self.indent += " "
def end_block(self) -> None:
if len(self.indent) < 4:
raise SyntaxError("Unexpected end of block.")
self.indent = self.indent[:-4]
def add(self, lineno: int, code: str) -> None:
if lineno < self.lineno:
raise SyntaxError(
"Inconsistence at %s : %s" % (self.lineno, lineno)
)
if lineno == self.lineno:
line = self.buf[-1]
if code:
if line:
if line[-1:] == ":":
self.buf[-1] = line + code
else:
self.buf[-1] = line + "; " + code
else:
self.buf[-1] = self.indent + code
else:
pad = lineno - self.lineno - 1
if pad > 0:
self.buf.extend([""] * pad)
if code:
self.buf.append(self.indent + code)
else:
self.buf.append("")
self.lineno = lineno + code.count("\n")
def build_block(self, nodes: typing.Iterable[Token]) -> None:
for lineno, token, value in nodes:
self.build_token(lineno, token, value)
def build_token(
self,
lineno: int,
token: str,
value: typing.Union[str, typing.Iterable[Token]],
) -> None:
if token in self.rules:
for rule in self.rules[token]:
if rule(self, lineno, token, value):
break
else:
raise SyntaxError(
'No rule to build "%s" token at line %d.' % (token, lineno)
)
def to_string(self) -> str:
return "\n".join(self.buf)
class SourceBuilder(object):
__slots__ = ("rules", "lineno")
def __init__(
self,
builder_rules: typing.Dict[str, typing.List[BuilderRule]],
builder_offset: int = 2,
**ignore: typing.Any
) -> None:
self.rules = builder_rules
self.lineno = 0 - builder_offset
def build_source(self, nodes: typing.Iterable[Token]) -> str:
builder = BlockBuilder(self.rules)
builder.build_block(nodes)
return builder.to_string()
def build_render(self, nodes: typing.Iterable[Token]) -> str:
builder = BlockBuilder(self.rules, lineno=self.lineno)
builder.add(
self.lineno + 1, "def render(ctx, local_defs, super_defs):"
)
builder.start_block()
builder.build_token(self.lineno + 2, "render", nodes)
return builder.to_string()
def build_module(self, nodes: typing.Iterable[Token]) -> str:
builder = BlockBuilder(self.rules, lineno=self.lineno)
builder.add(self.lineno + 1, "local_defs = {}; super_defs = {}")
builder.build_token(self.lineno + 2, "module", nodes)
return builder.to_string()
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553443.0
wheezy.template-3.2.1/src/wheezy/template/comp.py 0000644 0001751 0000173 00000001012 14460746143 021473 0 ustar 00runner docker import ast
import sys
import typing
from _thread import allocate_lock # noqa
def adjust_source_lineno(source: str, name: str, lineno: int) -> typing.Any:
node = compile(source, name, "exec", ast.PyCF_ONLY_AST)
ast.increment_lineno(node, lineno)
return node
if sys.version_info <= (3, 9, 0): # pragma: nocover
from typing import List, Tuple
else: # pragma: nocover
Tuple = tuple # type: ignore
List = list # type: ignore
__all__ = (
"adjust_source_lineno",
"Tuple",
"List",
)
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553443.0
wheezy.template-3.2.1/src/wheezy/template/compiler.py 0000644 0001751 0000173 00000002004 14460746143 022351 0 ustar 00runner docker import typing
from types import ModuleType
from wheezy.template.comp import adjust_source_lineno
class Compiler(object):
def __init__(
self, global_vars: typing.Dict[str, typing.Any], source_lineno: int
) -> None:
self.global_vars = global_vars
self.source_lineno = source_lineno
def compile_module(self, source: str, name: str) -> ModuleType:
node = adjust_source_lineno(source, name, self.source_lineno)
compiled = compile(node, name, "exec")
module = ModuleType(name)
module.__name__ = name
module.__dict__.update(self.global_vars)
exec(compiled, module.__dict__)
return module
def compile_source(
self, source: str, name: str
) -> typing.Dict[str, typing.Any]:
node = adjust_source_lineno(source, name, self.source_lineno)
compiled = compile(node, name, "exec")
local_vars: typing.Dict[str, typing.Any] = {}
exec(compiled, self.global_vars, local_vars)
return local_vars
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553443.0
wheezy.template-3.2.1/src/wheezy/template/console.py 0000644 0001751 0000173 00000006426 14460746143 022215 0 ustar 00runner docker import getopt
import json
import os
import sys
import typing
from wheezy.template.engine import Engine
from wheezy.template.ext.code import CodeExtension
from wheezy.template.ext.core import CoreExtension
from wheezy.template.loader import FileLoader
try:
from wheezy.html.utils import escape_html as escape # type: ignore[import]
except ImportError: # pragma: nocover
from html import escape
class Options:
def __init__(
self,
token_start: str = "@",
line_join: str = "\\",
searchpath: typing.Optional[typing.List[str]] = None,
extensions: typing.Optional[typing.List[str]] = None,
) -> None:
self.token_start = token_start
self.line_join = line_join
self.searchpath = searchpath or ["."]
self.extensions = extensions or []
self.template = ""
self.context: typing.List[str] = []
def main(argv: typing.Optional[typing.List[str]] = None) -> int:
args = parse_args(argv or sys.argv[1:])
if not args:
return 2
ts = args.token_start
extensions = [CoreExtension(ts, args.line_join), CodeExtension(ts)]
extensions.extend(args.extensions)
engine = Engine(FileLoader(args.searchpath), extensions)
engine.global_vars.update({"h": escape})
t = engine.get_template(args.template)
sys.stdout.write(t.render(load_context(args.context)))
return 0
def load_context(sources: typing.List[str]) -> typing.Mapping[str, typing.Any]:
c: typing.Dict[str, typing.Any] = {}
for s in sources:
if os.path.isfile(s):
d = json.load(open(s))
else:
d = json.loads(s)
c.update(d)
return c
def parse_args( # noqa: C901
args: typing.List[str],
) -> typing.Optional[Options]:
try:
opts, value = getopt.getopt(args, "s:t:j:wh")
except getopt.GetoptError:
e = sys.exc_info()[1]
usage()
print("error: %s" % e)
return None
d = Options(token_start="@", searchpath=["."], extensions=[])
for o, a in opts:
if o == "-h":
return None
elif o == "-t":
d.token_start = a
elif o == "-j":
d.line_join = a
elif o == "-s":
d.searchpath = a.split(";")
elif o == "-w": # pragma: nocover
from wheezy.html.ext.template import ( # type: ignore[import]
WhitespaceExtension,
)
d.extensions.append(WhitespaceExtension())
if not value:
usage()
return None
d.template = value[0]
d.context = value[1:]
return d
def usage() -> None:
from datetime import datetime
from os.path import basename
from wheezy.template import __version__
print(
"""\
wheezy.template %s
Copyright (C) 2012-%d by Andriy Kornatskyy
renders a template with the given context.
usage: %s template [ context ... ]
positional arguments:
template a filename
context a filename or JSON string
optional arguments:
-s path search path for templates ( . )
-t token token start ( @ )
-j token line join ( \\ )
-w whitespace clean up
-h show this help message
"""
% (__version__, datetime.now().year, basename(sys.argv[0]))
)
if __name__ == "__main__": # pragma: nocover
sys.exit(main())
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553443.0
wheezy.template-3.2.1/src/wheezy/template/engine.py 0000644 0001751 0000173 00000015142 14460746143 022013 0 ustar 00runner docker import sys
import typing
from types import ModuleType
from wheezy.template.builder import SourceBuilder, builder_scan
from wheezy.template.comp import allocate_lock # type: ignore[attr-defined]
from wheezy.template.compiler import Compiler
from wheezy.template.lexer import Lexer, lexer_scan
from wheezy.template.parser import Parser, parser_scan
from wheezy.template.typing import (
Loader,
RenderTemplate,
SupportsRender,
TemplateClass,
Token,
)
class Template(SupportsRender):
"""Simple template class."""
__slots__ = ("name", "render_template")
def __init__(self, name: str, render_template: RenderTemplate) -> None:
self.name = name
self.render_template = render_template
def render(self, ctx: typing.Mapping[str, typing.Any]) -> str:
return self.render_template(ctx, {}, {})
class Engine(object):
"""The core component of template engine."""
def __init__(
self,
loader: Loader,
extensions: typing.List[typing.Any],
template_class: typing.Optional[TemplateClass] = None,
) -> None:
self.lock = allocate_lock()
self.templates: typing.Dict[str, SupportsRender] = {}
self.renders: typing.Dict[str, RenderTemplate] = {}
self.modules: typing.Dict[str, ModuleType] = {}
self.global_vars = {"_r": self.render, "_i": self.import_name}
self.loader = loader
self.template_class = template_class or Template
self.compiler = Compiler(
self.global_vars, sys.version_info >= (3, 11, 0) and -1 or -2
)
self.lexer = Lexer(**lexer_scan(extensions))
self.parser = Parser(**parser_scan(extensions))
self.builder = SourceBuilder(**builder_scan(extensions))
def get_template(self, name: str) -> SupportsRender:
"""Returns compiled template."""
try:
return self.templates[name]
except KeyError:
self.compile_template(name)
return self.templates[name]
def render(
self,
name: str,
ctx: typing.Mapping[str, typing.Any],
local_defs: typing.Mapping[str, typing.Any],
super_defs: typing.Mapping[str, typing.Any],
) -> str:
"""Renders template by name in given context."""
try:
return self.renders[name](ctx, local_defs, super_defs)
except KeyError:
self.compile_template(name)
return self.renders[name](ctx, local_defs, super_defs)
def remove(self, name: str) -> None:
"""Removes given ``name`` from internal cache."""
self.lock.acquire(True)
try:
if name in self.renders:
del self.templates[name]
del self.renders[name]
if name in self.modules:
del self.modules[name]
finally:
self.lock.release()
# region: internal details
def import_name(self, name: str) -> ModuleType:
try:
return self.modules[name]
except KeyError:
self.compile_import(name)
return self.modules[name]
def compile_template(self, name: str) -> None:
self.lock.acquire(True)
try:
if name not in self.renders:
template_source = self.loader.load(name)
if template_source is None:
raise IOError('Template "%s" not found.' % name)
tokens = self.lexer.tokenize(template_source)
nodes = self.parser.parse(tokens)
source = self.builder.build_render(nodes)
# print_debug(name, tokens, nodes, source)
try:
render_template = self.compiler.compile_source(
source, name
)["render"]
except SyntaxError as e:
raise complement_syntax_error(e, template_source, source)
self.renders[name] = render_template
self.templates[name] = self.template_class(
name, render_template
)
finally:
self.lock.release()
def compile_import(self, name: str) -> None:
self.lock.acquire(True)
try:
if name not in self.modules:
template_source = self.loader.load(name)
if template_source is None:
raise IOError('Import "%s" not found.' % name)
tokens = self.lexer.tokenize(template_source)
nodes = self.parser.parse(tokens)
source = self.builder.build_module(nodes)
# print_debug(name, tokens, nodes, source)
try:
self.modules[name] = self.compiler.compile_module(
source, name
)
except SyntaxError as e:
raise complement_syntax_error(e, template_source, source)
finally:
self.lock.release()
# region: internal details
def print_debug(
name: str,
tokens: typing.List[Token],
nodes: typing.List[typing.Any],
source: str,
) -> None: # pragma: nocover
print(name.center(80, "-"))
from pprint import pprint
# pprint(tokens)
pprint(nodes)
from wheezy.template.utils import print_source
print_source(source, -1)
def complement_syntax_error(
err: SyntaxError, template_source: str, source: str
) -> SyntaxError:
"""Complements SyntaxError with template and source snippets,
like one below:
.. code-block:: none
File "shared/snippet/widget.html", line 4
if :
template snippet:
02
03 @msg!h
04 @if :
05 sd
06 @end
generated snippet:
02 _b = []; w = _b.append; w('\\n ')
03 w(h(msg)); w('\\n')
04 if :
05 w(' sd\\n')
06
if :
^
SyntaxError: invalid syntax
"""
lineno = err.lineno or 0
text = """\
%s
template snippet:
%s
generated snippet:
%s
%s""" % (
err.text,
source_chunk(template_source, lineno - 2, 1),
source_chunk(source, lineno, -1),
err.text and err.text.strip() or "",
)
return err.__class__(err.msg, (err.filename, lineno - 2, err.offset, text))
def source_chunk(source: str, lineno: int, offset: int, extra: int = 2) -> str:
lines = source.split("\n", lineno + extra)
s = max(0, lineno - extra - 1)
e = min(len(lines), lineno + extra)
r = []
for i in range(s, e):
line = lines[i]
r.append(" %02d %s" % (i + offset, line))
return "\n".join(r)
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 010212 x ustar 00 28 mtime=1690553458.2277496
wheezy.template-3.2.1/src/wheezy/template/ext/ 0000755 0001751 0000173 00000000000 14460746162 020772 5 ustar 00runner docker ././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553443.0
wheezy.template-3.2.1/src/wheezy/template/ext/__init__.py 0000644 0001751 0000173 00000000000 14460746143 023070 0 ustar 00runner docker ././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553443.0
wheezy.template-3.2.1/src/wheezy/template/ext/code.py 0000644 0001751 0000173 00000002612 14460746143 022256 0 ustar 00runner docker import re
import typing
from wheezy.template.comp import Tuple
from wheezy.template.typing import Builder, LexerRule, ParserRule, Token
from wheezy.template.utils import find_balanced
# region: lexer extensions
def code_token(m: typing.Match[str]) -> Token:
source = m.string
start = m.end()
end = find_balanced(source, start)
if source[end::1] == "\n":
end += 1
return end, "code", source[start:end]
# region: parser
def parse_code(value: str) -> typing.List[str]:
lines = value.rstrip("\n")[1:-1].split("\n")
lines[0] = lines[0].lstrip()
if len(lines) == 1:
return lines
line = lines[1]
n = len(line) - len(line.lstrip())
return [s[:n].lstrip() + s[n:] for s in lines]
# region: block_builders
def build_code(
builder: Builder, lineno: int, token: str, lines: typing.List[str]
) -> bool:
for line in lines:
builder.add(lineno, line)
lineno += 1
return True
# region: core extension
class CodeExtension:
"""Includes support for embedded python code."""
def __init__(self, token_start: str = "@") -> None:
self.lexer_rules: typing.Mapping[int, LexerRule] = {
300: (re.compile(r"\s*%s(?=\()" % token_start), code_token),
}
parser_rules: typing.Mapping[str, ParserRule] = {"code": parse_code}
builder_rules: typing.List[Tuple[str, typing.Any]] = [("code", build_code)]
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553443.0
wheezy.template-3.2.1/src/wheezy/template/ext/core.py 0000644 0001751 0000173 00000032360 14460746143 022277 0 ustar 00runner docker import re
import typing
from wheezy.template.comp import Tuple
from wheezy.template.typing import Builder, LexerRule, ParserConfig, Token
from wheezy.template.utils import find_all_balanced
# region: config
end_tokens = ["end"]
continue_tokens = ["else:", "elif "]
compound_tokens = ["for ", "if ", "def ", "extends"] + continue_tokens
reserved_tokens = ["require", "#", "include", "import ", "from "]
all_tokens = end_tokens + compound_tokens + reserved_tokens
out_tokens = ["markup", "var", "include"]
extends_tokens = ["def ", "require", "import ", "from "]
known_var_filters = {"s": "str"}
WRITER_DECLARE = "_b = []; w = _b.append"
WRITER_RETURN = "return ''.join(_b)"
# region: lexer extensions
def stmt_token(m: typing.Match[str]) -> Token:
"""Produces statement token."""
return m.end(), str(m.group(2)), str(m.group(1)).replace("\\\n", "")
RE_VAR = re.compile(r"(\.\w+)+")
RE_VAR_FILTER = re.compile(r"(? Token:
"""Produces variable token."""
start = m.start(1)
pos = m.end()
source = m.string
while 1:
end = find_all_balanced(source, pos)
if pos == end:
break
mm = RE_VAR.match(source, end)
if not mm:
break
pos = mm.end()
value = source[start:end]
mm = RE_VAR_FILTER.match(source, end)
if mm:
end = mm.end()
value += "!" + mm.group()
return end, "var", value
def rvalue_token(m: typing.Match[str]) -> Token:
"""Produces variable token as r-value expression."""
return m.end(), "var", m.group(1).strip()
# region: parser config
def configure_parser(parser: ParserConfig) -> None:
parser.end_tokens.extend(end_tokens)
parser.continue_tokens.extend(continue_tokens)
parser.compound_tokens.extend(compound_tokens)
parser.out_tokens.extend(out_tokens)
# region: parser
def parse_require(value: str) -> typing.List[str]:
return [v.strip(" ") for v in value.rstrip()[8:-1].split(",")]
def parse_extends(value: str) -> str:
return value.rstrip()[8:-1]
def parse_include(value: str) -> str:
return value.rstrip()[8:-1]
def parse_import(value: str) -> Tuple[str, str]:
name, var = value[7:].rsplit(" as ", 1)
return name, var
def parse_from(value: str) -> Tuple[str, str, str]:
name, var = value[5:].rsplit(" import ", 1)
s = var.rsplit(" as ", 1)
if len(s) == 2:
var, alias = s
else:
alias = var
return name, var, alias
def parse_var(
value: str,
) -> Tuple[str, typing.Optional[typing.List[str]]]:
if "!!" not in value:
return value, None
var, var_filter = value.rsplit("!!", 1)
return var, var_filter.strip().split("!")
# region: block_builders
def build_extends(
builder: Builder, lineno: int, token: str, nodes: typing.List[typing.Any]
) -> bool:
assert token == "render"
n = len(nodes)
if not n:
return False
lineno, token, value = nodes[-1]
if token != "extends":
return False
extends, nested = value
if n > 1:
nested = nodes[:-1] + nested
for lineno, token, value in nested:
if token in extends_tokens:
builder.build_token(lineno, token, value)
builder.add(
builder.lineno + 1,
"return _r(" + extends + ", ctx, local_defs, super_defs)",
)
return True
def build_module(
builder: Builder, lineno: int, token: str, nodes: typing.List[typing.Any]
) -> bool:
assert token == "module"
for lineno, token, value in nodes:
if token == "def ":
builder.build_token(lineno, token, value)
return True
def build_import(
builder: Builder, lineno: int, token: str, value: Tuple[str, str]
) -> bool:
assert token == "import "
name, var = value
builder.add(lineno, var + " = _i(" + name + ")")
return True
def build_from(
builder: Builder,
lineno: int,
token: str,
value: Tuple[str, str, str],
) -> bool:
assert token == "from "
name, var, alias = value
builder.add(
lineno, alias + " = _i(" + name + ").local_defs['" + var + "']"
)
return True
def build_render_single_markup(
builder: Builder, lineno: int, token: str, nodes: typing.List[typing.Any]
) -> bool:
assert lineno <= 0
assert token == "render"
if len(nodes) > 1:
return False
if len(nodes) == 0:
builder.add(lineno, "return ''")
return True
ln, token, nodes = nodes[0]
if token != "out" or len(nodes) > 1:
return False
ln, token, value = nodes[0]
if token != "markup":
return False
if value:
builder.add(ln, "return " + value)
else:
builder.add(ln, "return ''")
return True
def build_render(
builder: Builder, lineno: int, token: str, nodes: typing.Iterable[Token]
) -> bool:
assert lineno <= 0
assert token == "render"
builder.add(lineno, WRITER_DECLARE)
builder.build_block(nodes)
lineno = builder.lineno
builder.add(lineno + 1, WRITER_RETURN)
return True
def build_def_syntax_check(
builder: Builder, lineno: int, token: str, value: typing.Any
) -> bool:
assert token == "def "
stmt, nodes = value
lineno, token, value = nodes[0]
if token in compound_tokens:
builder.add(lineno, stmt)
builder.start_block()
token = token.rstrip()
error = """\
The compound statement '%s' is not allowed here. \
Add a line before it with @#ignore or leave it empty.
%s
@#ignore
@%s ...""" % (
token,
stmt,
token,
)
builder.add(lineno, "raise SyntaxError(%s)" % repr(error))
builder.end_block()
return True
elif len(nodes) > 1:
# token before @end
lineno, token, value = nodes[-2]
if token == "#":
del nodes[-2]
return False
def build_def_empty(
builder: Builder, lineno: int, token: str, value: typing.Any
) -> bool:
assert token == "def "
stmt, nodes = value
if len(nodes) > 1:
return False
def_name = stmt[4 : stmt.index("(", 5)]
builder.add(lineno, stmt)
builder.start_block()
builder.add(lineno, "return ''")
builder.end_block()
builder.add(
lineno + 1,
def_name.join(
(
"super_defs['",
"'] = ",
"; ",
" = local_defs.setdefault('",
"', ",
")",
)
),
)
return True
def build_def_single_markup(
builder: Builder, lineno: int, token: str, value: typing.Any
) -> bool:
assert token == "def "
stmt, nodes = value
if len(nodes) > 2:
return False
ln, token, nodes = nodes[0]
if token != "out" or len(nodes) > 1:
return False
ln, token, value = nodes[0]
if token != "markup":
return False
def_name = stmt[4 : stmt.index("(", 5)]
builder.add(lineno, stmt)
builder.start_block()
builder.add(ln, "return " + value)
builder.end_block()
builder.add(
ln + 1,
def_name.join(
(
"super_defs['",
"'] = ",
"; ",
" = local_defs.setdefault('",
"', ",
")",
)
),
)
return True
def build_def(
builder: Builder, lineno: int, token: str, value: typing.Any
) -> bool:
assert token == "def "
stmt, nodes = value
def_name = stmt[4 : stmt.index("(", 5)]
builder.add(lineno, stmt)
builder.start_block()
builder.add(lineno + 1, WRITER_DECLARE)
builder.build_block(nodes)
lineno = builder.lineno
builder.add(lineno, WRITER_RETURN)
builder.end_block()
builder.add(
lineno + 1,
def_name.join(
(
"super_defs['",
"'] = ",
"; ",
" = local_defs.setdefault('",
"', ",
")",
)
),
)
return True
def build_out(
builder: Builder, lineno: int, token: str, nodes: typing.Iterable[Token]
) -> bool:
assert token == "out"
builder.build_block(nodes)
return True
def build_include(
builder: Builder, lineno: int, token: str, value: str
) -> bool:
assert token == "include"
builder.add(lineno, "w(_r(" + value + ", ctx, local_defs, super_defs))")
return True
def build_var(
builder: Builder, lineno: int, token: str, value: typing.Any
) -> bool:
assert token == "var"
var, var_filters = value
if var_filters:
for f in var_filters:
var = known_var_filters.get(f, f) + "(" + var + ")"
builder.add(lineno, "w(" + var + ")")
return True
def build_markup(
builder: Builder, lineno: int, token: str, value: str
) -> bool:
assert token == "markup"
if value:
builder.add(lineno, "w(" + value + ")")
return True
def build_compound(
builder: Builder, lineno: int, token: str, value: typing.Any
) -> bool:
assert token in compound_tokens
stmt, nodes = value
builder.add(lineno, stmt)
builder.start_block()
builder.build_block(nodes)
builder.end_block()
return True
def build_require(
builder: Builder, lineno: int, token: str, variables: typing.List[str]
) -> bool:
assert token == "require"
builder.add(
lineno,
"; ".join([name + " = ctx['" + name + "']" for name in variables]),
)
return True
def build_comment(
builder: Builder, lineno: int, token: str, comment: str
) -> bool:
assert token == "#"
builder.add(lineno, comment)
return True
def build_end(
builder: Builder, lineno: int, token: str, value: typing.Any
) -> bool:
if builder.lineno != lineno:
builder.add(lineno - 1, "")
return True
# region: core extension
class CoreExtension(object):
"""Includes basic statements, variables processing and markup."""
def __init__(self, token_start: str = "@", line_join: str = "\\"):
def markup_token(m: typing.Match[str]) -> Token:
"""Produces markup token."""
return (
m.end(),
"markup",
m.group().replace(token_start + token_start, token_start),
)
self.lexer_rules: typing.Dict[int, LexerRule] = {
100: (
re.compile(
r"%s((%s).*?(? str:
"""Cleans leading whitespace before token start for all control
tokens. Ignores escaped token start.
"""
return re_clean2.sub(
r"\n%s\2" % token_start,
re_clean1.sub(
r"%s\2" % token_start, source.replace("\r\n", "\n")
),
)
self.preprocessors = [clean_source]
# region: parser
if line_join:
line_join = re.escape(line_join)
re_join1 = re.compile(r"(? typing.Optional[str]:
value = re_join2.sub("\\n", re_join1.sub("", value))
if value:
return repr(value)
else:
return None
else:
def parse_markup(value: str) -> typing.Optional[str]: # noqa
if value:
return repr(value)
else:
return None
self.parser_rules = {
"require": parse_require,
"extends": parse_extends,
"include": parse_include,
"import ": parse_import,
"from ": parse_from,
"var": parse_var,
"markup": parse_markup,
}
parser_configs = [configure_parser]
builder_rules = [
("render", build_extends),
("render", build_render_single_markup),
("render", build_render),
("module", build_module),
("import ", build_import),
("from ", build_from),
("require", build_require),
("out", build_out),
("markup", build_markup),
("var", build_var),
("include", build_include),
("def ", build_def_syntax_check),
("def ", build_def_empty),
("def ", build_def_single_markup),
("def ", build_def),
("if ", build_compound),
("elif ", build_compound),
("else:", build_compound),
("for ", build_compound),
("#", build_comment),
("end", build_end),
]
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553443.0
wheezy.template-3.2.1/src/wheezy/template/ext/determined.py 0000644 0001751 0000173 00000010202 14460746143 023456 0 ustar 00runner docker import re
import typing
from wheezy.template.comp import Tuple
from wheezy.template.utils import find_balanced
RE_ARGS = re.compile(r'\s*(?P(([\'"]).*?\3|.+?))\s*\,')
RE_KWARGS = re.compile(
r'\s*(?P\w+)\s*=\s*(?P([\'"].*?[\'"]|.+?))\s*\,'
)
RE_STR_VALUE = re.compile(r'^[\'"](?P.+)[\'"]$')
RE_INT_VALUE = re.compile(r"^(?P(\d+))$")
# region: core extension
class DeterminedExtension(object):
"""Tranlates function calls between template engines.
Strictly determined known calls are converted to preprocessor
calls, e.g.::
@_('Name:')
@path_for('default')
@path_for('static', path='/static/css/site.css')
Those that are not strictly determined are ignored and processed
by runtime engine.
"""
def __init__(
self,
known_calls: typing.List[str],
runtime_token_start: str = "@",
token_start: str = "#",
) -> None:
self.token_start = token_start
self.pattern = re.compile(
r"%s(%s)(?=\()" % (runtime_token_start, "|".join(known_calls))
)
self.preprocessors = [self.preprocess]
def preprocess(self, source: str) -> str:
result = []
start = 0
for m in self.pattern.finditer(source):
pstart = m.end()
pend = find_balanced(source, pstart)
if determined(source[pstart + 1 : pend - 1]):
name = m.group(1)
result.append(source[start : m.start()])
result.append(self.token_start + "ctx['" + name + "']")
start = pstart
if start:
result.append(source[start:])
return "".join(result)
else:
return source
def determined(expression: str) -> bool:
"""Checks if expresion is strictly determined.
>>> determined("'default'")
True
>>> determined('name')
False
>>> determined("'default', id=id")
False
>>> determined("'default', lang=100")
True
>>> determined('')
True
"""
args, kwargs = parse_params(expression)
for arg in args:
if not str_or_int(arg):
return False
for arg in kwargs.values():
if not str_or_int(arg):
return False
return True
def parse_kwargs(text: str) -> typing.Mapping[str, str]:
"""Parses key-value type of parameters.
>>> parse_kwargs('id=item.id')
{'id': 'item.id'}
>>> sorted(parse_kwargs('lang="en", id=12').items())
[('id', '12'), ('lang', '"en"')]
"""
kwargs = {}
for m in RE_KWARGS.finditer(text + ","):
groups = m.groupdict()
kwargs[groups["name"].rstrip("_")] = groups["expr"]
return kwargs
def parse_args(text: str) -> typing.List[str]:
"""Parses argument type of parameters.
>>> parse_args('')
[]
>>> parse_args('10, "x"')
['10', '"x"']
>>> parse_args("'x', 100")
["'x'", '100']
>>> parse_args('"default"')
['"default"']
"""
args = []
for m in RE_ARGS.finditer(text + ","):
args.append(m.group("expr"))
return args
def parse_params(
text: str,
) -> Tuple[typing.List[str], typing.Mapping[str, str]]:
"""Parses function parameters.
>>> parse_params('')
([], {})
>>> parse_params('id=item.id')
([], {'id': 'item.id'})
>>> parse_params('"default"')
(['"default"'], {})
>>> parse_params('"default", lang="en"')
(['"default"'], {'lang': '"en"'})
"""
if "=" in text:
args = text.split("=")[0]
if "," in args:
args = args.rsplit(",", 1)[0]
kwargs = text[len(args) :]
return parse_args(args), parse_kwargs(kwargs)
else:
return [], parse_kwargs(text)
else:
return parse_args(text), {}
def str_or_int(text: str) -> bool:
"""Ensures ``text`` as string or int expression.
>>> str_or_int('"default"')
True
>>> str_or_int("'Hello'")
True
>>> str_or_int('100')
True
>>> str_or_int('item.id')
False
"""
m = RE_STR_VALUE.match(text)
if m:
return True
else:
m = RE_INT_VALUE.match(text)
if m:
return True
return False
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553443.0
wheezy.template-3.2.1/src/wheezy/template/lexer.py 0000644 0001751 0000173 00000005272 14460746143 021670 0 ustar 00runner docker import typing
from wheezy.template.typing import (
LexerRule,
PostProcessorRule,
PreProcessorRule,
Token,
)
def lexer_scan(
extensions: typing.List[typing.Any],
) -> typing.Mapping[str, typing.Any]:
"""Scans extensions for ``lexer_rules`` and ``preprocessors``
attributes.
"""
lexer_rules: typing.Dict[int, LexerRule] = {}
preprocessors: typing.List[PreProcessorRule] = []
postprocessors: typing.List[PostProcessorRule] = []
for extension in extensions:
if hasattr(extension, "lexer_rules"):
lexer_rules.update(extension.lexer_rules)
if hasattr(extension, "preprocessors"):
preprocessors.extend(extension.preprocessors)
if hasattr(extension, "postprocessors"):
postprocessors.extend(extension.postprocessors)
return {
"lexer_rules": [lexer_rules[k] for k in sorted(lexer_rules.keys())],
"preprocessors": preprocessors,
"postprocessors": postprocessors,
}
class Lexer(object):
"""Tokenizes input source per rules supplied."""
def __init__(
self,
lexer_rules: typing.List[LexerRule],
preprocessors: typing.Optional[typing.List[PreProcessorRule]] = None,
postprocessors: typing.Optional[typing.List[PostProcessorRule]] = None,
**ignore: typing.Any
) -> None:
"""Initializes with ``rules``. Rules must be a list of
two elements tuple: ``(regex, tokenizer)`` where
tokenizer if a callable of the following contract::
def tokenizer(match):
return end_index, token, value
"""
self.rules = lexer_rules
self.preprocessors = preprocessors or []
self.postprocessors = postprocessors or []
def tokenize(self, source: str) -> typing.List[Token]:
"""Translates ``source`` accoring to lexer rules into
an iteratable of tokens.
"""
for preprocessor in self.preprocessors:
source = preprocessor(source)
tokens: typing.List[Token] = []
append = tokens.append
pos = 0
lineno = 1
end = len(source)
while pos < end:
for regex, tokenizer in self.rules:
m = regex.match(source, pos, end)
if m is not None:
npos, token, value = tokenizer(m)
assert npos > pos
append((lineno, token, value))
lineno += source[pos:npos].count("\n")
pos = npos
break
else:
raise AssertionError("Lexer pattern mismatch.")
for postprocessor in self.postprocessors:
postprocessor(tokens)
return tokens
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553443.0
wheezy.template-3.2.1/src/wheezy/template/loader.py 0000644 0001751 0000173 00000014536 14460746143 022022 0 ustar 00runner docker import os
import os.path
import stat
import time
import typing
from wheezy.template.comp import Tuple
from wheezy.template.engine import Engine
from wheezy.template.typing import Loader, SupportsRender
class FileLoader(Loader):
"""Loads templates from file system.
``directories`` - search path of directories to scan for template.
``encoding`` - decode template content per encoding.
"""
def __init__(
self, directories: typing.List[str], encoding: str = "UTF-8"
) -> None:
searchpath: typing.List[str] = []
for path in directories:
abspath = os.path.abspath(path)
assert os.path.exists(abspath)
assert os.path.isdir(abspath)
searchpath.append(abspath)
self.searchpath = searchpath
self.encoding = encoding
def list_names(self) -> Tuple[str, ...]:
"""Return a list of names relative to directories. Ignores any files
and directories that start with dot.
"""
names = []
for path in self.searchpath:
pathlen = len(path) + 1
for dirpath, dirnames, filenames in os.walk(path):
for i in [
i
for i, name in enumerate(dirnames)
if name.startswith(".")
]:
del dirnames[i]
for filename in filenames:
if filename.startswith("."):
continue
name = os.path.join(dirpath, filename)[pathlen:]
name = name.replace("\\", "/")
names.append(name)
return tuple(sorted(names))
def get_fullname(self, name: str) -> typing.Optional[str]:
"""Returns a full path by a template name."""
for path in self.searchpath:
filename = os.path.join(path, name)
if not os.path.exists(filename):
continue
if not os.path.isfile(filename):
continue
return filename
else:
return None
def load(self, name: str) -> typing.Optional[str]:
"""Loads a template by name from file system."""
filename = self.get_fullname(name)
if filename:
f = open(filename, "rb")
try:
return f.read().decode(self.encoding)
finally:
f.close()
return None
class DictLoader(Loader):
"""Loads templates from python dictionary.
``templates`` - a dict where key corresponds to template name and
value to template content.
"""
def __init__(self, templates: typing.Mapping[str, str]) -> None:
self.templates = templates
def list_names(self) -> Tuple[str, ...]:
"""List all keys from internal dict."""
return tuple(sorted(self.templates.keys()))
def load(self, name: str) -> typing.Optional[str]:
"""Returns template by name."""
if name not in self.templates:
return None
return self.templates[name]
class ChainLoader(Loader):
"""Loads templates from ``loaders`` until first succeed."""
def __init__(self, loaders: typing.List[Loader]) -> None:
self.loaders = loaders
def list_names(self) -> Tuple[str, ...]:
"""Returns as list of names from all loaders."""
names = set()
for loader in self.loaders:
names |= set(loader.list_names())
return tuple(sorted(names))
def load(self, name: str) -> typing.Optional[str]:
"""Returns template by name from the first loader that succeed."""
for loader in self.loaders:
source = loader.load(name)
if source is not None:
return source
return None
class PreprocessLoader(Loader):
"""Performs preprocessing of loaded template."""
def __init__(
self,
engine: Engine,
ctx: typing.Optional[typing.Mapping[str, typing.Any]] = None,
) -> None:
self.engine = engine
self.ctx = ctx or {}
def list_names(self) -> Tuple[str, ...]:
return self.engine.loader.list_names()
def load(self, name: str) -> str:
return self.engine.render(name, self.ctx, {}, {})
def autoreload(engine: Engine, enabled: bool = True) -> Engine:
"""Auto reload template if changes are detected in file.
Limitation: master (inherited), imported and preprocessed templates.
It is recommended to use application server that supports
file reload instead.
"""
if not enabled:
return engine
return AutoReloadProxy(engine)
# region: internal details
class AutoReloadProxy(Engine):
def __init__(self, engine: Engine):
from warnings import warn
self.engine = engine
self.names: typing.Dict[str, float] = {}
warn(
"autoreload limitation: master (inherited), imported "
"and preprocessed templates. It is recommended to use "
"application server that supports file reload instead.",
stacklevel=3,
)
def get_template(self, name: str) -> SupportsRender:
if self.file_changed(name):
self.remove(name)
return self.engine.get_template(name)
def render(
self,
name: str,
ctx: typing.Mapping[str, typing.Any],
local_defs: typing.Mapping[str, typing.Any],
super_defs: typing.Mapping[str, typing.Any],
) -> str:
if self.file_changed(name):
self.remove(name)
return self.engine.render(name, ctx, local_defs, super_defs)
def remove(self, name: str) -> None:
self.engine.remove(name)
# region: internal details
def __getattr__(self, name: str) -> typing.Any:
return getattr(self.engine, name)
def file_changed(self, name: str) -> bool:
try:
last_known_stamp = self.names[name]
current_time = time.time()
if current_time - last_known_stamp <= 2:
return False
except KeyError:
last_known_stamp = 0
loader = self.engine.loader
abspath = loader.get_fullname(name) # type: ignore[attr-defined]
if not abspath:
return False
last_modified_stamp = os.stat(abspath)[stat.ST_MTIME]
if last_modified_stamp <= last_known_stamp:
return False
self.names[name] = last_modified_stamp
return True
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553443.0
wheezy.template-3.2.1/src/wheezy/template/parser.py 0000644 0001751 0000173 00000005313 14460746143 022041 0 ustar 00runner docker import typing
from wheezy.template.typing import ParserConfig, ParserRule, Token
def parser_scan(
extensions: typing.List[typing.Any],
) -> typing.Mapping[str, typing.Any]:
parser_rules = {}
parser_configs = []
for extension in extensions:
if hasattr(extension, "parser_rules"):
parser_rules.update(extension.parser_rules)
if hasattr(extension, "parser_configs"):
parser_configs.extend(extension.parser_configs)
return {
"parser_rules": parser_rules,
"parser_configs": parser_configs,
}
class Parser(ParserConfig):
"""
``continue_tokens`` are used to insert ``end`` node right
before them to simulate a block end. Such nodes have token
value ``None``.
``out_tokens`` are combined together into a single node.
"""
def __init__(
self,
parser_rules: typing.Dict[str, ParserRule],
parser_configs: typing.Optional[
typing.List[typing.Callable[[ParserConfig], None]]
] = None,
**ignore: typing.Any
) -> None:
self.end_tokens: typing.List[str] = []
self.continue_tokens: typing.List[str] = []
self.compound_tokens: typing.List[str] = []
self.out_tokens: typing.List[str] = []
self.rules = parser_rules
if parser_configs:
for config in parser_configs:
config(self)
def end_continue(
self, tokens: typing.List[Token]
) -> typing.Iterator[Token]:
"""If token is in ``continue_tokens`` prepend it
with end token so it simulate a closed block.
"""
for t in tokens:
if t[1] in self.continue_tokens:
yield (t[0], "end", "")
yield t
def parse_iter(
self, tokens: typing.Iterator[Token]
) -> typing.Iterator[typing.Any]:
operands = []
for lineno, token, value in tokens:
if token in self.rules:
value = self.rules[token](value) # type: ignore[assignment]
if token in self.out_tokens:
operands.append((lineno, token, value))
else:
if operands:
yield operands[0][0], "out", operands
operands = []
if token in self.compound_tokens:
yield lineno, token, (value, list(self.parse_iter(tokens)))
else:
yield lineno, token, value
if token in self.end_tokens:
break
if operands:
yield operands[0][0], "out", operands
def parse(self, tokens: typing.List[Token]) -> typing.List[typing.Any]:
return list(self.parse_iter(self.end_continue(tokens)))
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553443.0
wheezy.template-3.2.1/src/wheezy/template/preprocessor.py 0000644 0001751 0000173 00000007427 14460746143 023303 0 ustar 00runner docker import typing
from wheezy.template.comp import allocate_lock # type: ignore[attr-defined]
from wheezy.template.engine import Engine
from wheezy.template.loader import ChainLoader, DictLoader
from wheezy.template.typing import Loader, SupportsRender
class Preprocessor(object):
"""Preprocess templates with ``engine`` and vary runtime templates
by ``key_factory`` function using ``runtime_engine_factory``.
"""
def __init__(
self,
runtime_engine_factory: typing.Callable[[Loader], Engine],
engine: Engine,
key_factory: typing.Callable[[typing.Mapping[str, typing.Any]], str],
) -> None:
self.lock = allocate_lock()
self.runtime_engines: typing.Dict[str, Engine] = {}
self.runtime_engine_factory = runtime_engine_factory
self.engine = engine
self.loader = engine.loader
self.key_factory = key_factory
template_class = self.engine.template_class
self.engine.template_class = lambda name, _: template_class(
name,
lambda ctx, local_defs, super_defs: self.render(
name, ctx, local_defs, super_defs
),
)
def get_template(self, name: str) -> SupportsRender:
return self.engine.get_template(name)
def render(
self,
name: str,
ctx: typing.Mapping[str, typing.Any],
local_defs: typing.Mapping[str, typing.Any],
super_defs: typing.Mapping[str, typing.Any],
) -> str:
try:
runtime_engine = self.runtime_engines[self.key_factory(ctx)]
except KeyError:
runtime_engine = self.ensure_runtime_engine(self.key_factory(ctx))
try:
return runtime_engine.renders[name](ctx, local_defs, super_defs)
except KeyError:
self.preprocess_template(runtime_engine, name, ctx)
return runtime_engine.renders[name](ctx, local_defs, super_defs)
def remove(self, name: str) -> None:
self.lock.acquire(True)
try:
self.engine.remove(name)
for runtime_engine in self.runtime_engines.values():
runtime_engine.remove(name)
finally:
self.lock.release()
# region: internal details
def ensure_runtime_engine(self, key: str) -> Engine:
self.lock.acquire(True)
try:
engines = self.runtime_engines
if key in engines: # pragma: nocover
return engines[key]
engine = engines[key] = self.runtime_engine_factory(
ChainLoader([DictLoader({}), self.engine.loader])
)
def render(
name: str,
ctx: typing.Mapping[str, typing.Any],
local_defs: typing.Mapping[str, typing.Any],
super_defs: typing.Mapping[str, typing.Any],
) -> str:
try:
return engine.renders[name](ctx, local_defs, super_defs)
except KeyError:
self.preprocess_template(engine, name, ctx)
return engine.renders[name](ctx, local_defs, super_defs)
engine.global_vars["_r"] = render
return engine
finally:
self.lock.release()
def preprocess_template(
self,
runtime_engine: Engine,
name: str,
ctx: typing.Mapping[str, typing.Any],
) -> None:
self.lock.acquire(True)
try:
if name not in runtime_engine.renders:
source = self.engine.render(name, ctx, {}, {})
loader = runtime_engine.loader.loaders[0] # type: ignore
loader.templates[name] = source
runtime_engine.compile_template(name)
del loader.templates[name]
finally:
self.lock.release()
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553443.0
wheezy.template-3.2.1/src/wheezy/template/py.typed 0000644 0001751 0000173 00000000000 14460746143 021656 0 ustar 00runner docker ././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553443.0
wheezy.template-3.2.1/src/wheezy/template/typing.py 0000644 0001751 0000173 00000003611 14460746143 022056 0 ustar 00runner docker import typing
from abc import abstractmethod
from wheezy.template.comp import List, Tuple
Token = Tuple[int, str, str]
class Builder:
lineno: int
@abstractmethod
def start_block(self) -> None:
... # pragma: nocover
@abstractmethod
def end_block(self) -> None:
... # pragma: nocover
@abstractmethod
def add(self, lineno: int, code: str) -> None:
... # pragma: nocover
@abstractmethod
def build_block(self, nodes: typing.Iterable[Token]) -> None:
... # pragma: nocover
@abstractmethod
def build_token(
self,
lineno: int,
token: str,
value: typing.Union[str, typing.Iterable[Token]],
) -> None:
... # pragma: nocover
Tokenizer = typing.Callable[[typing.Match], Token]
LexerRule = Tuple[typing.Pattern, Tokenizer]
PreProcessorRule = typing.Callable[[str], str]
PostProcessorRule = typing.Callable[[List[Token]], str]
BuilderRule = typing.Callable[
[
Builder,
int,
str,
typing.Union[str, List[str], typing.Iterable[Token]],
],
bool,
]
ParserRule = typing.Callable[[str], typing.Union[str, List[str]]]
class ParserConfig:
end_tokens: List[str]
continue_tokens: List[str]
compound_tokens: List[str]
out_tokens: List[str]
RenderTemplate = typing.Callable[
[
typing.Mapping[str, typing.Any],
typing.Mapping[str, typing.Any],
typing.Mapping[str, typing.Any],
],
str,
]
class SupportsRender:
@abstractmethod
def render(self, ctx: typing.Mapping[str, typing.Any]) -> str:
... # pragma: nocover
TemplateClass = typing.Callable[[str, RenderTemplate], SupportsRender]
class Loader:
@abstractmethod
def list_names(self) -> Tuple[str, ...]:
... # pragma: nocover
@abstractmethod
def load(self, name: str) -> typing.Optional[str]:
... # pragma: nocover
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553443.0
wheezy.template-3.2.1/src/wheezy/template/utils.py 0000644 0001751 0000173 00000002407 14460746143 021706 0 ustar 00runner docker def find_all_balanced(text: str, start: int = 0) -> int:
"""Finds balanced ``([`` with ``])`` assuming
that ``start`` is pointing to ``(`` or ``[`` in ``text``.
"""
if start >= len(text) or text[start] not in "([":
return start
while 1:
pos = find_balanced(text, start)
pos = find_balanced(text, pos, "[", "]")
if pos != start:
start = pos
else:
return pos
def find_balanced(
text: str, start: int = 0, start_sep: str = "(", end_sep: str = ")"
) -> int:
"""Finds balanced ``start_sep`` with ``end_sep`` assuming
that ``start`` is pointing to ``start_sep`` in ``text``.
"""
if start >= len(text) or start_sep != text[start]:
return start
balanced = 1
pos = start + 1
while pos < len(text):
token = text[pos]
pos += 1
if token == end_sep:
if balanced == 1:
return pos
balanced -= 1
elif token == start_sep:
balanced += 1
return start
def print_source(source: str, lineno: int = 1) -> None: # pragma: nocover
lines = []
for line in source.split("\n"):
lines.append("%02d " % lineno + line)
lineno += line.count("\n") + 1
print("\n".join(lines))
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 010212 x ustar 00 28 mtime=1690553458.2237494
wheezy.template-3.2.1/src/wheezy.template.egg-info/ 0000755 0001751 0000173 00000000000 14460746162 021663 5 ustar 00runner docker ././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553458.0
wheezy.template-3.2.1/src/wheezy.template.egg-info/PKG-INFO 0000644 0001751 0000173 00000007607 14460746162 022772 0 ustar 00runner docker Metadata-Version: 2.1
Name: wheezy.template
Version: 3.2.1
Summary: A lightweight template library
Home-page: https://github.com/akornatskyy/wheezy.template
Author: Andriy Kornatskyy
Author-email: andriy.kornatskyy@live.com
License: MIT
Keywords: html markup template preprocessor
Platform: any
Classifier: Environment :: Web Environment
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Natural Language :: English
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: Implementation :: CPython
Classifier: Programming Language :: Python :: Implementation :: PyPy
Classifier: Topic :: Internet :: WWW/HTTP
Classifier: Topic :: Internet :: WWW/HTTP :: Dynamic Content
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Classifier: Topic :: Software Development :: Widget Sets
Classifier: Topic :: Text Processing :: Markup :: HTML
Requires-Python: >=3.8
Description-Content-Type: text/markdown
License-File: LICENSE
# wheezy.template
[](https://github.com/akornatskyy/wheezy.template/actions/workflows/tests.yml)
[](https://coveralls.io/github/akornatskyy/wheezy.template?branch=master)
[](https://wheezytemplate.readthedocs.io/en/latest/?badge=latest)
[](https://badge.fury.io/py/wheezy.template)
[wheezy.template](https://pypi.org/project/wheezy.template/) is a
[python](https://www.python.org) package written in pure Python code. It
is a lightweight template library. The design goals achived:
- **Compact, Expressive, Clean:** Minimizes the number of keystrokes
required to build a template. Enables fast and well read coding. You
do not need to explicitly denote statement blocks within HTML
(unlike other template systems), the parser is smart enough to
understand your code. This enables a compact and expressive syntax
which is really clean and just pleasure to type.
- **Intuitive, No time to Learn:** Basic Python programming skills
plus HTML markup. You are productive just from start. Use full power
of Python with minimal markup required to denote python statements.
- **Do Not Repeat Yourself:** Master layout templates for inheritance;
include and import directives for maximum reuse.
- **Blazingly Fast:** Maximum rendering performance: ultimate speed
and context preprocessor features.
Simple template:
```txt
@require(user, items)
Welcome, @user.name!
@if items:
@for i in items:
@i.name: @i.price!s.
@end
@else:
No items found.
@end
```
It is optimized for performance, well tested and documented.
Resources:
- [source code](https://github.com/akornatskyy/wheezy.template),
[examples](https://github.com/akornatskyy/wheezy.template/tree/master/demos)
and [issues](https://github.com/akornatskyy/wheezy.template/issues)
tracker are available on
[github](https://github.com/akornatskyy/wheezy.template)
- [documentation](https://wheezytemplate.readthedocs.io/en/latest/)
## Install
[wheezy.template](https://pypi.org/project/wheezy.template/) requires
[python](https://www.python.org) version 3.8+. It is independent of
operating system. You can install it from
[pypi](https://pypi.org/project/wheezy.template/) site:
```sh
pip install -U wheezy.template
```
If you run into any issue or have comments, go ahead and add on
[github](https://github.com/akornatskyy/wheezy.template).
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553458.0
wheezy.template-3.2.1/src/wheezy.template.egg-info/SOURCES.txt 0000644 0001751 0000173 00000001614 14460746162 023551 0 ustar 00runner docker LICENSE
MANIFEST.in
README.md
setup.py
src/wheezy/__init__.py
src/wheezy.template.egg-info/PKG-INFO
src/wheezy.template.egg-info/SOURCES.txt
src/wheezy.template.egg-info/dependency_links.txt
src/wheezy.template.egg-info/entry_points.txt
src/wheezy.template.egg-info/namespace_packages.txt
src/wheezy.template.egg-info/not-zip-safe
src/wheezy.template.egg-info/top_level.txt
src/wheezy/template/__init__.py
src/wheezy/template/builder.py
src/wheezy/template/comp.py
src/wheezy/template/compiler.py
src/wheezy/template/console.py
src/wheezy/template/engine.py
src/wheezy/template/lexer.py
src/wheezy/template/loader.py
src/wheezy/template/parser.py
src/wheezy/template/preprocessor.py
src/wheezy/template/py.typed
src/wheezy/template/typing.py
src/wheezy/template/utils.py
src/wheezy/template/ext/__init__.py
src/wheezy/template/ext/code.py
src/wheezy/template/ext/core.py
src/wheezy/template/ext/determined.py ././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553458.0
wheezy.template-3.2.1/src/wheezy.template.egg-info/dependency_links.txt 0000644 0001751 0000173 00000000001 14460746162 025731 0 ustar 00runner docker
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553458.0
wheezy.template-3.2.1/src/wheezy.template.egg-info/entry_points.txt 0000644 0001751 0000173 00000000101 14460746162 025151 0 ustar 00runner docker [console_scripts]
wheezy.template = wheezy.template.console:main
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553458.0
wheezy.template-3.2.1/src/wheezy.template.egg-info/namespace_packages.txt 0000644 0001751 0000173 00000000007 14460746162 026213 0 ustar 00runner docker wheezy
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553458.0
wheezy.template-3.2.1/src/wheezy.template.egg-info/not-zip-safe 0000644 0001751 0000173 00000000001 14460746162 024111 0 ustar 00runner docker
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1690553458.0
wheezy.template-3.2.1/src/wheezy.template.egg-info/top_level.txt 0000644 0001751 0000173 00000000007 14460746162 024412 0 ustar 00runner docker wheezy