pax_global_header00006660000000000000000000000064147520167460014526gustar00rootroot0000000000000052 comment=ad2c8d67d08cbebb45e1aff2be21dc88635fc695 arogozhnikov-einops-ad2c8d6/000077500000000000000000000000001475201674600162365ustar00rootroot00000000000000arogozhnikov-einops-ad2c8d6/.devcontainer/000077500000000000000000000000001475201674600207755ustar00rootroot00000000000000arogozhnikov-einops-ad2c8d6/.devcontainer/devcontainer.json000066400000000000000000000017461475201674600243610ustar00rootroot00000000000000{ "name": "EinopsDev", "build": { "context": "..", "dockerfile": "./einops.Dockerfile", "target": "einops-devimage", }, "runArgs": [ // to use GPUs in container uncomment next line // "--gpus=all", "--name=einops-devbox", ], "shutdownAction": "none", "postCreateCommand": "uv pip install --system -e . && pre-commit install -f", "customizations": { "vscode": { "settings": { "python.defaultInterpreterPath": "/opt/venv/bin/python" }, "extensions": [ "ms-azuretools.vscode-docker", "ms-python.python", "ms-python.vscode-pylance", "ms-python.mypy-type-checker", "charliermarsh.ruff", "ms-toolsai.jupyter", // very optional git-specific stuff "arturock.gitstash", "mhutchie.git-graph", ] } } }arogozhnikov-einops-ad2c8d6/.devcontainer/einops.Dockerfile000066400000000000000000000002721475201674600242640ustar00rootroot00000000000000FROM python:3.11-bookworm AS einops-devimage # note this will pull aarch64 or x86 depending on machine it is running on. RUN pip install uv \ && uv pip install --system pre-commit hatcharogozhnikov-einops-ad2c8d6/.github/000077500000000000000000000000001475201674600175765ustar00rootroot00000000000000arogozhnikov-einops-ad2c8d6/.github/ISSUE_TEMPLATE/000077500000000000000000000000001475201674600217615ustar00rootroot00000000000000arogozhnikov-einops-ad2c8d6/.github/ISSUE_TEMPLATE/bug_report.md000066400000000000000000000005561475201674600244610ustar00rootroot00000000000000--- name: Bug report about: Create a report to help us improve title: "[BUG]" labels: bug assignees: '' --- **Describe the bug** A clear and concise description of what the bug is. **Reproduction steps** Steps to reproduce the behavior: **Expected behavior** What you expected to happen. **Your platform** Version of einops, python and DL package that you used extensions--introducing-new-operation---improve-notion.md000066400000000000000000000023441475201674600347300ustar00rootroot00000000000000arogozhnikov-einops-ad2c8d6/.github/ISSUE_TEMPLATE--- name: 'Extensions: Introducing new operation / improve notion' about: 'You want to suggest a new operation or extend an existing one? ' title: "[Feature suggestion]" labels: feature suggestion assignees: '' --- Einops sparks a lot of interest to introduce new operations that either meet specific requirements or look like a plausible extension of existing ones. Einops is not a purely experimental thing, it's already used in a number of large projects, so there are high expectations for any introduced improvement. Let's see which are they and how to check them? 1. Try to collect **use-cases** 2. **Implementation**. (optional) Implementing a sketch of your proposal in a code allows detecting possible conflicts and realize possible caveats. 3. **Integrity** - does it interplay well with existing operations and notation in einops? 4. **Readability**. This is harder to check, but give it a try. A simple but usable test is to write an exercise sheet with several examples of your extension explained, for others meaning of operation should be guessed. Send this test to a couple of your friends to collect the feedback and see how your notion will be understood. This should also help to improve your ideas to make them more digestible. arogozhnikov-einops-ad2c8d6/.github/workflows/000077500000000000000000000000001475201674600216335ustar00rootroot00000000000000arogozhnikov-einops-ad2c8d6/.github/workflows/deploy_docs.yml000066400000000000000000000021031475201674600246560ustar00rootroot00000000000000name: deploy-docs on: push: branches: # triggers on new commits / merged PRs to main or a special branch - main - docs-deploy-testing permissions: contents: write jobs: deploy: runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - name: Configure Git Credentials # credentials of GitHub's public bot, so that commits 'belonged' to bot run: | git config user.name github-actions[bot] git config user.email 41898282+github-actions[bot]@users.noreply.github.com - uses: actions/setup-python@v5 with: python-version: 3.12 ## Switched off cache, deployment is fast anyway. # - run: echo "cache_id=$(date --utc '+%V')" >> $GITHUB_ENV # - uses: actions/cache@v4 # with: # key: mkdocs-material-${{ env.cache_id }} # path: .cache # restore-keys: | # mkdocs-material- ## original deploy instruction has --force # - run: mkdocs gh-deploy --force - run: pip install hatch && hatch run docs:deploy_force arogozhnikov-einops-ad2c8d6/.github/workflows/deploy_to_pypi.yml000066400000000000000000000012671475201674600254230ustar00rootroot00000000000000name: Deploy to pypi on: release: types: [created] jobs: publish: runs-on: ubuntu-latest steps: - name: Checkout sources uses: actions/checkout@v4 - name: Set up Python uses: actions/setup-python@v5 with: python-version: "3.8" - name: Install dependencies run: | pip install . && pip install hatch - name: Publish einops to PyPi env: HATCH_INDEX_USER: "__token__" HATCH_INDEX_AUTH: ${{ secrets.PYPI_TOKEN }} run: | hatch run pypi:deploy arogozhnikov-einops-ad2c8d6/.github/workflows/run_tests.yml000066400000000000000000000023651475201674600244120ustar00rootroot00000000000000name: Run tests on: [push, pull_request] jobs: build: runs-on: ubuntu-latest strategy: matrix: python-version: ['3.8', '3.10', '3.12'] # currently there is conflict between tf, oneflow and paddle in protobuf versions. # cupy is not tested because it demands gpu # oneflow testing is dropped, see details at https://github.com/Oneflow-Inc/oneflow/issues/10340 # paddle was switched off because of divergence with numpy in py3.10, paddle==2.6.1 # The last pytensor release that supports python 3.8 doesn't include einsum, so we skip that combination. frameworks: ['numpy pytorch tensorflow jax', 'pytensor'] exclude: - python-version: '3.8' frameworks: 'pytensor' steps: - uses: actions/checkout@v4 - name: Set up Python ${{ matrix.python-version }} uses: actions/setup-python@v5 with: python-version: ${{ matrix.python-version }} cache: pip - name: Check for ruff compliance run: | pip install ruff==0.6.5 && ruff check . && ruff format . --check - name: Run tests run: | pip install -e . && python -m einops.tests.run_tests ${{ matrix.frameworks }} --pip-install arogozhnikov-einops-ad2c8d6/.github/workflows/test_notebooks.yml000066400000000000000000000012221475201674600254150ustar00rootroot00000000000000name: Test notebooks on: [push, pull_request] jobs: build: runs-on: ubuntu-latest strategy: matrix: python-version: ['3.8', '3.12'] steps: - uses: actions/checkout@v4 - name: Set up Python ${{ matrix.python-version }} uses: actions/setup-python@v5 with: python-version: ${{ matrix.python-version }} cache: pip - name: Install dependencies to run/test notebooks run: | pip install nbformat nbconvert jupyter pillow pytest numpy torch tensorflow - name: Testing notebooks run: | pip install -e . && pytest scripts/test_notebooks.py arogozhnikov-einops-ad2c8d6/.gitignore000066400000000000000000000026741475201674600202370ustar00rootroot00000000000000# Byte-compiled / optimized / DLL files __pycache__/ *.py[cod] *$py.class # C extensions *.so # Distribution / packaging .Python build/ develop-eggs/ dist/ downloads/ eggs/ .eggs/ lib/ lib64/ parts/ sdist/ var/ wheels/ *.egg-info/ .installed.cfg *.egg MANIFEST # PyInstaller # Usually these files are written by a python script from a template # before PyInstaller builds the exe, so as to inject date/other infos into it. *.manifest *.spec # Installer logs pip-log.txt pip-delete-this-directory.txt # Unit test / coverage reports htmlcov/ .tox/ .coverage .coverage.* .cache nosetests.xml coverage.xml *.cover .hypothesis/ .pytest_cache/ # Translations *.mo *.pot # Django stuff: *.log local_settings.py db.sqlite3 # Flask stuff: instance/ .webassets-cache # Scrapy stuff: .scrapy # Sphinx documentation docs/_build/ # PyBuilder target/ # Jupyter Notebook .ipynb_checkpoints # pyenv .python-version # celery beat schedule file celerybeat-schedule # SageMath parsed files *.sage.py # Environments .env .venv env/ venv/ ENV/ env.bak/ venv.bak/ # Spyder project settings .spyderproject .spyproject # Rope project settings .ropeproject # mkdocs documentation /site # mypy .mypy_cache/ # pycharm .idea # design-document design-einops.txt # logo and video generation logo # any local explorations *.personal* # oneflow's output trash log # this file is auto-generated docs_src/index.md # vs code *.code-workspace # macos system files .DS_Storearogozhnikov-einops-ad2c8d6/.pre-commit-config.yaml000066400000000000000000000004611475201674600225200ustar00rootroot00000000000000--- repos: - repo: https://github.com/astral-sh/ruff-pre-commit # Ruff version, synced with CI rev: v0.6.5 hooks: # Run the linter. - id: ruff types_or: [ python, pyi, jupyter ] # Run the formatter. - id: ruff-format types_or: [ python, pyi, jupyter ]arogozhnikov-einops-ad2c8d6/CITATION.cff000066400000000000000000000013021475201674600201240ustar00rootroot00000000000000cff-version: 1.2.0 title: einops message: >- If you use this software, please cite it using the metadata from this file. type: software authors: - given-names: Alex family-names: Rogozhnikov repository-code: 'https://github.com/arogozhnikov/einops' abstract: >- Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others) license: MIT preferred-citation: type: article authors: - given-names: Alex family-names: Rogozhnikov journal: "International Conference on Learning Representations" title: "Einops: Clear and Reliable Tensor Manipulations with Einstein-like Notation" year: 2022 url: https://openreview.net/forum?id=oapKSVM2bcarogozhnikov-einops-ad2c8d6/LICENSE000066400000000000000000000020611475201674600172420ustar00rootroot00000000000000MIT License Copyright (c) 2018 Alex Rogozhnikov Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. arogozhnikov-einops-ad2c8d6/README.md000066400000000000000000000311531475201674600175200ustar00rootroot00000000000000 https://user-images.githubusercontent.com/6318811/177030658-66f0eb5d-e136-44d8-99c9-86ae298ead5b.mp4 # einops [![Run tests](https://github.com/arogozhnikov/einops/actions/workflows/run_tests.yml/badge.svg)](https://github.com/arogozhnikov/einops/actions/workflows/run_tests.yml) [![PyPI version](https://badge.fury.io/py/einops.svg)](https://badge.fury.io/py/einops) [![Documentation](https://img.shields.io/badge/documentation-link-blue.svg)](https://einops.rocks/) ![Supported python versions](https://raw.githubusercontent.com/arogozhnikov/einops/main/docs/resources/python_badge.svg) Flexible and powerful tensor operations for readable and reliable code.
Supports numpy, pytorch, tensorflow, jax, and [others](#supported-frameworks). ## Recent updates: - 0.8.0: tinygrad backend added, small fixes - 0.7.0: no-hassle `torch.compile`, support of [array api standard](https://data-apis.org/array-api/latest/API_specification/index.html) and more - 10'000🎉: github reports that more than 10k project use einops - einops 0.6.1: paddle backend added - einops 0.6 introduces [packing and unpacking](https://github.com/arogozhnikov/einops/blob/main/docs/4-pack-and-unpack.ipynb) - einops 0.5: einsum is now a part of einops - [Einops paper](https://openreview.net/pdf?id=oapKSVM2bcj) is accepted for oral presentation at ICLR 2022 (yes, it worth reading). Talk recordings are [available](https://iclr.cc/virtual/2022/oral/6603)
Previous updates - flax and oneflow backend added - torch.jit.script is supported for pytorch layers - powerful EinMix added to einops. [Einmix tutorial notebook](https://github.com/arogozhnikov/einops/blob/main/docs/3-einmix-layer.ipynb)
## Tweets > In case you need convincing arguments for setting aside time to learn about einsum and einops... [Tim Rocktäschel](https://twitter.com/_rockt/status/1230818967205425152) > Writing better code with PyTorch and einops 👌 [Andrej Karpathy](https://twitter.com/karpathy/status/1290826075916779520) > Slowly but surely, einops is seeping in to every nook and cranny of my code. If you find yourself shuffling around bazillion dimensional tensors, this might change your life [Nasim Rahaman](https://twitter.com/nasim_rahaman/status/1216022614755463169) [More testimonials](https://einops.rocks/pages/testimonials/) ## Contents - [Installation](#Installation) - [Documentation](https://einops.rocks/) - [Tutorial](#Tutorials) - [API micro-reference](#API) - [Why use einops](#Why-use-einops-notation) - [Supported frameworks](#Supported-frameworks) - [Citing](#Citing) - [Repository](https://github.com/arogozhnikov/einops) and [discussions](https://github.com/arogozhnikov/einops/discussions) ## Installation Plain and simple: ```bash pip install einops ``` ## Tutorials Tutorials are the most convenient way to see `einops` in action - part 1: [einops fundamentals](https://github.com/arogozhnikov/einops/blob/main/docs/1-einops-basics.ipynb) - part 2: [einops for deep learning](https://github.com/arogozhnikov/einops/blob/main/docs/2-einops-for-deep-learning.ipynb) - part 3: [packing and unpacking](https://github.com/arogozhnikov/einops/blob/main/docs/4-pack-and-unpack.ipynb) - part 4: [improve pytorch code with einops](http://einops.rocks/pytorch-examples.html) Kapil Sachdeva recorded a small [intro to einops](https://www.youtube.com/watch?v=xGy75Pjsqzo). ## API `einops` has a minimalistic yet powerful API. Three core operations provided ([einops tutorial](https://github.com/arogozhnikov/einops/blob/main/docs/) shows those cover stacking, reshape, transposition, squeeze/unsqueeze, repeat, tile, concatenate, view and numerous reductions) ```python from einops import rearrange, reduce, repeat # rearrange elements according to the pattern output_tensor = rearrange(input_tensor, 't b c -> b c t') # combine rearrangement and reduction output_tensor = reduce(input_tensor, 'b c (h h2) (w w2) -> b h w c', 'mean', h2=2, w2=2) # copy along a new axis output_tensor = repeat(input_tensor, 'h w -> h w c', c=3) ``` Later additions to the family are `pack` and `unpack` functions (better than stack/split/concatenate): ```python from einops import pack, unpack # pack and unpack allow reversibly 'packing' multiple tensors into one. # Packed tensors may be of different dimensionality: packed, ps = pack([class_token_bc, image_tokens_bhwc, text_tokens_btc], 'b * c') class_emb_bc, image_emb_bhwc, text_emb_btc = unpack(transformer(packed), ps, 'b * c') ``` Finally, einops provides einsum with a support of multi-lettered names: ```python from einops import einsum, pack, unpack # einsum is like ... einsum, generic and flexible dot-product # but 1) axes can be multi-lettered 2) pattern goes last 3) works with multiple frameworks C = einsum(A, B, 'b t1 head c, b t2 head c -> b head t1 t2') ``` ### EinMix `EinMix` is a generic linear layer, perfect for MLP Mixers and similar architectures. ### Layers Einops provides layers (`einops` keeps a separate version for each framework) that reflect corresponding functions ```python from einops.layers.torch import Rearrange, Reduce from einops.layers.tensorflow import Rearrange, Reduce from einops.layers.flax import Rearrange, Reduce from einops.layers.paddle import Rearrange, Reduce ```
Example of using layers within a pytorch model Example given for pytorch, but code in other frameworks is almost identical ```python from torch.nn import Sequential, Conv2d, MaxPool2d, Linear, ReLU from einops.layers.torch import Rearrange model = Sequential( ..., Conv2d(6, 16, kernel_size=5), MaxPool2d(kernel_size=2), # flattening without need to write forward Rearrange('b c h w -> b (c h w)'), Linear(16*5*5, 120), ReLU(), Linear(120, 10), ) ``` No more flatten needed! Additionally, torch layers as those are script-able and compile-able. Operations [are torch.compile-able](https://github.com/arogozhnikov/einops/wiki/Using-torch.compile-with-einops), but not script-able due to limitations of torch.jit.script.
## Naming `einops` stands for Einstein-Inspired Notation for operations (though "Einstein operations" is more attractive and easier to remember). Notation was loosely inspired by Einstein summation (in particular by `numpy.einsum` operation). ## Why use `einops` notation?! ### Semantic information (being verbose in expectations) ```python y = x.view(x.shape[0], -1) y = rearrange(x, 'b c h w -> b (c h w)') ``` While these two lines are doing the same job in *some* context, the second one provides information about the input and output. In other words, `einops` focuses on interface: *what is the input and output*, not *how* the output is computed. The next operation looks similar: ```python y = rearrange(x, 'time c h w -> time (c h w)') ``` but it gives the reader a hint: this is not an independent batch of images we are processing, but rather a sequence (video). Semantic information makes the code easier to read and maintain. ### Convenient checks Reconsider the same example: ```python y = x.view(x.shape[0], -1) # x: (batch, 256, 19, 19) y = rearrange(x, 'b c h w -> b (c h w)') ``` The second line checks that the input has four dimensions, but you can also specify particular dimensions. That's opposed to just writing comments about shapes since comments don't prevent mistakes, not tested, and without code review tend to be outdated ```python y = x.view(x.shape[0], -1) # x: (batch, 256, 19, 19) y = rearrange(x, 'b c h w -> b (c h w)', c=256, h=19, w=19) ``` ### Result is strictly determined Below we have at least two ways to define the depth-to-space operation ```python # depth-to-space rearrange(x, 'b c (h h2) (w w2) -> b (c h2 w2) h w', h2=2, w2=2) rearrange(x, 'b c (h h2) (w w2) -> b (h2 w2 c) h w', h2=2, w2=2) ``` There are at least four more ways to do it. Which one is used by the framework? These details are ignored, since *usually* it makes no difference, but it can make a big difference (e.g. if you use grouped convolutions in the next stage), and you'd like to specify this in your code. ### Uniformity ```python reduce(x, 'b c (x dx) -> b c x', 'max', dx=2) reduce(x, 'b c (x dx) (y dy) -> b c x y', 'max', dx=2, dy=3) reduce(x, 'b c (x dx) (y dy) (z dz) -> b c x y z', 'max', dx=2, dy=3, dz=4) ``` These examples demonstrated that we don't use separate operations for 1d/2d/3d pooling, those are all defined in a uniform way. Space-to-depth and depth-to space are defined in many frameworks but how about width-to-height? Here you go: ```python rearrange(x, 'b c h (w w2) -> b c (h w2) w', w2=2) ``` ### Framework independent behavior Even simple functions are defined differently by different frameworks ```python y = x.flatten() # or flatten(x) ``` Suppose `x`'s shape was `(3, 4, 5)`, then `y` has shape ... - numpy, pytorch, cupy, chainer, jax: `(60,)` - keras, tensorflow.layers, gluon: `(3, 20)` `einops` works the same way in all frameworks. ### Independence of framework terminology Example: `tile` vs `repeat` causes lots of confusion. To copy image along width: ```python np.tile(image, (1, 2)) # in numpy image.repeat(1, 2) # pytorch's repeat ~ numpy's tile ``` With einops you don't need to decipher which axis was repeated: ```python repeat(image, 'h w -> h (tile w)', tile=2) # in numpy repeat(image, 'h w -> h (tile w)', tile=2) # in pytorch repeat(image, 'h w -> h (tile w)', tile=2) # in tf repeat(image, 'h w -> h (tile w)', tile=2) # in jax repeat(image, 'h w -> h (tile w)', tile=2) # in cupy ... (etc.) ``` [Testimonials](https://einops.rocks/pages/testimonials/) provide users' perspective on the same question. ## Supported frameworks Einops works with ... - [numpy](http://www.numpy.org/) - [pytorch](https://pytorch.org/) - [tensorflow](https://www.tensorflow.org/) - [jax](https://github.com/google/jax) - [cupy](https://github.com/cupy/cupy) - [flax](https://github.com/google/flax) (community) - [paddle](https://github.com/PaddlePaddle/Paddle) (community) - [oneflow](https://github.com/Oneflow-Inc/oneflow) (community) - [tinygrad](https://github.com/tinygrad/tinygrad) (community) - [pytensor](https://github.com/pymc-devs/pytensor) (community) Additionally, einops can be used with any framework that supports [Python array API standard](https://data-apis.org/array-api/latest/API_specification/index.html), which includes - numpy >= 2.0 - [MLX](https://github.com/ml-explore/mlx) - [pydata/sparse](https://github.com/pydata/sparse) >= 0.15 - [quantco/ndonnx](https://github.com/Quantco/ndonnx) - recent releases of jax and cupy. - dask is supported via [array-api-compat](https://github.com/data-apis/array-api-compat) ## Development Devcontainer is provided, this environment can be used locally, or on your server, or within github codespaces. To start with devcontainers in vs code, clone repo, and click 'Reopen in Devcontainer'. Starting from the next version, einops will distribute tests as a part of package. To run tests: ```bash # pip install einops python -m einops.tests.run_tests numpy pytorch jax --pip-install ``` `numpy pytorch jax` is an example, any subset of testable frameworks can be provided. Every framework is tested against numpy, so it is a requirement for tests. Specifying `--pip-install` will install requirements in current virtualenv, and should be omitted if dependencies are installed locally. To build/test docs: ```bash hatch run docs:serve # Serving on http://localhost:8000/ ``` ## Citing einops Please use the following bibtex record ```text @inproceedings{ rogozhnikov2022einops, title={Einops: Clear and Reliable Tensor Manipulations with Einstein-like Notation}, author={Alex Rogozhnikov}, booktitle={International Conference on Learning Representations}, year={2022}, url={https://openreview.net/forum?id=oapKSVM2bcj} } ``` ## Supported python versions `einops` works with python 3.8 or later. arogozhnikov-einops-ad2c8d6/docs/000077500000000000000000000000001475201674600171665ustar00rootroot00000000000000arogozhnikov-einops-ad2c8d6/docs/1-einops-basics.ipynb000066400000000000000000011636621475201674600231450ustar00rootroot00000000000000{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Einops tutorial, part 1: basics\n", "\n", "\n", "\n", "## Welcome to einops-land!\n", "\n", "We don't write \n", "```python\n", "y = x.transpose(0, 2, 3, 1)\n", "```\n", "We write comprehensible code\n", "```python\n", "y = rearrange(x, 'b c h w -> b h w c')\n", "```\n", "\n", "\n", "`einops` supports widely used tensor packages (such as `numpy`, `pytorch`, `jax`, `tensorflow`), and extends them.\n", "\n", "## What's in this tutorial?\n", "\n", "- fundamentals: reordering, composition and decomposition of axes\n", "- operations: `rearrange`, `reduce`, `repeat`\n", "- how much you can do with a single operation!\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Preparations" ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [], "source": [ "# Examples are given for numpy. This code also setups ipython/jupyter\n", "# so that numpy arrays in the output are displayed as images\n", "import numpy\n", "from utils import display_np_arrays_as_images\n", "\n", "display_np_arrays_as_images()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Load a batch of images to play with" ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "(6, 96, 96, 3) float64\n" ] } ], "source": [ "ims = numpy.load(\"./resources/test_images.npy\", allow_pickle=False)\n", "# There are 6 images of shape 96x96 with 3 color channels packed into tensor\n", "print(ims.shape, ims.dtype)" ] }, { "cell_type": "code", "execution_count": 3, "metadata": { "pycharm": { "name": "#%%\n" } }, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAGAAAABgCAIAAABt+uBvAAAG+klEQVR4Ae2ba0xUVxCAy2tRXiIoQlCKSqVBrVQXIoQgLYqCgmmsGxPTrDZUKmho3FgbGtH+qKZJSSMVf2BEjWhJZEVTZKEgj2KpRbCFgqDotvWBoBRY3r6g02xCVu7ee87eO3cVcwwxd2fmzJnzMfdwHrM2Y+3tb7B//ARs+VVM8z8BBoiQBwwQA0QgQFCzDGKACAQIapZBDBCBAEHNMogBIhAgqFkGMUAEAgQ1yyAGiECAoGYZxAARCBDULIMYIAIBgpplEANEIEBQswxigAgECGqWQQRA9gS9nOrHT578du3az1euNLa0tOn17Z2dA4ODIJw6ZYrT1KkzPDzm+vnN8/MLCQ4OVyrn+/vLGQuvbxvrXxyOjY2VVFbmarUXSkqACG9oLyremjtXlZCgVqng4UWNvJ+sCmh0dPRUfv43WVktbW3ihmVra7th7dp9u3YtDAwU58HSVtYDBG9TSlpafWOjpSFy7R0cHDRJSfs0mimOjlwtrsQagCBxDmRm7s/IeP78OWL0yiVLCnJyZvv4IPrkupId0NDw8MZt24ouXeL2LV3i7eVVqdUGzp8v3RWfB3n/zPcYDCtVKpnowJA6Hj6M3rjxrzt3+IYnXS4joJHHjxPU6l/r66VHKeDhfkfH+q1bh0dGBGykqGQE9NHOnZdra6UER9n2z5aWz9LTKY0tNZMLUNbx4/mFhZZGI9o+Ozf3l6tXRTcXaCjLJN1669bSmBj50t7seN5dtKi+pMTGxsasVrRQlgxK3bvXynRg/L83NV0sKxMNgq8hfgYVlpbGq9V8/XHlbq6uG+LiVkZGLl282HP6dHc3N0N/f1d3d1NrK/z5K9Dpevv6uK3MSiKXL686d86sSrQQH5ByzRrK5bKdnd3u7ds/T0mZPm0a3wBgofD1oUPfZWfDapPPxlTeVlMTgLqtRX7Fyi9fpqTj6uJy8dSpg2lpAnRg5KD9Nj09/+hRyl1F3vnzprykPyMD+j4nhyYmmEpzDx9eHRVFYww2H8TGZh08SGMMrySNGb0NJiBDX5+uvJym7082b06IiaGxHLf5eNOmmBUrxj/yPfzR3Axh8GlFyDEBnS8uhuMuYhCOCsV+jYZoxjXYnZzMFU6QwFSFuzrFBPRTVdWEcM1+jIuO9pk1y6xKWBgVFubi7CxsA1rKSZDox2iACaimro6mV5hQaMy4Nvb29rAU4MonSK7fvDlBIuUjGqDOR4/+vnuXJhQ4x6ExM2sza+ZMs3JToejjSlMn489oh/awvRh3KvwQRDHXCnsQ1sL+XtjAIi1aBlGmj0XBiTPu7u19+vSpuLbcVq8hILg1edTdzR2qOAkaINgTiItAjlaDQ0NYbtEAwdkzVkzS/YzgHTCiAbL++YYAR5r1qkBzUxUaIFOnr9MzGiC4TX91uCgUCqxg0ABBxQFWTNL9KBwcpDsxekADBMUYWDFJ9+Ps5CTdCTKgN2fPxopJuh8Pd3fpTowe0LYa/nPmUMbU0dBAs6Wi9Ca3GdorRl+PIutNMTovNECQ1QvmzaOJD6qnaMxeERs0QDCe8JAQmlEdOXmyf2CAxvJVsMEEFL9qFc2QHnZ1fbpnD2wpaYxfug0mIDhLhcscmiGdKShI1GgQDyVoOhVngwkIrq5U8fGUceTk5YXExkKJK6W9WTMoWYODcHVq6lcZGWYNpAuRb1bhuHNhVJRFr09EaOiupCS4I6PfrMDFTll19Y+lpXDN/W9PD1CAS6QLJ05Ix8H1gAwIOtiQmHiuqIjbk7AEsu/9iIjQ4OCgBQsCAwLgkt7FyQnuMOCQAO7mew0GAAG39XWNjXUNDfBrmFDuCFV4rdXVwl2I0+IDgmUOJJGVTz/gwmNYr4f/xVEQaIU5Bxm7gfL4L1NTBbqUQ/Xs2TO9PJWK+IBg/F/s2AHvixwgBHzeuH1bQCtaJQsgqGv54cgRP19f0WGJaHiD+t7JIueyAIIIvGbMKD971tfb26JopBhPpgwyjhO+n1Oh1eKWMwkQnHyAYDDwzZxanY6+Dkhg/ETVpAQEo4ISMd3p01kHDkAtInGQUgxgi0dfzUjfkVxzkGkEUE+WvGXL9aoqqJuCL+qYqnCf5Zin8ReKwmP+5969zGPH4Nt08AsXtqTXzvT0XL969Yfr1kVHRKCvFa0NyDhsWNcVV1QUlpXBVlPcASOUqYUple+Fh8OCK2zZMlhY0AO1yPLlADIN8W57OxQWNjQ339TroXLl/oMHcM0POxX4gU0v3E8YN2VwkAILq7cDAow/7wQFUda9mvYl4vnlAxIRtDWbWGOStuZ40PtigAhIGSAGiECAoGYZxAARCBDULIMYIAIBgpplEANEIEBQswxigAgECGqWQQwQgQBBzTKIASIQIKhZBjFABAIE9X+hFi2cRY+DcwAAAABJRU5ErkJggg==", "text/plain": [] }, "execution_count": 3, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# display the first image (whole 4d tensor can't be rendered)\n", "ims[0]" ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "pycharm": { "name": "#%%\n" } }, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAGAAAABgCAIAAABt+uBvAAAE8ElEQVR4Ae2cW0gUURjHW68Vrkn3vKSsVj6skS6ZlUahWBop5KXIaiGqh97KfPAlWfApCI2eIovA1Jd60oQiK+/hmqmIZUW0ubtBeemibqWrfbIkC+p8Z3TmnE0+Xxzm/58z5/vtf44z456jsU/Zl9HP/AS85pdImSbgIwTDu9fvKssqG+sarRbrL8evNevWxMTFHMo8lHUyy9fPV0iX5juphvMl9uf3n6KLReU3yycnJ2f3KWRzSOnd0r0H9s6WRO3hCgjCkpuS297SLlGtt7f3tdvXco25Eh6eEtcxKP9svjQdqNzpdBacKzA3m3lSkDgXvwR1mjvT49MluuIu7di5o7at1n2PqG1+Caq4VcFeJNDsedXD7lfPyQ9Q89NmWWU0PGmQ5VfJzA+Q7ZNNVg3Wj1ZZfpXMnACNjY6Nj4/LquHb8DdZfpXMnACtWLnCx0feTWngqkCVapbVLCdAGo1mU+gmWT0LDQ+V5VfJzAkQ9D4xOVFWDUkpSbL8Kpn5ATpx9gR7DfBott2wnd2vnpMfIEOCIeNYBkslMFqZSkxwVbKY1fbwAwSVlNwpidsVJ12Sl5fX1ZtXE/YlSNu4qVwBwd+y+8/u553LAwpzVggDedWjquNnjs+pCtnJ71nMvby3vW/hyaPpaZPNYnM4HNPvg2JjDmYezD6V7efv5+4Uvi0GkPCy2Tswd9TZj1/yTgKEfMQEiAAhBBCZEoQAkveELdHYZ+tnQ5hBwrAAqbq1Gu6/F3CggodQghCYBIgAIQQQmRJEgBACiEwJIkAIAUSmBBEghAAiU4IQQIo9asDbUsbvYg0NDOnX6ZF+eYxMCUI+CgJEgBACiEwJIkAIAUSmBBEghAAiU4IIEEIAkSlBBAghgMiUIAKEEEBkShABQgggMiWIACEEEFlAgjzk+70ImH+yAEAw6fLf2f+D3wIA+fiyvgh3TjiFIxQAyH+5P2PZoyOjjE71bAIAwSUWoA1gKWngywCLTVWPAEBQT9DqIJaq+nr6WGyqesQA2hC8gaWqluctLDZVPWIAMU6Wg7nP79+8V7V+tHExgHRbdGjPXAZTvonRqZJNDKDomGjGeupq6wovFKITgicmJuof1xecLzAeMTK2zGgTM5kFFn2Jj4hn7CLYIiIjjBeMSclJwWHB2lVaWAPk5/eftn4bTB3v7e7teNEBF6PrnkAbqO37ruTQLgYQ1Lw7crflg4WdEbuzw9qxMWQju1/aKeYSgz6lZqRK92zBKkxGW/Cxsw8UBijndM7s3iiyZ4kA0sfqVZqYCstbKQLa1YiwBMHpL5suK1jJTFNLJEFQz579e2CS6kxhSm0sHUBApPhGcVR0lFJoXO0MDw4Pfh1Uqk2RlxjUACuYlD8sh7sbpepxtaNgiAQDgnrCdeE1rTUKLkQBCw0q+CJJPCBgBPd1MHcOxmxYeWAxUYIkXrpyyWwxpxxOWUw77scKu5N278TMNrwhK7te9uDeA1nLVYVFhAGRtKNpMOrPt6bDzCnkbngWIFfvp6amul92tzW1dbV3weOIvd8+8mPEMeaAf4cEBAbA09ba9Wt1W3WR2yJhgI+Nj1V8CHOH6ImA3PsnfNsjxiDhFCQ6QIAk4ExLBIgAIQQQmRJEgBACiEwJIkAIAUSmBBEghAAiU4IIEEIAkSlBBAghgMiUIAKEEEBkShABQggg8l9ITjYwKjepmgAAAABJRU5ErkJggg==", "text/plain": [] }, "execution_count": 4, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# second image in a batch\n", "ims[1]" ] }, { "cell_type": "code", "execution_count": 5, "metadata": { "pycharm": { "name": "#%%\n" } }, "outputs": [], "source": [ "# we'll use three operations\n", "from einops import rearrange, reduce, repeat" ] }, { "cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAGAAAABgCAIAAABt+uBvAAAHEUlEQVR4Ae2bDUxWVRjHI0C+FZAPFSS+CRrgApkSI0uRgROWLtSwtdVCEBkbzGw6P5pWw2JNECk2IcvICQgNCIgPISgVkYKhICAQCGIS8iHwIig99LYri/fe57z3nveFrcPeueN5/uc8z/nx3Pvec56Lxkxf33Psh5/A8/wmZpklwAAhecAAMUAIAcTMMogBQgggZpZBDBBCADGzDGKAEAKImWUQA4QQQMwsgxgghABiZhnEACEEEDPLIAYIIYCYWQYxQAgBxMwyiAFCCCBmlkEIIC3Erhbz/QcPWtrbu3p64PNweHh8YmJCJgPP+np6erq6ZqamL1hb265e/ZKLi6mxsVoieuZkwQANj4zkFRf/VFX1a10dcHkWkWDL2d7ed+3arQEBwRs36uroCGrpGDXUX1mtqKlJTk8vqqiYfPxY9CKMDA3Dtm6Nj4x0dXISPQnJQLUCKigtPZaYeKOxkSQyEo2GhsYbQUGfHzliZ2NDohehURMguMXEHj4MF5SIENEhcJ86FBv74b59mpqaqFhZgToApWRk7D9+XH7fVTY+cv3rfn7fnzljYWZGPoREqVpAssnJt2NisgsKSEKRrrGxsqrIynKwtZU+FTeDCp+D4As7YMcOtdGBJXX39r66bVtbZye3POkNVWUQPMtsCgu7cuOG9BCVncHR1ra2qMhk2TJlByrUqySDnj59+mZExILQgUW2d3XtioqamZlRuGBlO1UC6JOkpB/Ly5UNhaK+pLIy9dw5KhPSv8Su1de/Ehr65MkTKvGJnmSpkdGtqiqrFStEzyAfSDmD4OKKPnhwwenA2kZGRz9KTJRIB4ZTBvRtdjbFB2WJy/v64sU/7t6VOAlNQHBfTEhJIQ9IZ8kS2Ch8k5QE1wI6qr+h4Up+/rH4ePJHwampqaSzZ9GZhQU070HFly8HhYcL+5NbYQ/1fng4rHalpeW/PatWCQ/kNtWjjx5FHjiQmZsrrJdbgWZvfb2WlvhDC5oZdD4nhyRo2IjnZWR8dfIkR4dkFKeB4edPn353506uR6Dx58AA/NoEBKiJGiA4u/ihpAT1B/vJrLS0kM2bUaWAABLwy4QETzc3AQ1nKigr49oiGtQAwbf7o7ExNIL9UVGBGzagMlSgra2ddOIEKgOBxCMEaoB+vnoVDReeTT6IjkZlhAL/dev8fHxQcWd3d4+EP7egBqixuRmNdXtwMK0tktxX3J49qFMQ/H7zJolMoYYaoLaODoUO5nZu8vef+1/pbbhaSU6mGxYDoL7799EFv+zujmqUEkDZA87J0CGtBL88vkmoZRDJHXq5iQlfHKL7fdasQcf29vejGj4BNUAkJQrjpUv54hDd7+bsjI7tvXcP1fAJqAGCk3M+H1z/8Ogo16bVcHF0RKeCs01UwyegBghuB3w+uP6BwUGuTatBctlKqRdQAwQFYnTNTS0tqEZZgaG+PjpkUQAiKd2p4pjR0MAABSTl+JVaBtkT1DZzi4qk3A4UgiDJDgOCLFM4OXRSA7SW4Ot2aGTk41On+EIR1w9zogNJLkO+SagB8vX25vMxt/+LtDTIo7k9EttDBN9QJJchXxjUAEE908nOjs8N1w+H1m/t3Zt+4QLXI7Hx18OH6AxwhIRq+ATUAIGDsJAQPjdz+6Ee/V5cXOCuXWXV1dPT03NNItok34xQkhYxs3yI+LPI+S7fCQv7NDkZcmS+aX4PHNPAB5IfNmiW5ubzBYQ9dQQv07xI8DDJ545mBsEltn3LFj5PCvthBwcHSVn5+QqtJJ11DQ2obLEAgkCPxsXBWR8aMS0BvMfX3NaGzraIAMF7lvFkh1joqkgEcBdDi5RQXPIgO71W6JHmJSZ3cDQ+3tvTU6Ez6p35paXonOu9vUkO1fjmoQ8IoslNT19hYcHnklY/5A689IjO9pqvL6oRENAHBM6sV66szMmR/uKAQNxgKq+pIXkIIjlyFHCkEkDgz8XBoTovz93VVcC3RNN3ly6hM5gvX77eywuVCQhUBQhcwv7+WmFhxO7dAu6lmGB7DAVI+E0IVJZDAwMlvvpKszbPt9pfrl+POXTot6YmPgFJP1ebny+Gx/GO7u7bd+7cbm+f/fefDxSdQVmcmSmxTqkOQBAonMgUlpV9lppKUl+cj2B2BiWLf7DLB15eHh4C+aXQ0X861QSI8wovEF7Iy4MNPRTzCDcl8rHKAuI8SmyoGxAXLjwE19TWwttWt1pb4WkYKjODQ0MCR3//O0AcKa4B7zs9GBwcGx+XyWTyItIS+NHWhvNA+CsoKaeCnAsRjQXLIBGxLsgQFX7NL8h6qDtlgBCkDBADhBBAzCyDGCCEAGJmGcQAIQQQM8sgBgghgJhZBjFACAHEzDKIAUIIIGaWQQwQQgAxswxigBACiJllEAOEEEDMLIMYIIQAYmYZhAD6G5tIOOckeJAKAAAAAElFTkSuQmCC", "text/plain": [] }, "execution_count": 6, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# rearrange, as the name suggests, rearranges elements\n", "# below we swapped height and width.\n", "# In other words, transposed first two axes (dimensions)\n", "rearrange(ims[0], \"h w c -> w h c\")" ] }, { "cell_type": "code", "execution_count": 7, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAGAAAABgCAIAAABt+uBvAAAHEUlEQVR4Ae2bDUxWVRjHI0C+FZAPFSS+CRrgApkSI0uRgROWLtSwtdVCEBkbzGw6P5pWw2JNECk2IcvICQgNCIgPISgVkYKhICAQCGIS8iHwIig99LYri/fe57z3nveFrcPeueN5/uc8z/nx3Pvec56Lxkxf33Psh5/A8/wmZpklwAAhecAAMUAIAcTMMogBQgggZpZBDBBCADGzDGKAEAKImWUQA4QQQMwsgxgghABiZhnEACEEEDPLIAYIIYCYWQYxQAgBxMwyiAFCCCBmlkEIIC3Erhbz/QcPWtrbu3p64PNweHh8YmJCJgPP+np6erq6ZqamL1hb265e/ZKLi6mxsVoieuZkwQANj4zkFRf/VFX1a10dcHkWkWDL2d7ed+3arQEBwRs36uroCGrpGDXUX1mtqKlJTk8vqqiYfPxY9CKMDA3Dtm6Nj4x0dXISPQnJQLUCKigtPZaYeKOxkSQyEo2GhsYbQUGfHzliZ2NDohehURMguMXEHj4MF5SIENEhcJ86FBv74b59mpqaqFhZgToApWRk7D9+XH7fVTY+cv3rfn7fnzljYWZGPoREqVpAssnJt2NisgsKSEKRrrGxsqrIynKwtZU+FTeDCp+D4As7YMcOtdGBJXX39r66bVtbZye3POkNVWUQPMtsCgu7cuOG9BCVncHR1ra2qMhk2TJlByrUqySDnj59+mZExILQgUW2d3XtioqamZlRuGBlO1UC6JOkpB/Ly5UNhaK+pLIy9dw5KhPSv8Su1de/Ehr65MkTKvGJnmSpkdGtqiqrFStEzyAfSDmD4OKKPnhwwenA2kZGRz9KTJRIB4ZTBvRtdjbFB2WJy/v64sU/7t6VOAlNQHBfTEhJIQ9IZ8kS2Ch8k5QE1wI6qr+h4Up+/rH4ePJHwampqaSzZ9GZhQU070HFly8HhYcL+5NbYQ/1fng4rHalpeW/PatWCQ/kNtWjjx5FHjiQmZsrrJdbgWZvfb2WlvhDC5oZdD4nhyRo2IjnZWR8dfIkR4dkFKeB4edPn353506uR6Dx58AA/NoEBKiJGiA4u/ihpAT1B/vJrLS0kM2bUaWAABLwy4QETzc3AQ1nKigr49oiGtQAwbf7o7ExNIL9UVGBGzagMlSgra2ddOIEKgOBxCMEaoB+vnoVDReeTT6IjkZlhAL/dev8fHxQcWd3d4+EP7egBqixuRmNdXtwMK0tktxX3J49qFMQ/H7zJolMoYYaoLaODoUO5nZu8vef+1/pbbhaSU6mGxYDoL7799EFv+zujmqUEkDZA87J0CGtBL88vkmoZRDJHXq5iQlfHKL7fdasQcf29vejGj4BNUAkJQrjpUv54hDd7+bsjI7tvXcP1fAJqAGCk3M+H1z/8Ogo16bVcHF0RKeCs01UwyegBghuB3w+uP6BwUGuTatBctlKqRdQAwQFYnTNTS0tqEZZgaG+PjpkUQAiKd2p4pjR0MAABSTl+JVaBtkT1DZzi4qk3A4UgiDJDgOCLFM4OXRSA7SW4Ot2aGTk41On+EIR1w9zogNJLkO+SagB8vX25vMxt/+LtDTIo7k9EttDBN9QJJchXxjUAEE908nOjs8N1w+H1m/t3Zt+4QLXI7Hx18OH6AxwhIRq+ATUAIGDsJAQPjdz+6Ee/V5cXOCuXWXV1dPT03NNItok34xQkhYxs3yI+LPI+S7fCQv7NDkZcmS+aX4PHNPAB5IfNmiW5ubzBYQ9dQQv07xI8DDJ545mBsEltn3LFj5PCvthBwcHSVn5+QqtJJ11DQ2obLEAgkCPxsXBWR8aMS0BvMfX3NaGzraIAMF7lvFkh1joqkgEcBdDi5RQXPIgO71W6JHmJSZ3cDQ+3tvTU6Ez6p35paXonOu9vUkO1fjmoQ8IoslNT19hYcHnklY/5A689IjO9pqvL6oRENAHBM6sV66szMmR/uKAQNxgKq+pIXkIIjlyFHCkEkDgz8XBoTovz93VVcC3RNN3ly6hM5gvX77eywuVCQhUBQhcwv7+WmFhxO7dAu6lmGB7DAVI+E0IVJZDAwMlvvpKszbPt9pfrl+POXTot6YmPgFJP1ebny+Gx/GO7u7bd+7cbm+f/fefDxSdQVmcmSmxTqkOQBAonMgUlpV9lppKUl+cj2B2BiWLf7DLB15eHh4C+aXQ0X861QSI8wovEF7Iy4MNPRTzCDcl8rHKAuI8SmyoGxAXLjwE19TWwttWt1pb4WkYKjODQ0MCR3//O0AcKa4B7zs9GBwcGx+XyWTyItIS+NHWhvNA+CsoKaeCnAsRjQXLIBGxLsgQFX7NL8h6qDtlgBCkDBADhBBAzCyDGCCEAGJmGcQAIQQQM8sgBgghgJhZBjFACAHEzDKIAUIIIGaWQQwQQgAxswxigBACiJllEAOEEEDMLIMYIIQAYmYZhAD6G5tIOOckeJAKAAAAAElFTkSuQmCC", "text/plain": [] }, "execution_count": 7, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# we could use more verbose names for axes, and result is the same:\n", "rearrange(ims[0], \"height width color -> width height color\")\n", "# when you operate on same set of axes many times,\n", "# you usually come up with short names.\n", "# That's what we do throughout tutorial - we'll use b (for batch), h, w, and c" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Composition of axes\n", "transposition is very common and useful, but let's move to other capabilities provided by einops" ] }, { "cell_type": "code", "execution_count": 8, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAGAAAAJACAIAAAD9yfRwAAAipklEQVR4Ae2dCXxN1/bHn8pIEgSJIZKIGIoQIirmmYhEW6SqNHjKowP/Ruu1WqpaSutPDZ2lNVS1QiliChXzGBKEJKZEBjFGZvN/eff/7rvvDmetc88+9+amq5989Jy9fmfvvb53nX33dM6t9CQn52/8n2kCz5g2seUpAQaExAEDYkAIAcTMEcSAEAKImSOIASEEEDNHEANCCCBmjiAGhBBAzBxBDAghgJg5ghgQQgAxcwQxIIQAYuYIYkAIAcTMEcSAEAKImSMIAWSH2NU037t//0hi4t7Dh5PPnUu/dCknL6+ouBgSnZ2cqjg713J3b+jt7eftHRwY2LFdu0a+vmrWxWTelSy/cPjkyZPte/asWrdu4/btQMRk1f7b0Lhhw8iIiKjISDj4b4u6ZxYF9Pjx45WxsXOXLj2Xnm6eW88888zgsLAZb7/domlT83KQe5XlAMHd9Pr7759ITpZbRUO9vb199PjxM6KjnRwdDa1iUywBCAJn9qJFH82f/+jRI4G1b9e69e8xMV516wrM0zAr1QGVlJYOHTcubtcuw7KVp9Tx8Nizbl3TRo2UZ2UqB3W/5u/cvds7MlIlOuDStevXew0dejkz05R7ytNVBFR2715EVNShEyeU11Iih+xr1waNHl1aViahUWJSEdDIN9/cf/SoksoRrz197tzk6dOJYrkytQAt/fHH2M2b5dbGbP13q1YdOHbM7MslLlSlkT5/4ULbvn3VC3uj/rRp2fLE9u2VKlUyajU7UZUImvThhxamA/6fPHNmS3y82SBMXSg+gjbv3BkeFWWqPMN0N1fXwQMG9O7atW1AQM0aNaq7ud0tLLx5+/aZ8+fh6+/3rVvzCwoMrzKa0rVDh4T1642azE4UD6hd//7E7nLlypXfmTDh3ddfr1GtmikHoKPw6ZdfLvjuO+htmtLopqcfPOgvdFgr+BbbvX8/kY6ri8uWlSvnvP++BB3wHKxfTJ8e+/33xFHFmg0bdHkpPxYMaHFMDKVO0JSuWrKkX/fuFDFoXggNXTpnDkUMtyRFRteIBHS3oGDr7t2Usl975ZWIvn0pSq1mzLBhfbt1056aOjh19ixUw5TVjHSRgDZs2wbTXWglHB0cPoqORmWGgncmTjRM1EuBpkps71QkoB0JCXrVNXo6oFevup6eRk3Sid1DQlyqVpXWgJXYCKL5aAQiAR08fpxSKjQoFJmhxs7ODroChul6KSlpaXopSk6FAcq7cePK1auUqsA8DkVmVONZu7bRdN1Es6crdTPRHgubtIfhhTZT6YPmhLZWOgdpK4zvpQWyrMIiiBg+sipnnvh2fv6DBw/Mu9bwqgoICFZNbty+beiqeSnCAMGYwLwaqHFVcUmJqGyFAYK5Z1F1Up5PmbgJRmGALD+/IcGR0l+VuFzXJAyQbqYV6VgYIFhNLz9cHBwcRFVGGCDYcSCqTsrzcbC3V56JJgdhgGAzhqg6Kc+napUqyjMRDMjHy0tUnZTn4169uvJMNDkIG2r4NmhArNO1pCTKkIqYm9oyYbcYfT+KqivFwnkJAwRR3cTPj1I/2D1FkZUTjTBA4E/H4GCKV18tX15YVERRlgeNSEDhffpQXLp+8+Y/pk6FISVFbHWNSEAwlwqLORSXVv/++9joaIGTEpRCzdOIBARLV5Hh4cR6xKxZExwaCltciXqjMtiyBhPhUZMmzZw/36hAeaLglVWY7mzRvbus26dz+/Zvjx8Pa2T0wQos7MTv27dp505Y5r515w5QgEWkjT/9pByHYQ6CAUEBg8eOXR8XZ1iSdApEX8/OndsHBjZv0qSpvz8s0rtUqQJrGDBJAGvz+XfvAghYrT+enHw8KQk+Br3tjrAL7/y+fdJFmGcVDwi6ORBEFp79gAWP0kuX4F/zKEhcJbIN0hQD2+OnTZokUaQapocPH15SZ6eieEDg/z/feAPuFzVASOSZevGihNVskyqAYF/LL1995V2/vtnVMuPCVPK6k6zMVQEENfCoVWv32rX169SRVRslYluKII2f8HzOn+vWid3OJEHQ9gCBM/BkztGtW+n7gCT8R002CQi8gi1iW3/+eens2bAXEXVSiQCGePTdjPSC1GqDdGsA+8kmjhqVkpAA+6bgQR1dk9hjNdpp8R1FaZ8zsrIWLVsGT9PBBy6tpFtr16w5qF+/IQMH9urcWXhf0dKANG5Dv27bn39ujo+HoaZ5E4ywTS2kXbseHTtChyskKAg6FnSgspTWAaRbxas5ObCxMOns2bRLl2DnSnZuLizzw0gF/mDQC+sTmkEZTKRAx6qZv7/mr1Xz5sR9r7plmXFsfUBmVNqSl1iikbakP8LLYkAIUgbEgBACiJkjiAEhBBAzRxADQgggZo4gBoQQQMwcQQwIIYCYOYIYEEIAMXMEMSCEAGLmCGJACAHEbJer7ivAkOLLv5lvMeQzEr/jCCnwX+b0c+mrf1i9b9e+rIwsWL6oWbtmQNuA/oP6Dx4x2N5BxZVFSt30NJVynuToJal6ev/e/Rn/M2PltyuNvs2lvnf9hT8t7NSjk6p1kJW5RQFBsET2jjx+8LhEFWEJcP6y+ZFRkRIaS5os2gZFj42WpgOew+7Md15759iBY5akIFGW5SLo1LFTA9oPkKiKrikwODDuqOytsro5iDq2XAT9/P3P9EoDzTMnz9D16iktB+jA7gOy3Ngbv1eWXiWx5QBlZ2bL8iHrSpYsvUpiCwEqKS6R++hK/p18lXyWla2FADlXcZa7tcmtmpssT1QSWwgQ7MKr6yVv1Ofl46WSz7KytRAgqFPnXvL23nfp3UWWJyqJLQdo+NjhdB9gaNYqqBVdr57ScoCCOgRFvBRB8QRaq5kLZsJdSRGrrbEcIPBkQcyCts+1lXYJfvdg3rfzOnTtIC2zmNWigOC7LPbP2FdeewUoGPUQGvJftv8ybMwwo1arJFpuLKbrXlpKGow89u/en52RXVpa+nQ+qE1Av0H9howc4uDooKu0+rF1AFndbXoFjIc6/foKr2RAyEfMgBgQQgAxcwQhgIQt++Rm5QY1CEJKk2nedGgT9L9lXiRYzhGEAGVADAghgJg5ghgQQgAxcwQxIIQAYuYIYkAIAcTMEYQAEjbUgNlS4l6s2zdvt6zdEqlXuTFzBCEfBQNiQAgBxMwRxIAQAoiZI4gBIQQQM0cQA0IIIGaOIAaEEEDMHEEMCCGAmDmCGBBCADFzBDEghABitkIElZP9vQiYf5utAEi9967+2ymR/7cCIDt76kT4o4ePRPpqVl5WAOTo5EisanFRMVGpnswKgOAWc3F1obh08/pNikxVjRUAgT/V3atTvEo9k0qRqaqxDiDPep4Urw7uOUiRqaqxDiDiw3Lw7POF8xdU9R/N3DqA/Br7oTXTCGZGzyQqVZJZB1CzgGZEf3bF7Xpv4nvoA8HwkvyEHQnvjHsnKjyKmDNRZp2HWeClL+192xOrCDLfRr5RE6O69OpSr0E912qu8A6QwruF2Vez4dHxlOSUxMOJcDNq+gSubq6pd0U27dYBBD6HNArJuJRBZ0RXJmYl1qkv7DeFrHOLgbd9I/rSfZalhIfRZOmlxVYDNPTVodI1M9taQQC1bNNSpQdT4fVWZsM1vNBqEQRVmTJzimGFlKdUkAgCEB27d4SHVJUT0cuh4gACxz5Z/An8Fo2ehwpP79y6c+vGLYWZaC+35i0GlYA3mKzcshJ6N9oKCTkQGERWBgQ4fPx8Nh/aLPBFFPCiQYETSdYHBIygXwfPzkGbDW8eUBJBEIlvT3/7WMax3mG9leSje63VetK6ldAewwzZD1/+sG7VOlmvq2rg2wCIhL4YCq2+qXc6aIuQe1C+AGlqDz8rlnwi+ej+o0nHk2A4knM1p6igqLSkFJZDXNxcYLRVy6OWXxO/Rk0bQQPfpn0b4U2YLsTyCEi3flY/LhdtkNUpSFSAAUnAeWpiQAwIIYCYOYIYEEIAMXMEMSCEAGLmCGJACAHEzBHEgBACiJkjiAEhBBAzRxADQgggZo4gBoQQQMx2f+MfGJNExLeYJB6eUUTwMCAGhBJABNwGMSCEAGLmCGJACAHEzBHEgBACiJkjiAEhBBAzRxADQgggZo4gBoQQQMwcQQwIIYCYOYIYEEIAMXMEMSCEAGLmCEIA2SF2oea8vJzMzMtZWRnZ2ZnwL/zdunWjtLSkrAx+KvPpv/AHD2o4O1dxcnKuVcvDy8vH29svMDA4KCjE17eR0LpQM6uUk/OEqpWpe/z4cVpaSlLSseTkxJSUpHPnThcU5MvM4z9yf/9mzz//8ssvj6lb1+s/qeofiQcEFJYv//rgwT0nThwqKioU64K9vQMwmjJlJsSX2JxN5SYeUFLS8dDQYFPlCUmvUaPm7NlLBg2yxA/62mQjfefOrQkTXp4370MhuKUzsUlAGpcWLvxkzpz3pd1TbrVhQOD84sVz1q5doZyCRA62DQgc++CDN3NzsyQ8VGiyeUCFhQUz1XnFhYaszQMCNzZt+u3cuWSFkWLq8ooACDrfMTFLTHmoML0iAAIEGzeugcGKQhZGL7foWEyvBh9/vLBRo6Z16tRzd6/t/PS/KjA6uXnz+tWrV+LjN69fv/ratWy9S0ydQpcd+u69eg0wJTA73Zo96VOncj08TL5sDFrfadPeiI1dSfTt739/a9asL4liuqz83mKurm4LF/7UtWsfojOnTycSlbJk5RcQuAEv4pg79xviC7ph5kCW50RxuQYEPvj4+HXq1JPiTH7+bWi/KEpZmvIOCJzp3TuM6FJ6+jmiki6zAUAwnUj0JyPjIlFJl9kAoBYtWhP9gS8+opIuswFAMD8NM2QUl4qLiygyWRobAAT+1KlTn+KV8BleKNQ2ALm7kyKopOSvGkEODqQfUlBjOGYbEeTo6ES5xWBYT5HJ0tgGIGIEyfKcKLYNQERn1JAxIIQqA2JACAHEzBHEgBACiJkjiAEhBBAzRxADQgggZo4gBoQQQMwcQQwIIYCYOYIYEEIAMXMEMSCEAGLmCGJACAHEzBHEgBACiJkjiAEhBBAzRxADQgggZvHbgJECbc3MtxjyiTEgBoQQQMwcQQwIIYCYOYIYEEIAMXMEMSCEAGLmCGJACAHEzBHEgBACiJkjiAEhBBAzRxADQgggZo4gBoQQQMx2/PNi0oT4FpPmYyPPrCJOqGnmCELoMiAGhBBAzBxBDAghgJg5ghgQQgAxcwQxIIQAYuYIYkAIAcTMEcSAEAKImSOIASEEEDNHEANCCCBmjiAGhBBAzBxBDAghgJit+U777MuXT+zZk3L8eGZ6etbFi4X5+WXFxfBae+eqVau4utbx9vZu3Lhxq1ZB3bo1CQyEl28jrqhjrnRChZczSlc158qVuJUr41atykhLk1ZqrdVq1uwTGRkeFdXyuee0iZY5sCigSykpyz79dMevvz5+9Mg89wI7dx43ffpzfagvujevFN2rLASotLj4m+nTV3/5pdlodCvdLSJi6pIlng0a6CaqdGwJQOcTE98dMgRaHIE+QCM1Iyam95AhAvM0mpXqLd+21atHd+oklg54UlJYOHXo0KXTphn1SmCiuoA2LFv24ciR98vKBNZYN6uY2bPnvfmmborwYxUBbV+z5pPXXoOvbeGV1s3w1yVLvnz3Xd0UscdqtUHQuxnbteu90lKx1TWV28zlywe++qopq5J0VQCVFBVFtmyZm5GhpGayrnVwclpz6pRP06ayrqKIVbnFFk6ZYkk64Cc0czPHjFHjdhYP6NyJE+u/+47y4YjVJB08uGXFCrF5Qm7ib7GJffociY+nV9TR2bnLwIF9X3rJt1kzj/r17R0cbuTk5F29unfTJuhzwzE9q3q+vutTUyEH+iWoUjCg5EOHRnfsiJaqFYT06wf9vdr16mlTdA8ePXz43cyZMXPm0Pvf05ctGzRmjG4mCo8F32LwpUuv0MgpU5Zs22aKDuRT2c5uwqxZCzdteqZyZWK2sV9/TVQSZSIB3b11a1dsLLHgXoMHT5o3jyLuFBo6ae5cihI00L1IPXmSKKbIRALas3Hjg/v3KaW6ubvDvUD8ZTXIcER0dLO2bSk5g2b3+vVEJUUmFNCGDZQiQfP3adNcqlUjijWy16ZPJ+qhdScqKTJhjTT0QbpVqwZdRLTUKi4u8devw5cXqtQThPn4XMvM1Es0errrxo3qtWoZNclNFBZBl86epdCB+nUNDzeDDlzYfdAgonvQEhGVqEwYoLPHjqGFaQQ9Bw8mKvVk0F3SSzF1Wh4BwcS7qerqpTdv104vhXgKU/dE5dULF4hKVCYsgq7RhqbQNtf18UGrZVTg7uFBbFmITZXRUvQShQHKy8rSy9roqU+TJkbTiYkN/P0pyuu0ylCyEgaI2EK7VK9OqZYpTVU3N1Mm3XRYI9A9VXIsDFBZSQmlHi40D01lVdXV1ZRJN51YGd1LTB0LA/TwwQNTZeimm/cFr82BeDmxQ6/NVuJAGCBHJ9Kv7CmchCXeyESOEly0JnGAaD1joofa+ukdFBcU6KUYPXWiVcbotXqJwgARx1awQ0GvBrJOiZcTK0MpWhggTy8vSnn0/qRhbvAjmBmpqYbphikCV6WFAYLdKoYVNUwpuH37dl6eYTolBbbIEL+/iZWhFCoMUMNnn6WUB5pziYlEpZ7s7NGjeimmTmF625RJbrowQM8GBRHL3rd5M1GpJ4snT1e2CA7Wu9bsU2HzQdBAdK9Ro+juXbQq0EBsycigTydqMoTdCr08PCjL/DCTnZCfD9vU0JpQBMIiCBzuQNvXBEs6e8hzj1ofVi9cSKEDethkJYoO5CYMEOTVJTxc64/0wfezZkkL9Kz5N2+u+OILvURTpzAhZ8pkRrpIQFAzWCOnVAIWHlZ8/jlFCRqYzJ0xahSxiwh7PWG9hJgzRSYSkFuNGn2GDqWUCprF7713YOtWVAxN2/zJk/dv2YIqNYKQ/v3Nnm8yWoRIQFDAS2+8YbQYw0RYLJ0cHv71hx/C8qmhVZMCc0ywkL1m8WJTAsP0yIkTDROVpAj7FtNW4q2wsANxcdpT9ABWVmFhHm5P6N3BMQzEr2dnXz53DhbmYQGH2DBrSoHJ3JXkqXG0YhqBeECpp06NCApSYycK6hIsZMNiPyqTJRB8i0HZTQMDX1J536BRD2HHq3A6UJD4CIJMYcQ0rHVrGDoZ9USNRFjLXpeS4u7pKTxz8REEVYR+2ufr1gnsrUm7DXs/5vzyixp0oFxVAEG+TVq3/njFCss8gQK7RDr07SsN0WyrWoCgQj1ffNECjMZ/9NGIt98223/0QhUBQdmhr7zy2W+/qXSvQXi+NXfuuBkzUCeVCFRppPUqlJ6c/M7gwQKXgyF/6LXPWrmyc1iYXlnCT9WNIE114aG4NUlJcCPQd9JJ+wk3b2xKigXoQDUsEUFab+F5MdiUGb92LYywtImyDmAqY8LHH7fr0UPWVUrEFgWkqSjca3/8+CM8cUjfYvB0GBwZGTF6dAV/4lDvk7xy/vyJhAR4mgyWOnIuX4bZSOhhwhgFGnXYhfb0mdUmTRoHBLTt1q1Zmzaibk+9OqCnVoggtE7lSmCJRrpcOSy3MgwIIcaAGBBCADFzBDEghABi5ghiQAgBxMwRxIAQAoiZI4gBIQQQM0cQA0IIIGaOIAaEEEDMHEEMCCGAmCs9OWHmAgOScUUx8y2GfJIMiAEhBBAzRxADQgggZo4gBoQQQMwcQQwIIYCYOYIYEEIAMXMEMSCEAGLmCGJACAHEzBHEgBACiJkjiAEhBBAzRxADQgggZo4gBoQQQMx2iF2cufReaVpGWkZuRha8BjIvK/tGNvx7484NSIe/krKS0rLS+w/vOzs6V3Gq4u7m7gNvUKjrE9g0MKRVSIB/gF1ly1VV12kVFw4LSwoPJR86cubIyfMnk9KSruReMfth+mou1QZ1HxTZJzK0Y6hlnoPVMhIPKL8wf8mvS7bs33Ls7LFHjx9pSxJy4N/Af8rIKWMGjbG3sxeSIZqJeEDHU44HjxT2AiijDrRo1OKrf37VtW1Xo1axiTb5LXb24tnu47pPWzrN7HuWDtEmAYF78FDn7JjZ4ZPDoYGne2uG0lYBaVyNOxAX8T8RqjKybUCAKf5I/IgPRpgRGsRLbB4Q+Ll+9/rPfvqM6LBcWUUABD5/sPSDk6kn5TpP0VcQQNDhGvfJOOHdLiBYQQCBJ9D/WhW3ihIUsjTWGeBoqrggekFj78ae7p7u1dxdnF1cqrg42DsUFBekZ6bvTdwL3ianJ8tyBr74RwwYUfmZyrKukhZbsyeduyO3Ts06EvVbt2vdxM8mXr99XUKjZ1r/xfoXerygl6jktFzfYoN7DT6+6jiMv+gexmyMoYspynINCBxo4Nlg+9LtMJqnOAOarQe2whQKUUyRlXdA4INffb/PJ1PfawpfZDsO76B4TtTYACDwZHTE6CY+1J90+SsCgunEaWOmET/zA6cOEJUUmW1EEHjyYs8XnRxIrxq+lH0JJjMpzlM0NgMIekmhnUIpLsFMyJkLZyhKisZmAIEzPdpR38wFSwMU5ykaWwIU0DiA4hJoYMmEqERltgSoVeNWqD8awbWb14hKVGZLgGCxjLiYUVxajHpOFNgSIHDJtYorxTGBk7A2BqiaK2nMcf/BfQpHisbGABHfUAnTJhTnKRobA1RUUkTxytHBkSKjaGwMELH1repcleI8RWNLgGAAQWx9PWp4UJynaGwJUOoV0q+vgdse7n9JQGmZaZTPHDSwsYioRGW2FEEwk4/6oxE09WlKVKIyWwIUtz8O9QcE0Jms71GfoqRobAbQwaSDV/OuUlwKbiFyd5LNAPp02acUOqDp2LojUUmR2QYg2MIBO10o/oCGPm1EydAGAF3JuTLsvWEUZ0ADC0Rd2nQhiikyawJ6a95bZy4ic6PbD20PGRVy6+4tijOgGdhlIHFKhJihHVGnhmwt/D5C/Nrmfs3DOoe1bdYWNkN71vSEEIDxRM6NnGMpx2B5ftfRXcQBqqaGsAFWbFWtCUjjSQr8mMSlFCFewVYIsQ0Q1Mqat5gQKLqZ/HPUP+X+fKvu5UaPKw4g2OPw6sBXjTqpJLHiAFoydYkaz3NUEEAjw0b2C+mnJFJMXVsRAMHX39fvfW3KQ4XpNg/Iu473Hwv+EDiFqAfUtgE18mqU8H2Cbz1fPa8EntowoPCu4bBBT1U6ANqagPo818e87526tequnLUS7qzqrtUFBovRrKzZk14xawU8z7Rs47LY+Fjijl8Yl7we+XrUwCj1Gh09TOVlGzBMhu08vDPxfCKQyryWCY8twhKYk6MTTA96eXo92/DZoGeD+nfsL3AuVQ+EqdPyAshU/ayebs02yOrOUyrAgBBKDIgBIQQQM0cQA0IIIGaOIAaEEEDMHEEMCCGAmDmCGBBCADFzBDEghABi5ghiQAgBxCx+wgwp0NbMfIshn1ilJ09OIJK/tpkjCPn8GRADQgggZo4gBoQQQMwcQQwIIYCYOYIYEEIAMXMEMSCEAGLmCGJACAHEzBHEgBACiJkjiAEhBBAzRxADQgggZo4gBoQQQMwcQQwIIYCYrfkwy4ULVw8fPg0PhZ8/fyU392Ze3u38/MKysnv37j2wt7erWtWpKjyyUvXpy+7hl8YawbNh//pr2dLf09MdcUuc2dILh/CWgISExF9/3bF58z74HTbzHPH1rde1a9t+/UJCQzvWqOFmXibEqywH6MGDhzExGxcs+Dk1VdibaB0c7AcM6PSPfwzp27eD8CfmNQQtBCg+/sgbb8wViEbv84+JmTF6dIReopBT1dsgeGTu/feXzpu3XNYrOOT61rixt9xLiHp1AT18+GjEiA+gxSHWxmxZy5aNzL5W+kJ1+0Hjx39qATpeXp7Vq5NegivNwqhVRUDff/87tMpGSxWbqF74QD3VAnT9+u3o6P8VC8JUbjYJaO7c5YWFJaZcEpsOXUexGermpkoE3bt3/8cf/9AtRtVj24ug3buP3blToCoUbebww73Nm/tpT4UfqPI1v2ePjH2P7u5uw4b169KlDfRl4PsIhmDOzk7Qeyoru19YWJyTcyMr6/rp0xdOnUrduzfx5s18PQR+fvWdnYW9PVovczhVBRD4Y1iS0RTo/i5a9A4MR/WslSs/A+NVV9cq9erVbteu+fPPdwcBdDUTE89v3Lhnw4Y92iJUvb+gUFWGGq1bD0tOTtfz2fA0IqLbxo1mftNdvJgFpDZuTOjZM3jGjHGGmYtKUQWQv/8gcACt4po1c156qS8qs65AlW8xR0cHilcwH0SRWVejCiBix3/+/FWXLwv7jRmVOKoCqFkzX0p1oSsQEjJqx47DFLG1NKoAeu65lkR/YJq1X7/Xn38++siRM8RLLCxTpZG+du1W/fr95f4YfFDQs2PHPj90aJ+aNUk/v2IZUqoAgqoPHz7tl1+2meEDdH969Wr/wgs9Bg3qbsnJeVNVVQvQlSs5bdoMh1UKUwWj6dBXhD7O8OGhL77Y081N2A+JoOXqCdQCBMX88UcCNC7KZ1qdnBzCwroMH94/LKwzsQOh56SSUxUBQbU++ujbmTO/U1I/3Wtr164xYcKQiRMjLXnrqQsI3Fuy5NfJk7949OixrqtKjiGIoC2fNWuC2itimkqqDgiK2bbt4JgxM2HtVAkXvWshmubOfUulpR7dsiwBCMqD2UW43RYvXgPLh7rFKzweOTLshx8+hOVDhflIXG4hQJoawOBr4cLVy5dvLioSNhvbrVvQhg3ziYMbCRCmTBYFpKkEfPevWLEFloMOHUpW/h0Hefbv3zEubpFtLz0b/Xyys6/Hxu5au3bnwYNKSX322ZtTp44yWorCRCtEkGGNYV7199//hHnChIQT5jVSdnaVExNXBwT4G2auMKVcANL6cPduUVzcfoAFm2NKS+9p0ykHQ4b0Xrt2LkUpS1O+AGmrDt960EhBHyopKU2bKH0Ag7i8vJ3CO0eqTHdIe0KxwnQ99AZPnlz988+fEr+h4N5UY2qpnALSQIQvJhiC7d37A3Fhhx5ulA9Jo1EFkNyZIOnqQtM7fvxgaY3GevlyDkUmS6MKoEOHTjduDMOlHzIycmXVxpQ4MLCpKZNuekFBke6pkGNVAJ0+nQ6d5unTv27YMLxHj3HffBMLU6tKqpuenqnkciXXqgTo/1dWoaMMy9ATJsypV69ft26vLVq0xoylnq1bD8AAheIksTmnZKXV2GmPBB4YLqtCqwQr6/A3adLnsOm5U6fWcNfApoMGDTxhcRk2Q8Os2OPHT+CbqLi4FMYiMKsNtyc0uvDFBKvyxLr5+XkRlXSZKv2gGjW6K5lspddeTwkL2bCcrZeo8FT8LXb1ap5V6MCkB4zsFeIwvFw8IO2+C8PCVE0JD+9arZqL8CLUAITv6xDuBmQ4dWqUGtmKB2TYQqtRb708Ye41OLiFXqKQU/GALH+LQVcbdmEJwWGYiWBA8D2t3gMZhrWHFOgr7NjxleEeNaNiMxIFA4Jpitatm5hRD/Muga15hw79VKdOTfMup1ylSj8Itmp8++26337bCb0+SiXM0Hh714Fp1pdf7m/GtbIuUQWQpgYQTb/9tgPm5/ftOylkcl6TLXTBYX111KhwVVd7tBBVBKQtIzPzGvxm8ebNew8cSDJvyhk2MgAXWJuHXR/Ekb22dIUHlgCkrSLE1MmT50+dSjt79iJ0uGFVAx7pKCmBp1Qf3L//AChAUMCgDMactWpV9/Ss2bBhPX//BgEBjYODm8N4TZuPJQ8sCsiSjokqS/C3mKhqlZ98GBDyWTAgBoQQQMwcQQwIIYCYOYIYEEIAMXMEMSCEAGLmCGJACAHEzBHEgBACiJkjiAEhBBAzRxADQggg5v8D/Y6g1f2etLsAAAAASUVORK5CYII=", "text/plain": [] }, "execution_count": 8, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# einops allows seamlessly composing batch and height to a new height dimension\n", "# We just rendered all images by collapsing to 3d tensor!\n", "rearrange(ims, \"b h w c -> (b h) w c\")" ] }, { "cell_type": "code", "execution_count": 9, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAkAAAABgCAIAAAB6y1p+AAAgNElEQVR4Ae1dB1RUR/c30ruCSi8iiA1E0SjYMCqIBrBij+WLGrufWJIYNcYWNUaMGJOoxEKMBSUqghosELEiAipFsNCLighSVND/zd/vcHCX3b27zNt6PXs878385s69v3ns3Tdz585H7/Lzm9A/wQwUmAuuoxpggAgS+hgQPULpadI1QXi9ytcSQUIfgaZCa6mSGCAGiAFigBiQUwbU5VQv1VYrIzXj4O6D/5z/Jzcrt7qq2qSliXNX58H+g0dOHKmhqaHa3JD1xAAxQAz8j4GPaApR+LMg5Smg169er/rvqgO/Hnj79i2/YpY2lkF7g3r178VfJbMSKRMkMzsl7JjoEU4czZAJ54fmWIXzQ1OIwvmRai28bI36ZNS+nfsa9F6gSl523thBY4/sOyJVtagzYoAYIAbkkgFyYHI0LIGfB8ZfiReuUG1t7ZLpS27G3RQOo1pigBggBpSeAXJg8jLEiTcTww+GY7R58+YNTDNikIQhBogBYkCJGSAHJi+D+8euP/CqgLe7e/suHk9IYoAYIAaUjwFyYPIypnEX4sRSJTY6Viw8gYkBYoAYUDIGyIHJy4BCgIZYquQ+zhULT2BigBggBpSMAXJgcjGglRWVsLIlliqlz0vFwhOYGCAGiAElY4AcmFwMqI6ujrq6eJvKDY0M5UJ1UoIYIAaIARkxQA5MRsR/2O1HH31kbiVe1kUrW6sPZdAdMUAMEAOqxQA5MHkZ794DeoulSp+BfcTCE5gYIAaIASVjgByYvAzo+M/H41WB1Igubi54PCGJAWKAGFA+BsiBycuYuvV08xvjh9EGVstWb10Ns44YMGGIAWKAGFBWBsiBydHIbg3Z2rVHV+EKNW3adNOvm3r27SkcRrXEADFADCg9A+TA5GiIIRYx7GLYhOkTwEs1qBYEevx59s+x08Y2WEuFxAAxQAyoFAN0nIqI4ZbJcRj3U+5DZqnLFy7nZeVVVVX9ex5YF2dvf+9Rk0ZpammK0FjK1TIhSMo2NqI7okc4eXScinB+6DgV4fyQAxPOTxP6AiKCRDAgtJqeH6H00PezcHqaEEHCCWp4qkp4G6olBogBYoAYIAZkzgA5MJkPASlADBADxAAxIAkD5MAkYY3aEAPEADFADMicAXJgMh8CUoAYIAaIAWJAEgbESyArSQ+C27x6/fp6QkLstWvJqakZDx/mFxW9rKiAQh1tbV0dnRbGxq1tbOxtbLq7unp069bGzk6wJKpRCQaKivKzsx/l5mbl5WXD//B59uxJVVVldTWEav77P3zevXuno6Orra3TokUrKytbeIJcXbu7ubnb2bVRCY7qGZn36NGtS5dS4uOzMzJyHzwoLy2trqh4+/atjp6eroGBmY2NjaOjo4uLW79+bV1dBe3cqCdPsS+rXlXdz7qfVZCVW5ybW5Sb9yQP/n/y/AmUw6eyurKquup1zWsdLXh6dI0NjW3NbeHj6uTq7uLu7OCsribLr0qZUJ+ZmXPt2p2UlIdpaY8LCp4WFZWUlpZXV7969eqNhoa6np62nh48Sjr6+rrAVJs2Vu8/nTo5mJoaS01hGUQhwlfM2UuXQo8dO3H2LHgspKmOrVsH+PlNDgiAC2QTJjBWUWQFuQVu1m5MVKoTcurqKcjfUXcrmwtWBPFpD1+19++nJCXdTE5OSElJSk29U1ZWyofCFjg4tBs2bNy4cdPMzaWaBJkzegQanv/4ceSBA5GhoVn37wsEfVhhZGIyKCDAd/LkTj16fFjD+R13YfTlleVXk69ev3v9dtrtpPtJjwsewxMlmT1G+kb+nv4BgwJ8PHyk7em5I6ghLuDLOSYm4fDhcxER/4CfbwgiuszOzqJv367e3u4+Ph7Nm3N7aIZUHRg8QAfCwjbu2JGakSGahoYQ8PSMHDp01aJFHZ2cGqpnX8bqC4gcGHJswEvt27fzypVLt25dffmyHNkKCdPQ0AQftnjxang/QzZpJIzV84NR42FKyp51684dPvy2thaD58e49u49Y+XKHoMG8VdxVML8+7m0vDT4cPDpy6dv3rtZ+1ZCHgQZ62DtsHjS4mn+0zTUNQRhGJczJ0iAfm/e1ISEnNi69Y/09CwBELGLNTU1hgzp9cUXo7y8enKU+k56DgxmC+d8/fWt5GSxaeBroKGhEThz5qrAQG0tLb5KxgWsvoDIgSEHJikp3senOxIsGax5c5P164P9/aWR0ITV8yPc0qqKil9Wrjy4bZvErqu+/H5+fsuCg02tresXcnTN/Ps5PiW++yRun5+ObTr+/OXPfbv25YiTD8QyJ+gD6f+7iY6+PnfuRoaui6eTkJBVU6eiEr3yNBR5K40gDnjxWhsU1Mvfn4n3ApPg8OLvg4P7DBuWW1Ag0kICEAM8DDx//mzWrHGbNq3gKVfQ27SEhDHOzqE//sjEewEJMSdPjurYMTosTEEJ4Vrtew/uec7wXL5jucRzklxriJcPJnz55XYvrznceS9QxtHRBq+SWEjOHVhlVRXMra/YtKlW0mkNQfbEJyV19/FJf/BAEIDKiQEhDAQFrd2w4WshAIWoOnPw4NRevSBeg622leXly0aP3rF8OVuxSiMN1orWh6z3XegLASCKa1RNTe348cs3btwL5nBqRadOXIVQcevAnr94MTAgIPL8eY7YKSwuHjB69KPsbI7kk1jlZmD79g1Hj+5XXBv/2rNnxaRJr6urOTIhZP36TfPmcSRcCcRGxkX6/ddPcX3YzJnrIF6D64GwsjJt1syAo144dGAQb+k3efLVW7c4Uv292LzCQv+pUyGAmtNeSLiyMvDNN/MKCnIV0bqzhw6tnT6d61msw8HB25YuVUR+pKNz9PXoid9MlE5fbHvZtSscojbYymxQGnevX9Adhw5s0rx5l2/caNAktoV3UlMXrlzJViZJUxEGysvLVq9erHDGwu6u1dOmcT3z856W/Zs3R+xX4PdUrgf3+IXj3+/9nute2MovLi4JDPyRrUxB0hTSge34/fewiAhBJjEv/y00NO7mTeZiSaAqMHDq1JHU1GQFsrTy5culo0bBBlyp6bxu5sys9HSpdadwHX2z45vb6bcVSO2NG/eVl1dKR2HY2sxdR5y8gaVlZi5Zs4Y7pRuUPG/5cun8IG2wdypUXAbgsQkJCVYg/YMWLy7IYrZZB2M4LLPBCx/X05UYTeQTAxvOZqydwXzbGUfGvnr1+vffT3IknF+s4r2BLVixQvqLUrfv3j0dHc1PH5UQAyIZOHHiECSjEgmTB0DqrVvHf/tN+pokXblymiYSBfMO+89CI0MF18tRzYULN58/L5OOQpB6okMHe+76UmcuOuLvv8/FxODFGhoYjBwyZCDkHnF2NmnevJmh4Yvy8qclJXfT0iB8MTwqqrQMy/XmnTs/lWISAbyNgDS3Ms9/l49pUvK0pFPLThikimO++y6oTRsnMzMLY+OWOv/+04VXhKdPi3NyHkdHRxw/frCwMA9JEaT8gNwfAwYMQeJlCNv+5ZdizTRo6ej0+fRTrzFj7Nq1a2VpqaGp+SQ/vygnJ/bUKcjZAdd4W35bvXrw+PEgAd9EnpFbA7c62jiaGpsaGxnr/5vST19TQ7OsoiwjOyM2IRa8UXJGslj6Q2D9xCET1ZqqidVK+uBLl8QIrDM2Nhw71rtPny6wlwviCSEFoo6ONvyhVVe/Li+vyM9/kptbfOdOZmJiemxswtOnpTzm2Ntb6uhwmG6CfSaOboMHIzcsq6mpLZk1a+mcOc2NjHjMrruFQPx127Zt/e035PRFxpUrDkzT/konk0KdvXCBd2BKmQsRn4kjMbGgVSuz+tTVv4bojOXL54aFHahfKOT6P/+Zv2bNNiEAyarYPj/JV69O9fDAa+Lu7b0qJKSlhUWDTWprasAnhWzYgN8BvXLPHv9p0xqUJlkh80QT+EwcBecKzEwEPj9gzrHzx2Z/P7u4pBhv2vEfjg/vPxyPF41kTlCTJkOGzI+KihPddZMmkD7jp5+WgG/HgOF3VUJC2okTl/766xK4tPdNhg3zDA/fgmkuGaapZM0Etbpw+TLSexno658+cGDD118L8V7QC9T+sHJl2K5dyKxRh/76S5BuVK5SDBgYGAYF7e3bdxDS6jt3EpBIGcIgqB3f+6TFi4PPnBHkvUCOmrr6rDVrgk6daqqGfWkI27kTr4CiI0cOGBkfGg/5D/GGhJwIwYNlhczLQ7lkP79+kAIK6b3AFsh26ObW/rvvZiUnH87MPLFly39hWs3V1YlTMxk7sO0hqPEDU0ODg709PZG2Dffx2bFhAwYMU44YGGFUgQGYf9+48RdkFlHIfC/nnLx49uw8Or3TgJEjF2zahLGol4/Pgo0bMUjAQPh++m1FCrdD2iUIZm1qfXbHWchGLwjAUx4VFwVHtPAUytttRUUVRqXx4wdjYA1i4GiVRYsmxsTsWrVqRoMAVoUsHdiLsrKoCxcwmk2fMMHPywuDrMNMGzvWq1+/ultBF4n37oEagmqpXNUYsLW179XrE4zVpaUlsH6GQcoKc+nEiTevX2N6NzQ2hrk+pOcGgRMDA9t17YqRDJgLx48jkcoBs7e037xwM9IWCEQ8d+0cEiwrmJYWahUTzgOTlYb4flk6sL/OnIHjKEX2raWp+W1goEgYP2DJ7Nn8hTwlsFQmnd3TPP3SrdwyMHDgUKRuGRmpSKRMYLC2gOz3P8uX6wteV25QyHR0KgCI/mhQghIXTvWb2ta2LdJA+XdgyMROW7aEPnqEDYNCksMcxtKBIYMPhwwYYG5qKoElnu7u+np6IhsiF+FEyiGAcjAAxzEjDcnKeoBESh8Gv8ziL17E9Kurrz961iwMsj7G098fzmiuXyLo+n5SUunTp4JqlbIcjmNePm050rS4RFR8BFIaF7B27ewwYiHU3t19yrlz1zBgWWFYOrAr8fEYM2BBCwPjx6irq0OoPX85T0kK+iBanoZ0q5QMdOzYGWkXBC4ikdKHPbx3DxJwYPrt6+sLofMYJA8GfBhPiaBbWAkTVKWs5SM+GaGtqY2x7mHeQzgMGoOUFaZHj07IrouKSry95wwbFnj9+l1kEynDmDmwoidPHueg5ky7dcZ+ofBzYdqyJX8hT4nExz3zyKFb5WBAW1sHTrDE2FJRgfIQGFHMMffQmdI+GTlSst5huxiyoQo6MNgl5tML9csbosnvZsrp1/378YXwQohvQo41wCAyvmfPyd26Tfzll7Bnz17gG0oBqc6qD0gfhRTVARGLgRTVIAzy0zdYToUqy4CZmSUcYinSfNjOLBIjK0B2Rgay6w7duiGRPLC2rq48JYJuc9B/7IIkKGJ5/279wy+GYzTPKshyd3HHIGWCMTMzGTPG688/z4jV+61bkAQmdf78zQMGfDx8eH9/f09TU2OxJHABFsMPC+8e+folXAiT2pLSUjiymYkoEqIcDBgbo97AKivl9w2sEJf8EGI3zG1tJRs141atmrVogWlbqJIn8Dk7il6/eM9e3hN5j31Yv34OMpSD53l486bmzJkrcJCYpaW3l9fsvXtPlZVV8GCkeauEDgxe4Z+UlEiTROpLzhnQ1NTCaCjP6RCLcnMxJti2xQbLNSjN2sGhwXKewmKcMjytFP3WxdEFaULhU3mfBLKzs9i3bzV+owW/4bW1b//++/rUqd+amg4cNWrp8eMXIEcwP4zrEmYODHI+ca0rXn5FZSUeTEilZ0BLC7X8LlaOQSmThozg0G/WrDGK6RkaYppXVcjyRzdGQy4wxobGGuoaGMkVVQrAD6yErVw5HWOOcAwkRTx27PzIkUusrYesWvULxH0Ix7OtZebAKqV4OpFICqrpgGaRHKkSAPkGJs+UVON+k+njPJAgS/UMDARV1S9HKlO/iXJcG+ii+Kl6hUp1IXNOvv125vbtS9XU2HiBJ0+ef/fdLlvboXPnbpRetntWJEr//BQhmmP2UwtpTlXEgLwxUINb1pUsgL7OWGRzZEKQOrFKc2FkYISx5fUbGUymYRTjx8ydOyYiYpu5OWrtk785fwlMJO7YccTJaYR0jhxj43v5zaASYoAYYMiAljZqFrSRxzQjJyqRfo6h+XIiCjnJDMeyyInCGDUGD/ZITw+H1IUaGuoYPAYDb2PTpq3+7LOVr19zG0/HzIHpSrR3EsOFBBhNZTmySALbqYlSMoD0GUgPJIiiClweUTgSSpAE5S5/iYtT1cIFDckPVwYGupA8PiUlbM6cAHz6eZH6Hzhw2strTmkph7tTmDkwHdwvRJE2MwFoaqDWWpn0RUKIASkwgMxtWF5a2hhlkM2RyjRGE/lsi4zO0NPRk0/9hWvl4GAdHLwsJydy27YlHh6dGxOjWNdRTMytceO+Rr651rXCXzBzYC2MjfG9co3U09Xlugvu5DN5brhTjyTLhAFTKytMv/j9zvzS4FsmKz2dv5y/xNTamr9Q6UsgQRQyOqNV81aKywbsD5s/f2xcXAh4sqCgxb16NdaTwb6xTZv2cUQIMwdmi/sD48gMHrHGjQsm5pEm5Vs4qFrKPVJ38s8AMtNuWUlJSVGRZObkPniAjI9HKiOZGnLbKv0xyruD/q2MFdiB1fFvadlqwYJxly+H5OZGwZvZwIE9JF4k++abn+vOaK6Tz+SC2aqdHfpHWWFSEialIRPzFFGIOnoptbamVhENJJ0lYKB1+/bIVqkJCXBGJRJcH3bvxo36t0Ku7dq1E1KrrFX3s+8jTbM1lzAZClK+lGEWFi1hbQw+L168jIy8HB5+MSLin6qqV3g1ampqIcL+6NGN+CZIJLM3sI5OTsguH6lkHhokOQDT0kaljQBkxUsF2C+JN5yQQhho7+YmpLZ+1T8REfVv8dfR6OOeO3bvjherNMjYhFikLU622C9DpEA5gRkZ6Y8bN/jIkY1FRdG7dq3o3LktXjHICMzF5jBmDgxm7dra22PsOXvpEgamshiYQtQ30MeY/7T4KQZGGCVgwKZtW2ToBBw4KcGaeWV5eVxUFIYoNXV1py5dMEglw0RejsRYBJudLVtZYpCKi4Goxc8/H3b79sE//liHzKkISRS5OFqMmQODwfDA/S77ed++ctzJRoo7wI3UvJlxM4yE9LvpGBhhlIABCO3pOWgQxpCinBz82c11Ag8GBb3G5a9x7d1bB3GubJ1k5bi4knQlpygHY0v3jqryegrP5Pjxg2Njd+vooCaNkpKwc7AYnt9jWDowX9wfWPHTp18sWybBj0S8VYqONLUwxZhw5dIVDIwwysFAH19fpCG71qxBIt/D4ITl/T/8gGwCB2YikcoEW7dnHdIcj84eSKSsYHC6N8OunZ0dZs4ciRH46FE+BiYWhqUDGzJggIE+au7rYHj454GBdOiJoKGysrUSVFW/PPFmYmZaZv0SulZiBsBzaOJ2W6bfvr1/82YkFfB1tmrKFOQWZjgIcYCkB2Yi9ZFDWPT16Mg41PwhKA/HhsmhCfVVunr1jqPjsDVrdmdlFdQvl/ja1dUJ07as7CUGJhaGpQPT1tIKQP86Czl0qLuPT+y1a2KpywOura09FxMzecGC1Vu28FQp9K29oz1S/9WBq5FIgik6A4bNmw8aPRppxfavvsKsacFEyJaFCy+fPo0U6z54sMTnjSG7kDfY4/zHY78ai9TKSN+oT5c+SLCsYHfuZGRm5qxcubN1a9/+/WfAOcuNTCGfkZEtK1tYOjCwIfCLL/D7cJNSUvqNGNFn2LDwqCixktm/KCs7dvr0lIULTV1cvMeN23/0aMKdO7JikIt+2zljw5TPR57/avZXIt9la2pqYs7FLJmxZLLvZC4UJpnSYWDM3LnIjt7W1i709d25YkVtTY2gJnDG2OxBgw5t3y4IwF8eMHs2f6GClszfNP/ug7vClT979az7FPdnL54Jh9XVftrnU+SRK3VNpH9RtyULfr5cunRr1qwNFhbe/fpN/+mnQ+DYxNUnKiouKOggphUy3AMjqg7DbB/Ye4ntHR2H+/gcj8S+bkOryzduwAfe3j7p3ftjV9cObds6OTiYNG+ur6urr6cHSe5Ly8pKX7x49vz53bS0+OTk+KSk1IwMePeqswEu0h88qH+r6NddPhYjymvfzn3gnCbPntxnQB8LawsDI4PqquryF+V5OXm5j3NTklMSriXAZOP7mHsDQwNFJ0eV9e/48ce9hgyJw/19gQ/bvXbtiZAQrzFjYPoRdh+3tLCARPLFeXmPUlPPHT4M8YrIwI33nHfo1q330KFKw//R6KPw6WDfYWjvoV3bdXV2cDY1MYVXKMgXlf8k/2bKzdDI0PM3zou1Wj/Nf5r885OcnMGjJEwjx8IugdiEBQs229qaQ/YNmBXs0MHe2toUNoHp6UGiQM23b99BJGFFRRXkNiwsfAbTjxCUAYGFiYnYUDJ7e9TKCI9uwm8/epfPeGENtnl19PSU8ukq6urqVQ8fwv/CrZWgtsBcgkYMmri3cc96mMVAEJ+IhNwEM0szvmJJC1gTlJQU7+ODiuNKTCxo1QplyBdfjD158rBICwMCJgcF7RUJEwvAmp4m6YmJE93c2K7DIy0KPnPG3dsbCUbCuiYggVhYfEp890mo5wcrEY1ztHFMP56On4JCCWZOUJMmzZt7cppgV5BdJ078CKdoCqqVrJzxFCIo0drGZvmCBZJpI3ErmCJ7qFz7o738vCRmQ3jD+ynsg1mF90i1DBlwcnUdM28eQ4FIUXBuPHPvhexaUWBfTvmSsffiwPKcnCKZeC9NTY1+/dyYG8TegYGKX86dC/OBzHUVLlDJZhFHf4ZdrhdOC38tOTB+ThSrZM66dVZt2khTZ0Nj42XBwdLsUeH6crB2+OzTz+Rf7boFMCmr6uvbFxJ5MO+UEwcGuST+/PlnG0upbkdPz8xkzo4MBXbq0qln355cKJCRyjsDzkUvJJM7BmAf8eZjx6S2m7ipmtqGP/80NkXtTeTOajmXHLwsWF2N/RIGc6shBJG5TIzAZcs4CR/jxIGBPa1atLhw9KilGWqJAmO/SIySvYGBvYtXLxZptQQAegOTgDR5a9K2c+fv9u+HXVlSUGzBpk09vbia0JaC/lLoYtLQSd7ujFcHOVKbP4KDo47qi5061a979471S1hdc/gH0MbO7uKxYw52dqx0FS5H+RyYh6fHqEmjhFstQS05MAlIk8Mmn4wYIQUfNvPbbycuWiSH5suPShC+uPOrnfKjj3BNpD+FCKk6fvppiXCtJK7l0IGBTo6tW9+IivL29JRYP3xD5XNgYPva7Wsd2jngScAgnz97/uzJMwySMHLOgM+ECd8fOcLRXCK83s3fuHHGqlVyToJs1bMxszm59aSiHMEMcfDp6ZzENgsaBYjFP3fuZ319XUGARpZz68BAueZGRlF//LFj/XpDA253IEGKRdgx1kg65K25oZHhgdMHYHcXW8XoJYwtnzKUBomdfr9yxdqB8a8cyPqx9eTJyUuXytA0+e+6jVWbmF0xdhZ28q/qew3LyyvFOgOlkXYNG+Z59epeMzOTRsoR0pxzBwZ9Q2jp7ClTUmJipk+YoKGhIUSbRlYpWRzHezZs7W0jrka4uLk0kpy65hqaGnSQWB0bSnDh6OJyKCkJJvog2oKJOTA5GZaSokx7lpnQwiPEt69vfGi8Ankv0N/Y2PDGjf3Xru2DRSnYnsxjEcNbGxuzgwfXhYdvMTTUYyiWX5Q0HNj7XiGg47fNmzPi4hbNnAkhHvyqSFzS0sTk8/Hjzxw86ObC7FteYmW4aAj7jk9dPQUxHTq6jXrm4E1u0cpFN7NuDhw6kAs9SaasGNDW1f3vli2Hk5MHBQQ0ZisSHJXy64ULEOJoIsXwK1mRNqjHIMniBs1bmB9YcwBmDpsZNJOV8o3pt0ePTiEhqwoKzu3evaJv366NeWD41YAUHr/+ujwj4y84+pK/lnkJ+0wcGBVh3/GZixcjoqMhFa9kBzRraWq6d+vW38MDNpy5u7lB4D6mXwkwzDMpSKBDXRM4wXL3tt3HQo/lZefVFYq8sLazBo/lM8IHokLYx62xJogycYgcUOGAnMzMk7//HhkaWoje2v9vmuCAAL+pUzv16CFcOPNa5okm8Jk4Cs4VQEKTPSf2hEWHJWckY0yDvFNzAuZM/nSy9Ba9mBPEZ2d2diHk1IqIiI2LS4JFMr560QVqak3Bbw0d2nv48P7IzPSiheIQsnFg9XXLyc9PvHcv6d69+w8f5hUW5hUUPH/xAjJRwQeykOlBRsT/T4oIB7XAxrJ2DhDT8O/HpUMHSJ9YXw5H16y/nxmoCbQk30qGDJJJ8UmQbio/J/9l2cuqyir4JaVvqA/ZDlu0amHf1r6NUxtgCtIqMl9C+8AGOSToA/1kfCNDeh6npd2KiUlLSMjOyMh/9Ogl/FlVVMBXNgR96OrrQ2pEOOXZ0dm5a79+7bp0YTX9KC7dzL+fxXJgZib/2+cDh1X+fe3vhLQE8GTZhdml5aUvK19qa2nD8cpWplbtW7d3a+822GOwk62TuAY2Fs+cIMEKwQrZ7dtpiYn37917AAk78vKKi4tLKiurX72CDJpvwEtBNg1Iigg5eVu0aGZqatK6tYWDg7Wzs2P37h04nZAUrHIT2TswIcrJQ5UMv4DkwXzROhBBQjkieoTS04T597NkDky4krKsZU6QLI1h37f01sDY604SiQFigBggBlSYAXJgKjz4ZDoxQAwQA4rMADkwRR490p0YIAaIARVmgByYCg8+mU4MEAPEgCIzQA5MkUePdCcGiAFiQIUZIAemwoNPphMDxAAxoMgMkANT5NEj3YkBYoAYUGEGyIGp8OCT6cQAMUAMKDID5MAUefRId2KAGCAGVJgBcmAqPPhkOjFADBADiswAOTBFHj3SnRggBogBFWaAHJgKDz6ZTgwQA8SAIjNADkyRR490JwaIAWJAhRkgB6bCg0+mEwPEADGgyAyQA1Pk0SPdiQFigBhQYQbIganw4JPpxAAxQAwoMgPqiqw86U4MEAPEwAcMdOvQ7d2tdx8U0Y3yMkBvYMo7tmQZMUAMEANKzcD/AX05csVXPtckAAAAAElFTkSuQmCC", "text/plain": [] }, "execution_count": 9, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# or compose a new dimension of batch and width\n", "rearrange(ims, \"b h w c -> h (b w) c\")" ] }, { "cell_type": "code", "execution_count": 10, "metadata": { "pycharm": { "name": "#%%\n" } }, "outputs": [ { "data": { "text/plain": [ "(96, 576, 3)" ] }, "execution_count": 10, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# resulting dimensions are computed very simply\n", "# length of newly composed axis is a product of components\n", "# [6, 96, 96, 3] -> [96, (6 * 96), 3]\n", "rearrange(ims, \"b h w c -> h (b w) c\").shape" ] }, { "cell_type": "code", "execution_count": 11, "metadata": { "pycharm": { "name": "#%%\n" } }, "outputs": [ { "data": { "text/plain": [ "(165888,)" ] }, "execution_count": 11, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# we can compose more than two axes.\n", "# let's flatten 4d array into 1d, resulting array has as many elements as the original\n", "rearrange(ims, \"b h w c -> (b h w c)\").shape" ] }, { "cell_type": "markdown", "metadata": { "pycharm": { "name": "#%% md\n" } }, "source": [ "## Decomposition of axis" ] }, { "cell_type": "code", "execution_count": 12, "metadata": { "pycharm": { "name": "#%%\n" } }, "outputs": [ { "data": { "text/plain": [ "(2, 3, 96, 96, 3)" ] }, "execution_count": 12, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# decomposition is the inverse process - represent an axis as a combination of new axes\n", "# several decompositions possible, so b1=2 is to decompose 6 to b1=2 and b2=3\n", "rearrange(ims, \"(b1 b2) h w c -> b1 b2 h w c \", b1=2).shape" ] }, { "cell_type": "code", "execution_count": 13, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAASAAAADACAIAAAAr0inhAAAglklEQVR4Ae1dB1RVx9OPhSqgYgGlKYIdRJG/gr2haEBjwRINaqLGEjX2BMWoUaPGiBE10UiiGGLBjh2NGMFCEVBBwEKvighSxPZNDjl8+uBd5t279zXGw/G8uzs7O/N79/fu3t3Z2VrvMjI+on/SEchsJr2OagABAojzNqjNWUuVhAAhIAiBuoJaU2NxEEiMS/T/zf+fS/+kJaeVlpQ2atLIpovNkOFDRk0cpaGpIU6fpFUUBGrREJEbVzmPgMpelq38eqXfr35v376tbJiJuYn3H949+vWoXKWwEjkDpDA/eXZMQ0SewInRDB5Wo/uP3rtzb5Xsgh7TU9LHDRp3aO8hMXonnWIgQAQTA1WeOhd+sTA8NJy78Zs3bxZPWxwWEsYtRrVKggARTEm+iI+iwqKO+R/DWPPq1SsYRmIkSUbhCBDBFP4V/GfAn7v/xJsCbLx7+y5eniQVhQARTFHIS/YbcjlEsojz+mrQVc56qlQKBIhgSvE1gBEwgSGTKWlJaTLJk7BCECCCKQR2yU6Li4rhzUqylPM6/1k+Zz1VKgUCRDCl+Bp0dHXq1pVt0d+gvoFSmE5GcCJABOOER16VtWrVamYqW9SjqYWpvKyjfvgjQATjjx3blj0H9JRJYa+BvWSSJ2GFIEAEUwjsVXQ64YsJVZRKKYLQRFt7WymVVKxECBDBlOXLsO9u7zbWDWMNvK2t2rIKRpUYYZJRLAJEMMXi/0HvW3y3dOnW5YOiShe1a9fe+OvG7r27V6qhAmVEgAimRN8KzCUG/B3w6bRPgUVVmgUTIX+d/2vc1HFV1lKhEiJA21Wq+VIUshsjITYBIqeuXb6WnpxeUlLy736wzjaDhw8ePWm0ppZmNRbLuVohAMnZRwHdEcGqAY/uHwKoGgQ4q6seinA2oUpCgBDAIkAEwyJFcoQADwSIYDxAoyaEABYBIhgWKZIjBHggIFuAKY8OOJq8LCu7GRl59caNmLi4xEePMrKzXxQVQaGOtraujk5jQ8OW5uaW5uYOdnZOXbu2atGCQxVV1QQEsrMzUlIep6Ulp6enwP/w9/RpbklJcWkpTLX++z/8vXv3TkdHV1tbp3HjpqamFnAH2dk52Ns7tmjRSiEQKWAWESA4f+XK/iNHTpw/D4xCum3dsqW7m5uHuzt8QDZhIsZqFjEzLdPezJ6JSRVKTl0/BfEfFZeK+cAKoErWQ+afhITY6OiwmJjI2NjouLg7BQX5laSwBVZWbUeMGD9+/NRmzeQaJC1XggFkfgEBG7Zvj0tMxALzoRyswI4aNmzlggUd2rT5sEasK1b3DxEM+Q0Bi/bu3RkaeiUi4vqLF4XIVkgxDQ1N4NiiRavg+YZsIlBMfu9gMBr839Chk+fP580ucBUoevjUqc7Ozt+sW1f68qVA56m5EiLw+PGD9eu/DQ6+wJxd4OyrV2X79v3Sp0/7EycOyMd3eRAMWPG9t3eP4cMjYmKYeAWbf3/w8ek1YkRaZiYThaSkRiHw7NnTmTPHb9y4Qg5ei06w4pISVw+PFRs3QkI/tv6ER0c7uLjEP3zIVi1pqyEIeHt/D49KsZ0Vl2DPnj8f6O5+5tIlkdzIyskZMGbM45QUkfSTWvVGYNu29YcP7xPVRxEJBu9Ibh4e1yMiRHUgPStr+JQpMEErai+kXF0RWL78q8xMEfNziUiwSV99de3WLTl8MXfi4uZ7ecmhI+pC/RAoLCxYtWqReH6JRbDtv/8eEBgont0Smnft3x8SRunaJVChSxQCp04diouLQYnKLiQKwe4/eLB4zRrZjRHU4itPT1jCFqSCGtdIBOC28fX1Ecl1UQg2b8UK+b8U3b5793RQkEgwkVr1RgCWxSDYSgwf6zJXGnjx4oXgYLxaA339UUOHDuzdu4uNTaOGDRsYGDwvLHySl3f3/n2Yfjx29mx+QQFS26adOz8eNAgpLGcx2O2f8Q51HHbek7yOTTrK2TxV7G71au9WrdoYGzc3NGyi8+8/XVhxffIkJzU1KSgo8OhR/6wsbDZyWNSG2JEBA4Yyx4F9qFTXIUOQC8p16tRZPHPmktmzG9avL80xmOhfu3Xrll27ADtpMu+XJ4aGWjENC2YVKvW+kdyf8QRTy1jE6OhwFxcHbojKa6OiMps2NZYmCbMXnp5zAgL8pAlIlH/++dw1a7ZKFAq/ZDxEvHztGpJd+np6p/381n/7LQe7wD2o/dHLK2D3bm0tLYy3B44fx4iRjNojoK9v4O39R+/e2BHNnTuRYmDCmGDbfH0xVkJOv/0+PoP79sUIg8wnLi7b16/HCMOQEiNGMjUBAQgN37DhF2QCSYjcFwMTlgR7XlBw9vJljJXTPv3UzdkZI1khM3XcOOc+fSoupX2IuncPzJBWS+U1DQELC8sePfpjvM7Pz4P3N4ykTDIsCXb83DnYLllt91qamt8tXFitWGWBxbNmVS6UKIFXNfmsbkv0S5dKi8DAgcOQtiUmxiEl8WIsCYacPBw6YEAzIyO8iRWSfR0d9erVq7iU9gH5EiitOZWrGQKwnRnpUXIy+8BxlgQLDQ/HeAIvVBixyjKQkx2m8iuXS5TEJiRIlNBlTUagQ4dOSPdh4hEpiRdjRrDs3Nyk1FRMx107YR2urM2oSZPKhRIlQjZ0SqiiSzVAAPJzNGzYCONIUdELjJhMMswWmiE8Ctlxe8RcBVJVlWIQX19lORXWWASMjU1gk2W17ouxh5rZEwz5+KrWSeECefn5sp53LLxT0qDMCBgaop5gxcXsn2BqSDCI3czNy1Pm75tskzMCmppamB7FCEdkRjCIacL4IB+ZomJRAjflYzz1whwBLS1tjE4xdmMwIxjk3sD4IB+ZUtrgLB+gVaQX5BNMDG+YEUz++1M44MCsd3M0pypCgBUCzAjGyiDSQwioEwLMCAbZ5JUHF01NTeUxhiypyQgwIxic2KA8OGpqaCiPMWRJTUaAGcHgMBTlwbGerq7yGCOrJcjtFbKqJXmFIMCMYBampgpxoMpODRs0qLJcJQpho7dK2ElGYhBgFirVwswM0x/IZEVHY0IKkdrUT6yuBvZLefP6jfq5r2YeMXuC4c8TokzX3PeQljYq7ACUFL0o4lZFtQpHgBnBYFTW2tIS4w+cvocRq7EyMETU09fDuP8k5wlGjGQUiAAzgoEPTg4OGE927N1b+IJ9VCWma1WRaWDYAGNq/N14jBjJKBABlgRzxeUkzHny5MulS8WI+1Igjmy7NmqO2vEdeiWUbb+kjTkCLAkGuQAgGRvGRP9jx75YuJA2lUjDytQCNSUbFRb14P4DaUqoXBkQYEkwSF3o7uqK9Mr3wAE4Pu/qjRtI+SrF4FA/SATiMW/eqs2bqxRQ0UJLa0uk5asWrkJKkphCEGBJMHBg4Zdf4tdJo2Nj+4wcCSfBQjJDmYLxITHbkdOn4bhnI1vbwePH7zt8OPLOHYXAJ1KnbW3aIjVfOnPpm1nfVDsWeP36dfCF4MXTF3u4eiA1kxgTBLBLLsjO2llbQ06bo2fOIOVBDLKswR88/fr37Pk/O7v2rVu3sbKCJPV6urqQQwqC9CE3ff7z50+fPYNs9eExMXByLGTdkDiQVs0Oku38v854APfu3Avk8Zjl0WtAr+ZmzfXr65eWlBY+L0xPTU9LSouNiY28EQmDyfI5fX0DfbxmkhSOAGOCgUGQ6RrSj8q6ewWOw4SjHngfNvswORl+pCHtlHBElEEDvINZWFokP0pGGpP0MAk5ViwsKMxKzzI2MUZqJjGBCDAeIoI1Lc3NPefNE2iWrM2BXY/U66RmZzdnWUFAyifEUlo7JFQMxNgTDIxaNmcOjPcYWCeLCjUbJY75bIws3ssgSwSTASzBoqIQDGIR/tqxw9zERLB5MiiIR+eNk0Gp4kQ7du7YvXd3MfpPjEsUQy3prBIBUQgGPTVt3Pjy4cMmxvIb66vZEwwwXCTO4dz0BKuSCSIVikUwMLdVixZ/HznC9jg8DhTUj2BOfZ1GTxrN4TK/KiIYP9z4tRKRYGCQdcuWt86exZ8Dxs+H8lbqRzDw6/tt31u1tRICS+W2z54+e5r7tHI5lYiBgLgEA4vhiMqzf/65fd06OItZDAcqdEKII/4054pWSv7BoL6B32k/WN1iayc9xNjiyaFNdIJB3xDbMWvy5NjgYDh3T0PMbBlqNs9R/rXBgljg9UBbe1uOb1GmKg1NDdpIJhNiQoTlQbBy+2DCY9emTYkhIQtmzIApECFGS7Rt0qjRFxMmnPP3t7dldhdKdKHYS1gXhvPOYc5DR1dQ6i54Ei7wWhCWHDZw2EDFelRzeq8rZ1chdcfmlSs3eHqe+/vvwKAgCNXlt8EZjsl07Nq1n5MTLLg52turfR4LePIDNz778rPftv52ZP+R9JR0/Bdn1sIMGOUy0gVmTeDYYnxDkhSOQK13GRnCtQjRkJqRAQcrR9+7l/DoEZw8lJ6ZCWnuIdIK/mDPGOSHKg9KhI0wsLDW1gre+f/9s23fHsIXhfSLbJvZDCkoPzGAJSYiBiI4o8OjIZwqIzXjRcGLkuISGIrrGehBtGHjpo0tW1u2atMKkIKwRuavcB+4qoQAfWCfgi8UTzAFA1Bd93T/VIMQAcQJEA0YOOGhSkJAGAJEMGH4UWtCgBMBIhgnPFRJCAhDgAgmDD9qTQhwIkAE44SHKgkBYQgQwYThR60JAU4EiGCc8FAlISAMASKYMPyoNSHAiQARjBMeqiQEhCFABBOGH7UmBDgRIIJxwkOVhIAwBIhgwvCj1oQAJwJEME54qJIQEIYAEUwYftSaEOBEgAjGCQ9VEgLCECCCCcOPWhMCnAgQwTjhoUpCQBgCRDBh+FFrQoATgVoR795xCtT0yi6RNR2BavwngDgBoicYJzxUSQgIQ4AIJgw/ak0IcCJABOOEhyoJAWEIEMGE4UetCQFOBIhgnPBQJSEgDAEimDD8qDUhwIkAEYwTHqokBIQhQAQThh+1JgQ4ESCCccJDlYSAMASIYMLwo9aEACcCRDBOeKiSEBCGABFMGH7UmhDgRIAIxgkPVRICwhAgggnDj1oTApwIEME44aFKQkAYAkQwYfhRa0KAEwEiGCc8VEkICEOACCYMP2pNCHAiQATjhIcqCQFhCBDBhOFHrQkBTgSIYJzwUCUhIAwBIpgw/Kg1IcCJABGMEx6qJASEIVBXWHNBrdMfP464ciU2PDwlMTHt4cPC/PzSoqK3b9/q1Kunq69vbG5ubm1tbWtr36dPazu72rXV/Leg5GVJQnJCcmZyWk5aWnZaem46/J/7LBfK4a+4tLiktKTsdZmOlo6utq6hgaFFMwv4s2tj52jraGNlU7eOIr9KQfcB38YPHqTeuHEnNvbR/ftJmZlPsrPz8vMLS0tfvnz5SkOjbr162vXqwa2ko6enC0i1amVa/texo5WRkSHfPmVup4DEoxlJSWf8/M7s35+ckIC0t36jRoPc3V09PDp264ZswkpMvLyahcWF12Ou37x78/b929EJ0UmZSfDjws/s+nr1h/cd7j7I3cXJRd6/ROIBVBUW7969Cw6OPHjwQmDgP/A7VJVI9WUtWjTv3bvL4MGOLi5ODRsaVN9AgIRcCfYoNnbP2rUXDh58++YNP5vtevac7uXVbdAgfs15tGJ+/+QX5vsc9Dl97XTYvbA3b3niIM0RKzOrRZMWTR0+VaOuhjQZxuXMAZJi36tXr319T2zZ8md8fLIUEZmLNTU1hg7t8eWXo52du9eqVUvm9ogGciJYSVHRL15e/lu38qbW+770cXNb6uNjZGb2fqFIn5nfP+Gx4Q6THESytlxth1Yddizb0btLb1F7+U85c4CqMjoo6OacORsYUkuiE1/flVOmuEkUMrmUx4vN/cjIsTY2+3/6iQm7wO3gkydHd+gQFBDABAL1U3Lv4b2+0/t6bvfkPeZUHkzAhWXLtjk7zxaPXeCstbW5SC6LTrBz/v5TevSA+Qy2DhQXFi4dM2a7pydbtWqjDd5V1vmuc53vChMkquvU69dvJkzw3LDhD3BHVC86dmwlkn5xCXZ8z54VkyaVlZaKZL3vunUbv/pKJOVqoPZMyBm3r91Ul2MzZqyF+QyxvwhTU6MGDfRF6kVEgp0/cOD7adPEHqUc9PHZumSJSOiogdqgm0ETl09URUd27z4GsxpysFy8xxcYLxbBYHVr1dSpYj/Zy9Hft2lT4L59cvgmVLSLo5eP/vDHD6plfE5O3sKFP8nHZtUjWPGLF0tGj4YFUvkABL2snTEjOT5ebt2pXEfLty+/HX9bhczesGFvYWGxfAyGpWfxOhLlCea9aFFmMrPFCozz8JoHD0yxh6MYS5RTBhbcpn8/nfmym0jOvnxZ9vvvJ0VSXlmtij3B4iIiju7aVdkNsUuiQ0NP00BROsqw/rb/zH7p9UpUc/ly2LNnBfIxCAJf2re3FK8v9gFs25Ytk+nVS0tHp9fHHzuPHduibdumJiYampq5GRnZqalXT52CmA/4jHd+16pVQyZMAA34JsosuWXhFmtzayNDI8P6hnr/htTpaWpoFhQVJKYkXo28CmyJSYyRyX6YuJ84dGKd2nVkaiV/4StXIvCdGhoajBs3uFevzrCWBfOBEIKoo6MNY5nS0rLCwqKMjNy0tJw7dx5ERcVfvRr55Em+hGZLSxMdHS2JQoaXjCM5Yq5fn+LkhLfPcfDglb6+TZo3r7LJm9evgTO+69fjV6i99uwZPnVqldr4FTIPVMBHcmReyDRuZMxh9pFLR2b9MCsnL4dDRqLq6I9HP+n3iUShoEvmAH300dChc8+eDcFYBeEXP/+8GH57MMLwux8Zef/EiSvHj18BypU3GTGi77FjmzHN+cnU5tdMWiuYNJdWVbl80qJFPufOSWMXyNepW3fmmjXep07VroP90Q3YubNyR+paMmrAqPD94RB/iHfQ94QvXlhRkunpqJ8MN7c+EOKEZBf4AtGG9vbtVq+eGRNz8MGDE5s3fw0hv3Z2bUR1kyXBnj99egkdvjRg1Kh5GzdifOvh4jJvwwaMJMjA8kD8bVWaLkP6JU3MzMjs/PbzEE0vTUCi/GzIWdgCI1GobJdFRaj55wkThvC2HLauLFgwMTh498qV03krwTRkSbArJ068KivD9GpgaAhjOXz88sSFC9t26YLRDDKXjx5FSqqHmKWJ5ab5m5C+wETihRsXkMKKEtPSQr1Fw34wRVmI75cpwY4fR3b8uaenXn3sj265zmleXkjlMDuClFQbsSluU1pbtEa6o/wEQwYubd68//HjdKTXihJjRjCYtwn/+2+MG7p6emNmzsRIvi/Td/hw2OP8fom0zwnR0flPnkirVcty2M7sORUb9xwShZo/UCBQbdu2wPQOU/mOjpMvXLiBEVaUDDOCPbp3DwI4MG70dnWFqXmMpIQMcEyiRNolvIlJq1LX8pH9R2pramO8e5T+CDZTYyQVJdOtW0dk15AmYPDg2SNGLLx58y6yiZzFmBHsXlgY0vT+o0YhJSXEYLlMokTaZQ0kGKySufRwkQbI++UwW333gZLejuV2wvSgTIkPYOa9e3ePrl0n/vJLwNOnz993VuGfmREMEtcgnWnftStSUkIMUt9IlEi7TH3wQFqVGpf369oP6R2k1kFKKkTM2LjR2LHOsnYdERE3c+b6Zs2cXVy+2rXrKDzcZNUghjwzgmXhgg9hbgOyIfHzxLBp0waNG2PaZqWkYMTUTMbG2gbpEaSsQkoqSmzdutnIqQ4JCyF1x7lzobCRzMRksLPzrD/+OFVQUCQhI89LZgTLTkvD2G3RGjvZVaU2MyurKsslCiHxmURJTbi0tbZFupn1JAspqSgxSPy0d+8q/EJOZTvfvHl78eLNKVO+MzIaOHr0kqNHL0MMcWUxsUuYEQw5w6HXoIEQl+oZGGCaQ44djJiayUCyRGQyqaISFcAH3sS8vKYJ/44gKPHIkUujRi02Mxu6cuUvch46MiNYaXExBgs9HEOkqaqnj9rajTRGWi+qW66vi8JHVZIIfPfdjG3bltSpw+Yuzc19tnr1bguLYZCgSn7R+qxuptevXmFU8Zugr9CMbI4MKKlQqzYf6uujlu/LXilgsMQP5DlzxgYGbm3WDPXujekCBorbtx9q02akfLacsfltAMe0tFGLMAK3OSMHokgeYr4P1ZJBbhSCbS8q5NeQIU7x8ccgdBASYrMyG55mU6eu+uwzr7Iy1IOBd7/sCIZbO0YyRJo/RQUF0qreL4ctQe9f1pzPL4pfYJzV0hRxBxTGAFll9PV1Ifg9NjZg9mx3fPh8tb34+Z2GjIuQ0b5aSd4CzAiGjC2EEx542woNkc2RxgixRDnbImcv4FAE5bSf2yorKzMfn6WpqWe2bl3s5NRJyBxjRUfBwRHjx3+LfPJXtMJ/YEYwI1NTTK/49ejK2gAFZGYb+WTVrmyhYksgAAo5e9G0YVPFmiqkd1gfmzt3XEiILzDN23tRjx5CmQbrZhs37hViEkdbZgRDRuIW5OXlZWdzGMRRBUccIeffkcZw9KWKVfFJ8UizmxqqMMEqfDQxaTpv3vhr13zT0s7Ck23gwG68X9KWL99Rsce5Qj+TD8wI1rJdO6RBcZGRSEkJsXu3bkmUSLuE9B7SqtS4PCElAekdHCyGlFQJsebNm8C72cWLO3JzL/n7rx0zZqCsaTYgRzfM4IvhLDOCtbO3R9r3T2AgUlJCDH/aQwcHB4m2NeESMuEg3WxjIe4+eaQZzMXq19cbP37IoUMbsrODdu9e0alTa3wXEDEsxuIYM4KZt26NnFqADZE83inhtIeQs2cxeEEmjzadO2Mk1UzmzLUzGI9gMdqkqQlGUnVlYNbxiy9G3L7t/+efa5ExjRDEKMbWMmYEgymd7rhz8SAlG+T1kfXL8/f2Rh4iAYf0wcmhsupXdfnQ6NDU7FSMFw4dasrjHe5JyNtx9epvyBFjdDR2jI3BuVyGGcFAXS9XV2THu9esQUqWi8EO5X0//ohsAhs6kZLqJLZ2z1qkO06dnJCSihJjm6HZxsZqxoxRGF8eP87AiMkkw5JgcGdr4uI5IPETnNiANBTgXjl5MnKJGTbqQb4qpGa1EYMjVOCkIqQ7+G1jSIXMxa5fv2NtPWLNmt+SkzOZKEfmZisoQC3Ty2QSS4IZNGw4aMwYZPfbvvkG804Fb2ub58+/dvo0Uq3jkCG895shu1A2saSMpHHfjENaBQneenXuhRRWlNidO4mQMcrLa2fLlq79+k2HfcoCQ+ATE1MU5QtLgoEPY+fMQXoCyXrnu7ruXLEC0vdKawJ7zGYNGnRg2zZpApXL3WfNqlyooiVzN869+/Aut/Hnr593nOz49PlTbrGK2o97fYzc0lLRRP4fKpak4OcV0mjDPuXmzQf36TPt558P8EjVBkmCvb39MV4gp0MwqipkGKfOBr1zhw0LOYMdroA8ZPaFxPQwvITVYfgMgfA56emP4+IgMT3MNyInNsr9gWQEfujUIBUQcH9gnhkanzr7P6cs2w/rOaxL2y42VjZGjYzgEQTxUBm5GWGxYZCe/tKtSzJNyV765VJ/h/7cLstWyxwgeJnv9fm1a1HSzLCwaAbRGzDqg0MbzMyMYBGsXj0dbW3Nt2/fwUwgJC2F2MKsrKcwvIRJC5gYhKz00lRJlC9f/sWaNTMlCgVesidYfFTURHt7tu+pSCchETcku0cKI8WY3z+yEgxpJ0YMjpKIPxrPJITv/7tjDtBHHzVs2FfUANz/N/7DTydO/AS7PD8sE3rFeIgI5rSxsxuriHOTYV84c3YJRVfJ2i+bvIwxu0RwMDU1WyHs0tTU6NPHnrlD7AkGJs5eu9a0VSvmtnIohFzcS2U5d4JDlbpWwRkRn338mfJ7V/ECJmdTXV17QyAI805FIRis8246ckRuq71w9sr6v/4yNDJijo46KfRZ6gMJgJXfI5hCVIiRS5d6iNGvKAQDQ1t36rR63z6Z0kfydg9Oaenu7My7eU1oOGnYpMGOjN9ORcItJkYBBINzxhwcOojhkVgEA1v7jxwpB47N+O67iQsWiAGN2uiE6ced3+xUFXfkP0SEUA84xU8kfEQkGFjs8umnPxw6JNJYER6PczdsmL5ypUjQqIdac2Pzk1tOqsoWZphnj4+Xa9ZhmOu/cGEHwzQEEreNuASDziBw6ffQUGTCUAnjOC4hamTLyZMeS5ZwyFBVK9NWwbuDWzRvoSpQFBYWy7THRKBfcH7s9et/QKZugXo4motOMOjb2tb2QHQ0DOTwJ8FyWAxVMPgMiI3tOWwYt1gNr3Xt7QoHzKoQu+D7ghPNb93ad+PGXngpguVj8b5Bc3Nj2JoJpzMbGNQTrxfQLA+CQTfaurpfb958MCZmkLu7kKUY2Iry6+XLMEXZyNhYVFyUQfmgboP4zfs1a9zMb40fjAwb6DdQBkdktQGOL4LDlzMzL/z22wo4RlnIDVO5awgB+fVXz8TE47A1s3It8xL2kRzVmghHn5z8/fcz+/fjj2j4N4zY3d1typSO3bpVq5+tAPNABXwkR+aFTAiI2XNiT0BQQExiDMav9pbtZ7vP9vjYQ34vXcwBquRnSkrW4cNBgYFXQ0Ki4SWtUn31BZAbGHg1bFjPTz7ph4ysr14pTkIBBKswLOn+/Yjg4PuRkZBqKuPx4xfPn0NOG7ilYFIETsGE0ETYJW1tY9OlT5+2nTuzGl5W9I78wPz+kYlgxo3+e1DDZsqLNy5G3o8EpqVkpeQX5kMKRG0tbdiebGpk2q5lO/t29kOchiggFwBzgKR/MfCGdvv2/aiohHv3HkLAR3p6Tk5OXnFx6cuXEMH6ClgE0RgQlAgxu40bNzAyatSyZXPI9GZjY+3g0F7UAad0kz9SJME4zFKeKub3Dz+CKQ8gkpYwB0iyA9W+ltM7mGqDRNYTAnwRIILxRY7aEQIIBIhgCJBIhBDgiwARjC9y1I4QQCBABEOARCKEAF8EiGB8kaN2hAACASIYAiQSIQT4IkAE44sctSMEEAgQwRAgkQghwBcBIhhf5KgdIYBAgAiGAIlECAG+CBDB+CJH7QgBBAJEMARIJEII8EWACMYXOWpHCCAQIIIhQCIRQoAvAkQwvshRO0IAgYAKpHpFeKFKIl3bd30X8U6VLCZbBSBATzAB4FFTQqA6BP4P8QdhNT1qmtsAAAAASUVORK5CYII=", "text/plain": [] }, "execution_count": 13, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# finally, combine composition and decomposition:\n", "rearrange(ims, \"(b1 b2) h w c -> (b1 h) (b2 w) c \", b1=2)" ] }, { "cell_type": "code", "execution_count": 14, "metadata": { "pycharm": { "name": "#%%\n" } }, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAMAAAAEgCAIAAABtjetmAAAhmElEQVR4Ae1dCXhN19q+JbMkiCGJjDKgKoRQEvMYQUINQUuDW1VDyxWqiigtSq9fVFRLpSWaKkERc6goMUVIkEhiyjwZIpFBDP2//vnvuXlOzvCdvdfeZy1Zfc6je6/9rm+96/ve7L32mvYbf+Xm/oP/p94DCba26i/yK/+ox33APSDGA1xAYrzH8/I7ENeAOA/wO5A4/9X53FxAdV4C4hzABSTOf3U+NxdQnZeAOAdwAYnzX53PzQVU5yUgzgFcQOL8V+dzcwHVeQmIcwAXkDj/1fncXEB1XgLiHMAFJM5/dT43F1Cdl4A4B3ABifNfnc/NBVTnJSDOAVxA4vxX53NzAdV5CYhzABeQOP/V+dxcQHVeAuIcwAUkzn91PjcXUJ2XgDgHGIjLLir3s6qqiwkJZy5cSEpJSb97N7eg4GlZGSSampiYmZo2tbJq6ejo4ujYxdPTp3NnV2dnUYUxmDnn3r0rp08nx8dnpqdn37lTWlxcWVb26tUr0wYNzCwsbBwdHd3d3du39+rdu5WnZ716+rkXvCH/wsK//vrr2OnTO/bs2X/sGCgGGVn3li0DAwKCAgPhAJmFCEz+hYW59+8fjog4vGNHRloasgoNmzQZGBjoHxTUrmtXZBZSMFkFBH89EVFRqzduTElPF1YB+DsbNXTo0rlz32rdWpgFXXPJKaC7yclbV6w4/ttvr16+1JVnNd6zR48PQ0K6DhwoLLuAXPIJCJ5WMz///EpSkgCWSlkMDQ2Dp01bGhxsYmysdIn4qTwCqigr+z4kJHL9esHSqVnx3gEBC8LCrB0caiZKdCyHgODGs/Lbb79Yu/al0D8slZXv3KHDvvBwe4nXrssgoFsJCZ+OHg0tHpXVFJYIjaSl4eEDRo8Wlh2fS/KWV3lFBTybl6xZQ1Y9UMP4xMQufn6pd+7ga0sh8mhk5OTu3cmqB6pZXlq6YMyYjYsWSV1laQX0+MmTAYGBh0+elKga+YWF/ceMuZeZKZF9qc3+vnXrkokTqyorJSoofOXKNR9/LJHxarMSCqjy2bOAoKDzV65IWoGc/PzhkydXSBYD6cgf27nzq6lT4fkuXRFg+bewsPWffipdERIKaOLHH5+9dEk66grL11NS5oSEKE6ZOIDenWVTpkCPhgxst3/zTfT27RIVJJWANv70U1R0tESka5vdvGPHucuXa6fTmVL+9Cm0mp9VVMhGb8W0aRmpqVIUJ4mAbt2+Pf/LL6Wgq8Hmx4sWyfMHrYED8lLovHl5GRlIMBEYNLPghifF41ISAc1eskT+RsnVGzcOxcQQcbekRlKuXNm7ebOkRag0nhgXd0iCBxn5fqDoEyfgvV1lHVQmWlpYjBoyZECvXp08PJo0btzI0vJJaemDR49u3LoFr2/7jhwpLilRmbF2Yq9u3WL37q2dLiaFeD/QjIEDL+oidGNT057Dhg0aO9a5TZvmdnaGRkZFubkFWVlnDh6EPms4xteuhbPz3tRUsIDPohVJXkCdBw9GdjfXr19//vTpn86c2bhhQ3VEoSNgxfr16zZvRt5+0+Pi3IgOu5IVUNL585N9fNRVtna6t68v9Ac2a9Gi9iVIefnixeZly8JXrcL3X4ds3Tp8yhSV1oQlEn6EnTp7FqkeC3PzQxERqz7/XIN6oEpw9d8hIVFbtiBHLXb+/rswR8iTC16q8QVNnDcv7OhRdeoBO/UNDKZ/+WXowYP16tdHmo3atAmJRMIIC2hDeDim4DfeeGNHWJhvnz4YMGDe8fPbuGoVBgyPPAxML5gnDx+ejIpCFt1/1KjZa9ZgwN39/GavXo1BAga6D1KvXkWCMTCSAnpSUnLk1ClMqVPfey9g0CAMUoGZMm7coN69FafqDq7dvAk01F3Vb/rp/fufV1VhOFhaWcGzBv7MMGDATAgObtOpExJ8imgzkaSAfj96FKaDaa2GsZHRF8HBWmG1AfNnzKidqJQCTSV5ei+VysWcnkY/Xv+5aJG5+nahyrKmortSofWt0oKwRJICOh4biyExpH9/W2trDFIJ08fb27xBA6XE2qfIRljtjJKmgLLj//gDU4SZufmY6dMxyJqYPsOHwxzFminqjtMSE4sfPFB3Vdd0kgKKi4/HFA8NGgysNsbAwABe9WunK6UkoyfyKWWU9PTuzZvQAY0pope/P7y6Y5BKGNCQUoq6U2gJqbukazoxARUUFd3PysIUD/N4MDCVGOtmzVSm10wUPN2xphHixzfRIy39Ro0SVjp0FyEzEhSQAbJIrTAYvtCKqQa0RbSFkaZUwmB8XmW6fhNhYjySQNvOnZFIJRhMrVdKUXeahQ6WOguKdGJ3IOTtR1GwdAePioufP38unX1hlvNxg1/QdrZ1chJWhFXz5o2aNsXkzSc3g+o1FBAMqRY9eoTxo5yYguxsTHFOrVphYOowDm5u6i7VTC/EkamZRd0xMQHBmIO6MuRPLysvl79QzSUiW9DmjRpptqP5agNLS82A6qswhx8Dw2CICQjmPmPKkwdTSd8ExUqcps1xClDnxgYWFuou1UxHkqmZRd0xMQHJP39DXZUgHdOfqSG7FJde4Jplwl7gFYSR2ZEd4gqzGg6ICUhDGfwSeMDYxATjB5HTFJEPSqTOMISJCQhWs2PKkwdjRHTKCxHOyJghFaCOUhluHNCEXLCICQh2RFBXK/nTjQwN5S9Uc4nIsS3YQUGzHc1XkdmRZDSXVX2VmIBgMw1MefJgGpiZyVMQvhRre3sMGN/fWNsa9F8gZ84TXPVMTEBOOAfVrrYUKVbiXoaloIQc6Sx59OhRQYEwArAFDPL9HEkGQ8MAA8JgnNFL+fMTEzFDWphCGcK0fPNNJNuUhASYI4YE14TdRK/Cg+nVNTOKOSZ2B8Lvt8LuSmQxjn7TywuZ/U+h6+li0NMd3+rSBUlGK4yYgOCp0crFRWt5AIDdpTCw1wzj2KoVsukKE74ELHCD3RTO4abzwkzq1h07knIvMQEBIR+crr/btq0UNzOGVCVpsAPzU7vh9n2CJTv4uYuKqkWGhiI3aYBNqGCTPEVGkQckBeSPc1DhgwcfLVgg4I9MZFX1nr2nvz+SwxYd1/XCDMPt//430jhMWEMiMTCSAoK5qrBYB1Nq5L59HwQHUzjpAkNeMAYiZ4TrLYOFE7AjArIgmCy7dNIkZBci7BEI6z2QljEwkgKCpVuBaHWH79wJ20PBFq0YluowsGkVTMQOmj172dq16jD0pFs2bjxwzBgknw0LF2LaNHAjXztnztlDh5BmvQcPFjzfSGURJAUEBQR/9BF+MUpicnLvkSN7jhgBi7l0GsyHhTt7Dh2aNGeOdfv2vuPHb9+9O+H6dZXVoy1x7KxZSEqw2HSOv/+mJUtg+am6LDDHCBZK79ywQR2gdnogYmVL7VwaUsgvbR71wQd7Dx/WUKTKS3D36tejx9uenm1btWrt5gaL5M3NzGANBgzyw9r44idPHj5+DKvl45OSYGc7mPWstGFea1fXW3/+qdKyyESyS5uBzCdDh57TxT+wMhUWxsPjD3r/4BgG0gtzcu6lpMDCeHhfQzacq50Ak2Uj0FOzkX4jLyDo5nmrTx+ZZ3fAgo2Ku3fhX2S18TDiAkq9dm2ClxdyqT+eJwYJC6VhsT0GiccQfoRBwbC9/KLZs/EMiCBfvHhxl9w8XyKU1Blp7ek5VuJ9C1UWDTu2ElcPFEReQGD0s1mz4HmkshrSJTK0XevMFSvsXV2lc0Vty7BWGnaOrp0uPkUSAcG+Lb9+952jnZ14fngLqeSWquALFYaEfrxv9uwh2JunmQbs3bHq11+tBK0G1mwZrkoiILDbvGnTU7t329nYaGVACsDQHQiq3KpDh+Xbt8vzhRTY5aObjltZ4IMilYCAAXxf5489e8hu96ShYmwJCCrSb+RIGTQ07YsvJsydq8FvIi9JKCBgBl/WuXTkCH4fIDGVYU5AUFm/9977etcuiZ5lcHv7ZPXqD5cuFeNVrXmlFRAUD1uMHfnll40rV8JeiFrZiAHAEBt+N0UxBZHNCwMLP8XFIRcE4ouGXu91Bw4ESbnFeDUZyQUExUDf9IxJk5JjY2FfKfjQDt4LuiIZakfXrBp8NG5nYiI8aPA71dXMXvsYHo5Ryck9hg6tfYl4CvmORM0UM7Kzv926Fb42BzcMzUj81WZNmgz39R09bFj/Hj2I9yUS70jUUC/4Xhhsmhmze7fgqQowVWP68uWd+/bVUArZS3ILqJo99Psd/eOP6JgYGAoVNkERtjnz7ty5r48PdDh5e3lBxwFZvyisySmg6kJh64wDP/0EXyzEb4Hw9zBtYGDA5Mmv+RcLFVGpeZCVmwsbGybevJl29y7szJKTlwfL7GEkBH7whwjrK6oHxWCiCHQstXFzq/61b9sWuW9rzbIEHMsvIAXJ+7duXYmNha+JwVKN3Hv3noJbFN9MNTeHoTGY5eju4dGpd+82HTuSevwpSkce6OcOhCRHA0yPAqKh+lo5yNGI1kqCA9j1ABcQu7GjgjkXEBVhYJcEFxC7saOCORcQFWFglwQXELuxo4I5FxAVYWCXBBcQu7GjgjkXEBVhYJcEFxC7saOCORcQFWFglwQXELuxo4I5FxAVYWCXBBcQu7GjgjkXEBVhYJcEFxC7saOCORcQFWFglwQXELuxo4K5QZ4tFTyoJdEpgVpqVBDjdyAqwsAuCfI7MmF8kZ6SHvlj5J8n/8zOyIblF02aNfHo5DF4+OBRE0YZGkm48hDDjWN08sAbuX/l6pRBJLjqWdXSfy2N+CFC5RZddo52oT+Hdu/bXWQpBLPbJvBnvCZ3yvoIg5vN6H6jt23aplI9QDMnM2fcwHG7tu3SRJlfo8kDsgoo+IPg+Lh4zdWH3TPnT51/+dxlzTB+lRIPyCega5ev7Yvch6k2bEAOjzkMkmP07gH5BPTLll/wtQW13bh6A4/nSH15QD4BnTt1TqdKnok5oxOeg/XiAfkEBA1knWqYfT9bJzwH68UDMgmovKxc10+rFD8u1otHeKE6eUAmAZmameq69ZNlQ0udasLBevGATAKCXe5s7XXrkbN3steLR3ihOnlAJgEBpx79ddu7vueAnjrVhIP14gH5BPTuB+/iawhDY+292uPxHKkvD8gnIK9uXgFjAzD1hNbSsnXL8N8dw9jkGIk8IJ+AoALrwtd16tpJc01ge+w1P6zp1qubZhi/SokHZBUQvItF/RH13tT31H0jAhravx77ddyUcZR4h9PQ6gG5p3NUE0pLToORjbOnzuZk5FRUVPw9H6ijh+9w39ETRxsZG2klLSeAT+fQ7G39CEgzJ6qucgFpDoesjzDNVPhVFj3ABcRi1CjizAVEUTBYpMIFxGLUKOKsn1UZFDmAGioVzyrSMtIy8jKyC7OzC7JzinLg36LHRZAOv/LK8orKiqoXVabGpmYmZlaWVk62TvDzbO3p3d7bw83DoL5+QknsLSwvO8/LwYtsOA6ePwj912Rt6mpNurew0vLS80nnL964ePXW1cS0xPt599WtNdDKuaF5w+F9hgcODPTz8VPXx6bViDCAfmQrjOvrkau4tDjst7BDZw9dvnn55auXRCr15OmT7dHb4efm4DZv4rwpw6cYGsi0vI63gYhEUAcjt7NuL9m05ML1C6TUU7NsMP7Ryo86vtvxTMKZmunSHXMBSedbvVm+eedmnw/7LNq4SPAzEU+dCwjvK5aQ8K2+leEr/ef4QwNcUt5cQJK6V8/GD587HPCvAEk1xAWk5xhLXXzMxZgJiydIVwoXkHS+pcXy3lN7v/75a4nYcAFJ5Fi6zC7euPhq6lUpOHEBSeFV6mxCl8GHX30oRccBFxB1wZaIUHxy/I7DO4gbJ9YTDbNRkXtVPXrwqF2zdsRr8voZXBe8zt3R3drK2qqhlbmpubmZuZGhUUlZSXpmOvQTghqS0pN0qjW82E8YMqF+vfo65dIMJiYgzcXwqwI8MM53nE0TG6WMMIzatV1X+M1/f/6ek3tmfD2j8FGhEkbdKQzWHog98E7fd9QBBKTzR5gAp9GSZVT/UfE74mH8C08ofH84HoxBcgFhvEQvxsHa4djGYzAaj6R45NwRmCKCBGNgXEAYL1GNcbFz+WbON0iK8CJ2/MJxJBgD4wLCeIl2zOSAya2cWiFZcgEhHVWHYDAdcdGURcgKn7um21Zxms3yO5Bm/zBzdWS/kSZGJhi6d3PuwmRIDBKD4QLCeIkBDPQS+XX3wxCFmR43bt/AIDEYLiCMl9jA9O3cF0kUpu4jkVphXEBaXcQMwMPdA8kVlnwgkVphXEBaXcQMoL07dkuu/Af5pGrFBUTKk/q3A6McyMUYZRVlpOhyAZHyJBV2LMwsMDwITnLlAsI4nBlMQwvUmEbV8ypSVeICIuVJKuzAKzqGB0wLwcAwGC4gjJeYwTwtf4rhamxkjIFhMFxAGC8xg0G2jhuYNiBVJT0IiO/fSyp4SnZggALZOm7euLlSXsGnehBQ/fokp1QKrvnrlzH1fiqyUs2tWBaQgSF2Hu3LF2Q2r0C6lXVYWmYasgqwsRASqRWmhzuQsQm2BVf2lFh/l1ZHvAYA/I4crZ1ak6qvHgQEjzBzC3NMBR4UPsDAOKbaA4fPHsa4Ajob7ZrbYZAYjB4EBLQaWTXCkEu9gX2oY6y93pi4xLisgixMHbu81QUDQ2L0IyDrFtYYfnGn4zAwjgEPrNi6AukHnw4+SCQGph8BIT8mB99uvn3rNqYadRwDW3DATi5IJ+CnDWEM6kdALu4uGHKAWRa8DImss7D7uffHLRyHrD4sAOrZsScSjIHpR0BtPNpgyAHm5OGTC2cs1PrB3hcvXsQej53/4fwg/yCkZfphn6z55MYdLXNPj50/5j3J++GTh8jqDOs5DDnlA2mQ2Da/yPKqYdkZ2W87v43P4uzqHDQjqGf/ni0cWlg0tKisqCx9UpqTlQOfBk9OSk64kAAPu+p3fgtLi9QnJJvexLf5hU0OukzUoRnb1qXt0B5DO7XpBJtBWzexhlsIjFfkFuVeTr4My+NPXjqJHECt9vbJ70/269IP73mtSP0ICGh5u3pn3CU2M7dmPROyE2zsbGqmiDnWu4DEkFfKC1s1pO5NJTuUpJ9HGFRsUMAgpeqROoWPkZEy9ZrZ+WzSZ2TVA/7Rm4DGvD9GovBwAal0LOzB8P6w91VeEpOoNwG169hOog+jpqeki/HI65o3bEGYFN/T0JuAIE7zls2TIlr8DlTbqxOHTvT19q2dLj5FnwLy6eMDH0kVXwclC1xASg6B17dNCzcpJZI61aeAoA5fbfjKrY0bqcpU23n88PHDImy/CNmiKbTmaON4YN0BglMQleqoZwFZNrSMOBQBvTtKtESe8ptQtQNd7V1jt8Q6t3AW6U8N2fUsIGDm5OIUfT66vRd2VaWGylRfMjQy5BOJwBX+vfxhAzxJ1QOl6F9AQAL6/eDbctCmNjUzrRaBsH/hTjY3ZO7ljMsDhg4QZoGqXAO7DhT23mTb1Dbiywh4cjWyaCR1jfTWE62yYjCD7Mf1P+7ZsScnM0clQGWig7MDKMZvpB+0yol/r0+PPdF5x/Pge01b92+NiolC7ugL4x4zA2cGDQuSrtGjFAK6BFRNDgZ3kq4kXTp7KTE+EYY7crNyn5Y8rSivgF5Uc0tzGO1q2rypSysX19au0ADv+HZH4k2omj7Sr4AU2/zCZLETF04k3EoAJWXmZ8JnD2EJmImxCUwvtLe2f7Plm15veg32GUxwrmpNJ2g4plFAGujKf4kSAclfcWSJVLSBkFw5jEIPcAFRGBSWKHEBsRQtCrlyAVEYFJYocQGxFC0KuXIBURgUlihxAbEULQq5cgFRGBSWKHEBsRQtCrlyAVEYFJYocQGxFC0KuXIBURgUlihxAbEULQq5cgFRGBSWKHEBsRQtCrlyAVEYFJYo8QllWqJFfEKZlvJYu8zvQKxFjDK+Bv/Is6WMEmV0OiVQRoguOvwORFc8mGPDBcRcyOgizAVEVzyYY8MFxFzI6CLMBURXPJhjwwXEXMjoIswFRFc8mGPDBcRcyOgizAVEVzyYY8MFxFzI6CLMBURXPJhjwwXEXMjoIswFRFc8mGPDBcRcyOgizAVEVzyYY8MFxFzI6CLMBURXPJhjwwXEXMjoIswFRFc8mGPDBcRcyOgizAVEVzyYY8MFxFzI6CLMBURXPJhjYyAn44KC3MzMe9nZGTk5mfAv/B4+LKqoKK+srKj+Fw7gQxmmpmYmJqZNmza3t3dydHTx9Ozi5eXt7OwqJ1Uayrp9O+vChevJyXdv3bqfl/egoOBRcXFpZeWzZ8+eGxoaNGhg0gA+qdLA1NzczMnJ1hW+DfZ/v3bt3KytrWTj/0Zu7l8SFQZfmklLS05MvJyUlJCcnJiScr2kpFhwWW5ubUaMGD9+/BRbW3vBRgRktLWVdWEh/P3Exib89tvx6Og/s7MLBBCGLM7OLXr16uTr6+3n59O4saUwI8hc5AUEKtm2bVNc3OkrV84/fVqK5IGEGRoagYbmzVsG9ydkFpEw2QT0/PmL8PD969b9kpqaIZKzIruRkeGQId0/+mj0oEHdiH8xvroU8gJKTIz38+uiqIMUB40bN1m5Mmz48HFSGFeyKY+AYmIuzpq1mqB0lGoRHr508uQApUQip0w2oh8/fjh9+vg1a5YQcYF+jcCD/rPPNgwaNFM69UAF3d0dJaqmrI1osnUIDf3q5cuXCxeuJGtWTmsvXrycMGExtHikLrRdO6leQZi8AyncvWHDqt27tytOmTuYNm2FDOqxt7du1MhCIuewLSBwyuLFH+flZUvkHUnNbtmyD1rNkhZRbVy62w/YZ15ApaUly5bNkyEMZIsoLHwUHPw/ZG2qs8YFpM4z/59+8OCulJQkLSDKLq9eva20tFweUtC1KF1BzN+BwDXQ+RYeHiadj4hbfvas6qefDhA3q84gvwOp88x/0/fv3wmDIf89p/vo1KnLjx+XyMOxXr16bdu6SFeWgXSmtVpevjzU1bW1jU0LK6tmpn//ZwadIg8eFGZl3Y+Jid67NzI/P0erkWoAdHlD33f//kOQeP3CTp++gidgZWU5bpxvz54doS8H3qdgCMzU1AQcVVlZVVpalptblJ1deP367WvXUs+cSXjwoFjJsouLnampsVIiwVN9CiggYGzz5jZKlbGzc4Rft269Zs9evGjRrKioCCWAutPTp4+xIiCIt7paKKVD9/G3386H4VKl9Pr168F4qoWFWYsWzTp3bjtiRB8AwKM8IeHW/v2nf//9tKIISZ9fUKg+BaTkFKVTCwvL0NCfCwvzz5w5oXRJ5en16wkq0ylMzMkpxLAKCOgNQxAYZDUGRru8vN6E3/Ll0+/cyQYl7d8f6+nZGm9BAJJeAUFl4Pm9evX3Pj5u8LeltW4w8q8VQwmgrKwCw+TddwdjYCoxMLVj7twJ8FN5lWAi7W9hTk4u3bv3w1S4uPgRtJ8wSL1jjI2NMBxgPhAGpl8M7QIC7wwYMBTpo/T0FCRSvzDkwMLatTvu3cO+RuirRgwICKYjIr2TkXEHidQvrE0bZwwBeNX39p50/PgFDFhfGAYE9NZbHZDegWENJFK/sK5d2yEJwDRWX9+ZI0YEX7x4A5lFZhgDAoL50TCDDOOXsrKnGJjeMfB6Be8HeBrwPtWtW1DnzhO+/z7q4cMn+IwyIHWohgxs1BVhY2On7lLNdOIzaGsaJ3hsY9Nk7NhBuhq8ciVl+vRVtraD/Pw+3rx5L9ycdLUgBZ4NAVlZoe5A5eVs3IEgkCtXzkQ2pZWiDlOnjx6Ng4lEdna+gwbN+PnngyUlZUoYOU/ZEJCREaoznqHhMFg4sW3bMjET3V++fHXixMXJk7+wth4wevSne/eegjFaOaVTXRYbAjI2NsG4BtPfiLEjDwZaQiEhU8WXBYNie/acHDVqvoPDkKVLv5f50caGgJB3IPHBkNnCF19M27DhUxjYIlJuUdHj5cu3ODkNhQUe8o32E6HOjQj2wKxZY6Oj19vaNhVsQSkjPMg2btzVuvVIeaYckdG+Uh34qU4eGDzYJzV1H4xbwQC7Thk1gOFuNGXKsvffD6mqeq4BJv4SF5B4HxKwABMz1q79V3Jy1MyZgbUnbwguICLiEKw4gxX1gi1ozcgFpNVF8gHc3BzCwhZkZR1ev36+j08HMe9oCtKxsVfGj/9cutcLLiCFq2k5gP6hTz4Zd+5cOCgpNHRe9+5ilQT9RmvWbJOoelxAEjmWgFk7u+azZ48/ezY8O/sI3JkGDOgquJG0ePF3ijmKBJjVMMEFVMMZtB7CvFVoG5048V1R0cnIyBVjxgzQdZozrKGGN3wp6scFJIVXpbLZsKH5+PGDd+1aXVAQs2XLkg4dWuFLghFZKTqHuIDwIaAICW9tH3ww4urVyF9+WYEcU4NBNCmmFnEBUSQLXanAaxrMmz5z5kfkEy0xMU3XIrTiuYC0uog8AFZ1ETTq4eE2bdoojMF793IxMJ0wXEA6uYsM+Pz56+7uI7788seMjDwiFpFrd0pKyE934QIiEkHdjFy/ng4rLkJCNrVs6d+374cwz1DkEHp6eqZuDMihuYDI+RJtSdElAx3EsMwZ5hm2aOHbu/fUb7/dKWApz5Ej50JDIzGFI5vbGFMKDLHRO4VFfqDVA0lJ6UoYaBXBynb4zZ79DWz6DL3P8FSCTREcHKyhEwg2gzYxMXr16i94k4JFiTC2lZ//EB5/0CiGFytYFa9kTd2piwv5HZK5gNR5W8L0Gzc0LT8CZcAvMvIocQZdurQlbpM/woi7VIvBrKwCSYfH1RUPe0b37u2l7qrgdC4gwa4TmFHRABKYX2g2f/9e0JEtNLfafFxAal0j0QV4BZPIsmazCxYEaQYIu8oFJMxvwnPVbkELt4XOCfsMdenyFhquA5ALSAdnEYHK/wiDrmrYpYoI+dpGuIBq+0TCFHgPl/STBrWpQ1/A8ePfEZwmq1QEF5CSQ6Q9ha19dZqDIZINbH13/vzPsJJapB0N2bmANDiH/CXYMfPSpe0XLmyDRgl0D5Iv4D8WHR1tYOrZvn1rLS0b/CdNkv9zAUniVs1GYXsX2PwwL+/4jz8ugS/DEZk8rygRurB/+GFRevrvMPVMkSjdAe+Jls63WizDpLB//nME/DIz83fvjomOPnPuXCI0krRkU3UZ1raCboYO7fHOO32RI/OqzAhJI//BOSEsKM4jzwfnqh0ALaSrV29du5Z28+Yd6LCGzVzhkxrl5fCV1OewPhBUAr3JMCgGY6JNmzaytm7SsmULWAnk4eEOYxSSPhA1xIcLSINz/r4kp4C0UKHyMm8DURkWdkhxAbETKyqZcgFRGRZ2SHEBsRMrKplyAVEZFnZIcQGxEysqmXIBURkWdkhxAbETKyqZcgFRGRZ2SHEBsRMrKplyAVEZFnZIcQGxEysqmXIBURkWdkhxAbETKyqZcgFRGRZ2SHEBsRMrKplyAVEZFnZIcQGxEysqmf4vPLVVQdr85mIAAAAASUVORK5CYII=", "text/plain": [] }, "execution_count": 14, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# slightly different composition: b1 is merged with width, b2 with height\n", "# ... so letters are ordered by w then by h\n", "rearrange(ims, \"(b1 b2) h w c -> (b2 h) (b1 w) c \", b1=2)" ] }, { "cell_type": "code", "execution_count": 15, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAASAAAADACAIAAAAr0inhAAAtg0lEQVR4Ae1dB1gVR9e2gIoK1gjYsGCNLSIW7KJGUYkNe9dYsMSexJhgi4ktmmgS+RIFuwbFAlEsGDs2UIgtWEGKXQQBC+p//twv91tnZvee2V2ul5vJkyeZeeecM2fe5dyzOzs7k/tNUlIuS/on2dmSvAFfLMwhC3MnV/0oC7teFuZQHgujR7gjGLAqBkSAWdXlFIOxNAZEgFnaFRH+WBUDVhVgs6fMrl6suls5t82rN1vVVRKDybEMWE+AXb189Wj40WeZz2rXr91jQI8ce0WE41bFgPUE2Orlqy9FX3rx/MW+Xftc8rsc/+O4VV0oMZicyYCVBBjE1Tr/dcZLUKZ8maatmxqroiAYeFcMWEmAvf77HyOJifGJ5WzKGauiIBh4VwxYSYD1atuLYHDJqiUEIqqCAfMzYCUBVq4Cma8mDpl45vgZ8xMqehQMSBmwkgDbvnG7dFRQtrW1dW/qToCiKhgwMwNWEmA0ay9fvvRq6EXjAhEMmJMBawiw82fOMynbfXo3ExegYMBsDFhDgNVzr8fkq3Tu0kxcgIIBszFgDQE2beQ0Jl/7ovYxcQEKBszGQG7r+B5MLlklvdH8tZuFfYBlYe6I78FMhKo1ZDAPVw/mKGcunMnEBSgYMBsDIoOZotrCUoaFuSMymIm/H2vIYC75XJijHOI7hIkLUDBgNgasIcAuP77M5Cvwp0AmLkDBgNkYsIYAcy3syuSra9+uTFyAggGzMWANAZb4OpHJ145NO5i4AAUDZmPAGgJM7hls0OhBZuNRdCQYYDJgDQEW/zKeOba1K9cycQEKBszGgDUEWKOKjZh8zfhmBhMXoGDAbAxYQ4CdunmKydf8z+czcQEKBszGgDUE2JQRU5h87Tmzh4kLUDBgNgasIcCW/MreHWD6qOlm41F0JBhgMmANARZ5MpI5trCzYUxcgIIBszFgDQHm1tjNu7c3Tdmpo+xnM1pSIIKBbGLAGgIMqNm1ZRdBkI2NTeMWjQlQVAUDZmbASgLsevp1gjjYKJFARFUwYH4GrCTAfNr4ENwt/mUxgYiqYMD8DFhJgIWeDO3/cX8pfbB7trQqyoKBd8KAlQQYHKqyadUmI4POZZ2bt21urIqCYOBdMWAlAVbArkDCq4SPJ37sUMTBqYzTFD/2q+d3xbLo91/LgJUEGFy/2Euxxw4ey8zMrP1B7Z4De/5rr6gYuEUxYGNR3mhxpmrNquHR4VosCF3BgO4MWE8G050aYVAwoJ0BEWDaORQWBAOyDIgAk6VGNAgGtDMgAkw7h8KCYECWARFgstSIBsGAdgZEgGnnUFgQDMgyIAJMlhrRIBjQzoAIMO0cCguCAVkGRIDJUiMaBAPaGdC6kuPIyZMrAgIOR0Q8fPy4RLFi7vXq9fb2Htjz375S6e7dpBMnDp06dTQ6+uzjx8DNw8zMjGLFStSr5+7t3btHjwHar5w6C7sCAsI2bbocGZmRllbCyamhp2ef8eOr16+vzppqrejY6OPRx09dOHXuyrlHqY8epjzMmzdv2VJlW9RvMcZnzAfVPlBtWaPi+vW7g4MPnj8fe+fOg9ev35QqVczRsUSbNu7t2zf29GyowrimDPb8xQsIsKs3bjxNT7crUKBk8eKVypf3aNBAhR9WphIffzMxMT4hIQ7iCv599izTzq5gyZKlypev5ObW5F0NNvHmzUtnzyZcv/4MrlehQk7ly1epU6dqvXpm9ifzeWZcclzC3QT4F8oZzzJeZL0oWKCgi7NLvWr1arvWNrM/xu6uXbt96dKN5OQHKSlpz5+/LFSoQKFCdi4uzpUrl61Vy9UoxlVQfz7YmzdvBo4fvyE4mO6vSsWKv69fD/+lm0wimPOvkhOS3cq5mTQFAiERIbBjB0ZSVgbj0D/K8Bl1UNCazZsDIHf9gzH+7+pafcuW/c7OZRltpiAed96ylXTr1rgPP4yLjX0L/bvSc8yYz3/6icYxSP0ojNR/ZdIy0pZtXBa4K/BG4g0FtUGdBwX4BeTJo+rXn8shiROHDkUOHPglBL0Ee6tYoULpr78e269fh7dQUxVVY/jb6N5Dh3bu3cu038vbW110Ma3RIDK6QFFrdNF9KyKxsZdiYqIuX/5TUSpX16591UWXslnl1t3r1jGjq0iJEl0GD1bW1as1IiYC7glvJd9SMFikcJFe7XqpjC4Fu4pNkC22bNmnEF2g3aJF/Y4d2WepKthWmcHgp7pW69aXr16VMw0ExYSHv1+tmpyAHI75hbbMDLZ8+Tfw79OnaXJDk+K2tvkiI2/DTaMUxJQx/NB2bly61LtOndevXtFNBqRes2arjiplXTlFfMKY9+u8Wf6zXr2W9UHahWs510tbL9na2EpBVBnvkMScv/+20aPnSwDZYteurYKDF+fOnVtW4u0GlRls3datCtEFXfTo1ElFdL3tm2zNAjNYamoKzGogowsG1rfvMBXRJcuIqYZVX3+tEF2gPfKrr0zZ0NSekpby+7HfkdEFPU0dOFVNdKny8eXLrKVLN2BU8+WzHT26Jz66wKbKDNagQ4fImBhln2xtbVNjYwvkz68sRrRifqEtM4OVLo39VTMMGSYVL158QAzfZBXDD2EkMz29RZEiygEGKi29vb/buZPQNVnFJ4zcbnz8vF/5/Qu/XTDpACmAd+gfzQMHTrVr5/tPzfT/V6/2GzrU27Tc3xJqMtipqCiT0QXGp4waxRtdSKctMIPBdDzSeaPY/PkrjOVsLaz86iuT0QUOfLoiG/05e4mbn58+UznvwkvmuHELuFTw0QVm1QTY/iNHMA59u2KFe8eOGElemcjbkUgVuV21kep4sbp1G+CFDZJjxvRduPBLXi0V8n9s347R8ipf/sDWrRhJFTINanLz0/Ljll/8+IWKvrhUYDYhNjaeS+XYsfN4ee5bRHAon4vLK/nHZWnftyMjyzo7SxGTZcwtUOncpU3aMQgkvUlCSsqKYRzKlQsyWMeO7rJGZBqSkt7ItMjCOHf+p34lKqq/m9v/6vKlgvb2R1NT5dvZLcg7Mshg7gP5+IFHnazTWdzTiUiH/hnNZ58tX7Ag8J8a6v9v3mB/38EcdwZ79vw5MrrAejk3t7+uX0d5zSNkHRkMRsz72MZD0n9lb125gtSCtR1u6MkxpE2jmIoMBlPned3zwptoo5HsKNy6xf0TDK+h8Z5wB5jPyJF4606lSlWrXBkvj5S0jmcwGOz48Z8jh6xabPbw4XjdYTNm4IW5JFU8g4F9r6ZedvntuDriEs7KegWvv7hUQLhoUXu8CneAPU5JwVu/c+9e2WxY52Y1GQzemwUFrcXzqUIy68ULvNbq+fMXjh+Pl8dLqshgYHz38d3tfNvhe+GVtLHJy6vCK88XYI+fPImI5LgBBW+O7tjB65NJeavJYDBSH59BJserWmDHqlXwzMylPn35ci55pLC6DAbGdy3dhexChdjw4XN4tcqWdeRS4QswWNHLZR2EKzVunPnsGa+WsrzVZDAYZrVqRZQHq6UVVvTyqmfTY5i6DAbOF/QoyDsEvHzjxrXxwgZJ5eVUtDW+AGvXuzdtQhmpXaOGirBUtmlNGezQoYvKg9XS+kW/frzqg6ZN41XByKvOYN3bdMfYVyczcuQ8XsUOHTy4VPgCDCYtuKyD8J+XL4+aPp1XS1nemjKYm1s55cFqac3Hf8exdtGi0LVrtXTK1FWdwYIPBn8b+C3TpnbQ3p47PYaFneDqly/AtoaGclk3CPsvXKhCS0HFmjIYfH+pMFItTfDp1/NMNXPcnQfp/1ioOoMBA58N+UwLD3K69+49SkvLkGuVw6dOHSjXxMT5AoxpwiSYu3RpkzJcAtaUwXbt2nL5cgzX8JHCFapXR0oSYk3s9J8ZV53BwDcbdxvCQ12qpUoVV2Fn8eJ1XFocAfZjQACXaaPwMf4lpEZdZsGaMhisV6hRow5zmBrBXrVqqbOw+fx5dYoKWloy2Jn1ZxQsq26aMmWpCt3AwNlcWhwB5tm8OZdpo3Czjz6Ct/LGqvaCNWUwYGbatJHaOaEteHToQIMYpHv16ryT+ybNaslg9fvVx3/nYtITo8D8+WONZXxhyBA/vDBIcgRY/fbtuUwbhT+oVYvrExqjolzBmjIYjHHRov/IjVQLvs3fX516XQ8P7hWApnrSksEgOPPm0f+NsLPzh6a8ZrSfPbuegcpDHAHWvFEjeTtKLecuXAjdv19JgrPNmjIYDL1qVQdOAlDiqn/Uok+cCAkMRPWBFtKSwSA414SuQXeFFdywgXuOHkw3bMg3A8Sxml7LXMWbpCTkuDGrxa1mNb2Bk+vX02HPKR35AVOwMdsADdt7RaJv6ZGL1yFIeFfTSwl5E4l+xEA6BN8a53aTdoEsZ2ScsLPLjxQGMY4MJmd0aO/el48ceREXB/+FMlOsZffuTFwdCB+h0P9euH+BtvbOvwfbsGHPuXNJcXHPz5yJ8/Vlv8OtXJl7vQU9UgKpIf+JSlsfn98uXDienu5/8OB7MhO8XVTtCEb4IK3KZbDQ70Ov77qefjz94R8PF34i+zqnWvdqUmu6lKdPH8y04+vrA/eBaWlHX7w4lZgYtn37ku7d2xgl69Rh/4UbBYgCNsDk7vEc7O3btmgBW44+SUu7cOXK9j17iA4M1cOs3d2YkqrBWu8xJs3MtqsU84vmOXOWOTmVtrOzgzmD27dvBQdvZI5u3brfmbgW0LddO6Z6fju79r17lypT5n5S0pGQEPgvUyz4r7+YuGqQ+Qy2dMpSx+KOhe0K57PNdzX+6vrdso83sAGO6q7lFBcuZNx2Fi/u0Lz5B7Dg0M6uwLNnL5KS7v/557UjR6KMRq5e3WEsYwpabxE/Gzfum7e/cZg6Z86SlSvpvpF3iZhbRNo4II8ePKJjzJz7IjI/7iI+qZwwYdDWrYwXKYQYc4AGEMlPTETEUA8P2k6TDz9cERZmxF9lZTUuUIC5oQDyLhF9R5aLuScHce+3LXxbz+k9je5JC4SktOmtMtohL68Je/Ycf0s3Vy7YDgC23CBAqEZGXh46dBYEG+wqBTmNFpBDsBkMdrmhTcB2x9PHvjXXCcvtl/6HMSd29cQJWl1fhI4usP9uM9j588nSMaalpTKja/jwCVIxXcrM6ALLfqtXS+3/Z/ZsZnR9tWqVVEx7mZnBkve9xQ/04vutL7Ov4MXBTFwLSEcXWPvhB8ZtPLxK2bnzEEQXCHBFF8hjM9jBY8c8e/UixmNfuDDsG0WAULWrWBE+fCZwkcGAELhXLFuWMeOsewb7on//sI0biUsAVTovHd+zZ4KXF0aSlgEEnTBQGQwMVula5drt//9TJv7RPYPVrdsnJuYq2YvidgDffQfPZhl+fiMJLYWqjUKbtImOLmgNYiUreAyjo2uu3ut9pb4ZypafwcBP5sbLDRs2o4ejEWFG18CpU2mzE7t0ocGaGmYgaWuAIDPY7bu3mdHVuXlnplktIB1d3t4tFQxev56wc+fhw4d/UZChm7C3iF1Zt4hw+ANtsVvHjqu++47Av9R7vS9hH6oWOItYr54z4efmzfvo11OnTx8jxLRXbfPlo40k3bxJg2eysmgQFgr/de4cjatGmLOIzu1Jfso5loNJRbqX0KOh9x/fp3EtCBzpQKjv2nWYQKRVkOeNLlDHBtgOyZOxoVf4Q/GWWdsxfPJkqWdQrl+7NoHoXs0RGSwu7ga9aqxo0eL6svHk4cOXrN++BUFBdEfrlyyhQUCqffABE1cHIjMYGIeFUXQXsJLjvWLv0bgWBDISob558zcEor2KDbD81C8i/Ydi9IZ+3Ir6888n/FuCGQ1iCjkig7m4VGrWzJMYTkrKowcP7hGgliqc58BUT2ddggFTpjDPB/v5yy+ZRtSByAwGxlMOp9BdwFrEDXs20LgWpGbNSoR6nz6fE4j2KirAIDbou8GRAwbIdf9h375EE6xtK+LgQID6VnNEBoMhHzsWTg9c333qd749VWjozqF48cJFitBdAwJ7J9L4mLlzaVA1gs9gv2xnP+T079hfde9MRTgKjMDnzfMlEO1VVIAxY2PNb7/Jdb930yaiCWbPfj9wgAD1reaIDAZDnjWLfEAFMCJC6e6fl6iPhg2jVVIfPaJBA7KEtTFRX10P5sNnsI+7fVzVpSrt6mC/wTSoBfHwqEuoz5z5082biQSosYoKMGYs3Tx1Sq7vA6yDcDq1bSsnrwueUzLYrFmT6fE2adKSBlUjk7y9ad1JMs9aIDmla1dafpOuX4XhM1jWq6zYuFjanzWzGQsvaDE8cuJENCFcrJhDxYplCFBjFRVgg6k3YNDr2LcXcEj9aNu8eWFqP6M51NSiVEV7OadksJCQCHqwmzevpkHVCDMj+fv5yRmkX46BpL7bS+EzmE1eG2YsuX7kKue/OnzYsI8IxcePU52c2hGgxirqRXP/sWM3UgcI0DMZRleysrJsy5c3Vg0FBXmpJHIpkFTFUGYusX+3e9PDSo5SpZwIVzErqggVaRXDT3N7+4ynT6VaUD6RkQELEQnQUO3k4nInPp5oCr9/v2jJkgRIV5EvmiGD0avpYSWHUwmSH+gCs6iK9uS/CNIhmdX0e/f+COedyxrnb0BlsA0//khbhjP4aNCA2NjYtGhMetln9Gg5eV3wnJLBbtzIoMf7yy/LaFA1EsBamDZHfg/t3+Pi6L4833uPBlUj+AwGXaQdTaM7ytMA9bdKK8oh/v5f0E0ffji2a9cpNK4aQWWwio0a3bp9m+hDOSPRH48pyxuNY36hjcLSgshgRjaYd3fM+0CDyqIJEzZTG/ou37MHs+kAMmFwZTC7JnbPXjwzDsdQSD2aal/QngAZVaRDMhkMDJ48uaZRo1oMy6og1K/CSdZubcpHyPpQC3Dqenqq8hCrlFMyGHPZ4bJl87DjRMgxl+qGb9smpzrthx/opvG6nu3GlcEyIzJpfxyaO9CgFiQ5eR9TvXHjwQ0aDHj48AmzlRdEBZhTXXJCE7qpUaWKQmdBISFEa3Q44/0PIaOlmlNmEd9/n/FgM3HiTC1jJ3SZd4OePXoQYsbqyX2MP7V39R4MvOo+tbvRN2PhRMAJY1mXQpkyHeTsjBjRtUSJInKtXDjqFvFwREQr+cuD7A8+ynxw8aJJYdW3iDniezDD8Ol5Dl/f6TNnLjBJDghg+Fn++eeB335LWAu9dcvZxYUAjVXeu0qjIvqOjG/qQv08B94h+btEGJ2trc2KFZ+OHMkIdePYMQVUBtMeXeAKJrowHsvJ5JQM5unJuB1ARpfc2Amcji4QUIiuR/fuERag2nnwYBpUjeDfg0EXyzcvpzvaNJ9cvUDLcCH9+jEmOYwWPD0bfvRRK2NVdQEVYIHLlqnuwKiYp4zOr/CMlg2FnPIMFh5Ovt8E/2fMGEsMR0u1A7VUTdla8VKl6Bn50DVrlLW4Wrmewcb3GU8b7zuDXH9Hy3AhGzd+rXCUHuxBD+/E2rf3TU1N5zJLCKMCbMjEiYSaiurzW7dUaOFVckoG8/FpQw9q/vwfaVA1EkYtVZNbhWjsIuXBA2PZUGjQujWBaKlyZbBDkYfovhZPWkyDWhA4PFb5MNi8efP069fRwaGQll5QATZryhQtfRh04eh07UYULOSUDBYUdJAexaRJQ2lQNfIBtQfz0ydPlK3Vpt5bnv3jD2UVrlauDNbKrRVtfOrSqTSoBalQofTOnd8pWHj16jXsw2Fn1yQ4mHHJFBSlTbgAk1/GJrWlXE7U9QM+uq+cksH69+9IO790aQANqkbOUWtB33d3V7b258mThEB5xVliQthklSuDPUp9RBucOlDnAIMuMO+UO3Vq3qlTM9ofJIIKsE9GjECaUxAro+sHfHRHOSWDwU6JtPMTJw6hQdUI/a3kxTNnlK01pr6djb96VVmFq5UrgxV3KG5rY0vYX7xO51tEsP/69VmiF7q6bVt4gQJN/PxW0k0YBBVg3//6K8aWskzs8ePKAhpbc0oGGzasGz3SZcsCaVA1Qn/t30hmm0RjF/SrMLk9SY0qXAWuDAaWX2a9JOyP6TmGQLRXZ83yxxh5771ivr69MJK0DCrAPu7fn9bkRao2bcqrwiWfUzLY6tXb6XHpm8FcqlUjujhl6mwA+jW03J6khGVklSuDgU1IYoTln7f+TCDaq7NmjVq+fLpJO/fvP4YZxXHjFpiUpAVQAfbLhg20Ji8SI1Zy/E3Z6NF9aOr0zWBx1L68dPwQPtALqeyLFiVktFR5Mxj9GDakyxAtDsjpTpyIuvPMnz/f3LlqUigqwAZoXsYBw6sj1iL+fZFXrtxMX2x9M1iZSpWILuj4IQS8qA0g0lJSCBktVd4MVrFMRaK7wJBAAtGlmpV1xtmZsXiNMP78+YvixVsHBOwicJNVVICtl18narIDo8AZmW3rjQIaCznlGcwMGSzxxg2CTDp+CIHd69cTSIGCBQlES5U3g91MvEl0169DPwLRpQovlJOTH2BMwZMYbKyNkZTKoAJMlwzmruvqbOkYDOWc8gxmhgxWqWZNgh86fgiBVtSuAc8yGN+tEVr4Km8Gq1C6AmF8Y9hGAtGl2qGDR2rqUVh5aNIaPInBiUcvXpCzL8qKqAATGUyZRGhlnq5C7E1vMGKGDHbj0iXCYZPPYIeofW/k9n4jLCOrvBnsVtItwnI2PYNBLzCX+PJlFtEdszpwYKd8+cj3B0xJI4gKMIUd2oyGTBZEBjNQZIYMRh8OZvIZzL1NG+IKwu6lBKKlypvBShYln4uy6RkMBrVkySQ4lKhwYdO3xHDQVKtWI7l4QAXYf6gbdK4+DMLiezADD2bIYHC8JXGB4NQiAiGqZw6Sq4Ecy5YlZLRUeTPYgxTyuWhUj1FaHFDWXbZs49OnqFviHTuWKJsiWlEBpstKDvFFs4F6M2QwOh1F7N1LXHiiSm/uezchgZDRUuXNYHb57Yju/Lf5E4iOVfj06/HjQ/ROiXQXxYq1UtjTmpZHBZguKznggFm6ex0RMYtoJJNOR7VMHWBPb+5b+f33jQa1F3gzWObzTKLTyQMmE4i+1bVrf4+IiDFpEyZF6OM7FLRQAabL4UM1WrRQ8EN708UHF2kj7/aMZvp0FfDQDBms86BBBBUX5HeJNUjaUftYXkd8fk70olDlzWC0qe/Wf0eDOiITJvSBpYnLlk1Vjh+Y1l+wIBDfLyrAdDl8KP7sWbxbKiTfL8n4xbWoEy4NgzLDM1jo2rUEgbAxPYFIq3DPk5meLkWgjNlSilBRqHJlsLSMNNrUt+O/pUF9kcTEe0FB+03eAX766RB8v6gAW8vadQjfh0GyvN5nuhEOXHl8hUCg+u/MYCNmziSogI3pH929S4DGKvM3+wR1YJVRXkWBK4Mxt2f7bPlnKvrlUilTptSxY6sTE8Patm2koGhr21ChlWhCBdigCRMINRXVp9euqdDCq1QvVp0W/ndmsF/nzaOpKO7oSIMG5Dbr0nQfyTcfLWfcgHNlMKbwar/Vyl3o1bp9+x+HD5PTsFLjUVEbpVXlMmpXqaOnTrXoRn5kcSc62lHXzV8NjmJ2TWIOKSM9w7WwK9G04+iOhs04fm8I9f+voh2i94oCbeYuiLRkr16Dket9Me7A+bFwRjMxlh92724qv5jm37mrFEGRtLppU9jw4XMyM59LQWP5jeJRzkYxKKAyGB1doJkd0SX1jLdMRxdY0BpdaCcsaiUHHV0wDoXoYh7oPFnXwzqYSQn2pmcSzFwVFb4ynCmcHeCTJ08hj8lFV8+ebfGdojLYo5SUEtTytoiQkMZubviekJKYX2imqVevXpWzKUc0wbGIrTu0JkC+KtohOi9BR+8kg8HnKt2rkzfMPr6+n7HOGDCwITIY/VeRlpbRvPnw6OhYuunRoz/grCMapxFUBqOjCwxlR3TR/uEROrpAV2t0obu3qAxGRxeMQyG6pnbvTg90FbWxBy2DR7gy2Oj5o2nLCXv0fPFN26eRLVv2MaMLVgYjowtsogKMeW7DbD12wqFHpRphnlQUtDZItUEuxbp1G9Dy7+o92NnXr2lnvKgDpYwyc1hbII7Udds2rlnElTNWGn0zFmr0qGEsm6cAG2jDmzF670RYGQyxh/QBFWDVmjWjzfnpsZcbbVY1UtWhKq3rM8iHBrMDsagM1qpYMXqMzDOKDGKepUrR8od1/eCSK4OV9ypP+wOnq9BgdiObNu1l7p3Yu3d7ZNeoAPvr2DHanGOdOjT4DpHYVMa98rzp88zjkkVlMGZsNMgje62Hsw4r/aRzZx2p48pgm7/ZTHftOdqTBrMb6devQ0zMFrqXGTNW0CATkSVdKj100iRp1VC+xjrljRYzG9KwAmM6fuZC8pVrNvljURnsUx9G3l4cHCw39p+/+opu+o+uG49yZbCmw5rS/phzFlHae6NGg6RVQ3n+/HE0yERQARawdCmt7FC1qslFJbRW9iGnb52mjXs19KLB7EAsKoMtCGI8eTJnMgxUFHJwoDlZr+s0PVcG82rKuGpf/vwl7aQW5DXrSZU2mJFxggb79p1Bg0wEFWDBu3fTyv26dWMusaElzYN08ehCd7T7NMNzWkw7YlEZjF6LCAOkdyM1jBp2pU9PTaUZGDB5Mg2qRrgy2O7jjKs2d8xc1b0zFfPI3zNL5SdNWiKtGsqbNs2nQSaCCrDuXl72hQsT+nAs+nBdrwFhn7caciKEVmGeK0uLaUcsKoPBavp8BQoQg4LdSNcuWkSAUJVbB5zMOriZVkciXBmMafPgmYNMXDV4/Hj03Lm/mlRfunQKLdOp0wQaZCI2TJQAnz1/nkadWg8yKxcsICTfYXVM3zF070cuH6HB7EAsKoOlPn784tkzepiDpk2jwUne3jQIv+4K54nR8iYRfAY7cOoA01ob9zZMXDU4ZIjfgQM/m1RfsyaUlvn99x9okImgMliB/PmHs46cggNTjlCHBhDd7Dt82DxvzH7exCCrRY0WhD/ZVLWoDOZQrFingQPpkbrbMH5Py1KbKIIi8vmE7kIOwWcw13KuTCP0ZtpMMTw4ZcqAChU6t2498u7dRwpagwd3plt1zmDQwSrqyCkA69as2YI69kbqDSxf2hAcvOb776VgNpUXfcW4//H0MtPcrkVlMGD493XraJ6XUQdnwzTV5uXLacmmXoxpBloMj+AzWIMBDWizRQoXoY+DoMW4kDFjvoFJBHij5ehYXEFx5syf6FadMxh0cOnwYbqb6EuXcpcuvV1mR9EnqanDp0xZGxT00ZAhtK7uyLQ5jPuf8N3hn/t+TvSVlZU1bSRDmBDjqlpUBgPP17I+YZ7g5fXzl2/Nxd1LTGQO8zhrWospiQTlMtiF6xcIC8xAevL0CSGmverr6wO/LxBmefO6//AD482boYumTevSffXv/wUNMhHULSJovt+qFVO/WcOGH8o0HTh6NPTvMwd2BgYydfUFQ4JCmAbnfD+HwI8fPH4v+R4BaqxaWgYbxNqEI0/evCP9/KQjnSXz2/d9aKhUTHuZmcF82vrUqlxLanxvxN47D+9IEUO5f8f+NKgR+emnIIOFZs3qeXk1lbPm4/Mp3bRhw9c0yESwAfZa5qfu2OnThSpXhjv+oJCQxDt34J4wITn5cEQEvJvuPXr0w8ePodfq1IGLTFc0gl18GNP0YNMln4v/d/4pj1LSn6bDB87fzPhmRI8R+0P3a+yOULe0DBb55g3hIVRfv3rV0Nb2u8mTE65fhzMvYbPE8zJnSvnPmkWra0GYGSzoQFBut9zTv59+8fpFeMS6dOOS3CFg8FWElt6ZuhBXBvzIkagqVbrC89iiRWtjYq7CVypwtuXDh09gmvHTT39IT8+k1b/8kvHAT4sBgg2wHvJn8MEUSMN69aq5uhYuWDDz2bOUJ08uXLlyNjoags3Q5RVd12UzhwFgQlwCs6lC5QrNPZvbF7FPe5KWcCsh6mQURJq9gz1TWDVoaRlsQqdOzLHAqV8tunSB/8LN4b4tW5iTjaC4ztSBfUzjCiAzg4F8zUo161ev71jCMT0z/cylM+Gnw5lGsmMZx7Fj56V9wa1gzZqVSpd+r0CBfLCcF5YgxsUl79t3UipjLONPWkF9D2awW7BSJYgfYx/4go2Nzcv4eKQ8+vMrhr0mlZvE3YhjNLCgqIQopzJOrJa3MbRDlvM9mGEA7nnzqp4MXBEWZnKvUkMv9aPepku+BslKvtFES5XyVWK3x5oQMjSjHYIdDpkLeTG9wOHO3t4tMZLYDHYzPl5ddIETmdRhHxjPVMjgowuMo6IL7YSlZbC/zp9XHV0waGR0oenJJZfBkBb+Cv4LKYkXUx1d0AUyukASG2AVy5ef9ynjaQ8zHlv5L5Ew6niZjyd+jBc+sv8IXtikpKU9g1WrV6/vJ5+YdNtsAsxnMHzvAbsC8MJmkIQ9BZC9YAMMzM1Uu26D+bUL0j8usV+W/YKXb9FOz3fQlpbBgIdNal8/tu3ZE08jUlJjBhv20TBkR0ix27fvIiVpMThgpUiRwjTORDgCLOv2baYJkyDze02TWioE9kXtw2v9+v2veGGTkpaWwcDhY6zVbSYHAgKnqYMgMFrKMhozWNarLGX7vK3lyjnyqhjluY4I4wiw9qzVUsZeFQq7WF+kK8irbmpfvz1ed8QnI/DCJiUtMIP1qVvXpNtMgW3U8WJMMS5QSwaDxVM2eW24ujMpvHv3cZMycgI9enjKNdE4R4CF//Zb+TJlaBMmEe/Bg03K6CLA3JZDzvKno1U+UjINWmAG23ntGr3jPNN5AoyNjiYQ7VUtGeza7WvaHSAsKLxZJiTp6rZt7HcJtCQgHAF278GDeJnXzUzTRnDR28tzjLjuhe4tu+NtLli5AC9sUtICMxjECb3jvMmBwGqPxu05bgRMGjQIaMlgYSvCkL3gxbgOcCDMnj69lkAUqhwBVqpkSXXbBEybO1fBAx2bgg8H4611a9ENL2xS0gIzWNW6dRfxn14Pqz1MDlaFgJYMlh3LOLgOcCDG+/PPWwlEocoRYGClJWsDPQXrhqYR/fqZlNFF4MShE3g7249sxwublLTADAY+MzfnUB7LpOzZjU9LBls7hyNjKI/O2IpfrWtUMRZWr/Yzlk0W+AIsISrKtUIFk0YJgV83biSQbKp6tPLoORA7xcw87ki1YxaYwWAsZ169Qn4Ybxx4RlqasaxjQUsGg1VUOnpiMIVfrUt3jTxs1qDIF2BXb968dusW3aUy0py1sltZRXXr1nXY9M08sE91v5aZwfZs2MC7noNYbq+aEEJRdQar7Vq7kF0hwpr2ap06vdUZqV3bFXNcutE4X4BVqVjx0eXLRmVkAQ5nQUpqF7uScgVpRN/tOiwzg3Xs33/hVuwvDvDGm+6QVIOY6gyWHV+CgT+RkSqX58NCe/yoQZIvwECh75gxXB2AMMyO8Kqolu/cuDNS9897fyIlMWKWmcHAcz+edyTjvvkGM1gVMqoz2OFfDqvozqRK4cLNTMowBcLDVzJxOZA7wOComx/nz5czx8Rhfj+FtTEYU1gjCLvclC5XGmOkdqnaGDGkjGVmMHAe1nOUc3VFjkLfg8+lnarOYFIjOpaTkzkW/Uj7LViwgLRqsswdYPCV9ef8v3NFWVtbmnROhQAsqE+6nYRR3HZoG0YMKWOxGexqTAzzAEt6XLBVTjOZr8hoYV5EXQarXLZyhdIVePvCyHfoMA4jRsh07drKwYHvgZA7wGCfkCd//fUxdYAi4QpRPRWF/myI0OSsulRygQ+9MEp92vfBiCFlLDaDValT53h6Orw+NjmQBroep0J0py6DweeYhB29qvCy+OTJNYUK2XEZ7NWrHZc8CHMHGOjA1gCBv/3G1VOj+vW55LUID/loCEZ9VfAqjBhSxmIzGPgPG91gXh8rHCCGJEFBTF0Gy443YEYn/f23MbcDMAoQhfLlnfr27UCAJqtqAqyMk9OLuLjJo0aZtG4QgBfNsJETUli7WNjZsKmzpyrbgee0Om51lGW4Wi02g8Eo4N1x0MWLyvuc12vWrISTE9eQuYTlMpjCKl7nks5F7Yty9cIlDO+LU1OPtmiB/en/4ovhXPYNwmoCDDTjEhLW45bhvFeiRM/OnWHXABXOqVN5+fLljwt+VNbtM7RPKadSyjJcrZacwWAg/5k9W/mkjjFz5nCNl1eYmcHaNWqn8B3Kwk8W8vbCK//bb/uOHj2H0YJNZ4YM6YKRJGRUBphL2bJ3Y2JC1qyBL50Ji9JqKw+P4FWr5PZ1k0rqWLa1tb2efn3CjAlyNoeOHTrZb7JcqzrckjMYjOjbLVt2XL3qJHOxeowala0PYOAAM4PtP7U/MSyxThXGrQQ8fXVrrediUeZlHT78/8+wXLx4EpwKyxQwgG5uNQIC/OA7SwUZuSaVAQbm4K4v9MAB2KtDznT+fPlae3g0yYaD0uV6NOIP7j3Ytp49SViuQrmO3Tvq/kbVwjMYMLMrIOAO62LB5KH30KFG6rKpwMxgyfuSV+1cFXM1hu50bK+x2bGAg+4oPv5OaOgR2EaKbjIgefPm6dSpGWQwOQFlnGNXKTlDIfv3w6aIp8+dS7p7NyMz07FkSWdHx/YtW3bt0KHhBx/Iacnh6E2c5Az8D48+Gw2bCEQcjrh/5z5s21bRtWK7Lu3gUFnki7L/GtLRof+5pr6k0Z1t/v6wW1vs+fPpaWklHB0btm3be9y4mg0aqHYIvYlTLuauUm8i30DXq3euDgwJhEiDO8YaFWsM8BrwSd9PVLqEd0jSQVpaRkDArrCwE+fOXXn8OA12HITpeFfXco0b1+7WrU2rVm4SWb6i+gxm6Od2UlL0xYswr/j4yRPYdqpQwYKFCxWC7zKru7rWqZldc6yYIcIjx+ljp+G12NPUp/B8DxshVqpaybW6K190YXrKOTK3rly5EhWVdPMmfCcG32LCHWOV2rWr8/8IqhixXAYDU7fv3o66EhV/J/5pxlP7gvYQYB08uCfrVLgkVYG4unjxemLivYyMZ7AjANwNwtnnjo4lIMbc3TX9GeuQwaSOai9r/IXW7gBpwcIcsjB3cuEThkIGIznXUsc7pKUXtK7WDIbuSAj+qxlQyGDWzYsIMOu+vpYyOuYsonN7Z0vxL9v8EAGWbdQKwxIGRAaTkCGKggG9GRAZTG9GhT3BgIQBkcEkZIiiYEBvBkQG05tRYU8wIGFAZDAJGaIoGNCbAZHB9GZU2BMMSBgQGUxChigKBvRmQGQwvRkV9gQDEgZEBpOQIYqCAb0ZEBlMb0aFPcGAhAGRwSRkiKJgQG8GRAbTm1FhTzAgYUBkMAkZoigY0JsBkcH0ZlTYEwxIGBAZTEKGKAoG9GZAZDC9GRX2BAMSBkQGk5AhioIBvRkQGUxvRoU9wYCEAZHBJGSIomBAbwZEBtObUWFPMCBhQGQwCRmiKBjQmwGRwfRmVNgTDEgYEBlMQoYoCgb0ZkBkML0ZFfYEAxIGRAaTkCGKggG9GRAZTG9GhT3BgIQBkcEkZIiiYEBvBkQG05tRYU8wIGFAZDAJGaIoGNCbAZHB9GZU2BMMSBj412YwpcPVJfyIomBAEwOQwQzHMWuykgOVxflgOfCiCZdzDgMiwHLOtRKe5kAGRIDlwIsmXM45DPwfiC5OIXdjYREAAAAASUVORK5CYII=", "text/plain": [] }, "execution_count": 15, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# move part of width dimension to height.\n", "# we should call this width-to-height as image width shrunk by 2 and height doubled.\n", "# but all pixels are the same!\n", "# Can you write reverse operation (height-to-width)?\n", "rearrange(ims, \"b h (w w2) c -> (h w2) (b w) c\", w2=2)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Order of axes matters" ] }, { "cell_type": "code", "execution_count": 16, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAkAAAABgCAIAAAB6y1p+AAAgNElEQVR4Ae1dB1RUR/c30ruCSi8iiA1E0SjYMCqIBrBij+WLGrufWJIYNcYWNUaMGJOoxEKMBSUqghosELEiAipFsNCLighSVND/zd/vcHCX3b27zNt6PXs878385s69v3ns3Tdz585H7/Lzm9A/wQwUmAuuoxpggAgS+hgQPULpadI1QXi9ytcSQUIfgaZCa6mSGCAGiAFigBiQUwbU5VQv1VYrIzXj4O6D/5z/Jzcrt7qq2qSliXNX58H+g0dOHKmhqaHa3JD1xAAxQAz8j4GPaApR+LMg5Smg169er/rvqgO/Hnj79i2/YpY2lkF7g3r178VfJbMSKRMkMzsl7JjoEU4czZAJ54fmWIXzQ1OIwvmRai28bI36ZNS+nfsa9F6gSl523thBY4/sOyJVtagzYoAYIAbkkgFyYHI0LIGfB8ZfiReuUG1t7ZLpS27G3RQOo1pigBggBpSeAXJg8jLEiTcTww+GY7R58+YNTDNikIQhBogBYkCJGSAHJi+D+8euP/CqgLe7e/suHk9IYoAYIAaUjwFyYPIypnEX4sRSJTY6Viw8gYkBYoAYUDIGyIHJy4BCgIZYquQ+zhULT2BigBggBpSMAXJgcjGglRWVsLIlliqlz0vFwhOYGCAGiAElY4AcmFwMqI6ujrq6eJvKDY0M5UJ1UoIYIAaIARkxQA5MRsR/2O1HH31kbiVe1kUrW6sPZdAdMUAMEAOqxQA5MHkZ794DeoulSp+BfcTCE5gYIAaIASVjgByYvAzo+M/H41WB1Igubi54PCGJAWKAGFA+BsiBycuYuvV08xvjh9EGVstWb10Ns44YMGGIAWKAGFBWBsiBydHIbg3Z2rVHV+EKNW3adNOvm3r27SkcRrXEADFADCg9A+TA5GiIIRYx7GLYhOkTwEs1qBYEevx59s+x08Y2WEuFxAAxQAyoFAN0nIqI4ZbJcRj3U+5DZqnLFy7nZeVVVVX9ex5YF2dvf+9Rk0ZpammK0FjK1TIhSMo2NqI7okc4eXScinB+6DgV4fyQAxPOTxP6AiKCRDAgtJqeH6H00PezcHqaEEHCCWp4qkp4G6olBogBYoAYIAZkzgA5MJkPASlADBADxAAxIAkD5MAkYY3aEAPEADFADMicAXJgMh8CUoAYIAaIAWJAEgbESyArSQ+C27x6/fp6QkLstWvJqakZDx/mFxW9rKiAQh1tbV0dnRbGxq1tbOxtbLq7unp069bGzk6wJKpRCQaKivKzsx/l5mbl5WXD//B59uxJVVVldTWEav77P3zevXuno6Orra3TokUrKytbeIJcXbu7ubnb2bVRCY7qGZn36NGtS5dS4uOzMzJyHzwoLy2trqh4+/atjp6eroGBmY2NjaOjo4uLW79+bV1dBe3cqCdPsS+rXlXdz7qfVZCVW5ybW5Sb9yQP/n/y/AmUw6eyurKquup1zWsdLXh6dI0NjW3NbeHj6uTq7uLu7OCsribLr0qZUJ+ZmXPt2p2UlIdpaY8LCp4WFZWUlpZXV7969eqNhoa6np62nh48Sjr6+rrAVJs2Vu8/nTo5mJoaS01hGUQhwlfM2UuXQo8dO3H2LHgspKmOrVsH+PlNDgiAC2QTJjBWUWQFuQVu1m5MVKoTcurqKcjfUXcrmwtWBPFpD1+19++nJCXdTE5OSElJSk29U1ZWyofCFjg4tBs2bNy4cdPMzaWaBJkzegQanv/4ceSBA5GhoVn37wsEfVhhZGIyKCDAd/LkTj16fFjD+R13YfTlleVXk69ev3v9dtrtpPtJjwsewxMlmT1G+kb+nv4BgwJ8PHyk7em5I6ghLuDLOSYm4fDhcxER/4CfbwgiuszOzqJv367e3u4+Ph7Nm3N7aIZUHRg8QAfCwjbu2JGakSGahoYQ8PSMHDp01aJFHZ2cGqpnX8bqC4gcGHJswEvt27fzypVLt25dffmyHNkKCdPQ0AQftnjxang/QzZpJIzV84NR42FKyp51684dPvy2thaD58e49u49Y+XKHoMG8VdxVML8+7m0vDT4cPDpy6dv3rtZ+1ZCHgQZ62DtsHjS4mn+0zTUNQRhGJczJ0iAfm/e1ISEnNi69Y/09CwBELGLNTU1hgzp9cUXo7y8enKU+k56DgxmC+d8/fWt5GSxaeBroKGhEThz5qrAQG0tLb5KxgWsvoDIgSEHJikp3senOxIsGax5c5P164P9/aWR0ITV8yPc0qqKil9Wrjy4bZvErqu+/H5+fsuCg02tresXcnTN/Ps5PiW++yRun5+ObTr+/OXPfbv25YiTD8QyJ+gD6f+7iY6+PnfuRoaui6eTkJBVU6eiEr3yNBR5K40gDnjxWhsU1Mvfn4n3ApPg8OLvg4P7DBuWW1Ag0kICEAM8DDx//mzWrHGbNq3gKVfQ27SEhDHOzqE//sjEewEJMSdPjurYMTosTEEJ4Vrtew/uec7wXL5jucRzklxriJcPJnz55XYvrznceS9QxtHRBq+SWEjOHVhlVRXMra/YtKlW0mkNQfbEJyV19/FJf/BAEIDKiQEhDAQFrd2w4WshAIWoOnPw4NRevSBeg622leXly0aP3rF8OVuxSiMN1orWh6z3XegLASCKa1RNTe348cs3btwL5nBqRadOXIVQcevAnr94MTAgIPL8eY7YKSwuHjB69KPsbI7kk1jlZmD79g1Hj+5XXBv/2rNnxaRJr6urOTIhZP36TfPmcSRcCcRGxkX6/ddPcX3YzJnrIF6D64GwsjJt1syAo144dGAQb+k3efLVW7c4Uv292LzCQv+pUyGAmtNeSLiyMvDNN/MKCnIV0bqzhw6tnT6d61msw8HB25YuVUR+pKNz9PXoid9MlE5fbHvZtSscojbYymxQGnevX9Adhw5s0rx5l2/caNAktoV3UlMXrlzJViZJUxEGysvLVq9erHDGwu6u1dOmcT3z856W/Zs3R+xX4PdUrgf3+IXj3+/9nute2MovLi4JDPyRrUxB0hTSge34/fewiAhBJjEv/y00NO7mTeZiSaAqMHDq1JHU1GQFsrTy5culo0bBBlyp6bxu5sys9HSpdadwHX2z45vb6bcVSO2NG/eVl1dKR2HY2sxdR5y8gaVlZi5Zs4Y7pRuUPG/5cun8IG2wdypUXAbgsQkJCVYg/YMWLy7IYrZZB2M4LLPBCx/X05UYTeQTAxvOZqydwXzbGUfGvnr1+vffT3IknF+s4r2BLVixQvqLUrfv3j0dHc1PH5UQAyIZOHHiECSjEgmTB0DqrVvHf/tN+pokXblymiYSBfMO+89CI0MF18tRzYULN58/L5OOQpB6okMHe+76UmcuOuLvv8/FxODFGhoYjBwyZCDkHnF2NmnevJmh4Yvy8qclJXfT0iB8MTwqqrQMy/XmnTs/lWISAbyNgDS3Ms9/l49pUvK0pFPLThikimO++y6oTRsnMzMLY+OWOv/+04VXhKdPi3NyHkdHRxw/frCwMA9JEaT8gNwfAwYMQeJlCNv+5ZdizTRo6ej0+fRTrzFj7Nq1a2VpqaGp+SQ/vygnJ/bUKcjZAdd4W35bvXrw+PEgAd9EnpFbA7c62jiaGpsaGxnr/5vST19TQ7OsoiwjOyM2IRa8UXJGslj6Q2D9xCET1ZqqidVK+uBLl8QIrDM2Nhw71rtPny6wlwviCSEFoo6ONvyhVVe/Li+vyM9/kptbfOdOZmJiemxswtOnpTzm2Ntb6uhwmG6CfSaOboMHIzcsq6mpLZk1a+mcOc2NjHjMrruFQPx127Zt/e035PRFxpUrDkzT/konk0KdvXCBd2BKmQsRn4kjMbGgVSuz+tTVv4bojOXL54aFHahfKOT6P/+Zv2bNNiEAyarYPj/JV69O9fDAa+Lu7b0qJKSlhUWDTWprasAnhWzYgN8BvXLPHv9p0xqUJlkh80QT+EwcBecKzEwEPj9gzrHzx2Z/P7u4pBhv2vEfjg/vPxyPF41kTlCTJkOGzI+KihPddZMmkD7jp5+WgG/HgOF3VUJC2okTl/766xK4tPdNhg3zDA/fgmkuGaapZM0Etbpw+TLSexno658+cGDD118L8V7QC9T+sHJl2K5dyKxRh/76S5BuVK5SDBgYGAYF7e3bdxDS6jt3EpBIGcIgqB3f+6TFi4PPnBHkvUCOmrr6rDVrgk6daqqGfWkI27kTr4CiI0cOGBkfGg/5D/GGhJwIwYNlhczLQ7lkP79+kAIK6b3AFsh26ObW/rvvZiUnH87MPLFly39hWs3V1YlTMxk7sO0hqPEDU0ODg709PZG2Dffx2bFhAwYMU44YGGFUgQGYf9+48RdkFlHIfC/nnLx49uw8Or3TgJEjF2zahLGol4/Pgo0bMUjAQPh++m1FCrdD2iUIZm1qfXbHWchGLwjAUx4VFwVHtPAUytttRUUVRqXx4wdjYA1i4GiVRYsmxsTsWrVqRoMAVoUsHdiLsrKoCxcwmk2fMMHPywuDrMNMGzvWq1+/ultBF4n37oEagmqpXNUYsLW179XrE4zVpaUlsH6GQcoKc+nEiTevX2N6NzQ2hrk+pOcGgRMDA9t17YqRDJgLx48jkcoBs7e037xwM9IWCEQ8d+0cEiwrmJYWahUTzgOTlYb4flk6sL/OnIHjKEX2raWp+W1goEgYP2DJ7Nn8hTwlsFQmnd3TPP3SrdwyMHDgUKRuGRmpSKRMYLC2gOz3P8uX6wteV25QyHR0KgCI/mhQghIXTvWb2ta2LdJA+XdgyMROW7aEPnqEDYNCksMcxtKBIYMPhwwYYG5qKoElnu7u+np6IhsiF+FEyiGAcjAAxzEjDcnKeoBESh8Gv8ziL17E9Kurrz961iwMsj7G098fzmiuXyLo+n5SUunTp4JqlbIcjmNePm050rS4RFR8BFIaF7B27ewwYiHU3t19yrlz1zBgWWFYOrAr8fEYM2BBCwPjx6irq0OoPX85T0kK+iBanoZ0q5QMdOzYGWkXBC4ikdKHPbx3DxJwYPrt6+sLofMYJA8GfBhPiaBbWAkTVKWs5SM+GaGtqY2x7mHeQzgMGoOUFaZHj07IrouKSry95wwbFnj9+l1kEynDmDmwoidPHueg5ky7dcZ+ofBzYdqyJX8hT4nExz3zyKFb5WBAW1sHTrDE2FJRgfIQGFHMMffQmdI+GTlSst5huxiyoQo6MNgl5tML9csbosnvZsrp1/378YXwQohvQo41wCAyvmfPyd26Tfzll7Bnz17gG0oBqc6qD0gfhRTVARGLgRTVIAzy0zdYToUqy4CZmSUcYinSfNjOLBIjK0B2Rgay6w7duiGRPLC2rq48JYJuc9B/7IIkKGJ5/279wy+GYzTPKshyd3HHIGWCMTMzGTPG688/z4jV+61bkAQmdf78zQMGfDx8eH9/f09TU2OxJHABFsMPC+8e+folXAiT2pLSUjiymYkoEqIcDBgbo97AKivl9w2sEJf8EGI3zG1tJRs141atmrVogWlbqJIn8Dk7il6/eM9e3hN5j31Yv34OMpSD53l486bmzJkrcJCYpaW3l9fsvXtPlZVV8GCkeauEDgxe4Z+UlEiTROpLzhnQ1NTCaCjP6RCLcnMxJti2xQbLNSjN2sGhwXKewmKcMjytFP3WxdEFaULhU3mfBLKzs9i3bzV+owW/4bW1b//++/rUqd+amg4cNWrp8eMXIEcwP4zrEmYODHI+ca0rXn5FZSUeTEilZ0BLC7X8LlaOQSmThozg0G/WrDGK6RkaYppXVcjyRzdGQy4wxobGGuoaGMkVVQrAD6yErVw5HWOOcAwkRTx27PzIkUusrYesWvULxH0Ix7OtZebAKqV4OpFICqrpgGaRHKkSAPkGJs+UVON+k+njPJAgS/UMDARV1S9HKlO/iXJcG+ii+Kl6hUp1IXNOvv125vbtS9XU2HiBJ0+ef/fdLlvboXPnbpRetntWJEr//BQhmmP2UwtpTlXEgLwxUINb1pUsgL7OWGRzZEKQOrFKc2FkYISx5fUbGUymYRTjx8ydOyYiYpu5OWrtk785fwlMJO7YccTJaYR0jhxj43v5zaASYoAYYMiAljZqFrSRxzQjJyqRfo6h+XIiCjnJDMeyyInCGDUGD/ZITw+H1IUaGuoYPAYDb2PTpq3+7LOVr19zG0/HzIHpSrR3EsOFBBhNZTmySALbqYlSMoD0GUgPJIiiClweUTgSSpAE5S5/iYtT1cIFDckPVwYGupA8PiUlbM6cAHz6eZH6Hzhw2strTmkph7tTmDkwHdwvRJE2MwFoaqDWWpn0RUKIASkwgMxtWF5a2hhlkM2RyjRGE/lsi4zO0NPRk0/9hWvl4GAdHLwsJydy27YlHh6dGxOjWNdRTMytceO+Rr651rXCXzBzYC2MjfG9co3U09Xlugvu5DN5brhTjyTLhAFTKytMv/j9zvzS4FsmKz2dv5y/xNTamr9Q6UsgQRQyOqNV81aKywbsD5s/f2xcXAh4sqCgxb16NdaTwb6xTZv2cUQIMwdmi/sD48gMHrHGjQsm5pEm5Vs4qFrKPVJ38s8AMtNuWUlJSVGRZObkPniAjI9HKiOZGnLbKv0xyruD/q2MFdiB1fFvadlqwYJxly+H5OZGwZvZwIE9JF4k++abn+vOaK6Tz+SC2aqdHfpHWWFSEialIRPzFFGIOnoptbamVhENJJ0lYKB1+/bIVqkJCXBGJRJcH3bvxo36t0Ku7dq1E1KrrFX3s+8jTbM1lzAZClK+lGEWFi1hbQw+L168jIy8HB5+MSLin6qqV3g1ampqIcL+6NGN+CZIJLM3sI5OTsguH6lkHhokOQDT0kaljQBkxUsF2C+JN5yQQhho7+YmpLZ+1T8REfVv8dfR6OOeO3bvjherNMjYhFikLU622C9DpEA5gRkZ6Y8bN/jIkY1FRdG7dq3o3LktXjHICMzF5jBmDgxm7dra22PsOXvpEgamshiYQtQ30MeY/7T4KQZGGCVgwKZtW2ToBBw4KcGaeWV5eVxUFIYoNXV1py5dMEglw0RejsRYBJudLVtZYpCKi4Goxc8/H3b79sE//liHzKkISRS5OFqMmQODwfDA/S77ed++ctzJRoo7wI3UvJlxM4yE9LvpGBhhlIABCO3pOWgQxpCinBz82c11Ag8GBb3G5a9x7d1bB3GubJ1k5bi4knQlpygHY0v3jqryegrP5Pjxg2Njd+vooCaNkpKwc7AYnt9jWDowX9wfWPHTp18sWybBj0S8VYqONLUwxZhw5dIVDIwwysFAH19fpCG71qxBIt/D4ITl/T/8gGwCB2YikcoEW7dnHdIcj84eSKSsYHC6N8OunZ0dZs4ciRH46FE+BiYWhqUDGzJggIE+au7rYHj454GBdOiJoKGysrUSVFW/PPFmYmZaZv0SulZiBsBzaOJ2W6bfvr1/82YkFfB1tmrKFOQWZjgIcYCkB2Yi9ZFDWPT16Mg41PwhKA/HhsmhCfVVunr1jqPjsDVrdmdlFdQvl/ja1dUJ07as7CUGJhaGpQPT1tIKQP86Czl0qLuPT+y1a2KpywOura09FxMzecGC1Vu28FQp9K29oz1S/9WBq5FIgik6A4bNmw8aPRppxfavvsKsacFEyJaFCy+fPo0U6z54sMTnjSG7kDfY4/zHY78ai9TKSN+oT5c+SLCsYHfuZGRm5qxcubN1a9/+/WfAOcuNTCGfkZEtK1tYOjCwIfCLL/D7cJNSUvqNGNFn2LDwqCixktm/KCs7dvr0lIULTV1cvMeN23/0aMKdO7JikIt+2zljw5TPR57/avZXIt9la2pqYs7FLJmxZLLvZC4UJpnSYWDM3LnIjt7W1i709d25YkVtTY2gJnDG2OxBgw5t3y4IwF8eMHs2f6GClszfNP/ug7vClT979az7FPdnL54Jh9XVftrnU+SRK3VNpH9RtyULfr5cunRr1qwNFhbe/fpN/+mnQ+DYxNUnKiouKOggphUy3AMjqg7DbB/Ye4ntHR2H+/gcj8S+bkOryzduwAfe3j7p3ftjV9cObds6OTiYNG+ur6urr6cHSe5Ly8pKX7x49vz53bS0+OTk+KSk1IwMePeqswEu0h88qH+r6NddPhYjymvfzn3gnCbPntxnQB8LawsDI4PqquryF+V5OXm5j3NTklMSriXAZOP7mHsDQwNFJ0eV9e/48ce9hgyJw/19gQ/bvXbtiZAQrzFjYPoRdh+3tLCARPLFeXmPUlPPHT4M8YrIwI33nHfo1q330KFKw//R6KPw6WDfYWjvoV3bdXV2cDY1MYVXKMgXlf8k/2bKzdDI0PM3zou1Wj/Nf5r885OcnMGjJEwjx8IugdiEBQs229qaQ/YNmBXs0MHe2toUNoHp6UGiQM23b99BJGFFRRXkNiwsfAbTjxCUAYGFiYnYUDJ7e9TKCI9uwm8/epfPeGENtnl19PSU8ukq6urqVQ8fwv/CrZWgtsBcgkYMmri3cc96mMVAEJ+IhNwEM0szvmJJC1gTlJQU7+ODiuNKTCxo1QplyBdfjD158rBICwMCJgcF7RUJEwvAmp4m6YmJE93c2K7DIy0KPnPG3dsbCUbCuiYggVhYfEp890mo5wcrEY1ztHFMP56On4JCCWZOUJMmzZt7cppgV5BdJ078CKdoCqqVrJzxFCIo0drGZvmCBZJpI3ErmCJ7qFz7o738vCRmQ3jD+ynsg1mF90i1DBlwcnUdM28eQ4FIUXBuPHPvhexaUWBfTvmSsffiwPKcnCKZeC9NTY1+/dyYG8TegYGKX86dC/OBzHUVLlDJZhFHf4ZdrhdOC38tOTB+ThSrZM66dVZt2khTZ0Nj42XBwdLsUeH6crB2+OzTz+Rf7boFMCmr6uvbFxJ5MO+UEwcGuST+/PlnG0upbkdPz8xkzo4MBXbq0qln355cKJCRyjsDzkUvJJM7BmAf8eZjx6S2m7ipmtqGP/80NkXtTeTOajmXHLwsWF2N/RIGc6shBJG5TIzAZcs4CR/jxIGBPa1atLhw9KilGWqJAmO/SIySvYGBvYtXLxZptQQAegOTgDR5a9K2c+fv9u+HXVlSUGzBpk09vbia0JaC/lLoYtLQSd7ujFcHOVKbP4KDo47qi5061a979471S1hdc/gH0MbO7uKxYw52dqx0FS5H+RyYh6fHqEmjhFstQS05MAlIk8Mmn4wYIQUfNvPbbycuWiSH5suPShC+uPOrnfKjj3BNpD+FCKk6fvppiXCtJK7l0IGBTo6tW9+IivL29JRYP3xD5XNgYPva7Wsd2jngScAgnz97/uzJMwySMHLOgM+ECd8fOcLRXCK83s3fuHHGqlVyToJs1bMxszm59aSiHMEMcfDp6ZzENgsaBYjFP3fuZ319XUGARpZz68BAueZGRlF//LFj/XpDA253IEGKRdgx1kg65K25oZHhgdMHYHcXW8XoJYwtnzKUBomdfr9yxdqB8a8cyPqx9eTJyUuXytA0+e+6jVWbmF0xdhZ28q/qew3LyyvFOgOlkXYNG+Z59epeMzOTRsoR0pxzBwZ9Q2jp7ClTUmJipk+YoKGhIUSbRlYpWRzHezZs7W0jrka4uLk0kpy65hqaGnSQWB0bSnDh6OJyKCkJJvog2oKJOTA5GZaSokx7lpnQwiPEt69vfGi8Ankv0N/Y2PDGjf3Xru2DRSnYnsxjEcNbGxuzgwfXhYdvMTTUYyiWX5Q0HNj7XiGg47fNmzPi4hbNnAkhHvyqSFzS0sTk8/Hjzxw86ObC7FteYmW4aAj7jk9dPQUxHTq6jXrm4E1u0cpFN7NuDhw6kAs9SaasGNDW1f3vli2Hk5MHBQQ0ZisSHJXy64ULEOJoIsXwK1mRNqjHIMniBs1bmB9YcwBmDpsZNJOV8o3pt0ePTiEhqwoKzu3evaJv366NeWD41YAUHr/+ujwj4y84+pK/lnkJ+0wcGBVh3/GZixcjoqMhFa9kBzRraWq6d+vW38MDNpy5u7lB4D6mXwkwzDMpSKBDXRM4wXL3tt3HQo/lZefVFYq8sLazBo/lM8IHokLYx62xJogycYgcUOGAnMzMk7//HhkaWoje2v9vmuCAAL+pUzv16CFcOPNa5okm8Jk4Cs4VQEKTPSf2hEWHJWckY0yDvFNzAuZM/nSy9Ba9mBPEZ2d2diHk1IqIiI2LS4JFMr560QVqak3Bbw0d2nv48P7IzPSiheIQsnFg9XXLyc9PvHcv6d69+w8f5hUW5hUUPH/xAjJRwQeykOlBRsT/T4oIB7XAxrJ2DhDT8O/HpUMHSJ9YXw5H16y/nxmoCbQk30qGDJJJ8UmQbio/J/9l2cuqyir4JaVvqA/ZDlu0amHf1r6NUxtgCtIqMl9C+8AGOSToA/1kfCNDeh6npd2KiUlLSMjOyMh/9Ogl/FlVVMBXNgR96OrrQ2pEOOXZ0dm5a79+7bp0YTX9KC7dzL+fxXJgZib/2+cDh1X+fe3vhLQE8GTZhdml5aUvK19qa2nD8cpWplbtW7d3a+822GOwk62TuAY2Fs+cIMEKwQrZ7dtpiYn37917AAk78vKKi4tLKiurX72CDJpvwEtBNg1Iigg5eVu0aGZqatK6tYWDg7Wzs2P37h04nZAUrHIT2TswIcrJQ5UMv4DkwXzROhBBQjkieoTS04T597NkDky4krKsZU6QLI1h37f01sDY604SiQFigBggBlSYAXJgKjz4ZDoxQAwQA4rMADkwRR490p0YIAaIARVmgByYCg8+mU4MEAPEgCIzQA5MkUePdCcGiAFiQIUZIAemwoNPphMDxAAxoMgMkANT5NEj3YkBYoAYUGEGyIGp8OCT6cQAMUAMKDID5MAUefRId2KAGCAGVJgBcmAqPPhkOjFADBADiswAOTBFHj3SnRggBogBFWaAHJgKDz6ZTgwQA8SAIjNADkyRR490JwaIAWJAhRkgB6bCg0+mEwPEADGgyAyQA1Pk0SPdiQFigBhQYQbIganw4JPpxAAxQAwoMgPqiqw86U4MEAPEwAcMdOvQ7d2tdx8U0Y3yMkBvYMo7tmQZMUAMEANKzcD/AX05csVXPtckAAAAAElFTkSuQmCC", "text/plain": [] }, "execution_count": 16, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# compare with the next example\n", "rearrange(ims, \"b h w c -> h (b w) c\")" ] }, { "cell_type": "code", "execution_count": 17, "metadata": { "pycharm": { "name": "#%%\n" } }, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAkAAAABgCAIAAAB6y1p+AAAmZUlEQVR4Ae1dB1xURxNHKaICKhY6KIgNwR57wUawfyrWWDCxd7HXYO+iYmwRBUuMDXsXxYiiIIoaFMGCIogloNgB+eZu75bHPAREyj2Y90vW2f/O7s7+l7u53bdFLTkqKkr+//Xk5GT2PyFRCiZkjBAbxAZ9LuhTQJ8ClfwUFFajhxggBogBYoAYkCAD5MAk2GlksoCBsLthgphMdHVxRUjrmq0RUqVUFYSU1y6PkDpmdRAysPNAhOz22I2QhC8JCKEoMUAM5BAD5MByiFgqNhsY+PL5Cypl2ohpCLGvbo+QjSs3IiQkOAQhb+PeIkRcV3RkNNI5ffg0Qib8OgEhjawbIcTvvB9CKEoMEAPZwgA5sGyhkQrJBgY+ffyESunesjtCPNd7IuTr168IydvosyfPkAG92vRCyB7PPQihKDFADGSBAXJgWSCNsuQIAy6/uaByAy8HIkSK0aSkJGT2pMGTEBLgF4AQihIDxECGDJADy5AiUsgRBm4G3ETleu/yRkh+jSYk4Pdkc8bPya+NpXYRAznHADmwnOOWSk6PgZ2bd6aXXMDSxO78zo07BYwDai4x8N0MkAP7bsooQ7Yw4OdDSxvSI/Li2YvpJVMaMUAMqKmRA6O/grxhQLzYIW/sUNVaIx9HqqppZBcxoCoMkANTlZ7I33Z8eP8BNVD8HggpFPBoXGxcAWeAmk8MZMgAObAMKSKFbGCgaLGiqBQNDQ2EUFTIgF4JPWGUZGKAGBAzQA5MzAkh2c9AoUKFUKFGpkYIoaiQAVMLU2GUZGKAGBAzQA5MzAkhucFAk1ZNcqMaydbRtHVTydpOhhMDucQAObBcIpqqQQz0+a0PQgpy1La2LWq+XR07hFCUGCAGEAPkwBAhFM0lBuo0wEfldurZKZfqzutqxO//XFfhA4jFk655bTXVTwyoHAPkwFSuSwqsQas8VqG2165fGyFSjBYujD9lSzcuRQ1p0KwBQihKDBADGTKAP1oZZiAFYiCHGBCvVNx3fh+qq+/gvggRuwekkMtR8eKUv079hWzoNQgf74sUKEoMEAOZYYAcWGZYIp28YUC7qDaqeNmmZQjxue2DkMHjBiOkql1VhIgXqWtqaSIdQxNDhLTp0AYhyzcvR8iV8CsIoeUYiBCKEgPZxQA5sOxiksrJGwYqVauEKha/TzoXfA7p3Iu7h5CIzxEICYoMQojnEXyZi3gpilYRLZSLosQAMZBDDJADyyFiqVhigBggBoiBnGWAHFjO8kulEwPEADFADOQQA+TAcohYKpYYIAaIAWIgZxkgB5az/FLpxAAxQAwQAznEADmwHCKWis0sAx8/f+SqMTFRIAffD+ZIQIDs2rDDvoc54u29C+Q/9v7BEXf3xSBPd5/OkalTh4M8YM4ACD9/+QLhL7+0e/boUathrcLDn1709wfE0bHe4a1b7Xra7dhxfL6bGyAtWtgsHjnSuov19OnuPYYOBaRBA8sRbduaOpp27TqxVhvZEsQaNQw7WVnp2+s3bDjQwE52WEblyiXsS5cu2rCopWWn4lZWgFhYaDXU1lavp66vb69hZgaIublmfS0tnSY6xYo10rW2ZrlalCpVrnU5dfV6rBwouZ25eZWuVbS1G9q0aAE6rVrZDWrSxGGkQ+nS9h369wdkwICOvzs7D5k/xNy83ZiZMwGZNWvszlWrgI2qVbtt3yfbdbB//457QUE3Qm/Uq9fvwePHgDx+/ODr16+JSYndu0+GqPCZMmWNMArypk0HEHLu3DWExMT8hxCKEgN5wgCdCJ4ntP9QpdGR0UbqqU7CNS5kHBWVLCw0G5HIyCS4Ny7+Q7yuvAIo2cfndgmj6qf9T7etXQYwQFat8mg2znnen/Nm/dGeIc7Oo0Z5rO06sesBH9n3LOjUr9/0D/+LVp2sHkTuZYieXsnzb2Lhaz0pKYAhEF5PTq7Zu2Zyr+tCpPOEzl/HBaolJxsXMgH8SLtGIxePrOaw6VNUcce+fQGJ1U5Yv3VRbJGw+Bdfdx6QfQX7+vtE3L/vY3YqLvbdu/fvAfGxtITQ2toHQvYEDxoEQr9+tyGctVS2ufj+/RAIFy0Kh3BvoSMQPnnyCEJv72cQQisgVHspC/z9YyFUIDJA7dEjgY4ciY19C/9yncSEBDn8mSPx8viLV684EiNHQu7f54gcUDt29ux1/+tnzhxlUQjXenhM2zJ/y5YUD9R/zJio0d1Hj+7HdSo2ahSVHNWoUcUSpUurfU588+5NpaDGvg/9mjWrWrONfbGn772Oeml+LjnSbcrKlXNtuztEX301dOwC++rtdIwqREdHqhmZgudr3Xp4clRnXiYIhoZtkqOihEiFCh0fXQ4UIgMGzPFcvEmI7Np1so+9sxABfkqppfpLFqaSTAxkyEDhDDVIId8w8PZtHLQlLl4Wsmft2kUgzP9zvhJQ693bAeSGAxtypEqVkiCXal6KIy1b2oIMIwOOjB8v8wSz18/myNat7iB7n/fmyNWr/4D88NlDjjB74CsyISERQoY/DJG5kI0b93vu2cOQGXIXBSMMNi4BsHPFihDa2w9h3gvk9bNlVW/YsI95L5DBe0EYGRnDvBfIBfl58/o1eC9gIOyRzB+Hh9/bt349eC+QF6xeDeHy5XMGNGjQYWwHkKs0bQphgwZWvzZtWrlrZZDZeBTGtVfPnNm4fyMg/4aGQvjq1QsIExITHj9O5c8A9PJK8bUQhadv3xlM4KGhYVsuM+F//3NByKlTVxCSnJyMEIoWWAbIgRWgrn/0SDawCH8qC9mzaNF0EGatn6UE1Hx9T4Psf1s2ycaed+9k44Skr0lKIOv/fpSPhM6evXo1KIiVsspF9oVla9vjp3btGNJTPi83bNjCgePGMeTkrl0ghIZG3A0LY8jXpGwwhhVFoZgB9mMiIeHLzUuX2F/L3iNHQM3LawNMqA5bOAzkWm1lvqd582oTOneu1acWyNMWLoTw0KHdMU+fXgy6CPKnz58hTOf58oUNSVNUDh68kBKRSz//PAoh27bJjKGHGAAGNIgFYiAbGZB99ynHUvAmBkqeOnWtTnI5VkVPW1sQ2rYdyc9/2rFyJSDgnJgChOScOBWqLLALtWNjX/sePszsXOzuPmbt9OHDexfT1f347h2ATbt0OXzzxNKls4wa19h/4gYgkdHR6kZGTJ+PuVk086G1tXnmlUkzfzNADix/92/Oti4xMenLx4+sDjZO6tNnRvwrxfDIuXFjSFqyZBs3ApZRgAxTQEk0hOKk5DvhQ3w8a1NgcDAIbm7z1dwUjazn6BgUcxPG/e91Cz33vQlo6IMHekZWLFm4nEeRIa1/qle3UlP80aWVTFhBYoAcWEHq7R9ua+ybN6yMg1u2gDB06IK7wZEMmdWvHwh//y2bgWTPl0+flCL9SwzIGHj+4gWE7M0rY6SVk1NAdNDevV6XwwJeB9wD8NGTJ9pGijEWuDRTUwOmycOSJXXJgXE2CrhADqyA/wGk13z+DuPU7t2gt3mzt9efCv80f/BgQDw8DvH8WZ4R4iWQUAAZePb8ObR65szR8fFvWfM7OzufCjkHayBXr1rz7voDGG99lP0S0i6A5FCTM2SAFnFkSFGBUwgJDIQ2v3jxX7/Ro1njXeXLzV1cVl66do0htBKswP1Z5GSDufeCSm7fvQuhq+tEr2XLDvgcAAc2Tr7EFMCjXl6Lty3OSUOobIkxQA5MYh2W7eZ+kL9vh2LXbd3KCp/cvTsIS5Z47jt6lCGf5S+64uM/sCiFxEBOM3DkyB5WRfXqFTft2AHy3bu3FgwdOnPdTJD9AgJYakRoKGzZZjKFBZABcmAFsNNlTb4XrlhM7zZxIkQ/f/4yad48xkV0hGxN4Nath1mUQmIg9xngQ3zZkg354+HhDm9V2XaO0TNmMBDmBuBcEpC5PkxlZ8uWD1Y+hSrOADkwFe+gbDbv7vXrrMSxs2Yx4cCmTSD4+ATI3zSkVMdOkUiJk0QM5DwDfH8Fr6paNUsmww4zDt64cwfkjx8/BF++HBgim/GGk0pY6jEvrx3HZSM2egoCA+TApNfL4kvr4bgg1AyEHD1zhiFrp05lmqd9fRnCfrpeuCBzbCiXCiJ3Xsq+ubLwzJ3rBrlWuazieXfuPAHy0dVHIdTT1YXw3LngIkWLBmwP0NfXc+7ZE5AbN6JaOzk9OPxgxAinnevWAXLvXuzSfftizsT89dfCuxcvAhIR8XnPnTvv/d4HBu54cfs2II8efbwQG5twLeH58zNf5MPZgIAIv/fvX59/HR//z0v5l++BA74Hw8L8Pf0TEwNYOSNGTNro47N07NIvX67ulf+qMDQ06Tt+vJ213du3/zB7oPCyxsYQPnt2sqSeHgj8CQjYzmUmeHuvQMjcucMR0rVrS4SUKVMSIbkZtbQ0QdUVLVqEIWxDvTD18uULPLps/Xomb3J1XeixkOMgJHz5QmMyISH5SSYHlp96E7fl1pUrDPp9heK77KryhypSvX1bMaOI8HwQVVdXh1Z06iRzSL0cekE4ddQoCO3tf27o4NC+SXtn506P5YtTqla1OxQeXrda3YiIYx6rZK7OwMBoyZ49liaW69ZN7fO//wEC5ze26tatnH65Xr0cqshPtNLU1LKysSmmXaxOnapl4chBNbUiRbR1S5bUUNcwMNDX1NQExMTEXLtYMX09fR2dYmX09QFp0KCZWcWK9avXV1cvzMqZOXNpXXv7Sf0naWpqdO/QAXR8fUMmrFwZvDtYV7cYs6d7937HIiL2Ld1nbFz24dWroAPPbzNngj1161ZzGTaMIYXlTe7SpQUa0Mya9RtT4OH+/cu4zIQXLxRDGY6DY+YyE8Tl2NpWRDpZi/IJw8xkv3DhFFdjBzRDNOrx4/sR90EIlx9kDMLxHTuEh0HzLCTkAwbIgeWDTsRN8Ll0iUF/u7sz4fqtW1gpdfzZsxepAanGdHV0kOnHtiuGJv3kb/s6dWq+aPp0puN+8iQIHh5zSpUowRA2vgE3gwrJk6iubqoxFtjg5rZNXUOjW6tuIDObmzVrM3zevMAdgYAsV67WcztypKKZzKPs27wZQniYSwNBu4hiQCOH0wgKFSqEUHDMCBGP5G7d+hvphIenbLFgSStWjEc6zZrVRkjNmpURkk709u2gdFJ3HzzIUuHUR49DHuloUpJ0GSAHJt2+w5bDga0MgnPKmXBOfr8G1ksr/v79x7RgVcfE37Y7lD4bxknMeocWLZgwVn7efJ8+P6t6q75tHxpRgeKSJRsgNDMw45mAk8aOjqfWyUYn/3N0ZPjYJUtK6Mic9LpFi7hmjgpWVqao/AkTfkGIr6/Cv3J8zhzZioxMPuzqgG8pe584wZJgW8gJP4XMkNAbN17GvvxWRsIlxAA5MAl1Vtqmvnn7liVcOKT4zXvCx4chMPufdh4RWqSIlgiTADBYflA9GKonn5cDoZP8kFkQZsvPChG2gXk7uA9MCEpdtrCwRE1o3LglIDDtKcR/cXFZNk42WzioVy+GV6ldmwltmzcXakpIjov7D1nLTsdn4M1//+Wp7B0Y/6T4HDgAlwHxVBKky0Bh6ZpOljMGDsrnwUC+oJwzYVc4fhc/suN5VP4pooW97O/yw+zB8F+V66p5I3SUs4IcYcKKFTsQwu7xQqB0o61bt0/TeOdOzkJ88OzZlSwqATJpxAghLmk5LOwut198NAzfhn/xyBFyYJwoSQvkwKTaffzzCesJWRsCz5/PcmOqVCmf5by5lrFdq1asrmLKF11GBgYMcRo+PJNmiLcHwN3KKO/p0/4IkVC0Tp2GaVoLi0qEeIvOnWcMmgFIi4YKfUNzc6agU7y4UFNCckTEg3Ss5W+C7wcH+930E2rGvXoljJIsFQbIgUmlp7CdD5UzJJflJz9BMj9TA6tmIl6/fvVMaOWxCn+j06xjR2QKLH9HSOajMTF4JsrBYSTK3qWLC0KuXr2DEBWJ2tjUyKQlXVt2BU0NDYVjA5emraUNSG35rTeZLESl1IRHUokNg3uuOSi8WBVAeE8Gd47zVBKkwgA5MKn0lMLOmJcvmfSv8jSdx0+z4aUOrM1DRIjXCyCF3I/WraH4am6pXKCRazYcOnQB1dWgwQCE1K2LFynADdFIBy5GRki2R7W1sS8vVap0mrXoFNMR4k07dHBs7AiIQdmyQlxC8vv379Kxll+ICjr85A6mDw7sTriK/iJJp0WURA5MYn8D/AioJ8rribOlAYaG+DuuZ8+22VJyNhZSTbncoFrduqzYQvJdvcIq8hC5fv0uqn348EUIMTJqixBHx9EI2bTpAEJgjIgQaHLmEdgQzSlKJ1elmjXt69qDJrt/mWeRkCDe7Cw0np18L0S4/DQ8PCI6gkdJkAoD5MCk0lMKO/l467n8iIecs37hQjyNllcLPcTLMYwsLHKu4TlackJCIir/5MnLCIFb1hBiYuKAkLZt8cqLbduOIJ23b98zRF8f/zpBmgq1cuVsrW2FSSXLlBFGVV/+8CG9Edh/cXGoCexeaQCfP3ny7OUzlEpR1WeAHJjq91EqC7kDi4mMTJWQ3ZHy5Y1RkZ6erggRb8NCCtkStagkWyxXkJ+kpK+o+WfOXEWIs/PvCDEwaM0QLa0iTOjefTLSOXDAR4jAsVVwrDNH4KwQLktCgNMR07ETTRuC5sv/FK8/X0RGPn/1PJ28lKSaDJADU81++aZV/E7kH1my8c3S000QvyebPXtwujmyJ1GnZMnsKaiAlfLpk8IVwdFWrOn7959DHHTrNkmIwGFXZmbtOFJcT09TQ3POnA0cYYJ42QtSyKuo2EWlb8n7DwqH9/E9bOVXDFjTz0KpKsUAOTCV6o6Mjfkgv5oL9D4pP3sZ58kxjd9/H4rKXrsW/8aHs/6QzvdGdVKfWvu92Umfj8AyQ8XLl7FcrbgunMKoO3fuZo4wwcKiPUJGjVqCEPF2BaSgCtFPsrueZQ98mj5+/shkCiXEwI9+uUioqfnDVH7pSWJCggq2aNSonsiqo0dXI8TIqAxC0o/+yBL59Eum1PQZAOZL6JYQ6winGVnqunV7kFrlyl0RooI3zPEt/3BmzZeElLlTZDlFVZYBcmAq2zUZGFZEWzEvlIFeXif//HMjZEJoqDdCxKfkwYnsXIfdB82jJOQaAzBN/b2Tctw24UiOgYMG4Xeo/fvP5vpMADeCkNyJgqvW0tTKnbqolmxkgBxYNpKZG0UVU+7Yle64BKalEFPic8pDQlJ2ULG3fSNH9kC5VOTMeGRVfoq+f/v2Xbrr+n6wsdu3H0MltG07EiFxcfEIycaolvJwMm24dky51CUby6eicpoBcmA5zXA2l19UOfASLy7P5prytLiKFc14/fHy1c/u7lM4woSnT48jZPXqVEsSILVRoxpIJ3dWTqJKJRoF5nN5aYOv73XEVe/e0xGS5UEhKgeiWvLb2kCAT1PxosXFCoSoOAPkwFS8g7B57DpEQA1MTXFaPo0Lt2wL3Y94X9qYMb0QB35+HggRuz03t4lIp3Hjguj2wDGg45QiQkPzfGmDeJ/c0qWeqL+yHC1eTDEZYGBmVq5UuSyXQxnzigFyYHnFfBbrtVD6LX70ahYLkk62t8rNOmAyu175v5iYLJtvYoK/p8aO7Y1Ku3QJu73IyBNIRzwibN26PtIRvslDSSoYjXzwIPRxqNAwWFwujKqIPHPmH8iSLN8nrq/coQGfJrjVGhVLUdVngByY6vdRKgvLmynm1ipUrZoqoQBEngcHa8gXd9wNCmLNTY6KQu3OIcTYuCwqGd7JIeTMmT8Q8vLlOYTs2rUAIU5OrRECr2MQAm0EJCkxSdjYNHWECixX5pF/r127/+T+9+bi5We7Pd8qOTExCdUFC/0RkslW8BFY+SpVLIwseI0kSIUBcmBS6SmFnTaVKzOpap06EjP9h8199ORJEe0iUMw/R4/+cGG5UUCJEqlOy4Uqe/f+GVW8Z88ShMTEnEXI5s2zAHn/LmU8VKNGJaTz49Gz+/ZdDLr44+Xkfgnio5a/dxeaTb16lS0Un6zct59qzDID5MCyTF3eZOSTHubKA5YqWVoyU/L3sg5o46kLF9gUIlxIyNnX0ZU5iWx8sc9LzitBvErzt9+6gDGvXrziJt24sYvLTNi5cwFCxO8IhQof4vHqPr8TJ45fOi7UkYosPmEy/Rvd1DU0YIO2sHWVa9UyKWciREiWBAPkwCTRTWkYyZczNKpXjyU3aNMmDb18BP3hqXh7H6O8QSb+3buS+iWhifw26nzUXNyU0DspL6h473OlPn3w2O7ixT95KhNgcpIju9zcLgdf5lEQvnz69DTmqRCRrhwcLJsL/dZTs0mTejaKTw3TKSrZOzy/1cYCgqdsFy0gDc5/zeyo9FtNRdc8Zr6x/H7nzGfJfc0Xomtzh02ZYmBsEBkRuXnePGZPfhqKIYYvX7isNh5h6UVtbSuqRSsU5DcOlxk6tBvP4LV8eYStP4/mM+HRo6h0WgQXoha/G5uOAiVJhQEagUmlp75pZ7tWrVgav6dYV0c2qwaPlnLTGIumE165chulWlt3Qci8efgXfUSE8gsSqeZWdJe3t6mFKdQWeuMGq/M3FxcmeC1bxgR+ZQaLSje8GXATGR9+LxwhLCr+OTJn4MCzV8/WrJnymgc2KR/3k+SEYZpNRiDswOaI+GrWVt26sZvPuA4JEmWAHJhEOy7FbO0iinkhvVKlGNpDORRr4+SUopeudPt2GEoPD3+KkNmz1yOkQoWOCLG3H4IQ8a3E2XuWuaW14hUgq9dj924mrJ02jQn1HB2ZAO94mHDRP5+MPFxdXKFF4kHninHjHkc9Zo1l4aVjx3pN6xUW9kQI5r6clJRqISUYcNrXF5kxYOxYhLiuWIGQ74o2/PnnEjolhFngPrmmtZoKEZIlygA5MIl2XHpmuwwbxpJ7jhrFBPErE5Q/aztpxF+dFy5cRyXDrcQIMTZ2QEjz5oMRsmaNwg9xXOxQWVIV2ypcRyh8VX5XBoeEMHxcx45jlo4BuXnXrgxZP2vWnQd3QG7apQtDkhITmeCt9HYsCiG/B4AjeSII74E7d/wc2DCiTZtTV04xY6aNkLnt3WvXNhzY8MQJP5D5APT1m9dubruY2o+Hb96+RYXsP3YMIQPHjUOIgZ0dQhx690aI1969CAm6fTtR2S88yfc09nyThkziqWwBy4COAwDpMWJEh6YdeBIT4JoYhFBUigyQA5Nir2Vgc1Vra6Zh89NPTPifchTSuF27NDPfuoVHYGmqZQsonuC6eFGxr4uXP3asYgKQI+IpzfLlZd9KtX6qxXX69p3BZSYsW+bFEXBpe8/uPXbsEkf+nD/ftocttP3StWsMbG9hYeNk8+pVXNdff2XIygkTJq+e/PHj59LVqjEk8Pz53ad2g/Nu368fQ2AL8L8P/oVrJ/lYAQ4gfhX3Cm7k2qtcMPnuzZuExITXr9/cunuX5XoYEhIXH/fgQSS/6v7c/v0hD0P8/IK5e5jSo4fnUc9du07ysUsXa+s2I9pMmbLmnXKX8bVz5xxHO9aq1ScyOtpzvScr/Pnr5+3ajblz717zqs0ZAiHceeV75crGlRs5AsK6rVtDghU+nuHO48fH/Rcn1LFt2ZIbwPDSNjbCZf0Adh88+Pmz58Jcnnv2XPdP9YPmdWzsod2HhDogL5qOf+I4tXQS6oQ+eFC1VFUhAnJvB+z5dm7eCXi1unUhtLQ0hdD/oj+ETdq3H9R5EAj05D8GNPJfk6hFYgaWz57NwJELFqjdkQ0y+JmK8IYAPMqdOw/EuVQcYW/g2DswZip80YNgYWkR8TCCIZMnr0at6NABz1DVqNGL67yMinqpFlW2bCuO7Fy1CuRlXikOdWjLloAIj+jrLLq5uJHyjCJeTnPloQ8ccbKx4TITJouuTD67F3zuXkjtq6bwzbBWEN5mnb2qJjxRCbzpzZuhZqKtgeB4IK+dTX1eUYtu3bjMhFEzZijLVqRs+/tv+E+oBo7QTMNMiIA/s9ZV/E7ieG3T2lxmQseGHREyvPdwhKxdtBYhfuf9hMiDiAg0AkNRofKI+fNXDxldr57s10b823iWJH/jdQNkcLGGhY2E+iRLmoHCkraejM8kAxXMzZlm5Zo1QXj6NGaG8k1Dz9GjAcnRM79Z1bkWtu3UNtfqoopygQGxu3r45Amrt3X37iBoaWlyMxo6OEwdOLV58zocAYFPod8PuS/ESZY6A+TApN6DWbEf3nhNVb4ek43J5E/LJk2YYGpllZVCVSaPU38nlbGFDMlOBvT09VlxMKnIhCnu7hXNKnbs2ExYTf8O/cVnoDAFcmBCovKBTA4sH3TidzcB1hyyIy0gJ9/C+dcff7CClu3fzwRzExMmcB0WVfGweq3qyMIGzRoghKKqz0BhdXVk5KK//mJIaHg4E/QNDNynuE+ZMkCoqaH+zTcjYXfDhJokS50BcmBS78Gs2J/mko1yZcqwsirVqMEEH+V6sLleXgwxMTRkgnhvDcNVM5zoOlE1DSOr0mFg7NKlKLVB27b92vdzdu7ER2Cg4NDQoV49G6T5rSiNwL7FjERxcmAS7bgfMjuTi+atypdn1bRUrjs/rxyccZdWUamjyi6tUYtGiK/u/WTvTuhRHQaG/v47MuaXCRNsK9rKzhMRPOunrV+zZpLQgQkSMxbJgWXMkaQ0yIFJqruyZKz4qNPQUMUive8tz7pCBZbFsW9fJlxT7pdavGcPQxxatGCCKk88zl87nxnJw4pVUn1RcpyEH2dA/ONmzJIlqNghc+aYG5pXq2YpxA+vOnz6tGJmm+Fwb7KOTrEsO7DY17HC8kF+/fI1QigqIQbIgUmos7Joanz8B5QzGy/jKFWiBCscjudhwomdO5mw9fJlJqxbuJAJZsrl5nq6ugzJq1CvhB6qevux7QgxNjNGCEUzwwA/EYYrrzp8mMtMGDB5spWpVZcuLYS472bYpbZNiJQ3Lm9oWFqIMFl8KmacaGO1OFeaCI3J0qRFKiA5MKn0VNbt1NfHX9bXrnmh4vz9PRECbxoQUrx4UYSkGeVLlq2Vxy6MGDiQae4ODmZCiPIAIZgmYshg5ZCOv7rX1ExZG810cjSE3WOo/KNXjiLEro4dQijKp5c5FfvkR5+YmytelwIOW4k7NusIl3lyHRACdwR6e68QIuCu9PSKC5HMy3xZR/pZNAUL7pkm2o6dfnZKVTUGyIGpWo/kjT316+OVex4ec5Ap0dGnEfLnn7MQ0qxZbYRwfwa4tnJ7L18MMl55zN0m5dm7f9+6xUoI8/NjQpsePZgwYehQJvAy+cIThmcYtqnfBumks2LN0CTlK5jlOnLlCMouXh5StFim3DwqRzWjcO0IMmyjjw8gwkOBYc2qURmjjRtncM3Shobb520PCzvIERBgMhBd5llSt6RQgctlS+Mh1299+vBUJpzctQshdezsxCPmCbMVP4+4ckBEAJeZ0Lp9a4RQVEIMkAOTUGflsaniixZ//bULssnXdzNCHj/G45jly8cjnRYt6nDEUn5ik6amhoWpKQMXK0+FWDFH4VO9799nSc+CgpgwaPp0JhxR3hlmqNy7PVR54BMoeM3zgpBv6wY54lgEy8jDucPncpkJdtaKgZd4UCj+irz66CrKPmb6GISYmJsgJKejbFpPXT3l895N/mugTp2qvGpP+RnHs2cP5siWf/6pZllNeHlmXXv7dVPXCRFQDjsYNmRI1yJaWjzjL+1+gc3FLRo14ggTXCdORMg/Bw8iJPrmTbPyZkJw8/LlziOdhQi8Z91zbo8Q0dDQuPb4mhABWfzzopxhOaRDUUkzkPIHLelmkPGZYaBGjbpILSoqOacRPpXE63Jx+YVVypHz5zch5PXr8wjhrhFysXdpq1dPgq8tUAOEbceGr9EO8tvRADkWIfNMjo6NN8jXC5QrZ7hffgihnZ31Q+Vp9NM3bDAuawxvWZ4EBrLqug4ZMuu3WTCRdVjpCOErO3h3cJEiWvMmT2Y6xuXLPzn+BEaB/eTHQAAI7mHL7C0aGuqwGbxMuTKAaGppjewxslgx7cpWVlMXTAUEbgGGE9BhOhdGjeyrFm7QtjCysLQ0getvTgScAJ1yJiZwbvpPP9lAu1xXuQJi17ChemF12KgL+/a69u0KSIcBA2Ahw5gxveA9Yp2GMsc/fvlyw9KGsDbP2MDAyNQIkO0BAXWr1QVW4bZuXT3dDRt2n3v5Ehagv3lzEUYqUDjwA213m+gWGLgDzskExwxI9fr17x245+o6bNro0aVKlwIEivp3778w3vJcvdq8gjlDRjiNAPzqsWO169dmCNgDSFxoKKztZAhE4Tm/b9+U+VOEyOwJEzbv2yxEmvz0U+CTQCECLYUfAUIEilrgvgAhTVo2QQgsFXFz2yarmJ4Cw4Ds808PMaBqDMhGe+9SGSWbnIxOhcCXOEJkE1mpdY4fX8OR8lWqqD19Hhy8myOyUUiQmmxqNNqI5ZuxcSMg8EXPEdmkWZDap09XOHLk0SOzILWvXwM5cv6//2oHqQ1yrckR/8+fAXH/exBHriUkAKIG/0NdhWQN8Y2LS0HA70SrnYiMRAgshEGI67ZtR8Zsg3JWT10ZLc/1i4vLSnsXQEZ3H88QONA2YHsAIKGXLjEE3ix6zfVSKx4UePIkQ8CAsb3HqqkFHdiyhSOVLSoDsnDatNFrpnGWQLO/k1ObMU5C5KdatY76HxUicK3PGq8UtmUthCpmjBXqANK+W3uEyKb+UveaPCsFxEDGDNAILGOOSIMYIAaIAWJABRkgB6aCnUImEQPEADFADGTMADmwjDkiDWKAGCAGiAEVZIAcmAp2CplEDBADxAAxkDED5MAy5og0iAFigBggBlSQAXJgKtgpZBIxQAwQA8RAxgyQA8uYI9IgBogBYoAYUEEGyIGpYKeQScQAMUAMEAMZM0AOLGOOSIMYIAaIAWJABRkgB6aCnUImEQPEADFADGTMADmwjDkiDWKAGCAGiAEVZIAcmAp2CplEDBADxAAxkDED5MAy5og0iAFigBggBlSQAXJgKtgpZBIxQAwQA8RAxgyQA8uYI9IgBogBYoAYUEEGyIGpYKeQScQAMUAMEAMZM0AOLGOO8pkG3NWLWpR8Hd/LnF8RuLkYtZ2ixAAxIF0GyIFJt+/IcmKAGCAGCjQD/wfn2aSnEzDdAQAAAABJRU5ErkJggg==", "text/plain": [] }, "execution_count": 17, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# order of axes in composition is different\n", "# rule is just as for digits in the number: leftmost digit is the most significant,\n", "# while neighboring numbers differ in the rightmost axis.\n", "\n", "# you can also think of this as lexicographic sort\n", "rearrange(ims, \"b h w c -> h (w b) c\")" ] }, { "cell_type": "code", "execution_count": 18, "metadata": { "pycharm": { "name": "#%%\n" } }, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAkAAAABgCAIAAAB6y1p+AAAgSElEQVR4Ae1dB1RUR/c30ruCSi8iiA1E0SjYo4JoACv2WL6osRuxJUaMsUWNnxgxJlGJBYkFJSqCGiwQsSICKkWw0IuKCFJU0O/mz/9wOAu73H2893Yfez17PG9nfnPn3t889r43c+fOJx9zcprRP/EMxBobi6+kmmbGzXKJBQkMGMfS/SOBnma5RI8kepoZG8dKrFf0yuaKTgDZTwwQA8QAMSBMBpSFqTZprdAMpCalBu0L+ufSP1npWRXlFQatDey72w/zGjZmyhgVVRWFpoaMJwYUiYFPaApR8nDTFKJkfnieQnz39t3ar9ce/u3whw8f6ipmamHqd8Cvz6A+datkVUJTiJKZpylEyfzQFKJkfmgKUTI/VCtHDMDL1tjPxh7cc7Be7wWKZmdkTxg64fjB43KkNKlCDBADnDFADowzakkw2wz4fOkTcz1GstSqqqrls5bfib4jGUa1xAAx0AQYIAfWBAZRIUyIuxMXEhSCMfX9+/cwzYhBEoYYIAYEzQA5MEEPnwIpf2TvEby14O0e3HuAxxOSGCAGhMgAOTAhjpoi6hx9OVoqs6MioqTCE5gYIAYExwA5MMENmYIqDAEaUlme9SxLKjyBiQFiQHAMkAMT3JAposJlpWWwsiWV5UWviqTCE5gYIAYExwA5MMENmSIqrKGpoaws3aZ7XT1dRWSKbCYGFIkBcmCKNNqCtfWTTz4xNpMua56ZpZlgzSXFiQFiAMUAOTAUTQSSOQN9B/eVSod+Q/pJhScwMUAMCI4BcmCCGzIFVXjSl5PwlkNqRAcnBzyekMQAMSBEBsiBCXHUFFFnp95OnuM9MZbDatm6Hetg1hEDJgwxQAwIlwFyYMIdO4XTfEfAju69uks2u3nz5lt/29q7f2/JMKolBoiBJsAAObAmMIiKYgLEIgZfCZ48azJ4qXpthkCPPy/8OWHmhHprqZAYIAaaGAN0nEoDA0rHqUgmiOfjVKqVeZT4CDJLXbt8LTs9u7y8/N/zwLrZu3m5jZ06VlVNVbLCPNfScSqSCafjVCTzQ8epSOaHHJhkfpqRA5NMkEwcmGSV5KqWHJjk4SAHJpkfcmCS+al/KkZyG6olBogBYoAYIAZkzgA5MJkPASlADBADxAAxwIQBcmBMWKM2xAAxQAwQAzJngByYzIeAFCAGiAFigBhgwoB0CVKZ9CC+zdt3727FxkbdvJmQlJT65ElOfv6b0lIo1FBX19TQaKWv39bCwtrCoqejo0uPHu2srMRLapo12U+f3r16NTEmJiM1Nevx45KioorS0g8fPmhoaWnq6BhZWFjY2to6ODgNGNDe0VFcZHnTpIasqsNA+dvyR+mP0nPTswqysvKzsp9nw//PXz2HcviUVZSVV5S/q3ynoaahqa6pr6tvaWwJH0c7R2cHZ3sbe2UlWf4U1LGGk4L8/JyMjKdZWenZ2RnwP3xevnxeDtxUQCjrv//D5+PHjxrAkLpGq1ZtzMws4RfI0bGnk5OzlVU7TnSSY6FpaZk3b95PTHySnPwsN/dFfn5hUVFJRcXbt2/fq6goa2mpa2nBT5GGtrYm3Ent2plVf7p0sTE01OfNLBlEIcItcuHq1cCTJ09fuAAeC2mqbdu23p6e07y94QLZhBUY/1GIOc+ehR0+HBYYmP7oEdIEPQODod7eHtOmdenVC9mELRhbUYi5WblO5k5saVUt5+yNs5C/g12Z0krjLgqxpKzkRsKNWw9u3Uu+F/8o/lnuM3i4kVa9aryetp7XQC/vod7uLu48PwlxF4UIbDx6lBgffychITYxMT4p6X5xcREzfqCVjU2HkSMnTpw409iY1yTRPEchwo9zZGTssWMXQ0P/gecgZnRZWZn079/dzc3Z3d2lZUtuD4Xg1YHBLXU4OHjL7t1JqanMqIG/rjEjRqxdurSznR0zCdK24tOBPUlM3L9x48Vjxz5UVUmrZzXesW/f2b6+vYYOZdacQStyYJJJY92BFZUU+R/zP3ft3J2Hd6o+MLxPxOlsY26zbOqymV4zVZRVxGHYLWfdgYGXOnhwz/XrV+/evfHmTQm72qqoqIIPW7ZsHbyfsStZnDTeHNj795UBAad37DiSkpIuThlpy1VVVYYP7/PVV2NdXXtzlNqNPwcGs4Xzv/32bkKCtCzUxauoqPjMmbPWx0ddTa1uLbsl/Diw8tLSX319g3buZOy6als9wNNzpb+/obl57UKOrsmBSSaWdQcWkxjTc2pPyZ02srZzu86/rPqlf/f+jZSDac66A4uPj3F355afli0NNm3y9/LiI+ELPw4sIuLWggVbWHRdIkMfELB2xgxUIlORhg1+5SOIA168Nvj59fHyYsV7gUlwOO+P/v79Ro7Mys1t0EL5ByTHxo63tw/8739Z8V5gb+SZM2M7d44IDpZ/20lDOWTg4eOHA2cPXL17NeM5STk0ikWVXr16OXfuxK1b17AoU1aiYIhXrdrl6jqfO+8FptnaWnBkIOcOrKy8HNZm1mzdWsV0Wkyc5THx8T3d3VMePxYHEET5+aCgGX36QLwGu9qWlZSsHDdu9+rV7IolaQrCAKyFbArY5LHEAwJAFMRkac3089uwefO30raSK3xlZdWkSau3bDkAw82pYl26cBUCw60De/X69RBv77BLlzhiJ6+gYPC4cU8zMjiSz7XYv/bvXzN16ruKCo46Cti0aevChRwJJ7FNnoGw6DDPrz3Jh4kb6F27Np84cUhcrfyXz5mzEeI1uNbTzMywRQsdjnrh0IFBvKXntGk37t7lSPVqsdl5eV4zZkAALKe9cCH8wtGjG2bN4nqW5pi//84VK7jQn2QqAgMRtyKmfDdFESxlZuN33y3Mzc1i1la2rfbuDYGoDR504O71C5Tn0IFNXbjw2u3bPBB0Pylpia8vDx2x2AXs7lo3cybXb+7VCh/ati30kICfE1mknUQxYODU5VM/HviRQUNFaFJSUrxu3TLBWVpQUOjj819+1BakA9v9xx/BoaH8EAS9/B4YGH3nDm/dNbKjsjdvVowdCxtMGykH33zjnDnpKSl4PCGJgdoMfLf7u3sp92qX0HUNA2fPHk9KSqj5KoiLLVsOlpSU8aMqbG3mriNO3sCS09KWr1/PndL1Sl64ejU/LzT19i5Vod+yZbnprG22wHQNy2zwwsf1dCVGE8IIkQHYcDZ7w2zWt50JkYq6OsPPTkCAf91yuS15+/bdH3+c4U094b2BLV6zhv9FqXsPHpyLiOBtVBh3lHT37qnff2fcnHHD+OvXz9FEImP6FL4h7D8LDAtUeBrqJ+D06aOQjKr+OvkrvXz5zqtXxfzoBaknOnWy5q4vZdZFh/7998XISLxYXR2dMcOHD4HcI/b2Bi1bttDVfV1S8qKw8EFyMoQvhoSHFxVjud62Z8/nPCahwNtYG7lr1Sqp3hTVNDT6ff656/jxVh06tDE1VVFVfZ6Tk5+ZGXX2LOTsgOvawiVf/75u3bBJk0CCZJhMao3NjHM+omwpfFHYpXUXmSgprE53+OywtbA11DfU19PX/jdlnbaqimpxaXFqRmpUbBR4o4TUBKksgsD6KcOnKDVXkqqV3IJ/+MGvXTs7IyMTff3WGv/+04QpihcvCjIzn0VEhJ46FZSXl41UHlJ+QO6PwYOHI/GyhV29KkVgnb6+7oQJbv36dYO9XBBPCCkQNTTUgaiKinclJaU5Oc+zsgru30+Li0uJiop98aJIxDRra1MNDQ7TTbCfiaPHsGHIDctKSkrL585dMX9+Sz09EbNrvkIg/sadO3f8/jty+iv1+nUbVtP+spuJI+HGjRkuLjXWNXjh7Oa2NiCgtYlJvciqykrwSQGbN+N3QPvu3+81c2a90pgVspWJA9873oE1yVyI+EwcuRdzjQyMJBB78tLJeT/OKygskIARqTr106lRg0aJFDbmqwwzccTF5bZpI5YfiM5YvXpBcPBhpHX/+c+i9et3IsF4GBeZOIYPXxQeHo3RAdJn/Pzzcnj2wYDhuTw2Nvn06at//XUVXFp1k5EjB4aEbMc0Z4ZpzqyZuFaXr11Dei8dbe1zhw9v/vZbCd4LeoHan3x9g/fuRWaNOvrXX+J0k4dyCGrHqzF12TL/8+fFeS+Qo6SsPHf9er+zZ5srYR+Kg/fswStAyKbNwJjBY2ICYyD/Id7MgNMBeLCgkTo6un5+B/r3H4q04v79WCRS5rDsbNQji6fnAEgBhfReYBRkO3Ry6vjDD3MTEo6lpZ3evv1rmFZzdLTj1F6WHdiuANT9DaYG+vu7DRyItG2Uu/vuzZsxYJhyxMBkgnn98uUldHqnwWPGLN66FaNnH3f3xVu2YJCAgfD9lHsUToZkq+nDzA3NL+y+ANnokaaGR4fDES1IsNBhsH6zZcuvyCy0kPleKPaWlpZjVJ00aRgGVi8GjlZZunRKZOTetWtn1wtgq5BNB/a6uDj88mWMZrMmT/Z0dcUgazAzJ0xwHTCg5qu4i7iHD0ENcbWyLb96+vT7d+8wOujq68NcH/IvBwRO8fHp0L07RjJgLp86hUQSTBEYsDa13rZkG9JSCES8ePMiEtwEYJaW1n36fIYxpKioENbPMEiZY9TUVDE6wHlgGJhsMWw6sL/On4fjKBu0R01V9XsfnwZhdQHL582rWyhSAktl/OyeFukX8xXmhjEwwPxn9Wpt8euC9QqZhd7KDdEf9UqgQoVlYIbnjPaW7ZHmK5QDA06GDBmBZCY1NQmJlC0Mmdhp+/bAp0+xYSyysohNB4YMPhw+eLCxoSEDgwc6O2traTXYELkI16AcdgHgWWOuXMHI1NTWHjd3LgZZGzPQywvOaK5dIu76UXx80YsX4mqpXAEZgOOYV89cjTQ8Og61/o+UJv8wOI4ZqWR6+mMkUrawDh2sMApAqL2z8/SLF29iwLLCsOnArsfEYMyABS0MrC5GWVkZQu3rlouUJKIPMhZpyOnXJw8fQgIOTBf9PTwgdB6DFMGADxMpEfcVVsLEVVG5YjIw+rPR6qrqGNufZD+Bw6AxyKaB6dy5K9IQCFxEImUL69WrC1KB/PxCN7f5I0f63Lr1ANmEZxhrDiz/+fNnmag50x5dsTdEXS4MW7euWyhSwvi4ZxE57H59iM509dmYMcy6hu1iyIbkwJBEKQ4Mdom590E9WUK09IM0Of0542K81NU14ARLjOTSUtQTKkYUpxgIL4T4FHwXEBnfu/e0Hj2m/Ppr8MuXr/ENeUAqs9UHpI9CiuqEiMVAiqoXBvnp6y2XbWFGaipSgU49eiCRIrD2jo4iJeK+ZqIHS5wEKm96DAzqMSjkSgjGrvTcdGcH7MQaRqCcY4yMTOEQywaVhO3MDWLkAWBkZDB+vOuff56XSpm7dyGJUNKiRdsGD/501KhBXl4DDQ31pZLABVgKPyy5e+Trl2QhrNQWFhXBkc2siGJRSB4u+SHEbhhbWjLrV79NmxatWmHa5gn2BDWMdYRhxoC9bcPz89WSs5/L+9o+MwbEtdLXR72BlZUJ4w0MzNy0aT4ylEOEk/fvK8+fvw4HiZmaurm6zjtw4GxxcakIhs+vTdCBwRTH88JCPknE9JWflYWBWbbHBoPVK83cxqbecpHCApwyIq3oa9NmwMHWAWlg3gt5nORAKs8ApqqqhmkloHSIVlYmBw+uw2/UqWt+VdWHv/++NWPG94aGQ8aOXXHq1GXIEVwXxnUJaw4Mcj5xrStefmlZGR7MDxIZwaHdokVj9NHS1cU0Ly+V5UMTRkPC8M+Avq6+irIKpt/ScsW6f9TUUOEtUuU4xfDMKQZWwnx9ZzW+C0iKePLkpTFjlpubD1+79leI+2i8TLwE1hxYGY+nWzVoXoX8HdBcgfOp2jgPJI4BLR0dcVW1y5HK1G5C14rAgI4m6v4pf4tK5dBkGEO+gQnO3u+/n7Nr1wolJXa8wPPnr374Ya+l5YgFC7bwl+2eLdL5Pz9FguaY/dQSmnNRVYlblmMWQF+jMLI5MiFIjVi6UBAG9HT0MJa+ey+DySKMYoSRloEFC8aHhu40NkatnWOEw0Ti7t3H7exG83PkGDu+F2OYgmPU1FGzEI08phk5UYn0cwo+ZApoPnISDI5lUUBymqrJw4a5pKSEQOpCFRVltmyEt7GZM9d98YXvu3fcxtOx5sA0Ge29ZYsvETmq8nfkFdJnID2QiL01X0txeSDhSJ+aJnRBDNQw8AYXR6eGC2qoEUsXcs6Ajo4mJI9PTAyeP98bn36+QaMOHz7n6jq/qIjD3QWsOTAN3BtGgzazAlBVQa1Fs9IXUggyt2FJURFSYL0wZHOkMvV2QYVNmAFkdIaWhlYTJkFhTbOxMff3X5mZGbZz53IXl66NiVGs4TAy8u7Eid8i3+xrWuEvWHNgrfT18b1yjdTS1OS6C2nlG5qZYZrg9zvXlQZ3SXpKSt3yuiWG5uZ1C4VSwsrflVCM5VNPSBCFjM5o07INn4pRX3wyAPvDFi2aEB0dAJ7Mz29Znz6N9WSwb2zr1oMcmcCaA7PE/UBzZIaIWP3GBaOLSGPlKzLTbnFhYWF+PrMesx4/RsbHI5VhpgbXreAgb667UEz5Kc9QTz9ATht9cmBN/x4xNW2zePHEa9cCsrLC4c1syJBejBfJvvvul5ozmtkljrVVOyv0Q31efDwmpSG7dspcWtuOHZE6JMXGwhmVSHBt2MPbt2t/lXBt1aGDhFo5r1JGLzVXVVbJuS1ypd6jjEdIfSyNGSaLQconmFwxYGLSGtbG4PP69ZuwsGshIVdCQ/8pL3+LV7Kysgoi7E+c2IJvgkSy9gbW2c4O2eVThcxj1NHJCcnPP6GhSKQILAJ93HPnnj1F2groq5o6Ki0CWFT6RrH22zZyEKNio5AS7Cyxf+xIgQQTBAN6etoTJw47fnxLfn7E3r1runZtj1cbMgJzsTmMNQcGs3btra0x9ly4ehUDa2IYi/btkaETcOAkgzXPspKS6PBwDGlKysp23bphkPKJgSlEbR1tjG4vCl5gYISpZiDsWhiGCtjsbNrGFIMkTFNlAKIWv/xy5L17QUeObETmVIQkilwcLcaaA4OhcsE91/9y8GAJ7mSspjT8EHrQe+hQjEX5mZn4s5trBAb5+b3D5R9x7NtXA3EuaI1kObxood8Co1XKgxQMjDDAwPX465n5mRgqenYW8Os7xkDCIBmA37RJk4ZFRe3T0EBNisTHY+eokQoAjE0H5oH7gS548eKrlSsZvGTgrZJPZD8PD6Rie9evRyKrYXDC8qGffkI2gQMzkUi5hRmaGGJ0u371OgZGGGBg4/6NSB5curogkQSTTwbgdHgWFbO3t5kzZwxG4NOnORiYVBg2HdjwwYN1tFFzO0EhIV/6+MjhoSdScSctGDyHKm63XMq9e4e2bUPKh9tx7fTpyC3McJDdYKYHZiL14QFmZmmG6SXuTlxachoGqeCYiFsRYdGo+UMgCo4NU3C6hG7+jRv3bW1Hrl+/Lz09lxVbHB3tMHKKi99gYFJh2HRg6mpq3uin+4CjR3u6u0fdvCmVuiLgqqqqi5GR0xYvXrd9u0iVHH7Vbdly6LhxSMV2ffMNZk0LXmS3L1ly7dw5pFjnYcMYnzeG7IIHmLWtNbKXdT7rkEiFhT3LeTbhmwlI8/W09fp164cEE0w+Gbh/PzUtLdPXd0/bth6DBs2Gc5YbmUI+NTVDVpay6cDABp+vvsLvM41PTBwwenS/kSNDwsOlSmb/urj45Llz05csMXRwcJs48dCJE7H378uKQan6Hb9gARL/oapqiYfHnjVrqiorxTWBM8bmDR16dNcucYC65d7z5tUtFFxJB3vsNoBLYZe+mfdNg+/6lZWVkRcjl89ePs1jmuDYEKfwoq2LHjx+IK62uvzCjQvO051fvn4pGVZT+3m/z5FHrtQ0oQt5Y6BmSxY8/l69enfu3M0mJm4DBsz6+eej4Nik1TY8PNrPLwjTChnugRFVg2FtH1i1xI62tqPc3U+FYacjoNW127fhA29vn/Xt+6mjY6f27e1sbAxattTW1NTW0oIk90XFxUWvX7989epBcnJMQkJMfHxSaiq8e9XYABcpjx/X/iq3150//bTP8OHROH7Ah+3bsOF0QIDr+PEw/Qi7j1ubmEAi+YLs7KdJSRePHYN4RWTgRjUhnXr06DtihNySg1es26dSRFEe3HMQnNO0edP6De5nYm6io6dTUV5R8rokOzM761lWYkJi7M1YmGysjrnX0dXBqyHnyBMRJ+DTybrTiL4junfobm9jb2hgCK9QkC8q53nOncQ7gWGBl25fkmo1eqbXTDm3mtRrkIGEhFQRDCxDRMEuiqjYxYu3WVoaQ/YNmBXs1Mna3NwQNoFpaUGiQNUPHz5CJGFpaTnkNszLewnTjxCUAYGFcXHYUClra9TMv4hukr+y7MCgs598fcMvX5b2dJWKt2/DLsHj8iXJ6oqrfZyeDg/RysrsmyOuR8bl8zduvHH+PH4d9XlOzpEdO+DDuMeahvM2bKi5FvQFrIFZWlumP0lHWvHs8TPkXGJJcUledp6RqRFSsvzDEp8kwocVPW0tbGkBjBUmZSvkwQNJj/vgmeATFHSedSV79uzEukyWpxBBv7YWFqsXL2ZdUckCwXs9Ecj+aDtHx/ELF0o2h4taOPfb2c2NC8kykenq6cpRv48S2Q/25UhVnsWumr4Kv0DAs27UHZKBzMx8TtPDi1NDVVVlwAAncbWMy9l3YKDKqgULYD6QsU7MGgplFhGsg5cws3btmJnJrJWuvv5Kf39mbeWz1bgvsOEw0upPDqxexmzMbb74/It6q6hQQAzULIDxrLOHR39I5MF6p5w4MMiV8Ocvv1iY8rpdPyUtjXV2OBII+4i3nTzJ227i5kpKm//8U98QtXeKI5NZF9ulW5fe/XuzLhYEpiaJrhBw0YvgZPqv9FdWEsAUveCI5VlhCEHkucfq7lau5CQ8ihMHBhq3adXq8okTpkb8rSUI6A0M+GnftesPhw7BriwebqbFW7f2duVqwo0H/cV1sWzdMnFVjSmnN7C67E0dMdXNuenMP9c1UHFK6kZw8GD7jBmePXt25qIjDn9A21lZXTl50sbKigu968oUlgMD/T8bPZoHHzbn+++nLF1al64mUOIy0GXs1LGsG0IOTIRSCF/c880ekUL6KlAG+J9ChFQdP/+8nCO6OHRgoLFt27a3w8PdBg7kSPvaYgXnwEB598mTfzx+nKO5RHi9W7Rly+y1a2uz1MSuN+zaYNPBhl2jXr189fL5S3ZlCleahZHFmR1n6Ahm4Y5gbc0hDj4lBRu7W7sh42uIxb948RdtbU3GEiQ35NaBQd8t9fTCjxzZvWmTro6OZFUaWQspFmHHWCOF8N8cEjv9cf26uQ3Lv8KQ9WPHmTPTVqzg3yI+e9TV0z187jDs7mK3U3oJq+aznVm7yL2RViZW7NJL0mTFQElJmVRnoDRSz5EjB964ccDIyKCRciQ059yBQd8Qejtv+vTEyMhZkyerqKhI0KaRVQKK46htqa2Dw9H4eJjog2iL2uWMr2FyMjgxsWnsWW6QBNgQFnoj1MHJoUEkEqCiqkIHiQFXHv09YgJjyHshbxtBwPT1dW/fPnTz5kFYlILtydzpbGFhFBS0MSRku66uFne9gGQ+HFi1ARDQ8fu2banR0UvnzIEQDxatam1g8OWkSeeDgpwcWPsVY1E9jCh1Tc2vt28/lpAw1Nu7MVtt4KiU3y5fhhBHAx7DZzAGcoqBfcdnb5yFmA4NzUb9TcKb3FLfpXfS7wwZMYRThfkRPrTXUGZxg8atjA+vPwwzhy10WvCjKvXCJwO9enUJCFibm3tx3741/ft3b8wPTl21IYXHb7+tTk39C46+rFvLesknH3NyWBfaoEDYd3z+ypXQiAhIxcvsgGY1VVXnHj0GubjAhjNnJycI3G+wU2aAWGNjZg0Zt8pMSzvzxx9hgYF56K3Z/6YJ9vb2nDGjS69ejPtl1tC4WS6zhly0ghMs9+3cdzLwZHZGNl6+uZU5eCz30e4QFcJ6XKhxLMv3T0xiTM+pPTHW5V7MhYQv+0/vD44ITkhNwDSBvFNwdvy0z6fxtuiVyzI9zeLjY9zdUfzExeW2aYMKk/7qqwlnzhxrkEBv72l+fgcahEkFMDaOlQrPAJyRkQc5x0JDo6Kj42GRjIEEJaXm4LdGjOg7atQgZGZ6Br3U20Q2Dqy2Kpk5OXEPH8Y/fPjoyZPsvLzs3NxXr19DJir4QJY2LciI+H9JEeGgFthY1sEG1uz//Th06gTpE2vL4eiafwdWY8iz5OS7kZHJsbEZqak5T5++AVpKS+EnCYI+NLW1ITUinPJsa2/ffcCADt26sTX9WNM78kKuHFi1znDbJNxNgAyb8THxkG4qJzPnTfGb8rJyeNLU1tWGbIet2rSybm/dzq4d3EmQVpH1JbTa1MnWgRkZ/P8PNBxW+ffNv2OTY8GTZeRlFJUUvSl7o66mDscrmxmadWzb0amj0zCXYXaWdrWV5+GadQfGg858dsGDA6sxB1bI7t1Ljot79PDhY0jYkZ1dUFBQWFYGaf4gA+t78FKQTQOSIkJO3latWhgaGrRta2JjY25vbws5ojidkKzRsO6F7B1YXZ3kqkSGDkyueBCnjBw6MHGqyqRcThyYTGzHdEoOTDJLfDowyZrIZy1/a2DyaT9pRQwQA8QAMSBQBsiBCXTgSG1igBggBhSdAXJgin4HkP3EADFADAiUAXJgAh04UpsYIAaIAUVngByYot8BZD8xQAwQAwJlgByYQAeO1CYGiAFiQNEZIAem6HcA2U8MEAPEgEAZIAcm0IEjtYkBYoAYUHQGyIEp+h1A9hMDxAAxIFAGyIEJdOBIbWKAGCAGFJ0BcmCKfgeQ/cQAMUAMCJQBcmACHThSmxggBogBRWeAHJii3wFkPzFADBADAmWAHJhAB47UJgaIAWJA0RkgB6bodwDZTwwQA8SAQBkgBybQgSO1iQFigBhQdAaUFZ0Asp8YEBQDPTr1+Hj3o6BUJmWJAa4YoDcwrpglucQAMUAMEAOcMvA/yZt6vw8tf2IAAAAASUVORK5CYII=", "text/plain": [] }, "execution_count": 18, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# what if b1 and b2 are reordered before composing to width?\n", "rearrange(ims, \"(b1 b2) h w c -> h (b1 b2 w) c \", b1=2) # produces 'einops'\n", "rearrange(ims, \"(b1 b2) h w c -> h (b2 b1 w) c \", b1=2) # produces 'eoipns'" ] }, { "cell_type": "markdown", "metadata": { "pycharm": { "name": "#%% md\n" } }, "source": [ "## Meet einops.reduce\n", "\n", "In einops-land you don't need to guess what happened\n", "```python\n", "x.mean(-1)\n", "```\n", "Because you write what the operation does\n", "```python\n", "reduce(x, 'b h w c -> b h w', 'mean')\n", "```\n", "\n", "if axis is not present in the output — you guessed it — axis was reduced." ] }, { "cell_type": "code", "execution_count": 19, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAGAAAABgCAIAAABt+uBvAAAN70lEQVR4Ae1cWXMbxxHGfSyAxUUcJMADPEDSkmJbthXJciq2K84vSFVeUnnIQ35OHpO3VOUhVankIUkl5XLks8oVW4osS7JFSqIsigBBnMS1i/vMt1xyBQKLXRwDyKlgCxJmp3tnej70zPR091KZyWQUs6s/Aqr+pBmFQ0DzQmCoVWv5bL5ULNbrjXa7rVar9Qa9mTbTVlqpVL4Qkfp1qpzyFAMcyXgS6CjaIiJptBqvz0uZKBHaC6qa6hRrtVqHB4f5jDg6QKBRb0RCESbHvCA0RLqdKkCJaKJSqohI0VnVVsSj8XKp3Fn3AsvTA6hSrrB5dqChthWpeHIgzskzTQ8gbt0Z+KqUq9WKnK4N3No4jNMDqFQsDSVosTAc/1CND848PYDq9frgYoETC/ZQ/BNinhJA2L9E93WJUTWbTQnq1EhTAkilUimGNABV6inJJo319ISAESgtShdVq9V21byQ2+kBNKx9PCz/hOCbHkBWu3XwMeiNeoPRMDj/5DinB5DRaLRYLQONRKlwe1wDcU6eaXoAYSyeBY+8Xig5NuP35rw6VYCwl/lX/Nxc67OjYSH3L/uttiEm46R1aNruDn48tWo1x/mDSrAGYSKpNWqDwWC2mGnb/70/aNI/OPH2pzrFiEs/hQZnAMmAPANoBpAMAjLk4c5HMo1JklvtVqlcKpWLZfhTK+VSpZRKpZ493Y8cHSUTKcQOWJatVCqtVhunMIPB6LA73W6Pb8G3sb65sRF02G06nRYfvV6n0agluyJJJAYQNuz9vf1O0ZqtZo7Bbp7JMVmGzbFFttVstRVtJsc+frj3aPdRIpFstcWCG2etLAYWYX/jDpBtbGz98I03L7/yOkWZdDoNRRnMZspsNiJkdMY+kW9idpAAUL1R2w8/TR4nMrk05wY6u04CPqk7t+/uPd6r1mpn1VLfAkACk9Vmf+ft99798XsmyoRKxNAsFspupwGWwEO2QB6gLJP57IuPu6Rk2eKdW1/fv/8NJlEXSeK2FyCe2eXx/vxnv3jl0qtClNHnc9lsgx30JPoTI018F8OcCoeO/vaXv9+69Z+h0BGT9rQulYj/9ne/+fNf/1RvnLpxsTZJ8I9DIrYGiQqBafXwwd5nn3zGFgqiDCNXYvLe+Nc/0unUr375awPi1nrdyE1JPzhZDdp98PijGx8RR0cY0p2vbv7hj79vtZvqiflnJwhQ+Fn4kw8/KQ+z6AgjH7zw1e0vPvjon527weDPDsI5KYCKhcIH798olSceQQY0d7768ubtLwcZ7Qg8EwEIQn/+6efZ3BCh1BFEFx6p18sff3ojkYwLNQQLEwEoGgrt7D4mKKV0U1ihs5njDz58H3uCNOcIVPIAVcuVu3fvNRrTC/sBIIbJR6NHDx/vjgCB9CPEtnl4S4MXgugsl0/aXHNBZd+WETJ9+vgpOC9tvWw2WWiavnTp4kZw3ePxwKmoULRy+dxh5ODeN19X6iXsgNVqrVKtVSu1YrHSG26FBaRUKXHQK5eLN2/9eyu4zQUpyV19hzFaF5VSKRQOVSoDnST8C0tvvH75nffeNhq4A5dwATX/wuKV19+stct37t18tLubzma4ydNWVCpVli0xbBF48fx6/amJWCwWUsepUOhZILAmNDV+gTBAsUgEY2g05PMO8Dtfv37tJz99t98Y1Cp1MLAN/bLQ1p379yKxKML7Bi5epne57fVanWFLLFM0GPR8C5Uqd4j59sH97y9A0P9wOAwpB7FKgsHgpVd/0A+d03qlwu9bgmqUS0U0fpR4vk9pdVqn04qP0AICASgfhPdLxQJlwlQlc5GcrmwuxxvNSqVMs06HYzUYrNXkE2K0Wp3NatsMXuBWqJMTfL9xw7sCzS1XqrHYUT+eEeplRjJUi/FYjN9opQ1/HMGDW1uYYul0vjZA0pDJZLFZbJ4F35zDLi1PrXaiRAfn3FLSj8hSiQEEaI6TSb4/6bM1bbE4XFxkGbPm2bOjglzmGbyL8PssL63StFUlqZv1OrdyxxKxZlN+EZSFhmcgBlC9Wi2WTpPmjNTpwikqhHd+XvDjwFwKhWLhwzi8sKLMqNTrDfifNlksVqvxbEkWZebXPhZrlhzooo+LVhIDCIcvYb5YLJy7r98153J3keBOgyrt70cyWabRk1iGyahSafDP7ZnXqKW2XR4g2EwMk+vqYuRbqf6GarTQ4fGBU91qNefzBdEWDhPx0H4YpGX/Hlw5PTxKeJoBMXypbL1soDgTKXSwj6zXVOY4X5BKJOYBwlKNjayn2REriAGEgEWnCG63o1AoNZvPfdI8FYl1GpnUsTYexCcWU6h1Bs+8i7aYeM883PUo9BrTQr+wp1FGpwifCJVjFiYFkFanWfC5D8PPLRdeUK1OannqGkyhWG4eJWNKZT5/rNEqKEqPuE+zvwulfRYjKBaLXU2NfEsMoHqP9Yw54nLZU6lsp3AjpGZifzzRKRbqk04zBp0K2SCdbfaW+e2st36EGmIAiWo+zgQYTDx2LEimUsmMTeDsLPC7HrrIZgtsPme3WzCFJbAe5KzT2b5Emdgu1u944XDQS8teIRYqbPASMkmQNBoNFCqTYZ58d5jL9l2wRX8tiWYlSMQAkhg5onrr64s4N4GHN7UlBJImCT9Ds9GMRlOIWos2CJtAup3BqcSmGL/R9OsY08HjddoddLnaHEf6xlkgjO8onyvgncWlRW/XdFNriI2LHNIDyIQjiMdtDwaXvN45I9VrAfWD93l97wscpWIlEkl2vedAMAmdGEB6/UD7d71ag645nHQgsBAMLnu9TuPA+dA4r8NcfI7WWQl7XDqdO7vjvikj1Xk7TpmYKhqpgWTCIIWTJFZus8nkcFhxImOYIg4ccKpyfsM+FxxDgqXTxZJMZrj8Bfq0mqA/iBhAJvOgPqqAz0+pOHW7sLlp0J2baICvUCgDrAJbglm8uLJkPDlq2C0mls0fHYVfCga7oBFuadoMqtfrQ43FTCyRgRhAtPW5c08QWrSQz50zHTt54Ga10mZ8mq0WwxSQB9RJTUpGvqCAwskGXtrOB8cpE1uDDEh1MpxTh35ipRIJlapPJvnZM2qVym6jV5bn/X43737DBp9MxM7oIt/Y74vFMghA2UROg4gBBMmcJ24wEdnPVyVTKfhGzteJ31WrVXgFVlYWYEDFjyK1E3+YOOtJLeIF+LY7HONYEl3tkwTI4/Wett53neXo8PjsPXqEvKEuUXpveb+XwaCz0tR3Tx/1MnTV1GqcI3FuztNVP84tSYDcXi9vLpbLle+eHOKYWj+RuFe+bC77cGdH1AjuZMa7z/gjFmCLHYXyrHykH9MQzln41TobGbNMEiAclBZ83CaCwCEiFqlk9smT8MFBNJthcDLoFFSn0z3de/JgZ6co57hJJVLxyOGD3W86H+8sw88bjcdT6TRfabJYaQuxFRptntsmOjserby6vv7tt/cRLBYeh6WLTyx2jLxUmsbqScGe5hPCbn75eT6XW1sPOh12GES9DnkcLO7dvnUQCSWSp+Pnm4WmABeEmGA58edSPI5XY44TKWVLLXEqFKQavEAYIGz2nnmvaOi5hMxo/F2KeFqrhU8Rnp3cUTyWyWcf7N6fX/C73V63y221WvVafbvZhMkYjyFJ5Fk0kSgVy4lkTKFqw3NSbzWhnEjKAkZ4z7xRa2AhR951u9laXV6mL1rstrnBBz8IJ2GA0OX2hQv8Ytmve4wLHySOh8NxzEqFIqK4u9OPma+PxSIsy0jzYKkKrK3B8sR5bdgXiCVaJrkG8d0YDFRgdVWiS55kkox8yD7ey6CnqK3NbaVCCZ3qpY5cQx4gpKoEt7dsNpu0TGRfK1Sp1Ssbaw4rF3rFxifd9VBU8gAhMQV+1deuXMFWJSEKUneNJi6kQ+Ry+3zbG5tQH7SGRYpIm3wj5AHizVnKZLpy9apW0kk05yKzoDo8nu3tbUS0+SHVzlKHiMA0AYDONNzudL5x9apEnJ4yGfFHy8Ycht3lWg4ELl+8KCgOjMsx2+x8nDBA2Eo6k1pwOrt2/bqFS6wTv9zzbt0YSfIOt3txZeWtK1ewGwovyLQaSIQ5Z5eK9z1YLWGAYJ7gtYDOrmmb/dpbP1ry+0UPkKj0LflG2JXxoHdpaS24/s7169TJK1OwGOEk4bsmuJERBghnscCqLxDw4d0b5Fby4uoNhkuvcZfdZus1c+E/Xgos4bXwTlily4jbr25vXb788o+vvdn5e5wqkVLRxntphC7yhiIEMxr1RqML/mY4vXK5Agxo7gf3+5EWlIxGw6FQnuGyOExGc7VexazEBFlcXcqk0pl0tp9TlR+vgaLmF/1LK4ubq2t2mtvUceFXsZjxVxjNmMtOl9PqsJ7Ynzxx3P8nAhAvFECx2Wh8YDdzLudCEbuwPxDw+HxMNhuPRi9uvASjKZlJMgWmUqtgbDaHLZvOMnkWtnbnyGDmuOe984sLfr9vweO1mW0ABf45E/IZKArHUxOcrLTZO7/gdDs7Hxy/TB4gTJbg5kuikuF9VHgwcFLDaRaoLc75K0V4oJEFzmSy6XQ2zRaYgst9Er2oItgPRTDBq+/EWuxGgiJUxmqh4bhE0AJwII6CBR6OzBGWMFHxRCvJAyTaDV8JTytlMuDD324ur0knREk0NTUS4UV6anJPraMZQDJQzwCaASSDgAx5pkEzgGQQkCHPNGgGkAwCMuSZBs0AkkFAhjzToBlAMgjIkGcaNANIBgEZ8kyDZgDJICBDJvYnumT6+Z8lz6aYzE/3X6WiqRkUPTziAAAAAElFTkSuQmCC", "text/plain": [] }, "execution_count": 19, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# average over batch\n", "reduce(ims, \"b h w c -> h w c\", \"mean\")" ] }, { "cell_type": "code", "execution_count": 20, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAGAAAABgCAIAAABt+uBvAAAN70lEQVR4Ae1cWXMbxxHGfSyAxUUcJMADPEDSkmJbthXJciq2K84vSFVeUnnIQ35OHpO3VOUhVankIUkl5XLks8oVW4osS7JFSqIsigBBnMS1i/vMt1xyBQKLXRwDyKlgCxJmp3tnej70zPR091KZyWQUs6s/Aqr+pBmFQ0DzQmCoVWv5bL5ULNbrjXa7rVar9Qa9mTbTVlqpVL4Qkfp1qpzyFAMcyXgS6CjaIiJptBqvz0uZKBHaC6qa6hRrtVqHB4f5jDg6QKBRb0RCESbHvCA0RLqdKkCJaKJSqohI0VnVVsSj8XKp3Fn3AsvTA6hSrrB5dqChthWpeHIgzskzTQ8gbt0Z+KqUq9WKnK4N3No4jNMDqFQsDSVosTAc/1CND848PYDq9frgYoETC/ZQ/BNinhJA2L9E93WJUTWbTQnq1EhTAkilUimGNABV6inJJo319ISAESgtShdVq9V21byQ2+kBNKx9PCz/hOCbHkBWu3XwMeiNeoPRMDj/5DinB5DRaLRYLQONRKlwe1wDcU6eaXoAYSyeBY+8Xig5NuP35rw6VYCwl/lX/Nxc67OjYSH3L/uttiEm46R1aNruDn48tWo1x/mDSrAGYSKpNWqDwWC2mGnb/70/aNI/OPH2pzrFiEs/hQZnAMmAPANoBpAMAjLk4c5HMo1JklvtVqlcKpWLZfhTK+VSpZRKpZ493Y8cHSUTKcQOWJatVCqtVhunMIPB6LA73W6Pb8G3sb65sRF02G06nRYfvV6n0agluyJJJAYQNuz9vf1O0ZqtZo7Bbp7JMVmGzbFFttVstRVtJsc+frj3aPdRIpFstcWCG2etLAYWYX/jDpBtbGz98I03L7/yOkWZdDoNRRnMZspsNiJkdMY+kW9idpAAUL1R2w8/TR4nMrk05wY6u04CPqk7t+/uPd6r1mpn1VLfAkACk9Vmf+ft99798XsmyoRKxNAsFspupwGWwEO2QB6gLJP57IuPu6Rk2eKdW1/fv/8NJlEXSeK2FyCe2eXx/vxnv3jl0qtClNHnc9lsgx30JPoTI018F8OcCoeO/vaXv9+69Z+h0BGT9rQulYj/9ne/+fNf/1RvnLpxsTZJ8I9DIrYGiQqBafXwwd5nn3zGFgqiDCNXYvLe+Nc/0unUr375awPi1nrdyE1JPzhZDdp98PijGx8RR0cY0p2vbv7hj79vtZvqiflnJwhQ+Fn4kw8/KQ+z6AgjH7zw1e0vPvjon527weDPDsI5KYCKhcIH798olSceQQY0d7768ubtLwcZ7Qg8EwEIQn/+6efZ3BCh1BFEFx6p18sff3ojkYwLNQQLEwEoGgrt7D4mKKV0U1ihs5njDz58H3uCNOcIVPIAVcuVu3fvNRrTC/sBIIbJR6NHDx/vjgCB9CPEtnl4S4MXgugsl0/aXHNBZd+WETJ9+vgpOC9tvWw2WWiavnTp4kZw3ePxwKmoULRy+dxh5ODeN19X6iXsgNVqrVKtVSu1YrHSG26FBaRUKXHQK5eLN2/9eyu4zQUpyV19hzFaF5VSKRQOVSoDnST8C0tvvH75nffeNhq4A5dwATX/wuKV19+stct37t18tLubzma4ydNWVCpVli0xbBF48fx6/amJWCwWUsepUOhZILAmNDV+gTBAsUgEY2g05PMO8Dtfv37tJz99t98Y1Cp1MLAN/bLQ1p379yKxKML7Bi5epne57fVanWFLLFM0GPR8C5Uqd4j59sH97y9A0P9wOAwpB7FKgsHgpVd/0A+d03qlwu9bgmqUS0U0fpR4vk9pdVqn04qP0AICASgfhPdLxQJlwlQlc5GcrmwuxxvNSqVMs06HYzUYrNXkE2K0Wp3NatsMXuBWqJMTfL9xw7sCzS1XqrHYUT+eEeplRjJUi/FYjN9opQ1/HMGDW1uYYul0vjZA0pDJZLFZbJ4F35zDLi1PrXaiRAfn3FLSj8hSiQEEaI6TSb4/6bM1bbE4XFxkGbPm2bOjglzmGbyL8PssL63StFUlqZv1OrdyxxKxZlN+EZSFhmcgBlC9Wi2WTpPmjNTpwikqhHd+XvDjwFwKhWLhwzi8sKLMqNTrDfifNlksVqvxbEkWZebXPhZrlhzooo+LVhIDCIcvYb5YLJy7r98153J3keBOgyrt70cyWabRk1iGyahSafDP7ZnXqKW2XR4g2EwMk+vqYuRbqf6GarTQ4fGBU91qNefzBdEWDhPx0H4YpGX/Hlw5PTxKeJoBMXypbL1soDgTKXSwj6zXVOY4X5BKJOYBwlKNjayn2REriAGEgEWnCG63o1AoNZvPfdI8FYl1GpnUsTYexCcWU6h1Bs+8i7aYeM883PUo9BrTQr+wp1FGpwifCJVjFiYFkFanWfC5D8PPLRdeUK1OannqGkyhWG4eJWNKZT5/rNEqKEqPuE+zvwulfRYjKBaLXU2NfEsMoHqP9Yw54nLZU6lsp3AjpGZifzzRKRbqk04zBp0K2SCdbfaW+e2st36EGmIAiWo+zgQYTDx2LEimUsmMTeDsLPC7HrrIZgtsPme3WzCFJbAe5KzT2b5Emdgu1u944XDQS8teIRYqbPASMkmQNBoNFCqTYZ58d5jL9l2wRX8tiWYlSMQAkhg5onrr64s4N4GHN7UlBJImCT9Ds9GMRlOIWos2CJtAup3BqcSmGL/R9OsY08HjddoddLnaHEf6xlkgjO8onyvgncWlRW/XdFNriI2LHNIDyIQjiMdtDwaXvN45I9VrAfWD93l97wscpWIlEkl2vedAMAmdGEB6/UD7d71ag645nHQgsBAMLnu9TuPA+dA4r8NcfI7WWQl7XDqdO7vjvikj1Xk7TpmYKhqpgWTCIIWTJFZus8nkcFhxImOYIg4ccKpyfsM+FxxDgqXTxZJMZrj8Bfq0mqA/iBhAJvOgPqqAz0+pOHW7sLlp0J2baICvUCgDrAJbglm8uLJkPDlq2C0mls0fHYVfCga7oBFuadoMqtfrQ43FTCyRgRhAtPW5c08QWrSQz50zHTt54Ga10mZ8mq0WwxSQB9RJTUpGvqCAwskGXtrOB8cpE1uDDEh1MpxTh35ipRIJlapPJvnZM2qVym6jV5bn/X43737DBp9MxM7oIt/Y74vFMghA2UROg4gBBMmcJ24wEdnPVyVTKfhGzteJ31WrVXgFVlYWYEDFjyK1E3+YOOtJLeIF+LY7HONYEl3tkwTI4/Wett53neXo8PjsPXqEvKEuUXpveb+XwaCz0tR3Tx/1MnTV1GqcI3FuztNVP84tSYDcXi9vLpbLle+eHOKYWj+RuFe+bC77cGdH1AjuZMa7z/gjFmCLHYXyrHykH9MQzln41TobGbNMEiAclBZ83CaCwCEiFqlk9smT8MFBNJthcDLoFFSn0z3de/JgZ6co57hJJVLxyOGD3W86H+8sw88bjcdT6TRfabJYaQuxFRptntsmOjserby6vv7tt/cRLBYeh6WLTyx2jLxUmsbqScGe5hPCbn75eT6XW1sPOh12GES9DnkcLO7dvnUQCSWSp+Pnm4WmABeEmGA58edSPI5XY44TKWVLLXEqFKQavEAYIGz2nnmvaOi5hMxo/F2KeFqrhU8Rnp3cUTyWyWcf7N6fX/C73V63y221WvVafbvZhMkYjyFJ5Fk0kSgVy4lkTKFqw3NSbzWhnEjKAkZ4z7xRa2AhR951u9laXV6mL1rstrnBBz8IJ2GA0OX2hQv8Ytmve4wLHySOh8NxzEqFIqK4u9OPma+PxSIsy0jzYKkKrK3B8sR5bdgXiCVaJrkG8d0YDFRgdVWiS55kkox8yD7ey6CnqK3NbaVCCZ3qpY5cQx4gpKoEt7dsNpu0TGRfK1Sp1Ssbaw4rF3rFxifd9VBU8gAhMQV+1deuXMFWJSEKUneNJi6kQ+Ry+3zbG5tQH7SGRYpIm3wj5AHizVnKZLpy9apW0kk05yKzoDo8nu3tbUS0+SHVzlKHiMA0AYDONNzudL5x9apEnJ4yGfFHy8Ycht3lWg4ELl+8KCgOjMsx2+x8nDBA2Eo6k1pwOrt2/bqFS6wTv9zzbt0YSfIOt3txZeWtK1ewGwovyLQaSIQ5Z5eK9z1YLWGAYJ7gtYDOrmmb/dpbP1ry+0UPkKj0LflG2JXxoHdpaS24/s7169TJK1OwGOEk4bsmuJERBghnscCqLxDw4d0b5Fby4uoNhkuvcZfdZus1c+E/Xgos4bXwTlily4jbr25vXb788o+vvdn5e5wqkVLRxntphC7yhiIEMxr1RqML/mY4vXK5Agxo7gf3+5EWlIxGw6FQnuGyOExGc7VexazEBFlcXcqk0pl0tp9TlR+vgaLmF/1LK4ubq2t2mtvUceFXsZjxVxjNmMtOl9PqsJ7Ynzxx3P8nAhAvFECx2Wh8YDdzLudCEbuwPxDw+HxMNhuPRi9uvASjKZlJMgWmUqtgbDaHLZvOMnkWtnbnyGDmuOe984sLfr9vweO1mW0ABf45E/IZKArHUxOcrLTZO7/gdDs7Hxy/TB4gTJbg5kuikuF9VHgwcFLDaRaoLc75K0V4oJEFzmSy6XQ2zRaYgst9Er2oItgPRTDBq+/EWuxGgiJUxmqh4bhE0AJwII6CBR6OzBGWMFHxRCvJAyTaDV8JTytlMuDD324ur0knREk0NTUS4UV6anJPraMZQDJQzwCaASSDgAx5pkEzgGQQkCHPNGgGkAwCMuSZBs0AkkFAhjzToBlAMgjIkGcaNANIBgEZ8kyDZgDJICBDJvYnumT6+Z8lz6aYzE/3X6WiqRkUPTziAAAAAElFTkSuQmCC", "text/plain": [] }, "execution_count": 20, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# the previous is identical to familiar:\n", "ims.mean(axis=0)\n", "# but is so much more readable" ] }, { "cell_type": "code", "execution_count": 21, "metadata": { "pycharm": { "name": "#%%\n" } }, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAGAAAABgCAAAAADH8yjkAAADw0lEQVR4Ae2ZT0hUURTGz5vUHFEhrcxpsFIrQoS0IM0/ERS6qZWQ5KJlQW2DdiLYojZF0KbaRFkEQYuoXSFqUZAgQYYWkaXVGKlhjiNjU9b43tx73v3uu+OMu3Hzzvfdc77fvDtv3jwZa5BW98+3uvFEWYaAX3f6xyPF1a0HDPudNstsi67cisVHNl8ud2aNCrMtOnlzOZ8m2vuNcp0mI0D3K6effp8dTwiDygQQeiAGRTtF5VmbAHrklKEfstYrE8AzFtHHtFaaACZYQlJvggkgygAzTGulCYB/GAu1iWzRBFDKZoJMa6UJoJElNDGtlSaA43JC9UZZ65UJIHBUzMjqEpVnbQKgztpEju/iloQwqIwAdL3D7iu9m+QN2/B2TbM9AxPzxdUtBw1etNRiCpCGkhH2qSczk1RvBuC5XfxGphpYGPs6Gfoeml5YiEQWrT9ZuX5/XmkwGKwsUnUzD19Fe1inQgZq6/d73VpTAiwxsxva6iwF27FSvYqivWceOmGqIlXAUmaZKtfx0gCocMJUReqAkgJVruOlDihyspRF6oC3l5TBtpk6gG6P2GGqYxoA1B1TJS976QAMP1oRYHBwn2ZOWrrGn/2EVc0ZvH4p9Ellm6SIvjxmhiA1gHtCm1yeaJY13WdakBjw84nQxsoL7P42jC8kDOjVbGxOAyM+ZTohNYBEk7s6xKw+phMSAmLCP36Jdrvi30ajM/YKP0LAhzBvFXVAFP/qYW7YGgLe2B3q4zpmQwD80v/EEkR5RBTx+rPbijvwDL6hCbUP2yEgpA5C7iRagADte+xOm3dbcQcCImhC7cN2CFhUByEXfuwhYC2KUvuwPV2AXDWXCALy0YTah+0QUKIOQi5sh4BNKErtw3YI2KYOQu5WtAABu9DEkr/DvVbltuIOBJTBt43o9Hn+PLpmZ9IAqw6NEI203mDX/W4/6oZnQE1ohOg5VbbIqzWyFBQGNOcIbXI5NBebkp29shQU/MKhwsP4ibBrw4CQQZSPzwAD6BgG8EemehyDt4iq+MOP9KIl8UJSktAA6JTUqROzeFEHwFOulXcuxzZ0gGm7yfs4Clt0APyyXHG4NU2AlZ3Be9cLhcaKAItjMM+1gN8uzRaFFXdlV7JtfLQLftQACs/xZqyz59Aa/oyjCYUfaO9QuHFLcwZwxrXQiPONfwkk2j4ViUZ9vpifwlZeXnjeyi9YX15RWWO5eJJhvkVXi6VBU5GWLdLBMgDd7vxfy2xRZos8d8CzIXMVZbbIcwc8G/BvOJ6jZg2rfpn+BUFLlGkzaXw/AAAAAElFTkSuQmCC", "text/plain": [] }, "execution_count": 21, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# Example of reducing of several axes\n", "# besides mean, there are also min, max, sum, prod\n", "reduce(ims, \"b h w c -> h w\", \"min\")" ] }, { "cell_type": "code", "execution_count": 22, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAASAAAAAwCAIAAAA+QDD4AAAP1UlEQVR4Ae1deVhV1RYXgcukzJPMYSAoCKgBDwVxABMHEtHUQkoyrfeixHzVk4K0slABfU9F1IBKtM/UBMxAM5QQBZlBZgUJQaYLwr1cuAxv9fge32Xvw3WfOyDC8bt/7PU7a9rrsM7eZ+91tjIDjx5NGkv/6qeNJW/AlzHm0BhzZ9KcnDF2v8aYQ5PHWHhEdKe/v//koZNu1m6mLFMHQ4fgwOB2druIuhgxJgKSi4DM+BjBggKCzn57VjAsljMtE24lqKqpCoKitMfYkDHG3GFGsKf8TY2HESz/bj6SXdDp8nvlJyNPPqX3zGUmAlKOwHhIsPTr6ZRRSruWRokzIBOBUYvAeEgwTieHMl6dnZ2UOAMyERi1CIyHBDOdbkoZrxdefIESZ0AmAqMWgfGQYMvXLNfS0UJCJiMj47fdDwEZkonAKEdgPCTYVNWpcYlxetP0hmLHUmB9deQr1yWuQwjTYCLwTCIg90ysStzoHKc56RXpKYkpNVU1MJotXbFU31Bf4lYYhUwE6EZgnCQYdFtZRfmVDa/Q7T/Dz0RAqhEYD1NEqQaIUc5EQJwIiD6CtbDZ5xITU27cyCsurn/8mN/bO1VFxcTQ0MHW9uVFi1Z7eiorKYnjmRDZ1OTU7IxsIQxDl94Oehve0IZIaTc4nM6iotzi4rzKytLa2pq6upqWlqauri4ejwumVVSmGhoa29g4LFni5enpraCgIG1/BPVXl5am/Phj9s2b1SUl7S0tAwMDUzU0TCws7BcsWOrrO3PePEFmKbWb2E2ZxZkFFQXlNeW1j2vrGuua25q7urvgx5JjaalpTTee7mTr5O3m7WLnAstUUnIDVwvRyMq6l5SUlplZWFFR29jI5nJ5cnKyKiqK6uqq5uaGlpYmjo42Hh5Ohoa6uLgQRJRSqScdHXsjIo7GxXG7ukZSraOlFRIU9I6//+TJ9AZJklKg0KDQ6IjokUwL4ndr7xoYGQgitNskDk2a1NbG3rDBA1Krr6+PxIS+vmFo6MHVq18lYRbkIXNHUGJSTXl5xM6daUlJw9DhhIOra1B4uAhpRlhby37Cdn/bHVJruNkRKTsLu7APwjydPUfkGOkCoUMC4leupH/yyX/y88sFMOom5Ly7+7zTp7+YNk2bmgND6f31g3huUZG9h8eBqCgh2QVsTS0t/9i9e/Ubbwhnw/x5XgEYowoKsgmzCzrZ0FC3ffuGsLDPpN3hhJiYDXZ2wrMLfMhNS/N3do4LC5OSP9xuLnl2gQ/5FfnL/r5sx8Edff1EDyzR3IaBa+fOCC+vQJLsAhPA//vvWSwWjXkfvQS7k5Pj7uPz4OFDwv5cvnbNd+vWvv5+Qv6JxhYZuffChdPS6/WZQ4c+37Klh8cjMdHf13f4o4/+/cknJMyjwxMZH7nti23SsxUaGh0e/gMt/TB2aWmpk4vQSLC6+vpV/v5PaNYfXbl+PTwqityhicYZEhIEb27S6PWtX389GBREV3Ps118nxsXRlZIe/6lLp46eOyoN/SUlD778knY5uI3NdFrO0Eiwrbt2wcSPlvZB5j0REc2trSIITgSRlpbGixfjJd5TTkfHnoCAAZHmDvsDA5vr6yXuksgKPz78cX2z5P05fvx8Xx/tuZWNzYu0OkKaYCmpqTAW4ar9fH1vXLjALi3lVVdXZ2Ye+/prYwN0UaGTwzlxWpITodDw0EfwHdvw37u73sXde7aIr69fVlZNZWVnbS0/P78hMjJWR0cPd+nKlYs4KCYSHxnZRPWtutuqVdG//36jrS2to+P7rKxX3noLN8R58iRm3z4clzjit8LvQeKD9hvt/Ex+8/Xm8/vP21va41Y6uB3hP4TjuJhIamo2rmHjxmVpaaeam6/z+Xd6eu40Nf2WlfV9VNS/fHwWq6j8tSo+a5Y5LiUEIU2wsKMUw/Sp8PDvDh92c3ZWV1VVYLFMjYy2b96cnZxsbWGBmDxzUfJ/Q4iJMUhqa+saGpooK6vIyspBaq1f7//ttz/jfublZeGgOEgvn3+O6n69tmNHRELCXHf3KWpqylOmwJrhpydOfHjoEG4rITaW29GB45JFdDV1zQzMVKeoysnKwRq9z2KfjLiMpY5LcSsxCTE9/B4cFwepq2tExF1dHeLjv1qwwF5LS01OTk5eXk5bW33evJnbtq09f35/U9O1hISIJUscESnhJFGC/Vlffz09HVHku3Lllg0bEBBIWKA/ij3/CktLHzc14cwTDZk719nObh7Saza7pbkZvdkIDy3ybmpqS0MDImI6Y8b7VIuEGwMDbZ2dEWbILniFQ8BRIBVZirF7YpUU0B3UlvaWjIIMyToA+YMobGhogS05BBwilZQUV61yMzMzGEJIGkQJdvXGDVigRNRt9/NDkCFyobOzprr6EDnYyC4oQJCJSb700ny843/+WYODIiOZ167hspvef19WDv2TGmRb9+67OH8W1RsBziZxxFDHcPOKzbja9Hz0EY/z0EIsLEwQ/oqKh25uAbdu5SO4OCR1xBGNWfkUJpNTU2HfG+EcImVlZYfag43y+/e9lixBwAlIGhub4b3mcCQ5HyvNzcVNuK5ciYODiLOnJ36pLC8PB0cHWbN4zfELxxFb9+7fQxAxSXitunkzB1Fy927J/Plb5syxev11rzVrFpuZTUMY6JJECfaghuL5uv/YMVrGmCniYLg0NDTxuHE4HBwUGamvrkZkoSpKz9gYAYdILT09NS0tqJ8aQqBRT3XTBRmk155jNQdXDnVVOCgOsm2bT2xsYl5eGa4kJ6cUfkFB4c7Otps2vbx+vaeeHsVdwwVxZDIO4Ugzm42DdBG6G2h09T8v/CyWIu5qv0TrFTra2xETGjo6CIKQkGAI0okpQRikR+po6MjLySP6n3CeIIiYpKKiwsWLB/CJoqDa27cLAwP3Gxsvf/314Hv37gteImwTJVhPTw+hOiFsfD5fyNWJc4lucaYIkYGaDERKXh79e0UYYMkMQcjLvhBBiZAqSiqInt6+XgQRn4QVi9u3Y/39VwovLObze0+fvmJvvzEkJApO4KRllyjBJFL3DeuetDxjmEWOgKKyMiLLfVr9Dex9ISJKmBKEQapkF68L0Q8LjAgiEVJTUy029vO8vDMBAd6qqlOE6IQ027PnxBtvhOILfkKkiBJMW0NDiArCS1NU0GcSoSAhm/CHEKGS8cGmMw19NW+sq+OPPA3hcblNdXVI33WwggGEQXoku4Pdze9G9Gupo5NYhEEccvZsi5MnP2tsvAo7XZs3r9TUVB1J2/ffX4YSkJGu4jjRqGJuZoZLNhUVaWtq4vizQuSoapx7+ZKfVzyrDpLbNbO2Lhq+wNvX2wuv7fh+16DOwtu38ZkPKCG3KFnOosoiXKGpvikOShZRUGDBThf8env70tJyzpxJjo//lcNBx9KQkOMBAa/g22iUzhCNYI729rjwL7/9hoPPEFFVpXjqPK5//AxdelamZ//tb7jpq+fO4eAgkvTdd/glSiU4mzSQX/74BVc7a/osHJQSAp9aLlr0UnR0cFXVJRcXO8RKY2NrdnYJAo5EEiXYMnd3fF9r1969JRUVI+kdfVx3mi5uNP23dBwc98iCFStksO9cL0RHN1B9Z1SSnf3LDz/gMYEnOQ6OAgKfZkZfiMYNwTfOOCgOkpKSUVZWLVyDnp7WN98E4jw1NfU4SIkQJZiejs4KbI+4sbn5peXLQw8cGGmDC84ROBkf77lxY9LVq5S2JQuaW5rjCqMjo6vKqhCc3cK+/NNlBBxPpJ6RkcuyZUiPujo7P1i1qu7+fUG86M6dIG9vfNVx7sKFcJqAIKc02pUPKzu7OgU1t7a3+v7Tt/VJqyAIbQMdA/sZ9ggoJrlvX4yV1VpHx81hYXFQwzGStqys4pEukeBE72Cg6NMdO5KuXUNm6hwu9/PwcDg+wN7GxsbKSk9bm8ViwYECDY2NOYWFVf/fqVy6YMFKDw8Sb8ThmTl7poKiQjdv2MtxW2ubh72Hp7en2XQzeB9reNRQCmWRxaUyk2SqOFVwfKI4FseybEBwcPqVK4iHFQUFa62tnTw8XrC2hlX4stzcnJs3KT9pCdi9G5GVBnnpxiW9pXquDq5WZlawLl/bUJv0RxKMYLgtKJ6aLEM0GOCyIyGFhZVwCfIHfh99dNjERN/BwQpKN1RVVaAukMvtgtLE3Nwy+GwM12BuboSDlAhpgs2zswsMCIg8cQLX0j8wAOkEP/zSIFI2/Kk5EpuYOGSX+zL35EvJiB4ej5fwYwICAllVXmVta43j4wOxc3FZ/eabcF4A0h1YS/zj8mX4IbgguWTtWkhCQUR6bS6Pm5yRDD8hJqD2970N7wlhEOHSo0fwbeOw7fiHD2EG3UCiSldX08FhBgkn8NB4KnwTHOzh5kaoV5CtrPKvR8Uo/Nv6/lZyKxX3xtALJLnb5Jy7Dh+2mD2bnH+Q0+jFF4OjKV6B6OqRIP/ut3bDFFGCCkFVYaHod//DD/1gFYTQHxoJxpKX/zkmZj39d9+yKvQtiNA5umwui1x8/XwJpcpLygk5n1M2+OLrSEoKrRwzMjc/dvWq6ljaffGa7/Wx/8cSvwWD80MR1Hp7uwcFvUYuSCPBQCkcdXg2Kiru0CEDfX1yG3BeQGtbGzm/OJxhx8M8VhFNb2of1Ipj6LmQhSreb9PTvbdsIfF2sY/Pd5mZBlR7niTi0uB51fNV+MwZX8EW35ar6xwvr/m0ytZgl+zTT9/66acwWv7QSzDoGBRMbF63riojIzYy0nPhQsWRT88ETvtZs3a9886txEQNqk0q8cOEa1BUUoz5OebgyYOUi4rAD69qi70WHzl9ZP+J/bj4+ENgHPvs1CnInKXr1rGobhZ8JAbL+lHXr+8/fx4v+ZVqQFYsWLFp+Sa87BCM2lnanQs7d3bfWUUFqVRIOTnZXL58uKYmad++9+bOtRJeBgQH3YSEvF1ZeWnPnnfIJ4eDoRPl4FHBoPO6u4vLyioePIBFec7/ziGdoqysoaZmaW5ubWkJRwkIMpO0RThYk1ItFIxVllbC2gusHHZ3dSupKOnq606fMd3KxgpyjFKEGpSUQ9TaaaPiuNPF5RZnZtaUlra3tkJ8pqqrm1ha2jo5qdC/TUN+E57zWddUZ/QyuvK202/ngQ8O8Hp4mUWZxfeL4bNleVl5Q11DRxtHSxPLIRP0GoQODVcKCx5wNGJ19aPm5jYer2fyZBklJQU4L8DUdJqtrQWcIDCcnQYlboLRMEXGKs4fEJkFmlxjzKEx5g7pf4IuJMFo3o+nsYuUYE9TKvp12lNE0U0xkkwEJl4EmASbePec6fEoRoBJsFEMNmNq4kWASbCJd8+ZHo9iBJgEG8VgM6YmXgSYBJt495zp8ShGgEmwUQw2Y2riRYC0mn7iRYbpsSQjAOf1DmQPSFLjc6Lrv4BueAUYFFRyAAAAAElFTkSuQmCC", "text/plain": [] }, "execution_count": 22, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# this is mean-pooling with 2x2 kernel\n", "# image is split into 2x2 patches, each patch is averaged\n", "reduce(ims, \"b (h h2) (w w2) c -> h (b w) c\", \"mean\", h2=2, w2=2)" ] }, { "cell_type": "code", "execution_count": 23, "metadata": { "pycharm": { "name": "#%%\n" } }, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAASAAAAAwCAIAAAA+QDD4AAALi0lEQVR4Ae1dCVAUVxoWZjw4hssVOQXxigVqdILHBI2uCq6skpJ4FRo1RkLWFEaT0i2TRa21YjxILNm4uOWuujFKFBKDF2LwWG4FAYsFFRUvGC1QkEMCI7B/ZJ0M73UP73XP0Q4vRVXe/73/el/3P9393uvWqr2ysoeU/lO7SykbyEViCUksnR5jrkjseEksIWuJ0SMqnU2fbHrN+bWlYUs1LRpRjpgxY8BADMgN5MfMbn5p+sXP1q8jidTkVJ/ePjv375y3ZJ6Z02Lhuz0DFnIF01aX9oB+vPRjbZs1GAPmYsBCCoyTvt3bd3PiDGQMmIwBSyiw9vZ2Tr7u3b7HiTOQMWAyBiyhwKysrDj5Wrh8ISfOQMaAyRiwhAIDsmbPn41QJpfLR70xCgGZyBgwMQMWUmDxCfERKyK03Ll7ud/TsPtDLR+sYTYGrNhCcxfcS2xlV2LpsIXmLk4fC7mCdTFK1s0YMBMDrMDMRDwL2z0YELuT4z85OX/bt+9idvazpqYBnp5Tg4JWvf/+IF/f7sEe7ygfParMyrqQm5teVJRXU/MY/kDVy8tHpZqyfHm0r+8gXksjdyTv25dy+HBpfn57W5ufv//U8PCFq1ZZW5v6d7boRlFmUWZucW7BtYIndU8e1z5W2ClGDxsdNjksck6kXCb2tBTM4sGDp3744Vxh4Y2HD6ttbHq7urr4+w8KDh4fFja5f38XAW4FPoO1tbUFTJlSWlbGF3LurFlH9uzh69WDkzxjqB+old5KPU60XZXtorcykyT0Il5c3Bb4a2io10bX03j33agvv/y7HgW+LuJ0Ojm4XVIyf+TIttbWTqiOsDs1ddz06ToAaZN8b+3mvZs37tnY2sabg27I+PXxH4R/oIuQtskT0vG4Z09SVNQXOgB3s1evnsnJX4eETODu5kKFFNjmnTv/sm0blzcUu5+f7+VOtz2e5ASSZoF5eHAvx6Gk6MiVldxL5DoqaJOEH8Rmtp9fRXk5AuKirUKRXleH4/oR8vPZSknHDyxvPr/0nPrqSp7Qi4HBpUIuH8u3V4Fz7O3t+Zw4J0h9b6CaNYuwuiCet1JZfu8eZ2AGAgMCapKWt0CZjKS6wO2z+nolz5I9bVCD6MNJLwuUNTU3GcQbnxOZLJCqumbMUPG54sTpCmxuZGR2PkX5Qki/8eM5AzOwg4GoqAXGo0Jlawu/0FT+JVVjkLmtypYqfyplB4eJVPqgvGBBCJUJRYFdu3kz8cQJKu8dymOCgwVYdROT5OTvjTTSL6KimpuE/Py/9+abRkpJmNvAxYHCDPVbNTe31Nc/06+D986bR/ekSlFgwydNQuLJZLInpaWwVK3790lUFKJWUFyMICJF2KjRKeSL8MVVBo4iMkkwLyxUd+am/Z13FuNu09JO4aB4JIlrkimloiIfniFe/l3SaKxlMiRWUVaWpqUFAY0hqlPVv6XyIqfEbYl4oLySPMKpEdxWDxITE4/31tenv+RGS9Kvjby8gyNGDAZ9mFrErfQgpAV2LiMD8fLrM+j9+86Ojgi+Iybmn199hYAwL4Ig3VPctevf+NZkmHs0OBufRfy2cUzrPK+trZ+Hh1aEhkwuv/z8uS7S0TbXRSx8avit5Ft4Pm+veRsHRSIpKVmIh4SELfb23HekSuXwq1e/h0pDTLoUSQts5qJFiK+KK1cQRCu+twB9rtiwY4e2t5s3NmyIRRi4dAn98UIUBIgphw4hVqtjY/Ha7tCJPXYMUS7Jy0MQk4l+nn5DfYYi4U6kn0AQ8SJeSzdv3hfvFvFAWmDN2D2De//+iC9dcVLnuQ3aR21dVxbWhhUwY4+Ik+1Fa9bwxZ0cFoZ31VZX46BpkPyD1BcKAYktWfJHxOrzz2E5MAcBRYpEBfaoqoo2TP9+/WhNuol+nz42xh7p8f37xYdIjOd4RBHvlsSDva09rpZ9NRsHxSCRkXNw85CQlVZWyvj4RLxLGEJUYCnnz+PerWARh//v6PHjiIlGo0EQJhqJgUs//4x49g/sYiJuROc7DjC/fO4c4sS8YtZV9JFJfD4//YROFnT4/PDDLVBmcnng/v3oaUwblKjADDINeOfBA9rkmL4wBsqvXUMM3QYMQBBEdPX0RJC7168jiClFFwcXJNyt+xyTH4gOrTh79ltxcWv5rFpb25Yt2wiV5uo67dGjJ3xq+nGiAoPNqvq9kPQ+pd+GQ+KW6eAMaJqbEbC3TRf3pbhCC+YE8WlU0VGBzk4baUvHRx/Nr6tL79lTrmc4VVU1bm7Tocz06PB1ERVYX2dnPnty3N7OjlyZaYphoK+bG2Je09VTdBX2geffYU4Qn0YV6xvrEf8ujug1DVEQLCoUti0tuTU1F1SqUXqcQJnB1ay2Fk1Mjwl0ERWYcuRI/V5Ier07r8CQmFDpwEc4qPQtWHmUSoWM7r+XLyMIIhbn5iLIKLPu56iurUbyCRgUgCCGFZ2cFJmZ/4KVroqKlGnTxvE5d3aezNfFiROdlKHTOC6OUvvWgJ2CXSH/f4hh0XLv5s26x7vuSRePEE2Njbr60A5dvBhBzCsGjw82TQIeHv3Ont0NsWAj1cSJy4uKbiBx585dd/ToVgTkE4muYC5OTnz20sFh35Z0kjFvJj7DhuEJ6NkzDvvocf3Xg4Jw0IyIpys6DWPsZODWsbDwMGzgQAIlJqKTtIiCrkhUYGCgsLfXNYN2xMqVCMJE6TDQq08fJJkIpRJBtOIMLy9tu6NB/RYWYi9ODI0OFefAkNYdWxAFeyQtsNyTJ5EYh3788XWuW0dELfXiRQQxpbg4VFr3OSYb+16M9usFBZmnT+MJbI+ObsQmeL9OTsY1TYPcqbxzKvMUEiviDxEIIl6E1WRr6zcMuKbMmRJpgQ0fMsQG+1EsKimBtWabgQM3xcZeLS2FifjW1tYHajV8omPZ6tVyb2/oDVm4kDOwMUAfPx/EbdqpNNVgVUlRCSTW2NCYn5O/Zf2WIYohwxw5bqIQ21da9B87Fr8KRc+cOcPTM+/8eXiNpeHp07SkpAk2NglxcfhIg0JNcQ1xD3b3n+ufcCYBpjQ0zzUlt0um/2n6wFkD8Xy+/eu3OCgS+e6703Db3LGmDNODERGfnTyZUV1dCyCsgD1+/DQzs2jdul3QJSYQxScD4ByFmhEQ7HpGxlA/P0JDAa/Eaz0XFxQHjyF9FCb9XAdxQvjryfC6iqsrOmMO2eKa5N8OIE6nB8xbBGE39lqu9DTOPnzoonejqa4t+Rv6tJ8M0EYZ7D247FiZVuyiQZyQ4MoJD5+amLitizRedpNewUAfZhFuZmW9NKT4f/KZMxTaIlQDRgeIsLY0Uxs7u+1JSbSjgk335NVF61yYPkV1CQtAaUVeXeCYosBAG77HBm9YUubTIzk1ldZEsP61WnSXkGBXFmD4+zlztiUmkg8keutWPZvuyf0YULP8eLkBvYl3pVbTncx0BQb5wRuWsAK2gut9Pr7s07FFTD5N8biDo8OVB1dI/GRfzCZRe9V14MuHmY2N+GvL+LhS1eola9fiuBmRmgs1vh6+RkogJ+eAnZ0NufMBA9xgGdrNrS+5CWhSF1iH939s3w5ldvzAgYH8u0h79+q16dNP4a1nEy9Ju3m6wfNV9PpoTiK8fb2PpB0BhQlvTeBUsDywj60tvLZ8rKyMc8uvg7PzgZwceC0e32BlbCrgkwEVKRUjh3DsE/rmz9/A1wScFE7Gy2HcuICGhgyomR07VuvZiyiTWcfErAC1u3fRiXSS3CgmOUjcidchf4gXH4vIg8QSklg6FP/4Az7JAQXm1pdjEojouPApEU9y8DkwLC7wCmbYJJg3xoClMsAKzFKPLBuXJBhgBSaJw8CSsFQGWIFZ6pFl45IEA6zAJHEYWBKWygArMEs9smxckmCAFZgkDgNLwlIZYAVmqUeWjUsSDBB9MkASmbIkXnEGYGfGKz4CIen/D3pQsiIrI2XcAAAAAElFTkSuQmCC", "text/plain": [] }, "execution_count": 23, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# max-pooling is similar\n", "# result is not as smooth as for mean-pooling\n", "reduce(ims, \"b (h h2) (w w2) c -> h (b w) c\", \"max\", h2=2, w2=2)" ] }, { "cell_type": "code", "execution_count": 24, "metadata": { "pycharm": { "name": "#%%\n" } }, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAMAAAAEgCAAAAADHhCPtAAARU0lEQVR4Ae1daXQUVRZ+2VdCIgSBJJAQwx4IMJFFFtnCIiCIoqw6I44O26DsHpkZzjB4AMVBUEQcRlEYiAzIiGEngDEHASGEBEPYDAkJypaE7ITOdCfp7nur6r16XV3d1Tmn/NP3u9u7372vlq4uots90rD/c2/Y5ROiE9B6gvoE9AnY2QF9C9nZQLvD9QnY3UI7E+gTsLOBdofrE7C7hXYm0CdgZwPtDtcnYHcL7UygT8DOBtodrk/A7hbamaDBT8CTuwFVZ1Izr94qrfL1a9K6dfcno7gDOR1zUtKuXi8qM/g3Cm/T6alY3s668T3YqjmamFSKKmkz7qVopLAL3NiReBUlCBk7qQdS0AAXAcOOtdniBO6jF7UXa5VoLr2/yyCO67VgoFgp0vAQOLPgvCiuVuE1c5GPtMUWbdmKTyTKN2UYsSpMNpE8AcOalY+oaeK+akm1cRrOv5JD9Qxc9yzVVm+QJVD+yiFWjmZ7n2CZ5W07Z1eynN56h2U12uQIFL54mp2hRVJrtgPb+uWblO1jDnttpVmS/pQ5W1VOkqmfFEyukM7Mpd01V6Z+sumv7EQyBF4/yQ43Wi8ukXWhOpybVUO1mQ3rtpslyU/2FvpsoWSQQLmvp0DBC0v75HK4+nzPOsyYE7j8F478hCyUb6N0nqU89ZPKWaxtxiSwmG97XzgoXZ+cNu0LOY86+ynWJmJtoQMT8QKNRj/dNaRx8b2Lh/YWI0ufvQjygnHHsafvsHFtW3gX3Ny/+xY2RJz2xgqAWAQGoguwx+w5webAwvc3oLGeaWO22PB5ehh2HrS+eb2ietUalJ58OAW7AsTYQidQ/YHb/xJsiQv+++foHmKXxWKDsAn7ztpprp94vr0d17UZu0KEHaGFfAqR28bBEJJR70GoZAvd2wMzkNHLIByCEElLh0Yk0wkUH4aO00ZAZJQnDwSKjGIAOMWkh9AxeJ0bhGRmFwTpHaIT+K4KpPBeBECdOBtoDCcB4BS/Q37zghA0npsR3o8QBHQCydBtqGV/WrR9AywiIWlA5hMNKdAv4FWITPLIcKjJuAsRlOkEfoRuoyCokz27At0lIPOJWegL3nBfURTes2kie72CSuA3dJXsJhEfCnS2EzgLogkZjVAtwCfZc2KHOo0nzXAZGXohJAYFYpWM5hqySzQoFjlgd2CiTuAGcJIXC9EpRd6fEDTgoAhxSOhjUJcHAZRVIlBDPcjgYlDOhyAaArPcxiyYPpE7NFAJFEIveRkdkvLuhJRAp8YQmOVGZsH0WQYBlKkEyqGXvFwh74I9UH5Uq9kv0CyYPh1OAF714MJUGR004pOoMQ4pkTtMSp0AdHKEjMqTnB/alcgd1kMl4A+95GVveRfsgSpCtZr9HpgF06cfBFCmEkALwAhp2WYC6N6nSCopUiJ36E0l0AR6ycs2DoyQMJjzKgT1cs0VqETu0EAlEAG95OUQeRfsge7VCm9jowldRyce5A6dqbcSraAXyWqGoAqgLcpxfgiCJnAWaWIQAoA6gfbAiZAchNQAcSjJAYRqwR6k6o4QAFQCIejyfhSEqCNGo8Nyf40wa8lhqPHoAhGUqQRIT+j2L3TlhxalstvTMPLmdxCZ5A2VUNOLepKgExgGE9x5S9QiaFYiD0dBqxEi5O56pMDO0EQnkBAI/XbOeQihCvJwH5jkwjqIiGEGuoy5j0FWCOgEfMZCP7J1UCrC9eDR0T+tlNLL64KfRT7L4J6vWXIIGQfTT+qMJ3PZvQXbpteMwfiKXnzswP77ZPg2tBg3+GkocnV/a6H5nJ4/4wQyke0JGAPEIECm7QWOtaJP/+7tYh4LCKgoKrz/c9q57Ecm7ROnhG6c+EXcZtJ83PDwFlUFl3bvRwcwIXGMkyCLQE7vCp5SPPPNnePxBj4XBhoAYohfD6Yb6ccAIa3n0eOApVrpVS72NZCFIY5h1M/+BxBz+zPSWk2XraJt0tJIHv/g1Swv1gSIxybqPRTMqZiA/xbq9cma3/2zUCsQS0wCJHRPC3GISHNFpOFVdN7AXt+UZ9kgZjaZBFH/i2KG1xqVEyCjZRksmskuQIYAiT7CboApu+ItZIx9YTNzF7n/bRG7fvZBbIoN/np1I5kcd4pkHFjmMQcYMw7eNocVa7LJTcD4MsKrJ6d5sdPYMwLSKWUGrYhRJ+lXYHNFrAuZ2cf4IHNj4h0rwlKTkWMGKLySmRNdWrlHcNdisvRa0s/swPjkI0BI9ZEDyeILlnd8v/7xHoz0vKZr2xLzkG/w2Mk9kIIGeAmY4m9eyLhyK7+woqLGPyAgMLxtTExndEtMW4NPfzkl/WrOg1JDQEB4dKc+XXjbYgsBvkKc7EU7fpxchvLldALKe6dOpD4BdfqoPIs+AeW9UydSn4A6fVSeRZ+A8t6pE6lPQJ0+Ks+iT0B579SJ1CegTh+VZ9EnoLx36kQ2+AnwPtHJ3nI8t6JJ12cmyDzjUqetNmTheypR9fbn9T+mhH/E87TJhgLsdeUiUDHW+jOYx4cT7V1T1XiuY2COtX7yaO6PqhZgbzKeCZwdglbpdgRBjQHPBLbgGs+lY6wt4iHwvaDEYwKsKeQhgJ8bC94a1rR64+IcBMoeCoosFGBNIQcBP+HFLkjTigWLcxBwE/7UGiHIoSnkIEAGCCoUYoHZuZCHwFRcUpc4jLVFPATix8EaPVe4Qai1zEOArAe/t7l/0EfrmtH6XAT8vp1m9muxczKK1xzw3AuZirz0xYm88qZdRr7orXnJuABeAjjKhZB5a7hQSbaVohOwrV/qezf4CQhv1NRvkXzGiiu5+fkFN++WV5SXP/T1C4mIiI3vyFsY9SyU35m+8oF4us02S8mpn9IzcuufeMDQoJFjh3DtDl6iMLlactGmg2claq9NX7x9e9SsKRwPobhYqlWxIM+1FWdo9Zs8r88bkCqIkIBaEpAoB6uyRi9nMax1dmkCpGbNRLnXt12bACGH5Bi4OgFy/I94WwmRyxMge/8prBlh1ydAlqejigWgARAwzH0kKBrCBkCApCXCigUy9Urc8h70vBsDkUPkf0SHhgQEej24lrrjomCBNRPob5G6zgTGJ3SLDPXzDOkxO+XzppjB1X0YQ+Q6BEBVY5IFb7R/BYwC0SUJkLD/BqE6D1NfPed5Oo1SOQlELkMLGZIRhMA1J2D8+0vRsErS8Ah4zkMETiIEgatOgIxGb/bnlMCioeyyBAKGwDJrfoYIyi5LgPSDZeK/5gMtrkugIyyTFCAEgOsS6ASqJORXhABwXQIhnqBMUgoBlF2XAAmEdVK/GrswgcaQgPC3aovNhQmgJyrUR1wuTABte3RZs7TfKLgwgTJYpz8EUOYkoMEvqyXouA2FRUOZkwD9Kx1Mpqp8GWVrihAAnATwMcR6SgBy2ydeReERCAHASQAfQ+jwAslUFfGT6Rhabk4CHuiqcpuWTU39IZgsUPjGjMXISYCEWCKMAvXeFjrZKZ+6CRN0gwDJvASaw6gUCBwkv4/y9kQIAl4C6CA6h88QMJ9a8jG0g0hfal5eAvg79jvUfCoZbryKEgX1RhACXgIdYBA5NP+hFVcfnTvRihRLi8CRdSThPsozDJ/GoY36Myt0Msq5XbEi6g8DwoIqivNyM8+cLSWNcrCVD50bjP3aJXTtGBpUduts4okabPmmP8YA8RIg3X8BUSIxk3qaE7laFUICVotAanOafivDu4XIcEFSDLMwVBnNpdfPfzf6ErOoS0yrncYo1trcE+jCfFUu284ameGr0LdjgSs3AbJYEImgIycwQXCso4X5txDpOwFHIuTAY6DjGrSSEPBPgKyi3hEScv+OMLFaOHwb9ctY7RI2EAjaEUavylF7KHJvK/qiJosNBEjkQcHVzJraq9QqqykNS5ap3yYCpMXBxX5S5YUtSE+Q0tuke9pD5P74J/9pLFIKFNxX4vq42xsFfwiIRAwb1deWOVoLwFfiLMOXey5ajYS0mz6Rvf1rnW0lQEhN2sm0X/JKyt0Cg5pGx8T0YBwYsBwJWUCgmfEvGCWnZ+YVlfgGtmwbN+QJiRCxynYC4hxKNWICCjIpG76ChRwVohNwVGd58+oT4O2Uo/z0CTiqs7x59QnwdspRfvoEHNVZ3rz6BHg75Sg/fQKO6ixv3gY/AS2/0PA2menX4CegE2DO1wlGfQJOaDJzCX0CzPY4wahPwAlNZi6hT4DZHicY9Qk4ocnMJfQJMNvjBKM+ASc0mbmEPgFme5xg1CfghCYzl9AnwGyPE4z6BJzQZOYSrNfRLIG3cnLzcnPvlFeUGf8/TH6+oRGtu8dHWawqCdfOXMr+9XZRRZWnf4B/QERUVGTHUI7UMo8WDVlpaZmZxRKJYsZPaSmhVqaq+WH3gXxxaMRTg4YEi9VIwyJQtDnldAnyRsBrymKeFqEYSfBw68dXJA1GpdfQ3w9ivDVq/P+93aOFEoLfJpHwC1n9nITWVtWxhdTya1Otm8zKaNdZ6P70FazcXDbDsvHs+kk0Mw/XQUzP8N6jpXQjj6X69d1ybh2YDnZNwJj5g+3M9LLGN2Xrb8l+bc5eAmSRxNlDtmyLw5atFpEmsAdg21uLUms8sGcP3X5HKiXWOZoA+SYTL2gLWss4S5vzOJxAzSbzUjZ/Vm7jCHE4AbKrnKMMSZfvCyXVSOneHkER4D+Nrohp3sTP33D7xoGdBShNScpQhPlBCnYNHt+7TZi/r6GypCD/4oXUuktsa1/sJET8BMY9XhsbHt5n/oJElOaIUgIXUZpJKwNqsYdXYPNuz5Ca80lJRgeZHUT4CVgXa/Txb8esiJB0CGyR0Rl4xHoc6hYX9/b1pH2xWCtCSggQ9zU9akAmxf/6oQwkIc9DUC9HzZwpoUUqZReyyH4wSeFtiGyQfaDvNQj4ZWUESAJaIRshfoBuEj7K4Q8EngoJxIMUhPyCED+Iga6FCUch5JUVEsCHVjHvagK/3yF8+/kpPyEFF1BIwDcEZi+FwAZ5hGD1pKEDNzO+YElmFqSQ9JFStoDKEghskJuNEzqfn9/+hS9sOicoJdAELq10AmRpEExTK1cfebPDc9seiPQ0hVIC3jCh4puhVhukvrAbjs1q+/K3lXAFuqyUADqFw6safSkpy4gFUlpCKr99ufO7XFtJFQLSRXBpF6+kVXB3dezCQvkctHD5SJU8XttRd5Moka7qs/itEmqs0pwAGXx6BvWG7O7sN6pwvSKkPQESuPzk9ABRYfWKxOeKaKY6vQsQIKTNqox3n5Q6HxlrTJ3OPkW4BAFCGr++P2NFT0kORz5kjsBFCBhrbPHGvsxVAyQOh+UXWQxch4CxyubTd1/Z9KzwS/CjVQ2GgLHQoPH/zl7bGVecVIgxQi41gbrKAqce/xTdI1Uno5IxcEECxh8tnk9C+ygD14yQpgQMqBQIOr4CUQ4EAllTAqd7rM4V1GOG6Csf6+ZaUwKZ19+NG71Z8qbzmpmJ3KemBH42/gP3H+Z3GLVRVO7hDbBw9PQCGoyyxIVD4OFAmFmb25CauiSiZ2y7sOYBPobq0qLfcjOSL6BVIxHCQFMC1ktsbu5OXBdC3RDCQMstdLMY10JDXk/RLEa9lgSsA2AUaDQNR5c1gW9DIPBnQc0Iakmg7hhG5UiBSd2ltGadlgT4tlDHleZaJT81JPDwsmRFAmW7XdSvm7WeGhIoEdw1CyqvgyMPNpPUW5QaEgg5cmiSv6UQSSF801eNJA1WpYYECOmxPmttH8kvwrUFxn5wZry1Uoqk6ZWYkMCpU/P27P+xWlSde2zCKHRHKvKoV7BeeKLFqK4vSb+QdTP/TnnlQ3dvn8ZNmrVq07G7zOay1OASBCzVKBA0PQYU1CsK0QmIWuJkhT4BJzdctJw+AVFLnKzQJ+DkhouW0ycgaomTFfoEnNxw0XL6BEQtcbJCn4CTGy5arsFP4P+4MGYl5KXrvAAAAABJRU5ErkJggg==", "text/plain": [] }, "execution_count": 24, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# yet another example. Can you compute result shape?\n", "reduce(ims, \"(b1 b2) h w c -> (b2 h) (b1 w)\", \"mean\", b1=2)" ] }, { "cell_type": "markdown", "metadata": { "jupyter": { "outputs_hidden": false }, "pycharm": { "name": "#%%\n" } }, "source": [ "## Stack and concatenate" ] }, { "cell_type": "code", "execution_count": 25, "metadata": { "pycharm": { "name": "#%%\n" } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " with 6 tensors of shape (96, 96, 3)\n" ] }, { "data": { "text/plain": [ "(6, 96, 96, 3)" ] }, "execution_count": 25, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# rearrange can also take care of lists of arrays with the same shape\n", "x = list(ims)\n", "print(type(x), \"with\", len(x), \"tensors of shape\", x[0].shape)\n", "# that's how we can stack inputs\n", "# \"list axis\" becomes first (\"b\" in this case), and we left it there\n", "rearrange(x, \"b h w c -> b h w c\").shape" ] }, { "cell_type": "code", "execution_count": 26, "metadata": { "pycharm": { "name": "#%%\n" } }, "outputs": [ { "data": { "text/plain": [ "(96, 96, 3, 6)" ] }, "execution_count": 26, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# but new axis can appear in the other place:\n", "rearrange(x, \"b h w c -> h w c b\").shape" ] }, { "cell_type": "code", "execution_count": 27, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false }, "pycharm": { "name": "#%%\n" } }, "outputs": [ { "data": { "text/plain": [ "True" ] }, "execution_count": 27, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# that's equivalent to numpy stacking, but written more explicitly\n", "numpy.array_equal(rearrange(x, \"b h w c -> h w c b\"), numpy.stack(x, axis=3))" ] }, { "cell_type": "code", "execution_count": 28, "metadata": { "pycharm": { "name": "#%%\n" } }, "outputs": [ { "data": { "text/plain": [ "(96, 576, 3)" ] }, "execution_count": 28, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# ... or we can concatenate along axes\n", "rearrange(x, \"b h w c -> h (b w) c\").shape" ] }, { "cell_type": "code", "execution_count": 29, "metadata": { "pycharm": { "name": "#%%\n" } }, "outputs": [ { "data": { "text/plain": [ "True" ] }, "execution_count": 29, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# which is equivalent to concatenation\n", "numpy.array_equal(rearrange(x, \"b h w c -> h (b w) c\"), numpy.concatenate(x, axis=1))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Addition or removal of axes\n", "\n", "You can write 1 to create a new axis of length 1. Similarly you can remove such axis.\n", "\n", "There is also a synonym `()` that you can use. That's a composition of zero axes and it also has a unit length." ] }, { "cell_type": "code", "execution_count": 30, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "(6, 1, 96, 96, 1, 3)\n", "(6, 96, 96, 3)\n" ] } ], "source": [ "x = rearrange(ims, \"b h w c -> b 1 h w 1 c\") # functionality of numpy.expand_dims\n", "print(x.shape)\n", "print(rearrange(x, \"b 1 h w 1 c -> b h w c\").shape) # functionality of numpy.squeeze" ] }, { "cell_type": "code", "execution_count": 31, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAkAAAABgCAIAAAB6y1p+AAAfBElEQVR4Ae2dB1QU1/fHfxTp4IpUqSpdBERUFAVRREFQUYgVC0aTKLYYNbbYY4wtthhTJEZjFxWiIDYkCBoEUcDQpEtTem/6f7/D73D8LyzcXWZmZ5fr4Xhm3nzfffd+ZuHuzLy57z//wX9IAAkgASSABJAAEkACSAAJIAEkwAwBCWaGwVH4ImBsZjxnyZwx48foGujKycuVvC1JiEsIvRF67c9rTY1NfJlCMRJAAkgACSABJgjIyMrs/XFvXkte/of89j8x2TEOzg5M+IFjIAEkgARYTwCvwFh0iuTk5C7fv2w3yq4Tn1paWtYuXnv5zOVONHgICSABJNATCGACY9FZPnHuhNdcry4dampq8h7rHRMV06USBUgACSABMSaACYwtJ9dmmM3tf24DvYmPiXcf7g4UowwJIAEkIJYEJMUyKlEMau6SuXC3SbaztLGE61GJBJAAEhA/ApjA2HJOHcbxNzvDcYIjW1xHP5AAEkACwiCACUwY1DsaU0dfp6Nmnm1khj3PY3gACSABJNADCGACY8VJlleQ79WrF1+ucFQ5fOlRjASQABIQMwKYwFhxQutq65qbm/lypbKiki89ipEAEkACYkYAExhbTmhBXgFfruRl5/GlRzESQAJIQMwIYAJjywmNvB/Jlyt/3/ubLz2KkQASQAJiRgATGFtO6Plfz8NdIaURXzx7AdejEgkgASQgfgQwgbHlnMY+iQ26FATxhjwt27ZmG0SJGiSABJCAGBPABMaik7t60eq4p3GdO/T+/fv1S9c/iXjSuQyPIgEkgASQABJglACp57v/5/28qtHH5saOcRnDqEM4GBJAAkiArQSwFiIbz4yJucncpXNHjxutY6AjLy//3/XAnieE3gy9dvZaY0MjGz1Gn5AAEkACSAAJIAEkgASQABJAAkgACSABJIAEkAASQAJIAAkgASSABJAAEkACSAAJIAEkgASQABJAAkigJxIQ5ixEWRmZEba2jvb2VubmxgMG9NPUVFJUJI119fW1dXXvSkszcnIys7NjXrx4/OxZRlZWTzw/GPNHBDQ1tfX1++vqGujoGLT+r6amLi+vICdHpmr+93/yIyEhUVdXW19f9+5dcV5edk5ORnx8TGxsVFZWxkeWesSmjqHhUGdni6FD9U1MdAcOVOZw5BQVJSUl62pqaquqCrOzc9LT016+jA0PT42Pf//hg3hDkZOVMzUwNdA20NHQ0dPU01HX0dXUVe+jLi8rT34U5BTk5eRlpGXqGsinp7a0sjS7IJv8PE95Hv0yOjE9sbmFv1rbYgDTyEjP3n6whcUAMzNDbW01TU1VDkdZTk5WVrZXU1NzTU19TQ35KNVVV9cSUq9f55Gf9PS8pKT0oqJSxsIXTgKb5Ow8d/r0aZMmkYwFDDUtM/NSUNAfly6liWYm09LVisvt4iVlIIo2medIT1K/o21XzDZIKjI1HWRtPczKytbCwtrcfLCKCkfgGNPTk2/cuHD+/OnCQjEvgtzP0NDd19d93jwDExMgroqSkruXLwefOZP49CmwC/tlygrKI61GjrAcMcRsiLWJtaG2IUnegrldUV1xM/zmpbuXQqJCPrwX80w/duzQmTNdPTzG6OpqCoYrKys/IiIuNDQ6JCSqvLxSMCPAXowmMPInab6Pz4bly82NjYH+cclIHYprt25tP3ToVUoK1yGW72ICA54gkqUWLPhi1KixQ4eOVFJSBvYCypqaGi9cOH3gwDZyfQbsIkKyAebmizdvdp01S1JKSjC34yMjf9658+ndu4J1Z0MvjjLHf6b/5NGThw0aJiUpIAdegaTnph84e+D0zdNNzU28NCLa3quXtJ/f1DVr5pqaGlAVQmNj0+3bj3/66cqdO3RVDmIugY0YMuTE3r1Dray6T6epqengqVM7Dh6sb2jovjVmLGACA3K2trYLCYkBigWTlZWVbNrkf/PmRcG6s7CXnILCF7t2zVm1SuDU9XFQj4KC9vn7F+XmftwoKtt2FnYxZ+n9/CS+Tlz+3fKIuAhRYdKlny4uI44f30Bh6uIa0c9vR0BAEFcjJbsCXlPzNTa58NqyevXjoCBKshcZmixe/LW//983buhqafHlCYqRACHQp0/fkycvrF+/SzxomA0Zcjkxcd6XX1KSvQgTpylTriYluXh7iwcfyqOwHGgZ/nP47uW7JSSZuwCgPIpWgxISkt99tyIs7AR92YsMlJaWQ5P/tCcw8mA9+I8/dq1fLyXobQ1ekdtZW8eEhpoOHMhLgO1IoBMCq1dv2bjx204EInFo0uzZAVFROv37U+utgrLyvitXlu/eTa1ZsbFGvpRv9tsc/EMwmRsiukFJS0tduLBnw4aFJBxao0hMfE2TfXoTGKd37/tXrkweP54m77U0NIh9Q319muyjWfEmsGLFRh+f+aIb4zQ/v13nzsnI0fU31G/z5vXHjokuH7o9n+wwOfiwCOewU6c2k/kadFPKyysqL6+iaRQaExiZb0nmNY0cOpQm11vN6mhpBQUEkCLutI6CxsWVwO7dx7S0dEUxuomzZm359VeBZ9YBQ57p77/q+++B4h4ocxnh8ufuP0Ux8CVLvMisDQY8p+/yizhPYwI7e+zY6OHDGQA02Nz8yM6dDAyEQ4gfAWVlle3bD4hcXBZ2dttOn6b7zk8rlvnr1nnMF+HrVLpP7vRx079e+DXdo1BrX0ND9eDBL6m1ycuaSCawZYsWeXt48AqJ8val8+aNGjaMcrNosCcQ8PT8xMyMgsmxjLFSUFL6/upV8gIuYyNuPnXKwNSUseFEbiAyocPG1EaE3N6wYYGysgIzDicmptM3EC1XYGZGRge2bqXP6Q4tH9+zp8N2bEQCnRMg1zGLF/t3rmHV0dUHDmgbUPayDiQ08piNXPBJ0vyoH+IJOzXkhbNftvxC9+1cqmKXlZVZtGgKVda6tJOQIGoJ7MiuXWTyYZeBUSsYYmnpMWECtTbRWg8hMHXqLFKGSiSCNbO1nb50KfOuWo8aNXnBAubHFZURyftnvu6+IuHtuHHD+vRRYcZVUnri1atM+saSptw0ySKuTk5ws5VVVddu374bEfE8IaGkrKy8srK3kpJa376WZmbu48d7ublxVKCs133xxV9sLSJQmFfYT6IfBEsftT5Jb5Mgyh6u+eab1a9fpxQU5JeVFdfV1ZMSiOQrsJqahp6eoYuLx/Tpc7S0dICISMkPB4ex9++HAPVClK3ct4+vR18NdXV///VX2MWLWcnJxfn5TQ0N6jo6mnp6jp6erjNnqvcDfSZb4126bVvo+fNNjY1CDJ/CodccXJOak1pcWlxSUUKq+lXVVpEKCcqKyib6Jo62jvPc51kZ83djeZPfprO3z5I/2RQ6SYcpUiwKbra0tPLChTuRkc9TU7PfvCkmJRBra+ulpCTl5GSUlBR1dNR1dTUGDzaysTF1dLRVU+NwWc7IeFNfT2O5Ceqn/z8LDQW+sNzS0rL/5Ml9J06UV1Rwhd22Sybib1m1as3SpcDLc+NRo9JFs1hiW8jwBCaWtRDhlThsbLSLiwvbuHFtkNkZe/Yc9/aGfin+7bejW7eu4jLCtl0re/uA6Gi4V9F37uxYtOhtQUGHXaSkpUlO8tu4Ef4G9E4/v5sBAR1aY0kjvBKHtqt2YQnPzw8JZ/r46Se/PqmhqgEPbfpX068/vA7XC0V5+/ZRNzcHyNCkfMbKlftJuV6ImGhsbc2mTXOeNm0sSWmtXW7cCPfyWgvsLoCM4mdgzqNHA7NXVXX1ZF/fjd9+20n2IvGQo1/t3Om9ZAmwatSsadMEoIBdxI9AVVXlqlULIiLuAkMbPNgWqBSijExqh49+9sAB/0mTeGUvYqelufnk1q2rPT3ft7QAzXovWwZUioEs8H6g3Tw7Uv8QHovfVD+4WFhKHR1QSg4KekRKQMGzFwknLi75m29OWlnNHDhw6pdfHiYlfePjU2gNk+IEttIPdP4+fPgwz9//Tng4MLbrISHLNm6EiMktR4gMNT2BAPmYbdjwGfkfEqyJiQVEJkRNb1XV8T4+QAfuX7v2w7p1EPHjkJAj69dDlERDpu+b2tgAxWIgyy3KdV3uSqrRA2Nxc3BT56gDxcKSKSrKQ4Y+fz4UIutQk5GRd/jwOSenJTt2/NyhgKpGKhNYbxUVt3HjIJ798uefQWFhEGWbJuDixbBHj9p2eW3YDBpE3OB1FNt7GoHs7MzHjx9AouZwVMnzM4hSWJqxU6f2kpGBjF5ZWrpz8WKIslVz7tCh5Lg4oH7cjBlApXjIMt9krvsB9FWAxEumI04YOYHlgTc0NEI8HDhQDyITrobKBEbW9yLLUXYZT0Nj47aDB7uUtRfs//HH9o1cLeRRmQMjb09zjYu7rCVw794toG9GRuZApVBkY728gOP+tnt3Ne/nyh0a+QVcCoDM/ujQghg3BgQFpGanAgOcaD8RqBSWDFjY6auv5hkaQqdBCSsWKhPYBNjkw9v37xcWFQkQcHh0dHVNTZcd7ahYsaXLUVAgKgRiY6GzHgwNB7A2KPIOlp2zM8S92urqKydPQpQfa8Jv3izMyfm4hde2ibU1R02N11GxbCfLMe85vQcYmoMNaH4E0BodsuTkLIhZMtX+yZPfXV3tIWJhaahMYA52dpAwyAMtiKy9prm5OS4hoX07V4sFeCFaro64K5YEkpLigXEpK/cGKpmXDbC0JAU4IONGBAc31NdDlFwaksO4WnjtkidhvA6Ja3vgg8D6RhDVAToDyGLQbObw9Gki0D1NTdU7d07cuHFwxAhLYBeGZZQlME11dUM90D3TmBcvBA6y6O3bLvsKvNxzl5ZRIIoE6uvryQqWEM8VFUEZAmKKcs0gcM54cPWqYKOT18WAHXtgAquurQ55DPrmTd7SG2Q0CEhSKDIyvZCvl9WmTh375MmZZ8/Off65t6oqu77kSVNF0Ay8Lte/gLkY3fGK1KfvTnfsK34ECgvfkEUsu4yLvM7cpUZYAn3wfYWk2FjBnEyNjwd21DMyAirFSfbw2UMvZ9BjSENtwycvn7A29sLCkkuXwmbPnsSXh0OHmpOfo0fX3b//z/XrD2/eDC8qKuXLAh1iyq7A2LMolyqHQ5ZspgMW2hRRAqWloCswBQVF1gaoBSt+SOZuFGZnCxZFaXFx+bt3kL5aPXIFvoS0rp9ftNLrp94PglGImk2bTgCncnA52auX9KRJo8hCYm/e3AkL+3HhQk8VFWH+1lCXwGD3D7lw0LFLLuHVVFXpsIw2RZRAY2MDxHN5eWH+KnbuoaaubueC1qPZqdDJch1ay01P77Cdq1ED5gxXL1HffZEGffahrabN8mCzsvIXLNgGfEWyw1hINakJE0YEBGwvKrp39er3Xl7jSI3gDpW0NlKWwPr0ZtG9USUGV5qg9fSgcUoINDSAHr/zVWOQEsfgRoAzOKrLy+E22ytrKivbN7ZvkVdkb6Zv7y1VLWWVZU3NTRBriiz+JtTmP3kStnPnL227Am+QoogzZowPDNyfm3t7x47PybwPgU0J0JGyBKbAppyBCzQL8FEQ4y7AKzA2E5BTUIC4Vw3LQLxM1VRV8Tr0cTvQmY+7iMc2KfgLCUReVh4iE7pm+/ZTK1Z839LynhJP1NX7fPPNkuzsW8eObeBwVCix2aURyhIY8+undBKbrKxsJ0fxEBIQOQLSsMe6pPZ8d0IDdgcWBOmOJ+zsW1FVAXFMhJ7BHz9+ycNjVUEB6NknJHZyI9Hf/5PU1EBmlhyjLIFBYkMNEkACghEAvtrVzWWagTcqgXlOsEjZ3At4k5ksy8LmKLh8Cw2NMjX1OnToXFNTM9chgXfJ1djp09v++GOnjAy98+koS2C13fvqJzCpDjs2NoAe2nfYFxuRAAsJAHMGMAPxClARVkeUrL3Gy4J4tyspKEECbIBNGoKYYkZTVVW7du1hCwvv48cv81V+vnP3fH0nh4Wd4HBofDuFsgRWJ9DL/53HL/DRxmbKvkoI7AN2RAIUEgDWNlTmcLozKLB7N6eKdMdD4fYFzs4gy2MK10/BRk9Pz12xYp+envuqVfujol50Z45imwNOTkMvXPi2bZfyDcoS2LtS4b/U1kanBlAysU3Mto0P70HLf7DNbfSHVgJFeXkQ+/rGxhAZL42BqSmvQx+3A535uIsYbJMCUcDZGcVlxaIbL3k/7OjRiw4Ofrq67qtXH4iM7G4mI++NbdiwkCYglCWwbNgvGE1hcJkt6d5kYi5rDO/yVeWFYd9wOGERAFbaVVFVVdXUFMxJUl8DOD8e6IxgbrC2l6khKLsT/4tLRTiBtfHPzy8+cuTCmDF+Ojpuy5fvu3fvqcAPyXbvXta2RnObfUo2KCsllZWbC3RIy9oaUtIQaE38ZE2N0CfAUtJS4hc+RtQhgcxXrzpsb99obmtL1qhs395ly6Bhw7rUtAqykpOBSnGSGetDr26zCrLEKfCCgrc//niZ/PTureTuPtrLy9nDY4y8vCw8RmlpKTLD3sdnA7wLUEnZFVhiSgpwyP49sg4NEA6RNdY3AsWKSopAJcpEncC/4AUnx0yeLFiwLuDlnpNiYgQbQqR7Odk6Af1PyYb+MQQaZImsoqL6woXQTz7ZoKnpsmTJrhcvUuGOkYrAdLwcRlkCKysvT83IgMQzcexYiKzHalret1RXVUPCV9NQg8hQIwYEslNSgPM4HKdMESBeBWVlBzc3SMeW5mb48s0Qg6KicR/tDnGVvOycX5wPUYquhsxa/PXXGzY2s+fM2QysqUiKKE6caE95yJQlMOJZFOx72bIFC5RhKxtRHq2oGCwvLYe4amoJvSkPsYYalhN4cvcuxENNPT1n8NrNbQbnrF4tIyfXttvJRnxkZH1tbScCsTw00nqknqYeJLSYpB50eUouyBwdP62rA722ZG1tAgHIl4bKBBYUFgYZW0NN7eS+fRBlj9UU5RdBYh81dhREhhrxIPB3cDAwkCVbtwKVrTKywvL8r74CdokICgIqxUm2efFmYDhRL6KASmHJJCSo/LOfkJB+6tQ1SCz9+/eDyPjSUBlJyIMHVdWge19zvbx+O3RIhAqu8MW0++K87DyIEZthNkZmRhAlasSAAMkcjbC3LU2HDJm/bh0wZEkJiR0BAcBXmMkU2XuBgUDLYiNzGeEy2WEyMJwHzx4AlcKSjRw5OC3txtatnxoYaFHiQ3x8CsSOsjLoNXCIqTYNlQmsvqHhMvhLot+sWTEhIY723borKikl5erkdObIkW1r17aFJAYbGWmgp4kk0m0HtolBvBgChEBlefndK1cgSqJZsXcv8JnW2iNHRnt4AM1Gh4QIvN4YcAi2yQz7GV7cexHoVUV1ReTzSKBYWDIrK2MjI72dO7/IzPzr4cOfyTrL3Swhb2SkL6xYqExgJIYDP/0Ef3/b2sLiUWDg3zdueLm58VULuLeKyvTJk3//4Yfily/vXLgw38fHdvBgYRGkY9zkhGSg2fGTx+/9cW+X17LS0tJOrk77f95/JvgM0DLKWEjg0vHjQK/Id7sfgoO/2LVLSlqaVxeyxtjJe/dmrVjBS9C+nUymbt8ooi1H1x+1HGjZufMT7SdG/x7dt3ffzmVtR4P/DgYuudLWhfkNS8v/3bYhpR3Hjh168uTG/Pw7jx79snLlLJLY+PXHzc1hzZo5kF7A6R4QU20anh/uNgVfG8lpaddDQqa7u8N7jR4+nPyQq7f7kZEx8fGvUlNT0tLelZfX1NZW1dQoyMlxlEmBG05fDmeQufkwKys7a2tzY2Mpqf/3CpTpwIHwEdmvfP7Pc7iTC75YQJLTmRNnIh5EFOQUVFVUySnIKaso6+jr6BrqWlhZ2NrbkpuNrXPuqyqr4JZRyTYCSf/88/jWLQfYRHmSwz7dsmWqn1/YpUsRwcHkyultfn4vWVmNfv36m5u7zprl6OkJnLjRyuHVs2eRt2+zjYnA/viQ9wZcfF5lvLoVeSsuOe5l+svikmJyCUXqRWmraw+3GD7Pfd744eOBBXxb3Th987TA/jDW0dqa+4U2SUlJR0db8nPkyLrs7AJSfePFi5SkpIy8vKL8/LfV1aRQYAPRyMhIKyjIk9qGWlp9DQy0bWxMXF3tbWygU8kyM/Moj1GCcouG+vqvwsP5uqLqvg/Nzc3yAwaQ/7tviiUWol9HGwwwoMMZWx3bwvxCOixTYtPa2i4kBDSPy8ZGu7gYFMhPP12cMmVml+5dvnxm9eqFXcqEKzC1sTkXG0v+mjDvxvJJk57cucP8uHyNaGdhF3MW9PnhyyxEnJaTZuJF/UQ7yNB8acrKwmktsMvLmalTvySraPI6Klg79b8GWTk5u48cEcwbgXuRW2QD9Pi++BV4OAY6hgWF0TSKySAR+B2jKXYxMJsSH3/p2DHmAyHrxrM/ezGP5eMR9/6+9+Nddm7r6WkKJXs1NjY9ehRLORPqExhx8bvjxx9ERlLua+cGTY3+d2O3c5moHL1yBvq4nt+ITCwwgfHLjF3645s25b1+zaRPlaWl+/z9mRxR5MZKz00/+9dZ9rvd9gCMYVeDgyNIIQ/KB6Ulgb1vaZm9bFnOmzeUu9uJQTFLYInxiU8innQSr8CHjM2574ALbAo7CoUAeY943YwZdUwtuUB+nTfOnl1aBHo3UShA2DCo/3f+zS0i8AiDTEEUCq59+87QMS4tCYw4WvzunbOPz5tC0CMKSgITs3kchMn+bfspIcNlBK/AuICI4m7qixff+Poys3DBkXXrnsBqFIgiSUp8Pnvr7J0nbH862Brp4MFCSGABAUExMUmUoOYyQlcCI8NkZGU5T5+enpXFNSRNu+KXwKLDo6+evUo5LkxglCMVisEH168zkMNObd9+7vBhoQQoKoMmpCd8/u3nouKtlRXTj1pIqY6VK2n5Lk6Y05jAiPW0rKxhbm53wsMZOLvil8AItC0rtqQnp1NLr0/fPn3V+1JrE60JhUDI+fNf+/jQdC+RXN4d3bDh5x07hBKaqAyaU5gzZc2U2vpakXCYVNQ1NaVlbjOv8F+9ynB1XVZdTRcfehMYiaq8omLSnDnLNm2qrKriFSQl7aTEIkdFhRJT7DFSWVHp6+6bn5tPrUvGFkK4jUBtCGitlcD9wMBFo0blplP8LaeyrGyNp+eZ779Hzp0QeJ332mmJU1Z+VicaVh1SUlLgaw2Ubjp/40b4yJELCwtLummnk+60J7DWsU/+/ru5k9Mvf/7Z1NTUiTfdPCSWF2HZmdke9h4vY192E05bd7JgppKiUtsubog6gbSXL2dZWZ07dIjMtqAklgeBgd4WFuL0zjIlWLiMBEcE282zE6HsRfwvK6scPny+vf0C8lCqpqaOKyIKd3NyCslKK15eaysrayg0294UQwmMDJxfWLh03TpjB4dDp06RKR7tXRG45W1Jya/nz5PrvNiEBIGNsLkjee/Yc6TngW0H6mq79ZkjV3KHdh4aZjDs3u17bI4XfeOXQH1d3eG1a2cOHnz38mV4Lbf2o5ClUj4bN45McSxhcPpVezeYabn79K5g8wYL3hX4bvUldw7Lq8qZcZXaUZ4+TfTz26Gt7bp48a6IiLjufGDaO0YK+3722R5j42lkpZX2RylvkaDcIsQgee94krOzh4sLKcUr2ALNDY2N0c+ePYyKIi+cRcXGUvXdE+K8EDVkBctPV346w3cGKRMFdyM3K/feX/dCAkMehz+m9sMK9wGuxEoccFYdKvWMjKYsXOju66sFXvqc3DAkmS8oICDx6dMObYpQI7wSh7artoSkxKdTP/V28bYytoLESOpOHb98nBQUFZWHXpCg9PW1vL1dPD0dHRysyUMySBcuTUvLe5K3bt2KvH79IbAyPZcFgXeFk8A+dldPW9vG0tJ60CCTAQN0tLR0tLX79O5NKlGRH1KFjFRErCY/NTVkoRbyYllyOpnTkE4qLr78919SPvFjOz1q29rOmlSQtBpqZTjQsJ9ePyUVJXkFeZKcqiurSbXDd8XvMlIzXie/JrDiYuIKcgt6FBwMtpWAoZnZUCcnM1tbfWPjfv37K5FfK0VFUoOKTPqora4uzMnJSU0ltx/jIiLICsvMzMhn4NTwlcAKS/73ng9ZrNLF3sXWzNba2FpfS5+jzFFSUKpvqCfLK+cV5f2b+W/sv7GhUaEp2SkMhCCsIZSVFYYMMSMVDgcNGkgKdujoaGhoqCooyMnK9pKR6UWyFKmmUV/fSGryvntXXlRUkpmZn56em5CQ9s8/SbW19UJxW/gJTChh46BIAAmIJQHBEphYougJQTH3DKwn0MQYkQASQAJIgDECmMAYQ40DIQEkgASQAJUEMIFRSRNtIQEkgASQAGMEMIExhhoHQgJIAAkgASoJYAKjkibaQgJIAAkgAcYIYAJjDDUOhASQABJAAlQSwARGJU20hQSQABJAAowRwATGGGocCAkgASSABKgkgAmMSppoCwkgASSABBgjgAmMMdQ4EBJAAkgACVBJABMYlTTRFhJAAkgACTBGABMYY6hxICSABJAAEqCSACYwKmmiLSSABJAAEmCMACYwxlDjQEgACSABJEAlAUxgVNJEW0gACSABJMAYAUxgjKHGgZAAEkACSAAJIAEkgASQABJAAkgACSABJIAEkIAoEvg/u1KxSzV+aO4AAAAASUVORK5CYII=", "text/plain": [] }, "execution_count": 31, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# compute max in each image individually, then show a difference\n", "x = reduce(ims, \"b h w c -> b () () c\", \"max\") - ims\n", "rearrange(x, \"b h w c -> h (b w) c\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Repeating elements\n", "\n", "Third operation we introduce is `repeat`" ] }, { "cell_type": "code", "execution_count": 32, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "(96, 5, 96, 3)" ] }, "execution_count": 32, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# repeat along a new axis. New axis can be placed anywhere\n", "repeat(ims[0], \"h w c -> h new_axis w c\", new_axis=5).shape" ] }, { "cell_type": "code", "execution_count": 33, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "(96, 5, 96, 3)" ] }, "execution_count": 33, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# shortcut\n", "repeat(ims[0], \"h w c -> h 5 w c\").shape" ] }, { "cell_type": "code", "execution_count": 34, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAASAAAABgCAIAAAAyzjgPAAAJXElEQVR4Ae3de0wUVxQG8KKCivhWxKgUFKWiVKtohBCltaL4ahpbNNEG01ipr9BIrI2Nj/5RjUlJoxXTaESNmJrIKqYiUhBFrEUFW6gKitLWB2q1KL7f9DSbbDasnBl37p3duf0a0wxz7txZf3c+Z9mZvePTUFv7Gv6DAATkCDST0y16hQAE/hNAwHAcQECiAAImERddQwABwzEAAYkCCJhEXHQNAQQMxwAEJAogYBJx0TUEEDAcAxCQKICAScRF1xBAwHAMQECiAAImERddQwABwzEAAYkCCJhEXHQNAQQMxwAEJAogYBJx0TUEEDAcAxCQKICAScRF1xBAwHAMQECiAAImERddQwABwzEAAYkCCJhEXHQNAQQMxwAEJAogYBJx0TUEWniQ4PGTJ8dOnjxcUlJRWVldU1N7/fq9+/dpZetWrfxbt+7SqVNocHDv4OBhgwfHREX1CQnx4Ev1yK7hw7NbwsfH/IlHGxoa8g4dyrTZ9uTlUaJ4REe1b2ho4uTJSYmJtOBYqeQCfPhhtZaPqQF78eLFtqys1enpldXVPGJT1WbNmk2ZMGH5woUDwsObamPd9fDhx86KPuYFjN4NzluypKyigkfUU/X19U1NTl6emtqqZUs97S3RBj78MFnUx4yA0T88K9euXZGW9vz5cx7xlapRgwbtzsjo2b37K23lhY3hww+KpX2kB+zBw4cfzp6978ABHtG9alBg4CGbLbxPH/c294at4MOPgtV95AbsVn39hBkzfikr4xGNVHsEBRVnZ9PnjUY68dS28OHlFfCRGLBHjx+PmTr1yPHjPKLxamT//sdycujDfeNdmdkDfHhtNXwkXmj+aMECE9JFg/R7ZeVny5bxo+WFVfjwg6KGj6yApW/enLV3Ly8osLohM/PnEycEdii7K/jwwsr4SHmLWHX+/JD4+IePHvGIYqtvDRxYlpfn4+MjtlsZvcGHV1XJR8oZLGXpUpPTRQP266lTOQUF/Mh5SRU+/ECo5CP+DLY3P39SUhIv6Fxt17btlPHj3x05ckhkZOeOHTu0a1d/9+7NurpTVVX04f7u3Nzbd+44t2eWR44YUbRrF9PAG0rw4UdBMR/xAYsaN07n7RrNmzdfNGfO5/PmdWzfvil0+qD26zVrvt2wga42NtXGeX310aNh3n1bMHycx8t1WTEfwW8RC48c0ZmutgEBOdu2rVqyhEkX6VP1m2XLsjZu1HlX1I7sbNcx85418OHHQj0fwQH7LiODF7RX6aOIzHXrxsbF6WlMbd5PSEhftUpPY3pLqaeZp9rAh5dXz0dkwOrv3MktLOQF7dVPpk+fHB+vp6WjzcfTpsWPGuX4samF306fppfRVNWz6+HD+yvpIzJg2fv303fgeESqtvTzW5GaqtnMtcGiuXNdVzZaQ7+qmXN1u9F+9fwIH15JSR+RAfupqIgXtFfHjx7dvVs3PS0btYmLjg5o06bRStcfdf4S6Lqh7DXw4YWV9BEZsKOlpbygvUq/UOlp5tqmRYsW9FG+6/pGa86cO9dojZf8CB9+IJT0ERaw6zdu/HnpEi9or9L3uPQ0e2mbbl27vnS980q3vy7t3InwZfjwpKr6CJv0hm5v4QUd1Qgdn1U4GruxcOXaNTe2kr0JfHhhVX2EncF0nr54ZSHVutu3nz59KqQrgZ3Ah8dU1UfBgNGsQzfq6vjhNL/qPQcQfPjRF+sjLGB0TxP/us2s3n/wwMzd6dkXfHglVX2EBYzmTuAFzaw+MvebMnr+avDhlVT1ERYw87+fwgyYnuvdzOYySvDhVVX1ERYwng9VCPw/BYQFjGaT9x5BPz8/73kx9lcCH35EVPURFjCvmtTJz9eXH07zq/DhzVX1ERYwehgKL2hmtY2/v5m707Mv+PBKqvoIC9jrPXvygmZWO3XoYObu9OwLPrySqj7CbpUK6dWLF3RUr5WX67ml0NFejQX48OOoqo+wM5j+5wn9cfEib61kFT78sKrqIyxg9K6sX+/ePKK9Sk/f09NMsTbw4QdUVR9hASO+mGHDeER7df3WrXfv3dPTUrE28OEHVEkfkQGbNGYML2iv/n3z5qeLF9MtlXoaq9QGPvxoKukjcl5EehxGYGSkzrMTTWLz/erV9KxKHl2lKnz40VTSR+QZjKYuTJw0iUd0VDN27BiWkHC4pMSxxo0FemQmTeSQlJLyVVqaG5ubvAl8eHAlfUSewYiPvq4/IC7uld7+xQ4fvjA5meZI1H+zDM3vVVBc/GN+Pk2z/M+tW7RfmgRuz5Yt/Ph5QxU+/Cio5yM4YMQ3ZdasXfv28Y6uVfrX653Y2OGDB0f06xceFkaT1Af4+9McUnSTNc1Nf7u+noJEs9WXVlSUlpfTMDR63DM9RbaquNi1Wy9cAx9+UBTzER8wusxFJzGTv31AE049rKmh//OD5w1V+PCjoJiPyN/B7HD0uOQvU1J4ROHVZ8+e1Vjk+jV8+NFXzEd8wIjvi/nz6f0e7yi8evbCBeF9SuoQPjysSj5SAkbPJfph/frgHj14R7HVs7rnjRO7Xzd6gw+PppKPlIARX2CXLoU7d/YICuIpBVYtdAaDj+a4K3P8yAoYCfYJCTlos5n2ODxrBQw+mhlT4/iRGDAS7Bsaejw3V/9zwDTRmQaWCxh8mNG0lxQ4fuQGjJjoEZW527enr1xJz2LWBDXSgG5x1P80ZyM7ErstfHhPq/tIDxjx0fMs586ceaaoiJ67J/XmQwt9zuF8VMHHWcN12dI+4i80uwI5r/nr8uW1mzZl2mx0wnFeb2S5a+fO740d+8HEiaNjYy1xrZn5y8KHwaGS5XzMDpidj64L7z94cG9BAd2q694XnOkxmdFRUW/HxNAFt+ihQ+mDXX5grFWFDz9eFvLxTMCc+S7V1tKDlctPnz5XU0NPHrpy9SpNU053WtEfummY5oey35TYNiCALqy9ERZm//NmRATdvujcj6rL8OFH1st9PB8wng9VCFhawIwPOSwNhBcPASMCCJgRPWwLAQ0BBEwDCGUIGBFAwIzoYVsIaAggYBpAKEPAiAACZkQP20JAQwAB0wBCGQJGBBAwI3rYFgIaAgiYBhDKEDAigIAZ0cO2ENAQQMA0gFCGgBEBBMyIHraFgIYAAqYBhDIEjAggYEb0sC0ENAQQMA0glCFgRAABM6KHbSGgIYCAaQChDAEjAgiYET1sCwENgX8B86gmBRxzD24AAAAASUVORK5CYII=", "text/plain": [] }, "execution_count": 34, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# repeat along w (existing axis)\n", "repeat(ims[0], \"h w c -> h (repeat w) c\", repeat=3)" ] }, { "cell_type": "code", "execution_count": 35, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAMAAAADACAIAAADdvvtQAAAP1UlEQVR4Ae2de0xUxxfHfyCgIr4VNSq1yk8qSqUKRghRWiuKik1jiybaYBor9RUaibWhEe0f1ZiUNFoxjUbUiKmJoJiCSEGUYq0PsIWKoOi29YFaLQI+8O3v/LLJZrMLy+zemXPvPTmEkLuzZ+bc+Z4P9zFz51yvVw0N/+EfVsBTBbw9rcj1WIH/K8AAMQeaFGCANMnHlRkgZkCTAgyQJvm4MgPEDGhSgAHSJB9XZoCYAU0KMECa5OPKDBAzoEkBBkiTfFyZAWIGNCnAAGmSjyszQMyAJgUYIE3ycWUGiBnQpAADpEk+rswAMQOaFGCANMnHlRkgZkCTAgyQJvm4MgPEDGhSgAHSJB9X9tFRgidPn54+d+7nU6eqa2vrLZaG27cfPHwIhV27dPHv2rVfnz6vBwUNDwqKDA+PjogYMWyYjruqi2tT6OOFv7Dw1atXRcePZ+fmHioqAmIEY/Pf119PnD07KTERNgSrmNTMXPqgAvTy5cs9OTkbMzNr6+s9i663t/ecmTPXrlw5OiTEsxaMXMuM+uABBGerZWlpldXV2kPo6+ubmpy8NjW1S+fO2lszSAsm1QcDIPjHWr9587qMjBcvXkiMVsTYsQezsoYMGiSxTV2aMrU+ygF61Nr64eLFh48eVRGbgYGBx3NzQ0aMUNE4Tptm10ctQPeam2cuWPBrZaW6YAweOLA8Lw/u19S5UNcyAX0UAvT4yZOpc+eeOHNGXQCsLYeNGnW6oABu/lU7kts+DX0UDiR+tGIFAj0Q1D9qaz9LT5cbXYTWaOijCqDMnTtz8vMRwmB1sS07+5ezZ9HcaXdERh8lp7C6y5fHxcW1Pn6sXWjxFt4aM6ayqMjLy0u8il6WlPRRcgRKWbMGmR5A4bfz5wtKSvRiwi2/lPSRfwTKLy5OSEoSF7RH9+5zZsx4d9KkcWFhfXv37tWjR/P9+3cbG8/X1cHN/8HCwqaWFsHWJk2cWHbggKCxXmbE9JEPUMT06YLDzZ06dVq1ZMnny5b17tmzvXDCje7XmzZ9u20bjLa1Z2NfXn/yZLCxp12J6SP5FFZ64oQgPd0DAgr27NmQluaCHiADvv0mPT1n+3bBWYt9eXn2PBltm54+kgH6LitLJGZwqZu9Zcu02FgRY7B5Pz4+c8MGEWM45YmY6WVDTx+ZADW3tBSWlorE5pP582fHxYlY2mw+njcvbvJk28f2Nn6vqYHdaO9bfctJ6iMToLwjR+AZqA6D1NnPb11qaodmzgarli51LnQogUslnNFLB78iH0nqIxOgn8rKRHScMWXKoAEDRCwdbGKjogK6dXModP4oeBHmXFF1CUl9ZAJ0sqJCJAZwQSNi5mzj4+MDt/rO5Q4lFy5dcigxyEeS+kgD6PadO39duyYSKniOR8SsTZsB/fu3WW5f6PHjjvaNSN+mqo+0h+pheF5Q9FCBa2HBpto0u3HrVpvl+hZS1UfaEUjw8IMQxcampmfPniE4cssFVX0IAgSrGu40NroVXQRj4wAkVx9pAMGcA0IYBF08fPRI0BLNjKo+0gCCZ3vRgtGho8e4T5J0uD9gQFUfaQDhP7/hImwi45kuqqv4iqo+0gBSITq3aXwFpAEEq9mN01s/Pz/j7Ix1T6jqIw0gQy2K8PP1NRpAVPWRBhAk0zBOzLr5+xtnZ6x7QlUfaQC9NmSIcWLWp1cv4+yMdU+o6iNtKmPY0KGCMbtVVSUypSXYmlnMqOoj7Qgknm/lz6tXzRJ1iftJVR9pAMFZY+Tw4SKKQ3YpETNiNlT1kQYQxDs6MlIk6lt3777/4IGIJTEbkvrIBChh6lSRkP9z9+6nq1fDlJ6IMSUbkvrIXBcG6SYCw8IEjy7wkPz3GzdCrjFKiLjuC0l9ZB6BYOlWYkKCaxFt32bt2xcZHw8pWm0lHmxAyjN40DgpJeWrjAwPqiNXIamPzCMQxAMeJx0dG+vW6SlmwoSVycmwRkx8sB/Wx5SUl/9YXAzLhP+9dw/8wiKhQ7t2IQPhgTt6+kgGCDSds2jRgcOH3RUX/jvfiYmZEB4eOnJkSHAwLJIP8PeHNRgwiQ1r45uamwEUWC1fUV1dUVUFYXBItwhZ7urKy911qos9MX3kAwTDPHAQQn56ARZstFos8FcXJtxySkwfmddAVh0hXeGXKSluaard+Pnz5xaTjE8S00c+QEDDF8uXw/lIOxZutXDxyhW37HU0pqSPEoAgb8sPW7cGDR6MGaSLwuuKMPeqTV+U9FECEKgW2K9f6f79kIO3TQVVFJroCERJH1UAgUbwfp1jublo6Z7MBRAZfRQCBBrBm3XOFBaK5wGCKh7/mA4g6CkBfdQCBBpBirHCvXsz16+HXIgewyFSEabYxLMpijSIY2N2fZQDBGGAfGRLFy68UFYGeaWUTn6Z6Drank5T6yN/INFeGuftv69f37xjB7xtDg4Yzt96VtK/b9/3pk37YNasKTExphhLdNFN0+mDDZBVOxj3O3LsWH5JCUyFevaAIqQ5i4qIeDs6GgacosaPhxtjF1Ex3Vcm0kcfgOwjeq2hARIbVtXUXLJYIDPLjZs3YRk5zITAL0zKwvoK66QYZHWFgaU3goOtv2+GhgrmbbX3ZcZtg+ujP0BmDCrvs00BjItomzPeoKcAA0Qvpqg9YoBQ5abnjAGiF1PUHjFAqHLTc8YA0Yspao8YIFS56TljgOjFFLVHDBCq3PScMUD0YoraIwYIVW56zhggejFF7REDhCo3PWcMEL2YovaIAUKVm54zBoheTFF7xAChyk3PGQNEL6aoPWKAUOWm54wBohdT1B4xQKhy03PGANGLKWqPGCBUuek5Y4DoxRS1RwwQqtz0nDFA9GKK2iMGCFVues4YIHoxRe0RA4QqNz1nDBC9mKL2iAFClZueMwaIXkxRe8QAocpNzxkDRC+mqD1igFDlpueMAaIXU9QeMUCoctNzxgDRiylqjxggVLnpOWOA6MUUtUcMEKrc9JwxQPRiitojPd+T/eTp09Pnzv186lR1bW29xdJw+/aDhw+hsGuXLv5du/br0wdecDw8KCgyPDw6IgLef4gqjAGcmUIfHd6VAa9QKTp+HN74dKioCIgRjBS83C9x9uykxETYEKxiUjNz6YMK0MuXL/fk5GzMzKytr/csut7e3nNmzly7cuXokBDPWjByLTPqgwcQnK2WpaVVVldrDyG89jA1OXltaiqlNz6ZVB8MgOAfa/3mzesyMl68eKGdHlsLEWPHHszKGjJokK3EpBum1kc5QI9aWz9cvPjw0aMqojswMPB4bm7IiBEqGsdp0+z6qAUI3j04c8GCXysr1QVj8MCB5Xl5cL+mzoW6lgnooxCgx0+eTJ0798SZM+oCYG05bNSo0wUFcPOv2pHc9mnoo3Ag8aMVKxDogaD+UVv7WXq63OgitEZDH1UAZe7cmZOfjxAGq4tt2dm/nD2L5k67IzL6KDmF1V2+PC4uDl67rF1o8RbeGjOmsqjIy8tLvIpelpT0UXIESlmzBpkeQOG38+cLSkr0YsItv5T0kX8Eyi8uTkhKEhe0R/fuc2bMeHfSpHFhYX179+7Vo0fz/ft3GxvP19XBzf/BwsKmlhbB1iZNnFh24ICgsV5mxPSRD1DE9OmCw82dOnVatWTJ58uW9e7Zs71wwo3u15s2fbttG4y2tWdjX15/8mSwsaddiekj+RRWeuKEID3dAwIK9uzZkJbmgh4gA779Jj09Z/t2wVmLfXl59jwZbZuePpIB+i4rSyRmcKmbvWXLtNhYEWOweT8+PnPDBhFjOOWJmOllQ08fmQA1t7QUlpaKxOaT+fNnx8WJWNpsPp43L27yZNvH9jZ+r6mB3WjvW33LSeojE6C8I0fgGagOg9TZz29damqHZs4Gq5YudS50KIFLJZzRSwe/Ih9J6iMToJ/KykR0nDFlyqABA0QsHWxio6ICunVzKHT+KHgR5lxRdQlJfWQCdLKiQiQGcEEjYuZs4+PjA7f6zuUOJRcuXXIoMchHkvpIA+j2nTt/XbsmEip4jkfErE2bAf37t1luX+jx4472jUjfpqqPtIfqYXheUPRQgWthwabaNLtx61ab5foWUtVH2hFI8PCDEMXGpqZnz54hOHLLBVV9CAIEqxruNDa6FV0EY+MAJFcfaQDBnANCGARdPHz0SNASzYyqPtIAgmd70YLRoaPHuE+SdLg/YEBVH2kA4T+/4SJsIuOZLqqr+IqqPtIAUiE6t2l8BaQBBKvZjdNbPz8/4+yMdU+o6iMNIEMtivDz9TUaQFT1kQYQJNMwTsy6+fsbZ2ese0JVH2kAvTZkiHFi1qdXL+PsjHVPqOojbSpj2NChgjG7VVUlMqUl2JpZzKjqI+0IJJ5v5c+rV80SdYn7SVUfaQDBWWPk8OEiikN2KREzYjZU9ZEGEMQ7OjJSJOpbd+++/+CBiCUxG5L6yAQoYepUkZD/c/fup6tXw5SeiDElG5L6yFwXBukmAsPCBI8u8JD89xs3Qq4xSoi47gtJfWQegWDpVmJCgmsRbd9m7dsXGR8PKVptJR5sQMozeNA4KSXlq4wMD6ojVyGpj8wjEMQDHicdHRvr1ukpZsKElcnJsEZMfLAf1seUlJf/WFwMy4T/vXcP/MIioUO7diED4YE7evpIBgg0nbNo0YHDh90VF/4734mJmRAeHjpyZEhwMCySD/D3hzUYMIkNa+ObmpsBFFgtX1FdXVFVBWFwSLcIWe7qysvddaqLPTF95AMEwzxwEEJ+egEWbLRaLPBXFybcckpMH5nXQFYdIV3hlykpbmmq3fj58+cWk4xPEtNHPkBAwxfLl8P5SDsWbrVw8coVt+x1NKakjxKAIG/LD1u3Bg0ejBmki8LrijD3qk1flPRRAhCoFtivX+n+/ZCDt00FVRSa6AhESR9VAIFG8H6dY7m5aOmezAUQGX0UAgQawZt1zhQWiucBgioe/5gOIOgpAX3UAgQaQYqxwr17M9evh1yIHsMhUhGm2MSzKYo0iGNjdn2UAwRhgHxkSxcuvFBWBnmllE5+meg62p5OU+sjfyDRXhrn7b+vX9+8Ywe8bQ4OGM7felbSv2/f96ZN+2DWrCkxMaYYS3TRTdPpgw2QVTsY9zty7Fh+SQlMhXr2gCKkOYuKiHg7OhoGnKLGj4cbYxdRMd1XJtJHH4DsI3qtoQESG1bV1FyyWCAzy42bN2EZOcyEwC9MysL6CuukGGR1hYGlN4KDrb9vhoYK5m2192XGbYProz9AZgwq77NNAYyLaJsz3qCnAANEL6aoPWKAUOWm54wBohdT1B4xQKhy03PGANGLKWqPGCBUuek5Y4DoxRS1RwwQqtz0nDFA9GKK2iMGCFVues4YIHoxRe0RA4QqNz1nDBC9mKL2iAFClZueMwaIXkxRe8QAocpNzxkDRC+mqD36Hz0nU7/Orcn7AAAAAElFTkSuQmCC", "text/plain": [] }, "execution_count": 35, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# repeat along two existing axes\n", "repeat(ims[0], \"h w c -> (2 h) (2 w) c\")" ] }, { "cell_type": "code", "execution_count": 36, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAASAAAABgCAIAAAAyzjgPAAAJxUlEQVR4Ae2deWwWRRjGbYFy37dA5RKUWywISKAKiFLEIIqCJEQkEIUEhUgUNGI8CCaNgQAeCYgRgwpFiAVByyWIQECFcF8iRwHBclNu1DzMG0qLu939Zna/2ad/fHm/nZn3nfnNPN1vd2dnEq5nZ9/BPxIgAT0EEvW4pVcSIIH/CFBgHAckoJEABaYRLl2TAAXGMUACGglQYBrh0jUJUGAcAySgkQAFphEuXZMABcYxQAIaCVBgGuHSNQlQYBwDJKCRAAWmES5dkwAFxjFAAhoJUGAa4dI1CVBgHAMkoJEABaYRLl2TAAXGMUACGglQYBrh0jUJUGAcAySgkQAFphEuXZMABcYxQAIaCVBgGuHSNQlQYBwDJKCRAAWmES5dk0BRIjBM4OKlS4i49tdfYfy0Zg2MTdu2wdi1dy+M7KNHYZw9dw6GFC9ZogSOlCpZEkaVSpVg1EtOhlFfGW1atcKRDikpMBrUrQuDn1oJJHDhUR18r1+/DreLly+HMTMjA8b8xYthiGbw1fDn3fXqIWLfXr1gDOzbF4YkGa6SleEosBh067Vr1+DlizlzYEyYMgXGtl27YhDAiIvExBvXC33S0hDwrZEjYTRt3NhIFSwMQoF571T5jTdszBh42bBpk3d34StZrFgxVGrU0KEw3ho1CkaJ4sXDV98w1ogCc9srcpp6f9IklBmXng7j6tWrbr3Eeb6Uli3Rgm+nT4dRu2bNOG+T3upTYA58z+fmIsfTQ4bAWLhkiUOZCCTXqFYNrVyuri0bN2gQgXYXuokUWMHITpw6hYS0AQNg/LJhQ8FZo320Vo0aALBy3jwYcg8z2mButJ4CyzMMLly8iO/dnnkGxqp16/Lk4JfbEGh+771IWbtgAQx5kHCbEpE4TIHl6Wb5HTgnMzNPAr+4JjBEnfM/+eAD14WszUiB/de1Uz77DD08fOxYa7vaeMNWzZ+PmA+2aWM8eFgCRlpg23fvRj+0fuQRGLkXLoSlZ+K/Hvc1a4ZGbFDP1hMSEuK/WYVrQaSnSo14803Qoq4KN2rc5f5t82ZkXJCVBaNnt27uitqTK4pnsMwff0QHPj5wYBh6slzZsqhGnx49YHTt1AlG6+bNYVSuWBFGhXLlYJw6cwbG8ZwcGJu3b4chDxK+/f57HDl5+jSMQD47tWuHuCvmzg2kAgEGjaLAUh59FMQDmXhRpEgRRH/1xRdhjB42DEbF8uVhxOpTHja8N3EifH746acw5Ll5rGK58bNr9WpkaxiZqcYREtjSVavQu13UrFY3YyJWecqWKQNXs9UQ756aGivn7v3IOa3/Sy+hlDyZcO/Ec853Ro9G2Tdeftmzk/gqGCGB9R40CH0zb9EiY50kl/Xz1I3KXuqGirE6FBho+ldf4fgLakZvgdlie1B+8cptj9j6D6E3+wV2Sl1+VG/RAh0gr1QZ6I/wPxTq3q8fOPywYoVuIDJhP2frVsQqry4pdYcOyr/9dxHlfGVSV8WTktCj49T086A62DHuq+q3ogGByYWfzI9J69rVsYZxncF+gRkYN/lHQI8uXXCwZvXq+VNDdSS1fXvUp0zp0jAMvAkqt5cosFANBi+VWb1+vZdi/sr0fuwxfw7MlS5a9MY/WblAkiUM9FVi686d+pyHyrO1Z7Cjx44B9L4DB8wTl/emzIf2HLF61aqeyxa2YBy96F3Ypt2S31qByTSoWxps5muTzp3NBIrTKIeOHInTmhe22tYKLJATV2HpRzZ/zsmTaPvly5dhyPIEljGhwCzr0Phojqy6dUzN87oz9HeDvJG1VmAyS8gbF5YyQ+Dc+fNmAgUVxVqByVoaQZFlXDcELtj+fpC1AuMbKG7Gd+B5TD79D6SxiYFEZVASiAgBa89gsmJ7RDoyTpuZpOaUxWn9HattrcC4pJFj34chQ5JaPDgMldFRB2sFJluN6KBGn7EiULpUqVi5CqcfawV2V+3a4STOWt1MoFKFCjd/tc+2VmB169QJsLeObNyI6CYn+AXYXoa+HQFrBRbsjjt/7N9Pgd1uzEXquLUCk98ejerXR4/uVNtGGuhg2Xev3f33GwjHEKElYP+SAc+/8groz/j6a2PdUK1KFcTardZRkkVvjNWBgcJAwH6BzV24EKD7DB5snnj/3r0RdObkyTBkGRzzlWFE8wTsF5gsS1ZNLeJ55uxZ86AHPfssgn48YQIMW1/QMM82zBHtF5jQH6zWn5k2a5YcNG+0bNIEQSe9+y4MWfjWfGU8R5RNPZeo1Sa/VKv21k9OhlvZb9ZzFAsKRkhg8pp609RU9Jy8lRRsR3Zs2xYVGKm2QpY1SYOd8CUr3mWtXIkafqdWHZflx/8+ceIWerLw4/wZM25JiuDXCAlMelcuxuTyTJJCYsgW4w937IgqtW3VCkaTRo1gNG7YEIYsW19GzYqQ9aHklQJZm/6k2rlThCEr2q9XO7ivVw/x5F+SnK8Q8f8/ZS/Z7UqW/5/f7tQoCkweUsmpTAai3Z1tpnWyTFWuei4iR8xUIFRRioaqNmYqI5sIjx0xAhHfUDcezFTA7ihXrlxBA/eqp+3yKNLuhhfYuigKTEC8Nnw47KU//3zDUJfskoeGZwI79uxBWQrMM8P4Lig7Cc2aOhUtaaMWDN1/6FB8ty0Etd+hNhB9PHr77gn+SJ/BhIJMvFg6ezYOdn7ySRjRWcFPaMTKkDNYrBzGox8KLE+vNVAbwy3LyEBCj+eeg7F73748WfnFiQAF9i+hKN5FdBoYedJl+bd+akNKmcibJx+/5CMgvwuOqgcA+bLYf4ACc9vH8lT6o88/R5nXx4+HcVptl+zWV8TynVCbR8sG09EBQIF572u5PHs7PR1eZnzzDQxZEdq7d4tKrsnMRGseaN3aoma5agoF5gqTy0x/HjyInJOmTYMxU13L/XX8uEsncZqtauXKqPkT3bvDeKpnTxhd1HyUCD5xpsD0jmd56rpo2TJEyszKgiE7A8rMEr1V8eFdNuxsn5ICNw916ABDJnO1V6+WysMPHwHtKUqBBd+XB7KzUYnft2yBsVEZ8ha2/Bw9dPgw8sjdF5nnJYZcLsqaTfmnKcoLoMm1asHhPWpyY36jhXoDQCZJogg/HQlQYI6ImIEEvBNI9F6UJUmABJwIUGBOhJhOAj4IUGA+4LEoCTgRoMCcCDGdBHwQoMB8wGNREnAiQIE5EWI6CfggQIH5gMeiJOBEgAJzIsR0EvBBgALzAY9FScCJAAXmRIjpJOCDAAXmAx6LkoATAQrMiRDTScAHAQrMBzwWJQEnAhSYEyGmk4APAhSYD3gsSgJOBCgwJ0JMJwEfBCgwH/BYlAScCFBgToSYTgI+CPwD+Cx4nR09n5kAAAAASUVORK5CYII=", "text/plain": [] }, "execution_count": 36, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# order of axes matters as usual - you can repeat each element (pixel) 3 times\n", "# by changing order in parenthesis\n", "repeat(ims[0], \"h w c -> h (w repeat) c\", repeat=3)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Note: `repeat` operation covers functionality identical to `numpy.repeat`, `numpy.tile` and actually more than that." ] }, { "cell_type": "markdown", "metadata": { "pycharm": { "name": "#%% md\n" } }, "source": [ "## Reduce ⇆ repeat\n", "\n", "reduce and repeat are like opposite of each other: first one reduces amount of elements, second one increases.\n", "\n", "In the following example each image is repeated first, then we reduce over new axis to get back original tensor. Notice that operation patterns are \"reverse\" of each other" ] }, { "cell_type": "code", "execution_count": 37, "metadata": {}, "outputs": [], "source": [ "repeated = repeat(ims, \"b h w c -> b h new_axis w c\", new_axis=2)\n", "reduced = reduce(repeated, \"b h new_axis w c -> b h w c\", \"min\")\n", "assert numpy.array_equal(ims, reduced)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Fancy examples in random order\n", "\n", "(a.k.a. mad designer gallery)" ] }, { "cell_type": "code", "execution_count": 38, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAASAAAADACAIAAAAr0inhAAAhX0lEQVR4Ae2dB3gU1frGI2kEktAhISQECCDBBKRL7xcQkN5iaAoIigoGFARpol4uIKBcBQQJCEGaIEgoUiItEnqRXOklREQBpXjp/3fzcs9/ZZclZWezO/Pt83B4c2Z25pwz33y//c6cc+apBxcupAa6uaUGBrqlipBGEEuw713gtufBgwdp/0SgIaQRpBHsezs8JQQTn21fny2/AswtSghmIriASxrBvuBSFuUGgl1I+ycCDaFFIyT8lMAj9xvcjyK8QjiFf15/Ci9vL4rAYoEUTVs3pZg8ezLFmVtnKOR6aXGZNGpVIZgQDPSSRtCqEYRgJoLb0SOe/u9pHrBH/x4UOXLkoHBzc8uKCAoJ4teXbFpCoZHTtWNrSAmFYOK8tXLeKg4xshCC2YdgJ2+eJFWq1KySFUylh3Lu7u48xZS5UwRlTg5JIZgQTAimYTeyEMw+BGvbra3W4LKEm6enJ0+6cttKQZlzokwIJgQTggnB0jq2nLB3a82uNY4HlyXKKlatKAQTgmnoJ4zcTyV1B4KdthEkBstSDBbVJ8oZCKaYtn7vekGZU6FMYjCJwSQG0xCAQrAsESy0VKhTEWzEhBFCMCGYhg7DaX+LS8GcOVLS7uoIwbJEMPUkSkVB2St6DugpBBOCCcEk8DNK4CcEywzBjl8/7lShl8Jmm65thGBCMCGYEEwIZtdZUk7lVLI+IiTlfgpB4eHh4VQo6/5KdyGYUxmbPAcTmBgFJtp1Fdo4ssRgmYnBlI8MDg12KoIN/3C4EExdHWcQQjAhmBBMw04BIViWCNb1pa5ORbD4pHghmDOAS8X5QjAhmBBMCOas88FW7VzlDASLqBTBYqjuTafy4sqdG1AIwYRgQjAhmLMSTIGidefWjkeZegq3PGG5hF7qWjiVEIIJwYRgQjCnJ9iJGyfIkErVK2mNMrVUsFq23ql8tgEDLRtVFoIJwYRgQjCnJ5hiiFriVy3XoYCjxrxnTqgXr3y94WuJuFSDO7kQggnBhGBCMNchmKVD3XJkC4HT580+FOUiy1H45/Gn8PR6uEZvQFAAc5q0bEIxcdZECvXeFstTSI6NKCh7NwnBhGBCMA0JJu9odjN/o668X1haw75vrBaCCcGEYEKwQOFMqn09q7DaMawWggnBhGBCMCGYmxAs0BWpKwQTggnBhGBCMCFYqhAsbXEdGyvsyCbAQhrBUI0gz8Gkf1KiOw3ZKDGYxGASg2n4s0JWlcrSqlLmgwD37UvhsLeVK7dRTJ++gGL48A8pund/5b+nT+NbDRs2T1i+HKJChSrjhg6FKFMmvGOrVhAhISUqli8PUahQkcIFC0L4+fnn8vGBwMtc3N3dITCX2Td3bm7iPtg5vEwZ5JQrF/F848YQTZq0HNi7N8RLL70+b9o0iE8+mX98xw6IHTuOszwc5YgcERo1ghAsSwS7ueMmI4r9cfspVk5eSTH9nekUw3oNo+jesjvEtydPNqza8NixFaPmzIkIi5g/f1zHAQPCgsOGDetVvUmToMJBbds2CCpZMp9/vho1Ivzz58/plbNEiSAvb29Me8mXz9/D0zO3T24fH2+/vHkL5SuEzCLBwWWLl/X29qpQq1bTGk3z5/dv1bNnn7Z9goOLDJ48GcV4+unQBXv27F24t0qV8KR79+7sutO+fSOGgkOH9qCYMeNdiu+//4zil182ULDwD9L+kJyMtoaeCbbn3B465ozOvzp//h58Or61adMhio8/nkPRq9drFNWr16Hw989Lgf2VuJ+Sgj/xiV+wgCKqXTsKkIdC0xTLXfH4cMxWRekSJbApLOzpd994AyImZszP27dD7NlzDrWAEKbZpRGEYI8S7M+tf9JLrZu+jmJs/7EUbRu0pSgZVJICAKGAEStx//7uVadOIWfz5pn9x46FeOWVDsXLlIEoVqwI0mz85ClQII9vHhSgQ//+ICpEbGLi6qmrIUCne0n3KMArJUJDixJc3bu3pFiwYDzF5cubKVh3oZxlaxiXYMnJVwicYcM+oKhXrymFr68fBYwsPeLe+fPYE5+5U6ak/e9WrnRpiuxKn0gwFAwemsWjMM28NoV5XgwFES4e3rwZOQcPXkQjQAjTMtEIBiXYlS1X6HTH9R9HUSOiBoV7DncKGF96xO3bPy45cgQ7f/758GbdukGULVs8h7s7bdeF0oq1ayMURIH/vX7958M/h4A/vv3jbQpzpqkcLy9P+uw2bepTrF37KQUwTsE2NCzcjEuw+Pik9NDJxj6Jq02/rPCpHBlJ4TxpJgiGwsNDswoUpjdQP7iQL1+Bd157DeKzz+L+OnUKQlCW/kYwKMGS5ielh05W99l2/TqscMOGf784eDCEi/KKN5LttF7r1uVLlcc+a86eTZiVAEEopUfMmTNKCAb7EYKZbMwGptQmFWjhsZXpO25ueCRF4YSpXQiGesFVs3YUVSpUgHjzzRHn9uxJvxdPY55BuScES1egde9eEh4lwdTefrtnUIkSEE899RQtzwhpLj8/1vefS5YM7z0cVQadrHY5chMirq1bZwvBhGAP3bPClKW4ceIEb6EWjRq50L2kEcHQAkRZQOHCEAMHDkveulVQZgPRQjBbBLtzZ9f4tCfFnTs39cqZ04VuMK2L2nv48Ba1WuAswBSGs1CY9zReubJFCCYEeyzBLh89Sht9rnJlrY1Vi+NrTTCUmSgLCgiAmDo19mRioqDMEmVCMOsEG/nFF7Ch3r1fwONXLW4A3Ryz82uvNa7eGNUhyjBURcDFzmc++pNeRJOpq9ALz3lo+rWrVaNw0dRhBEP7AFwY7x9RrhwEhjLePHlSUKZQJgT7G8E+iIuDxcycOcJQnYR2cSLdhwxp17Bds2Y1hWBCMFPoZXUkR4eWLe1ibdl+EAcTjPUFuFq37tz3xReFYEKwv43kmJ+UBBO5eHGDt48PhJ9frmy/Q1y0ADEx0WNiYz8c+KG5Fzcy0yQGM1nyp+PHu6hBP67Y2UUw/rQGwTZuPLBt5UpBmaFjsK3XrsFA4WsDixeHGDz4xcfZq+SnswXmzh2DB4ackbA8ORnTqA2OMkMT7OgPP9BufHT3EDm7CMb2BLiiovo8+8wzJBjmdxsWZYYmWPt+/WAQ//3vTpoFVrygkDTTLbB791f8LsBVoWbNKuFVILAQyN2ku8ZEmaEJ1rRevUxbkpN/MdsJhlnhaCKA68SJG6tiY4VgGi4N52ydSF/t3s35YIzI16yZ5uR3i/MXT413uXlzB0sLXikx+ssv546ZKwQzyroLdKh4DgYLgGelHdhdHL50mEe2mo4dO4X5CxbE+/uZnD263Xp17gyxb9+FBdOnQ2DVEEaJZ87c+vXQIeScOvXX7TNnIJKSzlw6bDr+8uUJ3GfAgCFLZs5ETkBAEI5DguX197d7vTJ6wPnzv6tbowa+Zcx50EaMwao3bkyCwRzxwdqAFA5LP37rY56Lyznh4RvKg/UMG3fseOLbEwMGdJywdOnFDRfj4j5YfPjwje03ENhsuXIF6xlircLtN278vvn3a9e2rjh2LDE28e7dpBmbNk14YwKWBokaNCiydOSff24tVLQojp+SspZnSUqaT/HNN5Moxo7tT9GuXUOKggXzUmQuDQsL5hfNwaVyioaGlileBpsSb90yWjCm5xjscV4TS2hgU3YRjPOg9+9PxUIXMEH4dY7cp4NnDsdGZi7nP//5A18HwTp0iMaRCZy3XnmFgktH0fSZYzqjZhjHkbGuME+B6hxLW1eY9cImIwgDEezLHTseDnDGT6y0NTloZ82b16JwWJq6PhXneu4f/0B5evVqvTYlBQJQslewuuvOnaUTloIYL48YUTh/YQiucmUVL6w1FoGiUN2AI0e+zJyIiDAKGylWleJWG6fApvdmz14+cTmEvWrq/McxEME2Ll5MrwlTgHAwwfx8fWmCaxcuhADBsoKpJ8INBOP6xKhp3bpN0mhhijaXz56NHIic3t4U0JqKatVq8xQoM1fhNwK42OCoqYEIhkULrRIsMtKhi4RGx8SAYK1b13OM9z275iyIMW3NGqx5CPFElGEf3A/4WIrjx1dy06RJgyjq1q1EMXq06YkiPpbfMs8Jr1KlZZ2WyHFM3Xm5s/dcBiJYm2bN4KpNTsWBBFPTXlbOnWsywP/NPXMAwRQodu48QYeKwiiUzZ48WVNwsbI4Rd68+aHZ8pUiIiiUg9e90D/BNv32Gz2Zp5eXVYKVKlWM1qBp2qh9ex5/9/37INiiRR862LOifxLoGDRxItamN6cKS+WwHA5TXLh376/f/2oElOmfYFeTk+k1vb28KGBSEI6JwTA56qEFp72uBRo+G6kjCaZQVrt2I0UwJTCcBa3BQmoqsMY9+zBRHnVRhGCPvnzEGX7XZsj3o+dKgUsJ817E8PCSNC8tUrzgi4dNuHqVAoUHwd5/f0CGamHHnWeOmAl0PF3JFD45DFyPnAvdm1+9/5UQDL7G5R9W4I0nrAWusbnQlGCgJUdyXNi3j/cVGtJcZAvBRo+erMClxIavv9YUXKw1TrFs2RZoXoLV8+ZRsBiKsfoT+o/B6rZqpcClhDnBatasQCPQIh006eHgCSCIxyfBMHKfUDp58ls70okVtH1AjAgBOiatWMHRFSyVg1FWpkIFvu/TdlHTUx0n30f/MVi3tm3pLGFJ5kJTgrVt3pwEewRcKANzsoVgq1btVOBS4s7Zs2gWFIwvfeb9xhy2mL1yJk+ezQMCU2NiYngtWAz9gUvVS7cEwxwk+rZcvr4UuLpKmBMMix/ShuyY4qQ82o6bNyngqpXgSA467yJF8lOsWzfdwe48dkwswBUQEoKCOZhgOGOpYqVw0o2XLjk5grJ4UfRPsNDgYHNwwVni6iJHU4LNmzbNCQl28uRNBS5LwTHv9AJoH7uLMWM+ZsvjEnRu3ZoXRXl6vQrdEuzrgwcVryyFOcFmzHiXxmTHtFnXrjwa/J+lMCcYttJHKvHCC/WZk5gYm0X3afvr17ZeA0O6DByY0yun4wlmeur9YM8n8fHqpdi6RJluCfbLgQP0kTBcq0JTgv2UkOCEBFOhDl5aaUkwvpqZ7kALgg0dOk5di8i0ZYBVeXQsdEswq4+/FMrMCZaaup5Wpabl8s+spJjQxa+DIZbCNsGwP8mjROXK5Zjz2WfDKH77bRMFIJBFMX3durYN2jqeYKgdTtp/3LgdX+7Iei2y2AjafV23BNuybJlVcMFZ4tJik6YEwylIMAirH/QiMp/lgbavwGh6VVNLUa5cpOmMGgRaDytl7cgDBgzlSVGeAvny8eroNfRS9dItwXq+847ilaUwJ5hy3l27NqN9ZD1dffo0DwLXaCkySjAcgS5WCU9PD+aoteARSTIHs54p0omFDRcvThsyTTWCg0XLHj3iPohLZ1EzVC8n2Vm3BHvcAA7lzo1MsFq1GuBedTDBevYcwJPiEqCHgwTDEiP6RpluCYZ+PEtwqRyrBDt1ahUsAJ+8ef0oMpr65snDr8B92hBZJxgOTg9tKdzdc3BTkybVKb78cjTFH3/8QPEIMTbP3IycvAUL4mgOJliVBg0mDpr4SHl4mawW1eU26ZZgo996i64RRmNVGJlgDRqYfgyjWZA6THTu3JPnUj8iIFKwhlZavKKCFp0J3RLs2Tp1FK8shVWCKee9cuVkk925uanpkvzziWn5qlW5D7yvDaEpwXBe+n5LkTOnFze1b9+IYtmyf0FgmSosbxxRowa+ohrBMSKkdOmY6BghGHyNiw2rf+Pll62CS7lPIxOsWbM2uJccTLBOnXrwpOoSQPy8fbvLmVaGGKtbgpV99llLcKkc2wRTznvUqL6wifR/qjdpwp2BBRsiuwiGIhFclqJQoXw1mjb19PB8770+LDl6IylUa2ghsIRj/w79hWCuR7A+UVFCMNwhVhsB76HkJt5CjkGZVYId3LhRCJbVsQL0mg72UsXLllW8shTpJJjy2Z98YnpIig/66CispmrhDZcjGKqDwuf3fzj/Gn96e3uxjq++2oni8uXNFKpZsi788ubt2aqng23DkQap217EF9u3t+q8VQBg5BjMeQiWFB8vBHNJggWVLGkJLpWTUYIpVx0f/wm9eGCg6anRI58W/1vixhUJhsKXCCrxSI3M/0Scxj/nzBlFoZol0yJnrlzdmnUTgrleDCYEU6y2FEIwh2FTt72IJcPDFa8sRaYJplw1XmJCL67e7IzxgS4dg9Vv0ya0aCgrlc40Ovp57nnrViKFap90ijwFCkgM9iBDHf9OsjMWJJQYDEZvtRGch2AHpBfRRX8fl6tc2RJcKifrBLP00MeOrcALU2DT6HZjDObrm8tqMOaEz8FQ7KoNGxbMayWwJJpsp/XqVeYOV65sobBsH6s5RYoV69e+n4vaWHp6I3XbiygjOSxDL5XjPATD6zkdFg5ly28r3cZg8MeKV5ZCC4LBDT9TvTr8N/0xBNw5CTZ16hAKrMEIAYJhlCP9H3ZzEoG1fn28fcifTKeYn8bvqheOWQUX98GmUuXLD35xMFuMl4mtoZsc3RJMvYoK19JqHKL1c7Ajvx2xYabZO6PZeQh2dvduIZhLPgdr2b27JbhUjkYE40r09L64ux4nQLDz5+PpqqdMiaGoVcsEN5TQwXDD216wrhNK65M7tw2PkNFNH300kF9BlWyIms2afTTwIzYUr44QzDWG1WNlQqvgUnGI1gRLvpJswyiFYLw6148fF4K5JMHw/g7FK0uhEcGUq97wyy/QNghmw2enpKylF//007cpGjd+ODdZLcWBg3NT1sWKY8fQGiy5HVMPD3ce7eDBrymsoqxd375zRs2x0Rquvkm3MdgP33yTjQTDqozHrx+HYaHn6qF5/V1ovTb9iq0rwOrH9ZshBuMmlhAN5QCB0fSPK4+Norr6Jt32Io5fsMASXCpHa4Lhncgmk01btNBSIAbLnGO+ejWB4Fq4cDxFx46NKXx8vDPHNDTUgvELUEiNPh06NOaRUWVLMXjy5I2fb8xca7jEt3RLsN9/+ikbCbZz1apzd8/BnrKLYAviF9jw/UIwhwV+uiXY8uRkxStLoTXBOg4wLVFGF2spMk0wGz4bYyNJsFmzRlJUqFAmPUxr0LYtxlKQLVqkiBt5WKvTyWZv3Xo+/ryNern6Jt0STP3cL1OypFWUadqLyDWtTLfWhQu+fqZXGT2CMq1jsKmxU4VgDsOUMjZLoVuC4fEOwYW1Ch1PsCLBwaab6sEeFIMCMFFCC4JZenqMpSDBEGNRYL1HxbStf/6JZkGRvHLmDC5iKq3Wn0WLPuQpUFR3Dw+/XH4Q265f59VhwSxr4eo5+idYz86dHU+wwgUL4qSwpz9//rlY8WIQDibYgCEDhGBCMEc8YWvcoYPjCaZc9cTly01319+7Ex1DMEvfj0dSBAW6HPuPHbt9znaW0zHpsGG9VLNUrl+/YdWGliXUX47+Cbbsiy8cTzBYEgmGN0RXfs40leN+SspD80p7IKZ1DFaxakUhmBDMEQQbExtrm2D37j0cxwD3qVysvQSWZ8Sh6JiVyC6CoRh8J/Kbb3bL7e/folYLVtMxaZcupsly+KAYgyZNGvHyCDaLxGCPHRBgw0c6z6a/Tp2iJ/Pz9TVHmaa9iDAjEgzihS4vIO3dpQtSfPA+EaRaEwyn4CX44egPltdCnoM5DG667UVUvVKbL1+mj0R3mUKZ+XOwbdvmmAzfzS0s7GFn2tix/Zlz+vRqCvjaLIo3JkzAEeizHU+wpHv3WPfazz+/4d8b5s4dw+o4Mm3RohbfIYpGwPvT9D2AQ5mf/mMw9Wjipa5ds4Vgg0YOMrfjCuHh+BMES0jr/yBekKORaNSikRYEu3vuHBG9Li6OonvHjhTmzwCZw0sgYxHhWUwTktTNpzPxfHS0VYLhxcfmN4C5Vm9XqV//4ZoTaueMLt2ew900uhzNi2GKDiOYehjYZeDAU6tO4ewoQ4E8Bd599yXzajpGg2C1WrTI45uHNnb7x9s6NjZ17xiIYD8lJMChwp3jtoFwWAw2Y/EMSwsGwZiJ8tSuVg0aYvns2RQ3Tpyg4NMzIiiLOT369+Bxztw+k84Y7GpyMhG0dNYsih6dOlHwJcsokqlUf3/PWOumTc/eOYt8FD5uXRxFVJ8oEqxJyyYsBr5lBKH/GEz5knk//miVYAMGdDSZSQY/DCfwpbp1K/GrWHiDAstLUZAY0OYCKOvYuCNyMGPt0OJDELvu3LHvb4c1587xgNUaNVr76VqePaBAwJo101iw3Ll9KByZduvWbOrq1VHNo1AedVF0LwxEsDSPafKa7Vq0gPt0GMF2nd5lacfmBONWFEyJnN7e0Mhp0agRBQIbisUzTDzEJqwoSHF+714KAIcCARLFtWPHKM7tMf04xLcObdoUWioUBNuybNmoSaOQ+en48d/v/x4CQ15++v0niGeefppTAdzd3Y9dMx0Bn73nTWfBZ9XOVRSfxX1GMXDYQIpaDWpBlC1VKrdvbgg0OPMpSDA/fz8UQ10L3QsDEUw5S0QC5r2ItWtXpB3YPS1ePJDHhPOmmDDhDYrVq6dSHDiwCO/ICi8ZfunSxqhBg4Z0H3Lz5o4ZmzbFfRCHwYQrjx8/vPjw3btJO27evLTx0l9/7Uy4ehXRy2+/bVpy5MiVLVeOH185YenSI0uOoC+0cceOc8fMxchD9Jc2rt546NAeOAV+D1esWJbnyq40vEoVnHrEiJdxCQzSeaiMzYgEO5mYSIL55Mz5SPwAO9Aup3jJ4srEbROMu8HTayRAMFUSuwsPj4fzU+6cNUVi+JgTDH+SYECiEVBmRIIt3LePBGMchTHmaWYgiVYt8OnataVDSuPN1/Dr93fflxgMnkX/PTyIwd5/+23teGV55D5v9lEmrGOCqTr+Z9s2aqsEW7R+kRAMvkafT8ZAsLNn13R942FEpGxChB1bAPMYcDQvL08Y0ez3ZqsFRYRgRiGYGo7QsHZtS+DYPWf93vXKfI1AsG9jY1lfqwQbO2WsEEzPBPvuu2mYTksLKFaqlDJ9EVlsAS5vjINgcciw4LD27RuBYHd23TEUuNJ++Zl+/RmxFzEtwDQ9B6OAKVw8eJC8CgkKsju4LA9Yo24NIxDsXyNH8l61SrDoftFCMD0TDIunw83QAuL276ew7/rsPKZBUg62RGWnr1vHKqN5MY5k1655yp0bUAjBTKMNFMqO79hB4AQFBFiSx145SzcvNQLBXu7WjXeaVYJVr1NdCKZngmF0hSKYEv9atow2oYYa8k9Jn9gCmKTMfdCY0c9H9+rV2oC8sqyyEOxvBFMo+3n7dvIqLDTUXuAyPw4J1iG6A7w47dKRQtORHOpWrJP2OkL8aZVg+QrkE4LpmWAREWEKXJbi/a++oqEIytQNY1X0Gz2a+WjDiLAItur1bdevXXu40jBsyNKvGydHCGadYApll48eJXn+Ub++OYJoVZnOIcGSryYTXGFPh+mPYFgckq1klWDYRIId+vWQjlFmuLGIt2//CPeJkRwcYaC87+MExqpzk3Qwoh0Uz1//5z9Vi4UEhISHl0SrYtJ0aup649ApPTUVgj2BYAplWNiQvJr+wQcU/n5+WSQYvk5w7Tyxk6JocFEKmK92wjExGO9ApFfSJqpxPhjrRaahghDLtiyjYGuwwXWTYziC/f77ZhKsatXyEDSC9IhFBw5w5+CwMArjpP758rGymJJMgRYrVaxUmzb1IQCuP/74AUKvI1ezUi8hWHoJRs8K81ICU4lJsD5RURSenp4U2M22YAyGfYgpJThLCl+PrBzJTaZD2bun0cEES0y7LR8hmKeXJzE1b/U8CjaCEEwP4+s5H4zgSkyMpeCjGxg0lqxgDrSl2H7jBvLxeXHwYAo1iIF/6iZt2K4d67I+NRUiJCQArdGqbiu+XxPzqQ3eQ4jWeCLchGCZJ5hCmRKnd+0iuAb360ehXrMCA2UOxeMIRi+OfbD2E8EVMyaGwieXD4XpCFljmhYEK1SgAAqGjxrAsXbhQuZgajNiSxJs8HuDUXi02P7U/RSsss7ApapjuBiMTtecYMyBKSjBt0Ui54svRpJgWDqKAktcUGCrElgeg5bUpFMnCrWgIv98JG1SvQlzPNw9HtmUjX9WrF2bZ8eKIBBYyQMVDCwYOGPGuxDzx82/dSsR4ok+W/ahIbGhhGD2JJhCmRJw3gTXqthYin7R0RAgWImQEDpvWHN6xMGLBwmu14e/ThEUEkRhOkJGmJZRgnl7efHeq1+zJsWYmBiKrStWUGByXXBoMDTq3uvVXhSLNy6mOH/vPAnGmioHr3shBPt/cCmC2RBnznwHDw2jmThxEAUW/aXAy4gpsHVF2nppEL2HD0eKT0BICAVSrOxLnbI2hWJs/7EUkaUjKTRK2R/o7p6jfb9+OEXlyuViExMh3nuvDwqPxa327VsIMf2d6WwEjMkw98fYJATLUCMYl2Bw+U7iPs/u3k2/jinAFOOGDqWI7tCBgnOuUWasOsgcRHcp91OQg7fGxCfFQ2A5pzEfj4HAeobtotpB4Ekd3k4GUbRIkcBigRB4YzVXJqwcGYn9kdO2eXNT5+eFB8MGDuT4wNipU0NKhCDnx+++q1S9EgTeUMNhk2ixt99/GzkQs5bOoth9djcFC+YkreokxRCCZYxgNuCmIreEhFlwcsAC1vql6Nu3HUXz5rWWHT0KgkVGlh7++eegQUBAgXZ9+0L4++eu0qABhLe3V9HQ0LNrziKQA3CwmoWHh7unl9ernV7NlSsn3m5c59k6+fP7493TxQOLlywZVDgoCAu+V6tWPvK559xzuLdqVbdljx65fXK//nqXQRMnYkHfadOGzE9KqhJeZfPmmUl372Ko+/Xr21CeKTFTWJ3k5cmCKY3I/BRu9FQsj5kaGOiWKkIaQSzBvneBEMxuBLMBt0c2OeztKsCURo5ZjpxO5gvBhN7yy0XDn29CMCGYUE7DGWtCMCGYEEwIlvaT346/+22P5HgkZEJ/oH1zJAaz46V08iBTCCYEE4IJwYRg0h/ommt7CMGEYEIwIZgQTAgmBLPvI3AZVCGDKnRgUfIczM49hOnpcpReROlFlMhEIhMNIxPj/DwRggnBZCSHjOSQ8f4y18E1J3wIwYRgQjAhmBBMCCYEc/JRYeY9VzIW0bw1XOjCuVxRZSSH9JdKf6mG/aUSg0kMJjGYxGASg0kMJjGYC/0+lhhMYjDO8dPaaCUGkxhMYjCJwdL8rR2drhDMjo2pNQRcuqhCMCGYEEwIJgST+WAyH0wHs3eMM8pbauqY6XbyHEyeg8lzMHkOJs/B5DmYPAdzod4k6UV06a45Fyq89CJKL6L0IkovovQiSi+i9CJKL6J0zTmma86F2ll6EaUXUXoRpRdRehGlF1F6EV2oF1GK6kIdcS5dVOlFlF5E6UWUXkR79yIKwVwaCy5U+P8DZvQ0NEpeM5cAAAAASUVORK5CYII=", "text/plain": [] }, "execution_count": 38, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# interweaving pixels of different pictures\n", "# all letters are observable\n", "rearrange(ims, \"(b1 b2) h w c -> (h b1) (w b2) c \", b1=2)" ] }, { "cell_type": "code", "execution_count": 39, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAASAAAADACAIAAAAr0inhAAApfElEQVR4Ae1dCVyN2fu/tBCVJVuWLGUXSUlklzZCyK4sfxRjmDBjGOvYGbsZTIjssqZIoUXWUqGiUNqUrU2FjP9Dc+/cuve+59zb896L3/Hp4573eZ5zznO+5/Z8e973vOdU+JSeLmD/ZCOQoS9bxzSAAAOI82tQIeLTJ06D/3WlaeT/OgKE8TOAOAGqwBiMEx8WoLnhYQxGwIcxGAEgFqAZQAQEONWMwTjhUVGATohLOPT3odCg0NTk1KLCIr3aesamxraDbIeOHaqhqUHwWMlqloNxAs4YjBMegYAxGAOIgACnmjEYJzxKZ7D3794vnr34wM4D//zzj6RnDQwabNq3qVvvbpIqlUkYg3FCzxiMEx7GYAR4GEAEgBiDEQBSZoCGdMu5n/Od8DscPqmpqW3w3ODs4sxho1SVMgFS6sBwOmMMRsCR5WAMIAICnGrGYJzwKDcHmz5m+qlDpwgOCQQaGhonrpww72ZOtFSGAWMwTpQZg3HCw1IMAjwMIAJAjMEIACktQEfdjrLvbE/wRqg2MTfxu+UnvFLpp9IAUukoFe6cMRgBOpaDMYAICHCqGYNxwqPEHGzulLkHdx8keCOmDogMaNexnZhARUXGYJzAMwbjhIelGAR4GEAEgBiDEQBSWoDuatQ16XESwRsx9cK1C93nuosJVFRUGkAqGl85u2UMRgCQ5WAMIAICnGrGYJzwKDEHa6zZ+MOHDwRvxNSu7q4rt68UE6ioyBiME3jGYJzwsBSDAA8DiAAQYzACQMoJ0AVvC4y0jQiulFYPHjV4x6EdpWWquFIOQKoYGUqfjMEIMLIcjAFEQIBTzRiMEx5l5WCfPn2CHKy4uJjgjZh6/LTxq/9cLSZQUZExGCfwjME44WEpBgEeBhABIMZgBICUFqAtmlqkJKUQvBFT/7rq1xm/zBATqKioNIBUNL5ydssYjAAgy8EYQAQEONWMwTjhUVYOBk54TPY47HmY4I2Y2v+2fwezDmICFRUZg3ECzxiMEx6WYhDgYQARAGIMRgBIaQE64kbEQMuBBG+Eatgm8cKdCxUqVBAKVPepNIBUN8Ty9MwYjIAey8EYQAQEONWMwTjhUWIOBn5MGznt7NGzBIcEAnV19WNBx7r06EK0VIYBYzBOlBmDccLDUgwCPAwgAkCMwQgAKTNAFxYUDu8zPPIm14lJFStWXL97/ciJIwl+K02tTICUNii8jhiDEbBkORgDiIAAp5oxGCc8ys3BwBXY3Pe3H3+DB2JS96bXb6i/ae+m7v26E5xWppoxGCfajME44WEpBgEeBhABIMZgBIBUEqAfxT6CHabCLoelJacVFhZ+Ph+so7HNIJth44ZpVtIkeKxktUoAUvIYy9EdYzACeCwHYwAREOBUMwbjhEfpORjBm69QzRiMc1IYg3HCw1IMAjwMIAJAjMEIALEAzQAiIMCpZgzGCQ8L0AR4GEAEgBiDEQBiDMYAIiDAqWYMxgkPC9AEeBhABIAYgxEAYgzGACIgwKlmDMYJDwvQBHgYQASAGIMRAGIMxgAiIMCpZgzGCQ8L0AR4GEAEgFTJYO/ev78ZGRly40ZMXFzCkyfpmZn5b9+CUKty5SpaWrVq1mxqYNDMwMDcxKSrmZlhkyaEofCjZgxGwFWJAGVmpj979jQ1NTkt7Rn8Dz+vXr0oLCwoKoIFm5//hx/YI1lLq0rlylq1atVp2LAxfINMTMw7dbJs0sSQMBB+1KpksLSnTyOuXo29c+dZQkLq48d52dlFb9/CaxpaVatW0dGpZ2Bg0Lx58/btO/Xs2cLEBF405AcBQqtKW4tY+K7wUfKj5Izk1KzU1MzUtBdp8P+LNy9ADj8FRQWFRYXvi99rVYKvT5WaujUb6zeGH5OWJpbtLY2NjNXV1Akj4UmtNIAk/E9MTLlx415s7JP4+KSMjJeZma+zs/OKit69e/dBQ0O9atXKVavCV0lLW7sKIGVo2LDkp107o7p1a0o0xpdABQwGMebi1avePj5nLl4EyqIcWfOmTZ0dHV2cnaFAWQXFDCtAZ6RmdGrUCcUlUSPnrp/r1AW5TVHjtAUsgCT6g1D76FFsdPTtmJjI2NjouLh7ubnZEla0AiOjVoMHjxo1aqK+fkPaOhh2KmCw9KQkvwMH/Ly9kx89ohxCNT09a2fngS4u7SwsKKtgmfEXoPMK8q7HXL95/+bd+LvRj6KTMpKkvmRJM5Bq2tUG9RrkbO1s19VO2VTPH0DSRg7ROTg48ujRAF/fUCB6aSZkWZMm9Xv0MLWxsbSz61qjhi65QjkslMpg8AU6cOLEmu3b4xISFPMZvj1DHRwW//RT25YtFWtB3lpYAZoxGCXyQFNeXn+Gh1+NiLien59HWYvSTENDE0hszpylkKFRVimnmVIZ7ElsrOeKFQFHj/7z8aNifptYWU1ZtMjC2lqx6grUQg/Q2XnZ245uOx92/vaD2x//URAHWQMxamQ0Z9yciYMmaqhryLJBlqMDJMO/Dx+K9+w5s3HjwYcPk2WYyC3W1NSwt+82bdqw/v278LSLq/IYDG4YTv/114iYGLlhkKigoaHhMXXqYg+PypUqSSiRBYzBCIBiASTsJjr6jp2dufCKl88aNfRWrtw2aJAyduZSEoMVvn3716JFhzZvVpi7xJHu6ej487ZtdRs1EhfyVEYP0Hdi75iP4/cL1Naw7Y5fdvQw7cETJqWaRQeoVOv/XgQG3pwxYw0id5XpZM+exRMmOJYRolwqg8Eg9Vq5ZcuSDRs+KvqXodShmnXocGrPnob6+lK1WEKsAM1yMMoZUQKDlXgya9bCefOWU3qlsJkyGCw+MnLesGHw1EthLyUrwoOyxXv29Bs2TFKFK0EP0EpgMEAAMor5E+Yvd1vO+01FdIBKzx9E519/3b52rRfcPyytwbwKDfW0sjLBbFHYFu8MVlBYOHzKFL+gIGGPmJ/16tS56uPT0tAQs9HSbTEGK42HxBUWQMKGlcZg0OEPP8yfP3+lsGdePnlnsAuHDi2dNOl9UREf7k/89dfpK1bw0bKoTfQArRwGK/Hfvpv9iXUnYPGHaDj4BXSAxFwsLv44duxCeOolJuOl+ObN1erVdfhoml8Ge5OT4zB27PWICD5cL2mzQb16oadPw6pFnrrACtAsB6OcIGUyGLi0ebPX8OHjKX1TwIxfBjvt6bliyhSFFyjQjGfEjBnztm6lsVTMBj1AK5PBYMj9LPqd3XiWRxJDB0hsniZNWgbPvsQEvBQbNqybkuLHS9OQDH9KT+epaVh1aT1iRNitWzy1L2rWuHXrm+fPwxp8kQSxwBiMACYWQMJulMxgOjq6V68+4G+BIo8MdvHIkQWjR/N686dkUsbPnfvj2rXCCUL+RA/QSmYwgMOpj5PPOh9kXETNoQMkbHn37lNTpvwuvOLx09a2q78/X38EqZtmZPDk+6qDB5Xw2wXO71+3rnJGxk5+fscyTHHgUW0OliHga5YFWAAJYVYyg0G3mpqNMrBHIRyNQD2Snwe18JbXOX5uzYtcFy80sLbmaSBYAVpfoP8p4vOTHEQGoz80fdUPq35x/UUcMbQyFkBCh/T1K376FJGV9drIaFBeXoFQzONnixaa+vqRPHXAVw62fe/eGQsW8OS01GbDzpzpZo6/BAkrxVAtg0lFDEeIBZDQG+UzGDyUDwyMat26vdAFzE9ecrCC/Hzndu0yktFWPRNHrFm58pGoqMY8vMOCHaAxGYwIi8hAraLabe/bHVt2FEnQCugAffHMw2PjH394oznJ2dC+fUtdXAZwmiiu5IXB4hMTTfv3hx0SFPdL/pod27WLuHgR/aUDrADNGIxySpXPYODYmDH/t27dLkoP5TLjhcFWTpvms3OnXH6U37hD165/h4aiL71DD9CIOZhcoJm1MbvhdQOoTK5aZGN0gASCd+/e6+vbvHmTS+4dw+LOHe9OnVpjtCSlDV4YzGbUqIDgYCm98Sw65+U1APtdTMZghEnDAkjYjUoYTFtbJzr6OWxHJfQC7ROfweIiIsaZmyvnBn0ZGJbs3TvQ1bWMsJyX6AFaVQwGOOxbus9lgEs5ASlbHR0ggcDf/5q9/cyyHfFzDX/y5OeHaWnx9eYuPoP5XroEu9PQo6GrozPU3r4f7EJibKxXo0Z1Xd2cvLyXr1/fj4+HNfin/P2zc2n/VOjRpUvwyZP0XdNYYgdocp+vX75uV7sd2U4g+C53laJnsGXLNhkatqxXr37NmrW1Pv+rAovyXr7MSklJCgz0PXny0PPnaTQwltgcOHC+b197entKS3wGc7e2vhkYSNk9mFXS0uo+YED/ESOatGpVp0EDDU3NF+npmSkpIefOwe4dUKZvqn6TJicfPoQW6KsQLdEDND2DbfTY2Nyged2adWtWq6n9eXs/bU0Nzdy3uQnPEkIiQ7z9vGMS5Nt/oUXjFrEnYpHTMHSABIKff94CL4ARp6bEoGZN3ZEjbbp379i8uQGsKoTtELW0KsNvWlHR+7y8t+npL1JTs+7dS4yKehgSEvnyZXaZZo2MGiUknC4jRLzEZzAzW1vKjTfU1NTmurnNmz69RrVqsoYE6/FXbN68cdcuyhXDCeHhRqh7ADMGkzU1/8qxAaJnsKiojDp16slyLy8vd8GCGSdOHJBlUEY+adLM5cs3lxGW/xJ5JUfM9euUv13gOuwgEJyQ0EtXt7aM1SQfi4t37dixZ/duyt8uaPNAXNwgS8vy4yJqAT1A0zNYRkBGPT2ZXyDw0CfIx8TBJOt1lshbYuHk+pNDeg8hmslhgA0QrOSws+sGaRjRBxMTfdhIY8uWuUDuksbwnTp+fJZAAD///oP7ApGR8WfOXD19+ipwWol08OBep05tEJrgr+dAZrDLYWF9nZ2F7nJ96mhrH9+1y6ZXLy4joQ4ysdHu7rA8XyiQ+bl83ryFs2bJVMuvwA7QZA9YDka5qxQ3gwHQEJdHj7YNCblEBl0g6NzZ6vTpUBpLuWyQc7AFY8bAK8yUHoybM2fWunVE42v+/rMGDqTcjqqNmdmB27eJbdIbYAdoOVZyEBkMRpGSmdJnap/ElH/jMXFcA7oPOLfpHNFMDgN0gASCDh1GxsQkEH1wdOx55swfRDOpBo8fpwKVnTkT3KeP+eLFU6TaoAiRGWzIxImnL1wgegbrLU7v3evYvz/RUmSw58iRST/9JLqUVYBbkbCeQ5ZWATljMAJo2ABh5WAlbicnP+na1YjmoVH16jVjY18RBiu/GpPBcl69sqlf/8P79zRu9B06dM3x45Qrm7w3bNg4Zw5Ns2BzKDKyZceOlMZEM/QAjZiDlTj/JO2J6WjTnPwc4ljAAG4hAjHWrlGbxpjKBh0ggQDW0QPDEHs/cmTViBFyxGhig3wYYDJYTm5u3fbt4YAvoqNTxo5V4PUtmgUi8NzwdWxsNV1dog+UBtgBmtwty8GwcrASrJ2d+4WFBZFxFwhiYjLR96zHZLAze/YsmzSJZiS6NWuee/JEW/bdeclGxnTqBPsrSsolJZMXLnRbvlxSrpgEPUCjMxiMa/ep3VN+p00kvH/3HmM3RjE0pNRCB0ggaNt2OJz6JaWv0qLff3dfsIDq+1a6nlKvMBnM69gxV4o7eJU0NZ/evKlft668Aw0MDYVNPoi1fPfvd+jXj2hGacAYjAAUNkC4ORg4v2vXxiVLyNk7WPr4XLW07EkYr5xqTAab7egIyy9oHJi9YcNYijsW4k3BHR+PwYPFJbLKLTp0OBwVJUsrrxw9QPPBYMUfi9sObwsHZNKMbvyA8V5LvWgsqWzQARIIunWbGB4eTewdjvaKiPBu2rQB0VKFBpgMNmb69EOnThEHM8TO7qSnJ9FM0qC4uLhG69bEQzGXzpmzSM7fXsm+RBLsAC1qWGaB5WC4OVhExI2BA6nWHvzxh+fIkRNlToxCCjQGg4d6PatVg3eZiW5U0dYOzMqCJYhEyzIGDo0bP3/2rIxQ6mXQixfVa9WSqpJXiB6g+WAwGNR+3/0ui6nWWMNhxYlnaJ+bkeFCB0ggoN8OEU5b3r9/OZzuRfZTRRaYDNbUwiIpJYU4kP1btoxT9NCGnk5OITducHcxwtHxyF9/cdvQaxmDEbDCBgg9BysqKmzWTMpaKslxLV268f/+b5akvDwSNAZLvHdvRPv2NK7Yjhq1gnq1h3iD62bOPEK3ie9Wf/+utrbidRUuowdonhgsvyC/dt/aRe+LiCOFZ485ITk6VXSIllQG6AB9vi1xcurUFVS9fzEaNKjX/PkTLCza0VdRmiUag2W+eFGvQwcav2ODg1s3b05jKWnjPHXqcdJ9lPatW0fj7RiHHaAlx1RWwnIw3BwM8G3bttabN+RVGnBcGBwaVnY+yneNtpr+jL8/pSe+L14U9uhBaSxudiMggPjbBfYxcXHunp7wNEy8rsJl9ACNzGBijwad5jidukK+yQRQhO8Nt2xPlfeTccMGCFbTZ2QENGhgS3yFAlbTi7sH+2pMnjx4+HBrPb1q4nIov34dUEYi+1IMUNlG9Bo0Bgu+fr3X0KH0HfNnCa9Fv3zwAKt9xmAEJLEBQs/BwP++fTvExcUQBiIQuLvPW7hwDdFMLgO0HGzr/Pn7Vq+m6ds3KUm/cWMayzI2r7OyrOkeTw9wcVm6b1+Z6opdYgdo5NX04oPaemTrzHUzxSWyyodXHh5pM1KWVj45OkBfuh89esHhwxfk8+SLtYaGet++nYcM6Q2JGdxjVKAF3CpoDEa5jAPXe6mtQRL/LilJQ0NDqlZeIXaAJvfPcjD0HGz48D7Xrl0hQu/q6r5y5XaimVwGaAwGB6lcOHyY2DesPwzOziaayTLoW7t29suXsrQiuVnv3jsvXxZdlqeAHqCRczCxsV2NuNp7Sm8xgczi+tnrPcZ6yFTLpUAH6EvvSUnpHTuOzs7Ok8sXcWM1tYrwrtfo0XZOTn10dauKq5RZRmOwpRs2LNmwQZmuc/SVdvdufbo/JjkaKVExBiNAhA0QHznYmDF2V66Q/+AcMcJ148a9hPHKqUZjsMk9etwNDSX23tbcfH85juRztbS8R3rQDD4YNG9+6hHVwjyiw+gBmj8Ge537Wq+3HnFEYDBn3Jx1s9bRWJJt0AESdnn2bPDgwR4070oKa0j/rFxZ08GhO+wd4OBgVamSpnQj3qRoDDZr0aLNf//Nm5/yNfzo2rXmTZvKV0eGNXaAltGNmJjlYOg52MSJQy5cOC2GsfSis7PLpk37pOsUlaIx2GhT04d37xLdsLC23hFA/1CibHvTbWzgaVhZqcR17fr1L6SlSYgVEaAHaP4YDIanaaH5ofgDcZxuw9x2zN9BNKMyQAdIrNclS3YuXbpLTFCuYu3aNdzchrm7Oyvz7iIag02ZO3f3wYPlAgCvckxQEBzcjNIeYzACjNgA8ZGDTZs28uzZo4SBCARfNYM5tWqV/PAhcQywFcfaEyeIZrIM5g0bFuTjI0srkutUr371zRvRZXkK6AGaVwaDHAwyMeJ4XQe67l2ClM2jA1Ta+23bjs6atf7jx39KixW/gjQMFnwsX+4Gr5Mp3gp1TTQGG/fDD94UX31qx8pleNvf34xuYSSxG+wATexQwHIw9Bzse2AwR0PDtCdPiF8f+7Fjlx84QDSTZfDbuHF+3t6ytCJ55SpVrr19K7osTwE9QPPKYM0cmz1Ne0oc72jb0QdXIP09jw6QhPcXLoRPnLg0I4P8/FOiqkwB5GNr1syEjYFlWiApGIMRgGQMpmSAWA4mHfDhbds+iY2VrhOTljMH8xgyBHYWF2tPerGant5ligUf0iuXlqIHaF4ZrOnApknpSaVHIOXqG8rBSrzPyyuAO4pbtx758KFYyngUFY0b5/D3379pamoo2gC5HhqDTZ03bxfFH29kjzAs4H0weCsMoyUBYzACjNgAMQaTDvhYMzM421K6TkxqaWOzjWJvbbEapYrT+va9TbHIsG7Dhn4UmxeUalrGxbfFYPBS88tscq4ydejUv379S8aI5RSjAyS7/8TElE2bDnl5+ebnF8i2kk/Ts2en06c3VK+uI181ams0BvuqVnLEhYS0MjKiBoHLEDtAc/VVomN3EdldRCnfEkpuaWdh4UWxmFBKB19ElPv7GrZte+z+fVmNyCVHD9C85mBVulYpfFdIHOBPY3/aMBtpZTY6QCTvYYn9/v3njx4NuH49pvwrFaE3W9uufn5bKI9JIHlXVo/GYL9v2vTb2rVlm1fR9bM7dxrVr4/SufIZ7M2rN21rtaVx/n/8jGbi+WAiDFX4HAxtT46bz5+LxsNRgF3pr7x6xWHAoYJw1V1Hp5DiAdfIhQthYymOpuhV6AEamcEi/xtKXkGebneq1Qmrf1j9s+vP/9UsTwkbINiT49MncjIPLqelRR07FnT8+KXw8PJS2erVP/z8s+sXGMQALQ8swrpoDHbgxInxM2cKm1XxZ35iYtUqVVCcUD6D5WbntqrRisZ5xmAcZzSLA/g9MFhIVpb4kDjKl54/r6nQ25ApiYmD6fZ7G7927YKdOzl8oFdhB2jsPTnEAi49N+5ZvGeC4wR6ELgssQGiZzCBIP3LjyA9/cWpU1fg5OXg4AjFHpSpq6tFRh4yNoYbY2KAcg2bVofGYKE3b/YYMoSm2+fR0XVr16ax/BpslM9gBW8LjLSpboGeDj3d2aqzilHCBug7ew6GloOFFtA+mjiWltaNbg/gMl8dOP0ZzoAuI5R6+dMff4yZPVuqSl4hdoDmkcEOXTg0ZgEVPkF/BfUx7yMvFNLtsQFSgMFEjuXk5Pv5hQGb+fqGFha+E8lpCsOG9Tt6FHKcr5XBXmdn67VpQzOS6+fOdenUicbya7DBDtDkMX38+LGReiOynUBw0P9gb9veNJY82mAD9J0xGNobzfAyGLwSRjORw93df9m+ncayjM0cJ6crFMcjQS3P0FATK6sy1RW7xA7Q2AwmNqppK6ft9KHKPFP9UxvUaSBWtRxFdIDK4YuoKqxdhAdl8C5ZdPQjkZC7ABsqZmZeQn9JDC0HA+9bWlk9onhjZYmHx2IPD+7Rfj1a7ABNNbIWui3y8/KJppu9Ng8fP5xoxq8BNkCMwaTPFzyk6lWjRn5OjnS1mLRuo0bnk5PlfXBekJfXt06d90VFYi1JL6qpq8PWi1pVq0pXyylFD9D09/oyAjLq6dWj99fA3iAlM4VoD+eq5IbmEs1oDdABou2YbAffycOHL06fvppyf8UjR1aNGNGf3K48FpgMNmH27H1HjxJ7r1OrVmJ4uI62NtHyazDADtBUY+rcpHNqcirR1H2u+8K1C4lm/BpgA/SdMRjaXUSYxdSPH2nmMuvly1337/emu6cvavDv5cv/XLRIdMlR6NSr164rVzgM5FKhB2hkBhPe9AqPDu82sRvN0OD+IdxFpLGkssEGqDx3EaU6fO9e4qVLp6WqyggNDSvq6wsBLaNT9BKTwU76+Q2dPJnGk9FDhnhv2ybvX4k0LaPbYAdoKgcHdh0YcZ28XMjE3MTvlh9Vi/wZYQPEGEzmXN2nPm8hIinprjyLcWE/etjz420uVeYwe8OGsXiHoGMHaOy7iMKA6zDTwe8a1S/bwskLl7stlzmL8iqwAQIG+/jxdsWKFSkcSS9ZyUG0bNZMjWgDBhUrPv+qGazo3bs6xsZ5FOegw2Amjhz515o1WGeg0MCnmA12gKbywm2U25kjZ2hMQ+JCjFpRLfugaU0RG2yAGIPJnIXcN2/eFRfLVJdW6Jia3jMwKC2TfgUnHc52dAw7f166urQUIt/ZJ08UO3+sdEv/XmEHaGwGK/i89iDwZqC1u7VU/yWFmMs4oHVsgIDBwsL2uLouHj9+wPjxDo0b60sOQSihZTAnpzZnzworyf7Mz0/8qhkMPJ/s4eFJcYhRyRg7tGmz5fffe3TpInvIBA0seggKCzt48mQzAwOenq1hB2jCiErU6xat27h8I41pX/u+B84foLHkywYbIMZgXDP1XF2dSy2mq6imNmn9em07O2ESIaYTK8KjjPU//nhk61YxmfRiyeFJ3eztt9BxnfRWJKTYARqbwQSCJN8ks7Fmr3JeSfguRVBNu9qLoBca6hpSdIqJsAECBvvzz/lubqvAHbgN1rOnKTybGjKkj7QN5WkZzNhY+zthsLiEhLa9esFvBf1kWXXu/NPUqTa9elXR0qKslZObGxgaeu7SJd9Ll1592SLbsX//M0hnxpbxATtAl2le+uW54+emOk+VrpOQuri5LNu8jDubLS4uvnb5mu8J36yMLK9zXhJtlEOADRBjMK7JeJCSQv/bBSQ2cf78KYsXw9qLeGmtZqamLnF1vRVEfmhzNiCgwpfbkpt9fa0cHKQ1pqAMO0DLwWDD+w1fNGVRO8N2HK5fvH7RdbGr15+0vzNj7MboR3JkNRxdyVBhAwQM5u4+fMeO4+L9QWptZWUydGhfe/tuRkaNhCoqBvP3vxYURJGBCQTq6plfew4GI4dHYfBATAgB7WflSpX6WFl1NjFp06JFSyMjvRo1tKtU0a5atbCoKDs3NzsnB5jqfnz8nZiYO9HRwJOQfYk33dLQMJ7i+D/xKpRl7ABN1S0s44DFHFSmX4yaGDZxcXfp3rd7/Ub1darpFBUW5eXkpaWkpSalxsbERt6IjLod9Tb/Ldjq6Oo8zHlI3zLZEhsgxmAEzDMJ+rJqOMur/4gRPQYOrGdgAOUP798/Skt7GhcXcPRoyLlzNIsPocWHjx8Dg7UxMztw+3bZDsp3jR2g5WCwEsfbNGvjYOVg2srU2Mi4rl5dSKLeFr5Nf5F+O/a2t5930K0g+JMh6XHSUo+lNAM9vfb0189gQFZhYVGyhgP3Fbt162Bi0jI19ViFCnXr169dtaoWHGP5zz+f4HXmt28LYeXh8+evkpMzYCl9QMCNqCjagNKihcY3wGBPnz2DNAyYRxZAfMjV1dULnzyB/9Ebxw7QtA5aGlomP0mmtZbHLjI1sl4DOdYQE9rGBogxGAHwh5mZ796/JxhhqyGJ1zAwgD2DYedg3LZVzmCIw2lu0PzhyYcVMisI5P0zg8MJbIAgB4N9dmnWv5uY6HP4pYBqzJgO3wCDwcBWbN68cM0aBUZYnioPw8JaNGtWnhak1sUO0FI7kSJcPHvx7k27pSjKLToScKSHdY9yNyNsABsgxmBCZGV/PlDu34cljkzftGnN8VK3nmQ7KIcGO0DLnYPJ4SvJ1HOR58RBEz9bcT98JLVTSo8NUHHxi1Lty77AZTA4Y2XXronVqiFC89l1zNX0IijgFl//UaMuh4WJJEoonPXyGmhNu1yI3h/sAE3b8/279/ub9qe1lsdu2aZlk3+cLE8NTltsgBiDccL9RQmb7z7KyCDb4VnAhsFaNjaRCm23yO0FdoBWGYMZNTKK84lTV/tyHwgxTGMDdPfude4ZEWlxGQwesn3Vu0qJhl1SgLcqze3snqWllZHzd7nut9/muLmht48doOVw0Kmn042QG3JUoDMdN3Xcmr/wMmRsgBiDUU3jo6ysF9nZVKblNoIVIVv9/Lr0748Yl0VOYQdolTHYhW0XbCyFt1gRkcIGaP/+/SLwuQu4DHbr1n5z87ao6eln93nJwUpweZyU1NPJKY3uUAhuKGm0k0eP3r1+PY2lXDbYAVqOzsOvhg/rPUyOCnSmFt0tToWcorOlsMIGiDEYBehfTC5fv55BvWE9baPS7ESvMCPGZVE/2AFaNQw2zmHc/mVizICIFDZAc+b8JgKfu4DIYBMmOO7Zs/hLd4jQfG6PRwaD1hOePrUfMyYxKelzVzz/625hEUK3LalcjmAHaLk6F8wcP/PEgRPy1SFZ19Cr8eDlA5IVtR4bIMZg1NALBP6XLz959kyOCvKbTl2yBNbjl9RDDj5fGsUO0CpgMFjEeH3f9apaVf9DFxEpbIBsbEb95ydnCYvB4FCV8PC92tpVvvSGCM3n9vhlMOjgTU7OKDe3i1evfu6Nz3+w3WJmTAx6D9gBWj4Hc3NyB3QZkBifKF81kvW9rHt6tfVIVnR6bIC+MwZD25te1mwE+fgsdnGhOZZSVguy5PCO0IxVq1zmzZNlgCLHDtDKZjCDegbBu4Ob1G+CgoaURlABguXw2tpW799/kNIRP6I2bZoFBf1Vrx5SuJFwkncGgx7hfYo/vbzmr1qVm5cn4QCm4E18fHVdXcwWBQLsAC23d7Cmfmivoekp6XLXlF3B56qPZU9L2Xp5NNgAMQaTB/0vtgkxMXOHDoXzKeWuKbuCbo0ayw8cwH15WWpvqAH6cw/0O/tK9UcuoWFDw8A/A3mkL/AGFaDXr3NtbWfcvo13D4YTr8GDe3l5LdPVFctOOe0VUCqDwUrcggdiSzds2Hfs2IcPfP0BcMPX18LUVAEUOKpgB2iOrmSqnqc9dx3kGhOBk2FqaGp4nvTs59BPZn9yKbABYgwmF/z/GhcVFPz522+HNm/+p/Sr/oq0JRD0cXKCE8b06uG9NSjbD9QA/bkbpTHYwB4D4dlXdZ3qsgeHoUEHSCC4efP+zp0+x45dgteTMVyU0oaBQb3Vq38YNcpWig5VpDwGK3E7OTV1i6ent48PLFbEGkhtPb1BNjbDBgzoa2WF/lIzdoBWcNBA+1tXbd2+ZnthgeLfOdixY+SEkePdxtepV0dBPySrYQPEGEwSY1rJk9jYXUuXBh4/Tr/zVJmm4dxKt2XLzHr3LiPn7xI9QNMzmLWF9ZU7V4o/Fss7Ov1a+mt/XDvWfqy8FRWxRwdI6AScUnnsWMD+/edDQ+8q/IURNvbfJ2zm4eY2zNV1ILwA9p+Ut5KyGaxkIPCG/4UrV3wDAwOCg2EPDwVGV0lT09LMrHfXrrAXlWWnTmpqago0QlMFO0DT9CnT5mXWy783/+3j7ZP2TI7XFBo1aQQZl52TXddeXekOVZDpgBQFNkCMwaSALJcIbiee3bvXz9v7OfWvFtwztHZ2dpwwoZ2FhVx9ld8YPUDTMxiccAn78nue8TwReCImgeoOB2xBNd15ussAl1LrNsqPAkcL6ABJ9PXs2fPjxwN9fUOuXYuGB2USerJATa0iEJeDg9WQIb2hQK6AZ6EaBhP3PyU9PerBg+gHD+B8Z7jTmJaRAYs/YFMq+IE/DKrC7ohfNkiEEzENGjRoZQRniXz+ad+mDWylKN4OT2XsAI3gJsACdxRvhd2KvhMNT8ngEVl+bj7kZrDRtLauNux8WKtOrWYtmhm2NASkOnbuCKkXQq+ymvgKAZLlqirkvK/k4BhUUnx8RHBwfGTks4SE9KdP4XxnWPABMRuOV66irQ3bJBq0aNHc2Ni0Z89WHTvCS18cTfGnQg/QcjGY6IxmOHz50o1LkfGRQGXPnj/LzsvOL8ivXKkyHLjcsG7D1k1bd2rdybarbcvGSg3Pn2FHB0j2XEJWdvdufFTUowcPHqekZKalZWVlvS4ogGOzYDfND0BTkFbBBomwKVWtWtXr1tVr2rQ+bANsbNzc3LwN7J0ou2EeNapnMB4Hh9E0C9AEFBlAnACpksE4HftalOgBWjEG+1rgkPQDHSDJLr5lCWMwwuyxAM0AIiDAqWYMxgkPDykGYzAC4t+XmjEYYT4ZgzGACAhwqhmDccLDGIwADw8AEXv8pgwYgxGmizEYA4iAAKeaMRgnPDwEaJaDERD/vtSMwQjzyRiMAURAgFPNGIwTHsZgBHh4AIjY4zdlwBiMMF2MwRhABAQ41YzBOOHhIUCzHIyA+PelZgxGmE/GYAwgAgKcasZgnPAwBiPAwwNAxB6/KQPGYITpYgzGACIgwKlmDMYJDw8BmuVgBMS/LzVjMMJ8MgZjABEQ4FQzBuOEhzEYAR4eACL2+E0ZMAYjTBdjMAYQAQFONWMwTnh4CNAsByMg/n2pGYMR5pMxGAOIgACnmjEYJzyMwQjw8AAQscdvyoAxGGG6GIMxgAgIcKoZg3HCw0OAZjkYAfHvS80YjDCfjMEYQAQEONWMwTjhYQxGgIcHgIg9flMGjMEI08UYjAFEQIBTzRiMEx4eAjTLwQiIf19qxmCE+WQMxgAiIMCpZgzGCQ9jMAI8PABE7PGbMmAMRpguxmAMIAICnGrGYJzw8BCgWQ5GQPz7UjMGI8wnYzAGEAEBTjVjME54GIMR4OEBIGKP35QBYzDCdDEGYwAREOBUMwbjhIcFaAI8DCACQIzBCAAxBmMAERDgVDMG44SHBWgCPAwgAkD/D0hrUOWHI2I5AAAAAElFTkSuQmCC", "text/plain": [] }, "execution_count": 39, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# interweaving along vertical for couples of images\n", "rearrange(ims, \"(b1 b2) h w c -> (h b1) (b2 w) c\", b1=2)" ] }, { "cell_type": "code", "execution_count": 40, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAASAAAABgCAIAAAAyzjgPAAATWUlEQVR4Ae1dCVRUR9Z+7PuusskSFkUFgzDGQBgxkwk6KkQTRZmJMvGcaFxJNLjEKG7RA/yOUXBJ9JAxmDj+aFREUUeN+Ku4okAkIIgLS4dFFtkR8b+Itk2v9bqrul9319Oj9erduvfW997Xr5Zb9XSeP3/O0IMiQBEgg4AuGbVUK0WAItCDgD6FgYsI/M4wbeL88mUYQ3H5NI+rCOjQJiK3bk0Hw9yR5VGALAF6nTMI0CYiZ24FOAJvLZnsArEcLvlMfZGKACWYVHiUfBFahogH5RgiUKoWowRT9R3g26ec4UOhQQlKMLW9mbfU1nNtcpwSTG3vNp2/VIdbRwmmDneJ+qi2CFCCcePWtXDDDeoFbgQowXAjKp8+U/mK0VJcR4ASjBt3SIcbblAvcCNACYYbUaqPIiCAACWYABjqlRyhXu5qqbeUYJy58QEsPaGtSpaAqUScEkwlsEsw6iMhXzSbLRtFNdAcpSBACaYUmBGNwFjiEARRyi4EkDgiQpercORG9HWjgGHa++b0ng1jGCNx+TSPqwhQgnH1zlC/NAIB2kTUiNtIK8FVBCjBuHpnqF8agQAlmEbcRloJriJACcbVO0P90ggEVLmrVMfDh1dzci5cuZL3++/FpaWVVVXNLS2dXV0mZmamFhYOrq6u3t7ew4cHhoaO8vf31NXw34I2pq2BaZDvoerP9NfXig3CKhnmDz5EJSVlV67kFxSUFhY+4PFqq6rqGhqa2ts79PWNjI1N+vUbMHCgm6urh7//yMDAIHd3z96C8BzZ2/N1EE+oYBTxeWXlqfPn9x06dPTUKWAUShWt7Ozej4wMj44OGjXKG6UAB2XKGaa6j1tNrU3ZedlXf7t6q/BW7t3cB7wH3d3dfSRknRzLPhb4dqCglD1jr8tozC8RoAHzFZ2CFYRN0LKycg4cOJ2R8X/l5VWCl6Snvbx8Jk2Kioqa5eg4kC9pbMzY2PDPiCSUSrDuiorUgwfjt2//vbhYvtr4h4TMXr169Pvvw4SQmh2vCNbQ1JB8IPn4xePX71x/1v1MkVqIEqxXWz+mnwFjoIhmlZZtYJhSUQeePu1KSTm6ZctPRUUPRa8i5hgYGALHvvxyLbzfBIs4ODA6ZELPlEWwlpar58/P/+qrm3l5ghWTLx0aEbEsOfldFxdj+cqrpNQrgt0ouDFyxkgsLkgiWK9yR8YRixWlKxGz+8+ZM1cXLIhXhFqCtbCxsdu4MfmDD6YLZjqSQUsZzQl4cW1Yt+6dDz7Awi4AJSs9fcqwYdsOHoSHlh6SEOAxvG6GXZtTkioV5kOzefnypLCw+bjYBXWpr388d25UQsIqJdSLOMFaS0uh77QqIeHZM4WaQ0JYtDY1LZs6NW7lyiKhC/RUAIEqpgrGTgQy1CzZ1fXs739fGR//bxL7T3/77YZNm77qRcTEhBQyZAlWX1j418jIE2fPEnI/ZePGuIUL7xPSrhFqYWRSfTk2Z843MJ5B7j4kJW1KS/sR9FtbkzJCkGDtDx5EREdn37xJyvcXeg8kJ69ZulSNf6WJovNCudyj/+Rdk2Zh9+7DMKohTQLHta+/XsjjEexqECMYjzdj4cKL167hAEGGjh8TExN/7PkdoockBKA/JukSN/Orq+uWLPmXEnxranqyf//LhiIJc2QI1ty8/YcfDmZkkPBYrM5v5sy5VES7Y2KxeZmpXhyLj9/b1NQqrT74rvn6ejFMHj59fTQRIVjhrVux69f3sUP4pLO9feGsWc9ZTtQSdopz6p8xOMeZyFWvo6Pzhx/SyekX0uzrC0EeXUKZuE4JEIzHi1m1qq1d7IJBXG6L0XPr8uXj+/aJuUCzXiFQLRRL8iqfa/+fO3e9vv6JcrzS1dUdOtTjhS0iL0x97NXI+O9/T2dloau1tLD4aPz4v06dGuDjY+fsbG1o2FhZWcvj/VZYCMOPhzMzG56gYp24c+fE6dMZQ0N068qThACdFzE6zgHOlR9XSrFbV1vn298XBLYs2eLt6m1va29rZWtuYm5uam5oYPik5Unxo+ILORf2nZDn1wReYnqMnhTrXLh0/jzSwNi6dd96eg52cHCyte1v0nOYGhl1m5lVm5hUw+xZe3tnU1NLZWVNeXl1fn7J7dtFFy7k1NY2CFXQw8PZxMToRWYhwwQIXVX8FHckB4/3p3HjECeU9fT0YufOXbpihY2Tk/iadHXVl5R8s3Xrlu+/R4zTK7582SsoSLw2buTK7AvxCcY7zXOwc5DidUZnhlAsohRh/iXOR3jkjB+/KDPzEt9hSYnbt3kDBjhYWDDm5mJF4He5RPACTKbl5BQePXr+yJHzQLneS5MmjTl8ePMrMfwE032lGs//5y5eRGSXhbn58dTUTUlJEtkFHunr2/j4/M/q1Qd37zY26v2ZkeHnf44ckSGhQZcnGk7UoNq8rkpFRfXrE6kpiG+SwC4oZskw/oKldXR0AgOHrFs3Ny/vQEnJ0c2bvxg9OsDff7CgDPY0VoI9fpyUkoLiIlR1X3LyWGjOoRyOjpP/9rftmzahyEKTkuHxUCQ1Q0aO11ENU8Pxure0tGHyEB5va7GqPD0HLl78cVbW7ri42WIFcGXiJFhjbW3muXMonn36j39ETJrEIoDZ0XHW9OlhoaEyld++c6cRuc8mU5taCLDlWBexETNccBkZYexFe7DxCvXNia4TJ8GOnDzZ0dkp07aRoeGaJUsYKyuZkkICsfPmCeWInkJXTTmz26KmVZgDCy5VaB27aWtrC0SduBsr5Yh20cXwEay7G3HwcPx77zkOZt/wdXQcExRkbmYms249ncDaWplimiSgYcuZfXzc0e8Obo6hW0aSxEewqqrLN26g2IQOFSNX9LK+vn6An59MEwV37zJPn8oU0zABCwb1Vx8q3gR/OHyMGuXLyjvgmFSaDWGlDa8wNoJV1dQ8KCtDce5Pb76JIiZWxr6/7LaQ3MulxVpUl0xzxhzd1WamGV1Y+ZIREaEw/8vWbi/NHj8WLUdsLYqoKZEcbBPNhSUlIsrFZ5S6uck3SFSnp5d27Jh4pQK5FX/8IXBGk+qHgIOD3bRpYfv3n5TDdRgE6H2bKXlzG0musv6dkKQI8fVlbmXl6OYmSYn0fNsBA6z79ZMuA1dhb6Gn2tdElAmLegls3DgffahDbNUgLrX3nabaQWVlE8xt0CCxcCBmunh5yZSECfuaujqZYponYKRBH4Zwd3fau3ctzJcqfptg47IpU1b+8ss5iCFWXBtbDdgIVt/YiGLbXLG1o2aWMD0v+2hpbZUtpHEStoytJtUJemKrV3+KpUYtLcYffRTr4jI+Lm4XbJ+IRSeiEmwEa21D6liZozFEkvdmEHmGcLQrPZYfwSkqwhqBNWvmJCUt1dNT9Ck1NDQC2zU19evW7XZzmwAbVCkvWp91pSUUQFyfYiTXAD3fJmJxlPluvk6a4DICCxZMy8jY6ugou++NWAtoKG7f/r+DB3+onCVniv42INaKL9aB9qLjywslWpubhXLoqcYjMG5ccFHRYQgdNDDQx1VZeJvNmrV25szVnZ1kp0yxEcwU7dWkIENa0IaEDLm5JAzX06F9eiwsTCH4vaDg4Pz5kbAwDhcAqanHYcdF2NEel0JRPdgIZgL7fCMcTQ0NCFISRRCLGxoYSFRBL6gtAl5eLsnJy8rKTmzdGhscLH+4giAAWVk3o6K+IrHvYq8VbATrZ2sr6Lek9CN5d6UHhYDCQ7SdbcxMsf3ISaoIuXy5x6Y5HgCFCzGYH1u0aPqlSykIc6JINk+evJyQsBdJlL0QNoK5DRyIYv1JXV1dVRWKpKhM+b17bWhfY7FVbDJA1LQyc2Cht3zmOB4AJV+lpJRydmZgwSWWPeW//noHf42zFItyXMLWa3R3cUE071dVZS/XF5oCzMzg00eIVtRXTB9fV159QWDlOZ9jMBcr3wwo7NENI/hpafGs7KIIY3uDDRuCGrN8/9EjFM+0VsbIuGfShh5yIABrDHvfaZL3EZCoFfbqIDE5ho1gtoMGDfLwkOi+wAX4+h50pwQy0JJNbIZ6FJvORnOIlBQ0Ec0tWITGy+EH20XQcphQbRGIRwCmwVe/0A/4/tjp01fQ5RElsREM1v8HjxyJYnXH3r1NyKH3rxWymgFDWJf5WjP3Uta21mydkrlZFVuFGiAPkYz81iNKdXJz76KIsZLBRzCGCR8/HsV2dW3tZ8uWsXuFadkKZXsnexQkNVUGcYs+xOoDxzw8kMaN7t/vRtSJLoaTYOPDw2EzNhTbPx8+/EViIuoUOiw80LLlJwPdkIZke6Au7PlHw15f2dn53t6T1q/f8/Ahr6d6Ch+Ie7OR2HkdJ8GMbWwiJ09GRCNpxYpdmZk5MqWhtyZ5WB8+6gcbgUTHxKzdvPm1JlbNgtfFOJTy8PZA9aaVNbvgC86oylUkl59fXFJStnr1zjfeCH/33dm7dh1UMAS+uPiRiqqC+4v0SxYsQJwn7X727PPw8J2rVl3v6gKaiV9eAivmxC1Pho3ZDh0//s/PP7cfPnxsVNSPaWk5+fmqQpCEXR8/H1C7KGHRb/d+k67/VPapFfNWyFxg2tXVlXU6K3Z2bGx4LPe/j86fkoLQAthGe+7cTU5OY0NDP9227T9APOmAiF6Fhyg/v1k0XzRHH9uk1WvdmFUOeest2NPmlxMnXluQnAKO7dmw4WhKSti0aaPDwx1cXfs7OT3t7LSuqLDr6DA3NYU9pCBIH/amb2hsfFxfD7vV38jLu5GbC7tuCH2QtujevZd21P/1BRUZ8dYI+DftTBr8HeoxdELIhACfAD8vP3s7eytzq5a2lsqayusF12F7+rPXzsJTCOSJnhf95/f+7OTiZGFl0d7W3tTYVFFWUf6gvCCvIOdKzu3rt1uaW0Dn2V1nXwLF4f/y8oqFvINeGewsD39jYhLd3BzfeedNaPWFhS0fMICBqDgY0oJAvd6eBEQiIGwdKKT+5amHB/4IO9x70zPM/du3hwUFIa5eEV9R9rmw4VRbaSn8y27YiL0hxUsg9peCPIMelj5U3JygBviURNEvRToOOoyzYDbX0jk2NmNQAnB796ZH8f6zz6anpx+QKXn06L8iIr6QKcZKAGcfrNfwG/7+K2NiWDmhuDA0gUph/lojXl+9aIRFhCkOi5CG5f9c3tOAlzNSTUgZqdOysm4UdmE3b2hoEBoaiF0tfoKBi8tXrfpLSAh2X6UotLS1fYSwXYcUDVy7NHXmVLwuebl4zZw4E69OEtru3CGhVbbO8PDRVlZIY+CydQlIECGYnpnZ/h07XCEYUymHrp7epv37beWKb1SKg/IY8R3h+/bot+UpKaFM8rJkfT3MXW4JptQye9myaBJ+EyEYODrAz+9cWpozq2AVeesXk5Dwdhj+BpW87mAr9+XaL3HpmjFhxtigsbi0aZ6eTz6JGDlyGIl6kSIY+OoZFPTroUNe7u4k/ObrnLNmzceLF/NPNSkRPCZ4yowpitcIhh93rtipuB5N1eDn57VtWyyh2hEkGHjsHRx8LTNz7JgxJLyH3ZUXxcfPjosjoZwjOjckbfDy8VLEGVcH1/Qt6WYmZn2U1PQ5486J8iN24APNp0/vwLgNgRCYZAkGxuATlZk//bR940b4FrOQbUVOLW1stqSnRy9dKqikQfBEI9KWVpapx1Nhdku+2ngO9MzaneXu5C5cnPVsrbACQueslkwo7gN8PzY7+9+wU7fiqiRpIE4wMKzj5DQvJqYgKwu+u2eAY7eMv3z44cGCgpAJE4RqVSp0rhGnbh5uGdkZwwOHs61N+OjwG/tuiGEXW0VKlId9J8aN071yZS90iszMTMhZdnV1+Pnnb+DrzJaWfd/tuE0qg2A9PpuaOo8Y8X1iYvGlS4vnzBkg73YK/iEh3507l3jokJ244RPWzyBuNAnpc3B2OJZ9DMY8TEyRnjnHfo6p61OhZWhtYS3GJXgdBojJ5k4WfL4oJSWOxzu9Z88q+IwyYvAdov8QAvLddyuLi49ERY1DLKKIGP5IDtne8HgwL3zy118zzpyBUF2UBc7QIHw/MjLik098R40S1f8mw+iJ5nI1BzGSA9zXYXSeM88F61FbXbtn655D+w5VPKoQzOenIa5qfuT86InRwp2uXgl/BnfwKd8yxkSOkK5Hj/6AmLGMjAuXLuXCskj+VVaRHBUVeRMmhEye/K6syHrMvz2qIBgfoRffmSmrrIQPK+feuXO3tBS+PFTB4/Vsc29kBKGJroMGefv5BYSG+owYAZNd/HKQGMowxoLn6pNmRTAHxqG3ZoKlIPgw72betYvXcm/kQjiVTqPOkDeGBA4JHBc8brDbYGEk+jOMi3Cemp5DD43Vslt+NWGBu6qW4KqUYHwAtCkhSBXp9YY3GJ9g0iXpVc4ioKw+GGcBoI5RBEgiQAlGEl2qW+sRoATT+keAAkASAUowkuhS3VqPACWY1j8CFACSCFCCkUSX6tZ6BCjBtP4RoACQRIASjCS6VLfWI0AJpvWPAAWAJAKUYCTRpbq1HgFKMK1/BCgAJBGgBCOJLtWt9QhQgmn9I0ABIIkAJRhJdKlurUeAEkzrHwEKAEkEKMFIokt1az0CdMGl1j8CFACSCNA3GEl0qW6tR+D/AXbsvnE7WUo9AAAAAElFTkSuQmCC", "text/plain": [] }, "execution_count": 40, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# interweaving lines for couples of images\n", "# exercise: achieve the same result without einops in your favourite framework\n", "reduce(ims, \"(b1 b2) h w c -> h (b2 w) c\", \"max\", b1=2)" ] }, { "cell_type": "code", "execution_count": 41, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAASAAAACQCAAAAACNVemdAAASuUlEQVR4Ae1deXxVxRWeR0JWSIBHCBCWCAaDBhMiAipEZAkKRBTRoi3Qiri1osbSTRREWxUUkFZFRFkUtD8EZSsCmoIQ0UTCEiKBsAXMQhZC9kC25s2Zu8zcZe5994ZfSB9/vDnLd87M/e7MXV7eYRyNyNy/PHNw02iz+WNN92AuoI05uBq6YUVc74FzStVcrcDmsD6DEj938dBvc5AtdLS+GXQY84NOrLCFnxaXxPoSSybHtLfFHZstA7JOUCUZR4Ut42lxSawT1Jsc03Ut7thsGZB1gu5x4oE4ptoynhaXxDpB7VeHNh2Vzz+Gt7hjs2VA3tazxCbvzHaO7mo9UYvMYANBKOC+Fnlo9gzK+hKzZxwtNovaDCpev/NQXm37XgPvvjdAd+C7D8jdj7eXaxpy5dGMk+dziqtrUGBY1Kh4Xw0YZT6788DZ0sb2vWJG30jZ1ZTClCMnzucUVVf7OPsOmXi7Qw3D2BpTt6ZkFVR5B3bo02/wmDDGq3zVKHv1vSoBFDL3KXaKyV8F5i0XgK72p+5yTVW+NCWjXu7oOu9eueqS5fmxL3ux9Ag6MFFBUaw8QcmII3IVRS+Ip3Q1ZftfD8vMjhFru8lUhNjjRwdj3hL5QYV/uFdSqDj3lJojFD8o/8kFvESbp0j8oIPTV+viq2h+0OGxz9MdKqIbXxgn5wc1/teHxrAE/TjiDAXYNpnTA4U2ryzZqB/z2StX5ICGpf+Uq3x5yRP6mHmLGH83eKwTrQxBOQlloguE7WwGxm9VnSu8qqgm+v5t1rxqC2vR1z96T89/7O+sN4oxMATNLGT8CM0vUpjsNBR/qZOtcr7y25iFJsfzF8VVTdbhB4r1oU/Qzu0QPHVPSc3Z93tipeJDWUJanJeL/z1NW/na5NST5w8vCQEg6VI1ah05X3HL9+z95D6AVK5UhcqNU8+U1hZtiAFTud4S2A2Yh/cW1V4pTF02KRDdBBbxk55B5JL50Zq4Dr69nzzQH8M+E8F2CZ3DArxCHvoY0h3Szlq3Hny/XnxLu4AbX/ojaJu5940u4UHezkn7RwN+JXUVo3vLwerwdcOc3m07D3piQ+HmUTSAvov9koS9kx8FUAgs3/QLTIxN6i3ROFGJ9pL5qRgjej8LPT48ALdV34PK+/Rb5Y8hxfu1kW2xK79aQPgnhAsiaakZtAtW/JMC5s5OWDog6Da3t0K+XzTTpoDnES+CeBDaVM0A2hE2DfRk2izXIrCSFafNOfUkTXresVfIASM7MU7Q7W3hGocqNbNmgme4ABgKwnFB57X3f4ARP2vjJn2HfT/dEfub+8NVYRRBZwCykEE20xJDHaEfbYLyMKB9qDAeZ3CpSwSzYNRpY8GXow15YtUhcKalJQ595CGxJymCWmIaV4MyCW6r5APZGjSTlmMP4dElB2NDBf408BHSFoN0xu/3ZYSY54dZPX+jnGwUQRqX+1oxh70C1bdaaqAOjhL7Yb7Xq2FVbYHYWqfqA2P4D9Mdort2bcxc9nRRg/QVoZRALUPK08yKH85fJfVSiUW4OUlWbakauyCNBqrTqkMzgkRf7fzfwo1KtFAEdRbNlNCO0pSKdAqUPkuWEBxdIM7gmkJsALOBzCWXMcipD715RcHmaZ0EzCdwYRdU+jmoD5gLG+l/C0S0ukBmmN5MVg/kWcMxoJ7czBBKh/kPZl5wk/8oYHrzoL4Jqy8kzYT1iOaK5wOHUTNoMGT6Dy8h4ycz1P6b3c3Q0S6hv60gELNg1W7JgdykjRA93nctP3U71groxz6KoLHw3DP7mBhnSOgCqGRDYDOgYbB4N+ZD0DFywHEGc5QsByAcuGrQzuOSOfRNkLMlU5NEERQ6HvsKbp0nzIa8FfHktFFRtEJW5vJT2FyyjfZa0ELh0Kqfy3ElOZrYgHPd0ouX8mSFC3Fx8kUM7B6DG9WP1yMHL8gSPKmCIG+95Qp6aSseQ+Urr8ZEhfqU5ac1HfPoCRRERbnR97LLemlMfHhdfmam45SPCsgt0wyYlVkPDLmu/ngaub/M4KbaFDo8MvD81hIATqMmAR2cjlJT/9xrYHhQY1X+QbJwyPkmQJqgQbOWgL0hLU3IJJuEgolpfUfswJaazeA41Z8BuK1G3wspa/ftE3OMGiKKmkLVDhiRC+D/jCYM5Ra7fOfOyRFdBso1eokh9OYYyutS+AShmXSQOGdpszva7Ag2qscc1sLRX+yuDUhXcf2RnjNtaIjPVw/RBkME3T6ZCjpBaZaUgHcZhnq8H2Qu4bi/6OBVCJqYSOMZglDA56sZxovgUkeHMdoCauKdZ7xWVOfHE+XhI9cwo5M71eRfbfBSMxPb8HHM8fu+9AWDZwAIOaadWhUvPpw7YmZ/31GnB+LyW/l2HyL6jnx3IT/AOCLg5TWjyUXfa9iyhcFGIsc/Egiw6PWfi4eiFjhkW/brt8CzhMsdNffkfHqBNfFBbg10eE1GVl4latexX/8OtEP7u4bGk+n5l/279I30ZSPM6Xkq8OqMbNdfVgeQw6YQsXItpwfWXnirJiWjuG3Y4H5yp5ZcfPhsUU0b/869BzhVIOoEqQAFk9oBCD47WrP51QmyYySQQ7HE7EvdOjJ5COKcRw9BHoI4DHDcnhnkIYjDAMftmUEcgtgHRw68hbvDVB97LQ3akWsyvJtJvNkHP7P5xa9lDI4r1iBOgNmwxBre6e8TNqtEyNjKWhsImvlcZm3uP4eVtjJmyOFYJ+inj3Gqn5d4CFJnIImYv1F3X+tW6zOoglAgtNc6I8z4rRPUl2S8nsncSlTrBN0fgqlwPNlKGGEOwzpBQVtcjy6+745iMrcS1YYn6SFZW06FjA9rJYSwh2EDQShwCpu1FenWl1grIkPtUNRmUMmWPRkX6gLDBtwV768WI9p27BfFJiExSK5pyO7Ui2X++7tjxY0dI4ZNHqSRVTKXZGRlXyi4dPmyd3DPAXHR0p90JAgrcerFFC+r5YtXVws5nInT2Skmf5lMXCwAXe15+JuL3NQkUy+rBurF5PlxqhMvbBVTDl+koIh6WS17PEvEuoSI54ZSepMSyxrM1osdHbNM5AcVv/hbSWEzu6G7US+2MlriB+0dukC318s0Pyjr92836AYg0/ViaZPOURm/mcnpgUKbV3j1Yu88WiNPWv/nv8pVvrzuNX2M2XqxvOkVTMKkZYzBZlW/XuzrRLa7N1azFn1903o9v+l6sdn4BzNUysUXKdVuRbderHyGcv7Ooq5q/OEsLdLBmK0X250EySZvzDyb8gb8kKJyrWYHi+DXsLM1ARoOw/ViS8gtJOG/l8pTH4NsZa9rZJXM47fsSUlaSP4wX/Wp5FBIu8FiuF7sPcAvWjo0yKfHtB0RWP0SjDZ+Gq0XqyXjeX7ziOB2gz58B4awqpw3lE7d23kFj1w9GHCba7XxOdhluF4sLxnjJ5AnYyecq8xC7Q6sePj1Yrvzcf4byK1r1lCsln9trFef+b4YWHpEG98WuwzXi+2BPwpMFRIO7YAlnQ4EpFvtrRD1i2Yw+Q7uWW+CeBpach3QDBMcIeNBOiwYlG0ENhmuFyOZdqcImbywcLqZXtR7QjeVQm+K9iBYJgiOeBAOCTqvHbkRI05r40zWi2VDpveZhM20xPj1YmfxQDoSIhEKdeKbLBkmM0oVNRJsBSouYjJZL1ainqlC3WzZ6gMZGjQTlWJPiOR3YhHMklVT6ghrU3uKomu8XqweH3pbiQAQwSxZtSV419bDX9v1YgH40GUTuAwbwKxNi+SpwSKZqZJZLpmqF+soj5TkQElUlRyqVhuM8Gafc0VIVZWDpe6CzmvLazGigz5Ov15MuIPiHOGQ6Wgn/Yysl5whGA3rtKL3T3FF16XB8w9CP8Dlqr/RnCcB2JWH901IqNv72Tq4Vs2dIVvTTLVPDGT6lpeQ8QeBnseYrau3QYr1QqY1IBCzYNVu94GrrzZC9BirFxsBzz2vZolxhgRYCcgsr/zc49tgzPJzAD3wKbQJ/EiMKNsIwGhtvMl6sRB4Iiy6561CkvPCuod3aacnnn7QLjmO2+IvuAFGAT3GYmRFwmlX++NEuB3dGcGLP1ftQpT+qQwDQ27QxputF3v+G7zKqxYtjors7FNekJ6N0LAx2vnBc7MfvltcjJnYtzY3PQNV+vIijPrnbMfII/3H9K8/+B0eG0IvcqP3jB4Y7p+/D/hB43VuImbrxaJnfAi9N6anC8PAZ09QVFu/sZuwvebf4D4xQBXmhvH2363EUVe2bROjH+CeMIRq9u8X8b5TRFEhmK8XmxOnSEJuBQq7zPCsTG4Sf6ZVK9rSm9no65ezFo7+WIg2QJwFMoh+vVjblYor4ClZsIZ411TKcYzSLCntdjIM9dnVyVzCO6br4FUI4tWL+S97h3lsuHhJpwfi+oCi9Qw/wDAiNPlROXZSSrhc5cvxC710QG7Viz24f8md4lXWcdNTW4J0eiAu/69WkFsZ8hu3llzH+GFGEO0+SnmQDMd7fNIGp5GYYff4A6zfgtfFQ1ELNFIvlqsWePn4mQvVKCC4Tz8FO+ShRxHVmJmWWx3Y9YYoP9qVR6tcTS1/VUrmxcYO/YYoBtOULU2esfBurE197srR06VeXaJ6yZ0gxypMnHoxVYIUSSSD2gFIXqVkB0HKrJJFnSDJz0pKglgErbehVY/GMuAhiGWE0T0EMYSwqocglhFG9xDEEMKqHoJYRhjdQxBDCKtSX7myzmtODzlg+5BN14uZHYHZB0uz+c3iqQdLA8E2LLFWvr+Y2VcNJen6+4t5ZpBnfzHlnKEsyUTbS1lbjWL9GlRJuKhoNZxQB2KdoN4k33VU3lajWCfIs78YZzJ49hfjEIQ8+4vxGPLsL8ZjqDX71V5WDdeLubG/WMXBQ5nZ2YXVVah9z4HjJur+UUbg3Uy9mDv7i/28N/18SY2XX1BYr6ghXYReSat8WeXUi8njjewvRr1qlIw5VC9PEPb2r+SqqsypF6NiDOwvpnhZTf7XCVkOx6DXOstU5gdULk+z1otVHaD4QTlTXqZGo6KYqhdzY3+xxbPk/KDGVGZNsc9BV7te7NW1KqTITFbrxXj7iy3/VNaZS+zcgTYwBF39erHECnpAtGa9Xkx/f7EzK+j+EOrLGBiCrn69WME6ZkRy1Y56Md39xTY0yLtzydczBpogk/Vi7u4vNjW7ojZ/VSgM5UtmRHLVzXox4/uLka9ox36U9OOP337yt5H+qI+8/yaZvos9tA+7F8GPsoonZbnUyCRs1P547T3sU98lk7qLif8ZrSvgh9twmLMIN2oftb3ysfn5RbhZ+ixu2ue0VwO7bFT+moRvMM6Z6yPDp8lkhEZdcqkDxYVWk3J9dwpA/4/kV7debOggPJTiAnpEMs1avZiR/cW8YQiXhU794hh+aIKucr3YHTCsph+KavyDGYDcrRczsL9YL9zzuRmHNUaAEFBI3AR2terFwqHbcs3BHQTPBAEQD8IhQee1/P3FRsKKO/Zo5LiR1MVATE0RRM7l+6IXhEJGt0vtBImErySVac9ik/v1YrGQMkeZWbBM2nIcxMzMRQPujicjEryuto1cKZErklwhibZK5Ldo9MO1vIdSrIRIJvgBHpglq6bE31/M9y1YZK4U6QvvmXNakYsi6IrCjQ216mbLVqpvtWxAXVvJBaI2oxISpEDc1LFmmd591QSHqNZtf3hZg6iBQA1S492aWoZMfLOqATi7bAKXYQOYjfRcjUFkpqoHBL/y2cR2oqvuw3mNooIFiqCOtE/Q4DwImrKVToHSZ8kCl03368WM7S8W8fKuxROChIFu2yBI0FKzIxxsZuvFSA69mUx3alSzWi92FDrqzevPJy6uPm3H1zDfPriP4oSaQTGQ6VteQsZP2L/AmK2rt0GK9UKmNSAQs2DVbv8Drpu0EaLH69Y5m6KxdvGYaHQJFEFu1ot1gYzJVGI7FIv1Ygb2F9t/VhqncxbIeZKpSaIIcrNerA9ktH9/MXfrxYzvL7bygWmrzwmEZAiCvKXWG3KvXqz59hdzs17M+P5iJ1FGxtKukd0CUXXx8TNATA85P/SrBnKvXqz59hdzt17M6P5ihfiZMx++MyC8dLqBIohaYgi5Vy82k0qJsmjVimZDvZje/mJqI53qRQ2YIci9erHm21/Mer2Y7v5iJykusDLi17SNIQi5Vy/WfPuLWa0X099fLPYO5vh9HltATyD6LubizuFOvVgz7i/mTr2Y4f3FopZufSZSehHo+/impxh+mK9chdmlUy8mQNhWa38xeF1g0eZ0vXoxKpP4lav2/mJpVIBLKT2Re+mKw7dDt4hghY/9TloFYNVkB0GGxyASpB2hJEgb6/Iwa1Af/P/o9RDEOesegjwEcRjguD0zyEMQhwGO2zODOATRX3dwwC3e3Qz7i/0Pa9iNS4qutycAAAAASUVORK5CYII=", "text/plain": [] }, "execution_count": 41, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# color can be also composed into dimension\n", "# ... while image is downsampled\n", "reduce(ims, \"b (h 2) (w 2) c -> (c h) (b w)\", \"mean\")" ] }, { "cell_type": "code", "execution_count": 42, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAMAAAAAYCAAAAACJ2q2ZAAAEV0lEQVR4AdVWfUxTVxQ//bJUBlXp6uaGCgUJAksYW2KiBkl0E5Nln0adzcBhZtZkX/zhR5a4zLDEOb8WE7fFuWSbMzqNNRRNVmNAYWEZ4GfiprKw0mEFSgtSBEpLd+5r332Pdx+blyVdvH+8e87v9zv3nnPfvfc9TQAevIVqfTmrDA+uT4ZSzzFJ/7JOgGfO8oRwjD5FqZYjrhHzh9YbHBFJkPIUUDANE8qYn4SsOKbQ8JyBy4d8OY4sjtGTIOUqIAn5cE8hP5GROtclX8RsW2LPZsfZ0ZLAHOUsSZBzTe1dPQNhMGeXVj6hKgn96L7eG7MsXLF2hgrvrm+/4x8M6zLylq2zqvAA3ccbb/pHDWmPL1hkTxEVsjfQ4vgjgeqrt2hEgdivdSesPRtEaGL/8gXqmw6+SG3JcG7uSzjpNXYJFi15/PZNIirrv986LHpGDzmOQpMOcf0LYv4Q2fWZyE+pH95Eh5Liv6kS84d77+6XcBVreNtuFm1+n+YPxTR/oAX0bQwDFDs7u5srMHi3VznCsUAgcFcJKv0yr7/9YwTD3yoZuLUNIdtxb5erCI2aK4yAAGWe3o5Dj6Kx8wbDO2N4BTo9ft+Vo++tkVhawOEg4UsfMeTtKwOInJYkHNa0VO2sd57GgFYm6OAYgPHUilTT4hPTAca/YAQEMKbpzK8eQWOcXQGy6CNBk9Y4d+VHZI0TjR7iegSi6wW0A59XE/wUuqJLAL1MXCMiyzMJbF1eC9DECCjwbEkbQAt1RaPihwEYetO8dHFpnkbEsKcF+NDp/5kyPJ8HGhQ3cIEBt6OikZKE/AHmoOlX0HLXhgWw8+f+tLUBYKCuDua+/nYa1dMtZKKQYPyHXzb5+kiDpqPZH3dJcsSdrN1HQpEOkS441VJTPhONzp1LyCaJN/oGCm8CFDWoTy6KDboowJDo8fUldwAaRo0YNHQeHyWTR4+R7ZOnxtscjtjv7gMB8G4+IfL0DbyByPXKazFC+OvU7xvNPCRPhvARuEd0PK0SxXc/wOqDb/WhuUE1NhTB3fdhN3LlDO/cUY8za/KFK0i6pOgbWFp1GMDlmjFv+khXD3y1mhmAAM99CXCtIH/c17WvQlUwOVhmPwJwrDY/+tsoil57XlXZlG0z3g4ilfsKw5927ddmWg0jPnJaCyhNC4BPLXvHcJv2C9Rl9QKqz+D3YfBXlLTxFgB7U74GuN8mDL9+j9Cxj9BVAZv9nUHJxX7By9XjicOPfUJpuoVAu6W1uiDuGp4ie0WlWdz2VAJrCwtV2H+G9LvcLwnBplW1B8ilrtJmC/OmV11kj0Ds88qseIS+eHtzLg2W/QsRbMQbjBgtc5j6qR4iHX2amZkmCeCxon/6YxlZ0luXx5J/oZVHe7xhS7ZOjsvsgb8Go8ZZT06oXjFYilSaLFBu6nP/VSKXT7R1NttEgPGsVgaSALNZskVL2kIi8pD1D30Bii30Py7/mkUAOfzz/w3PLyd1G8omRQAAAABJRU5ErkJggg==", "text/plain": [] }, "execution_count": 42, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# disproportionate resize\n", "reduce(ims, \"b (h 4) (w 3) c -> (h) (b w)\", \"mean\")" ] }, { "cell_type": "code", "execution_count": 43, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAkAAAAAwCAAAAADcTJoCAAAODElEQVR4AdVcV3Mb1xW+ICobCJBir2IXJVIWZUtWtVUdi1ZiJ3GJ7STOTOIHOzPOm3+E35JJ4skkk3g845KJ3E3LomyPClVsWbQos4IFJMUGNhBgQ93sgiD2nO1YLCUID8L3nXbP3j333LtLkbp5Ivz56oawXET6sl1EoUy8+mdo9+siyFTjqbeg6x/TIfvAAdiOFkAUwn+7gGHFs4BswPedG4j+zngVEIVQKn86xBthEGfXSUDigJfbxYzTHmhGMyZilyIiJwExhbA8KCxOWqk+0czQBZuFoiFhnNMpFI8rM0GB2vgHj+tgGIBXrvy9bQ1wEWgQkRM0QWJGrDzEQlVo7pbTE0wtqNme8J1VNXz8TnD5E8FZRMJE50cgQTO8vSgdAWNR0W5767KIMnSz99FGEV1MjK4yJr3bIHTuXxQ95tLAQPup8rs9uLrxUKELrjY/DGyEZBOw6E4iO1bly5e/Fyu/ldaRx9GF8qOJDhxnZckMwx8YSoLv3mTqh/l43r+9DpL9XzQ/qFY2MkdCZL5hkdi3VvFNR3/fLFrfXe/DPieQsGgBiYYUCEKLROMIm2PpF+Msp87eYUkSIwvMTXCSkRAdiKCreoyOPXHeLjyq/cQrx4uxKMbGPtlY2zERAqI3PhWZyZJELmCym4CTXPgr2cGSwSATJrEASRRT81CIzKFCNfajfTNNdZyIo2X3i68cE66h4euSoUVbq1XSjaeMs96Q/y0CC4hMTSNtkhI0P2vL6bw03ahDIHOerRoBqk/CHz/emJkPPrjU7xgN8/wuVuXyZKxAtICyWBsa4ZcoSJU4GSEEHaGc/ItIfBCtI+SggNOViDJkEkmwOVKpJLiArCqjILeM5mbfYP8gam2EUO1PIitMRAsIV91i4hWOx4XMwymgRXQ8hJZJhAtQLgP8AupFBoWIaUHGUJBsxNQTc0ODv+emCwVwrFkQRyQFMUAsdkDIECQa4wDdcFLgW7G1FY1H2IxwdnQsHuCdNP1oynT5mucwCCOaNDxjmXa+dBpdXHgYDsXBogVESqBlxyY2BQOTA6xx8wwcOkmxrgIm5nVAxuDvQlBSmshDBgwUw+PeGKSBtg1O1/AC2plwQ4LjSj1+V0PDlS95SwyqE8I6ZvVkgBBZuDsDTTJBND+knZPayrdIgI2RSiW5gvyKEVNKxO9p7gMwxiIkHCzegSrhrkK6v0ArihMlMVpOu2eBEOWTc4AlK6zWw8xc1yEjVCtq2bpapNWAONEOScpUhRz/R7tYbaAt1ycRXbyADPXI7fZbgn0hPPz5ZWSnguykfcCJK7+AfK0iyt12seD5uQDvKPXVIEpnK1wfSKOSLH6CHM3ovIFUUmRm4fKb73YsC5nMCwmFZAYh4bpsTyfSzbxT8tBWvJX7nAODqyTh9lxU30tyY2OlHNORoXPH2PUdHulb+mVMnTxgdxfMhfrfvgMbq9H7+QhUEdKMqSrWdoCdo+HPV1GMKna2kFyGMGfN0dG2krpKsH4jPkM3oCs8oEI5gyUKKKe2H1vfuaMvL9ySYzEZg7611dmpyTmKMVBcrDgaYKc8E7FToO6xUlrTMdxcbjUFfd5F18RkgKDdFDjeU1hYCZsOoa501tdYM0LeuZ4BznZfUKVBon19OVUFuWnmwNJUl5MTr4nDFdL1hxVqbIxYS/K2ZGaY9FQ44FvyuIbxsdkmEU+igMiRoSDHMzSE5mxd6w5vrDyOtWJq/NX5Tps7Yp7RUhH5dn8d+Yr+4/dmQpok+PBwZAHFslm6gdZtTE4OszARNDcn4m0vE1HIiNcLiDHydHdL2BZI6KQKyLbvkoRnTBV2Z8ewSmD4yYPuM55AWn7NDuFmPJeMBZS/W6RgOLNQt5Uj0Jo+rFMV0SN1OAYR9VL1KVVA5OFRzl4OogI4n3ABEbLlNclz5mwFGC9p4OEBt4JcLCcUGCViYt+hzpttQNL+1ei1IsdWcvdJOW3lmAvSeUFpnML8UikHseYt5bP5OuNTRvlBdD9NlzdKyOKE5E0UD620gPaKh5B6kch4pT+XIeUc1Wlzdw9KjTQrpbx3urwn5DePI5u9gW1XO4DCAmqMPeAITbRM8dqftwl5YZkmHYiUbcdREdOmRlFITUhti1wFHXhIk4HEg+Q+Jq6T1igroNzjklFkCojYfytf39oUEDmRLZ7p6oq47p5qtv9MchfTPSrZWDVI3foLyQQkRggpum85z0q/Q5E8RDOjW57uuBB5LR+aGHMtLPlDBmOqLauw2M6mtuKTOmWxdjLI/PQ7XlGT2TJRlbxiye3xLOalBILBQJAYjGlWW6Fb3kuZRZ39o4WIpXt0amHBF6CMJqs9rzQ/0pksT2jxBig4v+j1LnlWmfzDBkOqNSuvODe68m3PST58SF1DIG8SqhcmZueWl33BUIrRyFyDzZabTkhNi8y9lS0gomuuab8dGu5yBNaHCwRWmA3FXr8j1jHmimAmyrD3b3w775npiPDFYq4uJTo4Vy7HqdmpKddM5HH1QPQ45/cvzxDysauhKVPOW5E+73eXblCLP3atlxEhPp93nF539Y1FpPZkuqIQ4kb+8Ylpl4cCBoHAKtM5zDX1lXSNVrdIvSUGbgLQ8puJH3rX55Ua6xmMLd5QaI2QKcbBuuOlUzL1Q3RK+hh15a8X+XuIrvbgFnqUtOr6crmNUCB9oQIioWvXgrQtt4Aym3apuRPOP41ORJpnZPhXowW0nsvHvSSl6WA0qprfTIWX1PfGh/AeR1UPv/4INIofO18bnGTjVoywOBLLtmf/se3xR0Ue/t4fx0jo9neCVZC3q1Gvr9xVEWmmyA0Q+Q5EyMS56Z2NQ4PDi8CPhlSfY98zleVFKqoHB2KZ/sAD33d5WM6grKraMslLwOaALVwEhA/DP/Se3MYXxy0JXLpR84fbnLSZDnRt9oQ17mjAYeESIOQJ6lbfDBToKVNC4ZlYpqYmz9k3O8MwbgTr8qpq82kUcjhONfK0QCDfgairl6Ol73G55pe8a8Egvc/Tu2ROTnZeyc/V7gOCHYhJjJq6cyLN6wsSsyndnp1TqH6S8O+Wg4tm4f5DLFaJpj9cX1hzo9MLbr+fMhmt2bml+ZFlZXq8XmVUxq3jGHRm/lu6x0lvaD6/wZSZU1CZTWv3HVK3ttjA1IXrxD89Pev1rgRCIZ3eYE5Lz7LnFrJH8xdKWGs+ku1AgY+GNrys1uoNuPE99dbzzHVo+dEVFp5WcaZSl8KVcILbDOluDa0PnZPDT8H/seswX6peYm3i+F6dfkr2DnJcMA1/2kv3odJSLEVsC2JcElkoXCHga+/F6gdIWbj8rpsl9yG69mNiSXd+Fq0fkTBX20QUGomHzjCHRvWfL+n6kf5kWiT1MgUUPDMh6U7/PvsHiV2BTPhNV7fFnj7UDNXzBedkywty8xueSFOB89NEwt3qlPWWbkByv5L82R3ZAWbOy5oks4H/6wSym2qVd/42wR4nN0L/NTkLcf2ygovPFXdnNNId6GaftHdEe0u+yBREuWcmvejRJq40/B8q6b5n5+MKGrfxxfWXZ3H70Q7X2bccou6JFNCcsvbbJtfGRZNLCsX3qrP4xqPENdS6ufNDnQ0rSUPAJnRbQMgVJbKFnVeywAhxDXIHva94T0BlulM/KHMc3+RNbKpbWR48q5E1nogn0MkUkIHnwQoGnCxmkKmuIj/V7F+ZGeqP/HAgpv22OgYVg8zXoSn+G4lQoxk+lp2ZajRQK4sD3UsoqH+0CnHF5ALH0lC1LSdDv+R19OIBSHuDnmOqgh4tTbcYTXr//FgXd9O92iB9EhEbbRQrLNtK7Zn0FIX8S16Xa2w1orVJVQhtIaW+jOLr9u6NPNClpubUH71yA7blsQU7Mk1Ksi09kpbVWrq/rQtlOKyugMadKArZeiqDEdjtZUfar8LpIYtdTdhUDWvIi3hZior29p1bQRHmB2oRV0pwITYeN6076k0ZBTWEmnY4aIMtMsEkCmgEHc5MT25lQ1mOlnwC33/07Gd1yY/MLctOmCW6TqiQxjexes+RGE85VPw/VEEdGhRQLDohdYXvLQBKSKe6AvLCINWnIKP/4k5BwSG3w5GHpTwm0fvw4fI0qB86Su1JGKofkuTHusdQjrOIKSWrfciy7lFIKxHT/E8eWZ8xw9HI0AqiSkkAGjZAEsW2h54/KCCGIvEC8g1Bu53VkNG4qQIIXPhMBDTJCW3lMK+1ZciUYgdswcTyuA457slHVOsVZmPbHTMONYxGU0rQyQz3NKUhJN4D9cMJ0vMLcS8Yg7oDyP0A8alnTk3KDuS0H7cEQg4g9QBiGpDGbBTEiZhSgn5I8Z1bqRuyE+9ATmhXmQFZBJcZgWgK4PsBFqMk3YgpI9QItDPugozBNVYocanbY2AIjFP2Ia5uAefAGGtvq2pj4gU0DqPXQrKOUwqAbBbg+wHmoST9iCkjs+gAUc1/GKlBcTRfYbVo/3GruQTOXxVa+e8HEyhnRUS0gJYXoX8hJFGcDmSqNgHgf7ehIRWOqGb2J2EAUodYhOBdUvMCMlWiIWcQU0hqdNjQ8fZ/OlaxSJbxV07UBZfEP2UCLcnok06dASdKTQHhMyfsxtFrxU0Om2sxHeUOGMVTDJlCnF7fw7GcPne+vLYGtgaOAY+KFhBqQDw3rmA1hDoqV518POEOhObHnMW/wvRUWKMevoGspPB1aMK9q7lQSbyIKSWPDPm4puHh4XPlDbVmrlyMi25haILEvFk5nCxWmrwI1XtQRZ7ojmULBbBDITKHCtUYd7hlVXGyWoTcKGfrXz7qgw/hQlZR2f8BXGmSVKeqey8AAAAASUVORK5CYII=", "text/plain": [] }, "execution_count": 43, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# spilt each image in two halves, compute mean of the two\n", "reduce(ims, \"b (h1 h2) w c -> h2 (b w)\", \"mean\", h1=2)" ] }, { "cell_type": "code", "execution_count": 44, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAkAAAABgCAIAAAB6y1p+AAAn9ElEQVR4Ae1dB1gVR9eOBQQpKhYUARuIBQyKhNiwi4gdJFZA9NNgT+wVa4wao7HEFs1niR0big0VRayAghUFpQtWpAsS/5Pc71/X2Zm9hd17917GZx+fmXfOnDnz7uWeO7tzzpT7lJ7+Ff1HZuBFHXKb+C2Lpi7a8usW9jjpn/53vyIuRQzuPrikpETWyuBsYXWUNUuQOmZYqjEoPfz0tYrmby/zrZQg3o9Aed5W2qhhBsIvhJMsaNe53ertq/X09EgCFKcMUAYoA7rNAHVgkr6/qUmpPPZ5+3ofvnTY0dmRR4Y2UQYoA5QBXWWAOjDJ3dm7t+9OHzNdZlZhQSG/fc7tnENuhZyLPscvRlspA5QByoDuMUAdmITuKbir8cPG9/qm11/b/pKZVb1mdUXss29pr4gYlaEMUAYoA7rEQEVdmoxWz6XoQ5F3N+/Ia5HsWTi0cniR+oKN0DJlgDJAGaAMyBigKzCpfBICfwhEvBdY1rNfT6nYR+2gDFAGKAMSY4A6MEnckKePnu7esptriudwz7rWdbk4RSgDlAHKAGWAPkKUxGfAtqltaskXGw4tyllgLSPhGosDw1pJQcoAZYAyID4DdAUmPsfij1CxIv0hIj7LdATKAGVAYgxQByaJG5Kfl//p0yfVTIGNHgcvHFStL+1FGaAMUAa0lwH6y13D966tTdu05LTi4mJYRdWxrNO+a/uho4c6fetEMmveynmpialZ77JMq5ha1rPs0K1DC6cW5cqVI8lTnDJAGaAM6CoD5WguRP5bK3YuO+w7rb7f9T1x4ATWMMm96xKbICwL2gNSevjvFU31x8/PV5QgXoLoI0ReejTUSPJeGjKHDksZoAxQBqTIAHVgUrwr1CbKAGWAMkAZkMsAdWByKaIClAHKAGWAMiBFBqgD0/BdgXda2Etm1n+m/AdptaxgGR5KPGNFw5Ohw1MGKAOUATUyQB2YGslWfqirF68inX7Z9gvsPERAWqUMUAYoA2WQAerAJH3T05LSEPsG+w9GEFqlDFAGKANlkwHqwCR93wsKCiRtHzWOMkAZoAxojgHqwDTH/b8jF+TzuSgFzwPT8Bzo8JQBygBlQBMMUAemCdZZYw7qMojn2GWHlg4sWVqkDFAGKAOUgc8MUAf2mQuNlKJvRs+fPJ80tFs/N1ITxSkDlAHKQBlngDowzX8A9m3f9+ThE64dkBrRa4QXF6cIZYAyQBmgDAAD6k7mm19Q0M3b+3pUFIQ/DZ8w4a8jR5DbUKFChZKSEgSEKsg3btfu6fPnSBOSyzEuIaHroEFpGRkIjvTSSPVF6gvuuBDmBekQOzXvhDTJ8PoG9bE4AkIV5LmgbiAHDvz5ww/+yFx8fL7ftWszAkIVciJ//PiRi0O0natr0/j4x0gT4Ahy6NCuefMmxsW9R3CtqJ7Zu3f+iBF///131KdPA+3skp6gP4wq6ul9LC7mzgXkl48bd3jTJqQJcARZOXHigQ0buDgiJp3qkj+WLNi0ALHHp7fPrpO7EBCqRoZGeQV5XPxT1CffQF9uF8DZwgUfCvr+0Df0ZiiCs2UkW4aPzdChcw8cOAcWfvoUZWXVKzU1E7HW0LBSQcEHBITqjBm+K1fu5OKgB8ABA6YeOxbGbpXhbES1slpXYEDQoDFjwHuBrR+Kio6fPcs12tDAgAvKEO++fUlNDG7XqFH4sWMOTZsyiHQKTlZO3Es65knWktjYaK5tqalJXBAQQ8PKWBzA/v2HkJrY+KBBPmFhD9iItpQfR0cvGjUK/srA4LTnz7neC3BDIyPSdPr4+pKa2PiM9et9pk9nIxIv33l8h2thamYqFwSksgHx8+Pd3RvbhQ0aVjI8sebEwC4D2aC2lOfM2SjzXmBwfHwK13sB/uED5tcP4I0aWfJM8/vvPz9JsrQ079mzLY+wUk1qdWA/rVsXcuGCzL6b0dG5eZhfOpUNDUkT8PX2Ll9evsENrK1vnjpFUkJxrWPg4cMYrs1v3rzigoAYGBA/P0OG+Ovp6WN7IWCdOnx/jYiwRKoFeXkzvLyKCgtl9kSFhWENq2xigsUBtHdxcWzfntTKxievXMmuSrwc8wTz+Xn1Dv/5MTM1I03Hva27jZUNqZXBwYcFrQpiqtpSCA29yV5C3bhxD2u5nh7+oR2/A+vR49sdOwLDw7e/exeWkhJy+vR6rHIVQPn+QAWl2C7gsRauXs00XblxgymzCzXMiB8g2wYNPD082MKkMs8yjtSF4lJgIDs7KyYmkn2BVY8eYf6WCgrysQbXqFELiwMIbgl8GKlV2/HNCxbAqouZxcPISKbMLtS2tmZXkfKYBQsQRLuqWTlZkTBz1gXP9BJfJHJnATgXBKRenXpYHED49TxtxDRSq1bjxcUfJ0xYwT5T9+HDZ9gZGRnhn5DZ2/O5djiwcOTIvu3bO1atSvz9hB1OLoh3p3K7KSsAjzXGz5nDfrkV++gRVgmsnx5yHtwzkoE//njszBk4/pFBymAh+Hqwrs56585Ny5fPYc/uzp008GpsRFYuLMR/AVla1sM6PFmvadMWBQcfevfuDVehViPPHj7c+9tv7CkkP33KrjJla1vbmIgIpooUXLp379i37+UTJxBcW6obDmyYv2k+29q7++7KnqmyQSir4MCgl38///UH1j9IeIBo0/bqjh3H4+K+eCb/+HEidlJGRoZv32Zzm8zNzerXt0hMTOc2iYqUF1U7o3z34cNRsbFMFQpPn+E9fEPeX4jN7eymjh3L1lMGy3BeM/fSDR6uXQtDJpKc/HlVwW4ircCsrRuyxZAyrM9++mkDAupAdfuyZX9/ufUpNSEBOy/bFi2wOAPO3LCB5zEjIybNwqmr6LuDpBdffC8zZucX4lfwjnaOjAy3oFdR7/dZv+vYAeiw8Fqz5i9ksi9evEYQWRUcGBYH0NW1FalJPFxcB1b44QOYDgSt2LgRmUN6ZiaCyKrOjo5YnAEDp05t/fXXTJUWdIYB+KUcFXUdmQ5pswZpBebo6IxoQKr9+g2eMmUeAmpp9ez+/WB5emLiuQMHkCnkZGUhiKzq1LEjFmdAcyurwB07mKoWFXLyc24/uI0YnPoyFUFk1QLCCr5NizZYeQZ0beU6e+RspqrVhW3bjoL9ly9HI8svADMz32KnZmxM3OTi5iaHOqzCUoIiOrB37993/+47sO9sWNgjzgMN7A4OEG7bujX/lAwqVTq6Y0ftWsRXHfzdaatkGXjy5GFubg5iXlpaMoLIqiQH5uQk/69oxowlEydq/XfQse3b5w4dCmyE7N6NLL8ALMTtkAK8saNjlerVZRyS/u/m5eU/54sHuSRJSeHXY6+X/F2CmETabVj0sQiRlFUdbByqGFfBNjHgkoAlvdr1YqraWPj4sWTUqMVjxiwF45mdh+yJZGWhf4my1nr16rDF2GV397b6+npsRA3lcmLHS5WzsFBwGibGxns2bOjn58fIFyUl6en9jxGuHvBkG5cvH/Xjj4w8uyDUvF4Q7xd7NPlldhAYRCgzHSAIjCmzY7nYOCPAU2D35RETvkkoggiWWViUI7TIh2/fTqpb11omp6webnyY/PFwEiLT85VTOSX46erpeSEoiDEzIi/PoHJlWVUpPdBFqDiwVtGMOaIUyjkpwQ9iwZtLb5hNicrqESwOTGSCypVzQmateDUnJ5xZkLH1DBzYJSholUwPg8MWmIYN69rbN3J0tAsMHKP4KPySFfmb1dYK8cuHtm5169SJPeL7nByeTYnwfJLkvdhKJFKGCDDGEo05G8YC3SosXry2b99/1vrwz9Hx84+DlJRExoHJWrn/Q8fg4IPs/VdcGcki79+8Udy2EdOmwfb31qxAlFfp6VY2Nopr0FXJNVPXDHYbLJtdnR6fPz9Pk5+62Lvo6qyVnVffvh2HDu05ePBspqOhIX5H4pUr0fAHJXtTOG3aCNigCH6rWbOGEATN9BWqIOIjxItXrw7w91fQ0OkBAYj3go6v3+Kfw7J1jhk+nF2l5TLIQKNGdrVqwUPlfy729ENDT7Kr2PLmzftDQ+8OG/YfY2MTrIA0wdjr1+cOG+am8OONNm5uU1atQnYfZKakyJ3dop079cm5BeR21woBW2vb2tVryy62wVeir7Cr2PLyicsrlK+AbdIlEDbBHz/+63ff9WBPCru9EwRev86Kjn4sk1y1aoqvb28np6ZieC8YQpQV2Mnz5yHkC9l2yJ45UjY1MZkxfjwCQvX+48dN5P1C3LJypc+gQRPnzr1z/z5XA0W0hQGI/VLZ1Nq1Pz+GZSs5cmTv5MnzTExM2SC33LRpi1WrtsIyDvZAhoWdvXcvGt7GccUkgjyKilo/a9bN0FDF7alkaIjdl3ElOLh15878enr7+Di4uCzy94+5do1fUrOtEPulsgHmZubYvntC9kz3mY5tYsBZfrPc2riNWTqmNAYw2qRZMDMzXbcOw0NhYREprvn48TBwWmqYjsAO7HF8/OT5889dvqyU6Z69elWrgnlxCmk7vHr3lquqnbNz1Nmzp0JDV23aRIqPlqtEWwR0NQjM3d1Z5VtgZlYT2zcjI23u3Alr1/5XkQQukIOqa9decGFVSQHMz81dO23aka1blX3g2aF375q4tRrsXYSHihUqyvkSqGdn90d4+Kldu7YuWgQ7HqVABdcG5xGl+PxUMeMqBCT2aWzQhSDPrp7YVgZsadfyxs4b4O1+2vHTk6QnDC7BwocPRRcv3g4Li7p3Lz4t7WVeXoEiRg4e7Ma862LL5+TkmZj87x0qG4cyZD5cvDgAAcWoyvnsKjXkxj//nL5kScH/J7NRvG83V1es8NHTp1cvXIj1bYg8PBvp3b07XPGJifuPHYOOiIDOVCECTGfmItREDMnpxw4f3v3yZcaKFZuFGktTeiC5BGSKepGUpIIBPf7dDMztCO/AwCcFLFnCbUIQ+AXQx8+v59ChIXv2QMJfUpoPpJe2VI0NjUmmjvt53Df231iZW5EEZDg8RfTt7Tu81/ATl0/sOL7jdITkvn9evny7YsXOP/888e5dNv9cuK0dOrTkgoCkp7+ysMD/dgQHmZCQyp9fCqtTWbC8sh1I8pCld8LcuSp4L1DYysEBqzYrO3vZl/kFsGJs0KZ+/XlTpsCCjA1qS/ktvPT7/0tbbJaCnTwJfMG8K1fOt21rIwU7S2PDaFdX1bwXDFq/SRPS0DuWL49Q+Neenr5+P3//3bdv740Wee8gyVxxcOPKRAf28u3LLmO7PEt7psjI4MYGdB4QvDb4xbkXisirU8bGpt+vv+5RwXuBkba21lhTU1NfYnEZCE8ReVqFahLMgR0+eVJxm5CXydWrVSP1XbN1qw4vp5BZ29e0Zy6kiVZ5GCC9TGa6KPvMjekoncKHAoWe9mANrlW3LhYHEALIpvTps4eVpJQkycbtWuJ/krNltKisz5viOT4lvtXQVkpNp2Y1/LpEKSXCCufk5KusEPLHY/vCMguLy8Djxy/ztArVJEocWOarV/AyLBF2MaeksBP4so1G4rSYMC9lcbZOdhnRw25Sqix2HA/JGBoHJmNGqPgtofSQ7hcJF/bzA6762YMHD27fhlSHGUlJmampd8LDsUNfzsoyxr1XxgoDOMzJ6T8LFnTq1w8RIMWH6WocGCl+C+LAGtdrPNd/LhyVwl6xkeLDSHoQeuVXJRYHxj7H682b93Xq9IBEwDALNs5M6siRixATxlTFKAj2Dux9djak2YXtG9ciI8FvCW6r25Ah08eN69SmDZxYKLhyqlDrGLh7V3JPaUTl8PiOHWHHjkVeugRbORQZSCnvBQrhLLGp/ftDrnrwYbDpAxJ2mOl0shsVnvLBBg0403LssrHu7dw7t+7sYOvQwraFIvdCG2WWLh23erWcR47Vq1fp2vWbM2eI21M9PNqLPXfBVmAG9evDGZWKm4uskOSuwGSajY2M4IWZec2ah4KD+cdC9PML87QK+wuaZyCkia7AZISQVk7KZsog6UFoZ6rK6mc6IgWhPj+klRAyHFNVdoXE1V+1Rg0Ic75HOPZIWf2MYUhB5AXGV0KtkEh6kOkwVW1fgcGK6vnztDZt/JCkiMhKa+vWI2PHLoNZIzjDg9gFwVYzSnkvlWcFGRR1fqO8yuRodcfSxIFp9cQla3wWxKO+fi1Z8xDDdDgMC5mp2qoNGtTdtWuJmxsmQpexoV+/TuPGLS8p+ZtB1FwQzIGp2e4yOxyNAxP21sfGZj59+igpKSEnJzsvLxeyCefn55LOahF2aKpNQAZKEwcmoBmSVdW8+SA4TLJJk/ouLvaQFAryjihiKpykDC6KZz8hHAPWpYvz+fM3FdEmhozWOzCIUJG7CU0M4jSlk8aBCcs8nBAGV5s2HYVVq0Ft8ParXuPGGjSADi1BBmQnLF+7FgNnVwYELIekUD/9NB6OoJRr6uzZI3kcGHQfOtRdxx1YJX19yHAwwN0dzvFqJu84IrmEIgJvHz68eusWpK2Cc5zh0Ja0jIy3WVlatG0aAr+QGdFqaRjw9u7WrZsHHKrSvPnXBgaGpVGlRX0rGxu79unTxdOzWevWderVE9Dy0fPmQcapJzExAurUJVV7lu45d+NcxN0IiBXTlq8d+MW/b9+Z06cjdu5cJPdewIoNkkJFRT0iScI+w4CAn0itYuOir8Ag2e7CqVPrmOMjCUo/vSqmph7wjdWtG6OquLj4lQJZgBl5zRYg8EuzBujY6FevXoBLNqlq1SBFa10zs+r6+pV0bJrMdEzNzEbNnTsoIAASHjKggAXI0wEXvAyD7BtwpcTHZyQnv0xNFXAISanadnTbyL4jK1ZQ9ItxmPswuGAKcJzm/fj7cAB02qu0jNcZkpoU1hg48at//6nYJgQcPbo/jwMzNTXy8OiAdFFbVbBdiFMWLMgvKJBl4qgMuX0MDOAklHqWlj6TJqltMuyB6C5ENhsiloXaZkcwUajdg0LpIZhJhIWiZ2irVoX5+R+LiysZGICvgueE5paWsOv9j6VLsWMLtUuQuztRNpxQ+jW1CxFLGoDK7h4k7U5UVg/Jnq9EJgjO61q/fsaECd8hBjDneMnwtm2/jojYAWUER3pxq8zuxKKi4h49xl++HCWTYXBuF6UQRX9oyFX62x9/yJWhApQBoRgoa3FgcXfuCEWdgnrggVhqQoKCwlonpkIcGDJHWHLFJcYhoDZWp0z5xcbGqmfPtjzGX78eC8l/69atRZKpWbPaq1fvSK2Aw2HNx46tHjJkDk/cGE93UpNgDow0AMUpA2IwgBz9JcYQZUrneDc3IxMTWNtBoHRednZOVlZSXFxBXp6ukgCnfyk1NX0XfZPKJlVMqoBfz83PzSvIK/hQoJQGyQrDJnh//0VxcUdJqeXBcpj14cMXJk8eQppFQIDX4sXbSK0yHLZBhoSsW7ly57x5v/NLKt5KHZjiXFFJERmgcWAikquA6hvnzikgJV0RsePAij8Wv81+C5d0KSiFZS9evF64cMvq1T/w6Dh06DyPAxs3zhuy3cNxLTwaoAmy4M6c6derl2AZOqgD4ydccq00DkzYW+Lt7SusQqpNIwzQOLBS0r5+/X5YRcGzRJKea9di4fwUUisEhMFej40bD5IE2LiDgw27WpoydWClYU8DfWkcmLCkw3GXwipUszaX7t2NTU3h0R+kq4enf/DoD/L8wlJBzWbQ4bSdAcjJu3bt3g0bZpImAk8Rjx69RGoFHParHjx4nv9NGE931ZqoA1ONN8F6cePAYJVdoUIFwQaginSagd9xj/7eZmbq9KTp5ERhYOfOk5DDF95UkbTDOcukJsCrVTNdsWISvE7jkRG8Sesd2JOIiMLCQlkmRn34p6dnVLmyWdWqgjMlkkIaByYssf7+A3Q46ktBrsyEC7usaWEB2/eLi4pgkWcg275vZQXb9xW0ROfFArwCYDdHUXERHCpWSb+SkaFRrWq1apkRd+tpihB4xbVw4Vj+0XNz83ftOjVp0mCSGLMJniQwcmTfS5cid+8+RRIQHBcsDiz85s3mdnZcz8GkmUdMZ8dpweJ09NSpO/bvBxk2jnRRsApHusDZK6XXIxtOqDgeBY1nxGg2ehkVQsVvCaWHuUEKFoT6/JzZu7epk5N148bIYbAKxmntWrVq/ezZcHwlf/xWfk7O3rVrd/3yC2xE5J8gvx7+vuxWkcOciNno2Tawy3Ljt67FXFu2fVlIRAi7F7csVw+3Cx4RmyDOqJ07jwkLg9v7v4AtpB1SeAwZ0hMBoapyfBhXlVKIYCsw1wEDYODGDRu2dXbu07075I4yqFRJQVPgb3LzihWQDirm4UMFu/CI/XXkCE8rbdINBspaHNjcYf+ke4D45W+7d+/Qpw8kjjIln2POvcU+06c3sreHw5e5TTIEHGTo4cMRp08XFRaSZHQJx8aBTVo5iTTHvWf2Xom+EnI1JCUzhSSjGzikSQQHRpoLvAbDOjCSvNi4YCswZKVlYmzs3afP1O+/JyU/5K6Q4JyUjgMHcnFlKVi0enX0vXvH//tfZTti5YX6BY1VzgOKugL7+PGjYOeCikwQaeWk7HldJD2kW6CsfpIeoehBVlr6BgbdBw36bsIEHxcX7NDYFdKm+fMhLxRWHtGPlWGDWP1sAQXLYi8wlMqUcT/hvn0jfGo3kh7SNLV3BQYHgFlYuJWU3MZOrXLltpmZodyIMa1fgSGzzcnN3b5vn+ypINJEqrp++237b74htSqOB06dqrhw2ZSMuBjRsUdHSc2dxoEpdTtgnXQKXjXs3q1UrzGBgUrJa5GwIHFgJO+lRTyU3lTYEN++vSNJT0HBhwMHzsGOeZIAgnt5dYNk9rDFEcGFqgr2CBFrELzcwuIk8Mexcl4zIh3fZ2dDMl8E1O3qsfBjebl5r1++jrsfdy3s2t3bd1WY78nDJ6XmwNzdnVWYSOm7lKk4sAoV8X/vmdqfnJfGgSn+twDZ6K9fv3fv3tN79+I3bpzF7ejp2ZULMsiGDQdGjeqHvItlWpHCoUMr3r3LPnfuRkzMk+fP07Ozc0GAZ6Mj0l1uFf+BlttNJAG3Tp2U0hwaHu7p4aFUF20X/qb9F4vU+Mfxi6YuuhByQal5vXzxUil5HRbW9jiw0t8a+JW50M9PQT1w/F6bnj29x41TUF43xM5eP6v4RKoYV+ndobfi8uqXhJvo5xcYH58CQ2MdWK9e7XisAle0b9/ZoUMxWzmwvWB7PbxXgwvbWkqwfCn7y+2eERNzPTgYTlSBcwPlCkMae7kybIHg8+fZVW0s5+fll5SUqGy5TRMbeJDkG+CrlIadwTuVkqfCkmVgXUjIoHHjzK2sVLMQfoz/MnnyrQtyfgDB0s2pU6cfVq8+8ezZulOn2pelX42J6Yl+gX789EKaxC7OXeaNnndh84VXF17BIWH88hpv9fHhc7E8+Thklo8f/zOs3phZwNFiTFnNBbE2cTDTYDZlwFux72fO3Hv0qKyJwRlJZQvwvW/eosXrBw+U7aiUvFAv4UmDyjZrGJsYVzWram5hblnPsqFtwyYOTcZ645+mpn9K56qCI9D09PS4uDoQgQgSapPF998Pxs76xIkDWFyozRpY5QAKRM9XpE0Wss0UsIoKO3Zs25IlTNJ6RTZZwClfgX5+V0+dIhkP+I+//trc2dmuZUtDIyMeMZWbhNrEIdImi9CboYNnD37z/g1pguCx7OrZ1a1VlyRQWlwogr60IynpRYMGfeBjQ9ou//59bpUqxl92+qfGbNYwNKw0dqyno6Pd06fJkMIjL6+AKwwIST9WWAVQMAdGGhvZnQhiNvXr3zp92qxpU2yXg1u2eOPehIUFBXXy9MR2wYKld5AytUJ9AWGN5AHBsdVrWK9H3x6DfAbZt7RnJEm7EwNXB8KzREaMKUycPXH98vVMlSlgHSHTqkRBZIKEcmxC6VGCmX9FRaYH49jgAdHPBw/O8PLCmno5K6tj6cL8xy5cKOBmEHG+nz9PXVnHVnSzCBLPf+6vUsm6tnXSqSSVunI6iUwQ45CQgVeunDxjxm8ICNXXry/WqNGFiyuFNGvW8MGDQ0p1IQlr4B1YfGLikIAAkkF2NjbYpvuPH2NxHQaTniVtW7sNrm9dv522aFrbTm15Jtuhawdsa/SNaCyu7WBZiwNT/H7BU8FAX+Ij5ZdpaYqrQiTLV6gweeXK4T/+iOC6VIVzUko5HQcbhxNrTpRSica7g4/B2gDnOGNxxUHI5Hvu3O+Ky/NLasCBgUFnw8JIZlUnhGdGxsZiu9g1apSQlASBTdhW3QBvXLnh1dnLa4TX0vVLSTOysLLANqm2TRGrSlIgPQ+M53bwnOP1/NEjno48TaZmZsv37fu2hyiv4nnGVXNT+ivMI3rFbRjhMWLT7E2QUErxLhqU5NndbmVljjUsI4P4NBUrj4CQa2rduunGxpURXOWqYA7Mc/ToXxYsaFDqJGnGlfFzi4yJwU7ycXg4eK9nyclxCQlx8fH//P/vhRXWavDw7sM83gjO2sPODvbcY3GpgTQOjP+OTPLwGL9smZ2jI78Y0wo5DF+lY76Lzx3AvwtkOmIL3by8Zm7YIGCKRewopQGVjQNr1rDZw2cPuSPefnibCyqC2FjZbJi5wa2NmyLCmpJJScmEzRewgT429p899HFxxOecFhY1sUbCyzMszg9CjlrIHjNzpq+zc3N+SWVbBXNgR0JCTl+8OHfy5FkTJpQmmbox4V3xo6dPSXODpBKQwgouSGFFktENHDbNkyZSWFBIatIKnMaB8d+miJCQ62fOfDdxIrgxRfZTwLblv9as4eq8EhzMBXmQZq1bj1u6tI2bpL+XwX5l48A82ntgHdiekD08bGCbbK1tZ/nN8untU7GCYF+n2IFKA3boMOr+/QTFHwAaGRlih4M99FicBM6bN9rZGdIxOWG3hJB6KY4LyXhBYeG8FSsuRkTs+/13uZvmIVNi4YcPXENBCRcEpDR7zbEKdQzMeY9/Nm1iapKTjW/SMQZUm44WxYHBy619v/0WfvLkqqCgxl9/zT9fSJaIdWDKpjrcfVvFFQm/eRpvbdWkFdaGC7cuYHESCLsQO7furGBUL0mJGvCrV+8qNYqBgT5WHkKSsTgJhEPCSE2C4OUF0cJWcvHqVWd394TERDbILXdp354LApIlLw02thcF01LSsCTEvY+LTo3ef27/4rWLR4wd4dLBpVr1alhJCmoFA6kJCf7t2l2Ul7FakBNPYEOjVnCigpGw1QLbC/aVY3ESCOFf0vdeJOMBr1evDrb177/xPNy9G4eV1xQo5AqMmUNyWhqk5b0UFGTboAEDIoVvHB1DcOGTWe/fI5K6XU1NSoXYL8XnWL9R/cSERK58amIqF5QhtevWhsu1uysj8OZVqd7EMnokWCDFgUnQVJVNgj0aMwcNWrxrl/u/KeqxeuAdGBZXCpywfLlS8lokbF7dXIusFc/Udu2+xr7W4tnfIZ4xKmgWPQ6MZBMTHxa8c2dv1rsrBoeOsMMQ9mjINLBxpxYtIs+cQfDwY8cEyQWMGCx2HA8ynNwqEwc2ac6kWctmMfIMDgi8FEwuTpY1sfHI5EjSZkVGj9IFkQlix2+xg45JOMl+kjwJJ+lRFheZHqI5TOAzLMVOJX1+V8/g0BPya9wqLpapYHDPsWPnbN5M1Ct0g8hhTkqfB8aeX4eWHa78cUWGsOPJ1k5bO3nIZBkOeTrqW9Rn9xK4LDJBsjgwV9dWly9vY1vOjg9zd28XErJO1srGzcxM37y5hOBOTk0jI/cwqrKz80xNjZiq4AXRHxHIXZKfDA0lzeodYTVmXbcu0qVm9eptnJwQULerQXuCSNxin2lY1bcS3ntpjmKIA2MuzVmh+ZFhPxm/ERnJyYmEGErsZpC+I0fyK9Sl1u4u3eFgMPbFnl1yRjK7ypR7tu3JlMX1XswwIhfCw+8kJ2eQBklLe4ltwubk9fD44t2QqN4LrBLdgW3auRM7eQaEA5SZMlIgbehowgl27ufmVpqtj8i4WlFNS06LjcJ/eRmbGnOn0M2jGxfUXgTiwJhLe2dResune3oW5ufz64ki/IlVNkY/J47t29sTDhjjH0JLWy9FXoKtMbWr12Yu9kSycrLYVVkZ9t9D7iguLnHk7dtsHgvhp/ChQ8SFxMuXb7F9a9SoysUHDOjMBcVDRHdgs5cvT8sg+naY2PPk5BRcwAo0Ke7AvHr3Fo8jjWiGNBxyx7119RZWBnYecnH3ge5cUDoIxIExl3Sskr4lKfHxcEwlv52Po/HZWLgbPQIWL+ZXJdlWiAPDXvwGfyz5uP34dpJMbn4ut2m893guKH2kZ88J/EaePHmFJJCfX4htMjevjuCQGhEuBBS1WlFU7aA8OycHjkjeumoVz0B3Hzywwr1zJj0iY6/A3j1+DPHL8FaMR782Nnl28jx5/SRsvuAxHr7zsa01atVAcHh4yJ+JCpFXf5UdB8Z+1yW2JTpwHtje337rN2pUw2bNSFwlE2IorRs3ZnfpMnBg685q/fnMHr2UZWXjwJjhDocenj8a/wvAoJIBIyYr1KlRx7e3LwJqRfX27Qc3b953cbEnWRsREZOTk889ahnkP3woxvZq0ADdKBQQ4IWVFA8UfQUGpv/34MEk3hPzYggZ5Y1wWTkq6eu3YP2tVjU1dWnVCrYtiMeRRjSnp6T79fODNPM8o5NWaRDUjfQaPHKwDm+JRiarVBXiwJhLqY7SEf67pGTrokU89qQ/f45ttXVwYHDTatVmbdzIVMtOIfZpbEpmCna+cEgKgq+cvFJb0kQhlkN1y5YgLsggsO3wzp3HTJVdKCrCfwshp65YW9f28+vD7qiGsjocGHwLr9u+nWcyT549w7Zi00q1ad0agqCx8joGwisubCJ5Zprg5Jgyu9DIrhG7qqev5xPgw0ZoWccYCD10CJ4lkiaVS9gM1apjR6bLkt274U0QUy1ThfM3zmPna2luycb7uPYZ3ms4G9Gu8sGD52GNxWPz3btPsK0VKuDdhIODLVv+558nQsooNqKGMt4ywQfeExTEk2+X9JIMm1aqc9u2gpsnWYUbV2x8/fI1ybzcf8/n5rbCKZcMGBQWtP3IdtjxwCC6XYA4MObS7ZmyZwcP20/8+ScbYZexuX1hydWkZUtGrEydUcnMWlaIfox/R9i0QVNGspFlo12LdzFVbSzAkV0HD57jsfzBgwRsK9YtgVeDHFGMfP/+nYYM6clU1VbQWByY2mZYyoE0FcdTSrPV150SxMs1pYeXnq9EDnNSOg7sU9QnfoPV3So2Qeqej8Dj6dqrI4HpoeooA5QBXWQAYr90cVplbk7UgZW5W04nTBmgDEDgFyVBBxigDkwHbiKdAmWgrDOg7HlgZZ0vXZk/dWC6cifpPCgDZZgBlePAyjBnujB1Ne1C1AWq6BwoA5QBygBlQEoMUAcmpbtBbaEMUAYoA5QBhRmgDkxhqqggZYAyQBmgDEiJAfoOTEp3g9pCGaAMqMQAKX6LfY6XTLFMkoSrNDjtpDEGqAPTGPV0YMoAZUBTDNA4ME0xL+y41IEJyyfVRhmgDGgBAzQOTAtukgImUgemAElUhDJAGZA2A9w4sNbNWkvbZGqdAAxQByYAiVQFZYAyoFkGuHFgpLdimrWTji4sA3QXorB8Um2UAcoAZYAyoCYGqANTE9F0GMoAZYAyQBkQlgHqwITlk2qjDFAGKAOUATUx8H93yH8qB9qfjQAAAABJRU5ErkJggg==", "text/plain": [] }, "execution_count": 44, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# split in small patches and transpose each patch\n", "rearrange(ims, \"b (h1 h2) (w1 w2) c -> (h1 w2) (b w1 h2) c\", h2=8, w2=8)" ] }, { "cell_type": "code", "execution_count": 45, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAkAAAABgCAIAAAB6y1p+AAArZUlEQVR4Ae1dCVzN2RdHC9qQrbSREkKmRWmyDSFkyZqQvWFsY8kMYexj3w1jLXu27FuWJlmmQg1RQqmQbSgpKf7ndfT7v6ne73ffe7/fW+r+Pn1+zu/ec8899/t73nnnnnvPLf/12bNy9JKMwHNjyXXC18ydMjf8YnjonVDoysHMoal9053HdgrfrTQ9KBcgaTRVCi+Fhx12+1vs9WW+lgLE+hHQZK2llcpBIPdTrnZFbeh788rNjAbPU5/DHz5GXI74vt33TBUlKAIUAYpAGUSgQhkcs+oPOflxMippUNUALRk8rty2MvnTt3JqvVT/JVINKQIUAaERoAZMaISllv/r2F/bNWmHzTLeZYA3hvTkEZNdrV2lFkcbUAQoAhSBUooANWAq9GJzsnO6f9898I/AL1++1ClfB/5QOYZOe5pmpmkWHBjMlKiQ9lQVigBFgCKgWASoAVMs3qy9TRk5JepaFCtLufz8/GmjprHz0FqKAEWAIlAWEKAGTFXe8p3IO0f3HuXU5tnXZ8m5yc2dmgPByUwZKAIUAYpAKUaAGjBVebl7tuwhUSUyIhLYwNrdvX2XhJ/yUAQoAhSB0opAeboPjP3VKnIfDxP0YleJqVUJJ0yRADEjVx+CwsP+rug2J3Z8ylGAWAGiHhgrPLSSIkARoAhQBFQVAWrAlP9mXK1cP2Z9JHe/ApYGgNI9vXsqX3WqAUWAIkARUB4C1IApD/vCnmFxfGWdypqapFlRUpNSoalBFYNCAfRfigBFgCJQFhGgMTCOt67IGAa5E4ZK0xgYx8tTgWpFfn5UYLhSq0BDPByQUYBYAaIeGCs8yqjs3r87SbfkHhuJNMpDEaAIUATUDgFqwJT/ypzrOYMS0TeiUZXjB46z6zRj8QxISw+7wdjZaC1FgCJAESjdCJDGXUo3CsodnVt7N1DAwcWBUI1WHVo5ujqWL1+ekJ+yUQQoAhSBUokA9cCU/1pXbF2BSjzKemTvbA+RLSa4NWrSqEbNGmGtkYmRezf3ChUq3I+979LaRfl6Uw0oAhQBioBSEaCLODjgV3AQHvL5VqpcCXTCBR1oycTp8NBw8MA4lFZktYIBUuTQ+OiLwsOOIl2jwI4P3cjMjg/1wNjxUWgtGCpLHUvoMiEuATtuWLWhRUULpPdu3QtHq6iW9VIoPLQzigBFgCLwHwSoAfsPHMp98BnlAzOEoEODxg1QkwfvHuAhlsamxgNHDmQOt1SunrR3igBFgCKgCgjQKUSOt0CngChAHAiwVtPPDys8dIaMHZ5yFCB2gKgHxo4PraUIUAQoAhQBFUWAGjAVfTFULYoARYAiQBFgR4AaMHZ8aC1FgCJAEaAIqCgCSjBgX758CQwOBjw+5eaWrwMr7+r08/P7zt0daU0zM31ra6S7DRkyISAAacRPnL4XH6+ioEpW63nqc1wTDyyikZevg7zy0EwKD8ndqmXNunWLvb07fcOnAKD09GeRkREFH5nyv/wyZtCgLki7uFja2RkhbW6uZWNTBWlfX89ZsyYiLS4H6devX6olLlxKP46Lm+njA1xpT544lC8Pf2M7duxevz7SztrabatVQ/q3YcP2rFqFNEoVp7n6UfX6BVsXtBzaErUUoeBQPiYh5njYcaR95/i2/7E90qYepobtDJHWc9Or1aEW0qMXjN54cCPS4nKQ/pz3WdUhkEm/zZsPN2zoBU0TE1MKPj4OXl5TW7YcirSOjquGhhPS5uZdGjXqjfT06Wv//PMI0titON2r15Rz567LpA5HIyUs4nDs3Dk6NtbHy+vYuXMfsrI4FCystq5Xr1/37gvXrIECPIRT28Jiip/f4hkzClkE+ZevIDyYLgczh+iUaLjzpeiJ6ycgfwcYMPIsHnx1/X85fAH0f4nfKDA8QK1atT029taOHeuL1UtXYGXVsGdP7+XL50Az2CgOd1vbGosWre/RY4B0gqTkFgweiXo4aWp+yc+3aNAgOSFBIlOxij5jxnj6+vq6iDbIR38V4TO5R4/p69fXNjMrxstngXD7wMDwgKLzxsy7/eD20ctH5VR6SLch/dz7dZvYDeR8jRbh06Rfk42/bGxt31pOyRzNhQOopI6/fv1aoYIj1Jia1k5NTS+Jhbusbt06rVvbBwWdBNavX0UZ8sCYMTR3e2k4lGDAGrdpc//hw7AjR/66cWPW0qWgba0aNcCSfczORrqGoWFcwf+9oLVrXR0drVxdoRyNloapae+uXYM3b5ZmjHLx8vsFhC5X8e3J8pTLNTz5G/MLULlyGRnvDAyqgl4NGhh8+JCJxgaN2YYNe9LSni5a9CvU2tk5Zmd/TEiIA1pf3yAnJ/vzZ9Ev4po1a9eoUev+/X+AXrdul4NDS1dXK6BRjoVFRW/v4b///geUKObiGx5urdGAzdm+PS4q6uDGjdDAwNAwJysr99MnoMEgGZmbx0REAL0nOrpB8+ZOGhpAo9Ea0arV6Nmznd3doUQxF+/fz+8y31XVrwrKazpp5n/JB2OT/Slbx1UHSjb8siE1PXXxjsVAN7VqCuWJKYlAV9KulJuXCzNDQNtY2FgYW5y/cR7oW3tvAZtWCy2g0WhZ97SeOniqX28/KFHQxTtAEvT+/DlPS0sTKtHY7No1Py7u8eLFO6CkXj2Td+8y//03A+iKFbV1dSu9fSuiHR0bW1gYHz58EejQ0D+aNLEyMhJ9chijpa2t9enTjYKSrwKlvlOcAbt561aL7747d+WKR8H8BoxKqishIgKcMJh1g1ZozHI+fapUsaJUQmRg5usLCE2UDAqwNGEyTrHwCF7FF0CFisbERDVtag+W6YcfmkKZs3MrsEZg1QrrZfw3OjrF2NgUDSEaMxkFSdmMb3hK7j47K2vT7Nn9x48/vWvXH7Nnl8xUUmmV6tXfv3kDNWjAYKaNoUti57+M9+/nqLgom7o212Ovd/qpE6hraWKZ9DwJjZMM2lfRq/L+w3tomB+ZD3s00atDYyaDNFma8A5QSUqEht4cN27Jpk0zDxw4v2nToZJYpCjbs2ehh4eroWE7aIPGbMeO48OGER2yIUU3BayKMGDw6XHr0eN6tMiX5P1CY2Zqbx8eElLP3Jx3+Xx9AdEpRPJXg2aGnF8eTjRmBw8G9e07RB45ktry9fmRJB/KH9y6NdjJSebv6BIl6+jrf8zMhCo0bEvHj/dft65ETjkLhfh+RjMjp2LszcGfgNk24EFj5j7W/fiq45UrVmZvJUutEACJ6QEfmxkzNixZslOsTBAS5iRTUk7zLlrwRRwwMQhz6wJZL4Aj/tEjuKe9eNFj2DDe0aECywgCAQHjnz9PVcfBnt27d9j33/NrvQAHtF4MIAfWr1/j7888UgKtF+AA85BwD70ZOihgkDrCMnDgTAVYL0CmSZP6QuAjrAH79/37lZs3w7Qhv6o72tmlFPpzNvVFuIAfFnvxIr+98CsN127wuIID1MO1G8pcwcErRjB5CPLwzqvgosImTQooMosYH/8eJhiL8qn8c8i2bRsDAnJzcvjVdMnBg+h4MWLhcWJBuJopUUECJg9BK7wLqt6M4TNgRhG7QK8L/LDDyw4L2qkQwkeMmBccfEEIyYzM8PBtMIsIf2fOCOLBi6J2Al0Qo+ru63v17795lx8VE+Pk4SEuNjsnp3KlSuIllKYISEJg9eoF+fnfvoAk8ah++bn9+xeMGsW4AjwqPL1v3+ECL+7lUVvFi1q0fdGd+DuK75fHHvPy8iEutX37MR5llihKIMeL6UvAGNiDxET7jh3BtDCdCU1gPAz+S/O44oWvGAaNgZG/fUXGwECr7t37Hz9+AIgibhm5wiycfH1+inQBKwxDtm49rMDluKAAumUwXYkpp4uoJNujECEeBcTAShwsxsNg6aNGBY0SGWQpFAKgcuVevnxbpYqesXEnXFsoi2LEbaZOHbx8+a6oqN0ODo2IG5EyCjWFuGHHjomzZinSejEjPhUaytCUoAhwInDiRDAnj0oxfPzwwb9PnyN//qkUrU4FBSmlX3XpdPfp3aqv6pIlgZcuRSrAegEUsLwefvE0bmwpBCyCGLC+o0ePmznzfFiYEBoXkTl60CB0vL5r0gSruilwF0sRZVgeaQyMBRysUlgMTFwT8LrS0r74+IwSwv0S74hHukOtWs+Tk4WYPCyu5NzAQHS87Aq2YwKD59ChxdlUoURhMTDxwS4evxgdL8fGjlju281XnEEFaQODVitX7u7SZYICdIPol69vt/z8yMqVBdnyJIgBO3TypAKgwS7+3L07IjIS6Nt37yqsU9pRqUHg/v1YGMuxY/thW7RaDOp+dPSngi3/itF2oZ9fckHOtphr1xTTo3r1ErAh4Hb8bdBZAYtH+EImM1M9Puok4xXEgJF0jDz6enqidcD9++/ZsAEcKfjLTU5+dfcu0lBe1cAAaeQvcecybDLD2jZeXkDAZucFq1djiercRWOT5o9Ec8yCWGpyIUJyDRg13kmGP2/eamC7c+f57dvPHjz4F/GNjEw+ciQMaRSSmprfurXE1BLt29tByA1SftSvr0vSqdJ5Gjk4SKXD4KlTO/Ttu/TQIdFSsK9fI7KyQh4+RLpmQVoApCXJhCWOXg0bYq1nvXqfc3NhszNsO5PEr8Ry9IEYT4hTk1VTVgHP8/PPHx1/lH4hvWCt3Nc3l9/cCLyBNEoA+unpp5KkQcTLfqA91tp42cAjROA8J3lK4leFcn9/kY/YrJk1iTLdu7cZO7bvvn2LcDFhZmZ4Xl4k0mlpZyMjdyHt5fVDjRpVkUaxQCcmHrO27gmPc+cKNd2tTAOmoaGxe/36Tm3bbl+1amCvXjhsLS0tSCWFNJQ/vnkTabxvWLxY/LEIDbmpEpOSoPDomTNFquhj6UMAFl/goGrXNsbsU/BoYmLu4tIay/v0GQwEzL8vWbIJS0rBHRNnEA6kfe/esPx9SXAwENikko6OmZUV0qeSk0cGBBCKArZnSUmnd+8GApaQkLdSWc4BnQagbpCto5ZhLaQNDQydmzgjfWjpISw3q21GMoqE5ARIFgycZyJU+vvnn38SQcmsrGySQQ0c2HnDhl8GDOiEzHp6Ohoa36xGnTo1IZsUlh8+vOzly1Ckmzb99gGrX98UVyHOmTOapC8ZePhchXjywgWdypXb9+tHqIeBvv77gtkJWK9YvVq1mtWrQ0Og7z540KdbN6CZxFFgkLxGjOAUCx4OJ4+0DAKtIitRjbev3zap2eTuq7twL5GhSGEpSyUFMTBwv+Du4cH9A3/PnjNGRnUaNWoGmHz+LEpkV7FiJaAXLPA/cmTvrVupQBOuZty169TgwV1btHALCQkvgrD8jzx+fsa6u1erVQtmLAi1qli58rWPH4H50b17tUxM9KtWBTrq8uW/TpyYvHIl0Jg4au3p05M8PSH/L5SwX+CrsTPIUMvjIjuYxAP3C+5Og7k/PyfXnKxtWBvdtY85H7W1tDU1NEH/ZUHLYBVGzP4YoHE1o5WZFaZM5Bwd+GqcPFIz8AhQYd+Y7bDwSeK/hoYGb95churo6PuQR6N2bZFfASkTc3Jy9fV1gA4JuQK2cNaskUCLp+tFesWKn48dCwsL2wK1wl3fbCkvHfy2YsW67dtJRMEy91/GjUsq3CLW0MoKrRe0BRqtF9BTfvwRF+z2+u+uL5IuKE/pRqBdu85ovWCYWlraaL2ADghYGhYWJz52zj0VV66cA37MCyzeUKXo2OvXb4aGXjx0iFCrlp06HUtMROb6trZoveDRsV07tF5AVyhI4/u9h8fEJUsIxQJb/O3b5Mwqy9nVrSsz2ahTSQetF2g7bcg0tF6M5uc2nIOMiMwjJ/Hq31ecPKrPAKkLk5NPoZ6w/B2tFzxCwl+0XkD37NkWrRfQsEqeobHV5MmDhLZe0BGfHtj7jIwqBgY+P/209+hRHAPLXQZvqZO397l9+/Ly8rQk5zwEg4dpdWSQX6K2PP6CLlG+pEKS5L+l9TgVQs8Jol/NmxvDvVYtI0kwMuUkMmNj0yGT/f792wcMGM40lJPg8fNzbPt2WAHYpkoVWEbPqVXYu3d6VaT42vVxcICciuBjdbWwePH0Kad8vrwxARyMb54T5xAg+mXc0RjuRtW5Pz/ojQXODYSDxDglAwNv3pgAANna9oVM88OH92DfyLxgwdiAgI2isRQciUIyasXz8OmBhZw9CwO4Jtj8+LSxY0G+pqbI05d08Z4UTlJHtLz0IfDw4X0YVGZmhmoO7UpIyON790isF+gvlfUC/lGFOezbFi6JUk0QlKVVA4sG0LXXD15w9oqydOCr36pV9UGUszNHnGLFit3iPfbsOeXmzbviJapA8+aBVapbF05Y5hwSbNt69ebNkW3bODmLMGA8TE9Xl+UMzHcPHoALOG/lytmTJxdpLvMjj7+gOXWgMTDyGBisM+TEs1+/DlevXjxx4rqnZ0tOZmAgkUkiR5yHr88PxqvEJZdIw9Ffebm5oS9fQgCsRAZJhSTyYcXHmPnzgfPiq1dVa9SQJEqqch4dDKliYIQe0pajW0b1GgUjIsnusXvBbh8PH6seVonHvk3eSgVFycw8AlTYAUkMDL0uON8rKGh+x44uhU1V7l/ePDAS61VRW/u3KVPkCWjZN23KAiHmXcTDMFnYaFUZQaBDh64wUltbuzIyXhjmiJkzW3t6Smu9CPGB1R/IWTpWIZKMelj3YSRsyIPHYD5Oe5z5MZO8lWpyPnmSBoqlp7/t1Okn1dQQteLNA0t/9crIzm7n6tVDJ00iGTDGqJh1htCEnCaXT8LJzsPXL2j2XorX0hhYcUyKlPAeA/P3nw+J6mfM+GnRog1F+pL5ka/PDwTA5o0Y0dnb++y+fZzKwPwhxMA42Yow3Dh/3qVjRygk8cbKWgwMYPGa6nVk+ZEr0VfajW5XBLrij4QeXvGGRUsE8MBevHhjYtIZsmMkJT2rV8+zaI+sz69fX6pevYr4msOOHccOHOgxdKh0clg7kaKSNw8Mlr9Dt0kpKYSdg7lCiwX80tKEXVA2ioBUCMB2ZuBXzZQcTx8+BN3SU1NJRmTRQBSwkfZq0Ly5tE3KFH87R5HdamYt2rmh1peRUfX+/UW/VOrWrSPtQIyNO3p4jBdvdeHCzWHDfsOSI0cuffrEHUgSby4nzZsHxlgjFoXiwsIaWVsjp7QeWF9Pz4OFMxiSumjWqFHMxYsVTEw+JSXBhmhJbFKV8/ULmqRTGgPjNwYGmNva1rh373Xfvj9ERFzmfAXqHgODdBvMhmXOwRZhQK8LIlvvXr8uUiX+CDGwP2bNgrX4my9dEi+XmebRwRAiBsaMiyQGBszXdlxzHea6/OflUwZNYdrKRfAIUKEe6D/BUo5370S/2DgvWEY/cmTPMWMWAyfGxkACbGeGlBxQkpGRZWCgyylEIAbePDAS/Rq3acPYOWm9Lk7rBQrAucxwhySnr96+JdGH8pR6BIyMTGCM2toVS/1IYYCNHR3lHCaTpEOSnJSCiZaXZI6gJCGluDz5eTKM7sVr0ReRil+BgXM5t0jiEGAjM1ovePzzzyMQGAMiP/8L1tau3aFPH3+kFX/nzQObNHv2mq1bB/XuvfvwYcUPo3iPdB9YcUwEKRHARSXZswVj4T0G1q+f7+rVOydNGgp3vrDiC56B9vawg9jE0jLt8WMW3Zzd3W9euNBl0KD5u3axsLFX0RgYOz4k3pgqx8DER8dEs86evVZkblCcjYSuWbPaq1f/AifjpUEJk1+KRIIMPLx5YB8VmCFbhnHSJhQBtUYgpyApVEWuY8f1DAxgmIpMV6/WqFLl163zx9yGnTu7IhqQa0M2WNB6Qdtx45bgSWNMSW7uZ9lkcrbizYBt2bMHOlO6+5UQEQFqRKpnMl+IgcHKFrxzvjlgcHBxYO4k/CrOI9V5YBCvggQceGcZ1/DhvaD2xx8HsPAwVeh48eh+MZLlJ/BMk8dxcSyiYGUgZp2HOwtbiVU/deoEjteswYPZ3S9MYA8ZgUsUotxCiIGBAnjn1AQ8JEjAgXdOZmDQdtYGx8uyuyW7+zWmzxhgHth5IIlMJfL89ttm7H38+KUwGVinTid0xaAQsh3KoNjs2aNevLiADdevn16tmsH27XOYFB7a2vysSCiuGG8GrLhopZTk5OQopV/aqWoiUEaiX/KDr6uvD0I403x8cwSl3CUtv3pKl6CvI8KH8xDR7E/ZwAapgZWuMLsCv/3mJ84QH38UUheKl0hLz5u3xcKiq3ir4cPnDhkyW7xECJq3GJifvz+cLTnf33/W0qVCKCqtTBoDkxYxGfn5CvKIdU9jYGJgfCMHOTrCUZbdhgw5GRRUvJYpgRy+18+dg3wcl9+8YQqlIuA72rEC9+/aMrgPDGGETcoGrUTztOyXusTAxEeRmJiyevVe8J+gkImNpaW9NDX1KCgpz2m/xaUh3aaNQ0jIimrV2sIj45BhFS937k8qYTeVC2bnLUxNCfkpG0WAIkCOAOY2NJKcxhpFZRbsX86QYxVu6qNH5FqVQc74pPjSOmorKzO0XjBAV1c7XKNoYlILx5uaeoapJUcgLCza23sGOb+0nLx5YMz6eGk14Jcf9oHZtW//NCrKrOC0WfmFC+BgSFQK94Hde33PtoatRCaxirJ8Hhjhni2Ifm3atB/ux48fEEOuZJJQZsmNJZTy9flhD01h5/J4RT+2bx956VJlXd3srCwJQxEV1zY1hc3Urp07r+MpzMzjNidB94HVbF/z9bvXLMhglV9vv82HN/8+/vfpQ0V+DA8XjwBxaaOv3+rDh4/MGkJgX7166sGDFyIiYriaSqxHaUuW7Jw+fahEJjkqePPA5NCBz6baBfuXDQvO7uNTrgJlwUHVCuyNdlXaEHibni7DkNB1Y7deIJbQEZRBARVvkpXNZtcZ5XUr6wLNnO/MlKsFkZJyes2aaeKqTpzoffXqdizp0MFZhgWKOOsIx7LgMdDiwnmhefPAwm/ebN2rF0SebNzcEli3qqDefMWoeEGBRQhfv6BZuiixiiQXYkh4SAu3FgeDDvYd0rdEIYooFAAgGgMr/uLgIOaZPj5RX760rVbtw/v3xRmKlMjgjUEXkAi4fa1auQQroWSQX0RDfBTCwWBfKMioIdV5YNBq79m9sLzQvIt5SnoKI0QSoY4xMEljES/ft+/s0aOXg4OXQCHGyb58idq375yPz0xxthJp9MYOHDiPiaxK5JG2kDcPzNbGBvt2dXKSVgnKLxsCWR9Evwrj75baSXnZYCmVrRo5iLZMQFjCxd1doAGGHjq0d/VqEuslkAIqLvavW39di7lGYr1UfCDyqOft3RmtFwixs2sAd/hMDhzYmUQmbg6LiUkgYSbk4c0DI4mBvYiJqV2z5twVK+ZMmUKon9LZBHAwJI7pY9ZHKz0riGw1MGjwIfODRL7CChoDK0RC4r9lKga29vTp7z08upibn0pOxgi8RFyKVZDE2KDR5JUr1/j7Q6p7iJYVkyFLAY8emKAxMEKv7uKmiz0n98wIz5AFixLb8AhQifLJCt+//1Clih7w6ui4Zmd/Eo+TceZU7NOnw6FDodAWW5F1SMrFmwdG0uGTgqPKNwYGZhKciU4isJTxVKxUEUdU1bBqKRsaHY4CEAg/eRJ6SU9JgbObBerO1smpuZsbX9ZLICWVKNbGwsbJthROQaH1AmDT00O3bJkljvBff22tXPnbF5d4OUMfO3aFoXkneDNgENM6vHUr3FmCWy09PcFRe/n6tUGDBhjcg8fmHTrIP6rzYWG+EyfKL0e5EmD5BjpVfyf9TaJJlxZdgA2iZYO7DibhV3EeSEUPGuKdU1WIk718+QLvLMywBBFq8c7ChlWQBREIvHMyK5gBYk5zAwPhzhJ8OrhxIzpSU728UD0nTc0IguWC8J9RtyAHFeegRrRqFX3lyoSuXTk5Fc/g2Fj0+cE7Z+/gUb148wLvnMzA0OV70f81zsvUw/RS5KWqbapyciqR4cuXL9D7pk2HmOwb7dqNhkcSlfT1dSAzPXIWfBijmza1+vjxGktbJrUHrqfv2nUCC7O0VbwZMOi4S/v25N2PnDLl8+fPwB9TmB3n6JkzUiVUfJ+RcfjUKeyxk7d30MGD5L2XDs47kXcSHyTCWC6evkg+ommjp/l6+pLzU04VQQBWWJBrErRsGTB/yc+fxNUKlsWPdXfPysggEV6hYI9zv7FjSZjVgmfCUu7v03PXz52OOE0+nG6tupEzK57z+vV/rK17ii8LvHIlmsk3v3btftjRLIRWGRkfQCxMOfIonOcY2Ahv722sJ8YO7NVr79GjMAB01P66caO1i4vM48nPz8dF58LF1RQZAwMcPF09T1w7sWz2slXzV3HCQmNgnBCVshhY18GDT7Gmmbf57jtIWg+wsDhq4qDB6V9b5s3bv26deKEk+ucVKwZNniypVrZyHkM8AsXAkp4lOQ5yfPP+DecAA0YGzB8z/3PeZy1NLU5mUgYeAfpvl+B+QaAU1hBCMZyQUru24X/riZ5+/nnFqlVTYJX8woXbOBuodAxMX08PBjDlxx/Zo8dovZihtvHyatWzJ/MoFTF00qTazZphk1v//CNVW5VlNrUwBd0aNm1IruGvY39FX5alSV5eXtj5MBYGWqXiCGgXZLrpP24cu55ovRgeOHwyPy+PeSxCwFTSnKFDCa0XtJX5tMwi/arO491Hd9mVCb0Z2nJoSxLrBXLwyGY+rRe7crLWzp+/NTn5ObRmUkNBMt82bUbJIK95c5szZyIgAZUMbXlpwpsHJq4NrkhcMH16wBLRdgGSq1LFijmfPgFn8ObNNlZWkE0D6LyUlOycHH1ra6CvHD5898GDcTNnAg1eF/heQBS5WMJvRTjJHxXsgTGK4T6wUZNGbVm9hSlkIeasmNOqfasOzTsAz8PMh5nvM+1N7YEe/+v4WzduRVyOwLb6BvqZGZlA8+a9CQAQ3QeGL0vSHQNd3hMn7luzRhKPeLnPzz/D9KPfDz9AISwgfJmW1tfWFmgwilItmocDM3dFRopLlp8WwsEgXDHIKD9tyDT7hvbeM7yhJPdmLuxZrta2GtAdnDtc/Psi8y3P8LMQsAoRPLbhPYaz8EhXJQBArVqNuHr1DrsaS5dObNzYslu3icAGIa5KlbQrVBCFGF+/vgTnOFtZ9QDa39/3/Pkbd+6QbuMJCBg5f/4YZkEjuwKEtYIYMB1LSzA8YH46entfunqVUBX52UqTAWtZv2Xy4+Tzt853tO8oPzLFJVADVhwTVT7QUlxbJw0NcJ6ufvgwwM5OkakL1589C8mCxTWRnxbg+7mctAZM/lGgBGtz6/gj8flf8jU1NPmSWU4AgCC1Lhgh3jQkFnTs2Mru3dsQsxMxViDiImPqPXIkMoL1AkLTzIzQesHqj98Kd4al3rr17sEDlJMSHf3PpUtID+3fv0nDEibWbOrXR7sVr0BLiSoJcU9NTkWxYL2A4LRedevXRf64N3EpeSlIn7h+4o99fyCtq6eLBNzBaMHf/vP7ebNejGg+CCHOA4MYGKiGd04dVfk8MFCeWfuHq8jc9PTYrRee3QUNjyUmXis4DxNoOCqsQ9++nFAUZ4Bz43m3XsV7kacETwLDO4mcxpaNke1u8N1XF18hfe/gvZ1zd5I0L86TcDQBAih8Wq/ifchdkpKSLsl6WVgYM1uSY2L2v3p1EXvLy4vMzr6ONGSW2rNnoQxawJFgvFsvUIO/Xwrlyi2fPVuGgUGT0xdhGd03sEztRRNfeJkVZB9AeueBA4XF//n3UXIyBHigKP7RowaWlv+pU8MHjIGRK570KAmZG1f/9r8RHj1bejISMFsHPr5Ie2FkYpQQl9DavTXDQAl1QeCnhdJ9cbx69gyH1sPKihmjf58+DE1OwPks09evJ+dXC864x3GoZ5N+TRiFbfuKJldluKzM/g+yDM0V1kR88WGRTiEwhrExKLezE/3yw0tT06mQLOfmJuPsqKenIN85gkwhwmhhs1etGjWAwHiYVd26iUlJDAoCEaVpChEg8mrjdSTsCBAkeRGlhZQ3J4zGwFihFwAeUX8JMTEN7OyAwHgYrG5Ht4xVFxkrK2hojJo1a/ScOTK2Z20mwAyZcqYQm1o1jT0QyzpWmSoFAIjZ/iWTQjI2gu1isbElOyEySixoVkGexixt0XoxDH+fOdOpbVvmkRIkCEydO5WEjfKUQQTQejED/z04WLjsGBOXLhXIejH6qzthbmR+fNVxdR+FcPrDepDz5zcKIV8QA2bh5ISO18MnT1Bpw0aNzl25IsQAQOayWbPg3srZWSD5ShELXlefdn0mDJnAo/s12G8wOF7Vqlfjzf3iFRoaAyOEE6Jf6Hid2bMHm8DEIOdJKITCi7BtOHeO971fRbrg61HaGBhf/Z5dfzbuUFzdOnX5EiiQHDiUCyQ3a9ZfIPmSxP79d9DNm4FGRtUlMchTLogBu3TwoImREahlXa+ePMoRtoXoF3DinbCJurAtWLfAqiFvc+sQ/YKB//vm3zev3qgLAlTP4gjMCwrCjBgePj7Fa/kqgS4mLFni0lGQRbB8Kal0OZDFrVPLTngMmNKVYVcgNvYhZHWKjxetDlPYNWxYdycnWz09HYF6FCoGJq4ubKTA3c3olkGkKjk1tW6LFuI88tM1q1dPiIioamAAazo0NTXlF4gSBIphSKWeo7njsxTRAkLYsGyhbSFV2+LMdczqDBg2gLf5SQEAovvAir81lpKHsbHWBTv60S07eO/en3PnXggOZmnCWWVQrdrlt2852eRnECDEI10MzN3Z/cLNCzIMxLiG8dKJS92auwnrewkAEMbAMC8G0q1b24eH35ZqxxsnYps3zxw61DMnJ9fAQJeTWWYGQTywItowuTlG+fhoFZyYbGFqijwnAgP9Bg8uwk/+WFFbG5nDQ0Kew4a6xER45NF6kWsiKOfJ6yebOTSDLhA9ICbMmGBibiJVp2Z1zZAfMgXzZr2k0oAyC4MAWi+QDast4G7ZuPHvhUt2jczNpeqzt59f4I0bP3h5HSrMUCpVc3VkDpofhGo3sxb9FyO/HoY8HNRlkLDWi1wbaThv3AgEx0i8RVjYlqSkk1giw7HLjCgHh0azZ4+CR3Nzo9GjvWDpvKDWCzoS3IBt3LmzSuFZl1v27AEfYsrcuUwKKE9f382sud0YaEokPuXmYnlgcHBoeLiz2BL8EvnVqxC3goHOkFMjNjp25byV9XXr4xDWLlqb9jRNquGkJKUA/+Erh1+/fC1VQ4Ux0xiYDFD3tLZGxwvaQupeuDtWqMCUvCg4wIhc7KN793I+fuwxfHj1ghAAeUNV4JQhBgZ7tow7GqPysQ9jyUdR37S+WkwbMiN6+zajRYsh+Oji4rtjx/HMzI/btoVgCfhhFhZdkWaSx+OjVPfo6PvHj/8FTeztS9izK5UoQmbBDdjYoUPjwsLEtVkxZ07arVtYUk/KX4jictq6us6d+m2d3pbly0vfKkcLy//MFk6ePfnmk5viCJDTw34aFnwxWEtbC7aF1TKqRd6Qcqo4AvtjYoossjiakDB8xgyp1IYJQ+TfFh7u2K6dm0qeliLViAiZk08lzxszj5AZ2GDv84ZfNni29ozaHUXeShU4DQ0NYDEFaqKrWxkIOBhlxIieWLJ8+c9t2zogLcNdQ+ObHbl9ey/87d278OjRFTLIkaGJImJgnGqduHAh5t69gEmTgLNhq1b/vn+fHhsLNOTy0NXReR8fD7Rj587mJiZHtm0D+u/bt5s1bgzpE4EW+hIgxCOvykxMcdygceClQfZ6kGiuZQ5Ttcm5oghtN5dulg0s1watBRqCZxD0krdLlvYqCBCLtgqvUiI8SQ8e1C1IXgNZEJ89eXKiYElwCy0tWHAPGREBCcjka9206aDCJDgKx0bUoQAhHtIY2PPzz8H9grtRdaOU9BSz2qI59tYjWz998TTpZBLQem56+jr6wAD0mn1rOrt2hvMqgVboJQRAEgYAPtnt2w8gHgb1XbpMSEt7Cfk4gK5UqWVu7mfMW1+9ejs4D+XRo+NQPnHiMisrs/Hj/7/lWYJgAYtVwoAJOD65RSvxC0hu3RUigALECjOFhxUeVTFg7Eoqs1aBBkyZw5S17wqyNuShHSysQylDJkxo368f0nBYs1bhvKLJd9/ZuLlh+YzFi8vgkZXFUY6JimHy0zuYOdhU+faT0LaGrYulC/KvWbjm1OFTxdvSkrKDwOHNmxf6+eF42xkauhROV3iYmg5zdcXyuKgoDJuVDlikioHBydbgeOF9+7Ht45aMQxDgMGVNp29rmJ0GOw2Z/S1uVDogYh8FHGXp57cQeYyNO1ap0hppLa0WurrfI+3sPKR795+RhmMws7KykVbWnXpgHMjTX9AUIA4EWKvp54cVHj49MDBgjo0d4Q6Gh71TqAXTxcmjEgzUA2N9Dcr0wFgVo5UUAYoARUAKBMB6ATfeOZvBkSsv3rzAOyczZVBZBKgBU9lXQxWjCFAEKAIUATYEqAFjQ4fWUQQoAuqCgMwxMHUZINWzOALUgBXHhJZQBCgCFAGKgBogQA2YGrwkqiJFgCLAiQCNgXFCVPoYqAErfe+UjogiQBGgCJQJBKgBKxOvmQ6SIlDqEaAxsFL/iosPkBqw4pjQEooARYAiQBFQAwSoAVODl0RVpAhQBDgRoDEwTohKHwM1YKXvndIRUQQoAhSBMoEANWBl4jXTQVIESj0CNAZW6l9x8QFSA1YcE1pCEaAIUAQoAmqAADVgavCSqIoUAYoAJwJFYmCQrpfJ2FucplkQOfFUCwZqwNTiNVElKQIUAYoARaAoAtSAFUWEPlMEKALqiADGwAg1B58MzgMjZKZsKosANWAq+2qoYhQBigBFgCLAhsD/AOOgkaI1HP03AAAAAElFTkSuQmCC", "text/plain": [] }, "execution_count": 45, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# stop me someone!\n", "rearrange(ims, \"b (h1 h2 h3) (w1 w2 w3) c -> (h1 w2 h3) (b w1 h2 w3) c\", h2=2, w2=2, w3=2, h3=2)" ] }, { "cell_type": "code", "execution_count": 46, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAASAAAADACAIAAAAr0inhAAAjUklEQVR4Ae1dB1hUR9cWKdJRUCnSFBCjoggSBXsD0YgmKmpijyWWqLFEEw3GGPVT4ydGjIkmJCoaC1asBIwYQSNFQKXaqEtRQboU+Y/h+zfr3d27t8ywd91RHp6ZM2fOOfPOfblz586d0WjMz2+B85/IEqd1sI3ZgWULEdYGYA6f4KOo9zB3QEtF/kk5QYAgwB0BLe5VSU1sCGSmZh75+chfkX/lZuXWVNeYtTNzcXMZOXbk+KnjtXW0sbklhtEjoEGGiPSgNvMQsfZl7frP1h/66dCrV6+kA+tg2yHwt8B+Q/pJF8mVYB4CNTM+cpvJuQAzPmSIyLln0FeEm9WEoRMO7D0gk13gLy87b/KIyccPHEfvm1jEgwAhGB5cOVldMWdFXEwcfdWGhoZVc1fFRsfSq5FSgSBACCaQjmiRGJt4+shpJtHU1dXBMJKJJtFROgKEYErvgv8FcHj/YeahABvv3bnHXJ9oKgsBQjBlIU/1G301miqizV+PuE5bTgoFgQAhmCC6AYKACQxWoeQ+yWWlT5SVggAhmFJgpzqtqqyCJyuqlDZfWlJKW04KBYEAIZggukFPX09Li91Lf2MTY0GEToKgRYAQjBae5irU0NCwtGa3atPazrq5oiN+uCNACMYdO7Q1+w/rz8rggOEDWOkTZaUgQAimFNhlOP1wzocypHJEsDSxh3sPOYVELCAECMGE0hnufd39JvkxiQae1jbs3ACjSibKREe5CBCCKRf/N7zvDN7p1sftDZFUpmXLltt+2tZ3YF+pEiIQIgKEYALqFZhLDP0z9KO5HwGLZIYFEyG/X/l98uzJMkuJUIAIkM9VFHSKUj7HyEjJgJVTN67eyMvKq66ufv09WC8Xn7E+E6ZN0GmloyBiSjHmzzGUgg+libyymPEhBFPQO+QCogeI4EOPj+yhCH0dUkoQIAgwRIAQjCFQRI0gwAUBQjAuqJE6BAGGCBCCMQSKqBEEuCDAboEpFw9vb52XtbV/JyRcv3UrOTU189Gj/MLCispKEOrp6urr6bU1Ne1oa9vJ1tbD1dWrd28He3ulI1FYmJ+d/Tg3NysvLxt+w8+zZ8XV1VU1NTBV+fo3/DQ2Nurp6evq6rVt297a2g5a4Orq4e7uaW/voPT4VTEAlZlFFOWK3G3cpSHOz2+UFiKRwM4zGRkpQ7qYUazBJXjl2rWQkyfPXrkCjKKUyss6dezo7+c3w98fEpI6GlZWkllW6bCbYbD+g77KscDLyckJKSlJqal3y8pK6ZVpSh0du4wbN2XKlNmWlm8sMiaziDSgQREhGBUfuAoPHNgbE3MtPv5mRUW55LZ2QLlDoaFb9+xJzcykVmOWhzfI40ePXr98eTdn56YauAlmpcGdwNJt0tbWAY6tXLkB7m9NpYRg0ihJSsgzmCQar9OPHz/YsuXLqKhwYJdkGYwG3x01auayZZzZBdaAoifCwnp5e3+xeXPNy5eS9lUiXVdXe/Dgj4MGdT179qhKBKz0IAnBFHcBsOLbwMB+Y8fGJycr1magAR8v/ycoaMC4cbkivBvfM4iFi0pJybMFC6Zs2/YVl8pqVodMcijo8Krq6onz5l2MjFSgx744LinJw9eXfT2h1AgM/BY2aVzkP8rZwUEoMQkvDrx3sJIXL4TXZHYRDff3x8GupiAKiorYRSMw7d27twybOPFxdrbA4mIRzokTB1los1fFSDB4xvCbMYN9SMKqcTM+XlgBCSyavIKCsbNmwQS/wOJiGs66dZ+KRBj359KIb8Q1zd3URLcEpk3lqIfZQUILN46BMauGOfwWLTA7IPjQ9zPGOxi9Y1JKEFAHBAjB1KGXSRuVhgAhmNKgJ47VAQFCMHXoZdJGpSFACKY06IljdUCAEEwdepm0UWkIEIIpDXriWB0QIARTh14mbVQaAoRgSoOeOFYHBAjB1KGXSRuVhgAhmNKgJ47VAQFCMHXoZdJGpSFACKY06IljdUCAEEwdepm0UWkIEIIpDXriWB0QIARTh14mbVQaAoRgSoOeOFYHBAjB1KGXSRuVhgAhmNKgJ47VAQFCMHXoZdJGpSFACKY06IljdUCAEEwdepm0UWkIEIIpDXriWB0QIFtnc+/lvMeP469dS4mLy87MzH34sLy0tKayEjay1zMw0DcysrC1tXVycurRw33QoM6urnCuCr2n6pfVGVkZWaKs3KLc3MLcvOI8+F1cUgxy+Kmqqaquqa6tr9VrBad36Zsam9pZ2sGPq7OrZw9PF0cXLU3WXfngQc6tW3dTUh6lpT0RiZ4WFj4vLS2vgf1iX9Zpa2sZGOgaGEBT9AwN9cGTg4N100/37o7m5qb0bSGlYgRUeOPR8qrym8k3vRe2FTcGR0J6Y838J08uHjp0MSQkKyODoUcTM7MR/v5jZszo3qcPpcqFhRvvpN1Jykh6InoC5KSUMsyaGJqMHTzWf4S/r5cvlclvbjwKh5tFRSUcOxZ+/vxfwGOG9ilq9vZWAwe6+fh4+vp6PW4zmFKKNvtm+Ght/2MNswPVI1hpeWnQsaALNy7E3o9teNXQ2Ih3a2tJgj1KSfll06bwY8deNTRw62rX/v3nBQT0GTFCXN1dQ0Oc5p9wtHFcOW3l7LGztbW0/2ft/y+gurr64OCzO3ceTk/P4u+oyYKOjvbNl7WorMm08//hyyxEIHx/45BPPpng7d1XA2lHiCNTPYLFpcR5TPMQN6B5CFZdWfljQMCRXbs4U0scMCQG+fmtDgoyt7GBNFqCNXnp5tDthzU/DHQb+Dr7zxUaEfH34sVbEVKryRH8VvWt1zXcX/+BCw5eP2uWn7hRCBMKHgwQelJdU2kJCZNcXEL++18k7AIcos6dm9CtW0RoKCZM7j+8P3je4LV71sKYE/6tWbPb23sRDnZhir/5zTo52WJyyvrJGFMcwjRbX99w+fiRDR9/XIv69JCq8vLVEyfO/vJLTA2HZ63NwZsT0xONrOvhiQuTl2YwCxM8MK+D21H37riOOMN7Bzvzyy+4ocFqf/78TV9Nm4acXeKYgzdvFqdxJC5GX8TNrm2ffoojcrFNv8/8gGPiLI6EtbV569ZGOCyDTYwEu3L06Ldz52KKuxnM7t9/GmYFYIjVDL5U18WxoKBdn3+OL/6IvyOmrpuKzz5Yxnf7AuNabtiOCd5y+DAMVERuaI4hFuWK3G3cpYEWidykhXwkSUlxvr7/TqLwMYW7btjNMPe+MjCR9GulYSWZxZE+uH27rkj007ZtOIznN+aDWRH8x/ZPR8cG+VUkDhbXHWzPr7+Gnj8vdkMSbzcC+0JComNjVbSNYWHHU1OTMQWPhWBpDx6s2rgRU8TErDAR+HTtWhiwCDM2+qgg7ODgIHodzqVYCLb0q69U99BezlCqecU79+5diIhQURDOnj1aXV2FI3j00/Tn//gjPCoKeayW1pZNw3GqZYyDc6ormXljI6Pxo0YNh7VDLi5mbdq0NjZ+UV7+9Pnze2lpFyMjT1+6VFpWJrNik1C6Uc+fPu/erjtNFVRF33wT6ODgbGFhZWraTu/1P32Y0Xn6tCgn50lExPlTp44UFOSx8rV97973JBapsKqrXOWKivKYmGvDho1CHoZGY/7rh0iE/3qPHBmf/O+IVvoCQujrtSmRJVqDzCc5NDU1Vy1Y8PmiRW1MTOTFUPLixaZdu3bu2ydvNlIaH+YE4znJkZgoat/eQl7k5eVla9cuDg09JE9BpjwzJsbR3l5mETch6u6lRiGeBPr44yUbN+6iFvPOIx4iXr1xQ5JdvMMTrgEjQ8MLhw5t+fJLGnZB9FD6XUBA6P79uq1aCbcxsiIzMjIODPxt4MARsgrlyo6eOSO3TNgFd+8m4AgQMcF2BwfjiFJoNmFhaEhQkM/gwQwDe9/Xd8+WLQyVhaMGC/O3bv2R1SpYGBILJ35WkWRkpLDSZ6iMkmAvysouXb3K0LFKq8396CM/b29WTZg9ebL3oEGsqghB2c6uU79+Q5lHknj/PlwGzPWFo1la+hyeP5HHg5JgZy5ffllbizxEoRlspaPz9YoVHKJatXAhh1pKrzJ8+GjmMcCj5o3bt5nrC0ozMzMVeTwoCYZj8hB5g/kbHDVsmKW5OQc7gz09DQ0MOFRUbhV3d09WAajuQ3hW1kNWLWWijJJgMXFxTFyqug48UHFrgpaWFkzlc6urxFrduvVk5T2F8YferMw2gzJMnCL3goxghcXFT3JykMcnQIO9e7K74CSbYN6unWRWJdK6unpt2pgxDzU1M5O5sqA0KysrkMeD7EUzLI9CHpwwDXZVwbkKnkhaWHQoKXnG0EheQQFDTaGpwetm5CEhu4Opye0LeQeohEFTUxZ3MNibqq6uTiXaRQmyqgr9HYwQjAIyycpAQEenlQypHBGsnS1+/lxOoaDFOJYjIiMYrAkSNHgkOB4ItGqly6p2ZRWWhbOsYuCgjONrAGQEq6qu5tAkUkUlEGB1B4MW1aDewkQlUJIZJDKCke9TZOKrnkJ1WG/AsGeREYyhP6JGEFArBJARTF8P+95aatUxKt1YHR0dlY4fYfDICKany+45GGEbiCmhIaCjrS20kJQVDzKCtTU1VVYb3jK/rD4PEWbbDfT1hRlY80eFjGB21tbNH/1b6RE+lFb1dpm2bq3qTUAVP7KlUvb/HGWAKiwh2ylISkK4pFAk1VQtbaad0lDfIFW7uQXIt5xo7gZg9ofsDtbN2RlzqEIx/zg7G2sorXSZLpuorKjEGgkxzh8BZASDUUHnTp34ByR8C1euXcMaJAwRDY0Mmbh4WvSUiRrRUSICyAgGbfDy8FBiS5rN9Q8HDpRXoF8VKhl/a9PWkll56fR76fKKiFwgCKAk2BjV3BOPbU8UPX36yerVONatiSMxtzIXp2kSMddiaEpJkRAQQEkw+JYeNjMTQqtwx3Dk9Ok5K1bg+yjD2o7RlGxibOKDtAe4G0vs80EAJcFg6z//MWP4RKNCdYOPHvXw9b1+6xafmBvknPXcyakTQ7MbVmxgqIlJDTZimbF06YYdOzDZV3WzKAkGWKz45JO34D0pw05NSkkZ9MEHA8aNg80AWX1MABubnbxwYeayZeY9esj01cWli0y5tDDyYuQXC7/Ady+V9kiR+EyZcvDEiYS7dylytNn6+vqo8KhV81bNGDMDrWXc1nAdgp4SFzdn4MCX1dWN8Y1o29A8h6AXFT13dBxbXq6S3zWhBZyJtQ0HDrw3fToTTbY6TYfEb/l0y5qZa9jWZaIPh6CvXDlt+/ZlTJQ56CC+gzVFUFVR8fmECcAuDgEJpMrWrQcIu5j3xab587PSMU5prtuz7k76HebxsNLs3t2RlT4rZSwEC1y5UpSVxSoOQSm/fFn766/nBBWSwIOBY6w3zJ4t74AL/sE3vGqY9+08+M3flLQFrEfIoidYanz8qX37pJuhQpKrV2NLSspUKGAhhJoUE3Ph4EF8kcCjQcjFEOT2Yf/9rl07ITcrNsh02Zu4gsLE7jVrsL4jUhgAf4Vr1+KZG2mlpzfgvfe8J02y79KlfYcO2jo6xfn5hTk518PCwo8dgzRzUxTNnSt2Otk6mZuam5qYGuoZGuob6mjrlFWWZWZnXk+4DldbcmYypQqHrKmp8eTJPgMG9HJysrW2Njcw0NXT04V7UU1NbXl5ZX5+cW5u0d27DxIT069fT3j6tJTGxb4NG0Z++CEgQKPDp2hz8Oapo6ZqtkS5GLpTpw56ekzXpnEIHvEkR/LNm7O8vCTjUMVJjlGjlly6FC3ZCnlpTx+f9cHB7aysZCo01NfDNRe8ZcsrOdPxMmuJhaJwkYWZhTgrnTgZeXLhfxYWPS+SLmIomTXL7/vvVwF3mejD382EhLSzZ6+dOXMNKCezSsAvv4ydPVtmETdh0ySHuO6p7069P+R9cZZ/4v2NQ06fxviOoSX/ECUtHAsKksyqaDovj9ElO23lyqDLl+WxC9quqaW1YOPGwLCwlni+QBk/bHxcSJyjDcdndD+/QcHB6xmyC5oDL2Dc3d/55psFycnHHjw4u2PHZ3CsJ6WLQ/fupUjQZoPPBqM16OrqjNYgxRpKgr149iwyNJTiQBWzlZWK5z+HjR+/dNs2Jq3r5+u7dOtWJpocdGzMba7suWJiaMKh7ocfjuRQq6mKg4P18uVTo6L2UyzA65n0O7im+8DXpehLxSXFFKd8suvXz+NTXWFdlAS7dvZs3VtxfFGrVgqeIoxNTWEsxPyV+tQVK7q4Uf/YK+wbhgqdOnTavmw7Q2VJtQcPciSzqNJXT51CZUraDkwkht8Kl5YLVoKUYCp7fCile1q3NqJIKNmP1641lH8uM0W5KTs3IECmHIlwlt+sznad2ZrasSPk8eM8trUU6sPsjkIdPgpqSjCYd4r7808+wAmnbpcu9jTB6BsaTlywgEZBZtHgsWMtbG1lFvEXamlqrZ29lq0deBXh6TkzPPwW24r0+hlJSaVPn9Lr8CmNTmQ0/8THBcK6yO5gj+7fhwUcCCNToqk+fbrTeB84ZgxMzdMoyCsCjskr4i//YOgHujq6bO0UFj738Vk0btyKv/++x7YujT48idGU8ix6lPeovKqcp5Fmq46MYPdjY5staNyOYHoN3j/K8zJ0/Hh5RfRyeF1Gr8CnFN6S+fbz5WYBZt779p3Ru/fUH38MffYMwRkDWAkGbwvuPUD554AbaAxryb2MGNYXq2Wr7LFr4iaIExYWZpMmeYuzlETX3r0pEobZzq6uDDW5qQ3pPYRbxaZa8fGpCxZssbT09vX9dN++U3Bz42wtB/NhcVkilVmIh4xgBaq8+FD6Stq8eZHMqQ6Y27C0s5PWZyIxbd++ddu2TDS56bg4uXCrKFmrrq7+8uWY+fM3dejg4+298LffwsrKKiUVmKQLMO8LlFeMfm6GSbs46CAjWGFuLgf3gq1ib2914MAG6Yl4u86sJ+sk22jj6CiZRZvu4dQDocGGhld//PH3rFlfm5sPnzDh81OnrsIaaIb2izBfDAVPCxhGonQ1ZAR7a2Y4xF0CT2IBAXPF2aaEIb8tNQ2MjSkGEWZNjU21tbQRGmwyBYsST56MHD9+lY3NqPXrf2QydKyuZH3TYxV2ZTVe+6yCoVdGRrAa1TxzjR6dr7+eT1Ew5McQAyMFb9go7thmjfSp9nfv/lxTE00vFxeXfPPNfju70YsXb6X/2gD3xVD9UvFSG7bQYdJHAz0EV6+ax/KyhZXbBL3YC8/qYjvyEiZGJpSixYsnnT+/y9IS2bMfDBT37Dnu7PwBzSdzuBf01NYxHaxS0Gj+LDKCtVKP01V4fqaNeyAt80OhkSO90tNPw9JBbcabciu8EOFuNnv2hunTA2Rq4v47Ap/tyPTLQVhbW8ehFvMq6AjG6d0r80AFosmTIZVlZVgbUlFVIdO+kZE+LH5PSQldtMif+fJ5maYkhYcOXZDMitPwSZk4jSPRis2h7PQBeHsvKi3F+NoaGcHYrs2jb7ZgS8tLS/nExrO6Qtf0T/+OjjZBQatzci7u2rXKy6un9BypQvsMFXBfDAZ6BgwjUagWFRU/ZcqXMu/8CusyUUBGMHP1OL6Iz/t06EWsO8PAAiImT//wfm/JksnR0cHAtMDAlf36oWeaOeajdtq3ac/k4maoA+/9tm07wFCZrRoyguFbycq2SVj1y54/f15YyM1F7sOHWOev05+kswqsQ4f2S5dOuXEjODf3EtzZhg/vg+ohDffF0N4UJcEAtHXrfpD3jTYrSKWVkRGs4zvvSFt/KyWpCQnc2nX/9m1uFRnWysjOYKhJUbOyagfPZn/88UNxceSRI5smThzOc5sK2J6E4gJt1s6S42IaeWHU1zfAGwh5pXzkyAj2jrs7nzhUqO5f589zizYC8+fesBMOt8DEtUxMDKdMGXn8+NbCwoj9+7/q2ZPjspVumM/ZcbZD/50/rHimf7knRolVAhnBbDt3xv1oy6ph+JThg0IOz8RV5eXRly7hiwosX7xxEZV9mHWcM2fcnTtHDh/eJHNNJo0j2InEuVcvGgWeRfAyvUP7DjyNSFeHRZjIP40DL8gIBlNSfdXj+CLYkg32VZLuIXrJkcBA2J2TXodPaUxSTE5hDh8L0nWhT2HfjuvXf2Y1YnTt31/PANksn3RUHt08pIVIJElJHMfYNN6REQx8DHhbjlZRuEPt/o0baTCVLoIvfA9+9520HKFk0y+bEFqTNOXi4jh//nhJCX0aPkilV+BZ6tXTi6cFedUfP86XV8RZjpJggKzOW7Ge4+bNu05O4zZu/DkrSyQTWdg46eD27TKLpIVA1/UzZ2J9xRzxd8TFaGTjQ+kmMN/bDD5Uhf22pC0glPD87I0mkrIy2a/paaooLEJJMOM2bUZMnKjQpfAV7t7NhB2XAgL2duwo94/x7i++YPJMBU9rO5Ytu3FB9ooHJFA8yX8y+YvJ8kwNGTIPvlNmsgRengWQZ2Zm05RKFnmOHMn5ezlJO/LSsEHdgF4D5JUKUI6SYNC8SYsXC7CRbEMSvxKhmcyAzXqXjRmz96uvYPteefbhG7mFI0Yc3b1bngK9fMm2Jfce3qPXuXLziudMz2cvnslTg23A4TtlKyufQYPmfv/9UQ5btcEmx4GBR+TZp8j9Fy6kSNBm3xvwHo5PcpqCZDudw6RpGo08Nk+X6WD8nDmnLv47XMlvRD+ufcOvyPKNLO9MUlKcry/rx2g43XNo//7vurp27dzZ2dHRrE0bQ319QwOD6pqa0rKy0hcvnpWU3EtLi0tOjktKSs3MFJ9tKROf3Kzcd+3fZd4Uewf7GQtnDBg2wMrGysjEqKa6pvxFeV5OXu6T3AVTFsizk5goat/eQl6ppPyTTyafO3dMUkJJOzs4pP31F0WIJLtk95aEWwlwWG5lRSUYNDI2Sn+RjsRykxErDSuxNX//GYGBv4mzSBJaSKxIGvkuIODS1atwYUkK3/p0zcuXFyPhuMlIJC2FM5rtOtllPWK688STh0+Ue5bsw6wsOIRSSwv95bR7yxv3//Ky8oK8AosOjP4uIOkLnkYQDxEhmo62tmuXLuUZFqnu7eetQiAAux5h3odDjEZGCvrJdLFx5An0BIMQ1yxeDOMl5LGqlcGJ01Vsuij94cPm6SBCsBaampq///CDbQf0r9ubpwuF4KV7r+59B/YVQiQMY0jHvFWbOIzM1ExxWvgJLHcwaHb7tm2vnjjRwUJlxsoC7KqVG1YKMCp5IZE7mExkcBEMnDnY2/958qRMryokdLS3V1a0XoO9JkyboCzvbP0SgslEDCPBwJ9Tx44yvaqQ8PalSz6DBysr4G93f+vYxVFZ3ln5bTaClTwreVYs970fq5ibQRkvwZqhAbhdtDExuXT48J7Nm40x77gmsyHGJsaHLhyCt1sySwUlLILzmzHvOCJurwrNcxCCiXtNbgIWlS+cOTMlKmruRx9pa2vL1cNTAC/Ezt8838Md5a69eCJt0TzzHNo62k0vnTG1Aq1ZQjCmeMKEzb7t2zOjo5fPnw9TOEyrodCD96phN8NgzkNPXw+FPcQ22pmZzfnww8tHjrj3wPtXAO7kywOWx2bFDh89HHEbsJnTwmb57TRsZ229Y/36rWvXXv7zz/MREeFRUY85vWBtpaPj2bv3EC8vhjDBnROuremfTP95188nQ07mZecxrIhb7a8zZzzd3eHFDD5HNvY2wCjfD3xh1ofmWCl8AfCxjH4tIiUa1EsFKeZbtMDswLKF7C9WxHHk5Ocn3r+fdP9+xqNHeQUFeSJRyYsXsFIMfmCtsAGsSPxnUaKRoSG8GOziCHMWr396dO0KyxfBCIfwwWxyfPLtG7eT4pJgOVV+Tn5FWUV1VTUMZQ2NDWG1Xtv2bTt17uTg7ACeRnvNEYeKI6EQH55O4+vz8T6CcugANk0iBFOAFu4LCHP/cmKwAkjeKCb4vAGHVIY8g0lBQgQEAXQIEIKhw5JYIghIIUAIJgUJERAE0CFACIYOS2KJICCFACGYFCREQBBAhwAhGDosiSWCgBQChGBSkBABQQAdAoRg6LAklggCUggQgklBQgQEAXQIEIKhw5JYIghIIUAIJgUJERAE0CFACIYOS2KJICCFACGYFCREQBBAhwAhGDosiSWCgBQChGBSkBABQQAdAoRg6LAklggCUggQgklBQgQEAXQIEIKhw5JYIghIIaAR39goJeQlWDJ6dLTE+WCN8Yjtx6XEeUzzEIfY2BgvTqNKDBjw8Y0biQyttbOy8p40CY7PtbC1hXRdbW1RXt7j1NTwY8euh4XxP/i8a6euo/uPduvi5uLoYm5mDkc8VlZX5hfnx6bEhlwMibwdSXNKoMwm2NlZ9uvXE06F7dq1k42NuZVVOwMDPV1dnVevGuvq6isrq0tLywsKnsHxuXAoeHj4rcREpudxde3d+1BsrEynnIXuGhqSdSN/jBzqMVRSwjOt4f6v/XXr5mzcuICnQUp19ARLT0yc6u4uPkdcFQnWps1guMgoSJGsQgSCLl/29PFRqMZKQZJgTrZO6afSYW8fVhbolSUJdvbsf/38BtHrsy1FP0R0dnWd9OmnbOMQjn5OTiFhF4fuGD5hAnJ2UcJYM3MNWnZJ2tfR0R40yF1SgiSNnmAQ1qJNm6wdHJDE1/xGxAc0N79r1fVobGq6OigIa/yONo7T35uOz8WYMQNNTAyR28dCMD0Dg+0nT8Jv5OE2g8G7d1Xp+KlmAEShi5aamlt+/93U3FyhJh+FoNVBWpoY98ldvXoGn/Dk1cVCMHDWuWfPbw4eVLl9WCHy5GRCMHlXi2z50m3b+nrjPfB22uhpPp6In+4kGzNrlp+HRzdJCao0LoJBfEM/+AA4hirQZrNDGSKq4t+IZsMKHM3/+uupy5dj9QjTp3u/2IvPhYuL4/ffr8JkHyPBIGLfjz7CFDcmszBPnZ6eJWn8P8ePYxrrAnWXbN0q6UsV0/PWr8catq2F7bmd5wz0DDB5gXcV4eE/GBrqY7KPl2CYgsZntry8qmfPzpL2h40f/2tMjI2jo6SQf9q4TZud587N+Pxz/qZoLDhY451qGjduMI13JEVR+6PsreyRmJJp5ObN3ywszGQWIRESgr0Bo6mp8e3bB2/dOgCDcnj92lTm1KPH0aQkGAjB0/wb2lwzMHgOTUnpP3o0VwOM6o0ZOCYuJI6RKnslW1uLI0c2nT69g31VdjWwsgtCMTbGdW9saichmIz+7tOne3DwepEo/Oefv2oq1tXX/2zHjmPJySP8/fm8inHt3/+nq1dhitWMwfHwI/qM4DZvZtnW8tDGQzCyam3UeuBANz4BS6MDS0B++mltZuaZKVNGSpcSCQUB9Cs5KA7cEigCvtlmWColGWJCCzfJLKRzHjw49+uvF0NCChifDAYDQmCm36xZ3fv0oViTXKlAKRKFi2BBzC9nfwmNCE3OTKaUyszCuqpF/otmvDfj34cWt4Ts7IITJyLOn78eHZ0ED5kyK9ILNTVbAq9Gj+7//vtDICGpLI2PZCn/NPLrhxoSZgeEYFTAKXmaC+hJWlp8VFRaQkJ2Zmb+48cVcCxYZSVQAiZF9A0NYWmibefOTi4uboMGdenVS97wkp5gFmYWTfHkFOb8ceuPhLQEYFp2QXZpeWlFVYVuK10jfSNrc+t3Or7j/o77SK+RznZvXP2v60pcQPCEeedOWmJixv37D2HBSl5eUVHR86qqmpcvYQVlHbAIVjPAosTWrY3atm1tbm7WsaOVo6ONi4uTh0dX8YCZOT4UTW5ZifC5GVBUC7MDjG/uFLVM5cvtu3SBn+Zpho25zeyxs+GHjzsjI30YMcIPHyOkLisEWrLSJsoEAYIAKwQIwVjBRZQJAuwQIARjhxfRJgiwQoAQjBVcRJkgwA4BQjB2eBFtggArBAjBWMFFlAkC7BAgBGOHF9EmCLBCgBCMFVxEmSDADgFCMHZ4EW2CACsECMFYwUWUCQLsECAEY4cX0SYIsEKAEIwVXESZIMAOAUIwdngRbYIAKwQIwVjBRZQJAuwQIARjhxfRJgiwQoAQjBVcRJkgwA4B1fvgsnfX3m8eKIF6TwJ2APLVfrMtfK2R+kJDgNzBhNYjJJ63CoH/A4VcB9w6c5asAAAAAElFTkSuQmCC", "text/plain": [] }, "execution_count": 46, "metadata": {}, "output_type": "execute_result" } ], "source": [ "rearrange(ims, \"(b1 b2) (h1 h2) (w1 w2) c -> (h1 b1 h2) (w1 b2 w2) c\", h1=3, w1=3, b2=3)" ] }, { "cell_type": "code", "execution_count": 47, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAkAAAABgCAIAAAB6y1p+AAAx+UlEQVR4Ae1dB0AVxxZFRAQL9oJS7KioRBF7IxI7qNh7iS12TdRYiF2jxq6JLbavaIxKEnsvsaCAgjEolgAiGmMHBRXQf3avjMM8RMpr4kz+v5w5Ozsze97zzdy9uzMmb+7cufMG/0mbCRU4EXxC+GQHjh4IpqJTRYG3ymsFxjy7ucBb21iDaeLRROAX/LwATPjLcKgW8AZQWqlAehRQzsH/pZUKpEsBUxOZPn4FXr18JVzE+MHjwbhWchX4lQtWggkOChb4qCdRYDTruXv7LviDfxwUyo/+cjSYOmXrCLzMSgWkAlIBvSkgBzC9Sa3DhuA5CbXP/nE2mNsJtwUevjYYsvyh9PF+4X58JRJLBaQCUgF9KiAOYKePndZn87ItqYBUQCogFZAKpE8BcQCr61o3fRXJs/SgQKBfIFp5EfsC1qOuB2yxLMUMbtEBmaQCUgGpgP4VEAcw/fdAtph6BTav3ozCX/f7Gtb/jH/qT5QlpQJSAalA5lNAGcBszWxht23YBsvP6PmrbVGjBZ+V2CAKzFs1D+0u37wclo9aGRYbRArZqFRAKiAVUAawhIQE2DH9xwhy+J1+F6Knm1eXL14WysisVEAqIBWQCkgFDKKAMoDR/D38VTjDxLjUdWF9IqZS1UqMkUA/CtQpozyqHvM8Bpb3j40H60cH2YpUQCogFRAUkDEwQRCjy0beikSfLHNYwpqZmRld/2SHpAJSAamAgRRQBrCU5/I/zvuRlRncdbCB+vnpNkuecZYsWSDBrbhbsIaNeGm2/ul+NvLKpQJSAYMq8GEP7HbYu5dhrfJYGbS3snGpgFRAKiAVkAq8VUAZwDTn1Dwza/ksVub7n76XyulZgZola7IWA3wDgFP2mPV/lHVPAqmAVEAqoE8FPuyB6bM3si1NBeo1rsdI51rODBPw6KS8ziyTVEAqIBX4BBVQBrCU5+zLvl+GMk2dm8Ji4ehPUCPDXvL8NfOFDvD+8YqtK3CUZ/SPhe7JrFRAKiAV0I8CyXtg/NNu9d3qoytTF06FpUcJ9NMz2YpUQCogFZAKSAVSUCD5GBj/tJtTdSecX6tBrRRqkYf0qUBsTCyaa1WrldAo70nToWvB1wB4fvKoyWAaOzUW+PJ5y4Oxz24v8NVsqoHp5d5L4L3XeIPR3H4FpExSAamAVEA/CiTvgemnbdlK+hSgd8K2H9sunN6tfzcwpqbvPtNyFcsJZciTPhJ0ROCvPrkKBhtUCvyF2xfAbNi1gfHY4hK4a7+usJrbuLBiEkgFpAJSAV0roPzY2WS1gd26dquuG5P1a1EBC0sLoTZaKVHXe4AFRChPQsokFZAKSAUMroAygG05sAW2c9/OBu+N7IBUQCogFZAKSAVSqYAygNFjGqk8QRaTCkgFpAJSAamAMSjwLl5iDL2RfZAKSAWkAlIBqUAqFZADWCqFksWkAlIBqYBUwLgUkAOYcX0esjdSAamAVEAqkEoF5ACWSqHeFrt7+y4QvVnFn8m/a6U3/vXr12h3w7ZtQn8qNmwIJkuxYox/+eoVekUM2ZO+vmBmLFok8B0HDgRT9YsvBL5IlSpgcpYuLfDIyqQVBaKingj1LF06G0yXLk0Fvlw5ZU1tfLoCT0zK/L17d3CWn99pquGf4GCUn9itm4+P8mLfsmXfE+9iZgbeOUuWb7/9Ckz37i0iQ0OJIdu8ucsf69Y1auT4/ZAhPF+rVqnBTZo4ORX1KF2a5x0c8rgWKGBvb17bwoLnUblMWlHgSfQToZ4Za2aAqd27tsCbuSi7MuHTFXhieD72ZSwrCT4uLj7oWhCYP078QfzKlTvAly/v+eOvP4KZsGwC8VnA4tuTxbnXZOUV0saDGt+4EUEM2SqdqmzatLdsm7ITJizjeZvmNp6e3+R3zV+7dm+et6xtWaqUR1aXrPnzu/K8HMAg7weSkQxa5y4or2Qh0bAEW6NFC2R7jxwJi1W+wOw/dgz2yvXrYCh1HzoUTEFHR9hETvnb0NMTjNfcuQL/665dYAL//lvg/3vwAExMbKzA83VKnBEFQkNvCKfPnq38HJw4cRAWkxXYq1cvwz57Fg1L6Zdf1gFMnDgskXj7t23bBkDly+ejfOzz5wALv/66atXiAK1b1yO+kzov2e/tPWRItzthYbNmjV8zfToOvVZ3aQfYuHGFp4PD0aP7PEqVolPIBgX5T+3b99q14F9/VH65WLp1K/TcoUP379+L/OcfRgJER0dFPXoUFxf36uVLnpdYWwrciBC/P14/eaFy37+UqWp0jPKdOeirfJcSXifAUpq+Rvm4Pb/xTCTe/i3tURooV71clD98+BxA5codP+vyGUDr0a1pWcFBg2YhGxISPuT7IcePB8xeN/urr5RZF0sbd2+0tW1x1O9o2bJtGAnw142/evTwQp9nz1a+wCxF/hfp43PscdRjX9+/GAnw4tWL0NBI/Ct4/DiK503NbG2Rz122LCw/Q6cZt2OjRgLfqmdPMMMnTSIe66PjLOD/bd8OezMsjHhmAViikmQZCZDVxgaW5v48/3dICJ81FKZXd2mNQR7redXBmtWqCS36798P5s2dO/g/ZjuwzVxdGUP8pmXLwERfvy7wdDTj1s4uW01zc8yvG+XLx8+sMQdvYWfXuHGVvvXq8XyvXu5T+vTx8hqxeeFCnt+xY9PVCxfCwm4qbqU69yfLf+jv4+3ts39Zvz58Bfx08uUfPPiPzxo5dnKqLvQQnysYsvR+evnylRhDfKdOfcDMnLlU4H18ToK5evUx8ZY5c6L8qPnz+TqB/eLjYQPe4IvwpliJErD9vLwYQ/zOkBCBIV4rFj+ROXLUKexWOGtWF35mXd6zvIVF7aZDmhYokGTGPWDGADu7FpjvV6jQji9/MeSii0uP+IT49u3H8jwuf9y4JbCrVu0U+CNHzoO/d+8R/AD4Cit3rIQFw1JcfBwxAo8CxGTPXosVJtC27dcABw6cFXg9rCJbvaL4/cHnim6QzZ0jN3CTWk0YQ7xXP2WQ2/nDToG/+cdNMAl+CcS7udVE+atXd7I68YMD/Ab/T7SNGjkD//TTeMbQ0YiIvQLDn5VBbJaQkEBzamFmTTNu9F7g9xw+jEFr6dq143+egaMs9Rw+/M6w9mXq1MEPJSMBytWte+Kf05PmzBmyaBzPXw8NzWVdkhiaXWLuv9hkJV+mapMm2M5x/KxZw5dO4PkXyiQuO89IbCgF4uPj0XR0XJzQAczBwdwziRD4Q4d2Cwxlhw3rQSBPgQJ8gQYNKnz2hWuJnEUrt2/K83fv3jaxVuY9SJjWB546hf/BXWij/ugT37BhxSr163Zu3sW+ZV1rM1sipTUqBZ7HKq5hbOx9oVch4crklTwG/tBqn9XIYr7Pk8DVulaDLfh5wafPnvKHcAsLTkCD4h0GjpjJ8/jNcXP76k3AoKJFvyB+0KxBBEqWdA/dEdmr1+SA06FgTl5QpgJI3t77u5avBg8gH+WxlNor8Wv/22/HTbxMmjUb+iYgSQ/Xr9/Vx8k58Tz5V2sKmGHqddfaxPquSZqschbGdvXcd5iYpPba6dMoOWPcuLflE4+WLVkSDCW+hkRO+fsqPBxlZk+YwErSUYvs2fliusa4hWid1RoDudJPDhOTbvvoypVixSqgTtSQcPt2sSw2tZ2dzwYE0OVMHzsW9/eyZs2KGYauLzAj9WNunqZvTlq/aXx5sa3EfvO8Cfdd+fvvB6xvPJ94nuH/4kZcYeu3E2fC8DIxo02l9VqzZnq/flR+c0CAs7MzHDWaDuLaipcsicCV4S8yxR5gbm5S7YLJhWq6sxt2jUD9mOknbcVUZXDvXeBNQkN3mVww2bBhKitPJbt2bQY+Xz4rk1A6663lr4+rk6dN+vTxwLlaT/7B/tWrmVK1hOFlog+ptH37tl679veEBL+szlnGjes9Z856voeKj2X024+8vXi+3xLrWgHVgzTx6NULDSGqBOuuYjZ6gcHoBWvkoxd6KJNBFDiwdSvandG/PywiWLB96taFZaMXsPGPXuikTHpWID5emRCvXu0Di9ELtmvXibDC6AXG+EcvdDI9adfZXTiNbhXymBgd2epOTqg5IiBAqB8eDBiyxYsWBfrH15cx/FGJdaFAer5AejknR+7cfDsjR06a8+uv+K7AX+F5YsguXryh09ChPCOxrhXgPwvjwXA+hM5M6KtEMSgmxB9SPEj81qvWraYbcMyZGBubImD+/PNn5duG/0urMwU+Gg/MPygIXw6X5s35bw9wyM2bjIn891/g1n36MIZA7IsXAiOzmV6BmOho/hoXLZoxrkMHPNe3fKIy30w2TZo07JdlyxBdWzx2bLIFJPmJKKDpfMxaOwvX7j7SXVCAHjQn8vC5wwDdJ3WvVKk0AFmhvMxqVwGdx8Aw60dUQlv2bmCgUJtD6dIs6kGtXDpyhDEklqWFxZMMyMbHvXj8wejXlZMnKxRrgNYxgrZv1Wr77uSfX8hA1wx/KmboLM7Ex6uMGc8YOpP1mY+NhYQ8JX4E7t8K3yHdKJ1CDOx/fn7OLi7tBg7csXIlNZ7d0vKlesNZN30xTK3wTnQX/dJLzZZc3IuP5JnkzZvkHoAu9E0hBnbv3qEizllevDhr4ZwFcTs8e5I7d47o6BhddMOAdX40HpgBNUpH08vXrcNZI7y8YMn/y5SjVzqUkaekoMAV9RGemGfPUGZs+/awO1etYuUz3+jFLk0CrSswZ84G1Hn0qB8svTuV+UavdIrGx714jMgTajQSO6B7dxYJo15VrVSJMRQNeh0ZKTDES5t6BdL5HTKy0zw8Ogk9gmc5dcMGigB166Y8KwHshLdEEt+awl19YL+EBIEhXtpUKiDI/vFmv/mmB+u8v/+m2cNms+gXeGC8p8UYiorF+8ULjIyWpVWBTOuBrdq0Cd+b037KBITSxcuXE+Hbv3inTWBk9tNUYNeubcKFX7lyaaa6qlZ4SMjvvyuP/MXGxgSdOSMU27Nxo8DI7KepQKVKZXDh9L55xYqlJi2fhCxer2Zq4HYfwwQ27VV+o2TKiAKmqZ9rs5LOtZyB0Sosj4kxKlvXxUXoD7IstVJX/GPZdIDd6tIPR0+dwrn8OiMGwY8ePEK7iMzB8pgYWKty5WAPHD/OmDwODsDoPLN9R40C9vbxYQwdvXrjBpj7Dx8KfObwPyIjXwseQ4UKVc6qASd7B4dr16Jw1NIyB6zwJXHv3Vtg0pRFDGyw+iXEaoTAWD2OX23EePCaNYtZ3/77799FY8YcO7YfDIb5Ojly8Jd8797dcR07Yl3HIzt28DxeNr/5998vX76IfvKE59M64zba8r16tULf8E4VrKVldnhXuMyqDlVhwTDLX3uvVr34bDowPygSxhtgWHkEFmuF4D8np84ME2MQu8h7EWv334f/tm49es+pPWDQZ6y0wl/4P5H/DBny/X+P/tu69QDPx7yICQi4gpVWsHIKz2daD4y/SN3hKfPno3KsS6K7JtJdM/8oMF6IRj1YVgq2aaNGsN8OHQobdv48LJ/WLlyIbNe2bXkSuHwZZYJZKOkyGUIZmU2lAk8fPkTJS2fPwp5TbwMcUVdiS+Xp+i/GbrE2btcOreMhF1fXZrWbNsUw//uNG3x/ihSxnrNtm5VVXirJDmXLZl7a0TF7dovcefMyUgJdKPDXX+8+kefPY3XRRFrr7Ny0M38KXglvWa8lXu7GbdXw8D38oVLFSy1f/m3h/IU7d27K8zkscjg7VzDLalakSH6eN8XymphTP3j0SJhZ04wbjx4IPGboWEoqX/nyVAswAd7nIMZn3z4AnrcsWVJg+KP6wVjElrrHekIrsjMyTQCrEaK8jzqAMa8UjEFw/oL50S5F9YD5CF98RAR4jyZNYNE9WKxvApsvTx7GEJ9WGxl560VMjK/vyYikP2QzZoz1P3asWjWbBaNHo06W4FU0K16cfAt667Zz5yY4Kngbp/ftA8PWRBeOMm9A1/yRI3tZW23a1AdGV93Vr3GcusA/MT1UR59d4wcBLZdVpXZtrH9IXp3vy5eEjdOjLVy4KHmoc7dvB6a50TL1y19IXQr1g5f8vgKPoh49exZz7vK5hITXfJl5G+dh7XOnzk7CowfwJ+7cuQ/r7x9M5SPuRQCAwTJOsNOnr8HahsSQbdduDPhChRrD8jzzCT7I37x5G2X48gsWbALTsGF/WCxTy2qYOnUVw3xbDDt4OgDTcrpUUvPRfBRIZeLXPyQMb2/v3iWwFF27ceN3hg0YbytaoChrHbhTJ+Wf/Nq1k2Fz5Uriwafywlkx02zZsiFTMH+SYQ0Mzbjx8DcrSoBm6P+cOyfwXw8aBIZuAdOhthrvbC2fPVs4S/9Z2kbkhrroMLVOA63+e5I5Wixe3M4iR45atRrYqi4au6hJk+ZWd3U9cSJ49IIFjARo377HnvBwgH6TJtG3Zc6cFVTAVHUTgfFTUVf98oyYM4cOGcoeP/7uPgZWXqduYNV2gL1qkJWYYH9/AsyGXLzIsAQpKJDfKj9+wmpWqpk1qylfbEzPMdmymQVtDcLD3zy/fe72YsUKYYZevXpF4m2L2BJo06YRgJdXvwPLlU8tT648xO/YMQ/gv/8OU5ZZPGoBjPKMIVC5snKzgU+lS9vwWeDRo7vDnjixGpZ/32vy5AFCSSF7LfwaGNqOhA7tO61M9Pl0//F9PitxCgooX5q0zsER98LMnSoFJvDDd98BYE0/yjLL+yJ9O3cGzzOGwmVKlGA9CTjw7keKdVsAT6OiGNNtyBBgC7UG/XiNqWwFca+BY8fCY/b88ktgdFLwnnXEMGU0Qe7cVgK5ZMnGrGZm8DO+UrftwFF7+1KwYNjK6IhIEdP9668N65FMn76YPA/Y4OCHsOgY2dZ9+zJMDLIsOVStyjCB31U3nZzO+uoqIZjTU9wLBRjWtU/5wfqxaxfKoIeweHAf9tnTp4h7oZPAsITpovRv2zVW7mHeO3RPaJqPM+E2FI4+OfEElvGYFTFMPG5JgZk2DUv6Kh8rs5cu/QJMHgPPMx+ClaSjPj7zBYY/K1nc1rUt4ylahixLhfIVYjhlkGwMzNGxA3w78he//HIaw7wHqWeMuNfMmT9TT4DRunBdtKTywYO+Av/BrOkHS8gCUOA39W4JSXFGnW7TFpHGI052c3N0ZsrXX8Nq+r7G089PtifHf/sN1/7P33/D0ptexinFl+pKJQ3c3dE9vDoNmytxtmqcHZa9EhTgX6CuWbOScNRQ2fnzFX+XT9jfi2Xp0YymTRXfgE9t2ig/aOfOXeZJHisDWFrn6SzuhXN5zNcr4KZdujCGNuDgvQq2GxnK8LzuMO1AxtfP+1jTkt71Qq96dezI+h+q3j41lO/4vnZfhIUh7mVdpAj86R7t2wOjw2n1rdNXnimTEdCxoxtOpzl+QIAyESMM283ZGRZzSsa0tLdnDPF6s9hjDG2he7Bbt65lGMxP6nvrAEjEP3nwgLKwC//4A7ZM5cqwzJNjMTAwRhID6z56NLzemd7esOgqWcS9BIysESbabIXvGL43yJLFvioM87wx4M37NrNul2ldhuGUQbIxsNOn1zIPcsAAT4Z5D1LPGHGvR4+OUU8oHobrKlmyOLs69AeYLCMBfvttPmwKw7AygOkhjRk8mLViZmbGMIFq6r9qgdRpll+0mxo6xT2PF3xNuU8tkz4VcHNryZpzdHRiGKC/eneaZxq1bs1n9YmvX7/CmsMuwwwDnNy1i88Ca8bGhAIyq10F+nj0ESosZ1+OMZ6fewJbmFswxngAv/MZHiVHx2gPZePpoXH2RBnA0jrvZnEvnMvjFK7QrX594SjvSZzYuRNHeUb/uKWb4gFQ2rpiRSI0uXf/PjD5ahu2bWOY996MAV+5fp3FvdAf9h4YOsw8bPDAZLXIo6qMpwEDRqESmu9bWCh3rpgHgOEKGD4KY8YsWcIY4vVma9duiLbQPdj+/UcyDGZLYCAsJeLrNGuGLEW/yCeb2LUrGOa3GUkMrGHevPhm4M0t5uOikzw2nhgYyfs+i8eshUMhO0PA4NsDmytHLtjYs7GMId4Y7IapG9ArSq/9lRgw7aGcyCX/N9kY2L//PoRPSTtc4zTgsLA7sHzcK1++RgLDH9U6prgXtchiYBQJS72tXl15cGbFiu3KRalRNFg9eWDJy/8xsHidgHUzLCKCYWMDFRs2RJdoKOX7RozueL4tiZNV4Nb164y/p/GUEztkQGBfTnFTKlavbsA+yKa1pUDRogVQFT2qTnWWKFFMqFzZrlMZBrIIvDFn8SIzuvfVV7P5TioDWFpn6CzuhZ9F4PT9OHZU1+nhf14Ni50aN2a6mBZXbs7iDTnYhrVrw5JHOFl9REL/3mEqWxTeA8NZ8K11bZloWgGOjgVZPR06fA7M/ADfgwcZJh5zSjBkGxfC+0hvMc9rHWNbFtSJDsBOmDCEYWJg18yYwfDAz5VLGKa+PUI+2ZqTJ8Ewv81IYmAbz5/Hd8Xa3p75uOgkj3UaA3v48Cmao8Rm1olEEt7MzEXgmzQZDGb9evHmbVTUc76k5zeeLHs84DgwvitGaM9eOst6NX+TEv5JOSUbA8Mp8Cm9vWcKniUf9/LwaIijr1/7w/K8jrDwHhhaUTqpDWvKDxu8WLrmf9UIGPCt6x/TXmLULj5SgPuPHum/G594i0WLKlMHSubm2ROh8rfcZ5/xWQELb6EJR7WYffYsmtWG1REZZoB/ofs/o/S3WFeNBFhbN0FPmjcfJvRn1SolssAvHSS87Iyjhw6dg+3TZwosn4oUcUO2ffuxRLpWdyWwc+fRKmWrvHz5ii9sPDj8bjjrzL8P/mVYgvcpkMnXQkyl7wJ1HqjPN5NMdFaxIkWQHak+QUDDeY9hyr8xfmg3KsziXugVj1ncC7zWMSmmLXvkSBCravPmfcDMD8hfuDDDxGPuCIbs+rNnGeZ5reNJk+agTnQAduHCdQwTAzt1/XqGfdSngbpWqwaGfDWP0qUZBmPAGNjgJk2Yz+rVowcwOvY+q9MY2KtXyiC0b99SWH5WjifowNDSQTwPkqX38bGxiiuzfftcKjms89vR0dPzc7w6nT27Ob43OGRslpZcol7NGzmPXeb7QLIxMBROwb+cMmWlcBReL/4jS+fu33+GMcQXK9ZUYPizPoi1EgNDK0r3klrT90kjeVIgJtNtISg/WT0rgKW2WIvZLYzlEbhcVlasV3KnMSZFpgdTpgwUrnHpUsVP5VdCadasjlAmJMQHDK0/wh/Cail8Vv9YuYWY1lk5i3vhXB6j92A+ans9NJT13z9I8QZWzVPmQeST/W+pOklMXImf8an383RaMhPEwPr2bQtVKQ0a1BngfT4B+CFNm8Ji7ghLPgRhndqRI3ujfnQMlsfEwGKlR4br5swJvPPqVVjy1X5VHX3mtxkwBoZVDZnPShidFHxcxug0BibMqdGoLpjChZWbiqxm85rmkyevwLcHtyhLeZQCzzwMpVhihIx4nMsYOrpu3R8Cw5+Vbjx49mB2breJ3YBTTinEwHBiKv3LoUM7oWR8vLKOPn8WvFv8RxYLegHPnz+KMcTDexYY/iyGdRgDS1mgT+3oixcvPrVLNqrrFeJeKfctp7ogE5UxnrUtkvhb6koWKV+FPKofBe7ff8w3hIfUp01bDcbeviXFvNnRoUPnANMuxkQK54Ls23cqbM+e31EBZl+9imM4HSD2ZSw7yzybOcMSvE8BGQNL8v5Z5QoVoBT5SdWdnICxuiAseZa0br3Repl83Ovxw8foZ1p963SUf98XK338ihVb2YmE3+cTgOd9iPk+PsyfwKxRd3jRovXMf+Ixuk388cfKDyXho+pKHN3Vx9PJb5vcqxeOMh/OgDGwoc2aMf+V1j9Ex8Aka3UaA8M8HY3q2T489pD8gxcvzobuCmW+AvyJZcvGwebLZyX4FkonE30UKr9x4zTG0FFz82wCw5/1QbxuyjpWhjCyKaR0xMBQWwZ9xxs3IlADhnmhnty564Mhn5VvBQyLgYEHjoz8TynARbMWL94Cpl69vgJvalpdYPizgGUMDPqklCy5oIW9jU1KRY3pGG0AZkw9+nT7wq8lWNTOzkiE4DeWjNJ44PbRvXtG0k/ZDWNToEwZW3QJw7zQsYiIvWAWLx4j8HXqKJ4A/85Z8eKFhTIjRnQBc+rUWoG/fXsfGM223NxqgkcETsbAksTtLl25Al3Ix4pQY12Lpk0DQz4Z1hhkmBijsnwMzCqvFfr20b0HRnEviIyUQgwM93zgKwxq3Jj5EPVy5WIYs0DdYT7uxWN0mPyqFrbKP2/Cw9RNYVYcOQKGfDJag58wrAFjYBt8fQU/FZ0EQzY/FtVMxGAyQQyMzdyxRBNwocaFYJnHkBHcqNEA1PPkSbRQG3QDQ3bOnPUM87yAB80axJg565U7mSknrcTA0ERGvEZ2LhYRBh4+vDNjqGaszYgYGL1zBgaYLor3uYnR5LF1DsghQzrC8uUPHfoRDCJw0gMj0d5ac3V3NMrkl1vHJtHGiDK3b95Eb3gfIvZ5khdXDdhX4/S3DCiIsTUdEhaCLj2P1doX5sSJAFTYpcsE4Ur50NqkScoPLr9XslCYsjktlad+KGHPs0Qo/75XARkDSxIDo208ya/KmSMHZPtTXXuefDJS0aFePQBijMdizcaY5zHoGOJYU+fPp66WsypHjE4ttaUVm8oYGF5bhn+wOSBA8CHg0+ia4eNePMblk1+17fJlhieuXAm839sblnwy+lHD2oPEGDAGhv4Ifirtgo2OgW9hZwdLni6YzBQDg9eCuXzMmRiKY2nL4j02eB58bbhpxnyRuLjzwNgqkzFQVRPPH6X8yyVec2FiHBKSQWJg6ENqvEm+DOJe7CzCwoWkPvv06TO+sPTAeDWSwY4ODgJbJ437xwun6ygbeutWdovsVPmPGzYARD97ljd/Xh01J6tNvQIVnJUXMClRJKDWF18kEkb098/du1lv7qnLftIeZoyUQCpgcAXy5MnF90EZwDBnT5M9f+o8q+LXjb8ynCkB3Ugkn4wucN3ChQDEGI+t5exMD24g7nXv0iX0MHeuXOfDlE8KjE4tKtdWSiEGBj8GPkFMdDRsbUtLWE0fQg8MH/fiMRQgH2uz+vWood6Lphub9uociPwzEmrOr8q/GjAGjIGhdcFb/Xb5cjDoGONd27YlRqcxsC1b9qPRjh3HkTjM5shRB5hiV4xkDPHk0Xp7KzUQQxarrfOM935vdtSuhR2w8u3Ri+3QYZzQltKxxNjYL78cZBj8Ub+jsFb1rWBTk4wqBoYOa/qUxFDci8epubrUlEmPB/b82bvbxyGXlRvKMkkFdK3A4e3b0YT3okWwr4z4dT1H1UH/TL3PbKm+yKxrZT72+rt0aYZL2LZNfGbh3r3D4Fev9hIu0MlJuTFOiTzarl2VGvh08uQaZC0t396TOHnhJLJngs7ARtyL4EvqGv/++3GhCf4Ns6Cga/xRB3vlfo+LowtPSpyCAsoAltYZumszV1bjpLmTGM7EYOfevcLV8dGvfuoq9TxjKNxtyBD3Ou6s9XnfzVPCGIkeNnjgz9zcGIOLAj7p6yswxKfSCspkJJtCDOyHnTvhDfTz8mL+gaYPoQeGj3vxGFeN1mFp6Fp17Jigw+6NGwUGPgCLgX3XsydhQ3mWrF2s3MgwebToNouBuZiZASNaJlyLLrJY/QHV9uvXRqg8MHALGESbBJ4Ysog24WhMjDJigVkxYQVsHac6sPhPn1ZzrQp6w0zpWMCbWbOGkl9Ctnjh4uCPrDgCm5qUbAwMu/Uyrw67ZzH/DxUynrxPvGXFGDrq6qo8UUln8eWxXolQkj/6QazFGBja4pMygMn0QQVacJutUGHcoGNnfT1oEDBNBhlpEODt42Njb8OaLl+5PA2utDUM8UHBwawAgYaengD127QReB/1d0quBinIkr5sA3d34URz7hXDTkOHCkcNlQ25eFFoeqO6mhqRrxMSAEZqXMtPXl7gE+LjhXNlVv8KnD37FxotW7YNLP/c4/Tpa8CEh9+FpUQ3YBNzyt/jxwNghT23wGAxX9iGDfvD8mnJkq3I4tVmnkwZD587XChw+aby3FP6kjKApXX23aJGC9YYndujZQ/G6A0kqP+W+OYOnjiBbK8RI3gSmD2VJ/Cpz1pkzy4UjlIXGqcYWIWyZXH0dWQkrMGjYj9t+Yn1wb2D+88LFsDDzpYtGyx4ZtFVYLLE//nbb4whvm3z5mByWFoKPH8ucMYT+4eEGNgTdfWKecOHA5MH0CBPHsEnIN4glo97MTy8ZUuIsGnBghSksMqXTzh6NjaWxcAca9QgbFj/MtnWe44Zw2Jg8F6A/TQGKnq/LauZmXCN+s+OGjU/7E4Y2i34eUE8vN5yeEv9xLpS3woeuGfeTMuWw/M2zIveev2kzAAoxcXHJcIP/E02Bla3rhP8uevXf4NdvvxbWPI4vbz6AdvbWzOGeLQBhizvm/J8QoKyUuKJE6tZSTpKb33h1WaB588F5mNg2+ZsA8OnSqUr8dk04fR4YIF+gWjjxtUbrKUje48wTGD84PEA/MRfKJCa7NOoKBTbsWePULj3yJFgilSpIvBNu3QBs1GNkPOHLvz1V7z6T+7EQWWE49OYAWOQ7eXeiycl1psCtEPxYPWpPNz6QLuTe/eG3aqum0zdeK5+DQgbjzU1ffdvp+PgwehY43btjKd7n05PDpw9wC728LnDn33mULt37X37Tj98+nDRIu+9p8Wb/6ywoUBU1DPWNF7+bVW/FbJsxzLgbGbZWAEJPqAAPyuXOK0KeLZoAe+EPwtyM39lxrhxDPO8TrHmk4f8N6D/yP589n1Y0+Okm6hT1Ggff1ayc3bM+SSfGgX27fOri6+QuqsZMHTjtQXuonE7QSign2xg4Lv7ToTRZzQNi52wGSam26hRYFYePSrwv9+4AeZMTAzPY+bOz/olTpMCfv/zY+WBb91SRut69T6D1X+Cb4dGNR+omTt3BPjduxcLXQoK2lqxVMX795M4P2N6jkHkcsusLVi5gy9/edtlrJd//8h92umNHXo3i2SUBKlX4Ad1u0u+PL924rdqYONz9YE0voxOMR8Do4bsS9mzFjv07MBwCuDFy5fC0b3qekhTEl+RFo7KbLoVGDJzpnAu79vRURt1G0yhmJFk76srrvGdoRcJBn7+OU8Cty6jPFhRR10fQDgks1pRgCJely/f1Eptaa2Eomv0SgN/7tixytDVqpUyjPHJyalz8D/BhQo15sl5G+fh9YkuE7rQMr7sUKWOlczMXLAAmKVlbUYqYJu6WECQ+vPEH7h94QKyT9StjHg+Xn3DMfr6dZ4EjggIgP3r6NESpUvwh47v2DF5/uRlM2ceDjzM8707dQp+GFypfPmI+CQBQLzMdD1aqfzCbaUDfNp1dheyiPHwJPCw8cNg67rW5XmH0qVz5soJR4CRIadOARMDm9sqN8PE877U1oNbcZRnJE5WAWh04skTWOx0xc+sgWm1eLcOHQQeDy8IDPQnxsOjk4CRNZ7k1r59x469Klavjt7CcwCWvmYGFXh17tWDB0cfH39848bv/Af9969/Y2nX9VPXb96cZIB3q+k2dmwvPDCFW4V8+Y8CY6169HPSpH54zhCe08/f/fzkyQnpg2ZEAbOOAwdCUyeNp+xs1H3Q85YvL3wzzNS1SnOrjy3wh2zV5QYqa0y7GrHAwES+uMn6X37B/0DZmtnyB/BoRtncZcFUs6nG88Dutd1hv+rylcAvnb0UzOljp3n+Zng44l4U+iI+RF1Aj5WJjopmmMC/kf9amxYlfC34mkmlBkIBmU1WASyMhB/0Do6ONAixMmPbt+9+581hhCST3jww5re4WOcFYJU/P5ZsH7ds2erh4wbPmCEcldl0K4CNJZM917GDY7I8olyHzylHAgNDki1gtGQZ2zJONewv+d1ycanoWtzVxORiz1Y9zfJcMtoOfxwdk36GLhTAWhiCv2JXvDgYfCcYf+30acYQX6ZECYEhXiu2VoNaQj3IstS+R3uG0wTIF5E2fQpQ3IvOZTEw+DT4CLYEBsISJsu/Fk3MjE2bhA+Lv/0oHMpINoUYGKpF/9NtMzL7luci7vX998NIB2B6+y0jH/RHd66MgenkIytcsKBQ71H12cjiRd96eDhatmRJocx59b2rpo0aCbxWst9M/SaFemYsnYGjZcqXSaGMPKRPBcqp+6nyLU5TX4Xmh6jm3brxBYC/36Y8o8wPdUIBmc1kCly6dB1XFBcXDxsSEp7Jru5Tv5x56vuVpEL9mjUBhLgXz/QY2IN5Y/kK5GOY+UyS0VQgI7NvzXMNGAMzzZoVX4aBU6bAkn+z/MABhpknxMe9eExelLRpVQAKZ7JUuUxl5gntX7b//PmNz049i47+U/qLulAgk3tgfNyLx8n+m1HiXonp8cPHgA/vP0wk5N9MpQD5McPnzGFXNWLuXOABkyczplaTJgxLIBX4oAJ2Re0qViz1x8I/Dh78EWs79Onj0bR2UxcXR+zylStXjg+eLgukRwHNObVk9K8AVvEQ/Lzls2aBscqdW+DxGYMhS/0MO3+eMcSPHjgQDG5jCjx/LsNnb55lmMoXsy0GhhIx4a/EWxN0Q9Iyh2VaZ9yyPK9ACjEw5vPhg0gN3hoUJJTErmlvP8WkNTASAM+Own7RsSNPAgvroskYmC68hwzW+fDhMcS9MEBSPcB4slH4HInR5KOi/kTJNWu8hPINGlTT/PSFMhnMflHzC6EGs6xmApP6bCb3wFIvhGFLCr8X6MxgdUGKYHVxLL5v/dWwB5aGYqS9jQ3DBOarbkSk+iIEf2iXuk/YwB49eBJviZW0s+OZ3Wd3I1vFuQoj+eaIHP3daIBzoeoDYaycBIZToKzGwjQ0pHUfrXxSfKKbpcSUqlgR4Hv1eWC+jI+6UlrfCRN4Erho0q+KcFRm9alA/vxWaA63KIVGfX03gIH/J/A5c1oyhhZK/vLLNowhgMWiAMLClF8APv3wwyhkGzVy5kngbNnSPPZsnC52OHyPMj+e9tU0ofIqZd/9BAmHMnO2UIECwuX169oVDG2M+1h9sw2+C5i4W7fgbQDj55gY2B3Hd8AG3g2Ehf8hbcoKQCMoqS2b7hgYrQGP1R/YR09rD/K+y2J1w0aeSQfm41485v0qiVOvAPu8jAEI75ZZF7ReuXLi/6b/7+VLX+peaZvSbdo04j2n0F2hT5+e5BmJ9amAqfC9GT5hOJjidsUF3oDZ7ObmQuuN6tQBM/WbbwQey9GCuRsYaFvClj+0+ocf+gzpg6f7th3ZFnLjBjtkZmZGCy/RDbFs6muGtNtZ4aKFWTEJDKWA5gK47dTXFjf4vv1BYR37+U/llkh1V1fYz9XF9bdrLLpfT11yl50iQaZXIGtW8ffN2bkCrvq77/rz144FjS5e9F7+7XJYxtvZFb3+2/UBAzy7t+iOF5DdG7h7e8/03+Tv4zOflQEoUayElVVOnpFYnwqYCvPrb2d+C8Yv3E/gDeiLvAgLE1o/tn07mO9Gjxb4ejVqgMFaHri1xfcfgs5cNhO23uf1alarhjLAZPmHksNfKp6sW0s32E8kRXArldAl7zp0CGCGum8kL0LP4crMprFGsARkOvyY952F/cBYbcDHHj0S5vITVqwAU6lmTYFHhWDIztuxA7hA0aKM4Y9mEGMPMNRAiceJXKb9G6axKM8OdRGfmeqUgr9sWkTKXeMtEdf8+VGsVtKNHXQ6W8fqeUL9/v6bwEydOojnseoHfK/BHQaje4wPD9+Dhy/YdeHRDGy8mTd3XsYYA0h2PzB0jK12rwscHR2D+k+evCC0gq1VwAwcOFPgW7QYjv3AsHAU8bQ3GDpmbd0ENk+eBrB8srCojU1HhaWk1v6+Nlu2GkPnDM2Zsy5fWJyh8MckzvQK2BYrJlyju7ow/CR1vX/+0MYlS5A9or5mxPMSfyIKlNBYlIe84YnqMMaLQDdyd4WG8iQwpiOwvhrLbArFZNbIFaD4GT3uwXcVW6sgi5uuPAm8d6/y04GlewX+7t2DYHADVuBfvDhrW8RWWMy3b+u+cXHnl41b9vz5ab68afn69ZHH1iRsTyY6bFWuXJB/UDY7u9WLVvMnYCmpod2H5nFwwM6/PF+8alVnW2eHevUc8iRZo6x6s2Z22ew8v/zS3tyeLz9h9mzHgo7Y+qRWqVo8f/7ixVa1WmEx2eE9lSk/nxbPXIzsnh17eBL4ToTiUclkKAVqqE+UYEEpoQPN1adL+qj3e/lDtGHKJo11gYP9/VGMtkzky0uciRVo0K9BgQKuJVqVKF3ag79M7JJVs2ZPMxczD49RPJ+rXq4RI+ZZN7FeujTJD6JLDxdsxtjzu57Pn8fy5RdvUX40QsJDeDLT4GT3A8PVwY80KsvvB0ZYWx+B2dU//7xrYoKlj+5mSVInNmy8a6085gBrghKJCYv5grG+ayLwkRcvJsv779+fLD9r/PhhS8ajni+Gd+Drr1G16m7f3RZ3TZZsXMLzaH/ExBFgWrZrKfDKY99cDxN7Kv/qSYHzcXHQH0v6Cp/Cvtu3waw7c0bgp65fD6Y7dmZJ+qlhkVww/DNyeroA2YzhFDi55qRJtQsmF6qp9l0/npx4kiyPl4LBL+61RSiPh8jBNLLqb5Lzwrta8KPRZQQWBnewd4DleYkzhwLyFmLm+BzlVUgFpAKfnAIGiYFBZSHK9UGG4l5CDEwrn5YcwLQio6xEKiAVkApIBfStgBzA9K24bE8qIBWQCmhFARkDkwOYVr5IshKpgFRAKiAV0LcCcgDTt+KyPamAVEAqoBUFZAxMDmBa+SLJSqQCUgGpgFRA3wrIAUzfisv2pAJSAamAVhSQMTA5gGnliyQrkQpIBaQCUgF9KyAHMH0rLtuTCkgFpAJaUUDGwOQAppUvkqxEKiAVkApIBfStgBzA9K24bE8qIBWQCmhFARkDkwOYVr5IshKpgFRAKiAV0LcCcgDTt+KyPamAVEAqoBUF+BgYVcivUsg3YVie7QHGd0krWA5gWpFRViIVkApIBaQC+lZADmD6Vly2JxWQCkgFtKIAHwOjCvmdwPgmDMtrdw8w/rrkAMarIbFUQCogFZAKfDQK/B8ECT91932FJAAAAABJRU5ErkJggg==", "text/plain": [] }, "execution_count": 47, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# patterns can be arbitrarily complicated\n", "reduce(ims, \"(b1 b2) (h1 h2 h3) (w1 w2 w3) c -> (h1 w1 h3) (b1 w2 h2 w3 b2) c\", \"mean\", h2=2, w1=2, w3=2, h3=2, b2=2)" ] }, { "cell_type": "code", "execution_count": 48, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAkAAAABgCAIAAAB6y1p+AAAcg0lEQVR4Ae2dCVxV1dbAuZdJGRQ1B2QQVEQGFUecnooJpKk5NKhfhdn0Si2fT3t9Jc7mkNbPMClLjbIvcwZTE4ecTUBlEJVJVByAVEaJ2beKPrpe7rl3nXvPsM9l8fPn75x91t57rf/e96xz9tl7bQsL+iMCRIAIEAEiQASIABEgAkSACBABItB4Cfh09Vm9ZnViYmJBQUF5eXlOTk50dPQrr7xibWPdeKGQ5USACBABIsAyARtbm8/Xf15dU/1I19+NGzeCgoJY1p90IwJEgAgQgcZIoEmTJqdPn9bluf5Oq6quCgsLa4x0yGYiQASIABFglsD3W77/21NxH1VUVgwcOJBZK0gxIkAEiAARaFwE+vbty+2ztK+cizvXuOiQtUSACBABIsAsgQ0bNmi7Kb3nAQEBzNpCihEBIkAEJCCglqAOqgJDIGg4v9kZwcHBmGJJhggQASJgrgTIgbHSsu7u7rxU6dChAy95EiYCRIAImBkBcmBMNKi9nb2NtQ0vVVq2bMlLnoSJABEgAmZGgBwYEw1aVlYG8+N5qVJYVMhLnoSJABEgAmZGgBwYEw36yOLRrVu3eKkCi5p5yZMwESACRMDMCJADY6VBjx45ykuVI4eP8JInYSJABIgAESACohDo37+/3mnzj108f/68ykIlih5UKBEgAkSACBABvgS2bt36mJviOKmsqhwyZAjfwkmeCBABIkAEiIBYBJo2bXr217McbuuvZIjzC2HpxdKAyiUCRIAIEAEiYBwBiOcLITm4otHDvipPjnjSuJIpFxEgAkSACBAB0Qn4+vh++umnSUlJBYUFFRUVMEExZm/Mq6+9CputiF43VUAEiAARIAJEgAgQASJABIgAESACRIAIEAEiQASIABEgAkSACBABIkAEiAARIAJEgAgQASJABIgAESACRABLQM5oDrY2NoH9+8Oa3O7dunl16dLe2dnB0RESf//9dwhue+/evWvZf/zFx8efPnPmWlYW1iaSM1MCzs7Onp6esI+Muzv898df69at7ezsYP1c3f9woFKpoPNAF8rPz4dwkdnZ16D/nD17JivrmplS4TTL08NjWFBQn9694cfVqVMnJycnO3t7S7W69OHD0pKSmzduZGRmJicnHzt2LCkxsfbRI86CzOICLFDx9vaGPuPi4uLm5ubS3sXVzRX6T9Mm0Gv+7D92TWFHiLqbz4OCB9B54O/ixYtnz569lHIJVraYBQYeRnTu3BnCA/n6+nbt2hV+em3btoEu1KRJU1tbm6qqqocPyx7++VdaWgKgsv74u5aZmZWampKXl8+jGuWJqlRPPfXUd1u2lBQX61+0q3k1PT19yZIlXp07K8/ePzV2cXXRNEeQY+heCqWBUVutVvn7+0+d+kpERMSJEycKCgpMgXblypX58+e7uLhiqla0jIeHx7zw8KtpaXhc8LC4fv36foGBijZcS3lHB8eQ4JDweeG7du2Cm2tNdQ0eiJZkYWFh1DdRo0aNUqnlfOLXMlCcU9WwYUMjI9fn5NzUgoA/hafGqKhvJk+e7OTUQhwlZSpVpVaHhYVdTk3Fs9CSrKmu3vbjj75+fjJZYHy15MCQ7Jo3d3r//f89ePBgURGP5xutfsJ1Covq1q+PbNOmDVIZZYn5+vh8v2VLdXU1l/kG00+ePDlC4Tt9w1vCvHnz4LXJFA5coDLSM958801rK2tldQyMttbWVmDa1atXuWw3Ir2ionz37t2hoSEWZhC4NTAwMCE+3ggKDbNUVlQsX768ia0tpmEYkSEHhmyIPn36NGxxYVPgheOFFyYh9VGEmL2d3eo1a4S6Ze+JjnZ1c1OE4Q2VlKD/pKSkmFkw0hEjRgjrurR+sMqOfgcvXvBMVF1VpWWViafxcXGuLi4NezCbKeTAkO0iwQ2oruMtXrwEqRLjYj179sy6ds3EX5NW9uLi4meffZZxw3WqJ03/qa2pXbpsqRmMKKpU6hUrltfWGj++qtVzdJ4OHjxIZ2MpIBE+kP700086rTI98e6dO95duiiAgoUFOTBkM0lzA6rrex999BFSK2bF4EsDzDsw/aeks4RlS5cyaziXYlL2n5/2/QRzQ7g0YT/dyspy69YfdDa9sIkwrss+DR0agt5nzpwRloVWaRAn0MPTU0fdjCWRA0M2iJQ3IOhLL7/8MlIxBsVenTatukbcZ+fPIiIYNFyPShL3n0Oxh5TrwzZu/FrrdirGKcwH0dNe7F6Cb1TwTVgMIlplJiclsd+HyIEhe6rEN6DCwiKFTk2cNGlSTS3Mfhf9b9WqVci2Y0FM4v4D9Hfu2MmC4Xx1eP3110TvOn9WcODAfr66MSG/fds2aQBBLV9++SUTNnMrQQ6Mm81jV6S/AcE+oo9poIQToATL3ST7fSnoPVX6/gOt8P5/3ldCr/lbR5iIC585pek/H3/88d8VK+Xo7enTpaFTX8vAgQNZhkMODNk60t+Aampqu3XrjlSPBTEHB4fs69fre74EB/CZrYu3Nwu2G9RB+v4D/GH+Z0BAgEHd2BFYs2a1BN2mroqwMBFH6dViMO3q7b1acq+7bt06C5XZLzMUo7kae5mwYnrmzBkKovDx6tUeHTpIqTAM0W/atElNvy8O6JaWll9t+EqtFuV2ylGn8cm2trZSzmtPSblkvK6y5IRVqJK5d82KRo8eLYu9mErpDQxDCWRkeYKGRdMwXRapobxivXr1kubTl+Yv669H6alT5bUdU7ss/ecvPi+HYTSUXWbkyJENG1eklJqaavYnKDzWIuBFeLEoKiyEhzuYDQzvbRCazNra+olWreAYlqFAOq8AQsePHXtMFWWetHqiFRKgWYaSwt+A3nnnXYhJBqN/zs7tIH6HtbWNrW0TNzd3WGS6cuUqmJ6KxFgnNmrUSEX0l9hDh3jZBZ/Ktm3bNnHCBD9fX5gVDH4aYtxBjMQ1n3xy6/ZtXkVBbFIbGxvGKeH7z6x3Z0F0KJCHAJvwTQjIWFlatWjRIrBf4Nw5c5MSk3jBAeG0q2mKeAlbuXIF3rT79++vW/c5zBiCJycIhwjD12AjhO1wdHRwdm4P9MaNeyY8fN7OnTt+++23hsVmZKQz3mG01cOH24ClzRBQQ/8SAbgKo5EQQaohGp0pnTt10lZIaefkwHS2bMPEdu3a6WnbZs2aRUV92zAXV8ratWv1lMbIpQH9+3PprzP9559/hhjZXMpbWlktXrKEV/yOaa+8wlUaI+l4B9aurb7+A+ZMmDghLzdPJ1iuxPHjxjPCQY8a+/fv49JfK33Tpo3gsfQU9fglFTi5xYsXQ4To+nIglNTjMmyfBQ0fXq+6/oPioqLQkBCkNePHjUOu1oSQH8gymRUjB6a/59Rf1e/AoH3h41ZsbGy9vP4DWPLBbJeoVwxCHeq3QvMqcvYXDCjhfVhcfHy9MmweCOjAwEA3VzeIf6hJVf/x3r172cSiqVVSUqJ+K+quRkfv0czF67hjx47/+tfs48ePLVgwn1dGmYXB32LQ1NbUjB0zhpeu8NURU/L58+d5FcugMDkwTEODjEEHBo3bsaMnTDLEFAhDJQx2Bk2VWrZsWV5RgbEFZLbv2IGf0TR79mxksSDG+HQ7YR0Y8IcBRohGj+QDjwLwHUSz1Rg8zszMxJjzwgsvMKi8iCo1b968vLwcg8a4ZVuYuSEw2AhqiGik+EWTA8N0IZDBODBorsOHDyMLZDxKPQzfIQ0BZ9yM568AnvyQhcOoo/g/AuNrENyBgSqvv/Y6Eg6ITZkyxXjtJcmZmnoJY84HH3woiTomVSLkvM9x48bBBE2D6sCWFgsWLjQo1lDgY0REALWl5aDBgxvmpZRGS2Dfvn1I2318fJCSsog9Mx77fQViGMIQPS8l4dMFUn4Mz7ETZLEsi23evDk9DTsZITQklGVbQDd4ocRoOGfObA8PT4ykjDJCOrBg3Det/fv35969a4TNx44fh51kDWaELWgNypBA4yEAW0MhjYWBe6Sk9GKwBisoKAhTb0lp6frISIykpkx0dPTNmzc1U7iOA3r0aPXEE1xXzTIdtmNe9tEypGmDmI+8DjunYGxp0aLlr7+eDQkJxgjLJSOkAxs0YADGDKPnpcD48oULFwxWAXtgG5QhgcZDIDExEWlss2bsDj77+vs74uaDwTwCGMlHmqwpBtuAaZ7qOe7bp4+eq2Z5CbZ1RlLt6NkRNoNmGcK5c3FI9dq2bXvwYOyePbsDA/shs0gsJpgDA1ORUeHjExKMNjIvL89gXsYHggzqTwLCEoD7DnKCBixtEbZqAUvrh/YZu3bsMK5e2PYImRG+MyElzUastLT0wIEDGHNgkzC/bn4YSblkYmKiYQMwfO3PPDPu11/PJSQk/POf/4SZRPiMEkgK5sC6du2KVPfK5cuYT4g6ZZ57/nmDtbgoZ5dLg7aQgCAEbt++jSnH0ZHdB2cv9L53CcZOxE1Cv6rCUmgMTzOT+eWXX5AWeXTwQErKIpabm/fjjz/yrbp3796RkZG5ubkQXf6NN96ARc18SxBDXjAH5uHhIYZ+RpTZskULCOdhREbKYq4E7t1DTZG3t7dnloA7LvhhUVHRjRs3jLMiPz//3r17mLzu7u4YMTOTSUlOQVrUvn17pKRcYjDDsLCwwIja4db61FMjYRr57dt3YJHl1KlhzZrJ+dhnhg5MpVY/0cg+MhvRERtVFpj4irHXzo5dB+bm6ooxIS0dO1lOZ2mwQkhnulaiC04ZrVxKP01KTkKa4Mwd/QRZgthi169fDwub+uhRrdEVQfzi4ODgzZu/ycvL37Fj+/jxEzBT0I2ujiujYA4MYohx1SF9ugPDj9LS06AaKypQkxpUDEdbR0b0gciipjQ37BGFyc7yqypGf+NkIC5rVVUVJq8i+MTExCxeLMCSPojVO3His7t27YSdlxctWiTx0KJgDszOzg7TtNLINFFIZHFpaFAtyDcwlkEhf19FOA/EZWkxYpkK5EUqw1WLctNLcHyUsrPBwoULZ86cWVPDY0KHnrZr3brN/PnzYQQ7ImKdk5NE7zOCObCmTZrosU3iS7K8zEpsI1XXqAggP+tCyFBTsCCz2zIfk94UCHryFhUW6blaf8naRjHf4GEbRdg/5K5RC3Pr7dU8gB0hZsyYnp5+VZotxwRzYJo20DERIALCEvgdt7TLxGd/5FIzpJ8TlgALpSEHmasqUSONLFgEOsCWBd7eXT75ZA1ygBSjNryNwWZY334bJfb+O4I5sDLTHv0wUPAylbiP9vgCSZIIyEugHPf7QnogLltgDxquS5rpTP3YNRUT+9gBt1Kwohw1aUhsbfHll5SU/vvfcyAEBGz9VVpqONoRsuSXXno5Nvag/g2zkEVxiQnmwJh6KKusrOQymNKJgBIJwPx4jNom3iyQ2U2cKoIxhE0Z5OyMhw8fsqm/fq1gDurMmTNgS9h3333nzJnTpsxRrK9o6NBhP/zwfxYWqvoUYQ8Ec2DIFSTCas9VmkI7UJ05j2ofcdlF6Y2WQM6tWxjbvby8MGI6ZeAe08XbW+clrUSkMlq5lH4KAaKQI7T5v+Ur11gI9fvZZxGDBg12dXWfNevdU6dM9WSwbuw//3lPJCCCOTCjl0+KYdj9AmPW6ImhiRFl1vCJ8mJE+ZRFiQSQkXYh0k+btm2NM7BT587I9Sc5uLC/xqnBbC7vrijvDvrDknBmrcArdufO7bVrP/vHPwa7uLhNnz798OFDRn8kW7p0abdu/viq8ZKCObDr2dnIWmEbJ/gWKupfmTJf4esAVldWI0laWlkiJUlM6QQgABvShN69eiEltcT69u2rlcJ1igxnzpVdoen4t1tYJqxQG3WqfffunfXr1wcHh7Ru/QTsdrZ9+3a+H4ysrKxghr3Owk1MFMyBXUpNRaoCO5wiJRunGDLoNcBhPOh142w+kaw+j9iHoa7q0U8/bZwOzz33HDJjXHw8UtKcxIYOGYo0Jy09DSmpLLGiouIffvjh+eefh9XKr7/+WlJSIl5/iAgsxuIwwRwYLFNPx4WxCQ0NxZvdCCVhCLG4pBhjOOM7CGNMIBkkgYy0NOQ8jjFjxxrxxdzB0XHkyJEYZWBXo0S0N8UUqBSZUU+PwqgKi53v3L6DkVSuDMxa/PrrjQEBveCFDBlTERYyhoaGCG6yYA4MNDtz+jRGv7ffeovlsN8YE8SWefDgAaYKP38/jBjJmAEBmNgTe+gQxhA3Nzf83s31Bc6aNQtiAtWf6jk4derUw7IyPQJmeWnAwAEAFmNafCN6PX0EL2RDhgxBjij26NEDA5CXjJAOLAa3nxB8ZI784gsLhuPO8SIohvDdO3cxxQ4bNgwjRjLmQeCnvXuRhswPD0dK1onBDstz58xBZtkbE4OUNCexDz/8EGnOmdNnkJJyialUQt72U1IuQXB6jC2enh4YMV4yQlpyYN++Elwotv+ZMmXj118jo+Pwssc8hJFTOvv17YefGWUeZBqzFeA5kN9He/bsOWfuXCQrtUoVtXkzcglzTW3tzl27kCWbjdiIESOeHvU00pyjvxxFSsolNmBA/4yM9PDweR1we/QY1DMx8aJBGRBwdGyOEeMlI6QDK6+o2LZ9O7L6adOmwbs2vH4i5XWKqS0tQ4KDo775ZsGCBToFFJqYnpGO1HzN6jVISRJTOoGCwkL872vF8uWYb1rwtezTtWufHj0aCefggQPIpytkgeyLwU6HW7duReoJ3ylPnTyFFJZLrHv37p07e0Eo+uzsa7BLJ+yzbGIIeShNLlsErrerj09tTY3OzZS5Ek+ePDl+3DjkCsE6dZs3bz5h4sRvNm+G1dN1xUZHRwtsiazFwXwwLlwN0z9f/7nBd1mYxhocErxhw4aYvayP/8B29Q1t1JkC6zGQrQQ3IJ0laCXC5kbIAuUS69uvn5bOek5htsXiJUssray4tHV1dT10+LCeEhpeGjUKNZGBq0YJ0vH9Z9u2bf5+BhYnhYaEQqDbhhy4Ur7b8p0ENppYBcSL0tK/pqb6+PFj77wz04i9tuE5CaJPaRWo83TLlu9N1Lxhds7O3VAUk3L1ypXdu3eDd8EI18kM/vMPxkaOHDkC72SXL19Ou3r13v37EE2jpLTUrmlTp+bNnVq0aNWqlZ+/f9/evaGD+vj4aP0svXERBPBaySsZFxeHV+Dtt94OCQmJ/Dzy8NHDOTdziouKm9o1hREh2Da3g0eHHt17BPYPhCU+dXPui4qL8CWTJGsE4uPi9u/bNwo3UR62HAyfNw+GOmD/+L179968cePOnTs2trau7dvDL+iFSZPGjBmDnLhRxyE+IWH//v2sMTFaH3hMhL/LqZf37d934cKF5JTk/Nx8eIWyd7B3bu/cr0+/F1988cknn1Sp4TUV+7dp4yasqHxyPXp016pcrbYcMmQo/Fu71uLGjeunTp1JSrqYmnr51q1bsJy5tPRheXmFWq2ysbGGTV8h3hg8O8LwY0BAD7jzBAT01CqN6zQ7O4vrEkPpHp6eZWVlOj2weIlVlZXwksEQBZNVyczKFAkX4/ud45+gG+EbGHSrgICAap6DHEJ1pBAlLIDB9x+hsNSXk56WLlrMP5NvKBoFFBQ8qNdZyoOxY8dqaCHMoZDfwOo0gpAcS5ctE0Y7dClW1tYdzWt99N4Y7JQzNKS/BH39fPlmIXl2CCQmJq6LiJBen+07dsQePCh9vQqqcfnK5RaPWNfXzc1VjAXFBs2urKyAUUqDYnwFhHdgoMGKFSuOHjnCVxUT5c1sFDEqKspEIFzZYdMErkuUrggCH37wQVaWpKMxsDBxxowZioAjl5KZGZnfRSngA5i/fzdZEMEgNgTyELxqURwYzOOYPGUKMvyoUCZ5d+0qVFEslAMP2sdPHBdDE/j+IUaxVKZkBGAd8cSJE0ulCvgJW85Pmjw5Py9PMgOVWBE4+OoabBRTGQ2EKYiy1L5y5Sox6hXFgYGiEI85aPjw27dvi6G0zjLN7A0MbFy4YKFOS01M9PP1M7EEyi47gaSkpLCXXoJVWRJo8t7cuYdiYyWoSLlVfPftdwdjlTG+2q2bDA5s8+ZNigxQ4tW5c0ZGhjTfCU+cOKHcHwCX5lHfRglOj6md2xoajv8I3zgncWgSg0h0Yk/omK+0FZb4/iPULys5OdnO3k6zXVg+Bm2FMhxZDtTo4ODAMhN9usGcy59//hlpqiliebm5+vRQ5jVY8XblyhVTsOjM27p1a2Z54G9A5MCgESdOmACrTXS2somJ4Brfe+89ZvsJl2L4/mMin7rssKwbVjpzKcNaurW1VUVFuSCGIwtJTb3Urp2RG9Rh6Ik1hFhfN+zv+dTIkW9Pn16M2xO9PiPfAwixCCvG+OZiXB5WpUAMm5ycHGH1pHkcwvKUsTQI7DRo4MCMzExhdXhQUDB2zJhVq0T5biGsqjKWBlNphg4dqqDdvxwcHGHwWTJie/bsGTBgQG6uWXw9heVHEAmisqIC6bqNEAsMDJSsbaSsCNDFJ8QbAURnloqKCnxgNynNrKsL/wRNb2D1rQOBbFavWQOhN3S2ON/EHTt3tkVHOanXgZEDfP/hi0VLPiY6BoaXGLGalxqBgf02bdqIDJ+hZTXyFF5MJ0+ezEsrZQjD+u01q1fDcB8SBEYM5kd99dVXoSEhZraWWbNFIVgUbGkK0UkwQLhkYF7owkUL8fd9TQUkO8bfgPCGmE0oKf2t4OvjA0E3YGYHVwcwmA5x3YYFBemvhfGr+P4TGxtbVVVlkElDAQhoAkE6GOdgUD1HR4dp016FtVm1tfyC/zUEoply8eKFN954w8bGxqACChYATzN69OgvIiOvZWVpGo8/htBTvxw9Cvf0wYMGQUhfBbPgozrsYLls6TJ4usGDAslr2dciIiKGBw1XC7qNAh/Fecjib0DkwHRihXB2y5Yu5dVJ7j94ADsc9TOLAQwe/adtOwgZFT4vPCkxCfmDSr2U+tZbbyloyobOHqKV6O7uNnv2bIjqW1lZieSgJQav/gkJCYsWLYIwMVqFi32qErsCg+W7ubqC2T0CArp4ebnAn6trCycnGBKBP5VKBe8csN6lFF53i4tv5uRc/f+/5KQkCH5vsHCzFFBZqHr36T1o8KC+vft27NQR9tlzbOZoZ2cHvaqkuASiHcIaBtgdO/1qOtA6F3/uVs4ts+RARuknACsjhw0d2qtXLy8vLwjwBhOC7O3tLdXquiij8DqekZ4OM8Rg+i5EAqyVZEa+foUFuQoODDlj27mdc27eXzO/3FzdRoSMAFYQO9Tdzd2phRNMnCv/vbyktAR+PjCR6vz58zAZLS09TRAl2SwE3sl69uwFN2M/P38I2AE34zZt2sKNxdYWXqhsampqwcPBmwNMa4CZzHl5udnZ1zMzM1NSkuPiEsrKHrJpFGlFBIgAEVAMAV5vYIqxihTlICD6LESOeimZCBABIkAEiIBJBMiBmYSPMhMBIkAEiIBcBMiByUWe6iUCRIAIEAGTCJADMwkfZSYCRIAIEAG5CJADk4s81UsEiAARIAImESAHZhI+ykwEiAARIAJyESAHJhd5qpcIEAEiQARMIkAOzCR8lJkIEAEiQATkIkAOTC7yVC8RIAJEgAiYRIAcmEn4KDMRIAJEgAjIRYAcmFzkqV4iQASIABEwiQA5MJPwUWYiQASIABGQiwA5MLnIU71EgAgQASJgEgFyYCbho8xEgAgQASIgFwFyYHKRp3qJABEgAkTAJALkwEzCR5mJABEgAkSACBABIkAEiAARIAJEgAgQASJABIiAuRP4L8YVQkRxmbXeAAAAAElFTkSuQmCC", "text/plain": [] }, "execution_count": 48, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# subtract background in each image individually and normalize\n", "# pay attention to () - this is composition of 0 axis, a dummy axis with 1 element.\n", "im2 = reduce(ims, \"b h w c -> b () () c\", \"max\") - ims\n", "im2 /= reduce(im2, \"b h w c -> b () () c\", \"max\")\n", "rearrange(im2, \"b h w c -> h (b w) c\")" ] }, { "cell_type": "code", "execution_count": 49, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAkAAAABgCAIAAAB6y1p+AAAKyElEQVR4Ae3dfWxX1R3HcVpsGTS2UDREghprJQIN3SiI1ofxIE3JDCuKmDQpPqJbQuIYpQHpim4I1YQ2ghCWlLo/GimKESrRSDQVExq3QaVQH8pqK5ZSlGYpTz7UlY7s78+99q7n3Ht+17d/fs75nYfX+dkvNzn53aTB7u4R/OctcHqid5uJlrbWNjlMzUs1Ml+7ca3MM6/NlLn10DaQ9Q3YneCM5e+P3dXbH31Gs/05EnoGgHyPL9m3lUYEEEAAAQQcFaCAOXowLAsBBBBAwF+AAubvQysCCCCAgKMCFDBHD4ZlIYAAAgj4C1DA/H1oRQABBBBwVOAqR9f1s1lW7dZauddXa16V+dTcqTJ/dMWjMidEAAEE4irAE1hcT5Z9IYAAAjEXoIDF/IDZHgIIIBBXAQpYXE+WfSGAAAIxF6CAxfyA2R4CCCAQVwEKWFxPln0hgAACMRfgFmLEB3zTLTcFWkHWLVmB+tMZAQQQiKsAT2BxPVn2hQACCMRcgAIW8wNmewgggEBcBShgcT1Z9oUAAgjEXIACFvMDZnsIIIBAXAUoYHE9WfaFAAIIxFwgiTcy+5+w9RcOD+r5e071yIbrrr9O5pGF1oEi25mRiXkjsz8jLxz29xkBkC8QT2C+PDQigAACCLgqQAFz9WRYFwIIIICArwAFzJeHRgQQQAABVwUoYK6eDOtCAAEEEPAVoID58tCIAAIIIOCqgPXfQqzZtUvufV1lpcy/+/57mf9+2TKZV65bJ/MkmUYXLp2/VE5+9sxZmXuFxU8Uy6blK5fLPFHC/ftfk0utqvqzzDs722Wempoq8+XL/yDz0lI9vuzsYFhXVSVXVfv88zK/PKivvT5SVib7P7JmjcxdC3cf2C2XtP6v62XeeapT5mlj0mS+snilzMufKJd5UpJbf4EGPc59w4Yauf4tW+plfu7cRZlPmaJ/07W+flOg/rKzT8gTmA8OTQgggAAC7gpQwNw9G1aGAAIIIOAjQAHzwaEJAQQQQMBdAQqYu2fDyhBAAAEEfAQoYD44NCGAAAIIuCtg7BbiwY8+krt8cvVqmXvdipGdr4Qvbt8um3KnTZN5cVGRzKMKO050yKm9fvNQdr4S9n7T69WU0Hlr61G5/ra2T2TuFfb3/yCbqqv/IvO5cwtlnpeXL/OowoMNDXLq6lWrZB403Lp2rfzIpOxsmd+7ZInMowpbTrTIqdu+bJO5V9h/vl82VeyokPnkGyfL/KGCh2QeVfjee/+QU1dU7JB50PDYsX/Jj/T3/yhzUyFPYKYkGQcBBBBAIFQBClio3EyGAAIIIGBKgAJmSpJxEEAAAQRCFaCAhcrNZAgggAACpgQoYKYkGQcBBBBAIFQBY7cQt+zcKReempIi86a33pL5uIwMmc9auFDmu/bulblrtxAPdx2W6zz5xUmZ52e7dQtOLjLC8ODBT+XsR4/+U+ZPP/2wzJuaPpC5a7cQX9u2Ta4zeeRImb9y6JDMUzx+K7Lktttk/9c9bv+6dgtRLt4n/HSP/v709Oo3oc//3Xw52s69+u+ea7cQjxzR+5WbuhI2NFTLpoKC22Xe2Kj/vmVnXy/7mwp5AjMlyTgIIIAAAqEKUMBC5WYyBBBAAAFTAhQwU5KMgwACCCAQqgAFLFRuJkMAAQQQMCVAATMlyTgIIIAAAqEKGLuF+PePP5YLzxw7VuYNBw7I3Cscm54um9ra22VOGG+BlJRUucH8/Dky9wp7e7/xanIq/+ywvuU1JS9PrjNn9myZe4Ve43ze3Oz1kYTOUz2+P/NmzZP7mpY1TebH24/L3LUwK2tSoCWVl2+X/QcGBmS+aNGvZZ6cbPcZye7ockuECCCAAAIIDF+AAjZ8Q0ZAAAEEEIhAgAIWATpTIoAAAggMX4ACNnxDRkAAAQQQiECAAhYBOlMigAACCAxfwNgtxHPnz8vVXLx0SebPbd4s86DhxAkTgn6E/jEWGDXqF4F2NzDwn0D9o+p86cIFOXXG+PEyDxqmjxsnP/Ktx7yyc4zDCeP135mTZ/RvmbpG8cAD8+SSHn/8tzKvrW2Q+eLFpTLPz58u83feeVnm6elpMg8a8gQWVIz+CCCAAAJOCFDAnDgGFoEAAgggEFSAAhZUjP4IIIAAAk4IUMCcOAYWgQACCCAQVIACFlSM/ggggAACTggYu4Xo9VuF12Zmyo2+UF4u86Dh6NGjg37Eqf4jPd6o67XIywOXvZrI/yeQFEuHsddcI/d19vRpmQcNvz51Sn7Ea17ZOcZhb1+v3F1GWobMXQu9/s7U1FTIpa5aVSLzjRtrZV5X97bMd+zYI/OysodlHjTkCSyoGP0RQAABBJwQoIA5cQwsAgEEEEAgqAAFLKgY/RFAAAEEnBCggDlxDCwCAQQQQCCoAAUsqBj9EUAAAQScEDB2C/GOmTPlhvbs3y/zL7u6ZL7gnntkfqKzU+YTPG5nyc4OhuPG69+g81rqocZDsun4Ef1m2L5/98n+dy+4W+aEbgrk3nmnXNj7b7wh87fr6mT+Y3+/zL9obZX53MWLZZ7o4YfNH8otNH/eLHOvNy8X3F4g+7sW1te/K5fU06NvV86Zo/+eL1kyX47jdQuxq+tr2d9UyBOYKUnGQQABBBAIVYACFio3kyGAAAIImBKggJmSZBwEEEAAgVAFKGChcjMZAggggIApAQqYKUnGQQABBBAIVeAqU7OtWbFCDtXwrr79UrZhg+wfNKx69ln5kVm5uTJ3LUy7Wr+ZdHredLnUlsMtMi+cWSjzyVMny7zxk0aZE7opsKy0VC7sgzfflPmfSvRv2cnOV8LkZP1v2RKPeb3GSZT8seceM7LUkt8EczYy6f8xSFPTMfmprVvrZW4qvOuuX5oaSo6jv7WyKyECCCCAAALuCFDA3DkLVoIAAgggEECAAhYAi64IIIAAAu4IUMDcOQtWggACCCAQQIACFgCLrggggAAC7ggYu4U4IydH7qpxj34j5wvbtsn+HV99JfMbJk6U+cwEuW0oF+8TVtdWy9b1f1wv84vnL8q8sEjfTpSdIwxvvVV/f+6770G5qjFj9O3NUaNSZX+vcXJyfiX7uxbmzJ4tl7R53z6Z/62yUuaDg4MyX7Z6tcxz8/Nlnujh/fPul1vo6O6QedGcIpkXFxbL3LXwqaf0fvv6LsiltrSckHl6uv7/rrh4oey/dOkCmZsKeQIzJck4CCCAAAKhClDAQuVmMgQQQAABUwIUMFOSjIMAAgggEKoABSxUbiZDAAEEEDAlQAEzJck4CCCAAAKhCiQNdneHOmGiTXZaX35MtG3YWy9AvrZn+P74+szQL0D2/YxqfOblZ1Q8YtMrm2Tevq9d5jdPulnmkYWmgCLbgN2JeQKz68voCCCAAAKWBChglmAZFgEEEEDArgAFzK4voyOAAAIIWBKggFmCZVgEEEAAAbsCFDC7voyOAAIIIGBJgAJmCZZhEUAAAQTsClDA7PoyOgIIIICAJQEKmCVYhkUAAQQQsCtAAbPry+gIIIAAApYEKGCWYBkWAQQQQMCuAAXMri+jI4AAAghYEjD2RmZL62NYBBBA4CcFcrL1G70fvFe/0TtttH6z8E9ORAenBHgCc+o4WAwCCCCAwFAFKGBDlaIfAggggIBTAhQwp46DxSCAAAIIDFWAAjZUKfohgAACCDglQAFz6jhYDAIIIIDAUAX+Cxyhjd1yPQF/AAAAAElFTkSuQmCC", "text/plain": [] }, "execution_count": 49, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# pixelate: first downscale by averaging, then upscale back using the same pattern\n", "averaged = reduce(ims, \"b (h h2) (w w2) c -> b h w c\", \"mean\", h2=6, w2=8)\n", "repeat(averaged, \"b h w c -> (h h2) (b w w2) c\", h2=6, w2=8)" ] }, { "cell_type": "code", "execution_count": 50, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAkAAAABgCAIAAAB6y1p+AAAiNElEQVR4Ae1dCXhN1/ZvSyaSIMhAJpKY1RAxBDFGCEGFGIoY2hhb/kRbPGJOY3hoqD4qamiqJKYSUxBzkIQEQRJkFjFF5pH/er3fu991h733vfecc0909cun5+z122ut/bv3nnX2tPan77OzP8H/VDPwzEq1DCXAABJE/BogPUR6PukYR5b/46VIEPEr8BlRikJkABlABpABZECkDGAAE+kHg24hA8gAMoAMkBnAAEbmB6XIADKADCADImUAA5hIPxh0CxlABpABZIDMAAYwMj8oRQaQAWQAGRApAxjARPrBoFvIADKADCADZAYwgJH5QSkygAwgA8iASBnAACbSDwbdQgaQAWQAGSAzgAGMzA9KkQFkABlABkTKAAYwkX4w6BYygAwgA8gAmQEMYGR+UIoMIAPIADIgUgYwgIn0g0G3kAFkABlABsgMYAAj84NSZAAZQAaQAZEygAFMpB8MuoUMIAPIADJAZgADGJkflCIDyAAygAyIlAEMYCL9YNAtZAAZQAaQATIDGMDI/KAUGUAGkAFkQKQMYAAT6QeDbiEDyAAygAyQGcAARuYHpcgAMoAMIAMiZQADmEg/GHQLGUAGkAFkgMwABjAyPyhFBpABZAAZECkDGMBE+sGgW8gAMoAMIANkBjCAkflBKTKADCADyIBIGagpBr+ev3jxMCUlNSMD/t68fVtcUlJSWgqO1TIyMjI0bGBmZmdtbW9j07p5c7O6dcXgsGY+PMt8Rq5oZW1FBqAUGVCLgXfv3j25f//+rVvpyck5aWnPMzOLCwtLi4srKyoMDA0NjIyM69SxsLa2tLVt0rJlS2dn22bNPv30U7VMIBgZ0CEDOgtgb/Pzj5w6debixWsxMRC3GClo1rSpq4uLl7u7Z79+hgYGjLVEAnO2cSZ7kv0+mwxAKTLAwsDbV6+ijh6NOnIk5sIFiFgsVSQYiGdd3d17enm5eXmZ1qvHXhGRyIBOGPj0fbbQD83zV64Eh4ScPH++rLxc4zabGBv7eHnNnz69pZOTxkpYKj7jrlPU6NNGZIvVMoBxSBCZneopFZiehOvX/9yy5VxYWIUWPy5gWt/Q0H3UqNGzZ7fu3JlX4jvG8aq++iv/GAmqqKgsKCg2MzPV/uMRNIAdP3t22YYNsQkJ2vst0QDDHV8MGrR+6dImtrZc6ZTTw+EDCAOYHLeKt/n5eU+fpiiW67akXbtOGjvA4feH7MOD2NjgH364ERlJhqkr7e7pOWv16ubt26tbkRHP7fM5ryAvJUN03x9GKuRgnVr9/a3jliA5G0LdZmQ8v3s35e7d5ISEZLh49CitXbtmN2/u0d6+QAEMprjmLFkCA4bae6yoAebJFs+Z88Ps2TVq1FCUalnC4QMIAxj1swgODgwMXESFCQzIzn6vsUUOvz+qfIBBwk3+/oe2b3//XnM/VSmH8s8++2z0N99AGDOqXZsA00zE7fN51a+rlmxbopknYqv1PvbvT5NbglQ0sqys/Pz5W1FRsRBdsrJyi4pKDAz069Y1adHCvkuXNkOH9rK0rK+iqpJimHm9fv0uhCvQBhHr3r3HeXkFirjo6N2gXLFcrRIhAtjWXbsWrFwpWZehlnNqgfv26PHHzz+bN2igVi0qmMMHEAYwKttjx3pcvHiGChMYIOYAlhgT893Ikc/S0vjmxNrBYV14eLN27bg1xO3zudukbtF3o7n1UFfahAlgubmvg4J279p17M2bfFUt/e8bzOgBa9bMsrenTIJINTg5DU9JyZDeKr2YPHloSEiAUhF7Ib/L6EvLykb5+c1evJjv6AUNhqk1l0GDHqemsjcekaJiAF7cYmOvi8olkTtzev/+r9zcBIhewEPm48dTunc/f+iQaDkpKC64df+WaN0ToWM7dhx2dBz273/vI0QvcBt+mH/8capDh3HHjrEOoU2cOITa3gMHzsJMGBVGBvAYwGBBvPvo0WHHj5M94FCanpXVa8SI5KdPOdSJqgRjICkpsbBQyVCDYA5UL0NHdu5cPG5cWUmJYG6XFBV9P2rUyd9/F8yiWoauJ1yvelelVpV/LLiysmrq1BV+fqvYQwgMAw4fPn/Zsv+wkDZx4mDqfgwYqDxwQNvhFr4CGOzlGjx+/JWbN1layyEmKyfH88svIXZyqBNVCcNAfDy+PrMyfSo0dLWfH0+TXgQn4GV86cSJ58LDCRhdiW7cu6Er09XLLnyI48f/KyTkqLpuw/dt+fLtW7b8Sa1oZ2fVq1dHKmzPnhNUDBnASwADgmDk8HpsLNk2T9KU1NSxM2YI/9vmqTn/HLUJCbikmunTfhgXt3zqVPiVMaG5BoHdAF9fWE/GtWJt9d1+eFtbFf+M+osWbf3zT827PnPnrj916hqVKpg2o2IuX76dnp5DhREAvASwNT/9FHHuHMEq36LTUVHbdu/m2wroLy4qxkjJFc+JifFcqfqI9cA4HqzaKP87VY2umgk+LPD2howeunJAqd34JPz+KCXmg8LIyBtr12r1bKyqejdlynLq2OMXX/SF1R8f2Fa4gYfnwYNabfygGFCwSC+4ERcHm73oOJ4RCwMDYTiRDyN3bt1Z4LfA1dHVTt/O0dgR/u3SpMv8r+bHRuumx8lHG3Wi88GDuzqxW72M/rJ0aZYIZnkzUlK2LVkiHupKykpSn6WKxx9xegI7iGfPDtL+nfvZs5fUyTALC7MePdpTeTh+/BIVQwBwHMBgeGHWokVVVVUEk8KI8gsKlnMdR0tLSmd9Ocuzs+fvO35PfZxaUVEBbamsrIQcjn/s/MOrm9f0MdNLioWbVBeGSWGsPH+eDbuYhbFVfa08SUwM3bxZJP6DJ+CPSJxJSkvS1ZiqSBhgcQPmvWATMQuSigkO3k9dKO/t3Y+q5+rVeGpnjqCE4wC2NyyMw0QbBL9ZRL8dOJCWmcmCZMGUl5X79Pc5HHqYAD7257FRfUdBnCNgUKSUgfR0XDuqlJgPCneuXv1O07dDyHPY2sWli7t7P29vz/Hj4d9uHh5tunQxNTP7wAbzDXiyfflyZji/wLRn3DyX+fVSp9qh47VxI2crSKEzt2lTKLlBnp7dyQCQgp7btx9SYaoANVUJNCgHgoK2bmWvaKCvDzl5IRdUp3btWvXqRa6YEx//ND0dJrd+3r079+VLMlgihR7STzt3bggIYAFTMQH/FxBzLYYKi7sRt2TOknXb11GRioDXL18rFlJLYLkqpCCpqVfTwNCAj1wkVAc4AWRm4gOIQmR2auqZP+kLwGS11DI2hrS8fb29W3XqZGVnJyuSvX79/PmDuLjLMJrz11/PmTNrg4bIgwczVq+2cXSU1aaT68xczl5VdeK/AEYvXozjqvsl8Xb37uOrVs2EhB2qnHd0tIHliGlpz1QBJOV37iS5uXUkY1RJuQxgEF0eJCersiRbDs/cr7/8ctn8+VYWFrLlhGuLhg3hr6uz87xp06Z//33o4cMEsFS0Lzw8aPHimjW1bWbyg+S9/9krVUu+gOHEr+d+3axVMzJMUdqmYRvFQrVKjE2M65rVtWhkYW1n3dSpaYu2LTp07gDXainRCTgrK10ndquR0Yi9e9m7X9Cvmrp48agZM+DMFGobzSwsug8aBH/fb9kCOex3rFz56PZtai0AwDvrsV27IMsUC5hXTOZzDGAUgrVZeahUdWFhMayD//bbMUqlksLu3dtRA9j9+48JGsgiLocQIVqQjUmkkEj+yK5d/1m7lj16yaqF6vu2bJkyhsSaFA99tVMXLkhvNb4I/TWUfYQdkDBJprEtbSoWFhRmpmXGXo89uv/oxpUbp/lM62zfuZtDN+g+3rt9TxvNfNfFHhiV4Yh9+6gYCQCGB/968mT8vHks0UtWJ7xZ9vnii9C4uDlr137GllmU3StZQ5xfZ73I4lznR6bw+PHLnLeIGhTbt29ONQqpfqkYVQBtuyZSvXA2ytHTp6W3qi5gjOvg9u0evXurArCUw8/sl6AgmGyLZ5hDPh4ZOcTdnUUtAXP5nHqf/ZXzVwjaBBalPUnbsWkH/HV16+q/3N+1t6vADrCYYwlgK1ZsGjp0NIu2jw8DKw/TkpJY2jXB3x/CD/xGWMCqMBMXLHBo02aulxe1z5eTnp768KF9ixaqVAlTztID2zh/4xgPphdfRp+/XfvtwciDBLB7F/c9K/cQAIKJYMFFZiZTnIDUvePGDQT8hg2UFFPg/PXrCZD8t3Fjc1UNadWqqSqRtBw0SK/VveAsgMHq+cKiIqr5BTNmaBm9JCb09PR+WrUKEkdRLXKSAh+6NVRDsoCsNDG+D0Zfih7ZZ+TICSNXBa8yrWMq67DOr1+9ekH1wcGhubm5JRX2UQJio6JY2gXrMuau02T+VVE5jChOWbjw11WrFEVyJbEXL+o8gL14Q//+ONk6Wdbn8vuz1G8pOYBdiLkA4zGNGjaSY0z42+jouyxGZRPsQhjr1m3S8+ekiXkYQw4LOzdnzlhVym1s6JNEkFBYVXVqOWdDiJeio6nGTE1Mvps1iwpjBLh17dqD4bQ9WPqRofWhneouLCwRMEMdI11SWNjesCFdh0C3TFoihouSkmKqG5aWun8QUJ3kCQBZ56maYcAwICSECmMH+AUENGxE5xwyg7Dr5AkJ+8Comi3M6A9TqhJZQBuHNq2atpItkbuurKrceXSnXKFObhMTn1DtwgmTP/20QApr0qTxnj0rpbeqLg4ePKtKBOWNGjUkSCWi4uJSKkYVgLMAlvDggSob0nJvT896depIb7W/gAUdLEru3L/PAiNg6jesT5AqitTFK2rgtSTlYYp3b++crBxerailvLSU/gAyM6P/GNQyWo3A6QzLo3oOGcISb9hbXaNmzQGjR1PxLL5RlWgJYAlgZnXMtLSiWH1wj8GKhbIlYZFhsre6un74MJVqeswYD2PjWrKwAQO6DhvWW7ZE8fratYTsbJXd39q1jRSryJWUlVXIlbDfchbAkp/QI3x/Nzd2z1iQMBppaGBARcZrHcDadmxLtSILaNtBPbxsXWGuszOyJw2bJNmLLYxFshWWHpgRw4I6spXqK4XTTKjOswQbqhI5AKzClytRvM0WQWaQ4lJ6D97YyFjReS1LOrboSNaQkJyQ8TyDjBFACrkzqFZ69uygiFm4cLJioWwJjCIePqxyoZyhob4sWOl1ebkIAlj2c/oMYce2HD/WaxkZwTmWSkmRLUxiCK6yeMXrgcMGKhYSSjyGeRCkIhElxCYEBwaLxBmWHpiR0QevhyLxXBg3CvLyqIb4mIiytLWl2i0UweEPJQw9eONa3Aewto70Z9rZ6LNUDvkGkKeyJNadnJR81nBosrNzS7J7R45EqQK8e/delUhaXqOG5v0ozWtKzUsuWFZw1K9XT66W9red27enKtE+KaL3eO/Gto2phiQAK2srWCjBCNYtbGvQ1pe59FczAZxkCWDsOxkEcFhgE6UMK6TMG7N+RdmdZxmThNy+7Ap5QpZXllM16+vRewNUJXIAi/oWciWKt3EP4xQLBS6B07yoFq2tlbflq6+Gk+tevBj79m2hUgwk2lBaLluor68ne6vWNWcBDJbRUw3XNTWlYtQFtGrWjFol69kzKoYM0NPX2/TbJpY8F5CAedOuTfoG3P9UyB5qJoXMjb9u/lWzutzWgoEIqsKXL3OpmI8VwBK89fS5/9ZVMPyuWXzj+3MxMqDPteQX5XPuRh1j+qQ+jCJyblddhaWlZdQqtWsbKsWMGuWup1dTqUhSCFEqIuKKUgCcWqm0XLaQZZhRFi97zVkAMzJU3nhZY28L6G8BsniW6+YMaWw4Od+ye5/uG3ZugOX7BK8g5cf6Het79u9JwIhNFL4vnCV48O02y/BgRkYq326IVr9R7dpU315ovdpW0URuVpZioVwJi29yVTi/rWVIH15OTk/m3G5RCb33mZ6TzrlddRWyLJQwMlL+DK9fv06/fp3JFlVNg7H0/AjJqMhGQcpZAIPpKKqxl69fUzHqAliGJUs4OjzJx9cn7EJYe5f2Sp2EhR4Hzh0YM2WMUqloC7PSs2AyTOfuGRrSvz+Rkcd17qeuHKhlYkI1rVYaQ6o2CeApw+piyLjIqI0/mJmpGVX5pbhLVIy6gOwX2dQqeQV5VAzfAHIXSmKd0JP+4os+ZA8hzYfSpPI5Oa/IFUHaoEFdKkYVgLMA1oAhp/W9hw9V+aFxuXEt+psXVwEMnHTp7hJxM+JM3Jl/rf3XpJmTho8dPnH6xEWBi07eOnkq5hSkutC4ITqsePPKTR1al5hu0MCc6sOhQ6EFBflU2EcJYFlMAal4OW87S/pgFt84d0xOoZ2VnVyJ4u2+iH2KhVqW3Eq8RdVQWFxIxfANUDU8KGu3tLRc9lb2GhbTk5dalJSUKU0rRU2ECFYsLOrL2lLruqZaaAK4ia1tIi3VDRzTPHLIEIISDUTGDEMrnA+RtenQBv7U9fav69w/X6Q+VFVWFRUWwYqMR/ceXYu6BqduSkXUi/iYeCqGb4C1tR31QMucnKzFi2dv2vQb9aRXvr0VXr+tk1P81atkuxBsIIkUbN4iw9ilzzMzWYKiLcM8NLtRzZAsAQzmosLPhXv389bMhNJaLEHR0ED50JxShTwVwn6s168pL38FBUUmJsr7A3A6Zd++LmfP3iC4t2XLn1OnDpPLYRYfn0SoIhE1adKIilEF4Oy73pRhue3hkyc3LFvG7V5mlt5VbYZemiqCOCx37urMoTayKtiqvHz+8nMR58gwiVQMWTlsbZuyuBoWtjc3Nyco6Bc7OyY8i85qgXH6/HOqnzAHBgd0zVi5kopkAcBr37JJk8oZht+duN4ew+KeHKZ98/ZyJUpvZ/44s3ObzjYWNkql6haevn763E36T8yklom6mjnHs2wohv3IhMQZ48YNIgcwiFV//HEaElDJOn/mTLTsrdJrOHVFaTlL4WcsIBaMC8Ny9rz8/NVcnycLOqnusQwzUpVUL4BjC8e9J/b6zvBlcRs2NbPAeMW0b+/CqP/SpbOuro4+Pv23b98YGxvNsv6eUbOYYc60A/MkzocEBl49eVL7hsB0yPo5c26eoz+dwVZHNt+094qgodvn3QhSqSj3dW7faX2fZD2Rlmh8kZqdOilgEsvojrWFtcZWuKool2JDqdrMzFyl5ZLCESP6UpcLzpr14927KVIlJ09evXPnkfRW1UXbtk6qRNRyznpgrp06UY0BYOP27d1dXOAQSxYwCyaPYRMlyzAji61qh1mxecXFMxdTH6eSPS/MLyQDBJA6OzM9gCSewFPjypVz8Ce5rVevvqVlYzOz+vr6BgYGhvAvtw7/8st+bhVqoK1Z+/Z16td/++oVuS4kj4cU8nOCgsbPn09GEqR5L18GTJp05cQJAkYqMq1Xr0WHDtJbXV3AhmJY0f628C3VgZSMlI7jOq6bu27y0Mk1a2j4AIy8ETlm4ZhXbykfh8SZlk1aUr3iGwAHS8bEJJKtQOwZPry3Koypae3Bg3uGh5PeaWDNYZcuE6dN84ZTVJKT06lHNoMtmFpzcWmlyii1XMPPT1Gvg729U5MmybSkMvBmN27mzK2BgYwHeikakit59eaNXIniLRwhplj4TyiBRf++M31hLJHcWNgNRgYIILW3d3B0bJGS8lADW2/ewLeA6VGigXKoIoYABtN+7j4+Ydu2UZsAMWyjv//J0NCvly7tPWwYFS8LKC4oCN20ac/69UUMAxuSiuAV48lhsoY4v4ZQNKz3sD3H97Bohjjnt8pv/d71i6csHtF3hFoZOq7FX1u9c3XE1QgWQxKMc0vh5g5UeeXgQO8FUntLMDxIDmBgHVZzsMQtqZ8Q6liGN6V4uQvOhhBBr8/QoXLald6WlpVNnTfPY+zYyMuXKysrlWLYC1lWNtrykKGA3UPdInv260l1QG7elYrnCTB8+FieNH8car18mQaEJY2FDPHzhw8fbGe37ttvo8+ceZ2rcnQIurMZKSmnQkP9R4zoZ26+belS9ugFtoZOniwSen3cfdTyJCktyTfAt2G/hiP8RwTvD46KjXqdr2SfT0FxQUxiTOip0Olrptt62naf0l2t6AUuDXT9YFpILSe5ArMEsEuX4sgjooMH92jYsB5XLkn0gE5tFHLWAwMnfH18AoODCZsJZB2FY7rgDwb3IEGiRcOGsiK1rmMSEqj4FgybnalKqimgkQ19hY+xqSh6qGPHTtm8eXVFRXk1pZpvt9t06dK+R487V66wG4LTJvcHw8M5GKrUbdDAxtGxNowEmZjAwSvFhYUQqCDFYtqjRxrnggJ/wCt2f3hFDnId5GjjCCOEalkpLS89fAG24R6W1NKrqQdrLuqY1IFHOSx/h33KLHnuCRbhvJXmds0JAGFELAHs5cu8uLiHhMyHBgb6M2aMXLFiB4c+U3eYkW1xGcBgCNF78OCD6mxGgQyKLAeJEdoQEx9PkEpE/+QABr9FKj8mpnQMVYn2ACsra4hhe/b8or2qj1WD39KlMwcM0Kx1MLMFf5rVVVVrxooVqkTCl8Moq/8Ef+gnaWO6orIC+mFKu2KaqZ3lM0uzitzWatPGkUXh0aNRhAAGGmbO9AkK2l1Wxs1bJowfwh+LY6owXA4hgo2AefPIyZZU+aFZ+dv8/AcM5yT9kwMYy1GcDcwbaMY/57X8/ZfDigzO1X40Cru4u/diG6gXoMl9R4zo1KePAIbYTUwZNqW1Q2t2PN9IqwZWvkPUGPjlzx/YyGVvTx+MIeSVl/gGeqi5fdlbAf05drBSJMcBrHXz5vPZDplU6o26hTCLVlVVRa5loK//eatWZMxHLC14W0BtXdNmTakYYQCQj2PNmi3C2KqmVr7fsoUlrRTfrYPFhz9s3cq3FXX1wwDgzz/8LJI5XXB+7Zy1tY1qq9sKnvBubh2pmmEh4uPHmWTYypUzOJkJs7W1nDTJi2yLKuU4gIG9gPnzO7VrRzXMCeCvs2eperp16sRy6CVVTzUFZGVkUT13aO5AxQgGGDZszNy5/xLMXLUzZGFjExASolu3YbBu5d69sHdBt24ote7W0W3h5IVKRQIXerl5jfccL7BRgjkPD6adKjCKSFAConr1TIOCviVjWKQ//viNNgepSExwH8AgWhwOCbE0N2dpgzYY6HsdZwhgfVxdtbFS3etmplLep6CBsOtZVM387ruV33wjimeQqGiROtN/5MgpixZJb4W/mB0Y2GPwYOHtMlpcOWOlZ3dPRjBPMAdrhz0r9vCkXDO1gwa5sgSMo0cvUvVPnjx0wgStvgCw4WzsWA4WZ3IfwKDx1lZWUeHhjXl+QYONrCybwFiObKZ+YNUXkJiQSHW+Q+cOVIzAgIUL12zevNvExFRgu9XF3KzVq0fPnq0Tb6ctW+b73Xc6Mc1oFDqIYevC+nfpz4jnHGZraRu5LbKuSV3ONWujEHpOnp7dVWkA0iClE8QVyHmoCiNb/uuvS3r1cpYtYb9u1arp7t0r2PEEJC8BDOw1d3C4fORI25YtCba1FP1+6BBVQ8P69bs5a8gyVXm1AMRFx5H9tLG3YVlqT1bCh3TUqIlRUfeHDh0tnikNPpqpsc7vgoMnLligcXUNKsKG5f/bsMEvIECDugJXgfMtj208BpuUBbYL5iAnyMUdF+0b2Qtvmmpx+vSRUgycvzxwoKu//4TfflseE7OvsPBKcvKRw4c3BAT4STGEC+jMHTmyATQQMEpFbds6njv3C+zmUCpVt5CvAAZ+QH76GydO+I3naxQY0gcPHTAAIiUcI6mq2cM8PFiOUVZVvbqXwz5xalr6/oN19qJKpRcW1kMWjMjIO19++bWxsSjW+lN9FhIAueeX796tz3CWrPZemZqZBUdEjJ83T3tVwmiAGBa+Ljzwm8Aan9UQxiJYmTB4wvXfroszeoF7AwZ0DQkJuHx555s3URkZESdPBq9bN9fXdwgsnTcyMlCXJTiIMiLiJ5jKqlmTlWEYe7x2bZelZX11banCf/qeh1Nc5YxdvXXrm8WLb9+7J1eu1i3BT3hMP0lPf/T48aOUlP/++/df7t9bXiC/gEfv3moZkgM/s5IrqE63sFd8rMdYssdwCGePvj3IGJJUKIJKSoqvXYuKijp9925cUlJiXt5rklecyrKz32usTwB6YCfy8ilT4q9d09hJakWYdYPVj2YWFlSkuoCOlAECdfUpwd9+dBsSR0E2DSUy7opgD/WW77d4dPPgTuXfmgQgSGuPYe0i7G6G1R8VFZVKlUF3zcvL7fvvfV1cWisFaFwoRAAD52Bb+4nIyHXbtmm8bZkQwJQ2HrLUQzxz/vxzQv9MaUW5QgEeQHIWObxd4Lfg9x2/ExTC4OHN1Jsw/E3AUEQ6Iujly9zk5AdpaY/hiMuiosLCwoLi4kIIcuRcOJS2qBDDCWQqJPRiYeiB9Dcn9uyBs1SyU1PpPqmDaNWp08xVq7p5cP1c/p8Pwjyfq95VwcFda0LWQPqo/1nm7P9Otk4/TPph4pCJGqcGJrkiDEEkD1hlb97kw+EpcKjK06fZ+X/nB4cuWtOm1pCrF2bL6tQxZlWkDk6gACZ1KSU1df+RI3Aw2J379xmTTknqqhvApBa1vBDmAaSlk6qq+3r5Rl+KLsgvUAWYt3Se/3J/VVKm8mpNEFMLtQIJSU9FeXnEvn2Q8DcxRtveBrzTdBs40GfmTL5XGwr5fIYwduzisZCjISevnoRrrT7XTz6B5PdDeg6BrdN9OvXhcZpWSIK0ZEQX1YUOYNI2QhKNKzdvxiYkwDnOkE0jKyfndV4e4fUZA5iUOnUvcrJykoDlxKTkB8mSC8jeDkr09PVupd0ytzRXV+EHeCGf0B8Yrh43OqHn0e3b5w8dgsOUkxgSrcnyCKc5Q3pDNy+vft7eVnZ2siKernXyfH7x5sUZyHAcfebqnatwNhjhsSPXakiT6NLaxbWdKwStnh16wr5pOQD3tzohiPtm8KVRZwFMsUEVFRUvXr8uKi4uLS0tKy8HgD78p6cH5ymb1a2rq1OVdfIAUiSH25JXL15BJCsqLOJgBcdHSRB3dOuWHkh+CL0x+IN885DYNzczE/L2lhYXQ18N8vkaGhkZ16kDO6MtbW3tW7Ro7eLSvEMHo9rcLA9jpFDnz2dINn8v5V7as7SsF1k5L3Mk2XvLK8r19fQN9A0gj4Z5PXNzM3M7KzvIydvYvDFjuziD6ZwgzlrCiyIRBTBe2qe1Ut0+gLR2n38FSBCRY6SHSM8n+Hwm84MEkfnRYvaerBilyAAygAwgA8gAnwxgAOOTXdSNDCADyAAywBsDGMB4oxYVIwPIADKADPDJAAYwPtlF3cgAMoAMIAO8MYABjDdqUTEygAwgA8gAnwxgAOOTXdSNDCADyAAywBsDGMB4oxYVIwPIADKADPDJAAYwPtlF3cgAMoAMIAO8MYABjDdqUTEygAwgA8gAnwxgAOOTXdSNDCADyAAywBsDGMB4oxYVIwPIADKADPDJAAYwPtlF3cgAMoAMIAO8MYABjDdqUTEygAwgA8gAnwxgAOOTXdSNDCADyAAywBsDGMB4oxYVIwPIADKADPDJAAYwPtlF3cgAMoAMIAO8MYABjDdqUTEygAwgA8gAnwxgAOOTXdSNDCADyAAywBsDGMB4oxYVIwPIADKADPDJAAYwPtlF3cgAMoAMIAO8MYABjDdqUTEygAwgA8gAnwxgAOOTXdSNDCADyAAywBsDGMB4oxYVIwPIADKADPDJwP8DNFzetjDeuc8AAAAASUVORK5CYII=", "text/plain": [] }, "execution_count": 50, "metadata": {}, "output_type": "execute_result" } ], "source": [ "rearrange(ims, \"b h w c -> w (b h) c\")" ] }, { "cell_type": "code", "execution_count": 51, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAkAAAABgCAAAAADQwpL1AAAWzklEQVR4Ae1dCXRc1Xn+NPtIGu3b2MZ4EbYBbxqDAwQ3EAjGFBtIICYkEEgTtpM0pUA4bUMcltMeOJSUQk/YaqApKQETzE6bGgg2BmNbAmMZL7JlWZZGy0iaGc2mWfvPyG9035u3zZuRPK7nnjm6/73/f//7v3/+ud+7y3tCord3Uj8qte/z7Fv/2vqNmzeqlJ8Qm6Am5Tp2JhKT+plc7WT6JHdgQGGkI8HBWCzW1GAsDHOKVqj1QKEEkCs+bDKbZthNag0vyhWGBwoigLbXub7HueNc89irHF3MC98DBRFAr1iDaU99yzaapotE4XugIAJosz427ilrSeK7jf2F77WihWkPFEQAdXHm0EC0EHirdsjB1RTzAvdAQQSQwEcOu1NQUywWrAeOfwCtK/Oz3qkqSbDFIl3gHjj+AXSf3fkM4yQ3cJc1+AhTUyQL2QPHP4Ci0ajAQY8UIUzgkQIuHv8Aml3A3imapugBnaLEZAt0R7sFXezYukNQUywWrAeO/wiU6ZqzihCW6ZRCrTn+AfRL/iyssjgLK9RYEbXr+AfQg3bnmtqhKznrTjeFObKYnwAeOP4BRE56uTSQdlVxLyztihOCKIgA2sLthVmKe2EnRNQwRhZEAH1qd4429X1c5a6pHaqtFy4LMdYWyYLzQEEEEHnlSMCVPJFYXzyRWHAhIm9QoQRQ8kSixTS9eCJR/usqPG6hBNA3aO2HPnp94bmoaJGcB47/SrScdUVewXugGEAF/xUVtoGqICyRwOMebApjTxx0YtkWxxlhrBnFulHYtV7eiGnkHjoKLfF5Wqtatl0E4XdQ/SHKd8DcAeMR6Gi5iZYpy6GbDdMK2H6IECuvmfZ58bSdVrPQrgN1UWFAsw3LP8NyzRpTDcOR8KuH93020NDuregOWZwxfSBREgfqIsZFo7bV3Vt+Go/nZQA4iKMvo38P/E7EfEgYELch1ozgcgx/D50tkDuhpSqAugbQ5cdgFMEE9AmUxVA7hlkeVE6s/2XtqmT0SKR7GgYkONlVH4Z/BAkvon5EgwjHkKCj13STVQpdDcynosoMS3YaJaQ7zXCG4R1DRAd9CUrNqK7BzHkS0qqrO7ptrpFy96jJH9IHw4lYDGS/Ma4rDZvq9NXNaNbp8hI/2I8uJ4a9CI0hnnJRSSki5UA94jMRKkGJjMmqAmjvMLpD8MRASzQGCqAomgJYMCSjNifWXLstp/ZcYzLQg7gXkSBiQejiqQAyJQNI34DSeajjBHPNDwNDUQQiiMdh1MFmRVMj5pyBntwUO13OYe+w128MjumCkVgsDhp+TDF9VdA6w9g4Pzo/N/XHWpPWQ+gZhGcU4TASFJIGxMoRqgQaEZ2bHFLlknIAbR7Cj+L0TRxLFKEdwHPAH5CMzP9y4xaOlVV+X8PAMokG04BeCVZW1VaUPpjRYAzoTF3Cevo1o+RduHKMo41t+HXJxChP+vcDXwAP0S/YiN/ksLBVWV756wz76RaiLfV5hLAypn9rn39FhkxWFevQ9yy/AXWxD+gCPgL+BfgY+xx8AbakHEDv+zDCtmDo+VFc1seUsyGlood0vFE/mI0mSdlVkpxjjEUwnp5cPMgp7aR4TIhrOLMSzYvQKs5Urj1bScQxajvPIvPlKrUHDWnxj+CTlztPFugVAigaw2a67ZRIXw/CptC7REvZ6ik7D3QB3U/nlvqOYLf0fZRjem7alVqv6GvS57Zytg372pJ3VtqTQgB9OoRN0sr/Hfg9sM2lZS62c/Ih7F3Uyf+IHwXuzw0u39+PI9L+If2U3qyRlpDlbN+7QN7+BwD6fNVZukBWjwxzK3oIsHJJCgH0vMItFKwJLNeEAYUAYeS4s2DKxX3vliq0PsWMabMwrCAlzpaPnvE2DVHDgoDm+METyWWNnJJCAG2MK2i/MqggoIE9ZRBGtt0DreMD4B3GHqXLu8SuJJEbf+1gvWYFIYQPKzW+FF55EYUAGpJvTbOw1FxM4iZSrnEhQBjZdw1wE0r/Qc7SnHiE8v8IbNekQxHCSOvjwBOp1UUNPVhUjL7vKemVC6C31Y28V4WUOhHjFwiEkWn3o0nbrG99g9iFZdSt1DrGqYEw6u1qd1VGn6oqejCgSk5WSC6Afh2Rbcoxn+jnqDzlUwlhZLKeVs40pVd0qprdtlByHURVeyWhh9rPVBIR569UsZOzHXvFG3O1cr5rGuOkUvm9o7hjAA4nEv14zYfHOOZ0mogNZT0R6x0/v8H9ncZpo5zoXqaomRTMwj5HYxhNR9H0Eqq/zSjV3F2pgbdMu7Yely2CswUbFuBH5djMdbEG+PvtUDmccI2SuQDCer9cVN3Womtr+bCj+TsxvY8TnUM3Ep37r+WK6vPdaGVx4G70P4Ddl6C1BW3T8eUYDn4F1ypEfyCrUS6A3mJaktzCEVTRdk8UAy58wB+ctE3EGPU8cpIWEi2wxGAIwL8dZWx/t2ldDRIg/KJZKLPhcAjOUrTRIj2TvuPQspwoiDmzyWwwGDyB0fajY+noGe/k2lYH05ta8kq4WdHl6CxHqR76KKIe+IYx3APPCr4MKz9OSwYQRQmbzgtjtg1mI8bC6HTjU4ZXpzRTY2TFyWEj77uYDAh7B3UmmGmruR/xDckd1Yl0r6bF6C19ExqIOtWM6XOgsyIUwFdu3tq0RR3S8dRlFLbtm28ymGj31DXi2uI+zPLLtW7Jv85omYbIHIRLYU4gQRvPbozSHVIvAh8rRb5kADXEGPVAix9nO2E2o8KP7zpxsxM/D+HNlAhFWk1cKVB5yoSFmjregKYZUwR6WQi7LAmLVvp5nQH7LtgvgOlPnLS27s7n/+bLyrCYhhkTahpwqwNXOCa2sEMAIcVOrjv1OQthX6OYLC1HiW6+f/6rrQ7f50ttnCIajXSaJmIJBsLonmE+AhaYdcmjChUOWG5C69NK0UMmSAbQgQHOwFT+i4w75VcY/ucZXIaZNTlJEMba8WemQFuqTEkt+XobT/KO03jFLh5I4idNPK7KggDC2FZP9nzKFn93eBZbVEkvBm8Rr4yOKWSfJAPoImYEojF4Gt/JnfzwmmnMvmemxRRAGN1Bpxer/Pztw0+gbjrOGEzkA3yHtDiSm+Tp9GhlmkwSNCa18iqyLji/XMS2uY8PW9/XdA/0JaPxNvBvWRiWPCkZQPv7MJtrmnmTM9uCmBMzOAFyZvrr4eqyyKcAwpYyM7sylO9CYyNn4FkMi6tTzl9cABbE6CQQmx5zYG0DbRQcS7lDmJ3v4dFWx8ywqZvTr83/LIT9FrTnnngM3WeCdzvB9SCZSwZQS1SyzTjjHKajtbyxUKGhInsKIGyZJthiLb+ePwJlboq/zEpromUgjPSlo4fo8pheQw8vJM/8TKTl8DUjYaZTUtkkyQDay2h50sMUOHI7R1D+kpMpZE9OMYSRgUzw4z5UZG8y7wzEqhoFBY83KwgosgUQ1tHNs/kAH+AUtY0L3MiXc2BgNm9hi8+WKEkG0PNu/A3X5tbkpqNc0jaEpjVOMYRRv72wp0eQdcDatCmqiUtrkkdZxtO7wN9ydDrf2gIrV/hZzrMwAYQ1e5s/OHDahZx+AZerVsgJwu7E6E5O6jFuZ3M13Lejc0nyAK1ykgygG5XbTkiYJsg8UFMAYVep2EeUv5L3GHaNmBfXZAcFjDqOlIewdPRw4lryDzMaNSIyE5ZTENSD5k7KSezSlVsJJYb7hDVZlacewrYx9t2oafrKKMAbS5OHiAWJndWcZRMwsy4KIEzQ/p+PpuczAo5c8X0cyGRPQ3Am9DOVTnGkG0oG0E8CeDEtpUSU8+cISuJC/vGFsOeBXwotUi7PskxsM54vhlA7mbXEHcr6RCTYhcRMkEq0OtIofKcYhopo5Fd9E1WEYndglF2AaAV+geRGpxnxO9G/nt8ksyQZQM9kyk5VzRRA2M2yB8XVXOhhRujCKqbAkXeJVXJMVbk8hNVGJb87VdpTQu04+JnEvc5cBJfhkKKqPBhBfewaVOxITmDqIewtxpxrcoawR8TWCT9guqjQMstm2gOZEDbM8G/S+mzSG+gJMXpY8jR4lqn4mUkGUFYQtvhEhrBXgH9iPaeOPqNs4pzhMiUI86rTKZBSD2HPAYpYI1A+Xvw7tD6B1l3o2ojAfr7E68Ac2iRGuEd2kVgygP5/Q9itKn5bfH8KS3uYinPE7pFvEatkGimT8hBWnw8IIyMiiO5Df3fyqWORtAqDeXi0WUQxv+qLEw3C3mDszx3C/k0Mwtgb56ac1zkyIczFXMKtWiGMdOzBoQNwDzHaWPJqsZkaKyA5Av2HG3/NCF4j+SKN5As2HgziIkY4W/L4zsK0QdgtTbSEfSyJQtgrpyf/99l40rbKIQ9hLPdJ4LdcX9nmSxDbANqMbbWh/Sv0bod/N6Pi4qTmoADdGL70cY4bWKmppV+vY39d2vteJd00dwh7ilFuEFtyu5YR0EbKQ5g8V0OP3ejvwWBPxpNiDyVfhiCZJEcgtsVqqTt1TujZfo7SlHutvLvMs3N9XF3ECPY4B7HzC2GvLRR5dDDGWHF1PVPQRGZCGKvmBU3ngVgNQYSOJo8guunIO1tP9KWyR1EkA2jMiVmcpjfpvUx9OCX1FkPuFHzyHGj6Q8cUc0kVVUFbSWKUUzFN0/kKrvVEzp5IXMrXye6FaYOwj5agkutqNfDuUa7A5Y2miQ3zDcCPuXr1OQtSmQuJC4PW3ZyuHwI5IoYVlhvQem8Ky6rQ/hQ8WzjltFwpjCmORbmBoXnksiivaJQU5IlpLqSjhzQ8XOnRrIdtOKkQdq2Z7QoNM4QPWbGDsuhmGa+9WEEepNLRQ02nRYxiCjTWdaD7PSZmKIBkkmRcsPbd5ZPRkAdWPBFntfyARrZ8p7xDWC9jYbOVKaTIA3y/vrQo1weVBBDmcvNuEzvaWoQWaC3TrH4Xelj8/SMOyijjXygjmHAmn88aT48A78TQJvbwF70+sX0w64fCOMXHcl2J7ipr8DWu9oSAMHarqwO4QM97jUmMP35fZ8Rb3NWpz2UgzGrmxWxloiSsXi8jeR+cd+Oog6kxwnAXWn/DnLe/CriTERCQkgF0ZIAnudYH0f9B0O/C4nyMF+nooV7z9X93JxXCumw8/1xv5xX/WMcr/usCBhJ4HLmCDIS9eKSdbflORzNbVElvw95ZwHSILOI9CLVfqmQAXcz/DV00JG7VEQogcY722sdrhrU3lmiZdwj7GX9t8Ap6JpXrmkbljfwbh/ktsjMZrqFMLoCwdfybnotr2EFERg2P9RSG7sCY6CsW7mUE10y835Cp5UjJANrvxDrmRMelwPfpOf4QLvRi3gjKPfCPYm8ImxKYG1VareQ6k8k31w+u4NjX0xaMPvbTSk9p9ciAbbTNGtypi2/nuOrzSZ2FveHA5TX4HWfNSoAeNFjzCdx78U4D78AribjSwcXJq8kFEHaJt+KSng9mdB440rPJzX8q482DPavVaOTLrE/h1C3AUgRWo/8vcKQDnXtw+H30nA/f55wwu+TB1U3kkgFEIuv5szd6AtUeSL4k2mBAJIwRH/rCyVWndk3emTAhRa3k/6PC5aUBa2k4Gol6475+U3hzU59AXk1xUiGMDHibb8TpNbCUoj+IbnZKmZKps0+MT/xGciUBhJ3iqqv0VEZj0a7BqGBhbrWmx3pWcI832eGpQsBKR4BSW2P0XPMncnbxeJIB5B0V/ozmhVFDjwebkHxhcQTDQfTGkrry8rLiAM8qzK7wGowGnyk4MhZw6mMWq4XPz7qUdwjzDPFtKEFTE+jW1u/Fbr5Tv1HJl9RUmhE22WG3mCyhsdAuH+8nq/nR5i0pS+oQbYC+Ah4D9PTOTXrrlAtDqS/2mKH3y84g+dfKXButDdJErDkx8fTiVmAHI5AmCe75Q1WakwVBL+ugyVc6PZymUgSx2GkznylZmlQIq6wFTcTIfcd8nZA8UBGU9LGk5eMMFsKOAg9KiPsk6hWr30bHzwlegRcgN0n8lexGm9zF7RlU9ZTH6yOKpioLbK0V/KJ5Te6v8PLK6gqTDWH0jx/YX6qUUX91mhRHoV4AYVLSq7wVUiz5+r+UZ3Pcm2QfWpULoEeDnA7ZfA1vQJUVlWbeTC+Sl04/znlpMe8QRsaur5K2OM0pwVlihz3S/NyJJ3YvzF2JjIaneA9tCwXlAuhZJ57qw/lxsYUCRk9eIGy33fmrMv+zjFqWLEAII/PoJOt7C5M7hjLjUOYiNXtd8jQLYTKSfUN9c2TY0qwlCByQ5qY5O7BvebqQQcgFEAm/6UZ/RhtBxYoxQYXG4kv8iRirpUl2fGIlWXqyIYz6+p92ueghgW/aWYuyo9VAWGlcp/ll9V+oMKcCsXNTszMpWYUAelsFin2UDwhz6V0+KRuBTwoSwsjeTeXSRqc4l57JzZUVBDWyrxiu0dYy9Z9blJuuETmowmulEED/6cQGJx7z49lExkEjTs+cGP8hfa4+q7yuIUYTsbtKA78Xa7bSEP1QrF6+blJnYeNdP+fAn5fgskq8J2bKdQ04dZ72NWgBhGXeKljiurUH54r1rFxHL5L6Ezo2wf2/0rIOBG9N/usYuaQQQNT04CD2jEEGpmaE5TpQz/P7/DsMUVH52RbBypmolLByCiCMumzz4pBR2DWVqwygtybmkgQQlrlWsthfdnHdUs1d0LEN+pdqMs3PxNBypZd1KAfQZ27sF/9ak13TO/X+0C9jQxasD6rcB8XELSWJu3P+/z2TMQsbN3a3S/w2cY4NDmaDTOzKsqhrjBgz3XzOYH2ZtSwLLYxoGJFD6D8s8TAGCVYhtgRHaXOeaSRCKrCpxXVO3O5EoxNfDOOgD0PB5GplZRAz/FjmweU5n+VIG3W53UmfvXWu/7aNHrIGxyyhytLA3ArvuXWuMzTdA+1B4wZUj6Q+21B9frqn1Ns5nuZYJMBwsiZvd+DbDrx8Op6Zhq016K1GrB41MzFvMbo07WCkLWAhjKIn0Nbyctep2wbrXe6qalfd17pOvSGHY0AmGB9G6wtoXYC27uR7U/vjGCrDSCOGT8Pg19F9PfZcowJ+lQNo/Hr6BjDohs+HSIRe6ZjcDrOVoLoEDZkDa9oBmoij/hFP1E+7YAZjtKy8pKpKZ28UQwgVyjdLR8bO/P27wnFDnGXwWhGNQG+AuTz5qk1aqu5SYaSMiADCOnur+ocsgVBAr9NXBCumB6cbqLOcUzcGhuEJYqwEcXoEjIacMozWwjtNdv0w3a1aC5wuuL0IhpIbYSZDKoCABn3yn6fmMY2FxpwBjx/+WCxmMUbKbait1teKHkTKstdvwc4umPdO7NDgFNRmqUwoHvCB3k9Cf+kcGX2n5ZVomimUybFMxzl2DVSMjI5EohGj3lgdrJ4b0nj7LLDEmRoZxhCitynQnTWNTPS26DqMVMMmkBQtqg2gZU6sZE7RE6SMf0p0omo1Vpot5uvTqtNECc0/tKTVE6f+SRdvO5bPIm5Oid6/S+/1pfdsJh+vSn9yUplszEKYnTYc6xyXTGh3wNuccw9JBRckX+rL6h2n1a4OqA2gvNhaVJKVBwQQllXbKRMuBtCUuTqnjgQnEnPSldfGxQDKqzvzqkwIYXlVni9lxQDKlyfzr6cIYfn36UmrsQhhJ+1Xr/3CixCm3XfFluSBIoQVwyBvHihCWN5cefIoKkLYyfNdT8qVFiFsUtx6ciotQtjJ+b3ndNVFCMvJfcXGRQgrxkDePFCEsLy58uRRNLRr8aZDc5D6bD7kvqYgr7y4F1aQX0vKqE2HBgvXOM6yYgBxnijsfHUOx58n9cr+D6VSDTN4Qzi2AAAAAElFTkSuQmCC", "text/plain": [] }, "execution_count": 51, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# let's bring color dimension as part of horizontal axis\n", "# at the same time horizontal axis is downsampled by 2x\n", "reduce(ims, \"b (h h2) (w w2) c -> (h w2) (b w c)\", \"mean\", h2=3, w2=3)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Ok, numpy is fun, but how do I use einops with some other framework?\n", "\n", "If that's what you've done with `ims` being numpy array:\n", "```python\n", "rearrange(ims, 'b h w c -> w (b h) c')\n", "```\n", "That's how you adapt the code for other frameworks:\n", "\n", "```python\n", "# pytorch:\n", "rearrange(ims, 'b h w c -> w (b h) c')\n", "# tensorflow:\n", "rearrange(ims, 'b h w c -> w (b h) c')\n", "# gluon:\n", "rearrange(ims, 'b h w c -> w (b h) c')\n", "# cupy:\n", "rearrange(ims, 'b h w c -> w (b h) c')\n", "# jax:\n", "rearrange(ims, 'b h w c -> w (b h) c')\n", "# paddle:\n", "rearrange(ims, 'b h w c -> w (b h) c')\n", "\n", "...well, you got the idea.\n", "```\n", "\n", "Einops allows backpropagation as if all operations were native to framework.\n", "Operations do not change when moving to another framework - einops notation is universal" ] }, { "cell_type": "markdown", "metadata": { "pycharm": { "name": "#%% md\n" } }, "source": [ "# Summary\n", "\n", "- `rearrange` doesn't change number of elements and covers different numpy functions (like `transpose`, `reshape`, `stack`, `concatenate`, `squeeze` and `expand_dims`)\n", "- `reduce` combines same reordering syntax with reductions (`mean`, `min`, `max`, `sum`, `prod`, and any others)\n", "- `repeat` additionally covers repeating and tiling\n", "- composition and decomposition of axes are a corner stone, they can and should be used together\n", "\n", "\n", "\n", "- [Second part of tutorial](https://github.com/arogozhnikov/einops/tree/main/docs) shows how einops works with other frameworks\n", "- [Third part of tutorial](https://arogozhnikov.github.io/einops/pytorch-examples.html) shows how to improve your DL code with einops" ] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.11.8" } }, "nbformat": 4, "nbformat_minor": 4 } arogozhnikov-einops-ad2c8d6/docs/2-einops-for-deep-learning.ipynb000066400000000000000000000711131475201674600251640ustar00rootroot00000000000000{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Einops tutorial, part 2: deep learning\n", "\n", "Previous part of tutorial provides visual examples with numpy.\n", "\n", "## What's in this tutorial?\n", "\n", "- working with deep learning packages\n", "- important cases for deep learning models\n", "- `einops.asnumpy` and `einops.layers`" ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [], "source": [ "from einops import rearrange, reduce" ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", "\n", "x = np.random.RandomState(42).normal(size=[10, 32, 100, 200])" ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [], "source": [ "# utility to hide answers\n", "from utils import guess" ] }, { "attachments": {}, "cell_type": "markdown", "metadata": {}, "source": [ "## Select your flavour\n", "\n", "Switch to the framework you're most comfortable with. " ] }, { "cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [], "source": [ "# select \"tensorflow\" or \"pytorch\"\n", "flavour = \"pytorch\"" ] }, { "cell_type": "code", "execution_count": 5, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "selected pytorch backend\n" ] } ], "source": [ "print(\"selected {} backend\".format(flavour))\n", "if flavour == \"tensorflow\":\n", " import tensorflow as tf\n", "\n", " tape = tf.GradientTape(persistent=True)\n", " tape.__enter__()\n", " x = tf.Variable(x) + 0\n", "else:\n", " assert flavour == \"pytorch\"\n", " import torch\n", "\n", " x = torch.from_numpy(x)\n", " x.requires_grad = True" ] }, { "cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "(torch.Tensor, torch.Size([10, 32, 100, 200]))" ] }, "execution_count": 6, "metadata": {}, "output_type": "execute_result" } ], "source": [ "type(x), x.shape" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Simple computations \n", "\n", "- converting bchw to bhwc format and back is a common operation in CV\n", "- try to predict output shape and then check your guess!" ] }, { "cell_type": "code", "execution_count": 7, "metadata": {}, "outputs": [ { "data": { "text/html": [ "\n", "

Answer is: (10, 100, 200, 32) (hover to see)

" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "y = rearrange(x, \"b c h w -> b h w c\")\n", "guess(y.shape)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Worked!\n", "\n", "Did you notice? Code above worked for you backend of choice.
\n", "Einops functions work with any tensor like they are native to the framework.\n", "\n", "## Backpropagation\n", "\n", "- gradients are a corner stone of deep learning\n", "- You can back-propagate through einops operations
\n", " (just as with framework native operations)" ] }, { "cell_type": "code", "execution_count": 8, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "tensor(320., dtype=torch.float64)\n" ] } ], "source": [ "y0 = x\n", "y1 = reduce(y0, \"b c h w -> b c\", \"max\")\n", "y2 = rearrange(y1, \"b c -> c b\")\n", "y3 = reduce(y2, \"c b -> \", \"sum\")\n", "\n", "if flavour == \"tensorflow\":\n", " print(reduce(tape.gradient(y3, x), \"b c h w -> \", \"sum\"))\n", "else:\n", " y3.backward()\n", " print(reduce(x.grad, \"b c h w -> \", \"sum\"))" ] }, { "attachments": {}, "cell_type": "markdown", "metadata": { "pycharm": { "name": "#%% md\n" } }, "source": [ "## Meet `einops.asnumpy`\n", "\n", "Just converts tensors to numpy (and pulls from gpu if necessary)" ] }, { "cell_type": "code", "execution_count": 9, "metadata": { "jupyter": { "outputs_hidden": false }, "pycharm": { "name": "#%%\n" } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\n" ] } ], "source": [ "from einops import asnumpy\n", "\n", "y3_numpy = asnumpy(y3)\n", "\n", "print(type(y3_numpy))" ] }, { "cell_type": "markdown", "metadata": { "pycharm": { "name": "#%% md\n" } }, "source": [ "## Common building blocks of deep learning\n", "\n", "Let's check how some familiar operations can be written with `einops`\n", "\n", "**Flattening** is common operation, frequently appears at the boundary\n", "between convolutional layers and fully connected layers" ] }, { "cell_type": "code", "execution_count": 10, "metadata": {}, "outputs": [ { "data": { "text/html": [ "\n", "

Answer is: (10, 640000) (hover to see)

" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "y = rearrange(x, \"b c h w -> b (c h w)\")\n", "guess(y.shape)" ] }, { "cell_type": "markdown", "metadata": { "pycharm": { "name": "#%% md\n" } }, "source": [ "**space-to-depth**" ] }, { "cell_type": "code", "execution_count": 11, "metadata": {}, "outputs": [ { "data": { "text/html": [ "\n", "

Answer is: (10, 128, 50, 100) (hover to see)

" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "y = rearrange(x, \"b c (h h1) (w w1) -> b (h1 w1 c) h w\", h1=2, w1=2)\n", "guess(y.shape)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**depth-to-space** (notice that it's reverse of the previous)" ] }, { "cell_type": "code", "execution_count": 12, "metadata": { "jupyter": { "outputs_hidden": false }, "pycharm": { "name": "#%%\n" } }, "outputs": [ { "data": { "text/html": [ "\n", "

Answer is: (10, 8, 200, 400) (hover to see)

" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "y = rearrange(x, \"b (h1 w1 c) h w -> b c (h h1) (w w1)\", h1=2, w1=2)\n", "guess(y.shape)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Reductions\n", "\n", "Simple **global average pooling**." ] }, { "cell_type": "code", "execution_count": 13, "metadata": { "pycharm": { "name": "#%%\n" } }, "outputs": [ { "data": { "text/html": [ "\n", "

Answer is: (10, 32) (hover to see)

" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "y = reduce(x, \"b c h w -> b c\", reduction=\"mean\")\n", "guess(y.shape)" ] }, { "cell_type": "markdown", "metadata": { "pycharm": { "name": "#%% md\n" } }, "source": [ "**max-pooling** with a kernel 2x2" ] }, { "cell_type": "code", "execution_count": 14, "metadata": { "jupyter": { "outputs_hidden": false }, "pycharm": { "name": "#%%\n" } }, "outputs": [ { "data": { "text/html": [ "\n", "

Answer is: (10, 32, 50, 100) (hover to see)

" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "y = reduce(x, \"b c (h h1) (w w1) -> b c h w\", reduction=\"max\", h1=2, w1=2)\n", "guess(y.shape)" ] }, { "cell_type": "code", "execution_count": 15, "metadata": { "jupyter": { "outputs_hidden": false }, "pycharm": { "name": "#%%\n" } }, "outputs": [ { "data": { "text/html": [ "\n", "

Answer is: (10, 32, 50, 100) (hover to see)

" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "# you can skip names for reduced axes\n", "y = reduce(x, \"b c (h 2) (w 2) -> b c h w\", reduction=\"max\")\n", "guess(y.shape)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## 1d, 2d and 3d pooling are defined in a similar way\n", "\n", "for sequential 1-d models, you'll probably want pooling over time\n", "```python\n", "reduce(x, '(t 2) b c -> t b c', reduction='max')\n", "```\n", "\n", "for volumetric models, all three dimensions are pooled\n", "```python\n", "reduce(x, 'b c (x 2) (y 2) (z 2) -> b c x y z', reduction='max')\n", "```\n", "\n", "Uniformity is a strong point of `einops`, and you don't need specific operation for each particular case.\n", "\n", "\n", "### Good exercises \n", "\n", "- write a version of space-to-depth for 1d and 3d (2d is provided above)\n", "- write an average / max pooling for 1d models. " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Squeeze and unsqueeze (expand_dims)" ] }, { "cell_type": "code", "execution_count": 16, "metadata": {}, "outputs": [], "source": [ "# models typically work only with batches,\n", "# so to predict a single image ...\n", "image = rearrange(x[0, :3], \"c h w -> h w c\")\n", "# ... create a dummy 1-element axis ...\n", "y = rearrange(image, \"h w c -> () c h w\")\n", "# ... imagine you predicted this with a convolutional network for classification,\n", "# we'll just flatten axes ...\n", "predictions = rearrange(y, \"b c h w -> b (c h w)\")\n", "# ... finally, decompose (remove) dummy axis\n", "predictions = rearrange(predictions, \"() classes -> classes\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## keepdims-like behavior for reductions\n", "\n", "- empty composition `()` provides dimensions of length 1, which are broadcastable.\n", "- alternatively, you can use just `1` to introduce new axis, that's a synonym to `()`\n", "\n", "per-channel mean-normalization for *each image*:" ] }, { "cell_type": "code", "execution_count": 17, "metadata": { "jupyter": { "outputs_hidden": false }, "pycharm": { "name": "#%%\n" } }, "outputs": [ { "data": { "text/html": [ "\n", "

Answer is: (10, 32, 100, 200) (hover to see)

" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "y = x - reduce(x, \"b c h w -> b c 1 1\", \"mean\")\n", "guess(y.shape)" ] }, { "cell_type": "markdown", "metadata": { "pycharm": { "name": "#%% md\n" } }, "source": [ "per-channel mean-normalization for *whole batch*:" ] }, { "cell_type": "code", "execution_count": 18, "metadata": { "jupyter": { "outputs_hidden": false }, "pycharm": { "name": "#%%\n" } }, "outputs": [ { "data": { "text/html": [ "\n", "

Answer is: (10, 32, 100, 200) (hover to see)

" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "y = x - reduce(y, \"b c h w -> 1 c 1 1\", \"mean\")\n", "guess(y.shape)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Stacking\n", "\n", "let's take a list of tensors" ] }, { "cell_type": "code", "execution_count": 19, "metadata": {}, "outputs": [], "source": [ "list_of_tensors = list(x)" ] }, { "cell_type": "markdown", "metadata": { "pycharm": { "name": "#%% md\n" } }, "source": [ "New axis (one that enumerates tensors) appears *first* on the left side of expression.\n", "Just as if you were indexing list - first you'd get tensor by index" ] }, { "cell_type": "code", "execution_count": 20, "metadata": { "jupyter": { "outputs_hidden": false }, "pycharm": { "name": "#%%\n" } }, "outputs": [ { "data": { "text/html": [ "\n", "

Answer is: (10, 100, 200, 32) (hover to see)

" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "tensors = rearrange(list_of_tensors, \"b c h w -> b h w c\")\n", "guess(tensors.shape)" ] }, { "cell_type": "code", "execution_count": 21, "metadata": { "jupyter": { "outputs_hidden": false }, "pycharm": { "name": "#%%\n" } }, "outputs": [ { "data": { "text/html": [ "\n", "

Answer is: (100, 200, 32, 10) (hover to see)

" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "# or maybe stack along last dimension?\n", "tensors = rearrange(list_of_tensors, \"b c h w -> h w c b\")\n", "guess(tensors.shape)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Concatenation\n", "\n", "concatenate over the first dimension?" ] }, { "cell_type": "code", "execution_count": 22, "metadata": {}, "outputs": [ { "data": { "text/html": [ "\n", "

Answer is: (1000, 200, 32) (hover to see)

" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "tensors = rearrange(list_of_tensors, \"b c h w -> (b h) w c\")\n", "guess(tensors.shape)" ] }, { "cell_type": "markdown", "metadata": { "pycharm": { "name": "#%% md\n" } }, "source": [ "or maybe concatenate along last dimension?" ] }, { "cell_type": "code", "execution_count": 23, "metadata": { "jupyter": { "outputs_hidden": false }, "pycharm": { "name": "#%%\n" } }, "outputs": [ { "data": { "text/html": [ "\n", "

Answer is: (100, 200, 320) (hover to see)

" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "tensors = rearrange(list_of_tensors, \"b c h w -> h w (b c)\")\n", "guess(tensors.shape)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Shuffling within a dimension\n", "\n", "**channel shuffle** (as it is drawn in shufflenet paper)" ] }, { "cell_type": "code", "execution_count": 24, "metadata": {}, "outputs": [ { "data": { "text/html": [ "\n", "

Answer is: (10, 32, 100, 200) (hover to see)

" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "y = rearrange(x, \"b (g1 g2 c) h w-> b (g2 g1 c) h w\", g1=4, g2=4)\n", "guess(y.shape)" ] }, { "cell_type": "markdown", "metadata": { "pycharm": { "name": "#%% md\n" } }, "source": [ "simpler version of **channel shuffle**" ] }, { "cell_type": "code", "execution_count": 25, "metadata": { "jupyter": { "outputs_hidden": false }, "pycharm": { "name": "#%%\n" } }, "outputs": [ { "data": { "text/html": [ "\n", "

Answer is: (10, 32, 100, 200) (hover to see)

" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "y = rearrange(x, \"b (g c) h w-> b (c g) h w\", g=4)\n", "guess(y.shape)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Split a dimension\n", "\n", "Here's a super-convenient trick.\n", "\n", "Example: when a network predicts several bboxes for each position\n", "\n", "Assume we got 8 bboxes, 4 coordinates each.
\n", "To get coordinated into 4 separate variables, you move corresponding dimension to front and unpack tuple." ] }, { "cell_type": "code", "execution_count": 26, "metadata": {}, "outputs": [ { "data": { "text/html": [ "\n", "

Answer is: (10, 8, 100, 200) (hover to see)

" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/html": [ "\n", "

Answer is: (10, 100, 200) (hover to see)

" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "bbox_x, bbox_y, bbox_w, bbox_h = rearrange(x, \"b (coord bbox) h w -> coord b bbox h w\", coord=4, bbox=8)\n", "# now you can operate on individual variables\n", "max_bbox_area = reduce(bbox_w * bbox_h, \"b bbox h w -> b h w\", \"max\")\n", "guess(bbox_x.shape)\n", "guess(max_bbox_area.shape)" ] }, { "cell_type": "markdown", "metadata": { "pycharm": { "name": "#%% md\n" } }, "source": [ "## Getting into the weeds of tensor packing\n", "\n", "you can skip this part - it explains why taking a habit of defining splits and packs explicitly\n", "\n", "when implementing custom gated activation (like GLU), split is needed:\n", "```python\n", "y1, y2 = rearrange(x, 'b (split c) h w -> split b c h w', split=2)\n", "result = y2 * sigmoid(y2) # or tanh\n", "```\n", "... but we could split differently\n", "```python\n", "y1, y2 = rearrange(x, 'b (c split) h w -> split b c h w', split=2)\n", "```\n", "\n", "- first one splits channels into consequent groups: `y1 = x[:, :x.shape[1] // 2, :, :]`\n", "- while second takes channels with a step: `y1 = x[:, 0::2, :, :]`\n", "\n", "This may drive to very *surprising* results when input is\n", "- a result of group convolution\n", "- a result of bidirectional LSTM/RNN\n", "- multi-head attention\n", "\n", "Let's focus on the second case (LSTM/RNN), since it is less obvious.\n", "\n", "For instance, cudnn concatenates LSTM outputs for forward-in-time and backward-in-time\n", "\n", "Also in pytorch GLU splits channels into consequent groups (first way)\n", "So when LSTM's output comes to GLU,\n", "- forward-in-time produces linear part, and backward-in-time produces activation ...\n", "- and role of directions is different, and gradients coming to two parts are different\n", " - that's not what you expect from simple `GLU(BLSTM(x))`, right?\n", "\n", "`einops` notation makes such inconsistencies explicit and easy-detectable" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Shape parsing\n", "just a handy utility" ] }, { "cell_type": "code", "execution_count": 27, "metadata": { "jupyter": { "outputs_hidden": false }, "pycharm": { "name": "#%%\n" } }, "outputs": [], "source": [ "from einops import parse_shape" ] }, { "cell_type": "code", "execution_count": 28, "metadata": {}, "outputs": [], "source": [ "def convolve_2d(x):\n", " # imagine we have a simple 2d convolution with padding,\n", " # so output has same shape as input.\n", " # Sorry for laziness, use imagination!\n", " return x" ] }, { "cell_type": "code", "execution_count": 29, "metadata": {}, "outputs": [], "source": [ "# imagine we are working with 3d data\n", "x_5d = rearrange(x, \"b c x (y z) -> b c x y z\", z=20)\n", "# but we have only 2d convolutions.\n", "# That's not a problem, since we can apply\n", "y = rearrange(x_5d, \"b c x y z -> (b z) c x y\")\n", "y = convolve_2d(y)\n", "# not just specifies additional information, but verifies that all dimensions match\n", "y = rearrange(y, \"(b z) c x y -> b c x y z\", **parse_shape(x_5d, \"b c x y z\"))" ] }, { "cell_type": "code", "execution_count": 30, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "{'b': 10, 'c': 32, 'x': 100, 'y': 10, 'z': 20}" ] }, "execution_count": 30, "metadata": {}, "output_type": "execute_result" } ], "source": [ "parse_shape(x_5d, \"b c x y z\")" ] }, { "cell_type": "code", "execution_count": 31, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "{'batch': 10, 'c': 32}" ] }, "execution_count": 31, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# we can skip some dimensions by writing underscore\n", "parse_shape(x_5d, \"batch c _ _ _\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Striding anything\n", "\n", "Finally, how to convert any operation into a strided operation?
\n", "(like convolution with strides, aka dilated/atrous convolution)" ] }, { "cell_type": "code", "execution_count": 32, "metadata": {}, "outputs": [], "source": [ "# each image is split into subgrids, each subgrid now is a separate \"image\"\n", "y = rearrange(x, \"b c (h hs) (w ws) -> (hs ws b) c h w\", hs=2, ws=2)\n", "y = convolve_2d(y)\n", "# pack subgrids back to an image\n", "y = rearrange(y, \"(hs ws b) c h w -> b c (h hs) (w ws)\", hs=2, ws=2)\n", "\n", "assert y.shape == x.shape" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Layers\n", "\n", "For frameworks that prefer operating with layers, layers are available.\n", "\n", "You'll need to import a proper one depending on your backend:\n", "\n", "```python\n", "from einops.layers.torch import Rearrange, Reduce\n", "from einops.layers.flax import Rearrange, Reduce\n", "from einops.layers.tensorflow import Rearrange, Reduce\n", "from einops.layers.chainer import Rearrange, Reduce\n", "```\n", "\n", "`Einops` layers are identical to operations, and have same parameters.
\n", "(for the exception of first argument, which should be passed during call)\n", "\n", "```python\n", "layer = Rearrange(pattern, **axes_lengths)\n", "layer = Reduce(pattern, reduction, **axes_lengths)\n", "\n", "# apply layer to tensor\n", "x = layer(x)\n", "```\n", "\n", "Usually it is more convenient to use layers, not operations, to build models\n", "```python\n", "# example given for pytorch, but code in other frameworks is almost identical\n", "from torch.nn import Sequential, Conv2d, MaxPool2d, Linear, ReLU\n", "from einops.layers.torch import Reduce\n", "\n", "model = Sequential(\n", " Conv2d(3, 6, kernel_size=5),\n", " MaxPool2d(kernel_size=2),\n", " Conv2d(6, 16, kernel_size=5),\n", " # combined pooling and flattening in a single step\n", " Reduce('b c (h 2) (w 2) -> b (c h w)', 'max'), \n", " Linear(16*5*5, 120), \n", " ReLU(),\n", " Linear(120, 10), \n", " # In flax, the {'axis': value} syntax for specifying values for axes is mandatory:\n", " # Rearrange('(b1 b2) d -> b1 b2 d', {'b1': 12}), \n", ")\n", "```" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## What's now?\n", "\n", "- rush through [writing better code with einops+pytorch](https://arogozhnikov.github.io/einops/pytorch-examples.html)\n", "\n", "Use different framework? Not a big issue, most recommendations transfer well to other frameworks.
\n", "`einops` works the same way in any framework.\n", "\n", "Finally - just write your code with einops!" ] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.7.6" } }, "nbformat": 4, "nbformat_minor": 4 } arogozhnikov-einops-ad2c8d6/docs/3-einmix-layer.ipynb000066400000000000000000000655021475201674600230040ustar00rootroot00000000000000{ "cells": [ { "cell_type": "markdown", "id": "0c6e8fa3-2900-4879-a610-3c038f8c8d23", "metadata": {}, "source": [ "# EinMix: universal toolkit for advanced MLP architectures\n", "\n", "Recent progress in MLP-based architectures demonstrated that *very specific* MLPs can compete with convnets and transformers (and even outperform them).\n", "\n", "EinMix allows writing such architectures in a more uniform and readable way." ] }, { "cell_type": "markdown", "id": "dc9a3913-5d5f-41da-8946-f17b3e537ec4", "metadata": {}, "source": [ "## EinMix — building block of MLPs" ] }, { "cell_type": "code", "execution_count": 1, "id": "5f24a8f2-6680-4298-9736-081719c53f88", "metadata": {}, "outputs": [], "source": [ "from einops.layers.torch import EinMix as Mix" ] }, { "cell_type": "markdown", "id": "eacd97e9-7ae2-4321-a3ad-3fb9b122fd81", "metadata": {}, "source": [ "Logic of EinMix is very close to the one of `einsum`. \n", "If you're not familiar with einsum, follow these guides first:\n", "\n", "- https://rockt.github.io/2018/04/30/einsum\n", "- https://towardsdatascience.com/einsum-an-underestimated-function-99ca96e2942e\n", "- https://theaisummer.com/einsum-attention/\n", "\n", "Einsum uniformly describes a number of operations, however `EinMix` is defined slightly differently.\n", "\n", "Here is a linear layer, a common block in sequence modelling (e.g. in NLP/speech), written with einsum\n", "```python\n", "weight = <...create tensor...>\n", "result = torch.einsum('tbc,cd->tbd', embeddings, weight)\n", "```\n", "\n", "EinMix counter-part is:\n", "```python\n", "mix_channels = Mix('t b c -> t b c_out', weight_shape='c c_out', ...)\n", "result = mix_channels(embeddings)\n", "```\n", "\n", "Main differences compared to plain `einsum` are:\n", "\n", "- layer takes care of the weight initialization & management hassle\n", "- weight is not in the comprehension\n", "\n", "We'll discuss other changes a bit later, now let's implement ResMLP" ] }, { "cell_type": "code", "execution_count": 2, "id": "d6983748-20d9-4825-a79f-a0641e58085c", "metadata": {}, "outputs": [], "source": [ "# let's start\n", "import torch\n", "from torch import nn" ] }, { "cell_type": "markdown", "id": "928fc321-7b41-4737-bbf4-644021c1c209", "metadata": {}, "source": [ "## ResMLP — original implementation\n", "\n", "Building blocks of ResMLP consist only of linear/affine layers and one activation (GELU).
\n", "Let's see how we can rewrite all of the components with Mix. \n", "\n", "We start from a reference code for ResMLP block published in the [paper](https://arxiv.org/pdf/2105.03404.pdf):" ] }, { "cell_type": "code", "execution_count": 3, "id": "eed8edf4-0024-482a-bba2-03e84d4086ab", "metadata": {}, "outputs": [], "source": [ "# No norm layer\n", "class Affine(nn.Module):\n", " def __init__(self, dim):\n", " super().__init__()\n", " self.alpha = nn.Parameter(torch.ones(dim))\n", " self.beta = nn.Parameter(torch.zeros(dim))\n", "\n", " def forward(self, x):\n", " return self.alpha * x + self.beta\n", "\n", "\n", "class Mlp(nn.Module):\n", " def __init__(self, dim):\n", " super().__init__()\n", " self.fc1 = nn.Linear(dim, 4 * dim)\n", " self.act = nn.GELU()\n", " self.fc2 = nn.Linear(4 * dim, dim)\n", "\n", " def forward(self, x):\n", " x = self.fc1(x)\n", " x = self.act(x)\n", " x = self.fc2(x)\n", " return x\n", "\n", "class ResMLP_Blocks(nn.Module):\n", " def __init__(self, nb_patches, dim, layerscale_init):\n", " super().__init__()\n", " self.affine_1 = Affine(dim)\n", " self.affine_2 = Affine(dim)\n", " self.linear_patches = nn.Linear(nb_patches, nb_patches) #Linear layer on patches\n", " self.mlp_channels = Mlp(dim) #MLP on channels\n", " self.layerscale_1 = nn.Parameter(layerscale_init * torch.ones((dim))) # LayerScale\n", " self.layerscale_2 = nn.Parameter(layerscale_init * torch.ones((dim))) # parameters\n", "\n", " def forward(self, x):\n", " res_1 = self.linear_patches(self.affine_1(x).transpose(1,2)).transpose(1,2)\n", " x = x + self.layerscale_1 * res_1\n", " res_2 = self.mlp_channels(self.affine_2(x))\n", " x = x + self.layerscale_2 * res_2\n", " return x" ] }, { "cell_type": "markdown", "id": "7496cf7b-a3e1-4648-b62b-30de26c2e718", "metadata": {}, "source": [ "## ResMLP — rewritten\n", "\n", "Code below is the result of first rewriting: \n", "- combination [transpose -> linear -> transpose back] got nicely packed into a single `EinMix` (`mix_patches`)
\n", " `Mix('b t c -> b t0 c', weight_shape='t t0', bias_shape='t0', t=nb_patches, t0=nb_patches)` \n", " - pattern `'b t c -> b t0 c'` tells that `b` and `c` are unperturbed, while tokens `t->t0` were mixed\n", " - explicit parameter shapes are also quite insightful\n", " \n", "- In new implementation affine layer is also handled by `EinMix`:
\n", " `Mix('b t c -> b t c', weight_shape='c', bias_shape='c', c=dim)`\n", " - from the pattern you can see that there is no mixing at all, only multiplication and shift\n", " - multiplication and shift are defined by weight and bias - and those depend only on a channel\n", " - thus affine transform is per-channel\n", " \n", "- Linear layer is also handled by EinMix, the only difference compared to affine layer is absence of bias\n", "- We specified that input is 3d and order is `btc`, not `tbc` - this is not written explicitly in the original code\n", "\n", "The only step back that we had to do is change an initialization schema for EinMix for affine and linear layers" ] }, { "cell_type": "code", "execution_count": 4, "id": "a9cc8c91-d80a-4c35-a010-dbafad22b8c7", "metadata": {}, "outputs": [], "source": [ "def Mlp(dim):\n", " return nn.Sequential(\n", " nn.Linear(dim, 4 * dim),\n", " nn.GELU(),\n", " nn.Linear(4 * dim, dim),\n", " )\n", "\n", "def init(Mix_layer, scale=1.):\n", " Mix_layer.weight.data[:] = scale\n", " if Mix_layer.bias is not None:\n", " Mix_layer.bias.data[:] = 0\n", " return Mix_layer\n", "\n", "class ResMLP_Blocks2(nn.Module):\n", " def __init__(self, nb_patches, dim, layerscale_init):\n", " super().__init__()\n", "\n", " self.affine1 = init(Mix('b t c -> b t c', weight_shape='c', bias_shape='c', c=dim))\n", " self.affine2 = init(Mix('b t c -> b t c', weight_shape='c', bias_shape='c', c=dim))\n", " self.mix_patches = Mix('b t c -> b t0 c', weight_shape='t t0', bias_shape='t0', t=nb_patches, t0=nb_patches)\n", " self.mlp_channels = Mlp(dim)\n", " self.linear1 = init(Mix('b t c -> b t c', weight_shape='c', c=dim), scale=layerscale_init)\n", " self.linear2 = init(Mix('b t c -> b t c', weight_shape='c', c=dim), scale=layerscale_init)\n", "\n", " def forward(self, x):\n", " res1 = self.mix_patches(self.affine1(x))\n", " x = x + self.linear1(res1)\n", " res2 = self.mlp_channels(self.affine2(x))\n", " x = x + self.linear2(res2)\n", " return x" ] }, { "cell_type": "markdown", "id": "761d0d39-b041-4389-a19d-c3d512d64624", "metadata": {}, "source": [ "## ResMLP — rewritten more\n", "\n", "Since here in einops-land we care about code being easy to follow, let's make one more transformation.\n", "\n", "We group layers from both branches, and now the order of operations matches the order as they are written in the code.\n", "\n", "Could we go further? Actually, yes - `nn.Linear` layers can also be replaced by EinMix,\n", "however they are very organic here since first and last operations in `branch_channels` show components.\n", "\n", "Brevity of `nn.Linear` is benefitial when the context specifies tensor shapes.\n", "\n", "Other interesing observations:\n", "- hard to notice in the original code `nn.Linear` is preceded by a linear layer (thus latter is redundant or can be fused in the former)\n", "- hard to notice in the original code second `nn.Linear` is followed by an affine layer (thus latter is again redundant)\n", "\n", "Take time to reorganize your code. This may be quite insightful." ] }, { "cell_type": "code", "execution_count": 5, "id": "6fb50fac-1ef4-45a2-acdb-48f1050df7ca", "metadata": {}, "outputs": [], "source": [ "def init(layer: Mix, scale=1.):\n", " layer.weight.data[:] = scale\n", " if layer.bias is not None:\n", " layer.bias.data[:] = 0\n", " return layer\n", "\n", "class ResMLP_Blocks3(nn.Module):\n", " def __init__(self, nb_patches, dim, layerscale_init):\n", " super().__init__()\n", " self.branch_patches = nn.Sequential(\n", " init(Mix('b t c -> b t c', weight_shape='c', c=dim), scale=layerscale_init),\n", " Mix('b t c -> b t0 c', weight_shape='t t0', bias_shape='t0', t=nb_patches, t0=nb_patches),\n", " init(Mix('b t c -> b t c', weight_shape='c', bias_shape='c', c=dim)),\n", " )\n", "\n", " self.branch_channels = nn.Sequential(\n", " init(Mix('b t c -> b t c', weight_shape='c', c=dim), scale=layerscale_init),\n", " nn.Linear(dim, 4 * dim),\n", " nn.GELU(),\n", " nn.Linear(4 * dim, dim),\n", " init(Mix('b t c -> b t c', weight_shape='c', bias_shape='c', c=dim)),\n", " )\n", "\n", " def forward(self, x):\n", " x = x + self.branch_patches(x)\n", " x = x + self.branch_channels(x)\n", " return x" ] }, { "cell_type": "markdown", "id": "ad7f343e-b81d-4b25-96bd-4c48ff33f9c8", "metadata": {}, "source": [ "## ResMLP — performance\n", "\n", "There is some fear of using einsum because historically it lagged in performance.\n", "\n", "Below we run a test and verify that performace didn't change after transition to `EinMix`" ] }, { "cell_type": "code", "execution_count": 6, "id": "3987bbbc-ccc2-4374-8979-fd3fb29c4910", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "28.1 ms ± 1.61 ms per loop (mean ± std. dev. of 7 runs, 10 loops each)\n", "26.3 ms ± 620 µs per loop (mean ± std. dev. of 7 runs, 10 loops each)\n", "25.9 ms ± 706 µs per loop (mean ± std. dev. of 7 runs, 10 loops each)\n", "26.8 ms ± 2.99 ms per loop (mean ± std. dev. of 7 runs, 10 loops each)\n", "25.9 ms ± 794 µs per loop (mean ± std. dev. of 7 runs, 10 loops each)\n", "25.6 ms ± 723 µs per loop (mean ± std. dev. of 7 runs, 10 loops each)\n" ] } ], "source": [ "x = torch.zeros([32, 128, 128])\n", "for layer in [\n", " ResMLP_Blocks(128, dim=128, layerscale_init=1.),\n", " ResMLP_Blocks2(128, dim=128, layerscale_init=1.),\n", " ResMLP_Blocks3(128, dim=128, layerscale_init=1.),\n", " # scripted versions\n", " torch.jit.script(ResMLP_Blocks(128, dim=128, layerscale_init=1.)),\n", " torch.jit.script(ResMLP_Blocks2(128, dim=128, layerscale_init=1.)),\n", " torch.jit.script(ResMLP_Blocks3(128, dim=128, layerscale_init=1.)),\n", "]:\n", " %timeit -n 10 y = layer(x)" ] }, { "cell_type": "markdown", "id": "05428b35-5788-4e7f-867e-aa742d7215e7", "metadata": {}, "source": [ "## TokenMixer from MLPMixer — original code\n", "\n", "Let's now delve into MLPMixer. We start from pytorch [implementation](https://github.com/jaketae/mlp-mixer/blob/e7d68dfc31e94721724689e6ec90f05806b50124/mlp_mixer/core.py) by Jake Tae.\n", "\n", "We'll focus on two components of MLPMixer that don't exist in convnets. First component is TokenMixer:" ] }, { "cell_type": "code", "execution_count": 7, "id": "8f6a6073-44af-40ec-8154-02ae8370f749", "metadata": {}, "outputs": [], "source": [ "from torch.nn import functional as F\n", "\n", "class MLP(nn.Module):\n", " def __init__(self, num_features, expansion_factor, dropout):\n", " super().__init__()\n", " num_hidden = num_features * expansion_factor\n", " self.fc1 = nn.Linear(num_features, num_hidden)\n", " self.dropout1 = nn.Dropout(dropout)\n", " self.fc2 = nn.Linear(num_hidden, num_features)\n", " self.dropout2 = nn.Dropout(dropout)\n", "\n", " def forward(self, x):\n", " x = self.dropout1(F.gelu(self.fc1(x)))\n", " x = self.dropout2(self.fc2(x))\n", " return x\n", "\n", "\n", "class TokenMixer(nn.Module):\n", " def __init__(self, num_features, num_patches, expansion_factor, dropout):\n", " super().__init__()\n", " self.norm = nn.LayerNorm(num_features)\n", " self.mlp = MLP(num_patches, expansion_factor, dropout)\n", "\n", " def forward(self, x):\n", " # x.shape == (batch_size, num_patches, num_features)\n", " residual = x\n", " x = self.norm(x)\n", " x = x.transpose(1, 2)\n", " # x.shape == (batch_size, num_features, num_patches)\n", " x = self.mlp(x)\n", " x = x.transpose(1, 2)\n", " # x.shape == (batch_size, num_patches, num_features)\n", " out = x + residual\n", " return out" ] }, { "cell_type": "markdown", "id": "362fa6ac-7146-4f7b-9e1d-0765486c26da", "metadata": {}, "source": [ "## TokenMixer from MLPMixer — reimplemented\n", "\n", "We can significantly reduce amount of code by using `EinMix`. \n", "\n", "- Main caveat addressed by original code is that `nn.Linear` mixes only last axis. `EinMix` can mix any axis.\n", "- Sequential structure is always preferred as it is easier to follow\n", "- Intentionally there is no residual connection in `TokenMixer`, because honestly it's not work of Mixer and should be done by caller\n" ] }, { "cell_type": "code", "execution_count": 8, "id": "f447c6c5-932f-4929-991a-f97b14776618", "metadata": {}, "outputs": [], "source": [ "def TokenMixer(num_features: int, n_patches: int, expansion_factor: int, dropout: float):\n", " n_hidden = n_patches * expansion_factor\n", " return nn.Sequential(\n", " nn.LayerNorm(num_features),\n", " Mix('b hw c -> b hid c', weight_shape='hw hid', bias_shape='hid', hw=n_patches, hidden=n_hidden),\n", " nn.GELU(),\n", " nn.Dropout(dropout),\n", " Mix('b hid c -> b hw c', weight_shape='hid hw', bias_shape='hw', hw=n_patches, hidden=n_hidden),\n", " nn.Dropout(dropout),\n", " )" ] }, { "cell_type": "markdown", "id": "19caacfc-a066-4727-918e-b3a98a190014", "metadata": {}, "source": [ "You may also like independent [implementation](https://github.com/lucidrains/mlp-mixer-pytorch/blob/main/mlp_mixer_pytorch/mlp_mixer_pytorch.py) of MLPMixer from Phil Wang.
\n", "Phil solves the issue by repurposing `nn.Conv1d` to mix on the second dimension. Hacky, but does the job\n" ] }, { "cell_type": "markdown", "id": "5210dfd1-0195-458f-a8b9-819547123bad", "metadata": {}, "source": [ "## MLPMixer's patch embeddings — original\n", "\n", "Second interesting part of MLPMixer is derived from vision transformers.\n", "\n", "In the very beginning an image is split into patches, and each patch is linearly projected into embedding.\n", "\n", "I've taken the part of Jake's code responsible for embedding patches:" ] }, { "cell_type": "code", "execution_count": 9, "id": "24752920-006c-49e1-acb9-227543f84afb", "metadata": {}, "outputs": [], "source": [ "def check_sizes(image_size, patch_size):\n", " sqrt_num_patches, remainder = divmod(image_size, patch_size)\n", " assert remainder == 0, \"`image_size` must be divisibe by `patch_size`\"\n", " num_patches = sqrt_num_patches ** 2\n", " return num_patches\n", "\n", "class Patcher(nn.Module):\n", " def __init__(\n", " self,\n", " image_size=256,\n", " patch_size=16,\n", " in_channels=3,\n", " num_features=128,\n", " ):\n", " _num_patches = check_sizes(image_size, patch_size)\n", " super().__init__()\n", " # per-patch fully-connected is equivalent to strided conv2d\n", " self.patcher = nn.Conv2d(\n", " in_channels, num_features, kernel_size=patch_size, stride=patch_size\n", " )\n", "\n", " def forward(self, x):\n", " patches = self.patcher(x)\n", " batch_size, num_features, _, _ = patches.shape\n", " patches = patches.permute(0, 2, 3, 1)\n", " patches = patches.view(batch_size, -1, num_features)\n", "\n", " return patches" ] }, { "cell_type": "markdown", "id": "b5ecaac5-d5fe-48b7-9339-fe84fdccd9ac", "metadata": {}, "source": [ "## MLPMixer's patch embeddings — reimplemented\n", "\n", "`EinMix` does this in a single operation. This may require some training at first to understand.\n", "\n", "Let's go step-by-step:\n", "\n", "- `b c_in (h hp) (w wp) ->` - 4-dimensional input tensor (BCHW-ordered) is split into patches of shape `hp x wp`\n", "- `weight_shape='c_in hp wp c'`. Axes `c_in`, `hp` and `wp` are all absent in the output: three dimensional patch tensor was *mixed* to produce a vector of length `c`\n", "- `-> b (h w) c` - output is 3-dimensional. All patches were reorganized from `h x w` grid to one-dimensional sequence of vectors\n", "\n", "\n", "We don't need to provide image_size beforehead, new implementation handles images of different dimensions as long as they can be divided into patches" ] }, { "cell_type": "code", "execution_count": 10, "id": "2b786a62-a742-49d3-9e66-d02eb2f4e384", "metadata": {}, "outputs": [], "source": [ "def patcher(patch_size=16, in_channels=3, num_features=128):\n", " return Mix('b c_in (h hp) (w wp) -> b (h w) c', weight_shape='c_in hp wp c', bias_shape='c',\n", " c=num_features, hp=patch_size, wp=patch_size, c_in=in_channels)" ] }, { "cell_type": "markdown", "id": "a26b2f34-07cd-4fc2-bdf9-0043dd3451ea", "metadata": {}, "source": [ "## Vision Permutator\n", "\n", "As a third example we consider pytorch-like code from [ViP paper](https://arxiv.org/pdf/2106.12368.pdf).\n", "\n", "Vision permutator is only slightly more nuanced than previous models, because \n", "1. it operates on spatial dimensions separately, while MLPMixer and its friends just pack all spatial info into one axis. \n", "2. it splits channels into groups called 'segments'\n", "\n", "Paper provides pseudo-code, so I reworked that to complete module with minimal changes. Enjoy:" ] }, { "cell_type": "code", "execution_count": 11, "id": "2c94453b-bc78-4f06-8d51-f110e4b6501a", "metadata": {}, "outputs": [], "source": [ "class WeightedPermuteMLP(nn.Module):\n", " def __init__(self, H, W, C, S):\n", " super().__init__()\n", "\n", " self.proj_h = nn.Linear(H * S, H * S)\n", " self.proj_w = nn.Linear(W * S, W * S)\n", " self.proj_c = nn.Linear(C, C)\n", " self.proj = nn.Linear(C, C)\n", " self.S = S\n", "\n", " def forward(self, x):\n", " B, H, W, C = x.shape\n", " S = self.S\n", " N = C // S\n", " x_h = x.reshape(B, H, W, N, S).permute(0, 3, 2, 1, 4).reshape(B, N, W, H*S)\n", " x_h = self.proj_h(x_h).reshape(B, N, W, H, S).permute(0, 3, 2, 1, 4).reshape(B, H, W, C)\n", "\n", " x_w = x.reshape(B, H, W, N, S).permute(0, 1, 3, 2, 4).reshape(B, H, N, W*S)\n", " x_w = self.proj_w(x_w).reshape(B, H, N, W, S).permute(0, 1, 3, 2, 4).reshape(B, H, W, C)\n", "\n", " x_c = self.proj_c(x)\n", "\n", " x = x_h + x_w + x_c\n", " x = self.proj(x)\n", " return x" ] }, { "cell_type": "markdown", "id": "27a4aa53-e719-4079-973a-83b07d507c1f", "metadata": {}, "source": [ "That didn't look readable, right? \n", "\n", "This code is also very inflexible: code in the paper did not support batch dimension, and multiple changes were necessary to allow batch processing.
\n", "This process is fragile and easily can result in virtually uncatchable bugs.\n", "\n", "Now good news: each of these long method chains can be replaced with a single `EinMix` layer:" ] }, { "cell_type": "code", "execution_count": 12, "id": "a0a699c0-f0b9-4e2c-8dc1-b33f5a7dfa1e", "metadata": {}, "outputs": [], "source": [ "class WeightedPermuteMLP_new(nn.Module):\n", " def __init__(self, H, W, C, seg_len):\n", " super().__init__()\n", " assert C % seg_len == 0, f\"can't divide {C} into segments of length {seg_len}\"\n", " self.mlp_c = Mix('b h w c -> b h w c0', weight_shape='c c0', bias_shape='c0', c=C, c0=C)\n", " self.mlp_h = Mix('b h w (n c) -> b h0 w (n c0)', weight_shape='h c h0 c0', bias_shape='h0 c0',\n", " h=H, h0=H, c=seg_len, c0=seg_len)\n", " self.mlp_w = Mix('b h w (n c) -> b h w0 (n c0)', weight_shape='w c w0 c0', bias_shape='w0 c0',\n", " w=W, w0=W, c=seg_len, c0=seg_len)\n", " self.proj = nn.Linear(C, C)\n", "\n", " def forward(self, x):\n", " x = self.mlp_c(x) + self.mlp_h(x) + self.mlp_w(x)\n", " return self.proj(x)" ] }, { "cell_type": "markdown", "id": "7a9a7165-6c58-409f-b277-f4f7314e57fe", "metadata": {}, "source": [ "Great, now let's confirm that performance did not deteriorate." ] }, { "cell_type": "code", "execution_count": 13, "id": "f375fcd7-8238-470d-aeaf-976892062911", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "90.5 ms ± 1.22 ms per loop (mean ± std. dev. of 7 runs, 10 loops each)\n", "91.5 ms ± 616 µs per loop (mean ± std. dev. of 7 runs, 10 loops each)\n", "91.8 ms ± 626 µs per loop (mean ± std. dev. of 7 runs, 10 loops each)\n", "87.4 ms ± 3.59 ms per loop (mean ± std. dev. of 7 runs, 10 loops each)\n" ] } ], "source": [ "x = torch.zeros([32, 32, 32, 128])\n", "\n", "for layer in [\n", " WeightedPermuteMLP(H=32, W=32, C=128, S=4),\n", " WeightedPermuteMLP_new(H=32, W=32, C=128, seg_len=4),\n", " # scripted versions\n", " torch.jit.script(WeightedPermuteMLP(H=32, W=32, C=128, S=4)),\n", " torch.jit.script(WeightedPermuteMLP_new(H=32, W=32, C=128, seg_len=4)),\n", "]:\n", " %timeit -n 10 y = layer(x)" ] }, { "cell_type": "markdown", "id": "6694ade8-557d-4461-8f3a-4bd5f74dbea2", "metadata": {}, "source": [ "## Final remarks\n", "\n", "`EinMix` has an incredible potential: \n", "it helps with MLPs that don't fit into a limited 'mix all in the last axis' paradigm.\n", "\n", "However existing research is ... very limited, it does not cover real possibilities of densely connected architectures.\n", "\n", "Most of its *systematic* novelty is \"mix along spacial axes too\". \n", "But `EinMix` provides **an astonishing amount of other possibilities!**" ] }, { "cell_type": "markdown", "id": "a1ee15e6-933a-4c3e-842e-e908a0aeee15", "metadata": {}, "source": [ "### Groups of mixers\n", "\n", "You can find two settings compared in the MLPMixer paper (Supplementary A1)\n", "```python\n", "'b hw c -> b hw_out c', weight_shape='hw hw_out'\n", "```\n", "and\n", "```python\n", "'b hw c -> b hw_out c', weight_shape='c hw hw_out'\n", "```\n", "While latter makes more sense (why mixing should work similarly for all channels?), the former performs better.\n", "\n", "So one more question is reasonable: what if channels are split into groups, and mixing is defined for each group?\n", "```python\n", "'b hw (group c) -> b hw_out (group c)', weight_shape='group hw hw_out'\n", "```\n", "Implementing such setting without einops is considerably harder.\n", "\n", "### Mixing within patch on a grid\n", "\n", "What if you make mixing 'local' in space? Completely doable:\n", "\n", "```python\n", "'b c (h hI) (w wI) -> b c (h hO) (w wO)', weight_shape='c hI wI hO wO'\n", "```\n", "\n", "We split tensor into patches of shape `hI wI` and mixed things within channel.\n", " \n", "### Mixing in subgrids\n", "\n", "Ok, done with local mixing. How to collect information from the whole image?
\n", "Well, you can again densely connect all the tokens, but all-to-all connection is too expensive.\n", "\n", "\n", "\n", "TODO need some image here to show sub-grids and information exhange. \n", "\n", "\n", "Here is EinMix-way: split the image into subgrids (each subgrid has steps `h` and `w`), and connect densely tokens within each subgrid\n", "\n", "```python\n", "'b c (hI h) (wI w) -> b c (hO h) (wO w)', weight_shape='c hI wI hO wO'\n", "```\n", "\n", "\n", "### Going deeper\n", "And that's very top of the iceberg.
\n", "\n", "- Want to mix part of axis? — No problems!\n", "- ... in a grid-like manner — Supported! \n", "- ... while mixing channels within group? — Welcome! \n", "- In 2d/3d/4d? — Sure!\n", "- I don't use pytorch — EinMix is available for multiple frameworks!\n", "\n", "Hopefully this guide helped you to find MLPs more interesting and intriguing. And simpler to experiment with." ] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.9.0" } }, "nbformat": 4, "nbformat_minor": 5 } arogozhnikov-einops-ad2c8d6/docs/4-pack-and-unpack.ipynb000066400000000000000000000412741475201674600233370ustar00rootroot00000000000000{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# einops.pack and einops.unpack\n", "\n", "einops 0.6 introduces two more functions to the family: `pack` and `unpack`.\n", "\n", "Here is what they do:\n", "\n", "- `unpack` reverses `pack`\n", "- `pack` reverses `unpack`\n", "\n", "Enlightened with this exhaustive description, let's move to examples.\n", "\n" ] }, { "cell_type": "code", "execution_count": 1, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "# we'll use numpy for demo purposes\n", "# operations work the same way with other frameworks\n", "import numpy as np" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Stacking data layers\n", "\n", "Assume we have RGB image along with a corresponding depth image that we want to stack:" ] }, { "cell_type": "code", "execution_count": 2, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "from einops import pack, unpack\n", "\n", "h, w = 100, 200\n", "# image_rgb is 3-dimensional (h, w, 3) and depth is 2-dimensional (h, w)\n", "image_rgb = np.random.random([h, w, 3])\n", "image_depth = np.random.random([h, w])\n", "# but we can stack them\n", "image_rgbd, ps = pack([image_rgb, image_depth], 'h w *')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## How to read packing patterns\n", "\n", "pattern `h w *` means that\n", "- output is 3-dimensional\n", "- first two axes (`h` and `w`) are shared across all inputs and also shared with output\n", "- inputs, however do not have to be 3-dimensional. They can be 2-dim, 3-dim, 4-dim, etc.
\n", " Regardless of inputs dimensionality, they all will be packed into 3-dim output, and information about how they were packed is stored in `PS`" ] }, { "cell_type": "code", "execution_count": 3, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [ { "data": { "text/plain": [ "((100, 200, 3), (100, 200), (100, 200, 4))" ] }, "execution_count": 3, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# as you see, pack properly appended depth as one more layer\n", "# and correctly aligned axes!\n", "# this won't work off the shelf with np.concatenate or torch.cat or alike\n", "image_rgb.shape, image_depth.shape, image_rgbd.shape" ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [ { "data": { "text/plain": [ "[(3,), ()]" ] }, "execution_count": 4, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# now let's see what PS keeps.\n", "# PS means Packed Shapes, not PlayStation or Post Script\n", "ps" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "which reads: first tensor had shape `h, w, *and 3*`, while second tensor had shape `h, w *and nothing more*`.\n", "That's just enough to reverse packing:" ] }, { "cell_type": "code", "execution_count": 5, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [ { "data": { "text/plain": [ "((100, 200, 3), (100, 200))" ] }, "execution_count": 5, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# remove 1-axis in depth image during unpacking. Results are (h, w, 3) and (h, w)\n", "unpacked_rgb, unpacked_depth = unpack(image_rgbd, ps, 'h w *')\n", "unpacked_rgb.shape, unpacked_depth.shape" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "we can unpack tensor in different ways manually:" ] }, { "cell_type": "code", "execution_count": 6, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "# simple unpack by splitting the axis. Results are (h, w, 3) and (h, w, 1)\n", "rgb, depth = unpack(image_rgbd, [[3], [1]], 'h w *')\n", "# different split, both outputs have shape (h, w, 2)\n", "rg, bd = unpack(image_rgbd, [[2], [2]], 'h w *')\n", "# unpack to 4 tensors of shape (h, w). More like 'unstack over last axis'\n", "[r, g, b, d] = unpack(image_rgbd, [[], [], [], []], 'h w *')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Short summary so far\n", "\n", "- `einops.pack` is a 'more generic concatenation' (that can stack too)\n", "- `einops.unpack` is a 'more generic split'\n", "\n", "And, of course, `einops` functions are more verbose, and *reversing* concatenation now is *dead simple*\n", "\n", "Compared to other `einops` functions, `pack` and `unpack` have a compact pattern without arrow, and the same pattern can be used in `pack` and `unpack`. These patterns are very simplistic: just a sequence of space-separated axes names.\n", "One axis is `*`, all other axes are valid identifiers.\n", "\n", "Now let's discuss some practical cases" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Auto-batching\n", "\n", "ML models by default accept batches: batch of images, or batch of sentences, or batch of audios, etc.\n", "\n", "During debugging or inference, however, it is common to pass a single image instead (and thus output should be a single prediction)
\n", "In this example we'll write `universal_predict` that can handle both cases." ] }, { "cell_type": "code", "execution_count": 7, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "from einops import reduce\n", "def image_classifier(images_bhwc):\n", " # mock for image classifier\n", " predictions = reduce(images_bhwc, 'b h w c -> b c', 'mean', h=100, w=200, c=3)\n", " return predictions\n", "\n", "\n", "def universal_predict(x):\n", " x_packed, ps = pack([x], '* h w c')\n", " predictions_packed = image_classifier(x_packed)\n", " [predictions] = unpack(predictions_packed, ps, '* cls')\n", " return predictions" ] }, { "cell_type": "code", "execution_count": 8, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "(3,)\n", "(5, 3)\n", "(5, 7, 3)\n" ] } ], "source": [ "# works with a single image\n", "print(universal_predict(np.zeros([h, w, 3])).shape)\n", "# works with a batch of images\n", "batch = 5\n", "print(universal_predict(np.zeros([batch, h, w, 3])).shape)\n", "# or even a batch of videos\n", "n_frames = 7\n", "print(universal_predict(np.zeros([batch, n_frames, h, w, 3])).shape)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**what we can learn from this example**:\n", "\n", "- `pack` and `unpack` play nicely together. That's not a coincidence :)\n", "- patterns in `pack` and `unpack` may differ, and that's quite common for applications\n", "- unlike other operations in `einops`, `(un)pack` does not provide arbitrary reordering of axes" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Class token in VIT\n", "\n", "Let's assume we have a simple transformer model that works with `BTC`-shaped tensors." ] }, { "cell_type": "code", "execution_count": 9, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "def transformer_mock(x_btc):\n", " # imagine this is a transformer model, a very efficient one\n", " assert len(x_btc.shape) == 3\n", " return x_btc" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let's implement vision transformer (ViT) with a class token (i.e. static token, corresponding output is used to classify an image)" ] }, { "cell_type": "code", "execution_count": 10, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "# below it is assumed that you already\n", "# 1) split batch of images into patches 2) applied linear projection and 3) used positional embedding.\n", "\n", "# We'll skip that here. But hey, here is an einops-style way of doing all of that in a single shot!\n", "# from einops.layers.torch import EinMix\n", "# patcher_and_posembedder = EinMix('b (h h2) (w w2) c -> b h w c_out', weight_shape='h2 w2 c c_out',\n", "# bias_shape='h w c_out', h2=..., w2=...)\n", "# patch_tokens_bhwc = patcher_and_posembedder(images_bhwc)" ] }, { "cell_type": "code", "execution_count": 11, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "# preparations\n", "batch, height, width, c = 6, 16, 16, 256\n", "patch_tokens = np.random.random([batch, height, width, c])\n", "class_tokens = np.zeros([batch, c])" ] }, { "cell_type": "code", "execution_count": 12, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [ { "data": { "text/plain": [ "((6, 256), (6, 16, 16, 256))" ] }, "execution_count": 12, "metadata": {}, "output_type": "execute_result" } ], "source": [ "def vit_einops(class_tokens, patch_tokens):\n", " input_packed, ps = pack([class_tokens, patch_tokens], 'b * c')\n", " output_packed = transformer_mock(input_packed)\n", " return unpack(output_packed, ps, 'b * c_out')\n", "\n", "class_token_emb, patch_tokens_emb = vit_einops(class_tokens, patch_tokens)\n", "\n", "class_token_emb.shape, patch_tokens_emb.shape" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "At this point, let's make a small pause and understand conveniences of this pipeline, by contrasting it to more 'standard' code" ] }, { "cell_type": "code", "execution_count": 13, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "def vit_vanilla(class_tokens, patch_tokens):\n", " b, h, w, c = patch_tokens.shape\n", " class_tokens_b1c = class_tokens[:, np.newaxis, :]\n", " patch_tokens_btc = np.reshape(patch_tokens, [b, -1, c])\n", " input_packed = np.concatenate([class_tokens_b1c, patch_tokens_btc], axis=1)\n", " output_packed = transformer_mock(input_packed)\n", " class_token_emb = np.squeeze(output_packed[:, :1, :], 1)\n", " patch_tokens_emb = np.reshape(output_packed[:, 1:, :], [b, h, w, -1])\n", " return class_token_emb, patch_tokens_emb\n", "\n", "class_token_emb2, patch_tokens_emb2 = vit_vanilla(class_tokens, patch_tokens)\n", "assert np.allclose(class_token_emb, class_token_emb2)\n", "assert np.allclose(patch_tokens_emb, patch_tokens_emb2)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Notably, we have put all packing and unpacking, reshapes, adding and removing of dummy axes into a couple of lines." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Packing different modalities together\n", "\n", "We can extend the previous example: it is quite common to mix elements of different types of inputs in transformers.\n", "\n", "The simples one is to mix tokens from all inputs:\n", "\n", "```python\n", "all_inputs = [text_tokens_btc, image_bhwc, task_token_bc, static_tokens_bnc]\n", "inputs_packed, ps = pack(all_inputs, 'b * c')\n", "```\n", "\n", "and you can `unpack` resulting tokens to the same structure." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Packing data coming from different sources together\n", "\n", "Most notable example is of course GANs:\n", "\n", "```python\n", "input_ims, ps = pack([true_images, fake_images], '* h w c')\n", "true_pred, fake_pred = unpack(model(input_ims), ps, '* c')\n", "```\n", "`true_pred` and `fake_pred` are handled differently, that's why we separated them" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Predicting multiple outputs at the same time\n", "\n", "It is quite common to pack prediction of multiple target values into a single layer.\n", "\n", "This is more efficient, but code is less readable. For example, that's how detection code may look like:" ] }, { "cell_type": "code", "execution_count": 14, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "def loss_detection(model_output_bhwc, mask_h: int, mask_w: int, n_classes: int):\n", " output = model_output_bhwc\n", "\n", " confidence = output[..., 0].sigmoid()\n", " bbox_x_shift = output[..., 1].sigmoid()\n", " bbox_y_shift = output[..., 2].sigmoid()\n", " bbox_w = output[..., 3]\n", " bbox_h = output[..., 4]\n", " mask_logits = output[..., 5: 5 + mask_h * mask_w]\n", " mask_logits = mask_logits.reshape([*mask_logits.shape[:-1], mask_h, mask_w])\n", " class_logits = output[..., 5 + mask_h * mask_w:]\n", " assert class_logits.shape[-1] == n_classes, class_logits.shape[-1]\n", "\n", " # downstream computations\n", " return confidence, bbox_x_shift, bbox_y_shift, bbox_h, bbox_w, mask_logits, class_logits" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "When the same logic is implemented in einops, there is no need to memorize offsets.
\n", "Additionally, reshapes and shape checks are automatic:" ] }, { "cell_type": "code", "execution_count": 15, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "def loss_detection_einops(model_output, mask_h: int, mask_w: int, n_classes: int):\n", " confidence, bbox_x_shift, bbox_y_shift, bbox_w, bbox_h, mask_logits, class_logits \\\n", " = unpack(model_output, [[]] * 5 + [[mask_h, mask_w], [n_classes]], 'b h w *')\n", "\n", " confidence = confidence.sigmoid()\n", " bbox_x_shift = bbox_x_shift.sigmoid()\n", " bbox_y_shift = bbox_y_shift.sigmoid()\n", "\n", " # downstream computations\n", " return confidence, bbox_x_shift, bbox_y_shift, bbox_h, bbox_w, mask_logits, class_logits" ] }, { "cell_type": "code", "execution_count": 16, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "# check that results are identical\n", "import torch\n", "dims = dict(mask_h=6, mask_w=8, n_classes=19)\n", "model_output = torch.randn([3, 5, 7, 5 + dims['mask_h'] * dims['mask_w'] + dims['n_classes']])\n", "for a, b in zip(loss_detection(model_output, **dims), loss_detection_einops(model_output, **dims)):\n", " assert torch.allclose(a, b)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Or maybe **reinforcement learning** is closer to your mind?\n", "\n", "If so, predicting multiple outputs is valuable there too:\n", "\n", "```python\n", "action_logits, reward_expectation, q_values, expected_entropy_after_action = \\\n", " unpack(predictions_btc, [[n_actions], [], [n_actions], [n_actions]], 'b step *')\n", "\n", "\n", "```\n", "\n", "\n", "## That's all for today!\n", "\n", "happy packing and unpacking!" ] }, { "cell_type": "markdown", "metadata": {}, "source": [] } ], "metadata": { "kernelspec": { "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.10.6" } }, "nbformat": 4, "nbformat_minor": 4 } arogozhnikov-einops-ad2c8d6/docs/README.md000066400000000000000000000014151475201674600204460ustar00rootroot00000000000000# Einops tutorial You can read notebooks from github by clicking on files above. If you prefer nbviewer, use the links below: - Part 1 notebook at [nbviewer](https://nbviewer.jupyter.org/github/arogozhnikov/einops/blob/main/docs/1-einops-basics.ipynb) - Part 2 notebook at [nbviewer](https://nbviewer.jupyter.org/github/arogozhnikov/einops/blob/main/docs/2-einops-for-deep-learning.ipynb) - Part 3: EinMix for great MLPs at [nbviewer](https://nbviewer.jupyter.org/github/arogozhnikov/einops/blob/main/docs/3-einmix-layer.ipynb) - Part 4: einops.pack and einops.unpack [nbviewer](https://nbviewer.jupyter.org/github/arogozhnikov/einops/blob/main/docs/4-pack-and-unpack.ipynb) - Writing better code with einops and pytorch: [web link](https://einops.rocks/pytorch-examples.html) arogozhnikov-einops-ad2c8d6/docs/pytorch-examples.html000066400000000000000000007721171475201674600233770ustar00rootroot00000000000000 Writing better code with pytorch+einops
einops package logo
[github],    tutorials [1] and [2]

Writing a better code with pytorch and einops



Rewriting building blocks of deep learning

Now let's get to examples from real world. These code fragments taken from official tutorials and popular repositories.

Learn how to improve code and how einops can help you.

Left: as it was, Right: improved version

# start from importing some stuff
import torch
import torch.nn as nn 
import torch.nn.functional as F
import numpy as np
import math

from einops import rearrange, reduce, asnumpy, parse_shape
from einops.layers.torch import Rearrange, Reduce

Simple ConvNet

class Net(nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        self.conv1 = nn.Conv2d(1, 10, kernel_size=5)
        self.conv2 = nn.Conv2d(10, 20, kernel_size=5)
        self.conv2_drop = nn.Dropout2d()
        self.fc1 = nn.Linear(320, 50)
        self.fc2 = nn.Linear(50, 10)

    def forward(self, x):
        x = F.relu(F.max_pool2d(self.conv1(x), 2))
        x = F.relu(F.max_pool2d(self.conv2_drop(self.conv2(x)), 2))
        x = x.view(-1, 320)
        x = F.relu(self.fc1(x))
        x = F.dropout(x, training=self.training)
        x = self.fc2(x)
        return F.log_softmax(x, dim=1)

conv_net_old = Net()
conv_net_new = nn.Sequential(
    nn.Conv2d(1, 10, kernel_size=5),
    nn.MaxPool2d(kernel_size=2),
    nn.ReLU(),
    nn.Conv2d(10, 20, kernel_size=5),
    nn.MaxPool2d(kernel_size=2),
    nn.ReLU(),
    nn.Dropout2d(),
    Rearrange('b c h w -> b (c h w)'),
    nn.Linear(320, 50),
    nn.ReLU(),
    nn.Dropout(),
    nn.Linear(50, 10),
    nn.LogSoftmax(dim=1)
)

Reasons to prefer new implementation:

  • in the original code (to the left) if input size is changed and batch size is divisible by 16 (that's usually so), we'll get something senseless after reshaping
    • new code will explicitly raise an error in this case
  • we won't forget to use dropout with flag self.training with new version
  • code is straightforward to read and analyze
  • sequential makes printing / saving / passing trivial. And there is no need in your code to load a model (which also has a number of benefits)
  • don't need logsoftmax? Now you can use conv_net_new[:-1]. One more reason to prefer nn.Sequential
  • ... and we could also add inplace for ReLU

Super-resolution

class SuperResolutionNetOld(nn.Module):
    def __init__(self, upscale_factor):
        super(SuperResolutionNetOld, self).__init__()

        self.relu = nn.ReLU()
        self.conv1 = nn.Conv2d(1, 64, (5, 5), (1, 1), (2, 2))
        self.conv2 = nn.Conv2d(64, 64, (3, 3), (1, 1), (1, 1))
        self.conv3 = nn.Conv2d(64, 32, (3, 3), (1, 1), (1, 1))
        self.conv4 = nn.Conv2d(32, upscale_factor ** 2, (3, 3), (1, 1), (1, 1))
        self.pixel_shuffle = nn.PixelShuffle(upscale_factor)

    def forward(self, x):
        x = self.relu(self.conv1(x))
        x = self.relu(self.conv2(x))
        x = self.relu(self.conv3(x))
        x = self.pixel_shuffle(self.conv4(x))
        return x
def SuperResolutionNetNew(upscale_factor):
    return nn.Sequential(
        nn.Conv2d(1, 64, kernel_size=5, padding=2),
        nn.ReLU(inplace=True),
        nn.Conv2d(64, 64, kernel_size=3, padding=1),
        nn.ReLU(inplace=True),
        nn.Conv2d(64, 32, kernel_size=3, padding=1),
        nn.ReLU(inplace=True),
        nn.Conv2d(32, upscale_factor ** 2, kernel_size=3, padding=1),
        Rearrange('b (h2 w2) h w -> b (h h2) (w w2)', h2=upscale_factor, w2=upscale_factor),
    )

Here is the difference:

  • no need in special instruction pixel_shuffle (and result is transferrable between frameworks)
  • output doesn't contain a fake axis (and we could do the same for the input)
  • inplace ReLU used now, for high resolution pictures that becomes critical and saves us much memory
  • and all the benefits of nn.Sequential again

Restyling Gram matrix for style transfer

Original code is already good - first line shows what kind of input is expected

  • einsum operation should be read like:
  • for each batch and for each pair of channels, we sum over h and w.
  • I've also changed normalization, because that's how Gram matrix is defined, otherwise we should call it normalized Gram matrix or alike
def gram_matrix_old(y):
    (b, ch, h, w) = y.size()
    features = y.view(b, ch, w * h)
    features_t = features.transpose(1, 2)
    gram = features.bmm(features_t) / (ch * h * w)
    return gram
def gram_matrix_new(y):
    b, ch, h, w = y.shape
    return torch.einsum('bchw,bdhw->bcd', [y, y]) / (h * w)

It would be great to use just 'b c1 h w,b c2 h w->b c1 c2', but einsum supports only one-letter axes

Recurrent model

All we did here is just made information about shapes explicit to skip deciphering

class RNNModelOld(nn.Module):
    """Container module with an encoder, a recurrent module, and a decoder."""
    def __init__(self, ntoken, ninp, nhid, nlayers, dropout=0.5):
        super(RNNModel, self).__init__()
        self.drop = nn.Dropout(dropout)
        self.encoder = nn.Embedding(ntoken, ninp)
        self.rnn = nn.LSTM(ninp, nhid, nlayers, dropout=dropout)
        self.decoder = nn.Linear(nhid, ntoken)

    def forward(self, input, hidden):
        emb = self.drop(self.encoder(input))
        output, hidden = self.rnn(emb, hidden)
        output = self.drop(output)
        decoded = self.decoder(output.view(output.size(0)*output.size(1), output.size(2)))
        return decoded.view(output.size(0), output.size(1), decoded.size(1)), hidden
    
class RNNModelNew(nn.Module):
    """Container module with an encoder, a recurrent module, and a decoder."""
    def __init__(self, ntoken, ninp, nhid, nlayers, dropout=0.5):
        super(RNNModel, self).__init__()
        self.drop = nn.Dropout(p=dropout)
        self.encoder = nn.Embedding(ntoken, ninp)
        self.rnn = nn.LSTM(ninp, nhid, nlayers, dropout=dropout)
        self.decoder = nn.Linear(nhid, ntoken)

    def forward(self, input, hidden):
        t, b = input.shape
        emb = self.drop(self.encoder(input))
        output, hidden = self.rnn(emb, hidden)
        output = rearrange(self.drop(output), 't b nhid -> (t b) nhid')
        decoded = rearrange(self.decoder(output), '(t b) token -> t b token', t=t, b=b)
        return decoded, hidden

Channel shuffle (from shufflenet)

def channel_shuffle_old(x, groups):
    batchsize, num_channels, height, width = x.data.size()

    channels_per_group = num_channels // groups
    
    # reshape
    x = x.view(batchsize, groups, 
        channels_per_group, height, width)

    # transpose
    # - contiguous() required if transpose() is used before view().
    #   See https://github.com/pytorch/pytorch/issues/764
    x = torch.transpose(x, 1, 2).contiguous()

    # flatten
    x = x.view(batchsize, -1, height, width)

    return x
def channel_shuffle_new(x, groups):
    return rearrange(x, 'b (c1 c2) h w -> b (c2 c1) h w', c1=groups)

While progress is obvious, this is not the limit. As you'll see below, we don't even need to write these couple of lines.

Shufflenet

from collections import OrderedDict

def channel_shuffle(x, groups):
    batchsize, num_channels, height, width = x.data.size()

    channels_per_group = num_channels // groups
    
    # reshape
    x = x.view(batchsize, groups, 
        channels_per_group, height, width)

    # transpose
    # - contiguous() required if transpose() is used before view().
    #   See https://github.com/pytorch/pytorch/issues/764
    x = torch.transpose(x, 1, 2).contiguous()

    # flatten
    x = x.view(batchsize, -1, height, width)

    return x

class ShuffleUnitOld(nn.Module):
    def __init__(self, in_channels, out_channels, groups=3,
                 grouped_conv=True, combine='add'):
        
        super(ShuffleUnitOld, self).__init__()

        self.in_channels = in_channels
        self.out_channels = out_channels
        self.grouped_conv = grouped_conv
        self.combine = combine
        self.groups = groups
        self.bottleneck_channels = self.out_channels // 4

        # define the type of ShuffleUnit
        if self.combine == 'add':
            # ShuffleUnit Figure 2b
            self.depthwise_stride = 1
            self._combine_func = self._add
        elif self.combine == 'concat':
            # ShuffleUnit Figure 2c
            self.depthwise_stride = 2
            self._combine_func = self._concat
            
            # ensure output of concat has the same channels as 
            # original output channels.
            self.out_channels -= self.in_channels
        else:
            raise ValueError("Cannot combine tensors with \"{}\"" \
                             "Only \"add\" and \"concat\" are" \
                             "supported".format(self.combine))

        # Use a 1x1 grouped or non-grouped convolution to reduce input channels
        # to bottleneck channels, as in a ResNet bottleneck module.
        # NOTE: Do not use group convolution for the first conv1x1 in Stage 2.
        self.first_1x1_groups = self.groups if grouped_conv else 1

        self.g_conv_1x1_compress = self._make_grouped_conv1x1(
            self.in_channels,
            self.bottleneck_channels,
            self.first_1x1_groups,
            batch_norm=True,
            relu=True
            )

        # 3x3 depthwise convolution followed by batch normalization
        self.depthwise_conv3x3 = conv3x3(
            self.bottleneck_channels, self.bottleneck_channels,
            stride=self.depthwise_stride, groups=self.bottleneck_channels)
        self.bn_after_depthwise = nn.BatchNorm2d(self.bottleneck_channels)

        # Use 1x1 grouped convolution to expand from 
        # bottleneck_channels to out_channels
        self.g_conv_1x1_expand = self._make_grouped_conv1x1(
            self.bottleneck_channels,
            self.out_channels,
            self.groups,
            batch_norm=True,
            relu=False
            )


    @staticmethod
    def _add(x, out):
        # residual connection
        return x + out


    @staticmethod
    def _concat(x, out):
        # concatenate along channel axis
        return torch.cat((x, out), 1)


    def _make_grouped_conv1x1(self, in_channels, out_channels, groups,
        batch_norm=True, relu=False):

        modules = OrderedDict()
        conv = conv1x1(in_channels, out_channels, groups=groups)
        modules['conv1x1'] = conv

        if batch_norm:
            modules['batch_norm'] = nn.BatchNorm2d(out_channels)
        if relu:
            modules['relu'] = nn.ReLU()
        if len(modules) > 1:
            return nn.Sequential(modules)
        else:
            return conv


    def forward(self, x):
        # save for combining later with output
        residual = x
        if self.combine == 'concat':
            residual = F.avg_pool2d(residual, kernel_size=3, 
                stride=2, padding=1)

        out = self.g_conv_1x1_compress(x)
        out = channel_shuffle(out, self.groups)
        out = self.depthwise_conv3x3(out)
        out = self.bn_after_depthwise(out)
        out = self.g_conv_1x1_expand(out)
        
        out = self._combine_func(residual, out)
        return F.relu(out)
class ShuffleUnitNew(nn.Module):
    def __init__(self, in_channels, out_channels, groups=3, 
                 grouped_conv=True, combine='add'):
        super().__init__()
        first_1x1_groups = groups if grouped_conv else 1
        bottleneck_channels = out_channels // 4
        self.combine = combine
        if combine == 'add':
            # ShuffleUnit Figure 2b
            self.left = Rearrange('...->...') # identity
            depthwise_stride = 1
        else:
            # ShuffleUnit Figure 2c
            self.left = nn.AvgPool2d(kernel_size=3, stride=2, padding=1)
            depthwise_stride = 2
            # ensure output of concat has the same channels as original output channels.
            out_channels -= in_channels
            assert out_channels > 0

        self.right = nn.Sequential(
            # Use a 1x1 grouped or non-grouped convolution to reduce input channels
            # to bottleneck channels, as in a ResNet bottleneck module.
            conv1x1(in_channels, bottleneck_channels, groups=first_1x1_groups),
            nn.BatchNorm2d(bottleneck_channels),
            nn.ReLU(inplace=True),
            # channel shuffle
            Rearrange('b (c1 c2) h w -> b (c2 c1) h w', c1=groups),
            # 3x3 depthwise convolution followed by batch 
            conv3x3(bottleneck_channels, bottleneck_channels,
                    stride=depthwise_stride, groups=bottleneck_channels),
            nn.BatchNorm2d(bottleneck_channels),
            # Use 1x1 grouped convolution to expand from 
            # bottleneck_channels to out_channels
            conv1x1(bottleneck_channels, out_channels, groups=groups),
            nn.BatchNorm2d(out_channels),
        )        
        
    def forward(self, x):
        if self.combine == 'add':
            combined = self.left(x) + self.right(x)
        else:
            combined = torch.cat([self.left(x), self.right(x)], dim=1)
        return F.relu(combined, inplace=True)

Rewriting the code helped to identify:

  • There is no sense in doing reshuffling and not using groups in the first convolution (indeed, in the paper it is not so). However, result is an equivalent model.
  • It is also strange that the first convolution may be not grouped, while the last convolution is always grouped (and that is different from the paper)

Other comments:

  • There is an identity layer for pytorch introduced here
  • The last thing left is get rid of conv1x1 and conv3x3 in the code - those are not better than standard

Simplifying ResNet

class ResNetOld(nn.Module):

    def __init__(self, block, layers, num_classes=1000):
        self.inplanes = 64
        super(ResNetOld, self).__init__()
        self.conv1 = nn.Conv2d(3, 64, kernel_size=7, stride=2, padding=3,
                               bias=False)
        self.bn1 = nn.BatchNorm2d(64)
        self.relu = nn.ReLU(inplace=True)
        self.maxpool = nn.MaxPool2d(kernel_size=3, stride=2, padding=1)
        self.layer1 = self._make_layer(block, 64, layers[0])
        self.layer2 = self._make_layer(block, 128, layers[1], stride=2)
        self.layer3 = self._make_layer(block, 256, layers[2], stride=2)
        self.layer4 = self._make_layer(block, 512, layers[3], stride=2)
        self.avgpool = nn.AvgPool2d(7, stride=1)

        self.fc = nn.Linear(512 * block.expansion, num_classes)

        for m in self.modules():
            if isinstance(m, nn.Conv2d):
                n = m.kernel_size[0] * m.kernel_size[1] * m.out_channels
                m.weight.data.normal_(0, math.sqrt(2. / n))
            elif isinstance(m, nn.BatchNorm2d):
                m.weight.data.fill_(1)
                m.bias.data.zero_()

    def _make_layer(self, block, planes, blocks, stride=1):
        downsample = None
        if stride != 1 or self.inplanes != planes * block.expansion:
            downsample = nn.Sequential(
                nn.Conv2d(self.inplanes, planes * block.expansion,
                          kernel_size=1, stride=stride, bias=False),
                nn.BatchNorm2d(planes * block.expansion),
            )

        layers = []
        layers.append(block(self.inplanes, planes, stride, downsample))
        self.inplanes = planes * block.expansion
        for i in range(1, blocks):
            layers.append(block(self.inplanes, planes))

        return nn.Sequential(*layers)

    def forward(self, x):
        x = self.conv1(x)
        x = self.bn1(x)
        x = self.relu(x)
        x = self.maxpool(x)

        x = self.layer1(x)
        x = self.layer2(x)
        x = self.layer3(x)
        x = self.layer4(x)
        x = self.avgpool(x)
        x = x.view(x.size(0), -1)
        x = self.fc(x)

        return x
def make_layer(inplanes, planes, block, n_blocks, stride=1):
    downsample = None
    if stride != 1 or inplanes != planes * block.expansion:
        # output size won't match input, so adjust residual
        downsample = nn.Sequential(
            nn.Conv2d(inplanes, planes * block.expansion,
                      kernel_size=1, stride=stride, bias=False),
            nn.BatchNorm2d(planes * block.expansion),
        )
    return nn.Sequential(
        block(inplanes, planes, stride, downsample),
        *[block(planes * block.expansion, planes) for _ in range(1, n_blocks)]
    )


def ResNetNew(block, layers, num_classes=1000):    
    e = block.expansion
    
    resnet = nn.Sequential(
        Rearrange('b c h w -> b c h w', c=3, h=224, w=224),
        nn.Conv2d(3, 64, kernel_size=7, stride=2, padding=3, bias=False),
        nn.BatchNorm2d(64),
        nn.ReLU(inplace=True),
        nn.MaxPool2d(kernel_size=3, stride=2, padding=1),
        make_layer(64,      64,  block, layers[0], stride=1),
        make_layer(64 * e,  128, block, layers[1], stride=2),
        make_layer(128 * e, 256, block, layers[2], stride=2),
        make_layer(256 * e, 512, block, layers[3], stride=2),
        # combined AvgPool and view in one averaging operation
        Reduce('b c h w -> b c', 'mean'),
        nn.Linear(512 * e, num_classes),
    )
    
    # initialization
    for m in resnet.modules():
        if isinstance(m, nn.Conv2d):
            n = m.kernel_size[0] * m.kernel_size[1] * m.out_channels
            m.weight.data.normal_(0, math.sqrt(2. / n))
        elif isinstance(m, nn.BatchNorm2d):
            m.weight.data.fill_(1)
            m.bias.data.zero_()
    return resnet

Changes:

  • explicit check for input shape
  • no views and simple sequential structure, output is just nn.Sequential, so can always be saved/passed/etc
  • no need in AvgPool and additional views, this place is much clearer now
  • make_layer doesn't use internal state (that's quite faulty place)

Improving RNN language modelling

class RNNOld(nn.Module):
    def __init__(self, vocab_size, embedding_dim, hidden_dim, output_dim, n_layers, bidirectional, dropout):
        super().__init__()
        
        self.embedding = nn.Embedding(vocab_size, embedding_dim)
        self.rnn = nn.LSTM(embedding_dim, hidden_dim, num_layers=n_layers, 
                           bidirectional=bidirectional, dropout=dropout)
        self.fc = nn.Linear(hidden_dim*2, output_dim)
        self.dropout = nn.Dropout(dropout)
        
    def forward(self, x):
        #x = [sent len, batch size]
        
        embedded = self.dropout(self.embedding(x))
        
        #embedded = [sent len, batch size, emb dim]
        
        output, (hidden, cell) = self.rnn(embedded)
        
        #output = [sent len, batch size, hid dim * num directions]
        #hidden = [num layers * num directions, batch size, hid dim]
        #cell = [num layers * num directions, batch size, hid dim]
        
        #concat the final forward (hidden[-2,:,:]) and backward (hidden[-1,:,:]) hidden layers
        #and apply dropout
        
        hidden = self.dropout(torch.cat((hidden[-2,:,:], hidden[-1,:,:]), dim=1))
                
        #hidden = [batch size, hid dim * num directions]
            
        return self.fc(hidden.squeeze(0))
class RNNNew(nn.Module):
    def __init__(self, vocab_size, embedding_dim, hidden_dim, output_dim, n_layers, bidirectional, dropout):
        super().__init__()
        
        self.embedding = nn.Embedding(vocab_size, embedding_dim)
        self.rnn = nn.LSTM(embedding_dim, hidden_dim, num_layers=n_layers, 
                           bidirectional=bidirectional, dropout=dropout)
        self.dropout = nn.Dropout(dropout)
        self.directions = 2 if bidirectional else 1
        self.fc = nn.Linear(hidden_dim * self.directions, output_dim)
        
    def forward(self, x):
        #x = [sent len, batch size]        
        embedded = self.dropout(self.embedding(x))
        
        #embedded = [sent len, batch size, emb dim]
        output, (hidden, cell) = self.rnn(embedded)
        
        hidden = rearrange(hidden, '(layer dir) b c -> layer b (dir c)', 
                           dir=self.directions)
        # take the final layer's hidden
        return self.fc(self.dropout(hidden[-1]))
  • original code misbehaves for non-bidirectional models
  • ... and fails when bidirectional = False, and there is only one layer
  • modification of the code shows both how hidden is structured and how it is modified

Writing FastText faster

class FastTextOld(nn.Module):
    def __init__(self, vocab_size, embedding_dim, output_dim):
        super().__init__()
        
        self.embedding = nn.Embedding(vocab_size, embedding_dim)
        self.fc = nn.Linear(embedding_dim, output_dim)
        
    def forward(self, x):
        
        #x = [sent len, batch size]
        
        embedded = self.embedding(x)
                
        #embedded = [sent len, batch size, emb dim]
        
        embedded = embedded.permute(1, 0, 2)
        
        #embedded = [batch size, sent len, emb dim]
        
        pooled = F.avg_pool2d(embedded, (embedded.shape[1], 1)).squeeze(1) 
        
        #pooled = [batch size, embedding_dim]
                
        return self.fc(pooled)
def FastTextNew(vocab_size, embedding_dim, output_dim):
    return nn.Sequential(
        Rearrange('t b -> t b'),
        nn.Embedding(vocab_size, embedding_dim),
        Reduce('t b c -> b c', 'mean'),
        nn.Linear(embedding_dim, output_dim),
        Rearrange('b c -> b c'),
    )

Some comments on new code:

  • first and last operations do nothing and can be removed
    • but were added to explicitly show expected input and output
  • this also gives you a flexibility of changing interface by editing a single line. Should you need to accept inputs as (batch, time), you just change first line to Rearrange('b t -> t b'),

CNNs for text classification

class CNNOld(nn.Module):
    def __init__(self, vocab_size, embedding_dim, n_filters, filter_sizes, output_dim, dropout):
        super().__init__()
        self.embedding = nn.Embedding(vocab_size, embedding_dim)
        self.conv_0 = nn.Conv2d(in_channels=1, out_channels=n_filters, kernel_size=(filter_sizes[0],embedding_dim))
        self.conv_1 = nn.Conv2d(in_channels=1, out_channels=n_filters, kernel_size=(filter_sizes[1],embedding_dim))
        self.conv_2 = nn.Conv2d(in_channels=1, out_channels=n_filters, kernel_size=(filter_sizes[2],embedding_dim))
        self.fc = nn.Linear(len(filter_sizes)*n_filters, output_dim)
        self.dropout = nn.Dropout(dropout)
        
    def forward(self, x):
        
        #x = [sent len, batch size]
        
        x = x.permute(1, 0)
                
        #x = [batch size, sent len]
        
        embedded = self.embedding(x)
                
        #embedded = [batch size, sent len, emb dim]
        
        embedded = embedded.unsqueeze(1)
        
        #embedded = [batch size, 1, sent len, emb dim]
        
        conved_0 = F.relu(self.conv_0(embedded).squeeze(3))
        conved_1 = F.relu(self.conv_1(embedded).squeeze(3))
        conved_2 = F.relu(self.conv_2(embedded).squeeze(3))
            
        #conv_n = [batch size, n_filters, sent len - filter_sizes[n]]
        
        pooled_0 = F.max_pool1d(conved_0, conved_0.shape[2]).squeeze(2)
        pooled_1 = F.max_pool1d(conved_1, conved_1.shape[2]).squeeze(2)
        pooled_2 = F.max_pool1d(conved_2, conved_2.shape[2]).squeeze(2)
        
        #pooled_n = [batch size, n_filters]
        
        cat = self.dropout(torch.cat((pooled_0, pooled_1, pooled_2), dim=1))

        #cat = [batch size, n_filters * len(filter_sizes)]
            
        return self.fc(cat)
class CNNNew(nn.Module):
    def __init__(self, vocab_size, embedding_dim, n_filters, filter_sizes, output_dim, dropout):
        super().__init__()
        self.embedding = nn.Embedding(vocab_size, embedding_dim)
        self.convs = nn.ModuleList([
            nn.Conv1d(embedding_dim, n_filters, kernel_size=size) for size in filter_sizes
        ])
        self.fc = nn.Linear(len(filter_sizes) * n_filters, output_dim)
        self.dropout = nn.Dropout(dropout)
        
    def forward(self, x):
        x = rearrange(x, 't b -> t b')
        emb = rearrange(self.embedding(x), 't b c -> b c t')
        pooled = [reduce(conv(emb), 'b c t -> b c', 'max') for conv in self.convs]
        concatenated = rearrange(pooled, 'filter b c -> b (filter c)')
        return self.fc(self.dropout(F.relu(concatenated)))
  • Original code misuses Conv2d, while Conv1d is the right choice
  • Fixed code can work with any number of filter_sizes (and won't fail)
  • First line in new code does nothing, but was added for simplicity

Highway convolutions

  • Highway convolutions are common in TTS systems. Code below makes splitting a bit more explicit.
  • Splitting policy may eventually turn out to be important if input had previously groups over channel axes (group convolutions or bidirectional LSTMs/GRUs)
  • Same applies to GLU and gated units in general
class HighwayConv1dOld(nn.Conv1d):
    def forward(self, inputs):
        L = super(HighwayConv1dOld, self).forward(inputs)
        H1, H2 = torch.chunk(L, 2, 1)  # chunk at the feature dim
        torch.sigmoid_(H1)
        return H1 * H2 + (1.0 - H1) * inputs
class HighwayConv1dNew(nn.Conv1d):
    def forward(self, inputs):
        L = super().forward(inputs)
        H1, H2 = rearrange(L, 'b (split c) t -> split b c t', split=2)
        torch.sigmoid_(H1)
        return H1 * H2 + (1.0 - H1) * inputs

Tacotron's CBHG module

class CBHG_Old(nn.Module):
    """CBHG module: a recurrent neural network composed of:
        - 1-d convolution banks
        - Highway networks + residual connections
        - Bidirectional gated recurrent units
    """

    def __init__(self, in_dim, K=16, projections=[128, 128]):
        super(CBHG, self).__init__()
        self.in_dim = in_dim
        self.relu = nn.ReLU()
        self.conv1d_banks = nn.ModuleList(
            [BatchNormConv1d(in_dim, in_dim, kernel_size=k, stride=1,
                             padding=k // 2, activation=self.relu)
             for k in range(1, K + 1)])
        self.max_pool1d = nn.MaxPool1d(kernel_size=2, stride=1, padding=1)

        in_sizes = [K * in_dim] + projections[:-1]
        activations = [self.relu] * (len(projections) - 1) + [None]
        self.conv1d_projections = nn.ModuleList(
            [BatchNormConv1d(in_size, out_size, kernel_size=3, stride=1,
                             padding=1, activation=ac)
             for (in_size, out_size, ac) in zip(
                 in_sizes, projections, activations)])

        self.pre_highway = nn.Linear(projections[-1], in_dim, bias=False)
        self.highways = nn.ModuleList(
            [Highway(in_dim, in_dim) for _ in range(4)])

        self.gru = nn.GRU(
            in_dim, in_dim, 1, batch_first=True, bidirectional=True)
def forward_old(self, inputs):
    # (B, T_in, in_dim)
    x = inputs

    # Needed to perform conv1d on time-axis
    # (B, in_dim, T_in)
    if x.size(-1) == self.in_dim:
        x = x.transpose(1, 2)

    T = x.size(-1)

    # (B, in_dim*K, T_in)
    # Concat conv1d bank outputs
    x = torch.cat([conv1d(x)[:, :, :T] for conv1d in self.conv1d_banks], dim=1)
    assert x.size(1) == self.in_dim * len(self.conv1d_banks)
    x = self.max_pool1d(x)[:, :, :T]

    for conv1d in self.conv1d_projections:
        x = conv1d(x)

    # (B, T_in, in_dim)
    # Back to the original shape
    x = x.transpose(1, 2)

    if x.size(-1) != self.in_dim:
        x = self.pre_highway(x)

    # Residual connection
    x += inputs
    for highway in self.highways:
        x = highway(x)

    # (B, T_in, in_dim*2)
    outputs, _ = self.gru(x)

    return outputs
def forward_new(self, inputs, input_lengths=None):
    x = rearrange(inputs, 'b t c -> b c t')
    _, _, T = x.shape
    # Concat conv1d bank outputs
    x = rearrange([conv1d(x)[:, :, :T] for conv1d in self.conv1d_banks], 
                 'bank b c t -> b (bank c) t', c=self.in_dim)
    x = self.max_pool1d(x)[:, :, :T]

    for conv1d in self.conv1d_projections:
        x = conv1d(x)
    x = rearrange(x, 'b c t -> b t c')
    if x.size(-1) != self.in_dim:
        x = self.pre_highway(x)

    # Residual connection
    x += inputs
    for highway in self.highways:
        x = highway(x)

    # (B, T_in, in_dim*2)
    outputs, _ = self.gru(self.highways(x))

    return outputs    

There is still a large room for improvements, but in this example only forward function was changed

Simple attention

Good news: there is no more need to guess order of dimensions. Neither for inputs nor for outputs

class Attention(nn.Module):
    def __init__(self):
        super(Attention, self).__init__()
    
    def forward(self, K, V, Q):
        A = torch.bmm(K.transpose(1,2), Q) / np.sqrt(Q.shape[1])
        A = F.softmax(A, 1)
        R = torch.bmm(V, A)
        return torch.cat((R, Q), dim=1)
def attention(K, V, Q):
    _, n_channels, _ = K.shape
    A = torch.einsum('bct,bcl->btl', [K, Q])
    A = F.softmax(A * n_channels ** (-0.5), 1)
    R = torch.einsum('bct,btl->bcl', [V, A])
    return torch.cat((R, Q), dim=1)

Transformer's attention needs more attention

class ScaledDotProductAttention(nn.Module):
    ''' Scaled Dot-Product Attention '''

    def __init__(self, temperature, attn_dropout=0.1):
        super().__init__()
        self.temperature = temperature
        self.dropout = nn.Dropout(attn_dropout)
        self.softmax = nn.Softmax(dim=2)

    def forward(self, q, k, v, mask=None):

        attn = torch.bmm(q, k.transpose(1, 2))
        attn = attn / self.temperature

        if mask is not None:
            attn = attn.masked_fill(mask, -np.inf)

        attn = self.softmax(attn)
        attn = self.dropout(attn)
        output = torch.bmm(attn, v)

        return output, attn



class MultiHeadAttentionOld(nn.Module):
    ''' Multi-Head Attention module '''

    def __init__(self, n_head, d_model, d_k, d_v, dropout=0.1):
        super().__init__()

        self.n_head = n_head
        self.d_k = d_k
        self.d_v = d_v

        self.w_qs = nn.Linear(d_model, n_head * d_k)
        self.w_ks = nn.Linear(d_model, n_head * d_k)
        self.w_vs = nn.Linear(d_model, n_head * d_v)
        nn.init.normal_(self.w_qs.weight, mean=0, std=np.sqrt(2.0 / (d_model + d_k)))
        nn.init.normal_(self.w_ks.weight, mean=0, std=np.sqrt(2.0 / (d_model + d_k)))
        nn.init.normal_(self.w_vs.weight, mean=0, std=np.sqrt(2.0 / (d_model + d_v)))

        self.attention = ScaledDotProductAttention(temperature=np.power(d_k, 0.5))
        self.layer_norm = nn.LayerNorm(d_model)

        self.fc = nn.Linear(n_head * d_v, d_model)
        nn.init.xavier_normal_(self.fc.weight)

        self.dropout = nn.Dropout(dropout)


    def forward(self, q, k, v, mask=None):
        
        d_k, d_v, n_head = self.d_k, self.d_v, self.n_head
        
        sz_b, len_q, _ = q.size()
        sz_b, len_k, _ = k.size()
        sz_b, len_v, _ = v.size()
        
        residual = q
        
        q = self.w_qs(q).view(sz_b, len_q, n_head, d_k)
        k = self.w_ks(k).view(sz_b, len_k, n_head, d_k)
        v = self.w_vs(v).view(sz_b, len_v, n_head, d_v)
        
        q = q.permute(2, 0, 1, 3).contiguous().view(-1, len_q, d_k) # (n*b) x lq x dk
        k = k.permute(2, 0, 1, 3).contiguous().view(-1, len_k, d_k) # (n*b) x lk x dk
        v = v.permute(2, 0, 1, 3).contiguous().view(-1, len_v, d_v) # (n*b) x lv x dv
        
        mask = mask.repeat(n_head, 1, 1) # (n*b) x .. x ..
        output, attn = self.attention(q, k, v, mask=mask)
        
        output = output.view(n_head, sz_b, len_q, d_v)
        output = output.permute(1, 2, 0, 3).contiguous().view(sz_b, len_q, -1) # b x lq x (n*dv)
        
        output = self.dropout(self.fc(output))
        output = self.layer_norm(output + residual)
        
        return output, attn
class MultiHeadAttentionNew(nn.Module):
    def __init__(self, n_head, d_model, d_k, d_v, dropout=0.1):
        super().__init__()
        self.n_head = n_head
        
        self.w_qs = nn.Linear(d_model, n_head * d_k)
        self.w_ks = nn.Linear(d_model, n_head * d_k)
        self.w_vs = nn.Linear(d_model, n_head * d_v)
        
        nn.init.normal_(self.w_qs.weight, mean=0, std=np.sqrt(2.0 / (d_model + d_k)))
        nn.init.normal_(self.w_ks.weight, mean=0, std=np.sqrt(2.0 / (d_model + d_k)))
        nn.init.normal_(self.w_vs.weight, mean=0, std=np.sqrt(2.0 / (d_model + d_v)))
        
        self.fc = nn.Linear(n_head * d_v, d_model)
        nn.init.xavier_normal_(self.fc.weight)
        self.dropout = nn.Dropout(p=dropout)
        self.layer_norm = nn.LayerNorm(d_model)

    def forward(self, q, k, v, mask=None):
        residual = q
        q = rearrange(self.w_qs(q), 'b l (head k) -> head b l k', head=self.n_head)
        k = rearrange(self.w_ks(k), 'b t (head k) -> head b t k', head=self.n_head)
        v = rearrange(self.w_vs(v), 'b t (head v) -> head b t v', head=self.n_head)
        attn = torch.einsum('hblk,hbtk->hblt', [q, k]) / np.sqrt(q.shape[-1])
        if mask is not None:
            attn = attn.masked_fill(mask[None], -np.inf)
        attn = torch.softmax(attn, dim=3)
        output = torch.einsum('hblt,hbtv->hblv', [attn, v])
        output = rearrange(output, 'head b l v -> b l (head v)')
        output = self.dropout(self.fc(output))
        output = self.layer_norm(output + residual)
        return output, attn
    

Benefits of new implementation

  • we have one module, not two
  • now code does not fail for None mask
  • the amount of caveats in the original code that we removed is huge. Try erasing comments and deciphering what happens there

Self-attention GANs

SAGANs are currently SotA for image generation, and can be simplified using same tricks.

class Self_Attn_Old(nn.Module):
    """ Self attention Layer"""
    def __init__(self,in_dim,activation):
        super(Self_Attn_Old,self).__init__()
        self.chanel_in = in_dim
        self.activation = activation
        
        self.query_conv = nn.Conv2d(in_channels = in_dim , out_channels = in_dim//8 , kernel_size= 1)
        self.key_conv = nn.Conv2d(in_channels = in_dim , out_channels = in_dim//8 , kernel_size= 1)
        self.value_conv = nn.Conv2d(in_channels = in_dim , out_channels = in_dim , kernel_size= 1)
        self.gamma = nn.Parameter(torch.zeros(1))
        
        self.softmax  = nn.Softmax(dim=-1) #

    def forward(self, x):
        """
            inputs :
                x : input feature maps( B X C X W X H)
            returns :
                out : self attention value + input feature 
                attention: B X N X N (N is Width*Height)
        """
        
        m_batchsize,C,width ,height = x.size()
        proj_query  = self.query_conv(x).view(m_batchsize,-1,width*height).permute(0,2,1) # B X CX(N)
        proj_key =  self.key_conv(x).view(m_batchsize,-1,width*height) # B X C x (*W*H)
        energy =  torch.bmm(proj_query,proj_key) # transpose check
        attention = self.softmax(energy) # BX (N) X (N) 
        proj_value = self.value_conv(x).view(m_batchsize,-1,width*height) # B X C X N

        out = torch.bmm(proj_value,attention.permute(0,2,1) )
        out = out.view(m_batchsize,C,width,height)
        
        out = self.gamma*out + x
        return out,attention
class Self_Attn_New(nn.Module):
    """ Self attention Layer"""
    def __init__(self, in_dim):
        super().__init__()
        self.query_conv = nn.Conv2d(in_dim, out_channels=in_dim//8, kernel_size=1)
        self.key_conv = nn.Conv2d(in_dim, out_channels=in_dim//8, kernel_size=1)
        self.value_conv = nn.Conv2d(in_dim, out_channels=in_dim, kernel_size=1)
        self.gamma = nn.Parameter(torch.zeros([1]))

    def forward(self, x):
        proj_query = rearrange(self.query_conv(x), 'b c h w -> b (h w) c')
        proj_key = rearrange(self.key_conv(x), 'b c h w -> b c (h w)')
        proj_value = rearrange(self.value_conv(x), 'b c h w -> b (h w) c')
        energy = torch.bmm(proj_query, proj_key)
        attention = F.softmax(energy, dim=2)
        out = torch.bmm(attention, proj_value)
        out = x + self.gamma * rearrange(out, 'b (h w) c -> b c h w',
                                         **parse_shape(x, 'b c h w'))
        return out, attention

Improving time sequence prediction

While this example was considered to be simplistic, I had to analyze surrounding code to understand what kind of input was expected. You can try yourself.

Additionally now the code works with any dtype, not only double; and new code supports using GPU.

class SequencePredictionOld(nn.Module):
    def __init__(self):
        super(SequencePredictionOld, self).__init__()
        self.lstm1 = nn.LSTMCell(1, 51)
        self.lstm2 = nn.LSTMCell(51, 51)
        self.linear = nn.Linear(51, 1)

    def forward(self, input, future = 0):
        outputs = []
        h_t = torch.zeros(input.size(0), 51, dtype=torch.double)
        c_t = torch.zeros(input.size(0), 51, dtype=torch.double)
        h_t2 = torch.zeros(input.size(0), 51, dtype=torch.double)
        c_t2 = torch.zeros(input.size(0), 51, dtype=torch.double)

        for i, input_t in enumerate(input.chunk(input.size(1), dim=1)):
            h_t, c_t = self.lstm1(input_t, (h_t, c_t))
            h_t2, c_t2 = self.lstm2(h_t, (h_t2, c_t2))
            output = self.linear(h_t2)
            outputs += [output]
            
        for i in range(future):# if we should predict the future
            h_t, c_t = self.lstm1(output, (h_t, c_t))
            h_t2, c_t2 = self.lstm2(h_t, (h_t2, c_t2))
            output = self.linear(h_t2)
            outputs += [output]
        outputs = torch.stack(outputs, 1).squeeze(2)
        return outputs
class SequencePredictionNew(nn.Module):
    def __init__(self):
        super(SequencePredictionNew, self).__init__()
        self.lstm1 = nn.LSTMCell(1, 51)
        self.lstm2 = nn.LSTMCell(51, 51)
        self.linear = nn.Linear(51, 1)

    def forward(self, input, future=0):
        b, t = input.shape
        h_t, c_t, h_t2, c_t2 = torch.zeros(4, b, 51, dtype=self.linear.weight.dtype, 
                                           device=self.linear.weight.device)

        outputs = []
        for input_t in rearrange(input, 'b t -> t b ()'):
            h_t, c_t = self.lstm1(input_t, (h_t, c_t))
            h_t2, c_t2 = self.lstm2(h_t, (h_t2, c_t2))
            output = self.linear(h_t2)
            outputs += [output]
            
        for i in range(future): # if we should predict the future
            h_t, c_t = self.lstm1(output, (h_t, c_t))
            h_t2, c_t2 = self.lstm2(h_t, (h_t2, c_t2))
            output = self.linear(h_t2)
            outputs += [output]
        return rearrange(outputs, 't b () -> b t')

Transforming spacial transformer network (STN)

class SpacialTransformOld(nn.Module):
    def __init__(self):
        super(Net, self).__init__()

        # Spatial transformer localization-network
        self.localization = nn.Sequential(
            nn.Conv2d(1, 8, kernel_size=7),
            nn.MaxPool2d(2, stride=2),
            nn.ReLU(True),
            nn.Conv2d(8, 10, kernel_size=5),
            nn.MaxPool2d(2, stride=2),
            nn.ReLU(True)
        )

        # Regressor for the 3 * 2 affine matrix
        self.fc_loc = nn.Sequential(
            nn.Linear(10 * 3 * 3, 32),
            nn.ReLU(True),
            nn.Linear(32, 3 * 2)
        )

        # Initialize the weights/bias with identity transformation
        self.fc_loc[2].weight.data.zero_()
        self.fc_loc[2].bias.data.copy_(torch.tensor([1, 0, 0, 0, 1, 0], dtype=torch.float))

    # Spatial transformer network forward function
    def stn(self, x):
        xs = self.localization(x)
        xs = xs.view(-1, 10 * 3 * 3)
        theta = self.fc_loc(xs)
        theta = theta.view(-1, 2, 3)

        grid = F.affine_grid(theta, x.size())
        x = F.grid_sample(x, grid)

        return x
class SpacialTransformNew(nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        # Spatial transformer localization-network
        linear = nn.Linear(32, 3 * 2)
        # Initialize the weights/bias with identity transformation
        linear.weight.data.zero_()
        linear.bias.data.copy_(torch.tensor([1, 0, 0, 0, 1, 0], dtype=torch.float))
        
        self.compute_theta = nn.Sequential(
            nn.Conv2d(1, 8, kernel_size=7),
            nn.MaxPool2d(2, stride=2),
            nn.ReLU(True),
            nn.Conv2d(8, 10, kernel_size=5),
            nn.MaxPool2d(2, stride=2),
            nn.ReLU(True),
            Rearrange('b c h w -> b (c h w)', h=3, w=3),
            nn.Linear(10 * 3 * 3, 32),
            nn.ReLU(True),
            linear,
            Rearrange('b (row col) -> b row col', row=2, col=3),
        )

    # Spatial transformer network forward function
    def stn(self, x):
        grid = F.affine_grid(self.compute_theta(x), x.size())
        return F.grid_sample(x, grid)
  • new code will give reasonable errors when passed image size is different from expected
  • if batch size is divisible by 18, whatever you input in the old code, it'll fail no sooner than affine_grid.

Improving GLOW

That's a good old depth-to-space written manually!

Since GLOW is revertible, it will frequently rely on rearrange-like operations.

def unsqueeze2d_old(input, factor=2):
    assert factor >= 1 and isinstance(factor, int)
    factor2 = factor ** 2
    if factor == 1:
        return input
    size = input.size()
    B = size[0]
    C = size[1]
    H = size[2]
    W = size[3]
    assert C % (factor2) == 0, "{}".format(C)
    x = input.view(B, C // factor2, factor, factor, H, W)
    x = x.permute(0, 1, 4, 2, 5, 3).contiguous()
    x = x.view(B, C // (factor2), H * factor, W * factor)
    return x

def squeeze2d_old(input, factor=2):
    assert factor >= 1 and isinstance(factor, int)
    if factor == 1:
        return input
    size = input.size()
    B = size[0]
    C = size[1]
    H = size[2]
    W = size[3]
    assert H % factor == 0 and W % factor == 0, "{}".format((H, W))
    x = input.view(B, C, H // factor, factor, W // factor, factor)
    x = x.permute(0, 1, 3, 5, 2, 4).contiguous()
    x = x.view(B, C * factor * factor, H // factor, W // factor)
    return x
def unsqueeze2d_new(input, factor=2):
    return rearrange(input, 'b (c h2 w2) h w -> b c (h h2) (w w2)', h2=factor, w2=factor)

def squeeze2d_new(input, factor=2):
    return rearrange(input, 'b c (h h2) (w w2) -> b (c h2 w2) h w', h2=factor, w2=factor)
  • term squeeze isn't very helpful: which dimension is squeezed? There is torch.squeeze, but it's very different.
  • in fact, we could skip creating functions completely - it is a single call to einops anyway

Detecting problems in YOLO detection

def YOLO_prediction_old(input, num_classes, num_anchors, anchors, stride_h, stride_w):
    bs = input.size(0)
    in_h = input.size(2)
    in_w = input.size(3)
    scaled_anchors = [(a_w / stride_w, a_h / stride_h) for a_w, a_h in anchors]

    prediction = input.view(bs, num_anchors,
                            5 + num_classes, in_h, in_w).permute(0, 1, 3, 4, 2).contiguous()
    # Get outputs
    x = torch.sigmoid(prediction[..., 0])  # Center x
    y = torch.sigmoid(prediction[..., 1])  # Center y
    w = prediction[..., 2]  # Width
    h = prediction[..., 3]  # Height
    conf = torch.sigmoid(prediction[..., 4])  # Conf
    pred_cls = torch.sigmoid(prediction[..., 5:])  # Cls pred.

    FloatTensor = torch.cuda.FloatTensor if x.is_cuda else torch.FloatTensor
    LongTensor = torch.cuda.LongTensor if x.is_cuda else torch.LongTensor
    # Calculate offsets for each grid
    grid_x = torch.linspace(0, in_w - 1, in_w).repeat(in_w, 1).repeat(
        bs * num_anchors, 1, 1).view(x.shape).type(FloatTensor)
    grid_y = torch.linspace(0, in_h - 1, in_h).repeat(in_h, 1).t().repeat(
        bs * num_anchors, 1, 1).view(y.shape).type(FloatTensor)
    # Calculate anchor w, h
    anchor_w = FloatTensor(scaled_anchors).index_select(1, LongTensor([0]))
    anchor_h = FloatTensor(scaled_anchors).index_select(1, LongTensor([1]))
    anchor_w = anchor_w.repeat(bs, 1).repeat(1, 1, in_h * in_w).view(w.shape)
    anchor_h = anchor_h.repeat(bs, 1).repeat(1, 1, in_h * in_w).view(h.shape)
    # Add offset and scale with anchors
    pred_boxes = FloatTensor(prediction[..., :4].shape)
    pred_boxes[..., 0] = x.data + grid_x
    pred_boxes[..., 1] = y.data + grid_y
    pred_boxes[..., 2] = torch.exp(w.data) * anchor_w
    pred_boxes[..., 3] = torch.exp(h.data) * anchor_h
    # Results
    _scale = torch.Tensor([stride_w, stride_h] * 2).type(FloatTensor)
    output = torch.cat((pred_boxes.view(bs, -1, 4) * _scale,
                        conf.view(bs, -1, 1), pred_cls.view(bs, -1, num_classes)), -1)
    return output
def YOLO_prediction_new(input, num_classes, num_anchors, anchors, stride_h, stride_w):
    raw_predictions = rearrange(input, 'b (anchor prediction) h w -> prediction b anchor h w', 
                                anchor=num_anchors, prediction=5 + num_classes)
    anchors = torch.FloatTensor(anchors).to(input.device)
    anchor_sizes = rearrange(anchors, 'anchor dim -> dim () anchor () ()')

    _, _, _, in_h, in_w = raw_predictions.shape
    grid_h = rearrange(torch.arange(in_h).float(), 'h -> () () h ()').to(input.device)
    grid_w = rearrange(torch.arange(in_w).float(), 'w -> () () () w').to(input.device)

    predicted_bboxes = torch.zeros_like(raw_predictions)
    predicted_bboxes[0] = (raw_predictions[0].sigmoid() + grid_w) * stride_w  # center x
    predicted_bboxes[1] = (raw_predictions[1].sigmoid() + grid_h) * stride_h  # center y
    predicted_bboxes[2:4] = (raw_predictions[2:4].exp()) * anchor_sizes  # bbox width and height
    predicted_bboxes[4] = raw_predictions[4].sigmoid()  # confidence
    predicted_bboxes[5:] = raw_predictions[5:].sigmoid()  # class predictions
    # merging all predicted bboxes for each image
    return rearrange(predicted_bboxes, 'prediction b anchor h w -> b (anchor h w) prediction')

We changed and fixed a lot:

  • new code won't fail if input is not on the first GPU
  • old code has wrong grid_x and grid_y for non-square images
  • new code doesn't use replication when broadcasting is sufficient
  • old code strangely sometimes takes .data, but this has no real effect, as some branches preserve gradient till the end
    • if gradients not needed, torch.no_grad should be used, so it's redundant

Simpler output for a bunch of pictures

Next time you need to output drawings of you generative models, you can use this trick

device = 'cpu'
plt.imshow(np.transpose(vutils.make_grid(fake_batch.to(device)[:64], padding=2, normalize=True).cpu(),(1,2,0)))
padded = F.pad(fake_batch[:64], [1, 1, 1, 1])
plt.imshow(rearrange(padded, '(b1 b2) c h w -> (b1 h) (b2 w) c', b1=8).cpu())

Instead of conclusion

Better code is a vague term; to be specific, code is expected to be:

  • reliable: does what expected and does not fail. Explicitly fails for wrong inputs
  • maintainable and modifiable
  • reusable: understanding and modifying code should be easier than writing from scratch
  • fast: in my measurements, proposed versions have speed similar to the original code
  • readability counts, as a mean to achieve previous goals

Provided examples show how to improve these criteria for deep learning code. And einops helps a lot.

Links

  • pytorch and einops
  • significant part of the code was taken from the official examples and tutorials. All code fragments were taken for educational purpose.
  • (references for other code are given in source of this html)
  • einops has a tutorial for a more gentle introduction
arogozhnikov-einops-ad2c8d6/docs/resources/000077500000000000000000000000001475201674600212005ustar00rootroot00000000000000arogozhnikov-einops-ad2c8d6/docs/resources/einops_logo_350x350.png000066400000000000000000001024741475201674600252420ustar00rootroot00000000000000‰PNG  IHDR^^Í+w IDATx^ì½ ”$×užùçžU•µï[oÕ ô  èn\°pm™ )7Ó#K¢(Qc[”Æcid[çHçXmÉ"5©…mk8Mj(’CqHI t7zïê¥z©}¯Ê}™ó¿¨—ù22"3"·ª®ŽçS†ØË‹‘_Þ¸ïÞÿº2™LÎp,àXÀ±€cºYÀ倷n¶vNäXÀ±€ca¼Îƒ°nXZMb9’DcЃ–F/lŠY0œÀPC V •\´áJ¬çì»ÞpÀ»Þw`œŸá„…•¤X4[!!Üò¢)è„#Öã8ç´g¼öìål­X€ ]XI`5šÚ0vñyÝ"+¢=äEÐïÙ0ór&âX@µ€^çy°e¦ƒ1”0¿’¬jüÖÖ$,nì@Ø¢¡œÍên¼u7ùí{©…˜X0»‡áÛñ®mÞ9;àݼ÷¶&W§„·Ëô0}DMNX¥ƒ¼=­~‘’æTÈUɨÎaʶ€Þ²M·ywäb=[B¶½9€¶&ÆKÝ|;@¸)èAg³\9( µ°Êê¹8ü'Gxó>ÊöÊðnØ[S߉1vK/Ö,÷–™„pGÈ ¿¯Â2³a)œ¨y*Y)ËÈ,‡î¶üÞœþC2•ÁÌb3K…á'QʪÎçÕ´€ÞjZó6<–ÝR^^"ÁÖÙ™–~Hñ›zC˜sa•[g‹//œ@Ï|r!Že‹Õs„oÃù6›²ÞÛì†UkºÕʽ%¤º¶&¼žBu±z@˜áž¿½Ù—N ø'æ+«žs \­'Î9Žj¼wØó )ƒ1~kA©Æ¦m¬B¸Z…Ò»UóuÓ™ ¦æ™îVýê9Í£ö¢-äÏ aØ4“³¹cG÷Nxdîm=Jy¥=EA›_,\é=aO^Óâµ+ž#Ãm!_üâÉ´ˆO³ £ƒ‹œµ#Ô8r=Îíœãö·€ãñÞþ÷Ðô Û¤2Xµ<ÌrMEtÐðÀ­;· a‚ŽÙ úpB8–ęغ¦¶9.÷ɸs÷sÀ» ïýF,å•f&¤zÛ­C˜å¿Œ!« 'Ì-'0½[÷ ýãã@x~¡jpIxk`Ôõ8d±Îë1+ç$¤ú:h¤'ìÊ_˜ãõPáL}g8ázñ·ÃpÄÜo‡»´>stÀ»>v¯êY )-¾Yý¥ªN´ÈÁØòg°+¯Aß56ž^¬Üd­®— ü‘qªæjeáÛë¸xo¯ûUr¶õHß*9 ˆLˆf-L…«ËtN°è×V)UuS§T¹ªæÜ4sÀ{›Þʹ¥(FoÌãž]=øŒå%¤Ö{qÍÈÄF¹·ÜŽùÅ“óq„cšÔd_{@È<ê«å6:„bÓ«1 )p›>pδ«j¼U5gý6zcú•WÅ ìì‘.Ü»«~o$HåÞjÚ ,WŽ"‘̘q£CبT™”'—€cWþ–¼ÿþú=#Ι6®ðnÜ{Stf*xÕ 7"„eî­¾”7™Ì`Z,–Ù—šìj@sЯ¢Å@;È™ÙåډݣpI< \ž^º̬h{ u8à½M¿nUŸ¶Þª›´>4¯zö£ûú°¤ûGº 'UkOØ(÷–‰'Ò¸1͆*±˜ÏëBO[ÐÂÔ¡XZM%²ZTê)Ÿ-F€³Àkc@X÷{‷’;½¹öuÀ»î'Áa§Ê xå¥5¼"Q ÂóËÕà úþg '¬D’¸5W<œPÉí¨„õá’tFój]Λ_AµÁËx¸Óc®’'fýöuÀ»~¶ IÌI•e³v’ïí€W½D a.ÊíÝÞixõåxŠ 'Pà³%_Ç@ä/S¬&VWKK·4z R¸Ê¹>#å3Æo¯ÎhñÛ[‹¥/¯àÕwrvº-—¶ûFÜÂoïJ)Ý[9ñšÞâ+úF¹Ÿ·¦Wð‡_|¹¢ÙKÝ7€íƒ­eAØ,]ŠáJ1.®®± 4º[ý5xàÒ娕s7 —¬Ds“põá„b7¤ðZ)ÿv”Ô*ú:Ôug¼u2·ÞS±sZ£ ¨p4ó׿qmb Wo-ŠÿV2:[‚Ø¿³ „pWSQÏ.'DX„"8ê«.à švBí •\#÷µ á¥&•vñ '0~+à ‰2+—ÞrË¿Wú¤Ôv¼µµ¯',¬$«&T#!ìug06±œ}<‘ª:„Ü?€¾Nc«fceÙÂj 7g"5¶fu_ ÂòL 'ÜZвÆæ*;¿UðV»“³áÊî[-övÀ[ «´µÒ½å”éñÎ.„ÁR[}8¢Þ.õº^#Wí°ÓpwPxÃrDÀ¥iàGí…* 5ð­ˆÂ?µìÜáh Wí±©è@x+2_þÎÕöTŠMà½rK /[`!„/ŒÍáâõÌò]¹‚ÁpÄüÈCEp;AبºŒwmøöëÕ®4ض.àgš†Ì;^–¿¨à~ÚYÌ­à4ήpÀ[…ÇÂÊÂGN“w¼êÅ ¼‰‹xp%þÔo<&N÷µ\DGKP”,·…Œë`7RLi£bÕe!?ÐÚvÕVP«Vu™Uð2fKà2¬ G1›öµkÝ–ƒ®ìâ÷«µH‘Ýû"··ÓÉÙ‰—ge¼ŠÝìT—­7„Kw°3(*¯T/ÝNž¨ÙjßMßáëÏâ…“7ÁtµJF% jzo¬Òê²Rà¥g»»/?œ mjEl¾-äÆÁmÖ}Vb×rö5+ÿfš›VXü­Î°u«;à]ëzPIuÙz@ؼ|íêj”ÔQiö…ú…:x—&1Éðj$k‹¸6¾$þw%¶"Þ#wض/o¬Vu™xNñ[5œPŽMe¨öËv[7úäñ)gYÜ^³NÎ + LΗ×ÉYt ùLµF¬#jsnyGƒ·Õe­M>tµúô1&œÿÐT3¯U/ÏÙÛæ/'PõŒM0Ky*víûww‹/ÔÜb‰Dþ }=!Ì9W»ºLïØ¬\þ©áí™)/¦¯‚Wµ9¸{Úý†Ý–kµ˜k”ÏÌ9Õ¢“³“Qø »ãÀ+cqÌ¥dûðZŽb®4¯•àÇâhiÊW/ãµ–y¢2Ô@7•J™*¨I_Ór}ËÒ~üðVô´7ŠÃ̇ӷ€WÇ€ #yÓ’à¥&T娖MÍÀ«N‚?jlüÙèw›.æV"ìnÖ\”žõÔB¬æ¶aínß1à]ê2õ U-Ët0·+#^ïåÐkû– ºRû©àe¨A³/Tµt#íîÅ{Û…¹ˆ_>Vj–ö>g‹eÜ¥5£œ]ÃZãMU/ÙÞQ ·¶ÞZAØ(ŸYëu—¨ú[‘U;Ég†±ú;mlzð–+«WËÁ(4 ÏgÖ˜¯£C]Ál8A†j.)vífàU÷©„k^Æo¹X&à m@Ð\šLàâ­ê†hh»àUmÚÑLOØ žpþ’9CKú°¡ÖÝÈË¿å[ßôB3eôº«Å÷âNsß”àÝ(ÕeVÒRsù%c±ƒ*†Î/Ú­™U;7Sóp‰þ¬€×„)à3>³ZÒTÕ/ÓÀöÍŠÄmÚèc5ÇÔ²æñV{T^=„;a¿»`ŠR¤ˆÏ‹¾¹(?£8ýrxãVÏÝ)Þtà½9­©¬^µ¿ŒêñÌ ¬nÃ룗[*¬Vó´ ^9õ 5ÐÕ€† WdA,®ÄñÓ×oáõÑ™¢S®xRxôn­àñá•X×&c˜YJ@z¼2ÔPmV ¼ªMÛ›hkbñK!„¹»ù±áçÄ|õ=øjÛG¼Í áMÞš”nå!5«.S÷•×79Á‰K*s[™”n›rÁëv»ÐÖÀÎþÚ[|YžbîO¿tU¤¤Õ/½ÝwÞ›¯£ -¸&ŽÆ‘N§³1Þ2ÌSt—jƒW _¸ÑÙì rf£V™Õ¶ÑñDZZ“¶à¸™Æ¦ïÒj ËQ ö4ç½’ß..V]¶I‰Ø^sЯRžI÷ÒEQ~ZÉ*·Ý‡Ù.x½†{š0ÔÝ€æÆÜ—'™Êˆ<àÏ|ù¸¥üßj—×ûžƒZ,wf~ƒ 4Ý þç5µÀÕ™WÓU åT¼F¹·²“óôbÜð™á5Þ.Ökmpî|C៙†Ýgy=·ßàU;î¾åðܳ³[úZòìº!l·ºLm_O&³BèõüBY/=•ឆa¦hͦpúF÷hõõ“–žÿj‚÷Ý÷i§dÙóÉ%ÞÝ…·<0ˆ=[š…OiédÚ¢U·æ5³p£’Q)x‹ur^ ·^*VÜSlQ®’ë,w_Yª¬MÏ­/_NßdÊ_¹çØ(ûm:ðJÃRœû͇°wGz;´üO9ÖóW¿ZÕe‰d«áx'\k/à ­M~ìè¡«ÍŸ}ûˆ'S×xöL7æRP{U ^z?v:ýÒ>*x¥¦0ÿ}{·èÅþ]}ìns^Õg†žðäbù.¼Õêä¼–V@g.áÜÔðýsÀ¤…rÀkÅ¢uÚFõxN¹ \íê2uqMzÂmMÆ j,ý”*d•Þ#ð2œÐ×Ñ€-=MhSbq aà¥Ñ8~t>¥H®Z¥àe¬– e¬áëçøbî¯Ô5šW]\ƒÛ‡÷÷à‘ûú1ÐÕ˜afŒÏ'1½”ÄJĺ'l¼F¹·¼6æWÚÉy#@ØHùŒ÷òüðÂ(> ï¤ÞROw?/^u*õ†0à µ¨.3Ëj¨õ*¿r-¡îFlémÌJJÒS™\Ìàû§c8v9W`¡ÞƒJÀ»šð È,ÎgÂVÀ«¦“µ†rêiÊ[G°a+à5+å­e'g>3½¢d™¢Jù_\™žV­Žfá’å(ðêuàØÕâàpÀ[G°–:•ðê!üއ·aÿŽn45䯚jJeåyвº,Ôÿ W³ÊJ:Y- LðòÕ×ãÊM ™_̘虛I<{&Ž+ÓÅËì‚7ÔèÇ?}ÃvÙÓ‡DÚ-Ú­sÈâÎ+ßüS‡„ðè4 öù´ ^õ˜„ðG†pdOú:ó Ž¥01ŸÄÔš'¬n‹×HKX„iPï™eàõÕª°ÔÏÕ,\2±üä peÚÚÕ9àµf§ºlU.xÕÉmémÆÃ÷BØjç\}u™<~-ªË¬€W½¾J!Ìø-+ îé©=áX©TKà•« <{:žN(vã­‚·¿« ûGº°µ¯^WH㜋c~¥°J«XÞ'ãÀ GP‡á±»µÙé5…íæñrNï|Ã\úê2¾žÕJŒÇ.xó¼¶" jzñ ¨3v;ØM¹I/šýbuÿælOŠâ•Ë Ä’ö–ûK÷®-íØ·£ l\¶6‚>x½>L-&³o±G°„»›5e³Ïã ^<=™=Œ]ðªçgŒûŸ<4Œ{GÚ LÉÑ™¥$à ¬éüÀ(]JÓN ¢µoíÙ´Ö_G;6 —D⚸ÑóËŸ­ÞòmWõ=« ^#ßwWoAóF†#|W^\¬^b<•€× „[š4Àò5ß¿–?ÌpÂr—®ÏŸÖôxËFàåyîn®_¾•º·^¯Gx¼‰”ËxÕyé!Ü·Ö9‚"C,½>7¶ˆS£ p§#"—·ÒÊ5Bø]oÚŠýÛ á`v*n7J&áóz ´(ßY¯pB9÷,ïG¦= tvÕ®&òsÞ¯¦OžÂp‰›Z:X¥Ão¥¬âþµ¯:ͽÛ;ñè¡!léo5ì Ëb†‰ùhU“î‹™‰%·,™œ«Ž° 5vô7¡§=˜'IHïðÄXÿøjÃýíbJR½œÛ¨‚÷ž»(`»kX;.‡‘Ö† 5”^uŽŒ5ÞÕïE{È‹‰ÙÕ¼•Õ.ߜǫ£K¸>]˜ê¶¾ž8:„C»ÚÑ+ìšo1–rŒµ–(-ç>YÙ‡edf‹:ø£ÍØ:¤Õt0+Ç,¶ÞJ-XÅýë^=„?úî{ ¯¢R­ÝrL%1³Úv‡Yuóo_Mâ›Ç£Ùp‚ÕŠbs xá>„|y?Åto«^ÎKz¼7§WÄ"!!ÌFŸôvÙ’1ÈÓœÇùëKeÙT^?ápi»½}OÁBîF,î±ò …K¸ߌ¾ðcûñ[+çtÀkÅJuÚf=ÀËK“w?ÿµ“¦žp­ Lù¿h<)b®ê`œ0·a³ê²X"…h,s7øÊ+ù7±ð¼.Ü·Ívø°«Wó¨Õ`%Û£àU½vmÕÝ‹Îf/<OžÀúäœ={=K«~ [}ü÷açP;^»0 Þ«]Ã)¥ÔÂŪ˘wK‰ÍXøì³µùò;à­]Ë:êzƒ÷7>•{ÊŽîïÇà t#xa²uS+2ÆË=Æc›ùgÂVªËÞ‘ÂP;p}UoKƒGwú±£×Ê"4øµ½Ìö8vnÖRk¢Zƒ—÷Cz¼p¹ÁTÀ¶P.¾-ï!üÒÙá 3ž¯ŽPbñLíÍÆWîÕ˜Ö5ƒ@þý_ÒÀûÝŸ^Áw~¢%­²Äý½½âßÖt#øï ÂVªËÚá€×Êw{Ó– [¹øJ·‘¯ ^õ˜R7¢â=F‹kÂTk$E”Aï*às‰ãPc®¦Qu™ô**ïá?öúÐÚ˜_z‹Çáwi­—èñZõ¯c0ö»×š6æškÊùNÌEp}j'/L"àM‰0…ônY„±ÒkE{L'3¯zý„ðƒú±} A.KBxv9Žh¼Æ=«” Ù©.sÀkåIÖ¶qÀkÝV[–o-!\*«Áïu /¸£%(u(<Â8&½°ñ…4ž:ÁÎE ®«ð2œ°£×‹wÐÜ¿àÂÂØTä =^Ž^Õ8„pwk­MZa ÍÉWoÙcîüØf–S`%–~X¯Ü‡™÷ŒtáОÞ×Zk¤Üê2¼Öaâ€×º­*¯„+QP+^ d3ÿ¶¯3ŸÇ /¿Mn7®Ï¹DYæ­EmF,E]'D×`‚‘£ð2œpßVö ùE8AzÚú•|v'ù“B°þÊ­íñÊëZ ·.¯‰^(½v¦Ûñ¿‰´æÝž¿¾ˆ cKá;àUŸ á7Ü7h(ÞC¡ójHƒVZ]æ€×:LðZ·UÕÀ+T‰n„xeuÙ¶Þ:[s‹njuSÃè 3¡¾Êª~ë®Ú’–b¼C±`¶£'ÿµ˜iR²ºLŸ|/c¼<çë£óXŽh.–Rµ^¡¦K±R/ès‰-}x­œ\Ë»¥=ƒ^fn«ZD‹§|™'|áú’è¹V*ÔPêq$„ïßÝ‹#ûŒÔÊpµªËð–º{¹ÏðZ·UÕÁ«Ð.„Uðê«Ëx\†æV2xæLÜ´ºŒáˆÎÖ š›yê[ôxÛ2x},ŽÏ}/Œ8–kCf5øÆî~7º›óã·Ó 1±hf6áánvtka5ÔP,¤žà¥;ÛM»4z…÷.à |#`ÚÜrBd=(fÉ^®Y£OBxôÖÞr¨C=Íy‹kå>‚,:¹wïíGO{¡‚øŒ`ò|µ¨.sÀkýN:àµn«š‚×.„Çç"¸9µ‚-½!ôvóªË(RóýÓqœ»e={"ð =þðƒî¼¬†d*•pÑhhrŒÔ¦V‡¬.[ZÓ/½É…¥HA”Ñj~=ÀÛò`¡ŸG,”ÉÅ2©µ±M ØZ-vÐC˜°»{K š‚>œ»6‹ï¼p×&Š·<²úxªîïlß~äPµFøo¢cqNàˆÿVê2¼Vï–³¸fÝR[ÚY\+÷Dô„Ô›e›æÜòÿæ+/;;|ýëb5fsú¹7xp`8€…X_;á%[Ùû­D˜^Œ‰ ú–⥮U 5H·T·e·Ç¦Æ`Y%Ãúù¨%Ã,{¦ÖFW‹- ¹òo­›.‰ÄÓÂÃe¼ºÜ!Zõ4ûpxO‡ïô|SóÑöˆðµÚmÙÊù áG áÀHºÛó ¬î_íê2¼VãñZ·UÝ<^³)© jLK±1ãr Ïž‰áù³qÛb5fç‘‹ká¤sQ?¶õøÅ✌oHŒq&ÓüY5ggWÝßÂôx{:š0»”ÂùqíºÜ!Á›ˆÇEÎ-3ïd8AjmÌ­ðHe*®:Gžcï¶ôw4"eZX~Hf%ÇÕ[‹¸x}³ªŽe¹ Mkã­‡·ˆª¹ÎÖF±(È‘×ãcÕ­.sÀkýF9àµn«u¯:_ùÙƒðø›t_¼d Õ_æu1ìcû1ÜåC2¤a|s1œ02iá§ÁXf©Q ¼êþƒ] "÷¸³-'’Sn^+å1ïÝ@¨Áƒ[SK¢BC —0œÀréj5Ô@—pmiô¡™ s:-áZ@øß~ø0†zBøò+À4¨vÀký‰qÀkÝV ¼¿ðä}Èx›ðâ¥xUÀ«V—mïÒ¼@¾ r– 'Èê8¾"»Ü\é×/¦VòÏh%¯Uvrîhv£;¤¥ƒÞ\Äé°—°(aQéÒ+¬öЃ—¡9ŠÉXV ÂÿàýØÖß‷Ú7¶Œã9à-Ãhr—zÄxͦW-ð2 ìà6?;rÕl]!†â¸9)eêà•BèüÌëÕò„­B¸ðª‹kÓ+.ô´-5ú”­—º›]hiÐÌjf_œ¸4+2èá2~«ŠäÔ¼ª™O˘°ïQ?#„_êp+6»|:à­àË^å]ðV`ÐÛ¼Û{¼øù7‡XBåhFÀH¦K5xYW¶Ù‘f2¯ÂzÓ¦ÓDbI|å™ ˜¤€°…á€×‚‘ê´‰Þ };ƒ÷5áÁ]¹"‹X"ƒ›31\Ÿ#äO• ^iN—‹a-!ªæ”ÁB‚¾æŒXÑç뾕a%Œá.ÌÑ£¦:˜*VÖ«Sôà“hökipzMá^Õ6ýAŒ †„(’O¿t ?|õ†Â¯%3Õe#¼˜ùvïѼe­®¼n™ –Vc¸6ÇU]ãÊR¯‘9õV{Y•±,^Y]Ör¡­AËP`ëz×Ô§XŠº„v½\޶†”H;9:‡ªØ¬ ÞáîFlëkBsSNøˆÙ$‹+1üðµ8~.×¾¨Øã쀷‚/{•wuÀ[AowðRª‘’šßúÑ%¼õÈÚ×v¡bžçÔb ×gâŸO Ke¥'Õ¯ÂÏÙÒY!Öt#ä±õà•œÛB^t4”d”áV—M.$DgaWßèSÍã]Œ 3=ÍãÒØTu2;×VjÛb‹kú}y}ì^Aà2æ+ÇÙ$ž=ÇH§&Ýyüü¤ÞR†ß€Ÿ;à­à¦lðþéW^ÍZaG3ÞÿØïéASƒ_d6ȱNaz9éE–ÍÆ³•kvM(×Ü.–Ãñ¢ºR¼G‚×çõ"ã =äA{аæÊê²é¥´bŒš4â „î èóˆÂ…¼¾q™ؾ‰íÔÕë¶{}fÛ[/Óʨµ1ÔÓ˜= =õkÓ ü¿Çâ¸5¯…Id®µÞjÝúÇoûÊð IDATöÞŒà¥9d·Ý¾Îfì¿;ç ó3¯%èÆ¦sž°3e5HK3ñ ¹0§—ç§'Îø­¬.›]Nbn%-<\N(6éñÎ/ED<˜f™j˜]JZï±sÝÅÀËp;9w´²‡ŒÄ38v9ާN%Ĩ:ðÚ±üÆÛÖo÷d³ƒ—^ŸÌj8¸« ?ó†-¸g¤C¼ºz,Uæ`¦Àõ6úŒ#ÁR¯£T:™ á‘þ"W¥%euÙôRó«Àj<'6^êÜü\_2Ì£xÏP—~H/SÓɪվIÞùå(»›°½/ÊMÊ1³”Âóçãxþœy‡^+wzãn〷‚{s'Wš‰‹k÷îìÄã÷àî-­ 'ªŽ‰¹&ŠC¸x öÞö€8¶TPkmb5’;0°Lš•Í ¥n©x¹ôxYŽÝò¡aM8^=^%–à¥þ»4ñúÔøíèDONàü­ÒÕxKÝåý¹Þ îÏ ^šL.®IOxÏÖ6Ë6¯¬.£Ì¥²Õ{*C ˆšˆäÈse5”ïÑ´„­Õ÷¶ûñð¾.ÑË©t8b8AŠéà ÅM¼|q7À®x+¸ xó{Iß;Ò®ÊyÂ×gbÙf—üœêd²ºL—ÕeR ÝJ¯Õ[YÊã5Ëj(ÂìNÑßÑ€Ý[)É&ššãÜrϼÆ·Ošk;à-´€ÓeØêS^‡í6R—á:\®8E¥%ÃÌã•édjV-×Ô¯jP=^³ë}ç·â¡}½Ø5Ô’aÆ€g@& V›Í̇óâ·f­Þ7xÕk-áh"…ÎlíÍ ”3N½õàì¤Ͻ®ÉB–;·\ËmŒý6…ÇËEž[Ó+øþ±1¼z~ªn–u<^kÝnõfF‚¬*›Yˆ Mâælªþ&Ö¼©TKḀ?»l”[@!Ô¼^„ ¼M ¹VHÌ´xåJÏœI ÔÊÓã-÷a½ÓÀKGîvvöð+׆ë¹ß¦¯j@êŒÏÔÂx­WÞ–ñ¾çÍ[ñè¡at±Ä 2’T3£úÙä\SóQL̶è­xÕç&™L#‘`ÓÏ0Vcšð¹±µ§#ƒM";BvªøÎ‰(ž?ŸÄR$#*êd )„nçê¶w xÛ€'ö­ZÌf›¼sKQáñîéÊ»/µ†°ÞÒà¥PMO³–1 Ç£‡¶à½ï‚Ïë?’}MBÔ¨„ð©0nΆ…N.G-À»¼GãZÌ•ço8–}ãøLŒßŽ „02_]6µ˜D<ÇÂj Ÿ}6w¼Ö±yh+px+ДKk†ÔÛW»X[?âÆÚrS€WÆx^Þׇ#{û1@ÑUe¯_žÅŽá²Ea–R·Ê¯9x 1½PM.Ä’.<|ßVüò?Û)Ìû±O> ¦W=þÀ0ìì)€p$–ÆI¢¹©6­(:Ã4¹î6?|¨å. 12”Lai5ŽI毭Ö&¯.ã·rpûËS |ù¥8\Hãý÷C€Âo©oQîs†Žnö äúÝñÓée­[Æé[¹Š=ëGݘ[n*ðª&îl bÿÎ.C'’)œ½:_1„ð‚—º R÷VÞtÆ…HÒ•˜ é4ðÄƒÃøßžÜ)*Ðþåï}G¬òÓ{dƒÆ§Ø`2¯Û—Û-²ÎÞJ`%RÚÛ6ûÊËj`ªW0àG0èÏkÕN·6yÐÒèFƒ?—îŽgðâÅ8¾s*! CpÀ«xú¥ÐÇpÂÃ;!´'ÔpA{z<¿[†“ÕPÊšuü¼TVC­ ì€WƒŸÔ½U_ ùQÜü‡ámG†ñ±wŽˆ$xå x)%I¯S…ðý{úE/M´1Û‚ù¯S \ŸÕºGØVÓÉè S4ˆb5}°-’s«¡{Ëm°š‡˜N¨xÕsè!Lµw<8daVÊq¾A¿++æÎt°“cq<{6k3ö (Ô¹ìèqã#oö£!À_¿ €»Ý@píÎ3÷|ÆZ“J#[nÄŠ7ío+œ-à "CáVå$p<ÞÊmXµ#Ô¼ê$ìì‘®<³‹œ¸8…oþèrÕ®«Ø>ò®ƒð64ã…óQüø|é…@ý±Ì´NhÎ_sDnáÝF’.¸]Z*Uä&o ·à2ªåñšÙ‡fw ù£¡‡°ìZÁpÉb8…gOÇDu…yªÑúG]\ûüÝ"ÜÞ@s£Jñõfµ\¤ ÕQ©‡h#WV—©Ù œ¿ÈN¸L/•ºëŸ;àµn«šoY/ðê!ü‹ïàâõyüÉÿ<ž-çå\‹u /Ó±D —‰ýjV@¡äñêO­†ÔÏøƒâÿ·ùƘƒWÍj`¼Ûè¸f×^‰Çkö8ÊB þ—–¡;ë¥À[iu™^ë0qÀkÝVU¯zJÄ{ŒÀkEVçç—õ®Þ XU4z}!ëñf åâöµd =^î«5ˆ_oEm¬C¸ª¯I8óâ :ôÃ*xÕý;6–EÅ›Z€W=áË<á½C„å(a3ðV«ºÌ¯u˜8àµn«ªƒ·«àe|ÖHVe™Ì£¼1§‰^(;=ðoÿ  ûÝbqMßs­¿‰þ‘Fvs>ͼz@•ò„ß~d¨¤HŽÞØÙ’aéñ®WfYdᣈñ˜ÝZ;à%h™ý•å[ÖË7ï‡Pkðòº¤Ç;³Fc‹ºù>FÖƒ—‹¬Õ¬.sÀk&x­Ûª¦à5‚p1µÖæ„BM`1@£?··‘¬˜wKàÊq`ȼ2Ô ÓÉè ‹Ðuq×\>+édôÆôñOž_¯ÇKÕ²RC‚—"勹â¸"Ë¢ô¡`¼<.ã·z¼Š©Š’¾Ü¢>$QOð??‰ãç&…Ž® >é3j$„ùƒHyMj]ô´ŠÕTZ]怷ÔSœûܯu[Õ ¼ê‰Š)¨©ÛÉê¶Ì5›ð™È xõž°„°/?g Î݇“lýcáæF_VìV^n±ö£`¸r>¥À«¿^ÉÚ<‰5u#„× ¼rJÅsõ×VÍê2¼Öaâ€×º­Ö¼¥ LÝ„ç.äôaéݶôn‹•òV ^ÎKÓg°®ÑÀ}èu½ãÁ-ø×ïÚ)¼F‚W*„yl¿×#Úóê+à§,$ps!§í[ÎíÓƒ—ñißZu™z< ¸Ö¼èì~.`¸Ó…ÜÄS®¼Ö?ÜÆ(·œk°šÇkaÚ‹¬¬)W¬ÆhÎx­ßI¼ÖmµîàU'ð±÷j%ÃgÆ3·4Qk@ '»´j€7˜6 ,×øÿKŸxz­ãDñáó¸á÷{DÜùÖ|7 Ȇþë·Wpöz´ œ XkW€B™RÖãMŸ{N†[´ E½Á«ZªTVCeVðZ· ^ë¶ÚPàU×ÎŽ³Í½\óWuýä« ^;6Êj`<˜RŽ\ÂsççT‡ÓçÛ½}ŒÛþÃï?.Îóéï¬àÔU­r¤d8ÁƉŒB .ïv9àµaG£M­† XÍÝ7JÉp5¯©Ô±Ìòxïd™i)×¼¥ ¬‚÷#ÿé)ÑèÒãÉ_ÆÒKRÒãe¨!ÊPüý¸nÐ44åµÏüZï§¾C­†¨ýpB‰›£‚÷s?p<ÞRϲÏðÚ±V·½“Á{v<ƒó“Æ©MÅ |h«w÷¹°¸Ãß|ët^Á†>«¡Z·OÆ„Ÿ8:Œõ®]"÷WñjD™º°6Db;ûz²1^¹¸&c°Å×ÚP"ûC‚—çú¯ß^Æëkoµ®‘Çê>øÛÄ×&ÆË|íBàôåüä”5¡['ÔPÍ»\Ù±œPCö³Z2\Á)Lw•ïË£q%Š*‰'Ì%-™NÆÌÂŒéPú¡B˜Ù­ šw«êßüûŸÿÖ[ÅîU/ IÖ×ÞN,†s£Ð:ïÁÝ[4u2U$Ǫ]Ê¿ét°lØÊpÀkÅJõÙÆovÞ(à}êÄrö*Љ¬d!œÎ`w?«×ŒÓ°Ú’¸pu_ÿÑUœ³/²®7©¾”W¦“¼ýƒ§‹§Dx¤X>¯¾€Â¨<˜ñÛöFÍ”¹Ç Þjy̹,Ïÿû·U¼Zü–‡Ô~Èd¨?\ò=@¦O—ÛsÍL÷–çRóxK=ÊxKY¨~Ÿ;à­ÀÖ¼êåƒ0uøw鯾úì<°·ûwhÝ–ÕPÃù›¥»-›™P”¯õCS·yûÑaüÊ?ÿô±O~/û‘$!¬Ï*(¨\SÈPB[£+¯D~LWû«2x×rwõ•s;F%Úr^x+ø²WyW¼´ð’±\DR_}¦P •€—ç“=×®M,áÏ¿zB|i%R µ€ý#Ø·ƒÍ>ó!ÌyÎ-EqR×è“À5ÓN° ^Î×çÕ~ôCF0»ÌŒ œè Ëû&òx¯Å²ÕwFE2ÿVÜTZónù_ý°^«º·œ[V'BÉ!xù'‡PQ[³•>'Úo_ö*ïꀷƒÚ/!K¤‘JYPqÑ͉_~ŸGû“2€¼§n°Kn‘hBx|V‡Q—azˆð–N·H¿2Êjàaö˜»gW”–÷ôj§Âxõ|ñnË倗/Â!³ʱ%÷ÿüx›ø‘‘àUm&K µvñ šNŽÙ(Þþv „áËDò7„ÇÛ܆S7Ò89–,h S Â¥À+[ÿˆAYÏÉÂô|÷‰pDa£OÚàÖô2¾lLxÃe·‚û¤ßUŸN¦O“Û‹û˜2'›Ž¼R÷–ò2wt<Œï¼ÕÁ¤P>œPìvÁËcýŸÿòH¶õÏõYm1רD»\S;àµn9¼ÖmU¼Fº·¾4xCmxm,‰—F¼ E0\ _`‘^$²Þ¢U𪌢õ»Ç…€Ÿ}×Ì…À‹AX¨%ãØ3×§¦“UÛVfÇ“àýÔwWpúZ®½» \°ÜM ^)l&ïךš[pu·çN‡15¯¹¸Z8aMç×¢*ï—Že0:‘^ˤ±¶˜keZx­XIÛfS€7Kâ;?¹‚Ó£3˜¥0m†ôxÿõ>cª{[í©äy¼ xÕó¯-°è!L[E¢Ivãð­½»¾E±žk´Þ&Õ/mûЀz›qljðª7ìÔ¥¼>:×Gg*†°¬€úÃ_{DÄæ~ãS×ãÕ?´üòýÜ»%¼êv½Íø<¬ÆÒPc¼r›bà•ÛP7¡µÁ…Î[,ªÝp94Œw?²Slúñ?~­M~ܽ­ìî« „N wK/WDÓeÇŽ?ûÍÇÄ}ûóç«Ó@À§ýµ5hEG£¨ ßf–VcXX·p©ê2^—*’#mâvgp÷ðí^9wñÆûhÚ¯ŠúHphîØŒ¸Í]ÓÞr ,«Ë¨žeô‹Oωþ7ôLÉ‚ˆj=Fåz¼ŸÖBGz~ 5°ËðÅë øÌ—__(ùY¶€"ܘ—žp®R x™Ö+•Áä5ó-B8L…ki à­G¶fÁ«þhŒ-~ܽ½‡Ë€0OÈë=3V»¸ìÆÌñw¿û˜ø/Á{y*wgäu³Ë…~Â;»bØÖCЛ“­4j.:Ô‘Yo¾,¤Z¹Æø.ÿÊ]Á«e¨A^»Ì׿1ð~&Óôàó+,Ëõxù&Äøíý[7W8¡ØwþŽo)«. øµU`µJƒš Rc Z ÕÇxe«†ö ºAøêט¹@­¦>é (Zà¾Ìj ÜÔPà üãëºÜv)̸i.›‚Ÿ=ñÐ6üÂÏì2–¿ù™šŠ_&ÜÝ‹}Û»„x:¸Hxòâ4ž~é*–Ãñ‚p·%h—c@TÒRà]–#šÌB.–íéÓºsÂ7fcøÇãQœ¹•Vú–iÙ út2í¯êF²kœ×!«Áìå…çõÏÐJNä礫‹roºË%Âù!óoà ·hñ[†ªÕ÷f£÷ޝzC~zꎛÂk¦-ݧÏýÖ[„ÿí§ŸËÛ^NÔÂVÀ«…ò_ï5`°›¯V=wp«·zŠf5Ä’šö­>ד_Lþ?&úû\ !V#¡Äóл% Ãla0ÔPÃoîy´‡ŵ„ ¬†–Ws j3‹Q\z¸  +àå=|b…hŸ'áÉ%€ò·£kžr4–Äür œC|í„fïÙÃoƒ™¯u×Âãe8AÿfÇß–¸—jžJ;?¼xÓ.7Rþâyó˜íÃpÂÎK_³M¹Ñ^Y~ªŠÕЛzýÒ4NΈ?³¡.®©ž°º}µ!\ ¼ŒÁé_ E¸„]‡u­Ó­€Wö7“å³â¿x­šWœN§‘â_*#`ËŽ¥J—U𪡫ZÂý]!<|O?öh2–ÔM˜_Æçã8=:‹§^¼*4$ì‚÷ÞaM.SņZ]¶[ƒ°7ÿÇ†ç‰Æ“"܈ãÝ÷¦ ÷œ&ø¤‚|3¥ƒUúkpÇ‚— ]Šca%.V¸Ùh±!àÿUÀb6Ëj ˜øª¦÷%„¥×YÎÍ3ïÉëé‚×B.1ë:l¼ržÌ¿ñÛ†\Ú½ú”"áÈô²Õh¢h7 3ðªö ‚Zs£ü¯º0§·Yg[va×–Ò jr_#—Õeÿd¿v} 5,†r›ù·² z¶ËÅZ¸ÛãÑBübñŠôèž<Ää| ŸüfÎίšN–Õöå !œ-!VZÍW ÞËS±Y†Jg‰—Wânãe8aß nw!Èi”óìo¦}î8ð®F“XXIb~%az%„CÁµ ÞÚ–z—J's¹»ò–fZmJ©Ÿ¤ïGßsM-í8­h5Ï/Å6ï¥Åx¬‚wb hòíM9Ý[Û- „:à F2–ô€Í l¼z··Å+:±DÛÑód¬N nÅ{ì̇°^7B‚÷‹/[;€‘n-›A.®¸¼pIƒ.‡‘´¥þ¾P¦‚fŒ{[—+/«žðìbDĺw 5O'+aV¹aM²M“…TÕÉŠÁ‰%ÃÛZð÷ÇÙ04xÑš(™"NåVëé=Þ¯Ÿöõ秃ñ¾Eâ)!>ÔÚèÙ5wê¸cÀKÐÒÃÆÍûz@ÎE X7‚4ê¾ÆÊ×\+édÕ‚°'ø(’s !M$çøÕ¤%„Vd¬€—ošÎnÎ2ürÊ" uq-ûʪ(¨É½ä"›TP³ ^§5äG«(ó`6âBW@ IDAT}ÚŠbf‡޹U}_9ò3À¹qÍãÂíÂ͵‡‡í=ÀÏ?¤Ùî/~3=^Ÿ;#&-åñ]—Bè/™Í.‹SÞ›ßü_gµnÌi×Ǹ±^wv®\z¼ú}˜±Nb%’甃á–F/ÚCÞ;›¼aË?#ÝT;aS §.Mãå3¸9½Rò°å@˜Þ*jB‘œ (Ná§©)[ò´y˜·Át7çwuà±Ùù7–H °—Êã¥Wιª2–òäÔ¸ož8ºUxçV~´Tð&Rn,Ds*hV®Zá¾Vm/.À-„ãcšwK®3“af9#¼i»À•sQ×þò‡9O˜¢ìÁµ©óÙä‚ãâ,Yf¬¼èÐAX†dë™[®†–¸Ésʰ—”…üŸÇ€ ãÚ½¬Æ0«.c^«N,v.B¸³Å‡–FŸ˜óf›¼„ÄìR¢h8¡’+!Üò ©Buжa©3k$ò­Y‘çá¿ÿâ»îC°)'’c÷zôàe*㛊¼®¨â G5ણxÕm ,=^Bø£ð”P©!=ÞrÀ+Í„ð¯¿oø§o½<§5ÐŒíkaÜ3ã-5'³ÏͲØŽÐgFˆ‰W†]³½ûp‡« çš\G0j%õk8„á¾|éeM$§ÒaV]Æ·¢™¥xÉEV³ó3#§-´¹!¼éÀ;>n½!jð )èÉ uËsÂ?8~ݲnD1 8d¨+Þ¤B—êdv¯—à=¼Ãƒ©™üßßÔ (ä ×Ä81ÓÂŒ†ðAømG¶âyË]B æãüŒÈ„X\)~Ϫ^9ýâ!HÛʯ\\³kS¹½xƒ~``ÍÛ¦}rÐ{hžpLxÂÅ~¤ÇûÒÙIœ¸0U´UŸ†¤,¤T'+÷úŒªËNX¤°Næ…Ê=‡ÜOB¸«Å_é¡6Ôþ›¼´.=´¹å¤xÌVõkqD.ÌAØŠxY6„~®„ÄGÞu 5\Ód!íŒîàŸðbW_.WƒzJ„J½‚– ^Ʀéu²rí=înæG?ñ”èZQ*ÎXKðŠ5vnÉÀïfW´o¹Ã x¥8ÏÁ\»–àeÞù‰ “E3?xÖùÐýØBwMÒÎõ™U—1œ°Ib%j’4mç$ºmYpÂÐCO{ /k¤‚Cn˜]7x¹jTß‘óntÏ­DEæƒQZ(x gdP +=Þ“7R¼VÜ™GÉ•fÖÁ³]:Óç.^ŸÇ}ñØÚжµ˜Ÿðò•—åÊAoÖ+c¨áÉ7¼ûß}ÍÒ—¡Öàå$¤Ç;³ÂØoy k<Ž]ðª°áp8ìÂ]¨?žô¤ù#fÔèSÆx ÞË“ù-ïÍn‚YuAË !;‹Õ–nôZ½®Ú›}¦jwVµQ·ÛàeÙë¾qJÄï˜TÏV4Fc½!ÜÜèÍÆ„%L'fW„Œ¥TPS«ËŒ®AI0Lf5ÈÖ?rEm¤i¶ðAĈg¯Ìèñ–zX­€—±]zÿ\ÌæáÒ«NfðÀÞA|èm#â4ïý­o”:ø¼àýÌ3i\žÖýi1 5dswmd7TÞ¬1\ÈzÁ2!;933àíwGòÅðÂÉ[8vvBЬ„ðÿþÏÈËjž0Ÿý›¡QuÃÒ";!š,;~[ìfóM‘!~Oôƒ‹Ÿ§Ç‡w@tW¾ÝǦ¯*æ-QÞ|0§ûª¿Q„ðÌbKáDÅv‚ž¶€X½Uc{ÒS9~~§.NUPS ([ÚDÂ×¼F‡·f°­3 5nr>бÉUô¶±¥·É°d¸Ôõ/W§ \¿Ï“- áõ­DÓ‹\Šdðއñ«Ojêd•‚—/:ÌX±“µ"c¼Ÿþ^çoi‹L„Ó–7B n‘ffj° ᪀W¹ š´e~ñ–NÌ-Gñ““ã–2jãÝ1Ø‚¯¼âÂÍùü»,åîêMãȶLžØ8ã·‹+‰š„8‹ö]­~!p¤#"b)õŒnôÏ7%xU£w¶±gŽìí/”ÛñWœñàZBx°3ˆ¶7¯s-soãZŸa&ciV2,¿œ»ú\²zŒ'NÎE1zkIˆ~slímªxe8¡ ¦.Éâ+ÆŠ ÛÙ.ÖåîJ¥à]ŠyÑЪÆx:&^PÔ'–( a#ðòÒã]Œº±ת͆¾ŠMÝ®à5+å% ¹÷ÞClít Q&¾%q”ʨÑ/®É2÷¶F—¦¶E«‘¦†õô¢½Üw«Àãóª·0œÀŠÁŸŒjM@õ2Úx­Z¸Ûµ¯1:m½!Lϯ·= ÛÔ±´šÄìr«Ê‚5š´’åbŽ&R¹f—k­N\K€+Ͳ- C\裠øJ$!*¦ä˜_N`9’D{s bðŠdøLF„:‘ƒ l#Œ#Þ‰rÁÛÞ„Ëí˜]dHS•Ì(UµxeVƒì1§µH/av>–}Ó*ŒñfÅt(y|þh²ÂŒC¨ˆÆX効üùIÿðÕÙvXzðrY]f¦5BÈóyå3Cç¤ÒÁ,†˜.¦FííõÛ8à­ôTq«àUOYK·6ùÐÛæfrð֪紅°bƒgS^]¾n÷yåì.ŒÍáèa4·¶‰Ž/‘'«Çpÿ˜¢Õ×ÕAê\xœæ&?B ~œ¿6Oé[wƒ AmnÙ|T–ñÆÓ˜[Ñ ņ]ð2tÔÕÞ 3i­Í»ÿI²i#{̱h£P¨›B?ôšè K[¯z v <ОÁûÊ#ªédjVÏc¦%Lû²Ì6£{d$€W“XXMˆÚR5o}/_ˠ՟Ηá„p,þÎ šƒ!$¯ñ /'Äùì.ªÑ»åb稣ööÅž¼¶¾ªµÝ¸ðª3ìáȾ¾<µ+õszüÕ×Âæ¿ú '´4yóâ·z1»–Ћ÷ÈP!BðQ–!.Üš à2%Ìh¨fù-¬h; ¡« sfs¤'Íòévv±Ù%Ó”ÒX §1·šFÄbú´ðò:©Í@at©Ë8¬ðfãt†Öè³8„¿ðï5!t5ÆËÿm5·„e¨ÐÿÌ3k„èÁ+µl:;ð­Áì>r®zðª¶0ƒ0CAÂ{^+/V]F;ö´C˜ÏõÒjqó%¡³% €«Šqz"«ß ¼V-U‡í*o%æÃ9ÔÕ ~ÉU52z¢ôn«ñz&çÇsÐ{U_ÓXnzrt^×θgG»qùf~—a½‚AÆP‚,YÞ™ÉÑR®Æç };gFÑÅ5Bƒ? ônóUÉ4/—‹h‰”µ’R3îß=&Žõ§Ï¦qòZN?À*xÕ«5jôiã%ì?íg…Kkî©W°³£P ¼fæ›=×¹åWµl+Ä¿å}óS†D†Âš‘•s«Û8àµk±n_MðÚ0Kjéý¨ƒá„©…xÉpB¹æ ˜7æ×…•e¯/a|Ö~¶¿\\»rkù' ÔÜn.0)«éÔlH¦±ó šd YS ³;Œ<^‚áWUK˜Y %dಠ^u^Y¯Ùã=×8þæ§\›É`%Êþr´5¤…ºX¹•kÂÛ{Üøà®=^z¼¯þgƒ du™Y:˜‘}­‚WÝ—k üá&pR(wH·5io>êàµ5Åòš×é¥rϪí瀷2ûUuïZ×ÂGöõ›k01µÈƈå?Ôv CYÆ›SË"qÞÎ0Êj¹·²µ Àåà  ¬äñ›‹/5 ~îw¿%<\uqNíä,C/ªðªóùï¿óVá‰ýÍ‹ŒÍä<>ÁüLF\çÔ’={ªÇ§ÇKð¦àÆ_þ0§]ÜÑ´5æ[&™Ì fCUNî]xí<V·eÎí`G° Ìý¿•DK«G,¾ÞêØ±*G©xÕ‰Jµ«Ã{û çÏ41æ=–³Q®Ak] '0=¶a¼ŸýûW ªË8 f-FÒBj8¡RðþÌÃ[ð¯ÞµK€ôcÿùéì%Ó®&xÿÛ|«8ç_ý0…±YPhjjTf“]†™þŽi?4v½C}¨¡·Å…ö&MÎSï!Ê‹—úÌj'”bÏÂzƒ—÷캌×òô™òà V®»ÜïÉFÙoÓçñÖÒÐRýå3ãâÛò¼Âï¶þæ—ÂøÂ×Oijb¬å‰²KF$ž)»½»‘Ýü.„‚n<ùÆ-øÀ[´Š_ùƒ§-i × ¼—§rž-çÆB…Œ‹ù±¹+ HaÐ&¥ÁûsìØñ×?ÊßZþh;=~¡¯œÊ!!£ñ/zš™ÕPjNÕú¼XuÌØ‰Éj³Ëræäx¼åX­FûÔÛã•—¡ï@‘Õ}éFРmj- L—«ßfmr≔Ð{­Ä×S{îiGgk0» xòâ ~zzOG(±NÙsMÛìz¼Zk#¡¦éñ†ñ¾Çw M_úýïZÒ®xyÍôx‚Ån‘O­O©"é/GÙ飊Ôô=0 |øH@¢Ô6ª‰®oíÖªyߨÙJ£O a¿›MJµ®¼²é2ת¹€köµR]¶o¶º —ƒ¼åX­Fûlðª—wt-=­ÖºüOÍçhf½Ê˜ÚÓÜà–\áÿö±5°‹å±¹ê2¶7𛙯ű¼|n¦l—1Ôî7‚~Wžä$ÞxßÞûèa®}2j(öxÔ¼ÌÆ“o¶}¹„µ\â f–ÓhmÈ—à–¡¶'úÄ·4èr[†2FzsàUï[±FŸR÷vGWn—æ ‹ÖJ±”È«U»:Tó+f·ºÌ¨çZ5çÃc9à­¶E+8ÞF¯¼©Q+ñ=xU3ÂlO´k0„6Ò_Ÿ;ÃÓ'WÑÕâÅH;z}"åI¦Uͯ$ñÔ±ÂnËÅ<^¾ªÓ»  S¾äBùe¨a£ƒWµ'!¬¦Õ‰Ri¯[­”WËLèeðÄ^­µÐgŸÍ¡xõ÷o0wøñÀW^¡ƒÜNö¬E™{©ê2µ½½:o¼Ö!æÄx­Ûª`ËRÍ. ^Ñ×: TÂfàe,nGý]Zf¸Ý˜¸³+Ír~ GL͇E¥ÛŽ^/Þ{¿ýí>Ðó»µ m¥6ú½¹(^—Õžkl‹ÔÚÀœ_¥‰âÚ›¸¾Ûò£‡†ñÞÇvoñ—?ñ”¥;°^¯œ+ŠÖzÐ M~˜Édó™XÂPyo ð޽ZÈŸ|/ް^W-ÿ–o0Lµ3JÓâr%½• >UZ]æ€×Ò£,6rÀkÝVƒW=€ô„™12´Ö}Qw)c)[ÐMU^æønë -6Wä`•׫WâxþBã .Ñ2_dzUêbèñ$ßA«?•„7½ØÑ“/Óǘò™Ë3xéô8&fÑÒÀT¯Ü‘èõÉæ›â!s¹„¾„ìǯôxùž=a.æ`_ðrÞ ~w6ÕÍça÷üNÎÄj*Û\95«áSOkÊlÌŽðy2¦¡3Ý[ê%³D||fUÈž²‹ò=»zDŠÑ° áRÕelmÏ\½XѹðZ‡‰^붪*xÕƒU¢AðÎ/Ç0ÔÝ€¡îF47æÄGV“øéÅ8^¸”‚Q%=(#KÉÁKIüÙ3D£ á ôjæ9èÝú½Ѷ\ö˜»:®å(Ïùއ¶âÃoß%L †¤þÓÊôǨ'x3pƒÅ#üá*a€Î®LwãÜDúW:¿½¼ü‘ÙÞí‡jšj¨Uw´ó¥å¦‘î-ß@^¿<ƒ cóX ×a—JiT=a.ÀéG-ªËðZ‡‰^붪xË…0_õÝHƒ19=p ã ­M9‡,WfSÑx"¡N¿ÈÏUe µÎãYYBù¹j/ÁÏ£nË„0K…ùêoG«ÁìÕŠ+Ó”µÔRÇXÒ+›~VqÊ0¦Òâz¥wKø–FZ ôr{Z\xp§G¶¹àÊäº73ŒðÚ…I¼rn ¼Ÿå+Ö[V—1¤`õGÚl~x­ß9Çãµn«ºx¼fÓ!„ß|h¨@A:©?8½ŠïŸI†ʹ<†¶u¹Mù€_è ÓãdŠ_[VY-aéñR³«Õ‹¾6_„õ>À«Î“fLX-#¦  ÕÉDm\ó´­@Ðèú ^z_|›ÕHJïÖëÖÒÃd¸„ŸƒvE€ôàesÑCÃÀ=C!ÅÂd*…•"Q ÕäŠ8®M,âÚø®Ž/ÕÂÓ˹rÞrž£}ðZ·¤^ë¶ZWðª'§ŒåÏ¿ó<¾ ž9Ç·Ž/Wpù»uxð¡}ìôe (¤–ðü²Ö-C´õ18£U‡# Ü»³C¡T:™ì¬Ásòÿ–z¼Ü—‹rÔ‘µaã ÿá-âÜ_|‘a ¼'p skN•îdaft ^zÉã ZsQ†ÊX‰$Áx8ÅdûrvÓågê¨&„ïëÃ;Ú.tF¾{º6å¼x­ ðZ·Õ†/'"» ¿<ÇS'*ïžAîÛæGw³;›ÇKAóWc-‘d¯L§2Ò`–B<ž0Ñ{ 1þíwÏâôèLA8Bol“Õ€ô|Ó™\{!¹]1ËÖD2Œêd )üíKi\šÐâÓ2‚@ô©v^׃#Zu,¬¦0½G{£ÖÊÝ( o”ü¤‚˜>uŒÙ Wo-âÚDùò^²Ëð—_nÌÙ½²ÒÛ;à-m#¹…^ë¶Útà x]¶{}hnÈy[­Á¤(3ž]NYª\#$äâ—ŒΊl€xJ¼f÷µûÐ×îEËZóƒ9ÌzÌI£ë×(°ã÷¹à!Uuî·„0e;7–c8ÐÿÔ¯S7ƒÿþã®L§µpB®C”^Y]¶³'÷£Â¹O,$pu2 Ž[Éã•×KøRùKaÆ€ ßr 쀷‚/{•wuÀ[Ak½¸Vlj•z¼wøðžÃ B?€`+ßá¤Ðîiº´åh5X°)ììõ DÒ!SÐ]ÖÂŲdŽp¾pºf=r™ÀåҟȾHeðg¿©…>ÿÔðxåZ vÀËbÆoe8ÇŠ&(4žÄØ´\9ì€W½÷FæmýwOÜEÍE¼|Ù«¼«Þ z;ƒ÷g4âM»âê)ö²É`r!!:„ü ‘»ZxUsf!ìÍ AðͦçF¯´§½]­SŸ¿6§Å`‹¡‹ê²µ8)C |í—Us²³AøY‚7£ÉBªêdVÁË㊮¼[sb5¼n†èáN̋× ^›ž:ãÀ#M¢ð¥Aühßx~ÇÎNXzŠðZ2S]6rÀ[™ogðÝÀ£{"¡1²!„=®4¸Fˆœ»U˜lWŒ&fF‚ìÖK °ÎÚPÔB ^ t5Š^]jj=Þ³WæðÒ™ ¿0›Ý—`e8™Á VÑ1½Ž°õ —%ÈL-¶þágõ£.ŽçÒ·J׬ºL–à-6Ê/¯»³Ù‡ážF´·³6açhþ½pê–Èýµ2ðZ±R}¶qÀ[owðÝéÇ­éœÆ#1ÜÍÞq.‘¯+äÙ@s>[³q° NÊ–^ÕÌÍ n wh Öë8¨&|;[ÂlÔ(;P,…S8weÆfqúÊlAu;:PÇV­.c8‚ãù^¶þ¹:­yÞ\ ô¹©þU¸¸fT]Fߘ‰ ïV 'T ¼\hëk¢¯³Ax¸¬¢c>ór8Ž×èhÊiÈãç'ðVð^¯]ðV`ùÍ^UÙMv[þ™#½ìjïé2©žjfËÑ4&æâ`EÖj4•'’cÇŒôüDÓG¾ž/Ç,i oí ¡¯ƒáˆ \n­±¨h/ŸJâõKÓ8~~JüÉê2‚‘ÕgúÁýþÇï¼Mì/{®ÉmXü—Τ…èy,‘ᄇGòà Œß^Š™†*/…jXþÝÑ]–9è½O-ÄðâhB1¤+åðÚyò6ζx+¸› ¼Ò”…áž»ú°e°7ë ËÏÙå—ñ_v¾6¹j»Çœ Þ±‰\*…}ŒÄ{DsÅV?‚>7" Àãõ¢9¨y€L£ÇÊt°åp¯]˜ÆÉK385:czgeåÚ_ÿ(…›óZÛ!ùèÓz¢íì h‹q<ÏÌrRx¸¥Â å€WêÞnímÚ²ü›7gãxöl —¦ò+ËðVðÅÝ»:à­à&lfðªY ô„Üß#{{„›)¦¾¦A»Ia1œÆøÓÏâ– l^y+¤nDoGöli‹oê`õœV®œAÀO1ùj],Y–â=7§WòöWK†åâÚp§ ¸qp‹¦ÕÀj5zÌT£Ð9ÏWi=}Œwi5†žÖ†{›„wËB~5šÀÙq¼x%ƒÑ)ãÔo_Ü °«Þ nÂ^ÕDTÅ"„Ü׃–jタ Ø&}ž…T)ðv¶øÑÓ¦éÏæÔ‚ˆÄS€¬°ew‡Å(]ïâví!7:B´7‡° Þ 7}ýZ:˜\\£7¿fÜ—ÉùIÂ\|ä醴mgHðvµø‘N%DÌY '°À‚ñÛ_Ê”,ÿvÀkÇòo[¼Ü“;¼4—\\Û»£»†ÚptoºÛ|ú4HÑS ÇÍ!l^-œ•[ꂵg—àêRKcn3«ž¢ú ¥ ¼£¿IÌsz)rðL¦RX^ãêL‹k-êèIw·úµ–Iº•@Î]:¬@˜é`C]ÞÙ(R椸Q,žÂ͹8^¾œ„¡¼|q7À®x+¸ w:xÕô„ÞÕ÷ö «ÕŸ•T”& ®Ïhž¢×Î.®Í̇Ñß¡e-èà ªOkÈÖP‰” ÑBIC³Û(!Üßæ•s”ä`vsƒeuYG“V82»Ê>t…G,ÂôjûÛýèô£«Ù#DãÉoJ@¾x~'®g„иÝá€×®Å6Ööx+¸x3ó½\7ᇠàèÞn‘',um%„Ù'¬É— œ åÖAÎ×xBZÿ_.xÕ©|.†`o9ŠÓœ¹ŕɨXœã½æÌÀ«‹îm÷ ][}J=aB€g‰tsPUgÆK³ÏŽ»ðÝ“aœ¾–kRj÷1tÀk×bkûMÞh<‰Ïí$.ß\¬«uðƒWÞ©{ß]=y–"9\Hº6±,^»™.%ã·F7±àÕ7K yF9ì€W=Vk“Oˆ‹©Úæ¢n± ]-QmÆÁ´™å ^¾šR›«1c‘;ï^°|èД߱ʎÙ6̶›¼Òš‰d g¯ÎãÇÇêa¼ÅÁ›ï!zEϰ·ìÁôÀ»VÚË”)zœÔ¥[ŠáÖl´@ Ç©xy\Šåл&ü˯¬.ÛÒ@gˆ…Ú•ÇÀ…‰”X,c8¡œÊ53RÜIàlžØ´6è!mÚȦ¯zñõ€°^kàeÜ´£IËàxäÐ0ÞýÈN!¬¾¸CcÀ—-ù•žœ‹bz1'IY+ðªÏLoK¤0±”6Œñê¿\ZuYýìA§…X-·Nc9š .ËiïnåK|'€÷»€{‡òÕÞ¬ØævØfS€—ÕW÷ݳؿ³ Gööc ;”gûZAØoqð†@{“Ö,R‚éÈþa|øí;Å?ýòï?%îÛïÄ®-íyŽ'ÒB`j> †a]n7¢ {‹kž„|.˜©ÀRdŽ¬Ç»̯²EÊ0eŒÕeÃÝt6{Z“·dÞïÄb/^NƒÝ衱êϯ= 2œpt;°o ׆‰G¸1¯‰·óßÙðv›¼ú•W³÷B–¾îéF Ú•ÁWÛ“§ñÔ‹WÁ$ûJ†^cðv7Mü/Á´–£ÀÛãWŸÔÀûÞßúFöÈsÂMAŸèÈËÁØ©ÇíÆÌb Wg)a™ù¼• ãÕàë–N·X Ô/®i=ß2=ØÖÞmЯ͋åÅ7æÓxæl&[]fÔsÛ:¡ó»ÅpÂc»>;jæ4a{z<'Ü.=ýJîûFØwS‚W5l1Sìûì•Ù²!ì€7^Æ5{šsáy˜ô@à²àAŽwK}eÖ¼¬.{ä.`¤;?œ@ȼ…{¥®ÅÎçxíX«ÆÛ®7x㕞ðá½ý"á_Â_}îbÕÄ{ÌÀ«ž—â/l/cá_z4€½C~\¹¹Yõ'«ËØÏŒ ixí?¸„í¾Aà#oÐrŒeŒW¶cïëj+ò¦çà|$;³Ïš¬Ð“g,áõo©ê²ávà¡x­<5›¾rÍŠÊÝÆ,«AV]ÕR7 xõF:†£[cèlH%™Ë­?úÛWóªËä+·^ëOc›Ô¦à– 50§ø“ÿzRówÛ@K¼Só9=^·Í> 5 a–·Ï²×ÑÚ¨'x­V—=4â€×êã€×ª¥ ¶³’NV+Û¯^Vžp,žÄJÌ…—ÎÏãÓ_Êé\ð2ðw 0zTÚÖ†\ª·éoÍà‰»c˜_Mâ3ßϯ$1¯dÊ”<¯xåg 5ø=ZŽëõÙœ',?¿]ÓÉäüíf5´ü“Bî‘[·ÈŸì÷üx]§ fŽ‘ë¾rŠWì7¿÷|dµ`½Zðè/2ôð‡¨v7"xÃŽ£onÚ)«×}¨"‚W%S.™Þ272xýÍ5D½jÛÖÑËW+^ü,ß¶³Sž]~Ñg Ù$ÞàµIqa,Mà…9að é¹Ê'®}Þ¸ãßøÛ±ƒhjH3ëù,Á›Bý2€7h÷}…/8ýÖÔ¯çäšé¶e[ £À,§Âaà…9›eXÓ/p/œ©=E¯æøwža!Mzú§W«ýxõJ…ä,xý.bÅ{Ƥãå”±ÇËñÇÛ]y¯TÞ`9û» Žêöûó£äÀ¡?“=ûy¾·M{Ÿ.ÓÔmÁ"ÎÀqå4©hàE šI#{â0ÿ&xÓÌ€ú?K𦃪€×—ÈÕEŸiÀ›b¸z-x|¾ON9®Ó f|h¼.fBýÊ xSh_5ð¥Já*ƒ÷‚éÇÊù“ûËÁûÄ¿új'¹u«þt‚7Å·¼)¡Êà ƒpÜEŸÏ¿²µÇQEðúqoO{äεw·wzW º¶× ¼)¾¸x”àM1o_ñL¡ 7léC‡ºä«3Ú{8èN–b(¬ÍËÔpðHûg½ãÞ⊎Îþòìï÷Éê·Ž\viÕn®ÙÊU¸ü¥ï&K~ñšì³]6¤‚7^À(#fŽ»"ýçÿ´:»ò^;œYƒwÚX‘sNëÛ¬l·ï?Vþøy÷õîÁ[†µm÷óUuÅûõ™"'±U«xùK^_Öuïî”7Ú>–7Úvæa‚W?¡{âÃN%“OÜë Žc%¼²uk.öÇíÑ—ÿ$¯mîô.¨ìŠˆ¤¦ï¡HÜå¢~ÜÛ¤w®Õ¶£JàÅaži'‰Ì{$ކ͸1o©À8¼º)ƒMDš›º:œÖI2oöøÐ‡áÖÉs/oœªË"ùã¼m¾Â»>;` á¨Óeh7n²¸wyï¼ú=áX‘écº¯˜/[*-xà ü»õíNÇà—ÁÅý[yƒ×èœ=}ŒÌ?»;VÃ<²FΜ“ïÑ‚æ„Ù§Hè鲕»dý¶#¶—à5ƒÍ:¹Üæ„8¼5ê`Cgݻ˚õíòÁÇ{b_^U¯&î­+𺆰 ¼§ùR‹È°cŽÔwº +à?ì;a‚7¼0'´œ(2»¥æ‚W³öÉc‚pÁk÷6 ðº€pxýÓeþA¨ æ›Óe°Ô%_žØåF»^ ÞµKðmþâèj™Þ„à >᪀7iÜÛ¬Á,ÿ¬ÃžšjAðþ¡Sä‚ɽO—¡Üm"¯niûÈ~ò`Åû÷Ó>—»»äߟêò®Š÷S•ÜÉ ¼Êèf?+z?ASCáß½Ù._ž5Î{ºÇ^Ó^ý£¹Þ+6ø¢&{›'xýº4Á{¾pøèÓÞ"Ç ì §ËVoÒ« ëc­©q0Gà6ãáCIÓ rŸ\ÃY”O÷–ã„Y<¨!xU2ÅgZ¶ây³m§|’‘ÃXíy€×4¡Eh¼\Ôá`ù6/:Ø{öÁj4à ËóéîNÙþɾÄÝ,š;Ì3øåï&³¯Y#uÄŒÀJ8»oíé2u§dÌ ¼Á¦ùFPƒ9ÀÅÇeŠÛ\ ÖwÇœ©=Eo™O—™Æ Íß Þ4êÅ<›5„]÷½í»äá'× 6Í‚§Ë2’Å+¶à öÇäÕ¶ïZð¦p½Á[…ÓeiçAÜóo–ê.;‹¸iÁ ÿTl||´s—Üû‹µ9¨p¤ ‚7^nÍJ¸à­Úé²,¿o–ꆔí ÂIÁ‹•Êô±G6>‚· ç%Á«Wþdwßjy‚·ª§Ëô£dŸ“àµ×ÌÙi lÞ¸°z¯³áì)(‰©AÓŠ`ðžùSÿä¨k}k‡´nÜ¡y\þõ²2nÔ±ŒÊ÷`ÕO—©DM˜‰àM(œëÇá5ë·«ƒ÷hÀ«Ùø x]¤HVà ¶ô+SÉà~ä¥uÊʵÛTЀ—§ËTR¦ÎDð¦–ÐmÚà=qàµÙø xÝŽJ˼¾©õù¹¯¬o½ ¼<]æ~Ä•Hðæ«·Umq®oÒ‚×jHT™óo°Qqæ«0ðòt™jHg"xKšMµöÁûÞ§ÝeIÃê¼îÇ«žàƒp¼ˆyÜdu¯K䊷dsG–û÷ï'ýúõëuwY’n¼IT‹¦(à­…ðèšä8\ ÂTw¸â­ûÔ·¯{ý‹^÷½d‰i xÓ¨W‚g ^÷ƒHðº×´l%¼eQËþ¼–‚)²¼ ‘*ž…à­ø xÝO‚×½¦e+‘à-ÛˆZö‡àµL‘àUˆTñ,oÅ'€öÀ†K™«!¹šqÇ¿“—Ê'óV€àÍ[ñ×çCGPM}¦éÁk¯žæø·}©|¢^ ¼õR¾àõš.úLÓ|‚W¯žÍño}©ÌYoÞz@ÔïÂoü '=þÝS‰M<¬ÁË©`¥€ ¼á’3î­ÕTlèÌoC_}Ÿôz#‚·÷¸1îm}çq=j'xë¡z ë´0Á+¸·%üXt‰àµ‹Yu ˜ \eð2î­n•=Á[ö®sÿá[·Ém;{‚tW¼Œ{[ç‰X°ê Þ‚ H™›ãé6d Ì›=Þëêõß_ž{—ó¼Þ}ù[Œ{›û7@…o R™›xË’ÿ‹½®&‹¾çÞ,ÚÎ2Ë¡Á[Žqlè^¤¹m9Idz/O—%j>CðVsÜ Ùë¼âF¸/O—r:ºQo¡‡§ºËÂ.ÀËÓeÕ›.zNðºP‘edª€k§/O—e:Ô•)œà­ÌP—££."¨%/O—•cþ¥oQF‚í°V iÜ-xyºÌzHø€R‚W)³[›ÀËÓeÅë2´Žà-Ã(²½0A8 ¼<]Ɖ”—o^J³žº(7¢¼p›>VäÄ!ui"+­ o½ª]ö!¼`Ωžï~$2fxw¤0&*§ožj³.*@¨€ˆ¼œT€ Pœ xsœÕQ*@^Î*@¨@Î ¼9 Îꨠ/ç T gÞœguT€ P‚—s€ P*³o΂³:*@¨ÁË9@¨ÈY‚7gÁY T€àå T€ ä¬Á›³à¬Ž P*@ðrP*@rV€àÍYpVG¨ x9¨ 9+@ðæ,8«£T€ ¼œT€ Pœ xsœÕQ*@^Î*@¨@Î ¼9 Îꨠ/ç T gÞœguT€ P‚—s€ P*³o΂»¬nÏg]ò›§ÚdÇö½²éíO¥iÈi™0\þjÚ™2c„˪X  xŠ™gQmowÈ·®ý•ìÝs0´Úó.j‘E·ÎγI¬‹ P¥¯R¨"eÃJ÷›W<é­tãÒÕÿvº,¸|R‘šÎ¶P* "oNƒU+ß—ïÞ´ÂØr˜î¾ÿ|c>f T _Þ|õvRÛ#ÿõ{ùé¿®*ë™—¯Påc&*@òS€àÍOkg5¼Î¤dAT . ¼u‘=]¥45¤ÓOSz+@ðÖ{Ô͵¼äqÙ·7Ü£Á/’›k Äå#T ÞD΢ ¬zï¹í·‘ð=÷ÂñrÃw¾”EÕ,“ P” ¼)¬çãXùþúÉ6y½u‡ìý¬ËkÊø Í2ûì±<@QÏaÝTÀ ÁË)B¨ÈY‚7gÁY T€àå T€ ä¬ÁëHðî‘Ú»ðîØ¾GšÁCxÿâÈ&1z°£šŠ[ âGìÛÓmk†#Fu÷ùäS›{´È»õ»ví|üÔÑÑ!ÍÍÍ2lØ0ïSÔ´ÿ~ioo÷šç·ÿ=nܸܚ¼}ûv9pà@OŽ>úh4hPáµËM ¼ ÅÃÆÖK/¾/¯¿ºCV­Ü¬Æ/~Ĩ&™2}¤·éuîE- kí~ €ûÑ÷×Ë@=ç]l®ËTÞ%—M’ÙgéSŸ¯<,V¯|?¶=ãOm–)3FÊü¯MÌô%`mذA¶lÙ"›7oîݰ$“&M’©S§ÊÉ'ŸlÔT“aíÚµ‚OTš6mšàS›ÐvÿYºaeLœ8Ñk3>h¿«´qãFO;7®~¿>Ô=räHïE†¶ ]L:^N=¹›'~¾Q–=¶ÞÛ¨¢á¯ÿó_'0`ÿ­ëž3¶üþiŠW)™Ê sMlßþÛD rÚÕןît h­^½Úûà¿“$¬€gÍšåA1 Ð^xáY±":–`uÝu×õj"€·lÙ2«¶£½ ,HýÂì—/_n|I™4õ_b„°I)É1+È‘6aaxëÝs¬W€&Púu¹oÓàþò¿Ï_æ‹Ïâï®2®pMÂâåóí»Ï‘– ͦ¬Æ¿Ï<óŒ´â @®¼òJ5j”±î° &ðâ™E‹õ˜:ܸ²©óæÍó^¶ /¨G}Ôûuà:AC¼a¦¾ pÅ«œð—p\'@í®¥X(oð¢Ï÷>|±×ÆÛoZ‘º¾†èûyiª•/€‹Un à3 ˜êÒ€×/;-tý¶Ø¶Ð}ðÁU&S£þ|¹$-£¬Ï¼Š‘µ J£(®O[øÖ¼0 ýú©¶$]Œ|«þû¹8Q™® W9À ¨Ù$ x};-úà*]~ùåê&Vº°éf•п… fU|×Kð†P ¹´3ºkéùªÕŸ¶M®L iûfz~Ñ·g«6ƒå`•‹ÕnÉö§¼¼Y´6_ØŽMöi˜5\?¬/0Õ¸Ú¬ÌB«z—IðÆŒì™ß˜ÿ¸Õ‚Ã~‰ÔöN‡l~§C=ÆÚø eï”é#¼—Ž6a×}éÒ¥Úì©ódšÖý¬^àEGçÌ™#sçÎís’Õ.6;;;e÷îÝF=‡*7Üpƒ1_•3¼1£ 4¿yz“j~D­.o˜*à ¡I?^¶À¸ÙV$ðÂGyÖÙczVêðg¶}á@M¿}ý/^l½ÿWÀ…}J—)¿N<ÕUWi†P’‚wàÀÞ†žï«‹öá%£ß0ô¶Õ¸Uï­·Þªê >}úô>/èW=|àQÛ>[{³ª1%ËDðF (òfûÛ1MýåîûÍ›cZXjV½Ú²²45¸ÿ²èŒPÿ^HŠÍÈû¯1†®ôåךðEì±ÇT_C€ì /ŒÜ Ôð“[ `-P’€7Μakˆk' yçwõ³±ÑBG˜~06HXíšÌÆ”<Á1ÀÚ 5-ÜP @dZù]·¢æ^½Á‹—ÍOž0{#hÛ‰~j^8ȇxû ‹ªÉ%Ìfw_»êµ¯ÆŠÕ%LþI²8.ÅAå<ôÐCF¬á@É¥—^jÌÌ-aÚvͲ¼}õï~f´íbÕ‡yÿh°Inmsßu«¨à½kÉùê°“7]óœ¬[»Ã$h.æ´±íj`æ7 GŠa¾Ð$‹” xmgSîí·ßÚm_µu͘§¯oȬÀZ\ŸnJ—|m¢\³è S¶^¿íÆÞQã¸dº9B»’ԮƵå¡Í@û¦õÖ”«ýÉdsG»’Öx8ØòÚk¯5®Ê“¼ â^½¬¿Þoˆn84±c{wlÝz$À@ ƒ¨T/ðš^Qímðjí¼¦;íŠWc/®ÕTë•`òÀж1nþûÑÈÎ9çõá’z|ŸŠX'Á2*Pd=˜q+Ë*‚WëÑÄnêeÞà5¦¹ —>Üú¡‰‡²hnˆV”à ÑF "Ç-Áy'mûÊfãuO¡vÌ´nUšUª¼ÞË;pÛ°ii7QN–?óµZ%y±˜4(Ëß Þˆ‘ÔìÄãQ›‰®&MUÁñÔUÓIDAT‹x¯ð¢‰IkmppŒŸf5m^›ŸãÚ{æ‹ø–[nq5ÝBËÑÚIºÑ™iã P8Á1Ú@98Þ [¯6&oܘ¨SfŒ0N‹ª‚ÂhW[6?sµ·hšÛlÀ‹>™6ë?óÀ/ SŠ[•cÕŒ»ÓÒrИaÏ»¸Åû˜R•ÝÉ¢´ÁÆÜÂl6ÂÊ‚tîܹž'DÒ¤/Êö$‡ 3.´´i+6&xà'¡ƒúàE…½´þÁI5o´çÞ„#cã .a¯¿Úž€-LœwQ‹Œ=XÝ‚7Z*¸;a¹jÕ*5\0¸qÁ¥ÊbQ­Ð‚7螆v£Íh»)®ÿrHꃋvãE…ºàªÖÞÞ®ž{ÁŒh‡¯k'!Ák§Wdn¸žÀ{÷tyyv|¸§çÂÌài´ñ†ËàÁT'Ô5­²Å.À‚ÓZXíáÿa»õ“ï¿êR$-x£6 Ðnø#ù~Æpï‚o/þÅKÂe‚&¾VøÉ×ͯ+Z?¡~¿-.ÛQ¥²Þ*6ûš‹iÁ›K#YI] xë*?+/£oGÕmŸ^·z²4* >¹F_×êN‚·ºcÏžg¤W¼ [¢b Þ &»R ÞbŒC‘[Aðytض†T€àmÈa˵Ño®r³²*(@ðVa”Óõ‘àM§Ÿ¦} x9)L ¼&…øw*`©Ák)X³¼tv9[Þlõ-CéoF‘}(”o¡†£!x 9,lT#+@ð6òèåÓv‚7YK… x+4Ø »Jð&ŽQ(^Î “¯I!þ X*€8·šëÏ8ÜRØe'xK4˜ì  ¡ÁÛãÄVR*P"Þ &»B¨@c(ðÿ¿ÏáÒbwáâIEND®B`‚arogozhnikov-einops-ad2c8d6/docs/resources/python_badge.svg000066400000000000000000000050401475201674600243630ustar00rootroot00000000000000pythonpython3.8+3.8+arogozhnikov-einops-ad2c8d6/docs/resources/test_images.npy000066400000000000000000050402001475201674600242350ustar00rootroot00000000000000“NUMPYv{'descr': '>>>>>î?Õ?Õ?WWWWWW×?ÙØØØØØØ?ÙØØØØØØ?œ›››››Û?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?œœœœœì?œœœœœì?À¿¿¿¿¿ï?Ï?Ï?QQQQQQÑ?Ë?Ë?Î?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ùøøøøøè?ùøøøøøè?¼»»»»»ë?p?p?p?°?°?²?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?Î?Î?ÑÐÐÐÐÐÐ?ZZZZZZê?ZZZZZZê?======í?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?à?à?ÒÑÑÑÑÑá?¶µµµµµå?¶µµµµµå?è?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?vvvvvvæ?vvvvvvæ?ùøøøøøè?ÒÑÑÑÑÑá?ÒÑÑÑÑÑá?´³³³³³ã?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÜÛÛÛÛÛë?ÜÛÛÛÛÛë?ßÞÞÞÞÞî?000000à?000000à?òñññññá?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?±?±?³?žÝ?žÝ?ppppppà?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?•”””””Ä?•”””””Ä?Ç?›šššššÚ?›šššššÚ?žÝ?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?žÍ?žÍ?PPPPPPÐ?Ø?Ø?›šššššÚ?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?PPPPPPÐ?PPPPPPÐ?Ò?ÖÕÕÕÕÕÕ?ÖÕÕÕÕÕÕ?Ø?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?Ò?Ò?Ô?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?”“““““Ó?”“““““Ó?ÖÕÕÕÕÕÕ?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?”“““““Ó?”“““““Ó?ÖÕÕÕÕÕÕ?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?”“““““Ó?”“““““Ó?ÖÕÕÕÕÕÕ?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?”“““““Ó?”“““““Ó?ÖÕÕÕÕÕÕ?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?”“““““Ó?”“““““Ó?ÖÕÕÕÕÕÕ?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?”“““““Ó?”“““““Ó?ÖÕÕÕÕÕÕ?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?”“““““Ó?”“““““Ó?ÖÕÕÕÕÕÕ?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?”“““““Ó?”“““““Ó?ÖÕÕÕÕÕÕ?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?”“““““Ó?”“““““Ó?ÖÕÕÕÕÕÕ?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?”“““““Ó?”“““““Ó?ÖÕÕÕÕÕÕ?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?”“““““Ó?”“““““Ó?ÖÕÕÕÕÕÕ?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?”“““““Ó?”“““““Ó?ÖÕÕÕÕÕÕ?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?”“““““Ó?”“““““Ó?ÖÕÕÕÕÕÕ?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?”“““““Ó?”“““““Ó?ÖÕÕÕÕÕÕ?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?”“““““Ó?”“““““Ó?ÖÕÕÕÕÕÕ?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?”“““““Ó?”“““““Ó?ÖÕÕÕÕÕÕ?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?”“““““Ó?”“““““Ó?ÖÕÕÕÕÕÕ?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?”“““““Ó?”“““““Ó?ÖÕÕÕÕÕÕ?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?”“““““Ó?”“““““Ó?ÖÕÕÕÕÕÕ?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?”“““““Ó?”“““““Ó?ÖÕÕÕÕÕÕ?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?”“““““Ó?”“““““Ó?ÖÕÕÕÕÕÕ?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?”“““““Ó?”“““““Ó?ÖÕÕÕÕÕÕ?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?”“““““Ó?”“““““Ó?ÖÕÕÕÕÕÕ?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?”“““““Ó?”“““““Ó?ÖÕÕÕÕÕÕ?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?”“““““Ó?”“““““Ó?ÖÕÕÕÕÕÕ?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?”“““““Ó?”“““““Ó?ÖÕÕÕÕÕÕ?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?÷öööööæ?÷öööööæ?yyyyyyé?ê?ê?ýüüüüüì?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?”“““““Ó?”“““““Ó?ÖÕÕÕÕÕÕ?–•••••Õ?–•••••Õ?Ø××××××?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?ÝÜÜÜÜÜì?ÝÜÜÜÜÜì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?VVVVVVæ?ÜÛÛÛÛÛë?ÜÛÛÛÛÛë?àßßßßßß?ÔÓÓÓÓÓã?ÔÓÓÓÓÓã?UUUUUUÕ?›šššššÚ?›šššššÚ?œ›››››Ë?QQQQQQÑ?QQQQQQÑ?»?Á?Á?¬?²?²?˜?œ?œ??”?”?ª?°?°?·?½?½?˜—————Ç?žÍ?žÍ?SSSSSSÓ?Ø?Ø?œœœœœÜ?ÒÑÑÑÑÑá?ÒÑÑÑÑÑá?µ´´´´´ä?º¹¹¹¹¹é?º¹¹¹¹¹é?yyyyyyé?À¿¿¿¿¿ï?À¿¿¿¿¿ï?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?vvvvvvæ?üûûûûûë?üûûûûûë?ZZZZZZÚ?PPPPPPà?PPPPPPà?¿?Ã?Ã?®?³?³?Õ?ZZZZZZÚ?ZZZZZZÚ?444444ä?é?é?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?uuuuuuå?»ºººººê?»ºººººê?Ò?—–––––Ö?—–––––Ö?€?ˆ?ˆ?È?Î?Î?ÓÒÒÒÒÒâ?WWWWWWç?WWWWWWç?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?™˜˜˜˜˜è?Ÿžžžžžî?Ÿžžžžžî?×?œœœœœÜ?œœœœœÜ?€?ˆ?ˆ? ŸŸŸŸŸÏ?”“““““Ó?”“““““Ó?×ÖÖÖÖÖæ?\\\\\\ì?\\\\\\ì?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?µ´´´´´ä?º¹¹¹¹¹é?º¹¹¹¹¹é?»?‘À?‘À?¤?ª?ª?111111á?uuuuuuå?uuuuuuå?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?±°°°°°à?µ´´´´´ä?µ´´´´´ä?˜? ? ?™˜˜˜˜˜Ø?ŸžžžžžÞ?ŸžžžžžÞ?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?______ß?”“““““ã?”“““““ã?€?€?€?ÖÕÕÕÕÕÕ?Û?Û?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?ÑÐÐÐÐÐà?õôôôôôä?õôôôôôä?p?p?p?˜—————×?]]]]]]Ý?]]]]]]Ý?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?ttttttä?yyyyyyé?yyyyyyé?œ?¢?¢?000000à?444444ä?444444ä?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?xxxxxxè?~~~~~~î?~~~~~~î?¹?¿?¿?˜? ? ?ÖÕÕÕÕÕå?ë?ë?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?ÕÔÔÔÔÔÔ?ÚÙÙÙÙÙÙ?ÚÙÙÙÙÙÙ?˜—————Ç?Í?Í?yyyyyyé?À¿¿¿¿¿ï?À¿¿¿¿¿ï?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?•”””””ä?š™™™™™é?š™™™™™é?p?p?p?à?ôóóóóóã?ôóóóóóã?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?œ›››››Ë?Ñ?Ñ?œ?¢?¢?Ñ?UUUUUUÕ?UUUUUUÕ?QQQQQQá?–•••••å?–•••••å?UUUUUUå?›šššššê?›šššššê?è?þýýýýýí?þýýýýýí?¹¸¸¸¸¸è?¿¾¾¾¾¾î?¿¾¾¾¾¾î?æ?{{{{{{ë?{{{{{{ë?333333ã?Ø×××××ç?Ø×××××ç?×ÖÖÖÖÖÖ?\\\\\\Ü?\\\\\\Ü?µ?º?º?º?À?À?é???????ï???????ï?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?ôóóóóóã?ÙØØØØØè?ÙØØØØØè?œœœœœÌ?ÒÑÑÑÑÑÑ?ÒÑÑÑÑÑÑ?ç?½¼¼¼¼¼ì?½¼¼¼¼¼ì?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?YYYYYYé?ï?ï?Ù?ß?ß?€?ˆ?ˆ? ŸŸŸŸŸß?´³³³³³ã?´³³³³³ã?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?Ò?—–––––Ö?—–––––Ö?WWWWWW×?Ý?Ý?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?RRRRRRâ?×ÖÖÖÖÖæ?×ÖÖÖÖÖæ?€?ˆ?ˆ?Å?Ê?Ê?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?Ø×××××ç?ží?ží?”?˜?˜?ÑÐÐÐÐÐÐ?ÕÔÔÔÔÔÔ?ÕÔÔÔÔÔÔ?š™™™™™é?àßßßßßï?àßßßßßï?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?\\\\\\Ü?’‘‘‘‘‘á?’‘‘‘‘‘á?ÕÔÔÔÔÔä?ÚÙÙÙÙÙé?ÚÙÙÙÙÙé?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð? ŸŸŸŸŸß?´³³³³³ã?´³³³³³ã?¬?²?²?888888è?î?î?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?Ä?É?É?Ù?ß?ß?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?Ï?SSSSSSÓ?SSSSSSÓ?\\\\\\Ü?²±±±±±á?²±±±±±á?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?²±±±±±á?æ?æ?Ã?˜—————Ç?˜—————Ç?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?yyyyyyé?À¿¿¿¿¿ï?À¿¿¿¿¿ï?¤?¨?¨?¤?¨?¨?ÙØØØØØè?ßÞÞÞÞÞî?ßÞÞÞÞÞî?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?¸?¾?¾?¸·····ç?}}}}}}í?}}}}}}í?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?555555å?ZZZZZZê?ZZZZZZê?Ò?VVVVVVÖ?VVVVVVÖ?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?Ö?[[[[[[Û?[[[[[[Û?rrrrrrâ?÷öööööæ?÷öööööæ?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð? ŸŸŸŸŸß?´³³³³³ã?´³³³³³ã?QQQQQQá?–•••••å?–•••••å?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?³²²²²²â?777777ç?777777ç?ZZZZZZÚ?ppppppà?ppppppà?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?Ø?Þ?Þ?—–––––æ?ì?ì?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?wwwwwwç?======í?======í?Ó?Ø××××××?Ø××××××?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?Ò?VVVVVVÖ?VVVVVVÖ?¨?®?®?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?°?´?´?Ê?PPPPPPÐ?PPPPPPÐ?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?™˜˜˜˜˜È?ŸžžžžžÎ?ŸžžžžžÎ?–•••••Å?›šššššÊ?›šššššÊ?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?—–––––Æ?Ì?Ì?¾?“’’’’’Â?“’’’’’Â?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?”“““““Ã?™˜˜˜˜˜È?™˜˜˜˜˜È?œ›››››Ë?QQQQQQÑ?QQQQQQÑ?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?Ì?’‘‘‘‘‘Ñ?’‘‘‘‘‘Ñ?µ?º?º?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?¿?”“““““Ã?”“““““Ã?ÑÐÐÐÐÐÐ?ÕÔÔÔÔÔÔ?ÕÔÔÔÔÔÔ?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?ÑÐÐÐÐÐÐ?ÕÔÔÔÔÔÔ?ÕÔÔÔÔÔÔ?ª?°?°?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?»?Á?Á?“’’’’’Ò?×?×?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?“’’’’’Ò?×?×? ?¤?¤?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?Á?Å?Å?ÑÐÐÐÐÐÐ?ÕÔÔÔÔÔÔ?ÕÔÔÔÔÔÔ?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?Ð?Ô?Ô?®?³?³?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?•”””””Ä?š™™™™™É?š™™™™™É?œœœœœÌ?ÒÑÑÑÑÑÑ?ÒÑÑÑÑÑÑ?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?›šššššÊ?‘Ð?‘Ð?·?¼?¼?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?›šššššÊ?‘Ð?‘Ð?Æ?œ›››››Ë?œ›››››Ë?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?Ä?É?É?’‘‘‘‘‘Á?–•••••Å?–•••••Å?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?Ó?˜—————×?˜—————×?¬?²?²?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?š™™™™™é?àßßßßßï?àßßßßßï?¢?¦?¦?Í?Ò?Ò?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?Ù?ß?ß?wwwwwwç?í?í?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?æ?{{{{{{ë?{{{{{{ë?•”””””Ô?š™™™™™Ù?š™™™™™Ù?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?±°°°°°à?µ´´´´´ä?µ´´´´´ä?RRRRRRâ?×ÖÖÖÖÖæ?×ÖÖÖÖÖæ?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?±°°°°°à?µ´´´´´ä?µ´´´´´ä?œœœœœÜ?ÒÑÑÑÑÑá?ÒÑÑÑÑÑá?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?ÖÕÕÕÕÕå?ë?ë?UUUUUUÕ?›šššššÚ?›šššššÚ?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?PPPPPPÐ?TTTTTTÔ?TTTTTTÔ?´³³³³³ã?xxxxxxè?xxxxxxè?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?š™™™™™é?àßßßßßï?àßßßßßï?¨?®?®?·?½?½?š™™™™™é?àßßßßßï?àßßßßßï?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?xxxxxxè?^^^^^^î?^^^^^^î?˜? ? ?ˆ???™˜˜˜˜˜è?Ÿžžžžžî?Ÿžžžžžî?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?Ñ?Õ?Õ?’‘‘‘‘‘á?ÖÕÕÕÕÕå?ÖÕÕÕÕÕå?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?ZZZZZZÚ?PPPPPPà?PPPPPPà?š™™™™™É?Ð?Ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?000000à?444444ä?444444ä?”“““““Ã?™˜˜˜˜˜È?™˜˜˜˜˜È?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?¸·····ç?}}}}}}í?}}}}}}í?¦?¬?¬?\\\\\\Ü?²±±±±±á?²±±±±±á?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?ø÷÷÷÷÷ç?ÞÝÝÝÝÝí?ÞÝÝÝÝÝí?˜? ? ?ÜÛÛÛÛÛÛ?QQQQQQá?QQQQQQá?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?yyyyyyé?À¿¿¿¿¿ï?À¿¿¿¿¿ï?Î?ÓÒÒÒÒÒÒ?ÓÒÒÒÒÒÒ?€?€?€?—–––––æ?ì?ì?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?ÓÒÒÒÒÒÒ?WWWWWW×?WWWWWW×?€?ˆ?ˆ?RRRRRRâ?×ÖÖÖÖÖæ?×ÖÖÖÖÖæ?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?š™™™™™é?àßßßßßï?àßßßßßï?Ö?[[[[[[Û?[[[[[[Û?žÍ?“’’’’’Ò?“’’’’’Ò?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?TTTTTTä?999999é?999999é?€?ˆ?ˆ?Ù?ß?ß?YYYYYYé?ï?ï?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?×ÖÖÖÖÖæ?||||||ì?||||||ì?Ë?ÑÐÐÐÐÐÐ?ÑÐÐÐÐÐÐ?222222â?—–––––æ?—–––––æ?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?žÍ?“’’’’’Ò?“’’’’’Ò?µ?º?º?×ÖÖÖÖÖÖ?\\\\\\Ü?\\\\\\Ü?333333ã?Ø×××××ç?Ø×××××ç?666666æ?œ›››››ë?œ›››››ë?ÙØØØØØè?ßÞÞÞÞÞî?ßÞÞÞÞÞî?888888è?î?î?uuuuuuå?»ºººººê?»ºººººê?QQQQQQá?–•••••å?–•••••å?Ñ?UUUUUUÕ?UUUUUUÕ?˜? ? ?Å?Ê?Ê?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?ÕÔÔÔÔÔä?úùùùùùé?úùùùùùé?€?€?€?“’’’’’â?ç?ç?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?VVVVVVÖ?ÜÛÛÛÛÛÛ?ÜÛÛÛÛÛÛ?Ñ?Õ?Õ?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?™˜˜˜˜˜è?Ÿžžžžžî?Ÿžžžžžî?¼?’‘‘‘‘‘Á?’‘‘‘‘‘Á?¬?²?²?wwwwwwç?í?í?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?õôôôôôä?ê?ê?¢?¦?¦?€?ˆ?ˆ?³²²²²²â?777777ç?777777ç?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?qqqqqqá?¶µµµµµå?¶µµµµµå?€?€?€?\\\\\\Ü?²±±±±±á?²±±±±±á?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?‘à?•”””””ä?•”””””ä?€?€?€?ÚÙÙÙÙÙÙ?à?à?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?QQQQQQá?–•••••å?–•••••å?¢?¦?¦?€?€?€?œ›››››Û?111111á?111111á?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?å?::::::ê?::::::ê?½?Â?Â?¬?±?±?222222â?—–––––æ?—–––––æ?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?¹¸¸¸¸¸è?¿¾¾¾¾¾î?¿¾¾¾¾¾î?×?œœœœœÜ?œœœœœÜ?€?ˆ?ˆ?‘Ð?•”””””Ô?•”””””Ô?WWWWWWç?ýüüüüüì?ýüüüüüì?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?uuuuuuå?»ºººººê?»ºººººê?QQQQQQÑ?–•••••Õ?–•••••Õ?p?p?p?È?Î?Î?333333ã?Ø×××××ç?Ø×××××ç?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?æ?{{{{{{ë?{{{{{{ë?™˜˜˜˜˜Ø?ŸžžžžžÞ?ŸžžžžžÞ?¸?¾?¾?¬?±?±?ÕÔÔÔÔÔÔ?ÚÙÙÙÙÙÙ?ÚÙÙÙÙÙÙ?444444ä?é?é?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?¶µµµµµå?ûúúúúúê?ûúúúúúê?ŸžžžžžÞ?ã?ã?•”””””Ô?š™™™™™Ù?š™™™™™Ù?Ê?PPPPPPÐ?PPPPPPÐ?º?À?À?¬?±?±?”?˜?˜?ˆ???¨?®?®?¶?»?»?Ç?œœœœœÌ?œœœœœÌ?Ó?Ø××××××?Ø××××××?\\\\\\Ü?’‘‘‘‘‘á?’‘‘‘‘‘á?ttttttä?yyyyyyé?yyyyyyé?yyyyyyé? ŸŸŸŸŸï? ŸŸŸŸŸï?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ßÞÞÞÞÞî?ÙØØØØØè?ßÞÞÞÞÞî?ê?õôôôôôä?ê?666666æ?ÒÑÑÑÑÑá?666666æ?³²²²²²â?Þ?³²²²²²â?^^^^^^Þ?XXXXXXØ?^^^^^^Þ?˜—————×?Ó?˜—————×?‘Ð?›šššššÊ?‘Ð?—–––––Æ?Â?—–––––Æ?Â?½?Â?»?¶?»?±?¬?±? ?˜? ?€?€?€?¦?¢?¦?µ?±?µ?”“““““Ã?¿?”“““““Ã?’‘‘‘‘‘Ñ?Ì?’‘‘‘‘‘Ñ?YYYYYYÙ?TTTTTTÔ?YYYYYYÙ?³²²²²²â?Þ?³²²²²²â?ÚÙÙÙÙÙé?ÕÔÔÔÔÔä?ÚÙÙÙÙÙé?ï?YYYYYYé?ï?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?^^^^^^î?xxxxxxè?^^^^^^î?777777ç?³²²²²²â?777777ç?PPPPPPà?ZZZZZZÚ?PPPPPPà?TTTTTTÔ?PPPPPPÐ?TTTTTTÔ?Á?»?Á?€?€?€?®?¨?®?˜—————×?Ó?˜—————×?—–––––æ?222222â?—–––––æ?À¿¿¿¿¿ï?yyyyyyé?À¿¿¿¿¿ï?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?þýýýýýí?è?þýýýýýí?ppppppà?ZZZZZZÚ?ppppppà?Ì?—–––––Æ?Ì?ˆ?€?ˆ?Â?½?Â?ã?ŸžžžžžÞ?ã?______ï?999999é?______ï?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?–•••••Å?’‘‘‘‘‘Á?–•••••Å?888888è?ssssssã?888888è?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?p?p?p?ŸžžžžžÞ?™˜˜˜˜˜Ø?ŸžžžžžÞ?À¿¿¿¿¿ï?yyyyyyé?À¿¿¿¿¿ï?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?“’’’’’Ò?žÍ?“’’’’’Ò?ßÞÞÞÞÞî?ÙØØØØØè?ßÞÞÞÞÞî?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?Ñ?œ›››››Ë?Ñ?ßÞÞÞÞÞî?ÙØØØØØè?ßÞÞÞÞÞî?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?•”””””Ô?‘Ð?•”””””Ô?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ñðððððà?[[[[[[Û?ñðððððà?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ˆ?€?ˆ?ÚÙÙÙÙÙé?ÕÔÔÔÔÔä?ÚÙÙÙÙÙé?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?Ê?Å?Ê? ŸŸŸŸŸï?yyyyyyé? ŸŸŸŸŸï?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ä?000000à?ä?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?òñññññá?ÝÜÜÜÜÜÜ?òñññññá?—–––––æ?222222â?—–––––æ?yyyyyyé?ttttttä?yyyyyyé?¼»»»»»ë?VVVVVVæ?¼»»»»»ë?======í?wwwwwwç?======í?¿¾¾¾¾¾î?¹¸¸¸¸¸è?¿¾¾¾¾¾î?ï?YYYYYYé?ï?¾½½½½½í?ø÷÷÷÷÷ç?¾½½½½½í?é?444444ä?é?QQQQQQá?ÜÛÛÛÛÛÛ?QQQQQQá?Ç?“’’’’’Â?Ç?Ã?¿?Ã? ŸŸŸŸŸï?yyyyyyé? ŸŸŸŸŸï?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?–•••••å?QQQQQQá?–•••••å?¶?²?¶?ÖÕÕÕÕÕå?’‘‘‘‘‘á?ÖÕÕÕÕÕå?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?\\\\\\ì?×ÖÖÖÖÖæ?\\\\\\ì?Â?½?Â?Ó?ŸžžžžžÎ?Ó?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?¼»»»»»ë?VVVVVVæ?¼»»»»»ë?®?¨?®?ˆ?€?ˆ?]]]]]]í?˜—————ç?]]]]]]í?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?SSSSSSã?ß?SSSSSSã?•”””””ä?‘à?•”””””ä?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ï?YYYYYYé?ï?Ã?¿?Ã?˜—————×?Ó?˜—————×?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?444444ä?000000à?444444ä?’‘‘‘‘‘Á?¼?’‘‘‘‘‘Á?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?~~~~~~î?xxxxxxè?~~~~~~î? ?˜? ?p?p?p?^^^^^^î?xxxxxxè?^^^^^^î?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?TTTTTTÔ?PPPPPPÐ?TTTTTTÔ?ÙØØØØØè?ôóóóóóã?ÙØØØØØè?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?RRRRRRâ?]]]]]]Ý?RRRRRRâ?õôôôôôä?ÑÐÐÐÐÐà?õôôôôôä?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ø÷÷÷÷÷ç?SSSSSSã?ø÷÷÷÷÷ç?’‘‘‘‘‘á?\\\\\\Ü?’‘‘‘‘‘á?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?¾½½½½½í?ø÷÷÷÷÷ç?¾½½½½½í?Ý?WWWWWW×?Ý?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?”??”?ZZZZZZÚ?Õ?ZZZZZZÚ?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?²?¬?²?Ø?SSSSSSÓ?Ø?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?‘À?»?‘À?—–––––Ö?Ò?—–––––Ö?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?À?º?À?Ø?SSSSSSÓ?Ø?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?´?°?´?ÚÙÙÙÙÙÙ?ÕÔÔÔÔÔÔ?ÚÙÙÙÙÙÙ?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?¢?œ?¢?œœœœœÜ?×?œœœœœÜ?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?^^^^^^î?xxxxxxè?^^^^^^î?á?[[[[[[Û?á?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ÔÓÓÓÓÓã?àßßßßßß?ÔÓÓÓÓÓã?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?µ´´´´´ä?±°°°°°à?µ´´´´´ä?¸·····ç?ã?¸·····ç?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ÚÙÙÙÙÙÙ?ÕÔÔÔÔÔÔ?ÚÙÙÙÙÙÙ?œœœœœì?÷öööööæ?œœœœœì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?½?·?½?ª?¤?ª?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?¹¸¸¸¸¸è?ÔÓÓÓÓÓã?¹¸¸¸¸¸è?‘Ð?›šššššÊ?‘Ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?“’’’’’Ò?žÍ?“’’’’’Ò?ß?Ù?ß?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?wwwwwwç?ÓÒÒÒÒÒâ?wwwwwwç?¸·····ç?ã?¸·····ç?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?üûûûûûë?vvvvvvæ?üûûûûûë?³?®?³?®?¨?®?______ï?999999é?______ï?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?UUUUUUÕ?Ñ?UUUUUUÕ?zzzzzzê?UUUUUUå?zzzzzzê?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ç?“’’’’’â?ç?¸?³?¸?×ÖÖÖÖÖÖ?RRRRRRÒ?×ÖÖÖÖÖÖ?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?Å?Á?Å?]]]]]]Ý?˜—————×?]]]]]]Ý?õôôôôôä?ÑÐÐÐÐÐà?õôôôôôä?›šššššê?UUUUUUå?›šššššê?}}}}}}í?¸·····ç?}}}}}}í?______ï?999999é?______ï?þýýýýýí?è?þýýýýýí?»ºººººê?uuuuuuå?»ºººººê?ä?000000à?ä?Ñ?œ›››››Ë?Ñ?p?p?p?¸·····ç?ã?¸·····ç?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?“’’’’’Â?¾?“’’’’’Â?àßßßßßï?š™™™™™é?àßßßßßï?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?´³³³³³ã? ŸŸŸŸŸß?´³³³³³ã?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?Â?½?Â?ï?YYYYYYé?ï?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ç?“’’’’’â?ç?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?WWWWWW×?ÓÒÒÒÒÒÒ?WWWWWW×?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?—–––––Æ?Â?—–––––Æ?Ÿžžžžžî?™˜˜˜˜˜è?Ÿžžžžžî?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?¾?¸?¾?üûûûûûë?vvvvvvæ?üûûûûûë?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?’‘‘‘‘‘Á?¼?’‘‘‘‘‘Á?üûûûûûë?vvvvvvæ?üûûûûûë?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?SSSSSSÓ?Ï?SSSSSSÓ?þýýýýýí?è?þýýýýýí?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?QQQQQQÑ?œ›››››Ë?QQQQQQÑ??ˆ??¹?´?¹?ttttttä?ppppppà?ttttttä?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?½¼¼¼¼¼ì?ç?½¼¼¼¼¼ì?PPPPPPà?ZZZZZZÚ?PPPPPPà?Ç?“’’’’’Â?Ç?Å?Á?Å?³²²²²²â?Þ?³²²²²²â?Ÿžžžžžî?™˜˜˜˜˜è?Ÿžžžžžî?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?\\\\\\ì?×ÖÖÖÖÖæ?\\\\\\ì?´³³³³³ã? ŸŸŸŸŸß?´³³³³³ã?ÛÚÚÚÚÚÚ?–•••••Õ?ÛÚÚÚÚÚÚ?ŸžžžžžÎ?™˜˜˜˜˜È?ŸžžžžžÎ?¿?¹?¿?®?¨?®?ˆ?€?ˆ?¦?¢?¦?¸?³?¸?Ê?Å?Ê?WWWWWW×?ÓÒÒÒÒÒÒ?WWWWWW×?â?Ý?â?ë?ÖÕÕÕÕÕå?ë?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?yyyyyyé?ttttttä?yyyyyyé?ýüüüüüì?WWWWWWç?ýüüüüüì?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?{{{{{{ë?{{{{{{ë?æ?ttttttä?ttttttä?ppppppà?^^^^^^Þ?^^^^^^Þ?XXXXXXØ?TTTTTTÔ?TTTTTTÔ?PPPPPPÐ?Ì?Ì?—–––––Æ?‘À?‘À?»?±?±?¬?¢?¢?œ?€?€?€?œ?œ?˜?²?²?¬?½?½?·?Ä?Ä?À?Ê?Ê?Å?ÕÔÔÔÔÔÔ?ÕÔÔÔÔÔÔ?ÑÐÐÐÐÐÐ?žÝ?žÝ?Ø××××××?333333ã?333333ã?ßÞÞÞÞÞÞ?˜—————ç?˜—————ç?óòòòòòâ?í?í?wwwwwwç?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?àßßßßßï?àßßßßßï?š™™™™™é?ø÷÷÷÷÷ç?ø÷÷÷÷÷ç?SSSSSSã?Ú?Ú?Õ?¼?¼?·?À?À?º?˜—————×?˜—————×?Ó?µ´´´´´ä?µ´´´´´ä?±°°°°°à?ÞÝÝÝÝÝí?ÞÝÝÝÝÝí?ø÷÷÷÷÷ç?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?______ï?______ï?999999é?ã?ã?ŸžžžžžÞ?”“““““Ã?”“““““Ã?¿?˜?˜?”?Ô?Ô?Ð?ç?ç?“’’’’’â?àßßßßßï?àßßßßßï?š™™™™™é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?wwwwwwç?wwwwwwç?ÓÒÒÒÒÒâ?”“““““Ã?”“““““Ã?¿?ÔÓÓÓÓÓã?ÔÓÓÓÓÓã?àßßßßßß?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?À¿¿¿¿¿ï?À¿¿¿¿¿ï?yyyyyyé?ÞÝÝÝÝÝÝ?ÞÝÝÝÝÝÝ?Ø?p?p?p?xxxxxxè?xxxxxxè?´³³³³³ã?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ï?ï?YYYYYYé?ÖÕÕÕÕÕÕ?ÖÕÕÕÕÕÕ?’‘‘‘‘‘Ñ?þýýýýýí?þýýýýýí?è?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?™˜˜˜˜˜Ø?™˜˜˜˜˜Ø?ÔÓÓÓÓÓÓ?¼?¼?·?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?SSSSSSã?SSSSSSã?ß?Ò?Ò?Í?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?======í?======í?wwwwwwç?ª?ª?¤?Ý?Ý?WWWWWW×?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?œœœœœÜ?œœœœœÜ?×?ä?ä?000000à?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ï?ï?ùøøøøøè?±?±?¬?š™™™™™é?š™™™™™é?•”””””ä?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?×ÖÖÖÖÖæ?×ÖÖÖÖÖæ?RRRRRRâ?€?€?€?ßÞÞÞÞÞî?ßÞÞÞÞÞî?ÙØØØØØè?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?žÝ?žÝ?Ø××××××?²?²?¬?›šššššÚ?›šššššÚ?UUUUUUÕ?õôôôôôä?õôôôôôä?ÑÐÐÐÐÐà?š™™™™™é?š™™™™™é?•”””””ä?í?í?wwwwwwç?~~~~~~î?~~~~~~î?xxxxxxè? ŸŸŸŸŸï? ŸŸŸŸŸï?yyyyyyé?______ï?______ï?999999é?^^^^^^î?^^^^^^î?xxxxxxè?½¼¼¼¼¼ì?½¼¼¼¼¼ì?ç?yyyyyyé?yyyyyyé?ttttttä?öõõõõõå?öõõõõõå?²±±±±±á?PPPPPPà?PPPPPPà?ZZZZZZÚ?•”””””Ô?•”””””Ô?‘Ð?´?´?°?Â?Â?½?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?RRRRRRÒ?RRRRRRÒ?žÍ?XXXXXXØ?XXXXXXØ?”“““““Ó?~~~~~~î?~~~~~~î?xxxxxxè?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?àßßßßßï?àßßßßßï?š™™™™™é?š™™™™™é?š™™™™™é?•”””””ä?______ß?______ß?Ù?—–––––Æ?—–––––Æ?Â?Ô?Ô?Ð?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?Ç?Ç?“’’’’’Â?”“““““Ó?”“““““Ó? ŸŸŸŸŸÏ?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?œ›››››ë?œ›››››ë?666666æ?²±±±±±á?²±±±±±á?\\\\\\Ü?Ï?Ï?É?à?à?ÚÙÙÙÙÙÙ?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?’‘‘‘‘‘Á?’‘‘‘‘‘Á?¼?å?å?ñðððððà?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?Å?Å?Á?‘à?‘à?›šššššÚ?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?Í?Í?˜—————Ç?±?±?¬?ÜÛÛÛÛÛë?ÜÛÛÛÛÛë?VVVVVVæ?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?™˜˜˜˜˜Ø?™˜˜˜˜˜Ø?ÔÓÓÓÓÓÓ?·?·?²?777777ç?777777ç?³²²²²²â?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?333333ã?333333ã?ßÞÞÞÞÞÞ?ÔÓÓÓÓÓÓ?ÔÓÓÓÓÓÓ?Ð?é?é?444444ä?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ýüüüüüì?ýüüüüüì?WWWWWWç?€?€?€?š™™™™™É?š™™™™™É?•”””””Ä?óòòòòòâ?óòòòòòâ?^^^^^^Þ?ží?ží?Ø×××××ç?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?VVVVVVÖ?VVVVVVÖ?Ò?”?”??Ò?Ò?Í?333333ã?333333ã?ßÞÞÞÞÞÞ?í?í?wwwwwwç?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?üûûûûûë?üûûûûûë?vvvvvvæ?°?°?ª?ˆ?ˆ?€?QQQQQQÑ?QQQQQQÑ?œ›››››Ë?ôóóóóóã?ôóóóóóã?à?>>>>>>î?>>>>>>î?XXXXXXè?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?666666æ?666666æ?ÒÑÑÑÑÑá?p?p?p? ? ?˜?ÕÔÔÔÔÔÔ?ÕÔÔÔÔÔÔ?ÑÐÐÐÐÐÐ?¶µµµµµå?¶µµµµµå?qqqqqqá?ï?ï?ùøøøøøè?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?àßßßßßß?àßßßßßß?š™™™™™Ù?µ?µ?±? ŸŸŸŸŸß? ŸŸŸŸŸß?YYYYYYÙ?í?í?wwwwwwç?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?rrrrrrâ?rrrrrrâ?žÝ?ˆ?ˆ?€?œ?œ?˜?ÛÚÚÚÚÚÚ?ÛÚÚÚÚÚÚ?–•••••Õ?ÞÝÝÝÝÝí?ÞÝÝÝÝÝí?ø÷÷÷÷÷ç?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?÷öööööæ?÷öööööæ?rrrrrrâ?¹?¹?´?µ?µ?±?ÔÓÓÓÓÓã?ÔÓÓÓÓÓã?àßßßßßß?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?^^^^^^î?^^^^^^î?xxxxxxè?œœœœœÜ?œœœœœÜ?×?”?”??WWWWWW×?WWWWWW×?ÓÒÒÒÒÒÒ???????ï???????ï?é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?{{{{{{ë?{{{{{{ë?æ?ÛÚÚÚÚÚÚ?ÛÚÚÚÚÚÚ?–•••••Õ?¨?¨?¤? ŸŸŸŸŸÏ? ŸŸŸŸŸÏ?š™™™™™É?þýýýýýí?þýýýýýí?è?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?þýýýýýí?þýýýýýí?è?222222â?222222â?]]]]]]Ý?Í?Í?˜—————Ç?p?p?p?Ò?Ò?Í?àßßßßßï?àßßßßßï?š™™™™™é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?œœœœœì?œœœœœì?÷öööööæ?â?â?Ý?žÍ?žÍ?˜—————Ç?p?p?p?á?á?[[[[[[Û?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?½¼¼¼¼¼ì?½¼¼¼¼¼ì?ç?“’’’’’â?“’’’’’â?ÞÝÝÝÝÝÝ?Ñ?Ñ?œ›››››Ë?”?”????ˆ?ZZZZZZê?ZZZZZZê?555555å?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?î?î?888888è?ÕÔÔÔÔÔä?ÕÔÔÔÔÔä?±°°°°°à?UUUUUUÕ?UUUUUUÕ?Ñ?¢?¢?œ?Õ?Õ?Ñ?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?^^^^^^î?^^^^^^î?xxxxxxè?ssssssã?ssssssã?______ß?˜—————Ç?˜—————Ç?Ã?”?”??ÞÝÝÝÝÝí?ÞÝÝÝÝÝí?ø÷÷÷÷÷ç?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?àßßßßßï?àßßßßßï?š™™™™™é?µ´´´´´ä?µ´´´´´ä?±°°°°°à?°?°?ª?å?å?ñðððððà?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ZZZZZZê?ZZZZZZê?555555å?¢?¢?œ?^^^^^^Þ?^^^^^^Þ?XXXXXXØ?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?À¿¿¿¿¿ï?À¿¿¿¿¿ï?yyyyyyé?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?Ü?Ü?—–––––Ö?Ö?Ö?ÒÑÑÑÑÑÑ?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?™˜˜˜˜˜è?™˜˜˜˜˜è?ÔÓÓÓÓÓã?Â?Â?½?PPPPPPà?PPPPPPà?ZZZZZZÚ?ì?ì?—–––––æ?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ä?ä?000000à?“’’’’’Ò?“’’’’’Ò?žÍ?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?óòòòòòâ?óòòòòòâ?^^^^^^Þ?p?p?p?Ð?Ð?š™™™™™É?óòòòòòâ?óòòòòòâ?^^^^^^Þ?í?í?wwwwwwç?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ßÞÞÞÞÞÞ?ßÞÞÞÞÞÞ?ÙØØØØØØ?“’’’’’Ò?“’’’’’Ò?žÍ?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?›šššššÚ?›šššššÚ?UUUUUUÕ?€?€?€?Ï?Ï?É?222222â?222222â?]]]]]]Ý?{{{{{{ë?{{{{{{ë?æ?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?×ÖÖÖÖÖæ?×ÖÖÖÖÖæ?RRRRRRâ?¦?¦?¢?—–––––Ö?—–––––Ö?Ò?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?Î?Î?È?¿?¿?¹?—–––––Ö?—–––––Ö?Ò?QQQQQQá?QQQQQQá?ÜÛÛÛÛÛÛ?—–––––æ?—–––––æ?222222â?::::::ê?::::::ê?å?}}}}}}í?}}}}}}í?¸·····ç?ßÞÞÞÞÞî?ßÞÞÞÞÞî?ÙØØØØØè?àßßßßßï?àßßßßßï?š™™™™™é? ŸŸŸŸŸï? ŸŸŸŸŸï?yyyyyyé?ÿþþþþþî?ÿþþþþþî?ùøøøøøè?ží?ží?Ø×××××ç?œ›››››ë?œ›››››ë?666666æ?™˜˜˜˜˜è?™˜˜˜˜˜è?ÔÓÓÓÓÓã?µ´´´´´ä?µ´´´´´ä?±°°°°°à?žÝ?žÝ?Ø××××××?È?È?”“““““Ã?ß?ß?Ù?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?°?°?ª?666666æ?666666æ?ÒÑÑÑÑÑá?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?\\\\\\ì?\\\\\\ì?×ÖÖÖÖÖæ?˜?˜?”?î?î?888888è?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?·¶¶¶¶¶æ?·¶¶¶¶¶æ?RRRRRRâ?ÕÔÔÔÔÔÔ?ÕÔÔÔÔÔÔ?ÑÐÐÐÐÐÐ?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?á?á?[[[[[[Û?YYYYYYé?YYYYYYé?TTTTTTä?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?—–––––Ö?—–––––Ö?Ò?XXXXXXØ?XXXXXXØ?”“““““Ó?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?—–––––Æ?—–––––Æ?Â?—–––––Æ?—–––––Æ?Â?~~~~~~î?~~~~~~î?xxxxxxè?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?______ï?______ï?999999é?˜?˜?”?Å?Å?Á?ýüüüüüì?ýüüüüüì?WWWWWWç?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?¾½½½½½í?¾½½½½½í?ø÷÷÷÷÷ç?›šššššÊ?›šššššÊ?–•••••Å?Ô?Ô?Ð?>>>>>>î?>>>>>>î?XXXXXXè?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?é?é?444444ä?XXXXXXØ?XXXXXXØ?”“““““Ó?¦?¦?¢?¾?¾?¸?ÕÔÔÔÔÔä?ÕÔÔÔÔÔä?±°°°°°à?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?^^^^^^î?^^^^^^î?xxxxxxè?”“““““ã?”“““““ã?______ß?Ï?Ï?É?p?p?p?È?È?”“““““Ã?333333ã?333333ã?ßÞÞÞÞÞÞ???????ï???????ï?é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?<<<<<<ì?<<<<<<ì?·¶¶¶¶¶æ?’‘‘‘‘‘á?’‘‘‘‘‘á?\\\\\\Ü?Ð?Ð?š™™™™™É?”?”??ˆ?ˆ?€?š™™™™™É?š™™™™™É?•”””””Ä? ŸŸŸŸŸß? ŸŸŸŸŸß?YYYYYYÙ?úùùùùùé?úùùùùùé?ÕÔÔÔÔÔä?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?^^^^^^î?^^^^^^î?xxxxxxè?888888è?888888è?ssssssã?222222â?222222â?]]]]]]Ý?š™™™™™Ù?š™™™™™Ù?•”””””Ô?RRRRRRÒ?RRRRRRÒ?žÍ?Ç?Ç?“’’’’’Â?¸?¸?³?¬?¬?¦?œ?œ?˜?p?p?p???ˆ?¤?¤? ?²?²?¬?Á?Á?»?Ë?Ë?–•••••Å?SSSSSSÓ?SSSSSSÓ?Ï?œ›››››Û?œ›››››Û?Ö?RRRRRRâ?RRRRRRâ?]]]]]]Ý?ø÷÷÷÷÷ç?ø÷÷÷÷÷ç?SSSSSSã?Ÿžžžžžî?Ÿžžžžžî?™˜˜˜˜˜è?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?ð?ð?º¹¹¹¹¹é?arogozhnikov-einops-ad2c8d6/docs/utils/000077500000000000000000000000001475201674600203265ustar00rootroot00000000000000arogozhnikov-einops-ad2c8d6/docs/utils/__init__.py000066400000000000000000000022111475201674600224330ustar00rootroot00000000000000import numpy as np from PIL.Image import fromarray from IPython import get_ipython from IPython.display import display_html def display_np_arrays_as_images(): def np_to_png(a): if 2 <= len(a.shape) <= 3: return fromarray(np.array(np.clip(a, 0, 1) * 255, dtype="uint8"))._repr_png_() else: return fromarray(np.zeros([1, 1], dtype="uint8"))._repr_png_() def np_to_text(obj, p, cycle): if len(obj.shape) < 2: print(repr(obj)) if 2 <= len(obj.shape) <= 3: pass else: print("".format(obj.shape)) get_ipython().display_formatter.formatters["image/png"].for_type(np.ndarray, np_to_png) get_ipython().display_formatter.formatters["text/plain"].for_type(np.ndarray, np_to_text) _style_inline = """ """ def guess(x): display_html( _style_inline + "

Answer is: {x} (hover to see)

".format(x=tuple(x)), raw=True, ) arogozhnikov-einops-ad2c8d6/docs_src/000077500000000000000000000000001475201674600200355ustar00rootroot00000000000000arogozhnikov-einops-ad2c8d6/docs_src/1-einops-basics.ipynb000077700000000000000000000000001475201674600310732../docs/1-einops-basics.ipynbustar00rootroot00000000000000arogozhnikov-einops-ad2c8d6/docs_src/2-einops-for-deep-learning.ipynb000077700000000000000000000000001475201674600351612../docs/2-einops-for-deep-learning.ipynbustar00rootroot00000000000000arogozhnikov-einops-ad2c8d6/docs_src/3-einmix-layer.ipynb000077700000000000000000000000001475201674600306072../docs/3-einmix-layer.ipynbustar00rootroot00000000000000arogozhnikov-einops-ad2c8d6/docs_src/4-pack-and-unpack.ipynb000077700000000000000000000000001475201674600314752../docs/4-pack-and-unpack.ipynbustar00rootroot00000000000000arogozhnikov-einops-ad2c8d6/docs_src/CNAME000066400000000000000000000000141475201674600205760ustar00rootroot00000000000000einops.rocksarogozhnikov-einops-ad2c8d6/docs_src/api/000077500000000000000000000000001475201674600206065ustar00rootroot00000000000000arogozhnikov-einops-ad2c8d6/docs_src/api/asnumpy.md000066400000000000000000000000441475201674600226220ustar00rootroot00000000000000# einops.asnumpy ::: einops.asnumpyarogozhnikov-einops-ad2c8d6/docs_src/api/einsum.md000066400000000000000000000000421475201674600224240ustar00rootroot00000000000000# einops.einsum ::: einops.einsumarogozhnikov-einops-ad2c8d6/docs_src/api/pack_unpack.md000066400000000000000000000001551475201674600234100ustar00rootroot00000000000000# einops.pack and einops.unpack ## einops.pack ::: einops.pack
## einops.unpack ::: einops.unpackarogozhnikov-einops-ad2c8d6/docs_src/api/parse_shape.md000066400000000000000000000000541475201674600234210ustar00rootroot00000000000000# einops.parse_shape ::: einops.parse_shapearogozhnikov-einops-ad2c8d6/docs_src/api/rearrange.md000066400000000000000000000000501475201674600230710ustar00rootroot00000000000000# einops.rearrange ::: einops.rearrangearogozhnikov-einops-ad2c8d6/docs_src/api/reduce.md000066400000000000000000000000421475201674600223730ustar00rootroot00000000000000# einops.reduce ::: einops.reducearogozhnikov-einops-ad2c8d6/docs_src/api/repeat.md000066400000000000000000000000421475201674600224040ustar00rootroot00000000000000# einops.repeat ::: einops.repeatarogozhnikov-einops-ad2c8d6/docs_src/css/000077500000000000000000000000001475201674600206255ustar00rootroot00000000000000arogozhnikov-einops-ad2c8d6/docs_src/css/codehilite.css000066400000000000000000000101611475201674600234470ustar00rootroot00000000000000/*, .codehilite pre*/ .codehilite code{ color:#3F3F3F; background-color:#F7F7F7; overflow-x: scroll; box-sizing: border-box; padding: 0.01em; border: 1px solid #CCC; } .codehilite .hll { background-color: #ffffcc } .codehilite .c { color: #999988; font-style: italic } /* Comment */ .codehilite .err { color: #a61717; background-color: #e3d2d2 } /* Error */ .codehilite .k { color: #000000; font-weight: bold } /* Keyword */ .codehilite .o { color: #000000; font-weight: bold } /* Operator */ .codehilite .cm { color: #999988; font-style: italic } /* Comment.Multiline */ .codehilite .cp { color: #999999; font-weight: bold; font-style: italic } /* Comment.Preproc */ .codehilite .c1 { color: #999988; font-style: italic } /* Comment.Single */ .codehilite .cs { color: #999999; font-weight: bold; font-style: italic } /* Comment.Special */ .codehilite .gd { color: #000000; background-color: #ffdddd } /* Generic.Deleted */ .codehilite .ge { color: #000000; font-style: italic } /* Generic.Emph */ .codehilite .gr { color: #aa0000 } /* Generic.Error */ .codehilite .gh { color: #999999 } /* Generic.Heading */ .codehilite .gi { color: #000000; background-color: #ddffdd } /* Generic.Inserted */ .codehilite .go { color: #888888 } /* Generic.Output */ .codehilite .gp { color: #555555 } /* Generic.Prompt */ .codehilite .gs { font-weight: bold } /* Generic.Strong */ .codehilite .gu { color: #aaaaaa } /* Generic.Subheading */ .codehilite .gt { color: #aa0000 } /* Generic.Traceback */ .codehilite .kc { color: #000000; font-weight: bold } /* Keyword.Constant */ .codehilite .kd { color: #000000; font-weight: bold } /* Keyword.Declaration */ .codehilite .kn { color: #008000; font-weight: bold } /* Keyword.Namespace */ .codehilite .kp { color: #000000; font-weight: bold } /* Keyword.Pseudo */ .codehilite .kr { color: #000000; font-weight: bold } /* Keyword.Reserved */ .codehilite .kt { color: #445588; font-weight: bold } /* Keyword.Type */ .codehilite .m { color: #009999 } /* Literal.Number */ .codehilite .s { color: #d01040 } /* Literal.String */ .codehilite .na { color: #008080 } /* Name.Attribute */ .codehilite .nb { color: #0086B3 } /* Name.Builtin */ .codehilite .nc { color: #445588; font-weight: bold } /* Name.Class */ .codehilite .no { color: #008080 } /* Name.Constant */ .codehilite .nd { color: #3c5d5d; font-weight: bold } /* Name.Decorator */ .codehilite .ni { color: #800080 } /* Name.Entity */ .codehilite .ne { color: #990000; font-weight: bold } /* Name.Exception */ .codehilite .nf { color: #990000; font-weight: bold } /* Name.Function */ .codehilite .nl { color: #990000; font-weight: bold } /* Name.Label */ .codehilite .nn { color: #555555 } /* Name.Namespace */ .codehilite .nt { color: #000080 } /* Name.Tag */ .codehilite .nv { color: #008080 } /* Name.Variable */ .codehilite .ow { color: #000000; font-weight: bold } /* Operator.Word */ .codehilite .w { color: #bbbbbb } /* Text.Whitespace */ .codehilite .mf { color: #009999 } /* Literal.Number.Float */ .codehilite .mh { color: #009999 } /* Literal.Number.Hex */ .codehilite .mi { color: #009999 } /* Literal.Number.Integer */ .codehilite .mo { color: #009999 } /* Literal.Number.Oct */ .codehilite .sb { color: #d01040 } /* Literal.String.Backtick */ .codehilite .sc { color: #d01040 } /* Literal.String.Char */ .codehilite .sd { color: #d01040 } /* Literal.String.Doc */ .codehilite .s2 { color: #d01040 } /* Literal.String.Double */ .codehilite .se { color: #d01040 } /* Literal.String.Escape */ .codehilite .sh { color: #d01040 } /* Literal.String.Heredoc */ .codehilite .si { color: #d01040 } /* Literal.String.Interpol */ .codehilite .sx { color: #d01040 } /* Literal.String.Other */ .codehilite .sr { color: #009926 } /* Literal.String.Regex */ .codehilite .s1 { color: #d01040 } /* Literal.String.Single */ .codehilite .ss { color: #990073 } /* Literal.String.Symbol */ .codehilite .bp { color: #999999 } /* Name.Builtin.Pseudo */ .codehilite .vc { color: #008080 } /* Name.Variable.Class */ .codehilite .vg { color: #008080 } /* Name.Variable.Global */ .codehilite .vi { color: #008080 } /* Name.Variable.Instance */ .codehilite .il { color: #009999 } /* Literal.Number.Integer.Long */arogozhnikov-einops-ad2c8d6/docs_src/css/mkdocs.css000066400000000000000000000012611475201674600226170ustar00rootroot00000000000000div.highlight>pre { padding: 3px; } .md-typeset ul li, .md-typeset ol li { /*original padding makes lists huge*/ margin-bottom: 0.2em; } html { /* override 125% default, which looks terrible on smaller screens */ font-size: 100% !important; } .jupyter-wrapper { /* set css var used by markdown */ --jp-code-font-size: 90% !important; } .md-grid { /* change max width of page, looks much better on bigger screens */ max-width: 72rem !important; } div.highlight { /* this was introduced to fix size of documentation script. This applies to multiline code fragments in markdown */ font-size: 13px; background-color: #f2f2f2; }arogozhnikov-einops-ad2c8d6/docs_src/docs000077700000000000000000000000001475201674600220502../docs/ustar00rootroot00000000000000arogozhnikov-einops-ad2c8d6/docs_src/pages/000077500000000000000000000000001475201674600211345ustar00rootroot00000000000000arogozhnikov-einops-ad2c8d6/docs_src/pages/projects.md000066400000000000000000000055551475201674600233210ustar00rootroot00000000000000Einops tutorials cover multiple einops usages (and you'd better first follow tutorials), but it can also help to see einops in action. ## Selected projects Here are some open-source projects that can teach how to leverage einops for your problems - [@lucidrains](https://github.com/lucidrains) has a dramatic [collection of vision transformers](https://github.com/lucidrains/vit-pytorch) - there is a plenty of good examples how to use einops efficiently in your projects - lambda networks (non-conventional architecture) implemented by @lucidrains - nice demonstration how clearer code can be with einops, even compared to description in the paper - [implementation](https://github.com/lucidrains/lambda-networks) and [video](https://www.youtube.com/watch?v=3qxJ2WD8p4w) - capsule networks (aka capsnets) [implemented in einops](https://github.com/arogozhnikov/readable_capsnet) - blazingly fast, concise (3-10 times less code), and memory efficient capsule networks, written with einops - [NuX](https://github.com/Information-Fusion-Lab-Umass/NuX) — normalizing flows in Jax - different rearrangement patterns in normalizing flows have nice mapping to einops - For video recognition, look at [MotionFormer](https://github.com/facebookresearch/Motionformer) and [TimeSFormer](https://github.com/lucidrains/TimeSformer-pytorch) implementations - For protein folding, see [implementation](https://github.com/lucidrains/invariant-point-attention) of invariant point attention from AlphaFold 2 ## Community introductions to einops Tutorial in the AI summer about einops and einsum: Introduction to einops by Kapil Sachdeva Implementing visual transformer in pytorch: Refactoring machine learning code, one of posts in a series is devoted to einops: ML TLDR thread on einops: Book "Deep Reinforcement Learning in Action" by Brandon Brown & Alexander Zai contains an introduction into einops in chapter 10. ## Related projects: - [numpy.einsum](https://numpy.org/doc/stable/reference/generated/numpy.einsum.html) — grand-dad of einops, this operation is now available in all mainstream DL frameworks - einops in Rust language - einops in C++ for torch: - tensorcast in Julia language - for those chasing an extreme compactness of API, provides 'one op', as the name suggests - goes in opposite direction and creates einops-style operation for anything arogozhnikov-einops-ad2c8d6/docs_src/pages/testimonials.md000066400000000000000000000120101475201674600241630ustar00rootroot00000000000000 Einops was created three years ago, and never hit big ML news. However, little by little and step by step it sneaked into every major AI lab. This all happened by word of mouth and by sharing code snippets: > Einops simplifies and clarifies array/tensor manipulation. ðŸ‘
> You really have to try it out, you'll love it: https://github.com/arogozhnikov/einops
> Plus it supports NumPy, TensorFlow, PyTorch, and more. Aurélien Geron,
author of "Hands-On Machine Learning with Scikit-Learn and TensorFlow."
Former PM of YouTube video classification. [(ref)](https://twitter.com/aureliengeron/status/1382829421967515648) > This is your daily reminder to *never* trust pytorch's native reshape/view.
> Always use einops! > > (Just spend 1 h debugging code and it turned out tensor.view was shuffling the tensor in a weird way) Tom Lieberum, University of Amsterdam [(ref)](https://twitter.com/lieberum_t/status/1427282842250358787) > TIL einops can be faster than raw PyTorch 🤯 Zach Mueller, Novetta, author of "walk with fastai" [(ref)](https://twitter.com/TheZachMueller/status/1418003372494426113) > einops are also a joy to work with! Norman Casagrande, Deepmind [(ref)](https://twitter.com/nova77t/status/1419405150805008387) > — And btw I estimate that AI research suffers globally from a 5% loss of productivity because einops are not included in @PyTorch by default. > > — Might be true for research, but not likely to be true for engineering. I don't think a lot of people in the industry use PyTorch directly [...] > > — That’s why it’s 5% and not 30% > > — E-xa-ctly [Discussion thread](https://twitter.com/francoisfleuret/status/1409141186326106114) > After a while, it feels like einsum+einops is all you need ;) [...] Tim Rocktäschel, Facebook AI Research [(ref)](https://twitter.com/_rockt/status/1390049226193788930) > Yes, I am also using einops in that snippet! It's great!
> I wished I knew about it from the start when it was created Tim Dettmers, PhD Student at UoW and visiting researcher at Facebook AI [(ref)](https://twitter.com/Tim_Dettmers/status/1390027329351520256) > A little late to the party, but einops (https://github.com/arogozhnikov/einops) is a massive improvement to deep learning code readability. I love this! Daniel Havir, Apple [(ref)](https://twitter.com/danielhavir/status/1389070232853966849) > I recently discovered the beauty of torch.einsum and einops.rearrange
> and at this point I'm confused why I even bothered with other tensor operations in the first place. Robin M. Schmidt, AI/ML Resident at Apple [(ref)](https://twitter.com/robinschmidt_/status/1363709832788852736) > I end up using einops for ridiculously simple things,
> simply to be nice to my future self (because the code takes so little effort to read). Nasim Rahaman, MILA [(ref)](https://twitter.com/nasim_rahaman/status/1390027557546901504) > I love Einops for this kind of stuff, it makes code very readable,
> even if you are just doing a simple squeeze or expand_dims. [...] Cristian Garcia, ML Engineer @quansightai, [(ref)](https://twitter.com/cgarciae88/status/1331968395110211586) > i might be late to the party, but einsum and the einops package are unbelievably useful Samson Koelle, Stat PhD candidate at UoW, [(ref)](https://twitter.com/SeattleStatSam/status/1338673646898794496) > I really recommend einops for tensor shape manipulation Alex Mordvintsev, DeepDream creator, [(ref)](https://twitter.com/zzznah/status/1315297985585123328) > The einops Python package is worth checking out: > it provides a powerful declarative interface > for manipulating and reshaping multi-dimensional arrays https://github.com/arogozhnikov/einops Jake VanderPlas,
Google Research, core contributor to Altair, AstroML, scikit-learn, etc. [(ref)](https://twitter.com/jakevdp/status/1299012119761833989) > I can't believe after this many years of programming with NumPy/PyTorch/TensorFlow, I didn't know about ðšŽðš’ðš—ðšœðšžðš–. [...]
> — Arash Vahdat > > They smell like regexp to me -- concise, but I know it is going to take effort to understand or modify them in the future.
> — John Carmack > > I have a much easier time to read einsum than any equivalent combinations of matmul, reshape, broadcasting... you name it. > Regexps are ad-hoc, subtle and cryptic. > > Einstein summation is uniform, succinct with simple, clear semantics.
> Ein sum to rule them all ;)
> — Christian Szegedy > > The einops library, in particular, is magical. Best thing since baked bread and brewed beer.
> — @aertherks [Discussion thread](https://twitter.com/aertherks/status/1357054506656165889) arogozhnikov-einops-ad2c8d6/docs_src/pytorch-examples.html000077700000000000000000000000001475201674600315572../docs/pytorch-examples.htmlustar00rootroot00000000000000arogozhnikov-einops-ad2c8d6/einops/000077500000000000000000000000001475201674600175335ustar00rootroot00000000000000arogozhnikov-einops-ad2c8d6/einops/__init__.py000066400000000000000000000006461475201674600216520ustar00rootroot00000000000000# imports can use EinopsError class # ruff: noqa: E402 __author__ = "Alex Rogozhnikov" __version__ = "0.8.1" class EinopsError(RuntimeError): """Runtime error thrown by einops""" pass __all__ = ["rearrange", "reduce", "repeat", "einsum", "pack", "unpack", "parse_shape", "asnumpy", "EinopsError"] from .einops import rearrange, reduce, repeat, einsum, parse_shape, asnumpy from .packing import pack, unpack arogozhnikov-einops-ad2c8d6/einops/_backends.py000066400000000000000000000514411475201674600220230ustar00rootroot00000000000000""" Backends in `einops` are organized to meet the following requirements - backends are not imported unless those are actually needed, because - backends may not be installed - importing all available backends will drive to significant memory footprint - backends may be present but installed with errors (but never used), importing may drive to crashes - backend should be either symbolic or imperative - this determines which methods (from_numpy/to_numpy or create_symbol/eval_symbol) should be defined - if backend can't provide symbols for shape dimensions, UnknownSize objects are used """ import sys __author__ = "Alex Rogozhnikov" _loaded_backends: dict = {} _type2backend: dict = {} _debug_importing = False def get_backend(tensor) -> "AbstractBackend": """ Takes a correct backend (e.g. numpy backend if tensor is numpy.ndarray) for a tensor. If needed, imports package and creates backend """ _type = type(tensor) _result = _type2backend.get(_type, None) if _result is not None: return _result for framework_name, backend in list(_loaded_backends.items()): if backend.is_appropriate_type(tensor): _type2backend[_type] = backend return backend # Find backend subclasses recursively backend_subclasses = [] backends = AbstractBackend.__subclasses__() while backends: backend = backends.pop() backends += backend.__subclasses__() backend_subclasses.append(backend) for BackendSubclass in backend_subclasses: if _debug_importing: print("Testing for subclass of ", BackendSubclass) if BackendSubclass.framework_name not in _loaded_backends: # check that module was already imported. Otherwise it can't be imported if BackendSubclass.framework_name in sys.modules: if _debug_importing: print("Imported backend for ", BackendSubclass.framework_name) backend = BackendSubclass() _loaded_backends[backend.framework_name] = backend if backend.is_appropriate_type(tensor): _type2backend[_type] = backend return backend raise RuntimeError("Tensor type unknown to einops {}".format(type(tensor))) class AbstractBackend: """Base backend class, major part of methods are only for debugging purposes.""" framework_name: str def is_appropriate_type(self, tensor): """helper method should recognize tensors it can handle""" raise NotImplementedError() def from_numpy(self, x): raise NotImplementedError("framework doesn't support imperative execution") def to_numpy(self, x): raise NotImplementedError("framework doesn't support imperative execution") def create_symbol(self, shape): raise NotImplementedError("framework doesn't support symbolic computations") def eval_symbol(self, symbol, symbol_value_pairs): # symbol-value pairs is list[tuple[symbol, value-tensor]] raise NotImplementedError("framework doesn't support symbolic computations") def arange(self, start, stop): # supplementary method used only in testing, so should implement CPU version raise NotImplementedError("framework doesn't implement arange") def shape(self, x): """shape should return a tuple with integers or "shape symbols" (which will evaluate to actual size)""" return x.shape def reshape(self, x, shape): return x.reshape(shape) def transpose(self, x, axes): return x.transpose(axes) def reduce(self, x, operation, axes): return getattr(x, operation)(axis=axes) def stack_on_zeroth_dimension(self, tensors: list): raise NotImplementedError() def add_axis(self, x, new_position): raise NotImplementedError() def add_axes(self, x, n_axes, pos2len): repeats = [1] * n_axes for axis_position, axis_length in pos2len.items(): x = self.add_axis(x, axis_position) repeats[axis_position] = axis_length return self.tile(x, tuple(repeats)) def tile(self, x, repeats): """repeats - same lengths as x.shape""" raise NotImplementedError() def concat(self, tensors, axis: int): """concatenates tensors along axis. Assume identical across tensors: devices, dtypes and shapes except selected axis.""" raise NotImplementedError() def is_float_type(self, x): # some backends (torch) can't compute average for non-floating types. # Decided to drop average for all backends if type is not floating raise NotImplementedError() def layers(self): raise NotImplementedError("backend does not provide layers") def __repr__(self): return "".format(self.framework_name) def einsum(self, pattern, *x): raise NotImplementedError("backend does not support einsum") class UnknownSize: """pseudo-symbol for symbolic frameworks which do not provide symbols for shape elements""" def __floordiv__(self, other): return self def __eq__(self, other): return True # we don't know actual size def __mul__(self, other): return self def __rmul__(self, other): return self def __hash__(self): return hash(None) class NumpyBackend(AbstractBackend): framework_name = "numpy" def __init__(self): import numpy self.np = numpy def is_appropriate_type(self, tensor): return isinstance(tensor, self.np.ndarray) def from_numpy(self, x): return x def to_numpy(self, x): return x def arange(self, start, stop): return self.np.arange(start, stop) def stack_on_zeroth_dimension(self, tensors: list): return self.np.stack(tensors) def tile(self, x, repeats): return self.np.tile(x, repeats) def concat(self, tensors, axis: int): return self.np.concatenate(tensors, axis=axis) def is_float_type(self, x): return x.dtype in ("float16", "float32", "float64", "float128", "bfloat16") def add_axis(self, x, new_position): return self.np.expand_dims(x, new_position) def einsum(self, pattern, *x): return self.np.einsum(pattern, *x) class JaxBackend(NumpyBackend): framework_name = "jax" def __init__(self): super(JaxBackend, self).__init__() self.onp = self.np import jax.numpy self.np = jax.numpy def from_numpy(self, x): return self.np.asarray(x) def to_numpy(self, x): return self.onp.asarray(x) class TorchBackend(AbstractBackend): framework_name = "torch" def __init__(self): import torch self.torch = torch # importing would register operations in torch._dynamo for torch.compile from . import _torch_specific # noqa def is_appropriate_type(self, tensor): return isinstance(tensor, self.torch.Tensor) def from_numpy(self, x): variable = self.torch.from_numpy(x) if self.is_float_type(variable): # attach grad only to floating types variable.requires_grad = True return variable def to_numpy(self, x): return x.detach().cpu().numpy() def arange(self, start, stop): return self.torch.arange(start, stop, dtype=self.torch.int64) def reduce(self, x, operation, reduced_axes): if operation == "min": return x.amin(dim=reduced_axes) elif operation == "max": return x.amax(dim=reduced_axes) elif operation == "sum": return x.sum(dim=reduced_axes) elif operation == "mean": return x.mean(dim=reduced_axes) elif operation in ("any", "all", "prod"): # pytorch supports reducing only one operation at a time for i in list(sorted(reduced_axes))[::-1]: x = getattr(x, operation)(dim=i) return x else: raise NotImplementedError("Unknown reduction ", operation) def transpose(self, x, axes): return x.permute(axes) def stack_on_zeroth_dimension(self, tensors: list): return self.torch.stack(tensors) def add_axes(self, x, n_axes, pos2len): repeats = [-1] * n_axes for axis_position, axis_length in pos2len.items(): x = self.add_axis(x, axis_position) repeats[axis_position] = axis_length return x.expand(repeats) def tile(self, x, repeats): return x.repeat(repeats) def concat(self, tensors, axis: int): return self.torch.cat(tensors, dim=axis) def add_axis(self, x, new_position): return self.torch.unsqueeze(x, new_position) def is_float_type(self, x): return x.dtype in [self.torch.float16, self.torch.float32, self.torch.float64, self.torch.bfloat16] def layers(self): from .layers import torch return torch def einsum(self, pattern, *x): return self.torch.einsum(pattern, *x) class CupyBackend(AbstractBackend): framework_name = "cupy" def __init__(self): import cupy self.cupy = cupy def is_appropriate_type(self, tensor): return isinstance(tensor, self.cupy.ndarray) def from_numpy(self, x): return self.cupy.asarray(x) def to_numpy(self, x): return self.cupy.asnumpy(x) def arange(self, start, stop): return self.cupy.arange(start, stop) def stack_on_zeroth_dimension(self, tensors: list): return self.cupy.stack(tensors) def tile(self, x, repeats): return self.cupy.tile(x, repeats) def concat(self, tensors, axis: int): return self.cupy.concatenate(tensors, axis=axis) def add_axis(self, x, new_position): return self.cupy.expand_dims(x, new_position) def is_float_type(self, x): return x.dtype in ("float16", "float32", "float64", "float128", "bfloat16") def einsum(self, pattern, *x): return self.cupy.einsum(pattern, *x) class HashableTuple: """Overcomes non-hashability of symbolic elements""" def __init__(self, elements: tuple): self.elements = elements def __iter__(self): for x in self.elements: yield x def __len__(self): return len(self.elements) def __getitem__(self, item): return self.elements[item] # default equality and hash is used (True only with itself, hash taken of id) class TensorflowBackend(AbstractBackend): framework_name = "tensorflow" def __init__(self): import tensorflow self.tf = tensorflow def is_appropriate_type(self, tensor): return isinstance(tensor, (self.tf.Tensor, self.tf.Variable)) def from_numpy(self, x): assert self.tf.executing_eagerly() return self.tf.convert_to_tensor(x) def to_numpy(self, x): assert self.tf.executing_eagerly() return x.numpy() def arange(self, start, stop): return self.tf.range(start, stop) def shape(self, x): if self.tf.executing_eagerly(): return tuple(UnknownSize() if d is None else int(d) for d in x.shape) else: static_shape = x.shape.as_list() tf_shape = self.tf.shape(x) # use the static shape where known, otherwise use the TF shape components shape = tuple([s or tf_shape[dim] for dim, s in enumerate(static_shape)]) try: hash(shape) return shape except BaseException: # unhashable symbols in shape. Wrap tuple to be hashable. return HashableTuple(shape) def reduce(self, x, operation, axes): return getattr(self.tf, "reduce_" + operation)(x, axis=axes) def reshape(self, x, shape): return self.tf.reshape(x, shape) def transpose(self, x, axes): return self.tf.transpose(x, axes) def stack_on_zeroth_dimension(self, tensors: list): return self.tf.stack(tensors) def tile(self, x, repeats): return self.tf.tile(x, repeats) def concat(self, tensors, axis: int): return self.tf.concat(tensors, axis=axis) def add_axis(self, x, new_position): return self.tf.expand_dims(x, new_position) def is_float_type(self, x): return x.dtype in ("float16", "float32", "float64", "float128", "bfloat16") def layers(self): from .layers import tensorflow return tensorflow def einsum(self, pattern, *x): return self.tf.einsum(pattern, *x) class TFKerasBackend(AbstractBackend): framework_name = "tensorflow.keras" def __init__(self): import tensorflow as tf self.tf = tf self.keras = tf.keras self.K = tf.keras.backend def is_appropriate_type(self, tensor): return self.tf.is_tensor(tensor) and self.K.is_keras_tensor(tensor) def create_symbol(self, shape): return self.keras.Input(batch_shape=shape) def eval_symbol(self, symbol, symbol_value_pairs): model = self.keras.models.Model([var for (var, _) in symbol_value_pairs], symbol) return model.predict_on_batch([val for (_, val) in symbol_value_pairs]) def arange(self, start, stop): return self.K.arange(start, stop) def shape(self, x): shape = self.K.shape(x) # tf tensor return HashableTuple(tuple(shape)) def reduce(self, x, operation, axes): return getattr(self.K, operation)(x, axis=axes) def reshape(self, x, shape): return self.K.reshape(x, shape) def transpose(self, x, axes): return self.K.permute_dimensions(x, axes) def stack_on_zeroth_dimension(self, tensors: list): return self.K.stack(tensors) def tile(self, x, repeats): return self.K.tile(x, repeats) def concat(self, tensors, axis: int): return self.K.concatenate(tensors, axis=axis) def add_axis(self, x, new_position): return self.K.expand_dims(x, new_position) def is_float_type(self, x): return "float" in self.K.dtype(x) def layers(self): from .layers import keras return keras class OneFlowBackend(AbstractBackend): framework_name = "oneflow" def __init__(self): import oneflow as flow self.flow = flow def is_appropriate_type(self, tensor): return isinstance(tensor, self.flow.Tensor) def from_numpy(self, x): variable = self.flow.from_numpy(x) if self.is_float_type(variable): # attach grad only to floating types variable.requires_grad = True return variable def to_numpy(self, x): return x.detach().cpu().numpy() def arange(self, start, stop): return self.flow.arange(start, stop, dtype=self.flow.int64) def reduce(self, x, operation, reduced_axes): for axis in sorted(reduced_axes, reverse=True): if operation == "min": x, _ = x.min(dim=axis) elif operation == "max": x, _ = x.max(dim=axis) elif operation in ["sum", "mean", "prod", "any", "all"]: x = getattr(x, operation)(dim=axis) else: raise NotImplementedError("Unknown reduction ", operation) return x def transpose(self, x, axes): return x.permute(axes) def stack_on_zeroth_dimension(self, tensors: list): return self.flow.stack(tensors) def add_axes(self, x, n_axes, pos2len): repeats = [-1] * n_axes for axis_position, axis_length in pos2len.items(): x = self.add_axis(x, axis_position) repeats[axis_position] = axis_length return x.expand(*repeats) def tile(self, x, repeats): return x.repeat(repeats) def concat(self, tensors, axis: int): return self.flow.concat(tensors, dim=axis) def add_axis(self, x, new_position): return self.flow.unsqueeze(x, new_position) def is_float_type(self, x): return x.dtype in [self.flow.float16, self.flow.float32, self.flow.float64] def layers(self): from .layers import oneflow return oneflow def einsum(self, pattern, *x): return self.flow.einsum(pattern, *x) class PaddleBackend(AbstractBackend): framework_name = "paddle" def __init__(self): import paddle self.paddle = paddle def is_appropriate_type(self, tensor): return self.paddle.is_tensor(tensor) def from_numpy(self, x): tensor = self.paddle.to_tensor(x) tensor.stop_gradient = False return tensor def to_numpy(self, x): return x.detach().numpy() def arange(self, start, stop): return self.paddle.arange(start, stop, dtype=self.paddle.int64) def reduce(self, x, operation, axes): if len(axes) == x.ndim: # currently paddle returns 1d tensor instead of 0d return super().reduce(x, operation, axes).squeeze(0) else: return super().reduce(x, operation, axes) def transpose(self, x, axes): return x.transpose(axes) def add_axes(self, x, n_axes, pos2len): repeats = [-1] * n_axes for axis_position, axis_length in pos2len.items(): x = self.add_axis(x, axis_position) repeats[axis_position] = axis_length return x.expand(repeats) def stack_on_zeroth_dimension(self, tensors: list): return self.paddle.stack(tensors) def reshape(self, x, shape): return x.reshape(shape) def tile(self, x, repeats): return x.tile(repeats) def concat(self, tensors, axis: int): return self.paddle.concat(tensors, axis=axis) def add_axis(self, x, new_position): return x.unsqueeze(new_position) def is_float_type(self, x): return x.dtype in [self.paddle.float16, self.paddle.float32, self.paddle.float64] def layers(self): from .layers import paddle return paddle def einsum(self, pattern, *x): return self.paddle.einsum(pattern, *x) def shape(self, x): return tuple(x.shape) class TinygradBackend(AbstractBackend): framework_name = "tinygrad" def __init__(self): import tinygrad self.tinygrad = tinygrad def is_appropriate_type(self, tensor): return isinstance(tensor, self.tinygrad.Tensor) def from_numpy(self, x): return self.tinygrad.Tensor(x) def to_numpy(self, x): return x.numpy() def arange(self, start, stop): return self.tinygrad.Tensor.arange(start, stop) def shape(self, x): return x.shape def reshape(self, x, shape): return x.reshape(shape) def transpose(self, x, axes): return x.permute(axes) def reduce(self, x, operation, axes): for axis in sorted(axes, reverse=True): x = getattr(x, operation)(axis=axis) return x def stack_on_zeroth_dimension(self, tensors: list): return self.tinygrad.Tensor.stack(tensors) def add_axis(self, x, new_position): return x.unsqueeze(new_position) def tile(self, x, repeats): return x.repeat(repeats) def concat(self, tensors, axis: int): return tensors[0].cat(*tensors[1:], dim=axis) if len(tensors) > 1 else tensors[0] def is_float_type(self, x): return self.tinygrad.dtypes.is_float(x.dtype) def einsum(self, pattern, *x): return self.tinygrad.Tensor.einsum(pattern, *x) class PyTensorBackend(AbstractBackend): framework_name = "pytensor" def __init__(self): from pytensor import tensor self.pt = tensor def is_appropriate_type(self, tensor): return isinstance(tensor, self.pt.TensorVariable) def is_float_type(self, x): return x.dtype in self.pt.type.float_dtypes def from_numpy(self, x): return self.pt.as_tensor(x) def to_numpy(self, x): return x.eval() # Will only work if there are no symbolic inputs def create_symbol(self, shape): if not isinstance(shape, tuple | list): shape = (shape,) return self.pt.tensor(shape=shape) def eval_symbol(self, symbol, symbol_value_pairs): return symbol.eval(dict(symbol_value_pairs)) def arange(self, start, stop): return self.pt.arange(start, stop) def shape(self, x): # use the static shape dimensions where known return tuple( static_dim if static_dim is not None else symbolic_dim for static_dim, symbolic_dim in zip(x.type.shape, x.shape) ) def stack_on_zeroth_dimension(self, tensors: list): return self.pt.stack(tensors) def tile(self, x, repeats): return self.pt.tile(x, repeats) def concat(self, tensors, axis: int): return self.pt.concatenate(tensors, axis=axis) def add_axis(self, x, new_position): return self.pt.expand_dims(x, new_position) def einsum(self, pattern, *x): return self.pt.einsum(pattern, *x) arogozhnikov-einops-ad2c8d6/einops/_torch_specific.py000066400000000000000000000100521475201674600232260ustar00rootroot00000000000000""" Specialization of einops for torch. Unfortunately, torch's jit scripting mechanism isn't strong enough, and to have scripting supported at least for layers, a number of additional moves is needed. Design of main operations (dynamic resolution by lookup) is unlikely to be implemented by torch.jit.script, but torch.compile seems to work with operations just fine. """ import warnings from typing import Dict, List, Tuple import torch from einops.einops import TransformRecipe, _reconstruct_from_shape_uncached class TorchJitBackend: """ Completely static backend that mimics part of normal backend functionality but restricted to be within torchscript. """ @staticmethod def reduce(x: torch.Tensor, operation: str, reduced_axes: List[int]): if operation == "min": return x.amin(dim=reduced_axes) elif operation == "max": return x.amax(dim=reduced_axes) elif operation == "sum": return x.sum(dim=reduced_axes) elif operation == "mean": return x.mean(dim=reduced_axes) elif operation == "prod": for i in list(sorted(reduced_axes))[::-1]: x = x.prod(dim=i) return x else: raise NotImplementedError("Unknown reduction ", operation) @staticmethod def transpose(x, axes: List[int]): return x.permute(axes) @staticmethod def stack_on_zeroth_dimension(tensors: List[torch.Tensor]): return torch.stack(tensors) @staticmethod def tile(x, repeats: List[int]): return x.repeat(repeats) @staticmethod def add_axes(x, n_axes: int, pos2len: Dict[int, int]): repeats = [-1] * n_axes for axis_position, axis_length in pos2len.items(): x = torch.unsqueeze(x, axis_position) repeats[axis_position] = axis_length return x.expand(repeats) @staticmethod def is_float_type(x): return x.dtype in [torch.float16, torch.float32, torch.float64, torch.bfloat16] @staticmethod def shape(x): return x.shape @staticmethod def reshape(x, shape: List[int]): return x.reshape(shape) # mirrors einops.einops._apply_recipe def apply_for_scriptable_torch( recipe: TransformRecipe, tensor: torch.Tensor, reduction_type: str, axes_dims: List[Tuple[str, int]] ) -> torch.Tensor: backend = TorchJitBackend ( init_shapes, axes_reordering, reduced_axes, added_axes, final_shapes, n_axes_w_added, ) = _reconstruct_from_shape_uncached(recipe, backend.shape(tensor), axes_dims=axes_dims) if init_shapes is not None: tensor = backend.reshape(tensor, init_shapes) if axes_reordering is not None: tensor = backend.transpose(tensor, axes_reordering) if len(reduced_axes) > 0: tensor = backend.reduce(tensor, operation=reduction_type, reduced_axes=reduced_axes) if len(added_axes) > 0: tensor = backend.add_axes(tensor, n_axes=n_axes_w_added, pos2len=added_axes) if final_shapes is not None: tensor = backend.reshape(tensor, final_shapes) return tensor def allow_ops_in_compiled_graph(): if hasattr(torch, "__version__") and torch.__version__[0] < "2": # torch._dynamo and torch.compile appear in pytorch 2.0 return try: from torch._dynamo import allow_in_graph except ImportError: warnings.warn("allow_ops_in_compiled_graph failed to import torch: ensure pytorch >=2.0", ImportWarning) return from .einops import rearrange, reduce, repeat, einsum from .packing import pack, unpack allow_in_graph(rearrange) allow_in_graph(reduce) allow_in_graph(repeat) allow_in_graph(einsum) allow_in_graph(pack) allow_in_graph(unpack) # CF: https://github.com/pytorch/pytorch/blob/2df939aacac68e9621fbd5d876c78d86e72b41e2/torch/_dynamo/__init__.py#L222 global _ops_were_registered_in_torchdynamo _ops_were_registered_in_torchdynamo = True # module import automatically registers ops in torchdynamo allow_ops_in_compiled_graph() arogozhnikov-einops-ad2c8d6/einops/array_api.py000066400000000000000000000121771475201674600220640ustar00rootroot00000000000000from typing import List, Tuple, Sequence from .einops import Tensor, Reduction, EinopsError, _prepare_transformation_recipe, _apply_recipe_array_api from .packing import analyze_pattern, prod def reduce(tensor: Tensor, pattern: str, reduction: Reduction, **axes_lengths: int) -> Tensor: if isinstance(tensor, list): if len(tensor) == 0: raise TypeError("Einops can't be applied to an empty list") xp = tensor[0].__array_namespace__() tensor = xp.stack(tensor) else: xp = tensor.__array_namespace__() try: hashable_axes_lengths = tuple(axes_lengths.items()) recipe = _prepare_transformation_recipe(pattern, reduction, axes_names=tuple(axes_lengths), ndim=tensor.ndim) return _apply_recipe_array_api( xp, recipe=recipe, tensor=tensor, reduction_type=reduction, axes_lengths=hashable_axes_lengths, ) except EinopsError as e: message = ' Error while processing {}-reduction pattern "{}".'.format(reduction, pattern) if not isinstance(tensor, list): message += "\n Input tensor shape: {}. ".format(tensor.shape) else: message += "\n Input is list. " message += "Additional info: {}.".format(axes_lengths) raise EinopsError(message + "\n {}".format(e)) def repeat(tensor: Tensor, pattern: str, **axes_lengths) -> Tensor: return reduce(tensor, pattern, reduction="repeat", **axes_lengths) def rearrange(tensor: Tensor, pattern: str, **axes_lengths) -> Tensor: return reduce(tensor, pattern, reduction="rearrange", **axes_lengths) def asnumpy(tensor: Tensor): import numpy as np return np.from_dlpack(tensor) Shape = Tuple def pack(tensors: Sequence[Tensor], pattern: str) -> Tuple[Tensor, List[Shape]]: n_axes_before, n_axes_after, min_axes = analyze_pattern(pattern, "pack") xp = tensors[0].__array_namespace__() reshaped_tensors: List[Tensor] = [] packed_shapes: List[Shape] = [] for i, tensor in enumerate(tensors): shape = tensor.shape if len(shape) < min_axes: raise EinopsError( f"packed tensor #{i} (enumeration starts with 0) has shape {shape}, " f"while pattern {pattern} assumes at least {min_axes} axes" ) axis_after_packed_axes = len(shape) - n_axes_after packed_shapes.append(shape[n_axes_before:axis_after_packed_axes]) reshaped_tensors.append(xp.reshape(tensor, (*shape[:n_axes_before], -1, *shape[axis_after_packed_axes:]))) return xp.concat(reshaped_tensors, axis=n_axes_before), packed_shapes def unpack(tensor: Tensor, packed_shapes: List[Shape], pattern: str) -> List[Tensor]: xp = tensor.__array_namespace__() n_axes_before, n_axes_after, min_axes = analyze_pattern(pattern, opname="unpack") # backend = get_backend(tensor) input_shape = tensor.shape if len(input_shape) != n_axes_before + 1 + n_axes_after: raise EinopsError(f"unpack(..., {pattern}) received input of wrong dim with shape {input_shape}") unpacked_axis: int = n_axes_before lengths_of_composed_axes: List[int] = [-1 if -1 in p_shape else prod(p_shape) for p_shape in packed_shapes] n_unknown_composed_axes = sum(x == -1 for x in lengths_of_composed_axes) if n_unknown_composed_axes > 1: raise EinopsError( f"unpack(..., {pattern}) received more than one -1 in {packed_shapes} and can't infer dimensions" ) # following manipulations allow to skip some shape verifications # and leave it to backends # [[], [2, 3], [4], [-1, 5], [6]] < examples of packed_axis # split positions when computed should be # [0, 1, 7, 11, N-6 , N ], where N = length of axis split_positions = [0] * len(packed_shapes) + [input_shape[unpacked_axis]] if n_unknown_composed_axes == 0: for i, x in enumerate(lengths_of_composed_axes[:-1]): split_positions[i + 1] = split_positions[i] + x else: unknown_composed_axis: int = lengths_of_composed_axes.index(-1) for i in range(unknown_composed_axis): split_positions[i + 1] = split_positions[i] + lengths_of_composed_axes[i] for j in range(unknown_composed_axis + 1, len(lengths_of_composed_axes))[::-1]: split_positions[j] = split_positions[j + 1] - lengths_of_composed_axes[j] shape_start = input_shape[:unpacked_axis] shape_end = input_shape[unpacked_axis + 1 :] slice_filler = (slice(None, None),) * unpacked_axis try: return [ xp.reshape( # shortest way slice arbitrary axis tensor[(*slice_filler, slice(split_positions[i], split_positions[i + 1]), ...)], (*shape_start, *element_shape, *shape_end), ) for i, element_shape in enumerate(packed_shapes) ] except Exception: # this hits if there is an error during reshapes, which means passed shapes were incorrect raise RuntimeError( f'Error during unpack(..., "{pattern}"): could not split axis of size {split_positions[-1]}' f" into requested {packed_shapes}" ) arogozhnikov-einops-ad2c8d6/einops/einops.py000066400000000000000000001113011475201674600213770ustar00rootroot00000000000000import functools import itertools import string import typing from collections import OrderedDict from typing import Set, Tuple, List, Dict, Union, Callable, Optional, TypeVar, cast, Any if typing.TYPE_CHECKING: # for docstrings in pycharm import numpy as np # noqa E401 from . import EinopsError from ._backends import get_backend from .parsing import ParsedExpression, _ellipsis, AnonymousAxis Tensor = TypeVar("Tensor") ReductionCallable = Callable[[Tensor, Tuple[int, ...]], Tensor] Reduction = Union[str, ReductionCallable] Size = typing.Any _reductions = ("min", "max", "sum", "mean", "prod", "any", "all") # magic integers are required to stay within # traceable subset of language _unknown_axis_length = -999999 _expected_axis_length = -99999 def _product(sequence: List[int]) -> int: """minimalistic product that works both with numbers and symbols. Supports empty lists""" result = 1 for element in sequence: result *= element return result def _reduce_axes(tensor, reduction_type: Reduction, reduced_axes: List[int], backend): if callable(reduction_type): # custom callable return reduction_type(tensor, tuple(reduced_axes)) else: # one of built-in operations assert reduction_type in _reductions if reduction_type == "mean": if not backend.is_float_type(tensor): raise NotImplementedError("reduce_mean is not available for non-floating tensors") return backend.reduce(tensor, reduction_type, tuple(reduced_axes)) def _optimize_transformation(init_shapes, reduced_axes, axes_reordering, final_shapes): # 'collapses' neighboring axes if those participate in the result pattern in the same order # TODO add support for added_axes assert len(axes_reordering) + len(reduced_axes) == len(init_shapes) # joining consecutive axes that will be reduced # possibly we can skip this if all backends can optimize this (not sure) reduced_axes = tuple(sorted(reduced_axes)) for i in range(len(reduced_axes) - 1)[::-1]: if reduced_axes[i] + 1 == reduced_axes[i + 1]: removed_axis = reduced_axes[i + 1] removed_length = init_shapes[removed_axis] init_shapes = init_shapes[:removed_axis] + init_shapes[removed_axis + 1 :] init_shapes[removed_axis - 1] *= removed_length reduced_axes = reduced_axes[: i + 1] + tuple(axis - 1 for axis in reduced_axes[i + 2 :]) # removing axes that are moved together during reshape def build_mapping(): init_to_final = {} for axis in range(len(init_shapes)): if axis in reduced_axes: init_to_final[axis] = None else: after_reduction = sum(x is not None for x in init_to_final.values()) init_to_final[axis] = list(axes_reordering).index(after_reduction) return init_to_final init_axis_to_final_axis = build_mapping() for init_axis in range(len(init_shapes) - 1)[::-1]: if init_axis_to_final_axis[init_axis] is None: continue if init_axis_to_final_axis[init_axis + 1] is None: continue if init_axis_to_final_axis[init_axis] + 1 == init_axis_to_final_axis[init_axis + 1]: removed_axis = init_axis + 1 removed_length = init_shapes[removed_axis] removed_axis_after_reduction = sum(x not in reduced_axes for x in range(removed_axis)) reduced_axes = tuple(axis if axis < removed_axis else axis - 1 for axis in reduced_axes) init_shapes = init_shapes[:removed_axis] + init_shapes[removed_axis + 1 :] init_shapes[removed_axis - 1] *= removed_length old_reordering = axes_reordering axes_reordering = [] for axis in old_reordering: if axis == removed_axis_after_reduction: pass elif axis < removed_axis_after_reduction: axes_reordering.append(axis) else: axes_reordering.append(axis - 1) init_axis_to_final_axis = build_mapping() return init_shapes, reduced_axes, axes_reordering, final_shapes CookedRecipe = Tuple[Optional[List[int]], Optional[List[int]], List[int], Dict[int, int], Optional[List[int]], int] # Actual type is tuple[tuple[str, int], ...] # However torch.jit.script does not "understand" the correct type, # and torch_specific will use list version. HashableAxesLengths = Tuple[Tuple[str, int], ...] FakeHashableAxesLengths = List[Tuple[str, int]] class TransformRecipe: """ Recipe describes actual computation pathway. Recipe can be applied to a tensor or variable. """ # structure is non-mutable. In future, this can be non-mutable dataclass (python 3.7+) # update: pytorch 2.0 torch.jit.script seems to have problems with dataclasses unless they were explicitly provided def __init__( self, # list of sizes (or just sizes) for elementary axes as they appear in left expression. # this is what (after computing unknown parts) will be a shape after first transposition. # This does not include any ellipsis dimensions. elementary_axes_lengths: List[int], # if additional axes are provided, they should be set in prev array # This shows mapping from name to position axis_name2elementary_axis: Dict[str, int], # each dimension in input can help to reconstruct length of one elementary axis # or verify one of dimensions. Each element points to element of elementary_axes_lengths. input_composition_known_unknown: List[Tuple[List[int], List[int]]], # permutation applied to elementary axes, if ellipsis is absent axes_permutation: List[int], # permutation puts reduced axes in the end, we only need to know the first position. first_reduced_axis: int, # at which positions which of elementary axes should appear. Axis position -> axis index. added_axes: Dict[int, int], # ids of axes as they appear in result, again pointers to elementary_axes_lengths, # only used to infer result dimensions output_composite_axes: List[List[int]], ): self.elementary_axes_lengths: List[int] = elementary_axes_lengths self.axis_name2elementary_axis: Dict[str, int] = axis_name2elementary_axis self.input_composition_known_unknown: List[Tuple[List[int], List[int]]] = input_composition_known_unknown self.axes_permutation: List[int] = axes_permutation self.first_reduced_axis: int = first_reduced_axis self.added_axes: Dict[int, int] = added_axes self.output_composite_axes: List[List[int]] = output_composite_axes def _reconstruct_from_shape_uncached( self: TransformRecipe, shape: List[int], axes_dims: FakeHashableAxesLengths ) -> CookedRecipe: """ Reconstruct all actual parameters using shape. Shape is a tuple that may contain integers, shape symbols (tf, theano) and UnknownSize (tf, previously mxnet) known axes can be integers or symbols, but not Nones. """ # magic number need_init_reshape = False # last axis is allocated for collapsed ellipsis axes_lengths: List[int] = list(self.elementary_axes_lengths) for axis, dim in axes_dims: axes_lengths[self.axis_name2elementary_axis[axis]] = dim for input_axis, (known_axes, unknown_axes) in enumerate(self.input_composition_known_unknown): length = shape[input_axis] if len(known_axes) == 0 and len(unknown_axes) == 1: # shortcut for the most common case axes_lengths[unknown_axes[0]] = length continue known_product = 1 for axis in known_axes: known_product *= axes_lengths[axis] if len(unknown_axes) == 0: if isinstance(length, int) and isinstance(known_product, int) and length != known_product: raise EinopsError(f"Shape mismatch, {length} != {known_product}") else: # assert len(unknown_axes) == 1, 'this is enforced when recipe is created, so commented out' if isinstance(length, int) and isinstance(known_product, int) and length % known_product != 0: raise EinopsError(f"Shape mismatch, can't divide axis of length {length} in chunks of {known_product}") unknown_axis = unknown_axes[0] inferred_length: int = length // known_product axes_lengths[unknown_axis] = inferred_length if len(known_axes) + len(unknown_axes) != 1: need_init_reshape = True # at this point all axes_lengths are computed (either have values or variables, but not Nones) # elementary axes are ordered as they appear in input, then all added axes init_shapes: Optional[List[int]] = axes_lengths[: len(self.axes_permutation)] if need_init_reshape else None need_final_reshape = False final_shapes: List[int] = [] for grouping in self.output_composite_axes: lengths = [axes_lengths[elementary_axis] for elementary_axis in grouping] final_shapes.append(_product(lengths)) if len(lengths) != 1: need_final_reshape = True added_axes: Dict[int, int] = { pos: axes_lengths[pos_in_elementary] for pos, pos_in_elementary in self.added_axes.items() } # this list can be empty reduced_axes = list(range(self.first_reduced_axis, len(self.axes_permutation))) n_axes_after_adding_axes = len(added_axes) + len(self.axes_permutation) axes_reordering: Optional[List[int]] = self.axes_permutation if self.axes_permutation == list(range(len(self.axes_permutation))): axes_reordering = None _final_shapes = final_shapes if need_final_reshape else None return init_shapes, axes_reordering, reduced_axes, added_axes, _final_shapes, n_axes_after_adding_axes _reconstruct_from_shape = functools.lru_cache(1024)(_reconstruct_from_shape_uncached) def _apply_recipe( backend, recipe: TransformRecipe, tensor: Tensor, reduction_type: Reduction, axes_lengths: HashableAxesLengths ) -> Tensor: # this method implements actual work for all backends for 3 operations try: init_shapes, axes_reordering, reduced_axes, added_axes, final_shapes, n_axes_w_added = _reconstruct_from_shape( recipe, backend.shape(tensor), axes_lengths ) except TypeError: # shape or one of passed axes lengths is not hashable (i.e. they are symbols) _result = _reconstruct_from_shape_uncached(recipe, backend.shape(tensor), axes_lengths) (init_shapes, axes_reordering, reduced_axes, added_axes, final_shapes, n_axes_w_added) = _result if init_shapes is not None: tensor = backend.reshape(tensor, init_shapes) if axes_reordering is not None: tensor = backend.transpose(tensor, axes_reordering) if len(reduced_axes) > 0: tensor = _reduce_axes(tensor, reduction_type=reduction_type, reduced_axes=reduced_axes, backend=backend) if len(added_axes) > 0: tensor = backend.add_axes(tensor, n_axes=n_axes_w_added, pos2len=added_axes) if final_shapes is not None: tensor = backend.reshape(tensor, final_shapes) return tensor def _apply_recipe_array_api( xp, recipe: TransformRecipe, tensor: Tensor, reduction_type: Reduction, axes_lengths: HashableAxesLengths ) -> Tensor: # completely-inline implementation init_shapes, axes_reordering, reduced_axes, added_axes, final_shapes, n_axes_w_added = _reconstruct_from_shape( recipe, tensor.shape, axes_lengths ) if init_shapes is not None: tensor = xp.reshape(tensor, init_shapes) if axes_reordering is not None: tensor = xp.permute_dims(tensor, axes_reordering) if len(reduced_axes) > 0: if callable(reduction_type): # custom callable tensor = reduction_type(tensor, tuple(reduced_axes)) else: # one of built-in operations assert reduction_type in _reductions tensor = getattr(xp, reduction_type)(tensor, axis=tuple(reduced_axes)) if len(added_axes) > 0: # we use broadcasting for axis_position, axis_length in added_axes.items(): tensor = xp.expand_dims(tensor, axis=axis_position) final_shape = list(tensor.shape) for axis_position, axis_length in added_axes.items(): final_shape[axis_position] = axis_length tensor = xp.broadcast_to(tensor, final_shape) if final_shapes is not None: tensor = xp.reshape(tensor, final_shapes) return tensor @functools.lru_cache(256) def _prepare_transformation_recipe( pattern: str, operation: Reduction, axes_names: Tuple[str, ...], ndim: int, ) -> TransformRecipe: """Perform initial parsing of pattern and provided supplementary info axes_lengths is a tuple of tuples (axis_name, axis_length) """ left_str, rght_str = pattern.split("->") left = ParsedExpression(left_str) rght = ParsedExpression(rght_str) # checking that axes are in agreement - new axes appear only in repeat, while disappear only in reduction if not left.has_ellipsis and rght.has_ellipsis: raise EinopsError("Ellipsis found in right side, but not left side of a pattern {}".format(pattern)) if left.has_ellipsis and left.has_ellipsis_parenthesized: raise EinopsError("Ellipsis inside parenthesis in the left side is not allowed: {}".format(pattern)) if operation == "rearrange": if left.has_non_unitary_anonymous_axes or rght.has_non_unitary_anonymous_axes: raise EinopsError("Non-unitary anonymous axes are not supported in rearrange (exception is length 1)") difference = set.symmetric_difference(left.identifiers, rght.identifiers) if len(difference) > 0: raise EinopsError("Identifiers only on one side of expression (should be on both): {}".format(difference)) elif operation == "repeat": difference = set.difference(left.identifiers, rght.identifiers) if len(difference) > 0: raise EinopsError("Unexpected identifiers on the left side of repeat: {}".format(difference)) axes_without_size = set.difference( {ax for ax in rght.identifiers if not isinstance(ax, AnonymousAxis)}, {*left.identifiers, *axes_names}, ) if len(axes_without_size) > 0: raise EinopsError("Specify sizes for new axes in repeat: {}".format(axes_without_size)) elif operation in _reductions or callable(operation): difference = set.difference(rght.identifiers, left.identifiers) if len(difference) > 0: raise EinopsError("Unexpected identifiers on the right side of reduce {}: {}".format(operation, difference)) else: raise EinopsError("Unknown reduction {}. Expect one of {}.".format(operation, _reductions)) if left.has_ellipsis: n_other_dims = len(left.composition) - 1 if ndim < n_other_dims: raise EinopsError(f"Wrong shape: expected >={n_other_dims} dims. Received {ndim}-dim tensor.") ellipsis_ndim = ndim - n_other_dims ell_axes = [_ellipsis + str(i) for i in range(ellipsis_ndim)] left_composition = [] for composite_axis in left.composition: if composite_axis == _ellipsis: for axis in ell_axes: left_composition.append([axis]) else: left_composition.append(composite_axis) rght_composition = [] for composite_axis in rght.composition: if composite_axis == _ellipsis: for axis in ell_axes: rght_composition.append([axis]) else: group = [] for axis in composite_axis: if axis == _ellipsis: group.extend(ell_axes) else: group.append(axis) rght_composition.append(group) left.identifiers.update(ell_axes) left.identifiers.remove(_ellipsis) if rght.has_ellipsis: rght.identifiers.update(ell_axes) rght.identifiers.remove(_ellipsis) else: if ndim != len(left.composition): raise EinopsError(f"Wrong shape: expected {len(left.composition)} dims. Received {ndim}-dim tensor.") left_composition = left.composition rght_composition = rght.composition # parsing all dimensions to find out lengths axis_name2known_length: Dict[Union[str, AnonymousAxis], int] = OrderedDict() for composite_axis in left_composition: for axis_name in composite_axis: if isinstance(axis_name, AnonymousAxis): axis_name2known_length[axis_name] = axis_name.value else: axis_name2known_length[axis_name] = _unknown_axis_length # axis_ids_after_first_reshape = range(len(axis_name2known_length)) at this point repeat_axes_names = [] for axis_name in rght.identifiers: if axis_name not in axis_name2known_length: if isinstance(axis_name, AnonymousAxis): axis_name2known_length[axis_name] = axis_name.value else: axis_name2known_length[axis_name] = _unknown_axis_length repeat_axes_names.append(axis_name) axis_name2position = {name: position for position, name in enumerate(axis_name2known_length)} # axes provided as kwargs for elementary_axis in axes_names: if not ParsedExpression.check_axis_name(elementary_axis): raise EinopsError("Invalid name for an axis", elementary_axis) if elementary_axis not in axis_name2known_length: raise EinopsError("Axis {} is not used in transform".format(elementary_axis)) axis_name2known_length[elementary_axis] = _expected_axis_length input_axes_known_unknown = [] # some shapes are inferred later - all information is prepared for faster inference for i, composite_axis in enumerate(left_composition): known: Set[str] = {axis for axis in composite_axis if axis_name2known_length[axis] != _unknown_axis_length} unknown: Set[str] = {axis for axis in composite_axis if axis_name2known_length[axis] == _unknown_axis_length} if len(unknown) > 1: raise EinopsError("Could not infer sizes for {}".format(unknown)) assert len(unknown) + len(known) == len(composite_axis) input_axes_known_unknown.append( ([axis_name2position[axis] for axis in known], [axis_name2position[axis] for axis in unknown]) ) axis_position_after_reduction: Dict[str, int] = {} for axis_name in itertools.chain(*left_composition): if axis_name in rght.identifiers: axis_position_after_reduction[axis_name] = len(axis_position_after_reduction) result_axes_grouping: List[List[int]] = [ [axis_name2position[axis] for axis in composite_axis] for i, composite_axis in enumerate(rght_composition) ] ordered_axis_left = list(itertools.chain(*left_composition)) ordered_axis_rght = list(itertools.chain(*rght_composition)) reduced_axes = [axis for axis in ordered_axis_left if axis not in rght.identifiers] order_after_transposition = [axis for axis in ordered_axis_rght if axis in left.identifiers] + reduced_axes axes_permutation = [ordered_axis_left.index(axis) for axis in order_after_transposition] added_axes = { i: axis_name2position[axis_name] for i, axis_name in enumerate(ordered_axis_rght) if axis_name not in left.identifiers } first_reduced_axis = len(order_after_transposition) - len(reduced_axes) return TransformRecipe( elementary_axes_lengths=list(axis_name2known_length.values()), axis_name2elementary_axis={axis: axis_name2position[axis] for axis in axes_names}, input_composition_known_unknown=input_axes_known_unknown, axes_permutation=axes_permutation, first_reduced_axis=first_reduced_axis, added_axes=added_axes, output_composite_axes=result_axes_grouping, ) def _prepare_recipes_for_all_dims( pattern: str, operation: Reduction, axes_names: Tuple[str, ...] ) -> Dict[int, TransformRecipe]: """ Internal function, used in layers. Layer makes all recipe creation when it is initialized, thus to keep recipes simple we pre-compute for all dims """ left_str, rght_str = pattern.split("->") left = ParsedExpression(left_str) dims = [len(left.composition)] if left.has_ellipsis: dims = [len(left.composition) - 1 + ellipsis_dims for ellipsis_dims in range(8)] return {ndim: _prepare_transformation_recipe(pattern, operation, axes_names, ndim=ndim) for ndim in dims} def reduce(tensor: Union[Tensor, List[Tensor]], pattern: str, reduction: Reduction, **axes_lengths: Size) -> Tensor: """ einops.reduce combines rearrangement and reduction using reader-friendly notation. Some examples: ```python >>> x = np.random.randn(100, 32, 64) # perform max-reduction on the first axis # Axis t does not appear on RHS - thus we reduced over t >>> y = reduce(x, 't b c -> b c', 'max') # same as previous, but using verbose names for axes >>> y = reduce(x, 'time batch channel -> batch channel', 'max') # let's pretend now that x is a batch of images # with 4 dims: batch=10, height=20, width=30, channel=40 >>> x = np.random.randn(10, 20, 30, 40) # 2d max-pooling with kernel size = 2 * 2 for image processing >>> y1 = reduce(x, 'b c (h1 h2) (w1 w2) -> b c h1 w1', 'max', h2=2, w2=2) # same as previous, using anonymous axes, # note: only reduced axes can be anonymous >>> y1 = reduce(x, 'b c (h1 2) (w1 2) -> b c h1 w1', 'max') # adaptive 2d max-pooling to 3 * 4 grid, # each element is max of 10x10 tile in the original tensor. >>> reduce(x, 'b c (h1 h2) (w1 w2) -> b c h1 w1', 'max', h1=3, w1=4).shape (10, 20, 3, 4) # Global average pooling >>> reduce(x, 'b c h w -> b c', 'mean').shape (10, 20) # subtracting mean over batch for each channel; # similar to x - np.mean(x, axis=(0, 2, 3), keepdims=True) >>> y = x - reduce(x, 'b c h w -> 1 c 1 1', 'mean') # Subtracting per-image mean for each channel >>> y = x - reduce(x, 'b c h w -> b c 1 1', 'mean') # same as previous, but using empty compositions >>> y = x - reduce(x, 'b c h w -> b c () ()', 'mean') ``` Parameters: tensor: tensor: tensor of any supported library (e.g. numpy.ndarray, tensorflow, pytorch). list of tensors is also accepted, those should be of the same type and shape pattern: string, reduction pattern reduction: one of available reductions ('min', 'max', 'sum', 'mean', 'prod', 'any', 'all'). Alternatively, a callable f(tensor, reduced_axes) -> tensor can be provided. This allows using various reductions like: np.max, np.nanmean, tf.reduce_logsumexp, torch.var, etc. axes_lengths: any additional specifications for dimensions Returns: tensor of the same type as input """ try: if isinstance(tensor, list): if len(tensor) == 0: raise TypeError("Rearrange/Reduce/Repeat can't be applied to an empty list") backend = get_backend(tensor[0]) tensor = backend.stack_on_zeroth_dimension(tensor) else: backend = get_backend(tensor) hashable_axes_lengths = tuple(axes_lengths.items()) shape = backend.shape(tensor) recipe = _prepare_transformation_recipe(pattern, reduction, axes_names=tuple(axes_lengths), ndim=len(shape)) return _apply_recipe( backend, recipe, cast(Tensor, tensor), reduction_type=reduction, axes_lengths=hashable_axes_lengths ) except EinopsError as e: message = ' Error while processing {}-reduction pattern "{}".'.format(reduction, pattern) if not isinstance(tensor, list): message += "\n Input tensor shape: {}. ".format(shape) else: message += "\n Input is list. " message += "Additional info: {}.".format(axes_lengths) raise EinopsError(message + "\n {}".format(e)) def rearrange(tensor: Union[Tensor, List[Tensor]], pattern: str, **axes_lengths: Size) -> Tensor: """ einops.rearrange is a reader-friendly smart element reordering for multidimensional tensors. This operation includes functionality of transpose (axes permutation), reshape (view), squeeze, unsqueeze, stack, concatenate and other operations. Examples: ```python # suppose we have a set of 32 images in "h w c" format (height-width-channel) >>> images = [np.random.randn(30, 40, 3) for _ in range(32)] # stack along first (batch) axis, output is a single array >>> rearrange(images, 'b h w c -> b h w c').shape (32, 30, 40, 3) # stacked and reordered axes to "b c h w" format >>> rearrange(images, 'b h w c -> b c h w').shape (32, 3, 30, 40) # concatenate images along height (vertical axis), 960 = 32 * 30 >>> rearrange(images, 'b h w c -> (b h) w c').shape (960, 40, 3) # concatenated images along horizontal axis, 1280 = 32 * 40 >>> rearrange(images, 'b h w c -> h (b w) c').shape (30, 1280, 3) # flattened each image into a vector, 3600 = 30 * 40 * 3 >>> rearrange(images, 'b h w c -> b (c h w)').shape (32, 3600) # split each image into 4 smaller (top-left, top-right, bottom-left, bottom-right), 128 = 32 * 2 * 2 >>> rearrange(images, 'b (h1 h) (w1 w) c -> (b h1 w1) h w c', h1=2, w1=2).shape (128, 15, 20, 3) # space-to-depth operation >>> rearrange(images, 'b (h h1) (w w1) c -> b h w (c h1 w1)', h1=2, w1=2).shape (32, 15, 20, 12) ``` When composing axes, C-order enumeration used (consecutive elements have different last axis). Find more examples in einops tutorial. Parameters: tensor: tensor of any supported library (e.g. numpy.ndarray, tensorflow, pytorch). list of tensors is also accepted, those should be of the same type and shape pattern: string, rearrangement pattern axes_lengths: any additional specifications for dimensions Returns: tensor of the same type as input. If possible, a view to the original tensor is returned. """ return reduce(tensor, pattern, reduction="rearrange", **axes_lengths) def repeat(tensor: Union[Tensor, List[Tensor]], pattern: str, **axes_lengths: Size) -> Tensor: """ einops.repeat allows reordering elements and repeating them in arbitrary combinations. This operation includes functionality of repeat, tile, and broadcast functions. Examples for repeat operation: ```python # a grayscale image (of shape height x width) >>> image = np.random.randn(30, 40) # change it to RGB format by repeating in each channel >>> repeat(image, 'h w -> h w c', c=3).shape (30, 40, 3) # repeat image 2 times along height (vertical axis) >>> repeat(image, 'h w -> (repeat h) w', repeat=2).shape (60, 40) # repeat image 2 time along height and 3 times along width >>> repeat(image, 'h w -> (h2 h) (w3 w)', h2=2, w3=3).shape (60, 120) # convert each pixel to a small square 2x2. Upsample image by 2x >>> repeat(image, 'h w -> (h h2) (w w2)', h2=2, w2=2).shape (60, 80) # pixelate image first by downsampling by 2x, then upsampling >>> downsampled = reduce(image, '(h h2) (w w2) -> h w', 'mean', h2=2, w2=2) >>> repeat(downsampled, 'h w -> (h h2) (w w2)', h2=2, w2=2).shape (30, 40) ``` When composing axes, C-order enumeration used (consecutive elements have different last axis). Find more examples in einops tutorial. Parameters: tensor: tensor of any supported library (e.g. numpy.ndarray, tensorflow, pytorch). list of tensors is also accepted, those should be of the same type and shape pattern: string, rearrangement pattern axes_lengths: any additional specifications for dimensions Returns: Tensor of the same type as input. If possible, a view to the original tensor is returned. """ return reduce(tensor, pattern, reduction="repeat", **axes_lengths) def parse_shape(x: Tensor, pattern: str) -> dict: """ Parse a tensor shape to dictionary mapping axes names to their lengths. ```python # Use underscore to skip the dimension in parsing. >>> x = np.zeros([2, 3, 5, 7]) >>> parse_shape(x, 'batch _ h w') {'batch': 2, 'h': 5, 'w': 7} # `parse_shape` output can be used to specify axes_lengths for other operations: >>> y = np.zeros([700]) >>> rearrange(y, '(b c h w) -> b c h w', **parse_shape(x, 'b _ h w')).shape (2, 10, 5, 7) ``` For symbolic frameworks may return symbols, not integers. Parameters: x: tensor of any supported framework pattern: str, space separated names for axes, underscore means skip axis Returns: dict, maps axes names to their lengths """ exp = ParsedExpression(pattern, allow_underscore=True) shape = get_backend(x).shape(x) if exp.has_composed_axes(): raise RuntimeError(f"Can't parse shape with composite axes: {pattern} {shape}") if len(shape) != len(exp.composition): if exp.has_ellipsis: if len(shape) < len(exp.composition) - 1: raise RuntimeError(f"Can't parse shape with this number of dimensions: {pattern} {shape}") else: raise RuntimeError(f"Can't parse shape with different number of dimensions: {pattern} {shape}") if exp.has_ellipsis: ellipsis_idx = exp.composition.index(_ellipsis) composition = ( exp.composition[:ellipsis_idx] + ["_"] * (len(shape) - len(exp.composition) + 1) + exp.composition[ellipsis_idx + 1 :] ) else: composition = exp.composition result = {} for axes, axis_length in zip(composition, shape): # type: ignore # axes either [], or [AnonymousAxis] or ['axis_name'] if len(axes) == 0: if axis_length != 1: raise RuntimeError(f"Length of axis is not 1: {pattern} {shape}") else: [axis] = axes if isinstance(axis, str): if axis != "_": result[axis] = axis_length else: if axis.value != axis_length: raise RuntimeError(f"Length of anonymous axis does not match: {pattern} {shape}") return result # _enumerate_directions is not exposed in the public API def _enumerate_directions(x): """ For an n-dimensional tensor, returns tensors to enumerate each axis. ```python x = np.zeros([2, 3, 4]) # or any other tensor i, j, k = _enumerate_directions(x) result = i + 2*j + 3*k ``` `result[i, j, k] = i + 2j + 3k`, and also has the same shape as result Works very similarly to numpy.ogrid (open indexing grid) """ backend = get_backend(x) shape = backend.shape(x) result = [] for axis_id, axis_length in enumerate(shape): shape = [1] * len(shape) shape[axis_id] = axis_length result.append(backend.reshape(backend.arange(0, axis_length), shape)) return result # to avoid importing numpy np_ndarray = Any def asnumpy(tensor: Tensor) -> np_ndarray: """ Convert a tensor of an imperative framework (i.e. numpy/cupy/torch/jax/etc.) to `numpy.ndarray` Parameters: tensor: tensor of any known imperative framework Returns: `numpy.ndarray`, converted to numpy """ return get_backend(tensor).to_numpy(tensor) def _validate_einsum_axis_name(axis_name): if len(axis_name) == 0: raise NotImplementedError("Singleton () axes are not yet supported in einsum.") if len(axis_name) > 1: raise NotImplementedError("Shape rearrangement is not yet supported in einsum.") axis_name = axis_name[0] if isinstance(axis_name, AnonymousAxis): raise NotImplementedError("Anonymous axes are not yet supported in einsum.") if len(axis_name) == 0: raise RuntimeError("Encountered empty axis name in einsum.") if not isinstance(axis_name, str): raise RuntimeError("Axis name in einsum must be a string.") @functools.lru_cache(256) def _compactify_pattern_for_einsum(pattern: str) -> str: if "->" not in pattern: # numpy allows this, so make sure users # don't accidentally do something like this. raise ValueError("Einsum pattern must contain '->'.") lefts_str, right_str = pattern.split("->") lefts = [ParsedExpression(left, allow_underscore=True, allow_duplicates=True) for left in lefts_str.split(",")] right = ParsedExpression(right_str, allow_underscore=True) # Start from 'a' and go up to 'Z' output_axis_names = string.ascii_letters i = 0 axis_name_mapping = {} left_patterns = [] for left in lefts: left_pattern = "" for raw_axis_name in left.composition: if raw_axis_name == _ellipsis: left_pattern += "..." continue _validate_einsum_axis_name(raw_axis_name) axis_name = raw_axis_name[0] if axis_name not in axis_name_mapping: if i >= len(output_axis_names): raise RuntimeError("Too many axes in einsum.") axis_name_mapping[axis_name] = output_axis_names[i] i += 1 left_pattern += axis_name_mapping[axis_name] left_patterns.append(left_pattern) compact_pattern = ",".join(left_patterns) + "->" for raw_axis_name in right.composition: if raw_axis_name == _ellipsis: compact_pattern += "..." continue _validate_einsum_axis_name(raw_axis_name) axis_name = raw_axis_name[0] if axis_name not in axis_name_mapping: raise EinopsError(f"Unknown axis {axis_name} on right side of einsum {pattern}.") compact_pattern += axis_name_mapping[axis_name] return compact_pattern @typing.overload def einsum(tensor: Tensor, pattern: str, /) -> Tensor: ... @typing.overload def einsum(tensor1: Tensor, tensor2: Tensor, pattern: str, /) -> Tensor: ... @typing.overload def einsum(tensor1: Tensor, tensor2: Tensor, tensor3: Tensor, pattern: str, /) -> Tensor: ... @typing.overload def einsum(tensor1: Tensor, tensor2: Tensor, tensor3: Tensor, tensor4: Tensor, pattern: str, /) -> Tensor: ... def einsum(*tensors_and_pattern: Union[Tensor, str]) -> Tensor: r""" einops.einsum calls einsum operations with einops-style named axes indexing, computing tensor products with an arbitrary number of tensors. Unlike typical einsum syntax, here you must pass tensors first, and then the pattern. Also, note that rearrange operations such as `"(batch chan) out"`, or singleton axes `()`, are not currently supported. Examples: For a given pattern such as: ```python >>> x, y, z = np.random.randn(3, 20, 20, 20) >>> output = einsum(x, y, z, "a b c, c b d, a g k -> a b k") ``` the following formula is computed: ```tex output[a, b, k] = \sum_{c, d, g} x[a, b, c] * y[c, b, d] * z[a, g, k] ``` where the summation over `c`, `d`, and `g` is performed because those axes names do not appear on the right-hand side. Let's see some additional examples: ```python # Filter a set of images: >>> batched_images = np.random.randn(128, 16, 16) >>> filters = np.random.randn(16, 16, 30) >>> result = einsum(batched_images, filters, ... "batch h w, h w channel -> batch channel") >>> result.shape (128, 30) # Matrix multiplication, with an unknown input shape: >>> batch_shape = (50, 30) >>> data = np.random.randn(*batch_shape, 20) >>> weights = np.random.randn(10, 20) >>> result = einsum(weights, data, ... "out_dim in_dim, ... in_dim -> ... out_dim") >>> result.shape (50, 30, 10) # Matrix trace on a single tensor: >>> matrix = np.random.randn(10, 10) >>> result = einsum(matrix, "i i ->") >>> result.shape () ``` Parameters: tensors_and_pattern: tensors: tensors of any supported library (numpy, tensorflow, pytorch, jax). pattern: string, einsum pattern, with commas separating specifications for each tensor. pattern should be provided after all tensors. Returns: Tensor of the same type as input, after processing with einsum. """ if len(tensors_and_pattern) <= 1: raise ValueError( "`einops.einsum` takes at minimum two arguments: the tensors (at least one), followed by the pattern." ) pattern = tensors_and_pattern[-1] if not isinstance(pattern, str): raise ValueError( "The last argument passed to `einops.einsum` must be a string, representing the einsum pattern." ) tensors = tensors_and_pattern[:-1] pattern = _compactify_pattern_for_einsum(pattern) return get_backend(tensors[0]).einsum(pattern, *tensors) arogozhnikov-einops-ad2c8d6/einops/experimental/000077500000000000000000000000001475201674600222305ustar00rootroot00000000000000arogozhnikov-einops-ad2c8d6/einops/experimental/__init__.py000066400000000000000000000000001475201674600243270ustar00rootroot00000000000000arogozhnikov-einops-ad2c8d6/einops/experimental/indexing.py000066400000000000000000000001711475201674600244060ustar00rootroot00000000000000""" This file contained some thoughts on indexing. These ideas were developed further in eindex (separate package). """ arogozhnikov-einops-ad2c8d6/einops/layers/000077500000000000000000000000001475201674600210325ustar00rootroot00000000000000arogozhnikov-einops-ad2c8d6/einops/layers/__init__.py000066400000000000000000000072431475201674600231510ustar00rootroot00000000000000__author__ = "Alex Rogozhnikov" from typing import Any, Dict from ..einops import TransformRecipe, _apply_recipe, _prepare_recipes_for_all_dims, get_backend from .. import EinopsError class RearrangeMixin: """ Rearrange layer behaves identically to einops.rearrange operation. :param pattern: str, rearrangement pattern :param axes_lengths: any additional specification of dimensions See einops.rearrange for source_examples. """ def __init__(self, pattern: str, **axes_lengths: Any) -> None: super().__init__() self.pattern = pattern self.axes_lengths = axes_lengths # self._recipe = self.recipe() # checking parameters self._multirecipe = self.multirecipe() self._axes_lengths = tuple(self.axes_lengths.items()) def __repr__(self) -> str: params = repr(self.pattern) for axis, length in self.axes_lengths.items(): params += ", {}={}".format(axis, length) return "{}({})".format(self.__class__.__name__, params) def multirecipe(self) -> Dict[int, TransformRecipe]: try: return _prepare_recipes_for_all_dims( self.pattern, operation="rearrange", axes_names=tuple(self.axes_lengths) ) except EinopsError as e: raise EinopsError(" Error while preparing {!r}\n {}".format(self, e)) def _apply_recipe(self, x): backend = get_backend(x) return _apply_recipe( backend=backend, recipe=self._multirecipe[len(x.shape)], tensor=x, reduction_type="rearrange", axes_lengths=self._axes_lengths, ) def __getstate__(self): return {"pattern": self.pattern, "axes_lengths": self.axes_lengths} def __setstate__(self, state): self.__init__(pattern=state["pattern"], **state["axes_lengths"]) class ReduceMixin: """ Reduce layer behaves identically to einops.reduce operation. :param pattern: str, rearrangement pattern :param reduction: one of available reductions ('min', 'max', 'sum', 'mean', 'prod'), case-sensitive :param axes_lengths: any additional specification of dimensions See einops.reduce for source_examples. """ def __init__(self, pattern: str, reduction: str, **axes_lengths: Any): super().__init__() self.pattern = pattern self.reduction = reduction self.axes_lengths = axes_lengths self._multirecipe = self.multirecipe() self._axes_lengths = tuple(self.axes_lengths.items()) def __repr__(self): params = "{!r}, {!r}".format(self.pattern, self.reduction) for axis, length in self.axes_lengths.items(): params += ", {}={}".format(axis, length) return "{}({})".format(self.__class__.__name__, params) def multirecipe(self) -> Dict[int, TransformRecipe]: try: return _prepare_recipes_for_all_dims( self.pattern, operation=self.reduction, axes_names=tuple(self.axes_lengths) ) except EinopsError as e: raise EinopsError(" Error while preparing {!r}\n {}".format(self, e)) def _apply_recipe(self, x): backend = get_backend(x) return _apply_recipe( backend=backend, recipe=self._multirecipe[len(x.shape)], tensor=x, reduction_type=self.reduction, axes_lengths=self._axes_lengths, ) def __getstate__(self): return {"pattern": self.pattern, "reduction": self.reduction, "axes_lengths": self.axes_lengths} def __setstate__(self, state): self.__init__(pattern=state["pattern"], reduction=state["reduction"], **state["axes_lengths"]) arogozhnikov-einops-ad2c8d6/einops/layers/_einmix.py000066400000000000000000000255651475201674600230510ustar00rootroot00000000000000from typing import Any, List, Optional, Dict from einops import EinopsError from einops.parsing import ParsedExpression, _ellipsis import warnings import string from ..einops import _product def _report_axes(axes: set, report_message: str): if len(axes) > 0: raise EinopsError(report_message.format(axes)) class _EinmixMixin: def __init__(self, pattern: str, weight_shape: str, bias_shape: Optional[str] = None, **axes_lengths: Any): """ EinMix - Einstein summation with automated tensor management and axis packing/unpacking. EinMix is a combination of einops and MLP, see tutorial: https://github.com/arogozhnikov/einops/blob/main/docs/3-einmix-layer.ipynb Imagine taking einsum with two arguments, one of each input, and one - tensor with weights >>> einsum('time batch channel_in, channel_in channel_out -> time batch channel_out', input, weight) This layer manages weights for you, syntax highlights a special role of weight matrix >>> EinMix('time batch channel_in -> time batch channel_out', weight_shape='channel_in channel_out') But otherwise it is the same einsum under the hood. Plus einops-rearrange. Simple linear layer with a bias term (you have one like that in your framework) >>> EinMix('t b cin -> t b cout', weight_shape='cin cout', bias_shape='cout', cin=10, cout=20) There is no restriction to mix the last axis. Let's mix along height >>> EinMix('h w c-> hout w c', weight_shape='h hout', bias_shape='hout', h=32, hout=32) Example of channel-wise multiplication (like one used in normalizations) >>> EinMix('t b c -> t b c', weight_shape='c', c=128) Multi-head linear layer (each head is own linear layer): >>> EinMix('t b (head cin) -> t b (head cout)', weight_shape='head cin cout', ...) ... and yes, you need to specify all dimensions of weight shape/bias shape in parameters. Use cases: - when channel dimension is not last, use EinMix, not transposition - patch/segment embeddings - when need only within-group connections to reduce number of weights and computations - next-gen MLPs (follow tutorial link above to learn more!) - in general, any time you want to combine linear layer and einops.rearrange Uniform He initialization is applied to weight tensor. This accounts for the number of elements mixed and produced. Parameters :param pattern: transformation pattern, left side - dimensions of input, right side - dimensions of output :param weight_shape: axes of weight. A tensor of this shape is created, stored, and optimized in a layer If bias_shape is not specified, bias is not created. :param bias_shape: axes of bias added to output. Weights of this shape are created and stored. If `None` (the default), no bias is added. :param axes_lengths: dimensions of weight tensor """ super().__init__() self.pattern = pattern self.weight_shape = weight_shape self.bias_shape = bias_shape self.axes_lengths = axes_lengths self.initialize_einmix( pattern=pattern, weight_shape=weight_shape, bias_shape=bias_shape, axes_lengths=axes_lengths ) def initialize_einmix(self, pattern: str, weight_shape: str, bias_shape: Optional[str], axes_lengths: dict): left_pattern, right_pattern = pattern.split("->") left = ParsedExpression(left_pattern) right = ParsedExpression(right_pattern) weight = ParsedExpression(weight_shape) _report_axes( set.difference(right.identifiers, {*left.identifiers, *weight.identifiers}), "Unrecognized identifiers on the right side of EinMix {}", ) if weight.has_ellipsis: raise EinopsError("Ellipsis is not supported in weight, as its shape should be fully specified") if left.has_ellipsis or right.has_ellipsis: if not (left.has_ellipsis and right.has_ellipsis): raise EinopsError(f"Ellipsis in EinMix should be on both sides, {pattern}") if left.has_ellipsis_parenthesized: raise EinopsError(f"Ellipsis on left side can't be in parenthesis, got {pattern}") if any(x.has_non_unitary_anonymous_axes for x in [left, right, weight]): raise EinopsError("Anonymous axes (numbers) are not allowed in EinMix") if "(" in weight_shape or ")" in weight_shape: raise EinopsError(f"Parenthesis is not allowed in weight shape: {weight_shape}") pre_reshape_pattern = None pre_reshape_lengths = None post_reshape_pattern = None if any(len(group) != 1 for group in left.composition): names: List[str] = [] for group in left.composition: names += group names = [name if name != _ellipsis else "..." for name in names] composition = " ".join(names) pre_reshape_pattern = f"{left_pattern}-> {composition}" pre_reshape_lengths = {name: length for name, length in axes_lengths.items() if name in names} if any(len(group) != 1 for group in right.composition) or right.has_ellipsis_parenthesized: names = [] for group in right.composition: names += group names = [name if name != _ellipsis else "..." for name in names] composition = " ".join(names) post_reshape_pattern = f"{composition} ->{right_pattern}" self._create_rearrange_layers(pre_reshape_pattern, pre_reshape_lengths, post_reshape_pattern, {}) for axis in weight.identifiers: if axis not in axes_lengths: raise EinopsError("Dimension {} of weight should be specified".format(axis)) _report_axes( set.difference(set(axes_lengths), {*left.identifiers, *weight.identifiers}), "Axes {} are not used in pattern", ) _report_axes( set.difference(weight.identifiers, {*left.identifiers, *right.identifiers}), "Weight axes {} are redundant" ) if len(weight.identifiers) == 0: warnings.warn("EinMix: weight has no dimensions (means multiplication by a number)") _weight_shape = [axes_lengths[axis] for (axis,) in weight.composition] # single output element is a combination of fan_in input elements _fan_in = _product([axes_lengths[axis] for (axis,) in weight.composition if axis not in right.identifiers]) if bias_shape is not None: # maybe I should put ellipsis in the beginning for simplicity? if not isinstance(bias_shape, str): raise EinopsError("bias shape should be string specifying which axes bias depends on") bias = ParsedExpression(bias_shape) _report_axes( set.difference(bias.identifiers, right.identifiers), "Bias axes {} not present in output", ) _report_axes( set.difference(bias.identifiers, set(axes_lengths)), "Sizes not provided for bias axes {}", ) _bias_shape = [] used_non_trivial_size = False for axes in right.composition: if axes == _ellipsis: if used_non_trivial_size: raise EinopsError("all bias dimensions should go after ellipsis in the output") else: # handles ellipsis correctly for axis in axes: if axis == _ellipsis: if used_non_trivial_size: raise EinopsError("all bias dimensions should go after ellipsis in the output") elif axis in bias.identifiers: _bias_shape.append(axes_lengths[axis]) used_non_trivial_size = True else: _bias_shape.append(1) else: _bias_shape = None weight_bound = (3 / _fan_in) ** 0.5 bias_bound = (1 / _fan_in) ** 0.5 self._create_parameters(_weight_shape, weight_bound, _bias_shape, bias_bound) # rewrite einsum expression with single-letter latin identifiers so that # expression will be understood by any framework mapped_identifiers = {*left.identifiers, *right.identifiers, *weight.identifiers} if _ellipsis in mapped_identifiers: mapped_identifiers.remove(_ellipsis) mapped_identifiers = list(sorted(mapped_identifiers)) mapping2letters = {k: letter for letter, k in zip(string.ascii_lowercase, mapped_identifiers)} mapping2letters[_ellipsis] = "..." # preserve ellipsis def write_flat_remapped(axes: ParsedExpression): result = [] for composed_axis in axes.composition: if isinstance(composed_axis, list): result.extend([mapping2letters[axis] for axis in composed_axis]) else: assert composed_axis == _ellipsis result.append("...") return "".join(result) self.einsum_pattern: str = "{},{}->{}".format( write_flat_remapped(left), write_flat_remapped(weight), write_flat_remapped(right), ) def _create_rearrange_layers( self, pre_reshape_pattern: Optional[str], pre_reshape_lengths: Optional[Dict], post_reshape_pattern: Optional[str], post_reshape_lengths: Optional[Dict], ): raise NotImplementedError("Should be defined in framework implementations") def _create_parameters(self, weight_shape, weight_bound, bias_shape, bias_bound): """Shape and implementations""" raise NotImplementedError("Should be defined in framework implementations") def __repr__(self): params = repr(self.pattern) params += f", '{self.weight_shape}'" if self.bias_shape is not None: params += f", '{self.bias_shape}'" for axis, length in self.axes_lengths.items(): params += ", {}={}".format(axis, length) return "{}({})".format(self.__class__.__name__, params) class _EinmixDebugger(_EinmixMixin): """Used only to test mixin""" def _create_rearrange_layers( self, pre_reshape_pattern: Optional[str], pre_reshape_lengths: Optional[Dict], post_reshape_pattern: Optional[str], post_reshape_lengths: Optional[Dict], ): self.pre_reshape_pattern = pre_reshape_pattern self.pre_reshape_lengths = pre_reshape_lengths self.post_reshape_pattern = post_reshape_pattern self.post_reshape_lengths = post_reshape_lengths def _create_parameters(self, weight_shape, weight_bound, bias_shape, bias_bound): self.saved_weight_shape = weight_shape self.saved_bias_shape = bias_shape arogozhnikov-einops-ad2c8d6/einops/layers/flax.py000066400000000000000000000047501475201674600223440ustar00rootroot00000000000000from dataclasses import field from typing import Optional, Dict, cast import flax.linen as nn import jax import jax.numpy as jnp from . import RearrangeMixin, ReduceMixin from ._einmix import _EinmixMixin __author__ = "Alex Rogozhnikov" class Reduce(nn.Module): pattern: str reduction: str sizes: dict = field(default_factory=lambda: {}) def setup(self): self.reducer = ReduceMixin(self.pattern, self.reduction, **self.sizes) def __call__(self, input): return self.reducer._apply_recipe(input) class Rearrange(nn.Module): pattern: str sizes: dict = field(default_factory=lambda: {}) def setup(self): self.rearranger = RearrangeMixin(self.pattern, **self.sizes) def __call__(self, input): return self.rearranger._apply_recipe(input) class EinMix(nn.Module, _EinmixMixin): pattern: str weight_shape: str bias_shape: Optional[str] = None sizes: dict = field(default_factory=lambda: {}) def setup(self): self.initialize_einmix( pattern=self.pattern, weight_shape=self.weight_shape, bias_shape=self.bias_shape, axes_lengths=self.sizes, ) def _create_parameters(self, weight_shape, weight_bound, bias_shape, bias_bound): self.weight = self.param("weight", jax.nn.initializers.uniform(weight_bound), weight_shape) if bias_shape is not None: self.bias = self.param("bias", jax.nn.initializers.uniform(bias_bound), bias_shape) else: self.bias = None def _create_rearrange_layers( self, pre_reshape_pattern: Optional[str], pre_reshape_lengths: Optional[Dict], post_reshape_pattern: Optional[str], post_reshape_lengths: Optional[Dict], ): self.pre_rearrange = None if pre_reshape_pattern is not None: self.pre_rearrange = Rearrange(pre_reshape_pattern, sizes=cast(dict, pre_reshape_lengths)) self.post_rearrange = None if post_reshape_pattern is not None: self.post_rearrange = Rearrange(post_reshape_pattern, sizes=cast(dict, post_reshape_lengths)) def __call__(self, input): if self.pre_rearrange is not None: input = self.pre_rearrange(input) result = jnp.einsum(self.einsum_pattern, input, self.weight) if self.bias is not None: result += self.bias if self.post_rearrange is not None: result = self.post_rearrange(result) return result arogozhnikov-einops-ad2c8d6/einops/layers/keras.py000066400000000000000000000003241475201674600225100ustar00rootroot00000000000000__author__ = "Alex Rogozhnikov" from ..layers.tensorflow import Rearrange, Reduce, EinMix keras_custom_objects = { Rearrange.__name__: Rearrange, Reduce.__name__: Reduce, EinMix.__name__: EinMix, } arogozhnikov-einops-ad2c8d6/einops/layers/oneflow.py000066400000000000000000000035101475201674600230540ustar00rootroot00000000000000from typing import Optional, Dict, cast import oneflow as flow from . import RearrangeMixin, ReduceMixin from ._einmix import _EinmixMixin __author__ = "Tianhe Ren & Depeng Liang" class Rearrange(RearrangeMixin, flow.nn.Module): def forward(self, input): return self._apply_recipe(input) class Reduce(ReduceMixin, flow.nn.Module): def forward(self, input): return self._apply_recipe(input) class EinMix(_EinmixMixin, flow.nn.Module): def _create_parameters(self, weight_shape, weight_bound, bias_shape, bias_bound): self.weight = flow.nn.Parameter( flow.zeros(weight_shape).uniform_(-weight_bound, weight_bound), requires_grad=True ) if bias_shape is not None: self.bias = flow.nn.Parameter(flow.zeros(bias_shape).uniform_(-bias_bound, bias_bound), requires_grad=True) else: self.bias = None def _create_rearrange_layers( self, pre_reshape_pattern: Optional[str], pre_reshape_lengths: Optional[Dict], post_reshape_pattern: Optional[str], post_reshape_lengths: Optional[Dict], ): self.pre_rearrange = None if pre_reshape_pattern is not None: self.pre_rearrange = Rearrange(pre_reshape_pattern, **cast(dict, pre_reshape_lengths)) self.post_rearrange = None if post_reshape_pattern is not None: self.post_rearrange = Rearrange(post_reshape_pattern, **cast(dict, post_reshape_lengths)) def forward(self, input): if self.pre_rearrange is not None: input = self.pre_rearrange(input) result = flow.einsum(self.einsum_pattern, input, self.weight) if self.bias is not None: result += self.bias if self.post_rearrange is not None: result = self.post_rearrange(result) return result arogozhnikov-einops-ad2c8d6/einops/layers/paddle.py000066400000000000000000000035631475201674600226440ustar00rootroot00000000000000from typing import Optional, Dict, cast import paddle from . import RearrangeMixin, ReduceMixin from ._einmix import _EinmixMixin __author__ = "PaddlePaddle" class Rearrange(RearrangeMixin, paddle.nn.Layer): def forward(self, input): return self._apply_recipe(input) class Reduce(ReduceMixin, paddle.nn.Layer): def forward(self, input): return self._apply_recipe(input) class EinMix(_EinmixMixin, paddle.nn.Layer): def _create_parameters(self, weight_shape, weight_bound, bias_shape, bias_bound): self.weight = self.create_parameter( weight_shape, default_initializer=paddle.nn.initializer.Uniform(-weight_bound, weight_bound) ) if bias_shape is not None: self.bias = self.create_parameter( bias_shape, default_initializer=paddle.nn.initializer.Uniform(-bias_bound, bias_bound) ) else: self.bias = None def _create_rearrange_layers( self, pre_reshape_pattern: Optional[str], pre_reshape_lengths: Optional[Dict], post_reshape_pattern: Optional[str], post_reshape_lengths: Optional[Dict], ): self.pre_rearrange = None if pre_reshape_pattern is not None: self.pre_rearrange = Rearrange(pre_reshape_pattern, **cast(dict, pre_reshape_lengths)) self.post_rearrange = None if post_reshape_pattern is not None: self.post_rearrange = Rearrange(post_reshape_pattern, **cast(dict, post_reshape_lengths)) def forward(self, input): if self.pre_rearrange is not None: input = self.pre_rearrange(input) result = paddle.einsum(self.einsum_pattern, input, self.weight) if self.bias is not None: result += self.bias if self.post_rearrange is not None: result = self.post_rearrange(result) return result arogozhnikov-einops-ad2c8d6/einops/layers/tensorflow.py000066400000000000000000000063741475201674600236200ustar00rootroot00000000000000""" Comment about tensorflow layers: unfortunately instructions on creation of TF layers change constantly, and changed way too many times at this point to remember what-compatible-where. Layers in einops==0.7.0 (and several prior versions) are compatible with TF 2.13 Layers in einops==0.8.0 were re-implemented according to official instructions for TF 2.16 """ from typing import Optional, Dict, cast import tensorflow as tf from tensorflow.keras.layers import Layer from . import RearrangeMixin, ReduceMixin from ._einmix import _EinmixMixin __author__ = "Alex Rogozhnikov" class Rearrange(RearrangeMixin, Layer): def build(self, input_shape): pass # layer does not have any parameters to be initialized def call(self, inputs): return self._apply_recipe(inputs) def get_config(self): return {"pattern": self.pattern, **self.axes_lengths} class Reduce(ReduceMixin, Layer): def build(self, input_shape): pass # layer does not have any parameters to be initialized def call(self, inputs): return self._apply_recipe(inputs) def get_config(self): return {"pattern": self.pattern, "reduction": self.reduction, **self.axes_lengths} class EinMix(_EinmixMixin, Layer): def _create_parameters(self, weight_shape, weight_bound, bias_shape, bias_bound): # this method is called in __init__, # but we postpone actual creation to build(), as TF instruction suggests self._params = [weight_shape, weight_bound, bias_shape, bias_bound] def _create_rearrange_layers( self, pre_reshape_pattern: Optional[str], pre_reshape_lengths: Optional[Dict], post_reshape_pattern: Optional[str], post_reshape_lengths: Optional[Dict], ): self.pre_rearrange = None if pre_reshape_pattern is not None: self.pre_rearrange = Rearrange(pre_reshape_pattern, **cast(dict, pre_reshape_lengths)) self.post_rearrange = None if post_reshape_pattern is not None: self.post_rearrange = Rearrange(post_reshape_pattern, **cast(dict, post_reshape_lengths)) def build(self, input_shape): [weight_shape, weight_bound, bias_shape, bias_bound] = self._params self.weight = self.add_weight( shape=weight_shape, initializer=tf.random_uniform_initializer(-weight_bound, weight_bound), trainable=True, ) if bias_shape is not None: self.bias = self.add_weight( shape=bias_shape, initializer=tf.random_uniform_initializer(-bias_bound, bias_bound), trainable=True, ) else: self.bias = None def call(self, inputs): if self.pre_rearrange is not None: inputs = self.pre_rearrange(inputs) result = tf.einsum(self.einsum_pattern, inputs, self.weight) if self.bias is not None: result = result + self.bias if self.post_rearrange is not None: result = self.post_rearrange(result) return result def get_config(self): return { "pattern": self.pattern, "weight_shape": self.weight_shape, "bias_shape": self.bias_shape, **self.axes_lengths, } arogozhnikov-einops-ad2c8d6/einops/layers/torch.py000066400000000000000000000045371475201674600225340ustar00rootroot00000000000000from typing import Optional, Dict, cast import torch from . import RearrangeMixin, ReduceMixin from ._einmix import _EinmixMixin from .._torch_specific import apply_for_scriptable_torch __author__ = "Alex Rogozhnikov" class Rearrange(RearrangeMixin, torch.nn.Module): def forward(self, input): recipe = self._multirecipe[input.ndim] return apply_for_scriptable_torch(recipe, input, reduction_type="rearrange", axes_dims=self._axes_lengths) def _apply_recipe(self, x): # overriding parent method to prevent it's scripting pass class Reduce(ReduceMixin, torch.nn.Module): def forward(self, input): recipe = self._multirecipe[input.ndim] return apply_for_scriptable_torch(recipe, input, reduction_type=self.reduction, axes_dims=self._axes_lengths) def _apply_recipe(self, x): # overriding parent method to prevent it's scripting pass class EinMix(_EinmixMixin, torch.nn.Module): def _create_parameters(self, weight_shape, weight_bound, bias_shape, bias_bound): self.weight = torch.nn.Parameter( torch.zeros(weight_shape).uniform_(-weight_bound, weight_bound), requires_grad=True ) if bias_shape is not None: self.bias = torch.nn.Parameter( torch.zeros(bias_shape).uniform_(-bias_bound, bias_bound), requires_grad=True ) else: self.bias = None def _create_rearrange_layers( self, pre_reshape_pattern: Optional[str], pre_reshape_lengths: Optional[Dict], post_reshape_pattern: Optional[str], post_reshape_lengths: Optional[Dict], ): self.pre_rearrange = None if pre_reshape_pattern is not None: self.pre_rearrange = Rearrange(pre_reshape_pattern, **cast(dict, pre_reshape_lengths)) self.post_rearrange = None if post_reshape_pattern is not None: self.post_rearrange = Rearrange(post_reshape_pattern, **cast(dict, post_reshape_lengths)) def forward(self, input): if self.pre_rearrange is not None: input = self.pre_rearrange(input) result = torch.einsum(self.einsum_pattern, input, self.weight) if self.bias is not None: result += self.bias if self.post_rearrange is not None: result = self.post_rearrange(result) return result arogozhnikov-einops-ad2c8d6/einops/packing.py000066400000000000000000000167421475201674600215330ustar00rootroot00000000000000from functools import lru_cache from typing import List, Union, TypeVar, Tuple, Sequence from einops import EinopsError from einops._backends import get_backend from einops.parsing import ParsedExpression Tensor = TypeVar("Tensor") Shape = Union[Tuple[int, ...], List[int]] @lru_cache(maxsize=128) def analyze_pattern(pattern: str, opname: str) -> Tuple[int, int, int]: # Maybe some validation of identifiers? axes = pattern.split() axes_set = set(axes) if len(axes) != len(axes_set): raise EinopsError(f'Duplicates in axes names in {opname}(..., "{pattern}")') if "*" not in axes_set: raise EinopsError(f'No *-axis in {opname}(..., "{pattern}")') for axis in axes: if axis != "*": is_valid, reason = ParsedExpression.check_axis_name_return_reason(axis) if not is_valid: raise EinopsError(f'Invalid axis name {axis} in {opname}(..., "{pattern}")') n_axes_before = axes.index("*") n_axes_after = len(axes) - n_axes_before - 1 min_axes = n_axes_before + n_axes_after return n_axes_before, n_axes_after, min_axes def pack(tensors: Sequence[Tensor], pattern: str) -> Tuple[Tensor, List[Shape]]: """ Packs several tensors into one. See einops tutorial for introduction into packing (and how it replaces stack and concatenation). Parameters: tensors: tensors to be packed, can be of different dimensionality pattern: pattern that is shared for all inputs and output, e.g. "i j * k" or "batch seq *" Returns: (packed_tensor, packed_shapes aka PS) Example: ```python >>> from numpy import zeros as Z >>> inputs = [Z([2, 3, 5]), Z([2, 3, 7, 5]), Z([2, 3, 7, 9, 5])] >>> packed, ps = pack(inputs, 'i j * k') >>> packed.shape, ps ((2, 3, 71, 5), [(), (7,), (7, 9)]) ``` In this example, axes were matched to: i=2, j=3, k=5 based on order (first, second, and last). All other axes were 'packed' and concatenated. PS (packed shapes) contains information about axes that were matched to '*' in every input. Resulting tensor has as many elements as all inputs in total. Packing can be reversed with unpack, which additionally needs PS (packed shapes) to reconstruct order. ```python >>> inputs_unpacked = unpack(packed, ps, 'i j * k') >>> [x.shape for x in inputs_unpacked] [(2, 3, 5), (2, 3, 7, 5), (2, 3, 7, 9, 5)] ``` Read the tutorial for introduction and application scenarios. """ n_axes_before, n_axes_after, min_axes = analyze_pattern(pattern, "pack") # packing zero tensors is illegal backend = get_backend(tensors[0]) reshaped_tensors: List[Tensor] = [] packed_shapes: List[Shape] = [] for i, tensor in enumerate(tensors): shape = backend.shape(tensor) if len(shape) < min_axes: raise EinopsError( f"packed tensor #{i} (enumeration starts with 0) has shape {shape}, " f"while pattern {pattern} assumes at least {min_axes} axes" ) axis_after_packed_axes = len(shape) - n_axes_after packed_shapes.append(shape[n_axes_before:axis_after_packed_axes]) reshaped_tensors.append(backend.reshape(tensor, (*shape[:n_axes_before], -1, *shape[axis_after_packed_axes:]))) return backend.concat(reshaped_tensors, axis=n_axes_before), packed_shapes def prod(x: Shape) -> int: result = 1 for i in x: result *= i return result def unpack(tensor: Tensor, packed_shapes: List[Shape], pattern: str) -> List[Tensor]: """ Unpacks a single tensor into several by splitting over a selected axes. See einops tutorial for introduction into packing (and how it replaces stack and concatenation). Parameters: tensor: tensor to be unpacked packed_shapes: packed_shapes (aka PS) is a list of shapes that take place of '*' in each output. output will contain a single tensor for every provided shape pattern: pattern that is shared for input and all outputs, e.g. "i j * k" or "batch seq *", where * designates an axis to be unpacked Returns: list of tensors If framework supports views, results are views to the original tensor. Example: ```python >>> from numpy import zeros as Z >>> inputs = [Z([2, 3, 5]), Z([2, 3, 7, 5]), Z([2, 3, 7, 9, 5])] >>> packed, ps = pack(inputs, 'i j * k') >>> packed.shape, ps ((2, 3, 71, 5), [(), (7,), (7, 9)]) ``` In this example, axes were matched to: i=2, j=3, k=5 based on order (first, second, and last). All other axes were 'packed' and concatenated. PS (packed shapes) contains information about axes that were matched to '*' in every input. Resulting tensor has as many elements as all inputs in total. Packing can be reversed with unpack, which additionally needs PS (packed shapes) to reconstruct order. ```python >>> inputs_unpacked = unpack(packed, ps, 'i j * k') >>> [x.shape for x in inputs_unpacked] [(2, 3, 5), (2, 3, 7, 5), (2, 3, 7, 9, 5)] ``` Read the tutorial for introduction and application scenarios. """ n_axes_before, n_axes_after, min_axes = analyze_pattern(pattern, opname="unpack") backend = get_backend(tensor) input_shape = backend.shape(tensor) if len(input_shape) != n_axes_before + 1 + n_axes_after: raise EinopsError(f"unpack(..., {pattern}) received input of wrong dim with shape {input_shape}") unpacked_axis: int = n_axes_before lengths_of_composed_axes: List[int] = [-1 if -1 in p_shape else prod(p_shape) for p_shape in packed_shapes] n_unknown_composed_axes = sum(int(x == -1) for x in lengths_of_composed_axes) if n_unknown_composed_axes > 1: raise EinopsError( f"unpack(..., {pattern}) received more than one -1 in {packed_shapes} and can't infer dimensions" ) # following manipulations allow to skip some shape verifications # and leave it to backends # [[], [2, 3], [4], [-1, 5], [6]] < examples of packed_axis # split positions when computed should be # [0, 1, 7, 11, N-6 , N ], where N = length of axis split_positions = [0] * len(packed_shapes) + [input_shape[unpacked_axis]] if n_unknown_composed_axes == 0: for i, x in enumerate(lengths_of_composed_axes[:-1]): split_positions[i + 1] = split_positions[i] + x else: unknown_composed_axis: int = lengths_of_composed_axes.index(-1) for i in range(unknown_composed_axis): split_positions[i + 1] = split_positions[i] + lengths_of_composed_axes[i] for j in range(unknown_composed_axis + 1, len(lengths_of_composed_axes))[::-1]: split_positions[j] = split_positions[j + 1] - lengths_of_composed_axes[j] shape_start = input_shape[:unpacked_axis] shape_end = input_shape[unpacked_axis + 1 :] slice_filler = (slice(None, None),) * unpacked_axis try: return [ backend.reshape( # shortest way slice arbitrary axis tensor[(*slice_filler, slice(split_positions[i], split_positions[i + 1]))], (*shape_start, *element_shape, *shape_end), ) for i, element_shape in enumerate(packed_shapes) ] except Exception: # this hits if there is an error during reshapes, which means passed shapes were incorrect raise RuntimeError( f'Error during unpack(..., "{pattern}"): could not split axis of size {split_positions[-1]}' f" into requested {packed_shapes}" ) arogozhnikov-einops-ad2c8d6/einops/parsing.py000066400000000000000000000151321475201674600215520ustar00rootroot00000000000000from einops import EinopsError import keyword import warnings from typing import List, Optional, Set, Tuple, Union _ellipsis: str = "…" # NB, this is a single unicode symbol. String is used as it is not a list, but can be iterated class AnonymousAxis(object): """Important thing: all instances of this class are not equal to each other""" def __init__(self, value: str): self.value = int(value) if self.value <= 1: if self.value == 1: raise EinopsError("No need to create anonymous axis of length 1. Report this as an issue") else: raise EinopsError("Anonymous axis should have positive length, not {}".format(self.value)) def __repr__(self): return "{}-axis".format(str(self.value)) class ParsedExpression: """ non-mutable structure that contains information about one side of expression (e.g. 'b c (h w)') and keeps some information important for downstream """ def __init__(self, expression: str, *, allow_underscore: bool = False, allow_duplicates: bool = False): self.has_ellipsis: bool = False self.has_ellipsis_parenthesized: Optional[bool] = None self.identifiers: Set[str] = set() # that's axes like 2, 3, 4 or 5. Axes with size 1 are exceptional and replaced with empty composition self.has_non_unitary_anonymous_axes: bool = False # composition keeps structure of composite axes, see how different corner cases are handled in tests self.composition: List[Union[List[str], str]] = [] if "." in expression: if "..." not in expression: raise EinopsError("Expression may contain dots only inside ellipsis (...)") if str.count(expression, "...") != 1 or str.count(expression, ".") != 3: raise EinopsError( "Expression may contain dots only inside ellipsis (...); only one ellipsis for tensor " ) expression = expression.replace("...", _ellipsis) self.has_ellipsis = True bracket_group: Optional[List[str]] = None def add_axis_name(x): if x in self.identifiers: if not (allow_underscore and x == "_") and not allow_duplicates: raise EinopsError('Indexing expression contains duplicate dimension "{}"'.format(x)) if x == _ellipsis: self.identifiers.add(_ellipsis) if bracket_group is None: self.composition.append(_ellipsis) self.has_ellipsis_parenthesized = False else: bracket_group.append(_ellipsis) self.has_ellipsis_parenthesized = True else: is_number = str.isdecimal(x) if is_number and int(x) == 1: # handling the case of anonymous axis of length 1 if bracket_group is None: self.composition.append([]) else: pass # no need to think about 1s inside parenthesis return is_axis_name, reason = self.check_axis_name_return_reason(x, allow_underscore=allow_underscore) if not (is_number or is_axis_name): raise EinopsError("Invalid axis identifier: {}\n{}".format(x, reason)) if is_number: x = AnonymousAxis(x) self.identifiers.add(x) if is_number: self.has_non_unitary_anonymous_axes = True if bracket_group is None: self.composition.append([x]) else: bracket_group.append(x) current_identifier = None for char in expression: if char in "() ": if current_identifier is not None: add_axis_name(current_identifier) current_identifier = None if char == "(": if bracket_group is not None: raise EinopsError("Axis composition is one-level (brackets inside brackets not allowed)") bracket_group = [] elif char == ")": if bracket_group is None: raise EinopsError("Brackets are not balanced") self.composition.append(bracket_group) bracket_group = None elif str.isalnum(char) or char in ["_", _ellipsis]: if current_identifier is None: current_identifier = char else: current_identifier += char else: raise EinopsError("Unknown character '{}'".format(char)) if bracket_group is not None: raise EinopsError('Imbalanced parentheses in expression: "{}"'.format(expression)) if current_identifier is not None: add_axis_name(current_identifier) def flat_axes_order(self) -> List: result = [] for composed_axis in self.composition: assert isinstance(composed_axis, list), "does not work with ellipsis" for axis in composed_axis: result.append(axis) return result def has_composed_axes(self) -> bool: # this will ignore 1 inside brackets for axes in self.composition: if isinstance(axes, list) and len(axes) > 1: return True return False @staticmethod def check_axis_name_return_reason(name: str, allow_underscore: bool = False) -> Tuple[bool, str]: if not str.isidentifier(name): return False, "not a valid python identifier" elif name[0] == "_" or name[-1] == "_": if name == "_" and allow_underscore: return True, "" return False, "axis name should should not start or end with underscore" else: if keyword.iskeyword(name): warnings.warn("It is discouraged to use axes names that are keywords: {}".format(name), RuntimeWarning) if name in ["axis"]: warnings.warn( "It is discouraged to use 'axis' as an axis name " "and will raise an error in future", FutureWarning, ) return True, "" @staticmethod def check_axis_name(name: str) -> bool: """ Valid axes names are python identifiers except keywords, and additionally should not start or end with underscore """ is_valid, _reason = ParsedExpression.check_axis_name_return_reason(name) return is_valid arogozhnikov-einops-ad2c8d6/einops/py.typed000066400000000000000000000000001475201674600212200ustar00rootroot00000000000000arogozhnikov-einops-ad2c8d6/einops/tests/000077500000000000000000000000001475201674600206755ustar00rootroot00000000000000arogozhnikov-einops-ad2c8d6/einops/tests/__init__.py000066400000000000000000000067051475201674600230160ustar00rootroot00000000000000""" Common utils for testing. These functions allow testing only some frameworks, not all. """ import logging import os from functools import lru_cache from typing import List, Tuple from einops import _backends import warnings __author__ = "Alex Rogozhnikov" # minimize noise in tests logging logging.getLogger("tensorflow").disabled = True logging.getLogger("matplotlib").disabled = True FLOAT_REDUCTIONS = ("min", "max", "sum", "mean", "prod") # not includes any/all def find_names_of_all_frameworks() -> List[str]: backend_subclasses = [] backends = _backends.AbstractBackend.__subclasses__() while backends: backend = backends.pop() backends += backend.__subclasses__() backend_subclasses.append(backend) return [b.framework_name for b in backend_subclasses] ENVVAR_NAME = "EINOPS_TEST_BACKENDS" def unparse_backends(backend_names: List[str]) -> Tuple[str, str]: _known_backends = find_names_of_all_frameworks() for backend_name in backend_names: if backend_name not in _known_backends: raise RuntimeError(f"Unknown framework: {backend_name}") return ENVVAR_NAME, ",".join(backend_names) @lru_cache(maxsize=1) def parse_backends_to_test() -> List[str]: if ENVVAR_NAME not in os.environ: raise RuntimeError(f"Testing frameworks were not specified, env var {ENVVAR_NAME} not set") parsed_backends = os.environ[ENVVAR_NAME].split(",") _known_backends = find_names_of_all_frameworks() for backend_name in parsed_backends: if backend_name not in _known_backends: raise RuntimeError(f"Unknown framework: {backend_name}") return parsed_backends def is_backend_tested(backend: str) -> bool: """Used to skip test if corresponding backend is not tested""" if backend not in find_names_of_all_frameworks(): raise RuntimeError(f"Unknown framework {backend}") return backend in parse_backends_to_test() def collect_test_backends(symbolic=False, layers=False) -> List[_backends.AbstractBackend]: """ :param symbolic: symbolic or imperative frameworks? :param layers: layers or operations? :return: list of backends satisfying set conditions """ if not symbolic: if not layers: backend_types = [ _backends.NumpyBackend, _backends.JaxBackend, _backends.TorchBackend, _backends.TensorflowBackend, _backends.OneFlowBackend, _backends.PaddleBackend, _backends.CupyBackend, ] else: backend_types = [ _backends.TorchBackend, _backends.OneFlowBackend, _backends.PaddleBackend, ] else: if not layers: backend_types = [ _backends.PyTensorBackend, ] else: backend_types = [ _backends.TFKerasBackend, ] backend_names_to_test = parse_backends_to_test() result = [] for backend_type in backend_types: if backend_type.framework_name not in backend_names_to_test: continue try: result.append(backend_type()) except ImportError: # problem with backend installation fails a specific test function, # but will be skipped in all other test cases warnings.warn("backend could not be initialized for tests: {}".format(backend_type)) return result arogozhnikov-einops-ad2c8d6/einops/tests/run_tests.py000066400000000000000000000055071475201674600233040ustar00rootroot00000000000000""" Runs tests that are appropriate for framework. """ import os import sys from subprocess import Popen from pathlib import Path __author__ = "Alex Rogozhnikov" def run(cmd, **env): # keeps printing output when testing cmd = cmd.split(" ") if isinstance(cmd, str) else cmd print("running:", cmd) p = Popen(cmd, cwd=str(Path(__file__).parent), env={**os.environ, **env}) p.communicate() return p.returncode def main(): _executable, *args = sys.argv frameworks = [x for x in args if x != "--pip-install"] pip_install_is_set = "--pip-install" in args framework_name2installation = { "numpy": ["numpy"], "torch": ["torch --index-url https://download.pytorch.org/whl/cpu"], "jax": ["jax[cpu]", "flax"], "tensorflow": ["tensorflow"], "cupy": ["cupy"], # switch to stable paddlepaddle, because of https://github.com/PaddlePaddle/Paddle/issues/63927 # "paddle": ["paddlepaddle==0.0.0 -f https://www.paddlepaddle.org.cn/whl/linux/cpu-mkl/develop.html"], "paddle": ["paddlepaddle"], "oneflow": ["oneflow==0.9.0"], "pytensor": ["pytensor"], } usage = f""" Usage: python -m einops.tests.run_tests [--pip-install] Example: python -m einops.tests.run_tests numpy pytorch --pip-install Available frameworks: {list(framework_name2installation)} When --pip-install is set, auto-installs requirements with pip. (make sure which pip points to right pip) """ if len(frameworks) == 0: print(usage) return else: synonyms = { "tf": "tensorflow", "pytorch": "torch", "paddlepaddle": "paddle", } frameworks = [synonyms.get(f, f) for f in frameworks] wrong_frameworks = [f for f in frameworks if f not in framework_name2installation] if wrong_frameworks: print(usage) raise RuntimeError(f"Unrecognized frameworks: {wrong_frameworks}") if pip_install_is_set: print("Install testing infra") other_dependencies = ["pytest"] assert 0 == run("pip install {} --progress-bar off -q".format(" ".join(other_dependencies))) for framework in frameworks: print(f"Installing {framework}") pip_instructions = framework_name2installation[framework] assert 0 == run("pip install {} --progress-bar off -q".format(" ".join(pip_instructions))) # we need to inform testing script which frameworks to use # this is done by setting an envvar EINOPS_TEST_BACKENDS from einops.tests import unparse_backends envvar_name, envvar_value = unparse_backends(backend_names=frameworks) return_code = run( "python -m pytest .", **{envvar_name: envvar_value}, ) assert return_code == 0 if __name__ == "__main__": main() arogozhnikov-einops-ad2c8d6/einops/tests/test_einsum.py000066400000000000000000000253411475201674600236130ustar00rootroot00000000000000from typing import Any, Callable from einops.tests import collect_test_backends from einops.einops import _compactify_pattern_for_einsum, einsum, EinopsError import numpy as np import pytest import string class Arguments: def __init__(self, *args: Any, **kargs: Any): self.args = args self.kwargs = kargs def __call__(self, function: Callable): return function(*self.args, **self.kwargs) test_layer_cases = [ ( Arguments("b c_in h w -> w c_out h b", "c_in c_out", bias_shape=None, c_out=13, c_in=12), (2, 12, 3, 4), (4, 13, 3, 2), ), ( Arguments("b c_in h w -> w c_out h b", "c_in c_out", bias_shape="c_out", c_out=13, c_in=12), (2, 12, 3, 4), (4, 13, 3, 2), ), ( Arguments("b c_in h w -> w c_in h b", "", bias_shape=None, c_in=12), (2, 12, 3, 4), (4, 12, 3, 2), ), ( Arguments("b c_in h w -> b c_out", "c_in h w c_out", bias_shape=None, c_in=12, h=3, w=4, c_out=5), (2, 12, 3, 4), (2, 5), ), ( Arguments("b t head c_in -> b t head c_out", "head c_in c_out", bias_shape=None, head=4, c_in=5, c_out=6), (2, 3, 4, 5), (2, 3, 4, 6), ), ] # Each of the form: # (Arguments, true_einsum_pattern, in_shapes, out_shape) test_functional_cases = [ ( # Basic: "b c h w, b w -> b h", "abcd,ad->ac", ((2, 3, 4, 5), (2, 5)), (2, 4), ), ( # Three tensors: "b c h w, b w, b c -> b h", "abcd,ad,ab->ac", ((2, 3, 40, 5), (2, 5), (2, 3)), (2, 40), ), ( # Ellipsis, and full names: "... one two three, three four five -> ... two five", "...abc,cde->...be", ((32, 5, 2, 3, 4), (4, 5, 6)), (32, 5, 3, 6), ), ( # Ellipsis at the end: "one two three ..., three four five -> two five ...", "abc...,cde->be...", ((2, 3, 4, 32, 5), (4, 5, 6)), (3, 6, 32, 5), ), ( # Ellipsis on multiple tensors: "... one two three, ... three four five -> ... two five", "...abc,...cde->...be", ((32, 5, 2, 3, 4), (32, 5, 4, 5, 6)), (32, 5, 3, 6), ), ( # One tensor, and underscores: "first_tensor second_tensor -> first_tensor", "ab->a", ((5, 4),), (5,), ), ( # Trace (repeated index) "i i -> ", "aa->", ((5, 5),), (), ), ( # Too many spaces in string: " one two , three four->two four ", "ab,cd->bd", ((2, 3), (4, 5)), (3, 5), ), # The following tests were inspired by numpy's einsum tests # https://github.com/numpy/numpy/blob/v1.23.0/numpy/core/tests/test_einsum.py ( # Trace with other indices "i middle i -> middle", "aba->b", ((5, 10, 5),), (10,), ), ( # Ellipsis in the middle: "i ... i -> ...", "a...a->...", ((5, 3, 2, 1, 4, 5),), (3, 2, 1, 4), ), ( # Product of first and last axes: "i ... i -> i ...", "a...a->a...", ((5, 3, 2, 1, 4, 5),), (5, 3, 2, 1, 4), ), ( # Triple diagonal "one one one -> one", "aaa->a", ((5, 5, 5),), (5,), ), ( # Axis swap: "i j k -> j i k", "abc->bac", ((1, 2, 3),), (2, 1, 3), ), ( # Identity: "... -> ...", "...->...", ((5, 4, 3, 2, 1),), (5, 4, 3, 2, 1), ), ( # Elementwise product of three tensors "..., ..., ... -> ...", "...,...,...->...", ((3, 2), (3, 2), (3, 2)), (3, 2), ), ( # Basic summation: "index ->", "a->", ((10,)), (()), ), ] def test_layer(): for backend in collect_test_backends(layers=True, symbolic=False): if backend.framework_name in ["tensorflow", "torch", "oneflow", "paddle"]: layer_type = backend.layers().EinMix for args, in_shape, out_shape in test_layer_cases: layer = args(layer_type) print("Running", layer.einsum_pattern, "for", backend.framework_name) input = np.random.uniform(size=in_shape).astype("float32") input_framework = backend.from_numpy(input) output_framework = layer(input_framework) output = backend.to_numpy(output_framework) assert output.shape == out_shape valid_backends_functional = [ "tensorflow", "torch", "jax", "numpy", "oneflow", "cupy", "tensorflow.keras", "paddle", "pytensor", ] def test_functional(): # Functional tests: backends = filter(lambda x: x.framework_name in valid_backends_functional, collect_test_backends()) for backend in backends: for einops_pattern, true_pattern, in_shapes, out_shape in test_functional_cases: print(f"Running '{einops_pattern}' for {backend.framework_name}") # Create pattern: predicted_pattern = _compactify_pattern_for_einsum(einops_pattern) assert predicted_pattern == true_pattern # Generate example data: rstate = np.random.RandomState(0) in_arrays = [rstate.uniform(size=shape).astype("float32") for shape in in_shapes] in_arrays_framework = [backend.from_numpy(array) for array in in_arrays] # Loop over whether we call it manually with the backend, # or whether we use `einops.einsum`. for do_manual_call in [True, False]: # Actually run einsum: if do_manual_call: out_array = backend.einsum(predicted_pattern, *in_arrays_framework) else: out_array = einsum(*in_arrays_framework, einops_pattern) # Check shape: if tuple(out_array.shape) != out_shape: raise ValueError(f"Expected output shape {out_shape} but got {out_array.shape}") # Check values: true_out_array = np.einsum(true_pattern, *in_arrays) predicted_out_array = backend.to_numpy(out_array) np.testing.assert_array_almost_equal(predicted_out_array, true_out_array, decimal=5) def test_functional_symbolic(): backends = filter( lambda x: x.framework_name in valid_backends_functional, collect_test_backends(symbolic=True, layers=False) ) for backend in backends: for einops_pattern, true_pattern, in_shapes, out_shape in test_functional_cases: print(f"Running '{einops_pattern}' for symbolic {backend.framework_name}") # Create pattern: predicted_pattern = _compactify_pattern_for_einsum(einops_pattern) assert predicted_pattern == true_pattern rstate = np.random.RandomState(0) in_syms = [backend.create_symbol(in_shape) for in_shape in in_shapes] in_data = [rstate.uniform(size=in_shape).astype("float32") for in_shape in in_shapes] expected_out_data = np.einsum(true_pattern, *in_data) for do_manual_call in [True, False]: if do_manual_call: predicted_out_symbol = backend.einsum(predicted_pattern, *in_syms) else: predicted_out_symbol = einsum(*in_syms, einops_pattern) predicted_out_data = backend.eval_symbol( predicted_out_symbol, list(zip(in_syms, in_data)), ) if predicted_out_data.shape != out_shape: raise ValueError(f"Expected output shape {out_shape} but got {predicted_out_data.shape}") np.testing.assert_array_almost_equal(predicted_out_data, expected_out_data, decimal=5) def test_functional_errors(): # Specific backend does not matter, as errors are raised # during the pattern creation. rstate = np.random.RandomState(0) def create_tensor(*shape): return rstate.uniform(size=shape).astype("float32") # raise NotImplementedError("Singleton () axes are not yet supported in einsum.") with pytest.raises(NotImplementedError, match="^Singleton"): einsum( create_tensor(5, 1), "i () -> i", ) # raise NotImplementedError("Shape rearrangement is not yet supported in einsum.") with pytest.raises(NotImplementedError, match="^Shape rearrangement"): einsum( create_tensor(5, 1), "a b -> (a b)", ) with pytest.raises(NotImplementedError, match="^Shape rearrangement"): einsum( create_tensor(10, 1), "(a b) -> a b", ) # raise RuntimeError("Encountered empty axis name in einsum.") # raise RuntimeError("Axis name in einsum must be a string.") # ^ Not tested, these are just a failsafe in case an unexpected error occurs. # raise NotImplementedError("Anonymous axes are not yet supported in einsum.") with pytest.raises(NotImplementedError, match="^Anonymous axes"): einsum( create_tensor(5, 1), "i 2 -> i", ) # ParsedExpression error: with pytest.raises(EinopsError, match="^Invalid axis identifier"): einsum( create_tensor(5, 1), "i 2j -> i", ) # raise ValueError("Einsum pattern must contain '->'.") with pytest.raises(ValueError, match="^Einsum pattern"): einsum( create_tensor(5, 3, 2), "i j k", ) # raise RuntimeError("Too many axes in einsum.") with pytest.raises(RuntimeError, match="^Too many axes"): einsum( create_tensor(1), " ".join(string.ascii_letters) + " extra ->", ) # raise RuntimeError("Unknown axis on right side of einsum.") with pytest.raises(RuntimeError, match="^Unknown axis"): einsum( create_tensor(5, 1), "i j -> k", ) # raise ValueError( # "The last argument passed to `einops.einsum` must be a string," # " representing the einsum pattern." # ) with pytest.raises(ValueError, match="^The last argument"): einsum( "i j k -> i", create_tensor(5, 4, 3), ) # raise ValueError( # "`einops.einsum` takes at minimum two arguments: the tensors," # " followed by the pattern." # ) with pytest.raises(ValueError, match="^`einops.einsum` takes"): einsum( "i j k -> i", ) with pytest.raises(ValueError, match="^`einops.einsum` takes"): einsum( create_tensor(5, 1), ) # TODO: Include check for giving normal einsum pattern rather than einops. arogozhnikov-einops-ad2c8d6/einops/tests/test_examples.py000066400000000000000000000264221475201674600241320ustar00rootroot00000000000000import numpy import pytest from einops import rearrange, parse_shape, reduce from einops.tests import is_backend_tested from einops.tests.test_ops import imp_op_backends def test_rearrange_examples(): def test1(x): # transpose y = rearrange(x, "b c h w -> b h w c") assert tuple(y.shape) == (10, 30, 40, 20) return y def test2(x): # view / reshape y = rearrange(x, "b c h w -> b (c h w)") assert tuple(y.shape) == (10, 20 * 30 * 40) return y def test3(x): # depth-to-space y = rearrange(x, "b (c h1 w1) h w -> b c (h h1) (w w1)", h1=2, w1=2) assert tuple(y.shape) == (10, 5, 30 * 2, 40 * 2) return y def test4(x): # space-to-depth y = rearrange(x, "b c (h h1) (w w1) -> b (h1 w1 c) h w", h1=2, w1=2) assert tuple(y.shape) == (10, 20 * 4, 30 // 2, 40 // 2) return y def test5(x): # simple transposition y = rearrange(x, "b1 sound b2 letter -> b1 b2 sound letter") assert tuple(y.shape) == (10, 30, 20, 40) return y def test6(x): # parsing parameters t = rearrange(x, "b c h w -> (b h w) c") t = t[:, ::2] # replacement for dot-product, just changes size of second axis assert tuple(t.shape) == (10 * 30 * 40, 10) y = rearrange(t, "(b h w) c2 -> b c2 h w", **parse_shape(x, "b _ h w")) assert tuple(y.shape) == (10, 10, 30, 40) return y def test7(x): # split of embedding into groups y1, y2 = rearrange(x, "b (c g) h w -> g b c h w", g=2) assert tuple(y1.shape) == (10, 10, 30, 40) assert tuple(y2.shape) == (10, 10, 30, 40) return y1 + y2 # only one tensor is expected in output def test8(x): # max-pooling y = reduce(x, "b c (h h1) (w w1) -> b c h w", reduction="max", h1=2, w1=2) assert tuple(y.shape) == (10, 20, 30 // 2, 40 // 2) return y def test9(x): # squeeze - unsqueeze y = reduce(x, "b c h w -> b c () ()", reduction="max") assert tuple(y.shape) == (10, 20, 1, 1) y = rearrange(y, "b c () () -> c b") assert tuple(y.shape) == (20, 10) return y def test10(x): # stack tensors = list(x + 0) # 0 is needed https://github.com/tensorflow/tensorflow/issues/23185 tensors = rearrange(tensors, "b c h w -> b h w c") assert tuple(tensors.shape) == (10, 30, 40, 20) return tensors def test11(x): # concatenate tensors = list(x + 0) # 0 is needed https://github.com/tensorflow/tensorflow/issues/23185 tensors = rearrange(tensors, "b c h w -> h (b w) c") assert tuple(tensors.shape) == (30, 10 * 40, 20) return tensors def shufflenet(x, convolve, c1, c2): # shufflenet reordering example x = convolve(x) x = rearrange(x, "b (c1 c2) h w-> b (c2 c1) h w", c1=c1, c2=c2) x = convolve(x) return x def convolve_strided_1d(x, stride, usual_convolution): x = rearrange(x, "b c t1 t2 -> b c (t1 t2)") # reduce dimensionality x = rearrange(x, "b c (t stride) -> (stride b) c t", stride=stride) x = usual_convolution(x) x = rearrange(x, "(stride b) c t -> b c (t stride)", stride=stride) return x def convolve_strided_2d(x, h_stride, w_stride, usual_convolution): x = rearrange(x, "b c (h hs) (w ws) -> (hs ws b) c h w", hs=h_stride, ws=w_stride) x = usual_convolution(x) x = rearrange(x, "(hs ws b) c h w -> b c (h hs) (w ws)", hs=h_stride, ws=w_stride) return x def unet_like_1d(x, usual_convolution): # u-net like steps for increasing / reducing dimensionality x = rearrange(x, "b c t1 t2 -> b c (t1 t2)") # reduce dimensionality y = rearrange(x, "b c (t dt) -> b (dt c) t", dt=2) y = usual_convolution(y) x = x + rearrange(y, "b (dt c) t -> b c (t dt)", dt=2) return x # mock for convolution (works for all backends) def convolve_mock(x): return x tests = [ test1, test2, test3, test4, test5, test6, test7, test8, test9, test10, test11, lambda x: shufflenet(x, convolve=convolve_mock, c1=4, c2=5), lambda x: convolve_strided_1d(x, stride=2, usual_convolution=convolve_mock), lambda x: convolve_strided_2d(x, h_stride=2, w_stride=2, usual_convolution=convolve_mock), lambda x: unet_like_1d(x, usual_convolution=convolve_mock), ] for backend in imp_op_backends: print("testing source_examples for ", backend.framework_name) for test in tests: x = numpy.arange(10 * 20 * 30 * 40).reshape([10, 20, 30, 40]) result1 = test(x) result2 = backend.to_numpy(test(backend.from_numpy(x))) assert numpy.array_equal(result1, result2) # now with strides x = numpy.arange(10 * 2 * 20 * 3 * 30 * 1 * 40).reshape([10 * 2, 20 * 3, 30 * 1, 40 * 1]) # known torch bug - torch doesn't support negative steps last_step = -1 if (backend.framework_name != "torch" and backend.framework_name != "oneflow") else 1 indexing_expression = numpy.index_exp[::2, ::3, ::1, ::last_step] result1 = test(x[indexing_expression]) result2 = backend.to_numpy(test(backend.from_numpy(x)[indexing_expression])) assert numpy.array_equal(result1, result2) def tensor_train_example_numpy(): # kept here just for a collection, only tested for numpy # https://arxiv.org/pdf/1509.06569.pdf, (5) x = numpy.ones([3, 4, 5, 6]) rank = 4 if numpy.__version__ < "1.15.0": # numpy.einsum fails here, skip test return # creating appropriate Gs Gs = [numpy.ones([d, d, rank, rank]) for d in x.shape] Gs[0] = Gs[0][:, :, :1, :] Gs[-1] = Gs[-1][:, :, :, :1] # einsum way y = x.reshape((1,) + x.shape) for G in Gs: # taking partial results left-to-right # y = numpy.einsum('i j alpha beta, alpha i ... -> beta ... j', G, y) y = numpy.einsum("i j a b, a i ... -> b ... j", G, y) y1 = y.reshape(-1) # alternative way y = x.reshape(-1) for G in Gs: i, j, alpha, beta = G.shape y = rearrange(y, "(i rest alpha) -> rest (alpha i)", alpha=alpha, i=i) y = y @ rearrange(G, "i j alpha beta -> (alpha i) (j beta)") y = rearrange(y, "rest (beta j) -> (beta rest j)", beta=beta, j=j) y2 = y assert numpy.allclose(y1, y2) # yet another way y = x for G in Gs: i, j, alpha, beta = G.shape y = rearrange(y, "i ... (j alpha) -> ... j (alpha i)", alpha=alpha, i=i) y = y @ rearrange(G, "i j alpha beta -> (alpha i) (j beta)") y3 = y.reshape(-1) assert numpy.allclose(y1, y3) def test_pytorch_yolo_fragment(): if not is_backend_tested("torch"): pytest.skip() import torch def old_way(input, num_classes, num_anchors, anchors, stride_h, stride_w): # https://github.com/BobLiu20/YOLOv3_PyTorch/blob/c6b483743598b5f64d520d81e7e5f47ba936d4c9/nets/yolo_loss.py#L28-L44 bs = input.size(0) in_h = input.size(2) in_w = input.size(3) scaled_anchors = [(a_w / stride_w, a_h / stride_h) for a_w, a_h in anchors] prediction = input.view(bs, num_anchors, 5 + num_classes, in_h, in_w).permute(0, 1, 3, 4, 2).contiguous() # Get outputs x = torch.sigmoid(prediction[..., 0]) # Center x y = torch.sigmoid(prediction[..., 1]) # Center y w = prediction[..., 2] # Width h = prediction[..., 3] # Height conf = torch.sigmoid(prediction[..., 4]) # Conf pred_cls = torch.sigmoid(prediction[..., 5:]) # Cls pred. # https://github.com/BobLiu20/YOLOv3_PyTorch/blob/c6b483743598b5f64d520d81e7e5f47ba936d4c9/nets/yolo_loss.py#L70-L92 FloatTensor = torch.cuda.FloatTensor if x.is_cuda else torch.FloatTensor LongTensor = torch.cuda.LongTensor if x.is_cuda else torch.LongTensor # Calculate offsets for each grid grid_x = ( torch.linspace(0, in_w - 1, in_w) .repeat(in_w, 1) .repeat(bs * num_anchors, 1, 1) .view(x.shape) .type(FloatTensor) ) grid_y = ( torch.linspace(0, in_h - 1, in_h) .repeat(in_h, 1) .t() .repeat(bs * num_anchors, 1, 1) .view(y.shape) .type(FloatTensor) ) # Calculate anchor w, h anchor_w = FloatTensor(scaled_anchors).index_select(1, LongTensor([0])) anchor_h = FloatTensor(scaled_anchors).index_select(1, LongTensor([1])) anchor_w = anchor_w.repeat(bs, 1).repeat(1, 1, in_h * in_w).view(w.shape) anchor_h = anchor_h.repeat(bs, 1).repeat(1, 1, in_h * in_w).view(h.shape) # Add offset and scale with anchors pred_boxes = FloatTensor(prediction[..., :4].shape) pred_boxes[..., 0] = x.data + grid_x pred_boxes[..., 1] = y.data + grid_y pred_boxes[..., 2] = torch.exp(w.data) * anchor_w pred_boxes[..., 3] = torch.exp(h.data) * anchor_h # Results _scale = torch.Tensor([stride_w, stride_h] * 2).type(FloatTensor) output = torch.cat( (pred_boxes.view(bs, -1, 4) * _scale, conf.view(bs, -1, 1), pred_cls.view(bs, -1, num_classes)), -1 ) return output def new_way(input, num_classes, num_anchors, anchors, stride_h, stride_w): raw_predictions = rearrange(input, " b (anchor prediction) h w -> prediction b anchor h w", anchor=num_anchors) anchors = torch.FloatTensor(anchors).to(input.device) anchor_sizes = rearrange(anchors, "anchor dim -> dim () anchor () ()") _, _, _, in_h, in_w = raw_predictions.shape grid_h = rearrange(torch.arange(in_h).float(), "h -> () () h ()").to(input.device) grid_w = rearrange(torch.arange(in_w).float(), "w -> () () () w").to(input.device) predicted_bboxes = torch.zeros_like(raw_predictions) predicted_bboxes[0] = (raw_predictions[0].sigmoid() + grid_h) * stride_h # center y predicted_bboxes[1] = (raw_predictions[1].sigmoid() + grid_w) * stride_w # center x predicted_bboxes[2:4] = (raw_predictions[2:4].exp()) * anchor_sizes # bbox width and height predicted_bboxes[4] = raw_predictions[4].sigmoid() # confidence predicted_bboxes[5:] = raw_predictions[5:].sigmoid() # class predictions # only to match results of original code, not needed return rearrange(predicted_bboxes, "prediction b anchor h w -> b anchor h w prediction") stride_h = 4 stride_w = 4 batch_size = 5 num_classes = 12 anchors = [[50, 100], [100, 50], [75, 75]] num_anchors = len(anchors) input = torch.randn([batch_size, num_anchors * (5 + num_classes), 1, 1]) result1 = old_way( input=input, num_anchors=num_anchors, num_classes=num_classes, stride_h=stride_h, stride_w=stride_w, anchors=anchors, ) result2 = new_way( input=input, num_anchors=num_anchors, num_classes=num_classes, stride_h=stride_h, stride_w=stride_w, anchors=anchors, ) result1 = result1.reshape(result2.shape) assert torch.allclose(result1, result2) arogozhnikov-einops-ad2c8d6/einops/tests/test_layers.py000066400000000000000000000440701475201674600236120ustar00rootroot00000000000000import pickle from collections import namedtuple import numpy import pytest from einops import rearrange, reduce, EinopsError from einops.tests import collect_test_backends, is_backend_tested, FLOAT_REDUCTIONS as REDUCTIONS __author__ = "Alex Rogozhnikov" testcase = namedtuple("testcase", ["pattern", "axes_lengths", "input_shape", "wrong_shapes"]) rearrangement_patterns = [ testcase( "b c h w -> b (c h w)", dict(c=20), (10, 20, 30, 40), [(), (10,), (10, 10, 10), (10, 21, 30, 40), [1, 20, 1, 1, 1]], ), testcase( "b c (h1 h2) (w1 w2) -> b (c h2 w2) h1 w1", dict(h2=2, w2=2), (10, 20, 30, 40), [(), (1, 1, 1, 1), (1, 10, 3), ()], ), testcase( "b ... c -> c b ...", dict(b=10), (10, 20, 30), [(), (10,), (5, 10)], ), ] def test_rearrange_imperative(): for backend in collect_test_backends(symbolic=False, layers=True): print("Test layer for ", backend.framework_name) for pattern, axes_lengths, input_shape, wrong_shapes in rearrangement_patterns: x = numpy.arange(numpy.prod(input_shape), dtype="float32").reshape(input_shape) result_numpy = rearrange(x, pattern, **axes_lengths) layer = backend.layers().Rearrange(pattern, **axes_lengths) for shape in wrong_shapes: try: layer(backend.from_numpy(numpy.zeros(shape, dtype="float32"))) except BaseException: pass else: raise AssertionError("Failure expected") # simple pickling / unpickling layer2 = pickle.loads(pickle.dumps(layer)) result1 = backend.to_numpy(layer(backend.from_numpy(x))) result2 = backend.to_numpy(layer2(backend.from_numpy(x))) assert numpy.allclose(result_numpy, result1) assert numpy.allclose(result1, result2) just_sum = backend.layers().Reduce("...->", reduction="sum") variable = backend.from_numpy(x) result = just_sum(layer(variable)) result.backward() assert numpy.allclose(backend.to_numpy(variable.grad), 1) def test_rearrange_symbolic(): for backend in collect_test_backends(symbolic=True, layers=True): print("Test layer for ", backend.framework_name) for pattern, axes_lengths, input_shape, wrong_shapes in rearrangement_patterns: x = numpy.arange(numpy.prod(input_shape), dtype="float32").reshape(input_shape) result_numpy = rearrange(x, pattern, **axes_lengths) layer = backend.layers().Rearrange(pattern, **axes_lengths) input_shape_of_nones = [None] * len(input_shape) shapes = [input_shape, input_shape_of_nones] for shape in shapes: symbol = backend.create_symbol(shape) eval_inputs = [(symbol, x)] result_symbol1 = layer(symbol) result1 = backend.eval_symbol(result_symbol1, eval_inputs) assert numpy.allclose(result_numpy, result1) layer2 = pickle.loads(pickle.dumps(layer)) result_symbol2 = layer2(symbol) result2 = backend.eval_symbol(result_symbol2, eval_inputs) assert numpy.allclose(result1, result2) # now testing back-propagation just_sum = backend.layers().Reduce("...->", reduction="sum") result_sum1 = backend.eval_symbol(just_sum(result_symbol1), eval_inputs) result_sum2 = numpy.sum(x) assert numpy.allclose(result_sum1, result_sum2) reduction_patterns = rearrangement_patterns + [ testcase("b c h w -> b ()", dict(b=10), (10, 20, 30, 40), [(10,), (10, 20, 30)]), testcase("b c (h1 h2) (w1 w2) -> b c h1 w1", dict(h1=15, h2=2, w2=2), (10, 20, 30, 40), [(10, 20, 31, 40)]), testcase("b ... c -> b", dict(b=10), (10, 20, 30, 40), [(10,), (11, 10)]), ] def test_reduce_imperative(): for backend in collect_test_backends(symbolic=False, layers=True): print("Test layer for ", backend.framework_name) for reduction in REDUCTIONS: for pattern, axes_lengths, input_shape, wrong_shapes in reduction_patterns: print(backend, reduction, pattern, axes_lengths, input_shape, wrong_shapes) x = numpy.arange(1, 1 + numpy.prod(input_shape), dtype="float32").reshape(input_shape) x /= x.mean() result_numpy = reduce(x, pattern, reduction, **axes_lengths) layer = backend.layers().Reduce(pattern, reduction, **axes_lengths) for shape in wrong_shapes: try: layer(backend.from_numpy(numpy.zeros(shape, dtype="float32"))) except BaseException: pass else: raise AssertionError("Failure expected") # simple pickling / unpickling layer2 = pickle.loads(pickle.dumps(layer)) result1 = backend.to_numpy(layer(backend.from_numpy(x))) result2 = backend.to_numpy(layer2(backend.from_numpy(x))) assert numpy.allclose(result_numpy, result1) assert numpy.allclose(result1, result2) just_sum = backend.layers().Reduce("...->", reduction="sum") variable = backend.from_numpy(x) result = just_sum(layer(variable)) result.backward() grad = backend.to_numpy(variable.grad) if reduction == "sum": assert numpy.allclose(grad, 1) if reduction == "mean": assert numpy.allclose(grad, grad.min()) if reduction in ["max", "min"]: assert numpy.all(numpy.in1d(grad, [0, 1])) assert numpy.sum(grad) > 0.5 def test_reduce_symbolic(): for backend in collect_test_backends(symbolic=True, layers=True): print("Test layer for ", backend.framework_name) for reduction in REDUCTIONS: for pattern, axes_lengths, input_shape, wrong_shapes in reduction_patterns: x = numpy.arange(1, 1 + numpy.prod(input_shape), dtype="float32").reshape(input_shape) x /= x.mean() result_numpy = reduce(x, pattern, reduction, **axes_lengths) layer = backend.layers().Reduce(pattern, reduction, **axes_lengths) input_shape_of_nones = [None] * len(input_shape) shapes = [input_shape, input_shape_of_nones] for shape in shapes: symbol = backend.create_symbol(shape) eval_inputs = [(symbol, x)] result_symbol1 = layer(symbol) result1 = backend.eval_symbol(result_symbol1, eval_inputs) assert numpy.allclose(result_numpy, result1) layer2 = pickle.loads(pickle.dumps(layer)) result_symbol2 = layer2(symbol) result2 = backend.eval_symbol(result_symbol2, eval_inputs) assert numpy.allclose(result1, result2) def create_torch_model(use_reduce=False, add_scripted_layer=False): if not is_backend_tested("torch"): pytest.skip() else: from torch.nn import Sequential, Conv2d, MaxPool2d, Linear, ReLU from einops.layers.torch import Rearrange, Reduce, EinMix import torch.jit return Sequential( Conv2d(3, 6, kernel_size=(5, 5)), Reduce("b c (h h2) (w w2) -> b c h w", "max", h2=2, w2=2) if use_reduce else MaxPool2d(kernel_size=2), Conv2d(6, 16, kernel_size=(5, 5)), Reduce("b c (h h2) (w w2) -> b c h w", "max", h2=2, w2=2), torch.jit.script(Rearrange("b c h w -> b (c h w)")) if add_scripted_layer else Rearrange("b c h w -> b (c h w)"), Linear(16 * 5 * 5, 120), ReLU(), Linear(120, 84), ReLU(), EinMix("b c1 -> (b c2)", weight_shape="c1 c2", bias_shape="c2", c1=84, c2=84), EinMix("(b c2) -> b c3", weight_shape="c2 c3", bias_shape="c3", c2=84, c3=84), Linear(84, 10), ) def test_torch_layer(): if not is_backend_tested("torch"): pytest.skip() else: # checked that torch present import torch import torch.jit model1 = create_torch_model(use_reduce=True) model2 = create_torch_model(use_reduce=False) input = torch.randn([10, 3, 32, 32]) # random models have different predictions assert not torch.allclose(model1(input), model2(input)) model2.load_state_dict(pickle.loads(pickle.dumps(model1.state_dict()))) assert torch.allclose(model1(input), model2(input)) # tracing (freezing) model3 = torch.jit.trace(model2, example_inputs=input) torch.testing.assert_close(model1(input), model3(input), atol=1e-3, rtol=1e-3) torch.testing.assert_close(model1(input + 1), model3(input + 1), atol=1e-3, rtol=1e-3) model4 = torch.jit.trace(model2, example_inputs=input) torch.testing.assert_close(model1(input), model4(input), atol=1e-3, rtol=1e-3) torch.testing.assert_close(model1(input + 1), model4(input + 1), atol=1e-3, rtol=1e-3) def test_torch_layers_scripting(): if not is_backend_tested("torch"): pytest.skip() else: import torch for script_layer in [False, True]: model1 = create_torch_model(use_reduce=True, add_scripted_layer=script_layer) model2 = torch.jit.script(model1) input = torch.randn([10, 3, 32, 32]) torch.testing.assert_close(model1(input), model2(input), atol=1e-3, rtol=1e-3) def test_keras_layer(): if not is_backend_tested("tensorflow"): pytest.skip() else: import tensorflow as tf if tf.__version__ < "2.16.": # current implementation of layers follows new TF interface pytest.skip() from tensorflow.keras.models import Sequential from tensorflow.keras.layers import Conv2D as Conv2d, Dense as Linear, ReLU from einops.layers.keras import Rearrange, Reduce, EinMix, keras_custom_objects def create_keras_model(): return Sequential( [ Conv2d(6, kernel_size=5, input_shape=[32, 32, 3]), Reduce("b c (h h2) (w w2) -> b c h w", "max", h2=2, w2=2), Conv2d(16, kernel_size=5), Reduce("b c (h h2) (w w2) -> b c h w", "max", h2=2, w2=2), Rearrange("b c h w -> b (c h w)"), Linear(120), ReLU(), Linear(84), ReLU(), EinMix("b c1 -> (b c2)", weight_shape="c1 c2", bias_shape="c2", c1=84, c2=84), EinMix("(b c2) -> b c3", weight_shape="c2 c3", bias_shape="c3", c2=84, c3=84), Linear(10), ] ) model1 = create_keras_model() model2 = create_keras_model() input = numpy.random.normal(size=[10, 32, 32, 3]).astype("float32") # two randomly init models should provide different outputs assert not numpy.allclose(model1.predict_on_batch(input), model2.predict_on_batch(input)) # get some temp filename tmp_model_filename = "/tmp/einops_tf_model.h5" # save arch + weights print("temp_path_keras1", tmp_model_filename) tf.keras.models.save_model(model1, tmp_model_filename) model3 = tf.keras.models.load_model(tmp_model_filename, custom_objects=keras_custom_objects) numpy.testing.assert_allclose(model1.predict_on_batch(input), model3.predict_on_batch(input)) weight_filename = "/tmp/einops_tf_model.weights.h5" # save arch as json model4 = tf.keras.models.model_from_json(model1.to_json(), custom_objects=keras_custom_objects) model1.save_weights(weight_filename) model4.load_weights(weight_filename) model2.load_weights(weight_filename) # check that differently-inialized model receives same weights numpy.testing.assert_allclose(model1.predict_on_batch(input), model2.predict_on_batch(input)) # ulimate test # save-load architecture, and then load weights - should return same result numpy.testing.assert_allclose(model1.predict_on_batch(input), model4.predict_on_batch(input)) def test_flax_layers(): """ One-off simple tests for Flax layers. Unfortunately, Flax layers have a different interface from other layers. """ if not is_backend_tested("jax"): pytest.skip() else: import jax import jax.numpy as jnp import flax from flax import linen as nn from einops.layers.flax import EinMix, Reduce, Rearrange class NN(nn.Module): @nn.compact def __call__(self, x): x = EinMix( "b (h h2) (w w2) c -> b h w c_out", "h2 w2 c c_out", "c_out", sizes=dict(h2=2, w2=3, c=4, c_out=5) )(x) x = Rearrange("b h w c -> b (w h c)", sizes=dict(c=5))(x) x = Reduce("b hwc -> b", "mean", dict(hwc=2 * 3 * 5))(x) return x model = NN() fixed_input = jnp.ones([10, 2 * 2, 3 * 3, 4]) params = model.init(jax.random.PRNGKey(0), fixed_input) def eval_at_point(params): return jnp.linalg.norm(model.apply(params, fixed_input)) vandg = jax.value_and_grad(eval_at_point) value0 = eval_at_point(params) value1, grad1 = vandg(params) assert jnp.allclose(value0, value1) params2 = jax.tree_map(lambda x1, x2: x1 - x2 * 0.001, params, grad1) value2 = eval_at_point(params2) assert value0 >= value2, (value0, value2) # check serialization fbytes = flax.serialization.to_bytes(params) _loaded = flax.serialization.from_bytes(params, fbytes) def test_einmix_decomposition(): """ Testing that einmix correctly decomposes into smaller transformations. """ from einops.layers._einmix import _EinmixDebugger mixin1 = _EinmixDebugger( "a b c d e -> e d c b a", weight_shape="d a b", d=2, a=3, b=5, ) # fmt: off assert mixin1.pre_reshape_pattern is None assert mixin1.post_reshape_pattern is None assert mixin1.einsum_pattern == "abcde,dab->edcba" assert mixin1.saved_weight_shape == [2, 3, 5] assert mixin1.saved_bias_shape is None mixin2 = _EinmixDebugger( "a b c d e -> e d c b a", weight_shape="d a b", bias_shape="a b c d e", a=1, b=2, c=3, d=4, e=5, ) # fmt: off assert mixin2.pre_reshape_pattern is None assert mixin2.post_reshape_pattern is None assert mixin2.einsum_pattern == "abcde,dab->edcba" assert mixin2.saved_weight_shape == [4, 1, 2] assert mixin2.saved_bias_shape == [5, 4, 3, 2, 1] mixin3 = _EinmixDebugger( "... -> ...", weight_shape="", bias_shape="", ) # fmt: off assert mixin3.pre_reshape_pattern is None assert mixin3.post_reshape_pattern is None assert mixin3.einsum_pattern == "...,->..." assert mixin3.saved_weight_shape == [] assert mixin3.saved_bias_shape == [] mixin4 = _EinmixDebugger( "b a ... -> b c ...", weight_shape="b a c", a=1, b=2, c=3, ) # fmt: off assert mixin4.pre_reshape_pattern is None assert mixin4.post_reshape_pattern is None assert mixin4.einsum_pattern == "ba...,bac->bc..." assert mixin4.saved_weight_shape == [2, 1, 3] assert mixin4.saved_bias_shape is None mixin5 = _EinmixDebugger( "(b a) ... -> b c (...)", weight_shape="b a c", a=1, b=2, c=3, ) # fmt: off assert mixin5.pre_reshape_pattern == "(b a) ... -> b a ..." assert mixin5.pre_reshape_lengths == dict(a=1, b=2) assert mixin5.post_reshape_pattern == "b c ... -> b c (...)" assert mixin5.einsum_pattern == "ba...,bac->bc..." assert mixin5.saved_weight_shape == [2, 1, 3] assert mixin5.saved_bias_shape is None mixin6 = _EinmixDebugger( "b ... (a c) -> b ... (a d)", weight_shape="c d", bias_shape="a d", a=1, c=3, d=4, ) # fmt: off assert mixin6.pre_reshape_pattern == "b ... (a c) -> b ... a c" assert mixin6.pre_reshape_lengths == dict(a=1, c=3) assert mixin6.post_reshape_pattern == "b ... a d -> b ... (a d)" assert mixin6.einsum_pattern == "b...ac,cd->b...ad" assert mixin6.saved_weight_shape == [3, 4] assert mixin6.saved_bias_shape == [1, 1, 4] # (b) a d, ellipsis does not participate mixin7 = _EinmixDebugger( "a ... (b c) -> a (... d b)", weight_shape="c d b", bias_shape="d b", b=2, c=3, d=4, ) # fmt: off assert mixin7.pre_reshape_pattern == "a ... (b c) -> a ... b c" assert mixin7.pre_reshape_lengths == dict(b=2, c=3) assert mixin7.post_reshape_pattern == "a ... d b -> a (... d b)" assert mixin7.einsum_pattern == "a...bc,cdb->a...db" assert mixin7.saved_weight_shape == [3, 4, 2] assert mixin7.saved_bias_shape == [1, 4, 2] # (a) d b, ellipsis does not participate def test_einmix_restrictions(): """ Testing different cases """ from einops.layers._einmix import _EinmixDebugger with pytest.raises(EinopsError): _EinmixDebugger( "a b c d e -> e d c b a", weight_shape="d a b", d=2, a=3, # missing b ) # fmt: off with pytest.raises(EinopsError): _EinmixDebugger( "a b c d e -> e d c b a", weight_shape="w a b", d=2, a=3, b=1 # missing d ) # fmt: off with pytest.raises(EinopsError): _EinmixDebugger( "(...) a -> ... a", weight_shape="a", a=1, # ellipsis on the left ) # fmt: off with pytest.raises(EinopsError): _EinmixDebugger( "(...) a -> a ...", weight_shape="a", a=1, # ellipsis on the right side after bias axis bias_shape='a', ) # fmt: off arogozhnikov-einops-ad2c8d6/einops/tests/test_ops.py000066400000000000000000000645661475201674600231300ustar00rootroot00000000000000import itertools import numpy import numpy as np import pytest from einops import EinopsError from einops.einops import rearrange, reduce, repeat, _enumerate_directions from einops.tests import collect_test_backends, is_backend_tested, FLOAT_REDUCTIONS as REDUCTIONS imp_op_backends = collect_test_backends(symbolic=False, layers=False) sym_op_backends = collect_test_backends(symbolic=True, layers=False) identity_patterns = [ "...->...", "a b c d e-> a b c d e", "a b c d e ...-> ... a b c d e", "a b c d e ...-> a ... b c d e", "... a b c d e -> ... a b c d e", "a ... e-> a ... e", "a ... -> a ... ", "a ... c d e -> a (...) c d e", ] equivalent_rearrange_patterns = [ ("a b c d e -> (a b) c d e", "a b ... -> (a b) ... "), ("a b c d e -> a b (c d) e", "... c d e -> ... (c d) e"), ("a b c d e -> a b c d e", "... -> ... "), ("a b c d e -> (a b c d e)", "... -> (...)"), ("a b c d e -> b (c d e) a", "a b ... -> b (...) a"), ("a b c d e -> b (a c d) e", "a b ... e -> b (a ...) e"), ] equivalent_reduction_patterns = [ ("a b c d e -> ", " ... -> "), ("a b c d e -> (e a)", "a ... e -> (e a)"), ("a b c d e -> d (a e)", " a b c d e ... -> d (a e) "), ("a b c d e -> (a b)", " ... c d e -> (...) "), ] def test_collapsed_ellipsis_errors_out(): x = numpy.zeros([1, 1, 1, 1, 1]) rearrange(x, "a b c d ... -> a b c ... d") with pytest.raises(EinopsError): rearrange(x, "a b c d (...) -> a b c ... d") rearrange(x, "... -> (...)") with pytest.raises(EinopsError): rearrange(x, "(...) -> (...)") def test_ellipsis_ops_numpy(): x = numpy.arange(2 * 3 * 4 * 5 * 6).reshape([2, 3, 4, 5, 6]) for pattern in identity_patterns: assert numpy.array_equal(x, rearrange(x, pattern)), pattern for pattern1, pattern2 in equivalent_rearrange_patterns: assert numpy.array_equal(rearrange(x, pattern1), rearrange(x, pattern2)) for reduction in ["min", "max", "sum"]: for pattern1, pattern2 in equivalent_reduction_patterns: assert numpy.array_equal(reduce(x, pattern1, reduction=reduction), reduce(x, pattern2, reduction=reduction)) # now just check coincidence with numpy all_rearrange_patterns = [*identity_patterns] for pattern_pairs in equivalent_rearrange_patterns: all_rearrange_patterns.extend(pattern_pairs) def check_op_against_numpy(backend, numpy_input, pattern, axes_lengths, reduction="rearrange", is_symbolic=False): """ Helper to test result of operation (rearrange or transpose) against numpy if reduction == 'rearrange', rearrange op is tested, otherwise reduce """ def operation(x): if reduction == "rearrange": return rearrange(x, pattern, **axes_lengths) else: return reduce(x, pattern, reduction, **axes_lengths) numpy_result = operation(numpy_input) check_equal = numpy.array_equal p_none_dimension = 0.5 if is_symbolic: symbol_shape = [d if numpy.random.random() >= p_none_dimension else None for d in numpy_input.shape] symbol = backend.create_symbol(shape=symbol_shape) result_symbol = operation(symbol) backend_result = backend.eval_symbol(result_symbol, [(symbol, numpy_input)]) else: backend_result = operation(backend.from_numpy(numpy_input)) backend_result = backend.to_numpy(backend_result) check_equal(numpy_result, backend_result) def test_ellipsis_ops_imperative(): """Checking various patterns against numpy""" x = numpy.arange(2 * 3 * 4 * 5 * 6).reshape([2, 3, 4, 5, 6]) for is_symbolic in [True, False]: for backend in collect_test_backends(symbolic=is_symbolic, layers=False): for pattern in identity_patterns + list(itertools.chain(*equivalent_rearrange_patterns)): check_op_against_numpy( backend, x, pattern, axes_lengths={}, reduction="rearrange", is_symbolic=is_symbolic ) for reduction in ["min", "max", "sum"]: for pattern in itertools.chain(*equivalent_reduction_patterns): check_op_against_numpy( backend, x, pattern, axes_lengths={}, reduction=reduction, is_symbolic=is_symbolic ) def test_rearrange_array_api(): import numpy as xp from einops import array_api as AA if xp.__version__ < "2.0.0": pytest.skip() x = numpy.arange(2 * 3 * 4 * 5 * 6).reshape([2, 3, 4, 5, 6]) for pattern in identity_patterns + list(itertools.chain(*equivalent_rearrange_patterns)): expected = rearrange(x, pattern) result = AA.rearrange(xp.from_dlpack(x), pattern) assert numpy.array_equal(AA.asnumpy(result + 0), expected) def test_reduce_array_api(): import numpy as xp from einops import array_api as AA if xp.__version__ < "2.0.0": pytest.skip() x = numpy.arange(2 * 3 * 4 * 5 * 6).reshape([2, 3, 4, 5, 6]) for pattern in itertools.chain(*equivalent_reduction_patterns): for reduction in ["min", "max", "sum"]: expected = reduce(x, pattern, reduction=reduction) result = AA.reduce(xp.from_dlpack(x), pattern, reduction=reduction) assert numpy.array_equal(AA.asnumpy(np.asarray(result + 0)), expected) def test_rearrange_consistency_numpy(): shape = [1, 2, 3, 5, 7, 11] x = numpy.arange(numpy.prod(shape)).reshape(shape) for pattern in [ "a b c d e f -> a b c d e f", "b a c d e f -> a b d e f c", "a b c d e f -> f e d c b a", "a b c d e f -> (f e) d (c b a)", "a b c d e f -> (f e d c b a)", ]: result = rearrange(x, pattern) assert len(numpy.setdiff1d(x, result)) == 0 assert result.dtype == x.dtype result = rearrange(x, "a b c d e f -> a (b) (c d e) f") assert numpy.array_equal(x.flatten(), result.flatten()) result = rearrange(x, "a aa aa1 a1a1 aaaa a11 -> a aa aa1 a1a1 aaaa a11") assert numpy.array_equal(x, result) result1 = rearrange(x, "a b c d e f -> f e d c b a") result2 = rearrange(x, "f e d c b a -> a b c d e f") assert numpy.array_equal(result1, result2) result = rearrange(rearrange(x, "a b c d e f -> (f d) c (e b) a"), "(f d) c (e b) a -> a b c d e f", b=2, d=5) assert numpy.array_equal(x, result) sizes = dict(zip("abcdef", shape)) temp = rearrange(x, "a b c d e f -> (f d) c (e b) a", **sizes) result = rearrange(temp, "(f d) c (e b) a -> a b c d e f", **sizes) assert numpy.array_equal(x, result) x2 = numpy.arange(2 * 3 * 4).reshape([2, 3, 4]) result = rearrange(x2, "a b c -> b c a") assert x2[1, 2, 3] == result[2, 3, 1] assert x2[0, 1, 2] == result[1, 2, 0] def test_rearrange_permutations_numpy(): # tests random permutation of axes against two independent numpy ways for n_axes in range(1, 10): input = numpy.arange(2**n_axes).reshape([2] * n_axes) permutation = numpy.random.permutation(n_axes) left_expression = " ".join("i" + str(axis) for axis in range(n_axes)) right_expression = " ".join("i" + str(axis) for axis in permutation) expression = left_expression + " -> " + right_expression result = rearrange(input, expression) for pick in numpy.random.randint(0, 2, [10, n_axes]): assert input[tuple(pick)] == result[tuple(pick[permutation])] for n_axes in range(1, 10): input = numpy.arange(2**n_axes).reshape([2] * n_axes) permutation = numpy.random.permutation(n_axes) left_expression = " ".join("i" + str(axis) for axis in range(n_axes)[::-1]) right_expression = " ".join("i" + str(axis) for axis in permutation[::-1]) expression = left_expression + " -> " + right_expression result = rearrange(input, expression) assert result.shape == input.shape expected_result = numpy.zeros_like(input) for original_axis, result_axis in enumerate(permutation): expected_result |= ((input >> original_axis) & 1) << result_axis assert numpy.array_equal(result, expected_result) def test_reduction_imperatives(): for backend in imp_op_backends: print("Reduction tests for ", backend.framework_name) for reduction in REDUCTIONS: # slight redundancy for simpler order - numpy version is evaluated multiple times input = numpy.arange(2 * 3 * 4 * 5 * 6, dtype="int64").reshape([2, 3, 4, 5, 6]) if reduction in ["mean", "prod"]: input = input / input.astype("float64").mean() test_cases = [ ["a b c d e -> ", {}, getattr(input, reduction)()], ["a ... -> ", {}, getattr(input, reduction)()], ["(a1 a2) ... (e1 e2) -> ", dict(a1=1, e2=2), getattr(input, reduction)()], [ "a b c d e -> (e c) a", {}, getattr(input, reduction)(axis=(1, 3)).transpose(2, 1, 0).reshape([-1, 2]), ], [ "a ... c d e -> (e c) a", {}, getattr(input, reduction)(axis=(1, 3)).transpose(2, 1, 0).reshape([-1, 2]), ], [ "a b c d e ... -> (e c) a", {}, getattr(input, reduction)(axis=(1, 3)).transpose(2, 1, 0).reshape([-1, 2]), ], ["a b c d e -> (e c a)", {}, getattr(input, reduction)(axis=(1, 3)).transpose(2, 1, 0).reshape([-1])], ["(a a2) ... -> (a2 a) ...", dict(a2=1), input], ] for pattern, axes_lengths, expected_result in test_cases: result = reduce(backend.from_numpy(input.copy()), pattern, reduction=reduction, **axes_lengths) result = backend.to_numpy(result) assert numpy.allclose(result, expected_result), f"Failed at {pattern}" def test_reduction_symbolic(): for backend in sym_op_backends: print("Reduction tests for ", backend.framework_name) for reduction in REDUCTIONS: input = numpy.arange(2 * 3 * 4 * 5 * 6, dtype="int64").reshape([2, 3, 4, 5, 6]) input = input / input.astype("float64").mean() # slight redundancy for simpler order - numpy version is evaluated multiple times test_cases = [ ["a b c d e -> ", {}, getattr(input, reduction)()], ["a ... -> ", {}, getattr(input, reduction)()], ["(a a2) ... (e e2) -> ", dict(a2=1, e2=1), getattr(input, reduction)()], [ "a b c d e -> (e c) a", {}, getattr(input, reduction)(axis=(1, 3)).transpose(2, 1, 0).reshape([-1, 2]), ], [ "a ... c d e -> (e c) a", {}, getattr(input, reduction)(axis=(1, 3)).transpose(2, 1, 0).reshape([-1, 2]), ], [ "a b c d e ... -> (e c) a", {}, getattr(input, reduction)(axis=(1, 3)).transpose(2, 1, 0).reshape([-1, 2]), ], ["a b c d e -> (e c a)", {}, getattr(input, reduction)(axis=(1, 3)).transpose(2, 1, 0).reshape([-1])], ["(a a2) ... -> (a2 a) ...", dict(a2=1), input], ] for pattern, axes_lengths, expected_numpy_result in test_cases: shapes = [input.shape, [None for _ in input.shape]] for shape in shapes: sym = backend.create_symbol(shape) result_sym = reduce(sym, pattern, reduction=reduction, **axes_lengths) result = backend.eval_symbol(result_sym, [(sym, input)]) assert numpy.allclose(result, expected_numpy_result) if True: shape = [] _axes_lengths = {**axes_lengths} for axis, length in zip("abcde", input.shape): # filling as much as possible with Nones if axis in pattern: shape.append(None) _axes_lengths[axis] = length else: shape.append(length) sym = backend.create_symbol(shape) result_sym = reduce(sym, pattern, reduction=reduction, **_axes_lengths) result = backend.eval_symbol(result_sym, [(sym, input)]) assert numpy.allclose(result, expected_numpy_result) def test_reduction_stress_imperatives(): for backend in imp_op_backends: print("Stress-testing reduction for ", backend.framework_name) for reduction in REDUCTIONS + ("rearrange",): dtype = "int64" coincide = numpy.array_equal if reduction in ["mean", "prod"]: dtype = "float64" coincide = numpy.allclose max_dim = 11 if "oneflow" in backend.framework_name: max_dim = 7 if "paddle" in backend.framework_name: max_dim = 9 for n_axes in range(max_dim): shape = numpy.random.randint(2, 4, size=n_axes) permutation = numpy.random.permutation(n_axes) skipped = 0 if reduction == "rearrange" else numpy.random.randint(n_axes + 1) left = " ".join("x" + str(i) for i in range(n_axes)) right = " ".join("x" + str(i) for i in permutation[skipped:]) pattern = left + "->" + right x = numpy.arange(1, 1 + numpy.prod(shape), dtype=dtype).reshape(shape) if reduction == "prod": x /= x.mean() # to avoid overflows result1 = reduce(x, pattern, reduction=reduction) result2 = x.transpose(permutation) if skipped > 0: result2 = getattr(result2, reduction)(axis=tuple(range(skipped))) assert coincide(result1, result2) check_op_against_numpy(backend, x, pattern, reduction=reduction, axes_lengths={}, is_symbolic=False) def test_reduction_with_callable_imperatives(): x_numpy = numpy.arange(2 * 3 * 4 * 5 * 6).reshape([2, 3, 4, 5, 6]).astype("float32") x_numpy /= x_numpy.max() def logsumexp_torch(x, tuple_of_axes): return x.logsumexp(tuple_of_axes) def logsumexp_tf(x, tuple_of_axes): import tensorflow as tf return tf.reduce_logsumexp(x, tuple_of_axes) def logsumexp_keras(x, tuple_of_axes): import tensorflow.keras.backend as k return k.logsumexp(x, tuple_of_axes) def logsumexp_numpy(x, tuple_of_axes): # very naive logsumexp to compare to minused = x.max(tuple_of_axes) y = x - x.max(tuple_of_axes, keepdims=True) y = numpy.exp(y) y = numpy.sum(y, axis=tuple_of_axes) return numpy.log(y) + minused from einops._backends import TorchBackend, TensorflowBackend, TFKerasBackend, NumpyBackend backend2callback = { TorchBackend.framework_name: logsumexp_torch, TensorflowBackend.framework_name: logsumexp_tf, TFKerasBackend.framework_name: logsumexp_keras, NumpyBackend.framework_name: logsumexp_numpy, } for backend in imp_op_backends: if backend.framework_name not in backend2callback: continue backend_callback = backend2callback[backend.framework_name] x_backend = backend.from_numpy(x_numpy) for pattern1, pattern2 in equivalent_reduction_patterns: print("Test reduction with callable for ", backend.framework_name, pattern1, pattern2) output_numpy = reduce(x_numpy, pattern1, reduction=logsumexp_numpy) output_backend = reduce(x_backend, pattern1, reduction=backend_callback) assert numpy.allclose( output_numpy, backend.to_numpy(output_backend), ) def test_enumerating_directions(): for backend in imp_op_backends: print("testing directions for", backend.framework_name) for shape in [[], [1], [1, 1, 1], [2, 3, 5, 7]]: x = numpy.arange(numpy.prod(shape)).reshape(shape) axes1 = _enumerate_directions(x) axes2 = _enumerate_directions(backend.from_numpy(x)) assert len(axes1) == len(axes2) == len(shape) for ax1, ax2 in zip(axes1, axes2): ax2 = backend.to_numpy(ax2) assert ax1.shape == ax2.shape assert numpy.allclose(ax1, ax2) def test_concatenations_and_stacking(): for backend in imp_op_backends: print("testing shapes for ", backend.framework_name) for n_arrays in [1, 2, 5]: shapes = [[], [1], [1, 1], [2, 3, 5, 7], [1] * 6] for shape in shapes: arrays1 = [numpy.arange(i, i + numpy.prod(shape)).reshape(shape) for i in range(n_arrays)] arrays2 = [backend.from_numpy(array) for array in arrays1] result0 = numpy.asarray(arrays1) result1 = rearrange(arrays1, "...->...") result2 = rearrange(arrays2, "...->...") assert numpy.array_equal(result0, result1) assert numpy.array_equal(result1, backend.to_numpy(result2)) result1 = rearrange(arrays1, "b ... -> ... b") result2 = rearrange(arrays2, "b ... -> ... b") assert numpy.array_equal(result1, backend.to_numpy(result2)) def test_gradients_imperatives(): # lazy - just checking reductions for reduction in REDUCTIONS: if reduction in ("any", "all"): continue # non-differentiable ops x = numpy.arange(1, 1 + 2 * 3 * 4).reshape([2, 3, 4]).astype("float32") results = {} for backend in imp_op_backends: y0 = backend.from_numpy(x) if not hasattr(y0, "grad"): continue y1 = reduce(y0, "a b c -> c a", reduction=reduction) y2 = reduce(y1, "c a -> a c", reduction=reduction) y3 = reduce(y2, "a (c1 c2) -> a", reduction=reduction, c1=2) y4 = reduce(y3, "... -> ", reduction=reduction) y4.backward() grad = backend.to_numpy(y0.grad) results[backend.framework_name] = grad print("comparing gradients for", results.keys()) for name1, grad1 in results.items(): for name2, grad2 in results.items(): assert numpy.allclose(grad1, grad2), [name1, name2, "provided different gradients"] def test_tiling_imperatives(): for backend in imp_op_backends: print("Tiling tests for ", backend.framework_name) input = numpy.arange(2 * 3 * 5, dtype="int64").reshape([2, 1, 3, 1, 5]) test_cases = [ (1, 1, 1, 1, 1), (1, 2, 1, 3, 1), (3, 1, 1, 4, 1), ] for repeats in test_cases: expected = numpy.tile(input, repeats) converted = backend.from_numpy(input) repeated = backend.tile(converted, repeats) result = backend.to_numpy(repeated) assert numpy.array_equal(result, expected) def test_tiling_symbolic(): for backend in sym_op_backends: print("Tiling tests for ", backend.framework_name) input = numpy.arange(2 * 3 * 5, dtype="int64").reshape([2, 1, 3, 1, 5]) test_cases = [ (1, 1, 1, 1, 1), (1, 2, 1, 3, 1), (3, 1, 1, 4, 1), ] for repeats in test_cases: expected = numpy.tile(input, repeats) sym = backend.create_symbol(input.shape) result = backend.eval_symbol(backend.tile(sym, repeats), [[sym, input]]) assert numpy.array_equal(result, expected) sym = backend.create_symbol([None] * len(input.shape)) result = backend.eval_symbol(backend.tile(sym, repeats), [[sym, input]]) assert numpy.array_equal(result, expected) repeat_test_cases = [ # all assume that input has shape [2, 3, 5] ("a b c -> c a b", dict()), ("a b c -> (c copy a b)", dict(copy=2, a=2, b=3, c=5)), ("a b c -> (a copy) b c ", dict(copy=1)), ("a b c -> (c a) (copy1 b copy2)", dict(a=2, copy1=1, copy2=2)), ("a ... -> a ... copy", dict(copy=4)), ("... c -> ... (copy1 c copy2)", dict(copy1=1, copy2=2)), ("... -> ... ", dict()), (" ... -> copy1 ... copy2 ", dict(copy1=2, copy2=3)), ("a b c -> copy1 a copy2 b c () ", dict(copy1=2, copy2=1)), ] def check_reversion(x, repeat_pattern, **sizes): """Checks repeat pattern by running reduction""" left, right = repeat_pattern.split("->") reduce_pattern = right + "->" + left repeated = repeat(x, repeat_pattern, **sizes) reduced_min = reduce(repeated, reduce_pattern, reduction="min", **sizes) reduced_max = reduce(repeated, reduce_pattern, reduction="max", **sizes) assert numpy.array_equal(x, reduced_min) assert numpy.array_equal(x, reduced_max) def test_repeat_numpy(): # check repeat vs reduce. Repeat works ok if reverse reduction with min and max work well x = numpy.arange(2 * 3 * 5).reshape([2, 3, 5]) x1 = repeat(x, "a b c -> copy a b c ", copy=1) assert numpy.array_equal(x[None], x1) for pattern, axis_dimensions in repeat_test_cases: check_reversion(x, pattern, **axis_dimensions) def test_repeat_imperatives(): x = numpy.arange(2 * 3 * 5).reshape([2, 3, 5]) for backend in imp_op_backends: print("Repeat tests for ", backend.framework_name) for pattern, axis_dimensions in repeat_test_cases: expected = repeat(x, pattern, **axis_dimensions) converted = backend.from_numpy(x) repeated = repeat(converted, pattern, **axis_dimensions) result = backend.to_numpy(repeated) assert numpy.array_equal(result, expected) def test_repeat_symbolic(): x = numpy.arange(2 * 3 * 5).reshape([2, 3, 5]) for backend in sym_op_backends: print("Repeat tests for ", backend.framework_name) for pattern, axis_dimensions in repeat_test_cases: expected = repeat(x, pattern, **axis_dimensions) sym = backend.create_symbol(x.shape) result = backend.eval_symbol(repeat(sym, pattern, **axis_dimensions), [[sym, x]]) assert numpy.array_equal(result, expected) def test_repeat_array_api(): import numpy as xp from einops import array_api as AA if xp.__version__ < "2.0.0": pytest.skip() x = numpy.arange(2 * 3 * 5).reshape([2, 3, 5]) for pattern, axis_dimensions in repeat_test_cases: expected = repeat(x, pattern, **axis_dimensions) result = AA.repeat(xp.from_dlpack(x), pattern, **axis_dimensions) assert numpy.array_equal(AA.asnumpy(result + 0), expected) test_cases_repeat_anonymous = [ # all assume that input has shape [1, 2, 4, 6] ("a b c d -> c a d b", dict()), ("a b c d -> (c 2 d a b)", dict(a=1, c=4, d=6)), ("1 b c d -> (d copy 1) 3 b c ", dict(copy=3)), ("1 ... -> 3 ... ", dict()), ("() ... d -> 1 (copy1 d copy2) ... ", dict(copy1=2, copy2=3)), ("1 b c d -> (1 1) (1 b) 2 c 3 d (1 1)", dict()), ] def test_anonymous_axes(): x = numpy.arange(1 * 2 * 4 * 6).reshape([1, 2, 4, 6]) for pattern, axis_dimensions in test_cases_repeat_anonymous: check_reversion(x, pattern, **axis_dimensions) def test_list_inputs(): x = numpy.arange(2 * 3 * 4 * 5 * 6).reshape([2, 3, 4, 5, 6]) assert numpy.array_equal( rearrange(list(x), "... -> (...)"), rearrange(x, "... -> (...)"), ) assert numpy.array_equal( reduce(list(x), "a ... e -> (...)", "min"), reduce(x, "a ... e -> (...)", "min"), ) assert numpy.array_equal( repeat(list(x), "... -> b (...)", b=3), repeat(x, "... -> b (...)", b=3), ) def test_torch_compile_with_dynamic_shape(): if not is_backend_tested("torch"): pytest.skip() import torch # somewhat reasonable debug messages torch._dynamo.config.verbose = True def func1(x): # test contains ellipsis a, b, c, *other = x.shape x = rearrange(x, "(a a2) b c ... -> b (c a2) (a ...)", a2=2) # test contains passing expression as axis length x = reduce(x, "b ca2 A -> b A", "sum", ca2=c * 2) return x # seems can't test static and dynamic in the same test run. # func1_compiled_static = torch.compile(func1, dynamic=False, fullgraph=True, backend='aot_eager') func1_compiled_dynamic = torch.compile(func1, dynamic=True, fullgraph=True, backend="aot_eager") x = torch.randn(size=[4, 5, 6, 3]) assert torch.equal(func1_compiled_dynamic(x), func1(x)) # check with input of different dimensionality, and with all shape elements changed x = torch.randn(size=[6, 3, 4, 2, 3]) assert torch.equal(func1_compiled_dynamic(x), func1(x)) def bit_count(x): return sum((x >> i) & 1 for i in range(20)) def test_reduction_imperatives_booleans(): """Checks that any/all reduction works in all frameworks""" x_np = numpy.asarray([(bit_count(x) % 2) == 0 for x in range(2**6)]).reshape([2] * 6) for backend in imp_op_backends: print("Reduction any/all tests for ", backend.framework_name) for axis in range(6): expected_result_any = numpy.any(x_np, axis=axis, keepdims=True) expected_result_all = numpy.all(x_np, axis=axis, keepdims=True) assert not numpy.array_equal(expected_result_any, expected_result_all) axes = list("abcdef") axes_in = list(axes) axes_out = list(axes) axes_out[axis] = "1" pattern = (" ".join(axes_in)) + " -> " + (" ".join(axes_out)) res_any = reduce(backend.from_numpy(x_np), pattern, reduction="any") res_all = reduce(backend.from_numpy(x_np), pattern, reduction="all") assert numpy.array_equal(expected_result_any, backend.to_numpy(res_any)) assert numpy.array_equal(expected_result_all, backend.to_numpy(res_all)) # expected result: any/all expected_result_any = numpy.any(x_np, axis=(0, 1), keepdims=True) expected_result_all = numpy.all(x_np, axis=(0, 1), keepdims=True) pattern = "a b ... -> 1 1 ..." res_any = reduce(backend.from_numpy(x_np), pattern, reduction="any") res_all = reduce(backend.from_numpy(x_np), pattern, reduction="all") assert numpy.array_equal(expected_result_any, backend.to_numpy(res_any)) assert numpy.array_equal(expected_result_all, backend.to_numpy(res_all)) arogozhnikov-einops-ad2c8d6/einops/tests/test_other.py000066400000000000000000000261261475201674600234360ustar00rootroot00000000000000from doctest import testmod import numpy import pytest import einops import einops.layers import einops.parsing from einops._backends import AbstractBackend from einops.einops import rearrange, parse_shape, _optimize_transformation from einops.tests import collect_test_backends, is_backend_tested __author__ = "Alex Rogozhnikov" def test_doctests_examples(): # tests docstrings, additionally testmod(einops.layers, raise_on_error=True, extraglobs=dict(np=numpy)) testmod(einops.einops, raise_on_error=True, extraglobs=dict(np=numpy)) def test_backends_installed(): """ This test will fail if some of backends are not installed or can't be imported Other tests will just work and only test installed backends. """ from . import parse_backends_to_test backends_to_test = parse_backends_to_test() errors = [] for backend_type in AbstractBackend.__subclasses__(): if backend_type.framework_name not in backends_to_test: continue try: # instantiate backend_type() except Exception as e: errors.append((backend_type.framework_name, e)) assert len(errors) == 0, errors def test_optimize_transformations_numpy(): print("Testing optimizations") shapes = [[2] * n_dimensions for n_dimensions in range(14)] shapes += [[3] * n_dimensions for n_dimensions in range(6)] shapes += [[2, 3, 5, 7]] shapes += [[2, 3, 5, 7, 11, 17]] for shape in shapes: for attempt in range(5): n_dimensions = len(shape) x = numpy.random.randint(0, 2**12, size=shape).reshape([-1]) init_shape = shape[:] n_reduced = numpy.random.randint(0, n_dimensions + 1) reduced_axes = tuple(numpy.random.permutation(n_dimensions)[:n_reduced]) axes_reordering = numpy.random.permutation(n_dimensions - n_reduced) final_shape = numpy.random.randint(0, 1024, size=333) # just random init_shape2, reduced_axes2, axes_reordering2, final_shape2 = combination2 = _optimize_transformation( init_shape, reduced_axes, axes_reordering, final_shape ) assert numpy.array_equal(final_shape, final_shape2) result1 = x.reshape(init_shape).sum(axis=reduced_axes).transpose(axes_reordering).reshape([-1]) result2 = x.reshape(init_shape2).sum(axis=reduced_axes2).transpose(axes_reordering2).reshape([-1]) assert numpy.array_equal(result1, result2) # testing we can't optimize this formula again combination3 = _optimize_transformation(*combination2) for a, b in zip(combination2, combination3): assert numpy.array_equal(a, b) _IMPERATIVE_BACKENDS = collect_test_backends(symbolic=False, layers=False) x_np = numpy.zeros([10, 20, 30, 40]) def test_parse_shape_imperative(): for backend in _IMPERATIVE_BACKENDS: print("Shape parsing for ", backend.framework_name) parsed1 = parse_shape(x_np, "a b c d") parsed2 = parse_shape(backend.from_numpy(x_np), "a b c d") assert parsed1 == parsed2 == dict(a=10, b=20, c=30, d=40) assert parsed1 != dict(a=1, b=20, c=30, d=40) != parsed2 def test_underscore(): for backend in _IMPERATIVE_BACKENDS: parsed1 = parse_shape(x_np, "_ _ _ _") parsed2 = parse_shape(backend.from_numpy(x_np), "_ _ _ _") assert parsed1 == parsed2 == dict() def test_underscore_one(): for backend in _IMPERATIVE_BACKENDS: parsed1 = parse_shape(x_np, "_ _ _ hello") parsed2 = parse_shape(backend.from_numpy(x_np), "_ _ _ hello") assert parsed1 == parsed2 == dict(hello=40) def test_underscore_several(): for backend in _IMPERATIVE_BACKENDS: parsed1 = parse_shape(x_np, "_ _ a1 a1a111a") parsed2 = parse_shape(backend.from_numpy(x_np), "_ _ a1 a1a111a") assert parsed1 == parsed2 == dict(a1=30, a1a111a=40) def test_repeating(): with pytest.raises(einops.EinopsError): parse_shape(x_np, "a a b b") for backend in _IMPERATIVE_BACKENDS: with pytest.raises(einops.EinopsError): parse_shape(backend.from_numpy(x_np), "a a b b") def test_ellipsis(): for backend in _IMPERATIVE_BACKENDS: for shape, pattern, expected in [ ([10, 20], "...", dict()), ([10], "... a", dict(a=10)), ([10, 20], "... a", dict(a=20)), ([10, 20, 30], "... a", dict(a=30)), ([10, 20, 30, 40], "... a", dict(a=40)), ([10], "a ...", dict(a=10)), ([10, 20], "a ...", dict(a=10)), ([10, 20, 30], "a ...", dict(a=10)), ([10, 20, 30, 40], "a ...", dict(a=10)), ([10, 20, 30, 40], " a ... b", dict(a=10, b=40)), ([10, 40], " a ... b", dict(a=10, b=40)), ]: x = numpy.ones(shape) parsed1 = parse_shape(x, pattern) parsed2 = parse_shape(backend.from_numpy(x), pattern) assert parsed1 == parsed2 == expected def test_parse_with_anonymous_axes(): for backend in _IMPERATIVE_BACKENDS: for shape, pattern, expected in [ ([1, 2, 3, 4], "1 2 3 a", dict(a=4)), ([10, 1, 2], "a 1 2", dict(a=10)), ([10, 1, 2], "a () 2", dict(a=10)), ]: x = numpy.ones(shape) parsed1 = parse_shape(x, pattern) parsed2 = parse_shape(backend.from_numpy(x), pattern) assert parsed1 == parsed2 == expected def test_failures(): for backend in _IMPERATIVE_BACKENDS: # every test should fail for shape, pattern in [ ([1, 2, 3, 4], "a b c"), ([1, 2, 3, 4], "2 a b c"), ([1, 2, 3, 4], "a b c ()"), ([1, 2, 3, 4], "a b c d e"), ([1, 2, 3, 4], "a b c d e ..."), ([1, 2, 3, 4], "a b c ()"), ]: with pytest.raises(RuntimeError): x = numpy.ones(shape) parse_shape(backend.from_numpy(x), pattern) _SYMBOLIC_BACKENDS = [ *collect_test_backends(symbolic=True, layers=False), *collect_test_backends(symbolic=True, layers=True), ] # tensorflow.keras needs special way to compile, # shape vars can be used only inside layers but not as outputs _SYMBOLIC_BACKENDS = [backend for backend in _SYMBOLIC_BACKENDS if backend.framework_name != "tensorflow.keras"] @pytest.mark.parametrize("backend", _SYMBOLIC_BACKENDS) def test_parse_shape_symbolic(backend): for shape in [ [10, 20, 30, 40], [10, 20, None, None], [None, None, None, None], ]: print( f"special shape parsing {backend.framework_name=} {shape=}", ) input_symbol = backend.create_symbol(shape) shape_placeholder = parse_shape(input_symbol, "a b c d") shape = {} for name, symbol in shape_placeholder.items(): shape[name] = ( symbol if isinstance(symbol, int) else backend.eval_symbol(symbol, [(input_symbol, numpy.zeros([10, 20, 30, 40]))]) ) print(shape) result_placeholder = rearrange( input_symbol, "a b (c1 c2) (d1 d2) -> (a b d1) c1 (c2 d2)", **parse_shape(input_symbol, "a b c1 _"), d2=2 ) result = backend.eval_symbol(result_placeholder, [(input_symbol, numpy.zeros([10, 20, 30, 40]))]) print(result.shape) assert result.shape == (10 * 20 * 20, 30, 1 * 2) assert numpy.allclose(result, 0) @pytest.mark.parametrize("backend", _SYMBOLIC_BACKENDS) def test_parse_shape_symbolic_ellipsis(backend): for static_shape, shape, pattern, expected in [ ([10, 20], [None, None], "...", dict()), ([10], [None], "... a", dict(a=10)), ([10, 20], [None, None], "... a", dict(a=20)), ([10, 20, 30], [None, None, None], "... a", dict(a=30)), ([10, 20, 30, 40], [None, None, None, None], "... a", dict(a=40)), ([10], [None], "a ...", dict(a=10)), ([10, 20], [None, None], "a ...", dict(a=10)), ([10, 20, 30], [None, None, None], "a ...", dict(a=10)), ([10, 20, 30, 40], [None, None, None, None], "a ...", dict(a=10)), ([10, 20, 30, 40], [None, None, None, None], " a ... b", dict(a=10, b=40)), ([10, 40], [None, None], " a ... b ", dict(a=10, b=40)), ]: input_symbol = backend.create_symbol(shape) shape_placeholder = parse_shape(input_symbol, pattern) out_shape = {} for name, symbol in shape_placeholder.items(): if isinstance(symbol, int): out_shape[name] = symbol else: out_shape[name] = backend.eval_symbol(symbol, [(input_symbol, numpy.zeros(static_shape))]) assert out_shape == expected def test_is_float_type(): backends = collect_test_backends(symbolic=False, layers=False) backends += collect_test_backends(symbolic=False, layers=True) for backend in backends: for dtype in ["int32", "int64", "float32", "float64"]: is_float = "float" in dtype input = numpy.zeros([3, 4, 5], dtype=dtype) input = backend.from_numpy(input) assert backend.is_float_type(input) == is_float, (dtype, backend, input.dtype) def test_torch_compile(): """ Test ensures that allow_ops_in_compiled_graph allows compiling in a single graph Additionally we ensure that after compilation cache works properly (by changing shapes and patterns) We additionally check that pack/unpack still can be handled despite variable number of inputs/outputs """ if not is_backend_tested("torch"): pytest.skip() import torch from torch import nn from einops import repeat, reduce, pack, unpack, einsum from einops._torch_specific import allow_ops_in_compiled_graph allow_ops_in_compiled_graph() class TorchModuleWithOperations(nn.Module): def __init__(self) -> None: super().__init__() def forward(self, x_abc, suffix=""): a, b, c = x_abc.shape def suf(pattern): parts = pattern.split() return " ".join([p if p[-1] not in "acd" else p + suffix for p in parts]) # patterns look a bit strange because names a, c, d will be modified on every run # by suf function x_abcd = repeat(x_abc, suf("a b c -> a b c 4")) x_abc = reduce(x_abcd, suf("a b c d -> a b c"), "min") x_abdc, ps = pack([x_abc] * (2 + len(suffix)), suf("a b * c")) x_array = unpack(rearrange(x_abdc, suf("a b d c -> (a b ) 1 c d")), ps, "ab one1 c *") x1 = x_array[0] + len(x_array) x1 = rearrange(x1, suf("(a b ) 1 c -> a b c"), b=b) addition = einsum(x_abc, x_abcd, suf("a b c , a b c d -> d"))[0] return x1 + addition original = TorchModuleWithOperations() compiled = torch.compile(original, fullgraph=True, backend="aot_eager") for size in [10, 20, 40]: x = torch.rand([size, size + 1, size + 2]) for suffix in ["", "suf1", "other_suffix"]: result1 = compiled(x, suffix) result2 = original(x, suffix) assert torch.allclose(result1, result2) arogozhnikov-einops-ad2c8d6/einops/tests/test_packing.py000066400000000000000000000243141475201674600237260ustar00rootroot00000000000000import dataclasses import typing import numpy as np import pytest from einops import EinopsError, asnumpy, pack, unpack from einops.tests import collect_test_backends def pack_unpack(xs, pattern): x, ps = pack(xs, pattern) unpacked = unpack(xs, ps, pattern) assert len(unpacked) == len(xs) for a, b in zip(unpacked, xs): assert np.allclose(asnumpy(a), asnumpy(b)) def unpack_and_pack(x, ps, pattern: str): unpacked = unpack(x, ps, pattern) packed, ps2 = pack(unpacked, pattern=pattern) assert np.allclose(asnumpy(packed), asnumpy(x)) return unpacked def unpack_and_pack_against_numpy(x, ps, pattern: str): capturer_backend = CaptureException() capturer_numpy = CaptureException() with capturer_backend: unpacked = unpack(x, ps, pattern) packed, ps2 = pack(unpacked, pattern=pattern) with capturer_numpy: x_np = asnumpy(x) unpacked_np = unpack(x_np, ps, pattern) packed_np, ps3 = pack(unpacked_np, pattern=pattern) assert type(capturer_numpy.exception) == type(capturer_backend.exception) # noqa E721 if capturer_numpy.exception is not None: # both failed return else: # neither failed, check results are identical assert np.allclose(asnumpy(packed), asnumpy(x)) assert np.allclose(asnumpy(packed_np), asnumpy(x)) assert len(unpacked) == len(unpacked_np) for a, b in zip(unpacked, unpacked_np): assert np.allclose(asnumpy(a), b) class CaptureException: def __enter__(self): self.exception = None def __exit__(self, exc_type, exc_val, exc_tb): self.exception = exc_val return True def test_numpy_trivial(H=13, W=17): def rand(*shape): return np.random.random(shape) def check(a, b): assert a.dtype == b.dtype assert a.shape == b.shape assert np.all(a == b) r, g, b = rand(3, H, W) embeddings = rand(H, W, 32) check( np.stack([r, g, b], axis=2), pack([r, g, b], "h w *")[0], ) check( np.stack([r, g, b], axis=1), pack([r, g, b], "h * w")[0], ) check( np.stack([r, g, b], axis=0), pack([r, g, b], "* h w")[0], ) check( np.concatenate([r, g, b], axis=1), pack([r, g, b], "h *")[0], ) check( np.concatenate([r, g, b], axis=0), pack([r, g, b], "* w")[0], ) i = np.index_exp[:, :, None] check( np.concatenate([r[i], g[i], b[i], embeddings], axis=2), pack([r, g, b, embeddings], "h w *")[0], ) with pytest.raises(EinopsError): pack([r, g, b, embeddings], "h w nonexisting_axis *") pack([r, g, b], "some_name_for_H some_name_for_w1 *") with pytest.raises(EinopsError): pack([r, g, b, embeddings], "h _w *") # no leading underscore with pytest.raises(EinopsError): pack([r, g, b, embeddings], "h_ w *") # no trailing underscore with pytest.raises(EinopsError): pack([r, g, b, embeddings], "1h_ w *") with pytest.raises(EinopsError): pack([r, g, b, embeddings], "1 w *") with pytest.raises(EinopsError): pack([r, g, b, embeddings], "h h *") # capital and non-capital are different pack([r, g, b, embeddings], "h H *") @dataclasses.dataclass class UnpackTestCase: shape: typing.Tuple[int, ...] pattern: str def dim(self): return self.pattern.split().index("*") def selfcheck(self): assert self.shape[self.dim()] == 5 cases = [ # NB: in all cases unpacked axis is of length 5. # that's actively used in tests below UnpackTestCase((5,), "*"), UnpackTestCase((5, 7), "* seven"), UnpackTestCase((7, 5), "seven *"), UnpackTestCase((5, 3, 4), "* three four"), UnpackTestCase((4, 5, 3), "four * three"), UnpackTestCase((3, 4, 5), "three four *"), ] def test_pack_unpack_with_numpy(): case: UnpackTestCase for case in cases: shape = case.shape pattern = case.pattern x = np.random.random(shape) # all correct, no minus 1 unpack_and_pack(x, [[2], [1], [2]], pattern) # no -1, asking for wrong shapes with pytest.raises(BaseException): unpack_and_pack(x, [[2], [1], [2]], pattern + " non_existent_axis") with pytest.raises(BaseException): unpack_and_pack(x, [[2], [1], [1]], pattern) with pytest.raises(BaseException): unpack_and_pack(x, [[4], [1], [1]], pattern) # all correct, with -1 unpack_and_pack(x, [[2], [1], [-1]], pattern) unpack_and_pack(x, [[2], [-1], [2]], pattern) unpack_and_pack(x, [[-1], [1], [2]], pattern) _, _, last = unpack_and_pack(x, [[2], [3], [-1]], pattern) assert last.shape[case.dim()] == 0 # asking for more elements than available with pytest.raises(BaseException): unpack(x, [[2], [4], [-1]], pattern) # this one does not raise, because indexing x[2:1] just returns zero elements # with pytest.raises(BaseException): # unpack(x, [[2], [-1], [4]], pattern) with pytest.raises(BaseException): unpack(x, [[-1], [1], [5]], pattern) # all correct, -1 nested rs = unpack_and_pack(x, [[1, 2], [1, 1], [-1, 1]], pattern) assert all(len(r.shape) == len(x.shape) + 1 for r in rs) rs = unpack_and_pack(x, [[1, 2], [1, -1], [1, 1]], pattern) assert all(len(r.shape) == len(x.shape) + 1 for r in rs) rs = unpack_and_pack(x, [[2, -1], [1, 2], [1, 1]], pattern) assert all(len(r.shape) == len(x.shape) + 1 for r in rs) # asking for more elements, -1 nested with pytest.raises(BaseException): unpack(x, [[-1, 2], [1], [5]], pattern) with pytest.raises(BaseException): unpack(x, [[2, 2], [2], [5, -1]], pattern) # asking for non-divisible number of elements with pytest.raises(BaseException): unpack(x, [[2, 1], [1], [3, -1]], pattern) with pytest.raises(BaseException): unpack(x, [[2, 1], [3, -1], [1]], pattern) with pytest.raises(BaseException): unpack(x, [[3, -1], [2, 1], [1]], pattern) # -1 takes zero unpack_and_pack(x, [[0], [5], [-1]], pattern) unpack_and_pack(x, [[0], [-1], [5]], pattern) unpack_and_pack(x, [[-1], [5], [0]], pattern) # -1 takes zero, -1 unpack_and_pack(x, [[2, -1], [1, 5]], pattern) def test_pack_unpack_against_numpy(): for backend in collect_test_backends(symbolic=False, layers=False): print(f"test packing against numpy for {backend.framework_name}") check_zero_len = True for case in cases: unpack_and_pack = unpack_and_pack_against_numpy shape = case.shape pattern = case.pattern x = np.random.random(shape) x = backend.from_numpy(x) # all correct, no minus 1 unpack_and_pack(x, [[2], [1], [2]], pattern) # no -1, asking for wrong shapes with pytest.raises(BaseException): unpack(x, [[2], [1], [1]], pattern) with pytest.raises(BaseException): unpack(x, [[4], [1], [1]], pattern) # all correct, with -1 unpack_and_pack(x, [[2], [1], [-1]], pattern) unpack_and_pack(x, [[2], [-1], [2]], pattern) unpack_and_pack(x, [[-1], [1], [2]], pattern) # asking for more elements than available with pytest.raises(BaseException): unpack(x, [[2], [4], [-1]], pattern) # this one does not raise, because indexing x[2:1] just returns zero elements # with pytest.raises(BaseException): # unpack(x, [[2], [-1], [4]], pattern) with pytest.raises(BaseException): unpack(x, [[-1], [1], [5]], pattern) # all correct, -1 nested unpack_and_pack(x, [[1, 2], [1, 1], [-1, 1]], pattern) unpack_and_pack(x, [[1, 2], [1, -1], [1, 1]], pattern) unpack_and_pack(x, [[2, -1], [1, 2], [1, 1]], pattern) # asking for more elements, -1 nested with pytest.raises(BaseException): unpack(x, [[-1, 2], [1], [5]], pattern) with pytest.raises(BaseException): unpack(x, [[2, 2], [2], [5, -1]], pattern) # asking for non-divisible number of elements with pytest.raises(BaseException): unpack(x, [[2, 1], [1], [3, -1]], pattern) with pytest.raises(BaseException): unpack(x, [[2, 1], [3, -1], [1]], pattern) with pytest.raises(BaseException): unpack(x, [[3, -1], [2, 1], [1]], pattern) if check_zero_len: # -1 takes zero unpack_and_pack(x, [[2], [3], [-1]], pattern) unpack_and_pack(x, [[0], [5], [-1]], pattern) unpack_and_pack(x, [[0], [-1], [5]], pattern) unpack_and_pack(x, [[-1], [5], [0]], pattern) # -1 takes zero, -1 unpack_and_pack(x, [[2, -1], [1, 5]], pattern) def test_pack_unpack_array_api(): from einops import array_api as AA import numpy as xp if xp.__version__ < "2.0.0": pytest.skip() for case in cases: shape = case.shape pattern = case.pattern x_np = np.random.random(shape) x_xp = xp.from_dlpack(x_np) for ps in [ [[2], [1], [2]], [[1], [1], [-1]], [[1], [1], [-1, 3]], [[2, 1], [1, 1, 1], [-1]], ]: x_np_split = unpack(x_np, ps, pattern) x_xp_split = AA.unpack(x_xp, ps, pattern) for a, b in zip(x_np_split, x_xp_split): assert np.allclose(a, AA.asnumpy(b + 0)) x_agg_np, ps1 = pack(x_np_split, pattern) x_agg_xp, ps2 = AA.pack(x_xp_split, pattern) assert ps1 == ps2 assert np.allclose(x_agg_np, AA.asnumpy(x_agg_xp)) for ps in [ [[2, 3]], [[1], [5]], [[1], [5], [-1]], [[1], [2, 3]], [[1], [5], [-1, 2]], ]: with pytest.raises(BaseException): unpack(x_np, ps, pattern) arogozhnikov-einops-ad2c8d6/einops/tests/test_parsing.py000066400000000000000000000104451475201674600237550ustar00rootroot00000000000000import pytest from einops import EinopsError from einops.parsing import ParsedExpression, AnonymousAxis, _ellipsis __author__ = "Alex Rogozhnikov" class AnonymousAxisPlaceholder: def __init__(self, value: int): self.value = value assert isinstance(self.value, int) def __eq__(self, other): return isinstance(other, AnonymousAxis) and self.value == other.value def test_anonymous_axes(): a, b = AnonymousAxis("2"), AnonymousAxis("2") assert a != b c, d = AnonymousAxisPlaceholder(2), AnonymousAxisPlaceholder(3) assert a == c and b == c assert a != d and b != d assert [a, 2, b] == [c, 2, c] def test_elementary_axis_name(): for name in [ "a", "b", "h", "dx", "h1", "zz", "i9123", "somelongname", "Alex", "camelCase", "u_n_d_e_r_score", "unreasonablyLongAxisName", ]: assert ParsedExpression.check_axis_name(name) for name in ["", "2b", "12", "_startWithUnderscore", "endWithUnderscore_", "_", "...", _ellipsis]: assert not ParsedExpression.check_axis_name(name) def test_invalid_expressions(): # double ellipsis should raise an error ParsedExpression("... a b c d") with pytest.raises(EinopsError): ParsedExpression("... a b c d ...") with pytest.raises(EinopsError): ParsedExpression("... a b c (d ...)") with pytest.raises(EinopsError): ParsedExpression("(... a) b c (d ...)") # double/missing/enclosed parenthesis ParsedExpression("(a) b c (d ...)") with pytest.raises(EinopsError): ParsedExpression("(a)) b c (d ...)") with pytest.raises(EinopsError): ParsedExpression("(a b c (d ...)") with pytest.raises(EinopsError): ParsedExpression("(a) (()) b c (d ...)") with pytest.raises(EinopsError): ParsedExpression("(a) ((b c) (d ...))") # invalid identifiers ParsedExpression("camelCase under_scored cApiTaLs ß ...") with pytest.raises(EinopsError): ParsedExpression("1a") with pytest.raises(EinopsError): ParsedExpression("_pre") with pytest.raises(EinopsError): ParsedExpression("...pre") with pytest.raises(EinopsError): ParsedExpression("pre...") def test_parse_expression(): parsed = ParsedExpression("a1 b1 c1 d1") assert parsed.identifiers == {"a1", "b1", "c1", "d1"} assert parsed.composition == [["a1"], ["b1"], ["c1"], ["d1"]] assert not parsed.has_non_unitary_anonymous_axes assert not parsed.has_ellipsis parsed = ParsedExpression("() () () ()") assert parsed.identifiers == set() assert parsed.composition == [[], [], [], []] assert not parsed.has_non_unitary_anonymous_axes assert not parsed.has_ellipsis parsed = ParsedExpression("1 1 1 ()") assert parsed.identifiers == set() assert parsed.composition == [[], [], [], []] assert not parsed.has_non_unitary_anonymous_axes assert not parsed.has_ellipsis aap = AnonymousAxisPlaceholder parsed = ParsedExpression("5 (3 4)") assert len(parsed.identifiers) == 3 and {i.value for i in parsed.identifiers} == {3, 4, 5} assert parsed.composition == [[aap(5)], [aap(3), aap(4)]] assert parsed.has_non_unitary_anonymous_axes assert not parsed.has_ellipsis parsed = ParsedExpression("5 1 (1 4) 1") assert len(parsed.identifiers) == 2 and {i.value for i in parsed.identifiers} == {4, 5} assert parsed.composition == [[aap(5)], [], [aap(4)], []] parsed = ParsedExpression("name1 ... a1 12 (name2 14)") assert len(parsed.identifiers) == 6 assert parsed.identifiers.difference({"name1", _ellipsis, "a1", "name2"}).__len__() == 2 assert parsed.composition == [["name1"], _ellipsis, ["a1"], [aap(12)], ["name2", aap(14)]] assert parsed.has_non_unitary_anonymous_axes assert parsed.has_ellipsis assert not parsed.has_ellipsis_parenthesized parsed = ParsedExpression("(name1 ... a1 12) name2 14") assert len(parsed.identifiers) == 6 assert parsed.identifiers.difference({"name1", _ellipsis, "a1", "name2"}).__len__() == 2 assert parsed.composition == [["name1", _ellipsis, "a1", aap(12)], ["name2"], [aap(14)]] assert parsed.has_non_unitary_anonymous_axes assert parsed.has_ellipsis assert parsed.has_ellipsis_parenthesized arogozhnikov-einops-ad2c8d6/mkdocs.yml000066400000000000000000000026111475201674600202410ustar00rootroot00000000000000site_name: Einops repo_name: arogozhnikov/einops repo_url: https://github.com/arogozhnikov/einops site_url: https://einops.rocks docs_dir: docs_src theme: name: material favicon: images/favicon.png icon: logo: fontawesome/solid/infinity repo: octicons/mark-github-16 nav: - Introduction: index.md - Tutorials: - Einops Basics: 1-einops-basics.ipynb - Einops for Deep Learning: 2-einops-for-deep-learning.ipynb - Einops.pack and unpack: 4-pack-and-unpack.ipynb - Einmix for great MLPs: 3-einmix-layer.ipynb - Pytorch: pytorch-examples.html - API Reference: - asnumpy: api/asnumpy.md - parse_shape: api/parse_shape.md - rearrange: api/rearrange.md - reduce: api/reduce.md - repeat: api/repeat.md - einsum: api/einsum.md - pack and unpack: api/pack_unpack.md - Testimonials: pages/testimonials.md - Community/Ecosystem: pages/projects.md extra: search: language: en markdown_extensions: - admonition - codehilite - pymdownx.arithmatex - markdown.extensions.md_in_html plugins: - search - mkdocstrings: handlers: python: options: docstring_options: warn_unknown_params: false - mkdocs-jupyter extra_javascript: - https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.0/MathJax.js?config=TeX-MML-AM_CHTML extra_css: - css/mkdocs.css - css/codehilite.cssarogozhnikov-einops-ad2c8d6/pyproject.toml000066400000000000000000000062241475201674600211560ustar00rootroot00000000000000[build-system] requires = ["hatchling>=1.10.0"] build-backend = "hatchling.build" [project] name = "einops" description = "A new flavour of deep learning operations" readme = "README.md" requires-python = ">=3.8" # in sync with target_version keywords = [ 'deep learning', 'neural networks', 'tensor manipulation', 'machine learning', 'scientific computations', 'einops', ] license = { text = 'MIT' } classifiers = [ 'Intended Audience :: Science/Research', 'Programming Language :: Python :: 3', 'License :: OSI Approved :: MIT License', ] dependencies = [ # no run-time or installation-time dependencies ] dynamic = ["version"] authors = [{ name = 'Alex Rogozhnikov' }] [project.urls] Homepage = 'https://github.com/arogozhnikov/einops' [tool.setuptools] packages = ['einops', 'einops.layers'] [tool.hatch.version] path = "einops/__init__.py" [tool.hatch.build.targets.sdist] exclude = [ "/.devcontainer", "/.github", "/.idea", "/.pytest_cache", "/build", "/dist", "/docs", "/docs_src", "/scripts", "/log", ] [tool.hatch.build.targets.wheel] # should use packages from main section [tool.hatch.envs.docs] dependencies = [ "mkdocs~=1.6.1", "mkdocs-material~=9.5.34", "mkdocstrings[python]~=0.26.1", "mkdocs-jupyter~=0.25.0", # pygments is required by codehilite (highlighting in mkdocs) "pygments~=2.18.0", ] [tool.hatch.envs.docs.scripts] # For examples to build one has to run: # hatch run docs:build convert = "python scripts/convert_readme.py" build = "convert && mkdocs build --clean --strict {args}" serve = "convert && mkdocs serve --dev-addr localhost:8000 {args}" deploy = "convert && mkdocs build --clean --strict && mkdocs gh-deploy" # when mkdocs deployed from github actions, it requires --force. Reason unclear. deploy_force = "convert && mkdocs build --clean --strict && mkdocs gh-deploy --force" [tool.hatch.envs.pypi.scripts] # hatch run pypi:deploy_test deploy_test = "hatch build --clean && hatch publish -r test" deploy = "hatch build --clean && hatch publish" [tool.pytest.ini_options] # suppressing irrelevant warnings from google's tensorflow and pb2 on m1 mac # should be removed in 2023 filterwarnings = [ "ignore:Call to deprecated create function FieldDescriptor", "ignore:Call to deprecated create function Descriptor", "ignore:Call to deprecated create function EnumDescriptor", "ignore:Call to deprecated create function EnumValueDescriptor", "ignore:Call to deprecated create function FileDescriptor", "ignore:Call to deprecated create function OneofDescriptor", ] [tool.ruff] line-length = 120 target-version = 'py38' cache-dir = "/tmp/ruff_cache" # move cache out of workdir [tool.ruff.format] docstring-code-format = false # do not reformat notebooks exclude = ["*.ipynb"] [tool.ruff.lint] select = ["E4", "E7", "E9", "F", "W"] # this notebook is not really a notebook, # but a set of examples to be compared exclude = ["*Pytorch.ipynb"] [tool.ruff.lint.per-file-ignores] "*.ipynb" = [ "E402", # Module level import not at top of cell "F811", # redefinition of unused "E702", # Multiple statements on one line (semicolon) ] arogozhnikov-einops-ad2c8d6/scripts/000077500000000000000000000000001475201674600177255ustar00rootroot00000000000000arogozhnikov-einops-ad2c8d6/scripts/convert_readme.py000066400000000000000000000016531475201674600233010ustar00rootroot00000000000000""" Converts readme from github repo page to mkdocs-friendly """ from pathlib import Path original_text = Path(__file__).parent.parent.joinpath("README.md").read_text(encoding="utf-8") def replace_with_video_tag(line: str): if line.startswith("https://") and line.endswith(".mp4") and " " not in line: # treating as link to mp4 file. return "" # return f""" # \n\n
\n\n
# """.strip() else: # other lines are not touched return line new_content = "\n".join([replace_with_video_tag(line) for line in original_text.splitlines()]) # save contents docs_index = Path(__file__).parent.parent.joinpath("docs_src", "index.md") assert docs_index.parent.exists() docs_index.write_bytes(new_content.encode("utf-8")) print("Converted README.md") arogozhnikov-einops-ad2c8d6/scripts/pytorch_examples_source/000077500000000000000000000000001475201674600246735ustar00rootroot00000000000000arogozhnikov-einops-ad2c8d6/scripts/pytorch_examples_source/Pytorch.ipynb000066400000000000000000002764441475201674600274070ustar00rootroot00000000000000{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "
\n", " \n", " \"einops\n", " \n", "
\n", " [github],   \n", " tutorials\n", " [1] and\n", " [2]\n", "
\n", "
\n", "
\n", "
\n", "\n", "# Writing a better code with pytorch and einops\n", "\n", "\n", "

\n", "\n", "## Rewriting building blocks of deep learning \n", "\n", "Now let's get to examples from real world.\n", "These code fragments taken from official tutorials and popular repositories.\n", "\n", "Learn how to improve code and how `einops` can help you.\n", "\n", "**Left**: as it was, **Right**: improved version" ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [], "source": [ "#right\n", "# start from importing some stuff\n", "import torch\n", "import torch.nn as nn\n", "import torch.nn.functional as F\n", "import numpy as np\n", "import math\n", "\n", "from einops import rearrange, reduce, asnumpy, parse_shape\n", "from einops.layers.torch import Rearrange, Reduce" ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [], "source": [ "def initialize(model):\n", " for p in model.parameters():\n", " p.data[:] = torch.from_numpy(np.random.RandomState(sum(p.shape)).randn(*p.shape))\n", " return model" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Simple ConvNet" ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [], "source": [ "#left\n", "class Net(nn.Module):\n", " def __init__(self):\n", " super(Net, self).__init__()\n", " self.conv1 = nn.Conv2d(1, 10, kernel_size=5)\n", " self.conv2 = nn.Conv2d(10, 20, kernel_size=5)\n", " self.conv2_drop = nn.Dropout2d()\n", " self.fc1 = nn.Linear(320, 50)\n", " self.fc2 = nn.Linear(50, 10)\n", "\n", " def forward(self, x):\n", " x = F.relu(F.max_pool2d(self.conv1(x), 2))\n", " x = F.relu(F.max_pool2d(self.conv2_drop(self.conv2(x)), 2))\n", " x = x.view(-1, 320)\n", " x = F.relu(self.fc1(x))\n", " x = F.dropout(x, training=self.training)\n", " x = self.fc2(x)\n", " return F.log_softmax(x, dim=1)\n", "\n", "conv_net_old = Net()" ] }, { "cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [], "source": [ "#right\n", "conv_net_new = nn.Sequential(\n", " nn.Conv2d(1, 10, kernel_size=5),\n", " nn.MaxPool2d(kernel_size=2),\n", " nn.ReLU(),\n", " nn.Conv2d(10, 20, kernel_size=5),\n", " nn.MaxPool2d(kernel_size=2),\n", " nn.ReLU(),\n", " nn.Dropout2d(),\n", " Rearrange('b c h w -> b (c h w)'),\n", " nn.Linear(320, 50),\n", " nn.ReLU(),\n", " nn.Dropout(),\n", " nn.Linear(50, 10),\n", " nn.LogSoftmax(dim=1)\n", ")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Reasons to prefer new implementation:\n", "\n", "- in the original code (to the left) if input size is changed and batch size is divisible by 16 (that's usually so), we'll get something senseless after reshaping\n", " - new code will explicitly raise an error in this case\n", "- we won't forget to use dropout with flag self.training with new version\n", "- code is straightforward to read and analyze\n", "- sequential makes printing / saving / passing trivial. And there is no need in your code to load a model (which also has a number of benefits)\n", "- don't need logsoftmax? Now you can use `conv_net_new[:-1]`. One more reason to prefer `nn.Sequential`\n", "- ... and we could also add inplace for ReLU\n" ] }, { "cell_type": "code", "execution_count": 5, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "torch.Size([4, 10])" ] }, "execution_count": 5, "metadata": {}, "output_type": "execute_result" } ], "source": [ "conv_net_old(torch.zeros([16, 1, 20, 20])).shape\n", "# conv_net_new(torch.zeros([16, 1, 20, 20])).shape" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Super-resolution\n", "\n", "" ] }, { "cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [], "source": [ "#left\n", "class SuperResolutionNetOld(nn.Module):\n", " def __init__(self, upscale_factor):\n", " super(SuperResolutionNetOld, self).__init__()\n", "\n", " self.relu = nn.ReLU()\n", " self.conv1 = nn.Conv2d(1, 64, (5, 5), (1, 1), (2, 2))\n", " self.conv2 = nn.Conv2d(64, 64, (3, 3), (1, 1), (1, 1))\n", " self.conv3 = nn.Conv2d(64, 32, (3, 3), (1, 1), (1, 1))\n", " self.conv4 = nn.Conv2d(32, upscale_factor ** 2, (3, 3), (1, 1), (1, 1))\n", " self.pixel_shuffle = nn.PixelShuffle(upscale_factor)\n", "\n", " def forward(self, x):\n", " x = self.relu(self.conv1(x))\n", " x = self.relu(self.conv2(x))\n", " x = self.relu(self.conv3(x))\n", " x = self.pixel_shuffle(self.conv4(x))\n", " return x" ] }, { "cell_type": "code", "execution_count": 7, "metadata": {}, "outputs": [], "source": [ "#right\n", "def SuperResolutionNetNew(upscale_factor):\n", " return nn.Sequential(\n", " nn.Conv2d(1, 64, kernel_size=5, padding=2),\n", " nn.ReLU(inplace=True),\n", " nn.Conv2d(64, 64, kernel_size=3, padding=1),\n", " nn.ReLU(inplace=True),\n", " nn.Conv2d(64, 32, kernel_size=3, padding=1),\n", " nn.ReLU(inplace=True),\n", " nn.Conv2d(32, upscale_factor ** 2, kernel_size=3, padding=1),\n", " Rearrange('b (h2 w2) h w -> b (h h2) (w w2)', h2=upscale_factor, w2=upscale_factor),\n", " )" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Here is the difference:\n", "\n", "- no need in special instruction pixel_shuffle (and result is transferrable between frameworks)\n", "- output doesn't contain a fake axis (and we could do the same for the input)\n", "- inplace ReLU used now, for high resolution pictures that becomes critical and saves us much memory\n", "- and all the benefits of nn.Sequential again" ] }, { "cell_type": "code", "execution_count": 8, "metadata": {}, "outputs": [], "source": [ "model1 = initialize(SuperResolutionNetOld(upscale_factor=3))\n", "model2 = initialize(SuperResolutionNetNew(upscale_factor=3))" ] }, { "cell_type": "code", "execution_count": 9, "metadata": {}, "outputs": [], "source": [ "assert torch.allclose(model1(torch.zeros(1, 1, 30, 30)), model2(torch.zeros(1, 1, 30, 30))[None])" ] }, { "cell_type": "code", "execution_count": 10, "metadata": {}, "outputs": [], "source": [ "## that's how this code was mentioned to use\n", "\n", "# from PIL import Image\n", "# img = Image.open(opt.input_image).convert('YCbCr')\n", "# y, cb, cr = img.split()\n", "\n", "# model = torch.load(opt.model)\n", "# img_to_tensor = ToTensor()\n", "# input = img_to_tensor(y).view(1, -1, y.size[1], y.size[0])\n", "\n", "# if opt.cuda:\n", "# model = model.cuda()\n", "# input = input.cuda()\n", "\n", "# out = model(input)\n", "# out = out.cpu()\n", "# out_img_y = out[0].detach().numpy()\n", "# out_img_y *= 255.0\n", "# out_img_y = out_img_y.clip(0, 255)\n", "# out_img_y = Image.fromarray(np.uint8(out_img_y[0]), mode='L')\n", "\n", "# out_img_cb = cb.resize(out_img_y.size, Image.BICUBIC)\n", "# out_img_cr = cr.resize(out_img_y.size, Image.BICUBIC)\n", "# out_img = Image.merge('YCbCr', [out_img_y, out_img_cb, out_img_cr]).convert('RGB')\n", "\n", "## Benefits\n", "\n", "# - no need to remembder the order of components in PIL.Image.size (as you see, it is actually different)\n", "# - code explicitly shows shapes passed in and out\n", "# - normalization to [0, 1] range and back is also explicit (it is needed to rememebed in original code that division by 255 is done by ToTensor)\n", "\n", "input_image = '../../logo/einops_logo_350x350.png'\n", "from PIL import Image\n", "import numpy as np\n", "\n", "from torchvision.transforms import ToTensor\n", "model = SuperResolutionNetOld(upscale_factor=2)\n", "\n", "img = Image.open(input_image).convert('YCbCr')\n", "y, cb, cr = img.split()\n", "\n", "img_to_tensor = ToTensor()\n", "input = img_to_tensor(y).view(1, -1, y.size[1], y.size[0])\n", "out = model(input)\n", "\n", "out_img_y = out[0].detach().numpy()\n", "out_img_y = np.clip(out_img_y[0] * 255, 0, 255)\n", "\n", "model = SuperResolutionNetNew(upscale_factor=2)\n", "\n", "img = Image.open(input_image).convert('YCbCr')\n", "y, cb, cr = img.split()\n", "\n", "# TODO numpy.asarray\n", "y = torch.from_numpy(np.array(y, dtype='float32') / 255)\n", "out = model(rearrange(y, 'h w -> () () h w'))\n", "\n", "out_img_y = asnumpy(rearrange(out, '() h w -> h w'))\n", "out_img_y = np.clip(out_img_y * 255, 0, 255)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Restyling Gram matrix for style transfer" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "\n", "\n", "Original code is already good - first line shows what kind of input is expected\n", "\n", "- einsum operation should be read like:\n", " - for each batch and for each pair of channels, we sum over h and w.\n", "- I've also changed normalization, because that's how Gram matrix is defined, otherwise we should call it normalized Gram matrix or alike" ] }, { "cell_type": "code", "execution_count": 11, "metadata": {}, "outputs": [], "source": [ "#left\n", "def gram_matrix_old(y):\n", " (b, ch, h, w) = y.size()\n", " features = y.view(b, ch, w * h)\n", " features_t = features.transpose(1, 2)\n", " gram = features.bmm(features_t) / (ch * h * w)\n", " return gram" ] }, { "cell_type": "code", "execution_count": 12, "metadata": {}, "outputs": [], "source": [ "#right\n", "def gram_matrix_new(y):\n", " b, ch, h, w = y.shape\n", " return torch.einsum('bchw,bdhw->bcd', [y, y]) / (h * w)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "It would be great to use just `'b c1 h w,b c2 h w->b c1 c2'`, but einsum supports only one-letter axes" ] }, { "cell_type": "code", "execution_count": 13, "metadata": {}, "outputs": [], "source": [ "x = torch.randn([32, 128, 40, 40])" ] }, { "cell_type": "code", "execution_count": 14, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "7.58 ms ± 492 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)\n", "9.66 ms ± 258 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)\n" ] } ], "source": [ "%timeit gram_matrix_old(x).sum()\n", "%timeit gram_matrix_new(x).sum()" ] }, { "cell_type": "code", "execution_count": 15, "metadata": {}, "outputs": [], "source": [ "assert torch.allclose(gram_matrix_old(x), gram_matrix_new(x) / 128)" ] }, { "cell_type": "code", "execution_count": 16, "metadata": {}, "outputs": [], "source": [ "# x = x.to('cuda')\n", "# %timeit -n100 gram_matrix_old(x).sum(); torch.cuda.synchronize()\n", "# %timeit -n100 gram_matrix_new(x).sum(); torch.cuda.synchronize()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Recurrent model\n", "\n", "All we did here is just made information about shapes explicit to skip deciphering\n", "\n", "\n" ] }, { "cell_type": "code", "execution_count": 17, "metadata": {}, "outputs": [], "source": [ "#left\n", "class RNNModelOld(nn.Module):\n", " \"\"\"Container module with an encoder, a recurrent module, and a decoder.\"\"\"\n", " def __init__(self, ntoken, ninp, nhid, nlayers, dropout=0.5):\n", " super(RNNModel, self).__init__()\n", " self.drop = nn.Dropout(dropout)\n", " self.encoder = nn.Embedding(ntoken, ninp)\n", " self.rnn = nn.LSTM(ninp, nhid, nlayers, dropout=dropout)\n", " self.decoder = nn.Linear(nhid, ntoken)\n", "\n", " def forward(self, input, hidden):\n", " emb = self.drop(self.encoder(input))\n", " output, hidden = self.rnn(emb, hidden)\n", " output = self.drop(output)\n", " decoded = self.decoder(output.view(output.size(0)*output.size(1), output.size(2)))\n", " return decoded.view(output.size(0), output.size(1), decoded.size(1)), hidden\n" ] }, { "cell_type": "code", "execution_count": 18, "metadata": {}, "outputs": [], "source": [ "#right\n", "class RNNModelNew(nn.Module):\n", " \"\"\"Container module with an encoder, a recurrent module, and a decoder.\"\"\"\n", " def __init__(self, ntoken, ninp, nhid, nlayers, dropout=0.5):\n", " super(RNNModel, self).__init__()\n", " self.drop = nn.Dropout(p=dropout)\n", " self.encoder = nn.Embedding(ntoken, ninp)\n", " self.rnn = nn.LSTM(ninp, nhid, nlayers, dropout=dropout)\n", " self.decoder = nn.Linear(nhid, ntoken)\n", "\n", " def forward(self, input, hidden):\n", " t, b = input.shape\n", " emb = self.drop(self.encoder(input))\n", " output, hidden = self.rnn(emb, hidden)\n", " output = rearrange(self.drop(output), 't b nhid -> (t b) nhid')\n", " decoded = rearrange(self.decoder(output), '(t b) token -> t b token', t=t, b=b)\n", " return decoded, hidden" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Channel shuffle (from shufflenet) \n", "" ] }, { "cell_type": "code", "execution_count": 19, "metadata": {}, "outputs": [], "source": [ "#left\n", "def channel_shuffle_old(x, groups):\n", " batchsize, num_channels, height, width = x.data.size()\n", "\n", " channels_per_group = num_channels // groups\n", "\n", " # reshape\n", " x = x.view(batchsize, groups,\n", " channels_per_group, height, width)\n", "\n", " # transpose\n", " # - contiguous() required if transpose() is used before view().\n", " # See https://github.com/pytorch/pytorch/issues/764\n", " x = torch.transpose(x, 1, 2).contiguous()\n", "\n", " # flatten\n", " x = x.view(batchsize, -1, height, width)\n", "\n", " return x" ] }, { "cell_type": "code", "execution_count": 20, "metadata": {}, "outputs": [], "source": [ "#right\n", "def channel_shuffle_new(x, groups):\n", " return rearrange(x, 'b (c1 c2) h w -> b (c2 c1) h w', c1=groups)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "While progress is obvious, this is not the limit. As you'll see below, we don't even need to write these couple of lines." ] }, { "cell_type": "code", "execution_count": 21, "metadata": {}, "outputs": [], "source": [ "x = torch.zeros([32, 64, 100, 100])" ] }, { "cell_type": "code", "execution_count": 22, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "51.2 ms ± 2.18 ms per loop (mean ± std. dev. of 7 runs, 100 loops each)\n", "49.9 ms ± 594 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)\n" ] } ], "source": [ "%timeit -n100 channel_shuffle_old(x, 8); torch.cuda.synchronize()\n", "%timeit -n100 channel_shuffle_new(x, 8); torch.cuda.synchronize()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Shufflenet" ] }, { "cell_type": "code", "execution_count": 23, "metadata": {}, "outputs": [], "source": [ "def conv3x3(in_channels, out_channels, stride=1,\n", " padding=1, bias=True, groups=1):\n", " \"\"\"3x3 convolution with padding\n", " \"\"\"\n", " return nn.Conv2d(\n", " in_channels,\n", " out_channels,\n", " kernel_size=3,\n", " stride=stride,\n", " padding=padding,\n", " bias=bias,\n", " groups=groups)\n", "\n", "\n", "def conv1x1(in_channels, out_channels, groups=1):\n", " \"\"\"1x1 convolution with padding\n", " - Normal pointwise convolution When groups == 1\n", " - Grouped pointwise convolution when groups > 1\n", " \"\"\"\n", " return nn.Conv2d(\n", " in_channels,\n", " out_channels,\n", " kernel_size=1,\n", " groups=groups,\n", " stride=1)" ] }, { "cell_type": "code", "execution_count": 24, "metadata": {}, "outputs": [], "source": [ "#left\n", "from collections import OrderedDict\n", "\n", "def channel_shuffle(x, groups):\n", " batchsize, num_channels, height, width = x.data.size()\n", "\n", " channels_per_group = num_channels // groups\n", "\n", " # reshape\n", " x = x.view(batchsize, groups,\n", " channels_per_group, height, width)\n", "\n", " # transpose\n", " # - contiguous() required if transpose() is used before view().\n", " # See https://github.com/pytorch/pytorch/issues/764\n", " x = torch.transpose(x, 1, 2).contiguous()\n", "\n", " # flatten\n", " x = x.view(batchsize, -1, height, width)\n", "\n", " return x\n", "\n", "class ShuffleUnitOld(nn.Module):\n", " def __init__(self, in_channels, out_channels, groups=3,\n", " grouped_conv=True, combine='add'):\n", "\n", " super(ShuffleUnitOld, self).__init__()\n", "\n", " self.in_channels = in_channels\n", " self.out_channels = out_channels\n", " self.grouped_conv = grouped_conv\n", " self.combine = combine\n", " self.groups = groups\n", " self.bottleneck_channels = self.out_channels // 4\n", "\n", " # define the type of ShuffleUnit\n", " if self.combine == 'add':\n", " # ShuffleUnit Figure 2b\n", " self.depthwise_stride = 1\n", " self._combine_func = self._add\n", " elif self.combine == 'concat':\n", " # ShuffleUnit Figure 2c\n", " self.depthwise_stride = 2\n", " self._combine_func = self._concat\n", "\n", " # ensure output of concat has the same channels as\n", " # original output channels.\n", " self.out_channels -= self.in_channels\n", " else:\n", " raise ValueError(\"Cannot combine tensors with \\\"{}\\\"\" \\\n", " \"Only \\\"add\\\" and \\\"concat\\\" are\" \\\n", " \"supported\".format(self.combine))\n", "\n", " # Use a 1x1 grouped or non-grouped convolution to reduce input channels\n", " # to bottleneck channels, as in a ResNet bottleneck module.\n", " # NOTE: Do not use group convolution for the first conv1x1 in Stage 2.\n", " self.first_1x1_groups = self.groups if grouped_conv else 1\n", "\n", " self.g_conv_1x1_compress = self._make_grouped_conv1x1(\n", " self.in_channels,\n", " self.bottleneck_channels,\n", " self.first_1x1_groups,\n", " batch_norm=True,\n", " relu=True\n", " )\n", "\n", " # 3x3 depthwise convolution followed by batch normalization\n", " self.depthwise_conv3x3 = conv3x3(\n", " self.bottleneck_channels, self.bottleneck_channels,\n", " stride=self.depthwise_stride, groups=self.bottleneck_channels)\n", " self.bn_after_depthwise = nn.BatchNorm2d(self.bottleneck_channels)\n", "\n", " # Use 1x1 grouped convolution to expand from\n", " # bottleneck_channels to out_channels\n", " self.g_conv_1x1_expand = self._make_grouped_conv1x1(\n", " self.bottleneck_channels,\n", " self.out_channels,\n", " self.groups,\n", " batch_norm=True,\n", " relu=False\n", " )\n", "\n", "\n", " @staticmethod\n", " def _add(x, out):\n", " # residual connection\n", " return x + out\n", "\n", "\n", " @staticmethod\n", " def _concat(x, out):\n", " # concatenate along channel axis\n", " return torch.cat((x, out), 1)\n", "\n", "\n", " def _make_grouped_conv1x1(self, in_channels, out_channels, groups,\n", " batch_norm=True, relu=False):\n", "\n", " modules = OrderedDict()\n", " conv = conv1x1(in_channels, out_channels, groups=groups)\n", " modules['conv1x1'] = conv\n", "\n", " if batch_norm:\n", " modules['batch_norm'] = nn.BatchNorm2d(out_channels)\n", " if relu:\n", " modules['relu'] = nn.ReLU()\n", " if len(modules) > 1:\n", " return nn.Sequential(modules)\n", " else:\n", " return conv\n", "\n", "\n", " def forward(self, x):\n", " # save for combining later with output\n", " residual = x\n", " if self.combine == 'concat':\n", " residual = F.avg_pool2d(residual, kernel_size=3,\n", " stride=2, padding=1)\n", "\n", " out = self.g_conv_1x1_compress(x)\n", " out = channel_shuffle(out, self.groups)\n", " out = self.depthwise_conv3x3(out)\n", " out = self.bn_after_depthwise(out)\n", " out = self.g_conv_1x1_expand(out)\n", "\n", " out = self._combine_func(residual, out)\n", " return F.relu(out)" ] }, { "cell_type": "code", "execution_count": 25, "metadata": {}, "outputs": [], "source": [ "#right\n", "class ShuffleUnitNew(nn.Module):\n", " def __init__(self, in_channels, out_channels, groups=3,\n", " grouped_conv=True, combine='add'):\n", " super().__init__()\n", " first_1x1_groups = groups if grouped_conv else 1\n", " bottleneck_channels = out_channels // 4\n", " self.combine = combine\n", " if combine == 'add':\n", " # ShuffleUnit Figure 2b\n", " self.left = Rearrange('...->...') # identity\n", " depthwise_stride = 1\n", " else:\n", " # ShuffleUnit Figure 2c\n", " self.left = nn.AvgPool2d(kernel_size=3, stride=2, padding=1)\n", " depthwise_stride = 2\n", " # ensure output of concat has the same channels as original output channels.\n", " out_channels -= in_channels\n", " assert out_channels > 0\n", "\n", " self.right = nn.Sequential(\n", " # Use a 1x1 grouped or non-grouped convolution to reduce input channels\n", " # to bottleneck channels, as in a ResNet bottleneck module.\n", " conv1x1(in_channels, bottleneck_channels, groups=first_1x1_groups),\n", " nn.BatchNorm2d(bottleneck_channels),\n", " nn.ReLU(inplace=True),\n", " # channel shuffle\n", " Rearrange('b (c1 c2) h w -> b (c2 c1) h w', c1=groups),\n", " # 3x3 depthwise convolution followed by batch\n", " conv3x3(bottleneck_channels, bottleneck_channels,\n", " stride=depthwise_stride, groups=bottleneck_channels),\n", " nn.BatchNorm2d(bottleneck_channels),\n", " # Use 1x1 grouped convolution to expand from\n", " # bottleneck_channels to out_channels\n", " conv1x1(bottleneck_channels, out_channels, groups=groups),\n", " nn.BatchNorm2d(out_channels),\n", " )\n", "\n", " def forward(self, x):\n", " if self.combine == 'add':\n", " combined = self.left(x) + self.right(x)\n", " else:\n", " combined = torch.cat([self.left(x), self.right(x)], dim=1)\n", " return F.relu(combined, inplace=True)\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Rewriting the code helped to identify:\n", "\n", "- There is no sense in doing reshuffling and not using groups in the first convolution\n", " (indeed, in the paper it is not so). However, result is an equivalent model.\n", "- It is also strange that the first convolution may be not grouped, while the last convolution is always grouped\n", " (and that is different from the paper)\n", "\n", "Other comments:\n", "\n", "- There is an identity layer for pytorch introduced here\n", "- The last thing left is get rid of conv1x1 and conv3x3 in the code - those are not better than standard" ] }, { "cell_type": "code", "execution_count": 26, "metadata": {}, "outputs": [], "source": [ "model1 = ShuffleUnitOld(32, 32, groups=4, grouped_conv=True, combine='add')\n", "model2 = ShuffleUnitNew(32, 32, groups=4, grouped_conv=True, combine='add')" ] }, { "cell_type": "code", "execution_count": 27, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "True" ] }, "execution_count": 27, "metadata": {}, "output_type": "execute_result" } ], "source": [ "x = torch.randn(1, 32, 14, 14)\n", "initialize(model1)\n", "initialize(model2)\n", "torch.allclose(model1(x), model2(x))" ] }, { "cell_type": "code", "execution_count": 28, "metadata": {}, "outputs": [], "source": [ "import pickle\n", "dump1 = pickle.dumps(model1._combine_func)\n", "dump2 = pickle.dumps(model2)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Simplifying ResNet" ] }, { "cell_type": "code", "execution_count": 29, "metadata": {}, "outputs": [], "source": [ "#left\n", "class ResNetOld(nn.Module):\n", "\n", " def __init__(self, block, layers, num_classes=1000):\n", " self.inplanes = 64\n", " super(ResNetOld, self).__init__()\n", " self.conv1 = nn.Conv2d(3, 64, kernel_size=7, stride=2, padding=3,\n", " bias=False)\n", " self.bn1 = nn.BatchNorm2d(64)\n", " self.relu = nn.ReLU(inplace=True)\n", " self.maxpool = nn.MaxPool2d(kernel_size=3, stride=2, padding=1)\n", " self.layer1 = self._make_layer(block, 64, layers[0])\n", " self.layer2 = self._make_layer(block, 128, layers[1], stride=2)\n", " self.layer3 = self._make_layer(block, 256, layers[2], stride=2)\n", " self.layer4 = self._make_layer(block, 512, layers[3], stride=2)\n", " self.avgpool = nn.AvgPool2d(7, stride=1)\n", "\n", " self.fc = nn.Linear(512 * block.expansion, num_classes)\n", "\n", " for m in self.modules():\n", " if isinstance(m, nn.Conv2d):\n", " n = m.kernel_size[0] * m.kernel_size[1] * m.out_channels\n", " m.weight.data.normal_(0, math.sqrt(2. / n))\n", " elif isinstance(m, nn.BatchNorm2d):\n", " m.weight.data.fill_(1)\n", " m.bias.data.zero_()\n", "\n", " def _make_layer(self, block, planes, blocks, stride=1):\n", " downsample = None\n", " if stride != 1 or self.inplanes != planes * block.expansion:\n", " downsample = nn.Sequential(\n", " nn.Conv2d(self.inplanes, planes * block.expansion,\n", " kernel_size=1, stride=stride, bias=False),\n", " nn.BatchNorm2d(planes * block.expansion),\n", " )\n", "\n", " layers = []\n", " layers.append(block(self.inplanes, planes, stride, downsample))\n", " self.inplanes = planes * block.expansion\n", " for i in range(1, blocks):\n", " layers.append(block(self.inplanes, planes))\n", "\n", " return nn.Sequential(*layers)\n", "\n", " def forward(self, x):\n", " x = self.conv1(x)\n", " x = self.bn1(x)\n", " x = self.relu(x)\n", " x = self.maxpool(x)\n", "\n", " x = self.layer1(x)\n", " x = self.layer2(x)\n", " x = self.layer3(x)\n", " x = self.layer4(x)\n", " x = self.avgpool(x)\n", " x = x.view(x.size(0), -1)\n", " x = self.fc(x)\n", "\n", " return x" ] }, { "cell_type": "code", "execution_count": 30, "metadata": {}, "outputs": [], "source": [ "#right\n", "def make_layer(inplanes, planes, block, n_blocks, stride=1):\n", " downsample = None\n", " if stride != 1 or inplanes != planes * block.expansion:\n", " # output size won't match input, so adjust residual\n", " downsample = nn.Sequential(\n", " nn.Conv2d(inplanes, planes * block.expansion,\n", " kernel_size=1, stride=stride, bias=False),\n", " nn.BatchNorm2d(planes * block.expansion),\n", " )\n", " return nn.Sequential(\n", " block(inplanes, planes, stride, downsample),\n", " *[block(planes * block.expansion, planes) for _ in range(1, n_blocks)]\n", " )\n", "\n", "\n", "def ResNetNew(block, layers, num_classes=1000):\n", " e = block.expansion\n", "\n", " resnet = nn.Sequential(\n", " Rearrange('b c h w -> b c h w', c=3, h=224, w=224),\n", " nn.Conv2d(3, 64, kernel_size=7, stride=2, padding=3, bias=False),\n", " nn.BatchNorm2d(64),\n", " nn.ReLU(inplace=True),\n", " nn.MaxPool2d(kernel_size=3, stride=2, padding=1),\n", " make_layer(64, 64, block, layers[0], stride=1),\n", " make_layer(64 * e, 128, block, layers[1], stride=2),\n", " make_layer(128 * e, 256, block, layers[2], stride=2),\n", " make_layer(256 * e, 512, block, layers[3], stride=2),\n", " # combined AvgPool and view in one averaging operation\n", " Reduce('b c h w -> b c', 'mean'),\n", " nn.Linear(512 * e, num_classes),\n", " )\n", "\n", " # initialization\n", " for m in resnet.modules():\n", " if isinstance(m, nn.Conv2d):\n", " n = m.kernel_size[0] * m.kernel_size[1] * m.out_channels\n", " m.weight.data.normal_(0, math.sqrt(2. / n))\n", " elif isinstance(m, nn.BatchNorm2d):\n", " m.weight.data.fill_(1)\n", " m.bias.data.zero_()\n", " return resnet" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Changes:\n", "\n", "- explicit check for input shape\n", "- no views and simple sequential structure, output is just nn.Sequential, so can always be saved/passed/etc\n", "- no need in AvgPool and additional views, this place is much clearer now\n", "- `make_layer` doesn't use internal state (that's quite faulty place)" ] }, { "cell_type": "code", "execution_count": 31, "metadata": {}, "outputs": [], "source": [ "from torchvision.models.resnet import BasicBlock, Bottleneck, ResNet" ] }, { "cell_type": "code", "execution_count": 32, "metadata": {}, "outputs": [], "source": [ "x = torch.randn(2, 3, 224, 224)\n", "with torch.no_grad():\n", " model_old = ResNetOld(BasicBlock, layers=[2, 2, 2, 3])\n", " model_new = ResNetNew(BasicBlock, layers=[2, 2, 2, 3])\n", " initialize(model_old)\n", " initialize(model_new)\n", " assert torch.allclose(model_old(x), model_new(x), atol=1e-3)" ] }, { "cell_type": "code", "execution_count": 33, "metadata": {}, "outputs": [], "source": [ "# with torch.no_grad():\n", "# x = torch.randn([2, 512, 7, 7])\n", "# torch.allclose(nn.AvgPool2d(7)(x), reduce(x, 'b c h w -> b c', 'mean'), atol=1e-8)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Improving RNN language modelling" ] }, { "cell_type": "code", "execution_count": 34, "metadata": {}, "outputs": [], "source": [ "#left\n", "class RNNOld(nn.Module):\n", " def __init__(self, vocab_size, embedding_dim, hidden_dim, output_dim, n_layers, bidirectional, dropout):\n", " super().__init__()\n", "\n", " self.embedding = nn.Embedding(vocab_size, embedding_dim)\n", " self.rnn = nn.LSTM(embedding_dim, hidden_dim, num_layers=n_layers,\n", " bidirectional=bidirectional, dropout=dropout)\n", " self.fc = nn.Linear(hidden_dim*2, output_dim)\n", " self.dropout = nn.Dropout(dropout)\n", "\n", " def forward(self, x):\n", " #x = [sent len, batch size]\n", "\n", " embedded = self.dropout(self.embedding(x))\n", "\n", " #embedded = [sent len, batch size, emb dim]\n", "\n", " output, (hidden, cell) = self.rnn(embedded)\n", "\n", " #output = [sent len, batch size, hid dim * num directions]\n", " #hidden = [num layers * num directions, batch size, hid dim]\n", " #cell = [num layers * num directions, batch size, hid dim]\n", "\n", " #concat the final forward (hidden[-2,:,:]) and backward (hidden[-1,:,:]) hidden layers\n", " #and apply dropout\n", "\n", " hidden = self.dropout(torch.cat((hidden[-2,:,:], hidden[-1,:,:]), dim=1))\n", "\n", " #hidden = [batch size, hid dim * num directions]\n", "\n", " return self.fc(hidden.squeeze(0))\n" ] }, { "cell_type": "code", "execution_count": 35, "metadata": {}, "outputs": [], "source": [ "#right\n", "class RNNNew(nn.Module):\n", " def __init__(self, vocab_size, embedding_dim, hidden_dim, output_dim, n_layers, bidirectional, dropout):\n", " super().__init__()\n", "\n", " self.embedding = nn.Embedding(vocab_size, embedding_dim)\n", " self.rnn = nn.LSTM(embedding_dim, hidden_dim, num_layers=n_layers,\n", " bidirectional=bidirectional, dropout=dropout)\n", " self.dropout = nn.Dropout(dropout)\n", " self.directions = 2 if bidirectional else 1\n", " self.fc = nn.Linear(hidden_dim * self.directions, output_dim)\n", "\n", " def forward(self, x):\n", " #x = [sent len, batch size]\n", " embedded = self.dropout(self.embedding(x))\n", "\n", " #embedded = [sent len, batch size, emb dim]\n", " output, (hidden, cell) = self.rnn(embedded)\n", "\n", " hidden = rearrange(hidden, '(layer dir) b c -> layer b (dir c)',\n", " dir=self.directions)\n", " # take the final layer's hidden\n", " return self.fc(self.dropout(hidden[-1]))\n" ] }, { "cell_type": "code", "execution_count": 36, "metadata": {}, "outputs": [], "source": [ "model_old = initialize(RNNOld(10, 10, 10, output_dim=15, n_layers=2, bidirectional=True, dropout=0.1)).eval()\n", "model_new = initialize(RNNNew(10, 10, 10, output_dim=15, n_layers=2, bidirectional=True, dropout=0.1)).eval()\n", "\n", "x = torch.randint(0, 10, size=[23, 10]).long()\n", "\n", "assert torch.allclose(model_old(x), model_new(x))" ] }, { "cell_type": "code", "execution_count": 37, "metadata": {}, "outputs": [], "source": [ "# this code fails\n", "# model_old = initialize(RNNOld(10, 10, 10, output_dim=15, n_layers=1, bidirectional=False, dropout=0.1)).eval()\n", "# model_old(x).shape" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "- original code misbehaves for non-bidirectional models\n", "- ... and fails when bidirectional = False, and there is only one layer\n", "- modification of the code shows both how hidden is structured and how it is modified" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Writing FastText faster\n", "\n", "" ] }, { "cell_type": "code", "execution_count": 38, "metadata": {}, "outputs": [], "source": [ "#left\n", "class FastTextOld(nn.Module):\n", " def __init__(self, vocab_size, embedding_dim, output_dim):\n", " super().__init__()\n", "\n", " self.embedding = nn.Embedding(vocab_size, embedding_dim)\n", " self.fc = nn.Linear(embedding_dim, output_dim)\n", "\n", " def forward(self, x):\n", "\n", " #x = [sent len, batch size]\n", "\n", " embedded = self.embedding(x)\n", "\n", " #embedded = [sent len, batch size, emb dim]\n", "\n", " embedded = embedded.permute(1, 0, 2)\n", "\n", " #embedded = [batch size, sent len, emb dim]\n", "\n", " pooled = F.avg_pool2d(embedded, (embedded.shape[1], 1)).squeeze(1)\n", "\n", " #pooled = [batch size, embedding_dim]\n", "\n", " return self.fc(pooled)" ] }, { "cell_type": "code", "execution_count": 39, "metadata": {}, "outputs": [], "source": [ "#right\n", "def FastTextNew(vocab_size, embedding_dim, output_dim):\n", " return nn.Sequential(\n", " Rearrange('t b -> t b'),\n", " nn.Embedding(vocab_size, embedding_dim),\n", " Reduce('t b c -> b c', 'mean'),\n", " nn.Linear(embedding_dim, output_dim),\n", " Rearrange('b c -> b c'),\n", " )" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Some comments on new code:\n", "\n", "- first and last operations do nothing and can be removed\n", " - but were added to explicitly show expected input and output\n", "- this also gives you a flexibility of changing interface by editing a single line. Should you need to accept inputs as (batch, time), \n", " you just change first line to `Rearrange('b t -> t b'),`" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# CNNs for text classification" ] }, { "cell_type": "code", "execution_count": 40, "metadata": {}, "outputs": [], "source": [ "#left\n", "class CNNOld(nn.Module):\n", " def __init__(self, vocab_size, embedding_dim, n_filters, filter_sizes, output_dim, dropout):\n", " super().__init__()\n", " self.embedding = nn.Embedding(vocab_size, embedding_dim)\n", " self.conv_0 = nn.Conv2d(in_channels=1, out_channels=n_filters, kernel_size=(filter_sizes[0],embedding_dim))\n", " self.conv_1 = nn.Conv2d(in_channels=1, out_channels=n_filters, kernel_size=(filter_sizes[1],embedding_dim))\n", " self.conv_2 = nn.Conv2d(in_channels=1, out_channels=n_filters, kernel_size=(filter_sizes[2],embedding_dim))\n", " self.fc = nn.Linear(len(filter_sizes)*n_filters, output_dim)\n", " self.dropout = nn.Dropout(dropout)\n", "\n", " def forward(self, x):\n", "\n", " #x = [sent len, batch size]\n", "\n", " x = x.permute(1, 0)\n", "\n", " #x = [batch size, sent len]\n", "\n", " embedded = self.embedding(x)\n", "\n", " #embedded = [batch size, sent len, emb dim]\n", "\n", " embedded = embedded.unsqueeze(1)\n", "\n", " #embedded = [batch size, 1, sent len, emb dim]\n", "\n", " conved_0 = F.relu(self.conv_0(embedded).squeeze(3))\n", " conved_1 = F.relu(self.conv_1(embedded).squeeze(3))\n", " conved_2 = F.relu(self.conv_2(embedded).squeeze(3))\n", "\n", " #conv_n = [batch size, n_filters, sent len - filter_sizes[n]]\n", "\n", " pooled_0 = F.max_pool1d(conved_0, conved_0.shape[2]).squeeze(2)\n", " pooled_1 = F.max_pool1d(conved_1, conved_1.shape[2]).squeeze(2)\n", " pooled_2 = F.max_pool1d(conved_2, conved_2.shape[2]).squeeze(2)\n", "\n", " #pooled_n = [batch size, n_filters]\n", "\n", " cat = self.dropout(torch.cat((pooled_0, pooled_1, pooled_2), dim=1))\n", "\n", " #cat = [batch size, n_filters * len(filter_sizes)]\n", "\n", " return self.fc(cat)" ] }, { "cell_type": "code", "execution_count": 41, "metadata": {}, "outputs": [], "source": [ "#right\n", "class CNNNew(nn.Module):\n", " def __init__(self, vocab_size, embedding_dim, n_filters, filter_sizes, output_dim, dropout):\n", " super().__init__()\n", " self.embedding = nn.Embedding(vocab_size, embedding_dim)\n", " self.convs = nn.ModuleList([\n", " nn.Conv1d(embedding_dim, n_filters, kernel_size=size) for size in filter_sizes\n", " ])\n", " self.fc = nn.Linear(len(filter_sizes) * n_filters, output_dim)\n", " self.dropout = nn.Dropout(dropout)\n", "\n", " def forward(self, x):\n", " x = rearrange(x, 't b -> t b')\n", " emb = rearrange(self.embedding(x), 't b c -> b c t')\n", " pooled = [reduce(conv(emb), 'b c t -> b c', 'max') for conv in self.convs]\n", " concatenated = rearrange(pooled, 'filter b c -> b (filter c)')\n", " return self.fc(self.dropout(F.relu(concatenated)))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "- Original code misuses Conv2d, while Conv1d is the right choice\n", "- Fixed code can work with any number of filter_sizes (and won't fail)\n", "- First line in new code does nothing, but was added for simplicity" ] }, { "cell_type": "code", "execution_count": 42, "metadata": {}, "outputs": [], "source": [ "# old_model = initialize(CNNOld(32, 32, 32, [1, 2, 4], 32, dropout=0.1)).eval()\n", "# new_model = initialize(CNNNew(32, 32, 32, [1, 2, 4], 32, dropout=0.1)).eval()\n", "\n", "# x = torch.zeros([10, 20]).long()\n", "# assert torch.allclose(old_model(x), new_model(x), atol=1e-3)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Highway convolutions\n", "\n", "- Highway convolutions are common in TTS systems. Code below makes splitting a bit more explicit.\n", "- Splitting policy may eventually turn out to be important if input had previously groups over channel axes (group convolutions or bidirectional LSTMs/GRUs)\n", "- Same applies to GLU and gated units in general" ] }, { "cell_type": "code", "execution_count": 43, "metadata": {}, "outputs": [], "source": [ "#left\n", "class HighwayConv1dOld(nn.Conv1d):\n", " def forward(self, inputs):\n", " L = super(HighwayConv1dOld, self).forward(inputs)\n", " H1, H2 = torch.chunk(L, 2, 1) # chunk at the feature dim\n", " torch.sigmoid_(H1)\n", " return H1 * H2 + (1.0 - H1) * inputs" ] }, { "cell_type": "code", "execution_count": 44, "metadata": {}, "outputs": [], "source": [ "#right\n", "class HighwayConv1dNew(nn.Conv1d):\n", " def forward(self, inputs):\n", " L = super().forward(inputs)\n", " H1, H2 = rearrange(L, 'b (split c) t -> split b c t', split=2)\n", " torch.sigmoid_(H1)\n", " return H1 * H2 + (1.0 - H1) * inputs" ] }, { "cell_type": "code", "execution_count": 45, "metadata": {}, "outputs": [], "source": [ "hc1 = HighwayConv1dOld(10, 20, kernel_size=3, padding=1)\n", "hc2 = HighwayConv1dNew(10, 20, kernel_size=3, padding=1)\n", "initialize(hc1)\n", "initialize(hc2)\n", "fw1 = hc1(torch.zeros(1, 10, 100))\n", "fw2 = hc2(torch.zeros(1, 10, 100))\n", "assert torch.allclose(fw1, fw2)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Tacotron's CBHG module\n", "\n", "" ] }, { "cell_type": "code", "execution_count": 46, "metadata": {}, "outputs": [], "source": [ "#right\n", "class CBHG_Old(nn.Module):\n", " \"\"\"CBHG module: a recurrent neural network composed of:\n", " - 1-d convolution banks\n", " - Highway networks + residual connections\n", " - Bidirectional gated recurrent units\n", " \"\"\"\n", "\n", " def __init__(self, in_dim, K=16, projections=[128, 128]):\n", " super(CBHG, self).__init__()\n", " self.in_dim = in_dim\n", " self.relu = nn.ReLU()\n", " self.conv1d_banks = nn.ModuleList(\n", " [BatchNormConv1d(in_dim, in_dim, kernel_size=k, stride=1,\n", " padding=k // 2, activation=self.relu)\n", " for k in range(1, K + 1)])\n", " self.max_pool1d = nn.MaxPool1d(kernel_size=2, stride=1, padding=1)\n", "\n", " in_sizes = [K * in_dim] + projections[:-1]\n", " activations = [self.relu] * (len(projections) - 1) + [None]\n", " self.conv1d_projections = nn.ModuleList(\n", " [BatchNormConv1d(in_size, out_size, kernel_size=3, stride=1,\n", " padding=1, activation=ac)\n", " for (in_size, out_size, ac) in zip(\n", " in_sizes, projections, activations)])\n", "\n", " self.pre_highway = nn.Linear(projections[-1], in_dim, bias=False)\n", " self.highways = nn.ModuleList(\n", " [Highway(in_dim, in_dim) for _ in range(4)])\n", "\n", " self.gru = nn.GRU(\n", " in_dim, in_dim, 1, batch_first=True, bidirectional=True)" ] }, { "cell_type": "code", "execution_count": 47, "metadata": {}, "outputs": [], "source": [ "#left\n", "def forward_old(self, inputs):\n", " # (B, T_in, in_dim)\n", " x = inputs\n", "\n", " # Needed to perform conv1d on time-axis\n", " # (B, in_dim, T_in)\n", " if x.size(-1) == self.in_dim:\n", " x = x.transpose(1, 2)\n", "\n", " T = x.size(-1)\n", "\n", " # (B, in_dim*K, T_in)\n", " # Concat conv1d bank outputs\n", " x = torch.cat([conv1d(x)[:, :, :T] for conv1d in self.conv1d_banks], dim=1)\n", " assert x.size(1) == self.in_dim * len(self.conv1d_banks)\n", " x = self.max_pool1d(x)[:, :, :T]\n", "\n", " for conv1d in self.conv1d_projections:\n", " x = conv1d(x)\n", "\n", " # (B, T_in, in_dim)\n", " # Back to the original shape\n", " x = x.transpose(1, 2)\n", "\n", " if x.size(-1) != self.in_dim:\n", " x = self.pre_highway(x)\n", "\n", " # Residual connection\n", " x += inputs\n", " for highway in self.highways:\n", " x = highway(x)\n", "\n", " # (B, T_in, in_dim*2)\n", " outputs, _ = self.gru(x)\n", "\n", " return outputs" ] }, { "cell_type": "code", "execution_count": 48, "metadata": {}, "outputs": [], "source": [ "#right\n", "def forward_new(self, inputs, input_lengths=None):\n", " x = rearrange(inputs, 'b t c -> b c t')\n", " _, _, T = x.shape\n", " # Concat conv1d bank outputs\n", " x = rearrange([conv1d(x)[:, :, :T] for conv1d in self.conv1d_banks],\n", " 'bank b c t -> b (bank c) t', c=self.in_dim)\n", " x = self.max_pool1d(x)[:, :, :T]\n", "\n", " for conv1d in self.conv1d_projections:\n", " x = conv1d(x)\n", " x = rearrange(x, 'b c t -> b t c')\n", " if x.size(-1) != self.in_dim:\n", " x = self.pre_highway(x)\n", "\n", " # Residual connection\n", " x += inputs\n", " for highway in self.highways:\n", " x = highway(x)\n", "\n", " # (B, T_in, in_dim*2)\n", " outputs, _ = self.gru(self.highways(x))\n", "\n", " return outputs" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "There is still a large room for improvements, but in this example only forward function was changed" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Simple attention\n", "Good news: there is no more need to guess order of dimensions. Neither for inputs nor for outputs" ] }, { "cell_type": "code", "execution_count": 49, "metadata": {}, "outputs": [], "source": [ "#left\n", "class Attention(nn.Module):\n", " def __init__(self):\n", " super(Attention, self).__init__()\n", "\n", " def forward(self, K, V, Q):\n", " A = torch.bmm(K.transpose(1,2), Q) / np.sqrt(Q.shape[1])\n", " A = F.softmax(A, 1)\n", " R = torch.bmm(V, A)\n", " return torch.cat((R, Q), dim=1)" ] }, { "cell_type": "code", "execution_count": 50, "metadata": {}, "outputs": [], "source": [ "#right\n", "def attention(K, V, Q):\n", " _, n_channels, _ = K.shape\n", " A = torch.einsum('bct,bcl->btl', [K, Q])\n", " A = F.softmax(A * n_channels ** (-0.5), 1)\n", " R = torch.einsum('bct,btl->bcl', [V, A])\n", " return torch.cat((R, Q), dim=1)" ] }, { "cell_type": "code", "execution_count": 51, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "336 µs ± 7.9 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)\n", "336 µs ± 7.48 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)\n" ] } ], "source": [ "args = dict(\n", " K=torch.zeros(32, 128, 40).cuda(),\n", " V=torch.zeros(32, 128, 40).cuda(),\n", " Q=torch.zeros(32, 128, 30).cuda(),\n", ")\n", "\n", "%timeit -n100 result_old = Attention()(**args); torch.cuda.synchronize()\n", "%timeit -n100 result_new = attention(**args); torch.cuda.synchronize()" ] }, { "cell_type": "code", "execution_count": 52, "metadata": {}, "outputs": [], "source": [ "result_old = Attention()(**args); torch.cuda.synchronize()\n", "result_new = attention(**args); torch.cuda.synchronize()\n", "assert torch.allclose(result_old, result_new)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Transformer's attention needs more attention" ] }, { "cell_type": "code", "execution_count": 53, "metadata": {}, "outputs": [], "source": [ "#left\n", "class ScaledDotProductAttention(nn.Module):\n", " ''' Scaled Dot-Product Attention '''\n", "\n", " def __init__(self, temperature, attn_dropout=0.1):\n", " super().__init__()\n", " self.temperature = temperature\n", " self.dropout = nn.Dropout(attn_dropout)\n", " self.softmax = nn.Softmax(dim=2)\n", "\n", " def forward(self, q, k, v, mask=None):\n", "\n", " attn = torch.bmm(q, k.transpose(1, 2))\n", " attn = attn / self.temperature\n", "\n", " if mask is not None:\n", " attn = attn.masked_fill(mask, -np.inf)\n", "\n", " attn = self.softmax(attn)\n", " attn = self.dropout(attn)\n", " output = torch.bmm(attn, v)\n", "\n", " return output, attn\n", "\n", "\n", "\n", "class MultiHeadAttentionOld(nn.Module):\n", " ''' Multi-Head Attention module '''\n", "\n", " def __init__(self, n_head, d_model, d_k, d_v, dropout=0.1):\n", " super().__init__()\n", "\n", " self.n_head = n_head\n", " self.d_k = d_k\n", " self.d_v = d_v\n", "\n", " self.w_qs = nn.Linear(d_model, n_head * d_k)\n", " self.w_ks = nn.Linear(d_model, n_head * d_k)\n", " self.w_vs = nn.Linear(d_model, n_head * d_v)\n", " nn.init.normal_(self.w_qs.weight, mean=0, std=np.sqrt(2.0 / (d_model + d_k)))\n", " nn.init.normal_(self.w_ks.weight, mean=0, std=np.sqrt(2.0 / (d_model + d_k)))\n", " nn.init.normal_(self.w_vs.weight, mean=0, std=np.sqrt(2.0 / (d_model + d_v)))\n", "\n", " self.attention = ScaledDotProductAttention(temperature=np.power(d_k, 0.5))\n", " self.layer_norm = nn.LayerNorm(d_model)\n", "\n", " self.fc = nn.Linear(n_head * d_v, d_model)\n", " nn.init.xavier_normal_(self.fc.weight)\n", "\n", " self.dropout = nn.Dropout(dropout)\n", "\n", "\n", " def forward(self, q, k, v, mask=None):\n", "\n", " d_k, d_v, n_head = self.d_k, self.d_v, self.n_head\n", "\n", " sz_b, len_q, _ = q.size()\n", " sz_b, len_k, _ = k.size()\n", " sz_b, len_v, _ = v.size()\n", "\n", " residual = q\n", "\n", " q = self.w_qs(q).view(sz_b, len_q, n_head, d_k)\n", " k = self.w_ks(k).view(sz_b, len_k, n_head, d_k)\n", " v = self.w_vs(v).view(sz_b, len_v, n_head, d_v)\n", "\n", " q = q.permute(2, 0, 1, 3).contiguous().view(-1, len_q, d_k) # (n*b) x lq x dk\n", " k = k.permute(2, 0, 1, 3).contiguous().view(-1, len_k, d_k) # (n*b) x lk x dk\n", " v = v.permute(2, 0, 1, 3).contiguous().view(-1, len_v, d_v) # (n*b) x lv x dv\n", "\n", " mask = mask.repeat(n_head, 1, 1) # (n*b) x .. x ..\n", " output, attn = self.attention(q, k, v, mask=mask)\n", "\n", " output = output.view(n_head, sz_b, len_q, d_v)\n", " output = output.permute(1, 2, 0, 3).contiguous().view(sz_b, len_q, -1) # b x lq x (n*dv)\n", "\n", " output = self.dropout(self.fc(output))\n", " output = self.layer_norm(output + residual)\n", "\n", " return output, attn\n" ] }, { "cell_type": "code", "execution_count": 54, "metadata": {}, "outputs": [], "source": [ "#right\n", "class MultiHeadAttentionNew(nn.Module):\n", " def __init__(self, n_head, d_model, d_k, d_v, dropout=0.1):\n", " super().__init__()\n", " self.n_head = n_head\n", "\n", " self.w_qs = nn.Linear(d_model, n_head * d_k)\n", " self.w_ks = nn.Linear(d_model, n_head * d_k)\n", " self.w_vs = nn.Linear(d_model, n_head * d_v)\n", "\n", " nn.init.normal_(self.w_qs.weight, mean=0, std=np.sqrt(2.0 / (d_model + d_k)))\n", " nn.init.normal_(self.w_ks.weight, mean=0, std=np.sqrt(2.0 / (d_model + d_k)))\n", " nn.init.normal_(self.w_vs.weight, mean=0, std=np.sqrt(2.0 / (d_model + d_v)))\n", "\n", " self.fc = nn.Linear(n_head * d_v, d_model)\n", " nn.init.xavier_normal_(self.fc.weight)\n", " self.dropout = nn.Dropout(p=dropout)\n", " self.layer_norm = nn.LayerNorm(d_model)\n", "\n", " def forward(self, q, k, v, mask=None):\n", " residual = q\n", " q = rearrange(self.w_qs(q), 'b l (head k) -> head b l k', head=self.n_head)\n", " k = rearrange(self.w_ks(k), 'b t (head k) -> head b t k', head=self.n_head)\n", " v = rearrange(self.w_vs(v), 'b t (head v) -> head b t v', head=self.n_head)\n", " attn = torch.einsum('hblk,hbtk->hblt', [q, k]) / np.sqrt(q.shape[-1])\n", " if mask is not None:\n", " attn = attn.masked_fill(mask[None], -np.inf)\n", " attn = torch.softmax(attn, dim=3)\n", " output = torch.einsum('hblt,hbtv->hblv', [attn, v])\n", " output = rearrange(output, 'head b l v -> b l (head v)')\n", " output = self.dropout(self.fc(output))\n", " output = self.layer_norm(output + residual)\n", " return output, attn\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Benefits of new implementation\n", "\n", "- we have one module, not two\n", "- now code does not fail for None mask\n", "- the amount of caveats in the original code that we removed is huge. \n", " Try erasing comments and deciphering what happens there" ] }, { "cell_type": "code", "execution_count": 55, "metadata": {}, "outputs": [], "source": [ "# Poor implementation of torch.einsum, so code below doesn't work\n", "class MultiHeadAttentionHard(nn.Module):\n", " def __init__(self, n_head, d_model, d_k, d_v, dropout=0.1):\n", " super().__init__()\n", " self.w_qs = nn.Parameter(torch.randn(d_model, n_head, d_k) * np.sqrt(2.0 / (d_model + d_k)))\n", " self.w_ks = nn.Parameter(torch.randn(d_model, n_head, d_k) * np.sqrt(2.0 / (d_model + d_k)))\n", " self.w_vs = nn.Parameter(torch.randn(d_model, n_head, d_v) * np.sqrt(2.0 / (d_model + d_v)))\n", " self.w_fc = nn.Parameter(torch.randn(d_model, n_head, d_v) * np.sqrt(2.0 / (d_model + n_head * d_v)))\n", "\n", " self.dropout = nn.Dropout(p=dropout)\n", " self.layer_norm = nn.LayerNorm(d_model)\n", "\n", " def forward(self, q, k, v, mask=None):\n", " attn = torch.einsum('bld,dhc,bte,ehc->hblt', [q, self.w_qs, k, self.w_ks])\n", " if mask is not None:\n", " attn = attn.masked_fill(mask[None], -np.inf)\n", " attn = torch.softmax(attn, dim=3)\n", " output = torch.einsum('hblt,bte,ehv,dhv->hbd', [attn, v, self.w_vs, self.w_fc])\n", " output = self.dropout(output)\n", " output = self.layer_norm(output + q)\n", " return output, attn" ] }, { "cell_type": "code", "execution_count": 56, "metadata": {}, "outputs": [], "source": [ "n_heads = 8\n", "d_k = 32\n", "d_v = 64\n", "d_model = 100\n", "t = 51\n", "l = 53\n", "batch = 30\n", "\n", "layer1 = initialize(MultiHeadAttentionOld(n_heads, d_k=d_k, d_v=d_v, d_model=d_model)).eval().cuda()\n", "layer2 = initialize(MultiHeadAttentionNew(n_heads, d_k=d_k, d_v=d_v, d_model=d_model)).eval().cuda()\n", "\n", "args = dict(\n", " q=torch.randn(batch, l, d_model),\n", " k=torch.randn(batch, t, d_model) * 0.1,\n", " v=torch.randn(batch, t, d_model),\n", " mask=torch.randn(batch, l, t) > 0,\n", ")\n", "args = {k:v.cuda() for k, v in args.items()}\n", "\n", "o1, a1 = layer1(**args)\n", "o2, a2 = layer2(**args)" ] }, { "cell_type": "code", "execution_count": 57, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "(torch.Size([240, 53, 51]), torch.Size([8, 30, 53, 51]))" ] }, "execution_count": 57, "metadata": {}, "output_type": "execute_result" } ], "source": [ "a1.shape, a2.shape" ] }, { "cell_type": "code", "execution_count": 58, "metadata": {}, "outputs": [], "source": [ "assert torch.allclose(o1, o2)" ] }, { "cell_type": "code", "execution_count": 59, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "4.82 ms ± 73.9 µs per loop (mean ± std. dev. of 7 runs, 200 loops each)\n", "4.6 ms ± 116 µs per loop (mean ± std. dev. of 7 runs, 200 loops each)\n" ] } ], "source": [ "%timeit -n 200 layer1(**args); torch.cuda.synchronize()\n", "%timeit -n 200 layer2(**args); torch.cuda.synchronize()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Self-attention GANs\n", "\n", "SAGANs are currently SotA for image generation, and can be simplified using same tricks.\n", "\n", "\n", "" ] }, { "cell_type": "code", "execution_count": 60, "metadata": {}, "outputs": [], "source": [ "#left\n", "class Self_Attn_Old(nn.Module):\n", " \"\"\" Self attention Layer\"\"\"\n", " def __init__(self,in_dim,activation):\n", " super(Self_Attn_Old,self).__init__()\n", " self.chanel_in = in_dim\n", " self.activation = activation\n", "\n", " self.query_conv = nn.Conv2d(in_channels = in_dim , out_channels = in_dim//8 , kernel_size= 1)\n", " self.key_conv = nn.Conv2d(in_channels = in_dim , out_channels = in_dim//8 , kernel_size= 1)\n", " self.value_conv = nn.Conv2d(in_channels = in_dim , out_channels = in_dim , kernel_size= 1)\n", " self.gamma = nn.Parameter(torch.zeros(1))\n", "\n", " self.softmax = nn.Softmax(dim=-1) #\n", "\n", " def forward(self, x):\n", " \"\"\"\n", " inputs :\n", " x : input feature maps( B X C X W X H)\n", " returns :\n", " out : self attention value + input feature\n", " attention: B X N X N (N is Width*Height)\n", " \"\"\"\n", "\n", " m_batchsize,C,width ,height = x.size()\n", " proj_query = self.query_conv(x).view(m_batchsize,-1,width*height).permute(0,2,1) # B X CX(N)\n", " proj_key = self.key_conv(x).view(m_batchsize,-1,width*height) # B X C x (*W*H)\n", " energy = torch.bmm(proj_query,proj_key) # transpose check\n", " attention = self.softmax(energy) # BX (N) X (N)\n", " proj_value = self.value_conv(x).view(m_batchsize,-1,width*height) # B X C X N\n", "\n", " out = torch.bmm(proj_value,attention.permute(0,2,1) )\n", " out = out.view(m_batchsize,C,width,height)\n", "\n", " out = self.gamma*out + x\n", " return out,attention" ] }, { "cell_type": "code", "execution_count": 61, "metadata": {}, "outputs": [], "source": [ "#right\n", "class Self_Attn_New(nn.Module):\n", " \"\"\" Self attention Layer\"\"\"\n", " def __init__(self, in_dim):\n", " super().__init__()\n", " self.query_conv = nn.Conv2d(in_dim, out_channels=in_dim//8, kernel_size=1)\n", " self.key_conv = nn.Conv2d(in_dim, out_channels=in_dim//8, kernel_size=1)\n", " self.value_conv = nn.Conv2d(in_dim, out_channels=in_dim, kernel_size=1)\n", " self.gamma = nn.Parameter(torch.zeros([1]))\n", "\n", " def forward(self, x):\n", " proj_query = rearrange(self.query_conv(x), 'b c h w -> b (h w) c')\n", " proj_key = rearrange(self.key_conv(x), 'b c h w -> b c (h w)')\n", " proj_value = rearrange(self.value_conv(x), 'b c h w -> b (h w) c')\n", " energy = torch.bmm(proj_query, proj_key)\n", " attention = F.softmax(energy, dim=2)\n", " out = torch.bmm(attention, proj_value)\n", " out = x + self.gamma * rearrange(out, 'b (h w) c -> b c h w',\n", " **parse_shape(x, 'b c h w'))\n", " return out, attention" ] }, { "cell_type": "code", "execution_count": 62, "metadata": {}, "outputs": [], "source": [ "model_old = initialize(Self_Attn_Old(128, None))\n", "model_new = initialize(Self_Attn_New(128))" ] }, { "cell_type": "code", "execution_count": 63, "metadata": {}, "outputs": [], "source": [ "x = torch.randn(2, 128, 30, 30)\n", "assert torch.allclose(model_old(x)[0], model_new(x)[0], atol=1e-4)\n", "# returned attention is transposed\n", "assert torch.allclose(model_old(x)[1], model_new(x)[1], atol=1e-4)" ] }, { "cell_type": "code", "execution_count": 64, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "15.9 ms ± 990 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)\n", "14.5 ms ± 81.6 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)\n" ] } ], "source": [ "%timeit model_old(x)[0].sum().item()\n", "%timeit model_new(x)[0].sum().item()\n", "# surprise - I had slow down here due to the order of softmax, not einops" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Improving time sequence prediction\n", "\n", "\n", "\n", "While this example was considered to be simplistic, I had to analyze surrounding code to understand what kind of input was expected.\n", "You can try yourself. \n", "\n", "Additionally now the code works with any dtype, not only double; and new code supports using GPU." ] }, { "cell_type": "code", "execution_count": 65, "metadata": {}, "outputs": [], "source": [ "#left\n", "class SequencePredictionOld(nn.Module):\n", " def __init__(self):\n", " super(SequencePredictionOld, self).__init__()\n", " self.lstm1 = nn.LSTMCell(1, 51)\n", " self.lstm2 = nn.LSTMCell(51, 51)\n", " self.linear = nn.Linear(51, 1)\n", "\n", " def forward(self, input, future = 0):\n", " outputs = []\n", " h_t = torch.zeros(input.size(0), 51, dtype=torch.double)\n", " c_t = torch.zeros(input.size(0), 51, dtype=torch.double)\n", " h_t2 = torch.zeros(input.size(0), 51, dtype=torch.double)\n", " c_t2 = torch.zeros(input.size(0), 51, dtype=torch.double)\n", "\n", " for i, input_t in enumerate(input.chunk(input.size(1), dim=1)):\n", " h_t, c_t = self.lstm1(input_t, (h_t, c_t))\n", " h_t2, c_t2 = self.lstm2(h_t, (h_t2, c_t2))\n", " output = self.linear(h_t2)\n", " outputs += [output]\n", "\n", " for i in range(future):# if we should predict the future\n", " h_t, c_t = self.lstm1(output, (h_t, c_t))\n", " h_t2, c_t2 = self.lstm2(h_t, (h_t2, c_t2))\n", " output = self.linear(h_t2)\n", " outputs += [output]\n", " outputs = torch.stack(outputs, 1).squeeze(2)\n", " return outputs" ] }, { "cell_type": "code", "execution_count": 66, "metadata": {}, "outputs": [], "source": [ "#right\n", "class SequencePredictionNew(nn.Module):\n", " def __init__(self):\n", " super(SequencePredictionNew, self).__init__()\n", " self.lstm1 = nn.LSTMCell(1, 51)\n", " self.lstm2 = nn.LSTMCell(51, 51)\n", " self.linear = nn.Linear(51, 1)\n", "\n", " def forward(self, input, future=0):\n", " b, t = input.shape\n", " h_t, c_t, h_t2, c_t2 = torch.zeros(4, b, 51, dtype=self.linear.weight.dtype,\n", " device=self.linear.weight.device)\n", "\n", " outputs = []\n", " for input_t in rearrange(input, 'b t -> t b ()'):\n", " h_t, c_t = self.lstm1(input_t, (h_t, c_t))\n", " h_t2, c_t2 = self.lstm2(h_t, (h_t2, c_t2))\n", " output = self.linear(h_t2)\n", " outputs += [output]\n", "\n", " for i in range(future): # if we should predict the future\n", " h_t, c_t = self.lstm1(output, (h_t, c_t))\n", " h_t2, c_t2 = self.lstm2(h_t, (h_t2, c_t2))\n", " output = self.linear(h_t2)\n", " outputs += [output]\n", " return rearrange(outputs, 't b () -> b t')" ] }, { "cell_type": "code", "execution_count": 67, "metadata": {}, "outputs": [], "source": [ "seq_old = SequencePredictionOld().double()\n", "seq_new = SequencePredictionNew().double()\n", "initialize(seq_old)\n", "initialize(seq_new)\n", "x = torch.randn([10, 10], dtype=torch.double)" ] }, { "cell_type": "code", "execution_count": 68, "metadata": {}, "outputs": [], "source": [ "result_old = seq_old(x)\n", "result_new = seq_new(x)\n", "assert torch.allclose(result_old, result_new)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Transforming spacial transformer network (STN)\n", "\n", "\n", "" ] }, { "cell_type": "code", "execution_count": 69, "metadata": {}, "outputs": [], "source": [ "#left\n", "class SpacialTransformOld(nn.Module):\n", " def __init__(self):\n", " super(Net, self).__init__()\n", "\n", " # Spatial transformer localization-network\n", " self.localization = nn.Sequential(\n", " nn.Conv2d(1, 8, kernel_size=7),\n", " nn.MaxPool2d(2, stride=2),\n", " nn.ReLU(True),\n", " nn.Conv2d(8, 10, kernel_size=5),\n", " nn.MaxPool2d(2, stride=2),\n", " nn.ReLU(True)\n", " )\n", "\n", " # Regressor for the 3 * 2 affine matrix\n", " self.fc_loc = nn.Sequential(\n", " nn.Linear(10 * 3 * 3, 32),\n", " nn.ReLU(True),\n", " nn.Linear(32, 3 * 2)\n", " )\n", "\n", " # Initialize the weights/bias with identity transformation\n", " self.fc_loc[2].weight.data.zero_()\n", " self.fc_loc[2].bias.data.copy_(torch.tensor([1, 0, 0, 0, 1, 0], dtype=torch.float))\n", "\n", " # Spatial transformer network forward function\n", " def stn(self, x):\n", " xs = self.localization(x)\n", " xs = xs.view(-1, 10 * 3 * 3)\n", " theta = self.fc_loc(xs)\n", " theta = theta.view(-1, 2, 3)\n", "\n", " grid = F.affine_grid(theta, x.size())\n", " x = F.grid_sample(x, grid)\n", "\n", " return x\n" ] }, { "cell_type": "code", "execution_count": 70, "metadata": {}, "outputs": [], "source": [ "#right\n", "class SpacialTransformNew(nn.Module):\n", " def __init__(self):\n", " super(Net, self).__init__()\n", " # Spatial transformer localization-network\n", " linear = nn.Linear(32, 3 * 2)\n", " # Initialize the weights/bias with identity transformation\n", " linear.weight.data.zero_()\n", " linear.bias.data.copy_(torch.tensor([1, 0, 0, 0, 1, 0], dtype=torch.float))\n", "\n", " self.compute_theta = nn.Sequential(\n", " nn.Conv2d(1, 8, kernel_size=7),\n", " nn.MaxPool2d(2, stride=2),\n", " nn.ReLU(True),\n", " nn.Conv2d(8, 10, kernel_size=5),\n", " nn.MaxPool2d(2, stride=2),\n", " nn.ReLU(True),\n", " Rearrange('b c h w -> b (c h w)', h=3, w=3),\n", " nn.Linear(10 * 3 * 3, 32),\n", " nn.ReLU(True),\n", " linear,\n", " Rearrange('b (row col) -> b row col', row=2, col=3),\n", " )\n", "\n", " # Spatial transformer network forward function\n", " def stn(self, x):\n", " grid = F.affine_grid(self.compute_theta(x), x.size())\n", " return F.grid_sample(x, grid)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "- new code will give reasonable errors when passed image size is different from expected\n", "- if batch size is divisible by 18, whatever you input in the old code, it'll fail no sooner than affine_grid." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Improving GLOW\n", "\n", "That's a good old depth-to-space written manually!\n", "\n", "Since GLOW is revertible, it will frequently rely on `rearrange`-like operations.\n", "\n", "" ] }, { "cell_type": "code", "execution_count": 71, "metadata": {}, "outputs": [], "source": [ "#left\n", "def unsqueeze2d_old(input, factor=2):\n", " assert factor >= 1 and isinstance(factor, int)\n", " factor2 = factor ** 2\n", " if factor == 1:\n", " return input\n", " size = input.size()\n", " B = size[0]\n", " C = size[1]\n", " H = size[2]\n", " W = size[3]\n", " assert C % (factor2) == 0, \"{}\".format(C)\n", " x = input.view(B, C // factor2, factor, factor, H, W)\n", " x = x.permute(0, 1, 4, 2, 5, 3).contiguous()\n", " x = x.view(B, C // (factor2), H * factor, W * factor)\n", " return x\n", "\n", "def squeeze2d_old(input, factor=2):\n", " assert factor >= 1 and isinstance(factor, int)\n", " if factor == 1:\n", " return input\n", " size = input.size()\n", " B = size[0]\n", " C = size[1]\n", " H = size[2]\n", " W = size[3]\n", " assert H % factor == 0 and W % factor == 0, \"{}\".format((H, W))\n", " x = input.view(B, C, H // factor, factor, W // factor, factor)\n", " x = x.permute(0, 1, 3, 5, 2, 4).contiguous()\n", " x = x.view(B, C * factor * factor, H // factor, W // factor)\n", " return x" ] }, { "cell_type": "code", "execution_count": 72, "metadata": {}, "outputs": [], "source": [ "#right\n", "def unsqueeze2d_new(input, factor=2):\n", " return rearrange(input, 'b (c h2 w2) h w -> b c (h h2) (w w2)', h2=factor, w2=factor)\n", "\n", "def squeeze2d_new(input, factor=2):\n", " return rearrange(input, 'b c (h h2) (w w2) -> b (c h2 w2) h w', h2=factor, w2=factor)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "- term `squeeze` isn't very helpful: which dimension is squeezed? There is `torch.squeeze`, but it's very different.\n", "- in fact, we could skip creating functions completely - it is a single call to `einops` anyway" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Detecting problems in YOLO detection\n", "\n", "" ] }, { "cell_type": "code", "execution_count": 73, "metadata": {}, "outputs": [], "source": [ "#left\n", "def YOLO_prediction_old(input, num_classes, num_anchors, anchors, stride_h, stride_w):\n", " bs = input.size(0)\n", " in_h = input.size(2)\n", " in_w = input.size(3)\n", " scaled_anchors = [(a_w / stride_w, a_h / stride_h) for a_w, a_h in anchors]\n", "\n", " prediction = input.view(bs, num_anchors,\n", " 5 + num_classes, in_h, in_w).permute(0, 1, 3, 4, 2).contiguous()\n", " # Get outputs\n", " x = torch.sigmoid(prediction[..., 0]) # Center x\n", " y = torch.sigmoid(prediction[..., 1]) # Center y\n", " w = prediction[..., 2] # Width\n", " h = prediction[..., 3] # Height\n", " conf = torch.sigmoid(prediction[..., 4]) # Conf\n", " pred_cls = torch.sigmoid(prediction[..., 5:]) # Cls pred.\n", "\n", " FloatTensor = torch.cuda.FloatTensor if x.is_cuda else torch.FloatTensor\n", " LongTensor = torch.cuda.LongTensor if x.is_cuda else torch.LongTensor\n", " # Calculate offsets for each grid\n", " grid_x = torch.linspace(0, in_w - 1, in_w).repeat(in_w, 1).repeat(\n", " bs * num_anchors, 1, 1).view(x.shape).type(FloatTensor)\n", " grid_y = torch.linspace(0, in_h - 1, in_h).repeat(in_h, 1).t().repeat(\n", " bs * num_anchors, 1, 1).view(y.shape).type(FloatTensor)\n", " # Calculate anchor w, h\n", " anchor_w = FloatTensor(scaled_anchors).index_select(1, LongTensor([0]))\n", " anchor_h = FloatTensor(scaled_anchors).index_select(1, LongTensor([1]))\n", " anchor_w = anchor_w.repeat(bs, 1).repeat(1, 1, in_h * in_w).view(w.shape)\n", " anchor_h = anchor_h.repeat(bs, 1).repeat(1, 1, in_h * in_w).view(h.shape)\n", " # Add offset and scale with anchors\n", " pred_boxes = FloatTensor(prediction[..., :4].shape)\n", " pred_boxes[..., 0] = x.data + grid_x\n", " pred_boxes[..., 1] = y.data + grid_y\n", " pred_boxes[..., 2] = torch.exp(w.data) * anchor_w\n", " pred_boxes[..., 3] = torch.exp(h.data) * anchor_h\n", " # Results\n", " _scale = torch.Tensor([stride_w, stride_h] * 2).type(FloatTensor)\n", " output = torch.cat((pred_boxes.view(bs, -1, 4) * _scale,\n", " conf.view(bs, -1, 1), pred_cls.view(bs, -1, num_classes)), -1)\n", " return output" ] }, { "cell_type": "code", "execution_count": 74, "metadata": {}, "outputs": [], "source": [ "#right\n", "def YOLO_prediction_new(input, num_classes, num_anchors, anchors, stride_h, stride_w):\n", " raw_predictions = rearrange(input, 'b (anchor prediction) h w -> prediction b anchor h w',\n", " anchor=num_anchors, prediction=5 + num_classes)\n", " anchors = torch.FloatTensor(anchors).to(input.device)\n", " anchor_sizes = rearrange(anchors, 'anchor dim -> dim () anchor () ()')\n", "\n", " _, _, _, in_h, in_w = raw_predictions.shape\n", " grid_h = rearrange(torch.arange(in_h).float(), 'h -> () () h ()').to(input.device)\n", " grid_w = rearrange(torch.arange(in_w).float(), 'w -> () () () w').to(input.device)\n", "\n", " predicted_bboxes = torch.zeros_like(raw_predictions)\n", " predicted_bboxes[0] = (raw_predictions[0].sigmoid() + grid_w) * stride_w # center x\n", " predicted_bboxes[1] = (raw_predictions[1].sigmoid() + grid_h) * stride_h # center y\n", " predicted_bboxes[2:4] = (raw_predictions[2:4].exp()) * anchor_sizes # bbox width and height\n", " predicted_bboxes[4] = raw_predictions[4].sigmoid() # confidence\n", " predicted_bboxes[5:] = raw_predictions[5:].sigmoid() # class predictions\n", " # merging all predicted bboxes for each image\n", " return rearrange(predicted_bboxes, 'prediction b anchor h w -> b (anchor h w) prediction')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We changed and fixed a lot:\n", "\n", "- new code won't fail if input is not on the first GPU\n", "- old code has wrong grid_x and grid_y for non-square images\n", "- new code doesn't use replication when broadcasting is sufficient\n", "- old code strangely sometimes takes `.data`, but this has no real effect, as some branches preserve gradient till the end\n", " - if gradients not needed, torch.no_grad should be used, so it's redundant" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Simpler output for a bunch of pictures\n", "\n", "Next time you need to output drawings of you generative models, you can use this trick\n", "\n", "" ] }, { "cell_type": "code", "execution_count": 75, "metadata": {}, "outputs": [], "source": [ "fake_batch = torch.rand([100, 3, 1, 1]) + torch.zeros([100, 3, 32, 32])\n", "from matplotlib import pyplot as plt\n", "import torchvision.utils as vutils" ] }, { "cell_type": "code", "execution_count": 76, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "" ] }, "execution_count": 76, "metadata": {}, "output_type": "execute_result" }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAQUAAAD8CAYAAAB+fLH0AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvhp/UCwAAD4FJREFUeJzt3X2wXHV9x/H3tyiIiMNDNA0hBXRiNSgNMSIdHpShYhK1wdHhoVMakc7FTqhS6bQB2oJt0ar1oVQLxBIJVIVM0RIxIJBhBJ2i3MQYEiIQNRkSQ1IeFAYQBb79Y09gf+lN9t6793d3Ke/XzJlz9rdn9/vNyc1nzjm7N7/ITCRpu9/qdQOS+ouhIKlgKEgqGAqSCoaCpIKhIKlQLRQiYlZE3BMR6yNiQa06ksZW1PieQkTsBtwLvAPYBNwJnJqZd495MUljqtaZwhHA+sz8aWb+GrgamFuplqQx9JJK7zsZuL/t8SbgrTvbOSL8WqVU34OZ+apOO9UKhY4iYgAY6FV96UVo43B2qhUKm4EpbY8PbMaek5kLgYXw/JnCvzxwXqV2nveR3/44AD88/9nqtQ6/qHV1tvFVJ1WvddD/LAFg3cqV1Wu9YcYMAN5y1ceq17rztAsAuPnk+zvs2b13XNP6kb3sb9ZUr3XmP74RgA8v/Gb1WhcPvGdE+9e6p3AnMDUiDomI3YFTgKWVakkaQ1XOFDLz6Yg4C/g2sBuwKDPX1qglaWxVu6eQmcuAZbXeX1IdfqNRUsFQkFQwFCQVDAVJBUNBUsFQkFQwFCQVDAVJBUNBUsFQkFQwFCQVDAVJBUNBUsFQkFQwFCQVDAVJBUNBUsFQkFQwFCQVDAVJBUNBUqHKBLMjbsJp46TxsCIzZ3bayTMFSYWezSU5lA2fu6F6jYP/YjYA1/3ZrdVrzb3kOAD+ff9Dq9f604dac+089fH51Wvtcd4XW+sJx1av9dSDtwFw3RfeXb3W3LOuB2D6nu+sXmvVk98GYNvb/rx6rVd/519HtL9nCpIKhoKkgqEgqWAoSCoYCpIKhoKkgqEgqdDV9xQiYgPwGPAM8HRmzoyI/YBrgIOBDcBJmflId21KGi9jcaZwXGZOb/v65AJgeWZOBZY3jyW9QNS4fJgLLG62FwMnVqghqZJuQyGBmyJiRUQMNGMTM3NLs/0AMHGoF0bEQEQMRsRglz1IGkPd/u7D0Zm5OSJeDdwcET9ufzIzc2e/AZmZC4GF4G9JSv2kqzOFzNzcrLcB3wCOALZGxCSAZr2t2yYljZ9Rh0JE7BURe2/fBk4A1gBLgXnNbvOA67ptUtL46ebyYSLwjYjY/j5fzcwbI+JOYElEnAFsBE7qvk1J42XUoZCZPwV+b4jxh4Dju2lKUu/4jUZJBUNBUsFQkFQwFCQVDAVJBUNBUsFQkFQwFCQVnDZOevFw2jhJI9dX08Y9+eGrqtfY8+LTADhmTf2p3G5/Y2sqt31+/z+r1/rFf78fgE8d/sHqtf7qh4sA+K+lv6pe68Q/fBkAR16/snqtO949A4AvD55dvdbpMz8PwAWL6v9sfOyD7x/R/p4pSCoYCpIKhoKkgqEgqWAoSCoYCpIKhoKkgqEgqWAoSCoYCpIKhoKkgqEgqWAoSCoYCpIKhoKkgqEgqWAoSCoYCpIKhoKkgqEgqdAxFCJiUURsi4g1bWP7RcTNEXFfs963GY+IuDgi1kfE6oiYUbN5SWNvOGcKVwCzdhhbACzPzKnA8uYxwGxgarMMAJeMTZuSxkvHUMjM24CHdxieCyxuthcDJ7aNX5ktdwD7RMSksWpWUn2jvacwMTO3NNsPABOb7cnA/W37bWrG/o+IGIiIwYgYHGUPkmrIzI4LcDCwpu3xL3Z4/pFmfT1wdNv4cmDmMN4/XVxcqi+Dw/n3Ptozha3bLwua9bZmfDMwpW2/A5sxSS8Qo502bikwD/inZn1d2/hZEXE18Fbgl22XGR395tKfj7Kd4Xvphw4A4Ng511avdduy9wFw7mWfqF7rE2eeC8DZJw95tTamPn9NK+c/+cUPVK/11/OvAOCm7321eq0TjvojAP7hrnOq1/rbN30GgDcsWVG91rqT3jyi/TuGQkR8DXg7MCEiNgEX0AqDJRFxBrAROKnZfRkwB1gPPAGcPqJuJPVcx1DIzFN38tTxQ+ybwPxum5LUO36jUVLBUJBUMBQkFQwFSQVDQVLBUJBUMBQkFQwFSQVDQVLBUJBUMBQkFQwFSQVDQVLBUJBUMBQkFQwFSQVDQVLBUJBUMBQkFQwFSQVDQVLBUJBUiGbatt42EdH7JqT//1Zk5sxOO3mmIKkw2mnjqvjCgxur1zhrwkEArHrl8uq1pj/ami/nzoNur17rLRuPAeC+M/evXmvqZQ8BsPruw6rXOmzaagAmbl1ZvdbWiTMA+N3Hn6pe65699gDg8DOOqV7rh5eP7OfPMwVJBUNBUsFQkFQwFCQVDAVJhY6hEBGLImJbRKxpG7swIjZHxKpmmdP23LkRsT4i7omId9ZqXFIdwzlTuAKYNcT45zJzerMsA4iIacApwKHNa/4tInYbq2Yl1dcxFDLzNuDhYb7fXODqzHwqM38GrAeO6KI/SeOsm3sKZ0XE6ubyYt9mbDJwf9s+m5oxSS8Qow2FS4DXAtOBLcBnRvoGETEQEYMRMTjKHiRVMKpQyMytmflMZj4LfInnLxE2A1Padj2wGRvqPRZm5szh/IKGpPEzqlCIiEltD98LbP9kYilwSkTsERGHAFOBH3TXoqTx1PEXoiLia8DbgQkRsQm4AHh7REwHEtgAnAmQmWsjYglwN/A0MD8zn6nTuqQaOoZCZp46xPDlu9j/IuCibpqS1Dt+o1FSwVCQVDAUJBUMBUkFQ0FSwVCQVDAUJBUMBUkFQ0FSwVCQVHDaOOnFw2njJI1cX00b94mrrq9e49zT3g3AN0+eXb3We665AYDZ73l99Vo3fPPHAEyf9NHqtVZt+SwAnz1jYfVaH718AIAr5txWvdYHlh0LwIG/qj9F3aaXtaao+9a806vXetfiL49of88UJBUMBUkFQ0FSwVCQVDAUJBUMBUkFQ0FSwVCQVDAUJBUMBUkFQ0FSwVCQVDAUJBUMBUkFQ0FSwVCQVDAUJBUMBUmFjqEQEVMi4taIuDsi1kbER5rx/SLi5oi4r1nv24xHRFwcEesjYnVEzKj9h5A0doZzpvA0cE5mTgOOBOZHxDRgAbA8M6cCy5vHALOBqc0yAFwy5l1LqqZjKGTmlsxc2Ww/BqwDJgNzgcXNbouBE5vtucCV2XIHsE9ETBrzziVVMaJ7ChFxMHA48H1gYmZuaZ56AJjYbE8G7m972aZmbMf3GoiIwYgYHGHPkioadihExCuAa4GzM/PR9ueyNaPMiCZ0ycyFmTlzOJNTSBo/wwqFiHgprUD4SmZ+vRneuv2yoFlva8Y3A1PaXn5gMybphSAzd7kAAVwJfH6H8U8DC5rtBcCnmu13ATc0rzsS+MEwaqSLi0v1ZbDTv8XM7DyXZEQcDdwO3AU82wyfR+u+whLgd4CNwEmZ+XBEBPAFYBbwBHB6Zu7yvoFzSUrjYlhzSfbVBLOvZ7fqtX7MMwAc9b5HO+zZve9d+0oA9vzJG6rXevK16wD4u0NfWb3W369tHbtZx3+seq0bl18AwMlP7Fu91jUvfwSAw7778eq1Vh99HgCXHvOu6rU+dPu3tm86waykkTMUJBUMBUkFQ0FSwVCQVDAUJBUMBUkFQ0FSwVCQVDAUJBUMBUkFQ0FSwVCQVDAUJBUMBUkFQ0FSwVCQVDAUJBUMBUkFQ0FSwVCQVDAUJBUMBUkFQ0FSoa8mg5FUlZPBSBq5l/S6gXZ/cNFD1Wvccv7+AOw/7+XVaz20+AkA7p35ePVarxvcC4A/vvH71Wv9x6y3AnDpgz+vXutDEw4AYOVNZ1SvNeOEywG499Z11Wu97rjWVIJvftN3qtdacdfbRrS/ZwqSCoaCpIKhIKlgKEgqdAyFiJgSEbdGxN0RsTYiPtKMXxgRmyNiVbPMaXvNuRGxPiLuiYh31vwDSBpbw/n04WngnMxcGRF7Aysi4ubmuc9l5j+37xwR04BTgEOBA4BbIuJ1mfnMWDYuqY6OZwqZuSUzVzbbjwHrgMm7eMlc4OrMfCozfwasB44Yi2Yl1TeiewoRcTBwOLD9w/CzImJ1RCyKiH2bscnA/W0v28QQIRIRAxExGBGDI+5aUjXDDoWIeAVwLXB2Zj4KXAK8FpgObAE+M5LCmbkwM2cO52uXksbPsEIhIl5KKxC+kplfB8jMrZn5TGY+C3yJ5y8RNgNT2l5+YDMm6QVgOJ8+BHA5sC4zP9s2Pqltt/cCa5rtpcApEbFHRBwCTAV+MHYtS6ppOJ8+HAWcBtwVEauasfOAUyNiOpDABuBMgMxcGxFLgLtpfXIx308epBeOjqGQmd8FYoinlu3iNRcBF3XRl6Qe8RuNkgqGgqSCoSCpYChIKhgKkgqGgqSCoSCpYChIKhgKkgqGgqSCoSCp4LRx0ouH08ZJGrl+mTbuQeDxZt2PJtC/vYH9daOfe4Ox7e+g4ezUF5cPABEx2K//NVs/9wb2141+7g1605+XD5IKhoKkQj+FwsJeN7AL/dwb2F83+rk36EF/fXNPQVJ/6KczBUl9oOehEBGzmolo10fEgl73AxARGyLirmbi3MFmbL+IuDki7mvW+3Z6nzHsZ1FEbIuINW1jQ/YTLRc3x3N1RMzoQW99M/nwLiZI7vnx69vJmzOzZwuwG/AT4DXA7sCPgGm97KnpawMwYYexTwELmu0FwCfHsZ9jgRnAmk79AHOAG2j9D9xHAt/vQW8XAn85xL7Tmr/jPYBDmr/73Sr3NwmY0WzvDdzb9NHz47eL3np6/Hp9pnAEsD4zf5qZvwaupjVBbT+aCyxuthcDJ45X4cy8DXh4mP3MBa7MljuAfXaYuGc8etuZcZ98OHc+QXLPj98uetuZcTl+vQ6FYU1G2wMJ3BQRKyJioBmbmJlbmu0HgIm9ae05O+unX47pqCcfrmWHCZL76viN5eTN3ep1KPSrozNzBjAbmB8Rx7Y/ma1zub752Kbf+qHLyYdrGGKC5Of0+viN9eTN3ep1KPTlZLSZublZbwO+QesUbev208hmva13HcIu+un5Mc0+m3x4qAmS6ZPj14+TN/c6FO4EpkbEIRGxO3AKrQlqeyYi9oqIvbdvAyfQmjx3KTCv2W0ecF1vOnzOzvpZCvxJcxf9SOCXbafJ46KfJh/e2QTJ9MHx21lvPT9+Ne/8DvMO7Bxad11/ApzfB/28htYd3h8Ba7f3BOwPLAfuA24B9hvHnr5G6zTyN7SuI8/YWT+07pp/sTmedwEze9DbVU3t1c0P8qS2/c9versHmD0Ox+5oWpcGq4FVzTKnH47fLnrr6fHzG42SCr2+fJDUZwwFSQVDQVLBUJBUMBQkFQwFSQVDQVLBUJBU+F/ejqya4rYvVQAAAABJRU5ErkJggg==\n", "text/plain": [ "
" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "#right\n", "device = 'cpu'\n", "plt.imshow(np.transpose(vutils.make_grid(fake_batch.to(device)[:64], padding=2, normalize=True).cpu(),(1,2,0)))" ] }, { "cell_type": "code", "execution_count": 77, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "" ] }, "execution_count": 77, "metadata": {}, "output_type": "execute_result" }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAQUAAAD8CAYAAAB+fLH0AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvhp/UCwAAD4tJREFUeJzt3XvQXHV9x/H3VypIgQoYGmOIgjYqYaAhRqQjopUWkigGpzbCOBg004c6QaHa6QRoRW3jhRYviE0JQ4bAUCGtosGGa3QIOAVyIeYCBiKEkhgSLtYLgpbw7R97gvtLnye7eZ49u0t9v2bOnLO/Pbvfb06ST845u09+kZlI0k4v6XUDkvqLoSCpYChIKhgKkgqGgqSCoSCpUFsoRMSUiNgQERsjYk5ddSR1VtTxPYWI2At4APhTYDOwHDg9M+/reDFJHVXXmcKxwMbMfCgzfw1cC0yvqZakDvqdmt53LPBo0+PNwFuG2jki/FqlVL8nMvOQVjvVFQotRcQAMLDz8VceO7/2mue88rMArLrg+dprAUya+xI2HTKjK7UOe3wRAPevWlV7rSMmTQLgzVd9uvZayz94IQC3vP+/aq8FcNJ1r+ayv13blVpn/cNRAHzsshtqr3XJWacAPNLOvnWFwhZgXNPjQ6uxF2TmfGA+eKYg9ZO67iksB8ZHxOERsTdwGrC4plqSOqiWM4XMfC4izgZuBvYCFmTm+jpqSeqs2u4pZOYSYEld7y+pHn6jUVLBUJBUMBQkFQwFSQVDQVLBUJBUMBQkFQwFSQVDQVLBUJBUMBQkFQwFSQVDQVLBUJBUMBQkFQwFSQVDQVLBUJBUMBQkFQwFSYVa5pLc4yac90HqhpWZObnVTp4pSCr0bNq4XT38pRtrr3H4X00F4Fsf+W7ttQBOnfdOLh91ZFdq/cUTjWk1np07u/ZaL7vga431qBNqr/XsE8sA+NZX3117LYBTP/odJu57cldqrX7mZgC2vf2jtdcafftX297XMwVJBUNBUsFQkFQwFCQVDAVJBUNBUsFQkFQwFCQVRvTlpYjYBPwc2AE8l5mTI+Jg4DrgMGATMCMzfzKyNiV1SyfOFP44Myc2fad6DrA0M8cDS6vHkl4k6rh8mA4srLYXAqfWUENSTUYaCgncEhErI2KgGhudmVur7ceA0SOsIamLRvoDUcdn5paI+H3g1oj4YfOTmZlD/Vh0FSIDgz0nqXdGdKaQmVuq9XbgeuBYYFtEjAGo1tuHeO38zJzczs93S+qeYYdCROwXEQfs3AZOAtYBi4GZ1W4zgW+PtElJ3TOSy4fRwPURsfN9/jUzb4qI5cCiiJgFPALMGHmbkrpl2KGQmQ8BfzjI+JPAiSNpSlLv+I1GSQVDQVLBUJBUMBQkFQwFSQVDQVLBUJBUMBQkFZxLUvrt0dZckn0zbdwvP3Z17TV+95IzAHjb2u5M5XbHUes56I/+rSu1fvKffw7ARcd8uPZaf3PvAgCuX/xM7bXe+559ATjuhlW11wK465RJLFh+bldqffjNXwbgk1f8e+21PjPrfW3v6+WDpIKhIKlgKEgqGAqSCoaCpIKhIKlgKEgqGAqSCoaCpIKhIKlgKEgqGAqSCoaCpIKhIKlgKEgqGAqSCoaCpIKhIKlgKEgqGAqSCoaCpELLUIiIBRGxPSLWNY0dHBG3RsSD1fqgajwi4pKI2BgRayJiUp3NS+q8ds4UrgSm7DI2B1iameOBpdVjgKnA+GoZAOZ1pk1J3dIyFDJzGfDULsPTgYXV9kLg1Kbxq7LhLuDAiBjTqWYl1W+49xRGZ+bWavsxYHS1PRZ4tGm/zdWYpBeLzGy5AIcB65oe//cuz/+kWn8HOL5pfCkweYj3HABWVEu6uLjUvqxo5+/7cM8Utu28LKjW26vxLcC4pv0Orcb+j8ycn5mT25nbTlL3DHcuycXATODz1frbTeNnR8S1wFuAnzZdZuzWr+cNmh0dtfdHGlcyJ0yrf+4+gGVL3sd5l32uK7U+d9Z5AJzz/vqv1r5yXeP36vOXnll7rTlnXwnAzd+/pvZaACe/9QP8/dpPdKXW3x11MQBHXLei9lr3v7/9f3tbhkJEfB14BzAqIjYDF9IIg0URMQt4BJhR7b4EmAZsBH4JfGhPGpfUey1DITNPH+KpEwfZN4HZI21KUu/4jUZJBUNBUsFQkFQwFCQVDAVJBUNBUsFQkFQwFCQVDAVJBUNBUsFQkFQwFCQVDAVJBUNBUsFQkFQwFCQVDAVJBUNBUsFQkFQwFCQVDAVJBUNBUiGqKdx620RE75uQ/v9b2c6MbJ4pSCoMd9q4jrv08U211zj7kMMAuPf3ltZeC+CYn53IPa9Z1pVaxz5yAgAPDLyi9lqvn/8kAGvWH117raOPXAPA6MdW1l4LYNsr38QbfvFsV2pt2P9lABwz622117r3ijva3tczBUkFQ0FSwVCQVDAUJBUMBUkFQ0FSoWUoRMSCiNgeEeuaxj4VEVsiYnW1TGt67ryI2BgRGyLi5Loal1SPds4UrgSmDDL+pcycWC1LACJiAnAacGT1mn+OiL061ayk+rUMhcxcBjzV5vtNB67NzF9l5sPARuDYEfQnqctGck/h7IhYU11eHFSNjQUebdpnczUm6UViuKEwD3gdMBHYCly8p28QEQMRsSIiVgyzB0k1GFYoZOa2zNyRmc8Dl/ObS4QtwLimXQ+txgZ7j/mZObmdn9qS1D3DCoWIGNP08L3Azk8mFgOnRcQ+EXE4MB64Z2QtSuqmlj8lGRFfB94BjIqIzcCFwDsiYiKQwCbgLIDMXB8Ri4D7gOeA2Zm5o57WJdWhZShk5umDDF+xm/3nAnNH0pSk3vEbjZIKhoKkgqEgqWAoSCoYCpIKhoKkgqEgqWAoSCoYCpIKThsn/fZw2jhJe65vpo377NU31F7j/DNOAWDxjKm11wJ4z6IbmXrKG7tS68YbfgjAxFd9vPZaq3/8RQC+OGt+7bU+fsUAAFdOu732WgBnLnk7hz67qiu1Nr9sEgDfmXlm7bXevfDKtvf1TEFSwVCQVDAUJBUMBUkFQ0FSwVCQVDAUJBUMBUkFQ0FSwVCQVDAUJBUMBUkFQ0FSwVCQVDAUJBUMBUkFQ0FSwVCQVDAUJBVahkJEjIuI70XEfRGxPiLOqcYPjohbI+LBan1QNR4RcUlEbIyINRExqe5fhKTOaedM4TngE5k5ATgOmB0RE4A5wNLMHA8srR4DTAXGV8sAMK/jXUuqTctQyMytmbmq2v45cD8wFpgOLKx2WwicWm1PB67KhruAAyNiTMc7l1SLPbqnEBGHAccAdwOjM3Nr9dRjwOhqeyzwaNPLNldjkl4E2p73ISL2B74BnJuZP4uIF57LzNzTWZ4iYoDG5YWkPtLWmUJEvJRGIFyTmd+shrftvCyo1tur8S3AuKaXH1qNFTJzfmZObmcaK0ldlJm7XYAArgK+vMv4PwJzqu05wEXV9ruAG6vXHQfc00aNdHFxqX1Z0ervYma2nmA2Io4H7gDWAs9Xw+fTuK+wCHg18AgwIzOfisZ1xaXAFOCXwIcyc0WLGvlG9tptH53wQ3YA8NY/+2nttQC+/42Xs+/GI7pS65k/uB+ATx758tprfWZ94/hNOfHTtde6aemFAMx4+uDaawEs2u8pjr7zs12pteb48wH4l7e9q/Zaf3nHf0CbE8y2vKeQmXfS+Fd/MCcOsn8Cs1u9r6T+5DcaJRUMBUkFQ0FSwVCQVDAUJBUMBUkFQ0FSwVCQVDAUJBUMBUkFQ0FSwVCQVDAUJBUMBUkFQ0FSwVCQVDAUJBUMBUkFQ0FSwVCQVDAUJBUMBUkFQ0FSwVCQVGg5Q1RXmtjDyWklDUtbM0R5piCp0PZU9HX7k7lP1F7jtgtGAfCKmfvVXgvgyYVPs2HyL7pS6w0r9gfgAzfdVXuta6YcB8C8x39ce62PHPIqAFbePKv2WgBvOvkKNnz3vq7UesM7JzRqHnV77bVWrn172/t6piCpYChIKhgKkgqGgqRCy1CIiHER8b2IuC8i1kfEOdX4pyJiS0SsrpZpTa85LyI2RsSGiDi5zl+ApM5q59OH54BPZOaqiDgAWBkRt1bPfSkz/6l554iYAJwGHAm8CrgtIl6fmTs62bikerQ8U8jMrZm5qtr+OXA/MHY3L5kOXJuZv8rMh4GNwLGdaFZS/fbonkJEHAYcA9xdDZ0dEWsiYkFEHFSNjQUebXrZZnYfIpL6SNuhEBH7A98Azs3MnwHzgNcBE4GtwMV7UjgiBiJiRUSs2JPXSapXW6EQES+lEQjXZOY3ATJzW2buyMzngcv5zSXCFmBc08sPrcYKmTk/Mye3811sSd3TzqcPAVwB3J+ZX2waH9O023uBddX2YuC0iNgnIg4HxgP3dK5lSXVq59OHtwJnAGsjYnU1dj5wekRMBBLYBJwFkJnrI2IRcB+NTy5m+8mD9OLRMhQy804gBnlqyW5eMxeYO4K+JPWI32iUVDAUJBUMBUkFQ0FSwVCQVDAUJBUMBUkFQ0FSwVCQVDAUJBUMBUkFp42Tfnu0NW1cv8wQ9QTwdLXuV6Owv5Ho5/76uTfoXH+vaWenvjhTAIiIFf38H67Y38j0c3/93Bt0vz/vKUgqGAqSCv0UCvN73UAL9jcy/dxfP/cGXe6vb+4pSOoP/XSmIKkP9DwUImJKNefkxoiY0+t+ACJiU0SsrebIXFGNHRwRt0bEg9X6oFbv08F+FkTE9ohY1zQ2aD/RcEl1PNdExKQe9dc3c43uZj7UvjiGfTdfa2b2bAH2An4EvBbYG/gBMKGXPVV9bQJG7TJ2ETCn2p4DfKGL/ZwATALWteoHmAbcSOM/2z0OuLtH/X0K+OtB9p1Q/T7vAxxe/f7vVXN/Y4BJ1fYBwANVH31xDHfTX0+OYa/PFI4FNmbmQ5n5a+BaGnNR9qPpwMJqeyFwarcKZ+Yy4Kk2+5kOXJUNdwEH7jJHR7f6G0rX5xrNoedD7YtjuJv+hlLrMex1KPTrvJMJ3BIRKyNioBobnZlbq+3HgNG9ae0FQ/XTT8e07+Ya3WU+1L47hv0wX2uvQ6FfHZ+Zk4CpwOyIOKH5yWycw/XNxzb91k9lRHON1mGQ+VBf0A/HsNPztQ5Xr0OhrXknuy0zt1Tr7cD1NE7Ntu08hazW23vXIeymn744pjnCuUY7bbD5UOmjY1jHfK3D1etQWA6Mj4jDI2Jv4DQac1H2TETsFxEH7NwGTqIxT+ZiYGa120zg273p8AVD9bMY+GB1B/044KdNp8hd009zjQ41Hyp9cgyH6q9nx7DOu6pt3nmdRuNu64+AC/qgn9fSuLP7A2D9zp6AVwBLgQeB24CDu9jT12mcPv4PjevHWUP1Q+OO+deq47kWmNyj/q6u6q+p/hCPadr/gqq/DcDULvR3PI1LgzXA6mqZ1i/HcDf99eQY+o1GSYVeXz5I6jOGgqSCoSCpYChIKhgKkgqGgqSCoSCpYChIKvwvZzqwY4oVzloAAAAASUVORK5CYII=\n", "text/plain": [ "
" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "#right\n", "padded = F.pad(fake_batch[:64], [1, 1, 1, 1])\n", "plt.imshow(rearrange(padded, '(b1 b2) c h w -> (b1 h) (b2 w) c', b1=8).cpu())" ] }, { "cell_type": "code", "execution_count": 78, "metadata": {}, "outputs": [], "source": [ "# TODO: Hierarchical softmax\n", "# TODO: some reinforcement stuff would also be needed" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Instead of conclusion\n", "\n", "Better code is a vague term; to be specific, code is expected to be:\n", "\n", "- reliable: does what expected and does not fail. Explicitly fails for wrong inputs\n", "- maintainable and modifiable\n", "- reusable: understanding and modifying code should be easier than writing from scratch\n", "- fast: in my measurements, proposed versions have speed similar to the original code\n", "- readability counts, as a mean to achieve previous goals\n", "\n", "Provided examples show how to improve these criteria for deep learning code. And `einops` helps a lot." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Links\n", "\n", "- [pytorch](http://github.com/pytorch/pytorch) and [einops](https://github.com/arogozhnikov/einops)\n", "- significant part of the code was taken from the official [examples](https://github.com/pytorch/examples) and [tutorials](https://github.com/pytorch/tutorials). All code fragments were taken for educational purpose.\n", "- (references for other code are given in source of this html)\n", "- einops has a [tutorial](https://github.com/arogozhnikov/einops/tree/main/docs) for a more gentle introduction" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.6.5" } }, "nbformat": 4, "nbformat_minor": 2 } arogozhnikov-einops-ad2c8d6/scripts/pytorch_examples_source/converter.py000066400000000000000000000104161475201674600272560ustar00rootroot00000000000000""" just run this script with python converter.py . It will convert pytorch.ipynb to html page docs/pytorch-examples.html """ import nbformat import markdown from pygments import highlight from pygments.lexers import PythonLexer from pygments.formatters import HtmlFormatter notebook = nbformat.read("Pytorch.ipynb", as_version=nbformat.NO_CONVERT) content = "" cache = "" for cell in notebook["cells"]: if cell["cell_type"] == "code": source = cell["source"] if source.startswith("#left") or source.startswith("#right"): trimmed_source = source[source.index("\n") + 1 :] cache += "
{}
".format(highlight(trimmed_source, PythonLexer(), HtmlFormatter())) if source.startswith("#right"): content += "
{}
".format(cache) cache = "" elif cell["cell_type"] == "markdown": content += "
{}
".format(markdown.markdown(cell["source"])) else: raise RuntimeError("not expected type of cell" + cell["cell_type"]) styles = HtmlFormatter().get_style_defs(".highlight") styles += """ body { padding: 50px 10px; } .leftright-wrapper { text-align: center; overflow-x: auto; } .leftright-cells { display: inline-flex; text-align: left; } .leftright-cells > div { padding: 0px 10px; min-width: 350px; } .markdown-cell{ max-width: 700px; margin: 0px auto; } h1 { text-align: center; padding: 10px 0px 0px; } """ meta_tags = """ """ github_ribbon = """ """ result = f""" {meta_tags} Writing better code with pytorch+einops {github_ribbon} {content} """ with open("../../docs/pytorch-examples.html", "w") as f: f.write(result) arogozhnikov-einops-ad2c8d6/scripts/setup.py000066400000000000000000000015551475201674600214450ustar00rootroot00000000000000""" This is a fake script, it is not used. Github does not count dependent python projects unless you have a setup.py """ __author__ = "Alex Rogozhnikov" from setuptools import setup setup( name="einops", version="0.7.0", description="A new flavour of deep learning operations", long_description=open("README.md", encoding="utf-8").read(), long_description_content_type="text/markdown", url="https://github.com/arogozhnikov/einops", author="Alex Rogozhnikov", classifiers=[ "Intended Audience :: Science/Research", "Programming Language :: Python :: 3 ", "License :: OSI Approved :: MIT License", ], keywords="deep learning, neural networks, tensor manipulation, machine learning, " "scientific computations, einops", install_requires=[ # no run-time or installation-time dependencies ], ) arogozhnikov-einops-ad2c8d6/scripts/test_notebooks.py000066400000000000000000000041361475201674600233450ustar00rootroot00000000000000""" Script assumes torch, tf and numpy are already installed. Also needs: "nbformat", "nbconvert", "jupyter", "pillow", """ from typing import Dict from io import StringIO __author__ = "Alex Rogozhnikov" from pathlib import Path import nbformat from nbconvert.preprocessors import ExecutePreprocessor def render_notebook(filename: Path, replacements: Dict[str, str]) -> str: """Takes path to the notebook, returns executed and rendered version :param filename: notebook :param replacements: dictionary with text replacements done before executing :return: notebook, rendered as string """ with filename.open("r") as f: nb_as_str = f.read() for original, replacement in replacements.items(): assert original in nb_as_str, f"not found in notebook: {original}" nb_as_str = nb_as_str.replace(original, replacement) nb = nbformat.read(StringIO(nb_as_str), nbformat.NO_CONVERT) ep = ExecutePreprocessor(timeout=120, kernel_name="python3") ep.preprocess(nb, {"metadata": {"path": str(filename.parent.absolute())}}) result_as_stream = StringIO() nbformat.write(nb, result_as_stream) return result_as_stream.getvalue() def test_notebook_1(): [notebook] = Path(__file__).parent.with_name("docs").glob("1-*.ipynb") render_notebook(notebook, replacements={}) def test_notebook_2_with_all_backends(): [notebook] = Path(__file__).parent.with_name("docs").glob("2-*.ipynb") for backend in ["pytorch", "tensorflow"]: print("Testing {} with backend {}".format(notebook, backend)) replacements = {r"flavour = \"pytorch\"": r"flavour = \"{}\"".format(backend)} expected_string = "selected {} backend".format(backend) result = render_notebook(notebook, replacements=replacements) assert expected_string in result def test_notebook_3(): [notebook] = Path(__file__).parent.with_name("docs").glob("3-*.ipynb") render_notebook(notebook, replacements={}) def test_notebook_4(): [notebook] = Path(__file__).parent.with_name("docs").glob("4-*.ipynb") render_notebook(notebook, replacements={})