pax_global_header00006660000000000000000000000064145411147150014515gustar00rootroot0000000000000052 comment=b18899053a33e890f9ee9f9fa90e99dbdf03fb21 agate-1.9.1/000077500000000000000000000000001454111471500126065ustar00rootroot00000000000000agate-1.9.1/.github/000077500000000000000000000000001454111471500141465ustar00rootroot00000000000000agate-1.9.1/.github/CONTRIBUTING.md000066400000000000000000000056261454111471500164100ustar00rootroot00000000000000Contributing ============ Principles ---------- agate is a intended to fill a very particular programming niche. It should not be allowed to become as complex as [numpy] or [pandas]. Please bear in mind the following principles when contemplating an addition: - Humans have less time than computers. Optimize for humans. - Most datasets are small. Don’t optimize for “big data”. - Text is data. It must always be a first-class citizen. - Python gets it right. Make it work like Python does. - Humans lives are nasty, brutish and short. Make it easy. - Mutability leads to confusion. Processes that alter data must create new copies. - Extensions are the way. Don’t add it to core unless everybody needs it. Process for contributing code ----------------------------- Contributors should use the following roadmap to guide them through the process of submitting a contribution: 1. Fork the project on [GitHub]. 2. Check out the [issue tracker] and find a task that needs to be done and is of a scope you can realistically expect to complete in a few days. Don’t worry about the priority of the issues at first, but try to choose something you’ll enjoy. You’re much more likely to finish something to the point it can be merged if it’s something you really enjoy hacking on. 3. Comment on the ticket letting everyone know you’re going to be hacking on it so that nobody duplicates your effort. It’s also good practice to provide some general idea of how you plan on resolving the issue so that other developers can make suggestions. 4. Write tests for the feature you’re building. Follow the format of the existing tests in the test directory to see how this works. You can run all the tests with the command `pytest`. 5. Write the code. Try to stay consistent with the style and organization of the existing codebase. A good patch won’t be refused for stylistic reasons, but large parts of it may be rewritten and nobody wants that. 6. As you are coding, periodically merge in work from the master branch and verify you haven’t broken anything by running the test suite. 7. Write documentation. Seriously. 8. Once it works, is tested, and has documentation, submit a pull request on GitHub. 9. Wait for it to either be merged or to receive a comment about what needs to be fixed. 10. Rejoice. Legalese -------- To the extent that they care, contributors should keep in mind that the source of agate and therefore of any contributions are licensed under the permissive [MIT license]. By submitting a patch or pull request you are agreeing to release your code under this license. You will be acknowledged in the AUTHORS list, the commit history and the hearts and minds of jo [numpy]: https://numpy.org/ [pandas]: https://pandas.pydata.org/ [GitHub]: https://github.com/wireservice/agate [issue tracker]: https://github.com/wireservice/agate/issues [MIT license]: https://opensource.org/license/mit/ agate-1.9.1/.github/workflows/000077500000000000000000000000001454111471500162035ustar00rootroot00000000000000agate-1.9.1/.github/workflows/ci.yml000066400000000000000000000022241454111471500173210ustar00rootroot00000000000000name: CI on: [push, pull_request] jobs: build: if: github.event_name == 'push' || github.event.pull_request.head.repo.full_name != github.repository runs-on: ${{ matrix.os }} strategy: matrix: os: [macos-latest, windows-latest, ubuntu-latest] python-version: [3.8, 3.9, '3.10', '3.11', '3.12', pypy-3.9] steps: - if: matrix.os == 'ubuntu-latest' name: Install UTF-8 locales and lxml requirements run: | sudo apt install libxml2-dev libxslt-dev sudo locale-gen de_DE.UTF-8 sudo locale-gen en_US.UTF-8 sudo locale-gen ko_KR.UTF-8 sudo update-locale - uses: actions/checkout@v3 - uses: actions/setup-python@v4 with: python-version: ${{ matrix.python-version }} cache: pip cache-dependency-path: setup.py - run: pip install .[test] coveralls - env: LANG: en_US.UTF-8 PYTHONIOENCODING: utf-8 PYTHONUTF8: 1 run: pytest --cov agate - run: python charts.py - env: GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} run: coveralls --service=github agate-1.9.1/.github/workflows/lint.yml000066400000000000000000000011741454111471500176770ustar00rootroot00000000000000name: Lint on: [push, pull_request] env: BASEDIR: https://raw.githubusercontent.com/open-contracting/standard-maintenance-scripts/main jobs: build: if: github.event_name == 'push' || github.event.pull_request.head.repo.full_name != github.repository runs-on: ubuntu-latest steps: - uses: actions/checkout@v3 - uses: actions/setup-python@v4 with: python-version: '3.10' cache: pip cache-dependency-path: setup.py - run: pip install --upgrade check-manifest flake8 isort setuptools - run: check-manifest - run: flake8 . - run: isort . --check-only agate-1.9.1/.github/workflows/pypi.yml000066400000000000000000000013261454111471500177110ustar00rootroot00000000000000name: Publish to PyPI on: push jobs: build: runs-on: ubuntu-latest steps: - uses: actions/checkout@v3 - uses: actions/setup-python@v4 with: python-version: '3.10' - run: pip install --upgrade build - run: python -m build --sdist --wheel - name: Publish to TestPyPI uses: pypa/gh-action-pypi-publish@release/v1 with: password: ${{ secrets.TEST_PYPI_API_TOKEN }} repository-url: https://test.pypi.org/legacy/ skip-existing: true - name: Publish to PyPI if: startsWith(github.ref, 'refs/tags') uses: pypa/gh-action-pypi-publish@release/v1 with: password: ${{ secrets.PYPI_API_TOKEN }} agate-1.9.1/.gitignore000066400000000000000000000001501454111471500145720ustar00rootroot00000000000000.DS_Store *.pyc *.swp *.swo *.egg-info docs/_build dist .coverage build .proof .ipynb_checkpoints .idea agate-1.9.1/.pre-commit-config.yaml000066400000000000000000000004351454111471500170710ustar00rootroot00000000000000repos: - repo: https://github.com/pycqa/flake8 rev: 3.9.2 hooks: - id: flake8 - repo: https://github.com/pycqa/isort rev: 5.8.0 hooks: - id: isort - repo: https://github.com/mgedmin/check-manifest rev: "0.46" hooks: - id: check-manifest agate-1.9.1/.readthedocs.yaml000066400000000000000000000002531454111471500160350ustar00rootroot00000000000000version: 2 build: os: ubuntu-20.04 tools: python: "3.9" python: install: - path: . - requirements: docs/requirements.txt sphinx: fail_on_warning: true agate-1.9.1/AUTHORS.rst000066400000000000000000000046231454111471500144720ustar00rootroot00000000000000agate is made by a community. The following individuals have contributed code, documentation, or expertise to agate: * `Christopher Groskopf `_ * `Jeff Larson `_ * `Eric Sagara `_ * `John Heasly `_ * `Mick O'Brien `_ * `David Eads `_ * `Nikhil Sonnad `_ * `Matt Riggott `_ * `Tyler Fisher `_ * `William P. Davis `_ * `Ryan Murphy `_ * `Raphael Deem `_ * `Robin Linderborg `_ * `Chris Keller `_ * `Neil Bedi `_ * `Geoffrey Hing `_ * `Taurus Olson `_ * `Danny Page `_ * `James McKinney `_ * `Tony Papousek `_ * `Mila Frerichs `_ * `Paul Fitzpatrick `_ * `Ben Welsh `_ * `Kevin Schaul `_ * `sandyp `_ * `Lexie Heinle `_ * `Will Skora `_ * `Joe Germuska `_ * `Eli Murray `_ * `Derek Swingley `_ * `Or Sharir `_ * `Anthony DeBarros `_ * `Apoorv Anand `_ * `Ghislain Antony Vaillant `_ * `Neil MartinsenBurrell `_ * `Aliaksei Urbanski `_ * `Forest Gregg `_ * `Robert Schütz `_ * `Wouter de Vries `_ * `Kartik Agaram `_ * `Loïc Corbasson `_ * `Danny Sepler `_ * `brian-from-quantrocket `_ * `mathdesc `_ * `Tim Gates `_ * `castorf `_ * `Julien Enselme `__ agate-1.9.1/CHANGELOG.rst000066400000000000000000001135061454111471500146350ustar00rootroot000000000000001.9.1 - December 21, 2023 ------------------------- * Add Babel 2.14 support. 1.9.0 - October 17, 2023 ------------------------ * feat: Add a ``text_truncation_chars`` configuration for values that exceed ``max_column_width`` in :meth:`.Table.print_table` and :meth:`.Table.print_html`. * feat: Add a ``number_truncation_chars`` configuration for values that exceed ``max_precision`` in :meth:`.Table.print_table` and :meth:`.Table.print_html`. 1.8.0 - October 10, 2023 ------------------------ * feat: Lowercase the ``null_values`` provided to individual data types, since all comparisons to ``null_values`` are case-insensitive. (#770) * feat: :class:`.Mean` works with :class:`.TimeDelta`. (#761) * Switch from ``pytz`` to ``ZoneInfo``. * Add Python 3.12 support. * Drop Python 3.7 support (end-of-life was June 27, 2023). 1.7.1 - January 4, 2023 ----------------------- * Allow parsedatetime 2.6. 1.7.0 - January 3, 2023 ----------------------- * Add Python 3.10 and 3.11 support. * Drop support for Python 2.7 (EOL 2020-01-01), 3.6 (2021-12-23). 1.6.3 - July 15, 2021 --------------------- * feat: :meth:`.Table.from_csv` accepts a ``row_limit`` keyword argument. (#740) * feat: :meth:`.Table.from_json` accepts an ``encoding`` keyword argument. (#734) * feat: :meth:`.Table.print_html` accepts a ``max_precision`` keyword argument, like :meth:`.Table.print_table`. (#753) * feat: :class:`.TypeTester` accepts a ``null_values`` keyword argument, like individual data types. (#745) * feat: :class:`.Min`, :class:`.Max` and :class:`.Sum` (#735) work with :class:`.TimeDelta`. * feat: :class:`.FieldSizeLimitError` includes the line number in the error message. (#681) * feat: :class:`.csv.Sniffer` warns on error while sniffing CSV dialect. * fix: :meth:`.Table.normalize` works with basic processing methods. (#691) * fix: :meth:`.Table.homogenize` works with basic processing methods. (#756) * fix: :meth:`.Table.homogenize` casts ``compare_values`` and ``default_row``. (#700) * fix: :meth:`.Table.homogenize` accepts tuples. (#710) * fix: :meth:`.TableSet.group_by` accepts input with no rows. (#703) * fix: :class:`.TypeTester` warns if a column specified by the ``force`` argument is not in the table, instead of raising an error. (#747) * fix: Aggregations return ``None`` if all values are ``None``, instead of raising an error. Note that ``Sum``, ``MaxLength`` and ``MaxPrecision`` continue to return ``0`` if all values are ``None``. (#706) * fix: Ensure files are closed when errors occur. (#734) * build: Make PyICU an optional dependency. * Drop support for Python 3.4 (2019-03-18), 3.5 (2020-09-13). 1.6.2 - March 10, 2021 ---------------------- * feat: :meth:`.Date.__init__` and :meth:`.DateTime.__init__` accepts a ``locale`` keyword argument (e.g. :code:`en_US`) for parsing formatted dates. (#730) * feat: :meth:`.Number.cast` casts ``True`` to ``1`` and ``False`` to ``0``. (#733) * fix: :meth:`.utils.max_precision` ignores infinity when calculating precision. (#726) * fix: :meth:`.Date.cast` catches ``OverflowError`` when type testing. (#720) * Included examples in Python package. (#716) 1.6.1 - March 11, 2018 ---------------------- * feat: :meth:`.Table.to_json` can use Decimal as keys. (#696) * fix: :meth:`.Date.cast` and :meth:`.DateTime.cast` no longer parse non-date strings that contain date sub-strings as dates. (#705) * docs: Link to tutorial now uses version through Sphinx to avoid bad links on future releases. (#682) 1.6.0 - February 28, 2017 ------------------------- This update should not cause any breaking changes, however, it is being classified as major release because the dependency on awesome-slugify, which is licensed with GPLv3, has been replaced with python-slugify, which is licensed with MIT. * Suppress warning from babel about Time Zone expressions on Python 3.6. (#665) * Reimplemented slugify with python-slugify instead of awesome-slugify. (#660) * Slugify renaming of duplicate values is now consistent with :meth:`.Table.init`. (#615) 1.5.5 - December 29, 2016 ------------------------- * Added a "full outer join" example to the SQL section of the cookbook. (#658) * Warnings are now more explicit when column names are missing. (#652) * :meth:`.Date.cast` will no longer parse strings like :code:`05_leslie3d_base` as dates. (#653) * :meth:`.Text.cast` will no longer strip leading or trailing whitespace. (#654) * Fixed :code:`'NoneType' object has no attribute 'groupdict'` error in :meth:`.TimeDelta.cast`. (#656) 1.5.4 - December 27, 2016 ------------------------- * Cleaned up handling of warnings in tests. * Blank column names are not treated as unspecified (letter names will be generated). 1.5.3 - December 26, 2016 ------------------------- This is a minor release that adds one feature: sequential joins (by row number). It also fixes several small bugs blocking a downstream release of csvkit. * Fixed empty :class:`.Table` column names would be intialized as list instead of tuple. * :meth:`.Table.join` can now join by row numbers—a sequential join. * :meth:`.Table.join` now supports full outer joins via the ``full_outer`` keyword. * :meth:`.Table.join` can now accept column indicies instead of column names. * :meth:`.Table.from_csv` now buffers input files to prevent issues with using STDIN as an input. 1.5.2 - December 24, 2016 ------------------------- * Improved handling of non-ascii encoded CSV files under Python 2. 1.5.1 - December 23, 2016 ------------------------- This is a minor release fixing several small bugs that were blocking a downstream release of csvkit. * Documented differing behavior of :class:`.MaxLength` under Python 2. (#649) * agate is now tested against Python 3.6. (#650) * Fix bug when :class:`.MaxLength` was called on an all-null column. * Update extensions documentation to match new API. (#645) * Fix bug in :class:`.Change` and :class:`.PercentChange` where ``0`` values could cause ``None`` to be returned incorrectly. 1.5.0 - November 16, 2016 ------------------------- This release adds SVG charting via the `leather `_ charting library. Charts methods have been added for both :class:`.Table` and :class:`.TableSet`. (The latter create lattice plots.) See the revised tutorial and new cookbook entries for examples. Leather is still an early library. Please `report any bugs `_. Also in this release are a :class:`.Slugify` computation and a variety of small fixes and improvements. The complete list of changes is as follows: * Remove support for monkey-patching of extensions. (#594) * :class:`.TableSet` methods which proxy :class:`.Table` methods now appear in the API docs. (#640) * :class:`.Any` and :class:`.All` aggregations no longer behave differently for boolean data. (#636) * :class:`.Any` and :class:`.All` aggregations now accept a single value as a test argument, in addition to a function. * :class:`.Any` and :class:`.All` aggregations now require a test argument. * Tables rendered by :meth:`.Table.print_table` are now GitHub Flavored Markdown (GFM) compatible. (#626) * The agate tutorial has been converted to a Jupyter Notebook. * :class:`.Table` now supports ``len`` as a proxy for ``len(table.rows)``. * Simple SVG charting is now integrated via `leather `_. * Added :class:`.First` computation. (#634) * :meth:`.Table.print_table` now has a `max_precision` argument to limit Number precision. (#544) * Slug computation now accepts an array of column names to merge. (#617) * Cookbook: standardize column values with :class:`.Slugify` computation. (#613) * Cookbook: slugify/standardize row and column names. (#612) * Fixed condition that prevents integer row names to allow bools in :meth:`.Table.__init__`. (#627) * :class:`.PercentChange` is now null-safe, returns None for null values. (#623) * :class:`.Table` can now be iterated, yielding :class:`Row` instances. (Previously it was necessarily to iterate :code:`table.rows`.) 1.4.0 - May 26, 2016 -------------------- This release adds several new features, fixes numerous small bug-fixes, and improves performance for common use cases. There are some minor breaking changes, but few user are likely to encounter them. The most important changes in this release are: 1. There is now a :meth:`.TableSet.having` method, which behaves similarly to SQL's ``HAVING`` keyword. 2. :meth:`.Table.from_csv` is much faster. In particular, the type inference routines for parsing numbers have been optimized. 3. The :meth:`.Table.compute` method now accepts a ``replace`` keyword which allows new columns to replace existing columns "in place."" (As with all agate operations, a new table is still created.) 4. There is now a :class:`.Slug` computation which can be used to compute a column of slugs. The :meth:`.Table.rename` method has also added new options for slugifying column and row names. The complete list of changes is as follows: * Added a deprecation warning for ``patch`` methods. New extensions should not use it. (#594) * Added :class:`.Slug` computation (#466) * Added ``slug_columns`` and ``slug_rows`` arguments to :meth:`Table.rename`. (#466) * Added :meth:`.utils.slugify` to standardize a sequence of strings. (#466) * :meth:`.Table.__init__` now prints row and column on ``CastError``. (#593) * Fix null sorting in :meth:`.Table.order_by` when ordering by multiple columns. (#607) * Implemented configuration system. * Fixed bug in :meth:`.Table.print_bars` when ``value_column`` contains ``None`` (#608) * :meth:`.Table.print_table` now restricts header on max_column_width. (#605) * Cookbook: filling gaps in a dataset with Table.homogenize. (#538) * Reduced memory usage and improved performance of :meth:`.Table.from_csv`. * :meth:`.Table.from_csv` no longer accepts a sequence of row ids for :code:`skip_lines`. * :meth:`.Number.cast` is now three times as fast. * :class:`.Number` now accepts :code:`group_symbol`, :code:`decimal_symbol` and :code:`currency_symbols` arguments. (#224) * Tutorial: clean up state data under computing columns (#570) * :meth:`.Table.__init__` now explicitly checks that ``row_names`` are not ints. (#322) * Cookbook: CPI deflation, agate-lookup. (#559) * :meth:`.Table.bins` now includes values outside ``start`` or ``end`` in computed ``column_names``. (#596) * Fixed bug in :meth:`.Table.bins` where ``start`` or ``end`` arguments were ignored when specified alone. (#599) * :meth:`.Table.compute` now accepts a :code:`replace` argument that allows columns to be overwritten. (#597) * :meth:`.Table.from_fixed` now creates an agate table from a fixed-width file. (#358) * :mod:`.fixed` now implements a general-purpose fixed-width file reader. (#358) * :class:`TypeTester` now correctly parses negative currency values as Number. (#595) * Cookbook: removing a column (`select` and `exclude`). (#592) * Cookbook: overriding specific column types. (#591) * :class:`.TableSet` now has a :meth:`.TableSet._fork` method used internally for deriving new tables. * Added an example of SQL's :code:`HAVING` to the cookbook. * :meth:`.Table.aggregate` interface has been revised to be more similar to :meth:`.TableSet.aggregate`. * :meth:`.TableSet.having` is now implemented. (#587) * There is now a better error when a forced column name does not exist. (#591) * Arguments to :meth:`.Table.print_html` now mirror :meth:`.Table.print_table`. 1.3.1 - March 30, 2016 ---------------------- The major feature of this release is new API documentation. Several minor features and bug fixes are also included. There are no major breaking changes in this release. Internally, the agate codebase has been reorganized to be more modular, but this should be invisible to most users. * The :class:`.MaxLength` aggregation now returns a `Decimal` object. (#574) * Fixed an edge case where datetimes were parsed as dates. (#568) * Fixed column alignment in tutorial tables. (#572) * :meth:`.Table.print_table` now defaults to printing ``20`` rows and ``6`` columns. (#589) * Added Eli Murray to AUTHORS. * :meth:`.Table.__init__` now accepts a dict to specify partial column types. (#580) * :meth:`.Table.from_csv` now accepts a ``skip_lines`` argument. (#581) * Moved every :class:`.Aggregation` and :class:`.Computation` into their own modules. (#565) * :class:`.Column` and :class:`.Row` are now importable from `agate`. * Completely reorgnized the API documentation. * Moved unit tests into modules to match new code organization. * Moved major :class:`.Table` and :class:`.TableSet` methods into their own modules. * Fixed bug when using non-unicode encodings with :meth:`.Table.from_csv`. (#560) * :meth:`.Table.homogenize` now accepts an array of values as compare values if key is a single column name. (#539) 1.3.0 - February 28, 2016 ------------------------- This version implements several new features and includes two major breaking changes. Please take note of the following breaking changes: 1. There is no longer a :code:`Length` aggregation. The more obvious :class:`.Count` is now used instead. 2. Agate's replacements for Python's CSV reader and writer have been moved to the :code:`agate.csv` namespace. To use as a drop-in replacement: :code:`from agate import csv`. The major new features in this release are primarly related to transforming (reshaping) tables. They are: 1. :meth:`.Table.normalize` for converting columns to rows. 2. :meth:`.Table.denormalize` for converting rows to columns. 3. :meth:`.Table.pivot` for generating "crosstabs". 4. :meth:`.Table.homogenize` for filling gaps in data series. Please see the following complete list of changes for a variety of other bug fixes and improvements. * Moved CSV reader/writer to :code:`agate.csv` namespace. * Added numerous new examples to the R section of the cookbook. (#529-#535) * Updated Excel cookbook entry for pivot tables. (#536) * Updated Excel cookbook entry for VLOOKUP. (#537) * Fix number rendering in :meth:`.Table.print_table` on Windows. (#528) * Added cookbook examples of using :meth:`.Table.pivot` to count frequency/distribution. * :meth:`.Table.bins` now has smarter output column names. (#524) * :meth:`.Table.bins` is now a wrapper around pivot. (#522) * :meth:`.Table.counts` has been removed. Use :meth:`.Table.pivot` instead. (#508) * :class:`.Count` can now count non-null values in a column. * Removed :class:`.Length`. :class:`.Count` now works without any arguments. (#520) * :meth:`.Table.pivot` implemented. (#495) * :meth:`.Table.denormalize` implemented. (#493) * Added ``columns`` argument to :meth:`Table.join`. (#479) * Cookbook: Custom statistics/agate.Summary * Added Kevin Schaul to AUTHORS. * :meth:`Quantiles.locate` now correctly returns `Decimal` instances. (#509) * Cookbook: Filter for distinct values of a column (#498) * Added :meth:`.Column.values_distinct()` (#498) * Cookbook: Fuzzy phonetic search example. (#207) * Cookbook: Create a table from a remote file. (#473) * Added ``printable`` argument to :meth:`.Table.print_bars` to use only printable characters. (#500) * :class:`.MappedSequence` now throws an explicit error on __setitem__. (#499) * Added ``require_match`` argument to :meth:`.Table.join`. (#480) * Cookbook: Rename columns in a table. (#469) * :meth:`.Table.normalize` implemented. (#487) * Added :class:`.Percent` computation with example in Cookbook. (#490) * Added Ben Welsh to AUTHORS. * :meth:`.Table.__init__` now throws a warning if auto-generated columns are used. (#483) * :meth:`.Table.__init__` no longer fails on duplicate columns. Instead it renames them and throws a warning. (#484) * :meth:`.Table.merge` now takes a ``column_names`` argument to specify columns included in new table. (#481) * :meth:`.Table.select` now accepts a single column name as a key. * :meth:`.Table.exclude` now accepts a single column name as a key. * Added :meth:`.Table.homogenize` to find gaps in a table and fill them with default rows. (#407) * :meth:`.Table.distinct` now accepts sequences of column names as a key. * :meth:`.Table.join` now accepts sequences of column names as either a left or right key. (#475) * :meth:`.Table.order_by` now accepts a sequence of column names as a key. * :meth:`.Table.distinct` now accepts a sequence of column names as a key. * :meth:`.Table.join` now accepts a sequence of column names as either a left or right key. (#475) * Cookbook: Create a table from a DBF file. (#472) * Cookbook: Create a table from an Excel spreadsheet. * Added explicit error if a filename is passed to the :class:`.Table` constructor. (#438) 1.2.2 - February 5, 2016 ------------------------ This release adds several minor features. The only breaking change is that default column names will now be lowercase instead of uppercase. If you depended on these names in your scripts you will need to update them accordingly. * :class:`.TypeTester` no longer takes a ``locale`` argument. Use ``types`` instead. * :class:`.TypeTester` now takes a ``types`` argument that is a list of possible types to test. (#461) * Null conversion can now be disabled for :class:`.Text` by passing ``cast_nulls=False``. (#460) * Default column names are now lowercase letters instead of uppercase. (#464) * :meth:`.Table.merge` can now merge tables with different columns or columns in a different order. (#465) * :meth:`.MappedSequence.get` will no longer raise ``KeyError`` if a default is not provided. (#467) * :class:`.Number` can now test/cast the ``long`` type on Python 2. 1.2.1 - February 5, 2016 ------------------------ This release implements several new features and bug fixes. There are no significant breaking changes. Special thanks to `Neil Bedi `_ for his extensive contributions to this release. * Added a ``max_column_width`` argument to :meth:`.Table.print_table`. Defaults to ``20``. (#442) * :meth:`.Table.from_json` now defers most functionality to :meth:`.Table.from_object`. * Implemented :meth:`.Table.from_object` for parsing JSON-like Python objects. * Fixed a bug that prevented :meth:`.Table.order_by` on empty table. (#454) * :meth:`.Table.from_json` and :meth:`TableSet.from_json` now have ``column_types`` as an optional argument. (#451) * :class:`.csv.Reader` now has ``line_numbers`` and ``header`` options to add column for line numbers (#447) * Renamed ``maxfieldsize`` to ``field_size_limit`` in :class:`.csv.Reader` for consistency (#447) * :meth:`.Table.from_csv` now has a ``sniff_limit`` option to use :class:`.csv.Sniffer` (#444) * :class:`.csv.Sniffer` implemented. (#444) * :meth:`.Table.__init__` no longer fails on empty rows. (#445) * :meth:`.TableSet.from_json` implemented. (#373) * Fixed a bug that breaks :meth:`TypeTester.run` on variable row length. (#440) * Added :meth:`.TableSet.__str__` to display :class:`.Table` keys and row counts. (#418) * Fixed a bug that incorrectly checked for column_types equivalence in :meth:`.Table.merge` and :meth:`.TableSet.__init__`. (#435) * :meth:`.TableSet.merge` now has the ability to specify grouping factors with ``group``, ``group_name`` and ``group_type``. (#406) * :class:`.Table` can now be constructed with ``None`` for some column names. Those columns will receive letter names. (#432) * Slightly changed the parsing of dates and datetimes from strings. * Numbers are now written to CSV without extra zeros after the decimal point. (#429) * Made it possible for ``datetime.date`` instances to be considered valid :class:`.DateTime` inputs. (#427) * Changed preference order in type testing so :class:`.Date` is preferred to :class:`.DateTime`. * Removed ``float_precision`` argument from :class:`.Number`. (#428) * :class:`.AgateTestCase` is now available as ``agate.AgateTestCase``. (#426) * :meth:`.TableSet.to_json` now has an ``indent`` option for use with ``nested``. * :meth:`.TableSet.to_json` now has a ``nested`` option for writing a single, nested JSON file. (#417) * :meth:`.TestCase.assertRowNames` and :meth:`.TestCase.assertColumnNames` now validate the row and column instance keys. * Fixed a bug that prevented :meth:`.Table.rename` from renaming column names in :class:`.Row` instances. (#423) 1.2.0 - January 18, 2016 ------------------------ This version introduces one breaking change, which is only relevant if you are using custom :class:`.Computation` subclasses. 1. :class:`.Computation` has been modified so that :meth:`.Computation.run` takes a :class:`.Table` instance as its argument, rather than a single row. It must return a sequence of values to use for a new column. In addition, the :meth:`.Computation._prepare` method has been renamed to :meth:`.Computation.validate` to more accurately describe it's function. These changes were made to facilitate computing moving averages, streaks and other values that require data for the full column. * Existing :class:`.Aggregation` subclasses have been updated to use :meth:`.Aggregate.validate`. (This brings a noticeable performance boost.) * :class:`.Aggregation` now has a :meth:`.Aggregation.validate` method that functions identically to :meth:`.Computation.validate`. (#421) * :meth:`.Change.validate` now correctly raises :class:`.DataTypeError`. * Added a ``SimpleMovingAverage`` implementation to the cookbook's examples of custom :class:`.Computation` classes. * :meth:`.Computation._prepare` has been renamed to :meth:`.Computation.validate`. * :meth:`.Computation.run` now takes a :class:`.Table` instance as an argument. (#415) * Fix a bug in Python 2 where printing a table could raise ``decimal.InvalidOperation``. (#412) * Fix :class:`.Rank` so it returns Decimal. (#411) * Added Taurus Olson to AUTHORS. * Printing a table will now print the table's structure. * :meth:`.Table.print_structure` implemented. (#393) * Added Geoffrey Hing to AUTHORS. * :meth:`.Table.print_html` implemented. (#408) * Instances of :class:`.Date` and :class:`.DateTime` can now be pickled. (#362) * :class:`.AgateTestCase` is available as ``agate.testcase.AgateTestCase`` for extensions to use. (#384) * :meth:`.Table.exclude` implemented. Opposite of :meth:`.Table.select`. (#388) * :meth:`.Table.merge` now accepts a ``row_names`` argument. (#403) * :class:`.Formula` now automatically casts computed values to specified data type unless ``cast`` is set to ``False``. (#398) * Added Neil Bedi to AUTHORS. * :meth:`.Table.rename` is implemented. (#389) * :meth:`.TableSet.to_json` is implemented. (#374) * :meth:`.Table.to_csv` and :meth:`.Table.to_json` will now create the target directory if it does not exist. (#392) * :class:`.Boolean` will now correctly cast numerical ``0`` and ``1``. (#386) * :meth:`.Table.merge` now consistently maps column names to rows. (#402) 1.1.0 - November 4, 2015 ------------------------ This version of agate introduces three major changes. 1. :class:`.Table`, :meth:`.Table.from_csv` and :meth:`.TableSet.from_csv` now all take ``column_names`` and ``column_types`` as separate arguments instead of as a sequence of tuples. This was done to enable more flexible type inference and to streamline the API. 2. The interfaces for :meth:`.TableSet.aggregate` and :meth:`.Table.compute` have been changed. In both cases the new column name now comes first. Aggregations have also been modified so that the input column name is an argument to the aggregation class, rather than a third element in the tuple. 3. This version drops support for Python 2.6. Testing and bug-fixing for this version was taking substantial time with no evidence that anyone was actually using it. Also, multiple dependencies claim to not support 2.6, even though agate's tests were passing. * DataType's now have :meth:`.DataType.csvify` and :meth:`.DataType.jsonify` methods for serializing native values. * Added a dependency on `isodate `_ for handling ISO8601 formatted dates. (#233) * :class:`.Aggregation` results are no longer cached. (#378) * Removed `Column.aggregate` method. Use :meth:`.Table.aggregate` instead. (#378) * Added :meth:`.Table.aggregate` for aggregating single column results. (#378) * :class:`.Aggregation` subclasses now take column names as their first argument. (#378) * :meth:`.TableSet.aggregate` and :meth:`.Table.compute` now take the new column name as the first argument. (#378) * Remove support for Python 2.6. * :meth:`.Table.to_json` is implemented. (#345) * :meth:`.Table.from_json` is implemented. (#344, #347) * :class:`.Date` and :class:`.DateTime` type testing now takes specified format into account. (#361) * :class:`.Number` data type now takes a ``float_precision`` argument. * :class:`.Number` data types now work with native float values. (#370) * :class:`.TypeTester` can now validate Python native types (not just strings). (#367) * :class:`.TypeTester` can now be used with the :class:`.Table` constructor, not just :meth:`.Table.from_csv`. (#350) * :class:`.Table`, :meth:`.Table.from_csv` and :meth:`.TableSet.from_csv` now take ``column_names`` and ``column_types`` as separate parameters. (#350) * :const:`.DEFAULT_NULL_VALUES` (the list of strings that mean null) is now importable from ``agate``. * :meth:`.Table.from_csv` and :meth:`.Table.to_csv` are now unicode-safe without separately importing csvkit. * ``agate`` can now be used as a drop-in replacement for Python's ``csv`` module. * Migrated `csvkit `_'s unicode CSV reading/writing support into agate. (#354) 1.0.1 - October 29, 2015 ------------------------ * TypeTester now takes a "limit" arg that restricts how many rows it tests. (#332) * Table.from_csv now supports CSVs with neither headers nor manual column names. * Tables can now be created with automatically generated column names. (#331) * File handles passed to Table.to_csv are now left open. (#330) * Added Table.print_csv method. (#307, #339) * Fixed stripping currency symbols when casting Numbers from strings. (#333) * Fixed two major join issues. (#336) 1.0.0 - October 22, 2015 ------------------------ * Table.from_csv now defaults to TypeTester() if column_info is not provided. (#324) * New tutorial section: "Navigating table data" (#315) * 100% test coverage reached. (#312) * NullCalculationError is now a warning instead of an error. (#311) * TableSet is now a subclass of MappedSequence. * Rows and Columns are now subclasses of MappedSequence. * Add Column.values_without_nulls_sorted(). * Column.get_data_without_nulls() is now Column.values_without_nulls(). * Column.get_data_sorted() is now Column.values_sorted(). * Column.get_data() is now Column.values(). * Columns can now be sliced. * Columns can now be indexed by row name. (#301) * Added support for Python 3.5. * Row objects can now be sliced. (#303) * Replaced RowSequence and ColumnSequence with MappedSequence. * Replace RowDoesNotExistError with KeyError. * Replaced ColumnDoesNotExistError with IndexError. * Removed unnecessary custom RowIterator, ColumnIterator and CellIterator. * Performance improvements for Table "forks". (where, limit, etc) * TableSet keys are now converted to row names during aggregation. (#291) * Removed fancy __repr__ implementations. Use __str__ instead. (#290) * Rows can now be accessed by name as well as index. (#282) * Added row_names argument to Table constructor. (#282) * Removed Row.table and Row.index properties. (#287) * Columns can now be accessed by index as well as name. (#281) * Added column name and type validation to Table constructor. (#285) * Table now supports variable-length rows during construction. (#39) * aggregations.Summary implemented for generic aggregations. (#181) * Fix TableSet.key_type being lost after proxying Table methods. (#278) * Massive performance increases for joins. (#277) * Added join benchmark. (#73) 0.11.0 - October 6, 2015 ------------------------ * Implemented __repr__ for Table, TableSet, Column and Row. (#261) * Row.index property added. * Column constructor no longer takes a data_type argument. * Column.index and Column.name properties added. * Table.counts implemented. (#271) * Table.bins implemented. (#267, #227) * Table.join now raises ColumnDoesNotExistError. (#264) * Table.select now raises ColumnDoesNotExistError. * computations.ZScores moved into agate-stats. * computations.Rank cmp argument renamed comparer. * aggregations.MaxPrecision added. (#265) * Table.print_bars added. * Table.pretty_print renamed Table.print_table. * Reimplement Table method proxying via @allow_tableset_proxy decorator. (#263) * Add agate-stats references to docs. * Move stdev_outliers, mad_outliers and pearson_correlation into agate-stats. (#260) * Prevent issues with applying patches multiple times. (#258) 0.10.0 - September 22, 2015 --------------------------- * Add reverse and cmp arguments to Rank computation. (#248) * Document how to use agate-sql to read/write SQL tables. (#238, #241) * Document how to write extensions. * Add monkeypatching extensibility pattern via utils.Patchable. * Reversed order of argument pairs for Table.compute. (#249) * TableSet.merge method can be used to ungroup data. (#253) * Columns with identical names are now suffixed "2" after a Table.join. * Duplicate key columns are no longer included in the result of a Table.join. (#250) * Table.join right_key no longer necessary if identical to left_key. (#254) * Table.inner_join is now more. Use `inner` keyword to Table.join. * Table.left_outer_join is now Table.join. 0.9.0 - September 14, 2015 -------------------------- * Add many missing unit tests. Up to 99% coverage. * Add property accessors for TableSet.key_name and TableSet.key_type. (#247) * Table.rows and Table.columns are now behind properties. (#247) * Column.data_type is now a property. (#247) * Table[Set].get_column_types() is now the Table[Set].column_types property. (#247) * Table[Set].get_column_names() is now the Table[Set].column_names property. (#247) * Table.pretty_print now displays consistent decimal places for each Number column. * Discrete data types (Number, Date etc) are now right-aligned in Table.pretty_print. * Implement aggregation result caching. (#245) * Reimplement Percentiles, Quartiles, etc as aggregations. * UnsupportedAggregationError is now used to disable TableSet aggregations. * Replaced several exceptions with more general DataTypeError. * Column type information can now be accessed as Column.data_type. * Eliminated Column subclasses. Restructured around DataType classes. * Table.merge implemented. (#9) * Cookbook: guess column types. (#230) * Fix issue where all group keys were being cast to text. (#235) * Table.group_by will now default key_type to the type of the grouping column. (#234) * Add Matt Riggott to AUTHORS. (#231) * Support file-like objects in Table.to_csv and Table.from_csv. (#229) * Fix bug when applying multiple computations with Table.compute. 0.8.0 - September 9, 2015 ------------------------- * Cookbook: dealing with locales. (#220) * Cookbook: working with dates and times. * Add timezone support to DateTimeType. * Use pytimeparse instead of python-dateutil. (#221) * Handle percents and currency symbols when casting numbers. (#217) * Table.format is now Table.pretty_print. (#223) * Rename TextType to Text, NumberType to Number, etc. * Rename agate.ColumnType to agate.DataType (#216) * Rename agate.column_types to agate.data_types. * Implement locale support for number parsing. (#116) * Cookbook: ranking. (#110) * Cookbook: date change and date ranking. (#113) * Add tests for unicode support. (#138) * Fix computations.ZScores calculation. (#123) * Differentiate sample and population variance and stdev. (#208) * Support for overriding column inference with "force". * Competition ranking implemented as default. (#125) * TypeTester: robust type inference. (#210) 0.7.0 - September 3, 2015 ------------------------- * Cookbook: USA Today diversity index. * Cookbook: filter to top x%. (#47) * Cookbook: fuzzy string search example. (#176) * Values to coerce to true/false can now be overridden for BooleanType. * Values to coerce to null can now be overridden for all ColumnType subclasses. (#206) * Add key_type argument to TableSet and Table.group_by. (#205) * Nested TableSet's and multi-dimensional aggregates. (#204) * TableSet.aggregate will now use key_name as the group column name. (#203) * Added key_name argument to TableSet and Table.group_by. * Added Length aggregation and removed count from TableSet.aggregate output. (#203) * Fix error messages for RowDoesNotExistError and ColumnDoesNotExistError. 0.6.0 - September 1, 2015 ------------------------- * Fix missing package definition in setup.py. * Split Analysis off into the proof library. * Change computation now works with DateType, DateTimeType and TimeDeltaType. (#159) * TimeDeltaType and TimeDeltaColumn implemented. * NonNullAggregation class removed. * Some private Column methods made public. (#183) * Rename agate.aggegators to agate.aggregations. * TableSet.to_csv implemented. (#195) * TableSet.from_csv implemented. (#194) * Table.to_csv implemented (#169) * Table.from_csv implemented. (#168) * Added Table.format method for pretty-printing tables. (#191) * Analysis class now implements a caching workflow. (#171) 0.5.0 - August 28, 2015 ----------------------- * Table now takes (column_name, column_type) pairs. (#180) * Renamed the library to agate. (#179) * Results of common column operations are now cached using a common memoize decorator. (#162) * ated support for Python version 3.2. * Added support for Python wheel packaging. (#127) * Add PercentileRank computation and usage example to cookbook. (#152) * Add indexed change example to cookbook. (#151) * Add annual change example to cookbook. (#150) * Column.aggregate now invokes Aggregations. * Column.any, NumberColumn.sum, etc. converted to Aggregations. * Implement Aggregation and subclasses. (#155) * Move ColumnType subclasses and ColumnOperation subclasses into new modules. * Table.percent_change, Table.rank and Table.zscores reimplemented as Computers. * Computer implemented. Table.compute reimplemented. (#147) * NumberColumn.iqr (inter-quartile range) implemented. (#102) * Remove Column.counts as it is not the best way. * Implement ColumnOperation and subclasses. * Table.aggregate migrated to TableSet.aggregate. * Table.group_by now supports grouping by a key function. (#140) * NumberColumn.deciles implemented. * NumberColumn.quintiles implemented. (#46) * NumberColumn.quartiles implemented. (#45) * Added robust test case for NumberColumn.percentiles. (#129) * NumberColumn.percentiles reimplemented using new method. (#130) * Reorganized and modularized column implementations. * Table.group_by now returns a TableSet. * Implement TableSet object. (#141) 0.4.0 - September 27, 2014 -------------------------- * Upgrade to python-dateutil 2.2. (#134) * Wrote introductory tutorial. (#133) * Reorganize documentation (#132) * Add John Heasly to AUTHORS. * Implement percentile. (#35) * no_null_computations now accepts args. (#122) * Table.z_scores implemented. (#123) * DateTimeColumn implemented. (#23) * Column.counts now returns dict instead of Table. (#109) * ColumnType.create_column renamed _create_column. (#118) * Added Mick O'Brien to AUTHORS. (#121) * Pearson correlation implemented. (#103) 0.3.0 ----- * DateType.date_format implemented. (#112) * Create ColumnType classes to simplify data parsing. * DateColumn implemented. (#7) * Cookbook: Excel pivot tables. (#41) * Cookbook: statistics, including outlier detection. (#82) * Cookbook: emulating Underscore's any and all. (#107) * Parameter documention for method parameters. (#108) * Table.rank now accepts a column name or key function. * Optionally use cdecimal for improved performance. (#106) * Smart naming of aggregate columns. * Duplicate columns names are now an error. (#92) * BooleanColumn implemented. (#6) * TextColumn.max_length implemented. (#95) * Table.find implemented. (#14) * Better error handling in Table.__init__. (#38) * Collapse IntColumn and FloatColumn into NumberColumn. (#64) * Table.mad_outliers implemented. (#93) * Column.mad implemented. (#93) * Table.stdev_outliers implemented. (#86) * Table.group_by implemented. (#3) * Cookbook: emulating R. (#81) * Table.left_outer_join now accepts column names or key functions. (#80) * Table.inner_join now accepts column names or key functions. (#80) * Table.distinct now accepts a column name or key function. (#80) * Table.order_by now accepts a column name or key function. (#80) * Table.rank implemented. (#15) * Reached 100% test coverage. (#76) * Tests for Column._cast methods. (#20) * Table.distinct implemented. (#83) * Use assertSequenceEqual in tests. (#84) * Docs: features section. (#87) * Cookbook: emulating SQL. (#79) * Table.left_outer_join implemented. (#11) * Table.inner_join implemented. (#11) 0.2.0 ----- * Python 3.2, 3.3 and 3.4 support. (#52) * Documented supported platforms. * Cookbook: csvkit. (#36) * Cookbook: glob syntax. (#28) * Cookbook: filter to values in range. (#30) * RowDoesNotExistError implemented. (#70) * ColumnDoesNotExistError implemented. (#71) * Cookbook: percent change. (#67) * Cookbook: sampleing. (#59) * Cookbook: random sort order. (#68) * Eliminate Table.get_data. * Use tuples everywhere. (#66) * Fixes for Python 2.6 compatibility. (#53) * Cookbook: multi-column sorting. (#13) * Cookbook: simple sorting. * Destructive Table ops now deepcopy row data. (#63) * Non-destructive Table ops now share row data. (#63) * Table.sort_by now accepts a function. (#65) * Cookbook: pygal. * Cookbook: Matplotlib. * Cookbook: VLOOKUP. (#40) * Cookbook: Excel formulas. (#44) * Cookbook: Rounding to two decimal places. (#49) * Better repr for Column and Row. (#56) * Cookbook: Filter by regex. (#27) * Cookbook: Underscore filter & reject. (#57) * Table.limit implemented. (#58) * Cookbook: writing a CSV. (#51) * Kill Table.filter and Table.reject. (#55) * Column.map removed. (#43) * Column instance & data caching implemented. (#42) * Table.select implemented. (#32) * Eliminate repeated column index lookups. (#25) * Precise DecimalColumn tests. * Use Decimal type everywhere internally. * FloatColumn converted to DecimalColumn. (#17) * Added Eric Sagara to AUTHORS. (#48) * NumberColumn.variance implemented. (#1) * Cookbook: loading a CSV. (#37) * Table.percent_change implemented. (#16) * Table.compute implemented. (#31) * Table.filter and Table.reject now take funcs. (#24) * Column.count implemented. (#12) * Column.counts implemented. (#8) * Column.all implemented. (#5) * Column.any implemented. (#4) * Added Jeff Larson to AUTHORS. (#18) * NumberColumn.mode implmented. (#18) 0.1.0 ----- * Initial prototype agate-1.9.1/COPYING000066400000000000000000000021131454111471500136360ustar00rootroot00000000000000The MIT License Copyright (c) 2017 Christopher Groskopf and contributors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. agate-1.9.1/MANIFEST.in000066400000000000000000000007041454111471500143450ustar00rootroot00000000000000include *.ipynb include *.py include *.rst include COPYING recursive-include benchmarks *.py recursive-include docs *.py recursive-include docs *.rst recursive-include docs *.svg recursive-include docs *.txt recursive-include docs Makefile recursive-include examples *.csv recursive-include examples *.json recursive-include examples testfixed recursive-include tests *.py exclude .pre-commit-config.yaml exclude .readthedocs.yaml global-exclude *.pyc agate-1.9.1/README.rst000066400000000000000000000023141454111471500142750ustar00rootroot00000000000000.. image:: https://github.com/wireservice/agate/workflows/CI/badge.svg :target: https://github.com/wireservice/agate/actions :alt: Build status .. image:: https://coveralls.io/repos/wireservice/agate/badge.svg?branch=master :target: https://coveralls.io/r/wireservice/agate :alt: Coverage status .. image:: https://img.shields.io/pypi/dm/agate.svg :target: https://pypi.python.org/pypi/agate :alt: PyPI downloads .. image:: https://img.shields.io/pypi/v/agate.svg :target: https://pypi.python.org/pypi/agate :alt: Version .. image:: https://img.shields.io/pypi/l/agate.svg :target: https://pypi.python.org/pypi/agate :alt: License .. image:: https://img.shields.io/pypi/pyversions/agate.svg :target: https://pypi.python.org/pypi/agate :alt: Support Python versions agate is a Python data analysis library that is optimized for humans instead of machines. It is an alternative to numpy and pandas that solves real-world problems with readable code. agate was previously known as journalism. Important links: * Documentation: https://agate.rtfd.org * Repository: https://github.com/wireservice/agate * Issues: https://github.com/wireservice/agate/issues agate-1.9.1/agate/000077500000000000000000000000001454111471500136675ustar00rootroot00000000000000agate-1.9.1/agate/__init__.py000066400000000000000000000012071454111471500160000ustar00rootroot00000000000000import agate.csv_py3 as csv from agate.aggregations import * from agate.columns import Column from agate.computations import * from agate.config import get_option, set_option, set_options from agate.data_types import * from agate.exceptions import * # import agate.fixed as fixed from agate.mapped_sequence import MappedSequence from agate.rows import Row from agate.table import Table from agate.tableset import TableSet from agate.testcase import AgateTestCase from agate.type_tester import TypeTester from agate.utils import * from agate.warns import DuplicateColumnWarning, NullCalculationWarning, warn_duplicate_column, warn_null_calculation agate-1.9.1/agate/aggregations/000077500000000000000000000000001454111471500163415ustar00rootroot00000000000000agate-1.9.1/agate/aggregations/__init__.py000066400000000000000000000033541454111471500204570ustar00rootroot00000000000000""" Aggregations create a new value by summarizing a :class:`.Column`. For example, :class:`.Mean`, when applied to a column containing :class:`.Number` data, returns a single :class:`decimal.Decimal` value which is the average of all values in that column. Aggregations can be applied to single columns using the :meth:`.Table.aggregate` method. The result is a single value if a one aggregation was applied, or a tuple of values if a sequence of aggregations was applied. Aggregations can be applied to instances of :class:`.TableSet` using the :meth:`.TableSet.aggregate` method. The result is a new :class:`.Table` with a column for each aggregation and a row for each table in the set. """ from agate.aggregations.all import All from agate.aggregations.any import Any from agate.aggregations.base import Aggregation from agate.aggregations.count import Count from agate.aggregations.deciles import Deciles from agate.aggregations.first import First from agate.aggregations.has_nulls import HasNulls from agate.aggregations.iqr import IQR from agate.aggregations.mad import MAD from agate.aggregations.max import Max from agate.aggregations.max_length import MaxLength from agate.aggregations.max_precision import MaxPrecision from agate.aggregations.mean import Mean from agate.aggregations.median import Median from agate.aggregations.min import Min from agate.aggregations.mode import Mode from agate.aggregations.percentiles import Percentiles from agate.aggregations.quartiles import Quartiles from agate.aggregations.quintiles import Quintiles from agate.aggregations.stdev import PopulationStDev, StDev from agate.aggregations.sum import Sum from agate.aggregations.summary import Summary from agate.aggregations.variance import PopulationVariance, Variance agate-1.9.1/agate/aggregations/all.py000066400000000000000000000017641454111471500174730ustar00rootroot00000000000000from agate.aggregations.base import Aggregation from agate.data_types import Boolean class All(Aggregation): """ Check if all values in a column pass a test. :param column_name: The name of the column to check. :param test: Either a single value that all values in the column are compared against (for equality) or a function that takes a column value and returns `True` or `False`. """ def __init__(self, column_name, test): self._column_name = column_name if callable(test): self._test = test else: self._test = lambda d: d == test def get_aggregate_data_type(self, table): return Boolean() def validate(self, table): table.columns[self._column_name] def run(self, table): """ :returns: :class:`bool` """ column = table.columns[self._column_name] data = column.values() return all(self._test(d) for d in data) agate-1.9.1/agate/aggregations/any.py000066400000000000000000000016611454111471500175060ustar00rootroot00000000000000from agate.aggregations.base import Aggregation from agate.data_types import Boolean class Any(Aggregation): """ Check if any value in a column passes a test. :param column_name: The name of the column to check. :param test: Either a single value that all values in the column are compared against (for equality) or a function that takes a column value and returns `True` or `False`. """ def __init__(self, column_name, test): self._column_name = column_name if callable(test): self._test = test else: self._test = lambda d: d == test def get_aggregate_data_type(self, table): return Boolean() def validate(self, table): table.columns[self._column_name] def run(self, table): column = table.columns[self._column_name] data = column.values() return any(self._test(d) for d in data) agate-1.9.1/agate/aggregations/base.py000066400000000000000000000031351454111471500176270ustar00rootroot00000000000000from agate.exceptions import UnsupportedAggregationError class Aggregation: # pragma: no cover """ Aggregations create a new value by summarizing a :class:`.Column`. Aggregations are applied with :meth:`.Table.aggregate` and :meth:`.TableSet.aggregate`. When creating a custom aggregation, ensure that the values returned by :meth:`.Aggregation.run` are of the type specified by :meth:`.Aggregation.get_aggregate_data_type`. This can be ensured by using the :meth:`.DataType.cast` method. See :class:`.Summary` for an example. """ def __str__(self): """ String representation of this column. May be used as a column name in generated tables. """ return self.__class__.__name__ def get_aggregate_data_type(self, table): """ Get the data type that should be used when using this aggregation with a :class:`.TableSet` to produce a new column. Should raise :class:`.UnsupportedAggregationError` if this column does not support aggregation into a :class:`.TableSet`. (For example, if it does not return a single value.) """ raise UnsupportedAggregationError() def validate(self, table): """ Perform any checks necessary to verify this aggregation can run on the provided table without errors. This is called by :meth:`.Table.aggregate` before :meth:`run`. """ pass def run(self, table): """ Execute this aggregation on a given column and return the result. """ raise NotImplementedError() agate-1.9.1/agate/aggregations/count.py000066400000000000000000000023241454111471500200440ustar00rootroot00000000000000from agate.aggregations.base import Aggregation from agate.data_types import Number from agate.utils import default class Count(Aggregation): """ Count occurences of a value or values. This aggregation can be used in three ways: 1. If no arguments are specified, then it will count the number of rows in the table. 2. If only :code:`column_name` is specified, then it will count the number of non-null values in that column. 3. If both :code:`column_name` and :code:`value` are specified, then it will count occurrences of a specific value. :param column_name: The column containing the values to be counted. :param value: Any value to be counted, including :code:`None`. """ def __init__(self, column_name=None, value=default): self._column_name = column_name self._value = value def get_aggregate_data_type(self, table): return Number() def run(self, table): if self._column_name is not None: if self._value is not default: return table.columns[self._column_name].values().count(self._value) return len(table.columns[self._column_name].values_without_nulls()) return len(table.rows) agate-1.9.1/agate/aggregations/deciles.py000066400000000000000000000027501454111471500203270ustar00rootroot00000000000000from agate.aggregations.base import Aggregation from agate.aggregations.has_nulls import HasNulls from agate.aggregations.percentiles import Percentiles from agate.data_types import Number from agate.exceptions import DataTypeError from agate.utils import Quantiles from agate.warns import warn_null_calculation class Deciles(Aggregation): """ Calculate the deciles of a column based on its percentiles. Deciles will be equivalent to the 10th, 20th ... 90th percentiles. "Zeroth" (min value) and "Tenth" (max value) deciles are included for reference and intuitive indexing. See :class:`Percentiles` for implementation details. This aggregation can not be applied to a :class:`.TableSet`. :param column_name: The name of a column containing :class:`.Number` data. """ def __init__(self, column_name): self._column_name = column_name def validate(self, table): column = table.columns[self._column_name] if not isinstance(column.data_type, Number): raise DataTypeError('Deciles can only be applied to columns containing Number data.') has_nulls = HasNulls(self._column_name).run(table) if has_nulls: warn_null_calculation(self, column) def run(self, table): """ :returns: An instance of :class:`Quantiles`. """ percentiles = Percentiles(self._column_name).run(table) return Quantiles([percentiles[i] for i in range(0, 101, 10)]) agate-1.9.1/agate/aggregations/first.py000066400000000000000000000023061454111471500200430ustar00rootroot00000000000000from agate.aggregations.base import Aggregation class First(Aggregation): """ Returns the first value that passes a test. If the test is omitted, the aggregation will return the first value in the column. If no values pass the test, the aggregation will raise an exception. :param column_name: The name of the column to check. :param test: A function that takes a value and returns `True` or `False`. Test may be omitted when checking :class:`.Boolean` data. """ def __init__(self, column_name, test=None): self._column_name = column_name self._test = test def get_aggregate_data_type(self, table): return table.columns[self._column_name].data_type def validate(self, table): column = table.columns[self._column_name] data = column.values() if self._test is not None and len([d for d in data if self._test(d)]) == 0: raise ValueError('No values pass the given test.') def run(self, table): column = table.columns[self._column_name] data = column.values() if self._test is None: return data[0] return next(d for d in data if self._test(d)) agate-1.9.1/agate/aggregations/has_nulls.py000066400000000000000000000007451454111471500207110ustar00rootroot00000000000000from agate.aggregations.base import Aggregation from agate.data_types import Boolean class HasNulls(Aggregation): """ Check if the column contains null values. :param column_name: The name of the column to check. """ def __init__(self, column_name): self._column_name = column_name def get_aggregate_data_type(self, table): return Boolean() def run(self, table): return None in table.columns[self._column_name].values() agate-1.9.1/agate/aggregations/iqr.py000066400000000000000000000023111454111471500175030ustar00rootroot00000000000000from agate.aggregations.base import Aggregation from agate.aggregations.has_nulls import HasNulls from agate.aggregations.percentiles import Percentiles from agate.data_types import Number from agate.exceptions import DataTypeError from agate.warns import warn_null_calculation class IQR(Aggregation): """ Calculate the interquartile range of a column. :param column_name: The name of a column containing :class:`.Number` data. """ def __init__(self, column_name): self._column_name = column_name self._percentiles = Percentiles(column_name) def get_aggregate_data_type(self, table): return Number() def validate(self, table): column = table.columns[self._column_name] if not isinstance(column.data_type, Number): raise DataTypeError('IQR can only be applied to columns containing Number data.') has_nulls = HasNulls(self._column_name).run(table) if has_nulls: warn_null_calculation(self, column) def run(self, table): percentiles = self._percentiles.run(table) if percentiles[75] is not None and percentiles[25] is not None: return percentiles[75] - percentiles[25] agate-1.9.1/agate/aggregations/mad.py000066400000000000000000000025051454111471500174560ustar00rootroot00000000000000from agate.aggregations.base import Aggregation from agate.aggregations.has_nulls import HasNulls from agate.aggregations.median import Median from agate.data_types import Number from agate.exceptions import DataTypeError from agate.utils import median from agate.warns import warn_null_calculation class MAD(Aggregation): """ Calculate the `median absolute deviation `_ of a column. :param column_name: The name of a column containing :class:`.Number` data. """ def __init__(self, column_name): self._column_name = column_name self._median = Median(column_name) def get_aggregate_data_type(self, table): return Number() def validate(self, table): column = table.columns[self._column_name] if not isinstance(column.data_type, Number): raise DataTypeError('MAD can only be applied to columns containing Number data.') has_nulls = HasNulls(self._column_name).run(table) if has_nulls: warn_null_calculation(self, column) def run(self, table): column = table.columns[self._column_name] data = column.values_without_nulls_sorted() if data: m = self._median.run(table) return median(tuple(abs(n - m) for n in data)) agate-1.9.1/agate/aggregations/max.py000066400000000000000000000022201454111471500174740ustar00rootroot00000000000000from agate.aggregations.base import Aggregation from agate.data_types import Date, DateTime, Number, TimeDelta from agate.exceptions import DataTypeError class Max(Aggregation): """ Find the maximum value in a column. This aggregation can be applied to columns containing :class:`.Date`, :class:`.DateTime`, or :class:`.Number` data. :param column_name: The name of the column to be searched. """ def __init__(self, column_name): self._column_name = column_name def get_aggregate_data_type(self, table): column = table.columns[self._column_name] if isinstance(column.data_type, (Date, DateTime, Number, TimeDelta)): return column.data_type def validate(self, table): column = table.columns[self._column_name] if not isinstance(column.data_type, (Date, DateTime, Number, TimeDelta)): raise DataTypeError('Min can only be applied to columns containing DateTime, Date or Number data.') def run(self, table): column = table.columns[self._column_name] data = column.values_without_nulls() if data: return max(data) agate-1.9.1/agate/aggregations/max_length.py000066400000000000000000000023221454111471500210400ustar00rootroot00000000000000from decimal import Decimal from agate.aggregations.base import Aggregation from agate.data_types import Number, Text from agate.exceptions import DataTypeError class MaxLength(Aggregation): """ Find the length of the longest string in a column. Note: On Python 2.7 this function may miscalcuate the length of unicode strings that contain "wide characters". For details see this StackOverflow answer: https://stackoverflow.com/a/35462951 :param column_name: The name of a column containing :class:`.Text` data. """ def __init__(self, column_name): self._column_name = column_name def get_aggregate_data_type(self, table): return Number() def validate(self, table): column = table.columns[self._column_name] if not isinstance(column.data_type, Text): raise DataTypeError('MaxLength can only be applied to columns containing Text data.') def run(self, table): """ :returns: :class:`int`. """ column = table.columns[self._column_name] lens = [len(d) for d in column.values_without_nulls()] if not lens: return Decimal('0') return Decimal(max(lens)) agate-1.9.1/agate/aggregations/max_precision.py000066400000000000000000000015641454111471500215610ustar00rootroot00000000000000from agate.aggregations.base import Aggregation from agate.data_types import Number from agate.exceptions import DataTypeError from agate.utils import max_precision class MaxPrecision(Aggregation): """ Find the most decimal places present for any value in this column. :param column_name: The name of the column to be searched. """ def __init__(self, column_name): self._column_name = column_name def get_aggregate_data_type(self, table): return Number() def validate(self, table): column = table.columns[self._column_name] if not isinstance(column.data_type, Number): raise DataTypeError('MaxPrecision can only be applied to columns containing Number data.') def run(self, table): column = table.columns[self._column_name] return max_precision(column.values_without_nulls()) agate-1.9.1/agate/aggregations/mean.py000066400000000000000000000025231454111471500176350ustar00rootroot00000000000000from agate.aggregations.base import Aggregation from agate.aggregations.has_nulls import HasNulls from agate.aggregations.sum import Sum from agate.data_types import Number, TimeDelta from agate.exceptions import DataTypeError from agate.warns import warn_null_calculation class Mean(Aggregation): """ Calculate the mean of a column. :param column_name: The name of a column containing :class:`.Number` data. """ def __init__(self, column_name): self._column_name = column_name self._sum = Sum(column_name) def get_aggregate_data_type(self, table): column = table.columns[self._column_name] if isinstance(column.data_type, (Number, TimeDelta)): return column.data_type def validate(self, table): column = table.columns[self._column_name] if not isinstance(column.data_type, (Number, TimeDelta)): raise DataTypeError('Mean can only be applied to columns containing Number or TimeDelta data.') has_nulls = HasNulls(self._column_name).run(table) if has_nulls: warn_null_calculation(self, column) def run(self, table): column = table.columns[self._column_name] data = column.values_without_nulls() if data: sum_total = self._sum.run(table) return sum_total / len(data) agate-1.9.1/agate/aggregations/median.py000066400000000000000000000023171454111471500201530ustar00rootroot00000000000000from agate.aggregations.base import Aggregation from agate.aggregations.has_nulls import HasNulls from agate.aggregations.percentiles import Percentiles from agate.data_types import Number from agate.exceptions import DataTypeError from agate.warns import warn_null_calculation class Median(Aggregation): """ Calculate the median of a column. Median is equivalent to the 50th percentile. See :class:`Percentiles` for implementation details. :param column_name: The name of a column containing :class:`.Number` data. """ def __init__(self, column_name): self._column_name = column_name self._percentiles = Percentiles(column_name) def get_aggregate_data_type(self, table): return Number() def validate(self, table): column = table.columns[self._column_name] if not isinstance(column.data_type, Number): raise DataTypeError('Median can only be applied to columns containing Number data.') has_nulls = HasNulls(self._column_name).run(table) if has_nulls: warn_null_calculation(self, column) def run(self, table): percentiles = self._percentiles.run(table) return percentiles[50] agate-1.9.1/agate/aggregations/min.py000066400000000000000000000022201454111471500174720ustar00rootroot00000000000000from agate.aggregations.base import Aggregation from agate.data_types import Date, DateTime, Number, TimeDelta from agate.exceptions import DataTypeError class Min(Aggregation): """ Find the minimum value in a column. This aggregation can be applied to columns containing :class:`.Date`, :class:`.DateTime`, or :class:`.Number` data. :param column_name: The name of the column to be searched. """ def __init__(self, column_name): self._column_name = column_name def get_aggregate_data_type(self, table): column = table.columns[self._column_name] if isinstance(column.data_type, (Date, DateTime, Number, TimeDelta)): return column.data_type def validate(self, table): column = table.columns[self._column_name] if not isinstance(column.data_type, (Date, DateTime, Number, TimeDelta)): raise DataTypeError('Min can only be applied to columns containing DateTime, Date or Number data.') def run(self, table): column = table.columns[self._column_name] data = column.values_without_nulls() if data: return min(data) agate-1.9.1/agate/aggregations/mode.py000066400000000000000000000023211454111471500176350ustar00rootroot00000000000000from collections import defaultdict from agate.aggregations.base import Aggregation from agate.aggregations.has_nulls import HasNulls from agate.data_types import Number from agate.exceptions import DataTypeError from agate.warns import warn_null_calculation class Mode(Aggregation): """ Calculate the mode of a column. :param column_name: The name of a column containing :class:`.Number` data. """ def __init__(self, column_name): self._column_name = column_name def get_aggregate_data_type(self, table): return Number() def validate(self, table): column = table.columns[self._column_name] if not isinstance(column.data_type, Number): raise DataTypeError('Sum can only be applied to columns containing Number data.') has_nulls = HasNulls(self._column_name).run(table) if has_nulls: warn_null_calculation(self, column) def run(self, table): column = table.columns[self._column_name] data = column.values_without_nulls() if data: state = defaultdict(int) for n in data: state[n] += 1 return max(state.keys(), key=lambda x: state[x]) agate-1.9.1/agate/aggregations/percentiles.py000066400000000000000000000043721454111471500212360ustar00rootroot00000000000000import math from agate.aggregations.base import Aggregation from agate.aggregations.has_nulls import HasNulls from agate.data_types import Number from agate.exceptions import DataTypeError from agate.utils import Quantiles from agate.warns import warn_null_calculation class Percentiles(Aggregation): """ Divide a column into 100 equal-size groups using the "CDF" method. See `this explanation `_ of the various methods for computing percentiles. "Zeroth" (min value) and "Hundredth" (max value) percentiles are included for reference and intuitive indexing. A reference implementation was provided by `pycalcstats `_. This aggregation can not be applied to a :class:`.TableSet`. :param column_name: The name of a column containing :class:`.Number` data. """ def __init__(self, column_name): self._column_name = column_name def validate(self, table): column = table.columns[self._column_name] if not isinstance(column.data_type, Number): raise DataTypeError('Percentiles can only be applied to columns containing Number data.') has_nulls = HasNulls(self._column_name).run(table) if has_nulls: warn_null_calculation(self, column) def run(self, table): """ :returns: An instance of :class:`Quantiles`. """ column = table.columns[self._column_name] data = column.values_without_nulls_sorted() if not data: return Quantiles([None for percentile in range(101)]) # Zeroth percentile is first datum quantiles = [data[0]] for percentile in range(1, 100): k = len(data) * (float(percentile) / 100) low = max(1, int(math.ceil(k))) high = min(len(data), int(math.floor(k + 1))) # No remainder if low == high: value = data[low - 1] # Remainder else: value = (data[low - 1] + data[high - 1]) / 2 quantiles.append(value) # Hundredth percentile is final datum quantiles.append(data[-1]) return Quantiles(quantiles) agate-1.9.1/agate/aggregations/quartiles.py000066400000000000000000000027651454111471500207360ustar00rootroot00000000000000from agate.aggregations.base import Aggregation from agate.aggregations.has_nulls import HasNulls from agate.aggregations.percentiles import Percentiles from agate.data_types import Number from agate.exceptions import DataTypeError from agate.utils import Quantiles from agate.warns import warn_null_calculation class Quartiles(Aggregation): """ Calculate the quartiles of column based on its percentiles. Quartiles will be equivalent to the the 25th, 50th and 75th percentiles. "Zeroth" (min value) and "Fourth" (max value) quartiles are included for reference and intuitive indexing. See :class:`Percentiles` for implementation details. This aggregation can not be applied to a :class:`.TableSet`. :param column_name: The name of a column containing :class:`.Number` data. """ def __init__(self, column_name): self._column_name = column_name def validate(self, table): column = table.columns[self._column_name] if not isinstance(column.data_type, Number): raise DataTypeError('Quartiles can only be applied to columns containing Number data.') has_nulls = HasNulls(self._column_name).run(table) if has_nulls: warn_null_calculation(self, column) def run(self, table): """ :returns: An instance of :class:`Quantiles`. """ percentiles = Percentiles(self._column_name).run(table) return Quantiles([percentiles[i] for i in range(0, 101, 25)]) agate-1.9.1/agate/aggregations/quintiles.py000066400000000000000000000027701454111471500207360ustar00rootroot00000000000000from agate.aggregations.base import Aggregation from agate.aggregations.has_nulls import HasNulls from agate.aggregations.percentiles import Percentiles from agate.data_types import Number from agate.exceptions import DataTypeError from agate.utils import Quantiles from agate.warns import warn_null_calculation class Quintiles(Aggregation): """ Calculate the quintiles of a column based on its percentiles. Quintiles will be equivalent to the 20th, 40th, 60th and 80th percentiles. "Zeroth" (min value) and "Fifth" (max value) quintiles are included for reference and intuitive indexing. See :class:`Percentiles` for implementation details. This aggregation can not be applied to a :class:`.TableSet`. :param column_name: The name of a column containing :class:`.Number` data. """ def __init__(self, column_name): self._column_name = column_name def validate(self, table): column = table.columns[self._column_name] if not isinstance(column.data_type, Number): raise DataTypeError('Quintiles can only be applied to columns containing Number data.') has_nulls = HasNulls(self._column_name).run(table) if has_nulls: warn_null_calculation(self, column) def run(self, table): """ :returns: An instance of :class:`Quantiles`. """ percentiles = Percentiles(self._column_name).run(table) return Quantiles([percentiles[i] for i in range(0, 101, 20)]) agate-1.9.1/agate/aggregations/stdev.py000066400000000000000000000043211454111471500200400ustar00rootroot00000000000000from agate.aggregations import Aggregation from agate.aggregations.has_nulls import HasNulls from agate.aggregations.variance import PopulationVariance, Variance from agate.data_types import Number from agate.exceptions import DataTypeError from agate.warns import warn_null_calculation class StDev(Aggregation): """ Calculate the sample standard of deviation of a column. For the population standard of deviation see :class:`.PopulationStDev`. :param column_name: The name of a column containing :class:`.Number` data. """ def __init__(self, column_name): self._column_name = column_name self._variance = Variance(column_name) def get_aggregate_data_type(self, table): return Number() def validate(self, table): column = table.columns[self._column_name] if not isinstance(column.data_type, Number): raise DataTypeError('StDev can only be applied to columns containing Number data.') has_nulls = HasNulls(self._column_name).run(table) if has_nulls: warn_null_calculation(self, column) def run(self, table): variance = self._variance.run(table) if variance is not None: return variance.sqrt() class PopulationStDev(StDev): """ Calculate the population standard of deviation of a column. For the sample standard of deviation see :class:`.StDev`. :param column_name: The name of a column containing :class:`.Number` data. """ def __init__(self, column_name): self._column_name = column_name self._population_variance = PopulationVariance(column_name) def get_aggregate_data_type(self, table): return Number() def validate(self, table): column = table.columns[self._column_name] if not isinstance(column.data_type, Number): raise DataTypeError('PopulationStDev can only be applied to columns containing Number data.') has_nulls = HasNulls(self._column_name).run(table) if has_nulls: warn_null_calculation(self, column) def run(self, table): variance = self._population_variance.run(table) if variance is not None: return variance.sqrt() agate-1.9.1/agate/aggregations/sum.py000066400000000000000000000021101454111471500175110ustar00rootroot00000000000000import datetime from agate.aggregations.base import Aggregation from agate.data_types import Number, TimeDelta from agate.exceptions import DataTypeError class Sum(Aggregation): """ Calculate the sum of a column. :param column_name: The name of a column containing :class:`.Number` data. """ def __init__(self, column_name): self._column_name = column_name def get_aggregate_data_type(self, table): column = table.columns[self._column_name] if isinstance(column.data_type, (Number, TimeDelta)): return column.data_type def validate(self, table): column = table.columns[self._column_name] if not isinstance(column.data_type, (Number, TimeDelta)): raise DataTypeError('Sum can only be applied to columns containing Number or TimeDelta data.') def run(self, table): column = table.columns[self._column_name] start = 0 if isinstance(column.data_type, TimeDelta): start = datetime.timedelta() return sum(column.values_without_nulls(), start) agate-1.9.1/agate/aggregations/summary.py000066400000000000000000000020051454111471500204050ustar00rootroot00000000000000from agate.aggregations.base import Aggregation class Summary(Aggregation): """ Apply an arbitrary function to a column. :param column_name: The name of a column to be summarized. :param data_type: The return type of this aggregation. :param func: A function which will be passed the column for processing. :param cast: If :code:`True`, each return value will be cast to the specified :code:`data_type` to ensure it is valid. Only disable this if you are certain your summary always returns the correct type. """ def __init__(self, column_name, data_type, func, cast=True): self._column_name = column_name self._data_type = data_type self._func = func self._cast = cast def get_aggregate_data_type(self, table): return self._data_type def run(self, table): v = self._func(table.columns[self._column_name]) if self._cast: v = self._data_type.cast(v) return v agate-1.9.1/agate/aggregations/variance.py000066400000000000000000000045071454111471500205110ustar00rootroot00000000000000from agate.aggregations.base import Aggregation from agate.aggregations.has_nulls import HasNulls from agate.aggregations.mean import Mean from agate.data_types import Number from agate.exceptions import DataTypeError from agate.warns import warn_null_calculation class Variance(Aggregation): """ Calculate the sample variance of a column. For the population variance see :class:`.PopulationVariance`. :param column_name: The name of a column containing :class:`.Number` data. """ def __init__(self, column_name): self._column_name = column_name self._mean = Mean(column_name) def get_aggregate_data_type(self, table): return Number() def validate(self, table): column = table.columns[self._column_name] if not isinstance(column.data_type, Number): raise DataTypeError('Variance can only be applied to columns containing Number data.') has_nulls = HasNulls(self._column_name).run(table) if has_nulls: warn_null_calculation(self, column) def run(self, table): column = table.columns[self._column_name] data = column.values_without_nulls() if data: mean = self._mean.run(table) return sum((n - mean) ** 2 for n in data) / (len(data) - 1) class PopulationVariance(Variance): """ Calculate the population variance of a column. For the sample variance see :class:`.Variance`. :param column_name: The name of a column containing :class:`.Number` data. """ def __init__(self, column_name): self._column_name = column_name self._mean = Mean(column_name) def get_aggregate_data_type(self, table): return Number() def validate(self, table): column = table.columns[self._column_name] if not isinstance(column.data_type, Number): raise DataTypeError('PopulationVariance can only be applied to columns containing Number data.') has_nulls = HasNulls(self._column_name).run(table) if has_nulls: warn_null_calculation(self, column) def run(self, table): column = table.columns[self._column_name] data = column.values_without_nulls() if data: mean = self._mean.run(table) return sum((n - mean) ** 2 for n in data) / len(data) agate-1.9.1/agate/columns.py000066400000000000000000000070411454111471500157230ustar00rootroot00000000000000""" This module contains the :class:`Column` class, which defines a "vertical" array of tabular data. Whereas :class:`.Row` instances are independent of their parent :class:`.Table`, columns depend on knowledge of both their position in the parent (column name, data type) as well as the rows that contain their data. """ from agate.mapped_sequence import MappedSequence from agate.utils import NullOrder, memoize def null_handler(k): """ Key method for sorting nulls correctly. """ if k is None: return NullOrder() return k class Column(MappedSequence): """ Proxy access to column data. Instances of :class:`Column` should not be constructed directly. They are created by :class:`.Table` instances and are unique to them. Columns are implemented as subclass of :class:`.MappedSequence`. They deviate from the underlying implementation in that loading of their data is deferred until it is needed. :param name: The name of this column. :param data_type: An instance of :class:`.DataType`. :param rows: A :class:`.MappedSequence` that contains the :class:`.Row` instances containing the data for this column. :param row_names: An optional list of row names (keys) for this column. """ __slots__ = ['_index', '_name', '_data_type', '_rows', '_row_names'] def __init__(self, index, name, data_type, rows, row_names=None): self._index = index self._name = name self._data_type = data_type self._rows = rows self._keys = row_names def __getstate__(self): """ Return state values to be pickled. This is necessary on Python2.7 when using :code:`__slots__`. """ return { '_index': self._index, '_name': self._name, '_data_type': self._data_type, '_rows': self._rows, '_keys': self._keys } def __setstate__(self, data): """ Restore pickled state. This is necessary on Python2.7 when using :code:`__slots__`. """ self._index = data['_index'] self._name = data['_name'] self._data_type = data['_data_type'] self._rows = data['_rows'] self._keys = data['_keys'] @property def index(self): """ This column's index. """ return self._index @property def name(self): """ This column's name. """ return self._name @property def data_type(self): """ This column's data type. """ return self._data_type @memoize def values(self): """ Get the values in this column, as a tuple. """ return tuple(row[self._index] for row in self._rows) @memoize def values_distinct(self): """ Get the distinct values in this column, as a tuple. """ return tuple(set(self.values())) @memoize def values_without_nulls(self): """ Get the values in this column with any null values removed. """ return tuple(d for d in self.values() if d is not None) @memoize def values_sorted(self): """ Get the values in this column sorted. """ return sorted(self.values(), key=null_handler) @memoize def values_without_nulls_sorted(self): """ Get the values in this column with any null values removed and sorted. """ return sorted(self.values_without_nulls(), key=null_handler) agate-1.9.1/agate/computations/000077500000000000000000000000001454111471500164145ustar00rootroot00000000000000agate-1.9.1/agate/computations/__init__.py000066400000000000000000000020251454111471500205240ustar00rootroot00000000000000""" Computations create a new value for each :class:`.Row` in a :class:`.Table`. When used with :meth:`.Table.compute` these new values become a new column. For instance, the :class:`.PercentChange` computation takes two column names as arguments and computes the percentage change between them for each row. There are a variety of basic computations, such as :class:`.Change` and :class:`.Percent`. If none of these meet your needs you can use the :class:`Formula` computation to apply an arbitrary function to the row. If this still isn't flexible enough, it's simple to create a custom computation class by inheriting from :class:`Computation`. """ from agate.computations.base import Computation from agate.computations.change import Change from agate.computations.formula import Formula from agate.computations.percent import Percent from agate.computations.percent_change import PercentChange from agate.computations.percentile_rank import PercentileRank from agate.computations.rank import Rank from agate.computations.slug import Slug agate-1.9.1/agate/computations/base.py000066400000000000000000000024251454111471500177030ustar00rootroot00000000000000class Computation: # pragma: no cover """ Computations produce a new column by performing a calculation on each row. Computations are applied with :class:`.TableSet.compute`. When implementing a custom computation, ensure that the values returned by :meth:`.Computation.run` are of the type specified by :meth:`.Computation.get_computed_data_type`. This can be ensured by using the :meth:`.DataType.cast` method. See :class:`.Formula` for an example. """ def __str__(self): """ String representation of this column. May be used as a column name in generated tables. """ return self.__class__.__name__ def get_computed_data_type(self, table): """ Returns an instantiated :class:`.DataType` which will be appended to the table. """ raise NotImplementedError() def validate(self, table): """ Perform any checks necessary to verify this computation can run on the provided table without errors. This is called by :meth:`.Table.compute` before :meth:`run`. """ pass def run(self, table): """ When invoked with a table, returns a sequence of new column values. """ raise NotImplementedError() agate-1.9.1/agate/computations/change.py000066400000000000000000000046641454111471500202250ustar00rootroot00000000000000from agate.aggregations.has_nulls import HasNulls from agate.computations.base import Computation from agate.data_types import Date, DateTime, Number, TimeDelta from agate.exceptions import DataTypeError from agate.warns import warn_null_calculation class Change(Computation): """ Calculate the difference between two columns. This calculation can be applied to :class:`.Number` columns to calculate numbers. It can also be applied to :class:`.Date`, :class:`.DateTime`, and :class:`.TimeDelta` columns to calculate time deltas. :param before_column_name: The name of a column containing the "before" values. :param after_column_name: The name of a column containing the "after" values. """ def __init__(self, before_column_name, after_column_name): self._before_column_name = before_column_name self._after_column_name = after_column_name def get_computed_data_type(self, table): before_column = table.columns[self._before_column_name] if isinstance(before_column.data_type, (Date, DateTime, TimeDelta)): return TimeDelta() if isinstance(before_column.data_type, Number): return Number() def validate(self, table): before_column = table.columns[self._before_column_name] after_column = table.columns[self._after_column_name] for data_type in (Number, Date, DateTime, TimeDelta): if isinstance(before_column.data_type, data_type): if not isinstance(after_column.data_type, data_type): raise DataTypeError('Specified columns must be of the same type') if HasNulls(self._before_column_name).run(table): warn_null_calculation(self, before_column) if HasNulls(self._after_column_name).run(table): warn_null_calculation(self, after_column) return raise DataTypeError('Change before and after columns must both contain data that is one of: ' 'Number, Date, DateTime or TimeDelta.') def run(self, table): new_column = [] for row in table.rows: before = row[self._before_column_name] after = row[self._after_column_name] if before is not None and after is not None: new_column.append(after - before) else: new_column.append(None) return new_column agate-1.9.1/agate/computations/formula.py000066400000000000000000000020111454111471500204250ustar00rootroot00000000000000from agate.computations.base import Computation class Formula(Computation): """ Apply an arbitrary function to each row. :param data_type: The data type this formula will return. :param func: The function to be applied to each row. Must return a valid value for the specified data type. :param cast: If :code:`True`, each return value will be cast to the specified :code:`data_type` to ensure it is valid. Only disable this if you are certain your formula always returns the correct type. """ def __init__(self, data_type, func, cast=True): self._data_type = data_type self._func = func self._cast = cast def get_computed_data_type(self, table): return self._data_type def run(self, table): new_column = [] for row in table.rows: v = self._func(row) if self._cast: v = self._data_type.cast(v) new_column.append(v) return new_column agate-1.9.1/agate/computations/percent.py000066400000000000000000000046661454111471500204420ustar00rootroot00000000000000from agate.aggregations.has_nulls import HasNulls from agate.aggregations.sum import Sum from agate.computations.base import Computation from agate.data_types import Number from agate.exceptions import DataTypeError from agate.warns import warn_null_calculation class Percent(Computation): """ Calculate each values percentage of a total. :param column_name: The name of a column containing the :class:`.Number` values. :param total: If specified, the total value for each number to be divided into. By default, the :class:`.Sum` of the values in the column will be used. """ def __init__(self, column_name, total=None): self._column_name = column_name self._total = total def get_computed_data_type(self, table): return Number() def validate(self, table): column = table.columns[self._column_name] if not isinstance(column.data_type, Number): raise DataTypeError('Percent column must contain Number data.') if self._total is not None and self._total <= 0: raise DataTypeError('The total must be a positive number') # Throw a warning if there are nulls in there if HasNulls(self._column_name).run(table): warn_null_calculation(self, column) def run(self, table): """ :returns: :class:`decimal.Decimal` """ # If the user has provided a total, use that if self._total is not None: total = self._total # Otherwise compute the sum of all the values in that column to # act as our denominator else: total = table.aggregate(Sum(self._column_name)) # Raise error if sum is less than or equal to zero if total <= 0: raise DataTypeError('The sum of column values must be a positive number') # Create a list new rows new_column = [] # Loop through the existing rows for row in table.rows: # Pull the value value = row[self._column_name] if value is None: new_column.append(None) continue # Try to divide it out of the total percent = value / total # And multiply it by 100 percent = percent * 100 # Append the value to the new list new_column.append(percent) # Pass out the list return new_column agate-1.9.1/agate/computations/percent_change.py000066400000000000000000000036271454111471500217430ustar00rootroot00000000000000from agate.aggregations.has_nulls import HasNulls from agate.computations.base import Computation from agate.data_types import Number from agate.exceptions import DataTypeError from agate.warns import warn_null_calculation class PercentChange(Computation): """ Calculate the percent difference between two columns. :param before_column_name: The name of a column containing the "before" :class:`.Number` values. :param after_column_name: The name of a column containing the "after" :class:`.Number` values. """ def __init__(self, before_column_name, after_column_name): self._before_column_name = before_column_name self._after_column_name = after_column_name def get_computed_data_type(self, table): return Number() def validate(self, table): before_column = table.columns[self._before_column_name] after_column = table.columns[self._after_column_name] if not isinstance(before_column.data_type, Number): raise DataTypeError('PercentChange before column must contain Number data.') if not isinstance(after_column.data_type, Number): raise DataTypeError('PercentChange after column must contain Number data.') if HasNulls(self._before_column_name).run(table): warn_null_calculation(self, before_column) if HasNulls(self._after_column_name).run(table): warn_null_calculation(self, after_column) def run(self, table): """ :returns: :class:`decimal.Decimal` """ new_column = [] for row in table.rows: before = row[self._before_column_name] after = row[self._after_column_name] if before is not None and after is not None: new_column.append((after - before) / before * 100) else: new_column.append(None) return new_column agate-1.9.1/agate/computations/percentile_rank.py000066400000000000000000000017061454111471500221370ustar00rootroot00000000000000from agate.aggregations.percentiles import Percentiles from agate.computations.rank import Rank from agate.data_types import Number from agate.exceptions import DataTypeError class PercentileRank(Rank): """ Calculate the percentile into which each value falls. See :class:`.Percentiles` for implementation details. :param column_name: The name of a column containing the :class:`.Number` values. """ def validate(self, table): column = table.columns[self._column_name] if not isinstance(column.data_type, Number): raise DataTypeError('PercentileRank column must contain Number data.') def run(self, table): """ :returns: :class:`int` """ percentiles = Percentiles(self._column_name).run(table) new_column = [] for row in table.rows: new_column.append(percentiles.locate(row[self._column_name])) return new_column agate-1.9.1/agate/computations/rank.py000066400000000000000000000031501454111471500177200ustar00rootroot00000000000000from decimal import Decimal from functools import cmp_to_key from agate.computations.base import Computation from agate.data_types import Number class Rank(Computation): """ Calculate rank order of the values in a column. Uses the "competition" ranking method: if there are four values and the middle two are tied, then the output will be `[1, 2, 2, 4]`. Null values will always be ranked last. :param column_name: The name of the column to rank. :param comparer: An optional comparison function. If not specified ranking will be ascending, with nulls ranked last. :param reverse: Reverse sort order before ranking. """ def __init__(self, column_name, comparer=None, reverse=None): self._column_name = column_name self._comparer = comparer self._reverse = reverse def get_computed_data_type(self, table): return Number() def run(self, table): """ :returns: :class:`int` """ column = table.columns[self._column_name] if self._comparer: data_sorted = sorted(column.values(), key=cmp_to_key(self._comparer)) else: data_sorted = column.values_sorted() if self._reverse: data_sorted.reverse() ranks = {} rank = 0 for c in data_sorted: rank += 1 if c in ranks: continue ranks[c] = Decimal(rank) new_column = [] for row in table.rows: new_column.append(ranks[row[self._column_name]]) return new_column agate-1.9.1/agate/computations/slug.py000066400000000000000000000040131454111471500177360ustar00rootroot00000000000000from agate.aggregations.has_nulls import HasNulls from agate.computations.base import Computation from agate.data_types import Text from agate.exceptions import DataTypeError from agate.utils import issequence, slugify class Slug(Computation): """ Convert text values from one or more columns into slugs. If multiple column names are given, values from those columns will be appended in the given order before standardizing. :param column_name: The name of a column or a sequence of column names containing :class:`.Text` values. :param ensure_unique: If True, any duplicate values will be appended with unique identifers. Defaults to False. """ def __init__(self, column_name, ensure_unique=False, **kwargs): self._column_name = column_name self._ensure_unique = ensure_unique self._slug_args = kwargs def get_computed_data_type(self, table): return Text() def validate(self, table): if issequence(self._column_name): column_names = self._column_name else: column_names = [self._column_name] for column_name in column_names: column = table.columns[column_name] if not isinstance(column.data_type, Text): raise DataTypeError('Slug column must contain Text data.') if HasNulls(column_name).run(table): raise ValueError('Slug column cannot contain `None`.') def run(self, table): """ :returns: :class:`string` """ new_column = [] for row in table.rows: if issequence(self._column_name): column_value = '' for column_name in self._column_name: column_value = column_value + ' ' + row[column_name] new_column.append(column_value) else: new_column.append(row[self._column_name]) return slugify(new_column, ensure_unique=self._ensure_unique, **self._slug_args) agate-1.9.1/agate/config.py000066400000000000000000000107711454111471500155140ustar00rootroot00000000000000""" This module contains the global configuration for agate. Users should use :meth:`get_option` and :meth:`set_option` to modify the global configuration. **Available configuation options:** +-------------------------+------------------------------------------+-----------------------------------------+ | Option | Description | Default value | +=========================+==========================================+=========================================+ | default_locale | Default locale for number formatting | default_locale('LC_NUMERIC') or 'en_US' | +-------------------------+------------------------------------------+-----------------------------------------+ | horizontal_line_char | Character to render for horizontal lines | '-' | +-------------------------+------------------------------------------+-----------------------------------------+ | vertical_line_char | Character to render for vertical lines | '|' | +-------------------------+------------------------------------------+-----------------------------------------+ | bar_char | Character to render for bar chart units | '░' | +-------------------------+------------------------------------------+-----------------------------------------+ | printable_bar_char | Printable character for bar chart units | ':' | +-------------------------+------------------------------------------+-----------------------------------------+ | zero_line_char | Character to render for zero line units | '▓' | +-------------------------+------------------------------------------+-----------------------------------------+ | printable_zero_line_char| Printable character for zero line units | '|' | +-------------------------+------------------------------------------+-----------------------------------------+ | tick_char | Character to render for axis ticks | '+' | +-------------------------+------------------------------------------+-----------------------------------------+ | ellipsis_chars | Characters to render for ellipsis | '...' | +-------------------------+------------------------------------------+-----------------------------------------+ | text_truncation_chars | Characters for truncated text values | '...' | +-------------------------+------------------------------------------+-----------------------------------------+ | number_truncation_chars | Characters for truncated number values | '…' | +-------------------------+------------------------------------------+-----------------------------------------+ """ from babel.core import default_locale _options = { #: Default locale for number formatting 'default_locale': default_locale('LC_NUMERIC') or 'en_US', #: Character to render for horizontal lines 'horizontal_line_char': '-', #: Character to render for vertical lines 'vertical_line_char': '|', #: Character to render for bar chart units 'bar_char': '░', #: Printable character to render for bar chart units 'printable_bar_char': ':', #: Character to render for zero line units 'zero_line_char': '▓', #: Printable character to render for zero line units 'printable_zero_line_char': '|', #: Character to render for axis ticks 'tick_char': '+', #: Characters to render for ellipsis 'ellipsis_chars': '...', #: Characters for truncated text values 'text_truncation_chars': '...', #: Characters for truncated number values 'number_truncation_chars': '…', } def get_option(key): """ Get a global configuration option for agate. :param key: The name of the configuration option. """ return _options[key] def set_option(key, value): """ Set a global configuration option for agate. :param key: The name of the configuration option. :param value: The new value to set for the configuration option. """ _options[key] = value def set_options(options): """ Set a dictionary of options simultaneously. :param hash: A dictionary of option names and values. """ _options.update(options) agate-1.9.1/agate/csv_py3.py000066400000000000000000000107271454111471500156360ustar00rootroot00000000000000""" This module contains the Python 3 replacement for :mod:`csv`. """ import csv import warnings from agate.exceptions import FieldSizeLimitError POSSIBLE_DELIMITERS = [',', '\t', ';', ' ', ':', '|'] class Reader: """ A wrapper around Python 3's builtin :func:`csv.reader`. """ def __init__(self, f, field_size_limit=None, line_numbers=False, header=True, **kwargs): self.line_numbers = line_numbers self.header = header if field_size_limit: csv.field_size_limit(field_size_limit) self.reader = csv.reader(f, **kwargs) def __iter__(self): return self def __next__(self): try: row = next(self.reader) except csv.Error as e: # Terrible way to test for this exception, but there is no subclass if 'field larger than field limit' in str(e): raise FieldSizeLimitError(csv.field_size_limit(), self.line_num) else: raise e if not self.line_numbers: return row if self.line_numbers: if self.header and self.line_num == 1: row.insert(0, 'line_numbers') else: row.insert(0, str(self.line_num - 1 if self.header else self.line_num)) return row @property def dialect(self): return self.reader.dialect @property def line_num(self): return self.reader.line_num class Writer: """ A wrapper around Python 3's builtin :func:`csv.writer`. """ def __init__(self, f, line_numbers=False, **kwargs): self.row_count = 0 self.line_numbers = line_numbers if 'lineterminator' not in kwargs: kwargs['lineterminator'] = '\n' self.writer = csv.writer(f, **kwargs) def _append_line_number(self, row): if self.row_count == 0: row.insert(0, 'line_number') else: row.insert(0, self.row_count) self.row_count += 1 def writerow(self, row): if self.line_numbers: row = list(row) self._append_line_number(row) # Convert embedded Mac line endings to unix style line endings so they get quoted row = [i.replace('\r', '\n') if isinstance(i, str) else i for i in row] self.writer.writerow(row) def writerows(self, rows): for row in rows: self.writerow(row) class DictReader(csv.DictReader): """ A wrapper around Python 3's builtin :class:`csv.DictReader`. """ pass class DictWriter(csv.DictWriter): """ A wrapper around Python 3's builtin :class:`csv.DictWriter`. """ def __init__(self, f, fieldnames, line_numbers=False, **kwargs): self.row_count = 0 self.line_numbers = line_numbers if 'lineterminator' not in kwargs: kwargs['lineterminator'] = '\n' if self.line_numbers: fieldnames.insert(0, 'line_number') csv.DictWriter.__init__(self, f, fieldnames, **kwargs) def _append_line_number(self, row): if self.row_count == 0: row['line_number'] = 'line_number' else: row['line_number'] = self.row_count self.row_count += 1 def writerow(self, row): # Convert embedded Mac line endings to unix style line endings so they get quoted row = dict([(k, v.replace('\r', '\n')) if isinstance(v, str) else (k, v) for k, v in row.items()]) if self.line_numbers: self._append_line_number(row) csv.DictWriter.writerow(self, row) def writerows(self, rows): for row in rows: self.writerow(row) class Sniffer: """ A functional wrapper of ``csv.Sniffer()``. """ def sniff(self, sample): """ A functional version of ``csv.Sniffer().sniff``, that extends the list of possible delimiters to include some seen in the wild. """ try: dialect = csv.Sniffer().sniff(sample, POSSIBLE_DELIMITERS) except csv.Error as e: warnings.warn('Error sniffing CSV dialect: %s' % e, RuntimeWarning, stacklevel=2) dialect = None return dialect def reader(*args, **kwargs): """ A replacement for Python's :func:`csv.reader` that uses :class:`.csv_py3.Reader`. """ return Reader(*args, **kwargs) def writer(*args, **kwargs): """ A replacement for Python's :func:`csv.writer` that uses :class:`.csv_py3.Writer`. """ return Writer(*args, **kwargs) agate-1.9.1/agate/data_types/000077500000000000000000000000001454111471500160245ustar00rootroot00000000000000agate-1.9.1/agate/data_types/__init__.py000066400000000000000000000012731454111471500201400ustar00rootroot00000000000000""" Data types define how data should be imported during the creation of a :class:`.Table`. If column types are not explicitly specified when a :class:`.Table` is created, agate will attempt to guess them. The :class:`.TypeTester` class can be used to control how types are guessed. """ from agate.data_types.base import DEFAULT_NULL_VALUES, DataType from agate.data_types.boolean import DEFAULT_FALSE_VALUES, DEFAULT_TRUE_VALUES, Boolean from agate.data_types.date import Date from agate.data_types.date_time import DateTime from agate.data_types.number import Number from agate.data_types.text import Text from agate.data_types.time_delta import TimeDelta from agate.exceptions import CastError agate-1.9.1/agate/data_types/base.py000066400000000000000000000025551454111471500173170ustar00rootroot00000000000000from agate.exceptions import CastError #: Default values which will be automatically cast to :code:`None` DEFAULT_NULL_VALUES = ('', 'na', 'n/a', 'none', 'null', '.') class DataType: # pragma: no cover """ Specifies how values should be parsed when creating a :class:`.Table`. :param null_values: A sequence of values which should be cast to :code:`None` when encountered by this data type. """ def __init__(self, null_values=DEFAULT_NULL_VALUES): self.null_values = [v.lower() for v in null_values] def test(self, d): """ Test, for purposes of type inference, if a value could possibly be coerced to this data type. This is really just a thin wrapper around :meth:`DataType.cast`. """ try: self.cast(d) except CastError: return False return True def cast(self, d): """ Coerce a given string value into this column's data type. """ raise NotImplementedError def csvify(self, d): """ Format a given native value for CSV serialization. """ if d is None: return None return str(d) def jsonify(self, d): """ Format a given native value for JSON serialization. """ if d is None: return None return str(d) agate-1.9.1/agate/data_types/boolean.py000066400000000000000000000036771454111471500200320ustar00rootroot00000000000000from decimal import Decimal from agate.data_types.base import DEFAULT_NULL_VALUES, DataType from agate.exceptions import CastError #: Default values which will be automatically cast to :code:`True`. DEFAULT_TRUE_VALUES = ('yes', 'y', 'true', 't', '1') #: Default values which will be automatically cast to :code:`False`. DEFAULT_FALSE_VALUES = ('no', 'n', 'false', 'f', '0') class Boolean(DataType): """ Data representing true and false. Note that by default numerical `1` and `0` are considered valid boolean values, but other numbers are not. :param true_values: A sequence of values which should be cast to :code:`True` when encountered with this type. :param false_values: A sequence of values which should be cast to :code:`False` when encountered with this type. """ def __init__(self, true_values=DEFAULT_TRUE_VALUES, false_values=DEFAULT_FALSE_VALUES, null_values=DEFAULT_NULL_VALUES): super().__init__(null_values=null_values) self.true_values = true_values self.false_values = false_values def cast(self, d): """ Cast a single value to :class:`bool`. :param d: A value to cast. :returns: :class:`bool` or :code:`None`. """ if d is None: return d if type(d) is bool and type(d) is not int: return d if type(d) is int or isinstance(d, Decimal): if d == 1: return True if d == 0: return False if isinstance(d, str): d = d.replace(',', '').strip() d_lower = d.lower() if d_lower in self.null_values: return None if d_lower in self.true_values: return True if d_lower in self.false_values: return False raise CastError('Can not convert value %s to bool.' % d) def jsonify(self, d): return d agate-1.9.1/agate/data_types/date.py000066400000000000000000000065711454111471500173240ustar00rootroot00000000000000import locale from datetime import date, datetime, time import parsedatetime from agate.data_types.base import DataType from agate.exceptions import CastError ZERO_DT = datetime.combine(date.min, time.min) class Date(DataType): """ Data representing dates alone. :param date_format: A formatting string for :meth:`datetime.datetime.strptime` to use instead of using regex-based parsing. :param locale: A locale specification such as :code:`en_US` or :code:`de_DE` to use for parsing formatted dates. """ def __init__(self, date_format=None, locale=None, **kwargs): super().__init__(**kwargs) self.date_format = date_format self.locale = locale self._constants = parsedatetime.Constants(localeID=self.locale) self._parser = parsedatetime.Calendar(constants=self._constants, version=parsedatetime.VERSION_CONTEXT_STYLE) def __getstate__(self): """ Return state values to be pickled. Exclude _constants and _parser because parsedatetime cannot be pickled. """ odict = self.__dict__.copy() del odict['_constants'] del odict['_parser'] return odict def __setstate__(self, ndict): """ Restore state from the unpickled state values. Set _constants to an instance of the parsedatetime Constants class, and _parser to an instance of the parsedatetime Calendar class. """ self.__dict__.update(ndict) self._constants = parsedatetime.Constants(localeID=self.locale) self._parser = parsedatetime.Calendar(constants=self._constants, version=parsedatetime.VERSION_CONTEXT_STYLE) def cast(self, d): """ Cast a single value to a :class:`datetime.date`. If both `date_format` and `locale` have been specified in the `agate.Date` instance, the `cast()` function is not thread-safe. :returns: :class:`datetime.date` or :code:`None`. """ if type(d) is date or d is None: return d if isinstance(d, str): d = d.strip() if d.lower() in self.null_values: return None else: raise CastError('Can not parse value "%s" as date.' % d) if self.date_format: orig_locale = None if self.locale: orig_locale = locale.getlocale(locale.LC_TIME) locale.setlocale(locale.LC_TIME, (self.locale, 'UTF-8')) try: dt = datetime.strptime(d, self.date_format) except (ValueError, TypeError): raise CastError('Value "%s" does not match date format.' % d) finally: if orig_locale: locale.setlocale(locale.LC_TIME, orig_locale) return dt.date() try: (value, ctx, _, _, matched_text), = self._parser.nlp(d, sourceTime=ZERO_DT) except (TypeError, ValueError, OverflowError): raise CastError('Value "%s" does not match date format.' % d) else: if matched_text == d and ctx.hasDate and not ctx.hasTime: return value.date() raise CastError('Can not parse value "%s" as date.' % d) def csvify(self, d): if d is None: return None return d.isoformat() def jsonify(self, d): return self.csvify(d) agate-1.9.1/agate/data_types/date_time.py000066400000000000000000000101611454111471500203300ustar00rootroot00000000000000import datetime import locale import isodate import parsedatetime from agate.data_types.base import DataType from agate.exceptions import CastError class DateTime(DataType): """ Data representing dates with times. :param datetime_format: A formatting string for :meth:`datetime.datetime.strptime` to use instead of using regex-based parsing. :param timezone: A ``ZoneInfo`` timezone to apply to each parsed date. :param locale: A locale specification such as :code:`en_US` or :code:`de_DE` to use for parsing formatted datetimes. """ def __init__(self, datetime_format=None, timezone=None, locale=None, **kwargs): super().__init__(**kwargs) self.datetime_format = datetime_format self.timezone = timezone self.locale = locale now = datetime.datetime.now() self._source_time = datetime.datetime( now.year, now.month, now.day, 0, 0, 0, 0, None ) self._constants = parsedatetime.Constants(localeID=self.locale) self._parser = parsedatetime.Calendar(constants=self._constants, version=parsedatetime.VERSION_CONTEXT_STYLE) def __getstate__(self): """ Return state values to be pickled. Exclude _parser because parsedatetime cannot be pickled. """ odict = self.__dict__.copy() del odict['_constants'] del odict['_parser'] return odict def __setstate__(self, ndict): """ Restore state from the unpickled state values. Set _constants to an instance of the parsedatetime Constants class, and _parser to an instance of the parsedatetime Calendar class. """ self.__dict__.update(ndict) self._constants = parsedatetime.Constants(localeID=self.locale) self._parser = parsedatetime.Calendar(constants=self._constants, version=parsedatetime.VERSION_CONTEXT_STYLE) def cast(self, d): """ Cast a single value to a :class:`datetime.datetime`. If both `date_format` and `locale` have been specified in the `agate.DateTime` instance, the `cast()` function is not thread-safe. :returns: :class:`datetime.datetime` or :code:`None`. """ if isinstance(d, datetime.datetime) or d is None: return d if isinstance(d, datetime.date): return datetime.datetime.combine(d, datetime.time(0, 0, 0)) if isinstance(d, str): d = d.strip() if d.lower() in self.null_values: return None else: raise CastError('Can not parse value "%s" as datetime.' % d) if self.datetime_format: orig_locale = None if self.locale: orig_locale = locale.getlocale(locale.LC_TIME) locale.setlocale(locale.LC_TIME, (self.locale, 'UTF-8')) try: dt = datetime.datetime.strptime(d, self.datetime_format) except (ValueError, TypeError): raise CastError('Value "%s" does not match date format.' % d) finally: if orig_locale: locale.setlocale(locale.LC_TIME, orig_locale) return dt try: (_, _, _, _, matched_text), = self._parser.nlp(d, sourceTime=self._source_time) except Exception: matched_text = None else: value, ctx = self._parser.parseDT( d, sourceTime=self._source_time, tzinfo=self.timezone ) if matched_text == d and ctx.hasDate and ctx.hasTime: return value if matched_text == d and ctx.hasDate and not ctx.hasTime: return datetime.datetime.combine(value.date(), datetime.time.min) try: dt = isodate.parse_datetime(d) return dt except Exception: pass raise CastError('Can not parse value "%s" as datetime.' % d) def csvify(self, d): if d is None: return None return d.isoformat() def jsonify(self, d): return self.csvify(d) agate-1.9.1/agate/data_types/number.py000066400000000000000000000065041454111471500176730ustar00rootroot00000000000000import warnings from decimal import Decimal, InvalidOperation from babel.core import Locale from agate.data_types.base import DataType from agate.exceptions import CastError #: A list of currency symbols sourced from `Xe `_. DEFAULT_CURRENCY_SYMBOLS = ['؋', '$', 'ƒ', '៛', '¥', '₡', '₱', '£', '€', '¢', '﷼', '₪', '₩', '₭', '₮', '₦', '฿', '₤', '₫'] POSITIVE = Decimal('1') NEGATIVE = Decimal('-1') class Number(DataType): """ Data representing numbers. :param locale: A locale specification such as :code:`en_US` or :code:`de_DE` to use for parsing formatted numbers. :param group_symbol: A grouping symbol used in the numbers. Overrides the value provided by the specified :code:`locale`. :param decimal_symbol: A decimal separate symbol used in the numbers. Overrides the value provided by the specified :code:`locale`. :param currency_symbols: A sequence of currency symbols to strip from numbers. """ def __init__(self, locale='en_US', group_symbol=None, decimal_symbol=None, currency_symbols=DEFAULT_CURRENCY_SYMBOLS, **kwargs): super().__init__(**kwargs) self.locale = Locale.parse(locale) self.currency_symbols = currency_symbols # Suppress Babel warning on Python 3.6 # See #665 with warnings.catch_warnings(): warnings.simplefilter("ignore") # Babel 2.14 support. # https://babel.pocoo.org/en/latest/changelog.html#possibly-backwards-incompatible-changes number_symbols = self.locale.number_symbols.get('latn', self.locale.number_symbols) self.group_symbol = group_symbol or number_symbols.get('group', ',') self.decimal_symbol = decimal_symbol or number_symbols.get('decimal', '.') def cast(self, d): """ Cast a single value to a :class:`decimal.Decimal`. :returns: :class:`decimal.Decimal` or :code:`None`. """ if isinstance(d, Decimal) or d is None: return d t = type(d) if t is int: return Decimal(d) if t is float: return Decimal(repr(d)) if d is False: return Decimal(0) if d is True: return Decimal(1) if not isinstance(d, str): raise CastError('Can not parse value "%s" as Decimal.' % d) d = d.strip() if d.lower() in self.null_values: return None d = d.strip('%') if len(d) > 0 and d[0] == '-': d = d[1:] sign = NEGATIVE else: sign = POSITIVE for symbol in self.currency_symbols: d = d.strip(symbol) d = d.replace(self.group_symbol, '') d = d.replace(self.decimal_symbol, '.') try: return Decimal(d) * sign # The Decimal class will return an InvalidOperation exception on most Python implementations, # but PyPy3 may return a ValueError if the string is not translatable to ASCII except (InvalidOperation, ValueError): pass raise CastError('Can not parse value "%s" as Decimal.' % d) def jsonify(self, d): if d is None: return d return float(d) agate-1.9.1/agate/data_types/text.py000066400000000000000000000015241454111471500173640ustar00rootroot00000000000000from agate.data_types.base import DataType class Text(DataType): """ Data representing text. :param cast_nulls: If :code:`True`, values in :data:`.DEFAULT_NULL_VALUES` will be converted to `None`. Disable to retain them as strings. """ def __init__(self, cast_nulls=True, **kwargs): super().__init__(**kwargs) self.cast_nulls = cast_nulls def cast(self, d): """ Cast a single value to :func:`unicode` (:func:`str` in Python 3). :param d: A value to cast. :returns: :func:`unicode` (:func:`str` in Python 3) or :code:`None` """ if d is None: return d if isinstance(d, str): if self.cast_nulls and d.strip().lower() in self.null_values: return None return str(d) agate-1.9.1/agate/data_types/time_delta.py000066400000000000000000000020111454111471500204770ustar00rootroot00000000000000import datetime import pytimeparse from agate.data_types.base import DataType from agate.exceptions import CastError class TimeDelta(DataType): """ Data representing the interval between two dates and/or times. """ def cast(self, d): """ Cast a single value to :class:`datetime.timedelta`. :param d: A value to cast. :returns: :class:`datetime.timedelta` or :code:`None` """ if isinstance(d, datetime.timedelta) or d is None: return d if isinstance(d, str): d = d.strip() if d.lower() in self.null_values: return None else: raise CastError('Can not parse value "%s" as timedelta.' % d) try: seconds = pytimeparse.parse(d) except AttributeError: seconds = None if seconds is None: raise CastError('Can not parse value "%s" to as timedelta.' % d) return datetime.timedelta(seconds=seconds) agate-1.9.1/agate/exceptions.py000066400000000000000000000020771454111471500164300ustar00rootroot00000000000000""" This module contains various exceptions raised by agate. """ class DataTypeError(TypeError): # pragma: no cover """ A calculation was attempted with an invalid :class:`.DataType`. """ pass class UnsupportedAggregationError(TypeError): # pragma: no cover """ An :class:`.Aggregation` was attempted which is not supported. For example, if a :class:`.Percentiles` is applied to a :class:`.TableSet`. """ pass class CastError(Exception): # pragma: no cover """ A column value can not be cast to the correct type. """ pass class FieldSizeLimitError(Exception): # pragma: no cover """ A field in a CSV file exceeds the maximum length. This length may be the default or one set by the user. """ def __init__(self, limit, line_number): super().__init__( 'CSV contains a field longer than the maximum length of %i characters on line %i. Try raising the maximum ' 'with the field_size_limit parameter, or try setting quoting=csv.QUOTE_NONE.' % (limit, line_number) ) agate-1.9.1/agate/fixed.py000066400000000000000000000040111454111471500153340ustar00rootroot00000000000000""" This module contains a generic parser for fixed-width files. It operates similar to Python's built-in CSV reader. """ from collections import OrderedDict, namedtuple Field = namedtuple('Field', ['name', 'start', 'length']) class Reader: """ Reads a fixed-width file using a column schema in CSV format. This works almost exactly like Python's built-in CSV reader. Schemas must be in the "ffs" format, with :code:`column`, :code:`start`, and :code:`length` columns. There is a repository of such schemas maintained at `wireservice/ffs `_. """ def __init__(self, f, schema_f): from agate import csv self.file = f self.fields = [] reader = csv.reader(schema_f) header = next(reader) if header != ['column', 'start', 'length']: raise ValueError('Schema must contain exactly three columns: "column", "start", and "length".') for row in reader: self.fields.append(Field(row[0], int(row[1]), int(row[2]))) def __iter__(self): return self def __next__(self): line = next(self.file) values = [] for field in self.fields: values.append(line[field.start:field.start + field.length].strip()) return values @property def fieldnames(self): """ The names of the columns read from the schema. """ return [field.name for field in self.fields] class DictReader(Reader): """ A fixed-width reader that returns :class:`collections.OrderedDict` rather than a list. """ def __next__(self): line = next(self.file) values = OrderedDict() for field in self.fields: values[field.name] = line[field.start:field.start + field.length].strip() return values def reader(*args, **kwargs): """ A wrapper around :class:`.fixed.Reader`, so that it can be used in the same way as a normal CSV reader. """ return Reader(*args, **kwargs) agate-1.9.1/agate/mapped_sequence.py000066400000000000000000000103171454111471500174010ustar00rootroot00000000000000""" This module contains the :class:`MappedSequence` class that forms the foundation for agate's :class:`.Row` and :class:`.Column` as well as for named sequences of rows and columns. """ from collections import OrderedDict from collections.abc import Sequence from agate.utils import memoize class MappedSequence(Sequence): """ A generic container for immutable data that can be accessed either by numeric index or by key. This is similar to an :class:`collections.OrderedDict` except that the keys are optional and iteration over it returns the values instead of keys. This is the base class for both :class:`.Column` and :class:`.Row`. :param values: A sequence of values. :param keys: A sequence of keys. """ __slots__ = ['_values', '_keys'] def __init__(self, values, keys=None): self._values = tuple(values) if keys is not None: self._keys = keys else: self._keys = None def __getstate__(self): """ Return state values to be pickled. This is necessary on Python2.7 when using :code:`__slots__`. """ return { '_values': self._values, '_keys': self._keys } def __setstate__(self, data): """ Restore pickled state. This is necessary on Python2.7 when using :code:`__slots__`. """ self._values = data['_values'] self._keys = data['_keys'] def __unicode__(self): """ Print a unicode sample of the contents of this sequence. """ sample = ', '.join(repr(d) for d in self.values()[:5]) if len(self) > 5: sample = '%s, ...' % sample return f'' def __str__(self): """ Print an ascii sample of the contents of this sequence. """ return str(self.__unicode__()) def __repr__(self): return self.__str__() def __getitem__(self, key): """ Retrieve values from this array by index, slice or key. """ if isinstance(key, slice): indices = range(*key.indices(len(self))) values = self.values() return tuple(values[i] for i in indices) # Note: can't use isinstance because bool is a subclass of int elif type(key) is int: return self.values()[key] return self.dict()[key] def __setitem__(self, key, value): """ Set values by index, which we want to fail loudly. """ raise TypeError('Rows and columns can not be modified directly. You probably need to compute a new column.') def __iter__(self): """ Iterate over values. """ return iter(self.values()) @memoize def __len__(self): return len(self.values()) def __eq__(self, other): """ Equality test with other sequences. """ if not isinstance(other, Sequence): return False return self.values() == tuple(other) def __ne__(self, other): """ Inequality test with other sequences. """ return not self.__eq__(other) def __contains__(self, value): return self.values().__contains__(value) def keys(self): """ Equivalent to :meth:`collections.OrderedDict.keys`. """ return self._keys def values(self): """ Equivalent to :meth:`collections.OrderedDict.values`. """ return self._values @memoize def items(self): """ Equivalent to :meth:`collections.OrderedDict.items`. """ return tuple(zip(self.keys(), self.values())) def get(self, key, default=None): """ Equivalent to :meth:`collections.OrderedDict.get`. """ try: return self.dict()[key] except KeyError: if default: return default return None @memoize def dict(self): """ Retrieve the contents of this sequence as an :class:`collections.OrderedDict`. """ if self.keys() is None: raise KeyError return OrderedDict(self.items()) agate-1.9.1/agate/rows.py000066400000000000000000000013361454111471500152360ustar00rootroot00000000000000""" This module contains agate's :class:`Row` implementation. Rows are independent of both the :class:`.Table` that contains them as well as the :class:`.Columns` that access their data. This independence, combined with rows immutability allows them to be safely shared between table instances. """ from agate.mapped_sequence import MappedSequence class Row(MappedSequence): """ A row of data. Values within a row can be accessed by column name or column index. Row are immutable and may be shared between :class:`.Table` instances. Currently row instances are a no-op subclass of :class:`MappedSequence`. They are being maintained in this fashion in order to support future features. """ pass agate-1.9.1/agate/table/000077500000000000000000000000001454111471500147565ustar00rootroot00000000000000agate-1.9.1/agate/table/__init__.py000066400000000000000000000312561454111471500170760ustar00rootroot00000000000000""" The :class:`.Table` object is the most important class in agate. Tables are created by supplying row data, column names and subclasses of :class:`.DataType` to the constructor. Once created, the data in a table **can not be changed**. This concept is central to agate. Instead of modifying the data, various methods can be used to create new, derivative tables. For example, the :meth:`.Table.select` method creates a new table with only the specified columns. The :meth:`.Table.where` method creates a new table with only those rows that pass a test. And :meth:`.Table.order_by` creates a sorted table. In all of these cases the output is a new :class:`.Table` and the existing table remains unmodified. Tables are not themselves iterable, but the columns of the table can be accessed via :attr:`.Table.columns` and the rows via :attr:`.Table.rows`. Both sequences can be accessed either by numeric index or by name. (In the case of rows, row names are optional.) """ import sys import warnings from io import StringIO from itertools import chain from agate import utils from agate.columns import Column from agate.data_types import DataType from agate.exceptions import CastError from agate.mapped_sequence import MappedSequence from agate.rows import Row from agate.type_tester import TypeTester class Table: """ A dataset consisting of rows and columns. Columns refer to "vertical" slices of data that must all be of the same type. Rows refer to "horizontal" slices of data that may (and usually do) contain mixed types. The sequence of :class:`.Column` instances are retrieved via the :attr:`.Table.columns` property. They may be accessed by either numeric index or by unique column name. The sequence of :class:`.Row` instances are retrieved via the :attr:`.Table.rows` property. They may be accessed by either numeric index or, if specified, unique row names. :param rows: The data as a sequence of any sequences: tuples, lists, etc. If any row has fewer values than the number of columns, it will be filled out with nulls. No row may have more values than the number of columns. :param column_names: A sequence of string names for each column or `None`, in which case column names will be automatically assigned using :func:`.letter_name`. :param column_types: A sequence of instances of :class:`.DataType` or an instance of :class:`.TypeTester` or `None` in which case a generic TypeTester will be used. Alternatively, a dictionary with column names as keys and instances of :class:`.DataType` as values to specify some types. :param row_names: Specifies unique names for each row. This parameter is optional. If specified it may be 1) the name of a single column that contains a unique identifier for each row, 2) a key function that takes a :class:`.Row` and returns a unique identifier or 3) a sequence of unique identifiers of the same length as the sequence of rows. The uniqueness of resulting identifiers is not validated, so be certain the values you provide are truly unique. :param _is_fork: Used internally to skip certain validation steps when data is propagated from an existing table. When :code:`True`, rows are assumed to be :class:`.Row` instances, rather than raw data. """ def __init__(self, rows, column_names=None, column_types=None, row_names=None, _is_fork=False): if isinstance(rows, str): raise ValueError('When created directly, the first argument to Table must be a sequence of rows. ' 'Did you want agate.Table.from_csv?') # Validate column names if column_names: self._column_names = utils.deduplicate(column_names, column_names=True) elif rows: self._column_names = tuple(utils.letter_name(i) for i in range(len(rows[0]))) warnings.warn('Column names not specified. "%s" will be used as names.' % str(self._column_names), RuntimeWarning, stacklevel=2) else: self._column_names = tuple() len_column_names = len(self._column_names) # Validate column_types if column_types is None: column_types = TypeTester() elif isinstance(column_types, dict): for v in column_types.values(): if not isinstance(v, DataType): raise ValueError('Column types must be instances of DataType.') column_types = TypeTester(force=column_types) elif not isinstance(column_types, TypeTester): for column_type in column_types: if not isinstance(column_type, DataType): raise ValueError('Column types must be instances of DataType.') if isinstance(column_types, TypeTester): self._column_types = column_types.run(rows, self._column_names) else: self._column_types = tuple(column_types) if len_column_names != len(self._column_types): raise ValueError('column_names and column_types must be the same length.') if not _is_fork: new_rows = [] cast_funcs = [c.cast for c in self._column_types] for i, row in enumerate(rows): len_row = len(row) if len_row > len_column_names: raise ValueError( 'Row %i has %i values, but Table only has %i columns.' % (i, len_row, len_column_names) ) elif len(row) < len_column_names: row = chain(row, [None] * (len_column_names - len_row)) row_values = [] for j, d in enumerate(row): try: row_values.append(cast_funcs[j](d)) except CastError as e: raise CastError(str(e) + f' Error at row {i} column {self._column_names[j]}.') new_rows.append(Row(row_values, self._column_names)) else: new_rows = rows if row_names: computed_row_names = [] if isinstance(row_names, str): for row in new_rows: name = row[row_names] computed_row_names.append(name) elif hasattr(row_names, '__call__'): for row in new_rows: name = row_names(row) computed_row_names.append(name) elif utils.issequence(row_names): computed_row_names = row_names else: raise ValueError('row_names must be a column name, function or sequence') for row_name in computed_row_names: if type(row_name) is int: raise ValueError('Row names cannot be of type int. Use Decimal for numbered row names.') self._row_names = tuple(computed_row_names) else: self._row_names = None self._rows = MappedSequence(new_rows, self._row_names) # Build columns new_columns = [] for i in range(len_column_names): name = self._column_names[i] data_type = self._column_types[i] column = Column(i, name, data_type, self._rows, row_names=self._row_names) new_columns.append(column) self._columns = MappedSequence(new_columns, self._column_names) def __str__(self): """ Print the table's structure using :meth:`.Table.print_structure`. """ structure = StringIO() self.print_structure(output=structure) return structure.getvalue() def __len__(self): """ Shorthand for :code:`len(table.rows)`. """ return self._rows.__len__() def __iter__(self): """ Shorthand for :code:`iter(table.rows)`. """ return self._rows.__iter__() def __getitem__(self, key): """ Shorthand for :code:`table.rows[foo]`. """ return self._rows.__getitem__(key) @property def column_types(self): """ An tuple :class:`.DataType` instances. """ return self._column_types @property def column_names(self): """ An tuple of strings. """ return self._column_names @property def row_names(self): """ An tuple of strings, if this table has row names. If this table does not have row names, then :code:`None`. """ return self._row_names @property def columns(self): """ A :class:`.MappedSequence` with column names for keys and :class:`.Column` instances for values. """ return self._columns @property def rows(self): """ A :class:`.MappedSeqeuence` with row names for keys (if specified) and :class:`.Row` instances for values. """ return self._rows def _fork(self, rows, column_names=None, column_types=None, row_names=None): """ Create a new table using the metadata from this one. This method is used internally by functions like :meth:`.Table.order_by`. :param rows: Row data for the forked table. :param column_names: Column names for the forked table. If not specified, fork will use this table's column names. :param column_types: Column types for the forked table. If not specified, fork will use this table's column names. :param row_names: Row names for the forked table. If not specified, fork will use this table's row names. """ if column_names is None: column_names = self._column_names if column_types is None: column_types = self._column_types if row_names is None: row_names = self._row_names return Table(rows, column_names, column_types, row_names=row_names, _is_fork=True) def print_csv(self, **kwargs): """ Print this table as a CSV. This is the same as passing :code:`sys.stdout` to :meth:`.Table.to_csv`. :code:`kwargs` will be passed on to :meth:`.Table.to_csv`. """ self.to_csv(sys.stdout, **kwargs) def print_json(self, **kwargs): """ Print this table as JSON. This is the same as passing :code:`sys.stdout` to :meth:`.Table.to_json`. :code:`kwargs` will be passed on to :meth:`.Table.to_json`. """ self.to_json(sys.stdout, **kwargs) from agate.table.aggregate import aggregate from agate.table.bar_chart import bar_chart from agate.table.bins import bins from agate.table.column_chart import column_chart from agate.table.compute import compute from agate.table.denormalize import denormalize from agate.table.distinct import distinct from agate.table.exclude import exclude from agate.table.find import find from agate.table.from_csv import from_csv from agate.table.from_fixed import from_fixed from agate.table.from_json import from_json from agate.table.from_object import from_object from agate.table.group_by import group_by from agate.table.homogenize import homogenize from agate.table.join import join from agate.table.limit import limit from agate.table.line_chart import line_chart from agate.table.merge import merge from agate.table.normalize import normalize from agate.table.order_by import order_by from agate.table.pivot import pivot from agate.table.print_bars import print_bars from agate.table.print_html import print_html from agate.table.print_structure import print_structure from agate.table.print_table import print_table from agate.table.rename import rename from agate.table.scatterplot import scatterplot from agate.table.select import select from agate.table.to_csv import to_csv from agate.table.to_json import to_json from agate.table.where import where Table.aggregate = aggregate Table.bar_chart = bar_chart Table.bins = bins Table.column_chart = column_chart Table.compute = compute Table.denormalize = denormalize Table.distinct = distinct Table.exclude = exclude Table.find = find Table.from_csv = from_csv Table.from_fixed = from_fixed Table.from_json = from_json Table.from_object = from_object Table.group_by = group_by Table.homogenize = homogenize Table.join = join Table.limit = limit Table.line_chart = line_chart Table.merge = merge Table.normalize = normalize Table.order_by = order_by Table.pivot = pivot Table.print_bars = print_bars Table.print_html = print_html Table.print_structure = print_structure Table.print_table = print_table Table.rename = rename Table.scatterplot = scatterplot Table.select = select Table.to_csv = to_csv Table.to_json = to_json Table.where = where agate-1.9.1/agate/table/aggregate.py000066400000000000000000000016441454111471500172630ustar00rootroot00000000000000from collections import OrderedDict from agate import utils def aggregate(self, aggregations): """ Apply one or more :class:`.Aggregation` instances to this table. :param aggregations: A single :class:`.Aggregation` instance or a sequence of tuples in the format :code:`(name, aggregation)`, where each :code:`aggregation` is an instance of :class:`.Aggregation`. :returns: If the input was a single :class:`Aggregation` then a single result will be returned. If it was a sequence then an :class:`.OrderedDict` of results will be returned. """ if utils.issequence(aggregations): results = OrderedDict() for name, agg in aggregations: agg.validate(self) for name, agg in aggregations: results[name] = agg.run(self) return results aggregations.validate(self) return aggregations.run(self) agate-1.9.1/agate/table/bar_chart.py000066400000000000000000000023171454111471500172600ustar00rootroot00000000000000import leather def bar_chart(self, label=0, value=1, path=None, width=None, height=None): """ Render a bar chart using :class:`leather.Chart`. :param label: The name or index of a column to plot as the labels of the chart. Defaults to the first column in the table. :param value: The name or index of a column to plot as the values of the chart. Defaults to the second column in the table. :param path: If specified, the resulting SVG will be saved to this location. If :code:`None` and running in IPython, then the SVG will be rendered inline. Otherwise, the SVG data will be returned as a string. :param width: The width of the output SVG. :param height: The height of the output SVG. """ if type(label) is int: label_name = self.column_names[label] else: label_name = label if type(value) is int: value_name = self.column_names[value] else: value_name = value chart = leather.Chart() chart.add_x_axis(name=value_name) chart.add_y_axis(name=label_name) chart.add_bars(self, x=value, y=label) return chart.to_svg(path=path, width=width, height=height) agate-1.9.1/agate/table/bins.py000066400000000000000000000062331454111471500162670ustar00rootroot00000000000000from decimal import Decimal from babel.numbers import format_decimal from agate import utils from agate.aggregations import Max, Min def bins(self, column_name, count=10, start=None, end=None): """ Generates (approximately) evenly sized bins for the values in a column. Bins may not be perfectly even if the spread of the data does not divide evenly, but all values will always be included in some bin. The resulting table will have two columns. The first will have the same name as the specified column, but will be type :class:`.Text`. The second will be named :code:`count` and will be of type :class:`.Number`. :param column_name: The name of the column to bin. Must be of type :class:`.Number` :param count: The number of bins to create. If not specified then each value will be counted as its own bin. :param start: The minimum value to start the bins at. If not specified the minimum value in the column will be used. :param end: The maximum value to end the bins at. If not specified the maximum value in the column will be used. :returns: A new :class:`Table`. """ minimum, maximum = utils.round_limits( Min(column_name).run(self), Max(column_name).run(self) ) # Infer bin start/end positions start = minimum if not start else Decimal(start) end = maximum if not end else Decimal(end) # Calculate bin size spread = abs(end - start) size = spread / count breaks = [start] # Calculate breakpoints for i in range(1, count + 1): top = start + (size * i) breaks.append(top) # Format bin names decimal_places = utils.max_precision(breaks) break_formatter = utils.make_number_formatter(decimal_places) def name_bin(i, j, first_exclusive=True, last_exclusive=False): inclusive = format_decimal(i, format=break_formatter) exclusive = format_decimal(j, format=break_formatter) output = '[' if first_exclusive else '(' output += f'{inclusive} - {exclusive}' output += ']' if last_exclusive else ')' return output # Generate bins bin_names = [] for i in range(1, len(breaks)): last_exclusive = (i == len(breaks) - 1) if i == 1 and minimum < start: name = name_bin(minimum, breaks[i], last_exclusive=last_exclusive) elif i == len(breaks) - 1 and maximum > end: name = name_bin(breaks[i - 1], maximum, last_exclusive=last_exclusive) else: name = name_bin(breaks[i - 1], breaks[i], last_exclusive=last_exclusive) bin_names.append(name) bin_names.append(None) # Lambda method for actually assigning values to bins def binner(row): value = row[column_name] if value is None: return None i = 1 try: while value >= breaks[i]: i += 1 except IndexError: i -= 1 return bin_names[i - 1] # Pivot by lambda table = self.pivot(binner, key_name=column_name) # Sort by bin order return table.order_by(lambda r: bin_names.index(r[column_name])) agate-1.9.1/agate/table/column_chart.py000066400000000000000000000023301454111471500200040ustar00rootroot00000000000000import leather def column_chart(self, label=0, value=1, path=None, width=None, height=None): """ Render a column chart using :class:`leather.Chart`. :param label: The name or index of a column to plot as the labels of the chart. Defaults to the first column in the table. :param value: The name or index of a column to plot as the values of the chart. Defaults to the second column in the table. :param path: If specified, the resulting SVG will be saved to this location. If :code:`None` and running in IPython, then the SVG will be rendered inline. Otherwise, the SVG data will be returned as a string. :param width: The width of the output SVG. :param height: The height of the output SVG. """ if type(label) is int: label_name = self.column_names[label] else: label_name = label if type(value) is int: value_name = self.column_names[value] else: value_name = value chart = leather.Chart() chart.add_x_axis(name=label_name) chart.add_y_axis(name=value_name) chart.add_columns(self, x=label, y=value) return chart.to_svg(path=path, width=width, height=height) agate-1.9.1/agate/table/compute.py000066400000000000000000000040071454111471500170050ustar00rootroot00000000000000from collections import OrderedDict from copy import copy from agate.rows import Row def compute(self, computations, replace=False): """ Create a new table by applying one or more :class:`.Computation` instances to each row. :param computations: A sequence of pairs of new column names and :class:`.Computation` instances. :param replace: If :code:`True` then new column names can match existing names, and those columns will be replaced with the computed data. :returns: A new :class:`.Table`. """ column_names = list(copy(self._column_names)) column_types = list(copy(self._column_types)) for new_column_name, computation in computations: new_column_type = computation.get_computed_data_type(self) if new_column_name in column_names: if not replace: raise ValueError( 'New column name "%s" already exists. Specify replace=True to replace with computed data.' ) i = column_names.index(new_column_name) column_types[i] = new_column_type else: column_names.append(new_column_name) column_types.append(new_column_type) computation.validate(self) new_columns = OrderedDict() for new_column_name, computation in computations: new_columns[new_column_name] = computation.run(self) new_rows = [] for i, row in enumerate(self._rows): # Slow version if using replace if replace: values = [] for j, column_name in enumerate(column_names): if column_name in new_columns: values.append(new_columns[column_name][i]) else: values.append(row[j]) # Faster version if not using replace else: values = row.values() + tuple(c[i] for c in new_columns.values()) new_rows.append(Row(values, column_names)) return self._fork(new_rows, column_names, column_types) agate-1.9.1/agate/table/denormalize.py000066400000000000000000000103631454111471500176440ustar00rootroot00000000000000from collections import OrderedDict from decimal import Decimal from agate import utils from agate.data_types import Number from agate.rows import Row from agate.type_tester import TypeTester def denormalize(self, key=None, property_column='property', value_column='value', default_value=utils.default, column_types=None): """ Create a new table with row values converted into columns. For example: +---------+-----------+---------+ | name | property | value | +=========+===========+=========+ | Jane | gender | female | +---------+-----------+---------+ | Jane | race | black | +---------+-----------+---------+ | Jane | age | 24 | +---------+-----------+---------+ | ... | ... | ... | +---------+-----------+---------+ Can be denormalized so that each unique value in `field` becomes a column with `value` used for its values. +---------+----------+--------+-------+ | name | gender | race | age | +=========+==========+========+=======+ | Jane | female | black | 24 | +---------+----------+--------+-------+ | Jack | male | white | 35 | +---------+----------+--------+-------+ | Joe | male | black | 28 | +---------+----------+--------+-------+ If one or more keys are specified then the resulting table will automatically have :code:`row_names` set to those keys. This is the opposite of :meth:`.Table.normalize`. :param key: A column name or a sequence of column names that should be maintained as they are in the normalized table. Typically these are the tables unique identifiers and any metadata about them. Or, :code:`None` if there are no key columns. :param field_column: The column whose values should become column names in the new table. :param property_column: The column whose values should become the values of the property columns in the new table. :param default_value: Value to be used for missing values in the pivot table. If not specified :code:`Decimal(0)` will be used for aggregations that return :class:`.Number` data and :code:`None` will be used for all others. :param column_types: A sequence of column types with length equal to number of unique values in field_column or an instance of :class:`.TypeTester`. Defaults to a generic :class:`.TypeTester`. :returns: A new :class:`.Table`. """ from agate.table import Table if key is None: key = [] elif not utils.issequence(key): key = [key] field_names = [] row_data = OrderedDict() for row in self.rows: row_key = tuple(row[k] for k in key) if row_key not in row_data: row_data[row_key] = OrderedDict() f = str(row[property_column]) v = row[value_column] if f not in field_names: field_names.append(f) row_data[row_key][f] = v if default_value == utils.default: if isinstance(self.columns[value_column].data_type, Number): default_value = Decimal(0) else: default_value = None new_column_names = key + field_names new_rows = [] row_names = [] for k, v in row_data.items(): row = list(k) if len(k) == 1: row_names.append(k[0]) else: row_names.append(k) for f in field_names: if f in v: row.append(v[f]) else: row.append(default_value) new_rows.append(Row(row, new_column_names)) key_column_types = [self.column_types[self.column_names.index(name)] for name in key] if column_types is None or isinstance(column_types, TypeTester): tester = TypeTester() if column_types is None else column_types force_update = dict(zip(key, key_column_types)) force_update.update(tester._force) tester._force = force_update new_column_types = tester.run(new_rows, new_column_names) else: new_column_types = key_column_types + list(column_types) return Table(new_rows, new_column_names, new_column_types, row_names=row_names) agate-1.9.1/agate/table/distinct.py000066400000000000000000000022401454111471500171470ustar00rootroot00000000000000from agate import utils def distinct(self, key=None): """ Create a new table with only unique rows. :param key: Either the name of a single column to use to identify unique rows, a sequence of such column names, a :class:`function` that takes a row and returns a value to identify unique rows, or `None`, in which case the entire row will be checked for uniqueness. :returns: A new :class:`.Table`. """ key_is_row_function = hasattr(key, '__call__') key_is_sequence = utils.issequence(key) uniques = [] rows = [] if self._row_names is not None: row_names = [] else: row_names = None for i, row in enumerate(self._rows): if key_is_row_function: k = key(row) elif key_is_sequence: k = (row[j] for j in key) elif key is None: k = tuple(row) else: k = row[key] if k not in uniques: uniques.append(k) rows.append(row) if self._row_names is not None: row_names.append(self._row_names[i]) return self._fork(rows, row_names=row_names) agate-1.9.1/agate/table/exclude.py000066400000000000000000000007101454111471500167570ustar00rootroot00000000000000from agate import utils def exclude(self, key): """ Create a new table without the specified columns. :param key: Either the name of a single column to exclude or a sequence of such names. :returns: A new :class:`.Table`. """ if not utils.issequence(key): key = [key] selected_column_names = tuple(n for n in self._column_names if n not in key) return self.select(selected_column_names) agate-1.9.1/agate/table/find.py000066400000000000000000000006061454111471500162520ustar00rootroot00000000000000def find(self, test): """ Find the first row that passes a test. :param test: A function that takes a :class:`.Row` and returns :code:`True` if it matches. :type test: :class:`function` :returns: A single :class:`.Row` if found, or `None`. """ for row in self._rows: if test(row): return row return None agate-1.9.1/agate/table/from_csv.py000066400000000000000000000054151454111471500171530ustar00rootroot00000000000000import itertools from io import StringIO @classmethod def from_csv(cls, path, column_names=None, column_types=None, row_names=None, skip_lines=0, header=True, sniff_limit=0, encoding='utf-8', row_limit=None, **kwargs): """ Create a new table from a CSV. This method uses agate's builtin CSV reader, which supplies encoding support for both Python 2 and Python 3. :code:`kwargs` will be passed through to the CSV reader. :param path: Filepath or file-like object from which to read CSV data. If a file-like object is specified, it must be seekable. If using Python 2, the file should be opened in binary mode (`rb`). :param column_names: See :meth:`.Table.__init__`. :param column_types: See :meth:`.Table.__init__`. :param row_names: See :meth:`.Table.__init__`. :param skip_lines: The number of lines to skip from the top of the file. :param header: If :code:`True`, the first row of the CSV is assumed to contain column names. If :code:`header` and :code:`column_names` are both specified then a row will be skipped, but :code:`column_names` will be used. :param sniff_limit: Limit CSV dialect sniffing to the specified number of bytes. Set to None to sniff the entire file. Defaults to 0 (no sniffing). :param encoding: Character encoding of the CSV file. Note: if passing in a file handle it is assumed you have already opened it with the correct encoding specified. :param row_limit: Limit how many rows of data will be read. """ from agate import csv from agate.table import Table close = False try: if hasattr(path, 'read'): f = path else: f = open(path, encoding=encoding) close = True if isinstance(skip_lines, int): while skip_lines > 0: f.readline() skip_lines -= 1 else: raise ValueError('skip_lines argument must be an int') contents = StringIO(f.read()) if sniff_limit is None: kwargs['dialect'] = csv.Sniffer().sniff(contents.getvalue()) elif sniff_limit > 0: kwargs['dialect'] = csv.Sniffer().sniff(contents.getvalue()[:sniff_limit]) reader = csv.reader(contents, header=header, **kwargs) if header: if column_names is None: column_names = next(reader) else: next(reader) if row_limit is None: rows = tuple(reader) else: rows = tuple(itertools.islice(reader, row_limit)) finally: if close: f.close() return Table(rows, column_names, column_types, row_names=row_names) agate-1.9.1/agate/table/from_fixed.py000066400000000000000000000040211454111471500174470ustar00rootroot00000000000000from agate import fixed, utils @classmethod def from_fixed(cls, path, schema_path, column_names=utils.default, column_types=None, row_names=None, encoding='utf-8', schema_encoding='utf-8'): """ Create a new table from a fixed-width file and a CSV schema. Schemas must be in the "ffs" format. There is a repository of such schemas maintained at `wireservice/ffs `_. :param path: File path or file-like object from which to read fixed-width data. :param schema_path: File path or file-like object from which to read schema (CSV) data. :param column_names: By default, these will be parsed from the schema. For alternatives, see :meth:`.Table.__init__`. :param column_types: See :meth:`.Table.__init__`. :param row_names: See :meth:`.Table.__init__`. :param encoding: Character encoding of the fixed-width file. Note: if passing in a file handle it is assumed you have already opened it with the correct encoding specified. :param schema_encoding: Character encoding of the schema file. Note: if passing in a file handle it is assumed you have already opened it with the correct encoding specified. """ from agate.table import Table close_f = False close_schema_f = False try: if not hasattr(path, 'read'): f = open(path, encoding=encoding) close_f = True else: f = path if not hasattr(schema_path, 'read'): schema_f = open(schema_path, encoding=schema_encoding) close_schema_f = True else: schema_f = path reader = fixed.reader(f, schema_f) rows = list(reader) finally: if close_f: f.close() if close_schema_f: schema_f.close() if column_names == utils.default: column_names = reader.fieldnames return Table(rows, column_names, column_types, row_names=row_names) agate-1.9.1/agate/table/from_json.py000066400000000000000000000052501454111471500173260ustar00rootroot00000000000000import json from collections import OrderedDict from decimal import Decimal @classmethod def from_json(cls, path, row_names=None, key=None, newline=False, column_types=None, encoding='utf-8', **kwargs): """ Create a new table from a JSON file. Once the JSON has been deseralized, the resulting Python object is passed to :meth:`.Table.from_object`. If the file contains a top-level dictionary you may specify what property contains the row list using the :code:`key` parameter. :code:`kwargs` will be passed through to :meth:`json.load`. :param path: Filepath or file-like object from which to read JSON data. :param row_names: See the :meth:`.Table.__init__`. :param key: The key of the top-level dictionary that contains a list of row arrays. :param newline: If `True` then the file will be parsed as "newline-delimited JSON". :param column_types: See :meth:`.Table.__init__`. :param encoding: According to RFC4627, JSON text shall be encoded in Unicode; the default encoding is UTF-8. You can override this by using any encoding supported by your Python's open() function if :code:`path` is a filepath. If passing in a file handle, it is assumed you have already opened it with the correct encoding specified. """ from agate.table import Table if key is not None and newline: raise ValueError('key and newline may not be specified together.') close = False try: if newline: js = [] if hasattr(path, 'read'): for line in path: js.append(json.loads(line, object_pairs_hook=OrderedDict, parse_float=Decimal, **kwargs)) else: f = open(path, encoding=encoding) close = True for line in f: js.append(json.loads(line, object_pairs_hook=OrderedDict, parse_float=Decimal, **kwargs)) else: if hasattr(path, 'read'): js = json.load(path, object_pairs_hook=OrderedDict, parse_float=Decimal, **kwargs) else: f = open(path, encoding=encoding) close = True js = json.load(f, object_pairs_hook=OrderedDict, parse_float=Decimal, **kwargs) if isinstance(js, dict): if not key: raise TypeError( 'When converting a JSON document with a top-level dictionary element, a key must be specified.' ) js = js[key] finally: if close: f.close() return Table.from_object(js, row_names=row_names, column_types=column_types) agate-1.9.1/agate/table/from_object.py000066400000000000000000000037241454111471500176270ustar00rootroot00000000000000from agate import utils @classmethod def from_object(cls, obj, row_names=None, column_types=None): """ Create a new table from a Python object. The object should be a list containing a dictionary for each "row". Nested objects or lists will also be parsed. For example, this object: .. code-block:: python { 'one': { 'a': 1, 'b': 2, 'c': 3 }, 'two': [4, 5, 6], 'three': 'd' } Would generate these columns and values: .. code-block:: python { 'one/a': 1, 'one/b': 2, 'one/c': 3, 'two.0': 4, 'two.1': 5, 'two.2': 6, 'three': 'd' } Column names and types will be inferred from the data. Not all rows are required to have the same keys. Missing elements will be filled in with null values. Keys containing a slash (``/``) can collide with other keys. For example: .. code-block:: python { 'a/b': 2, 'a': { 'b': False } } Would generate: .. code-block:: python { 'a/b': false } :param obj: Filepath or file-like object from which to read JSON data. :param row_names: See :meth:`.Table.__init__`. :param column_types: See :meth:`.Table.__init__`. """ from agate.table import Table column_names = [] row_objects = [] for sub in obj: parsed = utils.parse_object(sub) for key in parsed.keys(): if key not in column_names: column_names.append(key) row_objects.append(parsed) rows = [] for sub in row_objects: r = [] for name in column_names: r.append(sub.get(name, None)) rows.append(r) return Table(rows, column_names, row_names=row_names, column_types=column_types) agate-1.9.1/agate/table/group_by.py000066400000000000000000000037441454111471500171660ustar00rootroot00000000000000from collections import OrderedDict from agate.data_types import Text from agate.tableset import TableSet def group_by(self, key, key_name=None, key_type=None): """ Create a :class:`.TableSet` with a table for each unique key. Note that group names will always be coerced to a string, regardless of the format of the input column. :param key: Either the name of a column from the this table to group by, or a :class:`function` that takes a row and returns a value to group by. :param key_name: A name that describes the grouped properties. Defaults to the column name that was grouped on or "group" if grouping with a key function. See :class:`.TableSet` for more. :param key_type: An instance of any subclass of :class:`.DataType`. If not provided it will default to a :class`.Text`. :returns: A :class:`.TableSet` mapping where the keys are unique values from the :code:`key` and the values are new :class:`.Table` instances containing the grouped rows. """ key_is_row_function = hasattr(key, '__call__') if key_is_row_function: key_name = key_name or 'group' key_type = key_type or Text() else: column = self._columns[key] key_name = key_name or column.name key_type = key_type or column.data_type groups = OrderedDict() for row in self._rows: if key_is_row_function: group_name = key(row) else: group_name = row[column.name] group_name = key_type.cast(group_name) if group_name not in groups: groups[group_name] = [] groups[group_name].append(row) if not groups: return TableSet([self._fork([])], [], key_name=key_name, key_type=key_type) output = OrderedDict() for group, rows in groups.items(): output[group] = self._fork(rows) return TableSet(output.values(), output.keys(), key_name=key_name, key_type=key_type) agate-1.9.1/agate/table/homogenize.py000066400000000000000000000057271454111471500175070ustar00rootroot00000000000000from agate import utils from agate.rows import Row def homogenize(self, key, compare_values, default_row=None): """ Fill in missing rows in a series. This can be used, for instance, to add rows for missing years in a time series. Missing rows are found by comparing the values in the :code:`key` columns with those provided as :code:`compare_values`. Values not found in the table will be used to generate new rows with the given :code:`default_row`. :code:`default_row` should be an array of values or an array-generating function. If not specified, the new rows will have :code:`None` in columns all columns not specified in :code:`key`. If :code:`default_row` is an array of values, its length should be row length minus the number of column names provided in the :code:`key`. If it is an array-generating function, the function should take an array of missing values for each new row and output a full row including those values. :param key: Either a column name or a sequence of such names. :param compare_values: Either an array of column values if key is a single column name or a sequence of arrays of values if key is a sequence of names. It can also be a generator that yields either of the two. A row is created for each value or list of values not found in the rows of the table. :param default_row: An array of values or a function to generate new rows. The length of the input array should be equal to row length minus column_names count. The length of array generated by the function should be the row length. :returns: A new :class:`.Table`. """ rows = list(self._rows) if not utils.issequence(key): key = [key] if len(key) == 1: if any(not utils.issequence(compare_value) for compare_value in compare_values): compare_values = [[compare_value] for compare_value in compare_values] column_values = [self._columns.get(name) for name in key] column_indexes = [self._column_names.index(name) for name in key] compare_values = [[column_values[i].data_type.cast(v) for i, v in enumerate(values)] for values in compare_values] column_values = zip(*column_values) differences = list(set(map(tuple, compare_values)) - set(column_values)) for difference in differences: if callable(default_row): new_row = default_row(difference) else: if default_row is not None: new_row = list(default_row) else: new_row = [None] * (len(self._column_names) - len(key)) for i, d in zip(column_indexes, difference): new_row.insert(i, d) new_row = [self._columns[i].data_type.cast(v) for i, v in enumerate(new_row)] rows.append(Row(new_row, self._column_names)) # Do not copy the row_names, since this function adds rows. return self._fork(rows, row_names=[]) agate-1.9.1/agate/table/join.py000066400000000000000000000177561454111471500163070ustar00rootroot00000000000000from agate import utils from agate.rows import Row def join(self, right_table, left_key=None, right_key=None, inner=False, full_outer=False, require_match=False, columns=None): """ Create a new table by joining two table's on common values. This method implements most varieties of SQL join, in addition to some unique features. If :code:`left_key` and :code:`right_key` are both :code:`None` then this method will perform a "sequential join", which is to say it will join on row number. The :code:`inner` and :code:`full_outer` arguments will determine whether dangling left-hand and right-hand rows are included, respectively. If :code:`left_key` is specified, then a "left outer join" will be performed. This will combine columns from the :code:`right_table` anywhere that :code:`left_key` and :code:`right_key` are equal. Unmatched rows from the left table will be included with the right-hand columns set to :code:`None`. If :code:`inner` is :code:`True` then an "inner join" will be performed. Unmatched rows from either table will be left out. If :code:`full_outer` is :code:`True` then a "full outer join" will be performed. Unmatched rows from both tables will be included, with the columns in the other table set to :code:`None`. In all cases, if :code:`right_key` is :code:`None` then it :code:`left_key` will be used for both tables. If :code:`left_key` and :code:`right_key` are column names, the right-hand identifier column will not be included in the output table. If :code:`require_match` is :code:`True` unmatched rows will raise an exception. This is like an "inner join" except any row that doesn't have a match will raise an exception instead of being dropped. This is useful for enforcing expectations about datasets that should match. Column names from the right table which also exist in this table will be suffixed "2" in the new table. A subset of columns from the right-hand table can be included in the joined table using the :code:`columns` argument. :param right_table: The "right" table to join to. :param left_key: Either the name of a column from the this table to join on, the index of a column, a sequence of such column identifiers, a :class:`function` that takes a row and returns a value to join on, or :code:`None` in which case the tables will be joined on row number. :param right_key: Either the name of a column from :code:table` to join on, the index of a column, a sequence of such column identifiers, or a :class:`function` that takes a ow and returns a value to join on. If :code:`None` then :code:`left_key` will be used for both. If :code:`left_key` is :code:`None` then this value is ignored. :param inner: Perform a SQL-style "inner join" instead of a left outer join. Rows which have no match for :code:`left_key` will not be included in the output table. :param full_outer: Perform a SQL-style "full outer" join rather than a left or a right. May not be used in combination with :code:`inner`. :param require_match: If true, an exception will be raised if there is a left_key with no matching right_key. :param columns: A sequence of column names from :code:`right_table` to include in the final output table. Defaults to all columns not in :code:`right_key`. Ignored when :code:`full_outer` is :code:`True`. :returns: A new :class:`.Table`. """ if inner and full_outer: raise ValueError('A join can not be both "inner" and "full_outer".') if right_key is None: right_key = left_key # Get join columns right_key_indices = [] left_key_is_func = hasattr(left_key, '__call__') left_key_is_sequence = utils.issequence(left_key) # Left key is None if left_key is None: left_data = tuple(range(len(self._rows))) # Left key is a function elif left_key_is_func: left_data = [left_key(row) for row in self._rows] # Left key is a sequence elif left_key_is_sequence: left_columns = [self._columns[key] for key in left_key] left_data = zip(*[column.values() for column in left_columns]) # Left key is a column name/index else: left_data = self._columns[left_key].values() right_key_is_func = hasattr(right_key, '__call__') right_key_is_sequence = utils.issequence(right_key) # Sequential join if left_key is None: right_data = tuple(range(len(right_table._rows))) # Right key is a function elif right_key_is_func: right_data = [right_key(row) for row in right_table._rows] # Right key is a sequence elif right_key_is_sequence: right_columns = [right_table._columns[key] for key in right_key] right_data = zip(*[column.values() for column in right_columns]) right_key_indices = [right_table._columns._keys.index(key) for key in right_key] # Right key is a column name/index else: right_column = right_table._columns[right_key] right_data = right_column.values() right_key_indices = [right_table._columns.index(right_column)] # Build names and type lists column_names = list(self._column_names) column_types = list(self._column_types) for i, column in enumerate(right_table._columns): name = column.name if not full_outer: if columns is None and i in right_key_indices: continue if columns is not None and name not in columns: continue if name in self.column_names: column_names.append('%s2' % name) else: column_names.append(name) column_types.append(column.data_type) if columns is not None and not full_outer: right_table = right_table.select([n for n in right_table._column_names if n in columns]) right_hash = {} for i, value in enumerate(right_data): if value not in right_hash: right_hash[value] = [] right_hash[value].append(right_table._rows[i]) # Collect new rows rows = [] if self._row_names is not None and not full_outer: row_names = [] else: row_names = None # Iterate over left column for left_index, left_value in enumerate(left_data): matching_rows = right_hash.get(left_value, None) if require_match and matching_rows is None: raise ValueError('Left key "%s" does not have a matching right key.' % left_value) # Rows with matches if matching_rows: for right_row in matching_rows: new_row = list(self._rows[left_index]) for k, v in enumerate(right_row): if columns is None and k in right_key_indices and not full_outer: continue new_row.append(v) rows.append(Row(new_row, column_names)) if self._row_names is not None and not full_outer: row_names.append(self._row_names[left_index]) # Rows without matches elif not inner: new_row = list(self._rows[left_index]) for k, v in enumerate(right_table._column_names): if columns is None and k in right_key_indices and not full_outer: continue new_row.append(None) rows.append(Row(new_row, column_names)) if self._row_names is not None and not full_outer: row_names.append(self._row_names[left_index]) # Full outer join if full_outer: left_set = set(left_data) for right_index, right_value in enumerate(right_data): if right_value in left_set: continue new_row = ([None] * len(self._columns)) + list(right_table.rows[right_index]) rows.append(Row(new_row, column_names)) return self._fork(rows, column_names, column_types, row_names=row_names) agate-1.9.1/agate/table/limit.py000066400000000000000000000014771454111471500164570ustar00rootroot00000000000000def limit(self, start_or_stop=None, stop=None, step=None): """ Create a new table with fewer rows. See also: Python's builtin :func:`slice`. :param start_or_stop: If the only argument, then how many rows to include, otherwise, the index of the first row to include. :param stop: The index of the last row to include. :param step: The size of the jump between rows to include. (`step=2` will return every other row.) :returns: A new :class:`.Table`. """ if stop or step: s = slice(start_or_stop, stop, step) else: s = slice(start_or_stop) rows = self._rows[s] if self._row_names is not None: row_names = self._row_names[s] else: row_names = None return self._fork(rows, row_names=row_names) agate-1.9.1/agate/table/line_chart.py000066400000000000000000000021571454111471500174450ustar00rootroot00000000000000import leather def line_chart(self, x=0, y=1, path=None, width=None, height=None): """ Render a line chart using :class:`leather.Chart`. :param x: The name or index of a column to plot as the x-axis. Defaults to the first column in the table. :param y: The name or index of a column to plot as the y-axis. Defaults to the second column in the table. :param path: If specified, the resulting SVG will be saved to this location. If :code:`None` and running in IPython, then the SVG will be rendered inline. Otherwise, the SVG data will be returned as a string. :param width: The width of the output SVG. :param height: The height of the output SVG. """ if type(x) is int: x_name = self.column_names[x] else: x_name = x if type(y) is int: y_name = self.column_names[y] else: y_name = y chart = leather.Chart() chart.add_x_axis(name=x_name) chart.add_y_axis(name=y_name) chart.add_line(self, x=x, y=y) return chart.to_svg(path=path, width=width, height=height) agate-1.9.1/agate/table/merge.py000066400000000000000000000044131454111471500164310ustar00rootroot00000000000000from collections import OrderedDict from agate.exceptions import DataTypeError from agate.rows import Row @classmethod def merge(cls, tables, row_names=None, column_names=None): """ Create a new table from a sequence of similar tables. This method will not carry over row names from the merged tables, but new row names can be specified with the :code:`row_names` argument. It is possible to limit the columns included in the new :class:`.Table` with :code:`column_names` argument. For example, to only include columns from a specific table, set :code:`column_names` equal to :code:`table.column_names`. :param tables: An sequence of :class:`.Table` instances. :param row_names: See :class:`.Table` for the usage of this parameter. :param column_names: A sequence of column names to include in the new :class:`.Table`. If not specified, all distinct column names from `tables` are included. :returns: A new :class:`.Table`. """ from agate.table import Table new_columns = OrderedDict() for table in tables: for i in range(0, len(table.columns)): if column_names is None or table.column_names[i] in column_names: column_name = table.column_names[i] column_type = table.column_types[i] if column_name in new_columns: if not isinstance(column_type, type(new_columns[column_name])): raise DataTypeError('Tables contain columns with the same names, but different types.') else: new_columns[column_name] = column_type column_keys = tuple(new_columns.keys()) column_types = tuple(new_columns.values()) rows = [] for table in tables: # Performance optimization for identical table structures if table.column_names == column_keys and table.column_types == column_types: rows.extend(table.rows) else: for row in table.rows: data = [] for column_key in column_keys: data.append(row.get(column_key, None)) rows.append(Row(data, column_keys)) return Table(rows, column_keys, column_types, row_names=row_names, _is_fork=True) agate-1.9.1/agate/table/normalize.py000066400000000000000000000062001454111471500173260ustar00rootroot00000000000000from agate import utils from agate.rows import Row from agate.type_tester import TypeTester def normalize(self, key, properties, property_column='property', value_column='value', column_types=None): """ Create a new table with columns converted into rows values. For example: +---------+----------+--------+-------+ | name | gender | race | age | +=========+==========+========+=======+ | Jane | female | black | 24 | +---------+----------+--------+-------+ | Jack | male | white | 35 | +---------+----------+--------+-------+ | Joe | male | black | 28 | +---------+----------+--------+-------+ can be normalized on columns 'gender', 'race' and 'age': +---------+-----------+---------+ | name | property | value | +=========+===========+=========+ | Jane | gender | female | +---------+-----------+---------+ | Jane | race | black | +---------+-----------+---------+ | Jane | age | 24 | +---------+-----------+---------+ | ... | ... | ... | +---------+-----------+---------+ This is the opposite of :meth:`.Table.denormalize`. :param key: A column name or a sequence of column names that should be maintained as they are in the normalized self. Typically these are the tables unique identifiers and any metadata about them. :param properties: A column name or a sequence of column names that should be converted to properties in the new self. :param property_column: The name to use for the column containing the property names. :param value_column: The name to use for the column containing the property values. :param column_types: A sequence of two column types for the property and value column in that order or an instance of :class:`.TypeTester`. Defaults to a generic :class:`.TypeTester`. :returns: A new :class:`.Table`. """ from agate.table import Table new_rows = [] if not utils.issequence(key): key = [key] if not utils.issequence(properties): properties = [properties] new_column_names = key + [property_column, value_column] row_names = [] for row in self._rows: k = tuple(row[n] for n in key) left_row = list(k) if len(k) == 1: row_names.append(k[0]) else: row_names.append(k) for f in properties: new_rows.append(Row((left_row + [f, row[f]]), new_column_names)) key_column_types = [self._column_types[self._column_names.index(name)] for name in key] if column_types is None or isinstance(column_types, TypeTester): tester = TypeTester() if column_types is None else column_types force_update = dict(zip(key, key_column_types)) force_update.update(tester._force) tester._force = force_update new_column_types = tester.run(new_rows, new_column_names) else: new_column_types = key_column_types + list(column_types) return Table(new_rows, new_column_names, new_column_types) agate-1.9.1/agate/table/order_by.py000066400000000000000000000023201454111471500171320ustar00rootroot00000000000000from agate import utils def order_by(self, key, reverse=False): """ Create a new table that is sorted. :param key: Either the name of a single column to sort by, a sequence of such names, or a :class:`function` that takes a row and returns a value to sort by. :param reverse: If `True` then sort in reverse (typically, descending) order. :returns: A new :class:`.Table`. """ if len(self._rows) == 0: return self._fork(self._rows) key_is_row_function = hasattr(key, '__call__') key_is_sequence = utils.issequence(key) def sort_key(data): row = data[1] if key_is_row_function: k = key(row) elif key_is_sequence: k = tuple(utils.NullOrder() if row[n] is None else row[n] for n in key) else: k = row[key] if k is None: return utils.NullOrder() return k results = sorted(enumerate(self._rows), key=sort_key, reverse=reverse) indices, rows = zip(*results) if self._row_names is not None: row_names = [self._row_names[i] for i in indices] else: row_names = None return self._fork(rows, row_names=row_names) agate-1.9.1/agate/table/pivot.py000066400000000000000000000110561454111471500164740ustar00rootroot00000000000000from agate import utils from agate.aggregations import Count def pivot(self, key=None, pivot=None, aggregation=None, computation=None, default_value=utils.default, key_name=None): """ Create a new table by grouping the data, aggregating those groups, applying a computation, and then organizing the groups into new rows and columns. This is sometimes called a "crosstab". +---------+---------+--------+ | name | race | gender | +=========+=========+========+ | Joe | white | male | +---------+---------+--------+ | Jane | black | female | +---------+---------+--------+ | Josh | black | male | +---------+---------+--------+ | Jim | asian | female | +---------+---------+--------+ This table can be pivoted with :code:`key` equal to "race" and :code:`columns` equal to "gender". The default aggregation is :class:`.Count`. This would result in the following table. +---------+---------+--------+ | race | male | female | +=========+=========+========+ | white | 1 | 0 | +---------+---------+--------+ | black | 1 | 1 | +---------+---------+--------+ | asian | 0 | 1 | +---------+---------+--------+ If one or more keys are specified then the resulting table will automatically have :code:`row_names` set to those keys. See also the related method :meth:`.Table.denormalize`. :param key: Either the name of a column from the this table to group by, a sequence of such column names, a :class:`function` that takes a row and returns a value to group by, or :code:`None`, in which case there will be only a single row in the output table. :param pivot: A column name whose unique values will become columns in the new table, or :code:`None` in which case there will be a single value column in the output table. :param aggregation: An instance of an :class:`.Aggregation` to perform on each group of data in the pivot table. (Each cell is the result of an aggregation of the grouped data.) If not specified this defaults to :class:`.Count` with no arguments. :param computation: An optional :class:`.Computation` instance to be applied to the aggregated sequence of values before they are transposed into the pivot table. Use the class name of the aggregation as your column name argument when constructing your computation. (This is "Count" if using the default value for :code:`aggregation`.) :param default_value: Value to be used for missing values in the pivot table. Defaults to :code:`Decimal(0)`. If performing non-mathematical aggregations you may wish to set this to :code:`None`. :param key_name: A name for the key column in the output table. This is most useful when the provided key is a function. This argument is not valid when :code:`key` is a sequence. :returns: A new :class:`.Table`. """ if key is None: key = [] elif not utils.issequence(key): key = [key] elif key_name: raise ValueError('key_name is not a valid argument when key is a sequence.') if aggregation is None: aggregation = Count() groups = self for k in key: groups = groups.group_by(k, key_name=key_name) aggregation_name = str(aggregation) computation_name = str(computation) if computation else None def apply_computation(table): computed = table.compute([ (computation_name, computation) ]) excluded = computed.exclude([aggregation_name]) return excluded if pivot is not None: groups = groups.group_by(pivot) column_type = aggregation.get_aggregate_data_type(self) table = groups.aggregate([ (aggregation_name, aggregation) ]) pivot_count = len(set(table.columns[pivot].values())) if computation is not None: column_types = computation.get_computed_data_type(table) table = apply_computation(table) column_types = [column_type] * pivot_count table = table.denormalize(key, pivot, computation_name or aggregation_name, default_value=default_value, column_types=column_types) else: table = groups.aggregate([ (aggregation_name, aggregation) ]) if computation: table = apply_computation(table) return table agate-1.9.1/agate/table/print_bars.py000066400000000000000000000166501454111471500175030ustar00rootroot00000000000000import sys from collections import OrderedDict from decimal import Decimal from babel.numbers import format_decimal from agate import config, utils from agate.aggregations import Max, Min from agate.data_types import Number from agate.exceptions import DataTypeError def print_bars(self, label_column_name='group', value_column_name='Count', domain=None, width=120, output=sys.stdout, printable=False): """ Print a text-based bar chart based on this table. :param label_column_name: The column containing the label values. Defaults to :code:`group`, which is the default output of :meth:`.Table.pivot` or :meth:`.Table.bins`. :param value_column_name: The column containing the bar values. Defaults to :code:`Count`, which is the default output of :meth:`.Table.pivot` or :meth:`.Table.bins`. :param domain: A 2-tuple containing the minimum and maximum values for the chart's x-axis. The domain must be large enough to contain all values in the column. :param width: The width, in characters, to use for the bar chart. Defaults to :code:`120`. :param output: A file-like object to print to. Defaults to :code:`sys.stdout`. :param printable: If true, only printable characters will be outputed. """ tick_mark = config.get_option('tick_char') horizontal_line = config.get_option('horizontal_line_char') locale = config.get_option('default_locale') if printable: bar_mark = config.get_option('printable_bar_char') zero_mark = config.get_option('printable_zero_line_char') else: bar_mark = config.get_option('bar_char') zero_mark = config.get_option('zero_line_char') y_label = label_column_name label_column = self._columns[label_column_name] # if not isinstance(label_column.data_type, Text): # raise ValueError('Only Text data is supported for bar chart labels.') x_label = value_column_name value_column = self._columns[value_column_name] if not isinstance(value_column.data_type, Number): raise DataTypeError('Only Number data is supported for bar chart values.') output = output width = width # Format numbers decimal_places = utils.max_precision(value_column) value_formatter = utils.make_number_formatter(decimal_places) formatted_labels = [] for label in label_column: formatted_labels.append(str(label)) formatted_values = [] for value in value_column: if value is None: formatted_values.append('-') else: formatted_values.append(format_decimal( value, format=value_formatter, locale=locale )) max_label_width = max(max([len(label) for label in formatted_labels]), len(y_label)) max_value_width = max(max([len(value) for value in formatted_values]), len(x_label)) plot_width = width - (max_label_width + max_value_width + 2) min_value = Min(value_column_name).run(self) max_value = Max(value_column_name).run(self) # Calculate dimensions if domain: x_min = Decimal(domain[0]) x_max = Decimal(domain[1]) if min_value < x_min or max_value > x_max: raise ValueError('Column contains values outside specified domain') else: x_min, x_max = utils.round_limits(min_value, max_value) # All positive if x_min >= 0: x_min = Decimal('0') plot_negative_width = 0 zero_line = 0 plot_positive_width = plot_width - 1 # All negative elif x_max <= 0: x_max = Decimal('0') plot_negative_width = plot_width - 1 zero_line = plot_width - 1 plot_positive_width = 0 # Mixed signs else: spread = x_max - x_min negative_portion = (x_min.copy_abs() / spread) # Subtract one for zero line plot_negative_width = int(((plot_width - 1) * negative_portion).to_integral_value()) zero_line = plot_negative_width plot_positive_width = plot_width - (plot_negative_width + 1) def project(value): if value >= 0: return plot_negative_width + int((plot_positive_width * (value / x_max)).to_integral_value()) return plot_negative_width - int((plot_negative_width * (value / x_min)).to_integral_value()) # Calculate ticks ticks = OrderedDict() # First tick ticks[0] = x_min ticks[plot_width - 1] = x_max tick_fractions = [Decimal('0.25'), Decimal('0.5'), Decimal('0.75')] # All positive if x_min >= 0: for fraction in tick_fractions: value = x_max * fraction ticks[project(value)] = value # All negative elif x_max <= 0: for fraction in tick_fractions: value = x_min * fraction ticks[project(value)] = value # Mixed signs else: # Zero tick ticks[zero_line] = Decimal('0') # Halfway between min and 0 value = x_min * Decimal('0.5') ticks[project(value)] = value # Halfway between 0 and max value = x_max * Decimal('0.5') ticks[project(value)] = value decimal_places = utils.max_precision(ticks.values()) tick_formatter = utils.make_number_formatter(decimal_places) ticks_formatted = OrderedDict() for k, v in ticks.items(): ticks_formatted[k] = format_decimal( v, format=tick_formatter, locale=locale ) def write(line): output.write(line + '\n') # Chart top top_line = f'{y_label.ljust(max_label_width)} {x_label.rjust(max_value_width)}' write(top_line) # Bars for i, label in enumerate(formatted_labels): value = value_column[i] if value == 0 or value is None: bar_width = 0 elif value > 0: bar_width = project(value) - plot_negative_width elif value < 0: bar_width = plot_negative_width - project(value) label_text = label.ljust(max_label_width) value_text = formatted_values[i].rjust(max_value_width) bar = bar_mark * bar_width if value is not None and value >= 0: gap = (' ' * plot_negative_width) # All positive if x_min <= 0: bar = gap + zero_mark + bar else: bar = bar + gap + zero_mark else: bar = ' ' * (plot_negative_width - bar_width) + bar # All negative or mixed signs if value is None or x_max > value: bar = bar + zero_mark bar = bar.ljust(plot_width) write(f'{label_text} {value_text} {bar}') # Axis & ticks axis = horizontal_line * plot_width tick_text = ' ' * width for i, (tick, label) in enumerate(ticks_formatted.items()): # First tick if tick == 0: offset = 0 # Last tick elif tick == plot_width - 1: offset = -(len(label) - 1) else: offset = int(-(len(label) / 2)) pos = (width - plot_width) + tick + offset # Don't print intermediate ticks that would overlap if tick != 0 and tick != plot_width - 1: if tick_text[pos - 1:pos + len(label) + 1] != ' ' * (len(label) + 2): continue tick_text = tick_text[:pos] + label + tick_text[pos + len(label):] axis = axis[:tick] + tick_mark + axis[tick + 1:] write(axis.rjust(width)) write(tick_text) agate-1.9.1/agate/table/print_html.py000066400000000000000000000107111454111471500175100ustar00rootroot00000000000000import math import sys from babel.numbers import format_decimal from agate import config, utils from agate.data_types import Number, Text def print_html(self, max_rows=20, max_columns=6, output=sys.stdout, max_column_width=20, locale=None, max_precision=3): """ Print an HTML version of this table. :param max_rows: The maximum number of rows to display before truncating the data. This defaults to :code:`20` to prevent accidental printing of the entire table. Pass :code:`None` to disable the limit. :param max_columns: The maximum number of columns to display before truncating the data. This defaults to :code:`6` to prevent wrapping in most cases. Pass :code:`None` to disable the limit. :param output: A file-like object to print to. Defaults to :code:`sys.stdout`, unless running in Jupyter. (See above.) :param max_column_width: Truncate all columns to at most this width. The remainder will be replaced with ellipsis. :param locale: Provide a locale you would like to be used to format the output. By default it will use the system's setting. :max_precision: Puts a limit on the maximum precision displayed for number types. Numbers with lesser precision won't be affected. This defaults to :code:`3`. Pass :code:`None` to disable limit. """ if max_rows is None: max_rows = len(self._rows) if max_columns is None: max_columns = len(self._columns) if max_precision is None: max_precision = float('inf') ellipsis = config.get_option('ellipsis_chars') truncation = config.get_option('text_truncation_chars') len_truncation = len(truncation) locale = locale or config.get_option('default_locale') rows_truncated = max_rows < len(self._rows) columns_truncated = max_columns < len(self._column_names) column_names = list(self._column_names[:max_columns]) if columns_truncated: column_names.append(ellipsis) number_formatters = [] formatted_data = [] # Determine correct number of decimal places for each Number column for i, c in enumerate(self._columns): if i >= max_columns: break if isinstance(c.data_type, Number): max_places = utils.max_precision(c[:max_rows]) add_ellipsis = False if max_places > max_precision: add_ellipsis = True max_places = max_precision number_formatters.append(utils.make_number_formatter(max_places, add_ellipsis)) else: number_formatters.append(None) # Format data for i, row in enumerate(self._rows): if i >= max_rows: break formatted_row = [] for j, v in enumerate(row): if j >= max_columns: v = ellipsis elif v is None: v = '' elif number_formatters[j] is not None and not math.isinf(v): v = format_decimal( v, format=number_formatters[j], locale=locale ) else: v = str(v) if max_column_width is not None and len(v) > max_column_width: v = '%s%s' % (v[:max_column_width - len_truncation], truncation) formatted_row.append(v) if j >= max_columns: break formatted_data.append(formatted_row) def write(line): output.write(line + '\n') def write_row(formatted_row): """ Helper function that formats individual rows. """ write('') for j, d in enumerate(formatted_row): # Text is left-justified, all other values are right-justified if isinstance(self._column_types[j], Text): write('%s' % d) else: write('%s' % d) write('') # Header write('') write('') write('') for i, col in enumerate(column_names): write('' % col) write('') write('') write('') # Rows for formatted_row in formatted_data: write_row(formatted_row) # Row indicating data was truncated if rows_truncated: write_row([ellipsis for n in column_names]) # Footer write('') write('
%s
') agate-1.9.1/agate/table/print_structure.py000066400000000000000000000012421454111471500206030ustar00rootroot00000000000000import sys from agate.data_types import Text def print_structure(self, output=sys.stdout, max_rows=None): """ Print this table's column names and types as a plain-text table. :param output: The output to print to. """ from agate.table import Table name_column = [n for n in self._column_names] type_column = [t.__class__.__name__ for t in self._column_types] rows = zip(name_column, type_column) column_names = ['column', 'data_type'] text = Text() column_types = [text, text] table = Table(rows, column_names, column_types) return table.print_table(output=output, max_column_width=None, max_rows=max_rows) agate-1.9.1/agate/table/print_table.py000066400000000000000000000117401454111471500176360ustar00rootroot00000000000000import math import sys from babel.numbers import format_decimal from agate import config, utils from agate.data_types import Number, Text def print_table(self, max_rows=20, max_columns=6, output=sys.stdout, max_column_width=20, locale=None, max_precision=3): """ Print a text-based view of the data in this table. The output of this method is GitHub Flavored Markdown (GFM) compatible. :param max_rows: The maximum number of rows to display before truncating the data. This defaults to :code:`20` to prevent accidental printing of the entire table. Pass :code:`None` to disable the limit. :param max_columns: The maximum number of columns to display before truncating the data. This defaults to :code:`6` to prevent wrapping in most cases. Pass :code:`None` to disable the limit. :param output: A file-like object to print to. :param max_column_width: Truncate all columns to at most this width. The remainder will be replaced with ellipsis. :param locale: Provide a locale you would like to be used to format the output. By default it will use the system's setting. :max_precision: Puts a limit on the maximum precision displayed for number types. Numbers with lesser precision won't be affected. This defaults to :code:`3`. Pass :code:`None` to disable limit. """ if max_rows is None: max_rows = len(self._rows) if max_columns is None: max_columns = len(self._columns) if max_precision is None: max_precision = float('inf') ellipsis = config.get_option('ellipsis_chars') truncation = config.get_option('text_truncation_chars') len_truncation = len(truncation) h_line = config.get_option('horizontal_line_char') v_line = config.get_option('vertical_line_char') locale = locale or config.get_option('default_locale') rows_truncated = max_rows < len(self._rows) columns_truncated = max_columns < len(self._column_names) column_names = [] for column_name in self.column_names[:max_columns]: if max_column_width is not None and len(column_name) > max_column_width: column_names.append('%s%s' % (column_name[:max_column_width - len_truncation], truncation)) else: column_names.append(column_name) if columns_truncated: column_names.append(ellipsis) widths = [len(n) for n in column_names] number_formatters = [] formatted_data = [] # Determine correct number of decimal places for each Number column for i, c in enumerate(self._columns): if i >= max_columns: break if isinstance(c.data_type, Number): max_places = utils.max_precision(c[:max_rows]) add_ellipsis = False if max_places > max_precision: add_ellipsis = True max_places = max_precision number_formatters.append(utils.make_number_formatter(max_places, add_ellipsis)) else: number_formatters.append(None) # Format data and display column widths for i, row in enumerate(self._rows): if i >= max_rows: break formatted_row = [] for j, v in enumerate(row): if j >= max_columns: v = ellipsis elif v is None: v = '' elif number_formatters[j] is not None and not math.isinf(v): v = format_decimal( v, format=number_formatters[j], locale=locale ) else: v = str(v) if max_column_width is not None and len(v) > max_column_width: v = '%s%s' % (v[:max_column_width - len_truncation], truncation) if len(v) > widths[j]: widths[j] = len(v) formatted_row.append(v) if j >= max_columns: break formatted_data.append(formatted_row) def write(line): output.write(line + '\n') def write_row(formatted_row): """ Helper function that formats individual rows. """ row_output = [] for j, d in enumerate(formatted_row): # Text is left-justified, all other values are right-justified if isinstance(self._column_types[j], Text): output = ' %s ' % d.ljust(widths[j]) else: output = ' %s ' % d.rjust(widths[j]) row_output.append(output) text = v_line.join(row_output) write(f'{v_line}{text}{v_line}') divider = '{v_line} {columns} {v_line}'.format( v_line=v_line, columns=' | '.join(h_line * w for w in widths) ) # Headers write_row(column_names) write(divider) # Rows for formatted_row in formatted_data: write_row(formatted_row) # Row indicating data was truncated if rows_truncated: write_row([ellipsis for n in column_names]) agate-1.9.1/agate/table/rename.py000066400000000000000000000045751454111471500166120ustar00rootroot00000000000000from agate import utils def rename(self, column_names=None, row_names=None, slug_columns=False, slug_rows=False, **kwargs): """ Create a copy of this table with different column names or row names. By enabling :code:`slug_columns` or :code:`slug_rows` and not specifying new names you may slugify the table's existing names. :code:`kwargs` will be passed to the slugify method in python-slugify. See: https://github.com/un33k/python-slugify :param column_names: New column names for the renamed table. May be either an array or a dictionary mapping existing column names to new names. If not specified, will use this table's existing column names. :param row_names: New row names for the renamed table. May be either an array or a dictionary mapping existing row names to new names. If not specified, will use this table's existing row names. :param slug_columns: If True, column names will be converted to slugs and duplicate names will have unique identifiers appended. :param slug_rows: If True, row names will be converted to slugs and dupicate names will have unique identifiers appended. """ from agate.table import Table if isinstance(column_names, dict): column_names = [column_names[name] if name in column_names else name for name in self._column_names] if isinstance(row_names, dict): row_names = [row_names[name] if name in row_names else name for name in self._row_names] if slug_columns: column_names = column_names or self._column_names if column_names is not None: if column_names == self._column_names: column_names = utils.slugify(column_names, ensure_unique=False, **kwargs) else: column_names = utils.slugify(column_names, ensure_unique=True, **kwargs) if slug_rows: row_names = row_names or self.row_names if row_names is not None: row_names = utils.slugify(row_names, ensure_unique=True, **kwargs) if column_names is not None and column_names != self._column_names: if row_names is None: row_names = self._row_names return Table(self._rows, column_names, self._column_types, row_names=row_names, _is_fork=False) return self._fork(self._rows, column_names, self._column_types, row_names=row_names) agate-1.9.1/agate/table/scatterplot.py000066400000000000000000000021611454111471500176740ustar00rootroot00000000000000import leather def scatterplot(self, x=0, y=1, path=None, width=None, height=None): """ Render a scatterplot using :class:`leather.Chart`. :param x: The name or index of a column to plot as the x-axis. Defaults to the first column in the table. :param y: The name or index of a column to plot as the y-axis. Defaults to the second column in the table. :param path: If specified, the resulting SVG will be saved to this location. If :code:`None` and running in IPython, then the SVG will be rendered inline. Otherwise, the SVG data will be returned as a string. :param width: The width of the output SVG. :param height: The height of the output SVG. """ if type(x) is int: x_name = self.column_names[x] else: x_name = x if type(y) is int: y_name = self.column_names[y] else: y_name = y chart = leather.Chart() chart.add_x_axis(name=x_name) chart.add_y_axis(name=y_name) chart.add_dots(self, x=x, y=y) return chart.to_svg(path=path, width=width, height=height) agate-1.9.1/agate/table/select.py000066400000000000000000000012021454111471500166020ustar00rootroot00000000000000from agate import utils from agate.rows import Row def select(self, key): """ Create a new table with only the specified columns. :param key: Either the name of a single column to include or a sequence of such names. :returns: A new :class:`.Table`. """ if not utils.issequence(key): key = [key] indexes = tuple(self._column_names.index(k) for k in key) column_types = tuple(self._column_types[i] for i in indexes) new_rows = [] for row in self._rows: new_rows.append(Row((row[i] for i in indexes), key)) return self._fork(new_rows, key, column_types) agate-1.9.1/agate/table/to_csv.py000066400000000000000000000021651454111471500166310ustar00rootroot00000000000000import os def to_csv(self, path, **kwargs): """ Write this table to a CSV. This method uses agate's builtin CSV writer, which supports unicode on both Python 2 and Python 3. ``kwargs`` will be passed through to the CSV writer. The ``lineterminator`` defaults to the newline character (LF, ``\\n``). :param path: Filepath or file-like object to write to. """ from agate import csv if 'lineterminator' not in kwargs: kwargs['lineterminator'] = '\n' close = True f = None try: if hasattr(path, 'write'): f = path close = False else: dirpath = os.path.dirname(path) if dirpath and not os.path.exists(dirpath): os.makedirs(dirpath) f = open(path, 'w') writer = csv.writer(f, **kwargs) writer.writerow(self._column_names) csv_funcs = [c.csvify for c in self._column_types] for row in self._rows: writer.writerow(tuple(csv_funcs[i](d) for i, d in enumerate(row))) finally: if close and f is not None: f.close() agate-1.9.1/agate/table/to_json.py000066400000000000000000000054371454111471500170140ustar00rootroot00000000000000import json import os from collections import OrderedDict def to_json(self, path, key=None, newline=False, indent=None, **kwargs): """ Write this table to a JSON file or file-like object. :code:`kwargs` will be passed through to the JSON encoder. :param path: File path or file-like object to write to. :param key: If specified, JSON will be output as an hash instead of a list. May be either the name of a column from the this table containing unique values or a :class:`function` that takes a row and returns a unique value. :param newline: If `True`, output will be in the form of "newline-delimited JSON". :param indent: If specified, the number of spaces to indent the JSON for formatting. """ if key is not None and newline: raise ValueError('key and newline may not be specified together.') if newline and indent is not None: raise ValueError('newline and indent may not be specified together.') key_is_row_function = hasattr(key, '__call__') json_kwargs = { 'ensure_ascii': False, 'indent': indent } # Pass remaining kwargs through to JSON encoder json_kwargs.update(kwargs) json_funcs = [c.jsonify for c in self._column_types] close = True f = None try: if hasattr(path, 'write'): f = path close = False else: if os.path.dirname(path) and not os.path.exists(os.path.dirname(path)): os.makedirs(os.path.dirname(path)) f = open(path, 'w') def dump_json(data): json.dump(data, f, **json_kwargs) if newline: f.write('\n') # Keyed if key is not None: output = OrderedDict() for row in self._rows: if key_is_row_function: k = key(row) else: k = str(row[key]) if k in output: raise ValueError('Value %s is not unique in the key column.' % str(k)) values = tuple(json_funcs[i](d) for i, d in enumerate(row)) output[k] = OrderedDict(zip(row.keys(), values)) dump_json(output) # Newline-delimited elif newline: for row in self._rows: values = tuple(json_funcs[i](d) for i, d in enumerate(row)) dump_json(OrderedDict(zip(row.keys(), values))) # Normal else: output = [] for row in self._rows: values = tuple(json_funcs[i](d) for i, d in enumerate(row)) output.append(OrderedDict(zip(row.keys(), values))) dump_json(output) finally: if close and f is not None: f.close() agate-1.9.1/agate/table/where.py000066400000000000000000000012641454111471500164450ustar00rootroot00000000000000def where(self, test): """ Create a new :class:`.Table` with only those rows that pass a test. :param test: A function that takes a :class:`.Row` and returns :code:`True` if it should be included in the new :class:`.Table`. :type test: :class:`function` :returns: A new :class:`.Table`. """ rows = [] if self._row_names is not None: row_names = [] else: row_names = None for i, row in enumerate(self._rows): if test(row): rows.append(row) if row_names is not None: row_names.append(self._row_names[i]) return self._fork(rows, row_names=row_names) agate-1.9.1/agate/tableset/000077500000000000000000000000001454111471500154725ustar00rootroot00000000000000agate-1.9.1/agate/tableset/__init__.py000066400000000000000000000156431454111471500176140ustar00rootroot00000000000000""" The :class:`.TableSet` class collects a set of related tables in a single data structure. The most common way of creating a :class:`.TableSet` is using the :meth:`.Table.group_by` method, which is similar to SQL's ``GROUP BY`` keyword. The resulting set of tables will all have identical columns structure. :class:`.TableSet` functions as a dictionary. Individual tables in the set can be accessed by using their name as a key. If the table set was created using :meth:`.Table.group_by` then the names of the tables will be the grouping factors found in the original data. :class:`.TableSet` replicates the majority of the features of :class:`.Table`. When methods such as :meth:`.TableSet.select`, :meth:`.TableSet.where` or :meth:`.TableSet.order_by` are used, the operation is applied to *each* table in the set and the result is a new :class:`TableSet` instance made up of entirely new :class:`.Table` instances. :class:`.TableSet` instances can also contain other TableSet's. This means you can chain calls to :meth:`.Table.group_by` and :meth:`.TableSet.group_by` and end up with data grouped across multiple dimensions. :meth:`.TableSet.aggregate` on nested TableSets will then group across multiple dimensions. """ from io import StringIO from itertools import zip_longest from agate.data_types import Text from agate.mapped_sequence import MappedSequence class TableSet(MappedSequence): """ An group of named tables with identical column definitions. Supports (almost) all the same operations as :class:`.Table`. When executed on a :class:`TableSet`, any operation that would have returned a new :class:`.Table` instead returns a new :class:`TableSet`. Any operation that would have returned a single value instead returns a dictionary of values. TableSet is implemented as a subclass of :class:`.MappedSequence` :param tables: A sequence :class:`Table` instances. :param keys: A sequence of keys corresponding to the tables. These may be any type except :class:`int`. :param key_name: A name that describes the grouping properties. Used as the column header when the groups are aggregated. Defaults to the column name that was grouped on. :param key_type: An instance some subclass of :class:`.DataType`. If not provided it will default to a :class`.Text`. :param _is_fork: Used internally to skip certain validation steps when data is propagated from an existing tablset. """ def __init__(self, tables, keys, key_name='group', key_type=None, _is_fork=False): tables = tuple(tables) keys = tuple(keys) self._key_name = key_name self._key_type = key_type or Text() self._sample_table = tables[0] while isinstance(self._sample_table, TableSet): self._sample_table = self._sample_table[0] self._column_types = self._sample_table.column_types self._column_names = self._sample_table.column_names if not _is_fork: for table in tables: if any(not isinstance(a, type(b)) for a, b in zip_longest(table.column_types, self._column_types)): raise ValueError('Not all tables have the same column types!') if table.column_names != self._column_names: raise ValueError('Not all tables have the same column names!') MappedSequence.__init__(self, tables, keys) def __str__(self): """ Print the tableset's structure via :meth:`TableSet.print_structure`. """ structure = StringIO() self.print_structure(output=structure) return structure.getvalue() @property def key_name(self): """ Get the name of the key this TableSet is grouped by. (If created using :meth:`.Table.group_by` then this is the original column name.) """ return self._key_name @property def key_type(self): """ Get the :class:`.DataType` this TableSet is grouped by. (If created using :meth:`.Table.group_by` then this is the original column type.) """ return self._key_type @property def column_types(self): """ Get an ordered list of this :class:`.TableSet`'s column types. :returns: A :class:`tuple` of :class:`.DataType` instances. """ return self._column_types @property def column_names(self): """ Get an ordered list of this :class:`TableSet`'s column names. :returns: A :class:`tuple` of strings. """ return self._column_names def _fork(self, tables, keys, key_name=None, key_type=None): """ Create a new :class:`.TableSet` using the metadata from this one. This method is used internally by functions like :meth:`.TableSet.having`. """ if key_name is None: key_name = self._key_name if key_type is None: key_type = self._key_type return TableSet(tables, keys, key_name, key_type, _is_fork=True) def _proxy(self, method_name, *args, **kwargs): """ Calls a method on each table in this :class:`.TableSet`. """ tables = [] for key, table in self.items(): tables.append(getattr(table, method_name)(*args, **kwargs)) return self._fork( tables, self.keys() ) from agate.tableset.aggregate import aggregate from agate.tableset.bar_chart import bar_chart from agate.tableset.column_chart import column_chart from agate.tableset.from_csv import from_csv from agate.tableset.from_json import from_json from agate.tableset.having import having from agate.tableset.line_chart import line_chart from agate.tableset.merge import merge from agate.tableset.print_structure import print_structure from agate.tableset.proxy_methods import (bins, compute, denormalize, distinct, exclude, find, group_by, homogenize, join, limit, normalize, order_by, pivot, select, where) from agate.tableset.scatterplot import scatterplot from agate.tableset.to_csv import to_csv from agate.tableset.to_json import to_json TableSet.aggregate = aggregate TableSet.bar_chart = bar_chart TableSet.bins = bins TableSet.column_chart = column_chart TableSet.compute = compute TableSet.denormalize = denormalize TableSet.distinct = distinct TableSet.exclude = exclude TableSet.find = find TableSet.from_csv = from_csv TableSet.from_json = from_json TableSet.group_by = group_by TableSet.having = having TableSet.homogenize = homogenize TableSet.join = join TableSet.limit = limit TableSet.line_chart = line_chart TableSet.merge = merge TableSet.normalize = normalize TableSet.order_by = order_by TableSet.pivot = pivot TableSet.print_structure = print_structure TableSet.scatterplot = scatterplot TableSet.select = select TableSet.to_csv = to_csv TableSet.to_json = to_json TableSet.where = where agate-1.9.1/agate/tableset/aggregate.py000066400000000000000000000051201454111471500177700ustar00rootroot00000000000000from agate.table import Table def _aggregate(self, aggregations=[]): """ Recursive aggregation allowing for TableSet's to be nested inside one another. """ from agate.tableset import TableSet output = [] # Process nested TableSet's if isinstance(self._values[0], TableSet): for key, nested_tableset in self.items(): column_names, column_types, nested_output, row_name_columns = _aggregate(nested_tableset, aggregations) for row in nested_output: row.insert(0, key) output.append(row) column_names.insert(0, self._key_name) column_types.insert(0, self._key_type) row_name_columns.insert(0, self._key_name) # Regular Tables else: column_names = [self._key_name] column_types = [self._key_type] row_name_columns = [self._key_name] for new_column_name, aggregation in aggregations: column_names.append(new_column_name) column_types.append(aggregation.get_aggregate_data_type(self._sample_table)) for name, table in self.items(): for new_column_name, aggregation in aggregations: aggregation.validate(table) for name, table in self.items(): new_row = [name] for new_column_name, aggregation in aggregations: new_row.append(aggregation.run(table)) output.append(new_row) return column_names, column_types, output, row_name_columns def aggregate(self, aggregations): """ Aggregate data from the tables in this set by performing some set of column operations on the groups and coalescing the results into a new :class:`.Table`. :code:`aggregations` must be a sequence of tuples, where each has two parts: a :code:`new_column_name` and a :class:`.Aggregation` instance. The resulting table will have the keys from this :class:`TableSet` (and any nested TableSets) set as its :code:`row_names`. See :meth:`.Table.__init__` for more details. :param aggregations: A list of tuples in the format :code:`(new_column_name, aggregation)`, where each :code:`aggregation` is an instance of :class:`.Aggregation`. :returns: A new :class:`.Table`. """ column_names, column_types, output, row_name_columns = _aggregate(self, aggregations) if len(row_name_columns) == 1: row_names = row_name_columns[0] else: def row_names(r): return tuple(r[n] for n in row_name_columns) return Table(output, column_names, column_types, row_names=row_names) agate-1.9.1/agate/tableset/bar_chart.py000066400000000000000000000024251454111471500177740ustar00rootroot00000000000000import leather def bar_chart(self, label=0, value=1, path=None, width=None, height=None): """ Render a lattice/grid of bar charts using :class:`leather.Lattice`. :param label: The name or index of a column to plot as the labels of the chart. Defaults to the first column in the table. :param value: The name or index of a column to plot as the values of the chart. Defaults to the second column in the table. :param path: If specified, the resulting SVG will be saved to this location. If :code:`None` and running in IPython, then the SVG will be rendered inline. Otherwise, the SVG data will be returned as a string. :param width: The width of the output SVG. :param height: The height of the output SVG. """ if type(label) is int: label_name = self.column_names[label] else: label_name = label if type(value) is int: value_name = self.column_names[value] else: value_name = value chart = leather.Lattice(shape=leather.Bars()) chart.add_x_axis(name=value_name) chart.add_y_axis(name=label_name) chart.add_many(self.values(), x=value, y=label, titles=self.keys()) return chart.to_svg(path=path, width=width, height=height) agate-1.9.1/agate/tableset/column_chart.py000066400000000000000000000024361454111471500205270ustar00rootroot00000000000000import leather def column_chart(self, label=0, value=1, path=None, width=None, height=None): """ Render a lattice/grid of column charts using :class:`leather.Lattice`. :param label: The name or index of a column to plot as the labels of the chart. Defaults to the first column in the table. :param value: The name or index of a column to plot as the values of the chart. Defaults to the second column in the table. :param path: If specified, the resulting SVG will be saved to this location. If :code:`None` and running in IPython, then the SVG will be rendered inline. Otherwise, the SVG data will be returned as a string. :param width: The width of the output SVG. :param height: The height of the output SVG. """ if type(label) is int: label_name = self.column_names[label] else: label_name = label if type(value) is int: value_name = self.column_names[value] else: value_name = value chart = leather.Lattice(shape=leather.Columns()) chart.add_x_axis(name=label_name) chart.add_y_axis(name=value_name) chart.add_many(self.values(), x=label, y=value, titles=self.keys()) return chart.to_svg(path=path, width=width, height=height) agate-1.9.1/agate/tableset/from_csv.py000066400000000000000000000022311454111471500176600ustar00rootroot00000000000000import os from collections import OrderedDict from glob import glob from agate.table import Table @classmethod def from_csv(cls, dir_path, column_names=None, column_types=None, row_names=None, header=True, **kwargs): """ Create a new :class:`TableSet` from a directory of CSVs. See :meth:`.Table.from_csv` for additional details. :param dir_path: Path to a directory full of CSV files. All CSV files in this directory will be loaded. :param column_names: See :meth:`Table.__init__`. :param column_types: See :meth:`Table.__init__`. :param row_names: See :meth:`Table.__init__`. :param header: See :meth:`Table.from_csv`. """ from agate.tableset import TableSet if not os.path.isdir(dir_path): raise OSError('Specified path doesn\'t exist or isn\'t a directory.') tables = OrderedDict() for path in glob(os.path.join(dir_path, '*.csv')): name = os.path.split(path)[1].strip('.csv') tables[name] = Table.from_csv(path, column_names, column_types, row_names=row_names, header=header, **kwargs) return TableSet(tables.values(), tables.keys()) agate-1.9.1/agate/tableset/from_json.py000066400000000000000000000041431454111471500200420ustar00rootroot00000000000000import json import os from collections import OrderedDict from decimal import Decimal from glob import glob from agate.table import Table @classmethod def from_json(cls, path, column_names=None, column_types=None, keys=None, **kwargs): """ Create a new :class:`TableSet` from a directory of JSON files or a single JSON object with key value (Table key and list of row objects) pairs for each :class:`Table`. See :meth:`.Table.from_json` for additional details. :param path: Path to a directory containing JSON files or filepath/file-like object of nested JSON file. :param keys: A list of keys of the top-level dictionaries for each file. If specified, length must be equal to number of JSON files in path. :param column_types: See :meth:`Table.__init__`. """ from agate.tableset import TableSet if isinstance(path, str) and not os.path.isdir(path) and not os.path.isfile(path): raise OSError('Specified path doesn\'t exist.') tables = OrderedDict() if isinstance(path, str) and os.path.isdir(path): filepaths = glob(os.path.join(path, '*.json')) if keys is not None and len(keys) != len(filepaths): raise ValueError('If specified, keys must have length equal to number of JSON files') for i, filepath in enumerate(filepaths): name = os.path.split(filepath)[1].strip('.json') if keys is not None: tables[name] = Table.from_json(filepath, keys[i], column_types=column_types, **kwargs) else: tables[name] = Table.from_json(filepath, column_types=column_types, **kwargs) else: if hasattr(path, 'read'): js = json.load(path, object_pairs_hook=OrderedDict, parse_float=Decimal, **kwargs) else: with open(path) as f: js = json.load(f, object_pairs_hook=OrderedDict, parse_float=Decimal, **kwargs) for key, value in js.items(): tables[key] = Table.from_object(value, column_types=column_types, **kwargs) return TableSet(tables.values(), tables.keys()) agate-1.9.1/agate/tableset/having.py000066400000000000000000000020651454111471500173230ustar00rootroot00000000000000def having(self, aggregations, test): """ Create a new :class:`.TableSet` with only those tables that pass a test. This works by applying a sequence of :class:`Aggregation` instances to each table. The resulting dictionary of properties is then passed to the :code:`test` function. This method does not modify the underlying tables in any way. :param aggregations: A list of tuples in the format :code:`(name, aggregation)`, where each :code:`aggregation` is an instance of :class:`.Aggregation`. :param test: A function that takes a dictionary of aggregated properties and returns :code:`True` if it should be included in the new :class:`.TableSet`. :type test: :class:`function` :returns: A new :class:`.TableSet`. """ new_tables = [] new_keys = [] for key, table in self.items(): props = table.aggregate(aggregations) if test(props): new_tables.append(table) new_keys.append(key) return self._fork(new_tables, new_keys) agate-1.9.1/agate/tableset/line_chart.py000066400000000000000000000023171454111471500201570ustar00rootroot00000000000000import leather def line_chart(self, x=0, y=1, path=None, width=None, height=None): """ Render a lattice/grid of line charts using :class:`leather.Lattice`. :param x: The name or index of a column to plot as the x axis of the chart. Defaults to the first column in the table. :param y: The name or index of a column to plot as the y axis of the chart. Defaults to the second column in the table. :param path: If specified, the resulting SVG will be saved to this location. If :code:`None` and running in IPython, then the SVG will be rendered inline. Otherwise, the SVG data will be returned as a string. :param width: The width of the output SVG. :param height: The height of the output SVG. """ if type(x) is int: x_name = self.column_names[x] else: x_name = x if type(y) is int: y_name = self.column_names[y] else: y_name = y chart = leather.Lattice(shape=leather.Line()) chart.add_x_axis(name=x_name) chart.add_y_axis(name=y_name) chart.add_many(self.values(), x=x, y=y, titles=self.keys()) return chart.to_svg(path=path, width=width, height=height) agate-1.9.1/agate/tableset/merge.py000066400000000000000000000034121454111471500171430ustar00rootroot00000000000000from agate.rows import Row from agate.table import Table def merge(self, groups=None, group_name=None, group_type=None): """ Convert this TableSet into a single table. This is the inverse of :meth:`.Table.group_by`. Any `row_names` set on the merged tables will be lost in this process. :param groups: A list of grouping factors to add to merged rows in a new column. If specified, it should have exactly one element per :class:`Table` in the :class:`TableSet`. If not specified or None, the grouping factor will be the name of the :class:`Row`'s original Table. :param group_name: This will be the column name of the grouping factors. If None, defaults to the :attr:`TableSet.key_name`. :param group_type: This will be the column type of the grouping factors. If None, defaults to the :attr:`TableSet.key_type`. :returns: A new :class:`Table`. """ if type(groups) is not list and groups is not None: raise ValueError('Groups must be None or a list.') if type(groups) is list and len(groups) != len(self): raise ValueError('Groups length must be equal to TableSet length.') column_names = list(self._column_names) column_types = list(self._column_types) column_names.insert(0, group_name if group_name else self._key_name) column_types.insert(0, group_type if group_type else self._key_type) rows = [] for index, (key, table) in enumerate(self.items()): for row in table._rows: if groups is None: rows.append(Row((key,) + tuple(row), column_names)) else: rows.append(Row((groups[index],) + tuple(row), column_names)) return Table(rows, column_names, column_types) agate-1.9.1/agate/tableset/print_structure.py000066400000000000000000000015641454111471500213260ustar00rootroot00000000000000import sys from agate.data_types import Text from agate.table import Table def print_structure(self, max_rows=20, output=sys.stdout): """ Print the keys and row counts of each table in the tableset. :param max_rows: The maximum number of rows to display before truncating the data. Defaults to 20. :param output: The output used to print the structure of the :class:`Table`. :returns: None """ max_length = min(len(self.items()), max_rows) name_column = self.keys()[0:max_length] type_column = [str(len(table.rows)) for key, table in self.items()[0:max_length]] rows = zip(name_column, type_column) column_names = ['table', 'rows'] text = Text() column_types = [text, text] table = Table(rows, column_names, column_types) return table.print_table(output=output, max_column_width=None) agate-1.9.1/agate/tableset/proxy_methods.py000066400000000000000000000047311454111471500207550ustar00rootroot00000000000000def bins(self, *args, **kwargs): """ Calls :meth:`.Table.bins` on each table in the TableSet. """ return self._proxy('bins', *args, **kwargs) def compute(self, *args, **kwargs): """ Calls :meth:`.Table.compute` on each table in the TableSet. """ return self._proxy('compute', *args, **kwargs) def denormalize(self, *args, **kwargs): """ Calls :meth:`.Table.denormalize` on each table in the TableSet. """ return self._proxy('denormalize', *args, **kwargs) def distinct(self, *args, **kwargs): """ Calls :meth:`.Table.distinct` on each table in the TableSet. """ return self._proxy('distinct', *args, **kwargs) def exclude(self, *args, **kwargs): """ Calls :meth:`.Table.exclude` on each table in the TableSet. """ return self._proxy('exclude', *args, **kwargs) def find(self, *args, **kwargs): """ Calls :meth:`.Table.find` on each table in the TableSet. """ return self._proxy('find', *args, **kwargs) def group_by(self, *args, **kwargs): """ Calls :meth:`.Table.group_by` on each table in the TableSet. """ return self._proxy('group_by', *args, **kwargs) def homogenize(self, *args, **kwargs): """ Calls :meth:`.Table.homogenize` on each table in the TableSet. """ return self._proxy('homogenize', *args, **kwargs) def join(self, *args, **kwargs): """ Calls :meth:`.Table.join` on each table in the TableSet. """ return self._proxy('join', *args, **kwargs) def limit(self, *args, **kwargs): """ Calls :meth:`.Table.limit` on each table in the TableSet. """ return self._proxy('limit', *args, **kwargs) def normalize(self, *args, **kwargs): """ Calls :meth:`.Table.normalize` on each table in the TableSet. """ return self._proxy('normalize', *args, **kwargs) def order_by(self, *args, **kwargs): """ Calls :meth:`.Table.order_by` on each table in the TableSet. """ return self._proxy('order_by', *args, **kwargs) def pivot(self, *args, **kwargs): """ Calls :meth:`.Table.pivot` on each table in the TableSet. """ return self._proxy('pivot', *args, **kwargs) def select(self, *args, **kwargs): """ Calls :meth:`.Table.select` on each table in the TableSet. """ return self._proxy('select', *args, **kwargs) def where(self, *args, **kwargs): """ Calls :meth:`.Table.where` on each table in the TableSet. """ return self._proxy('where', *args, **kwargs) agate-1.9.1/agate/tableset/scatterplot.py000066400000000000000000000023211454111471500204060ustar00rootroot00000000000000import leather def scatterplot(self, x=0, y=1, path=None, width=None, height=None): """ Render a lattice/grid of scatterplots using :class:`leather.Lattice`. :param x: The name or index of a column to plot as the x axis of the chart. Defaults to the first column in the table. :param y: The name or index of a column to plot as the y axis of the chart. Defaults to the second column in the table. :param path: If specified, the resulting SVG will be saved to this location. If :code:`None` and running in IPython, then the SVG will be rendered inline. Otherwise, the SVG data will be returned as a string. :param width: The width of the output SVG. :param height: The height of the output SVG. """ if type(x) is int: x_name = self.column_names[x] else: x_name = x if type(y) is int: y_name = self.column_names[y] else: y_name = y chart = leather.Lattice(shape=leather.Dots()) chart.add_x_axis(name=x_name) chart.add_y_axis(name=y_name) chart.add_many(self.values(), x=x, y=y, titles=self.keys()) return chart.to_svg(path=path, width=width, height=height) agate-1.9.1/agate/tableset/to_csv.py000066400000000000000000000007331454111471500173440ustar00rootroot00000000000000import os def to_csv(self, dir_path, **kwargs): """ Write each table in this set to a separate CSV in a given directory. See :meth:`.Table.to_csv` for additional details. :param dir_path: Path to the directory to write the CSV files to. """ if not os.path.exists(dir_path): os.makedirs(dir_path) for name, table in self.items(): path = os.path.join(dir_path, '%s.csv' % name) table.to_csv(path, **kwargs) agate-1.9.1/agate/tableset/to_json.py000066400000000000000000000034031454111471500175170ustar00rootroot00000000000000import json import os from collections import OrderedDict from io import StringIO def to_json(self, path, nested=False, indent=None, **kwargs): """ Write :class:`TableSet` to either a set of JSON files for each table or a single nested JSON file. See :meth:`.Table.to_json` for additional details. :param path: Path to the directory to write the JSON file(s) to. If nested is `True`, this should be a file path or file-like object to write to. :param nested: If `True`, the output will be a single nested JSON file with each Table's key paired with a list of row objects. Otherwise, the output will be a set of files for each table. Defaults to `False`. :param indent: See :meth:`Table.to_json`. """ if not nested: if not os.path.exists(path): os.makedirs(path) for name, table in self.items(): filepath = os.path.join(path, '%s.json' % name) table.to_json(filepath, indent=indent, **kwargs) else: close = True tableset_dict = OrderedDict() for name, table in self.items(): output = StringIO() table.to_json(output, **kwargs) tableset_dict[name] = json.loads(output.getvalue(), object_pairs_hook=OrderedDict) if hasattr(path, 'write'): f = path close = False else: dirpath = os.path.dirname(path) if dirpath and not os.path.exists(dirpath): os.makedirs(dirpath) f = open(path, 'w') json_kwargs = {'ensure_ascii': False, 'indent': indent} json_kwargs.update(kwargs) json.dump(tableset_dict, f, **json_kwargs) if close and f is not None: f.close() agate-1.9.1/agate/testcase.py000066400000000000000000000035351454111471500160620ustar00rootroot00000000000000import unittest import agate class AgateTestCase(unittest.TestCase): """ Unittest case for quickly asserting logic about tables. """ def assertColumnNames(self, table, names): """ Verify the column names in the given table match what is expected. """ self.assertIsInstance(table, agate.Table) self.assertSequenceEqual(table.column_names, names) self.assertSequenceEqual( [c.name for c in table.columns], names ) for row in table.rows: self.assertSequenceEqual( row.keys(), names ) def assertColumnTypes(self, table, types): """ Verify the column types in the given table are of the expected types. """ self.assertIsInstance(table, agate.Table) table_types = table.column_types column_types = [c.data_type for c in table.columns] for i, test_type in enumerate(types): self.assertIsInstance(table_types[i], test_type) self.assertIsInstance(column_types[i], test_type) def assertRows(self, table, rows): """ Verify the row data in the given table match what is expected. """ self.assertIsInstance(table, agate.Table) for i, row in enumerate(rows): self.assertSequenceEqual(table.rows[i], row) def assertRowNames(self, table, names): """ Verify the row names in the given table match what is expected. """ self.assertIsInstance(table, agate.Table) self.assertSequenceEqual(table.row_names, names) self.assertSequenceEqual( table.rows.keys(), names ) for column in table.columns: self.assertSequenceEqual( column.keys(), names ) agate-1.9.1/agate/type_tester.py000066400000000000000000000112651454111471500166150ustar00rootroot00000000000000import warnings from copy import copy from agate.data_types.base import DEFAULT_NULL_VALUES from agate.data_types.boolean import Boolean from agate.data_types.date import Date from agate.data_types.date_time import DateTime from agate.data_types.number import Number from agate.data_types.text import Text from agate.data_types.time_delta import TimeDelta class TypeTester: """ Control how data types are inferred for columns in a given set of data. This class is used by passing it to the :code:`column_types` argument of the :class:`.Table` constructor, or the same argument for any other method that create a :class:`.Table` Type inference can be a slow process. To limit the number of rows of data to be tested, pass the :code:`limit` argument. Note that may cause errors if your data contains different types of values after the specified number of rows. By default, data types will be tested against each column in this order: 1. :class:`.Boolean` 2. :class:`.Number` 3. :class:`.TimeDelta` #. :class:`.Date` #. :class:`.DateTime` #. :class:`.Text` Individual types may be specified using the :code:`force` argument. The type order by be changed, or entire types disabled, by using the :code:`types` argument. Beware that changing the order of the types may cause unexpected behavior. :param force: A dictionary where each key is a column name and each value is a :class:`.DataType` instance that overrides inference. :param limit: An optional limit on how many rows to evaluate before selecting the most likely type. Note that applying a limit may mean errors arise when the data is cast--if the guess is proved incorrect in further rows of data. :param types: A sequence of possible types to test against. This be used to specify what data formats you want to test against. For instance, you may want to exclude :class:`TimeDelta` from testing. It can also be used to pass options such as ``locale`` to :class:`.Number` or ``cast_nulls`` to :class:`.Text`. Take care in specifying the order of the list. It is the order they are tested in. :class:`.Text` should always be last. :param null_values: If :code:`types` is :code:`None`, a sequence of values which should be cast to :code:`None` when encountered by the default data types. """ def __init__(self, force={}, limit=None, types=None, null_values=DEFAULT_NULL_VALUES): self._force = force self._limit = limit if types: self._possible_types = types else: # In order of preference self._possible_types = [ Boolean(null_values=null_values), Number(null_values=null_values), TimeDelta(null_values=null_values), Date(null_values=null_values), DateTime(null_values=null_values), Text(null_values=null_values) ] def run(self, rows, column_names): """ Apply type inference to the provided data and return an array of column types. :param rows: The data as a sequence of any sequences: tuples, lists, etc. """ num_columns = len(column_names) hypotheses = [set(self._possible_types) for i in range(num_columns)] force_indices = [] for name in self._force.keys(): try: force_indices.append(column_names.index(name)) except ValueError: warnings.warn('"%s" does not match the name of any column in this table.' % name, RuntimeWarning) if self._limit: sample_rows = rows[:self._limit] elif self._limit == 0: text = Text() return tuple([text] * num_columns) else: sample_rows = rows for row in sample_rows: for i in range(num_columns): if i in force_indices: continue h = hypotheses[i] if len(h) == 1: continue for column_type in copy(h): if len(row) > i and not column_type.test(row[i]): h.remove(column_type) column_types = [] for i in range(num_columns): if i in force_indices: column_types.append(self._force[column_names[i]]) continue h = hypotheses[i] # Select in prefer order for t in self._possible_types: if t in h: column_types.append(t) break return tuple(column_types) agate-1.9.1/agate/utils.py000066400000000000000000000210501454111471500153770ustar00rootroot00000000000000""" This module contains a collection of utility classes and functions used in agate. """ import math import string from collections import OrderedDict from collections.abc import Sequence from decimal import ROUND_CEILING, ROUND_FLOOR, Decimal, getcontext from functools import wraps from slugify import slugify as pslugify from agate import config from agate.warns import warn_duplicate_column, warn_unnamed_column #: Sentinal for use when `None` is an valid argument value default = object() def memoize(func): """ Dead-simple memoize decorator for instance methods that take no arguments. This is especially useful since so many of our classes are immutable. """ memo = None @wraps(func) def wrapper(self): if memo is not None: return memo return func(self) return wrapper class NullOrder: """ Dummy object used for sorting in place of None. Sorts as "greater than everything but other nulls." """ def __lt__(self, other): return False def __gt__(self, other): if other is None: return False return True class Quantiles(Sequence): """ A class representing quantiles (percentiles, quartiles, etc.) for a given column of Number data. """ def __init__(self, quantiles): self._quantiles = quantiles def __getitem__(self, i): return self._quantiles.__getitem__(i) def __iter__(self): return self._quantiles.__iter__() def __len__(self): return self._quantiles.__len__() def __repr__(self): return repr(self._quantiles) def __eq__(self, other): return self._quantiles == other._quantiles def locate(self, value): """ Identify which quantile a given value is part of. """ i = 0 if value < self._quantiles[0]: raise ValueError('Value is less than minimum quantile value.') if value > self._quantiles[-1]: raise ValueError('Value is greater than maximum quantile value.') if value == self._quantiles[-1]: return Decimal(len(self._quantiles) - 1) while value >= self._quantiles[i + 1]: i += 1 return Decimal(i) def median(data_sorted): """ Finds the median value of a given series of values. :param data_sorted: The values to find the median of. Must be sorted. """ length = len(data_sorted) if length % 2 == 1: return data_sorted[((length + 1) // 2) - 1] half = length // 2 a = data_sorted[half - 1] b = data_sorted[half] return (a + b) / 2 def max_precision(values): """ Given a series of values (such as a :class:`.Column`) returns the most significant decimal places present in any value. :param values: The values to analyze. """ max_whole_places = 1 max_decimal_places = 0 precision = getcontext().prec for value in values: if value is None or math.isnan(value) or math.isinf(value): continue sign, digits, exponent = value.normalize().as_tuple() exponent_places = exponent * -1 whole_places = len(digits) - exponent_places if whole_places > max_whole_places: max_whole_places = whole_places if exponent_places > max_decimal_places: max_decimal_places = exponent_places # In Python 2 it was possible for the total digits to exceed the # available context precision. This ensures that can't happen. See #412 if max_whole_places + max_decimal_places > precision: # pragma: no cover max_decimal_places = precision - max_whole_places return max_decimal_places def make_number_formatter(decimal_places, add_ellipsis=False): """ Given a number of decimal places creates a formatting string that will display numbers with that precision. :param decimal_places: The number of decimal places :param add_ellipsis: Optionally add an ellipsis symbol at the end of a number """ fraction = '0' * decimal_places ellipsis = config.get_option('number_truncation_chars') if add_ellipsis else '' return ''.join(['#,##0.', fraction, ellipsis, ';-#,##0.', fraction, ellipsis]) def round_limits(minimum, maximum): """ Rounds a pair of minimum and maximum values to form reasonable "round" values suitable for use as axis minimum and maximum values. Values are rounded "out": up for maximum and down for minimum, and "off": to one higher than the first significant digit shared by both. See unit tests for examples. """ min_bits = minimum.normalize().as_tuple() max_bits = maximum.normalize().as_tuple() max_digits = max( len(min_bits.digits) + min_bits.exponent, len(max_bits.digits) + max_bits.exponent ) # Whole number rounding if max_digits > 0: multiplier = Decimal('10') ** (max_digits - 1) min_fraction = (minimum / multiplier).to_integral_value(rounding=ROUND_FLOOR) max_fraction = (maximum / multiplier).to_integral_value(rounding=ROUND_CEILING) return ( min_fraction * multiplier, max_fraction * multiplier ) max_exponent = max(min_bits.exponent, max_bits.exponent) # Fractional rounding q = Decimal('10') ** (max_exponent + 1) return ( minimum.quantize(q, rounding=ROUND_FLOOR).normalize(), maximum.quantize(q, rounding=ROUND_CEILING).normalize() ) def letter_name(index): """ Given a column index, assign a "letter" column name equivalent to Excel. For example, index ``4`` would return ``E``. Index ``30`` would return ``EE``. """ letters = string.ascii_lowercase count = len(letters) return letters[index % count] * ((index // count) + 1) def parse_object(obj, path=''): """ Recursively parse JSON-like Python objects as a dictionary of paths/keys and values. Inspired by JSONPipe (https://github.com/dvxhouse/jsonpipe). """ if isinstance(obj, dict): iterator = obj.items() elif isinstance(obj, (list, tuple)): iterator = enumerate(obj) else: return {path.strip('/'): obj} d = OrderedDict() for key, value in iterator: key = str(key) d.update(parse_object(value, path + key + '/')) return d def issequence(obj): """ Returns :code:`True` if the given object is an instance of :class:`.Sequence` that is not also a string. """ return isinstance(obj, Sequence) and not isinstance(obj, str) def deduplicate(values, column_names=False, separator='_'): """ Append a unique identifer to duplicate strings in a given sequence of strings. Identifers are an underscore followed by the occurance number of the specific string. ['abc', 'abc', 'cde', 'abc'] -> ['abc', 'abc_2', 'cde', 'abc_3'] :param column_names: If True, values are treated as column names. Warnings will be thrown if column names are None or duplicates. None values will be replaced with letter indices. """ final_values = [] for i, value in enumerate(values): if column_names: if not value: new_value = letter_name(i) warn_unnamed_column(i, new_value) elif isinstance(value, str): new_value = value else: raise ValueError('Column names must be strings or None.') else: new_value = value final_value = new_value duplicates = 0 while final_value in final_values: final_value = new_value + separator + str(duplicates + 2) duplicates += 1 if column_names and duplicates > 0: warn_duplicate_column(new_value, final_value) final_values.append(final_value) return tuple(final_values) def slugify(values, ensure_unique=False, **kwargs): """ Given a sequence of strings, returns a standardized version of the sequence. If ``ensure_unique`` is True, any duplicate strings will be appended with a unique identifier. agate uses an underscore as a default separator but this can be changed with kwargs. Any kwargs will be passed to the slugify method in python-slugify. See: https://github.com/un33k/python-slugify """ slug_args = {'separator': '_'} slug_args.update(kwargs) if ensure_unique: new_values = tuple(pslugify(value, **slug_args) for value in values) return deduplicate(new_values, separator=slug_args['separator']) return tuple(pslugify(value, **slug_args) for value in values) agate-1.9.1/agate/warns.py000066400000000000000000000024651454111471500154020ustar00rootroot00000000000000import warnings class NullCalculationWarning(RuntimeWarning): # pragma: no cover """ Warning raised if a calculation which can not logically account for null values is performed on a :class:`.Column` containing nulls. """ pass def warn_null_calculation(operation, column): warnings.warn('Column "{}" contains nulls. These will be excluded from {} calculation.'.format( column.name, operation.__class__.__name__ ), NullCalculationWarning, stacklevel=2) class DuplicateColumnWarning(RuntimeWarning): # pragma: no cover """ Warning raised if multiple columns with the same name are added to a new :class:`.Table`. """ pass def warn_duplicate_column(column_name, column_rename): warnings.warn('Column name "{}" already exists in Table. Column will be renamed to "{}".'.format( column_name, column_rename ), DuplicateColumnWarning, stacklevel=2) class UnnamedColumnWarning(RuntimeWarning): # pragma: no cover """ Warning raised when a column has no name and an a programmatically generated name is used. """ pass def warn_unnamed_column(column_id, new_column_name): warnings.warn('Column %i has no name. Using "%s".' % ( column_id, new_column_name ), UnnamedColumnWarning, stacklevel=2) agate-1.9.1/benchmarks/000077500000000000000000000000001454111471500147235ustar00rootroot00000000000000agate-1.9.1/benchmarks/__init__.py000066400000000000000000000000001454111471500170220ustar00rootroot00000000000000agate-1.9.1/benchmarks/test_joins.py000066400000000000000000000013771454111471500174660ustar00rootroot00000000000000import unittest from random import shuffle from timeit import Timer import agate class TestTableJoin(unittest.TestCase): def test_join(self): left_rows = [(str(i), i) for i in range(100000)] right_rows = [(str(i), i) for i in range(100000)] shuffle(left_rows) shuffle(right_rows) column_names = ['text', 'number'] column_types = [agate.Text(), agate.Number()] left = agate.Table(left_rows, column_names, column_types) right = agate.Table(right_rows, column_names, column_types) def test(): left.join(right, 'text') results = Timer(test).repeat(10, 1) min_time = min(results) self.assertLess(min_time, 20) # CI unreliable, 15s witnessed on PyPy agate-1.9.1/charts.py000077500000000000000000000017421454111471500144530ustar00rootroot00000000000000#!/usr/bin/env python import agate table = agate.Table.from_csv('examples/realdata/Datagov_FY10_EDU_recp_by_State.csv') table.limit(10).bar_chart('State Name', 'TOTAL', 'docs/images/bar_chart.svg') table.limit(10).column_chart('State Name', 'TOTAL', 'docs/images/column_chart.svg') table = agate.Table.from_csv('examples/realdata/exonerations-20150828.csv') by_year_exonerated = table.group_by('exonerated') counts = by_year_exonerated.aggregate([ ('count', agate.Count()) ]) counts.order_by('exonerated').line_chart('exonerated', 'count', 'docs/images/line_chart.svg') table.scatterplot('exonerated', 'age', 'docs/images/dots_chart.svg') top_crimes = table.group_by('crime').having([ ('count', agate.Count()) ], lambda t: t['count'] > 100) by_year = top_crimes.group_by('exonerated') counts = by_year.aggregate([ ('count', agate.Count()) ]) by_crime = counts.group_by('crime') by_crime.order_by('exonerated').line_chart('exonerated', 'count', 'docs/images/lattice.svg') agate-1.9.1/docs/000077500000000000000000000000001454111471500135365ustar00rootroot00000000000000agate-1.9.1/docs/Makefile000066400000000000000000000107521454111471500152030ustar00rootroot00000000000000# Makefile for Sphinx documentation # # You can set these variables from the command line. SPHINXOPTS = SPHINXBUILD = sphinx-build PAPER = BUILDDIR = _build # Internal variables. PAPEROPT_a4 = -D latex_paper_size=a4 PAPEROPT_letter = -D latex_paper_size=letter ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . .PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest help: @echo "Please use \`make ' where is one of" @echo " html to make standalone HTML files" @echo " dirhtml to make HTML files named index.html in directories" @echo " singlehtml to make a single large HTML file" @echo " pickle to make pickle files" @echo " json to make JSON files" @echo " htmlhelp to make HTML files and a HTML help project" @echo " qthelp to make HTML files and a qthelp project" @echo " devhelp to make HTML files and a Devhelp project" @echo " epub to make an epub" @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter" @echo " latexpdf to make LaTeX files and run them through pdflatex" @echo " text to make text files" @echo " man to make manual pages" @echo " changes to make an overview of all changed/added/deprecated items" @echo " linkcheck to check all external links for integrity" @echo " doctest to run all doctests embedded in the documentation (if enabled)" clean: -rm -rf $(BUILDDIR)/* html: $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/html." dirhtml: $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml." singlehtml: $(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml @echo @echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml." pickle: $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle @echo @echo "Build finished; now you can process the pickle files." json: $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json @echo @echo "Build finished; now you can process the JSON files." htmlhelp: $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp @echo @echo "Build finished; now you can run HTML Help Workshop with the" \ ".hhp project file in $(BUILDDIR)/htmlhelp." qthelp: $(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp @echo @echo "Build finished; now you can run "qcollectiongenerator" with the" \ ".qhcp project file in $(BUILDDIR)/qthelp, like this:" @echo "# qcollectiongenerator $(BUILDDIR)/qthelp/agate.qhcp" @echo "To view the help file:" @echo "# assistant -collectionFile $(BUILDDIR)/qthelp/agate.qhc" devhelp: $(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp @echo @echo "Build finished." @echo "To view the help file:" @echo "# mkdir -p $$HOME/.local/share/devhelp/agate" @echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/agate" @echo "# devhelp" epub: $(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub @echo @echo "Build finished. The epub file is in $(BUILDDIR)/epub." latex: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo @echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex." @echo "Run \`make' in that directory to run these through (pdf)latex" \ "(use \`make latexpdf' here to do that automatically)." latexpdf: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo "Running LaTeX files through pdflatex..." make -C $(BUILDDIR)/latex all-pdf @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." text: $(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text @echo @echo "Build finished. The text files are in $(BUILDDIR)/text." man: $(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man @echo @echo "Build finished. The manual pages are in $(BUILDDIR)/man." changes: $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes @echo @echo "The overview file is in $(BUILDDIR)/changes." linkcheck: $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck @echo @echo "Link check complete; look for any errors in the above output " \ "or in $(BUILDDIR)/linkcheck/output.txt." doctest: $(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest @echo "Testing of doctests in the sources finished, look at the " \ "results in $(BUILDDIR)/doctest/output.txt." agate-1.9.1/docs/about.rst000066400000000000000000000025761454111471500154140ustar00rootroot00000000000000=========== About agate =========== Why agate? ========== * A readable and user-friendly API. * A complete set of SQL-like operations. * Unicode support everywhere. * Decimal precision everywhere. * Exhaustive user documentation. * Pluggable `extensions `_ that add SQL integration, Excel support, and more. * Designed with `iPython `_, `Jupyter `_ and `atom/hydrogen `_ in mind. * Pure Python. No C dependencies to compile. * Exhaustive test coverage. * MIT licensed and free for all purposes. * Zealously `zen `_. * Made with love. Principles ========== agate is a intended to fill a very particular programming niche. It should not be allowed to become as complex as `numpy `_ or `pandas `_. Please bear in mind the following principles when considering a new feature: * Humans have less time than computers. Optimize for humans. * Most datasets are small. Don't optimize for "big data". * Text is data. It must always be a first-class citizen. * Python gets it right. Make it work like Python does. * Humans lives are nasty, brutish and short. Make it easy. * Mutability leads to confusion. Processes that alter data must create new copies. * Extensions are the way. Don't add it to core unless everybody needs it. agate-1.9.1/docs/api.rst000066400000000000000000000004351454111471500150430ustar00rootroot00000000000000=== API === .. toctree:: :maxdepth: 1 api/table api/tableset api/columns_and_rows api/data_types api/type_tester api/aggregations api/computations api/csv api/fixed api/misc api/exceptions api/warns api/testcase api/config agate-1.9.1/docs/api/000077500000000000000000000000001454111471500143075ustar00rootroot00000000000000agate-1.9.1/docs/api/aggregations.rst000066400000000000000000000027731454111471500175240ustar00rootroot00000000000000============ Aggregations ============ .. automodule:: agate.aggregations :no-members: .. autosummary:: :nosignatures: agate.Aggregation agate.Summary Basic aggregations ------------------ .. autosummary:: :nosignatures: agate.All agate.Any agate.Count agate.HasNulls agate.Min agate.Max agate.MaxPrecision Statistical aggregations ------------------------ .. autosummary:: :nosignatures: agate.Deciles agate.IQR agate.MAD agate.Mean agate.Median agate.Mode agate.Percentiles agate.PopulationStDev agate.PopulationVariance agate.Quartiles agate.Quintiles agate.StDev agate.Sum agate.Variance Text aggregations ----------------- .. autosummary:: :nosignatures: agate.MaxLength Detailed list ------------- .. autoclass:: agate.Aggregation .. autoclass:: agate.All .. autoclass:: agate.Any .. autoclass:: agate.Count .. autoclass:: agate.Deciles .. autoclass:: agate.HasNulls .. autoclass:: agate.IQR .. autoclass:: agate.MAD .. autoclass:: agate.Min .. autoclass:: agate.Max .. autoclass:: agate.MaxLength .. autoclass:: agate.MaxPrecision .. autoclass:: agate.Mean .. autoclass:: agate.Median .. autoclass:: agate.Mode .. autoclass:: agate.Percentiles .. autoclass:: agate.PopulationStDev .. autoclass:: agate.PopulationVariance .. autoclass:: agate.Quartiles .. autoclass:: agate.Quintiles .. autoclass:: agate.StDev .. autoclass:: agate.Sum .. autoclass:: agate.Summary .. autoclass:: agate.Variance agate-1.9.1/docs/api/columns_and_rows.rst000066400000000000000000000003551454111471500204200ustar00rootroot00000000000000================ Columns and rows ================ .. autosummary:: :nosignatures: agate.MappedSequence agate.Column agate.Row .. autoclass:: agate.MappedSequence .. autoclass:: agate.Column .. autoclass:: agate.Row agate-1.9.1/docs/api/computations.rst000066400000000000000000000011701454111471500175650ustar00rootroot00000000000000============ Computations ============ .. automodule:: agate.computations :no-members: .. autosummary:: :nosignatures: agate.Computation agate.Formula Mathematical computations ------------------------- .. autosummary:: :nosignatures: agate.Change agate.Percent agate.PercentChange agate.PercentileRank agate.Rank Detailed list ------------- .. autoclass:: agate.Change .. autoclass:: agate.Computation .. autoclass:: agate.Formula .. autoclass:: agate.Percent .. autoclass:: agate.PercentChange .. autoclass:: agate.PercentileRank .. autoclass:: agate.Rank .. autoclass:: agate.Slug agate-1.9.1/docs/api/config.rst000066400000000000000000000001311454111471500163010ustar00rootroot00000000000000====== Config ====== .. automodule:: agate.config :members: :inherited-members: agate-1.9.1/docs/api/csv.rst000066400000000000000000000022251454111471500156350ustar00rootroot00000000000000===================== CSV reader and writer ===================== Agate contains CSV readers and writers that are intended to be used as a drop-in replacement for :mod:`csv`. These versions add unicode support for Python 2 and several other minor features. Agate methods will use these version automatically. If you would like to use them in your own code, you can import them, like this: .. code-block:: python from agate import csv Due to nuanced differences between the versions, these classes are implemented seperately for Python 2 and Python 3. The documentation for both versions is provided below, but only the one for your version of Python is imported with the above code. Python 3 -------- .. autosummary:: :nosignatures: agate.csv_py3.reader agate.csv_py3.writer agate.csv_py3.Reader agate.csv_py3.Writer agate.csv_py3.DictReader agate.csv_py3.DictWriter Python 3 details ---------------- .. autofunction:: agate.csv_py3.reader .. autofunction:: agate.csv_py3.writer .. autoclass:: agate.csv_py3.Reader .. autoclass:: agate.csv_py3.Writer .. autoclass:: agate.csv_py3.DictReader .. autoclass:: agate.csv_py3.DictWriter agate-1.9.1/docs/api/data_types.rst000066400000000000000000000010421454111471500171730ustar00rootroot00000000000000========== Data types ========== .. automodule:: agate.data_types :no-members: .. autosummary:: :nosignatures: agate.DataType Supported types --------------- .. autosummary:: :nosignatures: agate.Text agate.Number agate.Boolean agate.Date agate.DateTime agate.TimeDelta Detailed list ------------- .. autoclass:: agate.DataType .. autoclass:: agate.Text .. autoclass:: agate.Number .. autoclass:: agate.Boolean .. autoclass:: agate.Date .. autoclass:: agate.DateTime .. autoclass:: agate.TimeDelta agate-1.9.1/docs/api/exceptions.rst000066400000000000000000000005501454111471500172220ustar00rootroot00000000000000============ Exceptions ============ .. autosummary:: :nosignatures: agate.DataTypeError agate.UnsupportedAggregationError agate.CastError agate.FieldSizeLimitError .. autoexception:: agate.DataTypeError .. autoexception:: agate.UnsupportedAggregationError .. autoexception:: agate.CastError .. autoexception:: agate.FieldSizeLimitError agate-1.9.1/docs/api/fixed.rst000066400000000000000000000010251454111471500161360ustar00rootroot00000000000000================== Fixed-width reader ================== Agate contains a fixed-width file reader that is designed to work like Python's :mod:`csv`. These readers work with CSV-formatted schemas, such as those maintained at `wireservice/ffs `_. .. autosummary:: :nosignatures: agate.fixed.reader agate.fixed.Reader agate.fixed.DictReader Detailed list ------------- .. autofunction:: agate.fixed.reader .. autoclass:: agate.fixed.Reader .. autoclass:: agate.fixed.DictReader agate-1.9.1/docs/api/misc.rst000066400000000000000000000002671454111471500160010ustar00rootroot00000000000000============= Miscellaneous ============= .. autosummary:: :nosignatures: agate.NullOrder agate.Quantiles .. autoclass:: agate.NullOrder .. autoclass:: agate.Quantiles agate-1.9.1/docs/api/table.rst000066400000000000000000000032711454111471500161330ustar00rootroot00000000000000===== Table ===== .. automodule:: agate.table :no-members: .. autosummary:: :nosignatures: agate.Table Properties ---------- .. autosummary:: :nosignatures: agate.Table.columns agate.Table.column_names agate.Table.column_types agate.Table.rows agate.Table.row_names Creating -------- .. autosummary:: :nosignatures: agate.Table.from_csv agate.Table.from_json agate.Table.from_fixed agate.Table.from_object Saving ------ .. autosummary:: :nosignatures: agate.Table.to_csv agate.Table.to_json Basic processing ---------------- .. autosummary:: :nosignatures: agate.Table.distinct agate.Table.exclude agate.Table.find agate.Table.limit agate.Table.order_by agate.Table.select agate.Table.where Calculating new data -------------------- .. autosummary:: :nosignatures: agate.Table.aggregate agate.Table.compute Advanced processing ------------------- .. autosummary:: :nosignatures: agate.Table.bins agate.Table.denormalize agate.Table.group_by agate.Table.homogenize agate.Table.join agate.Table.merge agate.Table.normalize agate.Table.pivot agate.Table.rename Previewing ---------- .. autosummary:: :nosignatures: agate.Table.print_bars agate.Table.print_csv agate.Table.print_html agate.Table.print_json agate.Table.print_structure agate.Table.print_table Charting -------- .. autosummary:: :nosignatures: agate.Table.bar_chart agate.Table.column_chart agate.Table.line_chart agate.Table.scatterplot Detailed list ------------- .. autoclass:: agate.Table :members: :inherited-members: agate-1.9.1/docs/api/tableset.rst000066400000000000000000000027521454111471500166520ustar00rootroot00000000000000======== TableSet ======== .. automodule:: agate.tableset :no-members: .. autosummary:: :nosignatures: agate.TableSet Properties ---------- .. autosummary:: :nosignatures: agate.TableSet.key_name agate.TableSet.key_type agate.TableSet.column_types agate.TableSet.column_names Creating -------- .. autosummary:: :nosignatures: agate.TableSet.from_csv agate.TableSet.from_json Saving ------ .. autosummary:: :nosignatures: agate.TableSet.to_csv agate.TableSet.to_json Processing ---------- .. autosummary:: :nosignatures: agate.TableSet.aggregate agate.TableSet.having agate.TableSet.merge Previewing ---------- .. autosummary:: :nosignatures: agate.TableSet.print_structure Charting -------- .. autosummary:: :nosignatures: agate.TableSet.bar_chart agate.TableSet.column_chart agate.TableSet.line_chart agate.TableSet.scatterplot Table Proxy Methods ------------------- .. autosummary:: :nosignatures: agate.TableSet.bins agate.TableSet.compute agate.TableSet.denormalize agate.TableSet.distinct agate.TableSet.exclude agate.TableSet.find agate.TableSet.group_by agate.TableSet.homogenize agate.TableSet.join agate.TableSet.limit agate.TableSet.normalize agate.TableSet.order_by agate.TableSet.pivot agate.TableSet.select agate.TableSet.where Detailed list ------------- .. autoclass:: agate.TableSet :inherited-members: agate-1.9.1/docs/api/testcase.rst000066400000000000000000000002621454111471500166540ustar00rootroot00000000000000==================== Unit testing helpers ==================== .. autoclass:: agate.AgateTestCase :members: assertColumnNames, assertColumnTypes, assertRows, assertRowNames agate-1.9.1/docs/api/type_tester.rst000066400000000000000000000002021454111471500174020ustar00rootroot00000000000000============== Type inference ============== .. automodule:: agate.type_tester :no-members: .. autoclass:: agate.TypeTester agate-1.9.1/docs/api/warns.rst000066400000000000000000000003211454111471500161670ustar00rootroot00000000000000======== Warnings ======== .. autoclass:: agate.NullCalculationWarning .. autoclass:: agate.DuplicateColumnWarning .. autofunction:: agate.warn_null_calculation .. autofunction:: agate.warn_duplicate_column agate-1.9.1/docs/changelog.rst000066400000000000000000000000751454111471500162210ustar00rootroot00000000000000========= Changelog ========= .. include:: ../CHANGELOG.rst agate-1.9.1/docs/conf.py000066400000000000000000000025241454111471500150400ustar00rootroot00000000000000# Configuration file for the Sphinx documentation builder. # # For the full list of built-in configuration values, see the documentation: # https://www.sphinx-doc.org/en/master/usage/configuration.html import os import sys sys.path.insert(0, os.path.abspath('..')) # -- Project information ----------------------------------------------------- # https://www.sphinx-doc.org/en/master/usage/configuration.html#project-information project = 'agate' copyright = '2017, Christopher Groskopf' version = '1.9.1' release = version # -- General configuration --------------------------------------------------- # https://www.sphinx-doc.org/en/master/usage/configuration.html#general-configuration extensions = [ 'sphinx.ext.autosummary', 'sphinx.ext.autodoc', 'sphinx.ext.intersphinx' ] templates_path = ['_templates'] exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store'] # -- Options for HTML output ------------------------------------------------- # https://www.sphinx-doc.org/en/master/usage/configuration.html#options-for-html-output html_theme = 'furo' htmlhelp_basename = 'agatedoc' autodoc_default_options = { 'members': None, 'member-order': 'bysource', 'show-inheritance': True, } intersphinx_mapping = { 'python': ('https://docs.python.org/3', None), 'leather': ('https://leather.readthedocs.io/en/latest/', None) } agate-1.9.1/docs/contributing.rst000066400000000000000000000077661454111471500170170ustar00rootroot00000000000000============ Contributing ============ agate actively encourages contributions from people of all genders, races, ethnicities, ages, creeds, nationalities, persuasions, alignments, sizes, shapes, and journalistic affiliations. You are welcome here. We seek contributions from developers and non-developers of all skill levels. We will typically accept bug fixes, documentation updates, and new cookbook recipes with minimal fuss. If you want to work on a larger feature—great! The maintainers will be happy to provide feedback and code review on your implementation. Before making any changes or additions to agate, please be sure to read about the principles of agate in the `About `_ section of the documentation. Process for documentation ========================= Not a developer? That's fine! As long as you can use `git` (there are many tutorials) then you can contribute to agate. Please follow this process: #. Fork the project on `GitHub `_. #. If you don't have a specific task in mind, check out the `issue tracker `_ and find a documentation ticket that needs to be done. #. Comment on the ticket letting everyone know you're going to be working on it so that nobody duplicates your effort. #. Write the documentation. Documentation files live in the `docs` directory and are in Restructured Text Format. #. Add yourself to the AUTHORS file if you aren't already there. #. Once your contribution is complete, submit a pull request on GitHub. #. Wait for it to either be merged by a maintainer or to receive feedback about what needs to be revised. #. Rejoice! Process for code ================ Hacker? We'd love to have you hack with us. Please follow this process to make your contribution: #. Fork the project on `GitHub `_. #. If you don't have a specific task in mind, check out the `issue tracker `_ and find a task that needs to be done and is of a scope you can realistically expect to complete in a few days. Don't worry about the priority of the issues at first, but try to choose something you'll enjoy. You're much more likely to finish something to the point it can be merged if it's something you really enjoy hacking on. #. If you already have a task you know you want to work on, open a ticket or comment on the existing ticket letting everyone know you're going to be working on it. It's also good practice to provide some general idea of how you plan on resolving the issue so that other developers can make suggestions. #. Write tests for the feature you're building. Follow the format of the existing tests in the test directory to see how this works. You can run all the tests with the command ``pytest``. #. Write the code. Try to stay consistent with the style and organization of the existing codebase. A good patch won't be refused for stylistic reasons, but large parts of it may be rewritten and nobody wants that. #. As you are coding, periodically merge in work from the master branch and verify you haven't broken anything by running the test suite. #. Write documentation. This means docstrings on all classes and methods, including parameter explanations. It also means, when relevant, cookbook recipes and updates to the agate user tutorial. #. Add yourself to the AUTHORS file if you aren't already there. #. Once your contribution is complete, tested, and has documentation, submit a pull request on GitHub. #. Wait for it to either be merged by a maintainer or to receive feedback about what needs to be revisited. #. Rejoice! Licensing ========= To the extent that they care, contributors should keep in mind that the source of agate and therefore of any contributions are licensed under the permissive `MIT license `_. By submitting a patch or pull request you are agreeing to release your code under this license. You will be acknowledged in the AUTHORS list, the commit history and the hearts and minds of journalists everywhere. agate-1.9.1/docs/cookbook.rst000066400000000000000000000037211454111471500161010ustar00rootroot00000000000000======== Cookbook ======== Welcome to the agate cookbook, a source of how-to's and use cases. .. toctree:: :hidden: :maxdepth: 2 cookbook/create cookbook/save cookbook/remove cookbook/filter cookbook/sort cookbook/search cookbook/standardize cookbook/statistics cookbook/compute cookbook/datetime cookbook/sql cookbook/excel cookbook/r cookbook/underscore cookbook/homogenize cookbook/columns cookbook/transform cookbook/locale cookbook/rank cookbook/charting cookbook/lookup Basics ====== * `Creating tables from various data types `_ * `Saving data to various data types `_ * `Removing columns from a table `_ * `Filtering rows of data `_ * `Sorting rows of data `_ * `Searching through a table `_ * `Standardize names and values `_ * `Calculating statistics `_ * `Computing new columns `_ * `Handling dates and times `_ Coming from other tools ======================= * `SQL `_ * `Excel `_ * `R `_ * `Underscore.js `_ * Pandas (coming soon!) Advanced techniques =================== * `Filling missing rows in a dataset `_ * `Renaming and reordering columns `_ * `Transforming data (pivot/normalize/denormalize) `_ * `Setting your locale and working with foreign data `_ * `Ranking a sequence of data `_ * `Creating simple charts `_ * `Mapping columns to common lookup tables `_ Have a common use case that isn't covered? Please `submit an issue `_ on the GitHub repository. agate-1.9.1/docs/cookbook/000077500000000000000000000000001454111471500153445ustar00rootroot00000000000000agate-1.9.1/docs/cookbook/charting.rst000066400000000000000000000104331454111471500176760ustar00rootroot00000000000000====== Charts ====== Agate offers two kinds of built in charting: very simple text bar charts and SVG charting via `leather `_. Both are intended for efficiently exploring data, rather than producing publication-ready charts. Text-based bar chart ==================== agate has a builtin text-based bar-chart generator: .. code-block:: python table.limit(10).print_bars('State Name', 'TOTAL', width=80) .. code-block:: bash State Name TOTAL ALABAMA 19,582 ▓░░░░░░░░░░░░░ ALASKA 2,705 ▓░░ ARIZONA 46,743 ▓░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ ARKANSAS 7,932 ▓░░░░░ CALIFORNIA 76,639 ▓░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ COLORADO 21,485 ▓░░░░░░░░░░░░░░░ CONNECTICUT 4,350 ▓░░░ DELAWARE 1,904 ▓░ DIST. OF COLUMBIA 2,185 ▓░ FLORIDA 59,519 ▓░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ +-------------+------------+------------+-------------+ 0 20,000 40,000 60,000 80,000 Text-based histogram ==================== :meth:`.Table.print_bars` can be combined with :meth:`.Table.pivot` or :meth:`.Table.bins` to produce fast histograms: .. code-block:: Python table.bins('TOTAL', start=0, end=100000).print_bars('TOTAL', width=80) .. code-block:: bash TOTAL Count [0 - 10,000) 30 ▓░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ [10,000 - 20,000) 12 ▓░░░░░░░░░░░░░░░░░░░░░░ [20,000 - 30,000) 7 ▓░░░░░░░░░░░░░ [30,000 - 40,000) 1 ▓░░ [40,000 - 50,000) 2 ▓░░░░ [50,000 - 60,000) 1 ▓░░ [60,000 - 70,000) 1 ▓░░ [70,000 - 80,000) 1 ▓░░ [80,000 - 90,000) 0 ▓ [90,000 - 100,000] 0 ▓ +-------------+------------+------------+-------------+ 0.0 7.5 15.0 22.5 30.0 SVG bar chart ============= .. code-block:: Python table.limit(10).bar_chart('State Name', 'TOTAL', 'docs/images/bar_chart.svg') .. figure:: ../images/bar_chart.svg SVG column chart ================ .. code-block:: Python table.limit(10).column_chart('State Name', 'TOTAL', 'docs/images/column_chart.svg') .. figure:: ../images/column_chart.svg SVG line chart ============== .. code-block:: Python by_year_exonerated = table.group_by('exonerated') counts = by_year_exonerated.aggregate([ ('count', agate.Count()) ]) counts.order_by('exonerated').line_chart('exonerated', 'count', 'docs/images/line_chart.svg') .. figure:: ../images/line_chart.svg SVG dots chart ============== .. code-block:: Python table.scatterplot('exonerated', 'age', 'docs/images/dots_chart.svg') .. figure:: ../images/dots_chart.svg SVG lattice chart ================== .. code-block:: Python top_crimes = table.group_by('crime').having([ ('count', agate.Count()) ], lambda t: t['count'] > 100) by_year = top_crimes.group_by('exonerated') counts = by_year.aggregate([ ('count', agate.Count()) ]) by_crime = counts.group_by('crime') by_crime.order_by('exonerated').line_chart('exonerated', 'count', 'docs/images/lattice.svg') .. figure:: ../images/lattice.svg Using matplotlib ================ If you need to make more complex charts, you can always use agate with `matplotlib `_. Here is an example of how you might generate a line chart: .. code-block:: python import pylab pylab.plot(table.columns['homeruns'], table.columns['wins']) pylab.xlabel('Homeruns') pylab.ylabel('Wins') pylab.title('How homeruns correlate to wins') pylab.show() agate-1.9.1/docs/cookbook/columns.rst000066400000000000000000000015031454111471500175550ustar00rootroot00000000000000=============================== Renaming and reordering columns =============================== Rename columns =============== You can rename the columns in a table by using the :meth:`.Table.rename` method and specifying the new column names as an array or dictionary mapping old column names to new ones. .. code-block:: python table = Table(rows, column_names = ['a', 'b', 'c']) new_table = table.rename(column_names = ['one', 'two', 'three']) # or new_table = table.rename(column_names = {'a': 'one', 'b': 'two', 'c': 'three'}) Reorder columns =============== You can reorder the columns in a table by using the :meth:`.Table.select` method and specifying the column names in the order you want: .. code-block:: python new_table = table.select(['3rd_column_name', '1st_column_name', '2nd_column_name']) agate-1.9.1/docs/cookbook/compute.rst000066400000000000000000000213051454111471500175530ustar00rootroot00000000000000================== Compute new values ================== Change ====== .. code-block:: python new_table = table.compute([ ('2000_change', agate.Change('2000', '2001')), ('2001_change', agate.Change('2001', '2002')), ('2002_change', agate.Change('2002', '2003')) ]) Or, better yet, compute the whole decade using a loop: .. code-block:: Python computations = [] for year in range(2000, 2010): change = agate.Change(year, year + 1) computations.append(('%i_change' % year, change)) new_table = table.compute(computations) Percent ======= Calculate the percentage for each value in a column with :class:`.Percent`. Values are divided into the sum of the column by default. .. code-block:: python columns = ('value',) rows = ([1],[2],[2],[5]) new_table = agate.Table(rows, columns) new_table = new_table.compute([ ('percent', agate.Percent('value')) ]) new_table.print_table() | value | percent | | ----- | ------- | | 1 | 10 | | 2 | 20 | | 2 | 20 | | 5 | 50 | Override the denominator with a keyword argument. .. code-block:: python new_table = new_table.compute([ ('percent', agate.Percent('value', 5)) ]) new_table.print_table() | value | percent | | ----- | ------- | | 1 | 20 | | 2 | 40 | | 2 | 40 | | 5 | 100 | Percent change ============== Want percent change instead of value change? Just swap out the :class:`.Computation`: .. code-block:: Python computations = [] for year in range(2000, 2010): change = agate.PercentChange(year, year + 1) computations.append(('%i_change' % year, change)) new_table = table.compute(computations) Indexed/cumulative change ========================= Need your change indexed to a starting year? Just fix the first argument: .. code-block:: Python computations = [] for year in range(2000, 2010): change = agate.Change(2000, year + 1) computations.append(('%i_change' % year, change)) new_table = table.compute(computations) Of course you can also use :class:`.PercentChange` if you need percents rather than values. Round to two decimal places =========================== agate stores numerical values using Python's :class:`decimal.Decimal` type. This data type ensures numerical precision beyond what is supported by the native :func:`float` type, however, because of this we can not use Python's builtin :func:`round` function. Instead we must use :meth:`decimal.Decimal.quantize`. We can use :meth:`.Table.compute` to apply the quantize to generate a rounded column from an existing one: .. code-block:: python from decimal import Decimal number_type = agate.Number() def round_price(row): return row['price'].quantize(Decimal('0.01')) new_table = table.compute([ ('price_rounded', agate.Formula(number_type, round_price)) ]) To round to one decimal place you would simply change :code:`0.01` to :code:`0.1`. .. _difference_between_dates: Difference between dates ======================== Calculating the difference between dates (or dates and times) works exactly the same as it does for numbers: .. code-block:: python new_table = table.compute([ ('age_at_death', agate.Change('born', 'died')) ]) Levenshtein edit distance ========================= The Levenshtein edit distance is a common measure of string similarity. It can be used, for instance, to check for typos between manually-entered names and a version that is known to be spelled correctly. Implementing Levenshtein requires writing a custom :class:`.Computation`. To save ourselves building the whole thing from scratch, we will lean on the `python-Levenshtein `_ library for the actual algorithm. .. code-block:: python import agate from Levenshtein import distance class LevenshteinDistance(agate.Computation): """ Computes Levenshtein edit distance between the column and a given string. """ def __init__(self, column_name, compare_string): self._column_name = column_name self._compare_string = compare_string def get_computed_data_type(self, table): """ The return value is a numerical distance. """ return agate.Number() def validate(self, table): """ Verify the column is text. """ column = table.columns[self._column_name] if not isinstance(column.data_type, agate.Text): raise agate.DataTypeError('Can only be applied to Text data.') def run(self, table): """ Find the distance, returning null when the input column was null. """ new_column = [] for row in table.rows: val = row[self._column_name] if val is None: new_column.append(None) else: new_column.append(distance(val, self._compare_string)) return new_column This code can now be applied to any :class:`.Table` just as any other :class:`.Computation` would be: .. code-block:: python new_table = table.compute([ ('distance', LevenshteinDistance('column_name', 'string to compare')) ]) The resulting column will contain an integer measuring the edit distance between the value in the column and the comparison string. USA Today Diversity Index ========================= The `USA Today Diversity Index `_ is a widely cited method for evaluating the racial diversity of a given area. Using a custom :class:`.Computation` makes it simple to calculate. Assuming that your data has a column for the total population, another for the population of each race and a final column for the hispanic population, you can implement the diversity index like this: .. code-block:: python class USATodayDiversityIndex(agate.Computation): def get_computed_data_type(self, table): return agate.Number() def run(self, table): new_column = [] for row in table.rows: race_squares = 0 for race in ['white', 'black', 'asian', 'american_indian', 'pacific_islander']: race_squares += (row[race] / row['population']) ** 2 hispanic_squares = (row['hispanic'] / row['population']) ** 2 hispanic_squares += (1 - (row['hispanic'] / row['population'])) ** 2 new_column.append((1 - (race_squares * hispanic_squares)) * 100) return new_column We apply the diversity index like any other computation: .. code-block:: Python with_index = table.compute([ ('diversity_index', USATodayDiversityIndex()) ]) Simple Moving Average ===================== A simple moving average is the average of some number of prior values in a series. It is typically used to smooth out variation in time series data. The following custom :class:`.Computation` will compute a simple moving average. This example assumes your data is already sorted. .. code-block:: python class SimpleMovingAverage(agate.Computation): """ Computes the simple moving average of a column over some interval. """ def __init__(self, column_name, interval): self._column_name = column_name self._interval = interval def get_computed_data_type(self, table): """ The return value is a numerical average. """ return agate.Number() def validate(self, table): """ Verify the column is numerical. """ column = table.columns[self._column_name] if not isinstance(column.data_type, agate.Number): raise agate.DataTypeError('Can only be applied to Number data.') def run(self, table): new_column = [] for i, row in enumerate(table.rows): if i < self._interval: new_column.append(None) else: values = tuple(r[self._column_name] for r in table.rows[i - self._interval:i]) if None in values: new_column.append(None) else: new_column.append(sum(values) / self._interval) return new_column You would use the simple moving average like so: .. code-block:: Python with_average = table.compute([ ('six_month_moving_average', SimpleMovingAverage('price', 6)) ]) agate-1.9.1/docs/cookbook/create.rst000066400000000000000000000115561454111471500173510ustar00rootroot00000000000000=============== Creating tables =============== From data in memory =================== From a list of lists. .. code-block:: python column_names = ['letter', 'number'] column_types = [agate.Text(), agate.Number()] rows = [ ('a', 1), ('b', 2), ('c', None) ] table = agate.Table(rows, column_names, column_types) From a list of dictionaries. .. code-block:: python rows = [ dict(letter='a', number=1), dict(letter='b', number=2), dict(letter='c', number=None) ] table = agate.Table.from_object(rows) From a CSV ========== By default, loading a table from a CSV will use agate's builtin :class:`.TypeTester` to infer column types: .. code-block:: python table = agate.Table.from_csv('filename.csv') Override type inference ======================= In some cases agate's :class:`.TypeTester` may guess incorrectly. To override the type for some columns and use TypeTester for the rest, pass a dictionary to the ``column_types`` argument. .. code-block:: python specified_types = { 'column_name_one': agate.Text(), 'column_name_two': agate.Number() } table = agate.Table.from_csv('filename.csv', column_types=specified_types) This will use a generic TypeTester and override your specified columns with ``TypeTester.force``. Limit type inference ==================== For large datasets :class:`.TypeTester` may be unreasonably slow. In order to limit the amount of data it uses you can specify the ``limit`` argument. Note that if data after the limit invalidates the TypeTester's inference you may get errors when the data is loaded. .. code-block:: python tester = agate.TypeTester(limit=100) table = agate.Table.from_csv('filename.csv', column_types=tester) Manually specify columns ======================== If you know the types of your data you may find it more efficient to manually specify the names and types of your columns. This also gives you an opportunity to rename columns when you load them. .. code-block:: python text_type = agate.Text() number_type = agate.Number() column_names = ['city', 'area', 'population'] column_types = [text_type, number_type, number_type] table = agate.Table.from_csv('population.csv', column_names, column_types) Or, you can use this method to load data from a file that does not have a header row: .. code-block:: python table = agate.Table.from_csv('population.csv', column_names, column_types, header=False) From a unicode CSV ================== You don't have to do anything special. It just works! From a latin1 CSV ================= .. code-block:: python table = agate.Table.from_csv('census.csv', encoding='latin1') From a semicolon delimited CSV ============================== Normally, agate will automatically guess the delimiter of your CSV, but if that guess fails you can specify it manually: .. code-block:: python table = agate.Table.from_csv('filename.csv', delimiter=';') From a TSV (tab-delimited CSV) ============================== This is the same as the previous example, but in this case we specify that the delimiter is a tab: .. code-block:: python table = agate.Table.from_csv('filename.csv', delimiter='\t') From JSON ========= .. code-block:: python table = agate.Table.from_json('filename.json') From newline-delimited JSON =========================== .. code-block:: python table = agate.Table.from_json('filename.json', newline=True) .. _load_a_table_from_a_sql_database: From a SQL database =================== Use the `agate-sql `_ extension. .. code-block:: python import agatesql table = agate.Table.from_sql('postgresql:///database', 'input_table') From an Excel spreadsheet ========================= Use the `agate-excel `_ extension. It supports both .xls and .xlsx files. .. code-block:: python import agateexcel table = agate.Table.from_xls('test.xls', sheet='data') table2 = agate.Table.from_xlsx('test.xlsx', sheet='data') From a DBF table ================ DBF is the file format used to hold tabular data for ArcGIS shapefiles. `agate-dbf `_ extension. .. code-block:: python import agatedbf table = agate.Table.from_dbf('test.dbf') From a remote file ================== Use the `agate-remote `_ extension. .. code-block:: python import agateremote table = agate.Table.from_url('https://raw.githubusercontent.com/wireservice/agate/master/examples/test.csv') agate-remote also let’s you create an Archive, which is a reference to a group of tables with a known path structure. .. code-block:: python archive = agateremote.Archive('https://github.com/vincentarelbundock/Rdatasets/raw/master/csv/') table = archive.get_table('sandwich/PublicSchools.csv') agate-1.9.1/docs/cookbook/datetime.rst000066400000000000000000000061561454111471500177020ustar00rootroot00000000000000=============== Dates and times =============== Specify a date format ===================== By default agate will attempt to guess the format of a :class:`.Date` or :class:`.DateTime` column. In some cases, it may not be possible to automatically figure out the format of a date. In this case you can specify a :meth:`datetime.datetime.strptime` formatting string to specify how the dates should be parsed. For example, if your dates were formatted as "15-03-15" (March 15th, 2015) then you could specify: .. code-block:: python date_type = agate.Date('%d-%m-%y') Another use for this feature is if you have a column that contains extraneous data. For instance, imagine that your column contains hours and minutes, but they are always zero. It would make more sense to load that data as type :class:`.Date` and ignore the extra time information: .. code-block:: python date_type = agate.Date('%m/%d/%Y 00:00') .. _specify_a_timezone: Specify a timezone ================== Timezones are hard. Under normal circumstances (no arguments specified), agate will not try to parse timezone information, nor will it apply a timezone to the :class:`datetime.datetime` instances it creates. (They will be *naive* in Python parlance.) There are two ways to force timezone data into your agate columns. The first is to use a format string, as shown above, and specify a pattern for timezone information: .. code-block:: python datetime_type = agate.DateTime('%Y-%m-%d %H:%M:%S%z') The second way is to specify a timezone as an argument to the type constructor: .. code-block:: python try: from zoneinfo import ZoneInfo except ImportError: # Fallback for Python < 3.9 from backports.zoneinfo import ZoneInfo eastern = ZoneInfo('US/Eastern') datetime_type = agate.DateTime(timezone=eastern) In this case all timezones that are processed will be set to have the Eastern timezone. Note, the timezone will be **set**, not converted. You cannot use this method to convert your timezones from UTC to another timezone. To do that see :ref:`convert_timezones`. Calculate a time difference ============================= See :ref:`difference_between_dates`. Sort by date ============ See :ref:`sort_by_date`. .. _convert_timezones: Convert timezones ==================== If you load data from a spreadsheet in one timezone and you need to convert it to another, you can do this using a :class:`.Formula`. Your datetime column must have timezone data for the following example to work. See :ref:`specify_a_timezone`. .. code-block:: python try: from zoneinfo import ZoneInfo except ImportError: # Fallback for Python < 3.9 from backports.zoneinfo import ZoneInfo us_eastern = ZoneInfo('US/Eastern') datetime_type = agate.DateTime(timezone=us_eastern) column_names = ['what', 'when'] column_types = [text_type, datetime_type] table = agate.Table.from_csv('events.csv', columns) rome = ZoneInfo('Europe/Rome') timezone_shifter = agate.Formula(lambda r: r['when'].astimezone(rome)) table = agate.Table.compute([ ('when_in_rome', timezone_shifter) ]) agate-1.9.1/docs/cookbook/excel.rst000066400000000000000000000075161454111471500172070ustar00rootroot00000000000000============= Emulate Excel ============= One of agate's most powerful assets is that instead of a wimpy "formula" language, you have the entire Python language at your disposal. Here are examples of how to translate a few common Excel operations. Simple formulas =============== If you need to simulate a simple Excel formula you can use the :class:`.Formula` class to apply an arbitrary function. Excel: .. code:: =($A1 + $B1) / $C1 agate: .. code-block:: python def f(row): return (row['a'] + row['b']) / row['c'] new_table = table.compute([ ('new_column', agate.Formula(agate.Number(), f)) ]) If this still isn't enough flexibility, you can also create your own subclass of :class:`.Computation`. SUM === .. code-block:: python number_type = agate.Number() def five_year_total(row): columns = ('2009', '2010', '2011', '2012', '2013') return sum(tuple(row[c] for c in columns)] formula = agate.Formula(number_type, five_year_total) new_table = table.compute([ ('five_year_total', formula) ]) TRIM ==== .. code-block:: python new_table = table.compute([ ('name_stripped', agate.Formula(text_type, lambda r: r['name'].strip())) ]) CONCATENATE =========== .. code-block:: python new_table = table.compute([ ('full_name', agate.Formula(text_type, lambda r: '%(first_name)s %(middle_name)s %(last_name)s' % r)) ]) IF == .. code-block:: python new_table = table.compute([ ('mvp_candidate', agate.Formula(boolean_type, lambda r: row['batting_average'] > 0.3)) ]) VLOOKUP ======= There are two ways to get the equivalent of Excel's VLOOKUP with agate. If your lookup source is another agate :class:`.Table`, then you'll want to use the :meth:`.Table.join` method: .. code-block:: python new_table = mvp_table.join(states, 'state_abbr') This will add all the columns from the `states` table to the `mvp_table`, where their `state_abbr` columns match. If your lookup source is a Python dictionary or some other object you can implement the lookup using a :class:`.Formula` computation: .. code-block:: python states = { 'AL': 'Alabama', 'AK': 'Alaska', 'AZ': 'Arizona', ... } new_table = table.compute([ ('mvp_candidate', agate.Formula(text_type, lambda r: states[row['state_abbr']])) ]) Pivot tables as cross-tabulations ================================= Pivot tables in Excel implement a tremendous range of functionality. Agate divides this functionality into a few different methods. If what you want is to convert rows to columns to create a "crosstab", then you'll want to use the :meth:`.Table.pivot` method: .. code-block:: python jobs_by_state_and_year = employees.pivot('state', 'year') This will generate a table with a row for each value in the `state` column and a column for each value in the `year` column. The intersecting cells will contains the counts grouped by state and year. You can pass the `aggregation` keyword to aggregate some other value, such as :class:`.Mean` or :class:`.Median`. Pivot tables as summaries ========================= On the other hand, if what you want is to summarize your table with descriptive statistics, then you'll want to use :meth:`.Table.group_by` and :meth:`.TableSet.aggregate`: .. code-block:: python jobs = employees.group_by('job_title') summary = jobs.aggregate([ ('employee_count', agate.Count()), ('salary_mean', agate.Mean('salary')), ('salary_median', agate.Median('salary')) ]) The resulting ``summary`` table will have four columns: ``job_title``, ``employee_count``, ``salary_mean`` and ``salary_median``. You may also want to look at the :meth:`.Table.normalize` and :meth:`.Table.denormalize` methods for examples of functionality frequently accomplished with Excel's pivot tables. agate-1.9.1/docs/cookbook/filter.rst000066400000000000000000000055251454111471500173720ustar00rootroot00000000000000=========== Filter rows =========== By regex ======== You can use Python's builtin :mod:`re` module to introduce a regular expression into a :meth:`.Table.where` query. For example, here we find all states that start with "C". .. code-block:: python import re new_table = table.where(lambda row: re.match('^C', str(row['state']))) This can also be useful for finding values that **don't** match your expectations. For example, finding all values in the "phone number" column that don't look like phone numbers: .. code-block:: python new_table = table.where(lambda row: not re.match('\d{3}-\d{3}-\d{4}', str(row['phone']))) By glob ======= Hate regexes? You can use glob (:mod:`fnmatch`) syntax too! .. code-block:: python from fnmatch import fnmatch new_table = table.where(lambda row: fnmatch('C*', row['state'])) Values within a range ===================== This snippet filters the dataset to incomes between 100,000 and 200,000. .. code-block:: python new_table = table.where(lambda row: 100000 < row['income'] < 200000) Dates within a range ==================== This snippet filters the dataset to events during the summer of 2015: .. code-block:: python import datetime new_table = table.where(lambda row: datetime.datetime(2015, 6, 1) <= row['date'] <= datetime.datetime(2015, 8, 31)) If you want to filter to events during the summer of any year: .. code-block:: python new_table = table.where(lambda row: 6 <= row['date'].month <= 8) Top N percent ============= To filter a dataset to the top 10% percent of values we first compute the percentiles for the column and then use the result in the :meth:`.Table.where` truth test: .. code-block:: python percentiles = table.aggregate(agate.Percentiles('salary')) top_ten_percent = table.where(lambda r: r['salary'] >= percentiles[90]) Random sample ============= By combining a random sort with limiting, we can effectively get a random sample from a table. .. code-block:: python import random randomized = table.order_by(lambda row: random.random()) sampled = table.limit(10) Ordered sample ============== With can also get an ordered sample by simply using the :code:`step` parameter of the :meth:`.Table.limit` method to get every Nth row. .. code-block:: python sampled = table.limit(step=10) Distinct values =============== You can retrieve a distinct list of values in a column using :meth:`.Column.values_distinct` or :meth:`.Table.distinct`. :meth:`.Table.distinct` returns the entire row so it's necessary to chain a select on the specific column. .. code-block:: python columns = ('value',) rows = ([1],[2],[2],[5]) new_table = agate.Table(rows, columns) new_table.columns['value'].values_distinct() # or new_table.distinct('value').columns['value'].values() (Decimal('1'), Decimal('2'), Decimal('5')) agate-1.9.1/docs/cookbook/homogenize.rst000066400000000000000000000055721454111471500202530ustar00rootroot00000000000000=============== Homogenize rows =============== Fill in missing rows in a series. This can be used, for instance, to add rows for missing years in a time series. Create rows for missing values ============================== We can insert a default row for each value that is missing in a table from a given sequence of values. Starting with a table like this, we can fill in rows for all missing years: +-------+--------------+------------+ | year | female_count | male_count | +=======+==============+============+ | 1997 | 2 | 1 | +-------+--------------+------------+ | 2000 | 4 | 3 | +-------+--------------+------------+ | 2002 | 4 | 5 | +-------+--------------+------------+ | 2003 | 1 | 2 | +-------+--------------+------------+ .. code-block:: python key = 'year' expected_values = (1997, 1998, 1999, 2000, 2001, 2002, 2003) # Your default row should specify column values not in `key` default_row = (0, 0) new_table = table.homogenize(key, expected_values, default_row) The result will be: +-------+--------------+------------+ | year | female_count | male_count | +=======+==============+============+ | 1997 | 2 | 1 | +-------+--------------+------------+ | 1998 | 0 | 0 | +-------+--------------+------------+ | 1999 | 0 | 0 | +-------+--------------+------------+ | 2000 | 4 | 3 | +-------+--------------+------------+ | 2001 | 0 | 0 | +-------+--------------+------------+ | 2002 | 4 | 5 | +-------+--------------+------------+ | 2003 | 1 | 2 | +-------+--------------+------------+ Create dynamic rows based on missing values =========================================== We can also specify new row values with a value-generating function: .. code-block:: python key = 'year' expected_values = (1997, 1998, 1999, 2000, 2001, 2002, 2003) # If default row is a function, it should return a full row def default_row(missing_value): return (missing_value, missing_value-1997, missing_value-1997) new_table = table.homogenize(key, expected_values, default_row) The new table will be: +-------+--------------+------------+ | year | female_count | male_count | +=======+==============+============+ | 1997 | 2 | 1 | +-------+--------------+------------+ | 1998 | 1 | 1 | +-------+--------------+------------+ | 1999 | 2 | 2 | +-------+--------------+------------+ | 2000 | 4 | 3 | +-------+--------------+------------+ | 2001 | 4 | 4 | +-------+--------------+------------+ | 2002 | 4 | 5 | +-------+--------------+------------+ | 2003 | 1 | 2 | +-------+--------------+------------+ agate-1.9.1/docs/cookbook/locale.rst000066400000000000000000000025061454111471500173400ustar00rootroot00000000000000======= Locales ======= agate strives to work equally well for users from all parts of the world. This means properly handling foreign currencies, date formats, etc. To facilitate this, agate makes a hard distinction between *your* locale and the locale of *the data* you are working with. This allows you to work seamlessly with data from other countries. Set your locale =============== Setting your locale will change how numbers are displayed when you print an agate :class:`.Table` or serialize it to, for example, a CSV file. This works the same as it does for any other Python module. See the :mod:`locale` documentation for details. Changing your locale will not affect how they are parsed from the files you are using. To change how data is parsed see :ref:`specify_locale_of_numbers`. .. _specify_locale_of_numbers: Specify locale of numbers ========================= To correctly parse numbers from non-US locales, you must pass a :code:`locale` parameter to the :class:`.Number` constructor. For example, to parse Dutch numbers (which use a period to separate thousands and a comma to separate fractions): .. code-block:: python dutch_numbers = agate.Number(locale='nl_NL') column_names = ['city', 'population'] column_types = [text_type, dutch_numbers] table = agate.Table.from_csv('dutch_cities.csv', columns) agate-1.9.1/docs/cookbook/lookup.rst000066400000000000000000000043631454111471500174150ustar00rootroot00000000000000====== Lookup ====== Generate new columns by mapping existing data to common `lookup `_ tables. CPI deflation ============= The `agate-lookup `_ extension adds a ``lookup`` method to agate's Table class. Starting with a table that looks like this: +-------+-------+ | year | cost | +=======+=======+ | 1995 | 2.0 | +-------+-------+ | 1997 | 2.2 | +-------+-------+ | 1996 | 2.3 | +-------+-------+ | 2003 | 4.0 | +-------+-------+ | 2007 | 5.0 | +-------+-------+ | 2005 | 6.0 | +-------+-------+ We can map the ``year`` column to its annual CPI index in one lookup call. .. code-block:: python import agatelookup join_year_cpi = table.lookup('year', 'cpi') The return table will have now have a new column: +-------+------+----------+ | year | cost | cpi | +=======+======+==========+ | 1995 | 2.0 | 152.383 | +-------+------+----------+ | 1997 | 2.2 | 160.525 | +-------+------+----------+ | 1996 | 2.3 | 156.858 | +-------+------+----------+ | 2003 | 4.0 | 184.000 | +-------+------+----------+ | 2007 | 5.0 | 207.344 | +-------+------+----------+ | 2005 | 6.0 | 195.267 | +-------+------+----------+ A simple computation tacked on to this lookup can then get the 2015 equivalent values of each cost: .. code-block:: python cpi_2015 = Decimal(216.909) def cpi_adjust_2015(row): return (row['cost'] * (cpi_2015 / row['cpi'])).quantize(Decimal('0.01')) cost_2015 = join_year_cpi.compute([ ('cost_2015', agate.Formula(agate.Number(), cpi_adjust_2015)) ]) And the final table will look like this: +-------+------+---------+------------+ | year | cost | cpi | cost_2015 | +=======+======+=========+============+ | 1995 | 2.0 | 152.383 | 2.85 | +-------+------+---------+------------+ | 1997 | 2.2 | 160.525 | 2.97 | +-------+------+---------+------------+ | 1996 | 2.3 | 156.858 | 3.18 | +-------+------+---------+------------+ | 2003 | 4.0 | 184.000 | 4.72 | +-------+------+---------+------------+ | 2007 | 5.0 | 207.344 | 5.23 | +-------+------+---------+------------+ | 2005 | 6.0 | 195.267 | 6.66 | +-------+------+---------+------------+ agate-1.9.1/docs/cookbook/r.rst000066400000000000000000000045651454111471500163510ustar00rootroot00000000000000========= Emulate R ========= c() === Agate's :meth:`.Table.select` and :meth:`.Table.exclude` are the equivalent of R's :code:`c` for selecting columns. R: .. code-block:: r selected <- data[c("last_name", "first_name", "age")] excluded <- data[!c("last_name", "first_name", "age")] agate: .. code-block:: python selected = table.select(['last_name', 'first_name', 'age']) excluded = table.exclude(['last_name', 'first_name', 'age']) subset ====== Agate's :meth:`.Table.where` is the equivalent of R's :code:`subset`. R: .. code-block:: r newdata <- subset(data, age >= 20 | age < 10) agate: .. code-block:: python new_table = table.where(lambda row: row['age'] >= 20 or row['age'] < 10) order ===== Agate's :meth:`.Table.order_by` is the equivalent of R's :code:`order`. R: .. code-block:: r newdata <- employees[order(last_name),] agate: .. code-block:: python new_table = employees.order_by('last_name') merge ===== Agate's :meth:`.Table.join` is the equivalent of R's :code:`merge`. R: .. code-block:: r joined <- merge(employees, states, by="usps") agate: .. code-block:: python joined = employees.join(states, 'usps') rbind ===== Agate's :meth:`.Table.merge` is the equivalent of R's :code:`rbind`. R: .. code-block:: r merged <- rbind(first_year, second_year) agate: .. code-block:: python merged = agate.Table.merge(first_year, second_year) aggregate ========= Agate's :meth:`.Table.group_by` and :meth:`.TableSet.aggregate` can be used to recreate the functionality of R's :code:`aggregate`. R: .. code-block:: r aggregates = aggregate(employees$salary, list(job = employees$job), mean) agate: .. code-block:: python jobs = employees.group_by('job') aggregates = jobs.aggregate([ ('mean', agate.Mean('salary')) ]) melt ==== Agate's :meth:`.Table.normalize` is the equivalent of R's :code:`melt`. R: .. code-block:: r melt(employees, id=c("last_name", "first_name")) agate: .. code-block:: python employees.normalize(['last_name', 'first_name']) cast ==== Agate's :meth:`.Table.denormalize` is the equivalent of R's :code:`cast`. R: .. code-block:: r melted = melt(employees, id=c("name")) casted = cast(melted, name~variable, mean) agate: .. code-block:: python normalized = employees.normalize(['name']) denormalized = normalized.denormalize('name') agate-1.9.1/docs/cookbook/rank.rst000066400000000000000000000033751454111471500170410ustar00rootroot00000000000000==== Rank ==== There are many ways to rank a sequence of values. agate strives to find a balance between simple, intuitive ranking and flexibility when you need it. Competition rank ================ The basic rank supported by agate is standard "competition ranking". In this model the values :code:`[3, 4, 4, 5]` would be ranked :code:`[1, 2, 2, 4]`. You can apply competition ranking using the :class:`.Rank` computation: .. code-block:: python new_table = table.compute([ ('rank', agate.Rank('value')) ]) Rank descending =============== Descending competition ranking is specified using the :code:`reverse` argument. .. code-block:: python new_table = table.compute([ ('rank', agate.Rank('value', reverse=True)) ]) Rank change =========== You can compute the change from one rank to another by combining the :class:`.Rank` and :class:`.Change` computations: .. code-block:: python new_table = table.compute([ ('rank2014', agate.Rank('value2014')), ('rank2015', agate.Rank('value2015')) ]) new_table2 = new_table.compute([ ('rank_change', agate.Change('rank2014', 'rank2015')) ]) Percentile rank =============== "Percentile rank" is a bit of a misnomer. Really, this is the percentile in which each value in a column is located. This column can be computed for your data using the :class:`.PercentileRank` computation: .. code-block:: Python new_table = table.compute([ ('percentile_rank', agate.PercentileRank('value')) ]) Note that there is no entirely standard method for computing percentiles. The percentiles computed in this manner may not agree precisely with those generated by other software. See the :class:`.Percentiles` class documentation for implementation details. agate-1.9.1/docs/cookbook/remove.rst000066400000000000000000000010231454111471500173670ustar00rootroot00000000000000============== Remove columns ============== Include specific columns ========================= Create a new table with only a specific set of columns: .. code-block:: python include_columns = ['column_name_one', 'column_name_two'] new_table = table.select(include_columns) Exclude specific columns ======================== Create a new table without a specific set of columns: .. code-block:: python exclude_columns = ['column_name_one', 'column_name_two'] new_table = table.exclude(exclude_columns) agate-1.9.1/docs/cookbook/save.rst000066400000000000000000000010241454111471500170310ustar00rootroot00000000000000============ Save a table ============ To a CSV ======== .. code-block:: python table.to_csv('filename.csv') To JSON ======= .. code-block:: python table.to_json('filename.json') To newline-delimited JSON ========================= .. code-block:: python table.to_json('filename.json', newline=True) To a SQL database ================= Use the `agate-sql `_ extension. .. code-block:: python import agatesql table.to_sql('postgresql:///database', 'output_table') agate-1.9.1/docs/cookbook/search.rst000066400000000000000000000026061454111471500173470ustar00rootroot00000000000000====== Search ====== Exact search ============ Find all individuals with the last_name "Groskopf": .. code-block:: python family = table.where(lambda r: r['last_name'] == 'Groskopf') Fuzzy search by edit distance ============================= By leveraging an `existing Python library `_ for computing the `Levenshtein edit distance `_ it is trivially easy to implement a fuzzy string search. For example, to find all names within 2 edits of "Groskopf": .. code-block:: python from Levenshtein import distance fuzzy_family = table.where(lambda r: distance(r['last_name'], 'Groskopf') <= 2) These results will now include all those "Grosskopfs" and "Groskoffs" whose mail I am always getting. Fuzzy search by phonetic similarity =================================== By using `Fuzzy `_ to calculate phonetic similarity, it is possible to implement a fuzzy phonetic search. For example to find all rows with `first_name` phonetically similar to "Catherine": .. code-block:: python import fuzzy dmetaphone = fuzzy.DMetaphone(4) phonetic_search = dmetaphone('Catherine') def phonetic_match(r): return any(x in dmetaphone(r['first_name']) for x in phonetic_search) phonetic_family = table.where(lambda r: phonetic_match(r)) agate-1.9.1/docs/cookbook/sort.rst000066400000000000000000000021511454111471500170640ustar00rootroot00000000000000==== Sort ==== Alphabetical ============ Order a table by the :code:`last_name` column: .. code-block:: python new_table = table.order_by('last_name') Numerical ========= Order a table by the :code:`cost` column: .. code-block:: python new_table = table.order_by('cost') .. _sort_by_date: By date ======= Order a table by the :code:`birth_date` column: .. code-block:: python new_table = table.order_by('birth_date') Reverse order ============= The order of any sort can be reversed by using the :code:`reverse` keyword: .. code-block:: python new_table = table.order_by('birth_date', reverse=True) Multiple columns ================ Because Python's internal sorting works natively with sequences, we can implement multi-column sort by returning a tuple from the key function. .. code-block:: python new_table = table.order_by(lambda row: (row['last_name'], row['first_name'])) This table will now be ordered by :code:`last_name`, then :code:`first_name`. Random order ============ .. code-block:: python import random new_table = table.order_by(lambda row: random.random()) agate-1.9.1/docs/cookbook/sql.rst000066400000000000000000000107061454111471500167010ustar00rootroot00000000000000=========== Emulate SQL =========== agate's command structure is very similar to SQL. The primary difference between agate and SQL is that commands like :code:`SELECT` and :code:`WHERE` explicitly create new tables. You can chain them together as you would with SQL, but be aware each command is actually creating a new table. .. note:: All examples in this section use the `PostgreSQL `_ dialect for comparison. If you want to read and write data from SQL, see :ref:`load_a_table_from_a_sql_database`. SELECT ====== SQL: .. code-block:: postgres SELECT state, total FROM table; agate: .. code-block:: python new_table = table.select(['state', 'total']) WHERE ===== SQL: .. code-block:: postgres SELECT * FROM table WHERE LOWER(state) = 'california'; agate: .. code-block:: python new_table = table.where(lambda row: row['state'].lower() == 'california') ORDER BY ======== SQL: .. code-block:: postgres SELECT * FROM table ORDER BY total DESC; agate: .. code-block:: python new_table = table.order_by(lambda row: row['total'], reverse=True) DISTINCT ======== SQL: .. code-block:: postgres SELECT DISTINCT ON (state) * FROM table; agate: .. code-block:: python new_table = table.distinct('state') .. note:: Unlike most SQL implementations, agate always returns the full row. Use :meth:`.Table.select` if you want to filter the columns first. INNER JOIN ========== SQL (two ways): .. code-block:: postgres SELECT * FROM patient, doctor WHERE patient.doctor = doctor.id; SELECT * FROM patient INNER JOIN doctor ON (patient.doctor = doctor.id); agate: .. code-block:: python joined = patients.join(doctors, 'doctor', 'id', inner=True) LEFT OUTER JOIN =============== SQL: .. code-block:: postgres SELECT * FROM patient LEFT OUTER JOIN doctor ON (patient.doctor = doctor.id); agate: .. code-block:: python joined = patients.join(doctors, 'doctor', 'id') FULL OUTER JOIN =============== SQL: .. code-block:: postgres SELECT * FROM patient FULL OUTER JOIN doctor ON (patient.doctor = doctor.id); agate: .. code-block:: python joined = patients.join(doctors, 'doctor', 'id', full_outer=True) GROUP BY ======== agate's :meth:`.Table.group_by` works slightly different than SQLs. It does not require an aggregate function. Instead it returns :py:class:`.TableSet`. To see how to perform the equivalent of a SQL aggregate, see below. .. code-block:: python doctors = patients.group_by('doctor') You can group by two or more columns by chaining the command. .. code-block:: python doctors_by_state = patients.group_by('state').group_by('doctor') HAVING ====== agate's :meth:`.TableSet.having` works very similar to SQL's keyword of the same name. .. code-block:: python doctors = patients.group_by('doctor') popular_doctors = doctors.having([ ('patient_count', Count()) ], lambda t: t['patient_count'] > 100) This filters to only those doctors whose table includes at least 100 results. Can add as many aggregations as you want to the list and each will be available, by name in the test function you pass. For example, here we filter to popular doctors with more an average review of at least three stars: .. code-block:: python doctors = patients.group_by('doctor') popular_doctors = doctors.having([ ('patient_count', Count()), ('average_stars', Average('stars')) ], lambda t: t['patient_count'] > 100 and t['average_stars'] >= 3) Chain commands together ======================= SQL: .. code-block:: postgres SELECT state, total FROM table WHERE LOWER(state) = 'california' ORDER BY total DESC; agate: .. code-block:: python new_table = table \ .select(['state', 'total']) \ .where(lambda row: row['state'].lower() == 'california') \ .order_by('total', reverse=True) .. note:: Chaining commands in this way is sometimes not a good idea. Being explicit about each step can lead to clearer code. Aggregate functions =================== SQL: .. code-block:: postgres SELECT mean(age), median(age) FROM patients GROUP BY doctor; agate: .. code-block:: python doctors = patients.group_by('doctor') patient_ages = doctors.aggregate([ ('patient_count', agate.Count()), ('age_mean', agate.Mean('age')), ('age_median', agate.Median('age')) ]) The resulting table will have four columns: ``doctor``, ``patient_count``, ``age_mean`` and ``age_median``. agate-1.9.1/docs/cookbook/standardize.rst000066400000000000000000000022701454111471500204070ustar00rootroot00000000000000============================ Standardize names and values ============================ Standardize row and columns names ================================= The :meth:`Table.rename` method has arguments to convert row or column names to slugs and append unique identifiers to duplicate values. Using an existing table object: .. code-block:: python # Convert column names to unique slugs table.rename(slug_columns=True) # Convert row names to unique slugs table.rename(slug_rows=True) # Convert both column and row names to unique slugs table.rename(slug_columns=True, slug_rows=True) Standardize column values ========================= agate has a :class:`Slug` computation that can be used to also standardize text column values. The computation has an option to also append unique identifiers to duplicate values. Using an existing table object: .. code-block:: python # Convert the values in column 'title' to slugs new_table = table.compute([ ('title-slug', agate.Slug('title')) ]) # Convert the values in column 'title' to unique slugs new_table = table.compute([ ('title-slug', agate.Slug('title', ensure_unique=True)) ]) agate-1.9.1/docs/cookbook/statistics.rst000066400000000000000000000105371454111471500202760ustar00rootroot00000000000000========== Statistics ========== Common descriptive and aggregate statistics are included with the core agate library. For additional statistical methods beyond the scope of agate consider using the `agate-stats `_ extension or integrating with `scipy `_. Descriptive statistics ====================== agate includes a full set of standard descriptive statistics that can be applied to any column containing :class:`.Number` data. .. code-block:: python table.aggregate(agate.Sum('salary')) table.aggregate(agate.Min('salary')) table.aggregate(agate.Max('salary')) table.aggregate(agate.Mean('salary')) table.aggregate(agate.Median('salary')) table.aggregate(agate.Mode('salary')) table.aggregate(agate.Variance('salary')) table.aggregate(agate.StDev('salary')) table.aggregate(agate.MAD('salary')) Or, get several at once: .. code-block:: python table.aggregate([ ('salary_min', agate.Min('salary')), ('salary_ave', agate.Mean('salary')), ('salary_max', agate.Max('salary')), ]) Aggregate statistics ==================== You can also generate aggregate statistics for subsets of data (sometimes referred to as "rolling up"): .. code-block:: python doctors = patients.group_by('doctor') patient_ages = doctors.aggregate([ ('patient_count', agate.Count()), ('age_mean', agate.Mean('age')), ('age_median', agate.Median('age')) ]) The resulting table will have four columns: ``doctor``, ``patient_count``, ``age_mean`` and ``age_median``. You can roll up by multiple columns by chaining agate's :meth:`.Table.group_by` method. .. code-block:: python doctors_by_state = patients.group_by("state").group_by('doctor') Distribution by count (frequency) ================================= Counting the number of each unique value in a column can be accomplished with the :meth:`.Table.pivot` method: .. code-block:: python # Counts of a single column's values table.pivot('doctor') # Counts of all combinations of more than one column's values table.pivot(['doctor', 'hospital']) The resulting tables will have a column for each key column and another :code:`Count` column counting the number of instances of each value. Distribution by percent ======================= :meth:`.Table.pivot` can also be used to calculate the distribution of values as a percentage of the total number: .. code-block:: python # Percents of a single column's values table.pivot('doctor', computation=agate.Percent('Count')) # Percents of all combinations of more than one column's values table.pivot(['doctor', 'hospital'], computation=agate.Percent('Count')) The output table will be the same format as the previous example, except the value column will be named :code:`Percent`. Identify outliers ================= The `agate-stats `_ extension adds methods for finding outliers. .. code-block:: python import agatestats outliers = table.stdev_outliers('salary', deviations=3, reject=False) By specifying :code:`reject=True` you can instead return a table including only those values **not** identified as outliers. .. code-block:: python not_outliers = table.stdev_outliers('salary', deviations=3, reject=True) The second, more robust, method for identifying outliers is by identifying values which are more than some number of "median absolute deviations" from the median (typically 3). .. code-block:: python outliers = table.mad_outliers('salary', deviations=3, reject=False) As with the first example, you can specify :code:`reject=True` to exclude outliers in the resulting table. Custom statistics ================== You can also generate custom aggregated statistics for your data by defining your own 'summary' aggregation. This might be especially useful for performing calculations unique to your data. Here's a simple example: .. code-block:: python # Create a custom summary aggregation with agate.Summary # Input a column name, a return data type and a function to apply on the column count_millionaires = agate.Summary('salary', agate.Number(), lambda r: sum(salary > 1000000 for salary in r.values())) table.aggregate([ count_millionaires ]) Your custom aggregation can be used to determine both descriptive and aggregate statistics shown above. agate-1.9.1/docs/cookbook/transform.rst000066400000000000000000000142351454111471500201160ustar00rootroot00000000000000========= Transform ========= Pivot by a single column ======================== The :meth:`.Table.pivot` method is a general process for grouping data by row and, optionally, by column, and then calculating some aggregation for each group. Consider the following table: +---------+---------+--------+-------+ | name | race | gender | age | +=========+=========+========+=======+ | Joe | white | female | 20 | +---------+---------+--------+-------+ | Jane | asian | male | 20 | +---------+---------+--------+-------+ | Jill | black | female | 20 | +---------+---------+--------+-------+ | Jim | latino | male | 25 | +---------+---------+--------+-------+ | Julia | black | female | 25 | +---------+---------+--------+-------+ | Joan | asian | female | 25 | +---------+---------+--------+-------+ In the very simplest case, this table can be pivoted to count the number occurences of values in a column: .. code-block:: python transformed = table.pivot('race') Result: +---------+--------+ | race | pivot | +=========+========+ | white | 1 | +---------+--------+ | asian | 2 | +---------+--------+ | black | 2 | +---------+--------+ | latino | 1 | +---------+--------+ Pivot by multiple columns ========================= You can pivot by multiple columns either as additional row-groups, or as intersecting columns. For example, given the table in the previous example: .. code-block:: python transformed = table.pivot(['race', 'gender']) Result: +---------+--------+-------+ | race | gender | pivot | +=========+========+=======+ | white | female | 1 | +---------+--------+-------+ | asian | male | 1 | +---------+--------+-------+ | black | female | 2 | +---------+--------+-------+ | latino | male | 1 | +---------+--------+-------+ | asian | female | 1 | +---------+--------+-------+ For the column, version you would do: .. code-block:: python transformed = table.pivot('race', 'gender') Result: +---------+--------+--------+ | race | male | female | +=========+========+========+ | white | 0 | 1 | +---------+--------+--------+ | asian | 1 | 1 | +---------+--------+--------+ | black | 0 | 2 | +---------+--------+--------+ | latino | 1 | 0 | +---------+--------+--------+ Pivot to sum ============ The default pivot aggregation is :class:`.Count` but you can also supply other operations. For example, to aggregate each group by :class:`.Sum` of their ages: .. code-block:: python transformed = table.pivot('race', 'gender', aggregation=agate.Sum('age')) +---------+--------+--------+ | race | male | female | +=========+========+========+ | white | 0 | 20 | +---------+--------+--------+ | asian | 20 | 25 | +---------+--------+--------+ | black | 0 | 45 | +---------+--------+--------+ | latino | 25 | 0 | +---------+--------+--------+ Pivot to percent of total ========================= Pivot allows you to apply a :class:`.Computation` to each row of aggregated results prior to returning the table. Use the stringified name of the aggregation as the column argument to your computation: .. code-block:: python transformed = table.pivot('race', 'gender', aggregation=agate.Sum('age'), computation=agate.Percent('sum')) +---------+--------+--------+ | race | male | female | +=========+========+========+ | white | 0 | 14.8 | +---------+--------+--------+ | asian | 14.8 | 18.4 | +---------+--------+--------+ | black | 0 | 33.3 | +---------+--------+--------+ | latino | 18.4 | 0 | +---------+--------+--------+ *Note: actual computed percentages will be much more precise.* It's helpful when constructing these cases to think of all the cells in the pivot table as a single sequence. Denormalize key/value columns into separate columns =================================================== It's common for very large datasets to be distributed in a "normalized" format, such as: +---------+-----------+---------+ | name | property | value | +=========+===========+=========+ | Jane | gender | female | +---------+-----------+---------+ | Jane | race | black | +---------+-----------+---------+ | Jane | age | 24 | +---------+-----------+---------+ | ... | ... | ... | +---------+-----------+---------+ The :meth:`.Table.denormalize` method can be used to transform the table so that each unique property has its own column. .. code-block:: python transformed = table.denormalize('name', 'property', 'value') Result: +---------+----------+--------+-------+ | name | gender | race | age | +=========+==========+========+=======+ | Jane | female | black | 24 | +---------+----------+--------+-------+ | Jack | male | white | 35 | +---------+----------+--------+-------+ | Joe | male | black | 28 | +---------+----------+--------+-------+ Normalize separate columns into key/value columns ================================================= Sometimes you have a dataset where each property has its own column, but your analysis would be easier if all properties were stored together. Consider this table: +---------+----------+--------+-------+ | name | gender | race | age | +=========+==========+========+=======+ | Jane | female | black | 24 | +---------+----------+--------+-------+ | Jack | male | white | 35 | +---------+----------+--------+-------+ | Joe | male | black | 28 | +---------+----------+--------+-------+ The :meth:`.Table.normalize` method can be used to transform the table so that all the properties and their values share two columns. .. code-block:: python transformed = table.normalize('name', ['gender', 'race', 'age']) Result: +---------+-----------+---------+ | name | property | value | +=========+===========+=========+ | Jane | gender | female | +---------+-----------+---------+ | Jane | race | black | +---------+-----------+---------+ | Jane | age | 24 | +---------+-----------+---------+ | ... | ... | ... | +---------+-----------+---------+ agate-1.9.1/docs/cookbook/underscore.rst000066400000000000000000000021271454111471500202510ustar00rootroot00000000000000===================== Emulate underscore.js ===================== filter ====== agate's :meth:`.Table.where` functions exactly like Underscore's :code:`filter`. .. code-block:: python new_table = table.where(lambda row: row['state'] == 'Texas') reject ====== To simulate Underscore's :code:`reject`, simply negate the return value of the function you pass into agate's :meth:`.Table.where`. .. code-block:: python new_table = table.where(lambda row: not (row['state'] == 'Texas')) find ==== agate's :meth:`.Table.find` works exactly like Underscore's :code:`find`. .. code-block:: python row = table.find(lambda row: row['state'].startswith('T')) any === The :class:`.Any` aggregation works like Underscore's :code:`any`. .. code-block:: python true_or_false = table.aggregate(Any('salaries', lambda d: d > 100000)) You can also use :meth:`.Table.where` to filter to columns that pass the truth test. all === The :class:`.All` aggregation works like Underscore's :code:`all`. .. code-block:: python true_or_false = table.aggregate(All('salaries', lambda d: d > 100000)) agate-1.9.1/docs/extensions.rst000066400000000000000000000047461454111471500165020ustar00rootroot00000000000000========== Extensions ========== The core agate library is designed rely on as few dependencies as possible. However, in the real world you're often going to want to interface with more specialized tools, or with other formats, such as SQL or Excel. Using extensions ================ agate support's plugin-style extensions using a monkey-patching pattern. Libraries can be created that add new methods onto :class:`.Table` and :class:`.TableSet`. For example, `agate-sql `_ adds the ability to read and write tables from a SQL database: .. code-block:: python import agate import agatesql # After calling patch the from_sql and to_sql methods are now part of the Table class table = agate.Table.from_sql('postgresql:///database', 'input_table') table.to_sql('postgresql:///database', 'output_table') List of extensions ================== Here is a list of agate extensions that are known to be actively maintained: * `agate-sql `_: Read and write tables in SQL databases * `agate-stats `_: Additional statistical methods * `agate-excel `_: Read excel tables (xls and xlsx) * `agate-dbf `_: Read dbf tables (from shapefiles) * `agate-remote `_: Read from remote files * `agate-lookup `_: Instantly join to hosted `lookup `_ tables. Writing your own extensions =========================== Writing your own extensions is straightforward. Create a function that acts as your "patch" and then dynamically add it to :class:`.Table` or :class:`.TableSet`. .. code-block:: python import agate def new_method(self): print('I do something to a Table when you call me.') agate.Table.new_method = new_method You can also create new classmethods: .. code-block:: python def new_class_method(cls): print('I make Tables when you call me.') agate.Table.new_method = classmethod(new_method) These methods can now be called on :class:`.Table` class in your code: .. code-block:: python >>> import agate >>> import myextension >>> table = agate.Table(rows, column_names, column_types) >>> table.new_method() 'I do something to a Table when you call me.' >>> agate.Table.new_class_method() 'I make Tables when you call me.' The same pattern also works for adding methods to :class:`.TableSet`. agate-1.9.1/docs/images/000077500000000000000000000000001454111471500150035ustar00rootroot00000000000000agate-1.9.1/docs/images/bar_chart.svg000066400000000000000000000106771454111471500174640ustar00rootroot00000000000000 TOTAL200004000060000800000State NameFLORIDADIST. OF COLUMBIADELAWARECONNECTICUTCOLORADOCALIFORNIAARKANSASARIZONAALASKAALABAMAagate-1.9.1/docs/images/column_chart.svg000066400000000000000000000110451454111471500202030ustar00rootroot00000000000000 State NameALABAMAALASKAARIZONAARKANSASCALIFORNIACOLORADOCONNECTICUTDELAWAREDIST. OF COLUMBIAFLORIDATOTAL200004000060000800000agate-1.9.1/docs/images/dots_chart.svg000066400000000000000000002662601454111471500176720ustar00rootroot00000000000000 exonerated19801990200020102020age2550751000agate-1.9.1/docs/images/lattice.svg000066400000000000000000000323571454111471500171630ustar00rootroot00000000000000 Child Sex Abuseexonerated19801990200020102020count10203040500Sexual Assaultexonerated19801990200020102020count10203040500Murderexonerated19801990200020102020count10203040500Drug Possession or Saleexonerated19801990200020102020count10203040500agate-1.9.1/docs/images/line_chart.svg000066400000000000000000000070651454111471500176440ustar00rootroot00000000000000 exonerated19801990200020102020count501001500agate-1.9.1/docs/index.rst000066400000000000000000000056771454111471500154160ustar00rootroot00000000000000=============== agate |release| =============== .. include:: ../README.rst .. toctree:: :hidden: :maxdepth: 2 about install tutorial cookbook extensions api contributing release_process license changelog Show me docs ============ * `About `_ - why you should use agate and the principles that guide its development * `Install `_ - how to install for users and developers * `Tutorial `_ - a step-by-step guide to start using agate * `Cookbook `_ - sample code showing how to accomplish dozens of common tasks, including comparisons to SQL, R, etc. * `Extensions `_ - a list of libraries that extend agate functionality and how to build your own * `API `_ - technical documentation for every agate feature * `Changelog `_ - a record of every change made to agate for each release Show me code ============ .. code-block:: python import agate purchases = agate.Table.from_csv('examples/realdata/ks_1033_data.csv') by_county = purchases.group_by('county') totals = by_county.aggregate([ ('county_cost', agate.Sum('total_cost')) ]) totals = totals.order_by('county_cost', reverse=True) totals.limit(10).print_bars('county', 'county_cost', width=80) :: county county_cost SEDGWICK 977,174.45 ▓░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ COFFEY 691,749.03 ▓░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ MONTGOMERY 447,581.20 ▓░░░░░░░░░░░░░░░░░░░░░░░░░ JOHNSON 420,628.00 ▓░░░░░░░░░░░░░░░░░░░░░░░░ SALINE 245,450.24 ▓░░░░░░░░░░░░░░ FINNEY 171,862.20 ▓░░░░░░░░░░ BROWN 145,254.96 ▓░░░░░░░░ KIOWA 97,974.00 ▓░░░░░ WILSON 74,747.10 ▓░░░░ FORD 70,780.00 ▓░░░░ +-------------+-------------+-------------+-------------+ 0 250,000 500,000 750,000 1,000,000 This example, along with detailed comments, are available as a `Jupyter notebook `_. Join us ======= * `Contributing `_ - guidance for developers who want to contribute to agate * `Release process `_ - the process for maintainers to publish new releases * `License `_ - a copy of the MIT open source license covering agate Who we are ========== .. include:: ../AUTHORS.rst Indices and tables ================== * :ref:`genindex` * :ref:`modindex` * :ref:`search` agate-1.9.1/docs/install.rst000066400000000000000000000017041454111471500157400ustar00rootroot00000000000000============ Installation ============ Users ----- To use agate install it with pip:: pip install agate For non-English locale support, `install PyICU `__. Developers ---------- If you are a developer that also wants to hack on agate, install it from git:: git clone git://github.com/wireservice/agate.git cd agate mkvirtualenv agate pip install -e .[test] python setup.py develop .. note:: To run the agate tests with coverage:: pytest --cov agate Supported platforms ------------------- agate supports the following versions of Python: * Python 2.7 * Python 3.5+ * `PyPy `_ versions >= 4.0.0 It is tested primarily on OSX, but due to its minimal dependencies it should work perfectly on both Linux and Windows. .. note:: `iPython `_ or `Jupyter `_ user? Agate works great there too. agate-1.9.1/docs/license.rst000066400000000000000000000000611454111471500157070ustar00rootroot00000000000000======= License ======= .. include:: ../COPYING agate-1.9.1/docs/release_process.rst000066400000000000000000000015211454111471500174450ustar00rootroot00000000000000=============== Release process =============== If substantial changes were made to the code: #. Ensure any new modules have been added to setup.py's ``packages`` list #. Ensure any new public interfaces have been added to the documentation #. Ensure TableSet proxy methods have been added for new Table methods Then: #. All tests pass on continuous integration #. The changelog is up-to-date and dated #. The version number is correct in: * setup.py * docs/conf.py #. Check for new authors: ``git log --invert-grep --author='James McKinney'`` #. Run ``python charts.py`` to update images in the documentation #. Tag the release: ``git tag -a x.y.z -m 'x.y.z release.'; git push --follow-tags`` #. Upload to PyPI: ``rm -rf dist; python setup.py sdist bdist_wheel; twine upload dist/*`` #. Build the documentation on ReadTheDocs manually agate-1.9.1/docs/requirements.txt000066400000000000000000000000351454111471500170200ustar00rootroot00000000000000furo sphinx>2 docutils>=0.18 agate-1.9.1/docs/tutorial.rst000066400000000000000000000004471454111471500161400ustar00rootroot00000000000000======== Tutorial ======== The agate tutorial is now available in new-and-improved Jupyter Notebook format. Find it |link-pre|\ |version|\ |link-post| .. |link-pre| raw:: html on GitHub! agate-1.9.1/example.py000077500000000000000000000027011454111471500146160ustar00rootroot00000000000000#!/usr/bin/env python import agate tester = agate.TypeTester(force={ 'fips': agate.Text() }) table = agate.Table.from_csv('examples/realdata/ks_1033_data.csv', column_types=tester) # Question 1: What was the total cost to Kansas City area counties? # Filter to counties containing Kansas City kansas_city = table.where(lambda r: r['county'] in ('JACKSON', 'CLAY', 'CASS', 'PLATTE')) # Sum total_cost of four counties print('Total for Kansas City area: %i' % kansas_city.aggregate(agate.Sum('total_cost'))) # Question 2: Which counties spent the most? # Group by counties counties = table.group_by('county') # Aggregate totals for all counties totals = counties.aggregate([ ('total_cost_sum', agate.Sum('total_cost')) ]) totals = totals.order_by('total_cost_sum', reverse=True) totals.limit(20).print_bars('county', 'total_cost_sum', width=80) print('Five most spendy counties:') totals.print_table(5) # Question 3: What are the most recent purchases? recent = table.order_by('ship_date', reverse=True) print('Five most recent purchases:') recent.print_table(5, 5) # Question 4: What is the standard of deviation of the cost of all purchases? stdev = table.aggregate(agate.StDev('total_cost')) print('Standard deviation of total_cost: %.2f' % stdev) # Question 5: How many robots were purchased? robots = table.where(lambda r: 'ROBOT' in (r['item_name'] or [])).aggregate(agate.Sum('quantity')) print('Number of robots purchased: %i' % robots) agate-1.9.1/example.py.ipynb000066400000000000000000000202741454111471500157400ustar00rootroot00000000000000{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Using agate in a Jupyter notebook\n", "\n", "First we import agate. Then we create an agate Table by loading data from a CSV file." ] }, { "cell_type": "code", "execution_count": 1, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\n" ] } ], "source": [ "import agate\n", "\n", "table = agate.Table.from_csv('examples/realdata/ks_1033_data.csv')\n", "\n", "table" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Question 1: What was the total cost to Kansas City area counties?\n", "\n", "To answer this question, we first must filter the table to only those rows which refer to a Kansas City area `county`." ] }, { "cell_type": "code", "execution_count": 3, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "1575\n", "15\n" ] } ], "source": [ "kansas_city = table.where(lambda r: r['county'] in ('JACKSON', 'CLAY', 'CASS', 'PLATTE'))\n", "\n", "print(len(table.rows))\n", "print(len(kansas_city.rows))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We can then print the Sum of the costs of all those rows. (The cost column is named `total_cost`.)" ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "$3716\n" ] } ], "source": [ "print('$%d' % kansas_city.aggregate(agate.Sum('total_cost')))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Question 2: Which counties spent the most?\n", "\n", "This question is more complicated. First we group the data by `county`, which gives us a TableSet named `counties`. A TableSet is a group of tables with the same columns." ] }, { "cell_type": "code", "execution_count": 5, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "('ALLEN', 'ANDERSON', 'BARTON', 'BOURBON', 'BROWN', 'BUTLER', 'CHAUTAUQUA', 'CHEROKEE', 'CHEYENNE', 'CLAY', 'COFFEY', 'COMANCHE', 'COWLEY', 'CRAWFORD', 'DECATUR', 'DICKINSON', 'DOUGLAS', 'ELLIS', 'ELLSWORTH', 'FINNEY', 'FORD', 'FRANKLIN', 'GEARY', 'PRATT', 'GRAHAM', 'GRANT', 'GRAY', 'GREELEY', 'GREENWOOD', 'HAMILTON', 'HARVEY', 'HASKELL', 'JACKSON', 'JEFFERSON', 'JOHNSON', 'KIOWA', 'LABETTE', 'LEAVENWORTH', 'LINN', 'LOGAN', 'LYON', 'MARION', 'MARSHALL', 'MCPHERSON', 'MEADE', 'MIAMI', 'MONTGOMERY', 'NEMAHA', 'NEOSHO', 'NORTON', 'OSAGE', 'OTTAWA', 'PAWNEE', 'POTTAWATOMIE', 'RAWLINS', 'RENO', 'RICE', 'RILEY', 'ROOKS', 'SALINE', 'SCOTT', 'SEDGWICK', 'SHAWNEE', 'SHERMAN', 'SMITH', 'STAFFORD', 'SUMNER', 'THOMAS', 'WABAUNSEE', 'WICHITA', 'WILSON', 'WOODSON', 'WYANDOTTE')\n" ] } ], "source": [ "# Group by county\n", "counties = table.group_by('county')\n", "\n", "print(counties.keys())" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We then use the `aggregate` function to sum the `total_cost` column for each table in the group. The resulting values are collapsed into a new table, `totals`, which has a row for each county and a column named `total_cost_sum` containing the new total." ] }, { "cell_type": "code", "execution_count": 6, "metadata": { "collapsed": false, "scrolled": true }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "('county', 'total_cost_sum')\n" ] } ], "source": [ "# Aggregate totals for all counties\n", "totals = counties.aggregate([\n", " ('total_cost_sum', agate.Sum('total_cost'),)\n", "])\n", "\n", "print(totals.column_names)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Finally, we sort the counties by their total cost, limit the results to the top 10 and then print the results as a text bar chart." ] }, { "cell_type": "code", "execution_count": 7, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "county total_cost_sum\n", "SEDGWICK 977,174.45 ▓░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ \n", "COFFEY 691,749.03 ▓░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ \n", "MONTGOMERY 447,581.20 ▓░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ \n", "JOHNSON 420,628.00 ▓░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ \n", "SALINE 245,450.24 ▓░░░░░░░░░░░░░░░░░░ \n", "FINNEY 171,862.20 ▓░░░░░░░░░░░░░ \n", "BROWN 145,254.96 ▓░░░░░░░░░░░ \n", "KIOWA 97,974.00 ▓░░░░░░░ \n", "WILSON 74,747.10 ▓░░░░░ \n", "FORD 70,780.00 ▓░░░░░ \n", "GREENWOOD 69,722.00 ▓░░░░░ \n", "DOUGLAS 68,069.42 ▓░░░░░ \n", "MIAMI 64,691.09 ▓░░░░░ \n", "LYON 51,236.00 ▓░░░░ \n", "HAMILTON 47,989.00 ▓░░░░ \n", "WYANDOTTE 45,259.50 ▓░░░ \n", "CHEYENNE 35,970.92 ▓░░░ \n", "PRATT 35,745.60 ▓░░░ \n", "FRANKLIN 34,309.00 ▓░░░ \n", "HARVEY 27,387.42 ▓░░ \n", " +-----------------+-----------------+------------------+-----------------+\n", " 0 250,000 500,000 750,000 1,000,000\n" ] } ], "source": [ "totals.order_by('total_cost_sum', reverse=True).limit(20).print_bars('county', 'total_cost_sum', width=100)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.5.0" } }, "nbformat": 4, "nbformat_minor": 0 } agate-1.9.1/examples/000077500000000000000000000000001454111471500144245ustar00rootroot00000000000000agate-1.9.1/examples/realdata/000077500000000000000000000000001454111471500162015ustar00rootroot00000000000000agate-1.9.1/examples/realdata/Datagov_FY10_EDU_recp_by_State.csv000066400000000000000000000053111454111471500244020ustar00rootroot00000000000000State Name,State Abbreviate,Post-9/11GI Bill Program,Montgomery GI Bill - Active Duty,Montgomery GI Bill - Selective Reserve,Dependents' Educational Assistance,Reserve Educational Assistance Program,Post-Vietnam Era Veteran's Educational Assistance Program,TOTAL ALABAMA,AL,7738,5779,2075,3102,883,5,"19,582" ALASKA,AK,1781,561,170,164,28,1,"2,705" ARIZONA,AZ,18588,21012,2132,3571,1433,7,"46,743" ARKANSAS,AR,2579,1639,1219,1760,733,2,"7,932" CALIFORNIA,CA,44906,20642,2977,7283,800,31,"76,639" COLORADO,CO,11379,6987,905,1735,472,7,"21,485" CONNECTICUT,CT,2208,1169,507,412,51,3,"4,350" DELAWARE,DE,1061,374,176,238,54,1,"1,904" DIST. OF COLUMBIA,DC,1485,434,50,191,23,2,"2,185" FLORIDA,FL,28292,20414,2986,6582,1227,18,"59,519" GEORGIA,GA,12464,9303,2473,3765,618,8,"28,631" HAWAII,HI,4012,802,352,444,87,2,"5,699" IDAHO,ID,1576,1134,618,423,193,1,"3,945" ILLINOIS,IL,13522,12000,2653,2218,1602,12,"32,007" INDIANA,IN,5061,2925,1892,1249,840,5,"11,972" IOWA,IA,4640,6237,1834,889,887,3,"14,490" KANSAS,KS,3495,2354,797,680,415,3,"7,744" KENTUCKY,KY,3852,2282,1130,1819,502,4,"9,589" LOUISIANA,LA,3992,2357,1222,1564,390,3,"9,528" MAINE,ME,1144,535,286,699,68,1,"2,733" MARYLAND,MD,11006,5022,683,1115,244,8,"18,078" MASSACHUSETTS,MA,4644,1947,854,1047,293,7,"8,792" MICHIGAN,MI,6634,3426,1439,1805,444,11,"13,759" MINNESOTA,MN,7125,3698,1957,1625,762,7,"15,174" MISSISSIPPI,MS,2928,1307,1428,916,571,2,"7,152" MISSOURI,MO,7512,7831,2154,1673,1643,8,"20,821" MONTANA,MT,1251,651,372,359,78,1,"2,712" NEBRASKA,NE,2483,2295,854,854,404,2,"6,892" NEVADA,NV,2919,1711,411,589,97,1,"5,728" NEW HAMPSHIRE,NH,1101,425,245,264,54,1,"2,090" NEW JERSEY,NJ,4619,2274,947,901,312,5,"9,058" NEW MEXICO,NM,2372,2116,462,1322,126,3,"6,401" NEW YORK,NY,12199,5420,1937,2517,591,14,"22,678" NORTH CAROLINA,NC,10068,7973,1598,4267,595,7,"24,508" NORTH DAKOTA,ND,857,519,529,259,255,1,"2,420" OHIO,OH,8994,5295,3160,1991,1174,11,"20,625" OKLAHOMA,OK,4129,3943,1418,2314,900,4,"12,708" OREGON,OR,3937,2437,638,1382,218,4,"8,616" PENNSYLVANIA,PA,11815,4904,2353,2225,690,11,"21,998" RHODE ISLAND,RI,953,333,238,243,80,1,"1,848" SOUTH CAROLINA,SC,6067,2852,1597,2195,341,4,"13,056" SOUTH DAKOTA,SD,826,564,684,284,226,1,"2,585" TENNESSEE,TN,6613,3430,1382,1829,481,5,"13,740" TEXAS,TX,35596,19027,3244,8041,1089,18,"67,015" UTAH,UT,3133,2250,1170,744,533,2,"7,832" VERMONT,VT,690,316,104,125,37,0,"1,272" VIRGINIA,VA,26426,10092,1549,3322,693,10,"42,092" WASHINGTON,WA,10231,5059,743,2381,307,8,"18,729" WEST VIRGINIA,WV,4145,5834,1075,840,1022,1,"12,917" WISCONSIN,WI,5436,3131,1509,1402,825,5,"12,308" WYOMING,WY,464,561,227,150,84,1,"1,487" PUERTO RICO,PR,957,635,1285,1580,670,2,"5,129" PHILIPPINES,PH,171,291,0,10,0,0,472 agate-1.9.1/examples/realdata/README.csv000066400000000000000000000002231454111471500176500ustar00rootroot00000000000000title,filename,url,retrieval_date "FY 10 Education Recipients by State",Datagov_FY10_EDU_recp_by_State.csv,http://www.data.gov/raw/4509,2011-04-17 agate-1.9.1/examples/realdata/exonerations-20150828.csv000066400000000000000000003676761454111471500223740ustar00rootroot00000000000000last_name,first_name,age,race,state,tags,crime,sentence,convicted,exonerated,dna,dna_essential,mistake_witness,false_confession,perjury,false_evidence,official_misconduct,inadequate_defense Abbitt,Joseph Lamont,31,Black,NC,CV,Child Sex Abuse,Life,1995,2009,Y,,Y,,,,, Abdal,Warith Habib,43,Black,NY,,Sexual Assault,20 to Life,1983,1999,Y,,Y,,,Y,Y, Abernathy,Christopher,17,Caucasian,IL,"CIU, CV",Murder,Life without parole,1987,2015,Y,,,Y,Y,,Y, Acero,Longino,35,Hispanic,CA,"NC, P",Sex Offender Registration,2 years and 4 months,1994,2006,,,,,,,,Y Adams,Anthony,26,Hispanic,CA,P,Manslaughter,12 years,1996,2001,,,,,Y,,Y, Adams,Cheryl,26,Caucasian,MA,"F, NC, P",Theft,"Probation and $5,500 restitution",1989,1993,,,,Y,Y,,, Adams,Don Ray,32,Black,PA,,Murder,Life,1992,2011,,,,,Y,,, Adams,Jarrett M.,17,Black,WI,NC,Sexual Assault,28 years,2000,2007,,,,,Y,,,Y Adams,Johnathan,12,Caucasian,GA,"CV, P",Murder,1 year,2005,2006,,,,Y,,,, Adams,Kenneth,21,Black,IL,"CDC, JI",Murder,75 years,1978,1996,Y,,Y,,Y,Y,Y, Adams,Laurence,19,Black,MA,JI,Murder,Death,1974,2004,,,,,Y,,Y,Y Adams,Randall D.,28,Caucasian,TX,CDC,Murder,Death,1977,1989,,,Y,,Y,,Y, Adams,Reginald,26,Black,LA,,Murder,Life without parole,1983,2014,,,,Y,Y,,Y, Adams,Sandra,29,Black,NY,"F, M, NC",Menacing,Not sentenced,1998,1999,,,,,Y,,Y, Adams,Sean,21,Black,CT,,Murder,Life,1999,2013,,,,,Y,,Y, Adams,Seneca,20,Black,IL,"M, NC",Assault,Probation,2006,2006,,,,,Y,,Y, Adams,Tari,18,Black,IL,"M, NC",Assault,Probation,2006,2006,,,,,Y,,Y, Addison,Ronald,19,Black,MD,,Murder,30 years,1998,2005,,,,,Y,,Y,Y Aguirre,Omar,28,Hispanic,IL,CDC,Murder,55 years,1999,2003,,,,Y,Y,,Y, Ajamu,Kwame,17,Black,OH,,Murder,Death,1975,2014,,,,,Y,,Y, Aldape Guerra,Ricardo,20,Hispanic,TX,,Murder,Death,1982,1997,,,,,Y,,Y, Aldridge,Robert,24,Caucasian,OH,"CSH, CV, NC",Child Sex Abuse,Life,1985,1997,,,,,Y,,Y, Alejandro,Gilbert,35,Hispanic,TX,,Sexual Assault,12 years,1990,1994,Y,,Y,,Y,Y,Y, Alexander,John Randall,22,Caucasian,MS,CV,Murder,Life,1989,2010,,,,,Y,,, Alexander,Richard,29,Black,IN,,Sexual Assault,70 years,1998,2001,Y,,Y,,,Y,, Algarin,Albert,21,Hispanic,NY,"CSH, CV, NC",Child Sex Abuse,25 to 50 years,1985,1990,,,,,Y,,Y, Allen,Billy Frederick,37,Caucasian,TX,,Murder,99 years,1983,2011,,,,,,,,Y Allen,Dante,16,Black,OH,,Kidnapping,Not sentenced,2005,2005,,,Y,,,,, "Allen, Jr.",George,25,Black,MO,,Murder,95 years,1983,2013,Y,*,,Y,Y,Y,Y, Alowonle,Rilwan,50,Black,MN,NC,Sexual Assault,4 years,2007,2010,,,,,Y,,,Y Alvarado,Victor,26,Hispanic,IL,"NC, P",Drug Possession or Sale,6 years,2012,2013,,,,,Y,,Y, Alvarez,Jesse,19,Hispanic,CA,P,Manslaughter,12 years,1996,2001,,,,,Y,,Y, Alvarez,Jorge,23,Hispanic,CA,P,Manslaughter,12 years,1996,2001,,,,,Y,,Y, Alvarez,Roy,38,Hispanic,CA,,Robbery,14 years,1995,2002,,,Y,,,,Y,Y Alvirez,Daniel,32,Hispanic,F-DC,"FED, NC, P",Conspiracy,Not sentenced,2011,2012,,,,,,,, Ambler,Peter,24,Caucasian,WI,,Murder,Life,1987,1989,,,,,Y,,, Amezquita,Gilbert,20,Hispanic,TX,,Assault,15 years,1998,2007,,,Y,,,,,Y Amirault,Violet,59,Caucasian,MA,"CSH, CV, F, NC, PH",Child Sex Abuse,20 years,1987,1998,,,,,Y,,Y,Y Amon-Ra,Nubian,43,Black,AZ,NC,Assault,7 years and 6 months,2006,2012,,,,,Y,,, Amrine,Joseph,28,Black,MO,JI,Murder,Death,1986,2003,,,,,Y,,Y,Y Andersen,Daniel,19,Caucasian,IL,,Murder,55 years,1982,2015,Y,*,,Y,Y,,Y, Anderson,Candice,21,Caucasian,TX,"F, NC, P",Manslaughter,Probation,2007,2014,,,,,,,, Anderson,David,34,Caucasian,PA,NC,Sexual Assault,5 to 15 years,2001,2011,,,,,,,Y, Anderson,James C.,39,Black,NY,CV,Burglary/Unlawful Entry,Not sentenced,1990,1990,,,Y,,,,, Anderson,James S.,26,Black,WA,,Robbery,17 years,2005,2009,,,,,Y,,,Y Anderson,Mark,51,Caucasian,HI,NC,Other Violent Felony,9 days,2011,2014,,,,,Y,,, Anderson,Marvin,18,Black,VA,,Sexual Assault,Life,1982,2002,Y,,Y,,,,, Anderson,Roland,32,Black,MS,CV,Assault,15 years,1997,2007,,,Y,,,,,Y Andre,Eddie,41,Black,NY,,Murder,25 to Life,1988,1994,,,,,Y,,, Andrews,James,21,Black,IL,CDC,Murder,Life,1985,2008,,,,Y,Y,,Y, Anthony,Obie,19,Black,CA,,Murder,Life without parole,1995,2011,,,,,Y,,Y,Y Antoine,Narcisse,27,Black,FL,,Attempted Murder,40 years,2011,2015,Y,*,,,,,, Appling,Riolordo,28,Black,CA,,Assault,Not sentenced,2012,2013,,,Y,,,,,Y Arledge,Randolph,26,Caucasian,TX,,Murder,99 years,1984,2013,Y,,,,Y,Y,, Armstrong,LaMonte,38,Black,NC,"CDC, JI",Murder,Life,1995,2013,Y,*,,,Y,Y,Y, Armstrong,Ralph,27,Caucasian,WI,,Murder,Life,1981,2009,Y,*,,,Y,Y,, Armstrong,Richard,25,Caucasian,MI,"CV, NC",Sexual Assault,14 to 30 years,2006,2012,,,,,Y,,Y,Y Arrington,Jermaine,19,Black,MD,,Murder,25 years,1995,2010,Y,*,Y,,,,,Y Arroyo,Rogelio,25,Hispanic,IL,CDC,Murder,Life without parole,1982,1991,,,Y,,Y,,, Arteaga,Josephine,21,Hispanic,TX,"CIU, F, M, NC, P",Drug Possession or Sale,Probation,2014,2015,,,,,,,, Ashe,Carlos,18,Black,CT,,Murder,90 years,2000,2013,,,,,Y,,Y, Atkins,Herman,19,Black,CA,,Sexual Assault,47 years,1988,2000,Y,,Y,,,Y,, Atkins,Kenneth,16,Caucasian,FL,"CV, NC",Sexual Assault,10 years,2004,2008,,,,,Y,,, Atkins,Timothy,17,Black,CA,,Murder,32 to Life,1987,2007,,,Y,,Y,,Y, Atlas,Gerald,25,Black,CA,,Attempted Murder,28 years to life,1990,1998,,,Y,,Y,,Y, Austin,Darryl,22,Black,NY,"CIU, PH",Murder,18 to life,1988,2014,,,,,Y,,Y, Austin,Michael,25,Black,MD,,Murder,Life,1975,2002,,,,,Y,,Y,Y Avery,Chamar,18,Black,MI,,Murder,20 to 50 years,2000,2010,,,Y,,,,,Y Avery,Steven,22,Caucasian,WI,,Sexual Assault,32 years,1985,2003,Y,,Y,,,,, Avery,William,25,Black,WI,JI,Murder,40 years,2005,2010,Y,,,,Y,,, Aviles,Anselmo,22,Hispanic,MA,,Sexual Assault,16 to 18 years,1989,1997,,,Y,,,,,Y Awe,Joseph,36,Caucasian,WI,"A, NC",Arson,3 years,2007,2013,,,,,Y,Y,Y,Y Ayers,David,42,Black,OH,JI,Murder,Life without parole,2000,2011,Y,,,,Y,,Y, Ayers,Devon,18,Black,NY,,Murder,25 years to life,1997,2012,,,,,Y,,, Ayers,Devon,18,Black,NY,,Murder,25 years to life,1997,2013,,,,,Y,,Y, Ayers,Randall Lynn,17,Caucasian,OH,CV,Attempted Murder,14 to 50 years,1982,1990,,,Y,,,,, Baba-Ali,Amine,31,Other,NY,"CV, NC",Child Sex Abuse,8 1/3 to 25 years,1989,1992,,,,,Y,Y,Y,Y Baddeley,Gabriel,19,Black,WA,"A, P",Arson,2 months,2002,2004,,,,Y,,,, Bailey,Darryl,17,Black,TN,,Murder,Life,1994,2000,,,,Y,,,, Baillie,Nissa,20,Caucasian,FL,"F, NC",Manslaughter,15 years,1997,2001,,,,,Y,,Y, Bain,James,19,Black,FL,CV,Child Sex Abuse,Life,1974,2009,Y,,Y,,,,, Baker,Edward,17,Black,PA,CDC,Murder,Life,1974,2002,,,,Y,Y,,Y,Y Baker,Jimmy Lee,22,Caucasian,OK,,Assault,Life,2009,2010,,,,,Y,,Y,Y Baker,Terry,30,Black,TX,"CIU, NC, P",Drug Possession or Sale,Probation,2014,2014,,,,,,Y,, Ballard,Frances,54,Caucasian,TN,"CSH, CV, F, NC",Child Sex Abuse,5 years,1987,1993,,,,,Y,Y,Y, Ballinger,Gene Curtis,45,Caucasian,NM,,Murder,9 years,1981,1993,,,,,Y,,Y, Baltierrez,Joseph,21,Don't Know,TX,"CIU, NC, P",Drug Possession or Sale,Probation,2014,2014,,,,,,Y,, Baltrip,Henry,22,Black,TX,"CIU, NC, P",Drug Possession or Sale,7 years,2013,2015,,,,,,Y,,Y Baniani,Borzou,27,Other,CA,NC,Drug Possession or Sale,Probation,2013,2014,,,,,,,, Banks,Brian,16,Black,CA,"CV, NC, P",Sexual Assault,6 years,2003,2012,,,,,Y,,, Banks,Medell,27,Black,AL,"CDC, CV, NC, P",Manslaughter,15 years,2001,2003,,,,Y,Y,,Y, Baran,Bernard,18,Caucasian,MA,"CSH, CV, NC",Child Sex Abuse,Life,1985,2009,,,,,Y,Y,Y,Y Barber,Jason,21,Caucasian,TX,CV,Murder,35 years,1997,2000,,,,,Y,,, Barbour,Bennett,22,Black,VA,,Sexual Assault,10 years,1978,2012,Y,,Y,,,,, Barnes,Steven,19,Caucasian,NY,"CV, JI",Murder,25 to Life,1989,2009,Y,,Y,,Y,Y,, "Barnes, Jr.",Donald,32,Caucasian,FL,"CV, NC",Child Sex Abuse,Life without parole,2011,2014,,,,,Y,,Y, Barr,Jonathan,14,Black,IL,"CDC, CV",Murder,85 years,1997,2011,Y,,,,Y,,Y, Baruxes,Kevin,18,Caucasian,CA,NC,Sexual Assault,18 to Life,1996,2003,,,,,Y,,, Bass,Jimmy,18,Black,MS,,Robbery,50 years,1988,2010,,,,,Y,,Y,Y Bates,David,18,Black,IL,CDC,Murder,20 years,1985,2015,,,,Y,Y,,Y, Batts,Rosa Sade,25,Black,TX,"F, NC, P",Drug Possession or Sale,2 years,2012,2013,,,,,,Y,, Bauer,Chester,26,Caucasian,MT,,Sexual Assault,30 years,1983,1997,Y,,Y,,,Y,, Baumer,Julie,27,Caucasian,MI,"CV, F, NC, SBS",Child Abuse,10 to 15 years,2005,2010,,,,,,Y,,Y Baylor,Ronnie,25,Black,CA,,Sexual Assault,16 years,1987,1996,,,,Y,,,Y,Y Beal,Kenneth,23,Black,TX,"CIU, NC, P",Drug Possession or Sale,7 months,2007,2015,,,,,,Y,, Beaman,Alan W.,20,Caucasian,IL,,Murder,50 years,1995,2009,,,,,,,Y, Beamon,Melvin Todd,24,Black,AL,,Murder,25 years,1989,1990,,,,,Y,,Y, Beard,Jacob,34,Caucasian,WV,"CDC, JI",Murder,Life without parole,1993,2000,,,Y,,Y,Y,Y, Beauchamp,Franklin,27,Hispanic,NY,"CSH, CV, NC",Child Sex Abuse,25 to 75 years,1986,1989,,,,,Y,,Y, Beaver,Antonio,30,Black,MO,,Robbery,18 years,1997,2007,Y,,Y,,,,, Becerra,Jeanie,21,Caucasian,KS,"F, M, NC",Obstruction of Justice,Not sentenced,2014,2014,,,,,Y,,Y,Y Beckett,Dale,33,Black,OH,JI,Murder,15 years to Life,1997,2003,,,Y,,Y,,Y, Belcher,Michael,21,Caucasian,OH,,Manslaughter,20 years,1999,2001,,,,,,,Y,Y Bell,Derrick,22,Black,NY,,Robbery,12 1/2 to 25 years,1997,2007,,,Y,,,,,Y Bell,Jerrell,19,Black,TX,"NC, P",Drug Possession or Sale,120 days,2011,2014,,,,,,Y,, "Bell, Jr.",James,27,Black,PA,,Drug Possession or Sale,2 to 5 years,2008,2014,,,,,Y,,Y,Y Bellamy,Kareem,26,Black,NY,,Murder,25 to Life,1995,2011,,,Y,,Y,,Y, Berghoff,Stacey,27,Caucasian,TX,"CIU, F, M, NC, P",Drug Possession or Sale,15 days,2014,2014,,,,,,,, Beridon,Cheryle,23,Black,LA,"F, NC",Drug Possession or Sale,Life,1979,2003,,,,,Y,,Y, Bermudez,Fernando,21,Hispanic,NY,CV,Murder,23 to Life,1992,2009,,,,,Y,,Y, Berry,Wilder,20,Black,IL,"CV, NC",Sexual Assault,35 years,1992,2000,,,,,Y,,,Y Berryman,Earl,24,Black,NJ,,Sexual Assault,50 years,1985,1997,,,Y,,Y,,,Y Bianco,Thomas,19,Caucasian,NY,,Murder,25 to Life,1986,1993,,,,,Y,,Y, Bibbins,Gene,29,Black,LA,CV,Child Sex Abuse,Life,1987,2003,Y,,Y,,Y,Y,Y, bin Wahad,Dhoruba,26,Black,NY,,Attempted Murder,25 to Life,1973,1995,,,,,,,Y, Birden,Eric,23,Black,TX,"CIU, NC, P",Drug Possession or Sale,30 days in jail,2007,2015,,,,,,Y,, Bivens,Phillip,28,Black,MS,"CDC, P",Murder,Life,1980,2010,Y,,,Y,Y,,Y, "Bivens, Jr.",Darrell Wayne,33,Black,TX,"NC, P",Sex Offender Registration,3 years,2013,2013,,,,,,,,Y Bjerklie,Donna,42,Caucasian,ND,"F, NC, P",Theft,Probation,2005,2006,,,,,,,, Blackshire,James,18,Black,TX,"CIU, CV, NC, P",Child Sex Abuse,40 years,1995,2009,,,,,Y,,Y,Y Blair,Michael,23,Caucasian,TX,CV,Murder,Death,1994,2008,Y,,Y,,,Y,, Blake,Bryan,22,Caucasian,NY,,Murder,25 to life,1985,1989,,,Y,,,,,Y Blake,Jeffrey,21,Black,NY,,Murder,36 to Life,1991,1998,,,,,Y,,Y, Bloodsworth,Kirk,23,Caucasian,MD,CV,Murder,Death,1985,1993,Y,,Y,,,,Y, Blyden,Malisha,22,Black,NY,"CDC, F",Attempted Murder,40 years,2007,2014,,,Y,,Y,,Y, Bolduc,Frank,42,Caucasian,F-WI,FED,Robbery,48 years and 4 months,1991,1999,,,Y,,,,Y, Bolstad,Daniel,37,Caucasian,WI,,Sexual Assault,20 years,2007,2015,,,Y,,Y,,,Y Booker,Donte,18,Black,OH,,Sexual Assault,10 to 25 years,1987,2005,Y,,Y,,,,, Boots,Christopher,19,Caucasian,OR,JI,Murder,20 to Life,1987,1995,Y,*,,,Y,Y,, Boquete,Orlando,27,Hispanic,FL,,Sexual Assault,65 years,1983,2006,Y,,Y,,,Y,, Bostic,Larry,31,Black,FL,P,Sexual Assault,8 years,1989,2007,Y,*,,,Y,,, Bostick,Yeidja,21,Caucasian,PA,"CV, F",Manslaughter,5 to 10 years,1990,1993,,,,,Y,Y,Y, Bowman,James,18,Black,MO,CDC,Murder,Life,1986,2001,,,,,Y,,Y, Boyce,David,19,Caucasian,VA,JI,Murder,Life,1991,2013,Y,*,Y,,Y,Y,Y,Y Boyd,Latherial,24,Black,IL,CIU,Murder,82 years,1990,2013,,,,,Y,,Y,Y "Boyd, Jr.",Kenneth Wayne,22,Black,TX,"CV, JI",Murder,Life,1999,2013,,,,,Y,,Y,Y Boyette,Calvin,19,Black,NY,A,Attempted Murder,12 1/2 to 25 years,1984,2001,,,Y,,,,Y, Bozella,Dewey,16,Black,NY,CDC,Murder,20 to Life,1983,2009,,,,,Y,,Y, Bradford,Marcellius,17,Black,IL,"CDC, P",Kidnapping,12 years,1988,2001,Y,,,Y,Y,Y,Y, Bradford,Ted,23,Caucasian,WA,,Sexual Assault,13 years,1996,2010,Y,,Y,Y,,,, Bradford,Ty,39,Caucasian,MT,NC,Assault,10 years,1999,2003,,,,,Y,,,Y Bradley,Adam,48,Caucasian,NY,"M, NC",Assault,Probation,2010,2013,,,,,Y,,, Bragdon,Anthony,19,Black,DC,CV,Sexual Assault,30 years,1992,2003,,,Y,,Y,Y,Y, Bragg,Rodney L.,31,Black,AR,,Drug Possession or Sale,Life,1996,2000,,,,,Y,,Y,Y Branch,Lamont,23,Black,NY,,Murder,25 years,1990,2002,,,,,Y,,Y, Branch,Michael,45,Caucasian,AR,NC,Murder,20 years,1995,1998,,,,,Y,,Y, Brandley,Clarence,29,Black,TX,CV,Murder,Death,1981,1990,,,,,Y,,Y, Braunskill,Clarence,38,Black,NY,,Drug Possession or Sale,20 to 40 years,1990,1997,,,Y,,,,, Bravo,Mark,31,Hispanic,CA,,Sexual Assault,8 years,1990,1994,Y,,Y,,Y,Y,Y,Y Brewer,Kennedy,21,Black,MS,"CV, JI",Murder,Death,1995,2008,Y,,,,Y,Y,Y, Bridgeman,Wiley,20,Black,OH,,Murder,Death,1975,2014,,,,,Y,,, Briggs,Brandy,19,Caucasian,TX,"CV, F, NC, P, SBS",Child Abuse,17 years,2000,2006,,,,,Y,Y,,Y Bright,Dan L.,35,Black,LA,,Murder,Death,1996,2004,,,Y,,,,Y,Y Bright,Gregory,20,Black,LA,CV,Murder,Life,1976,2003,,,,,Y,,Y,Y Brim,Dominique,14,Black,MI,F,Assault,Not sentenced,2002,2002,,,Y,,Y,,, Brinson,Jeremiah,28,Black,NY,NC,Robbery,18 years,1999,2009,,,,,Y,,, Briscoe,Johnny,28,Black,MO,,Sexual Assault,45 years,1983,2006,Y,,Y,,,Y,Y, Brison,Dale,27,Black,PA,,Sexual Assault,18 to 42 years,1991,1994,Y,,Y,,,Y,, Britt,Cheydrick,29,Black,FL,CV,Child Sex Abuse,30 years,2004,2013,Y,*,,,Y,Y,,Y Britton,Robert,38,Black,IL,NC,Sexual Assault,50 years,1996,2014,,,,Y,,,,Y Broam,Jack Ray,29,Caucasian,NV,"CSH, CV, NC",Child Sex Abuse,Life,1990,1998,,,,,Y,,, Brock,Donald,,Black,IL,NC,Theft,Not sentenced,1989,1989,,,Y,,Y,,, Brock,Jerry,35,Black,WA,"CV, NC",Child Sex Abuse,Life,1995,2014,,,,,Y,,, Brodie,Stephen,19,Caucasian,TX,"CIU, CV, P",Child Sex Abuse,5 years,1993,2010,,,,Y,,,Y, Bromgard,Jimmy Ray,18,Caucasian,MT,CV,Child Sex Abuse,40 years,1987,2002,Y,,Y,,Y,Y,Y,Y Bronson,Melinda,44,Caucasian,NY,"CV, F, NC",Child Sex Abuse,Probation,2002,2011,,,,,Y,,, Brooks,Gwendolyn,52,Black,TX,"CIU, F, NC, P",Drug Possession or Sale,6 months,2010,2014,,,,,,Y,, Brooks,Levon,26,Black,MS,CV,Murder,Life without parole,1992,2008,,,Y,,Y,Y,Y, Brown,Alfred,21,Black,TX,CDC,Murder,Death,2005,2015,,,Y,,Y,,Y, Brown,Charlotte,32,Black,TX,"F, NC, P",Drug Possession or Sale,6 years,2011,2012,,,,,,Y,, Brown,Danny,24,Black,OH,,Murder,Life,1982,2001,Y,,Y,,,,, Brown,Debra,35,Caucasian,UT,F,Murder,Life,1995,2013,,,,,,Y,Y,Y Brown,Dennis,16,Black,LA,,Sexual Assault,Life,1985,2005,Y,,Y,Y,,Y,Y, Brown,Joyce Ann,33,Black,TX,"F, JI",Murder,25 to Life,1980,1990,,,Y,,Y,,Y, Brown,Keith,25,Black,NC,"CV, P",Sexual Assault,35 years,1993,1999,Y,,,Y,,,, Brown,Leon,15,Black,NC,"CDC, CV",Murder,Death,1984,2014,Y,,,Y,Y,,Y, Brown,Nathan,23,Black,LA,,Sexual Assault,25 years,1997,2014,Y,,Y,,,,,Y Brown,Patrick,21,Black,PA,,Robbery,22 to 70 years,2002,2010,Y,,Y,,,,, Brown,Robert,,Black,IL,,Murder,35 years,1984,1989,,,Y,,,,, Brown,Roy,29,Caucasian,NY,JI,Murder,25 to Life,1992,2007,Y,,,,Y,Y,Y, Brown,Timothy,14,Black,FL,,Murder,Life,1993,2003,,,,Y,Y,,Y, Bryan,Sheila,43,Caucasian,GA,"A, F, NC",Murder,Life,1998,2000,,,,,,Y,, Bryant,James,38,Black,CA,NC,Drug Possession or Sale,12 years,1997,2000,,,,,Y,,Y, Bryson,David,28,Caucasian,OK,,Sexual Assault,85 years,1983,2003,Y,,Y,,Y,Y,Y, Buchli II,Richard,50,Caucasian,MO,,Murder,Life without parole,2002,2012,,,,,Y,,Y,Y Buckley,Gyronne,44,Black,AR,NC,Drug Possession or Sale,Life,1999,2010,,,,,Y,,Y, Bullock,Keith,27,Caucasian,MN,P,Murder,Not sentenced,1992,1992,,,,,,,, Bullock,Ronnie,27,Black,IL,CV,Child Sex Abuse,60 years,1984,1994,Y,,Y,,,,, Bumbury,Michael,23,Black,NY,NC,Sexual Assault,12 to Life,1999,2002,,,,,Y,,Y, Bunch,Kristine,21,Caucasian,IN,"A, CV, F, NC",Murder,60 years,1996,2012,,,,,Y,Y,Y, Bunge,Charles,38,Caucasian,NY,,"Attempt, Violent",6 years,2007,2010,,,Y,,,,, Buntin,Harold,15,Black,IN,,Sexual Assault,50 years,1986,2007,Y,,Y,,,Y,, Burge,Gerald,29,Caucasian,LA,CDC,Murder,Life without parole,1986,1992,,,,,Y,,Y, Burgess,Sabein,24,Black,MD,CIU,Murder,Life,1995,2014,,,,,,Y,Y,Y Burnette,Victor,26,Caucasian,VA,,Sexual Assault,20 years,1979,2009,Y,,Y,,,,, Burnside,Thomas,41,Black,F-IL,"CDC, FED",Drug Possession or Sale,Not sentenced,1991,1994,,,,,Y,,Y, Burrell,Albert,30,Caucasian,LA,JI,Murder,Death,1987,2001,,,,,Y,,Y, Burrowes,Christopher,21,Black,WI,CV,Child Sex Abuse,3 years,2007,2009,,,,,Y,,, Burrows,Joe,35,Caucasian,IL,CDC,Murder,Death,1989,1996,,,,,Y,,Y, Burt,Lazaro,20,Black,NY,,Murder,25 to Life,1994,2002,,,Y,,Y,,Y, Burton,Darryl,22,Black,MO,JI,Murder,Life,1985,2008,,,,,Y,,Y, Butler,Robert,34,Black,TN,NC,Sexual Assault,Life,1977,1995,,,,,Y,,,Y Butler,Sabrina,18,Black,MS,"CV, F, NC",Murder,Death,1990,1995,,,,Y,,Y,Y, "Butler, Jr.",A.B.,27,Black,TX,,Sexual Assault,Life,1983,2000,Y,,Y,,,,, Byars,Barry,24,Caucasian,TX,"CV, NC, P",Child Abuse,10 years,2004,2005,,,,,Y,,, Byrd,Kevin,23,Black,TX,,Sexual Assault,Life,1985,1997,Y,,Y,,,Y,, Cage,Dean,28,Black,IL,CV,Child Sex Abuse,40 years,1996,2008,Y,,Y,,,,, Caine,Eric,19,Black,IL,CDC,Murder,Life,1989,2011,,,,Y,Y,,Y, "Calderon, III",Domingo,26,Hispanic,TX,"CV, NC, P",Child Sex Abuse,10 years,2005,2010,,,,,Y,,, Caldwell,Maurice,23,Black,CA,,Murder,27 to Life,1991,2011,,,Y,,,,,Y Callace,Leonard,30,Caucasian,NY,,Sexual Assault,25 to 50 years,1987,1992,Y,,Y,,,,, Cameron,Darrell,19,Black,IL,,Robbery,Not sentenced,1991,1992,,,Y,,,,Y, Caminata,Victor,33,Caucasian,MI,"A, NC",Arson,9 to 40 years,2009,2014,,,,,,Y,,Y Camm,David,36,Caucasian,IN,"CDC, CV",Murder,Life,2002,2013,Y,,,,Y,,, Canen,Lana,43,Caucasian,IN,F,Murder,55 years,2005,2012,,,,,Y,Y,,Y Cannon,Phillip Scott,32,Caucasian,OR,,Murder,Life without parole,2000,2009,,,,,,Y,, Canter,Mark,22,Caucasian,MI,CDC,Murder,Life,1988,1996,,,,,Y,,Y, Cantu,Raul,34,Hispanic,TX,"NC, P",Drug Possession or Sale,6 months,2005,2005,,,,,,Y,, Capozzi,Anthony,27,Caucasian,NY,,Sexual Assault,11 to 35 years,1987,2007,Y,,Y,,,,, Carabajal,Joey,25,Hispanic,TX,"NC, P",Drug Possession or Sale,2 years,2013,2014,,,,,,,, Caravella,Anthony,15,Caucasian,FL,,Murder,Life,1984,2010,Y,,,Y,,,Y, Cardenas,Carlos,23,Hispanic,NY,,Robbery,8 to 25 years,1996,2007,,,Y,,,,, Cardenas,Napoleon,24,Hispanic,NY,,Robbery,15 to 30 years,1999,2007,,,Y,,,,, Cardwell,M. Donald,63,Caucasian,F-CT,"FED, NC, P",Tax Evasion/Fraud,Not sentenced,2000,2000,,,,,Y,,, Carmona,Arthur,16,Hispanic,CA,,Robbery,12 years,1998,2000,,,Y,,,,,Y Carnes,Linda,26,Caucasian,TX,"F, NC, P",Drug Possession or Sale,5 years,2014,2014,,,,,,Y,, Carney,John,32,Caucasian,F-TX,"FED, NC",Conspiracy,"87 months and $895,000 restitution",1995,1997,,,,,Y,,Y, Carr,Weldon Wayne,57,Caucasian,GA,"A, NC",Murder,Life,1994,2004,,,,,,Y,Y, Carrillo,Francisco,15,Hispanic,CA,CV,Murder,Life,1992,2011,,,,,Y,,Y, Carter,Brad,38,Caucasian,IL,"CV, NC",Child Sex Abuse,Not sentenced,2013,2015,,,,,Y,,, Carter,Edward,19,Black,MI,,Sexual Assault,Life,1975,2010,,,Y,,,Y,,Y Carter,Quentin,16,Black,MI,CV,Child Sex Abuse,6 to 20 years,1992,2015,,,,,Y,,, Carter,Raymond,40,Black,PA,,Murder,Life,1988,1996,,,,,Y,,Y, Carter,Richy,30,Black,WA,NC,Sexual Assault,14 months,2007,2013,,,,,Y,Y,Y, Carter,Steven,19,Black,FL,,Assault,Not sentenced,1990,1990,,,Y,,,,, Carter,William Douglas,55,Caucasian,VA,NC,Assault,14 years,1988,1992,,,,,Y,,Y, Cash,Jacob,23,Caucasian,FL,,Murder,50 years,1998,2008,,,,,,,, Castillo,Miguel,35,Hispanic,IL,,Murder,48 years,1991,2001,,,,Y,Y,,Y, Castor,Gregory,34,Black,TX,"CIU, M, NC, P",Drug Possession or Sale,90 days,2014,2014,,,,,,Y,, Catalan,Moises,25,Hispanic,TX,,Assault,35 years,1997,2003,,,,,Y,,,Y Catron,Xavier,22,Black,IL,CV,Murder,35 years,1992,2000,,,,,Y,,Y,Y Catton,James,43,Caucasian,F-IL,"FED, NC",Fraud,1 year and 9 months,1994,2004,,,,,Y,,Y, Celestino,Juan,41,Hispanic,OH,"CV, NC, PH",Child Sex Abuse,Life,1991,1995,,,,,Y,,Y, Cerrano,Dennis,46,Caucasian,CA,"CV, NC",Child Sex Abuse,6 years,1998,2000,,,,,Y,,, Cesar,Owen,26,Black,PA,"CV, NC",Child Sex Abuse,1 to 5 years,2005,2011,,,,,Y,,,Y Cesar,Owen,24,Black,PA,"CV, NC",Child Sex Abuse,16 to 37 years,2004,2009,,,,,Y,,,Y Chalmers,Terry,20,Black,NY,,Sexual Assault,12 to 24 years,1987,1995,Y,,Y,,,,, Chance,Clarence,23,Black,CA,JI,Murder,Life,1975,1992,,,,,Y,,Y, Chandler,Edwin,21,Black,KY,,Manslaughter,30 years,1995,2009,,,Y,Y,,,, Chaparro,Anthony,48,Hispanic,NJ,,Sexual Assault,Life,2003,2014,Y,,Y,,,Y,,Y Chaplin,Michael T.,24,Caucasian,VA,"CV, NC",Child Sex Abuse,1 year,2008,2009,,,,,Y,,, Chapman,Glen Edward,24,Black,NC,,Murder,Death,1994,2008,,,,,Y,,Y,Y Chapman,Vernon,23,Black,LA,,Sexual Assault,Life,1980,1994,,,Y,,,,Y, Charles,Clyde,26,Black,LA,,Sexual Assault,Life,1982,1999,Y,,Y,,,Y,, Charles,Lambert,16,Black,NY,P,Manslaughter,10 to 30 years,1993,1998,,,,Y,Y,,, Charles,Ulysses Rodriguez,30,Black,MA,,Sexual Assault,80 years,1984,2001,Y,,Y,,Y,Y,Y, Chasstion,Donald,50,Black,TX,"CIU, NC, P",Drug Possession or Sale,30 days,2009,2015,,,,,,Y,, Chatman,Carl,47,Black,IL,"CIU, NC",Sexual Assault,30 years,2004,2013,,,,Y,Y,,Y,Y Chatman,Charles,20,Black,TX,CIU,Sexual Assault,Life,1981,2008,Y,,Y,,,,, Chatmon,Timothy,30,Black,IL,CDC,Murder,35 years,1990,1995,,,,,Y,,, Chavez,China,22,Caucasian,TX,"CIU, F, M, NC, P",Drug Possession or Sale,45 days,2014,2014,,,,,,,, Chavez,Moses,22,Caucasian,TX,"CIU, M, NC, P",Drug Possession or Sale,45 days,2014,2014,,,,,,Y,, Chen,Boping,49,Asian,CA,"CV, NC",Child Sex Abuse,14 years,2006,2009,,,,,Y,,,Y Cheshire,Jesse Allen,37,Caucasian,OK,"CV, NC",Child Sex Abuse,16 years,2004,2008,,,,,Y,Y,, Cheung,Kum Yet,,Asian,CA,,"Attempt, Violent",29 years and 8 months,1995,2002,,,Y,,Y,,Y,Y Chretien,Roland,44,Caucasian,NH,NC,Sexual Assault,6 to 12 years,2005,2008,,,,,Y,,, Christoph,Dayna,16,Caucasian,WA,"CV, F, NC, P",Child Sex Abuse,21 to 28 weeks,1995,2000,,,,Y,,,,Y Chumley,Tom Edwin,52,Caucasian,GA,,Murder,Life,2005,2009,,,,Y,,,,Y Clancy,Michael,22,Hispanic,NY,,Murder,25 to Life,1999,2009,,,Y,,,,, Clark,Mark,41,Caucasian,WA,NC,Sexual Assault,Not sentenced,1998,1999,,,,,Y,,, Clark,Robert,20,Black,GA,,Sexual Assault,Life,1982,2005,Y,,Y,,,,, "Clay, Sr.",David,37,Black,MO,,Murder,Life without parole,1988,1999,,,Y,,,,,Y Cleary,Mark,27,Caucasian,MI,"CV, NC",Child Sex Abuse,20 to 30 years,1989,2005,,,,,Y,,, Clemmons,Artis,29,Black,F-LA,FED,Drug Possession or Sale,Not sentenced,1994,1995,,,,,Y,,, Clemmons,Eric,23,Black,MO,,Murder,Death,1987,2000,,,Y,,,,Y, Clugston,Christopher,20,Caucasian,FL,CDC,Murder,Life,1983,2001,,,Y,,Y,,, Coco,Allen,26,Black,LA,,Sexual Assault,Life,1997,2006,Y,,Y,,,,, Coker,Edgar,15,Black,VA,"CV, NC, P",Sexual Assault,1 year and 5 months,2007,2014,,,,,Y,,,Y Cole,Christopher,19,Caucasian,FL,NC,Manslaughter,15 years,1997,2001,,,,,Y,,Y, Cole,Reggie,18,Black,CA,,Murder,Life,1994,2009,,,,,Y,,Y,Y Cole,Timothy B.,24,Black,TX,PH,Sexual Assault,25 years,1986,2009,Y,,Y,,,Y,, Coleman,Christopher,20,Black,IL,"CDC, CV",Sexual Assault,60 years,1995,2014,,,Y,,Y,,Y, Coleman,Keith,24,Black,TX,"NC, P",Drug Possession or Sale,8 months,2006,2013,,,,,,Y,, Collins,Jabbar,20,Black,NY,,Murder,34 to Life,1995,2010,,,,,Y,,Y, Colomb,Edward,26,Black,F-LA,"CDC, FED, JI",Drug Possession or Sale,Not sentenced,2006,2006,,,,,Y,,, Colomb,Mary Ann,50,Black,F-LA,"CDC, F, FED, JI",Drug Possession or Sale,Not sentenced,2006,2006,,,,,Y,,, Colon,Danny,26,Hispanic,NY,,Murder,50 years to life,1993,2011,,,,,Y,,Y, Colton,Murray,20,Caucasian,CT,,Murder,50 years,1991,1998,,,,,Y,,, Comstock,Stephen,43,Caucasian,NV,NC,Possession of Stolen Property,10 to 25 years,2004,2015,,,,,,,Y, Coney,Robert Carroll,33,Black,TX,P,Robbery,Life,1966,2004,,,Y,Y,,,Y,Y Conley,Caramad,18,Black,CA,,Murder,Life without parole,1994,2011,,,,,Y,,Y, Conley,Kenneth,27,Caucasian,MA,,Perjury,2 years and 10 months,1998,2005,,,,,Y,,Y, Connor,Matthew,41,Black,PA,CV,Murder,Life,1980,1990,,,Y,,,,Y, Connor,Reginald,23,Black,NY,CV,Kidnapping,12.5 to 25 years,1993,2015,Y,*,,,Y,,Y,Y Contee,Darian,18,Black,TX,"NC, P",Gun Possession or Sale,3 years,2007,2010,,,,,,,,Y Conteh,Maligie,19,Black,VA,,Robbery,1 year and 11 months,2010,2013,,,Y,,,,Y,Y Copeland,Darrell A.,18,Black,VA,NC,Gun Possession or Sale,5 years,2007,2008,,,,,,,, Corner,Damon,23,Black,FL,,Murder,Life,2002,2012,,,,,Y,,Y,Y Correia,Humberto,47,Other,F-MA,"A, FED, NC",Arson,Not Sentenced,2002,2004,,,,,,Y,,Y Cortez,Arturo,34,Hispanic,CA,,Drug Possession or Sale,7 years,1998,2003,,,,,Y,,Y,Y Cosme,Michael,19,Black,NY,,Murder,25 years to life,1997,2012,,,,,Y,,, Cosme,Michael,19,Black,NY,,Murder,25 years to life,1997,2013,,,,,Y,,Y, Cotton,Ronald,22,Black,NC,,Sexual Assault,Life,1985,1995,Y,,Y,,,,, Coulter,Phillip,38,Caucasian,OK,CV,Child Sex Abuse,15 years,2000,2002,,,,,Y,,, Courteau,Paul,20,Caucasian,RI,,Robbery,15 years,1981,1999,,,Y,,,,, Courtney,Sedrick,23,Black,OK,,Robbery,30 years,1996,2012,Y,,Y,,,Y,, Courtney,Uriah,25,Caucasian,CA,CV,Sexual Assault,Life,2006,2013,Y,,Y,,,,, Cousin,Shareef,16,Black,LA,,Murder,Death,1996,1999,,,Y,,,,Y, Covender,Joel,26,Caucasian,OH,"CV, NC",Child Sex Abuse,15 to 50 years,1996,2014,,,,,Y,Y,, Cowans,Stephan,26,Black,MA,,Attempted Murder,30 to 45 years,1998,2004,Y,,Y,,,Y,, Cox,Jeffrey,22,Caucasian,VA,,Murder,Life,1991,2001,,,Y,,Y,,Y, Cox,Michael,43,Caucasian,CA,,Drug Possession or Sale,6 years,1999,2001,,,,Y,Y,,Y, Cox,Richard,47,Caucasian,CA,"CSH, CV, NC, P",Child Sex Abuse,Unknown,1985,1991,,,,,Y,,Y,Y Cox,Teresa Lynne,17,Caucasian,CA,"CSH, CV, F, JI, NC",Child Sex Abuse,10 years,1985,2000,,,,,Y,,Y, "Cox, Sr.",Terry,27,Caucasian,IA,"CV, NC",Child Sex Abuse,25 years,1997,2001,,,,,Y,,, Craig,Sandra,37,Black,MD,"CSH, F, NC",Child Sex Abuse,10 years,1987,1991,,,,,Y,,Y, Craine,Leonard,37,Black,NV,CV,Child Sex Abuse,Life,1990,2002,,,,,Y,,, Crawford,Bradley,47,Caucasian,OK,"M, NC",Stalking,6 months,2003,2004,,,,,Y,,, Crawford,Steven,14,Black,PA,CV,Murder,Life,1974,2002,,,,,Y,Y,Y, Credell,Cory,27,Black,SC,,Murder,Life,2001,2012,,,Y,,Y,,Y,Y Cridelle,Richard,41,Black,NY,NC,Sexual Assault,20 years,2010,2014,,,,,Y,,,Y Criner,Roy,21,Caucasian,TX,CV,Sexual Assault,Life,1990,2000,Y,,,,,Y,, Cristini,Michael,26,Caucasian,MI,,Sexual Assault,44 to 60 years,1991,2004,,,,,Y,Y,, Crivens,Algie,18,Black,IL,,Murder,20 years,1992,2000,,,,,Y,,Y, Cromedy,McKinley,29,Black,NJ,,Sexual Assault,60 years,1994,1999,Y,,Y,,,,, Crosby,Daniel,31,Caucasian,MT,"CV, NC",Child Sex Abuse,3 years,1996,2008,,,,,Y,,, Crosby,Timothy,27,Black,NY,"CDC, CV",Assault,10 to Life,1989,1999,,,,,Y,,Y, Crotzer,Alan,19,Black,FL,CV,Sexual Assault,Life,1982,2006,Y,,Y,,,Y,, Croy,Norma Jean,23,Native American,CA,F,Murder,Life,1979,1997,,,,,,,,Y Croy,Patrick,23,Native American,CA,,Murder,Death,1979,2004,,,,,,,, Cruz,Efren,22,Hispanic,CA,,Murder,41 years to Life,1997,2001,,,Y,,,,Y, Cruz,Jerome,24,Hispanic,NY,NC,Drug Possession or Sale,1 year,2007,2007,,,,,Y,,Y, Cruz,Robert,33,Hispanic,AZ,JI,Murder,Death,1981,1995,,,,,Y,,Y, Cruz,Rolando,19,Hispanic,IL,"CDC, CV, JI",Murder,Death,1985,1995,Y,,,Y,Y,,Y, Cservak,Wayne,20,Caucasian,GA,"CV, NC",Child Sex Abuse,10 years,1997,1997,,,,,Y,,,Y Cuevas,Roberto,21,Hispanic,CA,,Robbery,12 years,2003,2008,,,Y,,,,, Cullipher,Ricky,15,Caucasian,VA,CV,Assault,9 years,1997,2001,,,Y,Y,,,Y,Y Cunningham,Calvin Wayne,26,Black,VA,,Sexual Assault,16 years,1981,2011,Y,,Y,,,,, Cunningham,Connie,43,Caucasian,WA,"CSH, CV, F, NC",Child Sex Abuse,46 years and 6 months,1994,1997,,,,,Y,,Y, Cunningham,Henry,46,Caucasian,WA,"CSH, CV, NC, P",Child Sex Abuse,47 years,1994,1999,,,,Y,Y,,Y,Y Curlin,Anthony,39,Black,TX,NC,Drug Possession or Sale,35 years,1997,2000,,,,,Y,,Y, Curry,Albert,53,Black,F-SD,"FED, NC",Sexual Assault,Not sentenced,2002,2003,,,,,Y,,Y, Curtis,Exzavian,23,Black,TX,"CIU, NC, P",Drug Possession or Sale,Probation,2013,2014,,,,,,Y,, Cvijanovich,Daniel,26,Caucasian,F-ND,"FED, JI, NC",Other Nonviolent Felony,19 months,2007,2011,,,,,Y,,Y, Dabb,Wayne,56,Caucasian,MI,CV,Child Sex Abuse,4 3/4 to 15 years,2005,2008,,,,,,,,Y Dabbs,Charles,27,Black,NY,,Sexual Assault,12 1/2 to 20 years,1984,1991,Y,,Y,,,Y,, Daidone,Albert,38,Caucasian,PA,CDC,Murder,Life without parole,1984,1999,,,,,Y,,Y, Dail,Dwayne Allen,17,Caucasian,NC,CV,Child Sex Abuse,Life,1989,2007,Y,,Y,,,Y,, Dake,Jeffrey,27,Caucasian,WI,CV,Child Sex Abuse,16 years,1998,2007,,,Y,,,,Y, Dallas,Peter,23,Caucasian,FL,P,Murder,Not sentenced,1991,1992,,,,Y,,,Y, D'Ambrosio,Joe,26,Caucasian,OH,,Murder,Death,1989,2012,,,,,Y,Y,Y, Damien,Michael,,Caucasian,DC,PH,Murder,20 to Life,1975,2005,,,,,Y,,Y, Daniels,Erick,14,Black,NC,,Robbery,10 to 14 years,2001,2008,,,Y,,,,Y,Y Danziger,Richard,18,Caucasian,TX,CDC,Sexual Assault,Life,1990,2002,Y,,,,Y,Y,Y, Davalos,Luis,16,Hispanic,CA,P,Manslaughter,12 years,1996,2001,,,,,Y,,Y, Davenport-Fritsche,Donna,30,Caucasian,TX,"F, NC, P",Drug Possession or Sale,60 days,2012,2013,,,,,,Y,, Davidson,Willie,24,Black,VA,,Sexual Assault,20 years,1981,2005,Y,,Y,,,Y,, Davis,Andre,19,Black,IL,CV,Murder,Life,1981,2012,Y,*,,,Y,Y,, Davis,Anthony,20,Caucasian,IA,"CV, NC",Sexual Assault,Not sentenced,1990,1992,,,,,Y,,Y, Davis,Carlos,18,Black,NY,CIU,Gun Possession or Sale,7.5 to 15 years,1991,2015,,,,,Y,,, Davis,Cody,21,Caucasian,FL,,Robbery,3 years,2006,2007,Y,,Y,,,,, Davis,Danny,25,Black,F-LA,"CDC, FED, JI",Drug Possession or Sale,Not sentenced,2006,2006,,,,,Y,,, Davis,Dewey,53,Caucasian,WV,,Sexual Assault,10 to 20 years,1987,1995,Y,,,,Y,Y,Y, Davis,Donya,28,Black,MI,,Sexual Assault,22 years,2007,2014,Y,,Y,,Y,,, Davis,Gerald,26,Caucasian,WV,,Sexual Assault,14 to 35 years,1986,1995,Y,,,,Y,Y,Y, Davis,Glenn,18,Black,LA,,Murder,Life without parole,1993,2010,,,,,Y,,Y,Y Davis,Jeramie,36,Caucasian,WA,,Murder,40 years,2008,2013,Y,,,,,,, Davis,Larry,36,Caucasian,WA,,Sexual Assault,20 years and 6 months,1993,2010,Y,,Y,,,,, Davis,LaVelle,20,Black,IL,,Murder,45 years,1997,2009,,,,,Y,Y,Y, "Davis, Jr.",Sammy,32,Black,F-LA,"CDC, FED, JI",Drug Possession or Sale,Not sentenced,2006,2006,,,,,Y,,, Day,Lee Antione,28,Black,IL,,Murder,60 years,1992,2002,,,Y,,,,,Y Daye,Frederick Renee,25,Black,CA,,Sexual Assault,Life,1984,1994,Y,,Y,,,Y,, Deacon,Derrick,34,Black,NY,CV,Murder,25 to life,1989,2013,,,,,Y,,Y, Dean,James,20,Caucasian,NE,"CDC, P",Murder,10 years,1989,2009,Y,,,Y,Y,Y,, DeAngelo,Angel M.,26,Hispanic,F-NY,"CDC, FED",Murder,Life,2003,2004,,,,,Y,,Y, Deckinga,Shaun,26,Caucasian,F-MN,"FED, JI",Robbery,10 years,1993,1993,,,Y,,Y,,, Dedge,Wilton,20,Caucasian,FL,JI,Sexual Assault,30 years,1982,2004,Y,,Y,,Y,Y,, Deeb,Muneer,23,Other,TX,"CV, JI",Murder,Death,1985,1993,,,,,Y,Y,, DeJac,Lynn,30,Caucasian,NY,"CV, F, JI, NC",Murder,25 years,1994,2008,Y,*,,,Y,,, Delmore,Larry,22,Black,LA,,Murder,Life without parole,1993,2010,,,,,Y,,Y,Y Demings,Barry,55,Black,TX,"CIU, NC, P",Drug Possession or Sale,6 months,2008,2015,,,,,,Y,, DeSimone,David,45,Caucasian,IA,"CV, NC",Sexual Assault,15 years,2005,2012,,,,,Y,,Y, Deskovic,Jeffrey,16,Caucasian,NY,CV,Murder,15 to Life,1990,2006,Y,,,Y,,,, Dewey,Robert,33,Caucasian,CO,,Murder,Life,1996,2012,Y,,,,,,, Dewitt,Steven,20,Black,DC,,Murder,15 to Life,1992,2004,,,Y,,Y,,Y, "Dexter, Jr.",Clarence Richard,49,Caucasian,MO,,Murder,Death,1991,1999,,,,,,Y,Y,Y Diamond,Garry,24,Caucasian,VA,CV,Sexual Assault,15 years,1977,2013,Y,,Y,,,Y,Y, Diaz,Luis,40,Hispanic,FL,,Sexual Assault,Life,1980,2005,Y,,Y,,,Y,, Diaz,Luis,20,Hispanic,CA,CIU,Sexual Assault,15 years,1984,2012,,,Y,,,,, Diaz,Victor,17,Hispanic,TX,"CIU, NC, P",Drug Possession or Sale,Probation,2011,2014,,,,,,Y,, Diaz Morales,Robert Anel,20,Hispanic,PR,,Murder,172 years,2004,2012,,,,,Y,,, Dick,Steven,18,Caucasian,IL,,Drug Possession or Sale,6 months work release,1983,2001,,,,,Y,,Y, Dill,Grace,50,Caucasian,CA,"CSH, CV, F, NC",Child Sex Abuse,Life,1985,1991,,,,,Y,,Y, "Dill, Jr.",Wayne,26,Caucasian,CA,"CSH, CV, NC",Child Sex Abuse,Life,1985,1991,,,,,Y,,Y, Dillon,William,21,Caucasian,FL,JI,Murder,Life,1981,2008,Y,,Y,,Y,Y,Y, Dilosa,Douglas,35,Caucasian,LA,,Murder,Life,1987,2003,,,,,,,Y, DiNicola,Louis,29,Caucasian,PA,"A, CV, NC",Murder,Life,1980,1994,,,Y,,,Y,Y, Dinning,Jack,38,Caucasian,GA,,Murder,Life,1993,1997,,,,,Y,,Y, Dixon,Bobby Ray,22,Black,MS,"CDC, P, PH",Murder,Life,1980,2010,Y,,,Y,Y,,Y, Dixon,Donald,18,Black,MO,CDC,Murder,Life,1985,2001,,,,,Y,,Y, Dixon,Isaac,26,Black,TX,"CIU, NC, P",Drug Possession or Sale,90 days,2014,2015,,,,,,Y,, Dixon,John,27,Black,NJ,P,Sexual Assault,45 years,1992,2001,Y,,Y,,,,, Doggett,Carol,36,Caucasian,WA,"CSH, CV, F, NC",Child Sex Abuse,10 years and 10 months,1995,2000,,,,,Y,,Y, Doggett,Mark,35,Caucasian,WA,"CSH, CV, NC",Child Sex Abuse,10 years and 10 months,1995,2000,,,,,Y,,Y, Dollard,Jermaine,38,Black,DE,NC,Drug Possession or Sale,20 years,2013,2015,,,,,Y,Y,Y, Dombrowski,Peter,37,Caucasian,NY,,Robbery,9 to 18 years,1985,1990,,,Y,,,,Y,Y Domenech,Alfredo,23,Caucasian,PA,,Murder,Life,1988,2005,,,Y,,,,, Dominguez,Alejandro,16,Hispanic,IL,,Sexual Assault,9 years,1990,2002,Y,,Y,,,Y,Y, Dominguez,Jonathan,15,Hispanic,CA,NC,Assault,Not sentenced,2006,2006,,,,,Y,,Y, Doswell,Thomas,24,Black,PA,,Sexual Assault,13 to 26 years,1986,2005,Y,,Y,,,,, Dotson,Gary,20,Caucasian,IL,"CV, NC",Sexual Assault,25 to 50 years,1979,1989,Y,,,,Y,Y,, Douglas,Yancy,19,Black,OK,CV,Murder,Death,1995,2009,,,,,Y,,Y, Dove,Gayle,41,Caucasian,TX,"CSH, CV, F, NC",Child Sex Abuse,Life,1986,1989,,,,,Y,,Y, Dowaliby,David,31,Caucasian,IL,CV,Murder,45 years,1990,1992,,,Y,,,Y,, Doyle,Robert E.,44,Caucasian,AL,"CV, NC",Child Sex Abuse,20 years,1992,2001,,,,,Y,,Y, Drennen,Gabriel,36,Caucasian,WY,NC,Murder,Life,2011,2013,,,,,,Y,Y,Y Drinkard,Gary,37,Caucasian,AL,,Murder,Death,1995,2001,,,,,,,,Y Drumgold,Shawn,23,Black,MA,CV,Murder,Life,1989,2003,,,Y,,Y,,Y, Dubbs,Charles T.,36,Caucasian,PA,,Sexual Assault,12 to 40 years,2002,2007,,,Y,,,,, Dudley,Ronald,19,Black,NY,,Assault,15 years,2000,2001,,,Y,,,,,Y Duke,Dale,42,Caucasian,TX,"CIU, CV, NC, P",Child Sex Abuse,20 years,1992,2011,,,,,Y,,Y, Duncan,Cornelia,33,Black,TX,"CIU, F, NC, P",Drug Possession or Sale,6 months,2013,2015,,,,,,Y,, "Dupree, Jr.",Cornelius,20,Black,TX,CIU,Robbery,75 years,1980,2011,Y,,Y,,,,, Durham,Timothy,28,Caucasian,OK,CV,Child Sex Abuse,Life,1993,1997,Y,,Y,,,Y,, Dutcher,David,45,Caucasian,CA,"NC, P",Traffic Offense,4 month driver's license suspension,2009,2012,,,,,Y,,Y, Duval,John,21,Black,NY,CDC,Murder,25 to Life,1973,2000,,,,Y,Y,,Y, Dwight,Nathan Christopher,21,Black,GA,,Robbery,Life,2010,2013,Y,*,Y,,,,, Dziubak,Richard Paul,34,Caucasian,MN,"NC, P",Manslaughter,6 years and 9 months,1987,1989,,,,,Y,Y,, Eardley,James,23,Caucasian,IL,CDC,Conspiracy,Not sentenced,1990,1991,,,,,Y,,, Earle,Margaret,21,Caucasian,MA,"CV, F",Murder,Life,2005,2010,,,,,,,,Y Earnest,Michael,26,Caucasian,TX,"CIU, NC, P",Drug Possession or Sale,Probation,2014,2014,,,,,,,, Eason,Corey,23,Black,IL,NC,Sex Offender Registration,Not sentenced,2005,2005,,,,,,,,Y Eastridge,Joseph,28,Caucasian,DC,,Murder,20 to Life,1975,2005,,,,,Y,,Y, Echavarria,Angel,27,Hispanic,MA,,Murder,Life without parole,1996,2015,Y,*,Y,,,,Y,Y Echols,Douglas,25,Black,GA,,Sexual Assault,31 years,1987,2002,Y,,Y,,,,, Edmonds,Tyler,13,Caucasian,MS,,Murder,Life,2004,2008,,,,Y,Y,Y,, Edmunds,Audrey,34,Caucasian,WI,"CV, F, NC, SBS",Murder,18 years,1996,2008,,,,,,Y,, Edwards,James,44,Black,IL,,Murder,Life without parole,1996,2012,Y,,,Y,Y,,Y, Eldridge,Yvonne,41,Caucasian,CA,"CV, F, NC",Child Abuse,3 years and 4 months,1996,2003,,,,,,,,Y Elizondo,Joe,49,Hispanic,TX,"CV, NC",Child Sex Abuse,Life,1984,1997,,,,,Y,,, Elizondo,Mary Ann,27,Hispanic,TX,"CV, F, NC, P",Child Sex Abuse,35 years,1984,2005,,,,,Y,,, Elkins,Clarence,35,Caucasian,OH,CV,Murder,55 to Life,1999,2005,Y,,Y,,,,, Elliff,Hicks,23,Caucasian,TX,,Murder,55 years,1982,2007,,,,,,,Y, Ellis,Andre,35,Black,AL,NC,Sexual Assault,85 years,2013,2014,,,,,Y,,Y, Ellis,Bernard,26,Black,IL,,Murder,55 years,1996,2001,,,,,Y,,Y,Y Ellison,Jason,23,Caucasian,KS,NC,Sexual Assault,23 years,2006,2011,,,,,Y,,, Elmardoudi,Abdel-Ilah,35,Other,F-MI,FED,Supporting Terrorism,15 years,2003,2004,,,,,Y,,Y, Embry,Anthony,23,Black,AL,P,Murder,20 years,1993,1997,,,Y,,,,, Emory,Malcolm,19,Native American,MA,NC,Assault,six months (suspended) and three years probation,1970,1990,,,,,Y,,Y, Engberg-Lehmer,Teresa,24,Caucasian,IA,"CV, F, NC, P, SBS",Manslaughter,15 years,1997,1998,,,,,,Y,, Engel,Gary,33,Caucasian,MO,JI,Kidnapping,90 years,1991,2010,,,,,Y,,Y, Engesser,Oakley,41,Caucasian,SD,,Manslaughter,25 years,2001,2015,,,,,,,Y,Y Ennis,Sean,39,Caucasian,OH,,Sexual Assault,5 to 25 years,1990,1994,,,Y,,,,Y, Erby,Lonnie,31,Black,MO,CV,Child Sex Abuse,Life,1986,2003,Y,,Y,,,Y,, Escalera,Robert,21,Hispanic,NY,,Murder,15 to Life,1976,1990,,,Y,,,,,Y Esse,Eric,33,Caucasian,IA,,Murder,Life without parole,2003,2006,,,,Y,,Y,Y,Y Evans,Jerry Lee,24,Black,TX,CIU,Sexual Assault,Life,1987,2009,Y,,Y,,,,Y, Evans,Michael,17,Black,IL,CV,Murder,Life,1976,2003,Y,,Y,Y,,,Y, Everett,Harold,65,Caucasian,WA,"CSH, CV, NC, P",Child Sex Abuse,23 years,1994,1998,,,,,Y,,Y, Everett,Idella,41,Caucasian,WA,"CSH, CV, F, NC, P",Child Sex Abuse,4 years and 8 months,1994,1998,,,,Y,Y,,Y, Eze,Louis,31,Black,NY,"CV, NC",Child Sex Abuse,35 2/3 to life,1993,2007,,,,,Y,Y,,Y Fain,Charles,33,Caucasian,ID,"CV, JI",Murder,Death,1983,2001,Y,,,,Y,Y,Y, Faison,Anthony,21,Black,NY,,Murder,20 to Life,1988,2001,,,,,Y,,, Fappiano,Scott,21,Caucasian,NY,,Sexual Assault,20 to 50 years,1985,2006,Y,,Y,,,,, Farnbaugh,William,25,Caucasian,FL,NC,Sexual Assault,Life,1997,2004,,,,,Y,,,Y Farnsworth,Robert,29,Caucasian,MI,NC,Theft,Probation,1999,2000,,,,Y,,,, Faulkner,Kenneth,39,Caucasian,CA,"CV, NC",Kidnapping,30 to Life,2000,2003,,,,,Y,,, Fauntleroy,David,22,Black,IL,CDC,Murder,Life,1986,2009,,,,Y,Y,,Y, "Fears, Jr.",Joseph R.,34,Black,OH,,Sexual Assault,15 to 25 years,1984,2009,Y,,Y,,,,, Felder,Lance,16,Black,PA,,Murder,Life,1998,2014,,,Y,,,,, Felix,Darren,19,Black,NY,,Attempted Murder,12 years,2004,2010,,,Y,,Y,,Y, Fergeson,Latanya,29,Black,TX,"CIU, F, NC, P",Drug Possession or Sale,7 months,2008,2015,,,,,,Y,, Ferguson,Ryan,17,Caucasian,MO,CDC,Murder,40 years,2005,2013,,,,,Y,,Y, Ferrara,Vincent,36,Caucasian,F-MA,"CDC, FED, P",Murder,22 years,1992,2005,,,,,Y,,Y, Field,Denis,46,Caucasian,F-NY,"CDC, FED",Tax Evasion/Fraud,Not sentenced,2011,2013,,,,,Y,,Y, Fields,Hilliard,25,Black,TX,"CIU, CV, NC, P",Child Sex Abuse,5 years,1997,2011,,,,,Y,,, Fields,Nathson Edgar,29,Black,IL,,Murder,Death,1986,2009,,,,,Y,,Y,Y Figueroa,Fancy,16,Hispanic,NY,"F, M, NC, P",Filing a False Report,Community service,1997,2004,Y,*,,Y,,,Y, Filler,Vladek,37,Caucasian,ME,NC,Sexual Assault,Not sentenced,2009,2015,,,,,Y,,Y, Fish,Harold,57,Caucasian,AZ,NC,Murder,10 years,2006,2009,,,,,,,, Fisher,James,32,Caucasian,F-TX,"FED, NC",Conspiracy,"87 months and $895,000 restitution",1995,1997,,,,,Y,,Y, Fitzgerald,John,40,Caucasian,F-CA,FED,Tax Evasion/Fraud,Not sentenced,2007,2009,,,,,Y,,Y, Fleming,Jonathan,27,Black,NY,CIU,Murder,25 to life,1990,2014,,,,,Y,,Y, "Flores, Jr.",Ernesto,23,Hispanic,TX,P,Burglary/Unlawful Entry,2 years,2004,2015,,,Y,,,,Y,Y Foley,Kenneth Wayne,26,Caucasian,CA,CIU,Robbery,25 to Life,1995,2007,,,Y,,,,, Foley,Thomas,40,Caucasian,MI,,Murder,Life,2009,2011,,,,,,,, Forbes,Todd,35,Caucasian,F-MIL,"FED, NC",Sexual Assault,15 years,1998,2006,,,,,Y,,, Ford,Glenn,34,Black,LA,,Murder,Death,1984,2014,,,,,Y,Y,Y,Y Forsythe,Colleen Dill,26,Caucasian,CA,"CSH, CV, F, NC",Child Sex Abuse,Life,1985,1991,,,,,Y,,Y, Forsythe,Wayne,28,Caucasian,CA,"CSH, CV, NC",Child Sex Abuse,Life,1985,1991,,,,,Y,,Y, Foster,Demetrius,24,Black,MI,,Murder,32 to 50 years,2000,2014,,,Y,,,,,Y Fountain,Wiley,28,Black,TX,,Sexual Assault,40 years,1986,2003,Y,,Y,,,,, Fowler,Joel,17,Black,NY,CIU,Murder,25 years to life,2009,2015,,,,Y,Y,,Y,Y Fowler,Lawrence,37,Black,NY,CV,Murder,25 to Life,1998,2006,,,Y,,,,, Fox,Bretton,24,Caucasian,TX,"CIU, NC, P",Drug Possession or Sale,10 days,2008,2015,,,,,,Y,, Franklin,George,30,Caucasian,CA,CV,Murder,Life,1990,1996,Y,*,,,Y,,, Frederick,Sherri,33,Black,TX,"F, NC, P",Drug Possession or Sale,six months,2010,2012,,,,,,Y,, Frey,Joseph,31,Caucasian,WI,,Sexual Assault,Life,1994,2013,Y,,Y,,,,, Fritz,Dennis,22,Caucasian,OK,JI,Murder,Life,1988,1999,Y,,,,Y,Y,Y, Fritz,Judith,35,Caucasian,PA,"F, NC",Manslaughter,3 years,1995,1997,,,Y,,,Y,,Y Frye,Ralph,21,Caucasian,IL,CDC,Murder,23 years,1989,1996,,,,Y,Y,,Y, Fuentes,Israel,19,Hispanic,TX,"NC, P",Drug Possession or Sale,5 years,2013,2013,,,,,,,, Fuller,Larry,31,Black,TX,,Sexual Assault,50 years,1981,2006,Y,,Y,,,Y,, Funes,Jeffery,15,Hispanic,CA,NC,Assault,Not sentenced,2006,2006,,,,,Y,,Y, Futrell,Rayshard Darnell,17,Black,MI,,Murder,Life,2009,2010,,,Y,,,,Y, Gagnon,Richard,32,Caucasian,SC,JI,Murder,Life without parole,2008,2015,Y,,,,Y,,, Gainer,Donald,31,Caucasian,PA,"A, JI, NC",Arson,22 to 45 years,1985,1992,,,,,Y,,Y, Gaines,Freddie Lee,22,Black,AL,,Murder,30 years,1974,1991,,,Y,,,,, Gaines,Lynie,36,Black,MI,P,Drug Possession or Sale,10 to 30 years,1990,1997,,,,Y,Y,,, Gainus,Rhonda,44,Caucasian,TX,"F, NC, P",Drug Possession or Sale,3 years,2012,2013,,,,,,Y,, Galicia,Luis,21,Hispanic,CA,"CV, NC",Child Sex Abuse,15 to Life,2008,2011,,,,Y,Y,Y,Y, Galli,Aaron,22,Caucasian,UT,CDC,Murder,10 to life,1993,1993,,,,,Y,,, Gantt,Timothy,47,Black,CA,,Murder,Life without parole,1994,2008,,,Y,,Y,Y,Y, Garcia,Jose,27,Hispanic,NY,,Murder,25 years to life,1993,2007,,,,,Y,,Y,Y Garcia,Roy Lopez,45,Hispanic,CA,JI,Murder,Life,2000,2006,,,,,Y,,, Gardner,Lewis,15,Black,IL,"CDC, CIU",Murder,30 years,1995,2014,,,,Y,Y,,Y,Y Gariepy,Ronnie Mark,32,Caucasian,TX,"CV, NC, P",Child Sex Abuse,12 years,1992,2000,,,,Y,Y,,, Garner,David,34,Black,F-OH,"CDC, FED",Robbery,13 years,2005,2008,,,,,Y,,Y, Garner,Terence,16,Black,NC,,Attempted Murder,32 to 43 years,1998,2002,,,Y,,Y,,, Garza,Michael,28,Caucasian,TX,"CIU, NC, P",Drug Possession or Sale,10 months,2014,2014,,,,,,Y,, Gassman,Tyler W.,21,Caucasian,WA,CDC,Robbery,26 years,2009,2013,,,,,Y,,,Y Gaston,Jeffrey,34,Black,TX,"CIU, M, NC, P",Drug Possession or Sale,60 days,2006,2015,,,,,,Y,, Gates,Donald Eugene,30,Black,DC,,Murder,20 to Life,1982,2009,Y,,,,Y,Y,Y, Gathers,Gary,17,Black,DC,,Murder,36 years to life,1994,2015,,,,Y,Y,,Y,Y Gauger,Gary,41,Caucasian,IL,JI,Murder,Death,1993,1996,,,,Y,Y,,Y, Gausvik,Ralph,39,Caucasian,WA,"CSH, CV, NC",Child Sex Abuse,23 years and 4 months,1995,2000,,,,,Y,,Y, Gavin,Willie,35,Black,WI,"CV, NC, P",Child Sex Abuse,8 years,1997,2014,,,,,Y,,, Gavitt,David Lee,26,Caucasian,MI,"A, CV, NC",Murder,Life without parole,1986,2012,,,,,,Y,,Y Gayol,Robert,33,Hispanic,IL,CDC,Murder,Life,2001,2003,,,,,Y,,Y, Gell,Alan,20,Caucasian,NC,CDC,Murder,Death,1998,2004,,,,,Y,Y,Y, "Geralds, Jr.",Hubert,30,Black,IL,,Murder,Death,1997,2000,,,,Y,,,, Geri,Haim,50,Caucasian,F-DC,"FED, NC, P",Conspiracy,Not sentenced,2011,2012,,,,,,,, Gibbs,Barry,38,Caucasian,NY,JI,Murder,20 to Life,1988,2005,,,,,Y,,Y, Gibson,Roland,20,Black,LA,CDC,Murder,Life,1968,1993,,,,,Y,,Y, Giles,James Curtis,26,Black,TX,,Sexual Assault,30 years,1983,2007,Y,,Y,,,,Y, Gillard,Larry,22,Black,IL,,Sexual Assault,24 years,1984,2009,Y,,Y,,,Y,, Gilman,William,60,Caucasian,NY,CDC,Fraud,90 days,2008,2011,,,,,Y,,Y, Gilyard,Eugene,16,Black,PA,,Murder,Life,1998,2014,,,Y,,,,, "Girdler, Jr.",Ray,36,Caucasian,AZ,"A, CV, JI, NC",Murder,Life,1982,1991,,,,,Y,Y,Y, Girts,Jason,22,Caucasian,KY,"CV, NC",Child Sex Abuse,7 years,2006,2008,,,,,Y,,, Gladden,David,36,Black,PA,"A, CDC, JI",Murder,Life,1995,2007,,,,,Y,,Y, Glassman,Donald,36,Caucasian,NY,NC,Sexual Assault,Not sentenced,2007,2009,,,,,Y,,,Y Glenn,Eugene,41,Black,WI,,Robbery,20 years,2002,2003,,,Y,,,,, Glisson,Eric,18,Black,NY,,Murder,25 years to life,1997,2012,,,,,Y,,, Godschalk,Bruce,25,Caucasian,PA,JI,Sexual Assault,10 to 20 years,1987,2002,Y,,Y,Y,Y,,Y, Goff,Antoine,19,Black,CA,,Murder,27 to Life,1990,2003,,,,,Y,,Y, Golden,Andrew,45,Caucasian,FL,NC,Murder,Death,1991,1994,,,,,,,,Y Golding,Jerry,43,Caucasian,F-VA,"FED, NC",Gun Possession or Sale,3 years and 1 month,1998,1999,,,,,,,Y, Goldstein,Thomas Lee,30,Caucasian,CA,JI,Murder,25 to Life,1980,2004,,,,,Y,,Y, Gomez,Alfonso,21,Hispanic,CA,,Murder,41 years to life,1998,2012,,,Y,,,,, Gondor,Robert,24,Caucasian,OH,CDC,Manslaughter,10 to 25 years,1990,2007,,,,,Y,Y,,Y Gonzalez,Angel,20,Hispanic,IL,CIU,Sexual Assault,55 years,1995,2015,Y,,Y,Y,,,, Gonzalez,Daniel,18,Hispanic,CA,"NC, P",Assault,Not sentenced,2006,2007,,,,,Y,,Y, Gonzalez,David,33,Hispanic,SD,"CV, NC",Child Sex Abuse,1 year and 6 months,1999,2001,,,,,Y,,Y, Gonzalez,Hector,18,Hispanic,NY,,Murder,15 to Life,1996,2002,Y,,Y,,,,, Gonzalez,Kathy,24,Hispanic,NE,"CDC, F, P",Murder,10 years,1989,2009,Y,,,Y,Y,Y,, Good,Donald Wayne,21,Caucasian,TX,CV,Sexual Assault,Life,1984,2004,Y,,Y,,,Y,, Goodman,Bruce Dallas,34,Caucasian,UT,,Murder,5 to Life,1986,2004,Y,,Y,,,,, Goodman,Warren,30,Black,WI,CDC,Robbery,22 years,1994,2007,,,Y,,Y,,,Y Goods,Darron,19,Black,MD,CV,Attempted Murder,Not sentenced,2005,2006,,,Y,,,,,Y Googe,Michael,26,Caucasian,GA,P,Burglary/Unlawful Entry,2 years,2008,2015,,,,,Y,,, Gossett,Andrew,37,Caucasian,TX,,Sexual Assault,50 years,2000,2007,Y,,Y,,,,, Goudy,Walter,25,Black,IN,CDC,Murder,110 years,1995,2012,,,Y,,Y,,Y, Grady,Nathaniel,43,Black,NY,"CSH, CV, NC",Child Sex Abuse,45 years,1986,1997,,,,,,,Y, Grafton,Marjorie,52,Caucasian,CA,"CSH, CV, F, NC",Child Sex Abuse,16 years,1985,1990,,,,,Y,,Y, Graham,Michael,23,Caucasian,LA,JI,Murder,Death,1987,2000,,,,,Y,,Y, Grajeda,Arthur,19,Hispanic,CA,,Murder,32 years to life,1987,1991,,,Y,,Y,,Y, Grajeda,Senon,31,Hispanic,CA,,Murder,45 years to life,1987,1993,,,Y,,Y,,Y, Grandson,Dwight,17,Black,DC,,Murder,42 years and 8 months,2006,2011,,,,,Y,,Y, Grannis,David,21,Caucasian,AZ,,Murder,Death,1991,1996,,,,,Y,,, Graves,Anthony,26,Black,TX,"A, CDC, CV",Murder,Death,1994,2010,,,,,Y,Y,Y, Gray,Anthony,23,Black,MD,P,Murder,Life,1991,1999,Y,,,Y,,,Y,Y Gray,David A.,25,Black,IL,JI,Attempted Murder,60 years,1978,1999,Y,,Y,,Y,,, Gray,Paula,17,Black,IL,"F, JI",Murder,50 years,1978,2002,Y,,Y,Y,Y,Y,Y, Gray,Russell Leroy,20,Black,VA,,Murder,52 years,1987,1990,,,Y,,,,, Greco,Louis,47,Caucasian,MA,"CDC, PH",Murder,Death,1968,2001,,,,,Y,,Y, Green,Anthony Michael,22,Black,OH,,Sexual Assault,20 to 50 years,1988,2001,Y,,Y,,,Y,, Green,Debra,31,Black,IL,"F, M, NC",Assault,Probation,2010,2011,,,,,Y,,Y, Green,Doris,34,Caucasian,WA,"CSH, CV, F, NC",Child Sex Abuse,23 years,1995,1999,,,,Y,Y,,Y, Green,Edward,19,Black,DC,CV,Sexual Assault,Not sentenced,1989,1990,Y,,Y,,,,, Green,James,46,Black,TN,"CV, NC",Child Sex Abuse,16 years,2006,2008,,,,,Y,,, Green,Joseph,36,Black,FL,,Murder,Death,1993,2000,,,,,Y,,, Green,Kevin Lee,20,Caucasian,CA,CV,Murder,15 to Life,1980,1996,Y,,Y,,,,, Green,Lafayette,38,Black,FL,NC,Sexual Assault,11 years,1996,1997,,,,,Y,,, Green,Michael Anthony,18,Black,TX,CIU,Sexual Assault,75 years,1983,2010,Y,,Y,,,,Y, Green,Willie Earl,31,Black,CA,,Murder,33 to Life,1984,2008,,,,,Y,,Y, Greene,Cy,19,Black,NY,,Murder,15 to life,1985,2006,Y,*,Y,,Y,,Y,Y Greer,Rashawn,21,Black,OK,CV,Murder,Life,1998,2001,,,,,,,,Y Grega,John,32,Caucasian,VT,,Murder,Life without parole,1995,2013,Y,*,,,,,, Gregory,Wassillie,48,Native American,AK,"M, NC, P",Harassment,Probation,2014,2015,,,,,Y,,Y, Gregory,William,42,Black,KY,,Sexual Assault,70 years,1993,2000,Y,,Y,,,Y,Y, Gressman,Jed Allen,22,Caucasian,UT,,Sexual Assault,5 to Life,1993,1996,Y,*,,,Y,,, Griffin,Codell,41,Black,F-IL,"CDC, FED",Drug Possession or Sale,Not sentenced,1991,1994,,,,,Y,,Y, Griffin,Patrick,40,Caucasian,NY,NC,Sexual Assault,3.3 to 10 years,1996,2000,,,,,Y,,, Griffin,Reginald,22,Black,MO,,Murder,Death,1988,2013,,,,,Y,,Y, Grimes,Willie,41,Black,NC,,Sexual Assault,Life,1988,2012,,,Y,,,Y,, Grissom,James,43,Caucasian,MI,NC,Sexual Assault,15 to 35 years,2003,2012,,,,,Y,,, Gristwood,Daniel,29,Caucasian,NY,,Attempted Murder,12 1/2 to 25 years,1996,2006,,,,Y,,,Y, Groce,Tyrone,28,Black,NY,,Robbery,4 to 12 years,1993,1995,,,Y,,,,, Gross,George,38,Caucasian,NJ,"CV, NC",Child Sex Abuse,Not sentenced,1998,2001,,,,,Y,,,Y Guce,Teobaldo,42,Asian,NY,"CV, NC",Child Sex Abuse,3 to 9 years,1987,1992,,,,,Y,Y,, Guillory,Lavont,21,Black,CA,,Murder,Life,1994,2005,Y,*,Y,,,,Y, Gurley,Larry,20,Black,NY,NC,Murder,20 to life,1972,1994,,,,,Y,,Y, Guzman,Humberto,40,Hispanic,MA,,Drug Possession or Sale,18 to 20 years,1992,1997,,,Y,,,,,Y Hairston,Kim,32,Black,OH,,Murder,Life,1993,1995,,,,,Y,,Y, Hales,Warren,25,Caucasian,UT,"CV, NC, SBS",Murder,5 years to Life,2004,2007,,,,,,Y,,Y Haley,James,24,Black,MA,,Murder,Life without parole,1972,2008,,,,,Y,,Y, Hall,Erin,35,Caucasian,F-WI,"FED, NC",Conspiracy,2 months,2011,2014,,,,,,,, Hall,Harold,37,Black,CA,JI,Murder,Life,1990,2004,,,,Y,Y,,, Hall,Jennifer,20,Caucasian,MO,"A, F, NC",Arson,3 years,2001,2005,,,,Y,,Y,,Y Hall,Paula,36,Caucasian,MO,"CDC, F, JI",Murder,20 years,2009,2013,,,,,Y,,Y, Hall,Tony,25,Caucasian,TX,"CV, NC",Child Sex Abuse,15 years,1993,2011,,,,,Y,,, Halsey,Byron,24,Black,NJ,CV,Murder,Life,1988,2007,Y,,,Y,Y,,, Halstead,Dennis,26,Caucasian,NY,"CDC, CV, JI",Murder,33 years and 4 months,1986,2005,Y,,,,Y,Y,Y, Hamilton,Derrick,25,Black,NY,CIU,Murder,25 years to life,1992,2015,,,,,Y,,Y,Y Hampton,Patrick,18,Black,IL,,Sexual Assault,60 years,1982,2011,,,Y,,Y,,Y,Y Hancock,Troy Jon,25,Caucasian,UT,,Sexual Assault,5 to Life,1993,1996,Y,*,,,Y,,, Handley,Zachary,13,Caucasian,PA,"A, CV",Arson,6 to 12 months,2008,2015,,,,Y,Y,,, Hanline,Michael,31,Caucasian,CA,,Murder,Life without parole,1980,2015,Y,*,,,Y,,Y, Hannan,Ahmed,32,Other,F-MI,FED,Fraud,5 years,2003,2004,,,,,Y,,Y, Hannon,Donald,18,Caucasian,IA,"CV, NC",Sexual Assault,Not sentenced,1990,1992,,,,,Y,,Y, Hansen,Lisa,25,Caucasian,MI,"F, NC, P",Theft,Community service,2006,2006,,,,,,,, Hansen,Michael,27,Caucasian,MN,"CV, JI, NC",Murder,14 1/2 years,2006,2011,,,,,Y,Y,, Harden,James,16,Black,IL,"CDC, CV",Murder,80 years,1995,2011,Y,,,,Y,,Y, Harding,Christopher,36,Black,MA,,Assault,10 to 12 years,1990,1998,,,Y,,Y,,Y,Y Harmon,Ricky Dale,35,Caucasian,TX,"CV, NC",Child Sex Abuse,30 years,1994,2004,,,,,Y,,, Harper,Maurice,23,Black,IL,,Murder,25 years,2002,2007,,,,,Y,,Y,Y Harrington,Terry,17,Black,IA,CDC,Murder,Life,1978,2003,,,,,Y,,Y, Harris,Anthony,12,Black,OH,CV,Murder,7 years,1999,2000,,,,Y,,,Y, Harris,Benjamin,37,Black,WA,,Murder,Death,1984,1997,,,,Y,,,,Y Harris,Gerald,23,Black,NY,,Robbery,9 to 18 years,1992,2000,,,Y,,,,, Harris,Keith,19,Black,IL,,Attempted Murder,50 years,1979,2003,,,Y,,,,Y,Y Harris,Lynnette,33,Caucasian,F-WI,"F, FED, NC",Tax Evasion/Fraud,10 months,1990,1991,,,,,,,, Harris,Nicole,23,Black,IL,"CIU, CV, F, NC",Murder,30 years,2005,2013,,,,Y,,,Y,Y Harris,Warren,29,Black,IL,,Murder,50 to Life,1977,1990,,,,,Y,,,Y Harris,William,17,Black,WV,,Sexual Assault,10 to 20 years,1987,1995,Y,,Y,,Y,Y,Y, Harrison,Clarence,27,Black,GA,,Sexual Assault,Life,1987,2004,Y,,Y,,,Y,, Hart,James Thomas,43,Caucasian,CA,"CV, NC",Child Sex Abuse,12 years,1990,2001,,,,,Y,,,Y Harvey,John Michael,24,Caucasian,TX,"CV, NC",Child Sex Abuse,40 years,1992,2005,,,,,Y,,Y, Hash,Michael,15,Caucasian,VA,"CDC, JI",Murder,Life without parole,2001,2012,,,,,Y,,Y,Y Hatchett,Nathaniel,17,Black,MI,,Sexual Assault,25 to 40 years,1998,2008,Y,,Y,Y,,Y,Y, Hayes,Reginald,14,Black,NV,"CDC, CV",Murder,Life without parole,1985,1999,,,Y,,Y,,,Y Hayes,Travis,17,Black,LA,,Murder,Life,1998,2007,Y,,,Y,,,Y, Haygood,Andre,24,Black,TX,,Murder,Life,2002,2011,,,Y,,,,, Haynesworth,Thomas,18,Black,VA,,Sexual Assault,74,1984,2011,Y,,Y,,,,, Hays,Robert,33,Caucasian,NV,"CV, NC",Child Sex Abuse,Life,1993,2007,,,,Y,Y,Y,Y,Y Hebrard,Alexander,25,Caucasian,TX,"NC, P",Gun Possession or Sale,4 years,2011,2013,,,,,,,,Y Hebshie,James,55,Caucasian,F-MA,"A, FED, NC",Arson,15 years,2006,2011,,,,,,Y,,Y Heins,Chad,18,Caucasian,FL,JI,Murder,Life,1996,2007,Y,,,,Y,Y,Y, Heistand,Donald,29,Caucasian,MI,"CDC, P",Accessory to Murder,3 to 5 years,1989,2002,,,,,Y,,Y, Helmig,Dale,37,Caucasian,MO,,Murder,Life without parole,1996,2011,,,,Y,Y,,Y,Y Henly,Brock,44,Black,OH,"NC, P",Sex Offender Registration,90 days,2003,2007,,,,,,,,Y Henry,Darcus,21,Black,CT,,Murder,Life,1999,2013,,,,,Y,,Y, Henton,Eugene,16,Black,TX,P,Sexual Assault,4 years,1984,2006,Y,,Y,,,,, Hernandez,Alejandro,19,Hispanic,IL,"CDC, CV, JI",Murder,Death,1985,1995,Y,,,Y,Y,,Y, Hernandez,Alejandro,22,Hispanic,TX,CDC,Murder,99 years,1994,2006,,,,,Y,,Y,Y Hernandez,Francisco,26,Hispanic,CA,,Murder,48 years to life,2002,2005,,,,,Y,,, Hernandez,Maria,39,Hispanic,F-TX,"F, FED",Drug Possession or Sale,17 years,2004,2014,,,,,,,,Y Herrera,Bobby Paiste,17,Hispanic,CA,P,Assault,5 years,1998,2000,,,Y,,Y,,,Y Herrera,Eric,42,Hispanic,TX,"NC, P",Drug Possession or Sale,6 months,2011,2012,,,,,,Y,, Herrera,Juan,21,Hispanic,CA,,Murder,80 to life,1999,2006,,,,,Y,,,Y Herrera,Moses,18,Hispanic,TX,,Attempted Murder,7 years,1990,1992,,,Y,,,,, Hicks,Anthony,26,Black,WI,,Sexual Assault,19 years,1991,1997,Y,,Y,,,Y,,Y Hicks,Tyrone,41,Black,NY,,Sexual Assault,8 years,2000,2014,Y,*,Y,,,,, Hidalgo,Olmedo,24,Hispanic,NY,,Murder,25 to Life,1992,2005,,,Y,,,,Y, Highers,Raymond,20,Caucasian,MI,JI,Murder,Life without parole,1988,2013,,,Y,,Y,,, Highers,Thomas,21,Caucasian,MI,JI,Murder,Life without parole,1988,2013,,,Y,,Y,,, Hill,Darrin,26,Black,LA,,Sexual Assault,"Found Not Guilty by reason of Insanity, committed to state mental facility",1999,2012,Y,*,Y,,,,, Hill,Dartangnan,25,Black,OH,CV,Kidnapping,24 years,2001,2005,,,Y,,,,,Y Hill,Harold,16,Black,IL,"A, CDC",Murder,Life without parole,1994,2005,Y,*,,Y,Y,Y,Y, Hill,James,17,Black,IN,,Sexual Assault,35 years,1982,2009,,,Y,,Y,,Y, Hill,Michael,18,Black,MA,,Murder,Life without parole,1996,2002,,,,,Y,,Y,Y Hill,Robert,26,Black,NY,CIU,Murder,18 to life,1988,2014,,,,,Y,,Y,Y Hinton,Anthony,29,Black,AL,,Murder,Death,1986,2015,,,Y,,,Y,,Y Hobley,Madison,26,Black,IL,"A, CV",Murder,Death,1990,2003,,,Y,Y,Y,Y,Y, Hockersmith,Letha Jean,31,Caucasian,OK,"CV, F, SBS",Murder,Life,1994,1999,,,,,,Y,, Hodges,Devron,21,Black,TX,"NC, P",Robbery,10 years,2013,2015,,,,,Y,,Y, Hoehmann,Joseph,35,Caucasian,NY,"CV, NC",Child Sex Abuse,1 year,1996,1998,,,,,Y,,, Hoffman,Jonathon,41,Black,NC,,Murder,Death,1996,2007,,,,,Y,,Y, Hogan,Samuel,49,Black,WI,NC,Sexual Assault,6 years,1995,1997,,,,,Y,,,Y Holdren,Larry,27,Caucasian,WV,,Sexual Assault,30 to 60 years,1984,2000,Y,,Y,,,Y,, Holemon,Jeffrey,23,Caucasian,AL,,Sexual Assault,Life,1988,1999,Y,,Y,,,,, Holland,Dana,25,Black,IL,,Sexual Assault,90 years,1997,2003,Y,,Y,,Y,Y,Y, Holland,Dana,25,Black,IL,,Attempted Murder,28 years,1995,2003,Y,*,,,Y,,, Holmes,Phillip,33,Black,TX,"CIU, NC, P",Drug Possession or Sale,6 months,2014,2015,,,,,,Y,, Holton,Rudolph,33,Black,FL,"CV, JI",Murder,Death,1987,2003,Y,*,Y,,Y,,Y, Honaker,Edward,33,Caucasian,VA,,Sexual Assault,Life,1985,1994,Y,,Y,,Y,Y,Y, Hood,Tyrone,29,Black,IL,"CDC, CIU",Murder,75 years,1996,2015,,,Y,,Y,,Y, Hooper,John,48,Black,NY,"NC, P",Gun Possession or Sale,1 year,2013,2015,,,,,Y,,Y, Hopkins,Troy,23,Black,VA,,Murder,28 years,1990,2005,,,Y,,,,, Hornoff,Jeffrey Scott,26,Caucasian,RI,,Murder,Life,1996,2003,,,,,,,, Horton,Alice,20,Caucasian,TX,"F, NC, P",Drug Possession or Sale,3 years,2012,2013,,,,,,Y,, House,Paul G.,23,Caucasian,TN,,Murder,Death,1986,2009,Y,*,,,,Y,, House,Roger,35,Black,F-MIL,FED,Military Justice Offense,Fine,2002,2009,,,,,Y,Y,, "Housler, Jr.",David,19,Caucasian,TN,JI,Murder,Life without parole,1997,2014,,,,Y,Y,,,Y Houston,Darrell,23,Black,OH,,Murder,33 to Life,1992,2010,,,,,Y,,, Houston,Elton,28,Black,IL,,Murder,35 years,1984,1989,,,Y,,,,, Howard,DeAndre,20,Black,CA,,Murder,Life,2003,2013,,,Y,,,,,Y Howard,Stanley,21,Black,IL,,Murder,Death,1987,2003,,,,Y,Y,,Y, Howard,Timothy,23,Black,OH,,Murder,Death,1977,2003,,,Y,,Y,Y,Y, Hubbard,Donna Sue,30,Caucasian,CA,"CSH, CV, F, NC",Child Sex Abuse,Life,1985,1995,,,,Y,Y,,Y, Hudson,Larry,19,Black,LA,CDC,Murder,Death,1967,1993,,,Y,,Y,,Y, Hugee,Larry Lane,48,Black,MD,,Robbery,25 years,2004,2013,,,Y,,Y,,Y, Hughes,Elicia,32,Black,MS,F,Murder,Life,2007,2007,,,,,,,,Y Hungerford,Jordan,24,Caucasian,TX,"NC, P",Drug Possession or Sale,3 years,2014,2014,,,,,,Y,, Hunt,Darryl,19,Black,NC,,Murder,Life,1985,2004,Y,,Y,,Y,,Y, Hunt,H. Guy,54,Caucasian,AL,NC,Official Misconduct,Probation,1993,1998,,,,,,,, Hunt,Kenya,17,Black,CA,,Robbery,Not sentenced,1991,1991,,,Y,,,,, Hunter,Darrell,22,Black,CA,,Murder,Life without parole,2000,2008,,,Y,,,,, Hurt,Davie,17,Black,WV,CDC,Murder,Life,1998,2014,,,,,Y,,Y,Y Ireland,Kenneth,16,Caucasian,CT,,Murder,50 years,1989,2009,Y,,,,Y,Y,, Isham,Rommain,31,Native American,WI,"CV, NC",Child Sex Abuse,10 years,1990,2010,,,,,Y,,, "Islas, Jr.",Francisco,47,Hispanic,AZ,P,Drug Possession or Sale,90 days,2013,2013,,,,,Y,,Y, Jackson,Dwayne,18,Black,NV,P,Robbery,4 years,2003,2011,Y,,,,,Y,, Jackson,John,59,Black,WA,,Drug Possession or Sale,6 years and 4 months,1996,2001,,,,,Y,,,Y Jackson,Raymond,38,Black,TX,CIU,Sexual Assault,Life,1984,2012,Y,,Y,,,,, Jackson,Rickie,40,Black,PA,,Murder,Life without parole,1998,2000,,,,,Y,Y,, Jackson,Ricky,18,Black,OH,,Murder,Death,1975,2014,,,,,Y,,Y, Jackson,Willie,17,Black,LA,,Sexual Assault,40 years,1989,2006,Y,,Y,,,Y,, Jackson-Knight,Eric,21,Black,NY,"A, JI, NC",Murder,25 to Life,1980,1994,,,,Y,Y,Y,Y,Y "Jacobs, Jr.",Ralph A.,30,Caucasian,IN,"CDC, P",Murder,8 years,1991,1993,,,,Y,Y,,Y, Jacobson,Nathan,51,Caucasian,F-CA,"FED, NC, P",Fraud,Not sentenced,2008,2014,,,,,,,, James,Ernestine,42,Native American,F-WA,"F, FED",Manslaughter,Probation,1995,1999,,,,,,,, James,Gary Lamar,23,Black,OH,,Murder,Death,1977,2003,,,Y,,Y,Y,Y, James,Henry,20,Black,LA,,Sexual Assault,Life without parole,1982,2011,Y,,Y,,,,,Y James,Maurice,27,Black,NY,,Drug Possession or Sale,1 1/2 to 3 years,1990,1992,,,,,Y,,Y, Jamison,Derrick,23,Black,OH,CDC,Murder,Death,1985,2005,,,,,Y,,Y, Jardine,Alvin,20,,HI,,Sexual Assault,35 years,1992,2011,Y,*,Y,,,,, Jasin,Thomas,46,Caucasian,F-PA,FED,Conspiracy,"two years and $10,000 fine",1992,2003,,,,,,,,Y Jean,Lesly,22,Black,NC,,Sexual Assault,Life,1982,1991,Y,,Y,,,,Y, Jeffers,John,14,Caucasian,IN,"P, PH",Murder,34 years,1977,2003,,,,Y,,,, Jenkins,Jerry Lee,25,Caucasian,MD,P,Sexual Assault,Life,1987,2013,Y,,Y,,,,, Jennette,Alvena,21,Black,NY,CIU,Murder,18 to life,1988,2014,,,,,Y,,Y, Jernigan,Rachel,30,Hispanic,F-AZ,"F, FED",Robbery,14 years,2001,2008,,,Y,,,,Y, Jimenez,Thaddeus,13,Hispanic,IL,,Murder,50 years,1994,2009,,,Y,,Y,,, Jimerson,Verneal,25,Black,IL,"CDC, JI",Murder,Death,1985,1996,Y,,Y,,Y,,Y, Johnson,Albert K.,30,Black,CA,,Sexual Assault,24 years,1992,2002,Y,,Y,,,,Y, Johnson,Andrew,39,Black,WY,,Sexual Assault,Life,1989,2013,Y,,,,Y,,,Y Johnson,Anthony,27,Black,LA,,Murder,Life,1986,2010,Y,,Y,Y,,Y,Y, Johnson,Anthony,17,Black,IL,,Murder,47 years,2007,2014,,,,,Y,,Y,Y Johnson,Antrone,17,Black,TX,"CIU, CV, NC, P",Child Sex Abuse,Life,1996,2009,,,,,Y,,Y, Johnson,Arthur,33,Black,MS,,Sexual Assault,55 years,1993,2008,Y,,Y,,,,, Johnson,Calvin Crawford,25,Black,GA,,Sexual Assault,Life,1983,1999,Y,,Y,,,Y,, Johnson,Craig,27,Black,LA,NC,Robbery,70 years,1996,1997,,,,,Y,,, Johnson,Donnell,16,Black,MA,CV,Murder,18 to 20 years,1996,2000,,,Y,,,,Y, Johnson,Johnnie,18,Black,CT,,Murder,75 years,2001,2013,,,,,Y,,Y, Johnson,Juan,19,Black,IL,,Murder,30 years,1991,2004,,,,,Y,,Y,Y Johnson,Larry,30,Black,MO,,Sexual Assault,Life,1984,2002,Y,,Y,,,,, Johnson,LaShawn,28,Black,F-MT,"CDC, FED",Drug Possession or Sale,35 years,2006,2015,,,,,Y,,Y, Johnson,Latisha,18,Black,NY,"CDC, F",Attempted Murder,40 years,2007,2014,,,Y,Y,Y,,Y, Johnson,Nathaniel,40,Black,NY,,Robbery,5 years,2010,2013,,,Y,,,,Y, Johnson,Patrick,22,Black,NY,,Murder,15 to Life,1983,1993,,,,,Y,,Y,Y Johnson,Richard,31,Black,IL,,Sexual Assault,36 years,1992,1996,Y,,Y,,,,,Y Johnson,Rickey,26,Black,LA,,Sexual Assault,Life without parole,1983,2008,Y,,Y,,,,Y, Johnson,Sophia,23,Other,WA,"CDC, F",Murder,43 years,2003,2005,,,,,Y,,Y, Johnson,Terrell,18,Black,PA,,Murder,Life without parole,1995,2012,,,,,Y,,,Y Johnson,Timothy,22,Black,GA,P,Murder,Life,1984,2013,,,,,Y,,,Y Johnston,Dale,49,Caucasian,OH,,Murder,Death,1984,1990,,,Y,,,Y,Y, Jones,David Allen,32,Black,CA,,Murder,36 to Life,1995,2004,Y,,,Y,,,, Jones,Dewey,30,Caucasian,OH,JI,Murder,Life,1995,2014,Y,*,Y,,Y,,Y, Jones,Elroy Lucky,26,Black,MI,,Murder,Life without parole,2006,2014,,,,,Y,,Y, Jones,Jerry Jamaal,24,Black,WI,NC,Assault,2 years,2010,2011,,,,,Y,,,Y Jones,Joe C.,23,Black,KS,,Sexual Assault,Life,1986,1992,Y,,Y,,,,, Jones,Levon Junior,28,Black,NC,,Murder,Death,1993,2008,,,,,Y,,Y,Y Jones,Lonnie,34,Black,NY,,Murder,37 years,2002,2007,,,,,Y,,Y, Jones,Lydia Diane,,Black,AL,F,Drug Possession or Sale,Life without parole,2000,2006,,,,,,,,Y Jones,Melvin,30,Black,IL,,Murder,Life without parole,1983,1989,,,,Y,Y,,Y, Jones,Morris S.,16,Black,TX,,Assault,15 years,1998,2001,,,Y,,,,, Jones,Ronald,34,Black,IL,,Murder,Death,1989,1999,Y,,Y,Y,,,Y, Jones,Stephen,23,Black,F-MO,FED,Drug Possession or Sale,20 years,1998,2010,,,,,Y,,Y, Jones,Troy Lee,30,Black,CA,,Murder,Death,1982,1996,,,,,,,,Y Jones,Tyrone,20,Black,MD,CV,Murder,Life,1999,2010,,,Y,,,Y,Y, Jordan,Queen Esther,45,Black,TX,"F, NC, P",Drug Possession or Sale,6 months,2008,2009,,,,,,Y,, Jordan,Queen Esther,45,Black,TX,"F, NC, P",Drug Possession or Sale,Probation,2009,2009,,,,,,Y,, Jose,Terrence,32,Hispanic,MI,"CV, NC",Child Sex Abuse,25 to 40 years,2010,2014,,,,,,,,Y Joseph,Malenne,27,Black,FL,F,Destruction of Property,Not sentenced,2010,2010,,,Y,,,,, Jovanovic,Oliver,30,Caucasian,NY,NC,Sexual Assault,15 to Life,1998,2001,,,,,Y,,, Julian,Billy,20,Caucasian,IN,"A, CDC, JI",Arson,15 years,2003,2010,,,,,Y,,Y, Kagonyera,Kenneth,20,Black,NC,"CDC, P",Murder,12 to 15 years,2001,2011,Y,,,Y,Y,,Y, Kamienski,Paul,35,Caucasian,NJ,,Murder,Life,1988,2010,,,,,Y,,Y, Karage,Entre Nax,21,Asian,TX,CV,Murder,Life,1997,2005,Y,,,,,Y,, Karibi-Ikiriko,Abere,27,Black,MD,"F, NC",Murder,15 years,2005,2007,,,,,,,, "Kauffman, Jr.",Carl Joe,21,Caucasian,PA,CDC,Robbery,5 to 15 years,2001,2004,,,,,Y,,Y, Keifer,Lee,25,Caucasian,OK,"CV, NC",Child Sex Abuse,Life,1993,1997,,,,,Y,,, Keko,Anthony,62,Caucasian,LA,,Murder,Life,1993,1998,,,,,Y,Y,, Kelly,Donald,26,Caucasian,PA,,Murder,Life,1990,1993,,,,,Y,,,Y Kelly,Robert,40,Caucasian,NC,"CSH, CV, NC",Child Sex Abuse,Life,1992,1997,,,,,Y,,Y, "Kelly, Jr.",William M.,37,Caucasian,PA,P,Murder,10 to 20 years,1990,1993,Y,,Y,Y,,,, Kennedy,Thomas,32,Caucasian,WA,"CV, NC",Child Sex Abuse,15 years,2002,2012,,,,,Y,,, Keuylian,Viken,45,Caucasian,F-CA,"FED, NC, P",Fraud,Not sentenced,2009,2014,,,,,Y,,, Kezer,Joshua,17,Caucasian,MO,JI,Murder,30 years,1994,2009,Y,*,Y,,Y,,Y, Khatibi,Kian,22,Caucasian,NY,,Assault,7 to 14 years,1999,2008,,,,,Y,,Y, Kiel,Archie,48,Caucasian,MI,NC,Drug Possession or Sale,5 months,2010,2012,,,,,,,, Killian,Gloria,35,Caucasian,CA,"CDC, F",Murder,32 to Life,1986,2002,,,,,Y,,Y, Kimbell,Thomas,32,Caucasian,PA,"CV, JI",Murder,Death,1998,2002,,,,,Y,,, Kimsey,Martin,45,Caucasian,TX,,Robbery,Life,1985,1990,,,Y,,,,, Kindle,Jason,31,Black,CA,,Robbery,70 to Life,2000,2003,,,Y,,,,,Y King,Susan,38,Caucasian,KY,"F, P",Manslaughter,10 years,2008,2014,,,,,Y,Y,Y, Kinge,Shirley,54,Black,NY,"A, F",Arson,18 to 44 years,1990,1992,,,,,Y,Y,Y, Kiper,Ben,26,Caucasian,KY,"CV, NC",Child Sex Abuse,55 years,2000,2007,,,,,Y,,, Kirkwood,Justin,22,Caucasian,PA,,Robbery,3 1/2 to 7 years,2003,2006,,,Y,,,,Y, Kitchen,Ronald,22,Black,IL,"CV, JI",Murder,Death,1990,2009,,,,Y,Y,,Y, Kittilstad,Richard L.,50,Caucasian,WI,NC,Solicitation,10 years,2001,2007,,,,,Y,,, Kittler,Eric,15,Black,IL,,Murder,35 years,1999,2002,,,,Y,,,Y, Kluppelberg,James,18,Caucasian,IL,"A, CV, NC",Murder,Life,1989,2012,,,,Y,Y,Y,Y,Y Knapper,Isaac,17,Black,LA,,Murder,Life,1979,1991,,,,,Y,,Y, Kniffen,Brenda,29,Caucasian,CA,"CSH, CV, F, NC",Child Sex Abuse,Life,1984,1996,,,,,Y,Y,Y, Kniffen,Scott,28,Caucasian,CA,"CSH, CV, NC",Child Sex Abuse,Life,1984,1996,,,,,Y,Y,Y, Knight,Curtis,35,Black,NJ,,Murder,30 years,1990,2001,,,,,Y,,Y, Knupp,Richard,40,Caucasian,NY,"CV, NC",Child Sex Abuse,13 to 39 years,1989,1992,,,,,Y,,,Y Kogut,John,19,Caucasian,NY,"CV, JI",Murder,33 years and 4 months,1986,2005,Y,,,Y,Y,Y,Y, Kordonowy,Paul D.,24,Caucasian,MT,,Sexual Assault,30 years,1990,2003,Y,,,,Y,Y,Y, Kotler,Kerry,22,Caucasian,NY,,Sexual Assault,25 to 50 years,1982,1992,Y,,Y,,Y,Y,Y, Koubriti,Karim,22,Other,F-MI,FED,Supporting Terrorism,5 years,2003,2004,,,,,Y,,Y, Krone,Ray,34,Caucasian,AZ,,Murder,Death,1992,2002,Y,,,,,Y,Y, Kunze,David,44,Caucasian,WA,"CV, JI",Murder,Life without parole,1997,2001,,,,,Y,Y,Y, Kyles,Curtis,24,Black,LA,,Murder,Death,1984,1998,,,Y,,Y,,Y, LaBatte,Beth,24,Caucasian,WI,"F, JI",Murder,Life,1997,2006,Y,*,,Y,Y,,, "LaBolt, Jr.",William,42,Caucasian,NY,P,Burglary/Unlawful Entry,3 to 6 years,1990,1994,,,,,Y,Y,Y, LaBran,Dwight,23,Black,LA,,Murder,Life,1997,2001,,,,,Y,,, Lackey,Dan,28,Caucasian,NY,NC,Sexual Assault,8 years,2004,2007,,,,Y,Y,,Y,Y Lamb,Larry,36,Black,NC,,Murder,Life,1993,2013,,,,,Y,,Y,Y Landano,Vincent James,36,Caucasian,NJ,CDC,Murder,Life,1977,1998,,,,,Y,,Y, Lane,Lionel,33,Black,IL,JI,Murder,60 years,1995,1995,,,,,Y,,, Lantigua,Milton,19,Hispanic,NY,,Murder,20 to Life,1993,1996,,,,,Y,,Y, Lape,Michael Bruce,27,Caucasian,TX,"NC, P",Drug Possession or Sale,25 years,2012,2013,,,,,,Y,, Larkin,Francis,55,Caucasian,F-WI,FED,Robbery,32 years and 6 months,1991,1999,,,Y,,,,Y, Larkins,Ronald,33,Black,OH,CDC,Murder,Life without parole,1986,2006,,,,,Y,,Y, Larsen,Daniel,30,Caucasian,CA,,Other Nonviolent Felony,28 to life,1999,2014,,,Y,,Y,,Y,Y Larson,Robert E.,27,Caucasian,WA,CDC,Robbery,20 years,2009,2013,,,,,Y,,,Y Laughman,Barry,24,Caucasian,PA,,Murder,Life,1988,2004,Y,,,Y,,Y,, Lavernia,Carlos Marcos,29,Hispanic,TX,,Sexual Assault,Life,1985,2000,Y,,Y,,,Y,, Lawrence,Darrian Mark,20,Black,FL,,Murder,Life,2000,2005,,,,,Y,,,Y Lawson,Carl,23,Black,IL,CV,Murder,Death,1990,1996,,,,,,Y,,Y Lawson,Samuel,28,Caucasian,OR,,Murder,Life,2005,2014,,,Y,,,,Y, Lazzell,David,32,Caucasian,LA,"CV, NC, P",Child Sex Abuse,Probation,1991,2007,,,,,Y,,, Lea,Joe,24,Hispanic,CA,"CV, NC, P",Child Sex Abuse,16 years to life,2000,2009,,,,,Y,,, Leal-Del Carmen,Jonathan,44,Hispanic,F-CA,FED,Immigration,18 months,2010,2012,,,,,Y,,Y, Leblanc,David,43,Caucasian,TX,"CIU, M, NC, P",Drug Possession or Sale,10 days,2014,2014,,,,,,,, Lee,Ah,20,Asian,PA,,Murder,Life,1988,2004,,,Y,,,,Y, Lee,Koua Fong,28,Asian,MN,CV,Manslaughter,8 years,2007,2010,,,,,,,,Y Lee,William,17,Black,MI,NC,Sexual Assault,15 to 30 years,2007,2010,Y,*,,,Y,,Y, LeFever,Virginia,37,Caucasian,OH,"F, NC",Murder,Life,1990,2011,,,,,Y,Y,, Lehmer,Joel,32,Caucasian,IA,"CV, NC, P, SBS",Manslaughter,15 years,1997,1998,,,,,,Y,, Leka,Sami,33,Caucasian,NY,,Murder,20 years,1990,2002,,,Y,,,,Y, Lemons,Anthony,19,Black,OH,,Murder,21 years to life,1995,2014,,,,,Y,,Y, Lemus,David,22,Hispanic,NY,,Murder,25 to Life,1992,2007,,,Y,,,,Y, Letcher,Donald,29,Caucasian,SD,"CV, NC",Child Sex Abuse,5 years,1995,1996,,,,,Y,,, Leverett,Ron,29,Black,GA,,Drug Possession or Sale,3 years,1993,1994,,,Y,,,,,Y Lewchuk,Dennis,33,Caucasian,NE,NC,Assault,5 to 10 years,1980,1996,,,,,Y,,, Lewis,Brandon,23,Caucasian,AZ,NC,Assault,Probation,2013,2014,,,,,Y,,Y, Lewis,David,46,Caucasian,F-CA,"FED, NC",Assault,7 years and 8 months,2000,2005,,,,,,,, Lewis,George,28,Black,MN,,Sexual Assault,12 years and 8 months,1988,1991,,,Y,,,,, Lewis,Nathaniel,18,Black,OH,NC,Sexual Assault,8 years,1997,2004,,,,,Y,,, Lewis,Scott,25,Black,CT,,Murder,120 years,1995,2015,,,,,Y,,Y, Likine,Selesa,42,Black,MI,"F, NC",Failure to Pay Child Support,48 days in jail and one year probation,2008,2012,,,,,,,, Limone,Peter,31,Caucasian,MA,CDC,Murder,Death,1968,2001,,,,,Y,,Y, Lind,Frank,44,Caucasian,CA,"CV, NC",Child Sex Abuse,30 to life,2002,2005,,,,,Y,,, Lindsey,Johnnie,28,Black,TX,CIU,Sexual Assault,Life,1983,2009,Y,,Y,,,,, Lindstadt,George,42,Caucasian,NY,"CV, NC",Child Sex Abuse,12 1/2 to 25 years,1989,2001,,,,,Y,Y,,Y Linehan,Mechele,23,Caucasian,AK,F,Murder,99 years,2007,2012,,,,,,,Y, Linscott,Steven,26,Caucasian,IL,,Murder,40 years,1982,1992,Y,,,Y,,Y,Y, Lisker,Bruce,17,Caucasian,CA,JI,Murder,16 to Life,1985,2009,,,,,Y,Y,Y, Little,Anthony,43,Black,MI,NC,Failure to Pay Child Support,Probation,2009,2013,,,,,,,, Livingston,Christopher,19,Black,NY,,Murder,20 to Life,1987,1991,,,,,,Y,,Y Lloyd,Eddie Joe,34,Black,MI,CV,Murder,Life,1985,2002,Y,,,Y,,,,Y Locke,Ruben,35,Black,F-IL,"CDC, FED",Drug Possession or Sale,Not sentenced,1996,2000,,,,,Y,,Y, Loftin,Tyrone,48,Black,NY,NC,Sexual Assault,17 years,2008,2010,,,,,Y,,, Loftus,Brendan,23,Caucasian,CA,"CV, NC",Child Sex Abuse,5 years,1997,2000,,,,,Y,,, Logan,Alton,28,Black,IL,,Murder,Life,1982,2008,,,Y,,,,Y, Logan,Roger,35,Black,NY,CIU,Murder,25 to life,1999,2014,,,Y,,Y,,Y, Lomax,Walter,20,Black,MD,CIU,Murder,Life,1968,2014,,,Y,,,,Y,Y Long,Christopher,23,Black,MI,NC,Assault,3 1/2 to 7 years,2008,2008,,,,,Y,,, Long,Lee,34,Black,NY,,Sexual Assault,8 to 24 years,1995,2002,,,Y,,,,, Longus,Michael,26,Black,DC,,Murder,25 years,2005,2013,,,,,Y,,Y, Lopez,George,17,Hispanic,CA,,Robbery,13 years,2000,2002,,,Y,,,,,Y Lopez,William,30,Hispanic,NY,CIU,Murder,25 to life,1990,2014,,,,,Y,,Y,Y Love,Corey Anthony,20,Black,TX,"NC, P",Drug Possession or Sale,210 days,2006,2013,,,,,,Y,, Love,Dwight,22,Black,MI,,Murder,Life,1982,2001,,,Y,,Y,,Y, Loveless,Debbie,31,Caucasian,TX,"CV, F",Murder,Life,1989,1994,Y,*,,,,Y,Y,Y Lowery,Eddie,22,Caucasian,KS,,Sexual Assault,11 to Life,1982,2003,Y,,,Y,Y,Y,Y, Loyd,Courtney,32,Black,TX,"NC, P",Drug Possession or Sale,6 months,2014,2015,,,,,,Y,, Lumpkins,Todd,27,Black,NY,,Murder,24 years to life,1987,1989,,,Y,,,,Y, Lunsford,Kacie,29,Caucasian,TX,"CIU, F, NC, P",Drug Possession or Sale,Probation,2014,2014,,,,,,Y,, Luster,Albert,18,Black,WI,"CV, NC",Child Sex Abuse,Unknown,1990,1992,,,,,Y,,,Y Lynn,Eric,25,Black,MD,,Murder,Life,1994,2007,,,,,Y,,Y,Y Lyons,Antonino,40,Black,F-FL,"FED, NC",Robbery,Not sentenced,2001,2004,,,,,Y,,Y, Lyons,Marcus,29,Black,IL,,Sexual Assault,6 years,1988,2007,Y,,Y,,,,, Mable,Kendrick,38,Black,TX,"CIU, NC, P",Drug Possession or Sale,2 years,2014,2014,,,,,,Y,, Macias,Federico,31,Hispanic,TX,"CDC, JI",Murder,Death,1984,1993,,,,,Y,,,Y Mack,Norman,25,Black,TX,"NC, P",Drug Possession or Sale,six months,2005,2006,,,,,,,, Madrigal,Rafael,25,Hispanic,CA,,Attempted Murder,53 to Life,2002,2009,,,Y,,,,,Y Magnan,Paul Philip,46,Caucasian,CA,,Drug Possession or Sale,25 to Life,2000,2006,,,,,,,,Y Mahan,Dale,22,Caucasian,AL,,Sexual Assault,35 years,1986,1998,Y,,Y,,,,, Mahan,Ronnie,26,Caucasian,AL,,Sexual Assault,Life without parole,1986,1998,Y,,Y,,,,, Maher,Dennis,22,Caucasian,MA,,Sexual Assault,Life,1984,2003,Y,,Y,,,,, Mahoney,Michael,,Caucasian,F-MIL,"FED, NC",Drug Possession or Sale,"Loss of 6 months pay, bad conduct discharge, reduction in grade",2000,2003,,,,,,,Y, Mai,Hien Juan,34,Caucasian,TX,"NC, P",Drug Possession or Sale,120 days,2012,2014,,,,,,Y,, Maldonado,Robert,20,Black,NY,,Attempted Murder,8 to 16 years,1998,2004,,,Y,,,,Y, Manfredi,John,48,Caucasian,NY,NC,Bribery,2 to 6 years,1989,1994,,,,,Y,,Y, Manning,Jay Cee,28,Caucasian,NV,"CSH, CV, NC",Child Sex Abuse,Life,1990,1998,,,,,Y,,, Manning,Warren Douglas,30,Black,SC,,Murder,Death,1989,1999,,,,,,,, Manning,Willie,24,Black,MS,,Murder,Death,1996,2015,,,,,Y,,Y, Mardlin,Frederick,31,Caucasian,MI,"A, NC",Arson,3 to 20 years,2007,2013,,,,,,Y,, Marsh,Kenneth,28,Caucasian,CA,"CV, NC",Murder,15 to Life,1983,2004,,,,,,Y,, Marshall,Michael,41,Black,GA,P,Robbery,4 years,2008,2009,Y,,Y,,,,, Marshall,Ronnie,28,Caucasian,TN,P,Murder,Life,1990,1996,,,,,Y,,Y, Martin,Kevin,17,Black,DC,"CDC, P",Manslaughter,15 to Life,1984,2014,Y,*,,,Y,Y,Y,Y Martin,Mario,17,Black,TX,"NC, P",Drug Possession or Sale,30 days,2009,2010,,,,,,Y,, Martinez,Angelo,19,Hispanic,NY,,Murder,25 to Life,1986,2002,,,Y,,Y,,, Martinez,Joaquin Jose,23,Hispanic,FL,JI,Murder,Death,1997,2001,,,,,Y,,Y, Martorano,Raymond,53,Caucasian,PA,CDC,Murder,Life without parole,1984,1999,,,,,Y,,Y, Mason,Terrence,32,Black,NY,CDC,Robbery,10 to 20 years,1987,1995,,,Y,,Y,,,Y Massey,Shawn Giovanni,25,Black,NC,CV,Kidnapping,11 1/2 to 15 1/4 years,1999,2010,,,Y,,,,Y, Massingill,Anthony,18,Black,TX,CIU,Robbery,Life,1980,2014,Y,,Y,,,,, Masters,Timothy,15,Caucasian,CO,,Murder,Life,1999,2008,Y,*,,,Y,,Y, Matthews,Ryan,17,Black,LA,CDC,Murder,Death,1999,2004,Y,,Y,,Y,,Y, Mawson,Kimberly,32,Caucasian,RI,"CV, F",Murder,35 years,2007,2012,,,,,Y,,, Mayes,Larry,30,Black,IN,,Sexual Assault,80 years,1982,2001,Y,,Y,,,,Y, McAlister,Michael,29,Caucasian,VA,,Sexual Assault,35 years,1986,2015,,,Y,,,,, McAnally,Tamara,38,Caucasian,CA,"F, P",Fraud,"Probation and Restitution of $422,000",2004,2011,,,,,,,Y, McCabe,Travis,39,Caucasian,F-LA,"FED, NC",Perjury,Not sentenced,2010,2014,,,,,Y,,, McCaffrey,William,28,Caucasian,NY,NC,Sexual Assault,20 years,2006,2009,,,,,Y,,, McCallum,David,16,Black,NY,"CDC, CIU",Murder,25 years to life,1986,2014,Y,*,,Y,Y,,Y, McCarty,Curtis,20,Caucasian,OK,,Murder,Death,1986,2007,Y,,,,Y,Y,Y, McCaughn,Patrick,25,Caucasian,IL,,Robbery,12 years,1996,1999,,,Y,,,,, McClain,Anthony,35,Black,OH,CV,Murder,18 years to life,1995,2006,,,,,,,Y, McClaugherty,Charles,18,Caucasian,NM,"CDC, CV",Murder,Life,2001,2008,,,,,Y,,Y, McClendon,Robert,34,Black,OH,CV,Child Sex Abuse,15 to Life,1991,2008,Y,,Y,,,,, McCollum,Claude,26,Black,MI,,Murder,Life,2006,2007,,,,Y,,,Y,Y McCollum,Henry,19,Black,NC,"CDC, CV",Murder,Death,1984,2014,Y,,,Y,Y,,Y, McCord,Todd,21,Black,NY,,Murder,25 years to life,1987,1994,,,Y,,,,, McCormick,Michael Lee,32,Caucasian,TN,,Murder,Death,1987,2007,Y,*,,,,Y,,Y McCracken,Terence,18,Caucasian,PA,,Murder,Life,1983,1995,,,,,Y,Y,Y, McCray,Antron,14,Black,NY,CDC,Sexual Assault,5 to 10 years,1990,2002,Y,,,Y,Y,Y,, McCray,Brian,24,Black,VA,,Murder,43 years,1993,1994,,,,,Y,,Y, McCrimmon,Christopher,20,Black,AZ,,Murder,Death,1993,1997,,,,,Y,,Y, McCuan,Alvin,28,Caucasian,CA,"CSH, CV, NC",Child Sex Abuse,Life,1984,1996,,,,,Y,Y,Y, McCuan,Deborah,25,Caucasian,CA,"CSH, CV, F, NC",Child Sex Abuse,Life,1984,1996,,,,,Y,Y,Y, McCullough,Robert Wayne,30,Caucasian,TX,"CV, NC",Child Sex Abuse,Probation,2000,2011,,,,,Y,,, McDowell,Vidale,17,Black,MI,CDC,Murder,Life,2003,2004,,,,,Y,,, McGee,Arvin,25,Black,OK,,Sexual Assault,Life,1989,2002,Y,,Y,,,,, McGee,Leroy,23,Black,FL,,Robbery,4 years and 6 months,1991,1995,,,Y,,,,,Y McGhee,Curtis,17,Black,IA,"CDC, JI, P",Murder,Life,1978,2011,,,,,Y,,Y,Y McGowan,Thomas,24,Black,TX,CIU,Sexual Assault,Life,1985,2008,Y,,Y,,,,, McInnis,Edward,27,Black,NC,P,Sexual Assault,Life,1988,2015,Y,*,,Y,Y,,,Y McIntyre,Richard,31,Caucasian,ND,NC,Assault,60 days,1991,1992,,,,,,,, McKay,Cornell,22,Black,MO,,Robbery,12 years,2013,2015,,,Y,,,,Y, McKenzie,Gerald,24,Black,MI,CV,Attempted Murder,Life,1984,2004,,,,,,,,Y McKinney,DeWayne,19,Black,CA,,Murder,Life,1981,2000,,,Y,,,,, McKinney,Lawrence,22,Black,TN,,Sexual Assault,100 years,1978,2009,Y,,Y,,,,, McLaughlin,Bruce,45,Caucasian,VA,"CV, NC",Child Sex Abuse,13 years,1998,2002,,,,,Y,Y,,Y McLaughlin,Mark,30,Caucasian,F-PA,FED,Tax Evasion/Fraud,Not sentenced,1996,2000,,,,,Y,,Y, McLaughlin,Russell,36,Caucasian,F-PA,FED,Tax Evasion/Fraud,Not sentenced,1996,1998,,,,,Y,,Y, McMahan,David,40,Caucasian,ME,"M, NC",Assault,Probation,2000,2001,,,,,Y,,, McMillan,Clark Jerome,22,Black,TN,CV,Sexual Assault,Life,1980,2002,Y,,Y,,,,, McMillian,Walter,45,Black,AL,CDC,Murder,Death,1988,1993,,,,,Y,,Y, McNair,Daniel,35,Black,SC,NC,Murder,Life,1986,2000,,,,,,Y,, McNary,Willie Earl,26,Black,CA,,Assault,11 years and 8 months,1984,1996,,,Y,,,,, McNenney,Edward,40,Caucasian,NY,CDC,Fraud,90 days,2008,2011,,,,,Y,,Y, McSherry,Leonard,37,Caucasian,CA,CV,Child Sex Abuse,48 years,1988,2001,Y,,Y,,,,, Medina,Alarico Joe,25,Hispanic,CO,,Murder,Life,1991,1995,,,Y,,,,, Melendez,Juan Roberto,32,Hispanic,FL,CDC,Murder,Death,1984,2002,,,,,Y,,Y, Melendez,Renee,29,Hispanic,NY,F,Theft,Not sentenced,2014,2014,,,,,,,, Mellen,Susan,42,Caucasian,CA,F,Murder,Life without parole,1998,2014,,,,,Y,,Y,Y Mendez,Alexandra,18,Caucasian,TX,"CIU, F, NC, P",Drug Possession or Sale,6 months,2014,2014,,,,,,Y,, Menendez,Ceaser,23,Hispanic,CA,P,Manslaughter,12 years,1996,2001,,,,,Y,,Y, Mengloi,Mary,30,Caucasian,FL,"F, NC",Theft,2.5 years probation,1990,1993,,,,Y,Y,,, Mercer,Michael,40,Black,NY,CV,Sexual Assault,21 to 41 years,1992,2003,Y,,Y,,,,, Merrill,Thomas,24,Caucasian,CA,,Murder,Life,1991,1995,,,,,,,Y,Y Meyers,Terrence,21,Black,LA,,Murder,Life without parole,1993,2010,,,,,Y,,Y,Y Michaels,Margaret Kelly,22,Caucasian,NJ,"CSH, CV, F, NC",Child Sex Abuse,47 years,1988,1994,,,,,Y,,Y, Mikes,Melvin,19,Black,CA,,Murder,25 to Life,1985,1992,,,,,,,,Y Miles,Richard,19,Black,TX,CIU,Murder,60 years,1995,2012,,,,,Y,Y,Y, Milke,Debra,25,Caucasian,AZ,"CDC, CV, F",Murder,Death,1990,2015,,,,Y,Y,,Y, Miller,Benjamin,42,Caucasian,CT,CV,Murder,25 years,1973,1989,,,,Y,,,Y,Y Miller,Billy Wayne,31,Black,TX,,Sexual Assault,Life,1984,2006,Y,,Y,,,,, Miller,Gina,25,Caucasian,CA,"CSH, CV, F, NC",Child Sex Abuse,Life,1985,1991,,,,,Y,,Y, Miller,Harry,45,Black,UT,,Robbery,5 to Life,2004,2007,,,Y,,,,Y,Y Miller,Jerry,23,Black,IL,,Sexual Assault,45 years,1982,2007,Y,,Y,,,,, Miller,Joel,35,Caucasian,F-CA,FED,Gun Possession or Sale,2 years,2012,2014,,,,,Y,,, Miller,John,42,Caucasian,TX,"CV, NC",Murder,Life,1989,1994,Y,*,,,,Y,Y,Y Miller,Maurice Ladon,18,Black,OK,,Murder,Life,2005,2007,,,,Y,,,, Miller,Morgan,27,Black,OH,,Robbery,20 to 40 years,1984,1992,,,Y,,,,,Y Miller,Neil,22,Black,MA,,Sexual Assault,26 to 45 years,1990,2000,Y,,Y,,,Y,, Miller,Thomas,19,Caucasian,FL,NC,Manslaughter,15 years,1997,2001,,,,,Y,,Y, "Miller, Jr.",Jesse,17,Black,FL,JI,Murder,Life,2009,2014,,,,,Y,Y,,Y "Miller, Jr.",Lawrence J.,36,Caucasian,CT,CV,Assault,32 years,1983,1997,,,Y,,,,Y, "Miller, Jr.",Robert Lee,27,Black,OK,,Murder,Death,1988,1998,Y,,,Y,Y,Y,, Mills,Randall,43,Caucasian,TN,CV,Child Sex Abuse,20 years,2000,2014,Y,*,,,Y,Y,,Y Minnitt,Andre,21,Black,AZ,,Murder,Death,1993,2002,,,,,Y,,Y, Minsky,Gerald,42,Caucasian,F-KY,"CDC, FED, NC",Conspiracy,"18 months; $100,000 fine, $82,000 restitution",1991,1992,,,,,Y,,Y, Miranda,Adam,21,Hispanic,CA,P,Murder,Life,1983,2009,,,,,Y,,Y, Miranda,Roberto,38,Black,NV,CDC,Murder,Death,1982,1996,,,,,Y,,,Y Mitchel,Albert,42,Black,NY,"NC, P",Drug Possession or Sale,Not sentenced,2007,2008,,,,,Y,,Y, Mitchell,Charlie,34,Black,MI,,Murder,10 to 15 years,1989,2006,,,,,Y,,,Y Mitchell,Keith,18,Black,DC,,Murder,36 yeasr to life,1994,2015,,,,,Y,,Y,Y Mitchell,Marvin,21,Black,MA,CV,Child Sex Abuse,9 to 25 years,1990,1997,Y,,Y,,,Y,Y, Mitchell,Perry,20,Black,SC,CV,Sexual Assault,30 years,1984,1998,Y,,Y,,,Y,, Modahl,Jeffrey,29,Caucasian,CA,"CSH, CV, NC",Child Sex Abuse,48 years,1986,1999,,,,,Y,,Y, Moldowan,Jeffrey,20,Caucasian,MI,,Sexual Assault,60 to 90 years,1991,2003,,,,,Y,Y,, Monroe,Beverly,54,Caucasian,VA,"F, NC",Murder,22 years,1992,2003,,,,Y,Y,Y,Y, Montalvo,Ruben,16,Hispanic,NY,,Murder,15 to Life,1988,2001,,,Y,,,,Y, Montgomery,Carl,44,Caucasian,IL,CDC,Burglary/Unlawful Entry,7 years,1984,1989,,,,,Y,,,Y Montgomery,Johnathan,14,Caucasian,VA,"CV, NC",Sexual Assault,7 years and six months,2008,2013,,,,,Y,,, Montgomery,LaDondrell,34,Black,TX,,Robbery,Life,2010,2011,,,Y,,,,,Y Montgomery,Melvin,31,Black,F-MO,FED,Drug Possession or Sale,6 years and 6 months,1995,1997,,,,,,,, Montgomery,Patrick,31,Caucasian,TX,"CV, NC",Child Sex Abuse,20 years,1987,2009,,,,,Y,,, Montoya,Lorenzo,14,Hispanic,CO,JI,Murder,Life without parole,2000,2014,Y,*,,Y,Y,,Y,Y Moon,Brandon,26,Caucasian,TX,,Sexual Assault,75 years,1988,2005,Y,,Y,,,Y,, Mooney,John,19,Caucasian,MD,CIU,Murder,Life,2010,2014,,,,,Y,,, Moore,Anthony,32,Black,NY,,Murder,25 to Life,1996,1999,,,Y,Y,,,Y, Moore,Clarence,36,Black,NJ,,Sexual Assault,Life,1987,2006,,,Y,,,,Y, Moore,Curtis,32,Black,VA,PH,Murder,Life,1978,2010,Y,*,,Y,,,Y, Moore,Daniel Wade,24,Caucasian,AL,,Murder,Death,2002,2009,,,,,,,Y, Moore,Jonathan,18,Black,IL,CV,Murder,76 years,2002,2012,,,Y,,,,, Moore,Joshua,19,Caucasian,CA,,Robbery,12 years,1999,2001,,,Y,,,,,Y Moore,Kenneth,45,Caucasian,OH,P,Murder,15 to Life,1995,2004,,,,,,,, Moore,Kenneth,29,Black,TX,"NC, P",Drug Possession or Sale,2 years,2014,2015,,,,,,Y,Y, Moore,Walter,37,Caucasian,MI,CDC,Murder,Life,1988,1996,,,,,Y,,Y, Morales,Jose,17,Hispanic,NY,,Murder,15 to Life,1988,2001,,,Y,,,,Y, Morales,Raul,41,Hispanic,NY,NC,Sexual Assault,12 1/2 to 25 years,1988,1997,,,,,Y,,, Morales,Santiago Ventura,19,Hispanic,OR,,Murder,10 to Life,1986,1991,,,,,Y,,,Y Moreno,David,19,Hispanic,CA,"CDC, CV",Murder,Not sentenced,1998,2000,,,,,Y,,Y, Moreno,Noe,26,Hispanic,NC,P,Murder,18 to 22 years,2007,2012,,,,,,,,Y Morillo,Carlos,29,Hispanic,NY,,Murder,25 years to life,1993,2011,,,,,Y,,Y, Morris,Arthur,34,Caucasian,KS,"M, NC",Assault,Not sentenced,2014,2014,,,,,Y,,Y,Y Morris,Oscar,33,Black,CA,,Murder,Death,1983,2000,,,,,Y,,Y, Morton,Michael,32,Caucasian,TX,,Murder,Life,1987,2011,Y,,,,,Y,Y, Mosley,Sheldon,29,Black,TX,"CV, NC",Child Sex Abuse,60 years,1996,2013,,,,,Y,,,Y Moto,Vincent,22,Black,PA,,Sexual Assault,12 to 24 years,1987,1996,Y,,Y,,,,, Mowbray,Fredda Susie,39,Caucasian,TX,F,Murder,Life,1988,1998,,,,,Y,Y,Y, Mumphrey,Arthur,21,Black,TX,"CDC, CV",Child Sex Abuse,35 years,1986,2006,Y,,,,Y,,, Munchinski,David,25,Caucasian,PA,CDC,Murder,Life,1986,2013,,,,,Y,,Y, Munoz,Cesar,21,Hispanic,IL,"CIU, NC",Murder,45 years,2000,2013,,,,,,Y,,Y Munson,Adolph,37,Black,OK,JI,Murder,Death,1985,1995,,,Y,,Y,Y,Y, Murphy,Michelle,17,Caucasian,OK,"CV, F",Murder,Life without parole,1995,2014,Y,*,,Y,Y,Y,Y,Y Murphy,Thomas,39,Caucasian,WI,,Sexual Assault,20 years,1998,1999,,,,,,Y,, Murray,Lacresha,11,Black,TX,"CV, F",Murder,20 years,1996,2001,,,,Y,,Y,Y,Y Murray,Patrick,34,Caucasian,F-NY,"CDC, FED",Drug Possession or Sale,5 years,2010,2014,,,,,Y,Y,, Nash,A. Jabir,39,Black,NJ,"CV, NC",Child Sex Abuse,22 years,2002,2013,,,,,Y,,, Nash,Alprentiss,20,Black,IL,"CDC, CIU",Murder,80 years,1997,2012,Y,*,Y,,Y,,, Neely,Todd,18,Caucasian,FL,,Attempted Murder,15 years,1987,1989,,,Y,,,,Y, Neirynck,Brian,33,Caucasian,IN,"CV, NC",Child Sex Abuse,30 years,2003,2012,,,,,,,,Y Nelson,Bruce,24,Black,PA,CDC,Murder,Life,1982,1991,Y,,,,Y,,Y, Nelson,Gary,29,Black,GA,CV,Murder,Death,1980,1991,,,,,Y,Y,Y,Y Nelson,Jimmie,28,Caucasian,MI,,Murder,25 to 50 years,2010,2014,,,,,,,, Nelson,Robert,20,Black,MO,,Sexual Assault,98 years,1984,2013,Y,,Y,,,,, Nelson,Terry Lee,37,Caucasian,IL,JI,Murder,40 years,1989,1994,,,,Y,Y,,, Nesmith,Willie James,20,Black,PA,,Sexual Assault,9 to 25 years,1982,2001,Y,,Y,,,,, Newburn,Calvin,26,Black,CA,NC,Drug Possession or Sale,7 years,1997,1999,,,,,Y,,Y, Newsome,James,24,Black,IL,,Murder,Life,1980,1995,,,Y,,Y,,Y, Newton,Alan,21,Black,NY,,Sexual Assault,13 1/2 to 40 years,1985,2006,Y,,Y,,,,, Nguyen,Anthony,17,Asian,TX,"CIU, M, NC, P",Drug Possession or Sale,Probation,2013,2014,,,,,,,, Nickerson,Glen,29,Caucasian,CA,,Murder,Life without parole,1987,2003,Y,*,Y,,Y,,Y, Nieves,William,26,Hispanic,PA,,Murder,Death,1994,2000,,,Y,,Y,,Y,Y Nnodimele,Martin,47,Black,NY,CIU,Robbery,10 years,2008,2012,,,Y,,,,,Y Nobles,Glen,52,Black,TX,"NC, P",Sex Offender Registration,10 years,2013,2015,,,,,,,,Y Noel,Michael,55,Black,TX,"CIU, M, NC, P",Drug Possession or Sale,10 days,2014,2014,,,,,,Y,, Northrop,Alan,28,Caucasian,WA,,Sexual Assault,23 years and 6 months,1993,2010,Y,,Y,,,,, Norton,Roger,49,Caucasian,MA,"CV, NC",Child Sex Abuse,16 to 20 years,1992,2005,,,,,Y,,Y, Norwood,Matthew,15,Black,MS,P,Robbery,15 years,1997,2010,,,Y,,,,, Nowacki,Michael,57,Caucasian,CT,NC,Other Nonviolent Felony,15 months,2012,2015,,,,,,,, Nowak,Kevin,19,Caucasian,TX,"CIU, NC, P",Drug Possession or Sale,Probation,2014,2014,,,,,,,, Nunez,Lorenzo,23,Hispanic,CA,,Murder,40 to Life,1995,2001,,,,,,,Y, Oakes,William,30,Caucasian,NY,,Murder,25 to Life,1985,1992,,,,Y,Y,,Y, Ochoa,Christopher,22,Hispanic,TX,P,Murder,Life,1989,2002,Y,,,Y,,,Y, Ochoa,James,19,Hispanic,CA,P,Robbery,2 years,2005,2006,Y,,Y,,,,, O'Connell,Frank,25,Caucasian,CA,,Murder,25 to Life,1985,2012,,,,,Y,,Y, O'Dell,Karen,37,Caucasian,FL,"F, NC, P",Drug Possession or Sale,"Probation, 1 yr + 50 hrs community service",1998,2000,,,,,Y,,Y, Odom,Kirk,18,Black,DC,,Sexual Assault,20 to 66 years,1981,2012,Y,,Y,,,Y,, O'Donnell,James,32,Caucasian,NY,,Sexual Assault,3 1/2 to 7 years,1998,2000,Y,,Y,,,Y,Y, Okongwu,Dominic,41,Black,NY,"CV, NC",Child Sex Abuse,50 years,1993,2011,,,,,Y,Y,,Y Olebar,Brandon,19,Native American,WA,,Robbery,16 years and six months,2003,2013,,,Y,,,,, Ollins,Calvin,14,Black,IL,CDC,Murder,Life without parole,1988,2001,Y,,,Y,Y,Y,Y, Ollins,Larry,16,Black,IL,CDC,Murder,Life,1988,2001,Y,,,,Y,Y,Y, O'Neal,Johnnie,25,Black,NY,CIU,Sexual Assault,10 to 20 years,1985,2013,,,Y,,Y,,Y, Orange,Leroy,32,Black,IL,"A, CDC, CV",Murder,Death,1985,2003,,,,Y,Y,,Y,Y Ortiz,Anthony,20,Hispanic,NY,,Murder,50 years to life,1993,2011,,,,,Y,,Y, Ortiz,Armando Rodriguez,15,Hispanic,CA,,Murder,Life,2003,2007,,,Y,,,,,Y Ortiz,Josue,22,Hispanic,NY,P,Manslaughter,25 years,2006,2015,,,,Y,,,, Ortiz,Juan,44,Hispanic,CA,,Fraud,5 years,2005,2010,,,,,,,, Ortiz,Luis,18,Hispanic,IL,"CDC, P",Murder,Life,2000,2003,,,,Y,Y,,Y, Ortiz,Sandra,30,Hispanic,NJ,"F, NC",Murder,Not sentenced,2001,2003,,,,,,,Y,Y Ortiz,Victor,24,Hispanic,NY,CV,Sexual Assault,12 1/2 to 25 years,1984,1996,Y,,Y,,,,, Osborne,Larry,17,Caucasian,KY,CDC,Murder,Death,1999,2002,,,,,Y,,, Ott,Chaunte,21,Black,WI,"CDC, CV, JI",Murder,Life,1996,2009,Y,,,,Y,,, Overton,Hannah,29,Caucasian,TX,"CV, F, NC",Murder,Life without parole,2007,2015,,,,,,Y,Y,Y Owens,James L.,22,Caucasian,MD,JI,Murder,Life,1988,2008,Y,*,,,Y,,Y, "Oxley, Jr.",Wayne,39,Caucasian,NY,JI,Murder,25 years to life,2006,2012,,,,,Y,,, Pacek,Jerry,13,Caucasian,PA,,Murder,10 to 20 years,1959,1991,,,,Y,,,, Pacheco,Justin,18,Hispanic,CA,"CDC, CV",Murder,Not sentenced,1998,2000,,,,,Y,,Y, Pack,Phillip,23,Caucasian,TN,"CDC, NC",Murder,25 years,2010,2013,,,,,Y,,, Pacyon,Douglas,28,Caucasian,NY,,Sexual Assault,3 years and 6 months,1985,2010,Y,,Y,,,,, Padgett,Larry Randal,40,Caucasian,AL,,Murder,Death,1992,1997,,,,,,Y,Y, Padilla,Fidel,53,Hispanic,NY,"NC, P",Other Nonviolent Felony,Not sentenced,2012,2014,,,,,,,Y,Y Paey,Richard,38,Caucasian,FL,NC,Drug Possession or Sale,25 years,2004,2007,,,,,Y,,, Palazzolo,John,48,Caucasian,MI,NC,Sexual Assault,6 to 15 years,2006,2012,,,,,Y,,,Y Palmer,Lamar,19,Black,NY,,Assault,9 years,2000,2001,,,Y,,,,,Y "Palmer, Sr.",Dale,37,Black,OH,"CV, NC",Child Sex Abuse,Life,1994,1997,,,,,Y,,, Palomo,Tim,28,Caucasian,CA,"CSH, CV, NC",Child Sex Abuse,14 years,1985,1990,,,,,Y,,Y, Paradis,Donald,30,Caucasian,ID,,Murder,Death,1981,2001,,,,,Y,Y,Y,Y Pardue,Michael,17,Caucasian,AL,CDC,Murder,Life,1973,1997,,,,Y,Y,,Y,Y Parish,Christopher,21,Black,IN,,Attempted Murder,30 years,1998,2006,Y,*,Y,,Y,,Y,Y Parker,Darrel,24,Caucasian,NE,,Murder,Life,1956,1991,,,,Y,,,Y, Parker,Michael,35,Caucasian,NC,"CSH, CV, NC",Child Sex Abuse,Life,1994,2014,,,,,Y,Y,Y,Y Parkes,Timothy,49,Caucasian,F-TN,FED,Fraud,"6.5 years, $4.25 million restitution, $125,000 fine",2009,2012,,,,,,,Y, Passley,Marlon,23,Black,MA,CV,Murder,Life,1996,2000,,,Y,,,,, Patrick,Deon,20,Black,IL,"CDC, CIU",Murder,Life without parole,1995,2014,,,,Y,Y,,Y,Y Patterson,Aaron,21,Black,IL,CDC,Murder,Death,1989,2003,,,,Y,Y,,Y, Patterson,Javon,25,Black,IL,NC,Gun Possession or Sale,6 years,2007,2010,,,,,Y,,Y, Patterson,Maurice,35,Black,IL,,Murder,30 years,2003,2010,Y,,Y,,Y,Y,Y, Pavel,Kenneth,41,Caucasian,NY,"CV, NC",Child Sex Abuse,8 to 24 years,1989,2001,,,,,Y,,,Y Pavlinac,Laverne,57,Caucasian,OR,F,Murder,10 to Life,1991,1995,,,,Y,,,, Payne,Dana,28,Caucasian,MN,"CV, NC",Other Violent Felony,six months in jail and six months work/release,1989,1991,,,,,Y,Y,Y, Peacock,Freddie,33,Black,NY,,Sexual Assault,5 years,1976,2010,Y,,Y,Y,,,, Peak,Carolyn June,39,Caucasian,AZ,F,Murder,Not sentenced,2000,2003,,,,,,,Y, Peel,John,18,Caucasian,FL,"CV, NC, P, SBS",Manslaughter,10 years,2000,2002,,,,,Y,Y,, Pejcinovic,Adam,19,Caucasian,NY,CV,Assault,3 to 9 years,1988,1993,,,Y,,,,Y, Pejcinovic,Frank,17,Caucasian,NY,CV,Assault,1 to 3,1989,1993,,,Y,,,,Y, Pena,Jose Luis,36,Hispanic,TX,NC,Drug Possession or Sale,Life,2003,2011,,,,,Y,,Y, Penalver,Seth,21,Caucasian,FL,JI,Murder,Death,1999,2012,,,,,Y,,Y, Pendleton,Marlon,35,Black,IL,,Sexual Assault,20 years,1996,2006,Y,,Y,,Y,Y,, Peralta,David,23,Hispanic,GA,JI,Murder,Life,2001,2013,,,Y,,Y,,, Perez,Carlos,25,Hispanic,NY,,Murder,25 years to life,1997,2012,,,,,Y,,, Perez,Carlos,25,Hispanic,NY,,Murder,25 years to life,1997,2013,,,,,Y,,Y, Perez,Duarnis,27,Hispanic,F-NY,"FED, NC, P",Immigration,4.75 years,2000,2006,,,,,,,Y, Perez,Josefina,31,Hispanic,RI,F,Drug Possession or Sale,3 years,2002,2005,,,,,,,, Perez,Richard,19,Hispanic,CA,,Robbery,Not sentenced,2001,2001,,,Y,,,,,Y Perrington,Donald,30,Black,NY,,Gun Possession or Sale,8 years,2008,2014,,,,,,,, Perry,James Norman,31,Caucasian,MI,"CV, NC",Child Sex Abuse,Not sentenced,2006,2008,,,,,Y,,Y, Persad,Vishnu,23,Other,FL,,Assault,43 years,2001,2007,,,Y,,,,,Y Peters,Harold,39,Black,TX,"CIU, NC, P",Drug Possession or Sale,7 months,2014,2014,,,,,,Y,, Peterson,Jamie Lee,21,Caucasian,MI,,Murder,Life without parole,1998,2014,Y,,,Y,,,Y, Peterson,Kevin,32,Caucasian,UT,"CV, NC, P",Child Sex Abuse,1 to 15 years,1990,2012,,,,,Y,Y,, Peterson,Kevin K.,33,Caucasian,NE,,Murder,Not sentenced,1994,1995,,,Y,,Y,,Y, Peterson,Larry,36,Black,NJ,JI,Murder,Life,1989,2006,Y,,,,Y,Y,, Peterson,Mark,29,Caucasian,WI,NC,Sexual Assault,Not sentenced,1990,1990,,,,Y,,,Y, Pettiford,Antoine,23,Black,MD,,Murder,Life,1995,2000,,,Y,,Y,,Y, Pettit,Leona,27,Caucasian,OH,"F, NC",Murder,15 years to life,1999,2001,,,,,,,,Y Philen,Randall,50,Caucasian,TX,,Murder,Life,2011,2012,,,,,,Y,, Phillips,Lacey,22,Caucasian,F-WI,"F, FED, NC",Conspiracy,2 months,2011,2014,,,,,,,, Phillips,Michael,32,Black,TX,"CIU, CV, P",Sexual Assault,12 years,1990,2014,Y,,Y,,,,,Y Phillips,Paul,17,Black,IL,"CDC, CIU",Murder,30 years,1995,2014,,,,Y,Y,,Y,Y Phillips,Steven,22,Caucasian,TX,CIU,Sexual Assault,40 years,1982,2008,Y,,Y,,,,Y, Phinney,Roland,36,Caucasian,MA,,Murder,Life,1990,2008,,,,Y,,,Y, Piaskowski,Michael,43,Caucasian,WI,JI,Murder,Life,1995,2001,,,,,Y,,Y, Pichardo,Juan Carlos,23,Hispanic,NY,,Murder,20 years to life,1994,2000,,,,,Y,,Y,Y Pickett,Daniel,43,Caucasian,CA,"CV, NC",Child Sex Abuse,Not sentenced,2004,2004,,,,,Y,,Y, Pierce,Jeffrey Todd,23,Caucasian,OK,,Sexual Assault,65 years,1986,2001,Y,,Y,,Y,Y,Y, Pierce,Milyn,32,Caucasian,TX,"F, NC, P",Drug Possession or Sale,6 months,2014,2015,,,,,,,, Pierpoint,Jonathan Scott,28,Caucasian,NC,"CV, NC",Child Sex Abuse,Life,1992,2010,,,,,Y,Y,, Pierre,Charles,36,Black,NY,A,Murder,25 years to life,2003,2015,,,Y,,,,,Y Pinchback,Johnny,28,Black,TX,"CIU, CV",Child Sex Abuse,Life,1984,2011,Y,,Y,,,,, Pineda,Lisa,26,Hispanic,CA,F,Murder,40 years to life,2005,2009,,,,,,,, Piszczek,Brian,22,Caucasian,OH,,Sexual Assault,15 to 25 years,1991,1994,Y,,Y,,,,, Pitre,Derick,40,Black,TX,"CIU, NC, P",Drug Possession or Sale,6 months,2014,2014,,,,,,Y,, Pitts,Marcella,29,Caucasian,CA,"CSH, CV, F, NC",Child Sex Abuse,Life,1985,1991,,,,,Y,,Y, Pitts,Ricky Lynn,31,Caucasian,CA,"CSH, CV, NC",Child Sex Abuse,Life,1985,1991,,,,,Y,,Y, Plotnick,Samuel,28,Caucasian,KY,"CV, NC",Child Sex Abuse,20 years,2001,2008,,,,,Y,Y,,Y Polk,Eric,27,Black,TX,"CIU, NC, P",Drug Possession or Sale,6 months,2014,2015,,,,,,Y,, Polonia,Diomedes,26,Hispanic,NY,,Attempted Murder,14 to 18 years,1998,2003,,,Y,,,,,Y Pondexter,Ronald,22,Black,NY,,Murder,25 to life,1993,1997,,,Y,,Y,,Y, Pope,David Shawn,23,Caucasian,TX,,Sexual Assault,45 years,1986,2001,Y,,Y,,,Y,, Porter,Allen Wayne,18,Black,TX,CIU,Sexual Assault,Life,1991,2010,,,Y,,,,, Porter,Anthony,27,Black,IL,,Murder,Death,1983,1999,,,Y,,Y,,, Porter,Kerry,34,Black,KY,JI,Murder,60 years,1998,2011,Y,*,Y,,Y,,, Porter,Michael,26,Caucasian,FL,,Sexual Assault,Life,1988,2002,,,Y,,,,Y, Potts,Clinton,30,Caucasian,OK,JI,Murder,Life without parole,2009,2012,,,,,Y,,Y,Y Powell,Anthony,23,Black,MA,,Sexual Assault,12 to 20 years,1992,2004,Y,,Y,,,,, Powell,Benny,26,Black,CA,JI,Murder,Life,1975,1992,,,,,Y,,Y, Powell,Paris,19,Black,OK,CV,Murder,Death,1997,2009,,,,,Y,,Y, Pratt,Elmer,21,Black,CA,,Murder,25 to Life,1972,1999,,,Y,,Y,,Y, Prentice,Mark,27,Caucasian,NY,P,Robbery,12.5 to 25 years,1989,1995,,,Y,,Y,Y,Y, Prince,Christopher E.,18,Black,VA,"CV, NC, P",Burglary/Unlawful Entry,6 to 12 years,1994,1995,,,,,Y,,, Prineas,Anthony,21,Caucasian,WI,NC,Sexual Assault,10 years,2004,2012,,,,,Y,,, Prion,Lemuel,30,Caucasian,AZ,,Murder,Death,1999,2003,,,,,,,, Proctor,Eric,18,Caucasian,OR,JI,Murder,Life,1986,1995,Y,*,,,Y,,, Provience,Dwayne,27,Black,MI,,Murder,32 to 62 years,2001,2010,,,,,Y,,Y,Y Purtell,Daniel,21,Caucasian,PA,NC,Sexual Assault,3 1/2 to 6 years,2002,2004,,,,,Y,,, Purvis,John,41,Caucasian,FL,CV,Murder,Life,1985,1993,,,,Y,,,, "Quattrocchi, III",John,39,Caucasian,RI,"CV, NC",Child Sex Abuse,40 years,1994,2001,,,,,,Y,, Quick,Wesley,18,Caucasian,AL,,Murder,Death,1997,2003,,,,,,,, Quindt,David,21,Caucasian,CA,CV,Murder,Not sentenced,1999,2000,,,Y,,,,, "Raby, Jr.",Earl,28,Black,MI,,Destruction of Property,Probation,1998,1999,,,Y,,,,,Y Rachell,Ricardo,45,Hispanic,TX,CV,Child Sex Abuse,40 years,2003,2009,Y,,Y,,,,, "Radillo, Jr.",Sergio,24,Hispanic,FL,CV,Attempted Murder,Life,1995,2005,,,Y,,,,,Y Rainbolt,Thomas,51,Black,TX,"CIU, M, NC, P",Drug Possession or Sale,45 days,2014,2014,,,,,,,, Rainge,Willie,20,Black,IL,"CDC, JI",Murder,Life,1978,1996,Y,,Y,,Y,Y,Y,Y Ralston,Sean,18,Caucasian,MA,"CV, NC, SBS",Manslaughter,10 to 15 years,1989,1992,,,,Y,,Y,,Y Ramchair,Racky,22,Other,NY,,Robbery,10 to 20 years,1997,2010,,,Y,,,,, Ramirez,Jesus,48,Hispanic,TX,JI,Murder,Life,1998,2008,,,Y,,Y,,Y,Y Ramirez-Lopez,Juan,44,Hispanic,F-CA,FED,Immigration,6 1/2 years,2000,2003,,,,,Y,,Y, Ramos,Alberto,21,Hispanic,NY,"CSH, CV, NC",Child Sex Abuse,8 1/3 to 25 years,1985,1994,,,,,Y,,Y, Ramos,Jesse,24,Hispanic,TX,"CV, NC",Child Sex Abuse,Probation,1993,2000,,,,,,Y,,Y Randolph,Guy,33,Black,MA,"CV, P",Child Sex Abuse,10 years,1991,2008,,,Y,,,,,Y Ranta,David,35,Caucasian,NY,"CDC, CIU",Murder,37 years and 6 months to life,1991,2013,,,Y,Y,Y,,Y, Rea,Julie,28,Caucasian,IL,"CV, F",Murder,65 years,2002,2006,,,,,,,, Reasonover,Ellen,24,Black,MO,"F, JI",Murder,Life,1983,1999,,,,,Y,,Y,Y Recendez-Lopez,Edgar,20,Caucasian,TX,"CIU, NC, P",Drug Possession or Sale,3 months,2010,2014,,,,,,Y,, Redick,Neal,28,Caucasian,MI,"CV, NC",Child Sex Abuse,16 to 30 years,1992,2007,,,,,Y,,, Reed,Cornelius,15,Black,WI,,Murder,Life,1993,1997,,,Y,,,,, Reed,DeShawn,24,Black,MI,,Attempted Murder,20 to 30 years,2001,2009,,,,,Y,,Y, Reed,Marvin,33,Black,MI,,Attempted Murder,20 to 32 years,2001,2009,,,,,Y,,Y, Reeves,Marvin,29,Black,IL,"A, CDC, CV, JI",Murder,Life without parole,1991,2009,,,,,Y,,Y, Register,Kash,18,Black,CA,,Murder,Life without parole,1979,2013,,,Y,,Y,,Y,Y Reid,Mark,22,Black,CT,,Sexual Assault,12 years,1997,2003,Y,*,Y,,,Y,, Reno,Ronald,32,Caucasian,CA,P,Gun Possession or Sale,25 to Life,1997,2002,,,,,,,, Reser,Pamela Sue,29,Caucasian,OR,"CV, F, NC",Child Sex Abuse,Life,1999,2002,,,,,Y,,, Resh,Randy,24,Caucasian,OH,CDC,Murder,20 to Life,1990,2007,,,,,Y,Y,,Y Restivo,John,25,Caucasian,NY,"CDC, CV, JI",Murder,33 years and 4 months,1986,2005,Y,,,,Y,Y,Y, Revelle,George,36,Caucasian,MO,,Murder,Life without parole,1996,1998,,,,,,Y,, Reyes,George,33,Hispanic,TX,"CIU, NC, P",Drug Possession or Sale,10 months,2013,2015,,,,,,,, Reynolds,Donald,20,Black,IL,,Sexual Assault,69 years,1988,1997,Y,,Y,,Y,Y,Y, Rhoades,Nick,33,Caucasian,IA,"NC, P",Other Violent Felony,25 years,2009,2014,,,,,,,,Y Richardson,Gerard,28,Black,NJ,,Murder,30 years,1995,2013,Y,,,,,Y,, Richardson,Harold,16,Black,IL,CDC,Murder,40 years,1997,2012,Y,,,Y,Y,,, Richardson,James Joseph,31,Black,FL,"CV, JI",Murder,Death,1968,1989,,,,,Y,,Y, Richardson,Julie,44,Caucasian,CA,F,Drug Possession or Sale,1 year,1999,2001,,,,,Y,,Y, Richardson,Kevin,14,Black,NY,CDC,Sexual Assault,5 to 10 years,1990,2002,Y,,,Y,Y,Y,, "Richardson, Jr.",James E.,36,Caucasian,WV,,Murder,Life,1989,1999,Y,,,,Y,Y,Y, Rigel,Sierra,17,Caucasian,OR,"F, NC",Manslaughter,100 hours of community service,2011,2014,,,,Y,,,,Y Risha,Jesse,47,Caucasian,F-PA,"A, FED","Attempt, Violent",Not sentenced,2004,2006,,,,,Y,,Y, Rivas,Edgar,37,Hispanic,F-NY,FED,Drug Possession or Sale,10 years and 1 month,2003,2004,,,,,Y,,Y, Rivera,Alfred,25,Hispanic,NC,CDC,Murder,Death,1997,1999,,,,,Y,,Y, Rivera,Jacques,23,Hispanic,IL,CV,Murder,80 years,1990,2011,,,Y,,Y,,Y, Rivera,Jonathan,22,Hispanic,PR,,Murder,Life,2007,2008,,,Y,,,,Y, Rivera,Juan,19,Hispanic,IL,"CV, JI",Murder,Life,1993,2012,Y,,Y,Y,Y,,Y, Rivera,Simon Angel,24,Hispanic,TX,"NC, P",Sex Offender Registration,2 years,2002,2005,,,,,,,,Y Roberts,Amber,19,Caucasian,TX,"CIU, F, NC, P",Drug Possession or Sale,Probation,2014,2015,,,,,,Y,, Roberts,Henry Myron,62,Caucasian,MD,PH,Murder,50 years,1992,2002,,,,,,,, Roberts,Lisa,37,Black,OR,"F, P",Manslaughter,15 years,2004,2014,Y,*,,,,Y,,Y Roberts,Rodney,29,Black,NJ,"CV, P",Kidnapping,7 years,1996,2014,Y,,Y,,,,, Robertson,Shakara,18,Black,TX,"CIU, P",Robbery,Probation,1996,2012,,,Y,,,,, Robinson,Anthony,25,Black,TX,,Sexual Assault,27 years,1987,2000,Y,,Y,,,,, Robinson,Davonn,17,Black,WI,"CV, NC, P",Child Sex Abuse,5 years,2006,2010,,,,Y,Y,,, Robinson,Derrick,30,Black,FL,P,Murder,7 years,1989,1991,,,,,Y,,, Robinson,Eric,23,Black,CA,,Murder,Life,1994,2007,Y,*,,,,,Y, Robinson,Tobaris,27,Black,TX,"NC, P",Drug Possession or Sale,seven months,2011,2012,,,,,,Y,, Robison,James Albert,53,Caucasian,AZ,CDC,Murder,Death,1977,1993,,,,,Y,,, Robles,Willie,18,Hispanic,CA,,Robbery,Not sentenced,1991,1991,,,Y,,,,,Y Rocha,Mario,16,Hispanic,CA,CV,Murder,35 to Life,1998,2008,,,Y,,,,,Y Rockette,Xavier,19,Black,WI,JI,Murder,Life,2003,2013,,,,,Y,,, Rodrigues,Shaun,20,Other,HI,,Kidnapping,20 years,2002,2014,,,Y,,,,, Rodriguez,Angel,,Hispanic,IL,,Murder,60 years,1998,2000,,,Y,,,,, Rodriguez,George,25,Hispanic,TX,CV,Child Sex Abuse,60 years,1987,2005,Y,,Y,,,Y,, Rodriguez,Jeffrey,24,Hispanic,CA,,Robbery,25 to Life,2003,2007,,,Y,,,Y,,Y Rodriguez,Jose,19,Hispanic,NY,,Robbery,10 years,2007,2011,,,Y,,,,Y, Rodriguez,Manual Hidalgo,36,Hispanic,WA,"CSH, CV, NC",Child Sex Abuse,5 years and 6 months,1995,2000,,,,,Y,Y,Y,Y Roesser,Christopher,25,Caucasian,GA,NC,Murder,Life,2008,2013,,,,,Y,,,Y Rogers,Mandel,18,Black,TX,"CIU, P",Robbery,12 years,1999,2014,Y,,Y,,,,, Rojas,Carlos,34,Hispanic,F-AL,"FED, P",Drug Possession or Sale,20 years,1990,2002,,,,,Y,,Y, Rojas,Luis Kevin,18,Hispanic,NY,,Murder,15 to Life,1992,1998,,,Y,,,,,Y Rollin,Joseph Pierre,45,Caucasian,CA,NC,Dependent Adult Abuse,8 years,2004,2008,,,,,,Y,,Y Rollins,Lafonso,17,Black,IL,,Sexual Assault,75 years,1994,2004,Y,,Y,Y,,,Y,Y "Rollins, IV",Offord,17,Black,CA,CV,Murder,29 years to life,1992,1996,,,Y,,,Y,Y, Roman,Miguel,31,Hispanic,CT,"CV, JI",Murder,60 years,1990,2009,Y,,,,Y,,, "Rome, Sr.",Anthony,47,Caucasian,SD,"CV, NC",Kidnapping,Probation,1987,1990,,,,,,,, Rose,Michael,25,Caucasian,WA,"CSH, CV, NC",Child Sex Abuse,23 years,1995,2000,,,,,Y,Y,Y, Rose,Peter,37,Caucasian,CA,CV,Child Sex Abuse,27 years,1995,2005,Y,,Y,,Y,Y,Y,Y Ross,Ronald,44,Black,CA,,Attempted Murder,25 years to life,2006,2013,,,,,Y,,Y,Y Rowan,Jeffrey,26,Caucasian,OK,"CV, NC",Child Sex Abuse,35 years,2009,2012,,,,Y,,,, Ruby,Donald,39,Caucasian,PA,,Murder,Life,1987,1993,Y,*,,,,Y,,Y Ruffin,Julius,28,Black,VA,,Sexual Assault,Life,1982,2003,Y,,Y,,,,, Ruffin,Larry Donnell,19,Black,MS,"CDC, PH",Murder,Life,1980,2011,Y,,,Y,Y,,Y, Ruiz Cortez,Refugio,29,Hispanic,F-IL,"FED, NC",Drug Possession or Sale,17.5 years,1999,2010,,,,,Y,,Y, Russell,Stephen Lynn,26,Caucasian,TX,,Robbery,50 years,1980,1990,,,Y,,Y,,Y, Saecker,Frederic,38,Caucasian,WI,,Sexual Assault,15 years,1990,1996,Y,,,,,,, Salaam,Yusef,14,Black,NY,CDC,Sexual Assault,5 to 10 years,1990,2002,Y,,,Y,Y,Y,, Salazar,Ben,23,Hispanic,TX,,Sexual Assault,30 years,1992,1997,Y,,Y,,,,, Salce,Lydia,50,Caucasian,NY,"F, NC",Attempted Murder,16 years,2012,2015,,,,,Y,,, Salcedo,Claudia,21,Hispanic,IL,"F, NC",Assault,Probation,2005,2007,,,,,Y,,Y, Salcedo,Gloria,46,Hispanic,IL,"F, NC",Assault,Probation,2005,2007,,,,,Y,,Y, Salcedo,Teresa,17,Hispanic,IL,"F, NC",Assault,Probation,2005,2007,,,,,Y,,Y, Salvati,Joseph,31,Caucasian,MA,CDC,Murder,Life,1968,2001,,,,,Y,,Y, Samueli,Henry,48,Caucasian,F-CA,"FED, NC, P",Perjury,Not sentenced,2008,2009,,,,,,,Y, Sanchez,Isauro,20,Hispanic,IL,CDC,Murder,Life without parole,1982,1991,,,Y,,Y,,, Sanders,David,39,Caucasian,WI,CV,Child Sex Abuse,15 years,2006,2007,,,Y,,,,, Sanders,Rodell,28,Black,IL,CDC,Murder,80 years,1995,2014,,,Y,,Y,,Y,Y Sanders,Trina,49,Black,TX,"F, NC, P",Drug Possession or Sale,2 years,2014,2015,,,,,,Y,, Sandoval-Gonzalez,Gerardo,51,Hispanic,F-CA,FED,Immigration,7 years,2009,2012,,,,,,,Y, Santaliz Acosta,Luis,34,Hispanic,PR,,Murder,148 years,2000,2009,,,,,Y,,Y,Y Santana,Raymond,14,Black,NY,CDC,Sexual Assault,5 to 10 years,1990,2002,Y,,,Y,Y,Y,, Santos,Edar Duarte,26,Hispanic,IL,"CDC, P",Murder,12 years,2002,2003,,,,Y,Y,,Y, Santos,Jeffrey,40,Black,NY,NC,Assault,6 years,1998,2004,,,,,Y,,Y, Santos,Louis,20,Black,MA,,Murder,Life,1985,1990,,,Y,,,,Y, Saraceno,David,18,Caucasian,CT,A,Arson,Not sentenced,1998,1999,,,,Y,,,, Sarita,Danny,25,Hispanic,NY,CIU,Assault,4 years,2009,2012,,,,,Y,,Y, Sarsfield,Eric,23,Caucasian,MA,,Sexual Assault,10 to 15 years,1987,2000,Y,,Y,,Y,,, Saunders,Michael,15,Black,IL,CDC,Murder,40 years,1998,2012,Y,,,Y,Y,,, Saunders,Omar,18,Black,IL,CDC,Murder,Life without parole,1988,2001,Y,,,,Y,Y,Y, Savory,Johnnie,14,Black,IL,CV,Murder,50 to 100 years,1977,2015,Y,*,,Y,Y,,Y,Y Sawyer,Thomas,49,Caucasian,MI,JI,Sexual Assault,20 to 30 years plus 2 years,1992,2003,,,Y,,Y,,Y, "Scamardo, Jr.",Joseph Anthony,35,Caucasian,AR,"CV, NC",Child Sex Abuse,12 years,2010,2013,,,,,Y,,, Schaaf,David,40,Caucasian,PA,NC,Attempted Murder,14 to 28 years,2003,2006,,,,,Y,,, Schand,Mark,21,Black,MA,,Murder,Life without parole,1987,2013,,,Y,,Y,,Y, Scheer,Dana,34,Caucasian,F-FL,FED,Fraud,Probation,1995,1999,,,,,,,Y, Schledwitz,Karl,30,Caucasian,F-TN,FED,Fraud,6 months,1992,1999,,,,,,,Y, Schmieder,Jeff,41,Caucasian,WA,NC,Sexual Assault,Not sentenced,1998,1999,,,,,Y,,, Schulz,Stephen,35,Caucasian,NY,,Robbery,11 years,1999,2009,,,Y,,,,,Y Scott,Calvin Lee,26,Black,OK,,Sexual Assault,25 years,1983,2003,Y,,,,,Y,, Scott,Christopher Shun,27,Black,TX,CIU,Murder,Life,1997,2009,,,Y,,,,, Scott,David L.,15,Caucasian,IN,,Murder,50 years,1985,2008,Y,*,,Y,,,, Scott,Michael,17,Caucasian,TX,"CDC, CV",Murder,Life,2002,2009,Y,*,,Y,Y,,, Scott,Samuel,28,Black,GA,,Sexual Assault,Life,1987,2002,Y,,Y,,,,, Scott,Scotty,20,Caucasian,AR,,Murder,25 years,1983,1989,,,Y,,,,, Scruggs,Dwayne D.,32,Black,IN,CV,Sexual Assault,40 years,1986,1993,Y,,Y,,,,, Sealie,Frank,23,Black,AL,,Murder,Life without parole,2014,2015,,,Y,,Y,,, Seeland,Benjamin,32,Caucasian,AK,"M, NC",Assault,2 months,2014,2015,,,,,Y,,,Y Seiber,George,41,Caucasian,OH,,Assault,10 to 25 years,1987,1999,,,Y,,Y,,Y, Self,Grant,31,Caucasian,CA,"CSH, CV, NC",Child Sex Abuse,31 years,1985,2008,,,,,Y,,Y, Seranno,Ivan,17,Caucasian,PA,,Murder,Life,1988,2005,,,Y,,,,, Seri,Michael Caesar,42,Caucasian,CT,CV,Child Sex Abuse,6 months to 5 years,2001,2003,,,Y,,,Y,Y, Sermeno,Miguel Angel,38,Hispanic,CA,,Traffic Offense,8 months,1995,1997,,,Y,,,,Y,Y Settle,Daniel Roy,18,Caucasian,TX,"NC, P",Drug Possession or Sale,10 years,1999,2011,,,,,Y,,Y, Shabazz,Omar,27,Black,NY,,Gun Possession or Sale,8 years,2008,2014,,,,,,,, Shakur,Shabaka,23,Black,NY,,Murder,20 years to life,1989,2015,,,,Y,Y,,Y,Y Sharp,Shainnie,16,Black,IL,"CDC, CV, P",Murder,20 years,1994,2011,Y,,,Y,Y,,Y, Shaughnessy,Eric,21,Caucasian,MA,CV,Assault,5 years,1997,1998,,,Y,,,,, Shelden,Debra,26,Caucasian,NE,"CDC, F, P",Murder,10 years,1989,2009,Y,,,Y,Y,Y,, Shepard,Charles,24,Black,NY,,Murder,15 to Life,1988,2001,,,,,Y,,, Shephard,David L.,21,Black,NJ,,Sexual Assault,30 years,1984,1995,Y,,Y,,,,, Sherwin,James T.,53,Caucasian,F-NY,FED,Fraud,6 months,1989,1991,,,,,,,, Shomberg,Forest,38,Caucasian,WI,,Sexual Assault,12 years,2003,2009,Y,*,Y,,,,Y, Shortt,James,21,Black,CA,"CDC, JI",Murder,Life without parole,1982,2010,,,,,Y,,Y, Showers,Kurtis DeAngelo,28,Black,MI,,Robbery,Life,1994,1998,,,Y,,,Y,, Shreve,Thomas,31,Caucasian,NC,"M, NC",Threats,Probation,1988,1989,,,,,,,, Shull,George Edward,30,Caucasian,CA,"CIU, P",Sexual Assault,5 years,1989,2009,,,Y,,,,, Shved,Olga,23,Caucasian,WA,"CV, F, NC",Assault,10 years,2009,2014,,,,,,Y,, Sifuentes,Alberto,22,Hispanic,TX,JI,Murder,Life,1998,2008,,,Y,,Y,,Y,Y Siller,Thomas,41,Caucasian,OH,CDC,Murder,30 to life,1998,2011,Y,*,,,Y,Y,Y, "Silva, Sr.",Juan,43,Hispanic,WI,P,Manslaughter,5 years,2014,2015,,,,Y,,,, Silva-Santiago,Jesus,28,Hispanic,MA,,Murder,Life without parole,2006,2010,,,Y,,,,Y, Simmons,James,47,Black,WA,,Drug Possession or Sale,1 year and 1 day,2007,2010,,,,,Y,,, Simmons,Lawrence,19,Black,NJ,CDC,Murder,Life,1977,2000,,,,,Y,,Y, Simmons III,Tommy,35,Black,CA,,Murder,Life,1998,2003,,,Y,,,,,Y "Simmons, Jr.",Claude,42,Black,TX,CIU,Murder,Life,1997,2009,,,Y,,,,, Simon,Alstory,32,Black,IL,"CIU, P",Murder,37 years,1999,2014,,,,Y,Y,,, Sims,Lennie Darrold,24,Black,CA,,Robbery,7 years,1992,1994,,,Y,,,,,Y Sinegal,Layo,37,Black,AK,,Sexual Assault,5 years,1995,2000,,,Y,,,,,Y Sipe,David,27,Caucasian,F-TX,FED,Assault,Not sentenced,2001,2007,,,,,Y,,Y, Slater,Willie Lee,43,Black,FL,CV,Assault,Life,1999,2009,,,Y,,,,,Y Sledge,Joseph,34,Black,NC,JI,Murder,Life,1978,2015,Y,,,,Y,Y,Y, Smith,Billy James,34,Black,TX,,Sexual Assault,Life,1987,2006,Y,,Y,,,,, Smith,Charles,29,Black,IN,CDC,Murder,Death,1983,1991,,,,,Y,,Y,Y Smith,Christopher C.,32,Caucasian,IN,"CDC, P",Murder,38 years,1991,1993,,,,Y,Y,,Y, Smith,Demetrius,25,Black,MD,,Murder,Life,2010,2012,,,,,Y,,, Smith,Frank Lee,37,Black,FL,"CV, PH",Murder,Death,1986,2000,Y,,Y,,Y,,Y, Smith,Jay C.,50,Caucasian,PA,"CV, JI",Murder,Death,1986,1992,,,,,Y,Y,Y, Smith,John Edward,18,Black,CA,CV,Murder,Life,1994,2012,,,,,Y,,Y,Y Smith,Jon Keith,17,Black,MO,CDC,Murder,Life,1987,2000,,,,,Y,,Y, Smith,Joseph,32,Caucasian,NY,"CV, JI",Murder,Not sentenced,2002,2002,,,,,Y,,Y, Smith,Marcus Lashun,21,Black,TX,"CIU, P",Robbery,Probation,1995,2012,,,Y,,,,, Smith,Michael,32,Black,CA,,Murder,Life without parole,1994,2009,,,Y,,Y,Y,Y, Smith,Sarah,19,Caucasian,FL,"F, NC",Drug Possession or Sale,1 month,1998,2000,,,,,Y,,Y, Smith,Steven,37,Black,IL,,Murder,Death,1986,1999,,,,,Y,,, Smith,Sylvester,32,Black,NC,CV,Child Sex Abuse,Life,1984,2004,,,,,Y,,, Smith,Tammy,32,Caucasian,IA,"CV, F, NC",Child Abuse,10 years,2007,2011,,,,,,,,Y Smith,Timothy,35,Caucasian,KY,"CV, NC",Child Sex Abuse,20 years,2001,2006,,,,,Y,,,Y Smith,Walter D.,27,Black,OH,,Sexual Assault,78 to Life,1986,1996,Y,,Y,,,,, Smith,William,24,Caucasian,NM,,Murder,Life,1978,1996,,,,,Y,Y,Y, Smith,Zachary Noah,24,Caucasian,AL,"CV, NC",Child Sex Abuse,Probation,2006,2012,,,,,Y,,, Smythe,Jamar,30,Black,NY,,Drug Possession or Sale,15 years,2013,2015,,,,,Y,,Y, Snipes,Steve E.,34,Black,NC,,Robbery,7 years and 6 months,1998,2003,,,Y,,,,,Y Snowden,Harold Grant,38,Caucasian,FL,"CSH, CV, NC",Child Sex Abuse,50 to Life,1986,1998,,,,,Y,Y,Y, Snyder,Walter,20,Black,VA,,Sexual Assault,45 years,1986,1993,Y,,Y,Y,,,Y, Sommer,Cynthia,32,Caucasian,CA,F,Murder,Life,2007,2008,,,,,,Y,,Y Sorrels,Ross,44,Caucasian,WA,CV,Child Sex Abuse,10 years,1995,2003,,,,,Y,,, Sosnovske,John,39,Caucasian,OR,"CDC, P",Murder,Life,1991,1995,,,,,Y,,, Sousa,Joseph,26,Caucasian,DC,,Murder,20 to Life,1975,2005,,,,,Y,,Y, Souter,Larry Pat,26,Caucasian,MI,,Murder,20 to 60 years,1992,2005,,,,,Y,Y,Y, Span,Darlene,39,Caucasian,F-AZ,"F, FED, NC",Assault,Probation,1990,1996,,,,,Y,,Y,Y Span,Jerry,57,Caucasian,F-AZ,"FED, NC",Assault,Probation,1990,1996,,,,,Y,,Y,Y Spencer,Clyde Ray,37,Caucasian,WA,"CV, NC, P",Child Sex Abuse,Life,1985,2010,,,,,Y,,Y, Spiller,Jonathan,58,Caucasian,F-DC,"FED, NC, P",Conspiracy,Not sentenced,2011,2012,,,,,,,, Springsteen,Robert,17,Caucasian,TX,"CDC, CV",Murder,Death,2001,2009,Y,*,,Y,Y,,, Spurlock,Robert,22,Caucasian,TN,,Murder,Life,1990,1996,,,,,Y,,Y, Stallings,Patricia,24,Caucasian,MO,"CV, F, NC",Murder,Life,1991,1991,,,,,,Y,,Y Starks,Bennie,26,Black,IL,,Sexual Assault,60 years,1986,2013,Y,,,,Y,Y,Y, Statler,Paul E.,21,Caucasian,WA,CDC,Robbery,42 years,2009,2013,,,,,Y,,,Y Steidl,Gordon,35,Caucasian,IL,JI,Murder,Death,1987,2004,,,,,Y,,Y,Y Sterling,Frank,25,Caucasian,NY,,Murder,25 to Life,1992,2010,Y,,,Y,,,Y, Stevens,Richard,32,Black,F-NJ,FED,Sexual Assault,14 years,1990,1992,,,Y,,,,, Stevens,Ted,83,Caucasian,F-DC,FED,Official Misconduct,Not sentenced,2008,2009,,,,,,,Y, Stevenson,Macen,20,Caucasian,TX,"CIU, M, NC, P",Drug Possession or Sale,Probation,2014,2014,,,,,,,, Stewart,Arthur,35,Black,NY,,Robbery,7 to 14 years,1995,2001,,,Y,,,,,Y Stewart,Dail,22,Caucasian,UT,,Murder,5 years to life,1984,1992,,,,,Y,,, Stewart,Kia,17,Black,LA,CIU,Murder,Life without parole,2009,2015,,,Y,,,,Y,Y Stewart,Nicholas,29,Caucasian,F-MIL,"FED, NC",Sexual Assault,2 years,2009,2012,,,,,,,, Stewart,Zackary Lee,18,Caucasian,MO,JI,Murder,Life without parole,2008,2010,Y,*,,,Y,,, Stiers,Donald,25,Caucasian,TX,"CIU, NC, P",Drug Possession or Sale,2 years,2014,2015,,,,,,,, Stinson,Robert Lee,20,Black,WI,,Murder,Life,1985,2009,Y,,,,,Y,,Y Stoll,John,41,Caucasian,CA,"CSH, CV, NC",Child Sex Abuse,40 years,1985,2004,,,,,Y,,Y, Stonehouse,Carol,41,Caucasian,PA,"F, NC",Murder,7 to 14 years,1983,1990,,,,,,,,Y Street,John,36,Caucasian,F-MO,"FED, JI",Murder,Life without parole,2006,2009,,,,,Y,,, Strickland,Reshenda,24,Black,WA,F,Robbery,6 months,2004,2004,,,Y,,,,, Strong,Jason,24,Caucasian,IL,"CDC, CIU",Murder,46 years,2000,2015,,,,Y,Y,Y,, Stuckey,Willie,16,Black,NY,"CDC, CIU, PH",Murder,25 years to life,1986,2014,Y,*,,Y,Y,,Y, Sturgeon,Michael,16,Caucasian,WI,"CDC, P",Burglary/Unlawful Entry,120 days,1997,2000,,,,Y,Y,,Y, Sturgeon,Richard,32,Black,TX,,Robbery,50 years,1999,2009,,,Y,,,,Y,Y Su,Shih-Wei,17,Asian,NY,,Attempted Murder,16 years,1992,2003,,,,,Y,,Y, Suarez,Rafael,31,Hispanic,AZ,,Assault,5 years,1997,2000,,,,,Y,,,Y Sullivan,Harold,31,Caucasian,MA,,Murder,Life,1986,1990,,,,,Y,,Y, Sutherlin,David Brian,23,Black,MN,,Sexual Assault,3 years and 7 months,1985,2002,Y,,Y,,,,, Sutton,Josiah,16,Black,TX,,Sexual Assault,25 years,1999,2004,Y,,Y,,,Y,Y, Swanagan,Leonard,54,Black,GA,NC,Sex Offender Registration,1 year,2011,2012,,,,,,,,Y Swenson,Nathaniel,21,Black,KS,,Attempted Murder,16 years and 11 months,2000,2013,,,,,Y,,,Y Swift,Rubin,32,Black,PA,"CV, NC, P",Child Sex Abuse,28 to 60 3/4 years,2009,2012,,,,,Y,Y,, Swift,Terrill,17,Black,IL,CDC,Murder,36 years,1998,2012,Y,,,Y,Y,,, Swift,Walter,21,Black,MI,,Sexual Assault,30 to 55 years,1982,2008,,,Y,,,Y,Y,Y Talamantez,Jesse,20,Hispanic,TX,,Murder,99 years,1999,2004,,,Y,,,,,Y Talley,Dell,23,Black,VA,,Sexual Assault,Not sentenced,1991,1991,,,Y,,,,Y,Y Tameleo,Henry,63,Caucasian,MA,"CDC, PH",Murder,Death,1968,2001,,,,,Y,,Y, Tankleff,Martin,17,Caucasian,NY,,Murder,50 to Life,1990,2008,,,,Y,Y,,Y, "Tate, Jr.",Marcus,20,Black,TX,"CIU, NC, P",Drug Possession or Sale,5 months,2006,2015,,,,,,Y,, Tatum,Adam,36,Black,TN,"NC, P",Assault,2 years,2012,2013,,,,,Y,,Y, Taylor,Ada JoAnn,21,Caucasian,NE,"CDC, F, P",Murder,40 years,1989,2009,Y,,,Y,Y,Y,, Taylor,Andre,,Black,CA,,Attempted Murder,28 years to life,1990,1998,,,Y,,,,Y, Taylor,Daniel,17,Black,IL,"CDC, CIU",Murder,Life without parole,1995,2013,,,,Y,Y,,Y,Y Taylor,Don,20,Black,NY,,Murder,22 1/2 years to Life,1989,2004,,,,,Y,,Y, Taylor,Gregory,28,Caucasian,NC,JI,Murder,Life,1993,2010,Y,*,,,Y,Y,Y, Taylor,Robert,14,Black,IL,"CDC, CV",Murder,80 years,1997,2011,Y,,,Y,Y,,Y, Taylor,Ronald Gene,32,Black,TX,,Sexual Assault,60 years,1995,2008,Y,,Y,,,Y,, Taylor,Ruth,31,Caucasian,CA,"CSH, CV, F, NC, P",Child Sex Abuse,6 years,1985,2001,,,,,Y,,Y,Y Tears,Jonathan,29,Black,TN,NC,Attempted Murder,25 years,2009,2014,,,,,Y,,,Y Tedtaotao,Jathan,18,Other,GU,P,Robbery,21 years,2014,2015,,,,Y,,,, Tennison,John,17,Black,CA,,Murder,25 to Life,1990,2003,,,,,Y,,Y, Terens,Daniel,27,Caucasian,WI,CV,Manslaughter,20 years,1991,1993,,,,Y,Y,,,Y Terry,Paul,17,Black,IL,CV,Murder,Life,1977,2003,Y,,Y,,,,Y, Thagard,Jerome,16,Black,NY,,Murder,25 years to life,2010,2014,,,Y,,Y,,Y, Thain,Leeann,40,Caucasian,MI,"F, NC",Theft,90 days,2011,2014,,,,,,,,Y Thames,Vincent,16,Black,IL,"CDC, P",Murder,30 years,1998,2012,Y,,,Y,Y,,, Thibodeaux,Damon,21,Caucasian,LA,CV,Murder,Death,1997,2012,Y,,Y,Y,,,Y, Thomas,Adrian,26,Black,NY,"CV, NC",Murder,25 to life,2009,2014,,,,Y,,Y,Y, Thomas,Cameron,24,Black,TX,"NC, P",Drug Possession or Sale,1 year,2014,2014,,,,,,Y,, Thomas,Cherice,18,Black,CA,F,Murder,25 to Life,2009,2012,,,,,Y,,,Y Thomas,Donnell,22,Black,NY,"CV, NC",Child Sex Abuse,17 years,2007,2011,,,,,Y,,, Thomas,Marvin,20,Caucasian,WV,,Murder,Life,1987,1993,,,,,,Y,, Thomas,Rickey Dale,25,Black,TX,,Robbery,Life,1992,1993,,,Y,,,,,Y Thomas,Teresa,29,Black,OH,"F, NC",Murder,18 years to life,1993,1997,,,,,,,Y, Thomas,Victor Larue,26,Black,TX,,Sexual Assault,Life,1986,2001,Y,,Y,,,,, Thompson,Franklin,45,Black,IL,,Murder,24 years,1997,2003,,,,Y,,,, Thompson,Hubert,27,Black,CT,,Sexual Assault,12 years,1998,2012,Y,,Y,,,,, Thompson,John,22,Black,LA,,Murder,Death,1985,2003,,,Y,,Y,,Y, Thompson,John,22,Black,LA,,Robbery,49 years and 6 months,1985,1999,,,Y,,,,Y, Thompson,Patrick,35,Black,IL,NC,Sexual Assault,Not sentenced,2006,2008,,,,,Y,,,Y Thompson,Stephen,24,Caucasian,TX,"CV, NC",Child Sex Abuse,30 years,1991,2006,,,,,Y,,,Y Thompson,Teddy,16,Black,VA,,Robbery,16 years,2001,2007,,,Y,,,,, Thompson,Terrance,36,Black,IL,NC,Gun Possession or Sale,8 years,2003,2006,,,,,Y,,Y, Thorn,Michael,46,Caucasian,MI,,Fraud,5 years,1995,1999,,,,,,,,Y Thurman,Joe,29,Caucasian,TX,"CIU, NC, P",Drug Possession or Sale,2 years,2014,2014,,,,,,Y,, Thurman,Philip Leon,29,Black,VA,,Sexual Assault,31 years,1985,2005,Y,,Y,,,,, Tice,Derek,27,Caucasian,VA,CDC,Murder,Life,2000,2011,,,,Y,Y,,Y,Y Tillman,James Calvin,25,Black,CT,,Sexual Assault,45 years,1989,2006,Y,,Y,,,Y,, Tillman,Michael,20,Black,IL,CDC,Murder,Life,1986,2010,,,,Y,Y,Y,Y,Y "Tingle, Jr.",John,25,Black,VA,,Assault,10 years,1993,1994,,,Y,,,,, Tinney,Glenn,26,Caucasian,OH,P,Murder,15 to life,1992,2015,,,,Y,,,Y, Tiscareno,Abigail,40,Hispanic,UT,"CV, F, NC, SBS",Child Abuse,Not sentenced,2004,2005,,,,Y,,Y,Y,Y Tomaino,John,41,Caucasian,NY,NC,Murder,20 years to life,1996,1999,,,,,,Y,,Y Tomlin,Charles,26,Black,CA,CDC,Murder,25 to life,1979,1994,,,Y,,Y,,,Y Toney,Michael,19,Caucasian,TX,"CV, JI",Murder,Death,1999,2009,,,,,Y,,Y, Toney,Steven,35,Black,MO,,Sexual Assault,Life,1983,1996,Y,,Y,,,,,Y Toro,Angel,30,Caucasian,MA,,Murder,Life,1983,2004,,,,,Y,,Y, Torres,Alfredo,20,Hispanic,F-NM,FED,Drug Possession or Sale,5 years and 10 months,2006,2009,,,Y,,,,Y, Torres,Jesus,29,Hispanic,NY,"CSH, CV, NC",Child Sex Abuse,25 to 40 years,1986,1990,,,,,Y,,Y, Towler,Raymond,24,Black,OH,CV,Child Sex Abuse,Life,1981,2010,Y,,Y,,,,, Town,Meredith,36,Caucasian,WA,"CSH, CV, NC, P",Child Sex Abuse,20 years,1994,2000,,,,Y,Y,,Y, Townsend,Jerry,26,Black,FL,"CV, P",Murder,Life,1980,2001,Y,,Y,Y,,,, Trakhtenberg,Jacob,67,Caucasian,MI,"CV, NC",Child Sex Abuse,15 years,2006,2013,,,,,Y,,Y,Y Treadway,Clinton,21,Caucasian,FL,,Theft,10 years,2006,2012,,,,,,,, Treadwell,Kenneth,35,Caucasian,F-FL,FED,Fraud,Not sentenced,1995,1995,,,,,,,Y, Trevino,Michael,27,Hispanic,TX,"CV, P",Child Sex Abuse,Probation,1993,2012,,,Y,,,,, Tribble,Santae,17,Black,DC,,Murder,20 to Life,1980,2012,Y,,,,Y,Y,, Triplett,Eddie,38,Black,LA,,Drug Possession or Sale,Life without parole,1998,2011,,,,,Y,,Y, Trujillo,Nicolas,19,Caucasian,TX,"CIU, M, NC, P",Drug Possession or Sale,Probation,2014,2014,,,,,,,, Trulove,Jamal,25,Black,CA,,Murder,50 years to life,2010,2015,,,Y,,,,Y,Y Truvia,Earl,17,Black,LA,CV,Murder,Life,1976,2003,,,,,Y,,Y,Y Tucker,David,28,Black,MI,,Assault,6 to 10 years,1992,1999,,,Y,,,,,Y Tuley,Wesley,27,Caucasian,TX,"CV, NC, P",Child Sex Abuse,10 years,1997,2003,,,,,Y,,, Tullos,Nancy,56,Caucasian,F-CA,"F, FED, NC, P",Obstruction of Justice,Not sentenced,2008,2010,,,,,,,Y, Turner,Clinton,31,Black,NY,NC,Robbery,10 to 20 years,1988,2005,,,,,Y,,Y, Turner,Keith,22,Black,TX,,Sexual Assault,20 years,1983,2005,Y,,Y,,,,, Turner,Kenneth,30,Caucasian,CA,,Murder,Not sentenced,1995,1995,,,Y,,,,, Tyson,Betty,24,Black,NY,"CDC, F",Murder,25 to Life,1973,1998,,,,Y,Y,,Y, Tyson,Dequinncy,28,Black,TX,"CIU, NC, P",Drug Possession or Sale,30 days,2012,2015,,,,,,Y,, Valdez,Gilbert,17,Hispanic,TX,"CV, P",Sexual Assault,Probation,1997,2012,,,,,,,Y,Y Vamvakas,Evangelo,43,Caucasian,NY,,Robbery,3 1/2 to 10 years,1994,1996,,,Y,,,,,Y Vann,Gussie,42,Caucasian,TN,"CV, NC",Murder,Death,1994,2011,,,,,,Y,,Y Varela,Ignacio,42,Hispanic,IL,CDC,Murder,Life without parole,1982,1991,,,Y,,Y,,, Varela,Joaquin,22,Hispanic,IL,,Murder,Life without parole,1982,1991,,,Y,Y,,,, Vargas,Alfredo,62,Hispanic,CT,"CV, NC",Child Sex Abuse,12 years,2002,2006,,,,,Y,,,Y Vargas-Cintron,Johnny,34,Hispanic,MA,"NC, P",Drug Possession or Sale,7 years,2011,2013,,,,,Y,,, Vasquez,David,36,Hispanic,VA,P,Murder,35 years,1985,1989,Y,,Y,Y,,Y,Y, Vasquez,Gerardo,20,Caucasian,TX,"CIU, NC, P",Drug Possession or Sale,Probation,2008,2014,,,,,,Y,, Vasquez,Mario,34,Hispanic,WI,CV,Child Sex Abuse,20 years,1998,2015,,,Y,,,,,Y Vasquez,Michael,40,Hispanic,NY,CIU,Robbery,20 to life,1997,2012,,,,,Y,,, Vasquez,Rogelio,24,Caucasian,TX,"CIU, M, NC, P",Drug Possession or Sale,10 days,2014,2014,,,,,,,, Vaughan III,James,28,Black,OH,"CV, NC",Child Sex Abuse,Life without parole,2008,2009,,,,,Y,Y,,Y Vazquez,Hector,28,Hispanic,F-TX,FED,Drug Possession or Sale,Not sentenced,2004,2005,,,Y,,,,, Veal,Robert,14,Black,IL,"CDC, CV, P",Murder,20 years,1994,2011,Y,,,Y,Y,,Y, Vega,Damaso,37,Hispanic,NJ,CV,Murder,Life,1982,1989,,,,,Y,,Y, Velasquez,Eduardo,21,Hispanic,MA,,Sexual Assault,12 to 18 years,1988,2001,Y,,Y,,,Y,, Veltmann,Carl,62,Caucasian,F-FL,"A, FED, JI",Arson,Life,1992,1996,,,,,Y,,,Y Veltmann,Christopher,35,Caucasian,F-FL,"A, FED, JI",Arson,Life,1992,1996,,,,,Y,,,Y Vera,Francisco,29,Hispanic,IL,,Assault,17 years,1992,1998,,,Y,,,,,Y Vera,John,20,Hispanic,NY,CV,Robbery,6 to 18 years,1995,2000,,,Y,,,,, Verkuilen,Maxwell,25,Caucasian,WI,NC,Sexual Assault,4 years,2003,2007,,,,,Y,Y,,Y Villasana,Armand,44,Hispanic,MO,NC,Sexual Assault,Not sentenced,1999,2000,Y,,,,Y,,, von Schlieffen,Wolfgang,47,Caucasian,F-FL,FED,Conspiracy,10 years,1994,2000,,,,,Y,,Y, VonAllmen,Michael,24,Caucasian,KY,,Sexual Assault,35 years,1982,2010,,,Y,,,,, Wagstaffe,Everton,23,Black,NY,CV,Kidnapping,12.5 to 25 years,1993,2015,Y,*,,,Y,,Y,Y Waithe,Michael,22,Black,NY,NC,Burglary/Unlawful Entry,1 1/2 to 4 years,1987,2015,,,,,Y,,, Walker,James,30,Black,NY,,Murder,20 to Life,1971,1990,,,Y,,Y,,Y, Walker,Joseph,44,Black,SC,,Sexual Assault,24 years,2003,2014,,,Y,,,,,Y Walker,Quedillis Ricardo,34,Hispanic,CA,"CDC, CIU",Murder,26 to Life,1991,2003,Y,*,,,Y,,Y, Walker,William,36,Black,OH,,Robbery,3 years,2002,2005,,,Y,,,,,Y Wallach,E. Robert,50,Caucasian,F-NY,"CDC, FED",Fraud,"6 years, $250,000 fine, forfeiture of $425,000",1989,1993,,,,,Y,,Y, Waller,James,25,Black,TX,CV,Child Sex Abuse,30 years,1983,2007,Y,,Y,,,,, Waller,Patrick,20,Black,TX,CIU,Kidnapping,Life,1992,2008,Y,,Y,,,,, Wallis,Gregory,27,Caucasian,TX,,Sexual Assault,50 years,1989,2007,Y,,Y,,Y,,, Walls,George,39,Black,PA,NC,Sexual Assault,5 to 10 years,2003,2011,,,,,Y,,,Y Wanzer,Terry Lee,20,Caucasian,GA,CV,Sexual Assault,Life,1973,1991,,,Y,,,,, Ward,Bernard,28,Caucasian,MD,P,Murder,Life,1989,1994,,,Y,Y,,,,Y Ward,Madeleine,52,Caucasian,IL,"F, NC",Theft,6 years and 7 months,2006,2008,,,,,,,, Ward,Nathaniel,35,Black,TX,,Robbery,45 years,1999,2001,,,Y,,,,, Wardell,Billy,21,Black,IL,,Sexual Assault,69 years,1988,1997,Y,,Y,,Y,Y,Y, Ware,Melonie,30,Black,GA,"CV, F, NC, SBS",Murder,Life,2005,2009,,,,,Y,Y,,Y Warner,Colin,18,Black,NY,CV,Murder,15 to Life,1982,2001,,,Y,,Y,,Y, Warney,Douglas,33,Caucasian,NY,,Murder,25 to Life,1997,2006,Y,,,Y,,,, Washington,Calvin E.,30,Black,TX,,Murder,Life,1987,2001,Y,,,,Y,Y,Y, Washington,Darryl,24,Black,TX,CIU,Robbery,Life,1996,2012,,,Y,,,,Y, Washington,Earl,22,Black,VA,,Murder,Death,1984,2000,Y,,,Y,,Y,Y,Y Washington,Vonaire,24,Black,WI,,Robbery,22 years,1991,2001,,,Y,,,,,Y Washington,Wayne,20,Black,IL,"CIU, P",Murder,25 years,1996,2015,,,,Y,Y,,Y, Waters,Kenneth,25,Caucasian,MA,,Murder,Life,1983,2001,Y,,,,Y,,Y, Waters,Leo,30,Caucasian,NC,,Sexual Assault,Life,1982,2003,Y,,Y,,,,, Watkins,Cathy,27,Black,NY,F,Murder,25 years to life,1997,2012,,,,,Y,,, Watkins,Jerry,26,Caucasian,IN,"CV, JI",Murder,60 years,1986,2000,Y,,,,Y,Y,Y, Watkins,John,18,Caucasian,AZ,P,Sexual Assault,14 years,2004,2010,Y,,Y,Y,,,Y, Weaver,Mary,41,Caucasian,IA,"CV, F, NC, SBS",Murder,Life without parole,1994,1997,,,,,Y,Y,, Webb,Mark,22,Caucasian,TX,,Sexual Assault,30 years,1987,2001,Y,,Y,,,,, Webb,Troy,19,Black,VA,,Sexual Assault,47 years,1989,1996,Y,,Y,,,Y,, "Webb, III",Thomas,22,Black,OK,,Sexual Assault,60 years,1983,1996,Y,,Y,,,Y,, Webster,Bernard,18,Black,MD,,Sexual Assault,30 years,1983,2002,Y,,Y,,Y,Y,, Weimer,Howard,57,Caucasian,CA,"CSH, CV, NC",Child Sex Abuse,42 years,1985,2005,,,,,Y,,Y, Weiner,Mark,52,Caucasian,VA,NC,Kidnapping,8 years,2013,2015,,,,,Y,,Y,Y Weiner,Murray,65,Caucasian,CA,,Murder,Not sentenced,1994,1996,Y,*,,,,Y,Y, Weitzel,Robert,39,Caucasian,UT,NC,Manslaughter,1 to 15 years,2000,2002,,,,,,Y,, Wells,Harold,24,Black,MI,,Possession of Stolen Property,1 1/2 to 5 years,1992,1993,,,,,,,,Y West,Mark,47,Black,TX,"CIU, NC, P",Drug Possession or Sale,6 months,2014,2014,,,,,,Y,, Westbrooks,Lashane,29,Black,NY,NC,Drug Possession or Sale,3.5 years,2007,2007,,,,,Y,,Y, Westly,William,47,Black,NY,"JI, NC",Sexual Assault,20 to Life,1999,2002,,,,,Y,,Y, Wheeler-Whichard,Jonathan,15,Black,NY,,Murder,25 to Life,1997,2009,,,,,Y,,,Y White,Curtis,31,Black,VA,JI,Murder,Not sentenced,1993,1994,,,,,Y,,Y, White,George,36,Caucasian,AL,,Murder,Life,1987,1992,,,,,,,, White,John Jerome,20,Black,GA,,Sexual Assault,Life,1980,2007,Y,,Y,,,Y,, White,Joseph,22,Caucasian,NE,CDC,Murder,Life,1989,2008,Y,,,,Y,Y,, White,Michael,35,Native American,NY,NC,Sexual Assault,5 years,2006,2015,Y,*,,,Y,,, "White, Jr.",Theodore,35,Caucasian,MO,"CV, NC",Child Sex Abuse,50 years,1999,2005,,,,,Y,,Y, Whitehead,Ronnie,31,Caucasian,TX,"CIU, M, NC, P",Drug Possession or Sale,6 months,2004,2014,,,,,,Y,, Whitfield,Arthur Lee,26,Black,VA,,Sexual Assault,63 years,1982,2009,Y,,Y,,,,, Whitley,Drew,32,Black,PA,JI,Murder,Life,1989,2006,Y,,Y,,Y,,, Whitlock,Herbert,39,Caucasian,IL,JI,Murder,Life,1987,2008,,,,,Y,,Y,Y Wiggins,David Lee,24,Caucasian,TX,CV,Child Sex Abuse,Life,1989,2012,Y,,Y,,,Y,, Wilcox,Jennifer,20,Caucasian,OH,"CSH, CV, F, NC",Child Sex Abuse,Life,1985,1997,,,,,Y,,Y, Wilcoxson,Robert,21,Black,NC,"CDC, P",Murder,12 1/2 to 15 3/4 years,2002,2011,Y,,,,Y,,Y, Wiley,Kelvin,29,Black,CA,,Assault,4 years,1990,1992,,,,,Y,,,Y Wilhite,Charles,25,Black,MA,,Murder,Life,2010,2013,,,Y,,Y,,Y, Wilhoit,Gregory,30,Caucasian,OK,,Murder,Death,1987,1993,,,,,,Y,,Y Wilkins,Geary,40,Black,TX,"NC, P",Drug Possession or Sale,6 months,2013,2014,,,,,,Y,, Willett,Jeffery,29,Caucasian,WI,"CV, NC",Child Sex Abuse,30 years,2007,2010,,,,,Y,,,Y Williams,Alfred,53,Black,TX,NC,Drug Possession or Sale,30 years,1987,1989,,,,,Y,,Y, Williams,Antonio,35,Black,AL,CV,Child Sex Abuse,Life,2007,2011,,,,,Y,,Y,Y Williams,Calvin,,Black,LA,,Murder,Life,1977,1996,,,,,,,, Williams,Darrell,21,Black,OK,,Sexual Assault,Probation,2012,2014,,,Y,,,,Y, Williams,Dennis,21,Black,IL,"CDC, JI",Murder,Death,1978,1996,Y,,Y,,Y,Y,Y,Y Williams,Derrick Raphel,29,Black,FL,,Sexual Assault,Life,1993,2011,Y,,Y,,,,, Williams,Edward,25,Black,OH,NC,Assault,Probation,1997,2015,,,,,Y,,Y, Williams,Emmaline,43,Black,IL,"CDC, CV, F, NC",Child Sex Abuse,12 years,1986,1995,,,,Y,Y,,Y,Y Williams,Hayes,19,Black,LA,P,Murder,Life,1968,1997,,,,,Y,,Y, Williams,James,50,Caucasian,GA,NC,Murder,Life,1982,1989,,,,,Y,,Y,Y Williams,James Curtis,25,Black,TX,CIU,Sexual Assault,Life,1984,2012,Y,,Y,,,,, Williams,Jerron,31,Black,FL,CV,Manslaughter,50 years,2013,2015,,,Y,,,,Y, Williams,Jimmy,20,Black,OH,CV,Child Sex Abuse,Life,1991,2001,,,Y,,Y,Y,, Williams,Joe Sidney,19,Black,TX,JI,Murder,Life,1987,1993,Y,*,,,Y,Y,Y, Williams,Johnny,23,Black,CA,CV,Child Sex Abuse,16 years,2000,2013,Y,,Y,Y,,,Y, Williams,Lawrence,32,Black,NY,CIU,Assault,10 years,2009,2012,,,Y,,,,, Williams,Michael,31,Black,LA,,Murder,Life,1997,2011,,,,,Y,,Y, Williams,Michael Anthony,16,Black,LA,,Sexual Assault,Life without parole,1981,2005,Y,,Y,,,,, Williams,Thomas Wayne,38,Black,TX,NC,Drug Possession or Sale,Life,1998,2000,,,,,Y,,Y, Williams,Willie,23,Black,GA,,Sexual Assault,45 years,1985,2007,Y,,Y,,,,, Williamson,Ronald Keith,29,Caucasian,OK,JI,Murder,Death,1988,1999,Y,,,Y,Y,Y,Y,Y Willis,Calvin,21,Black,LA,CV,Child Sex Abuse,Life without parole,1982,2003,Y,,Y,,,,Y, Willis,Cedric,19,Black,MS,CV,Murder,Life,1997,2006,Y,*,Y,,,,, Willis,Ernest Ray,39,Caucasian,TX,"A, NC",Murder,Death,1987,2004,,,,,,Y,Y,Y Willis,John,42,Black,IL,,Sexual Assault,Life,1993,1999,Y,,Y,,Y,Y,Y, Willis,Marrio D'Shane,23,Black,OK,,Robbery,10 years,2003,2006,,,Y,,,,,Y Willoughby,Troy,20,Caucasian,WY,JI,Murder,life,2010,2012,,,,,Y,,Y, Wilson,Carol Jean,57,Black,MI,"F, NC",Forgery,Probation,2011,2013,,,,,Y,,,Y Wilson,Edwin,49,Caucasian,F-VA,FED,Gun Possession or Sale,17 years,1983,2004,,,,,Y,,Y, Wilson,Harold C.,30,Black,PA,,Murder,Death,1989,2005,Y,*,,,,,Y, Wilson,Johnny Lee,20,Caucasian,MO,P,Murder,Life without parole,1987,1995,,,,Y,Y,,Y, Wilson,Kathryn Dawn,22,Caucasian,NC,"CSH, CV, F, NC",Child Sex Abuse,Life,1993,1997,,,,,Y,,Y, Wilson,Robert,41,Black,IL,,Attempted Murder,30 years,1999,2006,,,Y,Y,,,Y, Wilson,Sharrif,15,Black,NY,"CDC, CIU, CV",Murder,9 to Life,1994,2014,Y,,,Y,Y,Y,Y,Y Wilson,Theopolis,22,Black,WI,NC,Burglary/Unlawful Entry,5 years,1989,1991,,,,,,,, Winfrey,Megan,16,Caucasian,TX,F,Murder,Life,2008,2013,,,,,,Y,, "Winfrey, Sr.",Richard,49,Caucasian,TX,JI,Murder,75 years,2007,2010,,,,,Y,Y,, Winslow,Thomas,18,Caucasian,NE,"CDC, P",Murder,50 years,1990,2009,Y,,,Y,Y,Y,, Winston,Michael,21,Black,IL,,Murder,40 years,2007,2012,,,Y,,Y,,Y, Wise,Korey,16,Black,NY,CDC,Sexual Assault,5 to 15 years,1990,2002,Y,,,Y,Y,Y,, Witt,Drayton,18,Caucasian,AZ,"CV, NC, SBS",Murder,20 years,2002,2012,,,,,,Y,, Woidtke,Rodney,27,Caucasian,IL,,Murder,45 years,1989,2001,,,,Y,,,Y,Y Womble,Willie,21,Black,NC,,Murder,80 years,1976,2014,,,,Y,,,Y, Wong,David,23,Asian,NY,JI,Murder,25 to Life,1987,2004,,,,,Y,,Y,Y Woodall,Glen,28,Caucasian,WV,,Sexual Assault,Life,1987,1992,Y,,Y,,Y,Y,, Woodard,James Lee,26,Black,TX,CIU,Murder,Life,1981,2008,Y,,Y,,,,Y, Woodley,Collin,30,Caucasian,NY,,Drug Possession or Sale,10 to 20 years,1990,1998,,,Y,,,,, Woods,Anthony,20,Black,MO,CV,Child Sex Abuse,25 years,1984,2005,Y,,Y,,,,, Woods,Cathy,26,Caucasian,NV,F,Murder,Life without parole,1980,2015,Y,,,Y,,,, Woods,Declan,43,Caucasian,CA,"NC, P",Traffic Offense,Fine of unk amt/alcohol awaremess program,2007,2012,,,,,Y,,Y, "Woodside, Jr.",Gary,20,Caucasian,NY,JI,Manslaughter,1 1/3 to 4 years,1989,2007,,,,Y,Y,,, Woodworth,Mark,16,Caucasian,MO,,Murder,31 years,1995,2014,,,,,,,Y, Word,Lathan,18,Black,GA,CV,Robbery,15 years,2000,2011,,,,,Y,,Y,Y Wosu,Joy,33,Black,NY,"CV, F, NC",Child Sex Abuse,25 years,1993,2009,,,,,Y,Y,,Y Woten,Michael Anthony,19,Caucasian,TX,,Robbery,55 years,1982,1990,,,Y,,,,, Wrice,Stanley,28,Black,IL,,Sexual Assault,Life,1983,2013,,,,Y,Y,,Y, Wright,Cleveland,20,Black,DC,,Murder,20 to Life,1979,2014,Y,*,,,Y,Y,Y, "Wright, Jr.",Harold,32,Black,WA,NC,Sexual Assault,6 months,2007,2013,,,,,Y,Y,Y, Wyatt,Rickey Dale,25,Black,TX,CIU,Sexual Assault,99 years,1981,2012,Y,,Y,,Y,Y,Y,Y Wyniemko,Kenneth,43,Caucasian,MI,,Sexual Assault,40 to 60 years,1994,2003,Y,,Y,,,,, Wynn,Morris,21,Caucasian,IL,,Drug Possession or Sale,8 years,2005,2006,,,,,Y,,Y, Yarbough,Anthony,18,Black,NY,"CDC, CIU, CV",Murder,75 to life,1994,2014,Y,,,Y,Y,Y,Y,Y Yarris,Nicholas,20,Caucasian,PA,JI,Murder,Death,1982,2003,Y,,,Y,Y,,Y,Y York,Kenneth,38,Caucasian,MO,,Sexual Assault,Life without parole,1994,2010,Y,,Y,,,,, Young,Dan,30,Black,IL,"A, CDC",Murder,Life without parole,1994,2005,Y,*,,Y,Y,Y,Y, Youngblood,Larry,30,Black,AZ,CV,Child Sex Abuse,10 years and 6 months,1985,2000,Y,,Y,,,,, Zimmer,Walter,40,Caucasian,OH,CDC,Manslaughter,50 years,1998,2011,Y,*,,,Y,Y,Y, Zimmerman,Evan,53,Caucasian,WI,,Murder,Life,2001,2005,Y,*,,Y,,,,Y Zinkiewicz,Tyrone,38,Caucasian,OH,"CV, NC",Other Nonviolent Felony,5 to 15 years,1988,1992,,,,,,,Y, Zomber,Michael,53,Caucasian,F-PA,FED,Fraud,2 years and 6 months,2003,2009,,,,,,,Y, agate-1.9.1/examples/realdata/ks_1033_data.csv000066400000000000000000006361631454111471500210110ustar00rootroot00000000000000state,county,fips,nsn,item_name,quantity,ui,acquisition_cost,total_cost,ship_date,federal_supply_category,federal_supply_category_name,federal_supply_class,federal_supply_class_name KS,ALLEN,20001,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,5/19/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ALLEN,20001,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,5/19/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ALLEN,20001,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,5/19/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ALLEN,20001,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,5/19/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ALLEN,20001,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ALLEN,20001,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,5/19/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ALLEN,20001,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,5/19/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ALLEN,20001,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,5/19/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ALLEN,20001,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,5/19/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ALLEN,20001,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,2/4/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ALLEN,20001,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,5/18/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ALLEN,20001,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,5/18/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ALLEN,20001,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,5/18/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ALLEN,20001,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,5/18/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ALLEN,20001,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,5/18/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ALLEN,20001,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,5/18/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ALLEN,20001,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,5/18/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ALLEN,20001,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ANDERSON,20003,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,12/15/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ANDERSON,20003,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/15/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ANDERSON,20003,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/15/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ANDERSON,20003,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/15/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ANDERSON,20003,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/15/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ANDERSON,20003,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/15/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ANDERSON,20003,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/15/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ANDERSON,20003,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/15/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ANDERSON,20003,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/15/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/11/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/11/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-179-0300,"RIFLE,7.62 MILLIMETER",1,Each,1278,1278,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-179-0300,"RIFLE,7.62 MILLIMETER",1,Each,1278,1278,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/12/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/12/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BARTON,20009,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BOURBON,20011,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,11/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BOURBON,20011,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,11/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BOURBON,20011,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,11/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BOURBON,20011,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,11/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BOURBON,20011,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,11/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BOURBON,20011,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BOURBON,20011,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BOURBON,20011,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BOURBON,20011,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BOURBON,20011,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,11/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BOURBON,20011,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,11/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BOURBON,20011,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,11/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BOURBON,20011,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,11/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BOURBON,20011,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,11/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BROWN,20013,2320-01-107-7155,"TRUCK,UTILITY",1,Each,25000,25000,6/29/2011 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,BROWN,20013,2320-01-107-7155,"TRUCK,UTILITY",1,Each,25000,25000,8/2/2011 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,BROWN,20013,2320-01-107-7155,"TRUCK,UTILITY",1,Each,25000,25000,8/2/2011 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,BROWN,20013,2320-01-107-7155,"TRUCK,UTILITY",1,Each,25000,25000,8/2/2011 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,BROWN,20013,2320-01-107-7156,"TRUCK,UTILITY",1,Each,41253,41253,6/30/2011 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,BROWN,20013,3590-DS-MIS-CSVS,AIR JACK,1,EA,250,250,1/11/2012 0:00:00,35,SERVICE AND TRADE EQPT,3590,Miscellaneous Service and Trade Equipment KS,BROWN,20013,4110-01-302-0350,REFRIGERATOR-FREEZE,1,EA,1905.71,1905.71,1/11/2012 0:00:00,41,"REFRIG, AIR CONDIT/CIRCULAT EQPT",4110,Refrigeration Equipment KS,BROWN,20013,5120-DS-JAC-KHAN,3 TON FLOOR JACK,1,EA,100,100,1/11/2012 0:00:00,51,HAND TOOLS,5120,"Hand Tools, Nonedged, Nonpowered" KS,BROWN,20013,5995-01-054-9744,"CABLE ASSEMBLY,POWER,ELECTRICAL",1,Each,164.55,164.55,8/22/2011 0:00:00,59,ELECTRICAL/ELECTRONIC EQPT COMPNTS,5995,"Cable, Cord, and Wire Assemblies: Communication Equipment" KS,BROWN,20013,5995-01-269-0092,"CABLE ASSEMBLY,POWER,ELECTRICAL",2,Each,154,308,8/22/2011 0:00:00,59,ELECTRICAL/ELECTRONIC EQPT COMPNTS,5995,"Cable, Cord, and Wire Assemblies: Communication Equipment" KS,BROWN,20013,6150-01-029-6766,"CABLE ASSEMBLY,POWER,ELECTRICAL",1,Each,557.34,557.34,8/22/2011 0:00:00,61,"ELECTRIC WIRE, POWER DISTRIB EQPT",6150,Miscellaneous Electric Power and Distribution Equipment KS,BROWN,20013,7110-00-CHA-IRS,OFFICE CHAIRS,15,EA,20,300,1/11/2012 0:00:00,71,FURNITURE,7110,Office Furniture KS,BROWN,20013,9535-00-236-7082,"SHEET,METAL",2,SH,208.18,416.36,1/11/2012 0:00:00,95,"METAL BARS, SHEETS, SHAPES",9535,"Plate, Sheet, Strip, and Foil: Nonferrous Base Metal" KS,BUTLER,20015,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/27/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BUTLER,20015,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/27/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BUTLER,20015,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/27/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BUTLER,20015,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/27/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BUTLER,20015,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/27/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BUTLER,20015,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/27/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BUTLER,20015,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BUTLER,20015,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BUTLER,20015,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,12/15/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BUTLER,20015,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/15/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BUTLER,20015,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/15/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BUTLER,20015,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/15/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BUTLER,20015,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/15/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BUTLER,20015,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/15/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BUTLER,20015,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/15/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BUTLER,20015,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/3/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BUTLER,20015,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/3/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BUTLER,20015,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/3/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BUTLER,20015,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/3/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BUTLER,20015,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/3/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BUTLER,20015,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/15/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BUTLER,20015,1005-01-128-9936,"RIFLE,5.56 MILLIMETER",1,Each,749,749,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,BUTLER,20015,1005-01-128-9936,"RIFLE,5.56 MILLIMETER",1,Each,749,749,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,9/4/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHAUTAUQUA,20019,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,0,0,3/27/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,0,0,3/27/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,0,0,3/27/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,0,0,3/27/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,0,0,3/27/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,0,0,3/27/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,0,0,3/27/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,0,0,3/27/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,0,0,3/27/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/29/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/29/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/29/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/29/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,8/8/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,8/8/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,10/29/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,10/29/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,10/29/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,10/29/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEROKEE,20021,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEYENNE,20023,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,12/7/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEYENNE,20023,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,5/15/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEYENNE,20023,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEYENNE,20023,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEYENNE,20023,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,3/19/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEYENNE,20023,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEYENNE,20023,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,12/7/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEYENNE,20023,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/28/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEYENNE,20023,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/16/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEYENNE,20023,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/16/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEYENNE,20023,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/16/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEYENNE,20023,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/16/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEYENNE,20023,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/16/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEYENNE,20023,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEYENNE,20023,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEYENNE,20023,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEYENNE,20023,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/20/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEYENNE,20023,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/20/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEYENNE,20023,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/20/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEYENNE,20023,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/20/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEYENNE,20023,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/20/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEYENNE,20023,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/28/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CHEYENNE,20023,2320-01-107-7155,"TRUCK,UTILITY",1,Each,25000,25000,7/11/2011 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,CHEYENNE,20023,2510-01-187-3606,"PANEL,BODY,VEHICULAR",1,Assembly,1335.69,1335.69,5/22/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,CHEYENNE,20023,2510-01-254-1500,"DOOR,VEHICULAR",1,Each,635.96,635.96,5/22/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,CHEYENNE,20023,2510-01-254-1501,"DOOR,VEHICULAR",1,Each,633.97,633.97,5/22/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,CHEYENNE,20023,2510-01-257-3903,"DOOR,VEHICULAR",1,Assembly,635.96,635.96,5/22/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,CHEYENNE,20023,2510-01-257-3904,"DOOR,VEHICULAR",1,Assembly,633.97,633.97,5/22/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,CHEYENNE,20023,2510-01-262-9520,"DOOR,HATCH,VEHICLE",1,Each,3718.69,3718.69,5/22/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,CLAY,20027,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CLAY,20027,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CLAY,20027,1005-00-179-0300,"RIFLE,7.62 MILLIMETER",1,Each,1278,1278,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CLAY,20027,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CLAY,20027,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CLAY,20027,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CLAY,20027,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CLAY,20027,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/3/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/3/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/3/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/3/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/3/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/3/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/3/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/3/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/3/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/3/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/3/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/3/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/3/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COFFEY,20031,1080-01-457-2956,"CAMOUFLAGE NET SYSTEM,RADAR SCATTERING",1,Each,1445,1445,2/25/2010 0:00:00,10,WEAPONS,1080,Camouflage and Deception Equipment KS,COFFEY,20031,1385-01-574-4707,"ROBOT,EXPLOSIVE ORDNANCE DISPOSAL",1,Each,10000,10000,4/17/2014 0:00:00,13,AMMUNITION AND EXPLOSIVES,1385,Surface Use Explosive Ordnance Disposal Tools and Equipment KS,COFFEY,20031,1385-01-574-4707,"ROBOT,EXPLOSIVE ORDNANCE DISPOSAL",1,Each,10000,10000,4/17/2014 0:00:00,13,AMMUNITION AND EXPLOSIVES,1385,Surface Use Explosive Ordnance Disposal Tools and Equipment KS,COFFEY,20031,1385-01-574-4707,"ROBOT,EXPLOSIVE ORDNANCE DISPOSAL",1,Each,10000,10000,4/17/2014 0:00:00,13,AMMUNITION AND EXPLOSIVES,1385,Surface Use Explosive Ordnance Disposal Tools and Equipment KS,COFFEY,20031,2355-01-553-4634,MINE RESISTANT VEHICLE,1,Each,658000,658000,9/5/2013 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2355,"Combat, Assault, and Tactical Vehicles, Wheeled" KS,COFFEY,20031,8465-01-416-4632,"SPECTACLES,BALLISTI",7,EA,21.69,151.83,3/4/2010 0:00:00,84,"CLOTHING/INDIVIDUAL EQPT, INSIGNIA",8465,Individual Equipment KS,COMANCHE,20033,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,6/9/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,6/9/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,6/9/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,6/9/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,6/9/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,6/9/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,12/14/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,12/14/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,2/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,2/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,2/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,2/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,2/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/12/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/12/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/12/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/12/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/12/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/12/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,COWLEY,20035,1240-00-930-3833,BINOCULAR,2,Each,278.05,556.1,5/21/2010 0:00:00,12,FIRE CONTROL EQPT.,1240,Optical Sighting and Ranging Equipment KS,COWLEY,20035,1240-01-411-1265,"SIGHT,REFLEX",5,EA,328,1640,4/8/2009 0:00:00,12,FIRE CONTROL EQPT.,1240,Optical Sighting and Ranging Equipment KS,CRAWFORD,20037,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,11/14/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,11/14/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,2/25/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,2/25/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,2/25/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,2/25/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,2/25/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,2/25/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,9/25/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,9/25/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,2/25/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,2/25/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,2/25/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,2/25/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,2/25/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,CRAWFORD,20037,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,2/25/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DECATUR,20039,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DECATUR,20039,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DICKINSON,20041,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/27/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DICKINSON,20041,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/27/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DICKINSON,20041,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/27/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DICKINSON,20041,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/27/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DICKINSON,20041,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DICKINSON,20041,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DICKINSON,20041,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DICKINSON,20041,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DICKINSON,20041,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DICKINSON,20041,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DICKINSON,20041,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DICKINSON,20041,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DICKINSON,20041,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DICKINSON,20041,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DICKINSON,20041,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DICKINSON,20041,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DICKINSON,20041,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,7/19/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,7/19/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,7/19/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,7/19/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/28/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/28/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,DOUGLAS,20045,2320-01-107-7155,"TRUCK,UTILITY",1,Each,25000,25000,1/10/2012 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,DOUGLAS,20045,2320-01-107-7155,"TRUCK,UTILITY",1,Each,25000,25000,2/16/2012 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,DOUGLAS,20045,2510-01-187-3606,"PANEL,BODY,VEHICULAR",1,Assembly,1335.69,1335.69,5/22/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,DOUGLAS,20045,2510-01-187-3606,"PANEL,BODY,VEHICULAR",1,Assembly,1335.69,1335.69,5/22/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,DOUGLAS,20045,2510-01-254-1500,"DOOR,VEHICULAR",1,Each,635.96,635.96,5/22/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,DOUGLAS,20045,2510-01-254-1500,"DOOR,VEHICULAR",1,Each,635.96,635.96,5/22/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,DOUGLAS,20045,2510-01-254-1501,"DOOR,VEHICULAR",1,Each,633.97,633.97,5/22/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,DOUGLAS,20045,2510-01-254-1501,"DOOR,VEHICULAR",1,Each,633.97,633.97,5/22/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,DOUGLAS,20045,2510-01-257-3903,"DOOR,VEHICULAR",1,Assembly,635.96,635.96,5/22/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,DOUGLAS,20045,2510-01-257-3903,"DOOR,VEHICULAR",1,Assembly,635.96,635.96,5/22/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,DOUGLAS,20045,2510-01-257-3904,"DOOR,VEHICULAR",1,Assembly,633.97,633.97,5/22/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,DOUGLAS,20045,2510-01-257-3904,"DOOR,VEHICULAR",1,Assembly,633.97,633.97,5/22/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,DOUGLAS,20045,2510-01-262-9520,"DOOR,HATCH,VEHICLE",1,Each,3718.69,3718.69,5/22/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,DOUGLAS,20045,2510-01-262-9520,"DOOR,HATCH,VEHICLE",1,Each,3718.69,3718.69,5/22/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,ELLIS,20051,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/9/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLIS,20051,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/9/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLIS,20051,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/9/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLIS,20051,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/9/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLIS,20051,1005-00-179-0300,"RIFLE,7.62 MILLIMETER",1,Each,1278,1278,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLIS,20051,1005-00-179-0300,"RIFLE,7.62 MILLIMETER",1,Each,1278,1278,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLIS,20051,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/13/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLIS,20051,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/13/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLIS,20051,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/28/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLIS,20051,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/28/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLIS,20051,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/28/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLIS,20051,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/28/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLIS,20051,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/28/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLIS,20051,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/28/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLIS,20051,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/28/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLIS,20051,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,6/12/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLIS,20051,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,6/12/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLIS,20051,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,6/12/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLIS,20051,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,6/12/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLIS,20051,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,6/12/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLIS,20051,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,6/12/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLIS,20051,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,6/12/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLIS,20051,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/20/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLIS,20051,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/20/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLIS,20051,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLIS,20051,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLIS,20051,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLSWORTH,20053,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLSWORTH,20053,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLSWORTH,20053,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLSWORTH,20053,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLSWORTH,20053,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLSWORTH,20053,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLSWORTH,20053,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLSWORTH,20053,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLSWORTH,20053,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLSWORTH,20053,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLSWORTH,20053,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLSWORTH,20053,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLSWORTH,20053,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLSWORTH,20053,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLSWORTH,20053,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ELLSWORTH,20053,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,1/12/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,1/12/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,1/12/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,1/12/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FINNEY,20055,1240-01-411-1265,"SIGHT,REFLEX",5,Each,328,1640,12/29/2011 0:00:00,12,FIRE CONTROL EQPT.,1240,Optical Sighting and Ranging Equipment KS,FINNEY,20055,2320-01-074-7642,"TRUCK,ARMORED",1,Each,65070,65070,5/17/2007 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,FINNEY,20055,2320-01-107-7153,"TRUCK,UTILITY",1,Each,47069,47069,1/24/2012 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,FINNEY,20055,2320-01-107-7153,"TRUCK,UTILITY",1,Each,47069,47069,1/24/2012 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,FORD,20057,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FORD,20057,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FORD,20057,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,8/21/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FORD,20057,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,8/21/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FORD,20057,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,8/21/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FORD,20057,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/18/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FORD,20057,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/18/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FORD,20057,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/18/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FORD,20057,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/18/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FORD,20057,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/18/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FORD,20057,1010-00-691-1382,"LAUNCHER,GRENADE",1,Each,720,720,8/20/2008 0:00:00,10,WEAPONS,1010,"Guns, over 30 mm up to 75 mm" KS,FORD,20057,2320-01-074-7642,"TRUCK,ARMORED",1,Each,65070,65070,6/24/2010 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,FRANKLIN,20059,1005-00-072-5011,"RIFLE,7.62 MILLIMETER",1,Each,206,206,6/21/2013 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FRANKLIN,20059,1005-00-072-5011,"RIFLE,7.62 MILLIMETER",1,Each,206,206,6/21/2013 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FRANKLIN,20059,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/29/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FRANKLIN,20059,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/29/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FRANKLIN,20059,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/29/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FRANKLIN,20059,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/29/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FRANKLIN,20059,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/29/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FRANKLIN,20059,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/29/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FRANKLIN,20059,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/29/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FRANKLIN,20059,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/29/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FRANKLIN,20059,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/29/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FRANKLIN,20059,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FRANKLIN,20059,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FRANKLIN,20059,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FRANKLIN,20059,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FRANKLIN,20059,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FRANKLIN,20059,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FRANKLIN,20059,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FRANKLIN,20059,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FRANKLIN,20059,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FRANKLIN,20059,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FRANKLIN,20059,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,FRANKLIN,20059,2320-01-107-7155,"TRUCK,UTILITY",1,Each,25000,25000,7/20/2011 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,GEARY,20061,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/29/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/10/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/10/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/10/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/10/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/10/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/10/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/10/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/10/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/10/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/10/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/29/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/22/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/22/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PRATT,20151,8465-01-109-3369,"MAT,SLEEPING",12,EA,9.02,108.24,2/2/2012 0:00:00,84,"CLOTHING/INDIVIDUAL EQPT, INSIGNIA",8465,Individual Equipment KS,GEARY,20061,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/22/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/22/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/22/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/22/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/13/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/13/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/13/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/13/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/13/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/13/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/9/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GEARY,20061,1240-00-930-3833,BINOCULAR,4,Each,278.05,1112.2,3/14/2011 0:00:00,12,FIRE CONTROL EQPT.,1240,Optical Sighting and Ranging Equipment KS,GEARY,20061,4110-DS-REF-RIGE,REFRIGERATOR 110 CUBIC FOOT,1,EA,525,525,6/7/2012 0:00:00,41,"REFRIG, AIR CONDIT/CIRCULAT EQPT",4110,Refrigeration Equipment KS,GRAHAM,20065,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GRAHAM,20065,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,9/29/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GRAHAM,20065,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,9/29/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GRAHAM,20065,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,9/29/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GRANT,20067,2320-01-107-7155,"TRUCK,UTILITY",1,Each,25000,25000,8/9/2011 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,GRAY,20069,2320-01-107-7155,"TRUCK,UTILITY",1,Each,25000,25000,2/29/2012 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,GREELEY,20071,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/29/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GREELEY,20071,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/29/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,GREENWOOD,20073,2320-01-107-7155,"TRUCK,UTILITY",1,Each,25000,25000,6/13/2011 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,GREENWOOD,20073,2320-01-371-9577,"TRUCK,UTILITY",1,Each,44722,44722,9/1/2011 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,HAMILTON,20075,2320-01-128-9551,"TRUCK,UTILITY",1,Each,47989,47989,7/8/2013 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,HARVEY,20079,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,HARVEY,20079,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,HARVEY,20079,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,HARVEY,20079,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,HARVEY,20079,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,HARVEY,20079,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,HARVEY,20079,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,HARVEY,20079,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,HARVEY,20079,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,2/25/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,HARVEY,20079,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,2/25/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,HARVEY,20079,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,2/25/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,HARVEY,20079,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,2/25/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,HARVEY,20079,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,2/25/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,HARVEY,20079,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,2/25/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,HARVEY,20079,2320-01-107-7155,"TRUCK,UTILITY",1,Each,25000,25000,6/24/2011 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,HASKELL,20081,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,HASKELL,20081,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,HASKELL,20081,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,HASKELL,20081,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,HASKELL,20081,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JACKSON,20085,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JACKSON,20085,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JACKSON,20085,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JACKSON,20085,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JACKSON,20085,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JACKSON,20085,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JACKSON,20085,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JEFFERSON,20087,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JEFFERSON,20087,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JEFFERSON,20087,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JEFFERSON,20087,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JEFFERSON,20087,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JEFFERSON,20087,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JEFFERSON,20087,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JEFFERSON,20087,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JEFFERSON,20087,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JEFFERSON,20087,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JEFFERSON,20087,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JEFFERSON,20087,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JEFFERSON,20087,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JEFFERSON,20087,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JEFFERSON,20087,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JEFFERSON,20087,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JEFFERSON,20087,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JEFFERSON,20087,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JEFFERSON,20087,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JEFFERSON,20087,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/9/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/9/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/9/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/9/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/9/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/9/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,5/3/2013 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,5/3/2013 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,5/3/2013 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,JOHNSON,20091,2355-01-555-0908,MINE RESISTANT VEHICLE,1,Each,412000,412000,10/4/2013 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2355,"Combat, Assault, and Tactical Vehicles, Wheeled" KS,KIOWA,20097,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/23/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,KIOWA,20097,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/23/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,KIOWA,20097,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/23/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,KIOWA,20097,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/23/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,KIOWA,20097,2320-01-128-9551,"TRUCK,UTILITY",1,Each,47989,47989,12/30/2013 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,KIOWA,20097,2320-01-128-9551,"TRUCK,UTILITY",1,Each,47989,47989,12/30/2013 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,LABETTE,20099,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/9/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LABETTE,20099,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/9/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LABETTE,20099,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/9/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LABETTE,20099,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/9/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LEAVENWORTH,20103,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,2/4/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LEAVENWORTH,20103,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,2/4/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LEAVENWORTH,20103,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,2/4/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LEAVENWORTH,20103,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,2/4/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LEAVENWORTH,20103,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,2/4/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LEAVENWORTH,20103,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,2/4/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LEAVENWORTH,20103,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,2/4/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LEAVENWORTH,20103,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,2/4/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LEAVENWORTH,20103,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,2/4/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LEAVENWORTH,20103,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,2/4/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LEAVENWORTH,20103,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,2/4/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LEAVENWORTH,20103,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,2/4/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LEAVENWORTH,20103,1005-01-128-9936,"RIFLE,5.56 MILLIMETER",1,Each,749,749,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LEAVENWORTH,20103,1005-01-128-9936,"RIFLE,5.56 MILLIMETER",1,Each,749,749,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LEAVENWORTH,20103,1005-01-128-9936,"RIFLE,5.56 MILLIMETER",1,Each,749,749,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LEAVENWORTH,20103,1005-01-128-9936,"RIFLE,5.56 MILLIMETER",1,Each,749,749,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LEAVENWORTH,20103,1005-01-128-9936,"RIFLE,5.56 MILLIMETER",1,Each,749,749,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LEAVENWORTH,20103,1005-01-128-9936,"RIFLE,5.56 MILLIMETER",1,Each,749,749,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LEAVENWORTH,20103,1005-01-128-9936,"RIFLE,5.56 MILLIMETER",1,Each,749,749,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LEAVENWORTH,20103,1005-01-128-9936,"RIFLE,5.56 MILLIMETER",1,Each,749,749,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LEAVENWORTH,20103,1005-01-128-9936,"RIFLE,5.56 MILLIMETER",1,Each,749,749,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LEAVENWORTH,20103,1005-01-128-9936,"RIFLE,5.56 MILLIMETER",1,Each,749,749,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,8/24/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/23/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/23/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/23/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/23/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,12/15/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,12/15/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/15/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/15/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/15/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/15/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/15/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/15/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/12/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/12/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/12/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/12/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/12/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/12/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/12/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/12/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/12/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/12/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/12/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/12/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/23/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LINN,20107,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/23/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LOGAN,20109,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LOGAN,20109,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LOGAN,20109,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LYON,20111,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/28/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LYON,20111,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/28/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LYON,20111,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/28/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LYON,20111,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/28/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LYON,20111,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/28/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LYON,20111,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/28/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LYON,20111,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/28/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LYON,20111,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/28/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LYON,20111,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/28/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LYON,20111,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/28/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,LYON,20111,2320-01-107-7155,"TRUCK,UTILITY",1,Each,25000,25000,3/25/2011 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,LYON,20111,2320-01-107-7155,"TRUCK,UTILITY",1,Each,25000,25000,12/22/2011 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,MARION,20115,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MARION,20115,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MARION,20115,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MARION,20115,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MARION,20115,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MARION,20115,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MARION,20115,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MARION,20115,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MARION,20115,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,2/4/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MARION,20115,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,2/4/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MARION,20115,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MARION,20115,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MARSHALL,20117,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MARSHALL,20117,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,12/6/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MARSHALL,20117,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MARSHALL,20117,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/6/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MCPHERSON,20113,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MCPHERSON,20113,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MCPHERSON,20113,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MCPHERSON,20113,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MCPHERSON,20113,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MCPHERSON,20113,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MCPHERSON,20113,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MCPHERSON,20113,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MCPHERSON,20113,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MCPHERSON,20113,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MCPHERSON,20113,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MCPHERSON,20113,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MCPHERSON,20113,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MCPHERSON,20113,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MCPHERSON,20113,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MCPHERSON,20113,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MEADE,20119,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,12/30/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MEADE,20119,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,12/30/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MEADE,20119,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,12/30/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MEADE,20119,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,12/30/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MEADE,20119,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MEADE,20119,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,8/3/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MEADE,20119,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,6/10/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MEADE,20119,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,6/10/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MEADE,20119,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,6/10/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MEADE,20119,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,6/10/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MEADE,20119,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,6/10/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MEADE,20119,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,6/10/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,8/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,8/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,8/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,8/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,5/22/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,5/22/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,5/22/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,5/22/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,5/22/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-921-5483,"SHOTGUN,12 GAGE,RIOT TYPE",1,Each,108,108,9/29/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-921-5483,"SHOTGUN,12 GAGE,RIOT TYPE",1,Each,108,108,9/29/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,1005-00-921-5483,"SHOTGUN,12 GAGE,RIOT TYPE",1,Each,108,108,9/29/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MIAMI,20121,2320-01-107-7155,"TRUCK,UTILITY",1,Each,25000,25000,8/3/2011 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,MIAMI,20121,2320-01-107-7155,"TRUCK,UTILITY",1,Each,25000,25000,5/15/2013 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,MIAMI,20121,2510-01-254-1500,"DOOR,VEHICULAR",1,Each,635.96,635.96,5/23/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,MIAMI,20121,2510-01-254-1501,"DOOR,VEHICULAR",1,Each,633.97,633.97,5/23/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,MIAMI,20121,2510-01-257-3903,"DOOR,VEHICULAR",1,Assembly,635.96,635.96,5/23/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,MIAMI,20121,2510-01-257-3904,"DOOR,VEHICULAR",1,Assembly,633.97,633.97,5/23/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,MONTGOMERY,20125,1005-00-179-0300,"RIFLE,7.62 MILLIMETER",1,Each,1278,1278,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MONTGOMERY,20125,1005-00-179-0300,"RIFLE,7.62 MILLIMETER",1,Each,1278,1278,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MONTGOMERY,20125,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MONTGOMERY,20125,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MONTGOMERY,20125,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MONTGOMERY,20125,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MONTGOMERY,20125,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MONTGOMERY,20125,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/24/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MONTGOMERY,20125,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MONTGOMERY,20125,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MONTGOMERY,20125,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MONTGOMERY,20125,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,MONTGOMERY,20125,2355-01-555-0908,MINE RESISTANT VEHICLE,1,Each,412000,412000,10/2/2013 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2355,"Combat, Assault, and Tactical Vehicles, Wheeled" KS,MONTGOMERY,20125,2541-01-580-1324,"ASSEMBLY,COVER",1,Each,1190.8,1190.8,1/15/2014 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2541,Weapons Systems Specific Vehicular Accessories KS,MONTGOMERY,20125,5340-01-571-4006,"HARDWARE KIT,ELECTRONIC EQUIPMENT",1,Kit,13999,13999,2/7/2014 0:00:00,53,HARDWARE AND ABRASIVES,5340,Hardware KS,MONTGOMERY,20125,5340-01-571-4006,"HARDWARE KIT,ELECTRONIC EQUIPMENT",1,Kit,13999,13999,4/17/2014 0:00:00,53,HARDWARE AND ABRASIVES,5340,Hardware KS,MONTGOMERY,20125,8470-01-441-0885,BALLISTIC BLANKET,2,EA,1228.2,2456.4,7/28/2009 0:00:00,84,"CLOTHING/INDIVIDUAL EQPT, INSIGNIA",8470,"Armor, Personal" KS,NEMAHA,20131,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NEMAHA,20131,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NEMAHA,20131,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NEMAHA,20131,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NEOSHO,20133,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,2/15/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NEOSHO,20133,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,2/15/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NEOSHO,20133,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NEOSHO,20133,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NEOSHO,20133,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NEOSHO,20133,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NEOSHO,20133,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NEOSHO,20133,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NEOSHO,20133,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NEOSHO,20133,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NEOSHO,20133,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NEOSHO,20133,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NEOSHO,20133,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NEOSHO,20133,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NEOSHO,20133,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NEOSHO,20133,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NEOSHO,20133,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NEOSHO,20133,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NEOSHO,20133,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NEOSHO,20133,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NORTON,20137,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,1/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NORTON,20137,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,1/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NORTON,20137,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,1/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NORTON,20137,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,1/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NORTON,20137,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,1/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,NORTON,20137,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,1/21/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,OSAGE,20139,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,OSAGE,20139,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,0,0,5/4/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,OSAGE,20139,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,0,0,5/4/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,OSAGE,20139,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,0,0,5/4/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,OSAGE,20139,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,1/24/2014 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,OTTAWA,20143,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,OTTAWA,20143,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,OTTAWA,20143,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,OTTAWA,20143,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,OTTAWA,20143,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,OTTAWA,20143,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,OTTAWA,20143,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,OTTAWA,20143,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,OTTAWA,20143,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,OTTAWA,20143,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,OTTAWA,20143,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,OTTAWA,20143,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,OTTAWA,20143,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,OTTAWA,20143,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,OTTAWA,20143,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,OTTAWA,20143,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PAWNEE,20145,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PAWNEE,20145,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PAWNEE,20145,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PAWNEE,20145,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PAWNEE,20145,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PAWNEE,20145,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PAWNEE,20145,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PAWNEE,20145,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PAWNEE,20145,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PAWNEE,20145,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PAWNEE,20145,8470-00-ARM-OR,,2,Each,2499.99,4999.98,11/5/2009 0:00:00,84,"CLOTHING/INDIVIDUAL EQPT, INSIGNIA",8470,"Armor, Personal" KS,PAWNEE,20145,8470-01-359-0427,"CARRIER,SMALL ARMS PROTECTIVE BODY ARMOR",2,Each,1696.5,3393,11/4/2009 0:00:00,84,"CLOTHING/INDIVIDUAL EQPT, INSIGNIA",8470,"Armor, Personal" KS,PAWNEE,20145,8470-01-359-0428,"CARRIER,SMALL ARMS PROTECTIVE BODY ARMOR",4,Each,1696.5,6786,11/4/2009 0:00:00,84,"CLOTHING/INDIVIDUAL EQPT, INSIGNIA",8470,"Armor, Personal" KS,PAWNEE,20145,8470-01-359-0429,"CARRIER,SMALL ARMS PROTECTIVE BODY ARMOR",4,Each,1696.5,6786,11/4/2009 0:00:00,84,"CLOTHING/INDIVIDUAL EQPT, INSIGNIA",8470,"Armor, Personal" KS,POTTAWATOMIE,20149,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/29/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,POTTAWATOMIE,20149,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/29/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,POTTAWATOMIE,20149,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/29/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,POTTAWATOMIE,20149,1005-00-179-0300,"RIFLE,7.62 MILLIMETER",1,Each,1278,1278,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,POTTAWATOMIE,20149,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,2/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,POTTAWATOMIE,20149,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,POTTAWATOMIE,20149,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,POTTAWATOMIE,20149,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,POTTAWATOMIE,20149,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,POTTAWATOMIE,20149,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,POTTAWATOMIE,20149,1240-01-411-1265,"SIGHT,REFLEX",1,EA,328,328,8/13/2009 0:00:00,12,FIRE CONTROL EQPT.,1240,Optical Sighting and Ranging Equipment KS,POTTAWATOMIE,20149,1240-01-411-1265,"SIGHT,REFLEX",5,EA,328,1640,8/13/2009 0:00:00,12,FIRE CONTROL EQPT.,1240,Optical Sighting and Ranging Equipment KS,PRATT,20151,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PRATT,20151,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PRATT,20151,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PRATT,20151,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PRATT,20151,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PRATT,20151,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PRATT,20151,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PRATT,20151,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PRATT,20151,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PRATT,20151,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PRATT,20151,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PRATT,20151,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PRATT,20151,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PRATT,20151,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PRATT,20151,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PRATT,20151,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PRATT,20151,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,12/4/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PRATT,20151,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,12/4/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PRATT,20151,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,12/4/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PRATT,20151,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,12/4/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PRATT,20151,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,12/4/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PRATT,20151,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,12/4/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PRATT,20151,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,12/4/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PRATT,20151,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,12/4/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,PRATT,20151,2320-01-107-7155,"TRUCK,UTILITY",1,Each,25000,25000,4/18/2011 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,PRATT,20151,6115-01-143-3311,"GENERATOR SET,GASOL",1,EA,4446,4446,6/14/2012 0:00:00,61,"ELECTRIC WIRE, POWER DISTRIB EQPT",6115,"Generators and Generator Sets, Electrical" KS,PRATT,20151,6115-01-150-4140,"GENERATOR SET,DIESE",1,EA,100,100,6/14/2012 0:00:00,61,"ELECTRIC WIRE, POWER DISTRIB EQPT",6115,"Generators and Generator Sets, Electrical" KS,PRATT,20151,8465-01-109-3369,"MAT,SLEEPING",10,EA,9.02,90.2,2/2/2012 0:00:00,84,"CLOTHING/INDIVIDUAL EQPT, INSIGNIA",8465,Individual Equipment KS,PRATT,20151,8465-01-109-3369,"MAT,SLEEPING",58,EA,9.02,523.16,2/2/2012 0:00:00,84,"CLOTHING/INDIVIDUAL EQPT, INSIGNIA",8465,Individual Equipment KS,RAWLINS,20153,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RAWLINS,20153,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,2/13/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RAWLINS,20153,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/15/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RAWLINS,20153,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,2/25/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RAWLINS,20153,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,2/25/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RAWLINS,20153,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,2/25/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RAWLINS,20153,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,2/25/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RAWLINS,20153,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,2/25/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RAWLINS,20153,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,2/25/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RAWLINS,20153,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,2/25/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/2/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/16/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RENO,20155,7110-DS-OFF-URN,DESK,1,EA,75,75,2/9/2012 0:00:00,71,FURNITURE,7110,Office Furniture KS,RENO,20155,7110-DS-OFF-URN,CHAIR OFFICE,62,EA,60,3720,2/9/2012 0:00:00,71,FURNITURE,7110,Office Furniture KS,RICE,20159,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RICE,20159,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RICE,20159,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RICE,20159,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RICE,20159,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RICE,20159,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RICE,20159,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RICE,20159,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RICE,20159,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RICE,20159,1005-00-179-0300,"RIFLE,7.62 MILLIMETER",1,Each,1278,1278,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RICE,20159,1005-00-179-0300,"RIFLE,7.62 MILLIMETER",1,Each,1278,1278,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RICE,20159,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RICE,20159,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RICE,20159,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RILEY,20161,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,8/23/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RILEY,20161,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,8/23/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RILEY,20161,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,8/23/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RILEY,20161,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,8/23/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RILEY,20161,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,8/23/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RILEY,20161,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,8/23/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RILEY,20161,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/13/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RILEY,20161,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/13/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RILEY,20161,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/13/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RILEY,20161,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/13/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RILEY,20161,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/13/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RILEY,20161,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/13/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RILEY,20161,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/13/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RILEY,20161,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/13/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RILEY,20161,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/13/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RILEY,20161,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/13/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RILEY,20161,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/13/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RILEY,20161,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/13/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RILEY,20161,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/13/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RILEY,20161,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/13/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RILEY,20161,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/13/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RILEY,20161,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/13/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RILEY,20161,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/13/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RILEY,20161,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/13/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RILEY,20161,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/13/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RILEY,20161,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/13/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,RILEY,20161,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/13/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ROOKS,20163,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ROOKS,20163,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ROOKS,20163,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ROOKS,20163,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ROOKS,20163,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ROOKS,20163,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ROOKS,20163,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ROOKS,20163,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ROOKS,20163,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ROOKS,20163,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ROOKS,20163,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ROOKS,20163,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ROOKS,20163,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ROOKS,20163,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ROOKS,20163,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ROOKS,20163,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ROOKS,20163,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,4/16/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,ROOKS,20163,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/16/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,4/13/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,4/13/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,4/13/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,4/13/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/22/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/22/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/22/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/22/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/22/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/22/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/22/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/22/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/22/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SALINE,20169,1010-00-691-1382,"LAUNCHER,GRENADE",1,Each,720,720,10/2/2006 0:00:00,10,WEAPONS,1010,"Guns, over 30 mm up to 75 mm" KS,SALINE,20169,2320-01-107-7155,"TRUCK,UTILITY",1,Each,25000,25000,6/13/2011 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,SALINE,20169,2355-DS-COM-BTV2,ONLY COMPLETE COMBAT/ASSAULT/TACTICAL WHEELED VEHICLES,1,Each,200000,200000,10/28/2011 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2355,"Combat, Assault, and Tactical Vehicles, Wheeled" KS,SALINE,20169,2510-01-187-3606,"PANEL,BODY,VEHICULAR",1,Assembly,1335.69,1335.69,5/22/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,SALINE,20169,2510-01-254-1500,"DOOR,VEHICULAR",1,Each,635.96,635.96,5/22/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,SALINE,20169,2510-01-254-1501,"DOOR,VEHICULAR",1,Each,633.97,633.97,5/22/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,SALINE,20169,2510-01-257-3903,"DOOR,VEHICULAR",1,Assembly,635.96,635.96,5/22/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,SALINE,20169,2510-01-257-3904,"DOOR,VEHICULAR",1,Assembly,633.97,633.97,5/22/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,SALINE,20169,2510-01-262-9520,"DOOR,HATCH,VEHICLE",1,Each,3718.69,3718.69,5/22/2012 0:00:00,25,VEHICULAR EQUIPMENT COMPONENTS,2510,"Vehicular Cab, Body, and Frame Structural Components" KS,SCOTT,20171,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SCOTT,20171,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SCOTT,20171,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SCOTT,20171,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SCOTT,20171,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SCOTT,20171,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SCOTT,20171,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,12/6/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,12/6/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,12/6/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,8/23/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,8/23/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,8/23/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/6/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/6/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/6/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/6/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/6/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,12/6/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,7/19/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,7/19/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,7/19/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/16/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/20/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SEDGWICK,20173,1240-01-411-1265,"SIGHT,REFLEX",1,EA,328,328,7/10/2009 0:00:00,12,FIRE CONTROL EQPT.,1240,Optical Sighting and Ranging Equipment KS,SEDGWICK,20173,1385-01-574-4707,"ROBOT,EXPLOSIVE ORDNANCE DISPOSAL",1,Each,10000,10000,2/6/2014 0:00:00,13,AMMUNITION AND EXPLOSIVES,1385,Surface Use Explosive Ordnance Disposal Tools and Equipment KS,SEDGWICK,20173,1385-01-574-4707,"ROBOT,EXPLOSIVE ORDNANCE DISPOSAL",1,Each,10000,10000,2/6/2014 0:00:00,13,AMMUNITION AND EXPLOSIVES,1385,Surface Use Explosive Ordnance Disposal Tools and Equipment KS,SEDGWICK,20173,1385-01-574-4707,"ROBOT,EXPLOSIVE ORDNANCE DISPOSAL",1,Each,10000,10000,2/6/2014 0:00:00,13,AMMUNITION AND EXPLOSIVES,1385,Surface Use Explosive Ordnance Disposal Tools and Equipment KS,SEDGWICK,20173,1385-01-574-4707,"ROBOT,EXPLOSIVE ORDNANCE DISPOSAL",1,Each,10000,10000,2/6/2014 0:00:00,13,AMMUNITION AND EXPLOSIVES,1385,Surface Use Explosive Ordnance Disposal Tools and Equipment KS,SEDGWICK,20173,1385-01-574-4707,"ROBOT,EXPLOSIVE ORDNANCE DISPOSAL",1,Each,10000,10000,2/6/2014 0:00:00,13,AMMUNITION AND EXPLOSIVES,1385,Surface Use Explosive Ordnance Disposal Tools and Equipment KS,SEDGWICK,20173,1385-01-574-4707,"ROBOT,EXPLOSIVE ORDNANCE DISPOSAL",1,Each,10000,10000,2/6/2014 0:00:00,13,AMMUNITION AND EXPLOSIVES,1385,Surface Use Explosive Ordnance Disposal Tools and Equipment KS,SEDGWICK,20173,1385-01-574-4707,"ROBOT,EXPLOSIVE ORDNANCE DISPOSAL",1,Each,10000,10000,2/6/2014 0:00:00,13,AMMUNITION AND EXPLOSIVES,1385,Surface Use Explosive Ordnance Disposal Tools and Equipment KS,SEDGWICK,20173,1385-01-574-4707,"ROBOT,EXPLOSIVE ORDNANCE DISPOSAL",1,Each,10000,10000,2/6/2014 0:00:00,13,AMMUNITION AND EXPLOSIVES,1385,Surface Use Explosive Ordnance Disposal Tools and Equipment KS,SEDGWICK,20173,1385-01-574-4707,"ROBOT,EXPLOSIVE ORDNANCE DISPOSAL",1,Each,10000,10000,2/6/2014 0:00:00,13,AMMUNITION AND EXPLOSIVES,1385,Surface Use Explosive Ordnance Disposal Tools and Equipment KS,SEDGWICK,20173,1385-01-574-4707,"ROBOT,EXPLOSIVE ORDNANCE DISPOSAL",1,Each,10000,10000,2/6/2014 0:00:00,13,AMMUNITION AND EXPLOSIVES,1385,Surface Use Explosive Ordnance Disposal Tools and Equipment KS,SEDGWICK,20173,1385-01-574-4707,"ROBOT,EXPLOSIVE ORDNANCE DISPOSAL",1,Each,10000,10000,2/6/2014 0:00:00,13,AMMUNITION AND EXPLOSIVES,1385,Surface Use Explosive Ordnance Disposal Tools and Equipment KS,SEDGWICK,20173,2320-01-074-7642,"TRUCK,ARMORED",1,EA,65070,65070,6/9/2010 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,SEDGWICK,20173,2320-01-107-7155,"TRUCK,UTILITY",1,Each,25000,25000,2/16/2012 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,SEDGWICK,20173,2355-01-553-4634,MINE RESISTANT VEHICLE,1,Each,658000,658000,11/12/2013 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2355,"Combat, Assault, and Tactical Vehicles, Wheeled" KS,SEDGWICK,20173,4110-01-030-8044,"ICE MAKING MACHINE,",2,EA,4066.68,8133.36,2/8/2012 0:00:00,41,"REFRIG, AIR CONDIT/CIRCULAT EQPT",4110,Refrigeration Equipment KS,SEDGWICK,20173,5855-01-420-0821,"ILLUMINATOR,INFRARED",5,Each,1875.44,9377.2,11/28/2006 0:00:00,58,COMM/DETECT/COHERENT RADIATION,5855,"Night Vision Equipment, Emitted and Reflected Radiation" KS,SEDGWICK,20173,5855-01-468-4169,"ILLUMINATOR,INFRARED",1,Each,756.38,756.38,11/30/2006 0:00:00,58,COMM/DETECT/COHERENT RADIATION,5855,"Night Vision Equipment, Emitted and Reflected Radiation" KS,SEDGWICK,20173,5855-01-501-9529,"VIEWER,NIGHT VISION",7,Each,10427.1,72989.7,3/3/2007 0:00:00,58,COMM/DETECT/COHERENT RADIATION,5855,"Night Vision Equipment, Emitted and Reflected Radiation" KS,SEDGWICK,20173,6515-01-466-0971,"OXIMETER,PULSE",5,EA,572.92,2864.6,3/8/2012 0:00:00,65,MEDICAL/DENTAL/VETERINARY EQPT/SUPP,6515,"Medical and Surgical Instruments, Equipment, and Supplies" KS,SEDGWICK,20173,6515-01-521-5730,"SPLINT,LEG",4,EA,261.8,1047.2,3/21/2012 0:00:00,65,MEDICAL/DENTAL/VETERINARY EQPT/SUPP,6515,"Medical and Surgical Instruments, Equipment, and Supplies" KS,SEDGWICK,20173,6530-01-265-3583,"EXTRACTION DEVICE,S",4,EA,260.17,1040.68,3/21/2012 0:00:00,65,MEDICAL/DENTAL/VETERINARY EQPT/SUPP,6530,"Hospital Furniture, Equipment, Utensils, and Supplies" KS,SEDGWICK,20173,6650-00-RBI-NOCU,BINOCULAR,1,EA,221,221,3/21/2012 0:00:00,66,INSTRUMENTS AND LABORATORY EQPT,6650,"Optical Instruments, Test Equipment, Components and Accessories" KS,SEDGWICK,20173,6720-DS-CAM-ERA2,CAMERA AND EQUIP SECURITY,1,EA,5515.75,5515.75,3/8/2012 0:00:00,67,PHOTOGRAPHIC EQPT,6720,"Cameras, Still Picture" KS,SEDGWICK,20173,6910-01-C06-6627,,10,Each,84.31,843.1,6/29/2011 0:00:00,69,TRAINING AIDS AND DEVICES,6910,Training Aids KS,SEDGWICK,20173,7025-DS-MON-IT08,MONITOR,4,EA,429.99,1719.96,2/28/2012 0:00:00,70,ADP EQPT/SOFTWARE/SUPPLIES AND EQPT,7025,ADP Input/Output and Storage Devices KS,SEDGWICK,20173,7025-DS-MON-IT08,MONITOR,4,EA,315,1260,2/28/2012 0:00:00,70,ADP EQPT/SOFTWARE/SUPPLIES AND EQPT,7025,ADP Input/Output and Storage Devices KS,SEDGWICK,20173,8340-01-521-6438,"SHELTER,ICS IMPROVE",2,EA,297.32,594.64,1/13/2009 0:00:00,83,TEXTILE/LEATHER/FUR; TENT; FLAG,8340,Tents and Tarpaulins KS,SEDGWICK,20173,8465-00-ACU-VEST,CPCV ACU VEST,25,EA,200,5000,2/19/2009 0:00:00,84,"CLOTHING/INDIVIDUAL EQPT, INSIGNIA",8465,Individual Equipment KS,SEDGWICK,20173,8465-01-416-4632,"SPECTACLES,BALLISTIC AND LASER PROTECTIVE",20,Each,21.69,433.8,10/4/2010 0:00:00,84,"CLOTHING/INDIVIDUAL EQPT, INSIGNIA",8465,Individual Equipment KS,SEDGWICK,20173,8465-01-416-4632,"SPECTACLES,BALLISTIC AND LASER PROTECTIVE",13,Each,21.69,281.97,1/5/2011 0:00:00,84,"CLOTHING/INDIVIDUAL EQPT, INSIGNIA",8465,Individual Equipment KS,SEDGWICK,20173,8465-01-416-4636,"SPECTACLES SET,BALL",5,EA,15.4,77,2/18/2011 0:00:00,84,"CLOTHING/INDIVIDUAL EQPT, INSIGNIA",8465,Individual Equipment KS,SEDGWICK,20173,8465-01-467-0721,FACE SHIELD,9,Each,58.03,522.27,1/5/2011 0:00:00,84,"CLOTHING/INDIVIDUAL EQPT, INSIGNIA",8465,Individual Equipment KS,SHAWNEE,20177,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/9/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/9/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/9/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/9/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/9/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/9/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/9/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/9/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/9/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/9/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/9/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/9/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/9/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,9/9/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-179-0300,"RIFLE,7.62 MILLIMETER",1,Each,1278,1278,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-179-0300,"RIFLE,7.62 MILLIMETER",1,Each,1278,1278,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHAWNEE,20177,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,10/21/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHERMAN,20181,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHERMAN,20181,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/25/2013 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHERMAN,20181,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/25/2013 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHERMAN,20181,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/25/2013 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHERMAN,20181,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/25/2013 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHERMAN,20181,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/25/2013 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHERMAN,20181,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHERMAN,20181,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,1/25/2013 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHERMAN,20181,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHERMAN,20181,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHERMAN,20181,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHERMAN,20181,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHERMAN,20181,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHERMAN,20181,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SHERMAN,20181,2320-01-215-7631,"TRUCK,VAN",1,EA,14173,14173,2/1/2012 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,SMITH,20183,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/17/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SMITH,20183,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/17/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SMITH,20183,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SMITH,20183,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SMITH,20183,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SMITH,20183,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,4/16/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,STAFFORD,20185,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,5/22/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,STAFFORD,20185,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,5/22/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SUMNER,20191,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/23/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,SUMNER,20191,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/23/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,THOMAS,20193,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,THOMAS,20193,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,THOMAS,20193,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,THOMAS,20193,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,THOMAS,20193,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,THOMAS,20193,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,THOMAS,20193,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,THOMAS,20193,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,THOMAS,20193,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,THOMAS,20193,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WABAUNSEE,20197,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,12/16/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WABAUNSEE,20197,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,12/16/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WABAUNSEE,20197,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/16/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WABAUNSEE,20197,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/16/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WABAUNSEE,20197,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/16/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WABAUNSEE,20197,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/16/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WABAUNSEE,20197,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/16/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WABAUNSEE,20197,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/16/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WABAUNSEE,20197,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/16/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WABAUNSEE,20197,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/16/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WABAUNSEE,20197,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/16/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WABAUNSEE,20197,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/16/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WABAUNSEE,20197,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,12/16/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WABAUNSEE,20197,2320-01-107-7155,"TRUCK,UTILITY",1,Each,25000,25000,3/12/2011 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,WABAUNSEE,20197,5835-DS-REC-ORD0,RECORDER,3,Each,0,0,11/14/2013 0:00:00,58,COMM/DETECT/COHERENT RADIATION,5835,Sound Recording and Reproducing Equipment KS,WABAUNSEE,20197,6515-01-139-7844,"PADDLES,DEFIBRILLATOR,EXTERNAL",1,Each,71.23,71.23,11/14/2013 0:00:00,65,MEDICAL/DENTAL/VETERINARY EQPT/SUPP,6515,"Medical and Surgical Instruments, Equipment, and Supplies" KS,WABAUNSEE,20197,6515-01-139-7844,"PADDLES,DEFIBRILLATOR,EXTERNAL",1,Each,71.23,71.23,11/14/2013 0:00:00,65,MEDICAL/DENTAL/VETERINARY EQPT/SUPP,6515,"Medical and Surgical Instruments, Equipment, and Supplies" KS,WABAUNSEE,20197,6515-01-139-7844,"PADDLES,DEFIBRILLATOR,EXTERNAL",1,Each,71.23,71.23,11/14/2013 0:00:00,65,MEDICAL/DENTAL/VETERINARY EQPT/SUPP,6515,"Medical and Surgical Instruments, Equipment, and Supplies" KS,WABAUNSEE,20197,6515-01-139-7844,"PADDLES,DEFIBRILLATOR,EXTERNAL",1,Each,71.23,71.23,11/14/2013 0:00:00,65,MEDICAL/DENTAL/VETERINARY EQPT/SUPP,6515,"Medical and Surgical Instruments, Equipment, and Supplies" KS,WABAUNSEE,20197,6515-01-139-7844,"PADDLES,DEFIBRILLATOR,EXTERNAL",1,Each,71.23,71.23,11/14/2013 0:00:00,65,MEDICAL/DENTAL/VETERINARY EQPT/SUPP,6515,"Medical and Surgical Instruments, Equipment, and Supplies" KS,WABAUNSEE,20197,6515-01-139-7844,"PADDLES,DEFIBRILLATOR,EXTERNAL",1,Each,71.23,71.23,12/4/2013 0:00:00,65,MEDICAL/DENTAL/VETERINARY EQPT/SUPP,6515,"Medical and Surgical Instruments, Equipment, and Supplies" KS,WABAUNSEE,20197,6515-01-139-7844,"PADDLES,DEFIBRILLATOR,EXTERNAL",1,Each,71.23,71.23,12/4/2013 0:00:00,65,MEDICAL/DENTAL/VETERINARY EQPT/SUPP,6515,"Medical and Surgical Instruments, Equipment, and Supplies" KS,WABAUNSEE,20197,6515-01-139-7844,"PADDLES,DEFIBRILLATOR,EXTERNAL",1,Each,71.23,71.23,12/4/2013 0:00:00,65,MEDICAL/DENTAL/VETERINARY EQPT/SUPP,6515,"Medical and Surgical Instruments, Equipment, and Supplies" KS,WABAUNSEE,20197,6515-01-139-7844,"PADDLES,DEFIBRILLATOR,EXTERNAL",1,Each,71.23,71.23,12/4/2013 0:00:00,65,MEDICAL/DENTAL/VETERINARY EQPT/SUPP,6515,"Medical and Surgical Instruments, Equipment, and Supplies" KS,WABAUNSEE,20197,6515-01-139-7844,"PADDLES,DEFIBRILLATOR,EXTERNAL",1,Each,71.23,71.23,12/4/2013 0:00:00,65,MEDICAL/DENTAL/VETERINARY EQPT/SUPP,6515,"Medical and Surgical Instruments, Equipment, and Supplies" KS,WICHITA,20203,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/17/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WICHITA,20203,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,10/17/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WICHITA,20203,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,10/17/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WICHITA,20203,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,10/17/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WICHITA,20203,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,10/17/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,5/18/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,5/18/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/25/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/25/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/25/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/25/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/25/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/25/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/25/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/25/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/25/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-726-5655,"PISTOL,CALIBER .45,AUTOMATIC",1,Each,58.71,58.71,1/25/2011 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/14/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WILSON,20205,2320-01-456-1282,"TRUCK,UTILITY",1,Each,35000,35000,6/27/2011 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,WILSON,20205,2320-01-456-1282,"TRUCK,UTILITY",1,EA,35000,35000,6/27/2011 0:00:00,23,"MOTOR VEHICLES, CYCLES, TRAILERS",2320,"Trucks and Truck Tractors, Wheeled" KS,WOODSON,20207,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WOODSON,20207,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WOODSON,20207,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WOODSON,20207,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,3/13/2013 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WOODSON,20207,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,3/13/2013 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WOODSON,20207,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,3/13/2013 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WOODSON,20207,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,3/13/2013 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WOODSON,20207,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,3/13/2013 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WOODSON,20207,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,3/13/2013 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WOODSON,20207,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,3/13/2013 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-073-9421,"RIFLE,5.56 MILLIMETER",1,Each,499,499,1/26/2009 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,10/2/2006 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,11/28/2007 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,9/22/2008 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-589-1271,"RIFLE,7.62 MILLIMETER",1,Each,138,138,8/3/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/3/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,8/3/2010 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-856-6885,"RIFLE,5.56 MILLIMETER",1,Each,120,120,4/12/2012 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-00-921-5004,"MAGAZINE,CARTRIDGE",50,Each,9.98,499,3/3/2014 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,1005-01-373-2774,"MAGAZINE,CARTRIDGE",50,Each,29.67,1483.5,7/2/2013 0:00:00,10,WEAPONS,1005,"Guns, through 30 mm" KS,WYANDOTTE,20209,7830-DS-BIC-YCEE,"BICYCLE, EXERCISE",1,Each,0,0,3/25/2014 0:00:00,78,RECREATIONAL/ATHLETIC EQPT,7830,Recreational and Gymnastic Equipment KS,WYANDOTTE,20209,7830-DS-BIC-YCEE,"BICYCLE, EXERCISE",1,Each,0,0,3/25/2014 0:00:00,78,RECREATIONAL/ATHLETIC EQPT,7830,Recreational and Gymnastic Equipment KS,WYANDOTTE,20209,7830-DS-ELL-IPTI,ELLIPTICAL,1,Each,0,0,3/25/2014 0:00:00,78,RECREATIONAL/ATHLETIC EQPT,7830,Recreational and Gymnastic Equipment KS,WYANDOTTE,20209,8465-01-524-5250,ASSAULT PACK,25,Each,90.28,2257,4/18/2013 0:00:00,84,"CLOTHING/INDIVIDUAL EQPT, INSIGNIA",8465,Individual Equipmentagate-1.9.1/examples/tableset/000077500000000000000000000000001454111471500162275ustar00rootroot00000000000000agate-1.9.1/examples/tableset/table1.csv000066400000000000000000000000321454111471500201070ustar00rootroot00000000000000letter,number a,1 a,3 b,2 agate-1.9.1/examples/tableset/table1.json000066400000000000000000000002371454111471500202740ustar00rootroot00000000000000[ { "letter": "a", "number": 1 }, { "letter": "a", "number": 3 }, { "letter": "b", "number": 2 } ] agate-1.9.1/examples/tableset/table2.csv000066400000000000000000000000321454111471500201100ustar00rootroot00000000000000letter,number b,0 a,2 c,5 agate-1.9.1/examples/tableset/table2.json000066400000000000000000000002371454111471500202750ustar00rootroot00000000000000[ { "letter": "b", "number": 0 }, { "letter": "a", "number": 2 }, { "letter": "c", "number": 5 } ] agate-1.9.1/examples/tableset/table3.csv000066400000000000000000000000321454111471500201110ustar00rootroot00000000000000letter,number a,1 a,2 c,3 agate-1.9.1/examples/tableset/table3.json000066400000000000000000000002371454111471500202760ustar00rootroot00000000000000[ { "letter": "a", "number": 1 }, { "letter": "a", "number": 2 }, { "letter": "c", "number": 3 } ] agate-1.9.1/examples/tableset/type_error/000077500000000000000000000000001454111471500204215ustar00rootroot00000000000000agate-1.9.1/examples/tableset/type_error/table1.csv000066400000000000000000000000321454111471500223010ustar00rootroot00000000000000letter,number a,a a,a b,v agate-1.9.1/examples/tableset/type_error/table2.csv000066400000000000000000000000321454111471500223020ustar00rootroot00000000000000letter,number b,0 a,2 c,5 agate-1.9.1/examples/test.csv000066400000000000000000000002271454111471500161210ustar00rootroot00000000000000number,text,boolean,date,datetime,timedelta 1,a,True,2015-11-04,2015-11-04T12:22:00,0:04:15 2,👍,False,2015-11-05,2015-11-04T12:45:00,0:06:18 ,b,,,, agate-1.9.1/examples/test.json000066400000000000000000000010261454111471500162750ustar00rootroot00000000000000[ { "number": 1.0, "text": "a", "boolean": true, "date": "2015-11-04", "datetime": "2015-11-04T12:22:00", "timedelta": "0:04:15" }, { "number": 2, "text": "👍", "boolean": false, "date": "2015-11-05", "datetime": "2015-11-04T12:45:00", "timedelta": "0:06:18" }, { "number": null, "text": "b", "boolean": null, "date": null, "datetime": null, "timedelta": null } ] agate-1.9.1/examples/test_cr.csv000066400000000000000000000002271454111471500166050ustar00rootroot00000000000000number,text,boolean,date,datetime,timedelta 1,a,True,2015-11-04,2015-11-04T12:22:00,0:04:15 2,👍,False,2015-11-05,2015-11-04T12:45:00,0:06:18 ,b,,,, agate-1.9.1/examples/test_crlf.csv000066400000000000000000000002331454111471500171240ustar00rootroot00000000000000number,text,boolean,date,datetime,timedelta 1,a,True,2015-11-04,2015-11-04T12:22:00,0:04:15 2,👍,False,2015-11-05,2015-11-04T12:45:00,0:06:18 ,b,,,, agate-1.9.1/examples/test_csv_sniff.csv000066400000000000000000000002271454111471500201610ustar00rootroot00000000000000number|text|boolean|date|datetime|timedelta 1|a|True|2015-11-04|2015-11-04T12:22:00|0:04:15 2|👍|False|2015-11-05|2015-11-04T12:45:00|0:06:18 |b|||| agate-1.9.1/examples/test_from_json_ambiguous.json000066400000000000000000000001221454111471500224200ustar00rootroot00000000000000[ { "a/b": 2, "a": { "b": false } } ] agate-1.9.1/examples/test_key.json000066400000000000000000000012451454111471500171500ustar00rootroot00000000000000{ "data": [ { "number": 1.0, "text": "a", "boolean": true, "date": "2015-11-04", "datetime": "2015-11-04T12:22:00", "timedelta": "0:04:15" }, { "number": 2, "text": "👍", "boolean": false, "date": "2015-11-05", "datetime": "2015-11-04T12:45:00", "timedelta": "0:06:18" }, { "number": null, "text": "b", "boolean": null, "date": null, "datetime": null, "timedelta": null } ], "other": "ignore me!" } agate-1.9.1/examples/test_keyed.json000066400000000000000000000010501454111471500174530ustar00rootroot00000000000000{ "a": { "number": 1.0, "text": "a", "boolean": true, "date": "2015-11-04", "datetime": "2015-11-04T12:22:00", "timedelta": "0:04:15" }, "👍": { "number": 2, "text": "👍", "boolean": false, "date": "2015-11-05", "datetime": "2015-11-04T12:45:00", "timedelta": "0:06:18" }, "b": { "number": null, "text": "b", "boolean": null, "date": null, "datetime": null, "timedelta": null } } agate-1.9.1/examples/test_latin1.csv000066400000000000000000000000211454111471500173610ustar00rootroot00000000000000a,b,c 1,2,3 4,5,agate-1.9.1/examples/test_mixed.json000066400000000000000000000004031454111471500174610ustar00rootroot00000000000000[ { "one": 1, "two": 4, "three": "a" }, { "one": 2, "two": 3, "three": "b", "four": "d" }, { "one": null, "two": 2, "three": "👍", "five": 5 } ] agate-1.9.1/examples/test_nested.json000066400000000000000000000004351454111471500176420ustar00rootroot00000000000000[ { "one": 1, "two": { "two_a": "a", "two_b": "b" }, "three": ["a", 2, "c"] }, { "one": 2, "two": { "two_a": "c", "two_b": "d" }, "three": ["d", 2, "f"] } ] agate-1.9.1/examples/test_newline.json000066400000000000000000000005501454111471500200170ustar00rootroot00000000000000{ "number": 1.0, "text": "a", "boolean": true, "date": "2015-11-04", "datetime": "2015-11-04T12:22:00", "timedelta": "0:04:15" } { "number": 2, "text": "👍", "boolean": false, "date": "2015-11-05", "datetime": "2015-11-04T12:45:00", "timedelta": "0:06:18" } { "number": null, "text": "b", "boolean": null, "date": null, "datetime": null, "timedelta": null } agate-1.9.1/examples/test_no_header.csv000066400000000000000000000001461454111471500201250ustar00rootroot000000000000001,a,True,2015-11-04,2015-11-04T12:22:00,04:15 2,👍,False,2015-11-05,2015-11-04T12:45:00,06:18 ,,,,, agate-1.9.1/examples/test_non_string_keyed.json000066400000000000000000000010521454111471500217150ustar00rootroot00000000000000{ "1": { "number": 1.0, "text": "a", "boolean": true, "date": "2015-11-04", "datetime": "2015-11-04T12:22:00", "timedelta": "0:04:15" }, "2": { "number": 2.0, "text": "👍", "boolean": false, "date": "2015-11-05", "datetime": "2015-11-04T12:45:00", "timedelta": "0:06:18" }, "None": { "number": null, "text": "b", "boolean": null, "date": null, "datetime": null, "timedelta": null } } agate-1.9.1/examples/test_tableset.json000066400000000000000000000011301454111471500201540ustar00rootroot00000000000000{ "table1":[ { "letter":"a", "number":1 }, { "letter":"a", "number":3 }, { "letter":"b", "number":2 } ], "table2":[ { "letter":"b", "number":0 }, { "letter":"a", "number":2 }, { "letter":"c", "number":5 } ], "table3":[ { "letter":"a", "number":1 }, { "letter":"a", "number":2 }, { "letter":"c", "number":3 } ] } agate-1.9.1/examples/test_utf16_big.csv000066400000000000000000000000441454111471500177640ustar00rootroot00000000000000a,b,c 1,2,3 4,5,agate-1.9.1/examples/test_utf16_little.csv000066400000000000000000000000441454111471500205200ustar00rootroot00000000000000a,b,c 1,2,3 4,5,agate-1.9.1/examples/testfixed000066400000000000000000000003741454111471500163520ustar00rootroot00000000000000Chicago Reader 1971-01-01 40 True 1.0 4:14:001971-01-01T04:14:00 Chicago Sun-Times 1948-01-01 63 True 1.24 14:57:131948-01-01T14:57:13 Chicago Tribune 1920-01-01 164 False41800000.01, 0:00:001920-01-01T00:00:00 Row with blanks agate-1.9.1/examples/testfixed_converted.csv000066400000000000000000000004531454111471500212130ustar00rootroot00000000000000text,date,integer,boolean,float,time,datetime,empty_column Chicago Reader,1971-01-01,40,True,1.0,4:14:00,1971-01-01T04:14:00, Chicago Sun-Times,1948-01-01,63,True,1.24,14:57:13,1948-01-01T14:57:13, Chicago Tribune,1920-01-01,164,False,41800000.01,0:00:00,1920-01-01T00:00:00, Row with blanks,,,,,,, agate-1.9.1/examples/testfixed_schema.csv000066400000000000000000000001721454111471500204600ustar00rootroot00000000000000column,start,length text,0,18 date,18,10 integer,29,4 boolean,33,5 float,38,11 time,50,8 datetime,58,19 empty_column,77,2 agate-1.9.1/exonerations.py000077500000000000000000000051621454111471500157050ustar00rootroot00000000000000#!/usr/bin/env python import proof import agate def load_data(data): data['exonerations'] = agate.Table.from_csv('examples/realdata/exonerations-20150828.csv') print(data['exonerations']) def confessions(data): num_false_confessions = data['exonerations'].aggregate(agate.Count('false_confession', True)) print('False confessions: %i' % num_false_confessions) @proof.never_cache def median_age(data): median_age = data['exonerations'].aggregate(agate.Median('age')) print('Median age at time of arrest: %i' % median_age) data['exonerations'].bins('age', 10, 0, 100).print_bars('age', width=80) data['exonerations'].pivot('age').order_by('age').print_bars('age', width=80) data['exonerations'].bins('age').print_bars('age', width=80) def years_in_prison(data): data['with_years_in_prison'] = data['exonerations'].compute([ ('years_in_prison', agate.Change('convicted', 'exonerated')) ]) def youth(data): sorted_by_age = data['exonerations'].order_by('age') youngest_ten = sorted_by_age.limit(10) youngest_ten.print_table(max_columns=7) def states(data): by_state = data['with_years_in_prison'].group_by('state') state_totals = by_state.aggregate([ ('count', agate.Count()) ]) sorted_totals = state_totals.order_by('count', reverse=True) sorted_totals.print_table(max_rows=5) medians = by_state.aggregate([ ('count', agate.Count()), ('median_years_in_prison', agate.Median('years_in_prison')) ]) sorted_medians = medians.order_by('median_years_in_prison', reverse=True) sorted_medians.print_table(max_rows=5) def race_and_age(data): # Filters rows without age data only_with_age = data['with_years_in_prison'].where( lambda r: r['age'] is not None ) # Group by race race_groups = only_with_age.group_by('race') # Sub-group by age cohorts (20s, 30s, etc.) race_and_age_groups = race_groups.group_by( lambda r: '%i0s' % (r['age'] // 10), key_name='age_group' ) # Aggregate medians for each group medians = race_and_age_groups.aggregate([ ('count', agate.Count()), ('median_years_in_prison', agate.Median('years_in_prison')) ]) # Sort the results sorted_groups = medians.order_by('median_years_in_prison', reverse=True) # Print out the results sorted_groups.print_table(max_rows=10) analysis = proof.Analysis(load_data) analysis.then(confessions) analysis.then(median_age) analysis.then(youth) years_analysis = analysis.then(years_in_prison) years_analysis.then(states) years_analysis.then(race_and_age) analysis.run() agate-1.9.1/setup.cfg000066400000000000000000000006641454111471500144350ustar00rootroot00000000000000[flake8] max-line-length = 119 per-file-ignores = # imported but unused, unable to detect undefined names agate/__init__.py: F401,F403 agate/aggregations/__init__.py: F401 agate/computations/__init__.py: F401 agate/data_types/__init__.py: F401 # module level import not at top of file agate/tableset/__init__.py: E402 agate/table/__init__.py: E402 [isort] line_length = 119 [bdist_wheel] universal = 1 agate-1.9.1/setup.py000066400000000000000000000043571454111471500143310ustar00rootroot00000000000000from setuptools import find_packages, setup with open('README.rst') as f: long_description = f.read() setup( name='agate', version='1.9.1', description='A data analysis library that is optimized for humans instead of machines.', long_description=long_description, long_description_content_type='text/x-rst', author='Christopher Groskopf', author_email='chrisgroskopf@gmail.com', url='https://agate.readthedocs.org/', project_urls={ 'Source': 'https://github.com/wireservice/agate', }, license='MIT', classifiers=[ 'Development Status :: 5 - Production/Stable', 'Framework :: IPython', 'Intended Audience :: Developers', 'Intended Audience :: Science/Research', 'License :: OSI Approved :: MIT License', 'Natural Language :: English', 'Operating System :: OS Independent', 'Programming Language :: Python', 'Programming Language :: Python :: 3.8', 'Programming Language :: Python :: 3.9', 'Programming Language :: Python :: 3.10', 'Programming Language :: Python :: 3.11', 'Programming Language :: Python :: 3.12', 'Programming Language :: Python :: Implementation :: CPython', 'Programming Language :: Python :: Implementation :: PyPy', 'Topic :: Scientific/Engineering :: Information Analysis', 'Topic :: Software Development :: Libraries :: Python Modules', ], packages=find_packages(exclude=['benchmarks', 'tests', 'tests.*']), install_requires=[ 'Babel>=2.0', 'isodate>=0.5.4', 'leather>=0.3.2', # KeyError: 's' https://github.com/bear/parsedatetime/pull/233 https://github.com/wireservice/agate/issues/743 'parsedatetime>=2.1,!=2.5', 'python-slugify>=1.2.1', 'pytimeparse>=1.1.5', 'tzdata>=2023.3;platform_system=="Windows"', ], extras_require={ 'test': [ 'coverage>=3.7.1', 'cssselect>=0.9.1', 'lxml>=3.6.0', # CI is not configured to install PyICU on macOS and Windows. 'PyICU>=2.4.2;sys_platform=="linux"', 'pytest', 'pytest-cov', 'backports.zoneinfo;python_version<"3.9"', ], } ) agate-1.9.1/tests/000077500000000000000000000000001454111471500137505ustar00rootroot00000000000000agate-1.9.1/tests/__init__.py000066400000000000000000000001651454111471500160630ustar00rootroot00000000000000import locale # The test fixtures can break if the locale is non-US. locale.setlocale(locale.LC_ALL, 'en_US.UTF-8') agate-1.9.1/tests/test_agate.py000066400000000000000000000005441454111471500164450ustar00rootroot00000000000000import unittest import agate class TestCSV(unittest.TestCase): def test_agate(self): self.assertIs(agate.csv.reader, agate.csv_py3.reader) self.assertIs(agate.csv.writer, agate.csv_py3.writer) self.assertIs(agate.csv.DictReader, agate.csv_py3.DictReader) self.assertIs(agate.csv.DictWriter, agate.csv_py3.DictWriter) agate-1.9.1/tests/test_aggregations.py000066400000000000000000000651271454111471500200460ustar00rootroot00000000000000import datetime import sys import unittest import warnings from decimal import Decimal from agate import Table from agate.aggregations import (IQR, MAD, All, Any, Count, Deciles, First, HasNulls, Max, MaxLength, MaxPrecision, Mean, Median, Min, Mode, Percentiles, PopulationStDev, PopulationVariance, Quartiles, Quintiles, StDev, Sum, Summary, Variance) from agate.data_types import Boolean, DateTime, Number, Text, TimeDelta from agate.exceptions import DataTypeError from agate.utils import Quantiles from agate.warns import NullCalculationWarning class TestSimpleAggregation(unittest.TestCase): def setUp(self): self.rows = ( (1, 2, 'a'), (2, 3, 'b'), (None, 4, 'c') ) self.number_type = Number() self.text_type = Text() self.column_names = ['one', 'two', 'three'] self.column_types = [self.number_type, self.number_type, self.text_type] self.table = Table(self.rows, self.column_names, self.column_types) def test_summary(self): summary = Summary('one', Boolean(), lambda c: 2 in c) self.assertIsInstance(summary.get_aggregate_data_type(None), Boolean) summary.validate(self.table) self.assertEqual(summary.run(self.table), True) def test_has_nulls(self): has_nulls = HasNulls('one') self.assertIsInstance(has_nulls.get_aggregate_data_type(None), Boolean) has_nulls.validate(self.table) self.assertEqual(has_nulls.run(self.table), True) def test_any(self): Any('one', lambda d: d).validate(self.table) self.assertIsInstance(Any('one', 2).get_aggregate_data_type(None), Boolean) self.assertEqual(Any('one', 2).run(self.table), True) self.assertEqual(Any('one', 5).run(self.table), False) self.assertEqual(Any('one', lambda d: d == 2).run(self.table), True) self.assertEqual(Any('one', lambda d: d == 5).run(self.table), False) def test_all(self): All('one', lambda d: d).validate(self.table) self.assertIsInstance(All('one', 5).get_aggregate_data_type(None), Boolean) self.assertEqual(All('one', lambda d: d != 5).run(self.table), True) self.assertEqual(All('one', lambda d: d == 2).run(self.table), False) def test_first(self): with self.assertRaises(ValueError): First('one', lambda d: d == 5).validate(self.table) First('one', lambda d: d).validate(self.table) self.assertIsInstance(First('one').get_aggregate_data_type(self.table), Number) self.assertEqual(First('one').run(self.table), 1) self.assertEqual(First('one', lambda d: d == 2).run(self.table), 2) self.assertEqual(First('one', lambda d: not d).run(self.table), None) def test_count(self): rows = ( (1, 2, 'a'), (2, 3, 'b'), (None, 4, 'c'), (1, 2, 'a'), (1, 2, 'a') ) table = Table(rows, self.column_names, self.column_types) self.assertIsInstance(Count().get_aggregate_data_type(table), Number) Count().validate(self.table) self.assertEqual(Count().run(table), 5) self.assertEqual(Count().run(table), 5) def test_count_column(self): rows = ( (1, 2, 'a'), (2, 3, 'b'), (None, 4, 'c'), (1, 2, 'a'), (1, 2, 'a') ) table = Table(rows, self.column_names, self.column_types) self.assertIsInstance(Count('one').get_aggregate_data_type(table), Number) Count('one').validate(self.table) self.assertEqual(Count('one').run(table), 4) self.assertEqual(Count('two').run(table), 5) def test_count_value(self): rows = ( (1, 2, 'a'), (2, 3, 'b'), (None, 4, 'c'), (1, 2, 'a'), (1, 2, 'a') ) table = Table(rows, self.column_names, self.column_types) self.assertIsInstance(Count('one', 1).get_aggregate_data_type(table), Number) Count('one', 1).validate(self.table) self.assertEqual(Count('one', 1).run(table), 3) self.assertEqual(Count('one', 4).run(table), 0) self.assertEqual(Count('one', None).run(table), 1) class TestBooleanAggregation(unittest.TestCase): def test_any(self): rows = [ [True], [False], [None] ] table = Table(rows, ['test'], [Boolean()]) Any('test', True).validate(table) self.assertEqual(Any('test', True).run(table), True) rows = [ [False], [False], [None] ] table = Table(rows, ['test'], [Boolean()]) Any('test', True).validate(table) self.assertEqual(Any('test', True).run(table), False) self.assertEqual(Any('test', lambda r: r).run(table), False) self.assertEqual(Any('test', False).run(table), True) self.assertEqual(Any('test', lambda r: not r).run(table), True) def test_all(self): rows = [ [True], [True], [None] ] table = Table(rows, ['test'], [Boolean()]) All('test', True).validate(table) self.assertEqual(All('test', True).run(table), False) rows = [ [True], [True], [True] ] table = Table(rows, ['test'], [Boolean()]) All('test', True).validate(table) self.assertEqual(All('test', True).run(table), True) self.assertEqual(All('test', lambda r: r).run(table), True) self.assertEqual(All('test', False).run(table), False) self.assertEqual(All('test', lambda r: not r).run(table), False) class TestDateTimeAggregation(unittest.TestCase): def setUp(self): self.rows = [ [datetime.datetime(1994, 3, 3, 6, 31)], [datetime.datetime(1994, 3, 3, 6, 30, 30)], [datetime.datetime(1994, 3, 3, 6, 30)], ] self.table = Table(self.rows, ['test', 'null'], [DateTime(), DateTime()]) self.time_delta_rows = [ [datetime.timedelta(seconds=10), datetime.timedelta(seconds=15), None], [datetime.timedelta(seconds=20), None, None], ] self.time_delta_table = Table( self.time_delta_rows, ['test', 'mixed', 'null'], [TimeDelta(), TimeDelta(), TimeDelta()] ) def test_min(self): self.assertIsInstance(Min('test').get_aggregate_data_type(self.table), DateTime) Min('test').validate(self.table) self.assertEqual(Min('test').run(self.table), datetime.datetime(1994, 3, 3, 6, 30)) def test_min_all_nulls(self): self.assertIsNone(Min('null').run(self.table)) def test_min_time_delta(self): self.assertIsInstance(Min('test').get_aggregate_data_type(self.time_delta_table), TimeDelta) Min('test').validate(self.time_delta_table) self.assertEqual(Min('test').run(self.time_delta_table), datetime.timedelta(0, 10)) def test_max(self): self.assertIsInstance(Max('test').get_aggregate_data_type(self.table), DateTime) Max('test').validate(self.table) self.assertEqual(Max('test').run(self.table), datetime.datetime(1994, 3, 3, 6, 31)) def test_max_all_nulls(self): self.assertIsNone(Max('null').run(self.table)) def test_max_time_delta(self): self.assertIsInstance(Max('test').get_aggregate_data_type(self.time_delta_table), TimeDelta) Max('test').validate(self.time_delta_table) self.assertEqual(Max('test').run(self.time_delta_table), datetime.timedelta(0, 20)) def test_mean(self): with self.assertWarns(NullCalculationWarning): Mean('mixed').validate(self.time_delta_table) Mean('test').validate(self.time_delta_table) self.assertEqual(Mean('test').run(self.time_delta_table), datetime.timedelta(seconds=15)) def test_mean_all_nulls(self): self.assertIsNone(Mean('null').run(self.time_delta_table)) def test_mean_with_nulls(self): warnings.simplefilter('ignore') try: Mean('mixed').validate(self.time_delta_table) finally: warnings.resetwarnings() self.assertAlmostEqual(Mean('mixed').run(self.time_delta_table), datetime.timedelta(seconds=15)) def test_sum(self): self.assertIsInstance(Sum('test').get_aggregate_data_type(self.time_delta_table), TimeDelta) Sum('test').validate(self.time_delta_table) self.assertEqual(Sum('test').run(self.time_delta_table), datetime.timedelta(seconds=30)) def test_sum_all_nulls(self): self.assertEqual(Sum('null').run(self.time_delta_table), datetime.timedelta(0)) class TestNumberAggregation(unittest.TestCase): def setUp(self): self.rows = ( (Decimal('1.1'), Decimal('2.19'), 'a', None), (Decimal('2.7'), Decimal('3.42'), 'b', None), (None, Decimal('4.1'), 'c', None), (Decimal('2.7'), Decimal('3.42'), 'c', None) ) self.number_type = Number() self.text_type = Text() self.column_names = ['one', 'two', 'three', 'four'] self.column_types = [self.number_type, self.number_type, self.text_type, self.number_type] self.table = Table(self.rows, self.column_names, self.column_types) def test_max_precision(self): with self.assertRaises(DataTypeError): MaxPrecision('three').validate(self.table) self.assertIsInstance(MaxPrecision('one').get_aggregate_data_type(self.table), Number) MaxPrecision('one').validate(self.table) self.assertEqual(MaxPrecision('one').run(self.table), 1) self.assertEqual(MaxPrecision('two').run(self.table), 2) def test_max_precision_all_nulls(self): self.assertEqual(MaxPrecision('four').run(self.table), 0) def test_sum(self): with self.assertRaises(DataTypeError): Sum('three').validate(self.table) Sum('one').validate(self.table) self.assertEqual(Sum('one').run(self.table), Decimal('6.5')) self.assertEqual(Sum('two').run(self.table), Decimal('13.13')) def test_sum_all_nulls(self): self.assertEqual(Sum('four').run(self.table), Decimal('0')) def test_min(self): with self.assertRaises(DataTypeError): Min('three').validate(self.table) Min('one').validate(self.table) self.assertEqual(Min('one').run(self.table), Decimal('1.1')) self.assertEqual(Min('two').run(self.table), Decimal('2.19')) def test_min_all_nulls(self): self.assertIsNone(Min('four').run(self.table)) def test_max(self): with self.assertRaises(DataTypeError): Max('three').validate(self.table) Max('one').validate(self.table) self.assertEqual(Max('one').run(self.table), Decimal('2.7')) self.assertEqual(Max('two').run(self.table), Decimal('4.1')) def test_max_all_nulls(self): self.assertIsNone(Max('four').run(self.table)) def test_mean(self): with self.assertWarns(NullCalculationWarning): Mean('one').validate(self.table) Mean('two').validate(self.table) with self.assertRaises(DataTypeError): Mean('three').validate(self.table) self.assertEqual(Mean('two').run(self.table), Decimal('3.2825')) def test_mean_all_nulls(self): self.assertIsNone(Mean('four').run(self.table)) def test_mean_with_nulls(self): warnings.simplefilter('ignore') try: Mean('one').validate(self.table) finally: warnings.resetwarnings() self.assertAlmostEqual(Mean('one').run(self.table), Decimal('2.16666666')) def test_median(self): with self.assertWarns(NullCalculationWarning): Median('one').validate(self.table) warnings.simplefilter('ignore') try: with self.assertRaises(DataTypeError): Median('three').validate(self.table) finally: warnings.resetwarnings() Median('two').validate(self.table) self.assertIsInstance(Median('two').get_aggregate_data_type(self.table), Number) self.assertEqual(Median('two').run(self.table), Decimal('3.42')) def test_median_all_nulls(self): self.assertIsNone(Median('four').run(self.table)) def test_mode(self): with warnings.catch_warnings(): warnings.simplefilter('error') with self.assertRaises(NullCalculationWarning): Mode('one').validate(self.table) with self.assertRaises(DataTypeError): Mode('three').validate(self.table) warnings.simplefilter('ignore') try: Mode('two').validate(self.table) finally: warnings.resetwarnings() self.assertIsInstance(Mode('two').get_aggregate_data_type(self.table), Number) self.assertEqual(Mode('two').run(self.table), Decimal('3.42')) def test_mode_all_nulls(self): self.assertIsNone(Mode('four').run(self.table)) def test_iqr(self): with warnings.catch_warnings(): warnings.simplefilter('error') with self.assertRaises(NullCalculationWarning): IQR('one').validate(self.table) with self.assertRaises(DataTypeError): IQR('three').validate(self.table) warnings.simplefilter('ignore') try: IQR('two').validate(self.table) finally: warnings.resetwarnings() self.assertIsInstance(IQR('two').get_aggregate_data_type(self.table), Number) self.assertEqual(IQR('two').run(self.table), Decimal('0.955')) def test_irq_all_nulls(self): self.assertIsNone(IQR('four').run(self.table)) def test_variance(self): with warnings.catch_warnings(): warnings.simplefilter('error') with self.assertRaises(NullCalculationWarning): Variance('one').validate(self.table) with self.assertRaises(DataTypeError): Variance('three').validate(self.table) warnings.simplefilter('ignore') try: Variance('two').validate(self.table) finally: warnings.resetwarnings() self.assertIsInstance(Variance('two').get_aggregate_data_type(self.table), Number) self.assertEqual( Variance('two').run(self.table).quantize(Decimal('0.0001')), Decimal('0.6332') ) def test_variance_all_nulls(self): self.assertIsNone(Variance('four').run(self.table)) def test_population_variance(self): with warnings.catch_warnings(): warnings.simplefilter('error') with self.assertRaises(NullCalculationWarning): PopulationVariance('one').validate(self.table) with self.assertRaises(DataTypeError): PopulationVariance('three').validate(self.table) warnings.simplefilter('ignore') try: PopulationVariance('two').validate(self.table) finally: warnings.resetwarnings() self.assertIsInstance(PopulationVariance('two').get_aggregate_data_type(self.table), Number) self.assertEqual( PopulationVariance('two').run(self.table).quantize(Decimal('0.0001')), Decimal('0.4749') ) def test_population_variance_all_nulls(self): self.assertIsNone(PopulationVariance('four').run(self.table)) def test_stdev(self): with warnings.catch_warnings(): warnings.simplefilter('error') with self.assertRaises(NullCalculationWarning): StDev('one').validate(self.table) with self.assertRaises(DataTypeError): StDev('three').validate(self.table) warnings.simplefilter('ignore') try: StDev('two').validate(self.table) finally: warnings.resetwarnings() self.assertIsInstance(StDev('two').get_aggregate_data_type(self.table), Number) self.assertAlmostEqual( StDev('two').run(self.table).quantize(Decimal('0.0001')), Decimal('0.7958') ) def test_stdev_all_nulls(self): self.assertIsNone(StDev('four').run(self.table)) def test_population_stdev(self): with warnings.catch_warnings(): warnings.simplefilter('error') with self.assertRaises(NullCalculationWarning): PopulationStDev('one').validate(self.table) with self.assertRaises(DataTypeError): PopulationStDev('three').validate(self.table) warnings.simplefilter('ignore') try: PopulationStDev('two').validate(self.table) finally: warnings.resetwarnings() self.assertIsInstance(PopulationStDev('two').get_aggregate_data_type(self.table), Number) self.assertAlmostEqual( PopulationStDev('two').run(self.table).quantize(Decimal('0.0001')), Decimal('0.6891') ) def test_population_stdev_all_nulls(self): self.assertIsNone(PopulationStDev('four').run(self.table)) def test_mad(self): with warnings.catch_warnings(): warnings.simplefilter('error') with self.assertRaises(NullCalculationWarning): MAD('one').validate(self.table) with self.assertRaises(DataTypeError): MAD('three').validate(self.table) warnings.simplefilter('ignore') try: MAD('two').validate(self.table) finally: warnings.resetwarnings() self.assertIsInstance(MAD('two').get_aggregate_data_type(self.table), Number) self.assertAlmostEqual(MAD('two').run(self.table), Decimal('0')) def test_mad_all_nulls(self): self.assertIsNone(MAD('four').run(self.table)) def test_percentiles(self): with warnings.catch_warnings(): warnings.simplefilter('error') with self.assertRaises(NullCalculationWarning): Percentiles('one').validate(self.table) with self.assertRaises(DataTypeError): Percentiles('three').validate(self.table) warnings.simplefilter('ignore') try: Percentiles('two').validate(self.table) finally: warnings.resetwarnings() rows = [(n,) for n in range(1, 1001)] table = Table(rows, ['ints'], [self.number_type]) percentiles = Percentiles('ints').run(table) self.assertEqual(percentiles[0], Decimal('1')) self.assertEqual(percentiles[25], Decimal('250.5')) self.assertEqual(percentiles[50], Decimal('500.5')) self.assertEqual(percentiles[75], Decimal('750.5')) self.assertEqual(percentiles[99], Decimal('990.5')) self.assertEqual(percentiles[100], Decimal('1000')) def test_percentiles_all_nulls(self): self.assertEqual(Percentiles('four').run(self.table), Quantiles([None] * 101)) def test_percentiles_locate(self): rows = [(n,) for n in range(1, 1001)] table = Table(rows, ['ints'], [self.number_type]) percentiles = Percentiles('ints').run(table) self.assertEqual(percentiles.locate(251), Decimal('25')) self.assertEqual(percentiles.locate(260), Decimal('25')) self.assertEqual(percentiles.locate(261), Decimal('26')) with self.assertRaises(ValueError): percentiles.locate(0) with self.assertRaises(ValueError): percentiles.locate(1012) def test_quartiles(self): """ CDF quartile tests from: http://www.amstat.org/publications/jse/v14n3/langford.html#Parzen1979 """ with warnings.catch_warnings(): warnings.simplefilter('error') with self.assertRaises(NullCalculationWarning): Quartiles('one').validate(self.table) with self.assertRaises(DataTypeError): Quartiles('three').validate(self.table) warnings.simplefilter('ignore') try: Quartiles('two').validate(self.table) finally: warnings.resetwarnings() # N = 4 rows = [(n,) for n in [1, 2, 3, 4]] table = Table(rows, ['ints'], [self.number_type]) quartiles = Quartiles('ints').run(table) for i, v in enumerate(['1', '1.5', '2.5', '3.5', '4']): self.assertEqual(quartiles[i], Decimal(v)) # N = 5 rows = [(n,) for n in [1, 2, 3, 4, 5]] table = Table(rows, ['ints'], [self.number_type]) quartiles = Quartiles('ints').run(table) for i, v in enumerate(['1', '2', '3', '4', '5']): self.assertEqual(quartiles[i], Decimal(v)) # N = 6 rows = [(n,) for n in [1, 2, 3, 4, 5, 6]] table = Table(rows, ['ints'], [self.number_type]) quartiles = Quartiles('ints').run(table) for i, v in enumerate(['1', '2', '3.5', '5', '6']): self.assertEqual(quartiles[i], Decimal(v)) # N = 7 rows = [(n,) for n in [1, 2, 3, 4, 5, 6, 7]] table = Table(rows, ['ints'], [self.number_type]) quartiles = Quartiles('ints').run(table) for i, v in enumerate(['1', '2', '4', '6', '7']): self.assertEqual(quartiles[i], Decimal(v)) # N = 8 (doubled) rows = [(n,) for n in [1, 1, 2, 2, 3, 3, 4, 4]] table = Table(rows, ['ints'], [self.number_type]) quartiles = Quartiles('ints').run(table) for i, v in enumerate(['1', '1.5', '2.5', '3.5', '4']): self.assertEqual(quartiles[i], Decimal(v)) # N = 10 (doubled) rows = [(n,) for n in [1, 1, 2, 2, 3, 3, 4, 4, 5, 5]] table = Table(rows, ['ints'], [self.number_type]) quartiles = Quartiles('ints').run(table) for i, v in enumerate(['1', '2', '3', '4', '5']): self.assertEqual(quartiles[i], Decimal(v)) # N = 12 (doubled) rows = [(n,) for n in [1, 1, 2, 2, 3, 3, 4, 4, 5, 5, 6, 6]] table = Table(rows, ['ints'], [self.number_type]) quartiles = Quartiles('ints').run(table) for i, v in enumerate(['1', '2', '3.5', '5', '6']): self.assertEqual(quartiles[i], Decimal(v)) # N = 14 (doubled) rows = [(n,) for n in [1, 1, 2, 2, 3, 3, 4, 4, 5, 5, 6, 6, 7, 7]] table = Table(rows, ['ints'], [self.number_type]) quartiles = Quartiles('ints').run(table) for i, v in enumerate(['1', '2', '4', '6', '7']): self.assertEqual(quartiles[i], Decimal(v)) def test_quartiles_all_nulls(self): self.assertEqual(Quartiles('four').run(self.table), Quantiles([None] * 5)) def test_quartiles_locate(self): """ CDF quartile tests from: http://www.amstat.org/publications/jse/v14n3/langford.html#Parzen1979 """ # N = 4 rows = [(n,) for n in [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]] table = Table(rows, ['ints'], [self.number_type]) quartiles = Quartiles('ints').run(table) self.assertEqual(quartiles.locate(2), Decimal('0')) self.assertEqual(quartiles.locate(4), Decimal('1')) self.assertEqual(quartiles.locate(6), Decimal('2')) self.assertEqual(quartiles.locate(8), Decimal('3')) with self.assertRaises(ValueError): quartiles.locate(0) with self.assertRaises(ValueError): quartiles.locate(11) def test_quintiles(self): with warnings.catch_warnings(): warnings.simplefilter('error') with self.assertRaises(NullCalculationWarning): Quintiles('one').validate(self.table) with self.assertRaises(DataTypeError): Quintiles('three').validate(self.table) warnings.simplefilter('ignore') try: Quintiles('two').validate(self.table) finally: warnings.resetwarnings() rows = [(n,) for n in range(1, 1000)] table = Table(rows, ['ints'], [self.number_type]) quintiles = Quintiles('ints').run(table) for i, v in enumerate(['1', '200', '400', '600', '800', '999']): self.assertEqual(quintiles[i], Decimal(v)) def test_quintiles_all_nulls(self): self.assertEqual(Quintiles('four').run(self.table), Quantiles([None] * 6)) def test_deciles(self): with warnings.catch_warnings(): warnings.simplefilter('error') with self.assertRaises(NullCalculationWarning): Deciles('one').validate(self.table) with self.assertRaises(DataTypeError): Deciles('three').validate(self.table) warnings.simplefilter('ignore') try: Deciles('two').validate(self.table) finally: warnings.resetwarnings() rows = [(n,) for n in range(1, 1000)] table = Table(rows, ['ints'], [self.number_type]) deciles = Deciles('ints').run(table) for i, v in enumerate(['1', '100', '200', '300', '400', '500', '600', '700', '800', '900', '999']): self.assertEqual(deciles[i], Decimal(v)) def test_deciles_all_nulls(self): self.assertEqual(Deciles('four').run(self.table), Quantiles([None] * 11)) class TestTextAggregation(unittest.TestCase): def setUp(self): self.rows = [ ['a', None], ['gobble', None], ['w', None] ] self.table = Table(self.rows, ['test', 'null'], [Text(), Text()]) def test_max_length(self): MaxLength('test').validate(self.table) self.assertEqual(MaxLength('test').run(self.table), 6) self.assertIsInstance(MaxLength('test').run(self.table), Decimal) def test_max_length_all_nulls(self): self.assertEqual(MaxLength('null').run(self.table), 0) def test_max_length_unicode(self): """ This text documents different handling of wide-unicode characters in Python 2 and Python 3. The former's behavior is broken, but can not be easily fixed. Bug: https://github.com/wireservice/agate/issues/649 Reference: http://stackoverflow.com/a/35462951 """ rows = [ ['a'], ['👍'], ['w'] ] table = Table(rows, ['test'], [Text()]) MaxLength('test').validate(table) # Non 4-byte versions of Python 2 (but not PyPy) if sys.maxunicode <= 65535: self.assertEqual(MaxLength('test').run(table), 2) # Modern versions of Python else: self.assertEqual(MaxLength('test').run(table), 1) self.assertIsInstance(MaxLength('test').run(table), Decimal) def test_max_length_invalid(self): rows = [ [1], [2], [3] ] table = Table(rows, ['test'], [Number()]) with self.assertRaises(DataTypeError): MaxLength('test').validate(table) agate-1.9.1/tests/test_columns.py000066400000000000000000000067531454111471500170540ustar00rootroot00000000000000import pickle import unittest from decimal import Decimal from agate import Table from agate.data_types import Number, Text class TestColumn(unittest.TestCase): def setUp(self): self.rows = ( (1, 2, 'a'), (2, 3, 'b'), (None, 4, 'c') ) self.number_type = Number() self.text_type = Text() self.column_names = ['one', 'two', 'three'] self.column_types = [self.number_type, self.number_type, self.text_type] self.table = Table(self.rows, self.column_names, self.column_types) def test_index(self): self.assertEqual(self.table.columns['one'].index, 0) self.assertEqual(self.table.columns['two'].index, 1) self.assertEqual(self.table.columns['three'].index, 2) def test_name(self): self.assertEqual(self.table.columns['one'].name, 'one') def test_data_type(self): self.assertIs(self.table.columns['one'].data_type, self.number_type) def test_pickleable(self): pickle.dumps(self.table.columns['one']) def test_row_names(self): table = Table(self.rows, self.column_names, self.column_types, row_names='three') column = table.columns['one'] self.assertSequenceEqual(column._keys, ['a', 'b', 'c']) self.assertEqual(column['b'], 2) def test_keys(self): table = Table(self.rows, self.column_names, self.column_types, row_names='three') self.assertIs(self.table.columns['one'].keys(), None) self.assertSequenceEqual(table.columns['one'].keys(), ['a', 'b', 'c']) def test_values(self): self.assertSequenceEqual( self.table.columns['one'].values(), [Decimal('1'), Decimal('2'), None] ) def test_values_distinct(self): rows = ( (1, 2), (2, 3), (None, 3) ) table = Table(rows, ('one', 'two'), [self.number_type, self.number_type]) self.assertSequenceEqual( table.columns['two'].values_distinct(), [Decimal('2'), Decimal('3')] ) def test_items(self): table = Table(self.rows, self.column_names, self.column_types, row_names='three') self.assertSequenceEqual(table.columns['one'].items(), [ ('a', Decimal('1')), ('b', Decimal('2')), ('c', None) ]) def test_dict(self): table = Table(self.rows, self.column_names, self.column_types, row_names='three') self.assertDictEqual(table.columns['one'].dict(), { 'a': Decimal('1'), 'b': Decimal('2'), 'c': None }) def test_values_without_nulls(self): self.assertSequenceEqual( self.table.columns['one'].values_without_nulls(), [Decimal('1'), Decimal('2')] ) def test_values_sorted(self): rows = ( (2, 2, 'a'), (None, 3, 'b'), (1, 4, 'c') ) table = Table(rows, self.column_names, self.column_types) self.assertSequenceEqual( table.columns['one'].values_sorted(), [Decimal('1'), Decimal('2'), None] ) def test_values_without_nulls_sorted(self): rows = ( (2, 2, 'a'), (None, 3, 'b'), (1, 4, 'c') ) table = Table(rows, self.column_names, self.column_types) self.assertSequenceEqual( table.columns['one'].values_without_nulls_sorted(), [Decimal('1'), Decimal('2')] ) agate-1.9.1/tests/test_computations.py000066400000000000000000000432331454111471500201130ustar00rootroot00000000000000import datetime import unittest import warnings from decimal import Decimal from agate import Table from agate.computations import Change, Formula, Percent, PercentChange, PercentileRank, Rank, Slug from agate.data_types import Boolean, Date, DateTime, Number, Text, TimeDelta from agate.exceptions import CastError, DataTypeError from agate.warns import NullCalculationWarning class TestTableComputation(unittest.TestCase): def setUp(self): self.rows = ( ('a', 2, 3, 4), (None, 3, 5, None), ('a', 2, 4, None), ('b', 3, 4, None) ) self.number_type = Number() self.text_type = Text() self.column_names = [ 'one', 'two', 'three', 'four' ] self.column_types = [ self.text_type, self.number_type, self.number_type, self.number_type ] self.table = Table(self.rows, self.column_names, self.column_types) def test_formula(self): new_table = self.table.compute([ ('test', Formula(self.number_type, lambda r: r['two'] + r['three'])) ]) self.assertIsNot(new_table, self.table) self.assertEqual(len(new_table.rows), 4) self.assertEqual(len(new_table.columns), 5) self.assertSequenceEqual(new_table.rows[0], ('a', Decimal('2'), Decimal('3'), Decimal('4'), Decimal('5'))) self.assertEqual(new_table.columns['test'][0], Decimal('5')) self.assertEqual(new_table.columns['test'][1], Decimal('8')) self.assertEqual(new_table.columns['test'][2], Decimal('6')) self.assertEqual(new_table.columns['test'][3], Decimal('7')) def test_formula_invalid(self): with self.assertRaises(CastError): self.table.compute([ ('test', Formula(self.number_type, lambda r: r['one'])) ]) def test_formula_no_validate(self): new_table = self.table.compute([ ('test', Formula(self.number_type, lambda r: r['one'], cast=False)) ]) # Now everything is screwed up self.assertSequenceEqual(new_table.rows[0], ('a', Decimal('2'), Decimal('3'), Decimal('4'), 'a')) self.assertEqual(new_table.columns['test'][0], 'a') def test_change(self): new_table = self.table.compute([ ('test', Change('two', 'three')) ]) self.assertIsNot(new_table, self.table) self.assertEqual(len(new_table.rows), 4) self.assertEqual(len(new_table.columns), 5) self.assertSequenceEqual(new_table.rows[0], ('a', Decimal('2'), Decimal('3'), Decimal('4'), Decimal('1'))) self.assertEqual(new_table.columns['test'][0], Decimal('1')) self.assertEqual(new_table.columns['test'][1], Decimal('2')) self.assertEqual(new_table.columns['test'][2], Decimal('2')) self.assertEqual(new_table.columns['test'][3], Decimal('1')) def test_change_mixed_types(self): rows = ( ('1', '10/24/1978'), ('2', '11/13/1974') ) column_names = ['number', 'date'] column_types = [Number(), Date()] table = Table(rows, column_names, column_types) with self.assertRaises(DataTypeError): table.compute([ ('test', Change('number', 'date')) ]) def test_changed_invalid_types(self): rows = ( (False, True), (True, False) ) column_names = ['before', 'after'] column_types = [Boolean(), Boolean()] table = Table(rows, column_names, column_types) with self.assertRaises(DataTypeError): table.compute([ ('test', Change('before', 'after')) ]) def test_change_nulls(self): with self.assertWarns(NullCalculationWarning): new_table = self.table.compute([ ('test', Change('three', 'four')) ]) with self.assertWarns(NullCalculationWarning): new_table = self.table.compute([ ('test', Change('four', 'three')) ]) warnings.simplefilter('ignore') try: new_table = self.table.compute([ ('test', Change('three', 'four')) ]) finally: warnings.resetwarnings() self.assertIsNot(new_table, self.table) self.assertEqual(len(new_table.rows), 4) self.assertEqual(len(new_table.columns), 5) self.assertSequenceEqual(new_table.rows[0], ('a', Decimal('2'), Decimal('3'), Decimal('4'), Decimal('1'))) self.assertEqual(new_table.columns['test'][0], Decimal('1')) self.assertEqual(new_table.columns['test'][1], None) self.assertEqual(new_table.columns['test'][2], None) self.assertEqual(new_table.columns['test'][3], None) def test_percent(self): new_table = self.table.compute([ ('test', Percent('two')) ]) self.assertIsNot(new_table, self.table) self.assertEqual(len(new_table.rows), 4) self.assertEqual(len(new_table.columns), 5) def to_one_place(d): return d.quantize(Decimal('0.1')) self.assertSequenceEqual( new_table.rows[0], ('a', Decimal('2'), Decimal('3'), Decimal('4'), Decimal('20.0')) ) self.assertEqual(to_one_place(new_table.columns['test'][0]), Decimal('20.0')) self.assertEqual(to_one_place(new_table.columns['test'][1]), Decimal('30.0')) self.assertEqual(to_one_place(new_table.columns['test'][2]), Decimal('20.0')) self.assertEqual(to_one_place(new_table.columns['test'][3]), Decimal('30.0')) def test_percent_total_override(self): new_table = self.table.compute([ ('test', Percent('two', 5)) ]) def to_one_place(d): return d.quantize(Decimal('0.1')) self.assertEqual(to_one_place(new_table.columns['test'][0]), Decimal('40.0')) self.assertEqual(to_one_place(new_table.columns['test'][1]), Decimal('60.0')) self.assertEqual(to_one_place(new_table.columns['test'][2]), Decimal('40.0')) self.assertEqual(to_one_place(new_table.columns['test'][3]), Decimal('60.0')) with self.assertRaises(DataTypeError): self.table.compute([ ('test', Percent('two', 0)) ]) with self.assertRaises(DataTypeError): self.table.compute([ ('test', Percent('two', -1)) ]) with self.assertRaises(DataTypeError): zero_table = Table([[0]], ['zero'], [self.number_type]) new_table = zero_table.compute([('test', Percent('zero'))]) def test_percent_zeros(self): column_names = ['label', 'value'] rows = ( ('one', 25), ('two', 25), ('three', 0) ) new_table = Table(rows, column_names) new_table = new_table.compute([ ('test', Percent('value')), ]) def to_one_place(d): return d.quantize(Decimal('0.1')) self.assertEqual(to_one_place(new_table.columns['test'][0]), Decimal('50.0')) self.assertEqual(to_one_place(new_table.columns['test'][1]), Decimal('50.0')) self.assertEqual(to_one_place(new_table.columns['test'][2]), Decimal('0.0')) def test_percent_nulls(self): warnings.simplefilter('ignore') try: new_table = self.table.compute([ ('test', Percent('four')) ]) finally: warnings.resetwarnings() def to_one_place(d): return d.quantize(Decimal('0.1')) self.assertEqual( to_one_place(new_table.columns['test'][0]), Decimal('100.0') ) self.assertEqual(new_table.columns['test'][1], None) self.assertEqual(new_table.columns['test'][2], None) self.assertEqual(new_table.columns['test'][3], None) def test_percent_change(self): new_table = self.table.compute([ ('test', PercentChange('two', 'three')) ]) self.assertIsNot(new_table, self.table) self.assertEqual(len(new_table.rows), 4) self.assertEqual(len(new_table.columns), 5) def to_one_place(d): return d.quantize(Decimal('0.1')) self.assertSequenceEqual(new_table.rows[0], ('a', Decimal('2'), Decimal('3'), Decimal('4'), Decimal('50.0'))) self.assertEqual(to_one_place(new_table.columns['test'][0]), Decimal('50.0')) self.assertEqual(to_one_place(new_table.columns['test'][1]), Decimal('66.7')) self.assertEqual(to_one_place(new_table.columns['test'][2]), Decimal('100.0')) self.assertEqual(to_one_place(new_table.columns['test'][3]), Decimal('33.3')) def test_percent_change_invalid_columns(self): with self.assertRaises(DataTypeError): self.table.compute([ ('test', PercentChange('one', 'three')) ]) with self.assertRaises(DataTypeError): self.table.compute([ ('test', PercentChange('three', 'one')) ]) def test_percent_change_nulls(self): with self.assertWarns(NullCalculationWarning): new_table = self.table.compute([ ('test', PercentChange('three', 'four')) ]) with self.assertWarns(NullCalculationWarning): new_table = self.table.compute([ ('test', PercentChange('four', 'three')) ]) warnings.simplefilter('ignore') try: new_table = self.table.compute([ ('test', PercentChange('three', 'four')) ]) finally: warnings.resetwarnings() self.assertIsNot(new_table, self.table) self.assertEqual(len(new_table.rows), 4) self.assertEqual(len(new_table.columns), 5) def to_one_place(d): return d.quantize(Decimal('0.1')) self.assertSequenceEqual(new_table.rows[2], ('a', Decimal('2'), Decimal('4'), None, None)) self.assertEqual(to_one_place(new_table.columns['test'][0]), Decimal('33.3')) self.assertEqual(new_table.columns['test'][1], None) self.assertEqual(new_table.columns['test'][2], None) self.assertEqual(new_table.columns['test'][3], None) def test_rank_number(self): new_table = self.table.compute([ ('rank', Rank('two')) ]) self.assertEqual(len(new_table.rows), 4) self.assertEqual(len(new_table.columns), 5) self.assertSequenceEqual(new_table.columns['rank'], (1, 3, 1, 3)) self.assertIsInstance(new_table.columns['rank'][0], Decimal) def test_rank_number_reverse(self): new_table = self.table.compute([ ('rank', Rank('two', reverse=True)) ]) self.assertEqual(len(new_table.rows), 4) self.assertEqual(len(new_table.columns), 5) self.assertSequenceEqual(new_table.columns['rank'], (3, 1, 3, 1)) def test_rank_number_key(self): new_table = self.table.compute([ ('rank', Rank('two', comparer=lambda x, y: int(y - x))) ]) self.assertEqual(len(new_table.rows), 4) self.assertEqual(len(new_table.columns), 5) self.assertSequenceEqual(new_table.columns['rank'], (3, 1, 3, 1)) def test_rank_number_reverse_key(self): new_table = self.table.compute([ ('rank', Rank('two', comparer=lambda x, y: int(y - x), reverse=True)) ]) self.assertEqual(len(new_table.rows), 4) self.assertEqual(len(new_table.columns), 5) self.assertSequenceEqual(new_table.columns['rank'], (1, 3, 1, 3)) def test_rank_text(self): new_table = self.table.compute([ ('rank', Rank('one')) ]) self.assertEqual(len(new_table.rows), 4) self.assertEqual(len(new_table.columns), 5) self.assertSequenceEqual(new_table.columns['rank'], (1, 4, 1, 3)) def test_percentile_rank(self): rows = [(n,) for n in range(1, 1001)] table = Table(rows, ['ints'], [self.number_type]) new_table = table.compute([ ('percentiles', PercentileRank('ints')) ]) self.assertEqual(len(new_table.rows), 1000) self.assertEqual(len(new_table.columns), 2) self.assertSequenceEqual(new_table.rows[0], (1, 0)) self.assertSequenceEqual(new_table.rows[50], (51, 5)) self.assertSequenceEqual(new_table.rows[499], (500, 49)) self.assertSequenceEqual(new_table.rows[500], (501, 50)) self.assertSequenceEqual(new_table.rows[998], (999, 99)) self.assertSequenceEqual(new_table.rows[999], (1000, 100)) self.assertIsInstance(new_table.columns['percentiles'][0], Decimal) self.assertIsInstance(new_table.columns['percentiles'][-1], Decimal) def test_percentile_rank_invalid_types(self): with self.assertRaises(DataTypeError): self.table.compute([ ('test', PercentileRank('one')) ]) def test_slug(self): rows = ( ('hello world', 2), ('Ab*c #e', 2), ('He11O W0rld', 3) ) expected = ['hello_world', 'ab_c_e', 'he11o_w0rld'] table = Table(rows, ['one', 'two'], [self.text_type, self.number_type]).compute([ ('slugs', Slug('one')) ]) self.assertSequenceEqual(table.columns['slugs'], expected) def test_slug_column_name_sequence(self): rows = ( ('hello world', 2, 'Ab*c #e'), ('Ab*c #e', 2, 'He11O W0rld'), ('He11O W0rld', 3, 'hello world') ) expected = ['hello_world_ab_c_e', 'ab_c_e_he11o_w0rld', 'he11o_w0rld_hello_world'] table1 = Table(rows, ['one', 'two', 'three'], [self.text_type, self.number_type, self.text_type]) table2 = table1.compute([ ('slugs', Slug(['one', 'three'])) ]) self.assertSequenceEqual(table2.columns['slugs'], expected) def test_slug_ensure_unique(self): rows = ( ('hello world', 2), ('Ab*c #e', 2), ('He11O W0rld', 3), ('HellO WOrld ', 3) ) expected = ['hello_world', 'ab_c_e', 'he11o_w0rld', 'hello_world_2'] table = Table(rows, ['one', 'two'], [self.text_type, self.number_type]).compute([ ('slugs', Slug('one', ensure_unique=True)) ]) self.assertSequenceEqual(table.columns['slugs'], expected) def test_slug_contains_null_error(self): with self.assertRaises(ValueError): self.table.compute([ ('slugs', Slug('one', ensure_unique=True)) ]) class TestDateAndTimeComputations(unittest.TestCase): def test_change_dates(self): rows = ( ('10/4/2015', '10/7/2015'), ('10/2/2015', '9/28/2015'), ('9/28/2015', '9/1/2015') ) date_type = Date() column_names = ['one', 'two'] column_types = [date_type, date_type] table = Table(rows, column_names, column_types) new_table = table.compute([ ('test', Change('one', 'two')) ]) self.assertIsNot(new_table, table) self.assertEqual(len(new_table.rows), 3) self.assertEqual(len(new_table.columns), 3) self.assertSequenceEqual(new_table.rows[0], ( datetime.date(2015, 10, 4), datetime.date(2015, 10, 7), datetime.timedelta(days=3) )) self.assertEqual(new_table.columns['test'][0], datetime.timedelta(days=3)) self.assertEqual(new_table.columns['test'][1], datetime.timedelta(days=-4)) self.assertEqual(new_table.columns['test'][2], datetime.timedelta(days=-27)) def test_change_datetimes(self): rows = ( ('10/4/2015 4:43', '10/7/2015 4:50'), ('10/2/2015 12 PM', '9/28/2015 12 PM'), ('9/28/2015 12:00:00', '9/1/2015 6 PM') ) datetime_type = DateTime() column_names = ['one', 'two'] column_types = [datetime_type, datetime_type] table = Table(rows, column_names, column_types) new_table = table.compute([ ('test', Change('one', 'two')) ]) self.assertIsNot(new_table, table) self.assertEqual(len(new_table.rows), 3) self.assertEqual(len(new_table.columns), 3) self.assertSequenceEqual(new_table.rows[0], ( datetime.datetime(2015, 10, 4, 4, 43), datetime.datetime(2015, 10, 7, 4, 50), datetime.timedelta(days=3, minutes=7) )) self.assertEqual(new_table.columns['test'][0], datetime.timedelta(days=3, minutes=7)) self.assertEqual(new_table.columns['test'][1], datetime.timedelta(days=-4)) self.assertEqual(new_table.columns['test'][2], datetime.timedelta(days=-26, hours=-18)) def test_change_timedeltas(self): rows = ( ('4:15', '8:18'), ('4h 2m', '2h'), ('4 weeks', '27 days') ) timedelta_type = TimeDelta() column_names = ['one', 'two'] column_types = [timedelta_type, timedelta_type] table = Table(rows, column_names, column_types) new_table = table.compute([ ('test', Change('one', 'two')) ]) self.assertIsNot(new_table, table) self.assertEqual(len(new_table.rows), 3) self.assertEqual(len(new_table.columns), 3) self.assertSequenceEqual(new_table.rows[0], ( datetime.timedelta(minutes=4, seconds=15), datetime.timedelta(minutes=8, seconds=18), datetime.timedelta(minutes=4, seconds=3) )) self.assertEqual(new_table.columns['test'][0], datetime.timedelta(minutes=4, seconds=3)) self.assertEqual(new_table.columns['test'][1], datetime.timedelta(hours=-2, minutes=-2)) self.assertEqual(new_table.columns['test'][2], datetime.timedelta(days=-1)) agate-1.9.1/tests/test_data_types.py000066400000000000000000000517001454111471500175210ustar00rootroot00000000000000import datetime import pickle import unittest from decimal import Decimal import parsedatetime try: from zoneinfo import ZoneInfo except ImportError: # Fallback for Python < 3.9 from backports.zoneinfo import ZoneInfo from agate.data_types import Boolean, Date, DateTime, Number, Text, TimeDelta from agate.exceptions import CastError class TestText(unittest.TestCase): def setUp(self): self.type = Text() def test_test(self): self.assertEqual(self.type.test(None), True) self.assertEqual(self.type.test('N/A'), True) self.assertEqual(self.type.test(True), True) self.assertEqual(self.type.test('True'), True) self.assertEqual(self.type.test(1), True) self.assertEqual(self.type.test(Decimal('1')), True) self.assertEqual(self.type.test('2.7'), True) self.assertEqual(self.type.test(2.7), True) self.assertEqual(self.type.test('3/1/1994'), True) self.assertEqual(self.type.test(datetime.date(1994, 3, 1)), True) self.assertEqual(self.type.test('3/1/1994 12:30 PM'), True) self.assertEqual(self.type.test('2015-01-01 02:34'), True) self.assertEqual(self.type.test(datetime.datetime(1994, 3, 1, 12, 30)), True) self.assertEqual(self.type.test('4:10'), True) self.assertEqual(self.type.test(datetime.timedelta(hours=4, minutes=10)), True) self.assertEqual(self.type.test('a'), True) self.assertEqual(self.type.test('A\nB'), True) self.assertEqual(self.type.test('👍'), True) self.assertEqual(self.type.test('05_leslie3d_base'), True) self.assertEqual(self.type.test('2016-12-29'), True) self.assertEqual(self.type.test('2016-12-29T11:43:30Z'), True) self.assertEqual(self.type.test('2016-12-29T11:43:30+06:00'), True) self.assertEqual(self.type.test('2016-12-29T11:43:30-06:00'), True) def test_cast(self): values = ('a', 1, None, Decimal('2.7'), 'n/a', '👍', ' foo', 'foo ') casted = tuple(self.type.cast(v) for v in values) self.assertSequenceEqual(casted, ('a', '1', None, '2.7', None, '👍', ' foo', 'foo ')) def test_no_cast_nulls(self): values = ('', 'N/A', None) t = Text() casted = tuple(t.cast(v) for v in values) self.assertSequenceEqual(casted, (None, None, None)) t = Text(cast_nulls=False) casted = tuple(t.cast(v) for v in values) self.assertSequenceEqual(casted, ('', 'N/A', None)) def test_null_values(self): t = Text(null_values=['Bad Value']) self.assertEqual(t.cast('Bad Value'), None) class TestBoolean(unittest.TestCase): def setUp(self): self.type = Boolean() def test_test(self): self.assertEqual(self.type.test(None), True) self.assertEqual(self.type.test('N/A'), True) self.assertEqual(self.type.test(True), True) self.assertEqual(self.type.test('True'), True) self.assertEqual(self.type.test('1'), True) self.assertEqual(self.type.test(1), True) self.assertEqual(self.type.test(Decimal('1')), True) self.assertEqual(self.type.test('0'), True) self.assertEqual(self.type.test(0), True) self.assertEqual(self.type.test(Decimal('0')), True) self.assertEqual(self.type.test('2.7'), False) self.assertEqual(self.type.test(2.7), False) self.assertEqual(self.type.test('3/1/1994'), False) self.assertEqual(self.type.test(datetime.date(1994, 3, 1)), False) self.assertEqual(self.type.test('3/1/1994 12:30 PM'), False) self.assertEqual(self.type.test('2015-01-01 02:34'), False) self.assertEqual(self.type.test(datetime.datetime(1994, 3, 1, 12, 30)), False) self.assertEqual(self.type.test('4:10'), False) self.assertEqual(self.type.test(datetime.timedelta(hours=4, minutes=10)), False) self.assertEqual(self.type.test('a'), False) self.assertEqual(self.type.test('A\nB'), False) self.assertEqual(self.type.test('👍'), False) self.assertEqual(self.type.test('05_leslie3d_base'), False) self.assertEqual(self.type.test('2016-12-29'), False) self.assertEqual(self.type.test('2016-12-29T11:43:30Z'), False) self.assertEqual(self.type.test('2016-12-29T11:43:30+06:00'), False) self.assertEqual(self.type.test('2016-12-29T11:43:30-06:00'), False) def test_cast(self): values = (True, 'yes', None, False, 'no', 'n/a', '1', 0) casted = tuple(self.type.cast(v) for v in values) self.assertSequenceEqual(casted, (True, True, None, False, False, None, True, False)) def test_cast_custom_strings(self): values = ('a', 'b', 'c', 'd', 'e', 'f') boolean_type = Boolean( true_values=('a', 'b'), false_values=('d', 'e'), null_values=('c', 'f') ) casted = tuple(boolean_type.cast(v) for v in values) self.assertSequenceEqual(casted, (True, True, None, False, False, None)) def test_cast_error(self): with self.assertRaises(CastError): self.type.cast('quack') class TestNumber(unittest.TestCase): def setUp(self): self.type = Number() def test_test(self): self.assertEqual(self.type.test(None), True) self.assertEqual(self.type.test('N/A'), True) self.assertEqual(self.type.test(True), True) self.assertEqual(self.type.test('True'), False) self.assertEqual(self.type.test(1), True) self.assertEqual(self.type.test(Decimal('1')), True) self.assertEqual(self.type.test('2.7'), True) self.assertEqual(self.type.test(2.7), True) self.assertEqual(self.type.test('3/1/1994'), False) self.assertEqual(self.type.test(datetime.date(1994, 3, 1)), False) self.assertEqual(self.type.test('3/1/1994 12:30 PM'), False) self.assertEqual(self.type.test('2015-01-01 02:34'), False) self.assertEqual(self.type.test(datetime.datetime(1994, 3, 1, 12, 30)), False) self.assertEqual(self.type.test('4:10'), False) self.assertEqual(self.type.test(datetime.timedelta(hours=4, minutes=10)), False) self.assertEqual(self.type.test('a'), False) self.assertEqual(self.type.test('A\nB'), False) self.assertEqual(self.type.test('👍'), False) self.assertEqual(self.type.test('05_leslie3d_base'), False) self.assertEqual(self.type.test('2016-12-29'), False) self.assertEqual(self.type.test('2016-12-29T11:43:30Z'), False) self.assertEqual(self.type.test('2016-12-29T11:43:30+06:00'), False) self.assertEqual(self.type.test('2016-12-29T11:43:30-06:00'), False) def test_cast(self): values = (2, 1, None, Decimal('2.7'), 'n/a', '2.7', '200,000,000') casted = tuple(self.type.cast(v) for v in values) self.assertSequenceEqual( casted, (Decimal('2'), Decimal('1'), None, Decimal('2.7'), None, Decimal('2.7'), Decimal('200000000')) ) def test_boolean_cast(self): values = (True, False) casted = tuple(self.type.cast(v) for v in values) self.assertSequenceEqual(casted, (Decimal('1'), Decimal('0'))) def test_currency_cast(self): values = ('$2.70', '-$0.70', '€14', '50¢', '-75¢', '-$1,287') casted = tuple(self.type.cast(v) for v in values) self.assertSequenceEqual( casted, (Decimal('2.7'), Decimal('-0.7'), Decimal('14'), Decimal('50'), Decimal('-75'), Decimal('-1287')) ) def test_cast_locale(self): values = (2, 1, None, Decimal('2.7'), 'n/a', '2,7', '200.000.000') casted = tuple(Number(locale='de_DE.UTF-8').cast(v) for v in values) self.assertSequenceEqual( casted, (Decimal('2'), Decimal('1'), None, Decimal('2.7'), None, Decimal('2.7'), Decimal('200000000')) ) def test_cast_text(self): with self.assertRaises(CastError): self.type.cast('a') def test_cast_floats(self): self.assertAlmostEqual(self.type.cast(0.1 + 0.2), Decimal('0.3')) self.assertEqual(self.type.cast(0.12345123456), Decimal('0.12345123456')) def test_cast_error(self): with self.assertRaises(CastError): self.type.cast('quack') class TestDate(unittest.TestCase): def setUp(self): self.type = Date() def test_test(self): self.assertEqual(self.type.test(None), True) self.assertEqual(self.type.test('N/A'), True) self.assertEqual(self.type.test(True), False) self.assertEqual(self.type.test('True'), False) self.assertEqual(self.type.test(1), False) self.assertEqual(self.type.test(Decimal('1')), False) self.assertEqual(self.type.test('2.7'), True) self.assertEqual(self.type.test(2.7), False) self.assertEqual(self.type.test('3/1/1994'), True) self.assertEqual(self.type.test(datetime.date(1994, 3, 1)), True) self.assertEqual(self.type.test('3/1/1994 12:30 PM'), False) self.assertEqual(self.type.test('2015-01-01 02:34'), False) self.assertEqual(self.type.test(datetime.datetime(1994, 3, 1, 12, 30)), False) self.assertEqual(self.type.test('4:10'), False) self.assertEqual(self.type.test(datetime.timedelta(hours=4, minutes=10)), False) self.assertEqual(self.type.test('a'), False) self.assertEqual(self.type.test('A\nB'), False) self.assertEqual(self.type.test('👍'), False) self.assertEqual(self.type.test('05_leslie3d_base'), False) self.assertEqual(self.type.test('2016-12-29'), True) self.assertEqual(self.type.test('2016-12-29T11:43:30Z'), False) self.assertEqual(self.type.test('2016-12-29T11:43:30+06:00'), False) self.assertEqual(self.type.test('2016-12-29T11:43:30-06:00'), False) self.assertEqual(self.type.test('MC 5.7.10 Per Dorothy Carroll'), False) self.assertEqual(self.type.test('testing workgroup fix - 4/7/2010 - Marcy Liberty'), False) def test_test_format(self): date_type = Date(date_format='%m-%d-%Y') self.assertEqual(date_type.test('3/1/1994'), False) self.assertEqual(date_type.test('03-01-1994'), True) self.assertEqual(date_type.test(datetime.date(1994, 3, 1)), True) def test_iso_format(self): values = ('1994-03-01', '2011-02-17') casted = tuple(self.type.cast(v) for v in values) self.assertSequenceEqual(casted, ( datetime.date(1994, 3, 1), datetime.date(2011, 2, 17) )) def test_cast_parser(self): values = ('3/1/1994', '2/17/2011', None, 'January 5th, 1984', 'n/a') casted = tuple(self.type.cast(v) for v in values) self.assertSequenceEqual(casted, ( datetime.date(1994, 3, 1), datetime.date(2011, 2, 17), None, datetime.date(1984, 1, 5), None )) def test_cast_format(self): date_type = Date(date_format='%m-%d-%Y') values = ('03-01-1994', '02-17-2011', None, '01-05-1984', 'n/a') casted = tuple(date_type.cast(v) for v in values) self.assertSequenceEqual(casted, ( datetime.date(1994, 3, 1), datetime.date(2011, 2, 17), None, datetime.date(1984, 1, 5), None )) def test_cast_format_locale(self): date_type = Date(date_format='%d-%b-%Y', locale='de_DE.UTF-8') # March can be abbreviated to Mrz or Mär depending on the locale version, # so we use December in the first value to ensure the test passes everywhere values = ('01-Dez-1994', '17-Feb-2011', None, '05-Jan-1984', 'n/a') casted = tuple(date_type.cast(v) for v in values) self.assertSequenceEqual(casted, ( datetime.date(1994, 12, 1), datetime.date(2011, 2, 17), None, datetime.date(1984, 1, 5), None )) def test_cast_locale(self): date_type = Date(locale='fr_FR') values = ('01 mars 1994', 'jeudi 17 février 2011', None, '5 janvier 1984', 'n/a') casted = tuple(date_type.cast(v) for v in values) self.assertSequenceEqual(casted, ( datetime.date(1994, 3, 1), datetime.date(2011, 2, 17), None, datetime.date(1984, 1, 5), None )) def test_cast_error(self): with self.assertRaises(CastError): self.type.cast('quack') def test_pickle_parser(self): from_pickle = pickle.loads(pickle.dumps(self.type)) self.assertEqual(from_pickle.date_format, self.type.date_format) self.assertEqual(from_pickle.locale, self.type.locale) self.assertIsInstance(from_pickle._constants, parsedatetime.Constants) self.assertIsInstance(from_pickle._parser, parsedatetime.Calendar) class TestDateTime(unittest.TestCase): def setUp(self): self.type = DateTime() def test_test(self): self.assertEqual(self.type.test(None), True) self.assertEqual(self.type.test('N/A'), True) self.assertEqual(self.type.test(True), False) self.assertEqual(self.type.test('True'), False) self.assertEqual(self.type.test(1), False) self.assertEqual(self.type.test(Decimal('1')), False) self.assertEqual(self.type.test('2.7'), True) self.assertEqual(self.type.test(2.7), False) self.assertEqual(self.type.test('3/1/1994'), True) self.assertEqual(self.type.test(datetime.date(1994, 3, 1)), True) self.assertEqual(self.type.test('3/1/1994 12:30 PM'), True) self.assertEqual(self.type.test('2015-01-01 02:34'), True) self.assertEqual(self.type.test(datetime.datetime(1994, 3, 1, 12, 30)), True) self.assertEqual(self.type.test('4:10'), False) self.assertEqual(self.type.test(datetime.timedelta(hours=4, minutes=10)), False) self.assertEqual(self.type.test('a'), False) self.assertEqual(self.type.test('A\nB'), False) self.assertEqual(self.type.test('👍'), False) self.assertEqual(self.type.test('05_leslie3d_base'), False) self.assertEqual(self.type.test('2016-12-29'), True) self.assertEqual(self.type.test('2016-12-29T11:43:30Z'), True) self.assertEqual(self.type.test('2016-12-29T11:43:30+06:00'), True) self.assertEqual(self.type.test('2016-12-29T11:43:30-06:00'), True) self.assertEqual(self.type.test('720(38-4)31A-1.1(A)'), False) def test_test_format(self): datetime_type = DateTime(datetime_format='%m-%d-%Y %I:%M %p') self.assertEqual(datetime_type.test('3/1/1994 12:30 PM'), False) self.assertEqual(datetime_type.test('03-01-1994 12:30 PM'), True) self.assertEqual(datetime_type.test(datetime.datetime(1994, 3, 1, 12, 30)), True) def test_iso_format(self): values = ('1994-03-01T12:30:00', '2011-02-17T06:30') casted = tuple(self.type.cast(v) for v in values) self.assertSequenceEqual(casted, ( datetime.datetime(1994, 3, 1, 12, 30, 0), datetime.datetime(2011, 2, 17, 6, 30, 0) )) def test_cast_parser(self): values = ('3/1/1994 12:30 PM', '2/17/2011 06:30', None, 'January 5th, 1984 22:37', 'n/a', '2015-01-01 02:34') casted = tuple(self.type.cast(v) for v in values) self.assertSequenceEqual(casted, ( datetime.datetime(1994, 3, 1, 12, 30, 0), datetime.datetime(2011, 2, 17, 6, 30, 0), None, datetime.datetime(1984, 1, 5, 22, 37, 0), None, datetime.datetime(2015, 1, 1, 2, 34, 0) )) def test_cast_parser_timezone(self): tzinfo = ZoneInfo('US/Pacific') datetime_type = DateTime(timezone=tzinfo) values = ('3/1/1994 12:30 PM', '2/17/2011 06:30', None, 'January 5th, 1984 22:37', 'n/a') casted = tuple(datetime_type.cast(v) for v in values) self.assertSequenceEqual(casted, ( datetime.datetime(1994, 3, 1, 12, 30, 0, 0, tzinfo=tzinfo), datetime.datetime(2011, 2, 17, 6, 30, 0, 0, tzinfo=tzinfo), None, datetime.datetime(1984, 1, 5, 22, 37, 0, 0, tzinfo=tzinfo), None )) def test_cast_format(self): datetime_type = DateTime(datetime_format='%m-%d-%Y %I:%M %p') values = ('03-01-1994 12:30 PM', '02-17-2011 06:30 AM', None, '01-05-1984 06:30 PM', 'n/a') casted = tuple(datetime_type.cast(v) for v in values) self.assertSequenceEqual(casted, ( datetime.datetime(1994, 3, 1, 12, 30, 0), datetime.datetime(2011, 2, 17, 6, 30, 0), None, datetime.datetime(1984, 1, 5, 18, 30, 0), None )) def test_cast_format_locale(self): date_type = DateTime(datetime_format='%Y-%m-%d %I:%M %p', locale='ko_KR.UTF-8') # Date formats depend on the platform's strftime/strptime implementation; # some platforms like macOS always return AM/PM for day periods (%p), # so we will catch any CastError that may arise from the conversion possible_values = ( ('1994-03-01 12:30 오후', '2011-02-17 06:30 오전', None, '1984-01-05 06:30 오후', 'n/a'), ('1994-03-01 12:30 PM', '2011-02-17 06:30 AM', None, '1984-01-05 06:30 PM', 'n/a'), ) valid = False exceptions = [] for values in possible_values: try: casted = tuple(date_type.cast(v) for v in values) except CastError as e: exceptions.append(repr(e)) continue self.assertSequenceEqual(casted, ( datetime.datetime(1994, 3, 1, 12, 30, 0), datetime.datetime(2011, 2, 17, 6, 30, 0), None, datetime.datetime(1984, 1, 5, 18, 30, 0), None )) valid = True if not valid: raise AssertionError('\n\n'.join(exceptions)) def test_cast_locale(self): date_type = DateTime(locale='fr_FR') values = ('01/03/1994 12:30', '17/2/11 6:30', None, '5/01/84 18:30', 'n/a') casted = tuple(date_type.cast(v) for v in values) self.assertSequenceEqual(casted, ( datetime.datetime(1994, 3, 1, 12, 30, 0), datetime.datetime(2011, 2, 17, 6, 30, 0), None, datetime.datetime(1984, 1, 5, 18, 30, 0), None )) def test_cast_error(self): with self.assertRaises(CastError): self.type.cast('quack') def test_pickle_parser(self): from_pickle = pickle.loads(pickle.dumps(self.type)) self.assertEqual(from_pickle.datetime_format, self.type.datetime_format) self.assertEqual(from_pickle.timezone, self.type.timezone) self.assertEqual(from_pickle.locale, self.type.locale) self.assertEqual(from_pickle._source_time, self.type._source_time) self.assertIsInstance(from_pickle._constants, parsedatetime.Constants) self.assertIsInstance(from_pickle._parser, parsedatetime.Calendar) class TestTimeDelta(unittest.TestCase): def setUp(self): self.type = TimeDelta() def test_test(self): self.assertEqual(self.type.test(None), True) self.assertEqual(self.type.test('N/A'), True) self.assertEqual(self.type.test(True), False) self.assertEqual(self.type.test('True'), False) self.assertEqual(self.type.test(1), False) self.assertEqual(self.type.test(Decimal('1')), False) self.assertEqual(self.type.test('2.7'), False) self.assertEqual(self.type.test(2.7), False) self.assertEqual(self.type.test('3/1/1994'), False) self.assertEqual(self.type.test(datetime.date(1994, 3, 1)), False) self.assertEqual(self.type.test('3/1/1994 12:30 PM'), False) self.assertEqual(self.type.test('2015-01-01 02:34'), False) self.assertEqual(self.type.test(datetime.datetime(1994, 3, 1, 12, 30)), False) self.assertEqual(self.type.test('4:10'), True) self.assertEqual(self.type.test(datetime.timedelta(hours=4, minutes=10)), True) self.assertEqual(self.type.test('a'), False) self.assertEqual(self.type.test('A\nB'), False) self.assertEqual(self.type.test('👍'), False) self.assertEqual(self.type.test('05_leslie3d_base'), False) self.assertEqual(self.type.test('2016-12-29'), False) self.assertEqual(self.type.test('2016-12-29T11:43:30Z'), False) self.assertEqual(self.type.test('2016-12-29T11:43:30+06:00'), False) self.assertEqual(self.type.test('2016-12-29T11:43:30-06:00'), False) def test_cast_parser(self): values = ('4:10', '1.2m', '172 hours', '5 weeks, 2 days', 'n/a') casted = tuple(self.type.cast(v) for v in values) self.assertSequenceEqual(casted, ( datetime.timedelta(minutes=4, seconds=10), datetime.timedelta(minutes=1, seconds=12), datetime.timedelta(hours=172), datetime.timedelta(weeks=5, days=2), None )) def test_cast_error(self): with self.assertRaises(CastError): self.type.cast('quack') agate-1.9.1/tests/test_fixed.py000066400000000000000000000032271454111471500164640ustar00rootroot00000000000000import unittest from agate import csv, fixed class TestFixed(unittest.TestCase): def test_reader(self): with open('examples/testfixed_converted.csv') as f: csv_reader = csv.Reader(f) csv_header = next(csv_reader) csv_data = list(csv_reader) with open('examples/testfixed') as f: with open('examples/testfixed_schema.csv') as schema_f: fixed_reader = fixed.Reader(f, schema_f) fixed_data = list(fixed_reader) self.assertEqual(csv_header, fixed_reader.fieldnames) self.assertEqual(csv_data, fixed_data) def test_reader_func(self): with open('examples/testfixed_converted.csv') as f: csv_reader = csv.reader(f) csv_header = next(csv_reader) csv_data = list(csv_reader) with open('examples/testfixed') as f: with open('examples/testfixed_schema.csv') as schema_f: fixed_reader = fixed.reader(f, schema_f) fixed_data = list(fixed_reader) self.assertEqual(csv_header, fixed_reader.fieldnames) self.assertEqual(csv_data, fixed_data) def test_dict_reader(self): with open('examples/testfixed_converted.csv') as f: csv_reader = csv.DictReader(f) csv_data = list(csv_reader) with open('examples/testfixed') as f: with open('examples/testfixed_schema.csv') as schema_f: fixed_reader = fixed.DictReader(f, schema_f) fixed_data = list(fixed_reader) self.assertEqual(csv_reader.fieldnames, fixed_reader.fieldnames) self.assertEqual(csv_data, fixed_data) agate-1.9.1/tests/test_from_json.py000066400000000000000000000072471454111471500173670ustar00rootroot00000000000000from agate import Table from agate.data_types import Boolean, Date, DateTime, Number, Text, TimeDelta from agate.rows import Row from agate.testcase import AgateTestCase from agate.type_tester import TypeTester class TestJSON(AgateTestCase): def setUp(self): self.rows = ( (1, 'a', True, '11/4/2015', '11/4/2015 12:22 PM', '4:15'), (2, '👍', False, '11/5/2015', '11/4/2015 12:45 PM', '6:18'), (None, 'b', None, None, None, None) ) self.column_names = [ 'number', 'text', 'boolean', 'date', 'datetime', 'timedelta' ] self.column_types = [ Number(), Text(), Boolean(), Date(), DateTime(), TimeDelta() ] def test_from_json(self): table1 = Table(self.rows, self.column_names, self.column_types) table2 = Table.from_json('examples/test.json') self.assertColumnNames(table2, self.column_names) self.assertColumnTypes(table2, [Number, Text, Boolean, Date, DateTime, TimeDelta]) self.assertRows(table2, table1.rows) def test_from_json_file_like_object(self): table1 = Table(self.rows, self.column_names, self.column_types) with open('examples/test.json', encoding='utf-8') as f: table2 = Table.from_json(f) self.assertColumnNames(table2, self.column_names) self.assertColumnTypes(table2, [Number, Text, Boolean, Date, DateTime, TimeDelta]) self.assertRows(table2, table1.rows) def test_from_json_with_key(self): table1 = Table(self.rows, self.column_names, self.column_types) table2 = Table.from_json('examples/test_key.json', key='data') self.assertColumnNames(table2, self.column_names) self.assertColumnTypes(table2, [Number, Text, Boolean, Date, DateTime, TimeDelta]) self.assertRows(table2, table1.rows) def test_from_json_mixed_keys(self): table = Table.from_json('examples/test_mixed.json') self.assertColumnNames(table, ['one', 'two', 'three', 'four', 'five']) self.assertColumnTypes(table, [Number, Number, Text, Text, Number]) self.assertRows(table, [ [1, 4, 'a', None, None], [2, 3, 'b', 'd', None], [None, 2, '👍', None, 5] ]) def test_from_json_nested(self): table = Table.from_json('examples/test_nested.json') self.assertColumnNames(table, ['one', 'two/two_a', 'two/two_b', 'three/0', 'three/1', 'three/2']) self.assertColumnTypes(table, [Number, Text, Text, Text, Number, Text]) self.assertRows(table, [ [1, 'a', 'b', 'a', 2, 'c'], [2, 'c', 'd', 'd', 2, 'f'] ]) def test_from_json_newline_delimited(self): table1 = Table(self.rows, self.column_names, self.column_types) table2 = Table.from_json('examples/test_newline.json', newline=True) self.assertColumnNames(table2, self.column_names) self.assertColumnTypes(table2, [Number, Text, Boolean, Date, DateTime, TimeDelta]) self.assertRows(table2, table1.rows) def test_from_json_no_type_tester(self): tester = TypeTester(limit=0) table = Table.from_json('examples/test.json', column_types=tester) self.assertColumnTypes(table, [Text, Text, Text, Text, Text, Text]) def test_from_json_error_newline_key(self): with self.assertRaises(ValueError): Table.from_json('examples/test.json', newline=True, key='test') def test_from_json_ambiguous(self): table = Table.from_json('examples/test_from_json_ambiguous.json') self.assertColumnNames(table, ('a/b',)) self.assertColumnTypes(table, [Boolean]) self.assertRows(table, [Row([False])]) agate-1.9.1/tests/test_mapped_sequence.py000066400000000000000000000070231454111471500205210ustar00rootroot00000000000000import unittest from agate.mapped_sequence import MappedSequence class TestMappedSequence(unittest.TestCase): def setUp(self): self.column_names = ('one', 'two', 'three') self.data = ('a', 'b', 'c') self.row = MappedSequence(self.data, self.column_names) def test_is_immutable(self): with self.assertRaises(TypeError): self.row[0] = 'foo' with self.assertRaises(TypeError): self.row['one'] = 100 def test_stringify(self): self.assertEqual(str(self.row), "") def test_stringify_long(self): column_names = ('one', 'two', 'three', 'four', 'five', 'six') data = ('a', 'b', 'c', 'd', 'e', 'f') row = MappedSequence(data, column_names) self.assertEqual(str(row), "") def test_length(self): self.assertEqual(len(self.row), 3) def test_eq(self): row2 = MappedSequence(self.data, self.column_names) self.assertTrue(self.row == ('a', 'b', 'c')) self.assertTrue(self.row == ['a', 'b', 'c']) self.assertTrue(self.row == row2) self.assertFalse(self.row == ('a', 'b', 'c', 'd')) self.assertFalse(self.row == 1) def test_ne(self): row2 = MappedSequence(self.data, self.column_names) self.assertFalse(self.row != ('a', 'b', 'c')) self.assertFalse(self.row != ['a', 'b', 'c']) self.assertFalse(self.row != row2) self.assertTrue(self.row != ('a', 'b', 'c', 'd')) self.assertTrue(self.row != 1) def test_contains(self): self.assertTrue('a' in self.row) self.assertFalse('d' in self.row) def test_set_item(self): with self.assertRaises(TypeError): self.row['one'] = 't' with self.assertRaises(TypeError): self.row['five'] = 'g' def test_get_item(self): self.assertEqual(self.row['one'], 'a') self.assertEqual(self.row['two'], 'b') self.assertEqual(self.row['three'], 'c') def test_get_by_key(self): self.assertEqual(self.row['one'], 'a') self.assertEqual(self.row[0], 'a') def test_get_by_slice(self): self.assertSequenceEqual(self.row[1:], ('b', 'c')) def test_get_invalid(self): with self.assertRaises(IndexError): self.row[3] with self.assertRaises(KeyError): self.row['foo'] def test_keys(self): self.assertIs(self.row.keys(), self.column_names) def test_values(self): self.assertIs(self.row.values(), self.data) def test_items(self): self.assertSequenceEqual(self.row.items(), [ ('one', 'a'), ('two', 'b'), ('three', 'c') ]) def test_get(self): self.assertEqual(self.row.get('one'), 'a') def test_get_default(self): self.assertEqual(self.row.get('four'), None) self.assertEqual(self.row.get('four', 'foo'), 'foo') def test_dict(self): self.assertDictEqual(self.row.dict(), { 'one': 'a', 'two': 'b', 'three': 'c' }) def test_dict_no_keys(self): row = MappedSequence(self.data) with self.assertRaises(KeyError): row.dict() def test_iterate(self): it = iter(self.row) self.assertSequenceEqual(next(it), 'a') self.assertSequenceEqual(next(it), 'b') self.assertSequenceEqual(next(it), 'c') with self.assertRaises(StopIteration): next(it) agate-1.9.1/tests/test_py3.py000066400000000000000000000173131454111471500161010ustar00rootroot00000000000000import csv import os import platform import sys import unittest from io import StringIO from agate import csv_py3 from agate.exceptions import FieldSizeLimitError class TestReader(unittest.TestCase): def setUp(self): self.rows = [ ['number', 'text', 'boolean', 'date', 'datetime', 'timedelta'], ['1', 'a', 'True', '2015-11-04', '2015-11-04T12:22:00', '0:04:15'], ['2', '👍', 'False', '2015-11-05', '2015-11-04T12:45:00', '0:06:18'], ['', 'b', '', '', '', ''] ] def test_utf8(self): with open('examples/test.csv', encoding='utf-8') as f: rows = list(csv_py3.Reader(f)) for a, b in zip(self.rows, rows): self.assertEqual(a, b) def test_reader_alias(self): with open('examples/test.csv', encoding='utf-8') as f: rows = list(csv_py3.reader(f)) for a, b in zip(self.rows, rows): self.assertEqual(a, b) def test_properties(self): with open('examples/test.csv', encoding='utf-8') as f: reader = csv_py3.Reader(f) self.assertEqual(reader.dialect.delimiter, ',') self.assertEqual(reader.line_num, 0) next(reader) self.assertEqual(reader.line_num, 1) def test_line_numbers(self): with open('examples/test.csv', encoding='utf-8') as f: rows = list(csv_py3.Reader(f, line_numbers=True)) sample_rows = [ ['line_numbers', 'number', 'text', 'boolean', 'date', 'datetime', 'timedelta'], ['1', '1', 'a', 'True', '2015-11-04', '2015-11-04T12:22:00', '0:04:15'], ['2', '2', '👍', 'False', '2015-11-05', '2015-11-04T12:45:00', '0:06:18'], ['3', '', 'b', '', '', '', ''] ] for a, b in zip(sample_rows, rows): self.assertEqual(a, b) class TestFieldSizeLimit(unittest.TestCase): def setUp(self): self.lim = csv.field_size_limit() with open('.test.csv', 'w', encoding='utf-8') as f: f.write('a' * 10) def tearDown(self): # Resetting limit to avoid failure in other tests. csv.field_size_limit(self.lim) os.remove('.test.csv') def test_field_size_limit(self): # Testing field_size_limit for failure. Creating data using str * int. with open('.test.csv', encoding='utf-8') as f: c = csv_py3.Reader(f, field_size_limit=9) try: c.__next__() except FieldSizeLimitError: pass else: raise AssertionError('Expected FieldSizeLimitError') # Now testing higher field_size_limit. with open('.test.csv', encoding='utf-8') as f: c = csv_py3.Reader(f, field_size_limit=11) self.assertEqual(['a' * 10], c.__next__()) class TestWriter(unittest.TestCase): def test_utf8(self): output = StringIO() writer = csv_py3.Writer(output) writer.writerow(['a', 'b', 'c']) writer.writerow(['1', '2', '3']) writer.writerow(['4', '5', 'ʤ']) written = StringIO(output.getvalue()) reader = csv_py3.Reader(written) self.assertEqual(next(reader), ['a', 'b', 'c']) self.assertEqual(next(reader), ['1', '2', '3']) self.assertEqual(next(reader), ['4', '5', 'ʤ']) def test_writer_alias(self): output = StringIO() writer = csv_py3.writer(output) writer.writerow(['a', 'b', 'c']) writer.writerow(['1', '2', '3']) writer.writerow(['4', '5', 'ʤ']) written = StringIO(output.getvalue()) reader = csv_py3.reader(written) self.assertEqual(next(reader), ['a', 'b', 'c']) self.assertEqual(next(reader), ['1', '2', '3']) self.assertEqual(next(reader), ['4', '5', 'ʤ']) def test_line_numbers(self): output = StringIO() writer = csv_py3.Writer(output, line_numbers=True) writer.writerow(['a', 'b', 'c']) writer.writerow(['1', '2', '3']) writer.writerow(['4', '5', 'ʤ']) written = StringIO(output.getvalue()) reader = csv_py3.Reader(written) self.assertEqual(next(reader), ['line_number', 'a', 'b', 'c']) self.assertEqual(next(reader), ['1', '1', '2', '3']) self.assertEqual(next(reader), ['2', '4', '5', 'ʤ']) def test_writerows(self): output = StringIO() writer = csv_py3.Writer(output) writer.writerows([ ['a', 'b', 'c'], ['1', '2', '3'], ['4', '5', 'ʤ'] ]) written = StringIO(output.getvalue()) reader = csv_py3.Reader(written) self.assertEqual(next(reader), ['a', 'b', 'c']) self.assertEqual(next(reader), ['1', '2', '3']) self.assertEqual(next(reader), ['4', '5', 'ʤ']) class TestDictReader(unittest.TestCase): def setUp(self): self.rows = [ ['number', 'text', 'boolean', 'date', 'datetime', 'timedelta'], ['1', 'a', 'True', '2015-11-04', '2015-11-04T12:22:00', '0:04:15'], ['2', '👍', 'False', '2015-11-05', '2015-11-04T12:45:00', '0:06:18'], ['', 'b', '', '', '', ''] ] self.f = open('examples/test.csv', encoding='utf-8') def tearDown(self): self.f.close() def test_reader(self): reader = csv_py3.DictReader(self.f) self.assertEqual(next(reader), dict(zip(self.rows[0], self.rows[1]))) def test_reader_alias(self): reader = csv_py3.DictReader(self.f) self.assertEqual(next(reader), dict(zip(self.rows[0], self.rows[1]))) class TestDictWriter(unittest.TestCase): def setUp(self): self.output = StringIO() def tearDown(self): self.output.close() def test_writer(self): writer = csv_py3.DictWriter(self.output, ['a', 'b', 'c']) writer.writeheader() writer.writerow({ 'a': '1', 'b': '2', 'c': '☃' }) result = self.output.getvalue() self.assertEqual(result, 'a,b,c\n1,2,☃\n') def test_writer_alias(self): writer = csv_py3.DictWriter(self.output, ['a', 'b', 'c']) writer.writeheader() writer.writerow({ 'a': '1', 'b': '2', 'c': '☃' }) result = self.output.getvalue() self.assertEqual(result, 'a,b,c\n1,2,☃\n') def test_line_numbers(self): writer = csv_py3.DictWriter(self.output, ['a', 'b', 'c'], line_numbers=True) writer.writeheader() writer.writerow({ 'a': '1', 'b': '2', 'c': '☃' }) result = self.output.getvalue() self.assertEqual(result, 'line_number,a,b,c\n1,1,2,☃\n') def test_writerows(self): writer = csv_py3.DictWriter(self.output, ['a', 'b', 'c'], line_numbers=True) writer.writeheader() writer.writerows([{ 'a': '1', 'b': '2', 'c': '☃' }]) result = self.output.getvalue() self.assertEqual(result, 'line_number,a,b,c\n1,1,2,☃\n') class TestSniffer(unittest.TestCase): @unittest.skipIf( platform.system() == 'Darwin' and sys.version_info[:2] == (3, 10), reason='The (macos-latest, 3.10) job fails on GitHub Actions' ) def test_sniffer(self): with open('examples/test.csv', encoding='utf-8') as f: contents = f.read() direct = csv.Sniffer().sniff(contents, csv_py3.POSSIBLE_DELIMITERS).__dict__ actual = csv_py3.Sniffer().sniff(contents).__dict__ expected = csv.Sniffer().sniff(contents).__dict__ self.assertEqual(direct, expected, f'{direct!r} != {expected!r}') self.assertEqual(actual, expected, f'{actual!r} != {expected!r}') agate-1.9.1/tests/test_table/000077500000000000000000000000001454111471500160765ustar00rootroot00000000000000agate-1.9.1/tests/test_table/__init__.py000066400000000000000000000526231454111471500202170ustar00rootroot00000000000000import warnings from decimal import Decimal from agate import Table from agate.computations import Formula from agate.data_types import Number, Text from agate.exceptions import CastError from agate.testcase import AgateTestCase from agate.warns import DuplicateColumnWarning class TestBasic(AgateTestCase): def setUp(self): self.rows = ( (1, 4, 'a'), (2, 3, 'b'), (None, 2, '👍') ) self.number_type = Number() self.text_type = Text() self.column_names = ['one', 'two', 'three'] self.column_types = [self.number_type, self.number_type, self.text_type] def test_create_table(self): warnings.simplefilter('ignore') try: table = Table(self.rows) finally: warnings.resetwarnings() self.assertColumnNames(table, ['a', 'b', 'c']) self.assertColumnTypes(table, [Number, Number, Text]) self.assertRows(table, self.rows) def test_create_filename(self): with self.assertRaises(ValueError): Table('foo.csv') def test_create_empty_table(self): table = Table([]) table2 = Table([], self.column_names, self.column_types) self.assertColumnNames(table, []) self.assertColumnTypes(table, []) self.assertRows(table, []) self.assertColumnNames(table2, self.column_names) self.assertColumnTypes(table2, [Number, Number, Text]) self.assertRows(table2, []) def test_create_table_column_types(self): column_types = [self.number_type, self.text_type, self.text_type] warnings.simplefilter('ignore') try: table = Table(self.rows, column_types=column_types) finally: warnings.resetwarnings() self.assertColumnNames(table, ['a', 'b', 'c']) self.assertColumnTypes(table, [Number, Text, Text]) self.assertRows(table, [ (1, '4', 'a'), (2, '3', 'b'), (None, '2', '👍') ]) def test_create_table_column_types_dict(self): column_types = dict(zip(self.column_names, self.column_types)) table = Table(self.rows, column_names=self.column_names, column_types=column_types) self.assertColumnNames(table, ['one', 'two', 'three']) self.assertColumnTypes(table, [Number, Number, Text]) self.assertRows(table, self.rows) def test_create_table_partial_column_types_dict(self): column_types = dict(zip(self.column_names[:2], self.column_types[:2])) table = Table(self.rows, column_names=self.column_names, column_types=column_types) self.assertColumnNames(table, ['one', 'two', 'three']) self.assertColumnTypes(table, [Number, Number, Text]) self.assertRows(table, self.rows) def test_create_table_column_names(self): table = Table(self.rows, self.column_names) self.assertColumnNames(table, self.column_names) self.assertColumnTypes(table, [Number, Number, Text]) self.assertRows(table, self.rows) def test_create_table_column_types_and_names(self): table = Table(self.rows, self.column_names, self.column_types) self.assertColumnNames(table, self.column_names) self.assertColumnTypes(table, [Number, Number, Text]) self.assertRows(table, self.rows) def test_create_table_non_string_columns(self): column_names = ['one', 'two', 3] with self.assertRaises(ValueError): Table(self.rows, column_names, self.column_types) def test_create_table_cast_error(self): column_types = [self.number_type, self.number_type, self.number_type] with self.assertRaises(CastError) as e: Table(self.rows, self.column_names, column_types) self.assertIn('Error at row 0 column three.', str(e.exception)) def test_create_table_null_column_names(self): column_names = ['one', None, 'three'] with self.assertWarns(RuntimeWarning): Table(self.rows, column_names, self.column_types) warnings.simplefilter('ignore') try: table = Table(self.rows, column_names, self.column_types) finally: warnings.resetwarnings() self.assertColumnNames(table, ['one', 'b', 'three']) def test_create_table_empty_column_names(self): column_names = ['one', '', 'three'] with self.assertWarns(RuntimeWarning): Table(self.rows, column_names, self.column_types) warnings.simplefilter('ignore') try: table = Table(self.rows, column_names, self.column_types) finally: warnings.resetwarnings() self.assertColumnNames(table, ['one', 'b', 'three']) def test_create_table_non_datatype_columns(self): column_types = [self.number_type, self.number_type, 'foo'] with self.assertRaises(ValueError): Table(self.rows, self.column_names, column_types) def test_create_duplicate_column_names(self): column_names = ['one', 'two', 'two'] with self.assertWarns(DuplicateColumnWarning): table = Table(self.rows, column_names, self.column_types) warnings.simplefilter('ignore') try: table = Table(self.rows, column_names, self.column_types) finally: warnings.resetwarnings() self.assertColumnNames(table, ['one', 'two', 'two_2']) self.assertColumnTypes(table, [Number, Number, Text]) self.assertRows(table, self.rows) def test_column_names_types_different_lengths(self): column_names = ['one', 'two', 'three', 'four'] with self.assertRaises(ValueError): Table(self.rows, column_names, self.column_types) def test_create_variable_length_rows(self): rows = ( (1, 4, 'a'), (2,), (None, 2) ) table = Table(rows, self.column_names, self.column_types) warnings.simplefilter('ignore') try: table2 = Table(rows) finally: warnings.resetwarnings() self.assertColumnNames(table, self.column_names) self.assertColumnTypes(table, [Number, Number, Text]) self.assertRows(table, [ (1, 4, 'a'), (2, None, None), (None, 2, None) ]) self.assertColumnTypes(table2, [Number, Number, Text]) self.assertRows(table2, [ (1, 4, 'a'), (2, None, None), (None, 2, None) ]) def test_create_table_no_column_names(self): warnings.simplefilter('ignore') try: table = Table(self.rows, None, self.column_types) finally: warnings.resetwarnings() self.assertEqual(len(table.rows), 3) self.assertEqual(len(table.columns), 3) self.assertSequenceEqual(table.columns[0], (1, 2, None)) self.assertSequenceEqual(table.columns['a'], (1, 2, None)) with self.assertRaises(KeyError): table.columns[None] with self.assertRaises(KeyError): table.columns['one'] self.assertSequenceEqual(table.columns[2], ('a', 'b', '👍')) self.assertSequenceEqual(table.columns['c'], ('a', 'b', '👍')) with self.assertRaises(KeyError): table.columns[''] def test_no_column_name_warning(self): with self.assertWarnsRegex(RuntimeWarning, 'Column 1 has no name. Using "b".'): Table(self.rows, ['a', None, 'c'], self.column_types) with self.assertWarnsRegex(RuntimeWarning, 'Column 1 has no name. Using "b".'): Table(self.rows, ['a', '', 'c'], self.column_types) def test_row_too_long(self): rows = ( (1, 4, 'a', 'foo'), (2,), (None, 2) ) with self.assertRaises(ValueError): Table(rows, self.column_names, self.column_types) def test_row_names(self): table = Table(self.rows, self.column_names, self.column_types, row_names='three') self.assertRowNames(table, ['a', 'b', '👍']) def test_row_names_non_string(self): table = Table(self.rows, self.column_names, self.column_types, row_names=[Decimal('2'), True, None]) self.assertSequenceEqual(table.row_names, [ Decimal('2'), True, None, ]) self.assertSequenceEqual(table.rows[Decimal('2')], (1, 4, 'a')) self.assertSequenceEqual(table.rows[True], (2, 3, 'b')) self.assertSequenceEqual(table.rows[None], (None, 2, '👍')) def test_row_names_int(self): with self.assertRaises(ValueError): Table(self.rows, self.column_names, self.column_types, row_names=['a', 'b', 3]) def test_row_names_func(self): table = Table(self.rows, self.column_names, self.column_types, row_names=lambda r: (r['one'], r['three'])) self.assertSequenceEqual(table.row_names, [ (Decimal('1'), 'a'), (Decimal('2'), 'b'), (None, '👍') ]) self.assertSequenceEqual(table.rows[(Decimal('1'), 'a')], (1, 4, 'a')) self.assertSequenceEqual(table.rows[(Decimal('2'), 'b')], (2, 3, 'b')) self.assertSequenceEqual(table.rows[(None, '👍')], (None, 2, '👍')) def test_row_names_invalid(self): with self.assertRaises(ValueError): Table( self.rows, self.column_names, self.column_types, row_names={'a': 1, 'b': 2, 'c': 3} ) def test_stringify(self): column_names = ['foo', 'bar', '👍'] table = Table(self.rows, column_names) u = str(table) self.assertIn('foo', u) self.assertIn('bar', u) self.assertIn('👍', u) def test_str(self): warnings.simplefilter('ignore') try: table = Table(self.rows) finally: warnings.resetwarnings() self.assertColumnNames(table, ['a', 'b', 'c']) self.assertColumnTypes(table, [Number, Number, Text]) self.assertRows(table, self.rows) def test_iter(self): warnings.simplefilter('ignore') try: table = Table(self.rows) finally: warnings.resetwarnings() for row, table_row, row_row in zip(self.rows, table, table.rows): self.assertEqual(row, table_row, row_row) def test_indexing(self): warnings.simplefilter('ignore') try: table = Table(self.rows) finally: warnings.resetwarnings() self.assertEqual(table[1], self.rows[1]) self.assertEqual(table[-1], self.rows[-1]) self.assertEqual(table[1:2], self.rows[1:2]) def test_get_column_types(self): table = Table(self.rows, self.column_names, self.column_types) self.assertSequenceEqual(table.column_types, self.column_types) def test_get_column_names(self): table = Table(self.rows, self.column_names, self.column_types) self.assertSequenceEqual(table.column_names, self.column_names) def test_select(self): table = Table(self.rows, self.column_names, self.column_types) new_table = table.select(('two', 'three')) self.assertIsNot(new_table, table) self.assertColumnNames(new_table, ['two', 'three']) self.assertColumnTypes(new_table, [Number, Text]) self.assertRows(new_table, [ [4, 'a'], [3, 'b'], [2, '👍'] ]) def test_select_single(self): table = Table(self.rows, self.column_names, self.column_types) new_table = table.select('three') self.assertColumnNames(new_table, ['three']) self.assertColumnTypes(new_table, [Text]) self.assertRows(new_table, [ ['a'], ['b'], ['👍'] ]) def test_select_with_row_names(self): table = Table(self.rows, self.column_names, self.column_types, row_names='three') new_table = table.select(('three',)) self.assertRowNames(new_table, ['a', 'b', '👍']) def test_select_does_not_exist(self): table = Table(self.rows, self.column_names, self.column_types) with self.assertRaises(ValueError): table.select(('four',)) def test_exclude(self): table = Table(self.rows, self.column_names, self.column_types) new_table = table.exclude(('one', 'two')) self.assertIsNot(new_table, table) self.assertColumnNames(new_table, ['three']) self.assertColumnTypes(new_table, [Text]) self.assertRows(new_table, [ ['a'], ['b'], ['👍'] ]) def test_exclude_single(self): table = Table(self.rows, self.column_names, self.column_types) new_table = table.exclude('one') self.assertIsNot(new_table, table) self.assertColumnNames(new_table, ['two', 'three']) self.assertColumnTypes(new_table, [Number, Text]) self.assertRows(new_table, [ [4, 'a'], [3, 'b'], [2, '👍'] ]) def test_exclude_with_row_names(self): table = Table(self.rows, self.column_names, self.column_types, row_names='three') new_table = table.exclude(('one', 'two')) self.assertRowNames(new_table, ['a', 'b', '👍']) def test_where(self): table = Table(self.rows, self.column_names, self.column_types) new_table = table.where(lambda r: r['one'] in (2, None)) self.assertIsNot(new_table, table) self.assertColumnNames(new_table, self.column_names) self.assertColumnTypes(new_table, [Number, Number, Text]) self.assertRows(new_table, [ self.rows[1], self.rows[2] ]) def test_where_with_row_names(self): table = Table(self.rows, self.column_names, self.column_types, row_names='three') new_table = table.where(lambda r: r['one'] in (2, None)) self.assertRowNames(new_table, ['b', '👍']) def test_find(self): table = Table(self.rows, self.column_names, self.column_types) row = table.find(lambda r: r['two'] - r['one'] == 1) self.assertIs(row, table.rows[1]) def test_find_none(self): table = Table(self.rows, self.column_names, self.column_types) row = table.find(lambda r: r['one'] == 'FOO') self.assertIs(row, None) def test_limit(self): table = Table(self.rows, self.column_names, self.column_types) new_table = table.limit(2) self.assertIsNot(new_table, table) self.assertColumnNames(new_table, self.column_names) self.assertColumnTypes(new_table, [Number, Number, Text]) self.assertRows(new_table, self.rows[:2]) def test_limit_slice(self): table = Table(self.rows, self.column_names, self.column_types) new_table = table.limit(0, 3, 2) self.assertIsNot(new_table, table) self.assertColumnNames(new_table, self.column_names) self.assertColumnTypes(new_table, [Number, Number, Text]) self.assertRows(new_table, self.rows[0:3:2]) def test_limit_slice_negative(self): table = Table(self.rows, self.column_names, self.column_types) new_table = table.limit(-2, step=-1) self.assertIsNot(new_table, table) self.assertColumnNames(new_table, self.column_names) self.assertColumnTypes(new_table, [Number, Number, Text]) self.assertRows(new_table, self.rows[-2:-1]) def test_limit_step_only(self): table = Table(self.rows, self.column_names, self.column_types) new_table = table.limit(step=2) self.assertIsNot(new_table, table) self.assertColumnNames(new_table, self.column_names) self.assertColumnTypes(new_table, [Number, Number, Text]) self.assertRows(new_table, self.rows[::2]) def test_limit_with_row_names(self): table = Table(self.rows, self.column_names, self.column_types, row_names='three') new_table = table.limit(2) self.assertRowNames(new_table, ['a', 'b']) def test_distinct_column(self): rows = ( (1, 2, 'a'), (2, None, None), (1, 1, 'c'), (1, None, None) ) table = Table(rows, self.column_names, self.column_types) new_table = table.distinct('one') self.assertIsNot(new_table, table) self.assertColumnNames(new_table, self.column_names) self.assertColumnTypes(new_table, [Number, Number, Text]) self.assertRows(new_table, [ rows[0], rows[1] ]) def test_distinct_multiple_columns(self): rows = ( (1, 2, 'a'), (2, None, None), (1, 1, 'c'), (1, None, None) ) table = Table(rows, self.column_names, self.column_types) new_table = table.distinct(['two', 'three']) self.assertIsNot(new_table, table) self.assertColumnNames(new_table, self.column_names) self.assertColumnTypes(new_table, [Number, Number, Text]) self.assertRows(new_table, [ rows[0], rows[1], rows[2] ]) def test_distinct_func(self): rows = ( (1, 2, 'a'), (2, None, None), (1, 1, 'c'), (1, None, None) ) table = Table(rows, self.column_names, self.column_types) new_table = table.distinct(lambda row: (row['two'], row['three'])) self.assertIsNot(new_table, table) self.assertColumnNames(new_table, self.column_names) self.assertColumnTypes(new_table, [Number, Number, Text]) self.assertRows(new_table, [ rows[0], rows[1], rows[2] ]) def test_distinct_none(self): rows = ( (1, 2, 'a'), (1, None, None), (1, 1, 'c'), (1, None, None) ) table = Table(rows, self.column_names, self.column_types) new_table = table.distinct() self.assertIsNot(new_table, table) self.assertColumnNames(new_table, self.column_names) self.assertColumnTypes(new_table, [Number, Number, Text]) self.assertRows(new_table, [ rows[0], rows[1], rows[2] ]) def test_distinct_with_row_names(self): rows = ( (1, 2, 'a'), (2, None, None), (1, 1, 'c'), (1, None, 'd') ) table = Table(rows, self.column_names, self.column_types, row_names='three') new_table = table.distinct('one') self.assertRowNames(new_table, ['a', None]) def test_chain_select_where(self): table = Table(self.rows, self.column_names, self.column_types) new_table = table.select(('one', 'two')).where(lambda r: r['two'] == 3) self.assertIsNot(new_table, table) self.assertColumnNames(new_table, self.column_names[:2]) self.assertColumnTypes(new_table, [Number, Number]) self.assertRows(new_table, [ self.rows[1][:2], ]) class TestData(AgateTestCase): def setUp(self): self.rows = ( (1, 4, 'a'), (2, 3, 'b'), (None, 2, 'c') ) self.number_type = Number() self.text_type = Text() self.column_names = ['one', 'two', 'three'] self.column_types = [self.number_type, self.number_type, self.text_type] def test_data_immutable(self): rows = [ [1, 4, 'a'], [2, 3, 'b'], [None, 2, 'c'] ] table = Table(rows, self.column_names, self.column_types) rows[0] = [2, 2, 2] self.assertSequenceEqual(table.rows[0], [1, 4, 'a']) def test_fork_preserves_data(self): table = Table(self.rows, self.column_names, self.column_types) table2 = table._fork(table.rows) self.assertIs(table.rows[0], table2.rows[0]) self.assertIs(table.rows[1], table2.rows[1]) self.assertIs(table.rows[2], table2.rows[2]) def test_where_preserves_rows(self): table = Table(self.rows, self.column_names, self.column_types) table2 = table.where(lambda r: r['one'] == 1) table3 = table2.where(lambda r: r['one'] == 1) self.assertIs(table.rows[0], table2.rows[0]) self.assertIs(table2.rows[0], table3.rows[0]) def test_order_by_preserves_rows(self): table = Table(self.rows, self.column_names, self.column_types) table2 = table.order_by(lambda r: r['one']) table3 = table2.order_by(lambda r: r['one']) self.assertIs(table.rows[0], table2.rows[0]) self.assertIs(table2.rows[0], table3.rows[0]) def test_limit_preserves_rows(self): table = Table(self.rows, self.column_names, self.column_types) table2 = table.limit(2) table3 = table2.limit(2) self.assertIs(table.rows[0], table2.rows[0]) self.assertIs(table2.rows[0], table3.rows[0]) def test_compute_creates_rows(self): table = Table(self.rows, self.column_names, self.column_types) table2 = table.compute([ ('new2', Formula(self.number_type, lambda r: r['one'])) ]) table3 = table2.compute([ ('new3', Formula(self.number_type, lambda r: r['one'])) ]) self.assertIsNot(table.rows[0], table2.rows[0]) self.assertNotEqual(table.rows[0], table2.rows[0]) self.assertIsNot(table2.rows[0], table3.rows[0]) self.assertNotEqual(table2.rows[0], table3.rows[0]) self.assertSequenceEqual(table.rows[0], (1, 4, 'a')) agate-1.9.1/tests/test_table/test_aggregate.py000066400000000000000000000020231454111471500214320ustar00rootroot00000000000000from agate import Table from agate.aggregations import Count, Sum from agate.data_types import Number, Text from agate.testcase import AgateTestCase class TestAggregate(AgateTestCase): def setUp(self): self.rows = ( (1, 4, 'a'), (2, 3, 'b'), (None, 2, '👍') ) self.number_type = Number() self.text_type = Text() self.column_names = ['one', 'two', 'three'] self.column_types = [self.number_type, self.number_type, self.text_type] self.table = Table(self.rows, self.column_names, self.column_types) def test_count(self): self.assertEqual(self.table.aggregate(Count()), 3) def test_sum(self): self.assertEqual(self.table.aggregate(Sum('two')), 9) def test_multiple(self): self.assertEqual( self.table.aggregate([ ('count', Count()), ('sum', Sum('two')) ]), { 'count': 3, 'sum': 9 } ) agate-1.9.1/tests/test_table/test_bins.py000066400000000000000000000116121454111471500204430ustar00rootroot00000000000000from decimal import Decimal from babel.numbers import get_decimal_symbol from agate import Table from agate.data_types import Number, Text from agate.testcase import AgateTestCase class TestBins(AgateTestCase): def setUp(self): self.number_type = Number() self.column_names = ['number'] self.column_types = [self.number_type] def test_bins(self): rows = [] for i in range(0, 100): rows.append([i]), new_table = Table(rows, self.column_names, self.column_types).bins('number') self.assertColumnNames(new_table, ['number', 'Count']) self.assertColumnTypes(new_table, [Text, Number]) self.assertSequenceEqual(new_table.rows[0], ['[0 - 10)', 10]) self.assertSequenceEqual(new_table.rows[3], ['[30 - 40)', 10]) self.assertSequenceEqual(new_table.rows[9], ['[90 - 100]', 10]) self.assertRowNames(new_table, [ '[0 - 10)', '[10 - 20)', '[20 - 30)', '[30 - 40)', '[40 - 50)', '[50 - 60)', '[60 - 70)', '[70 - 80)', '[80 - 90)', '[90 - 100]', ]) def test_bins_negative(self): rows = [] for i in range(0, -100, -1): rows.append([i]) new_table = Table(rows, self.column_names, self.column_types).bins('number', 10, start=-100) self.assertColumnNames(new_table, ['number', 'Count']) self.assertColumnTypes(new_table, [Text, Number]) self.assertSequenceEqual(new_table.rows[0], ['[-100 - -90)', 9]) self.assertSequenceEqual(new_table.rows[3], ['[-70 - -60)', 10]) self.assertSequenceEqual(new_table.rows[9], ['[-10 - 0]', 11]) def test_bins_mixed_signs(self): rows = [] for i in range(0, -100, -1): rows.append([i + 50]) new_table = Table(rows, self.column_names, self.column_types).bins('number') self.assertColumnNames(new_table, ['number', 'Count']) self.assertColumnTypes(new_table, [Text, Number]) self.assertSequenceEqual(new_table.rows[0], ['[-50 - -40)', 9]) self.assertSequenceEqual(new_table.rows[3], ['[-20 - -10)', 10]) self.assertSequenceEqual(new_table.rows[9], ['[40 - 50]', 11]) def test_bins_small_numbers(self): rows = [] for i in range(0, 100): rows.append([Decimal(i) / Decimal('10')]) new_table = Table(rows, self.column_names, self.column_types).bins('number') self.assertSequenceEqual(new_table.rows[0], ['[0 - 1)', 10]) self.assertSequenceEqual(new_table.rows[3], ['[3 - 4)', 10]) self.assertSequenceEqual(new_table.rows[9], ['[9 - 10]', 10]) def test_bins_values_outside_start_end(self): rows = [] for i in range(0, 100): rows.append([Decimal(i) / Decimal('10')]) table_one = Table(rows, self.column_names, self.column_types).bins('number', start=1, end=11) table_two = Table(rows, self.column_names, self.column_types).bins('number', start=-1, end=9) self.assertSequenceEqual(table_one.rows[0], ['[0 - 2)', 20]) self.assertSequenceEqual(table_two.rows[8], ['[8 - 10]', 20]) def test_bins_decimals(self): rows = [] for i in range(0, 100): rows.append([Decimal(i) / Decimal('100')]) new_table = Table(rows, self.column_names, self.column_types).bins('number') self.assertColumnNames(new_table, ['number', 'Count']) self.assertColumnTypes(new_table, [Text, Number]) self.assertSequenceEqual( new_table.rows[0], ['[0' + get_decimal_symbol() + '0 - 0' + get_decimal_symbol() + '1)', 10] ) self.assertSequenceEqual( new_table.rows[3], ['[0' + get_decimal_symbol() + '3 - 0' + get_decimal_symbol() + '4)', 10] ) self.assertSequenceEqual( new_table.rows[9], ['[0' + get_decimal_symbol() + '9 - 1' + get_decimal_symbol() + '0]', 10] ) def test_bins_nulls(self): rows = [] for i in range(0, 100): rows.append([Decimal(i) / Decimal('100')]) rows.append([None]) new_table = Table(rows, self.column_names, self.column_types).bins('number') self.assertColumnNames(new_table, ['number', 'Count']) self.assertColumnTypes(new_table, [Text, Number]) self.assertSequenceEqual( new_table.rows[0], ['[0' + get_decimal_symbol() + '0 - 0' + get_decimal_symbol() + '1)', 10] ) self.assertSequenceEqual( new_table.rows[3], ['[0' + get_decimal_symbol() + '3 - 0' + get_decimal_symbol() + '4)', 10] ) self.assertSequenceEqual( new_table.rows[9], ['[0' + get_decimal_symbol() + '9 - 1' + get_decimal_symbol() + '0]', 10] ) self.assertSequenceEqual(new_table.rows[10], [None, 1]) agate-1.9.1/tests/test_table/test_charting.py000066400000000000000000000041771454111471500213170ustar00rootroot00000000000000import leather from agate import Table from agate.data_types import Number, Text class TestTableCharts(leather.LeatherTestCase): def setUp(self): self.rows = ( (1, 4, 'a'), (2, 3, 'b'), (None, 2, '👍') ) self.number_type = Number() self.text_type = Text() self.column_names = ['one', 'two', 'three'] self.column_types = [self.number_type, self.number_type, self.text_type] self.table = Table(self.rows, self.column_names, self.column_types) def test_bar_chart(self): text = self.table.bar_chart(label='three', value='one') svg = self.parse_svg(text) self.assertElementCount(svg, '.axis', 2) self.assertElementCount(svg, '.series', 1) self.assertElementCount(svg, '.bars', 1) self.assertElementCount(svg, 'rect', 3) text2 = self.table.bar_chart(label=2, value=0) self.assertEqual(text, text2) def test_column_chart(self): text = self.table.column_chart(label='three', value='one') svg = self.parse_svg(text) self.assertElementCount(svg, '.axis', 2) self.assertElementCount(svg, '.series', 1) self.assertElementCount(svg, '.columns', 1) self.assertElementCount(svg, 'rect', 3) text2 = self.table.column_chart(label=2, value=0) self.assertEqual(text, text2) def test_line_chart(self): text = self.table.line_chart(x='one', y='two') svg = self.parse_svg(text) self.assertElementCount(svg, '.axis', 2) self.assertElementCount(svg, '.series', 1) self.assertElementCount(svg, 'path', 1) text2 = self.table.line_chart(x=0, y=1) self.assertEqual(text, text2) def test_scatterplot(self): text = self.table.scatterplot(x='one', y='two') svg = self.parse_svg(text) self.assertElementCount(svg, '.axis', 2) self.assertElementCount(svg, '.series', 1) self.assertElementCount(svg, '.dots', 1) self.assertElementCount(svg, 'circle', 2) text2 = self.table.scatterplot(x=0, y=1) self.assertEqual(text, text2) agate-1.9.1/tests/test_table/test_compute.py000066400000000000000000000077041454111471500211730ustar00rootroot00000000000000from agate import Table from agate.computations import Formula from agate.data_types import Number, Text from agate.testcase import AgateTestCase class TestCompute(AgateTestCase): def setUp(self): self.rows = ( ('a', 2, 3, 4), (None, 3, 5, None), ('a', 2, 4, None), ('b', 3, 6, None) ) self.number_type = Number() self.text_type = Text() self.column_names = [ 'one', 'two', 'three', 'four' ] self.column_types = [ self.text_type, self.number_type, self.number_type, self.number_type ] self.table = Table(self.rows, self.column_names, self.column_types) def test_compute(self): new_table = self.table.compute([ ('test', Formula(self.number_type, lambda r: r['two'] + r['three'])) ]) self.assertIsNot(new_table, self.table) self.assertColumnNames(new_table, ['one', 'two', 'three', 'four', 'test']) self.assertColumnTypes(new_table, [Text, Number, Number, Number, Number]) self.assertSequenceEqual(new_table.rows[0], ('a', 2, 3, 4, 5)) self.assertSequenceEqual(new_table.columns['test'], (5, 8, 6, 9)) def test_compute_multiple(self): new_table = self.table.compute([ ('number', Formula(self.number_type, lambda r: r['two'] + r['three'])), ('text', Formula(self.text_type, lambda r: (r['one'] or '-') + str(r['three']))) ]) self.assertIsNot(new_table, self.table) self.assertColumnNames(new_table, ['one', 'two', 'three', 'four', 'number', 'text']) self.assertColumnTypes(new_table, [Text, Number, Number, Number, Number, Text]) self.assertSequenceEqual(new_table.rows[0], ('a', 2, 3, 4, 5, 'a3')) self.assertSequenceEqual(new_table.columns['number'], (5, 8, 6, 9)) self.assertSequenceEqual(new_table.columns['text'], ('a3', '-5', 'a4', 'b6')) def test_compute_with_row_names(self): table = Table(self.rows, self.column_names, self.column_types, row_names='three') new_table = table.compute([ ('number', Formula(self.number_type, lambda r: r['two'] + r['three'])), ('text', Formula(self.text_type, lambda r: (r['one'] or '-') + str(r['three']))) ]) self.assertRowNames(new_table, [3, 5, 4, 6]) def test_compute_replace(self): new_table = self.table.compute([ ('two', Formula(self.number_type, lambda r: r['two'] + r['three'])) ], replace=True) self.assertIsNot(new_table, self.table) self.assertColumnNames(new_table, ['one', 'two', 'three', 'four']) self.assertColumnTypes(new_table, [Text, Number, Number, Number]) self.assertSequenceEqual(new_table.rows[0], ('a', 5, 3, 4)) self.assertSequenceEqual(new_table.columns['two'], (5, 8, 6, 9)) def test_compute_replace_change_type(self): new_table = self.table.compute([ ('two', Formula(self.text_type, lambda r: 'a')) ], replace=True) self.assertIsNot(new_table, self.table) self.assertColumnNames(new_table, ['one', 'two', 'three', 'four']) self.assertColumnTypes(new_table, [Text, Text, Number, Number]) self.assertSequenceEqual(new_table.rows[0], ('a', 'a', 3, 4)) self.assertSequenceEqual(new_table.columns['two'], ('a', 'a', 'a', 'a')) def test_compute_replace_partial(self): new_table = self.table.compute([ ('two', Formula(self.number_type, lambda r: r['two'] + r['three'])), ('test', Formula(self.number_type, lambda r: 1)) ], replace=True) self.assertIsNot(new_table, self.table) self.assertColumnNames(new_table, ['one', 'two', 'three', 'four', 'test']) self.assertColumnTypes(new_table, [Text, Number, Number, Number, Number]) self.assertSequenceEqual(new_table.rows[0], ('a', 5, 3, 4, 1)) self.assertSequenceEqual(new_table.columns['two'], (5, 8, 6, 9)) agate-1.9.1/tests/test_table/test_denormalize.py000066400000000000000000000100051454111471500220140ustar00rootroot00000000000000from agate import Table from agate.data_types import Number, Text from agate.testcase import AgateTestCase from agate.type_tester import TypeTester class TestDenormalize(AgateTestCase): def setUp(self): self.rows = ( ('Jane', 'Code', 'gender', 'female'), ('Jane', 'Code', 'age', '27'), ('Jim', 'Program', 'gender', 'male'), ('Jim', 'Bytes', 'age', '24') ) self.text_type = Text() self.column_names = ['first_name', 'last_name', 'property', 'value'] self.column_types = [self.text_type, self.text_type, self.text_type, self.text_type] def test_denormalize(self): table = Table(self.rows, self.column_names, self.column_types) normalized_table = table.denormalize('first_name', 'property', 'value') normal_rows = ( ('Jane', 'female', 27), ('Jim', 'male', 24), ) self.assertRows(normalized_table, normal_rows) self.assertColumnNames(normalized_table, ['first_name', 'gender', 'age']) self.assertColumnTypes(normalized_table, [Text, Text, Number]) self.assertRowNames(normalized_table, ['Jane', 'Jim']) def test_denormalize_no_key(self): table = Table(self.rows, self.column_names, self.column_types) normalized_table = table.denormalize(None, 'property', 'value') # NB: value has been overwritten normal_rows = ( ('male', 24), ) self.assertRows(normalized_table, normal_rows) self.assertColumnNames(normalized_table, ['gender', 'age']) self.assertColumnTypes(normalized_table, [Text, Number]) def test_denormalize_multiple_keys(self): table = Table(self.rows, self.column_names, self.column_types) normalized_table = table.denormalize(['first_name', 'last_name'], 'property', 'value') normal_rows = ( ('Jane', 'Code', 'female', 27), ('Jim', 'Program', 'male', None), ('Jim', 'Bytes', None, 24), ) self.assertRows(normalized_table, normal_rows) self.assertColumnNames(normalized_table, ['first_name', 'last_name', 'gender', 'age']) self.assertColumnTypes(normalized_table, [Text, Text, Text, Number]) self.assertRowNames(normalized_table, [('Jane', 'Code'), ('Jim', 'Program'), ('Jim', 'Bytes')]) def test_denormalize_default_value(self): table = Table(self.rows, self.column_names, self.column_types) normalized_table = table.denormalize(['first_name', 'last_name'], 'property', 'value', default_value='hello') normal_rows = ( ('Jane', 'Code', 'female', '27'), ('Jim', 'Program', 'male', 'hello'), ('Jim', 'Bytes', 'hello', '24'), ) self.assertRows(normalized_table, normal_rows) self.assertColumnNames(normalized_table, ['first_name', 'last_name', 'gender', 'age']) self.assertColumnTypes(normalized_table, [Text, Text, Text, Text]) def test_denormalize_column_types(self): table = Table(self.rows, self.column_names, self.column_types) normalized_table = table.denormalize(None, 'property', 'value', column_types=[Text(), Number()]) # NB: value has been overwritten normal_rows = ( ('male', 24), ) self.assertRows(normalized_table, normal_rows) self.assertColumnNames(normalized_table, ['gender', 'age']) self.assertColumnTypes(normalized_table, [Text, Number]) def test_denormalize_column_type_tester(self): table = Table(self.rows, self.column_names, self.column_types) type_tester = TypeTester(force={'gender': Text()}) normalized_table = table.denormalize(None, 'property', 'value', column_types=type_tester) # NB: value has been overwritten normal_rows = ( ('male', 24), ) self.assertRows(normalized_table, normal_rows) self.assertColumnNames(normalized_table, ['gender', 'age']) self.assertColumnTypes(normalized_table, [Text, Number]) agate-1.9.1/tests/test_table/test_from_csv.py000066400000000000000000000160431454111471500213310ustar00rootroot00000000000000import warnings from agate import Table from agate.data_types import Boolean, Date, DateTime, Number, Text, TimeDelta from agate.testcase import AgateTestCase from agate.type_tester import TypeTester class TestFromCSV(AgateTestCase): def setUp(self): self.rows = ( (1, 'a', True, '11/4/2015', '11/4/2015 12:22 PM', '4:15'), (2, '👍', False, '11/5/2015', '11/4/2015 12:45 PM', '6:18'), (None, 'b', None, None, None, None) ) self.column_names = [ 'number', 'text', 'boolean', 'date', 'datetime', 'timedelta' ] self.column_types = [ Number(), Text(), Boolean(), Date(), DateTime(), TimeDelta() ] def test_from_csv(self): table1 = Table(self.rows, self.column_names, self.column_types) table2 = Table.from_csv('examples/test.csv') self.assertColumnNames(table2, table1.column_names) self.assertColumnTypes(table2, [Number, Text, Boolean, Date, DateTime, TimeDelta]) self.assertRows(table2, table1.rows) def test_from_csv_crlf(self): table1 = Table(self.rows, self.column_names, self.column_types) table2 = Table.from_csv('examples/test_crlf.csv') self.assertColumnNames(table2, table1.column_names) self.assertColumnTypes(table2, [Number, Text, Boolean, Date, DateTime, TimeDelta]) self.assertRows(table2, table1.rows) def test_from_csv_cr(self): table1 = Table(self.rows, self.column_names, self.column_types) table2 = Table.from_csv('examples/test_cr.csv') self.assertColumnNames(table2, table1.column_names) self.assertColumnTypes(table2, [Number, Text, Boolean, Date, DateTime, TimeDelta]) self.assertRows(table2, table1.rows) def test_from_csv_file_like_object(self): table1 = Table(self.rows, self.column_names, self.column_types) f = open('examples/test.csv', encoding='utf-8') table2 = Table.from_csv(f) f.close() self.assertColumnNames(table2, table1.column_names) self.assertColumnTypes(table2, [Number, Text, Boolean, Date, DateTime, TimeDelta]) self.assertRows(table2, table1.rows) def test_from_csv_type_tester(self): tester = TypeTester(force={ 'number': Text() }) table = Table.from_csv('examples/test.csv', column_types=tester) self.assertColumnTypes(table, [Text, Text, Boolean, Date, DateTime, TimeDelta]) def test_from_csv_no_type_tester(self): tester = TypeTester(limit=0) table = Table.from_csv('examples/test.csv', column_types=tester) self.assertColumnTypes(table, [Text, Text, Text, Text, Text, Text]) def test_from_csv_no_header(self): warnings.simplefilter('ignore') try: table = Table.from_csv('examples/test_no_header.csv', header=False) finally: warnings.resetwarnings() self.assertColumnNames(table, ['a', 'b', 'c', 'd', 'e', 'f']) self.assertColumnTypes(table, [Number, Text, Boolean, Date, DateTime, TimeDelta]) def test_from_csv_no_header_columns(self): table = Table.from_csv('examples/test_no_header.csv', self.column_names, header=False) self.assertColumnNames(table, self.column_names) self.assertColumnTypes(table, [Number, Text, Boolean, Date, DateTime, TimeDelta]) def test_from_csv_sniff_limit_0(self): table2 = Table.from_csv('examples/test_csv_sniff.csv', sniff_limit=0) self.assertColumnNames(table2, ['number|text|boolean|date|datetime|timedelta']) self.assertColumnTypes(table2, [Text]) def test_from_csv_sniff_limit_200(self): table1 = Table(self.rows, self.column_names, self.column_types) table2 = Table.from_csv('examples/test_csv_sniff.csv', sniff_limit=200) self.assertColumnNames(table2, table1.column_names) self.assertColumnTypes(table2, [Number, Text, Boolean, Date, DateTime, TimeDelta]) self.assertRows(table2, table1.rows) def test_from_csv_sniff_limit_none(self): table1 = Table(self.rows, self.column_names, self.column_types) table2 = Table.from_csv('examples/test_csv_sniff.csv', sniff_limit=None) self.assertColumnNames(table2, table1.column_names) self.assertColumnTypes(table2, [Number, Text, Boolean, Date, DateTime, TimeDelta]) self.assertRows(table2, table1.rows) def test_from_csv_skip_lines(self): warnings.simplefilter('ignore') try: table1 = Table(self.rows[1:], column_types=self.column_types) table2 = Table.from_csv('examples/test.csv', header=False, skip_lines=2) finally: warnings.resetwarnings() self.assertColumnNames(table2, table1.column_names) self.assertColumnTypes(table2, [Number, Text, Boolean, Date, DateTime, TimeDelta]) self.assertRows(table2, table1.rows) def test_from_csv_skip_lines_crlf(self): warnings.simplefilter('ignore') try: table1 = Table(self.rows[1:], column_types=self.column_types) table2 = Table.from_csv('examples/test_crlf.csv', header=False, skip_lines=2) finally: warnings.resetwarnings() self.assertColumnNames(table2, table1.column_names) self.assertColumnTypes(table2, [Number, Text, Boolean, Date, DateTime, TimeDelta]) self.assertRows(table2, table1.rows) def test_from_csv_skip_lines_cr(self): warnings.simplefilter('ignore') try: table1 = Table(self.rows[1:], column_types=self.column_types) table2 = Table.from_csv('examples/test_cr.csv', header=False, skip_lines=2) finally: warnings.resetwarnings() self.assertColumnNames(table2, table1.column_names) self.assertColumnTypes(table2, [Number, Text, Boolean, Date, DateTime, TimeDelta]) self.assertRows(table2, table1.rows) def test_from_csv_row_limit(self): table1 = Table(self.rows[:2], self.column_names, self.column_types) table2 = Table.from_csv('examples/test.csv', row_limit=2) self.assertColumnNames(table2, table1.column_names) self.assertColumnTypes(table2, [Number, Text, Boolean, Date, DateTime, TimeDelta]) self.assertRows(table2, table1.rows) def test_from_csv_row_limit_no_header_columns(self): table1 = Table(self.rows[:2], self.column_names, self.column_types) table2 = Table.from_csv('examples/test_no_header.csv', self.column_names, header=False, row_limit=2) self.assertColumnNames(table2, table1.column_names) self.assertColumnTypes(table2, [Number, Text, Boolean, Date, DateTime, TimeDelta]) self.assertRows(table2, table1.rows) def test_from_csv_row_limit_too_high(self): table1 = Table(self.rows, self.column_names, self.column_types) table2 = Table.from_csv('examples/test.csv', row_limit=200) self.assertColumnNames(table2, table1.column_names) self.assertColumnTypes(table2, [Number, Text, Boolean, Date, DateTime, TimeDelta]) self.assertRows(table2, table1.rows) agate-1.9.1/tests/test_table/test_from_fixed.py000066400000000000000000000007351454111471500216360ustar00rootroot00000000000000from agate import Table from agate.testcase import AgateTestCase class TestFromFixed(AgateTestCase): def test_from_fixed(self): table1 = Table.from_csv('examples/testfixed_converted.csv') table2 = Table.from_fixed('examples/testfixed', 'examples/testfixed_schema.csv') self.assertColumnNames(table2, table1.column_names) self.assertColumnTypes(table2, [type(c) for c in table1.column_types]) self.assertRows(table2, table1.rows) agate-1.9.1/tests/test_table/test_group_by.py000066400000000000000000000074011454111471500213370ustar00rootroot00000000000000from decimal import Decimal from agate import Table, TableSet from agate.data_types import Boolean, Number, Text from agate.testcase import AgateTestCase class TestGroupBy(AgateTestCase): def setUp(self): self.rows = ( ('a', 2, 3, 4), (None, 3, 5, None), ('a', 2, 4, None), ('b', 3, 4, None) ) self.number_type = Number() self.text_type = Text() self.column_names = [ 'one', 'two', 'three', 'four' ] self.column_types = [ self.text_type, self.number_type, self.number_type, self.number_type ] def test_group_by(self): table = Table(self.rows, self.column_names, self.column_types) tableset = table.group_by('one') self.assertIsInstance(tableset, TableSet) self.assertEqual(len(tableset), 3) self.assertEqual(tableset.key_name, 'one') self.assertIsInstance(tableset.key_type, Text) self.assertIn('a', tableset.keys()) self.assertIn('b', tableset.keys()) self.assertIn(None, tableset.keys()) self.assertSequenceEqual(tableset['a'].columns['one'], ('a', 'a')) self.assertSequenceEqual(tableset['b'].columns['one'], ('b',)) def test_group_by_number(self): table = Table(self.rows, self.column_names, self.column_types) tableset = table.group_by('two') self.assertIsInstance(tableset, TableSet) self.assertEqual(len(tableset), 2) self.assertEqual(tableset.key_name, 'two') self.assertIsInstance(tableset.key_type, Number) self.assertIn(Decimal('2'), tableset.keys()) self.assertIn(Decimal('3'), tableset.keys()) self.assertSequenceEqual(tableset[Decimal('2')].columns['one'], ('a', 'a')) self.assertSequenceEqual(tableset[Decimal('3')].columns['one'], (None, 'b')) def test_group_by_key_name(self): table = Table(self.rows, self.column_names, self.column_types) tableset = table.group_by('one', key_name='test') self.assertIsInstance(tableset, TableSet) self.assertEqual(tableset.key_name, 'test') self.assertIsInstance(tableset.key_type, Text) self.assertIn('a', tableset.keys()) self.assertIn('b', tableset.keys()) self.assertIn(None, tableset.keys()) self.assertSequenceEqual(tableset['a'].columns['one'], ('a', 'a')) self.assertSequenceEqual(tableset['b'].columns['one'], ('b',)) def test_group_by_key_type(self): table = Table(self.rows, self.column_names, self.column_types) tableset = table.group_by('two', key_type=Text()) self.assertIsInstance(tableset, TableSet) self.assertEqual(tableset.key_name, 'two') self.assertIsInstance(tableset.key_type, Text) self.assertIn('2', tableset.keys()) self.assertIn('3', tableset.keys()) self.assertSequenceEqual(tableset['2'].columns['one'], ('a', 'a')) self.assertSequenceEqual(tableset['3'].columns['one'], (None, 'b')) def test_group_by_function(self): table = Table(self.rows, self.column_names, self.column_types) tableset = table.group_by(lambda r: r['three'] < 5, key_type=Boolean()) self.assertIsInstance(tableset, TableSet) self.assertEqual(len(tableset), 2) self.assertEqual(tableset.key_name, 'group') self.assertIn(True, tableset.keys()) self.assertIn(False, tableset.keys()) self.assertSequenceEqual(tableset[True].columns['one'], ('a', 'a', 'b')) self.assertSequenceEqual(tableset[False].columns['one'], (None,)) def test_group_by_bad_column(self): table = Table(self.rows, self.column_names, self.column_types) with self.assertRaises(KeyError): table.group_by('bad') agate-1.9.1/tests/test_table/test_homogenize.py000066400000000000000000000056231454111471500216610ustar00rootroot00000000000000from agate import Table from agate.data_types import Number, Text from agate.testcase import AgateTestCase class TestHomogenize(AgateTestCase): def setUp(self): self.rows = ( (0, 4, 'a'), (1, 3, 'b'), (None, 2, 'c') ) self.number_type = Number() self.text_type = Text() self.column_names = ['one', 'two', 'three'] self.column_types = [self.number_type, self.number_type, self.text_type] def test_homogenize_column_name(self): table = Table(self.rows, self.column_names, self.column_types) compare_values = range(3) homogenized = table.homogenize('one', compare_values, [3, 'd']) rows = ( (0, 4, 'a'), (1, 3, 'b'), (None, 2, 'c'), (2, 3, 'd') ) homogenized.print_table() self.assertColumnNames(homogenized, self.column_names) self.assertColumnTypes(homogenized, [Number, Number, Text]) self.assertRows(homogenized, rows) def test_homogenize_default_row(self): table = Table(self.rows, self.column_names, self.column_types) compare_values = [0, 1, 2] homogenized = table.homogenize(['one'], compare_values) rows = ( (0, 4, 'a'), (1, 3, 'b'), (None, 2, 'c'), (2, None, None) ) homogenized.print_table() self.assertColumnNames(homogenized, self.column_names) self.assertColumnTypes(homogenized, [Number, Number, Text]) self.assertRows(homogenized, rows) def test_homogenize_multiple_columns(self): table = Table(self.rows, self.column_names, self.column_types) def column_two(count): return [chr(ord('a') + c) for c in range(count)] homogenized = table.homogenize(['one', 'three'], zip(range(3), column_two(3)), [5]) rows = ( (0, 4, 'a'), (1, 3, 'b'), (None, 2, 'c'), (2, 5, 'c') ) homogenized.print_table() self.assertColumnNames(homogenized, self.column_names) self.assertColumnTypes(homogenized, [Number, Number, Text]) self.assertRows(homogenized, rows) def test_homogenize_lambda_default(self): table = Table(self.rows, self.column_names, self.column_types) def default_row(d): return [d[0], d[0] * 2, d[1]] def column_two(count): return [chr(ord('a') + c) for c in range(count)] homogenized = table.homogenize(['one', 'three'], zip(range(3), column_two(3)), default_row) rows = ( (0, 4, 'a'), (1, 3, 'b'), (None, 2, 'c'), (2, 4, 'c') ) homogenized.print_table() self.assertColumnNames(homogenized, self.column_names) self.assertColumnTypes(homogenized, [Number, Number, Text]) self.assertRows(homogenized, rows) agate-1.9.1/tests/test_table/test_join.py000066400000000000000000000274501454111471500204560ustar00rootroot00000000000000from agate import Table from agate.data_types import Number, Text from agate.testcase import AgateTestCase class TestJoin(AgateTestCase): def setUp(self): self.left_rows = ( (1, 4, 'a'), (2, 3, 'b'), (None, 2, 'c') ) self.right_rows = ( (1, 4, 'a'), (2, 3, 'b'), (None, 2, 'c') ) self.number_type = Number() self.text_type = Text() self.left_column_names = ['one', 'two', 'three'] self.right_column_names = ['four', 'five', 'six'] self.column_types = [self.number_type, self.number_type, self.text_type] self.left = Table(self.left_rows, self.left_column_names, self.column_types) self.right = Table(self.right_rows, self.right_column_names, self.column_types) def test_join(self): new_table = self.left.join(self.right, 'one', 'four') self.assertIsNot(new_table, self.left) self.assertIsNot(new_table, self.right) self.assertColumnNames(new_table, ['one', 'two', 'three', 'five', 'six']) self.assertColumnTypes(new_table, [Number, Number, Text, Number, Text]) self.assertRows(new_table, [ (1, 4, 'a', 4, 'a'), (2, 3, 'b', 3, 'b'), (None, 2, 'c', 2, 'c') ]) def test_join_column_indicies(self): new_table = self.left.join(self.right, 0, 0) self.assertIsNot(new_table, self.left) self.assertIsNot(new_table, self.right) self.assertColumnNames(new_table, ['one', 'two', 'three', 'five', 'six']) self.assertColumnTypes(new_table, [Number, Number, Text, Number, Text]) self.assertRows(new_table, [ (1, 4, 'a', 4, 'a'), (2, 3, 'b', 3, 'b'), (None, 2, 'c', 2, 'c') ]) def test_join_match_multiple(self): left_rows = ( (1, 4, 'a'), (2, 3, 'b') ) right_rows = ( (1, 1, 'a'), (1, 2, 'a'), (2, 2, 'b') ) left = Table(left_rows, self.left_column_names, self.column_types) right = Table(right_rows, self.right_column_names, self.column_types) new_table = left.join(right, 'one', 'five') self.assertIsNot(new_table, left) self.assertIsNot(new_table, right) self.assertColumnNames(new_table, ['one', 'two', 'three', 'four', 'six']) self.assertColumnTypes(new_table, [Number, Number, Text, Number, Text]) self.assertRows(new_table, [ (1, 4, 'a', 1, 'a'), (2, 3, 'b', 1, 'a'), (2, 3, 'b', 2, 'b') ]) def test_join2(self): new_table = self.left.join(self.right, 'one', 'five') self.assertIsNot(new_table, self.left) self.assertIsNot(new_table, self.right) self.assertColumnNames(new_table, ['one', 'two', 'three', 'four', 'six']) self.assertColumnTypes(new_table, [Number, Number, Text, Number, Text]) self.assertRows(new_table, [ (1, 4, 'a', None, None), (2, 3, 'b', None, 'c'), (None, 2, 'c', None, None) ]) def test_join_same_column_name(self): right_column_names = ['four', 'one', 'six'] right = Table(self.right_rows, right_column_names, self.column_types) new_table = self.left.join(right, 'one') self.assertIsNot(new_table, self.left) self.assertIsNot(new_table, right) self.assertColumnNames(new_table, ['one', 'two', 'three', 'four', 'six']) self.assertColumnTypes(new_table, [Number, Number, Text, Number, Text]) self.assertRows(new_table, [ (1, 4, 'a', None, None), (2, 3, 'b', None, 'c'), (None, 2, 'c', None, None) ]) def test_join_multiple_columns(self): new_table = self.left.join( self.right, ['two', 'three'], ['five', 'six'] ) self.assertIsNot(new_table, self.left) self.assertIsNot(new_table, self.right) self.assertColumnNames(new_table, ['one', 'two', 'three', 'four']) self.assertColumnTypes(new_table, [Number, Number, Text, Number]) self.assertRows(new_table, [ (1, 4, 'a', 1), (2, 3, 'b', 2), (None, 2, 'c', None) ]) def test_join_func(self): new_table = self.left.join( self.right, lambda left: '%i%s' % (left['two'], left['three']), lambda right: '%i%s' % (right['five'], right['six']) ) self.assertIsNot(new_table, self.left) self.assertIsNot(new_table, self.right) self.assertColumnNames(new_table, ['one', 'two', 'three', 'four', 'five', 'six']) self.assertColumnTypes(new_table, [Number, Number, Text, Number, Number, Text]) self.assertRows(new_table, [ (1, 4, 'a', 1, 4, 'a'), (2, 3, 'b', 2, 3, 'b'), (None, 2, 'c', None, 2, 'c') ]) def test_join_column_does_not_exist(self): with self.assertRaises(KeyError): self.left.join(self.right, 'one', 'seven') def test_inner_join(self): new_table = self.left.join(self.right, 'one', 'four', inner=True) self.assertIsNot(new_table, self.left) self.assertIsNot(new_table, self.right) self.assertColumnNames(new_table, ['one', 'two', 'three', 'five', 'six']) self.assertColumnTypes(new_table, [Number, Number, Text, Number, Text]) self.assertRows(new_table, [ (1, 4, 'a', 4, 'a'), (2, 3, 'b', 3, 'b'), (None, 2, 'c', 2, 'c') ]) def test_inner_join2(self): new_table = self.left.join(self.right, 'one', 'five', inner=True) self.assertIsNot(new_table, self.left) self.assertIsNot(new_table, self.right) self.assertColumnNames(new_table, ['one', 'two', 'three', 'four', 'six']) self.assertColumnTypes(new_table, [Number, Number, Text, Number, Text]) self.assertRows(new_table, [ (2, 3, 'b', None, 'c') ]) def test_inner_join_same_column_name(self): right_column_names = ['four', 'one', 'six'] right = Table(self.right_rows, right_column_names, self.column_types) new_table = self.left.join(right, 'one', inner=True) self.assertIsNot(new_table, self.left) self.assertIsNot(new_table, right) self.assertColumnNames(new_table, ['one', 'two', 'three', 'four', 'six']) self.assertColumnTypes(new_table, [Number, Number, Text, Number, Text]) self.assertRows(new_table, [ (2, 3, 'b', None, 'c') ]) def test_inner_join_func(self): new_table = self.left.join( self.right, lambda left: '%i%s' % (left['two'], left['three']), lambda right: '%i%s' % (right['five'], right['six']), inner=True ) self.assertIsNot(new_table, self.left) self.assertIsNot(new_table, self.right) self.assertColumnNames(new_table, ['one', 'two', 'three', 'four', 'five', 'six']) self.assertColumnTypes(new_table, [Number, Number, Text, Number, Number, Text]) self.assertRows(new_table, [ (1, 4, 'a', 1, 4, 'a') ]) def test_join_with_row_names(self): left = Table(self.left_rows, self.left_column_names, self.column_types, row_names='three') new_table = left.join(self.right, 'one', 'four') self.assertRowNames(new_table, ('a', 'b', 'c')) def test_join_require_match(self): with self.assertRaises(ValueError): self.left.join(self.right, 'one', 'five', require_match=True) with self.assertRaises(ValueError): self.left.join(self.right, 'one', 'five', require_match=True) self.left.join(self.right, 'one', 'four', require_match=True) def test_join_columns_kwarg(self): new_table = self.left.join(self.right, 'one', 'four', columns=['six']) self.assertIsNot(new_table, self.left) self.assertIsNot(new_table, self.right) self.assertColumnNames(new_table, ['one', 'two', 'three', 'six']) self.assertColumnTypes(new_table, [Number, Number, Text, Text]) self.assertRows(new_table, [ (1, 4, 'a', 'a'), (2, 3, 'b', 'b'), (None, 2, 'c', 'c') ]) def test_join_columns_kwarg_right_key(self): new_table = self.left.join(self.right, 'one', 'four', columns=['four', 'six']) self.assertIsNot(new_table, self.left) self.assertIsNot(new_table, self.right) self.assertColumnNames(new_table, ['one', 'two', 'three', 'four', 'six']) self.assertColumnTypes(new_table, [Number, Number, Text, Number, Text]) self.assertRows(new_table, [ (1, 4, 'a', 1, 'a'), (2, 3, 'b', 2, 'b'), (None, 2, 'c', None, 'c') ]) def test_join_rows_are_tuples(self): new_table = self.left.join(self.right, 'one', 'four', columns=['four', 'six']) self.assertIsInstance(new_table.rows[0].values(), tuple) def test_full_outer(self): left_rows = ( (1, 4, 'a'), (2, 3, 'b'), (3, 2, 'c') ) right_rows = ( (1, 4, 'a'), (2, 3, 'b'), (4, 2, 'c') ) left = Table(left_rows, self.left_column_names, self.column_types) right = Table(right_rows, self.right_column_names, self.column_types) new_table = left.join(right, 'one', 'four', full_outer=True) self.assertIsNot(new_table, left) self.assertIsNot(new_table, right) self.assertColumnNames(new_table, ['one', 'two', 'three', 'four', 'five', 'six']) self.assertColumnTypes(new_table, [Number, Number, Text, Number, Number, Text]) self.assertRows(new_table, [ (1, 4, 'a', 1, 4, 'a'), (2, 3, 'b', 2, 3, 'b'), (3, 2, 'c', None, None, None), (None, None, None, 4, 2, 'c') ]) def test_join_by_row_number(self): new_table = self.left.join(self.right, full_outer=True) self.assertIsNot(new_table, self.left) self.assertIsNot(new_table, self.right) self.assertColumnNames(new_table, ['one', 'two', 'three', 'four', 'five', 'six']) self.assertColumnTypes(new_table, [Number, Number, Text, Number, Number, Text]) self.assertRows(new_table, [ (1, 4, 'a', 1, 4, 'a'), (2, 3, 'b', 2, 3, 'b'), (None, 2, 'c', None, 2, 'c') ]) def test_join_by_row_number_short_right(self): right_rows = self.right_rows + ((7, 9, 'z'),) right = Table(right_rows, self.right_column_names, self.column_types) new_table = self.left.join(right, full_outer=True) self.assertIsNot(new_table, self.left) self.assertIsNot(new_table, right) self.assertColumnNames(new_table, ['one', 'two', 'three', 'four', 'five', 'six']) self.assertColumnTypes(new_table, [Number, Number, Text, Number, Number, Text]) self.assertRows(new_table, [ (1, 4, 'a', 1, 4, 'a'), (2, 3, 'b', 2, 3, 'b'), (None, 2, 'c', None, 2, 'c'), (None, None, None, 7, 9, 'z') ]) def test_join_by_row_number_short_left(self): left_rows = self.left_rows + ((7, 9, 'z'),) left = Table(left_rows, self.left_column_names, self.column_types) new_table = left.join(self.right, full_outer=True) self.assertIsNot(new_table, left) self.assertIsNot(new_table, self.right) self.assertColumnNames(new_table, ['one', 'two', 'three', 'four', 'five', 'six']) self.assertColumnTypes(new_table, [Number, Number, Text, Number, Number, Text]) self.assertRows(new_table, [ (1, 4, 'a', 1, 4, 'a'), (2, 3, 'b', 2, 3, 'b'), (None, 2, 'c', None, 2, 'c'), (7, 9, 'z', None, None, None) ]) agate-1.9.1/tests/test_table/test_merge.py000066400000000000000000000076741454111471500206240ustar00rootroot00000000000000from agate import Table from agate.data_types import Number, Text from agate.exceptions import DataTypeError from agate.testcase import AgateTestCase class TestMerge(AgateTestCase): def setUp(self): self.rows = ( (1, 4, 'a'), (2, 3, 'b'), (None, 2, 'c') ) self.number_type = Number() self.text_type = Text() self.column_names = ['one', 'two', 'three'] self.column_types = [self.number_type, self.number_type, self.text_type] def test_merge(self): table_a = Table(self.rows, self.column_names, self.column_types) table_b = Table(self.rows, self.column_names) table_c = Table.merge([table_a, table_b]) self.assertIsNot(table_c, table_a) self.assertIsNot(table_c, table_b) self.assertColumnNames(table_c, self.column_names) self.assertColumnTypes(table_c, [Number, Number, Text]) self.assertRows(table_c, self.rows + self.rows) def test_merge_different_names(self): table_a = Table(self.rows, self.column_names, self.column_types) column_names = ['a', 'b', 'c'] table_b = Table(self.rows, column_names, self.column_types) table_c = Table.merge([table_a, table_b]) self.assertIsNot(table_c, table_a) self.assertIsNot(table_c, table_b) self.assertColumnNames(table_c, self.column_names + column_names) self.assertColumnTypes(table_c, [Number, Number, Text, Number, Number, Text]) self.assertSequenceEqual(table_c.rows[0], [1, 4, 'a', None, None, None]) self.assertSequenceEqual(table_c.rows[3], [None, None, None, 1, 4, 'a']) for row in table_c.rows: self.assertSequenceEqual(row.keys(), self.column_names + column_names) def test_merge_mixed_names(self): table_a = Table(self.rows, self.column_names, self.column_types) column_names = ['two', 'one', 'four'] table_b = Table(self.rows, column_names, self.column_types) table_c = Table.merge([table_a, table_b]) self.assertIsNot(table_c, table_a) self.assertIsNot(table_c, table_b) self.assertColumnNames(table_c, ['one', 'two', 'three', 'four']) self.assertColumnTypes(table_c, [Number, Number, Text, Text]) self.assertSequenceEqual(table_c.rows[0], [1, 4, 'a', None]) self.assertSequenceEqual(table_c.rows[3], [4, 1, None, 'a']) for row in table_c.rows: self.assertSequenceEqual(row.keys(), ['one', 'two', 'three', 'four']) def test_merge_different_types(self): table_a = Table(self.rows, self.column_names, self.column_types) column_types = [self.number_type, self.text_type, self.text_type] table_b = Table(self.rows, self.column_names, column_types) with self.assertRaises(DataTypeError): Table.merge([table_a, table_b]) def test_merge_with_row_names(self): table_a = Table(self.rows, self.column_names, self.column_types, row_names='three') b_rows = ( (1, 4, 'd'), (2, 3, 'e'), (None, 2, 'f') ) table_b = Table(b_rows, self.column_names, self.column_types, row_names='three') table_c = Table.merge([table_a, table_b], row_names='three') self.assertRowNames(table_c, ['a', 'b', 'c', 'd', 'e', 'f']) def test_merge_with_column_names(self): table_a = Table(self.rows, self.column_names, self.column_types, row_names='three') b_rows = ( (1, 4, 'd'), (2, 3, 'e'), (None, 2, 'f') ) c_rows = ( (1, 4, 'a'), (2, 3, 'b'), (None, 2, 'c'), (None, 4, 'd'), (None, 3, 'e'), (None, 2, 'f') ) table_b = Table(b_rows, ['a', 'two', 'three'], self.column_types, row_names='three') table_c = Table.merge([table_a, table_b], column_names=table_a.column_names) self.assertRows(table_c, c_rows) agate-1.9.1/tests/test_table/test_normalize.py000066400000000000000000000077221454111471500215170ustar00rootroot00000000000000from agate import Table from agate.data_types import Number, Text from agate.testcase import AgateTestCase from agate.type_tester import TypeTester class TestNormalize(AgateTestCase): def setUp(self): self.rows = ( (1, 'c', 4, 'a'), (2, 'e', 3, 'b'), (None, 'g', 2, 'c') ) self.number_type = Number() self.text_type = Text() self.column_names = ['one', 'two', 'three', 'four'] self.column_types = [self.number_type, self.text_type, self.number_type, self.text_type] def test_normalize(self): table = Table(self.rows, self.column_names, self.column_types) normalized_table = table.normalize('one', 'three') normal_rows = ( (1, 'three', 4), (2, 'three', 3), (None, 'three', 2) ) self.assertRows(normalized_table, normal_rows) self.assertColumnNames(normalized_table, ['one', 'property', 'value']) self.assertColumnTypes(normalized_table, [Number, Text, Number]) def test_normalize_column_types(self): table = Table(self.rows, self.column_names, self.column_types) normalized_table = table.normalize('one', 'three', column_types=[Text(), Text()]) normal_rows = ( (1, 'three', '4'), (2, 'three', '3'), (None, 'three', '2') ) self.assertRows(normalized_table, normal_rows) self.assertColumnNames(normalized_table, ['one', 'property', 'value']) self.assertColumnTypes(normalized_table, [Number, Text, Text]) def test_normalize_column_type_tester(self): table = Table(self.rows, self.column_names, self.column_types) normalized_table = table.normalize('one', 'three', column_types=TypeTester(force={'value': Text()})) normal_rows = ( (1, 'three', '4'), (2, 'three', '3'), (None, 'three', '2') ) self.assertRows(normalized_table, normal_rows) self.assertColumnNames(normalized_table, ['one', 'property', 'value']) self.assertColumnTypes(normalized_table, [Number, Text, Text]) def test_normalize_multiple_fields(self): table = Table(self.rows, self.column_names, self.column_types) normalized_table = table.normalize('one', ['three', 'four']) normal_rows = ( (1, 'three', '4'), (1, 'four', 'a'), (2, 'three', '3'), (2, 'four', 'b'), (None, 'three', '2'), (None, 'four', 'c') ) self.assertRows(normalized_table, normal_rows) self.assertColumnNames(normalized_table, ['one', 'property', 'value']) self.assertColumnTypes(normalized_table, [Number, Text, Text]) def test_normalize_multiple_keys(self): table = Table(self.rows, self.column_names, self.column_types) normalized_table = table.normalize(['one', 'two'], ['three', 'four']) normal_rows = ( (1, 'c', 'three', '4'), (1, 'c', 'four', 'a'), (2, 'e', 'three', '3'), (2, 'e', 'four', 'b'), (None, 'g', 'three', '2'), (None, 'g', 'four', 'c') ) self.assertRows(normalized_table, normal_rows) self.assertColumnNames(normalized_table, ['one', 'two', 'property', 'value']) self.assertColumnTypes(normalized_table, [Number, Text, Text, Text]) def test_normalize_change_order(self): table = Table(self.rows, self.column_names, self.column_types) normalized_table = table.normalize('three', ['one', 'four']) normal_rows = ( (4, 'one', '1'), (4, 'four', 'a'), (3, 'one', '2'), (3, 'four', 'b'), (2, 'one', None), (2, 'four', 'c') ) self.assertRows(normalized_table, normal_rows) self.assertColumnNames(normalized_table, ['three', 'property', 'value']) self.assertColumnTypes(normalized_table, [Number, Text, Text]) agate-1.9.1/tests/test_table/test_order_py.py000066400000000000000000000101721454111471500213330ustar00rootroot00000000000000from agate import Table from agate.data_types import Number, Text from agate.testcase import AgateTestCase class TestOrderBy(AgateTestCase): def setUp(self): self.rows = ( (1, 4, 'a'), (2, 3, 'b'), (None, 2, '👍') ) self.number_type = Number() self.text_type = Text() self.column_names = ['one', 'two', 'three'] self.column_types = [self.number_type, self.number_type, self.text_type] def test_order_by(self): table = Table(self.rows, self.column_names, self.column_types) new_table = table.order_by('two') self.assertIsNot(new_table, table) self.assertColumnNames(new_table, self.column_names) self.assertColumnTypes(new_table, [Number, Number, Text]) self.assertRows(new_table, [ self.rows[2], self.rows[1], self.rows[0] ]) # Verify old table not changed self.assertRows(table, self.rows) def test_order_by_multiple_columns(self): rows = ( (1, 2, 'a'), (2, 1, 'b'), (1, 1, 'c') ) table = Table(rows, self.column_names, self.column_types) new_table = table.order_by(['one', 'two']) self.assertIsNot(new_table, table) self.assertColumnNames(new_table, self.column_names) self.assertColumnTypes(new_table, [Number, Number, Text]) self.assertRows(new_table, [ rows[2], rows[0], rows[1] ]) def test_order_by_func(self): rows = ( (1, 2, 'a'), (2, 1, 'b'), (1, 1, 'c') ) table = Table(rows, self.column_names, self.column_types) new_table = table.order_by(lambda r: (r['one'], r['two'])) self.assertIsNot(new_table, table) self.assertColumnNames(new_table, self.column_names) self.assertColumnTypes(new_table, [Number, Number, Text]) self.assertRows(new_table, [ rows[2], rows[0], rows[1] ]) def test_order_by_reverse(self): table = Table(self.rows, self.column_names, self.column_types) new_table = table.order_by(lambda r: r['two'], reverse=True) self.assertIsNot(new_table, table) self.assertColumnNames(new_table, self.column_names) self.assertColumnTypes(new_table, [Number, Number, Text]) self.assertRows(new_table, [ self.rows[0], self.rows[1], self.rows[2] ]) def test_order_by_nulls(self): rows = ( (1, 2, None), (2, None, None), (1, 1, 'c'), (1, None, 'a') ) table = Table(rows, self.column_names, self.column_types) new_table = table.order_by('two') self.assertIsNot(new_table, table) self.assertColumnNames(new_table, self.column_names) self.assertColumnTypes(new_table, [Number, Number, Text]) self.assertRows(new_table, [ rows[2], rows[0], rows[1], rows[3] ]) new_table = table.order_by('three') self.assertIsNot(new_table, table) self.assertColumnNames(new_table, self.column_names) self.assertColumnTypes(new_table, [Number, Number, Text]) self.assertRows(new_table, [ rows[3], rows[2], rows[0], rows[1] ]) new_table = table.order_by(('two', 'three')) self.assertIsNot(new_table, table) self.assertColumnNames(new_table, self.column_names) self.assertColumnTypes(new_table, [Number, Number, Text]) self.assertRows(new_table, [ rows[2], rows[0], rows[1], rows[3] ]) def test_order_by_with_row_names(self): table = Table(self.rows, self.column_names, self.column_types, row_names='three') new_table = table.order_by('two') self.assertRowNames(new_table, ['👍', 'b', 'a']) def test_order_by_empty_table(self): table = Table([], self.column_names) table.order_by('three') agate-1.9.1/tests/test_table/test_pivot.py000066400000000000000000000171071454111471500206560ustar00rootroot00000000000000import sys from decimal import Decimal from agate import Table from agate.aggregations import Sum from agate.computations import Percent from agate.data_types import Number, Text from agate.testcase import AgateTestCase class TestPivot(AgateTestCase): def setUp(self): self.rows = ( ('joe', 'white', 'male', 20, 'blue'), ('jane', 'white', 'female', 20, 'blue'), ('josh', 'black', 'male', 20, 'blue'), ('jim', 'latino', 'male', 25, 'blue'), ('julia', 'white', 'female', 25, 'green'), ('joan', 'asian', 'female', 25, 'green') ) self.number_type = Number() self.text_type = Text() self.column_names = ['name', 'race', 'gender', 'age', 'color'] self.column_types = [self.text_type, self.text_type, self.text_type, self.number_type, self.text_type] def test_pivot(self): table = Table(self.rows, self.column_names, self.column_types) pivot_table = table.pivot('race', 'gender') pivot_rows = ( ('white', 1, 2), ('black', 1, 0), ('latino', 1, 0), ('asian', 0, 1) ) self.assertColumnNames(pivot_table, ['race', 'male', 'female']) self.assertRowNames(pivot_table, ['white', 'black', 'latino', 'asian']) self.assertColumnTypes(pivot_table, [Text, Number, Number]) self.assertRows(pivot_table, pivot_rows) def test_pivot_by_lambda(self): table = Table(self.rows, self.column_names, self.column_types) pivot_table = table.pivot(lambda r: r['gender']) pivot_rows = ( ('male', 3), ('female', 3) ) self.assertColumnNames(pivot_table, ['group', 'Count']) self.assertRowNames(pivot_table, ['male', 'female']) self.assertColumnTypes(pivot_table, [Text, Number]) self.assertRows(pivot_table, pivot_rows) def test_pivot_by_lambda_group_name(self): table = Table(self.rows, self.column_names, self.column_types) pivot_table = table.pivot(lambda r: r['gender'], key_name='gender') pivot_rows = ( ('male', 3), ('female', 3) ) self.assertColumnNames(pivot_table, ['gender', 'Count']) self.assertRowNames(pivot_table, ['male', 'female']) self.assertColumnTypes(pivot_table, [Text, Number]) self.assertRows(pivot_table, pivot_rows) def test_pivot_by_lambda_group_name_sequence_invalid(self): table = Table(self.rows, self.column_names, self.column_types) with self.assertRaises(ValueError): table.pivot(['race', 'gender'], key_name='foo') def test_pivot_no_key(self): table = Table(self.rows, self.column_names, self.column_types) pivot_table = table.pivot(pivot='gender') pivot_rows = ( (3, 3), ) self.assertColumnNames(pivot_table, ['male', 'female']) self.assertColumnTypes(pivot_table, [Number, Number]) self.assertRows(pivot_table, pivot_rows) def test_pivot_no_pivot(self): table = Table(self.rows, self.column_names, self.column_types) pivot_table = table.pivot('race') pivot_rows = ( ('white', 3), ('black', 1), ('latino', 1), ('asian', 1) ) self.assertColumnNames(pivot_table, ['race', 'Count']) self.assertColumnTypes(pivot_table, [Text, Number]) self.assertRows(pivot_table, pivot_rows) def test_pivot_sum(self): table = Table(self.rows, self.column_names, self.column_types) pivot_table = table.pivot('race', 'gender', Sum('age')) pivot_rows = ( ('white', 20, 45), ('black', 20, 0), ('latino', 25, 0), ('asian', 0, 25) ) self.assertColumnNames(pivot_table, ['race', 'male', 'female']) self.assertColumnTypes(pivot_table, [Text, Number, Number]) self.assertRows(pivot_table, pivot_rows) def test_pivot_multiple_keys(self): table = Table(self.rows, self.column_names, self.column_types) pivot_table = table.pivot(['race', 'gender'], 'age') pivot_rows = ( ('white', 'male', 1, 0), ('white', 'female', 1, 1), ('black', 'male', 1, 0), ('latino', 'male', 0, 1), ('asian', 'female', 0, 1), ) self.assertRows(pivot_table, pivot_rows) self.assertColumnNames(pivot_table, ['race', 'gender', '20', '25']) self.assertRowNames(pivot_table, [ ('white', 'male'), ('white', 'female'), ('black', 'male'), ('latino', 'male'), ('asian', 'female'), ]) self.assertColumnTypes(pivot_table, [Text, Text, Number, Number]) def test_pivot_multiple_keys_no_pivot(self): table = Table(self.rows, self.column_names, self.column_types) pivot_table = table.pivot(['race', 'gender']) pivot_rows = ( ('white', 'male', 1), ('white', 'female', 2), ('black', 'male', 1), ('latino', 'male', 1), ('asian', 'female', 1), ) self.assertRows(pivot_table, pivot_rows) self.assertColumnNames(pivot_table, ['race', 'gender', 'Count']) self.assertColumnTypes(pivot_table, [Text, Text, Number]) def test_pivot_default_value(self): table = Table(self.rows, self.column_names, self.column_types) pivot_table = table.pivot('race', 'gender', default_value=None) pivot_rows = ( ('white', 1, 2), ('black', 1, None), ('latino', 1, None), ('asian', None, 1) ) self.assertColumnNames(pivot_table, ['race', 'male', 'female']) self.assertColumnTypes(pivot_table, [Text, Number, Number]) self.assertRows(pivot_table, pivot_rows) def test_pivot_compute(self): table = Table(self.rows, self.column_names, self.column_types) pivot_table = table.pivot('gender', computation=Percent('Count')) pivot_table.print_table(output=sys.stdout) pivot_rows = ( ('male', Decimal(50)), ('female', Decimal(50)), ) self.assertColumnNames(pivot_table, ['gender', 'Percent']) self.assertColumnTypes(pivot_table, [Text, Number]) self.assertRows(pivot_table, pivot_rows) def test_pivot_compute_pivots(self): table = Table(self.rows, self.column_names, self.column_types) pivot_table = table.pivot('gender', 'color', computation=Percent('Count')) pivot_table.print_table(output=sys.stdout) pivot_rows = ( ('male', Decimal(50), 0), ('female', Decimal(1) / Decimal(6) * Decimal(100), Decimal(1) / Decimal(3) * Decimal(100)), ) self.assertColumnNames(pivot_table, ['gender', 'blue', 'green']) self.assertColumnTypes(pivot_table, [Text, Number, Number]) self.assertRows(pivot_table, pivot_rows) def test_pivot_compute_kwargs(self): table = Table(self.rows, self.column_names, self.column_types) pivot_table = table.pivot('gender', 'color', computation=Percent('Count', total=8)) pivot_table.print_table(output=sys.stdout) pivot_rows = ( ('male', Decimal(3) / Decimal(8) * Decimal(100), 0), ('female', Decimal(1) / Decimal(8) * Decimal(100), Decimal(2) / Decimal(8) * Decimal(100)), ) self.assertColumnNames(pivot_table, ['gender', 'blue', 'green']) self.assertColumnTypes(pivot_table, [Text, Number, Number]) self.assertRows(pivot_table, pivot_rows) agate-1.9.1/tests/test_table/test_print_bars.py000066400000000000000000000066241454111471500216620ustar00rootroot00000000000000from io import StringIO from babel.numbers import format_decimal from agate import Table from agate.data_types import Number, Text from agate.exceptions import DataTypeError from agate.testcase import AgateTestCase class TestPrintBars(AgateTestCase): def setUp(self): self.rows = ( ('1.7', 2000, 'a'), ('11.18', None, None), ('0', 1, 'c') ) self.number_type = Number() self.international_number_type = Number(locale='de_DE.UTF-8') self.text_type = Text() self.column_names = ['one', 'two', 'three'] self.column_types = [ self.number_type, self.international_number_type, self.text_type ] def test_print_bars(self): table = Table(self.rows, self.column_names, self.column_types) output = StringIO() table.print_bars('three', 'one', output=output) output.getvalue().split('\n') def test_print_bars_width(self): table = Table(self.rows, self.column_names, self.column_types) output = StringIO() table.print_bars('three', 'one', width=40, output=output) lines = output.getvalue().split('\n') self.assertEqual(max([len(line) for line in lines]), 40) def test_print_bars_width_overlap(self): table = Table(self.rows, self.column_names, self.column_types) output = StringIO() table.print_bars('three', 'one', width=20, output=output) lines = output.getvalue().split('\n') self.assertEqual(max([len(line) for line in lines]), 20) def test_print_bars_domain(self): table = Table(self.rows, self.column_names, self.column_types) table.print_bars('three', 'one', domain=(0, 300)) def test_print_bars_domain_invalid(self): table = Table(self.rows, self.column_names, self.column_types) with self.assertRaises(ValueError): table.print_bars('three', 'one', domain=(5, 0)) def test_print_bars_negative(self): rows = ( ('-1.7', 2, 'a'), ('-11.18', None, None), ('0', 1, 'c') ) table = Table(rows, self.column_names, self.column_types) table.print_bars('three', 'one') def test_print_bars_mixed_signs(self): rows = ( ('-1.7', 2, 'a'), ('11.18', None, None), ('0', 1, 'c') ) table = Table(rows, self.column_names, self.column_types) table.print_bars('three', 'one') def test_print_bars_invalid_values(self): table = Table(self.rows, self.column_names, self.column_types) with self.assertRaises(DataTypeError): table.print_bars('one', 'three') def test_print_bars_with_nulls(self): table = Table(self.rows, self.column_names, self.column_types) output = StringIO() table.print_bars('three', 'two', width=20, printable=True, output=output) self.assertEqual(output.getvalue(), "three two\n" "a " + format_decimal(2000, format='#,##0') + " |:::::::\n" "None - | \n" "c 1 | \n" " +------+\n" " 0 " + format_decimal(2000, format='#,##0') + "\n") agate-1.9.1/tests/test_table/test_print_html.py000066400000000000000000000110321454111471500216640ustar00rootroot00000000000000import warnings from html.parser import HTMLParser from io import StringIO from agate import Table from agate.data_types import Number, Text from agate.testcase import AgateTestCase class TableHTMLParser(HTMLParser): """ Parser for use in testing HTML rendering of tables. """ def __init__(self): warnings.simplefilter('ignore') try: HTMLParser.__init__(self) finally: warnings.resetwarnings() self.has_table = False self.has_thead = False self.has_tbody = False self.header_rows = [] self.body_rows = [] self._in_table = False self._in_thead = False self._in_tbody = False self._in_cell = False self._cell_data = None def handle_starttag(self, tag, attrs): if tag == 'table': self._in_table = True return if tag == 'thead': self._in_thead = True return if tag == 'tbody': self._in_tbody = True return if tag == 'tr': self._current_row = [] return if tag in ('td', 'th'): self._in_cell = True return def handle_endtag(self, tag): if tag == 'table': if self._in_table: self.has_table = True self._in_table = False return if tag == 'thead': if self._in_thead: self.has_thead = True self._in_thead = False return if tag == 'tbody': if self._in_tbody: self.has_tbody = True self._in_tbody = False return if tag == 'tr': if self._in_tbody: self.body_rows.append(self._current_row) elif self._in_thead: self.header_rows.append(self._current_row) return if tag in ('td', 'th'): self._current_row.append(self._cell_data) self._cell_data = None self._in_cell = False return def handle_data(self, data): if self._in_cell: self._cell_data = data return class TestPrintHTML(AgateTestCase): def setUp(self): self.rows = ( (1, 4, 'a'), (2, 3, 'b'), (None, 2, '👍') ) self.number_type = Number() self.text_type = Text() self.column_names = ['one', 'two', 'three'] self.column_types = [self.number_type, self.number_type, self.text_type] def test_print_html(self): table = Table(self.rows, self.column_names, self.column_types) table_html = StringIO() table.print_html(output=table_html) table_html = table_html.getvalue() parser = TableHTMLParser() parser.feed(table_html) self.assertIs(parser.has_table, True) self.assertIs(parser.has_tbody, True) self.assertIs(parser.has_thead, True) self.assertEqual(len(parser.header_rows), 1) self.assertEqual(len(parser.body_rows), len(table.rows)) header_cols = parser.header_rows[0] self.assertEqual(len(header_cols), len(table.column_names)) for i, column_name in enumerate(table.column_names): self.assertEqual(header_cols[i], column_name) for row_num, row in enumerate(table.rows): html_row = parser.body_rows[row_num] self.assertEqual(len(html_row), len(row)) def test_print_html_tags(self): table = Table(self.rows, self.column_names, self.column_types) output = StringIO() table.print_html(output=output) html = output.getvalue() self.assertEqual(html.count(''), 4) self.assertEqual(html.count(''), 3) self.assertEqual(html.count(''), 4) self.assertEqual(html.count(''), 3) self.assertEqual(html.count(''), 4) self.assertEqual(html.count(''), 3) self.assertEqual(html.count('= 3 and t['number_sum'] > 6) self.assertIsInstance(new_tableset, TableSet) self.assertSequenceEqual(new_tableset.keys(), ['table2']) self.assertIs(new_tableset.values()[0], tableset['table2']) self.assertEqual(new_tableset.key_name, 'test') agate-1.9.1/tests/test_tableset/test_merge.py000066400000000000000000000052251454111471500213260ustar00rootroot00000000000000from collections import OrderedDict from agate import Table, TableSet from agate.data_types import Number, Text from agate.testcase import AgateTestCase class TestAggregate(AgateTestCase): def setUp(self): self.table1 = ( ('a', 1), ('a', 3), ('b', 2) ) self.table2 = ( ('b', 0), ('a', 2), ('c', 5) ) self.table3 = ( ('a', 1), ('a', 2), ('c', 3) ) self.text_type = Text() self.number_type = Number() self.column_names = ['letter', 'number'] self.column_types = [self.text_type, self.number_type] self.tables = OrderedDict([ ('table1', Table(self.table1, self.column_names, self.column_types)), ('table2', Table(self.table2, self.column_names, self.column_types)), ('table3', Table(self.table3, self.column_names, self.column_types)) ]) def test_merge(self): tableset = TableSet(self.tables.values(), self.tables.keys()) table = tableset.merge() self.assertColumnNames(table, ['group', 'letter', 'number']) self.assertColumnTypes(table, [Text, Text, Number]) self.assertEqual(len(table.rows), 9) self.assertSequenceEqual(table.rows[0], ['table1', 'a', 1]) self.assertSequenceEqual(table.rows[8], ['table3', 'c', 3]) def test_merge_key_name(self): tableset = TableSet(self.tables.values(), self.tables.keys(), key_name='foo') table = tableset.merge() self.assertColumnNames(table, ['foo', 'letter', 'number']) self.assertColumnTypes(table, [Text, Text, Number]) def test_merge_groups(self): tableset = TableSet(self.tables.values(), self.tables.keys(), key_name='foo') table = tableset.merge(groups=['red', 'blue', 'green'], group_name='color_code') self.assertColumnNames(table, ['color_code', 'letter', 'number']) self.assertColumnTypes(table, [Text, Text, Number]) self.assertEqual(len(table.rows), 9) self.assertSequenceEqual(table.rows[0], ['red', 'a', 1]) self.assertSequenceEqual(table.rows[8], ['green', 'c', 3]) def test_merge_groups_invalid_length(self): tableset = TableSet(self.tables.values(), self.tables.keys()) with self.assertRaises(ValueError): tableset.merge(groups=['red', 'blue'], group_name='color_code') def test_merge_groups_invalid_type(self): tableset = TableSet(self.tables.values(), self.tables.keys()) with self.assertRaises(ValueError): tableset.merge(groups='invalid', group_name='color_code') agate-1.9.1/tests/test_type_tester.py000066400000000000000000000115111454111471500177270ustar00rootroot00000000000000import unittest from agate.data_types import Boolean, Date, DateTime, Number, Text, TimeDelta from agate.type_tester import TypeTester class TestTypeTester(unittest.TestCase): def setUp(self): self.tester = TypeTester() def test_empty(self): rows = [ (None,), (None,), (None,), ] inferred = self.tester.run(rows, ['one']) # This behavior is not necessarily desirable. See https://github.com/wireservice/agate/issues/371 self.assertIsInstance(inferred[0], Boolean) def test_text_type(self): rows = [ ('a',), ('b',), ('',) ] inferred = self.tester.run(rows, ['one']) self.assertIsInstance(inferred[0], Text) def test_number_type(self): rows = [ ('1.7',), ('200000000',), ('',) ] inferred = self.tester.run(rows, ['one']) self.assertIsInstance(inferred[0], Number) def test_number_percent(self): rows = [ ('1.7%',), ('200000000%',), ('',) ] inferred = self.tester.run(rows, ['one']) self.assertIsInstance(inferred[0], Number) def test_number_currency(self): rows = [ ('$1.7',), ('$200000000',), ('',) ] inferred = self.tester.run(rows, ['one']) self.assertIsInstance(inferred[0], Number) def test_number_currency_locale(self): rows = [ ('£1.7',), ('£200000000',), ('',) ] inferred = self.tester.run(rows, ['one']) self.assertIsInstance(inferred[0], Number) def test_boolean_type(self): rows = [ ('True',), ('FALSE',), ('',) ] inferred = self.tester.run(rows, ['one']) self.assertIsInstance(inferred[0], Boolean) def test_date_type(self): rows = [ ('5/7/1984',), ('2/28/1997',), ('3/19/2020',), ('',) ] inferred = self.tester.run(rows, ['one']) self.assertIsInstance(inferred[0], Date) def test_date_type_iso_format(self): rows = [ ('1984-05-07',), ('1997-02-28',), ('2020-03-19',), ('',) ] inferred = self.tester.run(rows, ['one']) self.assertIsInstance(inferred[0], Date) def test_date_time_type(self): rows = [ ('5/7/84 3:44:12',), ('2/28/1997 3:12 AM',), ('3/19/20 4:40 PM',), ('',) ] inferred = self.tester.run(rows, ['one']) self.assertIsInstance(inferred[0], DateTime) def test_date_time_type_isoformat(self): rows = [ ('1984-07-05T03:44:12',), ('1997-02-28T03:12:00',), ('2020-03-19T04:40:00',), ('',) ] inferred = self.tester.run(rows, ['one']) self.assertIsInstance(inferred[0], DateTime) def test_time_delta_type(self): rows = [ ('1:42',), ('1w 27h',), ('',) ] inferred = self.tester.run(rows, ['one']) self.assertIsInstance(inferred[0], TimeDelta) def test_force_type(self): rows = [ ('1.7',), ('200000000',), ('',) ] tester = TypeTester(force={ 'one': Text() }) inferred = tester.run(rows, ['one']) self.assertIsInstance(inferred[0], Text) def test_limit(self): rows = [ ('1.7',), ('foo',), ('',) ] tester = TypeTester(limit=1) inferred = tester.run(rows, ['one']) self.assertIsInstance(inferred[0], Number) tester = TypeTester(limit=2) inferred = tester.run(rows, ['one']) self.assertIsInstance(inferred[0], Text) def test_types_force_text(self): rows = [ ('1.7',), ('200000000',), ('',) ] tester = TypeTester(types=[Text()]) inferred = tester.run(rows, ['one']) self.assertIsInstance(inferred[0], Text) def test_types_no_boolean(self): rows = [ ('True',), ('False',), ('False',) ] tester = TypeTester(types=[Number(), Text()]) inferred = tester.run(rows, ['one']) self.assertIsInstance(inferred[0], Text) def test_types_number_locale(self): rows = [ ('1,7',), ('200.000.000',), ('',) ] tester = TypeTester(types=[Number(locale='de_DE.UTF-8'), Text()]) inferred = tester.run(rows, ['one']) self.assertIsInstance(inferred[0], Number) self.assertEqual(str(inferred[0].locale), 'de_DE') agate-1.9.1/tests/test_utils.py000066400000000000000000000043451454111471500165270ustar00rootroot00000000000000import unittest from decimal import Decimal from agate.utils import Quantiles, letter_name, round_limits class TestQuantiles(unittest.TestCase): def setUp(self): self.values = [0, 10, 20, 30, 40, 50] self.quantiles = Quantiles(self.values) def test_methods(self): self.assertEqual(len(self.quantiles), 6) self.assertEqual(self.quantiles[2], 20) self.assertSequenceEqual(list(self.quantiles), self.values) self.assertEqual(repr(self.quantiles), repr(self.values)) def test_locate(self): self.assertEqual(self.quantiles.locate(25), 2) self.assertEqual(self.quantiles.locate(40), 4) with self.assertRaises(ValueError): self.quantiles.locate(-10) with self.assertRaises(ValueError): self.quantiles.locate(51) class TestMisc(unittest.TestCase): def test_round_limits(self): self.assertEqual( round_limits(Decimal('-2.7'), Decimal('2.7')), (Decimal('-3'), Decimal('3')) ) self.assertEqual( round_limits(Decimal('-2.2'), Decimal('2.2')), (Decimal('-3'), Decimal('3')) ) self.assertEqual( round_limits(Decimal('-2.22'), Decimal('2.22')), (Decimal('-3'), Decimal('3')) ) self.assertEqual( round_limits(Decimal('0'), Decimal('75')), (Decimal('0'), Decimal('80')) ) self.assertEqual( round_limits(Decimal('45'), Decimal('300')), (Decimal('0'), Decimal('300')) ) self.assertEqual( round_limits(Decimal('200.75'), Decimal('715.345')), (Decimal('200'), Decimal('800')) ) self.assertEqual( round_limits(Decimal('0.75'), Decimal('0.800')), (Decimal('0'), Decimal('1')) ) self.assertEqual( round_limits(Decimal('-0.505'), Decimal('0.47')), (Decimal('-0.6'), Decimal('0.5')) ) def test_letter_name(self): self.assertEqual(letter_name(0), 'a') self.assertEqual(letter_name(4), 'e') self.assertEqual(letter_name(25), 'z') self.assertEqual(letter_name(30), 'ee') self.assertEqual(letter_name(77), 'zzz') agate-1.9.1/tutorial.ipynb000066400000000000000000003653241454111471500155310ustar00rootroot00000000000000{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# The Agate Tutorial\n", "\n", "The best way to learn to use any tool is to actually use it. In this tutorial we will use agate to answer some basic questions about a dataset.\n", "\n", "The data we will be using is a copy of the [National Registry of Exonerations]( https://www.law.umich.edu/special/exoneration/Pages/detaillist.aspx) made on August 28th, 2015. This dataset lists individuals who are known to have been exonerated after having been wrongly convicted in United States courts. At the time this data was copied there were 1,651 entries in the registry." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Installing agate\n", "\n", "Installing agate from the command line is easy:\n", "\n", " pip install agate\n", "\n", "Note: You should be installing agate inside a [virtualenv](https://virtualenv.readthedocs.io/en/stable/>). If for some crazy reason you aren't using virtualenv you will need to add a ``sudo`` to the previous command.*\n", "\n", "For more detailed installation instructions, see the [Installation](https://agate.readthedocs.io/en/latest/install.html) section of the documentation." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Getting the data\n", "\n", "If you're just reading this tutorial you can skip this section. If you want to try working through it on your own then you'll need to download the data.\n", "\n", "It can be downloaded from\n", "\n", " curl -L -O https://github.com/wireservice/agate/raw/master/examples/realdata/exonerations-20150828.csv\n", "\n", "The rest of this tutorial will expect that data to be located in `examples/realdata`." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Importing agate\n", "\n", "Let's get started!" ] }, { "cell_type": "code", "execution_count": 1, "metadata": { "collapsed": true }, "outputs": [], "source": [ "import agate" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Loading data from a CSV\n", "\n", "The [`Table`](https://agate.readthedocs.io/en/latest/api/table.html#module-agate.table) is the basic class in agate. To create a table from a CSV we use the [`Table.from_csv`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.from_csv) class method:" ] }, { "cell_type": "code", "execution_count": 2, "metadata": { "collapsed": false }, "outputs": [], "source": [ "exonerations = agate.Table.from_csv('examples/realdata/exonerations-20150828.csv')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "With no other arguments specified, agate will automatically create an instance of [`TypeTester`](https://agate.readthedocs.io/en/latest/api/type_tester.html#agate.TypeTester) and use it to figure out the type of each column. TypeTester is a \"best guess\" approach to determining the kinds of data in your table. It can guess wrong. In that case you can create a TypeTester manually and use the ``force`` argument to override its guess for a specific column:" ] }, { "cell_type": "code", "execution_count": 3, "metadata": { "collapsed": false }, "outputs": [], "source": [ "tester = agate.TypeTester(force={\n", " 'false_evidence': agate.Boolean()\n", "})\n", "\n", "exonerations = agate.Table.from_csv('examples/realdata/exonerations-20150828.csv', column_types=tester)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "If you already know the types of your data you may wish to skip the TypeTester entirely. You may pass sequences of column names and column types to [`Table.from_csv`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.from_csv) as the ``column_names`` and ``column_types`` arguments, respectively.\n", "\n", "For larger datasets the [`TypeTester`](https://agate.readthedocs.io/en/latest/api/type_tester.html#agate.TypeTester) can be slow to evaluate the data. In that case you can specify a `limit` argument to restrict the amount of data it will use to infer types:" ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "collapsed": false }, "outputs": [], "source": [ "tester = agate.TypeTester(limit=100)\n", "\n", "exonerations = agate.Table.from_csv('examples/realdata/exonerations-20150828.csv', column_types=tester)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The dataset we are using in this tutorial is simple enough that we can rely on the built-in TypeTester to guess quickly and accurately.\n", "\n", "**Note:** agate's CSV reader and writer support unicode and other encodings for both Python 2 and Python 3. Try using them as a drop-in replacement for Python's builtin module: `from agate import csv`.\n", "\n", "**Note:** agate also has [`Table.from_json`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.from_json) for creating tables from JSON data." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Describing the table\n", "====================\n", "\n", "If you're working with new data, or you just need a refresher, you may want to review what columns are in the table. You can do this with the [`.Table.print_structure`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.print_structure) method or by just calling `print` on the table:" ] }, { "cell_type": "code", "execution_count": 5, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "| column | data_type |\n", "| ------------------- | --------- |\n", "| last_name | Text |\n", "| first_name | Text |\n", "| age | Number |\n", "| race | Text |\n", "| state | Text |\n", "| tags | Text |\n", "| crime | Text |\n", "| sentence | Text |\n", "| convicted | Number |\n", "| exonerated | Number |\n", "| dna | Boolean |\n", "| dna_essential | Text |\n", "| mistake_witness | Boolean |\n", "| false_confession | Boolean |\n", "| perjury | Boolean |\n", "| false_evidence | Boolean |\n", "| official_misconduct | Boolean |\n", "| inadequate_defense | Boolean |\n", "\n" ] } ], "source": [ "print(exonerations)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Navigating table data\n", "=====================\n", "\n", "agate goes to great pains to make accessing the data in your tables work seamlessly for a wide variety of use-cases. Access by both [`Column`](https://agate.readthedocs.io/en/latest/api/columns_and_rows.html#agate.Column) and [`Row`](https://agate.readthedocs.io/en/latest/api/columns_and_rows.html#agate.Row) is supported, via the [`Table.columns`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.columns) and [`Table.rows`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.rows) attributes respectively.\n", "\n", "All four of these objects are examples of [`.MappedSequence`](https://agate.readthedocs.io/en/latest/api/columns_and_rows.html#agate.MappedSequence), the foundational type that underlies much of agate's functionality. A MappedSequence functions very similar to a standard Python [`dict`](https://docs.python.org/3/tutorial/datastructures.html#dictionaries), with a few important exceptions:\n", "\n", "* Data may be accessed either by numeric index (e.g. column number) or by a non-integer key (e.g. column name).\n", "* Items are ordered, just like an instance of [`collections.OrderedDict`](https://docs.python.org/3/library/collections.html#collections.OrderedDict).\n", "* Iterating over the sequence returns its *values*, rather than its *keys*.\n", "\n", "To demonstrate the first point, these two lines are both valid ways of getting the first column in the `exonerations` table:" ] }, { "cell_type": "code", "execution_count": 6, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "" ] }, "execution_count": 6, "metadata": {}, "output_type": "execute_result" } ], "source": [ "exonerations.columns['last_name']" ] }, { "cell_type": "code", "execution_count": 7, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "" ] }, "execution_count": 7, "metadata": {}, "output_type": "execute_result" } ], "source": [ "exonerations.columns[0]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "In the same way, rows can be accessed either by numeric index or by an optional, unique \"row name\" specified when the table is created. In this tutorial we won't use row names, but here is an example of how they work:" ] }, { "cell_type": "code", "execution_count": 8, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "" ] }, "execution_count": 8, "metadata": {}, "output_type": "execute_result" } ], "source": [ "exonerations = agate.Table.from_csv('examples/realdata/exonerations-20150828.csv', row_names=lambda r: '%(last_name)s, %(first_name)s' % (r))\n", "\n", "exonerations.rows[0]" ] }, { "cell_type": "code", "execution_count": 9, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "" ] }, "execution_count": 9, "metadata": {}, "output_type": "execute_result" } ], "source": [ "exonerations.rows['Abbitt, Joseph Lamont']" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "In this case we create our row names using a [`lambda`](https://docs.python.org/3/tutorial/controlflow.html#lambda-expressions) function that takes a row and returns an unique identifer. If your data has a unique column, you can also just pass the column name. (For example, a column of USPS abbrevations or FIPS codes.) Note, however, that your row names can never be `int`, because that is reserved for indexing by numeric order. (A [`decimal.Decimal`](https://docs.python.org/3/library/decimal.html#decimal.Decimal) or stringified integer is just fine.)\n", "\n", "Once you've got a specific row, you can then access its individual values (cells, in spreadsheet-speak) either by numeric index or column name:" ] }, { "cell_type": "code", "execution_count": 10, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "'Abbitt'" ] }, "execution_count": 10, "metadata": {}, "output_type": "execute_result" } ], "source": [ "row = exonerations.rows[0]\n", "\n", "row[0]" ] }, { "cell_type": "code", "execution_count": 11, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "'Abbitt'" ] }, "execution_count": 11, "metadata": {}, "output_type": "execute_result" } ], "source": [ "row['last_name']" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "And the same goes for columns, which can be indexed numerically or by row name (if one has been setup):" ] }, { "cell_type": "code", "execution_count": 12, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "'Child Sex Abuse'" ] }, "execution_count": 12, "metadata": {}, "output_type": "execute_result" } ], "source": [ "column = exonerations.columns['crime']\n", "\n", "column[0]" ] }, { "cell_type": "code", "execution_count": 13, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "'Child Sex Abuse'" ] }, "execution_count": 13, "metadata": {}, "output_type": "execute_result" } ], "source": [ "column['Abbitt, Joseph Lamont']" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "For any instance of [`.MappedSequence`](https://agate.readthedocs.io/en/latest/api/columns_and_rows.html#agate.MappedSequence), iteration returns values, *in order*. Here we print only the first ten:" ] }, { "cell_type": "code", "execution_count": 14, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Abbitt\n", "Abdal\n", "Abernathy\n", "Acero\n", "Adams\n", "Adams\n", "Adams\n", "Adams\n", "Adams\n", "Adams\n" ] } ], "source": [ "for row in exonerations.rows[:10]:\n", " print(row['last_name'])" ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "To summarize, the four most common data structures in agate ([`Column`](https://agate.readthedocs.io/en/latest/api/columns_and_rows.html#agate.Column), [`Row`](https://agate.readthedocs.io/en/latest/api/columns_and_rows.html#agate.Row), [`Table.columns`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.columns) and [`Table.rows`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.rows)) are all instances of [`MappedSequence`](https://agate.readthedocs.io/en/latest/api/columns_and_rows.html#agate.MappedSequence) and therefore all behave in a uniform way. This is also true of [`TableSet`](https://agate.readthedocs.io/en/latest/api/tableset.html), which will discuss later on." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Aggregating column data\n", "=======================\n", "\n", "With the basics out of the way, let's do some actual analysis. Analysis begins with questions, so let's ask some.\n", "\n", "**Question:** How many exonerations involved a false confession?\n", "\n", "Answering this question involves counting the number of ``True`` values in the ``false_confession`` column. When we created the table we specified that the data in this column contained [`Boolean`](https://agate.readthedocs.io/en/latest/api/data_types.html#agate.Boolean) data. Because of this, agate has taken care of coercing the original text data from the CSV into Python's ``True`` and ``False`` values.\n", "\n", "We'll answer the question using an instance of [`Count`](https://agate.readthedocs.io/en/latest/api/aggregations.html#agate.Count) which is a type of [`Aggregation`](https://agate.readthedocs.io/en/latest/api/aggregations.html#agate.Aggregation). Aggregations are used to perform \"column-wise\" calculations. That is, they derive a new single value from the contents of a column. The [`Count`](https://agate.readthedocs.io/en/latest/api/aggregations.html#agate.Count) aggregation can count either all values in a column, or how many times a particular value appears.\n", "\n", "An Aggregation is applied to a table using [`Table.aggregate`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.aggregate).\n", "\n", "It sounds complicated, but it's really simple. Putting it all together looks like this:" ] }, { "cell_type": "code", "execution_count": 15, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "211" ] }, "execution_count": 15, "metadata": {}, "output_type": "execute_result" } ], "source": [ "exonerations.aggregate(agate.Count('false_confession', True))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let's look at another example, this time using a numerical aggregation.\n", "\n", "**Question:** What was the median age of exonerated indviduals at time of arrest?" ] }, { "cell_type": "code", "execution_count": 16, "metadata": { "collapsed": false }, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "/Users/cgroskopf/src/agate/agate/aggregations/median.py:37: NullCalculationWarning: Column \"age\" contains nulls. These will be excluded from Median calculation.\n" ] }, { "data": { "text/plain": [ "Decimal('26')" ] }, "execution_count": 16, "metadata": {}, "output_type": "execute_result" } ], "source": [ "exonerations.aggregate(agate.Median('age'))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The answer to our question is \"26 years old\", however, as the warnings indicate, not every exonerated individual in the data has a value for the ``age`` column. The [`Median`](https://agate.readthedocs.io/en/latest/api/aggregations.html#agate.Median) statistical operation has no standard way of accounting for null values, so it leaves them out of the calculation.\n", "\n", "**Question:** How many individuals do not have an age specified in the data?\n", "\n", "Now that we know there are null values in the ``age`` column, we might worry about our sample size. What if most of the rows don't have an age?" ] }, { "cell_type": "code", "execution_count": 17, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "9" ] }, "execution_count": 17, "metadata": {}, "output_type": "execute_result" } ], "source": [ "exonerations.aggregate(agate.Count('age', None))" ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "Only nine rows in this dataset don't have age, so it's certainly still useful to compute a median. However, we might still want to filter those rows out so we could have a consistent sample for all of our calculations. In the next section you'll learn how to do just that.\n", "\n", "Different [`aggregations`](https://agate.readthedocs.io/en/latest/api/aggregations.html) can be applied depending on the type of data in each column. If none of the provided aggregations suit your needs you can use [`Summary`](https://agate.readthedocs.io/en/latest/api/aggregations.html#agate.Summary) to apply an arbitrary function to a column. If that still doesn't suit your needs you can always create your own aggregation from scratch by subclassing [`Aggregation`](https://agate.readthedocs.io/en/latest/api/aggregations.html#agate.Aggregation)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Selecting and filtering data\n", "============================\n", "\n", "So what if those rows with no age were going to flummox our analysis? Agate's [`Table`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table) class provides a full suite of SQL-like operations including [`Table.select`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.select) for grabbing specific columns, [`Table.where`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.where) for selecting particular rows and [`Table.group_by`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.group_by) for grouping rows by common values.\n", "\n", "Let's use [`Table.where`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.where) to filter our exonerations table to only those individuals that have an age specified." ] }, { "cell_type": "code", "execution_count": 18, "metadata": { "collapsed": true }, "outputs": [], "source": [ "with_age = exonerations.where(lambda row: row['age'] is not None)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "You'll notice we provide a [`lambda`](https://docs.python.org/3/tutorial/controlflow.html#lambda-expressions) function to the [`Table.where`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.where). This function is applied to each row and if it returns ``True``, then the row is included in the output table.\n", "\n", "A crucial thing to understand about these table methods is that they return **new tables**. In our example above ``exonerations`` was a [`Table`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table) instance and we applied [`Table.where`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.where), so ``with_age`` is a new, different [`Table`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table). The tables themselves can't be changed. You can create new tables with these methods, but you can't modify them in-place. (If this seems weird, just trust me. There are lots of good computer science-y reasons to do it this way.)\n", "\n", "We can verify this did what we expected by counting the rows in the original table and rows in the new table:" ] }, { "cell_type": "code", "execution_count": 19, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "9" ] }, "execution_count": 19, "metadata": {}, "output_type": "execute_result" } ], "source": [ "len(exonerations.rows) - len(with_age.rows)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Nine rows were removed, which is the number of nulls we had already identified were in the column.\n", "\n", "Now if we calculate the median age of these individuals, we don't see the warning anymore." ] }, { "cell_type": "code", "execution_count": 20, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "Decimal('26')" ] }, "execution_count": 20, "metadata": {}, "output_type": "execute_result" } ], "source": [ "with_age.aggregate(agate.Median('age'))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Computing new columns\n", "=====================\n", "\n", "In addition to \"column-wise\" [`aggregations`](https://agate.readthedocs.io/en/latest/api/aggregations.html#module-agate.aggregations) there are also \"row-wise\" [`computations`](https://agate.readthedocs.io/en/latest/api/computations.html#module-agate.computations). Computations go through a [`Table`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table) row-by-row and derive a new column using the existing data. To perform row computations in agate we use subclasses of [`Computation`](https://agate.readthedocs.io/en/latest/api/computations.html#agate.Computation).\n", "\n", "When one or more instances of [`Computation`](https://agate.readthedocs.io/en/latest/api/computations.html#agate.Computation) are applied with the [`Table.compute`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.compute) method, a new table is created with additional columns.\n", "\n", "**Question:** How long did individuals remain in prison before being exonerated?\n", "\n", "To answer this question we will apply the [`Change`](https://agate.readthedocs.io/en/latest/api/computations.html#agate.Change) computation to the ``convicted`` and ``exonerated`` columns. Each of these columns contains the individual's age at the time of that event. All that [`Change`](https://agate.readthedocs.io/en/latest/api/computations.html#agate.Change) does is compute the difference between two numbers. (In this case each of these columns contain a [`Number`](https://agate.readthedocs.io/en/latest/api/data_types.html#agate.Number), but this will also work with [`Date`](https://agate.readthedocs.io/en/latest/api/data_types.html#agate.Date) or [`DateTime`](https://agate.readthedocs.io/en/latest/api/data_types.html#agate.DateTime).)" ] }, { "cell_type": "code", "execution_count": 21, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "Decimal('8')" ] }, "execution_count": 21, "metadata": {}, "output_type": "execute_result" } ], "source": [ "with_years_in_prison = exonerations.compute([\n", " ('years_in_prison', agate.Change('convicted', 'exonerated'))\n", "])\n", "\n", "with_years_in_prison.aggregate(agate.Median('years_in_prison'))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The median number of years an exonerated individual spent in prison was 8 years.\n", "\n", "Sometimes, the built-in computations, such as [`Change`](https://agate.readthedocs.io/en/latest/api/computations.html#agate.Change) won't suffice. I mentioned before that you could perform arbitrary column-wise aggregations using [`Summary`](https://agate.readthedocs.io/en/latest/api/aggregations.html#agate.Summary). You can do the same thing for row-wise computations using [`Formula`](https://agate.readthedocs.io/en/latest/api/computations.html#agate.Formula). This is somewhat analogous to Excel's cell formulas.\n", "\n", "For example, this code will create a ``full_name`` column from the ``first_name`` and ``last_name`` columns in the data:" ] }, { "cell_type": "code", "execution_count": 22, "metadata": { "collapsed": true }, "outputs": [], "source": [ "full_names = exonerations.compute([\n", " ('full_name', agate.Formula(agate.Text(), lambda row: '%(first_name)s %(last_name)s' % row))\n", "])" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "For efficiency's sake, agate allows you to perform several computations at once (though their results can't depend on one another):" ] }, { "cell_type": "code", "execution_count": 23, "metadata": { "collapsed": true }, "outputs": [], "source": [ "with_computations = exonerations.compute([\n", " ('full_name', agate.Formula(agate.Text(), lambda row: '%(first_name)s %(last_name)s' % row)),\n", " ('years_in_prison', agate.Change('convicted', 'exonerated'))\n", "])" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "You can also compute new columns to clean up your raw data. In the initial data, the ``state`` column has some values with a 'F-' prefix on the state abbreviation. Cases with that prefix are federal cases as opposed to state prosecutions. To make the data easier to use, we can create a new ``federal`` column to tag federal cases and clean up the original state column:" ] }, { "cell_type": "code", "execution_count": 24, "metadata": { "collapsed": true }, "outputs": [], "source": [ "clean_state_data = exonerations.compute([\n", " ('federal', agate.Formula(agate.Boolean(), lambda row: row['state'].startswith('F-'))),\n", " ('state', agate.Formula(agate.Text(), lambda row: row['state'][2:] if row['state'].startswith('F-') else row['state']))\n", "], replace=True)" ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "We add the ``replace`` argument to our ``compute`` method to replace the state column in place.\n", "\n", "If [`Formula`](https://agate.readthedocs.io/en/latest/api/computations.html#agate.Formula) is not flexible enough (for instance, if you needed to compute a new value based on the distribution of data in a column) you can always implement your own subclass of [`Computation`](https://agate.readthedocs.io/en/latest/api/computations.html#agate.Computation). See the API documentation for [`computations`](https://agate.readthedocs.io/en/latest/api/computations.html#module-agate.computations to see all of the supported ways to compute new data." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Sorting and slicing\n", "===================\n", "\n", "**Question:** Who are the ten exonerated individuals who were youngest at the time they were arrested?\n", "\n", "Remembering that methods of tables return tables, we will use [`Table.order_by`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.order_by) to sort our table:" ] }, { "cell_type": "code", "execution_count": 25, "metadata": { "collapsed": true }, "outputs": [], "source": [ "sorted_by_age = exonerations.order_by('age')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We can then use [`Table.limit`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.limit) get only the first ten rows of the data." ] }, { "cell_type": "code", "execution_count": 26, "metadata": { "collapsed": true }, "outputs": [], "source": [ "youngest_ten = sorted_by_age.limit(10)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now let's use [`Table.print_table`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.print_table) to help us pretty the results in a way we can easily review:" ] }, { "cell_type": "code", "execution_count": 27, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "| last_name | first_name | age | race | state | tags | crime | ... |\n", "| --------- | ---------- | --- | --------- | ----- | ------- | ------- | --- |\n", "| Murray | Lacresha | 11 | Black | TX | CV, F | Murder | ... |\n", "| Adams | Johnathan | 12 | Caucasian | GA | CV, P | Murder | ... |\n", "| Harris | Anthony | 12 | Black | OH | CV | Murder | ... |\n", "| Edmonds | Tyler | 13 | Caucasian | MS | | Murder | ... |\n", "| Handley | Zachary | 13 | Caucasian | PA | A, CV | Arson | ... |\n", "| Jimenez | Thaddeus | 13 | Hispanic | IL | | Murder | ... |\n", "| Pacek | Jerry | 13 | Caucasian | PA | | Murder | ... |\n", "| Barr | Jonathan | 14 | Black | IL | CDC, CV | Murder | ... |\n", "| Brim | Dominique | 14 | Black | MI | F | Assault | ... |\n", "| Brown | Timothy | 14 | Black | FL | | Murder | ... |\n" ] } ], "source": [ "youngest_ten.print_table(max_columns=7)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "If you find it impossible to believe that an eleven year-old was convicted of murder, I encourage you to read the Registry's [description of the case](https://www.law.umich.edu/special/exoneration/Pages/casedetail.aspx?caseid=3499>).\n", "\n", "**Note:** In the previous example we could have omitted the [`Table.limit`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.limit) and passed a ``max_rows=10`` to [`Table.print_table`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.print_table) instead. In this case they accomplish exactly the same goal.\n", "\n", "What if we were more curious about the *distribution* of ages, rather than the highest or lowest? agate includes the [`Table.pivot`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.pivot) and [`Table.bins`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.bins) methods for counting values individually or by ranges. Let's try binning the ages. Then, instead of using [`Table.print_table`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.print_table), we'll use [`Table.print_bars`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.print_bars) to generate a simple, text bar chart." ] }, { "cell_type": "code", "execution_count": 28, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "age Count\n", "[10 - 19) 227 ▓░░░░░░░░░░░░░░░░░░░░ \n", "[19 - 28) 688 ▓░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ \n", "[28 - 37) 405 ▓░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ \n", "[37 - 46) 213 ▓░░░░░░░░░░░░░░░░░░░ \n", "[46 - 55) 81 ▓░░░░░░░ \n", "[55 - 64) 24 ▓░░ \n", "[64 - 73) 3 ▓ \n", "[82 - 91) 1 ▓ \n", "None 9 ▓░ \n", " +---------------+---------------+--------------+---------------+\n", " 0 175 350 525 700\n" ] } ], "source": [ "binned_ages = exonerations.bins('age', 10, 0, 100)\n", "binned_ages.print_bars('age', 'Count', width=80)" ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "Notice that we specify we want :code:`10` bins spanning the range :code:`0` to :code:`100`. If these values are omitted agate will attempt to infer good defaults. We also specify that we want our bar chart to span a width of :code:`80` characters. This can be adjusted to a suitable width for your terminal or document.\n", "\n", "**Note:** If you use a monospaced font, such as Courier, you can copy and paste agate bar charts into emails or documents. No screenshots required." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Grouping and aggregating\n", "========================\n", "\n", "**Question:** Which state has seen the most exonerations?\n", "\n", "This question can't be answered by operating on a single column. What we need is the equivalent of SQL's ``GROUP BY``. agate supports a full set of SQL-like operations on tables. Unlike SQL, agate breaks grouping and aggregation into two discrete steps.\n", "\n", "First, we use [`Table.group_by`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.group_by) to group the data by state." ] }, { "cell_type": "code", "execution_count": 29, "metadata": { "collapsed": true }, "outputs": [], "source": [ "by_state = clean_state_data.group_by('state')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This takes our original [`Table`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table) and groups it into a [`TableSet`](https://agate.readthedocs.io/en/latest/api/tableset.html#agate.TableSet), which contains one table per state. As mentioned much earlier in this tutorial, TableSets are instances of [`MappedSequence`](https://agate.readthedocs.io/en/latest/api/columns_and_rows.html#agate.MappedSequence). That means they work very much like [`Column`](https://agate.readthedocs.io/en/latest/api/columns_and_rows.html#agate.Column) and [`Row`](https://agate.readthedocs.io/en/latest/api/columns_and_rows.html#agate.Row).\n", "\n", "Now we need to aggregate the total for each state. This works in a very similar way to how it did when we were aggregating columns of a single table, except that we'll use the [`Count`](https://agate.readthedocs.io/en/latest/api/aggregations.html#agate.Count) aggregation to count the total number of rows in each group." ] }, { "cell_type": "code", "execution_count": 30, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "| state | count |\n", "| ----- | ----- |\n", "| TX | 217 |\n", "| NY | 209 |\n", "| CA | 164 |\n", "| IL | 158 |\n", "| MI | 63 |\n", "| ... | ... |\n" ] } ], "source": [ "state_totals = by_state.aggregate([\n", " ('count', agate.Count())\n", "])\n", "\n", "sorted_totals = state_totals.order_by('count', reverse=True)\n", "\n", "sorted_totals.print_table(max_rows=5)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "You'll notice we pass a sequence of tuples to [`TableSet.aggregate`](https://agate.readthedocs.io/en/latest/api/tableset.html#agate.TableSet.aggregate). Each one includes two elements. The first is the new column name being created. The second is an instance of some [`Aggregation`](https://agate.readthedocs.io/en/latest/api/aggregations.html#agate.Aggregation). Unsurpringly, in this case the results appear to be roughly proportional to population.\n", "\n", "**Question:** What state has the longest median time in prison prior to exoneration?\n", "\n", "This is a much more complicated question that's going to pull together a lot of the features we've been using. We'll repeat the computations we applied before, but this time we're going to roll those computations up in state-by-state groups and then take the [`Median`](https://agate.readthedocs.io/en/latest/api/aggregations.html#agate.Median of each group. Then we'll sort the data and see where people have been stuck in prison the longest." ] }, { "cell_type": "code", "execution_count": 31, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "| state | count | median_years_in_p... |\n", "| ----- | ----- | -------------------- |\n", "| DC | 15 | 27 |\n", "| NE | 9 | 20 |\n", "| ID | 2 | 19 |\n", "| VT | 1 | 18 |\n", "| LA | 45 | 16 |\n", "| ... | ... | ... |\n" ] } ], "source": [ "with_years_in_prison = exonerations.compute([\n", " ('years_in_prison', agate.Change('convicted', 'exonerated'))\n", "])\n", "\n", "state_totals = with_years_in_prison.group_by('state')\n", "\n", "medians = state_totals.aggregate([\n", " ('count', agate.Count()),\n", " ('median_years_in_prison', agate.Median('years_in_prison'))\n", "])\n", "\n", "sorted_medians = medians.order_by('median_years_in_prison', reverse=True)\n", "\n", "sorted_medians.print_table(max_rows=5)" ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "DC? Nebraska? What accounts for these states having the longest times in prison before exoneration? I have no idea! Given that the group sizes are small, it would probably be wise to look for outliers.\n", "\n", "As with [`Table.aggregate`]()https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.aggregate and [`Table.compute`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.compute), the [`TableSet.aggregate`](https://agate.readthedocs.io/en/latest/api/tableset.html#agate.TableSet.aggregate) method takes a list of aggregations to perform. You can aggregate as many columns as you like in a single step and they will all appear in the output table." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Multi-dimensional aggregation\n", "=============================\n", "\n", "I've already shown you that you can use [`TableSet`](https://agate.readthedocs.io/en/latest/api/tableset.html#agate.TableSet) to group instances of [`Table`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table). However, you can also use a [`TableSet`](https://agate.readthedocs.io/en/latest/api/tableset.html#agate.TableSet) to group *other TableSets*. To put that another way, instances of [`TableSet`](https://agate.readthedocs.io/en/latest/api/tableset.html#agate.TableSet) can be *nested*.\n", "\n", "The key to nesting data in this way is to use [`TableSet.group_by`](https://agate.readthedocs.io/en/latest/api/tableset.html#agate.TableSet.group_by). This is one of many methods that can be called on a TableSet, which will then be applied to all the tables it contains. In the last section we used [`Table.group_by`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.group_by) to split data up into a group of tables. By calling [`TableSet.group_by`](https://agate.readthedocs.io/en/latest/api/tableset.html#agate.TableSet.group_by), which essentially called ``group_by`` on each table and collect the results. This can be pretty hard to wrap your head around, so let's look at a concrete example.\n", "\n", "**Question:** Is there a collective relationship between race, age and time spent in prison prior to exoneration?\n", "\n", "I'm not going to explain every stage of this analysis as most of it repeats patterns used previously. The key part to look for is the two separate uses of ``group_by``:" ] }, { "cell_type": "code", "execution_count": 32, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "| race | age_group | count | median_years_in_p... |\n", "| --------------- | --------- | ----- | -------------------- |\n", "| Native American | 20s | 2 | 21.5 |\n", "| | 20s | 1 | 19.0 |\n", "| Native American | 10s | 2 | 15.0 |\n", "| Native American | 30s | 2 | 14.5 |\n", "| Black | 10s | 188 | 14.0 |\n", "| Black | 20s | 358 | 13.0 |\n", "| Asian | 20s | 4 | 12.0 |\n", "| Black | 30s | 156 | 10.0 |\n", "| Caucasian | 10s | 76 | 8.0 |\n", "| Caucasian | 20s | 255 | 8.0 |\n", "| ... | ... | ... | ... |\n" ] } ], "source": [ "# Filters rows without age data\n", "only_with_age = with_years_in_prison.where(\n", " lambda r: r['age'] is not None\n", ")\n", "\n", "# Group by race\n", "race_groups = only_with_age.group_by('race')\n", "\n", "# Sub-group by age cohorts (20s, 30s, etc.)\n", "race_and_age_groups = race_groups.group_by(\n", " lambda r: '%i0s' % (r['age'] // 10),\n", " key_name='age_group'\n", ")\n", "\n", "# Aggregate medians for each group\n", "medians = race_and_age_groups.aggregate([\n", " ('count', agate.Count()),\n", " ('median_years_in_prison', agate.Median('years_in_prison'))\n", "])\n", "\n", "# Sort the results\n", "sorted_groups = medians.order_by('median_years_in_prison', reverse=True)\n", "\n", "# Print out the results\n", "sorted_groups.print_table(max_rows=10)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Exploratory charting\n", "\n", "Beginning with version 1.5.0, agate includes the pure-Python SVG charting library [leather](https://leather.readthedocs.io/en/latest/). Leather allows you to generate \"good enough\" charts with as little as one line of code. It's especially useful if you're working in a Jupyter Notebook, as the results will render inline.\n", "\n", "There are currently four chart types support: [`Table.bar_chart`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.bar_chart), [`Table.column_chart`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.column_chart), [`Table.line_chart`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.line_chart), and [`Table.scatterplot`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.scatterplot).\n", "\n", "Let's create charts from a few slices of data we've made in this tutorial." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Exonerations by state" ] }, { "cell_type": "code", "execution_count": 35, "metadata": { "collapsed": false }, "outputs": [ { "data": { "image/svg+xml": [ "count1002003000stateGUVTDENHMEIDWYPRNDHIAKMILCONMSDRIARSCKSMTNENVWVMNORKYUTIAMSTNINCTDCNJAZMDALGAOKNCMOWAVAMAWILAOHPAFLMIILCANYTX" ], "text/plain": [ "" ] }, "execution_count": 35, "metadata": {}, "output_type": "execute_result" } ], "source": [ "sorted_totals.bar_chart('state', 'count', height=1000)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Leather will try to maintain a reasonable aspect ratio for chart. In this case the chart is too short to display correctly. We've used the `height` argument to make the chart a little taller." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Exonerations by age bracket\n", "\n", "When creating a chart you may omit the column name arguments. If you do so the first and second columns in the table will be used. This is especially useful for charting the output of [`TableSet.aggregate`](https://agate.readthedocs.io/en/latest/api/tableset.html#agate.TableSet.aggregate) or [`Table.bins`](https://agate.readthedocs.io/en/latest/api/table.html#agate.Table.bins)." ] }, { "cell_type": "code", "execution_count": 36, "metadata": { "collapsed": false }, "outputs": [ { "data": { "image/svg+xml": [ "Count2505007500ageNone[82 - 91)[64 - 73)[55 - 64)[46 - 55)[37 - 46)[28 - 37)[19 - 28)[10 - 19)" ], "text/plain": [ "" ] }, "execution_count": 36, "metadata": {}, "output_type": "execute_result" } ], "source": [ "binned_ages.bar_chart()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Exonerations by year" ] }, { "cell_type": "code", "execution_count": 38, "metadata": { "collapsed": false }, "outputs": [ { "data": { "image/svg+xml": [ "exonerated19801990200020102020count501001500" ], "text/plain": [ "" ] }, "execution_count": 38, "metadata": {}, "output_type": "execute_result" } ], "source": [ "by_year_exonerated = exonerations.group_by('exonerated')\n", "\n", "counts = by_year_exonerated.aggregate([\n", " ('count', agate.Count())\n", "])\n", "\n", "counts.order_by('exonerated').line_chart('exonerated', 'count')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Exonerations over time, for most commonly exonerated crimes\n", "\n", "The real power of agate's exploratory charting comes when we want to compare different facets of data. With leather, agate can automatically render a of chart for each group in a TableSet." ] }, { "cell_type": "code", "execution_count": 41, "metadata": { "collapsed": false }, "outputs": [ { "data": { "image/svg+xml": [ "Child Sex Abuseexonerated19801990200020102020count10203040500Sexual Assaultexonerated19801990200020102020count10203040500Murderexonerated19801990200020102020count10203040500Drug Possession or Saleexonerated19801990200020102020count10203040500" ], "text/plain": [ "" ] }, "execution_count": 41, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# Filter to crimes with at least 100 exonerations\n", "top_crimes = exonerations.group_by('crime').having([\n", " ('count', agate.Count())\n", "], lambda t: t['count'] > 100)\n", "\n", "# Group by year of exoneration\n", "by_year = top_crimes.group_by('exonerated')\n", "\n", "# Count number of exonerations in each year\n", "counts = by_year.aggregate([\n", " ('count', agate.Count())\n", "])\n", "\n", "# Group by crime\n", "by_crime = counts.group_by('crime')\n", "\n", "# Sort each group of exonerations by year and chart the results\n", "by_crime.order_by('exonerated').line_chart('exonerated', 'count')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Styling charts\n", "\n", "As mentioned above, leather is designed for making \"good enough\" charts. You are never going to create a polished chart. However, sometimes you may want more control than agate offers through it's own methods. You can take more control over how your charts are presented by using [leather](https://leather.readthedocs.io/) directly." ] }, { "cell_type": "code", "execution_count": 52, "metadata": { "collapsed": false }, "outputs": [ { "data": { "image/svg+xml": [ "Total exonerations by stateNumber of exonerations1002003000StateGUVTDENHMEIDWYPRNDHIAKMILCONMSDRIARSCKSMTNENVWVMNORKYUTIAMSTNINCTDCNJAZMDALGAOKNCMOWAVAMAWILAOHPAFLMIILCANYTX" ], "text/plain": [ "" ] }, "execution_count": 52, "metadata": {}, "output_type": "execute_result" } ], "source": [ "import leather\n", "\n", "chart = leather.Chart('Total exonerations by state')\n", "chart.add_y_axis(name='State')\n", "chart.add_x_axis(name='Number of exonerations')\n", "chart.add_bars(sorted_totals, x='count', y='state')\n", "chart.to_svg(height=1000)" ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "Where to go next\n", "================\n", "\n", "This tutorial only scratches the surface of agate's features. For many more ideas on how to apply agate, check out the [`Cookbook`](https://agate.readthedocs.io/en/latest/cookbook.html), which includes dozens of examples of specific features of agate as well as recipes for substituting agate for Excel, SQL, R and more. Also check out the agate's [`Extensions`](https://agate.readthedocs.io/en/latest/extensions.html) which add support for reading/writing SQL tables, performing statistical analysis and more.\n", "\n", "Also, if you're going to be doing data processing in Python you really ought to check out [`proof`](https://proof.readthedocs.io/en/0.3.0/), a library for building data processing pipelines that are repeatable and self-documenting. It will make your code cleaner and save you tons of time.\n", "\n", "Good luck in your reporting!" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.5.0" } }, "nbformat": 4, "nbformat_minor": 0 }