django-treebeard-2.0b1/ 0000755 0000765 0000024 00000000000 12151313146 015040 5 ustar tabo staff 0000000 0000000 django-treebeard-2.0b1/CHANGES 0000644 0000765 0000024 00000010426 12151312464 016040 0 ustar tabo staff 0000000 0000000
Release 2.0b1 (May 29, 2013)
----------------------------
This is a beta release.
* Added support for Django 1.5 and Python 3.X
* Updated docs: the library supports python 2.5+ and Django 1.4+. Dropped
support for older versions
* Revamped admin interface for MP and NS trees, supporting drag&drop to reorder
nodes. Work on this patch was sponsored by the
`Oregon Center for Applied Science`_, inspired by `FeinCMS`_ developed by
`Jesús del Carpio`_ with tests from `Fernando Gutierrez`_. Thanks ORCAS!
* Updated setup.py to use distribute/setuptools instead of distutils
* Now using pytest for testing
* Small optimization to ns_tree.is_root
* Moved treebeard.tests to it's own directory (instead of tests.py)
* Added the runtests.py test runner
* Added tox support
* Fixed drag&drop bug in the admin
* Fixed a bug when moving MP_Nodes
* Using .pk instead of .id when accessing nodes.
* Removed the Benchmark (tbbench) and example (tbexample) apps.
* Fixed url parts join issues in the admin.
* Fixed: Now installing the static resources
* Fixed ManyToMany form field save handling
* In the admin, the node is now saved when moving so it can trigger handlers
and/or signals.
* Improved translation files, including javascript.
* Renamed Node.get_database_engine() to Node.get_database_vendor(). As the name
implies, it returns the database vendor instead of the engine used. Treebeard
will get the value from Django, but you can subclass the method if needed.
Release 1.61 (Jul 24, 2010)
---------------------------
* Added admin i18n. Included translations: es, ru
* Fixed a bug when trying to introspect the database engine used in Django 1.2+
while using new style db settings (DATABASES). Added
Node.get_database_engine to deal with this.
Release 1.60 (Apr 18, 2010)
---------------------------
* Added get_annotated_list
* Complete revamp of the documentation. It's now divided in sections for easier
reading, and the package includes .rst files instead of the html build.
* Added raw id fields support in the admin
* Fixed setup.py to make it work in 2.4 again
* The correct ordering in NS/MP trees is now enforced in the queryset.
* Cleaned up code, removed some unnecessary statements.
* Tests refactoring, to make it easier to spot the model being tested.
* Fixed support of trees using proxied models. It was broken due to a bug in
Django.
* Fixed a bug in add_child when adding nodes to a non-leaf in sorted MP.
* There are now 648 unit tests. Test coverage is 96%
* This will be the last version compatible with Django 1.0. There will be a
a 1.6.X branch maintained for urgent bug fixes, but the main development will
focus on recent Django versions.
Release 1.52 (Dec 18, 2009)
---------------------------
* Really fixed the installation of templates.
Release 1.51 (Dec 16, 2009)
---------------------------
* Forgot to include treebeard/tempates/\*.html in MANIFEST.in
Release 1.5 (Dec 15, 2009)
--------------------------
New features added
~~~~~~~~~~~~~~~~~~
* Forms
- Added MoveNodeForm
* Django Admin
- Added TreeAdmin
* MP_Node
- Added 2 new checks in MP_Node.find_problems():
4. a list of ids of nodes with the wrong depth value for
their path
5. a list of ids nodes that report a wrong number of children
- Added a new (safer and faster but less comprehensive) MP_Node.fix_tree()
approach.
* Documentation
- Added warnings in the documentation when subclassing MP_Node or NS_Node
and adding a new Meta.
- HTML documentation is now included in the package.
- CHANGES file and section in the docs.
* Other changes:
- script to build documentation
- updated numconv.py
Bugs fixed
~~~~~~~~~~
* Added table quoting to all the sql queries that bypass the ORM.
Solves bug in postgres when the table isn't created by syncdb.
* Removing unused method NS_Node._find_next_node
* Fixed MP_Node.get_tree to include the given parent when given a leaf node
Release 1.1 (Nov 20, 2008)
--------------------------
Bugs fixed
~~~~~~~~~~
* Added exceptions.py
Release 1.0 (Nov 19, 2008)
--------------------------
* First public release.
.. _Oregon Center for Applied Science: http://www.orcasinc.com/
.. _FeinCMS: http://www.feinheit.ch/media/labs/feincms/admin.html
.. _Jesús del Carpio: http://www.isgeek.net
.. _Fernando Gutierrez: http://xbito.pe
django-treebeard-2.0b1/django_treebeard.egg-info/ 0000755 0000765 0000024 00000000000 12151313146 022011 5 ustar tabo staff 0000000 0000000 django-treebeard-2.0b1/django_treebeard.egg-info/dependency_links.txt 0000644 0000765 0000024 00000000001 12151313140 026051 0 ustar tabo staff 0000000 0000000
django-treebeard-2.0b1/django_treebeard.egg-info/PKG-INFO 0000644 0000765 0000024 00000004127 12151313140 023104 0 ustar tabo staff 0000000 0000000 Metadata-Version: 1.1
Name: django-treebeard
Version: 2.0b1
Summary: Efficient tree implementations for Django 1.4+
Home-page: https://tabo.pe/projects/django-treebeard/
Author: Gustavo Picon
Author-email: tabo@tabo.pe
License: Apache License 2.0
Description:
django-treebeard
================
django-treebeard is a library that implements efficient tree implementations
for the Django Web Framework 1.4+, written by Gustavo Picón and licensed under
the Apache License 2.0.
django-treebeard is:
- **Flexible**: Includes 3 different tree implementations with the same API:
1. Adjacency List
2. Materialized Path
3. Nested Sets
- **Fast**: Optimized non-naive tree operations (see Benchmarks).
- **Easy**: Uses Django Model Inheritance with abstract classes to define your own
models.
- **Clean**: Testable and well tested code base. Code/branch test coverage is above
96%. Tests are available in Jenkins:
- Test suite running on different versions of Python, Django and database
engine: https://tabo.pe/jenkins/job/django-treebeard/
- Code quality: https://tabo.pe/jenkins/job/django-treebeard-quality/
You can find the documentation in
https://tabo.pe/projects/django-treebeard/docs/tip/
Platform: UNKNOWN
Classifier: Development Status :: 5 - Production/Stable
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Environment :: Web Environment
Classifier: Framework :: Django
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 2.5
Classifier: Programming Language :: Python :: 2.6
Classifier: Programming Language :: Python :: 2.7
Classifier: Programming Language :: Python :: 3.2
Classifier: Programming Language :: Python :: 3.3
Classifier: Operating System :: OS Independent
Classifier: Topic :: Software Development :: Libraries
Classifier: Topic :: Utilities
django-treebeard-2.0b1/django_treebeard.egg-info/requires.txt 0000644 0000765 0000024 00000000013 12151313140 024375 0 ustar tabo staff 0000000 0000000 Django>=1.4 django-treebeard-2.0b1/django_treebeard.egg-info/SOURCES.txt 0000644 0000765 0000024 00000002600 12151313146 023673 0 ustar tabo staff 0000000 0000000 CHANGES
LICENSE
MANIFEST.in
NOTICE
README.rst
UPDATING
setup.py
django_treebeard.egg-info/PKG-INFO
django_treebeard.egg-info/SOURCES.txt
django_treebeard.egg-info/dependency_links.txt
django_treebeard.egg-info/requires.txt
django_treebeard.egg-info/top_level.txt
docs/Makefile
docs/admin.rst
docs/al_tree.rst
docs/api.rst
docs/benchmarks.rst
docs/changes.rst
docs/conf.py
docs/exceptions.rst
docs/forms.rst
docs/index.rst
docs/intro.rst
docs/mp_tree.rst
docs/ns_tree.rst
docs/tests.rst
treebeard/__init__.py
treebeard/admin.py
treebeard/al_tree.py
treebeard/exceptions.py
treebeard/forms.py
treebeard/models.py
treebeard/mp_tree.py
treebeard/ns_tree.py
treebeard/numconv.py
treebeard/static/treebeard/expand-collapse.png
treebeard/static/treebeard/jquery-ui-1.8.5.custom.min.js
treebeard/static/treebeard/treebeard-admin.css
treebeard/static/treebeard/treebeard-admin.js
treebeard/templates/admin/tree_change_list.html
treebeard/templates/admin/tree_change_list_results.html
treebeard/templates/admin/tree_list.html
treebeard/templates/admin/tree_list_results.html
treebeard/templatetags/__init__.py
treebeard/templatetags/admin_tree.py
treebeard/templatetags/admin_tree_list.py
treebeard/tests/__init__.py
treebeard/tests/conftest.py
treebeard/tests/models.py
treebeard/tests/settings.py
treebeard/tests/test_treebeard.py
treebeard/tests/jenkins/rm_workspace_coverage.py
treebeard/tests/jenkins/toxhelper.py django-treebeard-2.0b1/django_treebeard.egg-info/top_level.txt 0000644 0000765 0000024 00000000012 12151313140 024526 0 ustar tabo staff 0000000 0000000 treebeard
django-treebeard-2.0b1/docs/ 0000755 0000765 0000024 00000000000 12151313146 015770 5 ustar tabo staff 0000000 0000000 django-treebeard-2.0b1/docs/admin.rst 0000644 0000765 0000024 00000000211 12151047564 017614 0 ustar tabo staff 0000000 0000000 Admin
=====
.. module:: treebeard.admin
.. autoclass:: TreeAdmin
:show-inheritance:
To be used by Django's admin.site.register
django-treebeard-2.0b1/docs/al_tree.rst 0000644 0000765 0000024 00000005744 12151047554 020156 0 ustar tabo staff 0000000 0000000 Adjacency List trees
====================
.. module:: treebeard.al_tree
This is a simple implementation of the traditional Adjacency List Model for
storing trees in relational databases.
In the adjacency list model, every node will have a
":attr:`~AL_Node.parent`" key, that will be NULL for root nodes.
Since ``django-treebeard`` must return trees ordered in a predictable way,
the ordering for models without the :attr:`~AL_Node.node_order_by`
attribute will have an extra attribute that will store the relative
position of a node between it's siblings: :attr:`~AL_Node.sib_order`.
The adjacency list model has the advantage of fast writes at the cost of
slow reads. If you read more than you write, use
:class:`~treebeard.mp_tree.MP_Node` instead.
.. inheritance-diagram:: AL_Node
.. autoclass:: AL_Node
:show-inheritance:
.. attribute:: node_order_by
Attribute: a list of model fields that will be used for node
ordering. When enabled, all tree operations will assume this ordering.
Example::
node_order_by = ['field1', 'field2', 'field3']
.. attribute:: parent
``ForeignKey`` to itself. This attribute **MUST** be defined in the
subclass (sadly, this isn't inherited correctly from the ABC in
`Django 1.0`). Just copy&paste these lines to your model::
parent = models.ForeignKey('self',
related_name='children_set',
null=True,
db_index=True)
.. attribute:: sib_order
``PositiveIntegerField`` used to store the relative position of a node
between it's siblings. This attribute is mandatory *ONLY* if you don't
set a :attr:`node_order_by` field. You can define it copy&pasting this
line in your model::
sib_order = models.PositiveIntegerField()
Examples::
class AL_TestNode(AL_Node):
parent = models.ForeignKey('self',
related_name='children_set',
null=True,
db_index=True)
sib_order = models.PositiveIntegerField()
desc = models.CharField(max_length=255)
class AL_TestNodeSorted(AL_Node):
parent = models.ForeignKey('self',
related_name='children_set',
null=True,
db_index=True)
node_order_by = ['val1', 'val2', 'desc']
val1 = models.IntegerField()
val2 = models.IntegerField()
desc = models.CharField(max_length=255)
Read the API reference of :class:`treebeard.Node` for info on methods
available in this class, or read the following section for methods with
particular arguments or exceptions.
.. automethod:: get_depth
See: :meth:`treebeard.Node.get_depth`
django-treebeard-2.0b1/docs/api.rst 0000644 0000765 0000024 00000016211 12151051647 017301 0 ustar tabo staff 0000000 0000000 API
===
.. module:: treebeard.models
.. inheritance-diagram:: Node
.. autoclass:: Node
:show-inheritance:
This is the base class that defines the API of all tree models in this
library:
- :class:`treebeard.mp_tree.MP_Node` (materialized path)
- :class:`treebeard.ns_tree.NS_Node` (nested sets)
- :class:`treebeard.al_tree.AL_Node` (adjacency list)
.. warning::
Please note that ``django-treebeard`` uses Django raw SQL queries for
some write operations, and raw queries don't update the objects in the
ORM since it's being bypassed.
Because of this, if you have a node in memory and plan to use it after a
tree modification (adding/removing/moving nodes), you need to reload it.
.. automethod:: Node.add_root
Example::
MyNode.add_root(numval=1, strval='abcd')
.. automethod:: add_child
Example::
node.add_child(numval=1, strval='abcd')
.. automethod:: add_sibling
Examples::
node.add_sibling('sorted-sibling', numval=1, strval='abc')
.. automethod:: delete
.. note::
Call our queryset's delete to handle children removal. Subclasses
will handle extra maintenance.
.. automethod:: get_tree
.. automethod:: get_depth
Example::
node.get_depth()
.. automethod:: get_ancestors
Example::
node.get_ancestors()
.. automethod:: get_children
Example::
node.get_children()
.. automethod:: get_children_count
Example::
node.get_children_count()
.. automethod:: get_descendants
Example::
node.get_descendants()
.. automethod:: get_descendant_count
Example::
node.get_descendant_count()
.. automethod:: get_first_child
Example::
node.get_first_child()
.. automethod:: get_last_child
Example::
node.get_last_child()
.. automethod:: get_first_sibling
Example::
node.get_first_sibling()
.. automethod:: get_last_sibling
Example::
node.get_last_sibling()
.. automethod:: get_prev_sibling
Example::
node.get_prev_sibling()
.. automethod:: get_next_sibling
Example::
node.get_next_sibling()
.. automethod:: get_parent
Example::
node.get_parent()
.. automethod:: get_root
Example::
node.get_root()
.. automethod:: get_siblings
Example::
node.get_siblings()
.. automethod:: is_child_of
Example::
node.is_child_of(node2)
.. automethod:: is_descendant_of
Example::
node.is_descendant_of(node2)
.. automethod:: is_sibling_of
Example::
node.is_sibling_of(node2)
.. automethod:: is_root
Example::
node.is_root()
.. automethod:: is_leaf
Example::
node.is_leaf()
.. automethod:: move
.. note:: The node can be moved under another root node.
Examples::
node.move(node2, 'sorted-child')
node.move(node2, 'prev-sibling')
.. automethod:: save
.. automethod:: get_first_root_node
Example::
MyNodeModel.get_first_root_node()
.. automethod:: get_last_root_node
Example::
MyNodeModel.get_last_root_node()
.. automethod:: get_root_nodes
Example::
MyNodeModel.get_root_nodes()
.. automethod:: load_bulk
.. note::
Any internal data that you may have stored in your
nodes' data (:attr:`path`, :attr:`depth`) will be
ignored.
.. note::
If your node model has a ForeignKey this method will try to load
the related object before loading the data. If the related object
doesn't exist it won't load anything and will raise a DoesNotExist
exception. This is done because the dump_data method uses integers
to dump related objects.
.. note::
If your node model has :attr:`node_order_by` enabled, it will
take precedence over the order in the structure.
Example::
data = [{'data':{'desc':'1'}},
{'data':{'desc':'2'}, 'children':[
{'data':{'desc':'21'}},
{'data':{'desc':'22'}},
{'data':{'desc':'23'}, 'children':[
{'data':{'desc':'231'}},
]},
{'data':{'desc':'24'}},
]},
{'data':{'desc':'3'}},
{'data':{'desc':'4'}, 'children':[
{'data':{'desc':'41'}},
]},
]
# parent = None
MyNodeModel.load_data(data, None)
Will create:
.. digraph:: load_bulk_digraph
"1";
"2";
"2" -> "21";
"2" -> "22";
"2" -> "23" -> "231";
"2" -> "24";
"3";
"4";
"4" -> "41";
.. automethod:: dump_bulk
Example::
tree = MyNodeModel.dump_bulk()
branch = MyNodeModel.dump_bulk(node_obj)
.. automethod:: find_problems
.. automethod:: fix_tree
.. automethod:: get_descendants_group_count
Example::
# get a list of the root nodes
root_nodes = MyModel.get_descendants_group_count()
for node in root_nodes:
print '%s by %s (%d replies)' % (node.comment, node.author,
node.descendants_count)
.. automethod:: get_annotated_list
Example::
annotated_list = get_annotated_list()
With data:
.. digraph:: get_annotated_list_digraph
"a";
"a" -> "ab";
"ab" -> "aba";
"ab" -> "abb";
"ab" -> "abc";
"a" -> "ac";
Will return::
[
(a, {'open':True, 'close':[], 'level': 0})
(ab, {'open':True, 'close':[], 'level': 1})
(aba, {'open':True, 'close':[], 'level': 2})
(abb, {'open':False, 'close':[], 'level': 2})
(abc, {'open':False, 'close':[0,1], 'level': 2})
(ac, {'open':False, 'close':[0], 'level': 1})
]
This can be used with a template like::
{% for item, info in annotated_list %}
{% if info.open %}
-
{% else %}
-
{% endif %}
{{ item }}
{% for close in info.close %}
{% endfor %}
{% endfor %}
.. note:: This method was contributed originally by
`Alexey Kinyov `_, using an idea borrowed
from `django-mptt`.
.. versionadded:: 1.55
.. automethod:: get_database_vendor
Example::
@classmethod
def get_database_engine(cls):
return "mysql"
.. versionadded:: 1.61
django-treebeard-2.0b1/docs/benchmarks.rst 0000644 0000765 0000024 00000026127 12150510123 020640 0 ustar tabo staff 0000000 0000000 Benchmarks
==========
``tbbench`` is a django app that isn't installed by default. I wrote it to
find spots that could be optimized, and it may help you to tweak your database
settings.
To run the benchmarks:
1. Add ``tbbench`` to your Python path
2. Add ``'tbbench'`` to the ``INSTALLED_APPS`` section in your django
settings file.
3. Run :command:`python manage.py syncdb`
4. In the ``tbbench`` dir, run :command:`python run.py`
.. note::
If the `django-mptt`_ package is also installed, both libraries will
be tested with the exact same data and operations.
Currently, the available tests are:
1. Inserts: adds 1000 nodes to a tree, in different places: root
nodes, normal nodes, leaf nodes
2. Descendants: retrieves the full branch under every node several times.
3. Move: moves nodes several times. This operation can be expensive
because involves reodrering and data maintenance.
4. Delete: Removes groups of nodes.
For every available library (treebeard and mptt), two models are tested: a
vanilla model, and a model with a "tree order by" attribute enabled
(:attr:`~treebeard.MP_Node.node_order_by` in treebeard,
``order_insertion_by`` in mptt).
Also, every test will be tested with and without database transactions
(``tx``).
The output of the script is a reST table, with the time for every test in
milliseconds (so small numbers are better).
By default, these tests use the default tables created by ``syncdb``. Even
when the results of ``treebeard`` are good, they can be improved *a lot*
with better indexing. The Materialized Path Tree approach used by
``treebeard`` is *very* sensitive to database indexing, so you'll
probably want to ``EXPLAIN`` your most common queries involving the
:attr:`~treebeard.MP_Node.path` field and add proper indexes.
.. note::
Tests results in Ubuntu 8.04.1 on a Thinkpad T61 with 4GB of ram.
.. warning::
These results shouldn't be taken as *"X is faster than Y"*,
but as *"both X and Y are very fast"*.
Databases tested:
- MySQL InnoDB 5.0.51a, default settings
- MySQL MyISAM 5.0.51a, default settings
- PostgreSQL 8.2.7, default settings, mounted on RAM
- PostgreSQL 8.3.3, default settings, mounted on RAM
- SQLite3, mounted on RAM
+-------------+--------------+-------------------+-------------------+-------------------+-------------------+-------------------+
| Test | Model | innodb | myisam | pg82 | pg83 | sqlite |
| | +---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | | no tx | tx | no tx | tx | no tx | tx | no tx | tx | no tx | tx |
+=============+==============+=========+=========+=========+=========+=========+=========+=========+=========+=========+=========+
| Inserts | TB MP | 3220 | 2660 | 3181 | 2766 | 2859 | 2542 | 2540 | 2309 | 2205 | 1934 |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | TB AL | 1963 | 1905 | 1998 | 1936 | 1937 | 1775 | 1736 | 1631 | 1583 | 1457 |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | TB NS | 3386 | 3438 | 3359 | 3420 | 4061 | 7242 | 3536 | 4401 | 2794 | 2554 |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | MPTT | 7559 | 9280 | 7525 | 9028 | 5202 | 14969 | 4764 | 6022 | 3781 | 3484 |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | TB MP Sorted | 4732 | 5627 | 5038 | 5215 | 4022 | 4808 | 3415 | 3942 | 3250 | 3045 |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | TB AL Sorted | 1096 | 1052 | 1092 | 1033 | 1239 | 999 | 1049 | 896 | 860 | 705 |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | TB NS Sorted | 6637 | 6373 | 6283 | 6313 | 7548 | 10053 | 6717 | 10941 | 5907 | 5461 |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | MPTT Sorted | 8564 | 10729 | 7947 | 10221 | 6077 | 7567 | 5490 | 6894 | 4842 | 4284 |
+-------------+--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| Descendants | TB MP | 6298 | N/A | 6460 | N/A | 7643 | N/A | 7132 | N/A | 10415 | N/A |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | TB AL | 56850 | N/A | 116550 | N/A | 54249 | N/A | 50682 | N/A | 50521 | N/A |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | TB NS | 5595 | N/A | 5824 | N/A | 10080 | N/A | 5840 | N/A | 5965 | N/A |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | MPTT | 5268 | N/A | 5306 | N/A | 9394 | N/A | 8745 | N/A | 5197 | N/A |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | TB MP Sorted | 6698 | N/A | 6408 | N/A | 8248 | N/A | 7265 | N/A | 10513 | N/A |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | TB AL Sorted | 59817 | N/A | 59718 | N/A | 56767 | N/A | 52574 | N/A | 53458 | N/A |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | TB NS Sorted | 5631 | N/A | 5858 | N/A | 9980 | N/A | 9210 | N/A | 6026 | N/A |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | MPTT Sorted | 5186 | N/A | 5453 | N/A | 9723 | N/A | 8912 | N/A | 5333 | N/A |
+-------------+--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| Move | TB MP | 837 | 1156 | 992 | 1211 | 745 | 1040 | 603 | 740 | 497 | 468 |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | TB AL | 8708 | 8684 | 9798 | 8890 | 7243 | 7213 | 6721 | 6757 | 7051 | 6863 |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | TB NS | 683 | 658 | 660 | 679 | 1266 | 2000 | 650 | 907 | 672 | 637 |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | MPTT | 6449 | 7793 | 6356 | 7003 | 4993 | 20743 | 4445 | 8977 | 921 | 896 |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | TB MP Sorted | 6730 | 7036 | 6743 | 7023 | 6410 | 19294 | 3622 | 12380 | 2622 | 2487 |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | TB AL Sorted | 3866 | 3731 | 3873 | 3717 | 3587 | 3599 | 3394 | 3371 | 3491 | 3416 |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | TB NS Sorted | 2017 | 2017 | 1958 | 2078 | 4397 | 7981 | 3892 | 8110 | 1543 | 1496 |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | MPTT Sorted | 6563 | 10540 | 6427 | 9358 | 5132 | 20426 | 4601 | 9428 | 957 | 955 |
+-------------+--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| Delete | TB MP | 714 | 651 | 733 | 686 | 699 | 689 | 595 | 561 | 636 | 557 |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | TB AL | 975 | 1093 | 2199 | 991 | 758 | 847 | 714 | 804 | 843 | 921 |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | TB NS | 745 | 745 | 742 | 763 | 555 | 698 | 430 | 506 | 530 | 513 |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | MPTT | 2928 | 4473 | 2914 | 4814 | 69385 | 167777 | 18186 | 26270 | 1617 | 1635 |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | TB MP Sorted | 811 | 751 | 808 | 737 | 798 | 1180 | 648 | 1101 | 612 | 565 |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | TB AL Sorted | 1030 | 1030 | 1055 | 987 | 797 | 1023 | 760 | 969 | 884 | 859 |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | TB NS Sorted | 756 | 750 | 728 | 758 | 807 | 847 | 576 | 748 | 501 | 490 |
| +--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
| | MPTT Sorted | 3729 | 5108 | 3833 | 4776 | 86545 | 148596 | 34059 | 127125 | 2024 | 1787 |
+-------------+--------------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+
.. _`django-mptt`: http://code.google.com/p/django-mptt/
django-treebeard-2.0b1/docs/changes.rst 0000644 0000765 0000024 00000000055 12150510123 020123 0 ustar tabo staff 0000000 0000000 Changelog
=========
.. include:: ../CHANGES
django-treebeard-2.0b1/docs/conf.py 0000644 0000765 0000024 00000001600 12151310452 017261 0 ustar tabo staff 0000000 0000000 # -*- coding: utf-8 -*-
"""
Configuration for the Sphinx documentation generator.
Reference: http://sphinx.pocoo.org/config.html
"""
import os
import sys
sys.path.insert(0, os.path.abspath('..'))
os.environ['DJANGO_SETTINGS_MODULE'] = 'treebeard.tests.settings'
extensions = ['sphinx.ext.autodoc', 'sphinx.ext.coverage',
'sphinx.ext.graphviz', 'sphinx.ext.inheritance_diagram',
'sphinx.ext.todo']
templates_path = ['_templates']
source_suffix = '.rst'
master_doc = 'index'
project = 'django-treebeard'
copyright = '2008-2013, Gustavo Picon'
version = '2.0b1'
release = '2.0b1'
exclude_trees = ['_build']
pygments_style = 'sphinx'
html_theme = 'default'
html_static_path = ['_static']
htmlhelp_basename = 'django-treebearddoc'
latex_documents = [(
'index',
'django-treebeard.tex',
'django-treebeard Documentation',
'Gustavo Picon',
'manual')]
django-treebeard-2.0b1/docs/exceptions.rst 0000644 0000765 0000024 00000000320 12151047545 020705 0 ustar tabo staff 0000000 0000000 Exceptions
==========
.. module:: treebeard.exceptions
.. autoexception:: InvalidPosition
.. autoexception:: InvalidMoveToDescendant
.. autoexception:: PathOverflow
.. autoexception:: MissingNodeOrderBy
django-treebeard-2.0b1/docs/forms.rst 0000644 0000765 0000024 00000000425 12151047537 017661 0 ustar tabo staff 0000000 0000000 Forms
=====
.. module:: treebeard.forms
.. autoclass:: MoveNodeForm
:show-inheritance:
Read the `Django Form objects documentation`_ for reference.
.. _`Django Form objects documentation`:
http://docs.djangoproject.com/en/dev/topics/forms/#form-objects
django-treebeard-2.0b1/docs/index.rst 0000644 0000765 0000024 00000002762 12150510123 017631 0 ustar tabo staff 0000000 0000000 django-treebeard
================
`django-treebeard `_
is a library that implements efficient tree implementations for the
`Django Web Framework 1.4+ `_, written by
`Gustavo Picón `_ and licensed under the Apache License 2.0.
``django-treebeard`` is:
- **Flexible**: Includes 3 different tree implementations with the same API:
1. :doc:`Adjacency List `
2. :doc:`Materialized Path `
3. :doc:`Nested Sets `
- **Fast**: Optimized non-naive tree operations (see :doc:`benchmarks`).
- **Easy**: Uses `Django Model Inheritance with abstract classes`_
to define your own models.
- **Clean**: Testable and well tested code base. Code/branch test coverage
is above 96%. Tests are available in Jenkins:
- `Tests running on different versions of Python, Django and DB engines`_
- `Code Quality`_
Contents
--------
.. toctree::
:maxdepth: 2
intro
api
mp_tree
ns_tree
al_tree
admin
forms
exceptions
benchmarks
tests
changes
.. _`Django Model Inheritance with abstract classes`:
https://docs.djangoproject.com/en/1.4/topics/db/models/#abstract-base-classes
.. _`Tests running on different versions of Python, Django and DB engines`:
https://tabo.pe/jenkins/job/django-treebeard/
.. _`Code Quality`: https://tabo.pe/jenkins/job/django-treebeard-quality/
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`
django-treebeard-2.0b1/docs/intro.rst 0000644 0000765 0000024 00000012110 12150510123 017641 0 ustar tabo staff 0000000 0000000 Introduction
============
Everything you need to get working quickly.
Prerequisites
-------------
``django-treebeard`` needs at least **Python 2.5** to run, and
**Django 1.4 or better**.
Installation
------------
You have several ways to install ``django-treebeard``. If you're not sure,
`just use pip `_
pip (or easy_install)
~~~~~~~~~~~~~~~~~~~~~
You can install the release versions from
`django-treebeard's PyPI page`_ using ``pip``::
pip install django-treebeard
or if for some reason you can't use ``pip``, you can try ``easy_install``,
(at your own risk)::
easy_install --always-unzip django-treebeard
setup.py
~~~~~~~~
Download a release from the `treebeard download page`_ and unpack it, then
run::
python setup.py install
.deb packages
~~~~~~~~~~~~~
Both Debian and Ubuntu include ``django-treebeard`` as a package, so you can
just use::
apt-get install python-django-treebeard
or::
aptitude install python-django-treebeard
Remember that the packages included in linux distributions are usually not the
most recent versions.
Configuration
-------------
Add ``'treebeard'`` to the `INSTALLED_APPS`_ section in your django settings
file.
.. note::
If you are going to use the :class:`Treeadmin `
class, you need to add the path to treebeard's templates in
`TEMPLATE_DIRS`_.
Also you need to enable `django.core.context_processors.request`_
in the `TEMPLATE_CONTEXT_PROCESSORS`_ setting in your django settings file.
Basic Usage
-----------
Create a basic model for your tree. In this example we'll use a Materialized
Path tree::
from django.db import models
from treebeard.mp_tree import MP_Node
class Category(MP_Node):
name = models.CharField(max_length=30)
node_order_by = ['name']
def __unicode__(self):
return 'Category: %s' % self.name
Run syncdb::
python manage.py syncdb
Let's create some nodes::
>>> get = lambda node_id: Category.objects.get(pk=node_id)
>>> root = Category.add_root(name='Computer Hardware')
>>> node = get(root.id).add_child(name='Memory')
>>> get(node.id).add_sibling(name='Hard Drives')
>>> get(node.id).add_sibling(name='SSD')
>>> get(node.id).add_child(name='Desktop Memory')
>>> get(node.id).add_child(name='Laptop Memory')
>>> get(node.id).add_child(name='Server Memory')
.. note::
Why retrieving every node again after the first operation? Because
``django-treebeard`` uses raw queries for most write operations,
and raw queries don't update the django objects of the db entries they
modify.
We just created this tree:
.. digraph:: introduction_digraph
"Computer Hardware";
"Computer Hardware" -> "Hard Drives";
"Computer Hardware" -> "Memory";
"Memory" -> "Desktop Memory";
"Memory" -> "Laptop Memory";
"Memory" -> "Server Memory";
"Computer Hardware" -> "SSD";
You can see the tree structure with code::
>>> Category.dump_bulk()
[{'id': 1, 'data': {'name': u'Computer Hardware'},
'children': [
{'id': 3, 'data': {'name': u'Hard Drives'}},
{'id': 2, 'data': {'name': u'Memory'},
'children': [
{'id': 5, 'data': {'name': u'Desktop Memory'}},
{'id': 6, 'data': {'name': u'Laptop Memory'}},
{'id': 7, 'data': {'name': u'Server Memory'}}]},
{'id': 4, 'data': {'name': u'SSD'}}]}]
>>> Category.get_annotated_list()
[(,
{'close': [], 'level': 0, 'open': True}),
(,
{'close': [], 'level': 1, 'open': True}),
(,
{'close': [], 'level': 1, 'open': False}),
(,
{'close': [], 'level': 2, 'open': True}),
(,
{'close': [], 'level': 2, 'open': False}),
(,
{'close': [0], 'level': 2, 'open': False}),
(,
{'close': [0, 1], 'level': 1, 'open': False})]
Read the :class:`treebeard.models.Node` API reference for detailed info.
.. _`django-treebeard's PyPI page`:
http://pypi.python.org/pypi/django-treebeard
.. _`treebeard download page`:
https://tabo.pe/projects/django-treebeard/download/
.. _`treebeard mercurial repository`:
http://code.tabo.pe/django-treebeard
.. _`latest treebeard version from PyPi`:
http://pypi.python.org/pypi/django-treebeard/
.. _`django.core.context_processors.request`:
http://docs.djangoproject.com/en/dev/ref/templates/api/#django-core-context-processors-request
.. _`INSTALLED_APPS`:
http://docs.djangoproject.com/en/dev/ref/settings/#installed-apps
.. _`TEMPLATE_DIRS`:
http://docs.djangoproject.com/en/dev/ref/settings/#template-dirs
.. _`TEMPLATE_CONTEXT_PROCESSORS`:
http://docs.djangoproject.com/en/dev/ref/settings/#template-context-processors
django-treebeard-2.0b1/docs/Makefile 0000644 0000765 0000024 00000006407 12150510123 017430 0 ustar tabo staff 0000000 0000000 # Makefile for Sphinx documentation
#
# You can set these variables from the command line.
SPHINXOPTS =
SPHINXBUILD = sphinx-build
PAPER =
BUILDDIR = _build
# Internal variables.
PAPEROPT_a4 = -D latex_paper_size=a4
PAPEROPT_letter = -D latex_paper_size=letter
ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
.PHONY: help clean html dirhtml pickle json htmlhelp qthelp latex changes linkcheck doctest
help:
@echo "Please use \`make ' where is one of"
@echo " html to make standalone HTML files"
@echo " dirhtml to make HTML files named index.html in directories"
@echo " pickle to make pickle files"
@echo " json to make JSON files"
@echo " htmlhelp to make HTML files and a HTML help project"
@echo " qthelp to make HTML files and a qthelp project"
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
@echo " changes to make an overview of all changed/added/deprecated items"
@echo " linkcheck to check all external links for integrity"
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
@echo " coverage Coverage"
clean:
-rm -rf $(BUILDDIR)/*
html:
mkdir -p _static
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
dirhtml:
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
pickle:
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
@echo
@echo "Build finished; now you can process the pickle files."
json:
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
@echo
@echo "Build finished; now you can process the JSON files."
htmlhelp:
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
@echo
@echo "Build finished; now you can run HTML Help Workshop with the" \
".hhp project file in $(BUILDDIR)/htmlhelp."
qthelp:
$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
@echo
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
".qhcp project file in $(BUILDDIR)/qthelp, like this:"
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/django-treebeard.qhcp"
@echo "To view the help file:"
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/django-treebeard.qhc"
latex:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo
@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
@echo "Run \`make all-pdf' or \`make all-ps' in that directory to" \
"run these through (pdf)latex."
changes:
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
@echo
@echo "The overview file is in $(BUILDDIR)/changes."
linkcheck:
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
@echo
@echo "Link check complete; look for any errors in the above output " \
"or in $(BUILDDIR)/linkcheck/output.txt."
doctest:
$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
@echo "Testing of doctests in the sources finished, look at the " \
"results in $(BUILDDIR)/doctest/output.txt."
coverage:
$(SPHINXBUILD) -b coverage $(ALLSPHINXOPTS) $(BUILDDIR)/coverage
@echo "Coverage, " \
"results in $(BUILDDIR)/coverage"
django-treebeard-2.0b1/docs/mp_tree.rst 0000644 0000765 0000024 00000017660 12151050631 020164 0 ustar tabo staff 0000000 0000000 Materialized Path trees
=======================
.. module:: treebeard.mp_tree
This is an efficient implementation of Materialized Path
trees for Django 1.4+, as described by `Vadim Tropashko`_ in `SQL Design
Patterns`_. Materialized Path is probably the fastest way of working with
trees in SQL without the need of extra work in the database, like Oracle's
``CONNECT BY`` or sprocs and triggers for nested intervals.
In a materialized path approach, every node in the tree will have a
:attr:`~MP_Node.path` attribute, where the full path from the root
to the node will be stored. This has the advantage of needing very simple
and fast queries, at the risk of inconsistency because of the
denormalization of ``parent``/``child`` foreign keys. This can be prevented
with transactions.
``django-treebeard`` uses a particular approach: every step in the path has
a fixed width and has no separators. This makes queries predictable and
faster at the cost of using more characters to store a step. To address
this problem, every step number is encoded.
Also, two extra fields are stored in every node:
:attr:`~MP_Node.depth` and :attr:`~MP_Node.numchild`.
This makes the read operations faster, at the cost of a little more
maintenance on tree updates/inserts/deletes. Don't worry, even with these
extra steps, materialized path is more efficient than other approaches.
.. note::
The materialized path approach makes heavy use of ``LIKE`` in your
database, with clauses like ``WHERE path LIKE '002003%'``. If you think
that ``LIKE`` is too slow, you're right, but in this case the
:attr:`~MP_Node.path` field is indexed in the database, and all
``LIKE`` clauses that don't **start** with a ``%`` character will use
the index. This is what makes the materialized path approach so fast.
.. _`Vadim Tropashko`: http://vadimtropashko.wordpress.com/
.. _`Sql Design Patterns`:
http://www.rampant-books.com/book_2006_1_sql_coding_styles.htm
.. inheritance-diagram:: MP_Node
.. autoclass:: MP_Node
:show-inheritance:
.. warning::
Do not change the values of :attr:`path`, :attr:`depth` or
:attr:`numchild` directly: use one of the included methods instead.
Consider these values *read-only*.
.. warning::
Do not change the values of the :attr:`steplen`, :attr:`alphabet` or
:attr:`node_order_by` after saving your first model. Doing so will
corrupt the tree. If you *must* do it:
1. Backup the tree with :meth:`dump_bulk`
2. Empty your model's table
3. Change :attr:`depth`, :attr:`alphabet` and/or
:attr:`node_order_by` in your model
4. Restore your backup using :meth:`load_bulk` with
``keep_ids=True`` to keep the same primary keys you had.
Example::
class SortedNode(MP_Node):
node_order_by = ['numval', 'strval']
numval = models.IntegerField()
strval = models.CharField(max_length=255)
Read the API reference of :class:`treebeard.Node` for info on methods
available in this class, or read the following section for methods with
particular arguments or exceptions.
.. attribute:: steplen
Attribute that defines the length of each step in the :attr:`path` of
a node. The default value of *4* allows a maximum of
*1679615* children per node. Increase this value if you plan to store
large trees (a ``steplen`` of *5* allows more than *60M* children per
node). Note that increasing this value, while increasing the number of
children per node, will decrease the max :attr:`depth` of the tree (by
default: *63*). To increase the max :attr:`depth`, increase the
max_length attribute of the :attr:`path` field in your model.
.. attribute:: alphabet
Attribute: the alphabet that will be used in base conversions
when encoding the path steps into strings. The default value,
``0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ`` is the most optimal possible
value that is portable between the supported databases (which means:
their default collation will order the :attr:`path` field correctly).
.. note::
In case you know what you are doing, there is a test that is
disabled by default that can tell you the optimal default alphabet
in your enviroment. To run the test you must enable the
:envvar:`TREEBEARD_TEST_ALPHABET` enviroment variable::
$ TREEBEARD_TEST_ALPHABET=1 python manage.py test treebeard.TestTreeAlphabet
On my Mountain Lion system, these are the optimal values for the
three supported databases in their *default* configuration:
================ ================
Database Optimal Alphabet
================ ================
MySQL 5.6.10 0-9A-Z
PostgreSQL 9.2.4 0-9A-Z
Sqlite3 0-9A-Z
================ ================
.. attribute:: node_order_by
Attribute: a list of model fields that will be used for node
ordering. When enabled, all tree operations will assume this ordering.
Example::
node_order_by = ['field1', 'field2', 'field3']
.. attribute:: path
``CharField``, stores the full materialized path for each node. The
default value of it's max_length, *255*, is the max efficient and
portable value for a ``varchar``. Increase it to allow deeper trees (max
depth by default: *63*)
.. note::
`django-treebeard` uses Django's abstract model inheritance, so:
1. To change the max_length value of the path in your model, you
can't just define it since you'd get a django exception, you have
to modify the already defined attribute::
class MyNodeModel(MP_Node):
pass
MyNodeModel._meta.get_field('path').max_length = 1024
2. You can't rely on Django's `auto_now` properties in date fields
for sorting, you'll have to manually set the value before creating
a node::
class TestNodeSortedAutoNow(MP_Node):
desc = models.CharField(max_length=255)
created = models.DateTimeField(auto_now_add=True)
node_order_by = ['created']
TestNodeSortedAutoNow.add_root(desc='foo',
created=datetime.datetime.now())
.. note::
For performance, and if your database allows it, you can safely
define the path column as ASCII (not utf-8/unicode/iso8859-1/etc) to
keep the index smaller (and faster). Also note that some databases
(mysql) have a small index size limit. InnoDB for instance has a
limit of 765 bytes per index, so that would be the limit if your path
is ASCII encoded. If your path column in InnoDB is using unicode,
the index limit will be 255 characters since in MySQL's indexes,
unicode means 3 bytes.
.. note::
treebeard uses **numconv** for path encoding:
https://tabo.pe/projects/numconv/
.. attribute:: depth
``PositiveIntegerField``, depth of a node in the tree. A root node
has a depth of *1*.
.. attribute:: numchild
``PositiveIntegerField``, the number of children of the node.
.. automethod:: add_root
See: :meth:`treebeard.Node.add_root`
.. automethod:: add_child
See: :meth:`treebeard.Node.add_child`
.. automethod:: add_sibling
See: :meth:`treebeard.Node.add_sibling`
.. automethod:: move
See: :meth:`treebeard.Node.move`
.. automethod:: get_tree
See: :meth:`treebeard.Node.get_tree`
.. note::
This metod returns a queryset.
.. automethod:: find_problems
.. note::
A node won't appear in more than one list, even when it exhibits
more than one problem. This method stops checking a node when it
finds a problem and continues to the next node.
.. note::
Problems 1, 2 and 3 can't be solved automatically.
Example::
MyNodeModel.find_problems()
.. automethod:: fix_tree
Example::
MyNodeModel.fix_tree()
django-treebeard-2.0b1/docs/ns_tree.rst 0000644 0000765 0000024 00000002272 12151047526 020172 0 ustar tabo staff 0000000 0000000 Nested Sets trees
=================
.. module:: treebeard.ns_tree
An implementation of Nested Sets trees for Django 1.4+, as described by
`Joe Celko`_ in `Trees and Hierarchies in SQL for Smarties`_.
Nested sets have very efficient reads at the cost of high maintenance on
write/delete operations.
.. _`Joe Celko`: http://en.wikipedia.org/wiki/Joe_Celko
.. _`Trees and Hierarchies in SQL for Smarties`:
http://www.elsevier.com/wps/product/cws_home/702605
.. inheritance-diagram:: NS_Node
.. autoclass:: NS_Node
:show-inheritance:
.. attribute:: node_order_by
Attribute: a list of model fields that will be used for node
ordering. When enabled, all tree operations will assume this ordering.
Example::
node_order_by = ['field1', 'field2', 'field3']
.. attribute:: depth
``PositiveIntegerField``, depth of a node in the tree. A root node
has a depth of *1*.
.. attribute:: lft
``PositiveIntegerField``
.. attribute:: rgt
``PositiveIntegerField``
.. attribute:: tree_id
``PositiveIntegerField``
.. automethod:: get_tree
See: :meth:`treebeard.Node.get_tree`
.. note::
This metod returns a queryset.
django-treebeard-2.0b1/docs/tests.rst 0000644 0000765 0000024 00000003642 12150510123 017662 0 ustar tabo staff 0000000 0000000 Running the Test Suite
======================
``django-treebeard`` includes a comprehensive test suite. It is highly
recommended that you run and update the test suite when you send patches.
py.test
-------
You will need `pytest`_ to run the test suite.
To run the test suite::
$ py.test
You can use all the features and plugins of pytest this way.
By default the test suite will run using a sqlite3 database in RAM, but you can
change this setting environment variables:
.. option:: DATABASE_ENGINE
.. option:: DATABASE_NAME
.. option:: DATABASE_USER
.. option:: DATABASE_PASSWORD
.. option:: DATABASE_HOST
.. option:: DATABASE_PORT
Sets the database settings to be used by the test suite. Useful if you
want to test the same database engine/version you use in production.
tox
---
``django-treebeard`` uses `tox`_ to run the test suite in all the supported
environments:
- py24-dj12-sqlite
- py24-dj12-mysql
- py24-dj12-postgres
- py24-dj13-sqlite
- py24-dj13-mysql
- py24-dj13-postgres
- py25-dj12-sqlite
- py25-dj12-mysql
- py25-dj12-postgres
- py25-dj13-sqlite
- py25-dj13-mysql
- py25-dj13-postgres
- py26-dj12-sqlite
- py26-dj12-mysql
- py26-dj12-postgres
- py26-dj13-sqlite
- py26-dj13-mysql
- py26-dj13-postgres
- py27-dj12-sqlite
- py27-dj12-mysql
- py27-dj12-postgres
- py27-dj13-sqlite
- py27-dj13-mysql
- py27-dj13-postgres
This means that the test suite will run 24 times to test every
environment supported by ``django-treebeard``. This takes a long time.
If you want to test only one or a few environments, please use the `-e`
option in `tox`_, like::
$ tox -e py27-dj13-postgres
.. _verbosity level:
.. _pytest: http://pytest.org/
http://docs.djangoproject.com/en/dev/ref/django-admin/#django-admin-option---verbosity
.. _coverage: http://nedbatchelder.com/code/coverage/
.. _tox: http://codespeak.net/tox/
django-treebeard-2.0b1/LICENSE 0000644 0000765 0000024 00000026137 12150510123 016047 0 ustar tabo staff 0000000 0000000
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
django-treebeard-2.0b1/MANIFEST.in 0000644 0000765 0000024 00000000235 12151306605 016600 0 ustar tabo staff 0000000 0000000 include CHANGES LICENSE NOTICE README.rst UPDATING MANIFEST.in
recursive-include docs Makefile README.rst *.py *.rst
recursive-include treebeard *.py *.html
django-treebeard-2.0b1/NOTICE 0000644 0000765 0000024 00000000000 12150510123 015723 0 ustar tabo staff 0000000 0000000 django-treebeard-2.0b1/PKG-INFO 0000644 0000765 0000024 00000004127 12151313146 016141 0 ustar tabo staff 0000000 0000000 Metadata-Version: 1.1
Name: django-treebeard
Version: 2.0b1
Summary: Efficient tree implementations for Django 1.4+
Home-page: https://tabo.pe/projects/django-treebeard/
Author: Gustavo Picon
Author-email: tabo@tabo.pe
License: Apache License 2.0
Description:
django-treebeard
================
django-treebeard is a library that implements efficient tree implementations
for the Django Web Framework 1.4+, written by Gustavo Picón and licensed under
the Apache License 2.0.
django-treebeard is:
- **Flexible**: Includes 3 different tree implementations with the same API:
1. Adjacency List
2. Materialized Path
3. Nested Sets
- **Fast**: Optimized non-naive tree operations (see Benchmarks).
- **Easy**: Uses Django Model Inheritance with abstract classes to define your own
models.
- **Clean**: Testable and well tested code base. Code/branch test coverage is above
96%. Tests are available in Jenkins:
- Test suite running on different versions of Python, Django and database
engine: https://tabo.pe/jenkins/job/django-treebeard/
- Code quality: https://tabo.pe/jenkins/job/django-treebeard-quality/
You can find the documentation in
https://tabo.pe/projects/django-treebeard/docs/tip/
Platform: UNKNOWN
Classifier: Development Status :: 5 - Production/Stable
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Environment :: Web Environment
Classifier: Framework :: Django
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 2.5
Classifier: Programming Language :: Python :: 2.6
Classifier: Programming Language :: Python :: 2.7
Classifier: Programming Language :: Python :: 3.2
Classifier: Programming Language :: Python :: 3.3
Classifier: Operating System :: OS Independent
Classifier: Topic :: Software Development :: Libraries
Classifier: Topic :: Utilities
django-treebeard-2.0b1/README.rst 0000644 0000765 0000024 00000001700 12151311067 016525 0 ustar tabo staff 0000000 0000000
django-treebeard
================
django-treebeard is a library that implements efficient tree implementations
for the Django Web Framework 1.4+, written by Gustavo Picón and licensed under
the Apache License 2.0.
django-treebeard is:
- **Flexible**: Includes 3 different tree implementations with the same API:
1. Adjacency List
2. Materialized Path
3. Nested Sets
- **Fast**: Optimized non-naive tree operations (see Benchmarks).
- **Easy**: Uses Django Model Inheritance with abstract classes to define your own
models.
- **Clean**: Testable and well tested code base. Code/branch test coverage is above
96%. Tests are available in Jenkins:
- Test suite running on different versions of Python, Django and database
engine: https://tabo.pe/jenkins/job/django-treebeard/
- Code quality: https://tabo.pe/jenkins/job/django-treebeard-quality/
You can find the documentation in
https://tabo.pe/projects/django-treebeard/docs/tip/
django-treebeard-2.0b1/setup.cfg 0000644 0000765 0000024 00000000073 12151313146 016661 0 ustar tabo staff 0000000 0000000 [egg_info]
tag_build =
tag_date = 0
tag_svn_revision = 0
django-treebeard-2.0b1/setup.py 0000644 0000765 0000024 00000003375 12151311076 016562 0 ustar tabo staff 0000000 0000000 #!/usr/bin/env python
import os
from setuptools import setup
from setuptools.command.test import test
def root_dir():
rd = os.path.dirname(__file__)
if rd:
return rd
return '.'
class pytest_test(test):
def finalize_options(self):
test.finalize_options(self)
self.test_args = []
self.test_suite = True
def run_tests(self):
import pytest
pytest.main([])
setup_args = dict(
name='django-treebeard',
version='2.0b1',
url='https://tabo.pe/projects/django-treebeard/',
author='Gustavo Picon',
author_email='tabo@tabo.pe',
license='Apache License 2.0',
packages=['treebeard', 'treebeard.templatetags'],
package_dir={'treebeard': 'treebeard'},
package_data={
'treebeard': ['templates/admin/*.html', 'static/treebeard/*']},
description='Efficient tree implementations for Django 1.4+',
long_description=open(root_dir() + '/README.rst').read(),
cmdclass={'test': pytest_test},
install_requires=['Django>=1.4'],
tests_require=['pytest'],
classifiers=[
'Development Status :: 5 - Production/Stable',
'Intended Audience :: Developers',
'License :: OSI Approved :: Apache Software License',
'Environment :: Web Environment',
'Framework :: Django',
'Programming Language :: Python',
'Programming Language :: Python :: 2.5',
'Programming Language :: Python :: 2.6',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3.2',
'Programming Language :: Python :: 3.3',
'Operating System :: OS Independent',
'Topic :: Software Development :: Libraries',
'Topic :: Utilities'])
if __name__ == '__main__':
setup(**setup_args)
django-treebeard-2.0b1/treebeard/ 0000755 0000765 0000024 00000000000 12151313146 016775 5 ustar tabo staff 0000000 0000000 django-treebeard-2.0b1/treebeard/__init__.py 0000644 0000765 0000024 00000000026 12151310464 021104 0 ustar tabo staff 0000000 0000000 __version__ = '2.0b1'
django-treebeard-2.0b1/treebeard/admin.py 0000644 0000765 0000024 00000010023 12150510123 020424 0 ustar tabo staff 0000000 0000000 """Django admin support for treebeard"""
import sys
from django.conf.urls import patterns, url
from django.contrib import admin, messages
from django.http import HttpResponse, HttpResponseBadRequest
from django.utils.translation import ugettext_lazy as _
if sys.version_info >= (3, 0):
from django.utils.encoding import force_str
else:
from django.utils.encoding import force_unicode as force_str
from treebeard.exceptions import (InvalidPosition, MissingNodeOrderBy,
InvalidMoveToDescendant, PathOverflow)
from treebeard.al_tree import AL_Node
class TreeAdmin(admin.ModelAdmin):
"""Django Admin class for treebeard"""
change_list_template = 'admin/tree_change_list.html'
def queryset(self, request):
if issubclass(self.model, AL_Node):
# AL Trees return a list instead of a QuerySet for .get_tree()
# So we're returning the regular .queryset cause we will use
# the old admin
return super(TreeAdmin, self).queryset(request)
else:
return self.model.get_tree()
def changelist_view(self, request, extra_context=None):
if issubclass(self.model, AL_Node):
# For AL trees, use the old admin display
self.change_list_template = 'admin/tree_list.html'
return super(TreeAdmin, self).changelist_view(request, extra_context)
def get_urls(self):
"""
Adds a url to move nodes to this admin
"""
urls = super(TreeAdmin, self).get_urls()
new_urls = patterns(
'',
url('^move/$', self.admin_site.admin_view(self.move_node), ),
url(r'^jsi18n/$', 'django.views.i18n.javascript_catalog',
{'packages': ('treebeard',)}),
)
return new_urls + urls
def get_node(self, node_id):
return self.model.objects.get(pk=node_id)
def move_node(self, request):
try:
node_id = request.POST['node_id']
target_id = request.POST['sibling_id']
as_child = bool(int(request.POST.get('as_child', 0)))
except (KeyError, ValueError):
# Some parameters were missing return a BadRequest
return HttpResponseBadRequest('Malformed POST params')
node = self.get_node(node_id)
target = self.get_node(target_id)
is_sorted = node.node_order_by is not None
pos = {
(True, True): 'sorted-child',
(True, False): 'last-child',
(False, True): 'sorted-sibling',
(False, False): 'left',
}[as_child, is_sorted]
try:
node.move(target, pos=pos)
# Call the save method on the (reloaded) node in order to trigger
# possible signal handlers etc.
node = self.get_node(node.pk)
node.save()
except (MissingNodeOrderBy, PathOverflow, InvalidMoveToDescendant,
InvalidPosition):
e = sys.exc_info()[1]
# An error was raised while trying to move the node, then set an
# error message and return 400, this will cause a reload on the
# client to show the message
# error message and return 400, this will cause a reload on
# the client to show the message
messages.error(request,
_('Exception raised while moving node: %s') % _(
force_str(e)))
return HttpResponseBadRequest('Exception raised during move')
if as_child:
msg = _('Moved node "%(node)s" as child of "%(other)s"')
else:
msg = _('Moved node "%(node)s" as sibling of "%(other)s"')
messages.info(request, msg % {'node': node, 'other': target})
return HttpResponse('OK')
def admin_factory(form_class):
"""Dynamically build a TreeAdmin subclass for the given form class.
:param form_class:
:return: A TreeAdmin subclass.
"""
return type(
form_class.__name__ + 'Admin',
(TreeAdmin,),
dict(form=form_class))
django-treebeard-2.0b1/treebeard/al_tree.py 0000644 0000765 0000024 00000025302 12150510123 020755 0 ustar tabo staff 0000000 0000000 """Adjacency List"""
from django.core import serializers
from django.db import connection, models, transaction
from django.utils.translation import ugettext_noop as _
from treebeard.exceptions import InvalidMoveToDescendant
from treebeard.models import Node
class AL_NodeManager(models.Manager):
"""Custom manager for nodes."""
def get_query_set(self):
"""Sets the custom queryset as the default."""
if self.model.node_order_by:
order_by = ['parent'] + list(self.model.node_order_by)
else:
order_by = ['parent', 'sib_order']
return super(AL_NodeManager, self).get_query_set().order_by(*order_by)
class AL_Node(Node):
"""Abstract model to create your own Adjacency List Trees."""
objects = AL_NodeManager()
node_order_by = None
@classmethod
def add_root(cls, **kwargs):
"""Adds a root node to the tree."""
newobj = cls(**kwargs)
newobj._cached_depth = 1
if not cls.node_order_by:
try:
max = cls.objects.filter(parent__isnull=True).order_by(
'sib_order').reverse()[0].sib_order
except IndexError:
max = 0
newobj.sib_order = max + 1
newobj.save()
transaction.commit_unless_managed()
return newobj
@classmethod
def get_root_nodes(cls):
""":returns: A queryset containing the root nodes in the tree."""
return cls.objects.filter(parent__isnull=True)
def get_depth(self, update=False):
"""
:returns: the depth (level) of the node
Caches the result in the object itself to help in loops.
:param update: Updates the cached value.
"""
if self.parent_id is None:
return 1
try:
if update:
del self._cached_depth
else:
return self._cached_depth
except AttributeError:
pass
depth = 0
node = self
while node:
node = node.parent
depth += 1
self._cached_depth = depth
return depth
def get_children(self):
""":returns: A queryset of all the node's children"""
return self.__class__.objects.filter(parent=self)
def get_parent(self, update=False):
""":returns: the parent node of the current node object."""
return self.parent
def get_ancestors(self):
"""
:returns: A *list* containing the current node object's ancestors,
starting by the root node and descending to the parent.
"""
ancestors = []
node = self.parent
while node:
ancestors.insert(0, node)
node = node.parent
return ancestors
def get_root(self):
""":returns: the root node for the current node object."""
ancestors = self.get_ancestors()
if ancestors:
return ancestors[0]
return self
def is_descendant_of(self, node):
"""
:returns: ``True`` if the node if a descendant of another node given
as an argument, else, returns ``False``
"""
return self.pk in [obj.pk for obj in node.get_descendants()]
@classmethod
def dump_bulk(cls, parent=None, keep_ids=True):
"""Dumps a tree branch to a python data structure."""
serializable_cls = cls._get_serializable_model()
if (
parent and serializable_cls != cls and
parent.__class__ != serializable_cls
):
parent = serializable_cls.objects.get(pk=parent.pk)
# a list of nodes: not really a queryset, but it works
objs = serializable_cls.get_tree(parent)
ret, lnk = [], {}
for node, pyobj in zip(objs, serializers.serialize('python', objs)):
depth = node.get_depth()
# django's serializer stores the attributes in 'fields'
fields = pyobj['fields']
del fields['parent']
# non-sorted trees have this
if 'sib_order' in fields:
del fields['sib_order']
if 'id' in fields:
del fields['id']
newobj = {'data': fields}
if keep_ids:
newobj['id'] = pyobj['pk']
if (not parent and depth == 1) or\
(parent and depth == parent.get_depth()):
ret.append(newobj)
else:
parentobj = lnk[node.parent_id]
if 'children' not in parentobj:
parentobj['children'] = []
parentobj['children'].append(newobj)
lnk[node.pk] = newobj
return ret
def add_child(self, **kwargs):
"""Adds a child to the node."""
newobj = self.__class__(**kwargs)
try:
newobj._cached_depth = self._cached_depth + 1
except AttributeError:
pass
if not self.__class__.node_order_by:
try:
max = self.__class__.objects.filter(parent=self).reverse(
)[0].sib_order
except IndexError:
max = 0
newobj.sib_order = max + 1
newobj.parent = self
newobj.save()
transaction.commit_unless_managed()
return newobj
@classmethod
def _get_tree_recursively(cls, results, parent, depth):
if parent:
nodes = parent.get_children()
else:
nodes = cls.get_root_nodes()
for node in nodes:
node._cached_depth = depth
results.append(node)
cls._get_tree_recursively(results, node, depth + 1)
@classmethod
def get_tree(cls, parent=None):
"""
:returns: A list of nodes ordered as DFS, including the parent. If
no parent is given, the entire tree is returned.
"""
if parent:
depth = parent.get_depth() + 1
results = [parent]
else:
depth = 1
results = []
cls._get_tree_recursively(results, parent, depth)
return results
def get_descendants(self):
"""
:returns: A *list* of all the node's descendants, doesn't
include the node itself
"""
return self.__class__.get_tree(parent=self)[1:]
def get_descendant_count(self):
""":returns: the number of descendants of a nodee"""
return len(self.get_descendants())
def get_siblings(self):
"""
:returns: A queryset of all the node's siblings, including the node
itself.
"""
if self.parent:
return self.__class__.objects.filter(parent=self.parent)
return self.__class__.get_root_nodes()
def add_sibling(self, pos=None, **kwargs):
"""Adds a new node as a sibling to the current node object."""
pos = self._prepare_pos_var_for_add_sibling(pos)
newobj = self.__class__(**kwargs)
if not self.node_order_by:
newobj.sib_order = self.__class__._get_new_sibling_order(pos,
self)
newobj.parent_id = self.parent_id
newobj.save()
transaction.commit_unless_managed()
return newobj
@classmethod
def _is_target_pos_the_last_sibling(cls, pos, target):
return pos == 'last-sibling' or (
pos == 'right' and target == target.get_last_sibling())
@classmethod
def _make_hole_in_db(cls, min, target_node):
qset = cls.objects.filter(sib_order__gte=min)
if target_node.is_root():
qset = qset.filter(parent__isnull=True)
else:
qset = qset.filter(parent=target_node.parent)
qset.update(sib_order=models.F('sib_order') + 1)
@classmethod
def _make_hole_and_get_sibling_order(cls, pos, target_node):
siblings = target_node.get_siblings()
siblings = {
'left': siblings.filter(sib_order__gte=target_node.sib_order),
'right': siblings.filter(sib_order__gt=target_node.sib_order),
'first-sibling': siblings
}[pos]
sib_order = {
'left': target_node.sib_order,
'right': target_node.sib_order + 1,
'first-sibling': 1
}[pos]
try:
min = siblings.order_by('sib_order')[0].sib_order
except IndexError:
min = 0
if min:
cls._make_hole_in_db(min, target_node)
return sib_order
@classmethod
def _get_new_sibling_order(cls, pos, target_node):
if cls._is_target_pos_the_last_sibling(pos, target_node):
sib_order = target_node.get_last_sibling().sib_order + 1
else:
sib_order = cls._make_hole_and_get_sibling_order(pos, target_node)
return sib_order
def move(self, target, pos=None):
"""
Moves the current node and all it's descendants to a new position
relative to another node.
"""
pos = self._prepare_pos_var_for_move(pos)
sib_order = None
parent = None
if pos in ('first-child', 'last-child', 'sorted-child'):
# moving to a child
if not target.is_leaf():
target = target.get_last_child()
pos = {'first-child': 'first-sibling',
'last-child': 'last-sibling',
'sorted-child': 'sorted-sibling'}[pos]
else:
parent = target
if pos == 'sorted-child':
pos = 'sorted-sibling'
else:
pos = 'first-sibling'
sib_order = 1
if target.is_descendant_of(self):
raise InvalidMoveToDescendant(
_("Can't move node to a descendant."))
if self == target and (
(pos == 'left') or
(pos in ('right', 'last-sibling') and
target == target.get_last_sibling()) or
(pos == 'first-sibling' and
target == target.get_first_sibling())):
# special cases, not actually moving the node so no need to UPDATE
return
if pos == 'sorted-sibling':
if parent:
self.parent = parent
else:
self.parent = target.parent
else:
if sib_order:
self.sib_order = sib_order
else:
self.sib_order = self.__class__._get_new_sibling_order(pos,
target)
if parent:
self.parent = parent
else:
self.parent = target.parent
self.save()
transaction.commit_unless_managed()
class Meta:
"""Abstract model."""
abstract = True
django-treebeard-2.0b1/treebeard/exceptions.py 0000644 0000765 0000024 00000001165 12150510123 021524 0 ustar tabo staff 0000000 0000000 """Treebeard exceptions"""
class InvalidPosition(Exception):
"""Raised when passing an invalid pos value"""
class InvalidMoveToDescendant(Exception):
"""Raised when attemping to move a node to one of it's descendants."""
class MissingNodeOrderBy(Exception):
"""
Raised when an operation needs a missing
:attr:`~treebeard.MP_Node.node_order_by` attribute
"""
class PathOverflow(Exception):
"""
Raised when trying to add or move a node to a position where no more nodes
can be added (see :attr:`~treebeard.MP_Node.path` and
:attr:`~treebeard.MP_Node.alphabet` for more info)
"""
django-treebeard-2.0b1/treebeard/forms.py 0000644 0000765 0000024 00000016544 12150510123 020500 0 ustar tabo staff 0000000 0000000 """Forms for treebeard."""
from django import forms
from django.db.models.query import QuerySet
from django.forms.models import BaseModelForm, ErrorList, model_to_dict
from django.forms.models import modelform_factory as django_modelform_factory
from django.utils.safestring import mark_safe
from django.utils.translation import ugettext_lazy as _
from treebeard.al_tree import AL_Node
from treebeard.mp_tree import MP_Node
from treebeard.ns_tree import NS_Node
class MoveNodeForm(forms.ModelForm):
"""
Form to handle moving a node in a tree.
Handles sorted/unsorted trees.
"""
__position_choices_sorted = (
('sorted-child', _('Child of')),
('sorted-sibling', _('Sibling of')),
)
__position_choices_unsorted = (
('first-child', _('First child of')),
('left', _('Before')),
('right', _('After')),
)
_position = forms.ChoiceField(required=True, label=_("Position"))
_ref_node_id = forms.TypedChoiceField(required=False,
coerce=int,
label=_("Relative to"))
def _get_position_ref_node(self, instance):
if self.is_sorted:
position = 'sorted-child'
node_parent = instance.get_parent()
if node_parent:
ref_node_id = node_parent.pk
else:
ref_node_id = ''
else:
prev_sibling = instance.get_prev_sibling()
if prev_sibling:
position = 'right'
ref_node_id = prev_sibling.pk
else:
position = 'first-child'
if instance.is_root():
ref_node_id = ''
else:
ref_node_id = instance.get_parent().pk
return {'_ref_node_id': ref_node_id,
'_position': position}
def __init__(self, data=None, files=None, auto_id='id_%s', prefix=None,
initial=None, error_class=ErrorList, label_suffix=':',
empty_permitted=False, instance=None):
opts = self._meta
if instance is None:
if opts.model is None:
raise ValueError('MoveNodeForm has no model class specified.')
else:
opts.model = type(instance)
self.is_sorted = getattr(opts.model, 'node_order_by', False)
if self.is_sorted:
choices_sort_mode = self.__class__.__position_choices_sorted
else:
choices_sort_mode = self.__class__.__position_choices_unsorted
self.declared_fields['_position'].choices = choices_sort_mode
if instance is None:
# if we didn't get an instance, instantiate a new one
instance = opts.model()
object_data = {}
choices_for_node = None
else:
object_data = model_to_dict(instance, opts.fields, opts.exclude)
object_data.update(self._get_position_ref_node(instance))
choices_for_node = instance
choices = self.mk_dropdown_tree(opts.model, for_node=choices_for_node)
self.declared_fields['_ref_node_id'].choices = choices
self.instance = instance
# if initial was provided, it should override the values from instance
if initial is not None:
object_data.update(initial)
super(BaseModelForm, self).__init__(data, files, auto_id, prefix,
object_data, error_class,
label_suffix, empty_permitted)
def _clean_cleaned_data(self):
""" delete auxilary fields not belonging to node model """
reference_node_id = 0
if '_ref_node_id' in self.cleaned_data:
reference_node_id = self.cleaned_data['_ref_node_id']
del self.cleaned_data['_ref_node_id']
position_type = self.cleaned_data['_position']
del self.cleaned_data['_position']
return position_type, reference_node_id
def save(self, commit=True):
position_type, reference_node_id = self._clean_cleaned_data()
if self.instance.pk is None:
cl_data = {}
for field in self.cleaned_data:
if not isinstance(self.cleaned_data[field], (list, QuerySet)):
cl_data[field] = self.cleaned_data[field]
if reference_node_id:
reference_node = self._meta.model.objects.get(
pk=reference_node_id)
self.instance = reference_node.add_child(**cl_data)
self.instance.move(reference_node, pos=position_type)
else:
self.instance = self._meta.model.add_root(**cl_data)
else:
self.instance.save()
if reference_node_id:
reference_node = self._meta.model.objects.get(
pk=reference_node_id)
self.instance.move(reference_node, pos=position_type)
else:
if self.is_sorted:
pos = 'sorted-sibling'
else:
pos = 'first-sibling'
self.instance.move(self._meta.model.get_first_root_node(), pos)
# Reload the instance
self.instance = self._meta.model.objects.get(pk=self.instance.pk)
super(MoveNodeForm, self).save(commit=commit)
return self.instance
@staticmethod
def is_loop_safe(for_node, possible_parent):
if for_node is not None:
return not (
possible_parent == for_node
) or (possible_parent.is_descendant_of(for_node))
return True
@staticmethod
def mk_indent(level):
return ' ' * (level - 1)
@classmethod
def add_subtree(cls, for_node, node, options):
""" Recursively build options tree. """
if cls.is_loop_safe(for_node, node):
options.append(
(node.pk,
mark_safe(cls.mk_indent(node.get_depth()) + str(node))))
for subnode in node.get_children():
cls.add_subtree(for_node, subnode, options)
@classmethod
def mk_dropdown_tree(cls, model, for_node=None):
""" Creates a tree-like list of choices """
options = [(0, _('-- root --'))]
for node in model.get_root_nodes():
cls.add_subtree(for_node, node, options)
return options
def movenodeform_factory(model, form=MoveNodeForm, fields=None, exclude=None,
formfield_callback=None, widgets=None):
"""Dynamically build a MoveNodeForm subclass with the proper Meta.
:param model:
:return: A MoveNodeForm subclass
Example of a generated class::
class AL_TestNodeForm(MoveNodeForm):
class Meta:
model = models.AL_TestNode
exclude = ('sib_order', 'parent')
"""
_exclude = _get_exclude_for_model(model, exclude)
return django_modelform_factory(
model, form, fields, _exclude, formfield_callback, widgets)
def _get_exclude_for_model(model, exclude):
if exclude:
_exclude = tuple(exclude)
else:
_exclude = ()
if issubclass(model, AL_Node):
_exclude += ('sib_order', 'parent')
elif issubclass(model, MP_Node):
_exclude += ('depth', 'numchild', 'path')
elif issubclass(model, NS_Node):
_exclude += ('depth', 'lft', 'rgt', 'tree_id')
return _exclude
django-treebeard-2.0b1/treebeard/models.py 0000644 0000765 0000024 00000050234 12151051647 020643 0 ustar tabo staff 0000000 0000000 """Models and base API"""
import sys
import operator
if sys.version_info >= (3, 0):
from functools import reduce
from django.db.models import Q
from django.db import models, transaction, router, connections
from treebeard.exceptions import InvalidPosition, MissingNodeOrderBy
class Node(models.Model):
"""Node class"""
_db_connection = None
@classmethod
def add_root(cls, **kwargs): # pragma: no cover
"""
Adds a root node to the tree. The new root node will be the new
rightmost root node. If you want to insert a root node at a specific
position, use :meth:`add_sibling` in an already existing root node
instead.
:param \*\*kwargs: object creation data that will be passed to the
inherited Node model
:returns: the created node object. It will be save()d by this method.
"""
raise NotImplementedError
@classmethod
def get_foreign_keys(cls):
""" Get foreign keys and models they refer to, so we can pre-process the
data for load_bulk """
foreign_keys = {}
for field in cls._meta.fields:
if field.get_internal_type() == 'ForeignKey' and \
field.name != 'parent':
foreign_keys[field.name] = field.rel.to
return foreign_keys
@classmethod
def _process_foreign_keys(cls, foreign_keys, node_data):
""" For each foreign key try to load the actual object so load_bulk
doesn't fail trying to load an int where django expects a model instance
"""
for key in foreign_keys.keys():
if key in node_data:
node_data[key] = foreign_keys[key].objects.get(
pk=node_data[key])
@classmethod
def load_bulk(cls, bulk_data, parent=None, keep_ids=False):
"""
Loads a list/dictionary structure to the tree.
:param bulk_data:
The data that will be loaded, the structure is a list of
dictionaries with 2 keys:
- ``data``: will store arguments that will be passed for object
creation, and
- ``children``: a list of dictionaries, each one has it's own
``data`` and ``children`` keys (a recursive structure)
:param parent:
The node that will receive the structure as children, if not
specified the first level of the structure will be loaded as root
nodes
:param keep_ids:
If enabled, loads the nodes with the same id that are given in the
structure. Will error if there are nodes without id info or if the
ids are already used.
:returns: A list of the added node ids.
"""
# tree, iterative preorder
added = []
# stack of nodes to analize
stack = [(parent, node) for node in bulk_data[::-1]]
foreign_keys = cls.get_foreign_keys()
while stack:
parent, node_struct = stack.pop()
# shallow copy of the data strucure so it doesn't persist...
node_data = node_struct['data'].copy()
cls._process_foreign_keys(foreign_keys, node_data)
if keep_ids:
node_data['id'] = node_struct['id']
if parent:
node_obj = parent.add_child(**node_data)
else:
node_obj = cls.add_root(**node_data)
added.append(node_obj.pk)
if 'children' in node_struct:
# extending the stack with the current node as the parent of
# the new nodes
stack.extend([
(node_obj, node)
for node in node_struct['children'][::-1]
])
transaction.commit_unless_managed()
return added
@classmethod
def dump_bulk(cls, parent=None, keep_ids=True): # pragma: no cover
"""
Dumps a tree branch to a python data structure.
:param parent:
The node whose descendants will be dumped. The node itself will be
included in the dump. If not given, the entire tree will be dumped.
:param keep_ids:
Stores the id value (primary key) of every node. Enabled by
default.
:returns: A python data structure, described with detail in
:meth:`load_bulk`
"""
raise NotImplementedError
@classmethod
def get_root_nodes(cls): # pragma: no cover
""":returns: A queryset containing the root nodes in the tree."""
raise NotImplementedError
@classmethod
def get_first_root_node(cls):
"""
:returns:
The first root node in the tree or ``None`` if it is empty.
"""
try:
return cls.get_root_nodes()[0]
except IndexError:
return None
@classmethod
def get_last_root_node(cls):
"""
:returns:
The last root node in the tree or ``None`` if it is empty.
"""
try:
return cls.get_root_nodes().reverse()[0]
except IndexError:
return None
@classmethod
def find_problems(cls): # pragma: no cover
"""Checks for problems in the tree structure."""
raise NotImplementedError
@classmethod
def fix_tree(cls): # pragma: no cover
"""
Solves problems that can appear when transactions are not used and
a piece of code breaks, leaving the tree in an inconsistent state.
"""
raise NotImplementedError
@classmethod
def get_tree(cls, parent=None):
"""
:returns:
A list of nodes ordered as DFS, including the parent. If
no parent is given, the entire tree is returned.
"""
raise NotImplementedError
@classmethod
def get_descendants_group_count(cls, parent=None):
"""
Helper for a very common case: get a group of siblings and the number
of *descendants* (not only children) in every sibling.
:param parent:
The parent of the siblings to return. If no parent is given, the
root nodes will be returned.
:returns:
A `list` (**NOT** a Queryset) of node objects with an extra
attribute: `descendants_count`.
"""
if parent is None:
qset = cls.get_root_nodes()
else:
qset = parent.get_children()
nodes = list(qset)
for node in nodes:
node.descendants_count = node.get_descendant_count()
return nodes
def get_depth(self): # pragma: no cover
""":returns: the depth (level) of the node"""
raise NotImplementedError
def get_siblings(self): # pragma: no cover
"""
:returns:
A queryset of all the node's siblings, including the node
itself.
"""
raise NotImplementedError
def get_children(self): # pragma: no cover
""":returns: A queryset of all the node's children"""
raise NotImplementedError
def get_children_count(self):
""":returns: The number of the node's children"""
return self.get_children().count()
def get_descendants(self):
"""
:returns:
A queryset of all the node's descendants, doesn't
include the node itself (some subclasses may return a list).
"""
raise NotImplementedError
def get_descendant_count(self):
""":returns: the number of descendants of a node."""
return self.get_descendants().count()
def get_first_child(self):
"""
:returns:
The leftmost node's child, or None if it has no children.
"""
try:
return self.get_children()[0]
except IndexError:
return None
def get_last_child(self):
"""
:returns:
The rightmost node's child, or None if it has no children.
"""
try:
return self.get_children().reverse()[0]
except IndexError:
return None
def get_first_sibling(self):
"""
:returns:
The leftmost node's sibling, can return the node itself if
it was the leftmost sibling.
"""
return self.get_siblings()[0]
def get_last_sibling(self):
"""
:returns:
The rightmost node's sibling, can return the node itself if
it was the rightmost sibling.
"""
return self.get_siblings().reverse()[0]
def get_prev_sibling(self):
"""
:returns:
The previous node's sibling, or None if it was the leftmost
sibling.
"""
siblings = self.get_siblings()
ids = [obj.pk for obj in siblings]
if self.pk in ids:
idx = ids.index(self.pk)
if idx > 0:
return siblings[idx - 1]
def get_next_sibling(self):
"""
:returns:
The next node's sibling, or None if it was the rightmost
sibling.
"""
siblings = self.get_siblings()
ids = [obj.pk for obj in siblings]
if self.pk in ids:
idx = ids.index(self.pk)
if idx < len(siblings) - 1:
return siblings[idx + 1]
def is_sibling_of(self, node):
"""
:returns: ``True`` if the node is a sibling of another node given as an
argument, else, returns ``False``
:param node:
The node that will be checked as a sibling
"""
return self.get_siblings().filter(pk=node.pk).exists()
def is_child_of(self, node):
"""
:returns: ``True`` if the node is a child of another node given as an
argument, else, returns ``False``
:param node:
The node that will be checked as a parent
"""
return node.get_children().filter(pk=self.pk).exists()
def is_descendant_of(self, node): # pragma: no cover
"""
:returns: ``True`` if the node is a descendant of another node given
as an argument, else, returns ``False``
:param node:
The node that will be checked as an ancestor
"""
raise NotImplementedError
def add_child(self, **kwargs): # pragma: no cover
"""
Adds a child to the node. The new node will be the new rightmost
child. If you want to insert a node at a specific position,
use the :meth:`add_sibling` method of an already existing
child node instead.
:param \*\*kwargs:
Object creation data that will be passed to the inherited Node
model
:returns: The created node object. It will be save()d by this method.
"""
raise NotImplementedError
def add_sibling(self, pos=None, **kwargs): # pragma: no cover
"""
Adds a new node as a sibling to the current node object.
:param pos:
The position, relative to the current node object, where the
new node will be inserted, can be one of:
- ``first-sibling``: the new node will be the new leftmost sibling
- ``left``: the new node will take the node's place, which will be
moved to the right 1 position
- ``right``: the new node will be inserted at the right of the node
- ``last-sibling``: the new node will be the new rightmost sibling
- ``sorted-sibling``: the new node will be at the right position
according to the value of node_order_by
:param \*\*kwargs:
Object creation data that will be passed to the inherited
Node model
:returns:
The created node object. It will be saved by this method.
:raise InvalidPosition: when passing an invalid ``pos`` parm
:raise InvalidPosition: when :attr:`node_order_by` is enabled and the
``pos`` parm wasn't ``sorted-sibling``
:raise MissingNodeOrderBy: when passing ``sorted-sibling`` as ``pos``
and the :attr:`node_order_by` attribute is missing
"""
raise NotImplementedError
def get_root(self): # pragma: no cover
""":returns: the root node for the current node object."""
raise NotImplementedError
def is_root(self):
""":returns: True if the node is a root node (else, returns False)"""
return self.get_root().pk == self.pk
def is_leaf(self):
""":returns: True if the node is a leaf node (else, returns False)"""
return not self.get_children().exists()
def get_ancestors(self): # pragma: no cover
"""
:returns:
A queryset containing the current node object's ancestors,
starting by the root node and descending to the parent.
(some subclasses may return a list)
"""
raise NotImplementedError
def get_parent(self, update=False): # pragma: no cover
"""
:returns: the parent node of the current node object.
Caches the result in the object itself to help in loops.
:param update: Updates de cached value.
"""
raise NotImplementedError
def move(self, target, pos=None): # pragma: no cover
"""
Moves the current node and all it's descendants to a new position
relative to another node.
:param target:
The node that will be used as a relative child/sibling when moving
:param pos:
The position, relative to the target node, where the
current node object will be moved to, can be one of:
- ``first-child``: the node will be the new leftmost child of the
``target`` node
- ``last-child``: the node will be the new rightmost child of the
``target`` node
- ``sorted-child``: the new node will be moved as a child of the
``target`` node according to the value of :attr:`node_order_by`
- ``first-sibling``: the node will be the new leftmost sibling of
the ``target`` node
- ``left``: the node will take the ``target`` node's place, which
will be moved to the right 1 position
- ``right``: the node will be moved to the right of the ``target``
node
- ``last-sibling``: the node will be the new rightmost sibling of
the ``target`` node
- ``sorted-sibling``: the new node will be moved as a sibling of
the ``target`` node according to the value of
:attr:`node_order_by`
.. note::
If no ``pos`` is given the library will use ``last-sibling``,
or ``sorted-sibling`` if :attr:`node_order_by` is enabled.
:returns: None
:raise InvalidPosition: when passing an invalid ``pos`` parm
:raise InvalidPosition: when :attr:`node_order_by` is enabled and the
``pos`` parm wasn't ``sorted-sibling`` or ``sorted-child``
:raise InvalidMoveToDescendant: when trying to move a node to one of
it's own descendants
:raise PathOverflow: when the library can't make room for the
node's new position
:raise MissingNodeOrderBy: when passing ``sorted-sibling`` or
``sorted-child`` as ``pos`` and the :attr:`node_order_by`
attribute is missing
"""
raise NotImplementedError
def delete(self):
"""Removes a node and all it's descendants."""
self.__class__.objects.filter(id=self.pk).delete()
def _prepare_pos_var(self, pos, method_name, valid_pos, valid_sorted_pos):
if pos is None:
if self.node_order_by:
pos = 'sorted-sibling'
else:
pos = 'last-sibling'
if pos not in valid_pos:
raise InvalidPosition('Invalid relative position: %s' % (pos, ))
if self.node_order_by and pos not in valid_sorted_pos:
raise InvalidPosition(
'Must use %s in %s when node_order_by is enabled' % (
' or '.join(valid_sorted_pos), method_name))
if pos in valid_sorted_pos and not self.node_order_by:
raise MissingNodeOrderBy('Missing node_order_by attribute.')
return pos
_valid_pos_for_add_sibling = ('first-sibling', 'left', 'right',
'last-sibling', 'sorted-sibling')
_valid_pos_for_sorted_add_sibling = ('sorted-sibling',)
def _prepare_pos_var_for_add_sibling(self, pos):
return self._prepare_pos_var(
pos,
'add_sibling',
self._valid_pos_for_add_sibling,
self._valid_pos_for_sorted_add_sibling)
_valid_pos_for_move = _valid_pos_for_add_sibling + (
'first-child', 'last-child', 'sorted-child')
_valid_pos_for_sorted_move = _valid_pos_for_sorted_add_sibling + (
'sorted-child',)
def _prepare_pos_var_for_move(self, pos):
return self._prepare_pos_var(
pos,
'move',
self._valid_pos_for_move,
self._valid_pos_for_sorted_move)
def get_sorted_pos_queryset(self, siblings, newobj):
"""
:returns: A queryset of the nodes that must be moved
to the right. Called only for Node models with :attr:`node_order_by`
This function is based on _insertion_target_filters from django-mptt
(BSD licensed) by Jonathan Buchanan:
https://github.com/django-mptt/django-mptt/blob/0.3.0/mptt/signals.py
"""
fields, filters = [], []
for field in self.node_order_by:
value = getattr(newobj, field)
filters.append(
Q(
*[Q(**{f: v}) for f, v in fields] +
[Q(**{'%s__gt' % field: value})]
)
)
fields.append((field, value))
return siblings.filter(reduce(operator.or_, filters))
@classmethod
def get_annotated_list(cls, parent=None):
"""
Gets an annotated list from a tree branch.
:param parent:
The node whose descendants will be annotated. The node itself
will be included in the list. If not given, the entire tree
will be annotated.
"""
result, info = [], {}
start_depth, prev_depth = (None, None)
for node in cls.get_tree(parent):
depth = node.get_depth()
if start_depth is None:
start_depth = depth
open = (depth and (prev_depth is None or depth > prev_depth))
if prev_depth is not None and depth < prev_depth:
info['close'] = list(range(0, prev_depth - depth))
info = {'open': open, 'close': [], 'level': depth - start_depth}
result.append((node, info,))
prev_depth = depth
if start_depth and start_depth > 0:
info['close'] = list(range(0, prev_depth - start_depth + 1))
return result
@classmethod
def _get_serializable_model(cls):
"""
Returns a model with a valid _meta.local_fields (serializable).
Basically, this means the original model, not a proxied model.
(this is a workaround for a bug in django)
"""
current_class = cls
while current_class._meta.proxy:
current_class = current_class._meta.proxy_for_model
return current_class
@classmethod
def _get_database_connection(cls, action):
if cls._db_connection is None:
cls._db_connection = {
'read': connections[router.db_for_read(cls)],
'write': connections[router.db_for_write(cls)]
}
return cls._db_connection[action]
@classmethod
def get_database_vendor(cls, action):
"""
returns the supported database vendor used by a treebeard model when
performing read (select) or write (update, insert, delete) operations.
:param action:
`read` or `write`
:returns: postgresql, mysql or sqlite
"""
return cls._get_database_connection(action).vendor
@classmethod
def _get_database_cursor(cls, action):
return cls._get_database_connection(action).cursor()
class Meta:
"""Abstract model."""
abstract = True
django-treebeard-2.0b1/treebeard/mp_tree.py 0000644 0000765 0000024 00000110322 12150701650 021001 0 ustar tabo staff 0000000 0000000 """Materialized Path Trees"""
import sys
import operator
if sys.version_info >= (3, 0):
from functools import reduce
from django.core import serializers
from django.db import models, transaction, connection
from django.db.models import F, Q
from django.utils.translation import ugettext_noop as _
from treebeard.numconv import NumConv
from treebeard.models import Node
from treebeard.exceptions import InvalidMoveToDescendant, PathOverflow
class MP_NodeQuerySet(models.query.QuerySet):
"""
Custom queryset for the tree node manager.
Needed only for the customized delete method.
"""
def delete(self):
"""
Custom delete method, will remove all descendant nodes to ensure a
consistent tree (no orphans)
:returns: ``None``
"""
# we'll have to manually run through all the nodes that are going
# to be deleted and remove nodes from the list if an ancestor is
# already getting removed, since that would be redundant
removed = {}
for node in self.order_by('depth', 'path'):
found = False
for depth in range(1, int(len(node.path) / node.steplen)):
path = node._get_basepath(node.path, depth)
if path in removed:
# we are already removing a parent of this node
# skip
found = True
break
if not found:
removed[node.path] = node
# ok, got the minimal list of nodes to remove...
# we must also remove their children
# and update every parent node's numchild attribute
# LOTS OF FUN HERE!
parents = {}
toremove = []
for path, node in removed.items():
parentpath = node._get_basepath(node.path, node.depth - 1)
if parentpath:
if parentpath not in parents:
parents[parentpath] = node.get_parent(True)
parent = parents[parentpath]
if parent and parent.numchild > 0:
parent.numchild -= 1
parent.save()
if not node.is_leaf():
toremove.append(Q(path__startswith=node.path))
else:
toremove.append(Q(path=node.path))
# Django will handle this as a SELECT and then a DELETE of
# ids, and will deal with removing related objects
if toremove:
qset = self.model.objects.filter(reduce(operator.or_, toremove))
super(MP_NodeQuerySet, qset).delete()
transaction.commit_unless_managed()
class MP_NodeManager(models.Manager):
"""Custom manager for nodes."""
def get_query_set(self):
"""Sets the custom queryset as the default."""
return MP_NodeQuerySet(self.model).order_by('path')
class MP_Node(Node):
"""Abstract model to create your own Materialized Path Trees."""
steplen = 4
alphabet = '0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ'
node_order_by = []
path = models.CharField(max_length=255, unique=True)
depth = models.PositiveIntegerField()
numchild = models.PositiveIntegerField(default=0)
objects = MP_NodeManager()
numconv_obj_ = None
@classmethod
def _int2str(cls, num):
return cls.numconv_obj().int2str(num)
@classmethod
def _str2int(cls, num):
return cls.numconv_obj().str2int(num)
@classmethod
def numconv_obj(cls):
if cls.numconv_obj_ is None:
cls.numconv_obj_ = NumConv(len(cls.alphabet), cls.alphabet)
return cls.numconv_obj_
@classmethod
def add_root(cls, **kwargs):
"""
Adds a root node to the tree.
:raise PathOverflow: when no more root objects can be added
"""
# do we have a root node already?
last_root = cls.get_last_root_node()
if last_root and last_root.node_order_by:
# there are root nodes and node_order_by has been set
# delegate sorted insertion to add_sibling
return last_root.add_sibling('sorted-sibling', **kwargs)
if last_root:
# adding the new root node as the last one
newpath = cls._inc_path(last_root.path)
else:
# adding the first root node
newpath = cls._get_path(None, 1, 1)
# creating the new object
newobj = cls(**kwargs)
newobj.depth = 1
newobj.path = newpath
# saving the instance before returning it
newobj.save()
transaction.commit_unless_managed()
return newobj
@classmethod
def dump_bulk(cls, parent=None, keep_ids=True):
"""Dumps a tree branch to a python data structure."""
# Because of fix_tree, this method assumes that the depth
# and numchild properties in the nodes can be incorrect,
# so no helper methods are used
qset = cls._get_serializable_model().objects.all()
if parent:
qset = qset.filter(path__startswith=parent.path)
ret, lnk = [], {}
for pyobj in serializers.serialize('python', qset):
# django's serializer stores the attributes in 'fields'
fields = pyobj['fields']
path = fields['path']
depth = int(len(path) / cls.steplen)
# this will be useless in load_bulk
del fields['depth']
del fields['path']
del fields['numchild']
if 'id' in fields:
# this happens immediately after a load_bulk
del fields['id']
newobj = {'data': fields}
if keep_ids:
newobj['id'] = pyobj['pk']
if (not parent and depth == 1) or\
(parent and len(path) == len(parent.path)):
ret.append(newobj)
else:
parentpath = cls._get_basepath(path, depth - 1)
parentobj = lnk[parentpath]
if 'children' not in parentobj:
parentobj['children'] = []
parentobj['children'].append(newobj)
lnk[path] = newobj
return ret
@classmethod
def find_problems(cls):
"""
Checks for problems in the tree structure, problems can occur when:
1. your code breaks and you get incomplete transactions (always
use transactions!)
2. changing the ``steplen`` value in a model (you must
:meth:`dump_bulk` first, change ``steplen`` and then
:meth:`load_bulk`
:returns: A tuple of five lists:
1. a list of ids of nodes with characters not found in the
``alphabet``
2. a list of ids of nodes when a wrong ``path`` length
according to ``steplen``
3. a list of ids of orphaned nodes
4. a list of ids of nodes with the wrong depth value for
their path
5. a list of ids nodes that report a wrong number of children
"""
evil_chars, bad_steplen, orphans = [], [], []
wrong_depth, wrong_numchild = [], []
for node in cls.objects.all():
found_error = False
for char in node.path:
if char not in cls.alphabet:
evil_chars.append(node.pk)
found_error = True
break
if found_error:
continue
if len(node.path) % cls.steplen:
bad_steplen.append(node.pk)
continue
try:
node.get_parent(True)
except cls.DoesNotExist:
orphans.append(node.pk)
continue
if node.depth != int(len(node.path) / cls.steplen):
wrong_depth.append(node.pk)
continue
real_numchild = cls.objects.filter(
path__range=cls._get_children_path_interval(node.path)
).extra(
where=['LENGTH(path)/%d=%d' % (cls.steplen, node.depth + 1)]
).count()
if real_numchild != node.numchild:
wrong_numchild.append(node.pk)
continue
return evil_chars, bad_steplen, orphans, wrong_depth, wrong_numchild
@classmethod
def fix_tree(cls, destructive=False):
"""
Solves some problems that can appear when transactions are not used and
a piece of code breaks, leaving the tree in an inconsistent state.
The problems this method solves are:
1. Nodes with an incorrect ``depth`` or ``numchild`` values due to
incorrect code and lack of database transactions.
2. "Holes" in the tree. This is normal if you move/delete nodes a
lot. Holes in a tree don't affect performance,
3. Incorrect ordering of nodes when ``node_order_by`` is enabled.
Ordering is enforced on *node insertion*, so if an attribute in
``node_order_by`` is modified after the node is inserted, the
tree ordering will be inconsistent.
:param destructive:
A boolean value. If True, a more agressive fix_tree method will be
attemped. If False (the default), it will use a safe (and fast!)
fix approach, but it will only solve the ``depth`` and
``numchild`` nodes, it won't fix the tree holes or broken path
ordering.
.. warning::
Currently what the ``destructive`` method does is:
1. Backup the tree with :meth:`dump_data`
2. Remove all nodes in the tree.
3. Restore the tree with :meth:`load_data`
So, even when the primary keys of your nodes will be preserved,
this method isn't foreign-key friendly. That needs complex
in-place tree reordering, not available at the moment (hint:
patches are welcome).
"""
if destructive:
dump = cls.dump_bulk(None, True)
cls.objects.all().delete()
cls.load_bulk(dump, None, True)
else:
cursor = cls._get_database_cursor('write')
# fix the depth field
# we need the WHERE to speed up postgres
sql = "UPDATE %s "\
"SET depth=LENGTH(path)/%%s "\
"WHERE depth!=LENGTH(path)/%%s" % (
connection.ops.quote_name(cls._meta.db_table), )
vals = [cls.steplen, cls.steplen]
cursor.execute(sql, vals)
# fix the numchild field
vals = ['_' * cls.steplen]
# the cake and sql portability are a lie
if cls.get_database_vendor('read') == 'mysql':
sql = "SELECT tbn1.path, tbn1.numchild, ("\
"SELECT COUNT(1) "\
"FROM %(table)s AS tbn2 "\
"WHERE tbn2.path LIKE "\
"CONCAT(tbn1.path, %%s)) AS real_numchild "\
"FROM %(table)s AS tbn1 "\
"HAVING tbn1.numchild != real_numchild" % {
'table': connection.ops.quote_name(
cls._meta.db_table)}
else:
subquery = "(SELECT COUNT(1) FROM %(table)s AS tbn2"\
" WHERE tbn2.path LIKE tbn1.path||%%s)"
sql = ("SELECT tbn1.path, tbn1.numchild, " + subquery +
" FROM %(table)s AS tbn1 WHERE tbn1.numchild != " +
subquery)
sql = sql % {
'table': connection.ops.quote_name(cls._meta.db_table)}
# we include the subquery twice
vals *= 2
cursor.execute(sql, vals)
sql = "UPDATE %(table)s "\
"SET numchild=%%s "\
"WHERE path=%%s" % {
'table': connection.ops.quote_name(cls._meta.db_table)}
for node_data in cursor.fetchall():
vals = [node_data[2], node_data[0]]
cursor.execute(sql, vals)
transaction.commit_unless_managed()
@classmethod
def get_tree(cls, parent=None):
"""
:returns:
A *queryset* of nodes ordered as DFS, including the parent.
If no parent is given, the entire tree is returned.
"""
if parent is None:
# return the entire tree
return cls.objects.all()
if not parent.is_leaf():
return cls.objects.filter(path__startswith=parent.path,
depth__gte=parent.depth)
return cls.objects.filter(pk=parent.pk)
@classmethod
def get_root_nodes(cls):
""":returns: A queryset containing the root nodes in the tree."""
return cls.objects.filter(depth=1)
@classmethod
def get_descendants_group_count(cls, parent=None):
"""
Helper for a very common case: get a group of siblings and the number
of *descendants* in every sibling.
"""
#~
# disclaimer: this is the FOURTH implementation I wrote for this
# function. I really tried to make it return a queryset, but doing so
# with a *single* query isn't trivial with Django's ORM.
# ok, I DID manage to make Django's ORM return a queryset here,
# defining two querysets, passing one subquery in the tables parameters
# of .extra() of the second queryset, using the undocumented order_by
# feature, and using a HORRIBLE hack to avoid django quoting the
# subquery as a table, BUT (and there is always a but) the hack didn't
# survive turning the QuerySet into a ValuesQuerySet, so I just used
# good old SQL.
# NOTE: in case there is interest, the hack to avoid django quoting the
# subquery as a table, was adding the subquery to the alias cache of
# the queryset's query object:
#
# qset.query.quote_cache[subquery] = subquery
#
# If there is a better way to do this in an UNMODIFIED django 1.0, let
# me know.
#~
if parent:
depth = parent.depth + 1
params = cls._get_children_path_interval(parent.path)
extrand = 'AND path BETWEEN %s AND %s'
else:
depth = 1
params = []
extrand = ''
sql = 'SELECT * FROM %(table)s AS t1 INNER JOIN '\
' (SELECT '\
' SUBSTR(path, 1, %(subpathlen)s) AS subpath, '\
' COUNT(1)-1 AS count '\
' FROM %(table)s '\
' WHERE depth >= %(depth)s %(extrand)s'\
' GROUP BY subpath) AS t2 '\
' ON t1.path=t2.subpath '\
' ORDER BY t1.path' % {
'table': connection.ops.quote_name(cls._meta.db_table),
'subpathlen': depth * cls.steplen,
'depth': depth,
'extrand': extrand}
cursor = cls._get_database_cursor('write')
cursor.execute(sql, params)
ret = []
field_names = [field[0] for field in cursor.description]
for node_data in cursor.fetchall():
node = cls(**dict(zip(field_names, node_data[:-2])))
node.descendants_count = node_data[-1]
ret.append(node)
transaction.commit_unless_managed()
return ret
def get_depth(self):
""":returns: the depth (level) of the node"""
return self.depth
def get_siblings(self):
"""
:returns: A queryset of all the node's siblings, including the node
itself.
"""
qset = self.__class__.objects.filter(depth=self.depth)
if self.depth > 1:
# making sure the non-root nodes share a parent
parentpath = self._get_basepath(self.path, self.depth - 1)
qset = qset.filter(
path__range=self._get_children_path_interval(parentpath))
return qset
def get_children(self):
""":returns: A queryset of all the node's children"""
if self.is_leaf():
return self.__class__.objects.none()
return self.__class__.objects.filter(
depth=self.depth + 1,
path__range=self._get_children_path_interval(self.path)
)
def get_next_sibling(self):
"""
:returns: The next node's sibling, or None if it was the rightmost
sibling.
"""
try:
return self.get_siblings().filter(path__gt=self.path)[0]
except IndexError:
return None
def get_descendants(self):
"""
:returns: A queryset of all the node's descendants as DFS, doesn't
include the node itself
"""
return self.__class__.get_tree(self).exclude(pk=self.pk)
def get_prev_sibling(self):
"""
:returns: The previous node's sibling, or None if it was the leftmost
sibling.
"""
try:
return self.get_siblings().filter(path__lt=self.path).reverse()[0]
except IndexError:
return None
def get_children_count(self):
"""
:returns: The number the node's children, calculated in the most
efficient possible way.
"""
return self.numchild
def is_sibling_of(self, node):
"""
:returns: ``True`` if the node is a sibling of another node given as an
argument, else, returns ``False``
"""
aux = self.depth == node.depth
# Check non-root nodes share a parent only if they have the same depth
if aux and self.depth > 1:
# making sure the non-root nodes share a parent
parentpath = self._get_basepath(self.path, self.depth - 1)
return aux and node.path.startswith(parentpath)
return aux
def is_child_of(self, node):
"""
:returns: ``True`` is the node if a child of another node given as an
argument, else, returns ``False``
"""
return (self.path.startswith(node.path) and
self.depth == node.depth + 1)
def is_descendant_of(self, node):
"""
:returns: ``True`` if the node is a descendant of another node given
as an argument, else, returns ``False``
"""
return self.path.startswith(node.path) and self.depth > node.depth
def add_child(self, **kwargs):
"""
Adds a child to the node.
:raise PathOverflow: when no more child nodes can be added
"""
if not self.is_leaf() and self.node_order_by:
# there are child nodes and node_order_by has been set
# delegate sorted insertion to add_sibling
self.numchild += 1
return self.get_last_child().add_sibling('sorted-sibling',
**kwargs)
# creating a new object
newobj = self.__class__(**kwargs)
newobj.depth = self.depth + 1
if not self.is_leaf():
# adding the new child as the last one
newobj.path = self._inc_path(self.get_last_child().path)
else:
# the node had no children, adding the first child
newobj.path = self._get_path(self.path, newobj.depth, 1)
max_length = newobj.__class__._meta.get_field('path').max_length
if len(newobj.path) > max_length:
raise PathOverflow(
_('The new node is too deep in the tree, try'
' increasing the path.max_length property'
' and UPDATE your database'))
# saving the instance before returning it
newobj.save()
newobj._cached_parent_obj = self
self.__class__.objects.filter(path=self.path).update(numchild=F(
'numchild')+1)
# we increase the numchild value of the object in memory
self.numchild += 1
transaction.commit_unless_managed()
return newobj
def add_sibling(self, pos=None, **kwargs):
"""
Adds a new node as a sibling to the current node object.
:raise PathOverflow: when the library can't make room for the
node's new position
"""
pos = self._prepare_pos_var_for_add_sibling(pos)
# creating a new object
newobj = self.__class__(**kwargs)
newobj.depth = self.depth
if pos == 'sorted-sibling':
siblings = self.get_sorted_pos_queryset(
self.get_siblings(), newobj)
try:
newpos = self._get_lastpos_in_path(siblings.all()[0].path)
except IndexError:
newpos = None
if newpos is None:
pos = 'last-sibling'
else:
newpos, siblings = None, []
stmts = []
_, newpath = self._move_add_sibling_aux(pos, newpos,
self.depth, self, siblings,
stmts, None, False)
parentpath = self._get_basepath(newpath, self.depth - 1)
if parentpath:
stmts.append(self._get_sql_update_numchild(parentpath, 'inc'))
cursor = self._get_database_cursor('write')
for sql, vals in stmts:
cursor.execute(sql, vals)
# saving the instance before returning it
newobj.path = newpath
newobj.save()
transaction.commit_unless_managed()
return newobj
def get_root(self):
""":returns: the root node for the current node object."""
return self.__class__.objects.get(path=self.path[0:self.steplen])
def is_leaf(self):
""":returns: True if the node is a leaf node (else, returns False)"""
return self.numchild == 0
def get_ancestors(self):
"""
:returns: A queryset containing the current node object's ancestors,
starting by the root node and descending to the parent.
"""
paths = [self.path[0:pos]
for pos in range(0, len(self.path), self.steplen)[1:]]
return self.__class__.objects.filter(path__in=paths).order_by('depth')
def get_parent(self, update=False):
"""
:returns: the parent node of the current node object.
Caches the result in the object itself to help in loops.
"""
depth = int(len(self.path) / self.steplen)
if depth <= 1:
return
try:
if update:
del self._cached_parent_obj
else:
return self._cached_parent_obj
except AttributeError:
pass
parentpath = self._get_basepath(self.path, depth - 1)
self._cached_parent_obj = self.__class__.objects.get(path=parentpath)
return self._cached_parent_obj
def move(self, target, pos=None):
"""
Moves the current node and all it's descendants to a new position
relative to another node.
:raise PathOverflow: when the library can't make room for the
node's new position
"""
pos = self._prepare_pos_var_for_move(pos)
oldpath = self.path
# initialize variables and if moving to a child, updates "move to
# child" to become a "move to sibling" if possible (if it can't
# be done, it means that we are adding the first child)
(pos, target, newdepth, siblings, newpos) = (
self._fix_move_to_child(pos, target)
)
if target.is_descendant_of(self):
raise InvalidMoveToDescendant(
_("Can't move node to a descendant."))
if oldpath == target.path and (
(pos == 'left') or
(pos in ('right', 'last-sibling') and
target.path == target.get_last_sibling().path) or
(pos == 'first-sibling' and
target.path == target.get_first_sibling().path)):
# special cases, not actually moving the node so no need to UPDATE
return
if pos == 'sorted-sibling':
siblings = self.get_sorted_pos_queryset(
target.get_siblings(), self)
try:
newpos = self._get_lastpos_in_path(siblings.all()[0].path)
except IndexError:
newpos = None
if newpos is None:
pos = 'last-sibling'
stmts = []
# generate the sql that will do the actual moving of nodes
oldpath, newpath = self._move_add_sibling_aux(pos, newpos, newdepth,
target, siblings, stmts,
oldpath, True)
# updates needed for mysql and children count in parents
self._updates_after_move(oldpath, newpath, stmts)
cursor = self._get_database_cursor('write')
for sql, vals in stmts:
cursor.execute(sql, vals)
transaction.commit_unless_managed()
@classmethod
def _get_basepath(cls, path, depth):
""":returns: The base path of another path up to a given depth"""
if path:
return path[0:depth * cls.steplen]
return ''
@classmethod
def _get_path(cls, path, depth, newstep):
"""
Builds a path given some values
:param path: the base path
:param depth: the depth of the node
:param newstep: the value (integer) of the new step
"""
parentpath = cls._get_basepath(path, depth - 1)
key = cls._int2str(newstep)
return '%s%s%s' % (parentpath,
'0' * (cls.steplen - len(key)),
key)
@classmethod
def _inc_path(cls, path):
""":returns: The path of the next sibling of a given node path."""
newpos = cls._str2int(path[-cls.steplen:]) + 1
key = cls._int2str(newpos)
if len(key) > cls.steplen:
raise PathOverflow(_("Path Overflow from: '%s'" % (path, )))
return '%s%s%s' % (path[:-cls.steplen],
'0' * (cls.steplen - len(key)),
key)
@classmethod
def _get_lastpos_in_path(cls, path):
""":returns: The integer value of the last step in a path."""
return cls._str2int(path[-cls.steplen:])
@classmethod
def _get_parent_path_from_path(cls, path):
""":returns: The parent path for a given path"""
if path:
return path[0:len(path) - cls.steplen]
return ''
@classmethod
def _get_children_path_interval(cls, path):
""":returns: An interval of all possible children paths for a node."""
return (path + cls.alphabet[0] * cls.steplen,
path + cls.alphabet[-1] * cls.steplen)
@classmethod
def _move_add_sibling_aux(cls, pos, newpos, newdepth, target, siblings,
stmts, oldpath=None, movebranch=False):
"""
Handles the reordering of nodes and branches when adding/moving
nodes.
:returns: A tuple containing the old path and the new path.
"""
if (
(pos == 'last-sibling') or
(pos == 'right' and target == target.get_last_sibling())
):
# easy, the last node
last = target.get_last_sibling()
newpath = cls._inc_path(last.path)
if movebranch:
stmts.append(cls._get_sql_newpath_in_branches(oldpath,
newpath))
else:
# do the UPDATE dance
if newpos is None:
siblings = target.get_siblings()
siblings = {'left': siblings.filter(path__gte=target.path),
'right': siblings.filter(path__gt=target.path),
'first-sibling': siblings}[pos]
basenum = cls._get_lastpos_in_path(target.path)
newpos = {'first-sibling': 1,
'left': basenum,
'right': basenum + 1}[pos]
newpath = cls._get_path(target.path, newdepth, newpos)
# If the move is amongst siblings and is to the left and there
# are siblings to the right of its new position then to be on
# the safe side we temporarily dump it on the end of the list
tempnewpath = None
if movebranch and len(oldpath) == len(newpath):
parentoldpath = cls._get_basepath(
oldpath,
int(len(oldpath) / cls.steplen) - 1
)
parentnewpath = cls._get_basepath(newpath, newdepth - 1)
if (
parentoldpath == parentnewpath and
siblings and
newpath < oldpath
):
last = target.get_last_sibling()
basenum = cls._get_lastpos_in_path(last.path)
tempnewpath = cls._get_path(newpath, newdepth, basenum + 2)
stmts.append(cls._get_sql_newpath_in_branches(oldpath,
tempnewpath))
# Optimisation to only move siblings which need moving
# (i.e. if we've got holes, allow them to compress)
movesiblings = []
priorpath = newpath
for node in siblings:
# If the path of the node is already greater than the path
# of the previous node it doesn't need shifting
if node.path > priorpath:
break
# It does need shifting, so add to the list
movesiblings.append(node)
# Calculate the path that it would be moved to, as that's
# the next "priorpath"
priorpath = cls._inc_path(node.path)
movesiblings.reverse()
for node in movesiblings:
# moving the siblings (and their branches) at the right of the
# related position one step to the right
sql, vals = cls._get_sql_newpath_in_branches(node.path,
cls._inc_path(
node.path))
stmts.append((sql, vals))
if movebranch:
if oldpath.startswith(node.path):
# if moving to a parent, update oldpath since we just
# increased the path of the entire branch
oldpath = vals[0] + oldpath[len(vals[0]):]
if target.path.startswith(node.path):
# and if we moved the target, update the object
# django made for us, since the update won't do it
# maybe useful in loops
target.path = vals[0] + target.path[len(vals[0]):]
if movebranch:
# node to move
if tempnewpath:
stmts.append(cls._get_sql_newpath_in_branches(tempnewpath,
newpath))
else:
stmts.append(cls._get_sql_newpath_in_branches(oldpath,
newpath))
return oldpath, newpath
def _fix_move_to_child(self, pos, target):
"""Update preliminar vars in :meth:`move` when moving to a child"""
newdepth = target.depth
newpos = None
siblings = []
if pos in ('first-child', 'last-child', 'sorted-child'):
# moving to a child
parent = target
newdepth += 1
if target.is_leaf():
# moving as a target's first child
newpos = 1
pos = 'first-sibling'
siblings = self.__class__.objects.none()
else:
target = target.get_last_child()
pos = {'first-child': 'first-sibling',
'last-child': 'last-sibling',
'sorted-child': 'sorted-sibling'}[pos]
# this is not for save(), since if needed, will be handled with a
# custom UPDATE, this is only here to update django's object,
# should be useful in loops
parent.numchild += 1
return pos, target, newdepth, siblings, newpos
@classmethod
def _updates_after_move(cls, oldpath, newpath, stmts):
"""
Updates the list of sql statements needed after moving nodes.
1. :attr:`depth` updates *ONLY* needed by mysql databases (*sigh*)
2. update the number of children of parent nodes
"""
if (
cls.get_database_vendor('write') == 'mysql' and
len(oldpath) != len(newpath)
):
# no words can describe how dumb mysql is
# we must update the depth of the branch in a different query
stmts.append(cls._get_sql_update_depth_in_branch(newpath))
oldparentpath = cls._get_parent_path_from_path(oldpath)
newparentpath = cls._get_parent_path_from_path(newpath)
if (not oldparentpath and newparentpath) or\
(oldparentpath and not newparentpath) or\
(oldparentpath != newparentpath):
# node changed parent, updating count
if oldparentpath:
stmts.append(cls._get_sql_update_numchild(oldparentpath,
'dec'))
if newparentpath:
stmts.append(cls._get_sql_update_numchild(newparentpath,
'inc'))
@classmethod
def _get_sql_newpath_in_branches(cls, oldpath, newpath):
"""
:returns" The sql needed to move a branch to another position.
.. note::
The generated sql will only update the depth values if needed.
"""
vendor = cls.get_database_vendor('write')
sql1 = "UPDATE %s SET" % (
connection.ops.quote_name(cls._meta.db_table), )
# <3 "standard" sql
if vendor == 'sqlite':
# I know that the third argument in SUBSTR (LENGTH(path)) is
# awful, but sqlite fails without it:
# OperationalError: wrong number of arguments to function substr()
# even when the documentation says that 2 arguments are valid:
# http://www.sqlite.org/lang_corefunc.html
sqlpath = "%s||SUBSTR(path, %s, LENGTH(path))"
elif vendor == 'mysql':
# hooray for mysql ignoring standards in their default
# configuration!
# to make || work as it should, enable ansi mode
# http://dev.mysql.com/doc/refman/5.0/en/ansi-mode.html
sqlpath = "CONCAT(%s, SUBSTR(path, %s))"
else:
sqlpath = "%s||SUBSTR(path, %s)"
sql2 = ["path=%s" % (sqlpath, )]
vals = [newpath, len(oldpath) + 1]
if len(oldpath) != len(newpath) and vendor != 'mysql':
# when using mysql, this won't update the depth and it has to be
# done in another query
# doesn't even work with sql_mode='ANSI,TRADITIONAL'
# TODO: FIND OUT WHY?!?? right now I'm just blaming mysql
sql2.append("depth=LENGTH(%s)/%%s" % (sqlpath, ))
vals.extend([newpath, len(oldpath) + 1, cls.steplen])
sql3 = "WHERE path LIKE %s"
vals.extend([oldpath + '%'])
sql = '%s %s %s' % (sql1, ', '.join(sql2), sql3)
return sql, vals
@classmethod
def _get_sql_update_depth_in_branch(cls, path):
"""
:returns: The sql needed to update the depth of all the nodes in a
branch.
"""
# Right now this is only used by *sigh* mysql.
sql = "UPDATE %s SET depth=LENGTH(path)/%%s"\
" WHERE path LIKE %%s" % (
connection.ops.quote_name(cls._meta.db_table), )
vals = [cls.steplen, path + '%']
return sql, vals
@classmethod
def _get_sql_update_numchild(cls, path, incdec='inc'):
""":returns: The sql needed the numchild value of a node"""
sql = "UPDATE %s SET numchild=numchild%s1"\
" WHERE path=%%s" % (
connection.ops.quote_name(cls._meta.db_table),
{'inc': '+', 'dec': '-'}[incdec])
vals = [path]
return sql, vals
class Meta:
"""Abstract model."""
abstract = True
django-treebeard-2.0b1/treebeard/ns_tree.py 0000644 0000765 0000024 00000053327 12151051647 021025 0 ustar tabo staff 0000000 0000000 """Nested Sets"""
import sys
import operator
if sys.version_info >= (3, 0):
from functools import reduce
from django.core import serializers
from django.db import connection, models, transaction
from django.db.models import Q
from django.utils.translation import ugettext_noop as _
from treebeard.exceptions import InvalidMoveToDescendant
from treebeard.models import Node
class NS_NodeQuerySet(models.query.QuerySet):
"""
Custom queryset for the tree node manager.
Needed only for the customized delete method.
"""
def delete(self, removed_ranges=None):
"""
Custom delete method, will remove all descendant nodes to ensure a
consistent tree (no orphans)
:returns: ``None``
"""
if removed_ranges is not None:
# we already know the children, let's call the default django
# delete method and let it handle the removal of the user's
# foreign keys...
super(NS_NodeQuerySet, self).delete()
cursor = self.model._get_database_cursor('write')
# Now closing the gap (Celko's trees book, page 62)
# We do this for every gap that was left in the tree when the nodes
# were removed. If many nodes were removed, we're going to update
# the same nodes over and over again. This would be probably
# cheaper precalculating the gapsize per intervals, or just do a
# complete reordering of the tree (uses COUNT)...
for tree_id, drop_lft, drop_rgt in sorted(removed_ranges,
reverse=True):
sql, params = self.model._get_close_gap_sql(drop_lft, drop_rgt,
tree_id)
cursor.execute(sql, params)
else:
# we'll have to manually run through all the nodes that are going
# to be deleted and remove nodes from the list if an ancestor is
# already getting removed, since that would be redundant
removed = {}
for node in self.order_by('tree_id', 'lft'):
found = False
for rid, rnode in removed.items():
if node.is_descendant_of(rnode):
found = True
break
if not found:
removed[node.pk] = node
# ok, got the minimal list of nodes to remove...
# we must also remove their descendants
toremove = []
ranges = []
for id, node in removed.items():
toremove.append(Q(lft__range=(node.lft, node.rgt)) &
Q(tree_id=node.tree_id))
ranges.append((node.tree_id, node.lft, node.rgt))
if toremove:
self.model.objects.filter(
reduce(operator.or_,
toremove)
).delete(removed_ranges=ranges)
transaction.commit_unless_managed()
class NS_NodeManager(models.Manager):
""" Custom manager for nodes.
"""
def get_query_set(self):
"""Sets the custom queryset as the default."""
return NS_NodeQuerySet(self.model).order_by('tree_id', 'lft')
class NS_Node(Node):
"""Abstract model to create your own Nested Sets Trees."""
node_order_by = []
lft = models.PositiveIntegerField(db_index=True)
rgt = models.PositiveIntegerField(db_index=True)
tree_id = models.PositiveIntegerField(db_index=True)
depth = models.PositiveIntegerField(db_index=True)
objects = NS_NodeManager()
@classmethod
def add_root(cls, **kwargs):
"""Adds a root node to the tree."""
# do we have a root node already?
last_root = cls.get_last_root_node()
if last_root and last_root.node_order_by:
# there are root nodes and node_order_by has been set
# delegate sorted insertion to add_sibling
return last_root.add_sibling('sorted-sibling', **kwargs)
if last_root:
# adding the new root node as the last one
newtree_id = last_root.tree_id + 1
else:
# adding the first root node
newtree_id = 1
# creating the new object
newobj = cls(**kwargs)
newobj.depth = 1
newobj.tree_id = newtree_id
newobj.lft = 1
newobj.rgt = 2
# saving the instance before returning it
newobj.save()
transaction.commit_unless_managed()
return newobj
@classmethod
def _move_right(cls, tree_id, rgt, lftmove=False, incdec=2):
if lftmove:
lftop = '>='
else:
lftop = '>'
sql = 'UPDATE %(table)s '\
' SET lft = CASE WHEN lft %(lftop)s %(parent_rgt)d '\
' THEN lft %(incdec)+d '\
' ELSE lft END, '\
' rgt = CASE WHEN rgt >= %(parent_rgt)d '\
' THEN rgt %(incdec)+d '\
' ELSE rgt END '\
' WHERE rgt >= %(parent_rgt)d AND '\
' tree_id = %(tree_id)s' % {
'table': connection.ops.quote_name(cls._meta.db_table),
'parent_rgt': rgt,
'tree_id': tree_id,
'lftop': lftop,
'incdec': incdec}
return sql, []
@classmethod
def _move_tree_right(cls, tree_id):
sql = 'UPDATE %(table)s '\
' SET tree_id = tree_id+1 '\
' WHERE tree_id >= %(tree_id)d' % {
'table': connection.ops.quote_name(cls._meta.db_table),
'tree_id': tree_id}
return sql, []
def add_child(self, **kwargs):
"""Adds a child to the node."""
if not self.is_leaf():
# there are child nodes, delegate insertion to add_sibling
if self.node_order_by:
pos = 'sorted-sibling'
else:
pos = 'last-sibling'
last_child = self.get_last_child()
last_child._cached_parent_obj = self
return last_child.add_sibling(pos, **kwargs)
# we're adding the first child of this node
sql, params = self.__class__._move_right(self.tree_id,
self.rgt, False, 2)
# creating a new object
newobj = self.__class__(**kwargs)
newobj.tree_id = self.tree_id
newobj.depth = self.depth + 1
newobj.lft = self.lft + 1
newobj.rgt = self.lft + 2
# this is just to update the cache
self.rgt += 2
newobj._cached_parent_obj = self
cursor = self._get_database_cursor('write')
cursor.execute(sql, params)
# saving the instance before returning it
newobj.save()
transaction.commit_unless_managed()
return newobj
def add_sibling(self, pos=None, **kwargs):
"""Adds a new node as a sibling to the current node object."""
pos = self._prepare_pos_var_for_add_sibling(pos)
# creating a new object
newobj = self.__class__(**kwargs)
newobj.depth = self.depth
sql = None
target = self
if target.is_root():
newobj.lft = 1
newobj.rgt = 2
if pos == 'sorted-sibling':
siblings = list(target.get_sorted_pos_queryset(
target.get_siblings(), newobj))
if siblings:
pos = 'left'
target = siblings[0]
else:
pos = 'last-sibling'
last_root = target.__class__.get_last_root_node()
if (
(pos == 'last-sibling') or
(pos == 'right' and target == last_root)
):
newobj.tree_id = last_root.tree_id + 1
else:
newpos = {'first-sibling': 1,
'left': target.tree_id,
'right': target.tree_id + 1}[pos]
sql, params = target.__class__._move_tree_right(newpos)
newobj.tree_id = newpos
else:
newobj.tree_id = target.tree_id
if pos == 'sorted-sibling':
siblings = list(target.get_sorted_pos_queryset(
target.get_siblings(), newobj))
if siblings:
pos = 'left'
target = siblings[0]
else:
pos = 'last-sibling'
if pos in ('left', 'right', 'first-sibling'):
siblings = list(target.get_siblings())
if pos == 'right':
if target == siblings[-1]:
pos = 'last-sibling'
else:
pos = 'left'
found = False
for node in siblings:
if found:
target = node
break
elif node == target:
found = True
if pos == 'left':
if target == siblings[0]:
pos = 'first-sibling'
if pos == 'first-sibling':
target = siblings[0]
move_right = self.__class__._move_right
if pos == 'last-sibling':
newpos = target.get_parent().rgt
sql, params = move_right(target.tree_id, newpos, False, 2)
elif pos == 'first-sibling':
newpos = target.lft
sql, params = move_right(target.tree_id, newpos - 1, False, 2)
elif pos == 'left':
newpos = target.lft
sql, params = move_right(target.tree_id, newpos, True, 2)
newobj.lft = newpos
newobj.rgt = newpos + 1
# saving the instance before returning it
if sql:
cursor = self._get_database_cursor('write')
cursor.execute(sql, params)
newobj.save()
transaction.commit_unless_managed()
return newobj
def move(self, target, pos=None):
"""
Moves the current node and all it's descendants to a new position
relative to another node.
"""
pos = self._prepare_pos_var_for_move(pos)
cls = self.__class__
parent = None
if pos in ('first-child', 'last-child', 'sorted-child'):
# moving to a child
if target.is_leaf():
parent = target
pos = 'last-child'
else:
target = target.get_last_child()
pos = {'first-child': 'first-sibling',
'last-child': 'last-sibling',
'sorted-child': 'sorted-sibling'}[pos]
if target.is_descendant_of(self):
raise InvalidMoveToDescendant(
_("Can't move node to a descendant."))
if self == target and (
(pos == 'left') or
(pos in ('right', 'last-sibling') and
target == target.get_last_sibling()) or
(pos == 'first-sibling' and
target == target.get_first_sibling())):
# special cases, not actually moving the node so no need to UPDATE
return
if pos == 'sorted-sibling':
siblings = list(target.get_sorted_pos_queryset(
target.get_siblings(), self))
if siblings:
pos = 'left'
target = siblings[0]
else:
pos = 'last-sibling'
if pos in ('left', 'right', 'first-sibling'):
siblings = list(target.get_siblings())
if pos == 'right':
if target == siblings[-1]:
pos = 'last-sibling'
else:
pos = 'left'
found = False
for node in siblings:
if found:
target = node
break
elif node == target:
found = True
if pos == 'left':
if target == siblings[0]:
pos = 'first-sibling'
if pos == 'first-sibling':
target = siblings[0]
# ok let's move this
cursor = self._get_database_cursor('write')
move_right = cls._move_right
gap = self.rgt - self.lft + 1
sql = None
target_tree = target.tree_id
# first make a hole
if pos == 'last-child':
newpos = parent.rgt
sql, params = move_right(target.tree_id, newpos, False, gap)
elif target.is_root():
newpos = 1
if pos == 'last-sibling':
target_tree = target.get_siblings().reverse()[0].tree_id + 1
elif pos == 'first-sibling':
target_tree = 1
sql, params = cls._move_tree_right(1)
elif pos == 'left':
sql, params = cls._move_tree_right(target.tree_id)
else:
if pos == 'last-sibling':
newpos = target.get_parent().rgt
sql, params = move_right(target.tree_id, newpos, False, gap)
elif pos == 'first-sibling':
newpos = target.lft
sql, params = move_right(target.tree_id,
newpos - 1, False, gap)
elif pos == 'left':
newpos = target.lft
sql, params = move_right(target.tree_id, newpos, True, gap)
if sql:
cursor.execute(sql, params)
# we reload 'self' because lft/rgt may have changed
fromobj = cls.objects.get(pk=self.pk)
depthdiff = target.depth - fromobj.depth
if parent:
depthdiff += 1
# move the tree to the hole
sql = "UPDATE %(table)s "\
" SET tree_id = %(target_tree)d, "\
" lft = lft + %(jump)d , "\
" rgt = rgt + %(jump)d , "\
" depth = depth + %(depthdiff)d "\
" WHERE tree_id = %(from_tree)d AND "\
" lft BETWEEN %(fromlft)d AND %(fromrgt)d" % {
'table': connection.ops.quote_name(cls._meta.db_table),
'from_tree': fromobj.tree_id,
'target_tree': target_tree,
'jump': newpos - fromobj.lft,
'depthdiff': depthdiff,
'fromlft': fromobj.lft,
'fromrgt': fromobj.rgt}
cursor.execute(sql, [])
# close the gap
sql, params = cls._get_close_gap_sql(fromobj.lft,
fromobj.rgt, fromobj.tree_id)
cursor.execute(sql, params)
transaction.commit_unless_managed()
@classmethod
def _get_close_gap_sql(cls, drop_lft, drop_rgt, tree_id):
sql = 'UPDATE %(table)s '\
' SET lft = CASE '\
' WHEN lft > %(drop_lft)d '\
' THEN lft - %(gapsize)d '\
' ELSE lft END, '\
' rgt = CASE '\
' WHEN rgt > %(drop_lft)d '\
' THEN rgt - %(gapsize)d '\
' ELSE rgt END '\
' WHERE (lft > %(drop_lft)d '\
' OR rgt > %(drop_lft)d) AND '\
' tree_id=%(tree_id)d' % {
'table': connection.ops.quote_name(cls._meta.db_table),
'gapsize': drop_rgt - drop_lft + 1,
'drop_lft': drop_lft,
'tree_id': tree_id}
return sql, []
@classmethod
def load_bulk(cls, bulk_data, parent=None, keep_ids=False):
"""Loads a list/dictionary structure to the tree."""
# tree, iterative preorder
added = []
if parent:
parent_id = parent.pk
else:
parent_id = None
# stack of nodes to analize
stack = [(parent_id, node) for node in bulk_data[::-1]]
foreign_keys = cls.get_foreign_keys()
while stack:
parent_id, node_struct = stack.pop()
# shallow copy of the data strucure so it doesn't persist...
node_data = node_struct['data'].copy()
cls._process_foreign_keys(foreign_keys, node_data)
if keep_ids:
node_data['id'] = node_struct['id']
if parent_id:
parent = cls.objects.get(pk=parent_id)
node_obj = parent.add_child(**node_data)
else:
node_obj = cls.add_root(**node_data)
added.append(node_obj.pk)
if 'children' in node_struct:
# extending the stack with the current node as the parent of
# the new nodes
stack.extend([
(node_obj.pk, node)
for node in node_struct['children'][::-1]
])
transaction.commit_unless_managed()
return added
def get_children(self):
""":returns: A queryset of all the node's children"""
return self.get_descendants().filter(depth=self.depth + 1)
def get_depth(self):
""":returns: the depth (level) of the node"""
return self.depth
def is_leaf(self):
""":returns: True if the node is a leaf node (else, returns False)"""
return self.rgt - self.lft == 1
def get_root(self):
""":returns: the root node for the current node object."""
if self.lft == 1:
return self
return self.__class__.objects.get(tree_id=self.tree_id, lft=1)
def is_root(self):
""":returns: True if the node is a root node (else, returns False)"""
return self.lft == 1
def get_siblings(self):
"""
:returns: A queryset of all the node's siblings, including the node
itself.
"""
if self.lft == 1:
return self.get_root_nodes()
return self.get_parent(True).get_children()
@classmethod
def dump_bulk(cls, parent=None, keep_ids=True):
"""Dumps a tree branch to a python data structure."""
qset = cls._get_serializable_model().get_tree(parent)
ret, lnk = [], {}
for pyobj in qset:
serobj = serializers.serialize('python', [pyobj])[0]
# django's serializer stores the attributes in 'fields'
fields = serobj['fields']
depth = fields['depth']
# this will be useless in load_bulk
del fields['lft']
del fields['rgt']
del fields['depth']
del fields['tree_id']
if 'id' in fields:
# this happens immediately after a load_bulk
del fields['id']
newobj = {'data': fields}
if keep_ids:
newobj['id'] = serobj['pk']
if (not parent and depth == 1) or\
(parent and depth == parent.depth):
ret.append(newobj)
else:
parentobj = pyobj.get_parent()
parentser = lnk[parentobj.pk]
if 'children' not in parentser:
parentser['children'] = []
parentser['children'].append(newobj)
lnk[pyobj.pk] = newobj
return ret
@classmethod
def get_tree(cls, parent=None):
"""
:returns:
A *queryset* of nodes ordered as DFS, including the parent.
If no parent is given, all trees are returned.
"""
if parent is None:
# return the entire tree
return cls.objects.all()
if parent.is_leaf():
return cls.objects.filter(pk=parent.pk)
return cls.objects.filter(
tree_id=parent.tree_id,
lft__range=(parent.lft, parent.rgt - 1))
def get_descendants(self):
"""
:returns: A queryset of all the node's descendants as DFS, doesn't
include the node itself
"""
if self.is_leaf():
return self.__class__.objects.none()
return self.__class__.get_tree(self).exclude(pk=self.pk)
def get_descendant_count(self):
""":returns: the number of descendants of a node."""
return (self.rgt - self.lft - 1) / 2
def get_ancestors(self):
"""
:returns: A queryset containing the current node object's ancestors,
starting by the root node and descending to the parent.
"""
if self.is_root():
return self.__class__.objects.none()
return self.__class__.objects.filter(
tree_id=self.tree_id,
lft__lt=self.lft,
rgt__gt=self.rgt)
def is_descendant_of(self, node):
"""
:returns: ``True`` if the node if a descendant of another node given
as an argument, else, returns ``False``
"""
return (
self.tree_id == node.tree_id and
self.lft > node.lft and
self.rgt < node.rgt
)
def get_parent(self, update=False):
"""
:returns: the parent node of the current node object.
Caches the result in the object itself to help in loops.
"""
if self.is_root():
return
try:
if update:
del self._cached_parent_obj
else:
return self._cached_parent_obj
except AttributeError:
pass
# parent = our most direct ancestor
self._cached_parent_obj = self.get_ancestors().reverse()[0]
return self._cached_parent_obj
@classmethod
def get_root_nodes(cls):
""":returns: A queryset containing the root nodes in the tree."""
return cls.objects.filter(lft=1)
class Meta:
"""Abstract model."""
abstract = True
django-treebeard-2.0b1/treebeard/numconv.py 0000644 0000765 0000024 00000007513 12150510123 021033 0 ustar tabo staff 0000000 0000000 """Convert strings to numbers and numbers to strings.
Gustavo Picon
https://tabo.pe/projects/numconv/
"""
__version__ = '2.1.1'
# from april fool's rfc 1924
BASE85 = '0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz' \
'!#$%&()*+-;<=>?@^_`{|}~'
# rfc4648 alphabets
BASE16 = BASE85[:16]
BASE32 = 'ABCDEFGHIJKLMNOPQRSTUVWXYZ234567'
BASE32HEX = BASE85[:32]
BASE64 = 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/'
BASE64URL = BASE64[:62] + '-_'
# http://en.wikipedia.org/wiki/Base_62 useful for url shorteners
BASE62 = BASE85[:62]
class NumConv(object):
"""Class to create converter objects.
:param radix: The base that will be used in the conversions.
The default value is 10 for decimal conversions.
:param alphabet: A string that will be used as a encoding alphabet.
The length of the alphabet can be longer than the radix. In this
case the alphabet will be internally truncated.
The default value is :data:`numconv.BASE85`
:raise TypeError: when *radix* isn't an integer
:raise ValueError: when *radix* is invalid
:raise ValueError: when *alphabet* has duplicated characters
"""
def __init__(self, radix=10, alphabet=BASE85):
"""basic validation and cached_map storage"""
if int(radix) != radix:
raise TypeError('radix must be an integer')
if not 2 <= radix <= len(alphabet):
raise ValueError('radix must be >= 2 and <= %d' % (
len(alphabet), ))
self.radix = radix
self.alphabet = alphabet
self.cached_map = dict(zip(self.alphabet, range(len(self.alphabet))))
if len(self.cached_map) != len(self.alphabet):
raise ValueError("duplicate characters found in '%s'" % (
self.alphabet, ))
def int2str(self, num):
"""Converts an integer into a string.
:param num: A numeric value to be converted to another base as a
string.
:rtype: string
:raise TypeError: when *num* isn't an integer
:raise ValueError: when *num* isn't positive
"""
if int(num) != num:
raise TypeError('number must be an integer')
if num < 0:
raise ValueError('number must be positive')
radix, alphabet = self.radix, self.alphabet
if radix in (8, 10, 16) and \
alphabet[:radix].lower() == BASE85[:radix].lower():
return ({8: '%o', 10: '%d', 16: '%x'}[radix] % num).upper()
ret = ''
while True:
ret = alphabet[num % radix] + ret
if num < radix:
break
num //= radix
return ret
def str2int(self, num):
"""Converts a string into an integer.
If possible, the built-in python conversion will be used for speed
purposes.
:param num: A string that will be converted to an integer.
:rtype: integer
:raise ValueError: when *num* is invalid
"""
radix, alphabet = self.radix, self.alphabet
if radix <= 36 and alphabet[:radix].lower() == BASE85[:radix].lower():
return int(num, radix)
ret = 0
lalphabet = alphabet[:radix]
for char in num:
if char not in lalphabet:
raise ValueError("invalid literal for radix2int() with radix "
"%d: '%s'" % (radix, num))
ret = ret * radix + self.cached_map[char]
return ret
def int2str(num, radix=10, alphabet=BASE85):
"""helper function for quick base conversions from integers to strings"""
return NumConv(radix, alphabet).int2str(num)
def str2int(num, radix=10, alphabet=BASE85):
"""helper function for quick base conversions from strings to integers"""
return NumConv(radix, alphabet).str2int(num)
django-treebeard-2.0b1/treebeard/static/ 0000755 0000765 0000024 00000000000 12151313146 020264 5 ustar tabo staff 0000000 0000000 django-treebeard-2.0b1/treebeard/static/treebeard/ 0000755 0000765 0000024 00000000000 12151313146 022221 5 ustar tabo staff 0000000 0000000 django-treebeard-2.0b1/treebeard/static/treebeard/expand-collapse.png 0000644 0000765 0000024 00000001774 12150510123 026010 0 ustar tabo staff 0000000 0000000 PNG
IHDR @ } sRGB bKGD pHYs tIME9=s |IDATXUnH.76t{6$keF@2!Qwq\2]p%%qwpdi/dU_踺4}a2@8E$IH$ c<___/$Ix