././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1696400768.8720984 drf-haystack-1.8.13/0000775000175000017500000000000014507202601012361 5ustar00mmmmmm././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/LICENSE.txt0000664000175000017500000000206414476342707014227 0ustar00mmmmmmThe MIT License (MIT) Copyright (c) 2015 Inonit AS Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/MANIFEST.in0000664000175000017500000000024114476342707014135 0ustar00mmmmmminclude README.md include LICENSE.txt include Pipfile include tox.ini recursive-include docs Makefile *.rst *.py *.bat recursive-include tests *.py *.json *.txt ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1696400768.8720984 drf-haystack-1.8.13/PKG-INFO0000644000175000017500000000175014507202601013457 0ustar00mmmmmmMetadata-Version: 2.1 Name: drf-haystack Version: 1.8.13 Summary: Makes Haystack play nice with Django REST Framework Home-page: https://github.com/rhblind/drf-haystack Download-URL: https://github.com/rhblind/drf-haystack.git Author: Rolf Håvard Blindheim Author-email: rhblind@gmail.com License: MIT License Classifier: Operating System :: OS Independent Classifier: Development Status :: 5 - Production/Stable Classifier: Environment :: Web Environment Classifier: Framework :: Django Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: MIT License Classifier: Programming Language :: Python :: 3 Classifier: Topic :: Software Development :: Libraries :: Python Modules Requires-Python: >=3.7 License-File: LICENSE.txt Requires-Dist: Django<4.3,>=2.2 Requires-Dist: djangorestframework<=3.14,>=3.7 Requires-Dist: django-haystack<=3.2.1,>=2.8 Requires-Dist: python-dateutil Implements a ViewSet, FiltersBackends and Serializers in order to play nice with Haystack. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1696400684.0 drf-haystack-1.8.13/Pipfile0000664000175000017500000000062214507202454013702 0ustar00mmmmmm[[source]] url = "https://pypi.org/simple" verify_ssl = true name = "pypi" [packages] django = ">=2.2,<=4.2.3" django-haystack = ">=2.8,<=3.2.1" djangorestframework = ">=3.7.0,<=3.14" python-dateutil = "*" [dev-packages] coverage = "*" nose = "*" sphinx = "*" sphinx-rtd-theme = "*" "urllib3" = "*" geopy = "*" tox = "*" wheel = "*" elasticsearch = ">=2.0.0,<=8.3.3" [requires] python_version = "3" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1696400347.0 drf-haystack-1.8.13/README.md0000664000175000017500000000370414507201733013651 0ustar00mmmmmmHaystack for Django REST Framework ================================== Build status ------------ [![Build Status](https://travis-ci.org/rhblind/drf-haystack.svg?branch=master)](https://travis-ci.org/rhblind/drf-haystack) [![Coverage Status](https://coveralls.io/repos/github/rhblind/drf-haystack/badge.svg?branch=master)](https://coveralls.io/github/rhblind/drf-haystack?branch=master) [![PyPI version](https://badge.fury.io/py/drf-haystack.svg)](https://badge.fury.io/py/drf-haystack) [![Documentation Status](https://readthedocs.org/projects/drf-haystack/badge/?version=latest)](http://drf-haystack.readthedocs.io/en/latest/?badge=latest) About ----- Small library which tries to simplify integration of Haystack with Django REST Framework. Fresh [documentation available](https://drf-haystack.readthedocs.io/en/latest/) on Read the docs! Supported versions ------------------ - Python 3.7 and above - Django >=2.2,<4.3 - Haystack 2.8, 3.2 - Django REST Framework >=3.7.0,<=3.13 - elasticsearch >=2.0.0,<=8.3.3, Installation ------------ $ pip install drf-haystack Supported features ------------------ We aim to support most features Haystack does (or at least those which can be used in a REST API). Currently, we support: - Autocomplete - Boost (Experimental) - Faceting - Geo Spatial Search - Highlighting - More Like This Show me more! ------------- ```python from drf_haystack.serializers import HaystackSerializer from drf_haystack.viewsets import HaystackViewSet from myapp.search_indexes import PersonIndex # You would define this Index normally as per Haystack's documentation # Serializer class PersonSearchSerializer(HaystackSerializer): class Meta: index_classes = [PersonIndex] fields = ["firstname", "lastname", "full_name"] # ViewSet class PersonSearchViewSet(HaystackViewSet): index_models = [Person] serializer_class = PersonSerializer ``` That's it, you're good to go. Hook it up to a DRF router and happy searching! ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1696400768.8640983 drf-haystack-1.8.13/docs/0000775000175000017500000000000014507202601013311 5ustar00mmmmmm././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/docs/01_intro.rst0000664000175000017500000001337414476342707015527 0ustar00mmmmmm.. _basic-usage-label: =========== Basic Usage =========== Usage is best demonstrated with some simple examples. .. warning:: The code here is for demonstration purposes only! It might work (or not, I haven't tested), but as always, don't blindly copy code from the internet. Examples ======== models.py --------- Let's say we have an app which contains a model `Location`. It could look something like this. .. code-block:: python # # models.py # from django.db import models from haystack.utils.geo import Point class Location(models.Model): latitude = models.FloatField() longitude = models.FloatField() address = models.CharField(max_length=100) city = models.CharField(max_length=30) zip_code = models.CharField(max_length=10) created = models.DateTimeField(auto_now_add=True) updated = models.DateTimeField(auto_now=True) def __str__(self): return self.address @property def coordinates(self): return Point(self.longitude, self.latitude) .. _search-index-example-label: search_indexes.py ----------------- We would have to make a ``search_indexes.py`` file for haystack to pick it up. .. code-block:: python # # search_indexes.py # from django.utils import timezone from haystack import indexes from .models import Location class LocationIndex(indexes.SearchIndex, indexes.Indexable): text = indexes.CharField(document=True, use_template=True) address = indexes.CharField(model_attr="address") city = indexes.CharField(model_attr="city") zip_code = indexes.CharField(model_attr="zip_code") autocomplete = indexes.EdgeNgramField() coordinates = indexes.LocationField(model_attr="coordinates") @staticmethod def prepare_autocomplete(obj): return " ".join(( obj.address, obj.city, obj.zip_code )) def get_model(self): return Location def index_queryset(self, using=None): return self.get_model().objects.filter( created__lte=timezone.now() ) views.py -------- For a generic Django REST Framework view, you could do something like this. .. code-block:: python # # views.py # from drf_haystack.serializers import HaystackSerializer from drf_haystack.viewsets import HaystackViewSet from .models import Location from .search_indexes import LocationIndex class LocationSerializer(HaystackSerializer): class Meta: # The `index_classes` attribute is a list of which search indexes # we want to include in the search. index_classes = [LocationIndex] # The `fields` contains all the fields we want to include. # NOTE: Make sure you don't confuse these with model attributes. These # fields belong to the search index! fields = [ "text", "address", "city", "zip_code", "autocomplete" ] class LocationSearchView(HaystackViewSet): # `index_models` is an optional list of which models you would like to include # in the search result. You might have several models indexed, and this provides # a way to filter out those of no interest for this particular view. # (Translates to `SearchQuerySet().models(*index_models)` behind the scenes. index_models = [Location] serializer_class = LocationSerializer urls.py ------- Finally, hook up the views in your `urls.py` file. .. note:: Make sure you specify the `basename` attribute when wiring up the view in the router. Since we don't have any single `model` for the view, it is impossible for the router to automatically figure out the base name for the view. .. code-block:: python # # urls.py # from django.conf.urls import patterns, url, include from rest_framework import routers from .views import LocationSearchView router = routers.DefaultRouter() router.register("location/search", LocationSearchView, basename="location-search") urlpatterns = patterns( "", url(r"/api/v1/", include(router.urls)), ) Query time! ----------- Now that we have a view wired up, we can start using it. By default, the `HaystackViewSet` (which, more importantly inherits the `HaystackGenericAPIView` class) is set up to use the `HaystackFilter`. This is the most basic filter included and can do basic search by querying any of the field included in the `fields` attribute on the `Serializer`. .. code-block:: none http://example.com/api/v1/location/search/?city=Oslo Would perform a query looking up all documents where the `city field` equals "Oslo". Field Lookups ............. You can also use field lookups in your field queries. See the Haystack `field lookups `_ documentation for info on what lookups are available. A query using a lookup might look like the following: .. code-block:: none http://example.com/api/v1/location/search/?city__startswith=Os This would perform a query looking up all documents where the `city field` started with "Os". You might get "Oslo", "Osaka", and "Ostrava". Term Negation ............. You can also specify terms to exclude from the search results using the negation keyword. The default keyword is ``not``, but is configurable via settings using ``DRF_HAYSTACK_NEGATION_KEYWORD``. .. code-block:: none http://example.com/api/v1/location/search/?city__not=Oslo http://example.com/api/v1/location/search/?city__not__contains=Los http://example.com/api/v1/location/search/?city__contains=Los&city__not__contains=Angeles ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/docs/02_autocomplete.rst0000664000175000017500000000370314476342707017071 0ustar00mmmmmm.. _autocomplete-label: Autocomplete ============ Some kind of data such as ie. cities and zip codes could be useful to autocomplete. We have a Django REST Framework filter for performing autocomplete queries. It works quite like the regular :class:`drf_haystack.filters.HaystackFilter` but *must* be run against an ``NgramField`` or ``EdgeNgramField`` in order to work properly. The main difference is that while the HaystackFilter performs a bitwise ``OR`` on terms for the same parameters, the :class:`drf_haystack.filters.HaystackAutocompleteFilter` reduce query parameters down to a single filter (using an ``SQ`` object), and performs a bitwise ``AND``. By adding a list or tuple of ``ignore_fields`` to the serializer's Meta class, we can tell the REST framework to ignore these fields. This is handy in cases, where you do not want to serialize and transfer the content of a text, or n-gram index down to the client. An example using the autocomplete filter might look something like this. .. code-block:: python from drf_haystack.filters import HaystackAutocompleteFilter from drf_haystack.serializers import HaystackSerializer from drf_haystack.viewsets import HaystackViewSet class AutocompleteSerializer(HaystackSerializer): class Meta: index_classes = [LocationIndex] fields = ["address", "city", "zip_code", "autocomplete"] ignore_fields = ["autocomplete"] # The `field_aliases` attribute can be used in order to alias a # query parameter to a field attribute. In this case a query like # /search/?q=oslo would alias the `q` parameter to the `autocomplete` # field on the index. field_aliases = { "q": "autocomplete" } class AutocompleteSearchViewSet(HaystackViewSet): index_models = [Location] serializer_class = AutocompleteSerializer filter_backends = [HaystackAutocompleteFilter] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/docs/03_geospatial.rst0000664000175000017500000000557314476342707016530 0ustar00mmmmmm.. _geospatial-label: GEO spatial locations ===================== Some search backends support geo spatial searching. In order to take advantage of this we have the :class:`drf_haystack.filters.HaystackGEOSpatialFilter`. .. note:: The ``HaystackGEOSpatialFilter`` depends on ``geopy`` and ``libgeos``. Make sure to install these libraries in order to use this filter. .. code-block:: none $ pip install geopy $ apt-get install libgeos-c1 (for debian based linux distros) or $ brew install geos (for homebrew on OS X) The geospatial filter is somewhat special, and for the time being, relies on a few assumptions. #. The index model **must** to have a ``LocationField`` (See :ref:`search-index-example-label` for example). If your ``LocationField`` is named something other than ``coordinates``, subclass the ``HaystackGEOSpatialFilter`` and make sure to set the :attr:`drf_haystack.filters.HaystackGEOSpatialFilter.point_field` to the name of the field. #. The query **must** contain a ``unit`` parameter where the unit is a valid ``UNIT`` in the ``django.contrib.gis.measure.Distance`` class. #. The query **must** contain a ``from`` parameter which is a comma separated longitude and latitude value. You may also change the query param ``from`` by defining ``DRF_HAYSTACK_SPATIAL_QUERY_PARAM`` on your settings. **Example Geospatial view** .. code-block:: python class DistanceSerializer(serializers.Serializer): m = serializers.FloatField() km = serializers.FloatField() class LocationSerializer(HaystackSerializer): distance = SerializerMethodField() class Meta: index_classes = [LocationIndex] fields = ["address", "city", "zip_code"] def get_distance(self, obj): if hasattr(obj, "distance"): return DistanceSerializer(obj.distance, many=False).data class LocationGeoSearchViewSet(HaystackViewSet): index_models = [Location] serializer_class = LocationSerializer filter_backends = [HaystackGEOSpatialFilter] **Example subclassing the HaystackGEOSpatialFilter** Assuming that your ``LocationField`` is named ``location``. .. code-block:: python from drf_haystack.filters import HaystackGEOSpatialFilter class CustomHaystackGEOSpatialFilter(HaystackGEOSpatialFilter): point_field = 'location' class LocationGeoSearchViewSet(HaystackViewSet): index_models = [Location] serializer_class = LocationSerializer filter_backends = [CustomHaystackGEOSpatialFilter] Assuming the above code works as it should, we would be able to do queries like this: .. code-block:: none /api/v1/search/?zip_code=0351&km=10&from=59.744076,10.152045 The above query would return all entries with zip_code 0351 within 10 kilometers from the location with latitude 59.744076 and longitude 10.152045. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/docs/04_highlighting.rst0000664000175000017500000001044514476342707017040 0ustar00mmmmmm.. _highlighting-label: Highlighting ============ Haystack supports two kinds of `Highlighting `_, and we support them both. #. SearchQuerySet highlighting. This kind of highlighting requires a search backend which has support for highlighting, such as Elasticsearch or Solr. #. Pure python highlighting. This implementation is somewhat slower, but enables highlighting support even if your search backend does not support it. .. note:: The highlighter will always use the ``document=True`` field on your index to hightlight on. See examples below. SearchQuerySet Highlighting --------------------------- In order to add support for ``SearchQuerySet().highlight()``, all you have to do is to add the :class:`drf_haystack.filters.HaystackHighlightFilter` to the ``filter_backends`` in your view. The ``HaystackSerializer`` will check if your queryset has highlighting enabled, and render an additional ``highlighted`` field to your result. The highlighted words will be encapsulated in an ``words go here`` html tag. **Example view with highlighting enabled** .. code-block:: python from drf_haystack.viewsets import HaystackViewSet from drf_haystack.filters import HaystackHighlightFilter from .models import Person from .serializers import PersonSerializer class SearchViewSet(HaystackViewSet): index_models = [Person] serializer_class = PersonSerializer filter_backends = [HaystackHighlightFilter] Given a query like below .. code-block:: none /api/v1/search/?firstname=jeremy We would get a result like this .. code-block:: json [ { "lastname": "Rowland", "full_name": "Jeremy Rowland", "firstname": "Jeremy", "highlighted": "Jeremy Rowland\nCreated: May 19, 2015, 10:48 a.m.\nLast modified: May 19, 2015, 10:48 a.m.\n" }, { "lastname": "Fowler", "full_name": "Jeremy Fowler", "firstname": "Jeremy", "highlighted": "Jeremy Fowler\nCreated: May 19, 2015, 10:48 a.m.\nLast modified: May 19, 2015, 10:48 a.m.\n" } ] Pure Python Highlighting ------------------------ This implementation make use of the haystack ``Highlighter()`` class. It is implemented as :class:`drf_haystack.serializers.HighlighterMixin` mixin class, and must be applied on the ``Serializer``. This is somewhat slower, but more configurable than the :class:`drf_haystack.filters.HaystackHighlightFilter` filter class. The Highlighter class will be initialized with the following default options, but can be overridden by changing any of the following class attributes. .. code-block:: python highlighter_class = Highlighter highlighter_css_class = "highlighted" highlighter_html_tag = "span" highlighter_max_length = 200 highlighter_field = None The Highlighter class will usually highlight the ``document_field`` (the field marked ``document=True`` on your search index class), but this may be overridden by changing the ``highlighter_field``. You can of course also use your own ``Highlighter`` class by overriding the ``highlighter_class = MyFancyHighLighter`` class attribute. **Example serializer with highlighter support** .. code-block:: python from drf_haystack.serializers import HighlighterMixin, HaystackSerializer class PersonSerializer(HighlighterMixin, HaystackSerializer): highlighter_css_class = "my-highlighter-class" highlighter_html_tag = "em" class Meta: index_classes = [PersonIndex] fields = ["firstname", "lastname", "full_name"] Response .. code-block:: json [ { "full_name": "Jeremy Rowland", "lastname": "Rowland", "firstname": "Jeremy", "highlighted": "Jeremy Rowland\nCreated: May 19, 2015, 10:48 a.m.\nLast modified: May 19, 2015, 10:48 a.m.\n" }, { "full_name": "Jeremy Fowler", "lastname": "Fowler", "firstname": "Jeremy", "highlighted": "Jeremy Fowler\nCreated: May 19, 2015, 10:48 a.m.\nLast modified: May 19, 2015, 10:48 a.m.\n" } ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/docs/05_more_like_this.rst0000664000175000017500000000403314476342707017365 0ustar00mmmmmm.. _more-like-this-label: More Like This ============== Some search backends supports ``More Like This`` features. In order to take advantage of this, we have a mixin class :class:`drf_haystack.mixins.MoreLikeThisMixin`, which will append a ``more-like-this`` detail route to the base name of the ViewSet. Lets say you have a router which looks like this: .. code-block:: python router = routers.DefaultRouter() router.register("search", viewset=SearchViewSet, basename="search") # MLT name will be 'search-more-like-this'. urlpatterns = patterns( "", url(r"^", include(router.urls)) ) The important thing here is that the ``SearchViewSet`` class inherits from the :class:`drf_haystack.mixins.MoreLikeThisMixin` class in order to get the ``more-like-this`` route automatically added. The view name will be ``{basename}-more-like-this``, which in this case would be for example ``search-more-like-this``. Serializing the More Like This URL ---------------------------------- In order to include the ``more-like-this`` url in your result you only have to add a ``HyperlinkedIdentityField`` to your serializer. Something like this should work okay. **Example serializer with More Like This** .. code-block:: python class SearchSerializer(HaystackSerializer): more_like_this = serializers.HyperlinkedIdentityField(view_name="search-more-like-this", read_only=True) class Meta: index_classes = [PersonIndex] fields = ["firstname", "lastname", "full_name"] class SearchViewSet(MoreLikeThisMixin, HaystackViewSet): index_models = [Person] serializer_class = SearchSerializer Now, every result you render with this serializer will include a ``more_like_this`` field containing the url for similar results. Example response .. code-block:: json [ { "full_name": "Jeremy Rowland", "lastname": "Rowland", "firstname": "Jeremy", "more_like_this": "http://example.com/search/5/more-like-this/" } ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/docs/06_term_boost.rst0000664000175000017500000000366214476342707016555 0ustar00mmmmmm.. _term-boost-label: Term Boost ========== .. warning:: **BIG FAT WARNING** As far as I can see, the term boost functionality is implemented by the specs in the `Haystack documentation `_, however it does not really work as it should! When applying term boost, results are discarded from the search result, and not re-ordered by boost weight as they should. These are known problems and there exists open issues for them: - https://github.com/inonit/drf-haystack/issues/21 - https://github.com/django-haystack/django-haystack/issues/1235 - https://github.com/django-haystack/django-haystack/issues/508 **Please do not use this unless you really know what you are doing!** (And please let me know if you know how to fix it!) Term boost is achieved on the SearchQuerySet level by calling ``SearchQuerySet().boost()``. It is implemented as a :class:`drf_haystack.filters.HaystackBoostFilter` filter backend. The ``HaystackBoostFilter`` does not perform any filtering by itself, and should therefore be combined with some other filter that does, for example the :class:`drf_haystack.filters.HaystackFilter`. .. code-block:: python from drf_haystack.filters import HaystackBoostFilter class SearchViewSet(HaystackViewSet): ... filter_backends = [HaystackFilter, HaystackBoostFilter] The filter expects the query string to contain a ``boost`` parameter, which is a comma separated string of the term to boost and the boost value. The boost value must be either an integer or float value. **Example query** .. code-block:: none /api/v1/search/?firstname=robin&boost=hood,1.1 The query above will first filter on ``firstname=robin`` and next apply a slight boost on any document containing the word ``hood``. .. note:: Term boost are only applied on terms existing in the ``document field``. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/docs/07_faceting.rst0000664000175000017500000003324514476342707016161 0ustar00mmmmmm.. _faceting-label: Faceting ======== Faceting is a way of grouping and narrowing search results by a common factor, for example we can group all results which are registered on a certain date. Similar to :ref:`more-like-this-label`, the faceting functionality is implemented by setting up a special ``^search/facets/$`` route on any view which inherits from the :class:`drf_haystack.mixins.FacetMixin` class. .. note:: Options used for faceting is **not** portable across search backends. Make sure to provide options suitable for the backend you're using. First, read the `Haystack faceting docs `_ and set up your search index for faceting. Serializing faceted counts -------------------------- Faceting is a little special in terms that it *does not* care about SearchQuerySet filtering. Faceting is performed by calling the ``SearchQuerySet().facet(field, **options)`` and ``SearchQuerySet().date_facet(field, **options)`` methods, which will apply facets to the SearchQuerySet. Next we need to call the ``SearchQuerySet().facet_counts()`` in order to retrieve a dictionary with all the *counts* for the faceted fields. We have a special :class:`drf_haystack.serializers.HaystackFacetSerializer` class which is designed to serialize these results. .. tip:: It *is* possible to perform faceting on a subset of the queryset, in which case you'd have to override the ``get_queryset()`` method of the view to limit the queryset before it is passed on to the ``filter_facet_queryset()`` method. Any serializer subclassed from the ``HaystackFacetSerializer`` is expected to have a ``field_options`` dictionary containing a set of default options passed to ``facet()`` and ``date_facet()``. **Facet serializer example** .. code-block:: python class PersonFacetSerializer(HaystackFacetSerializer): serialize_objects = False # Setting this to True will serialize the # queryset into an `objects` list. This # is useful if you need to display the faceted # results. Defaults to False. class Meta: index_classes = [PersonIndex] fields = ["firstname", "lastname", "created"] field_options = { "firstname": {}, "lastname": {}, "created": { "start_date": datetime.now() - timedelta(days=3 * 365), "end_date": datetime.now(), "gap_by": "month", "gap_amount": 3 } } The declared ``field_options`` will be used as default options when faceting is applied to the queryset, but can be overridden by supplying query string parameters in the following format. .. code-block:: none ?firstname=limit:1&created=start_date:20th May 2014,gap_by:year Each field can be fed options as ``key:value`` pairs. Multiple ``key:value`` pairs can be supplied and will be separated by the ``view.lookup_sep`` attribute (which defaults to comma). Any ``start_date`` and ``end_date`` parameters will be parsed by the python-dateutil `parser() `_ (which can handle most common date formats). .. note:: - The ``HaystackFacetFilter`` parses query string parameter options, separated with the ``view.lookup_sep`` attribute. Each option is parsed as ``key:value`` pairs where the ``:`` is a hardcoded separator. Setting the ``view.lookup_sep`` attribute to ``":"`` will raise an AttributeError. - The date parsing in the ``HaystackFacetFilter`` does intentionally blow up if fed a string format it can't handle. No exception handling is done, so make sure to convert values to a format you know it can handle before passing it to the filter. Ie., don't let your users feed their own values in here ;) .. warning:: Do *not* use the ``HaystackFacetFilter`` in the regular ``filter_backends`` list on the serializer. It will almost certainly produce errors or weird results. Faceting filters should go in the ``facet_filter_backends`` list. **Example serialized content** The serialized content will look a little different than the default Haystack faceted output. The top level items will *always* be **queries**, **fields** and **dates**, each containing a subset of fields matching the category. In the example below, we have faceted on the fields *firstname* and *lastname*, which will make them appear under the **fields** category. We also have faceted on the date field *created*, which will show up under the **dates** category. Next, each faceted result will have a ``text``, ``count`` and ``narrow_url`` attribute which should be quite self explaining. .. code-block:: json { "queries": {}, "fields": { "firstname": [ { "text": "John", "count": 3, "narrow_url": "http://example.com/api/v1/search/facets/?selected_facets=firstname_exact%3AJohn" }, { "text": "Randall", "count": 2, "narrow_url": "http://example.com/api/v1/search/facets/?selected_facets=firstname_exact%3ARandall" }, { "text": "Nehru", "count": 2, "narrow_url": "http://example.com/api/v1/search/facets/?selected_facets=firstname_exact%3ANehru" } ], "lastname": [ { "text": "Porter", "count": 2, "narrow_url": "http://example.com/api/v1/search/facets/?selected_facets=lastname_exact%3APorter" }, { "text": "Odonnell", "count": 2, "narrow_url": "http://example.com/api/v1/search/facets/?selected_facets=lastname_exact%3AOdonnell" }, { "text": "Hood", "count": 2, "narrow_url": "http://example.com/api/v1/search/facets/?selected_facets=lastname_exact%3AHood" } ] }, "dates": { "created": [ { "text": "2015-05-15T00:00:00", "count": 100, "narrow_url": "http://example.com/api/v1/search/facets/?selected_facets=created_exact%3A2015-05-15+00%3A00%3A00" } ] } } Serializing faceted results --------------------------- When a ``HaystackFacetSerializer`` class determines what fields to serialize, it will check the ``serialize_objects`` class attribute to see if it is ``True`` or ``False``. Setting this value to ``True`` will add an additional ``objects`` field to the serialized results, which will contain the results for the faceted ``SearchQuerySet``. The results will by default be serialized using the view's ``serializer_class``. If you wish to use a different serializer for serializing the results, set the :attr:`drf_haystack.mixins.FacetMixin.facet_objects_serializer_class` class attribute to whatever serializer you want to use, or override the :meth:`drf_haystack.mixins.FacetMixin.get_facet_objects_serializer_class` method. **Example faceted results with paginated serialized objects** .. code-block:: json { "fields": { "firstname": [ {"...": "..."} ], "lastname": [ {"...": "..."} ] }, "dates": { "created": [ {"...": "..."} ] }, "queries": {}, "objects": { "count": 3, "next": "http://example.com/api/v1/search/facets/?page=2&selected_facets=firstname_exact%3AJohn", "previous": null, "results": [ { "lastname": "Baker", "firstname": "John", "full_name": "John Baker", "text": "John Baker\n" }, { "lastname": "McClane", "firstname": "John", "full_name": "John McClane", "text": "John McClane\n" } ] } } Setting up the view ------------------- Any view that inherits the :class:`drf_haystack.mixins.FacetMixin` will have a special `action route `_ added as ``^/facets/$``. This view action will not care about regular filtering but will by default use the ``HaystackFacetFilter`` to perform filtering. .. note:: In order to avoid confusing the filtering mechanisms in Django Rest Framework, the ``FacetMixin`` class has a couple of hooks for dealing with faceting, namely: - :attr:`drf_haystack.mixins.FacetMixin.facet_filter_backends` - A list of filter backends that will be used to apply faceting to the queryset. Defaults to :class:drf_haystack.filters.HaystackFacetFilter`, which should be sufficient in most cases. - :attr:`drf_haystack.mixins.FacetMixin.facet_serializer_class` - The :class:`drf_haystack.serializers.HaystackFacetSerializer` instance that will be used for serializing the result. - :attr:`drf_haystack.mixins.FacetMixin.facet_objects_serializer_class` - Optional. Set to the serializer class which should be used for serializing faceted objects. If not set, defaults to ``self.serializer_class``. - :attr:`drf_haystack.mixins.FacetMixin.filter_facet_queryset()` - Works exactly as the normal :meth:`drf_haystack.generics.HaystackGenericAPIView.filter_queryset` method, but will only filter on backends in the ``self.facet_filter_backends`` list. - :meth:`drf_haystack.mixins.FacetMixin.get_facet_serializer_class` - Returns the ``self.facet_serializer_class`` class attribute. - :meth:`drf_haystack.mixins.FacetMixin.get_facet_serializer` - Instantiates and returns the :class:`drf_haystack.serializers.HaystackFacetSerializer` class returned from :meth:`drf_haystack.mixins.FacetMixin.get_facet_serializer_class` method. - :meth:`drf_haystack.mixins.FacetMixin.get_facet_objects_serializer` - Instantiates and returns the serializer class which will be used to serialize faceted objects. - :meth:`drf_haystack.mixins.FacetMixin.get_facet_objects_serializer_class` - Returns the ``self.facet_objects_serializer_class``, or if not set, the ``self.serializer_class``. In order to set up a view which can respond to regular queries under ie ``^search/$`` and faceted queries under ``^search/facets/$``, we could do something like this. We can also change the query param text from ``selected_facets`` to our own choice like ``params`` or ``p``. For this to make happen please provide ``facet_query_params_text`` attribute as shown in the example. .. code-block:: python class SearchPersonViewSet(FacetMixin, HaystackViewSet): index_models = [MockPerson] # This will be used to filter and serialize regular queries as well # as the results if the `facet_serializer_class` has the # `serialize_objects = True` set. serializer_class = SearchSerializer filter_backends = [HaystackHighlightFilter, HaystackAutocompleteFilter] # This will be used to filter and serialize faceted results facet_serializer_class = PersonFacetSerializer # See example above! facet_filter_backends = [HaystackFacetFilter] # This is the default facet filter, and # can be left out. facet_query_params_text = 'params' #Default is 'selected_facets' Narrowing --------- As we have seen in the examples above, the ``HaystackFacetSerializer`` will add a ``narrow_url`` attribute to each result it serializes. Follow that link to narrow the search result. The ``narrow_url`` is constructed like this: - Read all query parameters from the request - Get a list of ``selected_facets`` - Update the query parameters by adding the current item to ``selected_facets`` - Pop the :attr:`drf_haystack.serializers.HaystackFacetSerializer.paginate_by_param` parameter if any in order to always start at the first page if returning a paginated result. - Return a ``serializers.Hyperlink`` with URL encoded query parameters This means that for each drill-down performed, the original query parameters will be kept in order to make the ``HaystackFacetFilter`` happy. Additionally, all the previous ``selected_facets`` will be kept and applied to narrow the ``SearchQuerySet`` properly. **Example narrowed result** .. code-block:: json { "queries": {}, "fields": { "firstname": [ { "text": "John", "count": 1, "narrow_url": "http://example.com/api/v1/search/facets/?selected_facets=firstname_exact%3AJohn&selected_facets=lastname_exact%3AMcLaughlin" } ], "lastname": [ { "text": "McLaughlin", "count": 1, "narrow_url": "http://example.com/api/v1/search/facets/?selected_facets=firstname_exact%3AJohn&selected_facets=lastname_exact%3AMcLaughlin" } ] }, "dates": { "created": [ { "text": "2015-05-15T00:00:00", "count": 1, "narrow_url": "http://example.com/api/v1/search/facets/?selected_facets=firstname_exact%3AJohn&selected_facets=lastname_exact%3AMcLaughlin&selected_facets=created_exact%3A2015-05-15+00%3A00%3A00" } ] } } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/docs/08_permissions.rst0000664000175000017500000000241614476342707016751 0ustar00mmmmmm.. _permissions-label: Permissions =========== Django REST Framework allows setting certain ``permission_classes`` in order to control access to views. The generic ``HaystackGenericAPIView`` defaults to ``rest_framework.permissions.AllowAny`` which enforce no restrictions on the views. This can be overridden on a per-view basis as you would normally do in a regular `REST Framework APIView `_. .. note:: Since we have no Django model or queryset, the following permission classes are *not* supported: - ``rest_framework.permissions.DjangoModelPermissions`` - ``rest_framework.permissions.DjangoModelPermissionsOrAnonReadOnly`` - ``rest_framework.permissions.DjangoObjectPermissions`` ``POST``, ``PUT``, ``PATCH`` and ``DELETE`` are not supported since Haystack Views are read-only. So if you are using the ``rest_framework.permissions.IsAuthenticatedOrReadOnly`` , this will act just as the ``AllowAny`` permission. **Example overriding permission classes** .. code-block:: python ... from rest_framework.permissions import IsAuthenticated class SearchViewSet(HaystackViewSet): ... permission_classes = [IsAuthenticated] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/docs/09_multiple_indexes.rst0000664000175000017500000001255714476342707017760 0ustar00mmmmmm.. _multiple-indexes-label: Multiple search indexes ======================= So far, we have only used one class in the ``index_classes`` attribute of our serializers. However, you are able to specify a list of them. This can be useful when your search engine has indexed multiple models and you want to provide aggregate results across two or more of them. To use the default multiple index support, simply add multiple indexes the ``index_classes`` list .. code-block:: python class PersonIndex(indexes.SearchIndex, indexes.Indexable): text = indexes.CharField(document=True, use_template=True) firstname = indexes.CharField(model_attr="first_name") lastname = indexes.CharField(model_attr="last_name") def get_model(self): return Person class PlaceIndex(indexes.SearchIndex, indexes.Indexable): text = indexes.CharField(document=True, use_template=True) address = indexes.CharField(model_attr="address") def get_model(self): return Place class ThingIndex(indexes.SearchIndex, indexes.Indexable): text = indexes.CharField(document=True, use_template=True) name = indexes.CharField(model_attr="name") def get_model(self): return Thing class AggregateSerializer(HaystackSerializer): class Meta: index_classes = [PersonIndex, PlaceIndex, ThingIndex] fields = ["firstname", "lastname", "address", "name"] class AggregateSearchViewSet(HaystackViewSet): serializer_class = AggregateSerializer .. note:: The ``AggregateSearchViewSet`` class above omits the optional ``index_models`` attribute. This way results from all the models are returned. The result from searches using multiple indexes is a list of objects, each of which contains only the fields appropriate to the model from which the result came. For instance if a search returned a list containing one each of the above models, it might look like the following: .. code-block:: javascript [ { "text": "John Doe", "firstname": "John", "lastname": "Doe" }, { "text": "123 Doe Street", "address": "123 Doe Street" }, { "text": "Doe", "name": "Doe" } ] Declared fields --------------- You can include field declarations in the serializer class like normal. Depending on how they are named, they will be treated as common fields and added to every result or as specific to results from a particular index. Common fields are declared as you would any serializer field. Index-specific fields must be prefixed with "___". The following example illustrates this usage: .. code-block:: python class AggregateSerializer(HaystackSerializer): extra = serializers.SerializerMethodField() _ThingIndex__number = serializers.SerializerMethodField() class Meta: index_classes = [PersonIndex, PlaceIndex, ThingIndex] fields = ["firstname", "lastname", "address", "name"] def get_extra(self,instance): return "whatever" def get__ThingIndex__number(self,instance): return 42 The results of a search might then look like the following: .. code-block:: javascript [ { "text": "John Doe", "firstname": "John", "lastname": "Doe", "extra": "whatever" }, { "text": "123 Doe Street", "address": "123 Doe Street", "extra": "whatever" }, { "text": "Doe", "name": "Doe", "extra": "whatever", "number": 42 } ] Multiple Serializers -------------------- Alternatively, you can specify a 'serializers' attribute on your Meta class to use a different serializer class for different indexes as show below: .. code-block:: python class AggregateSearchSerializer(HaystackSerializer): class Meta: serializers = { PersonIndex: PersonSearchSerializer, PlaceIndex: PlaceSearchSerializer, ThingIndex: ThingSearchSerializer } The ``serializers`` attribute is the important thing here, It's a dictionary with ``SearchIndex`` classes as keys and ``Serializer`` classes as values. Each result in the list of results from a search that contained items from multiple indexes would be serialized according to the appropriate serializer. .. warning:: If a field name is shared across serializers, and one serializer overrides the field mapping, the overridden mapping will be used for *all* serializers. See the example below for more details. .. code-block:: python from rest_framework import serializers class PersonSearchSerializer(HaystackSerializer): # NOTE: This override will be used for both Person and Place objects. name = serializers.SerializerMethodField() class Meta: fields = ['name'] class PlaceSearchSerializer(HaystackSerializer): class Meta: fields = ['name'] class AggregateSearchSerializer(HaystackSerializer): class Meta: serializers = { PersonIndex: PersonSearchSerializer, PlaceIndex: PlaceSearchSerializer, ThingIndex: ThingSearchSerializer } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/docs/10_tips_n_tricks.rst0000664000175000017500000000605014476342707017240 0ustar00mmmmmm.. _tips-n-tricks-label: Tips'n Tricks ============= Reusing Model serializers ------------------------- It may be useful to be able to use existing model serializers to return data from search requests in the same format as used elsewhere in your API. This can be done by modifying the ``to_representation`` method of your serializer to use the ``instance.object`` instead of the search result instance. As a convenience, a mixin class is provided that does just that. .. class:: drf_haystack.serializers.HaystackSerializerMixin An example using the mixin might look like the following: .. code-block:: python class PersonSerializer(serializers.ModelSerializer): class Meta: model = Person fields = ("id", "firstname", "lastname") class PersonSearchSerializer(HaystackSerializerMixin, PersonSerializer): class Meta(PersonSerializer.Meta): search_fields = ("text", ) The results from a search would then contain the fields from the ``PersonSerializer`` rather than fields from the search index. .. note:: If your model serializer specifies a ``fields`` attribute in its Meta class, then the search serializer must specify a ``search_fields`` attribute in its Meta class if you intend to search on any search index fields that are not in the model serializer fields (e.g. 'text') .. warning:: It should be noted that doing this will retrieve the underlying object which means a database hit. Thus, it will not be as performant as only retrieving data from the search index. If performance is a concern, it would be better to recreate the desired data structure and store it in the search index. Regular Search View ------------------- Sometimes you might not need all the bells and whistles of a ``ViewSet``, but can do with a regular view. In such scenario you could do something like this. .. code-block:: python # # views.py # from rest_framework.mixins import ListModelMixin from drf_haystack.generics import HaystackGenericAPIView class SearchView(ListModelMixin, HaystackGenericAPIView): serializer_class = LocationSerializer def get(self, request, *args, **kwargs): return self.list(request, *args, **kwargs) # # urls.py # urlpatterns = ( ... url(r'^search/', SearchView.as_view()), ... ) You can also use `FacetMixin` or `MoreLikeThisMixin` in your regular views as well. .. code-block:: python # # views.py # from rest_framework.mixins import ListModelMixin from drf_haystack.mixins import FacetMixin from drf_haystack.generics import HaystackGenericAPIView class SearchView(ListModelMixin, FacetMixin, HaystackGenericAPIView): index_models = [Project] serializer_class = ProjectListSerializer facet_serializer_class = ProjectListFacetSerializer pagination_class = BasicPagination permission_classes = (AllowAny,) def get(self, request, *args, **kwargs): return self.facets(request, *args, **kwargs) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/docs/Makefile0000664000175000017500000001520614476342707014776 0ustar00mmmmmm# Makefile for Sphinx documentation # # You can set these variables from the command line. SPHINXOPTS = SPHINXBUILD = sphinx-build PAPER = BUILDDIR = _build # User-friendly check for sphinx-build ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1) $(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/) endif # Internal variables. PAPEROPT_a4 = -D latex_paper_size=a4 PAPEROPT_letter = -D latex_paper_size=letter ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . # the i18n builder cannot share the environment and doctrees with the others I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . .PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext help: @echo "Please use \`make ' where is one of" @echo " html to make standalone HTML files" @echo " dirhtml to make HTML files named index.html in directories" @echo " singlehtml to make a single large HTML file" @echo " pickle to make pickle files" @echo " json to make JSON files" @echo " htmlhelp to make HTML files and a HTML help project" @echo " qthelp to make HTML files and a qthelp project" @echo " devhelp to make HTML files and a Devhelp project" @echo " epub to make an epub" @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter" @echo " latexpdf to make LaTeX files and run them through pdflatex" @echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx" @echo " text to make text files" @echo " man to make manual pages" @echo " texinfo to make Texinfo files" @echo " info to make Texinfo files and run them through makeinfo" @echo " gettext to make PO message catalogs" @echo " changes to make an overview of all changed/added/deprecated items" @echo " xml to make Docutils-native XML files" @echo " pseudoxml to make pseudoxml-XML files for display purposes" @echo " linkcheck to check all external links for integrity" @echo " doctest to run all doctests embedded in the documentation (if enabled)" clean: rm -rf $(BUILDDIR)/* html: $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/html." dirhtml: $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml." singlehtml: $(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml @echo @echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml." pickle: $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle @echo @echo "Build finished; now you can process the pickle files." json: $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json @echo @echo "Build finished; now you can process the JSON files." htmlhelp: $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp @echo @echo "Build finished; now you can run HTML Help Workshop with the" \ ".hhp project file in $(BUILDDIR)/htmlhelp." qthelp: $(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp @echo @echo "Build finished; now you can run "qcollectiongenerator" with the" \ ".qhcp project file in $(BUILDDIR)/qthelp, like this:" @echo "# qcollectiongenerator $(BUILDDIR)/qthelp/django-pushit.qhcp" @echo "To view the help file:" @echo "# assistant -collectionFile $(BUILDDIR)/qthelp/django-pushit.qhc" devhelp: $(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp @echo @echo "Build finished." @echo "To view the help file:" @echo "# mkdir -p $$HOME/.local/share/devhelp/django-pushit" @echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/django-pushit" @echo "# devhelp" epub: $(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub @echo @echo "Build finished. The epub file is in $(BUILDDIR)/epub." latex: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo @echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex." @echo "Run \`make' in that directory to run these through (pdf)latex" \ "(use \`make latexpdf' here to do that automatically)." latexpdf: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo "Running LaTeX files through pdflatex..." $(MAKE) -C $(BUILDDIR)/latex all-pdf @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." latexpdfja: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo "Running LaTeX files through platex and dvipdfmx..." $(MAKE) -C $(BUILDDIR)/latex all-pdf-ja @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." text: $(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text @echo @echo "Build finished. The text files are in $(BUILDDIR)/text." man: $(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man @echo @echo "Build finished. The manual pages are in $(BUILDDIR)/man." texinfo: $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo @echo @echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo." @echo "Run \`make' in that directory to run these through makeinfo" \ "(use \`make info' here to do that automatically)." info: $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo @echo "Running Texinfo files through makeinfo..." make -C $(BUILDDIR)/texinfo info @echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo." gettext: $(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale @echo @echo "Build finished. The message catalogs are in $(BUILDDIR)/locale." changes: $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes @echo @echo "The overview file is in $(BUILDDIR)/changes." linkcheck: $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck @echo @echo "Link check complete; look for any errors in the above output " \ "or in $(BUILDDIR)/linkcheck/output.txt." doctest: $(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest @echo "Testing of doctests in the sources finished, look at the " \ "results in $(BUILDDIR)/doctest/output.txt." xml: $(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml @echo @echo "Build finished. The XML files are in $(BUILDDIR)/xml." pseudoxml: $(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml @echo @echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml." ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1696400768.8640983 drf-haystack-1.8.13/docs/apidoc/0000775000175000017500000000000014507202601014550 5ustar00mmmmmm././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/docs/apidoc/drf_haystack.rst0000664000175000017500000000212014476342707017760 0ustar00mmmmmm drf_haystack.fields ------------------- .. automodule:: drf_haystack.fields :members: :undoc-members: :show-inheritance: drf_haystack.filters -------------------- .. automodule:: drf_haystack.filters :members: :undoc-members: :show-inheritance: drf_haystack.generics --------------------- .. automodule:: drf_haystack.generics :members: :undoc-members: :show-inheritance: drf_haystack.mixins ------------------- .. automodule:: drf_haystack.mixins :members: :undoc-members: :show-inheritance: drf_haystack.query ------------------ .. automodule:: drf_haystack.query :members: :undoc-members: :show-inheritance: drf_haystack.serializers ------------------------ .. automodule:: drf_haystack.serializers :members: :undoc-members: :show-inheritance: drf_haystack.utils ------------------ .. automodule:: drf_haystack.utils :members: :undoc-members: :show-inheritance: drf_haystack.viewsets --------------------- .. automodule:: drf_haystack.viewsets :members: :undoc-members: :show-inheritance: ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/docs/apidoc/modules.rst0000664000175000017500000000010114476342707016763 0ustar00mmmmmmAPI Docs ======== .. toctree:: :maxdepth: 4 drf_haystack ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/docs/conf.py0000664000175000017500000001321514476342707014633 0ustar00mmmmmm#!/usr/bin/env python3 # -*- coding: utf-8 -*- import os import re import sys from datetime import date # If extensions (or modules to document with autodoc) are in another directory, # add these directories to sys.path here. If the directory is relative to the # documentation root, use os.path.abspath to make it absolute, like shown here. sys.path.insert(0, os.path.abspath('..')) os.environ['RUNTIME_ENV'] = 'TESTSUITE' os.environ['DJANGO_SETTINGS_MODULE'] = 'tests.settings' try: import django import sphinx_rtd_theme use_sphinx_rtd_theme = True if hasattr(django, "setup"): django.setup() except ImportError: use_sphinx_rtd_theme = os.environ.get('READTHEDOCS', False) def get_version(package): """ Return package version as listed in `__version__` in `init.py`. """ init_py = open(os.path.join(package, '__init__.py')).read() return re.search("__version__ = ['\"]([^'\"]+)['\"]", init_py).group(1) # -- General configuration ------------------------------------------------ # If your documentation needs a minimal Sphinx version, state it here. # # needs_sphinx = '1.0' # Add any Sphinx extension module names here, as strings. They can be # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom # ones. extensions = ['sphinx.ext.autodoc', 'sphinx.ext.intersphinx', 'sphinx.ext.extlinks', 'sphinx.ext.todo', 'sphinx.ext.viewcode'] # Add any paths that contain templates here, relative to this directory. templates_path = ['_templates'] # The suffix(es) of source filenames. # You can specify multiple suffix as a list of string: # # source_suffix = ['.rst', '.md'] source_suffix = '.rst' # The master toctree document. master_doc = 'index' # General information about the project. project = 'drf_haystack' copyright = '%d, Rolf Håvard Blindheim' % date.today().year author = 'Rolf Håvard Blindheim' # The version info for the project you're documenting, acts as replacement for # |version| and |release|, also used in various other places throughout the # built documents. # # The full version, including alpha/beta/rc tags. version = release = get_version(os.path.join('..', 'drf_haystack')) # The language for content autogenerated by Sphinx. Refer to documentation # for a list of supported languages. # # This is also used if you do content translation via gettext catalogs. # Usually you set "language" from the command line for these cases. language = None # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. # This patterns also effect to html_static_path and html_extra_path exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store'] # The name of the Pygments (syntax highlighting) style to use. pygments_style = 'sphinx' # If true, `todo` and `todoList` produce output, else they produce nothing. todo_include_todos = True # -- Options for HTML output ---------------------------------------------- # The theme to use for HTML and HTML Help pages. See the documentation for # a list of builtin themes. # html_theme = 'sphinx_rtd_theme' if use_sphinx_rtd_theme else 'alabaster' # Theme options are theme-specific and customize the look and feel of a theme # further. For a list of options available for each theme, see the # documentation. # # html_theme_options = {} # Add any paths that contain custom static files (such as style sheets) here, # relative to this directory. They are copied after the builtin static files, # so a file named "default.css" will overwrite the builtin "default.css". html_static_path = ['_static'] # -- Options for HTMLHelp output ------------------------------------------ # Output file base name for HTML help builder. htmlhelp_basename = 'drfhaystackdoc' # -- Options for LaTeX output --------------------------------------------- latex_elements = { # The paper size ('letterpaper' or 'a4paper'). 'papersize': 'a4paper', # The font size ('10pt', '11pt' or '12pt'). 'pointsize': '11pt', # Additional stuff for the LaTeX preamble. # # 'preamble': '', # Latex figure (float) alignment # # 'figure_align': 'htbp', } # Grouping the document tree into LaTeX files. List of tuples # (source start file, target name, title, # author, documentclass [howto, manual, or own class]). latex_documents = [ ('index', 'drf-haystack.tex', 'drf-haystack documentation', 'Inonit', 'manual'), ] # -- Options for manual page output --------------------------------------- # One entry per manual page. List of tuples # (source start file, name, description, authors, manual section). man_pages = [ ('index', 'drf-haystack', 'drf-haystack documentation', ['Inonit'], 1) ] # -- Options for Texinfo output ------------------------------------------- # Grouping the document tree into Texinfo files. List of tuples # (source start file, target name, title, author, # dir menu entry, description, category) texinfo_documents = [ ('index', 'drf-haystack', 'drf-haystack documentation', 'Inonit', 'drf-haystack', 'Haystack for Django REST Framework', 'Miscellaneous'), ] # Example configuration for intersphinx: refer to the Python standard library. intersphinx_mapping = { 'http://docs.python.org/': None, 'django': ('https://django.readthedocs.io/en/latest/', None), 'haystack': ('https://django-haystack.readthedocs.io/en/latest/', None) } # Configurations for extlinks extlinks = { 'drf-pr': ('https://github.com/rhblind/drf-haystack/pull/%s', 'PR#'), 'drf-issue': ('https://github.com/rhblind/drf-haystack/issues/%s', '#'), 'haystack-issue': ('https://github.com/django-haystack/django-haystack/issues/%s', '#') } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/docs/index.rst0000664000175000017500000002562614476342707015206 0ustar00mmmmmm================================== Haystack for Django REST Framework ================================== Contents: .. toctree:: :maxdepth: 2 01_intro 02_autocomplete 03_geospatial 04_highlighting 05_more_like_this 06_term_boost 07_faceting 08_permissions 09_multiple_indexes 10_tips_n_tricks apidoc/modules About ===== Small library aiming to simplify using Haystack with Django REST Framework Features ======== Supported Python and Django versions: - Python 3.6+ - `All supported versions of Django `_ Installation ============ It's in the cheese shop! .. code-block:: none $ pip install drf-haystack Requirements ============ - A Supported Django install - Django REST Framework v3.2.0 and later - Haystack v2.5 and later - A supported search engine such as Solr, Elasticsearch, Whoosh, etc. - Python bindings for the chosen backend (see below). - (geopy and libgeos if you want to use geo spatial filtering) Python bindings --------------- You will also need to install python bindings for the search engine you'll use. Elasticsearch ............. See haystack `Elasticsearch `_ docs for details .. code-block:: none $ pip install elasticsearch<2.0.0 # For Elasticsearch 1.x $ pip install elasticsearch>=2.0.0,<3.0.0 # For Elasticsearch 2.x Solr .... See haystack `Solr `_ docs for details. .. code-block:: none $ pip install pysolr Whoosh ...... See haystack `Whoosh `_ docs for details. .. code-block:: none $ pip install whoosh Xapian ...... See haystack `Xapian `_ docs for details. Contributors ============ This library has mainly been written by `me `_ while working at `Inonit `_. I have also had some help from these amazing people! Thanks guys! - See the full list of `contributors `_. Changelog ========= v1.8.12 ------ *Release date: 2023-01-03* - Updated supported Django-Haystack versions v1.8.11 ------ *Release date: 2021-08-27* - Updated supported Django-Haystack versions v1.8.10 ------ *Release date: 2021-04-12* - Updated supported Django versions - Updated supported Python versions v1.8.9 ------ *Release date: 2020-10-05* - Updated supported Django versions v1.8.8 ------ *Release date: 2020-09-28* - Updated supported DRF versions v1.8.7 ------ *Release date: 2020-08-01* - Updated supported Python, Haystack and DRF versions v1.8.6 ------ *Release date: 2019-10-13* - Fixed :drf-issue:`139`. Overriding declared fields must now use ``serializers.SerializerMethodField()`` and are handled by stock DRF. We don't need any custom functionality for this. - Added support for Django REST Framework v3.10.x - Dropped Python 2.x support v1.8.5 ------ *Release date: 2019-05-21* - Django2.2 support - Django REST Framework 3.9.x support v1.8.4 ------ *Release date: 2018-08-15* - Fixed Django2.1 support - Replaced `requirements.txt` with `Pipfile` for development dependencies management v1.8.3 ------ *Release date: 2018-06-16* - Fixed issues with ``__in=[...]`` and ``__range=[...]`` filters. Closes :drf-issue:`128`. v1.8.2 ------ *Release date: 2018-05-22* - Fixed issue with ``_get_count`` for DRF v3.8 v1.8.1 ------ *Release date: 2018-04-20* - Fixed errors in test suite which caused all tests to run on Elasticsearch 1.x v1.8.0 ------ *Release date: 2018-04-16* **This release was pulled because of critical errors in the test suite.** - Dropped support for Django v1.10.x and added support for Django v2.0.x - Updated minimum Django REST Framework requirement to v3.7 - Updated minimum Haystack requirements to v2.8 v1.7.1rc2 --------- *Release date: 2018-01-30* - Fixes issues with building documentation. - Fixed some minor typos in documentation. - Dropped unittest2 in favor of standard lib unittest v1.7.1rc1 --------- *Release date: 2018-01-06* - Locked Django versions in order to comply with Haystack requirements. - Requires development release of Haystack (v2.7.1dev0). v1.7.0 ------ *Release date: 2018-01-06 (Removed from pypi due to critical bugs)* - Bumping minimum support for Django to v1.10. - Bumping minimum support for Django REST Framework to v1.6.0 - Adding support for Elasticsearch 2.x Haystack backend v1.6.1 ------ *Release date: 2017-01-13* - Updated docs with correct name for libgeos-c1. - Updated ``.travis.yml`` with correct name for libgeos-c1. - Fixed an issue where queryset in the whould be evaluated if attribute is set but has no results, thus triggering the wrong clause in condition check. :drf-pr:`88` closes :drf-issue:`86`. v1.6.0 ------ *Release date: 2016-11-08* - Added Django 1.10 compatibility. - Fixed multiple minor issues. v1.6.0rc3 --------- *Release date: 2016-06-29* - Fixed :drf-issue:`61`. Introduction of custom serializers for serializing faceted objects contained a breaking change. v1.6.0rc2 --------- *Release date: 2016-06-28* - Restructured and updated documentation - Added support for using a custom serializer when serializing faceted objects. v1.6.0rc1 --------- *Release date: 2016-06-24* .. note:: **This release include breaking changes to the API** - Dropped support for Python 2.6, Django 1.5, 1.6 and 1.7 - Will follow `Haystack's `_ supported versions - Removed deprecated ``SQHighlighterMixin``. - Removed redundant ``BaseHaystackGEOSpatialFilter``. If name of ``indexes.LocationField`` needs to be changed, subclass the ``HaystackGEOSpatialFilter`` directly. - Reworked filters: - More consistent naming of methods. - All filters follow the same logic for building and applying filters and exclusions. - All filter classes use a ``QueryBuilder`` class for working out validation and building queries which are to be passed to the ``SearchQuerySet``. - Most filters does *not* inherit from ``HaystackFilter`` anymore (except ``HaystackAutocompleteFilter`` and ``HaystackHighlightFilter``) and will no longer do basic field filtering. Filters should be properly placed in the ``filter_backends`` class attribute in their respective order to be applied. This solves issues where inherited filters responds to query parameters they should ignore. - HaystackFacetSerializer ``narrow_url`` now returns an absolute url. - HaystackFacetSerializer now properly serializes ``MultiValueField`` and ``FacetMultiValueField`` items as a JSON Array. - ``HaystackGenericAPIView.get_object()`` optional ``model`` query parameter now requires a ``app_label.model`` instead of just the ``model``. - Extracted internal fields and serializer from the ``HaystackFacetSerializer`` in order to ease customization. - ``HaystackFacetSerializer`` now supports all three `builtin `_ pagination classes, and a hook to support custom pagination classes. - Extracted the ``more-like-this`` detail route and ``facets`` list route from the generic HaystackViewSet. - Support for ``more-like-this`` is available as a :class:`drf_haystack.mixins.MoreLikeThisMixin` class. - Support for ``facets`` is available as a :class:`drf_haystack.mixins.FacetMixin` class. v1.5.6 ------ *Release date: 2015-12-02* - Fixed a bug where ``ignore_fields`` on ``HaystackSerializer`` did not work unless ``exclude`` evaluates to ``True``. - Removed ``elasticsearch`` from ``install_requires``. Elasticsearch should not be a mandatory requirement, since it's useless if not using Elasticsearch as backend. v1.5.5 ------ *Release date: 2015-10-31* - Added support for Django REST Framework 3.3.0 (Only for Python 2.7/Django 1.7 and above) - Locked elasticsearch < 2.0.0 (See :drf-issue:`29`) v1.5.4 ------ *Release date: 2015-10-08* - Added support for serializing faceted results. Closing :drf-issue:`27`. v1.5.3 ------ *Release date: 2015-10-05* - Added support for :ref:`faceting-label` (Github :drf-issue:`11`). v1.5.2 ------ *Release date: 2015-08-23* - Proper support for :ref:`multiple-indexes-label` (Github :drf-issue:`22`). - Experimental support for :ref:`term-boost-label` (This seems to have some issues upstreams, so unfortunately it does not really work as expected). - Support for negate in filters. v1.5.1 ------ *Release date: 2015-07-28* - Support for More Like This results (Github :drf-issue:`10`). - Deprecated ``SQHighlighterMixin`` in favor of ``HaystackHighlightFilter``. - ``HaystackGenericAPIView`` now returns 404 for detail views if more than one entry is found (Github :drf-issue:`19`). v1.5.0 ------ *Release date: 2015-06-29* - Added support for field lookups in queries, such as ``field__contains=foobar``. Check out `Haystack docs `_ for details. - Added default ``permission_classes`` on ``HaystackGenericAPIView`` in order to avoid crash when using global permission classes on REST Framework. See :ref:`permissions-label` for details. v1.4 ---- *Release date: 2015-06-15* - Fixed issues for Geo spatial filtering on django-haystack v2.4.x with Elasticsearch. - A serializer class now accepts a list or tuple of ``ignore_field`` to bypass serialization. - Added support for Highlighting. v1.3 ---- *Release date: 2015-05-19* - ``HaystackGenericAPIView().get_object()`` now returns Http404 instead of an empty ``SearchQueryset`` if no object is found. This mimics the behaviour from ``GenericAPIView().get_object()``. - Removed hard dependencies for ``geopy`` and ``libgeos`` (See Github :drf-issue:`5`). This means that if you want to use the ``HaystackGEOSpatialFilter``, you have to install these libraries manually. v1.2 ---- *Release date: 2015-03-23* - Fixed ``MissingDependency`` error when using another search backend than Elasticsearch. - Fixed converting distance to D object before filtering in HaystackGEOSpatialFilter. - Added Python 3 classifier. v1.1 ---- *Release date: 2015-02-16* - Full coverage (almost) test suite - Documentation - Beta release Development classifier v1.0 ---- *Release date: 2015-02-14* - Initial release. Indices and tables ================== * :ref:`genindex` * :ref:`modindex` * :ref:`search` ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/docs/make.bat0000664000175000017500000001507314476342707014745 0ustar00mmmmmm@ECHO OFF REM Command file for Sphinx documentation if "%SPHINXBUILD%" == "" ( set SPHINXBUILD=sphinx-build ) set BUILDDIR=_build set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% . set I18NSPHINXOPTS=%SPHINXOPTS% . if NOT "%PAPER%" == "" ( set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS% set I18NSPHINXOPTS=-D latex_paper_size=%PAPER% %I18NSPHINXOPTS% ) if "%1" == "" goto help if "%1" == "help" ( :help echo.Please use `make ^` where ^ is one of echo. html to make standalone HTML files echo. dirhtml to make HTML files named index.html in directories echo. singlehtml to make a single large HTML file echo. pickle to make pickle files echo. json to make JSON files echo. htmlhelp to make HTML files and a HTML help project echo. qthelp to make HTML files and a qthelp project echo. devhelp to make HTML files and a Devhelp project echo. epub to make an epub echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter echo. text to make text files echo. man to make manual pages echo. texinfo to make Texinfo files echo. gettext to make PO message catalogs echo. changes to make an overview over all changed/added/deprecated items echo. xml to make Docutils-native XML files echo. pseudoxml to make pseudoxml-XML files for display purposes echo. linkcheck to check all external links for integrity echo. doctest to run all doctests embedded in the documentation if enabled goto end ) if "%1" == "clean" ( for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i del /q /s %BUILDDIR%\* goto end ) %SPHINXBUILD% 2> nul if errorlevel 9009 ( echo. echo.The 'sphinx-build' command was not found. Make sure you have Sphinx echo.installed, then set the SPHINXBUILD environment variable to point echo.to the full path of the 'sphinx-build' executable. Alternatively you echo.may add the Sphinx directory to PATH. echo. echo.If you don't have Sphinx installed, grab it from echo.http://sphinx-doc.org/ exit /b 1 ) if "%1" == "html" ( %SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/html. goto end ) if "%1" == "dirhtml" ( %SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml. goto end ) if "%1" == "singlehtml" ( %SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml. goto end ) if "%1" == "pickle" ( %SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the pickle files. goto end ) if "%1" == "json" ( %SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the JSON files. goto end ) if "%1" == "htmlhelp" ( %SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run HTML Help Workshop with the ^ .hhp project file in %BUILDDIR%/htmlhelp. goto end ) if "%1" == "qthelp" ( %SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run "qcollectiongenerator" with the ^ .qhcp project file in %BUILDDIR%/qthelp, like this: echo.^> qcollectiongenerator %BUILDDIR%\qthelp\django-pushit.qhcp echo.To view the help file: echo.^> assistant -collectionFile %BUILDDIR%\qthelp\django-pushit.ghc goto end ) if "%1" == "devhelp" ( %SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp if errorlevel 1 exit /b 1 echo. echo.Build finished. goto end ) if "%1" == "epub" ( %SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub if errorlevel 1 exit /b 1 echo. echo.Build finished. The epub file is in %BUILDDIR%/epub. goto end ) if "%1" == "latex" ( %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex if errorlevel 1 exit /b 1 echo. echo.Build finished; the LaTeX files are in %BUILDDIR%/latex. goto end ) if "%1" == "latexpdf" ( %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex cd %BUILDDIR%/latex make all-pdf cd %BUILDDIR%/.. echo. echo.Build finished; the PDF files are in %BUILDDIR%/latex. goto end ) if "%1" == "latexpdfja" ( %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex cd %BUILDDIR%/latex make all-pdf-ja cd %BUILDDIR%/.. echo. echo.Build finished; the PDF files are in %BUILDDIR%/latex. goto end ) if "%1" == "text" ( %SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text if errorlevel 1 exit /b 1 echo. echo.Build finished. The text files are in %BUILDDIR%/text. goto end ) if "%1" == "man" ( %SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man if errorlevel 1 exit /b 1 echo. echo.Build finished. The manual pages are in %BUILDDIR%/man. goto end ) if "%1" == "texinfo" ( %SPHINXBUILD% -b texinfo %ALLSPHINXOPTS% %BUILDDIR%/texinfo if errorlevel 1 exit /b 1 echo. echo.Build finished. The Texinfo files are in %BUILDDIR%/texinfo. goto end ) if "%1" == "gettext" ( %SPHINXBUILD% -b gettext %I18NSPHINXOPTS% %BUILDDIR%/locale if errorlevel 1 exit /b 1 echo. echo.Build finished. The message catalogs are in %BUILDDIR%/locale. goto end ) if "%1" == "changes" ( %SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes if errorlevel 1 exit /b 1 echo. echo.The overview file is in %BUILDDIR%/changes. goto end ) if "%1" == "linkcheck" ( %SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck if errorlevel 1 exit /b 1 echo. echo.Link check complete; look for any errors in the above output ^ or in %BUILDDIR%/linkcheck/output.txt. goto end ) if "%1" == "doctest" ( %SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest if errorlevel 1 exit /b 1 echo. echo.Testing of doctests in the sources finished, look at the ^ results in %BUILDDIR%/doctest/output.txt. goto end ) if "%1" == "xml" ( %SPHINXBUILD% -b xml %ALLSPHINXOPTS% %BUILDDIR%/xml if errorlevel 1 exit /b 1 echo. echo.Build finished. The XML files are in %BUILDDIR%/xml. goto end ) if "%1" == "pseudoxml" ( %SPHINXBUILD% -b pseudoxml %ALLSPHINXOPTS% %BUILDDIR%/pseudoxml if errorlevel 1 exit /b 1 echo. echo.Build finished. The pseudo-XML files are in %BUILDDIR%/pseudoxml. goto end ) :end ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1696400768.8640983 drf-haystack-1.8.13/drf_haystack/0000775000175000017500000000000014507202601015023 5ustar00mmmmmm././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694091443.0 drf-haystack-1.8.13/drf_haystack/__init__.py0000664000175000017500000000031514476344263017153 0ustar00mmmmmm# -*- coding: utf-8 -*- from __future__ import unicode_literals __title__ = "drf-haystack" __version__ = "1.8.13" __author__ = "Rolf Haavard Blindheim" __license__ = "MIT License" VERSION = __version__ ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/drf_haystack/constants.py0000664000175000017500000000041414476342707017431 0ustar00mmmmmmfrom django.conf import settings DRF_HAYSTACK_NEGATION_KEYWORD = getattr(settings, "DRF_HAYSTACK_NEGATION_KEYWORD", "not") GEO_SRID = getattr(settings, "GEO_SRID", 4326) DRF_HAYSTACK_SPATIAL_QUERY_PARAM = getattr(settings, "DRF_HAYSTACK_SPATIAL_QUERY_PARAM", "from") ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/drf_haystack/fields.py0000664000175000017500000000653314476342707016673 0ustar00mmmmmm# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals import six from rest_framework import fields class DRFHaystackFieldMixin(object): prefix_field_names = False def __init__(self, **kwargs): self.prefix_field_names = kwargs.pop('prefix_field_names', False) super(DRFHaystackFieldMixin, self).__init__(**kwargs) def bind(self, field_name, parent): """ Initializes the field name and parent for the field instance. Called when a field is added to the parent serializer instance. Taken from DRF and modified to support drf_haystack multiple index functionality. """ # In order to enforce a consistent style, we error if a redundant # 'source' argument has been used. For example: # my_field = serializer.CharField(source='my_field') assert self.source != field_name, ( "It is redundant to specify `source='%s'` on field '%s' in " "serializer '%s', because it is the same as the field name. " "Remove the `source` keyword argument." % (field_name, self.__class__.__name__, parent.__class__.__name__) ) self.field_name = field_name self.parent = parent # `self.label` should default to being based on the field name. if self.label is None: self.label = field_name.replace('_', ' ').capitalize() # self.source should default to being the same as the field name. if self.source is None: self.source = self.convert_field_name(field_name) # self.source_attrs is a list of attributes that need to be looked up # when serializing the instance, or populating the validated data. if self.source == '*': self.source_attrs = [] else: self.source_attrs = self.source.split('.') def convert_field_name(self, field_name): if not self.prefix_field_names: return field_name return field_name.split("__")[-1] class HaystackBooleanField(DRFHaystackFieldMixin, fields.BooleanField): pass class HaystackCharField(DRFHaystackFieldMixin, fields.CharField): pass class HaystackDateField(DRFHaystackFieldMixin, fields.DateField): pass class HaystackDateTimeField(DRFHaystackFieldMixin, fields.DateTimeField): pass class HaystackDecimalField(DRFHaystackFieldMixin, fields.DecimalField): pass class HaystackFloatField(DRFHaystackFieldMixin, fields.FloatField): pass class HaystackIntegerField(DRFHaystackFieldMixin, fields.IntegerField): pass class HaystackMultiValueField(DRFHaystackFieldMixin, fields.ListField): pass class FacetDictField(fields.DictField): """ A special DictField which passes the key attribute down to the children's ``to_representation()`` in order to let the serializer know what field they're currently processing. """ def to_representation(self, value): return dict( [(six.text_type(key), self.child.to_representation(key, val)) for key, val in value.items() ] ) class FacetListField(fields.ListField): """ The ``FacetListField`` just pass along the key derived from ``FacetDictField``. """ def to_representation(self, key, data): return [self.child.to_representation(key, item) for item in data] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/drf_haystack/filters.py0000664000175000017500000002352514476342707017075 0ustar00mmmmmm# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals import operator import six from functools import reduce from django.core.exceptions import ImproperlyConfigured from haystack.query import SearchQuerySet from rest_framework.filters import BaseFilterBackend, OrderingFilter from drf_haystack.query import BoostQueryBuilder, FilterQueryBuilder, FacetQueryBuilder, SpatialQueryBuilder class BaseHaystackFilterBackend(BaseFilterBackend): """ A base class from which all Haystack filter backend classes should inherit. """ query_builder_class = None @staticmethod def get_request_filters(request): return request.query_params.copy() def apply_filters(self, queryset, applicable_filters=None, applicable_exclusions=None): """ Apply constructed filters and excludes and return the queryset :param queryset: queryset to filter :param applicable_filters: filters which are passed directly to queryset.filter() :param applicable_exclusions: filters which are passed directly to queryset.exclude() :returns filtered queryset """ if applicable_filters: queryset = queryset.filter(applicable_filters) if applicable_exclusions: queryset = queryset.exclude(applicable_exclusions) return queryset def build_filters(self, view, filters=None): """ Get the query builder instance and return constructed query filters. """ query_builder = self.get_query_builder(backend=self, view=view) return query_builder.build_query(**(filters if filters else {})) def process_filters(self, filters, queryset, view): """ Convenient hook to do any post-processing of the filters before they are applied to the queryset. """ return filters def filter_queryset(self, request, queryset, view): """ Return the filtered queryset. """ applicable_filters, applicable_exclusions = self.build_filters(view, filters=self.get_request_filters(request)) return self.apply_filters( queryset=queryset, applicable_filters=self.process_filters(applicable_filters, queryset, view), applicable_exclusions=self.process_filters(applicable_exclusions, queryset, view) ) def get_query_builder(self, *args, **kwargs): """ Return the query builder class instance that should be used to build the query which is passed to the search engine backend. """ query_builder = self.get_query_builder_class() return query_builder(*args, **kwargs) def get_query_builder_class(self): """ Return the class to use for building the query. Defaults to using `self.query_builder_class`. You may want to override this if you need to provide different methods of building the query sent to the search engine backend. """ assert self.query_builder_class is not None, ( "'%s' should either include a `query_builder_class` attribute, " "or override the `get_query_builder_class()` method." % self.__class__.__name__ ) return self.query_builder_class class HaystackFilter(BaseHaystackFilterBackend): """ A filter backend that compiles a haystack compatible filtering query. """ query_builder_class = FilterQueryBuilder default_operator = operator.and_ default_same_param_operator = operator.or_ class HaystackAutocompleteFilter(HaystackFilter): """ A filter backend to perform autocomplete search. Must be run against fields that are either `NgramField` or `EdgeNgramField`. """ def process_filters(self, filters, queryset, view): if not filters: return filters query_bits = [] for field_name, query in filters.children: for word in query.split(" "): bit = queryset.query.clean(word.strip()) kwargs = { field_name: bit } query_bits.append(view.query_object(**kwargs)) return six.moves.reduce(operator.and_, filter(lambda x: x, query_bits)) class HaystackGEOSpatialFilter(BaseHaystackFilterBackend): """ A base filter backend for doing geo spatial filtering. If using this filter make sure to provide a `point_field` with the name of your the `LocationField` of your index. We'll always do the somewhat slower but more accurate `dwithin` (radius) filter. """ query_builder_class = SpatialQueryBuilder point_field = "coordinates" def apply_filters(self, queryset, applicable_filters=None, applicable_exclusions=None): if applicable_filters: queryset = queryset.dwithin(**applicable_filters["dwithin"]).distance(**applicable_filters["distance"]) return queryset def filter_queryset(self, request, queryset, view): return self.apply_filters(queryset, self.build_filters(view, filters=self.get_request_filters(request))) class HaystackHighlightFilter(HaystackFilter): """ A filter backend which adds support for ``highlighting`` on the SearchQuerySet level (the fast one). Note that you need to use a search backend which supports highlighting in order to use this. This will add a ``hightlighted`` entry to your response, encapsulating the highlighted words in an `highlighted results` block. """ def filter_queryset(self, request, queryset, view): queryset = super(HaystackHighlightFilter, self).filter_queryset(request, queryset, view) if self.get_request_filters(request) and isinstance(queryset, SearchQuerySet): queryset = queryset.highlight() return queryset class HaystackBoostFilter(BaseHaystackFilterBackend): """ Filter backend for applying term boost on query time. Apply by adding a comma separated ``boost`` query parameter containing a the term you want to boost and a floating point or integer for the boost value. The boost value is based around ``1.0`` as 100% - no boost. Gives a slight increase in relevance for documents that include "banana": /api/v1/search/?boost=banana,1.1 """ query_builder_class = BoostQueryBuilder query_param = "boost" def apply_filters(self, queryset, applicable_filters=None, applicable_exclusions=None): if applicable_filters: queryset = queryset.boost(**applicable_filters) return queryset def filter_queryset(self, request, queryset, view): return self.apply_filters(queryset, self.build_filters(view, filters=self.get_request_filters(request))) class HaystackFacetFilter(BaseHaystackFilterBackend): """ Filter backend for faceting search results. This backend does not apply regular filtering. Faceting field options can be set by using the ``field_options`` attribute on the serializer, and can be overridden by query parameters. Dates will be parsed by the ``python-dateutil.parser()`` which can handle most date formats. Query parameters is parsed in the following format: ?field1=option1:value1,option2:value2&field2=option1:value1,option2:value2 where each options ``key:value`` pair is separated by the ``view.lookup_sep`` attribute. """ query_builder_class = FacetQueryBuilder def apply_filters(self, queryset, applicable_filters=None, applicable_exclusions=None): """ Apply faceting to the queryset """ for field, options in applicable_filters["field_facets"].items(): queryset = queryset.facet(field, **options) for field, options in applicable_filters["date_facets"].items(): queryset = queryset.date_facet(field, **options) for field, options in applicable_filters["query_facets"].items(): queryset = queryset.query_facet(field, **options) return queryset def filter_queryset(self, request, queryset, view): return self.apply_filters(queryset, self.build_filters(view, filters=self.get_request_filters(request))) class HaystackOrderingFilter(OrderingFilter): """ Some docstring here! """ def get_default_valid_fields(self, queryset, view, context={}): valid_fields = super(HaystackOrderingFilter, self).get_default_valid_fields(queryset, view, context) # Check if we need to support aggregate serializers serializer_class = view.get_serializer_class() if hasattr(serializer_class.Meta, "serializers"): raise NotImplementedError("Ordering on aggregate serializers is not yet implemented.") return valid_fields def get_valid_fields(self, queryset, view, context={}): valid_fields = getattr(view, "ordering_fields", self.ordering_fields) if valid_fields is None: return self.get_default_valid_fields(queryset, view, context) elif valid_fields == "__all__": # View explicitly allows filtering on all model fields. if not queryset.query.models: raise ImproperlyConfigured( "Cannot use %s with '__all__' as 'ordering_fields' attribute on a view " "which has no 'index_models' set. Either specify some 'ordering_fields', " "set the 'index_models' attribute or override the 'get_queryset' " "method and pass some 'index_models'." % self.__class__.__name__) model_fields = map(lambda model: [(field.name, field.verbose_name) for field in model._meta.fields], queryset.query.models) valid_fields = list(set(reduce(operator.concat, model_fields))) else: valid_fields = [ (item, item) if isinstance(item, six.string_types) else item for item in valid_fields ] return valid_fields ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/drf_haystack/generics.py0000664000175000017500000000775714476342707017235 0ustar00mmmmmm# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals import six from django.contrib.contenttypes.models import ContentType from django.http import Http404 from haystack.backends import SQ from haystack.query import SearchQuerySet from rest_framework.generics import GenericAPIView from drf_haystack.filters import HaystackFilter class HaystackGenericAPIView(GenericAPIView): """ Base class for all haystack generic views. """ # Use `index_models` to filter on which search index models we # should include in the search result. index_models = [] object_class = SearchQuerySet query_object = SQ # Override document_uid_field with whatever field in your index # you use to uniquely identify a single document. This value will be # used wherever the view references the `lookup_field` kwarg. document_uid_field = "id" lookup_sep = "," # If set to False, DB lookups are done on a per-object basis, # resulting in in many individual trips to the database. If True, # the SearchQuerySet will group similar objects into a single query. load_all = False filter_backends = [HaystackFilter] def get_queryset(self, index_models=[]): """ Get the list of items for this view. Returns ``self.queryset`` if defined and is a ``self.object_class`` instance. @:param index_models: override `self.index_models` """ if self.queryset is not None and isinstance(self.queryset, self.object_class): queryset = self.queryset.all() else: queryset = self.object_class()._clone() if len(index_models): queryset = queryset.models(*index_models) elif len(self.index_models): queryset = queryset.models(*self.index_models) return queryset def get_object(self): """ Fetch a single document from the data store according to whatever unique identifier is available for that document in the SearchIndex. In cases where the view has multiple ``index_models``, add a ``model`` query parameter containing a single `app_label.model` name to the request in order to override which model to include in the SearchQuerySet. Example: /api/v1/search/42/?model=myapp.person """ queryset = self.get_queryset() if "model" in self.request.query_params: try: app_label, model = map(six.text_type.lower, self.request.query_params["model"].split(".", 1)) ctype = ContentType.objects.get(app_label=app_label, model=model) queryset = self.get_queryset(index_models=[ctype.model_class()]) except (ValueError, ContentType.DoesNotExist): raise Http404("Could not find any models matching '%s'. Make sure to use a valid " "'app_label.model' name for the 'model' query parameter." % self.request.query_params["model"]) lookup_url_kwarg = self.lookup_url_kwarg or self.lookup_field if lookup_url_kwarg not in self.kwargs: raise AttributeError( "Expected view %s to be called with a URL keyword argument " "named '%s'. Fix your URL conf, or set the `.lookup_field` " "attribute on the view correctly." % (self.__class__.__name__, lookup_url_kwarg) ) queryset = queryset.filter(self.query_object((self.document_uid_field, self.kwargs[lookup_url_kwarg]))) count = queryset.count() if count == 1: return queryset[0] elif count > 1: raise Http404("Multiple results matches the given query. Expected a single result.") raise Http404("No result matches the given query.") def filter_queryset(self, queryset): queryset = super(HaystackGenericAPIView, self).filter_queryset(queryset) if self.load_all: queryset = queryset.load_all() return queryset ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/drf_haystack/mixins.py0000664000175000017500000001062414476342707016730 0ustar00mmmmmm# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals from rest_framework.decorators import action from rest_framework.response import Response from drf_haystack.filters import HaystackFacetFilter class MoreLikeThisMixin(object): """ Mixin class for supporting "more like this" on an API View. """ @action(detail=True, methods=["get"], url_path="more-like-this") def more_like_this(self, request, pk=None): """ Sets up a detail route for ``more-like-this`` results. Note that you'll need backend support in order to take advantage of this. This will add ie. ^search/{pk}/more-like-this/$ to your existing ^search pattern. """ obj = self.get_object().object queryset = self.filter_queryset(self.get_queryset()).more_like_this(obj) page = self.paginate_queryset(queryset) if page is not None: serializer = self.get_serializer(page, many=True) return self.get_paginated_response(serializer.data) serializer = self.get_serializer(queryset, many=True) return Response(serializer.data) class FacetMixin(object): """ Mixin class for supporting faceting on an API View. """ facet_filter_backends = [HaystackFacetFilter] facet_serializer_class = None facet_objects_serializer_class = None facet_query_params_text = 'selected_facets' @action(detail=False, methods=["get"], url_path="facets") def facets(self, request): """ Sets up a list route for ``faceted`` results. This will add ie ^search/facets/$ to your existing ^search pattern. """ queryset = self.filter_facet_queryset(self.get_queryset()) for facet in request.query_params.getlist(self.facet_query_params_text): if ":" not in facet: continue field, value = facet.split(":", 1) if value: queryset = queryset.narrow('%s:"%s"' % (field, queryset.query.clean(value))) serializer = self.get_facet_serializer(queryset.facet_counts(), objects=queryset, many=False) return Response(serializer.data) def filter_facet_queryset(self, queryset): """ Given a search queryset, filter it with whichever facet filter backends in use. """ for backend in list(self.facet_filter_backends): queryset = backend().filter_queryset(self.request, queryset, self) if self.load_all: queryset = queryset.load_all() return queryset def get_facet_serializer(self, *args, **kwargs): """ Return the facet serializer instance that should be used for serializing faceted output. """ assert "objects" in kwargs, "`objects` is a required argument to `get_facet_serializer()`" facet_serializer_class = self.get_facet_serializer_class() kwargs["context"] = self.get_serializer_context() kwargs["context"].update({ "objects": kwargs.pop("objects"), "facet_query_params_text": self.facet_query_params_text, }) return facet_serializer_class(*args, **kwargs) def get_facet_serializer_class(self): """ Return the class to use for serializing facets. Defaults to using ``self.facet_serializer_class``. """ if self.facet_serializer_class is None: raise AttributeError( "%(cls)s should either include a `facet_serializer_class` attribute, " "or override %(cls)s.get_facet_serializer_class() method." % {"cls": self.__class__.__name__} ) return self.facet_serializer_class def get_facet_objects_serializer(self, *args, **kwargs): """ Return the serializer instance which should be used for serializing faceted objects. """ facet_objects_serializer_class = self.get_facet_objects_serializer_class() kwargs["context"] = self.get_serializer_context() return facet_objects_serializer_class(*args, **kwargs) def get_facet_objects_serializer_class(self): """ Return the class to use for serializing faceted objects. Defaults to using the views ``self.serializer_class`` if not ``self.facet_objects_serializer_class`` is set. """ return self.facet_objects_serializer_class or super(FacetMixin, self).get_serializer_class() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/drf_haystack/query.py0000664000175000017500000003151114476342707016564 0ustar00mmmmmm# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals import operator import six import warnings from itertools import chain from six.moves import zip from dateutil import parser from drf_haystack import constants from drf_haystack.utils import merge_dict class BaseQueryBuilder(object): """ Query builder base class. """ def __init__(self, backend, view): self.backend = backend self.view = view def build_query(self, **filters): """ :param dict[str, list[str]] filters: is an expanded QueryDict or a mapping of keys to a list of parameters. """ raise NotImplementedError("You should override this method in subclasses.") @staticmethod def tokenize(stream, separator): """ Tokenize and yield query parameter values. :param stream: Input value :param separator: Character to use to separate the tokens. :return: """ for value in stream: for token in value.split(separator): if token: yield token.strip() class BoostQueryBuilder(BaseQueryBuilder): """ Query builder class for adding boost to queries. """ def build_query(self, **filters): applicable_filters = None query_param = getattr(self.backend, "query_param", None) value = filters.pop(query_param, None) if value: try: term, val = chain.from_iterable(zip(self.tokenize(value, self.view.lookup_sep))) except ValueError: raise ValueError("Cannot convert the '%s' query parameter to a valid boost filter." % query_param) else: try: applicable_filters = {"term": term, "boost": float(val)} except ValueError: raise ValueError("Cannot convert boost to float value. Make sure to provide a " "numerical boost value.") return applicable_filters class FilterQueryBuilder(BaseQueryBuilder): """ Query builder class suitable for doing basic filtering. """ def __init__(self, backend, view): super(FilterQueryBuilder, self).__init__(backend, view) assert getattr(self.backend, "default_operator", None) in (operator.and_, operator.or_), ( "%(cls)s.default_operator must be either 'operator.and_' or 'operator.or_'." % { "cls": self.backend.__class__.__name__ }) self.default_operator = self.backend.default_operator self.default_same_param_operator = getattr(self.backend, "default_same_param_operator", self.default_operator) def get_same_param_operator(self, param): """ Helper method to allow per param configuration of which operator should be used when multiple filters for the same param are found. :param str param: is the param for which you want to get the operator :return: Either operator.or_ or operator.and_ """ return self.default_same_param_operator def build_query(self, **filters): """ Creates a single SQ filter from querystring parameters that correspond to the SearchIndex fields that have been "registered" in `view.fields`. Default behavior is to `OR` terms for the same parameters, and `AND` between parameters. Any querystring parameters that are not registered in `view.fields` will be ignored. :param dict[str, list[str]] filters: is an expanded QueryDict or a mapping of keys to a list of parameters. """ applicable_filters = [] applicable_exclusions = [] for param, value in filters.items(): excluding_term = False param_parts = param.split("__") base_param = param_parts[0] # only test against field without lookup negation_keyword = constants.DRF_HAYSTACK_NEGATION_KEYWORD if len(param_parts) > 1 and param_parts[1] == negation_keyword: excluding_term = True param = param.replace("__%s" % negation_keyword, "") # haystack wouldn't understand our negation if self.view.serializer_class: if hasattr(self.view.serializer_class.Meta, 'field_aliases'): old_base = base_param base_param = self.view.serializer_class.Meta.field_aliases.get(base_param, base_param) param = param.replace(old_base, base_param) # need to replace the alias fields = getattr(self.view.serializer_class.Meta, 'fields', []) exclude = getattr(self.view.serializer_class.Meta, 'exclude', []) search_fields = getattr(self.view.serializer_class.Meta, 'search_fields', []) # Skip if the parameter is not listed in the serializer's `fields` # or if it's in the `exclude` list. if ((fields or search_fields) and base_param not in chain(fields, search_fields)) or base_param in exclude or not value: continue param_queries = [] if len(param_parts) > 1 and param_parts[-1] in ('in', 'range'): # `in` and `range` filters expects a list of values param_queries.append(self.view.query_object((param, list(self.tokenize(value, self.view.lookup_sep))))) else: for token in self.tokenize(value, self.view.lookup_sep): param_queries.append(self.view.query_object((param, token))) param_queries = [pq for pq in param_queries if pq] if len(param_queries) > 0: term = six.moves.reduce(self.get_same_param_operator(param), param_queries) if excluding_term: applicable_exclusions.append(term) else: applicable_filters.append(term) applicable_filters = six.moves.reduce( self.default_operator, filter(lambda x: x, applicable_filters)) if applicable_filters else [] applicable_exclusions = six.moves.reduce( self.default_operator, filter(lambda x: x, applicable_exclusions)) if applicable_exclusions else [] return applicable_filters, applicable_exclusions class FacetQueryBuilder(BaseQueryBuilder): """ Query builder class suitable for constructing faceted queries. """ def build_query(self, **filters): """ Creates a dict of dictionaries suitable for passing to the SearchQuerySet `facet`, `date_facet` or `query_facet` method. All key word arguments should be wrapped in a list. :param view: API View :param dict[str, list[str]] filters: is an expanded QueryDict or a mapping of keys to a list of parameters. """ field_facets = {} date_facets = {} query_facets = {} facet_serializer_cls = self.view.get_facet_serializer_class() if self.view.lookup_sep == ":": raise AttributeError("The %(cls)s.lookup_sep attribute conflicts with the HaystackFacetFilter " "query parameter parser. Please choose another `lookup_sep` attribute " "for %(cls)s." % {"cls": self.view.__class__.__name__}) fields = facet_serializer_cls.Meta.fields exclude = facet_serializer_cls.Meta.exclude field_options = facet_serializer_cls.Meta.field_options for field, options in filters.items(): if field not in fields or field in exclude: continue field_options = merge_dict(field_options, {field: self.parse_field_options(self.view.lookup_sep, *options)}) valid_gap = ("year", "month", "day", "hour", "minute", "second") for field, options in field_options.items(): if any([k in options for k in ("start_date", "end_date", "gap_by", "gap_amount")]): if not all(("start_date", "end_date", "gap_by" in options)): raise ValueError("Date faceting requires at least 'start_date', 'end_date' " "and 'gap_by' to be set.") if not options["gap_by"] in valid_gap: raise ValueError("The 'gap_by' parameter must be one of %s." % ", ".join(valid_gap)) options.setdefault("gap_amount", 1) date_facets[field] = field_options[field] else: field_facets[field] = field_options[field] return { "date_facets": date_facets, "field_facets": field_facets, "query_facets": query_facets } def parse_field_options(self, *options): """ Parse the field options query string and return it as a dictionary. """ defaults = {} for option in options: if isinstance(option, six.text_type): tokens = [token.strip() for token in option.split(self.view.lookup_sep)] for token in tokens: if not len(token.split(":")) == 2: warnings.warn("The %s token is not properly formatted. Tokens need to be " "formatted as 'token:value' pairs." % token) continue param, value = token.split(":", 1) if any([k == param for k in ("start_date", "end_date", "gap_amount")]): if param in ("start_date", "end_date"): value = parser.parse(value) if param == "gap_amount": value = int(value) defaults[param] = value return defaults class SpatialQueryBuilder(BaseQueryBuilder): """ Query builder class suitable for construction spatial queries. """ def __init__(self, backend, view): super(SpatialQueryBuilder, self).__init__(backend, view) assert getattr(self.backend, "point_field", None) is not None, ( "%(cls)s.point_field cannot be None. Set the %(cls)s.point_field " "to the name of the `LocationField` you want to filter on your index class." % { "cls": self.backend.__class__.__name__ }) try: from haystack.utils.geo import D, Point self.D = D self.Point = Point except ImportError: warnings.warn("Make sure you've installed the `libgeos` library. " "Run `apt-get install libgeos` on debian based linux systems, " "or `brew install geos` on OS X.") raise def build_query(self, **filters): """ Build queries for geo spatial filtering. Expected query parameters are: - a `unit=value` parameter where the unit is a valid UNIT in the `django.contrib.gis.measure.Distance` class. - `from` which must be a comma separated latitude and longitude. Example query: /api/v1/search/?km=10&from=59.744076,10.152045 Will perform a `dwithin` query within 10 km from the point with latitude 59.744076 and longitude 10.152045. """ applicable_filters = None filters = dict((k, filters[k]) for k in chain(self.D.UNITS.keys(), [constants.DRF_HAYSTACK_SPATIAL_QUERY_PARAM]) if k in filters) distance = dict((k, v) for k, v in filters.items() if k in self.D.UNITS.keys()) try: latitude, longitude = map(float, self.tokenize(filters[constants.DRF_HAYSTACK_SPATIAL_QUERY_PARAM], self.view.lookup_sep)) point = self.Point(longitude, latitude, srid=constants.GEO_SRID) except ValueError: raise ValueError("Cannot convert `from=latitude,longitude` query parameter to " "float values. Make sure to provide numerical values only!") except KeyError: # If the user has not provided any `from` query string parameter, # just return. pass else: for unit in distance.keys(): if not len(distance[unit]) == 1: raise ValueError("Each unit must have exactly one value.") distance[unit] = float(distance[unit][0]) if point and distance: applicable_filters = { "dwithin": { "field": self.backend.point_field, "point": point, "distance": self.D(**distance) }, "distance": { "field": self.backend.point_field, "point": point } } return applicable_filters ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/drf_haystack/serializers.py0000664000175000017500000004472414476342707017765 0ustar00mmmmmm# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals import copy import six import warnings from itertools import chain from datetime import datetime try: from collections import OrderedDict except ImportError: from django.utils.datastructures import SortedDict as OrderedDict from django.core.exceptions import ImproperlyConfigured, FieldDoesNotExist from haystack import fields as haystack_fields from haystack.query import EmptySearchQuerySet from haystack.utils.highlighting import Highlighter from rest_framework import serializers from rest_framework.fields import empty from rest_framework.utils.field_mapping import ClassLookupDict, get_field_kwargs from drf_haystack.fields import ( HaystackBooleanField, HaystackCharField, HaystackDateField, HaystackDateTimeField, HaystackDecimalField, HaystackFloatField, HaystackIntegerField, HaystackMultiValueField, FacetDictField, FacetListField ) class Meta(type): """ Template for the HaystackSerializerMeta.Meta class. """ fields = tuple() exclude = tuple() search_fields = tuple() index_classes = tuple() serializers = tuple() ignore_fields = tuple() field_aliases = {} field_options = {} index_aliases = {} def __new__(mcs, name, bases, attrs): cls = super(Meta, mcs).__new__(mcs, str(name), bases, attrs) if cls.fields and cls.exclude: raise ImproperlyConfigured("%s cannot define both 'fields' and 'exclude'." % name) return cls def __setattr__(cls, key, value): raise AttributeError("Meta class is immutable.") def __delattr__(cls, key, value): raise AttributeError("Meta class is immutable.") class HaystackSerializerMeta(serializers.SerializerMetaclass): """ Metaclass for the HaystackSerializer that ensures that all declared subclasses implemented a Meta. """ def __new__(mcs, name, bases, attrs): attrs.setdefault("_abstract", False) cls = super(HaystackSerializerMeta, mcs).__new__(mcs, str(name), bases, attrs) if getattr(cls, "Meta", None): cls.Meta = Meta("Meta", (Meta,), dict(cls.Meta.__dict__)) elif not cls._abstract: raise ImproperlyConfigured("%s must implement a Meta class or have the property _abstract" % name) return cls class HaystackSerializer(six.with_metaclass(HaystackSerializerMeta, serializers.Serializer)): """ A `HaystackSerializer` which populates fields based on which models that are available in the SearchQueryset. """ _abstract = True _field_mapping = ClassLookupDict({ haystack_fields.BooleanField: HaystackBooleanField, haystack_fields.CharField: HaystackCharField, haystack_fields.DateField: HaystackDateField, haystack_fields.DateTimeField: HaystackDateTimeField, haystack_fields.DecimalField: HaystackDecimalField, haystack_fields.EdgeNgramField: HaystackCharField, haystack_fields.FacetBooleanField: HaystackBooleanField, haystack_fields.FacetCharField: HaystackCharField, haystack_fields.FacetDateField: HaystackDateField, haystack_fields.FacetDateTimeField: HaystackDateTimeField, haystack_fields.FacetDecimalField: HaystackDecimalField, haystack_fields.FacetFloatField: HaystackFloatField, haystack_fields.FacetIntegerField: HaystackIntegerField, haystack_fields.FacetMultiValueField: HaystackMultiValueField, haystack_fields.FloatField: HaystackFloatField, haystack_fields.IntegerField: HaystackIntegerField, haystack_fields.LocationField: HaystackCharField, haystack_fields.MultiValueField: HaystackMultiValueField, haystack_fields.NgramField: HaystackCharField, }) def __init__(self, instance=None, data=empty, **kwargs): super(HaystackSerializer, self).__init__(instance, data, **kwargs) if not self.Meta.index_classes and not self.Meta.serializers: raise ImproperlyConfigured("You must set either the 'index_classes' or 'serializers' " "attribute on the serializer Meta class.") if not self.instance: self.instance = EmptySearchQuerySet() @staticmethod def _get_default_field_kwargs(model, field): """ Get the required attributes from the model field in order to instantiate a REST Framework serializer field. """ kwargs = {} try: field_name = field.model_attr or field.index_fieldname model_field = model._meta.get_field(field_name) kwargs.update(get_field_kwargs(field_name, model_field)) # Remove stuff we don't care about! delete_attrs = [ "allow_blank", "choices", "model_field", "allow_unicode", ] for attr in delete_attrs: if attr in kwargs: del kwargs[attr] except FieldDoesNotExist: pass return kwargs def _get_index_field(self, field_name): """ Returns the correct index field. """ return field_name def _get_index_class_name(self, index_cls): """ Converts in index model class to a name suitable for use as a field name prefix. A user may optionally specify custom aliases via an 'index_aliases' attribute on the Meta class """ cls_name = index_cls.__name__ aliases = self.Meta.index_aliases return aliases.get(cls_name, cls_name.split('.')[-1]) def get_fields(self): """ Get the required fields for serializing the result. """ fields = self.Meta.fields exclude = self.Meta.exclude ignore_fields = self.Meta.ignore_fields indices = self.Meta.index_classes declared_fields = copy.deepcopy(self._declared_fields) prefix_field_names = len(indices) > 1 field_mapping = OrderedDict() # overlapping fields on multiple indices is supported by internally prefixing the field # names with the index class to which they belong or, optionally, a user-provided alias # for the index. for index_cls in self.Meta.index_classes: prefix = "" if prefix_field_names: prefix = "_%s__" % self._get_index_class_name(index_cls) for field_name, field_type in six.iteritems(index_cls.fields): orig_name = field_name field_name = "%s%s" % (prefix, field_name) # Don't use this field if it is in `ignore_fields` if orig_name in ignore_fields or field_name in ignore_fields: continue # When fields to include are decided by `exclude` if exclude: if orig_name in exclude or field_name in exclude: continue # When fields to include are decided by `fields` if fields: if orig_name not in fields and field_name not in fields: continue # Look up the field attributes on the current index model, # in order to correctly instantiate the serializer field. model = index_cls().get_model() kwargs = self._get_default_field_kwargs(model, field_type) kwargs['prefix_field_names'] = prefix_field_names field_mapping[field_name] = self._field_mapping[field_type](**kwargs) # Add any explicitly declared fields. They *will* override any index fields # in case of naming collision!. if declared_fields: for field_name in declared_fields: field_mapping[field_name] = declared_fields[field_name] return field_mapping def to_representation(self, instance): """ If we have a serializer mapping, use that. Otherwise, use standard serializer behavior Since we might be dealing with multiple indexes, some fields might not be valid for all results. Do not render the fields which don't belong to the search result. """ if self.Meta.serializers: ret = self.multi_serializer_representation(instance) else: ret = super(HaystackSerializer, self).to_representation(instance) prefix_field_names = len(getattr(self.Meta, "index_classes")) > 1 current_index = self._get_index_class_name(type(instance.searchindex)) for field in self.fields.keys(): orig_field = field if prefix_field_names: parts = field.split("__") if len(parts) > 1: index = parts[0][1:] # trim the preceding '_' field = parts[1] if index == current_index: ret[field] = ret[orig_field] del ret[orig_field] elif field not in chain(instance.searchindex.fields.keys(), self._declared_fields.keys()): del ret[orig_field] # include the highlighted field in either case if getattr(instance, "highlighted", None): ret["highlighted"] = instance.highlighted[0] return ret def multi_serializer_representation(self, instance): serializers = self.Meta.serializers index = instance.searchindex serializer_class = serializers.get(type(index), None) if not serializer_class: raise ImproperlyConfigured("Could not find serializer for %s in mapping" % index) return serializer_class(context=self._context).to_representation(instance) class FacetFieldSerializer(serializers.Serializer): """ Responsible for serializing a faceted result. """ text = serializers.SerializerMethodField() count = serializers.SerializerMethodField() narrow_url = serializers.SerializerMethodField() def __init__(self, *args, **kwargs): self._parent_field = None super(FacetFieldSerializer, self).__init__(*args, **kwargs) @property def parent_field(self): return self._parent_field @parent_field.setter def parent_field(self, value): self._parent_field = value def get_paginate_by_param(self): """ Returns the ``paginate_by_param`` for the (root) view paginator class. This is needed in order to remove the query parameter from faceted narrow urls. If using a custom pagination class, this class attribute needs to be set manually. """ if hasattr(self.root, "paginate_by_param") and self.root.paginate_by_param: return self.root.paginate_by_param pagination_class = self.context["view"].pagination_class if not pagination_class: return None # PageNumberPagination if hasattr(pagination_class, "page_query_param"): return pagination_class.page_query_param # LimitOffsetPagination elif hasattr(pagination_class, "offset_query_param"): return pagination_class.offset_query_param # CursorPagination elif hasattr(pagination_class, "cursor_query_param"): return pagination_class.cursor_query_param else: raise AttributeError( "%(root_cls)s is missing a `paginate_by_param` attribute. " "Define a %(root_cls)s.paginate_by_param or override " "%(cls)s.get_paginate_by_param()." % { "root_cls": self.root.__class__.__name__, "cls": self.__class__.__name__ }) def get_text(self, instance): """ Haystack facets are returned as a two-tuple (value, count). The text field should contain the faceted value. """ instance = instance[0] if isinstance(instance, (six.text_type, six.string_types)): return serializers.CharField(read_only=True).to_representation(instance) elif isinstance(instance, datetime): return serializers.DateTimeField(read_only=True).to_representation(instance) return instance def get_count(self, instance): """ Haystack facets are returned as a two-tuple (value, count). The count field should contain the faceted count. """ instance = instance[1] return serializers.IntegerField(read_only=True).to_representation(instance) def get_narrow_url(self, instance): """ Return a link suitable for narrowing on the current item. """ text = instance[0] request = self.context["request"] query_params = request.GET.copy() # Never keep the page query parameter in narrowing urls. # It will raise a NotFound exception when trying to paginate a narrowed queryset. page_query_param = self.get_paginate_by_param() if page_query_param and page_query_param in query_params: del query_params[page_query_param] selected_facets = set(query_params.pop(self.root.facet_query_params_text, [])) selected_facets.add("%(field)s_exact:%(text)s" % {"field": self.parent_field, "text": text}) query_params.setlist(self.root.facet_query_params_text, sorted(selected_facets)) path = "%(path)s?%(query)s" % {"path": request.path_info, "query": query_params.urlencode()} url = request.build_absolute_uri(path) return serializers.Hyperlink(url, "narrow-url") def to_representation(self, field, instance): """ Set the ``parent_field`` property equal to the current field on the serializer class, so that each field can query it to see what kind of attribute they are processing. """ self.parent_field = field return super(FacetFieldSerializer, self).to_representation(instance) class HaystackFacetSerializer(six.with_metaclass(HaystackSerializerMeta, serializers.Serializer)): """ The ``HaystackFacetSerializer`` is used to serialize the ``facet_counts()`` dictionary results on a ``SearchQuerySet`` instance. """ _abstract = True serialize_objects = False paginate_by_param = None facet_dict_field_class = FacetDictField facet_list_field_class = FacetListField facet_field_serializer_class = FacetFieldSerializer def get_fields(self): """ This returns a dictionary containing the top most fields, ``dates``, ``fields`` and ``queries``. """ field_mapping = OrderedDict() for field, data in self.instance.items(): field_mapping.update( {field: self.facet_dict_field_class( child=self.facet_list_field_class(child=self.facet_field_serializer_class(data)), required=False)} ) if self.serialize_objects is True: field_mapping["objects"] = serializers.SerializerMethodField() return field_mapping def get_objects(self, instance): """ Return a list of objects matching the faceted result. """ view = self.context["view"] queryset = self.context["objects"] page = view.paginate_queryset(queryset) if page is not None: serializer = view.get_facet_objects_serializer(page, many=True) return OrderedDict([ ("count", self.get_count(queryset)), ("next", view.paginator.get_next_link()), ("previous", view.paginator.get_previous_link()), ("results", serializer.data) ]) serializer = view.get_serializer(queryset, many=True) return serializer.data def get_count(self, queryset): """ Determine an object count, supporting either querysets or regular lists. """ try: return queryset.count() except (AttributeError, TypeError): return len(queryset) @property def facet_query_params_text(self): return self.context["facet_query_params_text"] class HaystackSerializerMixin(object): """ This mixin can be added to a serializer to use the actual object as the data source for serialization rather than the data stored in the search index fields. This makes it easy to return data from search results in the same format as elsewhere in your API and reuse your existing serializers """ def to_representation(self, instance): obj = instance.object return super(HaystackSerializerMixin, self).to_representation(obj) class HighlighterMixin(object): """ This mixin adds support for ``highlighting`` (the pure python, portable version, not SearchQuerySet().highlight()). See Haystack docs for more info). """ highlighter_class = Highlighter highlighter_css_class = "highlighted" highlighter_html_tag = "span" highlighter_max_length = 200 highlighter_field = None def get_highlighter(self): if not self.highlighter_class: raise ImproperlyConfigured( "%(cls)s is missing a highlighter_class. Define %(cls)s.highlighter_class, " "or override %(cls)s.get_highlighter()." % {"cls": self.__class__.__name__} ) return self.highlighter_class @staticmethod def get_document_field(instance): """ Returns which field the search index has marked as it's `document=True` field. """ for name, field in instance.searchindex.fields.items(): if field.document is True: return name def get_terms(self, data): """ Returns the terms to be highlighted """ terms = " ".join(six.itervalues(self.context["request"].GET)) return terms def to_representation(self, instance): ret = super(HighlighterMixin, self).to_representation(instance) terms = self.get_terms(ret) if terms: highlighter = self.get_highlighter()(terms, **{ "html_tag": self.highlighter_html_tag, "css_class": self.highlighter_css_class, "max_length": self.highlighter_max_length }) document_field = self.get_document_field(instance) if highlighter and document_field: # Handle case where this data is None, but highlight expects it to be a string data_to_highlight = getattr(instance, self.highlighter_field or document_field) or '' ret["highlighted"] = highlighter.highlight(data_to_highlight) return ret ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/drf_haystack/utils.py0000664000175000017500000000145014476342707016556 0ustar00mmmmmm# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals import six from copy import deepcopy def merge_dict(a, b): """ Recursively merges and returns dict a with dict b. Any list values will be combined and returned sorted. :param a: dictionary object :param b: dictionary object :return: merged dictionary object """ if not isinstance(b, dict): return b result = deepcopy(a) for key, val in six.iteritems(b): if key in result and isinstance(result[key], dict): result[key] = merge_dict(result[key], val) elif key in result and isinstance(result[key], list): result[key] = sorted(list(set(val) | set(result[key]))) else: result[key] = deepcopy(val) return result ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/drf_haystack/viewsets.py0000664000175000017500000000100714476342707017265 0ustar00mmmmmm# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals from rest_framework.mixins import ListModelMixin, RetrieveModelMixin from rest_framework.viewsets import ViewSetMixin from drf_haystack.generics import HaystackGenericAPIView class HaystackViewSet(RetrieveModelMixin, ListModelMixin, ViewSetMixin, HaystackGenericAPIView): """ The HaystackViewSet class provides the default ``list()`` and ``retrieve()`` actions with a haystack index as it's data source. """ pass ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1696400768.8640983 drf-haystack-1.8.13/drf_haystack.egg-info/0000775000175000017500000000000014507202601016515 5ustar00mmmmmm././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1696400768.0 drf-haystack-1.8.13/drf_haystack.egg-info/PKG-INFO0000644000175000017500000000175014507202600017612 0ustar00mmmmmmMetadata-Version: 2.1 Name: drf-haystack Version: 1.8.13 Summary: Makes Haystack play nice with Django REST Framework Home-page: https://github.com/rhblind/drf-haystack Download-URL: https://github.com/rhblind/drf-haystack.git Author: Rolf Håvard Blindheim Author-email: rhblind@gmail.com License: MIT License Classifier: Operating System :: OS Independent Classifier: Development Status :: 5 - Production/Stable Classifier: Environment :: Web Environment Classifier: Framework :: Django Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: MIT License Classifier: Programming Language :: Python :: 3 Classifier: Topic :: Software Development :: Libraries :: Python Modules Requires-Python: >=3.7 License-File: LICENSE.txt Requires-Dist: Django<4.3,>=2.2 Requires-Dist: djangorestframework<=3.14,>=3.7 Requires-Dist: django-haystack<=3.2.1,>=2.8 Requires-Dist: python-dateutil Implements a ViewSet, FiltersBackends and Serializers in order to play nice with Haystack. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1696400768.0 drf-haystack-1.8.13/drf_haystack.egg-info/SOURCES.txt0000664000175000017500000000360214507202600020401 0ustar00mmmmmmLICENSE.txt MANIFEST.in Pipfile README.md setup.cfg setup.py tox.ini docs/01_intro.rst docs/02_autocomplete.rst docs/03_geospatial.rst docs/04_highlighting.rst docs/05_more_like_this.rst docs/06_term_boost.rst docs/07_faceting.rst docs/08_permissions.rst docs/09_multiple_indexes.rst docs/10_tips_n_tricks.rst docs/Makefile docs/conf.py docs/index.rst docs/make.bat docs/apidoc/drf_haystack.rst docs/apidoc/modules.rst drf_haystack/__init__.py drf_haystack/constants.py drf_haystack/fields.py drf_haystack/filters.py drf_haystack/generics.py drf_haystack/mixins.py drf_haystack/query.py drf_haystack/serializers.py drf_haystack/utils.py drf_haystack/viewsets.py drf_haystack.egg-info/PKG-INFO drf_haystack.egg-info/SOURCES.txt drf_haystack.egg-info/dependency_links.txt drf_haystack.egg-info/not-zip-safe drf_haystack.egg-info/requires.txt drf_haystack.egg-info/top_level.txt tests/__init__.py tests/apps.py tests/constants.py tests/mixins.py tests/run_tests.py tests/settings.py tests/test_filters.py tests/test_serializers.py tests/test_utils.py tests/test_viewsets.py tests/urls.py tests/wsgi.py tests/mockapp/__init__.py tests/mockapp/admin.py tests/mockapp/models.py tests/mockapp/search_indexes.py tests/mockapp/serializers.py tests/mockapp/views.py tests/mockapp/fixtures/mockallfield.json tests/mockapp/fixtures/mocklocation.json tests/mockapp/fixtures/mockperson.json tests/mockapp/fixtures/mockpet.json tests/mockapp/migrations/0001_initial.py tests/mockapp/migrations/0002_mockperson.py tests/mockapp/migrations/0003_mockpet.py tests/mockapp/migrations/0004_load_fixtures.py tests/mockapp/migrations/0005_mockperson_birthdate.py tests/mockapp/migrations/0006_mockallfield.py tests/mockapp/migrations/__init__.py tests/mockapp/templates/search/indexes/mockapp/mocklocation_text.txt tests/mockapp/templates/search/indexes/mockapp/mockperson_text.txt tests/mockapp/templates/search/indexes/mockapp/mockpet_text.txt././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1696400768.0 drf-haystack-1.8.13/drf_haystack.egg-info/dependency_links.txt0000664000175000017500000000000114507202600022562 0ustar00mmmmmm ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694091208.0 drf-haystack-1.8.13/drf_haystack.egg-info/not-zip-safe0000664000175000017500000000000114476343710020757 0ustar00mmmmmm ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1696400768.0 drf-haystack-1.8.13/drf_haystack.egg-info/requires.txt0000664000175000017500000000013614507202600021114 0ustar00mmmmmmDjango<4.3,>=2.2 djangorestframework<=3.14,>=3.7 django-haystack<=3.2.1,>=2.8 python-dateutil ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1696400768.0 drf-haystack-1.8.13/drf_haystack.egg-info/top_level.txt0000664000175000017500000000001514507202600021242 0ustar00mmmmmmdrf_haystack ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1696400768.8720984 drf-haystack-1.8.13/setup.cfg0000664000175000017500000000036614507202601014207 0ustar00mmmmmm[bdist_wheel] universal = 1 [metadata] license_file = LICENSE.txt [pep8] max-line-length = 120 exclude = docs [flake8] max-line-length = 120 exclude = docs [frosted] max-line-length = 120 exclude = docs [egg_info] tag_build = tag_date = 0 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1696400347.0 drf-haystack-1.8.13/setup.py0000664000175000017500000000330614507201733014102 0ustar00mmmmmm# -*- coding: utf-8 -*- import re import os try: from setuptools import setup except ImportError: from ez_setup import use_setuptools use_setuptools() from setuptools import setup def get_version(package): """ Return package version as listed in `__version__` in `init.py`. """ init_py = open(os.path.join(package, "__init__.py")).read() return re.search("__version__ = ['\"]([^'\"]+)['\"]", init_py).group(1) setup( name="drf-haystack", version=get_version("drf_haystack"), description="Makes Haystack play nice with Django REST Framework", long_description="Implements a ViewSet, FiltersBackends and Serializers in order to play nice with Haystack.", author="Rolf Håvard Blindheim", author_email="rhblind@gmail.com", url="https://github.com/rhblind/drf-haystack", download_url="https://github.com/rhblind/drf-haystack.git", license="MIT License", packages=[ "drf_haystack", ], include_package_data=True, install_requires=[ "Django>=2.2,<4.3", "djangorestframework>=3.7,<=3.14", "django-haystack>=2.8,<=3.2.1", "python-dateutil" ], tests_require=[ "nose", "coverage" ], zip_safe=False, test_suite="tests.run_tests.start", classifiers=[ "Operating System :: OS Independent", "Development Status :: 5 - Production/Stable", "Environment :: Web Environment", "Framework :: Django", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Programming Language :: Python :: 3", "Topic :: Software Development :: Libraries :: Python Modules", ], python_requires=">=3.7" ) ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1696400768.8680983 drf-haystack-1.8.13/tests/0000775000175000017500000000000014507202601013523 5ustar00mmmmmm././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/__init__.py0000664000175000017500000000226514476342707015662 0ustar00mmmmmm# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals import os from django.core.exceptions import ImproperlyConfigured test_runner = None old_config = None os.environ["DJANGO_SETTINGS_MODULE"] = "tests.settings" import django if hasattr(django, "setup"): django.setup() def _geospatial_support(): try: import geopy from haystack.utils.geo import Point except (ImportError, ImproperlyConfigured): return False else: return True geospatial_support = _geospatial_support() def _restframework_version(): import rest_framework return tuple(map(int, rest_framework.VERSION.split("."))) restframework_version = _restframework_version() def _elasticsearch_version(): import elasticsearch return elasticsearch.VERSION elasticsearch_version = _elasticsearch_version() def setup(): from django.test.runner import DiscoverRunner global test_runner global old_config test_runner = DiscoverRunner() test_runner.setup_test_environment() old_config = test_runner.setup_databases() def teardown(): test_runner.teardown_databases(old_config) test_runner.teardown_test_environment() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/apps.py0000664000175000017500000000023114476342707015055 0ustar00mmmmmm# -*- coding: utf-8 -*- from django.apps import AppConfig class MockappConfig(AppConfig): name = "mockapp" verbose_name = "Mock Application" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/constants.py0000664000175000017500000000101414476342707016126 0ustar00mmmmmm# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals import os import json from django.conf import settings with open(os.path.join(settings.BASE_DIR, "mockapp", "fixtures", "mocklocation.json"), "r") as f: mocklocation_size = len(json.loads(f.read())) MOCKLOCATION_DATA_SET_SIZE = mocklocation_size with open(os.path.join(settings.BASE_DIR, "mockapp", "fixtures", "mockperson.json"), "r") as f: mockperson_size = len(json.loads(f.read())) MOCKPERSON_DATA_SET_SIZE = mockperson_size ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/mixins.py0000664000175000017500000000076014476342707015430 0ustar00mmmmmm# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals import warnings class WarningTestCaseMixin(object): """ TestCase mixin to catch warnings """ def assertWarning(self, warning, callable, *args, **kwargs): with warnings.catch_warnings(record=True) as warning_list: warnings.simplefilter(action="always") callable(*args, **kwargs) self.assertTrue(any(item.category == warning for item in warning_list)) ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1696400768.8680983 drf-haystack-1.8.13/tests/mockapp/0000775000175000017500000000000014507202601015155 5ustar00mmmmmm././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/mockapp/__init__.py0000664000175000017500000000000014476342707017275 0ustar00mmmmmm././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/mockapp/admin.py0000664000175000017500000000007714476342707016644 0ustar00mmmmmmfrom django.contrib import admin # Register your models here. ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1696400768.8680983 drf-haystack-1.8.13/tests/mockapp/fixtures/0000775000175000017500000000000014507202601017026 5ustar00mmmmmm././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/mockapp/fixtures/mockallfield.json0000664000175000017500000000405314476342707022372 0ustar00mmmmmm[ { "model": "mockapp.mockallfield", "fields": { "charfield": "Shauna Robinson", "integerfield": 674, "floatfield": 756.1749, "decimalfield": "21.69", "boolfield": true }, "pk": 1 }, { "model": "mockapp.mockallfield", "fields": { "charfield": "Zimmerman Warren", "integerfield": 379, "floatfield": -386.574, "decimalfield": "44.87", "boolfield": true }, "pk": 2 }, { "model": "mockapp.mockallfield", "fields": { "charfield": "Juliana Cooper", "integerfield": 357, "floatfield": 666.5266, "decimalfield": "-57.46", "boolfield": false }, "pk": 3 }, { "model": "mockapp.mockallfield", "fields": { "charfield": "Wiggins Mccormick", "integerfield": 441, "floatfield": 216.1145, "decimalfield": "192.59", "boolfield": false }, "pk": 4 }, { "model": "mockapp.mockallfield", "fields": { "charfield": "Rebekah Mclaughlin", "integerfield": 480, "floatfield": -976.415, "decimalfield": "21.86", "boolfield": true }, "pk": 5 }, { "model": "mockapp.mockallfield", "fields": { "charfield": "Karla Hawkins", "integerfield": 153, "floatfield": -506.917, "decimalfield": "199.86", "boolfield": false }, "pk": 6 }, { "model": "mockapp.mockallfield", "fields": { "charfield": "Shelly Calhoun", "integerfield": 528, "floatfield": 88.5078, "decimalfield": "-21.48", "boolfield": false }, "pk": 7 }, { "model": "mockapp.mockallfield", "fields": { "charfield": "Miranda Collins", "integerfield": 57, "floatfield": 236.1656, "decimalfield": "-74.10", "boolfield": false }, "pk": 8 }, { "model": "mockapp.mockallfield", "fields": { "charfield": "Kathryn Oliver", "integerfield": 48, "floatfield": 551.5019, "decimalfield": "-49.64", "boolfield": true }, "pk": 9 } ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/mockapp/fixtures/mocklocation.json0000664000175000017500000007355614476342707022444 0ustar00mmmmmm[ { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.245Z", "latitude": 59.87566954010669, "address": "Andersenhagen 8", "zip_code": "0289", "longitude": 10.878157588853062, "updated": "2015-02-13T21:39:42.245Z", "city": "Oslo" }, "pk": 1 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.245Z", "latitude": 59.869983759842185, "address": "Edvardsen\u00e5sen 6", "zip_code": "0202", "longitude": 10.759558525204776, "updated": "2015-02-13T21:39:42.245Z", "city": "Oslo" }, "pk": 2 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.245Z", "latitude": 59.844410359198534, "address": "Gundersenholtet 68", "zip_code": "0215", "longitude": 10.860611682103695, "updated": "2015-02-13T21:39:42.246Z", "city": "Oslo" }, "pk": 3 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.246Z", "latitude": 59.98769224959462, "address": "Solbergjordet 29", "zip_code": "0254", "longitude": 10.73730212224833, "updated": "2015-02-13T21:39:42.246Z", "city": "Oslo" }, "pk": 4 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.246Z", "latitude": 59.879738785452595, "address": "Simonsenstien 60", "zip_code": "0102", "longitude": 10.875848906700593, "updated": "2015-02-13T21:39:42.246Z", "city": "Oslo" }, "pk": 5 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.246Z", "latitude": 59.90536711350938, "address": "Olsenstykket 0", "zip_code": "0212", "longitude": 10.91876267185026, "updated": "2015-02-13T21:39:42.246Z", "city": "Oslo" }, "pk": 6 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.246Z", "latitude": 59.9583137867979, "address": "Nilsenstubben 9", "zip_code": "0213", "longitude": 10.879477132609468, "updated": "2015-02-13T21:39:42.246Z", "city": "Oslo" }, "pk": 7 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.246Z", "latitude": 59.8122568752108, "address": "Fredriksenstranda 2", "zip_code": "0128", "longitude": 10.858514787949423, "updated": "2015-02-13T21:39:42.246Z", "city": "Oslo" }, "pk": 8 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.246Z", "latitude": 59.874409235929434, "address": "Johannessengrenda 06", "zip_code": "0238", "longitude": 10.723069271842252, "updated": "2015-02-13T21:39:42.246Z", "city": "Oslo" }, "pk": 9 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.246Z", "latitude": 59.90183752568133, "address": "R\u00f8nninggrenda 2", "zip_code": "0119", "longitude": 10.793385350548396, "updated": "2015-02-13T21:39:42.246Z", "city": "Oslo" }, "pk": 10 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.246Z", "latitude": 59.948077110460844, "address": "Bergeall\u00e9en 73", "zip_code": "0271", "longitude": 10.766157403874733, "updated": "2015-02-13T21:39:42.246Z", "city": "Oslo" }, "pk": 11 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.246Z", "latitude": 60.06698731196217, "address": "Holmtoppen 73", "zip_code": "0143", "longitude": 10.77073988514214, "updated": "2015-02-13T21:39:42.246Z", "city": "Oslo" }, "pk": 12 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.246Z", "latitude": 59.826546236660405, "address": "Nilsenholtet 3", "zip_code": "0261", "longitude": 10.753602675761355, "updated": "2015-02-13T21:39:42.246Z", "city": "Oslo" }, "pk": 13 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.246Z", "latitude": 60.002705305741614, "address": "Simonsenroa 3", "zip_code": "0241", "longitude": 10.82821194011631, "updated": "2015-02-13T21:39:42.246Z", "city": "Oslo" }, "pk": 14 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.246Z", "latitude": 59.88006926289976, "address": "Ahmedflata 0", "zip_code": "0269", "longitude": 10.804450883729185, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 15 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 60.07785179851334, "address": "Andreasseneggen 68", "zip_code": "0194", "longitude": 10.8169080528998, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 16 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 59.83607620356919, "address": "Jensenroa 00", "zip_code": "0128", "longitude": 10.78535037196415, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 17 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 59.92658986005112, "address": "Tangenstykket 79", "zip_code": "0145", "longitude": 10.740875983170062, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 18 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 59.84403819590268, "address": "J\u00f8rgensenl\u00f8kka 98", "zip_code": "0124", "longitude": 10.768651841870405, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 19 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 59.829511912876946, "address": "Myklebustvollen 94", "zip_code": "0215", "longitude": 10.877066582376077, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 20 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 60.02706647066823, "address": "Hauglandr\u00f8a 6", "zip_code": "0137", "longitude": 10.6415654062034, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 21 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 59.985606573966486, "address": "Pettersenstien 1", "zip_code": "0185", "longitude": 10.74427374271031, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 22 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 59.978323278289515, "address": "Haugenlyngen 00", "zip_code": "0168", "longitude": 10.767461131389409, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 23 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 59.864337993875836, "address": "Andersentoppen 44", "zip_code": "0178", "longitude": 10.664264362593675, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 24 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 59.81819275751251, "address": "Pettersensvingen 7", "zip_code": "0240", "longitude": 10.843381478117784, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 25 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 59.94413644053888, "address": "Andresenringen 65", "zip_code": "0214", "longitude": 10.936694075863995, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 26 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 59.89692955711872, "address": "Hauger\u00f8a 1", "zip_code": "0282", "longitude": 10.707543212194091, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 27 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 59.83062460411829, "address": "Abrahamsenhagen 8", "zip_code": "0221", "longitude": 10.676561970075774, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 28 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 59.98915872890549, "address": "Rasmussengropa 8", "zip_code": "0119", "longitude": 10.589869453283573, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 29 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 60.08927375077344, "address": "Myklebustgrenda 0", "zip_code": "0232", "longitude": 10.657923054883481, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 30 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 59.98446663030189, "address": "Nguyenengen 7", "zip_code": "0170", "longitude": 10.845671559789183, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 31 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 59.94018500082587, "address": "Tveithagen 97", "zip_code": "0218", "longitude": 10.67927891526063, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 32 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 60.001933056984385, "address": "S\u00e6therholtet 7", "zip_code": "0251", "longitude": 10.902943242976246, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 33 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 59.99064929424015, "address": "Aasenh\u00f8gda 7", "zip_code": "0187", "longitude": 10.698916574006395, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 34 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 59.95239176850915, "address": "Haugeringen 60", "zip_code": "0258", "longitude": 10.8108034171511, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 35 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 59.95047538076824, "address": "Isaksenfaret 0", "zip_code": "0285", "longitude": 10.702264455322045, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 36 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 59.87152653174937, "address": "Aaseggen 9", "zip_code": "0185", "longitude": 10.601282479870813, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 37 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 59.75664623062497, "address": "Hanssenmoen 40", "zip_code": "0181", "longitude": 10.695768093659206, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 38 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 59.89882906294937, "address": "Sandvikstien 40", "zip_code": "0156", "longitude": 10.706022843738628, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 39 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 59.95190215981165, "address": "J\u00f8rgensenall\u00e9en 0", "zip_code": "0151", "longitude": 10.845651584873082, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 40 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 60.063223660182516, "address": "Johannessentunet 2", "zip_code": "0226", "longitude": 10.78422680396959, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 41 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 59.92306173399562, "address": "Aunelyngen 7", "zip_code": "0100", "longitude": 10.662778259175045, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 42 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 59.84634507117154, "address": "Ahmedroa 2", "zip_code": "0269", "longitude": 10.693610021735212, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 43 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 59.89429607822404, "address": "Nilsenekra 1", "zip_code": "0175", "longitude": 10.733483224706205, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 44 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 59.99164686399774, "address": "Holmvollen 4", "zip_code": "0160", "longitude": 10.759823276550938, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 45 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 59.9126155758038, "address": "Eideall\u00e9en 79", "zip_code": "0283", "longitude": 10.596025893278515, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 46 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 59.94908089577484, "address": "J\u00f8rgensenlia 2", "zip_code": "0198", "longitude": 10.918990673477044, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 47 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 59.982510362064396, "address": "Johnsenh\u00f8gda 4", "zip_code": "0199", "longitude": 10.826963074336764, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 48 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 59.97734865198299, "address": "Moeflata 28", "zip_code": "0125", "longitude": 10.723849656760295, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 49 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 60.00059654366635, "address": "Ahmedlia 61", "zip_code": "0135", "longitude": 10.908539294534814, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 50 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 59.970932832038336, "address": "Antonsenplassen 8", "zip_code": "0229", "longitude": 10.953581147660746, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 51 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 59.9638956777966, "address": "Rasmussenhagen 71", "zip_code": "0293", "longitude": 10.958498294466715, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 52 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 59.75359135925595, "address": "Johnsenflata 8", "zip_code": "0200", "longitude": 10.647355007841071, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 53 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 59.9875252004295, "address": "N\u00e6ssgata 1", "zip_code": "0185", "longitude": 10.62897271234992, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 54 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 59.82344446227308, "address": "Fredriksenmyra 47", "zip_code": "0179", "longitude": 10.900930645475936, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 55 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 60.0449028505056, "address": "Hauglandstranda 60", "zip_code": "0187", "longitude": 10.700919248639527, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 56 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 59.895087574947716, "address": "Hellandbakken 7", "zip_code": "0266", "longitude": 10.6954241816232, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 57 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 59.91832235151213, "address": "Bergtoppen 51", "zip_code": "0213", "longitude": 10.783205112962843, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 58 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 59.870190926776665, "address": "Sandvikbr\u00e5ten 3", "zip_code": "0204", "longitude": 10.736731793637208, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 59 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 59.95383238850984, "address": "Lundtjernet 42", "zip_code": "0158", "longitude": 10.696440218567446, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 60 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 59.84776600279412, "address": "Kristoffersenbr\u00e5ten 6", "zip_code": "0265", "longitude": 10.858949104566072, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 61 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 59.97793872518526, "address": "Halvorsenr\u00f8a 05", "zip_code": "0114", "longitude": 10.732623595464785, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 62 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 59.89328159393844, "address": "Moentoppen 9", "zip_code": "0235", "longitude": 10.89958795492445, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 63 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 59.936203316140215, "address": "Vikstubben 0", "zip_code": "0280", "longitude": 10.804538287413278, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 64 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 59.91628034434794, "address": "Jensenkroken 1", "zip_code": "0274", "longitude": 10.668646013047587, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 65 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 59.93638433102333, "address": "Danielsenl\u00f8kka 5", "zip_code": "0183", "longitude": 10.846395825794241, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 66 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 60.01146872045944, "address": "Fredriksenskogen 04", "zip_code": "0289", "longitude": 10.786193719338806, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 67 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 59.91742251441682, "address": "Gundersenbr\u00e5ten 4", "zip_code": "0289", "longitude": 10.813481002560126, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 68 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 59.85863052939531, "address": "Bakkenhagen 5", "zip_code": "0141", "longitude": 10.82213945646575, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 69 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 60.03792173804758, "address": "Hellandberget 1", "zip_code": "0130", "longitude": 10.791889868477469, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 70 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 59.83062929125564, "address": "Christenseneggen 7", "zip_code": "0120", "longitude": 10.907517800935755, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 71 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 59.98282468198254, "address": "Halvorsentjernet 4", "zip_code": "0177", "longitude": 10.759850950149907, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 72 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 59.926942607598704, "address": "Karlsenroa 45", "zip_code": "0278", "longitude": 10.8256581721645, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 73 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 59.81183021555159, "address": "Martinsenkollen 4", "zip_code": "0148", "longitude": 10.687672024340516, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 74 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 60.053001056564945, "address": "Myklebustr\u00f8a 89", "zip_code": "0273", "longitude": 10.811937383749598, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 75 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 60.01216118524038, "address": "Birkelandplassen 12", "zip_code": "0210", "longitude": 10.830278113681507, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 76 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 59.872862737063954, "address": "Christensenvollen 1", "zip_code": "0222", "longitude": 10.60413225834586, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 77 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 59.776869523305855, "address": "Bergemarka 7", "zip_code": "0109", "longitude": 10.924164479198348, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 78 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 59.87624098669839, "address": "Jacobsengrenda 6", "zip_code": "0188", "longitude": 10.706955510850085, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 79 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 59.909292753079754, "address": "Mikkelseneggen 6", "zip_code": "0241", "longitude": 10.699892028390964, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 80 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 59.92435859208912, "address": "Brekkemarka 79", "zip_code": "0204", "longitude": 10.917460845391144, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 81 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 59.83890918391743, "address": "Johnsenstykket 8", "zip_code": "0291", "longitude": 10.721964696792304, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 82 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 59.89793503164266, "address": "Andresengrenda 21", "zip_code": "0187", "longitude": 10.749307895780507, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 83 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 59.809240949970764, "address": "Hagentoppen 73", "zip_code": "0211", "longitude": 10.746657782878637, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 84 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 59.91023398739465, "address": "Lier\u00f8a 1", "zip_code": "0279", "longitude": 10.56075654122111, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 85 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 59.962776041928805, "address": "Antonsenbakken 1", "zip_code": "0159", "longitude": 10.741624622650262, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 86 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 59.955723818265724, "address": "Alikollen 07", "zip_code": "0275", "longitude": 10.747885118677889, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 87 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 59.85536001517987, "address": "Bergebr\u00e5ten 32", "zip_code": "0255", "longitude": 10.921264935541542, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 88 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 59.87014023036983, "address": "Hansenr\u00f8a 21", "zip_code": "0249", "longitude": 10.590000534675852, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 89 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.252Z", "latitude": 59.84835215092349, "address": "Hanssenhaugen 84", "zip_code": "0184", "longitude": 10.695848580676747, "updated": "2015-02-13T21:39:42.252Z", "city": "Oslo" }, "pk": 90 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.252Z", "latitude": 59.88772645155173, "address": "Nilsenbr\u00e5ten 8", "zip_code": "0256", "longitude": 10.954550411242598, "updated": "2015-02-13T21:39:42.252Z", "city": "Oslo" }, "pk": 91 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.252Z", "latitude": 60.05893819077175, "address": "Engenlunden 6", "zip_code": "0169", "longitude": 10.739877317798895, "updated": "2015-02-13T21:39:42.252Z", "city": "Oslo" }, "pk": 92 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.252Z", "latitude": 59.90718585715958, "address": "Bakkegrenda 72", "zip_code": "0217", "longitude": 10.637367253615418, "updated": "2015-02-13T21:39:42.252Z", "city": "Oslo" }, "pk": 93 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.252Z", "latitude": 59.78781948947378, "address": "Rasmusseneggen 1", "zip_code": "0132", "longitude": 10.766861073375164, "updated": "2015-02-13T21:39:42.252Z", "city": "Oslo" }, "pk": 94 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.252Z", "latitude": 59.909749850974066, "address": "Lundekroken 73", "zip_code": "0294", "longitude": 10.697207568337102, "updated": "2015-02-13T21:39:42.252Z", "city": "Oslo" }, "pk": 95 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.252Z", "latitude": 59.83822910281552, "address": "Kristoffersenvollen 16", "zip_code": "0284", "longitude": 10.888289065648543, "updated": "2015-02-13T21:39:42.252Z", "city": "Oslo" }, "pk": 96 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.252Z", "latitude": 59.93090005761206, "address": "Ellingsenvollen 1", "zip_code": "0179", "longitude": 10.568415840971076, "updated": "2015-02-13T21:39:42.252Z", "city": "Oslo" }, "pk": 97 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.252Z", "latitude": 59.923396, "address": "Waldemar Thranes gate 8", "zip_code": "0107", "longitude": 10.739370, "updated": "2015-02-13T21:39:42.252Z", "city": "Oslo" }, "pk": 98 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.252Z", "latitude": 59.928354, "address": "Waldemar Thranes gate 86", "zip_code": "0175", "longitude": 10.752732, "updated": "2015-02-13T21:39:42.252Z", "city": "Oslo" }, "pk": 99 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.252Z", "latitude": 59.926835, "address": "Uelands gate 22", "zip_code": "0175", "longitude": 10.749738, "updated": "2015-02-13T21:39:42.252Z", "city": "Oslo" }, "pk": 100 } ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/mockapp/fixtures/mockperson.json0000664000175000017500000005616314476342707022135 0ustar00mmmmmm[ { "model": "mockapp.mockperson", "pk": 1, "fields": { "firstname": "Abel", "lastname": "Foreman", "birthdate": "1966-06-06", "created": "2015-05-19T10:48:08.686Z", "updated": "2016-04-24T16:02:59.378Z" } }, { "model": "mockapp.mockperson", "pk": 2, "fields": { "firstname": "Dylan", "lastname": "Mckay", "birthdate": "1974-09-23", "created": "2015-05-19T10:48:08.688Z", "updated": "2016-04-24T16:02:59.380Z" } }, { "model": "mockapp.mockperson", "pk": 3, "fields": { "firstname": "Raymond", "lastname": "Mcleod", "birthdate": "1986-06-20", "created": "2015-05-19T10:48:08.689Z", "updated": "2016-04-24T16:02:59.381Z" } }, { "model": "mockapp.mockperson", "pk": 4, "fields": { "firstname": "Zane", "lastname": "Griffith", "birthdate": "2011-07-19", "created": "2015-05-19T10:48:08.690Z", "updated": "2016-04-24T16:02:59.382Z" } }, { "model": "mockapp.mockperson", "pk": 5, "fields": { "firstname": "Jeremy", "lastname": "Fowler", "birthdate": "2011-03-10", "created": "2015-05-19T10:48:08.691Z", "updated": "2016-04-24T16:02:59.383Z" } }, { "model": "mockapp.mockperson", "pk": 6, "fields": { "firstname": "Norman", "lastname": "Lara", "birthdate": "1991-06-13", "created": "2015-05-19T10:48:08.693Z", "updated": "2016-04-24T16:02:59.384Z" } }, { "model": "mockapp.mockperson", "pk": 7, "fields": { "firstname": "Scott", "lastname": "Barnett", "birthdate": "1969-11-10", "created": "2015-05-19T10:48:08.694Z", "updated": "2016-04-24T16:02:59.385Z" } }, { "model": "mockapp.mockperson", "pk": 8, "fields": { "firstname": "Brady", "lastname": "Gibbs", "birthdate": "2004-09-25", "created": "2015-05-19T10:48:08.695Z", "updated": "2016-04-24T16:02:59.386Z" } }, { "model": "mockapp.mockperson", "pk": 9, "fields": { "firstname": "Jesse", "lastname": "Woodward", "birthdate": "1955-05-27", "created": "2015-05-19T10:48:08.696Z", "updated": "2016-04-24T16:02:59.388Z" } }, { "model": "mockapp.mockperson", "pk": 10, "fields": { "firstname": "John", "lastname": "McClane", "birthdate": "1986-08-05", "created": "2015-05-19T10:48:08.697Z", "updated": "2016-04-24T16:02:59.389Z" } }, { "model": "mockapp.mockperson", "pk": 11, "fields": { "firstname": "John", "lastname": "McLaughlin", "birthdate": "1952-07-26", "created": "2015-05-19T10:48:08.698Z", "updated": "2016-04-24T16:02:59.389Z" } }, { "model": "mockapp.mockperson", "pk": 12, "fields": { "firstname": "Mason", "lastname": "George", "birthdate": "2002-09-05", "created": "2015-05-19T10:48:08.699Z", "updated": "2016-04-24T16:02:59.390Z" } }, { "model": "mockapp.mockperson", "pk": 13, "fields": { "firstname": "Kadeem", "lastname": "Hickman", "birthdate": "1984-08-29", "created": "2015-05-19T10:48:08.700Z", "updated": "2016-04-24T16:02:59.391Z" } }, { "model": "mockapp.mockperson", "pk": 14, "fields": { "firstname": "Prescott", "lastname": "Clemons", "birthdate": "2014-10-19", "created": "2015-05-19T10:48:08.701Z", "updated": "2016-04-24T16:02:59.392Z" } }, { "model": "mockapp.mockperson", "pk": 15, "fields": { "firstname": "Ciaran", "lastname": "Sykes", "birthdate": "1995-03-28", "created": "2015-05-19T10:48:08.702Z", "updated": "2016-04-24T16:02:59.393Z" } }, { "model": "mockapp.mockperson", "pk": 16, "fields": { "firstname": "Bruno", "lastname": "Hood", "birthdate": "1995-07-02", "created": "2015-05-19T10:48:08.703Z", "updated": "2016-04-24T16:02:59.394Z" } }, { "model": "mockapp.mockperson", "pk": 17, "fields": { "firstname": "Xander", "lastname": "Gould", "birthdate": "2006-11-09", "created": "2015-05-19T10:48:08.704Z", "updated": "2016-04-24T16:02:59.395Z" } }, { "model": "mockapp.mockperson", "pk": 18, "fields": { "firstname": "Odysseus", "lastname": "Cooley", "birthdate": "1991-03-18", "created": "2015-05-19T10:48:08.705Z", "updated": "2016-04-24T16:02:59.396Z" } }, { "model": "mockapp.mockperson", "pk": 19, "fields": { "firstname": "Randall", "lastname": "Ray", "birthdate": "1992-12-04", "created": "2015-05-19T10:48:08.705Z", "updated": "2016-04-24T16:02:59.397Z" } }, { "model": "mockapp.mockperson", "pk": 20, "fields": { "firstname": "Brandon", "lastname": "Richards", "birthdate": "2007-04-10", "created": "2015-05-19T10:48:08.706Z", "updated": "2016-04-24T16:02:59.398Z" } }, { "model": "mockapp.mockperson", "pk": 21, "fields": { "firstname": "Myles", "lastname": "Mullen", "birthdate": "1983-08-11", "created": "2015-05-19T10:48:08.707Z", "updated": "2016-04-24T16:02:59.399Z" } }, { "model": "mockapp.mockperson", "pk": 22, "fields": { "firstname": "Ryder", "lastname": "Walton", "birthdate": "1972-08-09", "created": "2015-05-19T10:48:08.708Z", "updated": "2016-04-24T16:02:59.400Z" } }, { "model": "mockapp.mockperson", "pk": 23, "fields": { "firstname": "Alec", "lastname": "Brooks", "birthdate": "2008-09-12", "created": "2015-05-19T10:48:08.709Z", "updated": "2016-04-24T16:02:59.401Z" } }, { "model": "mockapp.mockperson", "pk": 24, "fields": { "firstname": "Zeph", "lastname": "Hinton", "birthdate": "2003-04-27", "created": "2015-05-19T10:48:08.710Z", "updated": "2016-04-24T16:02:59.402Z" } }, { "model": "mockapp.mockperson", "pk": 25, "fields": { "firstname": "Carl", "lastname": "Drake", "birthdate": "2003-04-20", "created": "2015-05-19T10:48:08.711Z", "updated": "2016-04-24T16:02:59.403Z" } }, { "model": "mockapp.mockperson", "pk": 26, "fields": { "firstname": "Lamar", "lastname": "Hahn", "birthdate": "2006-07-22", "created": "2015-05-19T10:48:08.711Z", "updated": "2016-04-24T16:02:59.404Z" } }, { "model": "mockapp.mockperson", "pk": 27, "fields": { "firstname": "Aladdin", "lastname": "Buchanan", "birthdate": "1978-02-17", "created": "2015-05-19T10:48:08.712Z", "updated": "2016-04-24T16:02:59.405Z" } }, { "model": "mockapp.mockperson", "pk": 28, "fields": { "firstname": "Ashton", "lastname": "Odonnell", "birthdate": "1960-09-14", "created": "2015-05-19T10:48:08.713Z", "updated": "2016-04-24T16:02:59.406Z" } }, { "model": "mockapp.mockperson", "pk": 29, "fields": { "firstname": "Alvin", "lastname": "Hoffman", "birthdate": "1981-05-28", "created": "2015-05-19T10:48:08.714Z", "updated": "2016-04-24T16:02:59.408Z" } }, { "model": "mockapp.mockperson", "pk": 30, "fields": { "firstname": "Preston", "lastname": "Mejia", "birthdate": "1962-10-29", "created": "2015-05-19T10:48:08.715Z", "updated": "2016-04-24T16:02:59.409Z" } }, { "model": "mockapp.mockperson", "pk": 31, "fields": { "firstname": "Jameson", "lastname": "Hayes", "birthdate": "1952-06-09", "created": "2015-05-19T10:48:08.716Z", "updated": "2016-04-24T16:02:59.410Z" } }, { "model": "mockapp.mockperson", "pk": 32, "fields": { "firstname": "Mark", "lastname": "Fox", "birthdate": "1998-10-13", "created": "2015-05-19T10:48:08.717Z", "updated": "2016-04-24T16:02:59.411Z" } }, { "model": "mockapp.mockperson", "pk": 33, "fields": { "firstname": "Garth", "lastname": "Spears", "birthdate": "1954-04-05", "created": "2015-05-19T10:48:08.717Z", "updated": "2016-04-24T16:02:59.412Z" } }, { "model": "mockapp.mockperson", "pk": 34, "fields": { "firstname": "Justin", "lastname": "Herrera", "birthdate": "1970-08-11", "created": "2015-05-19T10:48:08.718Z", "updated": "2016-04-24T16:02:59.413Z" } }, { "model": "mockapp.mockperson", "pk": 35, "fields": { "firstname": "Rigel", "lastname": "Crane", "birthdate": "1994-07-29", "created": "2015-05-19T10:48:08.719Z", "updated": "2016-04-24T16:02:59.414Z" } }, { "model": "mockapp.mockperson", "pk": 36, "fields": { "firstname": "Ezra", "lastname": "Mosley", "birthdate": "1962-07-01", "created": "2015-05-19T10:48:08.720Z", "updated": "2016-04-24T16:02:59.415Z" } }, { "model": "mockapp.mockperson", "pk": 37, "fields": { "firstname": "Isaac", "lastname": "Ballard", "birthdate": "1990-08-07", "created": "2015-05-19T10:48:08.721Z", "updated": "2016-04-24T16:02:59.416Z" } }, { "model": "mockapp.mockperson", "pk": 38, "fields": { "firstname": "Kennan", "lastname": "Holman", "birthdate": "2006-03-21", "created": "2015-05-19T10:48:08.722Z", "updated": "2016-04-24T16:02:59.417Z" } }, { "model": "mockapp.mockperson", "pk": 39, "fields": { "firstname": "Bradley", "lastname": "Cantrell", "birthdate": "1974-05-09", "created": "2015-05-19T10:48:08.723Z", "updated": "2016-04-24T16:02:59.418Z" } }, { "model": "mockapp.mockperson", "pk": 40, "fields": { "firstname": "Eagan", "lastname": "Green", "birthdate": "1959-08-18", "created": "2015-05-19T10:48:08.723Z", "updated": "2016-04-24T16:02:59.419Z" } }, { "model": "mockapp.mockperson", "pk": 41, "fields": { "firstname": "Davis", "lastname": "Fields", "birthdate": "1983-11-07", "created": "2015-05-19T10:48:08.724Z", "updated": "2016-04-24T16:02:59.420Z" } }, { "model": "mockapp.mockperson", "pk": 42, "fields": { "firstname": "Edward", "lastname": "Munoz", "birthdate": "1962-12-30", "created": "2015-05-19T10:48:08.725Z", "updated": "2016-04-24T16:02:59.421Z" } }, { "model": "mockapp.mockperson", "pk": 43, "fields": { "firstname": "Raphael", "lastname": "Baird", "birthdate": "1950-03-02", "created": "2015-05-19T10:48:08.726Z", "updated": "2016-04-24T16:02:59.422Z" } }, { "model": "mockapp.mockperson", "pk": 44, "fields": { "firstname": "Clinton", "lastname": "Ward", "birthdate": "1998-01-18", "created": "2015-05-19T10:48:08.726Z", "updated": "2016-04-24T16:02:59.423Z" } }, { "model": "mockapp.mockperson", "pk": 45, "fields": { "firstname": "Judah", "lastname": "Odonnell", "birthdate": "2004-08-11", "created": "2015-05-19T10:48:08.727Z", "updated": "2016-04-24T16:02:59.424Z" } }, { "model": "mockapp.mockperson", "pk": 46, "fields": { "firstname": "Zachery", "lastname": "Humphrey", "birthdate": "1988-06-04", "created": "2015-05-19T10:48:08.728Z", "updated": "2016-04-24T16:02:59.425Z" } }, { "model": "mockapp.mockperson", "pk": 47, "fields": { "firstname": "Mark", "lastname": "Church", "birthdate": "1962-03-05", "created": "2015-05-19T10:48:08.728Z", "updated": "2016-04-24T16:02:59.426Z" } }, { "model": "mockapp.mockperson", "pk": 48, "fields": { "firstname": "Ronan", "lastname": "Rose", "birthdate": "1975-06-05", "created": "2015-05-19T10:48:08.729Z", "updated": "2016-04-24T16:02:59.428Z" } }, { "model": "mockapp.mockperson", "pk": 49, "fields": { "firstname": "Craig", "lastname": "Nicholson", "birthdate": "1951-12-01", "created": "2015-05-19T10:48:08.730Z", "updated": "2016-04-24T16:02:59.429Z" } }, { "model": "mockapp.mockperson", "pk": 50, "fields": { "firstname": "Benedict", "lastname": "Copeland", "birthdate": "1993-05-04", "created": "2015-05-19T10:48:08.731Z", "updated": "2016-04-24T16:02:59.430Z" } }, { "model": "mockapp.mockperson", "pk": 51, "fields": { "firstname": "Oleg", "lastname": "Decker", "birthdate": "2001-11-28", "created": "2015-05-19T10:48:08.731Z", "updated": "2016-04-24T16:02:59.431Z" } }, { "model": "mockapp.mockperson", "pk": 52, "fields": { "firstname": "Carlos", "lastname": "Trujillo", "birthdate": "1971-11-15", "created": "2015-05-19T10:48:08.732Z", "updated": "2016-04-24T16:02:59.432Z" } }, { "model": "mockapp.mockperson", "pk": 53, "fields": { "firstname": "Solomon", "lastname": "Acosta", "birthdate": "2013-06-14", "created": "2015-05-19T10:48:08.733Z", "updated": "2016-04-24T16:02:59.433Z" } }, { "model": "mockapp.mockperson", "pk": 54, "fields": { "firstname": "Coby", "lastname": "Gray", "birthdate": "1999-09-19", "created": "2015-05-19T10:48:08.733Z", "updated": "2016-04-24T16:02:59.434Z" } }, { "model": "mockapp.mockperson", "pk": 55, "fields": { "firstname": "Randall", "lastname": "Reilly", "birthdate": "1977-11-18", "created": "2015-05-19T10:48:08.734Z", "updated": "2016-04-24T16:02:59.435Z" } }, { "model": "mockapp.mockperson", "pk": 56, "fields": { "firstname": "Davis", "lastname": "Caldwell", "birthdate": "1972-04-30", "created": "2015-05-19T10:48:08.735Z", "updated": "2016-04-24T16:02:59.436Z" } }, { "model": "mockapp.mockperson", "pk": 57, "fields": { "firstname": "Wing", "lastname": "Daugherty", "birthdate": "1994-09-27", "created": "2015-05-19T10:48:08.736Z", "updated": "2016-04-24T16:02:59.437Z" } }, { "model": "mockapp.mockperson", "pk": 58, "fields": { "firstname": "Dane", "lastname": "Mcdowell", "birthdate": "2010-09-06", "created": "2015-05-19T10:48:08.737Z", "updated": "2016-04-24T16:02:59.437Z" } }, { "model": "mockapp.mockperson", "pk": 59, "fields": { "firstname": "Theodore", "lastname": "Bentley", "birthdate": "2010-12-17", "created": "2015-05-19T10:48:08.737Z", "updated": "2016-04-24T16:02:59.438Z" } }, { "model": "mockapp.mockperson", "pk": 60, "fields": { "firstname": "Nehru", "lastname": "Pace", "birthdate": "2007-12-02", "created": "2015-05-19T10:48:08.738Z", "updated": "2016-04-24T16:02:59.439Z" } }, { "model": "mockapp.mockperson", "pk": 61, "fields": { "firstname": "Ferris", "lastname": "Owen", "birthdate": "1966-07-16", "created": "2015-05-19T10:48:08.739Z", "updated": "2016-04-24T16:02:59.440Z" } }, { "model": "mockapp.mockperson", "pk": 62, "fields": { "firstname": "Holmes", "lastname": "Hill", "birthdate": "1970-07-03", "created": "2015-05-19T10:48:08.740Z", "updated": "2016-04-24T16:02:59.441Z" } }, { "model": "mockapp.mockperson", "pk": 63, "fields": { "firstname": "Nehru", "lastname": "Harrell", "birthdate": "1998-04-26", "created": "2015-05-19T10:48:08.741Z", "updated": "2016-04-24T16:02:59.442Z" } }, { "model": "mockapp.mockperson", "pk": 64, "fields": { "firstname": "John", "lastname": "Baker", "birthdate": "1963-12-18", "created": "2015-05-19T10:48:08.741Z", "updated": "2016-04-24T16:02:59.442Z" } }, { "model": "mockapp.mockperson", "pk": 65, "fields": { "firstname": "Wesley", "lastname": "Gill", "birthdate": "2004-10-07", "created": "2015-05-19T10:48:08.742Z", "updated": "2016-04-24T16:02:59.443Z" } }, { "model": "mockapp.mockperson", "pk": 66, "fields": { "firstname": "Moses", "lastname": "Jarvis", "birthdate": "2004-01-29", "created": "2015-05-19T10:48:08.743Z", "updated": "2016-04-24T16:02:59.444Z" } }, { "model": "mockapp.mockperson", "pk": 67, "fields": { "firstname": "Fuller", "lastname": "Conway", "birthdate": "1962-02-20", "created": "2015-05-19T10:48:08.743Z", "updated": "2016-04-24T16:02:59.445Z" } }, { "model": "mockapp.mockperson", "pk": 68, "fields": { "firstname": "Octavius", "lastname": "Mckinney", "birthdate": "2006-07-13", "created": "2015-05-19T10:48:08.744Z", "updated": "2016-04-24T16:02:59.446Z" } }, { "model": "mockapp.mockperson", "pk": 69, "fields": { "firstname": "Raja", "lastname": "Horne", "birthdate": "1952-09-11", "created": "2015-05-19T10:48:08.745Z", "updated": "2016-04-24T16:02:59.447Z" } }, { "model": "mockapp.mockperson", "pk": 70, "fields": { "firstname": "Keegan", "lastname": "Dorsey", "birthdate": "2006-06-06", "created": "2015-05-19T10:48:08.745Z", "updated": "2016-04-24T16:02:59.448Z" } }, { "model": "mockapp.mockperson", "pk": 71, "fields": { "firstname": "Wang", "lastname": "Jimenez", "birthdate": "1960-06-19", "created": "2015-05-19T10:48:08.746Z", "updated": "2016-04-24T16:02:59.449Z" } }, { "model": "mockapp.mockperson", "pk": 72, "fields": { "firstname": "Kermit", "lastname": "Galloway", "birthdate": "1976-05-22", "created": "2015-05-19T10:48:08.747Z", "updated": "2016-04-24T16:02:59.450Z" } }, { "model": "mockapp.mockperson", "pk": 73, "fields": { "firstname": "Lucian", "lastname": "Banks", "birthdate": "2004-06-28", "created": "2015-05-19T10:48:08.748Z", "updated": "2016-04-24T16:02:59.450Z" } }, { "model": "mockapp.mockperson", "pk": 74, "fields": { "firstname": "Lewis", "lastname": "Porter", "birthdate": "1967-08-12", "created": "2015-05-19T10:48:08.748Z", "updated": "2016-04-24T16:02:59.451Z" } }, { "model": "mockapp.mockperson", "pk": 75, "fields": { "firstname": "Simon", "lastname": "Noel", "birthdate": "1961-12-07", "created": "2015-05-19T10:48:08.749Z", "updated": "2016-04-24T16:02:59.452Z" } }, { "model": "mockapp.mockperson", "pk": 76, "fields": { "firstname": "Carl", "lastname": "Hodge", "birthdate": "1980-10-24", "created": "2015-05-19T10:48:08.750Z", "updated": "2016-04-24T16:02:59.453Z" } }, { "model": "mockapp.mockperson", "pk": 77, "fields": { "firstname": "Abel", "lastname": "Reynolds", "birthdate": "1953-03-18", "created": "2015-05-19T10:48:08.751Z", "updated": "2016-04-24T16:02:59.454Z" } }, { "model": "mockapp.mockperson", "pk": 78, "fields": { "firstname": "Kaseem", "lastname": "Porter", "birthdate": "1958-09-02", "created": "2015-05-19T10:48:08.751Z", "updated": "2016-04-24T16:02:59.455Z" } }, { "model": "mockapp.mockperson", "pk": 79, "fields": { "firstname": "Tucker", "lastname": "Parks", "birthdate": "1958-12-22", "created": "2015-05-19T10:48:08.752Z", "updated": "2016-04-24T16:02:59.456Z" } }, { "model": "mockapp.mockperson", "pk": 80, "fields": { "firstname": "Caldwell", "lastname": "Jensen", "birthdate": "1951-07-04", "created": "2015-05-19T10:48:08.753Z", "updated": "2016-04-24T16:02:59.457Z" } }, { "model": "mockapp.mockperson", "pk": 81, "fields": { "firstname": "Melvin", "lastname": "Tyson", "birthdate": "1955-10-17", "created": "2015-05-19T10:48:08.753Z", "updated": "2016-04-24T16:02:59.458Z" } }, { "model": "mockapp.mockperson", "pk": 82, "fields": { "firstname": "Walker", "lastname": "Hood", "birthdate": "1962-09-21", "created": "2015-05-19T10:48:08.754Z", "updated": "2016-04-24T16:02:59.459Z" } }, { "model": "mockapp.mockperson", "pk": 83, "fields": { "firstname": "Brenden", "lastname": "Dillon", "birthdate": "2004-12-21", "created": "2015-05-19T10:48:08.755Z", "updated": "2016-04-24T16:02:59.460Z" } }, { "model": "mockapp.mockperson", "pk": 84, "fields": { "firstname": "Avram", "lastname": "Young", "birthdate": "1997-08-18", "created": "2015-05-19T10:48:08.755Z", "updated": "2016-04-24T16:02:59.461Z" } }, { "model": "mockapp.mockperson", "pk": 85, "fields": { "firstname": "Lev", "lastname": "Guerrero", "birthdate": "2002-05-17", "created": "2015-05-19T10:48:08.756Z", "updated": "2016-04-24T16:02:59.461Z" } }, { "model": "mockapp.mockperson", "pk": 86, "fields": { "firstname": "Leroy", "lastname": "Wade", "birthdate": "2003-08-31", "created": "2015-05-19T10:48:08.757Z", "updated": "2016-04-24T16:02:59.462Z" } }, { "model": "mockapp.mockperson", "pk": 87, "fields": { "firstname": "Kuame", "lastname": "Gardner", "birthdate": "1970-05-31", "created": "2015-05-19T10:48:08.757Z", "updated": "2016-04-24T16:02:59.463Z" } }, { "model": "mockapp.mockperson", "pk": 88, "fields": { "firstname": "Lewis", "lastname": "David", "birthdate": "2011-01-30", "created": "2015-05-19T10:48:08.758Z", "updated": "2016-04-24T16:02:59.464Z" } }, { "model": "mockapp.mockperson", "pk": 89, "fields": { "firstname": "Ezra", "lastname": "Carrillo", "birthdate": "2001-04-16", "created": "2015-05-19T10:48:08.759Z", "updated": "2016-04-24T16:02:59.465Z" } }, { "model": "mockapp.mockperson", "pk": 90, "fields": { "firstname": "Barrett", "lastname": "Giles", "birthdate": "1955-07-20", "created": "2015-05-19T10:48:08.759Z", "updated": "2016-04-24T16:02:59.466Z" } }, { "model": "mockapp.mockperson", "pk": 91, "fields": { "firstname": "Adrian", "lastname": "Garner", "birthdate": "1979-11-22", "created": "2015-05-19T10:48:08.760Z", "updated": "2016-04-24T16:02:59.466Z" } }, { "model": "mockapp.mockperson", "pk": 92, "fields": { "firstname": "Vladimir", "lastname": "Stewart", "birthdate": "1995-08-21", "created": "2015-05-19T10:48:08.761Z", "updated": "2016-04-24T16:02:59.467Z" } }, { "model": "mockapp.mockperson", "pk": 93, "fields": { "firstname": "Burke", "lastname": "Aguirre", "birthdate": "1978-03-14", "created": "2015-05-19T10:48:08.762Z", "updated": "2016-04-24T16:02:59.468Z" } }, { "model": "mockapp.mockperson", "pk": 94, "fields": { "firstname": "Slade", "lastname": "Johnson", "birthdate": "2010-10-26", "created": "2015-05-19T10:48:08.762Z", "updated": "2016-04-24T16:02:59.469Z" } }, { "model": "mockapp.mockperson", "pk": 95, "fields": { "firstname": "Justin", "lastname": "Hodges", "birthdate": "2001-06-06", "created": "2015-05-19T10:48:08.763Z", "updated": "2016-04-24T16:02:59.470Z" } }, { "model": "mockapp.mockperson", "pk": 96, "fields": { "firstname": "Elton", "lastname": "Wolfe", "birthdate": "1959-09-21", "created": "2015-05-19T10:48:08.764Z", "updated": "2016-04-24T16:02:59.471Z" } }, { "model": "mockapp.mockperson", "pk": 97, "fields": { "firstname": "Kamal", "lastname": "Parker", "birthdate": "1980-01-31", "created": "2015-05-19T10:48:08.764Z", "updated": "2016-04-24T16:02:59.471Z" } }, { "model": "mockapp.mockperson", "pk": 98, "fields": { "firstname": "Jeremy", "lastname": "Rowland", "birthdate": "1981-10-26", "created": "2015-05-19T10:48:08.765Z", "updated": "2016-04-24T16:02:59.472Z" } }, { "model": "mockapp.mockperson", "pk": 99, "fields": { "firstname": "Derek", "lastname": "Anthony", "birthdate": "1980-04-18", "created": "2015-05-19T10:48:08.766Z", "updated": "2016-04-24T16:02:59.473Z" } }, { "model": "mockapp.mockperson", "pk": 100, "fields": { "firstname": "\u00c5smund", "lastname": "S\u00f8rensen", "birthdate": "1978-12-24", "created": "2015-05-19T10:48:08.767Z", "updated": "2016-04-24T16:02:59.474Z" } } ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/mockapp/fixtures/mockpet.json0000664000175000017500000000362014476342707021405 0ustar00mmmmmm[ { "pk": 1, "model": "mockapp.mockpet", "fields": { "name": "John", "created": "2015-05-19T10:48:08.686Z", "species": "Iguana", "updated": "2015-05-19T10:48:08.686Z" } }, { "pk": 2, "model": "mockapp.mockpet", "fields": { "name": "Fido", "created": "2015-05-19T10:48:08.688Z", "species": "Dog", "updated": "2015-05-19T10:48:08.688Z" } }, { "pk": 3, "model": "mockapp.mockpet", "fields": { "name": "Zane", "created": "2015-05-19T10:48:08.689Z", "species": "Dog", "updated": "2015-05-19T10:48:08.689Z" } }, { "pk": 4, "model": "mockapp.mockpet", "fields": { "name": "Spot", "created": "2015-05-19T10:48:08.690Z", "species": "Dog", "updated": "2015-05-19T10:48:08.690Z" } }, { "pk": 5, "model": "mockapp.mockpet", "fields": { "name": "Sam", "created": "2015-05-19T10:48:08.691Z", "species": "Bird", "updated": "2015-05-19T10:48:08.691Z" } }, { "pk": 6, "model": "mockapp.mockpet", "fields": { "name": "Jenny", "created": "2015-05-19T10:48:08.693Z", "species": "Cat", "updated": "2015-05-19T10:48:08.693Z" } }, { "pk": 7, "model": "mockapp.mockpet", "fields": { "name": "Squiggles", "created": "2015-05-19T10:48:08.694Z", "species": "Snake", "updated": "2015-05-19T10:48:08.694Z" } }, { "pk": 8, "model": "mockapp.mockpet", "fields": { "name": "Buttersworth", "created": "2015-05-19T10:48:08.695Z", "species": "Dog", "updated": "2015-05-19T10:48:08.695Z" } }, { "pk": 9, "model": "mockapp.mockpet", "fields": { "name": "Tweety", "created": "2015-05-19T10:48:08.696Z", "species": "Bird", "updated": "2015-05-19T10:48:08.696Z" } }, { "pk": 10, "model": "mockapp.mockpet", "fields": { "name": "Jake", "created": "2015-05-19T10:48:08.697Z", "species": "Cat", "updated": "2015-05-19T10:48:08.697Z" } } ] ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1696400768.8680983 drf-haystack-1.8.13/tests/mockapp/migrations/0000775000175000017500000000000014507202601017331 5ustar00mmmmmm././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/mockapp/migrations/0001_initial.py0000664000175000017500000000163414476342707022021 0ustar00mmmmmm# -*- coding: utf-8 -*- from __future__ import unicode_literals from django.db import models, migrations class Migration(migrations.Migration): dependencies = [ ] operations = [ migrations.CreateModel( name='MockLocation', fields=[ ('id', models.AutoField(primary_key=True, auto_created=True, serialize=False, verbose_name='ID')), ('latitude', models.FloatField()), ('longitude', models.FloatField()), ('address', models.CharField(max_length=100)), ('city', models.CharField(max_length=30)), ('zip_code', models.CharField(max_length=10)), ('created', models.DateTimeField(auto_now_add=True)), ('updated', models.DateTimeField(auto_now=True)), ], options={ }, bases=(models.Model,), ), ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/mockapp/migrations/0002_mockperson.py0000664000175000017500000000143614476342707022551 0ustar00mmmmmm# -*- coding: utf-8 -*- from __future__ import unicode_literals from django.db import models, migrations class Migration(migrations.Migration): dependencies = [ ('mockapp', '0001_initial'), ] operations = [ migrations.CreateModel( name='MockPerson', fields=[ ('id', models.AutoField(auto_created=True, verbose_name='ID', primary_key=True, serialize=False)), ('firstname', models.CharField(max_length=20)), ('lastname', models.CharField(max_length=20)), ('created', models.DateTimeField(auto_now_add=True)), ('updated', models.DateTimeField(auto_now=True)), ], options={ }, bases=(models.Model,), ), ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/mockapp/migrations/0003_mockpet.py0000664000175000017500000000147514476342707022037 0ustar00mmmmmm# -*- coding: utf-8 -*- from __future__ import unicode_literals from django.db import models, migrations class Migration(migrations.Migration): dependencies = [ ('mockapp', '0001_initial'), ('mockapp', '0002_mockperson'), ] operations = [ migrations.CreateModel( name='MockPet', fields=[ ('id', models.AutoField(auto_created=True, verbose_name='ID', primary_key=True, serialize=False)), ('name', models.CharField(max_length=20)), ('species', models.CharField(max_length=20)), ('created', models.DateTimeField(auto_now_add=True)), ('updated', models.DateTimeField(auto_now=True)), ], options={ }, bases=(models.Model,), ), ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/mockapp/migrations/0004_load_fixtures.py0000664000175000017500000000326614476342707023246 0ustar00mmmmmm# -*- coding: utf-8 -*- from __future__ import unicode_literals import os from django.core import serializers from django.db import models, migrations def load_data(apps, schema_editor): """ Load fixtures for MockPerson, MockPet and MockLocation """ fixtures = os.path.abspath(os.path.join(os.path.dirname(__file__), os.path.pardir, "fixtures")) with open(os.path.join(fixtures, "mockperson.json"), "r") as fixture: objects = serializers.deserialize("json", fixture, ignorenonexistent=True) for obj in objects: obj.save() with open(os.path.join(fixtures, "mocklocation.json"), "r") as fixture: objects = serializers.deserialize("json", fixture, ignorenonexistent=True) for obj in objects: obj.save() with open(os.path.join(fixtures, "mockpet.json"), "r") as fixture: objects = serializers.deserialize("json", fixture, ignorenonexistent=True) for obj in objects: obj.save() def unload_data(apps, schema_editor): """ Unload fixtures for MockPerson, MockPet and MockLocation """ MockPerson = apps.get_model("mockapp", "MockPerson") MockLocation = apps.get_model("mockapp", "MockLocation") MockPet = apps.get_model("mockapp", "MockPet") MockPerson.objects.all().delete() MockLocation.objects.all().delete() MockPet.objects.all().delete() class Migration(migrations.Migration): dependencies = [ ("mockapp", "0001_initial"), ("mockapp", "0002_mockperson"), ("mockapp", "0005_mockperson_birthdate"), ("mockapp", "0003_mockpet"), ] operations = [ migrations.RunPython(load_data, reverse_code=unload_data) ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/mockapp/migrations/0005_mockperson_birthdate.py0000664000175000017500000000100514476342707024572 0ustar00mmmmmm# -*- coding: utf-8 -*- # Generated by Django 1.9.4 on 2016-04-16 07:05 from __future__ import unicode_literals from django.db import migrations, models import tests.mockapp.models class Migration(migrations.Migration): dependencies = [ ('mockapp', '0002_mockperson'), ] operations = [ migrations.AddField( model_name='mockperson', name='birthdate', field=models.DateField(default=tests.mockapp.models.get_random_date, null=True), ), ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/mockapp/migrations/0006_mockallfield.py0000664000175000017500000000206714476342707023024 0ustar00mmmmmm# -*- coding: utf-8 -*- # Generated by Django 1.11.9 on 2018-03-22 22:46 from __future__ import unicode_literals from django.db import migrations, models import tests.mockapp.models class Migration(migrations.Migration): dependencies = [ ('mockapp', '0004_load_fixtures'), ] operations = [ migrations.CreateModel( name='MockAllField', fields=[ ('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')), ('charfield', models.CharField(max_length=100)), ('integerfield', models.IntegerField()), ('floatfield', models.FloatField()), ('decimalfield', models.DecimalField(decimal_places=2, max_digits=5)), ('boolfield', models.BooleanField(default=False)), ('datefield', models.DateField(default=tests.mockapp.models.get_random_date)), ('datetimefield', models.DateTimeField(default=tests.mockapp.models.get_random_datetime)), ], ), ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/mockapp/migrations/__init__.py0000664000175000017500000000000014476342707021451 0ustar00mmmmmm././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/mockapp/models.py0000664000175000017500000000465214476342707017042 0ustar00mmmmmm# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals import pytz from datetime import date, datetime, timedelta from random import randrange, randint from django.db import models def get_random_date(start=date(1950, 1, 1), end=date.today()): """ :return a random date between `start` and `end` """ delta = ((end - start).days * 24 * 60 * 60) return start + timedelta(seconds=randrange(delta)) def get_random_datetime(start=datetime(1950, 1, 1, 0, 0), end=datetime.today()): """ :return a random datetime """ delta = ((end - start).total_seconds()) return (start + timedelta(seconds=randint(0, int(delta)))).replace(tzinfo=pytz.UTC) class MockLocation(models.Model): latitude = models.FloatField() longitude = models.FloatField() address = models.CharField(max_length=100) city = models.CharField(max_length=30) zip_code = models.CharField(max_length=10) created = models.DateTimeField(auto_now_add=True) updated = models.DateTimeField(auto_now=True) def __str__(self): return self.address @property def coordinates(self): try: from haystack.utils.geo import Point except ImportError: return None else: return Point(self.longitude, self.latitude, srid=4326) class MockPerson(models.Model): firstname = models.CharField(max_length=20) lastname = models.CharField(max_length=20) birthdate = models.DateField(null=True, default=get_random_date) created = models.DateTimeField(auto_now_add=True) updated = models.DateTimeField(auto_now=True) def __str__(self): return "%s %s" % (self.firstname, self.lastname) class MockPet(models.Model): name = models.CharField(max_length=20) species = models.CharField(max_length=20) created = models.DateTimeField(auto_now_add=True) updated = models.DateTimeField(auto_now=True) def __str__(self): return self.name class MockAllField(models.Model): charfield = models.CharField(max_length=100) integerfield = models.IntegerField() floatfield = models.FloatField() decimalfield = models.DecimalField(max_digits=5, decimal_places=2) boolfield = models.BooleanField(default=False) datefield = models.DateField(default=get_random_date) datetimefield = models.DateTimeField(default=get_random_datetime) def __str__(self): return self.charfield ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/mockapp/search_indexes.py0000664000175000017500000000704514476342707020542 0ustar00mmmmmm# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals from django.utils import timezone from haystack import indexes from .models import MockLocation, MockPerson, MockPet, MockAllField class MockLocationIndex(indexes.SearchIndex, indexes.Indexable): text = indexes.CharField(document=True, use_template=True) address = indexes.CharField(model_attr="address") city = indexes.CharField(model_attr="city") zip_code = indexes.CharField(model_attr="zip_code") autocomplete = indexes.EdgeNgramField() coordinates = indexes.LocationField(model_attr="coordinates") @staticmethod def prepare_autocomplete(obj): return " ".join(( obj.address, obj.city, obj.zip_code )) def get_model(self): return MockLocation def index_queryset(self, using=None): return self.get_model().objects.filter( created__lte=timezone.now() ) class MockPersonIndex(indexes.SearchIndex, indexes.Indexable): text = indexes.CharField(document=True, use_template=True) firstname = indexes.CharField(model_attr="firstname", faceted=True) lastname = indexes.CharField(model_attr="lastname", faceted=True) birthdate = indexes.DateField(model_attr="birthdate", faceted=True) letters = indexes.MultiValueField(faceted=True) full_name = indexes.CharField() description = indexes.CharField() autocomplete = indexes.EdgeNgramField() created = indexes.FacetDateTimeField(model_attr="created") @staticmethod def prepare_full_name(obj): return " ".join((obj.firstname, obj.lastname)) @staticmethod def prepare_letters(obj): return [x for x in obj.firstname] @staticmethod def prepare_description(obj): return " ".join((obj.firstname, "is a nice chap!")) @staticmethod def prepare_autocomplete(obj): return " ".join((obj.firstname, obj.lastname)) def get_model(self): return MockPerson def index_queryset(self, using=None): return self.get_model().objects.filter( created__lte=timezone.now() ) class MockPetIndex(indexes.SearchIndex, indexes.Indexable): text = indexes.CharField(document=True, use_template=True) name = indexes.CharField(model_attr="name") species = indexes.CharField(model_attr="species") has_rabies = indexes.BooleanField(indexed=False) description = indexes.CharField() autocomplete = indexes.EdgeNgramField() @staticmethod def prepare_description(obj): return " ".join((obj.name, "the", obj.species)) @staticmethod def prepare_has_rabies(obj): return True if obj.species == "Dog" else False @staticmethod def prepare_autocomplete(obj): return obj.name def get_model(self): return MockPet class MockAllFieldIndex(indexes.SearchIndex, indexes.Indexable): text = indexes.CharField(document=True, use_template=False) charfield = indexes.CharField(model_attr="charfield") integerfield = indexes.IntegerField(model_attr="integerfield") floatfield = indexes.FloatField(model_attr="floatfield") decimalfield = indexes.DecimalField(model_attr="decimalfield") boolfield = indexes.BooleanField(model_attr="boolfield") datefield = indexes.DateField(model_attr="datefield") datetimefield = indexes.DateTimeField(model_attr="datetimefield") multivaluefield = indexes.MultiValueField() @staticmethod def prepare_multivaluefield(obj): return obj.charfield.split(' ', 1) def get_model(self): return MockAllField ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/mockapp/serializers.py0000664000175000017500000000347514476342707020115 0ustar00mmmmmm# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals from datetime import datetime, timedelta from rest_framework.serializers import HyperlinkedIdentityField from drf_haystack.serializers import HaystackSerializer, HaystackFacetSerializer, HighlighterMixin from .search_indexes import MockPersonIndex, MockLocationIndex class SearchSerializer(HaystackSerializer): class Meta: index_classes = [MockPersonIndex, MockLocationIndex] fields = [ "firstname", "lastname", "birthdate", "full_name", "text", "address", "city", "zip_code", "highlighted" ] class HighlighterSerializer(HighlighterMixin, HaystackSerializer): highlighter_css_class = "my-highlighter-class" highlighter_html_tag = "em" class Meta: index_classes = [MockPersonIndex, MockLocationIndex] fields = [ "firstname", "lastname", "full_name", "address", "city", "zip_code", "coordinates" ] class MoreLikeThisSerializer(HaystackSerializer): more_like_this = HyperlinkedIdentityField(view_name="search-person-mlt-more-like-this", read_only=True) class Meta: index_classes = [MockPersonIndex] fields = [ "firstname", "lastname", "full_name", "autocomplete" ] class MockPersonFacetSerializer(HaystackFacetSerializer): serialize_objects = True class Meta: index_classes = [MockPersonIndex] fields = ["firstname", "lastname", "created", "letters"] field_options = { "firstname": {}, "lastname": {}, "created": { "start_date": datetime.now() - timedelta(days=3 * 365), "end_date": datetime.now(), "gap_by": "day", "gap_amount": 10 } } ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1696400768.8600981 drf-haystack-1.8.13/tests/mockapp/templates/0000775000175000017500000000000014507202601017153 5ustar00mmmmmm././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1696400768.8600981 drf-haystack-1.8.13/tests/mockapp/templates/search/0000775000175000017500000000000014507202601020420 5ustar00mmmmmm././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1696400768.8600981 drf-haystack-1.8.13/tests/mockapp/templates/search/indexes/0000775000175000017500000000000014507202601022057 5ustar00mmmmmm././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1696400768.8720984 drf-haystack-1.8.13/tests/mockapp/templates/search/indexes/mockapp/0000775000175000017500000000000014507202601023511 5ustar00mmmmmm././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/mockapp/templates/search/indexes/mockapp/mocklocation_text.txt0000664000175000017500000000017614476342707030025 0ustar00mmmmmm{{ object.address }} {{ object.zip_code }} {{ object.city }} Created: {{ object.created }} Last modified: {{ object.updated }}././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/mockapp/templates/search/indexes/mockapp/mockperson_text.txt0000664000175000017500000000005514476342707027517 0ustar00mmmmmm{{ object.firstname }} {{ object.lastname }} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/mockapp/templates/search/indexes/mockapp/mockpet_text.txt0000664000175000017500000000002114476342707026772 0ustar00mmmmmm{{ object.name }}././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/mockapp/views.py0000664000175000017500000000232714476342707016711 0ustar00mmmmmm# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals from rest_framework.pagination import PageNumberPagination, LimitOffsetPagination from drf_haystack.filters import HaystackFilter, HaystackBoostFilter, HaystackHighlightFilter, HaystackAutocompleteFilter, HaystackGEOSpatialFilter from drf_haystack.viewsets import HaystackViewSet from drf_haystack.mixins import FacetMixin, MoreLikeThisMixin from .models import MockPerson, MockLocation from .serializers import ( SearchSerializer, HighlighterSerializer, MoreLikeThisSerializer, MockPersonFacetSerializer ) class BasicPageNumberPagination(PageNumberPagination): page_size = 20 page_size_query_param = "page_size" class BasicLimitOffsetPagination(LimitOffsetPagination): default_limit = 20 class SearchPersonFacetViewSet(FacetMixin, HaystackViewSet): index_models = [MockPerson] pagination_class = BasicLimitOffsetPagination serializer_class = SearchSerializer filter_backends = [HaystackFilter] # Faceting facet_serializer_class = MockPersonFacetSerializer class SearchPersonMLTViewSet(MoreLikeThisMixin, HaystackViewSet): index_models = [MockPerson] serializer_class = MoreLikeThisSerializer ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/run_tests.py0000664000175000017500000000114714476342707016147 0ustar00mmmmmm#!/usr/bin/env python # -*- coding: utf-8 -*- from __future__ import absolute_import, print_function, unicode_literals import os import sys import nose def start(argv=None): sys.exitfunc = lambda: sys.stderr.write("Shutting down...\n") if argv is None: argv = [ "nosetests", "--verbose", "--with-coverage", "--cover-erase", "--cover-branches", "--cover-package=drf_haystack", ] nose.run_exit(argv=argv, defaultTest=os.path.abspath(os.path.dirname(__file__))) if __name__ == "__main__": start(sys.argv) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/settings.py0000664000175000017500000000632214476342707015761 0ustar00mmmmmm# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals import os BASE_DIR = os.path.abspath(os.path.join(os.path.dirname(os.path.dirname(__file__)), 'tests')) SECRET_KEY = 'NOBODY expects the Spanish Inquisition!' DEBUG = True ALLOWED_HOSTS = ["*"] DATABASES = { 'default': { 'ENGINE': 'django.db.backends.sqlite3', 'NAME': os.path.join(BASE_DIR, os.pardir, 'test.db') } } INSTALLED_APPS = ( 'django.contrib.auth', 'django.contrib.contenttypes', 'django.contrib.sessions', 'django.contrib.staticfiles', 'haystack', 'rest_framework', 'tests.mockapp', ) MIDDLEWARE_CLASSES = ( 'django.contrib.sessions.middleware.SessionMiddleware', 'django.middleware.common.CommonMiddleware', 'django.middleware.csrf.CsrfViewMiddleware', 'django.contrib.auth.middleware.AuthenticationMiddleware', 'django.contrib.auth.middleware.SessionAuthenticationMiddleware', 'django.contrib.messages.middleware.MessageMiddleware', 'django.middleware.clickjacking.XFrameOptionsMiddleware', ) TEMPLATES = [ { 'BACKEND': 'django.template.backends.django.DjangoTemplates', 'OPTIONS': {'debug': True}, 'APP_DIRS': True, }, ] REST_FRAMEWORK = { 'DEFAULT_PERMISSION_CLASSES': ( 'rest_framework.permissions.AllowAny', ) } ROOT_URLCONF = 'tests.urls' WSGI_APPLICATION = 'tests.wsgi.application' LANGUAGE_CODE = 'en-us' TIME_ZONE = 'UTC' USE_I18N = True USE_L10N = True USE_TZ = True STATIC_URL = '/static/' HAYSTACK_CONNECTIONS = { 'default': { 'ENGINE': 'haystack.backends.elasticsearch_backend.ElasticsearchSearchEngine', 'URL': os.environ.get('ELASTICSEARCH_URL', 'http://localhost:9200/'), 'INDEX_NAME': 'drf-haystack-test', 'INCLUDE_SPELLING': True, 'TIMEOUT': 300, }, } DEFAULT_LOG_DIR = os.path.join(BASE_DIR, 'logs') LOGGING = { 'version': 1, 'disable_existing_loggers': False, 'formatters': { 'standard': { 'format': '%(levelname)s %(asctime)s %(module)s %(process)d %(thread)d %(message)s' }, }, 'handlers': { 'console_handler': { 'level': 'DEBUG', 'class': 'logging.StreamHandler', 'formatter': 'standard' }, 'file_handler': { 'level': 'DEBUG', 'class': 'logging.FileHandler', 'filename': os.path.join(DEFAULT_LOG_DIR, 'tests.log'), }, }, 'loggers': { 'default': { 'handlers': ['file_handler'], 'level': 'INFO', 'propagate': True, }, 'elasticsearch': { 'handlers': ['file_handler'], 'level': 'ERROR', 'propagate': True, }, 'elasticsearch.trace': { 'handlers': ['file_handler'], 'level': 'ERROR', 'propagate': True, }, }, } try: import elasticsearch if (2, ) <= elasticsearch.VERSION <= (3, ): HAYSTACK_CONNECTIONS['default'].update({ 'ENGINE': 'haystack.backends.elasticsearch2_backend.Elasticsearch2SearchEngine' }) except ImportError as e: del HAYSTACK_CONNECTIONS['default'] # This will intentionally cause everything to break! ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/test_filters.py0000664000175000017500000006107014476342707016631 0ustar00mmmmmm# -*- coding: utf-8 -*- # # Unit tests for the `drf_haystack.filters` classes. # from __future__ import absolute_import, unicode_literals import json from datetime import date, datetime, timedelta from unittest import skipIf from django.test import TestCase from rest_framework import status from rest_framework import serializers from rest_framework.test import APIRequestFactory from drf_haystack.viewsets import HaystackViewSet from drf_haystack.serializers import HaystackSerializer, HaystackFacetSerializer from drf_haystack.filters import ( HaystackAutocompleteFilter, HaystackBoostFilter, HaystackFacetFilter, HaystackFilter, HaystackGEOSpatialFilter, HaystackHighlightFilter, HaystackOrderingFilter ) from drf_haystack.mixins import FacetMixin from . import geospatial_support, elasticsearch_version from .constants import MOCKLOCATION_DATA_SET_SIZE, MOCKPERSON_DATA_SET_SIZE from .mixins import WarningTestCaseMixin from .mockapp.models import MockAllField, MockLocation, MockPerson from .mockapp.search_indexes import MockAllFieldIndex, MockLocationIndex, MockPersonIndex factory = APIRequestFactory() class HaystackFilterTestCase(TestCase): fixtures = ["mockperson", "mockallfield"] def setUp(self): MockAllFieldIndex().reindex() MockPersonIndex().reindex() class Serializer1(HaystackSerializer): class Meta: index_classes = [MockPersonIndex] fields = ["text", "firstname", "lastname", "full_name", "birthdate", "autocomplete"] field_aliases = { "q": "autocomplete", "name": "full_name" } class Serializer2(HaystackSerializer): class Meta: index_classes = [MockLocationIndex] exclude = ["lastname"] class Serializer4(serializers.Serializer): # This is not allowed. Must implement a `Meta` class. pass class Serializer5(HaystackSerializer): class Meta: index_classes = [MockAllFieldIndex] fields = ["integerfield"] class ViewSet1(HaystackViewSet): index_models = [MockPerson] serializer_class = Serializer1 # No need to specify `filter_backends`, defaults to HaystackFilter class ViewSet2(ViewSet1): serializer_class = Serializer2 class ViewSet3(ViewSet1): serializer_class = Serializer4 class ViewSet4(HaystackViewSet): serializer_class = Serializer5 self.view1 = ViewSet1 self.view2 = ViewSet2 self.view3 = ViewSet3 self.view4 = ViewSet4 def tearDown(self): MockPersonIndex().clear() def test_filter_view_has_default_filter(self): self.assertEqual(self.view1.filter_backends, [HaystackFilter]) def test_filter_no_query_parameters(self): request = factory.get(path="/", data="", content_type="application/json") response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), MOCKPERSON_DATA_SET_SIZE) def test_filter_single_field(self): request = factory.get(path="/", data={"firstname": "John"}) # Should return 3 results response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 3) def test_filter_single_field_with_lookup(self): request = factory.get(path="/", data={"firstname__startswith": "John"}) # Should return 3 results response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 3) def test_filter_aliased_field(self): request = factory.get(path="/", data={"name": "John McClane"}, content_type="application/json") response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 1) def test_filter_aliased_field_with_lookup(self): request = factory.get(path="/", data={"name__contains": "John McClane"}, content_type="application/json") response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 1) def test_filter_single_field_OR(self): # Test filtering a single field for multiple values. The parameters should be OR'ed request = factory.get(path="/", data={"lastname": "Hickman,Hood"}) # Should return 3 results response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 3) def test_filter_single_field_OR_custom_lookup_sep(self): setattr(self.view1, "lookup_sep", ";") request = factory.get(path="/", data={"lastname": "Hickman;Hood"}) # Should return 3 results response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 3) # Reset the `lookup_sep` setattr(self.view1, "lookup_sep", ",") def test_filter_multiple_fields(self): # Test filtering multiple fields. The parameters should be AND'ed request = factory.get(path="/", data={"lastname": "Hood", "firstname": "Bruno"}) # Should return 1 result response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 1) def test_filter_multiple_fields_OR_same_fields(self): # Test filtering multiple fields for multiple values. The values should be OR'ed between # same parameters, and AND'ed between them request = factory.get(path="/", data={ "lastname": "Hickman,Hood", "firstname": "Walker,Bruno" }) # Should return 2 result response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 2) def test_filter_excluded_field(self): request = factory.get(path="/", data={"lastname": "Hood"}, content_type="application/json") response = self.view2.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), MOCKPERSON_DATA_SET_SIZE) # Should return all results since, field is ignored def test_filter_with_non_searched_excluded_field(self): request = factory.get(path="/", data={"text": "John"}, content_type="application/json") response = self.view2.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 3) def test_filter_unicode_characters(self): request = factory.get(path="/", data={"firstname": "åsmund", "lastname": "sørensen"}, content_type="application/json") response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(len(response.data), 1) def test_filter_negated_field(self): request = factory.get(path="/", data={"text__not": "John"}, content_type="application/json") response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 97) def test_filter_negated_field_with_lookup(self): request = factory.get(path="/", data={"name__not__contains": "John McClane"}, content_type="application/json") response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 99) def test_filter_negated_field_with_other_field(self): request = factory.get(path="/", data={"firstname": "John", "lastname__not": "McClane"}, content_type="application/json") response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 2) def test_filter_gt_date_field(self): request = factory.get(path="/", data={"birthdate__gt": "1980-01-01"}, content_type="application/json") response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(len(response.data), MockPerson.objects.filter(birthdate__gt=date(1980, 1, 1)).count()) def test_filter_lt_date_field(self): request = factory.get(path="/", data={"birthdate__lt": "1980-01-01"}, content_type="application/json") response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(len(response.data), MockPerson.objects.filter(birthdate__lt=date(1980, 1, 1)).count()) def test_filter_in_integerfield(self): request = factory.get(path="/", data={"integerfield__in": "48,57"}, content_type="application/json") response = self.view4.as_view(actions={"get": "list"})(request) self.assertEqual(len(response.data), MockAllField.objects.filter(integerfield__in=[48, 57]).count()) def test_filter_range_integerfield(self): request = factory.get(path="/", data={"integerfield__range": "300,500"}, content_type="application/json") response = self.view4.as_view(actions={"get": "list"})(request) self.assertEqual(len(response.data), MockAllField.objects.filter(integerfield__range=[300, 500]).count()) class HaystackAutocompleteFilterTestCase(TestCase): fixtures = ["mockperson"] def setUp(self): MockPersonIndex().reindex() class Serializer(HaystackSerializer): class Meta: index_classes = [MockPersonIndex] fields = ["text", "firstname", "lastname", "autocomplete"] class ViewSet(HaystackViewSet): index_models = [MockPerson] serializer_class = Serializer filter_backends = [HaystackAutocompleteFilter] self.view = ViewSet def tearDown(self): MockPersonIndex().clear() def test_filter_autocomplete_single_term(self): # Test querying the autocomplete field for a partial term. Should return 4 results request = factory.get(path="/", data={"autocomplete": "jer"}, content_type="application/json") response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 4) def test_filter_autocomplete_multiple_terms(self): # Test querying the autocomplete field for multiple terms. # Make sure the filter AND's the terms on spaces, thus reduce the results. request = factory.get(path="/", data={"autocomplete": "joh mc"}, content_type="application/json") response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 2) def test_filter_autocomplete_multiple_parameters(self): request = factory.get(path="/", data={"autocomplete": "jer fowler", "firstname": "jeremy"}, content_type="application/json") response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) def test_filter_autocomplete_single_field_OR(self): request = factory.get(path="/", data={"autocomplete": "jer,fowl"}, content_type="application/json") response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) @skipIf(not geospatial_support, "Skipped due to lack of GEO spatial features") class HaystackGEOSpatialFilterTestCase(TestCase): fixtures = ["mocklocation"] def setUp(self): MockLocationIndex().reindex() class Serializer(HaystackSerializer): class Meta: index_classes = [MockLocationIndex] fields = [ "text", "address", "city", "zip_code", "coordinates", ] class ViewSet(HaystackViewSet): index_models = [MockLocation] serializer_class = Serializer filter_backends = [HaystackGEOSpatialFilter] self.view = ViewSet def tearDown(self): MockLocationIndex().clear() def test_filter_dwithin(self): request = factory.get(path="/", data={"from": "59.923396,10.739370", "km": 1}, content_type="application/json") response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 4) def test_filter_dwithin_without_range_unit(self): # If no range unit is supplied, no filtering will occur. Make sure we # get the entire data set. request = factory.get(path="/", data={"from": "59.923396,10.739370"}, content_type="application/json") response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), MOCKLOCATION_DATA_SET_SIZE) def test_filter_dwithin_invalid_params(self): request = factory.get(path="/", data={"from": "i am not numeric,10.739370", "km": 1}, content_type="application/json") self.assertRaises( ValueError, self.view.as_view(actions={"get": "list"}), request ) class HaystackHighlightFilterTestCase(TestCase): fixtures = ["mockperson"] def setUp(self): MockPersonIndex().reindex() class Serializer(HaystackSerializer): class Meta: index_classes = [MockPersonIndex] fields = ["firstname", "lastname"] class ViewSet(HaystackViewSet): index_models = [MockPerson] serializer_class = Serializer filter_backends = [HaystackHighlightFilter] self.view = ViewSet def tearDown(self): MockPersonIndex().clear() @skipIf(not elasticsearch_version < (2, ), "Highlighting is not yet supported for the Elasticsearch2 backend") def test_filter_highlighter_filter(self): request = factory.get(path="/", data={"firstname": "jeremy"}, content_type="application/json") response = self.view.as_view(actions={"get": "list"})(request) response.render() for result in json.loads(response.content.decode()): self.assertTrue("highlighted" in result) self.assertEqual( result["highlighted"], " ".join(("Jeremy", "%s\n" % result["lastname"])) ) class HaystackBoostFilterTestCase(TestCase): fixtures = ["mockperson"] def setUp(self): MockPersonIndex().reindex() class Serializer(HaystackSerializer): class Meta: index_classes = [MockPersonIndex] fields = ["firstname", "lastname"] class ViewSet(HaystackViewSet): index_models = [MockPerson] serializer_class = Serializer filter_backends = [HaystackBoostFilter] self.view = ViewSet def tearDown(self): MockPersonIndex().clear() # Skipping the boost filter test case because it fails. # I strongly believe that this has to be fixed upstream, and # that the drf-haystack code works as it should. # def test_filter_boost(self): # # # This test will fail # # See https://github.com/django-haystack/django-haystack/issues/1235 # # request = factory.get(path="/", data={"lastname": "hood"}, content_type="application/json") # response = self.view.as_view(actions={"get": "list"})(request) # response.render() # data = json.loads(response.content.decode()) # self.assertEqual(len(response.data), 2) # self.assertEqual(data[0]["firstname"], "Bruno") # self.assertEqual(data[1]["firstname"], "Walker") # # # We're boosting walter slightly which should put him first in the results # request = factory.get(path="/", data={"lastname": "hood", "boost": "walker,1.1"}, # content_type="application/json") # response = self.view.as_view(actions={"get": "list"})(request) # response.render() # data = json.loads(response.content.decode()) # self.assertEqual(len(response.data), 2) # self.assertEqual(data[0]["firstname"], "Walker") # self.assertEqual(data[1]["firstname"], "Bruno") def test_filter_boost_valid_params(self): request = factory.get(path="/", data={"boost": "bruno,1.2"}, content_type="application/json") response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) def test_filter_boost_invalid_non_numeric(self): request = factory.get(path="/", data={"boost": "bruno,i am not numeric!"}, content_type="application/json") try: self.view.as_view(actions={"get": "list"})(request) self.fail("Did not raise ValueError when called with a non-numeric boost value.") except ValueError as e: self.assertEqual( str(e), "Cannot convert boost to float value. Make sure to provide a numerical boost value." ) def test_filter_boost_invalid_malformed_query_params(self): request = factory.get(path="/", data={"boost": "bruno"}, content_type="application/json") try: self.view.as_view(actions={"get": "list"})(request) self.fail("Did not raise ValueError when called with a malformed query parameters.") except ValueError as e: self.assertEqual( str(e), "Cannot convert the '%s' query parameter to a valid boost filter." % HaystackBoostFilter.query_param ) class HaystackFacetFilterTestCase(WarningTestCaseMixin, TestCase): fixtures = ["mockperson"] def setUp(self): MockPersonIndex().reindex() class FacetSerializer1(HaystackFacetSerializer): class Meta: index_classes = [MockPersonIndex] fields = ["firstname", "lastname", "created"] class FacetSerializer2(HaystackFacetSerializer): class Meta: index_classes = [MockPersonIndex] fields = ["firstname", "lastname", "created"] field_options = { "firstname": {}, "lastname": {}, "created": { "start_date": datetime.now() - timedelta(days=3 * 365), "end_date": datetime.now(), "gap_by": "day", "gap_amount": 10 } } class ViewSet1(FacetMixin, HaystackViewSet): index_models = [MockPerson] facet_serializer_class = FacetSerializer1 class ViewSet2(FacetMixin, HaystackViewSet): index_models = [MockPerson] facet_serializer_class = FacetSerializer2 self.view1 = ViewSet1 self.view2 = ViewSet2 def tearDown(self): MockPersonIndex().clear() def test_filter_view_has_default_facet_filter(self): self.assertEqual(self.view1.facet_filter_backends, [HaystackFacetFilter]) def test_filter_facet_no_field_options(self): request = factory.get("/", data={}, content_type="application/json") response = self.view1.as_view(actions={"get": "facets"})(request) response.render() self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(json.loads(response.content.decode()), {}) def test_filter_facet_serializer_no_field_options_missing_required_query_parameters(self): request = factory.get("/", data={"created": "start_date:Oct 3rd 2015"}, content_type="application/json") try: self.view1.as_view(actions={"get": "facets"})(request) self.fail("Did not raise ValueError when called without all required " "attributes and no default field_options is set.") except ValueError as e: self.assertEqual( str(e), "Date faceting requires at least 'start_date', 'end_date' and 'gap_by' to be set." ) def test_filter_facet_no_field_options_valid_required_query_parameters(self): request = factory.get( "/", data={"created": "start_date:Jan 1th 2010,end_date:Dec 31th 2020,gap_by:month,gap_amount:1"}, content_type="application/json" ) response = self.view1.as_view(actions={"get": "facets"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) def test_filter_facet_with_field_options(self): request = factory.get("/", data={}, content_type="application/json") response = self.view2.as_view(actions={"get": "facets"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) def test_filter_facet_warn_on_inproperly_formatted_token(self): request = factory.get("/", data={"firstname": "token"}, content_type="application/json") self.assertWarning(UserWarning, self.view2.as_view(actions={"get": "facets"}), request) class OrderedHaystackViewSetTestCase(TestCase): fixtures = ["mockallfield"] def setUp(self): MockAllFieldIndex().reindex() class Serializer(HaystackSerializer): class Meta: fields = ("charfield", "integerfield", "floatfield", "decimalfield", "boolfield") index_classes = [MockAllFieldIndex] class ViewSet1(HaystackViewSet): index_models = [MockAllField] serializer_class = Serializer filter_backends = (HaystackOrderingFilter,) ordering_fields = "__all__" ordering = ("integerfield",) class ViewSet2(HaystackViewSet): index_models = [MockAllField] serializer_class = Serializer filter_backends = (HaystackOrderingFilter,) ordering_fields = "__all__" ordering = ("-integerfield",) self.view1 = ViewSet1 self.view2 = ViewSet2 def tearDown(self): MockAllFieldIndex().clear() def test_viewset_default_ordering(self): request = factory.get(path="/", content_type="application/json") response = self.view1.as_view(actions={"get": "list"})(request) response.render() content = json.loads(response.content.decode()) self.assertEqual( [result["integerfield"] for result in content], list(MockAllField.objects.values_list("integerfield", flat=True).order_by("integerfield")) ) def test_viewset_default_reverse_ordering(self): request = factory.get(path="/", content_type="application/json") response = self.view2.as_view(actions={"get": "list"})(request) response.render() content = json.loads(response.content.decode()) self.assertEqual( [result["integerfield"] for result in content], list(MockAllField.objects.values_list("integerfield", flat=True).order_by("-integerfield")) ) def test_viewset_order_by_single_query_param(self): request = factory.get(path="/", data={"ordering": "integerfield"}, content_type="application/json") response = self.view1.as_view(actions={"get": "list"})(request) response.render() content = json.loads(response.content.decode()) self.assertEqual( [result["integerfield"] for result in content], list(MockAllField.objects.values_list("integerfield", flat=True).order_by("integerfield")) ) def test_viewset_order_by_multiple_query_params(self): request = factory.get(path="/", data={"ordering": "integerfield,boolfield"}, content_type="application/json") response = self.view1.as_view(actions={"get": "list"})(request) response.render() content = json.loads(response.content.decode()) self.assertEqual( [result["integerfield"] for result in content], list(MockAllField.objects.values_list("integerfield", flat=True).order_by("integerfield", "boolfield")) ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/test_serializers.py0000664000175000017500000006210514476342707017515 0ustar00mmmmmm# -*- coding: utf-8 -*- # # Unit tests for the `drf_haystack.serializers` classes. # from __future__ import absolute_import, unicode_literals import json from datetime import datetime, timedelta import six from django.conf.urls import url, include from django.core.exceptions import ImproperlyConfigured from django.http import QueryDict from django.test import TestCase, SimpleTestCase, override_settings from haystack.query import SearchQuerySet from rest_framework import serializers from rest_framework.fields import CharField, IntegerField from rest_framework.routers import DefaultRouter from rest_framework.test import APIRequestFactory, APITestCase from drf_haystack import fields from drf_haystack.serializers import ( HighlighterMixin, HaystackSerializer, HaystackSerializerMixin, HaystackFacetSerializer, HaystackSerializerMeta) from drf_haystack.viewsets import HaystackViewSet from drf_haystack.mixins import MoreLikeThisMixin, FacetMixin from .mixins import WarningTestCaseMixin from .mockapp.models import MockPerson, MockAllField from .mockapp.search_indexes import MockPersonIndex, MockPetIndex, MockAllFieldIndex factory = APIRequestFactory() # More like this stuff class SearchPersonMLTSerializer(HaystackSerializer): more_like_this = serializers.HyperlinkedIdentityField(view_name="search-person-mlt-more-like-this", read_only=True) class Meta: index_classes = [MockPersonIndex] fields = ["firstname", "lastname", "full_name"] class SearchPersonMLTViewSet(MoreLikeThisMixin, HaystackViewSet): serializer_class = SearchPersonMLTSerializer class Meta: index_models = [MockPerson] # Faceting stuff class SearchPersonFSerializer(HaystackSerializer): class Meta: index_classes = [MockPersonIndex] fields = ["firstname", "lastname", "full_name"] class SearchPersonFacetSerializer(HaystackFacetSerializer): serialize_objects = True class Meta: index_classes = [MockPersonIndex] fields = ["firstname", "lastname", "created"] field_options = { "firstname": {}, "lastname": {}, "created": { "start_date": datetime.now() - timedelta(days=10 * 365), "end_date": datetime.now(), "gap_by": "month", "gap_amount": 1 } } class SearchPersonFacetViewSet(FacetMixin, HaystackViewSet): serializer_class = SearchPersonFSerializer facet_serializer_class = SearchPersonFacetSerializer class Meta: index_models = [MockPerson] router = DefaultRouter() router.register("search-person-mlt", viewset=SearchPersonMLTViewSet, basename="search-person-mlt") router.register("search-person-facet", viewset=SearchPersonFacetViewSet, basename="search-person-facet") urlpatterns = [ url(r"^", include(router.urls)) ] class HaystackSerializerTestCase(WarningTestCaseMixin, TestCase): fixtures = ["mockperson", "mockpet"] def setUp(self): MockPersonIndex().reindex() MockPetIndex().reindex() class Serializer1(HaystackSerializer): integer_field = serializers.IntegerField() city = serializers.SerializerMethodField() class Meta: index_classes = [MockPersonIndex] fields = ["text", "firstname", "lastname", "autocomplete"] def get_integer_field(self, instance): return 1 def get_city(self, instance): return "Declared overriding field" class Serializer2(HaystackSerializer): class Meta: index_classes = [MockPersonIndex] exclude = ["firstname"] class Serializer3(HaystackSerializer): class Meta: index_classes = [MockPersonIndex] fields = ["text", "firstname", "lastname", "autocomplete"] ignore_fields = ["autocomplete"] class Serializer7(HaystackSerializer): class Meta: index_classes = [MockPetIndex] class ViewSet1(HaystackViewSet): serializer_class = Serializer1 class Meta: index_models = [MockPerson] self.serializer1 = Serializer1 self.serializer2 = Serializer2 self.serializer3 = Serializer3 self.serializer7 = Serializer7 self.view1 = ViewSet1 def tearDown(self): MockPersonIndex().clear() def test_serializer_raise_without_meta_class(self): try: class Serializer(HaystackSerializer): pass self.fail("Did not fail when defining a Serializer without a Meta class") except ImproperlyConfigured as e: self.assertEqual(str(e), "%s must implement a Meta class or have the property _abstract" % "Serializer") def test_serializer_gets_default_instance(self): serializer = self.serializer1(instance=None) self.assertIsInstance(serializer.instance, SearchQuerySet, "Did not get default instance of type SearchQuerySet") def test_serializer_get_fields(self): obj = SearchQuerySet().filter(lastname="Foreman")[0] serializer = self.serializer1(instance=obj) fields = serializer.get_fields() self.assertIsInstance(fields, dict) self.assertIsInstance(fields["integer_field"], IntegerField) self.assertIsInstance(fields["text"], CharField) self.assertIsInstance(fields["firstname"], CharField) self.assertIsInstance(fields["lastname"], CharField) self.assertIsInstance(fields["autocomplete"], CharField) def test_serializer_get_fields_with_exclude(self): obj = SearchQuerySet().filter(lastname="Foreman")[0] serializer = self.serializer2(instance=obj) fields = serializer.get_fields() self.assertIsInstance(fields, dict) self.assertIsInstance(fields["text"], CharField) self.assertIsInstance(fields["lastname"], CharField) self.assertIsInstance(fields["autocomplete"], CharField) self.assertFalse("firstname" in fields) def test_serializer_get_fields_with_ignore_fields(self): obj = SearchQuerySet().filter(lastname="Foreman")[0] serializer = self.serializer3(instance=obj) fields = serializer.get_fields() self.assertIsInstance(fields, dict) self.assertIsInstance(fields["text"], CharField) self.assertIsInstance(fields["firstname"], CharField) self.assertIsInstance(fields["lastname"], CharField) self.assertFalse("autocomplete" in fields) def test_serializer_boolean_field(self): dog = self.serializer7(instance=SearchQuerySet().filter(species="Dog")[0]) iguana = self.serializer7(instance=SearchQuerySet().filter(species="Iguana")[0]) self.assertTrue(dog.data["has_rabies"]) self.assertFalse(iguana.data["has_rabies"]) def test_serializer_declared_field_overrides(self): obj = SearchQuerySet().filter(lastname="Foreman")[0] serializer = self.serializer1(instance=obj) self.assertEqual(serializer.data['city'], "Declared overriding field") class HaystackSerializerAllFieldsTestCase(TestCase): fixtures = ["mockallfield"] def setUp(self): MockAllFieldIndex().reindex() class Serializer1(HaystackSerializer): class Meta: index_classes = [MockAllFieldIndex] fields = ["charfield", "integerfield", "floatfield", "decimalfield", "boolfield", "datefield", "datetimefield", "multivaluefield"] self.serializer1 = Serializer1 def test_serialize_field_is_correct_type(self): obj = SearchQuerySet().models(MockAllField).latest('datetimefield') serializer = self.serializer1(instance=obj, many=False) self.assertIsInstance(serializer.fields['charfield'], fields.HaystackCharField) self.assertIsInstance(serializer.fields['integerfield'], fields.HaystackIntegerField) self.assertIsInstance(serializer.fields['floatfield'], fields.HaystackFloatField) self.assertIsInstance(serializer.fields['decimalfield'], fields.HaystackDecimalField) self.assertIsInstance(serializer.fields['boolfield'], fields.HaystackBooleanField) self.assertIsInstance(serializer.fields['datefield'], fields.HaystackDateField) self.assertIsInstance(serializer.fields['datetimefield'], fields.HaystackDateTimeField) self.assertIsInstance(serializer.fields['multivaluefield'], fields.HaystackMultiValueField) class HaystackSerializerMultipleIndexTestCase(WarningTestCaseMixin, TestCase): fixtures = ["mockperson", "mockpet"] def setUp(self): MockPersonIndex().reindex() MockPetIndex().reindex() class Serializer1(HaystackSerializer): """ Regular multiple index serializer """ class Meta: index_classes = [MockPersonIndex, MockPetIndex] fields = ["text", "firstname", "lastname", "name", "species", "autocomplete"] class Serializer2(HaystackSerializer): """ Multiple index serializer with declared fields """ _MockPersonIndex__hair_color = serializers.SerializerMethodField() extra = serializers.SerializerMethodField() class Meta: index_classes = [MockPersonIndex, MockPetIndex] exclude = ["firstname"] def get__MockPersonIndex__hair_color(self, instance): return "black" def get_extra(self, instance): return 1 class Serializer3(HaystackSerializer): """ Multiple index serializer with index aliases """ class Meta: index_classes = [MockPersonIndex, MockPetIndex] exclude = ["firstname"] index_aliases = { 'mockapp.MockPersonIndex': 'People' } class ViewSet1(HaystackViewSet): serializer_class = Serializer1 class ViewSet2(HaystackViewSet): serializer_class = Serializer2 class ViewSet3(HaystackViewSet): serializer_class = Serializer3 self.serializer1 = Serializer1 self.serializer2 = Serializer2 self.serializer3 = Serializer3 self.view1 = ViewSet1 self.view2 = ViewSet2 self.view3 = ViewSet3 def tearDown(self): MockPersonIndex().clear() MockPetIndex().clear() def test_serializer_multiple_index_data(self): objs = SearchQuerySet().filter(text="John") serializer = self.serializer1(instance=objs, many=True) data = serializer.data self.assertEqual(len(data), 4) for result in data: if "name" in result: self.assertTrue("species" in result, "Pet results should have 'species' and 'name' fields") self.assertTrue("firstname" not in result, "Pet results should have 'species' and 'name' fields") self.assertTrue("lastname" not in result, "Pet results should have 'species' and 'name' fields") elif "firstname" in result: self.assertTrue("lastname" in result, "Person results should have 'firstname' and 'lastname' fields") self.assertTrue("name" not in result, "Person results should have 'firstname' and 'lastname' fields") self.assertTrue("species" not in result, "Person results should have 'firstname' and 'lastname' fields") else: self.fail("Result should contain either Pet or Person fields") def test_serializer_multiple_index_declared_fields(self): objs = SearchQuerySet().filter(text="John") serializer = self.serializer2(instance=objs, many=True) data = serializer.data self.assertEqual(len(data), 4) for result in data: if "name" in result: self.assertTrue("extra" in result, "'extra' should be present in Pet results") self.assertEqual(result["extra"], 1, "The value of 'extra' should be 1") self.assertTrue("hair_color" not in result, "'hair_color' should not be present in Pet results") elif "lastname" in result: self.assertTrue("extra" in result, "'extra' should be present in Person results") self.assertEqual(result["extra"], 1, "The value of 'extra' should be 1") self.assertTrue("hair_color" in result, "'hair_color' should be present in Person results") self.assertEqual(result["hair_color"], "black", "The value of 'hair_color' should be 'black'") else: self.fail("Result should contain either Pet or Person fields") class HaystackSerializerHighlighterMixinTestCase(WarningTestCaseMixin, TestCase): fixtures = ["mockperson"] def setUp(self): MockPersonIndex().reindex() class Serializer2(HighlighterMixin, HaystackSerializer): highlighter_html_tag = "div" highlighter_css_class = "my-fancy-highlighter" highlighter_field = "description" class Meta: index_classes = [MockPersonIndex] fields = ["firstname", "lastname", "description"] class Serializer3(Serializer2): highlighter_class = None class ViewSet1(HaystackViewSet): serializer_class = Serializer2 class ViewSet2(HaystackViewSet): serializer_class = Serializer3 self.view1 = ViewSet1 self.view2 = ViewSet2 def tearDown(self): MockPersonIndex().clear() def test_serializer_highlighting(self): request = factory.get(path="/", data={"firstname": "jeremy"}, content_type="application/json") response = self.view1.as_view(actions={"get": "list"})(request) response.render() for result in json.loads(response.content.decode()): self.assertTrue("highlighted" in result) self.assertEqual( result["highlighted"], " ".join(('<%(tag)s class="%(css_class)s">Jeremy' % { "tag": self.view1.serializer_class.highlighter_html_tag, "css_class": self.view1.serializer_class.highlighter_css_class }, "%s" % "is a nice chap!")) ) def test_serializer_highlighter_raise_no_highlighter_class(self): request = factory.get(path="/", data={"firstname": "jeremy"}, content_type="application/json") try: self.view2.as_view(actions={"get": "list"})(request) self.fail("Did not raise ImproperlyConfigured error when called without a serializer_class") except ImproperlyConfigured as e: self.assertEqual( str(e), "%(cls)s is missing a highlighter_class. Define %(cls)s.highlighter_class, " "or override %(cls)s.get_highlighter()." % {"cls": self.view2.serializer_class.__name__} ) @override_settings(ROOT_URLCONF="tests.test_serializers") class HaystackSerializerMoreLikeThisTestCase(APITestCase): fixtures = ["mockperson"] def setUp(self): MockPersonIndex().reindex() def tearDown(self): MockPersonIndex().clear() def test_serializer_more_like_this_link(self): response = self.client.get( path="/search-person-mlt/", data={"firstname": "odysseus", "lastname": "cooley"}, format="json" ) self.assertEqual( response.data, [{ "lastname": "Cooley", "full_name": "Odysseus Cooley", "firstname": "Odysseus", "more_like_this": "http://testserver/search-person-mlt/18/more-like-this/" }] ) @override_settings(ROOT_URLCONF="tests.test_serializers") class HaystackFacetSerializerTestCase(TestCase): fixtures = ["mockperson"] def setUp(self): MockPersonIndex().reindex() self.response = self.client.get( path="/search-person-facet/facets/", data={}, format="json" ) def tearDown(self): MockPersonIndex().clear() @staticmethod def build_absolute_uri(location): """ Builds an absolute URI using the test server's domain and the specified location. """ location = location.lstrip("/") return "http://testserver/{location}".format(location=location) @staticmethod def is_paginated_facet_response(response): """ Returns True if the response.data seems like a faceted result. Only works for responses created with the test client. """ return "objects" in response.data and \ all([k in response.data["objects"] for k in ("count", "next", "previous", "results")]) def test_serializer_facet_top_level_structure(self): for key in ("fields", "dates", "queries"): self.assertContains(self.response, key, count=1) def test_serializer_facet_field_result(self): fields = self.response.data["fields"] for field in ("firstname", "lastname"): self.assertTrue(field in fields) self.assertTrue(isinstance(fields[field], list)) firstname = fields["firstname"][0] self.assertTrue({"text", "count", "narrow_url"} <= set(firstname)) self.assertEqual( firstname["narrow_url"], self.build_absolute_uri("/search-person-facet/facets/?selected_facets=firstname_exact%3A{term}".format( term=firstname["text"])) ) lastname = fields["lastname"][0] self.assertTrue({"text", "count", "narrow_url"} <= set(lastname)) self.assertEqual( lastname["narrow_url"], self.build_absolute_uri("/search-person-facet/facets/?selected_facets=lastname_exact%3A{term}".format( term=lastname["text"] )) ) def test_serializer_facet_date_result(self): dates = self.response.data["dates"] self.assertTrue("created" in dates) self.assertEqual(len(dates["created"]), 1) created = dates["created"][0] self.assertTrue(all([k in created for k in ("text", "count", "narrow_url")])) self.assertEqual(created["text"], "2015-05-01T00:00:00Z") self.assertEqual(created["count"], 100) self.assertEqual( created["narrow_url"], self.build_absolute_uri("/search-person-facet/facets/?selected_facets=created_exact%3A2015-05-01+00%3A00%3A00") ) def test_serializer_facet_queries_result(self): # Not Implemented pass def test_serializer_facet_narrow(self): response = self.client.get( path="/search-person-facet/facets/", data=QueryDict("selected_facets=firstname_exact:John&selected_facets=lastname_exact:McClane"), format="json" ) self.assertEqual(response.data["queries"], {}) self.assertTrue([all(("firstname", "lastname" in response.data["fields"]))]) self.assertEqual(len(response.data["fields"]["firstname"]), 1) self.assertEqual(response.data["fields"]["firstname"][0]["text"], "John") self.assertEqual(response.data["fields"]["firstname"][0]["count"], 1) self.assertEqual( response.data["fields"]["firstname"][0]["narrow_url"], self.build_absolute_uri("/search-person-facet/facets/?selected_facets=firstname_exact%3AJohn" "&selected_facets=lastname_exact%3AMcClane") ) self.assertEqual(len(response.data["fields"]["lastname"]), 1) self.assertEqual(response.data["fields"]["lastname"][0]["text"], "McClane") self.assertEqual(response.data["fields"]["lastname"][0]["count"], 1) self.assertEqual( response.data["fields"]["lastname"][0]["narrow_url"], self.build_absolute_uri("/search-person-facet/facets/?selected_facets=firstname_exact%3AJohn" "&selected_facets=lastname_exact%3AMcClane") ) self.assertTrue("created" in response.data["dates"]) self.assertEqual(len(response.data["dates"]), 1) self.assertEqual(response.data["dates"]["created"][0]["text"], "2015-05-01T00:00:00Z") self.assertEqual(response.data["dates"]["created"][0]["count"], 1) self.assertEqual( response.data["dates"]["created"][0]["narrow_url"], self.build_absolute_uri("/search-person-facet/facets/?selected_facets=created_exact%3A2015-05-01+00%3A00%3A00" "&selected_facets=firstname_exact%3AJohn&selected_facets=lastname_exact%3AMcClane" ) ) def test_serializer_raise_without_meta_class(self): try: class FacetSerializer(HaystackFacetSerializer): pass self.fail("Did not fail when defining a Serializer without a Meta class") except ImproperlyConfigured as e: self.assertEqual(str(e), "%s must implement a Meta class or have the property _abstract" % "FacetSerializer") class HaystackSerializerMixinTestCase(WarningTestCaseMixin, TestCase): fixtures = ["mockperson"] def setUp(self): MockPersonIndex().reindex() class MockPersonSerializer(serializers.ModelSerializer): class Meta: model = MockPerson fields = ('id', 'firstname', 'lastname', 'created', 'updated') read_only_fields = ('created', 'updated') class Serializer1(HaystackSerializerMixin, MockPersonSerializer): class Meta(MockPersonSerializer.Meta): search_fields = ['text', ] class Viewset1(HaystackViewSet): serializer_class = Serializer1 self.serializer1 = Serializer1 self.viewset1 = Viewset1 def tearDown(self): MockPersonIndex().clear() def test_serializer_mixin(self): objs = SearchQuerySet().filter(text="Foreman") serializer = self.serializer1(instance=objs, many=True) self.assertEqual( json.loads(json.dumps(serializer.data)), [{ "id": 1, "firstname": "Abel", "lastname": "Foreman", "created": "2015-05-19T10:48:08.686000Z", "updated": "2016-04-24T16:02:59.378000Z" }] ) class HaystackMultiSerializerTestCase(WarningTestCaseMixin, TestCase): fixtures = ["mockperson", "mockpet"] def setUp(self): MockPersonIndex().reindex() MockPetIndex().reindex() class MockPersonSerializer(HaystackSerializer): class Meta: index_classes = [MockPersonIndex] fields = ('text', 'firstname', 'lastname', 'description') class MockPetSerializer(HaystackSerializer): class Meta: index_classes = [MockPetIndex] exclude = ('description', 'autocomplete') class Serializer1(HaystackSerializer): class Meta: serializers = { MockPersonIndex: MockPersonSerializer, MockPetIndex: MockPetSerializer } self.serializer1 = Serializer1 def tearDown(self): MockPersonIndex().clear() MockPetIndex().clear() def test_multi_serializer(self): objs = SearchQuerySet().filter(text="Zane") serializer = self.serializer1(instance=objs, many=True) self.assertEqual( json.loads(json.dumps(serializer.data)), [{ "has_rabies": True, "text": "Zane", "name": "Zane", "species": "Dog" }, { "text": "Zane Griffith\n", "firstname": "Zane", "lastname": "Griffith", "description": "Zane is a nice chap!" }] ) class TestHaystackSerializerMeta(SimpleTestCase): def test_abstract_not_inherited(self): class Base(six.with_metaclass(HaystackSerializerMeta, serializers.Serializer)): _abstract = True def create_subclass(): class Sub(HaystackSerializer): pass self.assertRaises(ImproperlyConfigured, create_subclass) class TestMeta(SimpleTestCase): def test_inheritance(self): """ Tests that Meta fields are correctly overriden by subclasses. """ class Serializer(HaystackSerializer): class Meta: fields = ('overriden_fields',) self.assertEqual(Serializer.Meta.fields, ('overriden_fields',)) def test_default_attrs(self): class Serializer(HaystackSerializer): class Meta: fields = ('overriden_fields',) self.assertEqual(Serializer.Meta.exclude, tuple()) def test_raises_if_fields_and_exclude_defined(self): def create_subclass(): class Serializer(HaystackSerializer): class Meta: fields = ('include_field',) exclude = ('exclude_field',) return Serializer self.assertRaises(ImproperlyConfigured, create_subclass) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/test_utils.py0000664000175000017500000000274014476342707016320 0ustar00mmmmmm# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals from django.test import TestCase from drf_haystack.utils import merge_dict class MergeDictTestCase(TestCase): def setUp(self): self.dict_a = { "person": { "lastname": "Holmes", "combat_proficiency": [ "Pistol", "boxing" ] }, } self.dict_b = { "person": { "gender": "male", "firstname": "Sherlock", "location": { "address": "221B Baker Street" }, "combat_proficiency": [ "sword", "Martial arts", ] } } def test_utils_merge_dict(self): self.assertEqual(merge_dict(self.dict_a, self.dict_b), { "person": { "gender": "male", "firstname": "Sherlock", "lastname": "Holmes", "location": { "address": "221B Baker Street" }, "combat_proficiency": [ "Martial arts", "Pistol", "boxing", "sword", ] } }) def test_utils_merge_dict_invalid_input(self): self.assertEqual(merge_dict(self.dict_a, "I'm not a dict!"), "I'm not a dict!") ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/test_viewsets.py0000664000175000017500000003346214476342707017036 0ustar00mmmmmm# -*- coding: utf-8 -*- # # Unit tests for the `drf_haystack.viewsets` classes. # from __future__ import absolute_import, unicode_literals import json from unittest import skipIf from django.test import TestCase from django.contrib.auth.models import User from haystack.query import SearchQuerySet from rest_framework import status from rest_framework.pagination import PageNumberPagination from rest_framework.routers import SimpleRouter from rest_framework.serializers import Serializer from rest_framework.test import force_authenticate, APIRequestFactory from drf_haystack.viewsets import HaystackViewSet from drf_haystack.serializers import HaystackSerializer, HaystackFacetSerializer from drf_haystack.mixins import MoreLikeThisMixin, FacetMixin from . import restframework_version from .mockapp.models import MockPerson, MockPet from .mockapp.search_indexes import MockPersonIndex, MockPetIndex factory = APIRequestFactory() class HaystackViewSetTestCase(TestCase): fixtures = ["mockperson", "mockpet"] def setUp(self): MockPersonIndex().reindex() MockPetIndex().reindex() self.router = SimpleRouter() class FacetSerializer(HaystackFacetSerializer): class Meta: fields = ["firstname", "lastname", "created"] class ViewSet1(FacetMixin, HaystackViewSet): index_models = [MockPerson] serializer_class = Serializer facet_serializer_class = FacetSerializer class ViewSet2(MoreLikeThisMixin, HaystackViewSet): index_models = [MockPerson] serializer_class = Serializer class ViewSet3(HaystackViewSet): index_models = [MockPerson, MockPet] serializer_class = Serializer self.view1 = ViewSet1 self.view2 = ViewSet2 self.view3 = ViewSet3 def tearDown(self): MockPersonIndex().clear() def test_viewset_get_queryset_no_queryset(self): request = factory.get(path="/", data="", content_type="application/json") response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) def test_viewset_get_queryset_with_queryset(self): setattr(self.view1, "queryset", SearchQuerySet().all()) request = factory.get(path="/", data="", content_type="application/json") response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) def test_viewset_get_object_single_index(self): request = factory.get(path="/", data="", content_type="application/json") response = self.view1.as_view(actions={"get": "retrieve"})(request, pk=1) self.assertEqual(response.status_code, status.HTTP_200_OK) def test_viewset_get_object_multiple_indices(self): request = factory.get(path="/", data={"model": "mockapp.mockperson"}, content_type="application/json") response = self.view3.as_view(actions={"get": "retrieve"})(request, pk=1) self.assertEqual(response.status_code, status.HTTP_200_OK) def test_viewset_get_object_multiple_indices_no_model_query_param(self): request = factory.get(path="/", data="", content_type="application/json") response = self.view3.as_view(actions={"get": "retrieve"})(request, pk=1) self.assertEqual(response.status_code, status.HTTP_404_NOT_FOUND) def test_viewset_get_object_multiple_indices_invalid_modelname(self): request = factory.get(path="/", data={"model": "spam"}, content_type="application/json") response = self.view3.as_view(actions={"get": "retrieve"})(request, pk=1) self.assertEqual(response.status_code, status.HTTP_404_NOT_FOUND) def test_viewset_get_obj_raise_404(self): request = factory.get(path="/", data="", content_type="application/json") response = self.view1.as_view(actions={"get": "retrieve"})(request, pk=100000) self.assertEqual(response.status_code, status.HTTP_404_NOT_FOUND) def test_viewset_get_object_invalid_lookup_field(self): request = factory.get(path="/", data="", content_type="application/json") self.assertRaises( AttributeError, self.view1.as_view(actions={"get": "retrieve"}), request, invalid_lookup=1 ) def test_viewset_get_obj_override_lookup_field(self): setattr(self.view1, "lookup_field", "custom_lookup") request = factory.get(path="/", data="", content_type="application/json") response = self.view1.as_view(actions={"get": "retrieve"})(request, custom_lookup=1) setattr(self.view1, "lookup_field", "pk") self.assertEqual(response.status_code, status.HTTP_200_OK) def test_viewset_more_like_this_decorator(self): route = self.router.get_routes(self.view2)[2:].pop() self.assertEqual(route.url, "^{prefix}/{lookup}/more-like-this{trailing_slash}$") self.assertEqual(route.mapping, {"get": "more_like_this"}) def test_viewset_more_like_this_action_route(self): request = factory.get(path="/", data={}, content_type="application/json") response = self.view2.as_view(actions={"get": "more_like_this"})(request, pk=1) self.assertEqual(response.status_code, status.HTTP_200_OK) def test_viewset_facets_action_route(self): request = factory.get(path="/", data={}, content_type="application/json") response = self.view1.as_view(actions={"get": "facets"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) class HaystackViewSetPermissionsTestCase(TestCase): fixtures = ["mockperson"] def setUp(self): MockPersonIndex().reindex() class ViewSet(HaystackViewSet): serializer_class = Serializer self.view = ViewSet self.user = User.objects.create_user(username="user", email="user@example.com", password="user") self.admin_user = User.objects.create_superuser(username="admin", email="admin@example.com", password="admin") def tearDown(self): MockPersonIndex().clear() def test_viewset_get_queryset_with_no_permsission(self): setattr(self.view, "permission_classes", []) request = factory.get(path="/", data="", content_type="application/json") response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) def test_viewset_get_queryset_with_AllowAny_permission(self): from rest_framework.permissions import AllowAny setattr(self.view, "permission_classes", (AllowAny, )) request = factory.get(path="/", data="", content_type="application/json") response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) def test_viewset_get_queryset_with_IsAuthenticated_permission(self): from rest_framework.permissions import IsAuthenticated setattr(self.view, "permission_classes", (IsAuthenticated, )) request = factory.get(path="/", data="", content_type="application/json") response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN) force_authenticate(request, user=self.user) response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) def test_viewset_get_queryset_with_IsAdminUser_permission(self): from rest_framework.permissions import IsAdminUser setattr(self.view, "permission_classes", (IsAdminUser,)) request = factory.get(path="/", data="", content_type="application/json") force_authenticate(request, user=self.user) response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN) force_authenticate(request, user=self.admin_user) response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) def test_viewset_get_queryset_with_IsAuthenticatedOrReadOnly_permission(self): from rest_framework.permissions import IsAuthenticatedOrReadOnly setattr(self.view, "permission_classes", (IsAuthenticatedOrReadOnly,)) # Unauthenticated GET requests should pass request = factory.get(path="/", data="", content_type="application/json") response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) # Authenticated GET requests should pass request = factory.get(path="/", data="", content_type="application/json") force_authenticate(request, user=self.user) response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) # POST, PUT, PATCH and DELETE requests are not supported, so they will # raise an error. No need to test the permission. @skipIf(not restframework_version < (3, 7), "Skipped due to fix in django-rest-framework > 3.6") def test_viewset_get_queryset_with_DjangoModelPermissions_permission(self): from rest_framework.permissions import DjangoModelPermissions setattr(self.view, "permission_classes", (DjangoModelPermissions,)) # The `DjangoModelPermissions` is not supported and should raise an # AssertionError from rest_framework.permissions. request = factory.get(path="/", data="", content_type="application/json") try: self.view.as_view(actions={"get": "list"})(request) self.fail("Did not fail with AssertionError or AttributeError " "when calling HaystackView with DjangoModelPermissions") except (AttributeError, AssertionError) as e: if isinstance(e, AttributeError): self.assertEqual(str(e), "'SearchQuerySet' object has no attribute 'model'") else: self.assertEqual(str(e), "Cannot apply DjangoModelPermissions on a view that does " "not have `.model` or `.queryset` property.") def test_viewset_get_queryset_with_DjangoModelPermissionsOrAnonReadOnly_permission(self): from rest_framework.permissions import DjangoModelPermissionsOrAnonReadOnly setattr(self.view, "permission_classes", (DjangoModelPermissionsOrAnonReadOnly,)) # The `DjangoModelPermissionsOrAnonReadOnly` is not supported and should raise an # AssertionError from rest_framework.permissions. request = factory.get(path="/", data="", content_type="application/json") try: self.view.as_view(actions={"get": "list"})(request) self.fail("Did not fail with AssertionError when calling HaystackView " "with DjangoModelPermissionsOrAnonReadOnly") except (AttributeError, AssertionError) as e: if isinstance(e, AttributeError): self.assertEqual(str(e), "'SearchQuerySet' object has no attribute 'model'") else: self.assertEqual(str(e), "Cannot apply DjangoModelPermissions on a view that does " "not have `.model` or `.queryset` property.") @skipIf(not restframework_version < (3, 7), "Skipped due to fix in django-rest-framework > 3.6") def test_viewset_get_queryset_with_DjangoObjectPermissions_permission(self): from rest_framework.permissions import DjangoObjectPermissions setattr(self.view, "permission_classes", (DjangoObjectPermissions,)) # The `DjangoObjectPermissions` is a subclass of `DjangoModelPermissions` and # therefore unsupported. request = factory.get(path="/", data="", content_type="application/json") try: self.view.as_view(actions={"get": "list"})(request) self.fail("Did not fail with AssertionError when calling HaystackView with DjangoModelPermissions") except (AttributeError, AssertionError) as e: if isinstance(e, AttributeError): self.assertEqual(str(e), "'SearchQuerySet' object has no attribute 'model'") else: self.assertEqual(str(e), "Cannot apply DjangoModelPermissions on a view that does " "not have `.model` or `.queryset` property.") class PaginatedHaystackViewSetTestCase(TestCase): fixtures = ["mockperson"] def setUp(self): MockPersonIndex().reindex() class Serializer1(HaystackSerializer): class Meta: fields = ["firstname", "lastname"] index_classes = [MockPersonIndex] class NumberPagination(PageNumberPagination): page_size = 5 class ViewSet1(HaystackViewSet): index_models = [MockPerson] serializer_class = Serializer1 pagination_class = NumberPagination self.view1 = ViewSet1 def tearDown(self): MockPersonIndex().clear() def test_viewset_PageNumberPagination_results(self): request = factory.get(path="/", data="", content_type="application/json") response = self.view1.as_view(actions={"get": "list"})(request) response.render() content = json.loads(response.content.decode()) self.assertTrue(all(k in content for k in ("count", "next", "previous", "results"))) self.assertEqual(len(content["results"]), 5) def test_viewset_PageNumberPagination_navigation_urls(self): request = factory.get(path="/", data={"page": 2}, content_type="application/json") response = self.view1.as_view(actions={"get": "list"})(request) response.render() content = json.loads(response.content.decode()) self.assertEqual(content["previous"], "http://testserver/") self.assertEqual(content["next"], "http://testserver/?page=3") ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/urls.py0000664000175000017500000000073614476342707015111 0ustar00mmmmmm# -*- coding: utf-8 -*- from django.conf.urls import include, url from rest_framework import routers from tests.mockapp.views import SearchPersonFacetViewSet, SearchPersonMLTViewSet router = routers.DefaultRouter() router.register("search-person-facet", viewset=SearchPersonFacetViewSet, basename="search-person-facet") router.register("search-person-mlt", viewset=SearchPersonMLTViewSet, basename="search-person-mlt") urlpatterns = [ url(r"^", include(router.urls)) ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tests/wsgi.py0000664000175000017500000000060114476342707015064 0ustar00mmmmmm""" WSGI config for tests project. It exposes the WSGI callable as a module-level variable named ``application``. For more information on this file, see https://docs.djangoproject.com/en/1.7/howto/deployment/wsgi/ """ import os os.environ.setdefault("DJANGO_SETTINGS_MODULE", "tests.settings") from django.core.wsgi import get_wsgi_application application = get_wsgi_application() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1694090695.0 drf-haystack-1.8.13/tox.ini0000664000175000017500000001507114476342707013721 0ustar00mmmmmm[tox] envlist = docs, pypy3-django2.2-es1.x, pypy3-django2.2-es2.x, py36-django2.2-es1.x, py36-django2.2-es2.x, py36-django3.0-es1.x, py36-django3.0-es2.x, py36-django3.1-es1.x, py36-django3.1-es2.x, py36-django3.2-es1.x, py36-django3.2-es2.x, py37-django2.2-es1.x, py37-django2.2-es2.x, py37-django3.0-es1.x, py37-django3.0-es2.x, py37-django3.1-es1.x, py37-django3.1-es2.x, py37-django3.2-es1.x, py37-django3.2-es2.x, py38-django2.2-es1.x, py38-django2.2-es2.x, py38-django3.0-es1.x, py38-django3.0-es2.x, py38-django3.1-es1.x, py38-django3.1-es2.x, py38-django3.2-es1.x, py38-django3.2-es2.x, py38-django4.1-es1.x, py38-django4.1-es2.x, py39-django2.2-es1.x, py39-django2.2-es2.x, py39-django3.0-es1.x, py39-django3.0-es2.x, py39-django3.1-es1.x, py39-django3.1-es2.x, py39-django3.2-es1.x, py39-django3.2-es2.x, py39-django4.1-es1.x, py39-django4.1-es2.x, [base] deps = geopy [django2.2] deps = Django>=2.2,<=2.3 [django3.0] deps = Django>=3.0,<3.1 [django3.1] deps = Django>=3.1,<3.2 [django3.2] deps = Django>=3.2,<3.3 [django4.2] deps = Django>=4.0,<=4.2 [es1.x] setenv = VERSION_ES=>=1.0.0,<2.0.0 ELASTICSEARCH_URL=http://localhost:9200/ deps = elasticsearch>=1.0.0,<2.0.0 [es2.x] setenv = VERSION_ES=>=2.0.0,<3.0.0 ELASTICSEARCH_URL=http://localhost:9201/ deps = elasticsearch>=2.0.0,<3.0.0 [testenv] basepython = py36: python3.6 py37: python3.7 py38: python3.8 py39: python3.9 pypy3: pypy3 passenv = TRAVIS TRAVIS_JOB_ID TRAVIS_BRANCH commands = python {toxinidir}/setup.py test [testenv:docs] changedir = docs deps = sphinx sphinx-rtd-theme commands = sphinx-build -W -b html -d {envtmpdir}/doctrees . {envtmpdir}/html # # PyPy 3.6 # [testenv:pypy3-django2.2-es1.x] setenv = {[es1.x]setenv} deps = {[es1.x]deps} {[django2.2]deps} {[base]deps} [testenv:pypy3-django2.2-es2.x] setenv = {[es2.x]setenv} deps = {[es2.x]deps} {[django2.2]deps} {[base]deps} # # CPython3.6 # [testenv:py36-django2.2-es1.x] setenv = {[es1.x]setenv} deps = {[es1.x]deps} {[django2.2]deps} {[base]deps} [testenv:py36-django2.2-es2.x] setenv = {[es2.x]setenv} deps = {[es2.x]deps} {[django2.2]deps} {[base]deps} [testenv:py36-django3.0-es1.x] setenv = {[es1.x]setenv} deps = {[es1.x]deps} {[django3.0]deps} {[base]deps} [testenv:py36-django3.0-es2.x] setenv = {[es2.x]setenv} deps = {[es2.x]deps} {[django3.0]deps} {[base]deps} [testenv:py36-django3.1-es1.x] setenv = {[es1.x]setenv} deps = {[es1.x]deps} {[django3.1]deps} {[base]deps} [testenv:py36-django3.1-es2.x] setenv = {[es2.x]setenv} deps = {[es2.x]deps} {[django3.1]deps} {[base]deps} [testenv:py36-django3.2-es1.x] setenv = {[es1.x]setenv} deps = {[es1.x]deps} {[django3.2]deps} {[base]deps} [testenv:py36-django3.2-es2.x] setenv = {[es2.x]setenv} deps = {[es2.x]deps} {[django3.2]deps} {[base]deps} # # CPython3.7 # [testenv:py37-django2.2-es1.x] setenv = {[es1.x]setenv} deps = {[es1.x]deps} {[django2.2]deps} {[base]deps} [testenv:py37-django2.2-es2.x] setenv = {[es2.x]setenv} deps = {[es2.x]deps} {[django2.2]deps} {[base]deps} [testenv:py37-django3.0-es1.x] setenv = {[es1.x]setenv} deps = {[es1.x]deps} {[django3.0]deps} {[base]deps} [testenv:py37-django3.0-es2.x] setenv = {[es2.x]setenv} deps = {[es2.x]deps} {[django3.0]deps} {[base]deps} [testenv:py37-django3.1-es1.x] setenv = {[es1.x]setenv} deps = {[es1.x]deps} {[django3.1]deps} {[base]deps} [testenv:py37-django3.1-es2.x] setenv = {[es2.x]setenv} deps = {[es2.x]deps} {[django3.1]deps} {[base]deps} [testenv:py37-django3.2-es1.x] setenv = {[es1.x]setenv} deps = {[es1.x]deps} {[django3.2]deps} {[base]deps} [testenv:py37-django3.2-es2.x] setenv = {[es2.x]setenv} deps = {[es2.x]deps} {[django3.2]deps} {[base]deps} # # CPython3.8 # [testenv:py38-django2.2-es1.x] setenv = {[es1.x]setenv} deps = {[es1.x]deps} {[django2.2]deps} {[base]deps} [testenv:py38-django2.2-es2.x] setenv = {[es2.x]setenv} deps = {[es2.x]deps} {[django2.2]deps} {[base]deps} [testenv:py38-django3.0-es1.x] setenv = {[es1.x]setenv} deps = {[es1.x]deps} {[django3.0]deps} {[base]deps} [testenv:py38-django3.0-es2.x] setenv = {[es2.x]setenv} deps = {[es2.x]deps} {[django3.0]deps} {[base]deps} [testenv:py38-django3.1-es1.x] setenv = {[es1.x]setenv} deps = {[es1.x]deps} {[django3.1]deps} {[base]deps} [testenv:py38-django3.1-es2.x] setenv = {[es2.x]setenv} deps = {[es2.x]deps} {[django3.1]deps} {[base]deps} [testenv:py38-django3.2-es1.x] setenv = {[es1.x]setenv} deps = {[es1.x]deps} {[django3.2]deps} {[base]deps} [testenv:py38-django3.2-es2.x] setenv = {[es2.x]setenv} deps = {[es2.x]deps} {[django3.2]deps} {[base]deps} [testenv:py38-django4.1-es1.x] setenv = {[es1.x]setenv} deps = {[es1.x]deps} {[django4.1]deps} {[base]deps} [testenv:py38-django4.1-es2.x] setenv = {[es2.x]setenv} deps = {[es2.x]deps} {[django4.1]deps} {[base]deps} # # CPython3.9 # [testenv:py39-django2.2-es1.x] setenv = {[es1.x]setenv} deps = {[es1.x]deps} {[django2.2]deps} {[base]deps} [testenv:py39-django2.2-es2.x] setenv = {[es2.x]setenv} deps = {[es2.x]deps} {[django2.2]deps} {[base]deps} [testenv:py39-django3.0-es1.x] setenv = {[es1.x]setenv} deps = {[es1.x]deps} {[django3.0]deps} {[base]deps} [testenv:py39-django3.0-es2.x] setenv = {[es2.x]setenv} deps = {[es2.x]deps} {[django3.0]deps} {[base]deps} [testenv:py39-django3.1-es1.x] setenv = {[es1.x]setenv} deps = {[es1.x]deps} {[django3.1]deps} {[base]deps} [testenv:py39-django3.1-es2.x] setenv = {[es2.x]setenv} deps = {[es2.x]deps} {[django3.1]deps} {[base]deps} [testenv:py39-django3.2-es1.x] setenv = {[es1.x]setenv} deps = {[es1.x]deps} {[django3.2]deps} {[base]deps} [testenv:py39-django3.2-es2.x] setenv = {[es2.x]setenv} deps = {[es2.x]deps} {[django3.2]deps} {[base]deps} [testenv:py39-django4.1-es1.x] setenv = {[es1.x]setenv} deps = {[es1.x]deps} {[django4.1]deps} {[base]deps} [testenv:py39-django4.1-es2.x] setenv = {[es2.x]setenv} deps = {[es2.x]deps} {[django4.1]deps} {[base]deps}