drf-haystack-1.5.6/0000755000076500000240000000000012627546270014203 5ustar rolfstaff00000000000000drf-haystack-1.5.6/docs/0000755000076500000240000000000012627546270015133 5ustar rolfstaff00000000000000drf-haystack-1.5.6/docs/advanced_usage.rst0000644000076500000240000010167112627527535020627 0ustar rolfstaff00000000000000.. _advanced-usage-label: ============== Advanced Usage ============== Make sure you've read through the :ref:`basic-usage-label`. Query Field Lookups =================== You can also use field lookups in your field queries. See the Haystack `field lookups `_ documentation for info on what lookups are available. A query using a lookup might look like the following: .. code-block:: none http://example.com/api/v1/location/search/?city__startswith=Os This would perform a query looking up all documents where the `city field` started with "Os". You might get "Oslo", "Osaka", and "Ostrava". Query Term Negation ------------------- You can also specify terms to exclude from the search results using the negation keyword. The default keyword is "not", but is configurable via settings using ``DRF_HAYSTACK_NEGATION_KEYWORD``. .. code-block:: none http://example.com/api/v1/location/search/?city__not=Oslo http://example.com/api/v1/location/search/?city__not__contains=Los http://example.com/api/v1/location/search/?city__contains=Los&city__not__contains=Angeles Autocomplete ============ Some kind of data such as ie. cities and zip codes could be useful to autocomplete. We have a Django REST Framework filter for performing autocomplete queries. It works quite like the regular ``HaystackFilter`` but *must* be run against an ``NgramField`` or ``EdgeNgramField`` in order to work properly. The main difference is that while the HaystackFilter performs a bitwise ``OR`` on terms for the same parameters, the ``HaystackAutocompleteFilter`` reduce query parameters down to a single filter (using an ``SQ`` object), and performs a bitwise ``AND``. .. class:: drf_haystack.filters.HaystackAutocompleteFilter By adding a list or tuple of ``ignore_fields`` to the serializer's Meta class, we can tell the REST framework to ignore these fields. This is handy in cases, where you do not want to serialize and transfer the content of a text, or n-gram index down to the client. An example using the autocomplete filter might look something like this. .. code-block:: python class AutocompleteSerializer(HaystackSerializer): class Meta: index_classes = [LocationIndex] fields = ["address", "city", "zip_code", "autocomplete"] ignore_fields = ["autocomplete"] # The `field_aliases` attribute can be used in order to alias a # query parameter to a field attribute. In this case a query like # /search/?q=oslo would alias the `q` parameter to the `autocomplete` # field on the index. field_aliases = { "q": "autocomplete" } class AutocompleteSearchViewSet(HaystackViewSet): index_models = [Location] serializer_class = AutocompleteSerializer filter_backends = [HaystackAutocompleteFilter] GEO Locations ============= Some search backends support geo spatial searching. In order to take advantage of this we have the ``HaystackGEOSpatialFilter``. .. class:: drf_haystack.filters.HaystackGEOSpatialFilter .. note:: The ``HaystackGEOSpatialFilter`` depends on ``geopy`` and ``libgeos``. Make sure to install these libraries in order to use this filter. .. code-block:: none $ pip install geopy $ apt-get install libgeos (for debian based linux distros) or $ brew install geos (for homebrew on OS X) The geospatial filter is somewhat special, and for the time being, relies on a few assumptions. #. The index model **must** to have a ``LocationField`` named ``coordinates`` (See :ref:`search-index-example-label` for example). If your ``LocationField`` is named differently, instead of using the ``HaystackGEOSpatialFilter``, subclass the ``BaseHaystackGEOSpatialFilter`` and provide the name of your ``LocationField`` in ``point_field`` (string). #. The query **must** contain a ``unit`` parameter where the unit is a valid ``UNIT`` in the ``django.contrib.gis.measure.Distance`` class. #. The query **must** contain a ``from`` parameter which is a comma separated longitude and latitude value. **Example Geospatial view** .. code-block:: python class DistanceSerializer(serializers.Serializer): m = serializers.FloatField() km = serializers.FloatField() class LocationSerializer(HaystackSerializer): distance = SerializerMethodField() class Meta: index_classes = [LocationIndex] fields = ["address", "city", "zip_code", "location"] def get_distance(self, obj): if hasattr(obj, "distance"): return DistanceSerializer(obj.distance, many=False).data class LocationGeoSearchViewSet(HaystackViewSet): index_models = [Location] serializer_class = LocationSerializer filter_backends = [HaystackGEOSpatialFilter] **Example subclassing the ``BaseHaystackGEOSpatialFilter``** Assuming that your ``LocationField`` is named ``location``. .. code-block:: python from drf_haystack.filters import BaseHaystackGEOSpatialFilter class CustomHaystackGEOSpatialFilter(BaseHaystackGEOSpatialFilter): point_field = 'location' class LocationGeoSearchViewSet(HaystackViewSet): index_models = [Location] serializer_class = LocationSerializer filter_backends = [CustomHaystackGEOSpatialFilter] Assuming the above code works as it should, we would be able to do queries like this: .. code-block:: none /api/v1/search/?zip_code=0351&km=10&from=59.744076,10.152045 The above query would return all entries with zip_code 0351 within 10 kilometers from the location with latitude 59.744076 and longitude 10.152045. Highlighting ============ Haystack supports two kinds of `Highlighting `_, and we support them both. #. SearchQuerySet highlighting. This kind of highlighting requires a search backend which has support for highlighting, such as Elasticsearch or Solr. #. Pure python highlighting. This implementation is somewhat slower, but enables highlighting support even if your search backend does not support it. .. note:: The highlighter will always use the ``document=True`` field on your index to hightlight on. See examples below. SearchQuerySet Highlighting --------------------------- In order to add support for ``SearchQuerySet().highlight()``, all you have to do is to add the ``HaystackHighlightFilter`` to the ``filter_backends`` in your view. The ``HaystackSerializer`` will check if your queryset has highlighting enabled, and render an additional ``highlighted`` field to your result. The highlighted words will be encapsulated in an ``words go here`` html tag. .. warning:: The ``SQHighlighterMixin()`` is deprecated in favor of the ``HaystackHighlightFilter()`` filter backend. .. class:: drf_haystack.filters.HaystackHighlightFilter **Example view with highlighting enabled** .. code-block:: python from drf_haystack.viewsets import HaystackViewSet from drf_haystack.filters import HaystackHighlightFilter from .models import Person from .serializers import PersonSerializer class SearchViewSet(HaystackViewSet): index_models = [Person] serializer_class = PersonSerializer filter_backends = [HaystackHighlightFilter] Given a query like below .. code-block:: none /api/v1/search/?firstname=jeremy We would get a result like this .. code-block:: json [ { "lastname": "Rowland", "full_name": "Jeremy Rowland", "firstname": "Jeremy", "highlighted": "Jeremy Rowland\nCreated: May 19, 2015, 10:48 a.m.\nLast modified: May 19, 2015, 10:48 a.m.\n" }, { "lastname": "Fowler", "full_name": "Jeremy Fowler", "firstname": "Jeremy", "highlighted": "Jeremy Fowler\nCreated: May 19, 2015, 10:48 a.m.\nLast modified: May 19, 2015, 10:48 a.m.\n" } ] Pure Python Highlighting ------------------------ This implementation make use of the haystack ``Highlighter()`` class. It is implemented as a mixin class, and must be applied on the ``Serializer``. This is somewhat slower, but more configurable than the ``SQHighlighterMixin()``. .. class:: drf_haystack.serializers.HighlighterMixin The Highlighter class will be initialized with the following default options, but can be overridden by changing any of the following class attributes. .. code-block:: python highlighter_class = Highlighter highlighter_css_class = "highlighted" highlighter_html_tag = "span" highlighter_max_length = 200 highlighter_field = None The Highlighter class will usually highlight the ``document_field`` (the field marked ``document=True`` on your search index class), but this may be overridden by changing the ``highlighter_field``. You can of course also use your own ``Highlighter`` class by overriding the ``highlighter_class = MyFancyHighLighter`` class attribute. **Example serializer with highlighter support** .. code-block:: python from drf_haystack.serializers import HighlighterMixin, HaystackSerializer class PersonSerializer(HighlighterMixin, HaystackSerializer): highlighter_css_class = "my-highlighter-class" highlighter_html_tag = "em" class Meta: index_classes = [PersonIndex] fields = ["firstname", "lastname", "full_name"] Response .. code-block:: json [ { "full_name": "Jeremy Rowland", "lastname": "Rowland", "firstname": "Jeremy", "highlighted": "Jeremy Rowland\nCreated: May 19, 2015, 10:48 a.m.\nLast modified: May 19, 2015, 10:48 a.m.\n" }, { "full_name": "Jeremy Fowler", "lastname": "Fowler", "firstname": "Jeremy", "highlighted": "Jeremy Fowler\nCreated: May 19, 2015, 10:48 a.m.\nLast modified: May 19, 2015, 10:48 a.m.\n" } ] .. _more-like-this-label: More Like This ============== Some search backends supports ``More Like This`` features. In order to take advantage of this, the ``HaystackViewSet`` includes a ``more-like-this`` detail route which is appended to the base name of the ViewSet. Lets say you have a router which looks like this: .. code-block:: python router = routers.DefaultRouter() router.register("search", viewset=SearchViewSet, base_name="search") # MLT name will be 'search-more-like-this'. urlpatterns = patterns( "", url(r"^", include(router.urls)) ) The important thing here is that the ``SearchViewSet`` class inherits from the ``HaystackViewSet`` class in order to get the ``more-like-this`` route automatically added. The view name will be ``{base_name}-more-like-this``, which in this case would be for example ``search-more-like-this``. Serializing the More Like This URL ---------------------------------- In order to include the ``more-like-this`` url in your result you only have to add a ``HyperlinkedIdentityField`` to your serializer. Something like this should work okay. **Example serializer with More Like This** .. code-block:: python class SearchSerializer(HaystackSerializer): more_like_this = serializers.HyperlinkedIdentityField(view_name="search-more-like-this", read_only=True) class Meta: index_classes = [PersonIndex] fields = ["firstname", "lastname", "full_name"] Now, every result you render with this serializer will include a ``more_like_this`` field containing the url for similar results. Example response .. code-block:: json [ { "full_name": "Jeremy Rowland", "lastname": "Rowland", "firstname": "Jeremy", "more_like_this": "http://example.com/search/5/more-like-this/" } ] .. _term-boost-label: Term Boost ========== .. warning:: **BIG FAT WARNING** As far as I can see, the term boost functionality is implemented by the specs in the `Haystack documentation `_, however it does not really work as it should! When applying term boost, results are discarded from the search result, and not re-ordered by boost weight as they should. These are known problems and there exists open issues for them: - https://github.com/inonit/drf-haystack/issues/21 - https://github.com/django-haystack/django-haystack/issues/1235 - https://github.com/django-haystack/django-haystack/issues/508 **Please do not use this unless you really know what you are doing!** (And please let me know if you know how to fix it!) Term boost is achieved on the SearchQuerySet level by calling ``SearchQuerySet().boost()``. It is implemented as a filter backend, and applies boost **after** regular filtering has occurred. .. class:: drf_haystack.filters.HaystackBoostFilter .. code-block:: python from drf_haystack.filters import HaystackBoostFilter class SearchViewSet(HaystackViewSet): ... filter_backends = [HaystackBoostFilter] The filter expects the query string to contain a ``boost`` parameter, which is a comma separated string of the term to boost and the boost value. The boost value must be either an integer or float value. **Example query** .. code-block:: none /api/v1/search/?firstname=robin&boost=hood,1.1 The query above will first filter on ``firstname=robin`` and next apply a slight boost on any document containing the word ``hood``. .. note:: Term boost are only applied on terms existing in the ``document field``. .. _faceting-label: Faceting ======== Faceting is a way of grouping and narrowing search results by a common factor, for example we can group all results which are registered on a certain date. Similar to :ref:`more-like-this-label`, the faceting functionality is implemented by setting up a special ``^search/facets/$`` route on any view which inherits from the ``HaystackViewSet`` class. .. note:: Options used for faceting is **not** portable across search backends. Make sure to provide options suitable for the backend you're using. First, read the `Haystack faceting docs `_ and set up your search index for faceting. Serializing faceted counts -------------------------- Faceting is a little special in terms that it *does not* care about SearchQuerySet filtering. Faceting is performed by calling the ``SearchQuerySet().facet(field, **options)`` and ``SearchQuerySet().date_facet(field, **options)`` methods, which will apply facets to the SearchQuerySet. Next we need to call the ``SearchQuerySet().facet_counts()`` in order to retrieve a dictionary with all the *counts* for the faceted fields. We have a special ``HaystackFacetSerializer`` class which is designed to serialize these results. .. tip:: It *is* possible to perform faceting on a subset of the queryset, in which case you'd have to override the ``get_queryset()`` method of the view to limit the queryset before it is passed on to the ``filter_facet_queryset()`` method. The ``HaystackFacetSerializer`` overrides a number of methods and is customized to only serialize facets in a very specific format. Using this serializer for other stuff will probably not work very good. Consider yourself warned! Any serializer subclassed from the ``HaystackFacetSerializer`` is expected to have a ``field_options`` dictionary containing a set of default options passed to ``facet()`` and ``date_facet()``. **Facet serializer example** .. code-block:: python class PersonFacetSerializer(HaystackFacetSerializer): serialize_objects = False # Setting this to True will serialize the # queryset into an `objects` list. This # is useful if you need to display the faceted # results. Defaults to False. class Meta: index_classes = [PersonIndex] fields = ["firstname", "lastname", "created"] field_options = { "firstname": {}, "lastname": {}, "created": { "start_date": datetime.now() - timedelta(days=3 * 365), "end_date": datetime.now(), "gap_by": "month", "gap_amount": 3 } } The declared ``field_options`` will be used as default options when faceting is applied to the queryset, but can be overridden by supplying query string parameters in the following format. .. code-block:: none ?firstname=limit:1&created=start_date:20th May 2014,gap_by:year Each field can be fed options as ``key:value`` pairs. Multiple ``key:value`` pairs can be supplied and will be separated by the ``view.lookup_sep`` attribute (which defaults to comma). Any ``start_date`` and ``end_date`` parameters will be parsed by the python-dateutil `parser() `_ (which can handle most common date formats). .. note:: - The ``HaystackFacetFilter`` parses query string parameter options, separated with the ``view.lookup_sep`` attribute. Each option is parsed as ``key:value`` pairs where the ``:`` is a hardcoded separator. Setting the ``view.lookup_sep`` attribute to ``":"`` will raise an AttributeError. - The date parsing in the ``HaystackFacetFilter`` does intentionally blow up if fed a string format it can't handle. No exception handling is done, so make sure to convert values to a format you know it can handle before passing it to the filter. Ie., don't let your users feed their own values in here ;) .. warning:: Do *not* use the ``HaystackFacetFilter`` in the regular ``filter_backends`` list on the serializer. It will almost certainly produce errors or weird results. Faceting filters should go in the ``facet_filter_backends`` list. **Example serialized content** The serialized content will look a little different than the default Haystack faceted output. The top level items will *always* be **queries**, **fields** and **dates**, each containing a subset of fields matching the category. In the example below, we have faceted on the fields *firstname* and *lastname*, which will make them appear under the **fields** category. We also have faceted on the date field *created*, which will show up under the **dates** category. Next, each faceted result will have a ``text``, ``count`` and ``narrow_url`` attribute which should be quite self explaining. .. code-block:: json { "queries": {}, "fields": { "firstname": [ { "text": "John", "count": 3, "narrow_url": "/api/v1/search/facets/?selected_facets=firstname_exact%3AJohn" }, { "text": "Randall", "count": 2, "narrow_url": "/api/v1/search/facets/?selected_facets=firstname_exact%3ARandall" }, { "text": "Nehru", "count": 2, "narrow_url": "/api/v1/search/facets/?selected_facets=firstname_exact%3ANehru" } ], "lastname": [ { "text": "Porter", "count": 2, "narrow_url": "/api/v1/search/facets/?selected_facets=lastname_exact%3APorter" }, { "text": "Odonnell", "count": 2, "narrow_url": "/api/v1/search/facets/?selected_facets=lastname_exact%3AOdonnell" }, { "text": "Hood", "count": 2, "narrow_url": "/api/v1/search/facets/?selected_facets=lastname_exact%3AHood" } ] }, "dates": { "created": [ { "text": "2015-05-15T00:00:00", "count": 100, "narrow_url": "/api/v1/search/facets/?selected_facets=created_exact%3A2015-05-15+00%3A00%3A00" } ] } } Serializing faceted results --------------------------- When a ``HaystackFacetSerializer`` class determines what fields to serialize, it will check the ``serialize_objects`` class attribute to see if it is ``True`` or ``False``. Setting this value to ``True`` will add an additional ``objects`` field to the serialized results, which will contain the results for the faceted ``SearchQuerySet``. The results will be serialized using the view's ``serializer_class``. **Example faceted results with paginated serialized objects** .. code-block:: json { "fields": { "firstname": [ {"...": "..."} ], "lastname": [ {"...": "..."} ] }, "dates": { "created": [ {"...": "..."} ] }, "queries": {}, "objects": { "count": 3, "next": "http://example.com/api/v1/search/facets/?page=2&selected_facets=firstname_exact%3AJohn", "previous": null, "results": [ { "lastname": "Baker", "firstname": "John", "full_name": "John Baker", "text": "John Baker\n" }, { "lastname": "McClane", "firstname": "John", "full_name": "John McClane", "text": "John McClane\n" } ] } } Setting up the view ------------------- Any view that inherits the ``HaystackViewSet`` will have a special `action route `_ added as ``^/facets/$``. This view action will not care about regular filtering but will by default use the ``HaystackFacetFilter`` to perform filtering. .. note:: In order to avoid confusing the filtering mechanisms in Django Rest Framework, the ``HaystackGenericAPIView`` base class has a couple of new hooks for dealing with faceting, namely: - ``facet_filter_backends`` - A list of filter backends that will be used to apply faceting to the queryset. Defaults to ``HaystackFacetFilter``, which should be sufficient in most cases. - ``facet_serializer_class`` - The ``HaystackFacetSerializer`` subclass instance that will be used for serializing the result. - ``filter_facet_queryset()`` - Works exactly as the normal ``filter_queryset()`` method, but will only filter on backends in the ``facet_filter_backends`` list. - ``get_facet_serializer_class()`` - Returns the ``facet_serializer_class`` class attribute. - ``get_facet_serializer()`` - Instantiates and returns the ``HaystackFacetSerializer`` class returned from ``get_facet_serializer_class()``. In order to set up a view which can respond to regular queries under ie ``^search/$`` and faceted queries under ``^search/facets/$``, we could do something like this. .. code-block:: python class SearchPersonViewSet(HaystackViewSet): index_models = [MockPerson] # This will be used to filter and serialize regular queries as well # as the results if the `facet_serializer_class` has the # `serialize_objects = True` set. serializer_class = SearchSerializer filter_backends = [HaystackHighlightFilter, HaystackAutocompleteFilter] # This will be used to filter and serialize faceted results facet_serializer_class = PersonFacetSerializer # See example above! facet_filter_backends = [HaystackFacetFilter] # This is the default facet filter, and # can be left out. Narrowing --------- As we have seen in the examples above, the ``HaystackFacetSerializer`` will add a ``narrow_url`` attribute to each result it serializes. Follow that link to narrow the search result. The ``narrow_url`` is constructed like this: - Read all query parameters from the request - Get a list of ``selected_facets`` - Update the query parameters by adding the current item to ``selected_facets`` - Return a ``serializers.Hyperlink`` with URL encoded query parameters This means that for each drill-down performed, the original query parameters will be kept in order to make the ``HaystackFacetFilter`` happy. Additionally, all the previous ``selected_facets`` will be kept and applied to narrow the ``SearchQuerySet`` properly. **Example narrowed result** .. code-block:: json { "queries": {}, "fields": { "firstname": [ { "text": "John", "count": 1, "narrow_url": "/api/v1/search/facets/?selected_facets=firstname_exact%3AJohn&selected_facets=lastname_exact%3AMcLaughlin" } ], "lastname": [ { "text": "McLaughlin", "count": 1, "narrow_url": "/api/v1/search/facets/?selected_facets=firstname_exact%3AJohn&selected_facets=lastname_exact%3AMcLaughlin" } ] }, "dates": { "created": [ { "text": "2015-05-15T00:00:00", "count": 1, "narrow_url": "/api/v1/search/facets/?selected_facets=firstname_exact%3AJohn&selected_facets=lastname_exact%3AMcLaughlin&selected_facets=created_exact%3A2015-05-15+00%3A00%3A00" } ] } } .. _permission-classes-label: Permission Classes ================== Django REST Framework allows setting certain ``permission_classes`` in order to control access to views. The generic ``HaystackGenericAPIView`` defaults to ``rest_framework.permissions.AllowAny`` which enforce no restrictions on the views. This can be overridden on a per-view basis as you would normally do in a regular `REST Framework APIView `_. .. note:: Since we have no Django model or queryset, the following permission classes are *not* supported: - ``rest_framework.permissions.DjangoModelPermissions`` - ``rest_framework.permissions.DjangoModelPermissionsOrAnonReadOnly`` - ``rest_framework.permissions.DjangoObjectPermissions`` ``POST``, ``PUT``, ``PATCH`` and ``DELETE`` are not supported since Haystack Views are read-only. So if you are using the ``rest_framework.permissions.IsAuthenticatedOrReadOnly`` , this will act just as the ``AllowAny`` permission. **Example overriding permission classes** .. code-block:: python ... from rest_framework.permissions import IsAuthenticated class SearchViewSet(HaystackViewSet): ... permission_classes = [IsAuthenticated] Reusing Model serializers ========================= It may be useful to be able to use existing model serializers to return data from search requests in the same format as used elsewhere in your API. This can be done by modifying the ``to_representation`` method of your serializer to use the ``instance.object`` instead of the search result instance. As a convenience, a mixin class is provided that does just that. .. class:: drf_haystack.serializers.HaystackSerializerMixin An example using the mixin might look like the following: .. code-block:: python class PersonSerializer(serializers.ModelSerializer): class Meta: model = Person fields = ("id", "firstname", "lastname") class PersonSearchSerializer(HaystackSerializerMixin, PersonSerializer): class Meta(PersonSerializer.Meta): search_fields = ("text", ) The results from a search would then contain the fields from the ``PersonSerializer`` rather than fields from the search index. .. note:: If your model serializer specifies a ``fields`` attribute in its Meta class, then the search serializer must specify a ``search_fields`` attribute in its Meta class if you intend to search on any search index fields that are not in the model serializer fields (e.g. 'text') .. warning:: It should be noted that doing this will retrieve the underlying object which means a database hit. Thus, it will not be as performant as only retrieving data from the search index. If performance is a concern, it would be better to recreate the desired data structure and store it in the search index. .. _multiple-search-indexes-label: Multiple Search indexes ======================= So far, we have only used one class in the ``index_classes`` attribute of our serializers. However, you are able to specify a list of them. This can be useful when your search engine has indexed multiple models and you want to provide aggregate results across two or more of them. To use the default multiple index support, simply add multiple indexes the ``index_classes`` list .. code-block:: python class PersonIndex(indexes.SearchIndex, indexes.Indexable): text = indexes.CharField(document=True, use_template=True) firstname = indexes.CharField(model_attr="first_name") lastname = indexes.CharField(model_attr="last_name") def get_model(self): return Person class PlaceIndex(indexes.SearchIndex, indexes.Indexable): text = indexes.CharField(document=True, use_template=True) address = indexes.CharField(model_attr="address") def get_model(self): return Place class ThingIndex(indexes.SearchIndex, indexes.Indexable): text = indexes.CharField(document=True, use_template=True) name = indexes.CharField(model_attr="name") def get_model(self): return Thing class AggregateSerializer(HaystackSerializer): class Meta: index_classes = [PersonIndex, PlaceIndex, ThingIndex] fields = ["firstname", "lastname", "address", "name"] class AggregateSearchViewSet(HaystackViewSet): serializer_class = AggregateSerializer .. note:: The ``AggregateSearchViewSet`` class above omits the optional ``index_models`` attribute. This way results from all the models are returned. The result from searches using multiple indexes is a list of objects, each of which contains only the fields appropriate to the model from which the result came. For instance if a search returned a list containing one each of the above models, it might look like the following: .. code-block:: javascript [ { "text": "John Doe", "firstname": "John", "lastname": "Doe" }, { "text": "123 Doe Street", "address": "123 Doe Street" }, { "text": "Doe", "name": "Doe" } ] Declared fields --------------- You can include field declarations in the serializer class like normal. Depending on how they are named, they will be treated as common fields and added to every result or as specific to results from a particular index. Common fields are declared as you would any serializer field. Index-specific fields must be prefixed with "___". The following example illustrates this usage: .. code-block:: python class AggregateSerializer(HaystackSerializer): extra = serializers.CharField() _ThingIndex__number = serializers.IntegerField() class Meta: index_classes = [PersonIndex, PlaceIndex, ThingIndex] fields = ["firstname", "lastname", "address", "name"] def get_extra(self): return "whatever" def get__ThingIndex__number(self): return 42 The results of a search might then look like the following: .. code-block:: javascript [ { "text": "John Doe", "firstname": "John", "lastname": "Doe", "extra": "whatever" }, { "text": "123 Doe Street", "address": "123 Doe Street", "extra": "whatever" }, { "text": "Doe", "name": "Doe", "extra": "whatever", "number": 42 } ] Multiple Serializers -------------------- Alternatively, you can specify a 'serializers' attribute on your Meta class to use a different serializer class for different indexes as show below: .. code-block:: python class AggregateSearchSerializer(HaystackSerializer): class Meta: serializers = { PersonIndex: PersonSearchSerializer, PlaceIndex: PlaceSearchSerializer, ThingIndex: ThingSearchSerializer } The ``serializers`` attribute is the important thing here, It's a dictionary with ``SearchIndex`` classes as keys and ``Serializer`` classes as values. Each result in the list of results from a search that contained items from multiple indexes would be serialized according to the appropriate serializer. drf-haystack-1.5.6/docs/basic_usage.rst0000644000076500000240000001276412537476535020152 0ustar rolfstaff00000000000000.. _basic-usage-label: =========== Basic Usage =========== Usage is best demonstrated with some simple examples. .. warning:: The code here is for demonstration purposes only! It might work (or not, I haven't tested), but as always, don't blindly copy code from the internet. Examples ======== models.py --------- Let's say we have an app which contains a model `Location`. It could look something like this. .. code-block:: python # # models.py # from django.db import models from haystack.utils.geo import Point class Location(models.Model): latitude = models.FloatField() longitude = models.FloatField() address = models.CharField(max_length=100) city = models.CharField(max_length=30) zip_code = models.CharField(max_length=10) created = models.DateTimeField(auto_now_add=True) updated = models.DateTimeField(auto_now=True) def __str__(self): return self.address @property def coordinates(self): return Point(self.longitude, self.latitude) .. _search-index-example-label: search_indexes.py ----------------- We would have to make a `search_indexes.py` file for haystack to pick it up. .. code-block:: python # # search_indexes.py # from django.utils import timezone from haystack import indexes from .models import Location class LocationIndex(indexes.SearchIndex, indexes.Indexable): text = indexes.CharField(document=True, use_template=True) address = indexes.CharField(model_attr="address") city = indexes.CharField(model_attr="city") zip_code = indexes.CharField(model_attr="zip_code") autocomplete = indexes.EdgeNgramField() coordinates = indexes.LocationField(model_attr="coordinates") @staticmethod def prepare_autocomplete(obj): return " ".join(( obj.address, obj.city, obj.zip_code )) def get_model(self): return Location def index_queryset(self, using=None): return self.get_model().objects.filter( created__lte=timezone.now() ) views.py -------- For a generic Django REST Framework view, you could do something like this. .. code-block:: python # # views.py # from drf_haystack.serializers import HaystackSerializer from drf_haystack.viewsets import HaystackViewSet from .models import Location from .search_indexes import LocationIndex class LocationSerializer(HaystackSerializer): class Meta: # The `index_classes` attribute is a list of which search indexes # we want to include in the search. index_classes = [LocationIndex] # The `fields` contains all the fields we want to include. # NOTE: Make sure you don't confuse these with model attributes. These # fields belong to the search index! fields = [ "text", "address", "city", "zip_code", "autocomplete" ] class LocationSearchView(HaystackViewSet): # `index_models` is an optional list of which models you would like to include # in the search result. You might have several models indexed, and this provides # a way to filter out those of no interest for this particular view. # (Translates to `SearchQuerySet().models(*index_models)` behind the scenes. index_models = [Location] serializer_class = LocationSerializer urls.py ------- Finally, hook up the views in your `urls.py` file. .. note:: Make sure you specify the `base_name` attribute when wiring up the view in the router. Since we don't have any single `model` for the view, it is impossible for the router to automatically figure out the base name for the view. .. code-block:: python # # urls.py # from django.conf.urls import patterns, url, include from rest_framework import routers from .views import LocationSearchView router = routers.DefaultRouter() router.register("location/search", LocationSearchView, base_name="location-search") urlpatterns = patterns( "", url(r"/api/v1/", include(router.urls)), ) Query time! ----------- Now that we have a view wired up, we can start using it. By default, the `HaystackViewSet` (which, more importantly inherits the `HaystackGenericAPIView` class) is set up to use the `HaystackFilter`. This is the most basic filter included and can do basic search by querying any of the field included in the `fields` attribute on the `Serializer`. .. code-block:: none http://example.com/api/v1/location/search/?city=Oslo Would perform a query looking up all documents where the `city field` equals "Oslo". Regular Search View =================== Sometimes you might not need all the bells and whistles of a `ViewSet`, but can do with a regular view. In such scenario you could do something like this. .. code-block:: python # # views.py # from rest_framework.mixins import ListModelMixin from drf_haystack.generics import HaystackGenericAPIView class SearchView(ListModelMixin, HaystackGenericAPIView): serializer_class = LocationSerializer def get(self, request, *args, **kwargs): return self.list(request, *args, **kwargs) # # urls.py # urlpatterns = ( ... url(r'^search/', SearchView.as_view()), ... ) Next, check out the :ref:`advanced-usage-label`. drf-haystack-1.5.6/docs/conf.py0000644000076500000240000002237512627546164016445 0ustar rolfstaff00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- # # drf-haystack documentation build configuration file, created by # sphinx-quickstart on Thu Nov 27 20:42:57 2014. # # This file is execfile()d with the current directory set to its # containing dir. # # Note that not all possible configuration values are present in this # autogenerated file. # # All configuration values have a default; values that are commented out # serve to show the default. from __future__ import absolute_import, unicode_literals import os import sys from datetime import date try: import sphinx_rtd_theme use_sphinx_rtd_theme = True except ImportError: use_sphinx_rtd_theme = False # If extensions (or modules to document with autodoc) are in another directory, # add these directories to sys.path here. If the directory is relative to the # documentation root, use os.path.abspath to make it absolute, like shown here. sys.path.insert(0, os.path.abspath('.')) # sys.path.insert(1, os.path.abspath(os.path.pardir)) # os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'tests.settings') # # import django # if hasattr(django, "setup"): # django.setup() # -- General configuration ------------------------------------------------ # If your documentation needs a minimal Sphinx version, state it here. #needs_sphinx = '1.0' # Add any Sphinx extension module names here, as strings. They can be # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom # ones. extensions = [ 'sphinx.ext.autodoc', 'sphinx.ext.intersphinx', 'sphinx.ext.extlinks', 'sphinx.ext.todo', 'sphinx.ext.viewcode', ] # Add any paths that contain templates here, relative to this directory. templates_path = ['_templates'] # The suffix of source filenames. source_suffix = '.rst' # The encoding of source files. #source_encoding = 'utf-8-sig' # The master toctree document. master_doc = 'index' # General information about the project. project = 'drf-haystack' copyright = '%d, Inonit AS' % date.today().year # The version info for the project you're documenting, acts as replacement for # |version| and |release|, also used in various other places throughout the # built documents. # # The short X.Y version. version = '1.5' # The full version, including alpha/beta/rc tags. release = '1.5.6' # The language for content autogenerated by Sphinx. Refer to documentation # for a list of supported languages. #language = None # There are two options for replacing |today|: either, you set today to some # non-false value, then it is used: #today = '' # Else, today_fmt is used as the format for a strftime call. #today_fmt = '%B %d, %Y' # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. exclude_patterns = ['_build'] # The reST default role (used for this markup: `text`) to use for all # documents. #default_role = None # If true, '()' will be appended to :func: etc. cross-reference text. #add_function_parentheses = True # If true, the current module name will be prepended to all description # unit titles (such as .. function::). #add_module_names = True # If true, sectionauthor and moduleauthor directives will be shown in the # output. They are ignored by default. #show_authors = False # The name of the Pygments (syntax highlighting) style to use. pygments_style = 'sphinx' # A list of ignored prefixes for module index sorting. #modindex_common_prefix = [] # If true, keep warnings as "system message" paragraphs in the built documents. #keep_warnings = False # -- Options for HTML output ---------------------------------------------- # The theme to use for HTML and HTML Help pages. See the documentation for # a list of builtin themes. html_theme = 'sphinx_rtd_theme' if use_sphinx_rtd_theme else 'default' # Theme options are theme-specific and customize the look and feel of a theme # further. For a list of options available for each theme, see the # documentation. #html_theme_options = {} # Add any paths that contain custom themes here, relative to this directory. html_theme_path = [sphinx_rtd_theme.get_html_theme_path()] if use_sphinx_rtd_theme else [] # The name for this set of Sphinx documents. If None, it defaults to # " v documentation". #html_title = None # A shorter title for the navigation bar. Default is the same as html_title. #html_short_title = None # The name of an image file (relative to this directory) to place at the top # of the sidebar. #html_logo = None # The name of an image file (within the static path) to use as favicon of the # docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 # pixels large. #html_favicon = None # Add any paths that contain custom static files (such as style sheets) here, # relative to this directory. They are copied after the builtin static files, # so a file named "default.css" will overwrite the builtin "default.css". html_static_path = ['_static'] # Add any extra paths that contain custom files (such as robots.txt or # .htaccess) here, relative to this directory. These files are copied # directly to the root of the documentation. #html_extra_path = [] # If not '', a 'Last updated on:' timestamp is inserted at every page bottom, # using the given strftime format. #html_last_updated_fmt = '%b %d, %Y' # If true, SmartyPants will be used to convert quotes and dashes to # typographically correct entities. #html_use_smartypants = True # Custom sidebar templates, maps document names to template names. #html_sidebars = {} # Additional templates that should be rendered to pages, maps page names to # template names. #html_additional_pages = {} # If false, no module index is generated. #html_domain_indices = True # If false, no index is generated. #html_use_index = True # If true, the index is split into individual pages for each letter. #html_split_index = False # If true, links to the reST sources are added to the pages. #html_show_sourcelink = True # If true, "Created using Sphinx" is shown in the HTML footer. Default is True. #html_show_sphinx = True # If true, "(C) Copyright ..." is shown in the HTML footer. Default is True. #html_show_copyright = True # If true, todos are shown in the HTML output. Default is False. todo_include_todos = True # If true, an OpenSearch description file will be output, and all pages will # contain a tag referring to it. The value of this option must be the # base URL from which the finished HTML is served. #html_use_opensearch = '' # This is the file name suffix for HTML files (e.g. ".xhtml"). #html_file_suffix = None # Output file base name for HTML help builder. htmlhelp_basename = 'drfhaystackdoc' # -- Options for LaTeX output --------------------------------------------- latex_elements = { # The paper size ('letterpaper' or 'a4paper'). 'papersize': 'a4paper', # The font size ('10pt', '11pt' or '12pt'). 'pointsize': '11pt', # Additional stuff for the LaTeX preamble. #'preamble': '', } # Grouping the document tree into LaTeX files. List of tuples # (source start file, target name, title, # author, documentclass [howto, manual, or own class]). latex_documents = [ ('index', 'drf-haystack.tex', 'drf-haystack documentation', 'Inonit', 'manual'), ] # The name of an image file (relative to this directory) to place at the top of # the title page. #latex_logo = None # For "manual" documents, if this is true, then toplevel headings are parts, # not chapters. #latex_use_parts = False # If true, show page references after internal links. #latex_show_pagerefs = False # If true, show URL addresses after external links. #latex_show_urls = False # Documents to append as an appendix to all manuals. #latex_appendices = [] # If false, no module index is generated. #latex_domain_indices = True # -- Options for manual page output --------------------------------------- # One entry per manual page. List of tuples # (source start file, name, description, authors, manual section). man_pages = [ ('index', 'drf-haystack', 'drf-haystack documentation', ['Inonit'], 1) ] # If true, show URL addresses after external links. #man_show_urls = False # -- Options for Texinfo output ------------------------------------------- # Grouping the document tree into Texinfo files. List of tuples # (source start file, target name, title, author, # dir menu entry, description, category) texinfo_documents = [ ('index', 'drf-haystack', 'drf-haystack documentation', 'Inonit', 'drf-haystack', 'Haystack for Django REST Framework', 'Miscellaneous'), ] # Documents to append as an appendix to all manuals. #texinfo_appendices = [] # If false, no module index is generated. #texinfo_domain_indices = True # How to display URL addresses: 'footnote', 'no', or 'inline'. #texinfo_show_urls = 'footnote' # If true, do not generate a @detailmenu in the "Top" node's menu. #texinfo_no_detailmenu = False # Example configuration for intersphinx: refer to the Python standard library. intersphinx_mapping = { 'http://docs.python.org/': None, 'django': ('http://django.readthedocs.org/en/latest/', None), 'haystack': ('http://django-haystack.readthedocs.org/en/latest/', None) } # Configurations for extlinks extlinks = { 'drf-issue': ('https://github.com/inonit/drf-haystack/issues/%s', '#'), 'haystack-issue': ('https://github.com/django-haystack/django-haystack/issues/%s', '#') } drf-haystack-1.5.6/docs/index.rst0000644000076500000240000001251612627546164017003 0ustar rolfstaff00000000000000================================== Haystack for Django REST Framework ================================== Contents: .. toctree:: :maxdepth: 2 basic_usage advanced_usage ===== About ===== Small library aiming to simplify using Haystack with Django REST Framework Features ======== Supported Python versions: - Python 2.6 - Django 1.5 and 1.6 - Python 2.7, 3.3, 3.4 and 3.5 - Django 1.5, 1.6, 1.7 and 1.8 Installation ============ It's in the cheese shop! .. code-block:: none $ pip install drf-haystack Requirements ============ - Django - Django REST Framework - Haystack (and a supported search engine such as Solr, Elasticsearch, Whoosh, etc.) - Python bindings for the chosen backend (see below). - (geopy and libgeos if you want to use geo spatial filtering) Python bindings --------------- You will also need to install python bindings for the search engine you'll use. Elasticsearch ............. See haystack `Elasticsearch `_ docs for details .. warning:: ``django-haystack`` does not yet support Elasticsearch v2 yet, so we need to use the latest 1.x branch. .. code-block:: none $ pip install elasticsearch<2.0.0 Solr .... See haystack `Solr `_ docs for details. .. code-block:: none $ pip install pysolr Whoosh ...... See haystack `Whoosh `_ docs for details. .. code-block:: none $ pip install whoosh Xapian ...... See haystack `Xapian `_ docs for details. Contributors ============ This library has mainly been written by `me `_ while working at `Inonit `_. I have also had some help by these amazing people! Thanks guys! - `Jacob Rief `_ - `Jannon Frank `_ - `Michael Fladischer `_ (Debian package) Changelog ========= v1.5.6 ------ *Release date: 2015-12-02* - Fixed a bug where ``ignore_fields`` on ``HaystackSerializer`` did not work unless ``exclude`` evaluates to ``True``. - Removed ``elasticsearch`` from ``install_requires``. Elasticsearch should not be a mandatory requirement, since it's useless if not using Elasticsearch as backend. v1.5.5 ------ *Release date: 2015-10-31* - Added support for Django REST Framework 3.3.0 (Only for Python 2.7/Django 1.7 and above) - Locked elasticsearch < 2.0.0 (See :drf-issue:`29`) v1.5.4 ------ *Release date: 2015-10-08* - Added support for serializing faceted results. Closing :drf-issue:`27`. v1.5.3 ------ *Release date: 2015-10-05* - Added support for :ref:`faceting-label` (Github :drf-issue:`11`). v1.5.2 ------ *Release date: 2015-08-23* - Proper support for :ref:`multiple-search-indexes-label` (Github :drf-issue:`22`). - Experimental support for :ref:`term-boost-label` (This seems to have some issues upstreams, so unfortunately it does not really work as expected). - Support for negate in filters. v1.5.1 ------ *Release date: 2015-07-28* - Support for More Like This results (Github :drf-issue:`10`). - Deprecated ``SQHighlighterMixin`` in favor of ``HaystackHighlightFilter``. - ``HaystackGenericAPIView`` now returns 404 for detail views if more than one entry is found (Github :drf-issue:`19`). v1.5.0 ------ *Release date: 2015-06-29* - Added support for field lookups in queries, such as ``field__contains=foobar``. Check out `Haystack docs `_ for details. - Added default ``permission_classes`` on ``HaystackGenericAPIView`` in order to avoid crash when using global permission classes on REST Framework. See :ref:`permission-classes-label` for details. v1.4 ---- *Release date: 2015-06-15* - Fixed issues for Geo spatial filtering on django-haystack v2.4.x with Elasticsearch. - A serializer class now accepts a list or tuple of ``ignore_field`` to bypass serialization. - Added support for Highlighting. v1.3 ---- *Release date: 2015-05-19* - ``HaystackGenericAPIView().get_object()`` now returns Http404 instead of an empty ``SearchQueryset`` if no object is found. This mimics the behaviour from ``GenericAPIView().get_object()``. - Removed hard dependencies for ``geopy`` and ``libgeos`` (See Github :drf-issue:`5`). This means that if you want to use the ``HaystackGEOSpatialFilter``, you have to install these libraries manually. v1.2 ---- *Release date: 2015-03-23* - Fixed ``MissingDependency`` error when using another search backend than Elasticsearch. - Fixed converting distance to D object before filtering in HaystackGEOSpatialFilter. - Added Python 3 classifier. v1.1 ---- *Release date: 2015-02-16* - Full coverage (almost) test suite - Documentation - Beta release Development classifier v1.0 ---- *Release date: 2015-02-14* - Initial release. Indices and tables ================== * :ref:`genindex` * :ref:`modindex` * :ref:`search` drf-haystack-1.5.6/docs/make.bat0000644000076500000240000001507312435677301016543 0ustar rolfstaff00000000000000@ECHO OFF REM Command file for Sphinx documentation if "%SPHINXBUILD%" == "" ( set SPHINXBUILD=sphinx-build ) set BUILDDIR=_build set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% . set I18NSPHINXOPTS=%SPHINXOPTS% . if NOT "%PAPER%" == "" ( set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS% set I18NSPHINXOPTS=-D latex_paper_size=%PAPER% %I18NSPHINXOPTS% ) if "%1" == "" goto help if "%1" == "help" ( :help echo.Please use `make ^` where ^ is one of echo. html to make standalone HTML files echo. dirhtml to make HTML files named index.html in directories echo. singlehtml to make a single large HTML file echo. pickle to make pickle files echo. json to make JSON files echo. htmlhelp to make HTML files and a HTML help project echo. qthelp to make HTML files and a qthelp project echo. devhelp to make HTML files and a Devhelp project echo. epub to make an epub echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter echo. text to make text files echo. man to make manual pages echo. texinfo to make Texinfo files echo. gettext to make PO message catalogs echo. changes to make an overview over all changed/added/deprecated items echo. xml to make Docutils-native XML files echo. pseudoxml to make pseudoxml-XML files for display purposes echo. linkcheck to check all external links for integrity echo. doctest to run all doctests embedded in the documentation if enabled goto end ) if "%1" == "clean" ( for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i del /q /s %BUILDDIR%\* goto end ) %SPHINXBUILD% 2> nul if errorlevel 9009 ( echo. echo.The 'sphinx-build' command was not found. Make sure you have Sphinx echo.installed, then set the SPHINXBUILD environment variable to point echo.to the full path of the 'sphinx-build' executable. Alternatively you echo.may add the Sphinx directory to PATH. echo. echo.If you don't have Sphinx installed, grab it from echo.http://sphinx-doc.org/ exit /b 1 ) if "%1" == "html" ( %SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/html. goto end ) if "%1" == "dirhtml" ( %SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml. goto end ) if "%1" == "singlehtml" ( %SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml. goto end ) if "%1" == "pickle" ( %SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the pickle files. goto end ) if "%1" == "json" ( %SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the JSON files. goto end ) if "%1" == "htmlhelp" ( %SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run HTML Help Workshop with the ^ .hhp project file in %BUILDDIR%/htmlhelp. goto end ) if "%1" == "qthelp" ( %SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run "qcollectiongenerator" with the ^ .qhcp project file in %BUILDDIR%/qthelp, like this: echo.^> qcollectiongenerator %BUILDDIR%\qthelp\django-pushit.qhcp echo.To view the help file: echo.^> assistant -collectionFile %BUILDDIR%\qthelp\django-pushit.ghc goto end ) if "%1" == "devhelp" ( %SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp if errorlevel 1 exit /b 1 echo. echo.Build finished. goto end ) if "%1" == "epub" ( %SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub if errorlevel 1 exit /b 1 echo. echo.Build finished. The epub file is in %BUILDDIR%/epub. goto end ) if "%1" == "latex" ( %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex if errorlevel 1 exit /b 1 echo. echo.Build finished; the LaTeX files are in %BUILDDIR%/latex. goto end ) if "%1" == "latexpdf" ( %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex cd %BUILDDIR%/latex make all-pdf cd %BUILDDIR%/.. echo. echo.Build finished; the PDF files are in %BUILDDIR%/latex. goto end ) if "%1" == "latexpdfja" ( %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex cd %BUILDDIR%/latex make all-pdf-ja cd %BUILDDIR%/.. echo. echo.Build finished; the PDF files are in %BUILDDIR%/latex. goto end ) if "%1" == "text" ( %SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text if errorlevel 1 exit /b 1 echo. echo.Build finished. The text files are in %BUILDDIR%/text. goto end ) if "%1" == "man" ( %SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man if errorlevel 1 exit /b 1 echo. echo.Build finished. The manual pages are in %BUILDDIR%/man. goto end ) if "%1" == "texinfo" ( %SPHINXBUILD% -b texinfo %ALLSPHINXOPTS% %BUILDDIR%/texinfo if errorlevel 1 exit /b 1 echo. echo.Build finished. The Texinfo files are in %BUILDDIR%/texinfo. goto end ) if "%1" == "gettext" ( %SPHINXBUILD% -b gettext %I18NSPHINXOPTS% %BUILDDIR%/locale if errorlevel 1 exit /b 1 echo. echo.Build finished. The message catalogs are in %BUILDDIR%/locale. goto end ) if "%1" == "changes" ( %SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes if errorlevel 1 exit /b 1 echo. echo.The overview file is in %BUILDDIR%/changes. goto end ) if "%1" == "linkcheck" ( %SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck if errorlevel 1 exit /b 1 echo. echo.Link check complete; look for any errors in the above output ^ or in %BUILDDIR%/linkcheck/output.txt. goto end ) if "%1" == "doctest" ( %SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest if errorlevel 1 exit /b 1 echo. echo.Testing of doctests in the sources finished, look at the ^ results in %BUILDDIR%/doctest/output.txt. goto end ) if "%1" == "xml" ( %SPHINXBUILD% -b xml %ALLSPHINXOPTS% %BUILDDIR%/xml if errorlevel 1 exit /b 1 echo. echo.Build finished. The XML files are in %BUILDDIR%/xml. goto end ) if "%1" == "pseudoxml" ( %SPHINXBUILD% -b pseudoxml %ALLSPHINXOPTS% %BUILDDIR%/pseudoxml if errorlevel 1 exit /b 1 echo. echo.Build finished. The pseudo-XML files are in %BUILDDIR%/pseudoxml. goto end ) :end drf-haystack-1.5.6/docs/Makefile0000644000076500000240000001520612435677301016574 0ustar rolfstaff00000000000000# Makefile for Sphinx documentation # # You can set these variables from the command line. SPHINXOPTS = SPHINXBUILD = sphinx-build PAPER = BUILDDIR = _build # User-friendly check for sphinx-build ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1) $(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/) endif # Internal variables. PAPEROPT_a4 = -D latex_paper_size=a4 PAPEROPT_letter = -D latex_paper_size=letter ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . # the i18n builder cannot share the environment and doctrees with the others I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . .PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext help: @echo "Please use \`make ' where is one of" @echo " html to make standalone HTML files" @echo " dirhtml to make HTML files named index.html in directories" @echo " singlehtml to make a single large HTML file" @echo " pickle to make pickle files" @echo " json to make JSON files" @echo " htmlhelp to make HTML files and a HTML help project" @echo " qthelp to make HTML files and a qthelp project" @echo " devhelp to make HTML files and a Devhelp project" @echo " epub to make an epub" @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter" @echo " latexpdf to make LaTeX files and run them through pdflatex" @echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx" @echo " text to make text files" @echo " man to make manual pages" @echo " texinfo to make Texinfo files" @echo " info to make Texinfo files and run them through makeinfo" @echo " gettext to make PO message catalogs" @echo " changes to make an overview of all changed/added/deprecated items" @echo " xml to make Docutils-native XML files" @echo " pseudoxml to make pseudoxml-XML files for display purposes" @echo " linkcheck to check all external links for integrity" @echo " doctest to run all doctests embedded in the documentation (if enabled)" clean: rm -rf $(BUILDDIR)/* html: $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/html." dirhtml: $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml." singlehtml: $(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml @echo @echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml." pickle: $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle @echo @echo "Build finished; now you can process the pickle files." json: $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json @echo @echo "Build finished; now you can process the JSON files." htmlhelp: $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp @echo @echo "Build finished; now you can run HTML Help Workshop with the" \ ".hhp project file in $(BUILDDIR)/htmlhelp." qthelp: $(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp @echo @echo "Build finished; now you can run "qcollectiongenerator" with the" \ ".qhcp project file in $(BUILDDIR)/qthelp, like this:" @echo "# qcollectiongenerator $(BUILDDIR)/qthelp/django-pushit.qhcp" @echo "To view the help file:" @echo "# assistant -collectionFile $(BUILDDIR)/qthelp/django-pushit.qhc" devhelp: $(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp @echo @echo "Build finished." @echo "To view the help file:" @echo "# mkdir -p $$HOME/.local/share/devhelp/django-pushit" @echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/django-pushit" @echo "# devhelp" epub: $(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub @echo @echo "Build finished. The epub file is in $(BUILDDIR)/epub." latex: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo @echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex." @echo "Run \`make' in that directory to run these through (pdf)latex" \ "(use \`make latexpdf' here to do that automatically)." latexpdf: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo "Running LaTeX files through pdflatex..." $(MAKE) -C $(BUILDDIR)/latex all-pdf @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." latexpdfja: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo "Running LaTeX files through platex and dvipdfmx..." $(MAKE) -C $(BUILDDIR)/latex all-pdf-ja @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." text: $(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text @echo @echo "Build finished. The text files are in $(BUILDDIR)/text." man: $(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man @echo @echo "Build finished. The manual pages are in $(BUILDDIR)/man." texinfo: $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo @echo @echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo." @echo "Run \`make' in that directory to run these through makeinfo" \ "(use \`make info' here to do that automatically)." info: $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo @echo "Running Texinfo files through makeinfo..." make -C $(BUILDDIR)/texinfo info @echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo." gettext: $(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale @echo @echo "Build finished. The message catalogs are in $(BUILDDIR)/locale." changes: $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes @echo @echo "The overview file is in $(BUILDDIR)/changes." linkcheck: $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck @echo @echo "Link check complete; look for any errors in the above output " \ "or in $(BUILDDIR)/linkcheck/output.txt." doctest: $(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest @echo "Testing of doctests in the sources finished, look at the " \ "results in $(BUILDDIR)/doctest/output.txt." xml: $(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml @echo @echo "Build finished. The XML files are in $(BUILDDIR)/xml." pseudoxml: $(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml @echo @echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml." drf-haystack-1.5.6/drf_haystack/0000755000076500000240000000000012627546270016645 5ustar rolfstaff00000000000000drf-haystack-1.5.6/drf_haystack/__init__.py0000644000076500000240000000031412627546164020756 0ustar rolfstaff00000000000000# -*- coding: utf-8 -*- from __future__ import unicode_literals __title__ = "drf-haystack" __version__ = "1.5.6" __author__ = "Rolf Haavard Blindheim" __license__ = "MIT License" VERSION = __version__ drf-haystack-1.5.6/drf_haystack/fields.py0000644000076500000240000000514612604427315020464 0ustar rolfstaff00000000000000# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals from rest_framework.fields import ( BooleanField, CharField, DateField, DateTimeField, DecimalField, FloatField, IntegerField ) class DRFHaystackFieldMixin(object): prefix_field_names = False def __init__(self, **kwargs): self.prefix_field_names = kwargs.pop('prefix_field_names', False) super(DRFHaystackFieldMixin, self).__init__(**kwargs) def bind(self, field_name, parent): """ Initializes the field name and parent for the field instance. Called when a field is added to the parent serializer instance. Taken from DRF and modified to support drf_haystack multiple index functionality. """ # In order to enforce a consistent style, we error if a redundant # 'source' argument has been used. For example: # my_field = serializer.CharField(source='my_field') assert self.source != field_name, ( "It is redundant to specify `source='%s'` on field '%s' in " "serializer '%s', because it is the same as the field name. " "Remove the `source` keyword argument." % (field_name, self.__class__.__name__, parent.__class__.__name__) ) self.field_name = field_name self.parent = parent # `self.label` should default to being based on the field name. if self.label is None: self.label = field_name.replace('_', ' ').capitalize() # self.source should default to being the same as the field name. if self.source is None: self.source = self.convert_field_name(field_name) # self.source_attrs is a list of attributes that need to be looked up # when serializing the instance, or populating the validated data. if self.source == '*': self.source_attrs = [] else: self.source_attrs = self.source.split('.') def convert_field_name(self, field_name): if not self.prefix_field_names: return field_name return field_name.split("__")[-1] class HaystackBooleanField(DRFHaystackFieldMixin, BooleanField): pass class HaystackCharField(DRFHaystackFieldMixin, CharField): pass class HaystackDateField(DRFHaystackFieldMixin, DateField): pass class HaystackDateTimeField(DRFHaystackFieldMixin, DateTimeField): pass class HaystackDecimalField(DRFHaystackFieldMixin, DecimalField): pass class HaystackFloatField(DRFHaystackFieldMixin, FloatField): pass class HaystackIntegerField(DRFHaystackFieldMixin, IntegerField): pass drf-haystack-1.5.6/drf_haystack/filters.py0000644000076500000240000003707112627527535020702 0ustar rolfstaff00000000000000# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals import operator import warnings from itertools import chain from dateutil import parser from django.conf import settings from django.core.exceptions import ImproperlyConfigured from django.utils import six import haystack from haystack.query import SearchQuerySet from rest_framework.filters import BaseFilterBackend from .utils import merge_dict class HaystackFilter(BaseFilterBackend): """ A filter backend that compiles a haystack compatible filtering query. """ @staticmethod def get_request_filters(request): return request.GET.copy() @staticmethod def build_filter(view, filters=None): """ Creates a single SQ filter from querystring parameters that correspond to the SearchIndex fields that have been "registered" in `view.fields`. Default behavior is to `OR` terms for the same parameters, and `AND` between parameters. Any querystring parameters that are not registered in `view.fields` will be ignored. """ terms = [] exclude_terms = [] if filters is None: filters = {} # pragma: no cover for param, value in filters.items(): # Skip if the parameter is not listed in the serializer's `fields` # or if it's in the `exclude` list. excluding_term = False param_parts = param.split("__") base_param = param_parts[0] # only test against field without lookup negation_keyword = getattr(settings, "DRF_HAYSTACK_NEGATION_KEYWORD", "not") if len(param_parts) > 1 and param_parts[1] == negation_keyword: excluding_term = True param = param.replace("__%s" % negation_keyword, "") # haystack wouldn't understand our negation if view.serializer_class: try: if hasattr(view.serializer_class.Meta, "field_aliases"): old_base = base_param base_param = view.serializer_class.Meta.field_aliases.get(base_param, base_param) param = param.replace(old_base, base_param) # need to replace the alias fields = getattr(view.serializer_class.Meta, "fields", []) exclude = getattr(view.serializer_class.Meta, "exclude", []) search_fields = getattr(view.serializer_class.Meta, "search_fields", []) if ((fields or search_fields) and base_param not in chain(fields, search_fields)) or base_param in exclude or not value: continue except AttributeError: raise ImproperlyConfigured("%s must implement a Meta class." % view.serializer_class.__class__.__name__) tokens = [token.strip() for token in value.split(view.lookup_sep)] field_queries = [] for token in tokens: if token: field_queries.append(view.query_object((param, token))) term = six.moves.reduce(operator.or_, filter(lambda x: x, field_queries)) if excluding_term: exclude_terms.append(term) else: terms.append(term) terms = six.moves.reduce(operator.and_, filter(lambda x: x, terms)) if terms else [] exclude_terms = six.moves.reduce(operator.and_, filter(lambda x: x, exclude_terms)) if exclude_terms else [] return terms, exclude_terms def filter_queryset(self, request, queryset, view): applicable_filters, applicable_exclusions = self.build_filter(view, filters=self.get_request_filters(request)) if applicable_filters: queryset = queryset.filter(applicable_filters) if applicable_exclusions: queryset = queryset.exclude(applicable_exclusions) return queryset class HaystackAutocompleteFilter(HaystackFilter): """ A filter backend to perform autocomplete search. Must be run against fields that are either `NgramField` or `EdgeNgramField`. """ def filter_queryset(self, request, queryset, view): """ Applying `applicable_filters` to the queryset by creating a single SQ filter using `AND`. """ applicable_filters, applicable_exclusions = self.build_filter(view, filters=self.get_request_filters(request)) if applicable_filters: queryset = queryset.filter(self._construct_query(applicable_filters, queryset, view)) if applicable_exclusions: queryset = queryset.exclude(self._construct_query(applicable_exclusions, queryset, view)) return queryset def _construct_query(self, terms, queryset, view): query_bits = [] for field_name, query in terms.children: for word in query.split(" "): bit = queryset.query.clean(word.strip()) kwargs = { field_name: bit } query_bits.append(view.query_object(**kwargs)) return six.moves.reduce(operator.and_, filter(lambda x: x, query_bits)) class BaseHaystackGEOSpatialFilter(HaystackFilter): """ A base filter backend for doing geospatial filtering. If using this filter make sure to provide a `point_field` with the name of your the `LocationField` of your index. We'll always do the somewhat slower but more accurate `dwithin` (radius) filter. """ point_field = None def __init__(self, *args, **kwargs): if not self.point_field: raise ImproperlyConfigured("You should provide `point_field` in " "your subclassed geo-spatial filter " "class.") try: from haystack.utils.geo import D, Point self.D = D self.Point = Point except ImportError as e: # pragma: no cover warnings.warn("Make sure you've installed the `libgeos` library.\n " "(`apt-get install libgeos` on linux, or `brew install geos` on OS X.)") raise e def unit_to_meters(self, distance_obj): """ Emergency fix for https://github.com/toastdriven/django-haystack/issues/957 According to Elasticsearch documentation, units are always measured in meters unless explicitly declared otherwise. It seems that the unit description is lost somewhere, so everything ends up in the query without any unit values, thus the value is calculated in meters. """ return self.D(m=distance_obj.m * 1000) # pragma: no cover def geo_filter(self, queryset, filters=None): """ Filter the queryset by looking up parameters from the query parameters. Expected query parameters are: - a `unit=value` parameter where the unit is a valid UNIT in the `django.contrib.gis.measure.Distance` class. - `from` which must be a comma separated longitude and latitude. Example query: /api/v1/search/?km=10&from=59.744076,10.152045 Will perform a `dwithin` query within 10 km from the point with latitude 59.744076 and longitude 10.152045. """ filters = dict((k, filters[k]) for k in chain(self.D.UNITS.keys(), ["from"]) if k in filters) distance = dict((k, v) for k, v in filters.items() if k in self.D.UNITS.keys()) if "from" in filters and len(filters["from"].split(",")) == 2: try: latitude, longitude = map(float, filters["from"].split(",")) point = self.Point(longitude, latitude, srid=getattr(settings, "GEO_SRID", 4326)) if point and distance: major, minor, _ = haystack.__version__ if queryset.query.backend.__class__.__name__ == "ElasticsearchSearchBackend" \ and (major == 2 and minor < 4): distance = self.unit_to_meters(self.D(**distance)) # pragma: no cover else: distance = self.D(**distance) queryset = queryset.dwithin(self.point_field, point, distance).distance(self.point_field, point) except ValueError: raise ValueError("Cannot convert `from=latitude,longitude` query parameter to " "float values. Make sure to provide numerical values only!") return queryset def filter_queryset(self, request, queryset, view): queryset = self.geo_filter(queryset, filters=self.get_request_filters(request)) return super(BaseHaystackGEOSpatialFilter, self).filter_queryset(request, queryset, view) class HaystackGEOSpatialFilter(BaseHaystackGEOSpatialFilter): """ If using this filter make sure your index has a `LocationField` named `coordinates`. """ point_field = 'coordinates' class HaystackHighlightFilter(HaystackFilter): """ A filter backend which adds support for ``highlighting`` on the SearchQuerySet level (the fast one). Note that you need to use a search backend which supports highlighting in order to use this. This will add a ``hightlighted`` entry to your response, encapsulating the highlighted words in an `highlighted results` block. """ def filter_queryset(self, request, queryset, view): queryset = super(HaystackHighlightFilter, self).filter_queryset(request, queryset, view) if request.GET and isinstance(queryset, SearchQuerySet): queryset = queryset.highlight() return queryset class HaystackBoostFilter(HaystackFilter): """ Filter backend for applying term boost on query time. Apply by adding a comma separated ``boost`` query parameter containing a the term you want to boost and a floating point or integer for the boost value. The boost value is based around ``1.0`` as 100% - no boost. Gives a slight increase in relevance for documents that include "banana": /api/v1/search/?boost=banana,1.1 The boost is applied *after* regular filtering has occurred. """ @staticmethod def apply_boost(queryset, filters): if "boost" in filters and len(filters["boost"].split(",")) == 2: term, boost = iter(filters["boost"].split(",")) try: queryset = queryset.boost(term, float(boost)) except ValueError: raise ValueError("Cannot convert boost to float value. Make sure to provide a " "numerical boost value.") return queryset def filter_queryset(self, request, queryset, view): queryset = super(HaystackBoostFilter, self).filter_queryset(request, queryset, view) return self.apply_boost(queryset, filters=self.get_request_filters(request)) class HaystackFacetFilter(HaystackFilter): """ Filter backend for faceting search results. This backend does not apply regular filtering. Faceting field options can be set by using the ``field_options`` attribute on the serializer, and can be overridden by query parameters. Dates will be parsed by the ``python-dateutil.parser()`` which can handle most date formats. Query parameters is parsed in the following format: ?field1=option1:value1,option2:value2&field2=option1:value1,option2:value2 where each options ``key:value`` pair is separated by the ``view.lookup_sep`` attribute. """ # TODO: Support multiple indexes/serializers @staticmethod def parse(lookup_sep, options): """ Parse the field options query string and return it as a dictionary. """ defaults = {} if isinstance(options, six.text_type): tokens = [token.strip() for token in options.split(lookup_sep)] for token in tokens: if not len(token.split(":")) == 2: warnings.warn("The %s token is not properly formatted. Tokens need to be " "formatted as 'token:value' pairs." % token) continue param, value = token.split(":") if any([k == param for k in ("start_date", "end_date", "gap_amount")]): if param in ("start_date", "end_date"): value = parser.parse(value) if param == "gap_amount": value = int(value) defaults[param] = value return defaults def build_facet_filter(self, view, filters=None): """ Creates a dict of dictionaries suitable for passing to the SearchQuerySet ``facet``, ``date_facet`` or ``query_facet`` method. """ field_facets = {} date_facets = {} query_facets = {} facet_serializer_cls = view.get_facet_serializer_class() if filters is None: filters = {} # pragma: no cover if view.lookup_sep == ":": raise AttributeError("The %(cls)s.lookup_sep attribute conflicts with the HaystackFacetFilter " "query parameter parser. Please choose another `lookup_sep` attribute " "for %(cls)s." % {"cls": view.__class__.__name__}) try: fields = getattr(facet_serializer_cls.Meta, "fields", []) exclude = getattr(facet_serializer_cls.Meta, "exclude", []) field_options = getattr(facet_serializer_cls.Meta, "field_options", {}) for field, options in filters.items(): if field not in fields or field in exclude: continue field_options = merge_dict(field_options, {field: self.parse(view.lookup_sep, options)}) valid_gap = ("year", "month", "day", "hour", "minute", "second") for field, options in field_options.items(): if any([k in options for k in ("start_date", "end_date", "gap_by", "gap_amount")]): if not all(("start_date", "end_date", "gap_by" in options)): raise ValueError("Date faceting requires at least 'start_date', 'end_date' " "and 'gap_by' to be set.") if not options["gap_by"] in valid_gap: raise ValueError("The 'gap_by' parameter must be one of %s." % ", ".join(valid_gap)) options.setdefault("gap_amount", 1) date_facets[field] = field_options[field] else: field_facets[field] = field_options[field] except AttributeError: raise ImproperlyConfigured("%s must implement a Meta class." % facet_serializer_cls.__class__.__name__) return { "date_facets": date_facets, "field_facets": field_facets, "query_facets": query_facets } @staticmethod def apply_facets(queryset, filters): """ Apply faceting to the queryset """ for field, options in filters["field_facets"].items(): queryset = queryset.facet(field, **options) for field, options in filters["date_facets"].items(): queryset = queryset.date_facet(field, **options) # TODO: Implement support for query faceting # for field, options in filters["query_facets"].items(): # continue return queryset def filter_queryset(self, request, queryset, view): return self.apply_facets(queryset, filters=self.build_facet_filter(view, self.get_request_filters(request))) drf-haystack-1.5.6/drf_haystack/generics.py0000644000076500000240000001310312605427351021006 0ustar rolfstaff00000000000000# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals import warnings from django.http import Http404 from haystack.backends import SQ from haystack.query import SearchQuerySet from rest_framework.generics import GenericAPIView from rest_framework.permissions import AllowAny from .filters import HaystackFilter, HaystackFacetFilter class HaystackGenericAPIView(GenericAPIView): """ Base class for all haystack generic views. """ # Use `index_models` to filter on which search index models we # should include in the search result. index_models = [] object_class = SearchQuerySet query_object = SQ # Override document_uid_field with whatever field in your index # you use to uniquely identify a single document. This value will be # used wherever the view references the `lookup_field` kwarg. document_uid_field = "id" lookup_sep = "," # If set to False, DB lookups are done on a per-object basis, # resulting in in many individual trips to the database. If True, # the SearchQuerySet will group similar objects into a single query. load_all = False filter_backends = [HaystackFilter] permission_classes = [AllowAny] facet_filter_backends = [HaystackFacetFilter] facet_serializer_class = None def get_queryset(self): """ Get the list of items for this view. Returns ``self.queryset`` if defined and is a ``self.object_class`` instance. """ if self.queryset and isinstance(self.queryset, self.object_class): queryset = self.queryset.all() else: queryset = self.object_class()._clone() if len(self.index_models): queryset = queryset.models(*self.index_models) return queryset def get_object(self): """ Fetch a single document from the data store according to whatever unique identifier is available for that document in the SearchIndex. """ queryset = self.filter_queryset(self.get_queryset()) lookup_url_kwarg = self.lookup_url_kwarg or self.lookup_field if lookup_url_kwarg not in self.kwargs: raise AttributeError( "Expected view %s to be called with a URL keyword argument " "named '%s'. Fix your URL conf, or set the `.lookup_field` " "attribute on the view correctly." % (self.__class__.__name__, lookup_url_kwarg) ) queryset = queryset.filter(self.query_object((self.document_uid_field, self.kwargs[lookup_url_kwarg]))) if queryset and len(queryset) == 1: return queryset[0] elif queryset and len(queryset) > 1: raise Http404("Multiple results matches the given query. Expected a single result.") raise Http404("No result matches the given query.") def filter_queryset(self, queryset): queryset = super(HaystackGenericAPIView, self).filter_queryset(queryset) if self.load_all: queryset = queryset.load_all() return queryset def filter_facet_queryset(self, queryset): """ Given a search queryset, filter it with whichever facet filter backends in use. """ for backend in list(self.facet_filter_backends): queryset = backend().filter_queryset(self.request, queryset, self) if self.load_all: queryset = queryset.load_all() return queryset def get_facet_serializer(self, *args, **kwargs): """ Return the facet serializer instance that should be used for serializing faceted output. """ assert "objects" in kwargs, "`objects` is a required argument to `get_facet_serializer()`" facet_serializer_class = self.get_facet_serializer_class() kwargs["context"] = self.get_serializer_context() kwargs["context"].update({ "objects": kwargs.pop("objects") }) return facet_serializer_class(*args, **kwargs) def get_facet_serializer_class(self): """ Return the class to use for serializing facets. Defaults to using ``self.facet_serializer_class``. """ if self.facet_serializer_class is None: raise AttributeError( "%(cls)s should either include a `facet_serializer_class` attribute, " "or override %(cls)s.get_facet_serializer_class() method." % {"cls": self.__class__.__name__} ) return self.facet_serializer_class class SQHighlighterMixin(object): """ DEPRECATED! Remove in v1.6.0. Please use the HaystackHighlightFilter instead. This mixin adds support for highlighting on the SearchQuerySet level (the fast one). Note that you need to use a backend which supports hightlighting in order to use this. This will add a `hightlighted` entry to your response, encapsulating the highlighted words in an `highlighted results` block. """ def filter_queryset(self, queryset): warnings.warn( "The SQHighlighterMixin is marked for deprecation, and has been re-written " "as a filter backend. Please remove SQHighlighterMixin from the " "%(cls)s, and add HaystackHighlightFilter to %(cls)s.filter_backends." % {"cls": self.__class__.__name__}, DeprecationWarning ) queryset = super(SQHighlighterMixin, self).filter_queryset(queryset) if self.request.GET and isinstance(queryset, SearchQuerySet): queryset = queryset.highlight() return queryset drf-haystack-1.5.6/drf_haystack/serializers.py0000644000076500000240000004164312627527535021566 0ustar rolfstaff00000000000000# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals import copy import warnings from itertools import chain from datetime import datetime try: from collections import OrderedDict except ImportError: from django.utils.datastructures import SortedDict as OrderedDict from django.core.exceptions import ImproperlyConfigured from django.utils import six from haystack import fields as haystack_fields from haystack.query import EmptySearchQuerySet from haystack.utils import Highlighter from rest_framework import serializers from rest_framework.fields import empty from rest_framework.utils.field_mapping import ClassLookupDict, get_field_kwargs from .fields import ( HaystackBooleanField, HaystackCharField, HaystackDateField, HaystackDateTimeField, HaystackDecimalField, HaystackFloatField, HaystackIntegerField ) class HaystackSerializer(serializers.Serializer): """ A `HaystackSerializer` which populates fields based on which models that are available in the SearchQueryset. """ _field_mapping = ClassLookupDict({ haystack_fields.BooleanField: HaystackBooleanField, haystack_fields.CharField: HaystackCharField, haystack_fields.DateField: HaystackDateField, haystack_fields.DateTimeField: HaystackDateTimeField, haystack_fields.DecimalField: HaystackDecimalField, haystack_fields.EdgeNgramField: HaystackCharField, haystack_fields.FacetBooleanField: HaystackBooleanField, haystack_fields.FacetCharField: HaystackCharField, haystack_fields.FacetDateField: HaystackDateField, haystack_fields.FacetDateTimeField: HaystackDateTimeField, haystack_fields.FacetDecimalField: HaystackDecimalField, haystack_fields.FacetFloatField: HaystackFloatField, haystack_fields.FacetIntegerField: HaystackIntegerField, haystack_fields.FacetMultiValueField: HaystackCharField, haystack_fields.FloatField: HaystackFloatField, haystack_fields.IntegerField: HaystackIntegerField, haystack_fields.LocationField: HaystackCharField, haystack_fields.MultiValueField: HaystackCharField, haystack_fields.NgramField: HaystackCharField, }) def __init__(self, instance=None, data=empty, **kwargs): super(HaystackSerializer, self).__init__(instance, data, **kwargs) try: if not hasattr(self.Meta, "index_classes") and not hasattr(self.Meta, "serializers"): raise ImproperlyConfigured("You must set either the 'index_classes' or 'serializers' " "attribute on the serializer Meta class.") except AttributeError: raise ImproperlyConfigured("%s must implement a Meta class." % self.__class__.__name__) if not self.instance: self.instance = EmptySearchQuerySet() @staticmethod def _get_default_field_kwargs(model, field): """ Get the required attributes from the model field in order to instantiate a REST Framework serializer field. """ kwargs = {} if field.model_attr in model._meta.get_all_field_names(): model_field = model._meta.get_field_by_name(field.model_attr)[0] kwargs = get_field_kwargs(field.model_attr, model_field) # Remove stuff we don't care about! delete_attrs = [ "allow_blank", "choices", "model_field", ] for attr in delete_attrs: if attr in kwargs: del kwargs[attr] return kwargs def _get_index_field(self, field_name): """ Returns the correct index field. """ return field_name def _get_index_class_name(self, index_cls): """ Converts in index model class to a name suitable for use as a field name prefix. A user may optionally specify custom aliases via an 'index_aliases' attribute on the Meta class """ cls_name = index_cls.__name__ aliases = getattr(self.Meta, "index_aliases", {}) return aliases.get(cls_name, cls_name.split('.')[-1]) def get_fields(self): """ Get the required fields for serializing the result. """ fields = getattr(self.Meta, "fields", []) exclude = getattr(self.Meta, "exclude", []) if fields and exclude: raise ImproperlyConfigured("Cannot set both `fields` and `exclude`.") ignore_fields = getattr(self.Meta, "ignore_fields", []) indices = getattr(self.Meta, "index_classes") declared_fields = copy.deepcopy(self._declared_fields) prefix_field_names = len(indices) > 1 field_mapping = OrderedDict() # overlapping fields on multiple indices is supported by internally prefixing the field # names with the index class to which they belong or, optionally, a user-provided alias # for the index. for index_cls in self.Meta.index_classes: prefix = "" if prefix_field_names: prefix = "_%s__" % self._get_index_class_name(index_cls) for field_name, field_type in six.iteritems(index_cls.fields): orig_name = field_name field_name = "%s%s" % (prefix, field_name) # Don't use this field if it is in `ignore_fields` if orig_name in ignore_fields or field_name in ignore_fields: continue # When fields to include are decided by `exclude` if exclude: if orig_name in exclude or field_name in exclude: continue # When fields to include are decided by `fields` if fields: if orig_name not in fields and field_name not in fields: continue # Look up the field attributes on the current index model, # in order to correctly instantiate the serializer field. model = index_cls().get_model() kwargs = self._get_default_field_kwargs(model, field_type) kwargs['prefix_field_names'] = prefix_field_names field_mapping[field_name] = self._field_mapping[field_type](**kwargs) # Add any explicitly declared fields. They *will* override any index fields # in case of naming collision!. if declared_fields: for field_name in declared_fields: if field_name in field_mapping: warnings.warn("Field '{field}' already exists in the field list. This *will* " "overwrite existing field '{field}'".format(field=field_name)) field_mapping[field_name] = declared_fields[field_name] return field_mapping def to_representation(self, instance): """ If we have a serializer mapping, use that. Otherwise, use standard serializer behavior Since we might be dealing with multiple indexes, some fields might not be valid for all results. Do not render the fields which don't belong to the search result. """ if getattr(self.Meta, "serializers", None): ret = self.multi_serializer_representation(instance) else: ret = super(HaystackSerializer, self).to_representation(instance) prefix_field_names = len(getattr(self.Meta, "index_classes")) > 1 current_index = self._get_index_class_name(type(instance.searchindex)) for field in self.fields.keys(): orig_field = field if prefix_field_names: parts = field.split("__") if len(parts) > 1: index = parts[0][1:] # trim the preceding '_' field = parts[1] if index == current_index: ret[field] = ret[orig_field] del ret[orig_field] elif field not in chain(instance.searchindex.fields.keys(), self._declared_fields.keys()): del ret[orig_field] # include the highlighted field in either case if getattr(instance, "highlighted", None): ret["highlighted"] = instance.highlighted[0] return ret def multi_serializer_representation(self, instance): serializers = self.Meta.serializers index = instance.searchindex serializer_class = serializers.get(type(index), None) if not serializer_class: raise ImproperlyConfigured("Could not find serializer for %s in mapping" % index) return serializer_class(context=self._context).to_representation(instance) class HaystackFacetSerializer(serializers.Serializer): """ The ``HaystackFacetSerializer`` is used to serialize the ``facet_counts()`` dictionary results on a ``SearchQuerySet`` instance. """ # Setting the ``serialize_objects`` to True, # will serialize the faceted queryset using the ``view.serializer_class``. serialize_objects = False def __init__(self, *args, **kwargs): super(HaystackFacetSerializer, self).__init__(*args, **kwargs) class FacetDictField(serializers.DictField): """ A special DictField which passes the key attribute down to the children's ``to_representation()`` in order to let the serializer know what field they're currently processing. """ def to_representation(self, value): return dict([ (six.text_type(key), self.child.to_representation(key, val)) for key, val in value.items() ]) class FacetListField(serializers.ListField): """ The ``FacetListField`` just pass along the key derived from ``FacetDictField``. """ def to_representation(self, key, data): return [self.child.to_representation(key, item) for item in data] class FacetFieldSerializer(serializers.Serializer): """ Responsible for serializing each faceted result. """ text = serializers.SerializerMethodField() count = serializers.SerializerMethodField() narrow_url = serializers.SerializerMethodField() def __init__(self, *args, **kwargs): self._parent_field = None super(FacetFieldSerializer, self).__init__(*args, **kwargs) @property def parent_field(self): return self._parent_field @parent_field.setter def parent_field(self, value): self._parent_field = value def get_text(self, instance): """ Haystack facets are returned as a two-tuple (value, count). The text field should contain the faceted value. """ instance = instance[0] if isinstance(instance, (six.text_type, six.string_types)): return serializers.CharField(read_only=True).to_representation(instance) elif isinstance(instance, datetime): return serializers.DateTimeField(read_only=True).to_representation(instance) return instance def get_count(self, instance): """ Haystack facets are returned as a two-tuple (value, count). The count field should contain the faceted count. """ instance = instance[1] return serializers.IntegerField(read_only=True).to_representation(instance) def get_narrow_url(self, instance): """ Return a link suitable for narrowing on the current item. Since we don't have any means of getting the ``view name`` from here, we can only return relative paths. """ text = instance[0] query_params = self.context["request"].GET.copy() # Never keep the page query parameter in narrowing urls. # It will raise a NotFound exception when trying to paginate # a narrowed queryset. page_query_param = self.context["view"].paginator.page_query_param if page_query_param in query_params: del query_params[page_query_param] selected_facets = set(query_params.pop("selected_facets", [])) selected_facets.add("%(field)s_exact:%(text)s" % {"field": self.parent_field, "text": text}) query_params.setlist("selected_facets", sorted(selected_facets)) return serializers.Hyperlink("%(path)s?%(query)s" % { "path": self.context["request"].path_info, "query": query_params.urlencode() }, name="narrow-url") def to_representation(self, field, instance): """ Set the ``parent_field`` property equal to the current field on the serializer class, so that each field can query it to see what kind of attribute they are processing. """ self.parent_field = field return super(FacetFieldSerializer, self).to_representation(instance) self.FacetDictField = FacetDictField self.FacetListField = FacetListField self.FacetFieldSerializer = FacetFieldSerializer def get_fields(self): """ This returns a dictionary containing the top most fields, ``dates``, ``fields`` and ``queries``. """ field_mapping = OrderedDict() for field, data in self.instance.items(): field_mapping.update( {field: self.FacetDictField( child=self.FacetListField(child=self.FacetFieldSerializer(data)), required=False)} ) if self.serialize_objects is True: field_mapping["objects"] = serializers.SerializerMethodField() return field_mapping def get_objects(self, instance): """ Return a list of objects matching the faceted result. """ view = self.context["view"] queryset = self.context["objects"] page = getattr(view.paginator, "page", None) if page is not None: serializer = view.get_serializer(page, many=True) return OrderedDict([ ("count", view.paginator.page.paginator.count), ("next", view.paginator.get_next_link()), ("previous", view.paginator.get_previous_link()), ("results", serializer.data) ]) serializer = view.get_serializer(queryset, many=True) return serializer.data class HaystackSerializerMixin(object): """ This mixin can be added to a serializer to use the actual object as the data source for serialization rather than the data stored in the search index fields. This makes it easy to return data from search results in the same format as elsewhere in your API and reuse your existing serializers """ def to_representation(self, instance): obj = instance.object return super(HaystackSerializerMixin, self).to_representation(obj) class HighlighterMixin(object): """ This mixin adds support for ``highlighting`` (the pure python, portable version, not SearchQuerySet().highlight()). See Haystack docs for more info). """ highlighter_class = Highlighter highlighter_css_class = "highlighted" highlighter_html_tag = "span" highlighter_max_length = 200 highlighter_field = None def get_highlighter(self): if not self.highlighter_class: raise ImproperlyConfigured( "%(cls)s is missing a highlighter_class. Define %(cls)s.highlighter_class, " "or override %(cls)s.get_highlighter()." % {"cls": self.__class__.__name__} ) return self.highlighter_class @staticmethod def get_document_field(instance): """ Returns which field the search index has marked as it's `document=True` field. """ for name, field in instance.searchindex.fields.items(): if field.document is True: return name def to_representation(self, instance): ret = super(HighlighterMixin, self).to_representation(instance) terms = " ".join(six.itervalues(self.context["request"].GET)) if terms: highlighter = self.get_highlighter()(terms, **{ "html_tag": self.highlighter_html_tag, "css_class": self.highlighter_css_class, "max_length": self.highlighter_max_length }) document_field = self.get_document_field(instance) if highlighter and document_field: ret["highlighted"] = highlighter.highlight(getattr(instance, self.highlighter_field or document_field)) return ret drf-haystack-1.5.6/drf_haystack/utils.py0000644000076500000240000000147212604553472020360 0ustar rolfstaff00000000000000# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals from copy import deepcopy from django.utils import six def merge_dict(a, b): """ Recursively merges and returns dict a with dict b. Any list values will be combined and returned sorted. :param a: dictionary object :param b: dictionary object :return: merged dictionary object """ if not isinstance(b, dict): return b result = deepcopy(a) for key, val in six.iteritems(b): if key in result and isinstance(result[key], dict): result[key] = merge_dict(result[key], val) elif key in result and isinstance(result[key], list): result[key] = sorted(list(set(val) | set(result[key]))) else: result[key] = deepcopy(val) return result drf-haystack-1.5.6/drf_haystack/viewsets.py0000644000076500000240000000463612605427351021073 0ustar rolfstaff00000000000000# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals from rest_framework.decorators import detail_route, list_route from rest_framework.mixins import ListModelMixin, RetrieveModelMixin from rest_framework.response import Response from rest_framework.viewsets import ViewSetMixin from .generics import HaystackGenericAPIView class HaystackViewSet(RetrieveModelMixin, ListModelMixin, ViewSetMixin, HaystackGenericAPIView): """ The HaystackViewSet class provides the default ``list()`` and ``retrieve()`` actions with a haystack index as it's data source. Additionally it sets up a detail route for ``more-like-this`` results. """ @detail_route(methods=["get"], url_path="more-like-this") def more_like_this(self, request, pk=None): """ Sets up a detail route for ``more-like-this`` results. Note that you'll need backend support in order to take advantage of this. This will add ie. ^search/{pk}/more-like-this/$ to your existing ^search pattern. """ queryset = self.filter_queryset(self.get_queryset()) mlt_queryset = queryset.more_like_this(self.get_object().object) page = self.paginate_queryset(mlt_queryset) if page is not None: serializer = self.get_serializer(page, many=True) return self.get_paginated_response(serializer.data) serializer = self.get_serializer(mlt_queryset, many=True) return Response(serializer.data) @list_route(methods=["get"], url_path="facets") def facets(self, request): """ Sets up a list route for ``faceted`` results. This will add ie ^search/facets/$ to your existing ^search pattern. """ queryset = self.filter_facet_queryset(self.get_queryset()) for facet in request.GET.getlist("selected_facets"): if ":" not in facet: continue field, value = facet.split(":", 1) if value: queryset = queryset.narrow('%s:"%s"' % (field, queryset.query.clean(value))) page = self.paginate_queryset(queryset) if page is not None: serializer = self.get_facet_serializer(queryset.facet_counts(), objects=page, many=False) return Response(serializer.data) serializer = self.get_facet_serializer(queryset.facet_counts(), objects=queryset, many=False) return Response(serializer.data) drf-haystack-1.5.6/drf_haystack.egg-info/0000755000076500000240000000000012627546270020337 5ustar rolfstaff00000000000000drf-haystack-1.5.6/drf_haystack.egg-info/dependency_links.txt0000644000076500000240000000000112627546266024412 0ustar rolfstaff00000000000000 drf-haystack-1.5.6/drf_haystack.egg-info/not-zip-safe0000644000076500000240000000000112615117766022566 0ustar rolfstaff00000000000000 drf-haystack-1.5.6/drf_haystack.egg-info/PKG-INFO0000644000076500000240000000163112627546266021442 0ustar rolfstaff00000000000000Metadata-Version: 1.1 Name: drf-haystack Version: 1.5.6 Summary: Makes Haystack play nice with Django REST Framework Home-page: https://github.com/inonit/drf-haystack Author: Rolf Håvard Blindheim, Eirik Krogstad Author-email: rolf.blindheim@inonit.no, eirik.krogstad@inonit.no License: MIT License Download-URL: https://github.com/inonit/drf-haystack.git Description: Implements a ViewSet, FiltersBackends and Serializers in order to play nice with Haystack. Platform: UNKNOWN Classifier: Operating System :: OS Independent Classifier: Development Status :: 5 - Production/Stable Classifier: Environment :: Web Environment Classifier: Framework :: Django Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: MIT License Classifier: Programming Language :: Python :: 2 Classifier: Programming Language :: Python :: 3 Classifier: Topic :: Software Development :: Libraries :: Python Modules drf-haystack-1.5.6/drf_haystack.egg-info/requires.txt0000644000076500000240000000012012627546266022735 0ustar rolfstaff00000000000000Django>=1.5.0 djangorestframework>=2.4.4 django-haystack>=2.3.1 python-dateutil drf-haystack-1.5.6/drf_haystack.egg-info/SOURCES.txt0000644000076500000240000000264612627546270022233 0ustar rolfstaff00000000000000LICENSE.txt MANIFEST.in README.rst requirements.txt setup.py tox.ini docs/Makefile docs/advanced_usage.rst docs/basic_usage.rst docs/conf.py docs/index.rst docs/make.bat drf_haystack/__init__.py drf_haystack/fields.py drf_haystack/filters.py drf_haystack/generics.py drf_haystack/serializers.py drf_haystack/utils.py drf_haystack/viewsets.py drf_haystack.egg-info/PKG-INFO drf_haystack.egg-info/SOURCES.txt drf_haystack.egg-info/dependency_links.txt drf_haystack.egg-info/not-zip-safe drf_haystack.egg-info/requires.txt drf_haystack.egg-info/top_level.txt tests/__init__.py tests/constants.py tests/mixins.py tests/runtests.py tests/settings.py tests/test_filters.py tests/test_serializers.py tests/test_utils.py tests/test_viewsets.py tests/urls.py tests/wsgi.py tests/mockapp/__init__.py tests/mockapp/admin.py tests/mockapp/models.py tests/mockapp/search_indexes.py tests/mockapp/serializers.py tests/mockapp/views.py tests/mockapp/fixtures/mocklocation.json tests/mockapp/fixtures/mockperson.json tests/mockapp/fixtures/mockpet.json tests/mockapp/migrations/0001_initial.py tests/mockapp/migrations/0002_mockperson.py tests/mockapp/migrations/0003_mockpet.py tests/mockapp/migrations/0004_load_fixtures.py tests/mockapp/migrations/__init__.py tests/mockapp/templates/search/indexes/mockapp/mocklocation_text.txt tests/mockapp/templates/search/indexes/mockapp/mockperson_text.txt tests/mockapp/templates/search/indexes/mockapp/mockpet_text.txtdrf-haystack-1.5.6/drf_haystack.egg-info/top_level.txt0000644000076500000240000000001512627546266023072 0ustar rolfstaff00000000000000drf_haystack drf-haystack-1.5.6/LICENSE.txt0000644000076500000240000000206412601231056016011 0ustar rolfstaff00000000000000The MIT License (MIT) Copyright (c) 2015 Inonit AS Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. drf-haystack-1.5.6/MANIFEST.in0000644000076500000240000000025312601231056015722 0ustar rolfstaff00000000000000include README.rst include LICENSE.txt include requirements.txt include tox.ini recursive-include docs Makefile *.rst *.py *.bat recursive-include tests *.py *.json *.txt drf-haystack-1.5.6/PKG-INFO0000644000076500000240000000163112627546270015301 0ustar rolfstaff00000000000000Metadata-Version: 1.1 Name: drf-haystack Version: 1.5.6 Summary: Makes Haystack play nice with Django REST Framework Home-page: https://github.com/inonit/drf-haystack Author: Rolf Håvard Blindheim, Eirik Krogstad Author-email: rolf.blindheim@inonit.no, eirik.krogstad@inonit.no License: MIT License Download-URL: https://github.com/inonit/drf-haystack.git Description: Implements a ViewSet, FiltersBackends and Serializers in order to play nice with Haystack. Platform: UNKNOWN Classifier: Operating System :: OS Independent Classifier: Development Status :: 5 - Production/Stable Classifier: Environment :: Web Environment Classifier: Framework :: Django Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: MIT License Classifier: Programming Language :: Python :: 2 Classifier: Programming Language :: Python :: 3 Classifier: Topic :: Software Development :: Libraries :: Python Modules drf-haystack-1.5.6/README.rst0000644000076500000240000000256612627527535015706 0ustar rolfstaff00000000000000Haystack for Django REST Framework ================================== Build status ------------ .. image:: https://travis-ci.org/inonit/drf-haystack.svg?branch=master :target: https://travis-ci.org/inonit/drf-haystack .. image:: https://readthedocs.org/projects/drf-haystack/badge/?version=latest :target: https://readthedocs.org/projects/drf-haystack/?badge=latest :alt: Documentation Status .. image:: https://pypip.in/d/drf-haystack/badge.png :target: https://pypi.python.org/pypi/drf-haystack About ----- Small library which tries to simplify integration of Haystack with Django REST Framework. Contains a Generic ViewSet, a Serializer and a couple of Filters in order to make search as painless as possible. Fresh `documentation available `_ on Read the docs! Supported Python and Django versions ------------------------------------ Tested with the following configurations: - Python 2.6 - Django 1.5 and 1.6 - Python 2.7, 3.3 and 3.4 - Django 1.5, 1.6, 1.7 and 1.8 Installation ------------ $ pip install drf-haystack Supported features ------------------ We aim to support most features Haystack does (or at least those which can be used in a REST API). Currently we support: * Autocomplete * GEO Spatial searching * Highlighting * More Like This * Faceting drf-haystack-1.5.6/requirements.txt0000644000076500000240000000022112627546164017464 0ustar rolfstaff00000000000000Django Sphinx coverage django-haystack djangorestframework elasticsearch<2.0.0 nose sphinx-rtd-theme urllib3 geopy tox unittest2 python-dateutil drf-haystack-1.5.6/setup.cfg0000644000076500000240000000007312627546270016024 0ustar rolfstaff00000000000000[egg_info] tag_build = tag_svn_revision = 0 tag_date = 0 drf-haystack-1.5.6/setup.py0000644000076500000240000000345512627546164015726 0ustar rolfstaff00000000000000# -*- coding: utf-8 -*- import re import os try: from setuptools import setup except ImportError: from ez_setup import use_setuptools use_setuptools() from setuptools import setup def get_version(package): """ Return package version as listed in `__version__` in `init.py`. """ init_py = open(os.path.join(package, "__init__.py")).read() return re.search("__version__ = ['\"]([^'\"]+)['\"]", init_py).group(1) setup( name="drf-haystack", version=get_version("drf_haystack"), description="Makes Haystack play nice with Django REST Framework", long_description="Implements a ViewSet, FiltersBackends and Serializers in order to play nice with Haystack.", author="Rolf Håvard Blindheim, Eirik Krogstad", author_email="rolf.blindheim@inonit.no, eirik.krogstad@inonit.no", url="https://github.com/inonit/drf-haystack", download_url="https://github.com/inonit/drf-haystack.git", license="MIT License", packages=[ "drf_haystack", ], include_package_data=True, install_requires=[ "Django>=1.5.0", "djangorestframework>=2.4.4", "django-haystack>=2.3.1", "python-dateutil" ], tests_require=[ "nose", "coverage", "unittest2", "elasticsearch<2.0.0", ], zip_safe=False, test_suite="tests.runtests.start", classifiers=[ "Operating System :: OS Independent", "Development Status :: 5 - Production/Stable", "Environment :: Web Environment", "Framework :: Django", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Programming Language :: Python :: 2", "Programming Language :: Python :: 3", "Topic :: Software Development :: Libraries :: Python Modules", ] ) drf-haystack-1.5.6/tests/0000755000076500000240000000000012627546270015345 5ustar rolfstaff00000000000000drf-haystack-1.5.6/tests/__init__.py0000644000076500000240000000177412537476535017475 0ustar rolfstaff00000000000000# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals import os test_runner = None old_config = None os.environ["DJANGO_SETTINGS_MODULE"] = "tests.settings" import django if hasattr(django, "setup"): django.setup() def _geospatial_support(): try: import geopy from haystack.utils.geo import Point except ImportError: return False else: return True geospatial_support = _geospatial_support() def setup(): global test_runner global old_config try: from django.test.runner import DiscoverRunner test_runner = DiscoverRunner() except ImportError: # Django 1.5 did not have the DiscoverRunner from django.test.simple import DjangoTestSuiteRunner test_runner = DjangoTestSuiteRunner() test_runner.setup_test_environment() old_config = test_runner.setup_databases() def teardown(): test_runner.teardown_databases(old_config) test_runner.teardown_test_environment() drf-haystack-1.5.6/tests/constants.py0000644000076500000240000000101412537476535017735 0ustar rolfstaff00000000000000# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals import os import json from django.conf import settings with open(os.path.join(settings.BASE_DIR, "mockapp", "fixtures", "mocklocation.json"), "r") as f: mocklocation_size = len(json.loads(f.read())) MOCKLOCATION_DATA_SET_SIZE = mocklocation_size with open(os.path.join(settings.BASE_DIR, "mockapp", "fixtures", "mockperson.json"), "r") as f: mockperson_size = len(json.loads(f.read())) MOCKPERSON_DATA_SET_SIZE = mockperson_size drf-haystack-1.5.6/tests/mixins.py0000644000076500000240000000076012604553472017226 0ustar rolfstaff00000000000000# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals import warnings class WarningTestCaseMixin(object): """ TestCase mixin to catch warnings """ def assertWarning(self, warning, callable, *args, **kwargs): with warnings.catch_warnings(record=True) as warning_list: warnings.simplefilter(action="always") callable(*args, **kwargs) self.assertTrue(any(item.category == warning for item in warning_list)) drf-haystack-1.5.6/tests/mockapp/0000755000076500000240000000000012627546270016777 5ustar rolfstaff00000000000000drf-haystack-1.5.6/tests/mockapp/__init__.py0000644000076500000240000000000012467207001021062 0ustar rolfstaff00000000000000drf-haystack-1.5.6/tests/mockapp/admin.py0000644000076500000240000000007712467177130020442 0ustar rolfstaff00000000000000from django.contrib import admin # Register your models here. drf-haystack-1.5.6/tests/mockapp/fixtures/0000755000076500000240000000000012627546270020650 5ustar rolfstaff00000000000000drf-haystack-1.5.6/tests/mockapp/fixtures/mocklocation.json0000644000076500000240000007355612467752150024243 0ustar rolfstaff00000000000000[ { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.245Z", "latitude": 59.87566954010669, "address": "Andersenhagen 8", "zip_code": "0289", "longitude": 10.878157588853062, "updated": "2015-02-13T21:39:42.245Z", "city": "Oslo" }, "pk": 1 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.245Z", "latitude": 59.869983759842185, "address": "Edvardsen\u00e5sen 6", "zip_code": "0202", "longitude": 10.759558525204776, "updated": "2015-02-13T21:39:42.245Z", "city": "Oslo" }, "pk": 2 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.245Z", "latitude": 59.844410359198534, "address": "Gundersenholtet 68", "zip_code": "0215", "longitude": 10.860611682103695, "updated": "2015-02-13T21:39:42.246Z", "city": "Oslo" }, "pk": 3 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.246Z", "latitude": 59.98769224959462, "address": "Solbergjordet 29", "zip_code": "0254", "longitude": 10.73730212224833, "updated": "2015-02-13T21:39:42.246Z", "city": "Oslo" }, "pk": 4 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.246Z", "latitude": 59.879738785452595, "address": "Simonsenstien 60", "zip_code": "0102", "longitude": 10.875848906700593, "updated": "2015-02-13T21:39:42.246Z", "city": "Oslo" }, "pk": 5 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.246Z", "latitude": 59.90536711350938, "address": "Olsenstykket 0", "zip_code": "0212", "longitude": 10.91876267185026, "updated": "2015-02-13T21:39:42.246Z", "city": "Oslo" }, "pk": 6 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.246Z", "latitude": 59.9583137867979, "address": "Nilsenstubben 9", "zip_code": "0213", "longitude": 10.879477132609468, "updated": "2015-02-13T21:39:42.246Z", "city": "Oslo" }, "pk": 7 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.246Z", "latitude": 59.8122568752108, "address": "Fredriksenstranda 2", "zip_code": "0128", "longitude": 10.858514787949423, "updated": "2015-02-13T21:39:42.246Z", "city": "Oslo" }, "pk": 8 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.246Z", "latitude": 59.874409235929434, "address": "Johannessengrenda 06", "zip_code": "0238", "longitude": 10.723069271842252, "updated": "2015-02-13T21:39:42.246Z", "city": "Oslo" }, "pk": 9 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.246Z", "latitude": 59.90183752568133, "address": "R\u00f8nninggrenda 2", "zip_code": "0119", "longitude": 10.793385350548396, "updated": "2015-02-13T21:39:42.246Z", "city": "Oslo" }, "pk": 10 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.246Z", "latitude": 59.948077110460844, "address": "Bergeall\u00e9en 73", "zip_code": "0271", "longitude": 10.766157403874733, "updated": "2015-02-13T21:39:42.246Z", "city": "Oslo" }, "pk": 11 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.246Z", "latitude": 60.06698731196217, "address": "Holmtoppen 73", "zip_code": "0143", "longitude": 10.77073988514214, "updated": "2015-02-13T21:39:42.246Z", "city": "Oslo" }, "pk": 12 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.246Z", "latitude": 59.826546236660405, "address": "Nilsenholtet 3", "zip_code": "0261", "longitude": 10.753602675761355, "updated": "2015-02-13T21:39:42.246Z", "city": "Oslo" }, "pk": 13 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.246Z", "latitude": 60.002705305741614, "address": "Simonsenroa 3", "zip_code": "0241", "longitude": 10.82821194011631, "updated": "2015-02-13T21:39:42.246Z", "city": "Oslo" }, "pk": 14 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.246Z", "latitude": 59.88006926289976, "address": "Ahmedflata 0", "zip_code": "0269", "longitude": 10.804450883729185, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 15 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 60.07785179851334, "address": "Andreasseneggen 68", "zip_code": "0194", "longitude": 10.8169080528998, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 16 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 59.83607620356919, "address": "Jensenroa 00", "zip_code": "0128", "longitude": 10.78535037196415, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 17 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 59.92658986005112, "address": "Tangenstykket 79", "zip_code": "0145", "longitude": 10.740875983170062, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 18 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 59.84403819590268, "address": "J\u00f8rgensenl\u00f8kka 98", "zip_code": "0124", "longitude": 10.768651841870405, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 19 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 59.829511912876946, "address": "Myklebustvollen 94", "zip_code": "0215", "longitude": 10.877066582376077, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 20 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 60.02706647066823, "address": "Hauglandr\u00f8a 6", "zip_code": "0137", "longitude": 10.6415654062034, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 21 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 59.985606573966486, "address": "Pettersenstien 1", "zip_code": "0185", "longitude": 10.74427374271031, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 22 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 59.978323278289515, "address": "Haugenlyngen 00", "zip_code": "0168", "longitude": 10.767461131389409, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 23 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 59.864337993875836, "address": "Andersentoppen 44", "zip_code": "0178", "longitude": 10.664264362593675, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 24 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 59.81819275751251, "address": "Pettersensvingen 7", "zip_code": "0240", "longitude": 10.843381478117784, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 25 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 59.94413644053888, "address": "Andresenringen 65", "zip_code": "0214", "longitude": 10.936694075863995, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 26 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 59.89692955711872, "address": "Hauger\u00f8a 1", "zip_code": "0282", "longitude": 10.707543212194091, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 27 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 59.83062460411829, "address": "Abrahamsenhagen 8", "zip_code": "0221", "longitude": 10.676561970075774, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 28 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.247Z", "latitude": 59.98915872890549, "address": "Rasmussengropa 8", "zip_code": "0119", "longitude": 10.589869453283573, "updated": "2015-02-13T21:39:42.247Z", "city": "Oslo" }, "pk": 29 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 60.08927375077344, "address": "Myklebustgrenda 0", "zip_code": "0232", "longitude": 10.657923054883481, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 30 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 59.98446663030189, "address": "Nguyenengen 7", "zip_code": "0170", "longitude": 10.845671559789183, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 31 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 59.94018500082587, "address": "Tveithagen 97", "zip_code": "0218", "longitude": 10.67927891526063, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 32 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 60.001933056984385, "address": "S\u00e6therholtet 7", "zip_code": "0251", "longitude": 10.902943242976246, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 33 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 59.99064929424015, "address": "Aasenh\u00f8gda 7", "zip_code": "0187", "longitude": 10.698916574006395, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 34 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 59.95239176850915, "address": "Haugeringen 60", "zip_code": "0258", "longitude": 10.8108034171511, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 35 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 59.95047538076824, "address": "Isaksenfaret 0", "zip_code": "0285", "longitude": 10.702264455322045, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 36 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 59.87152653174937, "address": "Aaseggen 9", "zip_code": "0185", "longitude": 10.601282479870813, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 37 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 59.75664623062497, "address": "Hanssenmoen 40", "zip_code": "0181", "longitude": 10.695768093659206, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 38 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 59.89882906294937, "address": "Sandvikstien 40", "zip_code": "0156", "longitude": 10.706022843738628, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 39 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 59.95190215981165, "address": "J\u00f8rgensenall\u00e9en 0", "zip_code": "0151", "longitude": 10.845651584873082, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 40 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 60.063223660182516, "address": "Johannessentunet 2", "zip_code": "0226", "longitude": 10.78422680396959, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 41 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 59.92306173399562, "address": "Aunelyngen 7", "zip_code": "0100", "longitude": 10.662778259175045, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 42 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 59.84634507117154, "address": "Ahmedroa 2", "zip_code": "0269", "longitude": 10.693610021735212, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 43 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.248Z", "latitude": 59.89429607822404, "address": "Nilsenekra 1", "zip_code": "0175", "longitude": 10.733483224706205, "updated": "2015-02-13T21:39:42.248Z", "city": "Oslo" }, "pk": 44 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 59.99164686399774, "address": "Holmvollen 4", "zip_code": "0160", "longitude": 10.759823276550938, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 45 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 59.9126155758038, "address": "Eideall\u00e9en 79", "zip_code": "0283", "longitude": 10.596025893278515, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 46 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 59.94908089577484, "address": "J\u00f8rgensenlia 2", "zip_code": "0198", "longitude": 10.918990673477044, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 47 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 59.982510362064396, "address": "Johnsenh\u00f8gda 4", "zip_code": "0199", "longitude": 10.826963074336764, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 48 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 59.97734865198299, "address": "Moeflata 28", "zip_code": "0125", "longitude": 10.723849656760295, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 49 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 60.00059654366635, "address": "Ahmedlia 61", "zip_code": "0135", "longitude": 10.908539294534814, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 50 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 59.970932832038336, "address": "Antonsenplassen 8", "zip_code": "0229", "longitude": 10.953581147660746, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 51 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 59.9638956777966, "address": "Rasmussenhagen 71", "zip_code": "0293", "longitude": 10.958498294466715, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 52 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 59.75359135925595, "address": "Johnsenflata 8", "zip_code": "0200", "longitude": 10.647355007841071, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 53 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 59.9875252004295, "address": "N\u00e6ssgata 1", "zip_code": "0185", "longitude": 10.62897271234992, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 54 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 59.82344446227308, "address": "Fredriksenmyra 47", "zip_code": "0179", "longitude": 10.900930645475936, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 55 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 60.0449028505056, "address": "Hauglandstranda 60", "zip_code": "0187", "longitude": 10.700919248639527, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 56 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 59.895087574947716, "address": "Hellandbakken 7", "zip_code": "0266", "longitude": 10.6954241816232, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 57 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 59.91832235151213, "address": "Bergtoppen 51", "zip_code": "0213", "longitude": 10.783205112962843, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 58 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.249Z", "latitude": 59.870190926776665, "address": "Sandvikbr\u00e5ten 3", "zip_code": "0204", "longitude": 10.736731793637208, "updated": "2015-02-13T21:39:42.249Z", "city": "Oslo" }, "pk": 59 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 59.95383238850984, "address": "Lundtjernet 42", "zip_code": "0158", "longitude": 10.696440218567446, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 60 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 59.84776600279412, "address": "Kristoffersenbr\u00e5ten 6", "zip_code": "0265", "longitude": 10.858949104566072, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 61 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 59.97793872518526, "address": "Halvorsenr\u00f8a 05", "zip_code": "0114", "longitude": 10.732623595464785, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 62 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 59.89328159393844, "address": "Moentoppen 9", "zip_code": "0235", "longitude": 10.89958795492445, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 63 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 59.936203316140215, "address": "Vikstubben 0", "zip_code": "0280", "longitude": 10.804538287413278, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 64 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 59.91628034434794, "address": "Jensenkroken 1", "zip_code": "0274", "longitude": 10.668646013047587, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 65 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 59.93638433102333, "address": "Danielsenl\u00f8kka 5", "zip_code": "0183", "longitude": 10.846395825794241, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 66 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 60.01146872045944, "address": "Fredriksenskogen 04", "zip_code": "0289", "longitude": 10.786193719338806, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 67 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 59.91742251441682, "address": "Gundersenbr\u00e5ten 4", "zip_code": "0289", "longitude": 10.813481002560126, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 68 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 59.85863052939531, "address": "Bakkenhagen 5", "zip_code": "0141", "longitude": 10.82213945646575, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 69 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 60.03792173804758, "address": "Hellandberget 1", "zip_code": "0130", "longitude": 10.791889868477469, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 70 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 59.83062929125564, "address": "Christenseneggen 7", "zip_code": "0120", "longitude": 10.907517800935755, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 71 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 59.98282468198254, "address": "Halvorsentjernet 4", "zip_code": "0177", "longitude": 10.759850950149907, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 72 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 59.926942607598704, "address": "Karlsenroa 45", "zip_code": "0278", "longitude": 10.8256581721645, "updated": "2015-02-13T21:39:42.250Z", "city": "Oslo" }, "pk": 73 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.250Z", "latitude": 59.81183021555159, "address": "Martinsenkollen 4", "zip_code": "0148", "longitude": 10.687672024340516, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 74 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 60.053001056564945, "address": "Myklebustr\u00f8a 89", "zip_code": "0273", "longitude": 10.811937383749598, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 75 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 60.01216118524038, "address": "Birkelandplassen 12", "zip_code": "0210", "longitude": 10.830278113681507, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 76 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 59.872862737063954, "address": "Christensenvollen 1", "zip_code": "0222", "longitude": 10.60413225834586, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 77 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 59.776869523305855, "address": "Bergemarka 7", "zip_code": "0109", "longitude": 10.924164479198348, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 78 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 59.87624098669839, "address": "Jacobsengrenda 6", "zip_code": "0188", "longitude": 10.706955510850085, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 79 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 59.909292753079754, "address": "Mikkelseneggen 6", "zip_code": "0241", "longitude": 10.699892028390964, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 80 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 59.92435859208912, "address": "Brekkemarka 79", "zip_code": "0204", "longitude": 10.917460845391144, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 81 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 59.83890918391743, "address": "Johnsenstykket 8", "zip_code": "0291", "longitude": 10.721964696792304, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 82 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 59.89793503164266, "address": "Andresengrenda 21", "zip_code": "0187", "longitude": 10.749307895780507, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 83 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 59.809240949970764, "address": "Hagentoppen 73", "zip_code": "0211", "longitude": 10.746657782878637, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 84 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 59.91023398739465, "address": "Lier\u00f8a 1", "zip_code": "0279", "longitude": 10.56075654122111, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 85 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 59.962776041928805, "address": "Antonsenbakken 1", "zip_code": "0159", "longitude": 10.741624622650262, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 86 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 59.955723818265724, "address": "Alikollen 07", "zip_code": "0275", "longitude": 10.747885118677889, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 87 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 59.85536001517987, "address": "Bergebr\u00e5ten 32", "zip_code": "0255", "longitude": 10.921264935541542, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 88 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.251Z", "latitude": 59.87014023036983, "address": "Hansenr\u00f8a 21", "zip_code": "0249", "longitude": 10.590000534675852, "updated": "2015-02-13T21:39:42.251Z", "city": "Oslo" }, "pk": 89 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.252Z", "latitude": 59.84835215092349, "address": "Hanssenhaugen 84", "zip_code": "0184", "longitude": 10.695848580676747, "updated": "2015-02-13T21:39:42.252Z", "city": "Oslo" }, "pk": 90 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.252Z", "latitude": 59.88772645155173, "address": "Nilsenbr\u00e5ten 8", "zip_code": "0256", "longitude": 10.954550411242598, "updated": "2015-02-13T21:39:42.252Z", "city": "Oslo" }, "pk": 91 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.252Z", "latitude": 60.05893819077175, "address": "Engenlunden 6", "zip_code": "0169", "longitude": 10.739877317798895, "updated": "2015-02-13T21:39:42.252Z", "city": "Oslo" }, "pk": 92 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.252Z", "latitude": 59.90718585715958, "address": "Bakkegrenda 72", "zip_code": "0217", "longitude": 10.637367253615418, "updated": "2015-02-13T21:39:42.252Z", "city": "Oslo" }, "pk": 93 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.252Z", "latitude": 59.78781948947378, "address": "Rasmusseneggen 1", "zip_code": "0132", "longitude": 10.766861073375164, "updated": "2015-02-13T21:39:42.252Z", "city": "Oslo" }, "pk": 94 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.252Z", "latitude": 59.909749850974066, "address": "Lundekroken 73", "zip_code": "0294", "longitude": 10.697207568337102, "updated": "2015-02-13T21:39:42.252Z", "city": "Oslo" }, "pk": 95 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.252Z", "latitude": 59.83822910281552, "address": "Kristoffersenvollen 16", "zip_code": "0284", "longitude": 10.888289065648543, "updated": "2015-02-13T21:39:42.252Z", "city": "Oslo" }, "pk": 96 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.252Z", "latitude": 59.93090005761206, "address": "Ellingsenvollen 1", "zip_code": "0179", "longitude": 10.568415840971076, "updated": "2015-02-13T21:39:42.252Z", "city": "Oslo" }, "pk": 97 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.252Z", "latitude": 59.923396, "address": "Waldemar Thranes gate 8", "zip_code": "0107", "longitude": 10.739370, "updated": "2015-02-13T21:39:42.252Z", "city": "Oslo" }, "pk": 98 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.252Z", "latitude": 59.928354, "address": "Waldemar Thranes gate 86", "zip_code": "0175", "longitude": 10.752732, "updated": "2015-02-13T21:39:42.252Z", "city": "Oslo" }, "pk": 99 }, { "model": "mockapp.mocklocation", "fields": { "created": "2015-02-13T21:39:42.252Z", "latitude": 59.926835, "address": "Uelands gate 22", "zip_code": "0175", "longitude": 10.749738, "updated": "2015-02-13T21:39:42.252Z", "city": "Oslo" }, "pk": 100 } ] drf-haystack-1.5.6/tests/mockapp/fixtures/mockperson.json0000644000076500000240000005011712555631440023721 0ustar rolfstaff00000000000000[ { "pk": 1, "model": "mockapp.mockperson", "fields": { "lastname": "Foreman", "created": "2015-05-19T10:48:08.686Z", "firstname": "Abel", "updated": "2015-05-19T10:48:08.686Z" } }, { "pk": 2, "model": "mockapp.mockperson", "fields": { "lastname": "Mckay", "created": "2015-05-19T10:48:08.688Z", "firstname": "Dylan", "updated": "2015-05-19T10:48:08.688Z" } }, { "pk": 3, "model": "mockapp.mockperson", "fields": { "lastname": "Mcleod", "created": "2015-05-19T10:48:08.689Z", "firstname": "Raymond", "updated": "2015-05-19T10:48:08.689Z" } }, { "pk": 4, "model": "mockapp.mockperson", "fields": { "lastname": "Griffith", "created": "2015-05-19T10:48:08.690Z", "firstname": "Zane", "updated": "2015-05-19T10:48:08.690Z" } }, { "pk": 5, "model": "mockapp.mockperson", "fields": { "lastname": "Fowler", "created": "2015-05-19T10:48:08.691Z", "firstname": "Jeremy", "updated": "2015-05-19T10:48:08.691Z" } }, { "pk": 6, "model": "mockapp.mockperson", "fields": { "lastname": "Lara", "created": "2015-05-19T10:48:08.693Z", "firstname": "Norman", "updated": "2015-05-19T10:48:08.693Z" } }, { "pk": 7, "model": "mockapp.mockperson", "fields": { "lastname": "Barnett", "created": "2015-05-19T10:48:08.694Z", "firstname": "Scott", "updated": "2015-05-19T10:48:08.694Z" } }, { "pk": 8, "model": "mockapp.mockperson", "fields": { "lastname": "Gibbs", "created": "2015-05-19T10:48:08.695Z", "firstname": "Brady", "updated": "2015-05-19T10:48:08.695Z" } }, { "pk": 9, "model": "mockapp.mockperson", "fields": { "lastname": "Woodward", "created": "2015-05-19T10:48:08.696Z", "firstname": "Jesse", "updated": "2015-05-19T10:48:08.696Z" } }, { "pk": 10, "model": "mockapp.mockperson", "fields": { "lastname": "McClane", "created": "2015-05-19T10:48:08.697Z", "firstname": "John", "updated": "2015-05-19T10:48:08.697Z" } }, { "pk": 11, "model": "mockapp.mockperson", "fields": { "lastname": "McLaughlin", "created": "2015-05-19T10:48:08.698Z", "firstname": "John", "updated": "2015-05-19T10:48:08.698Z" } }, { "pk": 12, "model": "mockapp.mockperson", "fields": { "lastname": "George", "created": "2015-05-19T10:48:08.699Z", "firstname": "Mason", "updated": "2015-05-19T10:48:08.699Z" } }, { "pk": 13, "model": "mockapp.mockperson", "fields": { "lastname": "Hickman", "created": "2015-05-19T10:48:08.700Z", "firstname": "Kadeem", "updated": "2015-05-19T10:48:08.700Z" } }, { "pk": 14, "model": "mockapp.mockperson", "fields": { "lastname": "Clemons", "created": "2015-05-19T10:48:08.701Z", "firstname": "Prescott", "updated": "2015-05-19T10:48:08.701Z" } }, { "pk": 15, "model": "mockapp.mockperson", "fields": { "lastname": "Sykes", "created": "2015-05-19T10:48:08.702Z", "firstname": "Ciaran", "updated": "2015-05-19T10:48:08.702Z" } }, { "pk": 16, "model": "mockapp.mockperson", "fields": { "lastname": "Hood", "created": "2015-05-19T10:48:08.703Z", "firstname": "Bruno", "updated": "2015-05-19T10:48:08.703Z" } }, { "pk": 17, "model": "mockapp.mockperson", "fields": { "lastname": "Gould", "created": "2015-05-19T10:48:08.704Z", "firstname": "Xander", "updated": "2015-05-19T10:48:08.704Z" } }, { "pk": 18, "model": "mockapp.mockperson", "fields": { "lastname": "Cooley", "created": "2015-05-19T10:48:08.705Z", "firstname": "Odysseus", "updated": "2015-05-19T10:48:08.705Z" } }, { "pk": 19, "model": "mockapp.mockperson", "fields": { "lastname": "Ray", "created": "2015-05-19T10:48:08.705Z", "firstname": "Randall", "updated": "2015-05-19T10:48:08.705Z" } }, { "pk": 20, "model": "mockapp.mockperson", "fields": { "lastname": "Richards", "created": "2015-05-19T10:48:08.706Z", "firstname": "Brandon", "updated": "2015-05-19T10:48:08.706Z" } }, { "pk": 21, "model": "mockapp.mockperson", "fields": { "lastname": "Mullen", "created": "2015-05-19T10:48:08.707Z", "firstname": "Myles", "updated": "2015-05-19T10:48:08.707Z" } }, { "pk": 22, "model": "mockapp.mockperson", "fields": { "lastname": "Walton", "created": "2015-05-19T10:48:08.708Z", "firstname": "Ryder", "updated": "2015-05-19T10:48:08.708Z" } }, { "pk": 23, "model": "mockapp.mockperson", "fields": { "lastname": "Brooks", "created": "2015-05-19T10:48:08.709Z", "firstname": "Alec", "updated": "2015-05-19T10:48:08.709Z" } }, { "pk": 24, "model": "mockapp.mockperson", "fields": { "lastname": "Hinton", "created": "2015-05-19T10:48:08.710Z", "firstname": "Zeph", "updated": "2015-05-19T10:48:08.710Z" } }, { "pk": 25, "model": "mockapp.mockperson", "fields": { "lastname": "Drake", "created": "2015-05-19T10:48:08.711Z", "firstname": "Carl", "updated": "2015-05-19T10:48:08.711Z" } }, { "pk": 26, "model": "mockapp.mockperson", "fields": { "lastname": "Hahn", "created": "2015-05-19T10:48:08.711Z", "firstname": "Lamar", "updated": "2015-05-19T10:48:08.711Z" } }, { "pk": 27, "model": "mockapp.mockperson", "fields": { "lastname": "Buchanan", "created": "2015-05-19T10:48:08.712Z", "firstname": "Aladdin", "updated": "2015-05-19T10:48:08.712Z" } }, { "pk": 28, "model": "mockapp.mockperson", "fields": { "lastname": "Odonnell", "created": "2015-05-19T10:48:08.713Z", "firstname": "Ashton", "updated": "2015-05-19T10:48:08.713Z" } }, { "pk": 29, "model": "mockapp.mockperson", "fields": { "lastname": "Hoffman", "created": "2015-05-19T10:48:08.714Z", "firstname": "Alvin", "updated": "2015-05-19T10:48:08.714Z" } }, { "pk": 30, "model": "mockapp.mockperson", "fields": { "lastname": "Mejia", "created": "2015-05-19T10:48:08.715Z", "firstname": "Preston", "updated": "2015-05-19T10:48:08.715Z" } }, { "pk": 31, "model": "mockapp.mockperson", "fields": { "lastname": "Hayes", "created": "2015-05-19T10:48:08.716Z", "firstname": "Jameson", "updated": "2015-05-19T10:48:08.716Z" } }, { "pk": 32, "model": "mockapp.mockperson", "fields": { "lastname": "Fox", "created": "2015-05-19T10:48:08.717Z", "firstname": "Mark", "updated": "2015-05-19T10:48:08.717Z" } }, { "pk": 33, "model": "mockapp.mockperson", "fields": { "lastname": "Spears", "created": "2015-05-19T10:48:08.717Z", "firstname": "Garth", "updated": "2015-05-19T10:48:08.717Z" } }, { "pk": 34, "model": "mockapp.mockperson", "fields": { "lastname": "Herrera", "created": "2015-05-19T10:48:08.718Z", "firstname": "Justin", "updated": "2015-05-19T10:48:08.718Z" } }, { "pk": 35, "model": "mockapp.mockperson", "fields": { "lastname": "Crane", "created": "2015-05-19T10:48:08.719Z", "firstname": "Rigel", "updated": "2015-05-19T10:48:08.719Z" } }, { "pk": 36, "model": "mockapp.mockperson", "fields": { "lastname": "Mosley", "created": "2015-05-19T10:48:08.720Z", "firstname": "Ezra", "updated": "2015-05-19T10:48:08.720Z" } }, { "pk": 37, "model": "mockapp.mockperson", "fields": { "lastname": "Ballard", "created": "2015-05-19T10:48:08.721Z", "firstname": "Isaac", "updated": "2015-05-19T10:48:08.721Z" } }, { "pk": 38, "model": "mockapp.mockperson", "fields": { "lastname": "Holman", "created": "2015-05-19T10:48:08.722Z", "firstname": "Kennan", "updated": "2015-05-19T10:48:08.722Z" } }, { "pk": 39, "model": "mockapp.mockperson", "fields": { "lastname": "Cantrell", "created": "2015-05-19T10:48:08.723Z", "firstname": "Bradley", "updated": "2015-05-19T10:48:08.723Z" } }, { "pk": 40, "model": "mockapp.mockperson", "fields": { "lastname": "Green", "created": "2015-05-19T10:48:08.723Z", "firstname": "Eagan", "updated": "2015-05-19T10:48:08.723Z" } }, { "pk": 41, "model": "mockapp.mockperson", "fields": { "lastname": "Fields", "created": "2015-05-19T10:48:08.724Z", "firstname": "Davis", "updated": "2015-05-19T10:48:08.724Z" } }, { "pk": 42, "model": "mockapp.mockperson", "fields": { "lastname": "Munoz", "created": "2015-05-19T10:48:08.725Z", "firstname": "Edward", "updated": "2015-05-19T10:48:08.725Z" } }, { "pk": 43, "model": "mockapp.mockperson", "fields": { "lastname": "Baird", "created": "2015-05-19T10:48:08.726Z", "firstname": "Raphael", "updated": "2015-05-19T10:48:08.726Z" } }, { "pk": 44, "model": "mockapp.mockperson", "fields": { "lastname": "Ward", "created": "2015-05-19T10:48:08.726Z", "firstname": "Clinton", "updated": "2015-05-19T10:48:08.726Z" } }, { "pk": 45, "model": "mockapp.mockperson", "fields": { "lastname": "Odonnell", "created": "2015-05-19T10:48:08.727Z", "firstname": "Judah", "updated": "2015-05-19T10:48:08.727Z" } }, { "pk": 46, "model": "mockapp.mockperson", "fields": { "lastname": "Humphrey", "created": "2015-05-19T10:48:08.728Z", "firstname": "Zachery", "updated": "2015-05-19T10:48:08.728Z" } }, { "pk": 47, "model": "mockapp.mockperson", "fields": { "lastname": "Church", "created": "2015-05-19T10:48:08.728Z", "firstname": "Mark", "updated": "2015-05-19T10:48:08.729Z" } }, { "pk": 48, "model": "mockapp.mockperson", "fields": { "lastname": "Rose", "created": "2015-05-19T10:48:08.729Z", "firstname": "Ronan", "updated": "2015-05-19T10:48:08.729Z" } }, { "pk": 49, "model": "mockapp.mockperson", "fields": { "lastname": "Nicholson", "created": "2015-05-19T10:48:08.730Z", "firstname": "Craig", "updated": "2015-05-19T10:48:08.730Z" } }, { "pk": 50, "model": "mockapp.mockperson", "fields": { "lastname": "Copeland", "created": "2015-05-19T10:48:08.731Z", "firstname": "Benedict", "updated": "2015-05-19T10:48:08.731Z" } }, { "pk": 51, "model": "mockapp.mockperson", "fields": { "lastname": "Decker", "created": "2015-05-19T10:48:08.731Z", "firstname": "Oleg", "updated": "2015-05-19T10:48:08.731Z" } }, { "pk": 52, "model": "mockapp.mockperson", "fields": { "lastname": "Trujillo", "created": "2015-05-19T10:48:08.732Z", "firstname": "Carlos", "updated": "2015-05-19T10:48:08.732Z" } }, { "pk": 53, "model": "mockapp.mockperson", "fields": { "lastname": "Acosta", "created": "2015-05-19T10:48:08.733Z", "firstname": "Solomon", "updated": "2015-05-19T10:48:08.733Z" } }, { "pk": 54, "model": "mockapp.mockperson", "fields": { "lastname": "Gray", "created": "2015-05-19T10:48:08.733Z", "firstname": "Coby", "updated": "2015-05-19T10:48:08.733Z" } }, { "pk": 55, "model": "mockapp.mockperson", "fields": { "lastname": "Reilly", "created": "2015-05-19T10:48:08.734Z", "firstname": "Randall", "updated": "2015-05-19T10:48:08.734Z" } }, { "pk": 56, "model": "mockapp.mockperson", "fields": { "lastname": "Caldwell", "created": "2015-05-19T10:48:08.735Z", "firstname": "Davis", "updated": "2015-05-19T10:48:08.735Z" } }, { "pk": 57, "model": "mockapp.mockperson", "fields": { "lastname": "Daugherty", "created": "2015-05-19T10:48:08.736Z", "firstname": "Wing", "updated": "2015-05-19T10:48:08.736Z" } }, { "pk": 58, "model": "mockapp.mockperson", "fields": { "lastname": "Mcdowell", "created": "2015-05-19T10:48:08.737Z", "firstname": "Dane", "updated": "2015-05-19T10:48:08.737Z" } }, { "pk": 59, "model": "mockapp.mockperson", "fields": { "lastname": "Bentley", "created": "2015-05-19T10:48:08.737Z", "firstname": "Theodore", "updated": "2015-05-19T10:48:08.737Z" } }, { "pk": 60, "model": "mockapp.mockperson", "fields": { "lastname": "Pace", "created": "2015-05-19T10:48:08.738Z", "firstname": "Nehru", "updated": "2015-05-19T10:48:08.738Z" } }, { "pk": 61, "model": "mockapp.mockperson", "fields": { "lastname": "Owen", "created": "2015-05-19T10:48:08.739Z", "firstname": "Ferris", "updated": "2015-05-19T10:48:08.739Z" } }, { "pk": 62, "model": "mockapp.mockperson", "fields": { "lastname": "Hill", "created": "2015-05-19T10:48:08.740Z", "firstname": "Holmes", "updated": "2015-05-19T10:48:08.740Z" } }, { "pk": 63, "model": "mockapp.mockperson", "fields": { "lastname": "Harrell", "created": "2015-05-19T10:48:08.741Z", "firstname": "Nehru", "updated": "2015-05-19T10:48:08.741Z" } }, { "pk": 64, "model": "mockapp.mockperson", "fields": { "lastname": "Baker", "created": "2015-05-19T10:48:08.741Z", "firstname": "John", "updated": "2015-05-19T10:48:08.741Z" } }, { "pk": 65, "model": "mockapp.mockperson", "fields": { "lastname": "Gill", "created": "2015-05-19T10:48:08.742Z", "firstname": "Wesley", "updated": "2015-05-19T10:48:08.742Z" } }, { "pk": 66, "model": "mockapp.mockperson", "fields": { "lastname": "Jarvis", "created": "2015-05-19T10:48:08.743Z", "firstname": "Moses", "updated": "2015-05-19T10:48:08.743Z" } }, { "pk": 67, "model": "mockapp.mockperson", "fields": { "lastname": "Conway", "created": "2015-05-19T10:48:08.743Z", "firstname": "Fuller", "updated": "2015-05-19T10:48:08.743Z" } }, { "pk": 68, "model": "mockapp.mockperson", "fields": { "lastname": "Mckinney", "created": "2015-05-19T10:48:08.744Z", "firstname": "Octavius", "updated": "2015-05-19T10:48:08.744Z" } }, { "pk": 69, "model": "mockapp.mockperson", "fields": { "lastname": "Horne", "created": "2015-05-19T10:48:08.745Z", "firstname": "Raja", "updated": "2015-05-19T10:48:08.745Z" } }, { "pk": 70, "model": "mockapp.mockperson", "fields": { "lastname": "Dorsey", "created": "2015-05-19T10:48:08.745Z", "firstname": "Keegan", "updated": "2015-05-19T10:48:08.745Z" } }, { "pk": 71, "model": "mockapp.mockperson", "fields": { "lastname": "Jimenez", "created": "2015-05-19T10:48:08.746Z", "firstname": "Wang", "updated": "2015-05-19T10:48:08.746Z" } }, { "pk": 72, "model": "mockapp.mockperson", "fields": { "lastname": "Galloway", "created": "2015-05-19T10:48:08.747Z", "firstname": "Kermit", "updated": "2015-05-19T10:48:08.747Z" } }, { "pk": 73, "model": "mockapp.mockperson", "fields": { "lastname": "Banks", "created": "2015-05-19T10:48:08.748Z", "firstname": "Lucian", "updated": "2015-05-19T10:48:08.748Z" } }, { "pk": 74, "model": "mockapp.mockperson", "fields": { "lastname": "Porter", "created": "2015-05-19T10:48:08.748Z", "firstname": "Lewis", "updated": "2015-05-19T10:48:08.748Z" } }, { "pk": 75, "model": "mockapp.mockperson", "fields": { "lastname": "Noel", "created": "2015-05-19T10:48:08.749Z", "firstname": "Simon", "updated": "2015-05-19T10:48:08.749Z" } }, { "pk": 76, "model": "mockapp.mockperson", "fields": { "lastname": "Hodge", "created": "2015-05-19T10:48:08.750Z", "firstname": "Carl", "updated": "2015-05-19T10:48:08.750Z" } }, { "pk": 77, "model": "mockapp.mockperson", "fields": { "lastname": "Reynolds", "created": "2015-05-19T10:48:08.751Z", "firstname": "Abel", "updated": "2015-05-19T10:48:08.751Z" } }, { "pk": 78, "model": "mockapp.mockperson", "fields": { "lastname": "Porter", "created": "2015-05-19T10:48:08.751Z", "firstname": "Kaseem", "updated": "2015-05-19T10:48:08.751Z" } }, { "pk": 79, "model": "mockapp.mockperson", "fields": { "lastname": "Parks", "created": "2015-05-19T10:48:08.752Z", "firstname": "Tucker", "updated": "2015-05-19T10:48:08.752Z" } }, { "pk": 80, "model": "mockapp.mockperson", "fields": { "lastname": "Jensen", "created": "2015-05-19T10:48:08.753Z", "firstname": "Caldwell", "updated": "2015-05-19T10:48:08.753Z" } }, { "pk": 81, "model": "mockapp.mockperson", "fields": { "lastname": "Tyson", "created": "2015-05-19T10:48:08.753Z", "firstname": "Melvin", "updated": "2015-05-19T10:48:08.753Z" } }, { "pk": 82, "model": "mockapp.mockperson", "fields": { "lastname": "Hood", "created": "2015-05-19T10:48:08.754Z", "firstname": "Walker", "updated": "2015-05-19T10:48:08.754Z" } }, { "pk": 83, "model": "mockapp.mockperson", "fields": { "lastname": "Dillon", "created": "2015-05-19T10:48:08.755Z", "firstname": "Brenden", "updated": "2015-05-19T10:48:08.755Z" } }, { "pk": 84, "model": "mockapp.mockperson", "fields": { "lastname": "Young", "created": "2015-05-19T10:48:08.755Z", "firstname": "Avram", "updated": "2015-05-19T10:48:08.755Z" } }, { "pk": 85, "model": "mockapp.mockperson", "fields": { "lastname": "Guerrero", "created": "2015-05-19T10:48:08.756Z", "firstname": "Lev", "updated": "2015-05-19T10:48:08.756Z" } }, { "pk": 86, "model": "mockapp.mockperson", "fields": { "lastname": "Wade", "created": "2015-05-19T10:48:08.757Z", "firstname": "Leroy", "updated": "2015-05-19T10:48:08.757Z" } }, { "pk": 87, "model": "mockapp.mockperson", "fields": { "lastname": "Gardner", "created": "2015-05-19T10:48:08.757Z", "firstname": "Kuame", "updated": "2015-05-19T10:48:08.757Z" } }, { "pk": 88, "model": "mockapp.mockperson", "fields": { "lastname": "David", "created": "2015-05-19T10:48:08.758Z", "firstname": "Lewis", "updated": "2015-05-19T10:48:08.758Z" } }, { "pk": 89, "model": "mockapp.mockperson", "fields": { "lastname": "Carrillo", "created": "2015-05-19T10:48:08.759Z", "firstname": "Ezra", "updated": "2015-05-19T10:48:08.759Z" } }, { "pk": 90, "model": "mockapp.mockperson", "fields": { "lastname": "Giles", "created": "2015-05-19T10:48:08.759Z", "firstname": "Barrett", "updated": "2015-05-19T10:48:08.759Z" } }, { "pk": 91, "model": "mockapp.mockperson", "fields": { "lastname": "Garner", "created": "2015-05-19T10:48:08.760Z", "firstname": "Adrian", "updated": "2015-05-19T10:48:08.760Z" } }, { "pk": 92, "model": "mockapp.mockperson", "fields": { "lastname": "Stewart", "created": "2015-05-19T10:48:08.761Z", "firstname": "Vladimir", "updated": "2015-05-19T10:48:08.761Z" } }, { "pk": 93, "model": "mockapp.mockperson", "fields": { "lastname": "Aguirre", "created": "2015-05-19T10:48:08.762Z", "firstname": "Burke", "updated": "2015-05-19T10:48:08.762Z" } }, { "pk": 94, "model": "mockapp.mockperson", "fields": { "lastname": "Johnson", "created": "2015-05-19T10:48:08.762Z", "firstname": "Slade", "updated": "2015-05-19T10:48:08.762Z" } }, { "pk": 95, "model": "mockapp.mockperson", "fields": { "lastname": "Hodges", "created": "2015-05-19T10:48:08.763Z", "firstname": "Justin", "updated": "2015-05-19T10:48:08.763Z" } }, { "pk": 96, "model": "mockapp.mockperson", "fields": { "lastname": "Wolfe", "created": "2015-05-19T10:48:08.764Z", "firstname": "Elton", "updated": "2015-05-19T10:48:08.764Z" } }, { "pk": 97, "model": "mockapp.mockperson", "fields": { "lastname": "Parker", "created": "2015-05-19T10:48:08.764Z", "firstname": "Kamal", "updated": "2015-05-19T10:48:08.764Z" } }, { "pk": 98, "model": "mockapp.mockperson", "fields": { "lastname": "Rowland", "created": "2015-05-19T10:48:08.765Z", "firstname": "Jeremy", "updated": "2015-05-19T10:48:08.765Z" } }, { "pk": 99, "model": "mockapp.mockperson", "fields": { "lastname": "Anthony", "created": "2015-05-19T10:48:08.766Z", "firstname": "Derek", "updated": "2015-05-19T10:48:08.766Z" } }, { "pk": 100, "model": "mockapp.mockperson", "fields": { "lastname": "Sørensen", "created": "2015-05-19T10:48:08.767Z", "firstname": "Åsmund", "updated": "2015-05-19T10:48:08.767Z" } } ] drf-haystack-1.5.6/tests/mockapp/fixtures/mockpet.json0000644000076500000240000000362712567013134023204 0ustar rolfstaff00000000000000[ { "pk": 1, "model": "mockapp.mockpet", "fields": { "name": "John", "created": "2015-05-19T10:48:08.686Z", "species": "Iguana", "updated": "2015-05-19T10:48:08.686Z" } }, { "pk": 2, "model": "mockapp.mockpet", "fields": { "name": "Fido", "created": "2015-05-19T10:48:08.688Z", "species": "Dog", "updated": "2015-05-19T10:48:08.688Z" } }, { "pk": 3, "model": "mockapp.mockpet", "fields": { "name": "Zane", "created": "2015-05-19T10:48:08.689Z", "species": "Dog", "updated": "2015-05-19T10:48:08.689Z" } }, { "pk": 4, "model": "mockapp.mockpet", "fields": { "name": "Spot", "created": "2015-05-19T10:48:08.690Z", "species": "Dog", "updated": "2015-05-19T10:48:08.690Z" } }, { "pk": 5, "model": "mockapp.mockpet", "fields": { "name": "Sam", "created": "2015-05-19T10:48:08.691Z", "species": "Bird", "updated": "2015-05-19T10:48:08.691Z" } }, { "pk": 6, "model": "mockapp.mockpet", "fields": { "name": "Jenny", "created": "2015-05-19T10:48:08.693Z", "species": "Cat", "updated": "2015-05-19T10:48:08.693Z" } }, { "pk": 7, "model": "mockapp.mockpet", "fields": { "name": "Mr. Squiggles", "created": "2015-05-19T10:48:08.694Z", "species": "Cat", "updated": "2015-05-19T10:48:08.694Z" } }, { "pk": 8, "model": "mockapp.mockpet", "fields": { "name": "Mrs. Buttersworth", "created": "2015-05-19T10:48:08.695Z", "species": "Dog", "updated": "2015-05-19T10:48:08.695Z" } }, { "pk": 9, "model": "mockapp.mockpet", "fields": { "name": "Tweety", "created": "2015-05-19T10:48:08.696Z", "species": "Bird", "updated": "2015-05-19T10:48:08.696Z" } }, { "pk": 10, "model": "mockapp.mockpet", "fields": { "name": "Jake", "created": "2015-05-19T10:48:08.697Z", "species": "Cat", "updated": "2015-05-19T10:48:08.697Z" } } ] drf-haystack-1.5.6/tests/mockapp/migrations/0000755000076500000240000000000012627546270021153 5ustar rolfstaff00000000000000drf-haystack-1.5.6/tests/mockapp/migrations/0001_initial.py0000644000076500000240000000163412467205345023617 0ustar rolfstaff00000000000000# -*- coding: utf-8 -*- from __future__ import unicode_literals from django.db import models, migrations class Migration(migrations.Migration): dependencies = [ ] operations = [ migrations.CreateModel( name='MockLocation', fields=[ ('id', models.AutoField(primary_key=True, auto_created=True, serialize=False, verbose_name='ID')), ('latitude', models.FloatField()), ('longitude', models.FloatField()), ('address', models.CharField(max_length=100)), ('city', models.CharField(max_length=30)), ('zip_code', models.CharField(max_length=10)), ('created', models.DateTimeField(auto_now_add=True)), ('updated', models.DateTimeField(auto_now=True)), ], options={ }, bases=(models.Model,), ), ] drf-haystack-1.5.6/tests/mockapp/migrations/0002_mockperson.py0000644000076500000240000000143612537476535024360 0ustar rolfstaff00000000000000# -*- coding: utf-8 -*- from __future__ import unicode_literals from django.db import models, migrations class Migration(migrations.Migration): dependencies = [ ('mockapp', '0001_initial'), ] operations = [ migrations.CreateModel( name='MockPerson', fields=[ ('id', models.AutoField(auto_created=True, verbose_name='ID', primary_key=True, serialize=False)), ('firstname', models.CharField(max_length=20)), ('lastname', models.CharField(max_length=20)), ('created', models.DateTimeField(auto_now_add=True)), ('updated', models.DateTimeField(auto_now=True)), ], options={ }, bases=(models.Model,), ), ] drf-haystack-1.5.6/tests/mockapp/migrations/0003_mockpet.py0000644000076500000240000000147512567013134023627 0ustar rolfstaff00000000000000# -*- coding: utf-8 -*- from __future__ import unicode_literals from django.db import models, migrations class Migration(migrations.Migration): dependencies = [ ('mockapp', '0001_initial'), ('mockapp', '0002_mockperson'), ] operations = [ migrations.CreateModel( name='MockPet', fields=[ ('id', models.AutoField(auto_created=True, verbose_name='ID', primary_key=True, serialize=False)), ('name', models.CharField(max_length=20)), ('species', models.CharField(max_length=20)), ('created', models.DateTimeField(auto_now_add=True)), ('updated', models.DateTimeField(auto_now=True)), ], options={ }, bases=(models.Model,), ), ] drf-haystack-1.5.6/tests/mockapp/migrations/0004_load_fixtures.py0000644000076500000240000000320412567013134025026 0ustar rolfstaff00000000000000# -*- coding: utf-8 -*- from __future__ import unicode_literals import os from django.core import serializers from django.db import models, migrations def load_data(apps, schema_editor): """ Load fixtures for MockPerson, MockPet and MockLocation """ fixtures = os.path.abspath(os.path.join(os.path.dirname(__file__), os.path.pardir, "fixtures")) with open(os.path.join(fixtures, "mockperson.json"), "r") as fixture: objects = serializers.deserialize("json", fixture, ignorenonexistent=True) for obj in objects: obj.save() with open(os.path.join(fixtures, "mocklocation.json"), "r") as fixture: objects = serializers.deserialize("json", fixture, ignorenonexistent=True) for obj in objects: obj.save() with open(os.path.join(fixtures, "mockpet.json"), "r") as fixture: objects = serializers.deserialize("json", fixture, ignorenonexistent=True) for obj in objects: obj.save() def unload_data(apps, schema_editor): """ Unload fixtures for MockPerson, MockPet and MockLocation """ MockPerson = apps.get_model("mockapp", "MockPerson") MockLocation = apps.get_model("mockapp", "MockLocation") MockPet = apps.get_model("mockapp", "MockPet") MockPerson.objects.all().delete() MockLocation.objects.all().delete() MockPet.objects.all().delete() class Migration(migrations.Migration): dependencies = [ ("mockapp", "0001_initial"), ("mockapp", "0002_mockperson"), ("mockapp", "0003_mockpet"), ] operations = [ migrations.RunPython(load_data, reverse_code=unload_data) ] drf-haystack-1.5.6/tests/mockapp/migrations/__init__.py0000644000076500000240000000000012467177130023247 0ustar rolfstaff00000000000000drf-haystack-1.5.6/tests/mockapp/models.py0000644000076500000240000000272312567013134020627 0ustar rolfstaff00000000000000# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals from django.db import models from django.utils.encoding import python_2_unicode_compatible @python_2_unicode_compatible class MockLocation(models.Model): latitude = models.FloatField() longitude = models.FloatField() address = models.CharField(max_length=100) city = models.CharField(max_length=30) zip_code = models.CharField(max_length=10) created = models.DateTimeField(auto_now_add=True) updated = models.DateTimeField(auto_now=True) def __str__(self): return self.address @property def coordinates(self): try: from haystack.utils.geo import Point except ImportError: return None else: return Point(self.longitude, self.latitude, srid=4326) @python_2_unicode_compatible class MockPerson(models.Model): firstname = models.CharField(max_length=20) lastname = models.CharField(max_length=20) created = models.DateTimeField(auto_now_add=True) updated = models.DateTimeField(auto_now=True) def __str__(self): return "%s %s" % (self.firstname, self.lastname) @python_2_unicode_compatible class MockPet(models.Model): name = models.CharField(max_length=20) species = models.CharField(max_length=20) created = models.DateTimeField(auto_now_add=True) updated = models.DateTimeField(auto_now=True) def __str__(self): return self.name drf-haystack-1.5.6/tests/mockapp/search_indexes.py0000644000076500000240000000463012604553472022335 0ustar rolfstaff00000000000000# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals from django.utils import timezone from haystack import indexes from .models import MockLocation, MockPerson, MockPet class MockLocationIndex(indexes.SearchIndex, indexes.Indexable): text = indexes.CharField(document=True, use_template=True) address = indexes.CharField(model_attr="address") city = indexes.CharField(model_attr="city") zip_code = indexes.CharField(model_attr="zip_code") autocomplete = indexes.EdgeNgramField() coordinates = indexes.LocationField(model_attr="coordinates") @staticmethod def prepare_autocomplete(obj): return " ".join(( obj.address, obj.city, obj.zip_code )) def get_model(self): return MockLocation def index_queryset(self, using=None): return self.get_model().objects.filter( created__lte=timezone.now() ) class MockPersonIndex(indexes.SearchIndex, indexes.Indexable): text = indexes.CharField(document=True, use_template=True) firstname = indexes.CharField(model_attr="firstname", faceted=True) lastname = indexes.CharField(model_attr="lastname", faceted=True) full_name = indexes.CharField() description = indexes.CharField() autocomplete = indexes.EdgeNgramField() created = indexes.FacetDateTimeField(model_attr="created") @staticmethod def prepare_full_name(obj): return " ".join((obj.firstname, obj.lastname)) @staticmethod def prepare_description(obj): return " ".join((obj.firstname, "is a nice chap!")) @staticmethod def prepare_autocomplete(obj): return " ".join((obj.firstname, obj.lastname)) def get_model(self): return MockPerson def index_queryset(self, using=None): return self.get_model().objects.filter( created__lte=timezone.now() ) class MockPetIndex(indexes.SearchIndex, indexes.Indexable): text = indexes.CharField(document=True, use_template=True) name = indexes.CharField(model_attr="name") species = indexes.CharField(model_attr="species") description = indexes.CharField() autocomplete = indexes.EdgeNgramField() @staticmethod def prepare_description(obj): return " ".join((obj.name, "the", obj.species)) @staticmethod def prepare_autocomplete(obj): return obj.name def get_model(self): return MockPet drf-haystack-1.5.6/tests/mockapp/serializers.py0000644000076500000240000000337712605427351021711 0ustar rolfstaff00000000000000# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals from datetime import datetime, timedelta from rest_framework.serializers import HyperlinkedIdentityField from drf_haystack.serializers import HaystackSerializer, HaystackFacetSerializer, HighlighterMixin from .search_indexes import MockPersonIndex, MockLocationIndex class SearchSerializer(HaystackSerializer): class Meta: index_classes = [MockPersonIndex, MockLocationIndex] fields = [ "firstname", "lastname", "full_name", "text", "address", "city", "zip_code", ] class HighlighterSerializer(HighlighterMixin, HaystackSerializer): highlighter_css_class = "my-highlighter-class" highlighter_html_tag = "em" class Meta: index_classes = [MockPersonIndex, MockLocationIndex] fields = [ "firstname", "lastname", "full_name", "address", "city", "zip_code", ] class MoreLikeThisSerializer(HaystackSerializer): more_like_this = HyperlinkedIdentityField(view_name="search3-more-like-this", read_only=True) class Meta: index_classes = [MockPersonIndex] fields = [ "firstname", "lastname", "full_name", "autocomplete" ] class MockPersonFacetSerializer(HaystackFacetSerializer): serialize_objects = True class Meta: index_classes = [MockPersonIndex] fields = ["firstname", "lastname", "created"] field_options = { "firstname": {}, "lastname": {}, "created": { "start_date": datetime.now() - timedelta(days=3 * 365), "end_date": datetime.now(), "gap_by": "day", "gap_amount": 10 } } drf-haystack-1.5.6/tests/mockapp/templates/0000755000076500000240000000000012627546270020775 5ustar rolfstaff00000000000000drf-haystack-1.5.6/tests/mockapp/templates/search/0000755000076500000240000000000012627546270022242 5ustar rolfstaff00000000000000drf-haystack-1.5.6/tests/mockapp/templates/search/indexes/0000755000076500000240000000000012627546270023701 5ustar rolfstaff00000000000000drf-haystack-1.5.6/tests/mockapp/templates/search/indexes/mockapp/0000755000076500000240000000000012627546270025333 5ustar rolfstaff00000000000000drf-haystack-1.5.6/tests/mockapp/templates/search/indexes/mockapp/mocklocation_text.txt0000644000076500000240000000017612467214551031622 0ustar rolfstaff00000000000000{{ object.address }} {{ object.zip_code }} {{ object.city }} Created: {{ object.created }} Last modified: {{ object.updated }}drf-haystack-1.5.6/tests/mockapp/templates/search/indexes/mockapp/mockperson_text.txt0000644000076500000240000000005512560376163031316 0ustar rolfstaff00000000000000{{ object.firstname }} {{ object.lastname }} drf-haystack-1.5.6/tests/mockapp/templates/search/indexes/mockapp/mockpet_text.txt0000644000076500000240000000002112567013134030562 0ustar rolfstaff00000000000000{{ object.name }}drf-haystack-1.5.6/tests/mockapp/views.py0000644000076500000240000000223412605427351020501 0ustar rolfstaff00000000000000# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals from rest_framework.pagination import PageNumberPagination from drf_haystack.filters import HaystackBoostFilter, HaystackHighlightFilter, HaystackAutocompleteFilter from drf_haystack.generics import SQHighlighterMixin from drf_haystack.viewsets import HaystackViewSet from .models import MockPerson, MockLocation from .serializers import ( SearchSerializer, HighlighterSerializer, MoreLikeThisSerializer, MockPersonFacetSerializer ) class BasicPagination(PageNumberPagination): page_size = 2 page_size_query_param = "page_size" class SearchViewSet1(HaystackViewSet): index_models = [MockPerson] serializer_class = SearchSerializer filter_backends = [HaystackHighlightFilter, HaystackAutocompleteFilter] # Faceting facet_serializer_class = MockPersonFacetSerializer pagination_class = BasicPagination class SearchViewSet2(HaystackViewSet): index_models = [MockPerson, MockLocation] serializer_class = HighlighterSerializer class SearchViewSet3(HaystackViewSet): index_models = [MockPerson] serializer_class = MoreLikeThisSerializer drf-haystack-1.5.6/tests/runtests.py0000644000076500000240000000110212466653133017576 0ustar rolfstaff00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- from __future__ import absolute_import, print_function, unicode_literals import os import sys import nose def start(argv=None): sys.exitfunc = lambda: sys.stderr.write("Shutting down...\n") if argv is None: argv = [ "nosetests", "--cover-branches", "--with-coverage", "--cover-erase", "--verbose", "--cover-package=drf_haystack", ] nose.run_exit(argv=argv, defaultTest=os.path.abspath(os.path.dirname(__file__))) if __name__ == "__main__": start(sys.argv) drf-haystack-1.5.6/tests/settings.py0000644000076500000240000000540012605414301017537 0ustar rolfstaff00000000000000# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals import os BASE_DIR = os.path.abspath(os.path.join(os.path.dirname(os.path.dirname(__file__)), "tests")) SECRET_KEY = 'NOBODY expects the Spanish Inquisition!' DEBUG = True TEMPLATE_DEBUG = True ALLOWED_HOSTS = ["*"] # Application definition INSTALLED_APPS = ( 'django.contrib.auth', 'django.contrib.contenttypes', 'django.contrib.sessions', 'django.contrib.staticfiles', 'haystack', 'rest_framework', 'tests.mockapp', ) MIDDLEWARE_CLASSES = ( 'django.contrib.sessions.middleware.SessionMiddleware', 'django.middleware.common.CommonMiddleware', 'django.middleware.csrf.CsrfViewMiddleware', 'django.contrib.auth.middleware.AuthenticationMiddleware', 'django.contrib.messages.middleware.MessageMiddleware', 'django.middleware.clickjacking.XFrameOptionsMiddleware', ) ROOT_URLCONF = 'tests.urls' WSGI_APPLICATION = 'tests.wsgi.application' # Database # https://docs.djangoproject.com/en/1.7/ref/settings/#databases DATABASES = { 'default': { 'ENGINE': 'django.db.backends.sqlite3', 'NAME': os.path.join(BASE_DIR, 'test.db'), } } # Internationalization # https://docs.djangoproject.com/en/1.7/topics/i18n/ LANGUAGE_CODE = 'en-us' TIME_ZONE = 'UTC' USE_I18N = True USE_L10N = True USE_TZ = True # Static files (CSS, JavaScript, Images) # https://docs.djangoproject.com/en/1.7/howto/static-files/ STATIC_URL = '/static/' HAYSTACK_CONNECTIONS = { 'default': { 'ENGINE': 'haystack.backends.elasticsearch_backend.ElasticsearchSearchEngine', 'URL': 'http://localhost:9200/', 'INDEX_NAME': 'drf-haystack-test', 'TIMEOUT': 300, }, } DEFAULT_LOG_DIR = os.path.join(BASE_DIR, 'logs') LOGGING = { 'version': 1, 'disable_existing_loggers': False, 'formatters': { 'standard': { 'format': '%(levelname)s %(asctime)s %(module)s %(process)d %(thread)d %(message)s' }, }, 'handlers': { 'console_handler': { 'level': 'DEBUG', 'class': 'logging.StreamHandler', 'formatter': 'standard' }, 'file_handler': { 'level': 'DEBUG', 'class': 'logging.FileHandler', 'filename': os.path.join(DEFAULT_LOG_DIR, 'tests.log'), }, }, 'loggers': { 'default': { 'handlers': ['file_handler'], 'level': 'DEBUG', 'propagate': True, }, 'elasticsearch': { 'handlers': ['file_handler'], 'level': 'DEBUG', 'propagate': True, }, 'elasticsearch.trace': { 'handlers': ['file_handler'], 'level': 'DEBUG', 'propagate': True, }, }, } drf-haystack-1.5.6/tests/test_filters.py0000644000076500000240000004540012604553472020426 0ustar rolfstaff00000000000000# -*- coding: utf-8 -*- # # Unit tests for the `drf_haystack.filters` classes. # from __future__ import absolute_import, unicode_literals import json from datetime import datetime, timedelta from unittest2 import skipIf from django.core.exceptions import ImproperlyConfigured from django.test import TestCase from rest_framework import status from rest_framework import serializers from rest_framework.test import APIRequestFactory from drf_haystack.viewsets import HaystackViewSet from drf_haystack.serializers import HaystackSerializer, HaystackFacetSerializer from drf_haystack.filters import ( HaystackAutocompleteFilter, HaystackBoostFilter, HaystackGEOSpatialFilter, HaystackHighlightFilter, HaystackFacetFilter) from . import geospatial_support from .mixins import WarningTestCaseMixin from .constants import MOCKLOCATION_DATA_SET_SIZE, MOCKPERSON_DATA_SET_SIZE from .mockapp.models import MockLocation, MockPerson from .mockapp.search_indexes import MockLocationIndex, MockPersonIndex factory = APIRequestFactory() class HaystackFilterTestCase(TestCase): fixtures = ["mockperson"] def setUp(self): MockPersonIndex().reindex() class Serializer1(HaystackSerializer): class Meta: index_classes = [MockPersonIndex] fields = ["text", "firstname", "lastname", "full_name", "autocomplete"] field_aliases = { "q": "autocomplete", "name": "full_name" } class Serializer2(HaystackSerializer): class Meta: index_classes = [MockLocationIndex] exclude = ["lastname"] class Serializer4(serializers.Serializer): # This is not allowed. Must implement a `Meta` class. pass class ViewSet1(HaystackViewSet): index_models = [MockPerson] serializer_class = Serializer1 # No need to specify `filter_backends`, defaults to HaystackFilter class ViewSet2(ViewSet1): serializer_class = Serializer2 class ViewSet3(ViewSet1): serializer_class = Serializer4 self.view1 = ViewSet1 self.view2 = ViewSet2 self.view3 = ViewSet3 def tearDown(self): MockPersonIndex().clear() def test_filter_no_query_parameters(self): request = factory.get(path="/", data="", content_type="application/json") response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), MOCKPERSON_DATA_SET_SIZE) def test_filter_single_field(self): request = factory.get(path="/", data={"firstname": "John"}) # Should return 3 results response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 3) def test_filter_single_field_with_lookup(self): request = factory.get(path="/", data={"firstname__startswith": "John"}) # Should return 3 results response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 3) def test_filter_aliased_field(self): request = factory.get(path="/", data={"name": "John McClane"}, content_type="application/json") response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 1) def test_filter_aliased_field_with_lookup(self): request = factory.get(path="/", data={"name__contains": "John McClane"}, content_type="application/json") response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 1) def test_filter_single_field_OR(self): # Test filtering a single field for multiple values. The parameters should be OR'ed request = factory.get(path="/", data={"lastname": "Hickman,Hood"}) # Should return 3 results response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 3) def test_filter_single_field_OR_custom_lookup_sep(self): setattr(self.view1, "lookup_sep", ";") request = factory.get(path="/", data={"lastname": "Hickman;Hood"}) # Should return 3 results response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 3) # Reset the `lookup_sep` setattr(self.view1, "lookup_sep", ",") def test_filter_multiple_fields(self): # Test filtering multiple fields. The parameters should be AND'ed request = factory.get(path="/", data={"lastname": "Hood", "firstname": "Bruno"}) # Should return 1 result response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 1) def test_filter_multiple_fields_OR_same_fields(self): # Test filtering multiple fields for multiple values. The values should be OR'ed between # same parameters, and AND'ed between them request = factory.get(path="/", data={ "lastname": "Hickman,Hood", "firstname": "Walker,Bruno" }) # Should return 2 result response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 2) def test_filter_excluded_field(self): request = factory.get(path="/", data={"lastname": "Hood"}, content_type="application/json") response = self.view2.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), MOCKPERSON_DATA_SET_SIZE) # Should return all results since, field is ignored def test_filter_with_non_searched_excluded_field(self): request = factory.get(path="/", data={"text": "John"}, content_type="application/json") response = self.view2.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 3) def test_filter_raise_on_serializer_without_meta_class(self): # Make sure we're getting an ImproperlyConfigured when trying to filter on a viewset # with a serializer without `Meta` class. request = factory.get(path="/", data={"lastname": "Hood"}, content_type="application/json") self.assertRaises( ImproperlyConfigured, self.view3.as_view(actions={"get": "list"}), request ) def test_filter_unicode_characters(self): request = factory.get(path="/", data={"firstname": "åsmund", "lastname": "sørensen"}, content_type="application/json") response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(len(response.data), 1) def test_filter_negated_field(self): request = factory.get(path="/", data={"text__not": "John"}, content_type="application/json") response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 97) def test_filter_negated_field_with_lookup(self): request = factory.get(path="/", data={"name__not__contains": "John McClane"}, content_type="application/json") response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 99) def test_filter_negated_field_with_other_field(self): request = factory.get(path="/", data={"firstname": "John", "lastname__not": "McClane"}, content_type="application/json") response = self.view1.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 2) class HaystackAutocompleteFilterTestCase(TestCase): fixtures = ["mockperson"] def setUp(self): MockPersonIndex().reindex() class Serializer(HaystackSerializer): class Meta: index_classes = [MockPersonIndex] fields = ["text", "firstname", "lastname", "autocomplete"] class ViewSet(HaystackViewSet): index_models = [MockPerson] serializer_class = Serializer filter_backends = [HaystackAutocompleteFilter] self.view = ViewSet def tearDown(self): MockPersonIndex().clear() def test_filter_autocomplete_single_term(self): # Test querying the autocomplete field for a partial term. Should return 4 results request = factory.get(path="/", data={"autocomplete": "jer"}, content_type="application/json") response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 4) def test_filter_autocomplete_multiple_terms(self): # Test querying the autocomplete field for multiple terms. # Make sure the filter AND's the terms on spaces, thus reduce the results. request = factory.get(path="/", data={"autocomplete": "joh mc"}, content_type="application/json") response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 2) def test_filter_autocomplete_multiple_parameters(self): request = factory.get(path="/", data={"autocomplete": "jer fowler", "firstname": "jeremy"}, content_type="application/json") response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) def test_filter_autocomplete_single_field_OR(self): request = factory.get(path="/", data={"autocomplete": "jer,fowl"}, content_type="application/json") response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) @skipIf(not geospatial_support, "Skipped due to lack of GEO spatial features") class HaystackGEOSpatialFilterTestCase(TestCase): fixtures = ["mocklocation"] def setUp(self): MockLocationIndex().reindex() class Serializer(HaystackSerializer): class Meta: index_classes = [MockLocationIndex] fields = [ "text", "address", "city", "zip_code", "coordinates", ] class ViewSet(HaystackViewSet): index_models = [MockLocation] serializer_class = Serializer filter_backends = [HaystackGEOSpatialFilter] self.view = ViewSet def tearDown(self): MockLocationIndex().clear() def test_filter_dwithin(self): request = factory.get(path="/", data={"from": "59.923396,10.739370", "km": 1}, content_type="application/json") response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), 4) def test_filter_dwithin_without_range_unit(self): # If no range unit is supplied, no filtering will occur. Make sure we # get the entire data set. request = factory.get(path="/", data={"from": "59.923396,10.739370"}, content_type="application/json") response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(len(response.data), MOCKLOCATION_DATA_SET_SIZE) def test_filter_dwithin_invalid_params(self): request = factory.get(path="/", data={"from": "i am not numeric,10.739370", "km": 1}, content_type="application/json") self.assertRaises( ValueError, self.view.as_view(actions={"get": "list"}), request ) class HaystackHighlightFilterTestCase(TestCase): fixtures = ["mockperson"] def setUp(self): MockPersonIndex().reindex() class Serializer(HaystackSerializer): class Meta: index_classes = [MockPersonIndex] fields = ["firstname", "lastname"] class ViewSet(HaystackViewSet): index_models = [MockPerson] serializer_class = Serializer filter_backends = [HaystackHighlightFilter] self.view = ViewSet def tearDown(self): MockPersonIndex().clear() def test_filter_sq_highlighter_filter(self): request = factory.get(path="/", data={"firstname": "jeremy"}, content_type="application/json") response = self.view.as_view(actions={"get": "list"})(request) response.render() for result in json.loads(response.content.decode()): self.assertTrue("highlighted" in result) self.assertEqual( result["highlighted"], " ".join(("Jeremy", "%s\n" % result["lastname"])) ) class HaystackBoostFilterTestCase(TestCase): fixtures = ["mockperson"] def setUp(self): MockPersonIndex().reindex() class Serializer(HaystackSerializer): class Meta: index_classes = [MockPersonIndex] fields = ["firstname", "lastname"] class ViewSet(HaystackViewSet): index_models = [MockPerson] serializer_class = Serializer filter_backends = [HaystackBoostFilter] self.view = ViewSet def tearDown(self): MockPersonIndex().clear() # Skipping the boost filter test case because it fails. # I strongly believe that this has to be fixed upstream, and # that the drf-haystack code works as it should. # def test_filter_boost(self): # # # This test will fail # # See https://github.com/django-haystack/django-haystack/issues/1235 # # request = factory.get(path="/", data={"lastname": "hood"}, content_type="application/json") # response = self.view.as_view(actions={"get": "list"})(request) # response.render() # data = json.loads(response.content.decode()) # self.assertEqual(len(response.data), 2) # self.assertEqual(data[0]["firstname"], "Bruno") # self.assertEqual(data[1]["firstname"], "Walker") # # # We're boosting walter slightly which should put him first in the results # request = factory.get(path="/", data={"lastname": "hood", "boost": "walker,1.1"}, # content_type="application/json") # response = self.view.as_view(actions={"get": "list"})(request) # response.render() # data = json.loads(response.content.decode()) # self.assertEqual(len(response.data), 2) # self.assertEqual(data[0]["firstname"], "Walker") # self.assertEqual(data[1]["firstname"], "Bruno") def test_filter_boost_invalid_params(self): request = factory.get(path="/", data={"boost": "bruno,i am not numeric!"}, content_type="application/json") self.assertRaises( ValueError, self.view.as_view(actions={"get": "list"}), request ) class HaystackFacetFilterTestCase(WarningTestCaseMixin, TestCase): fixtures = ["mockperson"] def setUp(self): MockPersonIndex().reindex() class FacetSerializer1(HaystackFacetSerializer): class Meta: index_classes = [MockPersonIndex] fields = ["firstname", "lastname", "created"] class FacetSerializer2(HaystackFacetSerializer): class Meta: index_classes = [MockPersonIndex] fields = ["firstname", "lastname", "created"] field_options = { "firstname": {}, "lastname": {}, "created": { "start_date": datetime.now() - timedelta(days=3 * 365), "end_date": datetime.now(), "gap_by": "day", "gap_amount": 10 } } class ViewSet1(HaystackViewSet): index_models = [MockPerson] facet_serializer_class = FacetSerializer1 class ViewSet2(HaystackViewSet): index_models = [MockPerson] facet_serializer_class = FacetSerializer2 self.view1 = ViewSet1 self.view2 = ViewSet2 def tearDown(self): MockPersonIndex().clear() def test_filter_facet_no_field_options(self): request = factory.get("/", data={}, content_type="application/json") response = self.view1.as_view(actions={"get": "facets"})(request) response.render() self.assertEqual(response.status_code, status.HTTP_200_OK) self.assertEqual(json.loads(response.content.decode()), {}) def test_filter_facet_serializer_no_field_options_missing_required_query_parameters(self): request = factory.get("/", data={"created": "start_date:Oct 3rd 2015"}, content_type="application/json") try: self.view1.as_view(actions={"get": "facets"})(request) self.fail("Did not raise ValueError when called without all required " "attributes and no default field_options is set.") except ValueError as e: self.assertEqual( str(e), "Date faceting requires at least 'start_date', 'end_date' and 'gap_by' to be set." ) def test_filter_facet_no_field_options_valid_required_query_parameters(self): request = factory.get( "/", data={"created": "start_date:Jan 1th 2010,end_date:Dec 31th 2020,gap_by:month,gap_amount:1"}, content_type="application/json" ) response = self.view1.as_view(actions={"get": "facets"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) def test_filter_facet_with_field_options(self): request = factory.get("/", data={}, content_type="application/json") response = self.view2.as_view(actions={"get": "facets"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) def test_filter_facet_warn_on_inproperly_formatted_token(self): request = factory.get("/", data={"firstname": "token"}, content_type="application/json") self.assertWarning(UserWarning, self.view2.as_view(actions={"get": "facets"}), request) drf-haystack-1.5.6/tests/test_serializers.py0000644000076500000240000006557412627527535021336 0ustar rolfstaff00000000000000# -*- coding: utf-8 -*- # # Unit tests for the `drf_haystack.serializers` classes. # from __future__ import absolute_import, unicode_literals import json from datetime import datetime, timedelta from django.conf.urls import url, include from django.core.exceptions import ImproperlyConfigured from django.http import QueryDict from django.test import TestCase from haystack.query import SearchQuerySet from rest_framework import serializers from rest_framework.pagination import PageNumberPagination from rest_framework.routers import DefaultRouter from rest_framework.test import APIRequestFactory, APITestCase from drf_haystack.generics import SQHighlighterMixin from drf_haystack.serializers import ( HighlighterMixin, HaystackSerializer, HaystackSerializerMixin, HaystackFacetSerializer ) from drf_haystack.viewsets import HaystackViewSet from .mixins import WarningTestCaseMixin from .mockapp.models import MockPerson from .mockapp.search_indexes import MockPersonIndex, MockPetIndex factory = APIRequestFactory() class SearchPersonSerializer(HaystackSerializer): more_like_this = serializers.HyperlinkedIdentityField(view_name="search-person-more-like-this", read_only=True) class Meta: index_classes = [MockPersonIndex] fields = ["firstname", "lastname", "full_name"] class SearchPersonFacetSerializer(HaystackFacetSerializer): serialize_objects = True class Meta: index_classes = [MockPersonIndex] fields = ["firstname", "lastname", "created"] field_options = { "firstname": {}, "lastname": {}, "created": { "start_date": datetime.now() - timedelta(days=10 * 365), "end_date": datetime.now(), "gap_by": "month", "gap_amount": 1 } } class BasicPagination(PageNumberPagination): page_size = 2 page_size_query_param = "page_size" class SearchPersonViewSet(HaystackViewSet): serializer_class = SearchPersonSerializer facet_serializer_class = SearchPersonFacetSerializer pagination_class = BasicPagination class Meta: index_models = [MockPerson] router = DefaultRouter() router.register("search-person", viewset=SearchPersonViewSet, base_name="search-person") urlpatterns = [ url(r"^", include(router.urls)) ] class HaystackSerializerTestCase(WarningTestCaseMixin, TestCase): fixtures = ["mockperson"] def setUp(self): MockPersonIndex().reindex() class Serializer1(HaystackSerializer): # This is not allowed. Serializer must implement a # `Meta` class pass class Serializer2(HaystackSerializer): class Meta: # This is not allowed. The Meta class must implement # a `index_classes` attribute pass class Serializer3(HaystackSerializer): class Meta: index_classes = [MockPersonIndex] fields = ["some_field"] exclude = ["another_field"] # This is not allowed. Can't set both `fields` and `exclude` class Serializer4(HaystackSerializer): integer_field = serializers.IntegerField() city = serializers.CharField() class Meta: index_classes = [MockPersonIndex] fields = ["text", "firstname", "lastname", "autocomplete"] def get_integer_field(self, obj): return 1 def get_city(self, obj): return "Declared overriding field" class Serializer5(HaystackSerializer): class Meta: index_classes = [MockPersonIndex] exclude = ["firstname"] class Serializer6(HaystackSerializer): class Meta: index_classes = [MockPersonIndex] fields = ["text", "firstname", "lastname", "autocomplete"] ignore_fields = ["autocomplete"] class ViewSet1(HaystackViewSet): serializer_class = Serializer3 class ViewSet2(HaystackViewSet): serializer_class = Serializer4 class Meta: index_models = [MockPerson] self.serializer1 = Serializer1 self.serializer2 = Serializer2 self.serializer3 = Serializer3 self.serializer4 = Serializer4 self.serializer5 = Serializer5 self.serializer6 = Serializer6 self.view1 = ViewSet1 self.view2 = ViewSet2 def tearDown(self): MockPersonIndex().clear() def test_serializer_raise_without_meta_class(self): try: self.serializer1() self.fail("Did not fail when initialized serializer with no Meta class") except ImproperlyConfigured as e: self.assertEqual(str(e), "%s must implement a Meta class." % self.serializer1.__name__) def test_serializer_raise_without_index_models(self): try: self.serializer2() self.fail("Did not fail when initialized serializer with no 'index_classes' attribute") except ImproperlyConfigured as e: self.assertEqual(str(e), "You must set either the 'index_classes' or 'serializers' " "attribute on the serializer Meta class.") def test_serializer_raise_on_both_fields_and_exclude(self): # Make sure we're getting an ImproperlyConfigured when trying to call a viewset # which has both `fields` and `exclude` set. request = factory.get(path="/", data="", content_type="application/json") try: self.view1.as_view(actions={"get": "list"})(request) self.fail("Did not fail when serializer has both 'fields' and 'exclude' attributes") except ImproperlyConfigured as e: self.assertEqual(str(e), "Cannot set both `fields` and `exclude`.") def test_serializer_gets_default_instance(self): serializer = self.serializer4(instance=None) assert isinstance(serializer.instance, SearchQuerySet), self.fail("Did not get default instance " "of type SearchQuerySet") def test_serializer_get_fields(self): from rest_framework.fields import CharField, IntegerField obj = SearchQuerySet().filter(lastname="Foreman")[0] serializer = self.serializer4(instance=obj) fields = serializer.get_fields() assert isinstance(fields, dict), self.fail("serializer.data is not a dict") assert isinstance(fields["integer_field"], IntegerField), self.fail("serializer 'integer_field' field is not a IntegerField instance") assert isinstance(fields["text"], CharField), self.fail("serializer 'text' field is not a CharField instance") assert isinstance(fields["firstname"], CharField), self.fail("serializer 'firstname' field is not a CharField instance") assert isinstance(fields["lastname"], CharField), self.fail("serializer 'lastname' field is not a CharField instance") assert isinstance(fields["autocomplete"], CharField), self.fail("serializer 'autocomplete' field is not a CharField instance") def test_serializer_get_fields_with_exclude(self): from rest_framework.fields import CharField obj = SearchQuerySet().filter(lastname="Foreman")[0] serializer = self.serializer5(instance=obj) fields = serializer.get_fields() assert isinstance(fields, dict), self.fail("serializer.data is not a dict") assert isinstance(fields["text"], CharField), self.fail("serializer 'text' field is not a CharField instance") assert "firstname" not in fields, self.fail("serializer 'firstname' should not be present") assert isinstance(fields["lastname"], CharField), self.fail("serializer 'lastname' field is not a CharField instance") assert isinstance(fields["autocomplete"], CharField), self.fail("serializer 'autocomplete' field is not a CharField instance") def test_serializer_get_fields_with_ignore_fields(self): from rest_framework.fields import CharField obj = SearchQuerySet().filter(lastname="Foreman")[0] serializer = self.serializer6(instance=obj) fields = serializer.get_fields() assert isinstance(fields, dict), self.fail("serializer.data is not a dict") assert isinstance(fields["text"], CharField), self.fail("serializer 'text' field is not a CharField instance") assert isinstance(fields["firstname"], CharField), self.fail("serializer 'firtname' field is not a CharField instance") assert isinstance(fields["lastname"], CharField), self.fail("serializer 'lastname' field is not a CharField instance") assert "autocomplete" not in fields, self.fail("serializer 'autocomplete' should not be present") class HaystackSerializerMultipleIndexTestCase(WarningTestCaseMixin, TestCase): fixtures = ["mockperson", "mockpet"] def setUp(self): MockPersonIndex().reindex() MockPetIndex().reindex() class Serializer1(HaystackSerializer): """ Regular multiple index serializer """ class Meta: index_classes = [MockPersonIndex, MockPetIndex] fields = ["text", "firstname", "lastname", "name", "species", "autocomplete"] class Serializer2(HaystackSerializer): """ Multiple index serializer with declared fields """ _MockPersonIndex__hair_color = serializers.CharField() extra = serializers.IntegerField() class Meta: index_classes = [MockPersonIndex, MockPetIndex] exclude = ["firstname"] def get__MockPersonIndex__hair_color(self): return "black" def get_extra(self): return 1 class Serializer3(HaystackSerializer): """ Multiple index serializer with index aliases """ class Meta: index_classes = [MockPersonIndex, MockPetIndex] exclude = ["firstname"] index_aliases = { 'mockapp.MockPersonIndex': 'People' } class ViewSet1(HaystackViewSet): serializer_class = Serializer1 class ViewSet2(HaystackViewSet): serializer_class = Serializer2 class ViewSet3(HaystackViewSet): serializer_class = Serializer3 self.serializer1 = Serializer1 self.serializer2 = Serializer2 self.serializer3 = Serializer3 self.view1 = ViewSet1 self.view2 = ViewSet2 self.view3 = ViewSet3 def tearDown(self): MockPersonIndex().clear() MockPetIndex().clear() def test_serializer_multiple_index_data(self): objs = SearchQuerySet().filter(text="John") serializer = self.serializer1(instance=objs, many=True) data = serializer.data assert len(data) == 4, self.fail("all objects are not present") for result in data: if "name" in result: assert "species" in result, self.fail("Pet results should have 'species' and 'name' fields") assert "firstname" not in result, self.fail("Pet results should have 'species' and 'name' fields") assert "lastname" not in result, self.fail("Pet results should have 'species' and 'name' fields") elif "firstname" in result: assert "lastname" in result, self.fail("Person results should have 'firstname' and 'lastname' fields") assert "name" not in result, self.fail("Person results should have 'firstname' and 'lastname' fields") assert "species" not in result, self.fail("Person results should have 'firstname' and 'lastname' fields") else: self.fail("Result should contain either Pet or Person fields") def test_serializer_multiple_index_declared_fields(self): objs = SearchQuerySet().filter(text="John") serializer = self.serializer2(instance=objs, many=True) data = serializer.data assert len(data) == 4, self.fail("all objects are not present") for result in data: if "name" in result: assert "extra" in result, self.fail("'extra' should be present in Pet results") assert "hair_color" not in result, self.fail("'hair_color' should not be present in Pet results") elif "lastname" in result: assert "extra" in result, self.fail("'extra' should be present in Person results") assert "hair_color" in result, self.fail("'hair_color' should be present in Person results") else: self.fail("Result should contain either Pet or Person fields") class HaystackSerializerHighlighterMixinTestCase(WarningTestCaseMixin, TestCase): fixtures = ["mockperson"] def setUp(self): MockPersonIndex().reindex() class Serializer1(HaystackSerializer): class Meta: index_classes = [MockPersonIndex] fields = ["firstname", "lastname", "full_name"] class Serializer2(HighlighterMixin, HaystackSerializer): highlighter_html_tag = "div" highlighter_css_class = "my-fancy-highlighter" highlighter_field = "description" class Meta: index_classes = [MockPersonIndex] fields = ["firstname", "lastname", "description"] class Serializer3(Serializer2): highlighter_class = None class ViewSet1(SQHighlighterMixin, HaystackViewSet): serializer_class = Serializer1 class ViewSet2(HaystackViewSet): serializer_class = Serializer2 class ViewSet3(HaystackViewSet): serializer_class = Serializer3 self.viewset1 = ViewSet1 self.viewset2 = ViewSet2 self.viewset3 = ViewSet3 def tearDown(self): MockPersonIndex().clear() def test_serializer_qs_highlighting(self): request = factory.get(path="/", data={"firstname": "jeremy"}, content_type="application/json") response = self.viewset1.as_view(actions={"get": "list"})(request) response.render() for result in json.loads(response.content.decode()): self.assertTrue("highlighted" in result) self.assertEqual( result["highlighted"], " ".join(("Jeremy", "%s\n" % result["lastname"])) ) def test_serializer_qs_highlighter_gives_deprecation_warning(self): request = factory.get(path="/", data={"firstname": "jeremy"}, content_type="application/json") self.assertWarning(DeprecationWarning, self.viewset1.as_view(actions={"get": "list"}), request) def test_serializer_highlighting(self): request = factory.get(path="/", data={"firstname": "jeremy"}, content_type="application/json") response = self.viewset2.as_view(actions={"get": "list"})(request) response.render() for result in json.loads(response.content.decode()): self.assertTrue("highlighted" in result) self.assertEqual( result["highlighted"], " ".join(('<%(tag)s class="%(css_class)s">Jeremy' % { "tag": self.viewset2.serializer_class.highlighter_html_tag, "css_class": self.viewset2.serializer_class.highlighter_css_class }, "%s" % "is a nice chap!")) ) def test_serializer_highlighter_raise_no_highlighter_class(self): request = factory.get(path="/", data={"firstname": "jeremy"}, content_type="application/json") try: self.viewset3.as_view(actions={"get": "list"})(request) self.fail("Did not raise ImproperlyConfigured error when called without a serializer_class") except ImproperlyConfigured as e: self.assertEqual( str(e), "%(cls)s is missing a highlighter_class. Define %(cls)s.highlighter_class, " "or override %(cls)s.get_highlighter()." % {"cls": self.viewset3.serializer_class.__name__} ) class HaystackSerializerMoreLikeThisTestCase(APITestCase): fixtures = ["mockperson"] urls = "tests.test_serializers" def setUp(self): MockPersonIndex().reindex() def tearDown(self): MockPersonIndex().clear() def test_serializer_more_like_this_link(self): response = self.client.get( path="/search-person/", data={"firstname": "odysseus", "lastname": "cooley"}, format="json" ) self.assertTrue("results" in response.data) self.assertEqual( response.data["results"], [{ "lastname": "Cooley", "full_name": "Odysseus Cooley", "firstname": "Odysseus", "more_like_this": "http://testserver/search-person/18/more-like-this/" }] ) class HaystackFacetSerializerTestCase(TestCase): fixtures = ["mockperson"] urls = "tests.test_serializers" def setUp(self): MockPersonIndex().reindex() self.response = self.client.get( path="/search-person/facets/", data={}, format=json ) class FacetSerializer(HaystackFacetSerializer): serialize_objects = True class Meta: fields = ["firstname", "lastname"] field_options = { "firstname": {}, "lastname": {} } class Serializer(HaystackSerializer): class Meta: index_classes = [MockPersonIndex] fields = ["firstname", "lastname", "full_name"] class ViewSet(HaystackViewSet): serializer_class = Serializer facet_serializer_class = FacetSerializer pagination_class = PageNumberPagination class Meta: index_models = [MockPerson] self.view = ViewSet def tearDown(self): MockPersonIndex().clear() @staticmethod def is_paginated_facet_response(response): """ Returns True if the response.data seems like a faceted result. Only works for responses created with the test client. """ return "objects" in response.data and \ all([k in response.data["objects"] for k in ("count", "next", "previous", "results")]) def test_serializer_facet_top_level_structure(self): for key in ("fields", "dates", "queries"): self.assertContains(self.response, key, count=1) def test_serializer_facet_field_result(self): fields = self.response.data["fields"] for field in ("firstname", "lastname"): self.assertTrue(field in fields) self.assertTrue(isinstance(fields[field], list)) self.assertEqual(len(fields["firstname"]), 88) self.assertEqual(len(fields["lastname"]), 97) firstname = fields["firstname"][0] self.assertTrue(all([k in firstname for k in ("text", "count", "narrow_url")])) self.assertEqual(firstname["text"], "John") self.assertEqual(firstname["count"], 3) self.assertEqual(firstname["narrow_url"], "/search-person/facets/?selected_facets=firstname_exact%3AJohn") lastname = fields["lastname"][0] self.assertTrue(all([k in lastname for k in ("text", "count", "narrow_url")])) self.assertEqual(lastname["text"], "Porter") self.assertEqual(lastname["count"], 2) self.assertEqual(lastname["narrow_url"], "/search-person/facets/?selected_facets=lastname_exact%3APorter") def test_serializer_facet_date_result(self): dates = self.response.data["dates"] self.assertTrue("created" in dates) self.assertEqual(len(dates["created"]), 1) created = dates["created"][0] self.assertTrue(all([k in created for k in ("text", "count", "narrow_url")])) self.assertEqual(created["text"], "2015-05-01T00:00:00") self.assertEqual(created["count"], 100) self.assertEqual(created["narrow_url"], "/search-person/facets/?selected_facets=created_exact%3A2015-05-01+00%3A00%3A00") def test_serializer_facet_queries_result(self): # Not Implemented pass def test_serializer_facet_narrow(self): response = self.client.get( path="/search-person/facets/", data=QueryDict("selected_facets=firstname_exact:John&selected_facets=lastname_exact:McClane"), format="json" ) self.assertEqual(response.data["queries"], {}) self.assertTrue([all(("firstname", "lastname" in response.data["fields"]))]) self.assertEqual(len(response.data["fields"]["firstname"]), 1) self.assertEqual(response.data["fields"]["firstname"][0]["text"], "John") self.assertEqual(response.data["fields"]["firstname"][0]["count"], 1) self.assertEqual(response.data["fields"]["firstname"][0]["narrow_url"], ( "/search-person/facets/?selected_facets=firstname_exact%3AJohn&selected_facets=lastname_exact%3AMcClane" )) self.assertEqual(len(response.data["fields"]["lastname"]), 1) self.assertEqual(response.data["fields"]["lastname"][0]["text"], "McClane") self.assertEqual(response.data["fields"]["lastname"][0]["count"], 1) self.assertEqual(response.data["fields"]["lastname"][0]["narrow_url"], ( "/search-person/facets/?selected_facets=firstname_exact%3AJohn&selected_facets=lastname_exact%3AMcClane" )) self.assertTrue("created" in response.data["dates"]) self.assertEqual(len(response.data["dates"]), 1) self.assertEqual(response.data["dates"]["created"][0]["text"], "2015-05-01T00:00:00") self.assertEqual(response.data["dates"]["created"][0]["count"], 1) self.assertEqual(response.data["dates"]["created"][0]["narrow_url"], ( "/search-person/facets/?selected_facets=created_exact%3A2015-05-01+00%3A00%3A00" "&selected_facets=firstname_exact%3AJohn&selected_facets=lastname_exact%3AMcClane" )) def test_serializer_facet_include_objects(self): self.assertContains(self.response, "objects", count=1) def test_serializer_facet_include_paginated_objects(self): self.assertTrue(self.is_paginated_facet_response(self.response)) self.assertEqual(self.response.data["objects"]["next"], "http://testserver/search-person/facets/?page=2") self.assertEqual(self.response.data["objects"]["previous"], None) self.assertEqual(len(self.response.data["objects"]["results"]), 2) # `page_size` def test_serializer_faceted_and_paginated_response(self): response = self.client.get( path="/search-person/facets/", data=QueryDict("selected_facets=firstname_exact:John"), format="json" ) self.assertTrue(self.is_paginated_facet_response(response)) self.assertEqual(len(response.data["objects"]["results"]), 2) self.assertEqual(response.data["objects"]["count"], 3) self.assertEqual(response.data["objects"]["previous"], None) self.assertEqual(response.data["objects"]["next"], "http://testserver/search-person/facets/?page=2&selected_facets=firstname_exact%3AJohn") response = self.client.get( path="/search-person/facets/", data=QueryDict("page=2&selected_facets=firstname_exact:John") ) self.assertTrue(self.is_paginated_facet_response(response)) self.assertEqual(len(response.data["objects"]["results"]), 1) self.assertEqual(response.data["objects"]["count"], 3) self.assertEqual(response.data["objects"]["previous"], "http://testserver/search-person/facets/?selected_facets=firstname_exact%3AJohn") self.assertEqual(response.data["objects"]["next"], None) # Make sure that `page_query_param` is not included in the `narrow_url`. # It will make the pagination fail because when we narrow the queryset, the # pagination will have to be re-calculated. fields = response.data["fields"] firstname = fields["firstname"][0] self.assertFalse("page=2" in firstname["narrow_url"]) class HaystackSerializerMixinTestCase(WarningTestCaseMixin, TestCase): fixtures = ["mockperson"] def setUp(self): MockPersonIndex().reindex() class MockPersonSerializer(serializers.ModelSerializer): class Meta: model = MockPerson fields = ('id', 'firstname', 'lastname', 'created', 'updated') read_only_fields = ('created', 'updated') class Serializer1(HaystackSerializerMixin, MockPersonSerializer): class Meta(MockPersonSerializer.Meta): search_fields = ['text', ] class Viewset1(HaystackViewSet): serializer_class = Serializer1 self.serializer1 = Serializer1 self.viewset1 = Viewset1 def tearDown(self): MockPersonIndex().clear() def test_serializer_mixin(self): objs = SearchQuerySet().filter(text="Foreman") serializer = self.serializer1(instance=objs, many=True) self.assertEqual( json.loads(json.dumps(serializer.data)), [{ "id": 1, "firstname": "Abel", "lastname": "Foreman", "created": "2015-05-19T10:48:08.686000Z", "updated": "2015-05-19T10:48:08.686000Z" }] ) class HaystackMultiSerializerTestCase(WarningTestCaseMixin, TestCase): fixtures = ["mockperson", "mockpet"] def setUp(self): MockPersonIndex().reindex() MockPetIndex().reindex() class MockPersonSerializer(HaystackSerializer): class Meta: index_classes = [MockPersonIndex] fields = ('text', 'firstname', 'lastname', 'description') class MockPetSerializer(HaystackSerializer): class Meta: index_classes = [MockPetIndex] exclude = ('description', 'autocomplete') class Serializer1(HaystackSerializer): class Meta: serializers = { MockPersonIndex: MockPersonSerializer, MockPetIndex: MockPetSerializer } self.serializer1 = Serializer1 def tearDown(self): MockPersonIndex().clear() MockPetIndex().clear() def test_multi_serializer(self): objs = SearchQuerySet().filter(text="Zane") serializer = self.serializer1(instance=objs, many=True) self.assertEqual( json.loads(json.dumps(serializer.data)), [{ "text": "Zane", "name": "Zane", "species": "Dog" }, { "text": "Zane Griffith\n", "firstname": "Zane", "lastname": "Griffith", "description": "Zane is a nice chap!" }] ) drf-haystack-1.5.6/tests/test_utils.py0000644000076500000240000000274012604553472020116 0ustar rolfstaff00000000000000# -*- coding: utf-8 -*- from __future__ import absolute_import, unicode_literals from django.test import TestCase from drf_haystack.utils import merge_dict class MergeDictTestCase(TestCase): def setUp(self): self.dict_a = { "person": { "lastname": "Holmes", "combat_proficiency": [ "Pistol", "boxing" ] }, } self.dict_b = { "person": { "gender": "male", "firstname": "Sherlock", "location": { "address": "221B Baker Street" }, "combat_proficiency": [ "sword", "Martial arts", ] } } def test_utils_merge_dict(self): self.assertEqual(merge_dict(self.dict_a, self.dict_b), { "person": { "gender": "male", "firstname": "Sherlock", "lastname": "Holmes", "location": { "address": "221B Baker Street" }, "combat_proficiency": [ "Martial arts", "Pistol", "boxing", "sword", ] } }) def test_utils_merge_dict_invalid_input(self): self.assertEqual(merge_dict(self.dict_a, "I'm not a dict!"), "I'm not a dict!") drf-haystack-1.5.6/tests/test_viewsets.py0000644000076500000240000002472712604553472020640 0ustar rolfstaff00000000000000# -*- coding: utf-8 -*- # # Unit tests for the `drf_haystack.viewsets` classes. # from __future__ import absolute_import, unicode_literals from django.test import TestCase from django.contrib.auth.models import User from haystack.query import SearchQuerySet from rest_framework import status from rest_framework.routers import SimpleRouter from rest_framework.serializers import Serializer from rest_framework.test import force_authenticate, APIRequestFactory from drf_haystack.viewsets import HaystackViewSet from drf_haystack.serializers import HaystackFacetSerializer from .mockapp.models import MockPerson from .mockapp.search_indexes import MockPersonIndex factory = APIRequestFactory() class HaystackViewSetTestCase(TestCase): fixtures = ["mockperson"] def setUp(self): MockPersonIndex().reindex() self.router = SimpleRouter() class FacetSerializer(HaystackFacetSerializer): class Meta: fields = ["firstname", "lastname", "created"] class ViewSet(HaystackViewSet): serializer_class = Serializer facet_serializer_class = FacetSerializer self.view = ViewSet def tearDown(self): MockPersonIndex().clear() def test_viewset_get_queryset_no_queryset(self): request = factory.get(path="/", data="", content_type="application/json") response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) def test_viewset_get_queryset_with_queryset(self): setattr(self.view, "queryset", SearchQuerySet().all()) request = factory.get(path="/", data="", content_type="application/json") response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) def test_viewset_get_queryset_with_index_models(self): setattr(self.view, "index_models", [MockPerson]) request = factory.get(path="/", data="", content_type="application/json") response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) def test_viewset_get_object(self): request = factory.get(path="/", data="", content_type="application/json") response = self.view.as_view(actions={"get": "retrieve"})(request, pk=1) self.assertEqual(response.status_code, status.HTTP_200_OK) def test_viewset_get_obj_raise_404(self): request = factory.get(path="/", data="", content_type="application/json") response = self.view.as_view(actions={"get": "retrieve"})(request, pk=100000) self.assertEqual(response.status_code, status.HTTP_404_NOT_FOUND) def test_viewset_get_object_invalid_lookup_field(self): request = factory.get(path="/", data="", content_type="application/json") self.assertRaises( AttributeError, self.view.as_view(actions={"get": "retrieve"}), request, invalid_lookup=1 ) def test_viewset_get_obj_override_lookup_field(self): setattr(self.view, "lookup_field", "custom_lookup") request = factory.get(path="/", data="", content_type="application/json") response = self.view.as_view(actions={"get": "retrieve"})(request, custom_lookup=1) setattr(self.view, "lookup_field", "pk") self.assertEqual(response.status_code, status.HTTP_200_OK) def test_viewset_more_like_this_decorator(self): route = self.router.get_routes(self.view)[2:].pop() self.assertEqual(route.url, "^{prefix}/{lookup}/more-like-this{trailing_slash}$") self.assertEqual(route.mapping, {"get": "more_like_this"}) def test_viewset_more_like_this_action_route(self): request = factory.get(path="/", data={}, content_type="application/json") response = self.view.as_view(actions={"get": "more_like_this"})(request, pk=1) self.assertEqual(response.status_code, status.HTTP_200_OK) def test_viewset_facets_action_route(self): request = factory.get(path="/", data={}, content_type="application/json") response = self.view.as_view(actions={"get": "facets"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) class HaystackViewSetPermissionsTestCase(TestCase): fixtures = ["mockperson"] def setUp(self): MockPersonIndex().reindex() class ViewSet(HaystackViewSet): serializer_class = Serializer self.view = ViewSet self.user = User.objects.create_user(username="user", email="user@example.com", password="user") self.admin_user = User.objects.create_superuser(username="admin", email="admin@example.com", password="admin") def tearDown(self): MockPersonIndex().clear() def test_viewset_get_queryset_with_AllowAny_permission(self): from rest_framework.permissions import AllowAny setattr(self.view, "permission_classes", (AllowAny, )) request = factory.get(path="/", data="", content_type="application/json") response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) def test_viewset_get_queryset_with_IsAuthenticated_permission(self): from rest_framework.permissions import IsAuthenticated setattr(self.view, "permission_classes", (IsAuthenticated, )) request = factory.get(path="/", data="", content_type="application/json") response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN) force_authenticate(request, user=self.user) response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) def test_viewset_get_queryset_with_IsAdminUser_permission(self): from rest_framework.permissions import IsAdminUser setattr(self.view, "permission_classes", (IsAdminUser,)) request = factory.get(path="/", data="", content_type="application/json") force_authenticate(request, user=self.user) response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN) force_authenticate(request, user=self.admin_user) response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) def test_viewset_get_queryset_with_IsAuthenticatedOrReadOnly_permission(self): from rest_framework.permissions import IsAuthenticatedOrReadOnly setattr(self.view, "permission_classes", (IsAuthenticatedOrReadOnly,)) # Unauthenticated GET requests should pass request = factory.get(path="/", data="", content_type="application/json") response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) # Authenticated GET requests should pass request = factory.get(path="/", data="", content_type="application/json") force_authenticate(request, user=self.user) response = self.view.as_view(actions={"get": "list"})(request) self.assertEqual(response.status_code, status.HTTP_200_OK) # POST, PUT, PATCH and DELETE requests are not supported, so they will # raise an error. No need to test the permission. def test_viewset_get_queryset_with_DjangoModelPermissions_permission(self): from rest_framework.permissions import DjangoModelPermissions setattr(self.view, "permission_classes", (DjangoModelPermissions,)) # The `DjangoModelPermissions` is not supported and should raise an # AssertionError from rest_framework.permissions. request = factory.get(path="/", data="", content_type="application/json") try: self.view.as_view(actions={"get": "list"})(request) self.fail("Did not fail with AssertionError or AttributeError " "when calling HaystackView with DjangoModelPermissions") except (AttributeError, AssertionError) as e: if isinstance(e, AttributeError): self.assertEqual(str(e), "'SearchQuerySet' object has no attribute 'model'") else: self.assertEqual(str(e), "Cannot apply DjangoModelPermissions on a view that does " "not have `.model` or `.queryset` property.") def test_viewset_get_queryset_with_DjangoModelPermissionsOrAnonReadOnly_permission(self): from rest_framework.permissions import DjangoModelPermissionsOrAnonReadOnly setattr(self.view, "permission_classes", (DjangoModelPermissionsOrAnonReadOnly,)) # The `DjangoModelPermissionsOrAnonReadOnly` is not supported and should raise an # AssertionError from rest_framework.permissions. request = factory.get(path="/", data="", content_type="application/json") try: self.view.as_view(actions={"get": "list"})(request) self.fail("Did not fail with AssertionError when calling HaystackView " "with DjangoModelPermissionsOrAnonReadOnly") except (AttributeError, AssertionError) as e: if isinstance(e, AttributeError): self.assertEqual(str(e), "'SearchQuerySet' object has no attribute 'model'") else: self.assertEqual(str(e), "Cannot apply DjangoModelPermissions on a view that does " "not have `.model` or `.queryset` property.") def test_viewset_get_queryset_with_DjangoObjectPermissions_permission(self): from rest_framework.permissions import DjangoObjectPermissions setattr(self.view, "permission_classes", (DjangoObjectPermissions,)) # The `DjangoObjectPermissions` is a subclass of `DjangoModelPermissions` and # therefore unsupported. request = factory.get(path="/", data="", content_type="application/json") try: self.view.as_view(actions={"get": "list"})(request) self.fail("Did not fail with AssertionError when calling HaystackView with DjangoModelPermissions") except (AttributeError, AssertionError) as e: if isinstance(e, AttributeError): self.assertEqual(str(e), "'SearchQuerySet' object has no attribute 'model'") else: self.assertEqual(str(e), "Cannot apply DjangoModelPermissions on a view that does " "not have `.model` or `.queryset` property.") drf-haystack-1.5.6/tests/urls.py0000644000076500000240000000077512567040266016712 0ustar rolfstaff00000000000000# -*- coding: utf-8 -*- from django.conf.urls import patterns, include, url from rest_framework import routers from .mockapp.views import SearchViewSet1, SearchViewSet2, SearchViewSet3 router = routers.DefaultRouter() router.register("search1", viewset=SearchViewSet1, base_name="search1") router.register("search2", viewset=SearchViewSet2, base_name="search2") router.register("search3", viewset=SearchViewSet3, base_name="search3") urlpatterns = patterns( "", url(r"^", include(router.urls)) ) drf-haystack-1.5.6/tests/wsgi.py0000644000076500000240000000060112537476535016673 0ustar rolfstaff00000000000000""" WSGI config for tests project. It exposes the WSGI callable as a module-level variable named ``application``. For more information on this file, see https://docs.djangoproject.com/en/1.7/howto/deployment/wsgi/ """ import os os.environ.setdefault("DJANGO_SETTINGS_MODULE", "tests.settings") from django.core.wsgi import get_wsgi_application application = get_wsgi_application() drf-haystack-1.5.6/tox.ini0000644000076500000240000000433612627546164015526 0ustar rolfstaff00000000000000[tox] envlist = docs, py26-django1.5, py26-django1.6, py27-django1.5, py27-django1.6, py27-django1.7, py27-django1.8, py33-django1.5, py33-django1.6, py33-django1.7, py33-django1.8, py34-django1.5, py34-django1.6, py34-django1.7, py34-django1.8, [base] deps = elasticsearch<2.0.0 [django1.5] deps = Django>=1.5,<1.6 djangorestframework<=3.2.0 [django1.6] deps = Django>=1.6,<1.7 djangorestframework<=3.2.0 [django1.7] deps = Django>=1.7,<1.8 [django1.8] deps = Django>=1.8,<1.9 [testenv] commands = python {toxinidir}/setup.py test [testenv:docs] changedir = docs deps = sphinx sphinx-rtd-theme commands = sphinx-build -W -b html -d {envtmpdir}/doctrees . {envtmpdir}/html [testenv:py26-django1.5] basepython = python2.6 deps = geopy==0.99 {[django1.5]deps} {[base]deps} [testenv:py26-django1.6] basepython = python2.6 deps = geopy==0.99 {[django1.6]deps} {[base]deps} [testenv:py27-django1.5] basepython = python2.7 deps = geopy {[django1.5]deps} {[base]deps} [testenv:py27-django1.6] basepython = python2.7 deps = geopy {[django1.6]deps} {[base]deps} [testenv:py27-django1.7] basepython = python2.7 deps = geopy {[django1.7]deps} {[base]deps} [testenv:py27-django1.8] basepython = python2.7 deps = geopy {[django1.8]deps} {[base]deps} [testenv:py33-django1.5] basepython = python3.3 deps = geopy {[django1.5]deps} {[base]deps} [testenv:py33-django1.6] basepython = python3.3 deps = geopy {[django1.6]deps} {[base]deps} [testenv:py33-django1.7] basepython = python3.3 deps = geopy {[django1.7]deps} {[base]deps} [testenv:py33-django1.8] basepython = python3.3 deps = geopy {[django1.8]deps} {[base]deps} [testenv:py34-django1.5] basepython = python3.4 deps = geopy {[django1.5]deps} {[base]deps} [testenv:py34-django1.6] basepython = python3.4 deps = geopy {[django1.6]deps} {[base]deps} [testenv:py34-django1.7] basepython = python3.4 deps = geopy {[django1.7]deps} {[base]deps} [testenv:py34-django1.8] basepython = python3.4 deps = geopy {[django1.8]deps} {[base]deps}