1
votes

I'm using elasticsearch_autocomplete gem for autocomplete feature.

I have a problem with special characters ñ and accents áéíóú.

Model:

class Car
  ac_field :name, :description, :city, :skip_settings => true

  def self.ac_search(params, options={})
    tire.search load: true, page: params[:page], per_page: 9 do
      query do
        boolean do
          must { string params[:query], default_operator: "AND" } if params[:query].present?
          must { term :city, params[:city] } if params[:city].present?
        end
      end
      filter :term, city: params[:city] if params[:city].present?
      facet "city" do
        terms :city
      end
    end
  end

end

This version works fine with special characters.

e.g.: Query with Martin I get all results with Martín, martín, martin, Martin

With this approach this is the problem:

Now what results is individual words. e.g. A city tagged ["San Francisco", "Madrid"] will end up having three separate tags. Similarly, if I do a query to search on "san francisco" (must { term 'city', params[:city] }), that will fail, while a query on "San" or "Francisco" will succeed. The desired behaviour here is that the tag should be atomic, so only a "San Francisco" (or "Madrid") tag search should succeed.

To fix this problem I create my custom mapping:

model = self
  settings ElasticsearchAutocomplete::Analyzers::AC_BASE do
    mapping _source: {enabled: true, includes: %w(name description city)} do
      indexes :name, model.ac_index_config(:name)
      indexes :description, model.ac_index_config(:description)
      indexes :city, :type => 'string', :index => :not_analyzed 
    end
  end

With this mapping the problem with multi-words is fixed, and now facets with city field works fine:

Instead of getting the type facets San and Francisco Now I get San Francisco

Now, the problem is that with this mapping inside of the model the search doesn't find results with special characters

e.g.: Query with Martin I get only results with Martin martin

I'm using mongoid instead active record.

How can I fix this problem? I think that the problem is with asciifolding tokenfilter.

1
did you solve your problem yet? are you plugging in test words in Ruby or could this be a POST issue?tim peterson

1 Answers

1
votes

I fixed the problem with:

 class User
   include Mongoid::Document
   field :city, :type => String
   has_one: car
 end

 class Car
  ac_field :name, :description, :user_city, :skip_settings => true
  def self.ac_search(params, options={})
    tire.search load: true, page: params[:page], per_page: 9 do
      query do
        boolean do
          must { term :user_city, params[:user_city] } if params[:user_city].present?
        end
      end
      facet "cities" do
        terms :user_city
      end
     end
   end

   model = self
    settings ElasticsearchAutocomplete::Analyzers::AC_BASE do
     mapping _source: {enabled: true, includes: %w(car_city name description)} do
     indexes :car_city, :type => 'string', :index => :not_analyzed 
    end
   end

   def to_indexed_json
     to_json(methods: [:user_city])
   end
   def user_city
    user.city
   end
 end