I'm using elasticsearch_autocomplete gem for autocomplete feature.
I have a problem with special characters ñ
and accents áéíóú
.
Model:
class Car
ac_field :name, :description, :city, :skip_settings => true
def self.ac_search(params, options={})
tire.search load: true, page: params[:page], per_page: 9 do
query do
boolean do
must { string params[:query], default_operator: "AND" } if params[:query].present?
must { term :city, params[:city] } if params[:city].present?
end
end
filter :term, city: params[:city] if params[:city].present?
facet "city" do
terms :city
end
end
end
end
This version works fine with special characters.
e.g.: Query with Martin
I get all results with Martín, martín, martin, Martin
With this approach this is the problem:
Now what results is individual words. e.g. A city tagged ["San Francisco", "Madrid"] will end up having three separate tags. Similarly, if I do a query to search on "san francisco" (must { term 'city', params[:city] }), that will fail, while a query on "San" or "Francisco" will succeed. The desired behaviour here is that the tag should be atomic, so only a "San Francisco" (or "Madrid") tag search should succeed.
To fix this problem I create my custom mapping:
model = self
settings ElasticsearchAutocomplete::Analyzers::AC_BASE do
mapping _source: {enabled: true, includes: %w(name description city)} do
indexes :name, model.ac_index_config(:name)
indexes :description, model.ac_index_config(:description)
indexes :city, :type => 'string', :index => :not_analyzed
end
end
With this mapping the problem with multi-words is fixed, and now facets with city
field works fine:
Instead of getting the type facets San
and Francisco
Now I get San Francisco
Now, the problem is that with this mapping inside of the model the search doesn't find results with special characters
e.g.: Query with Martin
I get only results with Martin martin
I'm using mongoid instead active record.
How can I fix this problem? I think that the problem is with asciifolding tokenfilter.