0
votes

I am new to the ELK stack and have it implemented with elasticsearch version 1.4.4, logstash version 1.4.2, and kibana version 4. I am able to pull a csv file into elasticsearch using logstash and have it display in kibana.

When displaying a date from the file, the values within the date are separated out as if the dash contained within is a separator (ex. value in field is 01-01-2015, when this is displayed in kibana (regardless of display type) there will be three field entries, 01, 01, and 2015). Kibana gives a message that this is due to it being an analyzed field.

Kibana 4 has a feature to use json directly from the dashboard builder, Visualization, to change this to a non-analyzed field so that the entire string will be used, rather than separating it.

I have tried multiple formats, but this is the one that seems it should work as kibana recognizes it as valid syntax:

{ "index" : "not_analyzed" }

but when attempting to apply the change, the dashboard does not change its structure and kibana generates the following exception:

Visualize: Request to Elasticsearch failed: {"error":"SearchPhaseExecutionException[Failed to execute phase [query], all shards failed; shardFailures {[ftpEMbcOTxu0Tdf0e8i-Ig][csvtest][0]: SearchParseException[[csvtest][0]: query[ConstantScore(BooleanFilter(+cache(@timestamp:[1420092000000 TO 1451627999999])))],from[-1],size[0]: Parse Failure [Failed to parse source [{\"query\":{\"filtered\":{\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":{\"bool\":{\"must\":[{\"range\":{\"@timestamp\":{\"gte\":1420092000000,\"lte\":1451627999999}}}],\"must_not\":[]}}}},\"size\":0,\"aggs\":{\"2\":{\"terms\":{\"field\":\"Conn Dt\",\"size\":100,\"order\":{\"1\":\"desc\"},\"index\":\"not_analyzed\"},\"aggs\":{\"1\":{\"cardinality\":{\"field\":\"Area Cd\"}}}}}}]]]; nested: SearchParseException[[csvtest][0]: query[ConstantScore(BooleanFilter(+cache(@timestamp:[1420092000000 TO 1451627999999])))],from[-1],size[0]: Parse Failure [Unknown key for a VALUE_STRING in [2]: [index].]]; }{[ftpEMbcOTxu0Tdf0e8i-Ig][csvtest][1]: SearchParseException[[csvtest][1]: query[ConstantScore(BooleanFilter(+cache(@timestamp:[1420092000000 TO 1451627999999])))],from[-1],size[0]: Parse Failure [Failed to parse source [{\"query\":{\"filtered\":{\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":{\"bool\":{\"must\":[{\"range\":{\"@timestamp\":{\"gte\":1420092000000,\"lte\":1451627999999}}}],\"must_not\":[]}}}},\"size\":0,\"aggs\":{\"2\":{\"terms\":{\"field\":\"Conn Dt\",\"size\":100,\"order\":{\"1\":\"desc\"},\"index\":\"not_analyzed\"},\"aggs\":{\"1\":{\"cardinality\":{\"field\":\"Area Cd\"}}}}}}]]]; nested: SearchParseException[[csvtest][1]: query[ConstantScore(BooleanFilter(+cache(@timestamp:[1420092000000 TO 1451627999999])))],from[-1],size[0]: Parse Failure [Unknown key for a VALUE_STRING in [2]: [index].]]; }{[ftpEMbcOTxu0Tdf0e8i-Ig][csvtest][2]: SearchParseException[[csvtest][2]: query[ConstantScore(BooleanFilter(+cache(@timestamp:[1420092000000 TO 1451627999999])))],from[-1],size[0]: Parse Failure [Failed to parse source [{\"query\":{\"filtered\":{\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":{\"bool\":{\"must\":[{\"range\":{\"@timestamp\":{\"gte\":1420092000000,\"lte\":1451627999999}}}],\"must_not\":[]}}}},\"size\":0,\"aggs\":{\"2\":{\"terms\":{\"field\":\"Conn Dt\",\"size\":100,\"order\":{\"1\":\"desc\"},\"index\":\"not_analyzed\"},\"aggs\":{\"1\":{\"cardinality\":{\"field\":\"Area Cd\"}}}}}}]]]; nested: SearchParseException[[csvtest][2]: query[ConstantScore(BooleanFilter(+cache(@timestamp:[1420092000000 TO 1451627999999])))],from[-1],size[0]: Parse Failure [Unknown key for a VALUE_STRING in [2]: [index].]]; }{[ftpEMbcOTxu0Tdf0e8i-Ig][csvtest][3]: SearchParseException[[csvtest][3]: query[ConstantScore(BooleanFilter(+cache(@timestamp:[1420092000000 TO 1451627999999])))],from[-1],size[0]: Parse Failure [Failed to parse source [{\"query\":{\"filtered\":{\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":{\"bool\":{\"must\":[{\"range\":{\"@timestamp\":{\"gte\":1420092000000,\"lte\":1451627999999}}}],\"must_not\":[]}}}},\"size\":0,\"aggs\":{\"2\":{\"terms\":{\"field\":\"Conn Dt\",\"size\":100,\"order\":{\"1\":\"desc\"},\"index\":\"not_analyzed\"},\"aggs\":{\"1\":{\"cardinality\":{\"field\":\"Area Cd\"}}}}}}]]]; nested: SearchParseException[[csvtest][3]: query[ConstantScore(BooleanFilter(+cache(@timestamp:[1420092000000 TO 1451627999999])))],from[-1],size[0]: Parse Failure [Unknown key for a VALUE_STRING in [2]: [index].]]; }{[ftpEMbcOTxu0Tdf0e8i-Ig][csvtest][4]: SearchParseException[[csvtest][4]: query[ConstantScore(BooleanFilter(+cache(@timestamp:[1420092000000 TO 1451627999999])))],from[-1],size[0]: Parse Failure [Failed to parse source [{\"query\":{\"filtered\":{\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":{\"bool\":{\"must\":[{\"range\":{\"@timestamp\":{\"gte\":1420092000000,\"lte\":1451627999999}}}],\"must_not\":[]}}}},\"size\":0,\"aggs\":{\"2\":{\"terms\":{\"field\":\"Conn Dt\",\"size\":100,\"order\":{\"1\":\"desc\"},\"index\":\"not_analyzed\"},\"aggs\":{\"1\":{\"cardinality\":{\"field\":\"Area Cd\"}}}}}}]]]; nested: SearchParseException[[csvtest][4]: query[ConstantScore(BooleanFilter(+cache(@timestamp:[1420092000000 TO 1451627999999])))],from[-1],size[0]: Parse Failure [Unknown key for a VALUE_STRING in [2]: [index].]]; }]"} less

It can be seen within where the index: was changed to not_analyzed from analyzed; also the setting that has wildcard analyzed: true was also changed to false withing the advanced object configuration with the same result.

2

2 Answers

1
votes

Try index Mapping and put the date field as non-analyzed.

For Example:

"<index name>": {
      "mappings": {
         "<Mapping type>": {
            "properties": {
               "City": {
                  "type": "string",
                  "index": "not_analyzed"
               },
               "Date": {
                  "type": "string",
                  "index": "not_analyzed"
               }
           }
        }
0
votes

I had a similar issue today with the following message:

Parse Failure [Unknown key for a VALUE_STRING in [logTime]: [offset].]]; }]

I was sending a date histogram aggregation request against Elasticsearch 1.4.5 with the following payload:

['logTime'].forEach(function (field) {
    body.aggregations[field] = {
        date_histogram: {
            field: field,
            interval: 'week',
            time_zone: '+00:00',
            offset: '15h',
            min_doc_count: 0,
            extended_bounds: {
                min: 1440946800000,
                max: 1441551599999
            }
        }
    };
});

Note the use of offset parameter for the date_histogram. This parameter is introduced in Elasticsearch version 1.5.0 only. So, my 1.4.5 ES was complaining that this offset key was Unknown.

Replacing with post_offset as follows solved the problem though I had to adjust the value of the time_zone parameter as well. As a side note, post_offset is deprecated and replaced with offset since v1.5.

['logTime'].forEach(function (field) {
    body.aggregations[field] = {
        date_histogram: {
            field: field,
            interval: 'week',
            time_zone: '+09:00',
            post_offset: '-9h',
            min_doc_count: 0,
            extended_bounds: {
                min: 1440946800000,
                max: 1441551599999
            }
        }
    };
});