I have a simple logstash grok filter:
filter {
grok {
match => { "message" => "^%{TIMESTAMP_ISO8601:timestamp} %{NOTSPACE:name} %{WORD:level} %{SPACE} %{GREEDYDATA:message}$" }
overwrite => [ "message" ]
}
}
This works, it parses my logs, but according to Kibana, the timestamp values are output with data type string
.
The logstash @timestamp field has data type date
.
The grok documentation says you can specify a data type conversion, but only int and float are supported:
If you wish to convert a semantic’s data type, for example change a string to an integer then suffix it with the target data type. For example %{NUMBER:num:int} which converts the num semantic from a string to an integer. Currently the only supported conversions are int and float.
That suggests that I'm supposed to leave it as a string, however, if the index supports datetime values, why would you not want it properly stored and sortable as a datetime?