5
votes

I'm new to ElasticSearch and Kibana and am having trouble getting Kibana to recognise my timestamps.

I have a JSON file with lots of data that I wish to insert into Elasticsearch using Curl. Here is an example of one of the JSON entries.

{"index":{"_id":"63"}}
{"account_number":63,"firstname":"Hughes","lastname":"Owens", "email":"[email protected]", "_timestamp":"2013-07-05T08:49:30.123"}

I have tried to create an index in Elasticsearch using the command:

curl -XPUT 'http://localhost:9200/test/'

I have then tried to set up an appropriate mapping for the timestamp:

curl -XPUT 'http://localhost:9200/test/container/_mapping' -d'
{
"container" : {
"_timestamp" : {
"_timestamp" : {"enabled: true, "type":"date", "format": "date_hour_minute_second_fraction", "store":true}
}
}
}'

// format of timestamp from http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/mapping-date-format.html

I then have tried to bulk insert my data:

curl -XPOST 'localhost:9200/test/container/_bulk?pretty' --data-binary @myfile.json

All of these commands run without fault however when the data is viewed in Kibana the _timestamp field is not being recognised. Sorting via the timestamp does not work and trying to filter the data using different periods does not work. Any ideas on why this problem is occuring is appricieated.

3

3 Answers

7
votes

Managed to solve the problem. So for anyone else having this problem:

The format we had our date saved in was incorrect, needed to be :

"_timestamp":"2013-07-05 08:49:30.123"

then our mapping needed to be:

curl -XPUT 'http://localhost:9200/test/container/_mapping' -d'
{
"container" : {
"_timestamp" : {"enabled": true, "type":"date", "format": "yyyy-MM-dd HH:mm:ss.SSS", "store":true, "path" : "_timestamp"}
}
}'

Hope this helps someone.

6
votes

There is no need to make and ISO8601 date in case you have an epoch timestamp. To make Kibana recognize the field as date is has to be a date field though.

Please note that you have to set the field as date type BEFORE you input any data into the /index/type. Otherwise it will be stored as long and unchangeable.

Simple example that can be pasted into the marvel/sense plugin:

# Make sure the index isn't there
DELETE /logger

# Create the index
PUT /logger

# Add the mapping of properties to the document type `mem`
PUT /logger/_mapping/mem
{
  "mem": {
    "properties": {
      "timestamp": {
        "type": "date"
      },
      "free": {
         "type": "long"
      }
    }
  }
}

# Inspect the newly created mapping
GET /logger/_mapping/mem

Run each of these commands in serie.

Generate free mem logs

Here is a simple script that echo to your terminal and logs to your local elasticsearch:

while (( 1==1 )); do memfree=`free -b|tail -n 1|tr -s ' ' ' '|cut -d ' ' -f4`; echo $load; curl -XPOST "localhost:9200/logger/mem" -d "{ \"timestamp\": `date +%s%3N`, \"free\": $memfree }"; sleep 1; done

Inspect data in elastic search

Paste this in your marvel/sense

GET /logger/mem/_search

Now you can move to Kibana and do some graphs. Kibana will autodetect your date field.

-1
votes

This solution works for older ES <2.4 For the newer version of ES you can either use the "date" field along with the following parameters: https://www.elastic.co/guide/en/elasticsearch/reference/current/date.html