1
votes

Can somebody tell me what I'm doing wrong, or why Logstash doesn't want to parse an ISO8601 timestamp?

The error message I get is

Failed action ... "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [timestamp]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"2017-03-24 12:14:50\" is malformed at \"17-03-24 12:14:50\""}}

Sample log file line (last byte in IP address replaced with 000 on purpose)

2017-03-24 12:14:50 87.123.123.000 12345678.domain.com GET /smil:stream_17.smil/chunk_ctvideo_ridp0va0r600115_cs211711500_mpd.m4s - HTTP/1.1 200 750584 0.714 "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36" https://referrer.domain.com/video/2107 https fra1 "HIT, MISS" 12345678.domain.com

GROK pattern (use http://grokconstructor.appspot.com/do/match to verify)

RAW %{TIMESTAMP_ISO8601:timestamp}%{SPACE}%{IPV4:clientip}%{SPACE}%{HOSTNAME:http_host}%{SPACE}%{WORD:verb}%{SPACE}\/(.*:)?%{WORD:stream}%{NOTSPACE}%{SPACE}%{NOTSPACE}%{SPACE}%{WORD:protocol}\/%{NUMBER:httpversion}%{SPACE}%{NUMBER:response}%{SPACE}%{NUMBER:bytes}%{SPACE}%{SECOND:request_time}%{SPACE}%{QUOTEDSTRING:agent}%{SPACE}%{URI:referrer}%{SPACE}%{WORD}%{SPACE}%{WORD:location}%{SPACE}%{QUOTEDSTRING:cache_status}%{SPACE}%{WORD:account}%{GREEDYDATA}

Logstash configuration (input side):

input {
    file {
      path => "/subfolder/logs/*"
      type => "access_logs"
      start_position => "beginning"
    }
}
filter {
    # skip first two lines in log file with comments
    if [message] =~ /^#/ {
        drop { }
    }

    grok {
        patterns_dir => ["/opt/logstash/patterns"]
        match => { "message" => "%{RAW}" }
    }

    date {
        match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss" ]
        locale => "en"
    }

    # ... (rest of the config omitted for readability)
}
1
Can you post your Elasticsearch mapping for that index (specifically the field timestamp)?fylie

1 Answers

1
votes

So I am pretty sure this is being caused by the field timestamp being mapped to a type in Elasticsearch that it doesn't parse to. If you post your index mapping, I'd be happy to look at it.

A note: You can quickly solve this by adding remove_field because if the date filter is successful, the value of that field will be pulled into @timestamp. Right now you have the same value stored in two fields. Then you don't have to worry about the mapping for the field. :)

date {
    match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss" ]
    locale => "en"
    remove_field => [ "timestamp" ]
}