4
votes

My time stamp in the logs are in the format as below

2016-04-07 18:11:38.169  which is  yyyy-MM-dd HH:mm:ss.SSS

This log file is not live one (stored/old one), and I am trying to replace this timpestamp with logstash @timestamp value for the betterment in the Kibana Visualization.

My filter in logstash is like below

     grok {
       match => {
            "message" => [ "(?<timestamp>(\d){4}-(\d){2}-(\d){2} (\d){2}:(\d){2}:(\d){2}.(\d){3}) %{SYSLOG5424SD} ERROR u%{BASE16FLOAT}.%{JAVACLASS} - TransId:2b948ed5-12c0-4ae0-9b99-f1ee01191001 - TransactionId ::\"2b948ed5-12c0-4ae0-9b99-f1ee01191001\"- Actual Time taken to process \:\: %{NUMBER:responseTime:int}" ]
            } 

  }

date {
        match => [ "timestamp:date" , "yyyy-MM-dd HH:mm:ss.SSS Z"  ]
        timezone => "UTC"
        target => "@timestamp" 
         } 

But, its not replacing the @timestamp value, Json value

{
  "_index": "logstash-2017.02.09",
  "_type": "logs",
  "_id": "AVoiZq2ITxwgj2avgkZa",
  "_score": null,
  "_source": {
    "path": "D:\\SoftsandTools\\Kibana\\Logs_ActualTimetakentoprocess.log",
    "@timestamp": "2017-02-09T10:23:58.778Z", **logstash @timestamp**
    "responseTime": 43,
    "@version": "1",
    "host": "4637",
    "message": "2016-04-07 18:07:01.809 [SimpleAsyncTaskExecutor-3] ERROR s.v.wsclient.RestClient - TransId:2b948ed5-12c0-4ae0-9b99-f1ee01191001 - TransactionId ::\"2b948ed5-12c0-4ae0-9b99-f1ee01191001\"- Actual Time taken to process :: 43",
    "timestamp": "2016-04-07 18:07:01.809"   **Mine time stamp**
  }

Sample log line -

2016-04-07 18:11:38.171 [SimpleAsyncTaskExecutor-1] ERROR s.v.wsclient.RestClient - TransId:2b948ed5-12c0-4ae0-9b99-f1ee01191001 - TransactionId ::"2b948ed5-12c0-4ae0-9b99-f1ee01191001"- Actual Time taken to process :: 521

Could you please help and let me know, where am I going wring here..

2
Please add a sample logline.Fairy
thanks for response.. UpdatedVishwa

2 Answers

7
votes

You should basically have a grok match in order to use the timestamp of your log line:

grok {
    patterns_dir => ["give your path/patterns"]
    match => { "message" => "^%{LOGTIMESTAMP:logtimestamp}%{GREEDYDATA}" }          
}

In your pattern file make sure to have the patter which matches your timestamp in the log, which could look something like this:

LOGTIMESTAMP %{YEAR}%{MONTHNUM}%{MONTHDAY} %{TIME}

And then once you've done the grok filtering you might be able to use the filtered value like:

mutate {
    add_field => { "newtimestamp" => "%{logtimestamp}" }
    remove_field => ["logtimestamp"]
}
date {
    match => [ "newtimestamp" , "ISO8601" , "yyyy-MM-dd HH:mm:ss.SSS" ]
    target => "@timestamp"  <-- the timestamp which you wanted to apply on
    locale => "en"
    timezone => "UTC"
}

Hope this helps!

0
votes

you can use date filter plugin of logstash

date {
    match => ["timestamp", "UNIX"]
}