0
votes

I am using logstash to read some logs. I have a log file which the Timestamp only consist of time field, i.e. 08:28:20,500, but no date field. I would like to map it with the datetime of today. How should I do that with date filter.

A line of my log file is like this.

 08:28:20,500 INFO  [org.jboss.as.connector.subsystems.datasources] (ServerService Thread Pool -- 27) JBAS010403: Deploying JDBC-compliant driver class org.h2.Driver (version 1.3)>>"C:\CIGNA\jboss\jboss.log"

Is there anyone who can help with this issue?Great thanks in advance.

EDIT After using ruby as filter, I have managed to solve the issue. However, there is occasionally a ruby exception. As seen from below, the first message has come across with ruby exception while the 2nd one runs fine. I would wonder how this happen and if anyone can provide me some advice. Thanks.

{
       "message" => "10:30:39 FATAL [org.jboss.as.server] (default task-1) JBAS015957: Server boot has failed in an unre
coverable manner; exiting. See previous messages for details.\r",
      "@version" => "1",
    "@timestamp" => "2016-07-26T02:43:17.379Z",
          "path" => "C:/CIGNA/jboss/jboss.log",
          "host" => "SIMSPad",
          "type" => "txt",
          "Time" => "10:30:39",
         "Level" => "FATAL",
     "JavaClass" => "org.jboss.as.server",
       "Message" => "(default task-1) JBAS015957: Server boot has failed in an unrecoverable manner; exiting. See previo
us messages for details.\r",
          "tags" => [
        [0] "_rubyexception"
    ]
}
{
       "message" => "10:30:39 DEBUG [org.jboss.as.quickstarts.logging.LoggingExample] (default task-1) Settings reconfig
ured: JBOSS EAP Resettlement\r",
      "@version" => "1",
    "@timestamp" => "2016-07-26T02:30:39.000Z",
          "path" => "C:/CIGNA/jboss/jboss.log",
          "host" => "SIMSPad",
          "type" => "txt",
          "Time" => "10:30:39",
         "Level" => "DEBUG",
     "JavaClass" => "org.jboss.as.quickstarts.logging.LoggingExample",
       "Message" => "(default task-1) Settings reconfigured: JBOSS EAP Resettlement\r"
}

And my updated filter part in my logstash .conf file is as shown.

filter {
  grok {
    match => { "message" => '\A%{TIME:Time}%{SPACE}%{WORD:Level}%{SPACE}\[%{PROG:JavaClass}]%{SPACE}%{JAVALOGMESSAGE:Message}'}
  }
   ruby {
      code => "
        p = Time.parse(event['message']);
        event['@timestamp'] = LogStash::Timestamp.new(p);
      "
    }
}
1
I don't see any timestamp (date or time) in your sample log line. - Val
Sorry, I miscopied. I have edited it now. Thanks for reminding. - Kennedy Kan
I assume that the date filter is matching "Time" not "Timestamp" ? Or is "Timestamp" another field somewhere? - pandaadb
The match in the date filter has to be done on the Time field, not Timestamp, which does not exist in your example. Also the time pattern is missing ,SSS for the milliseconds - baudsp
@pandaadb I have corrected the question. It is just a typo. Sorry for confusing. - Kennedy Kan

1 Answers

3
votes

you can do that via ruby filter. Ruby can parse this out of the box. Sorry, I have not tried it with the date filter (might work as well). Here is my example:

my configuration:

input {
  stdin {
  }
}


filter {

    ruby {
      code => "
        p = Time.parse(event['message']);
        event['myTime'] = p;
      "
    }

}


output {
          stdout { codec => rubydebug }
}

Input and output:

[

artur@pandaadb:~/dev/logstash$ ./logstash-2.3.2/bin/logstash -f conf2/
Settings: Default pipeline workers: 8
Pipeline main started

08:28:20

{
       "message" => "08:28:20",
      "@version" => "1",
    "@timestamp" => "2016-07-25T09:43:28.814Z",
          "host" => "pandaadb",
        "myTime" => 2016-07-25 08:28:20 +0100
}

I am simply passing your string, you can use the variable that you parsed, e.g. "Time" in the ruby code.

Ruby is quite smart when parsing dates and recognises that it is a time, rather than an entire date. So it uses today's timestamp and modifies the time only.

Hope that helps!

EDIT:

I tried the date filter just now and that one works differently. It sets the date to the 1st of this year. So it appears that the ruby filter will be your solution as the date filter does not offer any date modifications that I know of to modify the date after it has been matched.

EDIT 2:

In the comment you asked how to write it into the @timestmap field. The @timestamp field is a predefined field that expects a Logstash Timestamp Objects (not a string or datetime object). So you can write it directly into that field, however you must create an object. (Alternatively this would also work by using the date filter, but why double the filters)

Here is the necessary code:

   ruby {
      code => "
        p = Time.parse(event['message']);
        event['@timestamp'] = LogStash::Timestamp.new(p);
      "
    }

EDIT:

With regards to the updated question, your issue is that you are using the wrong variable in your event.

From your log update, you can see that your grok is parsing the things correctly, e.g.:

"message" => "10:30:39 FATAL [org.jboss.as.server] (default task-1) JBAS015957: Server boot has failed in an unre ...",
"Time" => "10:30:39"

In your filter however you reference the "message" variable of the event, not the "Time" variable. So ruby will attempt to parse the entire message string into a date. Why this works in the second log is a mystery to me :D

You need to change your filter to:

filter {
  grok {
    match => { "message" => '\A%{TIME:Time}%{SPACE}%{WORD:Level}%{SPACE}\[%{PROG:JavaClass}]%{SPACE}%{JAVALOGMESSAGE:Message}'}
  }
   ruby {
      code => "
        p = Time.parse(event['Time']);
        event['@timestamp'] = LogStash::Timestamp.new(p);
      "
    }
}

This will tell the parsing to take the time that is in the event field "Time".

Regards, Artur