0
votes

I've a configuration in which filebeat fetches logs from some files (using a custom format) and sends those logs to a logstash instance.

In logstash I apply a gork filter in order to split some of the fields and then I send the output to my elasticsearch instance.

The pipeline works fine and it is correctly loaded on elasticsearch, but no event data is present (such as event.dataset or event.module). So I'm looking for the piece of code for adding such information to my events.

Here my filebeat configuration:

filebeat.config:
  modules:
    path: ${path.config}/modules.d/*.yml
    reload.enabled: false

filebeat.inputs:
  - type: log
    paths:
      - /var/log/*/info.log
      - /var/log/*/warning.log
      - /var/log/*/error.log



output.logstash:
  hosts: '${ELK_HOST:logstash}:5044'


Here my logstash pipeline:

input {
  beats {
    port => 5044
  }
}

filter {
    grok {
        match => { "message" => "MY PATTERN"}
    }
  mutate {
    add_field => { "logLevelLower" => "%{logLevel}" }
  }
  mutate {
    lowercase => [ "logLevelLower" ]
  }
}

output {
    elasticsearch {
    hosts => "elasticsearch:9200"
    user => "USER"
    password => "PASSWORD"
    index => "%{[@metadata][beat]}-%{logLevelLower}-%{[@metadata][version]}"
  }
}
1

1 Answers

1
votes

You can do it like this easily with a mutate/add_field filter:

filter {
  mutate {
    add_field => {
        "[ecs][version]" => "1.5.0"
        "[event][kind]" => "event"
        "[event][category]" => "host"
        "[event][type]" => ["info"]
        "[event][dataset]" => "module.dataset"
    }
  }
}

The Elastic Common Schema documentation explains how to pick values for kind, category and type.