0
votes

I have an issue where my beatmetric is caught by my http pipeline.

Both Logstash, Elastic and Metricbeat is running in Kubernetes.

My beatmetric is setup to send to Logstash on port 5044 and log to a file in /tmp. This works fine. But whenever I create a pipeline with an http input, this seems to also catch beatmetric inputs and send them to index2 in Elastic as defined in the http pipeline.

Why does it behave like this?

/usr/share/logstash/pipeline/http.conf

input {
  http {
    port => "8080"
  }
}

output {

  #stdout { codec => rubydebug }

  elasticsearch {

    hosts => ["http://my-host.com:9200"]
    index => "test2"
  }
}

/usr/share/logstash/pipeline/beats.conf

input {
    beats {
        port => "5044"
    }
}

output {
    file {
        path => '/tmp/beats.log'
        codec => "json"
    }
}

/usr/share/logstash/config/logstash.yml

pipeline.id: main
pipeline.workers: 1
pipeline.batch.size: 125
pipeline.batch.delay: 50
http.host: "0.0.0.0"
http.port: 9600
config.reload.automatic: true
config.reload.interval: 3s

/usr/share/logstash/config/pipeline.yml

- pipeline.id: main
  path.config: "/usr/share/logstash/pipeline"
1
how are you starting logstash? are you using multiple pipelines with pipelines.yml configuration?leandrojmp
Updated my question. Basically the pipeline.yml is pointing to a pipeline directory with both conf files. Everything is running in Kubernetes (and thus are started automatically via the docker image).d00dle

1 Answers

3
votes

Even if you have multiple config files, they are read as a single pipeline by logstash, concatenating the inputs, filters and outputs, if you need to run then as separate pipelines you have two options.

Change your pipelines.yml and create differents pipeline.id, each one pointing to one of the config files.

- pipeline.id: beats
  path.config: "/usr/share/logstash/pipeline/beats.conf"
- pipeline.id: http
  path.config: "/usr/share/logstash/pipeline/http.conf"

Or you can use tags in your input, filter and output, for example:

input {
  http {
    port => "8080"
    tags => ["http"]
  }
  beats {
    port => "5044"
    tags => ["beats"]
  }
}
output {
 if "http" in [tags] {
      elasticsearch {
        hosts => ["http://my-host.com:9200"]
        index => "test2"
      }
  }
 if "beats" in [tags] {
      file {
        path => '/tmp/beats.log'
        codec => "json"
      }
  }
}

Using the pipelines.yml file is the recommended way to running multiple pipelines