0
votes

I have written pipeline files for Logstash, but my current client is opposed to using Logstash and wants to ingest Filebeat generated logs directly in Elasticsearch.

Fine, if that is really what he wants. But I cannot find a complimentary pipeline file for Elasticsearch. I want to COPY config files into an image with a Dockerfile, then build the stack with Compose. Making a nice deployment pattern for the client going forward.

I am using version 7.11 of the stack and I have a good start on the Compose file for Elasticsearch and Kibana and another Compose for Filebeat. What I cannot find a a syntax that allows placing the pipelines into the ES Image.

Can someone point me in the right direction?

Thanks!

1

1 Answers

0
votes

I do not see how would you load the pipelines while starting up ES. You can either do it via the API, after the cluster has started, or by loading them with filebeat itself.

For most of the pipelines we use, as they do not change very often after the initial setup, we decided to use a very simple bash script that would iterate through a folder with pipeline JSONs and post them to the API via cURL commands.

curl -H "Content-Type: application/json" -XPUT http://${ELASTIC_URL}:9200/_ingest/pipeline/some-pipeline [email protected]

For other apps though, we had to create custom filebeat modules which have the pipelines built in.

In order to load pipelines via filebeat, you would need to create a custom module which already contains the pipeline JSON.

See the module development guide for more details.

Once created, you can run ./filebeat setup --pipelines --modules my-custom-module to push the pipelines to Elastic.