0
votes

I am sending logs to an azure eventhub with Serilog (using WriteTo.AzureEventHub(eventHubClient)), after that I am running a filebeat process with the azure module enabled, so I send these logs to elasticsearch to be able to explore them with Kibana.

The problem I have is that all the information goes to the field "message", I would need to separate the information of my logs in different fields to be able to do good queries.

The way I found was create an ingest pipeline in Kibana and through a grok processor I separate the fields inside the "meessage" and generate multiple fields. In the filebeat.yml I set the pipeline name, but nothing happen, it seems the pipeline is not working.

output.elasticsearch:
  # Array of hosts to connect to.
  hosts: ["localhost:9200"]
  pipeline: "filebeat-otc"

Does anybody knows what I am missing? THANKS in advance.

EDITION. I will add an example of my pipeline and my data. In the simulation is working properly:

POST _ingest/pipeline/_simulate
{
  "pipeline": {
    "processors": [
      {
        "grok": {
          "field": "message",
          "patterns": [
            "%{TIME:timestamp}\\s%{LOGLEVEL}\\s{[a-zA-Z]*:%{UUID:CorrelationID},[a-zA-Z]*:%{TEXT:OperationTittle},[a-zA-Z]*:%{TEXT:OriginSystemName},[a-zA-Z]*:%{TEXT:TargetSystemName},[a-zA-Z]*:%{TEXT:OperationProcess},[a-zA-Z]*:%{TEXT:LogMessage},[a-zA-Z]*:%{TEXT:ErrorMessage}}"
          ],
          "pattern_definitions": {
            "LOGLEVEL" : "\\[[^\\]]*\\]",
            "TEXT" : "[a-zA-Z0-9- ]*"
          }
        }
      }
    ]
  },
  "docs": [
    {
      "_source": {
        "message": "15:13:59 [INF] {CorrelationId:83355884-a351-4c8b-af8d-b77c48462f36,OperationTittle:Operation1,OriginSystemName:Fexa,TargetSystemName:Usina,OperationProcess:Testing Log Data,LogMessage:Esto es una buena prueba,ErrorMessage:null}"
      }
    },
    {
      "_source": {
        "message": "20:13:48 [INF] {CorrelationId:8451ee54-efca-40be-91c8-8c8e18e33f58,OperationTittle:null,OriginSystemName:Fexa,TargetSystemName:Donna,OperationProcess:Testing Log Data,LogMessage:null,ErrorMessage:null}"
      }
    }
  ]
}
1
Please update your question with an example of your message and the ingest pipeline you are using. When you simulate the ingest pipeline it works ok? Are you using any filebeat module?leandrojmp
@leandrojmp, Yes I am using an azure filebeat module. And yes, when I simulate the ingest pipeline it works ok. I will add an example as you asked. Thanks!Daniel Silva
If I'm not wrong, when you use a module it will create and use an ingest pipeline in elasticsearch, and the pipeline option in the output is ignored. You would need to edit that pipeline or use the index.final_pipeline setting in your index. You can check it in Kibana going to Stack Management / Ingest Node Pipelines.leandrojmp
Do you have Kibana? It is easier to edit there, go to Stack Management / Ingest Node Pipelines. I would recommend that you use the index.final_pipeline setting and add your ingest pipeline, it will then be processed after the module pipeline.leandrojmp
@leandrojmp as you suggested I set the index.final_pipeline with my pipeline and "IT WORKS!!" THANKS!! I spent a lot of time trying to do it.Daniel Silva

1 Answers

1
votes

It seems when you use a module it will create and use an ingest pipeline in elasticsearch, and the pipeline option in the output is ignored. So my solution was modify the index.final_pipeline. For this, in Kibana I went to Stack Management / Index Management there I found my index, there I went to Edit Settings and set "index.final_pipeline": "the-name-of-my-pipeline". I hope this helps to anybody.

This was thanks to leandrojmp