0
votes

I am using Logstash Elasticsearch output to publish data to Elasticsearch. Two records are merged to create a single record from a request and a response. This code is working with no issues.

elasticsearch {
  hosts => [ "localhost:9200" ]
  index => "transactions"
  action => "update"
  doc_as_upsert => true
  document_id => "%{tid}"
  script =>'
    if(ctx._source.transaction=="request"){
      ctx._source.status = params.event.get("status");
    }else if(ctx._source.transaction=="response"){                           
      ctx._source.api = params.event.get("api");                            
    }
}

Now I am trying to do add a new field with above record update using ingest pipelines.

PUT _ingest/pipeline/ingest_pipe2
{
  "description" : "describe pipeline",
  "processors" : [
    {
      "set" : {
        "field": "api-test",
        "value": "new"
      }
    }
  ]
}

This will add a new field to the incoming event. It works fine with following code.

elasticsearch {
  hosts => [ "localhost:9200" ]
  index => "transactions"
  pipeline => "ingest_pipe2"  
}

The problem is both logstash update and ingest pipeline update doesn't work together.

elasticsearch {
  hosts => [ "localhost:9200" ]
  index => "transactions"
  pipeline => "ingest_pipe2"**
  action => "update"
  doc_as_upsert => true
  document_id => "%{tid}"
  script =>'
    if(ctx._source.transaction=="request"){
      ctx._source.status = params.event.get("status");
    }else if(ctx._source.transaction=="response"){                           
      ctx._source.api = params.event.get("api");                            
    }
}
1

1 Answers

1
votes

It is not possible to use an ingest pipeline with doc_as_upsert

Using ingest pipelines with doc_as_upsert is not supported.

You can find more info here and here