0
votes

Im having an index as below:

{
"_index": "mydata",
"_type": "_doc",
"_id": "PuhnbG0B1IIlyY9-ArdR",
"_score": 1,
"_source": {
"age": 9,
"@version": "1",
"updated_on": "2019-01-01T00:00:00.000Z",
"id": 4,
"name": "Emma",
"@timestamp": "2019-09-26T07:09:11.947Z"
}

So my logstash conf for updaing data is input {

    jdbc {
        jdbc_connection_string => "***"
        jdbc_driver_class =>  "***"
    jdbc_driver_library => "***"
        jdbc_user => ***
        statement => "SELECT * from agedata WHERE updated_on > :sql_last_value ORDER BY updated_on"
    use_column_value =>true
        tracking_column =>updated_on
        tracking_column_type => "timestamp"
    }
}
output {
          elasticsearch { hosts => ["localhost:9200"] 
        index => "mydata" 
        action => update
            document_id => "{_id}"
            doc_as_upsert =>true}
          stdout { codec => rubydebug }
       }

So, when i run this after any updation in the same row, my expected output is to update the existing _id values for any changes i made in that row. But my Elasticsearch is indexing it as a new row where my _id is considered as a string.

"_index": "agesep",
"_type": "_doc",
"_id": ***"%{_id}"***

The duplicate occurs when i use document_id => "%{id}" as: actual:

         {
"_index": "mydata",
"_type": "_doc",
"_id": "BuilbG0B1IIlyY9-4P7t",
"_score": 1,
"_source": {
"id": 1,
"age": 13,
"name": "Greg",
"updated_on": "2019-09-26T08:11:00.000Z",
"@timestamp": "2019-09-26T08:17:52.974Z",
"@version": "1"
}
}

duplicate:

{
"_index": "mydata",
"_type": "_doc",
"_id": "1",
"_score": 1,
"_source": {
"age": 56,
"@version": "1",
"id": 1,
"name": "Greg",
"updated_on": "2019-09-26T08:18:00.000Z",
"@timestamp": "2019-09-26T08:20:14.561Z"
}

How do i get it to consider the existing _id and not create a duplicate value when i make updates in ES? My expectation is to update data in the index based on the _id, and not create a new row of update.

1
You're missing a % in "{_id}". Also have you tried %{id} instead of %{_id}Val
@Val, I tried %{id}, which is a column in my table, but it still was creating a new _id value for every update, creating a new row as duplicate for IDs.SPishere
@Val, sorry, it woked the first time, but now its again ` "_index": "agesep", "_type": "_doc", "_id": "%{_id}",`, im not sure whySPishere
See my answer, you need to use id not _id. There's no _id field coming from your DBVal
Yes, that's the file Logstash uses to store the sql_last_value for the next run. Just delete it, delete your agesep index and restart LogstashVal

1 Answers

1
votes

I suggest using id instead of _id

        document_id => "%{id}"