I'm trying to keep a MongoDB collection in sync with an Elasticsearch index using Logstash.
I'm using the Logstash JDBC plugin with the DBSchema JDBC Driver Library for this.
This is the configuration file I'm using for logstash:-
input {
jdbc{
jdbc_driver_class => "com.dbschema.MongoJdbcDriver"
jdbc_driver_library => "/path/to/mongojdbc1.8.jar"
jdbc_user => ""
jdbc_password => ""
jdbc_connection_string => "jdbc:mongodb://127.0.0.1:27017/db1"
statement => "db.collection1.find({ }, { '_id': false })"
}
}
output {
elasticsearch {
hosts => ["http://127.0.0.1:9200"]
index => "testing"
user => ""
password => ""
}
}
This works alright, but when I run logstash multiple times, the records are inserted multiple times into Elasticsearch. I do not want records to be re-written. Also, if I modify a document and run logstash again, it should change the same record in Elasticsearch without creating a new document. How do I go about achieving this?
document_idin your logstash output with some unique identifier from your source, if you do not use it, elasticsearch will generate an unique id for every document, do you have a unique identifier field in your source? - leandrojmprecord_last_run => trueandlast_run_metadata_path => "/usr/share/logstash/bin/since"to your jdbc section in logstash. you can read more about it here: elastic.co/guide/en/logstash/current/plugins-inputs-jdbc.html - eladyanai