Using Sqlalchemy With a large dataset, I would like to insert all rows using something efficient like session.add_all() followed by session.commit(). I am looking for a way to ignore inserting any rows which raise duplicate / unique key errors. The problem is that these errors only come up on the session.commit() call, so there is no way to fail that specific row and move onto the next.
The closest question I have seen is here: SQLAlchemy - bulk insert ignore: "Duplicate entry" ; however, the accepted answer proposes not using the bulk method and committing after every single row insert, which is extremely slow and causes huge amounts of I/O, so I am looking for a better solution.