My business scenario is to read a csv file and its contents and then persist in to the oracle database.
Consider that csv file having 1000 rows in it. Iterating each line and their columns values and setting in to preparedstatement and then adding in to batch using preparedStatement.addBatch() and then executing the batch. refer example below.
try {
for(List<String> mainList:dataList) { //lines
int i=1;
ps = conn.prepareStatement(insertQuery);
for(String value:mainList) { //column values
ps.setString(i++, value);
}
ps.addBatch();
}
int[] updateCounts = ps.executeBatch(); //50th record not valid
} catch(BatchUpdateException e){
int[] updateCounts = e.getUpdateCounts();
}
Here addbatch will have all lines(1000) and values, say example 50th record values is not valid. As of now batch executed(inserted) 49 records properly in database table. At the time of 50th record insertion, executebatch throwing BatchUpdateException (it might be any issue) in catch block since not a valid record.
After this, flow is blocked and does not continues to execute the remaining 950 records in the batch. what I want is once exception is thrown my code should run and insert remaining 950 records in the database.
Please help to resolve this issue or provide me any other alternate way to achieve this logic.
CSV sample:
Name,Id,Firstname,Lastname
Bharathi,2070,Bharathi,Raja --> row 1
Craig,Text,Craig,Johnson --> row 2
Mary,9346,Mary,Jenk --> row 3
Consider row 2 is the 50th record, value for the Id is Text which is not valid it should be number so exception thrown.
I'm having below dependencies in pom.xml and using Spring Boot Application
<dependency>
<groupId>com.oracle</groupId>
<artifactId>ojdbc7</artifactId>
<version>12.1.0.2</version>
</dependency>