0
votes

My business scenario is to read a csv file and its contents and then persist in to the oracle database.

Consider that csv file having 1000 rows in it. Iterating each line and their columns values and setting in to preparedstatement and then adding in to batch using preparedStatement.addBatch() and then executing the batch. refer example below.

try {
  for(List<String> mainList:dataList) { //lines
     int i=1;
     ps = conn.prepareStatement(insertQuery);
     for(String value:mainList) { //column values
        ps.setString(i++, value);
     }
     ps.addBatch();
  }
  int[] updateCounts = ps.executeBatch(); //50th record not valid

  } catch(BatchUpdateException e){
       int[] updateCounts = e.getUpdateCounts();
  }

Here addbatch will have all lines(1000) and values, say example 50th record values is not valid. As of now batch executed(inserted) 49 records properly in database table. At the time of 50th record insertion, executebatch throwing BatchUpdateException (it might be any issue) in catch block since not a valid record.

After this, flow is blocked and does not continues to execute the remaining 950 records in the batch. what I want is once exception is thrown my code should run and insert remaining 950 records in the database.

Please help to resolve this issue or provide me any other alternate way to achieve this logic.

CSV sample:

Name,Id,Firstname,Lastname
Bharathi,2070,Bharathi,Raja   --> row 1
Craig,Text,Craig,Johnson      --> row 2
Mary,9346,Mary,Jenk           --> row 3

Consider row 2 is the 50th record, value for the Id is Text which is not valid it should be number so exception thrown.

I'm having below dependencies in pom.xml and using Spring Boot Application

<dependency>
    <groupId>com.oracle</groupId>
    <artifactId>ojdbc7</artifactId>
    <version>12.1.0.2</version>
</dependency>
2

2 Answers

1
votes

The official Oracle JDBC drivers are on central maven. Sample GAV is as shown below.

<dependency>
    <groupId>com.oracle.database.jdbc</groupId>
    <artifactId>ojdbc7</artifactId>
    <version>12.1.0.2</version>
</dependency>
0
votes

Found alternate way to achieve this kind of problems.

I have made Prior validation before setting values in to preparedstatement and added in to batch. so that always valid records only added in to batch.

I have taken column name and datatype from tables and kept as Map(columnDataTypeMap) for the validation.

try {
  boolean skipFlag=true;
  for(List<String> mainList:dataList) { //lines
     int i=1;
     ps = conn.prepareStatement(insertQuery);
     for(String value:mainList) { //column values
        try {
               String columnName = csvHeader.split(",")[columnCount];
               String dataType = columnDataTypeMap.get(columnName.toUpperCase());
               if("VARCHAR2".equalsIgnoreCase(dataType)) {
                    ps.setString(i++, String.valueOf(value));
               } else if("NUMBER".equalsIgnoreCase(dataType)) {
                    ps.setInt(i++, Integer.valueOf(value));
               }
        }catch(Exception e) {
            skipFlag = false;
        }
     }
    if(skipFlag) {
        ps.addBatch();
    }
  }
  int[] updateCounts = ps.executeBatch();

  } catch(BatchUpdateException e){
       int[] updateCounts = e.getUpdateCounts();
  }