1
votes

I'm setting up a Spring Batch Job with ItemReader/ItemWriter to export datasets from an Oracle DB to CSV/Excel/PDF. For the reader part I'm using the JdbcCursorItemReader.

By default it is necessary to pass the SQL statement to the reader to extract the datasets - this is my reader configuration:

<bean id="readErgebnis" class="org.springframework.batch.item.database.JdbcCursorItemReader" scope="step">
    <property name="dataSource" ref="dataSource" />
    <property name="sql" value="#{jobParameters['stmt']}" />
    <property name="rowMapper">
        <bean class="myFancyRowMapper" />
    </property>
</bean>

So I added the SQL statement to the JobParametersBuilder...

JobParametersBuilder builder = new JobParametersBuilder(); 
builder.addString("stmt", sql);

That should be fine. However executing the job fails with an SQLException.

Now I know that Spring Batch is logging everything to it's logging tables, and there is the problem - the parameter string (containing the SQL) is limited to 250 characters. See the definition of the BATCH_JOB_PARAMS table:

CREATE TABLE BATCH_JOB_PARAMS  (
    JOB_INSTANCE_ID NUMBER(19,0) NOT NULL ,
    TYPE_CD VARCHAR2(6) NOT NULL ,
    KEY_NAME VARCHAR2(100) NOT NULL , 
    STRING_VAL VARCHAR2(250), 
    DATE_VAL TIMESTAMP DEFAULT NULL ,
    LONG_VAL NUMBER(19,0) ,
    DOUBLE_VAL NUMBER ,
    constraint JOB_INST_PARAMS_FK foreign key (JOB_INSTANCE_ID)
    references BATCH_JOB_INSTANCE(JOB_INSTANCE_ID)
);

Spring Batch stores every job parameter in a new row, therefore the SQL statement goes into the STRING_VAL column which is limited to 250 characters. I checked the DDLs for the other database systems (MySQL and DB2), it seems to be limited to 250 characters everywhere. I changed that column to CLOB now, works fine and my job runs.

My question: I can hardly believe that it is intended to store job parameters limited to 250 characters via batch jobs. So is there a better implementation for my scenario or is it intended to change the default values for the parameter columns in the BATCH_JOB_PARAMS table?

1
no, you should never modify the metadata schema. See my response.Cygnusx1

1 Answers

3
votes

DO not pass the SQL as a parameter of the job. Instead create a Class extending the JdbcCursorItemReader that implements the InitializingBean.

package xxx.readers;

public class MyReader  extends  JdbcCursorItemReader<AnObjet> implements InitializingBean{


@Override
public void afterPropertiesSet() throws Exception {
        // set the SQL

    String SELECT_PAYMENT = "SELECT * from table"
    super.setSql(SELECT_PAYMENT);

    }
}

This way your sql will be set when your beans are initialized at start up by the spring context.

if you need to add a where clause, use PreparedStatementSetter and add it to the config.

    <bean id="myReader" class="xxx.readers.MyReader">
    <property name="dataSource" ref="aDataSource" />
    <property name="rowMapper" ref="aMapper" />     
    <property name="preparedStatementSetter" ref="yourPreparedStatementSetter" />
</bean>

A preparedStatementSetter example :

package fcdq.iemt.batch.concilliation.utils;

import java.sql.PreparedStatement;
 import java.sql.SQLException;

 import org.springframework.beans.factory.annotation.Autowired;
 import org.springframework.jdbc.core.PreparedStatementSetter;
 import org.springframework.stereotype.Component;



 @Component("aPrStatSetter")
 public class RunNoSetter implements PreparedStatementSetter {


public void setValues(PreparedStatement ps) throws SQLException {
    ps.setString(1, "hello");
    ps.setString(2, "Hola");
    ps.setTimestamp(3, aTimestamp);
}
 }

The preparedStatementSetter will be called at run time to replace the ? in your where clause.

hope it help.