I'm setting up a Spring Batch Job with ItemReader/ItemWriter to export datasets from an Oracle DB to CSV/Excel/PDF. For the reader part I'm using the JdbcCursorItemReader.
By default it is necessary to pass the SQL statement to the reader to extract the datasets - this is my reader configuration:
<bean id="readErgebnis" class="org.springframework.batch.item.database.JdbcCursorItemReader" scope="step">
<property name="dataSource" ref="dataSource" />
<property name="sql" value="#{jobParameters['stmt']}" />
<property name="rowMapper">
<bean class="myFancyRowMapper" />
</property>
</bean>
So I added the SQL statement to the JobParametersBuilder...
JobParametersBuilder builder = new JobParametersBuilder();
builder.addString("stmt", sql);
That should be fine. However executing the job fails with an SQLException
.
Now I know that Spring Batch is logging everything to it's logging tables, and there is the problem - the parameter string (containing the SQL) is limited to 250 characters. See the definition of the BATCH_JOB_PARAMS table:
CREATE TABLE BATCH_JOB_PARAMS (
JOB_INSTANCE_ID NUMBER(19,0) NOT NULL ,
TYPE_CD VARCHAR2(6) NOT NULL ,
KEY_NAME VARCHAR2(100) NOT NULL ,
STRING_VAL VARCHAR2(250),
DATE_VAL TIMESTAMP DEFAULT NULL ,
LONG_VAL NUMBER(19,0) ,
DOUBLE_VAL NUMBER ,
constraint JOB_INST_PARAMS_FK foreign key (JOB_INSTANCE_ID)
references BATCH_JOB_INSTANCE(JOB_INSTANCE_ID)
);
Spring Batch stores every job parameter in a new row, therefore the SQL statement goes into the STRING_VAL column which is limited to 250 characters. I checked the DDLs for the other database systems (MySQL and DB2), it seems to be limited to 250 characters everywhere. I changed that column to CLOB
now, works fine and my job runs.
My question: I can hardly believe that it is intended to store job parameters limited to 250 characters via batch jobs. So is there a better implementation for my scenario or is it intended to change the default values for the parameter columns in the BATCH_JOB_PARAMS table?