3
votes

In Spring Batch, Reading from and writing to database does up to 31 000 records. After 31 000 records, getting the below exception:

java.sql.SQLRecoverableException:
 Encountered an error executing the step
org.springframework.transaction.CannotCreateTransactionException: Could not open JDBC Connection for transaction; nested exception is java.sql.SQLRecoverableException: IO Error: Socket read timed out
    at org.springframework.jdbc.datasource.DataSourceTransactionManager.doBegin(DataSourceTransactionManager.java:240)
    at org.springframework.transaction.support.AbstractPlatformTransactionManager.getTransaction(AbstractPlatformTransactionManager.java:371)
    at org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:127)
    at org.springframework.batch.core.step.tasklet.TaskletStep$2.doInChunkContext(TaskletStep.java:264)
    at org.springframework.batch.core.scope.context.StepContextRepeatCallback.doInIteration(StepContextRepeatCallback.java:76)
    at org.springframework.batch.repeat.support.TaskExecutorRepeatTemplate$ExecutingRunnable.run(TaskExecutorRepeatTemplate.java:258)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.sql.SQLRecoverableException: IO Error: Socket read timed out
    at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:458)
    at oracle.jdbc.driver.PhysicalConnection.<init>(PhysicalConnection.java:546)
    at oracle.jdbc.driver.T4CConnection.<init>(T4CConnection.java:236)
    at oracle.jdbc.driver.T4CDriverExtension.getConnection(T4CDriverExtension.java:32)
    at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:521)
    at oracle.jdbc.pool.OracleDataSource.getPhysicalConnection(OracleDataSource.java:280)
    at oracle.jdbc.pool.OracleDataSource.getConnection(OracleDataSource.java:207)
    at oracle.jdbc.pool.OracleDataSource.getConnection(OracleDataSource.java:157)
    at org.springframework.jdbc.datasource.DataSourceTransactionManager.doBegin(DataSourceTransactionManager.java:202)
    ... 8 more
Caused by: oracle.net.ns.NetException: Socket read timed out
    at oracle.net.ns.Packet.receive(Packet.java:339)
    at oracle.net.ns.NSProtocol.connect(NSProtocol.java:296)
    at oracle.jdbc.driver.T4CConnection.connect(T4CConnection.java:1102)
    at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:320)
    ... 16 more

DataSource:
    <bean id="dataSource" class="oracle.jdbc.pool.OracleDataSource"
        destroy-method="close">
        <property name="explicitCachingEnabled" value="true" />
        <property name="URL" value="${batch.jdbc.url}"></property>
        <property name="password" value="${batch.jdbc.password}"></property>
        <property name="user" value="${batch.jdbc.user}"></property>
        <property name="connectionCachingEnabled" value="true" />
        <property name="connectionCacheName" value="ImplicitCache01" />
        <property name="maxStatements" value="2000000" />
        <property name="connectionCacheProperties">
            <props merge="default">
                <!-- <prop key="MinLimit">5</prop> -->
                <!-- <prop key="MaxLimit">50</prop> -->
                <prop key="InitialLimit">1</prop>
                <prop key="MinLimit">1</prop>
                <prop key="MaxLimit">2000</prop>            
                <prop key="ConnectionWaitTimeout">11800000</prop>
                <prop key="InactivityTimeout">11800000</prop>
                <prop key="TimeToLiveConnectionTimeout">18000</prop>
                <prop key="ValidateConnection">true</prop>
                <prop key="PropertyCheckInterval">300000</prop>
            </props>
        </property>
    </bean>

Even though I raised the timeout in Connection Properties, Still I am getting the same exception.

I have raised the TimeToLiveConnectionTimeout. Still facing same issue.

Query:

select e.id, e.tpin, e.res_blob.getClobVal() clobvalue from ecrnt e where e.id  >= 688370 AND e.id <= 788370

I am querying xmltype in oracle db. The batch job runs successfully for 100k records when I query without xmlType. I am facing the below exception when I try to query xmltype.

 - org.springframework.dao.DataAccessResourceFailureException: Attempt
   to process next row failed; SQL [select e.id, e.tpin, e.res_blob.getClobVal() 
   clobvalue from ecrnt e where e.id  >= 688370 AND e.id <= 788370];
   **ORA-00028: your session has been killed ORA-00028: your session has
   been killed ORA-00028: your session has been killed ORA-04036: PGA
   memory used by the instance exceeds PGA_AGGREGATE_LIMIT ORA-06512: at
   "SYS.XMLTYPE", line 138 ;** 

Nested exception:

nested exception is java.sql.SQLRecoverableException: ORA-00028: your session has been killed ORA-00028: your session has been killed ORA-00028: your session has been killed ORA-04036: PGA memory used by the instance exceeds PGA_AGGREGATE_LIMIT ORA-06512: at "SYS.XMLTYPE", line 138 at org.springframework.jdbc.support.SQLStateSQLExceptionTranslator.doTranslate(SQLStateSQLExceptionTranslator.java:104) at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:72) at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:80) at org.springframework.batch.item.database.AbstractCursorItemReader.doRead(AbstractCursorItemReader.java:450) at org.springframework.batch.item.support.AbstractItemCountingItemStreamItemReader.read(AbstractItemCountingItemStreamItemReader.java:85) at sun.reflect.GeneratedMethodAccessor962.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:318) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:183) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150) at org.springframework.aop.support.DelegatingIntroductionInterceptor.doProceed(DelegatingIntroductionInterceptor.java:131) at org.springframework.aop.support.DelegatingIntroductionInterceptor.invoke(DelegatingIntroductionInterceptor.java:119) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:202) at com.sun.proxy.$Proxy467.read(Unknown Source) at org.springframework.batch.core.step.item.Simpl

1
looks like the connection to the database was lost after reading 31000 records, probably due to a transient network problem. Have you tried to rerun your job? Is this problem happening repeatedly?Mahmoud Ben Hassine
Yes. Every time I am facing the same issue where I have tried initially with lakh records Actually spring batch job to handle millions of records from db. I tried a lot over internet. Any solution will be helpful.Divya

1 Answers

0
votes

Increasing the timeout would not solve the issue since there is actually no timeout. You need to set the connection time-to-live to a high value so that oracle db server does not close your connection. See here: https://docs.oracle.com/cd/B28359_01/java.111/e10788/optimize.htm#CFHBJBCI