I am using spring-batch to schedule the Batch Job ie In-memory as project specific requirement (ie not in Production its Just for Test Environment), below are my configuration classes which looks like
// Batch Scheulder class
package org.learning.scheduler
import org.springframework.batch.core.explore.JobExplorer;
import org.springframework.batch.core.explore.support.SimpleJobExplorer;
import org.springframework.batch.core.launch.support.SimpleJobLauncher;
import org.springframework.batch.core.repository.JobRepository;
import org.springframework.batch.core.repository.support.MapJobRepositoryFactoryBean;
import org.springframework.batch.support.transaction.ResourcelessTransactionManager;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.task.SimpleAsyncTaskExecutor;
import org.springframework.scheduling.annotation.EnableScheduling;
/**
* Job Inmemory Config
*
*/
@EnableScheduling
@Configuration
public class InmemoryJobConfig {
@Bean
public ResourcelessTransactionManager transactionManager() {
return new ResourcelessTransactionManager();
}
@Bean
public MapJobRepositoryFactoryBean mapJobRepositoryFactoryBean(ResourcelessTransactionManager resourcelessTransactionManager) throws Exception {
MapJobRepositoryFactoryBean factoryBean = new MapJobRepositoryFactoryBean(resourcelessTransactionManager);
factoryBean.afterPropertiesSet();
return factoryBean;
}
@Bean
public JobRepository jobRepository(MapJobRepositoryFactoryBean factoryBean) throws Exception{
return (JobRepository) factoryBean.getObject();
}
@Bean
public JobExplorer jobExplorer(MapJobRepositoryFactoryBean repositoryFactory) {
return new SimpleJobExplorer(repositoryFactory.getJobInstanceDao(), repositoryFactory.getJobExecutionDao(),
repositoryFactory.getStepExecutionDao(), repositoryFactory.getExecutionContextDao());
}
@Bean
public SimpleJobLauncher jobLauncher(JobRepository jobRepository) throws Exception {
SimpleJobLauncher simpleJobLauncher = new SimpleJobLauncher();
simpleJobLauncher.setJobRepository(jobRepository);
simpleJobLauncher.setTaskExecutor(new SimpleAsyncTaskExecutor());
return simpleJobLauncher;
}
}
//Job ConfiguratinClass
/**
* Batch Entry Point for Scheduler for all Jobs
*
*
*/
@Import({InmemoryJobConfig.class})
@EnableBatchProcessing
@Configuration
@Slf4j
public class BatchScheduler {
@Autowired
private JobBuilderFactory jobs;
@Autowired
private StepBuilderFactory steps;
@Autowired
private SimpleJobLauncher jobLauncher;
@Autowired
private JobExplorer jobExplorer;
@Autowired
private MapJobRepositoryFactoryBean mapJobRepositoryFactoryBean;
@Bean
public ItemReader<UserDTO> userReader() {
return new UserReader();
}
@Bean
public ItemWriter<User> userWriter() {
return new UserWriter();
}
@Bean
public ItemReader<OrderDTO> orderReader() {
return new OrderReader();
}
@Bean
public ItemWriter<Order> orderWriter() {
return new OrderWriter();
}
@Bean
public Step userStep(ItemReader<UserDTO> reader, ItemWriter<User> writer) {
return steps.get("userStep")
.<UserDTO, User>chunk(20)
.reader(userReader())
.processor(new UserProcessor())
.writer(userWriter())
.build();
}
@Bean
public Step orderStep(ItemReader<OrderDTO> reader, ItemWriter<Order> writer) {
return steps.get("orderStep")
.<OrderDTO, Order>chunk(20)
.reader(orderReader())
.processor(new OrderProcessor())
.writer(orderWriter())
.build();
}
@Bean
public Job userJob() {
return jobs.get("userJob").incrementer(new RunIdIncrementer()).start(userStep(userReader(), userWriter())).build();
}
@Bean
public Job orderJob() {
return jobs.get("orderJob").incrementer(new RunIdIncrementer()).start(orderStep(orderReader(), orderWriter())).build();
}
@Scheduled(cron = "0 0/15 * * * ?")
public void scheduleUserJob() throws JobExecutionException {
Set<JobExecution> runningJob = jobExplorer.findRunningJobExecutions("userJob");
if (!runningJob.isEmpty()) {
throw new JobExecutionException(" User Job is already in Start State ");
}
JobParameters userParam =
new JobParametersBuilder().addLong("date", System.currentTimeMillis())
.toJobParameters();
jobLauncher.run(userJob(), userParam);
}
@Scheduled(cron = "0 0/15 * * * ?")
public void scheduleOrderJob() throws JobExecutionException {
Set<JobExecution> runningJob = jobExplorer.findRunningJobExecutions("orderJob");
if (!runningJob.isEmpty()) {
throw new JobExecutionException(" Order Job is already in Start State ");
}
JobParameters orderParam =
new JobParametersBuilder().addLong("date", System.currentTimeMillis())
.toJobParameters();
jobLauncher.run(orderJob(), orderParam);
}
@Scheduled(cron = "0 0/30 * * * ?")
public void scheduleCleanupMemoryJob() throws BatchException {
Set<JobExecution> orderRunningJob = jobExplorer.findRunningJobExecutions("orderJob");
Set<JobExecution> userRunningJob = jobExplorer.findRunningJobExecutions("userJob");
if (!orderRunningJob.isEmpty() || !userRunningJob.isEmpty()) {
throw new BatchException(" Order/user Job is running state , cleanup job is aborted ");
}
mapJobRepositoryFactoryBean.clear();
}
}
I have two Job scheduled on every 0/15 minutes which will do some business logic and i have scheduled in-memory cleanup job to clean the in-memory job data from the "mapJobRepositoryFactoryBean" bean only if any of these two Job is not running state .
I want suggestion to find the best approach how to delete the old Jobs which is already executed , the above approach will not delete the old Job details if its any of the Job is in running state.
Or is there any API's from spring-batch to clear the specific job details from in-memory once the Job is executed .? ie clear in-memory by JobId
Note : I want to go with MapJobRepositoryFactoryBean only not persistent database or any embedded database(H2)