I just switched from REST endpoint triggered spring batch job to a command line job run method.
I removed the controller class which had the JobLauncher
.
Now I am trying to run the job with the following command:
java -Dspring.batch.job.names="SOME JOBNAME" -jar somejar.jar
I see a run.id
and time
job parameters being passed by default in the BATCH_JOB_EXECUTION_PARAMS
table.
I didn't see these params being passed by default when I used to trigger the job with the REST endpoint launcher approach.
And the same value of both these params is being passed in each job-run.
run.id=1
and time=1612515999654
.
And as expected, it gives me the following JobInstanceAlreadyComplete
exception.
My job bean is something like this:
@Bean
public Job job() throws Exception {
return this.jobBuilderFactory.get("SOME JOBNAME")
.incrementer(new RunIdIncrementer())
.start(someStep())
.build();
}
I have this createJobRespository()
method overridden, by extending the DefaultBatchConfigurer
class which I use because my metadata tables are in a different schema in sql server db than the default dbo
schema.
@Override
protected JobRepository createJobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(dataSource);
factory.setTransactionManager(transactionManager);
factory.setTablePrefix("someSchema" + ".BATCH_");
factory.afterPropertiesSet();
return factory.getObject();
}
When I remove the createJobRespository()
method and thus use the batch metadata tables in default dbo
schema, only run.id
is passed by default and incremented as expected. No time
param is passed by default in this case.
My question is:
- Why this
time
andrun.id
job parameters are being passed by default when I trigger the job using command line? - And why is the value for both the parameters being passed the same every time I run the job? Even if I am using the
new RunIdIncrementer()
- How is the
createJobRespository()
method creating the difference?