I'd go for the second approach, suggested by Faiz Pinkman. It's simply closer to the way spring-batch works.
first step
- reader for your simple sql -> use the standard db reader
- processor -> your own implementation of your simple logic
- writer to a file -> use the standard FlatFileItemWriter
second step
I don't undestand exactly what you mean by "process update and insert logic behind". I assume, that you read data from a db and based on that data, you have to execute inserts and updates in a table.
- reader for your more complex data -> again, use the standard db reader
- processor ->
- prepare the string for the text file
- prepare the new inserts and upates
- writer -> use a composite writer with the following delegates
- FlatFileItemWriter for your textfile
- DbWriter depending on your inserts and update needs
This way, you have clear transaction boundaries and can be sure, that the content of the file and inserts and updates are "in sync".
note: first and second step can run in parallel
third step
- reader use a multiresource reader, to read from the two files
- writer use a FlatFileItemWriter to write both contents into one file.
Of course, If you don't need to have the content in one file, then you can skip step 3.
You could also execute step 1 and 2 after each other and write in the same file. But depending on the execution time for step 1 and 2, the performance could be inferior to execute step 1 and 2 in parallel and using a third step to compine the data.