Feb 16th, 2013, 12:18 PM
Converting old job
Hi, I'm trying to convert some old jobs to use Spring Batch. Unfortunately the old jobs are somewhat monolithic and are quite tightly coupled, so pulling the pieces out to fit into the read-write pattern has been tricky.
The job flow is like this:
1)Update Header record status to in process
2)Read detail records
3)Process/Write detail records
4)Update Header and send email noting results (this requires using counters accumulated in the previous phase)
The issue is with line 1 and 4. Where does that logic belong? Reader? Writer? BeforeStep/AfterStep? Somewhere else? It's not really after or before the step, it's part of the step, it's just after the reads/writes are done.
Thanks for any help/advice.
Feb 19th, 2013, 09:27 AM
Is this the processing of a flat file? If so, do you actually update the header record in the file or generate a new file with a new header record?
Feb 19th, 2013, 09:36 AM
Actually it's processing a database "header" record, as we call it from the db and does update that record. The header record is like a parent record of summary information of a process, after which actual detail db records get processed.
Thanks again for you help.
Feb 19th, 2013, 09:46 AM
Ah. That makes life much easier. In that case, I would recommend an StepExecutionListener (assuming 1-4 are all executed within a single step). You can update the header record in the StepExecutionListener#beforeStep method and again in the StepExecutionListener#afterStep.
With regards to number 4 specifically, I would actually break that up into two things. First do all your processing (1-3 and the update of the header record in 4). I would then use a second step to send the emails. This allows you to consider the processing of the records as complete (and not need to rerun that logic) even if the sending of the email fails.
Tags for this Thread