Sep 14th, 2011, 07:48 AM
Thread safety of job and step execution contexts
I have a job that may have multiple executions running concurrently. I need to share data between steps in the job. I'm seeing threading issues with putting objects in the step execution contexts and promoting them to the job execution context.
Could someone briefly explain the threading model for the execution contexts? Are they shared or separate? If they are not safe to use for concurrent job executions, what is the recommended method to share data between steps for this case? How do I insure the the data gets persisted properly in the batch tables also (for restart, etc.)?
Sep 14th, 2011, 08:18 AM
I think the answer to my issues may be to add scope="step" to all my readers, processors, and writers. If someone who could confirm or deny that, I'd appreciate it.
Sep 14th, 2011, 09:58 AM
The step scope can indeed help you to confine the state of job artifacts (readers, writers, listeners) to a specific job execution. Each step-scoped object exists only for the duration of its owning job execution. This should avoid collision between executions.
Sep 15th, 2011, 11:53 AM
FYI, I changed all my beans to scope="step" and it started working.
Sep 22nd, 2011, 06:42 PM
What if for example in case of reading multiple xml files using parallel processing by partition. In this case all the steps share the same Item Processor bean. Now for processing each chunked item I need to store some information in stepExecutionContext but has to be different for each file that means for each step. What do I do in this sitaution. My question is stepExecution is seprate for each step why not in Item processor have method
public String process(Object item,StepExecution stepExecution) that way it is thread safe. But now StepExecution is noly available in listener with @BeforeStep saveStepExecution and has to be set to instance varibale so that can be passed to next step which is not thread safe. Please let me know if there is a better way to handle this.