Jul 8th, 2011, 08:31 AM
Spring Batch: Step Context Memory is Creeping to max Litmit in Chunk Oriented Process
I have implemented a spring batch batch job using springbatch 2.1 version.
The job functionality:
1) Read data from data base
2) Process data
3) Write into different data base
I have implemented Remote chunk processing mechanism using multiple threads processing per each chunk to achieve better performance. For faster processing batch data is storing in memory between steps.
Problem: The batch job is getting launched by job launcher and the job is running loop. After executing couple of cycles in loop the memory utilisation by batch job is slowly creeping up and the memory is not getting released. After around 2 hours of batch running, the memory allocated to the process getting reached maximum limit and batch application is exiting with Out of Memory.
I have used profiler tools to analyse the problem and identified that some of context objects are sitting memory and the context is not getting cleared. I have used ConcurrentLinkedQueue for storing the items to be processed.
After loading data from data base, I am storing data in Step Execution Context to pass between steps. The step context data is getting promoted data between steps by using StepExecutionListenerSupport. Subsequently the data being used in next step by reading data from step execution context.
The below is the step & batch configurations:
<batch:job id="Job1" job-repository="jobRepository" >
<batch:tasklet transaction-manager="transactionManager" allow-start-if-complete="true" task-executor="dataLoadingStepTaskExecutor" throttle-limit=5>
<batch:chunk reader="loadingReader" writer="loadingWriter" processor="loadingProcessor" commit-interval=5 />
<batch:next on="END" to="dataLoading"/>
<batch:next on="*" to="dataProcessing"/>
<batch:step id="processing" next="inserting">
<batch:tasklet allow-start-if-complete="true" task-executor="dataProcessingStepTaskExecutor" throttle-limit=5>
<batch:chunk reader="processingReader" writer="processingWriter" processor="dataProcessor" chunk-completion-policy="dataProcessingPolicy" />
<batch:tasklet transaction-manager="transactionManager" allow-start-if-complete="true" task-executor="dataInsertingStepTaskExecutor" throttle-limit=5>
<batch:chunk reader="insertingReader" processor="insertingProcessor" writer="insertingWriter" commit-interval=5/>
<batch:end on="*" exit-code="COMPLETED"/>
Can somebody explain how spring batch will handle step context clearing and free the memory.
Jul 11th, 2011, 02:36 AM
All objects created directly by Spring Batch should be garbage collected as long as the Job executions complete in some way. Are you using an in-memory repository?
Jul 11th, 2011, 03:18 AM
I am using in memory repository.
I am keeping data in StepContext and promoting data to next step using promotion listeners. And my job is running in loop and each loop will process a chunk.
I believe some of the memory still not getting released.
I have changed my job configuration, by keeping data in Job Context ( instead of step context), which does not required any linstener promotion as data is available for all steps in batch. This worked fine. The problem solved.
But Can you explain why step context promotion is creeping memory?
Jul 11th, 2011, 05:32 AM
Because you are storing all your intermediate state in the JobRepository, which itself is in memory. You need to use an external repository, or store the state somewhere else (like your own volatile storage). Or, I guess, purge the in-memory repository manually.
Originally Posted by venkat.bhimireddy
Tags for this Thread