For example, the following beans types are in my parent context:
These bean types are in my prototype context:
Chudak, Thanks for the reply.
You had mentioned that we need not explicitly specify "prototype" scope in the bean definition as we are reloading the context. Just wanted to know will there be any problem due to this as it will keep loading these objects again and again for every job. Will they get garbage collected once the job is run?
I have configured my application according to your posting. My job is processing about 600k records for each run. The job involves loading a fixed length flat file with file size about 300MB.
After the 4th run, I got this out of memory error
I have started my Tomcat server with -Xmx1024mCode:java.lang.OutOfMemoryError: Java heap space at org.apache.catalina.loader.WebappClassLoader.findResourceInternal(WebappClassLoader.java:2053) at org.apache.catalina.loader.WebappClassLoader.findResource(WebappClassLoader.java:934) at org.apache.catalina.loader.WebappClassLoader.getResource(WebappClassLoader.java:1069) at org.springframework.core.io.ClassPathResource.getURL(ClassPathResource.java:159) at org.springframework.core.io.ClassPathResource.getFile(ClassPathResource.java:174) at org.springframework.core.io.AbstractResource.exists(AbstractResource.java:51) at org.springframework.batch.core.resource.StepExecutionResourceProxy.exists(StepExecutionResourceProxy.java:112) at org.springframework.batch.item.file.FlatFileItemReader.doOpen(FlatFileItemReader.java:226) at org.springframework.batch.item.support.AbstractBufferedItemReaderItemStream.open(AbstractBufferedItemReaderItemStream.java:154) at org.springframework.batch.item.support.CompositeItemStream.open(CompositeItemStream.java:103) at org.springframework.batch.core.step.item.ItemOrientedStep.open(ItemOrientedStep.java:462) at org.springframework.batch.core.step.AbstractStep.execute(AbstractStep.java:167) at org.springframework.batch.core.job.SimpleJob.execute(SimpleJob.java:100) at org.springframework.batch.core.configuration.support.ClassPathXmlApplicationContextJobFactory$ContextClosingJob.execute(ClassPathXmlApplicationContextJobFactory.java:107) at org.springframework.batch.core.launch.support.SimpleJobLauncher$1.run(SimpleJobLauncher.java:86) at java.lang.Thread.run(Thread.java:619)
If I run the job from command line (in separated JVM), the process is ok for any number of runs.
Do you have any advice on where is the issue?
But when you run from the commandline, it only process ONE run each time you run it, correct (versus multiple runs inside the tomcat container)?
I run 1 job each time too in the tomcat container. I am using some beans from the parent context for data access within a step. Is there are way to check whether the subcontext/ child context is really destroyed?
Here's the thing: if you set your context up like I suggested, there should only be a handful of beans in the child context. Running your job a half dozen times shouldn't create enough extra objects to blow the heap. Sounds like you may have a memory leak. I'd suggest that you run a profiler against your application and see where the objects are coming from...
BTW, what version of Spring Batch are you using?