Feb 17th, 2013, 05:36 PM
Performance not improving with partitioning
I have a requirement where we receive huge files to the order or multiple GB and each file processing is considered a single job. The records having been read have to be pushed to a web service or to a jms queue based on job type. After looking at the scaling section of the batch documentation, we decided that the partitioning approach best fits our bill. To that effect, I tried doing some performance tests by using the single file and taking a sample over 5 minutes and then another test by splitting the input file into 2 files or equal records and running the test with partitioning. I however I did not notice any significant improvement in the total records processed over the 5 minute sample. I use a FlatFileItemReader and a chunk size of 50 if anybody is interested. Has anybody been able to see any significant improvements by using this method with large file split using a linux script? Any help appreciated!
Tags for this Thread