Dec 17th, 2012, 03:47 AM
Step processing in Chunks Vs Bulk
Considering the below requirement:
1. Data Size to read: 1Lakhs records
2. Data must be read and then process to group the unique portion together
The above step result in e.g.
Total read : 1L records, Processed and grouped final count : 80K
Which approach will be better?
1. Item Reader and Writer with commit interval of 10K?
2. Tasklet with the customized DAO implementation and handler : to read the data and then group for further processing.
What is the openion?
Tags for this Thread