Jul 12th, 2011, 10:49 PM
performance impliation-volume transactions with frequent database look ups on input
I am trying to read 3000000 records from database tables and process each record one by one.
For each of the records read,I have to do additional look up in the database table and based on the result of the lookup I want to create the output records.
Now additional look ups made for each records will definitely result in perfomance issues.
so instead of looking up the database for each record. I will cache that record in memory and wait till I get next record containing the desired information from input.
Now my question is , Is spring batch is appropriate answer for this problem ?
If yes,how do I skip the records read from the input before writing it to the output ?
Jul 15th, 2011, 04:01 AM
Spring Batch isn't going to implement your business logic, or your caching (there are other tools for that). It's not really clear to me what it is you propose to cache. Is it just a case of using a service or DAO abstraction to do your lookups, and putting a cache layer in place in front of that? (Not really anything to do with Spring Batch.)