performance impliation-volume transactions with frequent database look ups on input
I am trying to read 3000000 records from database tables and process each record one by one.
For each of the records read,I have to do additional look up in the database table and based on the result of the lookup I want to create the output records.
Now additional look ups made for each records will definitely result in perfomance issues.
so instead of looking up the database for each record. I will cache that record in memory and wait till I get next record containing the desired information from input.
Now my question is , Is spring batch is appropriate answer for this problem ?
If yes,how do I skip the records read from the input before writing it to the output ?