I have to following process:
1. I read data from a file (10000 records).
2. I optimize each record based upon a model that specifies constraints and an objective function.
3. I write output data satisfying the constraints and maximizing the objective function to another file.
Because 2. is very CPU intensive it is something that has to be executed in parallel over multiple threads or even using a grid.
Because its a lot of data using flatfiles Spring Batch seemed to be an obvious choice. However because we want to optimize multiple record in parallel Spring Integration also was an obvious choice.
During a Spring Training Iwein (Spring Integration) told me that there was such a thing as Spring Batch Integration, which was not commonly used but was something with a lot of potential.
The feature that seems to apply to the problem is the Chunking using Spring Batch Integration. However I have read the code and the configuration files and it isn't clear to me how it works exactly.
Can you use a itemReader to read records from the file, pass them to the multiple itemProcessors (using Spring Integration), collect the results of the multiple itemProcessors and write them to file using an itemWriter?