Feb 11th, 2008, 08:52 AM
Master / Worker Pattern & jvm
1 ) Question concerning master/worker pattern
I would like to know if Spring batch implement the Master/Worker pattern which is currently used to parrallelizing work ? If it is the case, How can we achieve the following scenario using spring-batch ?
Scenario : A file copied in a directory contains thousands sales intructions. In order to process it in a efficient and quick way, the instructions must be splitted in group of hundreds instructions and associated to different work threads/queues. The master process which is probably a task (created through jbpl) will wait until the workers have finished their work.
2) Can we split the work on different java virtual machine like it is possible to do with Terracota product which is based on a hub-and-spoke architecture ?
Feb 11th, 2008, 10:31 AM
Not sure if this exactly fits the bill, but check out the parallelJob example in the samples package - it spawns several processes within the same JVM to accomplish its work.
Feb 11th, 2008, 11:06 AM
The 'chunk orientated processing changes' that are currently being polished for M5 should help as well. The basic principle being that the incoming file can be split into 'chunks' (say, 100 lines per chunk) and each chunk can then be put on different threads for processing. Ideally, the best way to do this will probably be JMS, putting the chunks on a queue and reading them off elsewhere. Gigaspaces even supports this out of the box. With release 1, we should have the basics in place, by supporting 'chunk orientated processing', but won't have the jms infrastructure in place and tested until post 1.0.