I have a simple job which reads from one database and write in another.
I have reader as JDBCCurserItemReader which reads only records which are not processed i.e. processed = 0 and Writer has Insert statement.
I have chunk size of 100. I mark the record in source table as "processed=1"
in writer's beforeWrite() method. This processed flag updation in during writing and within transaction so if anything fails it will be rolled back.
I tested this job by running only one instance. It works fine without any problem.
Now my question is I want to run multiple instances of the same job by changing version parameter at the same time. So if I start say 3 instances of the same job then they will be working on same set of data.
Is there a possibility that different jobs will pickup same chunk? If 2 jobs pickup same record they will add it to the destination table duplicate records as there are no constraint on destination table.
Correct me if I am thinking on wrong path.
Let's say first job pickup first 100 records and processing them. It processed and marked 80 records. However commit interval is 100 so it is not yet commited in database. If second instance of the same job kicks in will it pickup the first 100 records as a chunk?
Please help me because we are planning to run multiple instances of the same job in production for faster and better performance. Is it advisible to run multiple instances of the same job for faster performance? Or I should do multithreading within the job itself.