Apr 2nd, 2010, 07:02 AM
Duplicates with retry-limit
A step in the Job reads from a ListReader and for each entry, the call is made to a Web Service. The responses are stored in a list in the Writer.
Occasionally, the WS returns error and we find the retry-limit to be very useful.
What I found out is that whenever there is an error, the retry seems to retry certain entries that did not fail not even once. I changed the collection type of the responses to Set instead of List but wonder why this is happening.
Does the retry happen for all items in that Chunk?
Apr 3rd, 2010, 02:24 AM
If you have a chunk of 5 elements, it means that a transaction is started, 5 elements are read, processed and then a call is made to the writer with the 5 elements.
If you have an error, the chunk basically rollbacks (it can't commit because of the failure) so reprocessing of these items will occur again. If you want to avoid this, simply set commit-interval=1 on your step.
Apr 5th, 2010, 03:15 PM
Thanks for the explanation.
We would prefer not to reduce the commit interval and hence the application will handle the duplicates.
Can 'no-rollback-exception-classes' with the exception class used for Retry logic be used? Would it counter the purpose of retry?