Jul 31st, 2009, 12:23 PM
Batch Job use case
I have a business use case wherein We access DB for records based on business. Now the resultset count goes in 50K +. We have to read each record , Xml-ised the objects and generate pdfs for reports .
The idea here is not to load the whole resultset in memory as this is aniticpated to grow over the period of time. Does JDBCCursorItemReader provides me with options of reading a record -> processing it upto the pdf step and fetch for next (reusing the DB resources). Can I define steps too to accomplish the steps of tranformation to xml & pdf generation etc.
If you can point me towards a sample in this regard , it would be a great help.
Aug 3rd, 2009, 01:45 AM
The memory footprint of JdbcCursorItemReader depends a bit on the JDBC driver (so vendor dependent). JdbcPagingItemReader is more predictable, but may not be as efficient if you have a good cursor implementation. The idea anyway is that you have at least one option that doesn't require all items to be in memory.
I'm a little bit hazy on the XML and PDF steps. There is a danger of everything ending up in memory no matter what you do in the JDBC piece?