Apr 14th, 2008, 09:47 AM
BatchUpdate in MySql - Performance
I am reading from a file and writing to a database. As I am in the process of creating a prototype to go with Spring-Batch, I am using tradeDAO sample that came with the samples of Spring-Batch.
I am using MySQL and have about 50,000 records in my file. Just read all data in the file it takes me about 10-12 minutes, which I believe is long.
Then to read and insert them into MySQL database it takes me 40 minutes.
This is just a sample data and eventually we will get millions of data.
My question is this the normal performance I can get using Spring-Batch, is there any fine-tuning mechanism available?? If yes, how much would be the performance improvement??
Currently the whole batch process in mainframe gets completed in 1hr 30 minutes (including file read, and other business logic processing for millions of records). I am little concerned about the performance here.
Any help and input to improve performance would be GREAT.