May 29th, 2007, 02:04 PM
Whenever I write flat files, I'd *like* to make sure no other process writes to the file while I do, or moves the file while I'm writing, etc.
Since Linux doesn't support mandatory file locking (well not really anyway), and I sometimes even have to write to NFS, I usually go with lock files and a protocol ("don't touch xy.dat while xy.dat.lock exists!"), and hope for the best..
Will Spring Batch ease the pain? Please?
I haven't found anything in the javadocs - what's everybody else doing?!
May 31st, 2007, 02:59 AM
We might think about file locking later - it's not planned for 1.0-m2. If anyone has time / code to do it we might squeeze something into 1.0. It's a pain, as you say.
Since there are no proper guarantees from a filesystem we prefer to use business processes to ensure that we use the right data in a batch. Our recommendation for best practice is to never get into the situation where the data for a job could change in the middle of a job. E.g. use a naming convention for the input file that matches the job/stream id and start date, so there can be no ambiguity about what the contents are
May 31st, 2007, 04:20 AM
Thanks for the info, Dave.