Aug 13th, 2012, 07:25 AM
calling job with in a tasklet of another job
I need to call a job, from a takslet of another job.
Job A:Step A:
Tasklet A:get all the filenames with in a folder. In a loop, instantiate a job(Job B) with filepath parameter.
Job B: processes each file.
but while launching job B for first time i am getting an exception."Existing transaction detected in JobRepository. Please fix this and try again (e.g. remove @Transactional annotations from client)."
Aug 15th, 2012, 01:40 PM
Why not have job A --> step A --> reader (read files names in) --> processor (itemProcessor processes each file) --> writer (if necessary).
Why do you need two jobs?
Sep 6th, 2012, 09:15 AM
Call another batch job in a shell
The answer to Jeff is obviously because processing a file is itself a batch job because processing can be quite complicated, reading a file, processing each line and transforming it, then creating an entry in a db for instance.
Originally Posted by visualjeff
I had a similar need and the only way I could find to do it was have the writer create a new shell and execute another batch in it using
Runtime rt = Runtime.getRuntime();
command +="cmd /c ";
command += "java -jar " + jarFile
+ " launch-context-export.xml " + jobName .... etc
Process pr = rt.exec(command);
int exitVal = pr.waitFor();
Is there a better way to do it?
I tried using the SimpleJobLauncher instead of a shell but I also got the error IllegalStateException - existing transaction in repository.
I would have thought that this is quite a common scenario and had hoped for better support from Spring Batch.
Sep 6th, 2012, 12:34 PM
Forking a JVM may not be necessary for complex batch processing. But if you have too then check out SystemCommandTasklet.
Here is a Suggestion:
Think of each file as an item and break out the job out over a number steps.
itemReader reads the files within a directory
itemWriter a flatfile that lists every file that needs to be processed (a snapshot of the directory's current state)
itemReader reads the flatfile listing what files to be processed
itemProcessor processes files (could even be a compositeItemProcessor)
itemWriter writes record to the datasource
Sep 7th, 2012, 04:05 AM
A file is not an item
Originally Posted by visualjeff
I think it is a bit of a stretch to use a composite item processor, which if I understand correctly is intended to apply separate transformations to the same item, to convert the input item (a file containing many lines=items) to n output items.
I suppose that it could be used, but it seems to me that it is throwing away all the advantages of processing a single file as a batch job, and each item as a repeated step.
In my specific case I already had a working batch job (actually processing a group of db records to produce a flatfile rather than the other way around) so when it was needed to process n groups it seemed logical to create a job which performed the query to get a list of groups of db items that then called the existing batch job to process each group and produce a file for each.
A more logical solution would be the possibility of specifying a batch job as an item processor.
Another practical but less than optimal solution is for the parent job to use an itemWriter that creates a shell script where each line is the call to a child batch program executable with the appropriate parameters, and execute this file on termination of the first batch.