increasing duration between steps over a long period
my job does some file analysis over a quite long period of time. There are ~1200 files to read and analyse.
Within the job specification, I do iteration over a set of 10 steps required to analyse a single file.
Now I am facing a increased duration on analyzing a file. At start a single iteration over 10 steps took ~ 10 seconds, on half-time the iteration took ~7 minutes. While the execution time of each step approximately constant, the time between ending a step and starting the next one increased from 500 ms to 11 seconds!! :confused:
A memory issue?
Did anyone experience similar behaviour?
Require code? Please let me know!
Thank you very much in advance for your help!!