Page 1 of 1

Execute multiple jobs using a sequence

Posted: Thu Oct 16, 2008 10:47 pm
by AttitudeBaz
I have 70 jobs that all have the same parameters, except for one, the file name. Each job does the same thing, loads a data file into a database table. I want to loop through all of the jobs and execute them using a sequence. Is this possible? If not does anyone have any other ideas?

Posted: Thu Oct 16, 2008 11:48 pm
by chulett
You'd need to write your own looping job control as I don't see a Sequence being able to handle this gracefully unless you are willing to string 70 jobs together in a row.

Another thought, a job that reads the filenames in from a flatfile along with the name of the job that processes it and calls the UtilityRunJob function for each record that flows though the job. Each call will run the job that processes that particular file.

Posted: Fri Oct 17, 2008 1:19 am
by hamzaqk
i would have made one job to read 70 files...........

Posted: Fri Oct 17, 2008 1:32 am
by ray.wurlod
A job sequence can use a StartLoop activity to set up a "list of things" loop to process the file names. But why not stream the data from all files into the job via a filter command (such as the TYPE command) in the Sequential File stage, as hamzaqk suggested?

Posted: Fri Oct 17, 2008 7:57 am
by chulett
I made the silly assumption that the metadata is different across the 70 files. Otherwise, there's no reason for anything more than 1 job and a parameter for the filename. Or a fixed work file name and a 'before job' concatenation. [shrug]

Posted: Sun Oct 19, 2008 7:40 pm
by AttitudeBaz
The extract is run monthly and is a complete system extract. The source system is Jade (Object Oriented) therefore we need to extract to files and then load the files. Each file equates to a single database table in the staging area. We use CRCs to load the data efficiently.

I don't think we can use one job as the structure of each file is very different. I liked the idea of executing a job in a loop, but that adds time to the execution. Given that the files contain a complete system, ultimately I would like to run the jobs in parallel. That way the time to load all files is equivalent to the the time to load slowest file, rather than the sum of the time to load all files.

Unfortunately, the only way I think I can do this is by adding all the jobs to the sequence. Any other better ideas?

Posted: Sun Oct 19, 2008 7:59 pm
by chulett
I'd still look at the UtilityRunJob mechanism, but make it a mult-instance job run X ways with each instance handling 1 of each X records in the driving source file via the mod() function.

Posted: Wed Oct 22, 2008 12:59 am
by AttitudeBaz
The extract is run monthly and is a complete system extract. The source system is Jade (Object Oriented) therefore we need to extract to files and then load the files. Each file equates to a single database table in the staging area. We use CRCs to load the data efficiently.

I don't think we can use one job as the structure of each file is very different. I liked the idea of executing a job in a loop, but that adds time to the execution. Given that the files contain a complete system, ultimately I would like to run the jobs in parallel. That way the time to load all files is equivalent to the the time to load slowest file, rather than the sum of the time to load all files.

Unfortunately, the only way I think I can do this is by adding all the jobs to the sequence. Any other better ideas?

Posted: Wed Oct 22, 2008 2:46 am
by hamzaqk
if you want to run them in parallel then the only way is putting them in the sequence as you mentioned. if you want to try the sequential run, this may sound a bit daft but you can store the names of the jobs in a file and then use the exec command stage to run the jobs through a command line "dsjob -run" ???? you can loop through the file names by passing them as a parameter to the start loop and end loop activity :roll:

Posted: Wed Oct 22, 2008 7:50 am
by rakeshreddy29
did you try using the sequence?....what difficulty are you finding in it? Let me know

Posted: Wed Oct 22, 2008 7:59 am
by chulett
chulett wrote:I'd still look at the UtilityRunJob mechanism, but make it a mult-instance job run X ways with each instance handling 1 of each X records in the driving source file via the mod() function.
Is this your "workaround"?

The Sequence "difficulty" is the fact that each iteration of the loop would require a different job be run, the one matched to incoming filename. So the Job Activity stage is out and you'd need write a Routine to take its place and dynamically handle the job being run and monitored. Not really all that tough but adding parallelism would be problematic.

Still think the solution I outlined is best. [shrug]

Posted: Wed Oct 22, 2008 8:18 am
by rakeshreddy29
I was thinking it is 70 different jobs so in this case there should not be any problem in using the sequence. I read the question wrongly. I think u r correct craig.