Execute multiple jobs using a sequence
Moderators: chulett, rschirm, roy
-
- Premium Member
- Posts: 9
- Joined: Wed Mar 21, 2007 11:25 pm
- Location: Australia
Execute multiple jobs using a sequence
I have 70 jobs that all have the same parameters, except for one, the file name. Each job does the same thing, loads a data file into a database table. I want to loop through all of the jobs and execute them using a sequence. Is this possible? If not does anyone have any other ideas?
You'd need to write your own looping job control as I don't see a Sequence being able to handle this gracefully unless you are willing to string 70 jobs together in a row.
Another thought, a job that reads the filenames in from a flatfile along with the name of the job that processes it and calls the UtilityRunJob function for each record that flows though the job. Each call will run the job that processes that particular file.
Another thought, a job that reads the filenames in from a flatfile along with the name of the job that processes it and calls the UtilityRunJob function for each record that flows though the job. Each call will run the job that processes that particular file.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
A job sequence can use a StartLoop activity to set up a "list of things" loop to process the file names. But why not stream the data from all files into the job via a filter command (such as the TYPE command) in the Sequential File stage, as hamzaqk suggested?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
I made the silly assumption that the metadata is different across the 70 files. Otherwise, there's no reason for anything more than 1 job and a parameter for the filename. Or a fixed work file name and a 'before job' concatenation. [shrug]
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Premium Member
- Posts: 9
- Joined: Wed Mar 21, 2007 11:25 pm
- Location: Australia
The extract is run monthly and is a complete system extract. The source system is Jade (Object Oriented) therefore we need to extract to files and then load the files. Each file equates to a single database table in the staging area. We use CRCs to load the data efficiently.
I don't think we can use one job as the structure of each file is very different. I liked the idea of executing a job in a loop, but that adds time to the execution. Given that the files contain a complete system, ultimately I would like to run the jobs in parallel. That way the time to load all files is equivalent to the the time to load slowest file, rather than the sum of the time to load all files.
Unfortunately, the only way I think I can do this is by adding all the jobs to the sequence. Any other better ideas?
I don't think we can use one job as the structure of each file is very different. I liked the idea of executing a job in a loop, but that adds time to the execution. Given that the files contain a complete system, ultimately I would like to run the jobs in parallel. That way the time to load all files is equivalent to the the time to load slowest file, rather than the sum of the time to load all files.
Unfortunately, the only way I think I can do this is by adding all the jobs to the sequence. Any other better ideas?
-
- Premium Member
- Posts: 9
- Joined: Wed Mar 21, 2007 11:25 pm
- Location: Australia
The extract is run monthly and is a complete system extract. The source system is Jade (Object Oriented) therefore we need to extract to files and then load the files. Each file equates to a single database table in the staging area. We use CRCs to load the data efficiently.
I don't think we can use one job as the structure of each file is very different. I liked the idea of executing a job in a loop, but that adds time to the execution. Given that the files contain a complete system, ultimately I would like to run the jobs in parallel. That way the time to load all files is equivalent to the the time to load slowest file, rather than the sum of the time to load all files.
Unfortunately, the only way I think I can do this is by adding all the jobs to the sequence. Any other better ideas?
I don't think we can use one job as the structure of each file is very different. I liked the idea of executing a job in a loop, but that adds time to the execution. Given that the files contain a complete system, ultimately I would like to run the jobs in parallel. That way the time to load all files is equivalent to the the time to load slowest file, rather than the sum of the time to load all files.
Unfortunately, the only way I think I can do this is by adding all the jobs to the sequence. Any other better ideas?
if you want to run them in parallel then the only way is putting them in the sequence as you mentioned. if you want to try the sequential run, this may sound a bit daft but you can store the names of the jobs in a file and then use the exec command stage to run the jobs through a command line "dsjob -run" ???? you can loop through the file names by passing them as a parameter to the start loop and end loop activity
Teradata Certified Master V2R5
-
- Participant
- Posts: 23
- Joined: Wed Feb 13, 2008 10:53 am
Is this your "workaround"?chulett wrote:I'd still look at the UtilityRunJob mechanism, but make it a mult-instance job run X ways with each instance handling 1 of each X records in the driving source file via the mod() function.
The Sequence "difficulty" is the fact that each iteration of the loop would require a different job be run, the one matched to incoming filename. So the Job Activity stage is out and you'd need write a Routine to take its place and dynamically handle the job being run and monitored. Not really all that tough but adding parallelism would be problematic.
Still think the solution I outlined is best. [shrug]
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Participant
- Posts: 23
- Joined: Wed Feb 13, 2008 10:53 am