Page 1 of 1

Dynamic Job Execution

Posted: Wed Apr 23, 2008 5:20 am
by ann_nalinee
Hi All Gurus,

I have a new requirement to create sequence jobs to call ds jobs based on the data in control table. The data in control table will contain the name of the job to run for each subject area such as:

SUBJECT_AREA JOB_NAME
xxx job1
xxx job2
xxx job3
yyy job4

I need to read this data from control table and execute the jobs based on this result. My idea is doing a loop by loop activity and executing the job one by one by dsjob command until complete all the job in each subject area. However, with this solution, the job will run sequentially.

But if I want to execute job in parallel for each subject area like if I pass parameter xxx, it will execute job1, job2 and job3 in the same time. Please advise if you have any idea on this.

Thanks in advance

Posted: Wed Apr 23, 2008 6:18 am
by hamzaqk
Would it not be a good idea if you can have the names of jobs pertaining to a particular subject area written to separate files from the table i.e. subjectAreaA.txt,subjectAreaB.txt with job names passed as comma separated value. You can then read these files by the execute command stage and then pass its content one by one to the start loop activity as a counter value of the previous stage. This way you would be able to populate jobs from different subject areas in parallel and for the same subject area sequentially. i hope it made some sence ?

Posted: Wed Apr 23, 2008 7:12 am
by ray.wurlod
You could do this with custom job control code, but I can't envisage a tidy way to do it with a job sequence, unless there were execution paths in the job sequence for every possible job activity. My reason for making this assertion is that the job name is hard coded in a Job activity; there is (presently, at least) no scope for making it a parameter or any other form of variable.

Posted: Wed Apr 23, 2008 7:18 am
by keshav0307
call the job sequentially from unix only (using dsjob -run) for the subject area selected from the table

Posted: Wed Apr 23, 2008 7:37 am
by Sravani
Create shell script to call the the jobs depending on the area name selected fro mthe table.
For Ex:
If Area_Name value is 'XXX' Then
Run job1
Run Job2
Run Job3

If you do so, in case the Area_name is 'XXX', the control will trigger the job1 and then come out to call the next statement ie; it runs job2 and then job3. The control will just trigger the job and comeout from the loop and will not wait till the job completes its execution.

I thinks this will solve your problem.

Sravani.

Run Jobs in parallel using unix script

Posted: Fri Aug 22, 2008 11:14 pm
by rohanf
Though this post is resolved.
I guess the Run job command in below manner:
Run job1
Run Job2
Run Job3

will still cause the Jobs to run sequentially since as soon as the first run job command executes it waits till the job finishes and then goes to the second run job command.

We have to run them in background for them to run in parallel.
Run job1&
Run Job2&
Run Job3&

Sravani wrote:Create shell script to call the the jobs depending on the area name selected fro mthe table.
For Ex:
If Area_Name value is 'XXX' Then
Run job1
Run Job2
Run Job3

If you do so, in case the Area_name is 'XXX', the control will trigger the job1 and then come out to call the next statement ie; it runs job2 and then job3. The control will just trigger the job and comeout from the loop and will not wait till the job completes its execution.

I thinks this will solve your problem.

Sravani.
Please correct if I am wrong
Thanks & Regards
Rohan

Posted: Fri Aug 22, 2008 11:36 pm
by ray.wurlod
Of course you could us a Routine activity to invoke UtilityRunJob (an SDK routine).

Rohan, you're only right in a very limited sense. DataStage jobs always run in the background so that a run request (which starts that background processing) returns immediately even if it itself is not a background process. Therefore even the first alternative you suggested will have all the jobs running simultaneously, with only a slight offset in their start times.