Running server jobs in parallel using Sequence Job

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
dhiraj
Participant
Posts: 68
Joined: Sat Dec 06, 2003 7:03 am

Running server jobs in parallel using Sequence Job

Post by dhiraj »

I have a sequence job, which executes five server jobs in parallel. An external scheduler triggers this sequence job. Now if one of five server jobs abort ,and rest run to completion. when i restart the sequence job it will run all the five jobs all over again.How can i restrict it to run the aborted job alone?

I could think of one solution in which every job creates a marker file to indicate that it completed successfully. and we could check the existence of this file and decide whether it should be run or not. and the sequence job up on successful completion should delete all these marker files.

I want to know if there is a better and easier solution for this.


Thanks in advance

Dhiraj
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

If you're running version 7.1 or later, there are compilation checkboxes in the Job Sequence that effect exactly this functionality. On re-run, those jobs that completed successfully last time are skipped (with a log entry indicating that they've been skipped).

In earlier releases, you're up for a DIY approach; generate flag files and use a Command or Routine activity to determine whether any of these exist; if the sequence finishes with no jobs aborting, then another Command or Routine activity ensures that all flag files are removed.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply