Page 1 of 1

Job execution order

Posted: Sun Apr 19, 2009 6:46 am
by DS_SUPPORT
Is there a way to know the job execution order. Say for example if i have 10 stages, which stage will be executed first, and the sequential order of execution.

I did find some thing on DS_TEMPnn and RT_CONFIGnn (like all the stages), but wanted to know their execution order.

Can we get this information somehow?

Posted: Sun Apr 19, 2009 7:11 am
by chulett
You mean 'Stage execution order' it seems.

Being serious here with a somewhat silly answer, but... it starts with the first stage and continues following the links until the last stage. And that's more like what each record does, inside the job all of the stages are basically running simultaneously. You'll see them all start up in the log, rows get processed and then they'll all shut down, 'Finish'.

Of course, things that introduce a 'process break' (not sure what the official term is, that's what I call it) into the flow - a passive stage - affect that. Because all input must complete before the output side can start, you'll see the same thing going on as noted above only in discrete segments. So, at its most simple...

Seq -> Trans -> Seq

One basic 'process' with everything running. However...

Seq -> Trans -> Seq -> Trans -> Seq

Will run the first 'seq to seq' section to completion before the second 'seq to seq' segment starts and runs. Hope that helps.

Posted: Sun Apr 19, 2009 8:20 am
by eostic
If you spend a little bit of time learning the Server Job Debugger, you can actually "watch" the stage-stepping that Craig is talking about. Put a breakpoint on a variety of output links and then step thru the job and see where it stops as you move from breakpoint to breakpoint. Link order coming from a Transformer can be easily visualized, as can the "process boundaries" (if you have any) that Craig mentions. ...btw....we're talking about "default" behavior here --- IPC Stages and settings will alter this behavior dramatically...sometimes to your advantage, and sometimes not.

Ernie

Posted: Sun Apr 19, 2009 9:09 am
by chulett
Yup, 'IPC stages and settings' is a game changer here. :wink:

On the debugger front, it can certainly be enlightening to step through a job. You just need one breakpoint to get that initial stoppage, after that you can step through the job one link at a time without worrying about needing more breakpoints. Just an FYI.

Posted: Sun Apr 19, 2009 8:52 pm
by DS_SUPPORT
Thanks for the answers

Code: Select all

                       Hashed File <------- LKP DB
                          |
                          |
                          |
                          |
                          |
                          |
                          v
SOurce DB ------>Transformer ------> Target
In the above job design, the Hashed File will be populated first, so the link from LKP DB will be executed first.

By seeing this design we will be able to identify this, but how and where it will be stored in Datastage perspective?

Edit : Corrected the spelling mistakes

Posted: Sun Apr 19, 2009 9:46 pm
by chulett
Stroed in Datasatge perepective? Hmmm... basically in the RT_BPnnn directory, which is where the code the compiler generates goes. Probably other places as well, never really had a need to get all that intimate with the gory details of how things are stored in the DataStage perspective, as you put it.

Posted: Mon Apr 20, 2009 3:31 pm
by ray.wurlod
Stage dependencies are stored in RT_CONFIGnnn table for the job and, while the job is being designed, in DS_TEMPnnn. Structures for these tables are not documented.