Is there a way to know the job execution order. Say for example if i have 10 stages, which stage will be executed first, and the sequential order of execution.
I did find some thing on DS_TEMPnn and RT_CONFIGnn (like all the stages), but wanted to know their execution order.
Can we get this information somehow?
Job execution order
Moderators: chulett, rschirm, roy
-
- Premium Member
- Posts: 232
- Joined: Fri Aug 04, 2006 1:20 am
- Location: Bangalore
You mean 'Stage execution order' it seems.
Being serious here with a somewhat silly answer, but... it starts with the first stage and continues following the links until the last stage. And that's more like what each record does, inside the job all of the stages are basically running simultaneously. You'll see them all start up in the log, rows get processed and then they'll all shut down, 'Finish'.
Of course, things that introduce a 'process break' (not sure what the official term is, that's what I call it) into the flow - a passive stage - affect that. Because all input must complete before the output side can start, you'll see the same thing going on as noted above only in discrete segments. So, at its most simple...
Seq -> Trans -> Seq
One basic 'process' with everything running. However...
Seq -> Trans -> Seq -> Trans -> Seq
Will run the first 'seq to seq' section to completion before the second 'seq to seq' segment starts and runs. Hope that helps.
Being serious here with a somewhat silly answer, but... it starts with the first stage and continues following the links until the last stage. And that's more like what each record does, inside the job all of the stages are basically running simultaneously. You'll see them all start up in the log, rows get processed and then they'll all shut down, 'Finish'.
Of course, things that introduce a 'process break' (not sure what the official term is, that's what I call it) into the flow - a passive stage - affect that. Because all input must complete before the output side can start, you'll see the same thing going on as noted above only in discrete segments. So, at its most simple...
Seq -> Trans -> Seq
One basic 'process' with everything running. However...
Seq -> Trans -> Seq -> Trans -> Seq
Will run the first 'seq to seq' section to completion before the second 'seq to seq' segment starts and runs. Hope that helps.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
If you spend a little bit of time learning the Server Job Debugger, you can actually "watch" the stage-stepping that Craig is talking about. Put a breakpoint on a variety of output links and then step thru the job and see where it stops as you move from breakpoint to breakpoint. Link order coming from a Transformer can be easily visualized, as can the "process boundaries" (if you have any) that Craig mentions. ...btw....we're talking about "default" behavior here --- IPC Stages and settings will alter this behavior dramatically...sometimes to your advantage, and sometimes not.
Ernie
Ernie
Ernie Ostic
blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
Yup, 'IPC stages and settings' is a game changer here. ![Wink :wink:](./images/smilies/icon_wink.gif)
On the debugger front, it can certainly be enlightening to step through a job. You just need one breakpoint to get that initial stoppage, after that you can step through the job one link at a time without worrying about needing more breakpoints. Just an FYI.
![Wink :wink:](./images/smilies/icon_wink.gif)
On the debugger front, it can certainly be enlightening to step through a job. You just need one breakpoint to get that initial stoppage, after that you can step through the job one link at a time without worrying about needing more breakpoints. Just an FYI.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Premium Member
- Posts: 232
- Joined: Fri Aug 04, 2006 1:20 am
- Location: Bangalore
Thanks for the answers
In the above job design, the Hashed File will be populated first, so the link from LKP DB will be executed first.
By seeing this design we will be able to identify this, but how and where it will be stored in Datastage perspective?
Edit : Corrected the spelling mistakes
Code: Select all
Hashed File <------- LKP DB
|
|
|
|
|
|
v
SOurce DB ------>Transformer ------> Target
By seeing this design we will be able to identify this, but how and where it will be stored in Datastage perspective?
Edit : Corrected the spelling mistakes
Last edited by DS_SUPPORT on Sun Apr 19, 2009 10:39 pm, edited 1 time in total.
Stroed in Datasatge perepective? Hmmm... basically in the RT_BPnnn directory, which is where the code the compiler generates goes. Probably other places as well, never really had a need to get all that intimate with the gory details of how things are stored in the DataStage perspective, as you put it.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Stage dependencies are stored in RT_CONFIGnnn table for the job and, while the job is being designed, in DS_TEMPnnn. Structures for these tables are not documented.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.