Running Routing from Parallel Job

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
Munish
Participant
Posts: 89
Joined: Sun Nov 19, 2006 10:34 pm

Running Routing from Parallel Job

Post by Munish »

Hi All,

There comes a scenario where I have a Parallel Routine (Transform Function) which Backs out the data in case of fatal error. It is sql script deleting the rows before loading fresh data.

I need to run it from with in my pre-existing job(s).

How can I get it working??

What I tried so far:
I tried
Adding this code in Job Control section of my job properties
Answer =
BackOutScript(USERNAME,PASSWORD,INSTANCE,PATH,TABLENAME)
Is it the wrong way to do///// Or I should be running it from some other place//// Any other thing I am missing.

Any input will be appreciated.

Thanks and regards,
Munish
MK
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

First of all , what kind of routine is it? Is it a C routine or a Basic routine?
You mentioned Job Control so I am guessing its a Basic routine as for a C routine you must be executing it with DSExecute() at the OS level. Fill in some blanks for us, will ya.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
kumar_s
Charter Member
Charter Member
Posts: 5245
Joined: Thu Jun 16, 2005 11:00 pm

Post by kumar_s »

Also do let know, if not working what not working with error code mentioned.
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
Munish
Participant
Posts: 89
Joined: Sun Nov 19, 2006 10:34 pm

Post by Munish »

Hi There,
1. It is Basic Routine.
2. It was not compiling and was treating my Routine as Variable and was saying Variable not declared.

I have get the things running, I needed to run this in job stage and wanted to avoid Sequencer.
What I wanted:
was to run a sql script to delete the records>>>Dataset(With Records)>>>Load Database.

What I did:
First:
I used one Routine stage >> followed by my job which loads the detail to DB.
Second: I copied the code from JOBCONTROL tab from Sequencer property window.
Third: I Pasted the same to my job's JOBCONTROL window.

It worked fine.

Is this the right way?
But doing this, can there be chance or error?
I needed to run this in job stage and can not use sequencer....

Your comments are appreciated.

Thanks and regards,
Munish
MK
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Munish wrote: I needed to run this in job stage and wanted to avoid Sequencer.
I needed to run this in job stage and can not use sequencer....
Why can't you use a job sequence? It would make testing the condition under which you want to unwind the load easier, given that the job that performs the load can be finished before you check.

There's nothing to stop you using an after-job subroutine, of course. And that can be written in DataStage BASIC.

But there's no such thing as an after-stage subroutine in parallel jobs, except in the BASIC Transformer stage. In a parallel Transformer stage you can have an after-stage trigger, but that has to be written in C++.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply