How to define the sequence order for the link???

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
ICE
Participant
Posts: 249
Joined: Tue Oct 25, 2005 12:15 am

How to define the sequence order for the link???

Post by ICE »

Dear All,


I have some problem here. Help me pls.
I want to extract the data from 3 files to same hash file using 3 ODBC link in the same stage. Means that 3 ODBC files link to one hash file in the same time.Then,before I insert the data to the hash file,I want to clean the data at every time I run the job.Cleaning should be first ODBC file link and then want to append the data from other 2 ODBC files.So what I would like to know is do we have any way to define the sequence of the link so that I can make sure the 1st ODBC load to the hash file?
Is there any idea you have?


Thanks,
loveojha2
Participant
Posts: 362
Joined: Thu May 26, 2005 12:59 am

Post by loveojha2 »

Create two different server jobs one with the first ODBC link to the Hashed File stage with the appropriate transformations.

And the other with the other two ODBC populating the Hashed file.

Call these two jobs within a Seqnece Job using the job activity stages and put link between the job activity stages (make sure that you have correct trigger option of the link between the job activities).

Moreover I don't think that whatever you are trying to achieve is possible easily within a single server job.
Success consists of getting up just one more time than you fall.
sb_akarmarkar
Participant
Posts: 232
Joined: Fri Sep 30, 2005 4:52 am
Contact:

Post by sb_akarmarkar »

Hi,
Why dont you try union of three query in user-defined SQL then load with one link to hashed file.


Thanks,
Anupam
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Umm... you don't use ODBC stage type to access hashed files. You can use UV stage if you want to use SQL, or you can use a Hashed File stage. The easiest mechanism to clear the hashed file is to use a before-stage subroutine in the Transformer stage (this is executed prior to any inputs being opened) to issue a CLEAR.FILE command against the hashed file.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
katz
Charter Member
Charter Member
Posts: 52
Joined: Thu Jan 20, 2005 8:13 am

Post by katz »

Hi,

It is possible to use the property of passive stages (i.e Hashed File and ODBC) that no output links can be opened until all input links are closed, in order to force the execution sequence you desire. In my opinion, it would be a bit of a "cludge" and will require an "extra" an Hashed File stage and a Transformer stage in the job design, but it is possible. If it is an advisable job design is another question.

katz
Post Reply