Dear All,
I have some problem here. Help me pls.
I want to extract the data from 3 files to same hash file using 3 ODBC link in the same stage. Means that 3 ODBC files link to one hash file in the same time.Then,before I insert the data to the hash file,I want to clean the data at every time I run the job.Cleaning should be first ODBC file link and then want to append the data from other 2 ODBC files.So what I would like to know is do we have any way to define the sequence of the link so that I can make sure the 1st ODBC load to the hash file?
Is there any idea you have?
Thanks,
How to define the sequence order for the link???
Moderators: chulett, rschirm, roy
Create two different server jobs one with the first ODBC link to the Hashed File stage with the appropriate transformations.
And the other with the other two ODBC populating the Hashed file.
Call these two jobs within a Seqnece Job using the job activity stages and put link between the job activity stages (make sure that you have correct trigger option of the link between the job activities).
Moreover I don't think that whatever you are trying to achieve is possible easily within a single server job.
And the other with the other two ODBC populating the Hashed file.
Call these two jobs within a Seqnece Job using the job activity stages and put link between the job activity stages (make sure that you have correct trigger option of the link between the job activities).
Moreover I don't think that whatever you are trying to achieve is possible easily within a single server job.
Success consists of getting up just one more time than you fall.
-
- Participant
- Posts: 232
- Joined: Fri Sep 30, 2005 4:52 am
- Contact:
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Umm... you don't use ODBC stage type to access hashed files. You can use UV stage if you want to use SQL, or you can use a Hashed File stage. The easiest mechanism to clear the hashed file is to use a before-stage subroutine in the Transformer stage (this is executed prior to any inputs being opened) to issue a CLEAR.FILE command against the hashed file.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Hi,
It is possible to use the property of passive stages (i.e Hashed File and ODBC) that no output links can be opened until all input links are closed, in order to force the execution sequence you desire. In my opinion, it would be a bit of a "cludge" and will require an "extra" an Hashed File stage and a Transformer stage in the job design, but it is possible. If it is an advisable job design is another question.
katz
It is possible to use the property of passive stages (i.e Hashed File and ODBC) that no output links can be opened until all input links are closed, in order to force the execution sequence you desire. In my opinion, it would be a bit of a "cludge" and will require an "extra" an Hashed File stage and a Transformer stage in the job design, but it is possible. If it is an advisable job design is another question.
katz