Hash File useage

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
mcolen
Premium Member
Premium Member
Posts: 31
Joined: Wed Aug 11, 2004 8:59 am
Location: Florida

Hash File useage

Post by mcolen »

Can a hash file be created and used in the same jobstream ie create hash file from oracle 9i and lookup in transform of teradata?
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

mcolen,

yes, you can write to and read from a hash file in the same job, just make sure that you have buffering on both writes and reads turned off.
mcolen
Premium Member
Premium Member
Posts: 31
Joined: Wed Aug 11, 2004 8:59 am
Location: Florida

Hash File useage

Post by mcolen »

Arnd

While I know you can read and write it in the same job I am seeing jobs for the first time creating a file and trying to use it at the same time and was wondering if this is correct.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Yes, even if the file is being created in the same job it will work, provided the stage that creates the file is in the stream before the stage that reads from it.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

And there's a little trick you can play with a Transformer as a source when your lookup comes first. :wink:
-craig

"You can never have too many knives" -- Logan Nine Fingers
neena
Participant
Posts: 90
Joined: Mon Mar 31, 2003 4:32 pm

Post by neena »

chulett wrote:And there's a little trick you can play with a Transformer as a source when your lookup comes first. :wink:

Can u please reveal the trick

Thanks
RR
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

You can start an input stream with a Transformer under certain circumstances and that's what you need to do here.

Add a Transformer and a link to the reference hash. A transformer can be used to generate data as long as you define a stage variable in it (don't need to use it, just define it) and then constrain the number of output rows or it will run forever. So...

Populate the link with the same column metadata that your hash needs. Doesn't matter what you put in the derivation fields but obviously you need to put something there. Set an @FALSE constraint on the link so no rows ever go down it. However, its presence will tell DataStage that you need to write to it before the reads happen and you can use that link to create the hashed file on the first run or to clear it at the beginning of a run.

Hope that makes sense.
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply