UtilityHashLookup

A forum for discussing DataStage<sup>®</sup> basics. If you're not sure where your question goes, start here.

Moderators: chulett, rschirm, roy

Post Reply
tracy
Participant
Posts: 47
Joined: Mon Aug 07, 2006 9:19 am

UtilityHashLookup

Post by tracy »

I have created a sequence that does the following:

1. Calls a job that creates a hashed file that contains a single record... it basically just holds an id (like customer number) and a "key" which is hardcoded to "DUMMYKEY".
2. Calls another job that selects data from the database for that id. I use the UtilityHashLookup to pass the value of the Hashed File from step #1 as a parameter to this job.
3. Then I do some miscellaneous stuff with the data that was extracted in step #2.
4. Then I loop around and start again... selecting the next Id and putting it into the Hashed File, selecting the data for that new Id, etc.

The first thing I'm noticing is that I get stuck in a never ending loop, as for some reason the id that is being used is not getting incremented... it keeps using the same one in each leg of the loop. So my first question is: Once you use UtilityHashLookup to pass a value to a job, does that job no longer allow you to substitute any other values to it? For instance, if I pass 1 to the job in the first leg of the loop, is it possible to pass 2 to the job in the second leg of the loop?

I also notice that the Hashed File created in step 1 looks funny when we get to the second leg of the loop. On the first leg, the Hashed File looks fine... it's got one record with the right id (the id is the number 1). But on the second leg, I see 3 records in the Hashed File:

One of these three records looks fine: it's got the "DUMMYKEY" key and the id value is 2.

But then there is another record whose key is "DATA.30" and a third record whose key is "OVER.30".

So I'm then wondering if this strangeness in the hashed file is causing the problem. Has anybody seen this before?
rcanaran
Premium Member
Premium Member
Posts: 64
Joined: Wed Jun 14, 2006 3:51 pm
Location: CANADA

Re: UtilityHashLookup

Post by rcanaran »

Are you clearing the hash file before each write?
tracy
Participant
Posts: 47
Joined: Mon Aug 07, 2006 9:19 am

Post by tracy »

Yes, I am clearing the file... actually deleting and recreating the file. I'm also noticing that when I monitor it, it's showing 1 record going into the hashed file and 3 coming out. It doesn't make much sense.
swades
Premium Member
Premium Member
Posts: 323
Joined: Mon Dec 04, 2006 11:52 pm

Post by swades »

If you want to use only one value,then set UserStatus in a Transformer of first job.and Use it as parameter to second job.
tracy
Participant
Posts: 47
Joined: Mon Aug 07, 2006 9:19 am

Post by tracy »

I seem to have got it to work by unchecking the "Create file" box and then deleting the files/folder on the server associated with the hashed file.

I honestly don't understand why it works like this and why it didn't work the other way, so if anybody has any insight, I'd love to hear it.

I also don't understand why it created the file automatically after I unchecked and deleted the file. What the heck is this checkbox for then?
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

The stage will always attempt to create a hashed file being written to if it doesn't exist. That option allows you to either override the default values used for the creation and/or delete the hashed file each run.
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply