Page 1 of 1

UtilityHashLookup

Posted: Mon Jun 04, 2007 3:01 am
by tracy
I have created a sequence that does the following:

1. Calls a job that creates a hashed file that contains a single record... it basically just holds an id (like customer number) and a "key" which is hardcoded to "DUMMYKEY".
2. Calls another job that selects data from the database for that id. I use the UtilityHashLookup to pass the value of the Hashed File from step #1 as a parameter to this job.
3. Then I do some miscellaneous stuff with the data that was extracted in step #2.
4. Then I loop around and start again... selecting the next Id and putting it into the Hashed File, selecting the data for that new Id, etc.

The first thing I'm noticing is that I get stuck in a never ending loop, as for some reason the id that is being used is not getting incremented... it keeps using the same one in each leg of the loop. So my first question is: Once you use UtilityHashLookup to pass a value to a job, does that job no longer allow you to substitute any other values to it? For instance, if I pass 1 to the job in the first leg of the loop, is it possible to pass 2 to the job in the second leg of the loop?

I also notice that the Hashed File created in step 1 looks funny when we get to the second leg of the loop. On the first leg, the Hashed File looks fine... it's got one record with the right id (the id is the number 1). But on the second leg, I see 3 records in the Hashed File:

One of these three records looks fine: it's got the "DUMMYKEY" key and the id value is 2.

But then there is another record whose key is "DATA.30" and a third record whose key is "OVER.30".

So I'm then wondering if this strangeness in the hashed file is causing the problem. Has anybody seen this before?

Re: UtilityHashLookup

Posted: Mon Jun 04, 2007 8:05 am
by rcanaran
Are you clearing the hash file before each write?

Posted: Mon Jun 04, 2007 9:08 am
by tracy
Yes, I am clearing the file... actually deleting and recreating the file. I'm also noticing that when I monitor it, it's showing 1 record going into the hashed file and 3 coming out. It doesn't make much sense.

Posted: Mon Jun 04, 2007 11:38 am
by swades
If you want to use only one value,then set UserStatus in a Transformer of first job.and Use it as parameter to second job.

Posted: Mon Jun 04, 2007 12:06 pm
by tracy
I seem to have got it to work by unchecking the "Create file" box and then deleting the files/folder on the server associated with the hashed file.

I honestly don't understand why it works like this and why it didn't work the other way, so if anybody has any insight, I'd love to hear it.

I also don't understand why it created the file automatically after I unchecked and deleted the file. What the heck is this checkbox for then?

Posted: Mon Jun 04, 2007 12:27 pm
by chulett
The stage will always attempt to create a hashed file being written to if it doesn't exist. That option allows you to either override the default values used for the creation and/or delete the hashed file each run.