Lookup & update of HashFile from same transformer

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
vdr123
Participant
Posts: 65
Joined: Fri Nov 14, 2003 9:23 am

Lookup & update of HashFile from same transformer

Post by vdr123 »

Can we have lookup & update of the same hashfile from the same transformer.

From a transformer I have to "lookup" a hashfile and also "update" the same hash if its not there in the hashfile(have one dashed-link for lookup & one link for update to hashfile)
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

Yes, just keep in mind that read and write caching will prevent a subsequent row from "seeing" the preceding row if it's trying to reference it. You must not use caching if you need to reference a precedingly written row. If you don't care, then use write-delay caching, and if necessary, you can use a read-cache.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
vdr123
Participant
Posts: 65
Joined: Fri Nov 14, 2003 9:23 am

Post by vdr123 »

Ken, Thank You for the reply...
I tried doing it and it gives "cyclic or linear dependencies" error on the transformer.
vdr123
Participant
Posts: 65
Joined: Fri Nov 14, 2003 9:23 am

Post by vdr123 »

Ken, Thank You for the reply...
I tried doing it and it gives "cyclic or linear dependencies" error on the transformer.
I need to keep the hash-file uptodate to make sure the next record processed does a lookup on the updated hashfile(not to miss the lookup for record which is already processed in the flow)
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

vdr123 wrote:I tried doing it and it gives "cyclic or linear dependencies" error on the transformer.
This error means you've created a 'loop' with your job stream. What you are trying to do is pretty basic stuff and should work fine when properly laid out. If you can't get past this error, please post an "picture" of your job flow. Use the "code" tags and make use of the Preview button to ensure that everything lines up and makes sense. Then we should be able to tell you how to adjust it to correct the error.
-craig

"You can never have too many knives" -- Logan Nine Fingers
vdr123
Participant
Posts: 65
Joined: Fri Nov 14, 2003 9:23 am

Post by vdr123 »

I know the thing I am doing is basic...but its usefull to do the flow design...I am not able to post the pic of the job...

HEre is what i am trying to do...

Code: Select all

input----> Transform(lookup/update hashfile) ---->output
                     ^    |
                     |    |
             (lookup)|    |(not  a dotted line)
                     |    |(update hash)
                     |    V
                   HashFile

PS: Tried to make makes sense
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

You did great with the picture... and it shows the loop in your job, from transformer to hash and back again.

You need to use two separate Hash Stages, both pointing to the same hash file - this will break the loop. More like:

Code: Select all

input----> Transform(lookup/update hashfile) ---->output 
                     ^              | 
                     |              | 
             (lookup)|              |(not  a dotted line) 
                     |              |(update hash) 
                     |              V  
                   HashFile     HashFile
And then you'll be fine. :)
-craig

"You can never have too many knives" -- Logan Nine Fingers
vdr123
Participant
Posts: 65
Joined: Fri Nov 14, 2003 9:23 am

Post by vdr123 »

Thank You...
I have tried it and it works fine with different hash-files(as in ur picture)...
but i was looking to do it using one hashfile(same one)...
it wil make it easy as i will have many lookups(links) and many update(links) from the transformer depending on the link-contraint.

And that was what I was trying to ask(from/to the same COPY of hashfile)
mhester
Participant
Posts: 622
Joined: Tue Mar 04, 2003 5:26 am
Location: Phoenix, AZ
Contact:

Post by mhester »

You can certainly can read and write to the same Hash from the same transformer, although it may not give you the result you are expecting if you have things like cache enabled

If it is an SCD then this may be what you want and in this case it will work as designed.

Regards,

Michael
Last edited by mhester on Wed Mar 31, 2004 2:29 pm, edited 1 time in total.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

vdr123 wrote:I have tried it and it works fine with different hash-files(as in ur picture)... but i was looking to do it using one hashfile(same one)...
:? It is one physical hash file. You just need to reference it using two different Hash stages, both pointing to the same hash file. As you've found, you cannot use the same Stage to do both the writing and the lookup.

And for Michael, in my experience it is expected (and desired) that the first lookup fails for a new piece of data in a job like this. You then write that record to the Hash so that subsequent lookups do not fail.
-craig

"You can never have too many knives" -- Logan Nine Fingers
mhester
Participant
Posts: 622
Joined: Tue Mar 04, 2003 5:26 am
Location: Phoenix, AZ
Contact:

Post by mhester »

Craig,

Good call. I always thought you could use the same stage for both reading and writing but you are correct and this should solve their problem.

Regards,

Michael
mhester
Participant
Posts: 622
Joined: Tue Mar 04, 2003 5:26 am
Location: Phoenix, AZ
Contact:

Post by mhester »

And for Michael, in my experience it is expected (and desired) that the first lookup fails for a new piece of data in a job like this. You then write that record to the Hash so that subsequent lookups do not fail.
Craig,

You are correct and I was thinking of something else when I wrote this and then tried to correct prior to anyone reading (too slow :-)) I was thinking of something totally different.

Regards,

Michael
vdr123
Participant
Posts: 65
Joined: Fri Nov 14, 2003 9:23 am

Post by vdr123 »

True...it is one physical hash-file...why did Ascential provide an option to have both reference link and normal link to hashfile stage???
Do you think there was a reason for that???

All I am trying to do is not cacheing the hashfile...just doing lookup and updating it...its a trade-off to write each row to a physical file and use it for lookup.

Just wanted to try...if there was an option!!!

Thank You all for feedback.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

You could do it but for a fundamental rule in DataStage; no passive stage can open any of its outputs until all its inputs are closed.

It has always been thus.

The per-compiler detects this situation and reports a cyclic dependency if the input and output links come from the same active stage.

That's why you need separate stages - each establishes a separate connection (to the hashed file, in this case).
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply