Lookup & update of HashFile from same transformer
Moderators: chulett, rschirm, roy
Lookup & update of HashFile from same transformer
Can we have lookup & update of the same hashfile from the same transformer.
From a transformer I have to "lookup" a hashfile and also "update" the same hash if its not there in the hashfile(have one dashed-link for lookup & one link for update to hashfile)
From a transformer I have to "lookup" a hashfile and also "update" the same hash if its not there in the hashfile(have one dashed-link for lookup & one link for update to hashfile)
Yes, just keep in mind that read and write caching will prevent a subsequent row from "seeing" the preceding row if it's trying to reference it. You must not use caching if you need to reference a precedingly written row. If you don't care, then use write-delay caching, and if necessary, you can use a read-cache.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Ken, Thank You for the reply...
I tried doing it and it gives "cyclic or linear dependencies" error on the transformer.
I need to keep the hash-file uptodate to make sure the next record processed does a lookup on the updated hashfile(not to miss the lookup for record which is already processed in the flow)
I tried doing it and it gives "cyclic or linear dependencies" error on the transformer.
I need to keep the hash-file uptodate to make sure the next record processed does a lookup on the updated hashfile(not to miss the lookup for record which is already processed in the flow)
This error means you've created a 'loop' with your job stream. What you are trying to do is pretty basic stuff and should work fine when properly laid out. If you can't get past this error, please post an "picture" of your job flow. Use the "code" tags and make use of the Preview button to ensure that everything lines up and makes sense. Then we should be able to tell you how to adjust it to correct the error.vdr123 wrote:I tried doing it and it gives "cyclic or linear dependencies" error on the transformer.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
I know the thing I am doing is basic...but its usefull to do the flow design...I am not able to post the pic of the job...
HEre is what i am trying to do...
PS: Tried to make makes sense
HEre is what i am trying to do...
Code: Select all
input----> Transform(lookup/update hashfile) ---->output
^ |
| |
(lookup)| |(not a dotted line)
| |(update hash)
| V
HashFile
PS: Tried to make makes sense
You did great with the picture... and it shows the loop in your job, from transformer to hash and back again.
You need to use two separate Hash Stages, both pointing to the same hash file - this will break the loop. More like:
And then you'll be fine. ![Smile :)](./images/smilies/icon_smile.gif)
You need to use two separate Hash Stages, both pointing to the same hash file - this will break the loop. More like:
Code: Select all
input----> Transform(lookup/update hashfile) ---->output
^ |
| |
(lookup)| |(not a dotted line)
| |(update hash)
| V
HashFile HashFile
![Smile :)](./images/smilies/icon_smile.gif)
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
Thank You...
I have tried it and it works fine with different hash-files(as in ur picture)...
but i was looking to do it using one hashfile(same one)...
it wil make it easy as i will have many lookups(links) and many update(links) from the transformer depending on the link-contraint.
And that was what I was trying to ask(from/to the same COPY of hashfile)
I have tried it and it works fine with different hash-files(as in ur picture)...
but i was looking to do it using one hashfile(same one)...
it wil make it easy as i will have many lookups(links) and many update(links) from the transformer depending on the link-contraint.
And that was what I was trying to ask(from/to the same COPY of hashfile)
You can certainly can read and write to the same Hash from the same transformer, although it may not give you the result you are expecting if you have things like cache enabled
If it is an SCD then this may be what you want and in this case it will work as designed.
Regards,
Michael
If it is an SCD then this may be what you want and in this case it will work as designed.
Regards,
Michael
Last edited by mhester on Wed Mar 31, 2004 2:29 pm, edited 1 time in total.
Mike Hester
mhester@petra-ps.com
mhester@petra-ps.com
vdr123 wrote:I have tried it and it works fine with different hash-files(as in ur picture)... but i was looking to do it using one hashfile(same one)...
![Confused :?](./images/smilies/icon_confused.gif)
And for Michael, in my experience it is expected (and desired) that the first lookup fails for a new piece of data in a job like this. You then write that record to the Hash so that subsequent lookups do not fail.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
Craig,
Good call. I always thought you could use the same stage for both reading and writing but you are correct and this should solve their problem.
Regards,
Michael
Good call. I always thought you could use the same stage for both reading and writing but you are correct and this should solve their problem.
Regards,
Michael
Mike Hester
mhester@petra-ps.com
mhester@petra-ps.com
Craig,And for Michael, in my experience it is expected (and desired) that the first lookup fails for a new piece of data in a job like this. You then write that record to the Hash so that subsequent lookups do not fail.
You are correct and I was thinking of something else when I wrote this and then tried to correct prior to anyone reading (too slow
![Smile :-)](./images/smilies/icon_smile.gif)
Regards,
Michael
Mike Hester
mhester@petra-ps.com
mhester@petra-ps.com
True...it is one physical hash-file...why did Ascential provide an option to have both reference link and normal link to hashfile stage???
Do you think there was a reason for that???
All I am trying to do is not cacheing the hashfile...just doing lookup and updating it...its a trade-off to write each row to a physical file and use it for lookup.
Just wanted to try...if there was an option!!!
Thank You all for feedback.
Do you think there was a reason for that???
All I am trying to do is not cacheing the hashfile...just doing lookup and updating it...its a trade-off to write each row to a physical file and use it for lookup.
Just wanted to try...if there was an option!!!
Thank You all for feedback.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
You could do it but for a fundamental rule in DataStage; no passive stage can open any of its outputs until all its inputs are closed.
It has always been thus.
The per-compiler detects this situation and reports a cyclic dependency if the input and output links come from the same active stage.
That's why you need separate stages - each establishes a separate connection (to the hashed file, in this case).
It has always been thus.
The per-compiler detects this situation and reports a cyclic dependency if the input and output links come from the same active stage.
That's why you need separate stages - each establishes a separate connection (to the hashed file, in this case).
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.