Hashed file issue
Moderators: chulett, rschirm, roy
Hashed file issue
Hi,
i used a same hashed file in my job twice one time for referring and one time for writing, but issue is what ever records write into hashed file , same records records are not referring at same run , next run only thise records used as reference,
but i have to use the records as a reference what ever loaded into hashed file at single run.
Any suggestions???
Thanks in advance
i used a same hashed file in my job twice one time for referring and one time for writing, but issue is what ever records write into hashed file , same records records are not referring at same run , next run only thise records used as reference,
but i have to use the records as a reference what ever loaded into hashed file at single run.
Any suggestions???
Thanks in advance
D.N .MURTHY
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
That's got nothing to do with read and write cache.
Also, you need to ensure that read and write are in the same Transformer stage. Also recommended is "lock for update" when reading - this sets a row level lock anticipating the write when the record is not found.
Also, you need to ensure that read and write are in the same Transformer stage. Also recommended is "lock for update" when reading - this sets a row level lock anticipating the write when the record is not found.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Then you need to fully explain your job design / layout in gory detail, particularly how your are reading from and writing to the hashed file in question including all settings in both stages. Reading / writing to the same hashed file in a Server job is fundamental stuff and works just fine when setup properly, so we need to see where you've gone wrong here.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
Yes, in my job i have a one source file one transformer one target file and two hashed files(same hash used twice for read and write).
i have a two fields say A,B,C coming from my source and in my hashed file having two fields A , D(Populating from B with some transformations) and E (Keymngmt ID) . my target file having three fields A , E, F(populating from D with some transformations). where E and F are coming from same hash.
in this design A is key field.
at first run in my target populating values for only field A, E and F are coming as blank values, because these two fields should populate values from Hash.
but none of the records are hitting at same run.
i have a two fields say A,B,C coming from my source and in my hashed file having two fields A , D(Populating from B with some transformations) and E (Keymngmt ID) . my target file having three fields A , E, F(populating from D with some transformations). where E and F are coming from same hash.
in this design A is key field.
at first run in my target populating values for only field A, E and F are coming as blank values, because these two fields should populate values from Hash.
but none of the records are hitting at same run.
D.N .MURTHY
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
You said one transformer - are the target file and the hashed file being populated from the same transformer? In that case, the first time a new key value is encountered it would not already be in the hashed file and will get written to the hashed file & target at the same time. The fields that you are trying to lookup would only return data the second time the key is encountered in your data within that run.
Can you provide some sample data along with what you are trying to accomplish?
Can you provide some sample data along with what you are trying to accomplish?
-
- Participant
- Posts: 30
- Joined: Tue Jan 30, 2007 5:21 am
- Location: bangalore