Page 1 of 1

Hashed file issue

Posted: Tue Nov 30, 2010 9:39 am
by dr.murthy
Hi,

i used a same hashed file in my job twice one time for referring and one time for writing, but issue is what ever records write into hashed file , same records records are not referring at same run , next run only thise records used as reference,
but i have to use the records as a reference what ever loaded into hashed file at single run.

Any suggestions???

Thanks in advance

Posted: Tue Nov 30, 2010 10:50 am
by chulett
Don't cache the hashed file for reading. That or read about what the 'Locked for update' option entails.

Posted: Tue Nov 30, 2010 10:59 am
by dr.murthy
Update action for both options clear before file backup existing is unchecked

Posted: Tue Nov 30, 2010 1:12 pm
by ray.wurlod
That's got nothing to do with read and write cache.

Also, you need to ensure that read and write are in the same Transformer stage. Also recommended is "lock for update" when reading - this sets a row level lock anticipating the write when the record is not found.

Posted: Tue Nov 30, 2010 9:58 pm
by dr.murthy
Hi ray,

Where can i set this option lock for updates, am not able to see your reply fully.

Posted: Tue Nov 30, 2010 10:13 pm
by chulett
Before we go down that potentially rocky path, why not just make sure the lookup's Pre-load file to memory option is set to Disabled and see if that "fixes" it.

Posted: Tue Nov 30, 2010 11:15 pm
by dr.murthy
yes, i tried to set optin preload file to memory is disabled, even though its not working,
and also that record level read option is disable for me to check

Posted: Wed Dec 01, 2010 8:27 am
by chulett
Then you need to fully explain your job design / layout in gory detail, particularly how your are reading from and writing to the hashed file in question including all settings in both stages. Reading / writing to the same hashed file in a Server job is fundamental stuff and works just fine when setup properly, so we need to see where you've gone wrong here.

Posted: Wed Dec 01, 2010 11:12 am
by dr.murthy
Yes, in my job i have a one source file one transformer one target file and two hashed files(same hash used twice for read and write).

i have a two fields say A,B,C coming from my source and in my hashed file having two fields A , D(Populating from B with some transformations) and E (Keymngmt ID) . my target file having three fields A , E, F(populating from D with some transformations). where E and F are coming from same hash.
in this design A is key field.

at first run in my target populating values for only field A, E and F are coming as blank values, because these two fields should populate values from Hash.
but none of the records are hitting at same run.

Posted: Wed Dec 01, 2010 1:15 pm
by ray.wurlod
Then the hashed file was not populated correctly. Check your work. Also you should use consistent column naming.

Posted: Wed Dec 01, 2010 2:39 pm
by vinnz
You said one transformer - are the target file and the hashed file being populated from the same transformer? In that case, the first time a new key value is encountered it would not already be in the hashed file and will get written to the hashed file & target at the same time. The fields that you are trying to lookup would only return data the second time the key is encountered in your data within that run.

Can you provide some sample data along with what you are trying to accomplish?

Posted: Wed Dec 01, 2010 10:33 pm
by prasad.bodduluri
Split job into two parts.