HASHED FILE PROPERTIES
Posted: Mon Dec 11, 2006 3:48 am
HI ALL,
I have a hashed file in my job which has my tablename and system as the primary key coulmn.when i run the job once i will get an entry in to the hashed file and i have unchecked the option create file as i need to have entries all the tables that i run.
First time i ran i get no error
second time i run the job the data is loaded in to the target but i get a warning like this
Aesamplejobname..Hsh.WriteMaxvalue_Hsh: ds_uvput() - Write failed for record id 'RDM_SAMPLE_TABLE'
RAM
CAN'T I update the hased file for the key columns mentioned like 'RDM_SAMPLE_TABLE'RAM'
i EVEN GET THE FOLLOWING PHANTOM ERROR
DataStage Job 4703 Phantom 3205
Program "DSP.ActiveRun": Line 51, WRITE attempt on read-only file.
DataStage Phantom Finished
WHAT SHOULD I DO TO UPDATE THE RECORD IN THE HASHED FILE
I have a hashed file in my job which has my tablename and system as the primary key coulmn.when i run the job once i will get an entry in to the hashed file and i have unchecked the option create file as i need to have entries all the tables that i run.
First time i ran i get no error
second time i run the job the data is loaded in to the target but i get a warning like this
Aesamplejobname..Hsh.WriteMaxvalue_Hsh: ds_uvput() - Write failed for record id 'RDM_SAMPLE_TABLE'
RAM
CAN'T I update the hased file for the key columns mentioned like 'RDM_SAMPLE_TABLE'RAM'
i EVEN GET THE FOLLOWING PHANTOM ERROR
DataStage Job 4703 Phantom 3205
Program "DSP.ActiveRun": Line 51, WRITE attempt on read-only file.
DataStage Phantom Finished
WHAT SHOULD I DO TO UPDATE THE RECORD IN THE HASHED FILE