HI ALL,
I have a hashed file in my job which has my tablename and system as the primary key coulmn.when i run the job once i will get an entry in to the hashed file and i have unchecked the option create file as i need to have entries all the tables that i run.
First time i ran i get no error
second time i run the job the data is loaded in to the target but i get a warning like this
Aesamplejobname..Hsh.WriteMaxvalue_Hsh: ds_uvput() - Write failed for record id 'RDM_SAMPLE_TABLE'
RAM
CAN'T I update the hased file for the key columns mentioned like 'RDM_SAMPLE_TABLE'RAM'
i EVEN GET THE FOLLOWING PHANTOM ERROR
DataStage Job 4703 Phantom 3205
Program "DSP.ActiveRun": Line 51, WRITE attempt on read-only file.
DataStage Phantom Finished
WHAT SHOULD I DO TO UPDATE THE RECORD IN THE HASHED FILE
HASHED FILE PROPERTIES
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Has the file system become full? Has a disk quota been exceeded? Is the hashed file intact (what happens when you attempt to count the records in it)? Please confirm that your column definitions show two Key columns.
Please execute the following command from the DataStage command prompt and post the result:
Note: if your hashed file is in a directory you will need to execute a SETFILE command first.
Please execute the following command from the DataStage command prompt and post the result:
Code: Select all
LIST DICT <<HashedFileName>>
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.