Page 1 of 1

HASHED FILE PROPERTIES

Posted: Mon Dec 11, 2006 3:48 am
by parvathi
HI ALL,
I have a hashed file in my job which has my tablename and system as the primary key coulmn.when i run the job once i will get an entry in to the hashed file and i have unchecked the option create file as i need to have entries all the tables that i run.

First time i ran i get no error
second time i run the job the data is loaded in to the target but i get a warning like this

Aesamplejobname..Hsh.WriteMaxvalue_Hsh: ds_uvput() - Write failed for record id 'RDM_SAMPLE_TABLE'
RAM

CAN'T I update the hased file for the key columns mentioned like 'RDM_SAMPLE_TABLE'RAM'

i EVEN GET THE FOLLOWING PHANTOM ERROR

DataStage Job 4703 Phantom 3205
Program "DSP.ActiveRun": Line 51, WRITE attempt on read-only file.
DataStage Phantom Finished

WHAT SHOULD I DO TO UPDATE THE RECORD IN THE HASHED FILE

Posted: Mon Dec 11, 2006 4:35 am
by ray.wurlod
Has the file system become full? Has a disk quota been exceeded? Is the hashed file intact (what happens when you attempt to count the records in it)? Please confirm that your column definitions show two Key columns.

Please execute the following command from the DataStage command prompt and post the result:

Code: Select all

LIST DICT <<HashedFileName>>
Note: if your hashed file is in a directory you will need to execute a SETFILE command first.

Posted: Mon Dec 11, 2006 7:01 am
by parvathi
the reason i came to know is my datastage is in unix server and i have my login created .If i login in to the datastage with my login i am not able to create the hashed file but if i access with generic account i am able to create the hashed file

Posted: Mon Dec 11, 2006 9:11 am
by chulett
In other words, you have a permissions problem.