Hashed file corruption
Posted: Thu Mar 24, 2011 9:18 am
hello All,
We have a job that has been running for months without issue.
This morning the job fails while updating a hashed file.
the log has this entry:
"Job_DR07BR1D_ext..hf_DR07BU0D_cnts.ds_household_tbl_cnt_out: ds_uvput() - Write failed for record id 'HOUSEHOLD_TBL'"
when I try to view the hashed file in the job I get this error:
"Error calling subroutine:
DSD.Browse (Action=3); check DataStage is set up correctly in project ...."
I logged into admin and tried to do a uvfixfile but get the messege:
"hf_DR07BU0D_cnts is not a filename in voc"
we us directory path not account name when we create hash files.
This hashed file is created and initialized by another job and then updated by a series of jobs with load counts.
Is there another way to run a fix on this file?
thanks
DaleK
We have a job that has been running for months without issue.
This morning the job fails while updating a hashed file.
the log has this entry:
"Job_DR07BR1D_ext..hf_DR07BU0D_cnts.ds_household_tbl_cnt_out: ds_uvput() - Write failed for record id 'HOUSEHOLD_TBL'"
when I try to view the hashed file in the job I get this error:
"Error calling subroutine:
DSD.Browse (Action=3); check DataStage is set up correctly in project ...."
I logged into admin and tried to do a uvfixfile but get the messege:
"hf_DR07BU0D_cnts is not a filename in voc"
we us directory path not account name when we create hash files.
This hashed file is created and initialized by another job and then updated by a series of jobs with load counts.
Is there another way to run a fix on this file?
thanks
DaleK