Page 1 of 1

Fatal error in Server job due to ds_uvput()

Posted: Thu Jul 19, 2007 12:33 pm
by DS4DRIVER
While DataStage server was running, the job aborted due to the following Fatal error. The job reads a Sequential file and writes to a Hash file.

stg_SUMRY_HST_ins: ds_uvput() - Write failed for record id 'FNAA3070B
1
-70001
2007-07-03
9999-01-01 00:00:00.000000'


The job successfully finished in the next run.
Is there any expaination why the job failed the first time?

Posted: Thu Jul 19, 2007 3:16 pm
by ray.wurlod
Possibly beause there was a null in one of the key columns attempted to be written to the hashed file.

Hashed file, not hash file.

Posted: Thu Jul 19, 2007 3:46 pm
by DS4DRIVER
That is not the case.
There are 5 Key fields in the Hash file. All the fields have Values as per the Record.

ds_uvput() - Write failed for record id 'FNAA3070B
1
-70001
2007-07-03
9999-01-01 00:00:00.000000'

Posted: Thu Jul 19, 2007 3:47 pm
by DS4DRIVER
Also, the next time the job ran with teh same source file, it finished successfully.

Posted: Thu Jul 19, 2007 6:21 pm
by chulett
Does another run of the 'same file' guarantee that the exact same records will be written to the hashed file? If that's the case and no job changes were made between runs, is there any chance that you ran out of disk space where that hashed file lives on the first run?