While DataStage server was running, the job aborted due to the following Fatal error. The job reads a Sequential file and writes to a Hash file.
stg_SUMRY_HST_ins: ds_uvput() - Write failed for record id 'FNAA3070B
1
-70001
2007-07-03
9999-01-01 00:00:00.000000'
The job successfully finished in the next run.
Is there any expaination why the job failed the first time?
Fatal error in Server job due to ds_uvput()
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Does another run of the 'same file' guarantee that the exact same records will be written to the hashed file? If that's the case and no job changes were made between runs, is there any chance that you ran out of disk space where that hashed file lives on the first run?
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers