Hash file error

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
Mark j
Participant
Posts: 20
Joined: Tue Apr 20, 2004 9:26 am

Hash file error

Post by Mark j »

HI all,
I am getting these warnings when i am trying to write to a hashfile.. wht might be the reason for this error. I am getting 8 million warining 1 for each record

ds_uvput() - Write failed for record id

Thanks
Mark
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

There are many reasons a write to a hashed file might fail. You can search the Forum for some of them, which include operating system permissions, file full (there is a 2GB limit by default, but in this case you would not get warnings for the first 2GB worth of records), disk quota exceeded, file corrupted, and so on.
Analyze the hashed file to determine which, if any, of these is occurring and post the results. You may, in the process, solve it yourself but post the result anyway; this forum is about sharing such things.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Mark j
Participant
Posts: 20
Joined: Tue Apr 20, 2004 9:26 am

Post by Mark j »

Thanks for your feedback ray

since we are testing with full load of data the space is taken up by other programs ....disk full is the reason for my error to write to a hash file..


Thanks
Mark
Post Reply