WriteHash() - Write failed for record id

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
attu
Participant
Posts: 225
Joined: Sat Oct 23, 2004 8:45 pm
Location: Texas

WriteHash() - Write failed for record id

Post by attu »

when we move our job from dev to test and try to run it we get this fatal error message

Code: Select all

WriteHash() - Write failed for record id 'BRT019453K 53425'
After we re-compile the job it runs fine. It does not look like it is related to data issue because the job runs fine after we compile it. The fatal error comes in two of the hashed file stages - we are writing to hashed file and at the same time doing a lookup as well. Update action is clear file before writing and allow stage write cache is enabled for the hashed stage.

what could be the reason for job abort and why it runs fine after compiling?

Please provide feedback, Thanks
Vinodanand
Premium Member
Premium Member
Posts: 112
Joined: Mon Jul 11, 2005 7:54 am

Post by Vinodanand »

A search on Write failed for record id would have provided you with the solution . Here is my bit in helping you out ..

http://dsxchange.com/viewtopic.php?t=11 ... ce9937713d
dsvsinformatica
Participant
Posts: 2
Joined: Fri Aug 15, 2008 9:51 am
Location: bay area

Post by dsvsinformatica »

Is the same user running job or is it different users? and tell the Ulimit settings for that users.
sysmq
Premium Member
Premium Member
Posts: 29
Joined: Wed Aug 22, 2007 12:58 am

Post by sysmq »

Hi
I have the same error
I went through the forum looking for idea, checked all the recomandation mentioned in the forum (Data and over file size etc...)
yet as mentioned only re-compile solves this issue.
i cannot do this on regular beses in a production server

any other ideas ?

Here is the error message :
job.pearson.LT_After_Reject:writehash() - write failed fro record ID '4597291'
Sathishkumarins
Premium Member
Premium Member
Posts: 41
Joined: Tue Jul 08, 2008 5:45 am
Location: Columbus

Post by Sathishkumarins »

Some times Hashed file size may be a problem.
Try using a 64 bit hashed file. This might solve your problem.

Because in a 64 bit hashed file largest address that can be represented is 19million TB.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

It's definitely not a "64-bit" error. These give different errors. Write failed is usually because the key is illegal (contains mark characters or is null) or because the hashed file is internally corrupted.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
swordmood
Participant
Posts: 1
Joined: Thu Nov 27, 2008 1:05 am

Post by swordmood »

Hi,
I had the same error .
Maybe there is not enough memery or disk space.
And you can try.
Post Reply