Write Failed for hash File

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
raj9176
Participant
Posts: 10
Joined: Tue Dec 06, 2005 3:46 am

Write Failed for hash File

Post by raj9176 »

hi,
Job was running fine and when i migrated the job, i am getting the error that there was a "ds_uvput() - Write failed for record id ' 880'". Can any one please help me out with the problem.
Thanks
Raj
loveojha2
Participant
Posts: 362
Joined: Thu May 26, 2005 12:59 am

Post by loveojha2 »

You can not insert NULL value into a key column of a Hashed File, this error is pointing to the 880th record, check whether this row from source is containing any NULL in the key column or not.
Success consists of getting up just one more time than you fall.
raj9176
Participant
Posts: 10
Joined: Tue Dec 06, 2005 3:46 am

Hash Failed

Post by raj9176 »

i did check and there is no NULL Values in the Key Column. please let me know what can be the problem.
Thanks
Raj
loveojha2
Participant
Posts: 362
Joined: Thu May 26, 2005 12:59 am

Re: Hash Failed

Post by loveojha2 »

raj9176 wrote:i did check and there is no NULL Values in the Key Column. please let me know what can be the problem.
Thanks
Raj
Raj can you let us know how did you check for nulls from source. Go to the director pick the select query that it shows for the source data and try executing it directly against your database and see for any null or empty strings in the columns.
Is there any transformation happening after the source before writing to the Hashed File? Are you applying any transformation from your key columns?
Success consists of getting up just one more time than you fall.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Can you put a derivation of OCONV(In.Key,'MCP') for your key as a test, this will replace unprintable chars with a period ".". This would show that your key perhaps does contain a field mark or value mark (which is not allowed) or a null.
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

loveojha2 wrote:You can not insert NULL value into a key column of a Hashed File, this error is pointing to the 880th record, check whether this row from source is containing any NULL in the key column or not.
Just a slight correction, it's not the 880th record, the message:
Write failed for record id ' 880'".
means that the primary key value (the record id) is invalid for some reason. See that space before the 880? I suspect that the space is really a reserved character, something really low or really high on the ASCII chart. Arnd has suggested that the poster investigate the row where the primary key contains reserved characters and that is the proper solution.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
jinm
Premium Member
Premium Member
Posts: 47
Joined: Tue Feb 24, 2004 1:59 am

Post by jinm »

[quote]Just a slight correction, it's not the 880th record, the message:
Write failed for record id ' 880'".
means that the primary key value (the record id) is invalid for some reason.[quote]

Correction. Something somewhere in record with ID = '880' is invalid.
It need not be in the key column, but can be in any of the columns that is attempted to be written.

Have you given the full error message?
sb_akarmarkar
Participant
Posts: 232
Joined: Fri Sep 30, 2005 4:52 am
Contact:

Post by sb_akarmarkar »

Why dont you try to write in source query that where key column is not null.

Thanks,
Anupam
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Write the data into a text file then use a hex editor (UltraEdit is good, and free for 45 days) to see exactly what you're trying to write. Post results here.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply