Write failed for record

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
igorlp
Participant
Posts: 10
Joined: Thu Mar 09, 2006 2:13 am

Write failed for record

Post by igorlp »

What that mean?:

JOB_LKP_FACT_LAB_RESULT..LKP_FACT_LAB_RESULT.LNK_TO_LKP: ds_uvput() - Write failed for record id '3040298
20030122
4801
4801
0'

Thankws for help...
meena
Participant
Posts: 430
Joined: Tue Sep 13, 2005 12:17 pm

Re: Write failed for record

Post by meena »

Hi,
This error is because of the size. If this is a hash file then you need to check with the size of the hash file.Probably it is exceeding 2GB. You can get the size of hash file in DS admin/project by executing "Analyze.file hashfilename" in command execute.
And do a search in the forum, this is discussed many times here.
igorlp wrote:What that mean?:

JOB_LKP_FACT_LAB_RESULT..LKP_FACT_LAB_RESULT.LNK_TO_LKP: ds_uvput() - Write failed for record id '3040298
20030122
4801
4801
0'

Thankws for help...
igorlp
Participant
Posts: 10
Joined: Thu Mar 09, 2006 2:13 am

Post by igorlp »

Thanks, I am goingto search...

And... What i can do for get Hashfile with more size...
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Hmmm... typically, an issue around the 2GB Barrier would actually corrupt the hashed file and give you fun things like 'blink' errors. This is more indicative of either a space problem or something funky in the data you are writing. It can also depend on if you received just this single error or many similar ones.

Since nothing looks amiss in the key values it logged, how about space? One is disk space - was the area where the hashed file lives running low on disk space? The other can be cache space - do you have 'Write cache enabled' for this stage? If so, try it with it turned off to see of that resolves the issue.
-craig

"You can never have too many knives" -- Logan Nine Fingers
igorlp
Participant
Posts: 10
Joined: Thu Mar 09, 2006 2:13 am

Post by igorlp »

there are 50GB free on the disk and 'Write cache enabled' was disable...
A job was abort on the same count of rows many times...
narasimha
Charter Member
Charter Member
Posts: 1236
Joined: Fri Oct 22, 2004 8:59 am
Location: Staten Island, NY

Post by narasimha »

Like Craig said, check if the disk is full?

Another guess would be that you are loading non-printable/foreign/special characters to your hash file
Narasimha Kade

Finding answers is simple, all you need to do is come up with the correct questions.
rameshrr3
Premium Member
Premium Member
Posts: 609
Joined: Mon May 10, 2004 3:32 am
Location: BRENTWOOD, TN

Post by rameshrr3 »

I got a similar error while trying to write nulls to a hashed file. Check the data that is sourced to populate your hashed file.

The other reason could be your disk space problems

Thanks
Ramesh
igorlp
Participant
Posts: 10
Joined: Thu Mar 09, 2006 2:13 am

Post by igorlp »

Yes, there are null's in hash file...
Why there is a problem? And what I need to do for resolve this problem?
rameshrr3
Premium Member
Premium Member
Posts: 609
Joined: Mon May 10, 2004 3:32 am
Location: BRENTWOOD, TN

Post by rameshrr3 »

A key field in a hashed file cannot be populated with Nulls unless you want a warning message in your log . Hence you need to check the key column for nulls before loading the hashed file.( Maybe using a transformer stage prior to hashed file, and using the function

Code: Select all

IsNull()
to derive a constraint expression, that way you can prevent rows with key field values as NULL from being written to hashed file.

HTH
Ramesh
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

I don't believe "from OLE DB to Seq.file" is pertinent; the problem is populating a hashed file. Therefore your job contains either a Hashed File stage or a UV stage with an input link.

How large is your hashed file? (How much free space you have is irrelevant; a hashed file can not exceed 2GB without special pre-tuning.) Do you really need all those keys? Do you really need all those columns? Do you really need all those rows?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply