Page 1 of 1

Write failed for record

Posted: Sat Oct 28, 2006 11:55 pm
by igorlp
What that mean?:

JOB_LKP_FACT_LAB_RESULT..LKP_FACT_LAB_RESULT.LNK_TO_LKP: ds_uvput() - Write failed for record id '3040298
20030122
4801
4801
0'

Thankws for help...

Re: Write failed for record

Posted: Sun Oct 29, 2006 12:53 am
by meena
Hi,
This error is because of the size. If this is a hash file then you need to check with the size of the hash file.Probably it is exceeding 2GB. You can get the size of hash file in DS admin/project by executing "Analyze.file hashfilename" in command execute.
And do a search in the forum, this is discussed many times here.
igorlp wrote:What that mean?:

JOB_LKP_FACT_LAB_RESULT..LKP_FACT_LAB_RESULT.LNK_TO_LKP: ds_uvput() - Write failed for record id '3040298
20030122
4801
4801
0'

Thankws for help...

Posted: Sun Oct 29, 2006 1:16 am
by igorlp
Thanks, I am goingto search...

And... What i can do for get Hashfile with more size...

Posted: Sun Oct 29, 2006 8:29 am
by chulett
Hmmm... typically, an issue around the 2GB Barrier would actually corrupt the hashed file and give you fun things like 'blink' errors. This is more indicative of either a space problem or something funky in the data you are writing. It can also depend on if you received just this single error or many similar ones.

Since nothing looks amiss in the key values it logged, how about space? One is disk space - was the area where the hashed file lives running low on disk space? The other can be cache space - do you have 'Write cache enabled' for this stage? If so, try it with it turned off to see of that resolves the issue.

Posted: Mon Oct 30, 2006 11:31 am
by igorlp
there are 50GB free on the disk and 'Write cache enabled' was disable...
A job was abort on the same count of rows many times...

Posted: Mon Oct 30, 2006 12:29 pm
by narasimha
Like Craig said, check if the disk is full?

Another guess would be that you are loading non-printable/foreign/special characters to your hash file

Posted: Tue Oct 31, 2006 11:13 am
by rameshrr3
I got a similar error while trying to write nulls to a hashed file. Check the data that is sourced to populate your hashed file.

The other reason could be your disk space problems

Thanks
Ramesh

Posted: Tue Oct 31, 2006 1:49 pm
by igorlp
Yes, there are null's in hash file...
Why there is a problem? And what I need to do for resolve this problem?

Posted: Mon Nov 13, 2006 2:31 am
by rameshrr3
A key field in a hashed file cannot be populated with Nulls unless you want a warning message in your log . Hence you need to check the key column for nulls before loading the hashed file.( Maybe using a transformer stage prior to hashed file, and using the function

Code: Select all

IsNull()
to derive a constraint expression, that way you can prevent rows with key field values as NULL from being written to hashed file.

HTH
Ramesh

Posted: Mon Nov 13, 2006 7:29 am
by ray.wurlod
I don't believe "from OLE DB to Seq.file" is pertinent; the problem is populating a hashed file. Therefore your job contains either a Hashed File stage or a UV stage with an input link.

How large is your hashed file? (How much free space you have is irrelevant; a hashed file can not exceed 2GB without special pre-tuning.) Do you really need all those keys? Do you really need all those columns? Do you really need all those rows?