Write failed for record
Moderators: chulett, rschirm, roy
Write failed for record
What that mean?:
JOB_LKP_FACT_LAB_RESULT..LKP_FACT_LAB_RESULT.LNK_TO_LKP: ds_uvput() - Write failed for record id '3040298
20030122
4801
4801
0'
Thankws for help...
JOB_LKP_FACT_LAB_RESULT..LKP_FACT_LAB_RESULT.LNK_TO_LKP: ds_uvput() - Write failed for record id '3040298
20030122
4801
4801
0'
Thankws for help...
Re: Write failed for record
Hi,
This error is because of the size. If this is a hash file then you need to check with the size of the hash file.Probably it is exceeding 2GB. You can get the size of hash file in DS admin/project by executing "Analyze.file hashfilename" in command execute.
And do a search in the forum, this is discussed many times here.
This error is because of the size. If this is a hash file then you need to check with the size of the hash file.Probably it is exceeding 2GB. You can get the size of hash file in DS admin/project by executing "Analyze.file hashfilename" in command execute.
And do a search in the forum, this is discussed many times here.
igorlp wrote:What that mean?:
JOB_LKP_FACT_LAB_RESULT..LKP_FACT_LAB_RESULT.LNK_TO_LKP: ds_uvput() - Write failed for record id '3040298
20030122
4801
4801
0'
Thankws for help...
Hmmm... typically, an issue around the 2GB Barrier would actually corrupt the hashed file and give you fun things like 'blink' errors. This is more indicative of either a space problem or something funky in the data you are writing. It can also depend on if you received just this single error or many similar ones.
Since nothing looks amiss in the key values it logged, how about space? One is disk space - was the area where the hashed file lives running low on disk space? The other can be cache space - do you have 'Write cache enabled' for this stage? If so, try it with it turned off to see of that resolves the issue.
Since nothing looks amiss in the key values it logged, how about space? One is disk space - was the area where the hashed file lives running low on disk space? The other can be cache space - do you have 'Write cache enabled' for this stage? If so, try it with it turned off to see of that resolves the issue.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
A key field in a hashed file cannot be populated with Nulls unless you want a warning message in your log . Hence you need to check the key column for nulls before loading the hashed file.( Maybe using a transformer stage prior to hashed file, and using the function to derive a constraint expression, that way you can prevent rows with key field values as NULL from being written to hashed file.
HTH
Ramesh
Code: Select all
IsNull()
HTH
Ramesh
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
I don't believe "from OLE DB to Seq.file" is pertinent; the problem is populating a hashed file. Therefore your job contains either a Hashed File stage or a UV stage with an input link.
How large is your hashed file? (How much free space you have is irrelevant; a hashed file can not exceed 2GB without special pre-tuning.) Do you really need all those keys? Do you really need all those columns? Do you really need all those rows?
How large is your hashed file? (How much free space you have is irrelevant; a hashed file can not exceed 2GB without special pre-tuning.) Do you really need all those keys? Do you really need all those columns? Do you really need all those rows?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.