Page 1 of 1

Log report

Posted: Tue Jul 22, 2008 9:38 pm
by veera24
Hi all,

I got the following warning when am trying to write the data from a sequential file to a hashed file.

ds_uvput() - Write failed for record id ''

For the same job i got phantom error also saying that,

"Attempted WRITE with record ID larger than file/table maximum
record ID size of 768 characters."


When i view it in RT_BP subdirectory it is pointing a column with the length 0f 1 (I mean the column is having length of 1). My source data for the column is also length of 1.
So am wondering why it is saying about record ID size?

Could any one suggest me to resolve this?

Your time will be highly appreciated...

Thanks in advance...
veera...

Posted: Tue Jul 22, 2008 10:02 pm
by ray.wurlod
What is it you are viewing in "RT_BP subdirectory" (and, for that matter, which RT_BP subdirectory)?

Where does the Hashed File stage specify that the hashed file is?

This error would not have occurred if a job had not attempted to write a record into a hashed file for which the key value (the totality of all key columns plus one separator character between each) had more than 768 characters in it.

Posted: Tue Jul 22, 2008 10:10 pm
by chulett
The error message says it all. What is the metadata for the hashed file being written to? Have you checked what you are sending to the key fields?

Posted: Wed Jul 23, 2008 3:28 am
by sachin1
the key field for hash file cannot have a length greater than equal to 768 characters.

Posted: Wed Jul 23, 2008 3:30 am
by ray.wurlod
It can, but only if one of the uvconfig parameters is changed to permit it.

Posted: Wed Jul 23, 2008 3:50 am
by sachin1
Please can you let us know the parameter.

Posted: Wed Jul 23, 2008 4:26 am
by ArndW
sachin1 - the parameter is "MAXKEYSIZE" and it shouldn't be changed without good reason as changes can impact performance adversely.

Posted: Wed Jul 23, 2008 6:45 am
by sachin1
thanks a lot ArndW.