Hi all,
I got the following warning when am trying to write the data from a sequential file to a hashed file.
ds_uvput() - Write failed for record id ''
For the same job i got phantom error also saying that,
"Attempted WRITE with record ID larger than file/table maximum
record ID size of 768 characters."
When i view it in RT_BP subdirectory it is pointing a column with the length 0f 1 (I mean the column is having length of 1). My source data for the column is also length of 1.
So am wondering why it is saying about record ID size?
Could any one suggest me to resolve this?
Your time will be highly appreciated...
Thanks in advance...
veera...
Log report
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
What is it you are viewing in "RT_BP subdirectory" (and, for that matter, which RT_BP subdirectory)?
Where does the Hashed File stage specify that the hashed file is?
This error would not have occurred if a job had not attempted to write a record into a hashed file for which the key value (the totality of all key columns plus one separator character between each) had more than 768 characters in it.
Where does the Hashed File stage specify that the hashed file is?
This error would not have occurred if a job had not attempted to write a record into a hashed file for which the key value (the totality of all key columns plus one separator character between each) had more than 768 characters in it.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact: