I have resolved issue some what.But I have issue with single record.
my toatl number of records id 224400.In this only one record is giving problem because this records is bigger five times then all other.But the record I can able to write into sequential file but i am not able to write into Dataset.
How did you "resolve the issue somewhat"? Have you made changes which resolved it for the most part (as more than one record was causing a problem) or are you saying that there was only ever an issue with this single record?
Mark Winter
<i>Nothing appeases a troubled mind more than <b>good</b> music</i>
miwinter wrote:How did you "resolve the issue somewhat"? Have you made changes which resolved it for the most part (as more than one record was causing a problem) or are you saying that there was only ever an issue with this single record?
Finding the couse of issue means some what resolve.
miwinter wrote:How did you "resolve the issue somewhat"? Have you made changes which resolved it for the most part (as more than one record was causing a problem) or are you saying that there was only ever an issue with this single record?
It is aborting due to not a single record.But for some couple of records which exceds the length then only aborting.
I guess you cannot write records to a datasets which have bigger length than the blocksize of the dataset which is by default 128K. You can change that though by setting APT_PHYSICAL_DATASET_BLOCK_SIZE