the record is too big to fit into block

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

thumati.praveen
Participant
Posts: 106
Joined: Wed Oct 04, 2006 5:21 am

Post by thumati.praveen »

chulett wrote:There are other APT variables that may help, check this post to see if anything there works for you. ...
APT_AUTO_TRANSPORT_BLOCK_SIZE = False
APT_DEFAULT_TRANSPORT_BLOCK_SIZE = 400000
APT_MAX_TRANSPORT_BLOCK_SIZE = 1048576
APT_LATENCY_COEFFICIENT = 5

I have set all the variables in Project level But the APT_DEFAULT_TRANSPORT_BLOCK_SIZE is more than 400000.

Thanks,
Praveen.
thumati.praveen
Participant
Posts: 106
Joined: Wed Oct 04, 2006 5:21 am

Post by thumati.praveen »

Hi,

I have resolved issue some what.But I have issue with single record.


my toatl number of records id 224400.In this only one record is giving problem because this records is bigger five times then all other.But the record I can able to write into sequential file but i am not able to write into Dataset.

Could you please suggest me on the same?

Thanks,
Praveen.
miwinter
Participant
Posts: 396
Joined: Thu Jun 22, 2006 7:00 am
Location: England, UK

Post by miwinter »

How did you "resolve the issue somewhat"? Have you made changes which resolved it for the most part (as more than one record was causing a problem) or are you saying that there was only ever an issue with this single record?
Mark Winter
<i>Nothing appeases a troubled mind more than <b>good</b> music</i>
thumati.praveen
Participant
Posts: 106
Joined: Wed Oct 04, 2006 5:21 am

Post by thumati.praveen »

miwinter wrote:How did you "resolve the issue somewhat"? Have you made changes which resolved it for the most part (as more than one record was causing a problem) or are you saying that there was only ever an issue with this single record?
Finding the couse of issue means some what resolve.
thumati.praveen
Participant
Posts: 106
Joined: Wed Oct 04, 2006 5:21 am

Post by thumati.praveen »

miwinter wrote:How did you "resolve the issue somewhat"? Have you made changes which resolved it for the most part (as more than one record was causing a problem) or are you saying that there was only ever an issue with this single record?
It is aborting due to not a single record.But for some couple of records which exceds the length then only aborting.
thumati.praveen
Participant
Posts: 106
Joined: Wed Oct 04, 2006 5:21 am

Post by thumati.praveen »

Hi,

Can I resolve this issue by converting the data into binary format by using the column export stage?

Thanks,
Praveen.
thumati.praveen
Participant
Posts: 106
Joined: Wed Oct 04, 2006 5:21 am

Post by thumati.praveen »

Hi,

Can I resolve this issue by converting the data into binary format by using the column export stage?

Thanks,
Praveen.
tjr
Participant
Posts: 19
Joined: Wed Jun 15, 2005 6:37 am

Post by tjr »

I guess you cannot write records to a datasets which have bigger length than the blocksize of the dataset which is by default 128K. You can change that though by setting APT_PHYSICAL_DATASET_BLOCK_SIZE
Post Reply