Value above the Suggested Range limit

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
Amit_111
Participant
Posts: 134
Joined: Sat Mar 24, 2007 11:37 am

Value above the Suggested Range limit

Post by Amit_111 »

Hi,

In our DataStage job we are receiving the error "the record is too big to fit in a block; the length requested is: 107732, the max block length is: 1048576"

As per the IBM website, for DataStage Version 8.7 the range which can be specified for ENV Variable APT DEFAULT TRANSPORT BLOCK SIZE is a value between 8192 and 1048576.

In my case I just increased the value above limit and set it to 107733 and my DataStage job executed fine.

Here, I specified a value which is higher than the suggested range but still my job ran successfully. I am confused if the job is working fine for a value specified above the range then why is it that there is a range specified in this case.
Will it have any impact on data which I am not able to think or or imagine right now.

Thank you for your suggestions.
Amit_111
Participant
Posts: 134
Joined: Sat Mar 24, 2007 11:37 am

Post by Amit_111 »

Sorry, a small correction the requested length is not 107732.
It is actually greater than the range i.e. 1077732.

We modified the APT DEFAULT TRANSPORT BLOCK SIZE to 1077733 and the job executed successfully.

I just want to know if there is any issue to specify the value greater than the range provided in the documentation (1048576).
Post Reply