Job aborting due to block size

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
RK72
Participant
Posts: 154
Joined: Wed Sep 29, 2010 4:10 pm

Job aborting due to block size

Post by RK72 »

One of my job is aborting with the following error message "Join,0: Fatal Error: Virtual data set.; output of "APT_JoinSubOperatorNC in Join": the record is too big to fit in a block; the length requested is: 4147970, the max block length is: 3002368.".I set the APT_DEFAULT_TRANSPORT_SIZE to "3002368".What other parameter do I need to set so that it never fails and what should be the value of it.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Let's see, you set the maximum block size to a value of 3,002,368 and are wondering why you get an error on a block size of 4,147,970? Since we don't know the maximum block size your application can create, we cannot answer your question as to what size you need to set your $APT_MAX_TRANSPORT_BLOCK_SIZE. Note that the actual sizes used internally are the next lower power-of-two from whatever parameter value is used.
sjfearnside
Premium Member
Premium Member
Posts: 278
Joined: Wed Oct 03, 2007 8:45 am

Post by sjfearnside »

Try setting the APT_DEFAULT_TRANSPORT_SIZE to "4147970" instead of "3002368".
RK72
Participant
Posts: 154
Joined: Wed Sep 29, 2010 4:10 pm

Post by RK72 »

For some jobs the block size is more than that so what is the permanent solution for all the jobs is there any variable which can be set like APT_MAX_TRANSPORT_SIZE and it would take care of it.
RK72
Participant
Posts: 154
Joined: Wed Sep 29, 2010 4:10 pm

Post by RK72 »

I looked at lot of posts and found that if we set the APT_DEFAULT_TRANSPORT_BLOCK_SIZE=1048576 but still I am getting the error Seq_Main1,0: Fatal Error: File data set, file "E:/DataStage/Data/in/PNP/Main_Test_All.ds".; output of "Seq_Main1": the record is too big to fit in a block; the length requested is: 2104822, the max block length is: 131072.Do I need to stop/start the engine to take the effect as it still shows 131072.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Setting APT_DEFAULT_TRANSPORT_BLOCK_SIZE does not change your APT_MAX_TRANSPORT_BLOCK_SIZE :!:
DST
Participant
Posts: 7
Joined: Wed Nov 02, 2005 6:40 am

Post by DST »

RK72 wrote:I looked at lot of posts and found that if we set the APT_DEFAULT_TRANSPORT_BLOCK_SIZE=1048576 but still I am getting the error Seq_Main1,0: Fatal Error: File data set, file "E:/DataStage/Data/in/PNP/Main_Test_All.ds".; output of "Seq_Main1": the record is too big to fit in a block; the length requested is: 2104822, the max block length is: 131072.Do I need to stop/start the engine to take the effect as it still shows 131072.
Try to increase APT_PHYSICAL_DATASET_BLOCK_SIZE.
131072=8192*16
You need 4194304 to resolve your trouble
RK72
Participant
Posts: 154
Joined: Wed Sep 29, 2010 4:10 pm

Post by RK72 »

It worked with these values

APT_DELIMITED_READ_SIZE=20000000 APT_MAX_DELIMITED_READ_SIZE=20000000 APT_DEFAULT_TRANSPORT_BLOCK_SIZE=20000000 APT_PHYSICAL_DATASET_BLOCK_SIZE=20000000 APT_TSORT_STRESS_BLOCKSIZE=100000000
ruf888
Participant
Posts: 20
Joined: Wed May 13, 2009 1:14 am
Location: Germany

Post by ruf888 »

The change of the variable APT_PHYSICAL_DATASET_BLOCK_SIZE in addition to the change of the APT_DEFAULT_TRANSPORT_BLOCK_SIZE has helped me to solve the problem, thanks a lot
Post Reply