Page 1 of 2

the record is too big to fit into block

Posted: Tue Jun 23, 2009 5:05 am
by thumati.praveen
Hi,

Can you aware of below problem earlier ?


APT_CombinedOperatorController(5),0: Fatal Error: File data set, file "/etl/IS/Datasets/ds_isap_irde020_am_gm_gbo_final.ds".; output of "cpyROw": the record is too big to fit in a block; the length requested is: 145247, the max block length is: 131072.


Parameters for the job is:

$APT_CONFIG_FILE = /opt/IBM/InformationServer/Server/Configurations/default.apt
$APT_RECORD_COUNTS = False
$APT_DUMP_SCORE = False
$APT_MONITOR_SIZE = 500000
$APT_MONITOR_TIME = 20
$APT_DEFAULT_TRANSPORT_BLOCK_SIZE = 78904532


My input record length is 4000033960.


Thanks,
Praveen.

Posted: Tue Jun 23, 2009 5:24 am
by nagarjuna
Try to increase the value of the variable APT_DEFAULT_TRANSPORT_BLOCK_SIZE

Posted: Tue Jun 23, 2009 5:31 am
by thumati.praveen
I was already specified 8 digit number still job looking 131072 this.

Posted: Tue Jun 23, 2009 5:53 am
by ArndW
I seem to recall that the block size is allocated as a power of 2 and rounded from the number specified in the block size. Try specifying 1048576

Posted: Tue Jun 23, 2009 6:32 am
by thumati.praveen
ArndW wrote:I seem to recall that the block size is allocated as a power of 2 and rounded from the number specified in the block size. Try specifying 1048576 ...
I amgetting same problem.

Even I given more then that what you have specified.

Thanks,
Praveen.

Posted: Tue Jun 23, 2009 6:51 am
by chulett
Your input record length is over 4 billion bytes? :?

Posted: Tue Jun 23, 2009 6:59 am
by thumati.praveen
chulett wrote:Your input record length is over 4 billion bytes? :?
Yes boss,You identified my actual problem.

Could you please advise how to resolve the problem?

when ever small amount of records comming means less than 50000 records it is working if the records are comming more than 100000 I am facing this probel.

Thanks,
Praveen.

Posted: Tue Jun 23, 2009 7:04 am
by chulett
I have no idea how to handle something that big, sorry. Hopefully someone else does.

Posted: Tue Jun 23, 2009 7:07 am
by thumati.praveen
My input reocord is combination of xml chunks.

So many xml chunks I am taking as a single column in my input row.

Thanks,
Praveen.

Posted: Tue Jun 23, 2009 7:17 am
by ArndW
If you are getting the same error message, then obviously the change isn't being accepted. Can you check the value in the director log for your run?

Posted: Tue Jun 23, 2009 7:22 am
by thumati.praveen
ArndW wrote:If you are getting the same error message, then obviously the change isn't being accepted. Can you check the value in the director log for your run? ...
I checked the logs in the datastage.It is taking what ever values we are passing.But when it reaches the 22389 record count it is failing the job with above reason.

I am still afrid why it is not taking values from APT_DEFAULT_TRANSPORT_BLOCK _SIZE.

could you let me know is their any other settings I need to specify in the job?

I am assuming mainly the problem occures while writing the output dataset on the disk.

Thanks,
Praveen.

Posted: Tue Jun 23, 2009 7:28 am
by chulett
Just how same is the "same error" you get after changing the APT variable? Exactly the same as in it still says "the max block length is: 131072" or just that it is still "too big" but you see the "max" changing?

Posted: Tue Jun 23, 2009 7:33 am
by thumati.praveen
chulett wrote:Just how same is the "same error" you get after changing the APT variable? Exactly the same as in it still says "the max block length is: 131072" or just that it is still "too big" but you see ...
It always gives below error:

APT_CombinedOperatorController(4),1: Fatal Error: File data set, file "/etl/IS/Datasets/ds_isap_irde020_am_gm_gbo_final.ds".; output of "cpyROw": the record is too big to fit in a block; the length requested is: 145247, the max block length is: 131072.

what ever value you set to Environment variable the error is same.

Thanks,
Praveen.

Posted: Tue Jun 23, 2009 7:34 am
by chulett
There are other APT variables that may help, check this post to see if anything there works for you.

Posted: Tue Jun 23, 2009 7:38 am
by ArndW
Are you just setting this in the ADMIN or are you also passing it to the job as a parameter (which is what you should be doing).