the record is too big to fit in a block
Posted: Thu Apr 23, 2009 3:32 pm
Hello
I have a job design which reads a seqeuntial file where we are reading each record as a single line and breaking those records into multiple columns with column import stages and we are building the XML chunks for every record and finally we are joining all those chunks to produce a single big xml file.
When i tried to run the job with less data my job is working fine. but when iuse more data the datstage abends the error from the datastage log in as below
APT_CombinedOperatorController(14),0: Internal Error: (!(stat.statusBits() & APT_DMStatus::eRecordTooBig)):api/dataset_rep1.C: 1685: Virtual data set.; output of "inserted tsort operator {key={value=CustomerNumber, subArgs={asc, cs}}}": the record is too big to fit in a block;
the length requested is: 142471.
Traceback: msgAssertion__13APT_FatalPathFPCcRC11APT_UStringPCci() at 0xd47ffd70
putRecordToPartition_grow__14APT_DataSetRepFUi() at 0xd6d28a38
putRecord_nonCombined__14APT_DataSetRepFb() at 0xd6d25d94
putRecord__16APT_OutputCursorFv() at 0xd6f1a208
writeOutputRecord__17APT_TSortOperatorFv() at 0xd4c0c46c
runLocally__30APT_CombinedOperatorControllerFv() at 0xd6f49c28
run__15APT_OperatorRepFv() at 0xd6e8720c
runLocally__14APT_OperatorSCFv() at 0xd6e73bbc
runLocally__Q2_6APT_SC8OperatorFUi() at 0xd6efea44
runLocally__Q2_6APT_IR7ProcessFv() at 0xd6f818c8
Is there any setting that i need to add? while building the xml chunks i used Longvarchar as datatype with lengh being empty.
I have gone through forums and it was discussed that APT_DEFAULT_TRANSPORT_BLOCK_SIZE variable needs to be set.
If so where should i define this environmental variable in administrator?
there are Reporting environmental variables, Operator specific,compiler specific and User defined env varibales.
In which section should i define that value and to which value should i set that?
Thanks for your help
I have a job design which reads a seqeuntial file where we are reading each record as a single line and breaking those records into multiple columns with column import stages and we are building the XML chunks for every record and finally we are joining all those chunks to produce a single big xml file.
When i tried to run the job with less data my job is working fine. but when iuse more data the datstage abends the error from the datastage log in as below
APT_CombinedOperatorController(14),0: Internal Error: (!(stat.statusBits() & APT_DMStatus::eRecordTooBig)):api/dataset_rep1.C: 1685: Virtual data set.; output of "inserted tsort operator {key={value=CustomerNumber, subArgs={asc, cs}}}": the record is too big to fit in a block;
the length requested is: 142471.
Traceback: msgAssertion__13APT_FatalPathFPCcRC11APT_UStringPCci() at 0xd47ffd70
putRecordToPartition_grow__14APT_DataSetRepFUi() at 0xd6d28a38
putRecord_nonCombined__14APT_DataSetRepFb() at 0xd6d25d94
putRecord__16APT_OutputCursorFv() at 0xd6f1a208
writeOutputRecord__17APT_TSortOperatorFv() at 0xd4c0c46c
runLocally__30APT_CombinedOperatorControllerFv() at 0xd6f49c28
run__15APT_OperatorRepFv() at 0xd6e8720c
runLocally__14APT_OperatorSCFv() at 0xd6e73bbc
runLocally__Q2_6APT_SC8OperatorFUi() at 0xd6efea44
runLocally__Q2_6APT_IR7ProcessFv() at 0xd6f818c8
Is there any setting that i need to add? while building the xml chunks i used Longvarchar as datatype with lengh being empty.
I have gone through forums and it was discussed that APT_DEFAULT_TRANSPORT_BLOCK_SIZE variable needs to be set.
If so where should i define this environmental variable in administrator?
there are Reporting environmental variables, Operator specific,compiler specific and User defined env varibales.
In which section should i define that value and to which value should i set that?
Thanks for your help