Page 1 of 1

Teradata error: Request size exceeds maximum

Posted: Wed May 23, 2012 3:24 pm
by devnhi
Hi

Recently we have migrated to Datastage 8.5 and also to Teradata .I am using the Teradata connector stage and want to load the data into table . This table has 118 columns and it is failing :

Copy_3_of_Teradata_Connector_85,0: RDBMS code 350: CLI2: REQOVFLOW(350): Request size exceeds maximum.

As per this discusstion I have added :
APT_TERA_64K_BUFFERS=1
APT_TERA_64K_BUFFERSIZE=64000

and reran the job .The job failed again .Can somebody help us in resolving the issue .

Posted: Wed May 23, 2012 3:37 pm
by chulett
Split to your own post, linked back to original.

Posted: Wed May 23, 2012 4:30 pm
by PaulVL
How did you pull the metadata that defines your table structure on TD?

Re: Teradata error: Request size exceeds maximum

Posted: Wed May 23, 2012 4:56 pm
by kwwilliams
Is this an insert/update or select?

How big is the buffer in your teradata system (has nothing to do with APT settings you have set)?

Is your row size (number of bytes) * the array size greater than the teradata buffer settings?

I have been working with a Teradata 13 system which has a buffer of roughly 1,000,000 bytes. On wide rows, using the immediate setting it is fairly easy to cross the Teradata buffer threshold. One option is to use batch which is not hindered by the Teradata buffer.

Posted: Thu May 24, 2012 3:47 pm
by devnhi
Hi All

Thanks a lot for all your response.I have used Datastagemetadata import
for importing the metadata.But not sure , how to find out the limit on the Buffer size. Is there any way I can find out that limit ?


Thanks for all your time .

Posted: Thu May 24, 2012 6:03 pm
by chulett
I would imagine you could find the buffer size limit by asking your Teradata DBA.

Posted: Thu May 31, 2012 5:50 pm
by devnhi
I have found out that in our environment the buffer size is :1,000,000 bytes and I am exceeding this limit as per the calculation.

I am performing a insert into a Teradata table .

How should I Overcome this and deal with these type of wider rows ?

Thanks for all your inputs.

Posted: Fri Jun 01, 2012 7:13 am
by kwwilliams
Decrease your array size so that the (bytes per row * array size) is less than your buffer ... Or switch to bulk load instead of immediate which is not affected by the Teradata buffer.

Posted: Sat Jun 02, 2012 6:10 am
by devnhi
Thanks a lot .Using the Bulk load Option , I am able to resolve the issue.

Thanks Keith,

Posted: Tue Jun 04, 2013 1:05 am
by kshah9
Reducing the Array size and records counts worked for and my job is running without below two parameters as well.

APT_TERA_64K_BUFFERS=1
APT_TERA_64K_BUFFERSIZE=64000

Thanks and regards,
Kunal Shah