Teradata error: Request size exceeds maximum

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
devnhi
Premium Member
Premium Member
Posts: 68
Joined: Wed Jun 17, 2009 10:47 am

Teradata error: Request size exceeds maximum

Post by devnhi »

Hi

Recently we have migrated to Datastage 8.5 and also to Teradata .I am using the Teradata connector stage and want to load the data into table . This table has 118 columns and it is failing :

Copy_3_of_Teradata_Connector_85,0: RDBMS code 350: CLI2: REQOVFLOW(350): Request size exceeds maximum.

As per this discusstion I have added :
APT_TERA_64K_BUFFERS=1
APT_TERA_64K_BUFFERSIZE=64000

and reran the job .The job failed again .Can somebody help us in resolving the issue .
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Split to your own post, linked back to original.
-craig

"You can never have too many knives" -- Logan Nine Fingers
PaulVL
Premium Member
Premium Member
Posts: 1315
Joined: Fri Dec 17, 2010 4:36 pm

Post by PaulVL »

How did you pull the metadata that defines your table structure on TD?
kwwilliams
Participant
Posts: 437
Joined: Fri Oct 21, 2005 10:00 pm

Re: Teradata error: Request size exceeds maximum

Post by kwwilliams »

Is this an insert/update or select?

How big is the buffer in your teradata system (has nothing to do with APT settings you have set)?

Is your row size (number of bytes) * the array size greater than the teradata buffer settings?

I have been working with a Teradata 13 system which has a buffer of roughly 1,000,000 bytes. On wide rows, using the immediate setting it is fairly easy to cross the Teradata buffer threshold. One option is to use batch which is not hindered by the Teradata buffer.
devnhi
Premium Member
Premium Member
Posts: 68
Joined: Wed Jun 17, 2009 10:47 am

Post by devnhi »

Hi All

Thanks a lot for all your response.I have used Datastagemetadata import
for importing the metadata.But not sure , how to find out the limit on the Buffer size. Is there any way I can find out that limit ?


Thanks for all your time .
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

I would imagine you could find the buffer size limit by asking your Teradata DBA.
-craig

"You can never have too many knives" -- Logan Nine Fingers
devnhi
Premium Member
Premium Member
Posts: 68
Joined: Wed Jun 17, 2009 10:47 am

Post by devnhi »

I have found out that in our environment the buffer size is :1,000,000 bytes and I am exceeding this limit as per the calculation.

I am performing a insert into a Teradata table .

How should I Overcome this and deal with these type of wider rows ?

Thanks for all your inputs.
kwwilliams
Participant
Posts: 437
Joined: Fri Oct 21, 2005 10:00 pm

Post by kwwilliams »

Decrease your array size so that the (bytes per row * array size) is less than your buffer ... Or switch to bulk load instead of immediate which is not affected by the Teradata buffer.
devnhi
Premium Member
Premium Member
Posts: 68
Joined: Wed Jun 17, 2009 10:47 am

Post by devnhi »

Thanks a lot .Using the Bulk load Option , I am able to resolve the issue.
kshah9
Participant
Posts: 7
Joined: Wed Oct 06, 2010 11:32 am
Location: Pune

Thanks Keith,

Post by kshah9 »

Reducing the Array size and records counts worked for and my job is running without below two parameters as well.

APT_TERA_64K_BUFFERS=1
APT_TERA_64K_BUFFERSIZE=64000

Thanks and regards,
Kunal Shah
Best Regards,
Kunal Shah
Post Reply