Job aborts when array size and record count is set to 65000

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
lathalr
Participant
Posts: 16
Joined: Thu Feb 14, 2013 6:00 am

Job aborts when array size and record count is set to 65000

Post by lathalr »

Hi All,

We get the error and the job aborts when array size and record count is set to 65000 on ODBC stage-Insert Bulk failed due to schema change on target table.

Observation - new nullable columns are added to the table after we developed the job. These new columns are not defined in the job but as they are nullable, the job was working fine until recently when the array size was set to 1. But when we changed the array size to 65000 to improve the performance, the job gets aborted if we don't define those new nullable columns in the job

Thanks in advance.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Lower it. That value is high enough to not even make any sense, I'm afraid.
-craig

"You can never have too many knives" -- Logan Nine Fingers
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Code: Select all

Array Size = N * INT((packet size) / (row length)) 
where N is a small integer. Initially try it with N = 1.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
asorrell
Posts: 1707
Joined: Fri Apr 04, 2003 2:00 pm
Location: Colleyville, Texas

Post by asorrell »

For a more detailed explanation of array size and transaction size look at Arndt's answer at the bottom of this topic:

viewtopic.php?t=151063&highlight=array+size
Andy Sorrell
Certified DataStage Consultant
IBM Analytics Champion 2009 - 2020
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Without a specific plan and while understanding what each option controls, you could always try starting back at 1 and then raising the value in much smaller increments rather then going 'all in' right away. Increment it until the job aborts and then back it back down slightly.
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply