Page 1 of 1

Job aborts when array size and record count is set to 65000

Posted: Wed Oct 09, 2013 8:22 am
by lathalr
Hi All,

We get the error and the job aborts when array size and record count is set to 65000 on ODBC stage-Insert Bulk failed due to schema change on target table.

Observation - new nullable columns are added to the table after we developed the job. These new columns are not defined in the job but as they are nullable, the job was working fine until recently when the array size was set to 1. But when we changed the array size to 65000 to improve the performance, the job gets aborted if we don't define those new nullable columns in the job

Thanks in advance.

Posted: Wed Oct 09, 2013 8:23 am
by chulett
Lower it. That value is high enough to not even make any sense, I'm afraid.

Posted: Wed Oct 09, 2013 3:00 pm
by ray.wurlod

Code: Select all

Array Size = N * INT((packet size) / (row length)) 
where N is a small integer. Initially try it with N = 1.

Posted: Thu Oct 10, 2013 7:47 am
by asorrell
For a more detailed explanation of array size and transaction size look at Arndt's answer at the bottom of this topic:

viewtopic.php?t=151063&highlight=array+size

Posted: Thu Oct 10, 2013 7:52 am
by chulett
Without a specific plan and while understanding what each option controls, you could always try starting back at 1 and then raising the value in much smaller increments rather then going 'all in' right away. Increment it until the job aborts and then back it back down slightly.