Hi All,
We get the error and the job aborts when array size and record count is set to 65000 on ODBC stage-Insert Bulk failed due to schema change on target table.
Observation - new nullable columns are added to the table after we developed the job. These new columns are not defined in the job but as they are nullable, the job was working fine until recently when the array size was set to 1. But when we changed the array size to 65000 to improve the performance, the job gets aborted if we don't define those new nullable columns in the job
Thanks in advance.
Job aborts when array size and record count is set to 65000
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Code: Select all
Array Size = N * INT((packet size) / (row length))
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
For a more detailed explanation of array size and transaction size look at Arndt's answer at the bottom of this topic:
viewtopic.php?t=151063&highlight=array+size
viewtopic.php?t=151063&highlight=array+size
Without a specific plan and while understanding what each option controls, you could always try starting back at 1 and then raising the value in much smaller increments rather then going 'all in' right away. Increment it until the job aborts and then back it back down slightly.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers