Page 1 of 1

On Array Size!!

Posted: Wed Mar 15, 2006 3:35 pm
by gateleys
I was going thru' some previous postings on analyzing your input data and posting trial values for Array Size and Rows per Transaction before achieving an optimal performance while using OCI9. I came across a posting by Ray-
ray.wurlod wrote:I just had a report from a customer as follows. Job design is

Code: Select all

OCI9 ----->  HashedFile
There are 34756 rows to move. Array size is set to 1000. The job loaded 34000 rows, and apparently discarded the incomplete array.
Further investigation is required.
So, why was the incomplete array discarded? Don't tell me that the array size should be a factor of Rows per Transaction!!!

Cheers,
gateleys

Re: On Array Size!!

Posted: Wed Mar 15, 2006 3:50 pm
by ogmios
I always put array size to 1. We've got bitten once that when a error occurred in a job using array size > 1 all rows in the same "array" got the same treatment and not only the row causing it (in our case all were rolled back).

This behaviour changed between Datastage versions, so we just started using 1 as transaction size and kept it to that.

Ogmios