Tuning the array size of OCI stages

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
michaelsarsnorum
Participant
Posts: 31
Joined: Wed Aug 31, 2005 1:55 am

Tuning the array size of OCI stages

Post by michaelsarsnorum »

Hi

We tried some different values for the array size of a job included in our datawarehouse etl-jobs. What we found was that we were able to double the performance of the job by setting the arraysize to 20 (higher than this and it went back down).

What I'd like to know is if there are any formuales for using then setting array sizes. We have inherrited a large system from someone else, it contains a lot of jobs where the default array size of 1. We do not have the resources to go over and check every job and test it with different values to see what makes gives the best performance.

Any help is apreciated.

Michael
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

It's a factor of the average number of characters in a typical row, packet sizes, etc. You can hit the help button on the ODBC/OCI stages and see the explanation more clearly. You're going to need to profile your data to figure out the average row size and how many rows will fit into the array optimally. That value is a production side tunable and relies on a certain level of stability in the data composition.

Your best option is not to go for the high water-mark, but go a little under the average. You'll probably find out its worth just a quick guess and up the number from the default of 1. You may also consider the bulk loader route for pure inserts, which usually yields more return by far over any array tweaking.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Post Reply