Performance

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
c341
Participant
Posts: 39
Joined: Mon Jan 26, 2004 8:43 pm

Performance

Post by c341 »

Hi
My source is siebel business component and i'm transfering the records into the oracle table. one of the source has around 200,000 records. It takes more than 3 hrs to run this job.
I've given my transaction size as 5000 and the array size as 1000 in the OCI plug in.
Suggest me how to increase the performance...

The same situation if I transfer my records to a hash file, It took around 2 hrs ..its less than the first case. but still i want to reduce the run time...for hash files suggest me values in the group size, records size.....\

Thank You
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Re: Performance

Post by chulett »

c341 wrote:Suggest me how to increase the performance...
Does 'transferring into Oracle' mean inserting? If that's the case, drop your extracted records to a Sequential file and then bulk load them. SQL*Loader, especially a direct path load, will beat the pants off an OCI insert any day of the week.

Otherwise, Array Size is highly dependant on factors like if you are dragging the data across a network to your DataStage server, the speed of your network, how 'wide' your source table is, etc. Sometimes you'll find that changing the Array Size won't really help you out at all.
The same situation if I transfer my records to a hash file, It took around 2 hrs ..its less than the first case. but still i want to reduce the run time...for hash files suggest me values in the group size, records size.....
Can't. You'll need to help yourself here. Install the Hashed File Calculator from the Client cdrom that is in the 'Unsupported' Utilities folder. With information about your data that you input, it will suggest creation values for a hash file that are appropriate for your needs. A properly pre-created hash file can speed up a process like that enormously. :wink:
-craig

"You can never have too many knives" -- Logan Nine Fingers
wdudek
Participant
Posts: 66
Joined: Mon Dec 08, 2003 10:44 am

Post by wdudek »

We have an Oracle table that is updated daily with about 100,000 records, doing and update/insert. This takes us about 15 minutes. Are you doing updates/inserts or just inserts? Do you have indexes on the table? If you are doing updates then this will greatly speed the performace, but if you are only doing inserts it may slow it down. On many of our jobs we replace the entrie table nightly, and in these cases we drop the indexes before hand and create them again afterwards. As for the bulk loader, we have found it excellent for loading large amounts of data, but we only use it when the data amounts to about several million rows. One of the biggest problems we have encountered has been the network our servers are on, making some minor changes here has shown signifigant increases where there were apparently previously unknown problems.
Post Reply