ODBC Connector Stage for DB2 Performance issue

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
hi_manoj
Participant
Posts: 56
Joined: Sat Aug 13, 2011 2:00 pm
Location: BLR

ODBC Connector Stage for DB2 Performance issue

Post by hi_manoj »

Hi,

I have a ODBC connector stage with with which i am reading data from a DB2 table. Table has multi million data.

When reading data from the table the number of rows is very low, it starts with 1000 and goes up to 7000.
Is there a way we can increase the performance of the ODBC connector stage. ?

I have tried changing array size and record count (from 2000 to 200000), but the result is same.

please help
Regards
Manoj
Manoj
hi_manoj
Participant
Posts: 56
Joined: Sat Aug 13, 2011 2:00 pm
Location: BLR

Post by hi_manoj »

For one of the table, i tried with modulo partition (enable partition read-column name) , but seems like all the records are going into each partition. as if it doing entire partition.

for example -- if my source table has 5 records the output link shows 20 records
As I am using a 4 node cofig file.
Do I need to set any specific property.

Regards
Manoj
Manoj
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Is your source table actually partitioned? Sounds like the answer is no.
-craig

"You can never have too many knives" -- Logan Nine Fingers
Mike
Premium Member
Premium Member
Posts: 1021
Joined: Sun Mar 03, 2002 6:01 pm
Location: Tampa, FL

Post by Mike »

The DB2 Connector can do partitioned reads even when the underlying source table is not partitioned.

I always prefer the DB2 Connector over the ODBC Connector, so I don't know about the ODBC Connector's parallel read capabilities. The fact that you're seeing all rows broadcasted to every processing node implies either a bug or a "feature" of that stage.

Mike
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Just checked the docs and it doesn't seem that the ODBC Connector requires that either. So I would just double-check the partitioning method and column that you've chosen in the stage and then compare that to the partitioning that you are doing in the job itself.
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply