Page 1 of 1

Import Table definition - heavy response time

Posted: Tue Oct 21, 2003 3:46 am
by Kim Bundgaard
When I import (Table Definitions -> Plug-in Metadata Definitions -> DSDB2 -> DB2 z/OS datasource) a Table definition, I can see that the process in DB2 z/OS is approx. 10 seconds, but the DataStage Server uses about 300 seconds to process the 8000 tables/view and present it in a window (the DataStage Server is not cpu busy or have memory problem).
Do anyone have a circumvention for this or any suggestion for alternative DataStage configuration to speed up the function ?

Posted: Tue Oct 21, 2003 4:29 am
by Amos.Rosmarin
:!:

Did you try using ODBC import .... sometimes it works faster

Yes I've tried ODBC import

Posted: Tue Oct 21, 2003 4:39 am
by Kim Bundgaard
And it is faster, but I'm not sure that all datatype's is correct with the ODBC interface (but of course that can be checked), and what about accessing the data afterwards.
If I use ODBC for table import, will the access to the table be via ODBC og native DB2 CLI driver.

Posted: Tue Oct 21, 2003 4:43 am
by vmcburney
Usually you get this type of problem with client/server when it performs a large number of small select statements which creates a lot of round trips. If DB2 is not busy and your DataStage server is not busy then I would point the finger at the network, especially the network traffic of other applications using that database.

I don't know DB2 that well but a possible work around is to create some dummy database logins that can only see a different subset of the tables and use these logins when importing tables. You can divide the tables up by function or by name so that each login sees about 1000 tables/views. Will hopefully save you a few minutes on each import.

Posted: Tue Oct 21, 2003 4:49 am
by Kim Bundgaard
Sorry I haven't explained ...

But all the processing I'm waiting for is happening at the DataStage Server. There is no network traffic (GigaBit network) - DataStage only stand there for a couple of minuts until it want to show the table to me.

Posted: Tue Oct 21, 2003 5:22 am
by peterbaun
Hej Kim,

Sorry - can't really answer your question, I have actually seen the same problem but never meassured it in deeper detail - always thought it was response time from the server.

It probably won't help understanding your current problem but you can limit the amount of schemas to be processed by setting the variable SCHEMALIST in DB2 connect (I am assuming that you are using DB2 Connect) - it will reduce the amount of tables returned.

Unless it has been changed in version 6 then be aware that if you are importing table definitions with ODBC and then use these in the 390 environment then you probably will have to change some of the column-definitions eg. the length of the integer fields are larger than the max allowed length of integers in DS390 (as I recall then you will get an error in version 6 about such errors - you didn't previously which caused a lot of fun...)

Mvh
Peter Baun

Re: Import Table definition - heavy response time

Posted: Tue Oct 21, 2003 7:28 am
by peterbaun
Hej Kim,

Just want to add to my previous answer - it seems to be a known issue with the db2 plugin.

If you go to the Ascential Knowledge database you will find article G10221 which describes exactly your problem. The problem is described for version 5.2r1 but no resolution is given so I believe that it hasn't been fixed - so I would contact Ascential support and ask them for a status on this


Performance issues importing metadtada via db2 plug-in

Long Description
Importing metadata via the db2 plug-in takes along
time when the volume of table definition to be
imported is high.

Importing 20,000 table definitions via the db2
plug-in takes about 2hrs, as oppossed to 2 minutes
when the ODBC plug-in (via wire protocol) is used
against the same database.

We have tried importing metadatada from db2
databases on an NT & AS400 systems, the
performance are the same, please see figures
below:

Definitions DB2 Plug-in ODBC plug-in
5,000 0:06:35 0:01:02
12,000 0:39:00 0:01:18
22,000 2:00:06 0:02:20

The time shown above is the time it takes DS
Manager to build the list of tables available in
the database. It does not matter if the db2
database resides on AS400 or NT. For all tests
DataStage 5.2 was installed on Solaris 2.8 & AIX
5.1, performance figures were similar. DB2 client
7.2.

Steps to replicate the problem:
- We created table definitions (5,000, 10,000 &
20,000) via a sql script in the target database,
- Once tables were created we then imported
metadata via DB2 plug-in and ODBC plug-in and
compared the time it took DS Manager to build the
list of available tables.