Page 1 of 1

DB2 Connector

Posted: Thu Oct 31, 2013 9:51 pm
by Curious George
Hi All,

I have a Parallel Job which has an input Sequential Stage which feeds data to a DB2 Connector Stage which updates a table.
In the DB2 Connector Stage in the Partitioning tab the Collector Type is specified as 'Auto'.
Please let me know if the DB2 Update would run in parallel or in sequential mode. If it runs in Parallel mode please let me know what is the partitioning method it would use.
The DB2 table is partitioned on a particular column.

Posted: Thu Oct 31, 2013 11:56 pm
by ray.wurlod
Welcome aboard.

If it runs in parallel it will use the DB2 partitioning algorithm.

If you should wish to enforce this, set the execution mode to parallel and the partitioning algorithm to DB2 - you will be prompted for the name of the table that defines the partitioning.

Posted: Fri Nov 01, 2013 1:04 pm
by MT
Hi,

you will see if the DB2 Connector already runs in parallel looking at the partitioning icon.
I recommend to use "DB2 Connector" partitioning within the DB2 Connector - which is different to "Auto" but will provide the correct partitioning if your target is a DB2 DPF (partitioned database).

Partitioning of "DB2" is old and was used for the DB2EE stage and no longer valid for DB2 Connector.

Posted: Fri Nov 01, 2013 3:51 pm
by Curious George
Hi Ray,

Thank you for your reply.

Hi MT,
My job flow is below:
Sequential File --> DB2 Connector
I saw the partitioning icon in the link which flows from the Sequential file to DB2 Connector and it shows the symbol wherein data is converging into the DB2 Connector.
I opened the different tabs in the DB2 Connector Stage and checked the Stage page and in that there is no Advanced tab for DB2 Connector stage. It is present for other stages which shows the information:
Execution Mode:
Default(Parallel)

I don't have access to view the job score or anything like that.

Thank you.

Posted: Fri Nov 01, 2013 7:19 pm
by ray.wurlod
If what you say is true then your Sequential File stage is running in parallel, which seems doubly odd. Are you sure you're not seeing a "fan out" icon, which indicates partitioning rather than collecting?

You do have access to view the score. Request the score be logged using the APT_DUMP_SCORE environment variable as a job parameter, then look in the log for an entry of the form "This step has one dataset".

Posted: Sat Nov 02, 2013 1:27 pm
by MT
Hi Curious George
I opened the different tabs in the DB2 Connector Stage and checked the Stage page and in that there is no Advanced tab for DB2 Connector stage. It is present for other stages which shows the information:
Execution Mode:
Default(Parallel)

I don't have access to view the job score or anything like that.
There is an advanced tab like in the other stages but you have to click in the stage symbol in the ICON in the upper left corner once the connector window is open.

Unless you specified multiple readers per node when reading the file - I agree with ray it seems a little strange what you described and I would have a look into the score as Ray suggested.

Posted: Mon Nov 04, 2013 12:09 pm
by Curious George
Hi Ray/MT,

Thanks a lot for your response. I checked as per the input provided by MT. It shows the Execution Mode parameter as 'Sequential'.

Thanks for your valuable inputs.