Distributed Transaction stage

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
sheema
Premium Member
Premium Member
Posts: 204
Joined: Sat Jul 22, 2006 10:20 am

Distributed Transaction stage

Post by sheema »

Hi,

I have a requirement were i have to load data from a dataset to 4 diffrent oracle tables(Master,Detail) which have constraints that detail cannot be loaded until Master is loaded . This has to be achieved in a single datastage job as the data has to happen in a single commit after all the data has been loaded into all 4 tables.

I have seen posts that Distributed transaction stage can be used if the source is a MQ stage. I was just curious if we can use Distributed Transaction stage in my case where the source is a dataset and load into 4 different oracle tables.

Thanks
Sheema
Mike
Premium Member
Premium Member
Posts: 1021
Joined: Sun Mar 03, 2002 6:01 pm
Location: Tampa, FL

Post by Mike »

MQ (the product) is required to support the Distributed Transaction Stage. I'm not sure whether MQ (the stage) is required to be the source or not.

Mike
eostic
Premium Member
Premium Member
Posts: 3838
Joined: Mon Oct 17, 2005 9:34 am

Post by eostic »

I haven't had a chance to test it extensively, but in 8.5 the Connectors offer this capability directly. Alternatively, it can be done prior to 8.5 in a Server Job. Each link coming from a single Transformer can go to a separate table with its own SQL, and then a single commit can be utiilized. Been there since about release 4.

Ernie
Ernie Ostic

blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
sheema
Premium Member
Premium Member
Posts: 204
Joined: Sat Jul 22, 2006 10:20 am

Post by sheema »

Thanks for your reply.

If we use a server job to build this, i have below 2 questions

1.how can we see that it loads data into the master first and then the detail.

2. How do we do a single commit for all the tables once.
Thanks

Sheema
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

You use a single target stage and multiple links from the transformer, then enable 'Transaction Grouping'. Master first via proper link ordering.
-craig

"You can never have too many knives" -- Logan Nine Fingers
sheema
Premium Member
Premium Member
Posts: 204
Joined: Sat Jul 22, 2006 10:20 am

Post by sheema »

Thanks for the reply. I will try this option.
Thanks

Sheema
sheema
Premium Member
Premium Member
Posts: 204
Joined: Sat Jul 22, 2006 10:20 am

Post by sheema »

That option is working but the job is slow,any suggestions to improve the performance would be great. As Craig has suggested it is a server job a

seq file---->Transformer------->Oracle Oci stage

From transformer i have 4 different links going into Oracle stage and I am inserting data using custom sql statements, I have enabled transaction grouping.
Job is inserting data at 300 rows/sec.
Thanks

Sheema
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

That's because in order to do the 'all or nothing' part of the rollback for each 'transaction group' it forces the Transaction Size to 1.. It's the slowest way to put data in tables but also the only way to accomplish this. Seems like your only opportunity for improvement comes with the 8.5 release... perhaps.
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply