Before-SQL - To be committed only after the job finishes

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
vigneshra
Participant
Posts: 86
Joined: Wed Jun 09, 2004 6:07 am
Location: Chennai

Before-SQL - To be committed only after the job finishes

Post by vigneshra »

Hi

We have a problem. We have to load a table using a user-defined insert query and we are doing it by just passing a dummy row to the DB2 plugin and triggering the insert query. Here before inserting we need to disable the redo logging for the table because the volume of insert is quite large and log overflows. For this we are using SQL to disable the logging temporarily. Once commit is issued the logging is enabled automatically again. So we need to bring this user-defined SQL and disable log SQL into a single unit of work so that commit is issued only at the end. When we tried using API stage's Before-SQL, the behaviour was weird and it brings down the database. So we are looking for options to do it in DB2 enterprise stage itself. Please help us in this regard!
Vignesh.

"A conclusion is simply the place where you got tired of thinking."
roy
Participant
Posts: 2598
Joined: Wed Jul 30, 2003 2:05 am
Location: Israel

Post by roy »

Hi,
Please define wierd behaviour ???

Enterprise plugins don't always have a before after SQL like plugins.

have you consulted your DBA's for alternative solutions?

IHTH,
Roy R.
Time is money but when you don't have money time is all you can afford.

Search before posting:)

Join the DataStagers team effort at:
http://www.worldcommunitygrid.org
Image
vigneshra
Participant
Posts: 86
Joined: Wed Jun 09, 2004 6:07 am
Location: Chennai

Post by vigneshra »

Thanks Roy! Actually if we try to run the job with DB2 API, the job runs forever and if we try to abort, the table space gets corrupted and we need to rebuild the table once again. When we consulted with ETL, he suggested to go with Enterprise stage. So I was asking if there is any option to achieve this!
Vignesh.

"A conclusion is simply the place where you got tired of thinking."
mujeebur
Participant
Posts: 46
Joined: Sun Mar 06, 2005 3:02 pm
Location: Philly,USA

Post by mujeebur »

What abt setting back to ARCHIVELOG mode in the post-session , means to say after the commit of SQLs.
T42
Participant
Posts: 499
Joined: Thu Nov 11, 2004 6:45 pm

Post by T42 »

I would not recommend using the DB2 Enterprise stage. For one thing, if you run it on multiple nodes, there's no real control on the before-stage/after-stage routines, and it get run multiple times.

This may be the actual cause of your weird behavior.

Suggestion: Use Before-Job and After-Job to call the DB2 routines via the command line.
mujeebur
Participant
Posts: 46
Joined: Sun Mar 06, 2005 3:02 pm
Location: Philly,USA

Post by mujeebur »

Hi, What Ascential says in the manual is :

"The Open command property allows you to specify a command (for
example some SQL) that will be executed by the database before it
processes any data from the stage. There is also a Close property that
allows you to specify a command to execute after the data from the [b]stage[/b]
has been processed."

So is this can be implemented ?
T42
Participant
Posts: 499
Joined: Thu Nov 11, 2004 6:45 pm

Post by T42 »

Yes, but it suffer from the problem I mentioned above:

Each node is independent of each other for the most part. It have little idea of what each other is doing. So.. Open and Close basically get run by all nodes. So for a 4 nodes job, you have 4 DB2 stages asking to drop the same index and recreate the same index, for example...

That is definitely not what you want to do, I would presume.

So what to do? Before/After Job is your best bet, as I suggested earlier.
Post Reply