Abort the job from "After job subroutine"

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Sudhindra_ps wrote:But still no luck as job used to try to insert data first and then try to delete. Which resulted in Database constraint violation.
Used to try? :?

You need to do a more thorough job of detailing your requirements. Describe your complete job flow. What exactly are you trying to delete? What exactly are you trying to insert? What kind of constraint was violated? Describe the target table (or is it tables?) involved. All we know at this point is you are having a problem but don't really have enough information to (as I did before) do anything more than guess what might help you.
-craig

"You can never have too many knives" -- Logan Nine Fingers
Sudhindra_ps
Participant
Posts: 45
Joined: Thu Aug 31, 2006 3:13 am
Location: Bangalore

Post by Sudhindra_ps »

hi Chulett,

Am trying to delete and insert huge volume of data on same table in the same job.
I just extract all the data from Source system and compare this aganist Target table existing data. Based on comparision I just delete all the matching existing records from target table and insert all records flowing from source system into Target table.

As you suggested I just sent both Insert link and Delete link into single DRDBMS stage and also ensured link ordering was proper from Transformer stage so that Delete operation works first and then followed by Insert operation on the Target system.
But, all records from Insert link got rejected saying "Unique Key Constraint" violation on the target table as Delete operation wasn't complete before Insert operation could start.
Do you have any other suggestion on to how can we handle these kinds of situation.
Please let me know if you need any more details on this.

Thanks & regards
Sudhindra P S
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Change your transaction and array sizes to 1. This sends a COMMIT every row.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Which will kill your performance but prove (if it then works) that your design is correct.
Sudhindra_ps wrote:As you suggested I just sent both Insert link and Delete link into single DRDBMS stage and also ensured link ordering was proper from Transformer stage so that Delete operation works first and then followed by Insert operation on the Target system. But, all records from Insert link got rejected saying "Unique Key Constraint" violation on the target table as Delete operation wasn't complete before Insert operation could start.
Not possible. The second link cannot start before the first link is complete. So, two things. One - you may not be deleting all records records associated with the constraint in one statement. You'd have to determine if the keys being deleted matched the keys being inserted. Two - what Array Size settings are you using in both links? Do they match? Same question for Transaction Size on each link.

If they don't match link to link, what you could be seeing are records that have been updated properly with work done in the proper order but that have not been sent to the database in the proper sequence.

Something else to consider. If the DRS stage supports this Update Action, you might try the 'Replace Completely' option. That would allow you to have one link where every record processed first deletes itself and then inserts the latest information rather than doing an update. Or simply perform an update - is the delete really necessary? It can be a pretty expensive operation, I'd hate to see you do it for no good reason especially with 'huge' volumes.
-craig

"You can never have too many knives" -- Logan Nine Fingers
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

If DRS can not do "replace existing rows completely" (which means "delete then insert") consider using an ODBC stage, which can.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
getanjalip
Participant
Posts: 5
Joined: Tue Mar 07, 2006 2:20 am

Post by getanjalip »

Hi,

I have a similar situation.

I am a simple job which creates a output file depending on some constraints. In case my output file is empty, I want my DS job to be set in aborted state.
For this I planned to call after-job subroutine which will have ExecSH to check if the outfilesize > 0

if [ -s #OutFileName# ] Then exit 0 else exit 998 // Have to chk syntax

When I tried this option it just send out a warning message in the DS log and displays Finished state for the job.

Can this be done in any other way ?

Thanks.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

You can't use the exit status for this, you need to explicity abort the job - DSLogFatal() will do that for you.
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply