abort the job if look up is success

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
dodda
Premium Member
Premium Member
Posts: 244
Joined: Tue May 29, 2007 11:31 am

abort the job if look up is success

Post by dodda »

Hello
we have 2 jobs and
We have a requirement where in the job 1 we need to read the first record of a File and based on certain Key fileds i need to do a look up with Oracle table . if the key values exist in the oracle table then i need to abort the job and then as part of the job2 i can load the file to database.

So wondering is there a way to abort the job if look up is success.

I have other way where in as part of the job 1 i am mentioning the LookupFailure action=FAIL that means that there is no matching record and the job1 fails.In the sequence based on the status of job 1(abort) i can run the second job. I dont know how far it is valid.

any ideas

Thanks
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

You NEVER need to abort. Based on the lookup succeeding you need to make a decision about whether to initiate the second job. That is all.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
dodda
Premium Member
Premium Member
Posts: 244
Joined: Tue May 29, 2007 11:31 am

Post by dodda »

thanks ray,

Wondering how to make the second job run because if the first job is succes (lookup success) means there is a matching record exists and if matching record exists i dont want to go to second job
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Use a sequence. The first job records the result of the lookup somewhere (a file, its user status area), the sequence checks that and makes the decision about whether or not to run the second job.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
srinivas.g
Participant
Posts: 251
Joined: Mon Jun 09, 2008 5:52 am

Post by srinivas.g »

In Lookup Stage use the Condition tab and put the equal condition and put condition is fail so that job1 will aborted then trigger the job2 based on triggers condition.
Srinu Gadipudi
thanush9sep
Premium Member
Premium Member
Posts: 54
Joined: Thu Oct 18, 2007 4:20 am
Location: Chennai

Post by thanush9sep »

You will need need 3 jobs

1. Job1 which has lookup stage and captures the rejected data in Dataset1 and matched records in Dataset2
2. Job2
3. Job3 , In this job take Dataset2 and connected with a Basic_transformer and use this(If @INROWNUM >0 Then UtilityAbortToLog('aborted') Else 1) in the stageVariable (Note that your execution mode should be sequential)
4. this is how your Job sequence should look like

Job1
|
|
Job3---->Terminator Activity(In triggers, Expression Type should be Failed for this link)
|
|
Job2 (Expression type should be OK)

If there is any records in Dataset2 then JOB3 will Fail and goes to the Terminator Activity and If it runs OK then JOB2 will run successfully
Regards
LakshmiNarayanan
dodda
Premium Member
Premium Member
Posts: 244
Joined: Tue May 29, 2007 11:31 am

Post by dodda »

thnaks thanush
vinnz
Participant
Posts: 92
Joined: Tue Feb 17, 2004 9:23 pm

Post by vinnz »

As Ray suggested, it may be better to make your decision without having to abort. If you can use a file, store the results of the lookup into a sequential file in job1, use a command activity to see if any records were written to the file and invoke job2 based on the result using Nested condition activity.

Code: Select all

Job1 -> Command -> NestedCondition -> Job2
HTH
Post Reply