Prevent a Routine from Aborting a Job

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
NEO
Premium Member
Premium Member
Posts: 163
Joined: Mon Mar 22, 2004 5:49 pm

Prevent a Routine from Aborting a Job

Post by NEO »

folks,
I have a routine that would get aborted to certain write failures for some conditions. It works fine in all other values. This routine is used in a job taking arguments from a file. When there is a write failure for come conditions, the routine aborts and in turn aborts the job. When I run the stand alone routine it gives me <<ERROR>> for certain arguments. How can I make sure that the routine abort doesnt abort the job, and instead goes on to process the next set of values coming in?
Any thing that I can change in the routine to kind of handle exceptions and make sure the routine doesnt abort for some write failure inside?
Thanks.
Jay
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

Use all error trapping available on all commands. Failure to open a file should not allow processing to continue, letting a write to the file cause a failure. In fact, write failures can be trapped as well. Verify all of the DS BASIC functions/statements against the manual. There's only a handful of situations are unrecoverable.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
NEO
Premium Member
Premium Member
Posts: 163
Joined: Mon Mar 22, 2004 5:49 pm

Post by NEO »

kcbland wrote:Use all error trapping available on all commands. Failure to open a file should not allow processing to continue, letting a write to the file cause a failure. In fact, write failures can be trapped as well. Verify all of the DS BASIC functions/statements against the manual. There's only a handful of situations are unrecoverable.
Its actually an external call to a datastage subroutine

Code: Select all

ExecAction = DSR.SUB.LOG.PURGE 
            CALL @DSR.SUB.LOG(ExecAction,JobName,Upto_Last_N_Run)
The log file for this particular jobname has a permissions issue and hence the routine aborts.
Any ideas on how to capture the status of this and somehow handle the error instead of my routine aborting?
Thanks .
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

You're doing something unsupported, using knowledge of the undocumented APIs. You have no control over what happens inside. There's no global solution, once inside that subroutine it can do anything it wants.

Would you consider writing your own subroutine to clear the log? You're already doing things outside the publishes APIs. Why not write your own subroutine at this point.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
NEO
Premium Member
Premium Member
Posts: 163
Joined: Mon Mar 22, 2004 5:49 pm

Post by NEO »

kcbland wrote:You're doing something unsupported, using knowledge of the undocumented APIs. You have no control over what happens inside. There's no global solution, once inside that subroutine it can do anything it wants.

Would you consider writing your own subroutine to clear the log? You're already doing things outside the publishes APIs. Why not write your own subroutine at this point.
That is true. I kind of have been using some of the pointers from this awesome forum and looking into the header files, connecting the dots to get things done by using the internal subroutines. I found doing it this way very fast. A clear.file approach or any other way of doing it, seems to make my code very long and also very slow. I was amazed how fast i could clear the logs across the project using the above call and my code is only a few lines long. Its just a little tempting. But I guess like some say, half knowledge is more dangerous than no knowledge :) Just a side note, I have been trying to unlock jobs unsuccesfully using the following approach.

Code: Select all

ExecAction = DSR.SUB.EXE.UNLOCKL 
Dummy=""
         CALL @DSR.SUB.EXECJOB(ExecAction,JobName,1,Dummy)

OR 

ExecAction = DSR.SUB.EXE.RELEASE 
      Dummy = ""
      CALL @DSR.SUB.EXECJOB(ExecAction,JobName,1,Dummy)
Thanks for your response.
jay
alanwms
Charter Member
Charter Member
Posts: 28
Joined: Wed Feb 26, 2003 2:51 pm
Location: Atlanta/UK

Post by alanwms »

Jay,

You've stumbled on that great debate over using custom routines vs configuring the tool to perform the proper action. Using the tool capabilities will better insure that your jobs will work on future releases of DS. Your custom routines, while very fast, may cause havoc on a future release. For development stuff it's OK to write the routines, but for production code, it's better to use the tool.

Ken has pointed out some perfect examples of that argument in his earlier post.

Alan
Post Reply