Page 1 of 1

Prevent a Routine from Aborting a Job

Posted: Fri Dec 16, 2005 8:46 am
by NEO
folks,
I have a routine that would get aborted to certain write failures for some conditions. It works fine in all other values. This routine is used in a job taking arguments from a file. When there is a write failure for come conditions, the routine aborts and in turn aborts the job. When I run the stand alone routine it gives me <<ERROR>> for certain arguments. How can I make sure that the routine abort doesnt abort the job, and instead goes on to process the next set of values coming in?
Any thing that I can change in the routine to kind of handle exceptions and make sure the routine doesnt abort for some write failure inside?
Thanks.
Jay

Posted: Fri Dec 16, 2005 9:19 am
by kcbland
Use all error trapping available on all commands. Failure to open a file should not allow processing to continue, letting a write to the file cause a failure. In fact, write failures can be trapped as well. Verify all of the DS BASIC functions/statements against the manual. There's only a handful of situations are unrecoverable.

Posted: Fri Dec 16, 2005 9:53 am
by NEO
kcbland wrote:Use all error trapping available on all commands. Failure to open a file should not allow processing to continue, letting a write to the file cause a failure. In fact, write failures can be trapped as well. Verify all of the DS BASIC functions/statements against the manual. There's only a handful of situations are unrecoverable.
Its actually an external call to a datastage subroutine

Code: Select all

ExecAction = DSR.SUB.LOG.PURGE 
            CALL @DSR.SUB.LOG(ExecAction,JobName,Upto_Last_N_Run)
The log file for this particular jobname has a permissions issue and hence the routine aborts.
Any ideas on how to capture the status of this and somehow handle the error instead of my routine aborting?
Thanks .

Posted: Fri Dec 16, 2005 10:03 am
by kcbland
You're doing something unsupported, using knowledge of the undocumented APIs. You have no control over what happens inside. There's no global solution, once inside that subroutine it can do anything it wants.

Would you consider writing your own subroutine to clear the log? You're already doing things outside the publishes APIs. Why not write your own subroutine at this point.

Posted: Fri Dec 16, 2005 10:22 am
by NEO
kcbland wrote:You're doing something unsupported, using knowledge of the undocumented APIs. You have no control over what happens inside. There's no global solution, once inside that subroutine it can do anything it wants.

Would you consider writing your own subroutine to clear the log? You're already doing things outside the publishes APIs. Why not write your own subroutine at this point.
That is true. I kind of have been using some of the pointers from this awesome forum and looking into the header files, connecting the dots to get things done by using the internal subroutines. I found doing it this way very fast. A clear.file approach or any other way of doing it, seems to make my code very long and also very slow. I was amazed how fast i could clear the logs across the project using the above call and my code is only a few lines long. Its just a little tempting. But I guess like some say, half knowledge is more dangerous than no knowledge :) Just a side note, I have been trying to unlock jobs unsuccesfully using the following approach.

Code: Select all

ExecAction = DSR.SUB.EXE.UNLOCKL 
Dummy=""
         CALL @DSR.SUB.EXECJOB(ExecAction,JobName,1,Dummy)

OR 

ExecAction = DSR.SUB.EXE.RELEASE 
      Dummy = ""
      CALL @DSR.SUB.EXECJOB(ExecAction,JobName,1,Dummy)
Thanks for your response.
jay

Posted: Fri Dec 16, 2005 5:44 pm
by alanwms
Jay,

You've stumbled on that great debate over using custom routines vs configuring the tool to perform the proper action. Using the tool capabilities will better insure that your jobs will work on future releases of DS. Your custom routines, while very fast, may cause havoc on a future release. For development stuff it's OK to write the routines, but for production code, it's better to use the tool.

Ken has pointed out some perfect examples of that argument in his earlier post.

Alan