Page 1 of 1

Expect to abort job after ...

Posted: Tue May 25, 2010 11:38 pm
by videsh77
In our job design, We are filtering out offending entries from the DB, after the previous copletion of ETL operation. These filtered entries we are expected to direct to a sequential file.
If this sequential file at the end of the job is non-empty, is there any way we can abort the job, so further stream of execution would not progress.

Posted: Tue May 25, 2010 11:51 pm
by chulett
After-job routine, check filesize, call DSLogFatal() if non-empty.

Re: Expect to abort job after ...

Posted: Tue May 25, 2010 11:57 pm
by gaurav_shukla
videsh77 wrote:In our job design, We are filtering out offending entries from the DB, after the previous copletion of ETL operation. These filtered entries we are expected to direct to a sequential file.
If this sequential file at the end of the job is non-empty, is there any way we can abort the job, so further stream of execution would not progress.
I think you can use transformer, it has a constraint of 'Abort after Rows'.

Use that for your Sequential file link, and set it to 1.

That will abort the job if any row passes through that link.

Re: Expect to abort job after ...

Posted: Wed May 26, 2010 12:21 am
by truenorth
videsh77 wrote:If this sequential file at the end of the job is non-empty, is there any way we can abort the job, so further stream of execution would not progress.
I'm confused by this. If you reach the end of the job wouldn't it be too late to abort it? Unless you're talking about a job sequence?

Posted: Wed May 26, 2010 3:43 am
by videsh77
chulette -
Are you referring to write a Shell script, which checks file size & call DS routine ?

or write a DS routine itself ExecSH ?

Gaurav -
We already have that solution elsewhere, but limitation with this approach, you will never realize which all entries are offending, because after very first row, your job is going to get aborted.

Truenorth -
I understand your concern. But the proble is we have seen our DB operation is failed & batch progressed as normal. So this is going to be a counter check to ensure we have addressed anomalies, at this logical checkpoint.

Posted: Wed May 26, 2010 3:58 am
by priyadarshikunal
its has to be a custom after-job subroutine which checks the file size of the reject file and call DSLogFatal() if not empty and return a non zero value as error code.

Posted: Wed May 26, 2010 6:45 am
by chulett
Yes, that would be best. A script can also be used to pass back a non-zero return / exit code and the job will recognize it, however (from memory) it will log a warning rather than abort. :?

Posted: Thu May 27, 2010 4:04 am
by videsh77
I have adopted the path suggested by chulette, & I have modified ExecSH subroutine, to raise Fatal error, in case a script happens to run into failure.

This worked for me ...

Thanks all for your contribution.

Posted: Thu May 27, 2010 7:12 am
by chulett
Hopefully you created a modified version of ExecSH (like ExecSHFatal) rather than change the delivered one that everyone uses, yes?