Expect to abort job after ...
Moderators: chulett, rschirm, roy
Expect to abort job after ...
In our job design, We are filtering out offending entries from the DB, after the previous copletion of ETL operation. These filtered entries we are expected to direct to a sequential file.
If this sequential file at the end of the job is non-empty, is there any way we can abort the job, so further stream of execution would not progress.
If this sequential file at the end of the job is non-empty, is there any way we can abort the job, so further stream of execution would not progress.
Thanks with regards,
videsh.
videsh.
-
- Participant
- Posts: 12
- Joined: Wed Jun 13, 2007 2:12 am
Re: Expect to abort job after ...
I think you can use transformer, it has a constraint of 'Abort after Rows'.videsh77 wrote:In our job design, We are filtering out offending entries from the DB, after the previous copletion of ETL operation. These filtered entries we are expected to direct to a sequential file.
If this sequential file at the end of the job is non-empty, is there any way we can abort the job, so further stream of execution would not progress.
Use that for your Sequential file link, and set it to 1.
That will abort the job if any row passes through that link.
Re: Expect to abort job after ...
I'm confused by this. If you reach the end of the job wouldn't it be too late to abort it? Unless you're talking about a job sequence?videsh77 wrote:If this sequential file at the end of the job is non-empty, is there any way we can abort the job, so further stream of execution would not progress.
Todd Ramirez
Sr Consultant, Data Quality
San Antonio TX
Sr Consultant, Data Quality
San Antonio TX
chulette -
Are you referring to write a Shell script, which checks file size & call DS routine ?
or write a DS routine itself ExecSH ?
Gaurav -
We already have that solution elsewhere, but limitation with this approach, you will never realize which all entries are offending, because after very first row, your job is going to get aborted.
Truenorth -
I understand your concern. But the proble is we have seen our DB operation is failed & batch progressed as normal. So this is going to be a counter check to ensure we have addressed anomalies, at this logical checkpoint.
Are you referring to write a Shell script, which checks file size & call DS routine ?
or write a DS routine itself ExecSH ?
Gaurav -
We already have that solution elsewhere, but limitation with this approach, you will never realize which all entries are offending, because after very first row, your job is going to get aborted.
Truenorth -
I understand your concern. But the proble is we have seen our DB operation is failed & batch progressed as normal. So this is going to be a counter check to ensure we have addressed anomalies, at this logical checkpoint.
Thanks with regards,
videsh.
videsh.
-
- Premium Member
- Posts: 1735
- Joined: Thu Mar 01, 2007 5:44 am
- Location: Troy, MI