Page 1 of 1

Abort After Rows - Write to Sequential File Not working

Posted: Mon Mar 30, 2009 9:44 am
by vnspn
Hi,

We have a job where we want to abort the job when we get some invalid values for a column. We handle this using the "Abort After Rows" option in the Transformer. The output of that link is sent to a Sequential File stage to capture the invalid value.

The limit is set as '1'. When we run the job, the job gets aborted for the required condition. But when we view the sequential file, it is empty. It does not have any value written to the file. If we run the job by setting the 'Abort After Row' condition as '0', then we get the invalid value on the Sequential file.

We actually want to abort the job and at the same time capture the invalid value on a output file. Could this be done?

Thanks.

Posted: Mon Mar 30, 2009 12:45 pm
by ArndW
Could you put a transform stage between your "real" transform and the output sequential file. How many rows does the log show went through that transform stage with a setting of "0" and "1"?

Posted: Mon Mar 30, 2009 12:46 pm
by ArndW
Could you put a transform stage between your "real" transform and the output sequential file. How many rows does the log show went through that transform stage with a setting of "0" and "1"?

Re: Abort After Rows - Write to Sequential File Not working

Posted: Tue Mar 31, 2009 2:11 am
by Pagadrai
vnspn wrote: We actually want to abort the job and at the same time capture the invalid value on a output file. Could this be done?
Hi,
I dont think
1) capturing the invalid values in a file and
2) aborting the job
can be done in a single job.

Re: Abort After Rows - Write to Sequential File Not working

Posted: Tue Mar 31, 2009 3:08 am
by BugFree
Pagadrai wrote:
vnspn wrote: We actually want to abort the job and at the same time capture the invalid value on a output file. Could this be done?
Hi,
I dont think
1) capturing the invalid values in a file and
2) aborting the job
can be done in a single job.
Yes... It is not possible to capture the records if we set the limit to 1.
only way I can think of is allow the job to finish successfully and also collect the errors in the file.
Next, from the sequence check for the error records in that file (you can use Unix commands). Based upon the result, trigger next process or abort the sequence with notification.

Posted: Tue Mar 31, 2009 3:56 am
by v2kmadhav
why cant you write into two streams.... anything that passes goes into your actual output while the failure link is dealt with appropriately rather than aborting the job...rather than writing unix scripts etc

Posted: Tue Mar 31, 2009 4:32 am
by BugFree
v2kmadhav wrote:why cant you write into two streams.... anything that passes goes into your actual output while the failure link is dealt with appropriately rather than aborting the job...rather than writing unix scripts etc
You are right v2kmadhav 8). But to deal with failure records, we should have some condition check.
I thought a simple unix command "wc -l #FilePath#/#FileName#" will do the purpose.
Based upon its return value we can decide whether do we really have error records or not and go for the next process / business requirement.
Thats what I meant in my earlier post.

Posted: Tue Mar 31, 2009 12:24 pm
by vnspn
Thanks for all your response. Our requirement was to make both of these happen.
1) Abort the job immediately if an invalid value comes through.
2) Capture just the one record (first record) value that passed through the abort condition.

We were able to achieve this at the end. Initially the output error file was always empty even when the job aborts with an error record passing through that link. This was with all default settings in the stage.

Then, we tried by selecting the "Node pool and resource constraint" option to have the conductor node alone selected in the Transformer stage. Running the job with this option always writes record to the output error file whenever the abort takes place. So, we were able to achieve it the way we wanted it.

Thanks to all for your valuable inputs!