Abort After Rows - Write to Sequential File Not working

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
vnspn
Participant
Posts: 165
Joined: Mon Feb 12, 2007 11:42 am

Abort After Rows - Write to Sequential File Not working

Post by vnspn »

Hi,

We have a job where we want to abort the job when we get some invalid values for a column. We handle this using the "Abort After Rows" option in the Transformer. The output of that link is sent to a Sequential File stage to capture the invalid value.

The limit is set as '1'. When we run the job, the job gets aborted for the required condition. But when we view the sequential file, it is empty. It does not have any value written to the file. If we run the job by setting the 'Abort After Row' condition as '0', then we get the invalid value on the Sequential file.

We actually want to abort the job and at the same time capture the invalid value on a output file. Could this be done?

Thanks.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Could you put a transform stage between your "real" transform and the output sequential file. How many rows does the log show went through that transform stage with a setting of "0" and "1"?
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Could you put a transform stage between your "real" transform and the output sequential file. How many rows does the log show went through that transform stage with a setting of "0" and "1"?
Pagadrai
Participant
Posts: 111
Joined: Fri Dec 31, 2004 1:16 am
Location: Chennai

Re: Abort After Rows - Write to Sequential File Not working

Post by Pagadrai »

vnspn wrote: We actually want to abort the job and at the same time capture the invalid value on a output file. Could this be done?
Hi,
I dont think
1) capturing the invalid values in a file and
2) aborting the job
can be done in a single job.
BugFree
Participant
Posts: 82
Joined: Wed Dec 13, 2006 6:02 am

Re: Abort After Rows - Write to Sequential File Not working

Post by BugFree »

Pagadrai wrote:
vnspn wrote: We actually want to abort the job and at the same time capture the invalid value on a output file. Could this be done?
Hi,
I dont think
1) capturing the invalid values in a file and
2) aborting the job
can be done in a single job.
Yes... It is not possible to capture the records if we set the limit to 1.
only way I can think of is allow the job to finish successfully and also collect the errors in the file.
Next, from the sequence check for the error records in that file (you can use Unix commands). Based upon the result, trigger next process or abort the sequence with notification.
Ping me if I am wrong...
v2kmadhav
Premium Member
Premium Member
Posts: 78
Joined: Fri May 26, 2006 7:31 am
Location: London

Post by v2kmadhav »

why cant you write into two streams.... anything that passes goes into your actual output while the failure link is dealt with appropriately rather than aborting the job...rather than writing unix scripts etc
BugFree
Participant
Posts: 82
Joined: Wed Dec 13, 2006 6:02 am

Post by BugFree »

v2kmadhav wrote:why cant you write into two streams.... anything that passes goes into your actual output while the failure link is dealt with appropriately rather than aborting the job...rather than writing unix scripts etc
You are right v2kmadhav 8). But to deal with failure records, we should have some condition check.
I thought a simple unix command "wc -l #FilePath#/#FileName#" will do the purpose.
Based upon its return value we can decide whether do we really have error records or not and go for the next process / business requirement.
Thats what I meant in my earlier post.
Ping me if I am wrong...
vnspn
Participant
Posts: 165
Joined: Mon Feb 12, 2007 11:42 am

Post by vnspn »

Thanks for all your response. Our requirement was to make both of these happen.
1) Abort the job immediately if an invalid value comes through.
2) Capture just the one record (first record) value that passed through the abort condition.

We were able to achieve this at the end. Initially the output error file was always empty even when the job aborts with an error record passing through that link. This was with all default settings in the stage.

Then, we tried by selecting the "Node pool and resource constraint" option to have the conductor node alone selected in the Transformer stage. Running the job with this option always writes record to the output error file whenever the abort takes place. So, we were able to achieve it the way we wanted it.

Thanks to all for your valuable inputs!
Post Reply