Abort the job after certain number of rejects
Posted: Thu Dec 23, 2010 11:50 pm
I am having a source file A with a data of one million records. I am loading that file through the sequential file stage SF_A
In the sequential file stage SF_A I used reject mode as "output " so that I can direct reject records to a another sequential file SF_Rejects through a transformer TF_A. In the transformer output link I kept an option of abort after 101 records.
Now the problem is, job is getting aborted after 100 rows but the sequential file for the rejects is not creating. No file is created in the respective file path. Below is logical flow of the job i created.
SF_A -> Copy Stage -> Peak stage
|
|(reject_flow)
|
TF_A(Transformer) ->SF_Rejects (Sequential File Stage).
How can i overcome this problem, so that abort of the job and 100 records can be loaded into the sequential file stage at a time? So that client can save the processing time of the one million records if there are 100 rejects in the file.
Any suggestions for my problem?
In the sequential file stage SF_A I used reject mode as "output " so that I can direct reject records to a another sequential file SF_Rejects through a transformer TF_A. In the transformer output link I kept an option of abort after 101 records.
Now the problem is, job is getting aborted after 100 rows but the sequential file for the rejects is not creating. No file is created in the respective file path. Below is logical flow of the job i created.
SF_A -> Copy Stage -> Peak stage
|
|(reject_flow)
|
TF_A(Transformer) ->SF_Rejects (Sequential File Stage).
How can i overcome this problem, so that abort of the job and 100 records can be loaded into the sequential file stage at a time? So that client can save the processing time of the one million records if there are 100 rejects in the file.
Any suggestions for my problem?