My job design , looks like
Code: Select all
DRS 1-------->IPC----------> TRANSFORMER1 ------> Seperate Process (No Issues in this path)
|
|
|
|
Hashed File2 |I
^ |N
I| |P
N| |U
P| |T
U| |
Output T| OUTPUT |
DRS2<-------Transformer2<------V
| Hashed File Stage1
|
I|
N|
P|
U|
T|
V
Hashed File3
The output from the Hashed File should fire, only if all the rows are processed, but in my testing process, i aborted the job in the transformer1 by using utilityabort, Still the output link opens from the hashed file stage1 and processed 100 rows, I confirmed by checking the outputs in Hash file 2 and 3, but it didnt do any changes in the backend (DRS2).
I dont want this to happen, because the other two hashed files will be used in other jobs, with the wrong results.I am expecting the output link should open only if the rows are processed sucessfully.
Is there any way, to stop this happening, if the job aborts, the output link should not open. Is there any settings , I need to do for acheving this?