Let consider the following job scenario :
Code: Select all
Oci_Src--> Tfm1--> Dataset1
Dataset2--> Tfm2--> Oci_Tgt
In the above situation
> The first flow uses Dataset1 to write to a file f1.ds
> Dataset2 is used to read data written in the file f1.ds an perform any
subsequent functions as part of the second flow
> Oci_Src reads data in volumes of Millions
> The host system has a 4 NODE configuration file
The question is :
If the first node reads say 20,000 data and writes it to f1.ds via Dataset1 , the second flow automaticaaly starts to execute , but by the time the secnd flow completes , the file f1.ds will be updated by the next node buffer read from Oci_Src . So , the second flow will always executed against a source whose data volume keeps changing throughout the tenor of the process . In such a case , will the job not abort because the seconf flow will be trying to reading from a file having dynamic data volume ?
Thanks
Kumarjit.
Pain is the best teacher, but very few attend his class..