Hi ALL
We have designed 2 jobs in a parent-child relationship way, where the parent job is
calling the child job using UtilityJobRun for each row being extracted in the parent
job.
We are using "Multiple Instance" Option in both child as well as parent jobs to have
parallelism.
Our requirement is to run parallelly the child job for multiple countries.
Each country data is present in multiple partitions, so the parent job is used to
extract the partitions present in a single country and call the child job for that
partition of the country.
The parent job is designed in the following way
||DB2||---->||TFm||-->||File||
The child job is designed as follows:
||DB2||-->||Tfm(acting like router)||--> Here multiple ||Tfm||'s are
there...and are explained as follows.
The child job iss extracting data from DB2 database table, and one ||Tfm|| is acting
like a router, after that lot of ||Tfm||'s each one for each route (here a route
depends on Country_Cd) and finally it goes to different files.
You might understand that child job is designed in such a way that it routes the
requests appropriately depending on the country_cd.
If we run the parent job for one country, it is working fine. (for single instance)
But if we run the parent job for multiple countries, (means multiple instances, Here
we tried for 2 instances) we are getting the following error.
Childjob.2#5.BASIC_Transformer_Router.DSLink_SpainInput: ds_ipcopen() - Error in
open(/tmp/ade.SCIPROD.Childjob.2#5.BASIC_Transformer_Router-BASIC_Tfm_Spain.DSLink_S
painInput) - No such file or directory
In above BASIC_Transformer_Router.DSLink_SpainInput represents the link between
Router and the BASIC_Tfm_Spain Transformer.
Note: The multiple parent instances calling multiple child instances, We are taking
care of that.
If any one could help us in this regard would be great to us....
Thanking you in advance
Sai
Multiple Instnces in both Parent-Child Jobs
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 158
- Joined: Tue Mar 15, 2005 3:16 am
I guess the temp file produces by one Basic transformer is geting cleared by another which is running in parallel.
But why Basic Transformer in PX. Why cant the same routering function be made in PX transformer.
The main reason for ds_ipcopen() would be due to this.
But why Basic Transformer in PX. Why cant the same routering function be made in PX transformer.
The main reason for ds_ipcopen() would be due to this.
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Instead of running multiple instances of parallel jobs, why not just use a configuration file with more processing nodes? You can have more than one processing node on the same machine.
Make sure, however, that you don't force sequential mode operation by writing via a Sequential File stage; use a File Set or Data Set stage, and write subseqently from that to the final result file.
Make sure, however, that you don't force sequential mode operation by writing via a Sequential File stage; use a File Set or Data Set stage, and write subseqently from that to the final result file.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.