Hi all,
I am running two jobs in parallel from a sequence. Both of the jobs are extracting data and using the same hash file (file name: hsh_max_batch_number) as look up stream. However, it always turns out that one of the job get all rows back and the other one get 0 rows back.
In both of the job protperties, I checked the "Enable Hashed File cache sharing".
What else should I do to make the hash file been access by different jobs running in parallel.
Thanks.
Kathy
Access same hash file as look up from two different jobs
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Cache sharing requires other things to be configured. You can read about these in dsdskche.pdf. Meanwhile do not enable cache sharing - the separate processes will create separate caches and, design being appropriate, both jobs will work OK.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
I am sure the two job should both return rows. Actually in the 0 row returned job, all the records are tranformed to the not found output link.chulett wrote:There's nothing required to make this work if both are just doing lookups. You've got a problem with the job itself that got back 0 rows. ...
When I run the two jobs in sequential order, they are all O.K.
Kwang
O.K. will change. Then what should I do to be able to read the same hash file from two different jobs at the same time?ray.wurlod wrote:Cache sharing requires other things to be configured. You can read about these in dsdskche.pdf. Meanwhile do not enable cache sharing - the separate processes will create separate caches and, design ...
Thanks.
Kathy
Kwang