Access same hash file as look up from two different jobs

A forum for discussing DataStage<sup>®</sup> basics. If you're not sure where your question goes, start here.

Moderators: chulett, rschirm, roy

Post Reply
Kwang
Participant
Posts: 20
Joined: Tue Nov 04, 2003 4:27 pm
Location: Canada

Access same hash file as look up from two different jobs

Post by Kwang »

Hi all,

I am running two jobs in parallel from a sequence. Both of the jobs are extracting data and using the same hash file (file name: hsh_max_batch_number) as look up stream. However, it always turns out that one of the job get all rows back and the other one get 0 rows back.

In both of the job protperties, I checked the "Enable Hashed File cache sharing".

What else should I do to make the hash file been access by different jobs running in parallel.

Thanks.

Kathy
Kwang
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

There's nothing required to make this work if both are just doing lookups. You've got a problem with the job itself that got back 0 rows.
-craig

"You can never have too many knives" -- Logan Nine Fingers
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Cache sharing requires other things to be configured. You can read about these in dsdskche.pdf. Meanwhile do not enable cache sharing - the separate processes will create separate caches and, design being appropriate, both jobs will work OK.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Kwang
Participant
Posts: 20
Joined: Tue Nov 04, 2003 4:27 pm
Location: Canada

Post by Kwang »

chulett wrote:There's nothing required to make this work if both are just doing lookups. You've got a problem with the job itself that got back 0 rows. ...
I am sure the two job should both return rows. Actually in the 0 row returned job, all the records are tranformed to the not found output link.

When I run the two jobs in sequential order, they are all O.K.
Kwang
Kwang
Participant
Posts: 20
Joined: Tue Nov 04, 2003 4:27 pm
Location: Canada

Post by Kwang »

ray.wurlod wrote:Cache sharing requires other things to be configured. You can read about these in dsdskche.pdf. Meanwhile do not enable cache sharing - the separate processes will create separate caches and, design ...
O.K. will change. Then what should I do to be able to read the same hash file from two different jobs at the same time?

Thanks.

Kathy
Kwang
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Nothing. Just use them.
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply