concurrent jobs accessing hash lookup for reading &updat

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
hiltsmi
Participant
Posts: 20
Joined: Thu Aug 04, 2005 9:03 am

concurrent jobs accessing hash lookup for reading &updat

Post by hiltsmi »

I have a common hash file that will be used as a lookup by a number of different jobs. The jobs will lookup data from the file and then update the contents of the record and write it back to the hash file in the same job.

I have tested it with a single job and the reading and writing to the hash file appears to work okay.

But I am concerned about concurrent use of the hash file by multiple jobs at the same time which will probably be the case in production.

Can multiple jobs safely read and write to a hash file lookup at the same time?
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

hiltsmi,

the hashed file mechanism can handle many concurrent reads and writes to files, just as any database system does.

If one process READs a record that might have been updated by another process (or job), then you need to ensure is that all of your WRITEs and READs are done right off the hashed file(s) and are not cached or loaded into memory.

If you allow caching then you might have stale records read by one process while another has updated the record.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Do NOT use read cache or write cache, and ensure that every read from the hashed file has "lock for update" set. Under these circumstances you will not experience any concurrency issues.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply