Minimum Modulus Issue in Hashed Files

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
Amit_111
Participant
Posts: 134
Joined: Sat Mar 24, 2007 11:37 am

Minimum Modulus Issue in Hashed Files

Post by Amit_111 »

Dear All,

I have created a hashed file which stores the Primar key and an amount value which i cumulate whenever an instance from source is found.

The Minimum modulus of my file is set to 1 and the strange thing which is happening is after around 100000 apprx. records are processed from the source the records present in the hashed files are overwritten and the amount field value again starts from zero due to which the earlier values are lost.

When i increased the Minimum Modulus size, the job is working fine and no records are lost. I have not made any changes in the job other than increasing the minimum Modulus size in the job where the hashed file is getting created.

How can this happen? As per DataStage, it should create an overflow file and records should not be overwritten and the cumulative value should not be lost. I am unable to conclude this as to why it is happening in this way.

Please provide your valuable inputs regaridng the same.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

This shouldn't happen.

How big is the hashed file (or how big is the DATA.30 file within it)?

MINIMUM.MODULUS is not an upper limit on the size of the hashed file; it's only a lower limit.

Can you check your design to make sure that it's not there that the key values are being reset?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply