hash file parameters

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
parsi_cnu
Charter Member
Charter Member
Posts: 43
Joined: Thu Dec 04, 2003 4:26 pm

hash file parameters

Post by parsi_cnu »

What are the parameters i had to give in hash file creation when the file size is 1GB. I mean the number of minimum modulus, split load, merge load, large record, record length. ihad 1 million records.

thanks
marsboy
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Find your client cdrom and look for the 'unsupported utility' HFC - the Hashed File Calculator - on it. That will help you answer questions like this.
-craig

"You can never have too many knives" -- Logan Nine Fingers
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

The default settings for a hashed file will quite happily handle 1 million records totalling 1GB.

Minimum modulus will allow you to pre-allocate disk space.

If you set that accurately then split load is extraneous; it's the threshold at which the hashed file expands to hold more data.

Merge load is the threshold (% full) below which the file shrinks so as more compactly to hold a reduced amount of data.

Large record is a threshold (% of a group size) above which record data are stored in a separate buffer, so as to reduce the overall average size of records actually in group buffers.

The above words, other than the first sentence, are taken from one of my (copyright) training manuals.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply