Hash File Issues

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
Aravind
Participant
Posts: 16
Joined: Mon Dec 27, 2004 4:17 pm

Hash File Issues

Post by Aravind »

Hi,

I have job which start with a pretty good rows/sec of around 120. But as the job runs the rows/sec goes on decreasing to 10-15.Job used to take around 2-4 hours but on certain days it went on for 6-8 hours. I checked with my DBA for any locks . But as per the DBA everything looks ok.

Job reads from a hash file and does about 10 lookups and then insert/update to the table.
Insert record count on avg 60000
Update record count on an avg 250000

Hashfile from which it is read is Type30(Dynamic) minum modulus 1 and group size 1.

Hashfile file size
31993856 Apr 12 02:46 OVER.30
108572672 Apr 12 16:08 DATA.30

I hope to gain some perfomance by tunning the hashfile properties. Can someone guide me in this.
daignault
Premium Member
Premium Member
Posts: 165
Joined: Tue Mar 30, 2004 2:44 pm
Contact:

Post by daignault »

There has been a number of posts on this topic.

On your Datastage client CD you will find a tool called HFC.exe (written by Ray Wurlod). This will guide you in how to setup your dynamic file.

Click on your hash file (target) and click on the "create file" button. This will allow you to enter the options for create file. Then just fill in the blanks on the hfc tool and migrate the numbers to this window.

Cheers,

Ray D
wnogalski
Charter Member
Charter Member
Posts: 54
Joined: Thu Jan 06, 2005 10:49 am
Location: Warsaw

Post by wnogalski »

This one should help You:
viewtopic.php?t=82343
Regards,
Wojciech Nogalski
Post Reply