Page 1 of 1

Hash File Issues

Posted: Tue Apr 12, 2005 3:32 pm
by Aravind
Hi,

I have job which start with a pretty good rows/sec of around 120. But as the job runs the rows/sec goes on decreasing to 10-15.Job used to take around 2-4 hours but on certain days it went on for 6-8 hours. I checked with my DBA for any locks . But as per the DBA everything looks ok.

Job reads from a hash file and does about 10 lookups and then insert/update to the table.
Insert record count on avg 60000
Update record count on an avg 250000

Hashfile from which it is read is Type30(Dynamic) minum modulus 1 and group size 1.

Hashfile file size
31993856 Apr 12 02:46 OVER.30
108572672 Apr 12 16:08 DATA.30

I hope to gain some perfomance by tunning the hashfile properties. Can someone guide me in this.

Posted: Tue Apr 12, 2005 3:37 pm
by daignault
There has been a number of posts on this topic.

On your Datastage client CD you will find a tool called HFC.exe (written by Ray Wurlod). This will guide you in how to setup your dynamic file.

Click on your hash file (target) and click on the "create file" button. This will allow you to enter the options for create file. Then just fill in the blanks on the hfc tool and migrate the numbers to this window.

Cheers,

Ray D

Posted: Wed Apr 13, 2005 1:23 am
by wnogalski
This one should help You:
viewtopic.php?t=82343