Page 1 of 2

HFC.exe -> need it but don't have the Datastage CD

Posted: Wed Dec 07, 2005 1:57 pm
by ippie02
good afternoon DSers.

I would like to use the HFC application to enhance hash files but I don't have access to the Datastage CD. Is there another way to grab a hold of this application?

thank you

Posted: Wed Dec 07, 2005 3:28 pm
by ray.wurlod
That's the only official place it resides. Maybe someone would be good enough to post it in a downloadable area (such as ADN). I don't have access to a DataStage CD at the moment.

Posted: Wed Dec 07, 2005 3:29 pm
by narasimha
Give me your email, I could mail it to you. When zipped I guess it is just 29 KB

Posted: Wed Dec 07, 2005 3:31 pm
by ippie02
hi

epmenard@gmail.com

thank you very much!

ticket still open

Posted: Thu Dec 08, 2005 9:01 am
by ippie02
good morning,

narasimba was kind enough to propose to send the application to me by email, but hasn't done so.

If anybody else knows of a way I can get the file, please let me know, thank you

EP

got it

Posted: Thu Dec 08, 2005 12:50 pm
by ippie02
I have received the file.

thank you everybody.

Posted: Mon Dec 12, 2005 6:17 am
by rkdatastage
Hi
If u feel comfortable then can u share some of the information that how effective is HCF.exe , how to utilise it ? how can i estimate the values for the calculation of Hash file configurations..?
Thanks in advance , awaiting for your valuable response.

RK

Posted: Mon Dec 12, 2005 4:01 pm
by ray.wurlod
If you use HFC you will realise that it's quite easy. You enter the total number of records and the average record size - and choose a hashing algorithm from a drop-down list. HFC does the rest.

Posted: Wed Dec 14, 2005 3:34 am
by rkdatastage
Hi Ray

Thanks a lot for your reply.
can u tell me if the size of the hash file is huge means more than 50 million records or the size of the hash file is exceeding more than 2 GB , what type of hash file i had to use , whether Static or dynamic, if static what type of Static Algorthim i had to use as static is having 17 different static types and one B tree type , which one i had to use, all the 17 different static algorithms are same or will they differ...?

Thanks in Advance

RK

Posted: Wed Dec 14, 2005 9:07 am
by chulett
Why would there be different types if they were all the same? :wink:

If you actually use the HFC, you'll see that each static type relates to a specific key pattern. Fill in the other variables and then switch through them to see which type they correspond to by looking at the generated command line.

Posted: Wed Dec 14, 2005 3:23 pm
by ray.wurlod
Static or dynamic - doesn't make any difference for the total volume of data to be stored. Choose dynamic for ease of maintenance. Choose static for optimum performance IFF (if and only if) the volume of data is precisely known and unchanging.

Posted: Tue Feb 28, 2006 1:06 pm
by somu_june
Hi Narasimha,

I have seen your post to convert 32 bit type 30 to 64 bit static hash file. I have the same requirement and Iam using 7.5.1A version of datastage. Can I use HFC.exe to determine the file type,module, seperation for version 7.5.1A. If so please help me . I don't have any HFC .exe with me . If you had so can you mail me and also I want to know what this HFC do actually. My mail id is somu.june@gmail.com



Thanks,
Somaraju

Posted: Tue Feb 28, 2006 2:24 pm
by narasimha
HFC - Hash File Calculator is used to create Static or Dynamic Hash files.
HFC calculates the parameters required to create a hash file based on inputs like - Average Record Size, Number of Records.
It also give you the total size that the newly created hash file could store.
The Key pattern depends on the structure of dat you are storing in the hash file.
The output of the HFC is the command used to create the required hash file.
You could use the command to create the hash file, either in Datastage of in the OS level.
Note :Only 32 Bit Hash files can be created through the Datastage GUI, if you want 64, use the command in the OS level.
As for the HFC.exe i tried sending to the OP but due to tight security the mail did not go through. :(
I will try to upload it to a common place where everybody can access it

Posted: Tue Feb 28, 2006 4:11 pm
by ray.wurlod
A couple of corrections. HFC does not actually create the hashed file; it generates a command that allows you to create the hashed file either from DataStage environment (a CREATE.FILE command) or from the operating system environment (a mkdbfile command). It does not calculate the average record size and number of records; it expects you to provide that information, and uses it to estimate the total volume of data to be stored, allowing for storage overheads. The key pattern that you choose from the drop-down list determines the file type (hashing algorithm).

HFC works with hashed files for all versions of DataStage from 1.0 through 7.5.1A. Since nothing to do with hashed files changes with the Hawk release, it should give correct answers in future releases also.

I have low-priority plans to enhance HFC to be able to generate command options for enabling public or shared caching. But, as they say, there's no money in it, so it has to wait till a time when there's nothing more important for me to do.

Posted: Tue Feb 28, 2006 4:12 pm
by somu_june
Hi Narshima,

Thanks for telling what does HFC do. And my requirement I have already stated that Iam looking to a DB2 table and Iam writing records to a hash file. The records in DB2 table are not consistant , number of records increases in DB2 table after job is run we are loading the records to the same table then what should I have to do to achieve this. Which hash file I have to use I have three key columms all are char wether I should take type 18 static and resize it or can I stick with type 30 and increase to 64 bit. Once again thanks for trying to upload HFC in a common place, please let me know when you uploaded. Iam new to Datastage can you tell me what command I have to use to create a 64 bit hash file or can I follow your method which you stated in above posts.

Thanks,
Somaraju.