Page 1 of 1

2GB Limit on Hash File regarding blink error

Posted: Mon Mar 26, 2007 12:01 pm
by ggarze
Even if you choose the option "Use directory path", in which the hash fille is now ceated outside the project, does the 2GB limit still apply?

Posted: Mon Mar 26, 2007 12:04 pm
by DSguru2B
Yes.

Posted: Mon Mar 26, 2007 12:24 pm
by ravibabu
if your data is more than 2 GB ,go to Px jobs ...that's good for your project....

Posted: Mon Mar 26, 2007 12:51 pm
by ggarze
Thanks for the responses. Unfortuantely we don't have Parallel extender.

Posted: Mon Mar 26, 2007 12:53 pm
by DSguru2B
Load your data into a work table (temp table) and do database join. You dont need px just for handling huge reference data.

Posted: Mon Mar 26, 2007 1:04 pm
by kcbland
The 2 GB limit applies to either the data or the overflow file. You can find that your hashed file uses more than 2 GB if you add the DATA.30 and the OVER.30 together. But, if either reaches 2.2GB, the file will corrupt and the job loading it blow up. You should try to stay under 2.2 GB, but I'm not recommending using an ineffecient file size as a means to stretch the capabilities of 32BIT addressing.

Posted: Mon Mar 26, 2007 2:05 pm
by lfong
If you do not have Parallel Extender (EE) you can always create a 64bit hash file.

Posted: Mon Mar 26, 2007 5:31 pm
by ray.wurlod
Hashed files are created by default using 32-bit internal pointers. That is the source of the 2GB limit - it is the largest address that can be accessed with a signed 32-bit integer.

You can create hashed files with 64-bit internal pointers, or convert existing hashed files to 64-bit internal pointers, to overcome this limit. The maximum size of such as hashed file is theoretically around 9 million TB, though most operating systems will not let you create a file this size.

Posted: Tue Mar 27, 2007 3:29 am
by Cr.Cezon
You can use 64-bits Hash file. resizing the Hash file created by the job.

execute the TCL command:
RESIZE HashFile * * * 64
before write in the hash file

regards,
Cristina.

Posted: Tue Mar 27, 2007 4:13 am
by ravibabu
Hi,

Please split the file useing LP stage .I think,this can be helpful to you.

Please revert back to me if i am wrong.

Posted: Tue Mar 27, 2007 6:54 am
by ray.wurlod
You're wrong. Splitting the data stream will not overcome any 2GB storage limit, unless two separate hashed file stages are used, referring to two separate hashed files. Then there's no guarantee that a key being looked up will be in the correct processing stream.