2GB Limit on Hash File regarding blink error
Moderators: chulett, rschirm, roy
2GB Limit on Hash File regarding blink error
Even if you choose the option "Use directory path", in which the hash fille is now ceated outside the project, does the 2GB limit still apply?
The 2 GB limit applies to either the data or the overflow file. You can find that your hashed file uses more than 2 GB if you add the DATA.30 and the OVER.30 together. But, if either reaches 2.2GB, the file will corrupt and the job loading it blow up. You should try to stay under 2.2 GB, but I'm not recommending using an ineffecient file size as a means to stretch the capabilities of 32BIT addressing.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Hashed files are created by default using 32-bit internal pointers. That is the source of the 2GB limit - it is the largest address that can be accessed with a signed 32-bit integer.
You can create hashed files with 64-bit internal pointers, or convert existing hashed files to 64-bit internal pointers, to overcome this limit. The maximum size of such as hashed file is theoretically around 9 million TB, though most operating systems will not let you create a file this size.
You can create hashed files with 64-bit internal pointers, or convert existing hashed files to 64-bit internal pointers, to overcome this limit. The maximum size of such as hashed file is theoretically around 9 million TB, though most operating systems will not let you create a file this size.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
You're wrong. Splitting the data stream will not overcome any 2GB storage limit, unless two separate hashed file stages are used, referring to two separate hashed files. Then there's no guarantee that a key being looked up will be in the correct processing stream.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.