Hi Gurus,
i have a 64 bit hashed file, created by using the code
RESIZE hashedfile_name 18 13671881 2 64BIT.
after loading 800000 records, the job is getting aborted with the warning "add_to_heap() - Unable to allocate memory" . i have sufficient space in the disk.
any help is appreciated.
Thank you
add_to_heap() - Unable to allocate memory
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Turn Write cache off. This hashed file (14GB if no overflow) will never fit even in the maximum possible write cache (999MB).
The message is simply alerting you to this fact.
The message is simply alerting you to this fact.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Possibly. It rather depends on what the job does. But there's no flexibility with a hashed file of this size. You've already indicated in another post that you need every column and every row that you're loading into the hashed file - but you might like to verify this assertion.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.