Page 1 of 1

add_to_heap() - Unable to allocate memory

Posted: Mon Jun 18, 2007 12:23 pm
by gsym
Hi Gurus,
i have a 64 bit hashed file, created by using the code
RESIZE hashedfile_name 18 13671881 2 64BIT.
after loading 800000 records, the job is getting aborted with the warning "add_to_heap() - Unable to allocate memory" . i have sufficient space in the disk.

any help is appreciated.

Thank you

Posted: Mon Jun 18, 2007 12:40 pm
by chulett
Memory <> disk space. Memory = RAM. If you have Write Cache enabled, turn it off or try bumping the value for that in the Administrator.

Posted: Mon Jun 18, 2007 2:54 pm
by ray.wurlod
Turn Write cache off. This hashed file (14GB if no overflow) will never fit even in the maximum possible write cache (999MB).

The message is simply alerting you to this fact.

Posted: Tue Jun 19, 2007 7:49 am
by gsym
Thanks Craig n Ray,

Okay i will Turn Write cache off.
Shall i put write cache size as 999MB?
Is there any other way to improve the performance of this job?

Thanks

Posted: Tue Jun 19, 2007 2:38 pm
by ray.wurlod
Possibly. It rather depends on what the job does. But there's no flexibility with a hashed file of this size. You've already indicated in another post that you need every column and every row that you're loading into the hashed file - but you might like to verify this assertion.