Page 1 of 1

add_to_heap() - Unable to allocate memory

Posted: Tue Oct 25, 2005 12:12 pm
by Titto
We are getting warning message in Ver 7.5
XXXHashFiles.ABC: add_to_heap() - Unable to allocate memory
It started coming when we upgrade the system from 7.1 to 7.5 versions, the same job was running without above warning messages in 7.1 version. Does anyone come across the same issue using 7.5 version?

Any help is appriciated.

Thanks,

Posted: Tue Oct 25, 2005 12:20 pm
by kcbland
Use the Search facility. Paste "add_to_heap" in the box and select exact match. You can read a lot of discussion about this error. Short story, either ran out of write cache, or ran out of disk space.

Since you upgraded, perhaps your project default write/read cache settings are now different.

Posted: Tue Oct 25, 2005 1:06 pm
by Titto
Thank you!
I searched in forum but i did not get any version related information then i posted this ... I need to check out with admin about the previous settings to compare with current settings.

Thanks,

Posted: Tue Oct 25, 2005 1:55 pm
by ray.wurlod
It may also be the case that you're trying to get more data into the hashed file than formerly. That becomes the more likely explanation if you find that the cache sizes have not been changed.

Posted: Wed Oct 26, 2005 9:04 am
by Titto
Hi Ray,

We did not change any parameters in that job, we are currently (use to) using 50000 as Array Size at Oracle (ORAOCI9) stage and "Allow Stage Cache" check in Hash file stage.
Do you think is 50000 is huge and also " allow stage cache" also over head at hash file stage.
what are the adavantages using Allow Stage Cache in hash file stage.
here is my job flow

Code: Select all

Oracle Stage ------> Transformer-------> Hash File Stage
(1 Query)               (3 constraints)       (3 hash files)
Thanks

Posted: Wed Oct 26, 2005 9:24 am
by gpatton
did the memory configuration parameters in the uvconfig file change when you upgraded from 7.1 to 7.5?

Posted: Wed Oct 26, 2005 10:34 am
by Titto
how to check that one.. ?