add_to_heap() - Unable to allocate memory

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
Titto
Participant
Posts: 148
Joined: Tue Jun 21, 2005 7:49 am

add_to_heap() - Unable to allocate memory

Post by Titto »

We are getting warning message in Ver 7.5
XXXHashFiles.ABC: add_to_heap() - Unable to allocate memory
It started coming when we upgrade the system from 7.1 to 7.5 versions, the same job was running without above warning messages in 7.1 version. Does anyone come across the same issue using 7.5 version?

Any help is appriciated.

Thanks,
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

Use the Search facility. Paste "add_to_heap" in the box and select exact match. You can read a lot of discussion about this error. Short story, either ran out of write cache, or ran out of disk space.

Since you upgraded, perhaps your project default write/read cache settings are now different.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Titto
Participant
Posts: 148
Joined: Tue Jun 21, 2005 7:49 am

Post by Titto »

Thank you!
I searched in forum but i did not get any version related information then i posted this ... I need to check out with admin about the previous settings to compare with current settings.

Thanks,
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

It may also be the case that you're trying to get more data into the hashed file than formerly. That becomes the more likely explanation if you find that the cache sizes have not been changed.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Titto
Participant
Posts: 148
Joined: Tue Jun 21, 2005 7:49 am

Post by Titto »

Hi Ray,

We did not change any parameters in that job, we are currently (use to) using 50000 as Array Size at Oracle (ORAOCI9) stage and "Allow Stage Cache" check in Hash file stage.
Do you think is 50000 is huge and also " allow stage cache" also over head at hash file stage.
what are the adavantages using Allow Stage Cache in hash file stage.
here is my job flow

Code: Select all

Oracle Stage ------> Transformer-------> Hash File Stage
(1 Query)               (3 constraints)       (3 hash files)
Thanks
gpatton
Premium Member
Premium Member
Posts: 47
Joined: Mon Jan 05, 2004 8:21 am

Post by gpatton »

did the memory configuration parameters in the uvconfig file change when you upgraded from 7.1 to 7.5?
Titto
Participant
Posts: 148
Joined: Tue Jun 21, 2005 7:49 am

Post by Titto »

how to check that one.. ?
Post Reply