Heap error

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
Xanadu
Participant
Posts: 61
Joined: Thu Jul 22, 2004 9:29 am

Heap error

Post by Xanadu »

hello,
I am getting this warning when trying to write to a hash file.

add_to_heap() - Unable to allocate memory


But the job continues to run (The table has about 18 million rows) - It gives me this warning but continues load rows into the hash file. Can any1 shed light on this - this is on an AIX machine..
Job structure :

DRS stage (DB2) --> IPC --> Hash file

(4 processor machine with 2 processors to Ascential..128 KB cache - i put 256 earlier but when I got this warning I thght its because of bigger cache size and changed it to 128 but I am still getting this warnign..) I am getting a speed of about 1500 rows/sec ..is it unreasonable ? I was getting better speeds on a single processor with no I.P. Row buffering Windows box though..

I was trying to change the hash file setting from 32 to 64 but refrained as this same job ran successfully on a Windows box
Actually I have the hash file on the project in the windows box. I initially thought I would create the hash file directories using create.file <filename> in Admin and move the type.30,data.30 and over.30 into this directory in AIX from windows..
But I was not sure if I was doing the right thing so I started this job in AIX..

any inputs are greatly appreciated.
Thanks
-Xan
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

You ran out of disk space, and yes the job fails to blow up. Just kill it.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Xanadu
Participant
Posts: 61
Joined: Thu Jul 22, 2004 9:29 am

Post by Xanadu »

kcbland wrote:You ran out of disk space, and yes the job fails to blow up. Just kill it.
You mean the job just shows that this is writing the data but it isn't ?
Maaan that is bad news .. I just saw the disk space - infact the size is constantly increasing... if the disk space is full - y is the size increasing (though its giving me the warnings...)

is it a good idea to just copy the already exiting hashfile from the Windows box into the AIX ?

-Xan
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

No, you can't copy the file.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

While there are tools for moving hashed files between operating systems, it's much easier to have DataStage create them in the new environment than to learn the tools.

To save anyone asking, the tools include uvbackup/uvrestore and format.conv/fnuxi with -export and -import options. The answer to the next question is "no".
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply