Heap Allocation Failed

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
kittu.raja
Premium Member
Premium Member
Posts: 175
Joined: Tue Oct 14, 2008 1:48 pm

Heap Allocation Failed

Post by kittu.raja »

Hi,
I am getting an error saying that "scd_Application_auto_dim,0: Caught exception from runLocally(): APT_BadAlloc: Heap allocation failed.."
In this job I am reading two data sets and passing it into slowly changing dimension stage and loading into a table.
I searched the forum and did a ulimit -a in before subroutine and I am getting this values.

*** Output from command was: ***
time(seconds) unlimited
file(blocks) unlimited
data(kbytes) 1048576
stack(kbytes) 131072
memory(kbytes) unlimited
coredump(blocks) 4194303
nofiles(descriptors) 8192

Can anybody help me do I need to change anything to slove the issue

Thanks,
Rajesh Kumar
keshav0307
Premium Member
Premium Member
Posts: 783
Joined: Mon Jan 16, 2006 10:17 pm
Location: Sydney, Australia

Post by keshav0307 »

did you try search?, its very common problem

viewtopic.php?t=109330
kittu.raja
Premium Member
Premium Member
Posts: 175
Joined: Tue Oct 14, 2008 1:48 pm

Post by kittu.raja »

keshav0307 wrote:did you try search?, its very common problem

viewtopic.php?t=109330
How do I view my Hard limit and soft limit values using ulimit
Rajesh Kumar
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

in the before-job routine do an execute shell of "ulimit -a" and you will see the run-time values displayed in your log file.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

They've already done that. Aren't the "hard" values those configured in the kernel? There's different ways to find those, it depends on what flavor of UNIX you are running.
-craig

"You can never have too many knives" -- Logan Nine Fingers
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

I should have read the whole thread, not just the last post.

Kittu - which UNIX are you running? Each flavour has a different way of specifiying the hard and soft limits.
kittu.raja
Premium Member
Premium Member
Posts: 175
Joined: Tue Oct 14, 2008 1:48 pm

Post by kittu.raja »

ArndW wrote:I should have read the whole thread, not just the last post.

Kittu - which UNIX are you running? Each flavour has a different way of specifiying the hard and soft limits.
Hi,

I am running in HP -UX. I found the hard limits by using ulimit -Ha
These are the values I got
time(seconds) unlimited
file(blocks) unlimited
data(kbytes) 1048576
stack(kbytes) 131072
memory(kbytes) unlimited
coredump(blocks) unlimited

Can you tell me is there anything I have tell my admin to change
Rajesh Kumar
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Didn't you get other messages with this one, one like this for instance?

The current soft limit on the data segment (heap) size (XXX) is less than the hard limit (YYY), consider increasing the heap size limit
-craig

"You can never have too many knives" -- Logan Nine Fingers
kittu.raja
Premium Member
Premium Member
Posts: 175
Joined: Tue Oct 14, 2008 1:48 pm

Post by kittu.raja »

chulett wrote:Didn't you get other messages with this one, one like this for instance?

The current soft limit on the data segment (heap) size (XXX) is less than the hard limit (YYY), consider increasing the heap size limit
No i did not get that. I am also looking for that
Rajesh Kumar
kittu.raja
Premium Member
Premium Member
Posts: 175
Joined: Tue Oct 14, 2008 1:48 pm

Post by kittu.raja »

chulett wrote:Didn't you get other messages with this one, one like this for instance?

The current soft limit on the data segment (heap) size (XXX) is less than the hard limit (YYY), consider increasing the heap size limit
No i did not get that. I am also looking for that
Rajesh Kumar
kittu.raja
Premium Member
Premium Member
Posts: 175
Joined: Tue Oct 14, 2008 1:48 pm

Post by kittu.raja »

[quote="chulett"]Didn't you get other messages with this one, one like this for instance?

We are running the same job and same files in PreProd there is no abort the jobs are running fine why they are giving this error.
I saw the ulimit parameters its the same for both the environments.
Can you tell me what may be the cause.

Thanks,
Rajesh Kumar
Post Reply