Page 1 of 1

XML Stage - Heap allocation error

Posted: Mon Nov 23, 2009 1:27 pm
by bluezed28
When I tried to run a large file > 200MB through XML Pluggin Stage i get the error
:
APT_CombinedOperatorController,0: Fatal Error: Throwing exception: APT_BadAlloc: Heap allocation failed.


I confirmed all the hard and soft ulimits were set to unlimited but still the 'data' resource said it was constrained. I my brain for days on this and IBM couldn't find a solution , but I found out the problem and solution:

In the dsenv file there is a couple lines near the bottom:
LDR_CNTRL=MAXDATA=0x60000000@USERREGS
export LDR_CNTRL

You need to comment it out otherwise the number of memory segments allocated is 6 and not unlimited. I can't say what are the downsides, but this is what I needed to do to get my jobs working in the test environment.

Posted: Mon Nov 23, 2009 2:17 pm
by chulett
It's been mentioned here before, but one thing to keep in mind is the fact that it is an AIX-specific solution.

Posted: Mon Nov 23, 2009 3:21 pm
by eostic
...and be careful...you may not get a whole lot larger than what you have already.....at about 500M things hit limits in the current Stage.

Posted: Mon Nov 23, 2009 3:26 pm
by chulett
True, there's still an upper limit of around 400 or 500 MBs as noted, and it seems to vary by platform. You really should be getting many small XML files rather than a small number of really big ones. :wink:

Posted: Thu May 02, 2013 8:20 am
by skp
But if you comment it out in dsenv file.It will effect project level right.
Is there any way to reflect in only job and not at the project level.

Thanks,
Pani Kumar

Posted: Thu May 02, 2013 3:06 pm
by ray.wurlod
You can bring the environment variable into the job as a parameter and change its default value to the special token $UNSET.

Posted: Fri May 03, 2013 10:28 pm
by skp
Hi Ray,
Can I know which environment variable we need to use at job level?

Posted: Fri May 03, 2013 10:48 pm
by chulett
The one mentioned in the first post.