Page 1 of 1

Agregator failed

Posted: Mon Apr 23, 2007 1:55 am
by Cr.Cezon
Hello,

I have a parallel job with one agregator that agregates 5000000 reg.
it fails with the error:
Aggregator_30,0: Failure during execution of operator logic.
Aggregator_30,0: Fatal Error: Throwing exception: APT_BadAlloc: Heap allocation failed.
before there is a warning log:
Aggregator_30,0: The current soft limit on the data segment (heap) size (2147483645) is less than the hard limit (2147483647), consider increasing the heap size limit

it seems to be a problem with size.

i have increased the kernell param data from root and it doesn't work.


can sameone help me?

regards,
Cristina

Re: Agregator failed

Posted: Mon Apr 23, 2007 4:35 am
by sudeepmantri
Hi Cristina,
Whenever u perform Aggregation based on the key fields, u also make a Heap sort based on the key fields correct. For this the system internally creates a Heap table. For Larger volume of data this table grows in size exponentially, cause of which u might run into the issue. Ask ur admin to increase the size of the heap allocation.

Else you can try this out. Open the aggregator stage u r using, Goto the tab Stage---> Properties. Under the option node See the property Method. Chage the method from Hash to sort. This will do.

Thanks n regards :)
Sudeep

Posted: Mon Apr 23, 2007 4:38 am
by nick.bond
Your hitting the 2GB file size limit imposed on you by the UNIX setup.

I think this is set through ulimit(). I can't remember now if this is just setup once or whether it is specified in dsenv and I don't have a system to look at now. Will have to check in the morning unless someone else answers this first.

Posted: Mon Apr 23, 2007 6:34 am
by chulett
AFAIK, only an SA can change the 'hard limit' for something like that. Seeing as how the soft limit is only 2 bytes below that, raising it via ulimit won't be possible until the hard limit is raised.

Posted: Mon Apr 23, 2007 3:53 pm
by nick.bond
Have a look at this link which will explain how to check and change the soft and hard limits.

http://www.ss64.com/bash/ulimit.html

As Craig said, you probably need to get SA to do this for you.

Posted: Mon Apr 23, 2007 6:25 pm
by ray.wurlod
If you can pre-sort the data by the grouping columns, and change the Aggregator method from Hash to Sort, you should not encounter any memory issues.

Posted: Tue Apr 24, 2007 9:08 am
by Cr.Cezon
Thanks a lot