Agregator failed

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
Cr.Cezon
Participant
Posts: 101
Joined: Mon Mar 05, 2007 4:59 am
Location: Madrid

Agregator failed

Post by Cr.Cezon »

Hello,

I have a parallel job with one agregator that agregates 5000000 reg.
it fails with the error:
Aggregator_30,0: Failure during execution of operator logic.
Aggregator_30,0: Fatal Error: Throwing exception: APT_BadAlloc: Heap allocation failed.
before there is a warning log:
Aggregator_30,0: The current soft limit on the data segment (heap) size (2147483645) is less than the hard limit (2147483647), consider increasing the heap size limit

it seems to be a problem with size.

i have increased the kernell param data from root and it doesn't work.


can sameone help me?

regards,
Cristina
sudeepmantri
Participant
Posts: 54
Joined: Wed Oct 25, 2006 11:07 pm
Location: Hyderabad

Re: Agregator failed

Post by sudeepmantri »

Hi Cristina,
Whenever u perform Aggregation based on the key fields, u also make a Heap sort based on the key fields correct. For this the system internally creates a Heap table. For Larger volume of data this table grows in size exponentially, cause of which u might run into the issue. Ask ur admin to increase the size of the heap allocation.

Else you can try this out. Open the aggregator stage u r using, Goto the tab Stage---> Properties. Under the option node See the property Method. Chage the method from Hash to sort. This will do.

Thanks n regards :)
Sudeep
nick.bond
Charter Member
Charter Member
Posts: 230
Joined: Thu Jan 15, 2004 12:00 pm
Location: London

Post by nick.bond »

Your hitting the 2GB file size limit imposed on you by the UNIX setup.

I think this is set through ulimit(). I can't remember now if this is just setup once or whether it is specified in dsenv and I don't have a system to look at now. Will have to check in the morning unless someone else answers this first.
Regards,

Nick.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

AFAIK, only an SA can change the 'hard limit' for something like that. Seeing as how the soft limit is only 2 bytes below that, raising it via ulimit won't be possible until the hard limit is raised.
-craig

"You can never have too many knives" -- Logan Nine Fingers
nick.bond
Charter Member
Charter Member
Posts: 230
Joined: Thu Jan 15, 2004 12:00 pm
Location: London

Post by nick.bond »

Have a look at this link which will explain how to check and change the soft and hard limits.

http://www.ss64.com/bash/ulimit.html

As Craig said, you probably need to get SA to do this for you.
Regards,

Nick.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

If you can pre-sort the data by the grouping columns, and change the Aggregator method from Hash to Sort, you should not encounter any memory issues.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Cr.Cezon
Participant
Posts: 101
Joined: Mon Mar 05, 2007 4:59 am
Location: Madrid

Post by Cr.Cezon »

Thanks a lot
Post Reply