Page 1 of 1

job aborted because of heap size

Posted: Wed Dec 19, 2007 1:14 am
by aaha_naga
Hi
MY job aborted due to the reason of "The current soft limit on the data segment (heap) size is less than the hard limit, consider increasing the heap size limit"

where(how) i can handle this.

Posted: Wed Dec 19, 2007 4:18 am
by ArndW
increase that limit with the ulimit -s {size} command; this can be done in the dsenv file so that it affects all DataStage users.

Posted: Wed Dec 19, 2007 5:23 am
by aaha_naga
THANKS A LOT

Posted: Wed Dec 19, 2007 3:53 pm
by ray.wurlod
Time to mark the thread as Resolved?

Posted: Thu Dec 27, 2007 5:57 am
by Kirtikumar
Arnd, just wondering - can DS have any problem due to the 32 bit limits when we do this change?

Our DS Admins say that to go beyond 2 GB might have problem with other settings due to 32 bit probs. could there be any such problem?

Posted: Thu Dec 27, 2007 6:07 am
by ArndW
No, DS will not have issues due to this change.

Posted: Thu Jan 03, 2008 2:27 am
by rubik
ArndW wrote:No, DS will not have issues due to this change. ...
ArndW,

Since DataStage is a 32-bit application, wouldn't this mean that it will only utilize up to 2 GB of real memory? We ran into the same heap allocation problem for DS EE 8.x on AIX 5.3. After changing the data segment limit (both Hard and Soft) to unlimited, we still face the same problem.

Posted: Thu Jan 03, 2008 3:09 am
by ArndW
rubik - the original error posted was "The current soft limit on the data segment (heap) size is less than the hard limit, consider increasing the heap size limit"; so how can you be getting the same error message after setting it to unlimited? Or is this another error that might best be covered in a new thread?

Posted: Thu Jan 03, 2008 3:56 am
by rubik
ArndW wrote:rubik - the original error posted was "The current soft limit on the data segment (heap) size is less than the hard limit, consider increasing the heap size limit"; so how can you be getting the same error message after setting it to unlimited? Or is this another error that might best be covered in a new thread?
Sorry for hijacking but it's the same error. Nevertheless pls see the new thread that I've just created viewtopic.php?p=264643 for more details.