Heap Memory Size Isuue

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
Sourav
Participant
Posts: 17
Joined: Tue Sep 04, 2007 5:34 pm
Location: Delhi

Heap Memory Size Isuue

Post by Sourav »

Hi Friends,

Regarding Heap memory size issue while Datastage jobs got aborted due to as their having not enough haep memory size for propagate those large amount of records from one datasets to another datasets/filesets and their having two lookup table through which we can get particular data as per business logic is concerned .

Kindly give the full support and do the needful from your end .
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

PX is usually quite graceful in how it degrades from using memory to disk. Could you post your exact error - it is likely that you ran out of disk space and not memory.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Re: Heap Memory Size Isuue

Post by ray.wurlod »

Sourav wrote:Kindly give the full support and do the needful from your end .
Surely you're paying maintenance on the licence, which entitles you to support from your registered support provider?
DSXchange does not exist as a substitute for proper support. Your support provider is (or needs to be) aware of the particular issues at your site, what patches have been installed, and so on. We on DSXchange obviously can not be thus aware.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Sourav
Participant
Posts: 17
Joined: Tue Sep 04, 2007 5:34 pm
Location: Delhi

Heap Memory Size Isuue

Post by Sourav »

ArndW wrote:PX is usually quite graceful in how it degrades from using memory to disk. Could you post your exact error - it is likely that you ran out of disk space and not memory. ...
.


The exact error is given below and do the needful from your end .
---------------------------------------------------------------------------

Join_36,5: The current soft limit on the data segment (heap) size
(805306368) is less than the hard limit (2147483647), consider increasing
the heap size limit
Join_36,5: Current heap size: 276,559,192 bytes in 13,936 blocks
Join_36,5: Failure during execution of operator logic .
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Allocate more space (or more file systems) to scratch disk in your configuration file.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Ray - you've done the needful. Thanks.
Post Reply