data segment (heap) size

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
bgs
Participant
Posts: 22
Joined: Sat Feb 05, 2005 9:43 pm

data segment (heap) size

Post by bgs »

I am getting the following error in one of my jobs.(in join stage)


join_records,0:The current soft limit on the data segment (heap) size (2147483645) is less than the hard limit (2147483647), consider increasing the heap size limit


in the join stage I am using entire partition in one link(one record through this link) and round robin on the other(1470610 records through this link).Is this error related to space on the device??
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

This is an annoying message, which is really an informational message but displays as a warning on the grounds that it's "unusual". It stems from a difference (of two bytes?!!) between what your ulimit specifies (for data) and the hard limit coded in DataStage. Get your UNIX administrator to increase the ulimit value so that the message goes away. "unlimited" is a good value, or the value that DataStage appears to expect.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply