Page 1 of 1

What does this error message mean?

Posted: Thu Jan 27, 2005 1:16 pm
by cbres00
node_node1: Fatal Error: Unable to start ORCHESTRATE process on node node1 (slsudv18): APT_PMPlayer::APT_PMPlayer: fork() failed, Not enough space

Regards,
cbres00

Posted: Thu Jan 27, 2005 1:43 pm
by Amos.Rosmarin
Hi,

Check the disk space on where your datasets are (accoring to the APT_CONFIG_FILE)
Check the disk space on your tmp directory (if you did not change it in the uvconfig maybe it's time to do so).
Check the number of processes by executing

Code: Select all

ulimit -a
from a datastage server routine, it shoud be high (much more then the default 100).

You did not say what kind of unix you're using. It's very important.


HTH,
Amos

Posted: Thu Jan 27, 2005 4:11 pm
by cbres00
Strangely enough this is a near duplicate of a job I ran just a few minutes before...but it worked that time.

Where would I find APT_CONFIG_FILE? In Administrator?

We're using Solaris.

Thanks
cbres00
Amos.Rosmarin wrote:Hi,

Check the disk space on where your datasets are (accoring to the APT_CONFIG_FILE)
Check the disk space on your tmp directory (if you did not change it in the uvconfig maybe it's time to do so).
Check the number of processes by executing

Code: Select all

ulimit -a
from a datastage server routine, it shoud be high (much more then the default 100).

You did not say what kind of unix you're using. It's very important.


HTH,
Amos

Config File Locations

Posted: Thu Jan 27, 2005 6:36 pm
by trokosz
You find the APT_CONFIG_FILE in one of two places:

1. Go to Manager go to Tools | Configurations and there they are....

2. Go to cd $DSHOME, up one cd../Configurations and there they are where you can cat or vi them...

Posted: Tue Feb 01, 2005 1:06 pm
by T42
Do a 'df -k' - and see if there's anything that are at or near 100% utility rate. Check the configuration file to see which mountpoint you are mapping to, and go from there.

Posted: Fri Feb 04, 2005 12:23 pm
by dsxuserrio
Cbres00
This is clearly a memory issue as pointed by others.
A few more things
How many sorts are you using in your job??
Chack the config file and see how much space is allocated for scratch and data using df -k.

Sometimes after the job fails, the scratch disk is cleaned and df -k will not give the correct picture because it is already cleaned.
Thanks
dsxuserrio

Posted: Wed Apr 06, 2005 5:40 pm
by raviyn
Hi,

Hi we are also getting the same error ... We monitored the Df- k we are not hitting 100% ..Also This is the output of ulimit -a

time(seconds) unlimited
file(blocks) unlimited
data(kbytes) 1048576
stack(kbytes) 392192
memory(kbytes) unlimited
coredump(blocks) 4194303
nofiles(descriptors) 2048

Pleasae let me know.