Page 1 of 1

Output file full Error

Posted: Thu Jul 05, 2007 4:03 am
by scorpion
Hi Everybody,

When I am trying to run the one of my job,I am getting below fatal errors and getting aborted.

Sequential_File_77,0: Output file full
Sequential_File_77,0: Failure during execution of operator logic.
Sequential_File_77,0: Input 0 consumed 1295395 records.
APT_CombinedOperatorController,0: Fatal Error: No file descriptor was supplied.
node_node1: Player 1 terminated unexpectedly.


Could any one help me on this.

Posted: Thu Jul 05, 2007 4:06 am
by ArndW
Hmm... if I got a 'file full' message while trying to write to a sequential file, I would look to see how much space I had on that device (particularly while the job is running, as the output file gets deleted by default upon job abort).

Posted: Thu Jul 05, 2007 9:38 am
by ray.wurlod
It might also be that the inode table for that particular file system is full, or that the file has reached 2GB and large file support is not enabled. Check with your UNIX administrator.

Fork Failure error

Posted: Tue Oct 16, 2007 8:41 am
by DSkkk
ray.wurlod wrote:It might also be that the inode table for that particular file system is full.
Can u please tell if there is a command that we can execute from the administrator command line prompt to know whether the Inode table for a file system is full ?

The issue i am facing is a Job runs fine sometimes but then it fails with the Fork Failure error sometimes. Trying to resolve the issue. Any advice is greatly appreciated.

Posted: Tue Oct 16, 2007 9:30 am
by ray.wurlod
A fork failure is also an inability to create a new process. So it may be that the process table (rather than the inode table) is the problem in this case, or that there is no inode left on the device with which to create communication channels.

As to any command you can run from the Administrator, most of these commands need superuser privilege.

You have marked the thread as Resolved. Can you please post what the resolution was?