My data stage job (v8.1) is aborting with the following error message;
Fatal Error: Need to be able to open at least 16 files; please check your ulimit setting for number of file descriptors
Failure during execution of operator logic.
I verified the ulimit (ulimit -a) and it has;
nofiles(descriptors) unlimited
can anyone please suggest me,
Thanks
Mark
ulimit error
Moderators: chulett, rschirm, roy
-
- Premium Member
- Posts: 892
- Joined: Thu Oct 16, 2003 5:18 am
how did you verify the ulimit value? The only way that yuo can really do and know what the current value is , is to put a before-job shell call into your job to list the job's actual runtime value.
<a href=http://www.worldcommunitygrid.org/team/ ... TZ9H4CGVP1 target="WCGWin">
</a>
</a>
Any news
Hi,
I have the same problem. We added the command ulimit -Ha in before job subroutine and the file descriptors are set to unlimited. Any idea why we still receive this error message? Thanks.
time(seconds) unlimited
file(blocks) unlimited
data(kbytes) unlimited
stack(kbytes) 4194304
memory(kbytes) unlimited
coredump(blocks) unlimited
nofiles(descriptors) unlimited
best regards
betty
I have the same problem. We added the command ulimit -Ha in before job subroutine and the file descriptors are set to unlimited. Any idea why we still receive this error message? Thanks.
time(seconds) unlimited
file(blocks) unlimited
data(kbytes) unlimited
stack(kbytes) 4194304
memory(kbytes) unlimited
coredump(blocks) unlimited
nofiles(descriptors) unlimited
best regards
betty
Bso
Betty, please start your own post rather than jump on someone else's. That way you can include all of your information including the error(s) you are seeing, even if they are "the same problem" as this poster's. Also make sure you let us know what flavor of UNIX you are running.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers