The current limit on number of open files should be raised

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
satheesh.mohandass
Participant
Posts: 14
Joined: Wed Dec 26, 2007 10:56 am

The current limit on number of open files should be raised

Post by satheesh.mohandass »

One of my Datastage 8.0 Parallel Job aborts with the following message

main_program: Fatal Error: Could not open file /tmp/dynLUT15464f5607f5": Too many open files; The current limit on number of open files (1024) should be raised (both hard and soft)

I set the no of files open settings as shown below in /etc/security/limits.conf file in Red hat linux server

* hard nofile 4096
* soft nofile 63536

But still when i re run the job i get the same above error. Any quick response will be deeply appreciated.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Perhaps this will help:

viewtopic.php?t=111077
-craig

"You can never have too many knives" -- Logan Nine Fingers
satheesh.mohandass
Participant
Posts: 14
Joined: Wed Dec 26, 2007 10:56 am

Post by satheesh.mohandass »

Hi Craig,

Thanks for your reply but the link you provided didn't help me much in resolving the issue.

I tried to modify the Ulimit no of files open settings in users .profile , dsenv file and limits.conf file. But none of these settings are in effect when i run the datstage job. i still get the same error "The current limit on number of open files (1024) should be raised (both hard and soft) "

I also placed ulimit -a in befor ejob subroutine to find out the value of no of open files and it always shows 1024.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

You need to change the limits of the user under which the DataStage jobs run and then restart the DataStage engine.

<If your platform is LINUX, ignore this post. I didn't read the full problem description. >
Last edited by ArndW on Thu Aug 20, 2009 10:18 am, edited 1 time in total.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

It wasn't really meant to resolve anything, per se, what I got from the linked post was that it was unresolvable by any config change. You need to restructure your process to do "less things" at a time.
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply