VOC Error

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

mjgmc
Participant
Posts: 52
Joined: Thu Nov 25, 2004 8:06 am

VOC Error

Post by mjgmc »

Hello,

since about 1month ago, we have some problems to import datastage components in our Receipt's environment.

We couldn't understand the problem, but we figured out it was the universe base which was not stable.

Yesterday, we made a Rebuild index by DS.TOOLS in the Administrator and it seemed to work out. But this morning, one of our treatments (that ran 3 times before) aborted in several jobs with different errors that I'm putting here, each error came from a different job:

Program "DSD.StageRun": pc = 1E44, Unable to open the operating system file "DSG_BP.O/DSR_TIMESTAMP.B".
[ENFILE] File table overflow
Program "DSD.StageRun": pc = 1E44, Unable to load file "DSR_TIMESTAMP".
Program "DSD.StageRun": pc = 1E44, Unable to load subroutine.
Attempting to Cleanup after ABORT raised in stage PhaOgAlimDtwTabRefEga3ColA..InterProcess_7.IDENT17
DataStage Phantom Aborting with @ABORT.CODE = 3


Program "DSD.StageRun": pc = 70C, "$DS.GETPID" is not in the CATALOG space.
[ENFILE] File table overflow
Program "DSD.StageRun": Line 233, Incorrect VOC entry for $DS.GETPID.
Program "DSD.StageRun": Line 233, Unable to load subroutine.
Cannot find a job number 0
Attempting to Cleanup after ABORT raised in stage 0
DataStage Phantom Aborting with @ABORT.CODE = 3


Program "DSD.LinkReport": pc = 2BA, Unable to open the operating system file "DSD_BP.O/DSD_AddLinkEvent.B".
[ENFILE] File table overflow
Program "DSD.LinkReport": pc = 2BA, Unable to load file "DSD.AddLinkEvent".
Program "DSD.LinkReport": pc = 2BA, Unable to load subroutine.
Attempting to Cleanup after ABORT raised in stage PhaOgAlimDtwTabRefEga3ColF..REF_OG
DataStage Phantom Aborting with @ABORT.CODE = 3


ds_loadlibrary: error in dlopen of oraoci9.so - libwtc9.so: cannot open shared object file: No such file or directory

The last message is a fatal and the others are just warnings.

And to finish, one of the jobs just aborted without any warning or error.

What shell we do to fix this problem?
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

I wonder if you haven't hit the system file table limit, referring to
[ENFILE] File table overflow
.

The parameter and location where this system limit is set depends upon which flavor of UNIX you are using.
mjgmc
Participant
Posts: 52
Joined: Thu Nov 25, 2004 8:06 am

Post by mjgmc »

I can't read whole of your message, Arnd, but I think it's exactly that, because that message "[ENFILE] File table overflow" appeared several times whenever we wanted to import our jobs.

What do you think we can do?
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

This system limit needs to be reconfigured if too restrictive. What UNIX do you have?
mjgmc
Participant
Posts: 52
Joined: Thu Nov 25, 2004 8:06 am

Post by mjgmc »

We work with Linux RedHat 2.1
mjgmc
Participant
Posts: 52
Joined: Thu Nov 25, 2004 8:06 am

Post by mjgmc »

We work with Linux RedHat 2.1
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

what does "ulimit -a" for your datastage user show? And what setting for NFILES in your sysconfig?
mjgmc
Participant
Posts: 52
Joined: Thu Nov 25, 2004 8:06 am

Post by mjgmc »

ulimit -a shows:

time(cpu-seconds) unlimited
file(blocks) unlimited
coredump(blocks) 0
data(kbytes) unlimited
stack(kbytes) 8192
lockedmem(kbytes) unlimited
memory(kbytes) unlimited
nofiles(descriptors) 1024
processes 15231



How can I see the sysconfig information? Is it NFILES or NOFILES that you need?
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Your user process can only open 1024 files at a time; I'm not sure if the error message is the same, though. Your user stack is pretty small as well.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

I think you can use "sysctl" in RedHat but an not sure.
mjgmc
Participant
Posts: 52
Joined: Thu Nov 25, 2004 8:06 am

Post by mjgmc »

I transfered the information to an Unix administrator who proposed to modify this:

@dstage soft nofile 1024

@dstage hard nofile 65536


in the file /etc/security/limits.conf. They changed this, but when I do ulimit -a, I still get the same:

time(cpu-seconds) unlimited
file(blocks) unlimited
coredump(blocks) 0
data(kbytes) unlimited
stack(kbytes) 8192
lockedmem(kbytes) unlimited
memory(kbytes) unlimited
nofiles(descriptors) 1024
processes 15231

Is there something else we could do?
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Why not try setting the soft file limit to 2048 and checking to see if it makes a difference?
mjgmc
Participant
Posts: 52
Joined: Thu Nov 25, 2004 8:06 am

Post by mjgmc »

ArndW wrote:Why not try setting the soft file limit to 2048 and checking to see if it makes a difference? ...
I proposed to raise that value, but it seems the "Unix experts" didn't like it. What is the negative impact of raising that limit?
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

None that I am aware of. Why not ask your "UNIX experts" that question? Did you get any reason as to why they didn't like it?
-craig

"You can never have too many knives" -- Logan Nine Fingers
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

No negative impact unless the system is already overloaded.
Post Reply