Random Link collector Error

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

fgallet
Participant
Posts: 10
Joined: Thu Mar 31, 2005 3:09 am

Random Link collector Error

Post by fgallet »

Hi,

I've great trouble with a job that fail ramdomly with this message :

"DWHAlimACFO.Load_ACFO_Daily_w.ACPRNormalization.DSLink178: ds_ipcopen() - Error in open(/tmp/ade.ReportingPrism.DWHAlimACFO.Load_ACFO_Daily_w.ACPRNormalization-Link_Collector_167.DSLink178) - No such file or directory"

Could anybody help me ?

Thx a lot
Last edited by fgallet on Mon Jan 23, 2006 9:28 am, edited 1 time in total.
MaheshKumar Sugunaraj
Participant
Posts: 84
Joined: Thu Dec 04, 2003 9:55 pm

Post by MaheshKumar Sugunaraj »

Hi,

Could you please send some more details as to what your trying to do using the LINK COLLECTOR Stage.

As far as the message you have sent, I guess you taking a couple of Sequential Files and using a LINK COLLECTOR stage ur putting into a Single file, (Correct me if I am wrong :D )

**
ds_ipcopen() - Error in open(/tmp/ade.ReportingPrism.DWHAlimACFO.Load_ACFO_Daily_w.ACPRNormalization-Link_Collector_167.DSLink178) - No such file or directory"
**

Could you please check the Permissions of the Input files (Read Access) !!.

With Regards
M
fgallet
Participant
Posts: 10
Joined: Thu Mar 31, 2005 3:09 am

Post by fgallet »

you're Wrong ! :D
there's no sequential files.
i'm using Link Collector to "Normalize" several data Flow (Ora9i) and build a datamart table (Ora9i).
That error message let me think there's a problem with temp files but I've no clue ....

Fred
MaheshKumar Sugunaraj
Participant
Posts: 84
Joined: Thu Dec 04, 2003 9:55 pm

Post by MaheshKumar Sugunaraj »

you're Wrong ! :D
there's no sequential files.
i'm using Link Collector to "Normalize" several data Flow (Ora9i) and build a datamart table (Ora9i). That error message let me think there's a problem with temp files but I've no clue ....


Thanks fo r confirming the above, then you should be checking for the permissions on TEMP Files being created.

Anyway good for giving some information as to what your trying to do instead of just teh error message.

M
[/b]
fgallet
Participant
Posts: 10
Joined: Thu Mar 31, 2005 3:09 am

Post by fgallet »

ok ! thanks ...
next time i'll try to be more precise ...
:)
fred
MaheshKumar Sugunaraj
Participant
Posts: 84
Joined: Thu Dec 04, 2003 9:55 pm

Post by MaheshKumar Sugunaraj »

No Problem its always a learning curve, Best of luck.

:)
M
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Perhaps your /tmp file system has become full during the job run?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
fgallet
Participant
Posts: 10
Joined: Thu Mar 31, 2005 3:09 am

Post by fgallet »

I've just check my Tmp directories.
Access right are good.

Strange things :
the file mentioned in the message error have been created !
and owned by dsadm .... with of course access right ok

i'm still not understand ....

Help ! :'(

Fred
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

If you are running your jobs as user dsadm then the permissions are most likely not your problem. The most likely cause is what Ray has already mentioned - /tmp either running out of space or inodes. These temporary files are, I think, named pipes (use ls -al and see if the first character is a "p" in the attribute list). The fact that this error is sporadic makes it unlikely to be access rights and more likely to be disk space.
fgallet
Participant
Posts: 10
Joined: Thu Mar 31, 2005 3:09 am

Post by fgallet »

there's no 'p' when i execute a "ls -al".
I don't think there's a disk space or a Inode problem.
they're respectively 4% used and 1% used.

fred
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

And are you running your jobs as user "dsadmin" as well? Also try to measure your /tmp use during a run; 4% sounds like you have a lot of space left but if your /tmp only has a couple of Mb allocated it will fill up quickly and DataStage does try to clean up after itself.

p.s. Please edit your topic - it is used in searching so "any ideas" ranks pretty low :? Perhaps "Sporadic problems on link join sequential files" or something similar might be more descriptive.
fgallet
Participant
Posts: 10
Joined: Thu Mar 31, 2005 3:09 am

Post by fgallet »

Yes, jobs are executed with dsadm user.
I've monitored /tmp space usage ... never exceed 5%.
(something like 1GB available)
But, i noticed that the file specified in the error message have been created by DS at runtime before it abort.
And the file was not deleted when DS exited ....
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

The error is in an ipcopen() call so the files would need to be pipes. Could you delete the temporarry files in question before trying a new run? It might be that the error comes because DS is trying to do ipc calls on an incorrect file type.
fgallet
Participant
Posts: 10
Joined: Thu Mar 31, 2005 3:09 am

Post by fgallet »

when i delete this file & relaunch the job, it works fine ...
But, the problem re-appear randomly and i dont' know why ....

Fred
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

That means that at least the cause of the error is explained - the files are being created somewhere in your environment as normal sequential files instead of as pipes. What kind of collection algorithm are you using? If you change the name of the stage does the error re-occur (if not, then it is some other job creating these files, it it still occurs it is something to do with the stage itself).
Post Reply