Page 1 of 2

Random Link collector Error

Posted: Mon Jan 23, 2006 3:05 am
by fgallet
Hi,

I've great trouble with a job that fail ramdomly with this message :

"DWHAlimACFO.Load_ACFO_Daily_w.ACPRNormalization.DSLink178: ds_ipcopen() - Error in open(/tmp/ade.ReportingPrism.DWHAlimACFO.Load_ACFO_Daily_w.ACPRNormalization-Link_Collector_167.DSLink178) - No such file or directory"

Could anybody help me ?

Thx a lot

Posted: Mon Jan 23, 2006 3:15 am
by MaheshKumar Sugunaraj
Hi,

Could you please send some more details as to what your trying to do using the LINK COLLECTOR Stage.

As far as the message you have sent, I guess you taking a couple of Sequential Files and using a LINK COLLECTOR stage ur putting into a Single file, (Correct me if I am wrong :D )

**
ds_ipcopen() - Error in open(/tmp/ade.ReportingPrism.DWHAlimACFO.Load_ACFO_Daily_w.ACPRNormalization-Link_Collector_167.DSLink178) - No such file or directory"
**

Could you please check the Permissions of the Input files (Read Access) !!.

With Regards
M

Posted: Mon Jan 23, 2006 3:35 am
by fgallet
you're Wrong ! :D
there's no sequential files.
i'm using Link Collector to "Normalize" several data Flow (Ora9i) and build a datamart table (Ora9i).
That error message let me think there's a problem with temp files but I've no clue ....

Fred

Posted: Mon Jan 23, 2006 3:40 am
by MaheshKumar Sugunaraj
you're Wrong ! :D
there's no sequential files.
i'm using Link Collector to "Normalize" several data Flow (Ora9i) and build a datamart table (Ora9i). That error message let me think there's a problem with temp files but I've no clue ....


Thanks fo r confirming the above, then you should be checking for the permissions on TEMP Files being created.

Anyway good for giving some information as to what your trying to do instead of just teh error message.

M
[/b]

Posted: Mon Jan 23, 2006 3:45 am
by fgallet
ok ! thanks ...
next time i'll try to be more precise ...
:)
fred

Posted: Mon Jan 23, 2006 3:50 am
by MaheshKumar Sugunaraj
No Problem its always a learning curve, Best of luck.

:)
M

Posted: Mon Jan 23, 2006 3:58 am
by ray.wurlod
Perhaps your /tmp file system has become full during the job run?

Posted: Mon Jan 23, 2006 4:54 am
by fgallet
I've just check my Tmp directories.
Access right are good.

Strange things :
the file mentioned in the message error have been created !
and owned by dsadm .... with of course access right ok

i'm still not understand ....

Help ! :'(

Fred

Posted: Mon Jan 23, 2006 5:42 am
by ArndW
If you are running your jobs as user dsadm then the permissions are most likely not your problem. The most likely cause is what Ray has already mentioned - /tmp either running out of space or inodes. These temporary files are, I think, named pipes (use ls -al and see if the first character is a "p" in the attribute list). The fact that this error is sporadic makes it unlikely to be access rights and more likely to be disk space.

Posted: Mon Jan 23, 2006 6:08 am
by fgallet
there's no 'p' when i execute a "ls -al".
I don't think there's a disk space or a Inode problem.
they're respectively 4% used and 1% used.

fred

Posted: Mon Jan 23, 2006 8:05 am
by ArndW
And are you running your jobs as user "dsadmin" as well? Also try to measure your /tmp use during a run; 4% sounds like you have a lot of space left but if your /tmp only has a couple of Mb allocated it will fill up quickly and DataStage does try to clean up after itself.

p.s. Please edit your topic - it is used in searching so "any ideas" ranks pretty low :? Perhaps "Sporadic problems on link join sequential files" or something similar might be more descriptive.

Posted: Mon Jan 23, 2006 9:33 am
by fgallet
Yes, jobs are executed with dsadm user.
I've monitored /tmp space usage ... never exceed 5%.
(something like 1GB available)
But, i noticed that the file specified in the error message have been created by DS at runtime before it abort.
And the file was not deleted when DS exited ....

Posted: Mon Jan 23, 2006 10:13 am
by ArndW
The error is in an ipcopen() call so the files would need to be pipes. Could you delete the temporarry files in question before trying a new run? It might be that the error comes because DS is trying to do ipc calls on an incorrect file type.

Posted: Mon Jan 23, 2006 10:29 am
by fgallet
when i delete this file & relaunch the job, it works fine ...
But, the problem re-appear randomly and i dont' know why ....

Fred

Posted: Mon Jan 23, 2006 10:34 am
by ArndW
That means that at least the cause of the error is explained - the files are being created somewhere in your environment as normal sequential files instead of as pipes. What kind of collection algorithm are you using? If you change the name of the stage does the error re-occur (if not, then it is some other job creating these files, it it still occurs it is something to do with the stage itself).