Hi gurus
Since a few days we have randomly the following error
failed to open file in datastage status=13
It's happen inside the jobs who wrote and read a file.
The write is ok the read failed.
After the abort we can relaunch the job and it works.
Can you help me ?
Thank you
Failed to open file in datastage status=13
Moderators: chulett, rschirm, roy
Failed to open file in datastage status=13
Hope This Helps
Regards
Regards
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Check the permissions on the file and its parent folder, and determine under which user ID the jobs run (you may also need to check group membership).
Bonne chance!
Bonne chance!
Last edited by ray.wurlod on Wed Jun 08, 2016 10:54 pm, edited 1 time in total.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Actually, Teej, such blocking operations are perfectly possible, feasible and even normal in server jobs.
Put another way, in server jobs a passive stage can have an input link (writing) AND an output link (reading). When the job runs, the output link is not opened until the input link has been closed.
One use case is pre-populating hashed files in the same job in which they're used to deliver reference data.
Check also that the file pathname is EXACTLY the same on the input and the output link - you may be trying to open a non-existent file. That it works sometimes suggests that you may be using a parameter that sometimes has an incorrect value, or it may be a timing issue in the underlying file system (which may be a bit slow to effect the close, because it has to update date/time modified in a large directory and date/time accessed in all parent directories, for example).
Put another way, in server jobs a passive stage can have an input link (writing) AND an output link (reading). When the job runs, the output link is not opened until the input link has been closed.
One use case is pre-populating hashed files in the same job in which they're used to deliver reference data.
Check also that the file pathname is EXACTLY the same on the input and the output link - you may be trying to open a non-existent file. That it works sometimes suggests that you may be using a parameter that sometimes has an incorrect value, or it may be a timing issue in the underlying file system (which may be a bit slow to effect the close, because it has to update date/time modified in a large directory and date/time accessed in all parent directories, for example).
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.