File access problem - Process cannot access file

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
Xanadu
Participant
Posts: 61
Joined: Thu Jul 22, 2004 9:29 am

File access problem - Process cannot access file

Post by Xanadu »

Hello,

This is my job:

Table---> TFM--> hash file.
When I run this job I get the following error message:

DSD.UVOpen
rm: (DATA.30) The process cannot access the file because it is being used by another process.
Unable to DELETE file "Filename".
"Filename" is already in your VOC file as a file definition record.
File name =
File not created.

I run DS.TOOLS and there is no other process that uses this file or there is no lock active in this job.
Any solution to this problem ?

Thanks
-Xan
Amos.Rosmarin
Premium Member
Premium Member
Posts: 385
Joined: Tue Oct 07, 2003 4:55 am

Post by Amos.Rosmarin »

Hi

The quickest and easiest way ... close the DS server and kill all uvsh processes or just reboot :lol:

You will not see it in DS.TOOLS, but if you insist look at the control pannel for uvsh pid that you don't see in the ds.tools ... not very scientific but works


HTH
Amso
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

Make sure you don't have an Explorer session looking in the hash file directory, as you won't be able to delete and recreate the hash file.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Xanadu
Participant
Posts: 61
Joined: Thu Jul 22, 2004 9:29 am

Post by Xanadu »

Thanks Amso.
Actually I already did that and still have the problem.
Infact after I bounced the machine even the earlier error message doesn't come up. The job just enters the running state and stays like that unless I explicitly stop it. When this happened in my earlier reboot I deleted the job and rewrote the job (took me couple of jobs..simple one..)
..again i get that error message. Now when i reboot the machine again the same thing - the job just stays in running state with the link in blue (no error message in the job log..) but no output..

any inputs ?
-Xan
Xanadu
Participant
Posts: 61
Joined: Thu Jul 22, 2004 9:29 am

Post by Xanadu »

No Ken..
I dont have any explorer session open...
infact after I reboot the machine that error message doesn't come up but the job just stays in running state (soething like an infinite loop..)

thanks ken
-Xan
ketfos
Participant
Posts: 562
Joined: Mon May 03, 2004 8:58 pm
Location: san francisco
Contact:

Post by ketfos »

Hi,
You have mentioned that job remains in running state, something like infinite loop.

1. Kill the job and related process on the server and client.
2. Open the job in Designer.

Can you open the job in Designer?

Ketfos
Xanadu
Participant
Posts: 61
Joined: Thu Jul 22, 2004 9:29 am

Post by Xanadu »

It says the job is still running.
if i recompile it works fine. now if i run it again it shows me that error message again (it doesn't go into "infinite loop" this time..)
I try to delete the file explicitly using delete.file filename
and system throws a Data30 error that the file is already in use.
Funny part is there is some other job that rites to this file and it works fine.
Wait a min..let me try something
*Comes back*
You know what, I write to a new file "delete_file" and it still throws the same error. So its something with the job not that particular hash file.
Infact I tried deleting this file using delete.file and again same error that this delete_file is in use.
hmmmm.....any other inputs..

Thanks for ur response
-Xan
ketfos
Participant
Posts: 562
Joined: Mon May 03, 2004 8:58 pm
Location: san francisco
Contact:

Post by ketfos »

Hi,
What are you doing in TFM and hash file stage?

Ketfos
ketfos
Participant
Posts: 562
Joined: Mon May 03, 2004 8:58 pm
Location: san francisco
Contact:

Post by ketfos »

Hi,
Could you kill the job and related server processes succesfully?

After this, did you check the status of job in Director?

Was it still showing the job status as running?

If it was showing job status as running after killing the processes,
then you need to look for processes related with this job.

Let us know.


Ketfos
Xanadu
Participant
Posts: 61
Joined: Thu Jul 22, 2004 9:29 am

Post by Xanadu »

I actually dont have a TFM.... ( I donno what I was thinking when I wrote that file..I am working on tons of jobs of that setup ...so I guess I got confused)
This is the structure:
Table --> Hashfile.

(I just read from the table and write to the hash file. As simple as that..)
yeah..its showing the job status as running even after killing..
When I see the list of processes from DS.TOOLS i see a process like this:

SSELECT RT_LOG871 WITH @ID LIKE '1N0N' COUNT.SUP

That exists even after I "logout of the process" ... Is this a normal process ? Whats with "sselect" ?
( My job ID is 871 (notice this ID in the RD_Log871). I think this is the reason why its showing that the job is running...)

thanks ket..
-Xan
ketfos
Participant
Posts: 562
Joined: Mon May 03, 2004 8:58 pm
Location: san francisco
Contact:

Post by ketfos »

Hi,
What is understood is Even after you kill the job and related processes,
it shows the status as 'RUNNING'
That was the reason i specificallya sked you to look the status of job.

Before doing anything further, you need to kill all processes related to this on your client machine using the Clean Up resources option in Director.


Ketfos
Xanadu
Participant
Posts: 61
Joined: Thu Jul 22, 2004 9:29 am

Post by Xanadu »

thanks again ket
i alwys use ds.tools for cleanup..infact i just checked that the cleanup resources isn't even enabled for my job (donno y ?)
Maan this is crazy..
I just tried deleting the job and it says :
"Error while delting job ..
RT_CONFIG871 has apparently not been deleted from directory
RT_STATUS871 has apparently not been deleted from directory
RT_LOG871 has apparently not been deleted from directory"

wat could have went wrong ? :-S
I deleted the job but I am sure this problem wud persist even when I rewrite the job...

-Xan
dhiraj
Participant
Posts: 68
Joined: Sat Dec 06, 2003 7:03 am

Post by dhiraj »

Clean up resource is not enabled in your director because you've not enabled run time job administration in your project. to do this log on to administrator , select the project , go to properties, select the check box stating "enable run time job aministration in director". Click ok. close the adminstrator and connect to your director. Voila, you have clean up job resources enabled.

coming to real problem, i have encountered this a couple of times. Director shows the job status as running , but the job isn't actually running. to overcome this open the job using designer and simply press compile.
doing this solved my problems more than a few times. i have no justification as to why this happens.

Dhiraj
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

The job status you see in the Director status view is the most recent entry the job process (DSD.RUN) was able to make in its RT_STATUSnnn file.

Presumably it was aborted in such an abrupt fashion that it never got the chance to update the RT_STATUSnnn file with a "stopped" or "aborted" or "crashed" status.

This is one of the reasons you should never use kill -9 to stop a DataStage DSD.RUN process.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply