Page 1 of 1

Job is been accessed by another user

Posted: Mon Dec 19, 2005 11:09 pm
by antojj
Hi All,

I am getting the following error when trying to access a Job.

Problem Desc :
I have created a Job "Testjob" and been using since 3 days. I have been designing various transformations and able to work.

When i tried to access today, I am not able to access the Job.
Instead, I am getting the following error message :
"Job TestJob is been accessed by another user".

Can anyone suggest me what is the problem and what went wrong.
As for as my knowledge, No other user is accessing the Job.

Thanks,

Posted: Tue Dec 20, 2005 12:40 am
by kcshankar
Hi,
To know whats happening....
View the job status of "Testjob" in the DataStage Director.


regards
kcs

Posted: Tue Dec 20, 2005 1:10 am
by Andal
May be you could have closed the designer , when it hangs by clicking "End Now". You can relese the job by doing the "clean up resources" from the director.

Posted: Tue Dec 20, 2005 1:17 am
by loveojha2
You can use DS.TOOLS from the DS Administrator for releasing the Locks.

Posted: Tue Dec 20, 2005 11:29 am
by trokosz
The unlock suggestions are good but just to close the loop you may in addition have phantoms left out there.....even removing locks may still give you the error of exclusive so look for phatoms....

ps -ef | grep phan

kill -9 each phantom

Posted: Tue Dec 20, 2005 11:41 am
by ml
If this is on Windows plataform,
ps -ef | grep phan
kill -9 each phantom
won't help.

Antojj, you should talk with your DS administrator and if you have admin rights you should be able to close the sesion from Director/Cleanup Resources or Admin/commands/DS.TOOLS; but you need to be carfully because you can kill some other things.

Use the forum search utility to find more detail on how to unlock a job.

good luck,

Posted: Tue Dec 20, 2005 12:17 pm
by ak77
Hi -

I did use cleanup resources
I did use kill -9
i also tried using DS.TOOLS but dont know how far I did it right
But still some of my jobs and the table definition are locked

I didn't find any Phanton on these jobs

What else can I do?

Kishan

Posted: Tue Dec 20, 2005 12:57 pm
by ArndW
ak77,

if you used kill -9 then you are guaranteed to leave locks on jobs and other objects open. Even experienced people have trouble locating those locks on busy systems using LIST.READU or the client cleanup tools. Your best bet for the future is to ensure that the deadlock daemon is up and running, so the longest you will have to wait is the default 15 minutes (which you can decrease). Avoid doing a kill -9 at all costs.

Posted: Thu Dec 22, 2005 12:43 am
by ak77
Hi Arnd,

If the jobs get locked, the table definition linked with that job also gets locked??

I read before that kill -9 should be avoided but my system analyst did that when I asked her about a dead lock waiting for resource problem

I will try to avoid to this

Thanks again

Kishan

Posted: Thu Dec 22, 2005 2:55 am
by ArndW
Jobs will get and hold certain locks; but I don't think that a metadata table definition gets locked by a job when it uses it.

A normal kill signal is an interruptable one so that allows a process that gets this notification to cleanup up after itself. A kill -9 signal say "abort right now" and does not allow time for the process to close files correctly and release locks. A kill -9 should only be used as a last resort, such as if the normal kill modes haven't worked after a couple of minutes, and even then it should be clear that some locks will remain. The easiest solution is to let the deadlock daemon take care of these.