Cannot get exclusive access to executable file for job

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

HSBCdev
Premium Member
Premium Member
Posts: 141
Joined: Tue Mar 16, 2004 8:22 am
Location: HSBC - UK and India
Contact:

Post by HSBCdev »

Ok - I'll try that again. I've just set it off now. I'll let you if and when it completes.

Thanks
HSBCdev
Premium Member
Premium Member
Posts: 141
Joined: Tue Mar 16, 2004 8:22 am
Location: HSBC - UK and India
Contact:

Post by HSBCdev »

I've left the Director running (hanging?) for 1 hour now. I'll check again in another hour.
anupam
Participant
Posts: 172
Joined: Fri Apr 04, 2003 10:51 pm
Location: India

Post by anupam »

let me know the size of RT_LOGNNN of the Job
----------------
Rgds,
Anupam
----------------
The future is not something we enter. The future is something we create.
HSBCdev
Premium Member
Premium Member
Posts: 141
Joined: Tue Mar 16, 2004 8:22 am
Location: HSBC - UK and India
Contact:

Post by HSBCdev »

I got the number of the job by doing the option 5 - 11 on administrator - it told me that 'File "RT_STATUS1361" has been cleared'. I then did SELECT COUNT(*) FROM RT_LOG1361 and Administrator is now hung.
HSBCdev
Premium Member
Premium Member
Posts: 141
Joined: Tue Mar 16, 2004 8:22 am
Location: HSBC - UK and India
Contact:

Post by HSBCdev »

I've found the log through unix. It has the following sizes

RT_LOG1361:
total 129816
7421952 DATA.30
59027456 OVER.30


Can I just delete this file through unix?


I also have the following files - should I be getting rid of any of these?
DS_TEMP136 :total 128
RT_BP1361 total 112
RT_BP1361.O total 80
RT_CONFIG1361 total 152
RT_LOG1361 total 129816
RT_STATUS1361 total 16
anupam
Participant
Posts: 172
Joined: Fri Apr 04, 2003 10:51 pm
Location: India

Post by anupam »

No you can not just delete these files. There is no file which you should get rid of. All of these files have information abt the job and you should not delete any of these files from unix level.

The size of RT_LOG1361 is very big and you have to delete the entries from this file.

using uv you might have some alternative of deleting the previous unwanted records. You can use DELETE Sql but i would not at all recommend it as i have not used tit.

What you can do is take a export of the Job and then delete the Job from your project and then import it again.
----------------
Rgds,
Anupam
----------------
The future is not something we enter. The future is something we create.
HSBCdev
Premium Member
Premium Member
Posts: 141
Joined: Tue Mar 16, 2004 8:22 am
Location: HSBC - UK and India
Contact:

Post by HSBCdev »

I can't delete the job. I get the message 'Cannot get exclusive access to log for job MyJobName'

I've exported the job ok though.
anupam
Participant
Posts: 172
Joined: Fri Apr 04, 2003 10:51 pm
Location: India

Post by anupam »

You are unable to get exclusive excess and you know how to get rid of it. goto DS.TOOLS and clear the status file and release the LOCK if at all exists on RT_CONFIGNNN.

Then after taking the export, you would be able to delete the Job.

I hope this will solve your problem, if not then tommorow we will try the last option which is not at all recommended ......
----------------
Rgds,
Anupam
----------------
The future is not something we enter. The future is something we create.
HSBCdev
Premium Member
Premium Member
Posts: 141
Joined: Tue Mar 16, 2004 8:22 am
Location: HSBC - UK and India
Contact:

Post by HSBCdev »

I'd tried that earlier - I tried it again but still no luck.

I think that the problem has now been fixed however.
I reasoned that since I have already got my export done I can take a chance by messing with the hash file for RT_LOG1361.

I deleted the DATA.30 and the OVER.30 and copied in the DATA.30 and OVER.30 from an empty hash file.

I then went back into Director and was able to clear the log and open up the empty log without Director haning.

I then ran my sequence through and it has worked fine (this time).

Thanks for all your help and all your time.

P.S. What was your 'last option which is not at all recommended'?
anupam
Participant
Posts: 172
Joined: Fri Apr 04, 2003 10:51 pm
Location: India

Post by anupam »

Will discuss the last option later in case u face thic kind of problem again :P

But what you did was also not right ......
----------------
Rgds,
Anupam
----------------
The future is not something we enter. The future is something we create.
HSBCdev
Premium Member
Premium Member
Posts: 141
Joined: Tue Mar 16, 2004 8:22 am
Location: HSBC - UK and India
Contact:

Post by HSBCdev »

You knew I'd be back...

I had my job working again for a short time and succesfully did some low volume testing.

I then tried to run it with my full size file and it ran for a couple of hours before it was killed by a dba.

I then tried to access the job this morning on Designer and it hung. I had to end task to get out of it. I also had one other job open for editing at the time.

I then went into Director and found that whenever I click on the category that contains the two jobs Director hangs.

Also, in manager if I try to export the jobs it hangs.

Also, in Desiginer when I try to edit or copy the jobs I get the 'accessed by another user' message.

I've retried all the techniques mentioned earlier in this topic (DS.TOOLS and killing processes on UNIX).

What should I try next?

Thanks
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

What do you mean by "killed by DBA"?

If the DBA is killing processes that have open cursors, how is DataStage informed that this is the case? There's a right way and a wrong way. You need to educate the DBA.

Are you trying to do too many insert operations before a commit? Try reducing the rows per transaction. Even better, switch to a bulk load strategy.

When defunct processes hold locks (which leads to "locked by other process") you need to determine which locks (for example through LIST.READU or DS.TOOLS) and which defunct processes (for example through ipcs on UNIX or shrdump on Windows), then use the Administrator command UNLOCK to release the locks.

After that you should be able to recover things in the Repository.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
trokosz
Premium Member
Premium Member
Posts: 188
Joined: Thu Sep 16, 2004 6:38 pm
Contact:

Post by trokosz »

You have orpha processes locked which you now must unlock. The very very best way is to do this via Administrator which yes means dsadm rights. Never kill the Unix process thats a falacy!

Do this.....

1. Ask the person (login Id) having the issue to get out of DataStage all together....
2. In Administrator click on the Command button
3. Execute LIST.READU and note the ALL Inode and User No for the Login Id that is having the issue....
4. Execute SET.FILE UV VOC UV.VOC
5. Execute COPY UV.VOC to VOC UNLOCK
6. Execute UNLOCK INODE <Inode Number> ALL and/or UNLOCK USER <user no> ALL
7. Repeat 6 for all Inodes and User Nos
8. Execute 3 and see if the Login Id is gone (if not do 6 until it happens)
9. Call the person and your all set.....

10. Or run the dsdlockd.config daemon by editing this file, changing start=0 to start=1 . It will start next time DataStage is started. This will automatically do the above periodoically.

Hope this helps....
HSBCdev
Premium Member
Premium Member
Posts: 141
Joined: Tue Mar 16, 2004 8:22 am
Location: HSBC - UK and India
Contact:

Post by HSBCdev »

Thanks Ray and Trokosz. I've been away from work for a week and so didn't see your replies until today.

The week off seems to have fixed my job's problems as well - for the time being anyway. The next time i get it into a state I'll refer back to you messages.

Cheers
trokosz
Premium Member
Premium Member
Posts: 188
Joined: Thu Sep 16, 2004 6:38 pm
Contact:

Post by trokosz »

Don't export and reimport it doesn't solve.

You need to clear ALL locks outstanding for a user (or maybe all users if there sharing). Locks do not all have a specific Job name associated with them so you need to find the User Number for the User in questions and unlock every lock for that User. MAKE SURE THE USER HAS LOGGED OUT OF ALL DATASTAGE CLIENTS MEANING DESIGNER, DIRECTORE, MANAGER AND ADMINISTRATOR.

You can do this in Administrator Command by:

LIST.READU EVERY = You will see INODE and USER NO, take and write down the USER NO for each lock around the User in question page by page

Now do SET.FILE$ UV VOC UV.VOC
COPY FROM UV.VOC TO VOC UNLOCK

Now do UNLOCK USER <User No> ALL
Repeat UNLOCK for all User No's you wrote down

Go and redo a LIST.READU EVERY and that User should be gone.

Now that person is all set.
Post Reply