Unable to compile or open job.

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
arun_im4u
Premium Member
Premium Member
Posts: 104
Joined: Mon Nov 08, 2004 8:42 am

Unable to compile or open job.

Post by arun_im4u »

hi,

I had a problem opening one of my jobs in datastage. I cleared the status file, cleaned up resources and also killed the process in unix. But didnt work.

Then i went ahead and deleted the rt_config, rt_status, rt_bp and rt_log files, but didnt work.

Now my admin said he can restart the server and everything should be fine.

My question is will there be any problem if u delete those files. In earlier posts I see people saying never to delete those files.

Is this gonna be a problem when we restart the server???

Thanks,
arun.
Sainath.Srinivasan
Participant
Posts: 3337
Joined: Mon Jan 17, 2005 4:49 am
Location: United Kingdom

Post by Sainath.Srinivasan »

What RT_Config files did you delete?

Do you have any backup of the job you did? Because you may have deleted your job design and hence the job.
arun_im4u
Premium Member
Premium Member
Posts: 104
Joined: Mon Nov 08, 2004 8:42 am

Post by arun_im4u »

Hi sainath,

I am actually trying to delete that job, bcos i have a copy of it. The job does not exist in the director. But when i click on the category(datastage director) under which this job was there it throws me a message saying configuration file missing for that jobname.

In the datastage designer, I can see the job. If i try to open it it gives me the message job is being accessed by another user.

All i want is to delete this job, but i am not able to.

The files which i cleared in unix are:
RT_status
rt_log
rt_config
rt_bp
temp_rt

All the files which point to this job. Is there something else which has to be deleted in the server or will it be fine if i restart the server.

Thanks,
arun.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

NEVER delete the RT_... objects. Or the DS_... objects.

These are tables in the Repository database that store the run-time (and design time) information for the job. They are created when the job is created, and no mechanism exists for creating them subsequently (apart from manual intervention).

You will need to restore them from backup if you want to have any chance at all of compiling the job.

The "locked by another user" would have disappeared when DataStage was restarted.

Depending on your version, there may be a button in Administrator called "Cleanup" for the project - I think this was taken out in version 7.1.

On the server prior to version 7.5 the DS.TOOLS has an option (option 4) called "Check integrity of job files" which, among other things, deletes orphans caused by incomplete job deletions.

While that option was removed in version 7.5 the underlying command can still be executed. It is executed from the Administrator client command window or within the dssh environment on the server:

Code: Select all

DS.CHECKER JOBS
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
trokosz
Premium Member
Premium Member
Posts: 188
Joined: Thu Sep 16, 2004 6:38 pm
Contact:

Post by trokosz »

Well you need to remove/delete locks on Jobs....ask everyone to log out and search here on how to remove locks....its been discussed many times....

In addition, .....

1. Go to DataStage Administrator GUI Command Line and enter "LIST.READU EVERY". What this shows you are all User Locks currently occurring (implying developers are still on or phantoms exist).
2. In Unix you can execute "netstat -a | grep dsr" and/or "ps -ef | grep ds" which tells you if DataStage developers are still on.....ask them to logoff....proceed to Step 3.

3. Execute a ps -ef | grep -I osh to determine if any phantom Parallel Jobs are running and Execute a ps -ef | grep -I pha to determine if any phantom Server Jobs are running. If there are phantoms then do a kill -9 on them. Once this is done then stop the DataStage.

4. If you stop the DataStage Engine after due diligence above and per chance a Job was still running then even after killing parallel and server orphaned processes there still may be an issue restarting DataStage. This is more than likely due to the fact there still exists one or more orphaned shared memory segments. You would know this by the fact the uv -admin -info says the DataStage Engine is running but yet when you attempt to logon the message "the dsrpcd daemon is not running". So to see if they exist and how to remove them perform the following:

Ipcs | grep ade
Ipcrm -m {shared memory id}

Once you perform the commands above then stop and restart the DataStage Engine.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Keep in mind it may not be ade as the signature's first three hex characters. In earlier releases than 6.0 it was ace, and a different key can be specified in uvconfig; I have seen dae used, for example.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply