Page 1 of 1

Error Message

Posted: Fri Nov 12, 2004 4:42 pm
by DSmamta
Hello All:

I am able to compile and run my job sucessfully. Output is fine. But after some time when I try to go back to that job and try to open it I get the following error.

"Failed to open RT_LOG2085 file."

This has begun to happen to some of my saved jobs and some of my colleagues jobs as well.

What I thought was that my job design got corrupted so saved it as another file name compiled it and ran it. Again it successfully compiles and runs the job. The output also shows fine. But after sometime I go back and try to open it and I get the same error as I got previously.

Failed to open RT_LOG2085 file.

I am unable to see the log in the director cause when I try open the director, I get the following error.

"Cannot open executable job file RT_CONFIG2083. "

I click OK. Then I get the next error message

"Failed to open RT_LOG2085 file."

I click OK again.

I can see other jobs in the director but not the ones that I got error messages for.

In director under JOB --> Cleanup Resources --> PROCESSES and LOCKS (Show All) do not see any of my jobs or "RT_LOG2085 file" and "RT_CONFIG2083"

Would appreciate why I am getting these errors in my Designer and Director.

Regards,
MJ


Posted: Fri Nov 12, 2004 5:16 pm
by ketfos
Hi,
Try this thing.
Save the job with different name.
Run it by putting a small limit of 10 rows.
Then see the results.
Try opening the log.


Ketfos

Posted: Fri Nov 12, 2004 5:42 pm
by chulett
First check I'd make is to ensure you haven't run out of space wherever you have DataStage installed.

Posted: Fri Nov 12, 2004 6:56 pm
by ray.wurlod
Another thing that can cause this error is where the log file for the job has received so many entries that it has reached 2GB.

This is why you should purge log files. Once a log file reaches 2GB, unless there is administrative prevention by changing it to 64-bit internal addressing, the log file becomes strucurally corrupted.

Sometimes you can use CLEAR.FILE on the log file to retrieve the situation, however this does not work in 100% of cases.

Posted: Mon Nov 15, 2004 10:11 pm
by T42
You also can use the Adminstrator's option to limit number/days of job runs, and allow DataStage to automatically purge old log files for you every time you run a job.

You will get an informative message that you can safely ignore.

Posted: Tue Nov 16, 2004 8:28 am
by DSmamta
Hello All:

Thank you all SO much.

It was a combination of all that was mentioned.

Regards,
MJ

Posted: Mon Feb 28, 2005 1:37 am
by Christina Lim
Hallo all,

Our datastage server is shared by a lot of developers accessing different project.

We've been getting a lot of "Cannot open executable job file RT_CONFIG2083. " errors recently.

I suspect it's because the log file is reaching 2GB as mentioned by Ray.
Another thing that can cause this error is where the log file for the job has received so many entries that it has reached 2GB.

This is why you should purge log files. Once a log file reaches 2GB, unless there is administrative prevention by changing it to 64-bit internal addressing, the log file becomes strucurally corrupted.

Sometimes you can use CLEAR.FILE on the log file to retrieve the situation, however this does not work in 100% of cases.
I would to clarify some of the things here.

Since the option to automatically purge old log files for each run is controlled at project level, is it possible that when such error occurs in a project, it'll appear in other projects as well? Although that particular project automate the purge of old log files?