I have a small clarification to make. I have actually deployed a new catageory of jobs from one environment to another environment and i did run multiple compile of jobs from datastage manager by selecting the jobs under that specific catageory. I did force compile all the jobs. After running the jobs i could see all the jobs still in compiled status even though they ran in finished status except sequence which showed the finish status only the corresponding jobs showed compiled status even though they ran successfully. I wasnt able to see the logs for individual jobs where i could see them for sequence.
My first thought is that you didn't run the jobs that you thought you had run. What happens if you start one of these jobs manually from the director - does the status change and the log entries show up?
No these are not Mulitple instance jobs and the datastage director is connected to the same project where we executed the jobs. Our sequence finished successfully and it is showing the direcotr status as sucessful its only individual jobs which the sequence is executing that are showing the compiled status and also i could not retrieve the job log where as i could retrieve the job log for sequence.
i have run the sequence from the script again the sequence shows in finished status and individual jobs in compiled status but when i looked at the log it says clearing file RT_STATUS178. Am i missing something?
You have automated log clearing enabled.
Now that you have a job that has already run, if you re-run the script what is the last-run-time of the job you manually started? Has it changed or does it remain the same?
Finished with the previous run date/time? Please add an after-job ExecSH call to the command "touch /tmp/ThisIsAnEmptyFile.txt". Then compile the job and run the job sequence. Is the state of the job still "compiled" and does that file exist in /tmp?