Incorrect Job Status
Posted: Mon Feb 16, 2004 4:26 am
Hi,
Job design includes perl program which calls two jobs in serial... First job reads couple of seq. files and Redbrick database tables ..creates output sequential files. These files are used as a input by second job. Second job reads these files and loads redbrick table. Perl program uses dsjob command to execute the job. Completion code is used to check the job execution.
my ($DsCommand) = "$DsPath/$DsJob -server aaa -user $DsUser -password $DsPassword -run -mode NORMAL -wait -jobstatus $ParamList $DsProject $JobName > $DwLog/$JobName.log";
system($DsCommand);
$RetValue = $? >> 8;
if ($RetValue == 1 ) {
return $RetValue;
}
else {
$Subject = "Job Failed.";
$Content = "return code $RetValue.\n";
$Content .= "Please check the logfile for further details \n";
SendMail ("pageoper","$Subject","$Content");
return $RetValue;
}
-----------------------
Jobstatus problem1:
Perl program starts Job1 execution and waits for the return code (job status). Job1 status is 'Running' in the director.... Output Files are also not populated completely. But, sometimes, perl routine returns jobstatus as 'Job Finished' and Job2 starts processing.... Job2 loads redbrick table with incorrect data because files are not populated completely by job1.
As observed, pattern is - This situation arises if DataStage server has been restarted before running the perl script...
Has anyone come across this kinda situation ? Is there anything wrong in the return code check routine ?
Problem 2:
Job log file shows fatal error and then 'job aborted' line but director job view page shows job status as 'Finished' instead of 'Aborted'.
Job design includes perl program which calls two jobs in serial... First job reads couple of seq. files and Redbrick database tables ..creates output sequential files. These files are used as a input by second job. Second job reads these files and loads redbrick table. Perl program uses dsjob command to execute the job. Completion code is used to check the job execution.
my ($DsCommand) = "$DsPath/$DsJob -server aaa -user $DsUser -password $DsPassword -run -mode NORMAL -wait -jobstatus $ParamList $DsProject $JobName > $DwLog/$JobName.log";
system($DsCommand);
$RetValue = $? >> 8;
if ($RetValue == 1 ) {
return $RetValue;
}
else {
$Subject = "Job Failed.";
$Content = "return code $RetValue.\n";
$Content .= "Please check the logfile for further details \n";
SendMail ("pageoper","$Subject","$Content");
return $RetValue;
}
-----------------------
Jobstatus problem1:
Perl program starts Job1 execution and waits for the return code (job status). Job1 status is 'Running' in the director.... Output Files are also not populated completely. But, sometimes, perl routine returns jobstatus as 'Job Finished' and Job2 starts processing.... Job2 loads redbrick table with incorrect data because files are not populated completely by job1.
As observed, pattern is - This situation arises if DataStage server has been restarted before running the perl script...
Has anyone come across this kinda situation ? Is there anything wrong in the return code check routine ?
Problem 2:
Job log file shows fatal error and then 'job aborted' line but director job view page shows job status as 'Finished' instead of 'Aborted'.