Page 1 of 1

Abnormal Termination

Posted: Tue May 25, 2004 7:49 am
by jenkinsrob
Hi,

I am getting an error saying 'Abnormal Termination of Stage...'. I reset the job as already advised in this forum and I was indeed given some additional information about the error.

This was...

From previous run
DataStage Job 539 Phantom 32704
Job Aborted after Fatal Error logged.
Program "DSD.WriteLog": Line 239, Abort.
Attempting to Cleanup after ABORT raised in stage SC5CSMServiceDayJob..CheckRelation

DataStage Phantom Aborting with @ABORT.CODE = 1

Any ideas what the solution to this is??

Re: Abnormal Termination

Posted: Tue May 25, 2004 8:11 am
by ogmios
Give some information about what you're doing in job SC5CSMServiceDayJob stage CheckRelation. It's probably a coding error.

Ogmios

Posted: Tue May 25, 2004 8:16 am
by jenkinsrob
Its definitely not a coding error - this job has been running on a daily basis for over 3 months now and we have never had this error before...

We are also getting this error in other jobs and I think it might be a space issue on the file system but i'm not sure which stuff it is safe to delete other than our data files...

Posted: Tue May 25, 2004 8:21 am
by roy
Hi,
Welcome aboard :),
well it seems like your failiour has something to do with your log.
reason could be log corruption, sometimes disk space or permision (which I doubt) sometimes even a simple DSLogWarn usage after a standard wait for job in basic may cause something like this.

for windows open DS administrator click on the project tab, click on your project and click on the command button.
this will get you a new window.
in there type CLEAR.FILE RT_LOG539
execute this command to clear your log file and fix any corruption.
Be Aware this log file is for the job that generated this error message only!!!

Good Luck,

try this?

Posted: Tue May 25, 2004 12:57 pm
by larryoceanview
I have received this error many times using aggregators. What would cause an exisitng job to blow-up would be that the size of the input is getting bigger and it doesn't fit in memory.

You can try sorting your data and sending it to a sequential file before using it as input into an aggregator.