Job Aborting randomly, runs successfully also

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
vmcburney
Participant
Posts: 3593
Joined: Thu Jan 23, 2003 5:25 pm
Location: Australia, Melbourne
Contact:

Post by vmcburney »

The error message will hopefully also display a list of field values somewhere in the message log, this may help you debug the problem. For example this is a normal record:
Id=2324
FirstName="Fred"
LastName="Fruit"
This is a record where a non numeric has been passed into a numeric field:
Id=AB124
FirstName="Fred"
LastName="Fruit"

Have a look through and see if you can spot a character put into a numeric field, the numeric fields are easy to find as the values are not enclosed in commas. This may help you debug the cause.

Vincent McBurney
Data Integration Services
www.intramatix.com
spracht
Participant
Posts: 105
Joined: Tue Apr 15, 2003 11:30 pm
Location: Germany

Post by spracht »

malathi,

we have the same problems sometimes and in our case it seems to be related to the 'preload file to memory' option' in hashed file stages, especially for 'larger' lookups (500.000 records and more). After disabling this option the 'Abnormal Termination' doesn't occur again.

Stephan
degraciavg
Premium Member
Premium Member
Posts: 39
Joined: Tue May 20, 2003 3:36 am
Location: Singapore

Post by degraciavg »

I encountered this problem before with the Aggregator stage on 4.2. I think this stage has got a memory leak problem. I once filed a case with Ascential tech support but didn't pursue it because I was not able to send the test data to them... perhaps, you would want escalate your case to them...

Your error does not look like a memory-leak problem though, it seems more likely that some records you're aggregating are coming in as NULLs. In DataStage, any arithmetic operation involving NULL values will result in NULL, i.e., 1 + NULL = NULL. This scenario might have caused your problem.

Anyway, here's a tip: the best way to do aggregation is to do it in the database (with the right index/es) not in DataStage.

Cheers,
vladimir
vmcburney
Participant
Posts: 3593
Joined: Thu Jan 23, 2003 5:25 pm
Location: Australia, Melbourne
Contact:

Post by vmcburney »

Smart work, the aggregator stage is very dodgy in those old versions and worth avoiding. Another way to replace an aggregation stage is to load the data into a local Universe table then do a group by out of that table.

Vincent McBurney
Data Integration Services
www.intramatix.com
spracht
Participant
Posts: 105
Joined: Tue Apr 15, 2003 11:30 pm
Location: Germany

Post by spracht »

Strange, we are on DS 5.2 and encountered frequent 'Abnormal Termination' aborts when aggregating data within uv-stages. We worked around it using aggregator stages!!!

Stephan
Post Reply