Abort Code

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
EJRoufs
Participant
Posts: 73
Joined: Tue Aug 19, 2003 2:12 pm
Location: USA

Abort Code

Post by EJRoufs »

I am pulling data off of a SQL Server database, writing it to a temporary file, then trying to summarize that file. I am getting an error when trying to summarize:

DataStage Job 200 Phantom 2456
Program "DSP.ActiveRun": Line 51, Exception raised in GCI subroutine:
Access violation.
Attempting to Cleanup after ABORT raised in stage PullSPC1998SS..SumHistory
DataStage Phantom Aborting with @ABORT.CODE = 3

My first question is, why is my program blowing up? I am looking into this at the moment.... hoping i can figure that out quickly on my own. My second question, maybe more of my question i'm hoping to get answered here in the forum.... where is a good source of documentation? Like, where do i find out what "ABORT.CODE = 3" is telling me? The help built into DataStage doesn't seem to have the answer, nor does the documentation that came with the install. Is there a good place to search for this type of information?

Thanks for any help. :)
Eric
chucksmith
Premium Member
Premium Member
Posts: 385
Joined: Wed Jun 16, 2004 12:43 pm
Location: Virginia, USA
Contact:

Post by chucksmith »

Could you have mistyped the name of a function in a derivation or routine, or have an incorrect argument list in a call?
EJRoufs
Participant
Posts: 73
Joined: Tue Aug 19, 2003 2:12 pm
Location: USA

Post by EJRoufs »

chucksmith wrote:Could you have mistyped the name of a function in a derivation or routine, or have an incorrect argument list in a call?

It's possible; i'll double-check that. I wouldn't think so in this case, though. I had the entire routine working fine in DB2. All i did was switch the input to a SQL Server table instead, with the same set up. And that part works fine. It doesn't blow up until it gets about half way thru loading the records to the Summing step, which is identical to the Summing step i used with my DB2 read.
Eric
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

After it aborts, have you tried Resetting the job? If you do, is there anything in the log labelled 'From previous run'? This can give more clues as to the cause of the abort.
-craig

"You can never have too many knives" -- Logan Nine Fingers
EJRoufs
Participant
Posts: 73
Joined: Tue Aug 19, 2003 2:12 pm
Location: USA

Post by EJRoufs »

chulett wrote:After it aborts, have you tried Resetting the job? If you do, is there anything in the log labelled 'From previous run'? This can give more clues as to the cause of the abort.
I just reset it, but not getting any more helpful information. I basically have 2 million records i'm trying to sum down. It sends about 1.25 million of those records to the Sum step when it blows. :(
Eric
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Probably memory / resource related. There are definitely limits to the amount of un-ordered data the Aggregator can handle on any given system, but I don't know of any hard-and-fast rules on how to determine what that limit might be.

Any chance you can sort the incoming data to support your aggegation? That would help the Aggregator out tremendously if you could. Make sure you tell it the 'sort order' of the incoming data - and don't lie! :wink: It will blow up in a different way if you do...
-craig

"You can never have too many knives" -- Logan Nine Fingers
EJRoufs
Participant
Posts: 73
Joined: Tue Aug 19, 2003 2:12 pm
Location: USA

Post by EJRoufs »

chulett wrote:Probably memory / resource related. There are definitely limits to the amount of un-ordered data the Aggregator can handle on any given system, but I don't know of any hard-and-fast rules on how to determine what that limit might be.

Si. I believe you are correct. Before i was pulling in a month at a time and all was just peachy. Now i am pulling in a year at a time. The smaller amounts seem to work just fine. Thanks! :)

Anybody have any ideas how to determine my limits, or how to adjust my limits to handle more?
Eric
Post Reply