RANDOM "GCI subroutine: Access violation" WARNING

A forum for discussing DataStage<sup>®</sup> basics. If you're not sure where your question goes, start here.

Moderators: chulett, rschirm, roy

Post Reply
dec1177
Premium Member
Premium Member
Posts: 22
Joined: Mon Aug 06, 2007 2:26 pm

RANDOM "GCI subroutine: Access violation" WARNING

Post by dec1177 »

All:

I have 3 very simple jobs each designed as such:

Seq. File ------> Transformer ------> Seq. File

The purpose of these 3 jobs is to take 3 separate flat files all having the same file format and merge them together into one flat file. The first job clears out the target file while the second and third jobs append their source file data to the target file.

The problem happens at random with not just one of the 3 jobs but any of them. Here is the warning message that is generated:

DataStage Job 40 Phantom 6040
Program "DSD.StageRun": Line 676, Exception raised in GCI subroutine:
Access violation.
Attempting to Cleanup after ABORT raised in stage gp_export_SAP_ARMAST_3000..ARMAST_SAP
DataStage Phantom Aborting with @ABORT.CODE = 3


I have gone into the code for Job 40 and there is no line 676, the job only contains 337 lines.

The result is that the target file ends up with MOST of the data but not all. For instance, last night when this job sequence ran, the first job threw the above warning. The second and third jobs ran with success. The resulting target file is only missing 10 rows of data...from the first job's source file I would presume. It's not the last 10 rows because a quick search of the target file revealed that those rows were all there in the target file.

Any ideas would be much appreciated.
Thanks in advance.
I don't know signatures...
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Line 676 is not from the job, it's from the program noted in the log message - an internal DataStage program called DSD.StageRun, of which we don't have access to the source. So...

Doing anything... strange... in the transformer in your derivations? Using stage variables? Etc. If you rerun the jobs with the same input data, does the same error occur? Trying to assertain just how random your random is.
-craig

"You can never have too many knives" -- Logan Nine Fingers
dec1177
Premium Member
Premium Member
Posts: 22
Joined: Mon Aug 06, 2007 2:26 pm

Post by dec1177 »

chulett wrote: Doing anything... strange... in the transformer in your derivations? Using stage variables? Etc. If you rerun the jobs with the same input data, does the same error occur? Trying to assertain just how random your random is.
No, the transformer is simply passing along the fields from source to target except for 4 fields that are not in the source but must be populated in the target and I am passing empty strings to those.

I should have mentioned this earlier but resetting and rerunning the job seems to do the trick MOST of the time. Sometimes it will throw the same warning again, but resetting and rerunning a second time has always gotten it to process all the data.
I don't know signatures...
Post Reply