Page 1 of 1

"DSP.ActiveRun": Line 51, Exception raised in GCI

Posted: Thu Nov 01, 2007 4:34 am
by dsx_newbie
Hi experts,

anybody encountered this type of error when reading from an xml file

"DSP.ActiveRun": Line 51, Exception raised in GCI subroutine:
Access violation.

TIA

Posted: Thu Nov 01, 2007 6:34 am
by chulett
Not that I recall. How large is the file? What is your job design? Reset the job and see if there is a 'From previous run...' message in the log, if so post it.

Posted: Fri Nov 02, 2007 1:21 am
by ray.wurlod
How much physical memory is in your DataStage server machine?

Posted: Sat Nov 03, 2007 3:49 am
by dsx_newbie
ray.wurlod wrote:How much physical memory is in your DataStage server machine? ...
We are using 4 GB RAM for our local development server.

I'm not sure if the error is due to the way I am reading the xml :

Folder stage - > xml input stage -> transformer -> flat file. Is this the correct way of reading xml?

Thanks

Posted: Sat Nov 03, 2007 4:12 am
by ArndW
Doing a reset and looking at the log entries as already recommended by Craig will help most at present.

Posted: Tue Sep 08, 2009 10:52 pm
by SHARAD123
Hi All,

I am stuck up with the same error.

The log after resetting the job is,

DataStage Job 297 Phantom 6020
Unhandled exception raised at address 0x1005AAD6 : Access violation
Attempted to write to address 0x00000000

Aborting DataStage...

Could someone tell me what could be the problem. The job design is reading from a table then aggregating the data and writing to a seq file. The volume of data is 131,000,000 approx.

Thanks,
Sharad

Posted: Wed Sep 09, 2009 1:18 am
by ArndW
The error is a null pointer somewhere in the program. That makes it more difficult to analyze. Does the error occur immediately or after running for some time? Does it make a difference if you split the large XML into separate files?

Posted: Fri Sep 11, 2009 2:28 am
by SHARAD123
Hi,

The error is thrown up only in the aggregator stage.

The extract happens from a DB thru OCI and then a look up of the data is done. The process data are passed to an aggregator after which we get the error.

The job runs for an hour and a half and aborts in the aggregator stage.

We do not have an XML file in the design. The source is an OCI stage.

Posted: Fri Sep 11, 2009 5:45 am
by chulett
:!: Next time, start your own topic please.

That's too much data for the Aggregator to process, you'll need to sort your data first based on the grouping keys and then assert that sort order in the Aggregator stage. Do that properly and it should be able to handle pretty much any volume.