Page 1 of 1

Memory Fault Error...Parallel job reports failure (code 139)

Posted: Wed Aug 25, 2004 10:18 pm
by akash_nitj
Hi
I have just done the installation of DataStage 7.1 for our project.
When i am trying to run a simple job like reading from sequential file and writing to a sequential file a memory fault error is coming.

The error is Parallel job reports failure (code 139).

Are there any extra settings which need to be done before running the jobs.

........Waiting for a reply

Posted: Wed Aug 25, 2004 11:31 pm
by ray.wurlod
Are any other errors or warnings reported/logged? What volume of data (approximately) are you trying to process? Do any other applications on that machine generate memory errors? Have you logged a call with your support provider?

Posted: Thu Aug 26, 2004 1:40 am
by richdhan
Hi,

We came across this problem.
Ray wrote:Are any other errors or warnings reported/logged?
Ray, The problem is quite strange. You dont get any other errors or warnings. You get one fatal error stating

Parallel job reports failure (code 139)

In one of our job we were reading from a dataset but the dataset was empty. We tried to view the data and it failed. Once the dataset was loaded with data the job ran fine.

Just check whether the sequential file that you are using is accesible and that you are able to view the data from Datastage Designer before running the job.

Instead of using a sequential stage use some database stage and load the data into a dataset and check what happens.

HTH
--Rich

Posted: Thu Sep 08, 2005 7:49 pm
by BobCothroll
I encountered this problem with the same release of DataStage. It was caused by an Oracle 9i extract with the same target field specified twice. Perhaps that was just an "instigator", but when you get one of these esoteric error messages, start with your latest changes. DataStage error messages are often somewhere between nebulous and misleading.

Posted: Fri Sep 09, 2005 2:13 am
by mpouet
Hi,

Are you sure your input file exists ?

Matthieu