Parallel job getting aborted

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
ketanshah123
Participant
Posts: 88
Joined: Wed Apr 05, 2006 1:04 am

Parallel job getting aborted

Post by ketanshah123 »

HI

Parallel job getting aborted with error


Parallel job reports failure (code 139)

Contents of phantom output file =>
RT_SC8832/OshExecuter.sh[20]: 1978564 Segmentation fault


Job design is

oracle stage ----> transformer----->lookup------>transfprmer----->oracle stage


Any suggestion how can i resolve this ?

Thanx in advance.
hamzaqk
Participant
Posts: 249
Joined: Tue Apr 17, 2007 5:50 am
Location: islamabad

Post by hamzaqk »

check if the NLS setting of the job is the same as the project or set it to project default. And if this does not work out so a search on the "segmentation fault" and it will come up results pertaining to the same problem...
Teradata Certified Master V2R5
srinivas.g
Participant
Posts: 251
Joined: Mon Jun 09, 2008 5:52 am

Post by srinivas.g »

it is related to memory allocation problem.

Please check null handling properly used in transformer stage?
Srinu Gadipudi
sjfearnside
Premium Member
Premium Member
Posts: 278
Joined: Wed Oct 03, 2007 8:45 am

Post by sjfearnside »

I am using 8.0.1 and experienced this error message on 2 different occasions and it was related to a missing or invalid entry in my odbc.ini file.
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

Why do you have two transformers ? can it be consolidated in one, can you split up the job to see where the issue is happening?
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
ETLJOB
Participant
Posts: 87
Joined: Thu May 01, 2008 1:15 pm
Location: INDIA

Post by ETLJOB »

I made a copy of my original job and did few modifications.

When I ran the job I got the following error message.

"Contents of phantom output file =>
RT_SC3666/OshExecuter.sh[20]: 2994302 Segmentation fault(coredump)"
"Parallel job reports failure (code 139)"

Finally, I could manage to get rid of this weird error by fixing my oracle query! So i guess there are more reasons for this code 139 failure and you can't guess some solutions unless you know what is there in the job.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

ETLJOB wrote:So i guess there are more reasons for this failure and you can't guess some solutions unless you know what is there in the job.
I modified your quote slightly - a nice observation about trouble-shooting in general that some people don't seem to realize, and especially from where we sit on the other side of the glass. Sometimes you just gotta be there and have eyes on the target. :wink:
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply