Hash file making problem

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
rohit.ka07
Participant
Posts: 24
Joined: Thu Aug 12, 2010 11:22 pm
Location: Bangalore

Hash file making problem

Post by rohit.ka07 »

Hi,

I've a job with ODBC as source and performing lookup using hash file.
Job was running fine yesterday.
Today when I run, job is aborting with following error.

Abnormal termination of stage XXXX..TrfDevice detected.
Error message after resetting the job.

From previous run
DataStage Job 1209 Phantom 12704
Abnormal termination of DataStage.
Fault type is 11. Layer type is BASIC run machine.
Fault occurred in BASIC program JOB.899766562.DT.1560382767.TRANS3 at address 11e.

Actually I am mapping four columns from hash file to target and fifth column I am using for lookup,.
When I ran around this, I found that, when I map only two columns and hard code some value(1) for other two columns, Job ran success.
Then I thought there is some problem with a particular column.
So, I tried mapping only single column and hard coding default value to rest of columns to identify which column is giving problem.
Worst thing is, job runs success when I map single column for all columns.
Job runs success if I map any two columns.
Job runs success if I map any three columns.
Job fails only when I map all four columns.

Please add suggestions on this.\
Thanks...
ROHIT K A
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

What is your actual target? Did you search the forums for that "Fault Type is 11" error message?
-craig

"You can never have too many knives" -- Logan Nine Fingers
rohit.ka07
Participant
Posts: 24
Joined: Thu Aug 12, 2010 11:22 pm
Location: Bangalore

Post by rohit.ka07 »

No problem in the target side I guess...
there are two more transformers after this transformer.
Final trnf is Oracle bulk loader stage.

error is occurring in the Transformer stage reported in the error log..
I e the first transformer.
ROHIT K A
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Errors in Server jobs always come from 'active' stages, typically Transformers but that doesn't always mean they are at fault. Me, I would concentrate on the target. Did you search for your error here? You are not the first person to see that particular issue, hence the suggestion.
-craig

"You can never have too many knives" -- Logan Nine Fingers
rohit.ka07
Participant
Posts: 24
Joined: Thu Aug 12, 2010 11:22 pm
Location: Bangalore

Post by rohit.ka07 »

But I suspect, this issue is with the hash file which is used for lookup.
as I mentioned earlier, job runs success when we map any one single column to next transformer.
Job is aborting with the error message when I map all columns.

This is some what different issue compared to previous posts.
Isn't it?

FYI, I checked all previous posts related to "Fault type is 11. Layer type is BASIC run machine" error message.

Thanks...
ROHIT K A
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Never seen hashed files cause an issue like that. As I said, in your shoes I'd concentrate on your target, the stage and Oracle client version. That and involve your official support provider for any low level fault like that.
-craig

"You can never have too many knives" -- Logan Nine Fingers
rohit.ka07
Participant
Posts: 24
Joined: Thu Aug 12, 2010 11:22 pm
Location: Bangalore

Post by rohit.ka07 »

As I mentioned earlier, my target is Oracle Bulk Loader Stage.

Job design is like this.

ODBC---->Trnsf--->Transf---->Transformer------>Ora_Bulk_Loader

each transformers are connected with Hash files for lookup.

problem is with the first transformer (reported in the error log) which is connected by a hash file.
ROHIT K A
arunkumarmm
Participant
Posts: 246
Joined: Mon Jun 30, 2008 3:22 am
Location: New York
Contact:

Post by arunkumarmm »

Do you have any stage variable or column derivation that has complex IF..THEN..ELSE? If so can you post them all? In all of the transformers?
Arun
rohit.ka07
Participant
Posts: 24
Joined: Thu Aug 12, 2010 11:22 pm
Location: Bangalore

Post by rohit.ka07 »

Yes, there are two If Else conditions in the first transformer.

IF NOT(ISNULL(LnkIn.XXXX.YYYY)) Then 1 Else 0

IF NOT(ISNULL(LnkInXXXX.YYYY)) Then LnkInXXXX.ZZZZ Else 'Undefined'

No other transformations.
Also, no transformations in the rest of two transformers. Its direct mapping.
ROHIT K A
arunkumarmm
Participant
Posts: 246
Joined: Mon Jun 30, 2008 3:22 am
Location: New York
Contact:

Post by arunkumarmm »

rohit.ka07 wrote:Yes, there are two If Else conditions in the first transformer.

IF NOT(ISNULL(LnkIn.XXXX.YYYY)) Then 1 Else 0

IF NOT(ISNULL(LnkInXXXX.YYYY)) Then LnkInXXXX.ZZZZ Else 'Undefined'

No other transformations.
Also, no transformations in the rest of two transformers. Its direct mapping.
These are not complex though, can you remove these from the transformer and try running the job.

Are you sure you dont have any other condition in your transformers? Even in the constrains?
Arun
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

I'm going to repeat one last time that, IMHO, concentrating on the transformer and/or hashed file lookup is a mistake. Those kind of errors come from the target, typically Oracle client / driver generated errors and they can be very odd and very subtle, working for 99.9% of what you do in your jobs and blowing up on .1%. Been there, done that, got the t-shirt hence the suggestion to check your driver versions and involve support.
-craig

"You can never have too many knives" -- Logan Nine Fingers
arunkumarmm
Participant
Posts: 246
Joined: Mon Jun 30, 2008 3:22 am
Location: New York
Contact:

Post by arunkumarmm »

I had the same exact error, which was due to an incorrect constrain. I corrected that and never had the issue again. This may not be relevent, but that's what I did.
Arun
Post Reply