Hash file making problem
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 24
- Joined: Thu Aug 12, 2010 11:22 pm
- Location: Bangalore
Hash file making problem
Hi,
I've a job with ODBC as source and performing lookup using hash file.
Job was running fine yesterday.
Today when I run, job is aborting with following error.
Abnormal termination of stage XXXX..TrfDevice detected.
Error message after resetting the job.
From previous run
DataStage Job 1209 Phantom 12704
Abnormal termination of DataStage.
Fault type is 11. Layer type is BASIC run machine.
Fault occurred in BASIC program JOB.899766562.DT.1560382767.TRANS3 at address 11e.
Actually I am mapping four columns from hash file to target and fifth column I am using for lookup,.
When I ran around this, I found that, when I map only two columns and hard code some value(1) for other two columns, Job ran success.
Then I thought there is some problem with a particular column.
So, I tried mapping only single column and hard coding default value to rest of columns to identify which column is giving problem.
Worst thing is, job runs success when I map single column for all columns.
Job runs success if I map any two columns.
Job runs success if I map any three columns.
Job fails only when I map all four columns.
Please add suggestions on this.\
Thanks...
I've a job with ODBC as source and performing lookup using hash file.
Job was running fine yesterday.
Today when I run, job is aborting with following error.
Abnormal termination of stage XXXX..TrfDevice detected.
Error message after resetting the job.
From previous run
DataStage Job 1209 Phantom 12704
Abnormal termination of DataStage.
Fault type is 11. Layer type is BASIC run machine.
Fault occurred in BASIC program JOB.899766562.DT.1560382767.TRANS3 at address 11e.
Actually I am mapping four columns from hash file to target and fifth column I am using for lookup,.
When I ran around this, I found that, when I map only two columns and hard code some value(1) for other two columns, Job ran success.
Then I thought there is some problem with a particular column.
So, I tried mapping only single column and hard coding default value to rest of columns to identify which column is giving problem.
Worst thing is, job runs success when I map single column for all columns.
Job runs success if I map any two columns.
Job runs success if I map any three columns.
Job fails only when I map all four columns.
Please add suggestions on this.\
Thanks...
ROHIT K A
-
- Participant
- Posts: 24
- Joined: Thu Aug 12, 2010 11:22 pm
- Location: Bangalore
Errors in Server jobs always come from 'active' stages, typically Transformers but that doesn't always mean they are at fault. Me, I would concentrate on the target. Did you search for your error here? You are not the first person to see that particular issue, hence the suggestion.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Participant
- Posts: 24
- Joined: Thu Aug 12, 2010 11:22 pm
- Location: Bangalore
But I suspect, this issue is with the hash file which is used for lookup.
as I mentioned earlier, job runs success when we map any one single column to next transformer.
Job is aborting with the error message when I map all columns.
This is some what different issue compared to previous posts.
Isn't it?
FYI, I checked all previous posts related to "Fault type is 11. Layer type is BASIC run machine" error message.
Thanks...
as I mentioned earlier, job runs success when we map any one single column to next transformer.
Job is aborting with the error message when I map all columns.
This is some what different issue compared to previous posts.
Isn't it?
FYI, I checked all previous posts related to "Fault type is 11. Layer type is BASIC run machine" error message.
Thanks...
ROHIT K A
-
- Participant
- Posts: 24
- Joined: Thu Aug 12, 2010 11:22 pm
- Location: Bangalore
As I mentioned earlier, my target is Oracle Bulk Loader Stage.
Job design is like this.
ODBC---->Trnsf--->Transf---->Transformer------>Ora_Bulk_Loader
each transformers are connected with Hash files for lookup.
problem is with the first transformer (reported in the error log) which is connected by a hash file.
Job design is like this.
ODBC---->Trnsf--->Transf---->Transformer------>Ora_Bulk_Loader
each transformers are connected with Hash files for lookup.
problem is with the first transformer (reported in the error log) which is connected by a hash file.
ROHIT K A
-
- Participant
- Posts: 246
- Joined: Mon Jun 30, 2008 3:22 am
- Location: New York
- Contact:
-
- Participant
- Posts: 24
- Joined: Thu Aug 12, 2010 11:22 pm
- Location: Bangalore
-
- Participant
- Posts: 246
- Joined: Mon Jun 30, 2008 3:22 am
- Location: New York
- Contact:
These are not complex though, can you remove these from the transformer and try running the job.rohit.ka07 wrote:Yes, there are two If Else conditions in the first transformer.
IF NOT(ISNULL(LnkIn.XXXX.YYYY)) Then 1 Else 0
IF NOT(ISNULL(LnkInXXXX.YYYY)) Then LnkInXXXX.ZZZZ Else 'Undefined'
No other transformations.
Also, no transformations in the rest of two transformers. Its direct mapping.
Are you sure you dont have any other condition in your transformers? Even in the constrains?
Arun
I'm going to repeat one last time that, IMHO, concentrating on the transformer and/or hashed file lookup is a mistake. Those kind of errors come from the target, typically Oracle client / driver generated errors and they can be very odd and very subtle, working for 99.9% of what you do in your jobs and blowing up on .1%. Been there, done that, got the t-shirt hence the suggestion to check your driver versions and involve support.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Participant
- Posts: 246
- Joined: Mon Jun 30, 2008 3:22 am
- Location: New York
- Contact: