Weird Errors

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
bdstage
Charter Member
Charter Member
Posts: 59
Joined: Mon Apr 03, 2006 4:59 pm

Weird Errors

Post by bdstage »

Hi all,

We have a job that is failing from the past couple of days. In the first run I am getting the following error:

Code: Select all

J_Fact_PS_F_CI_BILLING_FAR..HASH_PS_D_BUSINESS_UNIT_1.DRS_PS_D_BUSINESS_UNIT_in: WriteHash() - Write failed for record id '03687
GL
NAFTA'
When I compiled the job again and executed it I got the following errors:

Code: Select all

J_Fact_PS_F_CI_BILLING_FAR..DRS_PS_CI_FAR_BIL_LKP: Using NLS map MS1252

J_Fact_PS_F_CI_BILLING_FAR..DRS_PS_CI_FAR_BIL_LKP: Client Library property required for stage J_Fact_PS_F_CI_BILLING_FAR.DRS_PS_CI_FAR_BIL_LKP

Attempting to Cleanup after ABORT raised in stage J_Fact_PS_F_CI_BILLING_FAR..DRS_PS_CI_FAR_BIL_LKP

Is this something to do with NLS?

When I imported the job from another environment and when I restarted the DataStage services, the job ran fine for a couple of runs.

But again the sequence of errors got repeated again.

Also FYI the same job is running fine in another environment with the same data.

Please help me. Thanks in advance for the reply.

Thanks,
Pavan.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

The first error with a failed write to a Hashed file is most likely caused by the file exceeding the approximate 2Gb limitations. Can you check the size of this file (if it is a default dynamic file, then the file will actually be a directory with 2 visible and one hidden file, the DATA.30 file is most likely going to be large).
I think the second issue should be handled once you've fixed the first.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Not necessarily. The layout of the error message suggests that the key value used contains two "mark characters". By default, hashed file keys are not permitted to have mark characters (dynamic array delimiter characters) in them, with the exception of the "separator character" for hashed files with a multi-column key (and even then, only the character specified in the @KEY_SEPARATOR entry in the dictionary, by default @TM, is permitted).
If you did not put them in explicitly you may have done so implicitly, for example by using the Fmt() function. Or you've tried to use three key columns when the hashed file is defined as having some number other than three.
Last edited by ray.wurlod on Tue Oct 10, 2006 2:00 pm, edited 1 time in total.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
bdstage
Charter Member
Charter Member
Posts: 59
Joined: Mon Apr 03, 2006 4:59 pm

Post by bdstage »

Hello ArndW,

I checked the hash file size and it is only 88KB. We are not loading even 1000 records into the hash file. I am not aware of what is causing problem here. One thing that is confusing me is there is same job in one more environment which is working fine.

Thanks,
Pavan.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Ray - normally that happens in DS when you have a composite key, the @TM inserted by DS tends to display as separate lines.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

I just saw your reply. bdstage. In your DS job do you have a composite key (i.e. more than 1 column marked "key")? If not - then you have illegal characters in your key. If yes - does this error happen on the first write (turn off buffering if enable to check) and if yes, are you certain you have the OS permissions to write to this file?
bdstage
Charter Member
Charter Member
Posts: 59
Joined: Mon Apr 03, 2006 4:59 pm

Post by bdstage »

Yes ArndW. I have a composite key for the hash file. I am not getting this error at all the times. Also, I have the required OS permissions to write to the file.

Thanks,
Pavan.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Just out of curiousity, do you have write cache enabled in the hashed file?
-craig

"You can never have too many knives" -- Logan Nine Fingers
bdstage
Charter Member
Charter Member
Posts: 59
Joined: Mon Apr 03, 2006 4:59 pm

Post by bdstage »

Yes Chulett. I have Write Cache Enabled. Will it make any difference.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Sure, it makes a difference - that's why it's there as an option. :wink:

In my experience, you can see messages like this logged as a warning in a job when the memory allocated to the write cache is depleted, runs out. They are just warnings, however, and not a sign of a failure to write the records to the actual hashed file. Interestingly enough, they also don't seem to count against whatever Warning threshold you had for the job. :?

Just last weekend killed off a stray UNIX phantom that, three hours after the job completed, was still writing thousands of records like this to its log.

Turn it off, see if the warnings go away.
-craig

"You can never have too many knives" -- Logan Nine Fingers
DS_SUPPORT
Premium Member
Premium Member
Posts: 232
Joined: Fri Aug 04, 2006 1:20 am
Location: Bangalore

Re: Weird Errors

Post by DS_SUPPORT »

bdstage wrote:
J_Fact_PS_F_CI_BILLING_FAR..DRS_PS_CI_FAR_BIL_LKP: Client Library property required for stage J_Fact_PS_F_CI_BILLING_FAR.DRS_PS_CI_FAR_BIL_LKP
I hope in your DRS Stage, you would have used some job parameters and those parameters may not be defined or doesnt conatain any value in the user defined environment variables.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

ArndW wrote:...(turn off buffering if enable to check)...
The same thing as Craig said - turn the buffering off to see where your error is really occurring.
bdstage
Charter Member
Charter Member
Posts: 59
Joined: Mon Apr 03, 2006 4:59 pm

Post by bdstage »

Yes Craig and Arndw. I understood that it makes a difference. The changed the job accordingly and it is running fine now.

Thanks a lot for the replies.

Thanks,
Pavan.
Post Reply