UVOpen Unable to Open File Hash File Error

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
yinyin61
Participant
Posts: 28
Joined: Mon Nov 07, 2005 7:40 pm

UVOpen Unable to Open File Hash File Error

Post by yinyin61 »

hi there,

i'm currently having this problem when running jobs. This appears everytime when I run a sequence of jobs and it occurs randomly throughout the different jobs from time to time. Please refer to the screen below for the abort message box:

Image

As you can see, it states that unable to open the hash file Hash_Before_BK_C_1 which serves as an output reference file.

Wierd thing is, There is another file with similar name before it that serves as an input hash file called Hash_Before_BK_C. These files share the same name in the same directory. And I opened that Hash_Before_BK_C input hash file and that the file is indeed created and there is data inside.

When I tried opening the Hash_Before_BK_C_1 sometimes it gives an error that the file does not exists or sometimes it is possible to open (In which the latter indicate that the error is self resolved)...

In short, only difference between this 2 is that one is an input file while the other runs as an output reference file.

These hash file stage resides in a shared container whereby different job with similar processing accesses it with specific input name for the files.

My problem is that I am getting the error randomly from jobs to jobs. Sometimes it may occur for job A, sometimes it doesn't but happens for job B, etc...

This is pretty unnerving because this random error is making the data migration very unstable.

Please advice.
Thank you.

Regards,
Aileen Chong
Software Engineer
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

The first likely cause is access rights - if another user created this hashed file and the umask wasn't set correctly, the userid you try to read/write to the file might not have sufficient UNIX permissions on the hashed file (if it is a type 30 Dynamic file, the path shown will be a directory under which you will have 3 files). The error message you pasted to your question - does it come from read, a write (with or without a create) or a reference lookup?
yinyin61
Participant
Posts: 28
Joined: Mon Nov 07, 2005 7:40 pm

UVOpen Unable to Open File Hash File Error

Post by yinyin61 »

Hi Andrew,

Thanks for the prompt reply. That error comes from a reference lookup where by the hash file is being looked up and details are sent and compared in a transformer.

The input hash file stage counterpart has been set to 'Create file' and it is set as 30 Dynamic file and that it clears the file before overwrite.

For the output hash file stage (Reference lookup) the pre-load to memory has been disabled.

And this is randomly occuring... it may happen to one job at one time and at another... it happens to another jobs...

Thanks in advanced.
ArndW wrote:The first likely cause is access rights - if another user created this hashed file and the umask wasn't set correctly, the userid you try to read/write to the file might not have sufficient UNIX permissions on the hashed file (if it is a type 30 Dynamic file, the path shown will be a directory under which you will have 3 files). The error message you pasted to your question - does it come from read, a write (with or without a create) or a reference lookup?
Last edited by yinyin61 on Thu Jan 05, 2006 2:23 am, edited 1 time in total.
Thank you.

Regards,
Aileen Chong
Software Engineer
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Then I am almost certain that access rights are the culprit, especially since you have a shared container that can be used by different userids.
yinyin61
Participant
Posts: 28
Joined: Mon Nov 07, 2005 7:40 pm

UVOpen Unable to Open File Hash File Error

Post by yinyin61 »

Hi again Andrew,

I disagree about the userid because there is only basically one user accessing that environment... it's just different jobs...

And this error is very random. Varying from one time happening to one job and next the other... Those jobs that do not have the error occuring successfully obtain the reference.

E.g. Run session 1

1) Job 1 - SUCCESS
2) Job 2 - FAIL
3) Job 3

session 1 re-run

1) Job 1 - FAIL
2) Job 2
3) Job 3

Thanks in advanced. ;-)
ArndW wrote:Then I am almost certain that access rights are the culprit, especially since you have a shared container that can be used by different userids.
Thank you.

Regards,
Aileen Chong
Software Engineer
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

I re-read your original post, originally I just remembered your statement "it happens every time I run..." but then I saw that it was different places in the same sequence.

This leads me to look at a system-level restriction with the number of concurrent files that can be open at any given time. If you look into your uvconfig file and check the values for MFILES and T30FILES to see what they are set to. If you are using the default Type 30 (Dynamic) files in your jobs, then the value of T30FILES might not be large enough - which could explain the seemingly random behaviour encountererd when opening your files. The MFILES parameter can be raised at the same time as the T30FILES, if it is set too low the effect is seen in performance, there are no errors directly associated with having a minimal setting.
yinyin61
Participant
Posts: 28
Joined: Mon Nov 07, 2005 7:40 pm

Post by yinyin61 »

ArndW wrote:I re-read your original post, originally I just remembered your statement "it happens every time I run..." but then I saw that it was different places in the same sequence.

This leads me to look at a system-level restriction with the number of concurrent files that can be open at any given time. If you look into your uvconfig file and check the values for MFILES and T30FILES to see what they are set to. If you are using the default Type 30 (Dynamic) files in your jobs, then the value of T30FILES might not be large enough - which could explain the seemingly random behaviour encountererd when opening your files. The MFILES parameter can be raised at the same time as the T30FILES, if it is set too low the effect is seen in performance, there are no errors directly associated with having a minimal setting.
Hi Andrew,

I've checked the T30FILES size as well as the MFILES size they are 200 and 50 respectively.

I have compared this with the development environment (Which has no such issues occuring) and the production environment and they are set the same.

Does resource allocation difference affect the way these file sizes behave? I mean, speculation is... maybe an overloaded server like the production's could have more resources allocated for more processes thus the failure whereas the development has not got as many processes running thus no error occurence. Is that a possibility?

I have actually reviewed this factor last night. Regarding the file sizes but I've sorta ruled it out as part of the culprit... cause both DS settings in prod and dev are same.

Thanks in advanced.
Thank you.

Regards,
Aileen Chong
Software Engineer
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Both those values are very small and should be raised immediately to improve performance, if nothing else. These numbers are not per-process but per-system so if you have more than a couple of jobs running at the same time they will quickly consume your available resources. Since you've stated that your production machine has more DS activity than development I would recommend you make this change first and see if your error persists.
yinyin61
Participant
Posts: 28
Joined: Mon Nov 07, 2005 7:40 pm

UVOpen Unable to Open File Hash File Error

Post by yinyin61 »

ArndW wrote:Both those values are very small and should be raised immediately to improve performance, if nothing else. These numbers are not per-process but per-system so if you have more than a couple of jobs running at the same time they will quickly consume your available resources. Since you've stated that your production machine has more DS activity than development I would recommend you make this change first and see if your error persists.
Hi ArndW,

I have tried increasing it and still it fails. Though I have noticed something, could you verify this?

Before a job runs, is it that the ds system will check if the hash file resource is there in the given directory? Then only it runs the job. Since the hash files are created during the job run and that I have emptied the hash directory prior to run, could this cause the error?

Thanks in advanced.
Thank you.

Regards,
Aileen Chong
Software Engineer
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

In order for changes to the uvconfig file to take effect, you need to run uvregend and restart the DataStage server - have you done that?

One possible cause of the error message is trying to open a hashed file that is not there; so your deleting the hashed files in that directory could be a cause. If your DataStage jobs are written so that they create them before trying to write to them (this is an option in the hashed file stage) then that shouldn't be a problem.

Please refrain from deleting OS objects such as hashed files unless you are sure you know what you are doing. If you were to delete a hashed file that is not declared with a path in a DS job it would cause the job to fail the next run - since the VOC entry would remain and DS would think that the file exists when in reality it has been deleted.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Re: UVOpen Unable to Open File Hash File Error

Post by chulett »

yinyin61 wrote: Before a job runs, is it that the ds system will check if the hash file resource is there in the given directory? Then only it runs the job. Since the hash files are created during the job run and that I have emptied the hash directory prior to run, could this cause the error?
Depending on your exact job design, this could very will be the cause of your issue. When using a hashed file as a Reference Lookup, yes - one of the first things the job will do is try to open it. If it doesn't exist, your job will abort with that error.

Some people are fooled because they are writing to (and creating) that same hashed file in the same job but downstream of the first lookup. This is too late and will still generate the error unless the hashed file is precreated. There is a 'trick', using a Transformer as a source, that you can use in conjunction with that reference lookup to create it in that job - if that is the issue you are having.
-craig

"You can never have too many knives" -- Logan Nine Fingers
yinyin61
Participant
Posts: 28
Joined: Mon Nov 07, 2005 7:40 pm

Re: UVOpen Unable to Open File Hash File Error

Post by yinyin61 »

chulett wrote:
yinyin61 wrote: Before a job runs, is it that the ds system will check if the hash file resource is there in the given directory? Then only it runs the job. Since the hash files are created during the job run and that I have emptied the hash directory prior to run, could this cause the error?
Depending on your exact job design, this could very will be the cause of your issue. When using a hashed file as a Reference Lookup, yes - one of the first things the job will do is try to open it. If it doesn't exist, your job will abort with that error.

Some people are fooled because they are writing to (and creating) that same hashed file in the same job but downstream of the first lookup. This is too late and will still generate the error unless the hashed file is precreated. There is a 'trick', using a Transformer as a source, that you can use in conjunction with that reference lookup to create it in that job - if that is the issue you are having.
Hi Ray and Arnd,

Thanks for your replies.

As Ray has pointed out... as well as from another source I have located... it's not advisable to place and input stream into a hash stage as well as one that is refrencing the same file and acts as the reference. I have modified the job as instructed by Ray as well as another friend online and placed a source Transformer before the hash reference. However I am unable to determine the result because another error occurs. One that I have posted also just recently. It has to do with this:

JOB_STG_Staff_Contact_C..ShrContSTGcurrentM.SRT_01.Lnk_SRT_01: dsintbuf_getrow() - row has 66 columns when 6 expected

I just don't understand why these errors are appearing in the server I am in now... previously I've tried this out in the test and development server no such error occured. And why does these errors appear randomly among the 30 + jobs I am having? All 30 jobs are sharing the same container but are taking turns to access the container.

However, I shall discuss that in the posted topic. Thanks guys! ;-)
Thank you.

Regards,
Aileen Chong
Software Engineer
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Well, I guess if it ain't Arnd helping out, it must be Ray. :wink:
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply