Page 1 of 2

Unable to create operating system file

Posted: Thu Oct 07, 2004 11:54 am
by sumitgulati
Hi All,

I am facing a strange problem. I have a job that is creating a hash file. The job was working fine until yesterday but now it is aborting with the following error message:
DSD.UVOpen Creating file "PS_GL_ACCOUNT_TBL_FIN" as Type 30.
Unable to create operating system file "PS_GL_ACCOUNT_TBL_FIN".
Any idea why is it happening?

Regards,
Sumit

Posted: Thu Oct 07, 2004 12:01 pm
by sumitgulati
The Job behaviour is actually unpredictable. At times it abort with the hash file creation error message and at times it runs successfully.

This means there is no problem with the job. What else could be the problem.

Thanks and Regards,
-Sumit

Posted: Thu Oct 07, 2004 12:22 pm
by kcbland
Are you running a lot of jobs at that particular moment that are also creating hash files?

Posted: Thu Oct 07, 2004 12:22 pm
by ketfos
Hi,
Have you tried using a different name?
Does file with this name already exists?
Have your permissions changed?
Do you have enough disk space?
Are you creating hash file using using account or directory path?

Ketfos

Re: Unable to create operating system file

Posted: Thu Oct 07, 2004 12:47 pm
by mayank007
Hi Sumit,

I was also getting this problem. This is a problem with the Server machine. The Datastage server would need to be restarted (Not the services). You may have to raise a ticket with your sysadmin was a hard restart of the machine. This solution solved my problem and i guess would be helpful for you as well. Also you could have a look into the unnecessary files (logs and hash) and try to delete them. At times the error log would also talk about VOC enties in the operating system.

Mayank :D

Posted: Thu Oct 07, 2004 2:46 pm
by sumitgulati
There does not seem to be any problem with hash file name and the permissions because the job runs successfully at times.
The disk space still available is close to 6 GB.
I am creating the hash file using account.
ketfos wrote:Hi,
Have you tried using a different name?
Does file with this name already exists?
Have your permissions changed?
Do you have enough disk space?
Are you creating hash file using using account or directory path?

Ketfos

Re: Unable to create operating system file

Posted: Thu Oct 07, 2004 2:48 pm
by sumitgulati
Mayank, I tried restarting the system but the problem still persists. It still behaves inconsistently - runs successfully at times and aborts at times.

Regards,
-Sumit

mayank007 wrote:Hi Sumit,

I was also getting this problem. This is a problem with the Server machine. The Datastage server would need to be restarted (Not the services). You may have to raise a ticket with your sysadmin was a hard restart of the machine. This solution solved my problem and i guess would be helpful for you as well. Also you could have a look into the unnecessary files (logs and hash) and try to delete them. At times the error log would also talk about VOC enties in the operating system.

Mayank :D

Posted: Thu Oct 07, 2004 3:23 pm
by Athorne
On a somewhat related note, is there a way to increase the minimum.modulus of a hash file without removing and recreating it? The HFC program suggests the command CREATE.FILE <FILE NAME> DYNAMIC MINIMUM.MODULUS 23 32BIT

I try that and it says the file already exists, and that is a fact it does exist. Is there a UPDATE.FILE type command or somethign likewise that does this, or do I have to remove the hash and create it new on the command line. I know you are thinking RTFM, :oops: but I have searched several places and can't find it, plus for some reason my xterm display is not showing the content when I do a HELP from the uvsh command line.

Thanks,
Andy

Posted: Thu Oct 07, 2004 3:38 pm
by kcbland
Thread hijacker, DYNAMIC hash files do not use the resizing commands, only the static hashfiles. You must whack the file and recreate it. If it's important enough to save the data, then rename the hash file directory and recreate the new file under the old name. Then, use a job to copy the data from the old to the new.

Posted: Thu Oct 07, 2004 3:41 pm
by ketfos
Hi,
Use CONFIGURE.FILE command to change dynamic file parameters for exisitng dynamic files.

Use ANALYZE.FILE command to verify the changes.

Ketfos

Posted: Thu Oct 07, 2004 3:43 pm
by Athorne
kcbland
Posted: Thu Oct 07, 2004 3:38 pm

--------------------------------------------------------------------------------
Thread hijacker,
:roll: Guilty as charged... :P

Thanks for the answer.

Andy

Posted: Thu Oct 07, 2004 4:33 pm
by ray.wurlod
Hijacker hihack! 8)
DYNAMIC hash files do not use the resizing commands, only the static hashfiles.
Dynamic hashed files DO use the RESIZE command. The "modulo" figure is interpreted as minimum modulus, the "separation figure" is mapped to group size (sep 8 = group size 2, any other sep = group size 1).

Good practice for persistent dynamic hashed files (for example in a UniVerse database) is occasionally to pack up the groups using

Code: Select all

RESIZE filename * * *
.

One can also convert a dynamic hashed file to 64-bit addressing using

Code: Select all

RESIZE filename * * * 64BIT

Posted: Thu Oct 07, 2004 5:12 pm
by sumitgulati
We figured out the problem.

Since so many users were using this server we sometimes were trying to exceed the maximum number of files DataStage server can open at a time. The limit can be increased by increasing the Mfiles number. The other way to resolve this issue is to limit the number of users.

We increased the Mfiles number and it is now working fine.

Thanks for your suggestions.

Regards,
-Sumit

Posted: Thu Oct 07, 2004 8:28 pm
by kcbland
:oops: Whoa, did I ever get that one wrong... :oops:

Sorry folks, for some reason it stuck in my brain DYNAMIC hash files aren't resizable. I'm suffering from a severe lack of donuts...

Posted: Thu Oct 07, 2004 8:30 pm
by kcbland
sumitgulati wrote:We figured out the problem.

Since so many users were using this server we sometimes were trying to exceed the maximum number of files DataStage server can open at a time. The limit can be increased by increasing the Mfiles number. The other way to resolve this issue is to limit the number of users.

We increased the Mfiles number and it is now working fine.

Thanks for your suggestions.

Regards,
-Sumit
Hey, that's where I was heading with my previous question. When you get a lot of jobs running (Server, PX, doesn't make a difference) and clients connected you will reach certain limits and weird things start to happen.