I am facing a strange problem. I have a job that is creating a hash file. The job was working fine until yesterday but now it is aborting with the following error message:
DSD.UVOpen Creating file "PS_GL_ACCOUNT_TBL_FIN" as Type 30.
Unable to create operating system file "PS_GL_ACCOUNT_TBL_FIN".
Are you running a lot of jobs at that particular moment that are also creating hash files?
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Hi,
Have you tried using a different name?
Does file with this name already exists?
Have your permissions changed?
Do you have enough disk space?
Are you creating hash file using using account or directory path?
I was also getting this problem. This is a problem with the Server machine. The Datastage server would need to be restarted (Not the services). You may have to raise a ticket with your sysadmin was a hard restart of the machine. This solution solved my problem and i guess would be helpful for you as well. Also you could have a look into the unnecessary files (logs and hash) and try to delete them. At times the error log would also talk about VOC enties in the operating system.
There does not seem to be any problem with hash file name and the permissions because the job runs successfully at times.
The disk space still available is close to 6 GB.
I am creating the hash file using account.
ketfos wrote:Hi,
Have you tried using a different name?
Does file with this name already exists?
Have your permissions changed?
Do you have enough disk space?
Are you creating hash file using using account or directory path?
Mayank, I tried restarting the system but the problem still persists. It still behaves inconsistently - runs successfully at times and aborts at times.
Regards,
-Sumit
mayank007 wrote:Hi Sumit,
I was also getting this problem. This is a problem with the Server machine. The Datastage server would need to be restarted (Not the services). You may have to raise a ticket with your sysadmin was a hard restart of the machine. This solution solved my problem and i guess would be helpful for you as well. Also you could have a look into the unnecessary files (logs and hash) and try to delete them. At times the error log would also talk about VOC enties in the operating system.
On a somewhat related note, is there a way to increase the minimum.modulus of a hash file without removing and recreating it? The HFC program suggests the command CREATE.FILE <FILE NAME> DYNAMIC MINIMUM.MODULUS 23 32BIT
I try that and it says the file already exists, and that is a fact it does exist. Is there a UPDATE.FILE type command or somethign likewise that does this, or do I have to remove the hash and create it new on the command line. I know you are thinking RTFM, but I have searched several places and can't find it, plus for some reason my xterm display is not showing the content when I do a HELP from the uvsh command line.
Thread hijacker, DYNAMIC hash files do not use the resizing commands, only the static hashfiles. You must whack the file and recreate it. If it's important enough to save the data, then rename the hash file directory and recreate the new file under the old name. Then, use a job to copy the data from the old to the new.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
DYNAMIC hash files do not use the resizing commands, only the static hashfiles.
Dynamic hashed files DO use the RESIZE command. The "modulo" figure is interpreted as minimum modulus, the "separation figure" is mapped to group size (sep 8 = group size 2, any other sep = group size 1).
Good practice for persistent dynamic hashed files (for example in a UniVerse database) is occasionally to pack up the groups using
Since so many users were using this server we sometimes were trying to exceed the maximum number of files DataStage server can open at a time. The limit can be increased by increasing the Mfiles number. The other way to resolve this issue is to limit the number of users.
We increased the Mfiles number and it is now working fine.
Sorry folks, for some reason it stuck in my brain DYNAMIC hash files aren't resizable. I'm suffering from a severe lack of donuts...
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Since so many users were using this server we sometimes were trying to exceed the maximum number of files DataStage server can open at a time. The limit can be increased by increasing the Mfiles number. The other way to resolve this issue is to limit the number of users.
We increased the Mfiles number and it is now working fine.
Thanks for your suggestions.
Regards,
-Sumit
Hey, that's where I was heading with my previous question. When you get a lot of jobs running (Server, PX, doesn't make a difference) and clients connected you will reach certain limits and weird things start to happen.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle