UVOpen mkdbfile: cannot create file

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
maheshsada
Participant
Posts: 69
Joined: Tue Jan 18, 2005 12:15 am

UVOpen mkdbfile: cannot create file

Post by maheshsada »

Hi

In my job, selecting data from table(1077070) and writing to hashfile, and this hashfile is used as lookup for other job.
This job is running fine for more than 1 year, suddenly its giving an error DSD.UVOpen mkdbfile: cannot create file /opt/etlbatch/hashfiles/HAdcCharge_026

i have checked the space in the dir
13631488 11554281 1948288 86% /opt/etlbatch

I have checked the forum and not getting a clear picture.

regards
Magesh S
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

Verify permissions of the directory, umask setting in ds.rc script, and userid with group.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

What has changed?

Is there any other message, such as failed to delete /opt/etlbatch/hashfiles/HAdcCharge_026 ?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
maheshsada
Participant
Posts: 69
Joined: Tue Jan 18, 2005 12:15 am

Post by maheshsada »

no its not giving any other message.

The day before it run, it has failed because of space issue in the server, so we deleted some unwanted files and created space. Then we restarted the job and the job got failed with the below error.
We reset the job and the error message is

DSD.UVOpen Unable to open file '/opt/etlbatch/hashfiles/HAdcCharge_026'., do we need to delete the hashfiles in hasdir in server and then continue. Any suggestions welcomed

regards
Magesh S
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Magesh - note that you are now showing a different error than originally, which is why Ray asked the question. You have a pointer in your VOC file that points to a non-existant file on the system. I think if you were to modify your hashed file settings to also "DELETE file before create" it will remove the VOC pointer to the file as well and then not have a problem in re-creating the UNIX objects.

You can also take a small hashed file and copy it to your "'/opt/etlbatch/hashfiles/HAdcCharge_026" location if this file is being cleared before use.
maheshsada
Participant
Posts: 69
Joined: Tue Jan 18, 2005 12:15 am

Post by maheshsada »

Hi

Iam showing the error which i got in the director log, there is no log which says delete failed, moreover in the job design for hashfile stage -

we have checked the update action portion -

Clear file before writing and

Create File option is unchecked

Magesh S
maheshsada
Participant
Posts: 69
Joined: Tue Jan 18, 2005 12:15 am

Post by maheshsada »

Hi

Iam showing the error which i got in the director log, there is no log which says delete failed, moreover in the job design for hashfile stage -

we have checked the update action portion -

Clear file before writing and

Create File option is unchecked

Magesh S
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

In order to get back into sync, check the create and check the delete-before-create options.
maheshsada
Participant
Posts: 69
Joined: Tue Jan 18, 2005 12:15 am

Post by maheshsada »

Hi
thank you , checked create and delete option, the job got completed
Magesh S
Post Reply