UVOpen mkdbfile: cannot create file
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 69
- Joined: Tue Jan 18, 2005 12:15 am
UVOpen mkdbfile: cannot create file
Hi
In my job, selecting data from table(1077070) and writing to hashfile, and this hashfile is used as lookup for other job.
This job is running fine for more than 1 year, suddenly its giving an error DSD.UVOpen mkdbfile: cannot create file /opt/etlbatch/hashfiles/HAdcCharge_026
i have checked the space in the dir
13631488 11554281 1948288 86% /opt/etlbatch
I have checked the forum and not getting a clear picture.
regards
Magesh S
In my job, selecting data from table(1077070) and writing to hashfile, and this hashfile is used as lookup for other job.
This job is running fine for more than 1 year, suddenly its giving an error DSD.UVOpen mkdbfile: cannot create file /opt/etlbatch/hashfiles/HAdcCharge_026
i have checked the space in the dir
13631488 11554281 1948288 86% /opt/etlbatch
I have checked the forum and not getting a clear picture.
regards
Magesh S
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
-
- Participant
- Posts: 69
- Joined: Tue Jan 18, 2005 12:15 am
no its not giving any other message.
The day before it run, it has failed because of space issue in the server, so we deleted some unwanted files and created space. Then we restarted the job and the job got failed with the below error.
We reset the job and the error message is
DSD.UVOpen Unable to open file '/opt/etlbatch/hashfiles/HAdcCharge_026'., do we need to delete the hashfiles in hasdir in server and then continue. Any suggestions welcomed
regards
Magesh S
The day before it run, it has failed because of space issue in the server, so we deleted some unwanted files and created space. Then we restarted the job and the job got failed with the below error.
We reset the job and the error message is
DSD.UVOpen Unable to open file '/opt/etlbatch/hashfiles/HAdcCharge_026'., do we need to delete the hashfiles in hasdir in server and then continue. Any suggestions welcomed
regards
Magesh S
Magesh - note that you are now showing a different error than originally, which is why Ray asked the question. You have a pointer in your VOC file that points to a non-existant file on the system. I think if you were to modify your hashed file settings to also "DELETE file before create" it will remove the VOC pointer to the file as well and then not have a problem in re-creating the UNIX objects.
You can also take a small hashed file and copy it to your "'/opt/etlbatch/hashfiles/HAdcCharge_026" location if this file is being cleared before use.
You can also take a small hashed file and copy it to your "'/opt/etlbatch/hashfiles/HAdcCharge_026" location if this file is being cleared before use.
<a href=http://www.worldcommunitygrid.org/team/ ... TZ9H4CGVP1 target="WCGWin">
</a>
</a>
-
- Participant
- Posts: 69
- Joined: Tue Jan 18, 2005 12:15 am
-
- Participant
- Posts: 69
- Joined: Tue Jan 18, 2005 12:15 am
In order to get back into sync, check the create and check the delete-before-create options.
<a href=http://www.worldcommunitygrid.org/team/ ... TZ9H4CGVP1 target="WCGWin">
</a>
</a>
-
- Participant
- Posts: 69
- Joined: Tue Jan 18, 2005 12:15 am