I am running a job which loads 5 hash files from a single oracle table.
Job is aborted. Reason I found in director is 'it couldn't open hash file'. The path I gave is correct.
The same thing is happening in another job where iam loading a hash file from a sequential file. Here it is unable to open sequential file.
Can anyone help me.
thanks,
Praneeth
unable to open a hash file
Moderators: chulett, rschirm, roy
Is the hashed file supposed to exist prior to these jobs running? If so then make sure that the job that creates the hashed file does so with the appropriate directory path and permissions prior to this job running.
If the file exists and this job can't see it, then it's either the file name, path or the permissions.
If the file exists and this job can't see it, then it's either the file name, path or the permissions.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Also, if this job is supposed to create the hashed file, make sure the Create File checkbox is checked.
Welcome aboard!.
Welcome aboard!.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
creating in the same job
I am creating the file in the same job.........
It worked fine last day before i reset the job........
But The job is reset. it shows the error
DSD.UVOpen Unable to open file
I even tried by deleting the files which are created and started from first ..But no use....
It worked fine last day before i reset the job........
But The job is reset. it shows the error
DSD.UVOpen Unable to open file
I even tried by deleting the files which are created and started from first ..But no use....
-
- Premium Member
- Posts: 503
- Joined: Wed Jun 29, 2005 8:14 am
Re: creating in the same job
Does It gives the same error if you view data in the hash file??(Right click and say view data input as well output). By this atleast we will know that the Hashed file is created and some probelm in accessing it while running the job.
Re: creating in the same job
Are the hashed files external path or did you leave the project as the default option? If you used the project as the default option you CANNOT remove the hashed file contents from Unix using the "rm" command.pkomalla wrote:I even tried by deleting the files which are created and started from first ..But no use....
Consider reading the fall newsletter to learn more about hashed files and how they work.
If a job is supposed to create a hashed file and can't, your issue is permissions or another process created the same named hashed file at that exact same moment.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
-
- Premium Member
- Posts: 503
- Joined: Wed Jun 29, 2005 8:14 am
1. If you are not able to do a view data on the hashed file then for sure creating that hashed file process is not correct. May be you gave the file names totally different in the output tab and input tab. Make sure you have given the same file name (writing and reading).
2. As far as my knowledge clean up resources is to "release" jobs and stuff and not to stop the job. If the job is not stopping click the stop button more than once.
2. As far as my knowledge clean up resources is to "release" jobs and stuff and not to stop the job. If the job is not stopping click the stop button more than once.