unable to open a hash file

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
pkomalla
Premium Member
Premium Member
Posts: 44
Joined: Tue Mar 21, 2006 6:18 pm

unable to open a hash file

Post by pkomalla »

I am running a job which loads 5 hash files from a single oracle table.

Job is aborted. Reason I found in director is 'it couldn't open hash file'. The path I gave is correct.

The same thing is happening in another job where iam loading a hash file from a sequential file. Here it is unable to open sequential file.


Can anyone help me.

thanks,

Praneeth
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

Is the hashed file supposed to exist prior to these jobs running? If so then make sure that the job that creates the hashed file does so with the appropriate directory path and permissions prior to this job running.

If the file exists and this job can't see it, then it's either the file name, path or the permissions.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

Also, if this job is supposed to create the hashed file, make sure the Create File checkbox is checked.

Welcome aboard!.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
pkomalla
Premium Member
Premium Member
Posts: 44
Joined: Tue Mar 21, 2006 6:18 pm

creating in the same job

Post by pkomalla »

I am creating the file in the same job.........

It worked fine last day before i reset the job........

But The job is reset. it shows the error

DSD.UVOpen Unable to open file

I even tried by deleting the files which are created and started from first ..But no use....
DeepakCorning
Premium Member
Premium Member
Posts: 503
Joined: Wed Jun 29, 2005 8:14 am

Re: creating in the same job

Post by DeepakCorning »

Does It gives the same error if you view data in the hash file??(Right click and say view data input as well output). By this atleast we will know that the Hashed file is created and some probelm in accessing it while running the job.
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Re: creating in the same job

Post by kcbland »

pkomalla wrote:I even tried by deleting the files which are created and started from first ..But no use....
Are the hashed files external path or did you leave the project as the default option? If you used the project as the default option you CANNOT remove the hashed file contents from Unix using the "rm" command.

Consider reading the fall newsletter to learn more about hashed files and how they work.

If a job is supposed to create a hashed file and can't, your issue is permissions or another process created the same named hashed file at that exact same moment.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
pkomalla
Premium Member
Premium Member
Posts: 44
Joined: Tue Mar 21, 2006 6:18 pm

Post by pkomalla »

Same error is shown while trying to view data....

And one more, Job is running even after I stopped it from director.

I tried using compile option in designer ..

In director their is an option called clean up resources which helps to kill a job. But I couldn't understand the procedure of it.
DeepakCorning
Premium Member
Premium Member
Posts: 503
Joined: Wed Jun 29, 2005 8:14 am

Post by DeepakCorning »

1. If you are not able to do a view data on the hashed file then for sure creating that hashed file process is not correct. May be you gave the file names totally different in the output tab and input tab. Make sure you have given the same file name (writing and reading).

2. As far as my knowledge clean up resources is to "release" jobs and stuff and not to stop the job. If the job is not stopping click the stop button more than once.
Post Reply