Clearing hash file with CLEAR.FILE command

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
SonShe
Premium Member
Premium Member
Posts: 65
Joined: Mon Aug 09, 2004 1:48 pm

Clearing hash file with CLEAR.FILE command

Post by SonShe »

I want to clear a hash file with a before-job command "CLEAR.FILE hashfilename". WHen the hash file was being created in the default project directory, this was working. However, when the hash file is being created in a separate directory other than the default project directory it looks like the CLEAR.FILE is not working.

I need to clear the file. How can I make CLEAR.FILE command to work when the hash file is in a separate directory. I will appreciate any help.

Thanks.
Sainath.Srinivasan
Participant
Posts: 3337
Joined: Mon Jan 17, 2005 4:49 am
Location: United Kingdom

Post by Sainath.Srinivasan »

You can tick the box saying 'clear before writing'.
SonShe
Premium Member
Premium Member
Posts: 65
Joined: Mon Aug 09, 2004 1:48 pm

Post by SonShe »

Sainath thanks for the reply. There are three input links writing data to the hash file. That is why I clear the file using the CLEAR.FILE command before the job start. I believe in this case I cannot check the clear box in the hash stage. Please clarify.
Sainath.Srinivasan
Participant
Posts: 3337
Joined: Mon Jan 17, 2005 4:49 am
Location: United Kingdom

Post by Sainath.Srinivasan »

Have a pre-job that does this for you with no rows passed to it.
SonShe
Premium Member
Premium Member
Posts: 65
Joined: Mon Aug 09, 2004 1:48 pm

Post by SonShe »

Thanks again, Sainath. I liked the idea!
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Another option would be to establish a VOC record to the hash so that CLEAR.FILE would work for your pathed hash file. Search the forum for the syntax as it's been posted many times here.

Yet another option would be to simply leverage your O/S - since pathed hashes are created by the mkdbfile command which works strictly at the O/S level - you could write a script that is called pre-job to delete the existing structures and recreate them with the same command. You could also just do the delete and let the job recreate them if taking the defaults is ok.

Food for thought...
-craig

"You can never have too many knives" -- Logan Nine Fingers
Sainath.Srinivasan
Participant
Posts: 3337
Joined: Mon Jan 17, 2005 4:49 am
Location: United Kingdom

Post by Sainath.Srinivasan »

I agree with Craig's idea but did not mention in that in my mail because there is a degree of risk involved in performing operations outside DataStage - for instance, when you migrate jobs to from dev to test to prod.
Post Reply