Page 1 of 1

Hash file Overwrite or Append by Default ?

Posted: Tue Nov 04, 2003 9:14 am
by raju_chvr
I have hash-files all over in my jobs.

*) I want to know by default whether these hash files are overwritten or appended everytime those jobs are exectued.

*) also is there any harm in checking the box - 'Clear file before writing' ? in the Input tab for Hash File?


*) and also why is the option 'Create File' and the 'Options' tab along with it.

Posted: Tue Nov 04, 2003 9:34 am
by kcbland
Check out this post, I believe I answered all and more of your questions:

viewtopic.php?t=85364&highlight=hash+files+abused

Re: Hash file Overwrite or Append by Default ?

Posted: Tue Nov 04, 2003 10:12 am
by raju_chvr
Kenneth,

That was a wonderfull explaination of the Hash File usage.

NEVER MIND I FOUND THE ANSWER !!

Thanks for your help ANYWAYS..

Re: Hash file Overwrite or Append by Default ?

Posted: Tue Nov 04, 2003 10:56 am
by kcbland
raju_chvr wrote:I have hash-files all over in my jobs.

*) I want to know by default whether these hash files are overwritten or appended everytime those jobs are exectued.

*) also is there any harm in checking the box - 'Clear file before writing' ? in the Input tab for Hash File?


*) and also why is the option 'Create File' and the 'Options' tab along with it.
1. If you don't check the clear box, the default is not to clear. This means existing data, where the primary key matches, will be overwritten. Non-matching data will be added to the hash file. You really need to comprehend what I wrote, because it explains things in detail.

2. Clearing the file before writing is okay, if you want a clear file. If there's data you need, well, it's gone. If you have multiple instances of a job simultaneously access the hash file, and clear file is checked, well, each job will clear the file, no matter if another instance has cleared and started adding data. This would be bad.

3. Create file is there to automatically create the file if the hash file isn't there. It has tuning capabilities. Creating a hash file in an instantiated job without parameterizing the hash file name or location/path is an unwise choice, as any of the simultaneous instances could "see" there's no hash file and both issue the create file statement.

Aborted How many rows will be committed ?

Posted: Tue Nov 04, 2003 3:47 pm
by raju_chvr
raju_chvr wrote:I have hash-files all over in my jobs.

*) I want to know by default whether these hash files are overwritten or appended everytime those jobs are exectued.

*) also is there any harm in checking the box - 'Clear file before writing' ? in the Input tab for Hash File?


*) and also why is the option 'Create File' and the 'Options' tab along with it.

Aborted How many rows will be committed ?

Posted: Tue Nov 04, 2003 3:50 pm
by raju_chvr
I wanted to know how many rows will be committed to the DB when the job aborts. I know this is the multiple of Transaction Number in my target OCI/ODBC stage.

I want to clarify this before I come to my own conclusion.

I am sure this is duplicate posting. Please excuse me for this.