Hash file, reading and writing in same job
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 437
- Joined: Fri Oct 15, 2004 6:13 am
- Location: Pune, India
Hash file, reading and writing in same job
Hi,
Sorry for posting the question that has been discussed so many times.
I have searched a lot on this topic.
I am trying to use the hash for ref purpose and update the same hash file if match is not found. Also the reqt is, if 2 rows in stream link have same key and key does not exist in the hash, after inserting 1 st row, second one should not be inserted.
In some of the post it is mentioned that in the ref link hash file stage, preload should be set to 'Disabled, lock for updates' and in some it is mentioened that it should be 'Enabled, Lock for updates'.
For the target hash, Write cache should be disabled.
Can anyone help me out as to which are the correct settings for hash ref stage and hash target stage for this sort of reqt.
Also, if anyone can give the reason or ref. to reason, it will be great help for me.
Thanks in advance.
Sorry for posting the question that has been discussed so many times.
I have searched a lot on this topic.
I am trying to use the hash for ref purpose and update the same hash file if match is not found. Also the reqt is, if 2 rows in stream link have same key and key does not exist in the hash, after inserting 1 st row, second one should not be inserted.
In some of the post it is mentioned that in the ref link hash file stage, preload should be set to 'Disabled, lock for updates' and in some it is mentioened that it should be 'Enabled, Lock for updates'.
For the target hash, Write cache should be disabled.
Can anyone help me out as to which are the correct settings for hash ref stage and hash target stage for this sort of reqt.
Also, if anyone can give the reason or ref. to reason, it will be great help for me.
Thanks in advance.
Regards,
S. Kirtikumar.
S. Kirtikumar.
-
- Participant
- Posts: 437
- Joined: Fri Oct 15, 2004 6:13 am
- Location: Pune, India
Thanks Kenneth!!!kcbland wrote:No read or write caching, turn off interprocess and row buffering.
So I should go for following settings:
Off - Interprocess and row buffering in job properties.
But what should be the settings in Ref hash stage for preload-
Enabled
Enabled lock for updates
Disabled
Disabled, lock for update
Regards,
S. Kirtikumar.
S. Kirtikumar.
-
- Participant
- Posts: 437
- Joined: Fri Oct 15, 2004 6:13 am
- Location: Pune, India
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Disabled, lock for updates.
This sets a record level update lock if the key is not found, on the assumption that you're going to write that key into the hashed file.
This sets a record level update lock if the key is not found, on the assumption that you're going to write that key into the hashed file.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
For what it's worth, the last time I had to do this I found that plain old Disabled is what worked for me. I don't recall exactly why, but had some strange behaviour when I tried 'Disabled, Locked for Updates'. Switching back to Disabled made it do exactly what it needed
Always best, in my opinion, to build little jobs to specifically test things like this. Switch it around and note how each option change affects the job. Then you'll know which way is right for your particular situation.
![Confused :?](./images/smilies/icon_confused.gif)
Always best, in my opinion, to build little jobs to specifically test things like this. Switch it around and note how each option change affects the job. Then you'll know which way is right for your particular situation.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
Not to argue with Ray, but the disable lock for updates means that the reference of a row will put a lock on that row. Failure to write to that row leaves a lock hanging. The job will degrade in performance as the internal lock table progressively fills with unrelieved locks, until the job basically freezes.
Only use locking if you absolutely need the row locked, meaning that some other job could be accessing and modifying the same row at the same time, which in my opinion is a BAD DESIGN for a lot of reasons.
Only use locking if you absolutely need the row locked, meaning that some other job could be accessing and modifying the same row at the same time, which in my opinion is a BAD DESIGN for a lot of reasons.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
read and write same hash file, be careful the performance
I have a job read and read same hash file. I disable cash and lock for update. performance is bad. (not acceptable).
be careful to use that lock for update option.
be careful to use that lock for update option.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact: