Row locking problem in Hash File
Posted: Sun Sep 25, 2005 8:43 pm
I have a routine that creates/opens/reads/writes a hash file. It is called from a Transformer (ie. once for each row). The hash file is created on the first row. For any call to the routine, it only writes to the file if the key does not already exist.
The routine works perfectly for the first 600 rows (writing about 50-100 new keys to the hash file) and then the job hangs. I know the routine is at fault because I removed it and the job ran to completion.
The commands I am using to create, open, read, and write the hash file are as follows:
When I change the readu and writeu to read and write, the job runs to completion. This leads me to believe that I was locking myself out of the file. I am actually happy not to lock the records because the hash file is unique to a single-instance job (no sharing reqd), but I am concerned that my understanding of locking is wrong.
I thought that if my session holds a lock, then I can readu and even writeu to my heart's content in the same session. Am I wrong?
I suspect it is something to do with dynamic re-sizing hash files or possibly caching, because there are definitely duplicates in the first 600 rows where I would be re-locking a row and it gets past these fine.
Some further info:
* I am not using inter-process communcation in the job - there is definitely only one session.
* There is no-one else trying to use the file; I am the only one on the machine, and this is the only job I am running.
* Between runs I had to kill the jobs from Unix and then stop/restart DS.
Any help or ideas would be greatly appreciated.
The routine works perfectly for the first 600 rows (writing about 50-100 new keys to the hash file) and then the job hangs. I know the routine is at fault because I removed it and the job ran to completion.
The commands I am using to create, open, read, and write the hash file are as follows:
Code: Select all
StrCommand = "$DSHOME/bin/mkdbfile " : HashFile : " 30 1 4 20 50 80 1628
Call DSExecute("UNIX", StrCommand , OutPut, RetCode)
Openpath HashFile TO UniqFileHandle
Readu ValExists From UniqFileHandle, Val
Writeu Val On UniqFileHandle, Val
I thought that if my session holds a lock, then I can readu and even writeu to my heart's content in the same session. Am I wrong?
I suspect it is something to do with dynamic re-sizing hash files or possibly caching, because there are definitely duplicates in the first 600 rows where I would be re-locking a row and it gets past these fine.
Some further info:
* I am not using inter-process communcation in the job - there is definitely only one session.
* There is no-one else trying to use the file; I am the only one on the machine, and this is the only job I am running.
* Between runs I had to kill the jobs from Unix and then stop/restart DS.
Any help or ideas would be greatly appreciated.