The routine works perfectly for the first 600 rows (writing about 50-100 new keys to the hash file) and then the job hangs. I know the routine is at fault because I removed it and the job ran to completion.
The commands I am using to create, open, read, and write the hash file are as follows:
Code: Select all
StrCommand = "$DSHOME/bin/mkdbfile " : HashFile : " 30 1 4 20 50 80 1628
Call DSExecute("UNIX", StrCommand , OutPut, RetCode)
Openpath HashFile TO UniqFileHandle
Readu ValExists From UniqFileHandle, Val
Writeu Val On UniqFileHandle, Val
I thought that if my session holds a lock, then I can readu and even writeu to my heart's content in the same session. Am I wrong?
I suspect it is something to do with dynamic re-sizing hash files or possibly caching, because there are definitely duplicates in the first 600 rows where I would be re-locking a row and it gets past these fine.
Some further info:
* I am not using inter-process communcation in the job - there is definitely only one session.
* There is no-one else trying to use the file; I am the only one on the machine, and this is the only job I am running.
* Between runs I had to kill the jobs from Unix and then stop/restart DS.
Any help or ideas would be greatly appreciated.