Page 1 of 1

same time reading and writing to hash file problem?

Posted: Tue Jul 03, 2007 8:21 am
by mallikharjuna
when i tried to read and write the data from same hash file at a time ,the following warning is comming?

"Abnormal termination of stage detected".

please tell me how to solve this problem.

Posted: Tue Jul 03, 2007 8:41 am
by JoshGeorge
If you are writing and reading from the same file, and you want to read the updated records concurrently, for 'Pre Load File to Memory' chose Disabled, Lock for updates. 'Allow Stage Write Cache' should be Un Checked in this case. You might want to cross check this.

Are there more error logs to that? 'Reset' your job after it aborts and you might get more error information about this.

Posted: Tue Jul 03, 2007 9:47 am
by ray.wurlod
First prove that it is a hashed file stage that is generating the error. Hashed File stages are passive stages, therefore do not generate processes, therefore can not terminate, abnormally or otherwise.

Second, it's hashed file, not hash file.

Posted: Tue Jul 03, 2007 9:13 pm
by kduke
This is not wise reading and writing to the same hashed file. Hashing distributes record based on the algorithm chosen. One record might cause a group to split and therefore be processed twice. Even worse scenario is when change keys or add new records. You might process the new records again.

Posted: Wed Jul 04, 2007 12:57 pm
by ray.wurlod
Reset the job in Director. Post whatever is in the "from previous run..." log event.