Page 1 of 1

Posted: Mon Jul 18, 2011 3:19 pm
by chulett
Yes, I've seen it personally and it has been reported here as well by several others, I do believe. All it takes is the loss of the hidden .Type30 file inside the directory to cause this issue as the hashed file is no longer a dynamic one. It can then fall back on this funky (type 1? 19?) "hashed file" where every record is written out as a separate file.

As you noted, the correct solution is to nuke it and let it rebuild from scratch.

Edit: found another post on the subject.

Posted: Tue Jul 19, 2011 8:37 am
by jdsmith575210
Thanks for the info chulett. I had suspected that the hashed files were being created as something other than Type 30, but I've never used anything else in five years, and figuring it out took a back seat to recovering.

Posted: Thu Jul 21, 2011 8:49 am
by roy
When you create the hashed file/s you are better off deleting and recreating them (if possible) for this (amongst other) reason.
IHTH... (I hope This Helps)

Posted: Thu Jul 21, 2011 9:05 am
by daignault
Backing up a Type30 causes the ".Type30" to go missing. One solution is to add a .Type19 hidden file to the directory and at the ">" prompt,

create a VOC ENTRY that looks like this:

F
PATHNAME
D_VOC


Then do a COPYI FROM PATHNAME TO REPAIREDHASH ALL


--------

Cheers,

Ray D

Posted: Thu Jul 21, 2011 10:38 am
by chulett
daignault wrote:Backing up a Type30 causes the ".Type30" to go missing.
That would depend entirely on how one backs it up, yes? :wink:

Posted: Thu Jul 21, 2011 10:44 am
by chulett
roy wrote:When you create the hashed file/s you are better off deleting and recreating them (if possible) for this (amongst other) reason.
Whereas in my experience, deleting and recreating them each time has been at the root of this issue. I tended to stick with "Clear" unless I had a good reason to nuke them each time and that would generally involve dynamic setting of the Minimum Modulus based on wildly variable volumes. Different strokes. :wink:

Posted: Thu Jul 21, 2011 1:55 pm
by roy
Craig,
I beg to differ!
In my humble opinion (from experience) any hashed file that is used for a long period using the clear option never being deleted and created a new, is bound to eventually get corrupted.

Posted: Thu Jul 21, 2011 2:24 pm
by chulett
And in my experience I've found them to be quite resilient and really only saw "corruption" for issues like (lack of) disk space, going over the dreaded 2GB barrier with 32bit addressing or by writing funky data to them. Otherwise, solid. Like I said, different strokes.