Over 500,000 objects in hashed file directory

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
patonp
Premium Member
Premium Member
Posts: 110
Joined: Thu Mar 11, 2004 7:59 am
Location: Toronto, ON

Over 500,000 objects in hashed file directory

Post by patonp »

Hi Folks,

One of our jobs failed in production last night, and the director log mentions that an abort was raised in a hashed file stage.

In Windows, the directory specific to the hashed file in question contains over 500K objects, each of which seems related to a specific record. I've never seen this type of behaviour before. Any ideas about what's going on?

Thanks!

Peter
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Sure. Dynamic hashed file gets 'corrupted', typically resulting in the lose of the .Type30 file. Result is it suddenly becomes a Type 19 (I believe) - one in which each record becomes a separate file. :shock:

Delete the directory and it should 'rebuild' fine. Happens sometimes when using the 'Delete' option.
-craig

"You can never have too many knives" -- Logan Nine Fingers
patonp
Premium Member
Premium Member
Posts: 110
Joined: Thu Mar 11, 2004 7:59 am
Location: Toronto, ON

Post by patonp »

Thanks Craig. Would we need to delete the D_ file as well as the directory, or just the directory?
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Just the directory would suffice. Doing both wouldn't hurt.
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply