I am facing a strange issue that I've never come across before, around hash file creation. I have a process that does two SQL extracts from a database, performs some basic transformation and outputs to a hash file. Here is a screenshot of the job:
![Image](http://iforce.co.nz/i/h4mjjr1k.arl.png)
The problem is that the other day one of the transforms (bottom one in the image above) outputted individual records to separate files and I cannot make sense of it. So instead of the transformer outputting records to a single hash file, it ended up outputting 150,000ish individual files to the hash directory (the 176K in the above image is from the latest run, but this count read 150Kish on problematic run in question). The impact was that we ended up running out of inodes on our UNIX box because the file creation meant we exceeded the inode threshold. As a result, other DS jobs fell over as there was no space to write to logs, temp directories etc.
On recompile and restart of the job, it restarted and ran as normal, creating a typical hash file with a DATA.30 of 85MB and a OVER.30 of 26MB.
If anyone has encountered a similar issue, I'd really like to hear what you found to be the cause and how you ensured it didn't happen again. I'm hesitant to keep this job running (even though it is seemingly running fine now) in case it topples every job running in production.
Thanks in advance.