I'm having trouble creating a LARGE Dynamic hash file (approx 50 bytes / record, 67 000 000 records) that reads data from and Oracle DB. The first error I got was an "unable to allocate memory error", which I got around by disabling the cache loading for the file. Now I'm running into an "ds_uvput() - Write failed for record" error for the same job.
From what I can gather, it looks like I'm running out of disk space as I estimate that the file should be +- 4 GB on disk when fully created. So, can someone help me answer the following?
- Firstly, should I create this hash file "manually" instead of letting DS create it for me? I've used the HFC for help and the command that DS runs is not the same as what the HFC suggests.
What directories should I look at to make sure there is enough space available, as they all seem to have sufficient space?
What entries in the uvconfig file would come into play in this case that I should be aware of (like UVTEMP,SCRMIN etc)?
Also, the DS server has a lot of free memory available, can I consider loading this file into memory (i.e. turning caching on)?
Any suggestions will be greatly appreciated.
Thanks.