Hi everyone,
I have job in my production environment which loads around 40 million records from a sequential file to a hashfile. It works fine in production. When I imported the same job in my qa environment it gives me the following errir after it loads around 95% of recrods.
Error : JobDs155SeqToAverageCostHash..LatestPrdtAvgCost.DSSJU155_AvgCost: WriteHash() - Write failed for record id '7008503420
10998002'
I checked the disk space and it is at 56% full, so that is not the problem.
Here are some key inputs for hashfile (same both in QA and prodcution).
allow stage write cache, create file, and clear file before wirting are enabled.
File creationg type - type 30(dynamic)
minimum modulus : 531253
group size : 1
split load : 80
merge load : 50
large record : 1628
hash algorithm : general
caching attributes : none
Also Delete file before create is enabled.
Any help would be appreciated,
thanks,
Abhi
WriteHash() - Write failed error when loading hashfile
Moderators: chulett, rschirm, roy
How large are the DATA.30 and OVER.30 files when the error occurs? Anywhere close to 2Gb?
<a href=http://www.worldcommunitygrid.org/team/ ... TZ9H4CGVP1 target="WCGWin">
</a>
</a>
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact: