The Job is unable to write data to a hash file after a capacity of 310000 rows. Is the type 30 hash file has a memory limit?? I dotn think it should have such a constraint.
The error-->
"WriteHash() - Write failed"
Job failure at 3100000th row of data
Moderators: chulett, rschirm, roy
-
- Premium Member
- Posts: 503
- Joined: Wed Jun 29, 2005 8:14 am
-
- Premium Member
- Posts: 503
- Joined: Wed Jun 29, 2005 8:14 am
-
- Participant
- Posts: 3337
- Joined: Mon Jan 17, 2005 4:49 am
- Location: United Kingdom
-
- Premium Member
- Posts: 503
- Joined: Wed Jun 29, 2005 8:14 am
-
- Participant
- Posts: 232
- Joined: Sat May 07, 2005 2:49 pm
- Location: USA
Hi Deepak,
Thanks,
Naveen
You can put a transformer stage before writing to hash file and in the transformer have a constraint like @INROWNUM >= 3100000, so that it will not write the first 3100000 values and will start writing values from 3100001. That way you will be able to tell whether the data is problem or not.DeepakCorning wrote: Does any one knows if i can run the job from a particular row so that i can see if it is data problem or not?
Thanks,
Naveen