Page 1 of 1

Job failure at 3100000th row of data

Posted: Tue Jul 12, 2005 7:02 am
by DeepakCorning
The Job is unable to write data to a hash file after a capacity of 310000 rows. Is the type 30 hash file has a memory limit?? I dotn think it should have such a constraint.

The error-->

"WriteHash() - Write failed"

Posted: Tue Jul 12, 2005 7:31 am
by Viswanath
Hi Deepak,

1.) What are the configurations of the file?
2.) Is the hash file pre-created? If yes then what are the hash parameters are you giving?
3.) What else does the error message say? I guess there would be more than what you have displayed currently?

Cheers,
Vishy

Posted: Tue Jul 12, 2005 7:37 am
by DeepakCorning
Its a dynamic hash file and yes it is precreated and not while running this job.

Thats the error with the data for which it has failed.

:-(

Posted: Tue Jul 12, 2005 7:48 am
by Sainath.Srinivasan
Did you check whether you have enough space in the file system?

What is the file size when it aborts?

Posted: Tue Jul 12, 2005 3:14 pm
by DeepakCorning
Yep Have a lot of memory available...i dont thin its memory problem.

Does any one knows if i can run the job from a particular row so that i can see if it is data problem or not?

Posted: Tue Jul 12, 2005 3:22 pm
by snassimr
If your system is 32bit The hsh file size limited to 2.2 Gb.
Multiply 3100000 to size of each row you enter the hash file.

Does it exceed 2.2 GB ?

Posted: Tue Jul 12, 2005 3:39 pm
by pnchowdary
Hi Deepak,
DeepakCorning wrote: Does any one knows if i can run the job from a particular row so that i can see if it is data problem or not?
You can put a transformer stage before writing to hash file and in the transformer have a constraint like @INROWNUM >= 3100000, so that it will not write the first 3100000 values and will start writing values from 3100001. That way you will be able to tell whether the data is problem or not.

Thanks,
Naveen