Job failure at 3100000th row of data

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
DeepakCorning
Premium Member
Premium Member
Posts: 503
Joined: Wed Jun 29, 2005 8:14 am

Job failure at 3100000th row of data

Post by DeepakCorning »

The Job is unable to write data to a hash file after a capacity of 310000 rows. Is the type 30 hash file has a memory limit?? I dotn think it should have such a constraint.

The error-->

"WriteHash() - Write failed"
Viswanath
Participant
Posts: 68
Joined: Tue Jul 08, 2003 10:46 pm

Post by Viswanath »

Hi Deepak,

1.) What are the configurations of the file?
2.) Is the hash file pre-created? If yes then what are the hash parameters are you giving?
3.) What else does the error message say? I guess there would be more than what you have displayed currently?

Cheers,
Vishy
DeepakCorning
Premium Member
Premium Member
Posts: 503
Joined: Wed Jun 29, 2005 8:14 am

Post by DeepakCorning »

Its a dynamic hash file and yes it is precreated and not while running this job.

Thats the error with the data for which it has failed.

:-(
Sainath.Srinivasan
Participant
Posts: 3337
Joined: Mon Jan 17, 2005 4:49 am
Location: United Kingdom

Post by Sainath.Srinivasan »

Did you check whether you have enough space in the file system?

What is the file size when it aborts?
DeepakCorning
Premium Member
Premium Member
Posts: 503
Joined: Wed Jun 29, 2005 8:14 am

Post by DeepakCorning »

Yep Have a lot of memory available...i dont thin its memory problem.

Does any one knows if i can run the job from a particular row so that i can see if it is data problem or not?
snassimr
Premium Member
Premium Member
Posts: 281
Joined: Tue May 17, 2005 5:27 am

Post by snassimr »

If your system is 32bit The hsh file size limited to 2.2 Gb.
Multiply 3100000 to size of each row you enter the hash file.

Does it exceed 2.2 GB ?
pnchowdary
Participant
Posts: 232
Joined: Sat May 07, 2005 2:49 pm
Location: USA

Post by pnchowdary »

Hi Deepak,
DeepakCorning wrote: Does any one knows if i can run the job from a particular row so that i can see if it is data problem or not?
You can put a transformer stage before writing to hash file and in the transformer have a constraint like @INROWNUM >= 3100000, so that it will not write the first 3100000 values and will start writing values from 3100001. That way you will be able to tell whether the data is problem or not.

Thanks,
Naveen
Post Reply