Page 1 of 1

job fails while loading into target table

Posted: Mon May 04, 2009 2:40 am
by monaz
Hi All,

Can anyone please guide me how to restart the job incase if the job fails .

for example i have 2million records to be loaded into the table.
i have input a seqential file or the dataset needs to be loaded into sybase database.

i want to just use insert commmand by deleting the perious records.

While loading 2milllion records if the job fails at 10000 records then how to handle in such senario.

I checked in forum it tells about hash file or caputure the record.

but how exactly i need to work on,as i am new to datastage not able to get the required results..
please guide..

Posted: Mon May 04, 2009 4:16 pm
by ray.wurlod
Whatever you want to do in terms of restarting from a point of failure you must design. And that involves keeping track of, or discovering subsequently, which rows were successfully committed to the database.

The only automatic restart in DataStage is from the start of the job that failed. You must design any cleanup that has to occur before that, should you wish to take this route.

Posted: Mon May 04, 2009 4:17 pm
by ray.wurlod
Please delete your duplicate post before anyone responds to it

Posted: Tue May 05, 2009 3:13 am
by monaz
ray.wurlod wrote:Please delete your duplicate post before anyone responds to it
Thanks ray,

Can you please guide me if my understanding is wrong.

I thought i will store the output in the file and then do the look up with the same file again and then insert into the fnal table.
So which ever matches i wont insert and insert the remaing records into the table.

Is this the correct way to do ...

Or shal i re-run the job in sequencer, but again to achieve this i need your help.

Please suggest...