Page 1 of 1

Job Restart from the abort point

Posted: Sun Jul 11, 2004 6:34 am
by bapajju
Hi All,
I am using Parallel Extender 7. I am extracting data from DB2 and the number of records is 5500000000 (550M). I am putting this data into DB2. Now my job aborts at 250M (250000000) the record. Now when doing a reload I want to start it from 2500000001 th record. Kindly let me know how to achieve this.

Posted: Sun Jul 11, 2004 12:54 pm
by kduke
You could use a constraint to start at @INPUTROW > StartRow where StartRow is a parameter. I am not sure if this works in PX.

Posted: Sun Jul 11, 2004 8:30 pm
by ray.wurlod
I'd be more concerned with finding out why it aborts, and correcting that situation. How do you know that all of the 250M records were successfully loaded?
That said, the approach is basically to determine the row counts - there are many ways to do that - and to use the appropriate one as the start point for your next run.

Posted: Thu Jul 15, 2004 11:04 pm
by bapajju
kduke wrote:You could use a constraint to start at @INPUTROW > StartRow where StartRow is a parameter. I am not sure if this works in PX.
Thanks Duke.We'r trying the same.

Posted: Wed Aug 25, 2004 1:13 pm
by gh_amitava
Hi,

Do not use @INPUTROW in PX partitioned jobs because the value of @INPUTROW will be different in different partition. You can use Surrogate key Generator stage to create a runtime number. It is consistent across the partition.

Also you can control the commit level in DB2 stage.

Regards
Amitava