Page 1 of 1

re running the job

Posted: Fri Nov 18, 2005 12:37 am
by kkreddy
hi DSguru's ,

i have a job which extracts 3 lakh records, when loading, after 55,000 rows the job gets aborting , and these 50000 records are inserted in the target, how can i load the target from the 50001 record . i am using datastage 7.0 , the transaction grouping checkbox is not avaliable in the ODBC Stage . in which cases it will be visable.


thanks in adavance
Kiran Kumar

Posted: Fri Nov 18, 2005 1:04 am
by Kirtikumar
I doubt whether you can do this sort of functionality in a single job. You may have to get the details from logs as to at which record the job was aborted and then code your job such that it will start from where it aborted.
Or else you can set Transaction level to 0 sothat when all the records (3 lakh) are inserted successfully, they will be commited.
So even if the job fails in between, during the next run instead starting in between it can start from 1st row again.

Transaction grouping available in ODBC only if 2 links are coming to ODBC as target.

Re: re running the job

Posted: Fri Nov 18, 2005 2:53 am
by loveojha2
kkreddy wrote:hi DSguru's ,

i have a job which extracts 3 lakh records, when loading, after 55,000 rows the job gets aborting , and these 50000 records are inserted in the target, how can i load the target from the 50001 record . i am using datastage 7.0 , the transaction grouping checkbox is not avaliable in the ODBC Stage . in which cases it will be visable.


thanks in adavance
Kiran Kumar
Instead you can load all the records but now may be with a different load strategy, Use Update Else Insert or Insert Else Update (if you are not already using it.) If you want to use commit only after all rows are successfully processed, then use Transaction Size 0.

Hope this is what you were looking for.

Re: re running the job

Posted: Fri Nov 18, 2005 5:08 am
by Kirtikumar
loveojha2 wrote: Instead you can load all the records but now may be with a different load strategy, Use Update Else Insert or Insert Else Update
One point worth mentioning is while dealing with large number of rows, Insert/Update option has performance implications.

Posted: Fri Nov 18, 2005 8:34 am
by chulett
Set a job parameter that is the number of rows to skip during the load. Default it to zero. Put a constraint in that says, in essence:

@INROWNUM > JOB_PARAMETER

Set it to 50000 and rerun. Note that there is a little bit more to it than that, but that is the gist of it.