re running the job

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
kkreddy
Participant
Posts: 28
Joined: Tue May 10, 2005 6:00 am

re running the job

Post by kkreddy »

hi DSguru's ,

i have a job which extracts 3 lakh records, when loading, after 55,000 rows the job gets aborting , and these 50000 records are inserted in the target, how can i load the target from the 50001 record . i am using datastage 7.0 , the transaction grouping checkbox is not avaliable in the ODBC Stage . in which cases it will be visable.


thanks in adavance
Kiran Kumar
Kirtikumar
Participant
Posts: 437
Joined: Fri Oct 15, 2004 6:13 am
Location: Pune, India

Post by Kirtikumar »

I doubt whether you can do this sort of functionality in a single job. You may have to get the details from logs as to at which record the job was aborted and then code your job such that it will start from where it aborted.
Or else you can set Transaction level to 0 sothat when all the records (3 lakh) are inserted successfully, they will be commited.
So even if the job fails in between, during the next run instead starting in between it can start from 1st row again.

Transaction grouping available in ODBC only if 2 links are coming to ODBC as target.
Regards,
S. Kirtikumar.
loveojha2
Participant
Posts: 362
Joined: Thu May 26, 2005 12:59 am

Re: re running the job

Post by loveojha2 »

kkreddy wrote:hi DSguru's ,

i have a job which extracts 3 lakh records, when loading, after 55,000 rows the job gets aborting , and these 50000 records are inserted in the target, how can i load the target from the 50001 record . i am using datastage 7.0 , the transaction grouping checkbox is not avaliable in the ODBC Stage . in which cases it will be visable.


thanks in adavance
Kiran Kumar
Instead you can load all the records but now may be with a different load strategy, Use Update Else Insert or Insert Else Update (if you are not already using it.) If you want to use commit only after all rows are successfully processed, then use Transaction Size 0.

Hope this is what you were looking for.
Last edited by loveojha2 on Fri Nov 18, 2005 5:36 am, edited 1 time in total.
Kirtikumar
Participant
Posts: 437
Joined: Fri Oct 15, 2004 6:13 am
Location: Pune, India

Re: re running the job

Post by Kirtikumar »

loveojha2 wrote: Instead you can load all the records but now may be with a different load strategy, Use Update Else Insert or Insert Else Update
One point worth mentioning is while dealing with large number of rows, Insert/Update option has performance implications.
Regards,
S. Kirtikumar.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Set a job parameter that is the number of rows to skip during the load. Default it to zero. Put a constraint in that says, in essence:

@INROWNUM > JOB_PARAMETER

Set it to 50000 and rerun. Note that there is a little bit more to it than that, but that is the gist of it.
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply