hi DSguru's ,
i have a job which extracts 3 lakh records, when loading, after 55,000 rows the job gets aborting , and these 50000 records are inserted in the target, how can i load the target from the 50001 record . i am using datastage 7.0 , the transaction grouping checkbox is not avaliable in the ODBC Stage . in which cases it will be visable.
thanks in adavance
Kiran Kumar
re running the job
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 437
- Joined: Fri Oct 15, 2004 6:13 am
- Location: Pune, India
I doubt whether you can do this sort of functionality in a single job. You may have to get the details from logs as to at which record the job was aborted and then code your job such that it will start from where it aborted.
Or else you can set Transaction level to 0 sothat when all the records (3 lakh) are inserted successfully, they will be commited.
So even if the job fails in between, during the next run instead starting in between it can start from 1st row again.
Transaction grouping available in ODBC only if 2 links are coming to ODBC as target.
Or else you can set Transaction level to 0 sothat when all the records (3 lakh) are inserted successfully, they will be commited.
So even if the job fails in between, during the next run instead starting in between it can start from 1st row again.
Transaction grouping available in ODBC only if 2 links are coming to ODBC as target.
Regards,
S. Kirtikumar.
S. Kirtikumar.
Re: re running the job
Instead you can load all the records but now may be with a different load strategy, Use Update Else Insert or Insert Else Update (if you are not already using it.) If you want to use commit only after all rows are successfully processed, then use Transaction Size 0.kkreddy wrote:hi DSguru's ,
i have a job which extracts 3 lakh records, when loading, after 55,000 rows the job gets aborting , and these 50000 records are inserted in the target, how can i load the target from the 50001 record . i am using datastage 7.0 , the transaction grouping checkbox is not avaliable in the ODBC Stage . in which cases it will be visable.
thanks in adavance
Kiran Kumar
Hope this is what you were looking for.
Last edited by loveojha2 on Fri Nov 18, 2005 5:36 am, edited 1 time in total.
-
- Participant
- Posts: 437
- Joined: Fri Oct 15, 2004 6:13 am
- Location: Pune, India
Re: re running the job
One point worth mentioning is while dealing with large number of rows, Insert/Update option has performance implications.loveojha2 wrote: Instead you can load all the records but now may be with a different load strategy, Use Update Else Insert or Insert Else Update
Regards,
S. Kirtikumar.
S. Kirtikumar.
Set a job parameter that is the number of rows to skip during the load. Default it to zero. Put a constraint in that says, in essence:
@INROWNUM > JOB_PARAMETER
Set it to 50000 and rerun. Note that there is a little bit more to it than that, but that is the gist of it.
@INROWNUM > JOB_PARAMETER
Set it to 50000 and rerun. Note that there is a little bit more to it than that, but that is the gist of it.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers