Roll Back in ODBC

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
kumar66
Participant
Posts: 265
Joined: Thu Jul 26, 2007 12:14 am

Roll Back in ODBC

Post by kumar66 »

Hi All,,

I am loading data from Db2 to SQL Server and for SQL Server I am ODBC stage.

I want to roll back the rows if the job gets aborted in the middle.

Is there any way we can do that in Parallel in ODBC stage?? Or any other approach?

Please Advice.

Thanks & Regards,
Kumar66.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

That should happen automatically as long as two things happen:

1. No intermediate commits are done
2. Job actually aborts rather than finishes with warnings
-craig

"You can never have too many knives" -- Logan Nine Fingers
kumar66
Participant
Posts: 265
Joined: Thu Jul 26, 2007 12:14 am

Post by kumar66 »

Hi Chulett,

I am not doing any commit or setting any row commit interval and the job is getting aborted. Still the rows are not getting roll backed.

Please Advice.

Thanks & Regards,
Kumar66
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Then they are getting comitted. Are any rolled back at all? What 8.x version? Have you asked your official support provider the same question?
-craig

"You can never have too many knives" -- Logan Nine Fingers
kumar66
Participant
Posts: 265
Joined: Thu Jul 26, 2007 12:14 am

Post by kumar66 »

Hi Chulett,

Its version 8.0.1 .

Thanks ,
Kumar66
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

kumar66 wrote: Its version 8.0.1 .
Known to be... buggy. Contact your official support provider and see if there's a patch you're missing.

:!: Seriously consider upgrading ASAP to the latest 8.1.x release.
-craig

"You can never have too many knives" -- Logan Nine Fingers
kumar66
Participant
Posts: 265
Joined: Thu Jul 26, 2007 12:14 am

Post by kumar66 »

Hi Chulett,

Is there any other approach or design that can be done.
I have given Method = Write and Mode =Append.

When ever my job fails , it should not load the data already loaded.
For e,g if my source has 10 records and my job aborted at 6 record , If i am running my job it should load from the 7 th record.


Please Advice.

Thanks & Regards,
Kumar66.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

You need to design that capability / smarts into your job. Sorry, no time for a proper explanation, perhaps others can chime in.
-craig

"You can never have too many knives" -- Logan Nine Fingers
vinnz
Participant
Posts: 92
Joined: Tue Feb 17, 2004 9:23 pm

Post by vinnz »

kumar66 wrote: I have given Method = Write and Mode =Append.

When ever my job fails , it should not load the data already loaded. ...
Have you tried using the Upsert method?
probal
Participant
Posts: 24
Joined: Sun Oct 25, 2009 10:17 am
Location: kolkata

Post by probal »

there are some confusions.
I don' know how 'Append' mode is checking for the previous data that was inserted.It shouldn't do this unless there is a unique constraint on that specific column.Use 'truncate' in that place.

Ideally in datastage if the job aborts the rows should be rolled back.but it is inconsistent(might be due to some specific patch).in that case u have to manually truncate the table.

p.s. use odbc enterprise stage.
Probal
Post Reply