writing log file after aborting job

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
stivazzi
Participant
Posts: 52
Joined: Tue May 02, 2006 3:53 am

writing log file after aborting job

Post by stivazzi »

Good morning to everyone... i have a job, starting in parallel mode and then calling a server shared container which writes on 3 tables with "enable transaction grouping". to rollback all the tables i call UtilityAbortToLog (please don't tell me i can't use it in production, i MUST rollback, and it's the only way i found!)... the problem is that i have to save log (or rejected records) in a file to be sent to the user...and the system rollbacks even the file :D
how can i do?
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

On of the options when writing to a sequential file is "cleanup on failure" and that defaults to "true". If you change that value the file won't be removed upon job failure.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Split the job so that rows are written to a staging area (text files) by one job, then loaded (server job?) from those. That way you have preserved the data to be loaded even in cases where you need to rollback.

You can monitor the load in your load job so that, if you want to restart from a known point (say 50,000 rows have been committed) you can restart the recovery load from row number 50,001.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply