Data loss in the sequential file when job aborts

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Why should the job abort? Kill anyone who stops it manually; we find that this prevents a second offence.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

This is a stupid requirement. What if the job aborts and you can not write to the sequential file because the disk has filled? There is no way that you can avoid "data loss in the sequential file" in this case.

Resist stupid requirements.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
MarkB
Premium Member
Premium Member
Posts: 95
Joined: Fri Oct 27, 2006 9:13 am

Post by MarkB »

ray.wurlod wrote:Why should the job abort? Kill anyone who stops it manually; we find that this prevents a second offence.
... and subsequent offenses as well. :D

Additionally, I would revisit those who wrote these 'specs' and give them a dose of reality.
Post Reply