Aborting job if too few rows are processed
Posted: Tue Aug 28, 2007 1:05 pm
Does anyone have an elegant way to do this?
I have a job that pulls data from a database, runs through a single transform to do some formatting, then writes to a dataset. I would like this job to abort if the number of records passed is less than a certain amount.
I just don't see any way to make this happen directly in the existing transform (the amount of data is small enough I'm willing to force the job to process sequentially if need be).
What I've done is have that transform also write to an aggregator stage for a total row count, which then links to a second transform. That transform has an output link with a constraint that the passed count is less than the cut-off point - and then that constraint has "Abort after 1 row" set.
This works fine enough, but just seems a bit slap-dash.
Any other ideas/suggestions on a cleaner way to make this happen? Or some obvious approach that I'm overlooking?
-> Richard
I have a job that pulls data from a database, runs through a single transform to do some formatting, then writes to a dataset. I would like this job to abort if the number of records passed is less than a certain amount.
I just don't see any way to make this happen directly in the existing transform (the amount of data is small enough I'm willing to force the job to process sequentially if need be).
What I've done is have that transform also write to an aggregator stage for a total row count, which then links to a second transform. That transform has an output link with a constraint that the passed count is less than the cut-off point - and then that constraint has "Abort after 1 row" set.
This works fine enough, but just seems a bit slap-dash.
Any other ideas/suggestions on a cleaner way to make this happen? Or some obvious approach that I'm overlooking?
-> Richard