Page 1 of 1

Job finishes but the status becomes aborted

Posted: Tue Aug 26, 2008 6:47 pm
by rajeshknl
Hi,

My parellel job runs successfully. It loads the expected 100 rows to the target. I have verified the target table. But the status of the job becomes 'Aborted' after the run. So i cant re-run the job.

I am using Netezza database as target. I am getting a weird error
"count of bad input rows reached maxerrors limit".

I dont understand what this is. I am sure my target table has the data i am expecting.[/b]

Posted: Tue Aug 26, 2008 7:12 pm
by keshav0307
its not a weird error.
what is bad record or error limit, you have set.

Posted: Tue Aug 26, 2008 7:53 pm
by ray.wurlod
What warnings/errors are logged (a) by the DataStage job and (b) in the Netezza environment?

Posted: Tue Aug 26, 2008 9:37 pm
by rajeshknl
By weird i meant weird in this context. The maxerror count has the default value of 1.


If all the records are loaded in the target where is the scope for bad/error records.

Posted: Tue Aug 26, 2008 11:19 pm
by ray.wurlod
First line is column names not being properly handled? I recall one job where this was off, even though the first line was column names. The file was sorted as it went through DataStage, and caused the job to abort (from memory with an Invalid number error).

Posted: Tue Aug 26, 2008 11:27 pm
by tkbharani
I have also faced this problem.
Check if the file count is matching the number of inserted records.

A.May be it, contains some header or footer.
B. Check if your job is no overwriting the error count setting.

Check if any after job soubroutine is being called in the job, which is making it to fail.

Posted: Wed Aug 27, 2008 10:07 am
by rajeshknl
I dont have any header/footer. Also i dont have any after/before job subroutines.