Hi,
My parellel job runs successfully. It loads the expected 100 rows to the target. I have verified the target table. But the status of the job becomes 'Aborted' after the run. So i cant re-run the job.
I am using Netezza database as target. I am getting a weird error
"count of bad input rows reached maxerrors limit".
I dont understand what this is. I am sure my target table has the data i am expecting.[/b]
Job finishes but the status becomes aborted
Moderators: chulett, rschirm, roy
-
- Premium Member
- Posts: 783
- Joined: Mon Jan 16, 2006 10:17 pm
- Location: Sydney, Australia
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
First line is column names not being properly handled? I recall one job where this was off, even though the first line was column names. The file was sorted as it went through DataStage, and caused the job to abort (from memory with an Invalid number error).
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
I have also faced this problem.
Check if the file count is matching the number of inserted records.
A.May be it, contains some header or footer.
B. Check if your job is no overwriting the error count setting.
Check if any after job soubroutine is being called in the job, which is making it to fail.
Check if the file count is matching the number of inserted records.
A.May be it, contains some header or footer.
B. Check if your job is no overwriting the error count setting.
Check if any after job soubroutine is being called in the job, which is making it to fail.
Thanks, BK