Hi All
I do have the 10,000 rows of data and the job get aborted while loading into warehouse after loading 5000 rows. How to hadle this situation to load it from 5001 record In Datastage PX not in server
Thanks
Sanjay
Jobs aborted in the middle of dataloading into warehouse
Moderators: chulett, rschirm, roy
Re: Jobs aborted in the middle of dataloading into warehouse
First, find as to why the job is getting aborted. Fix that. How are you handling your rejects? Second, make sure that the 5000 rows that were passed from your job have been commited to the target, so that you can restart from 5001 row. If the rows per transaction was not set to facilitate this, you will find that the rows prior to the abort will have rolled back. You might have to do a rerun of the entire job.sanjay wrote:Hi All
I do have the 10,000 rows of data and the job get aborted while loading into warehouse after loading 5000 rows. How to hadle this situation to load it from 5001 record In Datastage PX not in server
Thanks
Sanjay
gateleys
Re: Jobs aborted in the middle of dataloading into warehouse
Ooops!! This a post for a parallel job. I would not know how that would work. But the database principles should remain the same.
gateleys
gateleys
-
- Charter Member
- Posts: 143
- Joined: Thu Nov 04, 2004 6:53 am
Re: Jobs aborted in the middle of dataloading into warehouse
Moree information neede ,could you post the logs???? That might help in resolving the issue
-
- Participant
- Posts: 437
- Joined: Fri Oct 21, 2005 10:00 pm
Re: Jobs aborted in the middle of dataloading into warehouse
Why? Why would you not just send the rows to the database again? Set your database stage to insert then update. If your tables has a primary key or unique index on it, then the insert will fail due to a constraint violation and it will then update your data with the same values that you currently have in your table. No harm no foul.sanjay wrote: How to hadle this situation to load it from 5001 record In Datastage PX not in server
Now if you don't have a unique index, you would update then insert. But I would strongly recommend your table have a unique index.
Keith Williams
keith@peacefieldinc.com
keith@peacefieldinc.com
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Did the 5000 rows actually make it into the target table? Or did the entire transaction roll back? Do you use a staging area of any kind between your ETL and your actual load process?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
First try to load into a sequential file and make sure that you dont have any constraints set in your jobs. Like 'Abort after' or 'Read first n'...
If no, check the database if you can able to load the same set of data manually.
Make sure after 5000 records, it doesnt violate any referntial constrains or uniqness...
If no, check the database if you can able to load the same set of data manually.
Make sure after 5000 records, it doesnt violate any referntial constrains or uniqness...
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'