Jobs aborted in the middle of dataloading into warehouse

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
sanjay
Premium Member
Premium Member
Posts: 203
Joined: Fri Apr 23, 2004 2:22 am

Jobs aborted in the middle of dataloading into warehouse

Post by sanjay »

Hi All

I do have the 10,000 rows of data and the job get aborted while loading into warehouse after loading 5000 rows. How to hadle this situation to load it from 5001 record In Datastage PX not in server


Thanks
Sanjay
gateleys
Premium Member
Premium Member
Posts: 992
Joined: Mon Aug 08, 2005 5:08 pm
Location: USA

Re: Jobs aborted in the middle of dataloading into warehouse

Post by gateleys »

sanjay wrote:Hi All

I do have the 10,000 rows of data and the job get aborted while loading into warehouse after loading 5000 rows. How to hadle this situation to load it from 5001 record In Datastage PX not in server

Thanks
Sanjay
First, find as to why the job is getting aborted. Fix that. How are you handling your rejects? Second, make sure that the 5000 rows that were passed from your job have been commited to the target, so that you can restart from 5001 row. If the rows per transaction was not set to facilitate this, you will find that the rows prior to the abort will have rolled back. You might have to do a rerun of the entire job.

gateleys
gateleys
Premium Member
Premium Member
Posts: 992
Joined: Mon Aug 08, 2005 5:08 pm
Location: USA

Re: Jobs aborted in the middle of dataloading into warehouse

Post by gateleys »

Ooops!! This a post for a parallel job. I would not know how that would work. But the database principles should remain the same.

gateleys
vcannadevula
Charter Member
Charter Member
Posts: 143
Joined: Thu Nov 04, 2004 6:53 am

Re: Jobs aborted in the middle of dataloading into warehouse

Post by vcannadevula »

Moree information neede ,could you post the logs???? That might help in resolving the issue
kwwilliams
Participant
Posts: 437
Joined: Fri Oct 21, 2005 10:00 pm

Re: Jobs aborted in the middle of dataloading into warehouse

Post by kwwilliams »

sanjay wrote: How to hadle this situation to load it from 5001 record In Datastage PX not in server
Why? Why would you not just send the rows to the database again? Set your database stage to insert then update. If your tables has a primary key or unique index on it, then the insert will fail due to a constraint violation and it will then update your data with the same values that you currently have in your table. No harm no foul.

Now if you don't have a unique index, you would update then insert. But I would strongly recommend your table have a unique index.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Did the 5000 rows actually make it into the target table? Or did the entire transaction roll back? Do you use a staging area of any kind between your ETL and your actual load process?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
kumar_s
Charter Member
Charter Member
Posts: 5245
Joined: Thu Jun 16, 2005 11:00 pm

Post by kumar_s »

First try to load into a sequential file and make sure that you dont have any constraints set in your jobs. Like 'Abort after' or 'Read first n'...
If no, check the database if you can able to load the same set of data manually.
Make sure after 5000 records, it doesnt violate any referntial constrains or uniqness...
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
Post Reply