Page 1 of 1

while loading to target Abnormal termination of stage

Posted: Tue Aug 24, 2004 2:16 am
by K.A.R.PAULSON
Hi,

I am loading (172000 records) data from sequential file to oracle 9i database. At the end of job I am getting the below error.

“Abnormal termination of stage Dim_Customer_FMT_PCARD_TRANS_Update_2..Transformer_58 detected”

The source and target files columns with data type I am listing below.

S.No columname Sequential file transformer_58 Oracle 9i

1) cust_surr_key @inrownum decimal(6)
2) card_num varchar(19) varchar(19) varchar2(19)
3) cust_name varchar(40) varchar(40) varchar2(40)
4) Busin_type varchar(40) varchar(40) varchar2(40)
5) category varchar(18) varchar(18) varchar2(18)
6) territory varchar(40) varchar(40) varchar2(40)
7) salesman varchar(11) varchar(11) varchar2(11)
8) balance decimal(8,2) decimal(8,2) decimal(8,2)
9) update_date timestamp timestamp timestamp
10) last_updt_user @logname varchar2(20)


Even though it was aborted, I am able to see data in the target.

Just I am loading like direct mapping. Please can any one able to help me in detail.

Thanks

Regi

Posted: Tue Aug 24, 2004 2:49 am
by anupam
If your Job has aborted then that means it has not loaded all the 72000 records in the database. Your Job might have loaded less number of records in the database. Please check that.

If there is some less number of records in the table, then that may be due to some erroneous records and your job has aborted after giving 50 Warnings (default).

Please check the director log for any kind of warnings.
Please reset the Job and then paste the message from previous run, which will give some information about the reason for the job getting aborted.

Re: while loading to target Abnormal termination of stage

Posted: Tue Aug 24, 2004 3:33 pm
by ksmurthys
Hi Paul,

I got the same error "Abnormal termination" when i tried to load around 6900000 records from staging oracle table to production table.4200000 records were added into the taget table then after i got abnornal termination error.I thought its a datastage resource problem.In that case i loaded 4000000 records first then loadedremaining 2900000 records using filter on date field.You can try the same way,that definitely work.

Thanks,
Murthy.

Posted: Tue Aug 24, 2004 7:13 pm
by dhiraj
Paul,

Is inter process row buffering turned on your job.?
if so try running the job after disabling it.

i have jobs which extracts over 12 million records from a table ( approx 6 gb of data in flat file). so i guess data stage resources shouldn't be the problem.

dhiraj