Page 1 of 1

Multiload JOb Hangs at near completion of Records

Posted: Tue Jul 17, 2007 2:33 am
by reddy.cba
HI All,
I am running a Multiload JOb in D/S Parallel edition,
Input File is a Huge dataset contaning more than 12 million records, and i am writing this to Teradata using Multiload option.

After nearly writing 12 million records the job does not do anything... It still hangs up..

can any one tell me why its happening, or do i need to add any parameters if the Input file is too big to load.

Thanks In Advance
Vamshi

Money has not created Man, Man has created Money.....SO dont die for Money...

Posted: Tue Jul 17, 2007 3:54 am
by hamzaqk
not alot of information to come to any conclusions... have tried running the multiload script manually ?

mload < abc.mld ?

Does it run ok ?

It seems more like a network issue to me than an issue pertaining to ML. try running the job at a different time of the day.

Teradata Certified Master V2R5

Posted: Tue Jul 17, 2007 4:00 am
by reddy.cba
I havnt tried running it, But will try again...
Just in case i need to add any parameters if i am reading data frm a big dataset and writing into teradata, ?
Are there any parameters i need to add to run the job smoothly...

The same job was running fine when i ran with less than 100 records...
I assume nothing wrong with the job design its just the volume ....
may be i need to add some parameters to the job?

Thanks In Advance
Vamshi

Posted: Wed Jul 18, 2007 4:43 am
by reddy.cba
FOund the problem...
Prmiamry Index was created on wrong column and as a result... The job took ages to finish....

After changing the primary index column..it worked...
If any of you face any problem like this..check DDL for the Target DB and also check for the Indexing....

Many thanks to all
Vamshi