Load Problem
Moderators: chulett, rschirm, roy
Load Problem
hi all
i have one qry....
i have a data of 1 Cr Records,, while loading this data after 50k records job aborted.. So what i have to do for remaing records.
1- Re run the job once again or
2- Load the remaining records -in this case how will go about this..
Adv Thks
i have one qry....
i have a data of 1 Cr Records,, while loading this data after 50k records job aborted.. So what i have to do for remaing records.
1- Re run the job once again or
2- Load the remaining records -in this case how will go about this..
Adv Thks
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Not enough information.
What stage types are you using? What is the source? Are there any constraints in the source select statement that limits the row count to 5 crore? What is the "rows per transaction" setting? Do any error/warning messages from Oracle appear in the job log? Are you using OCI stage or bulk loader? Reset the job in Director - does a message "from previous run..." appear?
What stage types are you using? What is the source? Are there any constraints in the source select statement that limits the row count to 5 crore? What is the "rows per transaction" setting? Do any error/warning messages from Oracle appear in the job log? Are you using OCI stage or bulk loader? Reset the job in Director - does a message "from previous run..." appear?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 24
- Joined: Fri Aug 26, 2005 3:52 pm
- Contact:
Re: Load Problem
Hi
Solutions
1. You can delete the records loaed from the target and reruns the job again
2. put one more hash file to store the record count and use that as refeence when you are loading the data in second Run. (Ex: if you load 50000 then the hash file data will be 50000 as number and when you rerun the same data check for the INROWNUM < Hashfile data then skip. else take it to target.)
3. Set Transaction sife to a number to Commit. So that job wont fail.
HTH
Thanks
Raghu
Solutions
1. You can delete the records loaed from the target and reruns the job again
2. put one more hash file to store the record count and use that as refeence when you are loading the data in second Run. (Ex: if you load 50000 then the hash file data will be 50000 as number and when you rerun the same data check for the INROWNUM < Hashfile data then skip. else take it to target.)
3. Set Transaction sife to a number to Commit. So that job wont fail.
HTH
Thanks
Raghu
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Common on the sub-continent (India, Sri Lanka, Pakistan, Bangla Desh). "Lakh" and "crore" are counting units, just like you use "thousands" and "millions"
One lakh = 100,000 (but usually written 1,00,000).
One crore = 100 lakh = 10 million (usually written 1,00,00,000)
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact: