G'day,
I am trying to load a DB2 database with a volume of rows in the order of 7.5 million.
When using DB2 'load' (as opposed to 'write') not all my rows are being inserted into the database. From what I can tell, only the first 70000 rows (exactly) are loaded. DataStage statistics, and the Director log, indicate that all 7.5 million rows have been passed to the stage.
Note: When using 'write' to insert rows there is no problem. However the job runs too slowly for our needs.
Has anyone encountered this problem using DB2 load and what was your final solution?
Zac.
DB2 Load row limit
Moderators: chulett, rschirm, roy
-
- Premium Member
- Posts: 62
- Joined: Tue Jun 14, 2005 7:17 pm
- Location: Australia
- Contact:
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Is there some limiting setting for the DB2 bulk loader? DataStage has sent all the rows; one can only assume that there's something in DB2 blocking them. Ask your DBA to check.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Premium Member
- Posts: 62
- Joined: Tue Jun 14, 2005 7:17 pm
- Location: Australia
- Contact:
Doma,
this might be the equivalent of the rollback segment filling up. What is your commit frequency for this job? If you check the generated sequential data file(s) are they at a logical limitation size such as 2Gb (perhaps the DB/2 read program uses only a small pointer).
this might be the equivalent of the rollback segment filling up. What is your commit frequency for this job? If you check the generated sequential data file(s) are they at a logical limitation size such as 2Gb (perhaps the DB/2 read program uses only a small pointer).
<a href=http://www.worldcommunitygrid.org/team/ ... TZ9H4CGVP1 target="WCGWin">
</a>
</a>