Hi,
I am trying to import around 60 Million Records from SAP R3 ABAP Stage to the DB2 Database. The source stage is the ABAP Program Stage to pull the records from R3 using the ABAP program. Then we are getting the data loaded into DB2 database without any transformation in between.
In this process the complete load process takes around 40 hours to complete. Its performance is around 450 rows per second and is pretty slow. I am confused if the ABAP Program Stage is able to utilize the parallelism of DataStage job
Please provide valuable suggestion and feed back on the issue how the performance can be improved for the job.
Also please comment on the parallelism of the ABAP Progran Stage.
Regards
Abhijit
Load 60 Million records from SAP to DB2
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 46
- Joined: Tue Jul 20, 2010 1:26 pm
- Location: USA
- Contact:
-
- Premium Member
- Posts: 457
- Joined: Tue Sep 25, 2007 4:05 pm
Re: Load 60 Million records from SAP to DB2
I am assuming you are doing a normal "Insert" operation. This is a logged activity and hence it consumes a lot of time. You could try using Bulk loading.
Vivek Gadwal
Experience is what you get when you didn't get what you wanted
Experience is what you get when you didn't get what you wanted
-
- Participant
- Posts: 46
- Joined: Tue Jul 20, 2010 1:26 pm
- Location: USA
- Contact:
Re: Load 60 Million records from SAP to DB2
can u please let me know a bit more on how to do the bulk loading
Re: Load 60 Million records from SAP to DB2
That's a DB2 question, not a DataStage one. It depends on the details of your DB2 installation. For example, your first question might be do you have BMC utilities available. If, you do, then it might be a lot faster to FTP your data from Unix to the mainframe first, then run a mainframe job for the bulk load. DataStage might be the wrong approach, especially since you have no transformations involved.creatingfusion wrote:can u please let me know a bit more on how to do the bulk loading
I agree with Vivek. Your biggest performance hit is likely the logging. Anything you can do to avoid it is a good first step.
-
- Premium Member
- Posts: 457
- Joined: Tue Sep 25, 2007 4:05 pm
Re: Load 60 Million records from SAP to DB2
Bulk load is another way of loading rows into a table. It is very handy to do huge loads, like the one you have, into the table from the backend. You can learn in detail about Bulk loading into a database from a variety of resources available online and by talking to your DBA.
However, you need to learn about the DB2 Bulk load stage and configuring it. The DataStage documentation for this stage gives detailed set-up instructions as well as configuring it to work. The stage creates 3 files (it doesn't directly load into the table). You would be needing a script that can trigger this process after the files are created. I would recommend talking to your DBA about this script as you may have already have one in the shop.
However, you need to learn about the DB2 Bulk load stage and configuring it. The DataStage documentation for this stage gives detailed set-up instructions as well as configuring it to work. The stage creates 3 files (it doesn't directly load into the table). You would be needing a script that can trigger this process after the files are created. I would recommend talking to your DBA about this script as you may have already have one in the shop.
Vivek Gadwal
Experience is what you get when you didn't get what you wanted
Experience is what you get when you didn't get what you wanted