Page 1 of 2

Loading SAP data through DataStage

Posted: Sun Mar 02, 2008 3:42 pm
by mydsworld
Please let me know the methods (datastage plug-ins) for loading SAP data keeping in the mind :

1. Huge volume of data
2. Less volume of data.

Please let me know which SAP plug-in is to be used in which case.

Thanks.

Posted: Sun Mar 02, 2008 9:59 pm
by vmcburney
There is a SAP plugin stage for DataStage parallel jobs that can read from a SAS dataset, write to a SAS dataset and run a SAS program. So you could for example read from an external data source, pass it to some SAS code and write the results to a SAS dataset from a parallel DataStage job.

Posted: Sun Mar 02, 2008 11:24 pm
by ray.wurlod
Umm... SAS and SAP are very different beasts!

Different Loading option of SAP data through DataStage

Posted: Mon Mar 03, 2008 9:56 am
by mydsworld
I am looking for SAP data loading and not SAS data.

Basically looking for which SAP plugin (in DataStage) to use in which case (less data volume/huge data volume and othere relevant consideration).

Posted: Mon Mar 03, 2008 10:09 am
by chulett
Umm... typo?

Posted: Mon Mar 03, 2008 3:36 pm
by ray.wurlod
Nice try but, if so, what's a SAP dataset?
:?

Posted: Mon Mar 03, 2008 3:46 pm
by chulett
Hell if I know, me no do SAP. A dataset you've created from SAP data?

Posted: Mon Mar 03, 2008 4:15 pm
by mansoor_nb
There are several ways to load the data into the SAP system.
1)If the data which you are trying to load gets loaded into the SAP standard tables then you need to write BDC.
a)First you need to prepare the data and place the file in the SAP system.
b)once the file is created then the BDC(ABAP Code) should be triggered independently.
2) If data is getting loaded into the SAP custom tables then you can write the BAPI. The point a remains the same and once the file is created, BAPI should be called from DataStage. The import parameters of the BAPI should be the file name and file path in the SAP system.
OR
One can write the independent ABAP code which loads the data into the SAP custom table.

The above two approaches are can be followed for any kind of data volume

If the data volume is less say in thousands then you can go for IDOC load stage but IDoc load stage does not give an complete error message when it fails to execute. This becomes very hard to debug or fix the issue.

Posted: Mon Mar 03, 2008 4:34 pm
by ShaneMuir
You could also create a file with Datastage in a required format and load it using LSMW. This of course means not using the plug-ins.

Posted: Mon Mar 03, 2008 4:59 pm
by hondaccord94
If you are talking Plug-ins...then DataStage provides you with a SAP IDoc load stage plugin. This is probably the right one for your requirement. We used this in one of our Data migration projects. Worked faily well...quite a few limitations and not enough support.

Posted: Mon Mar 03, 2008 5:00 pm
by hondaccord94
If you are talking Plug-ins...then DataStage provides you with a SAP IDoc load stage plugin. This is probably the right one for your requirement. We used this in one of our Data migration projects. Worked faily well...quite a few limitations and not enough support.

Posted: Mon Mar 03, 2008 5:18 pm
by ray.wurlod
Let's try to get one thing clear from the outset - by "loading SAP data" do you mean loading data from SAP into DataStage, or loading data from DataStage into SAP?

Should have asked this question earlier.

There are plug-in stages (and other techniques) for each operation.

Loading SAP data

Posted: Tue Mar 04, 2008 9:32 pm
by mydsworld
By 'loading SAP data', I meant loading data from legacy to SAP system using DataStage.

Re: Loading SAP data

Posted: Wed Mar 05, 2008 12:31 am
by Shadab_Farooque
Use SAP Load Pack for loading data into SAP.

Posted: Tue Apr 08, 2008 3:31 am
by seemamone
i will post my query in a spearate post
thx