DS Job Design
Moderators: chulett, rschirm, roy
DS Job Design
All,
I have a staging table from which data would have to be loaded onto 6 different tables, only straight inserts.
The six table have a few attributes common amoung them.
I had 2 job designs in mind and I wanted your suggestions on the best strategy.
Strategy 1 :
One DS Job loading all the data into six different table involving the staging table, transfomer and six tables.
Strategy 2:
DS Job wherein the staging table would only select the fieds specific to each table and load data individually.
Thanks
Aditya.
I have a staging table from which data would have to be loaded onto 6 different tables, only straight inserts.
The six table have a few attributes common amoung them.
I had 2 job designs in mind and I wanted your suggestions on the best strategy.
Strategy 1 :
One DS Job loading all the data into six different table involving the staging table, transfomer and six tables.
Strategy 2:
DS Job wherein the staging table would only select the fieds specific to each table and load data individually.
Thanks
Aditya.
Is it one staging table to 6 different table or 6 staging tables to 6 other tables.
If its the first case, the option speaks about populating the 6 target tables.
If i understand properly, But the first approach involves loading the staging table as well. Pls clarify.
If its the first case, the option speaks about populating the 6 target tables.
If i understand properly, But the first approach involves loading the staging table as well. Pls clarify.
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
One job to select data, do all the transformations and then load to 6 flat files, identical to your target tables.
A job for each table.
All together 7 jobs. Keep it modular. Will help in restartability. Also keeps the segments well defined.
A job for each table.
All together 7 jobs. Keep it modular. Will help in restartability. Also keeps the segments well defined.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
If performance is being an issue, stage into text files and use sqlldr to get the data from there into the Oracle staging tables, perhaps in an after-job subroutine running ExecSH. You'll be amazed how much faster this can go. Don't use the ORABULK stage though - its performance is abysmal.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Is it direct load from Staging to all 6 tables or are you doing any transformations in middle. If there are no transformations in between, use 6 select reads and insert into tables. Do it parralelly so that the performance might increase if DB doesn't have bottleneck. My suggestion is write 6 different jobs and call it parallely so that you have restartability if it fails for one table. Try to use ORABULK but not sure it's availability in version 4
6 repeated reads on the single table for restratability is not a wise approach.
Read one and write into 6 flat file and use SQL loader to load it as noted, or read and write in the same job as one option that you have mentioned.
Read one and write into 6 flat file and use SQL loader to load it as noted, or read and write in the same job as one option that you have mentioned.
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'