Hi All,
I have 2 Oracle Schemas.I need to do ETL from one oracle(Relational Schema) to another Oracle(Star Schema).In Source Database I have 3 Base tables from which 16 Views are derived and I need to Move the Data from these sources(So 16 Objects) to target of 19 Tables.
So Do i need to design 19 Jobs to move data between Source and Target (one table by one table) or is there any other better way is there do the same.The mapping is taking more time in ETL.
Please help me in this issue.
Thanks & Regards,
K.S.Rajan.
Urgent....Please Help Me!!!
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 3593
- Joined: Thu Jan 23, 2003 5:25 pm
- Location: Australia, Melbourne
- Contact:
You need to import your table definitions for source and destination tables into the DataStage repositry using the Manager. This will speed up the process of creating the DataStage jobs. It is best to do one job per table as it makes the management and debugging easier. You mention 16 sources and 19 destinations, some of your jobs could have one input loading to more than one destination.
The mapping should be fast, don't forget there is an automap button in the transformer stage that will map columns with identical names.
The creation of a job should take just a few minutes. You load the table definitions into the source and destination stages, you auto map the columns and drag and drop the rest.
You then need a Sequence job that runs the 19 jobs using a combination of parallel and sequencial jobs, the number of parallel streams depends on data dependency and the number of processors on your server.
You need to consider whether you will bulk load the data into Oracle or use the OCI insert.
Vincent McBurney
Data Integration Services
www.intramatix.com
The mapping should be fast, don't forget there is an automap button in the transformer stage that will map columns with identical names.
The creation of a job should take just a few minutes. You load the table definitions into the source and destination stages, you auto map the columns and drag and drop the rest.
You then need a Sequence job that runs the 19 jobs using a combination of parallel and sequencial jobs, the number of parallel streams depends on data dependency and the number of processors on your server.
You need to consider whether you will bulk load the data into Oracle or use the OCI insert.
Vincent McBurney
Data Integration Services
www.intramatix.com
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
-
- Participant
- Posts: 3593
- Joined: Thu Jan 23, 2003 5:25 pm
- Location: Australia, Melbourne
- Contact:
devaraj4u, strongly recommend you get some training from Ray. If you are not in Mumbai then get in your car and start driving! Your process of loading a star schema will benefit greatly from DataStage Best Practices which Ray knows a thing or two about.
Vincent McBurney
Data Integration Services
www.intramatix.com
Vincent McBurney
Data Integration Services
www.intramatix.com