Page 2 of 2

Posted: Sun Mar 11, 2007 7:07 pm
by DSguru2B
....and hence my suggestion of reading it once, getting it transformed and loading into a flat file which has identical meta data as their respected tables. One hit on table, six flat files for six tables, even great for restartablity.

Posted: Sun Mar 11, 2007 7:12 pm
by kumar_s
DSguru2B wrote:....and hence my suggestion of reading it once, getting it transformed and loading into a flat file which has identical meta data as their respected tables. One hit on table, six flat files for six tables, even great for restartablity.
Absolutely agreed. :)

Posted: Sun Mar 11, 2007 7:14 pm
by oacvb
You might be right, But we need to think about restart option aswell. In real scenario, sometimes all tables except one might have gone fine. When you are re-executing, it might be difficult to clean-up the old records and some manual process are required. To avoid that he can go ahead with 6 jobs. It's easy to develop but hard to maintain. Always we need to think about all the options while developing.

Posted: Sun Mar 11, 2007 7:19 pm
by kumar_s
oacvb wrote:You might be right, But we need to think about restart option aswell. In real scenario, sometimes all tables except one might have gone fine. When you are re-executing, it might be difficult to clean-up the old records and some manual process are required. To avoid that he can go ahead with 6 jobs. It's easy to develop but hard to maintain. Always we need to think about all the options while developing.
Not sure if you could able to comprehend whats been said. A job which will extract the data from table and write into 6 diff Flat files. (single read).
Other 6 different jobs which reads each flat file and loads using SQL loader. (Hence achieved restartability as well).
As noted writing into Flat file comparatively effecient and loading using SQL loader from flat file is also comparatively effecient.

Posted: Sun Mar 11, 2007 7:29 pm
by oacvb
DSGuru solution is really a good one, I started writing my previous message as your reply to

kumar_s
Posted: Sun Mar 11, 2007 7:05 pm

--------------------------------------------------------------------------------





Even if database is very fine tuned and has high band width, its not worth hitting the database again and again for the same data. IMHO.


But unfortunately it missed the sequence.