How to handle many records in datasets?
Posted: Thu Jan 03, 2008 2:15 pm
I am extracting data from two views based on key column values.
The thing is both of the views having more than 100 million records.I am joining the view based on some key columns which is producing some million records.
The source is oracle table(40000 records) and i am pulling the records from the views based on key column.
1)I can't use sparse look up because the part of key colum is used as key.(Look up key length 21,Source table length 10.So i need to use substring in the look-up)
2)If i am using normal look-up,it is taking more time to read the data from view itself(It is taking approximately 30 hours to read the view)
3)And finally i am writing the records into dataset.After some amount of time the job is aborting( error message "ot enough space
Error finalizing / saving table /tmp/dynLUT1183942da1c5888")
4)If i am using some condition in look-up like rownum<1000000,it is working.
I am planning to write the records into some oracle temp tables.Then i will read the records from the temp table for further use instead of reading data from Dataset.Is it good move?
What is your suggestion on this issue?
Note :-
Source :- Oracle table(40000 or more app)
Lookup :- Two oracle view join based on key (Both the view having more than 100 million records)
Look up key :- Need to do substring in the look up
The thing is both of the views having more than 100 million records.I am joining the view based on some key columns which is producing some million records.
The source is oracle table(40000 records) and i am pulling the records from the views based on key column.
1)I can't use sparse look up because the part of key colum is used as key.(Look up key length 21,Source table length 10.So i need to use substring in the look-up)
2)If i am using normal look-up,it is taking more time to read the data from view itself(It is taking approximately 30 hours to read the view)
3)And finally i am writing the records into dataset.After some amount of time the job is aborting( error message "ot enough space
Error finalizing / saving table /tmp/dynLUT1183942da1c5888")
4)If i am using some condition in look-up like rownum<1000000,it is working.
I am planning to write the records into some oracle temp tables.Then i will read the records from the temp table for further use instead of reading data from Dataset.Is it good move?
What is your suggestion on this issue?
Note :-
Source :- Oracle table(40000 or more app)
Lookup :- Two oracle view join based on key (Both the view having more than 100 million records)
Look up key :- Need to do substring in the look up