How to handle huge historical record loads in Azure DataLake
Posted: Sat Jan 11, 2020 7:11 am
Hi All,
We have a scenario that we have to load 500 million historical records from sql server to Azure data lake followed by 200k daily/incremental records data to capture the changed records and then we will need to load into data lake.
The catch here is we don't have timestamp columns. We are looking for your thoughts and assistance as well w.r.t DataStage jobs.
Thanks & Regards,
S.R
We have a scenario that we have to load 500 million historical records from sql server to Azure data lake followed by 200k daily/incremental records data to capture the changed records and then we will need to load into data lake.
The catch here is we don't have timestamp columns. We are looking for your thoughts and assistance as well w.r.t DataStage jobs.
Thanks & Regards,
S.R