Page 1 of 1

Design Advise - Large number of servers

Posted: Mon Mar 28, 2011 2:54 am
by myukassign
I have a requirement to connect to 2000 servers and extract the data. The point is, each server is hosting same type of database and metadata also same.

I am just wondering creating 2000 separate dsn to connect to all the servers and extract the data is not a usual thing in ETL space. The data I need to pull from each server is leass than .5 million.

Please advise, can we say ETL is a best fit here? if yes, is there any method to dynamically create DSN's. Or any better approach where people follow in such cases

Many thanks in advance.

Posted: Mon Mar 28, 2011 3:09 am
by blewip
It does seem an odd requirement

You need to connect to 2000 database servers?

Why are you using ODBC, what database is it, MySQL?

Are these remote sites, each branch has a database?

Could not each branch ftp an update file?

Perhaps a bit more info, regarding the scenario?

Posted: Mon Mar 28, 2011 4:56 am
by myukassign
What other option I have other than odbc, if my database is SQL server? I am using 7.5.X .

yes it's remote and each server have a database. The name of db is dame and tables also same in each server, jsut that it's physically separated. Each physical location they are maintaing a local sever. The task is to integrate the whole.

Sorry no option of ftp to get updates. out of scope for me.

Posted: Mon Mar 28, 2011 6:48 am
by chulett
There's no magic here, you'll need whatever number of 'dsn' that you need to get this done. After that, seems like a multi-instance job would be the way to go. Out of curiousity, is this a one-time task or something you'll need to repeat on a regular basis?