Hi All ,
I got some funny requirement to connect to more than 300 SQL Server Databases and run a simple query on each database.
I have created a Job which would connect to all of these , All the DSN/connection entries were added to .odbc.ini file and I am parameterizing the connection name to run the job. Also the userid and password for all the databases is same.
Now the issue is updating the .odbc.ini file in regular intervals , due to the nature of the requirement we might require to add are remove databases connections from .odbc.ini file weekly.
Is there any way to connect to SQL server without making an entry to .odbc.ini file. something like building the DSN at runtime.
I am assuming there might be some system functions which can achieve this .
Any thoughts about this.
Thanks
RPatra
ODBC.ini file Alternatives
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
I had a similar but not exactly similar issue, maybe it'll help.
I had to grab data from various FoxPro files using the ODBC connection. This client was really weird and instead of storing all their data from one facility in one file, they made a file for each department. So their file structure looked like this:
So the number of ODBC locations could grow rapidly because each department gets a new folder for the year and if departments come and go. So what I did was mount this disk to my server (Linux), then created a "Staging Folder" where in my DataStage job I would execute a command to look up how many facilities there are today, then in a second loop, I would look up how many department folders there for that facility and start sub-looping through that list. In the sub-loop I would create a Linux symbolic link from that location to my "Stage Folder". I configured my ODBC to point to the stage folder, so when my DataStage job tried to load the data, it would follow the symbolic link back to the actual data.
And that is how I got around having to declare numerous ODBC configurations: created a staging location, and used symbolic links back to the real data files.
I had to grab data from various FoxPro files using the ODBC connection. This client was really weird and instead of storing all their data from one facility in one file, they made a file for each department. So their file structure looked like this:
Code: Select all
/<Facility>
//<Dept>
////<Year>
////<Year>
//<Dept>
////<Year>
////<Year>
//<Dept>
////<Year>
////<Year>
//<Dept>
////<Year>
////<Year>
/Building 1
//Sales
///2010
///2011
///2012
//HR
///2010
///2011
///2012
//Admin
///2010
///2011
///2012
/Building 2
//Sales
///2010
///2011
///2012
//HR
///2010
///2011
///2012
//Admin
///2010
///2011
///2012
And that is how I got around having to declare numerous ODBC configurations: created a staging location, and used symbolic links back to the real data files.
-
- Participant
- Posts: 6
- Joined: Sun Jan 04, 2015 4:55 pm
Here is the way that we fixed this.
1.Create a shell script that would generate a file that would containing ODBC entries - We parameterised databasenames,servernames to script.
2. Only in the parallel job where it connects to database - add a existing environmental variable which contains the default odbc.ini path.
3. Change the value of the path to your custom location having custom odbc entries.
Thats all. When ever the job executes the job would refer to this custom location of odbc and connect only to the required databases.
1.Create a shell script that would generate a file that would containing ODBC entries - We parameterised databasenames,servernames to script.
2. Only in the parallel job where it connects to database - add a existing environmental variable which contains the default odbc.ini path.
3. Change the value of the path to your custom location having custom odbc entries.
Thats all. When ever the job executes the job would refer to this custom location of odbc and connect only to the required databases.
Thanks
RP
RP