Page 1 of 1

Iterating a single server job over multiple schemes

Posted: Wed Sep 19, 2007 5:00 am
by vpauls
Hi.

We are trying to make a server job iterate over several shemes. Logging into a specific scheme is done by username and password alone. We have currently come up with the following methods.

1. One seqence where the job is calles 1 time per scheme with hardcoded username/passwords. Works but is not very good when wanting to add/remove/change a scheme.

2. Concatanate username/password in a comma seperated list using it in a StartLoop then use loop.counter$[a,b] to get the username/password substring. Works but might be easy to make errors. And the password must be a clear text or the whole comma seperated list treated as encrypted.

3. Storing username/passwords in a table and using a transformer to call the job with the appropriate parameters. Works but somewhat hides what is actually going on. Somehow feels like a hack.

We have tried having two loops, one with usernames and one with the passwords, but can not seem to find a way to make them both iterate once between each time the job is run.

Anyone have any other ideas? We do not really like any of method 1-3.

Would greatly appreciate any help :)

Posted: Wed Sep 19, 2007 4:34 pm
by ArndW
vpauls,

I'm not sure I understand what you mean by "schema" in this context. Does this mean the job itself is on different servers with different userid/passwords? Or that the job runs on one machine, but you login to different database schemas? Or something entirely different?

Posted: Wed Sep 19, 2007 4:35 pm
by ray.wurlod
Can you create the loop in a job sequence, passing schema name, user ID, password and any other required information to the job on each iteration.
The lists of parameters (connection strings, effectively) might be read from a file into user variables from which the StartLoop activity can set up the loop conditions.

Posted: Fri Sep 21, 2007 1:45 am
by vpauls
ArndW wrote:vpauls,

I'm not sure I understand what you mean by "schema" in this context. Does this mean the job itself is on different servers with different userid/passwords? Or that the job runs on one machine, but you login to different database schemas? Or something entirely different?
Sorry if I where a bit unclear. I meant database schemes on he same server.

I am trying to loop a job with paramA1/paramB1 then paramA2/paramB2 etc. I have several solutions. The problem is that none of them are all that modifiable. And we try to use routines as little as possible.

Posted: Fri Sep 21, 2007 4:11 am
by ameyvaidya
I would recommend a combination of your options 2, 3 and an additional Job.

Store the user names and passwords in a table.

use the following Sequence design:

Code: Select all

Job1---> UserVariables1---> StartLoop---> UserVariables2---> Your JobHere---> EndLoop---> Exec command(rm sequential file)

Job1:

Code: Select all

Table---> Transformer---> Aggregator---> Sequential file
Transformer will use a stage variable to build the following string for every row of input
Sv1=Sv1 : in.User :"|" : in.PWD : ","

Output 1 column with derivation=Sv1[1,Len(sv1)-1]

The aggregator will write only the last row to file.

The sequential file should be written without Quotes or delimiters.

You will need to write a routine that reads this sequential file and returns the 1 ine written. Call in the UserVariables1

Use this for the startloop/Endloop iterator

IN user Variables2, define 2 variables for user/Pwd. Use the field Function to separate them.

use these 2 in your job.

Finally delete the sequential file created in the Job1.


Hope this helps.

Posted: Mon Sep 24, 2007 1:02 am
by vpauls
Thanks for the input it is an interesting solution that would work. But combining two solutions I do not really want to use with routines kind of sums up what I do not want to do.

The real problem is not doing it, but making it as easy to understand as possible while still keeping it quick to change. I am probably going to use a "superuser" which has full grants on all schemes deriving the scheme names from a single list.

Posted: Thu Sep 27, 2007 4:32 pm
by jdmiceli
We do very similar processing of our databases here at work. I have 18 companies that have databases with the same structure. All of my jobs are parameterized to the utmost possible. I do everything with parameters if possible, and then pass the appropriate values in when calling the jobs to run. They are all multi-instanced and this seems to work pretty well in the limited testing I've done so far.

We will be rolling this out to production after we do a few months of parallel testing with the existing system.

Bestest,