Project Level Environment Variables ---------
Moderators: chulett, rschirm, roy
Project Level Environment Variables ---------
Hi,
One question...............I defined some user defined variables through Data Stage Administator ..........I am using the same variables in DataStage jobs.....My question is how to define the dynamic variables in Administrator... when ever I change the variable value in Administrator That should effect in Datastage jobs ........
Ram
One question...............I defined some user defined variables through Data Stage Administator ..........I am using the same variables in DataStage jobs.....My question is how to define the dynamic variables in Administrator... when ever I change the variable value in Administrator That should effect in Datastage jobs ........
Ram
Re: Project Level Environment Variables ---------
And it will, provided you are running a recent enough version. I think it is in 7.0.1 that Environment Variables work as you would expect. Before that, you would need to recompile the job before it would recognize the fact that you had changed it. This is documented in the Readme.trammohan wrote:when ever I change the variable value in Administrator That should effect in Datastage jobs
Not sure if you can use them in PX jobs. Not sure what you are asking about 'dynamic variables'. From what I recall from the documentation, you can set their value to $ENV and they will pick up the current value of the environment variable. Is that what you meant by dynamic?
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Participant
- Posts: 3593
- Joined: Thu Jan 23, 2003 5:25 pm
- Location: Australia, Melbourne
- Contact:
Add an environment variable to your job parameters and set it to $ENV to dynamically retrieve the environment variable value for that login, use $PROJDEF to retrieve the value for that project as defined in the User Defined environment variable section of project details in the DataStage Administrator.
If you maintain multiple environments on the one machine such as dev, test and UAT then you may want to use project defaults which makes $PROJDEF the more reliable value.
If you maintain multiple environments on the one machine such as dev, test and UAT then you may want to use project defaults which makes $PROJDEF the more reliable value.
Last edited by vmcburney on Sun Jun 20, 2004 11:02 pm, edited 1 time in total.
Certus Solutions
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
Possible bug with Encrypted Variables.
Has anyone seen issue with using Encrypted variables for passwords and not having them resolve at the time of job run? For some reason the variables are not resolving. If I type them in manually at the time of run the job goes fine, but if I rely on the variable to pickup the value it fails with a username/password error.
I've double and tripple checked the encrypted values for the passwords by typing them in again but it still doesn't work. If I change the value from encrypted to string it works fine, but then any user can look in the log and see the passwords in plain text.
I've double and tripple checked the encrypted values for the passwords by typing them in again but it still doesn't work. If I change the value from encrypted to string it works fine, but then any user can look in the log and see the passwords in plain text.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Really? I'll have to double-check as I don't recall seeing that. It would be nice, as I too have seen this problem. You aren't doing anything wrong, it's just a plain old B-U-G that's been around for several versions now. No clue what triggers it, but when it happens it can be extremely frustrating to deal with.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
How about encrypted environment variables?
Last edited by ray.wurlod on Fri Oct 01, 2004 4:30 pm, edited 1 time in total.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
In general, they do work just fine. It's just that, once in a great while, it will break for a particular job. When it does, nothing short of changing the parameter type to 'String' seems to help. It's... very odd.andru wrote:Just FYI. I'm working on Version 7.1r1. Encrypted parameters works fine for me.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
I'm also running 7.1r1 so it looks like the issue is still lingering. My main concern is with our developers getting the passwords out of the logs. If I can't rely on the encrypted variables I'll have to do something else. Logically I shouldn't be too concerned since a variable would allow the developers to create a job that destroys my database when they run it, but conveniently they wouldn't be able to tell the password of the ID that did it. haha I'll work something out.
Sorry to resurrect an old thread but this is applicable
We are using 7.1 and we had already been using environment variables set to $PROJDEF for things like Server Name, Username and Password, etc. We made a change to make all of the Password variables encrypted instead of strings. We reloaded the parameters in every job, and did the multijob compile afterward.
In our DEV, TST, and QA projects we had 1 or 2 minor hiccups, easily dealt with. When we did this same process in Production, every map that actually moved rows was aborting, but the sequencers were completing successfully. We could run an individual job by itself and even sequencers up to a certain point would run jobs successfully, but the highest level sequencers would cause all jobs run through them to abort with an error about failing to connect to either the source or the target DB.
The only way we were able to make our regularly schedule process work was to remove all of the parameters from each stage in the sequencers since it appeared that the individual jobs were handling them correctly.
Anybody know why this happened, or have recommendations for changing how we do things to avoid this kind of issue?
In our DEV, TST, and QA projects we had 1 or 2 minor hiccups, easily dealt with. When we did this same process in Production, every map that actually moved rows was aborting, but the sequencers were completing successfully. We could run an individual job by itself and even sequencers up to a certain point would run jobs successfully, but the highest level sequencers would cause all jobs run through them to abort with an error about failing to connect to either the source or the target DB.
The only way we were able to make our regularly schedule process work was to remove all of the parameters from each stage in the sequencers since it appeared that the individual jobs were handling them correctly.
Anybody know why this happened, or have recommendations for changing how we do things to avoid this kind of issue?