Environment Variables Usage

A forum for discussing DataStage<sup>®</sup> basics. If you're not sure where your question goes, start here.

Moderators: chulett, rschirm, roy

Post Reply
s_avneet
Participant
Posts: 22
Joined: Wed Aug 31, 2016 8:28 am

Environment Variables Usage

Post by s_avneet »

Hi All,

I have a quick question about the usage of environment variables and Parameters.

I would build around 35 jobs, performing the same functionality of moving files from one location to other with a minimal of transformation. Now the directories would remain common across all the jobs ( say inbound, outbound, error, archive etc) and the file names will be different. Will it be a good approach to use a single set of environment variables common across all the jobs? Can i call a common environment variables set from all the job sequences?/

Here is my design:

Job sequencer -> Px Jobs
Avneet
tradersjoe57
Premium Member
Premium Member
Posts: 13
Joined: Mon Oct 24, 2016 7:03 am

Post by tradersjoe57 »

Define Environment variables (in Admin) and then use the same variables by defining as parameter set and then use the same parameter set across all the jobs. I think that will make defining variables in each job easier.
asorrell
Posts: 1707
Joined: Fri Apr 04, 2003 2:00 pm
Location: Colleyville, Texas

Post by asorrell »

TradersJoe57 is right, a parameter set is the way to go. Make sure and set the environment variables in the parameter set to $PROJDEF so they'll take the default setting you configured in the admin client.
Andy Sorrell
Certified DataStage Consultant
IBM Analytics Champion 2009 - 2020
s_avneet
Participant
Posts: 22
Joined: Wed Aug 31, 2016 8:28 am

Post by s_avneet »

Ok so as I understand, I create Environment Variables in Admin Client and create Parameter Set containing the variables, set to $PROJDEF.

One last question. From a performance perspective, will this ever be a problem? I would be sharing the same set across different jobs at the same time.
Avneet
asorrell
Posts: 1707
Joined: Fri Apr 04, 2003 2:00 pm
Location: Colleyville, Texas

Post by asorrell »

Nope - it performs extremely well. I've had thousands of jobs running that were referencing several shared parameter sets with dozens of environment variables.

Be sure and have your Parameter Set structure completely figured out before you start using it in jobs. If you have to add or delete an environment variable later, you'll need to recompile everything that references the parameter set.
Andy Sorrell
Certified DataStage Consultant
IBM Analytics Champion 2009 - 2020
s_avneet
Participant
Posts: 22
Joined: Wed Aug 31, 2016 8:28 am

Post by s_avneet »

Thanks a lot, I will go ahead with the approach.

Another point I need to ask,

All the incoming files will be in the Inbound folder, I will move them all to Inprocess folder. I plan to use Execute_Command for the same. Can I use a single script for all the jobs sequencers or do I need a separate script per job?

The functionality of the script will be same, as it will be moving from one location to another, will parameterise the file name from job ($1)

Sample Script:

Code: Select all

#!/bin/bash
var_src_dir="/dev/data/Inbound/"
var_tgt_dir="/dev/data/Inprocess/"
var_file_pattern=$1
var_file_name=`ls -t $var_src_dir$var_file_pattern | head -n 1 |  xargs -n1 basename`
mv $var_src_dir$var_file_name  $var_tgt_dir$var_file_name
echo $var_file_name
Avneet
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

You can certainly use a single script if it is parameterized appropriately.
-craig

"You can never have too many knives" -- Logan Nine Fingers
s_avneet
Participant
Posts: 22
Joined: Wed Aug 31, 2016 8:28 am

Post by s_avneet »

Thanks a lot :)

Marked as resolved
Avneet
Post Reply