Page 1 of 1

Changing case of parameters on the fly

Posted: Mon Mar 02, 2009 9:44 am
by jdmiceli
Hi all!

I have done some searching and haven't found anything like this question yet. If it is out there and I missed, I offer my apologies in advance.

I have a large project that runs common code for 19 different companies at the moment (and growing). Everything that I can parameterize to make things dynamic to prevent/minimize the need for company specific code has been done, including the naming of files created during processing. This is where my problem lies.

I have a simple job that pulls data from a SQL Server 2000 table and runs it into a sequential file and one of the parameters is called 'CompanyName'. It is passed in from the shell script that runs job control. It is capitalized when passed in and is used in that format most of the time. The one place that it isn't is in a file name that is defined as:

Code: Select all

       '#DirScripts#/#CompanyName#_#Environment#.csv'  
The rest of the code is expecting the filename to be look like this:

Code: Select all

      '/datastage/int/bts/ldr/scripts/lmc_int.csv'
but the file name ends up like this:

Code: Select all

      '/datastage/int/bts/ldr/scripts/LMC_int.csv'
On a case sensitive system, this is a problem. Is there a way to apply the Downcase() function to the CompanyName parameter where it needs to be lower case. Most of the time, it needs to be upper case as it is passed in by job control, but for the filename, it needs to be lower case.

Trying this:

Code: Select all

         '#DirScripts#/downcase(#CompanyName#)_#Environment#.csv'
gives me this:

Code: Select all

        'downcase(LMC)_int.csv'.
I am probably just having a mental speedbump with this, but how do I get that stupid thing lower case on the fly as I need to be without affecting it's overall state within the job?

Thanks in advance!

Posted: Mon Mar 02, 2009 9:54 am
by Sainath.Srinivasan

Code: Select all

'#DirScripts#/' : downcase(#CompanyName#) : '_#Environment#.csv' 

Posted: Mon Mar 02, 2009 11:18 am
by jdmiceli
Thanks Sanaith for the suggestion, but setting it up that way got me a file called:

Code: Select all

      : downcase(LMC) : _int.csv
Is there something I have to do to get the interpreter to actually process the function in this location, since it isn't a normal Derivation spot? Maybe enclose it in {} or something? I'll try that, but :?

Any other ideas to try?

Posted: Mon Mar 02, 2009 11:40 am
by chulett
Things like the concatenation operator (obviously you can concatenate) or functions are not supported in the Filename property of the Sequential file stage. So, you have a couple of options as I see it...

First, is this parameter in this job used in multiple places or just in the filename? If the latter, you'll need to downcase it before you pass it to the job parameter. If you need to use both flavors in the same job and the other areas (like derivations) support functions, then downcase the passed in value and then upcase it in your derivations.

Seems to me if you need both flavors in the job in areas that do not support functions then you'll need to pass both into the job as separate parameters.

Posted: Mon Mar 02, 2009 12:10 pm
by jdmiceli
Hi Craig,

I have been experimenting with the downcasing of the variable as it is passed in from the shell script and that part works OK now. However, I do still have another place that requires the upper case version in the same job. Changing the case for passing from the shell is no biggie, it is just when I need both in the same job because I did not allow for multiple CompanyName's in the same job. I'll keep pounding.

Thanks for your input!

Posted: Mon Mar 02, 2009 1:14 pm
by ray.wurlod
Sounds like what you really need is two job parameters - one for the file name and one for "elsewhere in the job". Your script/job sequence can take care of the casing.

Re: Changing case of parameters on the fly

Posted: Mon Mar 02, 2009 3:07 pm
by bollinenik
Hi,
The easy and best way is , save values for SRC file name and Company name in a file and invoke the file in Datastage job or before starting the job you can read the file pass the values to those parameters on teh fly. either way you can follow read the file before triggering the job or read the file in between the job by using of routine and pass the value to parametrs on the fly.

Posted: Tue Mar 03, 2009 6:54 pm
by sbass1
You could split the job logically as follows:

1) GetParameters (read your metadata to get your parameters)
2) UseParameters (your current "downstream" job)

Then call both jobs within a sequencer job.

1) would read your table, set UserStatus based on your table data, the job sequencer would parse UserStatus and dynamically set the parameters to 2).

See viewtopic.php?t=125264&highlight= for more details.

I think it would require 2 parameters though.

However, as Craig said, you may be able to use one parameter, as long as the parameter for the file name is in the format you want (since functions aren't available at that point), and use functions to for example uppercase your parameter within your transformations.

HTH,
Scott

Posted: Tue Mar 03, 2009 6:56 pm
by sbass1
You could split the job logically as follows:

1) GetParameters (read your metadata to get your parameters)
2) UseParameters (your current "downstream" job)

Then call both jobs within a sequencer job.

1) would read your table, set UserStatus based on your table data, the job sequencer would parse UserStatus and dynamically set the parameters to 2).

See viewtopic.php?t=125264&highlight= for more details.

I think it would require 2 parameters though.

However, as Craig said, you may be able to use one parameter, as long as the parameter for the file name is in the format you want (since functions aren't available at that point), and use functions to for example uppercase your parameter within your transformations.

HTH,
Scott