Parameter Passing through Object?

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
calvinlo
Participant
Posts: 31
Joined: Thu Jul 17, 2003 2:55 am

Parameter Passing through Object?

Post by calvinlo »

Hi all,

I would like to know is there object oriented concept in DataStage?
Since now i have a generic job sequence model, every job will pass necessary parameters to this model to trigger corresponding jobs. However i find that there are too many parameters and every job may have different numbers of parameters to pass too. I like to have some kind of object, or a "Value String" so that i can "set" and "get" values from this object/String easily.
Anyone try that before?

Thanks,
Cal
vmcburney
Participant
Posts: 3593
Joined: Thu Jan 23, 2003 5:25 pm
Location: Australia, Melbourne
Contact:

Post by vmcburney »

This is a very common problem for DataStage sites since parameter handling in DataStage can be quite a labour intensive process. Two options are to buy the Parameter Manager product which has a link at the top of this web site or to put your parameters in a file or database table and retrieve them using a routine.

See the following topic for a discussion on setting parameters:
http://www.tools4datastage.com/forum/to ... C_ID=84590

Vincent McBurney
Data Integration Services
www.intramatix.com
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Parameters are properties of job objects (which are instances of the job class). So the only methods that are available are DSGetParamInfo and DSSetParam, or their implicit or inherited equivalents in Job Activities in job sequences or in stage or pin objects.

Because of the way that DataStage is designed, the only place from which an unmodified job sequence can obtain parameter values is from its own collection of parameters. You can modify the code generated by a job sequence to read parameter values from a file or table, of course; examples of this can be found in the archives.

I would second Vincent's suggestion that Parameter Manager is the best way to manage the parameters in a DataStage project.


Ray Wurlod
Education and Consulting Services
ABN 57 092 448 518
calvinlo
Participant
Posts: 31
Joined: Thu Jul 17, 2003 2:55 am

Post by calvinlo »

um..what i actually want to do is to group several parameters into one parameter and pass to another job as a String, then to write some method that can easily create this "string" and get value from this "string". is it possible?

Cal
mharkema
Participant
Posts: 11
Joined: Thu Mar 20, 2003 4:23 am

Post by mharkema »

Hi Cal,

It is posisble to concat parameters values into one parameter and to read this one parameter as separate parameters again. For example, I have created a Before/After subroutine using only one parameter (InputArg). When I call the routine, I pass one parameter that is build up from several parameter values, separated by a ":" sign (or whatever sign you like). When the routine receives the parameter (InputArg), I use the FIELD function to separate the parameter values again. Please refer to DataStage documentation about this function. The routine stores the separated parameters in strings that are used throughout the routine.

[;)] Hope this is helps in your case as well.
calvinlo
Participant
Posts: 31
Joined: Thu Jul 17, 2003 2:55 am

Post by calvinlo »

Thank you very much. That is what i want. And I am trying these functions now!

Cal
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

And to pass this between jobs, check out the SetUserStatus and GetUserStatus functions. Those names may not be *exactly* right, but should be close enough for you to find them. [:)]

-craig
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

THERE IS NO WAY TO MANIPULATE EMBEDDED STRING PARAMETERS!


For example, if you create a single parameter in a job, that is actually an embedded string of parameter values (directories, dsn's, userids, etc) in an attempt to not define ALL parameters your job requires as discrete parameters, you will not be able to parse this MOTHER OF ALL PARAMETERS.

Example, if you use any FIXED parameter (it uses the #MotherOfAllParameters# notation), then you have no DataStage BASIC capability to manipulate it. For example, if you're using it in a directory path, you cannot manipulate the value. If it's the DSN, USERID, PASSWORD files in any ODBC/OCI stages, you cannot manipulate the value.

The only way you could manipulate a #MotherOfAllParameters# value is if it's used in a SQL type query, where you can use SQL statements to substring or parse it.

You can manipulate it where you use it in an expression, because it's used as a variable: MotherOfAllParameters. You would use BASIC to manipulate it using the FIELD function or substring notation.

Kenneth Bland
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

I'm with Ken, based on many years of experience (his and mine).

Always prefer discrete parameters, as it makes the job design easier to understand and therefore to maintain.

Further, the effort involved in composing the mother of all parameters and subsequently decomposing it means that CPU cycles are being wasted that could be devoted to performing ETL. That is, your throughput performance will suffer.

There's no reason that the controlling job can not work with multiple parameter values; if you need to pass them between jobs use a file system object (for example a hashed file (easy) or a named pipe (less easy)).

Ray Wurlod
Education and Consulting Services
ABN 57 092 448 518
mharkema
Participant
Posts: 11
Joined: Thu Mar 20, 2003 4:23 am

Post by mharkema »

Hi Guys,

Just a question (I agree with the previous postings about avoiding "the Mother of all parameters"): how would you cope with a before/after routine that can handle only one parameter/input argument (as far as I know)? In this case I found it handy to use a concatenated parameter string and to my opinion, there is no other ways to pass more than one argument to such a routine(?). Please correct me if I am wrong [8)]
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

You gotta do what you gotta do. However, whatever you are doing in the before/after routine has to have knowledge of how to unnest that concatentated string. If you are executing commands and not scripts or routines, you're outta luck. But I've had to do what you describe.

This is different, the question posed is about not creating blocks of parameters in jobs. It should be highlighting an issue with DataStage. Parameters are fixed variables that are used as one-way gateways for values between controlling and controlled jobs. I personally wish DataStage didn't have embedded parameters as we currently see them, but instead had a "pointer" name to an parameter object in DS Manager. That way you could manage an object in one place, and all a job has to do it point to that object. Kind of like an INCLUDE file. Sure would make adding parameters to jobs easier, you just update that object, and all jobs would inherit that parameter set.

This is why so many of us have had to find ways to deal with heavily parametized job streams. ParameterManager is a prime example of a gap fill utility because the base product doesn't support what most of us do out here.

Kenneth Bland
Post Reply