Page 1 of 1

Issue:Pass the record value to job parameter

Posted: Fri Dec 07, 2007 5:17 pm
by chenyuan
Base on the record, each client_Id has to output its records to a new file.
By using job parameter can output with different file name. But the issue is DataStage can't pass the record's value to job parameter sequentially like for loop.

Posted: Sat Dec 08, 2007 4:24 am
by ray.wurlod
Welcome aboard.

You could do this with a server job (rather than a parallel job) using the UtilityRunJob function. By this means you could pass the record, or some part if it, to a job parameter.

Posted: Mon Dec 10, 2007 1:52 pm
by chenyuan
the UtilityRunJob function only returns job summary status, not the value form record.
Following is the copy from the routine description.
--------------------------------------------------------------------------------------
The routine runs a job. Job parameters may be supplied. The result is a dynamic array containing the job status, and row count information for each link. The routine UtilityGetRunJobInfo can be used to interpret this result.

As well as the job name and job parameters, the routine parameters allow the job warning limit and row count limit to be set.

Format of returned dynamic array:

Status<1>=Jobname=FinishStatus
Status<2>=Jobname
Status<3>=JobStartTimeStamp
Status<4>=JobStopTimeStamp
Status<5>=LinkNames (value mark @VM delimited)
Status<6>=RowCount (value mark @VM delimited)


ray.wurlod wrote:Welcome aboard.

You could do this with a server job (rather than a parallel job) using the UtilityRunJob function. By this means you could pass the record, or some part if it, to a job parameter.

Posted: Tue Dec 11, 2007 1:53 am
by ray.wurlod
You can return anything you want, perhaps by making use of the invoked job's user status area. Research the DSSetUserStatus() and DSGetJobInfo() functions. Or you could write your own version of UtilityRunJob.

Posted: Thu Dec 13, 2007 3:17 pm
by chenyuan
The problem is job parameter can't either change in "run time" or set as a "function".The Utility function's return value still can't pass to parameter by itself or in run time. The only feasible solution so far is to use shell script to read each record sequently and pass the value to job parameter. However, that's the worse solution because the job run times will depend on how many records the file has.

Re: Issue:Pass the record value to job parameter

Posted: Fri Dec 14, 2007 3:44 am
by JoshGeorge
I have recently posted a parallel routine which exactly does what you need. If you search my previous posts you can find it, alternatively you can find the same in this LINK.

Main highlights of this C++ function are:
  • -->Creates and writes on a text file for each record.
    -->From a stream of sinlge source or multiple sources create files for each input record or for a set of records according to a condition.
    -->You can dynamically pass your file path, file name and extension and also the records to be written into the file.
    -->If you want multiple records to be written into one file, store each record in a stage variable with new line character and finally pass that to this routine call.
    -->Records can be of different metadata.

chenyuan wrote:Base on the record, each client_Id has to output its records to a new file.
By using job parameter can output with different file name. But the issue is DataStage can't pass the record's value to job parameter sequentially like for loop.

Re: Issue:Pass the record value to job parameter

Posted: Fri Dec 14, 2007 3:54 pm
by chenyuan
I do believe that is external before/after job runtine. Unfortunatly, there is the limitation in priviliage of accessing datastage sever. External funtions are not allowed except shell script, and also the sever doesn't have Libray for C++ as well.
JoshGeorge wrote:I have recently posted a parallel routine which exactly does what you need. If you search my previous posts you can find it, alternatively you can find the same in this LINK.

Main highlights of this C++ function are:
  • -->Creates and writes on a text file for each record.
    -->From a stream of sinlge source or multiple sources create files for each input record or for a set of records according to a condition.
    -->You can dynamically pass your file path, file name and extension and also the records to be written into the file.
    -->If you want multiple records to be written into one file, store each record in a stage variable with new line character and finally pass that to this routine call.
    -->Records can be of different metadata.

chenyuan wrote:Base on the record, each client_Id has to output its records to a new file.
By using job parameter can output with different file name. But the issue is DataStage can't pass the record's value to job parameter sequentially like for loop.

Posted: Fri Dec 14, 2007 5:22 pm
by JoshGeorge
It is a parallel routine (External funtion). Not a - before/after job runtine. There shouldn't be any "limitation in priviliage of accessing datastage sever" in any case, something you might have to cross check. You have posted this in parallel forum, datastage parallel server does have all the libraries required for C++.