Page 1 of 1

Dynamically Naming Sequential File

Posted: Thu May 22, 2008 10:43 am
by msacks
Hi-

I have a Server Job that needs to create two Sequential Files, the two files are identical with the exception being that one of the files needs to be dynamically named. In order to dynamically name the file, I need to execute a query against a table in our database as follows:

select to_char( max(pay_end_dt), 'YYYYMMDD') || '.csv'
from my_table;


I need to take the result from this query and use it as the file name when I create my Sequential file. For example, if the query returned the value "20081212.csv", then I would want to pass that value to my Sequential File stage and use it for the file name.=20

Is this possible? If so, can you provide detailed instructions on how to go about this?

Thanks in advance.

Posted: Thu May 22, 2008 11:12 am
by paddu
if i did follow you correctly ,

here is what you can try ,

1) create a job which pulls the date value
select to_char( max(pay_end_dt), 'YYYYMMDD') || '.csv'
from my_table;
and save the result in a flat file say FileDate.txt

2) create a job sequence with execute command stage in it which reads the FIleDate.txt file .Pass execute command output to FileDate variable in the jobwhich needs it( Define variable FileDate in the job with blank value).

Filename in sequential file stage will look something like below.

Filename_#FileDate#.CSV.

Posted: Fri May 23, 2008 7:34 am
by msacks
Hi-

Two questions:

1. I know how to create an Execute Command Stage in a Job Sequence, but I'm not entirely sure how I would get it to read a text file as "paddu" suggests, and have it populate the variable FileDate. Can someone provide an example?

2. Once I have received a value for the parameter in the Execute Command Stage, how do I pass that value and use it in the Sequential File stage?

I'm fairly new to DataStage, so please forgive me if these are stupid questions.

Posted: Fri May 23, 2008 11:43 am
by paddu
1) Firstly define a variable in the job something like FileDate in job parameters tab and leave the default value blank.


2)In the job sequence , connect execute command stage and job activity stage (call the job which needs the value)

Execute command stage:

In command tab give CAT

in the parameters tab give the path and filename .

Job activity stage :After calling the job here .

you will see FileDate variable in the parameters.

Right click on value expression of FileDate parameter,then select Activity Variable , then select $CommandOutput.


Don't forget to give Filename in sequential file stage of the job as

Filename_#FileDate#.CSV.

By doing all the above , result will look something like this.Hope this is what you want.

test_20080523.CSV

Posted: Fri May 23, 2008 1:18 pm
by sachin1
thanks paddu.............it was very good explanation i gained knowledge

Posted: Fri May 23, 2008 2:31 pm
by msacks
Hi-

Thanks to your help, I finally got this working!! Just one last question.

I was getting the following error:
pass_param..JobControl (@recv_param): Controller problem: Error calling DSSetParam(FILE_DATE), code=-4
[ParamValue/Limitvalue is not appropriate]


I did some searching on DSXchange and discovered I needed to use the EReplace command in the Activity Stage as follows:

EReplace(Execute_Command_0.$CommandOutput, @FM,"")
What exactly is this stripping off from the value of $CommandOutput and why do I need to do this?

Posted: Fri May 23, 2008 3:06 pm
by chulett
Screen output is captured in a dynamic array, not a string. The Field Mark delimiter needs to be removed so that you pass in a simple string to the parameter. That field mark is the 'not appropriate' part of the error message.

The posts where you found this technique should also have explained this.

Posted: Fri May 23, 2008 4:32 pm
by msacks
Thanks for the fast response. I wasn't aware that of the dynamic array, are there any documents I can review to learn more about this?

I do want to say that if it wasn't for the DSXchange forums, I never would have know to use the EReplace command. Thanks to all who helped!