Page 1 of 2

i want to get the summary of a job..

Posted: Mon Mar 14, 2005 4:36 am
by Shri
Hi group,

Am trying to get the summary of a job,, about the job name, source, target , no of record passed, pass or fail.. how can i get it...

or else can anyone send me a routine to get only the record passed in to a target stage in a particular job. with getlinkinfo something...

thnx in advance..

Posted: Mon Mar 14, 2005 5:00 am
by ArndW
Hello Shri,

getting the summary information as you requested is not very difficult, it is a matter of calling the DSGetJobInfo function with the keys you want. Usually it is more work to figure out what to do with all that information gleaned from the log files - where and how are you going to output that information. You also have a number of options from the dsjob command line interface.

In order to get just the number of rows passed through to a given stage, you can use DSGetLinkInfo - but you need to know the job, stage and link names. I don't think you need someone to write & send a routine to you; that is only two lines of code.

Posted: Mon Mar 14, 2005 5:10 am
by ArndW
Shri,

I neglected to include some code that I had once written to get rows written to a specific link in an after-job routine:

Code: Select all

   EQUATE LinkToCheck  TO "Ln_Out_JC_TABLETIMESTAMPS"
   EQUATE StageToCheck TO "Hf_JC_TABLETIMESTAMPS"
   EQUATE JobToCall    TO "JdDSSJOBSynchronizeJobControlTables"
   EQUATE ProgramName  TO "RbaJobCheckReadControlTables"

   *******************************************************************
   ** Check the number of rows that went through the "changed" link **
   *******************************************************************
   RowsOutput  = DSGetLinkInfo(DSJ.ME,StageToCheck,LinkToCheck,DSJ.LINKROWCOUNT)

Posted: Mon Mar 14, 2005 7:25 am
by kduke
Do a search on EtlStats. You can download it from ADN or my tips page. Lots of prewritten code to do this.

Error with LINKROWCOUNT

Posted: Mon Mar 14, 2005 9:26 pm
by Shri
Hi ArndW,

I tried ur routine and its throwing this error which i cannot rectify. can u please help out...

Compiling: Source = 'DSU_BP/DSU.linkrownum', Object = 'DSU_BP.O/DSU.linkrownum'
*
0010 RowsOutput = DSGetLinkInfo(DSJ.ME,StageToCheck,LinkToCheck,DSJ.LINKROWCOUNT)

^
',' unexpected, Was expecting: '!', ')', '=', "AND", "OR", "LT", "LE",
"GT", "GE", "NE", "EQ", "MATCH"
Array 'DSGetLinkInfo' never dimensioned.
WARNING: Variable 'DSJ.ME' never assigned a value.
WARNING: Variable 'Ans' never assigned a value.

2 Errors detected, No Object Code Produced.

Posted: Mon Mar 14, 2005 10:07 pm
by ray.wurlod
You need the following declaration at/near the beginning of your routine, certainly ahead of any executable statement.

Code: Select all

$IFNDEF JOBCONTROL.H
$INCLUDE DSINCLUDE JOBCONTROL.H
$ENDIF

Posted: Mon May 16, 2005 12:59 am
by Madhav_M
Hi,
I am basically looking out to document the link info..
where u'll be seeing the return from "DSGetLinkInfo" function by default?

I tried to write the result of the function, I was not successful.

I got the error like following:
End of Line unexpected, Was expecting: Assignment Operator
0023 WRITESEQ "Records Outputted: ":r2 TO SeqFilePtr ELSE Err1 END

plese help me out in this.

Thanks
Maddi

Posted: Mon May 16, 2005 1:25 am
by ArndW
Hello Madhav_M,

Just remove the END from your line.

Posted: Mon May 16, 2005 2:31 am
by ray.wurlod
Unless Err1 is a macro (which I doubt), you can't use it as an lvalue in the way you have in the WRITESEQ statement. Try (assuming variable r2 has been assigned a value:

Code: Select all

WriteSeq "Records outputted: " : r2
Else
   Call DSLogWarn("Unable to write to file.", "WriteSeq")
End

Posted: Tue May 17, 2005 6:31 am
by Madhav_M
Thanks for your help.
I could able to write the link information to a sequential file.

Is there is any simple way to write this information into a table.

FYI I am calling this routine after job. once the routine is executed, the information can be written in a table?

Thanks

Posted: Tue May 17, 2005 6:41 am
by chulett
Madhav_M wrote:Is there is any simple way to write this information into a table.
Sure! Write a DataStage job to source the flat file(s) and load them into your table. :wink: Anything else wouldn't really fall into the 'simple' category.

Posted: Tue May 17, 2005 7:03 am
by Madhav_M
Hi Chullet,

Thanks for the input.. I thought of that option.
If I have 100 jobs i need to have 100 sequential file(since my no of source links and target links may vary job to job).. then again I need to have another job to update the table.

Instead u can directly write to table from routine?

Thx.

Posted: Tue May 17, 2005 7:43 am
by chulett
You don't necessarily need to have 100 files - assuming the metadata is identical between them. Depending on your situation, you can write to a single file, appending as you go. That or do what we typically do in this situation - a 'pre-job' step that cats all the files together into a single file. In either case then it's just one job run once to get everything loaded. The final file can have a fixed name to simplify even more the processing of it, no need to worry about passing filename parameters in that case.

As to your 'can you?' question - sure. As noted, it's not simple and you would need learn how to use the 'BCI' functions over ODBC. And buy drivers as the branded ones can't be used in this manner on a UNIX server... at least not for more than 30 days. :wink:

Or am I missing something here, guys? Is there a 'simple' option to write directly into a table from a routine? :?

Posted: Tue May 17, 2005 11:02 pm
by Madhav_M
If you run multiple jobs at the same time how the same sequential file is updated if you go for single sequential file?

Posted: Tue May 17, 2005 11:36 pm
by chulett
You can't. You could only append to a single file if the jobs are run in a serial fashion.

Or you could get more sophisticated and separate the jobs from the metrics gathering steps. Gather all of the statistics at once once all the jobs complete.

Or fall back on the 'concatenate multiple files into one for processing' idea. More than one way to skin this cat. :wink: