how to get end timestamp in DSJOB

A forum for discussing DataStage<sup>®</sup> basics. If you're not sure where your question goes, start here.

Moderators: chulett, rschirm, roy

Post Reply
bobbysridhar
Premium Member
Premium Member
Posts: 41
Joined: Sun Mar 09, 2008 8:12 pm

how to get end timestamp in DSJOB

Post by bobbysridhar »

Hi,
I have to get Job end timestamp once the job completed successfully.
Is there anything in transformer stage we can do about it.

Thanks,
Sridhar
bobbysridhar
Premium Member
Premium Member
Posts: 41
Joined: Sun Mar 09, 2008 8:12 pm

Post by bobbysridhar »

Could somebody help me in this regard.
Either something we can do in transformer or after job subroutine
k.v.sreedhar
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

The transformer would be 'too soon' as the job would still be running. Technically, the job is still running when it hits the 'After Job' section but you could get it there using the API - look at the documentation for DSGetJobInfo() and one of the InfoType values should get you the ending timestamp.

Or you could do it any way you like in a separate process that runs after the other job completes. It all depends on exactly what you need this timestamp for... I suspect it is related to your other question about partitioning by timestamp or renaming a file after job.
-craig

"You can never have too many knives" -- Logan Nine Fingers
bobbysridhar
Premium Member
Premium Member
Posts: 41
Joined: Sun Mar 09, 2008 8:12 pm

Post by bobbysridhar »

It is not related to any of my previous posts. I have a target table(oracle) which keep track of all loaded accounts. In that I have a column for endtimestamp.
k.v.sreedhar
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Me, personally, I would never build anything like that into a job. I prefer a separate process that runs after all jobs have completed that queries them for things like their ending timestamp, links counts, etc and outputs all of that to a flat file. Then another process can simply pick up that file and load all of the 'audit records' (or whatever you want to call them) into the appropriate place.

If you went the separate path, then you could do this gathering of information in a Transformer routine since it would be checking other jobs. It would also be pretty straight-forward to do all of that in a UNIX script.

Multiple ways to skin this cat.
-craig

"You can never have too many knives" -- Logan Nine Fingers
bobbysridhar
Premium Member
Premium Member
Posts: 41
Joined: Sun Mar 09, 2008 8:12 pm

Post by bobbysridhar »

Could you please let me know which process you implemented that runs after all jobs have completed.
Please let me know how to do it in UNIX also
k.v.sreedhar
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

In UNIX use dsjob -jobinfo command. This reports about five lines of information including the start and end timestamps. You can parse these with UNIX commands such as grep and cut.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
qt_ky
Premium Member
Premium Member
Posts: 2895
Joined: Wed Aug 03, 2011 6:16 am
Location: USA

Post by qt_ky »

I agree with Craig's approach. The last thing I would want is for the external job tracking table to have some problem, like that database goes down or the ID expires, then it brings your production ETL to a halt.

FYI: DataStage 8.7 has a web-based operations console (read-only).
Choose a job you love, and you will never have to work a day in your life. - Confucius
Post Reply