Operational Metadata Capture

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
gagan8877
Premium Member
Premium Member
Posts: 77
Joined: Mon Jun 19, 2006 1:30 pm

Operational Metadata Capture

Post by gagan8877 »

Hi Gurus

In version 7x I relied mostly on DSJobReport and XML to extract job run stats and load them into the audit dimension and Error Event Schema.

1. Is there a better or easier way in v 8.5? Non-XML way?

2. I know we can generate reports in Metadata workbench manually by going into the console and it can also export to excel, but can Metadata Workbench or another tool be used for the scenario of operational metadata capturing automatically on a scheduled basis?

3. Can we extract from repository database directly? Is it recommended?

4. Whats the best way? Please don't say "whatever u r comfortable with", pls assume I am looking for alternatives.

Thanks
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

DataStage jobs can generate operational metadata (various ways to implement this, including an option in the Job Run Options dialog). This is able to be captured and accessed using Metadata Workbench.

Once you get version 8.7 there's an even better way, called the Operations Console.

I never recommend extracting directly from the repository, as you're never really sure what you're dealing with. Instead, create reports using Metadata Workbench.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
gagan8877
Premium Member
Premium Member
Posts: 77
Joined: Mon Jun 19, 2006 1:30 pm

In Cognos too?

Post by gagan8877 »

Thanks Ray, my requirement is that the user needs to access the report inside Cognos where all other reports are (from single location). Can workbench reports go into Cognos layer? Or will have to use DSJobReport to shred xml and then load into the dimensional schema for Cognos to access?
Gary
"A journey of a thousand miles, begins with one step"
vmcburney
Participant
Posts: 3593
Joined: Thu Jan 23, 2003 5:25 pm
Location: Australia, Melbourne
Contact:

Post by vmcburney »

In version 8.5 and earlier the easiest way to get operational reporting in Cognos running is to build your own operational repository and populate it from the output of the HTML job reports. 8.5 made this a bit easier by providing an environment variable that lets you switch XML job reporting on and off for a project - so all the job reports end up in a directory on the DataStage and need to be ingested into database tables for reporting. Kim Duke's ETL Stats is an example of a database schema for this type of reporting.

The operational metadata reporting in 8.5 requires you schedule and run imports into XMETA. This metadata can be queried directly against XMETA but it is not very user friendly.

In version 8.7 IBM provide this reporting database for you with the new operational console. Unlike XMETA it is a user friendly schema that is intuitive and it comes with an agent to load it. Somewhere on Developerworks there is a long description of all the tables in that schema and how to use them - it is a perfect reporting schema for Cognos and it can drill right down to the level of jobs, links, row counts, job parameters etc.

The question you have to ask is whether to build your own operational reporting in 8.5 - potentially over $20K in work - or upgrade to 8.7 or wait for the release coming this October, rumoured to be version 9.1. I would expect more operational reporting in the next version, possibly even some exception management reporting, but don't quote me on that.
gagan8877
Premium Member
Premium Member
Posts: 77
Joined: Mon Jun 19, 2006 1:30 pm

Post by gagan8877 »

Thanks a lot Vincent! That was thorough and detailed.
Post Reply