Easy access of Job Description

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

kduke
Charter Member
Charter Member
Posts: 5227
Joined: Thu May 29, 2003 9:47 am
Location: Dallas, TX
Contact:

Post by kduke »

DS_JOBS, DS_JOBOBJECTS, RT_LOG and other similar hash files are where the actual data is stored. Arnd is correct these tend to change from one release to the next. You can generally read from these in routines but it normally will not let you write to them except in jobs. The SDK routines like DSGetJobInfo() and others do exactly what I do directly. So generally it is faster to go directly to the source data. Knowing where things are stored has other advantages. You can copy parameters from one job to all other jobs or update long description directly. I have used Universe a long time. So I am very comfortable doing this.

I really hate attaching to jobs. This comand was designed to run jobs after you attached to them. Now they require it to get link names and most of the SDK routines require attaching first. I think this is dumb. If all I want is information then do not give me a job handle so I can run it as well. If my routines break next release then in a few days they will be fixed and I will post the solutions. Most of my customers and friends know I am that way. They cound on it. Even customers which treated me poorly can count on me giving them answers long after they quit paying me. I do what I believe is correct no matter how others may treat me. Life is easier that way. Do the generous thing. Freely give and freely you will receive. I like free especially when it comes to wisdom.
Mamu Kim
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

A job can't attach to itself for the simple reason that it already is attached to itself! You can use DSJ.ME for the job handle argument (always the first argument) in the DSGet... discovery functions.

The structure of the repository is scheduled to change; in the Hawk release (Sep 2005) if all goes well in beta testing.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Ray,

I know about the attaching-to-itself limitation; but wrote some code that would get a list of all parent process (the DS calls don't do that) and inadvertantly issued an DSAttachJob('myjobname'... and it ended up hanging the process in such a way that I had to go through UV to clean up the mess. DS should be more graceful and either add a line of code to return DSJ.ME as the handle or issue an error message. (Took me a whole day to trace the cause of the error, so I feel emotional about this issue!)
StefL
Participant
Posts: 47
Joined: Fri Feb 25, 2005 3:55 am
Location: Stockholm
Contact:

Post by StefL »

Actually, one thing that strikes me is that there seems to be no way of getting information about which category (i.e. which folder in the folder structure) a certain job belongs to.
In my present solution I'm using DSGetProjectInfo(DSJ.JOBLIST) and then filter out the relevant jobs to add in the Data Dictionary from the names. It seems however that I'll have to name them according to category if I want to be able to distinguish the categories. Neither DSGetProjectInfo nor DSGetJobInfo seems to have any alternative to get info about the folder structure in the repository.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

StefL,

the category information doesn't come through the standard calls, as you've discovered. Categories were added sometime in the earlier DS versions (as a result of enhancement requests) and are therefore not really part of the file name (i.e. the same job or routine name cannot exist in multiple categories).

-Arnd.
StefL
Participant
Posts: 47
Joined: Fri Feb 25, 2005 3:55 am
Location: Stockholm
Contact:

Post by StefL »

Yes I've noticed that a job name has to be unique within the entire repository. Still the Categories information has to be stored somewhere in order for the repository to be presented as it is with the jobs placed in a folder structure. In my case the Category affects one attribute in the Data Dictionary I'm creating.

It's no big deal, I'll just put a reference to the Category in the Description that I'm getting out anyway and then scan it for the substring but I always like to get a full understanding of things...
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

StefL,

the category information doesn't come through the standard calls, as you've discovered. Categories were added sometime in the earlier DS versions (as a result of enhancement requests) and are therefore not really part of the file name (i.e. the same job or routine name cannot exist in multiple categories).

-Arnd.
kduke
Charter Member
Charter Member
Posts: 5227
Joined: Thu May 29, 2003 9:47 am
Location: Dallas, TX
Contact:

Post by kduke »

There are several posts of routines to get the category. Do a search. Mine is called GetCategory().
Mamu Kim
Post Reply