Discovering Where Hashed Files are used

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
danjm
Participant
Posts: 34
Joined: Tue Sep 09, 2003 8:44 am
Location: Canada
Contact:

Discovering Where Hashed Files are used

Post by danjm »

Can someone tell me if there is a straight forward way to find out what jobs are creating/using a specific hash file by name?
When I see a hashed file stage in a job that is 'under review', I'd like to be able to quickly identify the job that created the hash file and any/all jobs that may be referencing that hashfile.
Dan Marshall
Alberta Education
Edmonton, AB
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Hello Dan,

if the files are created within a job and people also use the column definition metadata, then I would go to the Manager, click on the metadata for that file and then do a "usage analysis" on it.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Where's Chuck when you need him? :wink:

To quote Mr Smith:
Try my List all files and tables used by jobs in a dsx file tool.

You will find it on the DataStage Tools page of www.anotheritco.com
-craig

"You can never have too many knives" -- Logan Nine Fingers
danjm
Participant
Posts: 34
Joined: Tue Sep 09, 2003 8:44 am
Location: Canada
Contact:

Post by danjm »

ArndW wrote:Hello Dan,

if the files are created within a job and people also use the column definition metadata, then I would go to the Manager, click on the metadata for that file and then do a "usage analysis" on it.
I was persuing that idea before I asked this question but if I go into Manager and select the file or it's column definitions, the "usage analysis" operation is not available (dimmed).... I only see "usage analysis" as an option at the 'job' level of the tree. Perhaps I'm misunderstanding or I'm not using the client tool correctly??
ONE MOMENT... I 'think' I found it correctly now.... under "Table Definitions"--> "Hashfiles".... seem like this IS what I'm looking for.. Thanks so much! Have a great day.
Dan Marshall
Alberta Education
Edmonton, AB
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Problem is the Usage Analysis data is easily broken so things don't link up properly. Also, if people don't pull the metadata from the repository and just type it in for instance, there is no linkage. Something to keep in mind.

I use Chuck's job to farm this information from an export and then I load it into an Oracle table. Then a simple query can get me a list of who reads from or write to specific hash files.
-craig

"You can never have too many knives" -- Logan Nine Fingers
danjm
Participant
Posts: 34
Joined: Tue Sep 09, 2003 8:44 am
Location: Canada
Contact:

Post by danjm »

chulett wrote:Problem is the Usage Analysis data is easily broken so things don't link up properly. Also, if people don't pull the metadata from the repository and just type it in for instance, there is no linkage. Something to keep in mind.

I use Chuck's job to farm this information from an export and then I load it into an Oracle table. Then a simple query can get me a list of who reads from or write to specific hash files.
Thanks for the extra information as I'd not considered that someone might manually enter metadata but of course that is a very real possibility. Could you tell me a little more about "Chuck's job", though? I don't understand that part of your comments.
Dan Marshall
Alberta Education
Edmonton, AB
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Go to his website and download the job, it's very well documented. You feed an export (.dsx file) in one end and and a .csv file comes out the other end. 8)

We used to keep the output in a spreadsheet, but pushing it up into a table made things easier.
-craig

"You can never have too many knives" -- Logan Nine Fingers
danjm
Participant
Posts: 34
Joined: Tue Sep 09, 2003 8:44 am
Location: Canada
Contact:

Post by danjm »

chulett wrote:Go to his website and download the job, it's very well documented. You feed an export (.dsx file) in one end and and a .csv file comes out the other end. 8)

We used to keep the output in a spreadsheet, but pushing it up into a table made things easier.
Thank you, Craig, I will definitely look into this as it sounds very useful. Have a nice day and thanks again to everyone for the assist. This group has been so very helpful to me in so many ways!!
Dan Marshall
Alberta Education
Edmonton, AB
kduke
Charter Member
Charter Member
Posts: 5227
Joined: Thu May 29, 2003 9:47 am
Location: Dallas, TX
Contact:

Post by kduke »

At TCL run:

Code: Select all

select 
   EVAL DS_JOBOBJECTS."@RECORD<6>" AS FILE_NAME FMT '35L',
   DS_JOBS.NAME AS JOB_NAME FMT '35L', 
   DS_JOBOBJECTS.NAME AS LINK_NAME FMT '35L'
from 
   DS_JOBOBJECTS, 
   DS_JOBS
where 
   DS_JOBOBJECTS.OLETYPE in ('CHashedOutput','CHashedInput') 
   and DS_JOBOBJECTS.OBJIDNO = DS_JOBS.JOBNO 
group by 
   FILE_NAME, 
   JOB_NAME,
   LINK_NAME
;
Mamu Kim
danjm
Participant
Posts: 34
Joined: Tue Sep 09, 2003 8:44 am
Location: Canada
Contact:

Post by danjm »

kduke wrote:At TCL run:

Code: Select all

select 
   EVAL DS_JOBOBJECTS."@RECORD<6>" AS FILE_NAME FMT '35L',
   DS_JOBS.NAME AS JOB_NAME FMT '35L', 
   DS_JOBOBJECTS.NAME AS LINK_NAME FMT '35L'
from 
   DS_JOBOBJECTS, 
   DS_JOBS
where 
   DS_JOBOBJECTS.OLETYPE in ('CHashedOutput','CHashedInput') 
   and DS_JOBOBJECTS.OBJIDNO = DS_JOBS.JOBNO 
group by 
   FILE_NAME, 
   JOB_NAME,
   LINK_NAME
;
Thanks for this as well, Kim. Works like a charm too! Have a great day. Thanks again to everyone for taking the time to assist me with this!
Dan Marshall
Alberta Education
Edmonton, AB
kduke
Charter Member
Charter Member
Posts: 5227
Joined: Thu May 29, 2003 9:47 am
Location: Dallas, TX
Contact:

Post by kduke »

Dan

You are welcome.
Mamu Kim
Post Reply