EtlStats

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
har
Participant
Posts: 118
Joined: Tue Feb 17, 2004 6:23 pm
Location: cincinnati
Contact:

EtlStats

Post by har »

Hi Kim,
I import the dsx and i also exported and created all the files & folders that are necessary ap per u r Copy2project and install.txt instructions.
I try to run DSJobReportDb by passing a jobname and i get this error:
Job stopped - before-job routine returned error: DSU.ExecDOS is not cataloged
I dont have exedos functinality in before or after job routine .
How can i modify this job.

Thanks,
Har
Har
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Search
I know for a fact that "DSU.ExecDOS not cataloged" has been answered in the past.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
har
Participant
Posts: 118
Joined: Tue Feb 17, 2004 6:23 pm
Location: cincinnati
Contact:

Post by har »

ray,
I check the VOC File for ExecDOS and its not their.
I planning to add the following record to VOC file,can you correct the following record if their any mistakes in it
DSU.ExecDOS
001 V
002 DSU_BP.O/DSU.ExecDOS
003 B
004 BN
009 DSU_BP.O

Thanks,
Har
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Seems like you need to replace all occurances of ExecDOS with ExecSH and convert the DOS commands to UNIX ones as well.
-craig

"You can never have too many knives" -- Logan Nine Fingers
kduke
Charter Member
Charter Member
Posts: 5227
Joined: Thu May 29, 2003 9:47 am
Location: Dallas, TX
Contact:

Post by kduke »

Craig is correct. There is a before job routine which runs the dsjob command. You need to fix the path to make this work in your project. If you are on UNIX then you need ExecSH instead of ExecDOS. You also need to change / to \. This outputs to a XML file below the project in the KimD folder. You can change all this here.

../../DSEngine/bin/dsjob -report #Project# #JobName# XML >./KimD/#JobName#.xml

is the command that it wants to execute. This xml file becomes the source file for all 3 tables.

ETL_JOB_HIST
ETL_ROW_HIST
ETL_PARAM_HIST
Mamu Kim
kduke
Charter Member
Charter Member
Posts: 5227
Joined: Thu May 29, 2003 9:47 am
Location: Dallas, TX
Contact:

Post by kduke »

For the PX users I have a much improved version of this job and these tables. The ETL_ROW_HIST was aggregated to remove the partitions. We found that row counts at the partition level is valuable to see if a hash partition is unbalanced.

We also found that we are RAM bound. The xml contains the process id of each stage. If you get a ps -elm then you can look up the size and add up the process sizes for a job. This takes some work if you are on a grid or in our case a cluster because this requires figuring out which server the PID is on. This required Kevin to parse the config file.

If we can add up all the processes running by job then we can figure out when we start paging and do better load balancing. We are real close to getting all this in tables. I will see if they will let me post it.

I have a script now which will do the same. It adds up all the process sizes for one job and displays the process names and sizes. Way cool.
Mamu Kim
stefanfrost1
Premium Member
Premium Member
Posts: 99
Joined: Mon Sep 03, 2007 7:49 am
Location: Stockholm, Sweden

Post by stefanfrost1 »

I am trying to run the command (changed #Project# and #JobName# to actual values and tried it in the unix prompt)
../../DSEngine/bin/dsjob -report #Project# #JobName# XML >./KimD/#JobName#.xml
I am sure that i am typing the correct names and using the correct caps, but i recieve the following message....

Code: Select all

ERROR: Failed to open project

Status code = 81016
Whats wrong, and yes I am running as dsadm
-------------------------------------
http://it.toolbox.com/blogs/bi-aj
my blog on delivering business intelligence using agile principles
kduke
Charter Member
Charter Member
Posts: 5227
Joined: Thu May 29, 2003 9:47 am
Location: Dallas, TX
Contact:

Post by kduke »

The #'s surround parameter names and their values get substituted at run time. Just like $Project would in a shell script. All jobs run in the project directory. The path to dsjob is because the Projects directory is relative to the DSEngine directory on most DataStage servers.

I think you need to go to class and learn the basics. These jobs require basic knowledge of parameters and DataStage in general.
Mamu Kim
Post Reply