Page 1 of 1

Determining work load / hardware resource demands

Posted: Tue Oct 26, 2010 3:17 pm
by vivekgadwal
Hello,

We are a relatively small DataStage shop (about 10 - 12 users) and we (our team) use DataStage, QualityStage and Information Analyzer to do their work for Business Intelligence. There are other non-BI related DataStage jobs that are running in Development, Test and Production environments. Needless to say, the jobs being quite complicated, we are very frequently stepping on each other toes with regards to the resources etc.

This brings me to the question (actually it is bothering us as a team):
:?: How are people in other shops determining their work load/hardware resource demands from a DataStage point of view?

I understand that a lot of factors need to be factored in such as, data partitioning, sorts, number of joins/lookups, transform(er)s.

:?: Is there are good way to determine the work load on the CPU and space (dasd)? Like a utility/software or such...?

Please advise. If you would like any additional details, I would be glad to provide them to the best of my knowledge.

*Edited this post to have visual clarity for the questions*

Posted: Thu Oct 28, 2010 3:21 pm
by vivekgadwal
Did anybody in this forum work with IBM Tivoli Workload scheduler? Does that help in my case? Or is it more like a conventional (external) scheduler?

Posted: Thu Oct 28, 2010 3:26 pm
by chulett
Isn't there a tool in the 8.x release for this exact purpose? Or was it added as part of the 8.1 release?

Posted: Thu Oct 28, 2010 4:57 pm
by ray.wurlod
It was added in the 8.0 release if memory serves (maybe even in 7.5.3?) and there are two: Resource Estimation tool and Performance Analysis tool. Read about them in the DataStage Designer Guide.

Posted: Thu Oct 28, 2010 4:59 pm
by ray.wurlod
In the roadmap for DataStage/Information Server (subject to the usual caveat: this is stuff they're thinking about or working on, with no promise to deliver or when) there is proposed a web console for monitoring production environments.

Posted: Fri Oct 29, 2010 1:28 am
by Sreenivasulu
Use perfomance analyzer - shows data in KB when data is in thousands of KB . May be a one-off bug but was not satisfied with this feature.

Regards
Sreeni

Posted: Fri Oct 29, 2010 8:15 am
by vivekgadwal
Thanks for your replies.

I could not find any documentation relating to these tools in the Designer guide. I launched the tool and there is some help w.r.t that. However, when I launch this tool, it is asking me to "Run" the job (according to the documentation, it will create a static model of the job). However, when I run it, nothing happens. It is a very small job (DRS --> XFM --> CSV file) and I waited a long time for this to return something in vain.

There is a little tab that opens on the taskbar below when I "run" the job for resource estimation. I tried to close it as it is not responding and it says the run aborted and a "Resource Estimation" box popped up, but it does not have anything in the window except for the icons.

Could you shed more light into running this utility please?

Posted: Fri Oct 29, 2010 8:38 am
by vivekgadwal
*Update*
I did not realize that Resource Estimation "Run" will actually do a run and log entries in Director. When I looked at it, here is what is logged:

Code: Select all

Entry-1)
main_program: Unexpected tokens: -f RT_SC2944/OshScript.osh -escaped -pf RT_SC2944/jpfile -impexp_charset UTF-8 -string_charset UTF-8 -input_charset UTF-8 -output_charset UTF-8 -collation_sequence en -default_timestamp_format %y.

Entry-2)
main_program: Failing to create a step.

Entry-3)
main_program: Orchresest error: Parsing OSH job failed.
What does this error mean?