SIMPLE DS job taking 90% CPU??

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
wwalker
Premium Member
Premium Member
Posts: 40
Joined: Thu Mar 30, 2006 6:30 am
Location: Near Geneva, Switzerland
Contact:

SIMPLE DS job taking 90% CPU??

Post by wwalker »

Hello, all,

I have a simple (like...SIMPLE) job :

ORA8 -->IP-->TF-->SEQ-->IP-->ODBC

takes 10 attributes from an Oracle DB, passes through a transformer (no transformation) then writes to a sequential Staging point, then writes to AS400 via ODBC.

the AS400 record is a bit long, but not unusually. 58 attributes (most of which are nulled (i.e. populated with ' ' or 0). Nothing overly special or unusual about attribute length or type...typical....

Problem is this - throughput is 20 rec/sec, and the transformer stage is taking 99% CPU.

The only thing that in my opinion that makes this job special...is that it is so simple.

Checked the server load...nothing special happening on the DEV server currently...but lots of paging going on....

Does this seem as wierd to my esteemed colleagues as it does to me?!?
Wade Walker
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Set a constraint of "1=2" on the output of your transform stage. Does it still process only 20 rows/sec at high CPU utilization? If yes, then you have an issue. Using high %age CPU amounts in DataStage is usually a good thing.
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

Any user written functions being used on a column? Maybe those functions are the issue.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
jdmiceli
Premium Member
Premium Member
Posts: 309
Joined: Wed Feb 22, 2006 10:03 am
Location: Urbandale, IA

Post by jdmiceli »

Just as an experiment, try landing the data instead of sending it directly from source to target in the same job. Send the source data to a sequential file and then put together a job to push the landed data to the target.

If you are using the ODBC Stage, consider trying the RDBMS stage. I know it is still ODBC under the covers, but it seems to work better for us.

Hope that helps!
Bestest!

John Miceli
System Specialist, MCP, MCDBA
Berkley Technology Services


"Good Morning. This is God. I will be handling all your problems today. I will not need your help. So have a great day!"
wwalker
Premium Member
Premium Member
Posts: 40
Joined: Thu Mar 30, 2006 6:30 am
Location: Near Geneva, Switzerland
Contact:

Post by wwalker »

Hi,

Thanks for your reponses. In short:

- I have increased the transaction size in the Oracle stage to 2000, which has substantially increased throughput...

- There are NO interesting user functions

- I have already landed the data - I was thinking also it might be best to break the flow to see the result...at least there is a now a staging point....

- ODBC is the only option - the client does not want a DB/400 client on the DS server.

I would say I have reasonable throughput now...~1000 rec/second, so given the simplicity, requirements and scope of the job, it is deliverable.... However, I continue to be suprised by the processor consumption/throughput ratio.

Thanks for your input to date

W
Wade Walker
Post Reply