Hi All,
I have a datastage job which completes successfully in less than 5 minutes.
Can anyone please let me know how do I find out the memory usage and CPU consumption of an etl job during the run time.
Thanks
Mark
Search found 262 matches
- Fri May 06, 2011 5:57 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: datastage memory usage
- Replies: 3
- Views: 4729
- Sun May 01, 2011 3:33 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: datastage log
- Replies: 6
- Views: 2653
datastage log
Thanks Ray and James for clarifying.
- Sun May 01, 2011 1:02 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: datastage log
- Replies: 6
- Views: 2653
datastage log
Thank You. The total CPU is : 163.23 This mean the total CPU usage is more than 100%? How so I bring the usuage down to less than 100%
Thanks
Mark
Thanks
Mark
- Sun May 01, 2011 10:59 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: datastage log
- Replies: 6
- Views: 2653
datastage log
Hi All, Can someone please let me what these numbers mean? job design: Fileset ---> transformer --> seq stage Tfm,1: Operator completed. status: APT_StatusOk elapsed: 1885.14 user: 158.51 sys: 4.72 (total CPU: 163.23) tfm,0: Operator completed. status: APT_StatusOk elapsed: 1885.22 user: 157.80 sys:...
- Fri Apr 15, 2011 11:56 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Internal Error
- Replies: 2
- Views: 1460
data stage error
I saved the job to another name, recompiled and it ran successfully.
- Fri Apr 15, 2011 10:58 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Internal Error
- Replies: 2
- Views: 1460
Internal Error
can anyone please tell me what this error message mean, main_program: Internal Error: (nP==nC): sc/sc.C: 4050 Traceback: assert.APT_FatalPath::pureAssertion(const char*,const char*,int)() at 0x90000000208b0d8 APT_SC::IPC_DataSet::computeStraightThroughOptimization()() at 0x9000000017630a4 APT_IR::Sc...
- Fri Mar 11, 2011 12:08 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Text Data
- Replies: 1
- Views: 975
Text Data
I have incoming data as 8772794193.
The expected output is 877-279-4193
can anyone please suggest me how can I accomplish this.
Regards
Mark
The expected output is 877-279-4193
can anyone please suggest me how can I accomplish this.
Regards
Mark
- Wed Feb 23, 2011 12:17 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: transformer error message
- Replies: 0
- Views: 1038
transformer error message
Hi all, I am seeing this error message at a transformer stage in my parallel job. Can anyone please suggest me what might be the issue here, The current soft limit on the data segment (heap) size (1610612736) is less than the hard limit (9223372036854775807), consider increasing the heap size limit ...
- Fri Feb 11, 2011 10:05 am
- Forum: General
- Topic: Datastage jobs Using Oracle Table
- Replies: 4
- Views: 4071
Datastage jobs Using Oracle Table
actually we havne't saved the meta data for this particular table. There are round 500 datastage jobs in the project. So trying to find out if there is any way I can use sql to query and find out list of all datastage jobs using a particular oracle table.
- Fri Feb 11, 2011 9:40 am
- Forum: General
- Topic: Datastage jobs Using Oracle Table
- Replies: 4
- Views: 4071
Datastage jobs Using Oracle Table
Can anyone please suggest me how do I find out the list of all datastage jobs using a particular oracle table say "employee".
Thanks
Mark
Thanks
Mark
- Wed Nov 10, 2010 4:23 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: datastage warnings
- Replies: 1
- Views: 1896
datastage warnings
Hi All,
I find these warnings in my job log from datastage director. Can somoene please tell me how can I fix these warnings. And what is causing these warnings.
Failed to initialize job monitoring. Monitor information will not be generated.
Failed to connect to JobMonApp on port 13401
Thanks
Mark
I find these warnings in my job log from datastage director. Can somoene please tell me how can I fix these warnings. And what is causing these warnings.
Failed to initialize job monitoring. Monitor information will not be generated.
Failed to connect to JobMonApp on port 13401
Thanks
Mark
- Wed Oct 27, 2010 12:57 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: output
- Replies: 5
- Views: 2046
output
I received output as pe
desired output: sony xxxxxx europe (after eliminating comma and extra spaces)
Thanks
Mark
desired output: sony xxxxxx europe (after eliminating comma and extra spaces)
Thanks
Mark
- Wed Oct 27, 2010 12:27 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: output
- Replies: 5
- Views: 2046
output
input: , sony xxxxxx europe
desired output: sony xxxxxx europe (after eliminating comma and extra spaces)
desired output: sony xxxxxx europe (after eliminating comma and extra spaces)
- Wed Oct 27, 2010 12:19 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: output
- Replies: 5
- Views: 2046
output
I have input data as , sony xxxxxx europe
desired output: sony xxxxxx europe
Can anyone please suggest me to accomplish this.
Thanks
Mark
desired output: sony xxxxxx europe
Can anyone please suggest me to accomplish this.
Thanks
Mark
- Thu Jul 01, 2010 2:04 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: dummy column
- Replies: 4
- Views: 3163
Re: dummy column
can I create column "dummy" using sequential stage and assign value as 'z' (or something else)
In sequential stage i was able to create a column but I could not assign value to that column.
thanks
Mark
In sequential stage i was able to create a column but I could not assign value to that column.
thanks
Mark