Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.
Moderators: chulett , rschirm , roy
thepakks
Participant
Posts: 22 Joined: Mon Aug 13, 2007 4:28 am
Location: Delhi
Post
by thepakks » Tue Aug 11, 2009 7:41 am
I have a simple job that reads a sequential file passes it through a transformer and then write into two sequential file. The seuqential file has about 250 million rows. I'm experiencing very low performance reading and writing the file.
I am using 5 node in configuration file.
If there any parameter for increase the performance of datastage job?
Thanks in advance.
Deepak
keshav0307
Premium Member
Posts: 783 Joined: Mon Jan 16, 2006 10:17 pm
Location: Sydney, Australia
Post
by keshav0307 » Tue Aug 11, 2009 7:50 am
what do you mean very low performance reading and writing the file??\
\how many records per second ?
how much you are expecting?
chulett
Charter Member
Posts: 43085 Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO
Post
by chulett » Tue Aug 11, 2009 7:50 am
Define 'very low performance' and describe in detail what you are doing in the transformer.
-craig
"You can never have too many knives" -- Logan Nine Fingers
priyadarshikunal
Premium Member
Posts: 1735 Joined: Thu Mar 01, 2007 5:44 am
Location: Troy, MI
Post
by priyadarshikunal » Tue Aug 11, 2009 9:28 am
In addition to that explain type of your file i.e. fixed width/delimited and
what properties are being used while reading and writing.
Priyadarshi Kunal
Genius may have its limitations, but stupidity is not thus handicapped.