Search found 90 matches

by theverma
Sat Nov 11, 2006 8:33 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Processing a Seuential file and Database
Replies: 7
Views: 1738

Hi Craig,
Can I use the DB2 Bulk Loader stage to upload the file created after doing all the validation/processing.
I went through DB2 Bulk Loader Stage properties but there are so many parameters that we have to specify.
Also can u give me any doc on DB2 Bulk Loader stage?

Thanx !!!
by theverma
Sat Nov 11, 2006 6:34 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Job runs on a single processor
Replies: 10
Views: 2687

So Ray you want to say that as i have a server job so it would not use all the processors while running on a multiprocessor server??
by theverma
Sat Nov 11, 2006 6:32 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Processing a Seuential file and Database
Replies: 7
Views: 1738

Processing a Seuential file and Database

Hello friends, I have a job which is reading a sequential file containing 2 million rows and after doing some processing/validations inserting the data into the DB2 Database(DB2 Stage). Can anybody suggest me,which option will be a good one,Reading the data from sequential file and do all the valida...
by theverma
Fri Nov 10, 2006 2:12 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Job runs on a single processor
Replies: 10
Views: 2687

Hi Ray,
Can u tell me how should i find out in a job that which link is slowing down the processing of the job.I m using Hashed file for some lookups.

Thanx
by theverma
Fri Nov 10, 2006 11:49 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Job runs on a single processor
Replies: 10
Views: 2687

In Monitor window it is showing me the %CP less than 100% For some stages it is very low like 20%. Can u tell me whether i m rite or wrong in my perception that Ascential works row by row.I mean Data will flow row by row ..not like 5 or 10 in some batch or something. Is there any other way to flow t...
by theverma
Fri Nov 10, 2006 11:22 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Job runs on a single processor
Replies: 10
Views: 2687

Hi..But i m using 6 transformers in my job. Job design is something like this: Seq File -> xformer1 -> xformer2 -> xformer3 -> xformer4 -> DB2 Table At xformer1,xformer2,xformer3,xformer4...i m doing some look ups to fetch some values. So according to you it shud use 4 cpu's but the full job is runn...
by theverma
Fri Nov 10, 2006 10:50 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Job runs on a single processor
Replies: 10
Views: 2687

Job runs on a single processor

Hello friends,
My job use only on a single processor even while running on a multiprocessor server.
Do i have to check some checkbox to run my job on all the available processors.I have checked the 'Multiple Instance' checkbox in job properties.

Thanx in advance!!!
by theverma
Thu Nov 09, 2006 3:59 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Performance issue with DB2 Stage
Replies: 4
Views: 1413

Performance issue with DB2 Stage

Hello friends, I m loading a sequential file input into DB2 tables after some lookups and validations.The job starts at a good rate but as the size of the table(Data in the table) increases,the rate decreases.and now it has decreased to 45rows/sec level.My job is handling 3 million rows. Also made m...
by theverma
Tue Nov 07, 2006 1:07 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Value of Array Size while inserting into a DB2 Table
Replies: 5
Views: 1269

ok..
Can u suggest me some value for Array size/Transaction Size.
My job has to handle approx. 30 million rows.

Thanx !!!
by theverma
Tue Nov 07, 2006 1:02 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Value of Array Size while inserting into a DB2 Table
Replies: 5
Views: 1269

Value of Array Size while inserting into a DB2 Table

Hello friends, I am inserting data from a sequential file into Database using DB2 Stage(SQL Action-Insert rows without clearing).I am using Array size value as 1 and transaction size value as 100.The rate of insert into the table is not very promising. Should i increase the value of the Array size a...
by theverma
Fri Nov 03, 2006 9:18 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Reading file from Mainframes
Replies: 1
Views: 638

Reading file from Mainframes

Hi,
Here we have Ascential on Windows and we want to read a file from Mainframes while running jobs on Windows.How to specify the Pathname in the Sequential stage when the file resides in Mainframes.
Can anybody tell me how to do this.

Thanx in advance!!!
by theverma
Fri Nov 03, 2006 6:50 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Speed limit while reading a sequential file.
Replies: 5
Views: 3427

Speed limit while reading a sequential file.

Hello friend, I ran my job for performance test purpose. The job is reading from a sequential file and putting that into DB2 Table.The job ran at a rate of 300 rows/sec. I want to know that is there any number of rows read/sec limit for sequential file or it depends on the Hardware on which we are r...
by theverma
Wed Nov 01, 2006 7:57 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Aggregator Functioning
Replies: 3
Views: 940

Thanx KcBland...!!
by theverma
Wed Nov 01, 2006 6:28 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Aggregator Functioning
Replies: 3
Views: 940

Aggregator Functioning

Hello friends, Can u tell me how much Data Aggregator processed for aggregation(sorting,computing total etc....).As far as i know..aggregator stores the incoming data in memory and then processed that data accordingly for the Output. My doubt is that if i m using aggregator for summing/grouping then...
by theverma
Tue Oct 31, 2006 9:09 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Avoid 'Run Stop' even on fatal Error
Replies: 5
Views: 2234

Thanx Ray...
Can u please explain this..as i m not able to see ur full message.