Hello friend,
I ran my job for performance test purpose.
The job is reading from a sequential file and putting that into DB2 Table.The job ran at a rate of 300 rows/sec.
I want to know that is there any number of rows read/sec limit for sequential file or it depends on the Hardware on which we are running our jobs.
How can i increase the number of rows read/sec rate while reading from the file.
Thanx
Speed limit while reading a sequential file.
Moderators: chulett, rschirm, roy
Speed limit while reading a sequential file.
Arun Verma
Hi,
Going by the info it looks like a simple job reading data from a Sequential file and just output to a DB2 Stage.
A job with Sequential File Stage and DB2 stage as Target only.
The low record processing speed could be because of the speed of response from DB.
If the data base can process the records faster rows/sec can increase.
Going by the info it looks like a simple job reading data from a Sequential file and just output to a DB2 Stage.
A job with Sequential File Stage and DB2 stage as Target only.
The low record processing speed could be because of the speed of response from DB.
If the data base can process the records faster rows/sec can increase.
Happy DataStaging
Here's a good chance for you to experiment. Copy your job and change it to write to another Sequential file instead of a database. After you run it you'll find that it is 1000's X faster.
It's funny you assume that DataStage is the slow part. You need to understand that in data processing reading/writing files is the fastest storage method in the galaxy, while using a database for selects/inserts/updates is the slowest.
It's funny you assume that DataStage is the slow part. You need to understand that in data processing reading/writing files is the fastest storage method in the galaxy, while using a database for selects/inserts/updates is the slowest.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Or don't write it at all. Set the Transformer stage output link constraint to @FALSE, so the job is purely reading. I would expect (given that your row size is not over-large) thousands of rows per second. There are quite a few speed-ups in the functions used to access sequential files.
The throughput penalty is in DB2, not in reading the sequential file. For each row inserted DB2 must check that no security/integrity constraints violated, and will need to update any indexes that exist on the table. Not to mention maintaining the transaction. Consider using a bulk loader instead.
The throughput penalty is in DB2, not in reading the sequential file. For each row inserted DB2 must check that no security/integrity constraints violated, and will need to update any indexes that exist on the table. Not to mention maintaining the transaction. Consider using a bulk loader instead.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
I've managed 80,000 rows/second on my laptop doing a read-only job from a sequential file. The downside is, everything else I add to the job will reduce that rate.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.