Poor performance when updating SQL Server table
Posted: Tue Jul 14, 2009 11:21 pm
Hi,
I'm implementing SCD2 processing against a target SQL Server table.
Excerpt from the job:
where Update/Close/Insert links are from the same xfm and to the same Target Table. In other words, one xfm stage, 3 links, one target table stage.
I have to set the transaction size in the target table from 0 to 1 to prevent deadlocks. See viewtopic.php?t=128156&highlight=, although this is a different job and issue.
The current target table size is around 172K rows. For the scenario in this post, all transactions are updates, i.e. all rows are going down the Update link (I've just run the job twice - first was all inserts (with good performance), the second is all updates (with horrid performance)). This table will only get larger in production.
I'm getting horrendous performance in this scenario - around 5 rows/sec = 34400 sec = 573 mins = 9.5 hours to run!
Any advice in this scenario? Should I write each link to sequential files or working tables in SQL Server, then Update/Close/Insert from these? Or, since I'm updating all columns in the target table, would Replace Existing Rows Completely give better performance?
Thanks,
Scott
I'm implementing SCD2 processing against a target SQL Server table.
Excerpt from the job:
Code: Select all
xfm --> Update --> Target Table
xfm --> Close --> Target Table
xfm --> Insert --> Target Table
I have to set the transaction size in the target table from 0 to 1 to prevent deadlocks. See viewtopic.php?t=128156&highlight=, although this is a different job and issue.
The current target table size is around 172K rows. For the scenario in this post, all transactions are updates, i.e. all rows are going down the Update link (I've just run the job twice - first was all inserts (with good performance), the second is all updates (with horrid performance)). This table will only get larger in production.
I'm getting horrendous performance in this scenario - around 5 rows/sec = 34400 sec = 573 mins = 9.5 hours to run!
Any advice in this scenario? Should I write each link to sequential files or working tables in SQL Server, then Update/Close/Insert from these? Or, since I'm updating all columns in the target table, would Replace Existing Rows Completely give better performance?
Thanks,
Scott