processing large resultsets
Posted: Tue Jun 03, 2008 3:26 pm
Hi there,
I am processing a large resultset (6M + records) through the USPREP ruleset.
My question is what is the best way to run such a resultset through the standardize stage efficiently. Right now I'm processing at 1240 rec/sec.
It appears that the standardize stage is really slow. I can understand why, it's doing alot of work. Just wondering what are some of the best practices for making this run efficiently.
Thanks.
I am processing a large resultset (6M + records) through the USPREP ruleset.
My question is what is the best way to run such a resultset through the standardize stage efficiently. Right now I'm processing at 1240 rec/sec.
It appears that the standardize stage is really slow. I can understand why, it's doing alot of work. Just wondering what are some of the best practices for making this run efficiently.
Thanks.