Performance for multiple Lookups
Posted: Thu Apr 03, 2008 1:26 am
Hi,
my job starts with a flat file and enriches data by 20-30 lookups.
For performance reasons I have selected the data into hashfiles so the lookup source is alsways a hashfile.
Because I have so many lookups and hashfiles the configured memory is not big enough to hold all data in RAM.
The option pre-load to memory is check in every hash file but the log of cause shows that it can not be cashed for all of the files.
Any ideas how I could increase the throughput for that job?
Currenty it is processing 30-40 rows /s.
I tried to split the job into smaller ones with 10-15 lookups with no success.
I tried to use multiple processes for that job - with even worse performance.
Which memory configuration sould I increase for the best performance increase?
Any help and ideas are appreciated.
kind regards
Michael
my job starts with a flat file and enriches data by 20-30 lookups.
For performance reasons I have selected the data into hashfiles so the lookup source is alsways a hashfile.
Because I have so many lookups and hashfiles the configured memory is not big enough to hold all data in RAM.
The option pre-load to memory is check in every hash file but the log of cause shows that it can not be cashed for all of the files.
Any ideas how I could increase the throughput for that job?
Currenty it is processing 30-40 rows /s.
I tried to split the job into smaller ones with 10-15 lookups with no success.
I tried to use multiple processes for that job - with even worse performance.
Which memory configuration sould I increase for the best performance increase?
Any help and ideas are appreciated.
kind regards
Michael