vimali balakrishnan wrote:
Can anyone clarify me about Transaction Size and Array size?
Details about them can be found in the pdf documents as well as datastage help. You should try to get this kind of simple details yourself and post a thread when you have some technical difficulties.
Transaction size: This is the number of rows written before the data is committed to the data table. The default value is 0, that is, all the rows are written before being committed to the data table.
Array Size: This is the number of rows written at a time. The default is 1, that is, each row is written in a separate operation. If the current setting of Parameter array size causes available storage space to be exceeded at run time, you will be informed when you compile the job.
Note: If the Parameter array size setting conflicts with the Rows
per transaction setting, the former takes precedence.
vimali balakrishnan wrote:
What will be the values given to these parameters to increase the performance?
There is no fixed values that should be set for these parameters, its only about analysis and getting optimum value.
By analyzing the length of the records and the average volume of the data, you should set a value and get the performance details.
Run the job several time with different values for each of the parameter (increase or decrease the value) in each run. Maintain a table with statistics for each run.
Finally decide on a optimum value, from the lists of run.