The job process around 10,000,000 records. There are 5 columns and there data types are very small
Update Action: Insert new rows or update existing rows
I have tried Array Size 1 and Transaction size 1 to 1000, still no luck
When the job was running, there was enough space in Scratch disk.
Design:
Code: Select all
Oracle --> Transformer --> DRS[MS SQL Server]
Error:
Code: Select all
DRS_STAGE,0: clntudp_create: out of memory
DRS_STAGE,0: clntudp_create: out of memory
DRS_STAGE,0: clntudp_create: out of memory
DRS_STAGE,0: Operator terminated abnormally: received signal SIGSEGV
Code: Select all
VIRTUAL MEMORY was 4057M and RES is 3.9G when the job aborted..
Where VIRT is Virtual Size of the task. This includes the size of process's executable binary, the data area and all the loaded shared libraries.
Where RES is Resident memory: current amount of process memory that resides in physical memory
Is there any way the size can be optimized?
Thanks
Prasad