Memory Error

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
prasad111
Premium Member
Premium Member
Posts: 173
Joined: Fri May 19, 2006 10:53 am

Memory Error

Post by prasad111 »

Hi,

The job process around 10,000,000 records. There are 5 columns and there data types are very small
Update Action: Insert new rows or update existing rows
I have tried Array Size 1 and Transaction size 1 to 1000, still no luck
When the job was running, there was enough space in Scratch disk.
Design:

Code: Select all

Oracle --> Transformer --> DRS[MS SQL Server]
I have used APT_DISABLE_COMBINATION and set it values to True, the error is shown in the DRS_STAGE.
Error:

Code: Select all

DRS_STAGE,0: clntudp_create: out of memory
DRS_STAGE,0: clntudp_create: out of memory
DRS_STAGE,0: clntudp_create: out of memory
DRS_STAGE,0: Operator terminated abnormally: received signal SIGSEGV
When I monitored the memory usage from the top command:

Code: Select all

VIRTUAL MEMORY was 4057M and RES is 3.9G when the job aborted..
Where VIRT is Virtual Size of the task. This includes the size of process's executable binary, the data area and all the loaded shared libraries.
Where RES is    Resident  memory:  current amount of process memory that resides in physical memory
Is the Virtual Memory should grow until the job finishes? As this is using most of the memory, I don't want increase the APT_BUFFER* size and crash the system..
Is there any way the size can be optimized?

Thanks
Prasad
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Although I haven't had my morning coffee yet, it would seem that 4057M is quite close to 3.9G which looks like your physical memory is equal to your virtual memory - meaning no swap!
Does your job repartition or sort the data? This would explain larger memory use.
prasad111
Premium Member
Premium Member
Posts: 173
Joined: Fri May 19, 2006 10:53 am

Post by prasad111 »

Thanks for the response.

I have tried both ways:
1) partition the data with default 2 node
2) Run the job sequentially
In both the case it errors out at the same point.

Thanks
Prasad
sjfearnside
Premium Member
Premium Member
Posts: 278
Joined: Wed Oct 03, 2007 8:45 am

Post by sjfearnside »

What is the layout of your scratch space?

When you run on 2 nodes, what is the definition for the configuration file?

Running on one node will put all the sort data on one scratch space/mountpoint.

I have noticed when there is more that one node in the config file and the partition definition is AUTO, it does not always distributed the data evenly.

How many mountpoints are there for the scratch space?

If you have multiple mountpoints defined for the scratch space, then use all the mountpoints for the scratch space by defining a configuration file that allocates all the scratch space and define the partitioning as round robin to ensure equal distribution of data to the avaialbe nodes and see if it will run. Monitor the scratch space during the execution.
prasad111
Premium Member
Premium Member
Posts: 173
Joined: Fri May 19, 2006 10:53 am

Post by prasad111 »

Thanks for the reply.

There is no sort in the job. it is straight data load from oracle --> Transformer --> DRS Stage[SQL Server]

There is only one mount point for the scratch space and also I was monitoring the space of scratch there were enough space and there were no file created in scratch or /tmp folder.

My concern is, whenever we do the upsert[update or insert], whenever we commit it should release the memory[Virtual memory space]... is my understanding correct?
miwinter
Participant
Posts: 396
Joined: Thu Jun 22, 2006 7:00 am
Location: England, UK

Post by miwinter »

Where does your SQL server install reside? Is it on the same box as the Datastage install? When you were monitoring your VM use, where were you doing so?
Mark Winter
<i>Nothing appeases a troubled mind more than <b>good</b> music</i>
sjfearnside
Premium Member
Premium Member
Posts: 278
Joined: Wed Oct 03, 2007 8:45 am

Post by sjfearnside »

Did you do a dump score to verify that DataStage is not inserting any sorts?
prasad111
Premium Member
Premium Member
Posts: 173
Joined: Fri May 19, 2006 10:53 am

Post by prasad111 »

Datastage is Linux and SQL Server is Windows, I was monitoring the VM by using the top command.

Yes, I have looked into the dump score, it was not inserting any sort.

Any suggestions is highly appreciated..

Thanks,
Prasad
sjfearnside
Premium Member
Premium Member
Posts: 278
Joined: Wed Oct 03, 2007 8:45 am

Post by sjfearnside »

Ask the DBA to watch the job execution to verify there is no problem with the DB running out of memory.
prasad111
Premium Member
Premium Member
Posts: 173
Joined: Fri May 19, 2006 10:53 am

Post by prasad111 »

No problem on the database side, everything running fine...
prasad111
Premium Member
Premium Member
Posts: 173
Joined: Fri May 19, 2006 10:53 am

Post by prasad111 »

We started using ODBC Stage it works fine..
Post Reply