Hi all,
I have a very simple job that reads a sequential file 40Million rows at 1500kbyte/row= 60GByte. Goes through a modify stage to eliminate a few rows and transform date into timestamp and then loads it into SQL server using ODBC enterprise.
As I run the job it increasingly takes more and more memory( 8GB swap), until it runs out of memory and is killed by the OS (AIX).
Is there a way of setting a limit for the amounth of memory Datastage will use?
thanks all
Datastage is eating all my memory
Moderators: chulett, rschirm, roy
I thought about that. It is a real pain to have to split the file in 5 parts and run 5 separate jobs for that though. Idealy you want to split depending on the size of the file because it is likely that the size of the input file will grow over time. That means you would have to create some kind of script that deals with that.kduke wrote:You can work around it by spliting the feed into multiple files or feeds.
-
- Participant
- Posts: 407
- Joined: Mon Jun 27, 2005 8:54 am
- Location: Walker, Michigan
- Contact:
AIX+ODBC+Datetime in our case but the leaky code is apparently used in other stages too. We talked to a tech and he promised to send us a patch. I am hoping to get it soon.Ultramundane wrote:Don't know if this related or not...
I just got a patch for memory leaks for datetime, numeric/decimal datatypes. We encountered the memory leaks when working with the Sybase, Teradata, and ODBC stages in PX. This patch was for DS751A on AIX. Ecase 76017.