Hi ,
We were getting the following error for quite sometime.
We have a 4 node config file (4NodeMPP.apt ). The source file around 10 million records.Reading from a Flat file and writing to a Flat file. We have a Lookup in the job design, which does a lookup on a Table to get an ID. Apart from it there isnt much design here.
earlier,We did a workaround of splitting the files and loaded the data.
Can anyone suggest where do we need to look in more, any ideas?
APT_CombinedOperatorController(2),0: Caught exception from runLocally(): APT_BadAlloc: Heap allocation failed..
As of now , we are working with the DBA's though. Let me know if you need more inputs.
Thanks,
Heap Allocation Error
Moderators: chulett, rschirm, roy
You can search this forum to get a better results.
"Heap allocation failed" means that it couldn't get more memory when it demanded it. So it has spilled to disk, and continued to process, albeit not as fast.
Check the disk space and memory using commands like df and tops. Also try running as root as it is generally given more authority.
Especially check the ulimit for data and stack for the user id under which DataStage jobs run. Get your UNIX administrator to set both (and file size ulimit) to unlimited.
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
-
- Participant
- Posts: 15
- Joined: Tue Apr 20, 2004 3:10 am