Heap allocation error - ODBC stage

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
Rohit_ben
Participant
Posts: 19
Joined: Sat Sep 22, 2007 4:55 am

Heap allocation error - ODBC stage

Post by Rohit_ben »

Hi

One of our parallel jobs fetches data from Oracle 10g and inserts into a SQL server table. The job design is as follows:

Oracle Enterprise > Transformer > ODBC Enterprise

The job nees to fetch and populate approx 20 million records. The job is failing due to the following error.

odbcwrt_EX_POST_RESULTS,0: Operator terminated abnormally: Terminating with exception:APT_BadAlloc: Heap allocation failed.

I have searched the forum for similar kind of errors and have applied some of the suggestions on the ODBC stage as following:

I have set the APT_DISABLE_COMBINATION to true.

1) Set Insert array size to a low value - I set it to 10 still the job fails
2) Set insert array size to 1 and set the row commit interval to a positive integer - I set array size to 1 and commit interval from 10, 1000, 2000 and 50,000. Still the job fails on all occasions.
3) I have also tried to set the execution mode to sequential. But to no sucecess.

Could you please let me know how can I proceed to find out the reason and resolve the issue?

Regards
Rohit
suse_dk
Participant
Posts: 93
Joined: Thu Aug 11, 2011 6:18 am
Location: Denmark

Post by suse_dk »

Are you able to run your job on just a single node?
_________________
- Susanne
Rohit_ben
Participant
Posts: 19
Joined: Sat Sep 22, 2007 4:55 am

Post by Rohit_ben »

We have created a server job with the same logic which is working fine.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

How wide are your rows? There is a configurable maximum number of bytes per row - from memory it's 8192. There's also a configurable maximum number of columns per row - from memory that one is 400. Exceeding either of these might cause a heap allocation error.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Rohit_ben
Participant
Posts: 19
Joined: Sat Sep 22, 2007 4:55 am

Post by Rohit_ben »

Sorry Ray, I was not able to view the premium content :(

The number of columns is 120. Are these 'configurable maximum number of bytes per row' and 'configurable maximum number of columns per row' set through some environment variables?
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Search the forum for MAXFETCHBUFF and MAXFETCHCOLS. These are entries that can be made in the uvodbc.config file.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Rohit_ben
Participant
Posts: 19
Joined: Sat Sep 22, 2007 4:55 am

Post by Rohit_ben »

We changed the uvodbc.config file in the project directory to reflect the following:

<DPServer32>
DBMSTYPE = ODBC
MAXFETCHBUFF = 16384


Array size = 1
Row commit interval = 5000

The job still failed. We did not change the MAXFETCHCOLS as it is 400 columns by default.
The job is failing at the same record count each time.
Post Reply