COmpilation Error in DataStage

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
jwiles
Premium Member
Premium Member
Posts: 1274
Joined: Sun Nov 14, 2004 8:50 pm
Contact:

Post by jwiles »

Either add the options to $APT_COMPILEOPT or $APT_LINKOPT, as required per the compiler documentation, in DataStage Administrator (at the project level), add those variables to the job properties and add the options there, or set the compiler options at the stage level on the build tab of the transformer.

The environment variables are documented within the Information Server documentation. The compiler options are documented in the compiler's man pages or online documentation.

Regards,
- james wiles


All generalizations are false, including this one - Mark Twain.
raghav_ds
Premium Member
Premium Member
Posts: 40
Joined: Wed May 04, 2011 2:21 am

Post by raghav_ds »

Thanks James. I will use your suggestions.
raghav_ds
Premium Member
Premium Member
Posts: 40
Joined: Wed May 04, 2011 2:21 am

Post by raghav_ds »

These are the values I have tried for compiling the Job.

APT_COMPILEOPT=-O -q64 -c
APT_LINKOPT = -G -q64

The Job is showing the same error. I am trying to analyze what different values I can give.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

You might want to open a case with your support provider, it may be more of a bug with the stage than any need to tweak the compiler options.
-craig

"You can never have too many knives" -- Logan Nine Fingers
raghav_ds
Premium Member
Premium Member
Posts: 40
Joined: Wed May 04, 2011 2:21 am

Post by raghav_ds »

This Job was working fine till last week and I have made few metadata changes and that I am getting the above compilation error.

My Job is reading 47 different record types and I guess it might be a memory issue because of the large volume and operators.

Total number of columns read from the CFF stage are 2573 (47 different records and 47 Record IDS). I have recently updated the metadata for one of the record that has 50 columns to 250 columns (filler column expanded to multiple column). If I make the columns back to 50 then the job is compiling properly.
raghav_ds
Premium Member
Premium Member
Posts: 40
Joined: Wed May 04, 2011 2:21 am

Post by raghav_ds »

Other Jobs are compiling properly in the project. I am getting this error only for one job and that job was running succesfully last week.
jwiles
Premium Member
Premium Member
Posts: 1274
Joined: Sun Nov 14, 2004 8:50 pm
Contact:

Post by jwiles »

The documentation for VisualAge C++ demonstrates how to specify the maxmem and spill options. -qmaxmem=value and -qspill=value

You may also consider disabling compiler optimization as well if modifying the maxmem and spill settings don't resolve the issue. I think it may be -qnooptimize, but verify for your version of VisualAge.

With this many columns, links (and not knowing what your derivations look like), you've likely simply exceeded the default capacity of some of the internal tables the compiler builds while compiling the transformer's C++ code. The same happened to me back in v7.0 on AIX with an input stream with a very high number of columns (somewhere in the 1700+ range). Disabling compiler optimization resolved that particular issue.

Regards,
- james wiles


All generalizations are false, including this one - Mark Twain.
raghav_ds
Premium Member
Premium Member
Posts: 40
Joined: Wed May 04, 2011 2:21 am

Post by raghav_ds »

The following setting are done at Project level.
APT_COMPILEOPT=-O -q64 -c
APT_LINKOPT = -G -q64

I have added the $APT_COMPILEOPT environment variable to the specific Job Parameters and assigned the value as follows.
APT_COMPILEOPT=-O0 -q64 -c

The Job has compiled successfully and as per my knowledge this should not have any performance implication at run time. Any advice on how to use these parameters. Would it be good at Job level or at project level. I have kept this at Job level so that these changes are applicable for only this job.

ANother option I am thinking is adding these values in "Build" tab of the Transformer Properties.
raghav_ds
Premium Member
Premium Member
Posts: 40
Joined: Wed May 04, 2011 2:21 am

Post by raghav_ds »

I have added the parameters in "Build" tab of Transformer.
Post Reply