importing Oracle table definition changes data type

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
iq_etl
Premium Member
Premium Member
Posts: 105
Joined: Tue Feb 08, 2011 9:26 am

importing Oracle table definition changes data type

Post by iq_etl »

Hello,

We just converted from 7.5 to 8.7, and have noticed that when we import table definitions using the Oracle Connector Import Wizard, it is changing Char columns to NChar and VarChar2 to NVarChar. I have found references to this in previous posts and within the DataStage documentation, but I am still not clear on *why* it does this conversion and if it's a problem. So far we have been manually changing each column back to the original format. Can anyone advise? Our NLS_CHARACTERSET for the database is AL32UTF8, so it is multi-byte, but our DBA is concerned that the NChar/NVarChar settings will cause the database to use more storage for the same data.

Thanks in advance for any input.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Oracle is... different.

Nowhere else in the world has VARCHAR2 as a data type. When you import this into DataStage it has to be translated into VarChar, as used by the rest of the world.

With NLS enabled and an NLS character set specified for Oracle, DataStage uses NVarChar, by which it means VarChar with Unicode extension enabled. It will also in some cases appear to make the string size two, three or even four times what they are in Oracle, depending on the actual map used.

Your DBA should be unconcerned if all the VARCHAR2 string lengths are specified as CHARACTER rather than BYTE.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply