Page 1 of 1

Data type mapping from Oracle to DataStage

Posted: Tue Oct 04, 2011 5:53 am
by deepticr
Hi,

We are using Oracle database as the source/target DB in our extraction process.

I have a particular column - ColA of datatype VARCHAR2(40 CHAR). The NLS setting of the database is UTF8.

When I use the ODBC Table Definitions to import the metadata, this column translates to NVARCHAR(160). But, when I use the Orchestrate Schema Definitions the column is read as VARCHAR(40).

Which of the above datatypes is the correct translation? Currently, we are using the development database which doesn't have multibyte data. So, either way my job runs fine.

But, when we migrate code to higher environments we might run into the risk of encountering multi byte data. So, which of the above two options is capable of handling multi byte data?

I also went through the IBM documentation on this which states VARCHAR2(n CHAR) multibyte = VARCHAR(n) in Datastage.

http://publib.boulder.ibm.com/infocente ... oracc.html

- Deepti

Posted: Tue Oct 04, 2011 2:59 pm
by ray.wurlod
They are both correct. When you import via ODBC Connector, the ODBC driver converts to what is right for ODBC. When you import directly (or via the orchdbutil facility) you get the data type that the OCI reports. As a general rule oversizing VARCHAR data types is less likely to cause problems than undersizing them.