NLS characters not supported by Oracle Enterprise Stage?

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
kannantechno
Participant
Posts: 9
Joined: Tue Feb 16, 2010 1:18 am

NLS characters not supported by Oracle Enterprise Stage?

Post by kannantechno »

Hi All,

Earlier i loaded data with OBDC stage and loaded in sql server. Now for performance tuning i replaced ODBC with Oracle Enterprise stage. I'm able to load data but it is not supporting the latin characters. For example (') apostrophe is reflected as question mark(?). I have tried all options of NLS setting but went in vain.

some of the options were:
1. Changing the NLS to project default ISO8859-1
2. Changing the NLS to UTF-8
3. Changing the NLS to ISO 8859 1997
4. Changing the NLS to IBM037
5. tried a chance by changing the column data type to VARCHAR earlier it was Nvarchar even then it did not support.

Any suggestion or help will be gr8!!
Thanks b4 :)
Regards,
Kannan
asorrell
Posts: 1707
Joined: Fri Apr 04, 2003 2:00 pm
Location: Colleyville, Texas

Post by asorrell »

NLS is most definitely supported by the Oracle Enterprise Stage. The "?" is more likely because it is being mapped to an incompatible map somewhere along the way.

When transferring NLS data between two different sources, you may have to "re-map" the data. Anytime it can't find a coresponding character in the new map it replaces it with a question mark.

Things to check:

1) Make sure that you are matching the NLS settings for both Source and Targets. This probably means you have to check with both of your Admins (Oracle and SQL Server) and determine what character sets they are using. Make sure your Oracle and ODBC (SQL Server) stages are set to match.
2) Make sure any environment variables for Oracle are also set correctly. There are a couple that are used to represent character sets (can't remember them, but do a search in the forums, they are referenced).
3) The default for most projects / jobs is UTF-8. If you aren't careful, it means you might be mapping the characters twice (incoming Latin-9, then map to UTF-8, then out to whatever SQL Server is set to). Each time you map it takes time, and gives an increasing chance for special characters to not be supported (once it is a ? it doesn't go back later).
4) Make sure the Latin-9 characters are really supported in your target character set. If SQL Server doesn't support them, then the "?" is technically correct.
Andy Sorrell
Certified DataStage Consultant
IBM Analytics Champion 2009 - 2020
kannantechno
Participant
Posts: 9
Joined: Tue Feb 16, 2010 1:18 am

Post by kannantechno »

Hi,

Thanks for your reply,

I have checked the Oracle DB having NLS settings as like below
"Select * from nls_database_parameters"

NLS_CHARACTERSET AL32UTF8

The characters were not displaying even when the data has been viewed from Oracle enterprise stage it self.

it would be more helpful if we get this issue resolved.


NLS Details in Datastage
----------------------------


NLS section of DS Admin
********************
In project level it is ISO8859-1

DataStage
*********

Default map for stages: UTF-8

Default collation locale for stages: Project (OFF)


NLS Map tab of Oracle Enterprise Stage:
********************************

Select category to set map for: NCHAR/NVARCHAR2

Map name for 'NCHAR/NVARCHAR2': Project default (UTF-8)


In oracle DB
----------------

NLS_LANGUAGE - AMERICAN
NLS_TERRITORY - AMERICA
NLS_CURRENCY - $
NLS_ISO_CURRENCY - AMERICA
NLS_NUMERIC_CHARACTERS - .,
NLS_CHARACTERSET - AL32UTF8
NLS_CALENDAR - GREGORIAN
NLS_DATE_FORMAT - DD-MON-RR
NLS_DATE_LANGUAGE - AMERICAN
NLS_SORT -BINARY
NLS_TIME_FORMAT -HH.MI.SSXFF AM
NLS_TIMESTAMP_FORMAT -DD-MON-RR HH.MI.SSXFF AM
NLS_TIME_TZ_FORMAT - HH.MI.SSXFF AM TZR
NLS_TIMESTAMP_TZ_FORMAT -DD-MON-RR HH.MI.SSXFF AM TZR
NLS_DUAL_CURRENCY - $
NLS_COMP - BINARY
NLS_LENGTH_SEMANTICS -BYTE
NLS_NCHAR_CONV_EXCP -FALSE
NLS_NCHAR_CHARACTERSET -AL16UTF16
NLS_RDBMS_VERSION -10.2.0.4.0
Regards,
Kannan
Post Reply