Page 1 of 1

Posted: Wed Apr 26, 2006 4:21 pm
by ray.wurlod
The NLS map associated with a DataStage hashed file should always be NONE, because the hashed file is "inside" DataStage; characters have already been mapped at the boundaries between external data and DataStage.

Posted: Wed May 03, 2006 1:58 am
by jinm
ray.wurlod wrote:characters have already been mapped at the boundaries between external data and DataStage.
Hi Ray and thanks for the reply.
Well apparently all characters have not been mapped.
job worked somewhat OK on the "NON-NLS-Enabled" server, mapping all "funny" characters to ?.
What needs to be done to include a default character for all characters that cannot be mapped.

This issue is only seen when writing to hashed files, not in a DB to DB mapping, which is the vast majority of our jobs.

Rgds Jan

Re: NLS setting for Hashed file??

Posted: Wed May 17, 2006 6:43 am
by vjeran
Hi Jan,

I have almost same problem with HASH file and QS plug in. What I have found out is following:

* HASH file use code page what is in Windows server
- go to DS Administrator : click on NLS... : Current ANSI code page is what DS see and use on your server
- if this is not correct (eg. 1252 instead of 1250) go to Windows Regional and Language options : Advanced tab : select correctly your language : and restart server
- at this point you are know that HASH file will use correct NLS

* do NOT link directly from oracle to Hash file : use transform stage

* check NLS seting for oracle CLIENT if DS server is not on same machine : HKEY_LOCAL_MACHINE\SOFTWARE\ORACLE\KEY_[OraDBHOME]\NLS_LANG : it should be [laguage]_[country].WE8ISO8859P1

* for oracle stage use ISO8859-1 because that way you say to DS that source is in that code page

* HASH file is by default UTF-8 (on Server Edition)

* use view data button to check results, until you see correct display of special characters mission is not finished

BR Vjeran K.