Page 1 of 1

Posted: Mon May 18, 2009 9:23 am
by chulett
Sure, with Convert() for example... but why? They're not really equivalent and a better solution would be to load them as is. What issue are you having?

Posted: Mon May 18, 2009 9:33 am
by senthilmp
chulett wrote:Sure, with Convert() for example... but why? They're not really equivalent and a better solution would be to load them as is. What issue are you having? ...
The issue what am facing is , the special character is read from a oracle table and its being written as it is to oracle target. But in between we do a dynamic hash file lookup. The special character is not being written to the Hash file its throwing error stating, the character is not defined under NLS setting.

But the oracle source & target stage are able to read to special character with NLS settign UTF-8. But the special character is not being written to the hash file. Also i couldnt see the NLS setting for Hash file as we see for other stages.

Posted: Mon May 18, 2009 3:16 pm
by asorrell
Try setting the mapping for the job to "None" - it will not attempt to map them to UTF8.

Note: This will only work correctly if the character is supported in the default map for the target (it is obviously supported in the source!).

Posted: Mon May 18, 2009 4:00 pm
by ray.wurlod
The map for hashed files must be NONE if they are only to be used within DataStage. A map of NONE will handle every character "as is".

Posted: Mon May 18, 2009 11:33 pm
by senthilmp
Where should i specifically set the map setting for Hash File, because i dont see any option for settign the NLS setting specifically for Hash File.

Posted: Tue May 19, 2009 12:10 am
by ray.wurlod
You won't. You won't see it for HASHED file stage either. That's how "they" make sure you can't change it from NONE in the GUI.

Posted: Tue May 19, 2009 12:14 am
by senthilmp
So by default the NLS setting for HASHED file is NONE?

Posted: Tue May 19, 2009 12:19 am
by ray.wurlod
Correct. This can only be changed (deliberately) at the TCL command line, and you would only do that if other applications were going to access the hashed file using non-UV-UTF8 character encoding. This was more important prior to version 6.0, when other UniVerse applications might have accessed data in DataStage hashed files.

Posted: Tue May 19, 2009 12:23 am
by senthilmp
Ok, But then why the special character is not being written to the Hashed file if its None by default. Actually we have three different datastge environment(Dev,QA & Prod). In two of the environment the special character is being written into hashed file as it is. But in one(prod) environment we are facing this special character not being writtten to the hashed file issue.

All the settings are same as it is in three environments. The job as well is same.

Posted: Tue May 19, 2009 5:29 am
by chulett
Well, something is obviously different there. You need to keep digging and find it. Oracle client version, perhaps?