NLS mapping problem
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 158
- Joined: Tue Mar 15, 2005 3:16 am
NLS mapping problem
Hi
We recently changed NLS mapping from ISO8859-1 to UTF8. We were done this to have all the international characters as specified in UTF8 to show in the table correctly.
Because of this change, the following job behaving differently
SEQ FILE -> TFM -> TABLE
Earlier
All the records were loading to the table. But showing some weird characters in the table for some records.
NOW
Some of the records are not loading to the table.. showing that as NLS mapping error as follows:
load_stgf_msgr_imv_d_max..feed_msgr_imv_d.feed_msgr_imv_d: nls_map_buffer_in() - NLS mapping error, row 13804 (approx), row = "us6.000VNh?nh n?y d?p qu? co ph?i kh?ng h?u: http://vietgiaitri.com/ind100000010000011"
Can any one please help us in finding the solution to this problem...
thanks
Sai
We recently changed NLS mapping from ISO8859-1 to UTF8. We were done this to have all the international characters as specified in UTF8 to show in the table correctly.
Because of this change, the following job behaving differently
SEQ FILE -> TFM -> TABLE
Earlier
All the records were loading to the table. But showing some weird characters in the table for some records.
NOW
Some of the records are not loading to the table.. showing that as NLS mapping error as follows:
load_stgf_msgr_imv_d_max..feed_msgr_imv_d.feed_msgr_imv_d: nls_map_buffer_in() - NLS mapping error, row 13804 (approx), row = "us6.000VNh?nh n?y d?p qu? co ph?i kh?ng h?u: http://vietgiaitri.com/ind100000010000011"
Can any one please help us in finding the solution to this problem...
thanks
Sai
-
- Participant
- Posts: 158
- Joined: Tue Mar 15, 2005 3:16 am
-
- Participant
- Posts: 158
- Joined: Tue Mar 15, 2005 3:16 am
-
- Premium Member
- Posts: 1044
- Joined: Wed Sep 29, 2004 3:30 am
- Location: Nottingham, UK
- Contact:
Re: NLS mapping problem
What character set is the file? Is it a UTF8 file or an ISO8859-1 file? If it is ISO8859-1 file, then set the Sequential File stage itself to that NLS map and then your job will convert it to UTF8. I find that ISO8859-1+MARKS handles a wider variety of characters correctly then plain old ISO8859-1.saikrishna wrote:Because of this change, the following job behaving differently
SEQ FILE -> TFM -> TABLE
Phil Hibbs | Capgemini
Technical Consultant
Technical Consultant
-
- Participant
- Posts: 158
- Joined: Tue Mar 15, 2005 3:16 am
-
- Participant
- Posts: 158
- Joined: Tue Mar 15, 2005 3:16 am
It is ASCII file, which contains Chinese characters. But could not load using DataStage.. But could load correctly using SQLLOADER..
I tried setting UTF, ISO8859-1+MARKS in NLS of DataStage..but could not load it correctly.
For chinese characters, DataStage is reading more characters than actual.
Can any one of you please help us in resolving the NLS problem in DataStage with Chinese characters.
Note: Whatever I pasted the data in my first comment was a direct copy..so please ignore that.
Thanks for the help
Sai
I tried setting UTF, ISO8859-1+MARKS in NLS of DataStage..but could not load it correctly.
For chinese characters, DataStage is reading more characters than actual.
Can any one of you please help us in resolving the NLS problem in DataStage with Chinese characters.
Note: Whatever I pasted the data in my first comment was a direct copy..so please ignore that.
Thanks for the help
Sai
-
- Participant
- Posts: 158
- Joined: Tue Mar 15, 2005 3:16 am
hI PhilHibbs
It could be non-ascii file... we get this from an upstream system as a regular file.
Please dont concentrate on the type of file, instead can you please tell us, if possible, the problem with the non-conversion of chinese characters using UTF8?
As it is working fine with normal sql loader utility in oracle.
Thanks
Sai
It could be non-ascii file... we get this from an upstream system as a regular file.
Please dont concentrate on the type of file, instead can you please tell us, if possible, the problem with the non-conversion of chinese characters using UTF8?
As it is working fine with normal sql loader utility in oracle.
Thanks
Sai
-
- Premium Member
- Posts: 1044
- Joined: Wed Sep 29, 2004 3:30 am
- Location: Nottingham, UK
- Contact:
Well, the file type is pretty important, if your Sequential File stage specifies UTF8 and the file isn't a UTF8 file, then you will have real problems.saikrishna wrote:Please dont concentrate on the type of file, instead can you please tell us, if possible, the problem with the non-conversion of chinese characters using UTF8?
Sai
In the SQL*Loader definition, do you specify the file type as UTF8 in the CTL or does it detect it? I'm a bit rusty with SQL*Loader and I've never had to worry about file types as I've only ever used it for ISO8859-1 files and that worked by default.
Can you tell me what the first few bytes of your source file are in hex? That will tell me whether it is UTF8 or not, providing it has a Byte Order Mark which most UTF8 files should have (but not always).
Phil Hibbs | Capgemini
Technical Consultant
Technical Consultant
-
- Participant
- Posts: 13
- Joined: Fri Aug 11, 2006 11:57 am
- Location: Toulouse, France
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Your thinking is "off base" here. The NLS Map must accurately describe how the data in the file are encoded.saikrishna wrote:Please dont concentrate on the type of file, instead can you please tell us, if possible, the problem with the non-conversion of chinese characters using UTF8?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.