Hi,
Our source oracle 10g database has been created with US7ASCII NLS Characterset whereas the Target oracle 10g database has been created with AL32UTF8 NLS Characterset.
I have a data stage job which extracts the data from source database into a sequential file and an another job that loads the data from sequential file into the Target database. One of the column in source database table say col1 has some multi byte characters. When i try to load this data into the target database it is unable to insert the data.
I tried running the jobs by changing the NLS charcter set from default ISO-8859-1 to UTF-8 i tried this at the Project level , job level and Stage level. But somehow datastage is unable to interpret the multi byte character thats coming from Column col1.
How do we resolve the issue of loading the Multi byte character into the target database. We have this issue not just for one column in one table but for multiple columns in several tables.
source and target database with different NLS character sets
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 14
- Joined: Wed Dec 26, 2007 10:56 am
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact: