Hi All,
We need to load data from source system (Oracle) and NLS at source is US7ASCII ,To our target Oracle
at source one coloum contain arabic data , we are facing problem in loading this coloumn please advise.
We tried using below method
1) extraction from source table and loaded in DATASET passing job level NLS parameter US7ASCII , and NLS ASL_ASCII
2) Data is read from DataSet and used project default NLS ISO8859-6 loaded in Target table (NLS ISO8859-6)
But we not able to load it it throwing n number of warning wrt NLS and lead to failure.
Source contain these kind of data
ALI H. AL-HADDAD.
ر��ش� �م �سم �م
ر���ص ��ِ�م� �م� ر�ّ
��لو� ���فّ �م زف
NASER M.S. AL-AJMI
�������� �� �ص�
����� ز��� �ـص �ّ
��شمـف ر�ى�فّ �ّ
��شمـف ر�ى�فّ �
But when asked this data is reconozed by Beam10 application
Please let me know resolution of this issue.
Character set issue
Moderators: chulett, rschirm, roy
Character set issue
2 B 1 4 ALL
You need to start at the beginning and take this step-by-step to find the source of the conversion issues.
How are these characters actually stored in Oracle? If you use US7ASCII as your character set then no Arabic characters are representable.
How are these characters actually stored in Oracle? If you use US7ASCII as your character set then no Arabic characters are representable.
<a href=http://www.worldcommunitygrid.org/team/ ... TZ9H4CGVP1 target="WCGWin">
</a>
</a>
Does NLS ISO8859-6 support these Arabic characters? If not, then there's no amount of magic you can do to make this work. As noted, US7ASCII sure won't.
Is that your character set at both the source and the target? If so, you'll need to override the setting of any other character set while the job runs so that no conversion takes place in the job. If the source and target are different and both support your data, then it is a matter of setting NLS properly in the job so that Oracle knows what conversion to make.
Is that your character set at both the source and the target? If so, you'll need to override the setting of any other character set while the job runs so that no conversion takes place in the job. If the source and target are different and both support your data, then it is a matter of setting NLS properly in the job so that Oracle knows what conversion to make.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
Thanks for reply ..
Data at the source look like this
QxwTv +e )Se we
Qv{wU 6vp R{e{ wew Qvq
)~dhv 6vvaq we Ra
(+{xwv~v '+ xUw
We are able read and load from other tables from the same source But we are facing problem with only this table(above data having box in between)
source database is US7ASCII.
and we are extracting and loading in Dataset at first step using US7ASCII as job level NLS parameter compatible with source.
Data at the source look like this
QxwTv +e )Se we
Qv{wU 6vp R{e{ wew Qvq
)~dhv 6vvaq we Ra
(+{xwv~v '+ xUw
We are able read and load from other tables from the same source But we are facing problem with only this table(above data having box in between)
source database is US7ASCII.
and we are extracting and loading in Dataset at first step using US7ASCII as job level NLS parameter compatible with source.
2 B 1 4 ALL
Thanks Andrew
We using number of tables from this source Database having US7ASCII NLS, We are not facing any problem with any of the table we are sucessfully able to load arabic data from source to target.
other tables has data like this but these are able to convert to arabic in target
dbOJZjjQGdeSeiGdiTQcgGHfGAYeQeMeOHdMeQHehLH NWGH
GdehGabg Ydi JLGhR MOhO GdJSgjdGJ GdKGHJg hGdehbJg
but we are facing problem with only this table having data like
QxwTv +e )Se we
Qv{wU 6vp R{e{ wew Qvq
I guess these boxes OR brackets are creating problem.
Please help me to resolve this
We using number of tables from this source Database having US7ASCII NLS, We are not facing any problem with any of the table we are sucessfully able to load arabic data from source to target.
other tables has data like this but these are able to convert to arabic in target
dbOJZjjQGdeSeiGdiTQcgGHfGAYeQeMeOHdMeQHehLH NWGH
GdehGabg Ydi JLGhR MOhO GdJSgjdGJ GdKGHJg hGdehbJg
but we are facing problem with only this table having data like
QxwTv +e )Se we
Qv{wU 6vp R{e{ wew Qvq
I guess these boxes OR brackets are creating problem.
Please help me to resolve this
2 B 1 4 ALL
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
They are NOT "boxes or brackets" - that's just how the tool you are using to display them represents those particular code values.
View Data in DataStage is notoriously bad at displaying non-ASCII characters - it usally just gives up and displays "?" to indicate unmappable character.
You need to find out WHAT data are in this source, and how they are encoded (that is, what character map was used when they were written).
In the Administrator client execute the command SELECT * FROM NLS.MAP.DESCS and look for a map name that looks like it might handle Arabic satisfactorily.
I guess you even need to determine whether the data in the file are, in fact, Arabic. You might have an "interloper" - for example (not a good example, maybe, but it makes the point) a file of Hebrew data.
View Data in DataStage is notoriously bad at displaying non-ASCII characters - it usally just gives up and displays "?" to indicate unmappable character.
You need to find out WHAT data are in this source, and how they are encoded (that is, what character map was used when they were written).
In the Administrator client execute the command SELECT * FROM NLS.MAP.DESCS and look for a map name that looks like it might handle Arabic satisfactorily.
I guess you even need to determine whether the data in the file are, in fact, Arabic. You might have an "interloper" - for example (not a good example, maybe, but it makes the point) a file of Hebrew data.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.