This probably means that while you have used MS-1256 within DataStage, the Oracle Database or output table has another character set defined which cannot hold some character that is available in MS-1256. What are your Oracle NLS settings (check in Oracle, not DataStage)
[quote="userasif"]After some changing now I am getting this message:
At row 1, link "OutLink"
Inserted value too large for column, row rejected.[/quote]
Try by giving tripe the size of the original column size for Arabic columns. Because Arabic characters are not of same size as English characters
For NLS applications you should always make sure that, in Oracle, you declare strings using VarChar2(32 Char) so that 32 characters can be used. If you use the default, then just 32 bytes would be reserved and you couldn't store 32 multibyte characters into that strnig.
Varchar2 is the appropriate type for string values up to 4000 characters. I used "32" as an example - you might need a longer string - what is your input string maximum length?
Maximum length is just 500.
Please consider new scanrio:
I have created table t1: c1 number(10), c2 varchar2(1000)
I have inserted Arabic word in c2 and it is fine and display is SQL Plus is also fine as Arabic word.
I have also created a new ETL job that loads data from table t1 to table t2 in target database just to check NLS Mapping.
I am getting the following message in Input Stage:
"Oracle data type not presently supported".
He's looking for clarification on how it was created in Oracle: as 1000 bytes or 1000 characters? Makes a huge difference when dealing with multibyte character sets.
-craig
"You can never have too many knives" -- Logan Nine Fingers