Reading & Writing 8 bit chars-NLS on DataStage and Oracl
Posted: Wed Jun 01, 2005 12:12 am
Hi All,
We have an input file that has an 8 bit character in it, essentially used to identify a transaction type (they have used from 0 to 127, and now they are sending us the character equivelants up to 150 or so).
At the moment we are running DataStage 7.1 without NLS support, and our target is a Oracle 8.1.7 database with an NLS_CHARACTERSET of ASCII7. The concern is that we are not going to be able to hold these new characters in the database.
I see that I have two choices available to me.
1. interpret the incoming data item as its numeric equivelant, and when inserting into the oracle table, use the ascii(numeric_equivelant) function.
2. Allow the data to pass through both DataStage and Oracle with the database being loaded as is into the target database.
In situation 1, I believe that I would need to run NLS on DataStage, setting the on DataStage to be the same as the character set of the incoming data for this to work.
In situation 2, I belive that I also need to run NLS on DataStage and the target database. Both of which would need to be running the same character set of the incoming data.
I believe that the option I would prefer is option 1. But at this point, I am getting rather strange results, with my numeric values of the ascii equivelants being in the range of 50000.
Has anyone had any similar experience or thoughts?
Regards,
IanG.
We have an input file that has an 8 bit character in it, essentially used to identify a transaction type (they have used from 0 to 127, and now they are sending us the character equivelants up to 150 or so).
At the moment we are running DataStage 7.1 without NLS support, and our target is a Oracle 8.1.7 database with an NLS_CHARACTERSET of ASCII7. The concern is that we are not going to be able to hold these new characters in the database.
I see that I have two choices available to me.
1. interpret the incoming data item as its numeric equivelant, and when inserting into the oracle table, use the ascii(numeric_equivelant) function.
2. Allow the data to pass through both DataStage and Oracle with the database being loaded as is into the target database.
In situation 1, I believe that I would need to run NLS on DataStage, setting the on DataStage to be the same as the character set of the incoming data for this to work.
In situation 2, I belive that I also need to run NLS on DataStage and the target database. Both of which would need to be running the same character set of the incoming data.
I believe that the option I would prefer is option 1. But at this point, I am getting rather strange results, with my numeric values of the ascii equivelants being in the range of 50000.
Has anyone had any similar experience or thoughts?
Regards,
IanG.