Page 1 of 1

NLS Setting - Difference bewteen Server job and Parallel job

Posted: Wed Apr 22, 2009 10:27 am
by dsuser_cai
Hi

Im in the process of converting server jobs to parallel job. In server job we have simple job (ODBC source ---> Transformer ---> Oracle target). no transformations in between, its just a direct pull. I have column in the source and in target (say COL1- varchar2(20)). When i checked in oracle for the NLS setting the source has US7ASCII and the target has AL32UTF8, so what happens is when i runt he parallel job COL1 has some special characters which do not have proper characterset mapping or they occupy more bytes/space. And the job aborts with the following error message:

sqlcode is: -12899
esql complaint: ORA-12899: value too large for column "schema"."table_name"."COL1" (actual: 37, maximum: 20)
.

but when i run the server job it dosent aborts. Can anybody help me fix this.

NLS server job: NLS_LANG=AMERICAN_AMERICA.US7ASCII

NLS setting in paralle job: NLS_LANG=AMERICAN_AMERICA.US7ASCII

Posted: Wed Apr 22, 2009 1:48 pm
by dsuser_cai
Hi

the issue is resolved now.

the source database has a NLS setting of US7ASCII and the target DB has UTF8, and the column width is only 3 in both DB's. So i was not able to insert the data, by using the following cinversion function this was fixed

select convert(Col1, 'US7ASCII', 'UTF8') FROM table_name

Posted: Wed Apr 22, 2009 2:41 pm
by ray.wurlod
Note all that this CONVERT() function is SQL, not the DataStage function.

And, in passing, NLS uses a completely different mechanism in parallel jobs from that used in server jobs.