teradata conversion warning

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
madannitjam
Participant
Posts: 7
Joined: Mon Feb 09, 2009 11:25 pm
Location: India

teradata conversion warning

Post by madannitjam »

Hi all,

I am getting below mentioned warning.

CDC_Order: When checking operator: On input data set 0:
When binding input interface field "ord_id" to field "ord_id": Implicit conversion from source type "ustring[max=40]" to result type "string[max=40]": Converting ustring to string using codepage ASCL_MS1252.

Please help.
Madan
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Finding tera_cs.txt may help this one also. However, you have ustring on one link and string on another link; DataStage is alerting you to the fact of the metadata mismatch. Remove the metadata mismatch and this particular alert will magically vanish!
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
etlbets
Premium Member
Premium Member
Posts: 25
Joined: Wed Jul 25, 2007 8:51 am

How do you remove the metadata mismatch?

Post by etlbets »

Hi Ray - I have the same problem... getting this implicit conversion error:

When checking operator: When binding output schema variable "outRec":
When binding output interface field "UPSTRM_SYS_CD" to field "UPSTRM_SYS_CD": Implicit conversion from source type "ustring[max=4]" to result type "string[max=4]":
Converting ustring to string using codepage ASCL_ISO8859-1.

How would I remove the metadata mismatch and make it magically vanish?
ray.wurlod wrote:Finding tera_cs.txt may help this one also. However, you have ustring on one link and string on another link; DataStage is alerting you to the fact of the metadata mismatch. Remove the metadata mismatch and this particular alert will magically vanish!
etlbets
asorrell
Posts: 1707
Joined: Fri Apr 04, 2003 2:00 pm
Location: Colleyville, Texas

Post by asorrell »

Assuming you are outputting to an NLS-compliant source and the map is set correctly for the target, then you should use NCHAR or NVARCHAR (the NLS versions of CHAR and VARCHAR) datatypes. This will keep the string in "NLS-mode" throughout the job, eliminating the problem.
Andy Sorrell
Certified DataStage Consultant
IBM Analytics Champion 2009 - 2020
etlbets
Premium Member
Premium Member
Posts: 25
Joined: Wed Jul 25, 2007 8:51 am

Post by etlbets »

Thanks for the info. Is there a way to set the NLS correctly at the job or stage level or do I need to change all the columns to NVARCHAR in the transformer?
asorrell wrote:Assuming you are outputting to an NLS-compliant source and the map is set correctly for the target, then you should use NCHAR or NVARCHAR (the NLS versions of CHAR and VARCHAR) datatypes. This will keep the string in "NLS-mode" throughout the job, eliminating the problem.
etlbets
etlbets
Premium Member
Premium Member
Posts: 25
Joined: Wed Jul 25, 2007 8:51 am

Post by etlbets »

Thanks for the help. Going from VARCHAR to NVARCHAR eliminated 99% of my warnings but I'm still seeing this one which does not make any sense because input/output are defined as Nullable = No.

When checking operator: When binding output schema variable "outRec": When binding output interface field "MAP_SUR_KEY" to field "MAP_SUR_KEY": Converting a nullable source to a non-nullable result;
a fatal runtime error could occur; use the modify operator to
specify a value to which the null should be converted.

Any ideas?

Thanks
asorrell wrote:Assuming you are outputting to an NLS-compliant source and the map is set correctly for the target, then you should use NCHAR or NVARCHAR (the NLS versions of CHAR and VARCHAR) datatypes. This will keep the string in "NLS-mode" throughout the job, eliminating the problem.
etlbets
Post Reply