U_TRUNCATED_CHAR_FOUND encountered
Moderators: chulett, rschirm, roy
-
- Premium Member
- Posts: 54
- Joined: Thu Oct 18, 2007 4:20 am
- Location: Chennai
U_TRUNCATED_CHAR_FOUND encountered
hi
while loading a table ( DS->CPY->ORA) i encounter this fatal error
i had set Default Map set for Stages as UTF-8 in NLS tab present in Job properties
APT_CombinedOperatorController,0: U_TRUNCATED_CHAR_FOUND encountered.
U_TRUNCATED_CHAR_FOUND encountered
Please help me on this ..
Thanks in advance
while loading a table ( DS->CPY->ORA) i encounter this fatal error
i had set Default Map set for Stages as UTF-8 in NLS tab present in Job properties
APT_CombinedOperatorController,0: U_TRUNCATED_CHAR_FOUND encountered.
U_TRUNCATED_CHAR_FOUND encountered
Please help me on this ..
Thanks in advance
Regards
LakshmiNarayanan
LakshmiNarayanan
-
- Premium Member
- Posts: 54
- Joined: Thu Oct 18, 2007 4:20 am
- Location: Chennai
thanks for ur reply craig,
i hav searched and tried with the previous posts they told to set false to APT_disable_combinational_operator parameter and run the job and then rebuild the indexes present in the target table, still the same above error persists
i have done many searches on this error ,but no clue .
pls help, im stuck with this for 2 days...
Thanks,
i hav searched and tried with the previous posts they told to set false to APT_disable_combinational_operator parameter and run the job and then rebuild the indexes present in the target table, still the same above error persists
i have done many searches on this error ,but no clue .
pls help, im stuck with this for 2 days...
Thanks,
Regards
LakshmiNarayanan
LakshmiNarayanan
You should add $APT_DISABLE_COMBINATION and set it to True before running the job again. The error will 'persist' but will tell you where (what stage) it is actually coming from, which can help with the diagnosis. Do that and repost the error.
ps. No reason to be 'stuck for two days', hopefully you've also contacted your official support provider as well in the meantime. Assuming you have one, your company is paying them good money so make them earn it once in a while.
ps. No reason to be 'stuck for two days', hopefully you've also contacted your official support provider as well in the meantime. Assuming you have one, your company is paying them good money so make them earn it once in a while.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Premium Member
- Posts: 54
- Joined: Thu Oct 18, 2007 4:20 am
- Location: Chennai
Does your oracle load have many columns? It would be best to narrow the problem down to a single column and then to the row causing the error.
<a href=http://www.worldcommunitygrid.org/team/ ... TZ9H4CGVP1 target="WCGWin">
</a>
</a>
-
- Premium Member
- Posts: 54
- Joined: Thu Oct 18, 2007 4:20 am
- Location: Chennai
Hi ArndW & chulett
yes there are around 120 columns in the oracle load
i have narrowed down to 6 columns which cause this error
those columns are varchar2(50) only
example :
LAST_NAME (Column name) (varchar2(50)) :
"Объяснение проведенных работ проведенных работa_2"
Length : 49
While loading this data into target table using datastage the length exceeds 50, so we are unable to perform the load properly.its telling value is too large.
but when we calculate the length of this columns using oracle it shows only 49 .
we do only load and append in this loading process
i have also set UTF-8 in NLS map in job properties as in my oracle it is AL32UTF8 .
i think due to this type of multilanguage character im unable to load it,but if i increase the length to varchar2(100) im able to load these data and no error occurs,but i feel this should'n be the solution.
Thanks,
yes there are around 120 columns in the oracle load
i have narrowed down to 6 columns which cause this error
those columns are varchar2(50) only
example :
LAST_NAME (Column name) (varchar2(50)) :
"Объяснение проведенных работ проведенных работa_2"
Length : 49
While loading this data into target table using datastage the length exceeds 50, so we are unable to perform the load properly.its telling value is too large.
but when we calculate the length of this columns using oracle it shows only 49 .
we do only load and append in this loading process
i have also set UTF-8 in NLS map in job properties as in my oracle it is AL32UTF8 .
i think due to this type of multilanguage character im unable to load it,but if i increase the length to varchar2(100) im able to load these data and no error occurs,but i feel this should'n be the solution.
Thanks,
Regards
LakshmiNarayanan
LakshmiNarayanan
-
- Premium Member
- Posts: 54
- Joined: Thu Oct 18, 2007 4:20 am
- Location: Chennai
hi
i think it is russian language(cyrillic) i also tried changing nls charset in job properties to russian according to this link below, but its not working.
http://www.w3.org/International/O-charset-lang.html
now im sure it is russian lang, in my oracle
NLS_CHARACTERSET is AL32UTF8
if i try UTF8 charset also its not coming...
Thanks,
i think it is russian language(cyrillic) i also tried changing nls charset in job properties to russian according to this link below, but its not working.
http://www.w3.org/International/O-charset-lang.html
now im sure it is russian lang, in my oracle
NLS_CHARACTERSET is AL32UTF8
if i try UTF8 charset also its not coming...
Thanks,
Regards
LakshmiNarayanan
LakshmiNarayanan
To me, that's a perfectly valid solution. Is your VARCHAR2(50) field actually defined as 50 BYTES or 50 CHARACTERS?thanush9sep wrote:i think due to this type of multilanguage character im unable to load it,but if i increase the length to varchar2(100) im able to load these data and no error occurs,but i feel this should'n be the solution.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers