Hi ,
I am trying to load data into oracle using DRS stage Upsert mode ,update then insert,while loading rows get rejected warning like "Inserted value too large for cloumn"IPC_out,But I have compared the columns and data type sizes both are in sync,still iam getting error and also i tried to load sequntial file its loaded sucessfully insted of DRS stage.
Please give me any suggestion to rectify the problem
Thanks in advance
Manu
DRS stage:Inserted value too large for column ,row rejected
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 30
- Joined: Fri Sep 26, 2008 9:33 pm
- Location: Bangalore
-
- Premium Member
- Posts: 8
- Joined: Tue Dec 16, 2008 12:11 pm
Re: DRS stage:Inserted value too large for column ,row rejec
This is most likely LOSSY data at the source that has been compressed.
Is your source data base NLS a 7 bit?
The IPC should say the row number that is rejecting.
Do a SQL Select (on the source DB) using the SQL from the Director log for the problem job and locate the row number given by the error message. Look at the data in each field and you will most likely find an upside down question mark or other characters that look like hieroglyphics or wingdings. This is usually caused by a person doing data entry from a keyboard setup to use international characters like latin accents. The GUI allows the use and accepts the data and then the target DB (your source) compresses the data to fit the field. Then the ETL processes the data and has no problem, however the target DB reads the data as "Too Large" and I have no clue how to get around it other than previously stated; cleansing the source.
Is your source data base NLS a 7 bit?
The IPC should say the row number that is rejecting.
Do a SQL Select (on the source DB) using the SQL from the Director log for the problem job and locate the row number given by the error message. Look at the data in each field and you will most likely find an upside down question mark or other characters that look like hieroglyphics or wingdings. This is usually caused by a person doing data entry from a keyboard setup to use international characters like latin accents. The GUI allows the use and accepts the data and then the target DB (your source) compresses the data to fit the field. Then the ETL processes the data and has no problem, however the target DB reads the data as "Too Large" and I have no clue how to get around it other than previously stated; cleansing the source.
And it could simply be actual data that is too large when compared to the target field, something that isn't enforced until it goes to the database. Is your message from DataStage or from the target database? The former means the job noticed the data was larger than the metadata and didn't even send it to the database, whereas the latter means the job thought it was fine but the database didn't.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
This is why it's so important to post the exact error/warning message, rather than "something like". Without the exact error message we're only guessing at best.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 246
- Joined: Mon Jun 30, 2008 3:22 am
- Location: New York
- Contact: