Inserted value too large for column, row rejected.

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

The more likely cause is that your data contains more characters than the table allows; i.e. you have a char(10) definition but are inserting 11 characters into that column. DataStage server will let you get away with that, but the database won't and issues an error.
taazeez
Participant
Posts: 7
Joined: Fri Jan 19, 2007 9:16 am

Post by taazeez »

I tried to check the length of the field, and it is still within the normal length range for the field. The record could not pass through Datasatage transformer being used. I have tried to remove some of the characters in the field but I am still getting the same problem.

The warning message is from Datasatage and not Oracle.

I also compared the lenth of the field in SQL Server and Oracle to be sure that there is no mismatch any where.


Tai
DeepakCorning
Premium Member
Premium Member
Posts: 503
Joined: Wed Jun 29, 2005 8:14 am

Re: Inserted value too large for column, row rejected.

Post by DeepakCorning »

Simple thing to do will be to try to insert this record in a Sequential File and then view data in it. I am pretty sure you will catch the problem then and there itself.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

taazeez wrote:... The record could not pass through Datasatage transformer being used...The warning message is from Datasatage and not Oracle...
DataStage Server transform stages do not limit the length (or even contents) of columns; the error message is being generated when DS tries to output to SQL Server. As DeepakCorning has recommended, if you redirect the output to a sequential file you will not only be able to see the actual contents of the column(s) in question but the error message will also go away.

I'm a bit confused, you mention both "SQL Server 2005" and "Oracle" in your original post - which one are you using?

Also, what is the datatype for the column in question both in DS and also in the database?
taazeez
Participant
Posts: 7
Joined: Fri Jan 19, 2007 9:16 am

Post by taazeez »

I am extracting record from SQL Server and loading it into Oracle.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Then it is the Oracle definition for the column is the one you want. Is it different from the SQL Server one?
solaik
Participant
Posts: 12
Joined: Sun Jan 23, 2005 10:33 pm

Re: Inserted value too large for column, row rejected.

Post by solaik »

Tai,

Change that column data type in DS input and output stage to VARBINARY.

Thanks
Solai.K
taazeez
Participant
Posts: 7
Joined: Fri Jan 19, 2007 9:16 am

Post by taazeez »

I have tried to redirect the job to sequential file, all the records got loaded, but DataSage still gave a warning message on the particular record as follows:


nls_map_buffer_out() - NLS mapping error.

The size of the two fields causing the problem are 1024 xters, 255 xters.
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

Great. Thanks for sharing that with us. Now you can mark your post as resolved.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
Post Reply