Page 1 of 1

Posted: Fri Jan 19, 2007 10:54 am
by ArndW
The more likely cause is that your data contains more characters than the table allows; i.e. you have a char(10) definition but are inserting 11 characters into that column. DataStage server will let you get away with that, but the database won't and issues an error.

Posted: Fri Jan 19, 2007 11:06 am
by taazeez
I tried to check the length of the field, and it is still within the normal length range for the field. The record could not pass through Datasatage transformer being used. I have tried to remove some of the characters in the field but I am still getting the same problem.

The warning message is from Datasatage and not Oracle.

I also compared the lenth of the field in SQL Server and Oracle to be sure that there is no mismatch any where.


Tai

Re: Inserted value too large for column, row rejected.

Posted: Fri Jan 19, 2007 11:20 am
by DeepakCorning
Simple thing to do will be to try to insert this record in a Sequential File and then view data in it. I am pretty sure you will catch the problem then and there itself.

Posted: Fri Jan 19, 2007 11:39 am
by ArndW
taazeez wrote:... The record could not pass through Datasatage transformer being used...The warning message is from Datasatage and not Oracle...
DataStage Server transform stages do not limit the length (or even contents) of columns; the error message is being generated when DS tries to output to SQL Server. As DeepakCorning has recommended, if you redirect the output to a sequential file you will not only be able to see the actual contents of the column(s) in question but the error message will also go away.

I'm a bit confused, you mention both "SQL Server 2005" and "Oracle" in your original post - which one are you using?

Also, what is the datatype for the column in question both in DS and also in the database?

Posted: Fri Jan 19, 2007 12:10 pm
by taazeez
I am extracting record from SQL Server and loading it into Oracle.

Posted: Fri Jan 19, 2007 12:31 pm
by ArndW
Then it is the Oracle definition for the column is the one you want. Is it different from the SQL Server one?

Re: Inserted value too large for column, row rejected.

Posted: Fri Jan 19, 2007 1:03 pm
by solaik
Tai,

Change that column data type in DS input and output stage to VARBINARY.

Thanks
Solai.K

Posted: Fri Jan 19, 2007 1:20 pm
by taazeez
I have tried to redirect the job to sequential file, all the records got loaded, but DataSage still gave a warning message on the particular record as follows:


nls_map_buffer_out() - NLS mapping error.

The size of the two fields causing the problem are 1024 xters, 255 xters.

Posted: Fri Jan 19, 2007 3:03 pm
by DSguru2B
Great. Thanks for sharing that with us. Now you can mark your post as resolved.