Page 1 of 1

Column size is too large

Posted: Mon Mar 21, 2011 7:28 am
by vitumati
Hi Friends,

I've column "Comment" in my source.I'm getting approximately 10000 character for that column.while reading the data from sequential file I'm getting below error.


<SF_Profiledb,0> Error reading on import.
<SF_Profiledb,0> Consumed more than 100000 bytes looking for record delimiter; aborting
<SF_Profiledb,0> Import error at record 0.

Can you please help?
Do I need to assign any Environment Variable to increase the size column capacity.

Can Datastage support 10000 character for one column.

Posted: Mon Mar 21, 2011 7:57 am
by chulett
The error actually says it read more than 10000 characters but couldn't find a record delimiter, not that anything was 'too large'. Check your metadata.

Posted: Mon Mar 21, 2011 4:52 pm
by ray.wurlod
Which, in turn, means that you have probably specified the wrong delimiter. Did you import the table definition for the file and load that into your job design (including Load on the Format tab)?