Page 1 of 1

Error when using CLOB data type in Job

Posted: Mon Mar 12, 2007 11:07 pm
by subhashini
Iam facing problem when iam using CLOB datatype for one column in the parallel job. column datatype is CLOB both in source and target tables

when I import the metadata for this column datastage is converting as LONGVARCHAR, when I run this job , I am getting following error.



main_program: Internal Error: (colDesc):orautils.C: 1714: getColDesc failed.
Traceback: [6] APT_FatalPath::msgAssertion(0xfdcbacff, 0xffbfbe38, 0xfdcbad07, 0x6b2, 0x0, 0xccc70), at 0xff0dad18
[7] APT_OraUtils::querySchema(0x0, 0xfdcdd2e0, 0xffbfbdc8, 0xffbfbf18, 0xffbfbf24, 0xfdcd5498), at 0xfdc23dc0
[8] APT_OraReadOperator::describeOperator(0xaec58, 0x0, 0x167cb0, 0xffbfc470, 0xfdcdcce0, 0xfdc05ae4), at 0xfdbfe788
[9] APT_OperatorRep::wrapDescribeOperator(0x140458, 0x0, 0x1, 0x7088f4, 0xffbfc6bc, 0xb4), at 0xfe70b354
[10] APT_OperatorRep::check1a(0x140458, 0x1800, 0xfed6db08, 0xff12d168, 0xfee138b8, 0xfee4e50c), at 0xfe70b8bc
[11] APT_StepRep::sequenceAndCheck1Operators(0x127ea0, 0x0, 0x0, 0xfed79076, 0xfee138b8, 0x0), at 0xfe788510
[12] APT_StepRep::check(0x127ea0, 0x68c264, 0x0, 0x0, 0xfee138b8, 0x127eac), at 0xfe787858
[13] APT_Step::check(0xcbdb0, 0xffbfce80, 0xffbfcdc4, 0xff134488, 0x0, 0xfee138b8), at 0xfe77b964
[14] APT_OSL::createAndCheckStep(0x1305f0, 0xffbfe218, 0x5d530, 0xffbfce48, 0xffbfce80, 0x3c78c), at 0xff220d90
[15] APT_ORCHESTRATE_main(0x1, 0x2b8f2, 0xffbfe350, 0x4, 0xffbfe4b8, 0xffbfe4d4), at 0x1fcd4


Pls help me out to come out of this problem.

Thanks in advance.

Posted: Tue Mar 13, 2007 2:49 am
by kumar_s
As per documenet, supported LOBs are LOB and BLOB.

Posted: Tue Mar 13, 2007 2:52 am
by kumar_s
And CLOB will be mapped to SQL_LONGVARDCHAR. The other restriction suggested is, the max of 2GB. And the length need to be mentioned as 4KB in Columns tab.

Posted: Wed Mar 14, 2007 9:39 pm
by subhashini
When I imported metadata CLOB is converted to LONGVARCHAR and size is 4000 bytes. But the source table doesn't have any data. I developed the job and try to run the job with out any data.

Posted: Thu Mar 15, 2007 6:53 am
by chulett
Any idea what the max data size is you'll be pulling from the CLOB for this job? Is your data source 10g by any chance?

Posted: Thu Mar 15, 2007 10:16 pm
by subhashini
Till now we didn't hv source data, but source system is not 10g , it is 9i.

Posted: Fri Mar 16, 2007 2:23 am
by kumar_s
So you are getting this error even when the data doesn't flow in the sequence.
How did you imported the metadata?
Internal Error: (colDesc):orautils.C: 1714: getColDesc failed.
Try orchbuild utility to import metadata and check if that helps by any chance.

Posted: Fri Mar 16, 2007 6:46 am
by chulett
subhashini wrote:Till now we didn't hv source data, but source system is not 10g , it is 9i.
Too bad, there's a loverly little package in 10g - the DBMS_LOB package - that makes working with those fields types much easier. Which is why I asked.

Posted: Sat Mar 17, 2007 2:38 am
by kumar_s
The option Craig will be help full in converting the data types from Oracle and load into oracle. Not sure if that will be much helpful in your case, if you could not convert access it from oracle to Datastage. But may be it can be used to convert to other format that Datastage can access easily.
Have you tried running with data in that field? Does that field involve any transformation?