Page 1 of 1

Reading 2GB CLOB Data from Oracle table through DataStage

Posted: Thu May 12, 2011 7:16 am
by sur_ds
I need to read 2GB(Length = 2000000000) of data from a CLOB column in an Oracle table into DataStage. I am using the LongVarchar option and any lenght above 479999 results in an abort saying 'An abonormal terrmination of job in the Director Log'.

Please help.

Thanks,
Surajit

Posted: Thu May 12, 2011 7:22 am
by chulett
You do know they are not supported, yes? However, that doesn't mean they can't be used although 2GB might be a wee bit on the 'too big' side. Can you leave the length empty? Seem to recall that as one trick. Also make sure you use an Array Size of 1.

Posted: Thu May 12, 2011 7:43 am
by sur_ds
Thanks Craig!

I am using, a maximum data size of 167384 bytes for testing. By the approach suggested by you, DS is reading values from the CLOB column of an Oracle table. These rows need to be inserted into a target Oracle table which also has a CLOB column.

In other words, I need to transfer CLOB data from 1 table to another. The job is reading the data but not able to insert into the target table. The error seen from the DIrector's log is 'Inserted value too large for column'.

Please suggest changes that I would need to make for the target OCI Stage for correct data insertion. Even in this Stage I have made the length of the column as NULL. Could this be the casue of the error?

Thanks,
Surajit

Posted: Thu May 12, 2011 7:54 am
by sur_ds
Also in the target stage I am specifying the length as 2000000000, the job ran successfully for 10 records. I am now running the job for the complete set for 21000 records.

Please specify what I would need to set for the following(with regards to any considerations for CLOB):
1. Rows per transaction
2. Array size

I have set the above as:
1. Rows per transaction = 5000
2. Array size = 5000

Thanks,
Surajit

Posted: Thu May 12, 2011 7:58 am
by chulett
chulett wrote:Also make sure you use an Array Size of 1.
What is 5000x2000000000? Stick with 1.

Posted: Thu May 12, 2011 8:16 am
by sur_ds
Thanks for your suggestions. I am preparing data for 2GB. Shall test and share the results.

Thanks,
Surajit

Posted: Wed May 25, 2011 10:56 pm
by sarjushah
I used ODBC ---> ODBC stage for CLOB xfer

Posted: Fri May 27, 2011 11:20 am
by arunkumarmm
We use to define the column as LongVarChar with 999999999 to read and write the CLOB data. And its working fine.