Reading 2GB CLOB Data from Oracle table through DataStage

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
sur_ds
Participant
Posts: 33
Joined: Mon Jun 11, 2007 8:47 am
Location: kolkata,India

Reading 2GB CLOB Data from Oracle table through DataStage

Post by sur_ds »

I need to read 2GB(Length = 2000000000) of data from a CLOB column in an Oracle table into DataStage. I am using the LongVarchar option and any lenght above 479999 results in an abort saying 'An abonormal terrmination of job in the Director Log'.

Please help.

Thanks,
Surajit
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

You do know they are not supported, yes? However, that doesn't mean they can't be used although 2GB might be a wee bit on the 'too big' side. Can you leave the length empty? Seem to recall that as one trick. Also make sure you use an Array Size of 1.
-craig

"You can never have too many knives" -- Logan Nine Fingers
sur_ds
Participant
Posts: 33
Joined: Mon Jun 11, 2007 8:47 am
Location: kolkata,India

Post by sur_ds »

Thanks Craig!

I am using, a maximum data size of 167384 bytes for testing. By the approach suggested by you, DS is reading values from the CLOB column of an Oracle table. These rows need to be inserted into a target Oracle table which also has a CLOB column.

In other words, I need to transfer CLOB data from 1 table to another. The job is reading the data but not able to insert into the target table. The error seen from the DIrector's log is 'Inserted value too large for column'.

Please suggest changes that I would need to make for the target OCI Stage for correct data insertion. Even in this Stage I have made the length of the column as NULL. Could this be the casue of the error?

Thanks,
Surajit
sur_ds
Participant
Posts: 33
Joined: Mon Jun 11, 2007 8:47 am
Location: kolkata,India

Post by sur_ds »

Also in the target stage I am specifying the length as 2000000000, the job ran successfully for 10 records. I am now running the job for the complete set for 21000 records.

Please specify what I would need to set for the following(with regards to any considerations for CLOB):
1. Rows per transaction
2. Array size

I have set the above as:
1. Rows per transaction = 5000
2. Array size = 5000

Thanks,
Surajit
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

chulett wrote:Also make sure you use an Array Size of 1.
What is 5000x2000000000? Stick with 1.
-craig

"You can never have too many knives" -- Logan Nine Fingers
sur_ds
Participant
Posts: 33
Joined: Mon Jun 11, 2007 8:47 am
Location: kolkata,India

Post by sur_ds »

Thanks for your suggestions. I am preparing data for 2GB. Shall test and share the results.

Thanks,
Surajit
sarjushah
Premium Member
Premium Member
Posts: 40
Joined: Thu May 12, 2005 3:59 pm

Post by sarjushah »

I used ODBC ---> ODBC stage for CLOB xfer
arunkumarmm
Participant
Posts: 246
Joined: Mon Jun 30, 2008 3:22 am
Location: New York
Contact:

Post by arunkumarmm »

We use to define the column as LongVarChar with 999999999 to read and write the CLOB data. And its working fine.
Arun
Post Reply