Handling CLOB datatype in datastage

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
bikan
Premium Member
Premium Member
Posts: 128
Joined: Thu Jun 08, 2006 5:27 am

Handling CLOB datatype in datastage

Post by bikan »

Hi,

I have a requirement where I have to fectch column from oracle database having CLOB datatype... I wanted to know that can i directly read the column in datastage ...or I have to use some other method..Please advise...
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Resist the requirement. There's nothing sensible that an ETL tool can do with a CLOB.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Have your SQL split the CLOB into n columns of VarChar(2000) each and then use either a transform stage or Column Import stage to concatenate the list of VarChar columns into one large one.
This method is being used on my current project because we have CLOBs and BLOBs of up to 60Kb (very wordy XML) and have no choice but to store it in Oracle. This is very unwieldy and, if there is any way for you to avoid it, you should use supported datatypes.
Jasti
Participant
Posts: 44
Joined: Sat Apr 14, 2007 6:34 am
Location: Hyderabad, India

Post by Jasti »

Hi..
Try using the LongVarChar datatype in the datastage job..We are using LongVarChar[4000] to read some CLOB type data in our project..
Thanks,
Mohan.A.Jasti.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Note that 4000 is bytes. If you are using NLS then the number of characters displayable in 4000 bytes is less and will cause the job to abort.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Do you have any idea what the max size of the CLOB data is? I've had good luck using the built-in DBMS_LOB package to process ours but they're not mongo huge like Arnd's were.
-craig

"You can never have too many knives" -- Logan Nine Fingers
nasimul
Participant
Posts: 37
Joined: Wed Jan 25, 2006 1:38 am

Post by nasimul »

I have the similar kind of requirement. I need to load a field in Oracle having CLOB datatype. While reading from source, I am using LongVarchar(2147483647). because the length can be 2147483647 bytes.

But while loading into table using upsert in Oracle Enterprise stage, getting error like SQLCODE - 1480.

While trying to load using LOAD method, giving error saon CLOB datatype not supported...like "main_program: Unsupported SQL type CLOB or NCLOB for column MEMO1."

Can you please suggest?

Thanks,
Nasimul
rameshrr3
Premium Member
Premium Member
Posts: 609
Joined: Mon May 10, 2004 3:32 am
Location: BRENTWOOD, TN

Post by rameshrr3 »

Im loading a CLOB column as LongVarchar(100000) . Source is an XML Chunk from MQ Series ..
Post Reply