Page 1 of 2

LongVarChar source and DS job fails to recognize it!!!

Posted: Wed Jun 15, 2005 1:30 pm
by vinaymanchinila
Getting a column with LongVarChar with no length specified, and the target column is also the same , but when I select the LongVarChar in DS it fails in runtime giving me a ORA handle error.
Is there any way I can find the max length of the column or how to go about it.
Thanks,

Posted: Wed Jun 15, 2005 3:23 pm
by ray.wurlod
Oracle is complaining that you're specifying LongVarChar(0), since "" is treated by DataStage as 0 in a numeric context.

What was the maximum length (precision) when you imported the table definition from Oracle? Use that value.

Posted: Wed Jun 15, 2005 3:28 pm
by vinaymanchinila
Hi Ray,
When I imported the metadata I had no number in the length column and I even did a "DESC" on the table , it just says LongVarChar.

Thanks,

Posted: Wed Jun 15, 2005 3:32 pm
by ray.wurlod
Try 4000. If that gives truncation errors, double it until the errors go away.

tried with 20,000

Posted: Wed Jun 15, 2005 3:40 pm
by vinaymanchinila
I tried with 20,000 and it works but what if there is some value tomorow greater than 20K and the job fails in production, thats what is the concern.
Thanks

Posted: Wed Jun 15, 2005 4:44 pm
by amsh76
Someone in operation should know the Max length possible..there has to be a limit. Why don't you ask them ?

Posted: Thu Jun 16, 2005 7:14 am
by vinaymanchinila
Thanks a bunch Craig, so to mention it as 2GB what should I give the column length as?

Posted: Sun Sep 18, 2005 11:05 am
by chulett
I get the dubious distinction of resurrecting this thread as - now that I actually need to do the same thing - I can't make it work. :lol:

We were able to successfully load a CLOB field from XML into Oracle using the LongVarchar datatype. However, success trying to select the darn thing back out is eluding us!

First attempts to simply select the CLOB into a large LongVarChar field resulted in a ORA-00932: inconsistent datatypes: expected - got CLOB error. Wrapping the CLOB field in a TO_CHAR() during the select got past that error but results in a Oracle Error - OCI_INVALID_HANDLE error. Can't afford to CAST this as a VARCHAR2 as the contents we need to extract are upwards of 16K, more than the 4K max the VARCHAR2 can deliver. And here we sit.

Anyone had success doing this? Thoughts / tips / tricks?

Gory details: DS 7.0.1 on HP-UX 11.11 using OCI9 and a 9i client to access a 10g database on an AIX box. Array Size is always set to 1. I can successfully select smaller, varchar sized CLOB data elements using a CAST AS VARCHAR2 in the source select. For whatever that is worth.

ps. The whole 10g/AIX aspect of this hasn't been a issue to date for the tons of other jobs we're running. FYI.

Posted: Sun Sep 18, 2005 4:37 pm
by ray.wurlod
If you can tolerate data type mismatch warning, try VarChar with precision up to 65536.

Posted: Sun Sep 18, 2005 5:41 pm
by chulett
Very interesting... changed LongVarChar to VarChar of 20K in size and (with the to_char function still in place on the select) - it worked! Without any kind of data type mismatch complaint, either. :shock:

Thanks!

Posted: Mon Sep 19, 2005 9:34 am
by chulett
Ok, spoke too soon. It's not "working working", it is however longer failing. :wink:

The CLOB is being silently truncated to 4K as it makes it way through the job and not really sure why right now. We'll keep messing with this and I'll update the post with anything pertinent we discover.

Of course, any thoughts on the issue - feel free to chime in.

Posted: Mon Sep 19, 2005 11:17 am
by kcbland
If your data has any chance of containing a lot of characters in the CLOB column, you're better off not sending it thru DataStage. Think of it like sending an email with an attached file. Your email may be small, but the attachment can clog the network if it's of sufficient size. The same holds true with CLOBs and BLOBs, they're not worth sending thru the ETL stream.

Your best bet is to route the CLOB/BLOBs around the ETL and bulk-load/merge on the backend. It's brutal to insert a bunch of rows and then come back and update them with the CLOB/BLOB data, but you'll be able to bulk load the CLOB/BLOB data into a work table and then merge it on the backside without suffering thru the ETL.

Now, if you need to be parsing the CLOB/BLOB data, I feel for you. You're probably going to find that DS will choke on the large string data.

Posted: Mon Sep 19, 2005 12:00 pm
by Bala R
Are you trying to insert the selected CLOB value into another Oracle table downstream? Possibly the OCI_HANDLE_ERROR could occur because of the pointer to the LOBLocator exists in the session or OCI env where you did a select and becomes invalid when moved to another session. If thats the case may be you could dumb the data in some intermediatory (sequential?) file and see if you could get past the select with TO_CHAR.

Just a thought.

Posted: Thu Sep 22, 2005 6:34 pm
by logic
Hi Craig,
Any developments on The CLOB issue. If so PLEASe update. I am facing similar situation and tried all the above solutions but nothing seems to be working. Stopped thinking Logically and tried the trial and error path. :(
We were initially using a DRS stage. Then we switched to OCI. When we use LongVarchar we get the

Code: Select all

OCI Invalid Handle error
. Using VarChar 4000 we get

Code: Select all

OCI has fetched Truncated Data error
(irrespective of array size). For an Array Size of 1 and VARCHAR-4000 the row is rejected with the error "Inserted value too large for column". This error disappears on making array size 1000 and VARCHAR-502000, but we get "not all variables bound error".
What approach should we take to solve this issue? Is there something we are missing?
THANKS.

Posted: Thu Sep 22, 2005 7:14 pm
by chulett
Are you trying to insert into a CLOB or select the contents one? And I'm assuming you are only having problems because you are dealing with more than 4K of data in the field, yes?