LongVarChar source and DS job fails to recognize it!!!

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

vinaymanchinila
Premium Member
Premium Member
Posts: 353
Joined: Wed Apr 06, 2005 8:45 am

LongVarChar source and DS job fails to recognize it!!!

Post by vinaymanchinila »

Getting a column with LongVarChar with no length specified, and the target column is also the same , but when I select the LongVarChar in DS it fails in runtime giving me a ORA handle error.
Is there any way I can find the max length of the column or how to go about it.
Thanks,
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Oracle is complaining that you're specifying LongVarChar(0), since "" is treated by DataStage as 0 in a numeric context.

What was the maximum length (precision) when you imported the table definition from Oracle? Use that value.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
vinaymanchinila
Premium Member
Premium Member
Posts: 353
Joined: Wed Apr 06, 2005 8:45 am

Post by vinaymanchinila »

Hi Ray,
When I imported the metadata I had no number in the length column and I even did a "DESC" on the table , it just says LongVarChar.

Thanks,
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Try 4000. If that gives truncation errors, double it until the errors go away.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
vinaymanchinila
Premium Member
Premium Member
Posts: 353
Joined: Wed Apr 06, 2005 8:45 am

tried with 20,000

Post by vinaymanchinila »

I tried with 20,000 and it works but what if there is some value tomorow greater than 20K and the job fails in production, thats what is the concern.
Thanks
amsh76
Charter Member
Charter Member
Posts: 118
Joined: Wed Mar 10, 2004 10:58 pm

Post by amsh76 »

Someone in operation should know the Max length possible..there has to be a limit. Why don't you ask them ?
vinaymanchinila
Premium Member
Premium Member
Posts: 353
Joined: Wed Apr 06, 2005 8:45 am

Post by vinaymanchinila »

Thanks a bunch Craig, so to mention it as 2GB what should I give the column length as?
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

I get the dubious distinction of resurrecting this thread as - now that I actually need to do the same thing - I can't make it work. :lol:

We were able to successfully load a CLOB field from XML into Oracle using the LongVarchar datatype. However, success trying to select the darn thing back out is eluding us!

First attempts to simply select the CLOB into a large LongVarChar field resulted in a ORA-00932: inconsistent datatypes: expected - got CLOB error. Wrapping the CLOB field in a TO_CHAR() during the select got past that error but results in a Oracle Error - OCI_INVALID_HANDLE error. Can't afford to CAST this as a VARCHAR2 as the contents we need to extract are upwards of 16K, more than the 4K max the VARCHAR2 can deliver. And here we sit.

Anyone had success doing this? Thoughts / tips / tricks?

Gory details: DS 7.0.1 on HP-UX 11.11 using OCI9 and a 9i client to access a 10g database on an AIX box. Array Size is always set to 1. I can successfully select smaller, varchar sized CLOB data elements using a CAST AS VARCHAR2 in the source select. For whatever that is worth.

ps. The whole 10g/AIX aspect of this hasn't been a issue to date for the tons of other jobs we're running. FYI.
-craig

"You can never have too many knives" -- Logan Nine Fingers
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

If you can tolerate data type mismatch warning, try VarChar with precision up to 65536.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Very interesting... changed LongVarChar to VarChar of 20K in size and (with the to_char function still in place on the select) - it worked! Without any kind of data type mismatch complaint, either. :shock:

Thanks!
-craig

"You can never have too many knives" -- Logan Nine Fingers
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Ok, spoke too soon. It's not "working working", it is however longer failing. :wink:

The CLOB is being silently truncated to 4K as it makes it way through the job and not really sure why right now. We'll keep messing with this and I'll update the post with anything pertinent we discover.

Of course, any thoughts on the issue - feel free to chime in.
-craig

"You can never have too many knives" -- Logan Nine Fingers
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

If your data has any chance of containing a lot of characters in the CLOB column, you're better off not sending it thru DataStage. Think of it like sending an email with an attached file. Your email may be small, but the attachment can clog the network if it's of sufficient size. The same holds true with CLOBs and BLOBs, they're not worth sending thru the ETL stream.

Your best bet is to route the CLOB/BLOBs around the ETL and bulk-load/merge on the backend. It's brutal to insert a bunch of rows and then come back and update them with the CLOB/BLOB data, but you'll be able to bulk load the CLOB/BLOB data into a work table and then merge it on the backside without suffering thru the ETL.

Now, if you need to be parsing the CLOB/BLOB data, I feel for you. You're probably going to find that DS will choke on the large string data.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Bala R
Participant
Posts: 66
Joined: Mon May 30, 2005 9:52 pm

Post by Bala R »

Are you trying to insert the selected CLOB value into another Oracle table downstream? Possibly the OCI_HANDLE_ERROR could occur because of the pointer to the LOBLocator exists in the session or OCI env where you did a select and becomes invalid when moved to another session. If thats the case may be you could dumb the data in some intermediatory (sequential?) file and see if you could get past the select with TO_CHAR.

Just a thought.
Last edited by Bala R on Sat Sep 24, 2005 12:26 pm, edited 1 time in total.
logic
Participant
Posts: 115
Joined: Thu Feb 24, 2005 10:48 am

Post by logic »

Hi Craig,
Any developments on The CLOB issue. If so PLEASe update. I am facing similar situation and tried all the above solutions but nothing seems to be working. Stopped thinking Logically and tried the trial and error path. :(
We were initially using a DRS stage. Then we switched to OCI. When we use LongVarchar we get the

Code: Select all

OCI Invalid Handle error
. Using VarChar 4000 we get

Code: Select all

OCI has fetched Truncated Data error
(irrespective of array size). For an Array Size of 1 and VARCHAR-4000 the row is rejected with the error "Inserted value too large for column". This error disappears on making array size 1000 and VARCHAR-502000, but we get "not all variables bound error".
What approach should we take to solve this issue? Is there something we are missing?
THANKS.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Are you trying to insert into a CLOB or select the contents one? And I'm assuming you are only having problems because you are dealing with more than 4K of data in the field, yes?
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply