LongVarChar source and DS job fails to recognize it!!!
Moderators: chulett, rschirm, roy
-
- Premium Member
- Posts: 353
- Joined: Wed Apr 06, 2005 8:45 am
LongVarChar source and DS job fails to recognize it!!!
Getting a column with LongVarChar with no length specified, and the target column is also the same , but when I select the LongVarChar in DS it fails in runtime giving me a ORA handle error.
Is there any way I can find the max length of the column or how to go about it.
Thanks,
Is there any way I can find the max length of the column or how to go about it.
Thanks,
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Oracle is complaining that you're specifying LongVarChar(0), since "" is treated by DataStage as 0 in a numeric context.
What was the maximum length (precision) when you imported the table definition from Oracle? Use that value.
What was the maximum length (precision) when you imported the table definition from Oracle? Use that value.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Premium Member
- Posts: 353
- Joined: Wed Apr 06, 2005 8:45 am
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
-
- Premium Member
- Posts: 353
- Joined: Wed Apr 06, 2005 8:45 am
tried with 20,000
I tried with 20,000 and it works but what if there is some value tomorow greater than 20K and the job fails in production, thats what is the concern.
Thanks
Thanks
-
- Premium Member
- Posts: 353
- Joined: Wed Apr 06, 2005 8:45 am
I get the dubious distinction of resurrecting this thread as - now that I actually need to do the same thing - I can't make it work.
We were able to successfully load a CLOB field from XML into Oracle using the LongVarchar datatype. However, success trying to select the darn thing back out is eluding us!
First attempts to simply select the CLOB into a large LongVarChar field resulted in a ORA-00932: inconsistent datatypes: expected - got CLOB error. Wrapping the CLOB field in a TO_CHAR() during the select got past that error but results in a Oracle Error - OCI_INVALID_HANDLE error. Can't afford to CAST this as a VARCHAR2 as the contents we need to extract are upwards of 16K, more than the 4K max the VARCHAR2 can deliver. And here we sit.
Anyone had success doing this? Thoughts / tips / tricks?
Gory details: DS 7.0.1 on HP-UX 11.11 using OCI9 and a 9i client to access a 10g database on an AIX box. Array Size is always set to 1. I can successfully select smaller, varchar sized CLOB data elements using a CAST AS VARCHAR2 in the source select. For whatever that is worth.
ps. The whole 10g/AIX aspect of this hasn't been a issue to date for the tons of other jobs we're running. FYI.
We were able to successfully load a CLOB field from XML into Oracle using the LongVarchar datatype. However, success trying to select the darn thing back out is eluding us!
First attempts to simply select the CLOB into a large LongVarChar field resulted in a ORA-00932: inconsistent datatypes: expected - got CLOB error. Wrapping the CLOB field in a TO_CHAR() during the select got past that error but results in a Oracle Error - OCI_INVALID_HANDLE error. Can't afford to CAST this as a VARCHAR2 as the contents we need to extract are upwards of 16K, more than the 4K max the VARCHAR2 can deliver. And here we sit.
Anyone had success doing this? Thoughts / tips / tricks?
Gory details: DS 7.0.1 on HP-UX 11.11 using OCI9 and a 9i client to access a 10g database on an AIX box. Array Size is always set to 1. I can successfully select smaller, varchar sized CLOB data elements using a CAST AS VARCHAR2 in the source select. For whatever that is worth.
ps. The whole 10g/AIX aspect of this hasn't been a issue to date for the tons of other jobs we're running. FYI.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Ok, spoke too soon. It's not "working working", it is however longer failing.
The CLOB is being silently truncated to 4K as it makes it way through the job and not really sure why right now. We'll keep messing with this and I'll update the post with anything pertinent we discover.
Of course, any thoughts on the issue - feel free to chime in.
The CLOB is being silently truncated to 4K as it makes it way through the job and not really sure why right now. We'll keep messing with this and I'll update the post with anything pertinent we discover.
Of course, any thoughts on the issue - feel free to chime in.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
If your data has any chance of containing a lot of characters in the CLOB column, you're better off not sending it thru DataStage. Think of it like sending an email with an attached file. Your email may be small, but the attachment can clog the network if it's of sufficient size. The same holds true with CLOBs and BLOBs, they're not worth sending thru the ETL stream.
Your best bet is to route the CLOB/BLOBs around the ETL and bulk-load/merge on the backend. It's brutal to insert a bunch of rows and then come back and update them with the CLOB/BLOB data, but you'll be able to bulk load the CLOB/BLOB data into a work table and then merge it on the backside without suffering thru the ETL.
Now, if you need to be parsing the CLOB/BLOB data, I feel for you. You're probably going to find that DS will choke on the large string data.
Your best bet is to route the CLOB/BLOBs around the ETL and bulk-load/merge on the backend. It's brutal to insert a bunch of rows and then come back and update them with the CLOB/BLOB data, but you'll be able to bulk load the CLOB/BLOB data into a work table and then merge it on the backside without suffering thru the ETL.
Now, if you need to be parsing the CLOB/BLOB data, I feel for you. You're probably going to find that DS will choke on the large string data.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Are you trying to insert the selected CLOB value into another Oracle table downstream? Possibly the OCI_HANDLE_ERROR could occur because of the pointer to the LOBLocator exists in the session or OCI env where you did a select and becomes invalid when moved to another session. If thats the case may be you could dumb the data in some intermediatory (sequential?) file and see if you could get past the select with TO_CHAR.
Just a thought.
Just a thought.
Last edited by Bala R on Sat Sep 24, 2005 12:26 pm, edited 1 time in total.
Hi Craig,
Any developments on The CLOB issue. If so PLEASe update. I am facing similar situation and tried all the above solutions but nothing seems to be working. Stopped thinking Logically and tried the trial and error path.
We were initially using a DRS stage. Then we switched to OCI. When we use LongVarchar we get the. Using VarChar 4000 we get (irrespective of array size). For an Array Size of 1 and VARCHAR-4000 the row is rejected with the error "Inserted value too large for column". This error disappears on making array size 1000 and VARCHAR-502000, but we get "not all variables bound error".
What approach should we take to solve this issue? Is there something we are missing?
THANKS.
Any developments on The CLOB issue. If so PLEASe update. I am facing similar situation and tried all the above solutions but nothing seems to be working. Stopped thinking Logically and tried the trial and error path.
We were initially using a DRS stage. Then we switched to OCI. When we use LongVarchar we get the
Code: Select all
OCI Invalid Handle error
Code: Select all
OCI has fetched Truncated Data error
What approach should we take to solve this issue? Is there something we are missing?
THANKS.