Inserting into Oracle
Moderators: chulett, rschirm, roy
Inserting into Oracle
Hi
I need to insert data to the oracle table.
i am getting a inputstring of length more than 30000 chars.(coz string contains a field ,email body )
i am formatting the string in a transformer stage and inserting into oracle table.
what datatypes should i use in transformer stage and oracle stage for this input string and how much length the datatype takes.
thanks .
I need to insert data to the oracle table.
i am getting a inputstring of length more than 30000 chars.(coz string contains a field ,email body )
i am formatting the string in a transformer stage and inserting into oracle table.
what datatypes should i use in transformer stage and oracle stage for this input string and how much length the datatype takes.
thanks .
What does your target table look like? What kind of errors are you getting?
Your biggest problem is going to be the fact that 30,000 is way too big for a single field of any normal type. You need to go with something like CLOB on the Oracle side and then you'll find that DataStage doesn't really handle CLOB data all that well.
You could also look at hacking it up into x number of chunks. I believe the biggest VARCHAR2 allowed is 4k - but that can depend on your version of Oracle, which you don't mention.
![Confused :?](./images/smilies/icon_confused.gif)
Your biggest problem is going to be the fact that 30,000 is way too big for a single field of any normal type. You need to go with something like CLOB on the Oracle side and then you'll find that DataStage doesn't really handle CLOB data all that well.
You could also look at hacking it up into x number of chunks. I believe the biggest VARCHAR2 allowed is 4k - but that can depend on your version of Oracle, which you don't mention.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
If you are at 7.5 or will be soon there, is the STP stage (STored Procedure) that currently has support for only oracle and you could write your own procedure like -
The above procedure converts a long into a clob which you can insert and maybe since you're doing this from a procedure you could get away with it in DataStage. Never tried, but you never know....
I'm not 100% sure this would work, but I would certainly give it a try.
Regards,
Code: Select all
create proceudre p ( p_id in number, p_string in varchar2 )
as
l_clob clob;
begin
insert into t ( id, text ) values ( p_id, empty_clob() )
returning text into l_clob;
dbms_lob.writeappend( l_clob, length(p_string), p_string );
end;
I'm not 100% sure this would work, but I would certainly give it a try.
Regards,
Mike Hester
mhester@petra-ps.com
mhester@petra-ps.com
The worst problem is that 30K of data per row is going to bog down the processing. It will be like a pineapple going thru a garden hose. I GUARANTEE you will have terrible performance, this is too much data to juggle thru DataStage. Yes it will work, but what a performance penalty. This is simply a matter of characters per second.
You will want to seriously consider bypassing this column thru all manipulations. I suggest parking the large column data into a target table with a surrogate key on each row, then use that surrogate key as a token throughout processing. Then, load the target table and post-load update it with the large column information. This gives two distinct advantages:
1. You will high-performance load the work table using sqlldr for the large columns.
2. You will accelerate all transformation processing as you are not carrying upto 30K of data thru the ether.
You will want to seriously consider bypassing this column thru all manipulations. I suggest parking the large column data into a target table with a surrogate key on each row, then use that surrogate key as a token throughout processing. Then, load the target table and post-load update it with the large column information. This gives two distinct advantages:
1. You will high-performance load the work table using sqlldr for the large columns.
2. You will accelerate all transformation processing as you are not carrying upto 30K of data thru the ether.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle