Page 1 of 1

blob,clob data types

Posted: Fri Feb 03, 2012 8:47 am
by iskapalli
Hi,

My job : oracle connector stage -> transformer -> teradata connector stage.

Table contains blob,clob data types. I tried the options with fast load,
multi load,tpump. But there is no use. I came to know that blob,clob data
types does not work for any load in the teradata.But Immediate option is working. But it is takeing 2 hours for 24 laks records.

But I searched in google. I came to know that I should use Teradata Parallel Transporter (SQL Inserter Operator).

Could you please suggest on this?

Posted: Fri Feb 03, 2012 12:45 pm
by bobyon
I'm not sure what you found in your Google search nor what TPT has to do with blob, clob data, but what I can tell you is that TPT is required to use the Teradata Connector.

Posted: Mon Feb 06, 2012 12:08 am
by kandyshandy
Take a glance at the advanced PX developer guide.

Posted: Mon Feb 06, 2012 8:35 am
by iskapalli
Hi,

My job : oracle connector stage -> transformer -> teradata connector stage.

source DDL(oracle):

COL1 CLOB
COL2 CLOB


Target DDL(TERADATA)
COL1 CLOB(1048544000) CHARACTER SET LATIN
COL2 CLOB(1048544000) CHARACTER SET LATIN


DATASTAGE(source):

COL1 LONGVARCHAR(length is not mentioned)
COL2 LONGVARCHAR(length is not mentioned).

datastage(target)


COL1 LONGVARCHAR(1048544000)
COL2 LONGVARCHAR(1048544000).

I am useing fast load.

I am getting below error
Error:
RDBMS code 3798: A column or character expression is larger than the max size
warning:
The estimated row length of 130,606 bytes multiplied by the array size of 2 is 261,212 which exceeds the maximum buffer size of 64,260


If I used immediate it is working .But it is very slow.Can you please suggest on this.

Posted: Mon Feb 06, 2012 10:07 pm
by kandyshandy
The message is clear that your data is exceeding the buffer size allowed. So check whether you can increase the buffer size by explicitly defining the environment variable in the job.

Posted: Tue Feb 07, 2012 3:28 am
by iskapalli
Hi,
This is the error I am getting.


TGT_CSPRC_BDNDETL,1: RDBMS code 3798: A column or character expression is larger than the max size. SQL statement: USING (NUMCHILD INTEGER, BDN_LEVEL SMALLINT, SRC_SYS_ID VARCHAR(15), LOAD_ERROR CHAR(3), DATA_ORIGIN CHAR(3), CREATED_EW_DTTM TIMESTAMP(0), LASTUPD_EW_DTTM TIMESTAMP(0), BATCH_SID DECIMAL(10), BREAKFILE CHAR(24), CODE CHAR(177), TAG CHAR(180), CODEDESC CHAR(180), OLDTAG CHAR(30), TH_SPVF FLOAT, TH_SPVU FLOAT, TH_SPPF FLOAT, TH_SPPU FLOAT, TH_SCVF FLOAT, TH_SCVU FLOAT, TH_SCPF FLOAT, TH_SCPU FLOAT, TH_CPVF FLOAT, TH_CPVU FLOAT, TH_CPPF FLOAT, TH_CPPU FLOAT, TH_CCVF FLOAT, TH_CCVU FLOAT, TH_CCPF FLOAT, TH_CCPU FLOAT, TH_CAVF FLOAT, TH_CAVU FLOAT, TH_CAPF FLOAT, TH_CAPU FLOAT, D1 CHAR(177), D2 CHAR(177), D3 CHAR(177), D4 CHAR(177), D5 CHAR(177), D6 CHAR(177), D7 CHAR(177), D8 CHAR(177), D9 CHAR(177), PARENT CHAR(177), SOW VARCHAR(64000), BOE VARCHAR(64000)) INSERT INTO EFBI_DEV1_STG_S.CSPRC_BDNDETL(NUMCHILD,BDN_LEVEL,SRC_SYS_ID,LOAD_ERROR,DATA_ORIGIN,CREATED_EW_DTTM,LASTUPD_EW_DTTM,BATCH_SID,BREAKFILE,CODE,TAG,CODEDESC,OLDTAG,TH_SPVF,TH_SPVU,TH_SPPF,TH_SPPU,TH_SCVF,TH_SCVU,TH_SCPF,TH_SCPU,TH_CPVF,TH_CPVU,TH_CPPF,TH_CPPU,TH_CCVF,TH_CCVU,TH_CCPF,TH_CCPU,TH_CAVF,TH_CAVU,TH_CAPF,TH_CAPU,D1,D2,D3,D4,D5,D6,D7,D8,D9,PARENT,SOW,BOE) VALUES(:NUMCHILD,:BDN_LEVEL,:SRC_SYS_ID,:LOAD_ERROR,:DATA_ORIGIN,:CREATED_EW_DTTM,:LASTUPD_EW_DTTM,:BATCH_SID,:BREAKFILE,:CODE,:TAG,:CODEDESC,:OLDTAG,:TH_SPVF,:TH_SPVU,:TH_SPPF,:TH_SPPU,:TH_SCVF,:TH_SCVU,:TH_SCPF,:TH_SCPU,:TH_CPVF,:TH_CPVU,:TH_CPPF,:TH_CPPU,:TH_CCVF,:TH_CCVU,:TH_CCPF,:TH_CCPU,:TH_CAVF,:TH_CAVU,:TH_CAPF,:TH_CAPU,:D1,:D2,:D3,:D4,:D5,:D6,:D7,:D8,:D9,:PARENT,:SOW,:BOE) (CC_TeraDBStatement::dbsBindInputs, file CC_TeraDBStatement.cpp, line 627)