DataStage int32 vs decimal(10,0)
Posted: Fri Aug 29, 2014 2:28 am
Hi,
Memory wise, what gives the best performance, declaring a column int32 or decimal(10,0) ?
I'm asking myself this question because I need to load a sql server table into oracle
The table consists of columns with the int datatype. In sql server the int datatype is like a number(10) in oracle. In my job what would be better (performance but also best practice)
1) read the sql server column with datastage integer datatype, then write the column to oracle with the datastage decimal(10,0) datatype
or
2) read the sql server column with the datastage decimal(10, 0) datatype, then write the column to oracle with the decimal(10,0) datatype.
thanks
Memory wise, what gives the best performance, declaring a column int32 or decimal(10,0) ?
I'm asking myself this question because I need to load a sql server table into oracle
The table consists of columns with the int datatype. In sql server the int datatype is like a number(10) in oracle. In my job what would be better (performance but also best practice)
1) read the sql server column with datastage integer datatype, then write the column to oracle with the datastage decimal(10,0) datatype
or
2) read the sql server column with the datastage decimal(10, 0) datatype, then write the column to oracle with the decimal(10,0) datatype.
thanks