microseconds getting trucated while loading intoOracle 11g

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
bicap
Premium Member
Premium Member
Posts: 95
Joined: Mon May 03, 2010 10:22 pm
Location: India
Contact:

microseconds getting trucated while loading intoOracle 11g

Post by bicap »

Hi All,

I am on Datastge 7.5 and I have file from which I load process_date time coumn

In Input file I have value as 23-MAY-11 02.50.21.689000

In datastge job at target stage the column is defined as timestamp 26 with extended propert set to Microseconds

when my job run and loads data into oracle 11g table value of process date time is becoming 23-MAY-11 02.50.21.000000 i.e microseconds are not showing.
in oracle table column is defined as timestamp(6)

if I use the same value and same datastage job and loads data into same table on 10i then data is getting inserted correctly that with microseconds
there also column is defined as timestamp(6)

on datastge job I use simple insert query to load the table ( target stage)

Is this compatibilty issue on Datasteg 7.5 and oracle 11g
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Oracle Timestamp data types are not supported but you can generate your own sql and use TO_TIMESTAMP() rather than the TO_DATE() the stage will automatically generate for you.
-craig

"You can never have too many knives" -- Logan Nine Fingers
bicap
Premium Member
Premium Member
Posts: 95
Joined: Mon May 03, 2010 10:22 pm
Location: India
Contact:

Post by bicap »

Hi,
I have also tried adding to_timestamp function in SQL query

insert into table X columns ( prorcess_date_time) value to_timestamp ( :process_date_time) but is have not correctd the result.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

OK, straighten out something for us. You've posted in the PX forum but marked your post as 'Server', which one is it? And what does your job design look like, in particular which target stage are you using? :?
-craig

"You can never have too many knives" -- Logan Nine Fingers
bicap
Premium Member
Premium Member
Posts: 95
Joined: Mon May 03, 2010 10:22 pm
Location: India
Contact:

Post by bicap »

Our Datasteg version is 7.5 parallel version only

my job design is i have seqquemtial fileas input I read the process_datetime column as varchar(26) then I have transformer in that I have transformation on processdatetime coulmn as
StringToTimestamp(PROCESS_DATETIME,"%yyyy-%mm-%dd %hh:%nn:%ss.6")
after that I have Oracle Enterprise stage which lods data into table with below query , column is defined as timestamp (26) and with microsecond option

INSERT

INTO

<schema name>..<tablename>
(BATCH_ID, PROCESS_DATETIME)

VALUES

(ORCHESTRATE.BATCH_ID, , ORCHESTRATE.PROCESS_DATETIME)

As data flow
source value is 2011-05-26-18.54.42.442000 (varchar (26)
target value ( when seen by writing into file) 2011-05-26 18:54:42.442000
Datatype timestamp 26 microseconds checked

when data loaded into Oracle 11 g table then value will be
2011-05-26 18:54:42.000000
On database column is defined as timestamp(6)
Post Reply