ORA OCI stage appends ".FF19" to date format on ti
Posted: Sun Sep 17, 2006 8:18 pm
Hi All,
I have a group of very simple server jobs which currently source data from one Oracle 9i Database and write it to another Oracle 9i Database.
Basically all of these jobs are structured as follows;
OCI -> Transformer -> OCI
The source database for these jobs is being upgraded from 9i to 10g. Therefore these jobs will soon be sourcing data from an Oracle 10g database and writing data to an Oracle 9i database.
As a result, I am in the process of testing the impact of this upgrade on my server jobs.
According to the DataStage Plugin Documentation, the ORAOCI9 stages can connect to a 10g database.
When I try to source data from the 10g source, I am seeing some very strange behaviour in regards to columns specified as "Timestamp". When you view the SQL from the SQL tab within the OCI stage, the timestamp columns are to_char'd (which I would expect) as shown below;
SELECT TO_CHAR(Date_Column, 'YYYY-MM-DD HH24:MI:SS') FROM TABLE
Although, when I run these jobs, this isn't the SQL which is being executed against the database. The SQL executed against the 10g database is as follows;
SELECT TO_CHAR(Date_Column, 'YYYY-MM-DD HH24:MI:SS.FF19') FROM TABLE
Note the ".FF19" which has been appended to the format specification in the TO_CHAR command. This causes the jobs to abort with the Oracle error "ORA-01821: date format not recognized"
As a test, I changed the data type from timestamp to varchar and manually specified the TO_CHAR in the derivation of the column.... and the job run fine.
Im currently thinking this behaviour may have something to do with the ORACLE client software (or its configuration) installed on our DataStage application server and the OCI plugin interaction with it.... but .... Has anyone encountered this sort of OCI behaviour before?
Our DataStage application environment is as follows;
Operating System = Red Hat Linux 3.2.3-54
DataStage = Enterprise Edition 7.5.1.A
Oracle Client = 10.2.0
Any input would be greatly appreciated!
I have a group of very simple server jobs which currently source data from one Oracle 9i Database and write it to another Oracle 9i Database.
Basically all of these jobs are structured as follows;
OCI -> Transformer -> OCI
The source database for these jobs is being upgraded from 9i to 10g. Therefore these jobs will soon be sourcing data from an Oracle 10g database and writing data to an Oracle 9i database.
As a result, I am in the process of testing the impact of this upgrade on my server jobs.
According to the DataStage Plugin Documentation, the ORAOCI9 stages can connect to a 10g database.
When I try to source data from the 10g source, I am seeing some very strange behaviour in regards to columns specified as "Timestamp". When you view the SQL from the SQL tab within the OCI stage, the timestamp columns are to_char'd (which I would expect) as shown below;
SELECT TO_CHAR(Date_Column, 'YYYY-MM-DD HH24:MI:SS') FROM TABLE
Although, when I run these jobs, this isn't the SQL which is being executed against the database. The SQL executed against the 10g database is as follows;
SELECT TO_CHAR(Date_Column, 'YYYY-MM-DD HH24:MI:SS.FF19') FROM TABLE
Note the ".FF19" which has been appended to the format specification in the TO_CHAR command. This causes the jobs to abort with the Oracle error "ORA-01821: date format not recognized"
As a test, I changed the data type from timestamp to varchar and manually specified the TO_CHAR in the derivation of the column.... and the job run fine.
Im currently thinking this behaviour may have something to do with the ORACLE client software (or its configuration) installed on our DataStage application server and the OCI plugin interaction with it.... but .... Has anyone encountered this sort of OCI behaviour before?
Our DataStage application environment is as follows;
Operating System = Red Hat Linux 3.2.3-54
DataStage = Enterprise Edition 7.5.1.A
Oracle Client = 10.2.0
Any input would be greatly appreciated!