ORA OCI stage appends ".FF19" to date format on ti
Moderators: chulett, rschirm, roy
ORA OCI stage appends ".FF19" to date format on ti
Hi All,
I have a group of very simple server jobs which currently source data from one Oracle 9i Database and write it to another Oracle 9i Database.
Basically all of these jobs are structured as follows;
OCI -> Transformer -> OCI
The source database for these jobs is being upgraded from 9i to 10g. Therefore these jobs will soon be sourcing data from an Oracle 10g database and writing data to an Oracle 9i database.
As a result, I am in the process of testing the impact of this upgrade on my server jobs.
According to the DataStage Plugin Documentation, the ORAOCI9 stages can connect to a 10g database.
When I try to source data from the 10g source, I am seeing some very strange behaviour in regards to columns specified as "Timestamp". When you view the SQL from the SQL tab within the OCI stage, the timestamp columns are to_char'd (which I would expect) as shown below;
SELECT TO_CHAR(Date_Column, 'YYYY-MM-DD HH24:MI:SS') FROM TABLE
Although, when I run these jobs, this isn't the SQL which is being executed against the database. The SQL executed against the 10g database is as follows;
SELECT TO_CHAR(Date_Column, 'YYYY-MM-DD HH24:MI:SS.FF19') FROM TABLE
Note the ".FF19" which has been appended to the format specification in the TO_CHAR command. This causes the jobs to abort with the Oracle error "ORA-01821: date format not recognized"
As a test, I changed the data type from timestamp to varchar and manually specified the TO_CHAR in the derivation of the column.... and the job run fine.
Im currently thinking this behaviour may have something to do with the ORACLE client software (or its configuration) installed on our DataStage application server and the OCI plugin interaction with it.... but .... Has anyone encountered this sort of OCI behaviour before?
Our DataStage application environment is as follows;
Operating System = Red Hat Linux 3.2.3-54
DataStage = Enterprise Edition 7.5.1.A
Oracle Client = 10.2.0
Any input would be greatly appreciated!
I have a group of very simple server jobs which currently source data from one Oracle 9i Database and write it to another Oracle 9i Database.
Basically all of these jobs are structured as follows;
OCI -> Transformer -> OCI
The source database for these jobs is being upgraded from 9i to 10g. Therefore these jobs will soon be sourcing data from an Oracle 10g database and writing data to an Oracle 9i database.
As a result, I am in the process of testing the impact of this upgrade on my server jobs.
According to the DataStage Plugin Documentation, the ORAOCI9 stages can connect to a 10g database.
When I try to source data from the 10g source, I am seeing some very strange behaviour in regards to columns specified as "Timestamp". When you view the SQL from the SQL tab within the OCI stage, the timestamp columns are to_char'd (which I would expect) as shown below;
SELECT TO_CHAR(Date_Column, 'YYYY-MM-DD HH24:MI:SS') FROM TABLE
Although, when I run these jobs, this isn't the SQL which is being executed against the database. The SQL executed against the 10g database is as follows;
SELECT TO_CHAR(Date_Column, 'YYYY-MM-DD HH24:MI:SS.FF19') FROM TABLE
Note the ".FF19" which has been appended to the format specification in the TO_CHAR command. This causes the jobs to abort with the Oracle error "ORA-01821: date format not recognized"
As a test, I changed the data type from timestamp to varchar and manually specified the TO_CHAR in the derivation of the column.... and the job run fine.
Im currently thinking this behaviour may have something to do with the ORACLE client software (or its configuration) installed on our DataStage application server and the OCI plugin interaction with it.... but .... Has anyone encountered this sort of OCI behaviour before?
Our DataStage application environment is as follows;
Operating System = Red Hat Linux 3.2.3-54
DataStage = Enterprise Edition 7.5.1.A
Oracle Client = 10.2.0
Any input would be greatly appreciated!
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Welcome aboard! :D
I have not seen that behaviour, so tend to agree that it may have something to do with the manner in which your Oracle client software is configured.
I have not seen that behaviour, so tend to agree that it may have something to do with the manner in which your Oracle client software is configured.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
As suspected....
It is indeed a problem with the configuration of the Oracle Client software.
IBM Ascential support provided us with a patch a few weeks back, which replaced the library file oraoci9.so in the <install_dir>/DSEngine/lib directory. This patch sorted out a seperate issue, but caused this one.....
so... solve one problem and get another in its place
.
It is indeed a problem with the configuration of the Oracle Client software.
IBM Ascential support provided us with a patch a few weeks back, which replaced the library file oraoci9.so in the <install_dir>/DSEngine/lib directory. This patch sorted out a seperate issue, but caused this one.....
so... solve one problem and get another in its place
![Crying or Very sad :cry:](./images/smilies/icon_cry.gif)
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
That will suffice. The number is the primary key to the table in which support cases are recorded (the "*AUA" suffix indicates that it was first accepted by the Australian support group).
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Premium Member
- Posts: 3
- Joined: Tue Nov 14, 2006 4:51 pm
This can be rectified by adding a new Environment variab
This can be rectified by adding a new Environment variable to the user defined category in Datastage Admin(environment variables section).
the new variable name is
DS_NO_FF which is string type and set to value 1.
the new variable name is
DS_NO_FF which is string type and set to value 1.
-
- Premium Member
- Posts: 3
- Joined: Tue Nov 14, 2006 4:51 pm
This can be rectified by adding a new Environment variab
This can be rectified by adding a new Environment variable to the user defined category in Datastage Admin(environment variables section).
the new variable name is
DS_NO_FF which is string type and set to value 1.
the new variable name is
DS_NO_FF which is string type and set to value 1.
-
- Premium Member
- Posts: 3
- Joined: Tue Nov 14, 2006 4:51 pm