Page 1 of 1

Posted: Sun Dec 14, 2008 8:41 pm
by chulett
What format do you actually want in the sequential file? If you want the YYYY-MM-DD format shown in your SQL, the data type would either need to be Varchar or Timestamp. The former you could use directly since there are no other data types in a sequential file while the latter would require removing the time portion that all Oracle dates carry.

Me, I wouldn't advocate touching the so-called 'SQL Builder' with a 10 foot pole unless you have absolutely no idea how to construct a SQL query. Much more efficient to use the 'Generate SELECT clause from column list; enter other clauses' Query Type. For this you input the column metadata from which the stage builds the 'select' list, the tables so it knows what to put in the 'from' then any 'where' or other clauses like group or order by.

:idea: FWIW, I've standardized all Oracle DATE fields as Timestamp data types in our jobs. Do that, learn how to handle the time portion and you'll never have another Oracle "date issue" in your jobs.

Posted: Sun Dec 14, 2008 11:37 pm
by Elozm
Thanks Craig.

Yes, i have tried with these two types, but the problem is we want to import a lot of jobs and to change this would take many time (mainly with Timestamp). So, why is this working well in our actual server, with Date data type, but fails in this new environment? Maybe a configuration variable we have omitted? Something in ORAOCI9 stage?

For your information we have executed this with the same result, in both environments, in sqlplus:

Code: Select all

SQL> select to_char(sysdate,'YYYY-MM-DD') from dual;
TO_CHAR(SY
----------
2008-12-14
And the NLS_DATE_FORMAT environment variable has the same format too.

Thanks for your help.

Posted: Mon Dec 15, 2008 8:01 am
by chulett
Yes, doing it right after the fact can be a little painful but would mean the problem would be solved forever, immune to any issues you are struggling with right now.

Sure, that TO_CHAR() would force the results to be the same in both environments. You need to determine what is different between the two DataStage "environments" - and in this case, "environment" means what? A new project on the same server? A new installation of DataStage on a different server? Are both running the same DataStage version? Oracle client version? NLS enabled? Using the same NLS map?

And no, not something in the OCI stage.

Re: OCI Date conversion failed

Posted: Tue Dec 16, 2008 9:16 am
by Klaus Schaefer
About the Date conversion you can try this:

1. If you want pre-hawk behaviour then you will have to set DS_NO_FF to 1 at Project level or in the dsenv file. After this both date and timestamp will be imported as timestamp and fractional second is not supported. Same as 7.5.2.

2. If you want fractional second ( Oracle 9i and above) then DS_NO_FF should not be set. Now date is imported as date and timestamp as timestamp with fractional second. Since the length of date is fixed this is not set during the import. Same for timestamp. The length is calculated internally.

Klaus

Posted: Tue Dec 16, 2008 10:24 am
by chulett
Hmmm... obviously, something is different or there wouldn't be a change in behaviour of a job between environments. Sounds like it's not an obvious difference, however. :?

Start with one job, verify that your changes will indeed allow it to work in both environments without change, then go all cookbook and bang out the rest of them.

Posted: Tue Dec 16, 2008 10:26 am
by Elozm
Thanks Craig / Klaus.

These environments are two servers but with the same IS and Oracle client version. We have verified and it seems there's no difference between them.

So, as Craig suggested, we are modifying the jobs :(, i'll inform you the results but really i would like to know if anybody knows if there is a specific reason about the generation of this error.

Thanks.

Posted: Tue Dec 16, 2008 10:28 am
by chulett
Wait, what the heck? There was another reply here I was replying to but when I actually posted my response, it was gone.

And now it's back but after mine. Odd. :?

Posted: Mon Dec 22, 2008 10:39 am
by Elozm
Due this works now in this way (changing data type) we have a workaround.... :)

Thanks for all the help!!

Posted: Mon Dec 22, 2008 10:48 am
by chulett
I think you actually have a resolution rather than a workaround, but that's just me. :P