Hi all,
I have a job in DS 7.5 which reads from the sequential file and loads an oracle table, the file contains few timestamp columns in the format - May 1 2011 12:00AM.
The job runs fine and loads the data to oracle fine in 7.5 whereas in 8.7 am unable to read the file column as timestamp (I tried to change the default timestamp format to %mmm %d %yyyy %hh:%nn:%ss%aa).
I tried to change the datatype to VARCHAR and was able to read from the file but am unable to load the data to Oracle as the StringtoTimeStamp function is not able to convert the same to a timestamp while loading.
Please advise
Loading Timestamp to Oracle
Moderators: chulett, rschirm, roy
Yes it does and yes it is.
You should be able to read it as a string and use the StringToTimestamp() function to match the "default format" of a date/time in your database, talk to your DBA if you don't know exactly what that is. Or take the value as you have it now and use a TO_DATE() in your insert DML and supply a format picture / mask that matches your incoming format so the insert itself will convert it properly.
Or play the standardize all Oracle DATE values game, something I always did 'back in the day'. I made sure my values in job matched an "ISO" timestamp format: YYYY-MM-DD HH24:MI:SS and then allowed the target stage to generate the DML, it knew to wrap any DATE values in a TO_DATE() automatically. Now, this was well before Connectors came on the scene so no clue if their "Generate SQL at runtime" option does any automatic conversions like that.
You should be able to read it as a string and use the StringToTimestamp() function to match the "default format" of a date/time in your database, talk to your DBA if you don't know exactly what that is. Or take the value as you have it now and use a TO_DATE() in your insert DML and supply a format picture / mask that matches your incoming format so the insert itself will convert it properly.
Or play the standardize all Oracle DATE values game, something I always did 'back in the day'. I made sure my values in job matched an "ISO" timestamp format: YYYY-MM-DD HH24:MI:SS and then allowed the target stage to generate the DML, it knew to wrap any DATE values in a TO_DATE() automatically. Now, this was well before Connectors came on the scene so no clue if their "Generate SQL at runtime" option does any automatic conversions like that.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
Hi chulett,
I got the issue now, when i move the file from the windows to unix environment it places a special character in the file for null and thats the reason the job is unable to load the data.
,^@,^@,
Its apending the above mentioned characters whereever the columns value is null. I tried to transfer the file via binary transfer mode but still am able to see these characters and the jobs aborts with this issue
I got the issue now, when i move the file from the windows to unix environment it places a special character in the file for null and thats the reason the job is unable to load the data.
,^@,^@,
Its apending the above mentioned characters whereever the columns value is null. I tried to transfer the file via binary transfer mode but still am able to see these characters and the jobs aborts with this issue