Timestamp

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
kate_gkl
Participant
Posts: 22
Joined: Mon Jul 19, 2004 2:33 am
Location: Kuala Lumpur, Malaysia

Timestamp

Post by kate_gkl »

Hi all.

I have encountered a problem with timestamp type. I have a field (with sql type varchar) given by source data as : 2002-01-11-10.04.50.000000. This is a field that showing us the date and time.

I have created a job to transform this data to timestamp by using the command : StringToTimestamp(DSLink2.DATIME,"%yyyy-%mm-%dd %hh:%nn:%ss")

After that, I used the output of this data (setting the sql type as timestamp) as a sequential file input to load into Oracle in another parallel job. Unfortunately, i could not view the data with the error message:
##W TOIX 000000 11:20:46(000) <Sequential_File_7,0> Field "DATIME" delimiter not seen, at offset: 218
##W TOIX 000154 11:20:46(001) <Sequential_File_7,0> Import warning at record 0:
##W TOIX 000018 11:20:46(002) <Sequential_File_7,0> Import unsuccessful at record 0:
##I TOIX 000163 11:20:46(003) <Sequential_File_7,0> Import complete. 0 records imported successfully, 1 rejected.
**VIEWDATA ROW LIMITER HIT**
However, if i set the sql type as varchar, then i can view the data successfully.

I am wondering whether i have done it correctly and how can I view the data by setting it to timestamp type. Thanks in advance for ur opinion and help.
rasi
Participant
Posts: 464
Joined: Fri Oct 25, 2002 1:33 am
Location: Australia, Sydney

Post by rasi »

First check the format of your oracle environment.

Do a export from your oracle table which has the timestamp field. And check the sequential file you creating is same as the format of ORACLE. Accordingly do the changes to the format in your transformation. If you are viewing the sequential file inside datastage with timestamp as data type regardless whether it is in the correct format you can view the file.

Thanks
Rasi
kate_gkl
Participant
Posts: 22
Joined: Mon Jul 19, 2004 2:33 am
Location: Kuala Lumpur, Malaysia

Post by kate_gkl »

Thank you Rasi for ur reply. Would you mind explain further for this sentence:
If you are viewing the sequential file inside datastage with timestamp as data type regardless whether it is in the correct format you can view the file.
Do you mean we can set it to any type in order to view the data?

Thanks very much! :)
rasi
Participant
Posts: 464
Joined: Fri Oct 25, 2002 1:33 am
Location: Australia, Sydney

Post by rasi »

Yes you are correct it won't give any error while reading inside datastage. Make sure you give enough length in the sequential stage for that type.

Cheers
Rasi
kate_gkl
Participant
Posts: 22
Joined: Mon Jul 19, 2004 2:33 am
Location: Kuala Lumpur, Malaysia

Post by kate_gkl »

oh, so if the datatype in the oracle is datetime for this field and my DataStage sequential file having varchar for this field, i still can successfully load the data into Oracle database?

By the way, u mentioned that i can make the changes according during transformation. May I know which data type should I used as the primary one? Oracle or DataStage?

Thanks again. Cheers!
rasi
Participant
Posts: 464
Joined: Fri Oct 25, 2002 1:33 am
Location: Australia, Sydney

Post by rasi »

Hi

I think you haven't got my reply.

The fact is while reading a sequential file from datastage job it won't give you any error. So the datatype doesn't really matter while reading from sequential file. But when you use the oracle stage then before inserting the timestamp field the source has to to be transformed according to you oracle timestamp format.

Cheers
Rasi
leo_t_nice
Participant
Posts: 25
Joined: Thu Oct 02, 2003 8:57 am

Post by leo_t_nice »

Hi

If you read the sequential file containing the timestamp value in your post, with the column defined as timestamp (which PX expects to be 19 characters in length) then you will get an error. I can replicate your error message by using the timestamp that you used, which is 26 characters long (it contains the fractional part of a second).

I am not sure if your source is sequential or a database. If it is a database then there will probably be a command to ensure the fractional part of the seconds is not output (for example, in teradata we can use something like timestamp(0)). If you have a sequential file as the source, then perhaps you could set the field as varchar/char, and limit the size of the field in your transformer (or output stage) to 19. Because your output stage is sequential you can, in your next job, read this data as a timestamp without a problem.

Hope this helps.
Post Reply