Blank "Timestamp" field shows *****

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
ady
Premium Member
Premium Member
Posts: 189
Joined: Thu Oct 12, 2006 12:08 am

Blank "Timestamp" field shows *****

Post by ady »

Hi,

I have a job where I get blank data from the source for a column with "Timestamp" datatype in a seq file. The source is from a server job.

When i view the data in parallel it shows blank columns but when I write the data in a seq file or a dataset.... The data shows as "*******"

Please help me solve this
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

Are you doing any type conversions for this particular column? Give us more info on your design.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
ady
Premium Member
Premium Member
Posts: 189
Joined: Thu Oct 12, 2006 12:08 am

Post by ady »

No type conversion in the job. I am performing two joins on the data but no transformations and type conversions.



SEQ file Seqfile

> join1 > Join2 > Output Seq File
Seq File
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

You need to do type conversions. If the file was created by a server job, everything is a string. For Date fields you need to specify StringToDate(). Handle nulls before that.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
ady
Premium Member
Premium Member
Posts: 189
Joined: Thu Oct 12, 2006 12:08 am

Post by ady »

I do not use a parallel transformer in my jobs , I use a basic transformer in my parallel jobs too. Where can I specify type conversions
ady
Premium Member
Premium Member
Posts: 189
Joined: Thu Oct 12, 2006 12:08 am

Post by ady »

When i specify a default value in the table definition , I dont get the **** anymore ... Cant I insert blank space in "Timestamp" field?
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

Well then you need to use modify stage. Type conversions are needed in parallel jobs.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Blank is not a valid timestamp, therefore you can not insert it in a timestamp field.

If you are writing to a text file, you can replace NULL with a Null Field Value, but blank is not the same as NULL; you might need to convert blank to NULL in an upstream stage.

If the text file is fixed width format, the Null Field Value must have the same number of characters as the Field Width. For example, for a timestamp with no fractional seconds, this means 19 characters. All blanks would be OK, so long as there were 19 of them.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
ady
Premium Member
Premium Member
Posts: 189
Joined: Thu Oct 12, 2006 12:08 am

Post by ady »

@DSguru

So my job will look like


seq file > modify > ...rest of the job....>>>

Would that solve my problem?
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

First create a test job that reads your source, changes the datatypes using modify stage and then loads it back to a sequential file stage. If this works out then you can replace the target sequential file stage with the your "rest of the job".
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
Post Reply