Compress Stage making Timestamp field invalid.

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
sridharvis
Premium Member
Premium Member
Posts: 26
Joined: Thu Apr 17, 2008 1:38 pm
Location: Chennai

Compress Stage making Timestamp field invalid.

Post by sridharvis »

Hi,
I am using Compress & Expand Stage for Datasets, since we have do not have much space in server.
But ini using that when we compress a Dataset using Compress Stage and then in the subsequent job when we use that compressed Dataset followed by Expand Stage. Then in the output from Expand Dataset we are facing a issue that , the 1st field with TimeStamp as DataType is getting corrupted and coming as '************' asteriks. We say that only the 1st TimeStamp field because other Timestamp field are coming out fine.

1st Job :

Dataset1-----> Compress-------->Dataset2

2nd Job :

Dataset2-----> Compress-------->Dataset3

We had tested this using both the GZIP & COMPRESS utility that comes with Compress & Expand Stage. But the issue still persist.

E,g:
Source Dataset1:
Col1,Col2,Col3,Col4,Col5,COl6
-----------------------------------------------------------------------
Deb,123,2009-08-08 03:06:04,Seal,2005-01-01 02:56:54,India

Target Dataset3:
Col1,Col2,Col3,Col4,Col5,COl6
-----------------------------------------------------------------------
Deb,123,************,Seal,2005-01-01 02:56:54,India


Anyone who has already faced this issue?
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

So, Job 2 actually has an Expand stage in it, yes? Your "picture" shows Compress for both, so best to just go back and edit that post to correct it.

Any idea what the actual 'corrupted' timestamp value is? Perhaps add a link to a Peek stage to find out. And double-check that your metadata for that field is in fact correct in both jobs.
-craig

"You can never have too many knives" -- Logan Nine Fingers
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

In order for a timestamp to be compressed, it needs to be converted to a string. Likewise, the string value is decompressed and then put into a timestamp format. I would wager that the problem here is the default string formatting of the timestamp field; since the "*"s you see usually come when a format cannot be parsed.
Is the date format used always the same, perhaps your first column definitions are different?
Post Reply