Data getting set to zero

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
Novak
Participant
Posts: 97
Joined: Mon May 21, 2007 10:08 pm
Location: Australia

Data getting set to zero

Post by Novak »

Hi,

We have a csv file with about 20 columns. When viewed through the sequential file stage we can see lots of non-zero values in our last column. This data gets passed through transform stage where we use StringToDecimal function to convert these values to decimal data type because of our target table fields.

The problem is when the records are read and table populated that this last column is populated with zeros only. To test few things out, we halved the number of columns so that the one in the middle that was being properly populated earlier became our last column in the source file. After running this column is now populated with zeros as well.
The source file we are running is about 60 thousand records. Certainly not big.

Any ideas?

Thanks,

Novak
Novak
Participant
Posts: 97
Joined: Mon May 21, 2007 10:08 pm
Location: Australia

Post by Novak »

Just to improve on that earlier explanation.

The column that was in the middle of the original source file had many non-zero values in the target column after running the load. After modifying the source file so that the same column is the last one, the resulting target column contains zeros ONLY.

Thanks,

Novak
peddakkagari
Participant
Posts: 26
Joined: Thu Aug 12, 2010 12:07 am

Post by peddakkagari »

Novak,

You just edit the source data once, edit means no need to change anything you just remove and add any one record then the values will be loaded with out any issues. This is the problem with CSV files. Yesterday we have faced the same issue.

Thanks,
Sreekanth
Novak
Participant
Posts: 97
Joined: Mon May 21, 2007 10:08 pm
Location: Australia

Post by Novak »

Hi Sreekanth,

If you mean just deleting any one record - I have done that as well. The file was much bigger at first, so I cut it down for the purposes of quicker testing.

And this file is going to be regular arrival, so we can't really manipulate it so that we can process it.

Thanks,

Novak
ShaneMuir
Premium Member
Premium Member
Posts: 508
Joined: Tue Jun 15, 2004 5:00 am
Location: London

Post by ShaneMuir »

If your target is getting populated with zero values, this is most likely your default value for the target. This indicates that something has potentially gone wrong with the string to decimal conversion.

Are there any warnings or informational lines regarding the conversion of that field in the logs?
prakashdasika
Premium Member
Premium Member
Posts: 72
Joined: Mon Jul 06, 2009 9:34 pm
Location: Sydney

Post by prakashdasika »

debug the job by adding a peek from the Transformer. The last coulmn should use the same function used to load database and view the data in director. If you are seeing the same data as in database then your function is the problem. (Datastage adds Leading and Trailing zeroes to the Decimal field as per precision and scale defined).
If you see expected data in peek then the database loader has some problem.

What is the precision and scale of last field in the database? can you post an example?
Prakash Dasika
ETL Consultant
Sydney
Australia
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

You might try sorting the source so that the non-empty values appear first.

Otherwise run the file through a script (or a stage) that appends a dummy 21st column that is always populated.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Novak
Participant
Posts: 97
Joined: Mon May 21, 2007 10:08 pm
Location: Australia

Post by Novak »

Thanks guys,

It was easy enough to convince source application to generate the dummy 21st column, and it worked fine.
Its a workaround though, and I will come back with a resolution when I get some more time to look into this (and if I do find one)

Regards,

Novak
suse_dk
Participant
Posts: 93
Joined: Thu Aug 11, 2011 6:18 am
Location: Denmark

Post by suse_dk »

Try to look at your file in a HEX editor - maybe there is a character read with the last column that causes it to default in the transformer?
_________________
- Susanne
Novak
Participant
Posts: 97
Joined: Mon May 21, 2007 10:08 pm
Location: Australia

Post by Novak »

This has been fixed.
What we've been missing out on all along was the record delimiter string option in the Sequential File stage. When it was put in and 'DOS format' value chosen all was fine.
The warnings generated for the StringToDecimal conversion were gone and the target columns was populated fine.

Thanks a lot for your inputs.

Regards,

Novak
Post Reply