Decimal problem

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
Ultramundane
Participant
Posts: 407
Joined: Mon Jun 27, 2005 8:54 am
Location: Walker, Michigan
Contact:

Decimal problem

Post by Ultramundane »

Using ODBC Enterprise Stage if I use the metadata that was imported, Ascential corrupts my decimal values that do not have any precision. That is, I have decimal(18, 0) columns and Ascential corrupts the values and sets them to 2147483647 or -2147483648. If I change the datatype in Ascential to Decimal(20, 2) same issue. However, if I use user defined SQL and do a convert to DECIMAL(20,2) in the SQL then it works. Columns that have a precision and scale greater than 0 in the database (18, 1), (20, 2), etc... seem to work fine. It does not work when the scale = 0.

What is wrong?
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Is Scale 0 or is Scale empty? (If the latter there's a problem with the import process - try importing Orchestrate schema definition using dbutil to access the table.)

The two values you've cited are, of course, the minimum and maximum possible int32 values; DataStage treats Decimal without a scale (that is, with scale empty) as int32. Try putting an explicit 0 into the Scale column and let us know whether that helps.
Last edited by ray.wurlod on Thu Mar 23, 2006 11:23 pm, edited 1 time in total.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Ultramundane
Participant
Posts: 407
Joined: Mon Jun 27, 2005 8:54 am
Location: Walker, Michigan
Contact:

Post by Ultramundane »

Scale is empty. I have tried to specify a scale, but Ascential blanks out a 0. I tried to import and get same thing. I have placed a call with Ascential and they thought it was pretty serious. They give me a P1 case.

Thanks for your help,
Ryan
Ultramundane
Participant
Posts: 407
Joined: Mon Jun 27, 2005 8:54 am
Location: Walker, Michigan
Contact:

Post by Ultramundane »

Just got patched that fixed. Ecase 91460.

Tx.
Ultramundane
Participant
Posts: 407
Joined: Mon Jun 27, 2005 8:54 am
Location: Walker, Michigan
Contact:

Post by Ultramundane »

Ultramundane wrote:Just got patched that fixed. Ecase 91460.

Tx.
The patch did not fix the issue. It caused another bug where decimals would get truncated both before and after the decimal point. Example,
123456.9999 would get loaded as 123450.9900

Ascential released another patch.
Ecase 91572.

This caused another problem with extreme rounding. This time with what appears to be an int64 instead of int32.

Ascential is working on yet another Ecase to fix.
Ecase 92288.

I think it is crazy that I need 3 different Ecases to fix one reported issue because Ascential cannot test properly.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Perhaps you could raise another request with IBM (it's not Ascential any longer) to test their patches? :lol:
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Ultramundane
Participant
Posts: 407
Joined: Mon Jun 27, 2005 8:54 am
Location: Walker, Michigan
Contact:

Post by Ultramundane »

Got patch for Ecase 92288. I still get warnings about implicit datatype conversions from decimal to ints that Ascential may not be able to make properly when storing decimal(x,0) as decimal(x,0), where x is natural number less than 39.

I also wanted to test more on the decimal and ODBC. Found that a decimal(38, 38 ) does not work either. Ascential is trying to convert an incoming decimal(38, 38 ), defined as decimal(38, 38 ), to a decimal(35, 38 ) and then to store that as a decimal(38, 38 ).
Post Reply