Using ODBC Enterprise Stage if I use the metadata that was imported, Ascential corrupts my decimal values that do not have any precision. That is, I have decimal(18, 0) columns and Ascential corrupts the values and sets them to 2147483647 or -2147483648. If I change the datatype in Ascential to Decimal(20, 2) same issue. However, if I use user defined SQL and do a convert to DECIMAL(20,2) in the SQL then it works. Columns that have a precision and scale greater than 0 in the database (18, 1), (20, 2), etc... seem to work fine. It does not work when the scale = 0.
What is wrong?
Decimal problem
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 407
- Joined: Mon Jun 27, 2005 8:54 am
- Location: Walker, Michigan
- Contact:
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Is Scale 0 or is Scale empty? (If the latter there's a problem with the import process - try importing Orchestrate schema definition using dbutil to access the table.)
The two values you've cited are, of course, the minimum and maximum possible int32 values; DataStage treats Decimal without a scale (that is, with scale empty) as int32. Try putting an explicit 0 into the Scale column and let us know whether that helps.
The two values you've cited are, of course, the minimum and maximum possible int32 values; DataStage treats Decimal without a scale (that is, with scale empty) as int32. Try putting an explicit 0 into the Scale column and let us know whether that helps.
Last edited by ray.wurlod on Thu Mar 23, 2006 11:23 pm, edited 1 time in total.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 407
- Joined: Mon Jun 27, 2005 8:54 am
- Location: Walker, Michigan
- Contact:
-
- Participant
- Posts: 407
- Joined: Mon Jun 27, 2005 8:54 am
- Location: Walker, Michigan
- Contact:
-
- Participant
- Posts: 407
- Joined: Mon Jun 27, 2005 8:54 am
- Location: Walker, Michigan
- Contact:
The patch did not fix the issue. It caused another bug where decimals would get truncated both before and after the decimal point. Example,Ultramundane wrote:Just got patched that fixed. Ecase 91460.
Tx.
123456.9999 would get loaded as 123450.9900
Ascential released another patch.
Ecase 91572.
This caused another problem with extreme rounding. This time with what appears to be an int64 instead of int32.
Ascential is working on yet another Ecase to fix.
Ecase 92288.
I think it is crazy that I need 3 different Ecases to fix one reported issue because Ascential cannot test properly.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
-
- Participant
- Posts: 407
- Joined: Mon Jun 27, 2005 8:54 am
- Location: Walker, Michigan
- Contact:
Got patch for Ecase 92288. I still get warnings about implicit datatype conversions from decimal to ints that Ascential may not be able to make properly when storing decimal(x,0) as decimal(x,0), where x is natural number less than 39.
I also wanted to test more on the decimal and ODBC. Found that a decimal(38, 38 ) does not work either. Ascential is trying to convert an incoming decimal(38, 38 ), defined as decimal(38, 38 ), to a decimal(35, 38 ) and then to store that as a decimal(38, 38 ).
I also wanted to test more on the decimal and ODBC. Found that a decimal(38, 38 ) does not work either. Ascential is trying to convert an incoming decimal(38, 38 ), defined as decimal(38, 38 ), to a decimal(35, 38 ) and then to store that as a decimal(38, 38 ).