Problem with using BigInt datatype with ODBC Enterprise

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

sjordery
Premium Member
Premium Member
Posts: 202
Joined: Thu Jun 08, 2006 5:58 am

Post by sjordery »

Hi All,

IBM Support have confirmed that BigInt is not supported in DataStage. As a work-around, we are doing the relevant processing in an SQL stored procedure outside of DataStage, although it is triggered by an event within our DataStage flow.

Thanks,
sjordery
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Thanks for posting the workaround.
-craig

"You can never have too many knives" -- Logan Nine Fingers
Ultramundane
Participant
Posts: 407
Joined: Mon Jun 27, 2005 8:54 am
Location: Walker, Michigan
Contact:

Post by Ultramundane »

sjordery wrote:Hi All,

IBM Support have confirmed that BigInt is not supported in DataStage. As a work-around, we are doing the relevant processing in an SQL stored procedure outside of DataStage, although it is triggered by an event within our DataStage flow.

Thanks,
sjordery
?

There are patches out there that fix this issue. If you like I can try and track the patches down.
VCInDSX
Premium Member
Premium Member
Posts: 223
Joined: Fri Apr 13, 2007 10:02 am
Location: US

Post by VCInDSX »

Ultramundane wrote: There are patches out there that fix this issue. If you like I can try and track the patches down.
Hi,
I have a similar issue that i would like to have a patch/fix. Is it possible to provide any links/information that you might have?

Many thanks in advance for your help,
-V
Ultramundane
Participant
Posts: 407
Joined: Mon Jun 27, 2005 8:54 am
Location: Walker, Michigan
Contact:

Post by Ultramundane »

VCInDSX wrote: Hi,
I have a similar issue that i would like to have a patch/fix. Is it possible to provide any links/information that you might have?

Many thanks in advance for your help,
Sure, (AIX and 752)

After going through the series of patches I received, it appears as though all patches were because of problems in orchodbc.o

Thus, the latest patch for that operator should fix the issue.
The latest patch I have is ECASE 118069. There are many fixes in that patch for decimal corruptions and datatype conversions related to ODBC.
VCInDSX
Premium Member
Premium Member
Posts: 223
Joined: Fri Apr 13, 2007 10:02 am
Location: US

Post by VCInDSX »

Ultramundane wrote: Sure, (AIX and 752)
Hi,
Thanks for the quick response. Appreciate it much.
Is this specific to AIX or would it be for other platforms as well? I believe our Server is 7.5.2 on a Linux box. I will forward this to the Admin group and have them check this patch (ECASE 118069).

Thanks again,
-V
Ultramundane
Participant
Posts: 407
Joined: Mon Jun 27, 2005 8:54 am
Location: Walker, Michigan
Contact:

Post by Ultramundane »

VCInDSX wrote: Hi,
Thanks for the quick response. Appreciate it much.
Is this specific to AIX or would it be for other platforms as well? I believe our Server is 7.5.2 on a Linux box. I will forward this to the Admin group and have them check this patch (ECASE 118069).

Thanks again,
I am not sure if the patch is is specificly to AIX. I think your admins should be able to find out very easily by asking IBM about the ecase. Good luck.
VCInDSX
Premium Member
Premium Member
Posts: 223
Joined: Fri Apr 13, 2007 10:02 am
Location: US

Post by VCInDSX »

Hi,
It appears that this ECase was specific to
"ODBC Stage doesn't get Database schema from Netezza properly when RCP is used"
Is that what it was meant for?
I wonder if other users have the same issue of not being able to process a BigInt into a SQL Server DB using ODBC.

Thanks
-V
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

I had this problem a couple of years ago. I just changed the datatype within my job to integer as a workaround.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
Ultramundane
Participant
Posts: 407
Joined: Mon Jun 27, 2005 8:54 am
Location: Walker, Michigan
Contact:

Post by Ultramundane »

VCInDSX wrote:Hi,
It appears that this ECase was specific to
"ODBC Stage doesn't get Database schema from Netezza properly when RCP is used"
Is that what it was meant for?
I wonder if other users have the same issue of not being able to process a BigInt into a SQL Server DB using ODBC.

Thanks
?
That is why I mentioned.
The latest patch I have is ECASE 118069. There are many fixes in that patch for decimal corruptions and datatype conversions related to ODBC.

This patch has many other patches in it that IBM does not keep track of when releasing patches. They only track what that patch was specifically created to do and they only release that information. This have very bad patch management practices.

If you like I can list all of the other patches which I have received for orchodbc.o.
Ultramundane
Participant
Posts: 407
Joined: Mon Jun 27, 2005 8:54 am
Location: Walker, Michigan
Contact:

Post by Ultramundane »

Well, it wouldn't surprise me if they deliverd that patch for a different OS as a point patch.

So, here are the cases affecting orchodbc.o
$ find . -name orchodbc.o -print
./P002_COMPOUND/patch_meijer/root/orchodbc.o
./P004_E97821/patch_97821/root/orchodbc.o
./P007_E91572/patch_91572/root/orchodbc.o
./P011_E118069/patch_118069/root/orchodbc.o

So,
Ecase: meijer (contains nearly 20 patches)
Ecase: 97821
Ecase: 91572
Ecase: 118069
VCInDSX
Premium Member
Premium Member
Posts: 223
Joined: Fri Apr 13, 2007 10:02 am
Location: US

Post by VCInDSX »

Hi Ultramundane,
Thanks for taking the time to gather all this information. Appreciate it much. I will pass this to our Admin group and see what they gather and let you/group know.
Wish they had more descriptions on the fixes related to these patches.

DSGuru,
Wouldn't that result in loss of data? BigInt ==> Integer might work as long as the values in the BigInt stay within the limits of Integer. As soon as you exceed the range of Integer, you are going to have unwanted results, correct? Maybe I am missing something. I should refer the ranges for each types within Datastage.

Thanks,
-V
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

True. That might be the case in the long run. But this was a short term, relatively small amounts of data and hence was an acceptable workaround.
You should try a test. Send in a large integer, thats larger than the integer range, into a BIGINT column, but mask it as integer or numeric in datastage. See if you have any loss of data.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
VCInDSX
Premium Member
Premium Member
Posts: 223
Joined: Fri Apr 13, 2007 10:02 am
Location: US

Post by VCInDSX »

Hi DSGuru,
Thanks for your time and suggestions. I was held up with some other issue and could not try this out to provide any feedback.
I tried a few things and wanted to share here so that anyone who might come searching will get more info (hopefully...)
I created a test table and sample data to try this. Here is the info.

Source Table DDL (SQL Server 2005)
CREATE TABLE BigIntTest(
[EmpIdBigInt] [bigint] NULL, /*Actual BigInt Values*/
[EmpIdInt] [int] NULL, /*Actual Integer values*/
[EmpIdDec] [decimal] NULL, /*If storing BigInt into Decimal*/
[EmpIdIntFromBigInt] [int] NULL, /*To transform BigInt to Int*/
)

Target Table DDL (SQL Server 2005)
CREATE TABLE BigIntTestCopy(
[EmpIdBigInt] [bigint] NULL, /*Actual BigInt Values*/
[EmpIdInt] [int] NULL, /*Actual Integer values*/
[EmpIdDec] [decimal] NULL, /*If storing BigInt into Decimal*/
[EmpIdIntFromBigInt] [int] NULL, /*To transform BigInt to Int*/
)

Sample test data (Loaded using SQL Server Client UI)
insert into BigIntTest values (1000000000000000000, 2147483643, 0,0)
insert into BigIntTest values (9223372036854775804, 2147483644, 0,0)
insert into BigIntTest values (9223372036854775805, 2147483645, 0,0)
insert into BigIntTest values (9223372036854775806, 2147483646, 0,0)
insert into BigIntTest values (9223372036854775807, 2147483647, 0,0)

BigInt Range overflow test

NOTE: From docs, Max limit for BigInt is 9223372036854775807
If I insert the next BigInt value (9223372036854775808) into this table
insert into BigIntTest values (9223372036854775808, 2147483647, 0,0)
I get an error message and the row does not get inserted.
Msg 8115, Level 16, State 2, Line 1
Arithmetic overflow error converting expression to data type bigint.
The statement has been terminated.


Integer Range overflow test
NOTE: From docs, Max limit for Integer is 2147483647
Similarly, if i try to insert the next Int value (2147483648)
insert into BigIntTest values (9223372036854775807, 2147483648, 0,0)
I get an error message and the row does not get inserted.
Msg 8115, Level 16, State 2, Line 1
Arithmetic overflow error converting expression to data type int.
The statement has been terminated.


Datastage client tests (BigInt ==> BigInt)
Next, I created a simple parallel job as follows
ODBC (Read from BigIntTest Table) ==> XFMer (Nothing much for now) ==> ODBC (Write to BigIntTestCopy table)
Compiled the job successfully.

When i validate the job, i see the following message
ODB_BigIntTest_Src: When checking operator: When binding output interface field "EmpIdBigInt" to field "EmpIdBigInt": Implicit conversion from source type "decimal[19,0]" to result type "int64": Possible range limitation.
ODB_BigIntTest_Src: When checking operator: When binding output interface field "EmpIdDec" to field "EmpIdDec": Implicit conversion from source type "int64" to result type "decimal[18,0]": Possible range limitation.


I ran the job and found the same 2 warnings in the log. When i checked the target table, here is what i found
EmpIdBigInt|EmpIdInt|EmpIdDec|EmpIdIntFromBigInt
1000000000000000000|2147483643|0|0
NULL|2147483644|0|0
NULL|2147483645|0|0
NULL|2147483646|0|0
NULL|2147483647|0|0

Datastage client tests (BigInt ==> BigInt using Integer in XFM)
In the transformer, i changed the target type to Integer and ran the same job. Got the same warnings. Output table looks as given below
EmpIdBigInt|EmpIdInt|EmpIdDec|EmpIdIntFromBigInt
-1486618624|2147483643|0|0
-4|2147483644|0|0
-3|2147483645|0|0
-2|2147483646|0|0
-1|2147483647|0|0

Datastage client tests (BigInt ==> BigInt using Decimal in XFM)
From the warnings, it seemed that the BigInt from the source table is treated as a Decimal(19,0) and therefore, i decided to try Decimal as the intermediate type. In the transformer, i changed the target type to Decimal(19,0) and ran the same job. Got the same warnings. But this time, the Output table appeared to have the expected values as found in the source.
EmpIdBigInt|EmpIdInt|EmpIdDec|EmpIdIntFromBigInt
1000000000000000000|2147483643|0|0
9223372036854775804|2147483644|0|0
9223372036854775805|2147483645|0|0
9223372036854775806|2147483646|0|0
9223372036854775807|2147483647|0|0

While looking at the OSH output here is what i found
## General options
[ident('ODB_BigIntTest_Src'); jobmon_ident('ODB_BigIntTest_Src')]
## Outputs
0> [modify (
EmpIdBigInt:nullable int64=EmpIdBigInt;
EmpIdInt:nullable int32=EmpIdInt;
EmpIdDec:nullable decimal[18,0]=EmpIdDec;
EmpIdIntFromBigInt:nullable int32=EmpIdIntFromBigInt;
keep
EmpIdBigInt,EmpIdInt,EmpIdDec,EmpIdIntFromBigInt;
)] 'ODB_BigIntTest_Src:BigIntTest_Src_to_XFMer_Lnk.v'
;


Generally Integer is 4 bytes and BigInt is 8 bytes. If the decimal(19,0) in the XFM provides another workaround, I am wondering what the effective space consumption on the Datastage side might be?

As per the second warning message, why is the Database type of Decimal showing as source type "int64" not being able to convert to decimal(18,0)

What other options do you think will help in this R&D.

I tried this using Server Jobs and have some interesting report. As this is already a long post, i did not want to bring that in. I can post it if anyone is interested.

Thanks for your time,
-V
Ultramundane
Participant
Posts: 407
Joined: Mon Jun 27, 2005 8:54 am
Location: Walker, Michigan
Contact:

Post by Ultramundane »

VCInDSX wrote:What other options do you think will help in this R&D.
Getting and applying the bug fixes.
Post Reply