Teradata RDBMS code 2617: Overflow occurred
Posted: Wed Sep 08, 2010 8:55 am
We have a job failing in production (and elsewhere) with the following error message:
The E2 table referenced is the error table used by FastLoad to capture uniqueness vioations. The input has 3.5M records. The target table actually gets 100% of these records. The E2 table gets loaded with 5.1M records from who knows where... From our testing, we cannot find where the duplicates originate. I can create a test job that writes to a dataset instead of a target table and the dataset is 100% unique.
Even more confusing is the error message itself. The cust_acct_id field is decimal(20,0) in the input datastream and the target table. There is no datatype conversion going on, at least in terms of code (I understand the underlying storage in DS is probably string and this gets converted to a Teradata decimal datatype as the data is loaded).
I have a test job in developmentthat fails the same way. It reads a dataset, passes through a copy stage (no column modifications going on) and out to the target table. No logic or transformations adn still it fails with this 2617 error. The only real difference is that there were duplicates in the input dataset. By why a datatype error instead of a warning/error for the duplicates?
Any suggestions?
Brad.
We are using DS 8.1 and Teradata 13. We are using the Teradata Connector to read/write from the database. In this case, it is doing a bulk write to the target table.9713 FATAL Sat Sep 4 08:21:03 2010 heidswk_dly_cust_acct_drv,0: RDBMS code 2617: Overflow occurred computing an expression involving dly_cust_acct_drv_E2.CUST_ACCT_ID SQL statement: LOCK ROW FOR ACCESS SELECT CAST(cust_acct_id AS DECIMAL(20)), CAST(cust_acct_typ
_cd AS CHAR(5)), ... etc. ... CAST(stat_au_chnl_cd AS CHAR(4)) FROM P_EIW_W_T_UTILWORK_01.dly_cust_acct_drv_E2 (CC_TeraConnection::executeSel
ect, file CC_TeraConnection.cpp, line 2,458) [pxbridge.C:5949]
The E2 table referenced is the error table used by FastLoad to capture uniqueness vioations. The input has 3.5M records. The target table actually gets 100% of these records. The E2 table gets loaded with 5.1M records from who knows where... From our testing, we cannot find where the duplicates originate. I can create a test job that writes to a dataset instead of a target table and the dataset is 100% unique.
Even more confusing is the error message itself. The cust_acct_id field is decimal(20,0) in the input datastream and the target table. There is no datatype conversion going on, at least in terms of code (I understand the underlying storage in DS is probably string and this gets converted to a Teradata decimal datatype as the data is loaded).
I have a test job in developmentthat fails the same way. It reads a dataset, passes through a copy stage (no column modifications going on) and out to the target table. No logic or transformations adn still it fails with this 2617 error. The only real difference is that there were duplicates in the input dataset. By why a datatype error instead of a warning/error for the duplicates?
Any suggestions?
Brad.