Search found 7 matches
- Fri Aug 08, 2008 12:57 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Error inserting into Netezza table using ODBC Stage
- Replies: 4
- Views: 2541
Thanks Aruna, I've checked and RCP is not enabled. Also, we don't drop or even create any fields in the transform. The Debug shows that the successful queries pull the c0, c1, c2... variables from an external table. Then there's that one point where the variables are referred to as stand alone value...
- Mon Aug 04, 2008 1:01 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Problem with un identifiable(junk/garbage) characters
- Replies: 1
- Views: 3246
I didn't see this until today but the solution is to simply look at the end of the nzlog file if you are using a Netezza Enterprise stage. Normally this file is in the /tmp directory but it can be redirected. The name should be <database>.<table name>.log and it should show something like this: Foun...
- Tue Jul 29, 2008 12:00 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Error inserting into Netezza table using ODBC Stage
- Replies: 4
- Views: 2541
Error inserting into Netezza table using ODBC Stage
We've got a very simple job (OS is AIX) that had been working fine until this month when it began failing on our test system with the following error: ERROR: Attribute 'C0' not found The job reads from a DataSet, goes through a Transform which does Null handling on those fields that are nullable and...
- Tue Oct 02, 2007 9:37 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: ERROR: reserve exceeded : Out of memory
- Replies: 2
- Views: 2043
ERROR: reserve exceeded : Out of memory
We're running DS 7.5 and have started getting this error with jobs that have been running for months without any problems. Does anyone know the cause and solution?
Mark
Mark
- Wed Sep 12, 2007 1:03 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Bigint values changed writting to Netezza table
- Replies: 3
- Views: 1762
- Tue Sep 11, 2007 4:40 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Bigint values changed writting to Netezza table
- Replies: 3
- Views: 1762
- Tue Sep 11, 2007 3:54 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Bigint values changed writting to Netezza table
- Replies: 3
- Views: 1762
Bigint values changed writting to Netezza table
We're testing an upcoming data migration and have come across something odd. Several jobs that use ODBC stages to write to Netezza tables are changing very large bigint values to even numbers. The values came from Oracle tables where the fields were defined as Number(17,0). Example: 9999904067010175...