Search found 7 matches

by nbd4id1
Fri Aug 08, 2008 12:57 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Error inserting into Netezza table using ODBC Stage
Replies: 4
Views: 2541

Thanks Aruna, I've checked and RCP is not enabled. Also, we don't drop or even create any fields in the transform. The Debug shows that the successful queries pull the c0, c1, c2... variables from an external table. Then there's that one point where the variables are referred to as stand alone value...
by nbd4id1
Mon Aug 04, 2008 1:01 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Problem with un identifiable(junk/garbage) characters
Replies: 1
Views: 3246

I didn't see this until today but the solution is to simply look at the end of the nzlog file if you are using a Netezza Enterprise stage. Normally this file is in the /tmp directory but it can be redirected. The name should be <database>.<table name>.log and it should show something like this: Foun...
by nbd4id1
Tue Jul 29, 2008 12:00 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Error inserting into Netezza table using ODBC Stage
Replies: 4
Views: 2541

Error inserting into Netezza table using ODBC Stage

We've got a very simple job (OS is AIX) that had been working fine until this month when it began failing on our test system with the following error: ERROR: Attribute 'C0' not found The job reads from a DataSet, goes through a Transform which does Null handling on those fields that are nullable and...
by nbd4id1
Tue Oct 02, 2007 9:37 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: ERROR: reserve exceeded : Out of memory
Replies: 2
Views: 2043

ERROR: reserve exceeded : Out of memory

We're running DS 7.5 and have started getting this error with jobs that have been running for months without any problems. Does anyone know the cause and solution?

Mark
by nbd4id1
Wed Sep 12, 2007 1:03 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Bigint values changed writting to Netezza table
Replies: 3
Views: 1762

OK. Found a thread from May-June 2007 that covered the problem. It is not resolved but there does appear to be a workaround.
by nbd4id1
Tue Sep 11, 2007 4:40 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Bigint values changed writting to Netezza table
Replies: 3
Views: 1762

We're using a 4 node configuration file. A Peek stage was substituted for the ODBC stage and a single record processed. The correct value was shown for the bigint field in the Peek stage.
Mark
by nbd4id1
Tue Sep 11, 2007 3:54 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Bigint values changed writting to Netezza table
Replies: 3
Views: 1762

Bigint values changed writting to Netezza table

We're testing an upcoming data migration and have come across something odd. Several jobs that use ODBC stages to write to Netezza tables are changing very large bigint values to even numbers. The values came from Oracle tables where the fields were defined as Number(17,0). Example: 9999904067010175...