Search found 86 matches
- Tue Mar 06, 2007 9:23 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Not being able to import data from a fixed width flat
- Replies: 6
- Views: 2093
- Tue Mar 06, 2007 9:22 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Decimal value gives "Input buffer overrun at field"
- Replies: 4
- Views: 10192
- Mon Feb 12, 2007 7:55 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: writing a warning to the log in a parallel job
- Replies: 5
- Views: 1767
Background: I need to write this warning when the compare stage has two rows that do not match. Both input datasets will have one row each. The purpose is to see if the data matchs the balance row at the end of the file. The balance row would have a row count and a sum of some of the columns. Then I...
- Mon Feb 12, 2007 7:43 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Compare stage, output columns
- Replies: 16
- Views: 18947
- Sun Feb 11, 2007 10:41 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: I am getting warning while loading data into sequential file
- Replies: 5
- Views: 1818
you should use a MODIFY stage to specify explicit values for nullable columns before you send them to a sequentail file. If you don't need nullable columns in the target (assuming that your sequentail file is a reject file or something like that and your target is a dataset or DB stage) then you sho...
- Sun Feb 11, 2007 10:38 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Importing Decimal field with commas
- Replies: 4
- Views: 1039
- Sun Feb 11, 2007 10:35 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: DS questions
- Replies: 3
- Views: 1226
>1. How can we perform a incremental data load in datastage this is more of an ETL question and not a DS question. One method is to maintain a copy of the target table as a dataset and check for the difference with your input file every time the job starts. That will get you an incremental data set....
- Sat Feb 03, 2007 6:43 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: writing a warning to the log in a parallel job
- Replies: 5
- Views: 1767
writing a warning to the log in a parallel job
Is there any way to write a warning to the log in a parallel job. Not server or sequence . I know how to do that. I need to know how to do it in a parallel job. I tried the following, but they don't work in a parallel job transformer: UtilityWarningToLog() DSLogWarn() Maybe you know of something tha...
- Fri Feb 02, 2007 3:01 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Compare stage, output columns
- Replies: 16
- Views: 18947
Compare stage, output columns
For the compare stage, all you have to do is define the following output columns: result : tinyint first : unknown (these are subrecords) second: unknonw If you don't want to use the data because all you want is the result code then you can add a MODIFY stage and specify "KEEP result". I h...
- Sun Jan 28, 2007 5:03 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Best parcatices for business logic
- Replies: 3
- Views: 1060
Best parcatices for business logic
What is the best way to implement lengthy business logic in a parallel job. My specif problem is that I have an IF statement with 100 conditions each performing a different calculation. Also, the IF condition stays the same for each column, but the calculations change. Ex: In col1 IF col0=1 then col...
- Sat May 06, 2006 5:45 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: main program: segmentation fault
- Replies: 1
- Views: 1041
main program: segmentation fault
I am getting a "main program: segmentation fault" error in my jobs when I run them. Sometimes it gives an unhandled exception error. I have 2 lookup stages and two join stages. All lookups are done with an ODBC stage. input and output source is a sequential file stage. The funny thing is t...