Search found 81 matches
- Mon Jul 27, 2009 6:43 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Concatination of two command outputs as a filepath
- Replies: 4
- Views: 2006
tanmaya, Can you just check your director log for these 2 command output. I think the the Command output will write the output with the newline character at the end. If so, your concatination will not give correct result. Use field function to get the correct result and then try to concatinate. Let ...
- Thu Apr 16, 2009 7:28 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Decimal warning
- Replies: 2
- Views: 1063
Re: Decimal warning
REJECTS: When checking operator: When validating export schema: At field "LOSGR": "null_field" length (1) must match field's fixed width (15) The decimal field "LOSGR" is of nullable field. I think you might have specified the null field value to '' (single character)....
- Thu Apr 16, 2009 5:12 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Error in control file when using Oracle Load
- Replies: 9
- Views: 3610
- Thu Apr 16, 2009 5:08 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Error in control file when using Oracle Load
- Replies: 9
- Views: 3610
hi, I believe the job failure is not becuase of number of fields. I can assume that the load option is rejecting some records and hence the error. Can you check whether the ora BAD files are created for that job run? I got the simillar problem using ora load option. The bulk load option will convert...
- Thu Apr 16, 2009 4:56 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Configuration file in DataStage
- Replies: 2
- Views: 3728
- Thu Apr 09, 2009 3:30 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Merge Duplicate Records
- Replies: 7
- Views: 4307
- Thu Apr 09, 2009 3:18 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Merge Duplicate Records
- Replies: 7
- Views: 4307
- Mon Apr 06, 2009 6:42 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Convert chr(0) to chr(32) (or how to trim them)
- Replies: 6
- Views: 17403
The source system is Oracle. What seems to have happened is that the source is all declared as CHAR. This automatically pads it with chr(0). Then we end up having to apply a bunch of trims in the Oracle select during our transformations. I tried convert(chr(0),chr(32),input_string) in the transform...
- Mon Apr 06, 2009 2:48 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Convert chr(0) to chr(32) (or how to trim them)
- Replies: 6
- Views: 17403
hi, Which is the soucre system for these data? Char(0) is nothing but the null character. So the TrimB function alone will not work as it is the function to remove the space character. How did you tried to convert char(0) to char(32) ? The reason i am asking this question is, you said the data conta...
- Tue Mar 31, 2009 8:35 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: External Source stage
- Replies: 12
- Views: 5750
Re: External Source stage
Are you using dsrecords function to get the count from the file I thought this function only relates to DataSet .
- Tue Mar 31, 2009 4:32 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Abort After Rows - Write to Sequential File Not working
- Replies: 7
- Views: 5855
why cant you write into two streams.... anything that passes goes into your actual output while the failure link is dealt with appropriately rather than aborting the job...rather than writing unix scripts etc You are right v2kmadhav 8). But to deal with failure records, we should have some conditio...
- Tue Mar 31, 2009 3:08 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Abort After Rows - Write to Sequential File Not working
- Replies: 7
- Views: 5855
Re: Abort After Rows - Write to Sequential File Not working
We actually want to abort the job and at the same time capture the invalid value on a output file. Could this be done? Hi, I dont think 1) capturing the invalid values in a file and 2) aborting the job can be done in a single job. Yes... It is not possible to capture the records if we set the limit...
- Tue Mar 31, 2009 1:09 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: How to achieve other than Pivot
- Replies: 10
- Views: 3272
- Tue Mar 31, 2009 12:19 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Upgrading the SAP R/r PACKs
- Replies: 2
- Views: 896
- Mon Mar 30, 2009 11:35 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Transformer
- Replies: 11
- Views: 2682
And don't worry about extra stages. That's what pipeline parallelism is all about... And just for knowledge I am asking cant we count the number of rows in atranformer?? I thought we can do any transformations or process in the transformer. Thanks Yes. We can count the number of rows in transformer...