Search found 40 matches

by priyadharsini
Fri Mar 25, 2011 3:07 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Can aggregator stage be of some help
Replies: 21
Views: 10836

Do you want to calculate the profit percentage based on Product id and country id or using Product id adn country id seperately?
by priyadharsini
Fri Mar 25, 2011 2:49 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: convert rows in to column
Replies: 7
Views: 4966

are you doing this based on any key olumn?
by priyadharsini
Thu Mar 17, 2011 6:00 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Unable to read data from Sequential stage
Replies: 4
Views: 2568

Please provide some sample data like how the split is happening.
by priyadharsini
Fri Feb 25, 2011 3:19 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Passing Dataset value to Oracle Database stage
Replies: 4
Views: 2510

Use sparse lookup if you want to pass the value dynamically, but this is preferable based on the amount of data you are having it in source and reference link.
by priyadharsini
Fri Feb 25, 2011 2:21 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Need you help on me using Dataset stage
Replies: 4
Views: 2893

Dataset doesn't allow to have 2 columns with same name. Please be clear, are you going to have same value on two columns?
by priyadharsini
Wed Feb 23, 2011 12:49 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Capture reject record from DB2 stage
Replies: 2
Views: 1877

Output Reject option is available only for Write method = Upsert in DB2 Enterprise stage.
by priyadharsini
Mon Feb 21, 2011 5:11 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Unable to convert the Varchar(9) to Numeric(9)
Replies: 7
Views: 4601

Check the director for any warnings.
by priyadharsini
Thu Feb 17, 2011 3:47 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Generate job reports in v8
Replies: 2
Views: 1716

tools-->reporting console
by priyadharsini
Tue Feb 15, 2011 3:58 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Complex Flat File and multple, conditional record types.
Replies: 6
Views: 5344

Try reading the file as single column. process the records sequentially. using stage variables you can combine the transaction and addenda record.
by priyadharsini
Fri Feb 11, 2011 4:56 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to Validate timestamp with milliseconds in Transformer
Replies: 4
Views: 12983

Your timestamp string should be varchar(26) with milliseconds.
Convert Varchar to timestamp StringToTimestamp(i/p col,"%yyyy-%mm-%dd %hh:%nn:%ss.6")) and then apply the IsValid function
by priyadharsini
Fri Feb 11, 2011 4:25 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Timestamp issue inserting into Oracle table from flat file
Replies: 5
Views: 2507

Is your job server or parallel?
If it is parallel then enable the microseconds checkbox in the extended properties and propogate till your target.
by priyadharsini
Thu Feb 10, 2011 2:06 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Datastage job still in "Running" state -- after us
Replies: 3
Views: 5274

Use cleanup resources in director and clear the status file.
by priyadharsini
Thu Feb 10, 2011 2:00 am
Forum: General
Topic: Issue in DB2
Replies: 3
Views: 11385

There is a high possiblity that load from /dev/null command will lock the table. Contact your DBA to release the lock and try.
by priyadharsini
Thu Feb 10, 2011 1:57 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Delta Run(Partial Run)
Replies: 3
Views: 1989

The constraint should be based on the number of nodes the job is running.