Search found 41 matches

by dsuser7
Wed Nov 20, 2013 3:58 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How can I output fields not grouped in aggregator?
Replies: 1
Views: 974

How can I output fields not grouped in aggregator?

Hi, I'm trying to output fields other than those used for grouping/keys in an aggregator stage. I'm using this stage to find the maximum value of one field. Is this possible? ex: Max on 4th field; keys are 1st and 2nd fields input: 123,a,asdf,45,ghk 123,a,ghg,55,dbd 234,b,olkd,90,gjsl 234,b,tklo,60,...
by dsuser7
Mon Nov 11, 2013 11:25 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Performance issue reading data from Oracle connector
Replies: 16
Views: 10221

This really breaks my heart, because the time difference between 3.3min and 7min may not seem like much; but the actual job I'm concerned with joins 70 mil records with 7 mil dataset and it take around 40min.
by dsuser7
Mon Nov 11, 2013 11:19 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Performance issue reading data from Oracle connector
Replies: 16
Views: 10221

Yikes, just realized, I inverted the rows/sec stats for the jobs, it should be

1. Job with 6 columns runs 3 min 3 sec [368,061 rows/sec] --- Avg record length in bytes = 57
2. Job with 30 columns runs 7 min 26 secs [148,720 rows/sec] --- Avg record length in bytes = 143
by dsuser7
Mon Nov 11, 2013 11:16 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Performance issue reading data from Oracle connector
Replies: 16
Views: 10221

There is no where clause in the select query. So the increase in time taken for the job to finish is just because the number of columns selected are more (bytes are more)?
by dsuser7
Mon Nov 11, 2013 10:37 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Performance issue reading data from Oracle connector
Replies: 16
Views: 10221

Would creating indexes on the other columns help performance of the job?
by dsuser7
Thu Nov 07, 2013 7:27 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Performance issue reading data from Oracle connector
Replies: 16
Views: 10221

Thank you for your reply. I have run jobs as suggested and here are the results: 1. Job with 6 columns runs 3 min 3 sec [148720 rows/sec] --- Avg record length in bytes = 57 2. Job with 30 columns runs 7 min 26 secs [368061 rows/sec] --- Avg record length in bytes = 143 Is the difference due to inde...
by dsuser7
Wed Nov 06, 2013 8:28 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Performance issue reading data from Oracle connector
Replies: 16
Views: 10221

Performance issue reading data from Oracle connector

Hi All, I'm select few columns from a table with around 70mil records (all records are being selected, no where clause). And joining the data from the above with a data set of around 7 mil records. When 6 fields are selected from oracle stage, the job finishes in less than 5 mins whereas the job tak...
by dsuser7
Wed Nov 06, 2013 8:00 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Timestamp comparison in Change Capture stage
Replies: 2
Views: 1483

Thank you for your reply.
I missed converting the timestamp format to Datastage timestamp format while reading from oracle connector stage. Now that both the timestamps being compared are in same format, it is working.
by dsuser7
Wed Oct 30, 2013 9:14 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Timestamp comparison in Change Capture stage
Replies: 2
Views: 1483

Timestamp comparison in Change Capture stage

Hi All I have a parallel job with before dataset as Oracle table and after dataset - a Dataset. There are Timestamp fields to be compared and these fields are nullable. As long as there are no nulls in both datasets, the comparison happens correctly. So I have converted the timestamp to varchar usin...
by dsuser7
Tue Jun 04, 2013 8:12 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Upsert LOB values to Oracle table using Oracle Connector
Replies: 8
Views: 3760

I'm sorry, I'm misunderstood. There are 2 separate jobs (with 2 separate source and target tables) -

1. source has CLOB which is mapped to target CLOB
2. source has BLOB which is mapped to target BLOB.
by dsuser7
Tue Jun 04, 2013 7:37 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Upsert LOB values to Oracle table using Oracle Connector
Replies: 8
Views: 3760

There are 2 tables - one has CLOB and the other has BLOB datatypes. As Datastage data type for LOB are LongVarChar or LongVarBinary respectively. I haven't defined a size to these fields in the job.

Are there any alternate stage or any properties using which I can improve the upsert performance.
by dsuser7
Mon Jun 03, 2013 4:26 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Upsert LOB values to Oracle table using Oracle Connector
Replies: 8
Views: 3760

Thanks Craig for the quick reply.

These 2 tables are not in the same database. Can't this be handled in Datastage?


-Thanks
by dsuser7
Mon Jun 03, 2013 3:15 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Upsert LOB values to Oracle table using Oracle Connector
Replies: 8
Views: 3760

Upsert LOB values to Oracle table using Oracle Connector

Hi, I'm trying to read from one oracle table and write to another oracle table. The source and target tables both contain LOB fields. The Oracle connector initially gave error that the array size should be made 1 for it to be able to process the LOB values. There are around 30million records to be p...
by dsuser7
Wed Mar 09, 2011 10:47 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Pivoting Dynamic number of columns
Replies: 5
Views: 3398

I think similar requirements have been discussed here just recently, but: Read the record as a single varchar column (8.5) Use a transformer with looping to build the pivoted output records (pre-8.5) Use a transfomer with multiple output links to build the pivoted records and funnel them together, ...