Search found 14 matches
- Thu Nov 02, 2006 1:01 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Lookup or Join or Merge
- Replies: 10
- Views: 6102
Oh yeah. I don't know if this holds for other databases, but for DB2 if the input and lookup tables are in the same database, it's most efficient to do a join in the DB2 read (in this case an outer join), since DB2 does a better job optimizing retrieval of data from its database than DataStage does...
- Fri Oct 27, 2006 5:07 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: how to replace special characters with N in a string
- Replies: 9
- Views: 4617
- Thu Oct 26, 2006 4:42 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: how to replace special characters with N in a string
- Replies: 9
- Views: 4617
Re: how to replace special characters with N in a string
You can use the Convert function in a Transformer stage. Syntax: convert('From List','To List','Expression') In From List, you can specify the list of special characters that needs to be replaced In To List, you can specify the list of new characters that will replace the special chars. In your case...
- Mon Oct 23, 2006 1:57 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Passing UserID and Password for DB connection
- Replies: 10
- Views: 3897
- Thu Sep 07, 2006 3:53 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: look up failed
- Replies: 6
- Views: 2976
- Thu Aug 24, 2006 3:03 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Not able to fetch any record.
- Replies: 7
- Views: 3158
- Thu Aug 24, 2006 2:23 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Datastage server Maintainence
- Replies: 24
- Views: 12970
We too faced a similar issue. We were creating temporary datasets in our jobs which were not deleted and started eating up the space. You can have a UNIX script to delete such temp files or datasets and put it as the last stage in your sequencer in an Execute command stage. This way you can ensure t...
- Tue Aug 08, 2006 11:14 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: DB2UDBEnterprise Load against DB2 without DPF?
- Replies: 3
- Views: 2822
- Tue Aug 08, 2006 10:04 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: How to find the last 10runs time of a job
- Replies: 5
- Views: 2587
- Mon Aug 07, 2006 4:36 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Agg,1: Hash table has grown to 16384 entries.
- Replies: 9
- Views: 35751
- Mon Aug 07, 2006 4:06 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: DB2UDBEnterprise Load against DB2 without DPF?
- Replies: 3
- Views: 2822
Re: DB2UDBEnterprise Load against DB2 without DPF?
You could run through the following checks, 1 - Is your target database partitioned? If not, DB2EE stage may not work correctly as it will not be able to load data parallely. You should be using a DB2 API stage 2 - If your target database is partitioned, check if the partitioning type in DB2EE stage...
- Thu Aug 03, 2006 3:15 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Job aborting when running with large data sets
- Replies: 7
- Views: 3188
- Wed Aug 02, 2006 4:48 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Job aborting when running with large data sets
- Replies: 7
- Views: 3188
- Wed Aug 02, 2006 4:35 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: How to preventing the join stage from sorting the records
- Replies: 18
- Views: 7847
Re: How to preventing the join stage from sorting the record
You can check the partitioning type specified on the link.