Search found 233 matches
- Tue Jan 15, 2008 4:04 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: dataset
- Replies: 1
- Views: 592
dataset
There are only 3 records in Dataset. Here Dataset acts as a lookup file. when i lookup with 1000 records (records from input), the output link from dataset shows 1000 rows. If input is 20000, output link ffrom dataset shows 20000. Dataset output is equal to the input records. what is this behaviour.
- Tue Jan 15, 2008 3:17 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: partition
- Replies: 3
- Views: 1090
Re: partition
I have a job like SORT--->filter---->Join (from filter there are 2 outputs to two joins, one for each). Now my problem i am hash partitioning and sorting on same keys in sort stage as well as in join stage (because the filter is not preserving the partition). How can i solve this. I want use the sa...
- Tue Jan 15, 2008 12:54 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: partition
- Replies: 3
- Views: 1090
partition
I have a job like SORT--->filter---->Join (from filter there are 2 outputs to two joins, one for each). Now my problem i am hash partitioning and sorting on same keys in sort stage as well as in join stage (because the filter is not preserving the partition). How can i solve this. I want use the sam...
- Mon Jan 14, 2008 8:35 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Agg,1: Hash table has grown to 16384 entries.
- Replies: 9
- Views: 35754
aggregator
How can we know how much MB our aggregator stage is using. If we know that we can count of record length and chosse between hash or sort.ray.wurlod wrote:Hash table aggregation method is recommended for 1000 or fewer distinct grouping values per MB. Looks like you've exceeded this limit.
please reply
- Mon Jan 14, 2008 8:33 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Agg,1: Hash table has grown to 16384 entries.
- Replies: 9
- Views: 35754
aggregator
How can we know how much MB our aggregator stage is using. If we know that we can count of record length and chosse between hash or sort.ray.wurlod wrote:Hash table aggregation method is recommended for 1000 or fewer distinct grouping values per MB. Looks like you've exceeded this limit.
please reply
- Mon Jan 14, 2008 8:14 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Inline sort or external sort
- Replies: 7
- Views: 3965
hash table
An inline sort is where you specify sorting on the Partitioning tab of a stage's input link. An external sort is, presumably, where you use an explicit Sort stage. The latter gives greater flexibility, including controlling memory allocated to sorting, generation of key change indicators, and perfo...
- Mon Jan 14, 2008 8:11 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Inline sort or external sort
- Replies: 7
- Views: 3965
hash table
An inline sort is where you specify sorting on the Partitioning tab of a stage's input link. An external sort is, presumably, where you use an explicit Sort stage. The latter gives greater flexibility, including controlling memory allocated to sorting, generation of key change indicators, and perfo...
- Mon Jan 14, 2008 8:04 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Hash table (hash partition)
- Replies: 4
- Views: 1169
Hash table (hash partition)
In the aggregation i have used hash partition. Now my warning is
"Hash table has grown to 32768 entries". This is only a warning. but what should i do now to eliminate these warnings
"Hash table has grown to 32768 entries". This is only a warning. but what should i do now to eliminate these warnings
- Mon Jan 14, 2008 8:02 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Sequential file in PX
- Replies: 5
- Views: 3358
Sequential file px
In Parallel jobs you need to handle NULLs explicitly. This doesn't mean that you should do null handling only in the Transformer. Moreover you don't have to do the Null handling on a field if you don't transform that field or if it is not being used in any stage variables. Now you are having this p...
- Sun Jan 13, 2008 9:23 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Sequential file in PX
- Replies: 5
- Views: 3358
Sequential file in PX
I have a record in which some of the columns are nulls. When i export this record to a sequential file the message is "exporting nullable fields without null handling properties". Then i used a xformer and handled all nulls and in the sequential i left the nullable column YES. Still the er...
- Sat Jan 12, 2008 3:34 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: How to create Sequence file with date stamp?
- Replies: 3
- Views: 1360
- Sat Jan 12, 2008 3:23 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: sequence issue
- Replies: 15
- Views: 5064
sequence issue
Call your routine from "User Variable Activity". Use the value of the user variable to trigger downstream activities. Routine activity is not required. i tried to use the user variables activity. But the trigger expression type is always unconditional. How we can give trigger condition in...
- Fri Jan 11, 2008 12:49 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: sequence issue
- Replies: 15
- Views: 5064
sequencer issue
Apparently what you have not done is read the docs. Yes. Its true. So if any condition returns other than zero, then i should not use routine. Can i do like this. My trigger condition returns value other than 0. I will not uncheck those 2 conditions. Can i still use a routine for this. Can you expl...
- Thu Jan 10, 2008 7:06 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: sequence issue
- Replies: 15
- Views: 5064
sequence issue
Have you tried using a custom trigger based upon the routine's return value? Or an explicit Failure trigger so that you're explicitly handling the "failure"? what i have done is In the routine activity i called a user defined routine which will retun value greater than 0. In the trigger p...
- Thu Jan 10, 2008 2:06 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: sequence issue
- Replies: 15
- Views: 5064
sequence issue
It is considered to have failed if the 'Automatically handle activities that fail' option is enabled, yes. Simply enough to 'fix', however - the documentation clearly states how you can handle it so the 'automatically' part doesn't kick in. In other words, it only handles it if it thinks you haven'...