Job aborts after 754K recs with Aggregator: %s
Moderators: chulett, rschirm, roy
-
- Premium Member
- Posts: 353
- Joined: Wed Apr 06, 2005 8:45 am
Job aborts after 754K recs with Aggregator: %s
Hi,
My job aborts after 754519 records.
I have a job that extracts data from ORACLE source (17 Coulmns), then look it up against 8 columns, followed by AGGREGATOR (to get sum of POs) and finally load into the target DB.
8 LOOKUPFILES
|
|
ORACLE/SOURCE---TRANSFORMER--AGGREGATOR---ORACLE/TARGET
\
\
REJECTFILE (Based on condition in Tx)
I have observed that all the records are getting accumulated in the source link of the Aggregator (I guess because it needs to group the columns and sum one of the column) and then the job fails, is this because of some space/memory issue ?
When I run the job for less than 750000 it works fine!
The 2 warnings that appear in the director are:
AggregatorPOBalance: %s
Abnormal termination of stage LdPOFact..XfmDWIDProcess detected
Thanks,
My job aborts after 754519 records.
I have a job that extracts data from ORACLE source (17 Coulmns), then look it up against 8 columns, followed by AGGREGATOR (to get sum of POs) and finally load into the target DB.
8 LOOKUPFILES
|
|
ORACLE/SOURCE---TRANSFORMER--AGGREGATOR---ORACLE/TARGET
\
\
REJECTFILE (Based on condition in Tx)
I have observed that all the records are getting accumulated in the source link of the Aggregator (I guess because it needs to group the columns and sum one of the column) and then the job fails, is this because of some space/memory issue ?
When I run the job for less than 750000 it works fine!
The 2 warnings that appear in the director are:
AggregatorPOBalance: %s
Abnormal termination of stage LdPOFact..XfmDWIDProcess detected
Thanks,
-
- Participant
- Posts: 3337
- Joined: Mon Jan 17, 2005 4:49 am
- Location: United Kingdom
We faced a similar problem once. It was because one of the lookup link was fetching a large amount of data and the operations defined in the transformer was failing. You try to see the performance statistics in the designer while the job runs and see any of your look up links really stucks while processing the 754519th Record from the input.
Smitha Jacob
-
- Premium Member
- Posts: 353
- Joined: Wed Apr 06, 2005 8:45 am
Hi Sai,
Did not check the memory or disk usage, let me know where to look at it, it might be the cause as lot of jobs are operating under the same file system.
Hi sjacobk,
I am using the utility "UtilityHashLookup('HshLkpPOXref', PO_SRC_OUTPUT.POKEY,1)" anyway will try to build a hash on the target to process the 754519th Record from the input, but still i will be extracting all the records again, is there a way I can start from this point, I guess I need to search this forum for check points.
Thanks,
Did not check the memory or disk usage, let me know where to look at it, it might be the cause as lot of jobs are operating under the same file system.
Hi sjacobk,
I am using the utility "UtilityHashLookup('HshLkpPOXref', PO_SRC_OUTPUT.POKEY,1)" anyway will try to build a hash on the target to process the 754519th Record from the input, but still i will be extracting all the records again, is there a way I can start from this point, I guess I need to search this forum for check points.
Thanks,
-
- Participant
- Posts: 3337
- Joined: Mon Jan 17, 2005 4:49 am
- Location: United Kingdom
Yep...it is the 'Aggregator' stage and not 'Aggravator' stage. Why do you want to aggrevate his problem? (Just kidding).lebos wrote:Check your /tmp directory also.
But, I couldn't get around this problem and had to design a solution that did not include the Aggravator stage.
Good luck. (Great error msg isn't it?)
Larry
-
- Participant
- Posts: 3337
- Joined: Mon Jan 17, 2005 4:49 am
- Location: United Kingdom
-
- Premium Member
- Posts: 353
- Joined: Wed Apr 06, 2005 8:45 am
-
- Premium Member
- Posts: 353
- Joined: Wed Apr 06, 2005 8:45 am
I have encountered this problem in past when I didn't have my data sorted before aggregation..
Larry, sorting data is always efficient before aggregation, and there is column available in Aggregator Stage where you specify how the data is sorted..
Now geeting back to the problem, as you tried sorting the data and you are still getting the error, my guess would be problem within the data..i mean its value, is it possible that any of the column have NULL ?
Larry, sorting data is always efficient before aggregation, and there is column available in Aggregator Stage where you specify how the data is sorted..
Now geeting back to the problem, as you tried sorting the data and you are still getting the error, my guess would be problem within the data..i mean its value, is it possible that any of the column have NULL ?
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Sort on ALL grouping columns, and tell the Aggregator stage that you have done so (on the Inputs properties). There's no point sorting otherwise.
This allows the Aggregator stage to free up memory as soon as any change occurs in a grouping column.
At a pinch, you can tune the memory consumption of the Aggregator stage in DS.TOOLS, but let's not go there just yet.
This allows the Aggregator stage to free up memory as soon as any change occurs in a grouping column.
At a pinch, you can tune the memory consumption of the Aggregator stage in DS.TOOLS, but let's not go there just yet.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.