Aggregator Limit
Posted: Fri May 23, 2008 7:00 am
I receive the following error during aggregation;
Aggregator_37,0: Failure during execution of operator logic.
Aggregator_37,0: Input 0 consumed 9614492 records.
Aggregator_37,0: Output 0 produced 9614492 records.
Aggregator_37,0: Fatal Error: pipe write failed: Broken pipe
Aggregator_37,0: Failure during execution of operator logic.
Aggregator_37,0: Input 0 consumed 9614046 records.
Aggregator_37,0: Output 0 produced 6683230 records.
Aggregator_37,0: Fatal Error: sendReadAck(): write failed on node SASHQOKWSDA Broken pipe
node_node2: Player 18 terminated unexpectedly.
I assume it's a space issue as the program worked fine with a 5 million row output. I ran it multiple time and it keeps aborting around the same amount of records. Is there a limit that I should change.
Thanks,
Aggregator_37,0: Failure during execution of operator logic.
Aggregator_37,0: Input 0 consumed 9614492 records.
Aggregator_37,0: Output 0 produced 9614492 records.
Aggregator_37,0: Fatal Error: pipe write failed: Broken pipe
Aggregator_37,0: Failure during execution of operator logic.
Aggregator_37,0: Input 0 consumed 9614046 records.
Aggregator_37,0: Output 0 produced 6683230 records.
Aggregator_37,0: Fatal Error: sendReadAck(): write failed on node SASHQOKWSDA Broken pipe
node_node2: Player 18 terminated unexpectedly.
I assume it's a space issue as the program worked fine with a 5 million row output. I ran it multiple time and it keeps aborting around the same amount of records. Is there a limit that I should change.
Thanks,