Page 1 of 1

Merge Stage Fails

Posted: Sun Jun 20, 2004 8:31 am
by roy
Hi All,
Here's A good one for you:
got 8GB of memory + 17GB in swap
on a 4 CPU alpha MARVEL
I want to merge between 2 large files:
leading one has over 57 milion rows and about 20GB
outer join file to the leading one is over 25 milion rows and over 11 GB
after some time the merge aborts giving:
Invalid row termination character configuration
Function 'input_srt_to_row' failed
Function 'hashToRow' Failed
Function 'get_next_output_row' Failed
since I use attunity odbc driver and experiance showed that a join query is not likely to finish in my life time I'm trying to get the files to my DS and merge them.
I've checked using the OS utilities (awk and sort) for file integrity
and got the same number of fields in every row foreach file.

does anyone have any idea/knoladge regarding limitations/ solution or any alternative regarding what I"m trying?

Thanks in advance,

Posted: Mon Jun 21, 2004 3:17 pm
by roy
Well Since no replies yet,
I intend to try splitting the leading file to smaller files and run them against the 11 GB.
Since the 20GB file is a product of a merge itself my guess is that 8GB as main file in a merge doing a left outer join to the 11GB file should work.

if I'm sutidfied with the result I might concider as Ken says the "divide and conquer" from step 1 in this case.

I'll try to update this thread if any interesting info turns up.

Cheers,

change the escape charator to 0

Posted: Thu Apr 20, 2006 2:35 am
by changming
that is worked for me well. do not ask me why, i learned from other's paste.