Here's A good one for you:
got 8GB of memory + 17GB in swap
on a 4 CPU alpha MARVEL
I want to merge between 2 large files:
leading one has over 57 milion rows and about 20GB
outer join file to the leading one is over 25 milion rows and over 11 GB
after some time the merge aborts giving:
since I use attunity odbc driver and experiance showed that a join query is not likely to finish in my life time I'm trying to get the files to my DS and merge them.Invalid row termination character configuration
Function 'input_srt_to_row' failed
Function 'hashToRow' Failed
Function 'get_next_output_row' Failed
I've checked using the OS utilities (awk and sort) for file integrity
and got the same number of fields in every row foreach file.
does anyone have any idea/knoladge regarding limitations/ solution or any alternative regarding what I"m trying?
Thanks in advance,