I have two std::ofstream text files of a hundred plus megs each and I want to concatenate them. Using fstreams to store the data to create a single file usually ends up with an
Is there any way of merging them faster than O(n)?
That would mean you would process the data without passing through it even once. You cannot interpret it for merging without reading it at least once (short answer: no).
For reading the data, you should consider un-buffered reads (look at std::fstream::read).