I have two std::ofstream text files of a hundred plus megs each and I want to concatenate them. Using fstreams to store the data to create a single file usually ends up with an
It really depends whether you wish to use "pure" C++ for this, personally at the cost of portability I would be tempted to write:
#include
#include
int main(int argc, char* argv[]) {
std::ostringstream command;
command << "cat "; // Linux Only, command for Windows is slightly different
for (int i = 2; i < argc; ++i) { command << argv[i] << " "; }
command << "> ";
command << argv[1];
return system(command.str().c_str());
}
Is it good C++ code ? No, not really (non-portable and does not escape command arguments).
But it'll get you way ahead of where you are standing now.
As for a "real" C++ solution, with all the ugliness that streams could manage...
#include
#include
static size_t const BufferSize = 8192; // 8 KB
void appendFile(std::string const& outFile, std::string const& inFile) {
std::ofstream out(outFile, std::ios_base::app |
std::ios_base::binary |
std::ios_base::out);
std::ifstream in(inFile, std::ios_base::binary |
std::ios_base::in);
std::vector buffer(BufferSize);
while (in.read(&buffer[0], buffer.size())) {
out.write(&buffer[0], buffer.size());
}
// Fails when "read" encounters EOF,
// but potentially still writes *some* bytes to buffer!
out.write(&buffer[0], in.gcount());
}