I have multiple files of text that I need to read and combine into one file. The files are of varying size: 1 - 50 MB each. What\'s the most efficient way to combine these f
This is code used above for .Net 4.0, but compatible with .Net 2.0 (for text files)
using (var output = new StreamWriter("D:\\TMP\\output"))
{
foreach (var file in Directory.GetFiles("D:\\TMP", "*.*"))
{
using (var input = new StreamReader(file))
{
output.WriteLine(input.ReadToEnd());
}
}
}
Please note that this will read the entire file in memory at once. This means that large files will cause a lot of memory to be used (and if not enough memory is available, it may fail all together).
Darin is on the right track. My tweak would be:
using (var output = File.Create("output"))
{
foreach (var file in new[] { "file1", "file2" })
{
using (var input = File.OpenRead(file))
{
input.CopyTo(output);
}
}
}
Do it in chunks:
const int chunkSize = 2 * 1024; // 2KB
var inputFiles = new[] { "file1.dat", "file2.dat", "file3.dat" };
using (var output = File.Create("output.dat"))
{
foreach (var file in inputFiles)
{
using (var input = File.OpenRead(file))
{
var buffer = new byte[chunkSize];
int bytesRead;
while ((bytesRead = input.Read(buffer, 0, buffer.Length)) > 0)
{
output.Write(buffer, 0, bytesRead);
}
}
}
}