问题
This question is a follow up to Efficient way to transfer many binary files into SQL Server database
I originally asked why using File.ReadAllBytes
was causing rapid memory use and it was concluded using that method put the data on the large object heap which cannot be easily reclaimed during run-time.
My question now is how to avoid that situation?
using (var fs = new FileStream(path, FileMode.Open))
{
using (var ms = new MemoryStream())
{
byte[] buffer = new byte[2048];
int bytesRead;
while ((bytesRead = fs.Read(buffer, 0, buffer.Length)) > 0)
{
ms.Write(buffer, 0, bytesRead);
}
return new CustomFile { FileValue = ms.ToArray() };
}
}
The following code was intended to get around the problem by reading a file in chunks instead of all at once but it seems to have the same problem.
回答1:
The memory stream holds an internal array fo the whole data (which you return in the end). It doesn't matter that you read in chunks of 2048 bytes as long as you keep concatenating to the memory stream. If you need to return the data as an array containing the entire file, then you will end up often creating that array the large object heap.
If the destination (a BLOB field or similar) does not allow you to pass in the data in any other way than a single byte array, then you can't get around allocating a byte array that holds all the data.
The best way of transferring data to the destination is of course if the destination also supports a stream semantic.
int Transfer(Stream source, Stream target)
{
byte buffer = new byte[BufSize];
int totalBytesTransferred = 0;
while ((bytesRead = source.Read(buffer, 0, BufSize)) > 0)
{
target.Write(buffer, 0, bytesRead);
totalBytesTransferred += bytesRead;
}
return totalBytesTransferred;
}
If this is possible depends on whether the target (Database BLOB for example) supports opening a stream to the target or not.
来源:https://stackoverflow.com/questions/15781521/avoiding-the-loh-when-reading-a-binary