问题
TL;DR: Is it possible to upload a big MemoryStream to Azure as chunks on the fly while zipping?
I have files which get saved into a MemoryStream
, I add these files to a ZipArchive
in another MemoryStream.
This MemoryStream I want to upload to an Azure-BlockBlob-Storage
using
blockBlob.UploadFromStream(zipMemoryStream);
So far so good.
The Problem now is, that the Zip-Archive might get bigger then 8GB which is a problem with the MemoryStream.
Is it possible to upload parts from the memory-stream as chunks to azure, and remove these bytes from the Stream?
Or is there a better approach to deal with zipArchive and azure?
For zipping I am using the class ZipArchive
in the Package System.IO.Compression
Best Regards,
Flo
回答1:
It may be not exactly what you are looking for, but did you try to do something like this:
var blob = container.GetBlockBlobReference("zipped.zip");
using (var stream = new ZipArchive(blob.OpenWrite(), ZipArchiveMode.Create))
{
var entry = stream.CreateEntry("entry1");
using (var es = entry.Open())
{
// Fill entry with data
}
// Other code
}
When you call OpenWrite
of CloudBlockBlob
it creates an instance of CloudBlobStream
that works in a different way than MemoryStream
. CloudBlobStream
sends data to Azure Storage Service in 4MB chunks, as far as I remember it doesn't save old
chunks into the memory.
来源:https://stackoverflow.com/questions/40302243/upload-big-ziparchive-memorystream-to-azure