Upload Big ZipArchive-MemoryStream to Azure

左心房为你撑大大i 提交于 2019-12-11 04:32:58

问题


TL;DR: Is it possible to upload a big MemoryStream to Azure as chunks on the fly while zipping?

I have files which get saved into a MemoryStream, I add these files to a ZipArchive in another MemoryStream.
This MemoryStream I want to upload to an Azure-BlockBlob-Storage using

blockBlob.UploadFromStream(zipMemoryStream);

So far so good.
The Problem now is, that the Zip-Archive might get bigger then 8GB which is a problem with the MemoryStream.

Is it possible to upload parts from the memory-stream as chunks to azure, and remove these bytes from the Stream?

Or is there a better approach to deal with zipArchive and azure?

For zipping I am using the class ZipArchive in the Package System.IO.Compression

Best Regards,
Flo


回答1:


It may be not exactly what you are looking for, but did you try to do something like this:

var blob = container.GetBlockBlobReference("zipped.zip");
using (var stream = new ZipArchive(blob.OpenWrite(), ZipArchiveMode.Create))
{
    var entry = stream.CreateEntry("entry1");
    using (var es = entry.Open())
    {
        // Fill entry with data
    }

    // Other code
}

When you call OpenWrite of CloudBlockBlob it creates an instance of CloudBlobStream that works in a different way than MemoryStream. CloudBlobStream sends data to Azure Storage Service in 4MB chunks, as far as I remember it doesn't save old chunks into the memory.



来源:https://stackoverflow.com/questions/40302243/upload-big-ziparchive-memorystream-to-azure

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!