How to I pass a stream from Web API to Azure Blob storage without temp files?

前端 未结 2 585
萌比男神i
萌比男神i 2021-02-03 12:32

I am working on an application where file-uploads happen often, and can be pretty big in size.

Those files are being uploaded to Web API, which will then get the stream

相关标签:
2条回答
  • 2021-02-03 13:13

    I think a better approach is for you to go directly to Azure Blob Storage from your client. By leveraging the CORS support in Azure Storage you eliminate load on your Web API server resulting in better overall scale for your application.

    Basically, you will create a Shared Access Signature (SAS) URL that your client can use to upload the file directly to Azure storage. For security reasons, it is recommended that you limit the time period for which the SAS is valid. Best practices guidance for generating the SAS URL is available here.

    For your specific scenario check out this blog from the Azure Storage team where they discuss using CORS and SAS for this exact scenario. There is also a sample application so this should give you everything you need.

    0 讨论(0)
  • 2021-02-03 13:19

    Solved it, with the help of this Gist.

    Here's how I am using it, along with a clever "hack" to get the actual file size, without copying the file into memory first. Oh, and it's twice as fast (obviously).

    // Create an instance of our provider.
    // See https://gist.github.com/JamesRandall/11088079#file-blobstoragemultipartstreamprovider-cs for implementation.
    var provider = new BlobStorageMultipartStreamProvider ();
    
    // This is where the uploading is happening, by writing to the Azure stream
    // as the file stream from the request is being read, leaving almost no memory footprint.
    await this.Request.Content.ReadAsMultipartAsync(provider);
    
    // We want to know the exact size of the file, but this info is not available to us before
    // we've uploaded everything - which has just happened.
    // We get the stream from the content (and that stream is the same instance we wrote to).
    var stream = await provider.Contents.First().ReadAsStreamAsync();
    
    // Problem: If you try to use stream.Length, you'll get an exception, because BlobWriteStream
    // does not support it.
    
    // But this is where we get fancy.
    
    // Position == size, because the file has just been written to it, leaving the
    // position at the end of the file.
    var sizeInBytes = stream.Position;
    

    Voilá, you got your uploaded file's size, without having to copy the file into your web instance's memory.

    As for getting the file length before the file is uploaded, that's not as easy, and I had to resort to some rather non-pleasant methods in order to get just an approximation.

    In the BlobStorageMultipartStreamProvider:

    var approxSize = parent.Headers.ContentLength.Value - parent.Headers.ToString().Length;
    

    This gives me a pretty close file size, off by a few hundred bytes (depends on the HTTP header I guess). This is good enough for me, as my quota enforcement can accept a few bytes being shaved off.

    Just for showing off, here's the memory footprint, reported by the insanely accurate and advanced Performance Tab in Task Manager.

    Before - using MemoryStream, reading it into memory before uploading

    Before

    After - writing directly to Blob Storage

    After

    0 讨论(0)
提交回复
热议问题