How to track progress of async file upload to azure storage

后端 未结 4 709
我寻月下人不归
我寻月下人不归 2020-12-03 01:56

Is there way to track the file upload progress to an Azure storage container? I am trying to make a console application for uploading data to Azure using C#.

My c

相关标签:
4条回答
  • 2020-12-03 02:39

    The below code uses Azure Blob Storage SDK v12. Reference link: https://www.craftedforeveryone.com/upload-or-download-file-from-azure-blob-storage-with-progress-percentage-csharp/

    public void UploadBlob(string fileToUploadPath)
    {
        var file = new FileInfo(fileToUploadPath);
        uploadFileSize = file.Length; //Get the file size. This is need to calculate the file upload progress
    
        //Initialize a progress handler. When the file is being uploaded, the current uploaded bytes will be published back to us using this progress handler by the Blob Storage Service
        var progressHandler = new Progress();
        progressHandler.ProgressChanged += UploadProgressChanged;
    
        var blob = new BlobClient(connectionString, containerName, file.Name); //Initialize the blob client
        blob.Upload(fileToUploadPath, progressHandler: progressHandler); //Make sure to pass the progress handler here
    
    }
    
    private void UploadProgressChanged(object sender, long bytesUploaded)
    {
        //Calculate the progress and update the progress bar.
        //Note: the bytes uploaded published back to us is in long. In order to calculate the percentage, the value has to be converted to double. 
        //Auto type casting from long to double happens here as part of function call
        Console.WriteLine(GetProgressPercentage(uploadFileSize, bytesUploaded));
    }
    
    private double GetProgressPercentage(double totalSize,double currentSize)
    {
        return (currentSize / totalSize) * 100;
    }
    
    0 讨论(0)
  • 2020-12-03 02:44

    How about this.

    public class ObservableFileStream : FileStream
    {
        private Action<long> _callback;
    
        public ObservableFileStream(String fileName, FileMode mode, Action<long> callback) : base(fileName, mode)
        {
            _callback = callback;
        }
    
        public override void Write(byte[] array, int offset, int count)
        {
            _callback?.Invoke(Length);
            base.Write(array, offset, count);
        }
    
        public override int Read(byte[] array, int offset, int count)
        {
            _callback?.Invoke(Position);
            return base.Read(array, offset, count);
        }
    }
    
    public class Test
    {
        private async void Upload(String filePath, CloudBlockBlob blob)
        {
            ObservableFileStream fs = null;
    
            using (fs = new ObservableFileStream(filePath, FileMode.Open, (current) =>
            {
                Console.WriteLine("Uploading " + ((double)current / (double)fs.Length) * 100d);
            }))
            {
                await blob.UploadFromStreamAsync(fs);
            }
        }
    }
    
    0 讨论(0)
  • 2020-12-03 02:46

    I don't think it's possible because uploading file is a single task and even though internally the file is split into multiple chunks and these chunks get uploaded, the code actually wait for the entire task to finish.

    One possibility would be manually split the file into chunks and upload those chunks asynchronously using PutBlockAsync method. Once all chunks are uploaded, you can then call PutBlockListAsync method to commit the blob. Please see the code below which accomplishes that:

    using Microsoft.WindowsAzure.Storage;
    using Microsoft.WindowsAzure.Storage.Auth;
    using Microsoft.WindowsAzure.Storage.Blob;
    using System;
    using System.Collections.Generic;
    using System.IO;
    using System.Linq;
    using System.Text;
    using System.Threading;
    using System.Threading.Tasks;
    
    namespace ConsoleApplication1
    {
        class Program
        {
            static CloudStorageAccount storageAccount = new CloudStorageAccount(new StorageCredentials("accountname", "accountkey"), true);
            static void Main(string[] args)
            {
                CloudBlobClient myBlobClient = storageAccount.CreateCloudBlobClient();
                myBlobClient.SingleBlobUploadThresholdInBytes = 1024 * 1024;
                CloudBlobContainer container = myBlobClient.GetContainerReference("adokontajnerneki");
                //container.CreateIfNotExists();
                CloudBlockBlob myBlob = container.GetBlockBlobReference("cfx.zip");
                var blockSize = 256 * 1024;
                myBlob.StreamWriteSizeInBytes = blockSize;
                var fileName = @"D:\cfx.zip";
                long bytesToUpload = (new FileInfo(fileName)).Length;
                long fileSize = bytesToUpload;
    
                if (bytesToUpload < blockSize)
                {
                    CancellationToken ca = new CancellationToken();
                    var ado = myBlob.UploadFromFileAsync(fileName, FileMode.Open, ca);
                    Console.WriteLine(ado.Status); //Does Not Help Much
                    ado.ContinueWith(t =>
                    {
                        Console.WriteLine("Status = " + t.Status);
                        Console.WriteLine("It is over"); //this is working OK
                    });
                }
                else
                {
                    List<string> blockIds = new List<string>();
                    int index = 1;
                    long startPosition = 0;
                    long bytesUploaded = 0;
                    do
                    {
                        var bytesToRead = Math.Min(blockSize, bytesToUpload);
                        var blobContents = new byte[bytesToRead];
                        using (FileStream fs = new FileStream(fileName, FileMode.Open))
                        {
                            fs.Position = startPosition;
                            fs.Read(blobContents, 0, (int)bytesToRead);
                        }
                        ManualResetEvent mre = new ManualResetEvent(false);
                        var blockId = Convert.ToBase64String(Encoding.UTF8.GetBytes(index.ToString("d6")));
                        Console.WriteLine("Now uploading block # " + index.ToString("d6"));
                        blockIds.Add(blockId);
                        var ado = myBlob.PutBlockAsync(blockId, new MemoryStream(blobContents), null);
                        ado.ContinueWith(t =>
                        {
                            bytesUploaded += bytesToRead;
                            bytesToUpload -= bytesToRead;
                            startPosition += bytesToRead;
                            index++;
                            double percentComplete = (double)bytesUploaded / (double)fileSize;
                            Console.WriteLine("Percent complete = " + percentComplete.ToString("P"));
                            mre.Set();
                        });
                        mre.WaitOne();
                    }
                    while (bytesToUpload > 0);
                    Console.WriteLine("Now committing block list");
                    var pbl = myBlob.PutBlockListAsync(blockIds);
                    pbl.ContinueWith(t =>
                    {
                        Console.WriteLine("Blob uploaded completely.");
                    });
                }
                Console.ReadKey();
            }
        }
    }
    
    0 讨论(0)
  • 2020-12-03 02:47

    Gaurav's solution works well and is very similar to http://blogs.msdn.com/b/kwill/archive/2011/05/30/asynchronous-parallel-block-blob-transfers-with-progress-change-notification.aspx. The challenge with this code is that you are doing a lot of complex work with very little error handling. I am not saying there is anything wrong with Gaurav's code - it looks solid - but especially with network related communication code there are lots of variables and lots of issues that you have to account for.

    For this reason I modified my original blog to use the upload code from the storage client library (under the assumption that the code coming from the Azure Storage team was more robust than anything I could write) and track progress using a ProgressStream class. You can see the updated code at http://blogs.msdn.com/b/kwill/archive/2013/03/06/asynchronous-parallel-block-blob-transfers-with-progress-change-notification-2-0.aspx.

    0 讨论(0)
提交回复
热议问题