I have a custom stream that is used to perform write operations directly into the page cloud blob.
public sealed class WindowsAzureCloudPageBlobStream : Stream
{
If you don't mind working out of a file instead of a stream (or perhaps this has stream support and I don't know about it), look at the Azure Storage Data Movement Library. It's the best I've seen so far.
It's relatively new (at the time of writing) but has very good support for moving large files in chunks and maximizing throughput (I use it for nightly copying of SQL backups, many exceeding 1GB in size).
https://azure.microsoft.com/en-us/blog/announcing-azure-storage-data-movement-library-0-2-0/
Usage is quite easy. Here's an example:
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;
using Microsoft.WindowsAzure.Storage.DataMovement;
namespace BlobUploader
{
public class Uploader
{
public string ConnectionString { get; set; }
public string ContainerName { get; set; }
public string BlobName { get; set; }
public void UploadFile(string filePath) {
CloudStorageAccount account = CloudStorageAccount.Parse(ConnectionString);
CloudBlobClient blobClient = account.CreateCloudBlobClient();
CloudBlobContainer blobContainer = blobClient.GetContainerReference(ContainerName);
blobContainer.CreateIfNotExists();
CloudBlockBlob destinationBlob = blobContainer.GetBlockBlobReference(BlobName);
TransferManager.Configurations.ParallelOperations = 64;
TransferContext context = new TransferContext();
context.ProgressHandler = new Progress((progress) => {
Console.WriteLine("Bytes uploaded: {0}", progress.BytesTransferred);
});
var task = TransferManager.UploadAsync(filePath, destinationBlob, null, context, CancellationToken.None);
task.Wait();
}
}
}
The following preview blog post gives some information on how this came about and how it approaches things in general:
https://azure.microsoft.com/en-us/blog/introducing-azure-storage-data-movement-library-preview-2/