I have some experience with S3, and in the past have used s3-parallel-put
to put many (millions) small files there. Compared to Azure, S3 has an expensive PUT p
https://github.com/Azure/azure-sdk-tools-xplat is the source of azure-cli, so more details can be found there. You may want to open issues to the repo :)
In order to upload bulk files into the blob storage there is a tool provided by Microsoft Checkout the storage explorer allows you to do the required task..
If you prefer the commandline and have a recent Python interpreter, the Azure Batch and HPC team has released a code sample with some AzCopy-like functionality on Python called blobxfer. This allows full recursive directory ingress into Azure Storage as well as full container copy back out to local storage. [full disclosure: I'm a contributor for this code]
To answer your questions: