Internally ASP.NET has a 2 GB addressing space, but in reality you only have less than 1 GB free for uploads (see http://support.microsoft.com/?id=295626 ). In addition IIS
no matter what code pattern you use. If you write a server side code, then file is going to go to your webrole and then several pains such as role recycle and retrying failed uploads is going to come. I removed these issues though a client side Silverlight control, that not only did fault tolerant uploads but also did it at a great speed. You can download my sample and read how I built it from: Pick Your Azure File Upload Control: Silverlight and TPL or HTML5 and AJAX
We can upload very large files into azure storage using parallel upload. That means we need to split up the large files into pieces of small file packets and and uoploaded these packets. Once the uploading completed we can join the packets to original one. For complete code please refer following link http://tuvian.wordpress.com/2011/06/28/how-to-upload-large-size-fileblob-to-azure-storage-using-asp-netc/
I actually did the exact same thing recently. I created a Silverlight Client app to handle chopping up the data and sending it to Azure.
This is a working example that I followed that does exactly that. Pretty much follow this and you're work is almost much done for you.
For this part of the question:
appcmd set config "My Site/MyApp" -section:requestFiltering -requestLimits.maxAllowedContentLength:104857600 -commitpath:apphost
on the server to go beyond this 30 MB limit. But how can I run this on my Azure servers?
You can do this using startup tasks - see http://richardprodger.wordpress.com/2011/03/22/azure-iis7-configuration-with-appcmd/