AS @Gaurav mentions, unzipping is not naively supported. There was a feedback item to include this as a feature but it was declined. I can think of two alternatives that may be of interest.
1) Build an Azure Data Factory custom activity that does the unzipping. As files are uploaded to a temporary location, you can then unzip then in a pipeline and write them to your application container. This will require a batch service instance but Data Factory will take care of all the orchestration and give you a management facility to alert for failures etc.
2) Move your blobs from Azure Blob Storage to Azure Data Lake Store using adlcopy.exe. Once in Data Lake Storage, you can then build your own custom extractor and query the zip/gzip files. After another look through the documentation it does seem that USQL may be able to do this natively. Look for the section Extraction from compressed data in the EXTRACT expression.
3) Use PolyBase with SQL Data Warehouse which can read zip/gzip files natively. This is the easiest but probably the most expensive option. See CREATE EXTERNAL TABLE and CREATE EXTERNAL FILE FORMAT.
4) And as @EvertonMc just mentioned, you could do it with an Azure function on a trigger, which is also a good option.
Good luck and let us know how you get on.