Painfully slow Azure table insert and delete batch operations

前端 未结 4 1443
隐瞒了意图╮
隐瞒了意图╮ 2021-01-30 05:12

I am running into a huge performance bottleneck when using Azure table storage. My desire is to use tables as a sort of cache, so a long process may result in anywhere from hund

4条回答
  •  北海茫月
    2021-01-30 05:40

    After going through lots of pain, experiments, finally been able to got optimal throughput for single table partition (2,000+ batch write operations per second) and much better throughput in storage account (3,500+ batch write operations per second) with Azure Table storage. I tried all different approaches, but setting the .net connection limit programmatically (I tried the configuration sample, but didn't work for me) solved the problem (based on a White Paper provided by Microsoft), as shown below:

    ServicePoint tableServicePoint = ServicePointManager
        .FindServicePoint(_StorageAccount.TableEndpoint);
    
    //This is a notorious issue that has affected many developers. By default, the value 
    //for the number of .NET HTTP connections is 2.
    //This implies that only 2 concurrent connections can be maintained. This manifests itself
    //as "underlying connection was closed..." when the number of concurrent requests is
    //greater than 2.
    
    tableServicePoint.ConnectionLimit = 1000;
    

    Anyone else who got 20K+ batch write operation per storage account, please share your experience.

提交回复
热议问题