Cosmos DB Mongo API How to manage “Request Rate is Large” condition

前端 未结 1 1857
星月不相逢
星月不相逢 2021-02-14 12:26

I Have the following code..

async function bulkInsert(db, collectionName, documents) {
  try {
    const cosmosResults = await db.collection(collectionName).inse         


        
相关标签:
1条回答
  • 2021-02-14 13:04

    Requests with cosmosdb need to consume RUs. Obviously, your insert request exceeded the RU throughput and error code 16500 occurred.

    Applications that exceed the provisioned request units for a collection will be throttled until the rate drops below the reserved level. When a throttle occurs, the backend will preemptively end the request with a 16500 error code - Too Many Requests. By default, API for MongoDB will automatically retry up to 10 times before returning a Too Many Requests error code.

    You could find more instructions from official document.

    You could follow the ways as below to try to solve the issue:

    1. Import your data in batches to reduce throughput.

    2. Add your own retry logic in your application.

    3. Increasing the reserved throughput for the collection. Of course, it increases your cost.

    You could refer to this article.

    Hope it helps you.


    Update Answer:

    It looks like your documents are not uniquely identifiable. So I think the "_id" attribute which automatically generated by Cosmos DB cannot determine which documents have been inserted and which documents have not been inserted.

    I suggest you increasing throughput settings, empty the database and then bulk import the data.

    Considering the cost , please refer to this document for setting the appropriate RU.

    Or you could test bulk import operation locally via Cosmos DB Emulator.

    0 讨论(0)
提交回复
热议问题