How to limit number of async IO tasks to database?

不羁的心 提交于 2020-06-11 12:53:40

问题


I have a list of id's and I want to get data for each of those id in parallel from database. My below ExecuteAsync method is called at very high throughput and for each request we have around 500 ids for which I need to extract data.

So I have got below code where I am looping around list of ids and making async calls for each of those id in parallel and it works fine.

private async Task<List<T>> ExecuteAsync<T>(IList<int> ids, IPollyPolicy policy,
    Func<CancellationToken, int, Task<T>> mapper) where T : class
{
    var tasks = new List<Task<T>>(ids.Count);
    // invoking multiple id in parallel to get data for each id from database
    for (int i = 0; i < ids.Count; i++)
    {
        tasks.Add(Execute(policy, ct => mapper(ct, ids[i])));
    }

    // wait for all id response to come back
    var responses = await Task.WhenAll(tasks);

    var excludeNull = new List<T>(ids.Count);
    for (int i = 0; i < responses.Length; i++)
    {
        var response = responses[i];
        if (response != null)
        {
            excludeNull.Add(response);
        }
    }
    return excludeNull;
}

private async Task<T> Execute<T>(IPollyPolicy policy,
    Func<CancellationToken, Task<T>> requestExecuter) where T : class
{
    var response = await policy.Policy.ExecuteAndCaptureAsync(
        ct => requestExecuter(ct), CancellationToken.None);
    if (response.Outcome == OutcomeType.Failure)
    {
        if (response.FinalException != null)
        {
            // log error
            throw response.FinalException;
        }
    }

    return response?.Result;
}

Question:

Now as you can see I am looping all ids and making bunch of async calls to database in parallel for each id which can put lot of load on database (depending on how many request is coming). So I want to limit the number of async calls we are making to database. I modified ExecuteAsync to use Semaphore as shown below but it doesn't look like it does what I want it to do:

private async Task<List<T>> ExecuteAsync<T>(IList<int> ids, IPollyPolicy policy,
    Func<CancellationToken, int, Task<T>> mapper) where T : class
{
    var throttler = new SemaphoreSlim(250);
    var tasks = new List<Task<T>>(ids.Count);
    // invoking multiple id in parallel to get data for each id from database
    for (int i = 0; i < ids.Count; i++)
    {
        await throttler.WaitAsync().ConfigureAwait(false);
        try
        {
            tasks.Add(Execute(policy, ct => mapper(ct, ids[i])));
        }
        finally
        {
            throttler.Release();
        }
    }

    // wait for all id response to come back
    var responses = await Task.WhenAll(tasks);

    // same excludeNull code check here

    return excludeNull;
}

Does Semaphore works on Threads or Tasks? Reading it here looks like Semaphore is for Threads and SemaphoreSlim is for tasks.

Is this correct? If yes then what's the best way to fix this and limit the number of async IO tasks we make to database here.


回答1:


Task is an abstraction on threads, and doesn’t necessarily create a new thread. Semaphore limits the number of threads that can access that for loop. Execute returns a Task which aren’t threads. If there’s only 1 request, there will be only 1 thread inside that for loop, even if it is asking for 500 ids. The 1 thread sends off all the async IO tasks itself.

Sort of. I would not say that tasks are related to threads at all. There are actually two kinds of tasks: a delegate task (which is kind of an abstraction of a thread), and a promise task (which has nothing to do with threads).

Regarding the SemaphoreSlim, it does limit the concurrency of a block of code (not threads).

I recently started playing with C# so my understanding is not right looks like w.r.t Threads and Tasks.

I recommend reading my async intro and best practices. Follow up with There Is No Thread if you're interested more about how threads aren't really involved.

I modified ExecuteAsync to use Semaphore as shown below but it doesn't look like it does what I want it to do

The current code is only throttling the adding of the tasks to the list, which is only done one at a time anyway. What you want to do is throttle the execution itself:

private async Task<List<T>> ExecuteAsync<T>(IList<int> ids, IPollyPolicy policy, Func<CancellationToken, int, Task<T>> mapper) where T : class
{
  var throttler = new SemaphoreSlim(250);
  var tasks = new List<Task<T>>(ids.Count);

  // invoking multiple id in parallel to get data for each id from database
  for (int i = 0; i < ids.Count; i++)
    tasks.Add(ThrottledExecute(ids[i]));

  // wait for all id response to come back
  var responses = await Task.WhenAll(tasks);

  // same excludeNull code check here
  return excludeNull;

  async Task<T> ThrottledExecute(int id)
  {
    await throttler.WaitAsync().ConfigureAwait(false);
    try {
      return await Execute(policy, ct => mapper(ct, id)).ConfigureAwait(false);
    } finally {
      throttler.Release();
    }
  }
}



回答2:


Your colleague has probably in mind the Semaphore class, which is indeed a thread-centric throttler, with no asynchronous capabilities.

Limits the number of threads that can access a resource or pool of resources concurrently.

The SemaphoreSlim class is a lightweight alternative to Semaphore, which includes the asynchronous method WaitAsync, that makes all the difference in the world. The WaitAsync doesn't block a thread, it blocks an asynchronous workflow. Asynchronous workflows are cheap (usually less than 1000 bytes each). You can have millions of them "running" concurrently at any given moment. This is not the case with threads, because of the 1 MB of memory that each thread reserves for its stack.

As for the ExecuteAsync method, here is how you could refactor it by using the LINQ methods Select, Where, ToArray and ToList:

private async Task<List<T>> ExecuteAsync<T>(IList<int> ids, IPollyPolicy policy,
    Func<CancellationToken, int, Task<T>> mapper) where T : class
{
    var throttler = new SemaphoreSlim(10); // 250 is not realistic!
    Task<T>[] tasks = ids.Select(async id =>
    {
        await throttler.WaitAsync().ConfigureAwait(false);
        try
        {
            return Execute(policy, ct => mapper(ct, id));
        }
        finally
        {
            throttler.Release();
        }
    }).ToArray();

    T[] results = await Task.WhenAll(tasks).ConfigureAwait(false);

    return results.Where(r => r != null).ToList();
}



回答3:


You are throttling the rate at which you add tasks to the list. You are not throttling the rate at which tasks are executed. To do that, you'd probably have to implement your semaphore calls inside the Execute method itself.

If you can't modify Execute, another way to do it is to poll for completed tasks, sort of like this:

for (int i = 0; i < ids.Count; i++)
{
    var pendingCount = tasks.Count( t => !t.IsCompleted );
    while (pendingCount >= 500) await Task.Yield();
    tasks.Add(Execute(policy, ct => mapper(ct, ids[i])));
}
await Task.WhenAll( tasks );



回答4:


Actually the TPL is capable to control the task execution and limit the concurrency. You can test how many parallel tasks is suitable for your use-case. No need to think about threads, TPL will manage everything fine for you.

To use limited concurrency see this answer, credits to @panagiotis-kanavos

.Net TPL: Limited Concurrency Level Task scheduler with task priority?

The example code is (even using different priorities, you can strip that):

QueuedTaskScheduler qts = new QueuedTaskScheduler(TaskScheduler.Default,4);
TaskScheduler pri0 = qts.ActivateNewQueue(priority: 0);
TaskScheduler pri1 = qts.ActivateNewQueue(priority: 1);

Task.Factory.StartNew(()=>{ }, 
                  CancellationToken.None, 
                  TaskCreationOptions.None, 
                  pri0);

Just throw all your tasks to the queue and with Task.WhenAll you can wait till everything is done.



来源:https://stackoverflow.com/questions/62288411/how-to-limit-number-of-async-io-tasks-to-database

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!