I have a question about Webapi2
My application is fully async/await
, but I want to optimize the last part. I have a hard time finding out, so is there a
Assuming that you are trying to parallelize a number of asynchronous tasks invoked by the controller action, and assuming that you want to return a response back to the client after just one (definite) task completes, without awaiting for all responses, (fire and forget) you can simply invoke asynchronous methods without awaiting them:
// Random async method here ...
private async Task<int> DelayAsync(int seconds)
{
await Task.Delay(seconds*1000);
Trace.WriteLine($"Done waiting {seconds} seconds");
return seconds;
}
[HttpGet]
public async Task<IHttpActionResult> ParallelBackgroundTasks()
{
var firstResult = await DelayAsync(6);
// Initiate unawaited background tasks ...
#pragma warning disable 4014
// Calls will return immediately
DelayAsync(100);
DelayAsync(111);
// ...
#pragma warning enable 4014
// Return first result to client without waiting for the background task to complete
return Ok(firstResult);
}
If you need to do further processing after all the background tasks complete, even if the original request thread has completed, it is still possible to schedule a continuation upon completion :
#pragma warning disable 4014
var backgroundTasks = Enumerable.Range(1, 5)
.Select(DelayAsync);
// Not awaited
Task.WhenAll(backgroundTasks)
.ContinueWith(t =>
{
if (t.IsFaulted)
{
// Exception handler here
}
Trace.WriteLine($"Done waiting for a total of {t.Result.Sum()} seconds");
});
#pragma warning restore 4014
Better still would be to refactor the background work into its own async method, where the benefits of exception handling are available:
private async Task ScheduleBackGroundWork()
{
try
{
// Initiate unawaited background tasks
var backgroundTasks = Enumerable.Range(1, 5)
.Select(DelayAsync);
var allCompleteTask = await Task.WhenAll(backgroundTasks)
.ConfigureAwait(false);
Trace.WriteLine($"Done waiting for a total of {allCompleteTask.Sum()} seconds");
}
catch (Exception)
{
Trace.WriteLine("Oops");
}
}
The invocation of the background work would still be unawaited, viz:
#pragma warning disable 4014
ScheduleBackGroundWork();
#pragma warning restore 4014
Notes
Assuming there isn't CPU-bound work done prior to the innermost await, this approach has the advantage over using Task.Run()
in that it uses less threadpool threads.
Even so, the wisdom of doing this needs to be considered - although the tasks are created serially on the controller's threadpool thread, when the IO bound work completes, the continuations (Trace.WriteLine
) will each require a thread to complete, which can still cause starvation if all continuations complete simultaneously - you won't want multiple clients invoking these kinds of functions, for scalability reasons.
Obviously, the client doesn't actually know what the final result outcome of all tasks is, so you may need to add extra state to notify the client once the actual work is done (e.g. via SignalR). Also, if the app pool dies or is recycled, the result will be lost.
You'll also get a compiler warning when you don't await the result of an async method - this can be suppressed by a pragma.
When using unawaited Tasks, you'll also want to put in a global Unobserved Task Exception handler when invoking async code without await. More on this here
If you use Dependency Injection, if the continuation to be executed after an unawaited Task has any dependencies, especially those which are injected per-request and are IDisposable
, you'll need to fudge your container to convince it not to Dispose of these dependencies when the request completes (since your continuation will need to run some time in the future)
Edit - Re Scalability
To be honest, it's going to depend a lot on exactly what you intend doing with the 'background' tasks. Consider this updated 'background task':
private async Task<int> DelayAsync(int seconds)
{
// Case 1 : If there's a lot of CPU bound work BEFORE the innermost await:
Thread.Sleep(1000);
await Task.Delay(seconds*1000)
.ConfigureAwait(false);
// Case 2 : There's long duration CPU bound work in the continuation task
Thread.Sleep(1000);
Trace.WriteLine($"Done waiting {seconds} seconds");
return seconds;
}
await
(Case 1, above), you would have needed to resort to
Jonathan's Task.Run()
strategy to decouple the waiting client
accessing the Controller from the 'case 1' work (otherwise the client would be forced to wait). Doing so will chew up ~1 thread per task.await
,
then the continuation scheduled would be scheduled to consume a
thread for the duration of the remaining work. Although this won't affect the original client call duration, it will affect the overall process thread and CPU usage.So I guess the answer is "it depends". You can probably get away with a few unawaited tasks with no pre + post processing on a self-hosted Owin service, but if you're using Azure, then something like Azure functions or the older Azure Web Jobs sounds like a better bet for background processing.
What you're looking for is "fire and forget" - which is inherently dangerous on ASP.NET.
The proper solution is to have an independent worker process (Azure function / Win32 service) that is connected to your WebAPI using a reliable queue (Azure queue / MSMQ). Your WebAPI should write to the queue and then return the response. The worker process (outside of ASP.NET) should read from the queue and process the work items.
The code after the return
statement will not be executed in your second example, but these tasks can be offloaded to the ThreadPool
:
public async Task<IHttpActionResult> Foo(Bar bar)
{
List<Task> tasks = new List<Task>();
var actualresult = Barfoo(bar.Bar);
foreach(var foobar in bar.Foo)
{
//some stuff which fills tasks for extra logic, not important for the client
Task.Run(() => /* foobar task creation, queued on worker threads */);
}
// this will execute without waiting for the foobar logic to finish
return Ok(actualresult.Result);
}
If you want to later check the 'extra logic' tasks for completion or errors, you may want to look into the Task Parallel Library