Using a variable delay in Task.Delay
randomly takes seconds instead of milliseconds when combined with a IO-like operation.
Code to reproduce:
Without seeing more code, it's hard to make futher guesses, but I'd like to summarize the comments, it may help someone else in the future:
We've figured out that the ThreadPool stuttering is not an issues here, as ThreadPool.SetMinThreads(500, 500)
didn't help.
Is there any SynchronizationContext
in place anywhere in your task workflow? Place Debug.Assert(SyncrhonizationContext.Current == null)
everywhere to check for that. Use ConfigureAwait(false)
with every await
.
Is there any .Wait
, .WaitOne
, .WaitAll
, WaitAny
, .Result
used anywhere in your code? Any lock () { ... }
constructs? Monitor.Enter/Exit
or any other blocking synchronization primitives?
Regarding this: I've already replaced Task.Delay(20)
with Task.Yield(); Thread.Sleep(20)
as a workaround, that works. But yeah, I continue to try to figure out what's going on here because the idea that Task.Delay(20) can shoot this far out of line makes it totally unusable.
This sounds worrying, indeed. It's very unlikely there's a bug in Task.Delay
, but everything is possible. For the sake of experimenting, try replacing await Task.Delay(20)
with await Task.Run(() => Thread.Sleep(20))
, having ThreadPool.SetMinThreads(500, 500)
still in-place.
I also have an experimental implementation of Delay
which uses unamanaged CreateTimerQueueTimer API (unlike Task.Delay
, which uses System.Threading.Timer
, which in turn uses managed TimerQueue
). It's available here as a gist. Feel free to try it as TaskExt.Delay
instead of the standard Task.Delay
. The timer callbacks are posted to ThreadPool
, so ThreadPool.SetMinThreads(500, 500)
still should be used for this experiment. I doubt it could make any difference, but I'd be interested to know.