Timer firing Tick event with 15 milliseconds delay

此生再无相见时 提交于 2019-12-25 14:13:09

问题


I'm having a weird problem with timers. As long as I know, the interval property of a timer indicates the delay between loops in which the timer_Tick event will be fired.

I had this problem with the exact signature (15 and 16 milliseconds delay) before while programming in Visual Basic. Any timers that I create, fire their tick event with 15 or 16 milliseconds delay. For instance, if I set the interval of my timer to 1 (which means its tick event should get fired 1000 times in 1 second), the event instead gets fired 62 to 66 times in 1 second (that's 1000/16 to 1000/15).

I've been developing VB applications since 5 years ago and always had this problem (that also means I had this problem on several different systems with both AMD and Intel processors), and now I'm having it again with C#.

I'd managed to do a workaround and solve this problem by calculating the time difference between each time the tick event gets fired based on TickCount method (GetTickCount API in VB and Environment.TickCount in C#).

*TickCount is the amount of milliseconds passed since the time system had started.

To understand the problem better, I created a windows application that counts the seconds since it gets executed (like a timer). It relies on both TickCount and ordinary addition on each time the Tick event gets fired. It also calculates the delay of the timer by subtracting the last value of TickCount from the current value of the TickCount (if the timer would have been fired 1000 times in 1 second truly, then the difference of the TickCounts would be 1 each time, thus meaning there's no delay, but if the difference is more than 1, then there is some delay between each times the timer's tick event gets fired).

Here's the code:

public partial class Form1 : Form
{
    int localTime = 0, systemTime = 0, baseSystemTime = 0, lastSystemTime = 0;
    public Form1()
    {
        InitializeComponent();
    }

    private void timer1_Tick(object sender, EventArgs e)
    {
        // Calculate time based on TickCount
        if (baseSystemTime == 0)
            baseSystemTime = Environment.TickCount;

        systemTime = Environment.TickCount - baseSystemTime;
        label2.Text ="System Time: " + ((systemTime / 1000) / 60).ToString() + ":" + ((systemTime / 1000) % 60).ToString();

        // Calculate time based on timer1_Tick
        localTime++;
        label1.Text = "Application Time: " + ((localTime / 1000) / 60).ToString() + ":" + ((localTime / 1000) % 60).ToString();

        // Calculate the delay
        if (lastSystemTime > 0)
        {
            label3.Text = "Delay: " + (Environment.TickCount - lastSystemTime).ToString() + " ms";
        }

        lastSystemTime = Environment.TickCount;
    }
}

I've also uploaded the whole solution here: http://ramt.in/test/TimerDelay.zip

Here's a screenshot of the application (with 15 milliseconds delay and 1 second counted by application while 17 seconds have actually passed!):

The solution is only 50kb, so feel free to download it and run it to see if you get the same result as me. If it's the same, then there's something wrong with the timer class in Microsoft world!

But more importantly, if anyone knows anything about what might cause this delay, please share your knowledge with me.


回答1:


This is a system problem not c# or VB. To check how accurate is your system toy can use Stopwatch class and two properties

  1. IsHighResolution - The timer used by the Stopwatch class depends on the system hardware and operating system. IsHighResolution is true if the Stopwatch timer is based on a high-resolution performance counter. Otherwise, IsHighResolution is false, which indicates that the Stopwatch timer is based on the system timer.
  2. Frequency

The following code from MSDN shows how it works

public static void DisplayTimerProperties()
{
    // Display the timer frequency and resolution. 
    if (Stopwatch.IsHighResolution)
    {
        Console.WriteLine("Operations timed using the system's high-resolution performance counter.");
    }
    else 
    {
        Console.WriteLine("Operations timed using the DateTime class.");
    }

    long frequency = Stopwatch.Frequency;
    Console.WriteLine("  Timer frequency in ticks per second = {0}",
        frequency);
    long nanosecPerTick = (1000L*1000L*1000L) / frequency;
    Console.WriteLine("  Timer is accurate within {0} nanoseconds", 
        nanosecPerTick);
}

UPDATE

You also did an error in your code:

        // Calculate time based on timer1_Tick
        localTime++;
        label1.Text = "Application Time: " + ((localTime / 1000) / 60).ToString() + ":" + ((localTime / 1000) % 60).ToString();

This lines has nothing to time which passed. It only calculates how many times timer1_Tick was run. The 1 millisecond interval is just too small for windows forms timer. You can read about this here: Timer takes 10 ms more than interval

If you need more precise timer you could look at this article Microsecond and Millisecond C# Timer




回答2:


This has nothing to do with realtime or not: The windows default timer resolution is 64 ticks/s or 15.625 ms. However, the systems timer resolution may be modified to operate at higher resolution, e.g. 1 ms. See this aswer to the question "Why are .NET timers limited to 15 ms resolution?" to get an idea about how to modifiy the systems timer resolution.



来源:https://stackoverflow.com/questions/29999274/timer-firing-tick-event-with-15-milliseconds-delay

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!