Explanation for Timespan Differences Between C# and JavaScript

后端 未结 2 1318
无人及你
无人及你 2021-02-14 09:41

This is based on Computing milliseconds since 1970 in C# yields different date than JavaScript and C# version of Javascript Date.getTime().

For all of these calculation

2条回答
  •  逝去的感伤
    2021-02-14 10:14

    As you correctly point out, .getTime() returns "the number of milliseconds since 1 January 1970 00:00:00 UTC."

    Which means that .getTime is (as you noticed) including the offset from UTC in the calculation.

    In order to make the C# code reflect this, the time you're subtracting from must include time zone information, while 1 January 1970 00:00:00 must be a UTC time.

    This might be easier to understand with a few examples. Given:

    DateTime e = new DateTime(2014, 2, 28, 0, 0, 0);
    DateTime s = new DateTime(1970, 1, 1, 0, 0, 0);
    
    1. e - s is incorrect because s is not a UTC time.
    2. e.ToUniversalTime() - s.ToUniversalTime() is incorrect because e no longer includes the offset from UTC (like the calculation in JavaScript does)
    3. e.ToUniversalTime() - s is correct because we're using the UTC time and the time we're subtracting includes the offset from UTC.

    This was easier for me to see when I dealt with DateTime.Ticks directly:

    e.Ticks // 635291424000000000
    s.Ticks // 621355968000000000
    
    e.Ticks - s.Ticks // 13935456000000000 ("naive" implementation)
    e.ToUniversalTime().Ticks - s.Ticks // 13935636000000000 (correct output)
    

    Again, the last example meets all of our requirements. The Unix epoch is in UTC, while the time we're dealing with still has its original offset.

提交回复
热议问题