This is based on Computing milliseconds since 1970 in C# yields different date than JavaScript and C# version of Javascript Date.getTime().
For all of these calculation
As you correctly point out, .getTime() returns "the number of milliseconds since 1 January 1970 00:00:00 UTC."
Which means that .getTime
is (as you noticed) including the offset from UTC in the calculation.
In order to make the C# code reflect this, the time you're subtracting from must include time zone information, while 1 January 1970 00:00:00 must be a UTC time.
This might be easier to understand with a few examples. Given:
DateTime e = new DateTime(2014, 2, 28, 0, 0, 0);
DateTime s = new DateTime(1970, 1, 1, 0, 0, 0);
e - s
is incorrect because s
is not a UTC time.e.ToUniversalTime() - s.ToUniversalTime()
is incorrect because e
no longer includes the offset from UTC (like the calculation in JavaScript does)e.ToUniversalTime() - s
is correct because we're using the UTC time and the time we're subtracting includes the offset from UTC.This was easier for me to see when I dealt with DateTime.Ticks
directly:
e.Ticks // 635291424000000000
s.Ticks // 621355968000000000
e.Ticks - s.Ticks // 13935456000000000 ("naive" implementation)
e.ToUniversalTime().Ticks - s.Ticks // 13935636000000000 (correct output)
Again, the last example meets all of our requirements. The Unix epoch is in UTC, while the time we're dealing with still has its original offset.