This is based on Computing milliseconds since 1970 in C# yields different date than JavaScript and C# version of Javascript Date.getTime().
For all of these calculation
As you correctly point out, .getTime() returns "the number of milliseconds since 1 January 1970 00:00:00 UTC."
Which means that .getTime
is (as you noticed) including the offset from UTC in the calculation.
In order to make the C# code reflect this, the time you're subtracting from must include time zone information, while 1 January 1970 00:00:00 must be a UTC time.
This might be easier to understand with a few examples. Given:
DateTime e = new DateTime(2014, 2, 28, 0, 0, 0);
DateTime s = new DateTime(1970, 1, 1, 0, 0, 0);
e - s
is incorrect because s
is not a UTC time.e.ToUniversalTime() - s.ToUniversalTime()
is incorrect because e
no longer includes the offset from UTC (like the calculation in JavaScript does)e.ToUniversalTime() - s
is correct because we're using the UTC time and the time we're subtracting includes the offset from UTC.This was easier for me to see when I dealt with DateTime.Ticks
directly:
e.Ticks // 635291424000000000
s.Ticks // 621355968000000000
e.Ticks - s.Ticks // 13935456000000000 ("naive" implementation)
e.ToUniversalTime().Ticks - s.Ticks // 13935636000000000 (correct output)
Again, the last example meets all of our requirements. The Unix epoch is in UTC, while the time we're dealing with still has its original offset.
I understand that JavaScript Date objects are based on the Unix Epoch (Midnight on Jan 1, 1970).
Yes, they are. Internally, it's just a number of milliseconds from the epoch. But when you call the date constructor, or look at the output from .toString()
, it is using the local time of where the code is running.
If you want the input to be specified in UTC, then you have to use a different incantation:
var ts = Date.UTC(2014,1,28); // returns a numeric timestamp, not a Date object
var dt = new Date(ts); // if you want a date object
var s = dt.toUTCString(); // if you want the output to be in UTC